'Contradictions and Chaos' - 5⧸10⧸18
Episode Stats
Length
1 hour and 51 minutes
Words per Minute
160.83983
Summary
Glenn Beck gives his thoughts on the release of three American hostages from North Korea and Iran's latest attack on Israel. He also talks about abortion in Ireland and the pro-choice vote in Ireland, and much, much more.
Transcript
00:00:00.000
The Blaze Radio Network, on demand, Glenn Beck.
00:00:13.020
This was the sound last night about 3 a.m. Eastern Time.
00:00:22.640
I think the award for the most busy and important job in the world today goes to Mike Pompeo.
00:00:28.260
He has been the Secretary of State for 14 days.
00:00:33.300
And yet he's manning the post of America's top diplomat at a time when war is very real, a very real possibility in multiple places.
00:00:42.260
In Asia, he has already traveled to North Korea not once, but twice.
00:00:53.900
That's two more times than anybody from the previous two administrations.
00:01:03.080
Now the fruits of these labors in the last year or two with President Trump have been huge.
00:01:09.780
President Trump may be the first U.S. sitting president to sit down with a North Korean leader.
00:01:22.540
Pompeo returned from North Korea just last night.
00:01:25.040
At this time, he arrived with three American citizens that have been held captive for years in a North Korean labor camp.
00:01:32.820
The last time the United States negotiated the release of U.S. captives, it involved the Obama administration, Iran, and a bribe of over $400 million in cash.
00:01:52.660
While the potential conflict in Asia seems to be going well, there's a second conflict.
00:02:04.840
As Pompeo was arriving in the United States last night, Iran was doing something they've never done before.
00:02:21.480
You're listening to the sound of Iranian rockets taking off from Syria on their way to northern Israel.
00:02:27.980
Last night, the Iranian Quds Force launched their first attack ever on Israel from inside Syria.
00:02:36.220
The Iron Dome surface-to-air defense system was very busy, engaging up to 20 missiles at a time.
00:02:42.820
The IDF has been anticipating this attack and immediately launched a counterattack.
00:02:48.020
There was an all-out strike on every Iranian Quds Force location inside of Syria.
00:03:01.100
You can bank on the fact that this is merely the opening salvo from both sides.
00:03:06.220
The next war in the Middle East will be Israel and Iran.
00:03:11.660
The question is, did it officially begin last night?
00:03:16.580
So, to Mike Pompeo, probably the busiest and most important man in the world right now, good luck.
00:03:28.480
And no pressure anything, but the fate of the world may rest in your hands.
00:03:56.520
I mean, there's a lot of truth that we can argue about.
00:04:11.680
Well, we're, you know, we're living in a multiverse.
00:04:19.240
So, there's some things that we can argue about, but we can never prove.
00:04:22.680
And we all have to stop pretending that we know the answer.
00:04:27.080
We know the answer that we're comfortable with.
00:04:29.500
But there are some things that the truth matters.
00:04:38.560
And we are now living in a postmodern world that teaches us there is no truth.
00:04:47.200
You know, one man's terrorist is another man's freedom fighter.
00:04:50.700
Well, I think Israel, Israel was wrong in this.
00:04:57.360
Well, no, we can take the facts of a situation.
00:05:03.240
And we can look at where is the poison coming from?
00:05:11.800
But nobody's going to give you an objective truth on that.
00:05:16.460
In Ireland, they are now trying to ban abortions.
00:05:26.200
Well, YouTube has decided to ban all pro-life videos for this leading up to this vote in Ireland.
00:05:35.660
So, if you have a pro-choice point of view, it's too dangerous.
00:05:47.740
I'm sure they'll solve this until, you know, once the election is over.
00:05:51.560
In the meantime, on YouTube, there is a new channel that is showing parents talking to their children about gender.
00:06:02.080
So, if a person with a wiener says, hey, I'm a girl, what is that person?
00:06:20.980
When you went to kindergarten, you went by she, her.
00:06:48.720
So, the surgery that I'm going to have, all of this is going to be gone.
00:07:26.080
Remember, a Chinese dress on a white girl in high school?
00:07:37.180
Or a Vietnamese sandwich at a college cafeteria?
00:07:48.300
Does anybody notice that this is all becoming a laughingstock?
00:07:52.120
This is going to be remembered in history as the moment the world went insane.
00:08:00.800
The hunt began with trace amounts of legitimacy, with outrage at understandably offensive things.
00:08:08.720
But then it just kind of spiraled into an ever-growing list of offensives.
00:08:13.480
So numerous were these offenses, and so minor that they had to create a term for them.
00:08:23.740
And now the entire world is a micro-aggression.
00:08:27.200
You get up in the morning, you are performing a micro-aggression on somebody.
00:08:32.660
The Atlantic has just published a 1900-word examination of the concept of cultural appropriation,
00:08:40.120
which lies at the heart of all of these identity politics and the PC culture.
00:08:44.980
In other words, it is the ultimate micro-aggression.
00:08:49.580
One startling revelation, the Chinese dress that the woman wore as her prom dress,
00:08:55.760
in what has become a firestorm of political correctness,
00:08:59.220
that dress, in fact, was co-opted from European culture.
00:09:09.800
The article is titled Every Culture Appropriates,
00:09:13.920
and it examines many examples of the so-called cultural appropriation throughout history.
00:09:22.020
The policemen of cultural appropriation don't think that way.
00:09:29.820
One Western victimization of non-Western peoples,
00:09:33.700
a victimization so extreme that it is triggered by a Westerns girl purchase of a Chinese dress
00:09:39.160
designed precisely so that Chinese girls could live more like Western girls.
00:09:48.660
the policemen of cultural appropriation must crush and deform much of the truth of cultural history,
00:10:02.100
and make babies out of the people they supposedly champion.
00:10:18.740
By accusing people of using power to intimidate less powerful people,
00:10:33.260
because the web of violations and aggressions and offensives have grown so thick
00:10:38.840
and so many find that it has assumed the very people who contrived it in the first place,
00:10:46.440
the people who were teaching it are now starting to fall into their own trap.
00:10:51.040
The culture police who have devoted their entire lives to exposing the injustices
00:10:56.720
have themselves become the insensitive and ignorant bullies that they supposedly pursued.
00:11:03.020
The behavior incredibly sharp and massively intelligent.
00:11:16.880
Their world, of course, finds its basis in theories.
00:11:48.100
What we have learned is there's tremendous power in accusations of racism and privilege.
00:11:53.140
And with these accusations, they become powerful.
00:12:04.220
is the philosophical basis of postmodernism, relativism.
00:12:14.040
you have nothing but a group of ideas that disagree with themselves.
00:12:20.340
And if you argue those ideas that are in conflict with one another,
00:12:26.300
you would be laughed away if it wasn't for the foundational premise
00:12:30.740
that there is no objective truth, only your truth.
00:12:35.360
Saul Alinsky, he wrote on the subject of relativism,
00:12:43.120
I'm not concerned if this faith in people is regarded as a prime truth
00:12:46.980
and therefore a contradiction of what I've already written.
00:13:16.000
and you have to track that wire all the way back
00:13:47.560
To them, their voices deserve to be the loudest,
00:13:56.800
Except there is a new politics that is forming.
00:14:07.060
People who have always considered themselves liberal
00:14:09.380
but can no longer suffer through the hysteria of the left.
00:14:30.960
Well, give me your tired, your poor, your huddled masses,
00:15:59.500
I think we need a new definition of friendship.
01:08:45.640
sarah can we play um the the google announcement
01:09:10.000
you know it's like talking to a computer you you
01:09:39.200
for for four people a week from wednesday and the the one trying to get the
01:09:47.500
reservation is the computer the guy the guy is the computer
01:09:51.500
let's say you want to call a restaurant but maybe it's a small restaurant which is
01:09:55.660
not easily available to book online the call actually goes a bit
01:10:02.800
for seven people um it's for four people for people when
01:10:19.320
um next wednesday at six pm oh actually we leave here for like
01:10:27.460
after like five people for four people you can come
01:10:42.360
oh no it's not too busy you you can count for four people okay
01:10:53.080
it's like improv-ing unbelievable so this is all good
01:10:58.520
they also announced that soon google's going to be able to write emails
01:11:02.100
for you in your voice i mean imagine the scandals you'll have you know
01:11:08.200
because like the anthony weiner excuse when he was starting texting you know
01:11:11.680
pictures to nine-year-olds uh was like oh i got hacked that's kind of a
01:11:15.980
standard excuse now of course hacking is real and sometimes you could
01:11:19.840
theoretically be hacked but it's usually just an excuse
01:11:22.520
the same thing is going to apply here oh google wrote that even i didn't even
01:11:31.040
i read about five of his books all in like a week
01:11:34.120
a week and a half yeah um i think it was the avogadro
01:11:41.200
it's start it's the first in the series and it starts
01:11:48.480
email and what links it will go to to to accomplish its goals which are your
01:12:08.420
is what you want it has to read everything it has to read all of your email
01:12:13.460
has to listen to your conversations has to uh read your facebook your
01:12:18.280
interactions with people on facebook i mean it is you want to talk about no
01:12:30.800
now they're going to openly admit it when they say oh by the way they can write
01:12:38.480
emails for you but it's got to read all of your email it's reading it it's
01:12:43.220
understanding it it's understanding the person on the
01:12:46.560
other end it's all connected i mean it is a brave new world that is
01:12:52.060
coming our way we have franklin foyer on um he is a guy who wrote a really
01:12:58.620
pretty bone-chilling article about deep fakes and how you're not going to have
01:13:04.060
real privacy and you're not really going to be able to believe your eyes
01:13:08.240
at all on anything very very soon name of his book is world without mind the
01:13:14.460
existential threat to big tech he joins us to talk about this next
01:13:19.780
in the high-tech world of espionage sometimes the best way to steal
01:13:25.580
information may still be the old school spycraft like you know talking to humans
01:13:30.520
and writing things down on paper of course it's also the old school way of
01:13:35.220
getting caught and that is exactly what happened to
01:13:37.620
former cia agent jerry chung shin lee hadn't been easy to catch him he's a 53
01:13:43.380
year old guy he was naturalized u.s citizen joined the cia in 1994 he left in
01:13:48.460
2007 and his main job for the cia was recruiting
01:13:52.120
clandestine human intelligence sources from his base in hong kong now lee is
01:13:57.540
suspected of giving the chinese information that caused the death or
01:14:00.920
imprisonment of 20 american agents in china starting in 2010 the cia began to
01:14:08.080
notice that all of our agents started to just disappear in china the cia
01:14:13.080
suspected that there was a traitor and so they asked the fbi to investigate
01:14:17.520
two years later the fbi suspected lee so they lured him with a phony job offer
01:14:23.140
to get him to fly to the u.s from his home in hong kong during the trip they
01:14:28.480
searched his hotel rooms in hawaii and virginia and they found two notebooks
01:14:32.880
containing handwritten lists of names and phone numbers of the covert cia agents
01:14:37.880
and informants in china the notebooks also had notes from asset meetings
01:14:42.440
meeting locations and locations of covert facilities in china so they went to
01:14:49.100
work and they built their case took several years the fbi finally got their
01:14:53.620
chance to arrest lee in january when he took a commercial flight from hong kong
01:14:57.980
to new york city and yesterday a federal grand jury charged him with
01:15:02.460
illegally possessing classified information prosecutors say chinese agents
01:15:07.520
offered lee a gift of a hundred thousand dollars in exchange for his
01:15:11.960
cooperation and a promise that they would take care of him for life
01:15:15.520
all they wanted were the names of people that worked for the cia imagine
01:15:22.380
giving up 20 lives for a hundred thousand dollars
01:15:27.060
prosecutors say lee prepared written reports for the chinese deposited hundreds of
01:15:32.820
thousands of dollars in cash in his personal accounts and lied to the fbi about
01:15:37.400
his activities overseas this is a devastating betrayal for our cia and our
01:15:44.160
intelligence agencies foremost because of the executed agents but also because
01:15:49.520
the massive amounts of time it takes to groom new sources and informants this is
01:15:53.640
not something that the united states is going to recover from quickly if it is
01:15:59.120
confirmed that lee's information was the cause of the death and imprisonment of 20
01:16:03.400
agents in china it will be the worst intelligence brief uh breach since um robert hansen do you
01:16:10.320
remember him he was caught passing secrets to russia in the 1990s and you remember what a big deal
01:16:15.720
that was now has anybody even heard of this guy so far lee has only been charged with possessing
01:16:24.240
classified information but he still could get life if he's convicted here's what you need to know
01:16:30.180
today make no mistake about it we are deep into some sort of a new war and perhaps it's a cold war
01:16:38.060
with china but this time china has eclipsed russia as our chief rival in that cold war
01:16:46.380
it's thursday may 10th you're listening to the glenbeck program
01:16:54.840
so on this program we've been talking about technology uh quite a bit over the last few
01:17:01.900
years the good things and the bad things the good things that are coming our way the profound
01:17:07.260
opportunities that are coming our way because of technology the problems that that technology
01:17:12.800
may cause for instance you know jobs we're looking at the job numbers upside down people who are
01:17:20.680
working on ai and robotics are looking at how can we have 100 unemployment so everybody can enjoy their
01:17:27.880
lives well with that comes some ethical questions and some philosophical questions and and the question
01:17:33.900
of how do we navigate that transition there's also privacy concerns and concerns about what's real
01:17:40.980
and what's not way beyond fake news there's a new book out um called world without mind it was
01:17:49.100
written by franklin uh foyer and he wrote an article that caught my eye a few weeks ago in the atlantic
01:17:55.240
the era of fake news begins and we wanted to have a conversation with him uh franklin welcome to the program
01:18:01.300
thank you so much so franklin we talked about deep fakes um i think before anybody and at least our
01:18:08.240
audience had really even heard of them and at the time i remember talking to people even on my own
01:18:14.140
staff that said glenn this is so far away and it's really not you know i kept saying to them i think in
01:18:19.940
the you know 2018 2020 definitely 2024 this is going to be a very big problem it's happening now and it's
01:18:30.120
just happening in the dark corners of the internet and um largely in the field of pornography where
01:18:36.860
so much of the internet actually begins before it migrates to the mainstream and it's this it's
01:18:42.820
this phenomenon where the average person now has access to technology that allows them to take a head
01:18:51.440
a picture of someone's face and very seamlessly uh transpose it to a body and so you're creating these
01:18:59.980
uh manipulated images which to the naked eye are very hard to detect as fakes and so we just know
01:19:08.840
that um that a technology like this is not going to stay bottled up it's going to be exploited by bad
01:19:15.720
people for bad ends and the fact that it's so democratized and so accessible to every every creep
01:19:24.620
everybody who's got an ex who's out for revenge um to co-workers i mean it really is a dystopian world
01:19:33.180
which we stand on the cusp of so you know you can't put the genie back in the bottle and i and
01:19:40.900
and i am i am really truly excited about the world of tomorrow but i am i'm split franklin between
01:19:49.340
if we don't do it right if we are not paying attention every step of the way and we don't
01:19:55.380
have some sort of of uh uh i don't know self-control yeah which we don't display in anything that we do
01:20:06.040
we could become slaves to this really quickly well i mean i think every everyone experiences that sense
01:20:15.760
of both possibility and enslavement when it comes to that device that we're all attached to now which
01:20:22.100
is our phone that we're we're addicted to our phones we're constantly manipulated by our phones
01:20:28.780
which uh are being used by corporations and by media to try to commandeer our attention on you know
01:20:38.440
an hourly basis we're always being dinged and pinged um and yet um we know that our phones
01:20:45.460
offer us uh great possibilities that make our lives better more efficient in all sorts of ways
01:20:50.420
and you raise this question of moderation and i do think that that kind of gets to the core
01:20:56.340
challenge to each and every one of us which is that when it comes to things like food and drink
01:21:01.940
which are also addictive right like if you stuck a bag of uh doritos in front of uh most human beings
01:21:09.620
they'd be tempted to polish it off or um uh and yet we still manage to teach our kids how to practice
01:21:17.320
moderation when it comes to those effective things we know how to enjoy food without
01:21:22.900
but we've yeah but we've had millions of years to come to that and we've had a shortage of food
01:21:30.820
this is for instance the google assistant that was announced earlier this week
01:21:34.880
that is going to be just that is revolutionary in good um in so many ways but if you look at what
01:21:42.340
they're talking about on on being able to have it now write all of your emails well the logical thing
01:21:47.780
is we're right now saying i don't want google reading my mail well google's going to be reading
01:21:53.460
your mail and your facebook and it's reading it now sorry yeah but but it's going to read your
01:21:59.660
facebook it's going to read everything about your friends how you communicate it's going to learn
01:22:04.580
how to speak like you and write like you and interact with your friends that that now you're in a
01:22:11.980
completely different world yeah no i think that that is true and so um you know and i think what
01:22:19.960
you're saying is that we're actually in this world now and and there's a way in which we can't stop it
01:22:26.340
that a lot of these things that we worry about as distant possibilities are actually happening
01:22:31.000
today that google's machines are reading your email now and they're using it to serve ads up to you
01:22:37.120
um that this complete dossier about you this very intimate portrait of the inside of your head
01:22:43.920
has actually already been compiled by facebook by google i mean people people are just uh very blase
01:22:51.600
about data but data is really an x-ray of your soul that these these companies have amassed these
01:22:57.960
portraits where they they know everything that you've read everything that you've bought it's
01:23:02.020
integrated with data from the outside world from the physical world about your shopping you know the cvs
01:23:07.900
loyalty cards and it's a very very powerful uh thing this portrait it can be manipulated really easily
01:23:14.880
and in fact it is manipulated by the likes of facebook where facebook is facebook knows the things that
01:23:22.380
gives you pleasure they know the things that cause you an anxiety and you know what your news feed
01:23:28.980
is shaped so that the things that you were reading are given a hierarchy by facebook based on those
01:23:36.560
pleasure points and those anxieties because facebook wants to keep you engaged on their site for as long
01:23:42.640
as possible and so that's a you know those technologies and that that that portrait of the inside of your head
01:23:50.060
that comes from data is just it leaves us tremendously vulnerable you brought something up in your book
01:23:57.540
that um was a new way of looking at this uh at least for me i've never heard anybody else express it
01:24:03.600
um this way um and and you know nothing is for free when google says i here's here's this it's free
01:24:14.080
there's this free product it's not free uh in fact it's not a product you are and the way these companies
01:24:24.220
are now viewing us as a product the information is really important can you go into the productization of
01:24:35.900
people right so it's your attention which uh you you're handing over to these companies it's your
01:24:45.560
personal information which you inadvertently hand over to these companies they then get
01:24:51.440
marketed to advertisers and um you know everyone says well you know what it's this great deal you get
01:24:59.300
google's email for free and sure you're surrendering surrendering your privacy um in order to get that
01:25:05.560
that that awesome service but that's a price that you seem to be willing to pay um but this question
01:25:12.780
of consent how much do we really understand about what we hand over to these companies when we click
01:25:19.240
accept on those terms of service agreements those terms of service agreements are legalistic and
01:25:26.320
they're long and i haven't met a single person who's ever read one of those we just we just click
01:25:32.480
accept because we we we we we treat these companies as if they're acting in good faith um we feel like
01:25:41.240
we have to accept their services because um you you need to be on facebook in order to be a citizen of
01:25:48.400
the world um sometimes your employer tells you you have to be on facebook in order to um be engaged in
01:25:56.140
in business um and so i think that we just need to i mean part of it is our it yeah it is on us it is
01:26:03.060
our own fault for not taking uh not taking the threats to our own privacy seriously and it's on us
01:26:10.440
for not thinking through these things in a more um in a more in a more rigorous sort of way but on the
01:26:17.680
other hand we're just left kind of unprotected and vulnerable and these companies actually behave so
01:26:23.200
much worse than we think that they do um i mean all these companies are selling data to outside
01:26:30.140
vendors they're giving access to our data to outside vendors and once it leaves the control of
01:26:35.920
facebook who knows where it ends up and who knows who's exploiting it and and when you talk about
01:26:43.160
exploitation of this i want to take you to another place we can take a quick break and then come back
01:26:49.340
and i want i want franklin to describe the exploitation that is on the horizon on how easily
01:26:57.640
you can be manipulated uh and and especially with things like virtual reality you're not going to
01:27:04.440
realize they're playing you you're not playing it they're playing you when we come back fema not a
01:27:13.840
first responder that is what the top administrator said in a speech recently from fema
01:27:19.280
we're not the first responder so who is when there is a problem who is the first one to respond
01:27:25.360
well we're finally returning to this truth you are i am we all are the first responders so it doesn't
01:27:34.240
matter what happened you know whether there's a problem in your local area you know your neighborhood
01:27:38.800
your house uh or there's you know floods hurricanes fires mudslice whatever it is you're the first
01:27:45.880
responder my patriot supply has the top rated food kit that millions of americans have chosen to get
01:27:51.920
prepared for you know a disaster that could happen in their area popular four-week emergency food supply
01:27:57.400
right now this week is 99 it's a super low price for security it's shipped free to your door but you
01:28:04.500
got to call now the number is 800-200-7163 or you can go online to preparewithglenn.com that's
01:28:10.960
preparewithglenn.com last 25 years it's easy to store it's easy to grab and go no matter what the
01:28:17.220
emergency is you are the first responder for your family call now be prepared 800-200-7163 800-200-7163
01:28:26.840
or preparewithglenn.com franklin for he is the um he's a correspondent with the atlantic and author
01:28:34.780
of a book called world without mind uh he wrote an article called the era of fake news begins uh and
01:28:42.060
you know we're complaining back and forth about you know uh the right is saying fake news the left is
01:28:48.800
saying fake news we're worried about algorithms etc etc but we're about to enter a very different world
01:28:54.960
um and can we talk a little bit about uh first let's explain what a deep fake is and and how that could
01:29:03.800
be used to manipulate and then i really want to go into virtual reality because that can change
01:29:09.600
everything if we're being manipulated right right and so a deep fake is um it is a technology that uses
01:29:18.840
machine learning which is a form of artificial intelligence in order to stitch together an image
01:29:25.600
so for instance it's possible to create an image of uh barack obama talking where uh you're taking a
01:29:35.600
catalog of of of of picture of images of barack obama's face you're able to then manipulate into
01:29:44.800
a video where his lips are moving as if barack obama is uh is is actually talking um yet it's really just a
01:29:54.400
computer that's manipulating those images of his lips talking and then um and then there's the
01:30:00.000
possibility that we can that that a voice can be manipulated by artificial intelligence to then
01:30:05.820
say whatever we program it to say and in fact i recommend there's a pretty hilarious um public
01:30:13.040
service announcement uh that's done by jordan peele the comedian where he does this with barack obama where
01:30:19.220
he he he he barack obama looks like he's delivering a sermon about the dangers of deep fakes and and in
01:30:27.000
the middle of this uh video clip they then cut to the comedian jordan peele and we see that he's actually
01:30:34.400
impersonating obama and that obama isn't saying the things that we think that he's saying it's being said
01:30:39.900
by this kind of uh puppeteer um who's uh using obama as a ventriloquist and the dangers of that should be
01:30:48.140
obvious right right well and once you get the voice right which just voice uh duplication the the the
01:30:56.140
the ground that has been covered and how good it is from you know two years ago to today is remarkable
01:31:03.400
we're not far away from being able to recreate people's voices but when you watch that video the only
01:31:09.360
thing that's off is the voice once you get the voice you wouldn't you would swear that that was barack
01:31:16.140
obama right exactly and in the nature of our machines is that the more data we feed to them
01:31:23.560
the better that they get correct they're constantly teaching themselves to improve and so um just given
01:31:31.160
the data that that we're it is good back to the conversation before the break when we give these
01:31:36.640
companies um our data every time we click on facebook every time we click on google every time you um
01:31:44.480
you do all these little things on the internet you're actually supplying the data that makes
01:31:48.760
these machines better that makes it possible to get to this kind of new dystopian age that we're
01:31:55.860
talking about how far away do you think we are from this new dystopian age well i think we're in
01:32:02.320
we're kind of in it now we're just in the earliest days and it's just hard to see a lot of the
01:32:09.900
consequences when it comes to something like deep fakes and the and rampant manipulation of video i
01:32:15.640
think we're still maybe you know two or three years away from it becoming something that populates
01:32:21.840
people's social media feeds and actually has real impacts and yeah in our politics and in our social
01:32:28.020
lives um but you know the question the question that everybody always is asking is you know have we
01:32:36.860
already left the age where private is privacy over is there any is there any chance that we can
01:32:43.780
salvage this distinction between a public life and a private life um can we can we can we take a break
01:32:51.040
here let's take a break here and when we come back i want you to explain that i also want to go into
01:32:56.880
virtual reality a little bit on what is private anymore will you have privacy anymore we'll wrap up our
01:33:06.060
conversation uh with franklin for world without mind is the name of his book coming up it's that
01:33:12.640
time of year the housing market is getting active people are venturing out of their homes now it's
01:33:18.580
not snowing on their heads all the time and people like to use this time of year to go buy a new house
01:33:23.760
if you're thinking about selling yours this is a great time to do it but you got to do it the right
01:33:27.900
way you got to find somebody who knows what they're doing somebody you can trust somebody who has been in
01:33:33.960
the business a long time that has all the experience needed knows how to do the advertising isn't just
01:33:39.500
putting up a couple balloons out in front of your house and and then they're done you need someone
01:33:44.920
from realestateagentsitrust.com realestateagentsitrust.com is the place to go where you when you
01:33:50.380
want to search for thousands and thousands and thousands of real estate agents across the country
01:33:54.580
and narrow it down to the best 1200 or so and honestly for you the best in your area that's all you
01:34:00.840
need to know you need one real estate agent that knows what they're doing realestateagentsitrust.com
01:34:05.880
is the place to go to find that person make that experience a good one realestateagentsitrust.com
01:34:11.920
the era of fake news franklin for he is a uh he's a writer has a book out called the world without
01:34:21.260
mind the existential threat of big tech uh and uh he is also a correspondent for the atlantic where he
01:34:29.400
wrote that article the era of fake news begins um we're just talking about we left the conversation
01:34:35.180
with with privacy and and franklin people will say well i'm not doing anything wrong so i don't mind if
01:34:41.380
they have it um however everything is starting to be connected and there is no end or beginning of your
01:34:51.320
private life and your public life and we are seeing not some dystopian movie but we're actually seeing
01:35:00.240
a government implement all of these dystopian ideas in china yes yes yes and that's i mean there are two
01:35:09.600
things that really concern me the first is that um that over time when we feel like somebody's always
01:35:17.200
looking over our shoulder will cease to be pre-thinking independent people that in order to
01:35:24.100
formulate an opinion in order to arrive at your own view of the world um you need to be guided by
01:35:30.680
your conscience and in order to do that you need to be able to turn over ideas in your head so if i'm
01:35:37.820
going to explore an idea i may go read people who i really disagree with in order to figure it out i need
01:35:44.140
to try on things i need to i need to engage i may engage with some odious people in in the course of
01:35:50.500
formulating my opinion i shouldn't be i shouldn't have to worry about that but we've already i mean
01:35:56.200
without technology we have already passed that i mean that's kind of this you know intellectual dark
01:36:02.340
web thing that is so appealing to me is i i we have to hear ideas we have ideas are not dangerous
01:36:11.960
you know they they can be if they're implemented but we need to hear all viewpoints and be allowed
01:36:19.240
to say wait a minute i think that's a crazy idea we have to have that yeah well and it's i mean but
01:36:27.040
but we're really not in an especially bad place yet relative to where we could be so right now you can
01:36:34.280
still if you want to if you want to find somebody's book you can find it on amazon you're not going to
01:36:38.740
lose your job you don't at least i don't think you should fear losing your job because you go buy
01:36:43.720
a book on amazon or you go to check out a piece on google or you listen to somebody's youtube channel
01:36:48.820
but you can easily imagine that once that information becomes kind of more transparently
01:36:54.160
available to everyone then you actually will that chilling effect will will take place and you will be
01:37:01.040
scared to do something like that and one of the things that i mean the the the atlantic is the place
01:37:06.660
that fired uh kevin uh williams uh i mean it was williamson i mean that was pretty chilling he had
01:37:13.660
an idea and said something and it was you know to to put a thought out there and it was i you know not
01:37:19.460
a thought i agree with but he lost his job it was publicly available and it was it's more complicated
01:37:26.280
than just uh it's an idea it had to do with the way that it was it was expressed in communications
01:37:33.680
between the editor and writer which is a question where there's trust involved and so that's that's a
01:37:39.140
it's a different a different question um but but you're right i mean we're not that far off yeah
01:37:45.880
from this universe and in the china example you just gave at the beginning i find especially chilling
01:37:52.360
very much this question of like this question of how do we deal with these companies how do we deal
01:37:56.760
with this data the idea that ultimately we could end up having a relationship brokered between big
01:38:04.080
companies and the government which is what what happened plenty of times in our past history like
01:38:09.320
at&t yep was a monopoly that basically cut a deal with the government and uh you know zuckerberg was
01:38:16.200
asked by uh one of the senators um you know you're a monopoly uh maybe we should just regulate you
01:38:23.020
and he said you know what the internet needs to be regulated and it was lindsey graham and lindsey
01:38:28.000
graham said well can you submit can you submit a set of regulations that might work to us so we can
01:38:33.000
consider them and just this idea that you could have this uh brokered relationship between these
01:38:39.820
companies that control so much data that have so much manipulative power and the government is
01:38:44.380
something that i think that we actually need to be fearful of in the long run and so my my approach
01:38:50.660
and i think this is an interesting place where um we're seeing ideology fracture and strange
01:38:56.120
bedfellows start to come up is that we're starting to talk about monopoly again in this country that
01:39:02.660
you know liberals typically uh fear private power conservatives typically fear public power
01:39:10.140
um but we're in this realm where you know our founders were worried about dangerous concentrations
01:39:15.740
of power in all forms and they were especially concerned about dangerous concentrations of power
01:39:21.960
when it came to areas of communication so um and it yeah go ahead and finish up well i just wanted
01:39:30.060
to say one thing which is that you know at the beginning you made the point which i agree with which
01:39:35.900
is that you know this technology is both empowering and it's dangerous uh but i want you know there's
01:39:42.840
something you know technology is something that's defined us as a human species our ability to affect
01:39:48.380
the environment is something that kind of rises above all the other species on this planet
01:39:53.620
and we've always used tools and technology and they've automated hammers automate part of uh what we do with
01:40:01.600
our with our bodies factories automated upper body strength but these machines are different because
01:40:07.420
they're intellectual technologies they automate mental functions and we're merging with them as a species
01:40:13.820
that you know man and machine is are becoming one sergey brin who's the co-founder of google
01:40:19.580
talks about the day when google will be implanted in your brain and i don't think that that's necessarily
01:40:25.800
such a distant fantasy no i think it's what it's what stephen hawking referred to uh as the end of
01:40:32.580
homo sapiens as we know it by 2050 he just means that we are going to integrate with technology
01:40:38.300
franklin i've got to go but i i'd love to have you back to finish this conversation uh another time
01:40:43.840
world without mind the name of the book extensional threat of big tech thank you so much franklin i
01:40:49.700
appreciate it thank you so much you bet you know one of the things that we you know i disagree with
01:40:58.980
on the on the atlantic uh point with with kevin williamson um but we are getting closer and closer
01:41:06.080
to the the government taking control of everything look at what president uh trump is talking about
01:41:15.620
now i mean he's got a it just has a really great thing he just did with south korea and on the heels
01:41:21.540
of that we're talking about uh you know picking and choosing which reporters are going to be able
01:41:29.040
to ask questions of the president that's extraordinarily dangerous welcome back to the
01:41:33.180
program pat gray thank you yeah he he uh tweeted out fake news is working overtime just reported that
01:41:39.100
despite the tremendous success we're having with the economy and all things else 91 of network news
01:41:45.400
about me is negative fake why do we work so hard in working with the media when it is corrupt
01:41:51.460
take away credentials uh no no no and and by the way negative doesn't necessarily mean fake
01:42:00.040
uh it right no so that's not necessarily the same thing and i mean he is right about the coverage
01:42:07.800
though because the uh media research center just did a study of abc cbs and uh nbc and the evening
01:42:15.760
news nine out of ten things that weren't uh neutral were negative so about 91 which is incredible and
01:42:24.060
unfair it is incredible you know i mean like we've had such we've had a few good things today obviously
01:42:29.020
the north korea thing also five of the top isis leaders have been captured today another really good
01:42:36.260
thing i mean the split should certainly not be 91 now you know you can't fill a room of reporters
01:42:42.980
that are all trump supporters and get your news either no we didn't like it when they they filled
01:42:49.240
a room full of obama supporters because not because they were doing anything funky just because everybody
01:42:56.700
except the fox news guy and jake tapper were in the bag you know what i mean hey how are you enchanted
01:43:03.320
we can't have that the press must be adversarial to power it has to be and when even matt drudge
01:43:12.020
is worried about it i mean matt drudge is there anybody who's been more supportive of donald trump
01:43:17.260
steve bannon drudge maybe not even i mean apparently yeah you're right i mean trump never fired drudge
01:43:24.440
right but drudge said anyone um he said i fear in the future uh the result of trump's crusade on fake
01:43:32.320
news will be licensing of all reporters dem's already float floated this in the senate pre-trump
01:43:37.380
the mop up on this issue is going to be excruciating and he also brought up what trump might start now
01:43:44.060
his successor could finish oh yeah in a really bad way oh yeah and we remember i talked a million times yes
01:43:51.660
yes i talked to i talked to ted koppel before trump was in and he said you know we need to have
01:43:57.400
license and i said no we don't no no we don't he said we can't just have anybody yes we can right yes
01:44:03.520
we can it requires us as citizens to do our homework that's the thing nobody wants to do any work well
01:44:09.660
well then fine you're going to become a prisoner because somebody is going to go down this road and
01:44:16.660
we must be the side that says no i disagree with them i think they're dishonest but we're not we are
01:44:25.800
not going to violate the first amendment and the free press you you have a right you know this this all
01:44:33.040
came from uh uh from pamphleteers you know at the time you didn't have radio you didn't have television
01:44:41.080
you didn't have talk shows you had people who were given speeches and they'd write those speeches down and
01:44:46.220
then they'd sell them on the street and well they weren't licensed that wasn't journalism that's
01:44:51.880
not in the paper no because those were the guys telling the truth and that's why that is part of
01:44:58.060
they were the blog sites of the revolutionary war and they're the ones who changed it you cannot
01:45:05.580
start to license people to express a point of view or report news and on the other side because
01:45:12.220
democrats are trying to uh do a preemptive strike on the other side they're trying to make laws that
01:45:17.440
define the profession of journalism that's also dangerous yes you're gonna screw this thing
01:45:21.140
completely up we already have everything we need in the first amendment congress shall make no law
01:45:27.040
abridging the freedom of speech or the press that's it period there you go you're already covered
01:45:34.340
completely and we should look at that the same way even with the press even when they're annoying
01:45:38.060
the same way we look at gun rights i mean no law right there is not you cannot abridge you you
01:45:44.640
can't you can't do anything that would limit these uh limit the access especially with the government
01:45:50.280
now you know like you mentioned the atlantic with with our guests and it's true like what i look at
01:45:55.260
that is no that's a completely different thing right it's distasteful what they did with kevin
01:45:58.720
different thing yeah no that wasn't that wasn't the same at all well yeah well yeah actually there's a
01:46:03.960
lot of different things to do it it wasn't well it's not the same as what we're talking about here
01:46:07.640
though no the atlantic can make what i view as a really a really bad decision if they want to
01:46:12.600
they're a private business government this is the government doing it and there is that line that's
01:46:16.240
a big one i mean you can't do those things i mean there's someone who was suing donald trump
01:46:20.400
for uh blocking them on twitter and the obviously it's ridiculous right but i mean in reality if the
01:46:28.520
president united states is making statements should not every citizen have access to those statements
01:46:34.720
right i mean i think you can make a real argument that you really probably can't block people on
01:46:39.520
twitter at least from the official accounts maybe he can from his private accounts but from official
01:46:43.600
government accounts you probably can't you can't um and that's because it's the same way we argue
01:46:48.700
with guns when people like well they just took it away in a situation that was it was it was really
01:46:52.360
obvious they shouldn't have had a gun no no that's not the way it works no even if you think it's
01:46:55.920
obvious you gotta have a real legal basis for those things yeah and we've completely lost sight of that
01:47:01.080
we really have and we have to be first second third fourth well all the way all of them
01:47:07.040
absolutists we have to be absolutists on all the the bill of rights and the amendments and on all
01:47:12.780
these issues that's what we need to refer to because that's where we have uh that's our that's
01:47:18.740
our guiding light yeah it's a u.s constitution and it's the guiding light it's the only thing that
01:47:23.060
will save us yeah is the constitution in the end if we don't stand firmly on that and tell
01:47:28.920
everybody who says yeah but no i i know i'm with you i'm frustrated but there is no but here
01:47:35.080
shall not be abridged period end of story good to have you back pat one quick follow-up question
01:47:45.260
from twitter uh is pat still alive this must be like game seven of a seven game series he lives
01:47:51.340
or dies on today's pat's unleash is that true yes today the day yes today we find out it could be
01:47:58.000
monday or it could be monday it could or tomorrow or tomorrow or tomorrow it could be today wow
01:48:05.060
friday or this coming wow thank you for telling me yeah this late in the game that it's not friday
01:48:11.080
you're welcome because it means god thank you for that the only way to be really certain is to never
01:48:16.660
miss a moment of pat gray thank you that's exactly right because i'm not going to heal over at any
01:48:21.840
moment it die yeah anytime die at any time all right pat uh do some jumping jacks or whatever
01:48:27.960
it is it makes you a little more vulnerable a little more on the edge good to have you back
01:48:32.640
all right let me tell you about uh 1-800 flowers what are you getting your mom
01:48:37.280
1-800 flowers assuming i remember as soon as we go to the next commercial to call
01:48:43.180
i've got my mother-in-law coming in to stay with stay with us this weekend too well you know the
01:48:48.100
other thing about that is this is a little tip for you you call 1-800 flowers you get a nice
01:48:53.160
bouquet sent to the house but you get two cards you get one you put your wife's on there and then you
01:48:59.680
just swap it out because your your mother-in-law's going to be there anyway and you when you say oh look
01:49:03.840
these flowers and whenever they're in the room you just swap the card i got a better idea i get 24
01:49:08.040
multi-colored roses okay so that's two dozen i just split them in two vases and i don't have to
01:49:15.900
switch the cards i'm like hey look at you guys huh yeah you're worthy you both get a dozen roses
01:49:21.620
yeah the other part about this is don't talk about it on national radio before you do it
01:49:27.020
that's the only other the only other thing you don't want to do that probably is where you should
01:49:31.860
have started okay yeah okay all right 24 multi-colored roses plus a free glass vase starting at 29.99
01:49:37.680
it ends today mother's day is sunday today's the last day you can get that deal 24 multicolored
01:49:47.400
roses 29.99 it's 1-800flowers.com click on the radio icon enter the promo code beck it's 1-800flowers.com
01:49:55.900
promo code beck offer ends today welcome to the program we have a lot to talk about uh on tomorrow's
01:50:03.440
show tomorrow is friday i'm gonna tell you about the uh cat that will that walked 12 miles to come
01:50:09.820
home to his family the family had given it away to another family 12 miles away but the cat missed
01:50:17.460
the family so much it walked 12 miles back home may i uh ask a question did the other family just
01:50:25.500
bring the cat back and put it in the front yard no it actually did walk 12 miles to get back home
01:50:29.960
did someone have a camera on the cat how do we know i don't know how do we know the cat i don't
01:50:34.620
know but apparently you could be a member of the family because the family uh saw the cat and they
01:50:38.700
were like oh how sweet brought it to the vet and had it euthanized oh my god yeah yeah they really
01:50:45.080
didn't want the cat you know the cat's not a good judge of character wow okay god glenn back mercury