Best of the Program with Peter Schweizer | 10⧸3⧸18
Episode Stats
Words per Minute
152.5355
Summary
Glenn Beck goes back to basics and explains the difference between a victim and an accuser in the Brett and Christine Blasey Ford case. He also talks about Donald Trump's comments about the "victim" in the case.
Transcript
00:00:09.800
Thursday, Thursday, Thursday, Thursday, Thursday.
00:00:17.340
In fact, we're going to go back to basics today.
00:00:25.800
Do you know the difference between a victim and an accuser?
00:00:35.180
And they don't understand the word mocking and relaying evidence.
00:00:40.300
I'm going to go out and let them say they do understand both of those things.
00:00:44.740
Well, the American people have to have a brush up on that.
00:00:57.300
It doesn't seem like we've learned those lessons yet.
00:01:03.160
He did the book Clinton Cash, among many other great ones.
00:01:06.380
But he's got a documentary out about what Google is doing, manipulating search results.
00:01:11.260
And Peter's not a guy who's just, like, making claims.
00:01:15.300
Yeah, he's got a guy from Harvard University, you know, big egghead, that actually is a Clinton supporter and was all for Clinton.
00:01:26.340
And as he was doing the research and speaking out about it, guess who got kicked off of Google?
00:01:33.700
So, it's something that you really need to pay attention to because there's some really frightening things happening.
00:01:43.920
You're listening to The Best of the Glenn Beck Program.
00:02:07.900
I got up this morning and I've been doing a whole bunch of research on the show.
00:02:15.020
The last thing I got to was the Donald Trump thing.
00:02:19.180
I was on our affiliate in Tulsa and I was asked, what about Donald Trump?
00:02:25.600
And I should have just said, I don't know, I haven't seen it yet.
00:02:29.420
But I had read about it and I'm glad I didn't comment on it.
00:02:34.580
All I said was, I'm not going to comment on it because it's just, it's ridiculous to focus everything on Donald Trump.
00:02:49.020
The media has said, did you hear Donald Trump mocking the victim?
00:03:03.600
But in case you have only read about it or you've only seen the headlines, I would like to play the audio.
00:03:11.920
Here is Donald Trump, according to the press, mocking the victim.
00:03:19.000
Well, do you think it was, nope, it was one beer.
00:03:49.900
This is Donald Trump stating the facts in the Kavanaugh case.
00:03:57.520
Facts that MSNBC and everybody else don't seem to care.
00:04:20.460
I, so-and-so, am a current resident of California.
00:04:22.940
I first met Christine Blasey, now Christine Blasey Ford, in 1989, 1990 in California.
00:04:33.560
From approximately 92 to 98, I was in a relationship.
00:04:37.600
So, from 1989 to 1998, nine years, this person knew and was very, very close to her.
00:04:48.400
I found her truthful and maintained no animus toward her.
00:04:53.760
During our time dating, Dr. Ford never brought up anything regarding her experience as a victim of sexual assault, harassment, or misconduct.
00:05:04.440
During some of the time we were dating, Dr. Ford lived with Monica I. McClain, who I understood to be her lifelong best friend.
00:05:15.700
During that time, it was my understanding that McClain was interviewing for jobs with the FBI and the U.S. Attorney's Office.
00:05:23.620
I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:05:30.300
Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:05:43.340
Dr. Ford was able to help because of her background in psychology.
00:05:48.940
Now, this is interesting because I do remember, while she was under oath, a very strange line of questioning that went a little something like this.
00:06:00.300
Have you ever had discussions with anyone, besides your attorneys, on how to take a polygraph?
00:06:12.260
And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:06:25.980
But it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:06:34.460
I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:06:38.400
Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:06:47.620
I demand an FBI investigation on Monica L. McClain, who is a lifetime friend of Dr. Ford.
00:06:58.320
Because there is, now I want to use this word carefully, an accuser, we have to define that here in a second.
00:07:07.140
An accuser saying that Dr. Ford and Monica McClain, Monica was interviewing for jobs with the FBI in the U.S. Attorney's Office.
00:07:19.740
I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:07:26.600
Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:07:37.400
Let me play this audio again of what she said under oath.
00:07:42.220
Have you ever had discussions with anyone besides your attorneys on how to take a polygraph?
00:07:54.020
And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:08:06.600
But it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:08:16.200
I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:08:20.380
Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:08:40.780
We have somebody who has accused her of teaching someone else about a polygraph.
00:08:51.800
I think that's important that we keep that standard.
00:08:57.020
So if we're going to use the same standard that the left is applying, she is a liar.
00:09:09.860
Can she even be allowed to work at a fast food restaurant?
00:09:12.740
How can you, how can you possibly believe a liar on anything she says?
00:09:24.540
This is the standard that we're now running to embrace.
00:09:28.840
This is the standard that our children, this is the standard that we ran from.
00:09:37.700
America, because in every other country, this was a new idea.
00:09:42.340
You cannot come into my house and just take me.
00:09:56.860
I have a right to know who my accuser is and address my accuser.
00:10:02.900
I have a right to be presumed innocent until proven guilty.
00:10:35.520
Now listen, it is so imperative that you understand what this is.
00:10:47.880
If you do not understand what you're fighting, do you think we could have won World War II without naming the Nazis?
00:10:57.020
Do you think we will ever win this war on terror without naming what it is about?
00:11:16.580
That Sharia law is the prevailing law, and if you're not under Sharia law, you're an infidel, which means I can kill you, I can rape you, I can turn you into a slave.
00:11:34.720
And we will never win it unless we name our enemy.
00:11:39.680
We would have never won World War II if we were fighting the Germans.
00:11:55.060
We would not have won in the Civil War had we been fighting the South.
00:12:03.600
We were fighting people who didn't believe in the Constitution.
00:12:16.840
And by the way, if you don't think that's true, we lost every single battle up until the point that Abraham Lincoln said,
00:12:31.300
We wouldn't have won the American Revolution if it wasn't against tyranny.
00:12:42.860
And it was for certain ideas, like the idea that you are innocent until proven guilty.
00:12:56.940
And until the American people understand what post-modernism is, you will lose.
00:13:05.040
You will lose every battle because you will only grow frustrated and angry, which will play directly into what they want to happen.
00:13:24.640
They want us just to start swinging in blind rage.
00:13:31.980
And until you understand what they're doing, until you understand that this isn't really about Ford, this isn't about the charges.
00:13:50.880
This is all about white men have put together, in this case, a rape culture.
00:14:00.360
And it doesn't matter if he really did it, because other white men have.
00:14:05.180
It doesn't matter if she was really a victim, because other women have been victims.
00:14:12.620
This is about collective justice, currently entitled social justice.
00:14:20.280
But make no mistake, this is collective justice.
00:14:25.300
And collective justice, to put it into the terms that a Christian will understand, is anti-Christ.
00:14:59.720
You cannot balance the scales by convicting someone who is not guilty, because someone who looked like them has done it anyway.
00:15:23.560
I don't know if your neighbors do, but I think you feel it.
00:15:26.180
We are extraordinarily close to the edge of the abyss.
00:15:37.180
I promised when it came to that time, and I asked you to do the same, I would stand and say, don't go there.
00:16:18.220
And I've been explaining it on TV, and I've explained it on radio on Thursday.
00:16:55.420
People are not going to want to hear the facts.
00:17:07.440
The only thing that matters is reason and facts.
00:17:46.960
Al, I have to tell you, I have to tell you, Al, I have to tell you, I don't even think he attacked
00:18:14.500
Now, that might look like an attack to some, but it ain't.
00:18:17.780
Al, do you want a copy of the book or the audio book?
00:18:21.060
I'm going to make one out while we're talking here.
00:18:29.920
We're just giving books away to everyone who actually gets on the air?
00:18:36.080
I figure I've penciled in five books for the rest of the year to Al.
00:18:57.400
Play the audio real quick as we go into the bottom of the hour.
00:19:38.280
If you're not a subscriber, become one now on iTunes.
00:19:41.520
And while you're there, do us a favor and rate the show.
00:19:46.360
Home Title Lock is awesome because they take some real worry off your plate.
00:19:50.820
Like right now, if I were to ask you, is someone else borrowing money against your name?
00:19:54.980
You cannot answer yes or no unless you have Home Title Lock because Home Title Lock prevents
00:20:01.980
In fact, they have a $100 search to see if it's already happened to you.
00:20:04.640
You get that for free when you sign up for Home Title Lock and they'll protect you on
00:20:11.700
This is just a guy who just got out of prison, learned how to do this.
00:20:14.860
This is really easy to steal your home right from underneath you.
00:20:18.200
The only people that can protect you, Home Title Lock.
00:20:25.440
Home Title Lock puts a barrier around your title and mortgage.
00:20:33.340
We are entering a new time and everything's being redesigned right now and people aren't
00:20:40.620
People aren't really talking about big fundamental things that are changing.
00:20:43.960
For instance, America was based on life, liberty, and the pursuit of happiness.
00:20:48.800
Nobody's talking about pursuit of happiness right now.
00:20:50.740
Pursuit of happiness is defined by our founders as ownership that you could own.
00:20:58.700
And ownership is a big part of capitalism and a big part of America.
00:21:07.820
When you buy a book on Kindle, do you own the book?
00:21:11.580
When you buy a movie from iTunes, do you own the movie?
00:21:25.980
Aaron, and I want to get this right, Perez, say it for me, Aaron.
00:21:38.520
We can't pronounce easy words, so that was going to be difficult.
00:21:47.840
I'm really fascinated by how we make the turns in our society for the future, and ownership
00:21:58.060
is a big part of this, because in the future, I don't know how many people will even own
00:22:05.680
But do we really own things when we buy them online?
00:22:09.640
So I think there's a real concern here that consumers go into transactions when they're
00:22:17.380
buying things, digital goods, especially digital books, movies, music.
00:22:22.620
They go into those transactions assuming they work the same way as they do in the world of
00:22:27.760
tangible goods, where if you buy a book, you can give it away to a friend, you can lend
00:22:32.720
it to someone, you can leave it in your will in the future and leave your book collection
00:22:41.300
And the rules that control these digital transactions when you buy something on your Kindle or from
00:22:47.620
iTunes are very different from the rules that we expect in the physical world.
00:22:52.780
And consumers don't really understand that distinction.
00:22:56.880
And I think that causes a real disconnect between what we all expect to happen and what happens
00:23:03.780
So to give you a quick example, just a couple of weeks ago, a consumer, a customer of the
00:23:14.820
Apple iTunes movie store found that three movies that he had purchased had been deleted from
00:23:27.000
Um, those of us that have been following these issues closely for years would remember 10
00:23:32.240
years ago when Amazon remotely deleted books off of people's Kindles, including, uh, ironically,
00:23:42.020
So these, these issues have been happening for a long time, but I think people are, are now
00:23:46.660
starting to really, uh, sit up and take notice of it.
00:23:49.500
So I remember it because this, it's easier for me to read everything on Kindle.
00:23:54.060
Um, but I, and I have a, a large collection, uh, in my library of, of, of hardcover books.
00:24:03.600
I read it all on Kindle, but I have recently really been concerned, not just because I don't
00:24:09.540
actually own it and I can't have it in my library and I can't pass it on, but also because
00:24:16.340
If you're a giant, if you're in China, I mean, at first they wouldn't sell the book, but
00:24:20.000
if they did sell the book, the government can just deem that that book is, you don't need
00:24:25.500
You could just overnight, just take all of that, every copy of that book out of circulation.
00:24:30.640
If it's only digital, that's really disturbing to me.
00:24:38.020
Um, it's a concern, um, from the perspective of censorship, as you've just described it.
00:24:44.060
It's also a real concern, uh, from the perspective of preservation and sort of archiving our cultural
00:24:52.900
If these books are, are stored on the centralized servers and only the hands of, you know, the,
00:24:59.920
the, the two or three companies, um, that dominate these markets, then there's a real risk that,
00:25:07.200
um, we aren't going to be able to ensure kind of the widespread distribution of copies.
00:25:14.060
That will allow us to, um, to, to, to archive and preserve, um, these works.
00:25:20.160
And, and Aaron, it, with the movie, it wasn't because they found it objectionable or anything
00:25:25.380
It's because that particular provider, they lost the rights to that movie, right?
00:25:31.300
And so they, they had to pull it from people's libraries because their rights had expired.
00:25:37.860
So there are a number of ways that this can happen.
00:25:40.380
This most recent example, I don't know that the facts are totally clear on exactly.
00:25:45.040
So one way this can happen is that as you described, um, the deal between the digital retailer,
00:25:52.920
Apple or Amazon, and the copyright holder expires, they no longer have the rights to sell that
00:25:59.680
It can also happen when a record label or a movie studio decides that they want to put out
00:26:05.680
the new updated, remastered director's cut edition of a movie.
00:26:11.000
And when they do that, they pull the old version to help drive the sale of the new.
00:26:17.700
So they almost force you to, I mean, cause they, they, they've always done this where,
00:26:21.900
you know, it's the masterpiece collection and it's, you know, additional footage and,
00:26:25.980
and, uh, you know, fully restored, but you still had the old copy.
00:26:31.600
You can't, you, I mean, even for, I mean, think of this, even just for comparison, you
00:26:36.800
can't, if they change something in a movie, imagine when, remember when George Lucas changed
00:26:42.060
Well, I want to see what it was like when it originally came out.
00:26:47.560
Unless the movie company decided to allow you to do that.
00:26:52.420
I mean, and the, and the problem in this most recent case in part was that the consumer
00:26:56.200
didn't have a local copy stored on their computer or their device.
00:27:01.440
Um, and, and this is just a practical tip for people.
00:27:04.160
You should always try to store as much as you can locally.
00:27:07.960
Now, these services are often trying to encourage consumers to rely on their own, on the, on the
00:27:19.260
And sometimes, um, with the Apple TV, for example, uh, the Apple TV doesn't allow you
00:27:28.680
You have to access it through their cloud servers.
00:27:34.680
I think that makes a big difference in your relationship.
00:27:37.700
If I downloaded something on Kindle, could I download it to another cloud and still be able
00:27:46.060
Uh, so the, the Kindle allows you to store those files locally on your own device, but
00:27:57.020
because the Kindle is tethered through software and network connections to Amazon, Amazon has
00:28:04.740
the ability as, as they showed 10 years ago, to remove those files from your device.
00:28:15.820
Well, we saw this several years ago too, in a very different way.
00:28:18.920
I'm sure, um, some of your listeners may remember when they woke up and found a U2 album on their
00:28:32.060
You write about this a little bit and it's, it's an interesting change in the way we think
00:28:37.380
There is, there is, in the past you had a transaction where you'd go into a store and you'd buy something.
00:28:42.820
With these digital purchases that we're making from iTunes or Amazon, we're actually like
00:28:50.900
You, it's a, it's sort of an open-ended thing where they're constantly knowing what you do
00:28:56.800
And you have that ongoing relationship where they can cancel that at any time without your
00:29:02.760
Can you talk a bit, a little bit about the change there?
00:29:08.540
Uh, that the switch to the digital platform offers convenience, but also makes consumer
00:29:13.840
Unlike a purchase at a broke bookstore, a digital media transaction is continuous linking buyer
00:29:18.440
and seller and giving the seller a post-transaction power impossible in physical markets.
00:29:25.040
So I think this is important for a number of reasons.
00:29:29.620
It leads to these scenarios that we were talking about earlier, where the seller of the good
00:29:35.380
has the ability not only to sort of reclaim, uh, or recall the good, but they also have some
00:29:43.040
ability to control how and when, and under what circumstances you make use of that product
00:29:50.180
So that's just not something that you could do in the tangible world, right?
00:29:55.040
Your, your local bookstore, put aside the publisher, your local bookstore can't tell you
00:30:02.920
They can't tell you, um, you know, how many times you get to read it.
00:30:07.520
They can't tell you who you get to lend that book to.
00:30:10.720
And they certainly can't keep records of all of those interactions.
00:30:13.980
And the digital world allows for, uh, that, that form of control.
00:30:20.760
And importantly, it's not limited just to digital media.
00:30:25.440
Uh, we have all these smart devices, uh, in our homes, on our bodies.
00:30:30.560
Um, you know, we've got our voice assistants and our fitness trackers and, you know, even
00:30:39.640
They all have software, they all have network connections and all of these sort of, uh,
00:30:45.660
problems that I've been describing are going to play out in that space as well, where device
00:30:52.520
makers are not only going to be able to track your behavior, but they're also going to be
00:30:57.680
able to limit the ways in which you can use the products that you think, uh, you have purchased.
00:31:04.080
So, so let me, so let me interrupt here and just ask you this.
00:31:08.080
I see when I go to iTunes, I see a movie I want to watch.
00:31:22.540
Uh, so I think there's a really good case to be made here that companies like Amazon and
00:31:27.000
Apple that use language like own and buy words that have real meaning for people in their
00:31:34.600
everyday lives are misstating the, the nature of those transactions.
00:31:39.960
So, uh, my, uh, coauthor, Chris Hofnagel, and I wrote a paper a few years ago, a couple of
00:31:47.180
years ago now, um, called what we buy when we buy now that did a survey of about 1500 consumers
00:31:54.500
to figure out what people think this language means.
00:31:58.300
And it turns out that a significant percentage of consumers incorrectly believe that they
00:32:06.840
do have true ownership rights and they get to keep these goods, that they can lend them,
00:32:13.080
And we think that there is an opportunity here to, uh, correct this misinformation in the
00:32:20.420
But think about the company that we're talking about, you know, Apple and Amazon are two of
00:32:25.280
the biggest corporations the world has ever seen and getting them to, uh, convincing them
00:32:34.820
to communicate in a, in a more clear and fair way is, is, is a real challenge.
00:32:43.360
So I think there is a possibility for class action litigation here.
00:32:50.560
There, there are a bunch of, uh, legal, uh, and practical hurdles to making that happen.
00:32:58.640
I think the federal trade commission has a role to play here.
00:33:02.660
This is, uh, squarely within their, um, uh, within their area of, of expertise and obligation
00:33:12.740
to police the market to make sure that consumers have accurate information.
00:33:22.000
The, the, the way the market works depends on consumers being informed.
00:33:28.940
People can't decide where to spend their money if they're being misled about the products that
00:33:34.680
So I think that it's crucial for the functioning of the market, uh, to have that information
00:33:40.580
Have you done any look into what a society without real ownership, I mean, we're down to, you
00:33:51.200
Uh, and that's only going to get stronger as, as, as we move forward.
00:33:54.580
Have you looked into what that means for a capitalist society and for America in particular, that
00:34:04.420
So my biggest concern here is the way this changes kind of our conception of ourselves and
00:34:14.060
the way we think about ourselves as individuals in a society.
00:34:38.480
One of my favorite guys, uh, because he is, he does his own homework.
00:34:43.440
He looks and he tells the truth as he finds it.
00:34:47.800
He's the president of government accountability Institute and a producer of a new documentary
00:34:59.260
And it actually, the creepy line comes from a speech that Eric Schmidt, Schmidt, the CEO
00:35:05.400
In fact, where he was asked, how do you make these ethical judgments about how far you're
00:35:10.440
And the, the interviewer actually asks Schmidt, are you going to implant things in our brain?
00:35:15.460
And Eric Schmidt's response was, well, we like to go right up to the creepy line, but
00:35:22.060
And he said, we're not going to implant anything in your brain.
00:35:35.600
It's, it's, I've interviewed him a couple of times and it is fascinating because he's
00:35:44.140
And I think it's his background as an engineer and, and he's sort of very direct.
00:35:49.380
I mean, one of the other things we quote him in the film is saying is that Google has and
00:35:54.000
takes very seriously its responsibility to change the values of American people.
00:36:00.060
Uh, you know, Google's mantra has always been, they are more than just a company to make
00:36:06.120
Uh, they have a certain ethos, a certain worldview.
00:36:09.160
And part of the reason that they structured the company the way they did in which the founders
00:36:14.140
always have controlling shares is that that sense of social mission is part of it.
00:36:18.300
And Schmidt has been always very direct about saying it.
00:36:21.160
Part of our mission as a company has been to try to shape and change the values of the
00:36:27.000
And that's sort of one of the premises of this film that it's not just about privacy.
00:36:31.620
It's not that there's taking all this information, Glenn, they're using that information against
00:36:36.680
us to try to nudge us or to move us into directions that we wouldn't ordinarily want to go.
00:36:42.760
So, um, so let's, can, can you tie this all to Kavanaugh and what we've seen with the Kavanaugh
00:36:50.040
case and how, for instance, you know, there's, there's, um, uh, there is this overwhelming,
00:36:57.200
uh, understanding from half the country that he is absolutely guilty and she is a victim.
00:37:06.660
And there's a lot of information on the other side.
00:37:10.620
In fact, more information on the other side, but you're not really seeing that.
00:37:14.660
It's, it's very hard because this is happening in real time right now to sort of monitor what's
00:37:22.140
Uh, in fact, one of the things we feature in the film is a study done by a Robert Epstein.
00:37:29.240
He's a Harvard PhD in psychology studied under BF Skinner, uh, was a former editor in chief
00:37:36.600
And by the way, and this is very relevant, was a Hillary Clinton supporter in 2016.
00:37:41.720
Well, one of the things he did in the 2016 election was he had 2000 people around the
00:37:47.160
country doing Google searches, uh, and they monitored the results that people were getting.
00:37:52.700
This is a very, uh, you know, uh, clear academic study and, and this research was peer reviewed
00:37:59.340
Uh, and what came back was that Google was systematically skewing search results in favor of Hillary Clinton.
00:38:06.320
They were, in other words, they were, uh, suppressing negative stories about Hillary and the algorithm
00:38:11.860
and they were pushing them in favor of Donald Trump.
00:38:14.480
And Epstein's point was, I actually supported Hillary Clinton thought she was more qualified,
00:38:18.740
but the bottom line is a company should not be doing this.
00:38:28.020
They're assuming the results in the list that they're getting is representative of some
00:38:40.380
Well, if you Google it and the, and the algorithm is giving you the answer that is skewed, right?
00:38:46.780
That's like going to a dictionary that will always change the definitions of things as
00:38:53.520
it applies to whatever's happening in the world.
00:38:59.320
And so in the, in the context of Kavanaugh, I mean, I don't know exactly because it's occurring
00:39:03.580
in real time, but the bottom line is there is a history here of Google doing this.
00:39:11.200
Tucker Carlson talked about on Fox about these internal emails where you actually had Google
00:39:19.520
We don't like, you know, Trump's policy on immigration.
00:39:22.340
So we want to sort of, uh, suppress certain stories.
00:39:28.640
And, and here's the, the, the, the point that we try to make Glenn in this film and in general,
00:39:33.300
the whole conversation that Google wants to have is about fake news and this debate about
00:39:41.820
If you and I are having a disagreement about something, I put up my fake news story and
00:39:47.080
you say, Oh yeah, I'm going to put up my fake news story.
00:39:52.780
And by the way, fake news doesn't really convince anybody.
00:39:56.020
You know, if you like Hillary Clinton, that fake news ad that the Russians ran of Jesus and,
00:40:01.940
and, and Hillary arm wrestling is probably not going to convince you to vote a different
00:40:06.880
That wasn't the, that wasn't a real arm wrestling competition.
00:40:09.640
But you know, the, the, the point is, is that that's not going to convince anybody because
00:40:16.440
You know, people tend to look for information they want.
00:40:19.020
What Google's doing is different because we don't know what we don't know.
00:40:23.860
The question that we should be asking people, uh, Google and Facebook is why will you not
00:40:32.520
I've never believed in, you know, those dystopian movies.
00:40:35.680
I've always made fun of them and said, yeah, this is, this is crazy.
00:40:38.960
You know, the, you know, the corporations out to get you because of their algorithms, because
00:40:44.740
they are so all encompassing, that is the world we're headed towards.
00:40:49.980
What do they tell you when they say algorithms?
00:40:54.440
Oh, no, we have to keep that top secret because.
00:40:57.460
They, what they argue is it's, it's for reasons of, of, uh, you know, state secret.
00:41:01.620
Um, and, and, you know, that they need to protect their trade secrets.
00:41:05.320
They need to be, uh, uh, you know, making sure that nobody gets access to it.
00:41:09.040
There's some truth to that, but there are a lot of things that they could do to demonstrate,
00:41:13.200
um, that they're offering a fair product and service to people.
00:41:17.460
And here's the thing, Glenn, they have lied about this before, you know, 10 years ago or
00:41:22.380
so you had other, uh, companies like TripAdvisor and Yelp who were saying that Google was artificially
00:41:29.220
suppressing their rankings in Google in favor of Google owned companies, which, okay, you
00:41:36.400
But here's the thing, Google flat out lied and said, absolutely not.
00:41:43.360
The best results are going to, are, are organically at the top.
00:41:48.300
The federal trade commission, the European union professors at Harvard university looked
00:41:52.640
at this and said, BS, you are fiddling with the algorithm.
00:41:55.700
You are screwing these other competitors and you're lying.
00:41:59.080
So the point is when Google says you can trust the algorithm, you can trust us.
00:42:06.020
And I think the only question that remains really is how are we going to deal with this?
00:42:10.600
Um, you know, there's an old story that Henry Kissinger said when he's on the national security
00:42:14.960
council, you give a president three choices, do nothing, take my solution or thermonuclear
00:42:25.720
We can try to deal with some sort of the regulatory issues related with Google, or we can break
00:42:34.720
And I think we're really at the point of point number three, because this is not a monopoly
00:42:42.760
This is controlling the news flow in the United States.
00:42:48.260
This is in the end, Peter, um, controlling everything.
00:42:52.900
Google is the most likely company in the America, in the American world, uh, to come up with AI.
00:43:02.660
Whoever, whoever gets to AI first controls everything.
00:43:12.820
This company is the most likely in the free world to come up with it.
00:43:17.720
If we don't have them contained in some way or another, when they get to AI, we're toast.
00:43:29.040
It's not just Google, the company, a lot of people don't realize this.
00:43:34.640
If you use Safari on your Apple product, you're actually using the Google algorithm.
00:43:41.680
Um, if you're using, if you are using, um, Yahoo, you're using Google.
00:43:49.580
The, all these entities are using the Google algorithm.
00:43:52.440
So even if you say, I am not going to use Google.com, you're using, unless you are making
00:44:00.740
If you're using any of those others, Google is the one that's dominating it.
00:44:04.080
And by the way, Google pays Apple $9 billion a year.
00:44:09.040
Google actually pays Apple to be the algorithm of choice for Safari.
00:44:13.640
That's how much they value this information and want to dominate this space.