Trump mocks the victim. Christine Blasey Ford's ex-boyfriend describes a strange line of questioning during a polygraph. Glenn explains why this is not only disturbing but also a very disturbing thing that happened under oath.
00:03:46.380I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:03:53.100Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become
00:04:01.860familiar and less nervous about the exam.
00:04:06.200Dr. Ford was able to help because of her background in psychology.
00:04:10.660Now, this is interesting because I do remember, while she was under oath, a very strange line
00:04:18.700of questioning that went a little something like this.
00:04:23.300Have you ever had discussions with anyone, besides your attorneys, on how to take a polygraph?
00:04:32.980Dr. Ford said, no, I was scared of the test itself, but it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:04:56.700I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:05:10.460I demand an FBI investigation on Monica L. McClain, who is a lifetime friend of Dr. Ford, because there is, now I want to use this word carefully, an accuser.
00:05:28.260We have to define that here in a second.
00:05:30.000An accuser saying that Dr. Ford and Monica McClain, Monica was interviewing for jobs with the FBI in the U.S. Attorney's Office.
00:05:42.580I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:05:49.440Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:06:00.260Let me play this audio again of what she said under oath.
00:06:05.060Have you ever had discussions with anyone besides your attorneys on how to take a polygraph?
00:06:16.860And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:06:25.500No, I was scared of the test itself, but it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:06:38.780I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:06:42.300Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:17:33.120Plus, they're made here in America, and if you're like me and you never remember, you just set up a schedule with them.
00:17:40.100You say, you know, every six months, every eight months, whatever your HVAC system calls for, and they just automatically send it to you, and you'll save 5% when you do that.
00:18:24.780He merely attacked her testimony, and after all of us have been sifting out the inconsistencies in her testimony,
00:18:33.400he merely voiced what we're all now thinking but don't want to express because we don't want to seem insensitive to a woman that was sexually assaulted.
00:18:44.120Al, I have to tell you, I don't even think he attacked her testimony.
00:21:27.920He is instead stating the facts and the absurdity of this case.
00:21:34.700And people are cheering because finally someone is saying it.
00:21:39.320In his case, to give President Trump bonus points here, he's saying things that I don't think any other president would have the balls to do.
00:21:47.620At least no other Republican president.
00:29:35.240That is Brett Kavanaugh, who now can't teach up at Harvard because all of the graduate students are saying we're not having a rapist to teach who can't who can't coach his basketball games anymore because they can't have a rapist.
00:31:09.860And if you think the Democrats and I think there are people who vote Democrat, I think there are people who vote for the Republican.
00:31:18.000I think there are people who vote for Kavanaugh actually feel bad for Christine Ford.
00:31:24.140I think they actually feel bad for because they don't know she maybe something happened.
00:31:30.220I was willing to say last week that I think something happened and I don't know.
00:31:36.240It's becoming less and less clear to me that even something happened because of all of the other testimony now that is coming out that saying, no, I've I was her boyfriend for 10 years, nine years.
00:32:52.900She the the documents from her doctor conflict with her report.
00:32:58.000She has changed the time four times in the last four or two months.
00:33:02.740So and that is sketchy because it appears as though she kept moving it closer because Brett Kavanaugh would have been out of state if it would have been where she originally claimed.
00:34:28.560We'll give you one of the fifteen hundred agents in your area or the area that you're moving to.
00:34:33.660And they will be able to help find the right house for you and negotiate the right price.
00:34:38.620If you're looking to sell your house, we probably have a real estate agent in your in your neighborhood as well that can help sell your house on time and for the most amount of money.
00:34:47.780These people have been hand ticked handpicked for the team for their knowledge, their skill and their track record.
00:34:53.360Thousands of families have already put this to the test.
00:41:51.020I'm I'm I'm I'm really fascinated by how we make the turns in our society for the future.
00:42:00.200And ownership is a big part of this, because in the future, I don't know how many people will even own cars.
00:42:06.100I mean, it's just it's just all changing.
00:42:08.580But do we really own things when we buy them online?
00:42:12.700So I think there's a real concern here that consumers go into transactions when they're buying things, digital goods, especially digital books, movies, music.
00:42:25.280They go into those transactions assuming they work the same way they do in the in the world of tangible goods, where if you buy a book, you can give it away to a friend.
00:42:36.480You can leave it in your will in the future and leave your book collection to your loved ones.
00:42:43.580And the rules that control these digital transactions when you buy something on your Kindle or from iTunes are very different from the rules that we expect in the physical world.
00:42:55.860And consumers don't really understand that distinction.
00:42:59.940And I think that causes a real disconnect between what we all expect to happen and what happens in fact.
00:43:06.700So to give you a quick example, just a couple of weeks ago, a consumer, a customer of the Apple iTunes movie store found that three movies that he had purchased had been deleted from his account.
00:43:28.500And I think that shocked a lot of people.
00:43:30.060Those of us who have been following these issues closely for years would remember 10 years ago when Amazon remotely deleted a book off of people's Kindles, including, ironically, George Orwell's 1984.
00:43:45.060So these issues have been happening for a long time.
00:43:48.260But I think people are now starting to really sit up and take notice.
00:43:52.220OK, so I remember because this it's easier for me to read everything on Kindle, but and I have a large collection in my library of hardcover books and I read so much.
00:44:06.980I read it all on Kindle, but I have recently really been concerned, not just because I don't actually own it and I can't have it in my library and I can't pass it on, but also because you watch things like it happening in China.
00:44:19.100If you're in China, if you're in China, I mean, at first they wouldn't sell the book, but if they did sell the book, the government can just deem that that book is you don't need to burn books.
00:44:28.380You could just overnight just take all of that, every copy of that book out of circulation if it's only digital.
00:44:41.460It's a concern from the perspective of censorship, as you've just described it.
00:44:47.120It's also a real concern from the perspective of preservation and sort of archiving our cultural history.
00:44:55.620If these books are stored on the centralized servers and only the hands of the two or three companies that dominate these markets, then there's a real risk that we aren't going to be able to ensure kind of the widespread distribution of copies that will allow us to archive and preserve these works.
00:45:23.180And Aaron, with the movie, it wasn't because they found it objectionable or anything else.
00:45:28.720It's because that particular provider, they lost the rights to that movie, right?
00:45:34.440And so they had to pull it from people's libraries because their rights had expired.
00:45:39.940So there are a number of ways that this can happen.
00:45:43.460This most recent example, I don't know that the facts are totally clear on exactly what went on.
00:45:48.560So one way this can happen is that, as you described, the deal between the digital retailer, Apple or Amazon, and the copyright holder expires.
00:45:59.360They no longer have the rights to sell that product.
00:46:02.320But it can also happen when a record label or a movie studio decides that they want to put out the new, updated, remastered, director's cut edition of a movie.
00:46:14.480And when they do that, they pull the old version to help drive the sale of the new.
00:46:22.700I mean, because they've always done this where, you know, it's the masterpiece collection and it's, you know, additional footage and, you know, fully restored.
00:46:55.500I mean, and the problem in this most recent case, in part, was that the consumer didn't have a local copy stored on their computer or their device.
00:47:04.900And this is just a practical tip for people.
00:47:07.220You should always try to store as much as you can locally.
00:47:10.220Now, these services are often trying to encourage consumers to rely on their own, on the company's own sort of cloud storage solution.
00:47:22.580And sometimes with the Apple TV, for example, the Apple TV doesn't allow you to permanently download a copy of a movie.
00:47:31.760You have to access it through their cloud servers.
00:47:34.220So, I think that makes a big difference in your relationship with those goods.
00:47:41.080If I downloaded something on Kindle, could I download it to another cloud and still be able to read it on Kindle?
00:47:48.560So, the Kindle allows you to store those files locally on your own device.
00:47:59.840But because the Kindle is tethered through software and network connections to Amazon, Amazon has the ability, as they showed 10 years ago, to remove those files from your device.
00:50:07.380If there ever is a time of of catastrophic change, there's going to be a time when we're all going to have to kind of work together and figure things out.
00:50:15.860Because the things aren't going to work the same with with dollars.
00:50:20.680May I suggest that the world always returns to gold?
00:50:23.440And for barter, they have the maple flex coin, which is this this bar of of silver that is about the size of a credit card that you can carry it around.
00:50:53.660Is the world hurling towards fiscal sanity or insanity?
00:51:01.480As soon as the stars start rolling the other direction, I'll stop talking about this, but I don't trust that we have anything that's going to going to bring us back into sanity.
00:51:13.300Other than some sort of catastrophic event.
00:51:42.800The switch to the digital platform offers convenience, but also makes consumer access more contingent.
00:51:47.740Unlike a purchase at a bookstore, a digital media transaction is continuous, linking buyer and seller and giving the seller a post-transaction power impossible in physical markets.
00:51:58.920So, I think this is important for a number of reasons.
00:52:03.400It leads to these scenarios that we were talking about earlier where the seller of the good has the ability not only to sort of reclaim or recall the good, but they also have some ability to control how and when and under what circumstances you make use of that product after the sale.
00:52:23.980So, that's just not something that you could do in the tangible world, right?
00:52:28.840Your local bookstore, put aside the publisher, your local bookstore can't tell you what country you're allowed to read a book in.
00:52:36.900They can't tell you, you know, how many times you get to read it.
00:52:41.320They can't tell you who you get to lend that book to.
00:52:44.720And they certainly can't keep records of all of those interactions.
00:52:47.800And the digital world allows for that form of control.
00:52:54.680And importantly, it's not limited just to digital media.
00:52:59.380We have all these smart devices in our homes, on our bodies.
00:53:05.120You know, we've got our voice assistants and our fitness trackers and, you know, even our home appliances and cars.
00:53:17.800And all of these sort of problems that I've been describing are going to play out in that space as well, where device makers are not only going to be able to track your behavior, but they're also going to be able to limit the ways in which you can use the products that you think you have purchased.
00:53:37.880So, let me interrupt here and just ask you this.
00:53:41.860I see when I go to iTunes, I see a movie I want to watch.
00:53:54.940So, I think there's a really good case to be made here that companies like Amazon and Apple that use language like own and buy, words that have real meaning for people in their everyday lives, are misstating the nature of those transactions.
00:54:13.760So, my co-author, Chris Hufnagel, and I wrote a paper a few years ago, a couple years ago now, called What We Buy When We Buy Now, that did a survey of about 1,500 consumers to figure out what people think this language means.
00:54:32.100And it turns out that a significant percentage of consumers incorrectly believe that they do have true ownership rights and they get to keep these goods, that they can lend them, that they can give them away.
00:54:46.960And we think that there is an opportunity here to correct this misinformation in the marketplace.
00:54:54.500But think about the company that we're talking about.
00:54:56.720Apple and Amazon are two of the biggest corporations the world has ever seen.
00:55:02.100And getting them to, convincing them to communicate in a more clear and fair way is a real challenge.
00:56:02.980People can't decide where to spend their money if they're being misled about the products that they're getting.
00:56:08.120So, I think that it's crucial for the functioning of the market to have that information be correct.
00:56:14.160Have you done any look into what a society without real ownership?
00:56:21.900I mean, we're down to, you know, renting clothes and everything else.
00:56:24.980And that's only going to get stronger as we move forward.
00:56:28.360Have you looked into what that means for a capitalist society and for America in particular that has always been about ownership?
00:56:36.580So, my biggest concern here is the way this changes kind of our conception of ourselves and the way we think about ourselves as individuals in a society.
00:57:47.340Will it even change the way we view things and change some fundamental concepts of what it means here in America of individual rights?
00:57:58.380We have Aaron Paraznowski on with us, professor of law and the author of the book, What We Buy, When We Buy Now.
00:58:08.220And you can find more information at TheEndOfOwnership.com.
00:58:11.700Aaron, so tell me how you've been looking at this.
00:58:16.260So I think in the short term, what we're likely to see are more changes in the way our commercial interactions occur.
00:58:24.500In the way that commercial transactions are structured, we're going to start to see people become more and more accustomed to paying for temporary access to resources rather than owning them.
00:58:38.660And in some ways, I think that makes some degree of sense.
00:58:41.900There are some people for whom owning a car isn't necessary.
00:58:44.920They'd rather be able to take a lift or use some sort of car share application.
00:58:49.600And I think that makes a lot of sense.
00:58:51.560What I'm worried about is the long term set of implications for a shift away from ownership and towards temporary access, a shift away from independent control of resources to one where we have to rely on permission or the sort of goodwill of the companies that control access.
00:59:17.200So maybe may I give you an example and see if I'm on the right track.
00:59:22.760I buy a car and I love this car and I want to keep it.
00:59:26.520And it's a classic car, but I don't own the software that runs the car.
00:59:31.480And if at any time the software company says, no, I'm not going to we're not going to support that or we want to discontinue or whatever, I don't I have a heap of junk.
00:59:43.120I can't do anything with it because I don't own the software that runs it.
00:59:49.900We see this issue come up in the motor vehicle context, but the way it's come up most most recently and most often is actually not with cars, but with tractors.
01:00:02.100John Deere, the long running American farm equipment company, makes exactly this argument that they own the software in the tractors that they sell to American farmers.
01:00:15.200And that means that farmers can only get their tractors repaired by authorized John Deere dealers.
01:00:24.920They can't go to their local mom and pop, you know, farm repair shop.
01:00:29.600I think those kinds of changes are really troubling because they go to this sense of independence and this sense of autonomy that we're all independent actors in the world who can make our own decisions, who can decide what's best for us.
01:00:47.020Do we want to keep this tractor as it is?
01:00:51.440Those decisions are being taken away from individual consumers and you're being forced to play by a set of rules dictated by the companies who, quote unquote, sell you these products.
01:01:03.600And doesn't that also doesn't also stop innovation?
01:01:06.740I mean, sometimes the guy who takes something and then tinkers with it comes up with a better system.
01:01:12.920But if I'm if I'm locked out of tinkering on my own property, it's it almost creates this this feeling of, oh, well, that's just the way it is.
01:01:25.100And that's the way it always is going to be.
01:01:30.560I think it has the real risk of doing that.
01:01:33.080It discourages people from being creative.
01:01:36.840It discourages people from from, as you say, tinkering with the things that they own.
01:01:42.300We have a lot of incredible innovations that have been made over the centuries in this country that didn't come from giant corporate R&D departments.
01:01:51.940They came from individuals messing around with things that they own in their garage.
01:01:56.440And there is a risk that we're foreclosing those kinds of opportunities.
01:01:59.900But even even more broadly than that, if we're discouraged from thinking of ourselves as independent actors in the world, you know, I worry that that creates a sort of complacency in in our in our population, in our country.
01:02:15.520And, you know, not not not to zoom out to too wide of a level here, but for a democracy to function, people have to feel and they have to be in charge of their own lives.
01:02:27.100They have to be invested in making informed decisions.
01:02:30.760And I worry that that, you know, this this lack of control over the everyday decisions might play into a much broader set of problems when it comes to people feeling like active participants in society and democracy.
01:02:47.040I couldn't agree with you. I couldn't agree with you. I couldn't agree with you more.
01:02:51.760I I just don't think this is the way society is thinking anymore.
01:02:55.880Everything is about the collective and very little is about the individual.
01:03:00.480And, you know, I think you understood you understated the case of tinkerers.
01:03:05.320I mean, if you look at the inventions in America, a lot of them, a lot of our progress came from what used to be called tinkerers, people who just did things in their own garage.
01:03:17.100And and and now, whether it's the government or these corporations, everyone is being told that's the way it is.
01:03:25.040Sit down. Shut up. You can't do anything about it.
01:03:27.920And I think that's extraordinary. I mean, that that that, you know, in in decades, that's what created China in many ways.
01:03:35.720That's what they don't think of things the same way that we do.
01:03:40.260They don't have that that spirit of invention that America has always been known for.
01:03:48.220So I agree with you that the history of innovation in this country has benefited greatly from individual creators.
01:04:01.140And we we need to keep an environment in which people have that ability to experiment, to innovate and ultimately to share that progress with with the rest of the country and the rest of the world.
01:04:16.360So I worry that we're moving in a direction where people aren't able to build those skill sets because they live in a world of sort of locked down digital devices.
01:04:31.260So let me ask you one more question and I'll let you go. I know you've spent twice the amount of time here that you you probably planned on.
01:04:37.340But let me ask you one more question. I am really concerned about copyrights, patents, trademarks.
01:04:43.800We seem to be entering a world where people don't take somebody's intellectual property seriously on the on the other side of this.
01:04:54.260They just feel that, well, I can download it. I can just take it.
01:04:59.180And we shouldn't have intellectual property rights.
01:05:02.400That that that is frightening, because, again, that was the second piece of the American experiment was you have a right to that intellectual property for a period of time so you can make money on it, which encourages other people to come up with their own ideas.
01:05:19.380Do you see this? Do you see this? Do you see this fading and is this trouble on the horizon as well?
01:05:26.960So I write and teach about intellectual property and it's something that I take very seriously.
01:05:32.920And one of the things that I always try to communicate to my students is that intellectual property system functions best when there is a balance between the interests of the public and the interests of creators.
01:05:49.380And the history of intellectual property, copyright in particular, is a history of a struggle to find and maintain that appropriate balance.
01:05:58.640And I think we're going through and have been going through kind of since the widespread adoption of the Internet, a period where we're struggling with how to answer some of those questions.
01:06:12.260There are certainly areas in which copyright holders have legitimate concerns about their works being exploited without compensation.
01:06:24.500And on the other hand, we live in a culture in which copyrighted works are sort of increasingly being distributed within these environments like Apple and Amazon, for example, where consumers can't do the things that they think they're entitled or should be entitled to do with them.
01:06:47.900So I think part of the solution here is providing consumers a strong incentive to pay for these works.
01:06:58.240That's one of the things that streaming services, I think, have gotten right, which is that they offer a really attractive deal to consumers.
01:07:06.760So people learn that if they're going to access the world's library of music, they have to pay for the privilege of doing that.
01:07:14.900But figuring out how that money gets distributed and what the right price point is, I think, is one of the sticking points.
01:07:20.800So it's an important set of questions and one that I probably can't do justice to.
01:09:01.600The trends on homes has gone down here in the last couple of months.
01:09:06.520And so now you probably have a better chance of negotiating because everybody really wants to, you know, sell their home before the holidays and close before the holidays.
01:09:16.220So if you are looking for a home, you need to be qualified.
01:09:19.480American financing gets qualified in just a couple of minutes.
01:09:22.020Just go to American financing dot net or you call one of the operators at eight hundred nine zero six twenty four forty.
01:09:27.740And they'll put you in touch with somebody that can help you.
01:09:30.400Eight hundred nine zero six twenty four forty.
01:09:33.720They will help you get a loan or refinance.
01:09:38.000If you if you are looking to consolidate all of your loans and refinance under your mortgage,
01:09:44.160you don't have to add extra years or anything like that.
01:09:47.100If you're looking to get out of a variable loan and get into a fixed mortgage, please do that.
01:09:53.660American financing dot net can help you.
01:09:55.620Now, these are salary based mortgage consultants.
01:11:06.140We have the author of this coming up in just a second, and it is it's staggering.
01:11:11.640And and and I don't think people what people are looking at are de-platforming and things like that.
01:11:19.420They are not thinking about the subtle moves.
01:11:23.780You know, if I controlled the information you had and I controlled what you saw and read first and you had to really dig down to find other things.
01:12:19.920I can go into the public and I can select the unstable and I can wind them up.
01:12:27.540Now, I am not saying that Google or Facebook is doing that.
01:12:31.160I don't even do not connect this to them, but that is what they have the ability to do, as well as the governments of the world have the ability to do that.
01:12:44.560What they are doing is they are shaping us by putting through their algorithms, putting information in front of us that they prefer.
01:15:44.260She is also on record against Trump and on the board of directors of a group that was whose mission was to block Trump's Supreme Court nominees.
01:16:39.140I found over one hundred text and Twitter messages and a video almost two minutes long that showed Keith Ellison dragging my mother off the bed by her feet,
01:16:49.180screaming and calling her an effing B, telling her to get the F out of the house.
01:16:56.960The messages I found were mixed with him consistently telling my mom he wanted her back.
01:17:03.020He knew that he had screwed up and he wished he could do things differently.
01:17:07.640He would victim shame, bully her and threaten her if she went public.
01:17:12.180I texted him and told him, I know what you did to my mother and a few other things.
01:17:18.940The woman was forced to come out and say, yes, that is true.
01:17:23.440It's the most difficult form of abuse to articulate.
01:17:26.440I didn't want it to come out, but this is a slow, insidious form of abuse.
01:17:30.860You don't realize it is happening until it's too late.
01:17:33.620The accuser wrote, four people, including my supervisor at the time, stated that I have come to them after and shared the exact story I shared publicly.
01:17:46.300I shared multiple texts between me and Keith Ellison, where I discussed the abuse with him and much more.
01:17:54.720She said, I knew I would not be believed.
01:17:58.380In 2005, Ellison also faced accusations of domestic abuse for making harassing phone calls in which he threatened to, quote, destroy a woman.
01:18:09.340She threatened to file a restraining order.
01:18:11.940The woman wrote in an affidavit that she and Ellison had been in a romantic relationship, that she had pushed, shoved.
01:18:19.220He had pushed, shoved and verbally abused her and had a lawyer intimidate and threaten her.
01:18:24.780However, this particular woman in Minnesota, the ex-girlfriend from 2016, did go to a doctor.
01:18:33.800At the time, the doctor has released the notes.
01:18:39.460All of these claims are consistent with what she told the doctor at the time.
01:18:45.320The doctor was treating her for abuse.
01:19:51.640If there is a preponderance of evidence, then I will presume that person either innocent or guilty.
01:20:00.420But after I have seen facts, if this kind of stuff was going on, this should all be in the court of law.
01:20:11.360If you are a victim, society will do nothing for you because we cannot do anything for you if you haven't gone to the police and reported it.
01:20:23.600If you believe that we live in a rape culture, then you have a responsibility to go to the police and document everything that has been done.
01:20:37.380Then we as a society need to do everything we can to make sure that justice is served, not on a collective basis, but judging it by the individual case.
01:22:08.780It's it's I've interviewed him a couple of times and it is fascinating.
01:22:13.400Yes, because he's just telling you he doesn't sugarcoat it.
01:22:17.360And I think it's his background as an engineer and and he's sort of very direct.
01:22:22.560I mean, one of the other things we quote him in the film is saying is that Google has and takes very seriously its responsibility to change the values of American people.
01:22:33.220You know, Google's mantra has always been they are more than just a company to make money.
01:22:39.440They have a certain ethos, a certain worldview.
01:22:42.320And part of the reason that they structured the company the way they did, in which the founders always have controlling shares, is that that sense of social mission is part of it.
01:22:51.780And Schmidt has been always very direct about saying it.
01:22:54.300Part of our mission as a company has been to try to shape and change the values of the United States.
01:23:00.240And that's sort of one of the premises of this film, that it's not just about privacy.
01:23:04.780It's not that there's taking all this information.
01:23:07.540Glenn, they're using that information against us to try to nudge us or to move us into directions that we wouldn't ordinarily want to go.
01:23:15.300OK, so so let's can you tie this all to Kavanaugh and what we've seen with the Kavanaugh case and how, for instance, you know, there's there's there is this overwhelming understanding from half the country that he is absolutely guilty and she is a victim.
01:23:39.060Right. And there's a lot of information on the other side.
01:23:43.780In fact, more information on the other side.
01:24:02.320He's a Harvard Ph.D. in psychology studied under BF Skinner, was a former editor in chief of Psychology Today magazine.
01:24:10.440And by the way, and this is very relevant, was a Hillary Clinton supporter in 2016.
01:24:14.900Well, one of the things he did in the 2016 election was he had 2000 people around the country doing Google searches and they monitored the results that people were getting.
01:24:25.860This is a very, you know, clear academic study, and this research was peer reviewed, as his other work was.
01:24:32.820And what came back was that Google was systematically skewing search results in favor of Hillary Clinton.
01:24:39.940They were, in other words, they were suppressing negative stories about Hillary and the algorithm, and they were pushing them in favor of Donald Trump.
01:24:47.540And Epstein's point was, I actually supported Hillary Clinton, thought she was more qualified, but the bottom line is, a company should not be doing this.
01:25:13.160Well, if you Google it and the algorithm is giving you the answer that is skewed, that's like going to a dictionary that will always change the definitions of things as it applies to whatever's happening in the world.
01:26:15.300If you and I are having a disagreement about something, I put up my fake news story and you say, oh, yeah, I'm going to put up my fake news story.
01:26:25.960And by the way, fake news doesn't really convince anybody.
01:26:29.180You know, if you like Hillary Clinton, that fake news ad that the Russians ran of Jesus and Hillary arm wrestling is probably not going to convince you to vote a different way.
01:26:40.060That wasn't that wasn't a real arm wrestling competition.
01:26:44.080But, you know, the point is, is that that's not going to convince anybody because of confirmation bias.
01:26:49.820You know, people tend to look for information they want.
01:26:52.260What Google's doing is different because we don't know what we don't know.
01:26:57.020The question that we should be asking people, Google and Facebook, is why will you not make your algorithm transparent?
01:27:20.300The name of the documentary is The Creepy Line, thecreepyline.com.
01:27:24.020Peter Schweitzer is with us and we have a lot to discuss because of deplatforming and kind of a roll in from our last conversation about information.
01:27:37.140How do you know that it's true and will true information, will actual information, will you be allowed to see or keep in the future?
01:27:48.040First, let me tell you about our sponsor this half hour.
01:28:37.540I mean, that really, that, I mean, I care for CarShield, but, uh, affect the whole insurance kind of program, you know, like, that's not my problem.
01:29:33.200We, um, uh, we are talking to, um, Larry Schweitzer.
01:29:37.400He is the president of Government Accountability Institute, the producer of a documentary called The Creepy Line, TheCreepyLine.com.
01:29:44.760Um, we're talking about Google and, Larry, I've never, I've never believed in, you know, those dystopian movies.
01:29:52.500I've always made fun of them and said, yeah, this is, this is crazy.
01:29:56.120You know, the corporation's out to get you.
01:29:58.740Because of their algorithms, because they are so all-encompassing, that is the world we're headed towards.
01:30:06.780What do they tell you when they say, algorithms, oh, no, we have to keep that top secret, because?
01:30:14.160Yeah, they, what they argue is it's, it's for reasons of, of, uh, you know, state secret.
01:30:18.980Um, and, and, you know, that they need to protect their trade secrets.
01:30:22.300They need to be, uh, uh, you know, making sure that nobody gets access to it.
01:30:26.140There's some truth to that, but there are a lot of things that they could do to demonstrate, um, that they're offering a fair product and service to people.
01:30:34.440And here's the thing, Glenn, they have lied about this before.
01:30:38.080You know, 10 years ago or so, you had other, uh, companies like TripAdvisor and Yelp who were saying that Google was artificially suppressing their rankings in Google in favor of Google-owned companies.
01:30:51.060Which, okay, you know, Google has the right to do that.
01:30:53.400But here's the thing, Google flat out lied and said, absolutely not.
01:31:05.280The Federal Trade Commission, the European Union, professors at Harvard University looked at this and said, BS, you are fiddling with the algorithm.
01:31:12.860You are screwing these other competitors and you're lying.
01:31:15.860So the point is when Google says you can trust the algorithm, you can trust us, they've lied before and they're lying now.
01:31:23.100And I think the only question that remains really is how are we going to deal with this?
01:31:28.040Um, you know, there's an old story that Henry Kissinger said when he's on the National Security Council.
01:31:32.620You give a president three choices, do nothing, take my solution, or thermonuclear war.
01:31:39.260In this case, it's kind of like that we can do nothing, we can try to deal with some sort of the regulatory issues related with Google, or we can break up these companies.
01:31:49.720Those are the three options that we have.
01:31:51.720And I think we're really at the point of point number three, because this is not a monopoly like standard oil, standard oil that's going to dominate the oil market.
01:31:59.720This is controlling the news flow in the United States.
01:32:03.160This is in the end, this is in the end, Peter, um, controlling everything.
01:48:05.180Now, this all matters, and the new LifeLock Identity Theft Protection adds the power of Norton Security to protect you against the threats to your identity and your devices.
01:48:16.420Now, nobody can stop all cyber threats, prevent all identity theft, or monitor all transactions at all businesses, but the new LifeLock with Norton Security sees the threats that you're going to miss.