Dr. Christine Blasey Ford has come forward with a new accusation against Supreme Court nominee Brett Kavanagh. She claims that she was sexually assaulted in high school in the early 1980s by a male high school basketball coach.
00:03:33.760Democrats want this delayed as close to the November midterms as possible.
00:03:38.520They study history just like everyone else.
00:03:41.300When this nearly identical scenario happened with Clarence Thomas and Anita Hill back in 1991, it triggered what is now known as the year of the woman in the 1992 elections.
00:03:54.100Feinstein, the ranking Democrat on the Judiciary Committee, was a product of that surge.
00:04:03.440Feinstein, you know, the woman who got the letter, whose peers office leaked the letter.
00:04:11.780She got into office because of Anita Hill.
00:04:16.140The Ford accusation reveals a lot more than just that.
00:04:21.440Back in August, a large group of left-leaning groups co-signed a letter to both Senator Feinstein and Grassley demanding Kavanaugh's records.
00:04:31.340It was basically the same narrative that Cory Booker and Kamala Harris were using during the opening day of the confirmation hearing.
00:04:37.760Okay, this group demanded they had organized to stop Kavanaugh.
00:04:44.880One of the groups that co-signed this letter was called the Project on Government Oversight, or POGO.
00:06:50.020So, Stu, I want to give you the choice of where we begin today.
00:06:55.480We can, I've got more on Kavanaugh and some really good stuff on, you know, understanding what's really happening.
00:07:03.340But I also have a woman who works at the main lobster pound who is now offering to sedate the lobsters with marijuana before they're foiled.
00:07:25.480So, she wants to get the lobsters baked before they're boiled.
00:07:32.840Because surely it would become an enjoyable experience to be boiled alive after you smoke a little pot.
00:09:18.900And I have news also on that from NPR, which is very, very interesting.
00:09:24.100Okay, so there's cage-free chickens, cage-free eggs, range-free chickens, grass-fed cows, meat eaters who, you know, are doing everything they can to, you know, lessen the guilt about eating a chicken.
00:09:41.820Well, I want to eat this chicken, but how many square feet did it grow up in?
00:10:01.060Could you just keep that cow in a box from the day it was born for me?
00:10:05.960I mean, I don't want to be cruel, but, okay, so here is, in Southwest Harbor, Maine, Charlotte Gill, that's an interesting name for a woman who runs a fish store.
00:11:30.520I bet you this was her father's or her family's thing, you think?
00:11:36.360And she's grown up her whole life, you know, being torn.
00:11:41.420I mean, look, you know, this is, you've been saying the veal thing ever since I've known you,
00:11:46.880that you don't eat it for that reason.
00:11:48.400And as America's only conservative vegetarian, I would say that most of the time I, you know, whatever, you know, people like to go to me into these conversations about this stuff because it's fun.
00:13:49.120Let's put them all in a cage and let's look nice and close to the little tank and we'll meet them all before we throw them in the boiling water.
00:13:55.840We, we as a society despise those things.
00:13:59.960Let me ask you this, though, Stu, seriously.
00:14:02.100Okay, let's just say you're, I don't know what your plan is.
00:15:05.960And again, who said, you know what would make this goose liver a little better is if we jam all this food and then we put a rope around its neck, let it live, and it'll become diseased.
00:15:23.240It's like, as if we decided that they were, like, responsible for the Nazi movement and we're just, like, extracting revenge over multiple decades.
00:15:37.440It might have been, it might have been, you know, one of the lesser known Nazi doctors that were like, how can I make diseased liver into something yummy?
00:15:44.660That does sound like a Nazi experiment.
00:16:06.560My Patriot Supply is, uh, uh, is going to be one of these things where they're going to stop delivering the specials soon, I think, because everybody's going to go, yeah, you know, maybe I should do that.
00:16:19.040What happens when, uh, what happens when the Democrats win, uh, the House?
00:18:07.560Or, or I could give you the story that has in it the statement from the police, the chief of police, putting that dead whale, quote, putting that dead whale in the dumpster.
00:20:30.020Now, the sheriff said it was a mistake.
00:20:32.220Sorry, we were trying to put this whale in a dumpster, and, you know, that was a mistake.
00:20:38.680And the reason why he's saying this is because people were videotaping them taking this big bulldozer and picking up this whale off the beach and driving it to, like, you know, behind the grocery store and just dumping it into the dumpster.
00:20:55.460Now, it's clear the whale has rigor mortis because it's flat as a board and does not move.
00:24:17.280I hate it when people put whales in dumpsters.
00:24:21.080Have you ever – like, if you – think of right now in your head, Glenn, and if you're listening to the show, think of this number in your head.
00:24:29.380How wide is a dumpster, a normal dumpster?
00:24:33.140How wide is it from left to right as you're facing it?
00:24:56.820She said all baby whales in the dumpsters.
00:24:59.940The only thing I can think of is I think they may have thought when they placed it on top, it would just fold itself into the dumpster because it's dead and, like, maybe it was so –
00:25:10.360Okay, but – okay, I thought of that too, but look at it in the, you know, the dumpster – I mean, the, you know, the shovel thing.
00:45:57.360I mean, there are, you know, garbage bags sitting lining the hallways because he says, you know,
00:46:04.020that he had a dispute with his medical waste company, and the bags contained fetuses.
00:46:10.880I mean, they were just sitting in the hallway.
00:46:13.060Some of them were stuck in the freezer.
00:46:14.620Some of them were stuck in milk cartons with name tags on them, and the place was filthy.
00:46:21.920It was cats running around and rats, and, I mean, it really was—you know, I don't think we could have made it look as bad as it really was.
00:46:32.940So, I mean, I was really struck by how filthy everything was.
00:46:39.820I mean, filthy it was, and you could almost smell it through the screen, the, you know,
00:46:48.540when you've got, you know, rotting body parts in the hallway and cats all over the place.
00:46:54.720I don't know how—I mean, how did anybody, anybody, think I shouldn't report this place?
00:47:04.280Yeah, well, I wanted to shoot the film in Smell-O-Vision, but I got shut down on that.
00:47:09.600Yeah, well, you did it with just imagination.
00:47:12.600But, you know, I think that what happened and part of the story is that this clinic was not inspected from, I guess, 1993 until 2010.
00:47:25.600There were no inspections done by the Board of Health at this clinic because of the political climate.
00:47:32.540You know, the governor, Tom Ridge, back then, who—he did not want to appear to be anti-reproductive rights, quote-unquote, or anti-woman.
00:48:31.500I mean, the infection, the disease, and the—let alone not just the kids that were killed, but also the patients that died.
00:48:42.240Yeah, and the reason that happened is exactly what you were talking about before, about not being able to talk honestly about these issues because we demonize each other.
00:48:54.920I mean, that's sort of part of the story is that abortion is so politically charged that you can't even have a rational discussion about it.
00:49:04.480And even when we agree on things, the other side is afraid to agree with you about the slightest little thing because they think they might be helping the pro-life movement.
00:50:01.020Well, part of the way Gosnell operated was that he did not have actual trained registered nurses working in his clinic.
00:50:10.340And I think probably because if he had, they would have challenged him.
00:50:14.920And so he basically surrounded himself with, you know, yes-men and sort of stooges that he could make do whatever he wanted them to do.
00:50:26.040And so he basically took, in many cases, high school girls and trained them to give the anesthesia and trained them to do some of the procedures so that he wouldn't have to be there even when some of the abortions were being done.
00:50:41.580And also so that he wouldn't have to answer to anybody.
00:50:44.920So you have these—in many cases, I thought the nurses were as much a victim as anybody else because they were kind of just doing what they were told to do.
00:50:52.760And since they'd never been trained medically, they just thought this was normal.
00:50:57.280They thought this was the way things were done.
00:51:02.140Anybody who says, oh, this is going to go back to backroom, back alley abortions.
00:51:08.660Yeah, that's what this guy was running.
00:51:11.420And anybody, even if you are—even if you're somebody who says, oh, I'm absolutely pro-choice, this—the state refusing to do any kind of inspections on abortion clinics
00:51:29.140is allowing back alley abortions to happen right now for the—not for the humanity to help these four little girls, but strictly for money.
00:51:41.520This guy was sick beyond your imagination.
00:51:44.940We'll continue our conversation here in just a second.
00:51:47.940First, let me tell you about our sponsor this half hour.
00:51:50.060It is SimpliSafe, home security, SimpliSafe, fantastic protection, really easy to use, and you're going to save a buttload of money.
00:52:02.780First of all, when you go to their website, and I urge you to do this, you go and just kind of scroll down,
00:52:08.180and you're going to see where they talk about the savings that you're going to have over the year.
00:52:12.620Now, you compare it to anything that you've ever had for home security, you know, ADT or, you know, Wells Fargo or wherever is, you know, putting that little sign in your yard.
00:52:23.420You look at the amount of money that they charge you every single month, you know, because that—well, they were putting in a big security system.
00:53:47.520We're talking to Nick Searces, the director of Gosnell, The Trial of America's Biggest Serial Killer.
00:53:54.520It comes out on October 12th, and you really need to see it.
00:53:57.840You need to—this is going to be one of those movies that is—they're going to do everything they can to keep this from going mainstream.
00:54:03.160You need to, you know, organize yourselves into groups and churches and whatever.
00:54:09.940Anybody who will go see this movie with you, it is—it's not shocking.
00:55:06.640And they honestly thought this guy was a monster and couldn't get anyone to help.
00:55:14.800Yeah, I mean, that's one of the main parts of the movie that I found so fascinating.
00:55:19.240I mean, when I read the script, there was just all this information in there about the procedure of abortion, about the laws surrounding it, you know, what constituted a legal abortion and what made it illegal.
00:55:32.060All that information in there, I was just like, wow, I didn't know this.
00:55:38.500And I think that's sort of the way the story unfolds in the movie is through the detectives is they're carrying that ball in that.
00:55:47.440Detective Woods is like discovering all this stuff and goes into this abortion clinic and just goes, wow, this can't be what they're all like, is it?
00:57:12.340He's one of those faces that you've seen.
00:57:13.600You're like, oh, my gosh, I've seen him a million times.
00:57:15.900He was in Moneyball, the last song, The Ugly Truth, Dead Girl, Runaway, Castaway, Head of State, Fried Green Tomatoes, Three Billboards Outside of Ebbing, Missouri, which is great.
00:57:30.180He's also, you know, recurring and, you know, a star in Justified in 11-22-63, which is also great.
00:57:42.620Seven Days American Gothic, HBO, From Earth to the Moon, The Mentalist, Lie to Me, Without a Trace, West Wing, CSI, NCIS.
00:57:52.980And now that he's the director of Gosnell and stars in the movie Gosnell, I believe he'll be playing Willie Loman in Death of a Salesman at the Waffle House just outside of Atlanta.
00:58:05.300Are you going to be able to get another job after this, Nick?
00:58:43.000One of the executive producers for the movie, Trial of America's Biggest Serial Killer, Gosnell, reached out to NPR to purchase a sponsorship for an interview show, Fresh Air.
00:58:53.640He was told he couldn't use the term abortionist to describe Gosnell in the ad.
00:58:59.200The emails obtained say, let's see, support for this NPR program comes from the film Gosnell, the trial of America's biggest serial killer.
00:59:13.020The film is a true story of an abortionist, Kermit Gosnell, the story mainstream media tried to cover up because it reveals the truth about abortion.
00:59:21.120They said they couldn't use the word abortionist, but they could use the word doctor.
00:59:26.600That, he was, they went back and said, okay, can we use abortion doctor?
01:01:20.980And that's one of the things that's been most shocking to me since we made the film is how resistant everybody is to just having a conversation about the truth of what happened in this situation.
01:01:34.540They're so afraid of it that they don't even want to talk about it, and they still want to pretend that it's not there.
01:01:40.200Was he in real life, you've seen the videotapes, you know, it was bizarre when they first come in, and they come into his house, and he seems, I don't know, just casual about everything.
01:02:01.160Almost, almost mentally handicapped, you know, calm, just, I don't know what, just completely detached.
01:02:14.180Yeah, I mean, he had this sort of air of superiority about him, and he still to this day thinks that he did nothing wrong, and that he's going to be exonerated.
01:02:27.400So he really, when they came into his house, he had no fear.
01:02:31.320I mean, he was not afraid that they were going to discover something bad about him, because he didn't think that he'd done anything wrong, except for the money.
01:02:42.120He was trying to hide the money wherever he could, because most of his, especially his late-term abortions that he did on the weekends, that was where he took only cash, and it was, it was, there were big piles of cash, stuffed all around the house, and hidden out at his beach house, and I mean, he was, I think he was concerned more about hiding the cash than anything else.
01:03:03.800And he didn't seem to live large, I mean, you know, he didn't, you know, hire a housekeeper, and his clinic was an absolute dump.
01:03:22.100But he did have, he did have, I'll say this, he did have extensive real estate holdings.
01:03:25.840He invested, you know, he had beach houses and over 17 properties, I believe.
01:03:31.940Tell me about the, you know, you, there's one scene, you go into the judge, and the judge is like, I'm not making this about abortion, and, I mean, that's kind of what this is, he's an abortion doctor, and, and you, you know, the, the, the prosecution, who couldn't get a doctor, at first at least, to testify against.
01:04:04.060Well, I think it's that sort of, you know, I don't really know, I mean, I'd be guessing, but I think it's that sort of medical brotherhood kind of thing.
01:04:12.940There's, there's, you know, it's, it's just bad form to testify against one of your own.
01:04:17.740And would you not testify against Mengele?
01:04:20.020If you, I mean, he wasn't doing experiments on people, but just honest to God, just in cleanliness, this guy was a horror show, let alone what he was doing to girls.
01:04:34.180Right. Well, I think also the politics entered into it.
01:04:39.160I mean, you, you, that, the doctors, I think, had the idea, probably correctly, that if they testified in this, it was going to become a political circus, and they didn't want to be part of it.
01:04:50.120They didn't, they just didn't want to spend their time or their, endanger their reputations by jumping into the middle of something this controversial, especially when they weren't involved, really.
01:05:01.400They would just be testifying as expert witnesses, and, you know, it wasn't their act to grind, necessarily.
01:05:08.420So, the, the, one of the shocking scenes in the movie is, as they're pulling up to the courthouse, one of them says in the car, okay, prepare for a zoo, listen, don't, you know, don't answer any questions, just keep moving forward.
01:05:24.500And they open the car door, and there is no one there, and there is no one in the, in the trial, in the courtroom, nobody, nobody from the press.
01:05:36.660And it's a little shocking, especially for this time, you know, an age where, look, this is, this is one of the greatest, most horrific stories in American history, and no one was there to document it.
01:05:56.420I think, Glenn, they just couldn't figure out a way to spin it.
01:06:00.220You know, they just couldn't figure out a way to actually cover the story and not risk helping tell the truth, not, not risk helping the pro-life movement.
01:06:12.360That, that had to be what was going on.
01:06:14.880And they, the, the press stayed away from the trial until Kirsten Powers, I believe it was, wrote an op-ed saying, why isn't anybody covering this?
01:06:48.220It really has that kind of creepy feeling to it.
01:06:51.220Uh, but without any kind of sensationalism and without any gore, uh, and it's, it, because, you know, you say, oh, you want to go see the abortion movie?
01:07:00.700No, nobody wants to go see an abortion movie.
01:07:03.260This is, this was so shocking in its coldness and in, and how much you just had no idea happened that it is strangely not one of those movies that you want to look away from.
01:07:25.520Yeah, I mean, and we really consciously made that effort.
01:07:28.480We, we said, we, we, we want to make this something that is palatable, that can, that people can watch and let the truth, let the facts speak for themselves.
01:07:37.900So we really, really worked hard to try to make a PG-13 movie that everybody could see.
01:07:43.580And I think the way that we did that is that we focused on the detectives and the lawyers.
01:07:49.300We focused on the, the heroes of the piece, which were the people who, who can, who caught Gusnell and who took him to trial and convicted him.
01:07:58.880And by doing that, it made, it made the movie more of a thriller and a courtroom drama than any sort of, you know, shock, gore movie.
01:08:08.320What, what's truly amazing to me is this is, you'd see this on Sherlock.
01:08:14.740You've seen worse than this on Sherlock and, and really how many shows, I mean, geez, man, you were, uh, you know, you're, you're part of CSI.
01:09:16.260Uh, is there anything else that people can do to help get the, spread the word?
01:09:20.960Well, you know, if you go to gosnowmovie.com, there's, we have a number of advanced screenings that we're doing around the country before, uh, before the release.
01:09:30.700Uh, and I think they're playing in Dallas tonight.
01:14:20.560The book is called Addicted to Outrage.
01:14:23.180And, you know, I'm very concerned about the outrage that's happening politically, but I am equally concerned about technology that is coming our way.
01:14:35.440We are standing at the best of times and the worst of times, and it's going to be up to us on whether technology and and our own human instincts and the worst of us bring us this dark future or a good future.
01:15:27.960And the way this is happening now in our society and everything is becoming political and we're starting to be we're starting to divide each other and call each other names.
01:15:38.820And whether you just woke up to this or you've always known this, you've got to start changing behavior and speaking to people differently and and checking yourself and social media.
01:16:01.360You have you wrote the book, The Social Machine Designs for Living Online.
01:16:07.140You were also part of the MIT Media Lab, sociable media group.
01:16:11.760I quote you in my book saying every ping could be social, sexual or professional opportunity.
01:16:17.720And we get a media reward, a squirt of dopamine, dopamine for answering the bell.
01:16:23.180These these rewards serve as jolts of energy that recharge the compulsion engine, much like the frisson of a gambler receives when a new card hits the table.
01:16:36.580Humidably, the effect is potent and hard to resist.
01:16:40.960Tell me how I don't think people really believe that we are that we're dope addicts.
01:16:50.820I don't think in the original quote I had said that we get a jolt of dopamine, but I don't think it's really important what the exact neurology behind it is.
01:17:00.540You know, I think most of us are aware of that feeling that, for instance, if you post something, a comment, you're always interested in seeing that people have liked it.
01:17:11.240A lot of this was actually on the positive side.
01:17:13.940The addiction is not necessarily about outrage.
01:17:16.420I think at the time I was thinking more of the issue around people posting pictures of kittens online and how popular cats had gotten.
01:17:25.700But I'm not I'm not trying to tell you.
01:17:27.420So, you know, I'm not sorry that if you felt I was presenting it this way, I'm not presenting you to prove my theory at all on outrage at all.
01:17:37.600I'm talking specifically just about social media and how social media is affecting us.
01:17:44.280And and and I think you're right at at at first.
01:17:48.460And I think even some ways now, even if it's in a negative way, we still do get that that hit and that high from people saying, I like this.
01:18:29.900But if you take them and make food that's just about those things and just about those tastes and is designed just to get you to keep eating, then it's really unhealthy.
01:18:40.900And the desires both to be liked by others.
01:18:45.040If we did not care what other people think of us, that's the mentality of a psychopath.
01:18:49.900You know, you want people who care about each other, who care, you know, am I doing things that other people think are acceptable?
01:18:59.520But if you start distilling that out into a space where everything you say gives you a little measurement of how many likes you get and you can measure it against the other things you said or what other people have said, it starts getting into the realm of social junk food.
01:19:15.300Do you think it's I mean, are you I mean, I know you're studying the media now.
01:19:21.280Do you think we're at the point of social junk food or we're social junkies to where because we're it's you know, you say, you know, if you don't care what people think, you're you know, you're a sociopath.
01:19:47.120There's a group of people, no matter what side you're on, no matter what the topic is, there's a group of people that have been deemed the enemy.
01:19:54.060And so you can tweet whatever you want, because you'll get all of the applause from your crowd, whoever your crowd happens to be.
01:20:06.560Well, I mean, and those are deeper issues that have been exacerbated by social media.
01:20:13.080But I think you can look in history at, you know, the rise of a fascistic government in the past or any, you know, there's a long history of war in human history.
01:20:23.000So the fact that you have a country that's deeply divided by groups who think the other one is the spawn of the devil is not actually new.
01:20:31.320But we're seeing a particular version of it with social media.
01:20:34.780Partly we get to see it played out in public all the time.
01:20:37.860And I think it's also very easy to blame the technology for it without looking at some of the deeper causes.
01:20:44.220And the issues around the attention are both when it's negative, like being able to rally people to your side by saying political things that are really outrageous.
01:20:58.660But they're also, it's also a problem when it's much more even positive things, like worrying about saying, shaping all of your views in terms of what will people like.
01:21:11.900And I think from the political stand, though, I think where there's a little bit of a difference on the right and the left.
01:21:19.900And perhaps this is where we may disagree, because I think that on the right you have a or on the more authoritarian side.
01:21:31.800And I think there's an authoritarian left also.
01:21:34.660But where you have people who feel very, very strongly that they are absolutely right and that all the outsiders are just wrong is where you get the phenomenon you're talking about where they can or they will tweet something or post things that are not only outrageous, but not true.
01:21:55.760And it will get a great deal of approval from the others on their side and outrage the outsiders, which is what they're seeking.
01:22:04.720And that's a particularly dangerous phenomenon online.
01:22:08.560So, Judith, I think that's a dangerous thing anywhere.
01:22:13.420Myself, it doesn't have to be right or left.
01:22:15.360And, you know, I write in my book that certitude is probably our biggest threat right now.
01:22:27.440Everyone on each side is absolutely certain that their side is right, as long as you agree with it 100 percent.
01:24:06.660And I think that's a lot of the danger of the present moment is that ability to once you start seeing others as human.
01:24:18.700And part of the issue is if you look at the history of highly authoritarian movements, a lot of it is about trying to portray those who are outsiders as very, very dangerous and subhuman.
01:24:36.620And so you can do anything and say anything about them and it only strengthens your inner group.
01:24:44.800And this is, you know, I'm this is a phenomenon we're seeing much more now than we did, you know, even 10 years ago and not just here, but throughout the world.
01:24:53.700There is a there is a an arrogance in some way to technology right now or those who are developing technology with I'm concerned about AI, AGI, ASI.
01:25:19.000I don't claim to know and I don't think anybody can claim to know with any kind of certainty, you know, when or if that can happen.
01:25:27.100But it is something to think about the upgrading, the transhumanism, the upgrading of of ourselves, the enhancements that are that are coming.
01:25:38.260We're messing with things that we don't really even understand because we don't even understand ourselves yet.
01:25:46.740We haven't mastered our own self-control.
01:25:51.740I mean, I'm not I'm I'm not a technophobe and I'm not afraid of technology.
01:25:57.820I'm I am concerned about the goals of some of the technology and how that those programs are written in what we teach.
01:26:07.040Are you concerned at all about how some of this stuff will change us that we don't we can't then reverse?
01:26:18.080Well, I think there are a number of things to be concerned about with artificial intelligence.
01:26:22.520I think the immediate issue is the ability of machines to imitate humans in ways that we can't recognize.
01:26:33.220You know, that's something that I think a lot of people are starting to be familiar with on Twitter, where it's very hard to tell if something was written by a human or by a bot.
01:26:42.720And the issue there is that, again, especially as a lot of our conversation occurs online, if you think you're speaking with another human, one of the important parts in what happens when we communicate with others, hopefully leaving out the extremes of anger, is that there's a level of empathy underneath.
01:27:05.120Even if you're trying to persuade someone else, it's because you care what they think, and often you care what they think of you.
01:27:11.420And that's really sort of the fundamental part of our connection with others.
01:27:16.300But if you're conversing with a bot, there's no connection there.
01:27:20.460It's simply something that has been programmed to affect some means, some end.
01:27:26.520And so what it needs to be a lot more effective and much more persuasive than people are.
01:27:32.940While the people don't recognize what they're dealing with, or even if they do, you know, if it's something like an Alexa, it becomes your friend, it's in your house, it chats with you, you know, it asks you how your day was.
01:27:46.740But you don't know what is actually going on in the program and what its internal motivations are, which is likely to be, you know, something that's beneficial to the maker of it.
01:27:59.560Yes. So this is something that is deeply concerning, that, you know, Alexa will be everywhere.
01:28:11.980Google, you know, controls so much information and placement.
01:28:15.020Just slight changes to algorithms can change people and, you know, the most likely scenario is get them to spend more, spend more time, do something that the company wants.
01:28:29.820Is it concerning to you? I've always been a capitalist. I've always been less worried about the companies.
01:28:37.980I'm concerned about the government and the companies. I'm concerned about anyone having this kind of power in our lives.
01:28:47.820Yes. I mean, I had always been mostly concerned with the companies. I thought the government was less worrisome. I'm now worried about both.
01:28:56.260But I think that, yes, I think the companies are quite dangerous, partly because, I mean, and we may, again, differ here, is that it's both that they want you to continue to consume things, which is not necessarily good for your pocketbook.
01:29:13.320It's terrible for the environment, you know. So if you look at, you know, even a simple case where we leave out the government, we leave out worries about, you know, fascist governments controlling people, just companies doing what they need to do to make more money.
01:29:29.500If you can turn people into even more rabid consumers than they are now, you know, what does that do to our society? It's not a particularly healthy outlook.
01:29:43.100Well, I think we've seen this already play out with, you know, Bernays in the 1920s of, you know, the first king of advertising and how he could subtly move people in a different direction.
01:29:57.600I mean, it's why we have, you know, ham and eggs for breakfast. That wasn't anything except advertising. Very, very clever advertising at the time through our doctors.
01:30:10.620And I think we're kind of seeing just a modern version of that. Judith, I have to cut you loose.
01:30:14.880I thank you so much for your time. And thanks for being out there thinking about ethics and what's happening with technology.
01:30:23.340Judith Donath, a fellow of Harvard Berkman Klein Center. Thanks for being on the program.
01:31:24.280X chair is my new office chair and I just ordered some for the studio or I've asked somebody to order some for the studio and they haven't been ordered or something is wrong.
01:31:32.800But I know it's not with the X chair people because I talked to the X chair people and they're like, we'll ship it tomorrow.
01:32:59.820Eight four four four X chair or X chair Beck dot com.
01:33:03.880You know, here is a here's a good example of the principle that I talk about in the book and people say, oh, Glenn, you just want to surrender.
01:33:13.180You just want us to reconcile with everybody.
01:34:46.160That is that would bring us to the bottom of the post nineteen ninety decline in murder rate.
01:34:52.080So the lowest murder rate in about thirty years and crime overall cities will experience the lowest crime rate since at least nineteen ninety.
01:36:03.300Because, you know, those people there that are living there are freaking out when you freak out and you would want the politics to stop and say, you know, imagine, for instance, Parkland, you're in.
01:36:15.440You've got kids in school in Parkland and nothing has been done except this big national debate about guns.
01:36:31.020Think of the people that are in Chicago that are living in those neighborhoods and they see that they're just being used by the media one way or the other, one side or the other in this stupid national argument.
01:37:43.420Now, yesterday she had another she come out and the evidence yesterday they were they were presenting this as see, there's another witnesses come forward.
01:37:52.340The witness that came forward yesterday said, yeah, I remember there was a buzz in high school about this.
01:39:51.260Some, I have a very low opinion, particularly of people in Washington.
01:39:54.800And I try, you try to look at people as individuals.
01:39:58.560And I just have to believe that, you know, they, you go home at night after a day of going on television and saying things like, they want to silence this accuser by inviting her to testify in front of America about her story and get more attention than it possibly could have ever had.
01:40:16.100When you say things like that, you have to know what you're saying is, is, is a lie.
01:41:57.820The left does not believe the things they're saying right now.
01:42:00.920And how do you, how, and, and, you know, this all goes down to the idea of the ends justifying the means.
01:42:08.300We need to stop him because we think Kavanaugh's bad.
01:42:11.580And we, if we can get past this election, we might win and we might have a chance to do our thing.
01:42:16.020So, based on that, say or do anything to stop him.
01:42:20.960Even if what you have to say is blatantly false, even if you have to accuse him of terrible, terrible things, even if you have to make arguments that are, are blatantly nonsensical, go ahead and, and do it.
01:42:34.980And that, you know, that is a, that's like, you should never put yourself, I heard a woman on, I think we actually have the audio, I think of this, but she said the Republic, she said the Republicans are telling the accuser she will never be able to tell her story.
01:42:55.140Dianne Feinstein handled this matter in the way that she handled it.
01:42:58.020That has nothing to do with Dr. Ford and her right to be given a fair process, and nothing can be fair when she's being rushed, an arbitrary deadline.
01:43:07.720There are no deadlines set for when this confirmation vote has to take place.
01:43:14.020They've said to her, you provide us with your statement by Friday, you show up here by Monday, and if you don't, you will forever be forbidden from telling your story, your truth about me.
01:45:05.180If she didn't experience it, someone else has experienced that very same thing.
01:45:11.320So she's just a voice for all of those who have experienced this.
01:45:17.180So we must believe her because we're no longer talking about her.
01:45:21.920We're talking about the collective and we must stop men from doing this.
01:45:27.720So we support her because she is going after the system, the patriarchy, the system that has repressed these women and kept women silent for so many years.
01:47:32.680Why did she feel empowered to tell something and say things that the very next day when people took her seriously and she saw the ramifications of her words?
01:47:45.000Why what made her so empowered to say these things that could destroy people one day and the next day go, wait, wait, wait, I know I don't even know if that's true.
01:47:57.140That's a pretty powerful, dare I say it, hit of dopamine, jolt of dopamine.
01:48:06.060There's some kind of drug that empowers you to do that.
01:48:10.620You go into a lot of that in the book.
01:48:12.020Addicted to Outrage, it is available right now at bookstores everywhere.
01:48:20.600I want to talk to you about American financing.
01:48:23.660We've we brought this up three times today that it looks like the House is going to go to the the Democrats and it is the House that, you know, files for impeachment and impeachment doesn't mean removal from office.