The Glenn Beck Program - September 20, 2018


Best of the Program with Movie Director, Nick Searcy | 9⧸20⧸18


Episode Stats

Length

56 minutes

Words per Minute

152.19708

Word Count

8,666

Sentence Count

670

Misogynist Sentences

21

Hate Speech Sentences

6


Summary

Coming up on the Glenn Beck Program today, we look at an interesting set of connections as far as the kind of hit job that's going on with Brett Kavanaugh. We also get into how not to dispose of a dead whale, and some great audio from a Parkland student who was once a gun control advocate.


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:06.060 Coming up on the podcast today, we look at an interesting set of connections
00:00:11.140 as far as the kind of hit job that's going on with Brett Kavanaugh right now.
00:00:16.780 We get into that. We also get into how not to dispose of a dead whale.
00:00:22.580 We learned some important lessons on that one.
00:00:24.880 There's some great audio from one of the Parkland students
00:00:28.040 who actually was one of the people who were really against guns
00:00:33.560 and kind of partnering with David Hogg
00:00:36.780 who has kind of changed his tune and learned something interesting about America
00:00:42.060 as he's gone around it.
00:00:44.140 It's a pretty amazing moment.
00:00:46.740 We'll have that for you today as well as an interview with Nick Searcy.
00:00:49.800 He directs the new movie Gosnell, the Trial of America's Biggest Serial Killer,
00:00:53.960 which is a horror show but an important one for you to know about.
00:00:58.040 And we talk about Kavanaugh as well.
00:01:01.640 What's going on, the latest developments.
00:01:03.440 Another person came forward, said they had heard about it.
00:01:06.280 Now they've reversed their claims.
00:01:07.680 We'll give you all the details on today's podcast.
00:01:09.540 You're listening to The Best of the Glenn Beck Program.
00:01:23.080 It's Thursday, September 20th.
00:01:25.820 Glenn Beck.
00:01:27.200 All right.
00:01:30.740 You know, I'm torn.
00:01:36.100 Some days I want to live.
00:01:38.560 Some days I don't.
00:01:41.020 And, you know, the thought is, you know, it's really kind of nice.
00:01:45.220 The sun's out.
00:01:46.240 It's beautiful.
00:01:47.240 But then I have to look at the people that I live around.
00:01:51.700 Has everybody gone crazy?
00:01:58.720 Democrats, during the Kavanaugh hearings, at least are being transparent here.
00:02:08.020 Let's look at what's happening.
00:02:10.300 I'm amazed how the left is giving the entire country a direct insider look on how they go about business.
00:02:18.060 Look at the confirmation charade here.
00:02:21.680 It has everything.
00:02:23.140 It has mic check, mic check, to shout down your opponents, political grandstanding to gain favor for future elections.
00:02:32.680 I mean, if you want to talk about virtue signaling, I am smart, Hikis.
00:02:39.140 Then, shady plots involving big money donors that are just advancing progressive agendas.
00:02:47.100 False accusations that we, at least, at least accusations that there's no way to prove them.
00:02:55.360 This is a look at social justice, period.
00:03:00.320 This is the way America is going to work.
00:03:03.220 Look, if you fall in line with the left and you think, you know what, we got some real hope and change coming, this is it.
00:03:10.500 This is what your life is going to be like on all levels.
00:03:14.380 It's open for everyone to see.
00:03:17.560 They don't really even appear to, you know, give a crap about hiding it anymore.
00:03:22.380 Let's take the allegation from, you know, Dr. Ford.
00:03:28.140 Grassley has given Ford until tomorrow to commit to a testimony that will be conducted in the next 72 hours.
00:03:36.880 He wants it complete by Monday so they can move on with a committee vote.
00:03:42.380 But Ford and her lawyer are now delaying.
00:03:45.940 They want the FBI to do an investigation.
00:03:50.120 Okay, well, that's not the way the law works.
00:03:53.240 And by the way, maybe that's what should have happened in the first place.
00:03:57.160 Maybe when you had this accusation, you should have gone to the Maryland police before you wrote a letter to the good senator.
00:04:06.740 The good senator, when she got the letter, she should have said, Maryland state police, you should look into this.
00:04:13.640 It's a state crime, not a federal crime.
00:04:18.320 By the way, it's a state crime that appears to have a statute of limitations of one year.
00:04:24.060 But that's the way it works.
00:04:25.520 You don't come out and make this public accusation, lay out all of these charges, have zero evidence, and then say, oh, by the way, I'm going to let that smear hang out here.
00:04:39.880 I'm not going to answer a single question until the FBI do their job, which, by the way, this is not the FBI's job.
00:04:47.260 It's a delay tactic.
00:04:49.880 So why?
00:04:51.120 Of course we know why.
00:04:53.160 Democrats want this delayed as close to the November midterms as possible.
00:04:57.920 They study history just like everyone else.
00:05:00.820 When this nearly identical scenario happened with Clarence Thomas and Anita Hill back in 1991,
00:05:06.840 it triggered what is now known as the year of the woman in the 1992 elections.
00:05:14.180 Feinstein, the ranking Democrat on the Judiciary Committee, was a product of that surge.
00:05:20.360 Let me say that one again.
00:05:22.860 Feinstein, you know, the woman who got the letter, whose peers' office leaked the letter.
00:05:29.820 However, she got into office because of Anita Hill.
00:05:37.320 The Ford accusation reveals a lot more than just that.
00:05:40.720 Back in August, a large group of left-leaning groups co-signed a letter to both Senator Feinstein and Grassley demanding Kavanaugh's records.
00:05:50.640 It was basically the same narrative that Cory Booker and Kamala Harris were using during the opening day of the confirmation hearing.
00:05:57.140 Okay, this group demanded they had organized to stop Kavanaugh.
00:06:04.180 One of the groups that co-signed this letter was called the Project on Government Oversight, or POGO.
00:06:13.120 Who's the vice chairperson of POGO?
00:06:17.660 That's a woman named Deborah Katz.
00:06:21.200 Yes, the same Deborah Katz.
00:06:24.980 She's now the lawyer for Kavanaugh's accuser.
00:06:29.220 Don't worry, this gets even more ridiculous.
00:06:32.540 POGO?
00:06:34.000 Directly funded by, guess who?
00:06:37.080 Spooky Dude.
00:06:38.160 George Soros, the Open Society Foundation.
00:06:41.880 Soros has his fingerprints all over this.
00:06:44.340 In June, the Daily Caller reported that a group called Demand Justice had launched an effort to try to stop Kavanaugh's confirmation.
00:06:53.720 They allocated $5 million to the project.
00:06:58.680 Want to take a guess where they get their money?
00:07:01.200 Yeah.
00:07:02.720 Well, no.
00:07:03.720 Uh-uh.
00:07:04.100 Not Soros.
00:07:05.060 They get it from the 1630 Fund.
00:07:08.080 The 1630 Fund gets their money directly from George Soros and the Open Society Center.
00:07:15.620 So that's, you know, it's, uh, what was that, uh, money laundering, uh, play?
00:07:20.720 Oh, yeah, uh, Tides Foundation.
00:07:22.780 They're used to this kind of stuff.
00:07:24.600 Recently, they've given over $2 million from the George Soros Fund.
00:07:30.020 We're, we're seeing play-by-play, almost in slow motion, everyone can see it, a live hit job.
00:07:41.300 For anybody who thought that Netflix House of Cards was fiction, was over the top, was ridiculous,
00:07:47.620 I urge you to go back and watch it again and tell me that doesn't look like Little House
00:07:54.000 on the Prairie compared to what's happening in real life today.
00:08:01.100 This is the best of the Glenn Beck Program.
00:08:03.900 Okay, so there's cage-free chickens, cage-free eggs, range-free chickens, grass-fed cows,
00:08:17.180 meat-eaters who, um, who, you know, are doing everything they can to, you know, lessen the
00:08:25.220 guilt about eating a chicken.
00:08:27.180 Well, I, I want to eat this chicken, but how many square feet did it grow up in?
00:08:35.800 Yes, I agree.
00:08:37.180 I mean, I don't eat veal for that.
00:08:38.700 I don't want my, I don't want my cows tortured so they're a little more tender, you know?
00:08:43.900 I don't know.
00:08:45.020 Not quite tender enough.
00:08:46.420 Could you just keep that cow in a box from the day it was born for me?
00:08:51.540 I mean, I don't want to be cruel, but, uh, okay.
00:08:54.740 So here is, uh, in, in Southwest Harbor, Maine, Charlotte Gill, that's an interesting name
00:09:02.040 for a woman who runs a fish store.
00:09:03.580 Charlotte Gill, uh, runs Charlotte's legendary lobster pound, uh, and she is now getting the
00:09:13.000 lobsters high before boiling them alive.
00:09:16.820 She said, I feel bad when lobsters come here and there's no exit strategy.
00:09:22.620 Okay.
00:09:23.220 First of all, hang on just a second.
00:09:24.520 Charlotte, the lobsters aren't coming there.
00:09:29.060 They've been captured.
00:09:31.120 It's not like a lobster walks in the door, ding, ding, the door, the doorbell goes off.
00:09:35.800 They walk in.
00:09:36.580 Oh, we've got customers.
00:09:37.660 No, they're lobsters.
00:09:39.320 They've come in for something else.
00:09:41.200 It's like, it's like the late 1700s.
00:09:43.480 There's a lot of African tourists that are coming to our land.
00:09:46.420 Have you noticed that lately?
00:09:47.640 These, these tour ships are showing up.
00:09:50.460 I'm like, no, that's not why, that's not what those are.
00:09:52.480 I feel bad when lobsters come here.
00:09:56.020 She owns this place.
00:09:57.700 I feel bad when lobsters come here and there is no exit strategy.
00:10:04.160 I got news for you.
00:10:05.620 There is an exit strategy.
00:10:07.060 You boil them and we eat them.
00:10:10.120 That's the exit strategy.
00:10:12.760 Can I tell you something?
00:10:13.700 I'll bet you she's inherited.
00:10:15.880 I bet you this was her father's or her family's thing.
00:10:20.840 Do you think?
00:10:21.260 And she's grown up her whole life, you know, being torn.
00:10:26.720 I mean, look, I, you know, this is, you've been saying the veal thing ever since I've known you, that you don't eat it for that reason.
00:10:33.760 And as, uh, America's only conservative vegetarian, uh, I would say that, uh, most of the time I, you know, whatever, you know, people like to go to me into these conversations about this stuff because it's fun.
00:10:45.140 But the lobster thing's insane.
00:10:47.840 Guys, we're throwing them alive into boiling water.
00:10:51.360 It's completely nuts.
00:10:53.360 Yes, I, there are a million.
00:10:55.360 What are you going to do?
00:10:56.280 Shoot them?
00:10:57.060 Yes.
00:10:58.160 Anything.
00:10:58.740 Like, first of all, I would argue, of course, the answer to that would be no, but still, if you're going to kill them, putting them into boiling water is completely nuts.
00:11:08.380 Do we have some vendetta against these things?
00:11:11.140 Like we, dude, did they, did, are they responsible for like the Adam and Eve thing?
00:11:15.680 And I'm not aware of it.
00:11:16.520 Were they in?
00:11:17.160 No, here's the thing.
00:11:18.000 Come on.
00:11:18.400 You know this.
00:11:19.040 If they weren't living under cover of water, we'd all be exterminating them.
00:11:24.800 We'd be terrified.
00:11:25.740 If they were crawling out.
00:11:26.460 Yeah.
00:11:26.740 They, if they were crawling out from underneath your refrigerator, we'd not be eating them.
00:11:32.000 We'd be exterminating them.
00:11:33.140 Which is another weird thing.
00:11:34.960 If it's, if you had a freaking red bug walking through your house like that, you're not bringing it.
00:11:39.580 You're not going to, oh, let's put it in the oven or let's boil it and eat it.
00:11:43.060 That would be weird.
00:11:44.360 I never, my, my daughter, Mary, when she was very young, I went on vacation up like, you know,
00:11:48.980 Nantucket or Cape Cod or someplace like that.
00:11:51.720 And, uh, and I, we went, we bought lobsters and I put them down on the floor and let them
00:11:56.240 crawl.
00:11:56.660 And Mary freaked out.
00:11:59.220 She was like, you're not going to make me eat bugs.
00:12:02.040 I won't eat bugs.
00:12:03.440 I won't eat bugs.
00:12:05.900 That's really what they are, man.
00:12:07.280 How hungry were you to go?
00:12:09.380 I don't know.
00:12:09.920 That big thing that just crawled out of the water.
00:12:12.040 What do you say?
00:12:12.560 We eat that.
00:12:13.740 I know.
00:12:14.300 And I think that's it.
00:12:14.980 I think because they're so ugly and creepy, we're like, sure.
00:12:18.360 We can go all Hannibal Lecter on them.
00:12:19.900 Let's just boil them.
00:12:21.060 Like there, there's every, we hear this, you know, the, oh, I don't want to, uh, we're
00:12:25.320 going to hunt, uh, with, uh, you know, we don't want to be cruel.
00:12:29.220 And, and when you put all these things in place and like lobsters were like, ah, just
00:12:33.380 rip them out.
00:12:34.120 You know what?
00:12:34.480 Let's put them all in a cage and let's look at nice and close to the little tank and we'll
00:12:38.520 meet them all before we throw them in the boiling water.
00:12:40.640 We, we, as a society despise those things.
00:12:45.180 Let me, let me ask you this though, Stu, seriously.
00:12:47.640 Okay.
00:12:48.140 Let's just say you're, I don't know what your plan is.
00:12:51.440 We electrocute them.
00:12:52.980 What is your plan?
00:12:54.160 I would, my plan would be not to eat them at all, as I think you're aware, but still,
00:12:58.180 if you're going to do it, it needs to be something else other than boiling them.
00:13:01.640 We don't boil any, anything else.
00:13:03.240 We don't be like, Hey cows, here's a giant vat of water.
00:13:05.940 Get in.
00:13:06.800 And then, oh, they're nice and boiled.
00:13:08.160 Let's put them in stew.
00:13:09.140 We don't handle anything else like that.
00:13:11.380 I don't understand why they're like, it's like you could hear them making all the noises
00:13:15.420 and they're trying to climb out.
00:13:17.260 And we're like, oh, this is okay.
00:13:20.180 It's a weird thing as a society that we do.
00:13:23.460 There's a few of them.
00:13:24.400 You've pointed out the veal thing.
00:13:25.800 The foie gras is.
00:13:27.520 Foie gras is another thing.
00:13:28.940 I love foie gras.
00:13:30.240 Will not eat it.
00:13:31.180 That is just horrible.
00:13:33.360 There's a few of them.
00:13:33.960 If anybody doesn't know how they make foie gras, they tie, they, they force feed a goose and
00:13:41.740 then they tie their neck closed.
00:13:44.320 And so their liver becomes diseased.
00:13:47.560 So they're, I mean, it's just, it's horrible.
00:13:50.520 Right.
00:13:50.640 They force feed them.
00:13:51.220 And again, who said, you know, what would make this goose liver a little better is if
00:13:56.900 we jam all this food and then we put a rope around its neck, let it live and it will become
00:14:03.260 diseased.
00:14:04.940 Ick.
00:14:05.420 Yeah, that is, uh, it's like, it's as if we decided, it's like, as if we decided that
00:14:10.160 they were like responsible for the Nazi movement and we're just like extracting revenge over
00:14:15.500 multiple decades.
00:14:16.420 It's just, it might've been the Nazis.
00:14:19.100 It might've been the Nazis.
00:14:20.280 There is.
00:14:21.740 That's right.
00:14:22.500 Maybe it might've been, it might've been, you know, one of the lesser known Nazi doctors
00:14:26.660 that were like, how can I make diseased liver into something yummy?
00:14:30.020 That does sound like a Nazi experiment.
00:14:32.300 It really does.
00:14:33.560 It really does.
00:14:35.420 This is the best of the Glenn Beck program.
00:14:40.900 Hi, it's Glenn.
00:14:52.720 If you're a subscriber to the podcast, can you do us a favor and rate us on iTunes?
00:14:57.300 If you're not a subscriber, become one today and listen on your own time.
00:15:01.360 You can subscribe on iTunes.
00:15:03.000 Thanks.
00:15:03.300 Stu, I, I just tweeted the lobster, the, I'm not sorry, not the lobster story, the whale
00:15:07.700 story.
00:15:09.340 Um, and have you seen the video?
00:15:11.360 Go to, go to my, uh, Twitter account and at Glenn Beck.
00:15:15.020 Mm-hmm.
00:15:15.340 Mm-hmm.
00:15:16.080 And, uh, just, just look at the whale story.
00:15:18.400 Now here's the thing.
00:15:19.160 They've taken a whale.
00:15:22.480 Now the sheriff said it was a mistake.
00:15:24.920 Sorry, we were trying to put this whale in a dumpster and, uh, you know, that was a mistake.
00:15:31.380 And the reason why he's saying this is because people were videotaping them taking this big bulldozer
00:15:37.060 and picking up this whale off the beach and driving it to, like, you know, behind the grocery store
00:15:42.800 and just dumping it into the, into the dumpster.
00:15:46.960 Now, it's clear the whale has rigor mortis because it's flat as a board and does not move.
00:15:54.100 Right?
00:15:55.300 Okay.
00:15:55.680 Have you seen it yet?
00:15:56.780 Well, uh, let's see.
00:15:58.040 I'm getting commercial, uh, I'm getting a computer issue is what I'm getting.
00:16:02.220 Thank you.
00:16:03.180 Awesome.
00:16:03.480 Would you please buy Apple, please?
00:16:05.840 This isn't Apple.
00:16:06.740 It's not, it's not Apple's fault.
00:16:07.880 It's the website.
00:16:08.440 It's just playing the commercial over and over again.
00:16:10.300 Oh, okay.
00:16:10.820 Uh, so it's going to take me a minute, but go ahead.
00:16:12.680 Uh, so they, cause they are taking the, uh, they're taking, I can see the picture of it
00:16:16.100 and it does not look like it would have made any sense to attempt this.
00:16:19.820 Right?
00:16:20.720 It's, it's in, it's bigger than the bulldozer's loader.
00:16:25.220 Yes.
00:16:25.620 All right.
00:16:26.400 It's, it's barely in that.
00:16:28.180 It's like, it's like, it looks like a, a, a, a mini dealership recommended one of their
00:16:32.860 cars to Michael Moore is what it looks like.
00:16:35.160 Exactly right.
00:16:36.000 This car is not for you, Mike.
00:16:37.080 It's not going to fit.
00:16:37.900 It's not, you're not going to fit.
00:16:38.940 It's not going to fit as they're squeezing him in all the, all of the,
00:16:42.680 the salespeople are just pushing his fat into the car.
00:16:45.900 It's not going to fit guys.
00:16:47.560 It's not going to fit.
00:16:49.540 This is flapping out the back windows.
00:16:51.860 It's like pushed up against the windshield.
00:16:53.500 It looks good on you.
00:16:55.020 It looks good on you.
00:16:56.400 Okay.
00:16:56.680 So, so it won't even fit into the loader.
00:17:00.980 And then they drive up to the, to the dumpster and they just let the loader go plop.
00:17:09.440 And the thing, the poor thing, I mean, it's horrible.
00:17:11.740 It's really horrible.
00:17:12.900 I mean, we have to remember it is dead, but it, it's horrible.
00:17:16.860 It just kind of lies across the dumpster for a second and then falls down.
00:17:22.100 Yeah.
00:17:22.440 Cause there's no way it's going to fit.
00:17:24.360 It's not even remotely close.
00:17:26.820 Right.
00:17:27.600 Okay.
00:17:28.060 So this is how, this is how, how sheep like we are.
00:17:34.680 How did that happen?
00:17:37.220 Well, they, somebody said, Hey, there's a, there's a baby whale.
00:17:43.280 It's died.
00:17:44.060 It's not a baby whale.
00:17:45.020 This is a small whale, baby whale.
00:17:47.140 It's died washed up on shore.
00:17:49.680 So, so, so the, I don't know the beach pickup police or whatever they are, they call and
00:17:57.360 say, what do we do?
00:17:58.260 We got a baby whale.
00:17:59.560 What do we do with it?
00:18:01.020 And the, the sheriff said, Oh, it's a baby whale.
00:18:03.300 Just throw it in the dumpster.
00:18:05.320 That's not a baby.
00:18:07.140 You should, there's more to that conversation.
00:18:09.520 The guy's like, I, he just said the baby whales go in the dumpster.
00:18:13.280 But nobody said, I don't think it'll fit in the dumpster.
00:18:17.800 It's a pretty big baby.
00:18:19.380 And I don't, I mean, I, I am not a waste management engineer.
00:18:24.780 However, I would, you know, you got to think past step one here.
00:18:29.720 Like it's in the dumpster.
00:18:31.460 Can, can a, I mean, maybe it can, but can one like garbage, garbage truck that picks up
00:18:35.560 those dumpsters, can they lift a 4,000 pound whale?
00:18:40.120 You've never, excuse me.
00:18:41.420 You've never thrown a little fish out into the garbage.
00:18:44.260 I will know, but I, I would assume this is not, this is not a little fish.
00:18:48.700 It's a baby whale.
00:18:50.660 It is a baby whale.
00:18:52.020 All baby whales should fit into a dumpster, right?
00:18:55.220 You know, that's the other thing I thought of.
00:18:56.860 First of all, I mean, who owns the dumpster?
00:18:59.520 Is it, is it like the, is it like the grocery store is like, Oh crap.
00:19:03.600 Who put the whale in the dumpster?
00:19:06.040 Now we don't have any room.
00:19:08.160 What are we going to do now?
00:19:09.520 Yeah.
00:19:09.980 I hate it when people put whales in dumpsters.
00:19:13.520 Have you ever, um, uh, like if you think of, think of right now in your head, Glenn, and
00:19:18.760 if you're listening to the show, think of this number in your head.
00:19:22.040 How wide is a dumpster, a normal dumpster?
00:19:25.960 How wide is it from left to right as you're facing it?
00:19:28.340 10 feet.
00:19:31.100 Okay.
00:19:31.500 I mean, I'm good.
00:19:32.280 I was thinking about six or seven.
00:19:34.660 Seven, maybe seven.
00:19:35.920 Yeah.
00:19:36.400 The whale was four feet.
00:19:37.960 The whale is 27 feet.
00:19:45.040 Who would look at it?
00:19:48.360 The only thing I can think of is, I think they may have thought when they placed it on top,
00:19:56.020 it would just fold itself into, into the, you know, the dumpster because it's dead.
00:20:01.620 And like, maybe it was so.
00:20:03.120 Okay.
00:20:03.240 But, okay.
00:20:04.060 I thought of that too, but look at it in the, you know, the, the dumpster.
00:20:08.760 I mean, the, uh, the, the, you know, the, the shovel thing, what do you call it?
00:20:13.600 The bulldozer.
00:20:14.440 That's such a man.
00:20:15.280 The bulldozer's loader or whatever that thing, you know, the big shovel thing in the front.
00:20:21.480 Right.
00:20:21.880 If you look at it, it's not bending.
00:20:25.480 No, it doesn't fit into that either.
00:20:28.340 And it's not bending.
00:20:29.760 So who was like, Hey, it's not bending now, but when we put it in the dumpster, uh, sheriff
00:20:36.700 said, baby, we're in the dumpster.
00:20:39.960 There's no brain power here at all.
00:20:43.480 I would, I would tend to agree with this analysis, uh, but I, I, I feel like we're going to have
00:20:50.840 very few of these, uh, situations going forward.
00:20:53.100 I feel like they've now proven this is not the most reliable way to dispose.
00:20:58.180 I mean, it was only a few years ago.
00:20:59.600 They tried to blow one up.
00:21:00.560 Remember when they blew one up on the beach?
00:21:02.280 Remember that one?
00:21:03.000 They just like filled it with TNT and just made it explode.
00:21:05.820 Okay.
00:21:05.980 So the baby whale, cause those are in the dumpsters.
00:21:08.740 Big whales, uh, sheriff says, uh, blow it up.
00:21:15.900 Like that's good.
00:21:16.880 Hell think of that.
00:21:18.180 We got to, what are you going to do to dispose of this?
00:21:21.300 Put a bunch of dynamite in his mouth.
00:21:24.340 It's going to scatter whale everywhere.
00:21:28.340 We are a weird.
00:21:29.280 It just disappears.
00:21:30.460 I see it in, I watched in a $6 million man once he blew stuff up and there wasn't anything
00:21:36.060 left.
00:21:38.740 We are too stupid to run a country.
00:21:43.120 We really, we should, we should just give up.
00:21:44.660 We really should.
00:21:45.280 We really should.
00:21:46.800 You know what, America?
00:21:47.560 Put your tools down.
00:21:48.580 I think we're done as a country.
00:21:50.200 Just, just walk away.
00:21:51.980 I don't know if you're working the bandsaw.
00:21:53.580 You probably shouldn't be working that.
00:21:55.240 Just put the tools down, turn off the machine.
00:21:58.440 Let's go home.
00:21:59.240 This is why I argue for the matrix and I've been doing it for a long time.
00:22:02.480 If we can just be fuel for some alien culture and then they just, we just lay down in like
00:22:07.760 a pool of, you know, some sort of jelly.
00:22:10.280 Everyone's like, oh, we need a red pill.
00:22:11.460 I'm kind of okay with that.
00:22:12.020 You had the author or the director of that documentary, Red Pill, right?
00:22:16.360 Red Pill, yeah.
00:22:17.160 On Cassie J the other day.
00:22:19.800 And I was thinking, I kind of, kind of want to go blue pill.
00:22:22.980 I kind of want to go blue pill and get into the gel.
00:22:25.500 And then in, like, in my mind, I'd think things were kind of normal.
00:22:29.920 And, you know, I-
00:22:30.640 Except I don't like the matrix because they made it real enough to where, you know, if
00:22:35.860 I, if I swear to, if I find out that we're in a matrix and I'm actually in a pod in gel
00:22:43.120 and I still am fat in my matrix life and in my matrix life, I can only eat the crap
00:22:55.020 like kale and I still get fat and I still have to exercise, I'm telling you, I am taking
00:23:02.980 the entire matrix down because that pisses me off.
00:23:08.260 It's a fair point.
00:23:09.380 Well, they said that in that documentary, The Matrix, in which they discuss, they tried
00:23:13.660 to make it perfect for humans and make everybody happy, but then we were such whiners.
00:23:17.420 We're so addicted to outrage.
00:23:18.960 We couldn't handle the perfect life they designed for us.
00:23:21.320 I can handle it.
00:23:21.840 I can handle it.
00:23:22.280 I can handle it too.
00:23:23.380 I am not on that bandwagon.
00:23:25.020 I am not on that bandwagon.
00:23:26.280 Give me the blue pill.
00:23:27.940 I will tell you, I met somebody here in Southern California yesterday and said, so what's Texas
00:23:32.420 like?
00:23:32.920 And I said, you know, California?
00:23:36.780 Yes.
00:23:37.400 Not like that.
00:23:38.300 No.
00:23:38.740 It's not like that.
00:23:40.380 And I said, oh, it's beautiful.
00:23:42.320 I said, in a different way.
00:23:43.560 You know, California has mountains and we don't.
00:23:46.780 You have trees and, well, we don't.
00:23:48.820 I said, you know, you have beautiful green grasslands.
00:23:52.360 We don't.
00:23:52.740 We don't.
00:23:53.760 But we have sky.
00:23:55.600 The sky.
00:23:57.180 Actually, I'm down to the sky.
00:23:59.120 Sky is beautiful in Texas.
00:24:01.120 It's beautiful.
00:24:02.000 And I said, some of the most beautiful sunrises and sunset.
00:24:04.780 I said, it's like Arizona in a way where it's just beautiful.
00:24:08.540 These cloud formations and they're huge and blah, blah, blah.
00:24:11.660 And she said, man, clouds.
00:24:15.840 You know, we don't even get clouds here.
00:24:18.400 Sometimes I just look up in the sky and there's not a cloud.
00:24:21.680 And I'm like, can we at least get a cloud?
00:24:24.400 And I wanted to say, shut up.
00:24:28.060 Shut up.
00:24:30.140 What did you need it to say?
00:24:31.080 We don't even get clouds.
00:24:32.340 It's always 75 degrees and perfectly blue.
00:24:37.040 Oh, I am so tired of it.
00:24:40.720 Shut up.
00:24:42.060 It is.
00:24:42.700 It is frustrating.
00:24:43.680 And in Texas, we don't have everything that California has, starting with a 13% state income tax.
00:24:49.040 That's right.
00:24:49.880 That's right.
00:24:50.680 You can.
00:24:51.100 Yeah, that's that.
00:24:52.260 There's your cloudy day.
00:24:54.700 Every April 15th.
00:24:57.200 OK.
00:24:58.860 When we come back.
00:25:01.480 Either Kavanaugh.
00:25:05.340 Either Kavanaugh.
00:25:07.040 Or the woman who's getting the lobsters high before she puts them into a tank.
00:25:13.500 Because there is some controversy surrounding that act of mercy.
00:25:23.820 This is the best of the Glenn Beck program.
00:25:25.840 And we really want to thank you for listening.
00:25:27.540 Stu, I just heard some amazing audio.
00:25:39.800 And it came out, I think, last night about, what's his name in Florida?
00:25:45.100 He was part of the March for Life.
00:25:48.360 Yeah.
00:25:48.780 Cameron Caskey.
00:25:50.460 OK.
00:25:51.280 He's one of the guys that was one of the big movers and shakers in the March for Life.
00:25:57.020 You know, the kids of Parkland.
00:26:00.020 He said yesterday he really regrets what he said to Marco Rubio.
00:26:03.860 And, you know, he was part of the group that was like, Marco Rubio, you know, you are taking money from the NRA.
00:26:09.260 And you're just want kids to die.
00:26:11.720 I want you to listen to what I think is one of the bravest people I have heard.
00:26:20.980 Especially considering his age.
00:26:23.960 Here's a high school kid who is now in Parkland.
00:26:28.260 If you're part of this, you see how the media has torn apart anybody who disagrees.
00:26:36.540 You've seen how people have torn the, you know, the conservative kids, how they've just been dismantled.
00:26:48.500 And you're willing to say this?
00:26:51.780 Listen to this interview.
00:26:53.060 This summer when March for Life went on the summer tour that we embarked on, I met that person in Texas who's got that semi-automatic weapon because that's how they like to protect their family.
00:27:04.820 I met the 50 some odd percent of women who are pro-life, even though I thought that it was preposterous that a woman could be pro-life and not pro-choice at the time.
00:27:12.460 I learned that a lot of our issues politically come from a lack of understanding of the perspectives and also just the fact that so often young conservatives and young liberals will go into debate, like I said earlier, trying to beat the other one as opposed to come to an agreement.
00:27:32.460 And, you know, that's natural.
00:27:33.480 It's important for things to be a bit competitive because I think competition is very important for everything.
00:27:38.680 But it comes to a point where all we're doing right now is drag each other apart.
00:27:42.780 I mean, the people who were OK with Trump will not forgive him for anything.
00:27:46.660 And the people who didn't like Trump will pretend that every single thing he does is pure, utter evil.
00:27:51.740 And it's a direction we need to head away from.
00:27:54.840 So I'm working on some efforts to encourage bipartisanship or at least discussion that is productive and help a lot of people avoid the mistakes that I make.
00:28:04.820 Is this, is this, is this kid unbelievable?
00:28:11.280 This is unbelievable.
00:28:13.520 What is he saying?
00:28:15.280 He's saying absolutely everything that so many people are fighting against.
00:28:22.180 He is making the message of my book.
00:28:25.640 Look what he did.
00:28:26.760 He was part of an angry, I think, almost a mob, an angry mob of kids that were, were at least portrayed on television as all David Hogg.
00:28:40.500 David Hogg's not listening to anybody.
00:28:42.640 He is full of certitude.
00:28:44.720 He's right.
00:28:45.460 You're wrong.
00:28:46.440 No ifs, ands, or buts.
00:28:48.840 This kid was part of that.
00:28:52.200 And then what did he say?
00:28:53.660 When I was in Texas, I met that person that uses an AR to protect their family.
00:28:58.880 Well, now, how did he meet that person?
00:29:00.920 Do you think he met that person because they were holding up a sign that said, you kids are idiots?
00:29:08.300 Was holding up a sign that said, you have to be stopped?
00:29:12.020 Was screaming names at them?
00:29:16.460 Was, was, was tweeting horrible things about these kids?
00:29:21.240 I highly doubt it.
00:29:23.660 He met a woman who was pro-life.
00:29:26.420 He said, at the time, I couldn't believe a woman would be pro-life.
00:29:30.140 He said, I met her and I talked to her.
00:29:34.040 What does that mean?
00:29:35.720 That means there were reasonable conservatives, and I think we all try to be, reasonable conservatives
00:29:43.600 that were calm enough, rational enough to find the one, not to go to David Hogg, but to find the one in the group that was honestly listening.
00:29:55.360 And they changed his heart.
00:29:58.360 That's exactly the point of the book, Addicted to Outrage.
00:30:06.780 And it's exactly the point I've been saying.
00:30:09.700 There, there are people, they are not necessarily the David Hoggs.
00:30:13.940 They're not necessarily the ones you see on TV all the time.
00:30:17.180 But there are people who are truly sick of this.
00:30:21.560 They know this doesn't work.
00:30:23.880 They don't want to do it anymore.
00:30:25.780 But they can't find anybody reasonable to talk to.
00:30:29.440 I congratulate this guy.
00:30:35.280 You are, you are my hero of the week, dude.
00:30:40.560 Congratulations.
00:30:41.900 And if there is anything we can do to help you meet with other people of different perspective, I'm all in.
00:30:56.140 The best of the Glenn Beck program.
00:30:59.440 I want to talk to you about something serious here.
00:31:14.260 And I, I, I, I, it's a really, it's a really strange thing.
00:31:20.160 I am fascinated by this story and repulsed by the story at the same time.
00:31:24.860 But the parts of the story that really fascinate me.
00:31:27.740 Are the parts how you don't know this story.
00:31:32.000 You, even conservatives think they know this story.
00:31:35.980 They have absolutely no idea.
00:31:38.580 I watched this movie a couple of weeks ago.
00:31:41.060 The trial of America's biggest serial killer.
00:31:43.800 And it blew me away.
00:31:47.140 It doesn't even have to, it is, but it doesn't even have to be a good movie.
00:31:51.040 The story, the story, the facts of this story are so incomprehensible that it has happened recently and no one found this interesting to cover.
00:32:05.240 It is one of the craziest stories you've ever seen.
00:32:09.960 And they bring it to life in a new movie that had to be impossible to make.
00:32:15.200 Gosnell, the trial of America's biggest serial killer.
00:32:19.020 If you don't remember who Dr. Gosnell is, let us remind you.
00:32:25.920 The director, Nick Searcy is, is on, Searcy is on with us now.
00:32:30.140 Hi, Nick.
00:32:30.560 How are you?
00:32:31.000 Hey, Glenn.
00:32:33.360 I'm great.
00:32:33.920 How are you?
00:32:34.520 I was hoping that I was going to be hero of the week, but I guess I'll set it up.
00:32:39.420 Well, you can be unpopular with all the girls just because you made this movie.
00:32:47.320 Nick, I have to tell you, I thought I knew this story.
00:32:51.880 I mean, we covered it.
00:32:52.700 I thought I knew this story, but until I saw your movie, I didn't connect visually with what that place was like.
00:33:03.960 And I also didn't know that this was a local, they were looking for doctor shoppers.
00:33:10.820 They stumbled into this guy.
00:33:12.680 Yeah, I mean, that's one of the fascinating aspects of the story is that they went after him because he was writing prescriptions for opioids and selling them.
00:33:24.040 And it was a drug case, basically.
00:33:26.000 And so when they raided the clinic, it's when the lead detective, James Woods, who we call Woody in the film, is he's just appalled by what he sees in the clinic.
00:33:37.540 And he just goes back to the DA's office and says, I don't know what's going on in there, but it can't be legal.
00:33:43.080 So it really was stumbled upon because of looking for the drugs.
00:33:50.300 So, Nick, was it – I mean, look, I understand dramatic storytelling and everything else.
00:33:57.580 Was the clinic really in that kind of shape?
00:34:05.720 Absolutely.
00:34:06.620 I mean, it's depicted as well as we could in the film.
00:34:09.240 But when you see the real photographs and the real footage that James Woods took when he went into that clinic, it's incredible.
00:34:17.800 I mean, there are, you know, garbage bags sitting lining the hallways because he says, you know, that he had a dispute with his medical waste company.
00:34:27.300 And the bags contained fetuses.
00:34:30.940 I mean, they were just sitting in the hallway.
00:34:33.100 Some of them were stuck in the freezer.
00:34:34.500 Some of them were stuck in milk cartons with name tags on them.
00:34:40.000 And the place was filthy.
00:34:42.300 It was cats running around and rats.
00:34:45.380 And, I mean, it really was – you know, I don't think we could have made it look as bad as it really was.
00:34:52.980 So, I mean, I was really struck by how filthy everything was.
00:34:59.880 I mean, filthy it was.
00:35:02.760 And you could almost smell it through the screen, the, you know, when you've got, you know, rotting body parts in the hallway and cats all over the place.
00:35:14.920 I don't know how – I mean, how did anybody, anybody think I shouldn't report this place?
00:35:23.260 Yeah, well, I wanted to shoot the film in Smell-O-Vision, but I got shut down on that.
00:35:29.680 Yeah, well, you did it with just imagination.
00:35:35.000 But, you know, I think that what happened and part of the story is that this clinic was not inspected from, I guess, 1993 until 2010.
00:35:45.220 There were no inspections done by the Board of Health at this clinic because of the political climate.
00:35:52.860 You know, the governor, Tom Ridge, back then, who – he did not want to appear to be anti-reproductive rights, quote, unquote, or anti-woman.
00:36:05.040 And so he told them to stand down.
00:36:08.600 That's my understanding.
00:36:09.880 He told them to stand down and don't inspect these clinics.
00:36:12.800 I have to tell you, if you – I mean, a simple inspection of that clinic would have shut it down years before.
00:36:22.240 Yeah, I have to tell you, whether he was doing abortions or not, which he clearly was, and I'm just trying to make a point here.
00:36:28.260 Any clinic, any clinic for male, female, for dogs, any clinic that was in that kind of shape, it is an insult to say you're just against reproductive rights.
00:36:44.880 You're anti-women for inspecting or closing that place down or testifying against it.
00:36:49.760 Are you kidding me?
00:36:51.700 I mean, the infection, the disease, and the – let alone not just the kids that were killed, but also the patients that died.
00:37:02.300 Yeah, and the reason that happened is exactly what you were talking about before, about not being able to talk honestly about these issues because we demonize each other.
00:37:14.960 I mean, that's sort of part of the story is that abortion is so politically charged that you can't even have a rational discussion about it.
00:37:24.520 And even when we agree on things, the other side is afraid to agree with you about the slightest little thing because they think they might be helping the pro-life movement or something.
00:37:34.860 They think they might be betraying their own cause if they even concede for a moment that Gosnell was a monster.
00:37:41.040 So to give some perspective here on how much of a monster he was, we'll get into that here in a second.
00:37:50.200 Let me just do – let me just ask you this.
00:37:53.640 Explain how he had, quote, nurses performing things that they shouldn't ever – even if they were nurses, they should not be doing.
00:38:06.680 And how one woman died because he wasn't even there during the procedure.
00:38:15.760 Somebody he had trained for a few hours did it.
00:38:20.220 Yes.
00:38:21.080 Well, part of the way Gosnell operated was that he did not have actual trained registered nurses working in his clinic.
00:38:30.380 And I think probably because if he had, they would have challenged him.
00:38:35.380 And so he basically surrounded himself with, you know, yes men and sort of stooges that he could make do whatever he wanted them to do.
00:38:45.980 And so he basically took, in many cases, high school girls and trained them to give the anesthesia and trained them to do some of the procedures so that he wouldn't have to be there even when some of the abortions were being done and also so that he wouldn't have to answer to anybody.
00:39:04.120 So you have these – in many cases, I thought the nurses were as much a victim as anybody else because they were kind of just doing what they were told to do.
00:39:12.820 And since they'd never been trained medically, they just thought this was normal.
00:39:17.340 They thought this was the way things were done.
00:39:19.940 So he victimized them as well.
00:39:22.200 Anybody who – anybody who says, oh, this is going to go back to backroom, back alley abortions, yeah, that's what this – that's what this guy was running.
00:39:31.980 And anybody, even if you are – even if you're somebody who says, oh, I'm absolutely pro-choice, this – the state refusing to do any kind of inspections on abortion clinics
00:39:49.180 is allowing back alley abortions to happen right now for the – not for the humanity to help these four little girls, but strictly for money.
00:40:01.560 This guy was sick beyond your imagination.
00:40:09.540 You're listening to the best of the Glenn Beck Program.
00:40:19.180 So, I mean, unless you, you know, just happen on this show for the first time today, I've got a new book out.
00:40:38.980 Surprise, surprise.
00:40:40.180 The book is called Addicted to Outrage.
00:40:42.740 And, you know, I'm very concerned about the outrage that's happening politically, but I am equally concerned about technology that is coming our way.
00:40:55.220 We are standing at the best of times and the worst of times, and it's going to be up to us on whether technology and our own human instincts
00:41:08.660 and the worst of us bring us this dark future or a good future.
00:41:14.400 I'm an optimistic catastrophist, but it is up to us.
00:41:20.340 And the only reason why I'm optimistic is because I know who we are when the chips are down.
00:41:26.020 But where is our bottom?
00:41:28.420 Most people will tell you, I don't have a problem with social media.
00:41:31.500 I'm not addicted.
00:41:32.380 Yeah, you really are.
00:41:34.040 And it's been designed to addict us.
00:41:36.980 I mean, what company sets out and says, you know what, I want to design something that people don't really want to check all the time.
00:41:45.060 It's designed for that.
00:41:47.440 And the way this is happening now in our society and everything is becoming political and we're starting to be, we're starting to divide each other and call each other names.
00:41:56.100 This is not good.
00:41:57.380 And whether you just woke up to this or you've always known this, you've got to start changing behavior and speaking to people differently and checking yourself and social media.
00:42:15.060 Judith Donath is with us now.
00:42:17.640 Judith, how are you?
00:42:18.780 Good.
00:42:19.220 How are you?
00:42:20.340 Very good.
00:42:20.920 You wrote the book, The Social Machine Designs for Living Online.
00:42:27.060 You were also part of the MIT Media Lab Sociable Media Group.
00:42:31.520 I quote you in my book saying, every ping could be social, sexual, or professional opportunity, and we get a media reward, a squirt of dopamine for answering the bell.
00:42:42.720 These rewards serve as jolts of energy that recharge the compulsion engine, much like the frisson of a gambler receives when a new card hits the table.
00:42:56.120 Humidably, the effect is potent and hard to resist.
00:43:00.960 Tell me how, I don't think people really believe that we are, that we're dope addicts.
00:43:07.320 A quick backup, I don't think in the original quote I had said that we get a jolt of dopamine, but I don't think it's really important what the exact sort of neurology behind it is.
00:43:19.240 Okay.
00:43:20.100 But, you know, I think most of us are aware of that feeling that, for instance, if you post something, a comment, you're always interested in seeing that people have liked it.
00:43:30.720 A lot of this was actually on the positive side.
00:43:33.500 The addiction is not necessarily about outrage.
00:43:35.980 I think at the time I was thinking more of the issue around people posting pictures of kittens online and how popular cats had gotten.
00:43:45.280 I'm not trying to tell you, so you know, I'm not, sorry if you felt I was presenting it this way, I'm not presenting you to prove my theory at all on outrage at all.
00:43:57.120 I'm talking specifically just about social media and how social media is affecting us.
00:44:03.840 And I think you're right at first, and I think even some ways now, even if it's in a negative way, we still do get that hit and that high from people saying,
00:44:17.800 I like this, say more, do more, whatever it is, a kitty cat video or an outrageous remark, people are getting high off of this.
00:44:30.620 People like me.
00:44:31.620 People are talking about me.
00:44:33.480 I've got something, you know, to say that people want to hear.
00:44:36.800 Right, and in some ways it's a little bit like the story of junk food.
00:44:42.000 You know, we evolved to want to have particular things, and sugar is useful for energy, salt is really useful.
00:44:49.380 But if you take them and make food that's just about those things and just about those tastes and is designed just to get you to keep eating, then it's really unhealthy.
00:44:59.680 And the desires both to be liked by others, if we did not care what other people think of us, that's the mentality of a psychopath.
00:45:10.600 You know, you want people who care about each other, who care, you know, am I doing things that other people think are acceptable?
00:45:17.480 That's how we have community.
00:45:18.700 But if you start distilling that out into a space where everything you say gives you a little measurement of how many likes you get, and you can measure it against the other things you said or what other people have said, it starts getting into the realm of social junk food.
00:45:36.280 Do you think it's – I mean, are you – I mean, I know you're studying the media now.
00:45:40.660 Do you think we're at the point of social junk food or we're social junkies to where – because we're – it's – you know, you say, you know, if you don't care what people think, you're, you know, you're a sociopath.
00:45:54.100 Agree.
00:45:55.760 However, we have, on both right and left, decided we don't care what half of the country thinks of us.
00:46:03.440 So we are, what, a half a sociopath.
00:46:06.340 There's a group of people, no matter what side you're on, no matter what the topic is, there's a group of people that have been deemed the enemy.
00:46:14.320 And so you can tweet whatever you want because you'll get all of the applause from your crowd, whoever your crowd happens to be.
00:46:24.120 Right. Well, I mean, and those are deeper issues that have been exacerbated by social media.
00:46:32.740 But I think you can look in history at, you know, the rise of fascistic governments in the past or any – you know, there's a long history of war in human history.
00:46:42.560 So the fact that you have a country that's deeply divided by groups who think the other one is the spawn of the devil is not actually new.
00:46:50.880 But we're seeing a particular version of it with social media.
00:46:54.340 Partly we get to see it played out in public all the time.
00:46:57.420 And I think it's also very easy to blame the technology for it without looking at some of the deeper causes.
00:47:03.800 And the issues around the attention are both, when it's negative, like being able to rally people to your side by saying political things that are really outrageous, but they're also – it's also a problem when it's much more even positive things, like worrying about saying – shaping all of your views in terms of what will people like.
00:47:31.440 I think from the political stand, though, I think where there's a little bit of a difference on the right and the left, and perhaps this is where we may disagree, because I think that on the right you have a – or on the more authoritarian side, and I think there's an authoritarian left also.
00:47:53.720 But where you have people who feel very, very strongly that they are absolutely right and that all the outsiders are just wrong is where you get the phenomenon you're talking about, where they can or they will tweet something or post things that are not only outrageous but not true.
00:48:15.320 And it will get a great deal of approval from the others on their side and outrage the outsiders, which is what they're seeking.
00:48:24.280 And that's a particularly dangerous phenomenon online.
00:48:27.580 So, Judith, I think that's a dangerous thing anywhere.
00:48:32.980 Myself, it doesn't have to be right or left.
00:48:35.180 And, you know, I write in my book that certitude is probably our biggest threat right now.
00:48:47.000 Everyone on each side is absolutely certain that their side is right, as long as you agree with it 100%.
00:48:56.360 You know, their side is right, the other side is absolutely wrong, and this certainty, and I think that it does come from the extremes, and I know that, you know, it's the thing that I read in the book.
00:49:15.340 The thing that one time made me popular was the thing that everybody wanted, I guess, and I just, I just, I was right.
00:49:27.040 I was right.
00:49:27.800 Everybody else was wrong.
00:49:28.720 I was right.
00:49:29.260 I was certain of it.
00:49:30.240 Now, the less certain I become of things, the more I hear the pain in people on all sides of the front, and the more I'm noticing that it is, it's the certitude in the extremes on both sides that are killing us.
00:49:51.880 I mean, you can't say, you know, you can't say that, you know, all people that want, you know, a bigger welfare state are communists.
00:50:02.520 That's ridiculous.
00:50:03.840 And you can't say that all people that voted for Donald Trump are deplorables.
00:50:07.880 Both of those are wrong.
00:50:11.020 And it seems as though we are only playing to those certainties at the extreme, and that's stopping us from being human beings and recognizing others as human beings.
00:50:24.480 Mm-hmm.
00:50:25.620 Yes, and I think that's a lot of the danger of the present moment is that ability to, once you start seeing others as human,
00:50:38.260 and part of the issue is if you look at the history of highly authoritarian movements, a lot of it is about trying to portray those who are outsiders as very, very dangerous and subhuman.
00:50:56.380 And so you can do anything and say anything about them, and it only strengthens your inner group.
00:51:03.940 And this is, you know, this is a phenomenon we're seeing much more now than we did, you know, even 10 years ago, and not just here, but throughout the world.
00:51:15.960 There is a, there is a, an arrogance in some way to technology right now, or those who are developing technology.
00:51:30.060 With, I'm, I'm, I'm concerned about, um, uh, AI, AGI, ASI.
00:51:38.720 I don't claim to know, and I don't think anybody can claim to know with any kind of certainty, you know, when or if that can happen.
00:51:45.840 Um, but it is something to think about, the upgrading, the transhumanism, the upgrading, uh, of, of ourselves, uh, the enhancements that are, that are coming.
00:51:57.180 Uh, we, we're messing with, um, things that we don't really even understand because we don't even understand ourselves yet.
00:52:06.340 We haven't mastered our own self-control.
00:52:08.520 Are you concerned at all, I mean, I'm not, I'm, I'm not a technophobe, and I'm not afraid of technology.
00:52:17.380 I'm, uh, I am concerned about the goals of some of the technology, uh, and how that, those programs are written in what we teach.
00:52:26.700 Are you concerned at all about, uh, how some of this stuff will change us that we don't, we can't then reverse?
00:52:36.260 I, well, I think there are a number of things to be concerned about with artificial intelligence.
00:52:42.080 I think the immediate issue is the ability of machines to imitate humans in ways that we can't recognize.
00:52:52.160 Um, you know, that's something that I think a lot of people are starting to be familiar with on Twitter, where it's very hard to tell if something was written by a human or by a bot.
00:53:02.280 And the issue there is that, again, especially as a lot of our conversation occurs online, if you think you're speaking with another human, one of the important parts in what happens when we communicate with others, hopefully, and when leaving out the extremes of anger, is that there's a level of empathy underneath.
00:53:24.680 Even if you're trying to persuade someone else, it's because you care what they think, and often you care what they think of you.
00:53:30.980 And that's really sort of the fundamental part of our connection with others.
00:53:36.020 But if you're conversing with a bot, there's no connection there.
00:53:39.900 It's simply something that has been programmed to affect some means, some end.
00:53:46.080 And so they can be made to be a lot more effective and much more persuasive than people are, while the people don't recognize what they're dealing with.
00:53:56.120 Or even if they do, you know, if it's something like an Alexa, it becomes your friend.
00:54:01.580 It's in your house, it chats with you, you know, it asks you how your day was, but you don't know what is actually going on in the program and what its internal motivations are, which is likely to be, you know, something that's beneficial to the maker of it, not to you.
00:54:19.760 Yes. So this is something that is deeply concerning, that, you know, Alexa will be everywhere.
00:54:31.560 Google, you know, controls so much information and placement.
00:54:35.100 Just slight changes to algorithms can change people.
00:54:41.960 And, you know, the most likely scenario is get them to spend more, spend more time, do something that the company wants.
00:54:49.760 Is it concerning to you?
00:54:52.740 I've always been a capitalist.
00:54:54.240 I've always been less worried about the companies.
00:54:57.540 I'm concerned about the government and the companies.
00:55:01.560 I'm concerned about anyone having this kind of power in our lives.
00:55:08.000 Yes.
00:55:08.640 I mean, I had always been mostly concerned with the companies.
00:55:12.060 I thought the government was less worrisome.
00:55:13.920 I'm now worried about both.
00:55:15.820 Right.
00:55:15.960 I think that, yes, I think the companies are quite dangerous, partly because, I mean, and we may, again, differ here, is that it's both that they want you to continue to consume things, which is not necessarily good for your pocketbook.
00:55:32.880 It's terrible for the environment, you know.
00:55:37.800 So if you look at, you know, even a simple case where we leave out the government, we leave out worries about, you know, fascist governments controlling people, just companies doing what they need to do to make more money.
00:55:49.060 If you can turn people into even more rabid consumers than they are now, you know, what does that do to our society?
00:55:58.700 It's not a particularly healthy outlook.
00:56:02.620 Well, I think we've seen this already play out with, you know, Bernays in the 1920s of, you know, the first, you know, king of advertising and how he could subtly move people in a different direction.
00:56:17.160 I mean, it's why we have, you know, ham and eggs for breakfast.
00:56:22.900 That wasn't anything except advertising.
00:56:25.560 Very, very clever advertising at the time through our doctors.
00:56:30.400 And I think we're kind of seeing just a modern version of that.
00:56:32.760 Judith, I have to cut you loose.
00:56:34.440 I thank you so much for your time.
00:56:36.440 And thanks for being out there thinking about ethics and what's happening with technology.
00:56:42.880 Judith Donath, a fellow of Harvard Berkman Klein Center.
00:56:49.020 Thanks for being on the program.
00:56:50.140 Thank you.
00:56:50.840 The Blaze Radio Network.
00:56:55.040 On demand.