Coming up on the Glenn Beck Program today, we look at an interesting set of connections as far as the kind of hit job that's going on with Brett Kavanaugh. We also get into how not to dispose of a dead whale, and some great audio from a Parkland student who was once a gun control advocate.
00:04:25.520You don't come out and make this public accusation, lay out all of these charges, have zero evidence, and then say, oh, by the way, I'm going to let that smear hang out here.
00:04:39.880I'm not going to answer a single question until the FBI do their job, which, by the way, this is not the FBI's job.
00:05:22.860Feinstein, you know, the woman who got the letter, whose peers' office leaked the letter.
00:05:29.820However, she got into office because of Anita Hill.
00:05:37.320The Ford accusation reveals a lot more than just that.
00:05:40.720Back in August, a large group of left-leaning groups co-signed a letter to both Senator Feinstein and Grassley demanding Kavanaugh's records.
00:05:50.640It was basically the same narrative that Cory Booker and Kamala Harris were using during the opening day of the confirmation hearing.
00:05:57.140Okay, this group demanded they had organized to stop Kavanaugh.
00:06:04.180One of the groups that co-signed this letter was called the Project on Government Oversight, or POGO.
00:10:21.260And she's grown up her whole life, you know, being torn.
00:10:26.720I mean, look, I, you know, this is, you've been saying the veal thing ever since I've known you, that you don't eat it for that reason.
00:10:33.760And as, uh, America's only conservative vegetarian, uh, I would say that, uh, most of the time I, you know, whatever, you know, people like to go to me into these conversations about this stuff because it's fun.
00:10:58.740Like, first of all, I would argue, of course, the answer to that would be no, but still, if you're going to kill them, putting them into boiling water is completely nuts.
00:11:08.380Do we have some vendetta against these things?
00:11:11.140Like we, dude, did they, did, are they responsible for like the Adam and Eve thing?
00:26:53.060This summer when March for Life went on the summer tour that we embarked on, I met that person in Texas who's got that semi-automatic weapon because that's how they like to protect their family.
00:27:04.820I met the 50 some odd percent of women who are pro-life, even though I thought that it was preposterous that a woman could be pro-life and not pro-choice at the time.
00:27:12.460I learned that a lot of our issues politically come from a lack of understanding of the perspectives and also just the fact that so often young conservatives and young liberals will go into debate, like I said earlier, trying to beat the other one as opposed to come to an agreement.
00:27:33.480It's important for things to be a bit competitive because I think competition is very important for everything.
00:27:38.680But it comes to a point where all we're doing right now is drag each other apart.
00:27:42.780I mean, the people who were OK with Trump will not forgive him for anything.
00:27:46.660And the people who didn't like Trump will pretend that every single thing he does is pure, utter evil.
00:27:51.740And it's a direction we need to head away from.
00:27:54.840So I'm working on some efforts to encourage bipartisanship or at least discussion that is productive and help a lot of people avoid the mistakes that I make.
00:28:04.820Is this, is this, is this kid unbelievable?
00:29:35.720That means there were reasonable conservatives, and I think we all try to be, reasonable conservatives
00:29:43.600that were calm enough, rational enough to find the one, not to go to David Hogg, but to find the one in the group that was honestly listening.
00:31:47.140It doesn't even have to, it is, but it doesn't even have to be a good movie.
00:31:51.040The story, the story, the facts of this story are so incomprehensible that it has happened recently and no one found this interesting to cover.
00:32:05.240It is one of the craziest stories you've ever seen.
00:32:09.960And they bring it to life in a new movie that had to be impossible to make.
00:32:15.200Gosnell, the trial of America's biggest serial killer.
00:32:19.020If you don't remember who Dr. Gosnell is, let us remind you.
00:32:25.920The director, Nick Searcy is, is on, Searcy is on with us now.
00:33:12.680Yeah, I mean, that's one of the fascinating aspects of the story is that they went after him because he was writing prescriptions for opioids and selling them.
00:33:26.000And so when they raided the clinic, it's when the lead detective, James Woods, who we call Woody in the film, is he's just appalled by what he sees in the clinic.
00:33:37.540And he just goes back to the DA's office and says, I don't know what's going on in there, but it can't be legal.
00:33:43.080So it really was stumbled upon because of looking for the drugs.
00:33:50.300So, Nick, was it – I mean, look, I understand dramatic storytelling and everything else.
00:33:57.580Was the clinic really in that kind of shape?
00:34:06.620I mean, it's depicted as well as we could in the film.
00:34:09.240But when you see the real photographs and the real footage that James Woods took when he went into that clinic, it's incredible.
00:34:17.800I mean, there are, you know, garbage bags sitting lining the hallways because he says, you know, that he had a dispute with his medical waste company.
00:35:02.760And you could almost smell it through the screen, the, you know, when you've got, you know, rotting body parts in the hallway and cats all over the place.
00:35:14.920I don't know how – I mean, how did anybody, anybody think I shouldn't report this place?
00:35:23.260Yeah, well, I wanted to shoot the film in Smell-O-Vision, but I got shut down on that.
00:35:29.680Yeah, well, you did it with just imagination.
00:35:35.000But, you know, I think that what happened and part of the story is that this clinic was not inspected from, I guess, 1993 until 2010.
00:35:45.220There were no inspections done by the Board of Health at this clinic because of the political climate.
00:35:52.860You know, the governor, Tom Ridge, back then, who – he did not want to appear to be anti-reproductive rights, quote, unquote, or anti-woman.
00:36:09.880He told them to stand down and don't inspect these clinics.
00:36:12.800I have to tell you, if you – I mean, a simple inspection of that clinic would have shut it down years before.
00:36:22.240Yeah, I have to tell you, whether he was doing abortions or not, which he clearly was, and I'm just trying to make a point here.
00:36:28.260Any clinic, any clinic for male, female, for dogs, any clinic that was in that kind of shape, it is an insult to say you're just against reproductive rights.
00:36:44.880You're anti-women for inspecting or closing that place down or testifying against it.
00:36:51.700I mean, the infection, the disease, and the – let alone not just the kids that were killed, but also the patients that died.
00:37:02.300Yeah, and the reason that happened is exactly what you were talking about before, about not being able to talk honestly about these issues because we demonize each other.
00:37:14.960I mean, that's sort of part of the story is that abortion is so politically charged that you can't even have a rational discussion about it.
00:37:24.520And even when we agree on things, the other side is afraid to agree with you about the slightest little thing because they think they might be helping the pro-life movement or something.
00:37:34.860They think they might be betraying their own cause if they even concede for a moment that Gosnell was a monster.
00:37:41.040So to give some perspective here on how much of a monster he was, we'll get into that here in a second.
00:37:50.200Let me just do – let me just ask you this.
00:37:53.640Explain how he had, quote, nurses performing things that they shouldn't ever – even if they were nurses, they should not be doing.
00:38:06.680And how one woman died because he wasn't even there during the procedure.
00:38:15.760Somebody he had trained for a few hours did it.
00:38:21.080Well, part of the way Gosnell operated was that he did not have actual trained registered nurses working in his clinic.
00:38:30.380And I think probably because if he had, they would have challenged him.
00:38:35.380And so he basically surrounded himself with, you know, yes men and sort of stooges that he could make do whatever he wanted them to do.
00:38:45.980And so he basically took, in many cases, high school girls and trained them to give the anesthesia and trained them to do some of the procedures so that he wouldn't have to be there even when some of the abortions were being done and also so that he wouldn't have to answer to anybody.
00:39:04.120So you have these – in many cases, I thought the nurses were as much a victim as anybody else because they were kind of just doing what they were told to do.
00:39:12.820And since they'd never been trained medically, they just thought this was normal.
00:39:17.340They thought this was the way things were done.
00:39:22.200Anybody who – anybody who says, oh, this is going to go back to backroom, back alley abortions, yeah, that's what this – that's what this guy was running.
00:39:31.980And anybody, even if you are – even if you're somebody who says, oh, I'm absolutely pro-choice, this – the state refusing to do any kind of inspections on abortion clinics
00:39:49.180is allowing back alley abortions to happen right now for the – not for the humanity to help these four little girls, but strictly for money.
00:40:01.560This guy was sick beyond your imagination.
00:40:09.540You're listening to the best of the Glenn Beck Program.
00:40:19.180So, I mean, unless you, you know, just happen on this show for the first time today, I've got a new book out.
00:40:40.180The book is called Addicted to Outrage.
00:40:42.740And, you know, I'm very concerned about the outrage that's happening politically, but I am equally concerned about technology that is coming our way.
00:40:55.220We are standing at the best of times and the worst of times, and it's going to be up to us on whether technology and our own human instincts
00:41:08.660and the worst of us bring us this dark future or a good future.
00:41:14.400I'm an optimistic catastrophist, but it is up to us.
00:41:20.340And the only reason why I'm optimistic is because I know who we are when the chips are down.
00:41:47.440And the way this is happening now in our society and everything is becoming political and we're starting to be, we're starting to divide each other and call each other names.
00:41:57.380And whether you just woke up to this or you've always known this, you've got to start changing behavior and speaking to people differently and checking yourself and social media.
00:42:20.920You wrote the book, The Social Machine Designs for Living Online.
00:42:27.060You were also part of the MIT Media Lab Sociable Media Group.
00:42:31.520I quote you in my book saying, every ping could be social, sexual, or professional opportunity, and we get a media reward, a squirt of dopamine for answering the bell.
00:42:42.720These rewards serve as jolts of energy that recharge the compulsion engine, much like the frisson of a gambler receives when a new card hits the table.
00:42:56.120Humidably, the effect is potent and hard to resist.
00:43:00.960Tell me how, I don't think people really believe that we are, that we're dope addicts.
00:43:07.320A quick backup, I don't think in the original quote I had said that we get a jolt of dopamine, but I don't think it's really important what the exact sort of neurology behind it is.
00:43:20.100But, you know, I think most of us are aware of that feeling that, for instance, if you post something, a comment, you're always interested in seeing that people have liked it.
00:43:30.720A lot of this was actually on the positive side.
00:43:33.500The addiction is not necessarily about outrage.
00:43:35.980I think at the time I was thinking more of the issue around people posting pictures of kittens online and how popular cats had gotten.
00:43:45.280I'm not trying to tell you, so you know, I'm not, sorry if you felt I was presenting it this way, I'm not presenting you to prove my theory at all on outrage at all.
00:43:57.120I'm talking specifically just about social media and how social media is affecting us.
00:44:03.840And I think you're right at first, and I think even some ways now, even if it's in a negative way, we still do get that hit and that high from people saying,
00:44:17.800I like this, say more, do more, whatever it is, a kitty cat video or an outrageous remark, people are getting high off of this.
00:44:33.480I've got something, you know, to say that people want to hear.
00:44:36.800Right, and in some ways it's a little bit like the story of junk food.
00:44:42.000You know, we evolved to want to have particular things, and sugar is useful for energy, salt is really useful.
00:44:49.380But if you take them and make food that's just about those things and just about those tastes and is designed just to get you to keep eating, then it's really unhealthy.
00:44:59.680And the desires both to be liked by others, if we did not care what other people think of us, that's the mentality of a psychopath.
00:45:10.600You know, you want people who care about each other, who care, you know, am I doing things that other people think are acceptable?
00:45:18.700But if you start distilling that out into a space where everything you say gives you a little measurement of how many likes you get, and you can measure it against the other things you said or what other people have said, it starts getting into the realm of social junk food.
00:45:36.280Do you think it's – I mean, are you – I mean, I know you're studying the media now.
00:45:40.660Do you think we're at the point of social junk food or we're social junkies to where – because we're – it's – you know, you say, you know, if you don't care what people think, you're, you know, you're a sociopath.
00:46:06.340There's a group of people, no matter what side you're on, no matter what the topic is, there's a group of people that have been deemed the enemy.
00:46:14.320And so you can tweet whatever you want because you'll get all of the applause from your crowd, whoever your crowd happens to be.
00:46:24.120Right. Well, I mean, and those are deeper issues that have been exacerbated by social media.
00:46:32.740But I think you can look in history at, you know, the rise of fascistic governments in the past or any – you know, there's a long history of war in human history.
00:46:42.560So the fact that you have a country that's deeply divided by groups who think the other one is the spawn of the devil is not actually new.
00:46:50.880But we're seeing a particular version of it with social media.
00:46:54.340Partly we get to see it played out in public all the time.
00:46:57.420And I think it's also very easy to blame the technology for it without looking at some of the deeper causes.
00:47:03.800And the issues around the attention are both, when it's negative, like being able to rally people to your side by saying political things that are really outrageous, but they're also – it's also a problem when it's much more even positive things, like worrying about saying – shaping all of your views in terms of what will people like.
00:47:31.440I think from the political stand, though, I think where there's a little bit of a difference on the right and the left, and perhaps this is where we may disagree, because I think that on the right you have a – or on the more authoritarian side, and I think there's an authoritarian left also.
00:47:53.720But where you have people who feel very, very strongly that they are absolutely right and that all the outsiders are just wrong is where you get the phenomenon you're talking about, where they can or they will tweet something or post things that are not only outrageous but not true.
00:48:15.320And it will get a great deal of approval from the others on their side and outrage the outsiders, which is what they're seeking.
00:48:24.280And that's a particularly dangerous phenomenon online.
00:48:27.580So, Judith, I think that's a dangerous thing anywhere.
00:48:32.980Myself, it doesn't have to be right or left.
00:48:35.180And, you know, I write in my book that certitude is probably our biggest threat right now.
00:48:47.000Everyone on each side is absolutely certain that their side is right, as long as you agree with it 100%.
00:48:56.360You know, their side is right, the other side is absolutely wrong, and this certainty, and I think that it does come from the extremes, and I know that, you know, it's the thing that I read in the book.
00:49:15.340The thing that one time made me popular was the thing that everybody wanted, I guess, and I just, I just, I was right.
00:49:30.240Now, the less certain I become of things, the more I hear the pain in people on all sides of the front, and the more I'm noticing that it is, it's the certitude in the extremes on both sides that are killing us.
00:49:51.880I mean, you can't say, you know, you can't say that, you know, all people that want, you know, a bigger welfare state are communists.
00:50:11.020And it seems as though we are only playing to those certainties at the extreme, and that's stopping us from being human beings and recognizing others as human beings.
00:50:25.620Yes, and I think that's a lot of the danger of the present moment is that ability to, once you start seeing others as human,
00:50:38.260and part of the issue is if you look at the history of highly authoritarian movements, a lot of it is about trying to portray those who are outsiders as very, very dangerous and subhuman.
00:50:56.380And so you can do anything and say anything about them, and it only strengthens your inner group.
00:51:03.940And this is, you know, this is a phenomenon we're seeing much more now than we did, you know, even 10 years ago, and not just here, but throughout the world.
00:51:15.960There is a, there is a, an arrogance in some way to technology right now, or those who are developing technology.
00:51:38.720I don't claim to know, and I don't think anybody can claim to know with any kind of certainty, you know, when or if that can happen.
00:51:45.840Um, but it is something to think about, the upgrading, the transhumanism, the upgrading, uh, of, of ourselves, uh, the enhancements that are, that are coming.
00:51:57.180Uh, we, we're messing with, um, things that we don't really even understand because we don't even understand ourselves yet.
00:52:06.340We haven't mastered our own self-control.
00:52:08.520Are you concerned at all, I mean, I'm not, I'm, I'm not a technophobe, and I'm not afraid of technology.
00:52:17.380I'm, uh, I am concerned about the goals of some of the technology, uh, and how that, those programs are written in what we teach.
00:52:26.700Are you concerned at all about, uh, how some of this stuff will change us that we don't, we can't then reverse?
00:52:36.260I, well, I think there are a number of things to be concerned about with artificial intelligence.
00:52:42.080I think the immediate issue is the ability of machines to imitate humans in ways that we can't recognize.
00:52:52.160Um, you know, that's something that I think a lot of people are starting to be familiar with on Twitter, where it's very hard to tell if something was written by a human or by a bot.
00:53:02.280And the issue there is that, again, especially as a lot of our conversation occurs online, if you think you're speaking with another human, one of the important parts in what happens when we communicate with others, hopefully, and when leaving out the extremes of anger, is that there's a level of empathy underneath.
00:53:24.680Even if you're trying to persuade someone else, it's because you care what they think, and often you care what they think of you.
00:53:30.980And that's really sort of the fundamental part of our connection with others.
00:53:36.020But if you're conversing with a bot, there's no connection there.
00:53:39.900It's simply something that has been programmed to affect some means, some end.
00:53:46.080And so they can be made to be a lot more effective and much more persuasive than people are, while the people don't recognize what they're dealing with.
00:53:56.120Or even if they do, you know, if it's something like an Alexa, it becomes your friend.
00:54:01.580It's in your house, it chats with you, you know, it asks you how your day was, but you don't know what is actually going on in the program and what its internal motivations are, which is likely to be, you know, something that's beneficial to the maker of it, not to you.
00:54:19.760Yes. So this is something that is deeply concerning, that, you know, Alexa will be everywhere.
00:54:31.560Google, you know, controls so much information and placement.
00:54:35.100Just slight changes to algorithms can change people.
00:54:41.960And, you know, the most likely scenario is get them to spend more, spend more time, do something that the company wants.
00:55:15.960I think that, yes, I think the companies are quite dangerous, partly because, I mean, and we may, again, differ here, is that it's both that they want you to continue to consume things, which is not necessarily good for your pocketbook.
00:55:32.880It's terrible for the environment, you know.
00:55:37.800So if you look at, you know, even a simple case where we leave out the government, we leave out worries about, you know, fascist governments controlling people, just companies doing what they need to do to make more money.
00:55:49.060If you can turn people into even more rabid consumers than they are now, you know, what does that do to our society?
00:55:58.700It's not a particularly healthy outlook.
00:56:02.620Well, I think we've seen this already play out with, you know, Bernays in the 1920s of, you know, the first, you know, king of advertising and how he could subtly move people in a different direction.
00:56:17.160I mean, it's why we have, you know, ham and eggs for breakfast.