The Glenn Beck Program - August 21, 2019


Best of the Program | Guests: Dr. Robert Epstein & Kevin Williamson | 8⧸21⧸19


Episode Stats

Length

50 minutes

Words per Minute

177.18484

Word Count

9,018

Sentence Count

676

Misogynist Sentences

8

Hate Speech Sentences

15


Summary

On this episode of the podcast, we have a special guest, Dr. Robert Epstein. Dr. Epstein has been looking into Google and thinks we are in trouble. He says 2020 is going to be even worse than 2016. Also, we are going to talk about the mobocracy of red flag laws, and some fundraising efforts from a political party on gun control.


Transcript

00:00:00.000 Hey, welcome to the podcast. I've got a great one for you today. Dr. Robert Epstein is going to be
00:00:05.740 joining us. He is he's a guy that's been looking into Google. He has been warning we are in real
00:00:13.680 trouble. We were in trouble in 2016 because Google is manipulating algorithms and search results
00:00:19.320 and it can change a large number of voters. And he says 2020 is going to be even worse.
00:00:24.640 Well, now he's under attack only because Donald Trump tweeted this guy is a Clinton supporter
00:00:31.700 and Hillary Clinton has come out against him. And now he is quite concerned about what's going to
00:00:38.460 happen to him. Another one caught in the the rat trap, if you will, of political correctness and
00:00:44.620 the mobocracy of America. Also, we're going to talk about the mobocracy of red flag laws and
00:00:50.500 and some some fundraising notices that are coming out from a political party on gun control. And
00:00:57.180 you'll never guess which political party is behind it. Then you don't want to miss Kevin
00:01:01.980 Williamson. Kevin Williamson, one of the best writers, author of the new book,
00:01:05.840 The Smallest Minority Independent Thinking in the Age of Mob Politics, joins us all on today's podcast.
00:01:11.900 The numbers are beyond horrific. Every day in the United States, a hundred people are killed with
00:01:32.040 guns and hundreds more are shot and injured. Is that true? Hundreds, hundreds of people are shot and
00:01:37.340 injured by caught by guns every day in America. Is that true? Hundreds? Well, I mean, you have what,
00:01:45.120 40,000 deaths per year from guns. Um, so I guess I guess it would have that, that, you know, of course,
00:01:51.760 it just means, um, uh, that also includes suicides and all the disclaimers, but that's over a hundred
00:01:57.820 of just, just deaths. So it's probably true. You know, we should probably separate these out from
00:02:02.720 violent criminal kind of things. You know what I mean? Oh yeah. These, like these death numbers
00:02:07.340 include like a guy, you know, uh, about to assault, you know, slit a woman's throat and a police
00:02:13.580 officer shoots them. Like that's included in that number. Right. Like there, a lot of these are
00:02:17.720 just, you know, can we separate those out? Let's just have a separate amount because I mean, you know,
00:02:21.520 one of the main things here, well, let me just get back to this. One of the main things is the next
00:02:25.980 line. Nearly two thirds of all yearly gun deaths in this country are suicides. Now, do you really
00:02:33.760 think that taking away a gun, by the way, that's not an AR two thirds of all of the gun related deaths
00:02:41.880 two thirds are, are suicides? Yeah. And if you, if you think that that is a gun problem, you're going
00:02:51.580 to have to explain why countries like Japan with no guns have a much higher suicide rate than us or
00:02:57.280 Russia who has a gun ownership rate of one 10th of the United States yet has a much higher murder
00:03:03.880 rate and a much higher suicide rate. In Pennsylvania, gun violence claims over 1500 lives every year
00:03:10.180 with gun suicides comprising 63% of all firearms deaths in Pennsylvania. 65% of veteran suicides in
00:03:18.220 our state involve a gun. Oh, we should take our guns away from veterans. Every day told an argument
00:03:24.380 to make. I mean, let's, yeah, let's argue to the people who've been defending our country that and
00:03:30.260 who, who have trained with these weapons and done everything that they could to, to, to protect
00:03:34.680 these liberties. Yeah. Let's take their guns away. Let's take veterans guns away. Wow. What a wonderful
00:03:40.040 idea that is. The everyday toll of gun violence in America is utterly heartbreaking. And this violence
00:03:45.120 routinely shocks the collective soul of our nation. We look for solutions to end America's
00:03:50.020 epidemic of gun violence. One sure way to reduce firearms death is through the implementation of
00:03:55.340 red flag laws, also known as extreme risk protection order laws. To date, these laws have been enacted in
00:04:02.700 17 states and the district of Columbia. Oh, well then that's a good idea. Back on February 14th,
00:04:08.220 there was a one year anniversary of Parkland, blah, blah, blah, blah, blah. My proposal, Senate bill 90 would
00:04:14.060 allow our judges to temporarily remove firearms from people in crisis who pose an imminent threat
00:04:19.460 of harming themselves or others. Red flag laws. Now, which party sent this out? You're setting us up
00:04:28.120 here. This is Senator Tom Killian, the ninth district, uh, in, uh, and a Republican in Pennsylvania
00:04:36.660 sending to all of his supporters. Republicans. Is this who you are? I mean, there's a lot of support
00:04:45.360 for red flag laws among Republicans. Yeah. Well, the second is good for you. It's good for you. You're
00:04:50.000 not, you are no longer, you are no longer a, a constitutionalist. You cannot consider yourself
00:04:57.400 a small government constitutionalist. If you believe in, in gun control and infringement, we're not
00:05:05.460 talking about, look, I have no problem. If you are, um, deemed dangerous after you've had the chance
00:05:15.860 to testify, you are innocent until proven guilty. We cannot cross this line. You are innocent until
00:05:26.500 proven guilty. The red flag law says some accuser can go to court and say, you know what? They're very,
00:05:33.780 very dangerous. They're very dangerous. I've been, look, I was married to them for a very long time
00:05:41.380 and, uh, the threats. Oh, you would not believe the threats that they're making. You don't think
00:05:47.980 vengeance, uh, just, just pettiness would get involved. You don't think that there's someone
00:05:54.360 in your family, perhaps, perhaps not. Maybe you're lucky. My no in, I know in my extended family,
00:06:03.020 I would not count it out that somebody in my extended family or relations would say,
00:06:11.880 oh yeah, yeah, I was, I was at a family reunion. The guy's unstable. Oh, he's absolutely unstable.
00:06:18.000 You could have people sitting at this desk with you that would make claims like that to the courts.
00:06:22.580 I mean, if you've ever been through a horrible divorce. Oh my God. Go back to, uh,
00:06:29.420 you don't even have to go. You could go to one of the brothers or sisters or uncles or aunts or
00:06:33.900 anybody else that just was angry. Why was Barack Obama our president? Why? Well, he won a Senate
00:06:39.740 race in Illinois. Why did he win the Senate race in Illinois? Well, he was in a very tight race against
00:06:44.400 Jack, Jack, I want to say Jack Ryan, but that's the character, right? Yeah. Who was the guy? I can't
00:06:48.760 remember his name. Uh, Ryan was his last name who went through a very nasty, uh, divorce with an
00:06:55.240 actress, uh, who in the divorce, uh, proceedings, uh, accused him of all sorts of crazy stuff that
00:07:02.360 somehow Barack Obama's people got unsealed. And then eventually his opponent, Obama's opponent
00:07:07.420 had to drop out of the race. Uh, it is Jack Ryan. Thank you. Does it end up to be true?
00:07:13.200 I, you know, after, I don't think so. Uh, you know, obviously wasn't there. Um, uh, and afterwards,
00:07:20.620 you know, after the divorce happened, they, there was a, a cooling off period and, oh, well,
00:07:26.720 he's not that person and blah, blah, blah. You're going through a divorce. This happens all the time.
00:07:30.960 Your husband loves guns, loves guns. And you want to bilk him for everything, every penny he's got.
00:07:41.520 Okay. That's not an unheard of, uh, uh, scenario because you're, you're being pushed by the
00:07:49.540 attorneys. You're angry, whatever it is. You're telling me that America, you can't see a husband
00:07:57.280 or a wife doesn't matter. Yeah. Say, you know what? You're going to give this to me or I'm going to
00:08:05.140 talk about your guns. It's objectively worse the other way, right? Let's say a woman is maybe having
00:08:11.040 an affair and the guy's very angry about it. And she has a gun to protect herself against an angry
00:08:16.220 man. And the man goes and says, you know what? She's nuts. She's been threatening people. I think
00:08:21.020 my children are in danger. Go take her guns. They go do that. Then she's vulnerable from him.
00:08:26.820 Right. I mean, it is, it's, it's a terrible thing. And we've tried this red flag law thing out
00:08:32.200 recently, uh, on another issue. It's called me too accusations. Uh, we've, where we've just been like,
00:08:37.960 you know what? Yeah. Yeah. Just take all their power away, take their jobs away, throw them out
00:08:41.580 of society. And then we'll figure out whether they actually did it or not. But then this,
00:08:45.420 what makes this worse is that's not involving the government or the court system. Right. That's
00:08:51.040 just public opinion. This is public opinion. That's bad enough. This one is saying, no,
00:08:56.280 we deem you guilty before you even have a chance to answer the charge. Yeah. And it's, it's,
00:09:03.020 and your reputation being of high quality is not a constitutional right. People can say all,
00:09:10.160 they have all sorts of terrible opinions about you. I don't know if you've noticed that some
00:09:13.120 people have terrible opinions about you even Glenn. It's true. Um, but you know, constitutional
00:09:18.620 right is your right to have bare arms. So we're taking away a constitutionally guaranteed thing
00:09:22.740 with no, I mean, cause like red flag laws already exist. Red flag laws are when you go and you say,
00:09:30.440 hey, you know, we need to get this person is really erratic and he's acting and we need to
00:09:35.400 have him committed, right? Involuntary commitment. Um, these things already exist. The only thing
00:09:40.920 that the new brand of red flag laws does is it makes it so they take the guns before they figure
00:09:47.400 it out. Yeah. Right. Like this is like, I don't know if he's crazy. Let's just take the guns and
00:09:51.580 then we'll figure it out and see if it's true. That is craziness upside down. That is craziness.
00:09:57.220 That is against everything we stand for. This is the, you remember you're taking away a person's
00:10:03.860 right to be innocent before guilty. You, what you're doing is you're starting down this slippery
00:10:14.680 slope that look, things happen. So we've got it. We're going to look at you as guilty and everybody
00:10:20.820 will know you're guilty and we're going to take your guns and good luck getting them back. By the way,
00:10:25.660 we're going to take your guns. But if you prove yourself to be responsible, excuse me,
00:10:33.200 this is, this is personally, I think that is civil war. I think these red flag laws are very popular
00:10:43.160 though. I mean, they, they, they poll very well. Then maybe not. I don't think people understand
00:10:47.780 exactly what they are. I don't think so either. You know, and it's like, you know, you have a,
00:10:52.060 a situation where the research on the red flag laws where they've been implemented shows no effect
00:10:57.840 on homicide rate. It shows a very slight effect on suicide rate. And we've seen, you know, some of
00:11:04.920 these states we're talking about a third of cases are later found out to be frivolous. A third?
00:11:12.000 You're taking away the constitutional right from a third of the people you're accusing? You can't do
00:11:17.280 that. That is not an American principle whatsoever. I know we really, like we all have these people
00:11:23.480 around us that are like, oh man, I, that guy seems dangerous. This is why most of those people don't go
00:11:28.140 out and shoot people. This is why I am not for the death penalty. It's not because of life, innocent
00:11:34.600 life. I'm opposed to taking innocent life. And that is why I'm, I'm torn on the death penalty,
00:11:42.360 but it finally come down on the death penalty. I'm against it because we can make mistakes and I
00:11:49.340 don't want to be responsible for taking innocent life. So put them in jail, stop all this nonsense of,
00:11:57.920 you know, racking everything up. If you're going to do it, then do it, but you better make sure you're
00:12:04.120 right. You want to execute somebody. Look how many people just from DNA tests. Now we may get to a
00:12:09.100 point to where you got it because everything's on camera. Everything's, but then you've got deep
00:12:13.580 fakes. Are you, are you sure you don't want to put an innocent man to death? You don't want an
00:12:21.140 innocent man in jail. You're going to destroy people just on what one person or one side of an opinion.
00:12:32.860 There's two sides and we must have the, the, uh, presumption of innocence for American citizens,
00:12:41.820 not the presumption of guilt. That is what leads to Stalin, Nazis, Mussolini, Mao, whoever you want,
00:12:52.420 when they can scoop you up or take your stuff or you lose your job through a court system
00:13:00.100 that says, yeah, well, we're going to, we will get back to you. We'll, we'll, we'll fix your life
00:13:05.640 if we were wrong. Really? Where do you go? Where do you go after the, after the sheriff or the FBI
00:13:12.780 are at the front of your house, taking out your stuff because you've been deemed unstable? Where do
00:13:20.060 you go to get that reputation back? Where do you go in this time where there is no forgetting because
00:13:26.760 of the internet? Where do you go when you, when you go in for a job interview and they Google your
00:13:32.580 name? Oh, and they see the pictures of the guns leaving your house because you might be unstable.
00:13:37.480 Where do you go to get that erased? Where? So, uh, the Washington Post has just written an article,
00:13:46.100 Anthony Scarmucci and the nine biggest one eighties on Trump ranked. Uh, number 10 is Scarmucci,
00:13:52.920 I think. Uh, number nine is Ann Coulter. Now this is, yeah, cause she's a, she's the roof,
00:13:59.600 the, I was for him. Now I'm against him. Yeah, no, no, no. She wrote a book in Trump we trust. Okay.
00:14:05.600 And you know what? I think he's been pretty decent on the border. He hadn't been great and he hasn't,
00:14:11.340 you know, done the border wall, but he's, I think he's tried. She reportedly at least co-wrote his
00:14:18.200 initial border proposal. So she's very tied into the details of that. Uh, and so she's maybe not
00:14:25.700 excited that the, the wall has not been able to get that across the finish line. Well, she says he
00:14:29.880 deserves to lose reelection. Okay. Uh, Mike Pompeo is, uh, is the next one. Jason Chaffetz is number
00:14:37.460 seven coming in at number six, Mick Mulvaney. Number five, Andrew Napolitano. I didn't know that
00:14:43.520 cause he was very anti-Trump, wasn't he? He's a libertarian, right? So he's not going to like the,
00:14:48.180 the executive sort of actions. I, but I have not heard him flip on that. I mean, he's not,
00:14:53.960 has he changed that? Oh, actually no, he's gotten worse. Oh, I think he's gotten worse. Uh,
00:15:00.900 once purveyor deep sea, Sarah Greger, he recently accused Trump of unleashing a torrent of hatred
00:15:07.500 in a Fox news.com op-ed. Trump claims this is because he declined to nominate, uh, Napolitano for
00:15:13.300 Supreme court. That's, uh, then Anthony Scarmucci numbers coming in at number three, Glenn Beck.
00:15:20.320 Hey, listen, you did well on the list or did I, there's no, this is no way to win on either,
00:15:28.260 in either direction. No, you don't win in either direction. However, if you remember, right,
00:15:32.880 it was just, I was only saying things cause I was failing. Yeah. It's weird how that's happened.
00:15:37.580 Yeah. Now, now you're now, uh, Glenn Beck staked out. This is from the Washington post.
00:15:43.060 Glenn Beck staked out principal ground against someone. I love this. Really? Now retroactive
00:15:48.300 admiration. Where, where, where was the principal ground support of Glenn Beck in 2016 for these
00:15:53.660 people? None. Uh, as someone who said, quote, he could be one of the most dangerous presidents to
00:15:59.300 ever come in the Oval Office. End quote. Yeah. He could have been, he hasn't been, he, he could be
00:16:07.880 in the future, but he hasn't been, uh, you know, when, when we had no evidence of what he would do
00:16:15.360 in office, uh, yeah, he could have been, I was very nervous based on his past performance.
00:16:22.880 He hasn't been, uh, after Trump's election, he pulled out Hitler comparisons saying he saw the
00:16:29.900 seeds of what happened in Germany in 1933. Still see them, see them in the Republican party,
00:16:34.660 see him in the democratic party, see them everywhere. If you're not seeing fascism,
00:16:39.240 communism on the horizon, uh, well, you're blind. You're blind. You can go any way, any direction
00:16:47.200 with any of these people. They call every Republican a Hitler every day. I know that's a problem.
00:16:52.380 Today, even as Trump has stoked racial divisions and split the country in a way Beck once decried.
00:16:58.520 No, I still decry that. I still think racial division is really bad. I still think the,
00:17:05.620 the way the president, you know, says things I'm like, Oh, please don't, don't say, please don't say
00:17:12.160 that. But look at the media. Trump is the one doing it. I know when you, when you're calling
00:17:18.520 literally every one of his supporters, if they support him a racist, yeah, who's stoking racist
00:17:23.660 divides. I mean, that's incredible. Uh, Beck now says if Trump loses in 2020, I think we're officially
00:17:30.440 at the end of the country as we know it. Yes. Yeah. Have you looked at the other side?
00:17:38.300 Have you looked at who the Democrats are running? Yeah. I think when they say, yeah,
00:17:44.860 we're going to get rid of the free market system, you know, and I'll just do executive
00:17:49.480 orders on the constitution. Sounds like the end of the Republic to me.
00:17:52.920 Hey, it's Glenn. And if you like what you hear on the program, you should check out Pat Gray
00:18:08.300 unleashed. His podcast is available wherever you download your favorite podcast.
00:18:13.440 We have Robert. We have Robert Epstein. He is an author, editor, a longtime psychology researcher
00:18:18.860 and professor, distinguished scientist who is passionate about educating the public about
00:18:24.260 advances in mental health and behavioral sciences, former editor in chief of psychology today.
00:18:30.560 He is now the senior research psychologist at the American Institute for behavioral research
00:18:36.320 and technology and contributing editor for scientific American mind. Yeah. He sounds like a dope
00:18:43.420 and turning the crazy level up to 10. Dr. Robert Epstein. How are you, sir?
00:18:50.880 Well, it's been a rough couple of days, to be honest with you. Yeah.
00:18:55.300 So, I mean, here's a I mean, here's a group of people that you probably politically would agree
00:19:01.300 with more than not getting trashed by the person you voted for on Twitter.
00:19:07.460 Sure. Well, I think Hillary Clinton should be ashamed of herself. I mean, you know, I really
00:19:13.220 just gotten caught in the crossfire here between Trump and and Hillary. And, you know, the president
00:19:22.800 sometimes, as you know, his his tweets are not exactly entirely accurate.
00:19:28.120 And he did get a couple of things slightly wrong when I tweeted about my testimony before Congress,
00:19:37.000 which was in July. And so, yeah, I can tell you, you know, get what he did is slightly wrong.
00:19:44.060 But what Hillary did is reprehensible, especially since I've been a strong supporter of the Clintons
00:19:50.740 for 20 years. I mean, I have a signed letter from from Bill up above my desk here.
00:19:58.540 Wow. And what she did is is it's shameful. It's shameful.
00:20:02.900 What did she what did she do? What did she do?
00:20:06.520 Well, she was replying to to the president. President said that, according to some guy,
00:20:13.660 you know, some researcher, Google shifted between two point six and 16 million votes to
00:20:21.860 me in 2016. Well, I'm the researcher. Yes, I testified before the Senate Judiciary Committee
00:20:28.520 in, you know, in July. And yes, I gave estimates of between two point six and ten point four million,
00:20:37.620 not 16 million. And also Trump said that that that I said that Google manipulated the election.
00:20:46.020 I didn't have never said that. I said I found pro Clinton bias in their search results sufficient
00:20:54.060 to impact undecided voters in a way that would shift that many votes. I've never said they manipulated
00:21:03.340 the election. I found what I measured was the bias, which I mean, was indisputable.
00:21:09.980 Hillary replied to him saying that study has been debunked, which is absolutely false.
00:21:18.860 And then saying and the whole study was based on 21 voters. What I actually studied, what I actually
00:21:24.780 captured and analyzed, which no one has ever done before, is I captured 13,207 election related searches
00:21:34.460 on Google, Bing and Yahoo and the 98,000 web pages to which the search results linked. That's what allowed
00:21:44.140 me to measure the bias that people were seeing in search results. And there was substantial bias in
00:21:50.220 favor of Clinton, whom I supported in all 10 search positions on the first page of Google search results,
00:22:00.460 but not any bias in Bing or Yahoo. So, you know, I found the bias. And now, based on experiments that
00:22:10.140 I've been doing for six and a half years, I was able to estimate with that level of bias how many votes
00:22:17.980 could be shifted. I know that from, again, extensive experimental research, which now has involved
00:22:24.300 tens of thousands of participants, five national elections in four countries. I know precisely
00:22:31.260 how bias can shift opinions and votes. And that's where I got those numbers from. So, again, Trump got
00:22:39.820 things slightly wrong. But what Hillary did was outrageous. My research has never been debunked
00:22:46.620 at all. And then this slew of stories that have turned up. I mean, they're literally, I'm not
00:22:54.700 kidding you. There are hundreds of them all over the world. And a few unconservative sources basically
00:23:01.580 just kind of report the facts. And then all the mainstream sources are basically saying,
00:23:08.380 uh, I'm incompetent, which I've never been accused of being my whole career. Uh, that, uh, again,
00:23:16.300 this is my study was debunked, et cetera, et cetera. I mean, it's, it's terrifying that, that, that,
00:23:22.780 you know, there could be so much bad information out there. This is, this is, it is, I think this is why
00:23:30.300 we are, uh, things are slowing down, not towards socialism, uh, not even towards nationalism. Those,
00:23:37.500 I think are going to speed up still. Uh, but the, the mob mentality that you have to be all in on
00:23:44.300 somebody or you're a traitor, I think that is actually starting to swing back to a normal kind of,
00:23:50.860 uh, feeling because average people are feeling what people like I have felt and others like tea party,
00:23:57.640 we have been feeling this for about 10 years and it is terrifying. Me too. It is a, it's a,
00:24:05.160 a good goal. It's a terrifying, um, witch hunt. And, and you don't, where do you go to get your
00:24:14.020 reputation back? Bob, where do you go? I don't know. And, and, you know, you know, and what I,
00:24:19.760 what I really accomplished in 2016 was setting up the first ever monitoring system, uh, to see what
00:24:27.080 big tech companies were showing people. No one's ever done that before because text power, uh, to
00:24:33.480 shift opinions and votes and purchases and attitudes and beliefs around the world derives from, uh, what
00:24:39.960 they internally call ephemeral experiences like search results. They, they're generated on the fly.
00:24:46.860 They have an impact on your thinking. They disappear. They're gone. They're not stored anywhere.
00:24:51.080 That's called an ephemeral experience. That's what Google, Google people call it. And it's
00:24:56.240 extremely powerful in shifting votes and opinions. Uh, I've shown in multiple experiments and, you
00:25:03.000 know, published in peer reviewed journals, you can easily shift 20% or more of undecided voters up to
00:25:09.720 80% in some. So how are they, how, how would they do that if they are doing it? How would that happen?
00:25:17.100 Explain that to the, the average person who has not heard this before. Okay. First of all,
00:25:23.700 it can happen just because they're not paying attention to their algorithm and the, and their
00:25:28.720 algorithm of course, always puts one dog food ahead of the other and, you know, one vacation spot ahead
00:25:34.840 of the other and one candidate ahead of the other. It has no equal time rule built into it. And once it,
00:25:41.880 it puts one candidate ahead of the other, then that starts to have a dramatic impact on undecided voters.
00:25:49.540 And as more undecided voters shift, the bias in search results gets stronger, that shifts more
00:25:57.800 undecided voters, et cetera, et cetera. It's a bandwagon effect, what I call a digital bandwagon
00:26:03.600 effect. And I've measured these things very precisely. And again, 2016 was a, was a tremendous
00:26:09.700 milestone year for us because we actually built a Nielsen type system to look over people's shoulders
00:26:17.720 with their permission and see what these companies were showing them. Then we built a bigger system
00:26:22.640 in 2018. And in 2020, we're trying to raise the money to build a much bigger monitoring system,
00:26:28.880 because you will never know why the next presidential candidate wins unless there has been extensive
00:26:37.620 monitoring of all this ephemeral stuff, news feeds, email suppression, shadow banning,
00:26:44.980 search suggestions, search results, et cetera, et cetera. And I'm the only person in the world
00:26:51.020 who's ever built such systems. And we have to have these systems, or we will not understand
00:26:57.520 what is going on and who, why somebody won or lost an election.
00:27:03.320 All right. So, so Dr. Robert Epstein, senior research psychologist, American Institute for
00:27:09.760 Behavior Research and Technology. He is talking about Google and these algorithms that are changing
00:27:16.280 the way we behave, the way we look, the way we think all underground and how they have impact our
00:27:22.360 elections. Is it mygoogleresearch.com where people could donate if they wanted to donate?
00:27:29.280 Yes. And yes, I'm glad. And you have been more helpful to me in that regard than just about
00:27:35.160 anybody. You've actually raised, without you knowing, just because you keep giving out that
00:27:39.200 link, you've raised a lot of money for this research.
00:27:42.440 I want you to know that I believe this is one of the most important things that we can do.
00:27:49.020 You are, you are actually, and I think you'll agree with this, you're actually way behind where
00:27:54.260 you should be, but you're, you are light years ahead of anyone else on the planet. Would you agree
00:28:00.020 with that? Positively. And I've been slaughtered now by mainstream media, which is my media. That's
00:28:08.440 my media. I'm not a conservative. This is an incredible story. Okay. We have some,
00:28:14.420 some detailed questions. And then I want to talk to you a little bit about, um, uh, what Google may be
00:28:19.840 doing, um, not only during this election, but also, uh, with ice. We'll get into that. I'd just
00:28:27.420 like to pick your brain on, on theories. If you had any, um, you can donate. And I urge you, I urge
00:28:34.620 you in the strongest of terms, if you have money that you can donate five bucks or, you know, a hundred
00:28:43.060 thousand dollars that you would consider this, uh, project, there is nothing more important than
00:28:49.680 getting your arms around the algorithms at Google, my Google research.com, my Google research.com.
00:28:57.440 Every day. It seems somebody pops up in my world. Like what Dr. Robert Epstein just said, uh, this is
00:29:05.820 wait, they've turned the guns on me. Yeah. And this is my side. Oh, look at what they turned the guns on
00:29:10.660 here. Peer review. Yeah. Now peer review doesn't matter. I thought that was the end all be all.
00:29:15.580 We're supposed to now peer review those studies. They're gone. How about, um, the fact that all
00:29:19.500 of a sudden we're supposed to trust gigantic companies making decisions for us. I mean,
00:29:23.360 I thought this was the exact opposite of what the left wanted. You know what, you want companies
00:29:26.880 controlling all this information? Well, it benefits them in this particular moment. Ignoring peer review
00:29:31.940 benefits them in this particular moment. So they'll take out a guy who voted for them,
00:29:36.900 which doesn't matter. Doesn't matter. Doesn't matter. Apparently he's garbage now.
00:29:40.840 It's really horrible. Reprehensible. I urge you to donate at my Google research.com.
00:29:46.340 Okay. Um, may I call you Bob or Robert? Robert, if you like, sure. Robert. Um, so Robert, we have
00:29:54.120 some questions. We, we went through the vanity fair article and if you read this article, if that's the
00:29:59.260 way you were doing the research, it's crazy. It's crazy. Would you agree with that? I, I, I did. I can't
00:30:07.180 even read these things. There are hundreds of them. Okay. So we're going to, we're going to go
00:30:10.660 through it piece by piece and you just tell us, is this how you do it? If not, tell us how you do
00:30:15.520 do it. Go ahead. The first accusation in here, uh, Robert is they basically say that the reason
00:30:21.440 why you're going after Google is because you have a vendetta against them because in 2012,
00:30:26.520 they warned visitors to your website that it had been hacked and serving malware to people
00:30:32.400 who are reading it. Okay. I have no vendetta against Google. I am probably Google's biggest admirer
00:30:39.300 in the world. Uh, I have friends at Google. Uh, yes, I, my website was hacked in 2012 as everyone's
00:30:47.020 is eventually. Uh, and, and I got notified of this by Google and that caught my eye. I said,
00:30:54.800 why is Google notifying me and not a government agency or a nonprofit organization? And then as
00:31:01.020 a programmer, I get intrigued too, because Google was now blocking access to my website, not just
00:31:07.400 through Google.com or through Chrome, which they own, but even through Safari, even through Firefox,
00:31:12.460 which is, you know, a nonprofit run browser. And I got curious, but how is that happening? How can
00:31:18.840 that be? How can Google block you through, you know, Apple Safari? And so I started to kind of just
00:31:24.720 look at Google more seriously. I have a vendetta against them. That's absurd. And then later that
00:31:31.040 year, there was research, uh, the early marketing research on the power of search rankings, uh, that
00:31:37.080 the power that search rankings have to influence people's clicks and purchases. And that, that made
00:31:43.400 me think, well, if that's true, then maybe, uh, search results could be used to shift opinions or even
00:31:51.920 shift voting preferences. And so I started my first series of experiments looking at that.
00:31:56.500 And it is, it is not like, it's not like you're Glenn Beck doing this. You are the former editor
00:32:02.180 in chief of psychology today. You are also the senior research psychologist of the American
00:32:07.880 Institute for behavioral research and technology. Your job revolves around how people make decisions
00:32:15.880 and what causes people to behave in certain ways. So of course you would be curious about this.
00:32:22.240 Of course you would investigate it because this is probably, I think in correct me if I'm, if I'm
00:32:28.260 wrong, doctor, but I believe that in maybe five years, but in the next 10 years, we're going to have
00:32:34.300 to have a serious discussion on, on if you actually have free will because of what they're doing in
00:32:42.480 nudging and, and how they will use and manipulate data.
00:32:46.280 Well, we're, we're, we're past that, uh, Glenn, uh, you know, in, in, in looking at, at, uh, Hillary's,
00:32:54.940 uh, horrendous, uh, tweet, uh, attacking me and, and telling blatant lies about me. I mean,
00:33:02.300 we're way past that point of, of, uh, having any free will left because you have to understand Hillary's
00:33:08.740 tweet by understanding how dependent she has been for years and years for votes and money from
00:33:16.100 Google. Google was her largest, uh, donor. Her, her, her chief technology officer, Stephanie
00:33:22.600 Hannon was a former Google executive. I mean, I could go on and on and on and on about that
00:33:27.040 relationship. And she got that information that she tweeted from Google.
00:33:36.000 Wow.
00:33:36.600 That's a great point.
00:33:38.060 Wow.
00:33:38.340 It's a little, a little circuitous. There's a little, there's a little bit of incestual feeling
00:33:44.960 here to some of this. Um, we, uh, do you have a yes or no question? We have about 35, 40 seconds.
00:33:50.840 Yeah. I mean, I don't think we can get into any of those in 30 seconds. Okay. So we want to get
00:33:54.740 into a couple more though. A couple more. And then I want to ask you, I don't know if you saw this,
00:33:58.420 that the, the Google employees yesterday, they don't have a problem with China apparently,
00:34:02.160 but Google employees were protesting in front of Google. They want them to stop helping and working
00:34:08.460 with ice and the border patrol. And my first question, uh, was wait, what is Google doing
00:34:18.200 with ice? And are they providing information to the government? Because I don't think that's kind
00:34:25.660 of in the game plan for anybody. Is it? This is the best of the Glenn Beck program.
00:34:43.600 Like listening to this podcast. If you're not a subscriber, become one now on iTunes. And while
00:34:48.920 you're there, do us a favor and rate the show. A man who began his journalism career
00:34:54.180 in the Bombay based Indian express newspaper group. I don't even know how you would go about
00:35:00.400 doing that. Kevin Williamson is here. Hello, Kevin. How are you?
00:35:04.060 It helps to have a friend in college whose family lives there and, uh,
00:35:06.660 really introduce you to some people. Yeah. So I ended up there. Okay. I didn't know the first
00:35:09.720 thing about it. So you were, uh, you were a, uh, you started there in Bombay. Yeah.
00:35:14.340 Had you ever been to Bombay? I'd never been outside of the United States except for Mexican border
00:35:18.700 towns growing up in Texas. Yeah. So it's a little different than Texas, the U S or Mexico.
00:35:24.180 Yeah. Um, you know, when I got there, um, no one even knew what the population of Bombay was at
00:35:28.980 that point in the nineties, because it was such a crazy chaotic place. They thought maybe 20 million,
00:35:32.960 maybe 25 million. It was a, but it's a great place to be a newspaper guy because everyone
00:35:37.080 read newspapers there. You know, typical household would get four or five newspapers a day. So
00:35:40.840 wow. Tremendously fun place to, uh, start a newspaper. Yeah. And then you became a theater
00:35:45.160 critic. Uh, some years later. Yeah. When I was living in New York, I wrote the theater column
00:35:49.080 for the new criterion for a while. But aren't you, I mean, don't you have to be an old cantankerous
00:35:54.460 bitter man to do that? Or are, are you, I've been an old cantankerous bitter man since I was about
00:36:00.240 11. And, uh, so I, I'm kind of growing into it. Okay. Yeah. Your body is starting to catch up
00:36:07.020 with your mind. Unfortunately, I've been waiting for my hair to turn gray for years. Yeah. Don't wish
00:36:12.180 for that. Cause I, I always wanted my hair cause everybody in my family by 30, they're white.
00:36:17.140 Yeah. And it took me to 50 and now, and I was like, well, everybody's got the great white hair.
00:36:22.340 And now I have it. And I'm like, good God, you look like you're a thousand years old. Um, all right.
00:36:27.820 So, um, you've written, you've written a new book, the smallest minority by borderline unpublishable
00:36:34.620 angry profane book. Uh, so I, um, I just, I, you can't read from this book on the radio. Oh yes.
00:36:40.980 There's one paragraph, there's one paragraph that I'm, I'm not going to read, but I'd like
00:36:46.160 you to read, uh, because it is one of the greatest screeds of all time. Oh no, I don't
00:36:54.240 know if I can find it here. It was, uh, you describing, you describing people, uh, uh, that
00:37:05.860 you, you know, you had a, Oh shoot. Where is that? Uh, Oh, here it is. Here it is. Here
00:37:10.040 it is. Uh, pick it up right here. Um, let's see. It's like testimony. You're like, actually,
00:37:15.460 did you write this? Everyone knows I'm a monster. Can I give this book to you? Do everyone
00:37:20.360 I know my own work. Do everyone I, uh, to a monster to the end of that paragraph on the
00:37:25.560 next page. I would much rather you read it. No, no, no. I would not. It's got words in
00:37:29.820 it. I can't pronounce. It's strange to read your own work. No, it's not. Um, this is like
00:37:34.020 when you got to read it the way you meant it too. Well, um, I should sometime tell you the
00:37:39.360 sentences that got left out. They edited this thing? Are you saying? Oh man, you wouldn't
00:37:44.780 believe the original version of this book. Oh my gosh. This is after edits. This is the
00:37:49.860 boulderized version. Oh my gosh. Because this is the first paragraph. Listen to this. So
00:37:54.760 it starts, everyone knows I'm a monster. And by everybody, I mean all good, decent, serious
00:37:59.060 newspaper analog reading people. And by all good, decent, serious newspaper analog reading
00:38:03.160 people, I mean you sad, atavistic, masturbatory specimens out there in the woolly wilds of
00:38:07.800 America. By which I mean you put pounding nobodies in Brooklyn or Gaiman, Oklahoma, depending on
00:38:12.460 your tribe. Obsessively following intra-media squabbles on social media, cheering for what
00:38:16.880 you imagine to be your side, like a bunch of marginally employed and past their time NFL
00:38:21.300 cheering leg tattooed douche rockets at some ghastly exorbitant sports bar, and enjoying
00:38:26.180 a nice bottle of the warm and comforting illusion of solidarity as though Tom Brady or Le'Veon
00:38:30.360 Bell would have taken a voluminous equine piss on you from a great height if you were smoldering
00:38:35.200 crackling on the sidelines like a sizzling plate of Kansas City burn-out.
00:38:39.860 No, the question is, that would have taken me a week to write that. That is just brilliant.
00:38:47.020 How long did that take you to write? Did that just pour out?
00:38:49.700 That was as long as it did to read.
00:38:51.860 So, okay, funny thing about this book is, you know, it's got a bunch of footnotes in it.
00:38:56.280 Yeah. And most of the book is sort of halfway like a normal political book. And then the
00:39:01.940 footnotes, which are about maybe a quarter of the book, are the kind of running commentary
00:39:06.680 of what I'm actually thinking as I write this stuff. And the footnotes were the part that
00:39:10.840 were problematic for some of the editors, I think. There are a few of them that didn't
00:39:15.820 make it in. I'll, I'll share with you off. Yeah. Yeah. Maybe you could share in our
00:39:19.160 podcast. Yeah. I don't think I can even share on your podcast. Really? Wow. Did you think
00:39:23.880 that it would go in or you were just like, I don't care? I figured I would just give it
00:39:26.980 a shot. You know, um, Regnery, I like working with Regnery, but when they put out the press
00:39:31.360 release for this book, they called it, you know, hilarious and profane. This was before
00:39:34.580 the book was done. Yeah. I figured if they're going to put profane in the press release,
00:39:37.560 that's license, right? That's right. I can do what I want. Right. Right. And you did.
00:39:41.780 Kinda. Yeah. Yeah. And you did. So, so take me, take me through it. I'm going to, we're
00:39:47.300 going to talk about it. We're going to do a podcast, uh, today for a broadcast in a couple
00:39:51.400 of weeks, but, and, and we'll go through all of it, but take me through the, the premise
00:39:56.740 of the, uh, of the book. Yeah. I started writing. And don't leave out any of the, uh, acerbic
00:40:02.560 or. I started writing the book in, in 2015 after, you know, witnessing a number of these
00:40:07.460 dumb, you know, kind of Twitter mob freak outs. The, uh, a lot of them weren't really
00:40:10.960 exactly political or political people. I, you know, the Justine Sacco business and
00:40:14.380 the, uh, guy getting canned from Google and all that. And, um, there was something
00:40:19.280 to me that seemed weird and, uh, kind of ritualistic about this stuff. It was a kind
00:40:23.560 of public ceremony. It wasn't really something that was about the issues that it pretended
00:40:27.980 to be about. And so I wrote part of the book at that time and a book proposal and I sent
00:40:32.020 it around and nobody wanted it. And then a couple of years later, I went to work for the
00:40:35.820 Atlantic for three days and got fired. And my phone started to ring before I got to
00:40:39.680 the airport literally. And I was waiting on the plane to come home to Texas. And, uh,
00:40:44.340 people were suddenly interested in the book. So go figure. Um, it's an ill will, an ill
00:40:49.920 wind that blows me no some good. But, um, so the book is about, um, some of the social
00:40:56.820 and political reasons for why people have become so hysterical and theatrical in terms
00:41:03.240 of their political engagement. And what I really ultimately argue is it's not really
00:41:06.340 about politics. It's that people have a certain emptiness in their lives in a sense that they
00:41:11.400 lack connectedness and, um, these media mob phenomenon and social media, you know, this
00:41:17.660 kind of performative theatrical hysterical politics gives them a false sense of having
00:41:22.900 being involved in something important. It gives them a sense that they've been involved
00:41:27.580 in something meaningful when they're not. Um, but it, it, it kind of feels good. And so
00:41:31.880 people go to it in this weird, addictive, compulsive way. So it's not really politics
00:41:36.080 that's happening on Twitter. It's this weird, embarrassing public group therapy session.
00:41:40.160 And that's essentially what the book is about.
00:41:42.000 So, um, it, it, but it is, it's being used by politics.
00:41:46.860 Sure. Yeah. Um, we just had, I said to Stu almost every day now I meet somebody, uh, and
00:41:54.200 usually now from the other side that is, has just been affected, lost their job, lost their
00:41:59.660 credit. We just had a really brilliant, uh, psychiatrist on with us a few, a few minutes
00:42:04.280 ago. And he, he's now being targeted by Google and Clinton who he voted for.
00:42:10.260 Yeah. He's a fan of, of Google. I mean, he has respect for Google. Um, and, and Clinton,
00:42:17.020 he said, I've, I've got a letter from Bill Clinton hanging above my desk and now they're
00:42:22.260 taking me out and saying that I'm a monster.
00:42:24.080 Yeah. Um, there's a bit in Coriolanus about that, how, you know, you're the favorite one
00:42:28.160 day and the villain the next day. And that's, that seems to be the case of it. Um, one of
00:42:31.680 the things I get into the book a lot is the emergence of that very thing of the use of
00:42:35.220 employment as a weapon of political coercion. And I think that's a really interesting, um,
00:42:40.320 subject to follow up on because this phenomenon of, uh, demand for homogeneity and conformity
00:42:46.340 is not really so much a problem for people like you or me. I mean, we're in the, in the
00:42:50.100 controversy business. It's what we do. Uh, you know, maybe you lose an advertiser here
00:42:53.940 and then maybe you lose a gig here or there, but you know, that's kind of what we do. It's
00:42:57.880 a much, much bigger problem if you're someone who's trying to manage a Starbucks in Philadelphia
00:43:01.100 and you're going to lose your job because you're enforcing company policy, but it becomes this
00:43:05.200 viral, uh, Twitter phenomenon. Or if you're a programmer at Google or you're someone who
00:43:09.400 works at a bank or you're someone who's a hairdresser and make examples of these people
00:43:14.200 and that kind of, you know, psychic terrorism is effective. And now people know just not to voice
00:43:19.380 opinions in the first place. If they're any way afraid that it might be unpopular or
00:43:23.500 nonconforming. I will tell you, I think that, um, I would have agreed with you just a few
00:43:29.660 years ago, but I believe my voice and I didn't feel this way at Fox. Okay. And they were coming
00:43:36.100 after me like crazy. I do believe my voice could be silenced. I could be erased now from
00:43:41.700 history and just not, you just gone. You think? Yeah, you don't. I think you, well, not to flatter you,
00:43:48.980 but you sell an awful lot of books and have an awful lot of listeners. I think it'd be hard
00:43:51.960 to do that. Um, but a lot of that, and I think maybe, um, we remind the, the, the Navy SEALs
00:43:58.380 when they turn dark and they're working for the corporations and come to get me at night.
00:44:02.600 But a lot of this stuff, when it comes to people like you though, I think, I think you saw this
00:44:06.060 really in the Roseanne Barr case where the public Twitter mob phenomenon is really a pretext for
00:44:11.880 things that are going on inside the company. You know, no one at ABC is making multimillion dollar
00:44:17.480 programming decisions based on what at Caitlin three, two, one vegan on Twitter has to say
00:44:23.900 about, you know, Roseanne Barr. Right. Right. Um, I didn't lose my job at the Atlantic because
00:44:30.020 people were freaking out on Twitter. I lost my job at the Atlantic because of things that were going on
00:44:33.680 on staff and in the company. And, uh, that tends to be the case, I think more for people like us.
00:44:38.620 And you've seen this in the, in the positive outcomes too, with, uh, I'll say a kind word for
00:44:42.960 the New York times, which has had several of its writers and people targeted in this way. And they
00:44:47.520 said, no, we're the New York times. We hire who we like, and we're going to keep Brett Stevens on
00:44:50.240 the staff and they don't like it. Yeah. That's the way premier radio is. That's a reason why we're
00:44:53.980 still on radio is because of I heart is an amazing company that just is like, I don't care where I
00:45:00.960 don't care. We'll put any voice on. And as long as they don't lose our license and they're generally
00:45:06.300 responsible, we don't care what their opinion is. We'll put them on. And we don't care what the mob
00:45:12.420 says. Yeah. And it's going to be up to institutions to stand up to this kind of thing. Uh, because,
00:45:16.940 you know, individuals, even, you know, ones that have some outlet like I do, um, really rely on
00:45:21.540 institutions to, to be the ones who are going to stand and run guard on this. But that's what I
00:45:25.300 mean. I think that, you know, you could erase because Google is quickly becoming every outlet,
00:45:32.280 right? It's becoming the, I mean, if you're not with Google, you're not around. And that's one of
00:45:37.560 the misunderstood things about this, you know, like that, that freak out about James Damore at Google
00:45:41.200 was not about some nobody programmer that no one cares about. It was about Google. It's not about,
00:45:46.700 we can get this fired. It's, we can make Google jump when we say jump. And we can make Facebook
00:45:50.540 jump when we say jump. And we can make the New York Times rewrite a headline when we said the
00:45:54.000 New York Times is going to rewrite a headline. So that the people involved in this who, you know,
00:45:57.860 get fired or otherwise are really just sort of instruments. They're, they're props for this,
00:46:02.420 this great active theater. It's more about controlling the institutions. And that's where institutions
00:46:07.060 really have to stand up for themselves. And that's the shame of particularly the university
00:46:10.340 culture, where you've got a bunch of academics who depend on intellectual honesty and intellectual
00:46:16.020 freedom, but will not stand up for it in their own institutions.
00:46:18.320 So I'd like to, may I change the subjects a bit here, Kevin, with you and go to red flagging.
00:46:22.880 Sure.
00:46:23.360 Uh, I just got a, uh, a listener sent in a, uh, uh, uh, fundraising piece from a, uh, a senator in,
00:46:34.920 uh, uh, where was it? Pennsylvania that was making the case that we must have red flagging.
00:46:44.580 Your thoughts on what's happening?
00:46:46.900 I don't understand the basic case for red flag laws. So David French at National Review and I've
00:46:51.340 debated about this a little bit on the corner. Um, I kind of distrust the, the, the whole premise
00:46:55.760 of it, but what's used as the, as the example is that we have these laws for involuntarily committing
00:47:01.280 people, uh, for mental care when they seem to present an immediate threat to themselves or
00:47:06.340 others. And there's a process there by which a judge and a doctor involved, and we've got this way of
00:47:10.580 doing it. So they want to use that as the basis of the red flags laws or the red flag law. I think
00:47:16.200 that ought to be the red flag law, that if you think someone actually is a danger to himself
00:47:20.560 or someone else, rather than messing around with whether this person can buy a gun, uh,
00:47:25.220 then we should probably, you know, ensure that this person is under, is under, is under
00:47:28.580 mental health care. So I think to that extent, we've already got the red flag law that we need
00:47:32.640 and people will say, well, it's, it's, it's very onerous and it's hard to get through this
00:47:36.240 process and it's hard to do. It should be. Yeah, exactly. That's how we want it. We're talking
00:47:40.140 about the bill of rights here. And, um, I'm, I'm always pretty queasy about the idea of suspending
00:47:46.020 anyone's civil rights, uh, when they haven't been charged with a crime or convicted of a crime or
00:47:50.680 even arrested for a crime. We're taking, look, if, if somebody comes in and if I, you know, I come in
00:47:56.360 in the day and I say, I got pop tarts in my pants. I had pop tarts in my pants and I'm eating pop tarts
00:48:02.340 from my pants. You might say, Glenn, you might, you know what, why don't you take the day off?
00:48:07.020 You might call Tanya and say, he might need to see a doctor. Um, but if I'm coming in with a gun or I'm
00:48:12.940 dangerous, then you might call police and I should be taken to the doctor and a doctor and a judge
00:48:18.700 should decide with my wife. If maybe I, you know, have more than pop tarts, uh, I may have guns and
00:48:24.280 maybe that's a danger. That's the way you deal with it. What they're trying to do is with this
00:48:29.800 red flag law, nobody will take it that far. Nobody will take it that far, but I I'm telling you right
00:48:37.540 now. You can't tell me that there aren't a lot of people who have been divorced that in that divorce
00:48:44.760 proceeding, somebody might say, you know what, man, he's dangerous and he's got a lot of guns.
00:48:50.660 Yeah. One other thing about having been a small town newspaper editor is you spend a lot of time
00:48:54.720 reading court records of those very things, you know, divorce cases, custody cases and stuff like
00:48:59.080 that. And you're probably half of the death threats I ever got in my life were, uh, editing a
00:49:04.500 small town newspaper and writing a DUI story about some guy who was in a custody dispute with his
00:49:08.540 wife and thinks he's going to lose his kids, uh, because this thing comes up and the sorts of
00:49:13.300 accusations that are made in those situations tend to be often irresponsible. And there's not much of
00:49:18.440 a downside for doing it. We don't really retaliate against people for that sort of thing. I, um, don't
00:49:23.820 have as much faith as a lot of conservatives do in the law enforcement and prosecutorial apparatus.
00:49:28.820 Although I think that the prosecutors are a bigger problem than the police officers are for the most part.
00:49:33.600 Yeah. And I don't really trust them with the power they already have and to give them additional
00:49:39.840 power on top of that to essentially make an end run around the bill of rights. Um, you know,
00:49:45.300 I'm going to take some convincing on that. Yeah. I'm going to take a little convincing.
00:49:48.920 But I don't think, I don't think a lot of America's needing a lot of convincing.
00:49:52.440 The really maddening thing about this is that, um, you know, if you look at say the U S attorney's
00:49:57.000 office for, for Illinois, they will not prosecute a straw buyer case. They just won't. Uh,
00:50:01.440 they don't think it's worth their time unless it's part of a big organized crime investigation.
00:50:04.740 The current conviction rate for, uh, illegal handgun cases in the Chicago area is about 14%.
00:50:11.100 Um, we've got laws on the books that we ought to be enforcing. We really ought to be going
00:50:15.840 after straw buyers. We ought to be going after people for minor weapons charges before they
00:50:20.020 become homicides. Um, we've got a lot of things that we could be doing on this front that we just
00:50:24.580 simply refuse to do because law enforcement is basically lazy. Um, if you look, all of our gun
00:50:29.960 control proposals are targeted at licensed gun dealers and the people who do business with them.
00:50:36.020 They've got addresses and business hours and records. They're really easy to police.
00:50:40.060 Whereas guys who are selling, you know, Glocks out of the trunk of their car off the interstate
00:50:43.820 somewhere are a lot harder to catch. All right. More with Kevin Williamson in
00:50:47.620 the blaze radio network on demand.