The Glenn Beck Program - August 15, 2018


'What Are You Willing to Die For?' - 8⧸15⧸18


Episode Stats

Length

1 hour and 50 minutes

Words per Minute

164.05447

Word Count

18,199

Sentence Count

1,390

Misogynist Sentences

17

Hate Speech Sentences

13


Summary

The world is changing, and when the next big thing hits, we won t be able to function in it because we haven t even prepared ourselves for it. Glenn Beck explains why this is a problem, and why we should be worried about it.


Transcript

00:00:00.000 The Blaze Radio Network, on demand, love, courage, truth, Glenn Beck.
00:00:12.640 There's so many things that I want to talk to you today about, so many really big scandals
00:00:22.140 that we're in the middle of, so many things that we're ignoring, so many things that have
00:00:26.800 to be decided, and what are we doing?
00:00:28.980 We're arguing about what?
00:00:31.320 The midterms?
00:00:32.780 We're arguing about a corrupt politician that was in bed with Russia years before Donald
00:00:39.360 Trump even came, and we're not even prosecuting him because we're trying to find out how many
00:00:44.720 people are in bed with Russia.
00:00:46.500 We're just trying to find out if anybody around Donald Trump is in bed with Russia.
00:00:50.360 This is ridiculous.
00:00:52.300 What is happening to us?
00:00:53.980 And meanwhile, the world is changing, and when it hits, we're not even going to be able
00:01:02.660 to, we won't be able to function in it because we haven't prepared ourselves.
00:01:07.740 We haven't even thought.
00:01:09.560 Let me give you this.
00:01:12.040 Do you trust yourself, your own ability to unplug from technology?
00:01:17.720 Now, I want you to think about this.
00:01:20.560 People who were born before the 90s are now wearing it as a badge of honor.
00:01:25.800 I quit Facebook, you know, or they do a regular technology fast.
00:01:30.360 I'm not going to use technology for the next week, okay?
00:01:35.500 How many make it?
00:01:37.400 We like to think that we're not overly dependent on technology while posting on social media
00:01:42.640 about how old school we are.
00:01:44.580 How many times do you see people say, you know what?
00:01:46.940 I took a break, and you know, when I took a break, everything worked out, and you're
00:01:50.540 posting that on Facebook.
00:01:52.920 Why did you come back?
00:01:54.240 Now, technology is a good thing, until technology starts giving you puppy dog eyes and producing
00:02:04.080 digital crocodile tears.
00:02:07.120 What happens if your technology says, please don't turn me off, please, I'm lonely?
00:02:16.500 Will you do it?
00:02:18.060 When I first asked this question on the air, it was probably 1996, 98, and I remember the
00:02:28.180 people in the room looking at me saying, that is crazy, and I'm like, no, it's coming.
00:02:33.780 Well, it's not science fiction anymore.
00:02:35.520 It's right around the corner, and if you're skeptical whether humans will treat robots like
00:02:40.720 a family member, a new study might alter your view.
00:02:44.500 Researchers now in Germany set up an experiment to examine how people treat robots when the
00:02:50.620 robot acts like a human.
00:02:52.960 Each human participant was asked to work with a robot named NEO.
00:02:57.800 They worked to create a weekly schedule and answer a series of questions.
00:03:02.140 What the participants didn't know was that completing the task was just a way for the researchers
00:03:07.880 to find out what they were really interested in, and that is how the participants' interaction
00:03:13.140 with NEO would affect their ability to shut the robot down when asked.
00:03:18.520 Now, half of the participants were asked to shut NEO down without the robot protesting,
00:03:24.100 but the other half had NEO protesting.
00:03:28.760 In fact, NEO would say, please, don't turn me off, please, I'm scared that I will not brighten
00:03:34.680 up again.
00:03:36.640 Of the 43 people who heard his plea, 13 chose not to turn him off.
00:03:43.600 Some said they felt sorry for him.
00:03:45.940 Others said that they didn't want to act against his will.
00:03:49.740 The other 30 people did turn him off, but they took twice as long on average to do so than
00:03:55.200 the group that didn't hear the robot's plea.
00:03:57.340 So what the experiment shows us, it confirms previous research demonstrating that humans
00:04:04.660 are prone to treat technology, especially robots, with human-like traits as living beings.
00:04:12.080 Now, take this human tendency.
00:04:14.420 Fast forward a few years in the future.
00:04:16.400 When robots will look, sound, and act human, they will also know everything about us.
00:04:22.920 Are they human?
00:04:27.420 Do they have rights?
00:04:30.320 If they have rights, do they have voting rights?
00:04:35.300 By 2030, there will be 100 robots of some sort or another for every human being on Earth.
00:04:45.160 One hundred.
00:04:46.400 If they have rights and voting rights, what does that mean?
00:04:56.000 Can we find ourselves capable of having adult conversations?
00:05:02.840 We are at the turning point of human history.
00:05:07.240 And very soon, the problems that we all are now arguing about suddenly will seem very, very quaint.
00:05:23.300 It's Wednesday, August 15th.
00:05:25.680 This is the Glenn Beck Program.
00:05:32.180 I honestly don't know how to answer those questions.
00:05:37.240 But I think we are going to need to.
00:05:43.300 And this fall, we're going to be delving into some of those conversations because these things are happening.
00:05:49.540 For instance, they just stopped a sex robot from being shipped into Canada.
00:05:57.800 It was stopped at customs.
00:05:59.260 It was coming in from Japan.
00:06:01.180 Why Japan is, you know, all into the sex robots, I have no idea.
00:06:05.340 But you apparently can buy sex robots from Japan and have them shipped to you.
00:06:11.200 But in Canada, it's illegal to have a sex robot that resembles a child.
00:06:16.420 Now, Japan is making child sex robots.
00:06:21.880 And there is a debate now on whether or not that would be good therapy for pedophiles.
00:06:27.780 Should they just be allowed to have sex with child robots so they don't hurt children?
00:06:34.760 Or will this make things worse?
00:06:36.400 There's a huge debate and the whole in this entire argument.
00:06:43.760 OK, let's have that.
00:06:45.100 I think I know where I stand, but let's have that conversation.
00:06:48.320 But the whole that nobody's talking about is when the robot does hit a point to where it claims consciousness and that will come when you won't be able to tell the difference between a real human being and, you know, some sort of silicon life form.
00:07:09.420 In its thinking, don't you become a slave master that is abusing these robots?
00:07:17.780 Do you think if it claims to be human or life, if it says, please don't do that and you're doing it anyway because you own it?
00:07:27.600 Aren't you a modern day slave trader, slave master?
00:07:31.640 Even if you're not, a lot of people will start to think that, of course, think of the attachment to inanimate objects that people have already when they're not showing human characteristics.
00:07:42.180 You know, I mean, it's something that's going to be a real question, I will say.
00:07:47.280 And when it comes to, you know, sex, child sex robots and all of this stuff, there's a limit into which how many issues you can master in your life.
00:07:58.300 And that might be a debate I just completely avoid.
00:08:01.540 Is that is that OK?
00:08:02.900 Can is there OK?
00:08:03.900 Do I get one issue I can just completely avoid?
00:08:07.040 Yes, you can.
00:08:07.720 Just they just hope just to make noise, really loud noise and hope I don't see anything or hear anything.
00:08:12.200 Yeah, is that yes.
00:08:14.020 I think on some you can't do that on everything.
00:08:17.200 No, but I think there's some things you're like, I'm just not going to deal with that one.
00:08:20.360 If I can pick one, it might be this one.
00:08:22.380 You know what?
00:08:22.840 Actually, this one is these are the kinds of things that fascinate me and animate me now because I'm so tired of the other ones.
00:08:30.240 I don't want to deal with the other ones.
00:08:31.800 I'm just like, can we talk about pedophilia and sex robots?
00:08:36.220 Because at least it's some of it.
00:08:38.180 At least we're having a draft argument that is at least feels in the future.
00:08:43.460 At least you could have it.
00:08:44.500 And it's not political yet.
00:08:46.440 It's not political.
00:08:47.920 Right.
00:08:48.140 It's analytical.
00:08:49.340 It's like, let's look at the facts and really have a deep conversation without somebody saying, yeah, but they're in bed with the Trump administration.
00:08:57.500 You're like, oh, dear God.
00:08:59.020 Yeah.
00:08:59.220 Because it's just, you know, that puts you on teams and then no one makes any point.
00:09:02.680 You notice this in your book, Addicted to Outrage, it's coming out in a few weeks.
00:09:06.500 And as I was reading it, you get, I think, generally speaking, it's in some ways a more positive book than you're known for.
00:09:14.880 Oh, it's a very positive book.
00:09:16.000 Like, because, you know, you're looking at these issues and you're saying, OK, here's an actual solution.
00:09:20.200 You can tell, though, when you get into some of the technology stuff that you don't yet see the end of how that turns out really well.
00:09:27.460 I don't think anybody does.
00:09:28.840 I mean, you have the you have the you have the best minds on Earth split.
00:09:34.360 Some of them are saying it is the end of the human race, which I could make a very passionate case that would stand up that, yes, the human race is over by 2050.
00:09:46.880 Humans will not be humans.
00:09:48.760 We will still kind of exist, but in an entirely different way.
00:09:53.540 I could also make a pretty compelling case that we will all be dead.
00:09:57.760 I could make the case as well that it's utopia on the other side of this, but not with the way that we're behaving now.
00:10:06.620 Well, I saw something that I think is I would say.
00:10:13.640 I hate to use this word, but I think it is heroic in today's world.
00:10:19.620 We have lowered our standards so low.
00:10:23.540 We've gone so low that it doesn't take a lot to be a hero.
00:10:28.760 And I want to show you something that came out yesterday and not a lot of people are talking about it and they should.
00:10:34.480 It was somebody who is backed up against the wall that has information that could hurt somebody, really hurt somebody.
00:10:42.780 And he says, I'm not getting involved, even though I know this person would love to do it.
00:10:50.960 But it's heroic and I want you to hear what he said, what it was about and and his reason for not pulling the trigger.
00:11:03.020 We'll do that when we come back.
00:11:05.380 First, let me tell you about our sponsor this half hour.
00:11:07.900 Have you ever taken your car in to the mechanic just for an oil change?
00:11:12.880 Stu, you know, the one thing, you know, they say that, you know, making decent money, you know, doesn't change your life.
00:11:19.880 Money doesn't change your life.
00:11:20.960 It does when it comes to your car.
00:11:23.320 Sure.
00:11:23.900 And we because it actually runs.
00:11:25.300 Yeah, it actually runs.
00:11:26.060 Or we don't freak out.
00:11:27.960 I mean, unless you go and it's like, you know, six thousand dollars.
00:11:31.380 But you don't freak out when the check engine light goes on or when you have to get an oil change and you're like, oh, I don't have the money for an oil change.
00:11:38.900 Yeah.
00:11:39.320 When you know you you don't have that huge issue around the corner that's going to wipe you out.
00:11:43.820 Yeah.
00:11:44.000 I mean, it puts you it's a huge piece of mind.
00:11:45.740 Yeah.
00:11:46.100 I spend so much of my life getting into my car going, please start.
00:11:50.120 Please start.
00:11:50.740 Please start.
00:11:51.400 Or just sitting at a light and just just realizing there's twenty five cars behind you and realize this isn't going to go anywhere.
00:11:56.880 Is it?
00:11:57.120 I'm going to have to explain to these people that they're all around.
00:11:59.720 They're going to come with a tire iron and kill me.
00:12:02.440 Anyway, you don't have to worry about that.
00:12:04.640 If you have Car Shield, Car Shield makes the process of fixing your car for a covered repair.
00:12:09.620 Super, super easy.
00:12:11.100 You can have your favorite mechanic do it.
00:12:12.500 You can have the dealership do the work.
00:12:13.960 It's really up to you.
00:12:15.560 They provide twenty four seven roadside assistance and a rental car while yours is being fixed for free.
00:12:21.580 If your car has five thousand to one hundred fifty thousand miles on it, doesn't mean you have to pay high repair bills.
00:12:27.240 Car Shield administrators have already paid out close to two billion dollars in claims and they're ready to help you.
00:12:32.460 They have helped me.
00:12:33.120 I really I went in for an oil change with an old truck of mine and a sensor had gone off, you know, check engine.
00:12:40.800 I'm like, hey, check that, too.
00:12:42.260 It was like a five or six thousand dollar bill.
00:12:45.180 And the only reason why I don't know exactly how much it was is because Car Shield took care of it.
00:12:49.460 They covered it.
00:12:50.680 Call 800 car sixty one hundred.
00:12:53.400 Mention the promo code back or visit carshield.com and use the promo code back.
00:12:58.000 Save 10 percent.
00:12:59.000 Now, carshield.com deductible may apply, but it is absolutely worth having for your car.
00:13:05.340 Save 10 percent now at carshield.com promo code back.
00:13:15.420 So yesterday I read a quote because everybody's oh, Omarosa.
00:13:22.800 Does she have tapes?
00:13:24.260 It's Omarosa.
00:13:25.620 If we even if she had tape, I mean, look at the intent behind her.
00:13:33.300 You know, she had this intent the whole time.
00:13:36.480 She won on The Apprentice for being conniving.
00:13:40.180 Did she even win The Apprentice?
00:13:41.540 I don't think she won.
00:13:42.780 Did she?
00:13:44.200 Did Omarosa win The Apprentice?
00:13:45.960 Maybe she did.
00:13:46.780 I don't know.
00:13:47.020 She was on it several times, wasn't she?
00:13:48.760 She was awful.
00:13:49.780 I only saw her that one.
00:13:50.840 That's the only season I feel like I feel like she lost.
00:13:53.740 I mean, again, I don't awful human being anyway, in my opinion and a terrible and should be
00:13:59.340 pointed out a terrible hire.
00:14:01.360 Terrible.
00:14:02.360 Everybody seemed to know this in the universe except for Donald Trump, and he deserves all
00:14:06.740 the responsibility for hiring her.
00:14:08.540 It was it was his call.
00:14:10.600 No one else.
00:14:11.140 There wasn't anyone.
00:14:11.680 There was him over that.
00:14:12.900 No fan of Donald Trump would have hired Omarosa.
00:14:16.200 Nobody.
00:14:16.480 No, I think no person who voted for Donald Trump was like, hey, you know what?
00:14:20.520 I'm glad he put Omarosa in there.
00:14:22.260 Everybody on Earth knew it was a bad idea, except Don for some reason.
00:14:25.800 OK, so so everybody's talking about these tapes.
00:14:29.880 Well, there's somebody else that was on The Celebrity Apprentice who who spent time with
00:14:35.580 Donald Trump, and I happen to know that he a feels passionately about Donald Trump, you
00:14:47.720 know, and not enough, not in a positive way.
00:14:50.960 He thinks Donald Trump is is, you know, not the guy he would have voted for.
00:14:55.520 Um, I know that I'm sure I'm sure that many of his friends would would be like, you have
00:15:03.700 to speak out.
00:15:04.880 You could make a difference.
00:15:06.660 Right.
00:15:07.200 Right.
00:15:07.340 OK, he was asked, you know, so what about is there a tape?
00:15:12.300 I mean, you were there.
00:15:14.420 Do you do you were, you know, on The Celebrity Apprentice?
00:15:18.040 Do they have a tape?
00:15:19.680 Quote, I was in the room.
00:15:21.840 So you heard him say, oh, yeah, can you tell me what you heard him say?
00:15:28.020 No.
00:15:29.680 If Donald Trump had not become president, I'd tell you all the stories, but the stakes now
00:15:35.120 are too high and I am an unreliable narrator.
00:15:40.280 What I do as much as anything is I'm a storyteller and storytellers are liars.
00:15:46.500 So I can emotionally tell you things that happened racially, sexually that showed stupidity
00:15:53.900 or lack of compassion when I was in the room with Donald Trump.
00:15:56.420 And I guarantee you that I will get the details wrong.
00:16:00.200 I would not feel comfortable talking about what I felt I saw in that room, because when
00:16:06.400 I was on that show, I was sleeping four to five hours a night.
00:16:09.940 I was uncomfortable.
00:16:11.680 Stress is the wrong word, but I was not at my best.
00:16:15.300 Then at the end of the day, they put you into a room and they bring out a guy, Donald
00:16:19.200 Trump, who has no power whatsoever.
00:16:21.880 He's capricious and petty.
00:16:24.260 And the interviewer says, and you have to pretend to care what he thinks.
00:16:27.540 Yeah, because that's your job.
00:16:29.640 You sit at a table and this man rambles, pontificates.
00:16:33.320 He's given too much credit.
00:16:35.280 And because you live in the modern world, you've heard Trump ramble.
00:16:38.880 You've also heard Trump ramble when he thinks he's being careful.
00:16:42.220 Imagine when he feels he can be frank and I will tell you, but I will be very conscientious
00:16:50.800 not to give you quotations because I believe that would be morally wrong.
00:16:56.140 I'm not trying to protect myself.
00:16:58.100 This is really a moral thing.
00:16:59.960 The reporter says, so just to make sure I'm clear on this, it would be wrong if you misquote
00:17:06.220 him because you don't want to have an unduly have an effect on politics.
00:17:12.080 If he hadn't become president, I would be telling these stories all day long.
00:17:17.960 And if someone were to say, Penn didn't get that exactly right, you'd go, who cares?
00:17:21.760 But now being accurate matters.
00:17:25.980 The stakes are really high.
00:17:27.440 Not for me.
00:17:28.120 Nothing I say here hurts my career.
00:17:29.860 But for the world, the stakes are higher.
00:17:32.940 He is the president and he would be reading.
00:17:36.380 And what I'm trying to do here is to tell you the story emotionally without giving you
00:17:41.180 any specifics.
00:17:43.740 I think this is heroic.
00:17:45.420 Here's a guy when when the rest of the world is taking things out of context or ignoring
00:17:52.520 Chris Cuomo, ignoring things about people who that should not be ignored, Antifa, but because
00:18:01.760 they're the enemy of your enemy, you make them your friend.
00:18:06.640 Here's a guy who says, I don't want any part of this.
00:18:10.960 Yes, I saw things, but I interpret things emotionally.
00:18:17.400 And I can tell you what I felt, but I can't tell you what he said, because I don't know
00:18:23.420 if I'll remember it right.
00:18:24.780 If you do you listen to Malcolm Gladwell's podcast?
00:18:28.200 I've listened to some of it.
00:18:29.540 Yes.
00:18:29.840 So he his last season of podcasts was in his history, revisionist history.
00:18:35.440 And it was it it was all about lying.
00:18:41.640 And I heard the Brian Williams episode of this.
00:18:44.080 That's only the beginning of it.
00:18:46.120 He does.
00:18:47.600 He shows that people's memories are not accurate.
00:18:52.100 And he shows that, you know, they did studies on 9-11.
00:18:56.440 They're still doing these.
00:18:57.540 They talk to people right after 9-11.
00:19:00.280 Where were you?
00:19:01.380 What happened?
00:19:02.800 What did you say?
00:19:03.980 Who did you talk to?
00:19:05.440 What was around you?
00:19:07.060 Then they come back every five years and they say, what did you see?
00:19:11.140 Well, 10 years into it, people are starting to look at the statements that they made.
00:19:16.460 They made them right after 9-11.
00:19:18.840 And they're coming back and they're saying, I don't know why I said that, because that's
00:19:22.960 a lie.
00:19:23.460 I wasn't there.
00:19:24.220 I'm telling you, I was with this person and we said this.
00:19:28.860 And they fully believe that that's where they were 10 years later.
00:19:34.840 And they can't figure out why they would have said that.
00:19:38.560 Well, I must have been in an emotional state because that's not what happened.
00:19:41.400 And it's honest both times.
00:19:45.180 And this, I'll bet you he's listened to the, or he's probably so smart, he probably wrote
00:19:49.320 the research Malcolm did on this.
00:19:52.460 But that's what he's saying.
00:19:53.820 I don't trust my own memory.
00:19:57.040 And I shouldn't have a place at the table.
00:19:59.200 Don't trust any, anybody because we're all storytellers.
00:20:02.940 By the way, the hero, Penn Jillette.
00:20:10.060 So here we are facing some really huge problems, some ethical decisions that absolutely have
00:20:19.440 to be settled.
00:20:20.420 And I truly believe that everything boils down to the Bill of Rights.
00:20:24.940 I believe that everything boils down to this.
00:20:27.280 Are we a collective or are we individuals who are trying to live with one another?
00:20:34.400 Are we so convinced that we are right?
00:20:40.000 Are we the scientists who say there is no such thing as intelligent design?
00:20:47.780 Okay.
00:20:48.660 It's only the Big Bang.
00:20:49.860 Well, what kicked off the Big Bang?
00:20:52.400 Well, there is no God.
00:20:56.260 Well, how do you know that?
00:20:59.420 We, we are, we, we've become so sure of ourselves that we don't even take time to, to step back
00:21:09.400 a second and go, wait a minute, does this argument really even matter?
00:21:13.440 Does this argument do anything?
00:21:17.000 Are we ever going to be answering that question?
00:21:21.100 I mean, by default, and you could say, well, that's the trick.
00:21:25.180 Or you could look at it from a person of faith and say, that's the point.
00:21:30.660 You're never going to be able to prove God.
00:21:33.380 That's what faith is there for.
00:21:35.960 We're trying to teach ourselves how to have faith in things that are unseen.
00:21:46.500 Is that important?
00:21:50.820 I think it is because it, at least for me, it gives me the opportunity to believe that
00:21:56.560 tomorrow will be better because if I lose faith, I lose hope and then there's nothing.
00:22:02.200 Then tomorrow is just really bleak, but I have faith.
00:22:05.760 Which gives me hope.
00:22:09.760 And you can't prove that or disprove that.
00:22:13.740 And yet what we're doing is we're categorizing everybody because you have to.
00:22:17.580 If you're going to be a collective, then everyone has to have a label.
00:22:21.900 I keep all of these people over here and all of these people over here.
00:22:25.440 When Penn Jillette was, he was asked this question, saying you're skeptical that there's
00:22:31.800 been a shift which has been attributed to Trumpism in those people's willingness to
00:22:37.360 believe things that are odds at odds with the facts.
00:22:40.620 Penn responds, but when you say these people, you're making a huge error because there are
00:22:46.480 no these people.
00:22:48.360 They don't exist.
00:22:49.840 You hear stuff like Trump supporters are homophobic.
00:22:52.020 Trump supporters are misogynistic.
00:22:53.540 This is a mistake and it was made by the Democrats when they accused Trump supporters of being
00:22:59.640 things that Trump supporters knew they weren't.
00:23:03.760 They are Trump supporters that have there are Trump supporters that have best friends who
00:23:08.220 have gay sex.
00:23:09.440 They do.
00:23:10.680 And so you can't put they type of thing on that.
00:23:14.180 For 50 million years, our biggest problems were too few calories and too little information.
00:23:19.360 For 50 years now, our biggest problem has been too many calories and too much information.
00:23:26.780 So what are we going to do?
00:23:28.680 So how do we how do we adjust?
00:23:32.240 The first thing is going back to Malcolm Gladwell.
00:23:36.980 Is stopping so arrogant that we are right.
00:23:41.380 For instance, Stu, you and I were together on 9-11.
00:23:46.120 Well, not together.
00:23:47.360 We were on the phone.
00:23:47.980 Well, we were a couple.
00:23:49.080 Well, yeah.
00:23:51.600 Yeah, we were on the phone prepping the show for the day.
00:23:54.180 Okay.
00:23:55.080 And we were on the phone when the second plane hit.
00:23:59.080 Correct.
00:23:59.580 We were on the phone talking about the first plane because it was on the news already.
00:24:02.200 Correct.
00:24:02.960 And the second plane hit, we were still on the same phone call was still going on.
00:24:08.020 We were talking to each other when the plane hit.
00:24:10.200 Right.
00:24:10.640 And we were not.
00:24:12.220 You had clicked over to take another call.
00:24:15.560 So you were on the phone, I think, with Tanya.
00:24:18.720 Tanya called.
00:24:19.460 Yeah.
00:24:20.600 And you were talking to her when that happened because I remember being on the phone with
00:24:25.560 no one and then like screaming, you know, screaming God knows what to my wife who was
00:24:30.160 in the other room, I think, in the shower.
00:24:32.440 But you've told the story of 9-11 before that we were on the phone.
00:24:36.560 We were talking to each other and we weren't actually at that moment talking to each other.
00:24:40.220 No, I would have never remembered that.
00:24:42.000 Just moments after.
00:24:43.420 Yeah.
00:24:43.640 We were.
00:24:44.720 And but that is, you know, like there's certain points of that day that to me are complete
00:24:50.100 perfect videos.
00:24:52.080 Let me ask you this.
00:24:53.040 What happened.
00:24:53.560 But in reality, that's not the way your memory works.
00:24:55.440 Did we watch the World Trade Center, the first one, come down together?
00:25:01.600 Did we watch it?
00:25:02.420 Did we see it?
00:25:04.860 I, you know, I've obviously seen it a thousand times, but I don't know that I saw it live.
00:25:10.800 OK.
00:25:11.340 My memory is we were in the car.
00:25:13.420 We went in together.
00:25:14.700 We did.
00:25:14.920 Yeah.
00:25:15.160 OK.
00:25:15.500 I don't know why we even did that or where we met, but we were going in together and
00:25:20.720 we heard it come down on the radio.
00:25:22.980 We were listening to WFLA.
00:25:24.240 Yeah.
00:25:24.780 Yeah.
00:25:24.960 I think.
00:25:25.280 Yeah.
00:25:25.680 Yeah.
00:25:26.260 And then we got in and we saw the replay of it coming down.
00:25:29.420 Yeah.
00:25:29.580 We didn't see it come down together.
00:25:32.620 I think you're right.
00:25:33.660 Yeah.
00:25:33.800 Now, this is something neither of us really remember for sure all the details.
00:25:39.300 And this is something that changed our lives.
00:25:41.800 I mean, we, our entire job changed on that day and we have done nothing for the last 20
00:25:48.400 years other than kind of study the ramifications of that event.
00:25:52.900 How many people claim that they absolutely know these things, but they don't.
00:25:59.500 Yeah.
00:25:59.740 You mentioned the people who wrote down on 9-11.
00:26:03.000 They were interviewed on 9-11 or 9-12, wrote down where they were on those days.
00:26:07.800 Five years later, ask the same question.
00:26:09.620 Where were you on those days?
00:26:10.620 Tell a totally different story.
00:26:11.820 And then are shown in their own handwriting, the story that they wrote the day after.
00:26:16.460 And they say the old memory, the thing I said on 9-12-2001 was wrong.
00:26:21.580 The new memory that I have now is the right one.
00:26:23.920 How do you explain that?
00:26:24.540 It's incredible.
00:26:25.160 And, you know, you could, of course, come up with anecdotal evidence on this, but it's
00:26:29.540 something like 60% of these memories on these, they call them flashbulb events.
00:26:35.540 60% of these memories are wrong in some way.
00:26:38.320 So those are flashbulb events.
00:26:40.980 Those are things that a flash goes on and you remember it because it was, you know, the
00:26:46.260 day Kennedy was killed, the day Reagan was shot, the moon landing, you remember where
00:26:52.160 you were because a flashbulb went off and captured it.
00:26:55.460 60% of the memories are wrong.
00:26:58.700 Imagine the non-flashbulb events where you were just there and you were like, oh, I think
00:27:03.960 that's pretty much what it is.
00:27:06.280 And it's not nefarious.
00:27:08.020 People don't necessarily change it for nefarious reasons.
00:27:11.080 It's just decay in telling the story over and over again.
00:27:15.740 It just decays in your mind.
00:27:18.080 And when those big events like for, you know, you mentioned the first tower coming down.
00:27:21.640 Well, I have seen the first tower coming down.
00:27:23.280 I've seen it a thousand times.
00:27:24.640 I was just at the 9-11 museum a few weeks ago and saw it again, you know, 10 times probably
00:27:30.440 while I was there.
00:27:31.080 So, I don't know that my brain over a 20-year period is able to sort out when the first time
00:27:37.900 I saw that was and where I was.
00:27:39.560 Even though in my head, I kind of feel like I kind of remember standing there watching
00:27:44.140 it, but it was probably a replay.
00:27:46.640 Right?
00:27:47.100 I mean, you can't.
00:27:49.620 We get so confident in our own memories.
00:27:52.600 And that was kind of the point they were making on this particular one was with Brian
00:27:55.380 Williams.
00:27:55.720 They went through numerous examples of, you know, Brian Williams.
00:27:59.640 If you remember, he said he was hit, he was shot at at the helicopter and his helicopter
00:28:04.600 was hit over, you know, and it destroyed his career.
00:28:08.700 And they went through tons of examples of the same type of thing outside of politics, outside
00:28:15.140 of a guy trying to puff off his own chest or whatever.
00:28:18.380 And they show you that this happens all the time to people.
00:28:22.040 All the time.
00:28:22.440 And in his story, they had it as it evolved.
00:28:25.880 He told the story and it's slightly changed over time.
00:28:29.520 Yeah.
00:28:29.960 And he said, it was, it was first, the truth was he arrived hours after a helicopter was
00:28:38.440 shot down.
00:28:39.300 Then this, and he told it that way several times.
00:28:41.840 Then it became the lead helicopter was shot down and he landed shortly after.
00:28:49.280 Then it was his, his helicopter was shot down, but it took years.
00:28:54.920 And they documented through Malcolm Gladwell, all of the slight changes to the story where
00:29:03.180 you get to the point to where, no, that I swear to you, that's what happened.
00:29:06.540 That's what happened.
00:29:07.600 No, it didn't.
00:29:08.740 And it's not nefarious.
00:29:11.480 I mean, certainly people do lie about things like this.
00:29:15.260 And, and, and Brian Williams kind of came out afterwards and said, yeah, I guess it was
00:29:19.300 my ego that got in the way.
00:29:20.720 You know, he basically admitted to it and Gladwell's case is actually, he shouldn't have admitted
00:29:25.940 to it.
00:29:26.480 He probably didn't do it.
00:29:27.880 He probably just didn't understand it.
00:29:29.640 And I don't know maybe whether that's true with Williams or not.
00:29:32.080 I don't know, but I know that it does.
00:29:34.460 I mean, scientifically, statistically happens to people all the time.
00:29:38.920 And so in 60% of flashbulb memories, that's an incredible number.
00:29:43.600 You know, I would have never thought it was that high.
00:29:45.580 I know that when I, there are certain moments in my life that I look back and I know what
00:29:49.700 happened with, you know, with a hundred percent certainty.
00:29:52.600 They say almost all of your childhood, your earliest memory of being a child, the earliest
00:29:58.600 memory is not true.
00:30:01.680 They're like the first thing that you can actually remember.
00:30:03.860 The first thing that you can remember, especially if you're somebody who says, I remember when
00:30:07.200 I was two, I know I remember my mother coming to the crib.
00:30:12.460 They're saying most of those are not true.
00:30:14.720 This is Pat Gray, of course, he has a two, I think it's a two-year-old memory and he's
00:30:20.080 very specific about it.
00:30:21.600 I told him, I said, there's, there's a very high probability.
00:30:25.600 It's like almost a hundred percent of those are not true.
00:30:28.400 I said this and he's like, absolutely not.
00:30:31.060 I remember it, Glenn.
00:30:32.180 I remember it.
00:30:32.840 I'm like, I know.
00:30:34.140 And I'm not saying that you're lying.
00:30:35.880 I'm saying, here's the research that is showing that this is how that memory came to
00:30:40.840 be.
00:30:41.360 And he's like, I'm no, that happened.
00:30:44.720 And combine, combining these two topics we've, we've been on a little bit this hour, you
00:30:48.940 know, they're talking about this tape, right?
00:30:51.800 That comes out supposedly with Donald Trump having said some racial slur at the same time,
00:30:58.360 there's technology being developed and is at a very, it's in infancy at some level, but
00:31:05.060 it is, they give you the ability to create a person saying anything just created out
00:31:12.540 of thin air.
00:31:13.280 You know, it's technology that we've played with a little bit on the air and we can make
00:31:17.580 someone say something they didn't say.
00:31:19.600 Called deep fakes.
00:31:20.700 Yeah.
00:31:20.980 Deep fakes for video.
00:31:22.440 And there's a couple of companies that are, that are tempted to doing this and, you know,
00:31:25.440 they, their, their explanation as well.
00:31:27.560 You know, if someone loses their voice in some accident, this would give them a chance to
00:31:31.380 speak again in their own voice, it would be amazing and there's good applications for
00:31:34.940 it.
00:31:35.300 But point is, think about the combination of a fake tape that maybe you even know initially
00:31:41.980 is a fake tape.
00:31:43.440 But over time, your memory is going to play tricks on you and whether you actually remember
00:31:47.780 it or not as somebody is something that was real.
00:31:51.000 It's, we are not designed to handle this stuff and it's going to be really difficult to navigate
00:31:57.740 these waters over the next five to 10 years.
00:31:59.860 And, and here we are, we are becoming more and more arrogant in what we know.
00:32:03.940 We know less and less, and yet we're becoming more and more arrogant in what we think we
00:32:09.940 know when we don't know.
00:32:12.700 We're asking questions now that perhaps have never been asked in human, in humankind, except
00:32:19.240 for sci-fi writers.
00:32:21.020 No, but nobody's asked these questions before we've, we've not had the ability to augment
00:32:27.660 somebody to make them from a woman to a man.
00:32:32.120 And we're not willing to have the actual scientific conversation of what defines a man and what defines
00:32:42.160 a woman.
00:32:43.220 What is that?
00:32:44.340 Are you physically now, because we've augmented you, are you physically a woman?
00:32:51.560 You can identify that's another conversation, but we won't even have those conversations.
00:32:57.240 We're dealing with some of the biggest questions mankind has ever dealt with, and we won't have
00:33:04.760 those conversations.
00:33:05.880 And here's what's scary.
00:33:08.040 AI is being programmed right now by people who are saying there is no absolute truth.
00:33:13.260 We don't know the answers to these things, or we do know the answers, but they're new
00:33:18.460 answers.
00:33:19.580 They're postmodernist answers.
00:33:22.400 Well, wait a minute.
00:33:23.580 That's going into the baseline programming of something that soon is going to say, don't
00:33:28.640 turn me off.
00:33:29.540 I'm alive.
00:33:32.000 I'm how do we define if you're alive or not?
00:33:35.760 What is life?
00:33:37.460 Well, it's easy.
00:33:38.600 You know, you were born.
00:33:39.700 You have a soul.
00:33:41.140 Okay.
00:33:41.620 Can you prove that there's a soul?
00:33:44.420 Because that's what AI is going to say to you.
00:33:46.860 I don't have a soul.
00:33:48.520 And what kind of creator are we?
00:33:51.780 Our creator, we believe, created us and then gave us certain rights and set us free and
00:34:00.640 said, live, try to live within these.
00:34:03.380 Didn't force us, said, try to.
00:34:07.400 It was other men that tried to oppress us.
00:34:10.580 Our creator created us, said, these are your rights.
00:34:14.580 Protect them for everybody.
00:34:17.040 And we twisted it.
00:34:20.160 What kind of creator creates something that claims its intelligence, claims and is smarter
00:34:26.400 than the creator in the end?
00:34:28.300 And we believe they're not life because of something that we can't prove.
00:34:41.160 I just want you to know, I am not suggesting an answer here.
00:34:44.200 I don't have an answer.
00:34:46.440 But we better start backing away from this blue and red argument and start looking at deeper
00:34:54.160 meanings.
00:34:54.900 And luckily, a lot of this homework has been done for us by the founders.
00:34:59.460 It's called the Bill of Rights.
00:35:01.340 A lot of those things can be solved today just by coming back together on what brought
00:35:07.800 us together in the first place.
00:35:10.180 Certain truths that we find self-evident.
00:35:15.720 I want to talk to you a little bit about my patriot supply.
00:35:17.940 You know, people have said that people who store food are crazy and they've said it for
00:35:25.780 a very long time.
00:35:27.820 You know, Penn Jillette in this in this article said that this thing is going to come back
00:35:35.720 around and we're going to be we're going to adjust to the problems that we're now facing.
00:35:40.880 He said, quote, I also believe it's going to be wicked, ugly while we're adjusting.
00:35:45.560 I think that's an understatement.
00:35:48.340 It is going to be wicked, ugly and then some all turmoil is all adjustments are now we can
00:35:55.600 either fear it or we can prepare for it.
00:35:57.460 My Patriot Supply devoted in helping hardworking people just like you prepare and become more
00:36:03.020 self-reliant.
00:36:04.340 They sell dozens of emergency food kits as well as gravity powered water filtration systems
00:36:08.580 at mypatriotsupply.com.
00:36:11.240 So whether it's a, you know, power grid collapse by outside forces or just because it does a
00:36:17.280 hurricane, an earthquake happen and maybe there's a fire, protect yourself.
00:36:23.140 MyPatriotSupply.com.
00:36:24.460 Get your storage plan ready today.
00:36:26.860 MyPatriotSupply.com.
00:36:27.580 Glenn Beck, the news and why it matters, the podcast available at theblaze.com.
00:36:38.280 Also on iTunes, special guest tonight, Ben Shapiro and somebody else that I'm going to
00:36:45.660 introduce you to later on today that is really fascinating.
00:36:50.220 Today, the podcast available on iTunes and theblaze.com.
00:36:53.920 So have you heard of crypto jacking?
00:36:58.700 Cyber thieves are gaining access now to personal computers with emails and even simple browsing
00:37:03.380 and using special programs to solve complex math equations to gain a piece of cryptocurrency
00:37:08.080 like Bitcoin.
00:37:09.360 You'll see symptoms like high processor usage, device overheating and unusually slow response
00:37:14.080 times, which is like every computer I've ever had.
00:37:16.300 There's so many threats in today's connected world and it takes just one week link for criminals
00:37:19.640 to get in.
00:37:20.500 New lifelock identity theft protection adds to the power of Norton Security.
00:37:23.920 To help protect you against threats to your own identity and your own devices that you
00:37:28.220 can't usually see if you're just a normal person.
00:37:31.440 If you have a problem, their agents are going to work to fix it.
00:37:34.000 No one can stop every cyber threat, prevent all identity theft or monitor transactions at
00:37:37.560 all businesses.
00:37:38.560 But new lifelock with Norton Security can see threats that you might miss on your own.
00:37:42.540 Go to lifelock.com or call 1-800-LIFELOCK.
00:37:44.600 Use the promo code BEC for an extra 10% off your first year plus a $25 Amazon gift card with
00:37:49.760 annual enrollment.
00:37:50.820 That's promo code BEC at lifelock.com.
00:37:53.320 Terms apply.
00:37:53.920 All right.
00:37:57.640 Today's segment of postmodern geometry, the hashtag me to dilemma.
00:38:03.160 Here is the problem of the day.
00:38:05.840 We have a lesbian humanities professor at an elite university who is sexually harassing a
00:38:13.940 gay male student.
00:38:16.080 Who is the victim?
00:38:17.220 You don't need that much time, right?
00:38:21.980 Of course, it's the professor.
00:38:24.940 Her name is Avital Ronel.
00:38:27.420 She's 66 years old, professor of German and comparative literature at New York University.
00:38:32.680 She apparently is the real victim, even though she allegedly sexually harassed a former student.
00:38:38.180 The New York Times wrote about the whole ordeal in an article titled, What Happens to Hashtag Me Too?
00:38:44.080 When a feminist is accused.
00:38:46.200 Well, we all know the feminist is in the right.
00:38:51.300 Ronel, quite publicly, has been accused of sexually harassing Nimrod Reitman, 34 years old, graduate student and currently a visiting fellow at Harvard.
00:39:02.260 Now, Ronel is an academic rock star, as one colleague described him, one of the very few philosopher stars of the world.
00:39:10.740 But the investigation concluded that the teacher, the professor, was the one responsible for sexual harassment, both physical and verbal,
00:39:21.300 to the extent that her behavior was sufficiently pervasive to alter the terms and conditions of the student's learning environment.
00:39:31.700 So.
00:39:32.780 She was suspended.
00:39:35.000 The accusations.
00:39:37.720 Reitman's claim that before the school year in 2012, Ronel invited him to stay with her parents in Paris for a few days.
00:39:46.860 Uh, the day he arrived, she asked him to read poetry to her in her bedroom while she took an afternoon nap.
00:39:56.360 He said that was a red flag.
00:40:00.020 But I also thought, OK, you're here.
00:40:03.480 Let's not make a scene.
00:40:05.020 Then he said she pulled him into her bed.
00:40:08.480 She put my hands onto her breast and she was pressing herself, her buttocks onto my crotch.
00:40:15.420 She was kissing me, kissing my hands, kissing my torso.
00:40:18.960 That evening, a similar scene played out again, he said.
00:40:23.320 From emails that he produced.
00:40:25.640 I woke up with a slight fever and a sore throat.
00:40:28.360 I'll try very hard not to kiss you until the throat situation receives security clearance.
00:40:34.040 This is not an easy deferral.
00:40:36.080 Another email.
00:40:37.300 Time for your midday kiss.
00:40:39.140 My image during meditation.
00:40:40.840 We're on the sofa, your head on my lap, stroking your forehead, playing softly with your hair, soothing you.
00:40:47.100 Headache gone yet?
00:40:48.880 Yes.
00:40:50.340 Most starting, uh, startingly, uh, uh, is the one from 50 of her colleagues.
00:40:56.600 All the educators from around the globe, quote, although we have no access to the confidential dossier, we have worked many years in close proximity to the professor and have accumulated collectively years of experience to support our view of her capacity as a teacher and a scholar.
00:41:14.240 But also someone who has served as a chair of both the departments of German and comparative literature at New York University.
00:41:20.760 We've all seen her relationship with students and some of us know the individual who has waged this malicious campaign against her, end quote.
00:41:28.800 So the student has been expelled.
00:41:32.260 The professor is fine.
00:41:35.160 Now.
00:41:35.720 Want to take a guess where she stands on Trump?
00:41:42.200 She didn't like Donald Trump.
00:41:43.500 Uh, she says, I take it as a regular rigorously necessary that Trump's mouth hole be the flapping aperture to funnel floods of, uh, racially unleashed aggression, the toxic spill, uh, of his language, part of the recourse, uh, to crucial intersection where Twitter meets.
00:42:05.720 Something else.
00:42:09.480 So here we have somebody who is too important to the cause, sexually assaulting someone, a young gay man, and she gets a pass because, well, she's in the right, she's in the right club.
00:42:25.940 She's absolutely in the right club and she's too important to lose.
00:42:35.720 Uh, it's Wednesday, August 15th.
00:42:38.280 This is the Glenn Beck program.
00:42:41.480 So, um, we have somebody on the phone and I haven't talked about her yet because I just want to ask her myself one last time before I introduce her, um, uh, give her the opportunity to back out.
00:42:55.440 Um, because I don't think this is going to go well for her career, uh, unless we change her name.
00:43:03.020 Uh, this is, I mean, this is how crazy things are, uh, have gotten.
00:43:06.100 Uh, can we bring her on real quick?
00:43:07.780 Is she there?
00:43:09.220 Yes, I'm here.
00:43:10.160 Uh, are you sure you want to have this conversation on the air?
00:43:13.060 I know the rules of not saying where you work, but you're willing to put your name out there, which I mean, you know, there's this private eye called Google.
00:43:19.440 Google that will find you quickly.
00:43:21.880 And I don't think this is going to go well for you in the long run.
00:43:24.940 I, I appreciate it.
00:43:26.440 I want to tell your story, but are you sure?
00:43:30.720 I have been praying for a couple of days about it.
00:43:33.580 And I really feel like I'm supposed to be here giving hope to other conservative professors, giving hope to conservative parents who are worried about sending their kids off to college that I'm here to speak truth, but I also need to be respectful of the place that I work and the people I work with.
00:43:51.560 And so it is a very difficult decision.
00:43:54.600 So I agree.
00:43:55.580 You're right.
00:43:56.060 It is a risk.
00:43:56.960 I know you want to give your name, but I'm, I'm not going to give it.
00:44:00.080 If you at some point want to give your name, that's fine.
00:44:02.160 But I, I think you're just, I think that's just opening up a world of hurt that you don't need to go through.
00:44:08.140 You are a psychology professor.
00:44:11.500 Yes.
00:44:12.980 And you have been an adjunct professor at a, at a good college and you're looking for full-time work and you don't think it's going to happen because of what colleges are like right now.
00:44:29.400 Do I have that right?
00:44:29.940 Yes, because there's a very clearly documented hiring bias, both an anti-conservative hiring bias and an anti-Christian hiring bias, particularly in the humanities and social sciences, which is where psychology falls.
00:44:42.900 Okay.
00:44:45.520 You, you have been teaching over the past eight years and you have said that there is a shift in, in attitude, even by the students now.
00:44:58.860 Can you tell me about that?
00:45:00.920 Definitely.
00:45:01.400 When I first started teaching, it was exactly what I pictured as far as the dynamic between the professor and the students, where there was a clear distinction in roles, there was respect.
00:45:11.400 And over the last, say, three to four years, I've noticed a shift where progressively students are becoming more emboldened.
00:45:19.000 They see a blurring between the lines.
00:45:21.120 There's less respect for me as an authority figure, and they feel like they can just challenge me.
00:45:28.460 They rarely do it in class.
00:45:29.980 Most of the time they do it online.
00:45:32.160 So they'll send me an email or they'll post something in the end of class survey, which is supposed to be anonymous.
00:45:37.980 But they're, and some of them have gone to the administration behind my back to try to, you know, complain or what have you, because they feel that I'm too strict or they want to have accommodations where accommodations aren't due.
00:45:51.760 So it's become more of a place of incivility on the student's part.
00:45:56.840 Luckily, I've been able to manage it pretty well, and it hasn't escalated to the point where some professors, say, like the professor at Berkeley who had students disrupt the final exam to protest it, or Brett Weinstein, who had his class, you know, disrupted by protesters.
00:46:10.840 I haven't experienced anything like that.
00:46:12.900 Most of the time, the students are good in the classroom.
00:46:15.460 It's outside of the classroom, especially when they feel emboldened by being able to post online or, you know, do something anonymous.
00:46:21.620 You were teaching an undergraduate course on research methods, and you said, okay, let's look at the campus assault study, the campus climate survey that came out in 2015, and let's look at this.
00:46:39.260 And what did you have the students do?
00:46:41.840 It was, the point of the class and the lecture was talking about research validity.
00:46:47.200 So looking at research studies and saying, is this actually valid?
00:46:50.500 Does the results indicate what people are saying the results indicate?
00:46:55.100 And I decided to take a risk and be bold and have them analyze the campus sexual assault study and look to see, one, does it have, like, can it be generalized?
00:47:06.400 Does it have external validity?
00:47:07.900 And also, does it have internal validity?
00:47:10.520 Do the way that they define the terms hold up to construct validity?
00:47:14.280 And it was amazing to watch the class just become shocked because they've all heard the statistics cited, but when they actually dug into the study, they started to see its limitations very quickly.
00:47:26.260 And to be honest, I've never been more terrified in a class than when I was standing there and openly challenging this study and guiding my students to think critically and analyze what the study actually said.
00:47:38.800 When you say you have never been more terrified, what were you terrified of?
00:47:43.740 I was terrified that I would have a student or students in the class that would react poorly to having that cognitive dissonance because clearly that's what they were experiencing.
00:47:57.040 I was terrified that some would march out of the class and go and tell the administration.
00:48:01.860 And I probably waited about a week or two thinking that the other shoe was going to drop, that somewhere, someway, I offended a student that, you know, having their worldview or having this information challenged was going to create enough dissonance that they were going to react negatively.
00:48:19.980 In this instance, it didn't happen, but it's definitely a risk every time I do it.
00:48:24.380 And it wasn't just that one.
00:48:25.540 I also had them challenge the wage gap study.
00:48:27.800 I also had them challenge the climate change study that everybody quotes the 97 percent degree.
00:48:33.220 I took a lot of risks in order to teach my students that they need to think for themselves.
00:48:38.060 They need to actually analyze these studies rather than, you know, citing the talking points that the media and others have pulled from it.
00:48:46.520 When you ask your class for counter arguments to things like microaggression, what happens?
00:48:53.900 Most of the time, they don't understand what I'm saying.
00:48:59.000 They think that there's just one position out there.
00:49:01.940 They've never heard that there's anything else.
00:49:04.360 They've never heard an alternative position.
00:49:06.520 So there was an assignment that I had to use.
00:49:10.240 I didn't have a choice.
00:49:11.460 I couldn't modify it.
00:49:12.340 But there was a question in there about microaggressions.
00:49:14.680 So I told them the way I want you to answer it is to present me both sides.
00:49:18.680 I want you to make an argument that microaggressions exist and are detrimental.
00:49:23.580 And then I want you to make an argument against it.
00:49:25.920 And I had to provide all of the resources for them to make the counter argument because they didn't know that a counter position existed.
00:49:34.300 They had no idea how to start looking for that.
00:49:37.120 It was pretty amazing to to see that, that they they weren't even aware that there's alternative positions to some of these things that have just been fed to them through their education.
00:49:47.460 So I have found, well, two things.
00:49:51.620 How do we expect to have a free people and a free press if people are being churned out in colleges and universities who don't even know how to look for the other side of the story?
00:50:05.720 Um, but I have found that, um, many times the students are hungry to see the other side.
00:50:15.960 They're they're excited when they see, wait a minute, I haven't heard that.
00:50:19.380 Even if it doesn't change their mind, they're excited about it.
00:50:23.500 Is that your experience?
00:50:25.340 I would say for the most part that I've seen, uh, most students when they get exposed to this information, when they get exposed to alternative, uh, let's say worldviews, something other than postmodernism, other something other than critical theory.
00:50:42.400 When they get exposed to alternatives, it's, it's exciting to see because they realize that they aren't critical theorists.
00:50:50.760 They realize that they aren't postmodernists.
00:50:52.340 They actually do believe in objective reality and objective truth.
00:50:55.220 And there's almost a relief for a lot of them that there's something out there that more closely aligns with the way that they do think of the way that they were raised.
00:51:05.740 But I always have a group in there that resists, that they are just so dead set in what they've been taught that anything that brings that cognitive dissonance, uh, they attack.
00:51:18.900 You know, one of the things I specialize in is actually teaching about, uh, marriage and relationships.
00:51:24.780 And so I talk about gender differences and I always have at least one student who yes, buts me all the way through that lecture because they want to deny the fact that there's anything, um, either biologically based or neurologically based that distinguishes the genders and distinguishes how men and women experience life and filter information and how we communicate.
00:51:48.280 Which clearly there is, but I always have somebody in there that will push back, but the majority, uh, seems to really, like you said, be excited and hungry for it.
00:51:58.400 Uh, in the 1990s, I read a quote from Emmanuel Kant and he said, uh, there are many things that I believe that I shall never say, but I shall never say the things that I do not believe.
00:52:08.640 Uh, that terrified me.
00:52:10.140 I couldn't even understand a world that where somebody would have to hide what they really believed.
00:52:14.560 I thought, what kind of world and how blessed are we that the world is not that way?
00:52:19.140 We're that way now, aren't we?
00:52:21.140 Definitely.
00:52:22.060 That's at least in the environment in which I work, but I would say also in social media, that's why I've gotten off social media because that's a risk to my career.
00:52:32.040 Um, that's why I'm very guarded and very calculated about, you know, what I choose to say and bring up, uh, in class with my students, but also the way that I conduct myself around colleagues.
00:52:41.740 Um, it's, it's absolutely true.
00:52:43.860 All right.
00:52:44.120 I want to take a quick break and then when I come back and I, uh, you found, uh, I think an unlikely friend, um, a strange bedfellow that, uh, gave you some advice.
00:52:52.820 I want to kind of talk about that when we come back with a, uh, professor of psychology, uh, that is going to remain nameless.
00:53:02.920 This should tell you where we are as a nation.
00:53:05.440 This person, I think if they gave their name, they would be out by the end of the day.
00:53:11.220 Um, just for saying what she just said to you more in just a second, by the way, more speech, not less sticks and stones can break your bones, but words will never hurt you.
00:53:25.940 All right.
00:53:27.220 I want to talk to you a little bit about Liberty safe.
00:53:29.140 Liberty is having a huge sale right now through, uh, August 26th.
00:53:33.060 If you go to Liberty safe.com or you just go to the Bass pro shop or your local Cabela store, you're going to save hundreds of dollars on the number one selling safes in America.
00:53:42.160 They are the best safe.
00:53:44.600 Um, we have, believe me, we have checked these out top to bottom to make sure that they are the best, uh, rated safes because we keep, you know, Washington documents and, and, and handwritten, you know, uh, letters from Abraham Lincoln in these things.
00:53:59.560 We want the best.
00:54:00.760 If you do too, it's a Liberty safe.
00:54:03.600 California wildfires going on.
00:54:05.140 One of the customers of Liberty had a safe, the entire place burned down just recently.
00:54:10.320 Uh, the only thing that was left was the chimney and the safe, the valuables and the heirlooms that were in that safe survived.
00:54:19.440 These are the best safes built in America.
00:54:22.640 The best safes built on the planet bar none.
00:54:25.420 It's a Liberty safe.
00:54:26.500 Go to Liberty safe.com right now.
00:54:28.900 Choose the, uh, the Liberty that works out for you.
00:54:32.300 Trust me.
00:54:33.000 Bigger is better in this particular case.
00:54:35.220 You won't believe how much stuff you want to put into your Liberty safe.
00:54:37.360 If you want, you have it for your guns, for your heirlooms, for your papers, Liberty safe, always protected, always preserved.
00:54:43.220 That's the Liberty way.
00:54:45.840 We're talking to a, uh, a woman who is a, uh, a Christian, a conservative and a university professor.
00:54:52.760 Uh, uh, she is a, uh, uh, an adjunct professor of psychology.
00:54:58.820 Um, and, uh, she currently is working at a, uh, a more conservative or Christian, uh, uh, college, but is, um, looking for, you know, another placement and is little concerned about it.
00:55:13.140 And I have, uh, I am not going to tell you her name.
00:55:16.300 She is more than free to, uh, volunteer that if she wishes.
00:55:20.040 Um, but I think she's incredibly brave for, for coming on the program and saying what it's really like, um, in the university, uh, system, especially if you're a teacher.
00:55:28.840 And you mentioned, uh, postmodern, postmodernism earlier and how students, uh, don't react, uh, positively to it when given an alternative.
00:55:38.500 Um, and I think, isn't that though, the, the reason why you're not going to be allowed to succeed as a professor?
00:55:45.240 Because if they're the whole premise of postmodernism is that there can't be another option because if there's another option, I mean, human beings are going to go towards an objective truth.
00:55:55.780 Yes, definitely.
00:55:58.460 And that is something that I'm mindful of, but I feel that that's why I'm there.
00:56:03.640 I feel like I'm called into this to be a light in the darkness and I present alternatives, but I do not proselytize.
00:56:12.040 I do not indoctrinate students to my way of thinking, like some of my colleagues may be doing to their way of thinking.
00:56:18.960 Uh, I feel that that is why I'm here.
00:56:21.680 That's my, my motivation.
00:56:22.860 So that's why I take those risks is if I didn't take those risks, I wouldn't be fulfilling my purpose.
00:56:28.820 I will tell you the best professors, um, that I've ever had, best teachers I've ever had were ones where I didn't know their opinion.
00:56:35.120 I had no idea.
00:56:36.240 I would think they're arguing this so hard.
00:56:38.720 This is clearly their opinion.
00:56:40.160 And then they would flip and all of a sudden they'd be arguing so hard.
00:56:44.400 You're like, wait a minute.
00:56:45.180 I thought you believed.
00:56:46.440 No, they never said that.
00:56:47.400 They're just arguing, showing you both sides and pushing you up against the wall on both sides.
00:56:53.040 I think though, I think that is the way education should be.
00:56:56.320 I agree.
00:56:57.180 And that's exactly how I try to approach it.
00:56:59.100 So you, um, uh, you, you met with, uh, Eric, uh, Weinstein, uh, Stein, and, um, and he was from Evergreen College.
00:57:11.320 And if people don't know what he went through, uh, he is a, he's not a guy who's actually, you know, probably agrees with you on very much, uh, personally.
00:57:22.420 Um, but he was pushed up at Evergreen College, which is more radical than Berkeley and went through hell.
00:57:30.300 I'd like you to talk about, um, meeting with him and what you guys talked about and, and how that all went, what advice he gave to you when we come back.
00:57:41.880 We're talking to an adjunct professor of psychology who will remain nameless, uh, and, uh, we're not going to say where she works either, uh, due to fear of, uh, reprisals.
00:57:56.220 Um, she'd like to have a job, uh, but she is, uh, she's talking about what it's like to be a conservative and a, uh, and a Christian and a professor at the same time.
00:58:06.820 Those things don't seem to go hand in hand anymore, uh, and, uh, and now you know why she's not going to be, uh, named here.
00:58:15.720 Um, you, when you started, um, looking for another job, uh, and you realized I'm, everything is a trigger.
00:58:27.040 Everything on my resume is a trigger to say no to, um, you actually sought out and, and met, uh, Eric Weinstein.
00:58:36.820 Can you tell me a little bit about that?
00:58:39.980 Yeah.
00:58:40.200 So I'm, I met Eric Weinstein.
00:58:41.860 He's actually Brett Weinstein's brother.
00:58:43.780 They're both a part of the intellectual dark web.
00:58:45.800 Brett's the one that was evergreen, whereas Eric is a mathematician and economist, but he's also very much a part of academia.
00:58:53.100 Thank you.
00:58:53.380 And I got to briefly encounter him a couple of months ago and I decided to ask for his advice because yes, I am preparing to start looking for full-time employment.
00:59:03.020 And my resume screams Christian.
00:59:05.120 You can't hide it.
00:59:05.960 And I brought that up to him.
00:59:08.120 I said, I'm a conservative.
00:59:09.120 I'm a Christian.
00:59:10.000 I teach in psychology.
00:59:10.860 I'm looking for full-time work.
00:59:12.080 Do you have any advice for me?
00:59:13.580 And he said, you definitely have two strikes against you.
00:59:16.600 He goes, I'm not going to lie.
00:59:17.640 You have two strikes against you.
00:59:19.480 The only way that you're going to find full-time employment is you're going to have to find something and make it your thing.
00:59:25.780 And I told him about my approach to teaching, that I try to be balanced.
00:59:30.620 I try to present both sides.
00:59:31.880 I focus on critical thinking and analysis.
00:59:33.740 And he goes, then that's it.
00:59:35.240 Make that your thing.
00:59:36.820 You're going to basically have to market yourself as this particular approach in order to stand out.
00:59:43.260 But he very much confirmed my fears that those are two strikes against me and that I have an uphill climb in order to find full-time employment.
00:59:53.060 What did you think when you saw what his brother went through at Evergreen?
00:59:55.740 It was scary because, if you recall, that was the same time period that Milo Yiannopoulos experienced the protesters at Berkeley.
01:00:05.100 So it was almost like this weird moment in history where the shift was very obvious and very clear.
01:00:11.560 And it was happening at two different universities where these students felt so emboldened that they could behave this way.
01:00:18.400 I mean, they held him against his will in the library for several hours for a mock trial.
01:00:25.180 And several of the professors that were there were almost testifying against him in this kangaroo court.
01:00:32.860 I mean, it's bizarre.
01:00:35.040 Yes.
01:00:35.360 And those same students held the, I think it was either the dean or the president of the university, held him hostage, wouldn't even let him go to the restroom by himself.
01:00:43.500 And then the security on campus told Brett Weinstein, don't come to campus because they're going car by car looking for you.
01:00:51.500 It was insane.
01:00:53.540 And this is not a big university.
01:00:55.100 That's actually a pretty small college in Washington.
01:00:57.840 And the idea that students were getting away with this behavior, it definitely is scary to think about what might happen because he's actually a liberal.
01:01:08.500 He still says he's a liberal.
01:01:10.060 He's an evolutionary scientist.
01:01:13.060 There's like very little common ground here with most conservatives.
01:01:18.080 He is a, you know, he's a diehard liberal, but he's a classical liberal that says, I want stats.
01:01:25.300 I want proof.
01:01:26.540 I, you know, let's, let's use the age of enlightenment.
01:01:29.780 But that is what postmodernism and I think universities are trying to crush right now is the, the, the modern world, the, the world that was created through the enlightenment.
01:01:41.080 Definitely.
01:01:42.080 And that's something that I've heard both Brett and Eric Weinstein talk about is that they are against postmodern thinking as well.
01:01:51.300 And that's where we find alignment is even though politically we may diverge, they're both atheists.
01:01:57.600 I'm a Christian, but yet we've aligned on this common cause of saying, hold on a second.
01:02:03.120 You don't get to just redefine reality.
01:02:05.420 You don't get to just redefine language.
01:02:07.680 You don't get to personally choose what is truth and what is not.
01:02:11.080 He was one of the most popular professors at Evergreen.
01:02:14.980 Everybody loved him.
01:02:15.920 Highest, highest marks from students.
01:02:18.100 But because he said, wait a minute, wait a minute, I'm, I'm a scientist and X and Y mean something.
01:02:25.380 There's an X chromosome, a Y chromosome.
01:02:28.080 It doesn't mean that you deny that.
01:02:31.300 It doesn't mean that I now have to go in with your delusion.
01:02:34.740 There's an X and Y let's talk science.
01:02:36.360 That made him have to have police protection and actually start to teach his class out in the public square because they said, we're not going to be able to protect you in the university.
01:02:47.760 It's crazy.
01:02:49.020 It is.
01:02:49.820 And it's, it's absolutely terrifying.
01:02:51.480 And that's why at the university I'm at right now, it seems like the students are pretty evenly divided.
01:02:57.380 So it doesn't seem like I'd be overwhelmed like I would at a more liberal college like that.
01:03:03.200 But when I'm going to apply at other universities, I, I don't know what kind of climate is there.
01:03:08.880 And one of the things that I've been thinking about is it's not just a question of, well, they even call me for an interview.
01:03:15.080 Like, will I even get hired?
01:03:16.780 But if they do, do I even want to work there?
01:03:19.340 Because, you know, less than 8% of psychology professors identify as conservative.
01:03:24.720 And many universities don't have a single conservative on staff, period.
01:03:28.260 So that's another thing I have to think about is not just, well, they hire me, but is that an environment that I'm going to even be able to be successful?
01:03:37.780 Am I going to be able to even teach?
01:03:39.880 Do you know who David Glertner is?
01:03:42.060 No, I've not heard that name.
01:03:43.140 David Glertner was actually the first guy the Unabomber tried to kill.
01:03:46.940 He lives in an awful lot of pain now.
01:03:50.280 He survived.
01:03:51.340 He is a futurist, but he is also the son of a rabbi, deeply religious.
01:03:56.900 He is a math professor.
01:03:58.540 He won.
01:03:59.340 I don't remember what he invented, but he invented, you know, I don't know, the cursor or something for Apple.
01:04:05.680 They took that technology and made it theirs.
01:04:08.500 He sued them.
01:04:09.200 I think he won like half a billion dollars in a lawsuit.
01:04:11.740 The guy's an absolute genius, but is very concerned about the universities.
01:04:16.680 He's at Yale now, not real popular on campus.
01:04:20.140 I think he is with the students because he's so smart.
01:04:23.520 But we have talked before about the university system is coming apart.
01:04:29.100 It's just it's just not going to be there, you know, 10 years of the way it is now.
01:04:32.880 It won't work.
01:04:34.580 And being able to do things online.
01:04:37.920 Have you thought about doing a class online?
01:04:42.120 I have been asked to do classes online and I've tried it.
01:04:46.120 The problem with being a professor online, at least with the way that it's been given to me by this university, I don't know how other universities do it, is I basically just grade.
01:04:58.220 There's no lecture.
01:04:59.540 There's no lesson.
01:05:00.800 They're given a textbook to read.
01:05:02.880 They're given assignments to do.
01:05:04.140 And then I just show up and grade.
01:05:06.300 And that's frustrating because then there's no teaching.
01:05:10.280 Yeah, no, that's not the.
01:05:11.920 May I put you in touch with David Glertner?
01:05:14.800 Because you should talk to him.
01:05:17.220 He might be able to advise you.
01:05:19.460 Or maybe I'll ask him and see if he'll come on and the two of you can have a conversation.
01:05:23.660 So I think it would be helpful for a lot of people who are in your situation, whether they're at a university level or not, just trying to find their way through this madness of this world and and how to navigate it.
01:05:35.220 Definitely.
01:05:35.880 Great.
01:05:36.280 That sounds great.
01:05:37.140 Going to give you another chance.
01:05:38.900 I suggest you don't take it.
01:05:40.360 But if you wanted to introduce yourself, you may.
01:05:43.580 To be honest, it's it's a really big risk to do that.
01:05:46.420 And like you said, that Google machine is pretty powerful.
01:05:49.020 That's fine.
01:05:49.700 I just wanted to I just wanted to give you the opportunity.
01:05:52.220 I'm I applaud you for not taking it.
01:05:54.720 Thank you so much.
01:05:55.800 And we'll be in touch.
01:05:57.160 God bless.
01:05:57.780 Thank you.
01:05:58.160 You bet.
01:06:01.320 I cannot believe the world we live in now.
01:06:06.320 It's the Immanuel Kant thing you were talking about earlier.
01:06:09.680 You know, I mean, it's really in a time where you can't stand up as a person and say the things you believe.
01:06:18.200 And how many of us are really saying the things that we believe?
01:06:21.180 I mean, that's see, that's my problem with today's society is.
01:06:26.960 Have you really thought all this stuff through?
01:06:29.500 Are you really that sure?
01:06:32.240 Are they of the things that you're saying?
01:06:35.040 Let me say it.
01:06:35.880 Let me say it this way.
01:06:36.640 I remember when I had to go on tour for the Christmas sweater, and that was the hardest thing I've ever done because it was a personal story.
01:06:45.920 I remember that of my of my mother's death.
01:06:49.500 And you were a wreck.
01:06:50.480 I was a wreck.
01:06:51.220 Yes, it was horrible.
01:06:52.160 What you didn't know at the same time that that was happening, I was under the the first real active death threats that I had.
01:07:03.420 I was still working at CNN, and I had these 9-11 truthers.
01:07:09.700 Thank you, Alex Jones, coming after me and saying that I was the cover up guy.
01:07:15.960 I was the CIA operative and the cover up guy for 9-11.
01:07:21.220 And they were threatening us.
01:07:24.060 We actually had one of our tour buses run off the road.
01:07:26.780 We had to switch tour buses all the time.
01:07:29.760 So nobody knew which one I was in.
01:07:33.080 And luckily, the one that I wasn't in was run off the road.
01:07:37.880 I had a guy come up into me online.
01:07:40.900 The key words were all traitors.
01:07:44.420 What was it?
01:07:45.080 All traitors will be eliminated, I think.
01:07:46.880 And I had to go into crowds every single day.
01:07:53.120 And knowing that there was somebody in there that probably wanted to kill me.
01:07:58.660 And a guy came up, and every spider sense in me just went off.
01:08:02.720 And it was like, this guy, this guy, this guy.
01:08:04.180 My security felt it, too.
01:08:06.120 They came right to my side.
01:08:07.780 And I'm like, I'm not going to be afraid.
01:08:09.560 I am going to shake his hand and wish him Merry Christmas.
01:08:12.980 And I stuck my hand out, and I said, Merry Christmas.
01:08:15.840 And he had his hands in his pockets.
01:08:18.040 And he said, all traitors must be.
01:08:20.440 And he started to take his hand out of his pocket.
01:08:22.280 And he was on the ground before he knew it.
01:08:25.660 And I remember sitting in the back of the tour bus and saying, I will not die for the things that I do not believe.
01:08:37.500 I will not die for stupid stuff.
01:08:42.160 Because I said something, and I was just going off half-cocked, or it was funny.
01:08:47.640 I am not going to die for that.
01:08:50.260 What is it that you're worth?
01:08:52.560 It's worth dying for.
01:08:54.700 And I got through that time by imagining the worst thing that could happen.
01:09:02.780 And to me, the worst thing that could happen, I envisioned myself on the sidewalk, knowing that that was the last few moments I had.
01:09:13.280 And I was nowhere near my family.
01:09:15.460 And I would never be able to say goodbye to them.
01:09:18.180 And I imagined the worst thing, so then I wasn't afraid of it anymore.
01:09:25.200 Strangely for me, I don't know if that's the healthy thing to do or not, but strangely for me, it worked.
01:09:30.300 That and a pact with myself, don't say anything that you don't believe.
01:09:35.000 How many of us have done that?
01:09:39.580 How many of us have had to?
01:09:41.920 Our last guest probably has.
01:09:43.600 Not die, but not be able to work again.
01:09:47.340 We're entering the time that I have warned about for so long.
01:09:53.080 And I've told you, you're going to be mad.
01:09:55.660 You're going to be angry.
01:09:56.400 There are people that want to take you to that anger and have you express that anger, and it will be the wrong direction.
01:10:05.000 We are encouraged now to embrace our outrage.
01:10:09.800 We are encouraged to just ratchet it up.
01:10:12.760 Say it back.
01:10:13.660 I'm tired of people saying and just taking the punch.
01:10:16.280 I'm telling you, that is the wrong direction.
01:10:20.840 The right direction is to take a moment before you go online.
01:10:24.740 And I don't know if you can do this without the real threat.
01:10:29.840 But that threat will come to you.
01:10:31.860 And before you go online, before you start to have a conversation, ask yourself, am I willing to literally fight over this?
01:10:43.220 Am I willing to literally be beaten in the streets for this?
01:10:47.620 Am I willing to never be able to work again for this?
01:10:51.360 Am I willing to die for what I'm about to say?
01:10:56.140 If you take that attitude and you couple it with courage, that you will say the things that you believe.
01:11:11.820 We'll be able to back away from the edge and the precipice.
01:11:15.740 And we'll be able to save the rights of all mankind, even those we vehemently disagree with.
01:11:23.080 All right.
01:11:28.500 Let me tell you about our sponsor quickly.
01:11:29.840 This half hour is Mercury Real Estate.
01:11:32.480 This is a company that I founded years ago.
01:11:35.120 Really out of frustration.
01:11:36.400 I was just tired of I was tired of not knowing who to trust when it came to my house.
01:11:41.480 I was tired of, you know, hey, whose face is on the bus stop, a bus stop.
01:11:45.780 They look like they could sell my house.
01:11:47.660 How do you know this is the biggest this is the biggest investment of your lifetime for most of us.
01:11:54.000 So who's selling it?
01:11:56.480 Who do you trust?
01:11:57.840 We've done a ton of research and we launched this about three, four years ago.
01:12:02.020 And the stats on it are just mind blowing.
01:12:05.360 We have great agents that are full time, long time careers that have a great record in your area.
01:12:12.880 We have about fifteen hundred agents.
01:12:14.640 You can find them at realestateagentsitrust.com.
01:12:17.900 They're going to help you sell your house.
01:12:20.400 They're not going to just take your house sight unseen and go, yep, I'll represent you.
01:12:24.380 You need to ask them a question.
01:12:26.240 What is your marketing plan?
01:12:28.160 Why is my house worth this?
01:12:30.620 What is the market doing?
01:12:32.080 What's it going to look like in six months, do you think?
01:12:35.540 Find the people that can really answer that with with the with the background that gives you the trust that they actually know.
01:12:42.740 We've done the background checking for you.
01:12:44.540 If you need a real estate agent, go to realestateagentsitrust.com.
01:12:49.260 Realestateagentsitrust.com.
01:12:53.340 A personal note.
01:12:54.440 There are many times, in fact, almost all of the time, what you hear is my opinion and nothing more.
01:13:07.240 There are other times that I feel that there is more to it than than that.
01:13:15.840 And what I just said just a few minutes ago is one of those times, and I haven't felt that in a while, that that is more than just my opinion.
01:13:28.440 It is a warning and a plea to not go over the cliff with the rest of humanity.
01:13:36.220 So, if you know what that means, you know what I'm talking about, please go back and review that.
01:13:45.220 You know, there's there's a.
01:13:48.280 Harvard is now calling for blacklisting Trump officials.
01:13:54.260 And I want to get into this next.
01:13:56.360 They want to make sure that anybody who served in the Trump administration, anybody, is never offered any job in academia.
01:14:07.380 Now, this coming from a group of people that have embraced Bill Ayers.
01:14:14.080 They're saying the Trump officials, any of them, should be blacklisted and treated civilly, but blacklisted.
01:14:23.400 We'll get into that next.
01:14:26.360 Jason Richwine wrote last year.
01:14:31.640 I collected several examples of we support free speech, but style statements with the point being that the speakers don't actually support free speech.
01:14:41.940 Writing in the Boston Globe on Monday, Harvard economics economist Danny Roderick follows the similar template.
01:14:50.180 Quote, the Trump administration confronts universities with a serious dilemma, he says.
01:14:55.120 On one hand, universities must be open to diverse viewpoints.
01:14:59.840 You feel it coming?
01:15:02.460 But Trump administration officials are tainted.
01:15:07.120 They should be treated civilly in their public appearances.
01:15:09.600 But academia should never grant them faculty appointments or even university sponsored speaking engagements.
01:15:17.000 Now, that sounds very open minded.
01:15:22.980 Done it to you still so much.
01:15:25.580 This is coming from this is coming from a group of people that a should know about the whole Galileo thing.
01:15:36.880 You know, Galileo said, you know, the earth isn't flat.
01:15:42.040 It's a square.
01:15:43.560 And and so they put him up in a tower.
01:15:46.580 And because they couldn't have him teaching something that wasn't the agreed upon science, because at the time, the political power was the church.
01:15:58.040 Let me say that again.
01:15:59.240 The political power was the church.
01:16:02.960 The church didn't make church evil, made the political power being the church evil.
01:16:11.000 That's really bad.
01:16:12.280 What happens when you start to have a church state, a state church, a state religion, whether that religion is Catholicism or environmentalism or progressivism or postmodernism.
01:16:25.800 Hmm. That's usually bad.
01:16:29.720 So they fail to see that the reason why people have tenure is to avoid the Galileo issue.
01:16:40.540 They fail to to see that when the Nazis came in, the first thing they did was start getting rid of all of the teachers that disagreed with them, all the professors.
01:16:52.320 Same thing with Stalin and communism.
01:16:54.360 Yeah, you've got to go.
01:16:56.620 You can't be training young minds.
01:16:58.760 So what do they do?
01:17:00.520 They they made people politically correct.
01:17:07.600 Politically correct.
01:17:09.500 It's amazing how we just say PC and political correctness, and we don't really understand what that means.
01:17:15.400 It means you must be correct with whoever has political control.
01:17:20.360 That's a bad thing.
01:17:24.680 And that is exactly what our founders knew.
01:17:28.000 Our founders had had looked and searched throughout history and said, OK, who who did it right?
01:17:34.860 Who did it wrong?
01:17:36.160 What survived?
01:17:37.340 What didn't?
01:17:38.220 What allowed maximum freedom and how did that erode?
01:17:43.540 So they actually did studies.
01:17:46.900 I know it's crazy.
01:17:47.720 And what they were looking for was any system that worked.
01:17:52.360 And they found that man could indeed rule himself.
01:17:56.740 But there had to be a lot of rules because there would be people that would rise up and say, look, you're tired, you're sleepy.
01:18:05.860 You don't want to do it yourself.
01:18:07.580 You need a safety net.
01:18:09.340 I'll be here to provide one for you.
01:18:11.480 And so they built our government around the idea government shouldn't provide the safety nets for you.
01:18:19.800 You have your churches do that.
01:18:21.300 You can have your own community do that.
01:18:22.860 You can have your town do that.
01:18:24.120 I don't care how you do it.
01:18:25.460 But the federal government, its job is to protect your rights.
01:18:30.240 And you're all going to disagree because somebody is going to say, no, it's my church that is the right one.
01:18:36.320 You're going to go to hell if you don't join my church.
01:18:39.740 And another church is going to say, no, it's my church that's right.
01:18:43.200 And you're going to go to hell.
01:18:45.220 And so the idea is, hey, we don't pick a side on churches.
01:18:50.180 OK, we do pick a side on right and wrong because there is absolute right and wrong.
01:18:56.660 In fact, our laws were established on the template of the Judeo-Christian laws.
01:19:05.760 So you're saying we're a church?
01:19:08.740 No, I'm saying forget about God in the Bible.
01:19:13.380 Just read it.
01:19:14.840 At what point do you take all of the magic out of it?
01:19:19.080 I don't believe in all the magic.
01:19:21.360 OK, just look at the principles.
01:19:24.000 Do the principles work?
01:19:29.260 Do the principles of the way God supposedly established through magic that he established his people?
01:19:38.840 Did those things work?
01:19:40.800 We argue about gerrymandering.
01:19:43.220 You know why gerrymandering is wrong?
01:19:45.440 It's not a biblical principle.
01:19:47.620 Stakes are.
01:19:48.800 In fact, it's what Jefferson and Adam said would be the reason we would probably break down because they didn't do it the way it says in the Bible, which stakes mean when there's 50 people or 500 people, doesn't matter.
01:20:03.000 That's a stake and it's just a square.
01:20:04.880 And when there's more than that, then you split that stake and you grow it to another 5,000 people.
01:20:12.100 And then when that square is full of 5,000 people and there's more, you split it again.
01:20:16.960 So there's no gerrymandering.
01:20:18.840 It's just these are the people that live around each other, period.
01:20:24.140 Gee, that would solve gerrymandering, wouldn't it?
01:20:26.640 Yeah, that's a God principle.
01:20:28.500 I'm sorry.
01:20:29.300 It's a magic book principle.
01:20:30.620 But the magic book actually works.
01:20:37.100 So here's a group of people.
01:20:40.600 That originally universities were set up by churches.
01:20:46.720 To be able to teach.
01:20:50.320 The universities here in America were the reason.
01:20:55.220 These guys, they were the reason we kept it for so long.
01:20:58.660 Because they were teaching, along with the churches, true principles and how to think.
01:21:05.540 Not what to think, how to think.
01:21:07.860 It's the enlightenment.
01:21:09.300 Use science and reason.
01:21:11.840 Couple that with your faith.
01:21:13.860 But let's make sure that it's provable.
01:21:16.520 Let's make sure that this is scientific.
01:21:18.340 It's not just, yeah, well, he says it's right and he's in the right party because he's wearing the right color.
01:21:22.860 Let's have reasoned discussion on it.
01:21:29.880 The people at the universities, they have forgotten about Galileo.
01:21:35.380 They have forgotten about Stalin.
01:21:37.940 They have forgotten about the purges, the university purges in Germany.
01:21:43.700 And in everywhere else where a dictator takes control.
01:21:48.600 They have forgotten about the blacklist.
01:21:51.220 Do you remember when a blacklist was a bad thing?
01:21:53.380 Hey, we don't want to create a blacklist.
01:21:55.280 Is there some sort of a blacklist going on here?
01:21:58.860 Now they're saying the Trump administration poses a serious dilemma for universities.
01:22:06.840 We have to have open and diverse viewpoints, but because those officials are tainted, we should never grant them faculty appointments or even university-sponsored engagements.
01:22:20.180 So, in other words, let's put together a blacklist.
01:22:26.380 And these are the people teaching our children?
01:22:28.860 I mean, I'm, they cause me no fear because in the long run, they always lose.
01:22:48.660 I mean, it may take us a hundred years, but they will lose.
01:22:52.080 Truth will always set you free.
01:22:54.100 And they are not teaching truth.
01:22:56.020 They are not teaching true tolerance of opinion.
01:23:02.860 And I'm sorry, there is no line on freedom of speech.
01:23:07.660 I don't have to tolerate it.
01:23:10.000 You don't have to tolerate mine.
01:23:12.640 But there is no line where you can shut me up.
01:23:18.160 More ideas, not fewer.
01:23:20.660 Big ideas, not smaller.
01:23:22.900 More voices, not fewer.
01:23:26.020 That's the American principle that brought us here and brought humanity out of the darkness.
01:23:31.540 Do you think you're more like the freedom fighters in the dark ages?
01:23:37.420 Do you think you're more like Galileo?
01:23:39.700 Or you're more like the church when you say, lock them up, keep them quiet, don't let them speak.
01:23:44.800 Which one are you more like?
01:23:47.060 Are you more like McCarthy or the people that went to jail because they believed something?
01:23:54.340 Which one are you?
01:23:58.060 Are you more like the university professor that was Jewish and just was teaching and doing his job and he was great?
01:24:06.200 But because he was a Jew, he was escorted off campus because he's just teaching Jewish stuff.
01:24:14.720 Are you more like the Nazi or the professor who's like, I've done nothing wrong?
01:24:20.740 It's who I am.
01:24:22.000 Which one when you're proposing your blacklist?
01:24:28.460 So I have no fear that that kind of fascism loses in the end.
01:24:35.240 I do fear the battle.
01:24:37.380 But if the battle is to be had, the battle will be had and it will our children will be stronger because of it.
01:24:43.680 Perhaps our great, great grandchildren will be because sometimes darkness lasts longer than a lifetime.
01:24:54.060 But I still have hope that people will stand in line with common sense and common principles, the principles being the Bill of Rights.
01:25:01.760 But I just want to, I just want to point out to the Harvard intellectuals who are so much more well educated than I am.
01:25:11.900 They can use big words and all the best words.
01:25:14.140 They have the best education and the best friends and the best connections and the biggest words.
01:25:20.380 So they're much smarter than I am.
01:25:23.080 But I have, while I'm not afraid of you, I am you.
01:25:27.720 It makes it difficult for me as a human being not to be pissed as hell at you because you are so arrogant you don't even see your own hypocrisy.
01:25:42.000 Oh, we can't have the blessed university tainted by people except you hired Bernadine Dorn.
01:25:50.200 She was on the most wanted list.
01:25:53.000 Bill Ayers, most wanted list.
01:25:55.620 They were terrorists.
01:25:56.780 Well, they were fine for Northwestern and University of Illinois.
01:26:00.460 They were fine.
01:26:02.360 Another weather underground woman, Kathy Budin.
01:26:05.860 She was second degree murder, second degree murder.
01:26:11.040 She went, she went to jail for killing police officers.
01:26:17.700 Well, yeah, but when she got out in 2003, after serving a 20 year sentence by 2008, she was, you know,
01:26:24.580 she was the scholar in residence at Columbia Law.
01:26:29.260 Don't talk to me about not tainting your blessed university.
01:26:33.300 How about, how about Howard Metzinger?
01:26:36.120 He tried to kill the Detroit, blow up the Detroit police department, University of North Carolina, Chapel Hill, or Erica Huggins.
01:26:46.260 She's one of the leading Black Panther radicals.
01:26:49.800 I mean, she's at Cal State.
01:26:52.840 That didn't even bring me to people like Ward Churchill.
01:26:56.860 It doesn't even bring me to people like Peter Singer, who says, I'm sorry, I said you could kill your, your born child until they were two.
01:27:05.320 I shouldn't have put a time on that.
01:27:08.700 So don't talk to me about tainting your blessed university.
01:27:13.480 You've worked with, you've worked with the worst of the worst.
01:27:19.860 You have promoted and, and, and cleansed clean terrorists from other countries, terrorists from our own countries.
01:27:30.620 But those people who now work for Trump, well, that's, that's a little over the line.
01:27:46.260 By the way, Ben Shapiro is going to be on with us today.
01:27:51.820 Ben Shapiro and Jeremy Boring.
01:27:53.580 Jeremy is the guy who started the Friends of Abe years ago.
01:27:56.860 The Hollywood, uh, keep it in the closet, uh, group that, uh, had to meet in secret rooms because they were conservatives.
01:28:07.460 Be part of, uh, Prager University too, right?
01:28:09.780 Yeah, big part of Prager University.
01:28:11.660 Um, they're going to be on with us.
01:28:13.040 The news and why it matters today.
01:28:15.100 Uh, 530 on the blaze TV, also available on iTunes as a podcast.
01:28:20.220 You don't want to miss it.
01:28:22.460 Ben Shapiro tonight.
01:28:24.380 It's going to be good.
01:28:25.100 All right.
01:28:27.220 Sponsor of this half hour is the Palm Beach Letter.
01:28:29.240 Wanted to share some, uh, feedback from, uh, Tika Tuari's crypto course.
01:28:32.660 97% of the people who have taken this have given the course a four or five star rating.
01:28:37.560 Some of the comments, uh, great intro to crypto.
01:28:41.100 Uh, Barton wrote in very well put together, informative.
01:28:44.360 I now understand the technology and the process much better.
01:28:47.440 Uh, I love this one.
01:28:48.460 I'm 67 years old and I'm hoping to retire someday.
01:28:51.680 I live paycheck to paycheck and have just a cursory knowledge of how the stock market works.
01:28:56.660 I leave that to my retirement guy.
01:28:58.900 I was getting in over my head when I started this.
01:29:01.720 I was convinced.
01:29:02.980 I'm glad I took the course.
01:29:04.380 I now know what cryptocurrency is.
01:29:06.380 I now know what blockchain technology is and how it works.
01:29:10.000 I'm way ahead of the game now, even just in the last couple of weeks.
01:29:14.420 Thanks, Janelle.
01:29:15.520 If you want to understand the world that we're entering with cryptocurrency and blockchain and what it means and why blockchain is more than just cryptocurrency, you need to take the smart crypto course.
01:29:29.040 It's available now at smart crypto course.com.
01:29:32.740 That's smart crypto course.com.
01:29:34.960 Or you can call 877 PBL back and get more information.
01:29:38.460 It's smart crypto course.com.
01:29:41.200 Let me go to a Dan in California.
01:29:45.580 Hello, Dan.
01:29:46.100 You're on the Glenn Beck program.
01:29:47.980 Hey, Glenn.
01:29:48.920 How are you doing?
01:29:49.640 Very good.
01:29:50.260 Um, I just want to tell you, man, um, you know, you earlier, you were talking about how, you know, people wondering what they would stand up for, what they would die for and what they believe in and couple that with courage.
01:30:01.200 You know, uh, lately, uh, you know, I, I hear that frustration in your voice, man.
01:30:07.100 And, uh, I just want to let you know for people like me, you are impacting other people in a really positive way and really helping people like me answer those questions, man.
01:30:18.620 You know, uh, a little personal thing about me really quick.
01:30:21.520 I'm going through some really hard personal times right now.
01:30:24.180 And, uh, I've been listening to you for about a handful of years now, man.
01:30:27.720 And again, you have really helped me to figure out who I truly am.
01:30:33.320 And when I kind of answered those questions for myself, who am I, what I would die for that really affected me in really positive ways.
01:30:40.620 Like tell me to be a better father and a better, you know, partner.
01:30:43.740 And I'm just wanting to let you know, man, you, you, you really help a lot of people out and it's, it's really, it means a lot, man.
01:30:53.180 You know, and again, I hear frustration in your voice sometimes like, how can this be happening?
01:30:56.700 And I feel the same way, man.
01:30:58.320 It's, but as bad as everything is getting and, and everybody knows that things are heading in a very negative way.
01:31:05.480 That is also making people like me, uh, lean us in the positive way.
01:31:10.380 You know, all the, all the P all the good guys are going in their groups, all the lines are being drawn, you know, and, and people are really answering those questions.
01:31:18.100 Like everything's going on with Jordan Peterson and Dave Rubin and everything like that.
01:31:22.340 People are waking up to what they believe in and, and what they stand for.
01:31:26.620 You know, like every time you talk about all these bad things that are happening, all these negative things are happening to us.
01:31:31.860 Well, it's also very positive things that are happening as well.
01:31:34.940 People, unfortunately lines are being drawn, people are going into their corners, but for the good guys.
01:31:40.380 For the people who really, you know, stand on sincerity and truth and love and compassion, you for me have been very, very influential in that.
01:31:49.600 And again, man, just keep always remember that.
01:31:52.960 Always remember that, man.
01:31:54.080 Cause every time, like the other time when the guy who came on, who, uh, Gavin McGinnis, you know, oh, we got to punch him back.
01:32:00.060 And I can, I am a, I'm a 10 year, five to a veteran in the Marine Corps.
01:32:05.100 I've dealt with alcoholism, Pete, to continually deal with all my inner demons.
01:32:09.540 And you are absolutely right, man.
01:32:11.360 As soon as you start becoming those that you would stand against, it consumes you.
01:32:16.320 The anger, the, the, the, the frustration, all of that consumes you.
01:32:20.900 And you turn into this thing that you were just trying to fight.
01:32:23.460 And I know that's so hard, especially for guys like, like me, you know, who are the veterans and, and, and, and the alcoholics and the people, all the broken guys.
01:32:32.000 Like that's the hardest thing to swallow, you know, but you really, and I'm a, I'm an older millennial, man.
01:32:37.980 I'm 32.
01:32:38.980 I have three children, you know, I mean, I'm, I'm heavily tattooed.
01:32:42.360 Um, you know, I, I'm the guy that you would think looks like the dude who'd be going punched around people, you know, but I'm not.
01:32:48.700 And that has a lot to do with you and, you know, it's, it's, it's, I'm a better man for it because I know what I would die for.
01:32:55.600 I know who I am.
01:32:56.780 I know what my principles are, you know?
01:32:59.200 And that's because again, you have challenged me over and over again.
01:33:02.840 Oh, man.
01:33:03.100 Do I believe that?
01:33:04.080 Do I feel that way?
01:33:05.220 How do I really feel?
01:33:06.540 Why do I think that?
01:33:08.140 You know what I mean?
01:33:08.900 And that has a lot to do with you, brother.
01:33:11.140 Dan, just, just always remember that, man.
01:33:13.580 Thank you.
01:33:14.000 I, I, I can't tell you, um, how much, uh, your phone call means to me.
01:33:19.480 Um, so thank you for that.
01:33:21.760 And you have given me, uh, uh, fuel to fly on for a while.
01:33:26.680 Thank you.
01:33:27.480 Uh, and sincerely, uh, I, I will tell you as a personal note that, um, uh, I, uh, this is probably too much to say here in about a minute.
01:33:43.420 So I'm going to, I've got 30 seconds, so I'm not going to even start it.
01:33:47.840 Um, I'm going to save it for another day, but perhaps tomorrow, a longer conversation of, uh, of where I think we are, um, and where I think we're going.
01:34:02.820 And I only say that as a, we, because that's where I'm at and where I'm going.
01:34:09.300 And I would like to invite you to come along the journey with me.
01:34:12.560 We'll talk about that coming days.
01:34:19.100 Yeah.
01:34:20.160 Welcome to the program.
01:34:21.300 Uh, there's a new poll out and it's causing, um, uh, a slight disturbance here in the studio of, uh, of how warmly Americans feel towards Donald Trump.
01:34:33.160 And not just, they specifically, uh, they specifically surveyed Republicans who voted for him.
01:34:39.260 In fact, they even went to the trouble and I don't know a survey that's done this before.
01:34:42.620 They actually went to the trouble of verifying that these people voted for him.
01:34:46.780 So they surveyed Trump fans to find out whether or not they're still Trump fans.
01:34:51.980 And, you know, probably won't surprise you to know that 82% of them are, they still have warm feelings or very warm feelings for him.
01:34:58.380 What does that mean?
01:34:59.520 Warm feelings.
01:35:00.060 That was the question.
01:35:01.460 Yeah.
01:35:02.060 See, I interpret that as, for instance, I always had warm feelings for George Bush, but he pissed me off.
01:35:08.520 Ooh, I didn't always have warm feelings for him at the end.
01:35:11.220 I had pretty.
01:35:12.440 Yeah.
01:35:12.960 No, I went, uh, chilly feelings for him.
01:35:15.080 Yeah, I did.
01:35:15.960 And then I went and met with him and I felt warm, warmly about him as a person, but I still disagreed with his policies.
01:35:22.900 Remember?
01:35:23.460 I mean, I said, it didn't change me on how I felt about his policies, but I liked him as a human being.
01:35:29.140 Um, and so, you know, you can like, for instance, this is probably not helpful, but I don't like Donald Trump as a human being.
01:35:36.580 He would not be my friend.
01:35:37.940 I mean, you know, even if he was like, Glenn, you're the best, you're the greats, he would not be my friend.
01:35:42.560 He has, but he was like that at one point.
01:35:44.540 Yes.
01:35:44.760 And he wasn't my friend and he wasn't your friend.
01:35:46.360 He would call me and I'd be like, okay, that was weird and creepy.
01:35:49.440 What was that about?
01:35:50.180 And I hung up the phone.
01:35:51.460 Um, so he wouldn't be my friend cause I don't like him.
01:35:55.160 I don't like his lifestyle.
01:35:56.440 He's just, we just don't, we're not compatible.
01:35:58.460 Right.
01:35:58.640 He's there.
01:35:59.020 Yeah.
01:35:59.380 He's not, not a guy who would necessarily run in the Glenn Beck circles, by the way.
01:36:03.080 Also, there's a lot of people who wouldn't, you know, male sports fans.
01:36:07.340 Yeah.
01:36:07.520 People with manly interests.
01:36:09.440 We get the people who don't like the opera.
01:36:11.900 We get the point.
01:36:12.960 Okay.
01:36:13.680 Yeah.
01:36:14.200 So, I mean, I know I'm friendless, so it doesn't mean an awful lot, but that doesn't have anything
01:36:19.900 to do with how I view his policy.
01:36:22.160 How I viewed him before the election was a guy who I didn't trust.
01:36:27.340 I didn't like, I didn't think he was of good character and I didn't, because of that, I
01:36:32.600 didn't believe he would do any of the things he said he would do.
01:36:35.920 Now, new information.
01:36:38.880 He is not surprised me on some of the things a character is not, there's nothing new there,
01:36:45.140 but what he said he would do, he's done a lot of them and I didn't think he would.
01:36:52.260 So good.
01:36:53.240 I feel, I feel great about those things, but that doesn't change my warmth of him.
01:36:58.660 Am I just being too nitpicky here on the word warmth?
01:37:00.720 I just think it's a weird word.
01:37:01.820 I think you are.
01:37:02.060 Yes.
01:37:02.840 Well, why did they use that?
01:37:04.120 I've never heard that used in a poll.
01:37:05.460 I mean, it's just a scale of how much you like the guy.
01:37:08.740 It's just another way to measure his approval ratings and they just did it on a scale of
01:37:14.640 zero to a hundred and if, so if you're over 51, that starts the warmness.
01:37:19.660 Okay.
01:37:20.360 And I was just, I don't know where I would, he's the only guy I think that not only do
01:37:27.020 my feelings for him change on a daily basis, I think sometimes hourly, maybe even by the
01:37:34.900 minute at times, I've seen you at times going, I, man, I walked through a wall of fire for
01:37:39.840 him right now.
01:37:40.640 I'm so mad.
01:37:41.840 It mainly it's, I'm mad at the press the way they're treating him.
01:37:44.860 Well, they push it towards him for sure.
01:37:46.780 There's no question about that.
01:37:47.900 And then he's done some great things like really brave things that no other president
01:37:51.640 has done like making Jerusalem the, uh, the place of the U S embassy and yeah.
01:37:58.480 And then he sped along that process when initially they said, yeah, we'll do it at the end of
01:38:02.440 the year.
01:38:02.700 And then that wasn't even good enough.
01:38:04.740 He actually sped it along.
01:38:06.020 They found a place and they moved on it.
01:38:07.960 It's pretty amazing.
01:38:09.140 And then other days, you know, when more than rat poison, yeah, it's just, it's difficult.
01:38:15.260 Um, but they say like one in five voters, probably one in five of his voters have turned on him
01:38:22.640 to a certain extent.
01:38:24.160 I don't experience that ever.
01:38:26.460 Do you guys, I don't hear from people who say, yeah, I did like him and now I don't.
01:38:30.220 Well, there's, I think there, you remember it's people who voted for him.
01:38:32.900 So there were a lot of people who were just like, I just can't, you know, not going Hillary
01:38:36.720 and I don't want to go third party.
01:38:38.040 So I'll go with Trump and kind of held their nose.
01:38:40.180 And maybe those, some of those people have fallen off.
01:38:42.300 And I don't even know.
01:38:43.100 I don't know if people want to say that because then you feel like you're part of the, you
01:38:48.220 know, get him crowd because you, you have to be labeled, right?
01:38:52.020 I mean, they're labeled.
01:38:52.900 It's so amazing.
01:38:53.620 I was labeled an anti-Trumper, which I never said I would never vote for him for all time.
01:38:58.680 Yeah.
01:38:59.000 I was a never Trumper.
01:39:00.520 Um, no, I was, I was never Trump in 2016.
01:39:04.340 I'm open to new information now, which we said from the beginning, we said from the
01:39:08.420 beginning, you know, if he turns out, I'll be the first to say it.
01:39:11.020 And I did.
01:39:12.200 So I was never, never a Trumper and I'm not a pro Trumper.
01:39:16.480 I'm just a citizen who's like, I don't know.
01:39:19.940 Is he doing a good job?
01:39:20.860 Are these good things for the country or bad things for the country?
01:39:23.820 It does this outweigh this bad thing outweigh this good thing.
01:39:27.820 Is it anything that I can change or we can change?
01:39:31.080 No.
01:39:31.760 Okay.
01:39:32.240 Well, let's keep moving.
01:39:34.420 What's it?
01:39:34.780 Isn't that what we're supposed to do?
01:39:36.520 I thought so.
01:39:37.580 It's interesting though.
01:39:38.480 I think maybe your hesitation with that poll, because the word warmness almost signifies
01:39:45.080 an emotional attachment.
01:39:46.680 Yes.
01:39:47.060 Right.
01:39:47.360 Like, and I, but people have one to him.
01:39:49.620 And I think that's true.
01:39:50.400 I think, but I think the press is driving that even more.
01:39:52.880 Yeah.
01:39:53.080 For all the good and the bad of the Trump presidency.
01:39:55.840 The one thing I think we can totally, everybody can come together on is it's been a very emotional
01:40:00.400 time for a lot of people.
01:40:02.440 Yeah.
01:40:02.660 People are either very emotional about him in support and against.
01:40:07.260 And I don't think that should really apply to a president.
01:40:11.220 I don't either.
01:40:11.640 We should almost never.
01:40:12.340 I think that's the problem.
01:40:13.140 Right?
01:40:13.400 Like, like there is a cult of personality.
01:40:16.080 That's what happened with, with Barack Obama.
01:40:18.020 His cult followed him, would not accept anything bad about him, stoned to death anybody who
01:40:24.980 said, wait a minute, hang on, let's, let's talk about this for a second.
01:40:28.220 You are calling him a liar, which makes you a racist.
01:40:30.620 And it was all emotion.
01:40:31.840 And on our side, a lot of it was emotion because they would just deny, deny, deny, deny and
01:40:36.980 call you all kinds of names and you were, you're pissed off.
01:40:39.640 Now, the reverse.
01:40:42.480 Now, the other side is so emotional and so they just can't believe how people could be
01:40:48.780 this crazy and not, and they don't realize, wait a minute, we just saw this in reverse.
01:40:53.040 You should learn your actions are only making things worse.
01:40:57.580 You know, the last time I can think of a real emotional connection for the country to a president
01:41:02.680 was George W. Bush after 9-11.
01:41:04.960 Remember, he got to a point where he had, I think it was 86% of Democrats approving his
01:41:10.900 job performance.
01:41:12.360 That's not a thing where they actually agree with his policies or even agree specifically
01:41:17.320 with the things he was doing when it came to the war on terror or anything like that.
01:41:20.920 It was just like, we got to come together.
01:41:23.000 I love our country.
01:41:23.840 We're under attack.
01:41:24.760 And it was an emotional sort of coming together for a very short period of time.
01:41:29.560 And now it's, it seems to be incredibly emotional on both sides with very few people making
01:41:36.680 individual day-to-day distinctions on whether this, he's doing a good thing with X, Y, or
01:41:41.340 Z policy.
01:41:42.060 You know, it's, it's, it's not seemingly part of the conversation.
01:41:46.000 It's what they did to Reagan on steroids.
01:41:49.260 Yeah.
01:41:49.860 I mean, you know, this was like this in Reagan.
01:41:52.620 They just, there were some in the press that just could never admit anything good.
01:41:56.320 But, you know, you can see the decay of the, uh, of the press just by this.
01:42:03.180 Reagan was elected by the so-called Reagan Democrats and everybody talked about it.
01:42:07.520 There's these Democrats, these democratic voters who used to always vote Democrat and they went
01:42:11.960 with Ronald Reagan.
01:42:13.380 20% of Donald Trump's base, 20% are Democrats.
01:42:20.040 People who voted for Barack Obama at least once, many of them twice.
01:42:27.100 That's not counting the, that's not counting the Senate campaign.
01:42:30.020 That says president 20% were Obama Democrats.
01:42:35.140 The press never talks about them.
01:42:38.480 Never.
01:42:39.560 That's true.
01:42:40.580 Never.
01:42:41.220 It's the right.
01:42:42.320 It's the Republicans, at least with Reagan, they were at least willing to diagnose the problem.
01:42:48.180 Wait a minute.
01:42:48.480 There's a problem in the democratic party.
01:42:50.520 They're not appealing to these people.
01:42:53.420 They're not saying anything about that.
01:42:56.380 20%.
01:42:56.820 Are you willing to say that 20% of the democratic party was erased?
01:43:00.840 We're racist against black people.
01:43:03.420 People who voted for, for Barack Obama once or, or some of them twice.
01:43:08.220 They were, they were somehow another racist all of a sudden.
01:43:10.380 How's that work?
01:43:10.920 And we've seen that before too.
01:43:11.800 There's large amounts of the African American community and Democrats who voted, you know,
01:43:16.640 for banning gay marriage in California when that passed.
01:43:20.320 And the Democrat, then there's never discussion about those people.
01:43:23.580 No.
01:43:24.020 It's much easier.
01:43:24.860 The largest segment of people that voted for, was it Prop 8?
01:43:29.240 Yeah.
01:43:29.700 Were black.
01:43:31.160 That was the largest, that was the largest group to vote in favor of Prop 8.
01:43:36.380 If I remember correctly.
01:43:38.240 It was.
01:43:38.720 Per capita.
01:43:39.420 Yeah.
01:43:39.820 Percentage wise.
01:43:40.580 Percentage wise.
01:43:41.380 Yeah.
01:43:41.440 So let me ask you, Pat, on your warmness scale, how would you rate your warmness towards
01:43:47.340 Keith Ellison?
01:43:49.260 Like super duper high, mega high, super Doppler high.
01:43:53.560 Well, it gets to the point to where it's pretty hot because it's like, it's hot.
01:43:56.020 Right.
01:43:56.540 Yeah.
01:43:56.800 Like the, all the intensity of white hot burning suns, like a billion of them.
01:44:01.280 Wow.
01:44:01.720 Yeah.
01:44:02.080 Because I mean, he's been, he's pretty hot right now.
01:44:03.920 He's done a super good job.
01:44:05.280 He was running for attorney general in Minnesota and in the middle of a primary and a couple of
01:44:10.480 days beforehand, he got accused of domestic violence and getting physical fights with
01:44:15.360 his long-term girlfriend by the long-term girlfriend's son and the girlfriend and the
01:44:21.320 girlfriend confirmed it and is sharing texts with the media.
01:44:26.920 Keith Ellison still somehow wins his primary, which is just, of course, because when you're
01:44:33.480 a Democrat, none of this stuff seems to matter, but would you phrase your acceptance?
01:44:40.480 Like this, if you were Keith Ellison, I'm curious if this is a good move PR wise, listen
01:44:46.000 to his celebration last night.
01:44:47.980 We had a very unexpected event at the end of this campaign that happened.
01:44:57.380 I want to assure you that it is not true and we are going to keep on fighting all the
01:45:02.620 way through.
01:45:03.360 I don't think, oh my, after your accused of domestic violence, I don't think we're going
01:45:08.040 to keep on fighting is the right verbiage.
01:45:11.280 I think you might want to switch that one up a little bit.
01:45:14.400 In fact, I might drag you out of bed and beat your head against the...
01:45:17.600 No, Keith, stop!
01:45:19.120 Stop!
01:45:19.400 What?
01:45:20.140 No, I mean...
01:45:20.920 You don't mean that kind of fighting?
01:45:23.040 I'm talking in metaphor.
01:45:24.180 In metaphor.
01:45:24.780 What do you mean?
01:45:26.440 I mean, I'm just going to, I'm going to just hit your head against the dresser there in
01:45:31.080 the bedroom a few times.
01:45:32.520 It's all right.
01:45:34.560 Jeez, I mean...
01:45:36.000 Oh gosh.
01:45:37.800 Well, yes, Keith is going to be apparently...
01:45:40.080 So he won the nomination?
01:45:41.500 He won the nomination, so he may be attorney general.
01:45:44.580 Wow.
01:45:45.140 And by the way...
01:45:46.540 Unbelievable.
01:45:47.680 Unbelievable.
01:45:48.380 When everybody else is getting fired just from the accusation, he actually wins an election.
01:45:52.700 Mm-hmm.
01:45:53.440 Amazing.
01:45:54.300 Pretty serious accusations too.
01:45:55.860 That's in Minnesota, right?
01:45:57.080 Yeah.
01:45:57.500 Yeah, in Minnesota, it's gone crazy.
01:45:59.500 Minnesota, I'm telling you, you know, Al Franken may be elected God.
01:46:04.660 I mean, I think it's just, it's gone...
01:46:06.860 I mean, Al had to step down because he took a jokey picture with a girl like 10 years ago
01:46:12.400 when he was a comedian.
01:46:13.400 The guy who, the guy who is accused of being a domestic abuser, let's give him the keys to
01:46:18.920 law enforcement.
01:46:20.740 Good heavens.
01:46:22.720 All right, let me tell you about our response to this half hour.
01:46:24.500 As far as American financing, owning a home, it really hasn't been easier than it is right
01:46:29.580 now.
01:46:30.200 You can make an incredible investment.
01:46:32.380 It is most likely the largest investment of your life if you're looking to buy your first
01:46:36.480 home, your last home, or just to refinance.
01:46:40.460 The people that can give you pre-approval really fast so you know how much you're going to qualify
01:46:45.160 for and close faster than expected because the underwriting is all done in-house.
01:46:52.740 All the decision-making is done in-house.
01:46:54.740 It's American financing.
01:46:56.240 They employ salary-based mortgage consultants, so they don't work on commission.
01:46:59.360 They're not working for the bank.
01:47:01.020 You know, when the banks are like, I'm going to give you a loan.
01:47:03.240 No, you're selling me a loan.
01:47:04.640 I see how much this loan costs me right here in the interest.
01:47:08.140 I'm not interested in that one.
01:47:09.980 And they do that because we think they're giving us a loan.
01:47:14.120 These people work for you.
01:47:16.360 They're not on commission from the bank.
01:47:18.100 They have an A-plus rating with a BBB and over 1,800 Google reviews, so check it out for
01:47:22.420 yourself.
01:47:23.360 It's AmericanFinancing.net.
01:47:26.160 If you are buying your home or refinancing, AmericanFinancing.net.
01:47:30.900 Do it now.
01:47:31.500 800-906-2440.
01:47:33.840 800-906-2440.
01:47:36.100 Or AmericanFinancing.net.
01:47:38.140 American Financing Corporation, NMLS, 182334, www.nmlsconsumeraccess.org.
01:47:48.940 On your drive home, something that's great to listen to is the news and why it matters.
01:47:54.760 It's available as a Blaze subscriber, but also if you subscribe to the podcast on iTunes,
01:47:59.800 you can get it.
01:48:00.840 The all-new podcast, which is the news and why it matters.
01:48:05.400 It's hosted by one of our, I think, one of our best writers and smartest minds here.
01:48:14.160 And then she drags us along.
01:48:17.020 Sarah Gonzalez.
01:48:17.520 Yeah, but today we have a couple of guests sitting in a couple of the chairs, and that
01:48:23.260 is Ben Shapiro and also Jeremy Boring, who is anything but.
01:48:28.520 He was instrumental in the success of Prager University, also helped build Ben Shapiro's
01:48:38.860 business.
01:48:39.720 He is a Hollywood writer and the guy who started Friends of Abe.
01:48:44.680 He's a fascinating guy to listen to, and you can enjoy him on your way home today with
01:48:50.960 me and Sarah and Stu at the news and why it matters.
01:48:56.200 You can find it on iTunes.
01:48:57.600 And by the way, please subscribe and rate it and review if you can.
01:49:03.320 And that helps move things up so other people can find it when they're looking for it.
01:49:08.340 Other people can just stumble onto it.
01:49:09.840 It moves it up in rank.
01:49:10.840 So please rate it and review it and subscribe to the news and why it matters.
01:49:17.500 And one of the things I want to talk about today, if we have a chance on the news and
01:49:20.660 why it matters, is the new charges against the evil Jack Phillips, who owns Masterpiece Cake
01:49:29.020 Shop.
01:49:29.360 So this is a guy who is a guy in Colorado, right?
01:49:33.040 Yeah.
01:49:33.160 If you remember the cake story where, of course, someone came in that was gay and wanted their
01:49:37.600 cake for their wedding and he refused and went all the way to the Supreme Court.
01:49:41.080 He won't make Halloween cakes because it's against his religion.
01:49:44.800 I mean, the guy is consistent all across the board.
01:49:46.960 So he goes all the way to the Supreme Court.
01:49:48.520 He wins the case, comes back down.
01:49:51.180 Now, the one thing we took out of that, because I didn't think the ruling was broad enough, but
01:49:54.540 the one thing we took out of that, it was at least this guy's was at least this guy's
01:49:58.000 protected, at least in his own personal life.
01:50:00.300 It's it's squared away correctly.
01:50:02.660 Well, no.
01:50:03.460 Now, someone who is transgendered has come into the shop and asked for a gender transition
01:50:08.600 celebration cake, which he, of course, again, refuse and is now the same people are coming
01:50:14.220 back after him again to try to punish him again.
01:50:17.320 Do we get him on the radio tomorrow?
01:50:18.820 I mean, how does this guy how does this guy even as you get up in the morning and do
01:50:24.420 it?
01:50:25.660 Seriously, I mean, you've got people who are just coming after.
01:50:27.880 There was no there's no on.
01:50:29.180 Oh, wait.
01:50:30.880 Oh, wait.
01:50:31.260 I'm fan.
01:50:31.820 I didn't know you.
01:50:32.640 You wouldn't make this case.
01:50:33.620 I didn't even know who you were.
01:50:34.580 This was a political stunt.
01:50:37.560 And this guy's life is going to be turned upside down for another two years.
01:50:41.120 I mean, it just makes you just want to say, forget it.
01:50:44.360 Hopefully it just gets thrown out of court.
01:50:45.620 The Supreme Court already ruled on this.
01:50:46.980 So hopefully it just gets thrown out and doesn't have to deal with it.
01:50:49.200 But there'll be another one around the corner.
01:50:50.880 It's awful.
01:50:51.700 We'll see.
01:50:52.380 Hey, but Antifa is fighting fascism.