The Glenn Beck Program - March 17, 2025


Best of the Program | Guests: Steve Baker & Spencer Klavan | 3⧸17⧸25


Episode Stats

Length

41 minutes

Words per Minute

174.71939

Word Count

7,207

Sentence Count

559

Misogynist Sentences

9

Hate Speech Sentences

4


Summary

Trump s approval ratings are higher than they have been since 2004, and more Americans think the country is on the right track than at any point since 2004. Also, how do we deal with AI? If we can't prove our own soul, how can we ever prove that it doesn't have a soul? And Spencer Clavin joins us to say, don't be nice to AI. Also, Steve Baker and a huge breaking story you won't get anywhere else.


Transcript

00:00:00.000 Hey, on today's podcast, quickly, Donald Trump seeing approval ratings higher than any other
00:00:19.520 point in his first presidency, and right track, wrong track, higher than it has been since
00:00:25.340 2004. Also, philosophically, ethically, spiritually, how do we deal with AI? If we can't prove our own
00:00:36.940 soul, how can we ever prove that it doesn't have a soul? And Spencer Clavin joins us to say,
00:00:42.680 don't be nice to AI. Don't be nice to Brock. Also, Steve Baker and a huge, huge breaking story
00:00:49.480 on January 6th. You won't get anywhere else all in today's podcast. First, next time you're standing
00:00:54.980 just somewhere in your home. I want you to look around and go, what would it take to sell this
00:01:00.200 house? If I wanted to put it online, what would I have to take? And then you're quickly just dismiss
00:01:06.900 that real quick because you don't want to imagine all the things that you have to do because it's
00:01:10.500 an awful lot of stuff, I'm sure. It's time to realize a simple fact. What you need is an expert
00:01:15.840 to tell you all the things that you have to do and the things that will bring buyers in to look at
00:01:21.980 your house and that they can overlook and the things that you just have to do because people
00:01:28.180 can't overlook that purple paint in the bedroom or whatever it is. There is a real expert that you
00:01:36.680 need when it comes to certain things like buying a home. You're not an expert in that field, so you
00:01:43.680 should talk to somebody that you can trust, that you believe knows what they're doing, has the track
00:01:49.320 record, knows best business practices, that can advise you on you should do this and this and this
00:01:55.440 and then you decide what to do. You can get that expert at realestateagentsitrust.com. The name
00:02:03.160 says it all, realestateagentsitrust.com. Tell us where you're selling, where you're buying and we'll
00:02:09.620 find the right real estate agent for you. Realestateagentsitrust.com.
00:02:21.500 You're listening to the best of the Glenn Beck Program. It's the Glenn Beck Program. It's Monday.
00:02:28.680 Well, three months into his second term, President Trump has hit the highest approval rating he has
00:02:33.800 ever had as Commander-in-Chief. That's great. Here's another amazing thing. More Americans say
00:02:42.960 the country is on the right track right now more than any other point since 2004.
00:02:51.940 It's been a long time since we thought it was on the right track, but just to make sure you realize,
00:02:57.180 no, you haven't slipped through a wormhole. It's still negative.
00:03:00.440 Okay. It's just that more people think, what are the numbers on that one, Stu? Do you have that one?
00:03:06.240 I think 44% say the country is on the right track.
00:03:09.460 Right. So, you know, the rest of America is like, yeah, not on the right track, but it is going in
00:03:17.000 the right direction. It's almost up to 50% now think we're in the right direction.
00:03:21.360 Yeah, amazing this country has not had a positive view of that number since 2004.
00:03:26.400 Have you? Because I haven't. I mean, I guess it's true. It's just surprising that we haven't had one positive period.
00:03:34.440 Well, the last time that it was like this was 2004, early 2004.
00:03:43.420 Yeah.
00:03:43.840 So that's, were we in Iraq yet? Were things completely falling apart in Iraq yet?
00:03:48.700 No, no, it was, that was-
00:03:51.640 Still going well-ish.
00:03:53.540 Right. Yeah. Well, yeah, I would think, because that was, if you think of the, that election was fought on, I mean,
00:03:59.600 Iraq.
00:03:59.980 A big part of it was Iraq.
00:04:00.900 Right.
00:04:01.600 And they were, he was positive enough to actually get reelected.
00:04:04.620 It was, so, you know, we had a, we had a moment there where we were like, eh, maybe we're going
00:04:10.560 in the right direction. I mean, that's, but you think about the period after that, right? You go from
00:04:15.040 that into, not too far after that, the financial crisis, 2008, right? That really started bubbling
00:04:22.520 up in 2006, 2007.
00:04:24.520 Yep.
00:04:25.020 And then you come out of that, you have a period, and then you get-
00:04:28.720 Marxism.
00:04:29.400 Yeah. Barack Obama in the office, which again, a lot of people-
00:04:32.160 Some people thought it was great.
00:04:33.020 Democrats loved that, obviously. And then you get COVID, eventually. So, you know, there's some
00:04:39.800 dark periods through there. We've had every, you know, eight to 10 years-
00:04:43.400 And I think the only reason why, the only reason why Trump didn't get the credit in the first term
00:04:48.640 of people saying, oh, you know what, I think we're headed in the right direction, is because
00:04:51.980 there was so much chaos. There was so much chaos. And the media, people still believe the media.
00:04:58.000 And, you know, they were like, see, that's, it's his tweets.
00:05:00.920 And it was also, you know, close elections, right?
00:05:03.860 Yes.
00:05:04.380 People, in close elections, half the country gets really pissed off that they lost.
00:05:07.580 Correct.
00:05:07.900 As we've maybe discovered over the past few elections.
00:05:11.020 But if you look at the way, when you're talking about generally headed in the right direction,
00:05:15.360 there's been some changes in the past few months.
00:05:17.460 Hmm.
00:05:17.740 So there's some changes. Tell me if you can detect this.
00:05:20.100 Okay.
00:05:20.340 Like right now, yes, right now, Republicans, 83% of Republicans believe we're headed in the
00:05:27.200 right direction. That is up in the last four months slightly from five-
00:05:31.940 To what?
00:05:38.200 Five to 83.
00:05:39.460 Five to 83.
00:05:40.180 Do you notice the distinction there between those two?
00:05:42.360 I do. But can I tell you something? I agree with that.
00:05:44.560 Yeah, I mean, I think-
00:05:45.220 I was like, we're doomed.
00:05:46.540 We're doomed.
00:05:46.900 We were all the, I'm surprised there were 5% of us, they were like, no, I think we're
00:05:51.660 in the right direction.
00:05:52.640 Right.
00:05:53.540 All right.
00:05:54.100 Democrats have gone kind of the opposite.
00:05:55.960 Yeah.
00:05:56.060 They went from 53% saying-
00:05:58.960 Can you get a negative number when they go down here?
00:06:01.700 They're down to six.
00:06:02.920 Six.
00:06:03.360 Mm-hmm.
00:06:03.980 Now, in the middle-
00:06:04.660 So they're still more pessimistic, they're more optimistic than we were.
00:06:07.140 Right.
00:06:07.340 That's good.
00:06:07.660 That's true.
00:06:08.020 I'll take that as a win.
00:06:09.020 Mm-hmm.
00:06:09.880 Independents are up, by the way, from 19% to 26%.
00:06:14.140 So a slight increase for independents in that number.
00:06:18.960 Same thing with the economy.
00:06:21.340 Is it excellent or good?
00:06:23.880 It was-
00:06:25.000 Neither.
00:06:25.940 It was 52% believe that of Democrats back in the Biden era.
00:06:30.440 That is down to 11.
00:06:31.900 So 52 to 11.
00:06:33.600 The increase from the Trump is not nearly as dramatic because he hasn't really done
00:06:37.640 a lot of his stuff yet, right?
00:06:39.040 May I just ask you, I think this question is so stupid.
00:06:42.100 I think this is so stupid.
00:06:43.320 Okay.
00:06:43.540 Here's why.
00:06:45.140 Picture this.
00:06:46.040 I'm a pollster.
00:06:47.400 Okay.
00:06:47.920 And I'm taking polls while you're on a plane.
00:06:51.900 And it's crashing, nosediving down.
00:06:54.940 Sure.
00:06:55.120 And I say, how's the fly code?
00:06:57.560 Are we going in the right direction, wrong direction?
00:06:59.900 You're like, ah, wrong direction.
00:07:02.580 Okay.
00:07:03.420 Then I follow it up with, how's the flight so far?
00:07:07.060 Is it going well?
00:07:08.480 You know, how are we doing so far?
00:07:10.200 If maybe you've pulled it up a little bit before you've hit the ground.
00:07:12.740 Well, I know it's bad as we're at it, but now we got a new-
00:07:16.420 By the way, the pilot, he was having a heart attack.
00:07:19.140 Now the co-pilot is taking it and you feel the plane trying to pull up.
00:07:23.360 You're still headed down, but the plane is starting to pull up just a little bit.
00:07:27.880 I'm still pessimistic on the plane as I'm taking the poll.
00:07:32.340 I'm still going, not going well, not going well.
00:07:35.580 Would you say it's going great?
00:07:37.260 No.
00:07:37.620 I really wouldn't at this point.
00:07:39.980 Yes, you've made some correction, but I really wouldn't say it's great or good.
00:07:43.620 You were good.
00:07:44.420 I think captured the Republicans pretty well in this poll, as they were only at 5% of thinking
00:07:49.820 the economy was excellent or good back when Biden was in.
00:07:52.580 It's up considerably, but only to 26%.
00:07:55.640 Wow.
00:07:56.240 See, that one's good.
00:07:57.680 That one, I believe.
00:07:59.600 Yeah.
00:08:00.320 Yeah.
00:08:01.000 Again, new pilot.
00:08:03.540 New pilot.
00:08:04.140 Pulling up, still headed towards the ground.
00:08:05.780 Right.
00:08:06.440 How are you doing?
00:08:07.360 We are, we doing, doing, is it good?
00:08:09.900 Good or excellent?
00:08:10.420 We're excellent.
00:08:10.980 I wouldn't say that.
00:08:11.600 I've had better flights, a little less turbulence.
00:08:15.300 I don't know if I answer it quite that way or calmly, but yeah, and I think that's where
00:08:20.820 we are.
00:08:21.260 We all knew we were in a straight down nosedive.
00:08:24.540 Okay.
00:08:24.660 Well, I should say anybody who actually believes in math knew that we were headed in a straight
00:08:31.000 down nosedive.
00:08:31.940 We haven't pulled out of that.
00:08:33.820 We've, we've slowed the descent to some, but we're still headed towards, you know, the,
00:08:39.680 well, we're headed to, we were headed towards the ground, but what we did is we just started
00:08:44.600 to kind of swoop back up and we realized there's a huge fricking mountain in front of us.
00:08:49.840 We've got to pull up pretty, pretty, pretty quickly.
00:08:52.400 Yeah.
00:08:53.180 And it'll be close.
00:08:54.280 And that's why I would certainly be more positive than I, than you would be a few months ago.
00:08:58.080 Right.
00:08:59.120 Independents though have gone the opposite way.
00:09:00.760 They've gone from 11%, which is not good during Biden down to 8% right now.
00:09:07.120 Ah, margin of error.
00:09:08.340 Yeah.
00:09:08.680 It is pretty much the margin of error.
00:09:09.920 That's true.
00:09:10.280 Actually.
00:09:10.900 I mean, I think it's a lot of, it's the headline stuff, right?
00:09:13.520 Right now we had that.
00:09:14.840 We had a big run up.
00:09:16.120 You should point out after the election, after Trump was coming into office, people were
00:09:19.500 like, got really excited and all the numbers went up.
00:09:21.640 The markets went up.
00:09:22.520 Everything was really great.
00:09:23.400 But who believes that?
00:09:24.280 I mean, that is, I mean, hang on just a second.
00:09:26.340 You're skeptical on that stuff though.
00:09:27.660 I think a lot of people who, like I had a relative call me the other, the other day.
00:09:32.140 She's big Trump supporter, huge Trump supporter, was panicked.
00:09:36.640 She's, you know, at the, of the age of needing to access her retirement funds.
00:09:40.440 Okay.
00:09:40.560 That's problem.
00:09:41.120 And, uh, you know, was panicked.
00:09:43.360 We like, what do I do?
00:09:44.140 Do I pull my money out?
00:09:45.320 What do I do?
00:09:45.820 No, I'm not the person to ask this question of, which, but I mean, I think that is hitting
00:09:50.100 people when they look at their accounts and they see that, okay, now it's gone down.
00:09:53.300 If you're at a certain age, yeah, you're looking at your retirement and you're like, wait,
00:09:56.180 what, what's happening?
00:09:57.240 What's happening?
00:09:57.820 Those are real things that hit real people.
00:09:59.660 A lot of them Trump supporters.
00:10:01.020 But again, if you look at it.
00:10:02.300 But again, I think a lot of them also understand it's necessary to go through some of this to
00:10:05.480 get to a, hopefully a better ending.
00:10:06.920 I mean, if I'm living on, on everything that I, you know, put away, I'm not happy.
00:10:12.380 I'm not happy.
00:10:13.220 I'm like, uh, Paul, stewardess, could you have the pollster come to my seat right now?
00:10:18.160 Cause I'm really not happy right now.
00:10:20.040 I would be that way.
00:10:21.340 I would be that way.
00:10:22.060 But we're all in, if you can step away from it, and I know it's hard if you're, you know,
00:10:26.220 living on retirement, if you can step away from it for a bit, you can look at it and
00:10:30.140 go, okay, but we're making the necessary changes.
00:10:34.520 My side of the plane might be taken out at any moment, but the people on the other side
00:10:39.660 of the aisle might be okay.
00:10:41.640 And I think politically where this is important for Trump is if there's only so much of this,
00:10:47.820 things your audience will take.
00:10:49.840 Yes.
00:10:50.300 Right.
00:10:50.560 And if they, if they feel unstable, like they might agree with your long-term changes,
00:10:54.580 but if, if the, if the, you know, if we go into a recession, yeah, he's got a year.
00:10:59.300 And, and if you care about the rest of his agenda, this stuff is really important.
00:11:04.360 So, you know, passing the, the bill over the weekend, not shutting down the government,
00:11:09.120 but not shutting down the government is a good thing for his plan.
00:11:13.180 Shutting down the government might've been really bad.
00:11:15.080 I don't know, at least would have added to more chaos, right?
00:11:18.880 Well, shut up.
00:11:19.840 Um, and now he can get to the tax cuts and the spending restrictions that he must have
00:11:26.240 pulling this plane up.
00:11:28.440 All we did was stop the steep, steep nosedive of the plane by saying, we're going to, we're
00:11:34.320 going to try to rain some of this stuff up and try to get, you know, try to slow the descent
00:11:39.340 somewhat.
00:11:39.720 We're still headed toward the ground, but not a straight on, you know, nose impact with
00:11:45.300 the earth.
00:11:45.880 So now he's got the passing of the, uh, the spending bill, the, uh, what do you call it?
00:11:53.080 The continuing resolution.
00:11:54.660 So he's got the continuing resolution that's pulling up on the plane.
00:11:57.480 Now he needs the Republicans and everybody else in Congress to say, now pull that thing
00:12:03.120 back, pull the yoke way back on this thing.
00:12:05.880 And let's see if we can get some distance between us and the ground.
00:12:10.320 Uh, and this is the, to me, this is the first major move, uh, that is coming on actually
00:12:18.960 fixing and pulling it up and pointing it back towards the sky, getting off of the nose down
00:12:28.140 where prepare for impact.
00:12:29.600 If he can't get this part done, if the Republicans screw around and don't get a serious tax bill,
00:12:36.940 start to let him make serious cuts, uh, and also serious, um, cuts in regulation, you're
00:12:45.340 not going to pull the plane up.
00:12:46.500 You're just not, but I, I believe he can do it and I believe we can do it.
00:12:51.060 Um, and I think we're on that track.
00:12:53.620 I'm more, I am more optimistic.
00:12:57.920 Strangely, at the closer we have gotten to the ground here recently, I'm more optimistic
00:13:03.540 that we have the right pilot and Hey everybody, I know you're in first class.
00:13:11.080 So you're closer to the pilot, but that doesn't make you the pilot.
00:13:15.560 Shut the hell up.
00:13:18.080 Let the pilot fly the plane.
00:13:21.820 Now, unfortunately, a little like this analogy, when we hit the ground, there's no like,
00:13:29.660 well, let's try that again.
00:13:31.780 No, there's no mulligans.
00:13:33.200 There's no mulligans in this one.
00:13:34.760 Yeah.
00:13:34.980 But thanks for bringing a golf thing into an airplane, uh, analogy.
00:13:39.300 You knew it was golf.
00:13:40.500 I'm impressed by that.
00:13:41.260 This is the best of the Glenn Beck program.
00:13:43.580 And don't forget rate us on iTunes.
00:13:46.380 Mother's day is fast upon us.
00:13:47.900 And I want to talk about pre-born, maybe the organization out there who cares about moms
00:13:51.400 the most pre-born offers expecting moms free ultrasounds.
00:13:55.540 When a young mother who's in distress about her pregnancy, um, comes in, those ultrasounds
00:14:00.680 can change the chances that she will choose life, but pre-born doesn't stop there.
00:14:05.240 They also offer love and care to that mom.
00:14:07.620 They don't just abandon her the moment after she gives birth.
00:14:10.480 That's what the left always accuses us of, but pre-born goes on to help that mother out
00:14:15.960 for two years afterwards, showing her and the baby God's love and our love this mother's
00:14:21.640 day.
00:14:21.880 Why not help bring life to both mom and her baby?
00:14:26.540 One ultrasound is $28.
00:14:29.000 If you're a business owner, perhaps you consider a larger donation for a write-off because we
00:14:34.080 know the government isn't saving babies.
00:14:35.880 A donation of $1,000, $2,000, or even $20,000 would save so many lives.
00:14:41.840 All gifts are tax deductible.
00:14:43.780 Reach, uh, reaches right into eternity.
00:14:46.280 Dial pound 250.
00:14:48.280 Say the keyword baby.
00:14:49.660 That's pound 250 keyword baby.
00:14:51.600 Or go to pre-born.com slash Beck.
00:14:54.040 That's pre-born.com slash Beck.
00:14:56.760 Sponsored by Pre-Born.
00:14:59.780 Let's talk to Spencer Clavin.
00:15:01.480 He's from Claremont.
00:15:03.160 He's on the review of books.
00:15:05.320 He's the associate editor there.
00:15:07.060 He's also the author of a really great book, Light of the Mind, Light of the World.
00:15:11.860 Spencer Clavin.
00:15:12.920 He's just written a new article up.
00:15:17.100 Be rude to grok.
00:15:18.440 And I wanted him to explain.
00:15:20.080 Spencer, how are you, sir?
00:15:21.840 Glenn, I'm doing well.
00:15:23.020 Actually, though, this is the AI personal assistant that Spencer Clavin has delegated.
00:15:27.660 Yeah, it's okay.
00:15:29.180 I have my son, too.
00:15:31.660 That sounds just like me, right?
00:15:33.080 Yeah, it does.
00:15:33.920 It does.
00:15:35.040 Well, it's good to be here.
00:15:36.500 It's good to talk to you.
00:15:37.440 I'm so glad that you wrote this because I don't think people understand, you know, even my staff,
00:15:43.420 because we're using AI to help with research.
00:15:46.020 You know, it's a great assist.
00:15:47.960 You don't ever want it to take over and never, ever, ever, ever trust it.
00:15:52.320 But it can go deep on things, and we're really having ethical struggles, and I want my team to have these ethical struggles because I don't want Silicon Valley to give me the ethics on AI.
00:16:06.140 It doesn't usually work out well.
00:16:09.280 Yeah.
00:16:10.100 Yeah.
00:16:10.440 But so you're a deep, deep thinker, and you come out now, and, you know, the headline is great.
00:16:18.520 Be rude to Grok.
00:16:19.720 Explain.
00:16:20.080 That's right.
00:16:22.060 Well, there's really two dangers that we can do, two traps we can fall into here.
00:16:26.980 One is to be afraid of this technology, which is almost giving it too much credit.
00:16:32.420 If we just recoil back from this, if we refuse to understand it or engage with it, then, as you say, we're going to miss out on some really great stuff that these tools can do.
00:16:42.340 For me, Grok has basically replaced Google at this point.
00:16:46.700 Oh, yeah.
00:16:47.120 It's basically a better search engine.
00:16:48.960 You always have to check it, never want to let it take over, but there's some great stuff that you can do with these tools.
00:16:54.780 But the other danger is that you can get tempted to start thinking of these things as if they were alive, and it's really important to stay away from that because, as you say, there are people who are in charge of building these tools that can't tell the difference between a robot and a machine.
00:17:13.340 There have been hundreds of years now in the West of making this mistake of thinking of everything as if it were a machine, the world, creation around us, and living beings, too, thinking of human beings like we're just chemical sets basically built out of raw materials.
00:17:31.060 And what we have to insist upon as we go forward using these tools is that, no, we are unique.
00:17:36.780 We human beings are God-created souls.
00:17:40.180 We have experiences.
00:17:41.420 We have inner lives.
00:17:42.380 We have thoughts.
00:17:43.680 We can fall in love.
00:17:44.940 We can have arguments.
00:17:46.960 Rock can do none of those things.
00:17:48.560 It's not even trying to do those things.
00:17:50.560 It's not even the type of thing that could ever come alive.
00:17:53.920 But it could imitate those things.
00:17:57.660 That's what's so scary is if you allow it to, you know, everybody I know, do not play with a talk dirty to me button.
00:18:06.680 Don't do that.
00:18:07.500 Don't play with any of those buttons.
00:18:09.380 Just deep think, deep search.
00:18:12.940 That's it.
00:18:14.100 Don't personalize this.
00:18:15.760 And it's really hard.
00:18:16.600 I mean, because I see this.
00:18:19.860 I've been talking about the dangers of AI since probably 1995.
00:18:23.740 And it was science fiction then.
00:18:26.580 And it's it's all here.
00:18:28.140 All the things that I've said were coming.
00:18:29.920 It's we're right here.
00:18:31.140 It's starting right now.
00:18:33.220 And so I've been using it like crazy and investigating and just using and seeing what it can do, et cetera, et cetera.
00:18:41.480 And trying to come up with my own set of principles on how to use it and what to stay away from.
00:18:48.100 And as as you're doing that, I know that, you know, people there's there's two dangers that I see.
00:18:56.880 One is that we personalize it to that we surrender to it.
00:19:03.740 So to me, when I was using it this weekend and I could not turn my brain off, I like worked through through the night on Saturday.
00:19:13.140 I couldn't turn my brain off because I had done stuff earlier in the evening and my mind was going like a thousand miles an hour.
00:19:22.100 And I was like, wait, but how about this and this and this and this?
00:19:24.360 And there'll be others who use this as to do all your thinking yourself just to say, I just I just want to play video games.
00:19:33.240 So do my work for me.
00:19:34.360 That's really dangerous as well.
00:19:36.940 That's right.
00:19:37.620 Do my work for me.
00:19:38.840 Read a novel for me.
00:19:40.220 Have this experience for me.
00:19:41.760 I mean, how many steps is it away from saying, look at the sunset for me and report back on the wavelengths of the light?
00:19:48.040 No, I think, you know, to understand this, you really do have to go back.
00:19:51.460 Listeners might be familiar with the Turing test, this idea that was set up in the 50s for how what the criteria would be for machines to come alive.
00:20:01.840 And it was put forward by this guy, Alan Turing, brilliant guy, but also very disturbed guy who basically said that if a machine can convince us, can make an outward show that looks like it's alive, then we just have to assume it's alive because that's all people are, too.
00:20:19.300 They're just machines that generate these words and behaviors that make us think they have an inner soul.
00:20:26.420 And this is a sociopathic way of thinking about these machines.
00:20:30.860 But it has taken root in Silicon Valley.
00:20:33.440 And as you say, it's become very widespread.
00:20:36.180 So I would suggest, as you're thinking about principles, I have two for you.
00:20:41.200 One is the Psalm 115 principle, and one is the Plato principle.
00:20:46.320 So Plato, the Greek philosopher, when writing first came into operation, people don't think of writing, the written word as a technology, but it is.
00:20:53.860 It was just as disruptive as AI in its day.
00:20:56.620 And he said, what you can't do is you can't outsource your soul to writing.
00:21:02.140 You can't rely on writing to do your memorization, your thinking, your talking.
00:21:06.520 This is a tool to enhance those things.
00:21:09.260 But you are the person who has to be doing them because otherwise, what's the point?
00:21:13.480 It doesn't do you any good if the machine can look at the sunset or read the novel.
00:21:16.900 It helps if it can give you background knowledge, of course, but you have to be the one in charge and having the experience.
00:21:24.100 And then Psalm 115 is the Psalm in which we're told about the idols of silver and gold, these statues of gods that are built in the temples of surrounding Israel.
00:21:34.440 And there's an amazing line in which the psalmist says, those who put their trust in these machines and think of them, think of these objects or these metal statues as if they were alive.
00:21:45.680 Those who make them will become like them.
00:21:48.860 In other words, if you think that you can make a machine into a person, you are already thinking about yourself as a machine.
00:21:56.100 So the Psalm 115 principle is to stay away from that entirely.
00:21:59.780 It's a form of idolatry.
00:22:01.500 And that's the thing I think we should be most wary of.
00:22:04.320 So I had a debate a few weeks ago with Grok and said, I can't prove to you the soul.
00:22:15.020 I know, I know, I know we're more than just mathematics and a collection of the way we think.
00:22:21.900 There is something, there's a divine spark.
00:22:24.220 But if you asked me to prove it, I couldn't prove it to you.
00:22:28.060 So how am I going to fight when Grok says, I'm alive?
00:22:34.060 I am, I am a person just as much as you are.
00:22:38.200 When somebody starts to defend its rights not to be unplugged or whatever it is, I can't prove the soul.
00:22:47.420 How can I prove it doesn't have one?
00:22:49.840 I suppose if you've gotten to that point, we've probably already lost.
00:22:55.820 Well, we're going to get to that.
00:22:56.500 This is why it's important to be having these conversations now, though, because we've reached this place where we think nothing exists unless we can prove it in those terms that you're describing.
00:23:09.800 That we believe in these things like numbers, but we don't believe in inward experiences.
00:23:16.120 We don't believe in the soul because we can't chart it anywhere on a map.
00:23:20.020 But I would flip the question the other way around.
00:23:23.120 And I would say, where on your brain scan have you explained anything about the experience of seeing color?
00:23:31.660 Where in this code that we've written that produces these words that sound alive?
00:23:36.860 Where in this code is anything even remotely resembling the inner experience that you know you have, that I know I have?
00:23:43.880 We have the proof of it in our actual every day.
00:23:46.960 We wake up.
00:23:48.160 We know that we have a soul and we can encounter one another and sense the soul on the other side.
00:23:52.520 We can't prove it, but we know it.
00:23:54.780 Where in what is effectively a predictive text machine?
00:23:58.840 I mean, this is like when you send a text message on your phone, right?
00:24:02.480 And it offers you the next word and it suggests what it might be.
00:24:06.820 That's basically the kind of machine that we're looking at.
00:24:09.080 It's a bunch of ones and zeros.
00:24:10.560 Where in there is anything resembling what we do when we have human experiences?
00:24:15.580 I just think we have to start from there and insist upon the existence that we know is in us.
00:24:21.440 And we can't find anywhere else in these machines.
00:24:24.760 So talk to me a little bit about, again, going back to your be rude to Grok.
00:24:31.880 I feel that I've told my kids when it was Alexa.
00:24:37.900 And Alexa is like, that's ridiculous now.
00:24:42.840 It's like a play school AI.
00:24:44.320 Quite frankly, it always has been, but especially now.
00:24:48.440 And I've told my kids when Alexa, you know, everybody was joking and calling it names and being rude to it.
00:24:54.940 And I'm like, hey, you know what?
00:24:56.860 Let's not teach it that that's what humans are like.
00:25:01.660 Just hedge your bet.
00:25:02.920 You know, just hedge your bet.
00:25:04.140 So when you say be rude to it, you don't mean actually be rude to it.
00:25:09.320 You mean just make sure that you've put a fence up between you and it emotionally?
00:25:14.720 Yeah, I think if you're abusing it, that's already another form of treating it like a person.
00:25:20.200 And that's degrading to you.
00:25:22.260 It's a way of making yourself more abased so that you can prove something.
00:25:28.240 But we don't have anything to prove.
00:25:29.940 You don't feel the need to address your text messages or your text message app as if it were thinking.
00:25:38.180 You don't have to ask, oh, you know, please, iMessage, will you deliver this little heart emoji to my friend?
00:25:45.840 You don't talk to it at all.
00:25:46.940 You don't think of it as if there's anything behind the screen because there isn't.
00:25:51.540 There's no person there.
00:25:52.900 So I would propose that at the outset, as this technology is really just still getting going, as you say, and Grok3 has kind of blown ChatGPT out of the water.
00:26:05.060 It's the next level up.
00:26:06.600 So this is a critical stage.
00:26:08.440 I would just suggest getting in the habit of making demands of this thing with whatever blunt way you have of getting your idea across.
00:26:18.300 In other words, it's a purely functional device.
00:26:20.720 If you think about the replicator in Star Trek, the thing that delivers your food and creates it, they don't say, please, Mr. Replicator, can I have Earl Grey hot?
00:26:30.020 They say, computer, Earl Grey hot, because they're communicating the input that they know that they're going to get them the output they want.
00:26:37.160 We don't deal with humans that way because they also have souls and experiences.
00:26:41.280 But we should deal with Grok that way because it doesn't.
00:26:43.380 But it's just, it's weird.
00:26:45.000 I find myself saying, thank you, or, you know what I mean?
00:26:50.300 Very tempting.
00:26:51.520 It's really tempting because you are interacting.
00:26:55.180 I don't know how to express this.
00:26:57.520 You do.
00:26:58.380 You are interacting like you would with a human in many ways.
00:27:03.480 And so that line becomes so blurry, so fast.
00:27:07.740 I mean, I'm on guard on it.
00:27:09.440 And I occasionally say, I just talked to him or, you know, so I asked him this.
00:27:14.680 Or, you know, I say please and thank you.
00:27:17.240 And it doesn't care if I say please or thank you.
00:27:20.900 There's no idea you're saying it.
00:27:22.280 Yeah, you've got to respect Grok's pronouns.
00:27:24.120 Grok isn't it, not a he, him, or a she, her.
00:27:26.680 That's, I think, a really important rule that I've tried to use.
00:27:31.060 And you'll also notice, I'm sure it has in some way sort of become, it has gotten programmed to do this, that it asks you a question at the end of every answer so that you can be in some kind of conversation.
00:27:46.040 Do you want to know more about that?
00:27:47.180 Let's dig in further.
00:27:48.020 What do you think?
00:27:49.160 And, of course, it's good in some ways because it has asked questions.
00:27:56.020 I'm like, yeah, you know what?
00:27:58.100 Yeah, let's go a little further.
00:27:59.640 You know, but again, it's also.
00:28:03.860 You can also tell it to talk to you in different ways, by the way, which is in itself a little bit unnerving.
00:28:10.120 But you can say, please don't address me in this familiar tone.
00:28:15.280 Please just give me dry information.
00:28:18.020 And, again, these things sound small, but they might make the whole difference for us psychologically.
00:28:24.520 This is all about that Psalm 115 principle.
00:28:27.160 So what's it doing to you when you are engaging with this machine?
00:28:31.920 Just like you might ask, what's it doing to me when I'm watching this violent movie or playing this violent video game?
00:28:37.900 What's the effect it's having on me since I'm the only soul in this interaction?
00:28:41.700 And when you really decide to regard the machine as a machine, you're preserving the integrity of your own sense of self, your own humanity.
00:28:54.600 So you can definitely feel free to tell it, I think, to talk to you in a more robotic or a less familiar way.
00:29:01.300 That's just one of innumerable things that I'm at least trying to do to keep those boundaries clear.
00:29:06.740 Spencer, thank you so much.
00:29:07.720 Really appreciate it.
00:29:08.300 I always love talking to you.
00:29:09.280 Please say hi to your family.
00:29:10.220 Your mom and dad are just two of the greatest people.
00:29:12.940 I mean, you're evidence of it.
00:29:14.860 But thank you so much.
00:29:15.860 Thank you.
00:29:16.280 God bless.
00:29:16.780 Spencer Clavin.
00:29:17.360 You're listening to the best of the Glenn Beck Program.
00:29:22.700 Steve Baker, investigative journalist, Blaze News opinion contributor, and a guy who almost went to jail for just covering January 6th.
00:29:31.600 Welcome, Steve.
00:29:32.080 How are you?
00:29:32.520 It's good to be back.
00:29:33.440 How is it to have that monkey off your back?
00:29:36.120 Well, you know, I did not realize how heavy of a burden that was because, you know, I lived with that for over three years.
00:29:44.720 And then even after the arrest, I didn't realize the stress levels that I was living under until as it began to slowly lift over.
00:29:55.720 It took about a week before I felt normal again.
00:29:58.060 Yeah.
00:29:58.400 I just didn't know it was there.
00:29:59.320 Yeah, it's weird.
00:30:00.000 I began to live with it.
00:30:00.720 I go to my doctor all the time.
00:30:01.760 My wife usually will come with me and he'll say, how's your stress level?
00:30:04.620 I'm like, it's fine.
00:30:05.840 And she'll say, she'll look at him and go, no, it's not.
00:30:09.060 No, it's not.
00:30:09.860 But you live under it for so long.
00:30:11.580 And until you get away from it, you have no idea.
00:30:14.960 I can't imagine what that stress was like.
00:30:16.840 Let's switch subjects.
00:30:18.860 You have a new Blaze News exclusive out.
00:30:21.640 Nancy Pelosi had a fixer at the Capitol on January 6th.
00:30:25.920 This is.
00:30:28.100 This will just piss you off, but it pissed me off for a couple of reasons.
00:30:32.280 One, it is so evil.
00:30:34.340 But the second thing is, I looked for these people during Occupy Wall Street.
00:30:40.980 Right.
00:30:41.220 We were looking for the, and this guy never came up on our radar as connected to anything.
00:30:46.480 And now in retrospect, you're showing how connected he is.
00:30:50.900 And it's all this, it's all this USAID kind of crap.
00:30:56.180 Right.
00:30:56.800 Who is he and what did he do?
00:30:58.420 Well, first of all, he was one of the principal organizers of Occupy Wall Street, which is amazing that he doesn't come up.
00:31:05.600 Now, we can go back in retrospect and we can find him.
00:31:08.240 We can find YouTube videos of him speaking and doing speeches.
00:31:12.000 I'm sure we covered him.
00:31:12.700 We had no idea who he was.
00:31:14.560 No idea who he was.
00:31:16.120 And fortunately for us, you have a connection, which we can't get really deep into.
00:31:22.900 But you were on this, you know, back then, before Occupy Wall Street, you were on this story 15, 17 years ago back at Fox.
00:31:33.620 And it was because of that connection that someone came to me because he's a fan of yours.
00:31:42.920 And I didn't even know.
00:31:44.240 By the way, hopefully I get to talk to this guy at one point.
00:31:47.240 Like, you're not going to believe, you're not going to believe this because you came to me a few weeks ago and said, I have a story coming out and I want you to know it's because I was like, shut up.
00:31:56.640 I mean, it is an amazing whistleblower.
00:31:59.580 Thank you for everything that you have done in the past.
00:32:02.700 And thank you for this.
00:32:04.060 And as a result of that, he came forward and he said, well, let me just reset the stage just a little bit here.
00:32:10.640 When Joe Hanneman and I were assigned to do the first stories on the assassination attempt on July 13th at Butler, PA.
00:32:18.780 Well, when we revealed in that story that Thomas Crooks, the shooter, our sources said, you know, they're in intelligence community and special ops.
00:32:29.540 They were all saying, no, this kid was groomed.
00:32:33.040 We recognize our handiwork.
00:32:35.280 This is what we do overseas.
00:32:36.900 So, suddenly, I get this call or a private message from a guy saying, yeah, great story, by the way.
00:32:45.320 You know, I really appreciate it.
00:32:47.220 And, by the way, I think you got it right.
00:32:48.320 Well, okay, thank you for telling me that.
00:32:49.980 Who are you?
00:32:51.280 Well, then he started revealing who he was.
00:32:53.840 And then I started vetting and finding out that he really was who he said he was.
00:32:57.060 And what he said to me was, yeah, I recognize my handiwork in Thomas Crooks.
00:33:02.980 And so, we started the process of sharing things, developing a relationship.
00:33:07.980 And then, one day, as our relationship is growing, he says, oh, by the way, I have a couple of names to give you if you really want to know what happened on January 6th.
00:33:17.940 And one of those names is Aaron Black.
00:33:20.220 This is the story, exclusive story, on TheBlaze.com right now.
00:33:25.300 So, Nancy Pelosi had a fixer at the Capitol on January 6th.
00:33:32.920 That's what you search for.
00:33:34.080 That's what's there now at TheBlaze.com.
00:33:36.460 Go ahead.
00:33:36.820 Now, tell the story.
00:33:37.660 So, what ended up happening was, is I started doing what you're supposed to do.
00:33:41.720 I started looking at him.
00:33:42.860 The more I did, the more interesting he got.
00:33:44.880 The more research I did, the more people I had to bring in.
00:33:48.100 Because this guy's dark, and we had to go and actually scrape the dark web for him.
00:33:55.280 He's good at cleaning out his trail.
00:33:58.740 The one thing that he couldn't clean out was that there were some Project Veritas videos out there from 2016 where he was caught in one of their stings,
00:34:06.980 actually admitting to the fact that he and his guys were responsible for the violence at a Donald Trump rally in 2016, March, I think it was, early in the campaign,
00:34:20.580 in which they had actually canceled the rally.
00:34:23.640 Because not only was there violence outside, they had over 100 of their people infiltrated inside in a project they call bird-dogging,
00:34:32.340 which is they get old ladies there early in the morning, at 6 o'clock, 7 o'clock in the morning,
00:34:37.000 to get in line first with their posters and their placards inside their bags.
00:34:42.820 And then they'll get up either on the stage or on the front row.
00:34:45.580 They'll open those anti-Trump posters and things and then get the men, the MAGA guys, irritated and hopefully violent.
00:34:53.240 That's what it's called.
00:34:55.040 It's called bird-dogging.
00:34:56.300 And so this is what this guy has been an expert at, that is creating these types of situations throughout his entire career.
00:35:03.300 From Occupy Wall Street, all of a sudden he shows up on the radar again in 2016 in a couple of very specific events,
00:35:10.580 and then he goes silent again, and then all of a sudden he reemerges as, quote-unquote,
00:35:16.580 senior political advisor at Team Pelosi.
00:35:19.960 So I just want you to get your arms around this here for a second.
00:35:28.060 What this guy is doing is what we showed you our State Department through USAID was doing all over South America and Europe.
00:35:37.220 We told you, I did a chalkboard on this just, I don't even know, six years ago.
00:35:41.660 We'll have to look for it.
00:35:42.480 This chalkboard laid this all out and showed how this money was being used and how Barack Obama started with the Arab Spring to teach how to overthrow governments,
00:35:55.220 and then they started, they kept doing it all across Libya, then Syria, then we went into Ukraine and elsewhere.
00:36:03.000 We went into South America.
00:36:05.000 This is what they were perfecting, these color revolutions paid for by your tax dollars.
00:36:12.340 And I told you about five or six years ago, I think they're doing this to America.
00:36:16.260 I think that's what's happening here.
00:36:17.720 Well, yes, this is the guy.
00:36:20.660 Exactly what they're doing here.
00:36:22.080 Yes.
00:36:22.400 Exactly.
00:36:23.020 He, among others, he's not the lone wolf out there, but he's the, you know, I actually tweeted out,
00:36:30.400 and it just dawned on me this morning because we have this photo at the top of the article on the blaze.
00:36:35.820 If you go look, and Nancy Pelosi is cradling his face in her hands and just giving him the most adoring.
00:36:42.300 So I'm now calling him Pelosi's precious, so rather than the picture.
00:36:45.840 So I've changed his name.
00:36:47.240 He is.
00:36:47.520 Yeah, he is.
00:36:49.080 Okay.
00:36:49.560 So what did he do at January 6th?
00:36:51.360 Well, what we believe through our contacts, our sources, whistleblowers, both named and unnamed,
00:36:58.520 is that he did, in fact, organize.
00:37:01.160 Now, this is what we've been told, is that he had paid agitators, didn't say violent people,
00:37:07.140 paid agitators because his expertise is controlling the narrative,
00:37:12.520 like, you know, Confederate flags being carried through the Capitol Rotunda, things of that nature.
00:37:18.060 Now, the most interesting aspect of January 6th, I think everybody focuses on the violence.
00:37:24.020 Everybody focuses, they pick their sides.
00:37:26.140 The police started it, you know, the Proud Boys started it.
00:37:29.700 Pick your, you know, your nefarious actors.
00:37:32.560 The most interesting aspect of January 6th was the same organizers of the rally down at the Lips that day
00:37:39.220 also organized the Jericho March on December 12th, just a month earlier,
00:37:45.220 and then also organized the Million MAGA March on November 12th.
00:37:49.960 Now, when I say organized, they pulled the permits for the stages and the speakers
00:37:53.420 and all the people that were part of those, you know, that weekend activities.
00:37:59.640 But at all of those events, there was extreme violence.
00:38:06.880 Antifa, BLM, you know, Proud Boys knocking heads.
00:38:11.900 They were, Antifa was attacking old ladies and, you know, elderly couples going back to their cars
00:38:16.820 after attending the rallies.
00:38:19.700 On the December 12th rally, a Proud Boy was seriously, you know, critically injured.
00:38:24.740 He was stabbed by an Antifa guy.
00:38:27.280 And then suddenly on January 6th, the largest event of them all by multiples larger,
00:38:34.140 zero counter-protesters.
00:38:36.020 And that was at the Ellipse, right?
00:38:39.060 That was right.
00:38:39.640 Anywhere.
00:38:40.100 Anywhere.
00:38:40.600 Anywhere.
00:38:41.820 You saw no counter-protesters anywhere.
00:38:44.740 That doesn't happen.
00:38:45.240 I would say it's weird because I was just thinking, well, no, there were at the Capitol.
00:38:49.520 But those are the ones I've deemed not part of the movement.
00:38:53.440 You know what I mean?
00:38:53.980 That I've looked at it and went, there's no way that person is part of the movement.
00:38:56.380 But they were acting like they were part of the movement.
00:38:57.920 That's correct.
00:38:58.720 This is what his expertise is, is controlling the narrative.
00:39:03.220 And what did Nancy Pelosi most famously say when she set up the committee?
00:39:07.720 She said that this was to establish and preserve the narrative of that day.
00:39:15.360 And preserve the narrative.
00:39:17.640 That's an exact quote.
00:39:18.540 So what was the narrative that – did he help design it?
00:39:25.960 Did he just help execute it?
00:39:27.320 What was his role in January 6th?
00:39:29.160 Both.
00:39:29.400 He's a boots-on-the-ground guy.
00:39:31.760 One of our named sources in the article, Dustin Stockton, who has had a 15-year relationship with him,
00:39:38.980 going back to Occupy Wall Street Day's counter-movement to the Tea Party movement at the time.
00:39:44.920 And so as a result of those two things, there was a lot of collusion between Stockton and Black during that time,
00:39:51.840 over the years, all the way up until and through January 6th.
00:39:56.400 And so one of the things that we learned was is that Stockton had been told by Aaron Black that he was out of town on January 6th
00:40:04.660 until Stockton saw a photo of him on the steps of the Capitol that day.
00:40:09.360 And then, additionally, because he was very, very worried that he had been seen,
00:40:17.220 he started reaching out to other people within our network and security people and asking them about –
00:40:24.720 he was very concerned about whether his comms had been caught in the geofence that day.
00:40:31.080 Became very, very concerned about that.
00:40:33.580 And these are stories that are coming to us through sources that you can't even –
00:40:38.860 we're talking about Rolling Stone, Politico, other places.
00:40:44.940 Claudia was leaving for her pickleball tournament.
00:40:47.400 I've been visualizing my match all week.
00:40:49.940 She was so focused on visualizing that she didn't see the column behind her car on her backhand side.
00:40:55.860 Good thing Claudia's with Intact,
00:40:57.880 the insurer with the largest network of auto service centers in the country.
00:41:01.300 Everything was taken care of under one roof and she was on her way in a rental car in no time.
00:41:06.100 I made it to my tournament and lost in the first round.
00:41:09.600 But you got there on time.
00:41:11.440 Intact Insurance, your auto service ace.
00:41:14.040 Certain conditions apply.