The Glenn Beck Program - May 30, 2025


Best of the Program | Guest: Zachary Levi | 5⧸30⧸25


Episode Stats

Length

44 minutes

Words per Minute

170.73648

Word Count

7,556

Sentence Count

542

Misogynist Sentences

3

Hate Speech Sentences

8


Summary

On today's Friday podcast, Washington State has gotten so far, they are so crazy even Mao has to be impressed. This is what happens when you adopt the attitude of the ends justify the means. It's a really important part of the program today. Also, Zachary Levi, the actor and the guy who's starting his own studios in Austin, Texas is talking to me about AI and the new Google AI that will change the industry. And a thank you to Elon Musk as his fight against the swamp comes to an end.


Transcript

00:00:00.000 Bank more encores when you switch to a Scotiabank banking package.
00:00:07.080 Learn more at scotiabank.com slash banking packages.
00:00:10.920 Conditions apply.
00:00:12.680 Scotiabank, you're richer than you think.
00:00:14.960 On today's Friday podcast, Washington State has gotten so far.
00:00:20.380 They are so crazy.
00:00:22.460 Even Mao has to be impressed.
00:00:24.980 This is what happens when you adopt the attitude of the ends justify the means.
00:00:29.240 It's a really important part of the program today.
00:00:31.780 Also, Zachary Levi, the actor, and the guy who's starting his own studios in Austin, Wildwood Studios,
00:00:39.760 he's talking to me about AI and the new Google AI that will, it's going to change the industry.
00:00:49.840 I mean, is there a reason we will have photographers and actors and everything else?
00:00:54.320 Just what has been released this week from Google has changed things dramatically.
00:01:00.360 And a thank you to Elon Musk as his fight against the swamp comes to an end.
00:01:05.920 Don't miss a second of today's podcast.
00:01:07.700 Have you ever watched a dog try to hide a limb that's hurting?
00:01:13.660 It'll limp for a second, then it'll snap to attention like, nope, nope, totally fine.
00:01:17.160 Watch me, I'm going to run around in circles.
00:01:20.140 You do things like that sometimes, pretending my body isn't falling apart.
00:01:24.380 You know, and you have zero, you have zero to say about it, really zero, zero stars.
00:01:28.960 I don't recommend it.
00:01:30.400 It's not a good thing when it happens.
00:01:32.480 My wife made me try Relief Factor.
00:01:35.420 And when I say she made me, she made me.
00:01:37.420 She said she won't listen to me whine anymore unless I try everything.
00:01:41.720 And I'm like, Relief Factor's not going to work.
00:01:43.780 It's all natural.
00:01:44.780 I mean, Dow Chemical is not even around, you know, better living through pharmaceuticals, baby.
00:01:50.960 And she said, no, I don't think that's actually true, Glenn.
00:01:55.360 So you should try something all natural.
00:01:57.420 And I did, and I can't tell you the difference it made in my life.
00:02:03.620 It totally changed my life.
00:02:06.060 It might change your life.
00:02:07.340 Just try it for three weeks.
00:02:08.820 Get the quick start trial for three weeks and see if your aches and pains and your problems
00:02:13.540 kind of disappear over time.
00:02:15.260 1995, it's relieffactor.com.
00:02:18.320 Relieffactor.com.
00:02:19.660 1-800, the number four, Relief.
00:02:23.500 Hello, America.
00:02:24.740 You know we've been fighting every single day.
00:02:26.500 We push back against the lies, the censorship, the nonsense of the mainstream media that they're
00:02:31.900 trying to feed you.
00:02:32.880 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:02:37.800 But to keep this fight going, we need you.
00:02:40.320 Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:02:44.020 Give us five stars and lead a comment because every single review helps us break through
00:02:48.540 Big Tech's algorithm to reach more Americans who need to hear the truth.
00:02:52.820 This isn't a podcast.
00:02:53.720 This is a movement and you're part of it, a big part of it.
00:02:57.640 So if you believe in what we're doing, you want more people to wake up, help us push this
00:03:01.360 podcast to the top.
00:03:02.780 Rate, review, share.
00:03:04.340 Together, we'll make a difference.
00:03:06.460 And thanks for standing with us.
00:03:07.700 Now let's get to work.
00:03:08.680 You're listening to the best of the Glenn Beck program.
00:03:20.960 You know, for Democrats, if you don't think you're playing with communism or socialism,
00:03:25.600 talk to the people in Washington state.
00:03:27.820 Talk to anyone who is sane in Washington state.
00:03:31.100 I'll give you his number, but they are going to full-fledged communism, Marxism.
00:03:38.600 You have every giant corporation now moving out of the Seattle area in Washington state
00:03:46.180 because they're going to, I'm telling you, they're going to go to wealth confiscation.
00:03:49.940 They're going to do it.
00:03:50.700 And there is a, uh, a place Lake Washington is, you know, by Bellevue and between Bellevue
00:03:57.160 and, and, uh, Seattle.
00:03:58.700 And it is beautiful.
00:04:00.560 It is just the most beautiful place you've ever seen.
00:04:04.400 And this is where Bill Gates and everybody else.
00:04:06.600 And when I was a kid, it was, it was not like that.
00:04:09.220 It was, you know, there were still normal people that live there.
00:04:12.360 Um, and, uh, now you can't even get close to it.
00:04:16.380 And there's this place in the middle of the lake and it's called Hunt's point.
00:04:20.820 And it is where, you know, these are 60 to a hundred million dollar houses.
00:04:26.820 Uh, and, and they're not necessarily fancy.
00:04:29.800 Uh, they just happen to be in an area where there's not very much land and it is the place
00:04:36.040 to live if you like water, uh, and you're living right on the water and it's, it's just
00:04:40.640 spectacular.
00:04:41.360 It used to be that when a place would go up for sale, even when I was a kid on Hunt's
00:04:47.680 point, it would never last.
00:04:49.840 Okay.
00:04:50.460 You'd never, they'd never come up for sale.
00:04:52.660 People that would, they wouldn't want to sell it cause you couldn't replace it.
00:04:56.080 You couldn't get anything like it.
00:04:57.960 And so, uh, they would, they would come up for sale and they'd be gone before anybody
00:05:02.180 would even know.
00:05:02.900 So I am told by a friend who, uh, knows that area quite well that I think he said 17 homes
00:05:13.540 in Hunt's point are up for sale, 17.
00:05:17.860 And some of them have been up for sale now for over a year and there are no buyers.
00:05:23.720 All of these people are trying to get out of Washington state and nobody's buying their
00:05:28.640 home because nobody you're going to, are you going to buy that?
00:05:33.320 Hey, rich person, where are you going to move from?
00:05:35.840 You're going to move to Washington.
00:05:38.100 No, Washington, the property values are going to start plummeting and you've got crazy people.
00:05:46.300 Not only crazy people are all around you.
00:05:48.300 I'm telling you, I grew up in Washington state.
00:05:50.420 I grew up listening to, um, you know, hippies and everything else.
00:05:55.500 My, you know, my friends and I remember going to a friend's house and we were standing on
00:06:00.300 our front porch and, uh, you know, we were, this is the Alex P Keaton days and not politically,
00:06:08.020 but just, I mean, I guess a little politically, but my friends, not all my friends, you know,
00:06:13.160 greed with Reagan, but we didn't talk politics.
00:06:15.360 It was just, you know, we weren't hippies.
00:06:17.240 And I remember standing on a front porch and my friend was going to open up her front door.
00:06:21.420 She had her hand on the doorknob and she, before she opened it, she turned to me and
00:06:26.500 she said, I really apologize.
00:06:27.760 My folks are probably in the living room getting stoned.
00:06:30.540 Just nevermind.
00:06:32.180 We were the adults and we opened up the door and I'm like, I get it.
00:06:36.740 And so open up the door and there they are getting stoned in the, and they're like,
00:06:39.740 Hey kids, what's going on?
00:06:41.780 Um, I mean, that's where I grew up.
00:06:43.840 Okay.
00:06:44.960 Um, and it, it was crazy back then.
00:06:47.320 And there's these people that believe in this thing called Cascadia, which is a communist
00:06:52.700 state.
00:06:53.800 Just get out of America, start a new communist country called Cascadia.
00:06:59.980 And it is, uh, Washington, Oregon.
00:07:03.060 And I think they want parts of Idaho.
00:07:05.300 Thank God Idaho hasn't gone nuts yet.
00:07:07.640 Um, but, uh, that's, what's coming.
00:07:11.400 That's what they want.
00:07:13.560 And you see people like, you know, the mayor of Seattle, do you see what happened in Seattle?
00:07:17.300 over the weekend, Stu, where there was this, yeah, yeah, go ahead.
00:07:21.700 A little bit.
00:07:22.460 You're talking about the mayor and this, this accusation going, no, no, no, no, no, no, no.
00:07:28.040 The Christian, the, the, uh, the Christians that had a revival out in a park and all of
00:07:34.520 these revolutionaries came, uh, they were threatening them.
00:07:37.940 The police came and shut down the Christians and they deemed the Christians and police against
00:07:43.020 their will, I think.
00:07:44.320 Um, but under the direction of the mayor, shut down the Christians, excused all the radical
00:07:49.860 revolutionaries and said, you know, it's the Christians here that are causing all the
00:07:53.380 ruckus.
00:07:53.900 I mean, it was, it's crazy what's going on.
00:07:56.200 Well, now what you were talking about is the scandal that's going on with Bruce Harrell.
00:08:00.580 He's the mayor of Seattle.
00:08:03.240 Who, who is the mayor of Seattle?
00:08:05.380 Who is this guy?
00:08:06.880 Okay.
00:08:07.340 Well, he's just like you.
00:08:09.040 Uh, well, I mean, just like you, if you had been arrested in 96 for brandishing a firearm
00:08:15.740 over a parking space, uh, in 1996, this, this has been out for a while.
00:08:23.600 Um, he was a young attorney and he was, had just been appointed to the, the housing authority
00:08:29.660 board in, I think council Bluffs, Iowa.
00:08:33.760 And he was at a casino and he was pulling up to a parking space and this other couple
00:08:38.440 in their family, it's a husband, wife, a mom, and somebody else.
00:08:41.620 They pull up and they pull into the parking space and he gets pissed off and they say
00:08:47.680 he pointed a gun at them and they were afraid for their lives.
00:08:52.280 Um, he admitted at the time to say, yeah, I, I, I had my gun, but I wasn't pointing it
00:08:59.020 at him.
00:08:59.400 What are you doing?
00:09:00.020 Just showing it to them.
00:09:01.160 Hey, I just, I'm so proud of my gun.
00:09:03.220 I just want you to see that's called brandishing a firearm.
00:09:05.780 Okay.
00:09:05.920 You can't do that.
00:09:06.580 It's against the law.
00:09:07.600 Well, the charges, you know, fall apart or whatever.
00:09:10.180 And so he's, he's charged with it, but he does, he's not convicted of it.
00:09:14.060 Nobody says anything.
00:09:15.340 Well, it comes up again recently.
00:09:17.240 And now he's saying, no, that I didn't have a gun.
00:09:19.980 They mistook my watch for a gun.
00:09:26.020 Now I, I, I am, I'm a, I'm a watch guy.
00:09:31.640 I'm a watch collector.
00:09:32.720 I, I, I like watches and I have some big watches, but I've, I've, I've never had anyone
00:09:40.560 at any airport or on the street go, oh my gosh, you've got a gun strapped to your wrist.
00:09:47.520 No, I, it's never happened.
00:09:49.600 Has that ever happened to you, Stu, where you're like, that guy's got a gun on his wrist.
00:09:53.400 And it, and you realize, no, it's just actually a watch.
00:09:56.080 A watch?
00:09:57.860 Well, you're talking about the watch gun.
00:09:59.460 Yeah.
00:09:59.740 I mean, I try not to wear that at night because people do make that mistake.
00:10:04.340 Yeah.
00:10:04.700 I mean, besides the watch gun, you know what I mean?
00:10:07.460 No.
00:10:07.840 Besides that watch.
00:10:09.140 Okay.
00:10:09.480 So that's his excuse.
00:10:11.040 Now when it's brought up, he's like, no, I didn't have one.
00:10:13.240 He said at the time he did, but he wasn't pointing it at him.
00:10:16.820 Now he says, no, they mistook that for a watch.
00:10:19.980 And it's justify the means.
00:10:23.900 I mean, if you're going to, if you're going to elect radicals, if you're going to elect
00:10:28.080 people that don't, you know, they just don't care about the law, the constitution, you know,
00:10:33.220 they don't care.
00:10:34.200 But well, you know, oh, well.
00:10:37.380 People just say anything now.
00:10:39.260 There's no, there's no even attempt to, to come up with stories that even sound real.
00:10:46.200 No.
00:10:46.600 Because, because basically like, and if you think about it, there's some pragmatic sense
00:10:52.500 to it in, in our current day, which is like, in reality, like what's going to happen is
00:10:58.660 the people who already liked you are going to support you no matter what you say.
00:11:02.440 And, uh, I, I just, they just, you might as well just say something and everyone's going
00:11:08.720 to like nod along and say, well, yes, I like his other policies or I want him to, to succeed.
00:11:13.860 So therefore I believe his gun watch story.
00:11:16.760 Right.
00:11:17.520 Who is the guy?
00:11:18.600 Oh, is it Jesse Smollett?
00:11:20.680 Yeah.
00:11:21.080 Jesse Smollett.
00:11:21.580 Was that his name?
00:11:22.160 Yeah.
00:11:22.640 He's still saying that it was, you know, he was targeted.
00:11:25.920 Still saying.
00:11:26.580 Yeah.
00:11:27.300 Yeah.
00:11:27.660 There's no consequence for any of this.
00:11:29.500 No, it's true.
00:11:30.360 I mean, we, we, we talked a little bit off the air a few minutes ago about, this is the
00:11:33.660 sort of conversation we have, which is the WNBA and the situation with Caitlin Clark and
00:11:37.880 Angel Reese, where, you know, again, these are two basketball players, one white, one
00:11:42.160 black, uh, the white player fouled the black player.
00:11:44.900 There's some sort of rivalry that seems to be basically one way from Angel Reese toward
00:11:49.480 Caitlin Clark.
00:11:50.820 Uh, Caitlin Clark walks away.
00:11:52.760 Angel Reese freaks out.
00:11:54.480 Um, you know, the team, you know, Angel Reese's team lost by like 30 points in the game.
00:11:59.560 Afterwards, she's doing her press conferences.
00:12:01.540 And of course, as you 100% can just fill in the blank, if you know, if you know nothing
00:12:06.660 about the story, claim that there was racism.
00:12:09.040 That was the reason why all of this happened.
00:12:10.920 And there was people in the stands yelling racial slurs at her.
00:12:14.520 She says this to a press conference.
00:12:16.060 It's a major controversy.
00:12:17.880 Everybody's talking about it.
00:12:19.080 They're batting it back and forth.
00:12:20.760 What does this mean?
00:12:21.700 Can you believe this full investigation launched?
00:12:24.180 Blah, blah, blah, blah, blah, blah, blah.
00:12:25.200 Now, remember, this is not, this doesn't happen in the woods.
00:12:28.040 This doesn't happen like under a bridge, you know, you know, in Madagascar somewhere.
00:12:34.820 This happens on a, in an, in arena where that's being televised.
00:12:40.320 So of course there are hundreds of fans around the area where this was supposedly going to
00:12:45.220 happen.
00:12:45.560 There were dozens of employees around this area.
00:12:48.320 There were cameras and microphones everywhere.
00:12:51.220 Of course they do the investigation.
00:12:52.960 Of course, no one can find any evidence that this happened at all.
00:12:58.080 No one can find one example of this occurring.
00:13:01.540 And then the end of the story is not a massive controversy about how this player could be
00:13:06.980 falsely accusing all of these people that are fans of the other team of being racist
00:13:12.580 and, and, and manufacturing claims of, of racial slurs.
00:13:17.620 No, no.
00:13:18.000 The story is a two paragraph statement from the WNBA.
00:13:22.240 Hey, we looked into it.
00:13:23.120 Couldn't find anything.
00:13:23.820 No, no follow-ups from, no follow-ups from any of the journalists who were concerned about
00:13:28.700 it at the time.
00:13:29.240 They just, it just, we just all move on.
00:13:31.660 And, and why no follow-up on the investigation of how that began?
00:13:36.420 Who started those charges?
00:13:38.020 What are they going to do?
00:13:39.320 Are they going to pay a price for starting those charges?
00:13:41.720 Shouldn't they?
00:13:42.120 And didn't I see that, that very player sitting on the bench talking about white girls?
00:13:47.660 Oh yeah.
00:13:48.060 Well, that's the, in a derogatory way.
00:13:49.540 Yeah.
00:13:49.720 That's hard to see many conversations without that phrase used from that particular player.
00:13:56.040 I mean, well, what about the racism there?
00:13:59.220 I mean, it's just, it, it doesn't, it doesn't seem to matter anymore.
00:14:03.300 The truth doesn't matter anymore.
00:14:05.600 You know, I, I, uh, somebody said to me the other day, have you seen that, uh, Donald Trump
00:14:10.280 is now saying that if you're working for the government, you have to go through, I think
00:14:13.520 it's a hundred hour class on the constitution.
00:14:16.620 And somebody said, well, wait a minute.
00:14:18.400 I, I don't want that.
00:14:19.620 I don't want that.
00:14:20.160 Cause I don't want them doing that with DEI.
00:14:21.860 No, no, no, no, no, no, no, no, no, um, the constitution is the owner's manual.
00:14:30.580 And right now we have a bunch of people that are trying to put our country together and
00:14:35.920 they've never read the instructions.
00:14:38.760 And you know, it's like, it's, it's like our country came from Ikea and you know, I can
00:14:45.020 build this.
00:14:45.420 I'll just put it together.
00:14:46.240 And it's being built upside down.
00:14:48.640 The legs are in the wrong place.
00:14:50.240 It's never, and you've got like 47 screws left over at the end.
00:14:54.720 Okay.
00:14:55.720 Read the instructions.
00:14:58.580 Okay.
00:14:59.060 They're not in Swedish.
00:15:00.940 They're in English.
00:15:02.360 Read the instructions.
00:15:04.200 This, this I think is one of the best things that the president has done so far.
00:15:08.980 You want to work in the, you want to work in the administration.
00:15:11.500 You want to work for the government.
00:15:12.560 Good.
00:15:12.840 You got to go take a course on the constitution of the United States because that's the
00:15:17.600 owner's manual and there's no excuse.
00:15:20.480 Oh, I didn't know that that was in the cause.
00:15:22.200 I didn't know we couldn't do that.
00:15:23.560 Even though they're not even saying that they're now saying 200 people.
00:15:28.620 Democrats are now saying, yeah, I knew that he was, I knew that he was gone, but, uh, you
00:15:33.800 know, we, we couldn't lose the election.
00:15:36.160 And, uh, wow.
00:15:38.660 Yeah.
00:15:38.780 I think you need a refresher on the constitution, uh, cause, uh, none of that is part of, uh,
00:15:44.520 our country.
00:15:45.200 No, no, none of that.
00:15:46.240 There's no place in the constitution that allows that.
00:15:48.780 Yeah.
00:15:49.240 One of the ways, uh, it's interesting.
00:15:50.760 They talk about that in the book of those decisions being made, right?
00:15:54.100 Why would you hide this from the American people?
00:15:58.620 Like, how could you justify that?
00:16:00.780 And, you know, it is exactly what you're saying.
00:16:04.100 And justify the means.
00:16:05.240 Right.
00:16:05.620 And they said that one of the reasons why, especially the really close group around the
00:16:11.280 Bidens, including the family and some of these advisors basically said, number one, Donald
00:16:15.920 Trump is, you know, basically Hitler, right?
00:16:18.880 Like the, he's a existential threat and he's the worst thing that could ever happen to us.
00:16:22.580 So we have to do anything to beat him.
00:16:24.080 And the people really close to Biden believed the only person who could beat him was Joe Biden.
00:16:28.680 Now that part's another part.
00:16:30.080 That's another level of delusion, I suppose.
00:16:32.460 To think that Joe Biden was uniquely qualified for this victory, but he was the only person
00:16:38.820 who, who did win in an election against him.
00:16:41.660 Right.
00:16:41.880 So you could, there's some, maybe some sense to that.
00:16:44.540 But as I think it was Alex Thompson, one of the authors pointed out, it's like, when
00:16:48.440 you, when you exist, when those two things are true, you can justify anything.
00:16:54.420 Yes.
00:16:55.000 Right.
00:16:55.140 Like if you believe Hitler's about to come into power and the only person who could beat
00:16:59.740 him is this old doddering fool you work with.
00:17:02.060 Well, of course you're going to justify all of this.
00:17:04.300 If you believe that Elon Musk is evil, I can, I can firebomb and terrorize anybody with a
00:17:11.620 Tesla.
00:17:12.080 Yeah.
00:17:12.340 If you believe that global warming is going to wipe the entire earth out, I can kill anyone
00:17:17.880 with an SUV.
00:17:19.100 This is why you can never adopt the ends justify the means, which is Saul Alinsky.
00:17:26.140 That is the motto now of the democratic party.
00:17:32.080 So you're in debt, you're behind the eight ball.
00:17:34.840 You've got high interest credit cards.
00:17:37.160 What do you do?
00:17:39.040 First thing you do, stop avoiding looking at it.
00:17:42.200 Okay.
00:17:42.600 There are mornings when you log into your bank account.
00:17:45.340 I mean, I, who wasn't the comedian that said, you know, I'm just thinking about, you know,
00:17:49.700 letting that thing chill for a while.
00:17:51.160 I I've seen this movie before.
00:17:52.860 I'm just not going to check my bank account.
00:17:55.060 Okay.
00:17:55.980 Um, I, I, I want you to look at things and get somebody that is an expert that can help
00:18:04.600 you.
00:18:04.800 That is not trying to sell you something.
00:18:07.720 Uh, this is American financing.
00:18:10.340 You know, your, your mortgage most likely sold to you by a bank.
00:18:14.740 And by people who got a kickback for selling you that mortgage, that's not the same with
00:18:21.180 American finance.
00:18:22.160 They work for you.
00:18:23.380 Call them now, see how they can help you.
00:18:25.400 It's American financing.net, American financing.net.
00:18:29.240 Now back to the podcast.
00:18:30.980 This is the best of the Glenn Beck program.
00:18:32.680 And don't forget rate us on iTunes.
00:18:36.700 Zachary Levi is with us.
00:18:37.940 Hi Zach.
00:18:38.400 How are you?
00:18:39.640 Hey, good morning, Glenn.
00:18:40.740 I'm doing all right.
00:18:41.340 How are you doing?
00:18:41.860 Um, I'm, I'm, I'm, I'm really good.
00:18:44.440 Uh, I, but I'm not in your business.
00:18:47.020 How, how concerned are you by what Google released this week?
00:18:53.040 I mean, I'm, I'm very concerned.
00:18:55.880 I mean, you know, when I, I, you and I talked about this when I came on your show last, um,
00:19:00.460 and I hate to sound like, you know, a doomer and gloomer.
00:19:07.380 Um, but I, this is something I've been foreseeing for a really long time.
00:19:12.220 I've been banging this drum for a really long time and trying to wake people up and say,
00:19:16.040 Hey, listen, technology, it moves exponentially.
00:19:21.040 This is one of the things that I think most people just don't understand whether it's people
00:19:25.240 in my industry or other industries and, and might I say, yes, this is knocking on the
00:19:30.820 doorstep of entertainment right now, but understand that AI is knocking on the doorstep of all
00:19:37.080 of our industries, your, your industry, radio, you know, everything in entertainment, certainly
00:19:41.820 anything that can be recorded and, and, and, and broadcast, but every industry that we are,
00:19:48.040 I mean, there are, um, huge, you know, experts in, in many fields that say within a year,
00:19:54.680 two years, you certainly within five years, every white collar job will be gone.
00:19:59.380 And a lot of blue collar jobs are going to be right behind that because you have to recognize
00:20:03.360 that AI is not just moving exponentially, but also humanoid robots and the development
00:20:08.900 of humanoid robots is developing exponentially.
00:20:11.980 And exponential growth is something that people just don't understand.
00:20:14.940 And most people see growth as, you know, kind of just, you know, multiplicative meaning
00:20:20.820 like, okay, every year it gets twice as good.
00:20:23.460 No, no, no.
00:20:24.200 It doesn't get twice as good every year.
00:20:25.680 It gets 10 times as good.
00:20:26.800 And then it gets a hundred times as good.
00:20:28.200 And then a thousand times as good and so on and so forth.
00:20:31.340 And so years ago, I was telling people, guys, if what we have right now, you know, like
00:20:36.480 for example, two years ago, uh, AI was generating images and, um, you know, but, but, you
00:20:43.620 know, humans had six fingers.
00:20:45.000 And so people said, ah, this is schlock.
00:20:47.220 Look at this.
00:20:47.720 You know, this is never going to get good.
00:20:49.280 They can't even get the amount of fingers right on people's hands.
00:20:52.260 I said, yeah, yeah.
00:20:52.880 Right now it couldn't right now.
00:20:55.520 It can't do that.
00:20:56.640 But six months later, it did six months after that, you had video.
00:21:00.660 And now you've got video with audio that is almost indiscernible.
00:21:06.140 As you've been seeing with these new examples, it's almost indiscernible.
00:21:09.540 Now people say, yeah, but I can still tell.
00:21:12.300 I go, yeah, right now you can, but six months from now, a year from now, two years from now,
00:21:18.740 we're going to be living in the world.
00:21:20.680 No, probably not even that long.
00:21:22.440 No.
00:21:23.120 And, and, and so people have got to wake up.
00:21:26.360 And so for people in my industry, I think that, yes, we should all be very, very concerned,
00:21:31.780 but everyone should be very concerned.
00:21:33.460 And it's not even just, you know, like, for example, yes, this could very much replace
00:21:38.420 my job.
00:21:38.960 This is partly why I've, I am building Wildwood Studios in Austin, Texas.
00:21:43.860 It, you know, has always been, it's a 25 year plus, you know, calling that God has put
00:21:48.240 on my life to, to create a better Hollywood, to give artists a better life, a better work
00:21:53.740 life balance, to give audiences better content.
00:21:56.720 These are all things that we've deserved for a really long time.
00:21:59.280 And, but AI is really the kind of, I think, most galvanizing, galvanizing force in all
00:22:07.200 of this, because if we don't do something about it, if we don't hold the line, if we
00:22:11.500 don't build the arc, which is really kind of what I've always felt on my life, I felt
00:22:15.900 this kind of Noah calling on my life that God's like, Hey, listen, a flood is coming.
00:22:20.520 It's not going to be water.
00:22:21.520 It's going to be something entirely different.
00:22:23.820 And that is this AI.
00:22:24.780 And if, and if you can build the arc, then you can at least save as many of those jobs
00:22:28.840 two by two as you can.
00:22:30.540 But if you don't build the arc, then the flood just wipes everything out.
00:22:34.300 And so, yeah, go ahead.
00:22:36.320 Let me, let me interrupt you on that because I believe, I mean, I'm developing some things
00:22:42.680 with AI and I've been on this for a very long time as well.
00:22:45.120 Um, and, uh, I believe you're absolutely right that you have to get, you know, you have to
00:22:52.660 get into a boat because floods are coming.
00:22:55.380 Um, however you, you, you have to, you can't dismiss it.
00:23:01.600 You have to, I think, use some of the skills that it has in a positive way.
00:23:08.400 Cause I think it could, it, it will enhance as long as you don't surrender to it, it will
00:23:14.200 enhance what you can do.
00:23:16.300 So are you talking about, you know, building something that has no use for AI and it's just
00:23:22.680 this Island, or are you saying that we'll use it, but we'll use it in ethical ways and
00:23:27.540 we'll never allow it, uh, to become the master.
00:23:31.740 We will always use it as a tool.
00:23:34.440 Yes, that, that, that's exactly right.
00:23:37.180 So I'm a firm believer and have been for many years that, you know, philosophically, you
00:23:42.640 cannot stop progress.
00:23:44.060 You can only hope to guide it that, that is the bottom line, right?
00:23:47.620 Right.
00:23:47.920 So it would be folly to look at new technology that, by the way, is going to do some really
00:23:53.060 cool things in this world.
00:23:54.700 Example being we're at the brink of nearly having our ear pods, you know, Apple, I think
00:24:00.700 will start, but other companies will be right behind it.
00:24:03.000 If not simultaneously, you will have real, real time language translation.
00:24:08.840 It's going to happen.
00:24:09.700 It's happening very, very soon.
00:24:11.280 Now that's incredible.
00:24:12.980 That's something that as a human race, we've all been wanting really since, I mean, since
00:24:16.520 the, I guess the Tower of Babel, right?
00:24:18.520 The ability for all of us to be able to communicate across the world, no language barriers whatsoever.
00:24:23.360 That is huge.
00:24:24.200 That's a huge leap forward for mankind.
00:24:26.000 Now, that's going to absolutely displace what is a smaller, let's say, industry of
00:24:32.820 translators, right?
00:24:34.040 It's not, there are many translators in the world, but it's not the biggest industry, let's
00:24:38.100 say.
00:24:38.460 And I feel for those people.
00:24:39.600 And I think we have to be very conscious about trying to rehome them in other jobs.
00:24:44.740 But that's it.
00:24:45.440 That's it.
00:24:45.840 You always have to ask yourself, is the juice worth the squeeze?
00:24:48.700 Is it ultimately worth it for the betterment of all of us, right?
00:24:53.120 So I don't think that we can't embrace AI.
00:24:56.140 We must embrace AI, but we must do it in as ethical a way as possible and be mindful of
00:25:02.200 what is it doing?
00:25:03.440 How is it disrupting?
00:25:04.420 And how is it displacing jobs?
00:25:06.580 Because that's the only thing that we can do.
00:25:08.900 Now, when it comes to entertainment, there's going to be all kinds of ways that we can implement
00:25:13.180 AI to make the process more efficient, more enjoyable.
00:25:16.820 And I have every intention of utilizing AI like that.
00:25:20.360 I don't vilify it writ large, but I think that we must be very mindful about how we implement
00:25:27.500 it in still holding on to human creativity, human art and entertainment is at the brink.
00:25:36.460 But I also believe, with Wildwood example being, I think that not only is it necessary to prevent,
00:25:45.840 let's say, the extinction of human art and entertainment, but there's also a market opportunity
00:25:49.780 in this because similar to vinyl, for example, you know, once upon a time, all music, we all
00:25:57.300 listen to vinyl records.
00:25:58.240 That's what it was.
00:25:59.080 And then the cassette tape came out and everyone said, oh, well, I don't need vinyl anymore.
00:26:02.660 I'm going to go with these little, you know, rectangular plastic, you know, cassette tapes
00:26:07.540 and I'm going to do that.
00:26:08.260 Great.
00:26:08.560 And then the CD came out.
00:26:09.620 Even more people left vinyl and then streaming.
00:26:11.600 And now even more people have left vinyl, but the people that held on the people that
00:26:16.040 said, you know what?
00:26:17.040 Yes, everyone is going to zig, but I'm going to zag.
00:26:20.220 I'm going to, I'm going to hold on to this.
00:26:21.920 I'm going to keep printing vinyl because I believe that there's something special about
00:26:25.000 it, unique about it.
00:26:26.100 And sure enough, vinyl sales have gone up because people are looking for something that's more
00:26:30.360 human, more tangible, more slightly imperfect with a little crackle, a little, you know,
00:26:36.320 whatever.
00:26:36.660 So what, why would, that's what we intend to do.
00:26:39.740 We intend to hold on to, we're not, I'm not trying to, I can't save the entire industry.
00:26:44.240 That's impossible, but I'm going to try and save as many jobs as I can.
00:26:47.620 And in doing so provide audiences, the alternative.
00:26:50.860 And I think a lot of people are going to be looking for that alternative.
00:26:54.020 So Zach, I, cause I, I'd like you, I've been on this for a long time and I have put
00:26:58.360 a lot of thought into, because my job is, uh, you know, at stake, everybody's job is
00:27:04.200 it's, um, and I've always felt, well, there's something special about humans that we have
00:27:11.680 a different sense to us, but I don't know if you heard, there was a study done of, I
00:27:16.960 think a hundred thousand songs and, uh, and they did, you know, what's called hook testing
00:27:23.360 to see which tested the best, the, I think it was seven out of the top 10 were AI and
00:27:31.900 people didn't know it was AI seven out of the top 10.
00:27:34.960 We used to say AI couldn't, you know, art can never be done.
00:27:40.440 So what is it that, that you think is going to be unique quickly?
00:27:47.380 I mean, I believe that there is going to be a huge draw back to handmade individual, you
00:27:54.500 know, when, uh, when, uh, machines came out and you had factories and they started producing
00:28:00.320 shirts, nobody wanted a homemade shirt.
00:28:02.640 Nobody wanted a handmade shirt.
00:28:04.220 They wanted one that was from the factory, but now handmade is the best of the best.
00:28:10.520 Uh, so there's going to be a, uh, renaissance, if you will, of handmade and human made stuff.
00:28:17.380 But what is it right now that will bridge this gap that humans can do that you don't
00:28:23.560 think AI can do?
00:28:26.800 Well, I think that, you know, obviously live performance, that's going to be huge, right?
00:28:32.580 So people in this, in this rebound effect of people saying, ah, you know, it just flooded
00:28:39.040 with ubiquitous AI content.
00:28:40.780 A lot of people are going to say, oh, I want something authentic, right?
00:28:43.340 And authenticity is the most important.
00:28:46.800 And in fact, there's been studies done where, um, you know, just from an energy level,
00:28:53.220 like, you know, as humans, we, we have, uh, we, we produce an energy when we have various
00:28:59.280 emotions, right?
00:29:00.240 And there's lower energy.
00:29:01.320 If you're sad, depressed, angry, and there's higher, um, energies when you're joyful and
00:29:06.900 happy and you feel love, but there's an energy even higher than love as they've tested.
00:29:11.440 And it's authenticity.
00:29:12.640 That is the highest energetic level that we can all reach.
00:29:16.040 And so people yearn for that.
00:29:17.620 They really do.
00:29:19.140 Yes.
00:29:19.580 So live performance obviously is going to be that, uh, sports is going to have a big, a
00:29:24.800 lot of people are, you know, uh, investing in, in, in sports and live performance because
00:29:30.780 that is going to hold on the longest, at least as long as long as let's say, you know, um,
00:29:37.220 robots and holograms, that's going to start to kind of eat into that market a little
00:29:41.480 bit.
00:29:41.740 We'll see how long that goes, but, but ultimately I have to tell you, may I say, may, may I say
00:29:47.080 something on that?
00:29:47.620 Have you been to London and seen the ABBA experience?
00:29:51.240 I haven't, but I've, I'm very well aware of it and I've heard it's incredible and that's
00:29:56.400 just the tip of the iceberg.
00:29:58.060 Yeah.
00:29:58.240 No, it's, it's, it's beyond incredible.
00:30:00.480 It is my son.
00:30:02.160 And I said, I didn't tell my daughter who was a teenager at the time, you know, 17 years
00:30:06.120 old that ABBA wasn't really performing.
00:30:08.280 We just didn't tell her.
00:30:09.440 And two songs into it, I said, do you think they're real?
00:30:12.600 Does it look like they're real?
00:30:13.640 And she's like, what are you talking about?
00:30:15.120 And I'm like, that's not real.
00:30:16.480 Those aren't people.
00:30:17.600 And she's like, what are you talking about?
00:30:20.400 And she couldn't believe it.
00:30:21.700 And the first couple of songs, my son, who was probably 18, 17 at the time, kept looking
00:30:27.100 at me going, dad, this changes everything.
00:30:29.580 This is not good.
00:30:30.880 This changes everything.
00:30:32.020 And I mean, everything is about to just turn upside down.
00:30:38.220 Yeah.
00:30:38.680 Yeah.
00:30:38.980 Well, yeah, it's already, it's like in front of our eyes, it's happening already.
00:30:43.380 And, and, and I am not one of those people, many people that I talked to, you know, a common
00:30:50.940 pushback that I get is people saying, well, it will never be able to fully replicate, let's
00:30:58.860 say, you know, human emotion or, you know, we'll always be able to tell.
00:31:03.000 And I just don't believe that.
00:31:04.440 I mean, I don't believe that.
00:31:05.560 We ourselves, no, we are amalgamations of everything that we've taken in, right?
00:31:11.840 So we're, we are, we ourselves are kind of LLMs.
00:31:14.920 We, we scrape our entire lives.
00:31:17.580 We scrape information from our parents, our community, people around us, the, you know,
00:31:21.660 the internet, whatever we're learning all the time.
00:31:23.940 And then we are replicating from the things that we learn AI is doing that and it's doing
00:31:29.260 it at scale and it's happening exponentially.
00:31:31.680 And we're very, very close to it becoming a GI general intelligence, which is then just
00:31:37.900 a few steps away from super intelligence.
00:31:39.580 And it will be then at that point, it will be more intelligent and more capable than not
00:31:44.560 just any individual human.
00:31:46.040 It will be more capable and more intelligent than the sum of all humanity.
00:31:50.160 So we're stepping into some insane, insane territory.
00:31:54.960 And when you start, you know, powering video agents like Google and others that will, that
00:32:02.240 will keep popping up, it's terrifying to, to acknowledge that a lot of people just don't,
00:32:09.080 they were trying, they're kind of burying their head in the sand and saying, no, no, no,
00:32:11.860 no, it won't happen.
00:32:12.680 It won't happen.
00:32:13.240 It's going to happen.
00:32:14.320 At that point, I think that what we have to, and what I'm hoping that Trump and the
00:32:19.880 administration are going to be working on in earnest is legislation that at the very
00:32:25.300 least requires all content that is AI generated to be watermarked, right?
00:32:31.540 So that therefore we know we can say, okay, I can't tell the difference.
00:32:36.720 I don't know the difference, but just by looking and listening to it, I can't tell if it's real
00:32:41.400 humans doing this or not, the difference will be that there will be some kind of watermarking
00:32:46.640 that indicates that.
00:32:47.680 And therefore that is what people are going to be looking for.
00:32:50.020 In the same way, if you go to the supermarket and you're looking at blueberries and that
00:32:54.860 these ones on the left look the same as the ones on the right, but there's packaging that
00:32:58.340 says these ones on the right are organic.
00:33:00.340 Oh, those are the ones I'm looking for.
00:33:01.840 I want the organic ones that aren't sprayed with glyphosate.
00:33:04.140 I'm trying to make certified organic human made content for free range artists.
00:33:10.400 That is what Wildwood Studios is going to be about.
00:33:13.460 And also at Wildwood Studios, we're not just going to be making and really focusing and
00:33:18.700 dedicated to making human film, television, music, and video games, but also providing
00:33:24.760 amphitheaters and live performance venues so that it's a one-stop shop.
00:33:28.720 So people can really know when they go there, they support us.
00:33:31.480 They're supporting humans in that process.
00:33:34.240 Zachary, I appreciate it.
00:33:35.960 Thank you so much.
00:33:37.140 And anything we can do to help you at Wildwood, let me know, please.
00:33:40.360 Zachary Levi, Wildwood Studios owner, actor.
00:33:43.640 He was Chuck.
00:33:44.660 He was Shazam.
00:33:45.940 I mean, there's a ton of great movies and everything else.
00:33:48.500 So Zachary Levi.
00:33:50.480 You're streaming the best of the Glenn Beck program, and you can find full episodes
00:33:54.460 wherever you download podcasts.
00:33:56.960 So Elon Factor, Elon Factor, Elon Musk is leaving Washington, D.C. today.
00:34:03.380 He is not in defeat.
00:34:07.300 I don't think he's in retreat.
00:34:09.460 We all expected this.
00:34:11.020 In fact, many of us believed that there was no way this was going to end well with him
00:34:18.100 because, you know, he was a strong individual.
00:34:21.560 Trump is a strong individual.
00:34:23.220 How are these two strong individuals going to get along?
00:34:26.120 They got along famously.
00:34:29.260 And we're losing today one of the few individuals in our time that is willing and has been willing
00:34:37.980 to challenge the most sacred assumptions.
00:34:40.840 Here's a guy that when he, you know, when they were talking about, I think it was the X Prize,
00:34:49.360 about, you know, making rockets for NASA, he went to NASA, this is early on, and they said,
00:34:56.180 okay, so here's what we want.
00:34:57.600 And he said, great.
00:34:58.260 And what kind of money do you want this to come into?
00:35:01.280 I mean, what do you want?
00:35:02.000 What's your goal that I could hit that would make it affordable for you?
00:35:07.520 And they said, what are you talking about?
00:35:08.660 He said, what do you want to spend on this?
00:35:10.680 And they said, don't worry about that.
00:35:12.620 Just make it fly.
00:35:13.720 And he's like, well, there's got to be a budget.
00:35:15.280 No, don't worry about it.
00:35:16.100 It's not about budget.
00:35:17.400 You know, it is about budget.
00:35:18.960 And he knew that that was wrong.
00:35:22.100 And he really, really, it bothered him a great deal.
00:35:25.100 And so here's a guy who comes in, reinvents absolutely everything,
00:35:32.060 and then goes to Washington because he actually believes in something,
00:35:38.160 and he's vilified for it.
00:35:40.600 I mean, I don't know of anybody that has been this vilified, you know,
00:35:47.260 so vital to progress and what humans are experiencing and going through
00:35:54.260 and solving huge problems.
00:35:57.880 I don't think I've ever seen anybody do that and been this vilified.
00:36:03.220 Here's a guy who didn't ask for the power.
00:36:05.620 He didn't seek the favor.
00:36:07.060 In fact, when he said, because he believed something,
00:36:10.640 yeah, I think I'm on the other side,
00:36:13.260 they tried to literally kill him for it.
00:36:17.800 And all he was fighting for was the freedom to invent,
00:36:22.220 the freedom to think differently,
00:36:23.500 the freedom to speak your mind,
00:36:25.960 and also the freedom to remain free by not becoming a slave
00:36:31.920 to an out-of-control government and out-of-control waste
00:36:35.900 and out-of-control spending.
00:36:38.160 He wanted just the chance to build something,
00:36:41.180 and he knew America was the place to do it.
00:36:44.220 And built he did.
00:36:45.800 He gave us the first reusable rockets.
00:36:48.380 I mean, think if I would have said to you six years ago,
00:36:52.540 yeah, we're going to send up some rockets,
00:36:54.940 and you're going to see it instead of just casting into the ocean.
00:36:57.360 It's going to reignite, and it's going to come down in control,
00:37:00.940 and we're going to just grab it out of the sky.
00:37:02.620 No.
00:37:03.260 No.
00:37:05.020 Not only did he do that, he thought that crap up.
00:37:11.160 Here's a guy who completely thinks out of the box.
00:37:14.400 American-made, American-launched,
00:37:17.000 restoring capability that we had already given away.
00:37:19.980 He forced the auto industry to evolve,
00:37:23.960 dragging it unwillingly into the 21st century with electric vehicles
00:37:28.380 that shattered the idea that sustainability has to come
00:37:31.720 at the cost of performance or ambition.
00:37:34.420 Here's a guy who, remember, the big three didn't want him around.
00:37:38.580 He had to break that entire system, and look what happened.
00:37:42.380 Then he took a brand-new platform, a speech that was supposed to free us up,
00:37:50.540 and it had become oppressive, ossified, monopolized.
00:37:54.900 It became the public square, and what did he do?
00:37:59.580 He went in, bought it with his own money, and was like,
00:38:02.200 this can't stand.
00:38:03.120 We have to have free speech.
00:38:04.600 Cracked it open and gave us, again, raw, uncomfortable at times,
00:38:09.840 but vital free speech.
00:38:12.740 It was all back into your hands now.
00:38:15.040 And now with Grok and AI, he's fighting to ensure that the machines of tomorrow
00:38:19.140 are actually aligned with not centralized power, but with human liberty.
00:38:26.500 But for all of this, all of this that would earn anyone
00:38:32.720 a chapter in the history books of the history of man,
00:38:38.220 how is he leaving Washington?
00:38:44.040 I mean, think of that.
00:38:45.680 He has endured the public efforts all around the world to ruin him,
00:38:51.640 coordinated efforts to deplatform, demonetize, to destroy.
00:38:56.420 He's received death threats.
00:38:58.040 His companies have been targeted.
00:38:59.460 His cars are burned.
00:39:00.860 His employees are harassed.
00:39:02.320 His customers are harassed.
00:39:03.800 All the while, he just keeps on doing what he does.
00:39:08.880 And that is, boys and girls, courage.
00:39:13.440 You don't see it very often.
00:39:16.140 That is what real courage looks like.
00:39:19.200 Without getting angry, without being vengeful, spiteful, any of it,
00:39:24.760 he just keeps going.
00:39:27.040 This is real courage.
00:39:30.480 This is the real thing.
00:39:32.180 Real high personal risk, high stakes, sleepless nights, relentless attacks,
00:39:38.040 and the refusal to sit down or break.
00:39:40.440 He's like, no, I believe this is right.
00:39:42.520 That's America.
00:39:44.220 He is really.
00:39:45.700 We have a few great symbols that we didn't have 20 years ago.
00:39:49.580 We have some great symbols of real leaders, real examples of courage and innovation that we didn't have.
00:40:01.520 And he's right up at the top.
00:40:05.780 I mean, history is riddled with people like this.
00:40:09.880 Nikola Tesla.
00:40:10.880 Nikola Tesla is probably one of them.
00:40:13.700 Penniless, mocked in at least his later years.
00:40:17.240 Galileo was, you know, imprisoned because he was telling the truth too early.
00:40:23.080 Winston Churchill, because he was telling the truth too early.
00:40:26.320 Nobody, I mean, he was cast aside until people realized, oh, the barbarians are at the gates.
00:40:33.260 These are people that saw over the horizon, saw the storms of life, or saw what was capable of being.
00:40:41.880 They came, they spoke up, and they paid dearly for it.
00:40:47.240 Churchill said once, you have enemies?
00:40:54.460 Oh, good.
00:40:56.200 That means you stood for something in your life.
00:40:58.840 Elon Musk, as he stood up again and again, technological sovereignty, speech, enterprise, for the radical, dangerous idea that the individual, not the institution, should shape the future.
00:41:17.280 I think there's going to be a time, and hopefully it's not too far in the future, when the heat has cooled and politics have moved on, that society will acknowledge not only what he's done, what he's given, but the sacrifice that he just went through.
00:41:36.720 But that'll happen, you know, at a time when the real effects of everything, I mean, when the future that he is helping shape right now, better or worse, is really taking root.
00:41:55.840 That's when he'll be recognized, once this nonsense is over.
00:42:00.880 The thing I like about him, he never asked us to trust him, he never asked for our loyalty, but I think he does deserve our respect, you know?
00:42:12.020 I don't care what side of that, I don't care who you voted for.
00:42:16.680 How do you not recognize what this man has done for humanity, especially if you're somebody who believes in global warming, what he's done for humanity, what he is still trying to do, the incredible strides that he has made, and the bravery that it has taken for him just to stand up.
00:42:36.840 I remember, he walked away from his side, didn't expect his side to leave him, but once he had a different opinion of theirs, they just abandoned him.
00:42:46.600 He lost all of his friends, he lost everything.
00:42:52.880 So today, as he is leaving, I would like to say, Elon Musk, thank you.
00:42:59.100 Thank you.
00:42:59.900 You didn't play the game, you changed the game.
00:43:03.860 Thank you.
00:43:04.500 Thank you for reminding me and so many other Americans that progress has never come in polite little packages.
00:43:20.380 It's never been polite.
00:43:21.960 The truth rarely comes dressed in approval.
00:43:25.100 But I think you did some things that are absolutely remarkable, and you're going to continue to do things that are remarkable.
00:43:35.900 Go in strength.
00:43:37.360 Know that history will catch up to you.
00:43:40.040 You're way ahead of the game.
00:43:42.140 Thank you, Elon Musk.
00:43:43.360 Toronto.
00:43:47.680 There's another great city that starts with a T.
00:43:50.880 Tampa, Florida.
00:43:53.120 Fly to Tampa on Porter Airlines to see why it's so T-rific.
00:43:57.520 On your way there, relax with free beer, wine, and snacks.
00:44:01.020 Free, fast streaming Wi-Fi.
00:44:02.880 And no middle seats.
00:44:04.600 You've never flown to Florida like this before.
00:44:07.040 So you'll land in Tampa ready to explore.
00:44:09.620 Visit flyporter.com and actually enjoy economy.