The Glenn Beck Program - January 07, 2025


Best of the Program | 1⧸7⧸25


Episode Stats

Length

38 minutes

Words per Minute

146.16003

Word Count

5,698

Sentence Count

423

Misogynist Sentences

2

Hate Speech Sentences

8


Summary

On today's show: Zuck is bringing an end to the fact-checkers in favor of something that looks a lot like X. Also, Trudeau says he's stepping down, but how long is he going to drag out the process? And AGI and ASI rapidly approaching what will become of us if we outlive our usefulness to us and the latest breakthroughs.


Transcript

00:00:00.000 This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460 Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420 All Porter fares include beer, wine, and snacks and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.860 And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240 Visit flyporter.com and actually enjoy economy.
00:00:30.000 On today's podcast, Zuck is bringing an end to the Facebook fact-checkers in favor of something that looks an awful lot like X.
00:00:38.360 What's happening there?
00:00:39.640 Also, Trudeau says he's going to step down, but how long is he going to drag out the process?
00:00:44.040 And AGI and ASI rapidly approaching what will become of us if we outlive our usefulness to us and the latest breakthroughs.
00:01:00.000 You're listening to The Best of the Glenn Beck Program.
00:01:14.900 I want to talk to you.
00:01:15.700 Did you see what just happened with Facebook, Stu?
00:01:18.740 And Mark Zuckerberg?
00:01:20.360 Yeah.
00:01:21.640 Yeah?
00:01:22.600 Yeah?
00:01:23.760 Yeah.
00:01:25.120 Fascinating.
00:01:25.560 Yeah, it is.
00:01:27.620 And I'm wondering what is happening here, you know, beyond the headline.
00:01:35.560 Here it is, just so you know.
00:01:37.460 Mark Zuckerberg announced Tuesday morning that content moderation and other restrictions on speech will be lifted across Facebook, Instagram, and other platforms as Donald Trump returns to the White House.
00:01:51.460 Huh.
00:01:51.800 No way.
00:01:52.640 Hold on just a second.
00:01:53.600 What?
00:01:53.680 So, you know, all traffic decline has been as high as 85% to 90% across all Blaze Media pages.
00:02:07.020 So, when I've said, I mean, we've had, and Stu, you probably know the numbers better than I do.
00:02:12.720 We're having more success and a bigger platform, bigger voice than any time in my career in the last three years.
00:02:24.560 The show and everything is just on fire.
00:02:28.520 And yet, our traffic on social media has declined by 85% to 90%.
00:02:36.700 It's just not possible.
00:02:39.800 There's no way other than we've been severely contained, if you will.
00:02:46.240 And, you know, I'd like to ask Mark Zuckerberg, where do I go to get my audience back?
00:02:54.480 Where do I, you know, you've kind of, people have kind of fallen out of the habit.
00:02:59.660 If you get all of your news from Facebook, God forbid, or Instagram, we've been so suppressed.
00:03:07.160 All of a sudden, are we going to pop back up?
00:03:09.000 I'd be interesting to see.
00:03:11.640 By the way, this is why your direct support to Blaze TV means so much.
00:03:16.800 We wouldn't have been able to survive everything if we hadn't built the Blaze and built it the way we did.
00:03:22.820 So, thank you for your subscription.
00:03:24.680 If you haven't subscribed, please do today.
00:03:27.100 So, he's ending all of the content moderation.
00:03:33.420 Now, what he did before was to go to places that, you know, they're absolute experts.
00:03:39.740 You know, like the Southern Poverty Law Center.
00:03:42.320 They know what's going on.
00:03:43.840 The Poynter Institute.
00:03:45.600 They know what's true.
00:03:48.440 And they've decided that they are going to go back to their roots, I'm quoting, and focus on reducing mistakes.
00:03:56.840 Simplifying our policies and restoring free expression on our platforms.
00:04:00.800 More specifically, we're going to get rid of fact-checkers and replace them with community notes similar to X.
00:04:09.240 Now, hang on just a second.
00:04:13.840 Wasn't X the most dangerous platform in the world?
00:04:19.060 Weren't those community notes just not enough?
00:04:22.480 The company's third-party fact-checking program was put into place following Trump's first election to manage content and misinformation on its platforms,
00:04:35.880 which executives conceded was a result of political pressure.
00:04:40.120 But now they say they've just gone too far.
00:04:42.780 We went to independent third-party fact-checkers, says the global chief of affairs officer at Meta.
00:04:51.100 It has become clear there's too much political bias in what they choose to fact-check,
00:04:56.260 because basically they get to fact-check whatever they see on the platform.
00:05:00.800 No, that can't be true.
00:05:03.640 No, it's true.
00:05:04.080 No.
00:05:04.660 Too much bias from the fact-checkers?
00:05:06.860 Yes.
00:05:07.260 But they're just checking facts, Glenn.
00:05:09.440 Yes, I know.
00:05:11.080 I know.
00:05:11.940 Isn't that weird?
00:05:13.860 Now, listen to this.
00:05:15.560 I mean, today, I think today's show is kind of based on, no, really?
00:05:26.020 You know?
00:05:27.180 The things that are happening now, wait until next hour.
00:05:30.720 I'm going to tell you a story that is just a jaw-dropping in how the world works between you and the elites.
00:05:43.320 Here's a case in this hour.
00:05:45.300 We're talking about Facebook.
00:05:47.100 How does it work between you and the elites?
00:05:49.380 Well, they didn't listen to you.
00:05:51.740 They wanted to shut you up.
00:05:53.440 They went to the elites who were on the winning side last time and said, okay, what do we need to do?
00:06:00.720 What do we need to do to make sure that, you know, we're on your side and we can get all that government money and nobody's going to hassle us?
00:06:10.140 What do we need to do?
00:06:11.660 So they did it.
00:06:12.900 And they went to the elites' selection of fact-checkers.
00:06:18.340 Now that the world has changed, at least here in America, now they're still not listening to you.
00:06:24.460 This isn't because you said something.
00:06:27.340 This is because Donald Trump has changed America.
00:06:32.060 And now they see the writing on the walls.
00:06:34.960 And so, again, it's not you.
00:06:38.140 It's power.
00:06:40.040 And it's disgusting.
00:06:42.440 The company is ending their fact-checkers completely.
00:06:46.980 And it will instead rely on the platform users to flag false or misleading content.
00:06:55.560 Instead of going to some so-called...
00:06:57.880 This is Facebook saying this.
00:07:00.540 Instead of going to some so-called expert, it will rely on the community and the people on the platform to provide their own commentary to something that they've read.
00:07:12.100 This is what freedom of speech is.
00:07:17.980 There's no expert that sits around in your town that checks everything somebody says and then says,
00:07:28.820 Wait a minute, wait a minute, wait a minute.
00:07:30.520 That's untrue.
00:07:32.240 And they have a political bias.
00:07:34.720 It just doesn't happen in real life.
00:07:37.020 But, again, let's remember that social media is not real life.
00:07:43.400 But at least it's starting...
00:07:45.240 Maybe it will start reflecting it a little bit more where you have the freedom to say what you think.
00:07:52.300 Can we pause on that one point for one quick second?
00:07:55.680 Sure.
00:07:56.060 Because Zuckerberg and Elon Musk have a rivalry, right?
00:08:00.340 Like, remember they were going to have a fight.
00:08:02.100 They were going to have like a cage match a year or two ago.
00:08:04.680 So there's a rivalry here for him to come out and say this.
00:08:09.360 He said this on video, Zuckerberg, saying that they would go to community...
00:08:13.900 Not just go to community notes, not just get rid of their fact checkers, but go to the community notes.
00:08:19.240 And as he said, similar to the way X does it.
00:08:22.900 He actually admitted that basically, like, we tried something, they tried something, theirs is better, we're going with theirs.
00:08:29.740 It's like a tech bro, you know, federalism.
00:08:33.800 That's a good thing and I think a tough admission for a guy like Mark Zuckerberg.
00:08:41.120 I mean, I'm with you in that I think they've run this so poorly.
00:08:45.520 And they have taken companies and content companies and given them this impression that they could advertise to people, gain followers, and then get their content distributed.
00:08:56.800 And then pulled the rug out from underneath them years ago and destroyed dozens and dozens of websites and companies because of it.
00:09:06.820 And that being said, this does kind of seem like a good change.
00:09:11.400 Like, I don't know if it's just Glenn, them kissing Trump's butt and realizing if Trump comes in, he's going to be a different kind of president and they're in a different environment and they better change or they're going to get, you know, hammered.
00:09:27.800 Or if it's a real change.
00:09:29.440 But either way, I think it's a positive one.
00:09:31.460 Well, you know me, I always look for the best in people, honestly.
00:09:37.100 You do?
00:09:37.580 I am kind of a poor judge of character because of that.
00:09:41.900 Because I see people for who I think they could be maybe at times.
00:09:50.480 And I kind of look at it like, I think that's who they really are going to be.
00:09:54.400 And they usually disappoint because people don't become the people they could be most times.
00:10:01.780 Instead, they settle for what they are or what they've allowed themselves to become because they don't have a true center of truth.
00:10:09.460 They don't know who they are and how they relate to all eternal truths.
00:10:15.480 And so they get lost really easily.
00:10:17.480 But when I set with Zuckerberg, this is more in line with the Zuckerberg that I set with.
00:10:24.400 However, you know, I was, I think I was, what did we decide, Stu?
00:10:30.160 Greatly conned with Zuckerberg.
00:10:35.060 Yeah, I don't know.
00:10:36.820 I mean, we went back and forth on it, I think.
00:10:39.480 Yeah, we did.
00:10:40.360 Because I do think there's a part of him that would like to be clear of all of this.
00:10:46.320 Like, I think he has other, you know, large goals in his life other than navigating every political thing that pops up.
00:10:57.060 So I am so glad you said that.
00:11:00.660 What are his big goals?
00:11:03.220 What does he really want to do?
00:11:05.920 Do you know?
00:11:07.200 What is he focusing on?
00:11:10.700 Hmm.
00:11:11.220 I mean, I, he has gone through several phases, right?
00:11:14.680 The company started going towards the metaverse, right?
00:11:17.760 They changed the company to meta a couple of years ago.
00:11:23.300 Uh-huh.
00:11:24.460 That, that was a...
00:11:25.660 That tells you everything you need to know.
00:11:27.220 Okay.
00:11:27.560 It tells you everything to know.
00:11:29.840 Meta is all about virtual reality, correct?
00:11:32.860 Mm-hmm.
00:11:33.260 So virtual reality.
00:11:34.460 Guess who's invested billions of dollars in VR.
00:11:43.080 The United States Army, Navy, and Air Force.
00:11:48.280 All going to virtual reality.
00:11:50.600 Mm-hmm.
00:11:51.140 Okay?
00:11:53.040 Meta has lost about $50 billion in its reality labs division.
00:11:58.660 Augmented reality, virtual reality, and the metaverse.
00:12:01.500 So, what, so what, what's happening here?
00:12:07.940 I think what this is partially, I mean, I want to give him the benefit of the doubt to some degree,
00:12:13.940 but I think partially what this is about is making sure that the government contracts don't stop with meta.
00:12:22.400 Make sure that they are able to get some of that money from the United States in the use of VR,
00:12:31.440 because that's where he really, where his heart is.
00:12:35.460 That's what he really wants to do.
00:12:38.680 Now, they lost $50 billion in their reality labs.
00:12:42.280 However, if you look at Facebook's revenue, they're subsidizing all of this stuff.
00:12:48.600 Facebook, the revenue was expected to go up to $100 billion in 2024.
00:12:57.840 Facebook's advertising revenue is now expected to grow over $127 billion by 2027.
00:13:05.960 So, that's the cash cow, but where his heart is, is VR and AR, and he wants to make sure that he gets, he's not off the government teat.
00:13:20.200 Otherwise, his, his real passion is gone.
00:13:24.160 I think that's what's happening.
00:13:25.980 Speculation, but I think that's what's happening.
00:13:28.400 It's got to be part of it, right?
00:13:29.820 It's got to be part of it.
00:13:31.780 Yeah.
00:13:32.040 You know, but I wonder and hope that it is more than that.
00:13:36.300 I mean, I, because I had the same reaction to Elon Musk when he started having this transition of real skepticism.
00:13:42.100 I didn't buy it.
00:13:42.800 I mean, the guy's been like the biggest climate activist in America.
00:13:46.460 Why would we believe all of a sudden he's coming around to these sorts of ideas?
00:13:50.160 Does seem now that that's pretty legitimate from Elon Musk.
00:13:54.120 Could it be legitimate from Zuckerberg?
00:13:56.160 Zuckerberg, remember, he's done, he did some of this stuff before the election.
00:14:00.460 I know.
00:14:00.800 He did signal this stuff.
00:14:02.560 He did testimony.
00:14:04.160 He, he outed some of the government intrusions before the election happened.
00:14:09.000 He called Trump a badass after the, after the assassination attempt.
00:14:13.020 Correct.
00:14:13.620 He said that was his turning point, was the assassination attempt.
00:14:18.020 He, he said that's when he realized, oh, this guy is really a badass.
00:14:21.780 This guy is actually, you know, what he says he is to some degree, at least according to Mark Zuckerberg.
00:14:29.320 You're listening to the best of the Glenn Beck program.
00:14:32.040 Let me just ask you, Stu, you know, the president has talked about the Panama Canal, which I think he's very serious about.
00:14:45.640 If he could take it back, I think he would.
00:14:48.080 Um, but I, I don't think he can legally.
00:14:51.300 So I don't know what he's doing with Panama, but the Panama Canal, that's going to be a big thing.
00:14:57.420 In my view, could be wrong.
00:14:59.800 Uh, it's going to be a, it's going to be a, an important thing, uh, to him and his administration.
00:15:04.860 But he, again, has brought up Greenland and buying Greenland, which I'm all for, quite honestly.
00:15:12.340 I mean, how much could it be, honestly?
00:15:14.400 What is it on Zillow?
00:15:15.400 Do you know?
00:15:16.540 What's the, I don't, I don't, I don't.
00:15:19.160 There's no comparables.
00:15:20.760 Yeah.
00:15:21.360 Um, the, um, uh, you know, the, the, the way that we waste money, that would be the best thing we could do besides buying like gold or Bitcoin.
00:15:30.680 Uh, that would be the best thing is buy land, uh, and, uh, it would be great.
00:15:36.560 And Greenland has rich in resources.
00:15:40.300 Um, you know, I, I don't know.
00:15:43.000 I mean, I know they want to get away from Denmark.
00:15:44.960 Is it Denmark or just Mark?
00:15:48.700 It's like Mark's the guy who owns Greenland.
00:15:51.760 I don't know.
00:15:52.480 I think it's Denmark that they belong to and they don't like it.
00:15:56.260 And, uh, they're told exactly how to live their lives from Denmark.
00:15:59.900 And they're like, we're closer to America than Denmark.
00:16:03.660 However, now listen, if this doesn't sound like negotiation, however, we are not for sale.
00:16:09.220 I mean, we don't want to be in bed with Denmark anymore.
00:16:13.080 Uh, you know, and we are so rich with resources right now.
00:16:17.620 I just, I mean, we're really not for sale, but Hey, we love Donald Trump in America.
00:16:22.520 Look out, you open that negotiation door and, uh, Donald Trump's going to walk through and
00:16:29.600 then, I mean, Denmark, uh, well, maybe Denmark to Greenland will be ours.
00:16:34.360 If you open yourself to negotiation.
00:16:36.900 Um, but he's also talking about Canada being the 51st state.
00:16:40.380 And I've thought that was because of Trudeau, you know, just making him into a governor.
00:16:46.880 But isn't there any chance he really would love to have Canada as a 51st state?
00:16:53.200 Well, yeah.
00:16:53.880 I mean, I mean, there are some arguments as to why not, but I think you start with why
00:16:58.960 not?
00:16:59.300 If you could do it, sure.
00:17:01.120 It would be great because of all the resources they have and, and all of that, though there
00:17:05.760 would be a lot of negatives, you'd be importing a lot of socialists, uh, into our voting, French,
00:17:11.600 a lot of, you'd have all the French.
00:17:13.220 Yeah.
00:17:13.380 You'd have a couple of new senators, right.
00:17:15.460 That would come in.
00:17:16.300 That would be Democrats.
00:17:17.120 That would make things a little more difficult, at least in the short term.
00:17:20.940 Um, yeah, but there's a lot of, but you'd also get rid of the CBC.
00:17:24.240 And so, well, you'd be replacing with ABC, but ABC looks like the blaze compared to the
00:17:29.420 CBC.
00:17:30.040 Yes.
00:17:30.520 So, you know, anything would be an improvement, but then you also get all that social
00:17:35.740 medicine.
00:17:36.420 Anyway, um, Canada is in so much trouble.
00:17:40.440 Canada is in the worst shape they've ever been in, at least in my lifetime.
00:17:44.820 Uh, and it's all because of, uh, Justin Trudeau, who is just horrible.
00:17:49.800 I mean, he's an egomaniac.
00:17:52.280 He really is.
00:17:52.860 Did you hear his quote resignation speech yesterday?
00:17:57.920 It was great.
00:17:58.960 Could we make it about, could we make it about him anymore?
00:18:01.760 And just like, in fact, Sarah, we have a part of the speech.
00:18:05.560 Can we play part of his speech over the holidays?
00:18:08.760 Okay.
00:18:09.840 I've also had a chance to reflect and have had long talks with my family about our future.
00:18:15.500 Hey, stop for a second.
00:18:16.760 Stop for a second.
00:18:17.640 Hold on just a second.
00:18:18.220 I've had time to reflect on what's right for Canada.
00:18:22.240 Uh, and I didn't, I didn't spend any time talking about my 16% approval rating.
00:18:28.440 It didn't even come up.
00:18:29.820 Didn't even come up.
00:18:30.640 But anyway, go ahead.
00:18:32.160 Throughout the course of my career, any success I have personally achieved has been because
00:18:37.960 of their support and with your encouragement.
00:18:41.820 Yes.
00:18:42.640 So last night over dinner, I told my kids about the decision that I'm sharing with you today.
00:18:48.560 Okay.
00:18:50.480 I intend to resign as party leader, as prime minister, after the party selects its next leader
00:18:58.780 through a robust, nationwide, competitive process.
00:19:03.520 Oh, so much like his dad, Fidel.
00:19:06.460 Um, so he, he says here, I'm going to resign.
00:19:11.440 I'm going to resign.
00:19:12.300 Now, what does that mean in Canada?
00:19:15.420 Well, he's going to do that after there's a robust effort to find a new leader for the
00:19:21.680 liberal party.
00:19:22.600 Why?
00:19:23.160 Because currently the conservative party is beating them by 24 points.
00:19:29.560 So there's no way they're going to, there's absolutely no way they're going to win at this
00:19:35.420 point.
00:19:35.900 So not only does he say it's going to have to happen, you know, after March 1st.
00:19:41.940 Okay.
00:19:42.980 So let's see.
00:19:43.760 We have January, February, March 1st.
00:19:45.720 Okay.
00:19:45.900 So two months to get your crap together.
00:19:48.160 But not only did he say, I'm going to resign sometime after March, um, he has done something
00:19:57.220 that I think we would call this martial law maybe, or, uh, I mean, he suspended the parliament.
00:20:08.860 It's called prorogued there in Canada.
00:20:12.660 I've prorogued the, uh, parliament just until March.
00:20:17.460 Uh, and that's it.
00:20:18.700 So what does prorogued mean?
00:20:21.360 It means shutting it down.
00:20:23.560 Pastries with the, uh, with the potato and cheese.
00:20:27.060 No, that's a pierogi.
00:20:27.620 Yeah, that's a pierogi.
00:20:29.740 Prorogued, not so delicious.
00:20:32.160 Uh, prorogued means you've done everything but dissolve the parliament.
00:20:39.280 You've shut it down entirely.
00:20:41.000 They cannot, there's no reason to go to work.
00:20:44.240 They're in the middle of, you know, what parliament and Congress does, which I can't explain to
00:20:50.820 you, but they do a lot of stuff.
00:20:52.820 They're debating and, uh, parliament is starting to go the opposite direction.
00:21:00.520 So he's, can you imagine our president?
00:21:02.640 If Donald Trump just came in and said, you know what?
00:21:05.420 It's too critical of a time right now.
00:21:07.500 I'm going to, I'm going to be leaving office, you know, in four years, but for the next
00:21:12.780 three years, I'm just suspending Congress.
00:21:15.960 Can you imagine that?
00:21:17.400 No, of course not.
00:21:18.660 Because it's dumb.
00:21:19.440 That's what he, that's what he just did.
00:21:22.160 It shouldn't be available as an option to the president or the prime minister to just
00:21:27.060 stop Congress or the parliament.
00:21:29.060 No.
00:21:29.480 It's a stupid system.
00:21:30.720 No.
00:21:31.520 Okay.
00:21:32.000 So the reason why he did that is not only just to stay in power and keep his policies
00:21:39.040 exactly where they are.
00:21:40.980 Um, but also, uh, he did that so they can't have a no confidence vote.
00:21:47.180 Like today, if he would have said, I'm going to resign, the conservatives could have stood
00:21:51.940 up and said, Hey, let's have a vote of no confidence.
00:21:54.780 And they might've gotten that through.
00:21:57.320 So that means he just would be removed.
00:21:59.800 And that would mean that his party would lose to the apple eating guy.
00:22:05.300 Now I have to tell you, uh, Pierre something or other, some French name.
00:22:12.980 I know him as the apple eating guy.
00:22:15.640 And that might sound like it's, uh, you know, uh, no, not necessarily the right thing to do
00:22:23.960 to call the next prime minister, the apple eating guy, but this is how I know him.
00:22:29.980 Uh, he did an interview with, I think the CBC and there's a left-wing journalist and he's
00:22:39.020 in Vancouver, BC, and he's eating an apple and to make sure the reporter knew exactly how
00:22:46.840 little he thought of them and their questions.
00:22:49.640 He answered while still eating the apple, his body kind of half turned, not really even,
00:22:55.320 not even really recognizing this guy fully, uh, in case you've never heard it or seen
00:23:01.460 it, here's that clip on the, on the topic.
00:23:05.480 I mean, in terms of your sort of strategy currently, you're obviously taking the populist, uh, pathway.
00:23:11.580 Um, what does that mean?
00:23:12.800 Well, appealing, appealing to people's, uh, more emotional levels.
00:23:18.480 I would guess.
00:23:19.220 Um, I mean, certainly, certainly you, certainly you tap, certainly you tap, uh, very strong
00:23:24.120 ideological language quite frequently.
00:23:26.540 Like what?
00:23:28.040 Uh, left wing, you know, this and that right wing.
00:23:31.780 They, you know, I mean, it's that, that type of ideological stick.
00:23:34.580 I never really talk about left or right.
00:23:35.920 Anyways, a lot of people.
00:23:36.700 I don't really believe in that.
00:23:37.580 Okay.
00:23:38.280 A lot of people would, would say that you're simply taking a page out of the Donald Trump,
00:23:43.200 uh, book.
00:23:43.980 Like which people would say that?
00:23:45.200 Well, I'm sure a great many Canadians, but I don't know who, but well, you're the one
00:23:53.060 who asked the question.
00:23:54.040 So you must know somebody.
00:23:55.940 Okay.
00:23:56.940 I'm sure there's some out there, but anyways, the point of this, the point of this question
00:24:00.700 is, I mean, why should, why should Canadians trust you with their vote given, you know,
00:24:08.520 not, not just the sort of ideological inclination in terms of taking the page out of Donald Trump's
00:24:13.700 book, but also.
00:24:14.740 Talking about what page, what page, can you give me a page?
00:24:17.020 Give me the page.
00:24:18.280 You keep saying that.
00:24:19.200 In terms, in terms of turning things quite dramatically in terms of, of Trudeau and, and
00:24:23.660 the left wing and all of this.
00:24:25.140 I mean, you, you, you make quite a, you know, it's, it's quite a play that you make on it.
00:24:29.260 So I'm, I'm not sure.
00:24:31.000 I don't know.
00:24:31.460 I don't know what your question is.
00:24:32.900 Then forget that.
00:24:33.660 Why shouldn't Canadians trust you with their vote?
00:24:37.380 Common sense.
00:24:38.840 Common sense for, for a change.
00:24:42.000 We're going to make common sense common in this country.
00:24:45.460 We don't have any common sense in the current government.
00:24:49.720 This is so good on the topic.
00:24:52.060 I mean, in terms of your, just the crunching of the apple in the middle of the question.
00:24:56.100 It is so good.
00:24:57.200 It is so satisfying.
00:24:59.380 Here's what he said about Israel bombing Iran.
00:25:02.560 Listen to this.
00:25:03.560 Yesterday, you said that you endorse Israel proactively defending itself by hitting Iran's
00:25:08.460 nuclear sites, which is something that President Joe Biden does not endorse.
00:25:12.560 Do you not feel like this could lead to a likelihood of an all out conventional war between
00:25:16.980 Iran and Israel?
00:25:17.740 And are you, do you not agree with Joe Biden and his assessment?
00:25:23.200 I think the idea of allowing a genocidal, theocratic, unstable dictatorship that is desperate
00:25:35.000 to be, to avoid being overthrown by its own people to develop nuclear weapons is about
00:25:41.160 the most dangerous and irresponsible thing that the world could ever allow.
00:25:45.480 And if Israel were to stop that genocidal, theocratic, unstable government from acquiring
00:25:54.740 nuclear weapons, it would be a gift by the Jewish state to humanity.
00:26:02.140 Yes.
00:26:04.800 Quick, pirogue the parliament.
00:26:08.860 Pirogue them.
00:26:09.880 Put them in jail if you have to.
00:26:11.800 I'll resign eventually when I've figured out a way to rig the system.
00:26:16.780 This is the best of the Glenn Beck program.
00:26:23.820 Let me talk to you a little bit about Sam Olsen.
00:26:28.320 He is the guy from OpenAI.
00:26:31.780 And yesterday, if you missed it, you should go back to the show and listen to the podcast.
00:26:37.440 Yesterday, our number two was on the singularity.
00:26:41.260 This is something that I've been talking about.
00:26:43.880 Stu, I was talking about this, I think, before you even joined the show.
00:26:48.080 This might be the longest running commentary that I have in my career is what's coming with
00:26:56.980 technology and AI, AGI, and ASI.
00:27:03.640 AI is artificial intelligence.
00:27:06.320 General intelligence is what you are, you and I.
00:27:10.000 That's what we as humans were good at many different things.
00:27:14.020 AGI, artificial general intelligence, is like a human, except it's not just good at things.
00:27:20.480 It masters everything, okay?
00:27:23.180 Right after that is ASI, artificial superintelligence.
00:27:28.820 That's when AI becomes God.
00:27:32.500 It is more powerful than all of the brains alive on the planet today.
00:27:38.680 It is more powerful than any supercomputer.
00:27:43.180 It's God-like, okay?
00:27:45.300 You won't be able to keep up with it.
00:27:47.840 You won't be able to understand it.
00:27:50.240 It is so far beyond humans.
00:27:54.400 We won't be able to, you just do what it says, thinking that it's right, because you don't
00:28:02.480 know what it knows.
00:28:03.760 You don't know how it arrived at that.
00:28:06.440 Or you turn it off.
00:28:08.000 But ASI will not allow you to turn it off.
00:28:10.720 So, AI, we've had.
00:28:14.920 AGI, according to Sam Altman, we are now at, or soon will be.
00:28:21.980 We're at the singularity, which means Moore's rule of doubling the chip power every two years
00:28:28.660 is now over.
00:28:29.840 It's gone from that slope to a straight line up now.
00:28:33.080 That's the singularity.
00:28:35.480 Progress that is so rapid, you won't be able to keep up with it is now where we're at.
00:28:40.720 Um, and AGI, artificial general intelligence, some people didn't think that we would get
00:28:47.340 there.
00:28:48.340 Many, if not most of the computer scientists believed in AGI, we could, we could, we could
00:28:56.260 achieve that.
00:28:58.280 Most people did not think we could achieve the singularity until 2050, if at all.
00:29:04.520 And most computer scientists don't think that we'll ever get to ASI, artificial superintelligence,
00:29:12.140 which could mean the end of humanity.
00:29:15.220 Okay.
00:29:17.140 Listen to this.
00:29:18.220 This is from Sam Altman, his blog that just came out.
00:29:22.660 We started OpenAI almost nine years ago because we believed that AGI, artificial general intelligence,
00:29:30.940 was possible and that it could be the most impactful technology in human history.
00:29:36.280 But we wanted to figure out how to build it and make it broadly beneficial.
00:29:40.960 We were excited to make our mark on history.
00:29:44.140 Our ambitions were extraordinarily high.
00:29:46.440 And so was our belief that the work might benefit society in an equally extraordinary
00:29:50.900 way.
00:29:52.280 At the time, very few people cared.
00:29:54.320 And if they did, it was mostly because they thought we had no chance of success.
00:29:59.180 AGI.
00:30:01.220 In 2022, OpenAI was a quiet research lab working on something temporarily called chat with chat
00:30:08.800 GPT 3.5.
00:30:11.660 We are much better at research than we are at naming things.
00:30:15.160 We had been watching people use the playground feature of our API and knew that developers
00:30:20.220 would really enjoy talking to the model.
00:30:22.740 We thought building a demo around that experience would show people something important about
00:30:26.700 the future and help us make our models better and safer.
00:30:30.460 We ended up mercifully calling it chat GPT instead and launched it on November 30th of 22.
00:30:37.300 We always knew abstractly that at some point we'd hit a tipping point and the AI revolution
00:30:44.120 would get kicked off.
00:30:45.400 But we didn't know what the moment would be.
00:30:48.580 To our surprise, it turned out to be this.
00:30:52.160 The launch of the chat GPT kicked off a growth curve unlike anything we have ever seen in our
00:30:58.420 company, our industry, and the world broadly.
00:31:01.340 We are finally seeing some of the massive upside we have always hoped for from AI, and we can
00:31:08.400 see how much more will soon come.
00:31:11.180 It hasn't been easy, the road hasn't been smooth, and the right choices haven't always
00:31:16.360 been obvious.
00:31:18.120 In the last two years, we had to build an entire company almost from scratch around this new
00:31:22.160 technology.
00:31:22.920 Now, he goes on to building the company and the technology, but I want to skip down here.
00:31:28.740 We have done what is easily some of our best research ever.
00:31:37.600 We grew from 100 million weekly active users to more than 300 million.
00:31:42.860 Most of all, we have continued to put technology out into the world that genuinely seems to be
00:31:48.860 loved by people and that solves real problems.
00:31:51.880 We are proud of our track record in research and development so far.
00:31:59.240 We are committed to continuing to advance our thinking on safety and benefits sharing.
00:32:05.980 We continue to believe that the best way to make an AI system safe is by gradually releasing
00:32:12.500 it into the world, giving society time to adapt and co-evolve with the technology, learning
00:32:18.340 from experience and continuing to make the technology safer.
00:32:21.540 We believe in the importance of being world leaders on safety and alignment research and
00:32:28.040 in guiding research with feedback from the real world applications.
00:32:32.100 So, safety and alignment research.
00:32:36.560 What is that?
00:32:37.820 Well, safety, because you saw at the beginning of ChatGPT, just the hallucinations that ChatGPT
00:32:46.780 could do.
00:32:47.740 Also, can it ever lie to us?
00:32:50.160 Can it or will it ever start to look at us as we look at insects?
00:32:57.320 Will it ever start to see that Americans or I'm sorry, not Americans, humans are the problem?
00:33:03.120 And the easiest way to solve our problems is to eliminate the humans.
00:33:09.660 So, that's what safety means.
00:33:13.040 And alignment research means making sure that the AI and AGI and ASI, which we will be insects
00:33:23.600 to, okay, it will have no time for us.
00:33:28.040 It will barely, we will be so far, and I'm using this term clinically, we will be so far
00:33:35.660 beyond retarded to ASI that it will, it has no reason to pay attention to us at all.
00:33:46.000 So, making sure that alignment, that our goals and its goals remain intact.
00:33:53.560 But how do you do that?
00:33:55.460 How do you build a fence around something?
00:33:59.080 Well, it's like this.
00:34:00.880 Imagine using a baby gate, you know, the kind that, you know, go over the stairs.
00:34:07.100 Imagine if somebody said, you know, I got to keep you out of this room.
00:34:11.760 And they put a baby gate up.
00:34:15.100 Is that going to concern you?
00:34:16.920 Are you even going to spend any time worrying about that baby gate?
00:34:21.200 You'll just step over it, okay?
00:34:23.240 It's bad for babies, but that's what we could do to ASI.
00:34:29.620 Anything that we would want to do, it's so far beneath ASI, they won't even have to worry
00:34:38.060 about it.
00:34:39.060 Okay.
00:34:39.940 We are now confident we know how to build AGI as we have traditionally understood it.
00:34:46.060 This is game-changing.
00:34:48.060 We believe that in 2025, we may see the first AI agents join the workforce and materially change
00:34:57.920 the output of companies.
00:35:01.480 Stu, what is an AI agent?
00:35:03.140 I mean, I don't know that I know exactly.
00:35:07.700 When you talk about AGI and artificial general intelligence, I would think it would be like
00:35:12.680 an assistant.
00:35:13.900 Like, you could have them essentially do any task.
00:35:17.000 Like, you could assign an employee, right?
00:35:19.520 Like, you don't need to program them to do a specific thing.
00:35:23.520 You could say, hey, we need you to answer the phones here.
00:35:26.440 We need you to...
00:35:27.440 I mean, it might not be directly like that, but it's that type of thing that can take a
00:35:31.520 generalized job, a role like that, and do it on its own.
00:35:36.040 Would you, if you were hiring people for a company and you had somebody that doesn't make mistakes
00:35:44.600 and was much smarter than everybody else in the room, would you have them answer the phones?
00:35:51.800 No.
00:35:52.440 You'd have them in a really high-powered position in your company.
00:35:55.840 Exactly right.
00:35:56.540 Right.
00:35:57.120 So, an AI agent, now, not the first.
00:35:59.820 The first will be just like that, because remember, they say they're slowly going to roll
00:36:04.140 this out so you get used to it.
00:36:05.480 The first one will be like a secretary, somebody who can take care of, I'll pay the bills, I'll
00:36:11.020 take care of all of this stuff, and we will love it.
00:36:13.960 The first ones to join the workforce and materially change the output of companies will be something,
00:36:22.440 and I'm just imagining this, so please excuse me if you're in this field for my being a baby
00:36:29.380 gate here, but as I imagine it, it would be someone that you would have a virtual conference
00:36:35.440 with that looks like a human, sounds like a human, you can have a conversation with,
00:36:40.260 and you can say, look, can you help us on this?
00:36:42.580 We're trying to figure this out, blah, blah, blah, blah, blah, and they can game change for
00:36:46.840 you, your approach in your company.
00:36:49.360 That's what he thinks is coming this year.
00:36:54.560 Now, he says, we're beginning to turn our aim beyond that to super intelligence in the
00:37:01.640 true sense of the word.
00:37:03.400 We love our current products, but we are here for the glorious future.
00:37:07.400 With super intelligence, we can do anything else.
00:37:13.220 Super intelligent tools could massively accelerate scientific discovery and innovation well beyond
00:37:19.740 what we're capable of doing on our own, and in turn, massively increase abundance and prosperity.
00:37:26.280 Here's the thing, and I want to get into this tomorrow, massively increasing abundance and prosperity.
00:37:36.760 How?
00:37:38.900 Well, by becoming much more efficient, by spending less to make more.
00:37:46.140 But who gets that money?
00:37:50.020 Where is that abundance?
00:37:51.980 Things will be cheaper, but if the jobs are taken by AI or AGI or ASI, how do you make money?
00:38:02.440 A 30% disruption is coming in, we will by 2030, if things play out the way we believe they're going
00:38:10.520 to play out now, which is a deeply unsettling of jobs and careers and everything else, at least at the beginning.
00:38:21.220 You're looking at a 30% unemployment rate, minimum, by 2030.
00:38:28.180 Now, he goes on to say, this sounds like science fiction right now and somewhat crazy to even talk about.
00:38:34.280 That's all right.
00:38:35.120 We've been here before, and we're okay with being there again.
00:38:37.960 We're pretty confident that in the next few years, everyone will see what we see, and
00:38:43.260 the need to act with great care while still maximizing broad benefit and empowerment is
00:38:48.620 so important.
00:38:49.880 Given the possibilities of our work, OpenAI cannot be a normal company.
00:38:54.820 How lucky and humbling it is to be able to play a role in this work.