Best of the Program | 1⧸7⧸25
Episode Stats
Words per Minute
146.16003
Summary
On today's show: Zuck is bringing an end to the fact-checkers in favor of something that looks a lot like X. Also, Trudeau says he's stepping down, but how long is he going to drag out the process? And AGI and ASI rapidly approaching what will become of us if we outlive our usefulness to us and the latest breakthroughs.
Transcript
00:00:00.000
This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460
Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420
All Porter fares include beer, wine, and snacks and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.860
And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240
Visit flyporter.com and actually enjoy economy.
00:00:30.000
On today's podcast, Zuck is bringing an end to the Facebook fact-checkers in favor of something that looks an awful lot like X.
00:00:39.640
Also, Trudeau says he's going to step down, but how long is he going to drag out the process?
00:00:44.040
And AGI and ASI rapidly approaching what will become of us if we outlive our usefulness to us and the latest breakthroughs.
00:01:00.000
You're listening to The Best of the Glenn Beck Program.
00:01:15.700
Did you see what just happened with Facebook, Stu?
00:01:27.620
And I'm wondering what is happening here, you know, beyond the headline.
00:01:37.460
Mark Zuckerberg announced Tuesday morning that content moderation and other restrictions on speech will be lifted across Facebook, Instagram, and other platforms as Donald Trump returns to the White House.
00:01:53.680
So, you know, all traffic decline has been as high as 85% to 90% across all Blaze Media pages.
00:02:07.020
So, when I've said, I mean, we've had, and Stu, you probably know the numbers better than I do.
00:02:12.720
We're having more success and a bigger platform, bigger voice than any time in my career in the last three years.
00:02:28.520
And yet, our traffic on social media has declined by 85% to 90%.
00:02:39.800
There's no way other than we've been severely contained, if you will.
00:02:46.240
And, you know, I'd like to ask Mark Zuckerberg, where do I go to get my audience back?
00:02:54.480
Where do I, you know, you've kind of, people have kind of fallen out of the habit.
00:02:59.660
If you get all of your news from Facebook, God forbid, or Instagram, we've been so suppressed.
00:03:11.640
By the way, this is why your direct support to Blaze TV means so much.
00:03:16.800
We wouldn't have been able to survive everything if we hadn't built the Blaze and built it the way we did.
00:03:33.420
Now, what he did before was to go to places that, you know, they're absolute experts.
00:03:39.740
You know, like the Southern Poverty Law Center.
00:03:48.440
And they've decided that they are going to go back to their roots, I'm quoting, and focus on reducing mistakes.
00:03:56.840
Simplifying our policies and restoring free expression on our platforms.
00:04:00.800
More specifically, we're going to get rid of fact-checkers and replace them with community notes similar to X.
00:04:13.840
Wasn't X the most dangerous platform in the world?
00:04:22.480
The company's third-party fact-checking program was put into place following Trump's first election to manage content and misinformation on its platforms,
00:04:35.880
which executives conceded was a result of political pressure.
00:04:42.780
We went to independent third-party fact-checkers, says the global chief of affairs officer at Meta.
00:04:51.100
It has become clear there's too much political bias in what they choose to fact-check,
00:04:56.260
because basically they get to fact-check whatever they see on the platform.
00:05:15.560
I mean, today, I think today's show is kind of based on, no, really?
00:05:27.180
The things that are happening now, wait until next hour.
00:05:30.720
I'm going to tell you a story that is just a jaw-dropping in how the world works between you and the elites.
00:05:53.440
They went to the elites who were on the winning side last time and said, okay, what do we need to do?
00:06:00.720
What do we need to do to make sure that, you know, we're on your side and we can get all that government money and nobody's going to hassle us?
00:06:12.900
And they went to the elites' selection of fact-checkers.
00:06:18.340
Now that the world has changed, at least here in America, now they're still not listening to you.
00:06:27.340
This is because Donald Trump has changed America.
00:06:42.440
The company is ending their fact-checkers completely.
00:06:46.980
And it will instead rely on the platform users to flag false or misleading content.
00:07:00.540
Instead of going to some so-called expert, it will rely on the community and the people on the platform to provide their own commentary to something that they've read.
00:07:17.980
There's no expert that sits around in your town that checks everything somebody says and then says,
00:07:37.020
But, again, let's remember that social media is not real life.
00:07:45.240
Maybe it will start reflecting it a little bit more where you have the freedom to say what you think.
00:07:52.300
Can we pause on that one point for one quick second?
00:07:56.060
Because Zuckerberg and Elon Musk have a rivalry, right?
00:08:00.340
Like, remember they were going to have a fight.
00:08:02.100
They were going to have like a cage match a year or two ago.
00:08:04.680
So there's a rivalry here for him to come out and say this.
00:08:09.360
He said this on video, Zuckerberg, saying that they would go to community...
00:08:13.900
Not just go to community notes, not just get rid of their fact checkers, but go to the community notes.
00:08:22.900
He actually admitted that basically, like, we tried something, they tried something, theirs is better, we're going with theirs.
00:08:33.800
That's a good thing and I think a tough admission for a guy like Mark Zuckerberg.
00:08:41.120
I mean, I'm with you in that I think they've run this so poorly.
00:08:45.520
And they have taken companies and content companies and given them this impression that they could advertise to people, gain followers, and then get their content distributed.
00:08:56.800
And then pulled the rug out from underneath them years ago and destroyed dozens and dozens of websites and companies because of it.
00:09:06.820
And that being said, this does kind of seem like a good change.
00:09:11.400
Like, I don't know if it's just Glenn, them kissing Trump's butt and realizing if Trump comes in, he's going to be a different kind of president and they're in a different environment and they better change or they're going to get, you know, hammered.
00:09:31.460
Well, you know me, I always look for the best in people, honestly.
00:09:37.580
I am kind of a poor judge of character because of that.
00:09:41.900
Because I see people for who I think they could be maybe at times.
00:09:50.480
And I kind of look at it like, I think that's who they really are going to be.
00:09:54.400
And they usually disappoint because people don't become the people they could be most times.
00:10:01.780
Instead, they settle for what they are or what they've allowed themselves to become because they don't have a true center of truth.
00:10:09.460
They don't know who they are and how they relate to all eternal truths.
00:10:17.480
But when I set with Zuckerberg, this is more in line with the Zuckerberg that I set with.
00:10:24.400
However, you know, I was, I think I was, what did we decide, Stu?
00:10:40.360
Because I do think there's a part of him that would like to be clear of all of this.
00:10:46.320
Like, I think he has other, you know, large goals in his life other than navigating every political thing that pops up.
00:11:11.220
I mean, I, he has gone through several phases, right?
00:11:14.680
The company started going towards the metaverse, right?
00:11:17.760
They changed the company to meta a couple of years ago.
00:11:34.460
Guess who's invested billions of dollars in VR.
00:11:53.040
Meta has lost about $50 billion in its reality labs division.
00:11:58.660
Augmented reality, virtual reality, and the metaverse.
00:12:07.940
I think what this is partially, I mean, I want to give him the benefit of the doubt to some degree,
00:12:13.940
but I think partially what this is about is making sure that the government contracts don't stop with meta.
00:12:22.400
Make sure that they are able to get some of that money from the United States in the use of VR,
00:12:31.440
because that's where he really, where his heart is.
00:12:38.680
Now, they lost $50 billion in their reality labs.
00:12:42.280
However, if you look at Facebook's revenue, they're subsidizing all of this stuff.
00:12:48.600
Facebook, the revenue was expected to go up to $100 billion in 2024.
00:12:57.840
Facebook's advertising revenue is now expected to grow over $127 billion by 2027.
00:13:05.960
So, that's the cash cow, but where his heart is, is VR and AR, and he wants to make sure that he gets, he's not off the government teat.
00:13:25.980
Speculation, but I think that's what's happening.
00:13:32.040
You know, but I wonder and hope that it is more than that.
00:13:36.300
I mean, I, because I had the same reaction to Elon Musk when he started having this transition of real skepticism.
00:13:42.800
I mean, the guy's been like the biggest climate activist in America.
00:13:46.460
Why would we believe all of a sudden he's coming around to these sorts of ideas?
00:13:50.160
Does seem now that that's pretty legitimate from Elon Musk.
00:13:56.160
Zuckerberg, remember, he's done, he did some of this stuff before the election.
00:14:04.160
He, he outed some of the government intrusions before the election happened.
00:14:09.000
He called Trump a badass after the, after the assassination attempt.
00:14:13.620
He said that was his turning point, was the assassination attempt.
00:14:18.020
He, he said that's when he realized, oh, this guy is really a badass.
00:14:21.780
This guy is actually, you know, what he says he is to some degree, at least according to Mark Zuckerberg.
00:14:29.320
You're listening to the best of the Glenn Beck program.
00:14:32.040
Let me just ask you, Stu, you know, the president has talked about the Panama Canal, which I think he's very serious about.
00:14:51.300
So I don't know what he's doing with Panama, but the Panama Canal, that's going to be a big thing.
00:14:59.800
Uh, it's going to be a, it's going to be a, an important thing, uh, to him and his administration.
00:15:04.860
But he, again, has brought up Greenland and buying Greenland, which I'm all for, quite honestly.
00:15:21.360
Um, the, um, uh, you know, the, the, the way that we waste money, that would be the best thing we could do besides buying like gold or Bitcoin.
00:15:30.680
Uh, that would be the best thing is buy land, uh, and, uh, it would be great.
00:15:43.000
I mean, I know they want to get away from Denmark.
00:15:52.480
I think it's Denmark that they belong to and they don't like it.
00:15:56.260
And, uh, they're told exactly how to live their lives from Denmark.
00:15:59.900
And they're like, we're closer to America than Denmark.
00:16:03.660
However, now listen, if this doesn't sound like negotiation, however, we are not for sale.
00:16:09.220
I mean, we don't want to be in bed with Denmark anymore.
00:16:13.080
Uh, you know, and we are so rich with resources right now.
00:16:17.620
I just, I mean, we're really not for sale, but Hey, we love Donald Trump in America.
00:16:22.520
Look out, you open that negotiation door and, uh, Donald Trump's going to walk through and
00:16:29.600
then, I mean, Denmark, uh, well, maybe Denmark to Greenland will be ours.
00:16:36.900
Um, but he's also talking about Canada being the 51st state.
00:16:40.380
And I've thought that was because of Trudeau, you know, just making him into a governor.
00:16:46.880
But isn't there any chance he really would love to have Canada as a 51st state?
00:16:53.880
I mean, I mean, there are some arguments as to why not, but I think you start with why
00:17:01.120
It would be great because of all the resources they have and, and all of that, though there
00:17:05.760
would be a lot of negatives, you'd be importing a lot of socialists, uh, into our voting, French,
00:17:17.120
That would make things a little more difficult, at least in the short term.
00:17:20.940
Um, yeah, but there's a lot of, but you'd also get rid of the CBC.
00:17:24.240
And so, well, you'd be replacing with ABC, but ABC looks like the blaze compared to the
00:17:30.520
So, you know, anything would be an improvement, but then you also get all that social
00:17:40.440
Canada is in the worst shape they've ever been in, at least in my lifetime.
00:17:44.820
Uh, and it's all because of, uh, Justin Trudeau, who is just horrible.
00:17:52.860
Did you hear his quote resignation speech yesterday?
00:17:58.960
Could we make it about, could we make it about him anymore?
00:18:01.760
And just like, in fact, Sarah, we have a part of the speech.
00:18:05.560
Can we play part of his speech over the holidays?
00:18:09.840
I've also had a chance to reflect and have had long talks with my family about our future.
00:18:18.220
I've had time to reflect on what's right for Canada.
00:18:22.240
Uh, and I didn't, I didn't spend any time talking about my 16% approval rating.
00:18:32.160
Throughout the course of my career, any success I have personally achieved has been because
00:18:42.640
So last night over dinner, I told my kids about the decision that I'm sharing with you today.
00:18:50.480
I intend to resign as party leader, as prime minister, after the party selects its next leader
00:18:58.780
through a robust, nationwide, competitive process.
00:19:15.420
Well, he's going to do that after there's a robust effort to find a new leader for the
00:19:23.160
Because currently the conservative party is beating them by 24 points.
00:19:29.560
So there's no way they're going to, there's absolutely no way they're going to win at this
00:19:35.900
So not only does he say it's going to have to happen, you know, after March 1st.
00:19:48.160
But not only did he say, I'm going to resign sometime after March, um, he has done something
00:19:57.220
that I think we would call this martial law maybe, or, uh, I mean, he suspended the parliament.
00:20:12.660
I've prorogued the, uh, parliament just until March.
00:20:23.560
Pastries with the, uh, with the potato and cheese.
00:20:32.160
Uh, prorogued means you've done everything but dissolve the parliament.
00:20:44.240
They're in the middle of, you know, what parliament and Congress does, which I can't explain to
00:20:52.820
They're debating and, uh, parliament is starting to go the opposite direction.
00:21:02.640
If Donald Trump just came in and said, you know what?
00:21:07.500
I'm going to, I'm going to be leaving office, you know, in four years, but for the next
00:21:22.160
It shouldn't be available as an option to the president or the prime minister to just
00:21:32.000
So the reason why he did that is not only just to stay in power and keep his policies
00:21:40.980
Um, but also, uh, he did that so they can't have a no confidence vote.
00:21:47.180
Like today, if he would have said, I'm going to resign, the conservatives could have stood
00:21:51.940
up and said, Hey, let's have a vote of no confidence.
00:21:59.800
And that would mean that his party would lose to the apple eating guy.
00:22:05.300
Now I have to tell you, uh, Pierre something or other, some French name.
00:22:15.640
And that might sound like it's, uh, you know, uh, no, not necessarily the right thing to do
00:22:23.960
to call the next prime minister, the apple eating guy, but this is how I know him.
00:22:29.980
Uh, he did an interview with, I think the CBC and there's a left-wing journalist and he's
00:22:39.020
in Vancouver, BC, and he's eating an apple and to make sure the reporter knew exactly how
00:22:49.640
He answered while still eating the apple, his body kind of half turned, not really even,
00:22:55.320
not even really recognizing this guy fully, uh, in case you've never heard it or seen
00:23:05.480
I mean, in terms of your sort of strategy currently, you're obviously taking the populist, uh, pathway.
00:23:12.800
Well, appealing, appealing to people's, uh, more emotional levels.
00:23:19.220
Um, I mean, certainly, certainly you, certainly you tap, certainly you tap, uh, very strong
00:23:28.040
Uh, left wing, you know, this and that right wing.
00:23:31.780
They, you know, I mean, it's that, that type of ideological stick.
00:23:38.280
A lot of people would, would say that you're simply taking a page out of the Donald Trump,
00:23:45.200
Well, I'm sure a great many Canadians, but I don't know who, but well, you're the one
00:23:56.940
I'm sure there's some out there, but anyways, the point of this, the point of this question
00:24:00.700
is, I mean, why should, why should Canadians trust you with their vote given, you know,
00:24:08.520
not, not just the sort of ideological inclination in terms of taking the page out of Donald Trump's
00:24:14.740
Talking about what page, what page, can you give me a page?
00:24:19.200
In terms, in terms of turning things quite dramatically in terms of, of Trudeau and, and
00:24:25.140
I mean, you, you, you make quite a, you know, it's, it's quite a play that you make on it.
00:24:33.660
Why shouldn't Canadians trust you with their vote?
00:24:42.000
We're going to make common sense common in this country.
00:24:45.460
We don't have any common sense in the current government.
00:24:52.060
I mean, in terms of your, just the crunching of the apple in the middle of the question.
00:25:03.560
Yesterday, you said that you endorse Israel proactively defending itself by hitting Iran's
00:25:08.460
nuclear sites, which is something that President Joe Biden does not endorse.
00:25:12.560
Do you not feel like this could lead to a likelihood of an all out conventional war between
00:25:17.740
And are you, do you not agree with Joe Biden and his assessment?
00:25:23.200
I think the idea of allowing a genocidal, theocratic, unstable dictatorship that is desperate
00:25:35.000
to be, to avoid being overthrown by its own people to develop nuclear weapons is about
00:25:41.160
the most dangerous and irresponsible thing that the world could ever allow.
00:25:45.480
And if Israel were to stop that genocidal, theocratic, unstable government from acquiring
00:25:54.740
nuclear weapons, it would be a gift by the Jewish state to humanity.
00:26:11.800
I'll resign eventually when I've figured out a way to rig the system.
00:26:23.820
Let me talk to you a little bit about Sam Olsen.
00:26:31.780
And yesterday, if you missed it, you should go back to the show and listen to the podcast.
00:26:37.440
Yesterday, our number two was on the singularity.
00:26:41.260
This is something that I've been talking about.
00:26:43.880
Stu, I was talking about this, I think, before you even joined the show.
00:26:48.080
This might be the longest running commentary that I have in my career is what's coming with
00:27:06.320
General intelligence is what you are, you and I.
00:27:10.000
That's what we as humans were good at many different things.
00:27:14.020
AGI, artificial general intelligence, is like a human, except it's not just good at things.
00:27:23.180
Right after that is ASI, artificial superintelligence.
00:27:32.500
It is more powerful than all of the brains alive on the planet today.
00:27:54.400
We won't be able to, you just do what it says, thinking that it's right, because you don't
00:28:14.920
AGI, according to Sam Altman, we are now at, or soon will be.
00:28:21.980
We're at the singularity, which means Moore's rule of doubling the chip power every two years
00:28:29.840
It's gone from that slope to a straight line up now.
00:28:35.480
Progress that is so rapid, you won't be able to keep up with it is now where we're at.
00:28:40.720
Um, and AGI, artificial general intelligence, some people didn't think that we would get
00:28:48.340
Many, if not most of the computer scientists believed in AGI, we could, we could, we could
00:28:58.280
Most people did not think we could achieve the singularity until 2050, if at all.
00:29:04.520
And most computer scientists don't think that we'll ever get to ASI, artificial superintelligence,
00:29:18.220
This is from Sam Altman, his blog that just came out.
00:29:22.660
We started OpenAI almost nine years ago because we believed that AGI, artificial general intelligence,
00:29:30.940
was possible and that it could be the most impactful technology in human history.
00:29:36.280
But we wanted to figure out how to build it and make it broadly beneficial.
00:29:46.440
And so was our belief that the work might benefit society in an equally extraordinary
00:29:54.320
And if they did, it was mostly because they thought we had no chance of success.
00:30:01.220
In 2022, OpenAI was a quiet research lab working on something temporarily called chat with chat
00:30:11.660
We are much better at research than we are at naming things.
00:30:15.160
We had been watching people use the playground feature of our API and knew that developers
00:30:22.740
We thought building a demo around that experience would show people something important about
00:30:26.700
the future and help us make our models better and safer.
00:30:30.460
We ended up mercifully calling it chat GPT instead and launched it on November 30th of 22.
00:30:37.300
We always knew abstractly that at some point we'd hit a tipping point and the AI revolution
00:30:52.160
The launch of the chat GPT kicked off a growth curve unlike anything we have ever seen in our
00:31:01.340
We are finally seeing some of the massive upside we have always hoped for from AI, and we can
00:31:11.180
It hasn't been easy, the road hasn't been smooth, and the right choices haven't always
00:31:18.120
In the last two years, we had to build an entire company almost from scratch around this new
00:31:22.920
Now, he goes on to building the company and the technology, but I want to skip down here.
00:31:28.740
We have done what is easily some of our best research ever.
00:31:37.600
We grew from 100 million weekly active users to more than 300 million.
00:31:42.860
Most of all, we have continued to put technology out into the world that genuinely seems to be
00:31:51.880
We are proud of our track record in research and development so far.
00:31:59.240
We are committed to continuing to advance our thinking on safety and benefits sharing.
00:32:05.980
We continue to believe that the best way to make an AI system safe is by gradually releasing
00:32:12.500
it into the world, giving society time to adapt and co-evolve with the technology, learning
00:32:18.340
from experience and continuing to make the technology safer.
00:32:21.540
We believe in the importance of being world leaders on safety and alignment research and
00:32:28.040
in guiding research with feedback from the real world applications.
00:32:37.820
Well, safety, because you saw at the beginning of ChatGPT, just the hallucinations that ChatGPT
00:32:50.160
Can it or will it ever start to look at us as we look at insects?
00:32:57.320
Will it ever start to see that Americans or I'm sorry, not Americans, humans are the problem?
00:33:03.120
And the easiest way to solve our problems is to eliminate the humans.
00:33:13.040
And alignment research means making sure that the AI and AGI and ASI, which we will be insects
00:33:28.040
It will barely, we will be so far, and I'm using this term clinically, we will be so far
00:33:35.660
beyond retarded to ASI that it will, it has no reason to pay attention to us at all.
00:33:46.000
So, making sure that alignment, that our goals and its goals remain intact.
00:34:00.880
Imagine using a baby gate, you know, the kind that, you know, go over the stairs.
00:34:07.100
Imagine if somebody said, you know, I got to keep you out of this room.
00:34:16.920
Are you even going to spend any time worrying about that baby gate?
00:34:23.240
It's bad for babies, but that's what we could do to ASI.
00:34:29.620
Anything that we would want to do, it's so far beneath ASI, they won't even have to worry
00:34:39.940
We are now confident we know how to build AGI as we have traditionally understood it.
00:34:48.060
We believe that in 2025, we may see the first AI agents join the workforce and materially change
00:35:07.700
When you talk about AGI and artificial general intelligence, I would think it would be like
00:35:13.900
Like, you could have them essentially do any task.
00:35:19.520
Like, you don't need to program them to do a specific thing.
00:35:23.520
You could say, hey, we need you to answer the phones here.
00:35:27.440
I mean, it might not be directly like that, but it's that type of thing that can take a
00:35:31.520
generalized job, a role like that, and do it on its own.
00:35:36.040
Would you, if you were hiring people for a company and you had somebody that doesn't make mistakes
00:35:44.600
and was much smarter than everybody else in the room, would you have them answer the phones?
00:35:52.440
You'd have them in a really high-powered position in your company.
00:35:59.820
The first will be just like that, because remember, they say they're slowly going to roll
00:36:05.480
The first one will be like a secretary, somebody who can take care of, I'll pay the bills, I'll
00:36:11.020
take care of all of this stuff, and we will love it.
00:36:13.960
The first ones to join the workforce and materially change the output of companies will be something,
00:36:22.440
and I'm just imagining this, so please excuse me if you're in this field for my being a baby
00:36:29.380
gate here, but as I imagine it, it would be someone that you would have a virtual conference
00:36:35.440
with that looks like a human, sounds like a human, you can have a conversation with,
00:36:40.260
and you can say, look, can you help us on this?
00:36:42.580
We're trying to figure this out, blah, blah, blah, blah, blah, and they can game change for
00:36:54.560
Now, he says, we're beginning to turn our aim beyond that to super intelligence in the
00:37:03.400
We love our current products, but we are here for the glorious future.
00:37:07.400
With super intelligence, we can do anything else.
00:37:13.220
Super intelligent tools could massively accelerate scientific discovery and innovation well beyond
00:37:19.740
what we're capable of doing on our own, and in turn, massively increase abundance and prosperity.
00:37:26.280
Here's the thing, and I want to get into this tomorrow, massively increasing abundance and prosperity.
00:37:38.900
Well, by becoming much more efficient, by spending less to make more.
00:37:51.980
Things will be cheaper, but if the jobs are taken by AI or AGI or ASI, how do you make money?
00:38:02.440
A 30% disruption is coming in, we will by 2030, if things play out the way we believe they're going
00:38:10.520
to play out now, which is a deeply unsettling of jobs and careers and everything else, at least at the beginning.
00:38:21.220
You're looking at a 30% unemployment rate, minimum, by 2030.
00:38:28.180
Now, he goes on to say, this sounds like science fiction right now and somewhat crazy to even talk about.
00:38:35.120
We've been here before, and we're okay with being there again.
00:38:37.960
We're pretty confident that in the next few years, everyone will see what we see, and
00:38:43.260
the need to act with great care while still maximizing broad benefit and empowerment is
00:38:49.880
Given the possibilities of our work, OpenAI cannot be a normal company.
00:38:54.820
How lucky and humbling it is to be able to play a role in this work.