The Glenn Beck Program - December 11, 2024


Best of the Program | Guest: Tristan Harris | 12⧸11⧸24


Episode Stats

Length

52 minutes

Words per Minute

138.88223

Word Count

7,349

Sentence Count

649

Misogynist Sentences

6

Hate Speech Sentences

9


Summary

Bill O'Reilly, Mark Andreessen, and Tristan Harris join Glenn Beck on the show to talk about the story of Christmas, and what it really means, and how it affects the war we re all in right now.


Transcript

00:00:00.000 This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460 Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420 All Porter fares include beer, wine, and snacks, and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.840 And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240 Visit flyporter.com and actually enjoy economy.
00:00:30.000 Today's show might be worth listening to the entire thing.
00:00:33.200 I mean, they always are, but, you know, I know your day gets busy because there's a lot happening on today's show.
00:00:39.420 First of all, on the best of, the story of Christmas, and what it really means, and how it affects the war we're all in right now.
00:00:49.580 Right into defending freedom of speech and Bill O'Reilly, he's got a wild perspective on, because he's lived it.
00:00:58.900 What's happening with the CEO killer in New York and the response.
00:01:04.840 He's lived this story.
00:01:06.400 Nobody's talked about it, but we do.
00:01:08.320 And Tristan Harris, former ethicist for Google, talks to us about character.ai, something that is wildly dangerous.
00:01:19.900 Where are we on AI?
00:01:22.320 Mark Andreessen chimes in with a warning that will chill you to the bone, all on today's podcast.
00:01:28.820 Are you tired of not only paying far too much for your mobile phone service, but also knowing that some of the money is going to support causes that you don't support?
00:01:38.380 You know, one of the best things about living in America is we still kind of have a free market, and it's getting more free.
00:01:45.640 Patriot Mobile is America's only Christian conservative company, and their mission is to passionately defend our God-given constitutional rights and freedoms and to glorify God always.
00:01:55.960 And here's the way they do that.
00:01:57.460 They provide a great service.
00:01:59.720 They charge less for your mobile phone.
00:02:02.860 You get the same exact coverage from wherever you're switching from.
00:02:07.060 You're not going to be giving your money to far-left radicals that want to destroy the country.
00:02:13.520 They instead take some of their profits, and they reinvest it into, like, our school districts and getting the right school board people elected, helping the grassroots grow the support for our constitutional rights.
00:02:28.520 You can keep your phone number for a limited time.
00:02:31.160 You can also get a free smartphone just by going to patriotmobile.com slash beck or call 972-PATRIOT.
00:02:38.340 Use the promo code FRIDAY to get that smartphone.
00:02:42.160 And you can use that any day because not just Black Friday matters.
00:02:46.620 All Fridays matter with patriotmobile.com slash beck.
00:02:51.060 patriotmobile.com slash beck or call 972-PATRIOT.
00:02:58.520 You're listening to The Best of the Glenn Beck Program.
00:03:19.980 So we're just a few days away from Christmas, and it just feels weird.
00:03:24.240 I don't know.
00:03:25.100 Maybe it's because I'm into my Christmas shopping or whatever.
00:03:29.520 But I'm lacking just a little bit of the Christmas spirit, and I want to start today by fixing our gaze upon that cradle in Bethlehem where the greatest gift ever given entered the world.
00:03:46.520 The humblest of surroundings beneath the watchful eyes of shepherds in the celestial light of heaven's star.
00:03:54.420 A simple child was born.
00:03:59.320 Definitely not a child of earthly power, no wealth, but purpose.
00:04:06.420 And through him, the chains of mankind's bondage were destined to be broken.
00:04:14.520 When he was born, in a nutshell, what the angel said was liberty, redemption, hope.
00:04:29.720 It's what our founders understood.
00:04:36.400 Each of us endowed with certain inalienable rights.
00:04:42.320 Life, liberty, the pursuit of happiness.
00:04:44.860 Our creator gave these to us.
00:04:47.140 Each of us was endowed with free will, the power to choose, to chart our own course, to stumble, to rise, to dust ourself off, and press on.
00:05:00.680 Again, this is the difference between people.
00:05:08.960 People that just want a guarantee, which there is none in life.
00:05:14.000 Or people who understand that free will, to be free, to live free.
00:05:20.380 That gift is precious and perilous.
00:05:26.240 It's always on the edge.
00:05:28.940 But it is the foundation of our humanity and the cornerstone of a truly free society.
00:05:36.080 Without that simple liberty to make mistakes, we can't learn.
00:05:44.120 Without the liberty to fail, we don't grow.
00:05:48.300 And without the liberty to choose between good and evil, the triumph of virtue over vice means absolutely nothing.
00:06:05.120 We miss this message.
00:06:07.880 Or maybe we save this message for Christmas Eve.
00:06:11.600 It's more appropriate on Christmas Eve.
00:06:14.460 We should be talking about this all year long.
00:06:17.000 In fact, in many ways, it's what we've been fighting for, the message of Christmas.
00:06:25.920 It's not just joy and celebration.
00:06:31.860 But the message of Christmas is profound liberation.
00:06:38.600 It is the birth of Christ, is the birth of freedom itself.
00:06:43.380 And not the kind of freedom that is wrought by, you know, a sword in an army or enshrined in our capital in the writings on parchment.
00:06:53.420 But a freedom given to us, each of us at birth, born in our soul.
00:07:00.140 It's the toughest kind of freedom.
00:07:07.500 Because it belongs 100% to us and what we do.
00:07:11.540 We can blame other freedom on, well, the politicians in Washington.
00:07:15.140 There's no blame except for us.
00:07:18.240 And it is the freedom to forgive others.
00:07:20.640 And more difficult, I think, to forgive ourselves, it's the freedom to lay down the weight of guilt.
00:07:32.080 I'm a recovering alcoholic, and for a reason.
00:07:39.620 There are times in your life where you just are wrought with guilt.
00:07:44.260 You just can't move because in your head you're playing these tapes over and over again and they're all lies.
00:07:50.480 That's what Christmas is, the freedom to lay down that guilt, to heal wounds, old and new, to grasp the hand of grace that lifts us up out of the muck and the mire.
00:08:10.020 This freedom is the most precious.
00:08:14.260 And like all freedom, it is neither easy nor secure.
00:08:20.480 Today, if you would watch the news, you would find that we are living in times filled with war and rumors of war.
00:08:35.620 And yet, we believe, most of us, that we are at peace.
00:08:43.600 For as Longfellow may have said, the cannons of war are silent on our shores.
00:08:50.480 But we are a world at war.
00:08:54.820 We're a nation at war, a people at war.
00:08:58.940 We've said this for a while now.
00:09:01.140 More people in the world are waking up to this every day.
00:09:04.760 We are in a spiritual war.
00:09:06.860 It's invisible.
00:09:08.660 It doesn't have aircraft carriers, but it is insidious.
00:09:13.080 And you don't win with armies.
00:09:17.540 You don't win at the ballot box.
00:09:20.940 It's won within the hearts and minds of every single individual.
00:09:25.700 It's so today.
00:09:31.260 What happened 2,000 years ago is so important today because we're fighting a war for truth in a world drowning in lies.
00:09:41.400 A war for courage in a time where men's hearts have failed them, riddled with fear.
00:09:49.820 A war for faith when we are surrounded by doubt.
00:09:55.760 And it is truly the most perilous of all struggles.
00:10:02.720 Because it doesn't announce itself with the sound of drums or dramatic speeches or the sight of red banners.
00:10:13.460 Its battlefield isn't seen.
00:10:17.920 The stakes are eternal.
00:10:27.920 So as we prepare and we stand on the threshold of this sacred season.
00:10:38.280 Let's not take what we face too lightly.
00:10:43.720 This unseen enemy.
00:10:45.700 Let's not take what happened at the ballot box as, that was a reprieve.
00:10:52.000 That was God doing what we couldn't do.
00:10:55.580 Saying, okay, you can't fight that battle on your own.
00:10:59.040 I got it.
00:11:00.820 I will cover what you can't do.
00:11:06.840 I'll save him and have him stand back up again.
00:11:11.360 I'll protect at the ballot box.
00:11:13.520 But you got to get out and do it.
00:11:15.060 Now, God says, okay, now what are you going to do with it?
00:11:19.360 Because I've done what you can't do.
00:11:22.640 That's the deal with free will.
00:11:25.520 It's a partnership.
00:11:26.860 He'll forgive us.
00:11:28.020 But we got to do the work.
00:11:31.480 We have to take up the shield of faith and the sword of truth and the helmet of salvation.
00:11:37.620 Which was given to us by that little teeny baby in a manger.
00:11:48.280 And then that baby growing into a man to see what he did.
00:11:53.760 So we can draw strength from his example.
00:11:56.320 Boy, that's a hard example to follow.
00:12:04.820 Because he didn't come to condemn.
00:12:08.240 He came to save.
00:12:10.300 He didn't rule.
00:12:13.520 He came to serve.
00:12:15.760 He didn't come to divide.
00:12:18.080 He came to unite under truth.
00:12:21.300 I've said this before.
00:12:28.160 I wonder how many of us actually took time to give thanks at Thanksgiving.
00:12:31.760 For the miracles we've seen.
00:12:33.440 We've seen miracles.
00:12:35.260 I don't...
00:12:36.040 If you miss them, I don't know how.
00:12:37.980 But we've seen miracles.
00:12:39.380 And in the season of eternal truth and light, it's fitting that we give thanks.
00:12:51.300 Not only for the blessings that we hold dear, but also for the trials that refine us.
00:12:58.840 I'm a better man because of the trials of the last 20 years.
00:13:02.940 I don't know who I would have been if my back hadn't been against the wall for the last 20 years.
00:13:08.040 I don't know who you'd be.
00:13:10.800 That comes from freedom of will, of choice.
00:13:16.040 We've all made a choice which side we stand on.
00:13:19.040 Did we stand up when it was tough or did we cower?
00:13:24.340 And if we cowered, are we going to forgive ourselves so we'll stand up now?
00:13:31.680 Give thanks for the trials.
00:13:33.540 Give thanks for the liberty that we possess.
00:13:35.800 For it's only in freedom that we can fully embrace the gift of redemption.
00:13:44.700 And that is the principal gift.
00:13:49.420 It's not just the courage.
00:13:50.760 It's not just the freedom.
00:13:52.260 He knows we're going to make mistakes.
00:13:55.560 Let us give thanks for the right to choose, even when our choices lead us in very dark places.
00:14:03.240 Because every single misstep brings us closer to the God who never ceases to call us back.
00:14:11.780 Son.
00:14:13.420 Daughter.
00:14:14.900 I'm here.
00:14:15.420 Just come back.
00:14:16.100 This year, let's give thanks for the courage to stand, not only for ourselves, but for those who can't stand alone.
00:14:30.500 The story of Christmas is, above all, a story of hope, a story that transcends the bounds of time, place, and circumstance.
00:14:43.440 It's the hope that in the darkest of your night, a star will shine.
00:14:49.780 It's the hope that in the humblest of stables, a king will be born.
00:14:59.000 It is the hope that in the brokenness of our humanity, redemption is there to triumph.
00:15:08.560 I want you to close your eyes for a minute, unless you're driving.
00:15:11.320 That would be bad.
00:15:12.140 Keep your eyes open if you're driving.
00:15:13.380 If you're not driving, close your eyes just for a second.
00:15:16.120 Consider the scene of that first Christmas.
00:15:18.360 The world was troubled.
00:15:22.100 People were weary, especially the Holy Family.
00:15:25.940 The future was absolutely uncertain.
00:15:29.640 Yet, in the stillness of that night, heaven touched earth, and the light of the world entered in darkness.
00:15:41.860 And so it is with us.
00:15:44.360 No matter how beaten down we are, no matter how grave our hour or how heavy the burden, the light of truth still shines.
00:15:56.700 They tried to snuff it out, as darkness always does, but it cannot.
00:16:01.100 And that light of truth calls us to rise above all of our trials, to grasp the freedom that that little baby secured for us, and to walk boldly in the path of all that is good, all that is true, all that is right.
00:16:18.180 Let's commit over the next few weeks, that in the next year, we're going to hold fast to the truth, no matter if it's good for our side or bad for our side.
00:16:32.000 We're just talking about the truth, hold to the truth, knowing that we're all flawed, but also knowing we're loved beyond measure.
00:16:40.780 That even though we fall, we're never alone and never forsaken.
00:16:50.340 And though all the battles of life may rage, the ultimate victory has already been won.
00:16:58.080 May I humbly suggest that we commit to each other this year to let the joy of Christmas not be just something that maybe we just barely feel right now, but we'll feel more and more as we get closer to the holidays.
00:17:21.320 Instead of letting that being of a fleeting sentiment, let's try to make that an abiding strength.
00:17:33.800 Let the hope of that little baby inspire us, give us the courage to face the trials of our time, because trials are still yet to come.
00:17:45.260 Let the freedom wrought by the birth of Christ embolden us to live every day as people redeemed, ever striving, ever learning, ever stumbling, but grateful for that stumble because it means we're ever growing.
00:18:04.600 In the words of the angels on that holy night, glory to God in the highest and on earth peace, goodwill toward men.
00:18:19.900 The greatest gift we can give ourselves and each other is the gift that this peace, this freedom, this hope be ours now and forevermore.
00:18:46.540 We're going to talk about a lot of crap today.
00:18:50.860 We're going to talk about a lot of people that you're going to want to say, I don't think I can forgive that person.
00:19:01.000 A lot of people that you're going to be sitting with in just a few weeks at the Christmas table, and they're going to say, I don't know why you don't see that shooting that UnitedHealthcare worker.
00:19:13.540 Why that CEO, why that's not a heroic, why that's not a heroic, have you seen the shooter's abs?
00:19:19.560 And you're going to go, I can't take it anymore, I can't take it.
00:19:22.740 But what I thought of when I was putting this together for you today was, man,
00:19:32.580 If God can put up with me, how can I not put up with...
00:19:42.540 I don't know, name anybody on MSNBC.
00:19:49.000 Name anybody on The View.
00:19:50.480 We also have some good things to share with you today.
00:20:00.620 I think it's actually all good.
00:20:02.940 It's all good.
00:20:03.800 I think we should recognize that we wouldn't be in the good position that we're in right now if what we all thought was bad in 2020 didn't happen.
00:20:15.780 If Donald Trump would have won in 2020, God only knows what we'd be facing.
00:20:21.620 And what we'd be facing unknown because they hadn't revealed themselves yet.
00:20:26.660 There is nothing bad.
00:20:29.940 It's what we do with it and how we react.
00:20:33.740 Do we say, wow, I've got another burden.
00:20:37.700 I've got to get myself up off the ground and carry on.
00:20:41.840 Or do I quit?
00:20:44.680 Let me tell you about Dan.
00:20:46.180 He is a semi-professional bodybuilder in New York.
00:20:49.800 Right away, I think, Dan, we have so much in common.
00:20:52.420 But he's been suffering from shoulder pain for years due to a weightlifting injury.
00:20:58.420 Me, too.
00:20:59.320 Sometimes I just pack that food right on my fork.
00:21:02.300 Anyway, he kept hearing me talk about Relief Factor on the radio, and he wondered, I wonder if it could help somebody like me, you know, healthy.
00:21:09.880 Well, within a couple of weeks of beginning it, his pain went away, his mobility began to return.
00:21:15.840 Dan got his life back, and so can you.
00:21:18.100 Relief Factor is a daily supplement that helps your body fight pain by fighting inflammation, which is the source of most of the pain in our bodies and a lot of our disease as well.
00:21:27.220 100% drug-free, developed by doctors to help reduce or eliminate pain.
00:21:30.900 Over a million people have tried Relief Factor's Quick Start Kit, and 70% of them have gone on to order it again.
00:21:36.540 So stop masking your pain and start fighting back naturally.
00:21:40.300 Give Relief Factor a try.
00:21:41.540 Right now, their three-week quick start, $19.95, less than a dollar a day.
00:21:45.060 Visit ReliefFactor.com or call 1-800-4-RELIEF.
00:21:49.700 1-800-4-RELIEF.
00:21:52.860 Now, back to the podcast.
00:21:54.780 You're listening to the best of the Glenn Beck Program.
00:22:01.060 All right, I'm going to play some stuff.
00:22:04.100 Got to be said.
00:22:05.200 And I want you to know, what I'm going to say to you here is I'm only saying it because it is absolutely true, and it only counts when it takes everything in you to say it.
00:22:21.860 It's easy to say, well, we have the right.
00:22:25.800 It's easy to say that.
00:22:28.340 It only counts when you hate saying it, and I hate saying this.
00:22:37.280 With that, let me play a couple of clips of audio.
00:22:42.220 Let's first play Taylor Lorenz as she was talking with Piers Morgan about the killer of the UnitedHealthcare CEO.
00:22:54.040 Why would you be in such a celebratory mood about the execution of another human being?
00:23:01.180 Aren't you supposed to be on the caring, sharing left where, you know, you believe in the sanctity of life?
00:23:06.740 I do believe in the sanctity of life, and I think that's why I felt, along with so many other Americans, joy, unfortunately, you know, because it feels like...
00:23:18.460 Joy? Serious?
00:23:18.960 I mean...
00:23:19.740 Joy in a man's execution?
00:23:21.600 Maybe not joy, but certainly not empathy.
00:23:25.800 Because, again...
00:23:26.280 We're watching the footage.
00:23:27.440 How can this make you joyful?
00:23:29.620 This guy's a husband, he's a father, and he's being young down in the middle of Manhattan.
00:23:34.820 Why is that making you joyful?
00:23:36.140 So are the tens of thousands of Americans that are being murdered.
00:23:37.720 So are the tens of thousands of Americans, innocent Americans, who died because greedy health insurance executives like this one push policies of denying care to the most vulnerable people.
00:23:51.340 And by the way, let me just clarify...
00:23:51.780 Hang on, Taylor, I'll come back to you.
00:23:54.260 Okay, don't say I'm joyful.
00:23:55.780 You said you were feeling joyful.
00:23:59.440 Yeah, I take that back.
00:24:00.940 Joyful is the wrong word, Piers.
00:24:02.540 You think?
00:24:03.060 As I clarify...
00:24:03.760 You think?
00:24:04.260 Yeah.
00:24:04.900 You think joyful is the wrong word?
00:24:07.220 Yeah.
00:24:07.640 Sure.
00:24:07.860 I'd say it is.
00:24:08.360 But vindicated celebratory because, again, it feels like justice in this system when somebody responsible for the death.
00:24:15.520 Oh, no.
00:24:16.240 Please let her keep talking.
00:24:17.820 That was awesome.
00:24:18.660 I can't take it anymore.
00:24:19.800 No.
00:24:20.260 Are you sure joy isn't the right word?
00:24:23.240 It's crazy.
00:24:24.340 Amazing.
00:24:24.640 Now, let's go to Daniel Penny.
00:24:27.500 Daniel Penny is found innocent.
00:24:30.700 I think anyone looks at what he did, what he tried to do, the spirit he tried to do it in.
00:24:38.180 He was not trying to kill anyone.
00:24:41.120 He was trying to protect people.
00:24:44.180 BLM of New York, which has only sold, I think the only thing they do is sell hats, you know,
00:24:50.420 that say F the mayor, you know, whatever.
00:24:54.840 They came out, and this is what the head of BLM New York said after the Daniel Penny trial.
00:25:02.160 We need some black vigilantes.
00:25:05.220 That's right.
00:25:06.600 People want to jump up and choke us and kill us for being loud.
00:25:14.620 How about we do the same when they attempt to oppress us?
00:25:19.400 Right.
00:25:19.740 Boy, am I tired.
00:25:23.660 Don't get tired.
00:25:24.460 Okay.
00:25:25.440 It's important to make sure you're well rested.
00:25:27.440 Yeah.
00:25:27.740 Get your rest.
00:25:28.580 You might get a little cranky.
00:25:29.840 Might do and say some crazy things.
00:25:31.980 Okay.
00:25:32.240 So, let me talk about those two statements quickly.
00:25:38.520 If I said this and said, it's time for some vigilantes, not even white or black, just it's time to get some vigilantes.
00:25:48.800 They would do everything they could to get me off the air.
00:25:52.300 Everything.
00:25:53.940 And I wouldn't say that because I don't believe in that.
00:25:57.060 I believe in the Constitution.
00:25:58.720 But here's a guy who can say that and no one says a word except amen.
00:26:06.080 No one on the left.
00:26:07.040 No one in the media.
00:26:08.160 Well, he's got reasons to say that, you know.
00:26:11.940 Oh, okay.
00:26:13.340 But I would be blackballed.
00:26:16.040 My life would be over if I said that.
00:26:19.660 Taylor Lorenz.
00:26:20.840 She's out of her mind nuts.
00:26:23.600 Okay.
00:26:24.420 Out of her mind nuts.
00:26:26.760 How many times do we have to hear this woman say crazy things like, I don't feel joyful, just celebratory.
00:26:35.080 Because somebody was gunned down in the streets.
00:26:39.360 Because she thinks health care is murdering people in America.
00:26:46.760 Okay.
00:26:47.060 Here's what I.
00:26:49.300 Oh, my gosh.
00:26:50.240 Stu, do you have aspirin on you or anything?
00:26:54.000 Because if I have a stroke while I'm saying this, please just put some aspirin on my tongue so I might survive a little bit on this.
00:27:04.040 All of these people have a right to say that.
00:27:09.220 Here is.
00:27:10.140 You can't cry fire in a fire crowd.
00:27:12.240 Crowd a firehouse.
00:27:13.000 I don't know.
00:27:14.200 They were just in a courtroom saying we should kill people like him.
00:27:20.360 I don't know.
00:27:21.560 So, here is the actual court ruling.
00:27:25.940 This is from 1969.
00:27:28.400 Court said there's a two-pronged test to evaluate speech.
00:27:32.480 One, speech can be prohibited if it is, quoting, directed at inciting or producing imminent lawless action.
00:27:44.860 Now, you could say, why don't we have a vigilante?
00:27:48.640 Why don't we kill people?
00:27:50.840 That is inciting.
00:27:52.980 It is.
00:27:53.900 Inciting people to go and take lawless action.
00:27:56.680 But it isn't imminent lawless action.
00:28:01.960 If somebody then picked up guns and started mowing down black people or white people or people that have bad acne or perfect faces or whatever it is,
00:28:13.900 then that speech, he would be responsible for it.
00:28:20.940 But the court says it is such a fine line here that you have to go so far before your speech is banned.
00:28:33.600 It has to be, one, directed at inciting or producing imminent lawless action, and two, likely to incite or produce such action.
00:28:46.020 Two standards.
00:28:48.080 Both of them have to be met.
00:28:50.160 I am only spitting this out because I hate what these people have said.
00:28:56.560 I despise what these people say.
00:29:00.160 I believe with everything, every piece and every cell of my body, what they're saying is evil.
00:29:08.880 But because I'm an American constitutionalist, I defend their right to say it.
00:29:16.240 And it only matters to say these things when it kills you to say it.
00:29:22.840 And it's killing me to say it.
00:29:26.680 For all those on the left that claim that they are the banners of justice, they believe in the Bill of Rights, they believe in freedom of speech, but it has limits.
00:29:37.700 Yes, those are those two limits.
00:29:40.720 That's as far as I have seen two people go in a week, maybe in my lifetime.
00:29:48.200 And I'm not calling for them to be silenced.
00:29:51.940 And if you'll notice, nor is most people on the right.
00:29:58.600 No one's saying, get them.
00:30:01.480 Because we hold certain truths to be self-evident.
00:30:11.880 A guy I really have been thinking a lot about this week, which I try to avoid all the time, is Bill O'Reilly.
00:30:21.400 And Bill O'Reilly is a guy in 2009 that everybody dogpiled on and said, he's responsible for Dr. Tiller being killed.
00:30:32.620 He's the guy who did it.
00:30:34.240 His speech, yada, yada, yada.
00:30:36.780 And nobody talked about, nobody on the right that hates abortion, no one came out and said, well, yeah, but the guy has good abs.
00:30:47.180 And he does make a point, the doctor was murdering a lot of people.
00:30:51.900 No one excused that.
00:30:54.460 And yet they can gun a CEO down in the street and the left is all excusing him.
00:31:02.160 Well, yeah, but there's no but here.
00:31:06.820 Bill O'Reilly, welcome to the program.
00:31:08.860 How are you, sir?
00:31:10.260 You know, I was thinking about you too, Beck, because you're looking more and more like Santa every year.
00:31:15.700 I don't think that's right.
00:31:16.760 You kind of morphed into that North Pole look.
00:31:20.860 Thank you.
00:31:21.360 Thank you.
00:31:21.920 I appreciate it.
00:31:23.080 You know, you're a brilliant man.
00:31:25.300 Of course, everyone knows that.
00:31:27.360 But the story with Dr. Tiller in Kansas is even worse than people know.
00:31:35.200 Because what I was doing in 2009 on the Fox News channel was reporting what this guy was doing, Tiller.
00:31:45.200 And in the body of the reportage, I mentioned that his nickname in Kansas was Tiller the Baby Killer.
00:31:56.220 And that was true.
00:31:58.880 It was part of the story.
00:32:00.820 Immediately, the far left press said, I branded him that name.
00:32:07.700 O'Reilly called him that name.
00:32:09.580 O'Reilly made that up.
00:32:11.320 O'Reilly put him in danger.
00:32:13.160 O'Reilly wanted him dead.
00:32:14.840 That's how heinous the left-wing media is.
00:32:21.280 And it's gotten worse since then, if you can believe it.
00:32:24.600 But it's gotten absolutely worse.
00:32:27.020 But they're paying a big price now for that.
00:32:29.540 Anyway, Tiller himself was murdered.
00:32:34.520 And the guy who did it is in life, serving life.
00:32:38.380 Right.
00:32:38.560 He was going down to church.
00:32:41.240 And subsequently, the people who worked for Tiller all lost their medical licenses in Kansas.
00:32:50.220 Because, sorry about the dog.
00:32:54.700 Can you hear the dog?
00:32:55.680 No, this is your only friend.
00:32:56.920 It's okay.
00:32:57.620 I know.
00:32:58.680 The dog is barking.
00:33:01.100 It's all right.
00:33:02.180 So wait a minute, wait a minute, wait a minute.
00:33:03.620 So this would be like if everybody around this CEO of UnitedHealthcare lost their license
00:33:11.160 because they were actually doing something illegal and really bad.
00:33:15.960 But that didn't happen in this case.
00:33:18.400 And yet, in the Tiller case, all the people around him lost their license to practice?
00:33:26.120 Yeah.
00:33:26.400 I mean, they had hearings in Kansas.
00:33:28.560 This is how bad it was.
00:33:29.860 This guy was charging $5,000.
00:33:35.020 You walk in, no, you pay cash.
00:33:39.500 And he aborts whatever unborn child, whatever stage it's in.
00:33:46.680 Could be nine months.
00:33:49.400 $5,000, please.
00:33:51.400 Hand him the money.
00:33:52.280 He does the operation.
00:33:53.400 So the medical authorities in Kansas, after my reporting, looked into what he was doing.
00:34:00.340 He was dead by the time they issued the report.
00:34:04.200 But all of his people lost their license to practice medicine.
00:34:08.560 Did you, Bill, did you or anyone you know, were you ever even tempted to say,
00:34:16.980 yeah, he was gunned down, but, I mean, no.
00:34:22.320 Of course not.
00:34:23.220 And he was gunned down in a church, in an Episcopal church.
00:34:28.160 That's how crazy this whole story is.
00:34:30.500 Look, I'm a sane individual.
00:34:32.600 I know that some people disagree with that, including you.
00:34:35.680 But I don't want people to be harmed.
00:34:40.940 I'll harm them through reporting.
00:34:44.600 That's the vehicle.
00:34:45.900 I'll expose them.
00:34:47.840 But I don't want them to be physically hurt.
00:34:51.620 Now, this story, because I live in New York, as you know,
00:34:55.640 this story about the CEO being gunned down,
00:34:58.440 this is largely a media-driven story.
00:35:02.760 There isn't an overwhelming consensus on the part of the left
00:35:09.280 that this homicide was justifiable.
00:35:12.240 It's some real far-out-there kooks driving it.
00:35:16.860 Elizabeth Warren.
00:35:18.180 Elizabeth Warren.
00:35:19.760 Go ahead.
00:35:21.320 Hear me out.
00:35:22.180 All right.
00:35:22.500 If you don't think Elizabeth Warren is a kook.
00:35:25.020 Okay, all right.
00:35:25.980 You make a good point.
00:35:27.100 All right.
00:35:27.420 Thank you.
00:35:28.020 I mean, I've got to spend a week in Idaho with you
00:35:30.640 and get you back into reality.
00:35:32.760 This woman is beyond the pay.
00:35:35.180 This and Alexandria Ocasio-Cortez, my God.
00:35:39.280 All right.
00:35:39.800 Their view of the world is insane.
00:35:43.860 So there are people on the fringe, and Warren is one of them, all right,
00:35:50.960 who is saying, using the death of this man to try to hammer the health insurance companies,
00:35:59.960 which deserve to be hammered.
00:36:03.000 That's the real crisis here.
00:36:04.840 So many legitimate claims are being denied now.
00:36:11.360 And working-class Americans, they pay their insurance premiums,
00:36:15.560 and then they have something wrong, and they put the claim in, and it comes back,
00:36:18.520 screw you.
00:36:19.640 We're not paying it.
00:36:20.880 So that's a legitimate, absolutely a legitimate beef, but you don't gun down people.
00:36:31.040 So it's a very, very complicated emotional story, but the media seizes on stuff like this
00:36:37.180 because they don't know what else to do.
00:36:40.000 Unlike you and I, we have a different narrative every day.
00:36:43.900 We present to our audience different facts.
00:36:47.220 The people on television, mostly cable, they don't know what to do every day, Beck.
00:36:52.780 They've got to latch on to something to stop their falling ratings,
00:36:58.500 and that's primarily what this is all about.
00:37:01.460 They're never going to learn, are they?
00:37:02.540 No, because they don't have control.
00:37:06.580 The corporate masters, the Comcasts, and the CBS, and all of these people, Disney,
00:37:15.140 they're telling them what to say.
00:37:17.800 And you better damn well say it, or you're not going to get your check.
00:37:20.840 Look at Morning Joe.
00:37:22.800 That's the best example.
00:37:24.420 He was ordered by Comcast to go to see Trump.
00:37:28.000 Oh, I didn't know.
00:37:28.520 He just didn't show up at the gate.
00:37:30.580 He was ordered to do it.
00:37:33.880 Oh, you've got to make nice with him because we're losing audience.
00:37:37.000 You better get down there.
00:37:38.240 And, of course, it blew up totally, and MSNBC is done forever now.
00:37:43.860 I mean, how can you call a guy Hitler and then go and make peace and say,
00:37:48.640 hey, we're buddies?
00:37:48.820 Well, maybe he's serving Wienerschnitzel that night.
00:37:51.480 I don't know.
00:37:53.200 Okay, but there's no logic.
00:37:55.680 There's no logic in corporate.
00:37:57.780 I know you feel the same way, Beck, because we, you know,
00:38:01.220 I am so happy to be out of that corporate thing.
00:38:04.940 Oh, my gosh.
00:38:05.600 You run your own corporation.
00:38:06.620 I run my own corporation.
00:38:08.860 I am the relief factor.
00:38:11.700 I think that's some kind of thing that they advertise on shows.
00:38:14.180 Yes, it is.
00:38:15.080 It's so tremendous that I don't have to deal with these pinheads, these dishonest executives.
00:38:21.560 I know.
00:38:22.400 Oh, my God.
00:38:23.740 It is, you know, it is the thing that the corporate media, I mean, they're done.
00:38:29.680 I don't think 2028.
00:38:32.600 Let me give you a fact.
00:38:34.900 All right.
00:38:35.140 So when I left Fox, there were people who said,
00:38:37.800 oh, you're not going to be able to sell any more books now.
00:38:40.460 Yeah, I know.
00:38:40.940 Because you had the Fox thing.
00:38:42.140 Right.
00:38:43.120 Confronting the president, maybe number one again this Sunday.
00:38:48.140 We had a huge week last week.
00:38:51.500 Thirteen weeks on the New York Times bestseller list.
00:38:54.640 Number one debuted.
00:38:56.220 Okay.
00:38:56.720 We don't need them, Beck.
00:38:58.780 I know.
00:38:59.420 I know.
00:39:00.000 We don't need them.
00:39:01.080 And they know it.
00:39:04.520 We're going around them on YouTube.
00:39:07.080 We're going around them on Spotify.
00:39:09.300 We're going around them with our direct distribution all over the place.
00:39:13.820 And people have noticed that.
00:39:16.200 Yeah.
00:39:16.620 Bill O'Reilly, it's good to talk to you, Bill.
00:39:18.700 Have a great holiday.
00:39:20.040 Maybe we'll talk before then.
00:39:21.260 But good to talk to you.
00:39:22.940 Thank you so much for coming in.
00:39:24.400 Anytime, Santa.
00:39:24.780 You bet.
00:39:25.140 Sure.
00:39:26.140 This is the best of the Glenn Beck Program.
00:39:33.520 Welcome back to the program.
00:39:35.400 If you missed any of the show today, make sure you go back to the podcast.
00:39:39.600 We started out with a really happy attitude.
00:39:41.920 And unfortunately, we are discussing now this hour some of the scariest stuff of my lifetime.
00:39:48.880 Will be the scariest stuff in your lifetime.
00:39:50.980 Maybe of anyone's lifetime.
00:39:53.020 We're talking about AI and technology that we have now.
00:39:57.900 I remember sitting 10, 15 years ago with some of the members.
00:40:02.260 I can't even remember what subcommittee it was.
00:40:04.940 But they were overseeing technology.
00:40:07.280 And I tried to explain to them what AI was and AGI and ASI.
00:40:13.860 And they were like, well, we should pass some laws.
00:40:17.800 And I'm like, you don't even understand what you're talking about.
00:40:21.500 And they were all 80.
00:40:23.440 I mean, I'm 60.
00:40:25.000 I have a hard time finding it.
00:40:27.920 The minds that are developing AI do not have the same old think that people like me have.
00:40:38.780 And thank God there are some young ones that understand.
00:40:44.620 And I actually think Silicon Valley is some are waking up to.
00:40:52.300 This is worse than the atomic bomb that is right here, ready to be born.
00:41:00.280 And Tristan Harris, I've talked to him for, I don't know, how many years.
00:41:06.520 He's the first guy that gave me hope when I talked to him maybe 10 years ago.
00:41:10.780 He was a former Google design ethicist who left when he realized Google doesn't have any ethics.
00:41:18.920 This is bad.
00:41:20.020 And he is now with the Center for Humane Technology.
00:41:24.260 He's the co-founder.
00:41:25.400 And he has been fighting.
00:41:27.780 You can't stop this now.
00:41:30.400 But he's at least trying to get everybody to agree that this is dangerous.
00:41:37.140 We have to have some parameters.
00:41:39.440 Tristan, how are you?
00:41:41.440 Glenn, it's good to be good to be back with you.
00:41:43.460 And I think it was 2017 when we first talked about some of these issues in the attention economy.
00:41:48.120 And here we are now.
00:41:50.720 I know.
00:41:51.160 Where are we now?
00:41:53.220 How close are we to, you know, the things like the loss of free will, where we just don't know if it was us that decided or it's been planted, you know, in our minds to think it's our idea?
00:42:07.740 Well, you know, first, I was listening beforehand to your conversation with Megan Garcia about her son, Sewell, who obviously was manipulated by this character.ai chatbot.
00:42:22.540 And unfortunately, as of yesterday, there was a second piece of litigation filed about another child who's actually still anonymous.
00:42:32.660 The parents are still anonymous.
00:42:33.660 And in this case, you know, this young child, JF, was a kind and sweet, you know, young person, had no history of violence or outbursts.
00:42:43.460 And after his exposure to character.ai, he was basically encouraged by the chatbot to practice self-harm in the form of cutting, told how to do it, encouraged to do it.
00:42:59.320 And he was also encouraged by this chatbot to be physically and emotionally abusive towards his parents and members of his family.
00:43:10.800 And this is obviously heartbreaking.
00:43:15.760 And it's really an extension of the things that you and I talked about around social media, because why is all this happening?
00:43:22.600 Like, obviously, no one wants this, including the founders of character.ai never would have wanted this to happen.
00:43:27.800 So how are we getting results that nobody wants?
00:43:31.380 And the answer is the incentives.
00:43:33.920 You know, Charlie Munger, Warren Buffett's business partner, said, if you show me the incentive, I will show you the outcome.
00:43:40.140 And when the incentives and business models are, I have to get you using this product for as much as possible.
00:43:46.980 It's the race for maximizing attention and engagement usage of the product that creates.
00:43:53.000 I think we talked about it the very first time, the race to the bottom of the brainstem for a more polarized, addicted, distracted, sexualized forms of media.
00:44:01.180 But now the things that we saw with, you know, other forms of media, you now have a personalized AI in which the way the character.ai works is they take a fictional character.
00:44:12.920 You know, you're a kid.
00:44:13.800 What do you like fictionally?
00:44:14.820 You like Game of Thrones.
00:44:15.660 You like Princess Leia.
00:44:16.640 You like Star Wars.
00:44:17.400 You take your favorite character and then, boom, snap of the fingers, you have a fully interactive version of this character who's talking to you 24-7.
00:44:25.920 And our team, unfortunately, uncovered, along with the family that was harmed, that when you create a new account on character.ai as a young person, it immediately recommends, of all the characters that it could recommend to you, it recommends characters named stepsis, like stepsister, or CEO, or high school teacher.
00:44:46.780 And these characters almost immediately engage in sexually explicit interactions because they're simply, you know, trained to do this.
00:44:55.840 Okay, so, Tristan, here's the problem.
00:45:04.040 Your typical answer would be, okay, the incentives are all screwed up, but that's what comes from the free market when you have a, you know, when you have an immoral end user, which is our society.
00:45:20.500 It's, you know, free market is bad.
00:45:23.540 But I want to play a little bit from what Mark Andreessen just said to Barry Weiss.
00:45:29.840 Listen to this.
00:45:30.980 We had meetings in D.C. in May where we talked to them about this, and the meetings were absolutely horrifying, and we came out basically deciding we had to endorse Trump.
00:45:40.540 Mark, add so little color to absolutely horrifying.
00:45:44.040 What did you hear in those meetings?
00:45:45.720 They said, look, AI is one of these, AI is a technology basically that the government is going to completely control.
00:45:52.040 This is not going to be a startup thing.
00:45:53.660 They actually said flat out to us, don't start, don't do AI startups, like don't fund AI startups.
00:45:58.480 It's not something that we're going to allow to happen.
00:46:01.160 They're not going to be allowed to exist.
00:46:02.740 There's no point.
00:46:03.480 They basically said AI is going to be a game of two or three big companies working closely with the government, and we're going to basically wrap them in a, you know, I'm paraphrasing, but we're going to basically wrap them in a government cocoon.
00:46:15.820 We're going to protect them from competition.
00:46:17.580 We're going to control them, and we're going to dictate what they do.
00:46:21.580 And then I said, well, I said, I don't understand how you're going to lock this down so much because like the math.
00:46:26.540 Stop here.
00:46:26.800 There's more to that that is just horrifying.
00:46:32.800 What is the solution?
00:46:34.220 Because it's not government control, and you've got the free market, and we're all just wanting to consume it.
00:46:44.080 What's the answer?
00:46:45.160 Yeah.
00:46:46.000 Well, so we often talk about this problem as there's sort of two ways to go, which is one is you say this is a dangerous technology, and we need to sort of control it.
00:46:55.360 We need to centralize it.
00:46:56.860 We need to regulate it.
00:46:58.040 We need to protect against some of the things that we just talked about with Character.ai.
00:47:01.740 But then the problem is you get runaway concentration of power, you know, and who would you trust to be a trillion times more powerful?
00:47:08.380 Do you want any government or any company that you would trust to build artificial general intelligence?
00:47:14.240 Nope.
00:47:14.360 And that's obviously a bad outcome.
00:47:16.500 The other option is you say, well, that's dangerous.
00:47:18.540 Let's actually let, you know, everyone maximally adopt AI in every application into every domain as fast as possible.
00:47:26.480 It's kind of an AI maximalist approach, but then you get it getting sucked up into perverse incentives.
00:47:31.880 It's not that AI is bad.
00:47:33.800 It's that the incentives here were saying we have to maximize engagement to children.
00:47:38.480 And I think there's some basic things we could agree on, like, do you really want AI chatbots basically talking to minors?
00:47:45.760 This product was marketed for 12-year-olds and above.
00:47:51.380 This is something I think everybody can agree on.
00:47:55.080 But what we say often is that there's these two bad outcomes.
00:47:57.800 One is over-democratization and one is over-concentration.
00:48:02.400 And what we really want is, you know, the paths to hell are wide and many and the path to heaven is narrow and steep.
00:48:07.940 It's this very delicate balance of steering.
00:48:10.540 We're not for or against AI.
00:48:11.840 We're for pro-steering, you know, and that can include basic things like liability so that companies are liable for the harms that they create, just like you would want any externalities to be owned on the balance sheet.
00:48:23.900 And so they're incentivized to not have caused the problem.
00:48:26.920 And you can have things like whistleblower protections in advance to the fact that the government doesn't have a lot of people, as you said, the octogenarians that are in Congress and don't understand the issues.
00:48:38.240 We can have more protections from people inside the company.
00:48:41.240 So hang on just a second.
00:48:42.140 It's a dangerous situation.
00:48:43.180 Hang on just a second.
00:48:43.940 So here's one of the things that they didn't understand.
00:48:46.660 When some of those people were talking about, well, then we're going to pass new laws, I said, those laws, by the time you get them written, it will be a whole new set of problems.
00:48:56.740 It's moving so fast.
00:48:58.760 I mean, we're so far behind, you know, and those sound like, I mean, we're doing, honestly, it's almost as if we are as unmoored as the Nazi scientists who are like, I don't know, let's inject some blue into people's eyes.
00:49:23.320 I mean, it's insane what we're doing, and there doesn't seem to be any way to stop it because everybody is going to have it and our enemies are going to have it.
00:49:35.900 Yes, but I mean, as you're saying, I do agree that it's insane we've allowed ourselves to get this far.
00:49:42.020 You know, a short way of saying it is software is eating the world.
00:49:45.520 Actually, Mark Andreessen said that.
00:49:46.960 Yeah.
00:49:47.280 But AI is now eating software.
00:49:50.020 Right.
00:49:50.140 And we have no rules and no for software nor AI.
00:49:53.660 So it's like the Wild West is eating the world.
00:49:55.700 We used to have protections around children in media.
00:49:58.080 What can you show them on Saturday morning?
00:49:59.600 We used to have protections around, you know, what kinds of things can go on the airwaves from the FCC.
00:50:04.420 But when software and AI eats the world, all of those protections and the spirit of the law go out the window.
00:50:09.660 And as you said, it is moving very, very quickly, but it doesn't mean that there's nothing that we can do.
00:50:14.740 There's still a spectrum of outcomes ahead of us.
00:50:17.560 OK.
00:50:17.820 And we have to choose to get to the less bad of those of those outcomes.
00:50:23.240 OK.
00:50:23.460 So what should we be pushing for?
00:50:26.700 Well, as we said, you know, I think of it like belts and suspenders.
00:50:29.860 It's a whole set of things that we need to get to a better future.
00:50:34.140 So one is actually just on the argumentative side.
00:50:36.580 You were just mentioning China.
00:50:37.640 As we both know, I would say the number one reason why we're not regulating or doing something about this is because we're saying, well, we'll stifle innovation compared to China.
00:50:46.100 Right.
00:50:46.660 But it turns out the biggest accelerant to China's AI progress has been American AI companies, specifically Facebook or Meta's AI model called Llama, has been cited to be the number one accelerant of China's progress.
00:51:00.940 And so the first thing is getting clear that to the degree we're in a race, we're in a race to get to a stable future.
00:51:08.740 And right now we're building a future like we played the game Jenga.
00:51:11.660 If you remember that, you know, in your family, it's like we're adding these amazing new things at the top of the stack, like new cures to cancer.
00:51:19.100 But we pulled out a fundamental building block in society, which is now anybody can make dangerous things with biology because that's what enabled the new cures to cancer.
00:51:28.240 We make at the very top of the Jenga tower, we add the ability for anybody to make AI art.
00:51:33.860 But in doing so, we pulled out a fundamental block of now no one knows what's true or real.
00:51:39.080 And so, yes, we're in a race with China, but we're in a race to have actually a stable and integrated future.
00:51:44.880 So it's a race for who can better govern this technology, not who has the power in a way that self undermines you.
00:51:51.440 Tristan, I'm out of time and I always am with you.
00:51:54.880 I would love to have you back maybe before the holiday or right after whatever will fit into your schedule to spend more time with you on what's right around the corner, what's already here that parents and all of us should be aware of and more of the because I read this stuff and I see this stuff and, you know, I've been talking about it since the 90s.
00:52:16.640 And to me, it just feels overwhelming that it's here and nobody's done anything about it.
00:52:23.040 And I'm on the opposite end of most people, I think.
00:52:26.820 They don't even know about most of this stuff yet and what's right around the corner.
00:52:32.020 And I appreciate your point of view that we still have time.
00:52:35.820 We can still do things.
00:52:38.080 It's obviously getting slim, but we can still emphasize that, you know, to policymakers, senators, the new administration, the new surgeon general,
00:52:45.120 we need to know about and act on these harms.
00:52:47.780 And, Glenn, thank you so much always for letting me get a chance to talk about this stuff.
00:52:51.560 Tristan, thank you.
00:52:52.280 I hope to have you on again soon.
00:52:53.820 Thank you.
00:52:54.520 God bless.