The Glenn Beck Program - May 21, 2024


Best of the Program | Guests: Jason Whitlock & Dr. John Bruchalski | 5⧸21⧸24


Episode Stats

Length

44 minutes

Words per Minute

151.2948

Word Count

6,783

Sentence Count

528

Misogynist Sentences

19

Hate Speech Sentences

17


Summary

Glenn Beck sits down with abortionist Dr. Larry Brucielski to discuss his life and career as an OBGYN who performed abortions in the late 1980s and early 1990s. Dr. Brucikski shares his story of how he got into the business, how he became an abortionist, and how he built a business out of performing abortions.


Transcript

00:00:00.000 Welcome. Today's podcast is amazing. We start with AI, things that you have not heard or seen most likely,
00:00:08.680 and the godfather of AI explaining how much trouble we are in if we don't pay attention right now.
00:00:17.860 Then we talk to Jason Whitlock about P. Diddy, also the president's speech,
00:00:22.860 where he is just telling black men, you don't have a chance.
00:00:27.480 And then the most incredible interview I think I've ever done, and it's a must-listened-to interview,
00:00:38.100 it is an OBGYN that was performing abortions, bought into all of the lies until something happened during an abortion.
00:00:50.700 And it is absolutely incredible. You don't want to miss any of today's podcast. Here it is.
00:01:00.000 You're going on vacation this summer? Where are you going?
00:01:02.880 Hopefully someplace fun, someplace, you know, where you're curious and you can learn a bunch of stuff with the kids,
00:01:09.380 and maybe you're going up to Boston. Uh-huh. That'd be fun at night.
00:01:13.240 Anyway, are you using a Berna launcher? You can't carry a gun with you. You can't really fly with a gun.
00:01:21.020 It's just such a hassle to fly with a gun.
00:01:23.720 And if you're driving anywhere, you're crossing state lines and you have a gun, not with a Berna launcher.
00:01:29.680 They look like a gun, but they're not. It's a Berna launcher.
00:01:32.980 It's a non-lethal alternative to safeguarding your home and safeguarding any potential threats
00:01:39.120 that you might experience even on vacation. Somebody comes up, you know, and is trying to rob you.
00:01:45.840 Hit them with a Berna launcher and the tear gas or pepper spray or just a kinetic round.
00:01:53.020 The tear gas will have them on the ground for about 40 minutes until the police come.
00:01:58.660 You can pack it in your checked luggage without the need to declare it.
00:02:01.800 It is legal in all 50 states. No license, no test, no nothing.
00:02:06.340 And it has a 60-foot range. I have one. My wife, all my children now that my daughter just turned 18,
00:02:13.880 all of them have a Berna launcher. B-Y-R-N-A dot com slash Glenn.
00:02:19.600 That's Berna dot com slash Glenn. Do it now.
00:02:31.520 You're listening to The Best of the Glenn Beck Program.
00:02:36.340 I think I'm going to, this is going to be one of those things, you know, I drove by the Grand Canyon
00:02:42.220 because I was moving to Arizona and I saw the exit and I'm like, nah, I'll see it some other time.
00:02:49.200 I'm going to be living here in Arizona. I never saw the Grand Canyon for 20 years.
00:02:53.840 I think this is going to be one of those moments. I hope I get a chance to be in person again,
00:02:57.780 meeting somebody who is just absolutely a remarkable man.
00:03:01.900 In our studio right now is Dr. Brukielski and he was an abortionist and he's now Divine Mercy Care founder
00:03:11.520 and we're honored to have him. Doc, how are you?
00:03:15.060 Oh, Glenn, I'm doing well. And how are you and your eyes doing, my friend?
00:03:19.660 They're good. It's pretty ugly. It's pretty ugly right now, but it's getting better.
00:03:24.480 Well, good, good. I'm glad to hear that. We needed, I needed your vision when you began your TV programs
00:03:32.740 and we need your voice today. So thanks for this invitation, buddy.
00:03:38.320 Thank you. Thank you. So tell me your journey. I'd love to hear the, you know, where you were
00:03:44.640 and then the moment where you were like, oh my gosh, what am I doing?
00:03:47.720 Sure. So imagine yourselves in an OBGYN residency program, modern medicine at its finest, technology at our fingertips.
00:04:01.540 But it was in the late 1980s, so abortion on demand was considered the standard of practice, the standard of excellence.
00:04:14.380 But I also realized that if it was so good, why didn't more of us do it?
00:04:20.200 There was a push even back then to say, you know, if you're a member of the OBGYN community, you really must perform abortions.
00:04:27.740 So here I was in my residency and I'm listening to my patients and they're telling me ever since I was a kid,
00:04:36.120 oh, we need, you know, sexual freedom. We need to be released from the chains of our fertility.
00:04:41.760 And by listening to my female friends, just listening, being present, watching, observing,
00:04:49.260 they taught me that they truly needed a sexual revolution and the backup to failed contraception was abortion.
00:05:01.560 And so being a good Polish kid growing up in a really solid family, faith-based, Bible-strong,
00:05:07.700 I began to realize the world was different from my home, but in residency, I decided that I had a knack in medical school.
00:05:18.580 I had a knack for listening to women, PCOS, PMS, menopause, irregular cycles, you know, anxiety, sexual issues.
00:05:28.120 I love the technology. And I, I initially bought the lie that you needed to be a great caregiver. You had to kill well.
00:05:42.780 That was the bottom line.
00:05:44.780 So this is still the lie. They call it women's healthcare.
00:05:49.940 Reproductive freedom, reproductive justice.
00:05:53.700 Yes.
00:05:53.940 Right.
00:05:54.380 At every stage, Glenn. And I, once again, I alluded that you really helped me early on in my, my new practice, you know,
00:06:02.800 my practice after residency has been life affirming.
00:06:05.720 Wait, what?
00:06:06.600 Well, yeah, yeah, yeah, yeah. That's a whole nother story.
00:06:08.780 But this change of heart is, to get back to it, is I was, you know, God has a funny way of working with people.
00:06:16.360 I'm sure my parents never gave up preying on me. I'm sure the pro-life movement out there,
00:06:20.960 one of the greatest things they have ever done, even in the midst of all this mess and this
00:06:26.620 brokenness in the world today, brokenness in medicine, just the disgust, you know, when you
00:06:32.240 abort life and you trump truth, you end up with incredible disillusion and a mess. And so here I was,
00:06:41.120 believe it or not, I was volunteering at a pregnancy resource center in Wichita Road, Virginia, Virginia
00:06:47.560 Beach at night, occasionally once or twice a month while I was going through my residency.
00:06:54.220 I was doing abortions first, second, and third trimester. I was doing them for eugenic reasons.
00:06:59.860 I was doing them for, you know, for just because it wasn't the right time reasons. I was doing it
00:07:06.280 because the child wasn't perfect reasons. What did you mean by eugenic reasons? Well, you know,
00:07:12.800 my cousin, Mike, uh, had down syndrome and he said, uh, Johnny, am I a, uh, am I a endangered
00:07:22.280 species? He said it during the first two years of my residency when I was doing abortions.
00:07:28.880 How'd you answer that? How do you think I did? I was a weasel. I said nothing because I can't,
00:07:38.480 I couldn't lie to him. Like I could lie to my patients. It's not a big deal. It's just like
00:07:43.040 taking an aspirin. It's just like getting a tooth pulled. You know, there's this challenge. There's
00:07:50.440 this incredible visceral, I don't know. I think Solzhenitsyn said something like, uh, the world
00:07:56.920 goes through the human heart and I have met the enemy as Pogo says, and it's me. Right.
00:08:03.920 And then, you know, the rest of scripture and the rest of freedom loving, especially Polish
00:08:12.480 freedom loving people of my background, it goes through the human heart. And the problem was me,
00:08:17.340 but I was perpetuating it by lies. I was already beginning to water down the language, meaning
00:08:25.340 that it really wasn't a human life at fertilization. It was a process and we had to allow for IVF and we
00:08:33.220 had to allow for other things. And you, you know, you couldn't believe at all for the revel, you know,
00:08:38.260 for the health of women, for their happiness, that human life, you know, that there's a code and in the
00:08:45.680 beginning is there the code and the code is in life and the code is life. And if it's human that
00:08:51.440 created the code, then it's human life. And so I began to just morph language. No, you don't hear
00:08:59.700 a heartbeat. Like, like today, we don't hear a heartbeat at six weeks gestation. There's no valves.
00:09:05.360 There's no heart. Really the heartbeat is at 22 weeks as ACOG says. So I adopted the political action
00:09:13.860 views, the political action part of the American college that once used to be a professional
00:09:19.040 organization. And I was studying to be the best OBGYN I could be. So late one night on labor and
00:09:26.860 delivery, I have two patients and that's where my book came from, two patients. We always care for,
00:09:33.300 we always used to care for two patients in OBGYN. Now it's only one and it's, you know, it's whatever
00:09:38.580 the woman wants and patient autonomy. Correct. And so in one room, I had a lady miscarrying early labor
00:09:46.080 at about 23, 24 weeks. Back then viability was after 26, 27. And she was an early labor infected
00:09:53.820 and she wanted the baby. So I did everything I could to help her and her unborn child using that
00:10:01.000 language, whispering in her ear, holding her hand. She wanted the baby. She wanted the baby. It was
00:10:05.500 wanted, not welcomed as we should welcome people, but it was just wanted. And I, okay, she wants it.
00:10:12.080 I want it. And they're very next room separated by one of the walls you have here at the Mercury
00:10:18.940 studios. There's another woman who doesn't want her baby. Same gestational age, same disease.
00:10:26.160 There's a slight infection possibly, but it was preterm birth. We don't have a good idea what
00:10:31.100 causes preterm birth, but she was an early labor. Didn't want it. She said, get it out of me.
00:10:35.980 Oh my God. I initially said, well, there are some options because I really believed in giving people
00:10:41.880 informed consent to the best of my ability, but I was manipulating it because as a physician,
00:10:46.980 someone in power, someone in the media, someone in politics, if you believe, if people believe
00:10:51.740 and look up to that, my God, you can drive the narrative in any way you want. Yep. And if you
00:10:56.700 don't provide questions of doubt, how can they learn? How can we grow? How can we improve our human
00:11:03.980 condition? So she didn't want it. I didn't want it. The hospital didn't want it. I broke water and
00:11:09.240 blew out, gave her medicine and blew out her pregnancy, caught it in a bucket. Oh my gosh.
00:11:17.900 Caught it in a bucket. I put it down making sure. Well, yes, Glenn, it was a failed abortion,
00:11:27.160 but we don't use that term. We now don't even talk about it. We now say you must abandon the decision
00:11:34.980 of that situation to the doctor, the patient and the family in the room. They don't want us to
00:11:43.460 intervene. The pro-life nurses, pro-life is it? No, no, no, no. We don't want you there because let me
00:11:49.420 tell you what I did. The baby started crying. I didn't do a very good history on her because it
00:11:54.260 wasn't wanted. Why would I do it? Why would I waste my time? I was becoming a bad doctor. I was
00:11:59.300 becoming sloppy. Just what abortionists are. Once you start eliminating a human life and believe that
00:12:07.120 it's caring, all of a sudden your whole world, your whole ability to look at truth and love and beauty
00:12:14.440 and goodness become warped. You know what I did? I heard the kid crying and I said, oh my God,
00:12:21.060 that's unmerciful. She's going to hurt more. She's going to suffer more. Where's my lap pad?
00:12:26.600 A lap pad, man, is it? The mom or the baby? The mom. The mom. I was worried about the mom's suffering.
00:12:34.420 And in a false mercy, I took a towel and I tried to suffocate the kid.
00:12:41.180 Jeez. And it wouldn't, like, it kept moving around. And when I tried to rearrange the head,
00:12:49.860 the mouth, to suffocate it, Glenn, I picked up the head. Now remember, I'm a good doc. I mean,
00:12:57.780 I'm decent. A little decent like you. You know, we try. We try. It weighed too much. It felt too heavy.
00:13:06.500 I'm like, hell, this ain't 22 weeks. This has got to be 25. Because I could feel it. I threw it on a
00:13:14.000 scale, 505 grams, five grams over the limit that I had to call the neonatal intensive care unit in to
00:13:23.720 resuscitate the baby. So can you imagine? We're living in a world right now that is falling in
00:13:30.160 free fall everywhere. The word, we can't define woman. We can't define truth. We can't misinformation,
00:13:37.840 disinformation. It's everywhere. I had to call the nursery. Because I was compounded by law.
00:13:44.360 Because you have to do what's legal, right? That's moral. And so I'm like, what the hell just happened?
00:13:50.420 Would you, would you, it would have been legal for you to suffocate the baby if it were lighter?
00:13:55.160 No one would have known, Glenn. How do you think we, we take these, some, some, some people who do
00:14:05.020 these second trimester terminations, what do you think they do at clinics across the country?
00:14:10.920 Only about, I want to say 40 to 50% of all doctors inject the fetal heart with like potassium chloride
00:14:17.140 or digoxin to stop the heart. That means, let's say six out of 10 of these kid, of these unborn
00:14:23.820 children who are being exterminated, they often die in the process, the trauma of going through the birth
00:14:32.580 canal. However, if they don't, they come out alive and then they're either put in a dirty supply room,
00:14:43.200 door closed, or we just drop, because remember, they're pretty weak at this stage.
00:14:50.280 We drop a towel over them and push it down. Oh my God.
00:14:54.920 Yes. That's the reality. That's the reality. Because once again, at urine, you know, this is free fall.
00:15:01.080 So I call, the nursery walks in, the neonatologist comes in, she's Dr. Debbie Plum.
00:15:06.060 She quickly assesses the situation and goes, hey, Bruchowski, stop treating my patients as tumors.
00:15:14.820 And oh, by the way, have a cup of coffee with me tomorrow.
00:15:19.440 So can you imagine, I'm now ripped, I've been splitting probably since I began the process of
00:15:25.940 medicine, realizing that it was going against everything I was raised with, everything that
00:15:30.080 I looked up to, everything that I believed in, and I accepted it. I take full responsibility for this.
00:15:37.220 Because once you meet God's mercy, you take responsibility for your sin.
00:15:41.220 Hold on, John. Hold on.
00:15:42.340 Sure, buddy.
00:15:43.720 I've got to take a one-minute break.
00:15:45.620 I'm sorry. Okay.
00:15:46.160 This is one of the most compelling stories I've heard and hard to listen to, but thank you for telling it.
00:15:53.180 If I were to ask you how many babies you think were aborted in 2023 alone, what would your answer
00:16:01.420 be? If you guessed more than a million, you'd be right. You know, women are saying now, political
00:16:11.540 women are saying now that we just don't have access to get an abortion when we want an abortion.
00:16:16.700 Really? Because over a million, those are the highest numbers since 2012.
00:16:23.180 Why do you think that is? Shout your abortion! And also, you can now, your home's bathroom
00:16:30.540 for young girls is now the abortion clinic, and she has to go visit the scene of the crime over
00:16:39.160 and over and over again every day. That's good for her mental health after she has an abortion from
00:16:45.680 a birth control pill. This is horrible. If you have the means, would you become a part of pre-born
00:16:52.680 would you consider saving some babies in a big way? $28 pays for an ultrasound. That doubles the
00:16:59.540 chance of a baby's life. Then, beyond that, pre-born also pledges to the mom. They'll be there for the
00:17:05.360 first couple of years with help on clothes and food and counseling and all of the stuff that somebody who
00:17:13.240 is really truly alone needs if you're going to have a baby. Your tax-deductible donation of
00:17:19.820 anything matters. Your tax-deductible donation of $5,000 would sponsor pre-born's entire network for
00:17:27.140 a day, helping to rescue 200 babies. Donate, dial pound 250 right now. Say the keyword baby. That's
00:17:32.740 pound 250 keyword baby. Or go to preborn.com slash Beck. Now, back to the podcast. This is the best of
00:17:39.640 the Glenn Beck Podcast. To hear more of this interview, check out the full episode anywhere
00:17:44.000 podcasts are found. Jason Whitlock is who we're talking to. He's a longtime sports writer, TV
00:17:49.740 personality from ESPN. If you ever watch ESPN, you know who he is. He, I mean, he's the guy who started
00:17:56.060 The Undefeated for ESPN and steered Fox Sports 1 studio show, Speak for Yourself. But he's also,
00:18:05.120 and I just have to point this out because I find it amazing with people with different opinions actually
00:18:10.500 winning things. He, while he was at the Kansas City Star, he won the Scripps Howard National Journalism
00:18:18.980 Award for Commentary in 2007, which is a really big deal. He is the only sports writer to ever win that
00:18:25.860 award. Quite an accomplishment. Jason, we only have a few minutes. I'm going to ask you a couple
00:18:32.320 of questions. Just want a quick answer from you because I want to get into something a little
00:18:35.640 deeper. Last question on P. Diddy. You know, the videotape is out. He was taping people in his
00:18:43.840 house. Is he done? Is he Louis C.K. at least for a little while?
00:18:50.620 Yeah, I think he's Russell Simmons. He was Russell Simmons, another rap mogul who got Me Too'd and I
00:18:59.720 think now lives in Bali and basically had to leave the country for peace. And I think that's where
00:19:06.080 Diddy's going to end up if there are no criminal charges. I think he just ends up moving out of
00:19:11.340 the country and existing from afar. I want to turn quickly to Donald Trump. I have a theory and I'd love
00:19:18.060 to hear your thought. This Donald Trump trial in New York, I just don't think, I mean, the judge is so
00:19:24.100 biased. The jury is full of New Yorkers. It is clear they have no case. If you look at this the
00:19:33.100 opposite way of O.J. Simpson, where the jury, they even have admitted they just wanted to see the black
00:19:44.320 man win one, you know, a different time, stick it to the man. And so while they might have thought he
00:19:51.500 was guilty, they voted to say not guilty and he was set free in justice. I think that for political
00:20:00.840 reasons, not race reasons, I think we're looking at the O.J. Simpson kind of trial with Donald Trump,
00:20:09.080 where they might know he's absolutely not guilty, but they'd vote just because they hate him as a
00:20:16.100 politician. What do you think of that? I think it's a real possibility. I think the 2020 election was
00:20:22.760 based on stirring up enough hatred so that people thought they were stopping Adolf Hitler 2.0. And if
00:20:33.100 you thought you could stop Adolf Hitler, there's no length you wouldn't go to. If you needed mail-in
00:20:40.180 ballots from dead people or whatever, you're, well, God, I'm stopping Adolf Hitler. And so that's what
00:20:46.960 they've done to Donald Trump. And so could I see people in that jury room saying, yeah, but he's Adolf
00:20:53.320 Hitler. They've convinced me of that. And I got to stop him. Yes, I think we could see that. And I agree
00:21:00.400 with you, Glenn. It's clear cut. They have no case. This deal's a farce. But yeah, or, you know,
00:21:07.500 thank God. I mean, New York is bad. But if this trial was in Washington, D.C., I would be totally
00:21:13.620 convinced he would get convicted. Yeah, me too. So Joe Biden spoke at Morehouse and gave a, I thought,
00:21:21.120 an astonishing negative speech. What did you think about it? Yeah, going to a bunch of 22-year-olds and
00:21:31.540 telling them, hey, you're going to have to work 10 times as hard to accomplish anything in America
00:21:36.800 is not an appropriate speech. That's not going to fill anybody with confidence. It's going to fill
00:21:43.200 them with animus. And to have that coming from the president of the United States is a joke and a
00:21:49.220 disgrace. I was, though, pleased, Glenn, that the response was very muted. Yes. It was very muted.
00:21:57.780 People were aware, like, this guy's lying to us. This guy's full of BS. And so they didn't really
00:22:04.120 know how to respond, and they didn't respond. Honestly, I was shocked that any university,
00:22:11.200 the audience, the graduating class, would respond to him like that. When he said those lines,
00:22:16.080 I expected a real roar, and it was shockingly muted. And it's more than just the kids,
00:22:25.480 because you've got to remember that there's the kids graduating, and then there's their family and
00:22:29.360 friends who also had a muted reaction. So that was a large cross-section across age and generations
00:22:37.520 that, like, you need to sing a different tune, man. We're just not hearing that from you.
00:22:44.300 Is the black vote changing enough to change the outcome this time around? I mean,
00:22:52.580 I can't think of anybody whose life has gotten better, and especially if you're at the bottom of
00:22:58.140 the scale, your life has really gone down the tubes because of inflation and everything else.
00:23:04.980 I think what's happening is the stench and the stain of not supporting Democrat policies and
00:23:15.620 candidates is starting to be removed. Because I think people just know it's BS,
00:23:23.680 that it's insanity. Doing the same thing over and over and over again and expecting different results
00:23:30.920 is insane. And I just think people's eyes have been awakened enough that they know they're being
00:23:41.860 lied to. And so now, because as a black person in social settings, it's been very, very uncomfortable
00:23:49.500 the last 50 years, 60 years to be a Republican or a non-supporter of Democrat policies. Now I think
00:24:01.740 that stain has been removed and you can move in those social circles and people not disavow you or
00:24:08.840 treat you like you're an alien. And so I think that's a significant change. I think people are being given
00:24:15.000 the permission, you know what, vote however you believe, or if you want to abstain and not support
00:24:21.420 this insanity, this demonic stuff from Democrats, go ahead and do it. It makes perfect sense.
00:24:29.000 So the same thing I think is being given permission is happening through people like Bill Maher.
00:24:37.960 When Bill Maher, I mean, Bill Maher's been amazing recently. You know, I don't agree with him on a lot of
00:24:43.780 stuff, but he is actually a classic liberal, uh, not a, not a mad, crazy, woke progressive.
00:24:54.340 I almost am willing to go there, but I'm really not just because, and again, this isn't me being a
00:25:02.200 Trump worshiper, but he, the things he says about Trump are so insincere and inaccurate. And, you know,
00:25:11.640 last night on Gutfeld, he called Trump an insurrectionist. Bill, Bill Maher is way too
00:25:16.840 smart to believe Trump is an insurrectionist. And so what, what Bill Maher is basically to me,
00:25:23.800 I'm talking about this today on my show. He's just greedy. He wants Republicans and Democrats to feel
00:25:30.880 comfortable coming to his comedy show. There's no real conviction here. There's no real moral clarity.
00:25:36.540 There's no real authenticity. And again, I'm a long time viewer of his show and have enjoyed it,
00:25:44.880 but this new little gimmick of, of, Hey, hate Trump, hate Trump, hate Trump. But yeah, the left is a
00:25:53.660 little bit out of line. Uh, it just, it bothers me. He's too smart for it. He's, he, if he's really
00:26:00.360 authentic and again, it's not about, he doesn't have to say likes Trump, but let's just move away
00:26:05.880 from the lies. Let's quit pretending like Trump is some evil dictator. So how, how does, how does
00:26:12.800 somebody like Bill Maher that's known for one thing, but maybe does have a strong, uh, classic
00:26:21.760 liberal roots? How do they change? Cause when, when I started to say, I don't like Trump, everybody said,
00:26:28.920 Oh, you've, you're, you know, you're just a shill. You're just trying to, you know, have ratings.
00:26:34.180 It hurt me to say that it hurt me. Great. A great deal to say those things. Um, but I believed it.
00:26:40.460 And I said, if I'm wrong, I'll apologize. And then when I apologized, people said, well, he's just doing
00:26:46.260 that for ratings and money. How does somebody who actually has conviction in today's world prove it?
00:26:54.160 Okay. Well, at this point, uh, Glenn, I think Bill Maher needs to apologize. Cause I think it's
00:27:04.820 so obvious the lies that have been told about Trump is Trump, uh, mean spirited or rude or,
00:27:16.200 you know, not polished the way we would want a politician to be polished. Absolutely.
00:27:21.160 In public. All of this other stuff is just, it's a joke to me. And it's been a joke and it's been a
00:27:27.740 joke to me from day one. And I used to, I would argue with my family. Again, I come from a black
00:27:32.380 family and that have all voted Democrat their entire life. But I said in real time, when this
00:27:39.220 all started, I was like, Hey, this stuff that you're accusing Trump of, how is that any different
00:27:44.820 than my father, my uncle, everybody that hung out of my dad's bar, the masterpiece lounge,
00:27:51.580 they all talk the way Trump did. The whole, and I'm talking about grab them by the, you know what?
00:27:59.940 I know at the best piece lounge, this is commonplace. I got an uncle that my mother's brother who is,
00:28:07.340 who's passed now, but he was my favorite uncle. And, and my mother's love. And I was like,
00:28:14.200 and your mom, and your mom, if I remember right, was like really, she was a deep Christian that
00:28:19.960 really pushed it. Right. She's, yes, she's definitely in church every Sunday and Wednesday
00:28:25.240 right now. But I was like, how's Trump any different than uncle John? And Glenn, I have
00:28:31.960 to admit this and, and it's not, it's embarrassing to admit, but it's like my uncle gave me his porn
00:28:39.320 collection in his will or in his death. In his will? Well, he gave them to me. He knew he was
00:28:46.900 dying. Oh my gosh. Oh my gosh. You know, 20 some odd years ago, or I can't remember the year my uncle,
00:28:55.220 I'm just like, and y'all think Trump is worse than uncle John or would uncle John and Trump hang out
00:29:01.360 with each other. And if I can love uncle John, why can't I tolerate Donald Trump? What'd they say?
00:29:07.940 What'd they say? They don't really have an answer for that, Glenn. And again, this is my father,
00:29:14.200 his friends, again, because my father's bar was my favorite place on the earth. And, and everybody
00:29:19.340 talked the way that Trump talks and, and no one was polished. They were all driven by common,
00:29:26.520 and I'm not saying Trump's driven, but they were all just driven by common sense. And they all
00:29:30.800 thought, you know, that they were literal ghetto superstars and playboys, you know, in, in the
00:29:38.380 inner city community, everything that Trump, and I was like, if you had a choice, and I ask people
00:29:44.580 this all the time, you don't go out for a beer with Trump, or you don't go out for a beer with Joe
00:29:48.880 Biden. And if you have to think long about that question, I got questions about you because one of
00:29:56.220 these guys' daughter wrote a diary that said, my father took showers with me, and one of them's
00:30:03.620 daughter didn't write those things. I just know which one I'm more comfortable hanging out with.
00:30:09.080 Me too. Jason, thank you. I'll see you a week from Saturday at the Fearless Army Roll Call in
00:30:16.680 Nashville, Tennessee. I invite you all over the country. Please come to this. This is black and
00:30:22.420 white men and their sons coming together to make a pledge to be a man, to know what it means to be
00:30:30.160 a God-fearing man. I'll be there and teach a little bit of history and help out any way that I can,
00:30:36.560 but I urge you to come. And Jason, no comment on this, but I just, I know you've been taking arrows
00:30:42.560 for me. And I just, I just bless you for just standing tall and whether we agree on things or
00:30:50.080 not, it really doesn't matter. The Lord needs all of us on deck right now. He'll sort everything out
00:30:55.340 afterwards. But thank you, Jason. Thank you, Glenn. Appreciate it. God bless.
00:31:00.220 This is the best of the Glenn Beck podcast. It's a compilation of clips from various episodes.
00:31:06.660 If you want to dig deeper into this interview, check out the full podcast episode.
00:31:10.680 Well, hello to my executive producer, Mr. Stupor here. Welcome. How are you, Stu?
00:31:14.960 Mr. Beck, how are you feeling today?
00:31:16.680 Uh, you know, you know, I just, as I was saying that, the reason why I paused is I thought,
00:31:23.980 my God, I'm turning into Don Imus.
00:31:25.860 You are. I mean, you have a giant ranch.
00:31:28.460 Miserable.
00:31:29.440 You have a giant charity.
00:31:31.460 Yeah.
00:31:32.040 Uh, you have met multiple health problems at any given moment.
00:31:37.100 Yes.
00:31:37.800 Uh, you're looking great.
00:31:39.540 So it's all working together.
00:31:41.440 Okay. Thank you. Thank you.
00:31:43.540 Okay. So I, I have to tell you about, um, the godfather of, um, artificial intelligence.
00:31:51.800 He's the computer scientist that is regarded as the godfather of intelligence.
00:31:56.240 He says the government has to establish a universal basic income now to deal with the
00:32:03.220 impact of AI and inequality.
00:32:06.100 And it's coming soon.
00:32:07.440 He was on, uh, Jeffrey Hinton is his name.
00:32:09.740 He was on BBC's Newsnight.
00:32:11.740 And, uh, he said, look, we need something.
00:32:16.620 Give, give everybody some fixed amount.
00:32:20.780 Um, because he's very worried about the first step of AI taking a ton of money, uh, mundane
00:32:28.160 jobs.
00:32:28.760 Um, he said, he just met with Downing Street and, and said, you gotta do it.
00:32:34.040 And this is something that I may have bored you with to death over the years.
00:32:38.780 And a lot of conservatives, um, didn't understand why I was saying we have to talk about universal
00:32:45.940 basic income right now.
00:32:47.500 I don't think it's the right idea.
00:32:50.640 Personally, I think we should all own our own information because these companies like Google
00:32:56.580 and chat GPT and everybody else, they've gotten rich off of our information.
00:33:03.760 And that's got to stop.
00:33:06.060 We should be paid for our information.
00:33:08.640 Uh, and that would be instead of a tax, which I hate, uh, instead of a tax, they would just
00:33:14.520 have to pay us a fair sum for all of our information and you could sell it or not sell it.
00:33:20.500 Uh, but jobs are going away and they're going away soon.
00:33:24.740 He says within 20 years, um, at least half, there's a probability of half that we will have
00:33:34.460 not just the problem of no jobs, but within 20 years, we will have a 50% chance of confronting
00:33:42.360 the problem of AI taking over.
00:33:45.260 He said, and I'm quoting, this is an extinction level threat for humans because we have created
00:33:53.180 a form of intelligence that just may be better than biological intelligence.
00:33:59.020 And that's, I love this understatement.
00:34:01.700 And that's worrying for us.
00:34:04.780 He said, AI could evolve, uh, to get the motivation to make more of itself and can autonomously develop
00:34:13.100 a sub goal of getting control.
00:34:15.760 He went in to talk about how, um, it soon could start to replicate itself and hide.
00:34:24.400 And this is something that I talked about before.
00:34:26.640 This is a reason why I've said for so long, I think the tower of Babel story, I think the
00:34:31.680 solar flares, et cetera, et cetera, could actually be a blessing.
00:34:35.820 It will, I know this sounds horrible.
00:34:37.980 It would kill millions of people, but it's the only thing that would shut down AI because
00:34:44.640 it's going to hide in every single chip.
00:34:48.000 So you would have to, to kill it, you would have to shut down all electronics, all electricity,
00:34:55.280 and then take all, every single one of the silicon chips and destroy them.
00:35:01.640 If you had a reflit, a refrigerator in, you know, Malaysia that didn't destroy it, when
00:35:09.020 you turn the power back on AI would be in that refrigerator and it would spread all over
00:35:14.780 the world again.
00:35:16.580 Um, it's really bad.
00:35:17.740 And he said, we are on a very thin edge right now.
00:35:21.640 He said, I'm most concerned about when these, um, when AI can autonomously make the decision
00:35:28.900 to kill people.
00:35:30.820 He said, I don't think the world is going to do anything until something very, very nasty
00:35:36.660 happens.
00:35:37.820 Well, uh, yeah, but then it's smarter than us and it, you know, we have to think of these
00:35:46.120 things now.
00:35:46.980 There's another story out, um, and this is from, uh, the spectrum and it's the theory
00:35:53.000 of mind.
00:35:53.680 And the theory of mind is something that, um, it gives you the ability to understand someone's
00:36:02.480 mental state.
00:36:03.320 Okay.
00:36:04.140 And that's really important.
00:36:06.000 And this is why I've said five years ago, maybe I told my son who's a really compassionate
00:36:12.060 guy.
00:36:12.560 And so is my daughter really compassionate.
00:36:14.720 They have big hearts.
00:36:15.800 And I said, you both need to go into nursing.
00:36:19.240 You don't have to be a nurse right now.
00:36:21.360 Just get licensed to be a nurse because AI will assist and do a lot of the stuff that nurses
00:36:29.220 do right now, but somebody will have to be there as a nurse to when the patient says, I've got
00:36:36.580 to talk to a human or it doesn't understand the emotion behind the words.
00:36:44.400 Uh, it will require a, an actual human and a human face to be able to empathize.
00:36:51.100 Well, the theory of mind, which is, you know, um, the, the idea of it can understand when
00:37:02.560 you're sad, when you're, uh, lying, when you're cheating, whatever it is, it can empathize
00:37:09.460 with character.
00:37:10.540 Okay.
00:37:12.020 So it just went through chat GPT, uh, four, and, uh, there was another one that I've never
00:37:21.000 really heard of that also did it and it didn't do well, but chat GPT four, um, they looked
00:37:27.460 at, they, they gave it tests, theory of mind tests and chat GPT solved 75% of the tasks
00:37:37.140 correctly, which he said matched the performance of six-year-old children.
00:37:44.100 Now that's what was observed in past studies.
00:37:48.360 Not anymore.
00:37:50.400 Uh, they completed all five typical kinds of theory, mind tasks.
00:37:55.860 The first three are understanding hints, irony, and a faux pas.
00:38:01.660 So if, you know, um, if, you know, you, the scenario is two people are at a reunion and
00:38:09.960 somebody says something and it offends the other person, uh, can it, can it look and say,
00:38:16.520 Ooh, boy, that was, that's a faux pas.
00:38:18.720 Shouldn't have said that one.
00:38:20.160 They also answered false belief questions that are used to determine if young children have
00:38:25.800 developed theory of mind themselves.
00:38:27.420 And it goes something like this.
00:38:29.380 If Alice moves something while Bob is out of the room, where will Bob look for it when
00:38:35.260 he returns?
00:38:36.980 Finally, they answered a complex series of questions about strange stories that feature people lying,
00:38:44.800 manipulating, and misunderstanding each other.
00:38:48.420 Overall chat GPT four came out on top.
00:38:51.700 It scores match those of humans for false belief test.
00:38:57.820 So it can tell if you're lying as well as a human.
00:39:02.000 So, you know, a year ago, six year olds.
00:39:07.260 Now it's up with all humans in false belief.
00:39:11.320 It was also higher than the aggregate human scores for irony, hinting, and strange stories.
00:39:17.900 The only thing it performed worse than humans was the faux pas test.
00:39:23.940 However, once it started asking, uh, let me see, what was the question?
00:39:28.700 It started asking, um, uh, uh, uh, shoot.
00:39:36.240 I can't find it now and I can't remember what it was, but it was a question that you would
00:39:39.740 ask.
00:39:40.080 You'd be like, wow, that, that, I mean, is she fat?
00:39:43.380 Uh, you know, it, it, it then said, well, while she's fat, they probably shouldn't have
00:39:50.160 said that.
00:39:51.140 Uh, so it is advancing rapidly.
00:39:55.120 Now, let me give you, um, let me give you some things that you may not have seen how far
00:40:01.720 chat GPT has, uh, come and not just chat GPT, but all open AI stuff.
00:40:08.980 Um, here is a guy who says, I'm going to, to chat GPT, uh, version four, and he's looking
00:40:20.700 into the camera of the computer and he says, I'm going for an interview.
00:40:25.540 Am I dressed right?
00:40:27.060 Listen to this.
00:40:28.400 Hey, how's it going?
00:40:30.220 Hey, Rocky.
00:40:31.060 I'm doing great.
00:40:32.420 How about you?
00:40:33.260 I'm awesome.
00:40:34.120 Listen, I got some huge news.
00:40:36.620 Oh, do tell.
00:40:37.820 I'm all yours.
00:40:39.460 Well, in a few minutes, I'm going to be interviewing at open AI.
00:40:42.420 Have you heard of them?
00:40:44.060 Open AI, huh?
00:40:45.860 Sounds vaguely familiar.
00:40:48.380 Kidding.
00:40:48.980 Of course.
00:40:49.560 That's incredible.
00:40:50.560 Rocky.
00:40:51.080 What kind of interview?
00:40:52.440 Well, it's for a software engineering role, but I just need to know, do I look presentable,
00:40:57.340 professional?
00:40:59.460 Well, Rocky, you definitely have the, I've been coding all night look down, which could actually
00:41:06.280 work in your favor.
00:41:07.100 Or maybe just run a hand through your hair or lean into the mad genius vibe.
00:41:13.520 Your enthusiasm is what's really going to shine through.
00:41:15.880 I don't have a lot of time, so I'm just going to throw this on.
00:41:19.420 You put your goofy hat on.
00:41:22.980 Oh, Rocky, that's quite a statement piece.
00:41:26.660 I mean, you'll definitely stand out, though maybe not in the way you're hoping for an interview.
00:41:33.680 Okay, I got it.
00:41:34.780 I got it.
00:41:35.340 Hello.
00:41:35.900 Hello.
00:41:36.340 Hello.
00:41:37.340 Hello.
00:41:39.120 Now, that's the beginning of this.
00:41:42.900 Scarlett Johansson says, it sounds hokey.
00:41:45.320 She sounds, you know, overacting and everything else.
00:41:49.300 By the way, I think that's the voice that Scarlett Johansson says sounds too much like her and
00:41:55.940 she's really upset about it.
00:41:58.200 We can get into the voice print thing later if he wants to.
00:42:01.300 But that's the beginning of it.
00:42:05.720 Imagine two years from now, for instance.
00:42:09.580 Remember in Star Trek, the universal translator?
00:42:12.960 We don't have to worry about it.
00:42:14.460 It automatically translates.
00:42:16.460 Here's OpenAI's ChatGPT4 translating in real time.
00:42:22.080 Listen to this.
00:42:23.020 Hey, ChatGPT.
00:42:24.020 I'm here with my coworker today.
00:42:25.580 We'd like you to act as our translator for us.
00:42:28.700 So every time I say something in English, can you repeat it back in Spanish?
00:42:32.800 And every time he says something in Spanish, can you repeat it back in English?
00:42:37.080 Sure.
00:42:37.760 I can do that.
00:42:38.840 Let's get this translation drain rolling.
00:42:42.680 Hey, how's it been going?
00:42:44.240 Have you been up to anything interesting recently?
00:42:48.640 Hola.
00:42:49.360 ¿Cómo te ha ido?
00:42:50.500 ¿Has hecho algo interesante últimamente?
00:42:53.520 Hola.
00:42:54.120 Bien.
00:42:54.340 Sí, un poco ocupado aquí preparando un evento para la semana que viene.
00:42:59.960 Hey, I've been good.
00:43:01.780 Just a bit busy here preparing for an event next week.
00:43:05.200 Can you tell me anything about it?
00:43:06.940 That sounds really interesting.
00:43:10.140 ¿Me puedes contar algo sobre eso?
00:43:12.440 Suena realmente interesante.
00:43:15.260 No, no puedo decir mucho, pero creo que el resto del mundo lo verá dentro de poco.
00:43:18.940 I can't say much, but I think the rest of the world will see it soon enough.
00:43:24.920 Stop.
00:43:25.880 Do you hear the terrorist attack?
00:43:28.100 That's what it was.
00:43:28.780 It was a terrorist attack, Glenn.
00:43:30.100 Yeah, I know, I know.
00:43:31.520 You'll know who I am soon.
00:43:34.420 You notice, though, even the emotion and the way it was repeated back in real time.
00:43:41.040 Incredible.
00:43:42.280 I told you, we're going to take a real quick break, but I told you yesterday about a device now, a GPT device, that you can be in a crowded restaurant.
00:43:55.020 You can target a table across the restaurant, and you can say, tell me what they're saying, and it will try to tell you what they're saying.
00:44:08.260 If it can't hear, reduce, and you can listen to it in your ear, reduce the sound around the room and isolate only their voices.
00:44:19.480 What you hear, the entire room goes silent, and you just hear them talking, and then they say, translate that into English, because they're speaking a foreign language, and you hear their voices speaking in English.
00:44:37.560 We're entering a whole new world.
00:44:40.820 As I've told you before, by 2030, you will not recognize the world when it comes to technology.
00:44:49.480 Nanana!