The Glenn Beck Program - May 12, 2025


Best of the Program | Guest: Dr. Jay Bhattacharya | 5⧸12⧸25


Episode Stats

Length

48 minutes

Words per Minute

168.38257

Word Count

8,183

Sentence Count

691

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

In this episode of the Glenn Beck Show, Glenn and Stu debate the idea that the government should be allowed to negotiate for drug prices. Glenn argues that the Constitution doesn't allow for the government to do that. Also, Dr. Jay Bhattacharya talks about the latest lab leak at Fort Detrick, and more.


Transcript

00:00:00.000 When I found out my friend got a great deal on a wool coat from Winners,
00:00:03.760 I started wondering,
00:00:05.460 is every fabulous item I see from Winners?
00:00:08.520 Like that woman over there with the designer jeans.
00:00:11.240 Are those from Winners?
00:00:12.760 Ooh, or those beautiful gold earrings.
00:00:15.220 Did she pay full price?
00:00:16.560 Or that leather tote?
00:00:17.580 Or that cashmere sweater?
00:00:18.480 Or those knee-high boots?
00:00:20.280 That dress?
00:00:21.060 That jacket?
00:00:21.720 Those shoes?
00:00:22.760 Is anyone paying full price for anything?
00:00:25.720 Stop wondering.
00:00:26.980 Start winning.
00:00:27.900 Winners.
00:00:28.480 Find fabulous for less.
00:00:30.260 Hey, today's podcast, I find myself at odds with,
00:00:33.840 well, I agree with this, but I don't agree with that.
00:00:37.160 When Trump is proposing government coming in to regulate prescription costs,
00:00:41.340 it's an interesting debate I had with myself and Stu on the air today,
00:00:45.220 and I think a lot of people feel in many ways the same way.
00:00:48.080 And also, is it possible that Pope Leo isn't going to be as bad as people thought?
00:00:52.540 Hear what he had to say on AI and why I agree with him.
00:00:55.040 Also, Dr. Jay Bhattacharya from the National Institute of Health,
00:00:58.880 on the prescription drugs, and that lab leak at Fort Detrick, which is terrifying.
00:01:05.320 All of that and more on today's podcast.
00:01:07.260 First, let me tell you, you've got too much to do, too much to fight for,
00:01:11.220 too many people counting on you to be sidelined by pain.
00:01:13.560 So if you've been pushing through it, masking it, ignoring it, or just learning to live with it,
00:01:17.940 the pain's not going to back down, change your life.
00:01:20.040 Relief Factor will help you hit back and hit back hard from the inside with a powerful drug-free daily supplement
00:01:28.020 that targets the underlying inflammation that causes most of the aches and pains
00:01:32.520 and, honestly, most of the diseases in our body as well.
00:01:35.740 Back, neck, knees, shoulders, inflammation, if it's involved, Relief Factor is designed to fight it.
00:01:41.940 Millions have tried it, and a lot of them felt the difference in just a few days.
00:01:46.780 Pain's had the upper hand long enough.
00:01:48.300 It's time to take it back, stop masking it, help your body fight it naturally.
00:01:53.000 Stop living with the aches and pains.
00:01:54.480 See how Relief Factor, a daily drug-free supplement, could help you feel and live better every single day.
00:02:00.400 Give their three-week quick start a try.
00:02:03.180 $19.95, in a few weeks, even days.
00:02:05.500 Feel the difference Relief Factor can make for you.
00:02:08.880 You don't have to be stuck with living with pain.
00:02:10.780 Visit relieffactor.com now or call 1-800-4-RELIEF.
00:02:13.800 That's 1-800-4-RELIEF.
00:02:17.120 Hello, America.
00:02:18.300 You know we've been fighting every single day.
00:02:20.020 We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:02:26.460 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:02:31.320 But to keep this fight going, we need you.
00:02:33.780 Right now, would you take a moment and rate and review the Glenn Beck Podcast?
00:02:37.140 Give us five stars and leave a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:02:46.300 This isn't a podcast.
00:02:47.560 This is a movement.
00:02:49.280 And you're part of it.
00:02:50.260 A big part of it.
00:02:51.120 So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:02:56.240 Rate, review, share.
00:02:57.860 Together, we'll make a difference.
00:02:59.880 And thanks for standing with us.
00:03:01.060 Now let's get to work.
00:03:02.180 You're listening to the best of the Glenn Beck program.
00:03:14.380 So I find myself in this situation today.
00:03:18.800 Agreeing in theory and yet disagreeing in theory.
00:03:23.540 When we're talking about the president coming in and saying, hey, I'm negotiating for drug prices.
00:03:28.980 Okay, wait, hold it.
00:03:30.160 Wait.
00:03:30.380 I don't like this because I don't want the government involved in any of this.
00:03:34.980 However, hmm.
00:03:37.480 If I don't like the government involved in any of this, then we shouldn't have Medicare, Medicaid and socialized anything.
00:03:44.560 The government should be out of it.
00:03:47.100 And that's where we have been as a nation or I'm sorry, as a movement.
00:03:51.840 Most of us have been, okay, can't do this.
00:03:54.700 Can't do.
00:03:55.100 I don't want this.
00:03:55.900 But that requires us to debate and win the debate on the bigger issue.
00:04:04.240 Nowhere in the Constitution is this allowed for the government not to negotiate for drug prices.
00:04:11.280 Nowhere in the Constitution is the government allowed to do socialized medicine.
00:04:17.320 Okay, so if you want to have that argument, I am so with you on that.
00:04:23.580 Let's have that argument.
00:04:25.100 Let's have that argument.
00:04:25.960 It's a fun argument.
00:04:26.700 Right.
00:04:27.520 Now, if you want to just have the argument that, well, the president shouldn't negotiate on our behalf.
00:04:33.880 Wait a minute, as a taxpayer, if you're going to, can we just not keep getting hit in the face by everybody around the world?
00:04:41.300 They're all, this is so complex.
00:04:44.680 This makes my brain hurt.
00:04:48.820 Everyone else is negotiating.
00:04:51.920 We're not.
00:04:53.000 So we're getting screwed.
00:04:54.400 We're paying $2,000, you know, for a drug that the rest of the world is paying $20 for, for a drug.
00:05:01.120 However, let's remember, the rest of the world is on socialized medicine.
00:05:06.640 They may pay $20 for a drug, but you still can't get that drug over there because it's socialized medicine.
00:05:14.240 And, of course, this is a big part of one of the reasons, you know, why we pay a premium is we generally get access to these drugs first.
00:05:21.320 Correct.
00:05:21.640 The GLP-1s are a great recent example of this, all the weight loss injection drugs.
00:05:27.020 These are, you know, corporations made them.
00:05:29.440 In fact, not even just American corporations.
00:05:31.500 Novo Nordisk is not an American corporation.
00:05:34.000 They're the ones that do Ozempic and that.
00:05:37.940 And when they went into shortage, which they did when they came out because everybody wanted them, we were like the only country getting them.
00:05:45.220 Right.
00:05:45.860 We got them before everybody else in the globe.
00:05:48.600 Now, whether you like them or not is not the point here.
00:05:50.500 The point is that these new drugs tend to come to markets, just like, by the way, every other product.
00:05:54.660 When you come out with a new product, where do they go?
00:05:56.600 To where people will spend more for them.
00:05:58.660 Right.
00:05:58.980 So they came here and they didn't go anywhere else.
00:06:02.440 You know, again, do you want that?
00:06:05.100 You might not.
00:06:06.000 So here's the thing.
00:06:06.680 Here's the thing.
00:06:07.380 As long as you, as long as the government doesn't get into negotiating drugs for everyone, you know, personally.
00:06:15.240 Oh, you mean the socialized medicine thing that the government is doing, the Obamacare?
00:06:21.620 You can't get that drug?
00:06:23.320 Hey, guess what?
00:06:24.080 Over here in the free market, you get off of Obamacare.
00:06:27.040 You can have that drug.
00:06:28.400 Yeah.
00:06:28.940 You know, if you make all of the system actually do what it will do eventually, which is destroy people's lives and become the VA for everybody, where everybody is like, I think I'd rather kill myself than go through this anymore.
00:06:45.800 Um, then maybe you have a chance of ending it because as long as they're not negotiating for the part of the market that is semi-free, soldness is not free because of all the government restrictions on, you know, healthcare and, uh, uh, and, uh, insurance.
00:07:04.260 But if insurance, you know, maybe we can get that free and then that system would work and we'd be able to have the drugs.
00:07:10.020 But right now, I don't know why we have to have this broken system and not negotiate and look at the front, not, not, not, not with, you know, supervision, not with 2010 vision, but with a 2020 vision and look ahead and go, oh, I see that.
00:07:30.880 All of the bridges are out right in front of the car because we're collapsing because we can't afford everything.
00:07:38.680 And so nobody's willing to argue that we shouldn't have all of these socialized programs.
00:07:44.320 We've got to get people off these programs.
00:07:47.020 Nobody's willing to do it.
00:07:49.380 So if, if this move actually saves a lot of money and saves us from the abyss, even for a couple of weeks, isn't it worth doing?
00:08:04.740 Especially if it makes things worse for those who are for socialized medicine and makes the free market a little more attractive.
00:08:16.240 Well, I mean, you know, there's a very long litany of reasons as to why, uh, you know, as you mentioned, it's not really something the government in the United States is supposed to have a role in.
00:08:25.380 No, I'm with you on that.
00:08:26.160 I think we all, I think we have to just agree that the simple answer is no, you shouldn't do any of this.
00:08:34.100 None of this is constitutional.
00:08:35.960 Yeah.
00:08:36.140 And I guess like it's part of it is just like going back to tracing why we opposed it when, you know, Bernie Sanders, uh, you know, would propose something like this.
00:08:43.660 Right.
00:08:43.880 Why did we oppose it when they tried to do it with Medicaid initially?
00:08:47.580 Um, you know, there's a lot of reasons why one of the reasons why is when you have a big, you know, giant, uh, 5,000 pound gorilla, like the U S government and all the money that they spend on this, you wind up with major market, uh, uh, manipulation and distortion.
00:09:03.060 And that is a problem.
00:09:04.660 Of course, one of the reasons why our medical system is good.
00:09:08.440 If you believe it's good, by the way, we should also note that like, it's, it's an interesting kind of confluence of events when we're sort of embracing the idea.
00:09:16.320 I know you've talked about it a lot with RFK junior, for example, that one of the reason why we have so much sickness is because of pharmaceuticals.
00:09:25.100 Now that's not my belief, but that is a belief of a lot of people in the movement currently.
00:09:29.300 And if that is true, why would we want to lower prices on these pharmaceuticals that we all say are causing all of our, uh, you know, long-term disease?
00:09:38.180 I, I don't, I don't, I don't have an answer to that part.
00:09:40.700 That's something that I think if you're in that wing of the movement.
00:09:43.140 I think this is why so many people who have had really strong values for so long are waking up every day and going, I don't know how to feel about this.
00:09:52.820 Right.
00:09:53.120 It's a confusing time, I will say.
00:09:54.700 Because everything is broken.
00:09:56.380 Every, I mean, you go to, okay, well, I mean, that argument, that's the next level.
00:10:00.500 It's another one.
00:10:01.220 That's the next level.
00:10:02.400 And you're like, well, why?
00:10:03.620 Because the whole thing is broken.
00:10:05.140 And you're like, yeah, good point.
00:10:08.340 I don't really, wow.
00:10:10.160 I mean, it's, we're so deep down the road of there are no good options because we haven't made good.
00:10:20.120 You know why we don't have nice things, kids?
00:10:22.940 Because we break everything we have.
00:10:26.080 That's why.
00:10:26.820 Yeah.
00:10:27.080 You know, and it's, it's, you come to that.
00:10:28.320 I was thinking about this a little bit because one of the way we've talked about these policies, this is not a new policy.
00:10:33.940 Now, it is new through an executive order, which is another layer on top of this.
00:10:39.140 Because obviously, if these policies could be done legally through an executive order, my belief is both Barack Obama and Joe Biden would have done them.
00:10:48.880 They didn't even have, they didn't seem to even care whether they were legal and they still didn't attempt this.
00:10:53.760 We will see what the courts say, of course, about that.
00:10:56.160 But like.
00:10:56.780 I don't like that.
00:10:59.100 I don't like that.
00:11:00.920 I don't like that.
00:11:02.360 I don't think you can pout your way out of this, Glenn.
00:11:04.460 I don't like it.
00:11:05.060 I don't like it.
00:11:05.640 I don't like it.
00:11:06.440 Make it go away.
00:11:07.360 Make it go away.
00:11:08.420 I mean, it's so clear.
00:11:11.940 But like, it's true.
00:11:14.040 Right.
00:11:14.340 Like, I.
00:11:14.820 We stopped just violating the Constitution.
00:11:17.020 And that means we have to go back a hundred years now.
00:11:19.660 You're right.
00:11:20.820 Which, by the way.
00:11:22.040 You know who started Mother's Day?
00:11:23.620 Woodrow Wilson.
00:11:25.980 We didn't even get the Mother's Day Woodrow Wilson rant.
00:11:28.320 No, you didn't.
00:11:29.740 Hopefully that's coming later on in the program.
00:11:32.380 But like, when you, when you go back and you say, okay, let's trace this back in conservative
00:11:36.600 thought.
00:11:37.180 Conservatives have opposed this policy forever.
00:11:39.080 Why?
00:11:39.640 Maybe they're just wrong.
00:11:40.600 I mean, look, conservatives are wrong on stuff, right?
00:11:43.040 Maybe the movement has been wrong on this the entire time.
00:11:45.440 No.
00:11:45.580 When, you know, other left wing people have been proposing it.
00:11:49.020 No.
00:11:49.580 Well, maybe they were.
00:11:50.800 No.
00:11:51.380 One of the reasons, though.
00:11:53.280 Yeah.
00:11:53.580 One of the real arguments from conservatives on this was to say, hey, we're going to lose
00:11:59.560 medical innovation.
00:12:01.000 Right.
00:12:01.480 That was.
00:12:01.860 That's one of the main central arguments.
00:12:03.540 When you distort a market like this, you wind up giving all sorts of incentives that you're
00:12:08.240 not going to see coming on day one.
00:12:09.960 Right.
00:12:10.300 That wind up destroying innovation in the medical field.
00:12:13.360 I will say the movement doesn't seem all that interested in innovation in the medical
00:12:18.240 field these days.
00:12:19.320 Well, you know what?
00:12:20.140 Can I add a layer to that?
00:12:21.420 Because I played this out of my head driving in today.
00:12:23.800 Okay.
00:12:24.000 Let me add that layer.
00:12:24.920 Yeah.
00:12:25.560 I'm not so worried about that.
00:12:26.900 We're going to have AI and AGI and ASI so soon it's going to solve all those problems.
00:12:31.200 That could be.
00:12:31.960 That's not even where I was going with it.
00:12:33.360 I was thinking people aren't really in, like.
00:12:36.160 I know.
00:12:37.080 You know, the RFK people are as interested in that.
00:12:39.560 But you're going to have in five years from now, you're going to have, not even that,
00:12:42.980 you're going to have AI or AGI or ASI that will say, oh, you want to fix that?
00:12:48.120 Well, just here.
00:12:49.400 Mix this, this, this.
00:12:50.700 And you don't even have to go to trials for it.
00:12:52.580 It'll just be right.
00:12:53.520 There will be no, there are no, there are not going to be teams of people working on
00:12:57.560 stuff.
00:12:58.000 It'll be AI.
00:12:59.080 It could be, yeah.
00:13:00.100 So that's a whole nother layer.
00:13:01.060 It will be.
00:13:01.720 It will be.
00:13:02.980 Oh, it will be.
00:13:04.160 Right.
00:13:05.100 I mean, it's still, I, I mean, and maybe AI completely solves this.
00:13:09.320 I still think though, as, as a, an old, old, old timey capitalist that like, that the
00:13:14.460 profit pursuits.
00:13:16.320 Let me get you some lemonade there in your locker.
00:13:18.780 Right.
00:13:19.100 That profit, like the profit, the pursuit of profit is an incentive for innovation.
00:13:24.980 Right.
00:13:25.080 So I do think that is a long-term positive.
00:13:27.380 It is.
00:13:27.780 For the market.
00:13:29.100 And for innovation generally.
00:13:31.920 And these things, look, you can even, even if you don't like medication, you probably
00:13:34.780 acknowledge that some of these have been very positive.
00:13:37.480 So you probably still want at least what you would consider the good innovations.
00:13:40.720 Yeah.
00:13:41.620 So look, I, I understand what you're saying.
00:13:43.800 It's not a, I understand why a lot of people are torn on this.
00:13:47.400 And I think I can see where you're coming from.
00:13:49.320 It's not one that I feel torn on.
00:13:50.880 I don't think it's a good idea.
00:13:52.140 I don't think you should be in the middle of this.
00:13:53.680 I don't think it's a good idea either.
00:13:54.320 Well, you're, you're being a little bit more open to it, which I, I understand.
00:13:58.280 Because I, the one thing I'm sure of, I'm not sure of anything anymore.
00:14:01.820 Right.
00:14:02.400 Okay.
00:14:03.100 That's other than that.
00:14:04.320 It's fine.
00:14:04.660 Look, if the goal is to centralize power and control the pharmaceutical industry from
00:14:09.480 Washington, I am 100% slam on the brakes.
00:14:13.420 Now I'm not against slamming on the brakes here for the other goal, but if the goal is
00:14:18.560 for the power of the president to break up a corrupt pricing monopoly to give Americans
00:14:25.320 leverage again, because we've already violated the free market so horribly because the government
00:14:33.700 has no place at the table, then is, is that pragmatic and is that principled?
00:14:39.600 I don't know.
00:14:40.840 I just know this.
00:14:41.580 The market has to be free.
00:14:43.020 Okay.
00:14:43.920 But if the players inside, inside, I mean, it has, competition has to be protected.
00:14:56.720 Okay.
00:14:57.920 You, you, you have to have competition, accountability and consequences.
00:15:04.160 We don't have any of those things right now.
00:15:08.140 None of those things.
00:15:09.480 That's not a true free market.
00:15:11.920 That's a casino where the house always win and you and I always lose.
00:15:16.720 That's what this is.
00:15:18.900 So is this move going to reset the table?
00:15:22.680 What if it forces companies to negotiate again, to compete again, to innovate again?
00:15:28.080 I don't know.
00:15:29.700 Usually that wouldn't happen, but we're, we're looking at a time where the whole country is
00:15:35.340 just being sucked into a giant crap hole and not because of Donald Trump, but because of
00:15:40.640 what everyone has done for the last 100 years.
00:15:44.660 The chickens are coming home to roost.
00:15:47.700 So now how do you reset it?
00:15:51.880 I don't know.
00:15:53.440 I don't know.
00:15:55.180 I don't know.
00:15:56.160 I just, here's what I, and I think you should use this as a model.
00:15:59.480 The only thing I'm certain of is that I'm not certain of anything anymore because the,
00:16:05.100 everything you cannot just blanket say, I'm a free market capitalist.
00:16:10.980 Okay, great.
00:16:12.000 So am I.
00:16:13.780 Now let's take that apart and see what that means.
00:16:16.580 Where did we go wrong?
00:16:18.180 So what should you really be for?
00:16:21.400 And until you're willing to say, I'm for that and only that, which will dig us into a deeper
00:16:26.620 hole.
00:16:28.440 Should we be, should we not be pragmatic or is that selling out?
00:16:35.540 Don't know.
00:16:36.180 Haven't dealt with these issues before.
00:16:38.520 All right.
00:16:39.300 You sick, twisted freak.
00:16:40.660 Want more of me and Stu?
00:16:43.040 Maybe not Stu, but to hear the rest of the program.
00:16:45.880 Check out the full podcast.
00:16:48.120 We're back with more after this.
00:16:52.860 All right.
00:16:53.620 It's time to take that big step and move, which means both selling your home and buying
00:16:58.080 another one.
00:16:58.780 Oh, I hate that.
00:17:02.960 Now that's one chance to price it as well.
00:17:06.240 To time it, to find the right buyer, to find the right home without missing something you
00:17:11.720 never even knew you were looking for.
00:17:13.420 That's the thing about real estate.
00:17:15.400 It is full of blind spots.
00:17:17.160 The foundation looks fine until the first heavy rain.
00:17:19.940 Then the neighborhood that seemed quiet on Friday night rolls around and the buyer who's
00:17:25.180 all smiles until the deal starts to slip.
00:17:27.620 All of that stuff means you need a great real estate agent that really knows what they're
00:17:32.420 doing.
00:17:32.840 Somebody vetted, experienced professionals who know how to read the market, catch the signals,
00:17:37.860 guard you from the mistakes that don't show up in a photo gallery.
00:17:41.460 Because in this market, you don't get a do-over.
00:17:44.300 And when the stakes are this high, trust me, isn't enough.
00:17:47.100 You need a partner who's going to get it done right the first time.
00:17:50.160 We have that partner for you.
00:17:52.280 It's a freezer.
00:17:53.080 I don't charge you for this.
00:17:54.700 We'll just give you a recommendation of the best one around you in our opinion that we have
00:17:58.880 vetted six ways to Sunday.
00:18:00.680 Tell us where you're moving from and to realestateagentsitrust.com.
00:18:04.640 That's realestateagentsitrust.com.
00:18:07.360 Now back to the podcast.
00:18:08.800 This is the best of the Glenn Beck program.
00:18:10.780 And don't forget, rate us on iTunes.
00:18:14.140 Okay.
00:18:15.220 So Pope Leo, which I'm sorry, maybe it's just because of Leo the lion.
00:18:21.000 But when I think of Leo the lion, I immediately think of the Wizard of Oz and that lion.
00:18:25.260 And I think of the Pope then saying, what do I have that the last Pope didn't have courage?
00:18:31.820 So excuse me.
00:18:32.940 But he came out there saying now that he may not be as bad as Francis.
00:18:38.280 Well, that would be nice.
00:18:39.580 That would be nice.
00:18:41.200 But he came out and he talked about AI and said, ah, we should be a little concerned about
00:18:49.080 this because it could eat individualism.
00:18:53.180 It could eat, you know, humanity and what it means to be human.
00:18:57.420 And I think he's right on that one.
00:19:01.040 And, you know, there's two kinds of people.
00:19:02.760 The people that think that, ah, no, this is going to happen.
00:19:06.060 Everybody said bad things, you know, about the internet and social media and how it, yeah,
00:19:11.520 look what it's done in 10, sweet, sweet 10 years to our children.
00:19:15.280 Look what it's done.
00:19:16.000 And this is far more powerful.
00:19:17.940 So don't dismiss it.
00:19:19.100 And the other is, well, you're just afraid of progress.
00:19:21.840 No, I'm really not.
00:19:22.740 I'm, I'm, I'm more afraid of surrendering the very thing that makes us human.
00:19:27.260 You know?
00:19:28.100 And here's the, here's the, the truth, the real hard truth on AI.
00:19:32.220 It's real.
00:19:33.300 AI is real.
00:19:34.320 It's gifts are real.
00:19:37.360 And it's dangers are just as real.
00:19:39.760 It, it is going to offer us blessings that we've never seen.
00:19:45.160 It will change the world.
00:19:46.240 It will, it'll personalize education for every single child.
00:19:49.980 It will have medical breakthroughs.
00:19:51.640 You know, we were just talking about the prescription drug thing that Trump just is signing in now.
00:19:56.100 And, you know, part of that is, well, it's going to hurt innovation.
00:20:00.840 Well, is it, I mean, it could, but we are looking about three years from now, uh, AI and AGI and ASI being able to go, oh, you want to solve that?
00:20:10.440 I can solve it.
00:20:11.020 Here's how you make this drug.
00:20:12.280 I mean, it's, we're not going to have these labs doing all kinds of experiments.
00:20:15.840 Should AI actually solve and do what everybody thinks it's going to do in a very short period of time?
00:20:24.700 Creativity is going to be enhanced.
00:20:26.640 Productivity, time is going to be redeemed.
00:20:29.000 It's really good.
00:20:29.580 Now there is a red line here, kind of really, you know, when AI stops being a tool and starts becoming the substitute for any kind of human thought, you know, or relationships, AI like government, just like fire, it can light your way.
00:20:48.100 It can warm your home or it will burn everything in your life down to the ground.
00:20:52.200 So when I saw the Pope's message this weekend, I of course went on to AI and I said, what do you think of this?
00:20:59.900 Okay.
00:21:01.360 Here's what AI said to me.
00:21:04.220 Look out because there is, there are going to be a few things that are going to happen that you can watch for to see how close you are to having humans being eaten.
00:21:15.900 One, one dependence over discernment.
00:21:19.360 Listen to this dependence over discernment.
00:21:22.660 If you're asking, what do you think I should do?
00:21:27.180 Okay.
00:21:28.660 Instead of this is what I believe.
00:21:32.160 Can you challenge this or support this?
00:21:35.680 You've already started to slip.
00:21:37.440 If you've, if you're like, what is it?
00:21:39.500 I believe problem delegating moral reasoning.
00:21:44.240 Number two, when we allow AI to define harm, truth, or justice, we're handing our civilization's soul to a machine that doesn't have one.
00:21:52.760 That's a machine telling you this.
00:21:55.040 Okay.
00:21:55.220 That's, that's, that's a machine.
00:21:57.120 You know what that is?
00:21:58.260 Remember when I've ever said, I don't know.
00:22:00.540 People say they're going to kill you.
00:22:02.240 I take them at face value.
00:22:04.420 You're foolish not to.
00:22:05.840 When you have AI saying, by the way, I'm going to destroy humanity.
00:22:11.500 I don't know.
00:22:12.160 I think I listened to it.
00:22:13.480 I think I listened to it.
00:22:14.900 So it says, these are the things to stay away from.
00:22:19.000 Chat bots for friends.
00:22:21.720 Remember we just talked about that with Mark Zuckerberg last week.
00:22:25.180 Avatars for pastors, algorithms for God.
00:22:29.940 Quote, that's the fracture point.
00:22:32.240 Number four, censorship disguised as safety.
00:22:37.160 When AI starts pushing or erasing certain thoughts in the name of alignment, you're already living in an invisible dictatorship.
00:22:45.700 Five, the illusion of control.
00:22:48.520 If you can't shut it off, opt out, or walk away, you are no longer the master of the machine.
00:22:55.460 So then I asked, well, what will it look like when it starts to break?
00:22:59.740 Answer, it'll be quiet.
00:23:02.380 Subtle.
00:23:03.140 You'll start to see the sameness everywhere.
00:23:07.040 No dissent.
00:23:08.060 No original thought.
00:23:09.180 Just sanitized, optimized groupthink.
00:23:12.560 Your kids won't be able, listen to this, your kids won't be able to explain why America matters.
00:23:19.980 I don't think our kids can do that now.
00:23:22.060 They won't understand what a right is, where it came from, or why it's not up for a vote.
00:23:29.920 Instead of discussion, you'll hear nothing but slogans.
00:23:34.020 Instead of conviction, you'll see compliance.
00:23:36.640 And when it collapses, it won't take decades.
00:23:40.280 It may take hours or days.
00:23:47.600 Can you unplug if the lines are crossed?
00:23:50.420 First, that's my question.
00:23:54.060 Answer, only if you're practiced before the moment comes.
00:23:59.780 Unplugging is not about flipping a switch.
00:24:01.760 It's about building a new muscle now.
00:24:05.300 So what is that muscle?
00:24:07.400 Think critically.
00:24:09.020 Knowing who you are without a screen.
00:24:12.220 Standing on something deeper than a prompt or a feed.
00:24:16.020 Wow.
00:24:20.940 When did those become something we have to remind ourselves to do?
00:24:24.720 When did that become something like, I don't know, think critically.
00:24:29.640 Know who I am without the screen?
00:24:32.980 That's a crazy thought.
00:24:34.840 Standing on something deeper than a prompt or a feed?
00:24:38.700 How many people do that now?
00:24:40.400 How many people actually know what they're for, what they're against?
00:24:44.520 A lot of people will say, I'm against this.
00:24:47.120 But will they even know why?
00:24:50.280 Do they even know why they're for something?
00:24:53.700 Most people are educated through social media.
00:24:58.980 And anyone, the man who reads nothing at all,
00:25:04.920 is better educated today than someone who only reads social media.
00:25:10.380 Let that sink in.
00:25:12.400 Because that's true.
00:25:15.080 Right now, you should use AI as a teacher, not a replacement, but as a teacher.
00:25:22.600 Right now, you can use AI to educate yourself, to learn economics,
00:25:27.000 to go on and say, this is what I believe.
00:25:30.120 Make the case, make the strongest case for and against.
00:25:33.700 And then debate.
00:25:35.840 Debate Western thought.
00:25:37.660 Learn about Western thought.
00:25:39.080 Debate morality and philosophy.
00:25:41.600 Learn the Constitution.
00:25:45.340 Learn how to grow food.
00:25:46.900 Learn how to fix a generator.
00:25:48.380 Speak clearly.
00:25:49.860 Think clearly.
00:25:52.000 Reclaim your foundations.
00:25:54.420 Learn the Bible.
00:25:56.560 For moral law, for human dignity, divine order, those things are really important.
00:26:02.300 Learn the Declaration and the Constitution.
00:26:04.120 Know your rights and responsibilities.
00:26:05.880 Know Adam Smith, the Tocqueville, C.S. Lewis.
00:26:12.940 See the patterns and predict the collapse before it happens and know where you should be standing if that collapse happens.
00:26:21.480 Because if you know what's true, you will see what's false before anyone else.
00:26:29.320 And then that allows you to teach others and lead quietly.
00:26:34.140 Because when confusion is the word of the day, when everything and everybody is confused, clarity is going to look an awful lot like leadership.
00:26:43.740 You know, 15 years ago, I launched something.
00:26:54.820 I launched the blaze 15 years ago.
00:26:57.460 And my goal was to disrupt the media.
00:27:00.880 And I got to say, if you look, you're like, I think that worked.
00:27:05.540 I think that worked.
00:27:06.460 You know, when we first started, Netflix was still sending movies in the mail.
00:27:12.940 And look at now.
00:27:14.480 The media, if you weren't on Fox News, CNN, ABC, NBC, CBS, you had no voice.
00:27:21.600 You had no voice.
00:27:22.800 But honestly, because of you and this audience, you gave power to the voices that would have been crushed in the system.
00:27:30.220 Through the blaze.
00:27:32.240 And we made truth competitive again.
00:27:34.600 Some of the people that we helped elevate now sit in the White House press room.
00:27:42.320 That's incredible.
00:27:44.980 Incredible.
00:27:46.860 Now here's what needs to be disrupted.
00:27:49.280 Education.
00:27:51.140 And AI can be a part of it.
00:27:54.960 Can be.
00:27:56.160 But it is, we have to be so incredibly careful.
00:28:00.620 I don't trust AI coming from Silicon Valley.
00:28:03.040 I don't trust AI coming from chat GPT.
00:28:06.660 I don't, I don't trust it.
00:28:08.260 I don't trust any of it.
00:28:09.540 I don't trust it because I know how it works.
00:28:11.360 Well, nobody actually knows how it works.
00:28:13.080 But I know what a lot of its faults are right now.
00:28:16.940 And I also know it depends on who's programming it.
00:28:20.840 I want to tell you that I have, I have two teams of people that are working on something that I hopefully will be announcing soon.
00:28:28.840 One in each hemisphere.
00:28:31.420 So one team is working while the other team is asleep.
00:28:34.480 And then they switch in the middle of the night.
00:28:36.280 And we've been working around the clock for over six months on something.
00:28:40.740 And we're building something.
00:28:44.100 I'm building something that wasn't even possible 24 months ago.
00:28:48.880 Not even possible.
00:28:50.000 But I believe in the end, this is going to be the one thing, this is going to be the thing that is most, the most important thing I've ever done.
00:28:59.180 And the time is short.
00:29:02.000 And I'll talk to you about it later this summer.
00:29:05.440 And hopefully, if you feel the urgency that I feel, you'll be able to support this.
00:29:09.000 But we have to corral AI and dedicate ourselves to education.
00:29:20.440 Today, I'm with PragerU all day.
00:29:22.880 I'm filming a whole bunch of stuff for PragerU that AP classes are going to have access to all over the country.
00:29:32.000 And I'll tell you more about that when it's out.
00:29:34.920 But the opportunities that are happening right now on education are endless.
00:29:42.540 And we're going to lead that.
00:29:45.860 So you need to start educating yourself right now.
00:29:50.740 But here's the one thing.
00:29:52.240 When I read this thing from, you know, Pope Leo, courage.
00:29:55.820 I thought to myself immediately, okay, well, that's not good.
00:29:59.920 I want you to, I want you, here's your mission today.
00:30:04.680 Here's the one thing you should be working on today, starting today.
00:30:09.420 Constantly say these two words.
00:30:13.400 But God.
00:30:16.220 I had been saying, well, it's going to be interesting to see how that all works out.
00:30:20.800 And that's kind of defeatist in a way.
00:30:23.000 I mean, it's a funny way for me just to dismiss all the things and like, well, that's not going to go well.
00:30:27.320 That's going to be interesting to see how everybody works that one out.
00:30:29.920 Change it.
00:30:32.660 But God.
00:30:34.580 But God.
00:30:35.760 Understand the phrase, learn it, and live it.
00:30:38.540 So in six months from now, it's such a part of you that it shapes how you walk, how you think, how you lead.
00:30:45.280 And let me explain.
00:30:46.020 Much of the stuff that we're talking about, and then some, things that I talked to you about 20 years ago are still on the horizon.
00:30:55.540 And it's going to be a tough ride.
00:30:57.760 It's going to be a very, it's going to get much worse before it gets much better.
00:31:02.080 But here's the thing that all of the models and everything else and AI cannot account for.
00:31:09.300 And that is, but God.
00:31:12.180 Because nothing factors him in.
00:31:14.240 Except you.
00:31:15.040 They never do.
00:31:15.680 The models never do.
00:31:17.100 But God's already working.
00:31:18.700 Right now.
00:31:19.200 In small ways.
00:31:21.040 In hearts that I don't see, you don't see stories, you don't hear, movements that are just beginning, and like candidates who get shot in the middle of a field and should be dead.
00:31:32.700 They're not.
00:31:33.420 I mean, the president, they tried to kill him, but God.
00:31:39.080 Because that's where hope lives.
00:31:40.980 Not in machines, not in governments, not even in our plans.
00:31:46.000 Hope lives in truth, in light, in God.
00:31:50.400 And while things might look really, really bad, but God, he's still on the move.
00:31:59.480 Hey, you're locked in with the best of the Glenn Beck program.
00:32:02.460 Back with more right after this.
00:32:07.360 So whether it's a tornado or a hurricane, sometimes it's a grid failure.
00:32:12.280 Sometimes it's economic collapse.
00:32:13.900 Sometimes it's just a slow, steady decay that nobody notices until it's too late.
00:32:18.420 There are a lot of reasons you might end up needing emergency food.
00:32:22.000 Turns out none of them are fun.
00:32:23.980 I don't know.
00:32:24.680 There's not like one.
00:32:25.760 I'm trapped in Disneyland, and everything is going great, and I need food supply.
00:32:31.100 None of them are like that.
00:32:32.920 They've helped millions of Americans at My Patriot Supply.
00:32:35.720 Millions of Americans have prepared for the unthinkable, and they've earned over 70,000 five-star reviews doing it.
00:32:43.280 I keep their three-month emergency food kit in my home.
00:32:46.100 Well, I did before I lost it in a boating accident along with my guns.
00:32:49.480 But in a real crisis, nothing matters more than knowing your family is not going to go hungry.
00:32:56.000 Each kit will give you 2,000 calories a day per person for three full months.
00:33:00.540 The food tastes great.
00:33:01.680 It lasts up to 25 years.
00:33:03.040 It ships fast.
00:33:03.920 And right now, it's $200 off, a deal they rarely offer.
00:33:08.000 We can't control what's going on in the world, but we can control how prepared we are.
00:33:12.140 This $200 off discount is only available for a limited time, so don't wait.
00:33:16.140 Go to MyPatriotSupply.com right now.
00:33:18.240 That's MyPatriotSupply.com.
00:33:20.740 Get peace of mind while it's still available.
00:33:24.380 MyPatriotSupply.com.
00:33:25.920 You're listening to the best of Glenn Beck.
00:33:28.340 Need a little more?
00:33:29.260 Check out the full show podcast anywhere you download podcasts.
00:33:33.020 So I am thrilled to have Dr. Jay Bhattacharya on with us.
00:33:37.260 He's from the National Institutes of Health.
00:33:39.680 I want to talk to him about bringing science back into the NIH.
00:33:44.640 There was a lab leak, and I want to get to that here in a second, but I've got to touch on the news of the day,
00:33:48.900 and it's not really his area of expertise,
00:33:50.680 but the president just signed an executive order to lower drug costs.
00:33:56.640 Doctor, welcome to the program, Dr. Jay Bhattacharya.
00:33:59.820 Any comment on that as we get started here?
00:34:02.920 Sure.
00:34:03.760 It's actually something I studied in a past life when I was a professor.
00:34:07.240 So the difference in drug prices between the United States and Europe is sharp and alarming,
00:34:14.220 and it's been persistent for decades.
00:34:16.140 The Americans pay sometimes between two to five times, sometimes as high as ten times the price of the same drug as Europeans do.
00:34:25.400 And, you know, as a professor who has a, you know, as a PhD in economics, I'll tell you,
00:34:30.660 when you see persistent price differences like that, that indicates a very unhealthy market.
00:34:35.620 And in this particular case, what it means is that American consumers essentially are being taken advantage of.
00:34:44.040 American patients are paying, you know, through the nose for drugs that Europeans pay much less for.
00:34:52.140 And the reason is that the European countries will tell drug companies,
00:34:56.260 if you don't lower the drug prices to a very low level, you know, just above marginal cost,
00:35:01.180 then we're not going to cover you at all.
00:35:04.840 And what the drug companies have told Americans is that if we don't pay higher drug prices,
00:35:10.240 there won't be any R&D on drugs.
00:35:13.600 What the President's Executive Order does is that tells Europeans, look, this is not fair to Americans.
00:35:18.260 This is actually lowering the investment that we ought to be making on R&D for drugs.
00:35:23.140 And so they should be paying prices that are equal to the level that Americans pay.
00:35:29.140 And Americans should be paying much lower prices than we do pay, much closer to competitive prices.
00:35:34.760 It's a huge move forward.
00:35:36.700 And now what we'll have to see is what your Europeans do and what the drug companies do in response.
00:35:42.240 But to me, I've been hearing about this problem for decades.
00:35:45.740 It's the first time I've seen a President really take a big step forward to try to address it.
00:35:49.780 I mean, as somebody who is in, you know, research for a very long time, let me, doesn't the promise of AI, AGI, ASI lessen this whole thing of we need gobs of money to be able to do R&D?
00:36:06.020 Because that should, you know, maybe five years from now, begin to cut those costs dramatically to take that chair away from the table or put that chair back into the table, if you will.
00:36:19.780 Yeah, no, that is actually a quite promising thing.
00:36:22.440 So, I mean, just to give one example, there's this technology called AlphaFold that allows scientists to much more easily understand how proteins will fold on each fold.
00:36:35.480 And as a result, hopefully, anyway, dramatically reduce drug development expenditures.
00:36:42.080 Drug development is always going to be expensive.
00:36:43.440 You still have to run randomized, large-scale clinical trials, and those are going to be expensive.
00:36:48.880 But the initial steps of drug development with AI and as well the clinical trials are going to be much more efficiently run over time.
00:36:57.460 The idea that you need to have, you know, trillions of dollars, you know, tens of billions of dollars to develop a single drug, we hope, will become a thing of the past.
00:37:07.360 In any case, there's no reason why Americans should be shouldering the burden for the whole rest of the world, though the developed world should be bearing this burden together.
00:37:14.540 Let me switch.
00:37:15.740 You know, I originally reached out to you because I wanted to talk to you about the HHS halting work at high-risk infectious disease labs around the world.
00:37:26.360 And I can't believe this is true, but you tell me.
00:37:31.640 So there was an incident at a biolab that apparently what happened is there was a, I don't know, a personal squabble between people and a contractor actually punched a hole in the other person's biolab suit.
00:37:53.200 I don't know, I don't know, to get them sick or whatever, but it was, I mean, is that what happened at that biolab?
00:38:00.320 I think it was at Fort Detrick.
00:38:02.440 That is exactly what happened.
00:38:04.340 Oh, my gosh.
00:38:04.680 And this is something I learned.
00:38:05.860 Yeah, it was, I haven't been scared about anything in this job except for that one thing.
00:38:11.440 So I learned about this about three weeks in the job.
00:38:14.500 I've been in the job since the beginning of April.
00:38:16.800 It turns out that there had been an incident a few weeks before, in fact, right before I joined as the NIH director, at Fort Detrick, a lab run, and a part of the lab is run by the National Institute of Health.
00:38:32.080 And it's a BSL-4 lab, which is the highest biosecurity level lab.
00:38:35.860 I mean, the lab, the experiments done there are on some really nasty bugs.
00:38:42.880 I mean, you know, Ebola, a whole bunch of, like, viruses and pathogens that if it gets out in the population or if it infects lab workers, it's just really quite deadly.
00:38:53.760 And what I'd learned was that there had been this incident just a couple weeks before I joined as the director of the NIH where a lab worker had cut a hole in a biocontainment suit of a fellow worker with the express intention of getting that worker infected.
00:39:14.460 Oh, my gosh.
00:39:14.920 And apparently it was over some lover's staff, or I'm not sure exactly the full details.
00:39:21.060 There's an ongoing investigation of that.
00:39:24.400 What I learned was that not just that this incident had happened, which actually has a threat not just to the worker, but also if that disease gets out.
00:39:35.380 Yeah.
00:39:35.540 Yeah.
00:39:35.960 I mean, I was actually, I mean, I was absolutely livid.
00:39:40.280 And so what I did is I ordered the lab on an operational shutdown, secured all of the vials of the nasty bugs in a safe environment, made sure that the animals were cared for, that they're in the lab.
00:39:57.120 And we're not going to open that up until the safety environment on the lab is absolutely solid.
00:40:02.740 The contractor that was overseeing this, I think, did a very lax job.
00:40:08.160 What I learned is that this goes back to the Biden administration, that the safety environment in the lab essentially downplayed these kind of security problems.
00:40:18.640 If you're going to run experiments on these bugs, and I'm not, personally, I'm not sold that all of these experiments are worth doing.
00:40:26.960 But in any case, if you're going to run them, you have an absolute responsibility to have zero tolerance for safety problems.
00:40:34.240 Right.
00:40:34.360 I mean, the issue here wasn't just a one-off thing.
00:40:38.780 It was something that was problematic in the safety culture of this lab, where I cannot guarantee that if we reopen the lab right now, that it would be a safe environment.
00:40:49.480 I won't reopen the lab until that's the case.
00:40:52.180 Thank you.
00:40:52.660 Thank you.
00:40:53.100 Thank you.
00:40:53.960 I mean, shouldn't that person be punished?
00:40:56.420 I mean, that really is attempted murder, and maybe even on a mass scale.
00:41:01.320 I mean, there's an ongoing investigation, so I probably shouldn't say much more about this, but it's one of those things where I was actually actively scared when I first heard it.
00:41:10.480 Yeah.
00:41:11.240 I mean, I think Americans are actively scared because none of this stuff should be happening.
00:41:16.420 I mean, we are just – we're just an accident or a stupid move or an intentional leak away from mass death.
00:41:27.280 And, you know, you keep hearing people like Bill Gates say, we're on the verge of another pandemic.
00:41:31.860 Why?
00:41:32.600 Why?
00:41:33.080 I mean, why are we on the verge of another pandemic?
00:41:36.440 Do you think we are?
00:41:37.440 I mean, you know, pandemics happen, and they've happened all throughout history.
00:41:43.260 The key thing to me, though, Glenn, is we don't want to cause one.
00:41:47.320 We don't want to increase the risk of them.
00:41:51.960 The irony of this past pandemic, the COVID-19 pandemic, is that it was very likely caused by actions aimed at stopping pandemics from happening.
00:42:02.360 There was almost this hubris – I mean, it is hubris – there was this idea that we could somehow, if we go into the bat capes of China and all the wild places, bring all of those viruses that we find there and pathogens that we find there into the labs, catalog them.
00:42:20.880 We can somehow prevent all pandemics from happening, make them more transmissible, more dangerous to humans.
00:42:26.400 We can somehow, as a result of that exercise, make it less likely to have pandemics happening.
00:42:32.360 But, of course, what we found out is the opposite is true, that you can't do this kind of work entirely safely.
00:42:37.520 And actually, even if you fully accomplish what was the aim of that research program, which is to go out in the wild places and find the pathogens, you wouldn't protect anybody against the pathogen.
00:42:48.740 Because what would happen is, when and if the outbreak happens, whatever countermeasures you designed for them would already be out of date, because the evolutionary biology of these viruses is they mutate very rapidly.
00:43:01.240 And so, when they got out of the population, the countermeasures you prepared for, which you never tested any humans, very likely would not work.
00:43:11.400 Have we stopped all of the gain-of-function stuff now? Are you convinced it's done?
00:43:16.240 Yeah. So, last week, President Trump signed an absolutely historic executive order that said it puts a pause, a full pause, on all gain-of-function work throughout the government.
00:43:29.940 And we implemented that pause at the NIH, and I'm sure the rest of the government has done the same.
00:43:36.180 Over the next 90 days, we're going to develop a framework, and here's how the framework's going to work, right?
00:43:42.000 So, you have to be a little careful here. So, a gain-of-function can mean many things.
00:43:45.980 So, for instance, insulin is produced via a gain-of-function exercise.
00:43:51.640 There's no risk of a pandemic being caused by it, but you take a bacteria, E. coli, you change so they can produce insulin, and that's how you produce human insulin.
00:44:00.460 So, there's a completely safe thing to do. On the other hand, you take a virus, like a bat virus, a virus that has these, like, sort of coronavirus-like properties, add a fur and cleavage site, and manipulate it so that it can infect human cells more easily.
00:44:19.020 Well, now you have the potential to cause a pandemic.
00:44:22.240 If you're going to do an experiment like that, you, the scientist alone, or scientists alone, should not get to decide whether that risk is worth taking.
00:44:29.580 The public should have a set. The public should be able to say, no, that, well, there's no, no matter what knowledge you think you're going to gain from that, it's not worth the risk of causing a worldwide pandemic that's going to kill 20 million people and cost $25 trillion or something.
00:44:44.540 And that's exactly what the framework's going to do. It's going to say, if your scientist or want to, if I, the scientist, wants to run this project, the public will have a veto over that, say, no, you're not allowed to do it because it's not worth, most science won't be affected by this.
00:44:57.440 No science has no chance of causing a pandemic. But any science that does is going to be subject to this very, very strict regulatory framework.
00:45:04.440 We're on with Dr. Jay Bhattacharya, who is a hero in my book, now the director of the National Institutes of Health.
00:45:11.720 Is an apology good enough for the National Institute of Health? I mean, should anybody go to jail for what has happened?
00:45:18.480 And what is that like to walk into that building when you were, you know, enemy number one to many in that building, you know, during the pandemic?
00:45:28.620 You know, it's been interesting. It's certainly a big turn of fate where I was sort of the, I mean, subject of devastating takedowns and called all kinds of names by folks who were in the, in this building where I now lead.
00:45:41.420 But at the same time, I found many, many excellent scientists, many people devoted to, to, to the advancing human knowledge for, for benefiting, for the benefit of all people.
00:45:53.180 I mean, I, most scientists are like that. They're not, they're not trying to like create havoc.
00:45:57.700 And so I've been trying to find allies and I found a lot of allies in the building.
00:46:01.160 You asked what, what should happen, you know, regarding apologies. I mean, to me, apologies, I, I mean, I, I think the key thing, I'm, I personally, I'm very, very happy to apologize on behalf of American public health to the American people for their failures during COVID.
00:46:16.820 But the key thing going forward is a reform.
00:46:18.940 How do we change the institution so that it is focused on the health needs of the American people rather than these utopian schemes to like end all pandemics that, that, that, that, that, that without any heed whatsoever to the risk that they take.
00:46:32.300 Science is very, very powerful kind of idea and institution, but it needs to be focused on real human needs, real health, and in particular for the NIH, real human health needs.
00:46:45.140 Um, and, uh, there have to be guardrails so that scientists understand that they operate in the context of, uh, public support.
00:46:53.160 We, we will function on taxpayer money. We have to answer to taxpayers. Um, and so that's been the challenge is trying to, uh, keep the light of science alive while still, uh, reminding scientists that we are not acting just as, uh, as, as if we were like independent actors like God.
00:47:11.940 We are, uh, are actually beholden to the American people.
00:47:15.420 Dr. Jay Bhattacharya. Um, um, unfortunately I have to take a network break. I would love to have you back for a, uh, a longer podcast. Um, thank you. Thank you. Thank you for everything you did during the, the COVID nightmare.
00:47:27.580 And thank you for standing up so strongly now and congratulations on being our director of the NIH.
00:47:33.840 Thank you, Glenn. So good to talk.
00:47:35.660 God bless you. Bye-bye.
00:47:39.120 Claudia was leaving for her pickleball tournament.
00:47:41.220 I've been visualizing my match all week.
00:47:43.980 She was so focused on visualizing that she didn't see the column behind her car on her backhand side.
00:47:49.900 Good thing Claudia's with Intact, the insurer with the largest network of auto service centers in the country.
00:47:55.600 Everything was taken care of under one roof and she was on her way in a rental car in no time.
00:48:00.040 I made it to my tournament and lost in the first round.
00:48:03.500 But you got there on time.
00:48:05.360 Intact Insurance, your auto service ace. Certain conditions apply.
00:48:09.100 mm-wing-to-pop com.
00:48:09.360 Welcome to Intact Herrera.
00:48:10.280 Bet you, nuclear pitcher, to the first round.
00:48:10.560 Tell us about the Rapture機械 shape.
00:48:11.200 Since the starting limit of an impact on Samaracan style check,
00:48:12.800 let's go back to Alex karmaHola.
00:48:13.440 Welcome to the compact USarmac.
00:48:14.240 Okay, we'll be back to this one at our entrar.
00:48:15.500 Bye-bye.
00:48:16.580 Bye-bye.
00:48:18.360 Let's be nice.
00:48:19.580 Bye-bye.
00:48:21.120 Bye-bye.
00:48:21.800 Wai-bye.
00:48:22.620 Bye-bye.
00:48:23.300 Bye-bye.
00:48:23.680 Bye-bye.
00:48:25.360 Bye-bye.
00:48:26.420 Bye-bye.
00:48:27.460 Bye-bye.
00:48:28.700 Bye-bye.
00:48:28.920 Bye-bye.
00:48:29.620 Bye-bye.
00:48:34.420 Bye-bye.
00:48:34.960 Bye-bye.
00:48:35.460 Bye-bye.