Real Coffee with Scott Adams - June 08, 2021


Episode 1400 Scott Adams: It's My Birthday So Come Say Hi


Episode Stats

Length

45 minutes

Words per Minute

152.94447

Word Count

6,898

Sentence Count

495

Misogynist Sentences

5

Hate Speech Sentences

12


Summary

Happy Birthday to me, and it's good to see all of you. You know how many people knew it was my birthday today? And so I looked up a list of famous people to see whose birthdays are shared with me?


Transcript

00:00:00.000 Well, happy birthday to me, and it's good to see all of you.
00:00:09.620 You know, I was wondering how so many people knew it was my birthday today,
00:00:15.300 and I realized that famous people, their birthdays are often publicized.
00:00:21.860 And so I looked up a list of famous people to see whose birthdays are shared with me.
00:00:30.000 So, here's the list of famous people who share my birthday,
00:00:35.040 and being, you know, very famous like I am,
00:00:38.000 I'm probably pretty close to the top of this list, wouldn't you think?
00:00:41.320 Well, let's look at the list. You got Kanye West.
00:00:44.340 Happy birthday, Kanye. Good day for a birthday. Glad I'm sharing it with you.
00:00:50.240 Other famous people. Frank Grillo. Never heard of him.
00:00:55.700 Tim Berners-Lee. Oh, okay. Nice to have a birthday with a founder of the Internet.
00:01:02.180 We got Tori DeVito, Shilpa Shetty, Nancy Sinatra, Julianna Margulies.
00:01:07.360 I met her once. Joan Rivers. Wow. Frank Lloyd Wright. All on my birthday.
00:01:12.700 Jerry Stiller. Ashley Biden.
00:01:15.660 You got Bonnie Tyler, Keenan Ivory Williams. Wow.
00:01:20.120 Barbara Bush, Griffin Dunn, Gabrielle Giffords, Rob Pilatus.
00:01:25.560 Where am I on this list?
00:01:27.580 There are a lot of famous people with his birthday.
00:01:29.640 Come on. Come on.
00:01:31.380 50, 100.
00:01:32.780 Well, apparently I'm not in the top 100,
00:01:34.880 but someday I'll make it to the top 100 birthdays on my own birthday.
00:01:40.200 Would you like to enjoy the Simultaneous Sip Special Birthday Edition?
00:01:46.040 Yeah, of course you would.
00:01:47.480 And all you need is a cuppermanger, a glass, a tank of chalice, a stein, a canteen jug, a flask, a vessel of any kind.
00:01:53.300 Fill it with your favorite liquid.
00:01:54.760 I like coffee.
00:01:56.680 Thank you.
00:01:58.280 And join me now for the unparalleled pleasure.
00:02:04.440 Here, the dope ahead of the day, dopamine ahead of the day,
00:02:08.360 the thing that makes everything better is called the Simultaneous Sip and I have is now go.
00:02:15.400 Oh, just right.
00:02:17.880 How do you like that sip of coffee when the temperature is just right?
00:02:22.820 It's perfect, really.
00:02:25.240 Let's talk about all the things that are coming.
00:02:27.840 So, I watch with fascination as the evolution toward self-driving cars.
00:02:35.420 And thank you, everybody, for the birthday wishes.
00:02:38.180 So, there's a company called Cruise that is pretty far along.
00:02:42.460 It's a subsidiary of General Motors.
00:02:44.780 They've got a self-driving car that has just got permitted to operate in California.
00:02:52.660 An actual self-driving car.
00:02:55.260 And what's different is this one doesn't require a human operator to take the controls if something goes wrong.
00:03:03.900 So, until now, we did have self-driving cars, but it required a human to be in the passenger seat to take over.
00:03:11.740 And it looks like this is the first one that's made the next step.
00:03:15.500 And it will be driving literally autonomously.
00:03:19.340 There will be no human there.
00:03:20.840 The car will just pull up.
00:03:22.140 You get in and it goes somewhere.
00:03:23.220 Now, I don't know if I would be the first one to get in this.
00:03:27.840 The fly lady.
00:03:29.380 Well, you're much too nice.
00:03:31.360 Thank you.
00:03:32.840 I don't deserve that, literally.
00:03:35.460 But thank you.
00:03:37.160 And I've been watching this driverless car thing.
00:03:40.540 And what's interesting is there's a whole bunch of companies that are getting into it.
00:03:44.220 And one assumes that someday, I don't know, Apple and Google will have a car, whatever.
00:03:50.500 But it's not going to be like one day there are self-driving cars.
00:03:54.700 It's more like one day you'll drive past one.
00:03:57.640 And you'll think, what the hell?
00:03:59.420 Was that a car with nobody in it?
00:04:01.340 And then there'll just be more of them.
00:04:03.020 But one of the questions that I wonder about is, what's it like to be a human driver on the highway with some self-driving cars and some human-driven cars?
00:04:15.900 Because, correct me if I'm wrong, but isn't a big part of driving anticipating what the other driver will do?
00:04:22.400 When you come to a stop, let's say a four-way stop, don't you look at the faces of the drivers?
00:04:29.780 That's one of your biggest variables, right?
00:04:31.620 You look at their faces.
00:04:32.940 Because if their face is looking at you, you know they saw you.
00:04:37.120 If their face is looking the other direction and they're inching forward, you stop.
00:04:42.060 Because you're not looking at the car, you're looking at the face.
00:04:45.800 What does a self-driving car do?
00:04:48.620 Self-driving cars don't look at faces.
00:04:50.400 So they've got one disadvantage that humans don't have.
00:04:55.280 But also, I can't look at the face of a self-driving car.
00:04:59.320 How long will it take me to learn when the self-driving car recognizes me?
00:05:05.760 Because I want the self-driving car to see me.
00:05:08.320 And I want to know that they see me.
00:05:10.180 Like a human.
00:05:11.820 So what do you do?
00:05:13.000 It's almost like the self-driving cars need another, some kind of other option.
00:05:18.900 So you can know that they recognize you.
00:05:21.620 For example, could there be another light?
00:05:25.120 Or maybe something that shows up in your car?
00:05:27.900 Something that tells you that this other car now has a full understanding of your existence and what you're doing.
00:05:33.720 As long as I know the other car knows what I'm up to, I'm okay.
00:05:38.720 But until I know it knows where I am and what I'm doing, I don't know.
00:05:42.780 It'd be hard to trust it.
00:05:45.280 You know, I mentioned this the other day, but it's becoming more and more hardened as a reality.
00:05:50.520 There will be two histories of the pandemic.
00:05:55.240 One history will say, hey, there was a big old pandemic.
00:05:59.380 The other history is what some members of the right are forming into their own reality, which is the pandemic didn't happen.
00:06:11.140 There was none.
00:06:12.340 It was just an illusion.
00:06:14.540 That actually nothing happened.
00:06:15.800 There was a virus, and the virus was real, and people got it and people died, so that part is not in question.
00:06:23.140 What's question is, is anything unusual, did anything unusual happen in terms of viruses?
00:06:29.900 Or was the unusual thing only the way we reacted to this one?
00:06:34.500 And we're just going to have to live with two realities.
00:06:37.520 We will have two completely different histories, one in which the virus never happened as a pandemic, but it happened as a normal, little extra aggressive seasonal virus that we overreacted to.
00:06:53.980 How many of you buy into the opinion that the pandemic was largely an illusion and not really much worse than a normal flu?
00:07:03.720 How many of you buy into that?
00:07:05.280 I just want to see in your comments if that's becoming a popular view or not.
00:07:12.340 I'm seeing a yes, thumbs up.
00:07:14.300 I see a scam, no.
00:07:17.020 Overreacted, yes, yes, yes, yes.
00:07:20.120 Me, me, me, no, no, no, no.
00:07:23.000 Well, so you see, there it is.
00:07:25.560 There are roughly as many people who think the pandemic didn't exist as there are who think they lived through one.
00:07:32.640 How do you reconcile two histories that are opposites?
00:07:38.440 And we both lived through it, right?
00:07:40.460 It's not like we're talking about something that was hypothetical and it's in our history books.
00:07:46.440 You and I, and everyone watching this right now, we just, we're in it still.
00:07:51.420 We're living through it in real time and still a large segment of the population thinks it isn't real.
00:08:00.680 It didn't happen.
00:08:02.380 Well, I'm in the camp that says something happened and some people were dying and whether we overcounted them or not, it's still in the millions.
00:08:10.740 Yeah, I'm fairly certain it's in the millions.
00:08:14.280 Was it worth it?
00:08:15.200 Well, we'll figure that out as we go.
00:08:18.320 There's a survey, a study.
00:08:21.560 I don't know if you'd call it a study.
00:08:22.800 Let's call it a survey because that's what it is.
00:08:25.840 A survey about whether more people got the COVID when they were wearing masks or not wearing masks.
00:08:33.060 This is from Axios Ipsos poll.
00:08:36.240 And what do you think?
00:08:37.000 If you haven't seen, for those of you who have not seen it yet, so don't ruin it yet.
00:08:42.280 So many of you have not seen the results.
00:08:44.640 So they did a poll and they said, how much did you wear masks and then did you get coronavirus?
00:08:51.500 And they determined, and I won't tell you the answer yet, but what do you think?
00:08:57.280 What do you think was the result?
00:08:59.080 Did the people who wore the masks more often get fewer infections or more of them or the same?
00:09:07.940 In the comments, tell me what you think it turned out.
00:09:11.280 I see somebody says the same, no difference.
00:09:14.680 No effect, no.
00:09:17.520 Less likely to get sick, same, no difference.
00:09:21.940 So most of you have not seen this survey, obviously.
00:09:26.340 Same, blah, blah, blah, fewer.
00:09:28.460 Here's the answer.
00:09:30.280 Gigantic difference.
00:09:32.440 Which way did it go?
00:09:34.360 There was a doubling.
00:09:36.480 So one of those conditions, either wearing a mask or not wearing a mask, one of those two conditions doubled your risk of getting infection.
00:09:45.120 Doubled.
00:09:46.080 All right?
00:09:46.640 So we're not talking about a small difference.
00:09:49.260 Doubled.
00:09:49.620 Which was it?
00:09:51.540 Was it the people who wore the masks who were safer?
00:09:55.100 Or was it the people who didn't wear masks who were safer?
00:09:58.140 What do you think?
00:10:00.640 Because there are a lot of people who think the masks are causing problems.
00:10:03.500 The answer is, according to the Ipsos-Axios poll, that people who did not regularly wear masks had twice the infections.
00:10:15.300 The people who didn't wear masks were twice as likely to be infected, based on self-reporting, right?
00:10:25.740 Now, thank you.
00:10:29.280 In the comments, somebody says, this is not a gold standard of reporting, or a gold standard study.
00:10:34.860 No, it's not even a study.
00:10:36.800 It's more like a poll.
00:10:38.520 I would call it a poll, not a study.
00:10:39.940 So, this does not pass scientific rigor.
00:10:45.260 Okay, we all agree on that.
00:10:47.040 There's nobody thinks that this passes any kind of scientific gold standard, right?
00:10:52.280 So the fact that there's this one survey that does show a really big difference, though.
00:10:58.660 Here's where I would be more skeptical.
00:11:01.400 If the masking difference showed a difference of, say, 10 or 20%, let's say, hypothetically,
00:11:09.900 I'd say to myself, I don't know that you could measure that, really.
00:11:13.320 10 to 20% sounds like, I don't know.
00:11:16.980 That's too small of a hypothetical difference to be sure there weren't other variables in there mucking up your result.
00:11:24.740 So 10 to 20%, I would have said, that doesn't tell us anything.
00:11:28.880 But what if it's double?
00:11:31.400 Well, you know, let's just say, hypothetically, let's say the polling was done well.
00:11:38.260 It's still not a scientific study.
00:11:39.980 It's not a gold standard, controlled anything.
00:11:43.460 But double, right?
00:11:46.360 When I do a Twitter poll, I know it's highly unscientific.
00:11:51.040 It's as unscientific as you can get.
00:11:53.380 Because, you know, my followers are not even close to any kind of a random sample.
00:11:57.340 And still, if I run a Twitter poll and the results are 10 to 1 in one direction,
00:12:04.340 I still think it means something.
00:12:06.100 It just isn't scientific.
00:12:07.680 But if it's, you know, a 10% chance, obviously, it means nothing.
00:12:12.500 I don't know.
00:12:13.620 I would say this is not conclusive or close to it.
00:12:16.860 But it certainly agrees with my opinion.
00:12:21.460 And the best way to know that something is true is if it agrees with your prior opinion.
00:12:28.220 Now, that's the standard I use.
00:12:31.500 So this agrees with what I already thought to be true, which is masks are more likely to work than not.
00:12:37.320 One of the obvious problems with this is that people don't wear masks for the same reasons and risks are different.
00:12:47.220 And, you know, one of the reasons you might wear masks a lot is because you perceive a risk, et cetera, et cetera, et cetera.
00:12:53.640 So there's a million problems wrong with the poll, but I'll bet it will hold up.
00:12:58.880 If I had to bet, let me put it into money.
00:13:02.120 If somebody said to you, Scott, you can't avoid making this bet.
00:13:06.000 We'll kill you if you don't participate.
00:13:08.400 So you have to participate.
00:13:09.900 And you have to put all of your money on whether masks made a difference or they didn't.
00:13:17.440 I would bet all of my money that it made a difference.
00:13:22.080 I'd bet all of it.
00:13:23.000 Now, this is under the assumption that I had to bet one way or the other.
00:13:26.560 I would prefer not betting because I'm not that confident.
00:13:30.860 You know, I'd need something like 99% confidence to bet everything I have.
00:13:36.500 But in this hypothetical where I had to bet, yeah, I would bet that they work.
00:13:42.380 Now, I'm also, am I unbiased?
00:13:49.300 Nope.
00:13:49.740 Because I have committed to a position early on in the pandemic.
00:13:55.000 Do you remember early on when the experts were saying masks don't work?
00:14:00.240 And I believe, and by the way, I would love a fact check on this.
00:14:03.380 If there's anybody who can give me a fact check.
00:14:04.960 I believe, but I can't confirm it yet, that I'm the first public figure to call bullshit on Fauci, et cetera, saying that masks don't work.
00:14:17.060 Now, given that I've made such a public commitment to that view that masks almost certainly work, this was my view early on before we had data, it just seems logically that they would.
00:14:29.280 Given that I made that commitment, I am no longer able to be objective when I look at, say, a result like this poll, because I want it to agree with me, right?
00:14:40.460 Who doesn't want to be right on their birthday?
00:14:44.480 Everybody wants to be right, especially on their birthday.
00:14:47.240 So you should not take me as a credible source when I tell you, hey, this poll looks pretty credible, because I want it to be true.
00:14:57.600 I want it to be true because that makes me feel good and me look good.
00:15:02.740 So I have no credibility on this question.
00:15:06.140 You get that, right?
00:15:06.920 That doesn't mean I'm wrong.
00:15:08.980 I actually think I'm right.
00:15:11.060 But in terms of you judging my credibility on this question, it should be zero, because I've set up the perfect condition for cognitive dissonance.
00:15:21.020 I set myself up for that.
00:15:23.260 By taking a public stand and putting my reputation behind it, and it's a big question, right?
00:15:28.640 It's not a trivial question.
00:15:30.440 If it were trivial, I'd say, ah, who cares?
00:15:33.340 But it's big.
00:15:34.100 Like, people may have lived or died, literally lived or died, on the question of whether they believed masks worked or didn't.
00:15:43.120 May have, right?
00:15:44.340 That's unclear at this point.
00:15:47.880 So, yeah, you should not believe me on that topic, but I'll report what I see on the topic.
00:15:53.540 I will report it if it says masks don't work.
00:15:55.900 If there's a big study that disagrees with me, I'm not going to like it.
00:16:00.740 But I promise you I'll report it so you've heard it.
00:16:05.420 That's the best I can do.
00:16:07.580 You know, I've been reading a lot about China lately.
00:16:10.740 You've got a fentanyl problem, a Uyghur genocide problem.
00:16:14.840 They've got a stealing IP problem, maybe not too forthcoming on the virus.
00:16:21.060 And, you know, when you put it all together, I'm starting to think that China doesn't have our best interests in mind.
00:16:33.140 All right, moving on.
00:16:36.700 CNN reports that, I guess now we have a recording of a 2019 phone call in which Rudy Giuliani was, according to the CNN headline,
00:16:46.240 this is how they said it, Rudy Giuliani cajoled and pressured Ukraine to investigate baseless conspiracies about Biden.
00:16:57.700 Now, the reporting on this is very interesting because it suggests that CNN knows how investigations turn out before they're started.
00:17:08.580 Is this a pattern?
00:17:10.320 Well, yes, it is.
00:17:11.340 It is because it turns out that CNN has done this with the election audits.
00:17:18.360 CNN is already reporting the result of the audit that didn't happen yet, or at least it's not complete yet.
00:17:26.240 This is really happening right in front of you.
00:17:29.160 CNN is telling you the outcome of an audit before the audit's over.
00:17:36.100 And people are just accepting that.
00:17:38.760 They're accepting it as if that makes any sense.
00:17:42.080 Now, it would certainly be fair for them to say, based on everything we've seen and reported,
00:17:47.580 we're not expecting that the audit will come up with any surprises.
00:17:51.640 That's fair.
00:17:53.480 That's fair.
00:17:54.620 Completely fair.
00:17:55.960 Based on everything we've seen and reported so far,
00:17:58.760 we have no reason to believe that the audit will kick up some surprises.
00:18:03.040 I would say that's fine.
00:18:04.360 But when you say it's baseless, you're kind of suggesting you know how it's going to turn out.
00:18:11.600 And you don't.
00:18:13.240 They did, I would say, similar things with the Russian collusion.
00:18:17.240 There was a...
00:18:18.740 Obviously, the Mueller investigation was ongoing.
00:18:21.520 But it seems to me that CNN reported it as a fact all along, before it was done.
00:18:28.580 And now this Ukraine investigation that never happened,
00:18:31.920 but somehow CNN knows what the result would be for the investigation that never happened.
00:18:40.200 So these are three clear, I think, clear examples of where CNN knew the result of an investigation
00:18:47.780 before the investigation.
00:18:50.520 How do they even remain as a TV network?
00:18:57.000 They are so far into ridiculousness.
00:19:00.020 Now, let me ask you this.
00:19:01.800 Am I biased in favor of Fox News?
00:19:06.160 Or is it true that Fox News doesn't do this?
00:19:09.840 Now, I'm not saying that Fox News doesn't do their own stuff you can criticize, right?
00:19:14.860 So we're not saying Fox News good, CNN bad.
00:19:18.040 That's not what I'm saying.
00:19:18.820 But I'm saying this specific trick, where they know the outcome of something before the end,
00:19:25.100 isn't that only CNN?
00:19:27.400 I don't know if MSNBC does it.
00:19:29.860 But I can't remember any time that Fox News has ever told us the outcome of a study
00:19:36.820 before the study was over.
00:19:39.000 Do you have any memory of that?
00:19:40.700 This seems to be specific to CNN.
00:19:43.560 Here's a scary story, but fun.
00:19:46.540 It turns out that criminal organizations, mafia types, have been using an encryption technology
00:19:55.280 in which you would take a phone that had been crippled for all other purposes except for
00:20:00.940 this one encrypted app, and it could only talk to people who also had a crippled phone
00:20:06.940 with just that app on it.
00:20:08.680 So it was like a really special, super-secret encrypted thing that's better than, you know,
00:20:14.260 Telegram, it's better than Signal, it's better than WhatsApp, because those are just, you
00:20:20.220 know, encryption apps.
00:20:22.200 But in this one, you had to have an actual special phone that could only talk to the other special
00:20:27.060 phones.
00:20:27.540 It's pretty well encrypted, right?
00:20:31.500 Except law enforcement has had that encryption broken for a long time, and they've been watching
00:20:38.040 the entire criminal enterprises revealing all of their secrets for a long time until they
00:20:45.960 just did a big roll-up and arrested a bunch of people.
00:20:49.180 But here's my advice to you.
00:20:54.560 There's no such thing as an encrypted app.
00:20:58.900 Now, technically they're encrypted, in the sense that encryption happens in a, you know,
00:21:05.520 math and coding sense.
00:21:07.800 Yes.
00:21:10.740 Betsy, you skied in my hometown.
00:21:13.160 I'll be damned.
00:21:13.660 You've seen my house.
00:21:14.460 If you were on the ski slope in Wyndham, New York, you saw my house, because you just looked
00:21:19.440 across the valley, and there it was.
00:21:23.340 But here's the thing.
00:21:27.100 Here's why there's no such thing as a safely encrypted app.
00:21:32.120 My goodness, you, Jin Jian, you are way too nice.
00:21:36.740 I certainly appreciate it, but I'm not worth it.
00:21:40.060 But I appreciate it very much.
00:21:45.700 So there's no such thing as a really encrypted app, and here's why.
00:21:50.220 Your encrypted app ends up with a human being.
00:21:55.400 That's why you sent it, right?
00:21:56.920 You sent your message to a human being.
00:22:01.500 The human being isn't encrypted, right?
00:22:04.640 As soon as the human being gets it, it's no longer a secret message.
00:22:09.860 It's a message that some damn person can tell anybody they want.
00:22:13.620 So the moment you think, well, I'm safe now.
00:22:16.800 I got my encrypted app.
00:22:19.420 No such thing.
00:22:21.140 I used to have the Signal app.
00:22:25.220 Got rid of it.
00:22:26.260 Because the moment you think your messages are secret, you're fucked.
00:22:33.080 There's no such thing as a secret message.
00:22:35.940 That doesn't exist.
00:22:37.380 Because the person on the other end is not encrypted.
00:22:42.240 And let me point out this.
00:22:45.280 Those companies, Signal, Telegram, WhatsApp,
00:22:48.760 don't they have human beings working there?
00:22:51.320 Do you think there's no way to get an insider
00:22:54.700 to figure out how to beat an encryption?
00:22:59.040 Actually, I don't know the answer to that question.
00:23:00.840 It might be that there is no way.
00:23:02.580 But I doubt it.
00:23:04.100 I would think that if you could get an insider
00:23:06.300 who had the right skills, the right access,
00:23:08.840 they could open up any app.
00:23:11.140 How do you know that insiders haven't already gone
00:23:13.620 to the developers for whatever app you're using
00:23:16.940 and said to them, here's the deal.
00:23:19.100 If you don't give us a backdoor to your encrypted app,
00:23:22.960 we will shut your operation down one way or another.
00:23:26.800 We'll do it legally,
00:23:28.080 but you're basically out of business
00:23:29.920 unless you give us a backdoor.
00:23:31.840 Did that happen?
00:23:33.840 How would you know?
00:23:35.620 How would you know?
00:23:37.240 Is there any way you would know
00:23:38.700 that your app developer had a secret conversation with the FBI?
00:23:42.960 No.
00:23:43.940 No, you'd have no way of knowing.
00:23:45.020 So you would be a sucker to use encrypted apps.
00:23:50.400 You would be a sucker.
00:23:51.780 Because an encrypted app is exactly where people
00:23:54.180 are going to look for your secrets.
00:23:56.840 Now, it might not be that your neighbor can read your encrypted app,
00:24:01.220 but your neighbor wasn't reading your other mail either.
00:24:05.240 Your neighbor wasn't getting into your text messages.
00:24:08.140 So you only have two kinds of people.
00:24:10.720 The people who can't get into your messages at all or don't care
00:24:13.580 and the people like the government
00:24:15.860 who really do want to see your messages
00:24:18.380 and they can do it.
00:24:20.840 Right?
00:24:22.460 So this example of the mafia super secret encryption
00:24:27.400 being easily, I don't know if it was easily,
00:24:30.460 but they did crack it,
00:24:32.000 that should tell you everything you need to know
00:24:34.160 about secret information.
00:24:35.520 Don't write anything down in any form if it's bad.
00:24:41.820 Don't ever write incriminating things at all.
00:24:45.880 Ever.
00:24:46.820 Don't write anything encrypted that's private
00:24:50.500 and would get you in trouble.
00:24:52.820 All right.
00:24:54.200 And a related story.
00:24:56.340 And I'm not sure what to think about this yet,
00:24:59.380 but did you hear that the FBI
00:25:02.000 managed to somehow get back the ransomware
00:25:05.860 that was paid for the Colonial Pipeline hack?
00:25:10.100 Is that still true?
00:25:11.900 I mean, I heard that yesterday,
00:25:13.100 but it looks like it wasn't in the news this morning.
00:25:16.020 Did something happen to the story?
00:25:17.820 Can somebody confirm that that happened?
00:25:20.220 Is it true that the FBI
00:25:21.940 somehow clawed back the,
00:25:24.540 I think it was Bitcoin,
00:25:25.380 and somehow clawed back that money that was paid?
00:25:28.460 Can I get a confirmation on that?
00:25:31.700 Partially, some of it.
00:25:33.580 Not all of it, somebody says.
00:25:35.340 Well, probably because some of it already
00:25:36.880 had passed through to another wallet.
00:25:39.520 They got a portion of it, somebody's saying.
00:25:41.420 Okay.
00:25:43.320 Now, you might ask yourself,
00:25:45.040 hey, I thought the whole point of Bitcoin
00:25:48.020 and cryptocurrency is that it was secret.
00:25:53.740 So how did the FBI get a hold of the secret passcode,
00:25:59.660 the secret code that led them into the wallet
00:26:01.740 of the bad guys to take their money out?
00:26:04.340 How'd they get it?
00:26:05.800 Because these crypto things
00:26:07.960 are supposed to be all secret and secure, right?
00:26:10.980 It's why criminals use them.
00:26:13.880 How secure is Bitcoin?
00:26:16.960 Well, here's the problem.
00:26:18.860 Apparently, there's a history
00:26:20.160 of the FBI hackers hacking hackers.
00:26:24.840 So in other words,
00:26:25.900 if you had already known you had a hacker group,
00:26:29.040 you might not be able to get directly
00:26:30.840 into their Bitcoin business,
00:26:32.560 but you can control their network
00:26:34.160 so that everything they do,
00:26:36.280 you can see in theory, right?
00:26:38.640 We don't know if this happened in this case,
00:26:40.300 but in theory,
00:26:41.360 you could hack their whole network
00:26:43.100 and anything they did,
00:26:44.660 including typing in a Bitcoin password,
00:26:48.500 might be known.
00:26:50.280 So you don't need to penetrate the Bitcoin itself,
00:26:54.400 you know, the blockchain, etc.
00:26:55.980 You don't need to penetrate that.
00:26:57.820 That still might be secure.
00:27:00.580 But you might be able to get their entire network
00:27:05.840 and then catch them
00:27:07.100 when they're typing in their passcode or something.
00:27:10.120 Now, I think that's the thing, right?
00:27:12.180 Can somebody who's a little more technically astute
00:27:14.780 tell me the answer to this?
00:27:16.380 If you could control somebody's computer
00:27:18.280 such that you could see every keystroke,
00:27:21.500 couldn't you get their passcode?
00:27:23.540 I mean, I think you could, right?
00:27:26.160 So, yeah, a keylogger is what it's called,
00:27:28.380 a keylogger.
00:27:29.060 So, if the good guys hacked the bad guys,
00:27:33.180 they could certainly get a hold of their passcode
00:27:35.520 and then empty their wallets.
00:27:37.200 Don't know if that's what happened.
00:27:39.060 Could have been an insider.
00:27:40.640 Could have been something else.
00:27:41.860 I don't know.
00:27:43.060 I doubt they've penetrated the blockchain
00:27:46.100 in some way that they could get a hold of.
00:27:48.380 I don't think it's possible to do it directly.
00:27:51.860 I think they would have to go after the people,
00:27:54.160 not the technology.
00:27:57.420 All right.
00:27:58.660 Keyloggers are easy to find, yeah.
00:28:00.620 You would think a high-end to hackers
00:28:02.260 would be protecting themselves from being hacked.
00:28:05.180 So, I doubt it's as simple as a keylogger,
00:28:07.940 but that's the basic idea.
00:28:09.320 They left it on Coinbase, which is dumb.
00:28:14.800 But being on Coinbase doesn't automatically
00:28:17.000 give somebody a backdoor into it.
00:28:20.760 That alone wouldn't make a difference.
00:28:24.340 All right.
00:28:25.220 Vice President Harris made the mistake
00:28:28.980 of trying to speak in public.
00:28:32.380 Vice President Harris speaks in public
00:28:36.060 as well as Joe Biden climbs stairs in public.
00:28:39.320 In either case, it's an adventure,
00:28:42.260 and you're probably a little nervous
00:28:43.880 when you see it if you're a supporter.
00:28:46.360 And one of these stories that we hear
00:28:49.780 is that she's quietly receiving
00:28:51.580 what's called media training.
00:28:54.460 In other words, and by the way,
00:28:56.780 did I ever tell you that this would happen?
00:28:59.660 I told you that sooner or later,
00:29:02.400 somebody was going to be training Harris
00:29:05.140 in how to talk in public
00:29:06.760 because she does not know how to talk in public.
00:29:10.220 Her biggest problem is her nervous giggle.
00:29:13.780 It's a really big problem.
00:29:15.520 And I don't know how hard it is
00:29:17.020 to teach somebody not to do that
00:29:18.640 if it's a lifelong habit.
00:29:20.420 But man, if she doesn't beat that,
00:29:22.220 I don't know how she could ever be president.
00:29:24.500 But she did an interview with Lester Holt
00:29:27.460 in which he was asking her
00:29:29.760 about having not visited the border.
00:29:33.020 And she said glibly,
00:29:36.340 but I also haven't been to Europe,
00:29:39.460 so what's your point?
00:29:42.520 What?
00:29:44.920 Somebody gave her media training
00:29:46.800 and she answered that way,
00:29:48.460 but I also haven't been to Europe,
00:29:49.980 so what's your point?
00:29:50.760 Oh my fucking God.
00:29:54.780 I don't know how you could be any worse.
00:29:57.640 Now, first of all,
00:29:58.420 I assume that she means
00:29:59.400 she's been to Europe personally,
00:30:01.360 but not on official business.
00:30:03.140 So it wasn't like she's never been to Europe,
00:30:05.020 I assume.
00:30:05.740 I mean, that was my interpretation of it.
00:30:08.100 But you need to answer that question directly, right?
00:30:12.620 That was a fair question.
00:30:14.700 Why haven't you been to the border?
00:30:16.080 A completely fair question.
00:30:17.240 It is also a question
00:30:19.160 you know you're going to be asked.
00:30:22.320 Fox News reports the number of days
00:30:24.200 she hasn't visited the border.
00:30:26.840 Every day,
00:30:28.200 the number of days she hasn't visited the border
00:30:30.720 is reported in national news.
00:30:33.420 She knew the question was coming
00:30:35.060 and she wasn't ready for it.
00:30:38.120 Are you kidding me?
00:30:39.800 She wasn't ready for the most obvious question
00:30:42.300 about her job?
00:30:44.800 I mean, that's just really bad work.
00:30:47.760 That's somebody who doesn't look like
00:30:49.560 they did the homework at all.
00:30:51.980 I mean, you can't even say
00:30:53.440 she tried hard but failed.
00:30:55.360 This doesn't look like trying at all.
00:30:58.300 This looks like literally being absent
00:31:00.900 from your duties
00:31:02.040 if you can't answer a simple question like that.
00:31:06.380 So the direct answer would have been this.
00:31:09.520 I would like to...
00:31:11.040 Let me give you the correct media answer, okay?
00:31:14.180 Vice President Harris,
00:31:16.660 people are concerned that you haven't visited the border
00:31:19.140 and yet you're in charge of making sure Central America
00:31:22.240 doesn't send more people across the border.
00:31:24.260 Don't you think you should visit the border?
00:31:26.140 Here's the correct answer.
00:31:27.580 I think it's very important
00:31:28.580 that everybody visits the border
00:31:31.200 who's involved in the decision-making.
00:31:33.840 So yes, I'm definitely going to visit the border.
00:31:36.060 But in terms of priorities,
00:31:38.420 my first priority is specifically
00:31:40.600 working with these countries
00:31:42.200 to make sure that there's less problem
00:31:44.280 on the border in the first place.
00:31:46.160 But yeah, absolutely, I'll visit the border.
00:31:48.100 We just need to work that into the schedule.
00:31:50.560 At the moment, frankly,
00:31:51.960 we have enough reporting from the border
00:31:53.820 that I know what's going on there.
00:31:55.500 But I do think that for the benefit of the public,
00:31:58.340 the public needs to see me there.
00:31:59.720 So I'm with you.
00:32:01.220 And as soon as I get done with this stuff,
00:32:03.500 we're going to schedule a trip to the border.
00:32:06.060 And I'd like to see it firsthand.
00:32:07.860 But we do have good reporting on it.
00:32:10.520 Compare my answer to,
00:32:13.860 I haven't met the Europe either.
00:32:18.540 I mean, seriously.
00:32:20.800 How bad a politician can you be
00:32:23.080 to not be able to answer
00:32:24.420 the most obvious question you'll ever be asked?
00:32:28.140 All right.
00:32:29.720 Ron DeSantis continues to do
00:32:34.800 almost everything right
00:32:36.260 for somebody who's likely to run for president.
00:32:40.780 I mean, whoever is advising Ron DeSantis,
00:32:45.720 assuming that this isn't him making up
00:32:47.900 all of his own decisions,
00:32:49.140 maybe it is, I don't know.
00:32:50.720 But he really is doing one right thing after another.
00:32:53.440 It's kind of crazy.
00:32:54.700 So the latest right thing he did
00:32:56.580 in terms of running for president,
00:32:58.300 if that's his ambition,
00:32:59.720 is he's going after China
00:33:01.240 from the perspective of a governor,
00:33:04.800 which is really smart.
00:33:07.480 Because I don't know who else has done this.
00:33:09.480 Has anybody done this as directly?
00:33:11.780 So he's done, I guess,
00:33:13.280 he's blaming the Chinese government
00:33:17.100 for the virus and the way they handled it.
00:33:19.100 So that's A+.
00:33:20.220 But that's a little bit more ordinary.
00:33:22.820 Other people are doing that.
00:33:24.740 But then he just decided,
00:33:28.040 well, he just passed some laws, I guess,
00:33:29.900 or he's promoting a bill
00:33:30.940 that would ban partnerships
00:33:33.680 between the Chinese government
00:33:35.600 and any educational entity in Florida.
00:33:38.460 So there's something called the Confucius Institute,
00:33:41.520 which, quote,
00:33:43.240 facilitates cultural exchanges
00:33:45.440 and direct deals between foreign governments
00:33:48.580 and Florida's universities and colleges.
00:33:51.400 And DeSantis is saying,
00:33:53.020 how about China get out of Florida?
00:33:54.480 Yes, that is the correct approach.
00:34:00.080 Yes, Ron DeSantis,
00:34:01.740 you are 100% on the right page here.
00:34:05.460 100%.
00:34:06.420 Not just politically,
00:34:08.140 because it's brilliant politically,
00:34:10.200 because it raises his stature
00:34:11.580 to national leader.
00:34:13.900 But it's also exactly the right thing to do.
00:34:16.740 There's no...
00:34:17.480 This one's not ambiguous at all.
00:34:19.180 This is exactly the right thing to do.
00:34:20.740 And then also, there's a second bill.
00:34:24.900 So these are bills.
00:34:25.760 These are not laws yet.
00:34:27.180 The second bill that would make
00:34:28.780 theft and trafficking of trade secrets
00:34:30.800 a criminal offense under state law.
00:34:33.500 They're already federally...
00:34:34.660 Federally it's a crime,
00:34:35.900 but he's tightening up
00:34:36.720 and making it a state crime as well.
00:34:38.360 And that, of course,
00:34:39.040 is aimed more at China than anybody else.
00:34:42.620 So by Ron DeSantis packaging up,
00:34:45.420 you know,
00:34:46.060 some anti-China Chinese government, really,
00:34:49.400 we don't want to say
00:34:50.420 it's about Chinese people.
00:34:51.720 We love the Chinese people,
00:34:53.080 not the government.
00:34:55.420 It's just brilliant politically.
00:34:57.960 So I'm glad that I have DeSantis
00:34:59.860 to talk about,
00:35:00.860 because I like to talk about
00:35:02.020 good and bad persuasion.
00:35:04.000 And he's doing it great.
00:35:05.680 Doing it great.
00:35:08.720 Obama's in the news,
00:35:09.960 and I have a big question about him.
00:35:12.880 So I guess he's got a book coming out,
00:35:14.720 so he's doing some interviews.
00:35:15.600 And he repeated the fine people hoax.
00:35:20.920 And I'm a little surprised by it.
00:35:26.760 Now, I can imagine...
00:35:27.980 It's easy to imagine that in the past
00:35:29.900 that all the Democratic figures
00:35:31.780 had repeated that hoax,
00:35:33.820 because I think a lot of them
00:35:34.780 believed it in the beginning.
00:35:37.060 But do we believe...
00:35:38.540 And I asked this in a poll.
00:35:40.980 I said,
00:35:41.400 do you think Barack Obama
00:35:42.540 believes the fine people hoax
00:35:44.120 is actually real?
00:35:46.460 As recently as yesterday?
00:35:48.900 Could somebody as well-informed
00:35:50.800 as Obama
00:35:51.900 not know that that's a hoax?
00:35:54.540 By now.
00:35:55.780 Certainly, I would forgive anybody
00:35:57.220 who didn't know it was a hoax
00:35:58.340 in the beginning.
00:35:59.500 But now?
00:36:00.700 How do you not know that now?
00:36:02.940 You know, even Biden stopped saying it.
00:36:05.000 We haven't heard Biden say it
00:36:06.240 in quite a while,
00:36:06.980 because I think he finally got the message.
00:36:08.660 It wasn't real.
00:36:10.260 But Obama, does he still believe it?
00:36:12.440 Or is he morally empty?
00:36:17.440 Which is it?
00:36:18.580 Because I actually don't know.
00:36:20.720 I'm actually a little confused.
00:36:22.780 Because it seems to me
00:36:24.060 that especially from the perspective
00:36:25.660 of an ex-president,
00:36:27.380 I don't have a memory of Obama
00:36:31.500 telling lies this big
00:36:33.920 that are so disprovable
00:36:35.860 in public.
00:36:37.580 Can somebody remind me
00:36:40.700 of any big lies
00:36:41.900 that Obama told
00:36:43.540 that were just obvious lies?
00:36:45.720 Now, of course,
00:36:46.900 they all do spin, right?
00:36:48.620 So I'm not talking about spin
00:36:50.080 or a little bit of hyperbole,
00:36:52.420 something's worse than something.
00:36:54.000 Can somebody give me...
00:36:55.340 Well, that was a promise.
00:36:56.780 Somebody says,
00:36:57.720 you know,
00:36:58.040 the promise about
00:36:59.700 if you like your doctor,
00:37:00.880 you can keep it.
00:37:01.600 My guess is that
00:37:03.480 he actually believed that.
00:37:05.600 I don't think he would say it.
00:37:07.400 I mean,
00:37:07.660 my guess is that
00:37:08.380 that wasn't a lie.
00:37:09.380 He actually believed it.
00:37:11.340 But that's a guess.
00:37:13.140 Yeah.
00:37:14.600 That's more of a
00:37:15.640 campaign promise.
00:37:19.580 Isn't it?
00:37:20.580 Yeah.
00:37:21.040 And I'm not sure
00:37:21.840 that just being wrong
00:37:22.800 is the same.
00:37:25.420 The Trayvon...
00:37:26.600 Well,
00:37:27.560 I don't want to get
00:37:28.800 into Trayvon Martin.
00:37:30.560 Yeah.
00:37:31.040 So we don't really
00:37:32.440 have a big history
00:37:33.360 of Obama saying things
00:37:34.960 that are this obviously
00:37:36.100 provably false.
00:37:39.080 You know,
00:37:39.240 it would be one thing
00:37:39.860 to say something
00:37:40.480 that you can't prove
00:37:41.420 one way or the other
00:37:42.300 or the statistics
00:37:44.520 are misleading.
00:37:45.500 You could read it this way
00:37:46.460 or you could read it that way.
00:37:47.820 That's one thing.
00:37:49.220 But when has Obama
00:37:50.520 just said
00:37:51.040 an obviously false thing
00:37:53.200 that you can just
00:37:54.280 look up yourself
00:37:55.000 and tell it's false?
00:37:56.600 Yeah.
00:37:59.220 So I have...
00:38:01.460 My guess
00:38:02.380 is that he actually
00:38:03.700 believes it.
00:38:05.480 That's my best guess.
00:38:07.300 I can't read his mind,
00:38:09.120 but I think
00:38:09.740 he actually believes it.
00:38:12.060 Amazing.
00:38:13.080 He also said
00:38:13.980 that it was laughable
00:38:15.120 because he literally laughed.
00:38:16.720 He, quote,
00:38:17.480 chuckled
00:38:18.040 when he talks about
00:38:20.900 Republicans thinking
00:38:21.960 that critical race theory
00:38:23.360 and the teaching of it
00:38:24.300 is the greatest threat
00:38:25.920 to our republic.
00:38:26.600 Now, of course,
00:38:27.400 he's exaggerating a little bit,
00:38:28.960 but he's acting
00:38:29.500 as though Republicans
00:38:30.280 think critical race theory
00:38:31.800 is the biggest threat
00:38:32.760 to our republic.
00:38:34.780 Is that...
00:38:35.920 Is that crazy?
00:38:38.940 Is it crazy for somebody
00:38:40.560 to think that critical race theory
00:38:42.520 is, in fact,
00:38:43.760 the biggest risk
00:38:44.680 to the republic?
00:38:45.820 Because I think
00:38:46.620 you could make that argument.
00:38:48.380 You could definitely
00:38:49.260 make that argument.
00:38:50.120 It goes like this.
00:38:51.800 The most important thing
00:38:53.280 that holds the country
00:38:54.120 together is...
00:38:56.080 If you're not a dictatorship,
00:38:57.900 the most important thing
00:38:59.260 that holds the country
00:39:00.040 together is...
00:39:02.160 The culture.
00:39:03.820 The set of beliefs
00:39:05.520 that everybody shares.
00:39:07.420 If you get these set
00:39:09.780 of beliefs wrong
00:39:10.580 that everybody shares
00:39:11.660 about who you are
00:39:12.660 and what your country
00:39:13.420 is about,
00:39:14.540 then the whole thing
00:39:15.200 falls apart.
00:39:17.480 Critical race theory
00:39:18.720 attacks the central premise
00:39:20.360 of America
00:39:21.060 that we're a melting pot
00:39:23.020 and that we shouldn't be
00:39:24.600 obsessing over people's race
00:39:26.840 and anything else.
00:39:28.680 So, I would say
00:39:30.440 that the DNA of America,
00:39:33.400 which is we don't
00:39:34.300 obsessively focus on race,
00:39:36.780 we try to ignore it
00:39:37.980 while still also doing
00:39:39.200 everything that
00:39:39.880 the legal system
00:39:41.120 and good people
00:39:41.840 can do to eliminate
00:39:43.020 bias and bigotry
00:39:44.680 and prejudice
00:39:46.000 and all that.
00:39:48.500 So, yeah,
00:39:49.220 you work hard
00:39:49.760 to eliminate it,
00:39:51.340 but you don't say
00:39:52.580 this is who you are.
00:39:53.920 You don't change
00:39:54.560 your identity
00:39:55.100 to a bunch of assholes
00:39:56.380 who hate each other.
00:39:57.220 Keep the melting pot.
00:39:59.960 So, I would say
00:40:01.060 that to Obama,
00:40:02.040 I don't think
00:40:02.520 you understand
00:40:03.400 what holds
00:40:04.760 the country together.
00:40:06.100 It's a shared
00:40:06.960 set of beliefs.
00:40:08.240 That's it.
00:40:09.240 You get rid of
00:40:09.980 the shared set of beliefs
00:40:10.840 and there's nothing
00:40:11.700 that would hold
00:40:12.700 the country together
00:40:13.420 at all.
00:40:15.280 You know,
00:40:15.500 the Constitution
00:40:16.200 won't do it,
00:40:17.220 the laws won't do it,
00:40:18.360 the military won't do it,
00:40:20.360 the political system
00:40:21.400 won't do it.
00:40:22.340 You've got to start
00:40:23.060 with that
00:40:23.480 or nothing works.
00:40:25.200 So, he is looking
00:40:26.360 to erase
00:40:28.000 the DNA
00:40:29.280 of the country
00:40:30.000 and I've got to think
00:40:31.240 that is the biggest
00:40:32.160 risk to the country.
00:40:35.100 And it's hard
00:40:35.800 to imagine
00:40:36.400 he doesn't see that.
00:40:38.220 Rasmussen reports
00:40:39.060 that only 42%
00:40:41.320 of moderates
00:40:42.020 think Fauci
00:40:43.460 was telling the truth
00:40:44.440 about the gain
00:40:45.340 of function
00:40:45.880 funding in Wuhan.
00:40:48.540 Now,
00:40:48.980 I tell you only
00:40:49.500 what the moderates think
00:40:50.500 because I don't have
00:40:51.800 to tell you
00:40:52.200 what the left thinks
00:40:53.000 or the right thinks.
00:40:53.780 would you be surprised
00:40:55.460 to know
00:40:55.940 that the people
00:40:57.140 who identify
00:40:57.660 as the left
00:40:58.380 largely believe
00:40:59.740 that Fauci
00:41:00.320 is telling the truth?
00:41:01.820 Would you be surprised
00:41:02.760 to know
00:41:03.180 that people
00:41:03.760 who identify
00:41:04.280 with the right
00:41:04.900 largely believe
00:41:07.140 that he's lying?
00:41:08.640 So, I just look
00:41:09.540 at the moderates
00:41:10.180 and if you lose
00:41:11.420 the moderates,
00:41:12.920 well,
00:41:13.620 you lost the argument
00:41:14.600 and the moderates
00:41:16.220 only 42%
00:41:18.360 believe him.
00:41:19.980 So, even the moderates
00:41:20.920 have left
00:41:21.360 at this point.
00:41:22.240 Now,
00:41:23.280 my view
00:41:24.180 on Dr. Fauci
00:41:25.160 is a little bit
00:41:26.220 more nuanced
00:41:27.180 which is
00:41:28.700 there may in fact
00:41:30.360 have been
00:41:30.700 some cover your ass
00:41:31.760 activity going on here
00:41:33.900 but that doesn't mean
00:41:36.000 that they're guilty
00:41:36.840 of something wrong.
00:41:38.540 You can cover your ass
00:41:40.120 even if you didn't
00:41:41.440 do anything wrong
00:41:42.260 and you could
00:41:43.120 rightly be blamed
00:41:44.900 for covering your ass
00:41:46.040 and there are two reasons
00:41:48.100 to cover your ass.
00:41:49.140 One is that you're guilty
00:41:49.960 and you don't want people
00:41:50.820 to find out
00:41:51.520 and the other
00:41:52.540 is that you're not guilty
00:41:53.520 and you don't want
00:41:54.560 to be unreasonably
00:41:55.520 blamed for something
00:41:56.360 so you cover your ass.
00:41:58.100 So, was Fauci
00:41:59.120 trying to cover his ass
00:42:00.540 by the way he talked
00:42:02.280 about the gain of function
00:42:03.980 to try to downplay it
00:42:05.960 and make it look like
00:42:07.720 no, not directly
00:42:08.940 and it was only
00:42:09.680 a little bit of money
00:42:10.540 and it wasn't
00:42:11.460 that kind of gain of function.
00:42:13.420 Yeah, he did all that
00:42:14.500 but that doesn't mean
00:42:16.620 he's guilty of a crime
00:42:18.440 because whether you're guilty
00:42:20.340 or not guilty
00:42:21.160 you're going to kind of
00:42:22.180 cover your ass.
00:42:23.900 We'd expect it
00:42:24.760 in both situations.
00:42:26.880 So, I think that there
00:42:29.900 probably is something
00:42:31.160 technically true
00:42:32.300 about funding
00:42:34.080 that ended up
00:42:35.140 in the wrong place
00:42:35.960 but I don't know
00:42:37.520 that that being
00:42:38.180 technically true
00:42:39.180 makes Fauci
00:42:40.080 a bad person
00:42:41.360 or a liar
00:42:41.940 necessarily.
00:42:43.460 So, I'm not quite
00:42:44.660 of that opinion yet.
00:42:45.700 My opinion is
00:42:46.980 that he might have
00:42:47.500 been wrong a lot
00:42:48.420 like everybody.
00:42:50.700 You know?
00:42:51.140 Unfortunately,
00:42:51.740 this was not
00:42:52.280 the kind of emergency
00:42:53.320 in which we could say
00:42:55.000 well, the public is dumb
00:42:57.740 but thank goodness
00:42:59.100 we have experts.
00:43:00.720 It wasn't that kind
00:43:01.700 of problem.
00:43:03.260 It was a problem
00:43:04.040 where the experts
00:43:04.680 didn't know
00:43:05.180 but they were doing
00:43:06.680 the best they could.
00:43:08.080 So, I'm going to be
00:43:08.980 consistent with what
00:43:10.020 I said at the beginning
00:43:11.120 of the pandemic
00:43:11.920 which is that
00:43:13.460 at the end of the pandemic
00:43:14.660 we're going to be
00:43:15.480 blaming our leaders
00:43:16.540 for making bad decisions
00:43:17.820 and I think
00:43:19.440 it's a little bit
00:43:20.420 harsh.
00:43:22.220 You sort of have to
00:43:23.280 do it anyway
00:43:24.180 because people
00:43:24.900 have to be accountable
00:43:25.620 but it's a little harsh
00:43:26.920 because people
00:43:27.580 were guessing
00:43:28.120 and it was fog of war
00:43:29.140 some were going
00:43:30.160 to get it right
00:43:30.760 by luck
00:43:31.560 some were going
00:43:32.780 to get it wrong
00:43:33.420 by bad luck
00:43:34.840 and
00:43:36.020 criticizing the people
00:43:38.900 who got it wrong
00:43:39.540 just not much
00:43:41.360 to gain in that.
00:43:43.720 All right.
00:43:44.660 Somebody say
00:43:47.520 he actively suppressed
00:43:48.820 the lab leak theory.
00:43:50.960 Well, here's what
00:43:51.660 I ask about
00:43:52.320 the lab leak theory.
00:43:53.760 Given that we know
00:43:54.760 now that there is
00:43:55.680 a place
00:43:56.240 on the genome
00:43:58.240 where you look
00:43:59.140 specifically a place
00:44:00.920 to look
00:44:01.400 to see if something
00:44:02.740 has been engineered
00:44:03.540 all of those opinions
00:44:06.120 about it not being
00:44:07.120 engineered
00:44:07.620 they had not
00:44:09.200 not yet looked
00:44:10.820 at the place
00:44:12.040 that you could look
00:44:12.740 to find out
00:44:13.300 if it was.
00:44:15.100 What?
00:44:17.360 How do you
00:44:17.940 explain that?
00:44:19.380 That we had
00:44:20.080 a strong opinion
00:44:20.920 about it
00:44:21.460 before looking
00:44:23.500 at the one thing
00:44:24.300 that would actually
00:44:24.820 tell us if it
00:44:25.500 happened or not.
00:44:26.980 Well, that's your
00:44:28.180 world you live in
00:44:29.040 and
00:44:30.100 I
00:44:31.160 am going to go off
00:44:32.160 and enjoy the rest
00:44:33.460 of my birthday.
00:44:34.740 Derek
00:44:35.160 my goodness
00:44:36.340 you're much
00:44:36.960 too nice.
00:44:41.340 I do work hard
00:44:42.900 to try to make
00:44:43.720 a difference.
00:44:44.800 That is true.
00:44:45.800 So it is my
00:44:46.380 intention
00:44:46.860 it is always my
00:44:48.460 intention to work
00:44:49.340 hard
00:44:49.620 to make something
00:44:51.680 better.
00:44:52.900 So I try
00:44:53.680 and I appreciate
00:44:55.380 very much
00:44:55.980 you recognizing
00:44:56.980 that
00:44:57.300 and thank you
00:44:58.400 so much
00:44:58.820 and I will talk
00:44:59.720 to you
00:45:00.100 tomorrow.
00:45:01.220 a pair.
00:45:02.100 Okay,
00:45:02.480 here we go.
00:45:02.860 Here we go.
00:45:03.460 We are going to do this
00:45:03.660 here.
00:45:04.720 Here we go.
00:45:05.360 Here we go.