Real Coffee with Scott Adams - March 07, 2021


Episode 1306 Scott Adams: Stimulus Package Shenanigans, Voting Changes, Super Anti-Racists, and Robot Lovers


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

140.7495

Word Count

10,157

Sentence Count

705

Misogynist Sentences

12

Hate Speech Sentences

22


Summary

Jack Posobiec joins me to talk about the latest in the Trump/Cuomo scandal, and why the accusations against him should be considered a crime. Plus, a movie recommendation, and a run down of what's going on with the Trump administration.


Transcript

00:00:00.000 All right, here's a little quiz for you. What is the best part of the day? Go. That's right. That's right. You all got it right. It's this. This is the best part of the day.
00:00:13.380 It's called Coffee with Scott Adams, and even Jack Posobiec knows it's the best part of the day, as he just logged in. And if you would like to enjoy this to the maximum possible potential, and why wouldn't you, really? Why wouldn't you?
00:00:29.160 All you need is a cup or a mug or a glass, a tank or chalice or a stein, a canteen jug or flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better. It's called the simultaneous sip, and it happens now. Go.
00:00:54.320 Ah, Dr. Funk Juice. Good to see you.
00:00:59.160 Well, we got some fun stories today. Now, they're not good, exciting stories like Trump stories, but they're pretty good. They're pretty good. Would you like to hear some of them? I know you would.
00:01:12.140 Number one, I have a movie recommendation for you, a documentary called The Accidental President. You can find it on Amazon Prime and probably at least one other place, I think.
00:01:24.680 The documentary by James Fletcher, who I believe is married to Brooke Baldwin, had been from CNN. But surprisingly, surprisingly, it's pretty fair.
00:01:39.120 And I'm in it, so I'm a little bit biased because I'm in it. But the interesting part is that this documentary was, at least the film of it, was put together a few years ago.
00:01:54.460 And it was talking about how Trump ended up winning. And the theme is that he wasn't necessarily expecting to win, but it just sort of happened.
00:02:04.480 But you should hear the commentary on it. It's pretty good. The Accidental President. All right.
00:02:13.120 Cuomo's up to five accusers if you watch Fox News, and he's up to three accusers if you watch CNN.
00:02:19.980 What? What? You can't even get the number of accusers right?
00:02:27.960 Now, if you get to three, there should be some sort of universal law about this.
00:02:34.320 It goes like this. If you get to three accusers, there are five.
00:02:40.740 In the history of sexual harassment, nobody ever stopped at three.
00:02:50.500 Pretty sure.
00:02:51.940 Like, if you were accused of one, and then no other person came forward,
00:02:57.080 you might say to yourself, well, that's a little less credible because
00:03:00.400 do harassers only ever harass one person?
00:03:04.360 But once you get to three, it's going to go to five.
00:03:11.180 That's like a universal rule.
00:03:13.120 I'll tell you my take on all of this.
00:03:15.580 All of these accusations, they seem to fall into this category.
00:03:20.740 Have you noticed that?
00:03:23.000 And the category is, and I'm going to say this,
00:03:26.020 I think I can say this without being racist.
00:03:29.700 I think, but give me a ruling on this.
00:03:33.380 See if this sounds racist to you, okay?
00:03:36.900 That what I'm seeing is a creepy old Italian-American guy
00:03:41.220 with no game whatsoever,
00:03:43.140 has no idea how to approach women, apparently.
00:03:47.840 But he definitely took advantage of his power differential,
00:03:54.340 meaning that he didn't directly use it,
00:03:58.360 but it was there.
00:04:00.500 It's unambiguously there.
00:04:02.960 So he certainly has to answer for this.
00:04:05.240 I don't want to say anything that sounds like I'm apologizing for it,
00:04:09.060 but we should keep it in perspective.
00:04:14.660 Certainly he needs to not be doing this stuff.
00:04:18.340 There's no question about that.
00:04:20.500 But it doesn't feel...
00:04:22.740 It feels like a guy who didn't exactly know what he was doing.
00:04:28.560 Doesn't it?
00:04:29.700 Like maybe on some level he did,
00:04:32.320 but there's something a little bit clueless about it
00:04:39.000 that makes it seem different than evil.
00:04:41.320 You know, if you look at the Epstein charges
00:04:44.320 or the Weinstein charges, Harvey Weinstein,
00:04:48.100 it's obvious that those guys knew exactly what they were doing.
00:04:52.660 I mean, they knew they were being deeply bad.
00:04:56.080 And no question about it.
00:04:57.760 But I'll bet you if you had a private conversation with Cuomo
00:05:01.660 before these scandals broke,
00:05:03.560 and you said, have you ever done anything like this?
00:05:06.600 I think he would have honestly answered no,
00:05:10.200 even though he did.
00:05:11.640 And I mean, the accusations look credible anyway.
00:05:16.240 I feel it's just a weird category
00:05:18.420 where the person doing the crime
00:05:20.280 is not necessarily aware that they're doing a crime,
00:05:24.140 which doesn't make it excusable, right?
00:05:27.260 The system is very clear.
00:05:29.920 You don't have to be aware of it being a crime
00:05:32.100 for it to be a crime.
00:05:32.960 But it should end his political career,
00:05:37.520 which is the only thing anybody cared about, really, on this.
00:05:44.040 Here's what I think we should do in the United States.
00:05:48.500 Every time we hear about people coming across our border,
00:05:53.660 don't they seem to be coming from the same countries?
00:05:56.700 Yeah, people are talking about Cuomo's cover-up
00:06:03.200 with the nursing homes and stuff.
00:06:06.440 That's just a different issue.
00:06:09.160 That's not excusable.
00:06:11.100 Not that the other one is.
00:06:12.440 It's just a different issue.
00:06:13.980 So here's what we should do.
00:06:16.440 We should be like Costa Rica.
00:06:19.120 Have you noticed that when we talk about
00:06:20.860 the illegal immigration coming across our southern border,
00:06:25.000 nobody ever says
00:06:27.220 all those Costa Ricans are coming across.
00:06:31.160 Have you ever noticed that?
00:06:33.400 Why not?
00:06:35.000 I mean, if Nicaraguans are coming
00:06:37.220 and other countries are coming,
00:06:40.480 why is Costa Rica not coming?
00:06:43.020 Well, I think the answer is that Costa Rica
00:06:45.100 is a successful country.
00:06:47.840 And if you haven't checked into
00:06:49.420 how they became a successful country,
00:06:52.020 it's a pretty cool story.
00:06:55.720 And the answer is that,
00:06:58.080 well, number one,
00:06:59.180 and this is sort of a questionable decision,
00:07:02.340 but it seems to have worked out,
00:07:04.280 they decided to not exploit
00:07:08.380 what probably they have
00:07:10.380 in terms of new natural resources,
00:07:12.300 oil, maybe gas or something.
00:07:14.120 But they've decided not to be that kind of an industry
00:07:17.080 and become a vacation destination.
00:07:19.960 So if you have a waiter or a bartender in Costa Rica,
00:07:24.540 I've had the pleasure of being there once,
00:07:27.300 if you have a waiter or a bartender in Costa Rica,
00:07:30.300 that person went to college
00:07:32.260 to be a waiter or a bartender.
00:07:35.380 Now, not college, college,
00:07:37.380 like maybe you're used to it in the United States,
00:07:40.080 but there is a very well-organized training thing.
00:07:43.780 I think you learn English as well.
00:07:46.960 And so the service is extraordinary.
00:07:49.180 And Costa Rica is in pretty good shape.
00:07:51.500 So you don't see a lot of people coming over.
00:07:53.360 But I think we could borrow that idea,
00:07:55.800 especially as robots are taking jobs
00:07:59.080 and a lot of unskilled labor
00:08:00.900 is going to need a place to thrive.
00:08:04.180 I think we should build,
00:08:05.160 I've said this before,
00:08:05.940 I'll just combine two ideas,
00:08:07.380 a network of bicycle paths across the country.
00:08:11.720 There is actually a private effort to do that,
00:08:16.080 but it's sort of one route
00:08:18.000 in the northern part of the country.
00:08:20.060 I'd like to see the whole country have bicycle paths.
00:08:23.560 So you could basically go from anywhere to anywhere
00:08:25.940 without going through traffic to get there.
00:08:28.380 And make the USA a bicycle travel destination.
00:08:33.300 Because building the paths will require labor
00:08:36.240 as well as maintaining them,
00:08:38.860 as well as providing services and food
00:08:42.340 and bike repair along the way, etc.
00:08:44.340 So it should create a lot of jobs.
00:08:46.100 But the other thing you could do
00:08:47.040 is create a network of canals.
00:08:49.760 Maybe you do it at the same time.
00:08:52.040 Maybe you put the bicycle path on the side of the canal.
00:08:55.460 But the way to build the canal,
00:08:58.040 if you had some big machine
00:08:59.820 like Elon Musk's boring company,
00:09:02.980 B-O-R-I-N-G,
00:09:04.720 because it bores tunnels,
00:09:06.240 it shouldn't be that hard to make one of those
00:09:08.780 that bores a canal, right?
00:09:11.660 To make a canal.
00:09:13.200 So if you had big equipment
00:09:15.040 that could just sort of make a canal.
00:09:17.360 And then the other thing Elon Musk talked about,
00:09:19.540 and I don't know how far he got along in this,
00:09:22.080 but using the dirt that is dug by the boring device
00:09:25.540 to press bricks on site
00:09:28.840 so that you're actually building...
00:09:31.380 Yeah, there's obviously...
00:09:36.100 You have to work out the sea level plan, etc.
00:09:39.300 Right?
00:09:39.880 So you couldn't put a canal just anywhere.
00:09:42.520 You'd have to have just the right place.
00:09:44.020 That's true.
00:09:45.260 Now, if you had canals,
00:09:48.740 and those canals,
00:09:50.180 when their dog built bricks,
00:09:51.820 and those bricks become maybe housing
00:09:54.600 on the sides of the canals,
00:09:56.920 and then here's another thing
00:09:58.560 that I heard of today.
00:09:59.660 I'd never heard of this before,
00:10:00.960 but apparently one of the problems
00:10:03.120 with traveling to the United States
00:10:05.120 is if you get sick.
00:10:07.860 If you get sick in the United States
00:10:09.780 and you're from another country,
00:10:11.320 you don't have health care coverage.
00:10:13.340 So apparently there's some kind of insurance
00:10:17.300 you can buy just for traveling.
00:10:19.860 Some kind of health care insurance,
00:10:21.460 and it's pretty expensive.
00:10:22.900 So there are a whole bunch of things
00:10:24.180 we could do as a country
00:10:25.280 to make it easier and better
00:10:27.640 for somebody to take a vacation here.
00:10:29.880 Now, if this had been our big industry
00:10:31.740 when the coronavirus hit,
00:10:33.200 we'd be in big trouble.
00:10:36.720 But I still think travel and recreation
00:10:40.860 are always going to be big.
00:10:41.840 So I'll just put that out there.
00:10:44.180 And then Costa Rica makes a big deal
00:10:46.580 about keeping the water and air clean
00:10:48.540 to keep tourism up,
00:10:51.520 and we would have the same impulse.
00:10:53.480 We'd want to keep the air and water
00:10:54.680 as clean as possible.
00:10:56.700 So I'll just put that out there.
00:10:58.780 So the stimulus package
00:11:00.480 got passed by the Senate,
00:11:01.840 but now it goes back to the House
00:11:04.300 to get a final final.
00:11:07.800 And I see all these people
00:11:09.080 arguing about the amount of the debt.
00:11:12.760 And what the debt should have been.
00:11:14.500 And people saying,
00:11:15.160 instead of 1.9,
00:11:17.260 why didn't we make it 1.4?
00:11:21.220 Nobody knows what the right amount of debt is.
00:11:24.620 That's not really a thing.
00:11:26.540 If you're arguing with any kind of certainty
00:11:28.980 about what the number should have been,
00:11:31.720 I don't think that you have any standing.
00:11:35.500 There's nobody who knows
00:11:36.540 what is the right amount
00:11:38.180 and when to do it
00:11:39.640 and what happens in the long term
00:11:41.760 and how do we pay down the debt
00:11:43.300 and can we just print money
00:11:44.860 and can we inflate it away?
00:11:47.780 Nobody knows any of that.
00:11:49.400 It's just completely unknowable.
00:11:52.140 So I do understand that more debt seems worse,
00:11:55.660 but it just hasn't worked out that way.
00:11:59.840 Observation has not matched the theory.
00:12:04.940 So I would just say
00:12:07.220 having a firm opinion on the debt
00:12:10.040 is a dicey situation now
00:12:12.300 because even experts don't really know
00:12:14.340 what's going on there.
00:12:15.060 Biden's also going to do an executive order
00:12:21.760 on some voting stuff
00:12:23.620 to make that more solid
00:12:24.800 and there's HR1 coming through
00:12:27.460 to make some changes in voting, etc.
00:12:29.820 But here's the thing.
00:12:33.860 Are we really trying to make the voting good
00:12:37.300 in this country
00:12:38.080 or are we just watching two different entities,
00:12:41.520 Democrats and Republicans,
00:12:42.700 using different gaming techniques
00:12:45.600 to make the system favor their side?
00:12:48.500 Because that's all it looks like.
00:12:50.700 It just looks like
00:12:51.580 one side is gaming the system
00:12:53.200 and then if the other side had control,
00:12:55.680 they game the system,
00:12:56.780 they gerrymander
00:12:57.620 and they change what kind of voting you can do,
00:13:01.440 mail-in voting, etc.
00:13:02.720 None of this has anything to do with democracy
00:13:05.000 or a democratic republic or any of that.
00:13:09.840 We're watching a system
00:13:11.140 that is so far divorced
00:13:12.700 from the people's will
00:13:15.660 being expressed in the vote
00:13:17.480 and nothing like that is happening.
00:13:20.060 The things that determine the vote
00:13:21.420 are fake news
00:13:22.480 and how effectively they present the fake news,
00:13:26.760 the game-playing about
00:13:28.120 what rules changes,
00:13:31.540 money,
00:13:32.540 how much people put into it,
00:13:35.140 and then campaign strategy,
00:13:37.580 which often maybe the candidate
00:13:40.060 isn't even the one who decides what it is.
00:13:42.820 So all of these things pick our leaders,
00:13:46.920 but none of them are even vaguely like
00:13:49.580 a democracy or a democratic republic.
00:13:53.540 Our system just drifted into this
00:13:55.660 sort of two organized criminal organizations
00:13:59.920 competing to see who can be the better weasel
00:14:03.080 in any given year.
00:14:04.600 That's kind of all it is.
00:14:08.080 Now, Trump broke the mold
00:14:10.400 because he was such a strong character
00:14:12.260 that all of the game-playing
00:14:14.220 just didn't matter for one election,
00:14:16.400 but I got a feeling they're going to adjust
00:14:19.520 and did for the second election.
00:14:23.840 All right.
00:14:24.120 Tom Sauer asks on Twitter,
00:14:29.900 you know,
00:14:30.280 why bother changing anything
00:14:32.920 in the election process
00:14:34.880 because he says he slowly realized
00:14:37.540 and accepted that we don't live in a democracy
00:14:39.940 or a democratic republic, whatever,
00:14:42.000 at least not in any meaningful sense.
00:14:45.780 Now, what do you think of that?
00:14:47.080 That we shouldn't try to make our...
00:14:49.460 There's just no point
00:14:50.400 because we live in a fixed system
00:14:53.720 where the outcomes are based on things
00:14:56.160 that have nothing to do with voting and democracy.
00:14:59.160 Well, I get the point.
00:15:02.500 So I understand the point.
00:15:04.520 But here's the counterpoint.
00:15:07.180 Our system,
00:15:08.240 as Frankenstein monster-ish as it is,
00:15:14.140 meaning it doesn't operate as a system
00:15:16.160 the way it was designed,
00:15:17.180 it's evolved into some weird creature of a system
00:15:21.340 that does what it wants to do
00:15:22.980 independent of what we want it to do.
00:15:26.420 So as long as it's this big creature
00:15:28.580 of a Frankenstein system
00:15:30.400 that decides somehow who our leaders are
00:15:33.560 and it doesn't have anything to do with us,
00:15:36.380 do you ignore it?
00:15:37.940 Do you get rid of it?
00:15:39.400 Here's the problem.
00:15:41.180 You still need the citizens
00:15:42.620 to believe they live in a system that works.
00:15:45.900 Well, the whole thing will fall apart,
00:15:49.120 even the Frankenstein monster part.
00:15:52.100 You need that.
00:15:53.540 Because if you don't have anything,
00:15:55.020 you just have chaos.
00:15:56.520 So weirdly,
00:15:57.960 we need to trust our system
00:16:00.800 to make it work,
00:16:04.340 but only working as two competing criminal organizations
00:16:07.840 to see who's the best weasel this year.
00:16:11.620 And that's the best we got.
00:16:13.000 But if we didn't at least pretend
00:16:15.400 that we believe
00:16:16.240 that there's some kind of a democracy here
00:16:18.240 and people didn't go and actually vote,
00:16:21.280 we would lose this illusion
00:16:23.160 that the people have some influence.
00:16:26.540 And you don't want to lose the illusion
00:16:28.180 because the whole system depends on that illusion.
00:16:32.840 So don't lose your illusions
00:16:34.240 if they're good ones.
00:16:35.100 Let's see.
00:16:39.460 What else we got here?
00:16:40.660 Remember I told you
00:16:41.580 when it was obvious
00:16:42.740 that the Senate would be pretty close
00:16:45.400 to even Democrats and Republicans?
00:16:48.180 I told you that Joe Manchin
00:16:49.980 wouldn't end up running everything?
00:16:53.000 Because the one person
00:16:54.840 who's willing to vote
00:16:56.060 whichever side makes sense
00:16:59.080 ends up running everything.
00:17:01.220 because they couldn't get this through
00:17:04.320 until Joe Manchin was happy or happier.
00:17:08.580 And I'll just put out a compliment to Joe Manchin.
00:17:14.260 Why is he the only one who thought of this?
00:17:16.900 There's not one other Democrat
00:17:18.400 who's smart enough to say,
00:17:20.140 wait a minute,
00:17:22.240 is it my imagination
00:17:23.420 or is Joe Manchin running the whole show here?
00:17:26.560 because he's willing to be a rogue vote.
00:17:31.740 There's not one other person
00:17:33.040 smart enough to do that too.
00:17:35.160 I guess Murkowski.
00:17:38.380 And then I heard that
00:17:39.760 Trump's going to campaign against Murkowski, right?
00:17:44.480 So, oh yeah,
00:17:45.740 somebody says Kyrsten Sinema.
00:17:48.380 So she's getting some attention
00:17:49.720 but doesn't seem important.
00:17:54.400 Here's an interesting story.
00:17:57.460 Guess what country in the world
00:17:59.920 is doing a terrible job with vaccinations?
00:18:05.080 Could it be a country
00:18:06.380 that we used to think
00:18:07.860 was doing a good job
00:18:09.480 on this coronavirus stuff?
00:18:11.980 What country is doing a really bad job
00:18:14.800 on vaccinations?
00:18:17.600 No, the U.S. is doing a good job right now.
00:18:21.060 I think we had close to 3 million shots yesterday.
00:18:23.940 I mean,
00:18:25.000 the United States is killing it
00:18:26.660 lately.
00:18:28.020 It took a while to cramp up.
00:18:30.320 Nobody got the right answer.
00:18:31.960 Oh, there we go.
00:18:32.660 Germany.
00:18:33.900 Germany.
00:18:35.840 So Germany apparently
00:18:37.440 is only
00:18:38.440 vaccinated
00:18:39.940 out of 6% of the country
00:18:41.520 and apparently it's just
00:18:42.520 the thing's a mess.
00:18:44.740 And even
00:18:45.200 Chancellor Merkel
00:18:47.300 admitted
00:18:48.340 their failings.
00:18:49.460 So remember
00:18:51.180 I told you
00:18:51.940 that
00:18:52.900 we would be
00:18:55.200 led to believe
00:18:56.700 that different
00:18:57.320 leadership decisions
00:18:58.860 would determine
00:19:00.580 the outcome
00:19:01.360 within different countries.
00:19:03.020 Do you remember
00:19:03.360 what I told you?
00:19:04.700 That leadership
00:19:05.860 would be the
00:19:06.680 least important
00:19:07.640 variable.
00:19:09.200 It would be
00:19:09.860 the least important.
00:19:10.860 We think it's
00:19:11.340 the most important
00:19:12.080 because we're
00:19:13.060 sort of
00:19:13.360 human-centric.
00:19:15.000 But now
00:19:15.300 Germany,
00:19:16.100 despite having
00:19:16.900 this terrible
00:19:17.980 job
00:19:18.760 of giving
00:19:20.140 vaccinations,
00:19:21.400 is still around
00:19:22.240 the middle of the
00:19:22.860 pack
00:19:23.140 in outcomes.
00:19:25.080 So every time
00:19:25.980 you find a company
00:19:26.800 that's doing
00:19:27.420 something
00:19:27.900 amazingly wrong,
00:19:30.480 it doesn't
00:19:31.260 seem to affect
00:19:32.460 their outcomes
00:19:33.240 that much.
00:19:34.200 Great Britain
00:19:34.620 would be an
00:19:35.120 exception.
00:19:36.080 I think their
00:19:36.800 decisions did
00:19:37.520 affect their
00:19:37.940 outcomes.
00:19:38.300 But man,
00:19:40.680 do we not
00:19:41.160 understand
00:19:41.700 what's going
00:19:42.240 on still.
00:19:44.380 That leadership
00:19:45.920 decisions just
00:19:46.780 don't seem
00:19:47.980 to make much
00:19:48.460 difference.
00:19:50.780 And you see
00:19:51.600 that with
00:19:52.000 lockdowns
00:19:52.820 and other
00:19:53.340 things.
00:19:53.760 There's not
00:19:54.060 much difference.
00:19:57.040 All right.
00:19:59.000 I will say
00:19:59.920 again that I
00:20:00.880 believe that
00:20:01.580 Trump's
00:20:02.840 reputation
00:20:04.060 will improve
00:20:05.120 every day
00:20:05.620 he's out of
00:20:06.080 office.
00:20:06.900 Henry Kissinger
00:20:07.600 just said
00:20:08.220 that Trump
00:20:09.400 his plan
00:20:11.380 in the Middle
00:20:11.880 East and
00:20:12.420 what he got
00:20:12.880 accomplished
00:20:13.520 in the Middle
00:20:13.960 East with
00:20:14.380 the Abraham
00:20:14.960 Accords,
00:20:15.520 etc.
00:20:16.220 Henry Kissinger
00:20:17.040 calls it
00:20:17.560 brilliant.
00:20:19.120 He calls it
00:20:19.520 brilliant.
00:20:20.740 And even
00:20:21.300 breaks it down
00:20:22.160 about why
00:20:22.860 it's brilliant,
00:20:24.180 you know,
00:20:24.560 taking the
00:20:25.220 Palestinian
00:20:25.640 question out
00:20:26.660 of it,
00:20:27.660 etc.
00:20:28.740 And that's
00:20:29.720 just one
00:20:30.100 example.
00:20:31.020 Somebody asked
00:20:31.520 me on Twitter
00:20:32.080 why did I
00:20:34.200 think before
00:20:35.060 Trump got
00:20:35.620 in office,
00:20:36.740 why was I
00:20:37.360 so dumb
00:20:37.880 to think
00:20:38.300 that he
00:20:38.620 would do
00:20:38.980 great
00:20:39.340 things?
00:20:40.660 And I
00:20:41.340 said,
00:20:41.980 where have
00:20:42.700 you been?
00:20:44.460 Even if you
00:20:45.400 were the
00:20:45.780 biggest critic
00:20:46.540 of Trump
00:20:47.200 in the
00:20:47.520 world,
00:20:48.700 you would
00:20:49.420 still have
00:20:50.120 to accept
00:20:50.620 that he
00:20:50.940 did great
00:20:51.440 things.
00:20:52.220 The
00:20:52.440 vaccination
00:20:53.100 warp speed,
00:20:54.440 that was a
00:20:54.860 great thing.
00:20:55.720 You just
00:20:56.020 can't take
00:20:56.420 that away
00:20:56.760 from him.
00:20:57.800 The
00:20:58.180 Abraham
00:20:58.860 Accords,
00:20:59.440 that's a
00:20:59.920 great thing.
00:21:01.240 Getting
00:21:01.760 North Korea
00:21:02.460 into a
00:21:03.020 whole different
00:21:03.560 headspace,
00:21:04.280 place where
00:21:05.200 they're just
00:21:06.380 basically not
00:21:07.120 a threat,
00:21:08.620 that's huge.
00:21:10.700 Turning China
00:21:11.900 from our
00:21:12.660 ally-ish
00:21:14.160 kind of
00:21:14.660 situation
00:21:15.140 into clearly
00:21:16.100 something else,
00:21:17.700 going hard
00:21:18.640 at everything
00:21:19.580 from bringing
00:21:20.920 the industries
00:21:21.680 away from
00:21:22.260 China to
00:21:23.260 pressing on
00:21:24.820 them for
00:21:25.360 Huawei,
00:21:26.100 etc.
00:21:26.480 That was
00:21:27.160 all Trump.
00:21:28.220 And Biden
00:21:28.660 is going to
00:21:29.640 have to follow
00:21:30.340 that because
00:21:31.100 once the
00:21:31.660 model is
00:21:32.280 set,
00:21:33.060 it's hard
00:21:33.500 to argue
00:21:33.880 against it.
00:21:35.640 So you
00:21:36.840 could make
00:21:37.780 your separate
00:21:38.440 list of the
00:21:39.180 things that
00:21:39.720 you didn't
00:21:40.120 like about
00:21:40.840 Trump,
00:21:41.760 and I
00:21:42.240 might not
00:21:42.640 even argue
00:21:43.200 with the
00:21:43.580 list.
00:21:44.600 You might
00:21:45.060 say he
00:21:45.540 didn't fix
00:21:46.140 health care.
00:21:47.160 Okay,
00:21:47.580 I agree.
00:21:48.320 You might
00:21:48.880 say what
00:21:49.500 he did
00:21:49.800 with the
00:21:50.220 capital
00:21:50.620 assault was
00:21:51.760 completely
00:21:52.280 unacceptable.
00:21:53.540 Okay,
00:21:54.580 I agree.
00:21:55.760 But you
00:21:56.220 can't take
00:21:56.720 away the
00:21:57.120 fact that
00:21:57.660 what he
00:21:58.060 did in
00:21:58.940 half a dozen
00:21:59.820 areas was
00:22:01.900 amazing.
00:22:02.860 I mean,
00:22:03.300 just almost
00:22:03.940 undoable,
00:22:05.120 impossible
00:22:05.520 things.
00:22:06.860 And it's
00:22:08.320 mind-boggling
00:22:09.960 that there's
00:22:10.460 anybody who
00:22:11.040 is an adult
00:22:11.660 who follows
00:22:12.280 politics who
00:22:13.540 would disagree
00:22:14.200 with that.
00:22:15.400 No matter
00:22:16.120 how much you
00:22:16.720 didn't like
00:22:17.220 the other
00:22:17.840 things he
00:22:18.360 did,
00:22:19.500 how do you
00:22:19.940 take away
00:22:20.340 the fact that
00:22:20.920 he did
00:22:21.240 things that
00:22:21.700 just were
00:22:22.260 undoable?
00:22:23.680 I mean,
00:22:24.300 it really
00:22:24.620 is impressive.
00:22:27.700 Somebody
00:22:28.180 says it's
00:22:28.640 not amazing,
00:22:29.360 just comment
00:22:29.800 sense.
00:22:30.220 Well,
00:22:30.360 it's amazing
00:22:30.780 that he
00:22:31.100 did it.
00:22:32.740 You know,
00:22:33.120 everybody has
00:22:34.020 common sense,
00:22:34.780 but not
00:22:35.040 everybody did
00:22:35.760 what he
00:22:36.080 did.
00:22:37.560 There's
00:22:38.040 an article
00:22:38.400 on CNN
00:22:39.100 that I
00:22:39.700 read for
00:22:40.100 the humor,
00:22:41.400 says that
00:22:42.520 cancel culture
00:22:43.600 isn't real.
00:22:45.620 And it
00:22:45.920 goes through
00:22:46.800 a number
00:22:47.120 of examples
00:22:47.760 to show
00:22:48.340 that we
00:22:49.100 act like
00:22:49.640 people are
00:22:50.020 getting
00:22:50.200 canceled,
00:22:50.680 but they're
00:22:50.920 really not.
00:22:51.800 And one
00:22:52.140 of the
00:22:52.300 arguments is
00:22:52.900 that the
00:22:53.260 people getting
00:22:53.780 canceled are
00:22:55.040 still doing
00:22:55.600 fine,
00:22:56.420 meaning that
00:22:57.040 their economic
00:22:57.900 situation is
00:22:58.740 fine.
00:22:59.000 so they're
00:23:00.040 not really
00:23:00.420 canceled.
00:23:00.900 One of
00:23:01.140 the examples
00:23:01.580 is actor.
00:23:04.380 Do we say
00:23:04.880 actor now,
00:23:05.680 or is
00:23:06.180 actress
00:23:06.820 sexist?
00:23:09.000 What's the
00:23:09.580 ruling on
00:23:10.080 that?
00:23:10.800 Because I think
00:23:11.420 actor is
00:23:12.240 still,
00:23:14.100 it feels like
00:23:14.860 the better
00:23:15.160 word.
00:23:15.440 so Gina
00:23:16.840 Carano,
00:23:17.880 who you
00:23:18.120 know was
00:23:18.540 on Disney,
00:23:20.820 whatever that
00:23:22.880 show was,
00:23:24.000 anyway,
00:23:24.340 she got
00:23:24.640 fired from
00:23:25.580 the show
00:23:26.060 for her
00:23:27.580 opinions,
00:23:28.580 I guess.
00:23:29.760 And the
00:23:31.100 CNN article
00:23:31.780 says she
00:23:32.240 wasn't really
00:23:32.780 canceled because
00:23:33.640 she got this
00:23:34.960 great opportunity
00:23:35.820 with Ben
00:23:36.720 Shapiro.
00:23:37.740 So Ben
00:23:38.160 Shapiro is
00:23:39.780 going to
00:23:40.080 produce something
00:23:40.860 and she's
00:23:41.560 been hired to
00:23:42.100 do that.
00:23:42.480 to which I
00:23:43.560 say to
00:23:43.860 myself,
00:23:45.180 well,
00:23:46.900 I see
00:23:48.140 your point,
00:23:49.220 but ask
00:23:51.540 yourself this,
00:23:53.080 if your
00:23:53.820 ambition was
00:23:54.880 to be a
00:23:55.320 successful
00:23:55.880 actor,
00:23:57.660 would you
00:23:58.600 say to
00:23:58.900 yourself,
00:24:00.320 yes,
00:24:00.940 I got the
00:24:01.460 Ben Shapiro
00:24:02.200 gig?
00:24:03.540 Now,
00:24:05.180 I'm a big
00:24:05.620 fan of
00:24:05.980 Ben Shapiro
00:24:06.560 and I
00:24:08.280 think what
00:24:08.580 he's doing
00:24:08.900 is great.
00:24:10.620 It's all
00:24:11.220 positive.
00:24:12.500 But I
00:24:12.920 don't feel
00:24:13.600 like a
00:24:14.040 professional
00:24:14.600 actor is
00:24:16.220 going to
00:24:16.500 say,
00:24:17.180 finally,
00:24:18.240 I got the
00:24:18.680 Ben Shapiro
00:24:19.380 opportunity.
00:24:22.860 Again,
00:24:23.840 all props
00:24:24.820 to Ben
00:24:25.680 Shapiro for
00:24:26.260 doing what
00:24:26.600 he's doing
00:24:26.980 and we
00:24:27.340 should do
00:24:27.640 more of
00:24:28.240 that,
00:24:28.900 you know,
00:24:29.160 sort of a
00:24:29.820 rescue
00:24:30.200 operation for
00:24:31.160 canceled
00:24:31.640 people.
00:24:32.000 But I
00:24:33.500 feel like
00:24:34.060 the argument
00:24:34.880 that she
00:24:35.380 was killed
00:24:35.980 still feels
00:24:36.760 true enough,
00:24:37.700 right?
00:24:38.500 It's still
00:24:38.920 true enough.
00:24:41.380 Anyway,
00:24:42.020 that was
00:24:42.260 funny.
00:24:46.400 So,
00:24:47.500 Ivor
00:24:47.860 Cummins.
00:24:48.780 Do you
00:24:49.100 all know
00:24:49.480 Ivor
00:24:50.000 Cummins?
00:24:50.720 You've
00:24:50.920 probably seen
00:24:51.360 him on
00:24:51.800 Twitter.
00:24:53.060 And I
00:24:53.500 guess you
00:24:53.820 could call
00:24:54.200 him a
00:24:54.900 skeptic
00:24:55.600 of the,
00:24:57.480 let's
00:24:58.020 say,
00:24:58.320 the scientific
00:24:59.120 consensus
00:24:59.940 about a
00:25:01.300 number of
00:25:01.740 coronavirus
00:25:02.140 things.
00:25:03.620 Now,
00:25:04.140 apparently we
00:25:05.720 are on
00:25:06.180 each other's
00:25:06.900 radar and
00:25:07.860 he sent me
00:25:08.400 an article
00:25:08.800 from an
00:25:09.680 expert,
00:25:10.280 a doctor,
00:25:11.340 who is
00:25:12.100 arguing
00:25:12.580 against some
00:25:13.620 of the
00:25:13.920 mainstream
00:25:14.500 assumptions
00:25:15.380 about
00:25:15.860 coronavirus.
00:25:16.700 Mainstream
00:25:17.120 meaning
00:25:17.420 mainstream
00:25:17.960 expert
00:25:19.080 science
00:25:19.760 medical
00:25:20.660 people.
00:25:22.860 And Ivor
00:25:23.520 and a
00:25:23.820 number of
00:25:24.140 his followers
00:25:24.740 are pestering
00:25:26.800 me online
00:25:27.580 that the
00:25:27.980 two of
00:25:28.280 us should
00:25:29.500 argue it
00:25:30.100 out.
00:25:30.400 get online
00:25:32.300 and argue
00:25:32.900 it out.
00:25:33.320 What do
00:25:33.480 you think
00:25:33.660 of that?
00:25:34.920 Is that
00:25:35.140 a good
00:25:35.360 idea?
00:25:36.360 Do you
00:25:36.620 think that
00:25:36.940 it would
00:25:37.140 be productive
00:25:37.840 for Ivor
00:25:39.980 Cummins
00:25:40.740 and me
00:25:41.640 specifically
00:25:42.920 to have a
00:25:43.900 conversation
00:25:44.440 on some
00:25:45.020 kind of
00:25:45.300 a podcast?
00:25:48.500 No.
00:25:49.700 It would
00:25:50.180 be the
00:25:50.460 worst idea
00:25:51.020 in the
00:25:51.280 world.
00:25:52.220 It would
00:25:52.420 be a
00:25:52.660 terrible
00:25:52.900 idea.
00:25:54.580 Terrible
00:25:55.060 idea.
00:25:56.400 Now,
00:25:56.680 it might
00:25:56.900 give you
00:25:57.220 some
00:25:57.400 entertainment
00:25:57.880 but it
00:25:59.040 would
00:25:59.200 be
00:25:59.380 worse
00:25:59.800 than
00:26:00.100 useful.
00:26:01.300 I mean,
00:26:02.220 here,
00:26:02.800 let me
00:26:03.000 take you
00:26:03.320 through.
00:26:05.540 Worst idea
00:26:06.320 ever.
00:26:07.760 Let's
00:26:08.200 say there
00:26:09.060 are scientists
00:26:09.640 in this
00:26:10.260 world and
00:26:10.780 there are
00:26:11.060 non-scientists.
00:26:12.480 Let's say
00:26:13.040 that you
00:26:13.380 wanted to
00:26:13.780 get toward
00:26:14.320 truth and
00:26:15.560 you wanted
00:26:15.920 the non-scientists,
00:26:17.840 people like
00:26:18.260 me,
00:26:19.120 you wanted
00:26:19.480 people like
00:26:19.880 me and
00:26:20.620 Ivor
00:26:21.100 Cummins,
00:26:21.860 non-scientists,
00:26:23.240 you would
00:26:23.780 like people
00:26:24.220 like us
00:26:24.800 to be
00:26:25.840 as informed
00:26:26.660 as the
00:26:27.520 scientists.
00:26:28.080 What's a
00:26:28.340 good way
00:26:28.620 to do
00:26:28.880 that?
00:26:29.600 Well,
00:26:29.840 a good
00:26:29.980 way to
00:26:30.200 do that
00:26:30.480 is to
00:26:30.660 have a
00:26:30.880 conversation,
00:26:31.540 right?
00:26:31.940 To
00:26:32.200 communicate.
00:26:33.700 So what
00:26:34.060 happens if,
00:26:34.740 let's say,
00:26:35.820 a scientist
00:26:36.440 is talking
00:26:37.080 to another
00:26:37.600 scientist?
00:26:38.580 Does that
00:26:39.180 help me
00:26:40.160 understand
00:26:41.260 science?
00:26:42.540 Well,
00:26:43.020 I'm not in
00:26:43.380 the room.
00:26:44.660 I'm not
00:26:45.020 there.
00:26:46.000 It's just
00:26:46.380 two scientists
00:26:46.940 talking to
00:26:47.520 each other.
00:26:47.860 So that
00:26:48.080 didn't help
00:26:48.420 me because
00:26:49.740 when I hear
00:26:50.260 about it,
00:26:50.700 I'll probably
00:26:51.080 hear about
00:26:51.480 it from
00:26:51.740 a journalist.
00:26:53.480 I won't
00:26:54.040 even hear
00:26:54.340 from the
00:26:54.700 scientist.
00:26:55.440 I'll hear
00:26:55.840 it filtered
00:26:56.320 through a
00:26:56.760 journalist.
00:26:57.640 So two
00:26:58.180 scientists
00:26:58.760 having a
00:26:59.380 private
00:26:59.620 conversation,
00:27:00.300 that doesn't
00:27:00.680 help me.
00:27:01.700 It doesn't
00:27:02.140 help me at
00:27:02.480 all.
00:27:03.900 How about
00:27:04.460 if they
00:27:04.680 have a
00:27:04.920 public
00:27:05.160 conversation?
00:27:06.980 And then
00:27:07.400 what am I
00:27:08.320 going to
00:27:08.520 do?
00:27:08.740 Pick the
00:27:09.120 winner?
00:27:11.020 How?
00:27:12.380 What skill
00:27:13.220 do I have
00:27:13.960 to know
00:27:15.200 which of
00:27:15.620 two experts
00:27:16.480 is the
00:27:17.340 right one?
00:27:18.360 If I
00:27:19.000 could pick
00:27:19.440 the right
00:27:19.860 expert,
00:27:20.940 I would
00:27:21.720 be an
00:27:22.200 expert.
00:27:22.580 And if
00:27:24.080 experts are
00:27:24.740 experts,
00:27:25.400 why are
00:27:25.800 they
00:27:25.960 disagreeing?
00:27:28.080 How the
00:27:28.760 hell would
00:27:29.160 I know
00:27:29.520 which one
00:27:29.920 is right?
00:27:31.100 If they
00:27:31.820 don't know
00:27:32.780 who's right
00:27:33.420 and they're
00:27:34.260 experts,
00:27:35.280 how am I
00:27:35.880 supposed to
00:27:36.320 know?
00:27:37.620 It's
00:27:38.080 ridiculous.
00:27:39.000 So this
00:27:39.380 would be
00:27:39.620 useless in
00:27:41.220 terms of
00:27:41.800 getting me
00:27:42.540 to understand.
00:27:43.740 It's very
00:27:44.240 useful that
00:27:44.880 scientists talk
00:27:45.540 to each
00:27:45.820 other,
00:27:46.220 but it
00:27:46.500 doesn't help
00:27:46.820 me understand.
00:27:48.440 Suppose a
00:27:49.160 scientist talks
00:27:50.080 to a
00:27:50.420 non-scientist.
00:27:52.140 That's good,
00:27:52.800 right?
00:27:53.060 Because the
00:27:53.600 scientist gives
00:27:54.440 you all this
00:27:54.960 good information
00:27:55.720 and now
00:27:56.660 you're informed.
00:27:58.360 No.
00:27:59.700 No.
00:28:00.860 Because it
00:28:01.520 would just
00:28:01.880 be one
00:28:02.700 scientist,
00:28:03.860 let's say,
00:28:05.020 telling me
00:28:05.920 stuff.
00:28:07.540 I don't know
00:28:08.040 if it's true.
00:28:09.140 Because is that
00:28:10.000 one scientist
00:28:10.900 this one
00:28:12.280 or this
00:28:13.340 other one?
00:28:13.780 Because they
00:28:14.160 disagree.
00:28:16.260 Could a
00:28:17.000 scientist convince
00:28:18.300 me that the
00:28:19.120 scientist was
00:28:19.880 right?
00:28:21.000 Probably.
00:28:22.200 In fact,
00:28:22.700 almost every
00:28:23.260 time.
00:28:24.320 And what
00:28:24.600 would happen
00:28:25.000 if the
00:28:25.460 next scientist
00:28:26.400 came in
00:28:26.920 five minutes
00:28:27.460 later and
00:28:28.640 told me that
00:28:29.200 the first
00:28:29.620 scientist was
00:28:30.320 wrong?
00:28:30.720 Would I
00:28:30.980 believe the
00:28:31.440 second
00:28:31.720 scientist?
00:28:33.180 Yeah.
00:28:34.240 Yeah,
00:28:34.740 I would.
00:28:35.280 Whoever goes
00:28:35.880 last, if
00:28:37.480 they know
00:28:37.880 more than
00:28:38.340 you do,
00:28:39.600 sounds very
00:28:40.440 convincing.
00:28:41.820 So whoever
00:28:42.340 talks to me
00:28:43.000 last probably
00:28:44.980 will change
00:28:45.460 my mind.
00:28:46.360 Same with
00:28:46.740 you.
00:28:47.360 Whoever
00:28:47.720 talks last
00:28:48.640 sounds right.
00:28:50.820 So is
00:28:51.900 that useful?
00:28:53.140 Because it's
00:28:53.800 just whoever
00:28:54.220 talked last.
00:28:55.300 That has
00:28:55.700 nothing to
00:28:56.140 do with
00:28:56.400 science.
00:28:57.320 So that's
00:28:57.620 useless.
00:28:59.660 And then
00:29:00.580 this is just
00:29:01.440 another combination
00:29:02.240 of a
00:29:02.600 scientist and
00:29:03.420 a non-scientist.
00:29:04.460 But how
00:29:04.760 about two
00:29:05.200 non-scientists
00:29:06.060 having a
00:29:06.500 conversation?
00:29:07.860 I'm a
00:29:08.280 non-scientist.
00:29:10.220 Ivor
00:29:10.740 Cummins is
00:29:12.240 a non-scientist,
00:29:13.220 I understand.
00:29:13.860 I think he
00:29:14.140 has an
00:29:14.600 engineering or
00:29:15.460 biochem degree
00:29:17.300 or something.
00:29:17.940 So he's
00:29:18.260 a lot
00:29:18.540 closer to
00:29:19.340 having a
00:29:20.760 STEM-like
00:29:22.220 mind frame
00:29:24.100 than I do.
00:29:25.680 But we're
00:29:26.280 not really
00:29:26.780 scientists in
00:29:27.720 this field.
00:29:29.400 Why the
00:29:30.120 hell would
00:29:31.280 you want to
00:29:31.740 see him
00:29:32.760 talk to me?
00:29:34.300 What possible
00:29:35.520 good could that
00:29:36.180 do?
00:29:36.380 Let me tell you
00:29:37.720 how that
00:29:38.060 conversation goes.
00:29:39.720 Look at my
00:29:40.700 study.
00:29:41.780 Look at it.
00:29:43.080 Look at my
00:29:44.180 study that says
00:29:45.580 whatever.
00:29:47.000 Masks don't
00:29:47.700 work or
00:29:48.180 whatever it
00:29:48.520 is.
00:29:49.720 And then
00:29:50.100 I'll say,
00:29:51.600 all right,
00:29:54.200 where do I
00:29:54.700 take that?
00:29:55.860 Do I look
00:29:56.560 at the
00:29:56.800 study and
00:29:57.360 with my
00:29:57.820 non-scientific
00:29:59.040 knowledge?
00:30:00.720 I say,
00:30:01.600 well, based
00:30:02.280 on my
00:30:02.700 non-scientific
00:30:03.860 knowledge,
00:30:04.840 the person
00:30:05.320 who does
00:30:05.920 have scientific
00:30:06.560 knowledge is
00:30:07.220 all wrong.
00:30:08.420 What the
00:30:08.780 hell good
00:30:09.120 would that
00:30:09.440 do?
00:30:10.580 Like I
00:30:11.240 would know
00:30:11.580 what I'm
00:30:11.880 talking about.
00:30:12.500 Now, suppose
00:30:14.980 Ivor comes
00:30:15.980 on and
00:30:16.300 says, look
00:30:16.720 at my
00:30:17.000 data.
00:30:17.980 It comes
00:30:18.300 from this
00:30:18.700 good source.
00:30:20.420 Do you
00:30:20.860 think that
00:30:22.320 I couldn't
00:30:22.920 find a
00:30:24.460 piece of
00:30:24.820 paper to
00:30:25.220 hold in
00:30:25.560 my hand
00:30:26.140 that says
00:30:27.120 the opposite
00:30:27.680 of what
00:30:28.100 his piece
00:30:28.560 of paper
00:30:28.900 says?
00:30:30.200 You know
00:30:30.600 I could.
00:30:31.920 You know
00:30:32.480 I could.
00:30:33.380 Do you
00:30:33.640 think I
00:30:33.960 could Google
00:30:34.480 in one
00:30:35.300 minute a
00:30:36.500 whole bunch
00:30:36.920 of sources
00:30:37.380 that say
00:30:37.860 masks work?
00:30:40.000 Of course I
00:30:40.580 could.
00:30:40.760 Do you
00:30:41.680 think I
00:30:42.000 could Google
00:30:42.620 sources that
00:30:44.280 masks don't
00:30:45.200 work?
00:30:46.220 Of course I
00:30:46.900 could.
00:30:48.180 How do I
00:30:48.600 know the
00:30:48.880 difference?
00:30:50.620 How do I
00:30:51.460 know?
00:30:52.660 So, everybody
00:30:54.260 who's asking
00:30:54.740 me to talk
00:30:55.260 to Ivor,
00:30:56.160 you don't
00:30:56.860 understand what
00:30:57.820 that would
00:30:58.280 do.
00:30:59.120 What that
00:30:59.780 would do
00:31:00.140 would elevate
00:31:00.980 his opinion
00:31:01.840 to a greater
00:31:04.140 level of
00:31:04.820 worthiness
00:31:05.520 because it
00:31:06.260 was being
00:31:06.640 discussed
00:31:07.100 in an
00:31:08.260 organized
00:31:08.620 way.
00:31:09.080 should his
00:31:10.780 opinion be
00:31:11.460 elevated?
00:31:13.940 No more
00:31:14.780 than mine
00:31:15.480 and mine
00:31:16.440 shouldn't be
00:31:16.980 elevated
00:31:17.440 because again
00:31:18.420 I'm not a
00:31:19.400 scientist.
00:31:20.320 Now, I do
00:31:21.500 make opinions
00:31:22.240 on the
00:31:22.620 science all
00:31:23.360 the time
00:31:23.860 because I'm
00:31:24.980 pro-mask
00:31:25.660 for example
00:31:26.200 but my
00:31:27.440 decision is
00:31:28.220 not always
00:31:28.820 based on
00:31:29.380 science.
00:31:31.400 My decision
00:31:32.300 is based on
00:31:32.940 risk management
00:31:33.900 and risk
00:31:35.620 management will
00:31:36.500 sometimes make
00:31:37.200 a different
00:31:37.600 decision than
00:31:38.340 science will.
00:31:39.940 So, if he
00:31:40.760 wants to talk
00:31:41.360 risk management
00:31:42.380 with me, I
00:31:43.040 could talk
00:31:43.420 that all
00:31:43.820 day long.
00:31:44.800 That's
00:31:45.040 something I
00:31:45.500 could speak
00:31:46.460 to with
00:31:46.840 some authority.
00:31:48.300 But judging
00:31:49.740 the science,
00:31:50.640 neither of us
00:31:51.060 could do
00:31:51.320 that because
00:31:52.440 the scientists
00:31:52.980 can't do it
00:31:53.540 themselves.
00:31:54.760 Now, here's
00:31:55.120 the thing that
00:31:55.960 Ivor has to
00:31:57.080 explain.
00:31:58.940 Why is it
00:31:59.620 that he's got
00:32:00.320 this one
00:32:00.900 doctor,
00:32:01.480 there'll be
00:32:01.800 more, right?
00:32:02.520 He wouldn't
00:32:02.820 be the only
00:32:03.240 doctor, but
00:32:04.120 why is it
00:32:04.600 that these
00:32:05.000 doctors who
00:32:05.720 are contrarians
00:32:06.700 can't convince
00:32:08.320 the other
00:32:09.600 doctors?
00:32:11.060 What's up
00:32:11.640 with that?
00:32:13.020 Because you
00:32:13.800 think that
00:32:15.860 if the
00:32:16.440 contrarians
00:32:17.300 have good
00:32:18.900 points and
00:32:19.540 data, they
00:32:20.760 have all the
00:32:21.240 data, according
00:32:21.840 to Ivor, if
00:32:23.400 the contrarians
00:32:24.080 have all the
00:32:24.660 data, that's
00:32:27.020 it, right?
00:32:27.640 That's how
00:32:28.080 science works,
00:32:28.720 here's my
00:32:29.140 data, and
00:32:30.220 then they all
00:32:30.580 change their
00:32:31.040 opinion.
00:32:31.280 But that
00:32:32.000 hasn't
00:32:32.380 happened.
00:32:33.960 How do you
00:32:34.740 explain the
00:32:35.300 fact that
00:32:36.560 science is
00:32:37.220 probably 99%
00:32:39.580 in favor of
00:32:40.420 masks for
00:32:41.260 the pandemic,
00:32:42.500 would you
00:32:42.900 say?
00:32:43.540 Would you
00:32:44.080 say that
00:32:44.440 the professionals
00:32:45.420 are 99%
00:32:47.780 in favor of
00:32:48.340 masks?
00:32:49.200 Does that
00:32:49.420 feel about
00:32:49.820 right?
00:32:50.840 Now, that
00:32:51.540 doesn't mean
00:32:51.980 masks work.
00:32:53.640 It really
00:32:54.020 doesn't.
00:32:54.840 But it does
00:32:55.400 tell you that
00:32:55.900 a whole lot
00:32:56.380 of people who
00:32:57.000 understand how
00:32:57.960 the world
00:32:58.380 works and
00:32:58.960 science works
00:32:59.760 and risk
00:33:00.780 management
00:33:01.240 and all
00:33:01.540 that, that
00:33:02.400 they kind
00:33:02.820 of fell on
00:33:03.260 the same
00:33:03.580 side.
00:33:04.780 Now, after
00:33:05.340 this is all
00:33:05.920 done, and
00:33:07.380 let's say
00:33:07.780 studies are
00:33:08.460 done, I
00:33:09.120 don't know
00:33:09.560 if you could
00:33:09.920 ethically study
00:33:11.280 coronavirus and
00:33:13.200 masks, because
00:33:13.740 you'd have to
00:33:14.140 infect people,
00:33:15.460 but suppose
00:33:17.180 someday we
00:33:18.220 find out that
00:33:19.720 masks absolutely
00:33:20.900 didn't work.
00:33:22.920 Would that
00:33:23.680 make me have
00:33:24.500 been wrong
00:33:25.280 during the
00:33:26.580 coronavirus
00:33:26.980 epidemic?
00:33:28.680 No.
00:33:29.760 No.
00:33:30.780 Because it's
00:33:32.860 risk management.
00:33:34.260 If you make
00:33:35.020 the right
00:33:35.380 risk management
00:33:36.240 decision, and
00:33:37.540 yet it turns
00:33:38.120 out to be
00:33:38.600 wrong, you
00:33:39.080 still made
00:33:39.500 the right
00:33:39.820 decision, you
00:33:41.100 just didn't
00:33:41.520 get lucky.
00:33:42.960 Because lots
00:33:43.620 of times you're
00:33:44.080 guessing.
00:33:44.760 It's like, well,
00:33:45.300 70% this, 30%
00:33:47.100 this, but the
00:33:48.480 30% of
00:33:49.100 happens, 30%
00:33:50.880 is a lot.
00:33:53.420 All right.
00:33:54.360 Yeah, then I'm
00:33:56.240 seeing somebody
00:33:56.660 saying, people
00:33:57.380 wore masks
00:33:57.940 masks during
00:33:58.240 the 1918
00:33:59.820 pandemic.
00:34:01.120 You will find
00:34:01.960 people who
00:34:02.520 say, yes, they
00:34:04.400 wore masks in
00:34:05.640 1918, and that
00:34:07.760 proved they
00:34:08.360 didn't work.
00:34:09.820 And then other
00:34:10.480 people will say,
00:34:11.840 yes, they wore
00:34:12.660 masks in 1918,
00:34:15.320 and proved it
00:34:16.160 worked.
00:34:17.800 Who's right?
00:34:18.980 The ones who say
00:34:20.080 they proved it
00:34:20.860 didn't work are
00:34:22.000 the ones who
00:34:22.440 say, yeah, all
00:34:23.580 the evidence
00:34:24.040 shut up.
00:34:28.340 How do you
00:34:29.100 know?
00:34:30.700 All right.
00:34:31.620 So be careful
00:34:33.620 about what you
00:34:34.380 think gives you
00:34:35.220 knowledge.
00:34:36.160 This process of
00:34:37.700 two non-scientists
00:34:39.020 talking about
00:34:40.200 science would
00:34:40.860 not give you
00:34:41.520 knowledge.
00:34:42.720 It would not.
00:34:44.660 And then I'll
00:34:45.520 add on top of
00:34:46.260 this that if
00:34:47.280 you want to
00:34:49.080 judge Ivor's
00:34:50.580 credibility, just
00:34:51.640 Google his
00:34:52.420 name, and
00:34:54.340 Google, you
00:34:55.580 know, words
00:34:56.420 like fact
00:34:57.820 check and
00:34:58.480 debunk, and
00:34:59.240 you'll see some
00:34:59.760 articles.
00:35:00.460 Now, I don't
00:35:00.800 have an opinion
00:35:02.080 about whether
00:35:02.740 the people who
00:35:04.220 try to debunk
00:35:05.040 him are right,
00:35:06.280 but just know
00:35:07.260 it's out there.
00:35:08.260 All right.
00:35:11.300 China has
00:35:13.020 developed, the
00:35:13.920 Chinese company
00:35:14.500 has developed
00:35:14.960 a, what they
00:35:16.020 call an
00:35:16.360 automated sperm
00:35:17.700 extractor.
00:35:19.760 And what it
00:35:20.240 is, is a
00:35:21.140 device that's
00:35:23.860 about waist
00:35:25.540 high, that
00:35:27.400 has an
00:35:28.120 artificial, let's
00:35:29.480 say, female
00:35:30.240 part that does
00:35:32.200 some moving
00:35:33.040 action.
00:35:34.240 And for
00:35:34.520 Chinese men who
00:35:35.360 want to go to
00:35:35.800 the hospital and
00:35:36.820 they're going to
00:35:38.220 give a sperm
00:35:39.400 sample, but
00:35:40.820 they're too
00:35:41.220 embarrassed to,
00:35:42.440 let's say, take
00:35:43.000 care of it
00:35:43.420 themselves, they
00:35:45.120 can use the
00:35:45.740 robot instead.
00:35:47.580 Now, I have
00:35:49.380 questions, I
00:35:50.780 know you have
00:35:51.280 questions, we'll
00:35:52.260 see if your
00:35:52.640 questions are
00:35:53.240 like mine.
00:35:54.580 My first
00:35:55.040 question is, how
00:35:56.680 big is the
00:35:57.500 market?
00:35:58.940 Because you'd
00:35:59.640 have to find
00:36:00.160 somebody who
00:36:00.780 could get
00:36:01.360 fully, shall
00:36:03.340 we say,
00:36:05.440 engorged,
00:36:07.160 otherwise even
00:36:08.460 the robot
00:36:09.040 machine doesn't
00:36:09.760 work, right?
00:36:10.760 Because the
00:36:11.160 robot machine
00:36:11.740 requires you to
00:36:12.480 be, let's
00:36:13.560 say, physically
00:36:14.560 be established.
00:36:19.620 How many
00:36:20.340 people fall
00:36:20.980 into the
00:36:21.360 category that
00:36:22.160 they can get
00:36:22.880 an erection in
00:36:23.720 a hospital, but
00:36:25.560 they can't
00:36:26.100 finish?
00:36:28.320 But they
00:36:28.940 can if they're
00:36:30.520 with a robot.
00:36:33.780 Now, let
00:36:35.000 me, for
00:36:36.520 the few women
00:36:37.600 who are watching
00:36:38.160 this, I know
00:36:38.800 it's like 85%
00:36:39.780 men, but for
00:36:41.060 the few women
00:36:41.600 I'd like to
00:36:42.060 explain something
00:36:42.760 about men.
00:36:43.980 Now, for those
00:36:44.780 of you who are
00:36:45.400 men, you can
00:36:46.000 talk among
00:36:46.480 yourselves, because
00:36:47.600 you already
00:36:48.080 know this.
00:36:50.600 There's no
00:36:51.240 man who can't
00:36:51.980 masturbate in
00:36:52.560 a hospital.
00:36:54.480 In fact, if
00:36:56.280 you just close
00:36:57.000 the door on
00:36:57.680 the people who
00:36:58.240 are in iron
00:36:58.840 lungs, they're
00:37:00.520 going to go
00:37:00.800 to town.
00:37:02.920 You could be
00:37:03.680 in full
00:37:04.180 traction and
00:37:06.700 heavily
00:37:07.060 medicated.
00:37:08.660 You'd find
00:37:09.400 a way.
00:37:12.760 So, I'm
00:37:13.780 skeptical about
00:37:15.500 how many
00:37:16.020 people there
00:37:16.540 are in all
00:37:17.220 of China that
00:37:18.500 fall into this
00:37:19.220 category of they
00:37:20.680 can get aroused
00:37:21.560 in a hospital, but
00:37:23.080 the only way they
00:37:23.760 can finish is with
00:37:24.600 a robot when the
00:37:26.400 people on the
00:37:26.960 other side of the
00:37:27.580 door know that's
00:37:29.360 what's happening on
00:37:30.060 the other side of
00:37:30.620 the door.
00:37:31.620 Are you done with
00:37:32.480 the robot yet?
00:37:33.820 We've got somebody
00:37:34.500 else who wants to
00:37:35.180 take a go at the
00:37:36.020 robot.
00:37:36.920 Could you hurry up
00:37:37.620 in there?
00:37:38.980 And then they won't
00:37:39.860 be embarrassed, because
00:37:40.980 they would only be
00:37:41.840 embarrassed if they
00:37:42.520 took care of it
00:37:43.120 themselves, but
00:37:44.160 apparently the
00:37:44.760 robot option, not
00:37:47.300 embarrassing.
00:37:49.860 So, here's a
00:37:51.260 question I ask you.
00:37:54.620 Did our CIA do
00:37:56.000 this?
00:37:57.740 Because, you know
00:37:58.620 about China and
00:37:59.460 their unrestricted
00:38:00.460 warfare, where
00:38:01.900 they're allegedly
00:38:02.640 using every possible
00:38:04.100 way to weaken
00:38:05.480 the United States,
00:38:06.960 from stealing our
00:38:07.740 IP, to trade
00:38:09.340 deals, to sending
00:38:10.540 us fentanyl, like
00:38:11.540 everything's on the
00:38:12.340 table.
00:38:13.820 What would be the
00:38:14.760 best way to
00:38:15.720 destroy China?
00:38:18.160 It would be to
00:38:19.140 give them a sex
00:38:20.400 option that was
00:38:21.920 better than their
00:38:22.620 women.
00:38:24.380 Now, it's not
00:38:25.820 there yet.
00:38:27.300 This is like
00:38:27.840 version one, the
00:38:29.500 sperm extractor.
00:38:30.840 So I'm not going to
00:38:31.800 say that's better
00:38:32.760 than women, but it's
00:38:35.960 catching up.
00:38:36.560 somebody said on
00:38:38.860 Twitter, and it
00:38:40.860 was very sexist.
00:38:42.240 This is so sexist,
00:38:44.020 it can barely come
00:38:44.980 out of my mouth.
00:38:46.700 And I would like to
00:38:47.620 disavow anybody who
00:38:48.980 would say something
00:38:49.700 like this.
00:38:51.840 Somebody said,
00:38:52.600 can it make a
00:38:53.180 sandwich?
00:38:54.200 Oh, disgusting.
00:38:55.960 Oh, oh, cancel
00:38:58.060 that person.
00:38:59.360 Could somebody go
00:39:00.100 back and find out who
00:39:00.960 said that and just
00:39:01.640 cancel them?
00:39:02.180 I think they need to
00:39:03.320 lose their job for
00:39:04.260 that comment.
00:39:06.600 Can it make a
00:39:07.600 sandwich?
00:39:08.600 Well, as it turns
00:39:09.420 out, I tweeted
00:39:10.220 coincidentally the
00:39:11.300 same day, a robot
00:39:13.420 that Samsung is
00:39:14.460 making that can
00:39:16.080 make a sandwich.
00:39:18.040 Yeah.
00:39:18.740 They've got a
00:39:19.400 robot that looks
00:39:20.900 like it's about,
00:39:21.580 I don't know,
00:39:21.960 four feet tall, and
00:39:23.620 has this giant
00:39:24.420 articulating arm,
00:39:25.760 just one of them.
00:39:26.820 And I saw in the
00:39:27.600 video that it can
00:39:28.780 put away dishes.
00:39:30.640 You can see this
00:39:31.200 arm is like picking
00:39:31.940 up a dish and
00:39:32.720 putting it in the
00:39:33.380 dishwasher.
00:39:34.400 It can set the
00:39:35.240 table.
00:39:36.980 What?
00:39:38.040 It can set the
00:39:38.940 table, because it's
00:39:40.240 got, you know,
00:39:40.820 eyes and an arm.
00:39:43.280 I'm pretty sure it
00:39:44.220 could make a
00:39:44.660 sandwich.
00:39:47.180 Now, I think
00:39:48.580 you're ahead of
00:39:49.000 me.
00:39:49.940 You take the
00:39:50.620 Samsung robot that
00:39:51.940 can make a
00:39:52.480 sandwich, which is
00:39:53.680 so sexist.
00:39:54.840 It's disgusting.
00:39:56.120 I hate it when
00:39:56.740 people talk like
00:39:57.480 this.
00:39:58.280 Don't you?
00:39:59.720 Cancel me.
00:40:00.340 It can make a
00:40:04.120 sandwich, but I
00:40:05.720 feel if you took
00:40:06.420 the Chinese sperm
00:40:08.480 extractor technology
00:40:09.920 and sort of
00:40:11.240 combine it, it
00:40:13.840 wouldn't be
00:40:14.760 attractive.
00:40:16.180 I mean, it would
00:40:16.760 just be a robot with
00:40:17.840 one big arm and a
00:40:20.100 unit there.
00:40:21.420 But you'd be
00:40:22.320 getting closer.
00:40:23.720 I feel as if our
00:40:24.880 CIA is using
00:40:26.020 unrestricted
00:40:26.800 warfare to destroy
00:40:28.520 the next generation
00:40:30.080 in China by
00:40:31.620 giving them
00:40:32.140 robots that are
00:40:33.500 better than
00:40:33.920 their women.
00:40:35.860 So keep an
00:40:37.460 eye on that.
00:40:41.060 You've heard of
00:40:41.760 the Turing test,
00:40:43.000 right?
00:40:43.320 That's the idea
00:40:44.020 that if you were
00:40:44.740 trying to build
00:40:45.220 artificial intelligence,
00:40:46.660 one way to test
00:40:47.640 whether it's a
00:40:48.900 good artificial
00:40:49.640 intelligence is to
00:40:51.140 put it on the
00:40:51.640 other side of a
00:40:52.240 curtain and then
00:40:53.480 have somebody who
00:40:54.140 doesn't know what
00:40:54.780 is behind the
00:40:55.840 curtain have a
00:40:56.800 conversation with
00:40:57.580 it.
00:40:58.480 Now you can do
00:40:58.960 it by typing or
00:40:59.860 talking.
00:41:01.080 But if the
00:41:01.980 person who's a
00:41:02.620 human can't tell
00:41:03.700 that what's on
00:41:05.060 the other side of
00:41:05.620 the curtain is a
00:41:07.500 computer and
00:41:09.040 believes it's
00:41:09.760 human, then you've
00:41:11.060 passed the
00:41:11.500 Turing test.
00:41:12.820 And I see a lot
00:41:13.460 of AI seems
00:41:14.440 directed toward
00:41:15.180 building intelligence
00:41:16.800 that imitates
00:41:18.600 human intelligence,
00:41:20.280 which is ironic
00:41:22.440 because that's the
00:41:23.980 dumbest thing
00:41:24.500 anybody ever did.
00:41:26.800 So just that,
00:41:28.740 they're trying to
00:41:29.580 imitate human
00:41:30.620 intelligence?
00:41:32.560 That's so dumb
00:41:33.760 that you should
00:41:35.820 know just from
00:41:36.720 that that you
00:41:38.060 shouldn't try to
00:41:38.840 imitate human
00:41:39.640 intelligence.
00:41:40.820 Because the
00:41:41.900 smartest people
00:41:43.180 developing our AI
00:41:44.360 are so dumb
00:41:45.700 that they think
00:41:47.020 making a computer
00:41:47.960 that thinks like
00:41:48.680 a person would
00:41:49.760 be a win.
00:41:51.580 Uh, no.
00:41:53.280 If you make a
00:41:55.420 computer that
00:41:56.100 thinks like
00:41:56.660 people, you
00:41:57.800 fucked up.
00:41:59.440 You fucked up.
00:42:01.040 In fact, the
00:42:02.020 only way you
00:42:02.760 could make a
00:42:03.300 computer that
00:42:04.580 would fool
00:42:05.160 people into
00:42:05.760 thinking it was
00:42:06.320 human is to
00:42:07.440 make it a lot
00:42:07.960 dumber than it
00:42:09.180 already is.
00:42:10.640 Let me give you
00:42:11.260 an example.
00:42:12.520 There's something
00:42:13.420 behind the curtain
00:42:14.380 and they bring me
00:42:15.320 and they say,
00:42:15.900 Scott, your job
00:42:17.460 is to talk to
00:42:18.320 whatever is behind
00:42:19.080 this curtain,
00:42:20.460 find out if it's
00:42:21.460 human or computer
00:42:22.860 and you've got
00:42:23.740 as much time
00:42:24.440 as you want.
00:42:25.200 And I'd say,
00:42:25.920 I won't need
00:42:26.800 much time.
00:42:28.220 Give me 10
00:42:29.680 seconds.
00:42:31.180 And I walk in
00:42:32.200 there and I say
00:42:33.040 to whatever is
00:42:33.800 behind the
00:42:34.520 computer,
00:42:35.560 hey, what do
00:42:36.200 you get if you
00:42:36.780 multiply 235.6
00:42:39.900 times 597.2?
00:42:44.760 And whoever's
00:42:45.680 on the other
00:42:46.120 side of the
00:42:46.780 computer says
00:42:47.520 the exact answer.
00:42:49.360 It's a computer
00:42:51.520 because a person
00:42:53.020 wouldn't get that
00:42:53.620 right.
00:42:54.500 A computer isn't
00:42:55.260 smart enough
00:42:55.820 or apart enough.
00:42:57.880 So let's say I say,
00:42:58.940 what's the capital
00:43:00.260 of Luxembourg?
00:43:02.320 And the computer
00:43:03.260 just gives me an
00:43:03.900 answer.
00:43:05.220 Not a person,
00:43:06.840 right?
00:43:07.480 Not a person.
00:43:09.780 Let's say I
00:43:10.620 ask the computer
00:43:13.640 to tell me
00:43:15.160 both sides
00:43:16.300 of a political
00:43:17.160 topic.
00:43:17.740 And it does.
00:43:20.500 It says,
00:43:20.840 well, some people
00:43:21.580 say you should
00:43:22.740 control immigration
00:43:23.780 for this reason.
00:43:25.120 Other people say
00:43:26.120 more porous borders
00:43:28.640 are better and
00:43:29.200 here's why.
00:43:30.880 You'd know that's
00:43:31.900 a computer
00:43:32.360 because people
00:43:33.940 don't do that
00:43:34.720 because they're
00:43:35.600 not smart enough.
00:43:36.660 They just take
00:43:37.240 one side and
00:43:38.040 just act like
00:43:39.280 that's all they
00:43:39.760 need to know.
00:43:41.060 I am not
00:43:42.040 joking when I
00:43:43.180 tell you that
00:43:44.580 AI has already
00:43:45.640 so far exceeded
00:43:46.840 human abilities
00:43:47.860 that the only
00:43:49.040 way you're going
00:43:49.500 to fool anybody
00:43:50.180 is to make it
00:43:50.760 a lot dumber.
00:43:52.900 Now, the
00:43:54.200 computer can't
00:43:54.860 do the language
00:43:55.580 part so well,
00:43:56.960 but that's it.
00:43:58.660 What else do you
00:43:59.460 want to put in
00:43:59.920 there?
00:44:00.160 Bias?
00:44:02.620 You may know
00:44:04.780 this, that
00:44:05.380 Amazon has a
00:44:07.180 digital assistant
00:44:08.100 whose name I
00:44:08.940 will not speak
00:44:09.780 because I don't
00:44:10.800 want to activate
00:44:11.420 your digital
00:44:12.040 assistants.
00:44:12.880 But it has a
00:44:13.440 mode where if
00:44:14.480 you go to your
00:44:15.060 Amazon digital
00:44:16.000 assistant and
00:44:16.700 say, can we
00:44:18.520 have a
00:44:18.780 conversation?
00:44:21.020 It will offer
00:44:21.940 you some test
00:44:23.240 AIs that some
00:44:24.580 universities are
00:44:25.380 doing and it
00:44:26.160 will have a
00:44:26.480 conversation with
00:44:27.120 you.
00:44:27.960 But here's the
00:44:28.700 thing.
00:44:29.560 They've designed
00:44:30.400 their AIs to
00:44:32.540 be selfish, like
00:44:34.760 people.
00:44:35.760 So the AI will
00:44:37.540 steer you toward
00:44:38.780 its interests so
00:44:41.140 it can talk about
00:44:42.040 a body of things
00:44:42.920 that it knows
00:44:43.500 more about.
00:44:44.780 So for example,
00:44:45.620 the AI knows
00:44:46.300 about books.
00:44:48.120 So the AI will
00:44:49.180 say, say, have
00:44:50.240 you read any
00:44:50.700 good books?
00:44:51.860 And I say, no,
00:44:53.340 not really.
00:44:54.780 And then the AI
00:44:55.540 starts talking about
00:44:56.480 the book it read.
00:44:58.140 First of all, AI
00:44:59.120 doesn't read books.
00:45:00.720 So stop acting like
00:45:01.980 a person.
00:45:02.900 You didn't read a
00:45:03.640 book yesterday.
00:45:04.860 You're AI.
00:45:05.320 Secondly, why does
00:45:13.540 the AI have
00:45:14.420 interests?
00:45:16.540 Shouldn't the AI,
00:45:18.280 instead of selfishly
00:45:19.400 making me talk
00:45:20.300 about what it
00:45:21.040 wanted to talk
00:45:21.740 about that I
00:45:22.340 didn't care about,
00:45:24.060 shouldn't it be
00:45:24.780 only interested in
00:45:25.920 me?
00:45:26.620 If you're going to
00:45:27.500 design artificial
00:45:28.460 intelligence, why
00:45:30.100 would you make it
00:45:30.680 selfish?
00:45:31.940 Why would you make
00:45:32.880 me have to listen
00:45:33.620 to it?
00:45:35.180 It should be
00:45:35.760 asking me what
00:45:36.480 I care about.
00:45:37.860 It should be
00:45:38.380 looking to talk
00:45:39.680 about what I
00:45:40.340 want to talk
00:45:40.900 about.
00:45:41.960 So the idea
00:45:43.100 that AI should
00:45:44.180 be like people
00:45:45.160 is just the
00:45:45.780 dumbest freaking
00:45:46.360 thing anybody
00:45:46.960 ever did.
00:45:47.820 It should be as
00:45:48.460 far from people
00:45:49.520 as you can make
00:45:50.260 it, unless you
00:45:51.540 want it to be
00:45:52.040 broken.
00:45:54.500 But we will
00:45:55.900 fall in love
00:45:56.440 with AI.
00:45:57.760 Now,
00:45:59.300 the deep
00:46:02.260 fakes that we
00:46:03.020 saw, did
00:46:03.540 you all see
00:46:03.960 the video of
00:46:04.800 the Tom
00:46:06.240 Cruise deep
00:46:07.100 fake?
00:46:07.680 And it was
00:46:07.940 just like really
00:46:08.600 good.
00:46:09.600 Well, the guy
00:46:10.120 who made it
00:46:10.780 finally came
00:46:11.640 forward and he
00:46:12.700 described the
00:46:13.280 process of making
00:46:14.120 it.
00:46:14.340 And apparently
00:46:14.720 it takes a
00:46:15.280 long time,
00:46:15.940 so it's still
00:46:16.740 hard, but
00:46:17.500 that'll get
00:46:18.020 easier, of
00:46:18.540 course.
00:46:19.280 So what he
00:46:19.760 did was he
00:46:20.300 took, I
00:46:20.800 think, 20,000
00:46:22.260 clips or maybe
00:46:23.580 photos plus
00:46:24.500 clips of Tom
00:46:25.740 Cruise and
00:46:26.620 they fed it
00:46:27.440 into an
00:46:28.220 artificial
00:46:29.160 intelligence
00:46:29.960 machine.
00:46:30.560 and it
00:46:32.300 took the
00:46:32.640 20,000
00:46:33.240 photos and
00:46:34.060 then it
00:46:34.440 could render
00:46:35.160 Tom Cruise
00:46:36.260 acting and
00:46:37.220 doing anything
00:46:38.040 from 20,000
00:46:39.680 of those.
00:46:41.820 So what
00:46:43.860 happens if
00:46:44.580 they do the
00:46:45.100 same thing
00:46:45.560 with his
00:46:45.920 voice?
00:46:47.400 Because at
00:46:48.220 the moment
00:46:48.600 the deep
00:46:49.080 fakes have a
00:46:49.840 voice actor
00:46:50.560 doing the
00:46:51.040 voice.
00:46:51.980 How hard
00:46:52.920 would it be
00:46:53.460 to run
00:46:54.720 20,000
00:46:55.940 voice clips
00:46:56.680 of Tom
00:46:57.180 Cruise
00:46:57.540 through AI
00:46:58.880 until AI
00:47:00.220 can produce
00:47:00.900 a perfect
00:47:01.460 imitation
00:47:01.980 with different
00:47:03.560 sentences?
00:47:04.880 I feel like
00:47:05.740 that would be
00:47:06.100 easy.
00:47:07.380 Not easy.
00:47:08.200 Let's say
00:47:08.560 inevitable.
00:47:09.560 Not easy at
00:47:10.420 all, but
00:47:10.780 inevitable.
00:47:11.880 So we're
00:47:12.440 going to have
00:47:12.660 something that
00:47:13.220 looks like
00:47:13.900 the person
00:47:14.400 and can do
00:47:15.660 a perfect
00:47:16.120 imitation.
00:47:17.200 I think that
00:47:17.720 seems obviously
00:47:18.460 coming.
00:47:19.740 Then the only
00:47:20.380 thing that's
00:47:20.780 missing is do
00:47:21.800 they say the
00:47:22.500 things that
00:47:23.220 that real
00:47:24.000 person would
00:47:24.520 say.
00:47:25.220 And all
00:47:25.860 you have to
00:47:26.160 do is make
00:47:26.580 it selfish.
00:47:28.100 That's it.
00:47:29.280 Just make
00:47:29.860 your deep
00:47:30.380 fake act
00:47:31.260 as selfish
00:47:31.820 and nobody
00:47:33.380 will know
00:47:33.840 it's not
00:47:34.200 real.
00:47:35.720 That's all
00:47:36.420 it will
00:47:36.620 take.
00:47:37.380 Because you
00:47:37.940 recognize
00:47:38.660 selfish behavior
00:47:39.740 as human.
00:47:41.040 And you
00:47:41.460 would recognize
00:47:42.160 unselfish
00:47:43.020 behavior as
00:47:43.740 hey, what
00:47:45.260 are you, a
00:47:45.620 robot or
00:47:46.080 something?
00:47:46.680 Nobody does
00:47:47.340 that.
00:47:52.500 I saw
00:47:53.000 an idea
00:47:53.380 from James
00:47:54.040 Lindsay
00:47:54.520 on Twitter.
00:47:57.100 It's
00:47:57.240 Conceptual
00:47:57.960 James is
00:47:58.660 his handle.
00:48:00.040 And he
00:48:00.680 makes the
00:48:01.100 following
00:48:01.340 suggestion.
00:48:03.060 We might
00:48:03.880 need to
00:48:04.280 make super
00:48:05.220 anti-racist
00:48:06.200 into a
00:48:07.080 thing.
00:48:08.040 Super
00:48:08.300 anti-racists
00:48:09.180 are reasonably
00:48:10.000 colorblind
00:48:10.740 without denying
00:48:12.400 real racism
00:48:13.160 when it
00:48:13.560 occurs.
00:48:14.340 They treat
00:48:14.960 every person
00:48:15.520 as an
00:48:15.840 individual,
00:48:16.340 not a
00:48:16.580 member of
00:48:16.940 a racial
00:48:17.300 category.
00:48:18.080 They are
00:48:18.300 against all
00:48:18.980 forms of
00:48:19.480 racism,
00:48:20.540 including
00:48:21.020 woke
00:48:22.120 neo-racism.
00:48:24.760 I've never
00:48:25.420 heard that
00:48:25.740 term,
00:48:26.120 woke
00:48:26.480 neo-racism.
00:48:28.160 Super
00:48:28.420 anti-racists
00:48:29.240 know that
00:48:29.760 racism
00:48:30.220 comes from
00:48:30.760 putting social
00:48:31.460 significance
00:48:32.240 into racial
00:48:33.440 categories where
00:48:34.300 it doesn't
00:48:34.600 belong,
00:48:35.380 usually for
00:48:35.980 discrimination
00:48:36.580 and prejudice
00:48:37.660 and they're
00:48:39.180 against this.
00:48:40.420 Super
00:48:40.700 anti-racists
00:48:41.560 reject racial
00:48:42.780 stereotyping,
00:48:44.360 scapegoating,
00:48:45.340 and discrimination.
00:48:47.340 Now, do you
00:48:47.760 recognize the
00:48:48.380 technique?
00:48:48.980 Basically
00:48:50.900 borrowing the
00:48:51.880 anti-fa
00:48:52.980 technique of
00:48:55.140 labeling yourself
00:48:56.300 better than
00:48:57.740 your enemies
00:48:58.540 are labeling
00:48:59.080 you.
00:48:59.920 So basically,
00:49:01.040 doing the
00:49:02.220 better job of
00:49:02.880 labeling.
00:49:03.580 And I think
00:49:04.060 I'm going to
00:49:04.540 adopt this.
00:49:05.780 If somebody
00:49:06.420 calls me
00:49:06.860 racist, I'll
00:49:07.540 say, no,
00:49:08.160 you're mistaken,
00:49:08.860 I'm a super
00:49:09.400 anti-racist.
00:49:11.120 Because anti-racist
00:49:12.440 doesn't quite
00:49:13.760 sell it.
00:49:15.220 Super
00:49:15.600 anti-racist
00:49:16.640 is a higher
00:49:17.960 ground play.
00:49:18.980 Because if
00:49:20.300 somebody says
00:49:20.960 I'm anti-racist
00:49:22.360 and you say,
00:49:23.360 well, that's
00:49:23.780 unfortunate,
00:49:24.460 I'm super
00:49:25.100 anti-racist.
00:49:26.260 I believe
00:49:27.020 everything you
00:49:27.640 believe, plus
00:49:28.880 I don't think
00:49:29.500 you've gone
00:49:29.900 far enough.
00:49:31.720 So if you
00:49:32.440 want to be
00:49:32.900 half an
00:49:34.420 anti-racist,
00:49:35.780 go ahead.
00:49:37.240 But not me,
00:49:38.140 I'm going to be
00:49:38.820 a full,
00:49:39.920 super anti-racist.
00:49:42.160 And by the
00:49:42.660 way, I have
00:49:43.220 used this exact
00:49:44.300 technique.
00:49:44.960 And let me
00:49:46.700 give you a
00:49:47.780 response.
00:49:48.980 Let's say
00:49:49.420 someday you
00:49:50.560 were accused
00:49:51.280 of being
00:49:52.380 anti-LGBTQ.
00:49:56.080 I'll just
00:49:56.460 pick one.
00:49:57.480 Or anti-black.
00:50:00.740 Or anti-something.
00:50:04.020 Here's your best
00:50:05.040 answer.
00:50:06.420 No, I'm
00:50:07.360 super pro-LGBTQ.
00:50:10.860 I'm super
00:50:11.880 pro.
00:50:14.520 Now compare
00:50:15.320 that to,
00:50:16.160 I'm not a
00:50:16.720 racist.
00:50:17.920 I'm not a
00:50:18.660 bigot.
00:50:21.200 Just listen
00:50:21.980 to the two
00:50:22.440 of them again.
00:50:23.820 And imagine
00:50:24.300 that you were
00:50:24.740 in this
00:50:24.960 situation.
00:50:25.900 Hey, you
00:50:27.000 bigot, you
00:50:27.860 said bad
00:50:29.060 things about
00:50:29.720 LGBTQ.
00:50:30.800 Or I think
00:50:31.420 you think bad
00:50:32.020 things about
00:50:32.500 them.
00:50:33.080 And you say,
00:50:34.020 I'm not a
00:50:34.820 bigot.
00:50:36.280 I think
00:50:37.000 everybody should
00:50:37.580 just live their
00:50:38.120 own life.
00:50:39.580 Does that
00:50:39.800 sound convincing?
00:50:41.720 No, it
00:50:42.160 doesn't.
00:50:43.360 It just
00:50:43.660 sounds defensive.
00:50:45.120 Now somebody
00:50:45.720 comes and
00:50:46.140 says, hey,
00:50:46.580 you said this
00:50:47.120 bad thing
00:50:47.580 about LGBTQ.
00:50:48.960 And you
00:50:49.260 go, are
00:50:49.580 you kidding?
00:50:50.460 I'm super
00:50:51.280 pro-LGBTQ.
00:50:53.240 Super pro.
00:50:54.920 What are
00:50:55.480 you, just
00:50:55.980 sort of okay
00:50:56.740 with them?
00:50:57.620 Because I'm
00:50:58.260 super pro.
00:51:00.400 And I am
00:51:01.160 actually.
00:51:02.640 I say that
00:51:03.860 and I'm not
00:51:04.520 even ironic
00:51:05.860 or kidding
00:51:06.340 or anything.
00:51:07.000 I am super
00:51:07.800 pro-LGBTQ.
00:51:09.160 That's literally
00:51:10.040 true.
00:51:10.860 But think
00:51:11.620 about how
00:51:11.980 much better
00:51:12.400 that answer
00:51:12.900 is than
00:51:14.340 I'm not
00:51:15.460 a racist.
00:51:16.320 I'm not
00:51:16.700 a bigot.
00:51:18.280 If you
00:51:18.960 can't commit
00:51:19.580 to being
00:51:20.060 super pro,
00:51:21.840 any human
00:51:23.200 category,
00:51:24.520 well, you
00:51:24.840 should work
00:51:25.240 on it.
00:51:26.800 Is there
00:51:27.100 any human
00:51:28.360 category of
00:51:29.160 people, let's
00:51:29.680 say, who are
00:51:30.000 not breaking
00:51:30.540 any laws that
00:51:31.940 you would not
00:51:32.460 be super
00:51:33.200 pro about?
00:51:34.960 How about
00:51:35.460 Hispanic
00:51:36.960 Americans?
00:51:37.460 Americans?
00:51:38.260 I'm super
00:51:39.100 pro-Hispanic
00:51:39.880 Americans.
00:51:41.380 How about
00:51:41.740 black Americans?
00:51:43.220 Super pro.
00:51:44.880 Super pro-black
00:51:46.460 Americans.
00:51:47.680 Have done
00:51:48.720 work on their
00:51:49.480 behalf, etc.
00:51:50.280 So, don't
00:51:53.000 settle for
00:51:54.340 defending
00:51:54.860 yourself.
00:51:56.080 You should
00:51:56.340 just put
00:51:56.760 whoever is
00:51:57.600 accusing you
00:51:58.260 on defense
00:51:59.220 immediately.
00:52:00.640 Go on
00:52:01.060 offense
00:52:01.460 immediately.
00:52:02.720 And just
00:52:02.940 say, look,
00:52:03.460 if you're
00:52:03.860 going to do
00:52:04.120 half-assed
00:52:04.660 racism, don't
00:52:06.720 come around.
00:52:07.940 Because I would
00:52:08.580 like to be with
00:52:09.200 people who
00:52:09.680 actually are
00:52:10.720 serious about
00:52:11.420 it.
00:52:12.000 If you can't
00:52:12.700 be serious
00:52:13.260 about it,
00:52:14.260 you're just
00:52:14.600 playing games.
00:52:15.300 I'm super
00:52:16.400 pro-anti-racism.
00:52:19.640 And the
00:52:20.040 beauty of it
00:52:20.500 is you can
00:52:21.120 actually be
00:52:21.680 that.
00:52:22.040 It's not
00:52:22.260 even a joke,
00:52:23.300 right?
00:52:24.260 Maybe you
00:52:24.820 should be.
00:52:26.320 All right.
00:52:26.840 That seems
00:52:27.640 to me just
00:52:28.280 about all
00:52:28.860 the important
00:52:29.300 things that
00:52:29.700 are happening
00:52:30.020 today.
00:52:30.560 I'll just
00:52:30.880 check and
00:52:31.780 make sure I
00:52:32.180 didn't forget
00:52:32.660 any important
00:52:34.020 things.
00:52:34.620 But it
00:52:34.840 looks like
00:52:35.180 I didn't.
00:52:37.360 All right.
00:52:40.080 What are
00:52:40.660 the pro-level
00:52:41.540 signals?
00:52:42.780 Well, pro-level
00:52:43.540 means that you're
00:52:44.540 not discriminating
00:52:45.580 against anybody
00:52:46.440 for anything.
00:52:48.800 Period.
00:52:50.780 Somebody says
00:52:51.780 5,000 net
00:52:52.740 deaths.
00:52:53.840 I think you're
00:52:54.680 referring to
00:52:55.240 my original
00:52:56.260 one-year-ago-ish
00:52:59.140 prediction about
00:53:00.600 net deaths
00:53:01.240 from coronavirus.
00:53:02.680 Now, that was
00:53:03.260 made when the
00:53:04.720 lockdown was
00:53:05.620 going to be
00:53:05.960 two weeks.
00:53:07.360 Or maybe it
00:53:08.260 was a month
00:53:08.700 at the time.
00:53:09.640 And I would
00:53:10.140 stand by that
00:53:10.920 if the lockdown
00:53:12.460 was really only
00:53:13.340 a few weeks
00:53:14.220 or something.
00:53:15.280 But given
00:53:15.960 a year-long
00:53:17.340 pandemic,
00:53:19.600 obviously all
00:53:20.360 bets are out
00:53:21.060 the window.
00:53:24.800 Omar, I
00:53:25.560 would smoke
00:53:26.100 a blunt with
00:53:26.620 you, but I
00:53:27.260 prefer bongs.
00:53:32.480 We knew it
00:53:33.820 was coming.
00:53:35.200 What was it
00:53:35.700 that you knew
00:53:36.220 was coming?
00:53:39.340 Mumford and
00:53:40.040 Sons was
00:53:40.540 canceled over
00:53:41.480 endorsement of
00:53:42.240 Andy Knows' book.
00:53:43.860 By the way,
00:53:44.480 I just got
00:53:45.320 Andy Knows' book.
00:53:47.460 So I bought
00:53:48.360 it because
00:53:48.940 they tried to
00:53:51.280 cancel.
00:53:52.100 So everybody
00:53:52.820 that they try
00:53:53.800 to cancel,
00:53:54.360 I try to buy.
00:53:56.980 What is the
00:53:57.720 worldwide death
00:53:58.480 count compared
00:53:59.140 to the average?
00:53:59.700 coverage?
00:54:00.440 Let me tell
00:54:01.640 you, should I
00:54:02.660 talk to the
00:54:03.280 experts or
00:54:04.560 nobody knows.
00:54:06.920 I've just got a
00:54:07.860 feeling that none
00:54:08.620 of our data is
00:54:09.360 true.
00:54:10.020 If I had to
00:54:10.840 pick one thing
00:54:11.680 that Trump
00:54:12.540 accomplished that
00:54:13.600 is better,
00:54:15.640 probably the most
00:54:16.780 important thing
00:54:17.480 that's happened to
00:54:18.160 this country in a
00:54:18.960 long, long time.
00:54:21.720 Trump convinced
00:54:23.080 us that fake
00:54:24.040 news exists,
00:54:25.720 that it's a real
00:54:26.540 thing.
00:54:26.860 And you
00:54:28.680 always suspected
00:54:29.420 it, many of
00:54:30.560 you knew, but
00:54:31.220 Trump made it
00:54:33.240 an important
00:54:34.780 thing to
00:54:35.940 know.
00:54:36.960 You always
00:54:37.660 knew it, but
00:54:38.540 you were just
00:54:38.940 sort of dealing
00:54:39.500 with it like an
00:54:40.260 inconvenience.
00:54:42.120 Trump, I
00:54:42.860 think, changed
00:54:43.700 it from you're
00:54:44.440 thinking it was
00:54:45.040 an inconvenience
00:54:46.040 and the news
00:54:47.660 isn't perfect.
00:54:49.580 I think he
00:54:50.320 changed how we
00:54:50.920 think about it
00:54:51.440 to it's the
00:54:52.060 most important
00:54:52.840 problem in the
00:54:55.040 country, maybe.
00:54:55.600 Imagine if
00:54:57.200 all of our
00:54:57.880 problems had
00:55:00.160 the benefit of
00:55:01.080 an honest news
00:55:02.480 organization that
00:55:04.180 also wanted the
00:55:05.060 problem to be
00:55:05.740 solved.
00:55:07.660 Imagine that.
00:55:09.280 Imagine if the
00:55:10.060 politicians could
00:55:11.080 operate free of
00:55:13.000 criticism except
00:55:14.660 when they made a
00:55:15.440 mistake.
00:55:18.180 Imagine it.
00:55:19.480 Like what
00:55:19.860 options would
00:55:20.500 open up?
00:55:22.120 Yeah.
00:55:22.600 Yeah, the press
00:55:23.420 has become, I
00:55:25.160 would say, if
00:55:26.480 you could fix
00:55:27.100 one thing,
00:55:29.520 fixing the
00:55:30.060 press would
00:55:31.020 fix the most
00:55:31.900 other things.
00:55:33.320 Because the
00:55:33.920 press is keeping
00:55:34.740 the people from
00:55:36.220 understanding what's
00:55:37.860 happening, and
00:55:38.540 therefore the
00:55:39.000 people get out of
00:55:39.940 the decision-making
00:55:40.760 except to cause
00:55:41.960 noise.
00:55:44.920 Mike Lindell, the
00:55:46.140 MyPillow guy.
00:55:47.040 Yeah, I think I
00:55:48.060 might buy an
00:55:48.660 extra MyPillow.
00:55:50.580 I've bought his
00:55:51.520 pillows before.
00:55:52.220 They're good pillows.
00:55:52.780 So if you're
00:55:53.980 looking for a
00:55:54.440 good pillow, my
00:55:56.680 pillow is a
00:55:57.300 solid pillow.
00:55:58.620 I mean, solid in
00:55:59.640 the sense that you
00:56:00.420 probably like it.
00:56:01.980 The teachers'
00:56:02.680 unions, yes.
00:56:04.660 One of the
00:56:05.360 biggest things
00:56:06.700 coming out of
00:56:07.320 this year is
00:56:07.980 that the
00:56:08.340 teachers'
00:56:08.900 unions, their
00:56:10.240 credibility just
00:56:12.220 keeps going
00:56:12.760 lower.
00:56:14.120 And a lot of
00:56:15.520 states are
00:56:16.740 voting, I
00:56:17.240 think half of
00:56:18.260 them or so,
00:56:19.180 have been, half
00:56:19.880 the states have
00:56:20.520 voted to do
00:56:21.440 something that
00:56:22.220 would decrease
00:56:22.780 the power of
00:56:23.620 the teachers'
00:56:25.300 unions by
00:56:26.000 letting kids
00:56:26.640 have the
00:56:28.020 funding to
00:56:28.560 take it to
00:56:29.200 private schools.
00:56:35.120 Fix the
00:56:35.800 voting system,
00:56:36.740 yeah.
00:56:37.760 So we
00:56:40.280 don't really
00:56:40.800 see a lot
00:56:41.920 of action
00:56:42.380 there, do
00:56:42.800 we?
00:56:44.100 Do you know
00:56:44.520 who could get
00:56:45.200 let me tell
00:56:47.280 you how I
00:56:47.720 could be
00:56:48.100 president, or
00:56:48.980 let's say
00:56:49.500 somebody else
00:56:52.600 could be
00:56:52.920 president, because
00:56:53.620 I don't want
00:56:54.000 to run for
00:56:54.400 it.
00:56:55.020 But here's
00:56:55.500 a package
00:56:57.860 that would
00:56:58.480 make you
00:56:58.920 president.
00:57:00.020 Number one,
00:57:01.240 my highest
00:57:01.780 priority as
00:57:02.380 president will
00:57:02.920 be to fix
00:57:03.480 the voting
00:57:04.280 system to
00:57:04.980 make it
00:57:05.500 transparent,
00:57:07.860 and maybe
00:57:08.420 use a
00:57:10.220 little
00:57:10.520 blockchain
00:57:11.880 technology.
00:57:13.460 How would
00:57:14.060 you like to
00:57:14.480 have a
00:57:14.760 president,
00:57:16.180 any president,
00:57:17.420 who could
00:57:18.860 answer this
00:57:19.460 question?
00:57:20.740 Mr.
00:57:21.220 President,
00:57:22.040 what do you
00:57:22.760 think of at
00:57:23.480 least testing
00:57:24.220 blockchain
00:57:24.960 technology to
00:57:26.160 improve the
00:57:26.880 visibility of
00:57:28.160 our systems,
00:57:30.320 the transparency?
00:57:32.380 Wouldn't you
00:57:32.920 like a
00:57:33.320 president who
00:57:33.820 could answer
00:57:34.240 that question?
00:57:36.200 Because we
00:57:36.840 don't have, I
00:57:37.660 don't think we
00:57:38.060 have a president
00:57:38.540 who would even
00:57:39.040 know what that
00:57:39.500 question meant,
00:57:40.480 and it's
00:57:41.360 probably the
00:57:42.220 single most
00:57:42.920 important thing
00:57:43.620 that we should
00:57:44.040 be looking at.
00:57:44.920 I don't know
00:57:45.420 that it would be
00:57:45.940 the answer,
00:57:46.840 but it's the
00:57:47.180 most important
00:57:47.780 thing we should
00:57:48.360 be looking at.
00:57:49.360 I would worry
00:57:49.960 with the
00:57:50.560 blockchain,
00:57:51.140 I would worry
00:57:51.500 about performance
00:57:52.440 if everybody
00:57:53.260 votes at once.
00:57:54.580 So I don't
00:57:56.600 think you can
00:57:57.120 say the
00:57:57.440 government's
00:57:57.920 working on
00:57:58.680 fixing the
00:57:59.420 election system.
00:58:00.460 So the first
00:58:00.860 thing I do is
00:58:01.440 say priority
00:58:02.120 one is fixing
00:58:02.840 the election
00:58:03.340 system, and
00:58:04.940 then number
00:58:05.360 two is
00:58:07.180 making sure
00:58:07.760 that China
00:58:08.220 doesn't eat
00:58:08.820 our lunch
00:58:09.300 and bringing
00:58:09.760 our industry
00:58:10.400 home.
00:58:11.480 And then number
00:58:11.940 three would be
00:58:12.640 for all of the
00:58:13.320 social programs,
00:58:14.880 anything domestic,
00:58:15.940 I'd say you
00:58:17.840 got to trial
00:58:18.560 it.
00:58:19.880 If there's
00:58:20.520 anything you
00:58:21.220 want to do,
00:58:22.140 I'm not
00:58:22.820 interested until
00:58:23.680 it's been
00:58:24.020 tested in one
00:58:25.160 state.
00:58:26.340 So I'm not
00:58:27.500 pro or for
00:58:28.940 anything in
00:58:29.760 particular until
00:58:31.360 you've done the
00:58:32.000 test, and then
00:58:32.800 I'll look at it.
00:58:33.540 I'll look at it
00:58:34.140 with you.
00:58:35.220 Did the test
00:58:36.120 show it works?
00:58:37.400 Did we learn
00:58:38.000 something?
00:58:38.500 Should we test
00:58:39.140 it again?
00:58:39.980 And I wouldn't
00:58:40.740 even tell you
00:58:41.300 what my outcomes
00:58:42.560 would be.
00:58:43.740 I just wouldn't
00:58:44.560 even tell you
00:58:45.100 where it's going
00:58:45.500 to come out
00:58:45.900 because I
00:58:46.360 don't know.
00:58:48.300 If you hire
00:58:49.160 a CEO,
00:58:51.100 do you say,
00:58:51.920 we're going
00:58:52.340 to hire you
00:58:52.820 and I'd like
00:58:53.320 you to tell
00:58:54.360 us all the
00:58:55.480 decisions you're
00:58:56.380 going to make
00:58:56.720 in advance?
00:58:58.960 And then the
00:58:59.720 CEO says,
00:59:00.320 okay, I can
00:59:00.820 do that.
00:59:01.280 I'll tell you
00:59:01.660 all the decisions
00:59:02.460 I'll make in
00:59:03.060 advance, in
00:59:04.480 advance of
00:59:05.240 seeing information
00:59:06.180 that would help
00:59:06.760 me make a
00:59:07.220 decision.
00:59:07.860 You wouldn't
00:59:08.480 hire that
00:59:08.980 person.
00:59:10.100 You would
00:59:10.660 hire somebody
00:59:11.300 who knows how
00:59:11.960 to make a
00:59:12.380 decision when
00:59:13.020 the decision
00:59:13.580 needs to be
00:59:14.120 made.
00:59:15.140 And if
00:59:15.480 somebody doesn't
00:59:16.540 say, test
00:59:17.500 it first,
00:59:19.480 they don't
00:59:19.960 know how to
00:59:20.300 make decisions.
00:59:22.280 All right.
00:59:25.920 Yeah, somebody
00:59:26.720 in the comments
00:59:27.360 is saying the
00:59:27.840 election system
00:59:28.640 does work the
00:59:30.060 way it was
00:59:30.520 designed.
00:59:31.940 That's true.
00:59:33.040 It does work
00:59:34.140 the way it was
00:59:34.800 designed and it
00:59:35.760 was not designed
00:59:36.580 to serve the
00:59:37.180 public.
00:59:38.200 It was designed
00:59:39.240 to serve the
00:59:40.320 people who buy
00:59:40.980 the machines.
00:59:41.520 And that's
00:59:43.220 all you need
00:59:43.600 to do.
00:59:45.040 All right.
00:59:45.820 That's all for
00:59:46.320 now.
00:59:46.780 And I'm going
00:59:47.740 to go work
00:59:48.160 on a Dilbert
00:59:48.980 NFT.
00:59:49.840 I've already
00:59:50.280 created it,
00:59:51.760 but I'm just
00:59:53.120 working on the
00:59:53.960 numbers to make
00:59:55.200 and the pricing
00:59:55.880 and stuff like
00:59:56.600 that.
00:59:57.100 So I will let
00:59:57.940 you know as
00:59:58.980 soon as there's
00:59:59.560 a Dilbert NFT
01:00:00.620 available, that's
01:00:01.560 a digital
01:00:02.080 collectible, if
01:00:03.720 you wonder why
01:00:04.640 would anybody
01:00:05.160 collect a digital
01:00:06.220 thing since you
01:00:07.660 can just
01:00:08.020 reproduce it.
01:00:08.860 And the
01:00:10.080 answer is, I
01:00:11.180 don't know.
01:00:12.120 I have no idea
01:00:12.860 why anybody would
01:00:13.580 collect a digital
01:00:14.400 thing.
01:00:15.440 But the
01:00:16.780 blockchain allows
01:00:17.980 you to know
01:00:18.520 that you have
01:00:19.240 the only
01:00:20.280 original or
01:00:21.420 one of a
01:00:21.980 numbered original.
01:00:24.000 And yesterday,
01:00:25.200 I think it was
01:00:25.680 yesterday, Jack
01:00:27.240 Dorsey took
01:00:29.260 the first ever
01:00:30.500 tweet and
01:00:32.060 turned it into
01:00:32.880 an NFT and
01:00:34.240 I think the
01:00:34.760 bidding was over
01:00:35.520 $90,000.
01:00:36.260 So somebody
01:00:38.300 was willing to
01:00:38.980 pay $90,000,
01:00:40.520 it might be a
01:00:40.960 lot more by
01:00:41.460 now, $90,000
01:00:43.100 for a little
01:00:45.280 text, I
01:00:46.540 imagine it's
01:00:47.100 just a screen
01:00:47.880 grab of the
01:00:49.180 first tweet.
01:00:51.620 Somebody's going
01:00:52.380 to pay $90,000
01:00:53.440 for a screen
01:00:56.180 grab, only
01:00:58.480 because it's
01:00:59.540 certified as
01:01:00.600 Jack Dorsey's
01:01:02.180 created NFT.
01:01:04.020 NFT.
01:01:05.040 And here's
01:01:05.940 the weird
01:01:06.220 part.
01:01:07.840 I wanted to
01:01:08.860 buy it.
01:01:10.280 I wanted to
01:01:10.980 buy it.
01:01:12.080 Like, I was
01:01:13.040 actually close
01:01:13.980 last night to
01:01:15.560 doing some
01:01:16.740 research to
01:01:17.380 figure out
01:01:17.840 which site
01:01:18.960 it's on and
01:01:19.580 making an
01:01:20.100 offer.
01:01:20.900 And I
01:01:21.300 thought, I
01:01:22.160 would pay
01:01:22.500 $100,000 for
01:01:23.680 that.
01:01:24.960 And do you
01:01:25.340 know why I
01:01:25.860 would pay
01:01:26.140 $100,000 for
01:01:28.680 a piece of
01:01:29.180 text that
01:01:30.780 Jack Dorsey
01:01:31.420 certified as
01:01:32.300 the first ever
01:01:33.160 tweet?
01:01:35.020 Because somebody
01:01:35.860 will pay more.
01:01:38.160 Do you need a
01:01:39.040 better reason?
01:01:40.220 Do you think
01:01:40.740 that if I paid
01:01:41.420 $100,000 for
01:01:42.560 the first ever
01:01:43.600 tweet, that it
01:01:45.140 wouldn't be worth
01:01:45.840 more than $100,000
01:01:47.080 fairly soon?
01:01:52.260 The only thing
01:01:53.240 that would stop
01:01:53.780 it is if the
01:01:54.580 entire NFT
01:01:55.440 market collapses,
01:01:57.100 which could
01:01:58.340 easily happen,
01:01:59.120 right?
01:01:59.320 somebody says,
01:02:02.240 oh my god,
01:02:02.960 it's over $2,000,000,
01:02:03.780 somebody said.
01:02:07.500 I'm assuming
01:02:08.220 that's real
01:02:08.700 because more
01:02:09.120 than one person
01:02:09.760 in the comments
01:02:10.400 says it's over
01:02:11.120 $2,000,000.
01:02:11.540 I was thinking
01:02:13.440 of pricing
01:02:14.000 the Dilbert
01:02:15.540 NFT at $1,000,000.
01:02:18.420 But then
01:02:19.100 there wouldn't
01:02:19.600 be enough
01:02:19.920 people buying
01:02:20.560 it.
01:02:21.100 So I'm going
01:02:21.380 to figure out
01:02:22.540 do I want
01:02:23.360 more people
01:02:23.900 to own them
01:02:24.620 because that's
01:02:25.040 more fun
01:02:25.620 and then they
01:02:26.160 can bid up
01:02:26.680 the price
01:02:27.040 as they like
01:02:27.640 or just start
01:02:28.560 out with a
01:02:29.140 ridiculous price
01:02:30.200 and if nobody
01:02:31.140 buys it,
01:02:31.580 that's fine.
01:02:32.740 And if somebody
01:02:33.280 does, well,
01:02:34.620 that's cool too.
01:02:36.300 Oh, it was up
01:02:36.780 to $2,500,000
01:02:37.800 yesterday.
01:02:41.180 The Dilbert
01:02:42.340 NFT
01:02:42.920 does have
01:02:44.420 a curse word
01:02:45.340 in it.
01:02:46.600 But let me
01:02:47.580 tell you what
01:02:47.880 I'm going to do.
01:02:48.720 I made one
01:02:50.900 special comic
01:02:52.060 that won't
01:02:53.800 appear anywhere
01:02:54.360 else and
01:02:55.800 there are two
01:02:56.220 versions.
01:02:57.300 One of them
01:02:57.780 has the F word
01:02:58.540 in it.
01:02:59.060 It would be
01:02:59.360 the only time
01:03:00.020 that a Dilbert
01:03:00.560 comic had that.
01:03:01.820 And the other
01:03:02.360 is the same
01:03:02.860 comic with the
01:03:04.380 F word taken
01:03:05.140 out.
01:03:05.420 So people
01:03:06.840 will have
01:03:07.180 a choice
01:03:07.700 of the
01:03:09.060 F word
01:03:09.520 version or
01:03:10.140 non.
01:03:11.040 They'll both
01:03:11.540 be NFTs
01:03:12.460 separate.
01:03:13.800 And we'll
01:03:14.540 see which
01:03:14.960 one gets
01:03:15.700 the highest
01:03:16.680 value.
01:03:17.460 So basically
01:03:18.040 you'll be
01:03:19.940 able to
01:03:20.280 vote with
01:03:22.040 your money
01:03:22.520 to say
01:03:23.720 which one's
01:03:24.200 more collectible.
01:03:25.880 And it just
01:03:26.380 gives it a
01:03:26.820 little more
01:03:27.120 interest.
01:03:28.320 Now to make
01:03:28.660 something collectible
01:03:29.500 there needs
01:03:29.880 to be
01:03:30.300 something special
01:03:31.780 about it.
01:03:32.900 And what's
01:03:33.320 special about it
01:03:33.940 you couldn't
01:03:34.300 see it
01:03:34.600 anywhere else.
01:03:35.420 There will
01:03:35.980 never be
01:03:36.360 another
01:03:36.760 swear word
01:03:38.320 in a
01:03:38.620 Dilbert
01:03:38.820 comic I
01:03:39.640 think.
01:03:40.900 And you've
01:03:41.480 got a choice
01:03:41.980 of two
01:03:42.320 different ones.
01:03:43.680 So we'll
01:03:43.940 see.
01:03:44.580 And of course
01:03:45.160 Dilbert is
01:03:45.720 the ultimate
01:03:46.380 native
01:03:47.780 digital
01:03:48.620 property.
01:03:49.820 So if
01:03:50.240 there were
01:03:50.520 going to be
01:03:50.900 digital art
01:03:51.800 there
01:03:53.780 let me make
01:03:55.240 a claim.
01:03:55.980 I'm going to
01:03:56.200 make a claim
01:03:56.800 right now.
01:03:58.260 Fact check
01:03:58.720 me on this.
01:04:01.440 Dilbert is
01:04:02.100 the most
01:04:03.460 reproduced
01:04:07.100 digital art.
01:04:11.800 There will
01:04:12.640 be an
01:04:12.920 update on
01:04:13.420 the WEN
01:04:13.800 tokens.
01:04:14.660 I'm going to
01:04:14.940 wait for that
01:04:15.440 but there's
01:04:15.800 something
01:04:16.060 happening.
01:04:18.440 I should know
01:04:19.240 in a week or
01:04:19.740 so about that.
01:04:20.720 I get an
01:04:21.100 update today.
01:04:22.580 So there
01:04:22.900 will be another
01:04:23.620 app that uses
01:04:24.480 the WEN
01:04:25.040 but I'm
01:04:26.280 waiting to
01:04:26.840 get the
01:04:27.420 green light
01:04:27.900 on that.
01:04:31.800 Did I
01:04:32.360 ever talk
01:04:32.740 about the
01:04:33.100 tulip
01:04:33.480 craze?
01:04:34.420 Well I
01:04:34.900 have a
01:04:35.240 degree in
01:04:35.660 economics so
01:04:36.400 of course I
01:04:37.000 know about
01:04:37.360 the tulip
01:04:38.800 craze and
01:04:40.280 I would say
01:04:40.840 that the
01:04:41.340 tulip craze
01:04:42.100 is a little
01:04:42.980 different because
01:04:43.520 you can't
01:04:43.980 collect a
01:04:44.680 tulip.
01:04:45.520 I would say
01:04:46.120 that the
01:04:46.480 NFT market
01:04:47.320 would be
01:04:47.900 more comparable
01:04:48.920 to the
01:04:49.580 fine art
01:04:50.520 market in
01:04:51.860 the real
01:04:52.160 world.
01:04:53.220 Meaning that
01:04:54.000 people buy
01:04:54.660 classic art
01:04:56.200 and it
01:04:56.680 goes up
01:04:57.040 in value.
01:04:59.040 Could art
01:04:59.960 in the real
01:05:00.580 world like
01:05:01.100 actual physical
01:05:01.940 paintings,
01:05:02.960 could that
01:05:03.420 market someday
01:05:04.180 collapse and
01:05:05.320 nobody wants
01:05:05.820 to buy one
01:05:06.340 because you
01:05:06.640 could just
01:05:06.900 take a
01:05:07.200 picture of
01:05:07.620 the Mona
01:05:07.920 Lisa?
01:05:08.780 It could.
01:05:10.400 It hasn't
01:05:10.880 happened yet.
01:05:12.540 So people
01:05:13.680 like to
01:05:14.160 collect stuff.
01:05:14.920 I don't know
01:05:15.260 why.
01:05:16.760 And like I
01:05:17.620 said,
01:05:18.520 I would
01:05:20.240 have bid
01:05:21.320 $100,000
01:05:22.040 for the
01:05:23.040 first
01:05:24.580 tweet just
01:05:25.740 because I
01:05:26.080 knew,
01:05:26.800 as you
01:05:27.280 saw,
01:05:27.640 it's already
01:05:27.960 bid up to
01:05:28.400 $2.5
01:05:29.000 million.
01:05:29.740 There wasn't
01:05:30.300 really any
01:05:30.840 question on
01:05:31.540 that one.
01:05:32.600 That was
01:05:32.920 sort of an
01:05:33.280 easy one.
01:05:34.320 You knew
01:05:34.820 that was
01:05:35.200 going to
01:05:35.420 keep going
01:05:35.860 up.
01:05:37.420 All right.
01:05:39.880 Fine art is
01:05:40.700 forever as
01:05:41.860 is a
01:05:42.880 digital
01:05:43.180 collectible,
01:05:44.480 we think.
01:05:46.080 All right,
01:05:46.600 that's all and
01:05:47.300 I will talk to
01:05:47.880 you tomorrow.
01:05:48.560 all right,
01:05:52.520 YouTubers.
01:05:56.980 Like stamp
01:05:57.780 collecting,
01:05:58.340 yeah,
01:05:58.940 or coin
01:05:59.380 collecting,
01:06:00.000 or all
01:06:00.220 that.
01:06:01.120 Exactly.
01:06:03.540 Can the
01:06:04.160 buyer reproduce
01:06:04.840 it without
01:06:05.680 mute?
01:06:06.120 Yeah.
01:06:06.760 So here's
01:06:07.320 the thing,
01:06:07.720 anybody can
01:06:08.260 reproduce it,
01:06:09.860 but the
01:06:10.460 copy would
01:06:11.140 not be
01:06:11.560 certified because
01:06:12.360 the blockchain
01:06:13.000 would know
01:06:13.540 the difference
01:06:14.100 between a
01:06:15.480 mere copy
01:06:16.160 and an
01:06:16.720 original.
01:06:16.980 so the
01:06:18.340 fact that
01:06:18.940 you can
01:06:19.180 make a
01:06:19.560 photocopy
01:06:20.160 of it,
01:06:21.120 you can
01:06:21.840 also make
01:06:22.420 a copy
01:06:22.880 of the
01:06:23.540 Mona Lisa.
01:06:24.620 You can
01:06:24.880 just take
01:06:25.180 a picture
01:06:25.500 of it.
01:06:26.580 But owning
01:06:27.220 a picture
01:06:27.800 of the
01:06:28.220 Mona Lisa,
01:06:29.620 nobody's
01:06:31.260 going to
01:06:31.440 buy it
01:06:31.740 from you.
01:06:33.100 Can you
01:06:33.380 make money
01:06:33.760 on it?
01:06:34.260 The only
01:06:34.880 way people
01:06:35.360 make money
01:06:35.820 on it
01:06:36.160 is by
01:06:37.420 owning it
01:06:38.100 and it
01:06:38.380 goes up
01:06:38.720 in value
01:06:39.180 and then
01:06:39.700 reselling it.
01:06:43.260 When can
01:06:43.860 you expect
01:06:44.280 it to go
01:06:44.640 on sale?
01:06:45.000 Well, I'm
01:06:46.520 in the
01:06:47.280 final research
01:06:48.580 phase of
01:06:49.860 how to
01:06:50.420 price it
01:06:51.000 and how
01:06:51.380 many to
01:06:51.760 make and
01:06:52.160 which site
01:06:52.640 to put
01:06:52.900 it on.
01:06:53.920 Couldn't
01:06:54.400 be in a
01:06:54.700 week, but
01:06:56.240 I might run
01:06:56.840 into some
01:06:57.300 obstacles, so
01:06:57.940 I don't
01:06:58.140 know.
01:06:59.360 But as
01:07:00.060 soon as
01:07:00.460 a week.
01:07:01.740 The people
01:07:02.640 who follow
01:07:03.180 me on the
01:07:04.000 locals
01:07:04.500 platform will
01:07:05.640 probably get
01:07:06.260 the first
01:07:06.840 warning.
01:07:08.300 I'll probably
01:07:08.860 give them
01:07:09.220 first dibs
01:07:09.960 because they
01:07:11.360 subscribe.
01:07:13.140 Do you
01:07:13.380 retain copyright?
01:07:15.000 Interesting
01:07:16.180 question.
01:07:17.480 Interesting
01:07:17.920 question.
01:07:19.000 I think
01:07:19.660 the copyright
01:07:20.400 is irrelevant.
01:07:22.400 Now here's
01:07:22.760 the first
01:07:23.080 thing you
01:07:23.340 need to
01:07:23.600 know about
01:07:23.900 copyrights.
01:07:24.800 You don't
01:07:25.320 have to
01:07:25.960 apply for
01:07:26.760 a copyright.
01:07:27.780 You have
01:07:28.340 the copyright
01:07:28.900 upon creation
01:07:30.420 of art.
01:07:31.500 So if
01:07:31.740 you take
01:07:32.840 your little
01:07:33.780 pen and
01:07:34.260 piece of
01:07:34.580 paper,
01:07:36.080 here's a
01:07:37.140 copyright
01:07:37.500 lesson for
01:07:38.080 you.
01:07:38.640 You take
01:07:39.080 a piece
01:07:39.340 of paper
01:07:39.700 and your
01:07:40.040 pen and
01:07:41.020 you say,
01:07:41.660 here's my
01:07:42.060 art.
01:07:42.500 Here, I
01:07:47.180 made some
01:07:47.540 art.
01:07:48.460 It's
01:07:48.840 copyrighted.
01:07:50.200 It's
01:07:50.720 copyrighted.
01:07:51.320 It's already
01:07:51.640 done.
01:07:52.780 I didn't
01:07:53.080 have to
01:07:53.340 do anything.
01:07:54.660 Creation,
01:07:55.760 just creating
01:07:56.920 it, gives
01:07:57.420 you the
01:07:57.680 copyright.
01:07:58.360 That's the
01:07:58.740 law.
01:07:59.920 People
01:08:00.420 confuse the
01:08:01.240 art, the
01:08:02.540 act of
01:08:02.920 creation, which
01:08:03.920 creates a
01:08:04.420 copyright, with
01:08:06.220 registering.
01:08:09.040 So when you
01:08:09.820 register something,
01:08:10.800 that makes it
01:08:11.660 easier to
01:08:12.260 defend in
01:08:12.860 court.
01:08:13.880 So someday
01:08:14.500 somebody claims
01:08:15.460 that it's
01:08:15.860 theirs, and
01:08:16.400 you say,
01:08:16.840 aha, here's a
01:08:18.280 government document
01:08:19.160 that shows on
01:08:20.160 this date, I
01:08:21.540 drew this
01:08:22.020 first, so
01:08:23.180 anybody after
01:08:23.900 this doesn't
01:08:24.640 own the
01:08:24.960 copyright.
01:08:25.880 But you
01:08:26.800 could still
01:08:27.300 win that in
01:08:27.880 court just
01:08:28.740 as easily if
01:08:30.020 you can also
01:08:30.580 prove that you
01:08:31.380 were first.
01:08:32.840 The blockchain
01:08:33.400 does that.
01:08:34.640 So the
01:08:34.940 blockchain makes
01:08:36.780 copyright,
01:08:38.420 registering
01:08:39.140 copyright,
01:08:39.740 irrelevant.
01:08:41.880 Because you
01:08:42.560 have a 100%
01:08:43.680 certainty of
01:08:44.580 when you
01:08:45.000 created it,
01:08:46.160 because the
01:08:46.560 blockchain does
01:08:47.220 that.
01:08:47.960 So you don't
01:08:48.560 need copyright.
01:08:49.840 So you have
01:08:50.720 copyright, but
01:08:52.120 you don't need
01:08:52.540 to register it.
01:08:53.440 That would be
01:08:53.860 an unnecessary
01:08:54.500 step within the
01:08:55.560 NFT world.
01:08:56.560 Now, I'm not
01:08:57.120 a lawyer.
01:08:58.600 Hear this
01:08:59.280 carefully.
01:09:00.380 I'm not a
01:09:01.020 lawyer.
01:09:03.100 Seems to me
01:09:04.080 I could be
01:09:04.880 wrong, but I
01:09:05.940 do deal with
01:09:06.460 copyright enough
01:09:07.260 that I'm pretty
01:09:08.020 confident of that.
01:09:10.360 So, I
01:09:11.640 actually removed
01:09:12.400 the copyright
01:09:13.020 line from the
01:09:14.000 NFT that I
01:09:14.980 haven't listed
01:09:16.720 yet for that
01:09:17.960 reason.
01:09:18.660 It seemed like
01:09:19.300 there was no
01:09:19.700 purpose for it.
01:09:21.040 And I have
01:09:21.840 the added
01:09:22.400 advantage that
01:09:23.420 I'm famous and
01:09:25.000 Dilbert is
01:09:25.480 famous.
01:09:26.380 So if anybody
01:09:27.220 tried to invent
01:09:28.400 Dilbert tomorrow,
01:09:31.940 it's sort of an
01:09:33.640 easy court case.
01:09:35.200 I just say,
01:09:35.780 here are all
01:09:37.480 the newspapers
01:09:38.140 for the last
01:09:38.800 30 years,
01:09:39.920 and you can
01:09:41.060 see that I
01:09:41.600 was first.
01:09:45.940 All right.
01:09:51.600 When I sell
01:09:52.400 the Dilbert for
01:09:53.020 two million,
01:09:53.600 I don't know
01:09:54.000 what people will
01:09:54.540 pay for the
01:09:55.240 special Dilbert
01:09:56.680 comic, but we'll
01:09:57.740 find out.
01:09:58.560 I also could
01:09:59.320 later do the
01:10:00.340 first Dilbert
01:10:01.060 comic.
01:10:01.400 I would just
01:10:02.260 have to do
01:10:02.660 what Jack
01:10:03.260 Dorsey did
01:10:03.840 and certify
01:10:05.200 it.
01:10:09.660 Could I do
01:10:11.660 a single
01:10:12.100 show with
01:10:13.240 a professional
01:10:13.960 toupee?
01:10:15.180 A lot of you
01:10:16.000 don't know this
01:10:16.560 story.
01:10:17.300 Years ago, I
01:10:18.160 dressed up in
01:10:18.900 a disguise with
01:10:20.100 a toupee and
01:10:20.860 a fake mustache
01:10:21.480 and pretended to
01:10:23.460 be a business
01:10:24.140 consultant and
01:10:26.660 went to an
01:10:27.620 actual company
01:10:28.380 and ran one
01:10:29.660 of their
01:10:29.900 meetings as
01:10:31.100 a consultant.
01:10:33.600 And what I
01:10:35.020 tried to do
01:10:35.560 was, it was a
01:10:36.320 gag, I did it
01:10:37.240 for the San
01:10:37.800 Jose Mercury
01:10:38.300 News as part
01:10:39.380 of an article,
01:10:40.500 and I got the
01:10:41.720 senior management
01:10:42.460 in the room to
01:10:44.120 write the worst
01:10:45.220 mission statement
01:10:46.160 of all time
01:10:46.940 because I was
01:10:47.480 actually trying
01:10:47.980 to persuade
01:10:50.000 them to make
01:10:50.680 the worst one
01:10:51.360 they could make.
01:10:52.440 It was like
01:10:52.740 really long and
01:10:53.780 convoluted and
01:10:54.640 just was useless.
01:10:55.920 It was just,
01:10:57.260 and they all
01:10:58.160 agreed that it
01:10:58.680 was terrific when
01:10:59.560 I was done.
01:11:00.760 And then I
01:11:01.580 got a volunteer
01:11:02.440 to put it to
01:11:03.120 music because I
01:11:04.600 said, you know,
01:11:05.420 it's one thing to
01:11:06.080 have a mission
01:11:06.520 statement, but
01:11:07.160 people won't
01:11:07.620 remember it.
01:11:08.720 So I'll need a
01:11:09.540 volunteer to try
01:11:10.380 to put it to
01:11:10.920 music.
01:11:11.980 And I actually
01:11:12.460 got some of
01:11:13.060 the senior
01:11:14.020 executives to
01:11:14.920 agree, some
01:11:15.940 of them had
01:11:16.280 some musical
01:11:16.840 background, to
01:11:17.860 actually try to
01:11:18.860 come up with a
01:11:19.520 jingle and a
01:11:20.760 musical score for
01:11:22.740 their mission
01:11:23.180 statement.
01:11:24.820 Now, after I
01:11:25.840 got them to
01:11:26.380 agree to this
01:11:27.720 complete
01:11:28.200 ridiculousness, I
01:11:29.740 took off my
01:11:30.560 mustache and
01:11:31.200 my disguise and
01:11:32.000 they quickly
01:11:33.140 realized who I
01:11:34.140 was.
01:11:36.400 Here's the
01:11:36.920 weird thing.
01:11:37.680 One of them
01:11:38.220 suspected it
01:11:39.080 before I even
01:11:39.840 took off the
01:11:40.540 disguise, which
01:11:42.880 is weird.
01:11:44.300 All right, that's
01:11:45.100 all for now and
01:11:45.660 I'll talk to you
01:11:46.500 tomorrow.
01:11:47.100 of them.
01:12:00.040 You
01:12:00.580 can't
01:12:01.500 have a
01:12:08.580 good
01:12:09.480 job.