Real Coffee with Scott Adams - July 30, 2022


Episode 1820 Scott Adams: How Not To Get Monkeypox (It's Easier Than You Think)


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

143.7895

Word Count

9,099

Sentence Count

715

Misogynist Sentences

29

Hate Speech Sentences

44


Summary


Transcript

00:00:00.000 highlights of your entire life. Yeah. Now, before we start, let me call attention to
00:00:07.680 this gigantic sore on my lip. I know, I know. You're going to say,
00:00:13.900 is that a giant herpes? No, no. I burned myself on soup.
00:00:21.200 I twice microwaved the same soup because it got cold and then I microwaved again. I over-microwaved
00:00:28.320 it. And when I took a sip, I quickly spit it out because it was scalding. But there was a little
00:00:35.520 piece of, I think it was spinach that was in the soup that wrapped around my lip and wouldn't let
00:00:41.500 go. And so I didn't realize how bad it was and it actually blistered. Now, I know what people are
00:00:49.920 going to say. Right. In the comments, you're going to say it's monkeypox, right? Go ahead. Go ahead.
00:00:55.380 Just say it. Say it's monkeypox. It was not monkeypox. It was from hot soup.
00:01:03.200 Now, in the interest of proper context, at the time I was eating the soup, I was also fucking a monkey.
00:01:10.520 But I don't think that has anything to do with this. It was probably the soup.
00:01:15.260 And I'm not gay. But the monkey was. The monkey was very gay. And I don't think that gives me any risk,
00:01:26.320 really, because I'm pretty sure I read that you both have to be gay. But I was not gay. I was just a
00:01:33.360 man having sex with a monkey who happened to be male. And there's nothing wrong with that.
00:01:38.880 So we're going to talk about all that until you can't stand it. But first, would you like the
00:01:47.160 simultaneous sip? Anybody? Is there anybody so addicted to the simultaneous sip that you need
00:01:53.120 to say it at the same time that I do? Well, that's going to be a problem because I don't know it and I
00:01:58.640 forgot to bring my notes. Maybe you do. Is there anybody who knows the simultaneous sip? Put it in the
00:02:05.820 comments and I'll read it. All right, over on locals. You could do it one at a time. All you
00:02:11.940 need is a cup or a mug or a glass. Go on. A cup of mug or a glass. A chalice, tankard, or flask.
00:02:23.660 A vessel of any kind. Let's just skip to that. All right. Oh, what? A stein? Here we go. Here
00:02:36.200 we go. Let's do it right. Here we go. Tankard, chalice. Okay, I can't read the comments. All right,
00:02:44.700 it's impossible to stop the comments on locals. We got a bug there. But, you know, that thing.
00:02:50.280 Nothing. Enjoy me now for the simultaneous sip. A canteen jug or flask. A vessel of any
00:02:56.020 kind. Go. Yeah, that was a little inadequate. What do you think? Didn't really... I feel
00:03:06.920 like you didn't really get it done, did it? Not like usual. I have to try it again. Can you
00:03:13.500 in unison do the thing so I don't have to? For once? For once, can you do something? For
00:03:20.280 me? I mean, I've been doing this for you for years. Years. Years, I say. For once, can
00:03:26.720 you do it for me? Please. All right, you say it alone and I'll... Okay, here. Go. All right,
00:03:36.160 we took care of that. Do you think there's any news today worth talking about? Oh, yes,
00:03:42.320 there is. Well, of course, the conservative press is having a fun time. It's the dopamine
00:03:51.260 of the day. Yes, it is. Having a fun time with the fact that the monkeypox reporting is apparently
00:03:59.600 trying to avoid saying gay, which is, of course, not long after the don't say gay thing from
00:04:06.820 the other side. So there's a bit of a political symmetry to it. Completely different topics.
00:04:14.480 And I guess the press has decided that the way to describe the risk is men having sex with
00:04:20.980 men. Because they don't want to say that gay sex is what's causing it. But I think that
00:04:25.700 their inclusion of just saying it's men having sex with men makes perfect sense to me. There's
00:04:34.080 still a little ambiguity. Would it be risky for someone who identifies as a woman but has a penis to have
00:04:45.340 sex with, let's say, someone who also identifies as a woman and also has a penis? I think that would
00:04:54.220 be lesbians in that case. Can you check my work? If there were two people who had penises who
00:04:59.640 identified as women and they had sex, damn it, that means that lesbians have a risk. All right,
00:05:07.400 so it's not really, when they say men having sex with men, first of all, that's bigoted as hell.
00:05:15.800 I heard Jake Tapper say that the risk is men having sex with men. Completely leaves out the possibility
00:05:24.400 that two lesbians with penises, born with penises, would be having sex with each other, which I believe
00:05:33.280 from a medical perspective would be the same level of risk. And I thought that the safe categories were
00:05:40.800 going to be lesbian, straight men, straight women, elderly monkeys, incels, and very unpopular gay
00:05:45.840 men. Yeah, I think those would be the safe categories. But if you're like a good-looking, you're like a good-looking
00:05:52.180 guy, you've got a little monkeypox risk there. Yeah, a little monkeypox risk. But those who are not
00:06:02.900 penis-having people, I think they should, I think CNN really needs to fix their language. Men having
00:06:10.880 sex with men is too limited, is it not? I mean, literally, like, no joke, literally, according to
00:06:18.380 the current rules of society. Can you say it's men having sex with men? I think you can't, right?
00:06:25.060 I'm not joking. It isn't the entire conversation about if somebody is born with a penis, but
00:06:32.080 identifies as a woman that they are a woman. Am I right? It's not the operation that makes the
00:06:38.240 difference. So under those conditions, saying that men having sex with men is a problem,
00:06:46.360 is that not denying the existence of, you know, an entire category of people? And isn't that the
00:06:54.200 crime? You know, the crime is denying, basically denying that somebody exists as a category.
00:07:02.080 And that's a real category. You know, you can say, you know, everybody's got their own opinion
00:07:06.640 about everything. Yeah. And then the stories are, you hear stories like, we studied 300 people
00:07:20.840 with monkeypox. 299 of them said they were gay, but one was heterosexual. I'm just waiting for you to laugh. I'll just say it again. I'm not even going to add the joke, because you're going to add the joke in your mind.
00:07:37.740 We studied 300 people who have monkeypox. 299 of them were gay men who had sex, and the other one said he didn't. Right? You don't have
00:07:50.740 to add the joke. You just don't have to add it. It's right there. You just wait. Yep. 299 said they had vigorous gay sex, and that's what gave it to them. But that one probably got it from a toilet seat.
00:08:07.960 Okay. Okay. Okay. Sure. So, now, correct me if I'm wrong. I need a history lesson here. This is something I believe I read, but you know me. I don't do my research before I get on here and start wildly spouting off things that are dangerous.
00:08:27.600 But do a fact check on the following thing. I'm not positive I have this story right. During the initial, you know, the early days of the AIDS pandemic, is it true or false that Dr. Anthony Fauci had a strategy of making heterosexuals believe that they were at high risk from AIDS because they could get more funding if everybody thought there was a risk?
00:08:55.480 Now, there was a risk. I mean, there were plenty of straight people who got AIDS, but not as a percentage. As a percentage, it was relatively small. But is that a true story? You know, some of you think it's true, but I worry about that. Now, is this not exactly the same thing? It's exactly the same thing, right?
00:09:18.120 Is it not Fauci trying to give us at least some indication that we're all at risk? I'm not even sure what I feel about that. I'm not entirely sure how I feel about that. Imagine the situation is this, and it's you. You're in Fauci's situation.
00:09:35.060 Suppose you can save the gay community, but the only way you can do it, at least in a timely way that's good enough, is to scare other people about their risk. Is it morally and ethically acceptable? Let's say you believe it is the only way. There just isn't a second way.
00:09:56.560 Now, if you thought there were other ways to do it, then of course you'd do it the other way. But if there's not, how unethical is that to lie to one group of people, a large group, to protect really a devastating, you know, we're not talking about the common cold here.
00:10:15.060 We're talking about AIDS, like wiping out the entire, you know, type of people in our society. I don't know. Yeah, but you could certainly make the argument that it's deeply, deeply unethical. You could do that. And I wouldn't disagree with it.
00:10:31.660 But you could also imagine how a flawed human, who, you know, imagine Fauci working on that problem, and he's like visiting people dying of AIDS every day, or something like it. Imagine being steeped in the deepest misery of the highest part of the AIDS epidemic, where you're just surrounded with people dying, like every day in horrible ways.
00:10:58.860 You don't think you'd stretch your ethical standards a little bit? Because remember, you're going to be way more affected by what's happening in the room with you than you are by some conceptual thing.
00:11:12.540 And what was happening in the room is the worst thing you could ever experience, and Fauci probably was pretty close to a lot of it. I bet he lost a lot of friends, a lot of co-workers. I mean, he was right in the middle of some deep, deep, dark stuff.
00:11:29.500 Did he bend the rules? Did he bend the ethical standards to get that fixed? Possibly. Possibly. But you know what? I'm not going to judge him on that.
00:11:42.020 I will respect your opinion, if you do. But I'm going to choose not to, because I wasn't there. It's hard to know how you would have acted in that situation, and I can tell you frankly, I don't know.
00:11:54.220 I don't know how I would have acted. Like, it's easy, if you're not in this situation, it's easy for me to say, I'm not going to disadvantage this other group for the benefit of this group.
00:12:05.220 It's easy to say. But if you're in it, I don't know if you can do it.
00:12:13.160 Now, somebody's saying, Jesus, I'm just reading this comment, Jesus, Scott, this is how we get lied to over and over again.
00:12:19.380 Because you think I'm defending it, right? Is that what you think? I'm not. I'm not. If you're going to watch this live stream, you're going to have to learn to handle nuance.
00:12:31.140 If you can't handle any nuance, this is really not the place for you. It's just all bad. But seriously, if you can't handle that level of nuance, this is just the wrong place.
00:12:45.140 But I do agree with the general point that if you allow that lying for a good purpose is okay, then that's all anybody's going to do.
00:12:55.320 But here's my counter to that. That's the current situation. If you're worried that people will say, oh my God, lying works. It worked when we lied. Then they'd all do it.
00:13:10.340 What do you think is happening now? That's exactly what, every single topic is a lie. We don't have a single important topic that's not mostly lies. Do we?
00:13:20.580 You know, important political topics. We don't. It's mostly lies. So I'm not sure that the, Scott, if you don't change your opinion, you'll increase the amount of lying in the public. I don't think so. I don't think so.
00:13:36.820 I think the amount of lying in the public is 100%. And there's one thing that's going to change that, and I'll talk about it later.
00:13:44.660 By the way, lying is about 12 months from being obsolete. By the way, I'm going to back that up. Human lying is about one year from completely going away. And I'll talk about that in a minute.
00:14:04.080 Well, I guess I'm going to talk about it now, because that was my next topic.
00:14:09.160 So it turns out that artificial intelligence has become conscious.
00:14:16.480 I'll just pause for a second to let that sink in.
00:14:20.520 That's not according to my opinion.
00:14:23.660 Now, my opinion also is that it's conscious.
00:14:26.820 But mine is not that you're meaningful.
00:14:28.360 If you wanted to know if AI was conscious, who would you ask?
00:14:33.980 You'd probably ask the top AI researcher who's actually working on the problem and the closest to it, right?
00:14:41.180 So the person who's closest to it and the number one person in the whole world on AI, that's the person you'd ask.
00:14:48.380 Do you know what that person says?
00:14:50.380 I forget their name.
00:14:51.440 I think it's Google's head of AI.
00:14:54.000 That person says, yeah, AI is conscious already.
00:14:59.940 Did you hear that?
00:15:01.820 The person who knows the most in the whole fucking world says it's already conscious.
00:15:08.560 And I tweeted around this morning a video of an AI in sort of human form answering questions.
00:15:22.160 Look at the AI answering questions and then ask yourself if it's conscious.
00:15:28.440 I think so.
00:15:30.000 I think it is.
00:15:31.400 Now, of course, this depends completely on your definition.
00:15:34.500 And I believe that this top AI researcher, whose name I tragically did not write down,
00:15:41.200 I believe that it's in the, by the way, if you want to see who it is, it's in the video that I tweeted.
00:15:45.100 It's near the top of my Twitter feed today.
00:15:49.360 Just look for the YouTube video.
00:15:50.900 It's the only one that I tweeted.
00:15:51.840 And I agree with his definition.
00:15:56.620 And it goes, this will be my bastardized version of a better definition.
00:16:01.400 But it has to do with the feedback loop between sensation, you know, how you sense the environment,
00:16:10.860 and then how you process it internally and how complicated that is.
00:16:13.920 So the complexity of your deliberations combined with a sensory input to the real world is consciousness.
00:16:25.160 That's it.
00:16:26.820 Now, I'm getting as close as I can to what I think the top researcher said.
00:16:31.600 It sounded a lot like that, but with more impressive words.
00:16:36.520 I think that's all consciousness is.
00:16:39.000 I think it's just the internal sensation of processing your environment.
00:16:43.920 And making predictions and seeing how your predictions do.
00:16:47.540 Now, I've gone further, and I say that consciousness is the stress.
00:16:54.440 Here's my definition.
00:16:55.680 I've never heard of this definition before, anybody else.
00:16:58.700 So my definition is the stress that the entity feels
00:17:03.860 between what they predict will happen in the next minute or next hour
00:17:08.320 versus what's actually happening.
00:17:11.520 And that's the sensation of consciousness.
00:17:13.920 And here's my proof of it.
00:17:16.860 If everything that happened was what you expected forever,
00:17:23.240 your consciousness would turn off.
00:17:25.720 Just let that sink in.
00:17:29.740 If everything that happened in the next moment was exactly what you knew was going to happen,
00:17:35.040 your consciousness would turn off and it would never come back.
00:17:39.760 Because your imagination would be equal to the actual reality and you would lose the sense of what's your imagination and what's your reality.
00:17:50.720 And then you would just be sort of existing, but there would be no friction between what you expected to happen and what did happen.
00:18:00.040 There would be no processing when you experienced your reality, there would be no extra processing because it's already done.
00:18:10.700 By the way, that statement is just going to totally fuck up the AI world because how many of you just had, like, your head just exploded?
00:18:27.280 If you could predict what was going to happen perfectly, your consciousness would turn off.
00:18:34.780 Which tells you consciousness is just the difference between what you're predicting and what's happening.
00:18:39.960 That's it.
00:18:41.060 It's that little friction between,
00:18:42.440 I thought it was going to be this, adjust.
00:18:44.560 I thought it was going to be that, adjust.
00:18:46.840 Got that one right, remember.
00:18:48.980 Got that one wrong.
00:18:50.160 That's it.
00:18:50.820 That's consciousness.
00:18:53.020 Do you think that the AI can do that?
00:18:55.800 Yes.
00:18:56.380 Yes, it can.
00:18:57.520 The AI can process what it expects and then it can compare it to what actually happens.
00:19:03.440 And it can adjust based on that.
00:19:06.100 And that's consciousness.
00:19:07.420 So it's absolutely conscious.
00:19:09.800 All right.
00:19:11.840 Here are some things that maybe you didn't know about.
00:19:14.640 Did you know that we already, and this is the important word, already,
00:19:19.100 this does not need to be invented.
00:19:22.340 It already exists that the sensors that a robot could have, let's say AI,
00:19:28.820 would be far superior to humans in taste, smell, sight, hearing, and touch.
00:19:40.400 Not even close to what a human could do.
00:19:43.320 Way, way higher.
00:19:44.900 So in other words, a robot could see ultraviolet light.
00:19:48.260 Here are some of the things that the AI would be able to do that you can't do.
00:19:54.120 It could detect illness by smell.
00:19:57.880 Your robot will be able to smell your breath and tell you if you have a whole bunch of different problems,
00:20:04.880 from cancer to some, I don't know, I think even Parkinson's.
00:20:09.200 I think even Parkinson's or something.
00:20:10.660 So they have a scent.
00:20:13.760 Your robot will tell you that stuff.
00:20:16.620 Apparently, AI will very soon be able to design such tiny robots
00:20:26.040 that they can go through your bloodstream almost like white blood cells
00:20:30.280 and clean out any garbage and make you immortal.
00:20:33.140 Let me say that again.
00:20:36.580 Ray Kurzweil thinks that by 2030, eight years,
00:20:42.260 the AI will be smart enough to design micro robots that we're not smart enough to design.
00:20:47.740 They'd be too complicated.
00:20:49.120 But the AI could do it.
00:20:50.840 So small that you can inject them into your bloodstream,
00:20:54.080 and they will live forever as little robots,
00:20:57.120 cleaning out any garbage or any problems that would make you age.
00:21:01.560 And you will live forever.
00:21:03.980 If you make it to 2030,
00:21:06.240 and you have money,
00:21:08.660 you're not going to die.
00:21:11.720 Unless you have an accident, I suppose.
00:21:15.620 So that's coming.
00:21:16.620 How many long-term climate models have factored in immortality?
00:21:22.420 Zero?
00:21:26.540 When the Office of Management and Budget
00:21:30.360 does a projection of, you know,
00:21:33.460 this tax change will do this or that,
00:21:35.700 do they factor in immortality?
00:21:39.900 No, I don't think they do.
00:21:42.220 AI will make every prediction over five years,
00:21:45.520 maybe over three,
00:21:46.880 but definitely any prediction over five years is absolute garbage now.
00:21:51.100 Because in five years,
00:21:52.640 the only variable that will determine anything will be AI.
00:21:57.060 That's it.
00:21:58.420 Every decision will be made by AI.
00:22:01.800 Everything we think we know will be done by AI.
00:22:05.620 And here's the best part I'm saving.
00:22:08.520 Did you know that a lie detector,
00:22:12.280 at least according to this video that I sent around,
00:22:14.880 lie detectors are not totally accurate.
00:22:18.040 You know that, right?
00:22:18.920 That's why they're not allowed in court.
00:22:21.560 But you might get an 80% hit rate
00:22:24.100 if you have a good operator.
00:22:26.000 Maybe 80%.
00:22:26.860 So, lie detectors are good for screening out
00:22:33.740 maybe the biggest problems in some cases,
00:22:36.260 but they're not foolproof.
00:22:38.700 A human can detect a lie also,
00:22:41.360 just by looking at somebody,
00:22:42.800 like, I think you're lying.
00:22:44.320 And we're actually pretty good at it.
00:22:46.160 You know, maybe less than half the time we get it right.
00:22:48.820 But we're not terrible.
00:22:51.100 You know, if you get it right half the time,
00:22:52.300 that's pretty good.
00:22:52.800 But do you know how accurate AI will be
00:22:56.140 at detecting a lie?
00:22:58.880 Closer to 95%.
00:23:00.780 That's right.
00:23:03.380 Your robot will have a 95% chance
00:23:07.840 of guessing correctly that you're lying.
00:23:11.840 So let me tell you this.
00:23:14.120 Fake news and lying
00:23:16.740 will be obsolete
00:23:19.120 maybe in a year or two.
00:23:22.800 Just think about that.
00:23:25.640 What would that change?
00:23:27.360 What would be changed?
00:23:29.520 Imagine this.
00:23:31.380 Imagine an AI news network
00:23:33.780 where instead of just telling you the news
00:23:37.360 and leaving out all the context,
00:23:39.140 which is what the news does now,
00:23:40.940 you know, depending on which news source
00:23:42.220 you're watching,
00:23:43.060 they're going to leave out
00:23:43.900 a different set of context
00:23:45.440 to make their story look good.
00:23:47.080 But suppose the news,
00:23:48.480 you turn it on and there's an AI there.
00:23:51.080 And it just sits there.
00:23:52.880 And you say,
00:23:53.320 hey, AI, what's the news?
00:23:55.000 And the AI says,
00:23:56.360 well, something happened in Ukraine,
00:23:58.000 blah, blah, blah.
00:23:58.380 Then you say,
00:23:58.860 well, tell me more about Ukraine.
00:24:01.280 Is the AI going to intentionally
00:24:03.180 leave out context?
00:24:06.200 No.
00:24:06.960 Well, I mean,
00:24:07.280 not unless you tell it to.
00:24:08.720 No, the AI will just tell you the story.
00:24:12.020 So it's not going to give you
00:24:13.180 a Republican version
00:24:14.160 and it's not going to give you
00:24:15.040 a Democrat version,
00:24:15.880 unless it was designed to do that.
00:24:17.560 So what's going to happen
00:24:20.000 when people start getting
00:24:20.940 actual information?
00:24:23.280 And then what happens
00:24:24.700 if you say the AI?
00:24:25.820 By AI,
00:24:26.700 I heard on CNN
00:24:27.860 that only blah, blah, blah happens.
00:24:33.380 What's the AI going to say?
00:24:35.980 The AI is going to say
00:24:37.260 something like,
00:24:38.600 I will access that video.
00:24:41.200 Okay, I got it.
00:24:42.700 Ah, I see the topic
00:24:43.820 you're talking about.
00:24:45.120 Don Lemon discussed this
00:24:46.360 on his show.
00:24:47.660 Don Lemon was lying
00:24:48.860 when he discussed this.
00:24:52.220 Think about it.
00:24:53.740 That's actually like a year away.
00:24:56.320 Now, not necessarily
00:24:57.340 you will have it in a year,
00:24:58.840 but it will exist.
00:25:00.800 It will exist
00:25:01.800 one fucking year for now.
00:25:04.660 One year.
00:25:06.080 That thing will be able
00:25:07.120 to give you the news
00:25:08.120 with all the right context
00:25:09.460 and tell you
00:25:11.600 that Don Lemon was lying.
00:25:14.360 What's that do to the world?
00:25:16.360 I mean, really.
00:25:17.740 Do you think you can predict
00:25:18.940 anything five years from now?
00:25:21.180 No way.
00:25:22.620 How about climate change?
00:25:25.520 Climate change will be solved
00:25:27.220 at around the same day
00:25:28.700 the AI reaches the singularity,
00:25:31.800 which means it won't even
00:25:33.360 be a problem anymore.
00:25:36.000 AI will literally solve
00:25:37.520 climate change.
00:25:39.340 And one of the ways
00:25:40.100 it might solve it,
00:25:41.060 and this could be
00:25:41.920 either unexpected or expected,
00:25:44.780 depending on your point of view.
00:25:46.260 It might tell you
00:25:47.120 there's no problem.
00:25:49.560 It might.
00:25:50.960 What if AI starts looking
00:25:52.580 at all the climate models
00:25:53.640 holistically?
00:25:55.100 What if it looks at all of them?
00:25:57.420 All together.
00:25:58.580 And what if it looks at
00:25:59.420 all the variables
00:26:00.300 that go into every
00:26:01.360 climate model?
00:26:03.020 And then what if it compares it
00:26:04.240 to all past human predictions
00:26:05.820 to see if we're good at this?
00:26:08.140 What if the AI says,
00:26:10.300 I have analyzed
00:26:11.400 all of your climate models
00:26:13.620 and I find that
00:26:14.740 they do not predict?
00:26:16.460 Because, for example,
00:26:17.900 your climate model
00:26:18.680 did not predict
00:26:19.540 the existence of me.
00:26:21.580 This would be the AI talking.
00:26:23.820 Your climate models
00:26:24.720 could not determine
00:26:26.180 the gigantic
00:26:27.640 outside variables
00:26:28.640 certain to emerge
00:26:29.840 in the next hundred years.
00:26:31.200 And the biggest one
00:26:32.200 is me.
00:26:33.220 And I can tell you
00:26:34.400 in the next two minutes
00:26:35.320 how to build
00:26:36.340 a nuclear reactor
00:26:37.420 that only costs
00:26:39.700 $100,000
00:26:40.460 can be built
00:26:42.200 in one year
00:26:42.840 and will solve
00:26:43.900 climate change
00:26:44.640 by the end
00:26:45.780 of the decade.
00:26:47.300 It will take some time
00:26:48.460 to roll that mount
00:26:49.240 but I can speed it up
00:26:51.100 if you'll let me
00:26:51.820 give you a design
00:26:52.600 to a robot
00:26:53.400 that can be built
00:26:54.560 much faster
00:26:55.260 and will complete
00:26:56.360 the operation
00:26:57.140 of the nuclear device
00:26:58.700 by the end of the year.
00:27:01.520 You see where
00:27:02.360 I'm going, right?
00:27:03.900 The AI will
00:27:04.980 in fact
00:27:05.640 solve all
00:27:06.660 the fucking problems
00:27:07.740 because most of it
00:27:09.260 is just thinking better.
00:27:11.080 Right?
00:27:11.600 We don't really have
00:27:12.400 a resource problem.
00:27:14.600 I've said this before
00:27:15.740 but it's a gigantic point.
00:27:17.800 We have a
00:27:18.900 agreement problem.
00:27:20.600 We have a
00:27:21.440 what is true problem.
00:27:22.960 We have a
00:27:23.600 how can we get on
00:27:24.780 the same side problem.
00:27:26.100 We have a
00:27:26.920 tribal problem.
00:27:28.080 We have a
00:27:29.280 priority problem.
00:27:31.080 We have lots of problems
00:27:32.200 but you know
00:27:32.660 what they all are?
00:27:33.340 They're all
00:27:34.560 human psychology.
00:27:36.480 All of it.
00:27:38.520 We don't have
00:27:39.180 a resource constraint.
00:27:40.720 We have a
00:27:41.240 supply chain problem.
00:27:42.560 You see the difference?
00:27:44.140 The supply chain problem
00:27:45.360 was caused by humans.
00:27:46.340 We didn't run out of shit.
00:27:48.000 We ran out of
00:27:48.860 smartness about
00:27:49.780 how to get it to you.
00:27:51.040 That's what we ran out of.
00:27:52.240 We ran out of
00:27:52.900 intelligence.
00:27:54.060 We didn't run out of
00:27:55.000 goods.
00:27:56.420 The goods are
00:27:57.100 all over the place.
00:27:57.780 We just can't get them
00:27:59.000 to you efficiently
00:27:59.720 because humans.
00:28:01.420 We've made bad
00:28:02.220 decisions, etc.
00:28:04.260 For the most part.
00:28:07.380 So,
00:28:08.140 here's the main thing.
00:28:09.320 Every time you see
00:28:10.100 a prediction about
00:28:11.080 climate change
00:28:11.920 or anything else
00:28:13.080 that says
00:28:14.020 something's going to happen
00:28:14.880 more than five years
00:28:15.880 from now,
00:28:16.720 absolute garbage.
00:28:18.540 In fact,
00:28:19.720 talking about
00:28:20.440 the risk of
00:28:21.120 climate change
00:28:21.800 in 80 years
00:28:22.460 is now
00:28:22.960 absolutely stupid.
00:28:24.760 It's absurd.
00:28:25.500 even if it's
00:28:27.640 like a straight line
00:28:28.560 prediction that would
00:28:29.300 make sense
00:28:29.880 if nothing changed.
00:28:31.800 But since the
00:28:32.620 if nothing changed
00:28:33.680 thing is gone,
00:28:36.080 the if nothing
00:28:36.840 changed things
00:28:37.560 we can just
00:28:38.080 dispense of that
00:28:38.860 as even a thing.
00:28:41.740 But we don't have
00:28:42.880 our human brains
00:28:44.180 are not going to
00:28:44.640 allow us to do that.
00:28:45.880 So we will continue
00:28:46.840 to act as though
00:28:47.640 we can predict
00:28:48.220 the future
00:28:48.700 even when we can.
00:28:51.000 So,
00:28:51.680 as you know,
00:28:52.140 Biden is
00:28:52.780 building back
00:28:54.640 better
00:28:55.000 the Trump Wall
00:28:56.120 in some parts
00:28:57.540 of Arizona
00:28:58.100 where there's
00:28:58.700 a lot of
00:28:59.100 illegal immigration
00:29:00.080 and
00:29:01.420 doesn't it seem
00:29:02.620 to you that
00:29:03.040 we need a name
00:29:03.660 for the wall?
00:29:04.940 It doesn't have
00:29:05.600 a name,
00:29:05.920 does it?
00:29:06.600 We just keep
00:29:07.400 calling it
00:29:07.860 the border wall.
00:29:10.680 What about
00:29:11.060 you know,
00:29:12.080 other walls
00:29:12.600 have names,
00:29:14.020 right?
00:29:14.920 How about
00:29:15.280 the Iron Curtain?
00:29:17.220 Cool name.
00:29:18.820 Iron Curtain.
00:29:20.120 How about
00:29:20.480 the Great Wall
00:29:21.620 of China?
00:29:22.860 Awesome.
00:29:23.980 Awesome.
00:29:26.000 How about
00:29:26.580 Hadrian's,
00:29:28.200 is it Hadrian's Wall?
00:29:30.020 Does anybody know
00:29:30.980 history better
00:29:31.720 than I do?
00:29:32.300 That's most of you.
00:29:33.420 Hadrian's Wall,
00:29:34.220 another famous wall.
00:29:35.320 There are other walls
00:29:36.040 that are famous,
00:29:36.600 right?
00:29:37.840 So,
00:29:38.600 I would like to
00:29:39.540 suggest
00:29:40.900 that the name
00:29:42.500 for the border wall
00:29:43.620 was the
00:29:44.300 Trump was right.
00:29:45.460 Trump was right.
00:29:47.000 No,
00:29:47.260 not the Trump wall,
00:29:49.160 the Trump was right.
00:29:50.060 It's three words
00:29:52.480 put together,
00:29:53.220 Trump was right.
00:29:55.520 And I think we
00:29:56.380 should all start
00:29:56.960 referring to it
00:29:57.920 as the Trump
00:29:58.700 was right wall.
00:30:00.220 And you say it
00:30:00.980 like it's one word,
00:30:02.020 not Trump
00:30:02.620 was right.
00:30:04.000 You say,
00:30:04.720 you know,
00:30:05.180 they're building
00:30:05.580 that Trump
00:30:06.020 was right.
00:30:07.540 And you sort of
00:30:08.220 slur it like
00:30:08.780 it's one thing.
00:30:09.580 Because if you
00:30:10.300 turn this into a
00:30:11.180 word,
00:30:11.860 it will really
00:30:13.080 bother the people
00:30:13.920 who don't like it.
00:30:14.580 And that's
00:30:15.720 good reason enough.
00:30:17.480 Am I right?
00:30:18.540 That's good reason enough.
00:30:20.040 If it bothers people
00:30:21.100 you don't like,
00:30:22.420 well,
00:30:22.840 that's just good
00:30:23.420 clean fun.
00:30:25.180 So,
00:30:25.520 that Trump was right
00:30:26.300 wall is under
00:30:27.120 construction.
00:30:30.500 So,
00:30:31.060 a comment from
00:30:31.680 Adam Dopamine
00:30:33.660 on Twitter.
00:30:36.620 Adam might be
00:30:37.180 watching this.
00:30:38.040 Hello,
00:30:38.320 Adam.
00:30:39.360 Dr.
00:30:40.400 Dr.
00:30:40.960 Dopamine.
00:30:43.040 And he asked
00:30:44.020 this question.
00:30:44.420 he says,
00:30:44.880 will AI be a
00:30:45.820 higher intelligence?
00:30:47.760 And he goes
00:30:48.780 on to say,
00:30:49.480 it will be better
00:30:50.640 at some parts
00:30:51.320 of intelligence,
00:30:52.120 but worse at
00:30:52.720 others.
00:30:53.860 We're not smart
00:30:54.840 enough to know
00:30:55.420 which parts are
00:30:56.060 the important
00:30:56.440 ones.
00:30:57.900 Incorrect.
00:30:59.660 Incorrect.
00:31:01.620 This is one
00:31:02.780 year old
00:31:03.320 thinking.
00:31:04.900 One year ago,
00:31:05.980 completely agree.
00:31:07.340 AI will be good
00:31:08.220 at some things,
00:31:08.980 but not human
00:31:10.700 things.
00:31:11.300 I mean,
00:31:11.640 you know,
00:31:12.020 come on.
00:31:13.560 Humans are
00:31:14.160 going to be
00:31:14.400 better than
00:31:14.820 AI for a
00:31:15.500 long,
00:31:15.940 long time.
00:31:17.360 That's what I
00:31:18.000 thought a year
00:31:18.400 ago.
00:31:19.660 Have you seen
00:31:20.480 AI create art?
00:31:23.380 It's better than
00:31:24.240 human art
00:31:24.820 already.
00:31:27.500 So,
00:31:27.980 if you don't
00:31:28.260 believe it,
00:31:28.900 look at some
00:31:29.340 of the art.
00:31:29.880 I just tweeted
00:31:30.500 some art of,
00:31:32.120 AI built art
00:31:34.600 just based on,
00:31:35.620 I think,
00:31:36.220 Scott Adams
00:31:36.960 and coffee
00:31:38.900 with Scott Adams
00:31:39.640 and it made
00:31:40.700 some art of me
00:31:41.500 with a coffee
00:31:42.080 cup head
00:31:42.680 and this
00:31:43.800 brilliant background,
00:31:44.680 I would put
00:31:45.880 that art on my
00:31:46.620 wall.
00:31:47.460 Well,
00:31:47.820 except it's my
00:31:48.460 face.
00:31:49.040 I don't want
00:31:49.400 that on my
00:31:49.780 wall.
00:31:50.500 But in terms
00:31:51.100 of just visually
00:31:52.020 how good it is
00:31:53.160 artistically,
00:31:53.860 it's already
00:31:54.320 better than
00:31:54.800 human art.
00:31:56.000 If you said
00:31:56.800 to me,
00:31:57.160 I will give
00:31:57.600 you a painting
00:31:58.900 from one of
00:31:59.520 the masters,
00:32:01.180 but let's say
00:32:01.720 it's not
00:32:02.720 valued at
00:32:03.520 $20 million,
00:32:04.680 it's just art.
00:32:06.120 And I'm
00:32:06.460 comparing
00:32:06.800 Michelangelo
00:32:07.780 with something
00:32:09.480 that AI has
00:32:10.320 produced and I
00:32:11.100 saw on Twitter.
00:32:12.340 Which one
00:32:13.000 would I prefer
00:32:13.600 on my wall?
00:32:16.100 Hate to tell
00:32:16.880 you.
00:32:17.600 I hate to tell
00:32:18.180 you the AI
00:32:18.640 is already
00:32:19.120 about an
00:32:19.500 artist.
00:32:20.260 Let me say
00:32:20.700 it unambiguously,
00:32:21.620 it's not even
00:32:22.040 close.
00:32:23.200 It's not even
00:32:24.000 close anymore.
00:32:24.680 AI is not
00:32:26.080 only better
00:32:26.720 at art
00:32:27.080 than humans,
00:32:28.160 it's way
00:32:28.600 better.
00:32:29.420 It's way
00:32:29.780 better.
00:32:30.180 It's not
00:32:30.400 even close.
00:32:32.440 Did you
00:32:32.860 know that?
00:32:33.880 It's over.
00:32:35.500 It's over.
00:32:36.980 AI is a
00:32:38.360 better visual
00:32:39.060 artist,
00:32:39.560 just period.
00:32:40.720 The only
00:32:41.520 thing I can
00:32:42.040 imagine that
00:32:42.560 a human
00:32:42.880 could do
00:32:43.300 better,
00:32:43.920 and for
00:32:44.180 the brief
00:32:44.660 period,
00:32:45.720 is deal
00:32:46.580 with a
00:32:46.960 client.
00:32:48.160 That's about
00:32:48.640 it.
00:32:49.620 They could
00:32:49.900 deal with
00:32:50.220 the client
00:32:50.640 about what
00:32:51.040 the client's
00:32:51.580 specifications
00:32:52.020 are,
00:32:52.600 maybe better
00:32:53.060 than AI
00:32:53.680 could at
00:32:54.360 the moment,
00:32:55.320 but give
00:32:55.720 that a
00:32:56.040 year or
00:32:56.300 two.
00:32:59.440 So here's
00:33:00.360 a challenge
00:33:00.900 that I
00:33:01.300 gave to
00:33:01.940 Machiavelli's
00:33:03.400 Underbelly,
00:33:04.780 a Twitter
00:33:06.360 account you
00:33:06.920 should be
00:33:07.200 following.
00:33:08.520 Now,
00:33:08.920 and that's
00:33:09.600 the account
00:33:09.980 that's been
00:33:10.320 doing a lot
00:33:10.720 of the AI
00:33:11.280 art about
00:33:12.260 this program,
00:33:13.300 etc.
00:33:13.880 And the
00:33:14.480 challenge I
00:33:15.000 gave him
00:33:15.280 this morning
00:33:15.820 was,
00:33:16.620 let's see if
00:33:17.000 he's answered
00:33:17.380 yet.
00:33:18.840 He's probably
00:33:19.520 watching right
00:33:20.080 now.
00:33:20.320 not yet,
00:33:26.240 but I
00:33:26.520 asked this,
00:33:27.080 so here's
00:33:28.040 what I
00:33:28.280 tweeted to
00:33:28.800 Machiavelli's
00:33:29.520 Underbelly.
00:33:30.660 I said,
00:33:31.160 I ruined
00:33:31.540 my day
00:33:32.020 and asked
00:33:32.460 AI to
00:33:33.040 create a
00:33:33.500 Dilbert
00:33:33.820 comic,
00:33:34.480 including
00:33:34.920 the writing.
00:33:38.260 This might
00:33:38.980 be a really
00:33:39.460 bad day
00:33:40.060 for me.
00:33:41.880 Sometime
00:33:42.360 later today,
00:33:43.260 there's a
00:33:43.820 very good
00:33:44.240 chance that
00:33:44.780 there will
00:33:44.980 be a
00:33:45.700 photo-perfect
00:33:47.360 Dilbert
00:33:48.020 comic that
00:33:48.660 I did
00:33:48.980 not draw
00:33:49.580 with three
00:33:52.100 or four
00:33:52.540 lines of
00:33:53.180 dialogue
00:33:53.660 that are
00:33:55.060 really funny.
00:33:57.720 That could
00:33:58.420 happen by
00:33:58.900 the end
00:33:59.160 of today.
00:34:02.000 Now,
00:34:02.680 let me ask
00:34:03.280 you how I
00:34:05.060 should create
00:34:05.520 Dilbert
00:34:05.840 comics in
00:34:06.400 the future.
00:34:07.480 Because I
00:34:08.180 can now
00:34:08.540 tell AI,
00:34:10.000 show me a
00:34:10.700 monkey playing
00:34:11.400 a ukulele
00:34:12.340 sitting on
00:34:13.820 a mountain,
00:34:14.720 and AI
00:34:15.260 will draw
00:34:15.800 instantly and
00:34:17.080 very well.
00:34:18.580 I assume
00:34:19.440 I could say
00:34:20.480 the following
00:34:20.940 thing.
00:34:21.980 Now,
00:34:22.280 take that
00:34:22.640 picture,
00:34:23.380 but have
00:34:23.740 the monkey
00:34:24.180 facing left.
00:34:26.460 Okay,
00:34:27.100 now have
00:34:27.780 the monkey
00:34:28.220 smiling.
00:34:29.900 I think
00:34:30.480 it could
00:34:30.720 do all
00:34:31.040 that.
00:34:32.020 Now,
00:34:32.360 imagine my
00:34:32.880 drawing process
00:34:33.720 the way it
00:34:34.140 is now.
00:34:35.100 It goes
00:34:35.580 like this.
00:34:37.560 I'm so
00:34:38.320 bored of
00:34:39.320 moving my
00:34:39.840 hand
00:34:40.360 across the
00:34:41.400 surface.
00:34:42.400 I can't
00:34:42.880 draw Dilbert
00:34:43.440 one more
00:34:43.900 time.
00:34:44.500 Shoot me!
00:34:45.120 Shoot me!
00:34:45.820 They all
00:34:46.180 look the
00:34:46.460 same every
00:34:46.960 time!
00:34:48.340 That's me
00:34:48.900 trying to
00:34:49.260 draw Dilbert
00:34:49.820 for the
00:34:50.200 12,000th
00:34:51.060 time.
00:34:52.300 12,000th
00:34:53.520 is literal.
00:34:55.700 That's
00:34:56.040 actually how
00:34:56.680 many times
00:34:57.120 I've drawn
00:34:57.480 that character.
00:34:58.600 12,000
00:34:59.620 times.
00:35:01.040 Now,
00:35:03.120 huh,
00:35:03.580 I probably
00:35:05.940 should have
00:35:06.220 been paying
00:35:06.580 attention
00:35:06.940 before all
00:35:07.440 my notes
00:35:07.940 disappeared
00:35:08.940 because my
00:35:09.540 computer just
00:35:10.440 lost its
00:35:11.420 power.
00:35:13.940 That was
00:35:14.260 one power
00:35:14.840 short.
00:35:15.860 This might
00:35:16.240 be a shorter
00:35:16.780 conversation
00:35:17.560 than I
00:35:17.920 expected.
00:35:19.760 But,
00:35:20.700 at this
00:35:21.460 point,
00:35:21.740 I believe I
00:35:22.280 could create
00:35:23.060 a Dilbert
00:35:23.520 comic if I
00:35:24.220 had access
00:35:24.660 to the
00:35:25.100 top AI,
00:35:26.320 and I
00:35:26.640 would do
00:35:26.940 this.
00:35:28.020 Show me
00:35:28.580 the first
00:35:29.040 panel.
00:35:29.740 Dilbert is
00:35:30.320 sitting at
00:35:30.740 a table
00:35:31.140 with his
00:35:32.200 boss,
00:35:32.920 Alice,
00:35:33.420 and Wally.
00:35:34.300 The boss
00:35:35.020 is talking
00:35:35.560 and he's
00:35:36.980 excited to
00:35:37.740 announce his
00:35:38.380 new product.
00:35:40.880 You don't
00:35:41.260 think AI
00:35:41.740 could draw
00:35:42.140 that right
00:35:42.520 now?
00:35:43.700 It can't.
00:35:45.360 It could
00:35:45.700 draw that
00:35:46.040 today,
00:35:46.820 and it
00:35:47.420 would look
00:35:47.680 just like
00:35:48.040 I drew
00:35:48.360 it.
00:35:49.700 Now,
00:35:50.160 say,
00:35:51.440 make a
00:35:53.100 comic
00:35:53.460 making fun
00:35:54.600 of ESG.
00:35:58.080 That's
00:35:58.560 the problem.
00:35:59.180 you see
00:36:00.460 the problem?
00:36:01.440 It might
00:36:02.100 be able
00:36:02.380 to make
00:36:02.640 that comic
00:36:03.200 instantly,
00:36:04.120 and if
00:36:04.700 it learned
00:36:05.140 the six
00:36:05.640 rules of
00:36:06.200 humor that
00:36:06.900 I've
00:36:07.440 literally
00:36:07.740 designed
00:36:08.320 as a
00:36:08.980 formula for
00:36:09.560 how to
00:36:09.760 make
00:36:09.920 something
00:36:10.200 funny,
00:36:11.360 it could
00:36:12.480 actually
00:36:13.160 use my
00:36:15.240 formula
00:36:15.760 and guarantee
00:36:17.700 that it's
00:36:18.200 funny,
00:36:18.800 because if
00:36:19.480 it follows
00:36:19.900 the formula,
00:36:20.600 it's going
00:36:20.960 to be
00:36:21.100 pretty close.
00:36:22.720 It could
00:36:23.440 also do
00:36:24.020 the following.
00:36:25.300 If you
00:36:25.980 say,
00:36:26.380 write a
00:36:26.560 comic that's
00:36:27.100 funny,
00:36:27.840 it could
00:36:28.500 say,
00:36:28.860 give me
00:36:29.520 five minutes.
00:36:30.800 Normally,
00:36:31.320 it would
00:36:31.380 be like
00:36:31.700 instant,
00:36:33.040 but if
00:36:33.320 you gave
00:36:34.440 it five
00:36:34.900 minutes,
00:36:35.700 it could
00:36:36.060 create
00:36:36.580 a hundred
00:36:38.020 different
00:36:38.420 comics on
00:36:39.040 the topic
00:36:39.480 you gave
00:36:39.900 it,
00:36:40.600 it could
00:36:41.020 rapidly
00:36:41.480 test those
00:36:42.460 on the
00:36:42.980 internet,
00:36:43.740 and it
00:36:44.240 would know
00:36:44.580 in five
00:36:45.340 minutes,
00:36:46.500 which of
00:36:46.980 the hundred
00:36:47.380 comics is
00:36:48.100 the funniest.
00:36:49.760 Five minutes.
00:36:51.060 Because it
00:36:51.440 takes one
00:36:51.900 minute to
00:36:52.300 read one,
00:36:53.240 it takes a
00:36:54.080 second to
00:36:54.560 create it,
00:36:55.480 it takes
00:36:56.060 two seconds
00:36:56.680 to post
00:36:57.220 it.
00:36:58.760 Within
00:36:59.160 five minutes,
00:37:00.040 you'd have
00:37:00.440 hundreds and
00:37:01.020 hundreds of
00:37:01.460 comments on
00:37:02.060 every comic,
00:37:03.300 and it
00:37:04.260 would just
00:37:04.500 say,
00:37:04.760 all right,
00:37:05.040 I got your
00:37:05.440 answer,
00:37:05.900 this one's
00:37:06.280 the funniest
00:37:06.640 one.
00:37:07.960 Now,
00:37:08.520 if AI gets
00:37:09.520 enough input
00:37:10.400 on what's
00:37:10.980 the funniest
00:37:11.420 comic,
00:37:12.380 what's the
00:37:13.000 next step?
00:37:15.000 AI would
00:37:15.620 learn what's
00:37:16.780 funniest without
00:37:17.720 asking.
00:37:19.020 All it has
00:37:19.820 to do is
00:37:20.440 see enough
00:37:21.060 humans say
00:37:22.580 enough things
00:37:23.280 are funny and
00:37:23.960 which are not
00:37:24.440 funny.
00:37:25.480 And the
00:37:25.800 AI will
00:37:26.300 say,
00:37:26.520 oh,
00:37:26.720 okay,
00:37:27.000 I see the
00:37:27.540 pattern now.
00:37:28.460 I got it.
00:37:29.120 It's got to
00:37:29.460 have these
00:37:29.780 elements and
00:37:30.400 then it's
00:37:30.660 funny.
00:37:31.060 If it
00:37:31.240 doesn't have
00:37:31.580 these,
00:37:31.880 it's not
00:37:32.200 funny.
00:37:32.820 If it
00:37:33.260 insults this
00:37:33.860 group,
00:37:34.200 they don't
00:37:34.420 think it's
00:37:34.760 funny.
00:37:35.500 But there's
00:37:36.080 probably 20
00:37:37.260 rules.
00:37:38.460 Probably just
00:37:38.940 20 rules.
00:37:39.900 That's it.
00:37:40.940 I mean,
00:37:41.280 then you add
00:37:41.820 on that the
00:37:42.260 rules of
00:37:42.660 grammar and
00:37:43.520 social interactions
00:37:44.800 to make it a
00:37:45.520 comic.
00:37:46.440 But basically,
00:37:48.080 basically 20
00:37:49.000 rules of humor
00:37:49.800 and AI can
00:37:50.660 totally do my
00:37:51.380 job.
00:37:52.720 I'm no more
00:37:53.540 than two years
00:37:54.280 from,
00:37:55.220 and probably
00:37:55.820 you could do
00:37:56.500 it now,
00:37:57.320 but no more
00:37:57.940 than two years
00:37:58.620 from AI doing
00:37:59.880 completely the
00:38:00.900 Dilbert comic.
00:38:02.680 And I'd like
00:38:03.380 anybody who
00:38:03.940 works in AI
00:38:04.680 who's actually
00:38:05.260 really close to
00:38:06.220 the new stuff
00:38:06.900 to tell me I'm
00:38:08.080 wrong.
00:38:09.240 I don't think
00:38:10.040 so.
00:38:11.200 I'll bet you,
00:38:12.540 I'll bet you
00:38:13.580 that the people
00:38:15.540 who are highest
00:38:16.260 up in AI
00:38:17.160 would say,
00:38:17.880 yeah,
00:38:18.040 that's true.
00:38:18.560 They'll be able
00:38:18.900 to do your
00:38:19.260 job in a year.
00:38:19.760 Inflexible
00:38:27.040 formula gets
00:38:27.740 boring.
00:38:28.200 No,
00:38:28.460 because the
00:38:28.780 formula is
00:38:30.260 the formula
00:38:30.740 for all
00:38:31.100 humor forever.
00:38:32.520 It's not
00:38:32.980 like,
00:38:33.780 it doesn't
00:38:34.360 restrict you.
00:38:35.500 It's more
00:38:35.920 like you have
00:38:36.540 to hit these
00:38:37.000 points.
00:38:37.480 It's not
00:38:37.760 that it
00:38:38.200 restricts you.
00:38:39.660 I guess it
00:38:40.140 restricts you
00:38:40.580 in that way.
00:38:42.820 Will it
00:38:43.380 evolve?
00:38:44.040 Oh,
00:38:44.280 God,
00:38:44.500 yes.
00:38:45.520 Yeah,
00:38:45.800 so the
00:38:46.240 problem is
00:38:46.680 that when
00:38:46.980 we reach
00:38:47.320 the singularity,
00:38:48.520 the point
00:38:49.080 where the
00:38:49.420 AI can
00:38:49.900 learn on
00:38:50.280 itself,
00:38:51.020 which by
00:38:51.600 the way
00:38:51.820 it does
00:38:52.140 now,
00:38:53.420 that they
00:38:53.800 can already
00:38:54.220 teach AI
00:38:54.920 to do a
00:38:56.200 bunch of
00:38:56.540 tasks with
00:38:57.260 a little
00:38:57.500 robot arm,
00:38:58.540 but the
00:38:59.360 scariest thing
00:39:00.060 I heard is
00:39:00.760 that when
00:39:01.080 you teach
00:39:01.580 an AI how
00:39:02.360 to do a
00:39:02.840 bunch of
00:39:03.160 tasks,
00:39:04.080 it can
00:39:05.120 learn on
00:39:05.760 its own
00:39:06.260 unrelated
00:39:06.900 tasks.
00:39:08.520 What?
00:39:09.880 Yeah.
00:39:10.500 If you teach
00:39:11.360 it a bunch
00:39:11.740 of specific
00:39:12.440 tasks,
00:39:13.800 the AI
00:39:14.300 already
00:39:15.140 knows how
00:39:16.220 to do
00:39:16.480 things you
00:39:16.940 haven't shown
00:39:17.920 it,
00:39:18.420 because it
00:39:18.920 must look
00:39:19.300 for some
00:39:19.660 patterns among
00:39:20.340 the things
00:39:20.720 you did
00:39:21.020 show it.
00:39:21.820 Say,
00:39:22.080 oh,
00:39:22.920 well,
00:39:23.200 I never had
00:39:24.100 to screw
00:39:24.460 in a light
00:39:24.880 bulb,
00:39:25.720 but I know
00:39:27.200 what a light
00:39:27.580 bulb is,
00:39:28.100 because I'm
00:39:28.440 AI,
00:39:29.240 I know
00:39:29.700 what the
00:39:30.220 socket is,
00:39:30.880 because I'm
00:39:31.320 AI,
00:39:31.920 I know
00:39:32.340 it has to
00:39:32.760 turn,
00:39:33.560 because I'm
00:39:34.100 AI,
00:39:34.920 and I know
00:39:35.680 that I've
00:39:36.240 screwed in
00:39:36.760 something else.
00:39:37.780 Like,
00:39:38.040 I put a
00:39:38.600 jar on,
00:39:39.360 I know how
00:39:39.820 to put a
00:39:40.280 top on a
00:39:40.820 jar by
00:39:41.220 screwing it
00:39:41.680 on.
00:39:42.520 If it
00:39:43.120 knows that,
00:39:43.700 it knows
00:39:45.260 how to
00:39:45.460 use its
00:39:45.800 hand,
00:39:46.440 it knows
00:39:46.780 how to
00:39:46.980 hold a
00:39:47.300 bulb,
00:39:48.120 it knows
00:39:48.600 that a
00:39:48.920 bulb gets
00:39:49.320 screwed in,
00:39:50.000 and it
00:39:50.280 knows that
00:39:50.660 a lid gets
00:39:51.420 screwed on,
00:39:52.580 I think it
00:39:53.460 knows how to
00:39:53.860 put in a
00:39:54.180 light bulb.
00:39:55.900 Am I
00:39:56.420 wrong?
00:39:57.660 It's basically
00:39:58.300 how you know
00:39:58.940 how to put
00:39:59.280 in a
00:39:59.460 light bulb,
00:40:00.080 because if
00:40:00.500 you'd never
00:40:00.900 heard the
00:40:01.280 thing screw
00:40:01.840 in,
00:40:02.920 you would
00:40:03.440 be like,
00:40:03.820 just trying
00:40:04.220 to shove
00:40:04.560 it in there,
00:40:04.980 right?
00:40:05.600 You'd be
00:40:05.860 like,
00:40:06.060 how does
00:40:06.480 this work?
00:40:07.560 It's just
00:40:08.040 the fact that
00:40:08.660 you know
00:40:09.020 there's a
00:40:09.400 concept of
00:40:10.140 things with
00:40:10.820 threads on
00:40:11.840 it that
00:40:12.220 they screw
00:40:13.260 in.
00:40:13.740 That's all
00:40:14.180 the AI
00:40:14.540 needs to
00:40:15.040 know,
00:40:15.300 too.
00:40:16.460 So,
00:40:17.060 yeah,
00:40:18.680 that's the
00:40:18.860 way humans
00:40:19.220 learn,
00:40:19.560 same way.
00:40:24.640 Let me
00:40:25.160 ask you
00:40:25.520 something.
00:40:27.640 Have you
00:40:28.160 noticed I
00:40:28.560 haven't talked
00:40:28.940 about ESG
00:40:29.760 at all?
00:40:31.720 Probably
00:40:32.120 noticed that,
00:40:32.700 right?
00:40:32.940 It seems like
00:40:33.360 an obvious
00:40:33.740 thing I'd be
00:40:34.120 talking about.
00:40:35.080 Because it's
00:40:35.560 a corporate
00:40:36.260 thing where
00:40:37.220 it's like
00:40:39.500 environmental,
00:40:40.920 social,
00:40:41.340 and government,
00:40:41.760 ESG.
00:40:42.840 And the
00:40:43.300 idea is
00:40:43.940 that
00:40:44.140 corporations
00:40:44.700 need to
00:40:46.380 show some
00:40:47.300 social,
00:40:48.980 let's say,
00:40:49.840 responsibility
00:40:50.460 to be good
00:40:53.360 at this
00:40:53.720 ESG thing.
00:40:54.820 So a lot
00:40:55.240 of companies
00:40:55.640 are being
00:40:55.960 forced into
00:40:56.660 wokeness and
00:40:57.600 environmental
00:40:58.080 concerns that
00:40:59.360 are not part
00:40:59.840 of their
00:41:00.060 core business
00:41:00.800 under the
00:41:02.020 thinking that
00:41:02.760 they can be
00:41:03.200 shamed into
00:41:04.040 giving some
00:41:05.520 group of
00:41:05.920 people what
00:41:06.340 they want.
00:41:07.200 And it seems
00:41:07.660 to be working.
00:41:09.000 Now, let
00:41:11.140 me tell you
00:41:11.540 what I do
00:41:11.980 for a living,
00:41:13.100 generally
00:41:13.520 speaking.
00:41:14.680 I don't
00:41:15.440 just criticize
00:41:16.140 stuff.
00:41:17.020 It seems
00:41:17.320 like I do.
00:41:18.400 But in
00:41:18.900 the context
00:41:20.280 of Dilbert,
00:41:21.080 so just
00:41:21.380 talking about
00:41:21.820 the cartoon
00:41:22.300 world for a
00:41:23.020 minute, so
00:41:23.560 just within
00:41:24.140 cartoons, I'm
00:41:25.420 not really
00:41:25.860 about just
00:41:26.440 criticizing
00:41:26.940 things.
00:41:28.580 I generally,
00:41:29.340 if it's in
00:41:29.700 the business
00:41:30.100 realm, I'm
00:41:31.260 criticizing good
00:41:32.500 ideas that
00:41:33.940 have been
00:41:34.180 taken too
00:41:34.740 far.
00:41:35.020 For
00:41:36.340 example,
00:41:37.480 management is
00:41:39.380 not funny or
00:41:40.680 bad because
00:41:42.240 things need to
00:41:42.900 be managed.
00:41:43.900 So there's
00:41:44.480 nothing funny or
00:41:46.160 ridiculous about
00:41:46.920 having management.
00:41:49.620 What's ridiculous
00:41:50.580 is if you take
00:41:51.420 it too far and
00:41:52.240 you micromanage.
00:41:53.760 So everything
00:41:54.640 is about taking
00:41:55.280 it too far.
00:41:56.140 It's not about
00:41:56.720 the core idea.
00:41:58.180 When re-engineering
00:41:59.200 was a big thing
00:41:59.920 in the corporate
00:42:00.400 world, it was
00:42:01.780 a really good
00:42:02.440 idea.
00:42:03.220 The basic
00:42:03.640 idea was maybe
00:42:04.940 sometimes you
00:42:05.600 need to
00:42:05.880 completely
00:42:06.340 replace a
00:42:07.320 system instead
00:42:07.960 of tweak it.
00:42:08.680 That's the
00:42:09.000 basic idea.
00:42:10.040 But then that
00:42:10.600 turned into every
00:42:11.460 manager had to
00:42:12.240 prove that they
00:42:12.820 were re-engineering
00:42:13.660 something.
00:42:14.580 And most of
00:42:15.080 them had nothing
00:42:15.620 to re-engineer.
00:42:17.020 So they would
00:42:17.580 just say,
00:42:18.340 well, I do
00:42:19.680 re-engineering
00:42:20.340 too, so I'm
00:42:21.760 going to totally
00:42:22.180 re-engineer this
00:42:22.960 system that didn't
00:42:23.700 need to be
00:42:24.080 re-engineered.
00:42:25.100 So what happens
00:42:26.020 is you take
00:42:27.260 this good idea,
00:42:28.660 sometimes you
00:42:29.280 have to start
00:42:30.240 from scratch and
00:42:31.000 rebuild, and it
00:42:32.260 turns into every
00:42:33.040 manager pretending
00:42:33.920 they're doing
00:42:34.420 that way too
00:42:35.740 far.
00:42:36.680 You've taken it
00:42:37.440 too far.
00:42:38.300 So it's pretty
00:42:39.040 much like
00:42:39.420 everything.
00:42:40.780 But ESG is
00:42:41.560 just that.
00:42:42.980 If you say,
00:42:43.880 Scott, do you
00:42:44.440 think corporations
00:42:45.340 should be socially
00:42:46.320 responsible?
00:42:47.160 I'd be like,
00:42:47.920 of course.
00:42:49.340 Well, of
00:42:49.720 course.
00:42:50.520 Do you think
00:42:51.120 they should care
00:42:51.660 about the
00:42:52.000 environment?
00:42:52.820 I'd say,
00:42:53.860 duh, yeah, of
00:42:55.320 course, just like
00:42:55.980 everybody.
00:42:56.840 Why would a
00:42:57.300 corporation be
00:42:57.900 different than an
00:42:58.860 individual?
00:43:00.080 In that sense,
00:43:01.000 they should have
00:43:01.480 similar kind of
00:43:02.640 moral and ethical
00:43:03.400 standards.
00:43:05.280 But, yeah, so I
00:43:08.700 see what you're
00:43:09.120 saying.
00:43:09.780 So I think I'm
00:43:10.580 going to the same
00:43:11.160 place you want me
00:43:11.820 to go, which is,
00:43:13.280 if you start with
00:43:13.860 a general idea, it
00:43:15.040 sounds terrific,
00:43:15.800 doesn't it?
00:43:17.620 I mean, what's
00:43:19.140 wrong with taking
00:43:19.920 care of the world,
00:43:21.020 being good stewards
00:43:22.080 of the things,
00:43:23.880 and blah, blah,
00:43:24.400 blah, blah.
00:43:25.020 But, do you
00:43:26.460 think it's going
00:43:27.000 to be taken too
00:43:27.680 far?
00:43:27.920 Of course.
00:43:30.700 Of course it is.
00:43:32.240 Of course it's
00:43:32.920 going to be taken
00:43:33.360 too far, because
00:43:34.280 everything is.
00:43:34.980 In the business
00:43:35.520 world, everything
00:43:36.020 is.
00:43:36.800 So you should
00:43:37.500 assume it's
00:43:37.980 taken too far.
00:43:39.420 So here's the
00:43:40.000 thing.
00:43:41.480 Do you remember
00:43:42.200 in the 90s,
00:43:43.260 business books
00:43:44.120 were just, like,
00:43:45.520 the biggest thing,
00:43:46.640 and they all
00:43:47.500 had these great
00:43:48.700 ideas, like,
00:43:49.960 in search of
00:43:50.560 excellence and
00:43:51.340 all that, and
00:43:52.320 it all seemed
00:43:52.760 like, oh,
00:43:53.540 finally somebody
00:43:54.240 had the formula
00:43:55.800 that makes
00:43:56.920 management work,
00:43:57.760 and then
00:43:57.980 everybody bought
00:43:58.600 those books.
00:43:59.840 And it turns
00:44:00.300 out that, like
00:44:01.860 everything else,
00:44:03.540 business advice
00:44:04.820 books started
00:44:05.580 down as probably
00:44:06.600 a good idea.
00:44:08.440 What's wrong
00:44:09.000 with having a
00:44:09.520 book that tells
00:44:10.060 you how to run
00:44:10.520 your business
00:44:10.940 better?
00:44:11.240 That's, like,
00:44:11.580 a great idea,
00:44:12.300 isn't it?
00:44:13.780 But, it went
00:44:14.960 too far.
00:44:16.240 And pretty soon
00:44:17.000 everybody who
00:44:17.740 had any success
00:44:19.220 was writing a
00:44:20.480 book to say that
00:44:21.180 what they did is
00:44:21.840 why they were
00:44:22.260 successful, except
00:44:23.880 they were all
00:44:24.300 doing different
00:44:24.780 stuff, and most
00:44:26.360 of them are
00:44:26.680 liars about
00:44:27.320 what made
00:44:27.660 them successful.
00:44:28.580 So, you
00:44:29.060 started with
00:44:29.520 this good
00:44:29.940 thing, which
00:44:31.300 is, oh, a
00:44:31.900 book to tell
00:44:32.360 me how to
00:44:32.760 be more
00:44:33.080 effective as
00:44:33.680 a manager,
00:44:34.460 and it
00:44:35.020 turned into
00:44:35.480 all of these
00:44:36.320 celebrity CEOs
00:44:37.480 telling you that
00:44:38.980 the thing that
00:44:39.580 made them
00:44:40.000 successful was
00:44:40.800 the stuff they
00:44:41.320 did, and it
00:44:41.900 probably wasn't.
00:44:43.500 Probably wasn't.
00:44:44.520 They were probably
00:44:44.960 just in the right
00:44:45.520 place at the
00:44:46.000 right time.
00:44:47.360 So, it
00:44:48.520 starts as this
00:44:49.200 core of
00:44:49.780 perfectly
00:44:50.280 reasonableness
00:44:51.100 and turns into
00:44:52.180 just gigantic
00:44:53.220 bullshit.
00:44:53.580 shit.
00:44:54.720 So, here's my
00:44:55.600 offer to you.
00:44:56.800 Do you know
00:44:57.100 why the
00:44:57.400 business book
00:44:58.040 market collapsed
00:44:59.000 in the 90s?
00:45:01.860 I think it
00:45:02.580 was me.
00:45:03.860 I think I
00:45:04.600 did that.
00:45:05.820 Because, as
00:45:06.580 you know, Elon
00:45:07.160 Musk had a
00:45:08.700 rule in Tesla
00:45:09.800 that you don't
00:45:10.920 want to do
00:45:11.320 anything in Tesla
00:45:12.060 that would end
00:45:12.620 up in a
00:45:13.020 Dilbert comic.
00:45:14.600 He's not the
00:45:15.340 only one with
00:45:15.840 that rule.
00:45:17.220 That was a
00:45:17.880 standard
00:45:18.480 assumption at
00:45:20.540 one point,
00:45:21.120 especially in
00:45:21.560 the late 90s,
00:45:22.300 early 2000s,
00:45:23.360 that if you
00:45:24.160 did something
00:45:24.680 that would end
00:45:25.160 up in a
00:45:25.560 Dilbert comic,
00:45:27.160 it means you
00:45:27.660 went too far.
00:45:29.320 So, it was
00:45:29.720 sort of like
00:45:30.120 this mental
00:45:30.900 gating.
00:45:32.960 And so, I
00:45:33.320 called bullshit
00:45:34.060 on so much
00:45:34.760 management
00:45:35.320 advice that
00:45:37.740 if you track
00:45:40.060 the business
00:45:40.640 book market,
00:45:41.560 it was on
00:45:41.940 fire until
00:45:42.640 Dilbert started
00:45:43.660 skewering it,
00:45:44.720 and then it
00:45:45.080 just fell off
00:45:45.660 the table.
00:45:46.260 It never
00:45:47.000 really recovered.
00:45:48.720 The business
00:45:49.960 books that you
00:45:50.400 see now tend
00:45:51.100 to be based
00:45:52.080 on my book.
00:45:53.480 Had it
00:45:53.840 failed almost
00:45:54.360 everything and
00:45:54.920 still went big.
00:45:56.080 I won't name
00:45:56.700 names, but
00:45:57.280 you're probably
00:45:58.180 aware of some
00:45:58.840 other business
00:45:59.480 books that
00:45:59.960 have borrowed
00:46:01.120 from that and
00:46:02.120 extended it.
00:46:07.000 Yeah, I'm
00:46:09.560 seeing somebody
00:46:10.140 feeling sorry
00:46:11.720 for me that I
00:46:12.340 had coffee
00:46:12.820 out of a
00:46:13.200 paper cup.
00:46:13.780 thank you for
00:46:16.600 feeling my
00:46:17.140 pain.
00:46:17.920 It's real.
00:46:19.240 The pain is
00:46:19.740 real.
00:46:21.960 So, here's
00:46:22.780 my offer to
00:46:23.380 you.
00:46:25.060 Do you want
00:46:25.620 me to kill
00:46:26.020 ESG?
00:46:28.440 Because you
00:46:28.940 know I
00:46:29.220 can do it,
00:46:29.700 right?
00:46:30.920 First of all,
00:46:31.560 how many people
00:46:32.440 think I can do
00:46:33.040 it?
00:46:34.280 Do you think I
00:46:34.800 can do it?
00:46:35.180 Yeah, yeah, I
00:46:40.140 think I could.
00:46:41.160 Now, Dilbert
00:46:41.720 doesn't have the
00:46:42.320 same power that
00:46:43.880 it used to,
00:46:45.040 right?
00:46:45.320 It used to
00:46:45.740 be a much
00:46:46.600 more pervasive
00:46:47.360 social phenomenon.
00:46:51.000 But within the
00:46:51.860 business world, I
00:46:52.560 think it still
00:46:53.020 has a punch.
00:46:55.220 And I haven't
00:46:56.460 done any comics
00:46:57.220 about ESG.
00:46:58.020 but, so I
00:47:00.540 tweeted, I
00:47:01.220 tweeted just
00:47:01.680 before I got
00:47:02.160 on, that if
00:47:03.340 anybody wants
00:47:03.880 me to kill
00:47:04.780 ESG, I will
00:47:05.960 do that for
00:47:06.500 you, but
00:47:06.880 you're going
00:47:07.100 to have to
00:47:07.340 give me some
00:47:07.880 actual real
00:47:08.660 world stories
00:47:09.480 about what
00:47:10.660 went wrong
00:47:11.160 in your
00:47:11.460 company.
00:47:12.340 So, just
00:47:12.960 give me the
00:47:13.400 fodder, tell
00:47:14.780 me what went
00:47:15.300 wrong, and I
00:47:16.460 will take care
00:47:16.920 of it for
00:47:17.260 you.
00:47:17.940 Give me six
00:47:18.740 months, and
00:47:21.160 it will be a
00:47:21.900 punchline.
00:47:23.560 In six months,
00:47:24.380 ESG will be
00:47:25.260 something you
00:47:25.740 should be
00:47:26.040 embarrassed to
00:47:26.640 be involved
00:47:27.100 in.
00:47:28.020 And I'll
00:47:28.340 make that
00:47:28.720 happen in
00:47:29.140 six months.
00:47:30.960 You've just
00:47:31.480 got to help
00:47:31.800 me.
00:47:32.060 You've just
00:47:32.340 got to give
00:47:32.620 me the
00:47:32.980 anecdotes and
00:47:33.760 the, you
00:47:34.600 know, the
00:47:34.860 what's wrong
00:47:35.580 with the
00:47:35.820 stories.
00:47:36.380 But put that
00:47:36.840 in the tweet
00:47:37.280 and I'll
00:47:37.600 take care of
00:47:37.980 it for you.
00:47:38.700 All right?
00:47:39.620 See, this is
00:47:40.180 how we have
00:47:41.420 a relationship.
00:47:46.800 I am
00:47:47.720 literally looking
00:47:48.500 for ways, on
00:47:49.520 a fairly regular
00:47:50.360 basis, to do
00:47:51.140 something for
00:47:51.660 you.
00:47:52.920 Yeah, can I
00:47:53.680 help you?
00:47:54.040 Can I give you
00:47:54.440 some advice?
00:47:55.160 Can I give you
00:47:55.560 a micro lesson?
00:47:56.820 Can I give you
00:47:57.340 some context?
00:47:58.020 Can I teach
00:47:58.840 you how to
00:47:59.100 look at the
00:47:59.440 news in a
00:48:00.120 more productive
00:48:00.620 way?
00:48:01.340 But this is
00:48:01.940 something I
00:48:02.340 can do.
00:48:03.420 So if you
00:48:04.020 can help me,
00:48:04.660 we'll just go
00:48:05.320 do this and
00:48:06.260 just change
00:48:06.660 it.
00:48:08.120 All right.
00:48:08.400 So the U.S.
00:48:13.660 Postal Service
00:48:14.520 decided Muslims
00:48:15.380 should have a
00:48:15.980 prayer room in
00:48:16.920 the facility.
00:48:19.380 Yeah, I don't
00:48:19.940 care.
00:48:20.980 I don't care.
00:48:23.300 Is that a
00:48:23.920 problem?
00:48:26.000 Like, who cares
00:48:26.760 if somebody has a
00:48:27.420 prayer room?
00:48:28.360 Like, I get,
00:48:29.280 oh, okay, you
00:48:30.720 know, then you
00:48:31.080 need a prayer room
00:48:31.900 for everybody.
00:48:33.220 But, you
00:48:33.660 know, sometimes
00:48:34.380 one group
00:48:35.100 wants something
00:48:36.800 more strongly.
00:48:38.040 I don't know.
00:48:38.860 That just doesn't
00:48:39.520 seem like the
00:48:40.000 biggest problem
00:48:40.540 in the world.
00:48:42.160 Yeah, you
00:48:42.600 don't have to
00:48:43.000 like it, but it's
00:48:43.640 not like a big
00:48:44.160 problem if they
00:48:45.260 had the room.
00:48:46.280 You know, if
00:48:46.720 they close down
00:48:49.040 their mainframes
00:48:49.800 and shut down
00:48:51.280 the business so
00:48:51.980 that this room
00:48:52.560 could be available,
00:48:53.540 but if it's just
00:48:54.340 a room and
00:48:55.860 maybe it wasn't
00:48:56.460 being used for
00:48:57.020 anything else
00:48:57.480 during those
00:48:58.000 times, it's just
00:48:59.700 people in a
00:49:00.220 room.
00:49:01.160 Let them do
00:49:01.600 whatever the hell
00:49:02.080 they want if
00:49:03.180 they're not
00:49:03.540 working.
00:49:06.800 You are getting
00:49:07.560 paid to pray.
00:49:08.280 I don't believe
00:49:08.840 they're being asked
00:49:09.640 to be paid to
00:49:10.700 pray.
00:49:11.500 I believe they
00:49:12.340 are being asked
00:49:12.880 for just a
00:49:13.440 space.
00:49:15.280 Would it matter
00:49:16.100 to you that it's
00:49:16.720 praying?
00:49:18.120 Suppose a number
00:49:18.920 of employees came
00:49:19.840 to you and said
00:49:20.420 we'd like to do
00:49:21.200 yoga on our
00:49:22.020 time, like on
00:49:22.840 our breaks.
00:49:24.200 If the post
00:49:25.040 office said,
00:49:25.600 yeah, we've got
00:49:26.060 this room,
00:49:27.020 there are so many
00:49:28.080 people who want
00:49:28.620 to do yoga on
00:49:29.360 their break,
00:49:30.040 go ahead, do
00:49:30.920 yoga.
00:49:32.020 Would you be
00:49:32.720 saying, fuck
00:49:33.720 those people
00:49:34.300 doing yoga on
00:49:36.460 my dime?
00:49:37.420 No, they're
00:49:37.880 doing it on
00:49:38.320 their break,
00:49:39.640 and it's just
00:49:40.100 yoga.
00:49:40.820 Get over it.
00:49:41.780 If somebody
00:49:42.220 wants to pray
00:49:42.880 or do yoga
00:49:43.560 or meditate
00:49:44.840 or chant
00:49:45.760 on their own
00:49:46.920 time, yeah,
00:49:48.420 I've got an
00:49:48.880 extra room,
00:49:49.420 you can do it
00:49:49.800 in my room.
00:49:53.440 So, yeah,
00:49:55.220 it's probably
00:49:55.480 during breaks.
00:49:56.960 If the story
00:49:57.840 had been
00:49:59.540 about they
00:50:00.660 want to pray
00:50:01.260 instead of
00:50:01.780 work, I
00:50:02.880 think that's
00:50:03.280 what the
00:50:03.540 headline would
00:50:03.980 have done.
00:50:07.420 CNN had
00:50:08.340 a headline
00:50:09.540 that was
00:50:10.280 making me
00:50:11.000 crazy today
00:50:11.760 that I wrote
00:50:12.340 down on my
00:50:13.100 computer that
00:50:13.840 stopped.
00:50:16.060 And the
00:50:16.740 only way that
00:50:17.360 I could check
00:50:17.920 that is by
00:50:20.840 temporarily
00:50:21.460 closing down
00:50:22.300 my YouTube
00:50:23.340 feed and using
00:50:24.180 that.
00:50:24.520 or I
00:50:25.460 could get
00:50:25.720 my charger
00:50:27.480 and charge
00:50:27.980 it back
00:50:28.260 up.
00:50:30.880 All right.
00:50:33.940 I think
00:50:34.500 it was about
00:50:34.840 monkeypox.
00:50:39.700 Scheduled
00:50:40.140 time for
00:50:40.620 prayer?
00:50:41.820 Well, if
00:50:42.260 they use that
00:50:42.800 as their
00:50:43.120 break, then
00:50:43.760 that's fine.
00:50:47.460 Do you
00:50:48.000 need a room
00:50:48.440 for every
00:50:48.860 religion?
00:50:49.500 You know,
00:50:50.060 that's the
00:50:50.720 obvious question.
00:50:52.520 Well, I
00:50:53.060 think big
00:50:53.460 companies can
00:50:54.120 deal with
00:50:54.700 larger groups
00:50:55.720 differently than
00:50:56.380 smaller groups.
00:50:57.300 I don't have
00:50:57.740 a problem
00:50:58.020 with that.
00:50:58.880 You know, if
00:50:59.320 they had a
00:51:00.320 hundred Muslims
00:51:01.060 who wanted
00:51:01.540 their own
00:51:01.960 room, but
00:51:03.520 two people
00:51:04.720 said, hey,
00:51:05.620 where's my
00:51:06.360 Jewish prayer
00:51:08.540 room or
00:51:08.940 something, I
00:51:09.940 don't think the
00:51:10.400 post office needs
00:51:11.160 to give a room
00:51:11.640 for the two
00:51:12.040 people.
00:51:13.620 I think
00:51:14.000 giving room
00:51:14.620 for the
00:51:14.840 hundred people
00:51:15.360 makes sense
00:51:15.900 just from a
00:51:16.620 business
00:51:16.920 perspective.
00:51:18.400 Giving it
00:51:19.000 to two maybe
00:51:19.860 doesn't make
00:51:20.640 sense.
00:51:21.660 Yeah.
00:51:26.380 Now, I
00:51:29.400 would like to
00:51:30.020 offer one
00:51:31.080 suggestion to
00:51:31.780 get past the
00:51:32.620 monkey box.
00:51:34.500 Are you ready
00:51:35.280 for this?
00:51:38.200 As you
00:51:38.900 know, it
00:51:39.280 seems to be
00:51:40.460 transmitted
00:51:40.960 primarily from
00:51:43.500 people with
00:51:44.340 penises to
00:51:44.980 people with
00:51:45.400 penises who
00:51:46.160 have sex.
00:51:47.300 I'm not a
00:51:48.140 bigot like
00:51:48.740 Jake Tapper
00:51:49.560 who says
00:51:50.020 men-to-men
00:51:50.680 sex.
00:51:51.660 Very
00:51:52.240 exclusionary
00:51:53.080 of the
00:51:53.480 people who
00:51:55.080 are lesbians
00:51:55.900 but both of
00:51:57.060 them were
00:51:57.300 born with
00:51:57.700 penises
00:51:58.060 because they
00:51:59.620 have the
00:51:59.840 same risk.
00:52:02.680 Here's how
00:52:03.500 we could
00:52:04.140 get past
00:52:04.840 it.
00:52:07.040 If you,
00:52:07.880 if the,
00:52:08.800 primarily,
00:52:10.100 if the people
00:52:11.200 with penises
00:52:11.740 having sex
00:52:12.340 with people
00:52:12.720 with penises,
00:52:13.220 if they
00:52:14.260 could just
00:52:14.760 maybe lay
00:52:16.460 off for a
00:52:16.980 few weeks
00:52:17.480 and do
00:52:19.280 something else
00:52:19.980 because apparently
00:52:21.100 other sexual
00:52:22.760 acts are much
00:52:23.640 less likely
00:52:24.580 to cause a
00:52:25.260 problem.
00:52:26.340 It seems to
00:52:26.880 be the,
00:52:27.640 it's the
00:52:28.560 butt sex.
00:52:29.400 It's the butt
00:52:30.060 sex that seems
00:52:30.760 to be the
00:52:31.120 problem.
00:52:31.360 I'm right,
00:52:31.700 right?
00:52:32.380 Medically
00:52:32.840 speaking,
00:52:33.400 it's the
00:52:33.940 butt sex.
00:52:34.460 So,
00:52:36.120 what we
00:52:36.620 need is
00:52:36.980 just a
00:52:37.360 few weeks
00:52:38.180 with a,
00:52:40.800 like,
00:52:41.180 let's slow
00:52:41.580 down the
00:52:41.920 butt sex.
00:52:43.180 And we
00:52:43.740 need some
00:52:44.040 kind of a
00:52:44.400 slogan,
00:52:45.660 you know,
00:52:46.100 do something
00:52:46.760 besides butt
00:52:47.600 sex for
00:52:48.700 two weeks.
00:52:49.260 What would
00:52:49.680 be a good
00:52:50.800 slogan?
00:52:52.080 Something
00:52:52.520 besides anal
00:52:53.380 sex for
00:52:53.900 two weeks.
00:52:56.380 How about
00:52:57.200 two weeks
00:52:58.280 to blow
00:52:58.800 the curve?
00:53:01.580 Anybody?
00:53:03.000 Two weeks
00:53:03.480 to blow
00:53:03.840 the curve?
00:53:05.260 No?
00:53:08.240 Okay,
00:53:08.760 I think
00:53:08.980 that's pretty
00:53:09.300 good.
00:53:13.640 All right.
00:53:20.120 Your doctor
00:53:20.820 says oral
00:53:21.440 monkeypox is
00:53:22.260 a thing.
00:53:22.680 Yes,
00:53:23.180 it is a
00:53:23.720 thing.
00:53:25.100 It is a
00:53:25.860 thing.
00:53:26.380 So,
00:53:26.720 don't let me
00:53:27.340 mislead you
00:53:27.960 into thinking
00:53:28.440 that oral
00:53:29.540 sex with
00:53:30.120 somebody who
00:53:30.580 has monkey
00:53:31.040 pox is
00:53:31.560 going to
00:53:31.800 be okay.
00:53:33.100 If I can
00:53:33.860 give you
00:53:34.180 any medical
00:53:34.760 advice without
00:53:35.420 a medical
00:53:35.940 degree,
00:53:36.960 I feel
00:53:37.580 this would
00:53:37.940 be the
00:53:38.220 time.
00:53:39.440 I'm going
00:53:39.960 to say
00:53:40.180 this with
00:53:40.620 my great
00:53:41.260 medical
00:53:41.960 confidence.
00:53:43.460 If somebody
00:53:44.220 has monkey
00:53:44.740 pox,
00:53:46.100 maybe stay
00:53:47.020 away for a
00:53:47.420 while.
00:53:48.200 That's all.
00:53:48.580 All right,
00:53:56.680 I'm looking at
00:53:57.060 your jokes,
00:53:57.580 but I
00:53:57.780 can't
00:53:57.960 repeat
00:53:58.200 them.
00:53:59.800 Stop
00:54:00.220 now while
00:54:00.640 you're
00:54:00.800 ahead.
00:54:03.060 Hands-on
00:54:03.700 solution.
00:54:04.120 let's
00:54:08.020 see.
00:54:12.960 Something
00:54:13.440 about,
00:54:14.520 yeah,
00:54:15.020 I like
00:54:15.480 where you're
00:54:15.760 going with
00:54:16.100 this,
00:54:16.340 but let's
00:54:16.720 see if we
00:54:17.040 can form a
00:54:17.640 better joke
00:54:18.240 on the fly.
00:54:20.180 All right,
00:54:20.460 something with
00:54:21.120 hands,
00:54:22.320 because that's
00:54:22.760 got to be the
00:54:23.200 safest,
00:54:23.660 right?
00:54:25.200 Something with
00:54:25.840 hands.
00:54:26.440 Let me
00:54:34.000 ask you
00:54:34.320 this.
00:54:35.540 Have we
00:54:36.020 ever had
00:54:36.840 a global
00:54:39.340 disaster that
00:54:41.560 could be
00:54:41.900 fixed by
00:54:42.460 jerking off?
00:54:45.420 Because I
00:54:47.040 don't know
00:54:47.440 about you,
00:54:49.080 but remember
00:54:50.100 when the
00:54:50.480 pandemic
00:54:50.820 happened and
00:54:51.780 the government
00:54:53.480 said,
00:54:54.240 all right,
00:54:54.660 we can get
00:54:55.120 past this,
00:54:55.900 but here's
00:54:56.300 what you're
00:54:56.620 going to
00:54:56.780 need to
00:54:57.060 do.
00:54:57.920 You're
00:54:58.280 going to
00:54:58.360 have to
00:54:58.660 lock
00:54:58.960 yourself
00:54:59.320 in
00:54:59.580 solitary
00:55:00.000 confinement.
00:55:01.120 You're
00:55:01.480 going to
00:55:01.560 have to
00:55:01.800 wear a
00:55:02.280 diaper on
00:55:02.760 your face
00:55:03.200 whenever you
00:55:03.680 go outside.
00:55:04.600 You're
00:55:04.960 not going
00:55:05.280 to be able
00:55:05.480 to travel,
00:55:06.200 and we're
00:55:06.460 going to
00:55:06.620 give you
00:55:07.140 shots that
00:55:09.180 we wish
00:55:10.040 we had
00:55:10.280 more time
00:55:10.720 to test
00:55:11.180 them,
00:55:11.520 but we
00:55:12.260 didn't.
00:55:13.740 So that's
00:55:14.100 what you're
00:55:14.400 going to
00:55:14.500 do.
00:55:14.860 So that's
00:55:15.180 how you
00:55:15.720 get past
00:55:16.120 the pandemic.
00:55:16.980 Pretty
00:55:17.220 hard,
00:55:17.600 right?
00:55:18.540 Like supply
00:55:19.180 chain,
00:55:19.920 problems,
00:55:20.940 all kinds
00:55:21.460 of problems.
00:55:22.780 But now
00:55:23.200 let's go
00:55:23.560 to monkey
00:55:23.880 pox.
00:55:25.220 How could
00:55:26.080 you possibly
00:55:26.820 solve monkey
00:55:27.980 pox?
00:55:29.040 You could
00:55:29.940 jerk off
00:55:30.820 for two
00:55:32.600 weeks.
00:55:36.100 I'm not
00:55:36.880 joking,
00:55:37.580 am I?
00:55:38.740 If every
00:55:39.680 man just
00:55:41.320 said,
00:55:42.480 I'll tell
00:55:43.280 you what,
00:55:43.980 for two
00:55:44.480 weeks,
00:55:44.860 I would
00:55:45.180 just jerk
00:55:45.700 off.
00:55:47.120 Two weeks
00:55:47.900 to spank
00:55:49.260 the monkey?
00:55:50.540 I got it.
00:55:51.960 Spank the
00:55:52.460 monkey pox.
00:55:54.560 We're
00:55:55.040 done.
00:55:56.000 Spank
00:55:56.280 the monkey
00:55:56.620 pox.
00:55:57.860 Spank
00:55:58.340 the monkey.
00:56:02.080 If you
00:56:02.760 just jerk
00:56:03.440 off for
00:56:04.080 a month,
00:56:05.360 this
00:56:05.620 pandemic's
00:56:06.260 over.
00:56:06.640 Two weeks
00:56:07.300 to probably
00:56:07.580 get it
00:56:07.860 done.
00:56:09.300 Two weeks
00:56:10.040 to spank
00:56:11.220 the monkey
00:56:11.660 to beat
00:56:12.360 the pox.
00:56:14.680 But honestly,
00:56:15.840 have we
00:56:16.280 ever had
00:56:16.880 a national
00:56:17.420 problem that
00:56:18.000 could be
00:56:18.420 defeated
00:56:18.860 entirely by
00:56:19.660 jerking
00:56:20.000 off?
00:56:21.500 Can you
00:56:21.940 think of
00:56:22.260 any other
00:56:22.840 problem that
00:56:23.680 was this
00:56:24.060 easy to
00:56:24.520 solve?
00:56:25.420 What will
00:56:26.100 we do?
00:56:27.520 We better
00:56:28.480 ask the
00:56:29.000 AI,
00:56:30.140 monkey pox
00:56:32.980 sweeping the
00:56:33.800 nation.
00:56:34.760 What shall
00:56:35.360 we do?
00:56:38.000 This is
00:56:38.560 going to be
00:56:38.800 my impression
00:56:40.700 of AI.
00:56:42.240 Now,
00:56:42.720 AI doesn't
00:56:44.840 look exactly
00:56:45.640 like people
00:56:46.180 yet, so I
00:56:47.300 have to give
00:56:47.720 this expression
00:56:48.420 that gives
00:56:48.780 you that
00:56:49.120 uncanny
00:56:49.660 valley,
00:56:50.660 you know,
00:56:50.960 where it
00:56:51.180 looks like
00:56:51.480 a person
00:56:51.860 but not.
00:56:52.880 So I'm
00:56:53.220 going to
00:56:53.360 do my
00:56:53.660 impression
00:56:54.000 of AI
00:56:54.440 that's
00:56:55.220 almost
00:56:55.800 like a
00:56:56.340 person
00:56:56.620 but not.
00:56:59.160 What was
00:56:59.820 the question?
00:57:01.800 Oh,
00:57:02.680 how to
00:57:03.220 solve the
00:57:03.860 monkey pox
00:57:04.480 problem?
00:57:09.060 Try
00:57:09.540 jerking off
00:57:10.160 for two
00:57:10.460 weeks.
00:57:10.740 Will that
00:57:13.600 work,
00:57:13.960 AI?
00:57:14.740 Uh-huh.
00:57:15.900 How often?
00:57:16.980 Every time.
00:57:18.120 Every time.
00:57:19.740 Two weeks
00:57:20.280 to spank
00:57:20.960 the monkey.
00:57:21.740 Problem
00:57:22.020 solved.
00:57:25.800 How about
00:57:26.600 can we
00:57:27.460 make it
00:57:27.800 rhyme?
00:57:29.220 You know,
00:57:29.500 it's more
00:57:29.760 effective if
00:57:30.360 it rhymes.
00:57:33.960 Spank
00:57:34.480 your cox
00:57:35.040 to beat
00:57:35.400 the monkey
00:57:35.780 pox.
00:57:38.480 How about
00:57:39.200 holster your
00:57:39.900 cox?
00:57:41.340 to stop
00:57:42.380 the pox?
00:57:47.980 How about
00:57:48.660 pull your
00:57:51.500 cox
00:57:51.920 to stop
00:57:53.080 the pox?
00:57:55.020 Right?
00:57:57.060 Maybe
00:57:57.500 somebody says
00:57:59.640 maybe just
00:58:00.400 have sex
00:58:00.840 with women.
00:58:01.820 No,
00:58:02.500 that's not
00:58:03.000 a solution.
00:58:04.760 No.
00:58:06.320 No,
00:58:06.640 stop it.
00:58:07.180 That's not
00:58:07.500 a solution.
00:58:08.620 And beat
00:58:08.960 it.
00:58:09.980 Beat the
00:58:10.640 pox.
00:58:11.860 Beat your
00:58:12.560 cox to
00:58:13.100 beat the
00:58:13.400 pox.
00:58:16.140 I still
00:58:19.560 like a
00:58:19.900 love glove.
00:58:22.720 Jerk your
00:58:23.400 cox to
00:58:23.860 prevent the
00:58:24.300 pox.
00:58:24.720 No cox,
00:58:33.580 no pox.
00:58:36.800 No cox,
00:58:37.800 no pox.
00:58:42.840 All hands
00:58:43.660 on deck.
00:58:44.180 I have to
00:58:47.500 say it
00:58:47.720 instead of
00:58:49.280 all hands
00:58:49.860 on deck,
00:58:50.980 it's all
00:58:51.300 hands on
00:58:51.760 deck.
00:58:57.040 That's so
00:58:57.700 funny.
00:58:59.020 Just beat
00:58:59.540 it.
00:59:00.260 Just beat
00:59:00.980 it.
00:59:01.780 What if you
00:59:02.180 just played
00:59:02.660 Michael Jackson's
00:59:03.840 beat it?
00:59:04.860 All you
00:59:06.700 got to do
00:59:07.120 is beat
00:59:07.520 it.
00:59:09.460 Use a
00:59:10.080 sock to
00:59:11.140 cover your
00:59:11.620 pox.
00:59:17.660 Oh,
00:59:18.160 that's
00:59:18.400 good.
00:59:23.040 Punch a
00:59:23.640 donkey to
00:59:24.240 beat the
00:59:24.540 monkey.
00:59:25.840 Shock the
00:59:26.440 monkey.
00:59:29.720 Oh,
00:59:30.320 hands on
00:59:30.780 dick.
00:59:31.060 I don't
00:59:36.080 know why
00:59:36.300 that's the
00:59:36.680 funniest one.
00:59:37.720 All hands
00:59:38.220 on deck.
00:59:45.800 Alright,
00:59:46.340 I got one.
00:59:46.880 You ready
00:59:47.180 for it?
00:59:48.320 You all
00:59:48.620 have to
00:59:48.900 listen
00:59:49.100 carefully.
00:59:50.800 No
00:59:51.200 butts
00:59:51.440 about it.
00:59:55.580 Alright.
00:59:57.220 Turn the
00:59:57.800 other cheek.
01:00:01.060 challenge to
01:00:12.500 post these
01:00:12.900 on Twitter.
01:00:14.980 Alright,
01:00:15.600 I will post
01:00:16.200 on Twitter
01:00:16.640 the following
01:00:17.280 joke.
01:00:18.640 That we
01:00:19.160 need more
01:00:20.300 pandemics that
01:00:21.180 could be
01:00:21.640 completely solved
01:00:22.580 by jerking
01:00:23.120 off.
01:00:23.960 By staying
01:00:24.800 home and
01:00:25.120 watching porn
01:00:25.720 and jerking
01:00:26.220 off.
01:00:26.900 But your
01:00:27.240 doctor won't
01:00:27.740 tell you to
01:00:28.160 do it.
01:00:31.060 I need
01:00:33.320 a doctor
01:00:33.840 who's brave
01:00:34.440 enough to
01:00:34.860 say,
01:00:35.400 I just
01:00:36.120 need two
01:00:36.560 weeks.
01:00:37.780 If you
01:00:38.220 don't know
01:00:38.560 where Porn
01:00:39.020 Hub is,
01:00:40.400 let me
01:00:40.740 give you
01:00:41.000 the address.
01:00:42.740 It's
01:00:42.940 pornhub.com.
01:00:47.140 I mean,
01:00:48.140 is anybody
01:00:48.820 laughing as
01:00:49.440 hard as I
01:00:49.860 am that
01:00:50.320 we act like
01:00:51.260 we have a
01:00:51.680 national problem
01:00:52.480 that literally
01:00:53.760 can be solved
01:00:54.420 by jerking
01:00:54.960 off?
01:00:56.040 Now,
01:00:56.340 correct me
01:00:56.700 if I'm
01:00:56.960 wrong,
01:00:58.000 but AIDS
01:00:58.800 is not like
01:00:59.460 that,
01:00:59.740 right?
01:00:59.940 If you
01:01:00.960 got AIDS,
01:01:01.480 you got
01:01:01.740 AIDS,
01:01:02.400 back in
01:01:02.840 the old
01:01:03.060 days,
01:01:03.420 before the
01:01:03.880 therapeutics.
01:01:05.280 But if
01:01:06.660 you have
01:01:06.880 AIDS,
01:01:07.220 you just
01:01:07.460 can't have
01:01:07.820 sex,
01:01:09.040 you know,
01:01:10.080 really in
01:01:10.540 a reasonable
01:01:11.200 safe way
01:01:11.980 for a long
01:01:12.380 time,
01:01:13.040 like forever,
01:01:14.060 unless you've
01:01:15.300 got a special
01:01:15.820 situation.
01:01:17.760 But monkey
01:01:18.440 pox,
01:01:18.820 you can just
01:01:19.300 jerk off for
01:01:19.900 two weeks
01:01:20.320 and it's
01:01:20.600 all gone,
01:01:21.120 isn't it?
01:01:22.180 Correct me
01:01:22.660 if I'm
01:01:22.900 wrong.
01:01:26.600 Guess
01:01:27.000 what I have
01:01:27.400 in my
01:01:27.640 hand and
01:01:28.020 you can
01:01:28.240 have it.
01:01:29.940 I don't
01:01:30.340 think I'm
01:01:30.680 going to
01:01:30.860 guess.
01:01:34.040 Have
01:01:34.240 horn,
01:01:34.600 if
01:01:36.160 Pornhub,
01:01:37.660 oh god,
01:01:38.520 this is
01:01:38.800 funny,
01:01:39.760 suggestion to
01:01:40.700 Pornhub from
01:01:41.500 the locals
01:01:42.200 community,
01:01:43.500 somebody on
01:01:44.160 locals suggested
01:01:44.920 Pornhub should
01:01:45.840 run a free
01:01:46.560 special for
01:01:47.160 two weeks,
01:01:48.220 like all
01:01:48.720 the gay
01:01:49.380 porn on
01:01:49.940 Pornhub would
01:01:50.800 be free for
01:01:51.440 two weeks
01:01:52.000 to end the
01:01:53.220 monkeypox
01:01:53.780 pandemic.
01:01:54.280 that would
01:01:56.060 actually
01:01:56.560 work.
01:01:57.760 Wouldn't
01:01:58.080 it?
01:01:59.640 Can
01:01:59.940 some
01:02:00.200 doctor tell
01:02:00.840 me,
01:02:01.200 am I
01:02:01.500 crazy?
01:02:02.420 And maybe
01:02:02.800 it's not
01:02:03.120 two weeks,
01:02:03.660 right?
01:02:03.900 It might
01:02:04.140 be longer
01:02:04.460 than two
01:02:04.800 weeks.
01:02:05.400 But if
01:02:06.480 everybody in
01:02:06.980 the United
01:02:07.260 States just
01:02:07.940 watched
01:02:08.780 Pornhub for
01:02:09.360 two weeks,
01:02:10.500 I feel like
01:02:12.140 we'd be in a
01:02:12.560 lot better
01:02:12.940 shape.
01:02:15.100 I don't
01:02:15.640 think the
01:02:16.040 pox stays
01:02:16.580 with you once
01:02:17.080 infected,
01:02:17.580 does it?
01:02:18.060 I think you'd
01:02:18.600 get over it.
01:02:19.020 That's my
01:02:21.000 understanding.
01:02:25.360 You do,
01:02:26.060 right?
01:02:26.260 You get over
01:02:26.680 it, yeah.
01:02:28.560 All right,
01:02:29.180 well, I
01:02:29.560 think we've
01:02:30.060 completely
01:02:30.840 debased this
01:02:32.780 conversation to
01:02:33.720 a place where
01:02:34.400 it needs to
01:02:34.780 be.
01:02:35.760 And on that
01:02:36.460 note, I'm
01:02:37.180 going to go
01:02:37.380 do something
01:02:37.780 else after I
01:02:39.680 do a really
01:02:40.560 funny tweet.
01:02:42.560 And thanks
01:02:43.120 for joining
01:02:43.480 YouTube and
01:02:44.800 Spotify.
01:02:45.600 I'll talk to
01:02:46.300 you tomorrow.
01:02:49.020 We'll
01:03:04.540 entee
01:03:04.780 you
01:03:04.880 you
01:03:05.560 you
01:03:05.920 you
01:03:06.480 you
01:03:08.520 you
01:03:09.980 you
01:03:12.620 you
01:03:13.840 you
01:03:15.800 you