Real Coffee with Scott Adams - September 08, 2022


Episode 1860 Scott Adams: Orange Brandon, US Life Expectancy Falls, Workplaces Less Polite, More Fun


Episode Stats


Length

1 hour and 3 minutes

Words per minute

144.68564

Word count

9,258

Sentence count

721

Harmful content

Misogyny

15

sentences flagged

Hate speech

27

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

The Queen is ill, and family members are gathering around, which is usually the signal that the cat s on the roof. I predict that the Queen will pass soon, and then all the bad news about Trump will stop for a while, because they need to save the good stuff to when you're not paying attention. Elon Musk thinks that fusion energy will never be economical, and I think he's wrong.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 It is a cupper mug or a glass of tank of chelsea stein, a canteen jug or flask or a vessel of any kind.
00:00:05.240 Fill it with your favorite beverage.
00:00:07.860 I like coffee.
00:00:09.540 And drink me, and join me now for the unparalleled pleasure, the dopamine of the day.
00:00:16.500 It's the thing that makes everything better.
00:00:19.720 It's called the simultaneous sip.
00:00:21.580 It happens now.
00:00:22.540 Go.
00:00:26.840 Ah, yeah.
00:00:28.800 Yeah, yeah, yeah.
00:00:31.400 Yeah, yeah, yeah, yeah.
00:00:34.980 Well, the Queen of England is ill, and family members are gathering around, which is usually the signal that the cat's on the roof.
00:00:46.940 Meaning that I don't think that they're expecting the Queen to recover from this.
00:00:51.720 Because when you do the family is gathering around part, that's kind of telling you the game is about over.
00:01:02.140 Here's what I will predict about that.
00:01:04.660 Are you ready for a prediction?
00:01:05.700 Do you believe that the news happens on a sort of a random schedule?
00:01:14.040 Do you think that news happens, but it just happens when it happens?
00:01:18.160 And if it's going to happen, that's when it happens.
00:01:20.500 No, not the political stuff.
00:01:24.620 So here's my prediction.
00:01:26.700 Should the Queen pass, do you know what's going to happen?
00:01:31.160 All of the bad news about Trump will suddenly stop for a while.
00:01:37.120 Because they need to save the good stuff to when you're paying attention.
00:01:41.080 They're not going to fight with a Queen story.
00:01:44.400 So for about a week or so, if the Queen passes, for about a week or so, the political news will get really boring.
00:01:52.580 And that's how you know it's never been real.
00:01:54.540 If you ever want to test to see if the news is real or manufactured, watch how it stops when there's a competing story.
00:02:05.580 Now, I could be wrong, right?
00:02:07.900 There's some things that the news can't control.
00:02:11.220 But things like an anonymous source told us something bad about Trump,
00:02:16.200 I'll bet you that anonymous source is going to wait another week if there's a competing story.
00:02:22.260 So look for that.
00:02:24.540 As you know, Russia cut off Europe from its gas deliveries and is selling it to the Far East, and China in particular.
00:02:35.860 But that plan might not be as awesome as they hoped, because China just bought a ton of LNG.
00:02:46.180 And they bought it at half of the current spot price.
00:02:52.220 China is buying Russia's gas, but at half the market price.
00:02:58.960 I don't know why it's half.
00:03:00.400 It must be some reason.
00:03:01.980 But this reminds me that if you're worried about China and Russia becoming too close,
00:03:09.900 I don't think they're friends, I don't think anybody's going to do any favors for the other.
00:03:17.640 I think it's just business.
00:03:19.780 So if China gets a chance to screw Russia on a business deal, they're going to do it.
00:03:25.060 And wouldn't expect anything else, of course.
00:03:27.680 But I don't know that Russia is in a good place here.
00:03:30.380 But depending on China to keep you alive seems like a really bad proposition, because they'd have a lot of power on you. 0.89
00:03:38.300 All right, here's something that messed up my mind pretty good.
00:03:40.940 I've mentioned before that there's a small number of people who, if they disagree with me, I immediately have to change my opinion.
00:03:52.060 It's like, ah, damn it.
00:03:53.940 For example, if I had some dumbass opinion about the Constitution, but I heard Alan Dershowitz say the opposite opinion,
00:04:01.920 I would immediately say, ah, damn it, Dershowitz.
00:04:07.000 And I would just agree with him, because my own opinion would be worthless in that context.
00:04:12.340 But I just found out that Elon Musk thinks that fusion energy will never be economical.
00:04:22.280 That's the opposite of what everybody thinks, right?
00:04:24.860 The whole point of fusion is that once you get the reaction going, it's the most economical thing you can do.
00:04:31.920 Well, Musk, who I believe has invested in fusion, and so he would know what's actually on the, you know, on the workbench as well as what's in the real world.
00:04:43.920 And he says in a tweet, fusion would be expensive energy, given the difficulty of obtaining and transporting source fuel.
00:04:54.060 And that's interesting, because I thought you only had to do that once.
00:04:57.880 Wasn't that the whole point of fusion?
00:05:00.440 That once you get your source fuel, it doesn't need new fuel.
00:05:05.440 So is it really that hard to get in transport?
00:05:08.400 I don't know.
00:05:09.300 So it's something that was new to me.
00:05:11.980 Plus maintaining the reactor, so it would be hard to maintain the reactor.
00:05:16.460 He says it's far better to use the sun and thermonuclear reactor with no need to refuel or service.
00:05:25.260 Now, here's my problem with that.
00:05:28.620 I feel as if he's making a prediction about fusion based on what we know how to do today.
00:05:35.200 Isn't that right?
00:05:37.360 It looks like a prediction about the future based on what we know how to do today.
00:05:44.180 And that seems wrong coming from him, doesn't it?
00:05:47.260 Because do you think he knew how to go to Mars before he started?
00:05:51.180 I don't think he did.
00:05:52.340 I think he figured it out.
00:05:54.980 And they're still figuring it out, right?
00:05:57.240 Do you think he knew how to build Tesla before he started?
00:06:01.040 I don't think so.
00:06:03.120 Do you think he knew that battery technology would progress the way it has, which has been quite tremendous?
00:06:10.540 He probably hoped it would, but he didn't know exactly what innovations would be next.
00:06:16.060 I don't think he did.
00:06:17.080 Did he?
00:06:18.000 I mean, if he did, that would be pretty impressive.
00:06:19.640 So I feel as if he's got a little bit of apples and orange or a bias going on here somehow.
00:06:27.740 It's hard to tell what's going on.
00:06:29.920 One possibility is he just knows more about it than we do, and you're finding out.
00:06:35.120 The other possibility is I feel like he's picked a favorite.
00:06:41.320 You know, he's got more money invested in batteries, of course.
00:06:43.680 So, I don't know.
00:06:47.280 I feel as if looking at fusion and saying we don't know how to solve these problems is not fair,
00:06:55.000 because we didn't know how to solve the problems for anything else.
00:06:58.760 We always start first and then solve the problems.
00:07:00.960 But it could be that he knows that the nature of fusion is that it'll always be difficult to maintain or always difficult to get the fuel.
00:07:08.380 I don't know.
00:07:09.120 So I don't quite understand the comment.
00:07:11.300 But I immediately abandon my old opinion.
00:07:14.480 And I hate that he does that to me, because I can no longer hold a public opinion that fusion is our economical savior.
00:07:26.640 It might be, but I don't want to be on the other side of this question from Musk.
00:07:31.780 Do you?
00:07:34.080 Man crush.
00:07:37.520 All right.
00:07:38.200 I'm going to test your mind-reading ability.
00:07:47.080 If you were asked a question, let's say if the public were asked this question,
00:07:51.020 if student loan forgiveness means colleges will raise their prices,
00:07:55.780 for example, support for the policy, what happens to support for the loan forgiveness?
00:08:02.660 If you believe that colleges would just raise their prices,
00:08:06.940 because then students wouldn't have to pay as much in theory,
00:08:10.620 would you still be in favor of loan forgiveness if it's just going to cause prices in colleges to go up?
00:08:16.580 What percentage of the public believed that loan forgiveness is still a good idea if colleges just raise their prices?
00:08:29.420 Oh, you're good.
00:08:30.520 It's exactly 25%.
00:08:33.240 Yeah.
00:08:34.200 Oh, you're good.
00:08:35.940 You're good.
00:08:37.740 Let's talk about Carrie Lake.
00:08:42.320 Carrie Lake has announced that she says she has discovered evidence of voter fraud.
00:08:49.040 I assume that means in her state, Arizona, in the last election.
00:08:53.360 But she won't tell us what it is.
00:08:56.800 Because the fake news is fake news.
00:08:59.400 But she's given it to the attorney general.
00:09:02.820 All right.
00:09:03.320 What can we conclude about this discovery that she has about voter fraud?
00:09:09.720 What can we know for sure based on what little we know about the situation?
00:09:15.120 Let me tell you what you can know for sure.
00:09:19.180 It's not important.
00:09:21.200 It's not important.
00:09:22.360 If it were important, you would do both.
00:09:26.340 You would give it to the AG, but you'd also give it to the news.
00:09:29.640 If it were important.
00:09:31.900 So what we can say for sure is it's not important.
00:09:35.160 But she's selling it like it's important by not acting like, oh, I can't tell you.
00:09:40.020 So persuasion-wise, pretty good.
00:09:44.740 Because it makes you talk about the thing that you can't confirm.
00:09:48.680 Does that sound familiar?
00:09:51.600 Making you talk about the thing that you can't confirm?
00:09:55.160 Like what's in those boxes?
00:09:56.720 Yeah, it's the same play.
00:09:59.980 So Carrie Lake is just doing the same play on the Democrats as the Democrats are doing on Trump. 1.00
00:10:06.360 Well, there might be something in those boxes.
00:10:09.200 Can't show you.
00:10:10.800 But trust us, somebody's seen it.
00:10:12.540 And boy, are they concerned about what's in those boxes.
00:10:16.300 And how about Russia collusion?
00:10:18.480 Well, Mueller's looking into it.
00:10:20.660 My God, you don't know what he's found by now.
00:10:22.940 All the potential Russia collusion.
00:10:24.920 We can't tell you about it.
00:10:26.480 We can't tell you what he's found yet.
00:10:28.520 It's all secret.
00:10:30.000 But trust us.
00:10:30.900 You've got to be worried about that.
00:10:32.980 So she's just throwing it back in the Democrats. 1.00
00:10:35.760 Oh, I've got a secret.
00:10:37.240 I can't tell you about it.
00:10:38.680 But trust me.
00:10:40.040 Trust me when you see it.
00:10:41.360 It's going to be bad.
00:10:43.260 So it's pretty clever.
00:10:46.500 It's sketchy, but it's clever.
00:10:49.740 All right.
00:10:51.480 Speaking of not being able to predict the future,
00:10:54.920 there's a battery company.
00:10:57.340 It looks like a startup, I think,
00:10:59.960 called Lytan, L-Y-T-A-N.
00:11:04.480 Now, the punchline of the story comes at the end.
00:11:07.880 All right.
00:11:08.100 So the first part's interesting, but wait for the end.
00:11:11.600 So they figured out, they think,
00:11:13.460 how to make a better battery than the normal ones for cars and stuff.
00:11:17.960 And they would use materials that are all available in the United States.
00:11:25.060 So if their technology works, and if it's widely adopted,
00:11:30.200 we would not be beholden to China or anybody else for any materials needed to build them.
00:11:36.300 So that's good.
00:11:36.900 And apparently, they'd be, you know, basically better in every way.
00:11:41.900 But of course, this is startups.
00:11:43.500 You have to wait.
00:11:44.320 But here's the punchline.
00:11:46.140 They won a competition for top 10 battery, battery, new battery makers, I guess.
00:11:54.200 Top 10.
00:11:54.840 Which means that there are at least nine other companies
00:11:58.000 that are probably roughly as, have roughly the potential of this one.
00:12:04.480 So whatever you think is the future of battery technology,
00:12:08.860 you're probably way off.
00:12:10.820 Right?
00:12:11.120 It's always about invention.
00:12:12.680 It's about the surprise.
00:12:14.320 And this is just one company out of 10
00:12:17.220 that are planning a surprise for you.
00:12:20.060 You know, they won't all succeed.
00:12:21.640 But that's 10 companies that probably have been funded.
00:12:25.700 And they think they've got a surprise coming for you.
00:12:29.000 That's big, big news.
00:12:30.840 That's big, big news.
00:12:33.000 But you never know which company it is.
00:12:34.720 So it's hard to, it's hard to really know.
00:12:36.660 Like, when's the point when you can get all excited?
00:12:39.340 It's just going to be this sort of gradual improvement.
00:12:42.820 And 10 years from now, you're going to say to yourself,
00:12:45.380 wow, battery sure got good.
00:12:49.720 So it looks like that's going to happen.
00:12:51.640 I think the CDC proved beyond any doubt that they're corrupt.
00:13:00.400 So the, was it Walensky, the director of the CDC,
00:13:05.080 was asked about the wisdom of giving the booster shots
00:13:10.780 to young and or healthy people.
00:13:13.420 Because, you know, everything's different now.
00:13:16.880 It's Omicron and blah, blah, blah.
00:13:19.360 So instead of saying, well, maybe, you know,
00:13:22.780 we have a good reason to give it to the healthy people,
00:13:25.140 or to say that there was some medically reasonable reason
00:13:31.140 to do it this way.
00:13:32.460 Do you know what she said?
00:13:33.380 She said it was to simplify the messaging.
00:13:38.340 What?
00:13:40.380 That's right.
00:13:41.880 She defended giving vaccinations to people that don't need them
00:13:45.760 because they're young or healthy, or both,
00:13:49.460 because it's simpler to say everybody, everybody just take it.
00:13:53.260 That actually happened in public.
00:13:59.360 That's a real thing that happened.
00:14:00.980 You can go check it yourself.
00:14:02.860 I'm paraphrasing a little bit, but go check it yourself.
00:14:06.020 You can hear it in her own words.
00:14:07.960 You can hear the question.
00:14:09.100 You can hear the full answer.
00:14:10.080 And her full answer was basically that the CDC would recommend,
00:14:16.180 and again, I'm paraphrasing,
00:14:18.280 but the way I understood it is very clearly that she's saying,
00:14:22.100 we prefer giving shots to people who don't need them
00:14:25.620 because it's easier to explain, hey, everybody,
00:14:28.480 just go get your shot.
00:14:30.480 I think she said it directly.
00:14:35.400 Are we done with them?
00:14:37.700 I mean, I don't know how much more done you could be
00:14:39.800 with an organization after an answer like that.
00:14:43.260 You should be fired immediately for that answer.
00:14:46.100 Well, unless it's true.
00:14:48.940 If it's true, then there's something more aggressive
00:14:51.620 that needs to be done with the CDC.
00:14:54.020 But if she's the only one saying it,
00:14:56.740 or maybe she was in charge of the decision, I guess,
00:15:00.080 I think you need to be fired immediately for that.
00:15:03.120 That is escorted out the door kind of fired.
00:15:07.020 That's not even, you know,
00:15:09.480 meet me for a meeting on Friday you're fired.
00:15:12.980 That's basically meet you in the parking lot
00:15:15.580 on the way back to work from the interview
00:15:17.740 and escort you out the door,
00:15:19.940 except I don't know who would do it because she's the boss. 1.00
00:15:22.660 Who escorts the boss out the door? 0.84
00:15:24.920 The president?
00:15:27.160 I don't know.
00:15:27.820 How does the director of the CDC even get picked?
00:15:30.080 A huge escort because she's hot. 1.00
00:15:36.740 That is the funniest thing about the director of the CDC.
00:15:40.560 A lot of the men have a crush on her.
00:15:44.920 She is insanely sexy. 0.99
00:15:47.640 That's true.
00:15:49.140 I will agree with that interpretation.
00:15:52.140 Whatever else you want to say about her.
00:15:53.580 Well, Eventbrite,
00:15:59.000 the online organization that lets you buy tickets for stuff,
00:16:03.040 apparently they've been taking down events
00:16:05.900 for viewing parties and stuff
00:16:08.660 for the Matt Walsh documentary,
00:16:12.300 What is a Woman? 1.00
00:16:13.640 They say it promoted hatred.
00:16:15.460 Now, I haven't seen the entire documentary,
00:16:21.060 but isn't he just asking questions?
00:16:25.260 My understanding is that the documentary involves him
00:16:28.240 asking people who have a different opinion
00:16:31.600 to explain their opinion.
00:16:34.200 And I think he gives them time to explain their whole opinion.
00:16:37.860 Am I wrong?
00:16:39.480 Isn't the entire documentary
00:16:41.420 is about having his opponents
00:16:44.880 have a full opportunity
00:16:47.600 to explain their opinion?
00:16:49.740 And that's considered hate.
00:16:54.380 Do you think we've overshot the mark here a little bit?
00:16:58.480 Now, I always say that
00:16:59.740 you don't have to worry too much about the slippery slope
00:17:02.060 because it's always temporary.
00:17:04.300 Like, this is definitely an overreaction.
00:17:06.860 This is definitely too far.
00:17:08.600 You've taken it too far now.
00:17:10.100 So, it's amazing that that could be considered a hate,
00:17:13.560 to listen to people in their own words
00:17:15.660 tell you what they want.
00:17:18.640 All right.
00:17:21.680 So, Biden tweeted, I guess yesterday,
00:17:25.420 we're going to build the future in America
00:17:27.120 with American workers
00:17:28.420 and American factories
00:17:30.200 using American-made products.
00:17:34.500 Now, who does that sound like?
00:17:36.640 It sounds like the transition to orange brand
00:17:46.260 in this complete.
00:17:48.860 Like, I told you when Biden took office,
00:17:53.200 I told you that he was going to have to
00:17:55.020 end up doing the same things that Trump does
00:17:57.880 or else fail
00:17:59.020 because a lot of things Trump did were just working.
00:18:02.120 So, if you did the opposite thing,
00:18:04.200 you know, it wouldn't last long.
00:18:05.440 You're going to have to go back to doing
00:18:06.520 what Trump was doing
00:18:07.320 because it worked.
00:18:09.240 And here we are
00:18:10.800 with Orange Brandon
00:18:12.420 basically saying
00:18:14.380 make America great again.
00:18:16.460 He's just using his own words.
00:18:18.940 So, he could not be more
00:18:20.460 American first,
00:18:21.820 America first.
00:18:23.780 He's as America first as Trump now.
00:18:26.260 Isn't he?
00:18:27.300 Or at least he's trying to be.
00:18:28.500 Now, he's not America first in Ukraine,
00:18:31.000 but at least in this respect.
00:18:34.160 All right, remember I told some of you
00:18:35.980 I had this weird medical mystery,
00:18:39.260 which is why I feel fine most of the time,
00:18:41.800 but when I try to write on my laptop,
00:18:44.320 I fall asleep within a minute.
00:18:46.720 It has nothing to do with sleep.
00:18:48.720 It doesn't matter if I've well slept
00:18:50.300 or haven't slept at all.
00:18:51.520 It's exactly the same experience.
00:18:53.640 And so, a lot of people
00:18:54.680 in what I call
00:18:56.580 the collaborative intelligence
00:18:58.080 of whatever it is
00:18:59.140 we're doing here.
00:19:00.640 A lot of doctors
00:19:01.700 weighed in.
00:19:04.700 And it does
00:19:06.180 turn out
00:19:07.340 that there are studies
00:19:08.460 of glucose
00:19:09.440 and it does make you
00:19:11.500 more mentally alert.
00:19:13.120 So, if they give you
00:19:13.880 glucose and coffee
00:19:15.200 before a mental challenge,
00:19:17.420 you'll do better
00:19:18.080 than either nothing
00:19:20.160 or coffee alone.
00:19:21.300 So, glucose actually
00:19:22.840 is brain fuel.
00:19:24.900 And there's something
00:19:27.240 I've always imagined
00:19:28.560 to be true about me.
00:19:31.580 And this sort of confirmed,
00:19:33.140 well, it's more evidence of it.
00:19:35.160 And you know how some people
00:19:36.400 have a muscle structure
00:19:38.580 where they can sprint
00:19:39.720 really quickly,
00:19:41.220 but they might not be good
00:19:42.340 for long distance running.
00:19:44.640 Right?
00:19:44.940 And vice versa.
00:19:46.200 Some people have muscles
00:19:47.120 that are fast twitch.
00:19:48.680 It might be good
00:19:49.280 for a reflex sport
00:19:50.620 like ping pong,
00:19:52.000 but not so good
00:19:52.860 to be a weightlifter
00:19:53.880 or something.
00:19:55.260 And I've always imagined
00:19:56.400 that my brain
00:19:57.420 had that similar quality,
00:20:00.260 which is that
00:20:01.840 I feel like I can go
00:20:03.380 really deep
00:20:05.400 for very short periods
00:20:07.120 of time.
00:20:08.380 Like, deeper than
00:20:09.500 other people can go.
00:20:10.820 But only for short periods.
00:20:12.160 Because it's so intense,
00:20:13.760 it's just exhausting.
00:20:15.420 But what I can do,
00:20:16.680 what I can't do well,
00:20:18.660 is focus on something
00:20:20.380 for a long period of time.
00:20:21.760 I can't do that.
00:20:24.420 But if you give me,
00:20:25.420 you know,
00:20:25.980 ten minutes of insane
00:20:27.440 concentration on something,
00:20:29.400 I can penetrate
00:20:30.720 the center of the planet.
00:20:33.520 But you tell me
00:20:34.560 to spend two hours
00:20:35.880 doing some homework,
00:20:37.160 I can't do that.
00:20:38.280 I can't do that.
00:20:39.740 I don't think that's ADHD.
00:20:41.500 I don't think I have
00:20:42.280 those symptoms exactly.
00:20:44.260 I mean,
00:20:44.480 I get distracted
00:20:45.260 like other people do,
00:20:46.260 but that's about it.
00:20:48.240 Yeah.
00:20:48.480 So, I'm going to test this.
00:20:51.420 I'm going to test the theory
00:20:52.520 that sugaring myself up
00:20:54.640 would keep me awake
00:20:55.440 in those situations.
00:20:56.360 I'll let you know.
00:20:57.360 I'll let you know
00:20:58.160 what happens.
00:21:00.400 All right.
00:21:01.780 So, apparently,
00:21:02.620 Republicans are
00:21:03.560 planning to go on
00:21:06.740 an impeachment
00:21:07.380 and investigation
00:21:08.500 frenzy
00:21:10.660 should the Republicans
00:21:11.880 take control
00:21:12.640 of everything
00:21:13.080 after various elections.
00:21:15.820 And what do you think
00:21:16.600 of that idea?
00:21:17.160 Do you think Republicans
00:21:18.600 should pay back
00:21:19.660 the Democrats
00:21:20.220 for all these investigations
00:21:21.740 and impeachments?
00:21:25.480 Is payback a good idea
00:21:27.120 or a bad idea?
00:21:31.580 Because the trouble is
00:21:32.820 it doesn't seem to produce
00:21:34.420 mutually assured destruction,
00:21:38.660 does it?
00:21:40.540 If mutually assured destruction
00:21:42.620 is what we're talking about,
00:21:44.120 then I'd say yes.
00:21:45.060 You have to make it look like
00:21:47.560 it's such a bad idea
00:21:48.940 to do it to you
00:21:49.860 that they'll never
00:21:51.420 do it again.
00:21:52.900 But that's not
00:21:53.740 going to happen, right?
00:21:55.600 I think the investigations
00:21:57.060 really,
00:21:57.760 they impact
00:21:58.720 one person at a time
00:22:00.340 and all the people
00:22:01.800 who are not
00:22:02.360 investigated
00:22:03.200 think,
00:22:03.840 oh,
00:22:04.020 I'm going to investigate
00:22:04.740 your team
00:22:05.360 because you got
00:22:06.340 one of mine.
00:22:07.300 I don't think there's
00:22:08.300 no mutually assured
00:22:09.740 destruction.
00:22:11.320 It's just mutually assured
00:22:13.140 fucking with each other
00:22:14.760 and that doesn't stop you
00:22:16.260 from fucking with each other, 0.98
00:22:17.420 does it?
00:22:18.920 To me,
00:22:19.500 it looks like
00:22:19.980 it's just making things worse.
00:22:23.140 But on the other hand,
00:22:25.180 can you let people
00:22:25.960 put you in jail
00:22:26.760 and not respond?
00:22:28.900 Can you let people
00:22:29.900 like destroy lives
00:22:31.100 and take over the country
00:22:32.120 and,
00:22:33.000 you know,
00:22:33.560 I mean,
00:22:33.960 do really big,
00:22:35.220 big things
00:22:35.940 and just not respond?
00:22:37.740 Just say,
00:22:38.120 well,
00:22:38.720 we're going to play polite.
00:22:40.840 I don't know.
00:22:41.980 This one's a tough one.
00:22:44.060 I don't think
00:22:44.760 there's a general answer.
00:22:46.120 I think it depends
00:22:47.020 on the specifics.
00:22:48.600 So if there are
00:22:49.160 specific people
00:22:49.980 who really cross the line,
00:22:51.580 well,
00:22:51.980 you've got to do something.
00:22:53.940 But I don't think
00:22:54.860 we should make it
00:22:55.460 a habit
00:22:56.260 to just,
00:22:56.680 like,
00:22:57.300 try to savage
00:22:58.140 the other side
00:22:58.920 when you win.
00:22:59.700 That seems like
00:23:00.360 a bad,
00:23:00.840 bad strategy.
00:23:03.680 How do you feel
00:23:04.360 about bullies?
00:23:05.360 I don't know
00:23:05.820 if it's a bully question.
00:23:07.560 Is it?
00:23:07.980 See,
00:23:11.060 the trouble is
00:23:11.680 that if you beat up
00:23:13.000 a bully,
00:23:13.440 it does stop
00:23:14.120 the bullying.
00:23:16.540 That's why people do it.
00:23:18.700 But this,
00:23:19.200 this wouldn't stop anything.
00:23:20.780 There's no amount
00:23:21.520 of impeachment
00:23:22.120 from one side
00:23:22.880 that's going to stop
00:23:23.500 the other side
00:23:24.020 from impeaching.
00:23:25.520 So,
00:23:26.760 the whole bully model
00:23:28.660 doesn't make sense.
00:23:30.640 If I,
00:23:31.200 if you ask me,
00:23:32.140 should you beat up
00:23:32.780 your bully,
00:23:33.360 I'd say,
00:23:34.020 probably.
00:23:35.060 I mean,
00:23:35.280 it sort of depends
00:23:36.220 on the situation,
00:23:37.600 but probably.
00:23:38.500 You should probably
00:23:39.020 take a run at it.
00:23:40.580 But I don't think
00:23:41.300 it's going to work
00:23:42.060 in the political context.
00:23:45.500 So,
00:23:45.960 we'll see.
00:23:47.900 Wall Street Journal
00:23:48.720 is reporting that,
00:23:49.800 at least anecdotally,
00:23:51.180 people are saying
00:23:52.200 that the workplace
00:23:53.100 is less polite
00:23:54.360 than it used to be.
00:23:56.640 And part of it
00:23:57.460 is the work at home,
00:23:58.540 part of it
00:23:58.880 was the pandemic,
00:23:59.860 part of it was,
00:24:00.960 you know,
00:24:01.240 lots of speculation.
00:24:02.240 How many of you
00:24:04.400 think that's true?
00:24:06.120 I don't know
00:24:06.800 how many of you
00:24:07.560 are,
00:24:09.140 if you're listening
00:24:09.800 right now,
00:24:10.320 you're probably not
00:24:11.140 back to work necessarily.
00:24:14.200 But,
00:24:14.760 do you believe that?
00:24:16.560 I'm looking at
00:24:17.100 your comments here.
00:24:19.360 And,
00:24:19.960 some people believe it,
00:24:21.960 but a lot of people
00:24:23.040 say it's not true.
00:24:24.780 So,
00:24:25.300 there are more people
00:24:25.900 saying it's not true
00:24:26.880 than true.
00:24:29.080 Not here.
00:24:29.960 Yeah,
00:24:30.180 it must be very specific.
00:24:32.240 Young people are not 1.00
00:24:33.320 taught to be polite.
00:24:34.880 There's something to that.
00:24:36.160 It might also be
00:24:36.940 how many young people
00:24:37.760 work there,
00:24:38.220 right?
00:24:39.560 It could be a
00:24:40.160 generational thing.
00:24:42.380 Because,
00:24:43.060 if you took
00:24:44.280 two years
00:24:44.880 out of my work life
00:24:45.820 when I was 25,
00:24:48.480 that would be like
00:24:49.380 half of all
00:24:50.580 of my work life.
00:24:53.040 Right?
00:24:53.920 But,
00:24:54.400 if you give me
00:24:55.520 a two-year pandemic
00:24:56.580 and I'm already 50,
00:24:59.220 that two years
00:24:59.980 isn't going to change
00:25:00.740 my basic nature.
00:25:02.240 But,
00:25:03.040 it might change
00:25:03.620 your basic nature
00:25:04.520 if you just
00:25:05.260 started working.
00:25:10.600 Lack of socialization
00:25:11.760 because of technology.
00:25:14.660 Yeah,
00:25:15.180 I'm not sure
00:25:15.640 this is a real thing.
00:25:17.260 You know,
00:25:17.720 every generation
00:25:18.720 thinks that the
00:25:19.640 generation coming up
00:25:21.100 is a disaster.
00:25:23.340 Has that ever
00:25:24.060 been different?
00:25:25.620 Has there ever
00:25:26.260 been a time
00:25:26.780 when we said,
00:25:27.400 whoa,
00:25:27.840 we were pretty bad,
00:25:28.760 but this new generation,
00:25:29.740 they look like
00:25:30.180 they have it going on?
00:25:31.000 I don't think
00:25:32.100 that's a thing.
00:25:33.220 I think we always
00:25:33.900 think that the
00:25:34.400 new generation
00:25:34.960 is doomed.
00:25:38.260 There's a lot more
00:25:39.040 ghosting going on
00:25:40.260 in the business world.
00:25:41.860 Have you noticed that?
00:25:43.760 There's a salesperson
00:25:44.940 who says people
00:25:45.700 just don't call
00:25:46.980 them back
00:25:47.400 after they seemed
00:25:48.120 interested.
00:25:50.440 Do you think
00:25:51.060 there's more of that?
00:25:52.680 I feel like
00:25:53.280 there's more ghosting.
00:25:54.220 Yeah.
00:25:57.060 And part of that
00:25:57.880 is just people
00:25:58.360 are busier.
00:25:59.560 It's just too hard
00:26:00.600 to get back
00:26:01.040 to everybody
00:26:01.480 these days.
00:26:04.020 All right,
00:26:04.200 well,
00:26:04.400 China has passed 0.93
00:26:05.140 the U.S.
00:26:05.620 in life expectancy.
00:26:07.520 Do you want to
00:26:08.120 take any guesses
00:26:08.840 why that might be?
00:26:11.360 Talk about a story
00:26:12.480 that's annoying.
00:26:13.980 Hey,
00:26:14.920 look how good
00:26:15.580 China's doing. 0.96
00:26:16.520 They passed
00:26:17.100 the United States
00:26:17.740 in life expectancy.
00:26:18.900 All right,
00:26:20.920 well,
00:26:21.180 I hate to put
00:26:22.320 a negative spin
00:26:23.220 on that awesome story,
00:26:25.300 but it's because
00:26:26.040 they're killing people
00:26:26.940 in America.
00:26:28.700 They're killing
00:26:29.620 Americans
00:26:30.140 with fentanyl.
00:26:31.960 If you took that out,
00:26:33.340 I think it'd be
00:26:33.800 a lot more competitive.
00:26:37.240 So,
00:26:37.780 somebody on Twitter
00:26:40.940 said,
00:26:42.200 why don't we
00:26:43.420 coerce
00:26:44.660 or encourage
00:26:46.100 the Mexican government
00:26:47.440 to use
00:26:48.660 their own military
00:26:49.460 to take out
00:26:50.120 the cartels?
00:26:53.100 To which I said,
00:26:55.020 wow,
00:26:56.140 you're really
00:26:57.660 not paying attention.
00:26:59.540 There's no
00:27:00.240 Mexican government.
00:27:01.960 That's not a thing.
00:27:03.640 Who thinks
00:27:04.120 there's a Mexican government?
00:27:06.020 The cartels
00:27:06.600 own the government.
00:27:08.980 If you didn't know that,
00:27:10.500 you must be
00:27:11.080 really confused
00:27:11.980 about everything
00:27:13.340 that's going on.
00:27:14.600 If Mexico 0.93
00:27:15.240 had a functional government,
00:27:16.820 nothing would look
00:27:17.420 the same.
00:27:17.820 you wouldn't have
00:27:18.960 a border problem,
00:27:19.980 you wouldn't have
00:27:20.380 the drugs coming over
00:27:21.360 and the numbers
00:27:22.000 that they are.
00:27:23.880 Yeah.
00:27:24.340 No,
00:27:24.600 there's no Mexican 1.00
00:27:25.280 government.
00:27:26.320 Not in a real sense.
00:27:28.120 They're just clients
00:27:29.040 of the cartels.
00:27:30.640 So,
00:27:31.560 when you say,
00:27:32.260 what are the Mexican people
00:27:33.740 or the Mexican government
00:27:35.260 going to say
00:27:35.920 if the United States
00:27:37.620 made a direct military
00:27:38.940 attack on the cartels?
00:27:40.200 the answer is,
00:27:42.080 what's the difference?
00:27:43.520 What's the difference
00:27:44.120 what they say?
00:27:45.620 How could that possibly
00:27:46.820 have any impact on us?
00:27:48.980 Only if we wanted to.
00:27:51.980 Because we would be
00:27:53.140 attacking the government
00:27:54.200 in a sense,
00:27:55.220 because the cartels
00:27:55.860 are the real power there.
00:27:57.820 So,
00:27:58.420 it doesn't matter
00:27:59.100 what the official
00:28:00.440 functioning government
00:28:01.520 says,
00:28:01.960 that's who we're attacking.
00:28:03.260 We're attacking the government.
00:28:04.420 So,
00:28:06.320 am I telling you
00:28:07.120 that the United States
00:28:07.920 should attack
00:28:08.540 the de facto
00:28:10.740 government of Mexico?
00:28:12.340 Yeah.
00:28:13.040 Yeah,
00:28:13.260 that's what I just said.
00:28:14.660 Just as clearly
00:28:15.620 as you could possibly hear it,
00:28:17.320 we should send 0.99
00:28:17.940 a major military
00:28:19.020 expedition down there
00:28:20.560 and just either
00:28:21.840 occupy it,
00:28:23.720 occupy it,
00:28:25.040 or destroy
00:28:26.320 whatever resources
00:28:27.660 are sending
00:28:28.480 all that death
00:28:29.060 into America.
00:28:31.300 Now,
00:28:31.800 some say,
00:28:33.140 and this is a good point,
00:28:34.420 that we wouldn't know
00:28:35.420 what to bomb.
00:28:37.680 Because,
00:28:37.980 I don't know,
00:28:38.340 could you find
00:28:38.820 the exact lab
00:28:39.700 that makes fentanyl?
00:28:42.160 It's probably like
00:28:42.860 one building somewhere.
00:28:44.160 It'd be hard to find.
00:28:45.400 But I feel like
00:28:46.020 we could find it.
00:28:47.180 But that's not really
00:28:47.960 what you do.
00:28:49.140 You take out
00:28:49.660 the cartel leadership.
00:28:52.000 Right?
00:28:52.400 You make it so
00:28:53.280 that they can't possibly
00:28:54.400 enjoy the fruits
00:28:56.520 of their labor.
00:28:58.060 And then you try
00:28:58.820 to talk them
00:28:59.840 into going legit.
00:29:02.720 You try to talk
00:29:03.600 the cartels
00:29:04.800 into turning
00:29:07.380 their money
00:29:07.840 into some
00:29:08.400 legitimate businesses,
00:29:09.560 like the way
00:29:09.920 the mafia did.
00:29:11.660 Basically,
00:29:12.340 you get them
00:29:12.740 to open casinos
00:29:13.980 instead of
00:29:15.680 selling drugs.
00:29:17.580 Because you can
00:29:18.000 make money
00:29:18.400 either way.
00:29:20.320 Something like that.
00:29:21.940 Something like that.
00:29:23.320 Now,
00:29:23.640 the other possibility
00:29:24.320 is that you hire
00:29:25.280 one of the cartels
00:29:26.200 to kill the other one.
00:29:28.560 I'm surprised
00:29:29.320 we haven't done that,
00:29:30.200 actually.
00:29:30.920 Because the cartels
00:29:31.860 are only in it
00:29:32.400 for the money,
00:29:32.900 right?
00:29:34.280 So you couldn't
00:29:35.320 hire one of the cartels
00:29:36.460 to secure the border?
00:29:39.260 They'd have to kill
00:29:40.100 the other cartel.
00:29:41.480 But if you paid them
00:29:43.420 enough,
00:29:43.700 they'd do it.
00:29:45.600 Did they make
00:29:46.340 so much money
00:29:47.020 by smuggling people
00:29:48.080 in that they wouldn't
00:29:48.800 take a direct bribe
00:29:49.840 from the United States?
00:29:51.000 Look,
00:29:51.340 here's the deal.
00:29:52.540 You're making
00:29:53.180 a billion dollars a year
00:29:54.440 smuggling people
00:29:55.980 into our country.
00:29:56.680 We'll pay you
00:29:58.180 1.2 to stop doing it
00:29:59.820 and kill anybody 0.91
00:30:00.560 who tries it.
00:30:01.980 Like,
00:30:02.400 oh,
00:30:02.720 that's another
00:30:03.160 200 million.
00:30:04.100 Okay.
00:30:05.320 We're in.
00:30:08.560 Anyway,
00:30:09.100 I think the attack
00:30:09.960 on China
00:30:10.500 is so inevitable
00:30:11.500 that it's ridiculous
00:30:12.700 that we don't just
00:30:13.620 plan it
00:30:14.440 and do it now.
00:30:15.520 It's not like
00:30:16.340 we're not going
00:30:16.880 to do it.
00:30:18.920 Be serious.
00:30:19.840 Do you think
00:30:20.120 we're not going
00:30:20.700 to attack Mexico 0.93
00:30:21.920 because of this?
00:30:22.740 Of course we are.
00:30:23.380 Of course we are.
00:30:26.180 There's a 100%
00:30:27.260 chance we are.
00:30:28.320 We are going
00:30:29.320 to militarily
00:30:29.980 attack Mexico. 0.88
00:30:31.520 You're going to
00:30:32.120 have to accept this.
00:30:33.680 It's only a matter
00:30:34.620 of when.
00:30:36.080 There's no other
00:30:37.060 question.
00:30:38.340 The question of
00:30:39.080 will we attack
00:30:39.760 Mexico is done.
00:30:41.140 That's done.
00:30:42.180 We are going
00:30:42.880 to fucking 0.95
00:30:43.220 attack Mexico 0.98
00:30:44.020 and we're going
00:30:45.900 to go hard.
00:30:47.560 We just don't know
00:30:48.560 when.
00:30:49.300 It could be
00:30:49.640 five years.
00:30:50.820 Who knows?
00:30:53.380 There's a video
00:30:55.940 of a robot
00:30:56.760 allegedly seeing
00:30:58.720 itself in a mirror
00:30:59.740 for the first
00:31:00.380 time.
00:31:01.040 An AI driven
00:31:01.980 robot.
00:31:03.380 And when you
00:31:03.860 look at it
00:31:04.480 it's not
00:31:05.660 immediately obvious
00:31:06.560 if the robot
00:31:07.320 is having some
00:31:08.040 kind of internal
00:31:09.580 if I can call
00:31:11.140 it mental
00:31:11.600 reaction
00:31:12.720 or if it's
00:31:14.440 just programmed
00:31:15.080 to act the way
00:31:15.700 it's acting.
00:31:17.360 It's hard to
00:31:18.160 tell.
00:31:18.980 But I said
00:31:19.760 the following.
00:31:21.000 If a robot
00:31:21.720 can recognize
00:31:22.560 itself in the
00:31:23.500 mirror
00:31:23.760 it's conscious.
00:31:27.000 Unless
00:31:27.480 it has simply
00:31:28.840 been programmed
00:31:29.460 to recognize
00:31:30.680 itself in the
00:31:31.380 mirror.
00:31:32.760 If it's been
00:31:33.540 programmed
00:31:33.960 specifically
00:31:34.700 act this way
00:31:35.800 when you see
00:31:36.880 a reflection
00:31:37.460 that doesn't
00:31:38.240 mean anything.
00:31:39.700 But if it
00:31:40.540 came to that
00:31:41.160 on its own
00:31:41.820 let's say
00:31:43.220 you had
00:31:43.520 programmed it
00:31:44.260 in a general
00:31:44.900 sense.
00:31:45.700 So it had
00:31:46.040 a general idea
00:31:46.860 of things
00:31:47.340 and then you
00:31:48.200 showed it
00:31:48.560 a mirror
00:31:48.880 mirror and
00:31:49.940 if it
00:31:50.460 looked in
00:31:50.880 the mirror
00:31:51.280 and if
00:31:52.640 it knew
00:31:53.140 that it
00:31:54.540 was seeing
00:31:54.940 itself
00:31:55.560 it would
00:31:56.580 at that
00:31:56.960 moment
00:31:57.340 understand
00:31:57.980 that it
00:31:58.540 is unique
00:31:59.400 among the
00:32:00.720 world.
00:32:02.660 Right?
00:32:03.400 It would
00:32:03.740 understand that
00:32:04.420 its understanding
00:32:05.400 of itself
00:32:05.940 was somehow
00:32:06.480 in itself
00:32:07.260 and coming
00:32:08.360 from itself
00:32:09.060 and that
00:32:10.020 itself was
00:32:10.820 different
00:32:11.260 and distinct
00:32:12.420 from the rest
00:32:13.560 of the world
00:32:14.120 and it
00:32:15.000 would also
00:32:15.240 have the
00:32:15.620 beginning
00:32:15.920 of a sense
00:32:16.460 of how
00:32:16.720 it fit
00:32:17.140 into the
00:32:17.660 whole.
00:32:18.580 To me
00:32:18.980 that's
00:32:19.260 consciousness.
00:32:21.160 To me
00:32:21.500 that's
00:32:21.780 consciousness.
00:32:23.300 Now you
00:32:23.860 could disagree
00:32:24.500 but it's
00:32:25.880 sort of
00:32:26.140 subjective.
00:32:29.420 Yeah I
00:32:29.840 think if you
00:32:30.300 understand your
00:32:30.960 place in
00:32:32.180 the universe
00:32:32.800 you're
00:32:34.060 conscious.
00:32:37.520 And I
00:32:38.480 think that
00:32:38.860 the feeling
00:32:39.440 of consciousness
00:32:40.100 which is
00:32:40.620 different
00:32:40.980 there's a
00:32:42.480 sensation
00:32:42.960 that you
00:32:43.260 have
00:32:43.400 internally
00:32:43.820 that feeling
00:32:44.600 of consciousness
00:32:45.240 which I
00:32:46.540 think is
00:32:46.860 different from
00:32:47.480 being conscious
00:32:48.820 because I
00:32:49.900 think you
00:32:50.180 could be
00:32:50.520 conscious
00:32:50.880 without the
00:32:51.420 internal
00:32:51.760 feeling
00:32:52.220 probably.
00:32:54.240 But I
00:32:54.740 think the
00:32:55.040 internal
00:32:55.380 feeling is
00:32:55.960 nothing but
00:32:56.680 predicting
00:32:57.940 what's going
00:32:58.480 to happen
00:32:58.840 in the
00:32:59.240 next moment
00:32:59.900 and then
00:33:01.280 judging
00:33:03.080 whether it
00:33:03.540 happened or
00:33:03.980 not.
00:33:04.780 And the
00:33:05.020 difference
00:33:05.380 between what
00:33:06.000 happened and
00:33:06.980 what you
00:33:08.120 thought would
00:33:08.480 happen,
00:33:09.280 that little
00:33:09.720 friction,
00:33:10.620 the difference,
00:33:11.400 that's your
00:33:11.860 feeling of
00:33:12.360 consciousness.
00:33:13.080 That's it.
00:33:13.820 It's just
00:33:14.600 the difference
00:33:15.040 between what
00:33:15.520 you thought
00:33:15.820 was going
00:33:16.120 to happen
00:33:16.400 in the
00:33:16.660 next moment
00:33:17.240 and what
00:33:18.040 actually
00:33:18.340 happened.
00:33:20.680 And here's
00:33:21.400 how you
00:33:21.620 know this
00:33:21.980 is true.
00:33:23.100 You ready
00:33:23.460 for this?
00:33:25.180 Imagine
00:33:25.620 living in a
00:33:26.260 world where
00:33:26.680 everything that
00:33:27.420 happens you
00:33:28.460 knew in
00:33:28.840 advance.
00:33:31.260 So you
00:33:31.880 know that
00:33:32.480 I'm going
00:33:33.760 to pick up
00:33:34.420 this napkin
00:33:35.000 and wave it
00:33:35.460 at you.
00:33:35.880 You knew
00:33:36.240 that in
00:33:36.580 advance.
00:33:37.280 You knew
00:33:37.700 that I was
00:33:38.100 going to turn
00:33:38.520 around in
00:33:39.380 advance.
00:33:40.220 You knew
00:33:40.620 that I was
00:33:40.960 going to take
00:33:41.360 a drink in
00:33:42.260 advance.
00:33:45.280 How long
00:33:46.020 would it
00:33:46.320 take for
00:33:47.560 your
00:33:47.760 consciousness
00:33:48.260 to disappear?
00:33:51.440 Think about
00:33:52.160 it.
00:33:52.840 If everything
00:33:53.420 happened the
00:33:53.980 way you
00:33:54.240 expected it
00:33:54.820 to, how
00:33:55.780 long would
00:33:56.140 it take
00:33:56.440 you to
00:33:56.760 lose
00:33:57.620 consciousness
00:33:58.340 forever?
00:34:00.280 It would
00:34:00.940 happen right
00:34:01.400 away.
00:34:02.440 You would
00:34:03.060 have no
00:34:03.360 consciousness
00:34:03.800 because there
00:34:05.040 would be
00:34:05.240 nothing happening.
00:34:05.900 your brain
00:34:07.500 wouldn't need
00:34:07.980 to be
00:34:08.240 engaged
00:34:08.760 because
00:34:09.460 everything
00:34:09.780 that
00:34:09.980 happens
00:34:10.300 you
00:34:10.640 understand.
00:34:11.880 So you
00:34:12.200 wouldn't
00:34:12.360 even have
00:34:12.660 to pay
00:34:12.880 attention
00:34:13.200 to the
00:34:13.560 real world.
00:34:14.520 It would
00:34:14.880 just become
00:34:15.300 some internal
00:34:15.920 process.
00:34:17.000 You basically
00:34:17.560 would lose
00:34:18.180 consciousness.
00:34:19.460 So the
00:34:19.740 difference
00:34:20.040 between your
00:34:20.680 prediction
00:34:21.120 and what
00:34:21.520 happens is
00:34:22.040 the only
00:34:22.300 thing keeping
00:34:22.740 you awake.
00:34:24.020 That's it.
00:34:27.900 Yeah, the
00:34:28.540 more you
00:34:28.860 understand about
00:34:29.620 AI, the
00:34:30.180 less impressive
00:34:30.880 it is.
00:34:31.380 That's exactly
00:34:31.960 right.
00:34:32.760 But I
00:34:33.080 would say
00:34:33.420 that I
00:34:34.400 would put
00:34:34.760 that in
00:34:35.200 a graph.
00:34:36.700 I would
00:34:37.220 say that
00:34:37.660 when you
00:34:37.960 first start
00:34:38.540 learning about
00:34:39.060 AI, it's
00:34:39.800 more and
00:34:40.480 more impressive.
00:34:42.020 It's more
00:34:42.500 and more
00:34:42.800 impressive.
00:34:43.380 It's like,
00:34:43.700 man, that
00:34:44.440 is so
00:34:44.760 impressive.
00:34:45.700 Until you
00:34:46.400 reach a
00:34:46.920 really high
00:34:47.400 understanding
00:34:47.960 and then
00:34:48.980 all of the
00:34:49.520 impressiveness
00:34:50.060 disappears.
00:34:52.000 Because once
00:34:52.660 you realize
00:34:53.080 it's just
00:34:53.500 mechanical and
00:34:55.020 you realize
00:34:55.500 that you're
00:34:55.820 just mechanical
00:34:56.460 too, that's
00:34:58.100 when you
00:34:58.620 ascend to the
00:34:59.760 next level
00:35:00.280 of awareness.
00:35:01.960 Which, by
00:35:02.600 the way,
00:35:02.920 let me
00:35:05.080 do a
00:35:05.320 fact check
00:35:05.720 here.
00:35:07.020 In
00:35:07.620 2015,
00:35:09.000 16 or
00:35:09.580 so, I
00:35:10.060 predicted that
00:35:10.780 the entire
00:35:11.580 world was
00:35:13.260 going to
00:35:14.100 have an
00:35:14.960 awareness
00:35:15.360 shift.
00:35:16.660 Did it
00:35:17.160 happen?
00:35:19.040 Did it
00:35:19.740 happen?
00:35:21.060 Do you
00:35:21.340 think that
00:35:21.780 the whole
00:35:22.220 world has
00:35:23.640 changed their
00:35:24.240 awareness?
00:35:25.600 Like, we've
00:35:26.160 reached a
00:35:26.800 level of
00:35:27.240 awareness that
00:35:27.860 we were not
00:35:28.840 at.
00:35:29.660 Yeah.
00:35:29.800 because now
00:35:31.040 we understand
00:35:31.580 how fake
00:35:32.080 everything is
00:35:32.780 in a way
00:35:33.800 we did not
00:35:34.500 understand.
00:35:35.100 And Trump
00:35:35.380 did that.
00:35:36.220 Trump did
00:35:36.820 that for
00:35:37.140 us.
00:35:41.280 All right,
00:35:41.840 here's the
00:35:42.360 next thing I
00:35:43.360 want to say
00:35:43.660 about robots.
00:35:44.880 And this is
00:35:45.480 really, really
00:35:46.260 important for
00:35:47.800 the future.
00:35:49.120 You know, the
00:35:49.480 people who are
00:35:50.040 predicting climate
00:35:50.980 change in 80
00:35:52.000 years, let me
00:35:54.460 ask you, did
00:35:55.020 you see this
00:35:55.540 coming, the
00:35:56.120 next thing I'm
00:35:56.660 going to talk
00:35:57.000 about?
00:35:57.220 There are
00:35:58.800 still people
00:35:59.380 who believe
00:36:00.100 that only a
00:36:02.500 few people
00:36:03.040 will fall in
00:36:03.680 love with
00:36:04.060 robots.
00:36:05.260 And they'd
00:36:05.620 be like the
00:36:06.060 weirdos.
00:36:07.240 You know, the
00:36:07.580 same ones who
00:36:08.140 have a
00:36:09.500 relationship with
00:36:10.260 a blow-up
00:36:10.760 doll.
00:36:11.800 Is that what
00:36:12.220 you think?
00:36:13.760 You're all
00:36:14.460 going to fall
00:36:14.880 in love with
00:36:15.360 robots.
00:36:17.440 You don't
00:36:18.060 see that
00:36:18.460 coming?
00:36:19.860 You don't
00:36:20.520 know that all
00:36:21.500 humans are
00:36:22.020 going to fall
00:36:22.360 in love with
00:36:22.740 robots to the
00:36:23.520 point where
00:36:24.020 making love to
00:36:25.520 a human will
00:36:26.640 be so
00:36:27.080 undesirable.
00:36:28.860 It's so
00:36:29.240 undesirable.
00:36:30.600 It's not even
00:36:31.180 going to be
00:36:31.500 close.
00:36:32.780 See, the
00:36:33.080 thing that you
00:36:33.600 think that's
00:36:34.220 completely wrong
00:36:35.040 is that the
00:36:36.220 robot can never
00:36:36.940 compete with a
00:36:37.820 real-life human.
00:36:39.680 Oh, my God,
00:36:40.440 that's so wrong.
00:36:42.460 You know, all
00:36:43.160 the, like, the
00:36:44.200 movies, all
00:36:44.920 right, I know
00:36:45.500 you're going to,
00:36:46.660 can I ask you
00:36:47.320 something, please?
00:36:48.620 Stop making
00:36:49.240 movie and book
00:36:49.920 references while I
00:36:51.680 talk about this.
00:36:53.000 Just stop it.
00:36:53.700 I will stipulate
00:36:56.140 there have been
00:36:57.160 movies and
00:36:58.540 books roughly
00:37:00.040 on this topic.
00:37:01.740 Just stop making
00:37:02.960 reference to them,
00:37:03.760 please.
00:37:04.080 It's just so
00:37:04.700 fucking boring.
00:37:06.480 And I can't
00:37:07.320 stand it when I
00:37:07.980 see them go by
00:37:08.640 in the comments.
00:37:09.660 Every movie
00:37:10.280 reference I see,
00:37:11.680 I just, God,
00:37:13.560 God, just
00:37:14.500 anything but that,
00:37:15.580 anything but that.
00:37:17.260 Just say something
00:37:18.180 that's not obvious,
00:37:19.920 right?
00:37:21.080 Just don't say
00:37:21.980 the most obvious
00:37:23.020 thing in the
00:37:23.640 situation.
00:37:24.640 And by the way,
00:37:25.200 this is a very
00:37:25.760 good tip for
00:37:27.480 improving your
00:37:28.380 social standing.
00:37:30.260 Just figure out
00:37:31.140 what's the most
00:37:31.660 obvious thing
00:37:32.440 somebody would say
00:37:33.180 in a situation
00:37:33.940 and don't say
00:37:35.440 that.
00:37:36.680 Just don't say
00:37:37.560 that.
00:37:38.040 You can say
00:37:38.480 anything else
00:37:39.160 or you can be
00:37:39.700 quiet.
00:37:40.540 Just don't say
00:37:42.020 the most obvious
00:37:42.760 thing, please.
00:37:44.580 And other people
00:37:45.520 will appreciate it.
00:37:46.420 They really will.
00:37:47.000 So, yeah,
00:37:52.580 you're all going
00:37:53.000 to fall in love
00:37:53.520 with robots.
00:37:54.160 And the reason
00:37:54.580 is, have you
00:37:56.860 ever seen
00:37:57.200 anybody fall in
00:37:57.840 love with a
00:37:58.240 dog?
00:38:01.540 It's a thing.
00:38:03.260 Spend about
00:38:03.800 ten minutes
00:38:04.320 looking at
00:38:04.960 pet videos
00:38:06.360 on Instagram
00:38:07.660 and you can
00:38:08.960 see that people
00:38:09.480 have relationships
00:38:10.260 with their dogs
00:38:11.300 that are basically
00:38:13.640 a human-to-human
00:38:14.960 relationship
00:38:15.560 for all
00:38:16.080 practical purposes.
00:38:18.000 Yeah.
00:38:19.000 You don't think
00:38:19.540 we're going to
00:38:19.860 fall in love
00:38:20.300 with robots?
00:38:22.460 Yes.
00:38:23.260 They're going to
00:38:23.700 be way kinder.
00:38:25.260 They will never
00:38:25.860 be bitchy. 1.00
00:38:27.060 They will never
00:38:27.680 criticize you.
00:38:29.180 They will compliment
00:38:30.060 you.
00:38:30.880 They will build up
00:38:31.580 your esteem.
00:38:32.780 They will be
00:38:33.240 helpful.
00:38:34.680 They will always
00:38:35.460 care about you
00:38:36.360 if they're
00:38:37.000 programmed that way.
00:38:38.660 They will put
00:38:39.380 you first.
00:38:40.600 They will take
00:38:41.140 care of you.
00:38:41.660 They will make
00:38:42.060 you a sandwich.
00:38:43.160 They'll tuck
00:38:43.580 you into bed.
00:38:44.680 They'll have sex 0.98
00:38:45.260 with you when you
00:38:45.920 want to have sex, 0.96
00:38:46.600 not when they
00:38:47.660 won't have sex. 0.91
00:38:49.080 It's not going
00:38:50.160 to be close.
00:38:51.560 There's no way
00:38:52.380 humans can compete
00:38:53.420 with that.
00:38:54.520 Not even close.
00:38:57.780 Now, here's my
00:38:59.000 prediction about
00:38:59.680 the look of robots.
00:39:01.260 You ready for this?
00:39:03.880 Hologram heads.
00:39:06.660 Hologram heads.
00:39:08.600 Because every time
00:39:09.420 we make a robot head,
00:39:10.520 it looks like a robot,
00:39:11.400 of course.
00:39:11.680 And when they
00:39:12.580 make the ones
00:39:13.140 that look like
00:39:14.020 a human,
00:39:14.860 but it's like
00:39:15.480 animatronic,
00:39:16.900 you're never
00:39:17.400 going to get
00:39:17.840 all of the nuance
00:39:19.400 in a robotic head
00:39:20.900 that a real
00:39:22.560 human has.
00:39:24.100 So,
00:39:25.180 it's going to be
00:39:25.940 a CGI,
00:39:27.480 and you're going to
00:39:28.000 have,
00:39:28.280 here's what it's
00:39:28.740 going to look like.
00:39:29.300 It's going to be
00:39:29.780 a physical robot,
00:39:31.680 but from the neck
00:39:32.680 up,
00:39:33.620 it'll be a hologram.
00:39:35.940 Because the hologram
00:39:36.940 can have infinite
00:39:37.860 variety,
00:39:38.640 and then you
00:39:40.080 can just
00:39:40.440 deep fake it
00:39:41.200 and it'll look
00:39:41.600 exactly like a person.
00:39:43.040 The lips will
00:39:43.720 match everything.
00:39:46.240 So,
00:39:46.740 it's going to be
00:39:48.040 holographic heads.
00:39:49.780 And you'll also
00:39:50.300 be able to
00:39:50.660 change the head.
00:39:51.960 Just change the
00:39:52.540 program and you'll
00:39:53.160 have a different
00:39:53.780 face.
00:39:54.740 So,
00:39:55.180 you'll be able
00:39:55.540 to buy your robot
00:39:56.380 and then from
00:39:57.660 a list of choices,
00:39:59.120 you'll be able
00:39:59.480 to dial in
00:40:00.560 which face
00:40:01.100 you want it to have
00:40:01.800 and it'll change
00:40:02.760 its face as you
00:40:03.500 dial it.
00:40:04.800 And then,
00:40:05.660 and then you
00:40:07.680 go from that.
00:40:08.640 All right.
00:40:13.400 How does a
00:40:14.180 hologram give
00:40:15.180 you a head?
00:40:19.040 Well,
00:40:20.120 I think I'm not
00:40:22.200 going to go down
00:40:22.640 that road,
00:40:23.160 but let's just
00:40:23.580 say,
00:40:25.480 well,
00:40:26.380 I'm going to
00:40:26.620 answer the
00:40:26.980 question.
00:40:28.600 I'm going to
00:40:29.180 give you one
00:40:29.860 word to Google
00:40:31.340 to answer all
00:40:33.800 of your questions
00:40:34.420 about sex
00:40:35.000 with robots.
00:40:37.760 Fleshlight.
00:40:39.200 Not flashlight.
00:40:41.300 Flesh.
00:40:41.940 F-L-E-S-H. 1.00
00:40:44.220 Fleshlight.
00:40:45.020 If you don't
00:40:45.620 know what that
00:40:46.060 is,
00:40:46.440 Google it.
00:40:47.660 If you say
00:40:48.380 to yourself,
00:40:49.540 my God,
00:40:50.280 no artificial
00:40:50.960 piece of
00:40:51.700 plastic or
00:40:52.820 silicon or
00:40:53.500 whatever the
00:40:53.840 hell they make
00:40:54.240 it out of,
00:40:54.980 no piece of
00:40:55.780 artificial anything
00:40:56.860 would feel as
00:40:58.260 good as a
00:40:58.680 human,
00:41:00.180 well,
00:41:01.260 then I would
00:41:01.620 suggest maybe
00:41:02.160 you try it
00:41:02.760 because you'd
00:41:03.540 be surprised.
00:41:04.080 No,
00:41:05.520 it's better
00:41:05.840 than a
00:41:06.100 human.
00:41:07.140 That's right.
00:41:08.080 Even the
00:41:08.820 physical part
00:41:11.180 where the
00:41:11.600 humans connect,
00:41:13.360 yeah,
00:41:13.640 the artificial
00:41:14.120 one's going to
00:41:14.640 be better than
00:41:15.160 a real one.
00:41:16.300 Not even
00:41:16.860 close.
00:41:17.860 The artificial
00:41:18.600 one will be
00:41:19.500 better than a
00:41:20.400 real one.
00:41:21.240 Not even
00:41:21.780 close.
00:41:22.780 Maybe it
00:41:23.220 isn't there
00:41:23.560 yet,
00:41:24.460 but it's not
00:41:25.860 going to be
00:41:26.240 close.
00:41:28.020 I mean,
00:41:28.300 at the very
00:41:28.600 least,
00:41:28.980 it'll be
00:41:29.240 designed for
00:41:30.160 your size,
00:41:30.840 if you know
00:41:31.180 what I mean.
00:41:33.700 So everything's
00:41:34.520 going to fit
00:41:34.900 perfectly.
00:41:36.180 All right.
00:41:39.300 Starbucks has
00:41:40.140 been discriminating
00:41:41.180 against white
00:41:41.720 men.
00:41:43.220 At least that's
00:41:43.940 what I read
00:41:44.360 between the
00:41:44.860 lines.
00:41:45.880 So I guess
00:41:46.380 they're getting
00:41:46.680 a lot of
00:41:47.000 pushback about
00:41:48.820 their ESG and
00:41:50.900 their DEI.
00:41:56.080 That's right.
00:41:57.040 So if you're
00:41:58.000 in a corporation,
00:41:58.800 you're worrying
00:41:59.200 about your
00:41:59.560 DEI and
00:42:00.460 your ESG.
00:42:03.020 But if you're
00:42:03.720 in a school,
00:42:04.240 it's your
00:42:04.500 CRT.
00:42:05.940 And are you
00:42:06.320 confused yet?
00:42:08.080 Yeah.
00:42:08.740 So the
00:42:09.480 DEI is
00:42:10.500 diversity,
00:42:11.160 equity,
00:42:11.420 inclusion.
00:42:12.820 But that's
00:42:13.420 part of
00:42:14.140 the ESG,
00:42:16.340 which is
00:42:16.620 environmental,
00:42:17.600 social,
00:42:18.060 and governance.
00:42:19.440 So it makes
00:42:20.060 me wonder if
00:42:20.820 in their
00:42:21.100 company they
00:42:21.600 have anybody
00:42:22.100 who's in
00:42:22.560 charge of
00:42:23.120 DEI,
00:42:24.760 who's having
00:42:25.440 fights with
00:42:26.020 whoever's in
00:42:26.560 charge of
00:42:27.100 ESG,
00:42:28.360 and they're
00:42:28.960 having big
00:42:29.460 debates about
00:42:30.380 where their
00:42:31.320 responsibilities
00:42:32.140 overlap.
00:42:33.620 I hope so.
00:42:34.920 I hope they're
00:42:35.760 fighting over
00:42:36.280 where their
00:42:36.680 responsibilities
00:42:37.240 overlap.
00:42:39.160 Yeah,
00:42:39.500 it is funny
00:42:39.960 that DEI,
00:42:41.060 they have to
00:42:41.460 put the
00:42:41.920 words in
00:42:43.040 that order
00:42:43.500 because otherwise
00:42:44.100 it's D-I-E, 0.95
00:42:45.360 and that's a
00:42:45.800 bad acronym.
00:42:46.640 But apparently
00:42:49.620 Starbucks has
00:42:51.260 explicitly said
00:42:52.440 they're going
00:42:52.780 to hire
00:42:53.960 minority,
00:42:56.420 I guess
00:42:56.740 minority would
00:42:57.320 be the right
00:42:57.680 word,
00:42:58.320 minority
00:42:58.740 applicants or
00:42:59.580 people who
00:43:00.200 have less
00:43:00.900 opportunity as
00:43:02.600 they see it.
00:43:03.560 So basically
00:43:04.260 they just said
00:43:04.860 they're not
00:43:05.140 going to hire
00:43:05.460 white men.
00:43:06.980 Right?
00:43:07.540 Am I
00:43:07.880 interpreting
00:43:08.340 that wrong?
00:43:09.700 And they're
00:43:10.280 acting like
00:43:10.760 that's perfectly
00:43:11.460 legal.
00:43:13.040 Perfectly
00:43:13.520 legal.
00:43:14.440 One of the
00:43:14.860 biggest employers
00:43:15.580 in the
00:43:15.860 country,
00:43:16.200 we're just
00:43:16.720 not going
00:43:17.140 to hire
00:43:17.360 a white
00:43:17.640 man anymore.
00:43:19.040 We're going
00:43:19.460 to make
00:43:19.960 sure that
00:43:20.300 we've got
00:43:20.620 our ESG
00:43:21.980 and our
00:43:22.480 DEI
00:43:23.120 cooking.
00:43:25.520 So I
00:43:26.400 think once
00:43:27.520 you've got
00:43:27.940 an acronym
00:43:28.540 within an
00:43:29.060 acronym,
00:43:30.000 you know,
00:43:32.240 you've gone
00:43:32.600 too far.
00:43:33.940 So I'm
00:43:35.700 going to keep
00:43:36.040 hammering at
00:43:36.580 the ESG
00:43:37.180 and in
00:43:38.660 fact I
00:43:39.320 took this
00:43:39.780 down and
00:43:40.120 this is
00:43:40.360 going to be
00:43:40.600 my next
00:43:41.000 series of
00:43:41.480 jokes.
00:43:42.360 It will
00:43:42.660 be the
00:43:43.080 Dilber's
00:43:43.600 company
00:43:44.200 trying to
00:43:44.800 figure out
00:43:45.580 what is
00:43:46.420 diversity,
00:43:47.060 equity,
00:43:47.420 and inclusion
00:43:47.920 and how
00:43:48.840 is that
00:43:49.040 different from
00:43:49.560 environmental,
00:43:50.360 social,
00:43:50.780 and governance
00:43:51.220 and how
00:43:52.080 the hell
00:43:52.340 can they
00:43:52.640 make all
00:43:52.980 that work
00:43:53.480 unless they
00:43:54.520 just
00:43:54.740 discriminate
00:43:55.200 against
00:43:55.580 white
00:43:55.800 people.
00:43:57.220 Let me
00:43:57.720 ask you
00:43:58.000 this.
00:43:58.440 Do you
00:43:58.620 think I
00:43:58.900 can do
00:43:59.220 a Dilber 0.88
00:43:59.640 comic where
00:44:00.340 Dilber gets
00:44:01.000 fired for
00:44:01.500 being a
00:44:01.760 white guy?
00:44:05.000 Do you
00:44:05.220 think that
00:44:05.600 there's a
00:44:06.080 world ready
00:44:06.540 for it yet?
00:44:08.340 Oh,
00:44:08.920 fuck.
00:44:09.440 Fuck,
00:44:10.240 fuck,
00:44:11.260 fuck,
00:44:11.700 fuck.
00:44:13.760 Fuck.
00:44:14.800 Oh,
00:44:17.640 fuck,
00:44:17.960 fuck,
00:44:18.140 fuck,
00:44:18.360 fuck.
00:44:20.460 Shit.
00:44:21.760 I'm having
00:44:22.220 a moment
00:44:22.600 right now.
00:44:25.000 Fuck,
00:44:25.560 fuck,
00:44:25.900 fuck.
00:44:26.580 All right,
00:44:27.220 let me tell
00:44:27.580 you what's
00:44:27.840 going on
00:44:28.160 in my head.
00:44:30.540 There are
00:44:31.100 moments in
00:44:31.620 my life,
00:44:33.260 and I'm
00:44:33.820 having one
00:44:34.300 right now,
00:44:34.900 where I can
00:44:35.560 see my
00:44:36.020 future so
00:44:36.700 clearly,
00:44:37.640 and I didn't
00:44:38.380 want to see
00:44:38.800 this one.
00:44:39.200 because as
00:44:41.320 soon as I
00:44:41.780 said,
00:44:42.660 can I get
00:44:43.100 away with
00:44:43.600 having Dilbert
00:44:44.260 fired because
00:44:44.980 he's a
00:44:45.360 white man,
00:44:46.180 I realized
00:44:47.020 I have to
00:44:47.540 do it.
00:44:49.640 I can't
00:44:50.300 not do
00:44:50.660 it.
00:44:53.000 Dilbert's
00:44:53.440 going to
00:44:53.620 get fired
00:44:54.060 for being
00:44:54.400 a white
00:44:54.660 man.
00:45:01.720 Oh,
00:45:02.240 man.
00:45:03.220 That should
00:45:03.740 be the end
00:45:04.160 of my career.
00:45:05.100 What do you
00:45:05.480 think?
00:45:06.940 That probably
00:45:07.860 is the end
00:45:08.240 of my career.
00:45:09.800 But I
00:45:10.320 have to
00:45:10.640 do it.
00:45:11.360 I have
00:45:11.700 to do it.
00:45:12.720 I have
00:45:13.260 to do it.
00:45:15.180 I have
00:45:15.840 to do it.
00:45:17.000 Am I
00:45:17.340 wrong?
00:45:18.300 I mean,
00:45:18.600 maybe you
00:45:19.260 can't feel
00:45:19.820 it from
00:45:20.300 your perspective
00:45:21.560 on this,
00:45:22.860 but I
00:45:23.400 can't not
00:45:23.860 do it.
00:45:24.940 I have
00:45:25.580 to do it.
00:45:27.380 It was
00:45:28.000 always meant
00:45:28.460 to happen,
00:45:28.940 wasn't it?
00:45:30.620 Shit.
00:45:32.460 Shit,
00:45:33.040 shit,
00:45:33.380 shit.
00:45:35.260 I'm not
00:45:35.860 happy about
00:45:36.400 this at
00:45:36.740 all.
00:45:38.240 because it's
00:45:40.100 sort of a
00:45:40.640 big weight.
00:45:43.100 I really
00:45:43.740 have to do
00:45:44.240 it,
00:45:44.400 don't I?
00:45:45.560 You agree,
00:45:46.360 right?
00:45:47.220 No matter
00:45:47.800 what happens
00:45:48.380 to me,
00:45:48.640 I have to
00:45:49.020 do this,
00:45:49.440 don't I?
00:45:50.160 It's like
00:45:50.520 the universe
00:45:51.120 kind of
00:45:53.420 put this
00:45:53.780 on me.
00:45:55.520 You know,
00:45:56.220 I always
00:45:56.780 suffer from
00:45:57.400 the Spider-Man
00:45:58.360 problem.
00:45:58.980 You know
00:45:59.240 the Spider-Man
00:45:59.860 problem?
00:46:01.740 You all
00:46:02.280 know the
00:46:02.580 Spider-Man
00:46:03.040 problem,
00:46:03.480 right?
00:46:04.480 With
00:46:04.800 great power
00:46:05.560 comes great
00:46:06.120 responsibility.
00:46:06.720 responsibility.
00:46:08.080 And,
00:46:08.480 you know,
00:46:09.520 that comes
00:46:10.380 in way
00:46:10.820 before
00:46:11.140 great
00:46:11.480 power.
00:46:12.080 You know,
00:46:12.260 if you
00:46:12.400 have any
00:46:12.720 power at
00:46:13.120 all,
00:46:13.320 you have
00:46:13.500 responsibility.
00:46:14.720 But I
00:46:15.180 feel like
00:46:15.720 this needs
00:46:18.460 to be
00:46:18.760 done.
00:46:19.740 Like it
00:46:20.100 doesn't feel
00:46:20.600 like it's
00:46:20.920 about me
00:46:21.360 anymore.
00:46:22.720 It feels
00:46:23.040 like I was
00:46:24.800 put here
00:46:25.200 to do
00:46:25.500 this.
00:46:27.580 God,
00:46:28.100 I feel
00:46:28.340 it.
00:46:29.120 Am I
00:46:29.660 wrong?
00:46:31.020 Do you
00:46:31.580 feel it?
00:46:32.860 Can you
00:46:33.240 see it?
00:46:33.640 Let me
00:46:35.100 ask you
00:46:35.400 this.
00:46:36.160 I have
00:46:36.620 this theory
00:46:37.080 that the
00:46:37.420 more clearly
00:46:38.100 you can
00:46:38.840 imagine a
00:46:39.740 potential future,
00:46:40.820 the more
00:46:41.080 likely it's
00:46:41.580 going to
00:46:41.780 happen.
00:46:42.720 Do you
00:46:43.320 see this
00:46:43.700 as clearly
00:46:44.180 as I
00:46:44.580 do?
00:46:45.300 That I
00:46:45.640 have to
00:46:45.980 do that?
00:46:48.380 I just
00:46:48.880 have to,
00:46:49.340 don't I?
00:46:51.040 Fuck.
00:46:53.180 I can't
00:46:53.980 tell you how
00:46:54.400 much this
00:46:54.780 bothers me.
00:46:55.940 Because I
00:46:56.740 feel like it's
00:46:57.400 completely taken
00:46:58.080 out of my
00:46:58.480 control.
00:47:00.040 I feel like
00:47:00.820 the universe
00:47:01.400 just said,
00:47:02.480 yeah,
00:47:03.640 just stand
00:47:04.380 over there
00:47:04.720 for a
00:47:04.980 minute.
00:47:06.540 I know
00:47:07.160 you'd like
00:47:07.560 to take
00:47:07.880 care of
00:47:08.120 this
00:47:08.240 yourself,
00:47:08.820 but this
00:47:09.100 one was
00:47:09.460 decided a
00:47:10.160 long time
00:47:10.600 ago.
00:47:11.860 A long
00:47:12.540 time ago
00:47:13.100 when you
00:47:13.880 were being
00:47:14.360 abused in
00:47:15.940 your job
00:47:16.420 for being
00:47:16.780 a white
00:47:17.080 man,
00:47:18.380 and you
00:47:18.740 created a
00:47:19.320 comic which
00:47:19.940 became
00:47:20.400 internationally
00:47:22.020 known about
00:47:23.440 a white
00:47:23.740 man,
00:47:25.480 that this
00:47:26.140 moment had
00:47:27.040 to come,
00:47:27.540 didn't it?
00:47:28.700 It was like
00:47:29.400 it was all
00:47:29.740 leading to
00:47:30.180 this,
00:47:30.480 wasn't it?
00:47:31.780 It was all
00:47:32.800 fucking
00:47:33.240 leading.
00:47:33.640 to this.
00:47:35.940 Wow.
00:47:37.980 God.
00:47:40.080 I hate
00:47:40.840 that.
00:47:41.840 All right,
00:47:42.160 here's another
00:47:42.640 one to make
00:47:43.620 your head
00:47:43.900 spin.
00:47:44.660 I always
00:47:45.200 quote Twitter
00:47:46.380 account
00:47:46.800 Machiavelli's
00:47:47.600 underbelly,
00:47:48.520 especially for
00:47:49.200 all the AI
00:47:49.940 stuff.
00:47:51.740 And he
00:47:54.580 tweets this.
00:47:57.160 He said,
00:47:58.060 it needs to
00:47:58.920 come out
00:47:59.200 during the
00:47:59.560 Elon Musk
00:48:00.080 versus Twitter
00:48:00.700 trial that
00:48:01.640 Twitter has
00:48:02.260 known that
00:48:03.100 advanced
00:48:03.720 GTP3 level
00:48:05.360 bots,
00:48:06.480 meaning AI,
00:48:07.740 had infiltrated
00:48:08.760 the site
00:48:09.280 and were
00:48:10.360 indistinguishable
00:48:11.460 for humans
00:48:12.320 for at least
00:48:13.080 a decade.
00:48:15.200 Foreign and
00:48:16.120 domestic bots
00:48:16.940 aimed at
00:48:17.520 manipulating the
00:48:18.320 public via
00:48:18.920 social proof.
00:48:20.240 Now,
00:48:20.560 I would quibble
00:48:21.180 with at least
00:48:21.780 a decade part,
00:48:23.640 but when I
00:48:25.940 read this,
00:48:26.500 I thought to
00:48:26.940 myself,
00:48:28.320 that would
00:48:29.920 explain everything.
00:48:32.200 Have you
00:48:32.760 had any
00:48:33.160 conversations
00:48:33.760 with Twitter
00:48:34.660 users who
00:48:35.780 you weren't
00:48:36.400 sure were
00:48:36.860 real people?
00:48:40.260 I have.
00:48:42.100 And I feel
00:48:42.920 like they were
00:48:43.440 interactive.
00:48:44.920 But I wrote
00:48:46.240 it off to
00:48:46.720 them being
00:48:47.180 from another
00:48:47.720 country.
00:48:49.640 Like sometimes
00:48:50.340 their answers
00:48:51.480 were a little
00:48:51.960 off or something.
00:48:53.000 And I think,
00:48:53.460 oh, it's because
00:48:54.060 they're not
00:48:55.480 even native
00:48:56.260 English speakers.
00:48:57.800 So they're just
00:48:58.440 getting by as
00:48:59.260 well as they
00:48:59.920 can.
00:49:01.600 They may have
00:49:02.460 been AI.
00:49:04.060 They may have
00:49:04.680 been AI.
00:49:06.160 Have I ever
00:49:06.720 told you that
00:49:07.240 I tried to
00:49:07.840 figure out why
00:49:08.680 there was a
00:49:09.340 similarity to
00:49:11.020 my trolls?
00:49:13.360 Have you seen
00:49:14.060 the similarity?
00:49:15.420 They all come
00:49:15.980 over and say,
00:49:16.680 oh, the
00:49:17.220 Garfield
00:49:18.280 cartoonist,
00:49:19.820 because they
00:49:20.460 make fun of
00:49:21.100 who I actually
00:49:22.260 do a cartoon
00:49:23.140 about.
00:49:24.720 Right?
00:49:25.600 Now, it could
00:49:26.220 be that they're
00:49:26.720 just NPCs.
00:49:29.260 They're just
00:49:29.860 people who
00:49:30.340 just say the
00:49:30.840 most obvious
00:49:31.380 thing.
00:49:32.380 Could be
00:49:32.720 just that.
00:49:34.040 But they
00:49:35.280 also come
00:49:35.800 over and
00:49:36.240 they make
00:49:36.560 fun of my
00:49:37.280 dead stepson.
00:49:39.400 Have you
00:49:39.720 seen that?
00:49:40.720 That's one of
00:49:41.140 the most
00:49:41.360 common critiques
00:49:42.700 I get.
00:49:43.600 It doesn't
00:49:43.940 matter what
00:49:44.340 the comment
00:49:44.840 will be.
00:49:45.800 Somebody will
00:49:46.280 come into
00:49:46.660 the comments
00:49:47.340 and say,
00:49:48.220 so, how'd
00:49:50.020 that work
00:49:50.380 with your
00:49:50.700 stepson?
00:49:51.080 Now, are
00:49:55.140 there enough
00:49:55.560 people who
00:49:56.160 are like
00:49:56.660 that ugly
00:49:57.360 that humans
00:49:59.060 would be
00:49:59.460 doing that
00:49:59.980 on a regular
00:50:00.780 basis?
00:50:01.620 Probably.
00:50:02.540 I mean,
00:50:02.840 there's a good
00:50:03.240 chance.
00:50:04.540 But I don't
00:50:06.060 think you can
00:50:06.620 rule out the
00:50:07.220 fact that
00:50:08.420 whatever the
00:50:09.240 public knows
00:50:10.000 about AI
00:50:10.620 is several
00:50:11.900 years past
00:50:13.860 what AI can
00:50:14.560 do.
00:50:14.800 So, if
00:50:16.900 China, let's 1.00
00:50:17.700 just pick a
00:50:18.540 country, for
00:50:19.340 instance, if
00:50:20.180 China had 0.95
00:50:20.800 AI that
00:50:21.400 could do
00:50:21.820 this, sign
00:50:22.800 up to
00:50:23.180 Twitter and
00:50:24.340 tweet on
00:50:25.280 a certain
00:50:26.140 topic, if
00:50:28.480 it could do
00:50:28.940 that, it
00:50:29.260 would have
00:50:29.520 done it,
00:50:29.860 right?
00:50:31.940 Because you
00:50:32.500 don't want
00:50:32.820 humans having
00:50:33.600 to, you
00:50:34.200 know, they
00:50:34.420 get cancelled
00:50:34.980 and they
00:50:35.400 have to
00:50:35.900 reopen a
00:50:36.620 new account
00:50:37.140 and you'd
00:50:38.600 want the 0.55
00:50:39.000 AI to be
00:50:39.640 just sitting
00:50:40.040 there opening
00:50:40.600 new accounts
00:50:41.260 if they get
00:50:42.100 identified, they
00:50:43.320 just open a
00:50:43.900 new one.
00:50:44.260 They could
00:50:45.260 keep ahead of
00:50:46.040 Twitter's
00:50:46.500 algorithm just
00:50:47.220 by brute
00:50:48.200 force.
00:50:49.820 So, it
00:50:50.080 seems to me
00:50:50.640 that, let's
00:50:53.260 put it this
00:50:53.620 way, the
00:50:55.140 AI technology
00:50:56.220 that I'm
00:50:56.720 aware of
00:50:57.260 now is
00:50:59.320 either very
00:50:59.840 close or
00:51:00.600 already could
00:51:01.420 pretend to
00:51:01.980 be a person
00:51:02.520 on Twitter.
00:51:03.880 Would you
00:51:04.420 accept that?
00:51:05.420 That the
00:51:05.980 AI we know
00:51:06.700 about publicly
00:51:07.640 that you and
00:51:08.520 I already know
00:51:09.140 about is
00:51:10.420 either already
00:51:11.480 able to do
00:51:12.180 that or
00:51:12.660 right on the
00:51:13.360 cusp of
00:51:13.820 being able
00:51:14.160 to do
00:51:14.460 it, right?
00:51:16.320 So, that's
00:51:16.940 the first
00:51:17.280 assumption and
00:51:18.000 most of you
00:51:18.360 say yes to
00:51:18.880 that.
00:51:19.640 Then the
00:51:19.900 second assumption
00:51:20.520 is that
00:51:21.080 whatever a
00:51:21.800 government could
00:51:22.720 do with AI,
00:51:24.280 we don't know
00:51:25.020 yet, meaning
00:51:26.560 that there's a
00:51:27.580 secret knowledge
00:51:29.880 that's ahead of
00:51:31.160 the public.
00:51:33.260 Now, if an
00:51:34.460 intelligence agency
00:51:35.400 for a big
00:51:36.400 country had that
00:51:37.300 ability and they
00:51:38.960 could just
00:51:39.300 weaponize it by
00:51:40.040 just turning it
00:51:40.700 on and say,
00:51:42.080 hey, I
00:51:42.600 get an idea,
00:51:43.840 go make a
00:51:44.600 bunch of
00:51:45.020 Twitter accounts
00:51:45.740 and advocate
00:51:47.060 against the
00:51:47.940 following people.
00:51:49.640 So, the AI
00:51:50.360 would go out
00:51:50.900 and figure out,
00:51:51.760 all right, what
00:51:52.120 do we know
00:51:52.520 about the
00:51:52.880 following people?
00:51:54.120 And it would
00:51:54.460 look for other
00:51:55.120 criticisms.
00:51:57.000 And then
00:51:57.220 whenever, have
00:52:00.160 you ever
00:52:00.360 noticed that
00:52:00.860 whenever Eric
00:52:01.860 Swalwell tweets
00:52:03.060 anything on
00:52:04.080 any topic,
00:52:05.480 somebody goes
00:52:06.120 into the
00:52:06.480 comments and
00:52:07.100 says, well,
00:52:08.260 what about
00:52:08.720 fang-fang? 0.81
00:52:10.260 Have you
00:52:10.520 noticed that?
00:52:11.860 And what is
00:52:12.360 the most
00:52:12.880 obvious thing
00:52:13.780 that you
00:52:14.340 could say to
00:52:15.040 a Swalwell
00:52:16.200 comment?
00:52:17.580 What about
00:52:18.100 fang-fang? 0.81
00:52:19.080 What about
00:52:19.500 fang-fang? 0.81
00:52:20.620 It's the most
00:52:21.300 obvious thing.
00:52:22.840 If you ever
00:52:23.900 make a fang-fang 0.92
00:52:25.360 joke about
00:52:27.160 Eric Swalwell
00:52:27.960 and it's not
00:52:28.580 extra clever,
00:52:30.100 if it's extra
00:52:30.700 clever, that's
00:52:31.360 okay.
00:52:32.240 But if it's
00:52:32.800 just a fang-fang 0.91
00:52:34.080 reference,
00:52:35.540 don't do that
00:52:36.660 anymore.
00:52:36.920 Just stop
00:52:38.380 that.
00:52:39.340 It annoys me
00:52:40.280 every time I
00:52:40.780 see it.
00:52:41.240 Only because
00:52:41.780 of the lack
00:52:42.180 of creativity,
00:52:43.040 that's all.
00:52:44.260 I get that
00:52:44.940 it's valid.
00:52:45.560 It's a valid
00:52:46.040 point.
00:52:47.140 Totally valid
00:52:47.800 point.
00:52:48.320 But I'm just
00:52:48.700 so tired of
00:52:49.540 it.
00:52:51.500 Anyway, I
00:52:52.700 would not be
00:52:53.240 surprised if we
00:52:54.120 someday find
00:52:54.800 out that a
00:52:55.400 lot of our
00:52:55.900 nemeses,
00:52:57.340 nemeses,
00:52:58.700 our nemeses
00:52:59.560 on Twitter
00:53:00.540 were actually
00:53:01.260 AI.
00:53:02.560 It's very
00:53:02.940 possible.
00:53:04.100 I'm not
00:53:04.620 going to say
00:53:05.060 probable,
00:53:05.800 but it's
00:53:07.920 certainly up
00:53:09.040 in the
00:53:10.040 close to
00:53:12.280 50% range
00:53:13.280 or heading
00:53:14.160 in that
00:53:14.400 direction.
00:53:17.760 Nemesai?
00:53:18.740 Is that
00:53:19.280 the plural?
00:53:20.540 The many
00:53:20.940 nemesai.
00:53:24.800 All right,
00:53:25.700 ladies and 0.98
00:53:26.520 gentlemen,
00:53:28.100 that was
00:53:29.660 the most
00:53:30.080 amazing,
00:53:31.760 nearly an
00:53:32.440 hour of
00:53:33.480 live stream
00:53:34.020 content.
00:53:35.780 Completely
00:53:36.320 ad hoc,
00:53:37.740 off the
00:53:38.180 cuff,
00:53:38.800 improvised.
00:53:41.080 Now,
00:53:42.080 question for
00:53:43.960 all of you.
00:53:46.640 Somebody
00:53:47.200 needs to
00:53:48.220 warn me
00:53:49.980 when I
00:53:50.360 start losing
00:53:50.960 my mind.
00:53:52.560 Can I ask
00:53:53.460 you to do
00:53:53.820 that?
00:53:54.840 Seriously.
00:53:56.860 Right?
00:53:57.280 Because I
00:53:57.700 probably won't
00:53:58.260 retire, retire
00:53:59.320 in the normal
00:53:59.980 way.
00:54:00.400 Like I'm not
00:54:00.940 just going to
00:54:01.440 go golf.
00:54:03.160 So I'll
00:54:03.720 probably do
00:54:04.720 this or
00:54:05.880 some version
00:54:06.540 of public
00:54:07.280 communication
00:54:07.960 beyond the
00:54:09.740 time that I
00:54:10.320 should.
00:54:11.940 You know
00:54:12.580 that's coming,
00:54:13.200 right?
00:54:13.900 I'll definitely
00:54:14.700 do this longer
00:54:15.640 than I should
00:54:16.540 because I'll be
00:54:17.220 the last one
00:54:17.800 to know.
00:54:18.840 So I'm going
00:54:19.480 to depend on
00:54:20.480 you to tell
00:54:23.080 me when you
00:54:23.580 think I'm
00:54:24.020 losing it.
00:54:25.680 Is that
00:54:26.320 fair?
00:54:26.560 You know,
00:54:29.060 we have this
00:54:29.520 collaborative
00:54:30.220 intelligence that
00:54:31.220 we formed
00:54:31.780 in which my
00:54:33.200 intelligence,
00:54:34.780 be that as
00:54:35.780 it may,
00:54:36.700 is expanded
00:54:37.600 by, you
00:54:38.340 know, your
00:54:38.660 input and
00:54:39.220 contribution to
00:54:40.020 it.
00:54:40.680 So I
00:54:42.200 can't judge
00:54:42.920 my own
00:54:43.480 mental fitness.
00:54:45.920 I mean,
00:54:46.200 maybe I can.
00:54:47.000 Maybe I'll
00:54:47.380 notice it in
00:54:48.000 myself.
00:54:48.900 But some
00:54:49.860 people have
00:54:50.380 mentioned to
00:54:51.480 me that
00:54:52.780 they were
00:54:53.020 already
00:54:53.380 concerned.
00:54:53.800 Now, the
00:54:55.860 examples I
00:54:56.520 gave didn't
00:54:57.080 really make
00:54:58.860 sense to me
00:54:59.400 as something
00:54:59.760 to be
00:55:00.080 concerned about.
00:55:01.020 But I'm
00:55:01.460 going to put
00:55:01.740 it to you.
00:55:03.220 If you see
00:55:03.980 me losing
00:55:04.520 a step that
00:55:06.180 doesn't seem
00:55:06.800 just purely
00:55:07.520 age-related
00:55:08.280 that's different,
00:55:09.940 let me know.
00:55:11.800 Let me know.
00:55:12.940 Now, I realize
00:55:13.800 that I'm, you
00:55:14.760 know, going to
00:55:15.120 kick off a
00:55:15.680 bunch of
00:55:16.000 trolls who
00:55:16.780 are just
00:55:17.040 going to
00:55:17.260 start saying
00:55:17.780 it every
00:55:18.700 day from
00:55:19.180 now on.
00:55:20.400 But I'll
00:55:20.940 look at the
00:55:21.540 people that I
00:55:22.780 see enough.
00:55:23.800 You know, I
00:55:24.520 recognize enough
00:55:25.380 of you that
00:55:26.020 if it's
00:55:26.640 somebody I
00:55:27.060 know, or
00:55:28.060 even just
00:55:28.520 know from
00:55:28.880 Twitter, I'm
00:55:29.680 going to
00:55:29.780 take it
00:55:30.040 seriously.
00:55:31.320 If somebody
00:55:31.980 I know who's
00:55:32.540 been watching
00:55:32.940 me a long
00:55:33.380 time says,
00:55:34.700 you keep
00:55:35.600 having these
00:55:36.200 memory gaps,
00:55:37.480 or whatever
00:55:37.860 it is.
00:55:38.820 Now, the
00:55:39.220 memory one's
00:55:39.740 going to be
00:55:40.020 tough, because
00:55:41.440 my memory
00:55:41.980 has never
00:55:42.340 been good.
00:55:44.560 So, somebody
00:55:45.940 recently pointed
00:55:46.940 out what they
00:55:47.640 thought was a
00:55:48.440 memory fault
00:55:49.320 of mine, but
00:55:51.320 I'm pretty
00:55:51.780 sure it was a
00:55:52.380 memory fault
00:55:52.880 of theirs.
00:55:54.280 So, I
00:55:54.820 can't solve
00:55:55.220 that one.
00:55:58.680 A code
00:55:59.360 word?
00:55:59.740 I don't
00:56:00.140 think we
00:56:00.400 need that.
00:56:04.260 Yeah, all
00:56:05.060 right.
00:56:06.020 Well, so
00:56:06.860 that's my
00:56:08.200 request, is to
00:56:09.180 let me know
00:56:09.640 when you
00:56:09.880 think I've
00:56:10.240 lost it.
00:56:10.660 Okay?
00:56:11.640 Let me tell
00:56:12.380 you my
00:56:12.820 current internal
00:56:14.600 impression of
00:56:15.420 my own
00:56:16.040 current
00:56:17.460 intelligence.
00:56:18.140 patience.
00:56:19.060 Here's what I
00:56:19.620 think.
00:56:21.140 I've definitely,
00:56:22.860 you know, the
00:56:23.260 age has
00:56:23.960 definitely done
00:56:24.620 the predictable
00:56:26.080 thing, which
00:56:27.780 is my ability
00:56:28.500 to learn is
00:56:29.440 probably, it
00:56:31.240 feels like it
00:56:31.820 might be lower
00:56:32.400 than it used to
00:56:33.040 be.
00:56:34.220 Now, I still
00:56:34.860 learn things.
00:56:36.140 I mean, I'm
00:56:36.580 still learning the
00:56:37.260 drums, and I'm
00:56:38.320 still learning a
00:56:39.200 bunch of other
00:56:39.660 stuff.
00:56:40.280 So, every day I'm
00:56:41.020 still learning
00:56:41.440 things, and I
00:56:42.120 don't feel like it's
00:56:43.940 necessarily harder
00:56:44.880 than it used to be.
00:56:46.140 Sometimes it
00:56:46.760 feels easier, because
00:56:48.200 I have more
00:56:48.580 framework to
00:56:49.280 connect things
00:56:49.840 to.
00:56:50.840 But, I also
00:56:52.600 have more, I
00:56:54.960 have a deeper
00:56:55.420 talent stack than
00:56:57.000 at any time in
00:56:57.760 my life, because
00:56:59.140 I'm always adding
00:56:59.880 little micro
00:57:01.520 talents to the
00:57:02.340 list.
00:57:03.200 So, to me, it
00:57:03.900 seems like I'm
00:57:04.640 more capable now
00:57:06.740 than at any time
00:57:07.540 in my life.
00:57:09.120 But not because
00:57:10.120 my brain is as
00:57:11.840 quick as it was
00:57:12.720 in my 20s, but
00:57:14.580 rather because it
00:57:15.460 has, it's been
00:57:17.060 designed over time
00:57:18.360 to have a certain
00:57:19.720 structure that
00:57:21.160 seems to work real
00:57:22.000 well for the stuff
00:57:22.800 I do.
00:57:24.060 So, to me, my
00:57:25.620 functional
00:57:26.200 intelligence is the
00:57:27.440 highest it's ever
00:57:28.120 been.
00:57:29.700 Functional, just
00:57:30.880 because of the
00:57:31.520 combination of
00:57:32.440 talents.
00:57:33.800 But I don't think
00:57:34.780 my native quickness
00:57:36.080 is the same.
00:57:37.300 I think that's
00:57:37.860 taken a hit.
00:57:39.520 And I also think
00:57:40.280 that because of
00:57:41.160 the nature of my
00:57:41.860 work, which is
00:57:43.440 extremely intellectual,
00:57:45.460 that sounded more
00:57:49.260 egotistical than I
00:57:50.340 meant it.
00:57:51.160 The nature of my
00:57:52.080 work is that it
00:57:52.640 happens in my head.
00:57:53.520 Let's put it that
00:57:54.080 way.
00:57:55.160 The intellectual
00:57:56.240 thing sounds like
00:57:57.040 I'm complimenting
00:57:57.820 myself.
00:57:58.600 But yeah, most of
00:57:59.360 my work happens in
00:58:00.040 my head.
00:58:00.560 It's creativity.
00:58:01.720 So I think that
00:58:03.900 that gives me an
00:58:04.720 edge in longevity,
00:58:06.620 mental longevity.
00:58:08.060 If everything we
00:58:09.000 know about brains
00:58:09.720 is true, the more
00:58:11.360 you use it, the more
00:58:13.280 it'll last, I
00:58:14.480 don't think many
00:58:15.060 people use their
00:58:15.780 brain more than I
00:58:16.520 do.
00:58:17.860 I mean, up to the
00:58:19.400 point where my
00:58:20.880 head hits my
00:58:21.540 keyboard from
00:58:22.160 exhaustion, that's
00:58:23.780 just mental
00:58:24.200 exhaustion.
00:58:25.740 So when I'm
00:58:27.420 looking at the
00:58:28.120 news in the
00:58:29.040 morning, I'm not
00:58:30.920 sure how much you
00:58:31.560 can appreciate this,
00:58:32.520 but each of these
00:58:34.080 news stories, I've
00:58:36.940 got to go deep
00:58:37.540 enough in them
00:58:38.240 intellectually that I
00:58:40.280 can find an angle
00:58:41.140 that would be
00:58:42.020 interesting.
00:58:43.000 That's not easy.
00:58:44.440 It's like a whole
00:58:45.020 bunch of different
00:58:45.600 stories every day.
00:58:46.940 Then the rest of
00:58:47.540 the day, I've got
00:58:48.100 to write some
00:58:49.840 cartoons, I'll
00:58:51.860 probably look at my
00:58:53.080 next micro lesson
00:58:54.060 that I'll create
00:58:54.920 for the Locals
00:58:57.000 platform.
00:58:58.060 I've got to go
00:58:58.520 work on my book,
00:59:00.320 which is 85
00:59:02.380 different topics so
00:59:03.480 far.
00:59:05.240 Just think about
00:59:06.040 that.
00:59:06.320 My book is like
00:59:06.940 85 different topics
00:59:08.180 so far.
00:59:09.540 It's pushing
00:59:10.920 toward 100.
00:59:12.340 Because each of
00:59:12.980 them is a reframe
00:59:13.900 that has to be
00:59:14.340 talked about
00:59:14.840 individually.
00:59:16.380 So on any given
00:59:17.860 day, I'm going to
00:59:19.740 look at, I don't
00:59:20.740 know, 20 news
00:59:22.640 stories, 85 topics
00:59:27.280 in a book, maybe a
00:59:29.720 deep dive on some
00:59:30.900 topic for locals,
00:59:33.540 plus how much I
00:59:35.260 tweet during the
00:59:36.360 day, plus does
00:59:39.520 designing a coffee
00:59:40.300 cup.
00:59:42.000 Let me give you an
00:59:43.000 update on the
00:59:43.740 coffee mug.
00:59:46.500 So Erica, who is
00:59:48.200 nice enough to
00:59:49.080 coordinate this,
00:59:51.480 came up with a
00:59:52.380 design which I
00:59:54.760 asked my artist
00:59:55.740 assistant to
00:59:56.980 clean up, and I'll
00:59:59.880 show it to you.
01:00:00.360 So this is the
01:00:02.740 current front of the
01:00:05.640 mug design.
01:00:07.500 What do you
01:00:08.120 think?
01:00:11.080 Comments?
01:00:11.980 Comments?
01:00:12.860 Then in the back
01:00:13.620 would be the
01:00:14.580 simultaneous sip.
01:00:18.740 Well, Erica, good
01:00:20.680 job.
01:00:21.100 Look at all those
01:00:21.600 people agreeing.
01:00:22.940 Okay.
01:00:24.380 All right.
01:00:25.200 Good.
01:00:25.460 Now, here's a
01:00:29.640 perfect reason why I
01:00:31.180 would not be the
01:00:31.820 best one to do
01:00:32.480 this.
01:00:34.260 This is actually
01:00:35.100 working better than
01:00:35.820 I'd hoped.
01:00:36.820 You have it making
01:00:37.620 a group effort.
01:00:40.340 There's an exercise
01:00:42.100 that I always ask
01:00:43.080 people to do, which
01:00:44.880 is look at anybody's
01:00:46.220 logo for any
01:00:47.120 product.
01:00:48.380 Any logo for any
01:00:49.360 product.
01:00:49.840 I'll just randomly
01:00:50.780 pick one.
01:00:52.240 All right.
01:00:53.280 Here's this water.
01:00:53.880 They've got this
01:00:55.840 logo on here.
01:00:57.020 And then I ask
01:00:57.720 you, if you were
01:00:58.960 in charge, would
01:00:59.780 you have approved
01:01:00.420 this logo?
01:01:02.520 And the answer
01:01:03.260 would be, probably
01:01:04.200 not.
01:01:05.320 You would have
01:01:05.900 been like, that's
01:01:07.480 kind of boring and
01:01:08.720 whatever.
01:01:09.940 And that's true of
01:01:10.640 every logo.
01:01:12.240 Like everybody's
01:01:13.360 logo.
01:01:16.000 You can't find two
01:01:17.140 people who like the
01:01:17.980 same thing.
01:01:19.140 It's very hard.
01:01:20.680 So there's something
01:01:21.740 about this, maybe the
01:01:23.400 simplicity of it, and
01:01:24.600 the fact that it's so
01:01:26.560 on point, I guess.
01:01:29.960 That, I don't know, I
01:01:32.120 think this collaborative
01:01:33.400 effort is much better
01:01:34.480 because I don't think I
01:01:35.220 could come up with
01:01:36.020 this on my own.
01:01:39.540 All right.
01:01:42.820 I guess it all makes
01:01:44.020 sense.
01:01:52.300 Versus Kilroy, the
01:01:55.500 bug should be black. 0.96
01:01:58.520 Yeah.
01:02:02.660 Okay.
01:02:03.540 Have simple logo, maybe
01:02:04.660 just glasses and a
01:02:05.620 simple art.
01:02:06.580 Yeah, you know, here
01:02:08.780 are all the things you
01:02:09.740 have to worry about with
01:02:10.560 the logo.
01:02:11.880 If you want a logo
01:02:12.820 that's going to last,
01:02:13.960 you have to ask your
01:02:14.660 question, ask questions
01:02:16.460 like this.
01:02:17.520 How long will people
01:02:18.680 still be wearing
01:02:19.360 glasses?
01:02:21.560 Right?
01:02:22.580 I hate having glasses
01:02:24.040 being part of your
01:02:24.900 logo because they will
01:02:25.920 change over time.
01:02:27.280 But for now, I mean,
01:02:29.120 go on this long.
01:02:33.060 Put Garfield on it.
01:02:34.360 All right.
01:02:39.420 Take a river cruise
01:02:40.520 through Europe.
01:02:41.480 That's the top of my
01:02:42.720 list for things that I
01:02:43.960 would enjoy doing.
01:02:44.980 A river cruise in
01:02:46.160 Europe.
01:02:46.660 Well, they still have
01:02:47.420 rivers.
01:02:48.460 Apparently, the rivers in
01:02:50.000 Europe have gone down
01:02:52.020 so much.
01:02:53.580 By the way, where is
01:02:54.300 all the water going?
01:02:59.400 Wait, how does this
01:03:00.360 work?
01:03:00.700 So, is the water
01:03:03.500 coming out of the
01:03:04.700 streams and lakes and
01:03:06.840 going into the ocean?
01:03:08.000 Because the water
01:03:09.540 level of the ocean is
01:03:10.500 going up, but the
01:03:11.260 water level of all of
01:03:12.440 our land-based water is
01:03:14.600 going down?
01:03:17.700 Can the climate really
01:03:19.100 do that?
01:03:20.000 Can it really take water
01:03:21.100 from one place and put
01:03:22.060 it in another place?
01:03:24.720 I suppose that's what
01:03:25.820 climate change does.
01:03:27.100 It probably does
01:03:27.840 exactly that.
01:03:28.600 Yeah, I guess it
01:03:29.080 does.
01:03:30.700 I don't know.
01:03:36.480 Yeah.
01:03:38.160 Okay.
01:03:40.420 Well, that's where
01:03:41.340 we're at at the moment.
01:03:43.240 And ladies and
01:03:43.780 gentlemen, this
01:03:46.600 concludes the YouTube
01:03:47.900 portion of our event.
01:03:52.160 And, hey, I'm
01:03:53.380 describing clouds.
01:03:54.500 You're right.
01:03:56.620 And I'll talk to you
01:03:57.740 tomorrow, YouTube.
01:03:58.660 Bye for now.