Real Coffee with Scott Adams - April 09, 2023


Episode 2073 Scott Adams: Happy Easter, Elon Musk, Robot Babies, Cyborg Soldiers, Myocarditis, Oh My


Episode Stats

Length

48 minutes

Words per Minute

132.04869

Word Count

6,357

Sentence Count

536

Misogynist Sentences

3

Hate Speech Sentences

18


Summary

It's Easter, and Scott Adams is here with the best Easter entertainment you've ever had in your whole life. He talks about the Doonesbury comic, Garfield, and the future of humanity. And he makes a prediction about robot babies.


Transcript

00:00:00.720 Ooh, I'm overexposed. Let's get a little less light on that. Yeah, that's better.
00:00:08.060 Good morning, everybody, and welcome to the Highlight of Civilization.
00:00:12.100 It's called Coffee with Scott Adams, and it's the best Easter entertainment you've ever had in your whole life.
00:00:18.640 Let me put you in the center, put me in the center of the frame. There we go.
00:00:24.860 And I think if you'd like to take this up a level, and I know you would,
00:00:30.000 all you need is a cup or a mug or a glass, a tank or a chalice or a stein,
00:00:34.140 a canteen, jug or a flask, a vessel of any kind.
00:00:37.100 Fill it with your favorite liquid. I like coffee.
00:00:41.120 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:44.800 the thing that makes everything better. It's called Simultaneous Sip.
00:00:49.020 It happens now. Go.
00:00:55.440 Ah.
00:00:55.880 Well, I was just alerted to the fact that the Doonesbury comic is apparently taking a...
00:01:05.400 Well, I won't say that...
00:01:07.200 Let me just read it.
00:01:09.240 You can make up your own mind about whether this is supportive or not.
00:01:14.980 So it shows the two older couple, one's reading a newspaper.
00:01:21.200 Newspapers still exist in this comic world.
00:01:24.000 And the young person comes in and...
00:01:25.980 The young person says,
00:01:31.240 Okay, so I've been thinking about the raw deal that cancelled cartoonists got.
00:01:35.760 And the mom says,
00:01:36.720 Oh, honey.
00:01:37.820 What?
00:01:38.660 Before you say anything, let me make a quick point.
00:01:41.460 All life is editing.
00:01:43.440 It's the things we may think but don't say that make it bearable to live in the world.
00:01:47.860 Civilization absolutely depends on it.
00:01:50.280 The father says,
00:01:51.600 Agreed.
00:01:52.500 Just because you have the right to spew transgressive crap doesn't mean you must,
00:01:57.440 especially if it hurts people.
00:01:59.620 So editing is in your own self-interest, son.
00:02:01.900 It protects you from your basest instincts.
00:02:04.400 And the son says,
00:02:06.740 Okay, okay, but I'll still miss Garfield.
00:02:09.640 But not from ignorance.
00:02:14.060 Now, what do you think?
00:02:17.580 Transgressive.
00:02:18.720 Now, it's interesting it didn't mention racist.
00:02:22.900 Interesting.
00:02:24.100 Anyway, I don't know what to make of that, frankly.
00:02:27.680 But happy Easter.
00:02:30.920 And because Easter is a day for a big comeback,
00:02:34.400 I thought I would do a poll on Twitter to find out,
00:02:39.200 now that some time has passed since my cancellation,
00:02:42.680 I asked,
00:02:43.840 Do people think that Dilbert was canceled because its author is a racist,
00:02:49.540 or because the people who canceled him are racists?
00:02:52.220 And a month later,
00:02:55.120 4% think the problem is that the author is a racist,
00:03:00.180 and 96% either think that racists canceled Scott,
00:03:05.340 or that everybody's a racist.
00:03:08.640 So it's 96% on my side.
00:03:13.420 So happy Easter.
00:03:17.660 I think I timed that just right.
00:03:20.020 Just right.
00:03:22.940 All right.
00:03:25.100 Well, enough about that.
00:03:27.220 I have a prediction about robots.
00:03:29.600 Robot prediction coming in.
00:03:31.300 Have you noticed how people who don't have children
00:03:34.580 sometimes will treat their animals as like their child?
00:03:40.420 Have you ever noticed that?
00:03:42.160 It's like, oh, my baby, my cat, my baby, my dog, whatever.
00:03:46.260 Now, that makes perfect sense,
00:03:48.920 because people have sort of a parental instinct,
00:03:53.440 maternal instinct, etc.
00:03:55.960 But in the context of the actual human population decreasing,
00:04:02.340 Italy being a prime example,
00:04:04.180 population is dropping fast.
00:04:06.580 And other Western countries,
00:04:07.980 our populations are going to be dropping fast.
00:04:11.220 Don't you think people are going to have robot babies?
00:04:13.600 Now, they might not be in baby form.
00:04:19.220 They might be adult robots.
00:04:21.280 But don't you think that people are going to be bringing them up
00:04:25.440 and training them to be the character and the personality
00:04:28.860 that they want them to be?
00:04:30.680 And won't you be able to customize your robot by its experience?
00:04:35.760 Won't robots all end up having different personalities?
00:04:38.980 Because they would have their own personal experience
00:04:41.640 on top of their programming.
00:04:43.600 Yeah, I think people are going to be raising robot babies,
00:04:48.280 and it's going to be the end of human civilization.
00:04:51.160 Because people are going to prefer the robot.
00:04:54.000 Because it won't ever die.
00:04:56.300 You don't have to worry about it dying.
00:04:57.900 You can just have it backed up.
00:04:59.840 All the risks of having a child could be removed,
00:05:02.860 and you could still get some kind of parental satisfaction.
00:05:06.600 I know you're saying, not me, not me.
00:05:11.120 I need the real kids.
00:05:12.520 But I'm saying that some people will do it.
00:05:14.680 Not you.
00:05:16.080 Not you, necessarily.
00:05:17.740 But I think it's going to be a big deal.
00:05:19.540 People are going to be raising robots.
00:05:21.880 Because you will actually train them to be more to your liking.
00:05:25.400 That's raising them.
00:05:26.100 All right, well, let's talk about all things...
00:05:30.560 Oh, here's some more robot stuff.
00:05:33.060 So the military is testing out some enhanced reality goggles
00:05:39.680 that their soldiers could wear.
00:05:42.300 And apparently it can do all kinds of stuff.
00:05:43.940 Like, you can see the whole field as if you're looking down from it.
00:05:47.900 So you can see where everybody is.
00:05:49.640 You can communicate.
00:05:51.020 You can do your GPS.
00:05:52.500 You can have night vision goggles.
00:05:54.480 And who knows what else.
00:05:56.960 But, man, by the time you add those vision-related things to a soldier,
00:06:03.640 you are pretty much a cyborg, aren't you?
00:06:06.960 I mean, that's almost as much technology as there is human flesh.
00:06:11.880 Almost by weight, as well.
00:06:13.940 Because the technical parts and the backpack and everything
00:06:16.640 are going to weigh 75 pounds or something crazy.
00:06:21.340 So cyborg soldiers and robot babies, that's all coming.
00:06:28.740 And you know about Musk is trying to build a city
00:06:32.040 around his own companies there in Texas.
00:06:35.180 So he'd like to have some cheap housing that's also awesome.
00:06:40.900 But it would be below market rent.
00:06:43.940 For the best little city there could be, I guess.
00:06:47.280 Now, what I didn't know is that Ye has actually talked to Musk about also building cities.
00:06:52.740 And as most of you know who have been watching me for a while,
00:06:56.160 that's also my long-term objective is to design a city.
00:07:02.020 Now, I don't have to own it, but I'd like to design one.
00:07:05.500 And so now you've got Ye thinking about designing cities.
00:07:09.840 You've got Musk thinking about designing cities.
00:07:12.300 You've got me thinking about designing cities, and probably lots of other people thinking about it.
00:07:19.060 Here's a weird prediction.
00:07:20.280 If Musk designs this city the way I think he's going to,
00:07:26.280 it could end up being his biggest business.
00:07:29.960 Now, at the moment, it's not a business at all, except for the boxable,
00:07:33.920 the little temporary buildings.
00:07:36.280 I think he lives in one, actually.
00:07:39.440 So that part is commercial already, and he owns some of that.
00:07:42.960 But I think what he's going to do, this is just my speculation,
00:07:46.420 I think he's going to design one city for his employees,
00:07:51.380 but he's going to do it in a way that if it works,
00:07:54.380 it could become a commercial enterprise,
00:07:57.180 meaning designing other cities and using his boring company thing to bore tunnels and stuff.
00:08:04.060 Because remember, that thing also makes bricks.
00:08:07.240 So the boring company can, you know, bore a tunnel,
00:08:10.820 but also uses the dirt that's left over to press it into bricks.
00:08:16.420 So I've always thought that he had the perfect situation to design a low-cost city.
00:08:22.180 If he does, I think, given that cities are no longer livable,
00:08:27.620 would you agree with that, by the way?
00:08:29.320 Would you agree that cities are basically dead?
00:08:32.400 There's no going back at this point.
00:08:34.340 They'll just become basically magnets for crime,
00:08:37.660 and anybody who lives there, I feel sorry for them.
00:08:40.840 But I think the biggest market is going to be building
00:08:43.860 brand-new design cities from scratch.
00:08:48.180 And I think Musk is going to be the biggest player in that,
00:08:51.840 even without trying.
00:08:53.380 I think he'll just sort of accidentally, you know,
00:08:56.760 back himself into the biggest business in the world.
00:09:00.480 Because it wouldn't make sense for him to just build the city
00:09:03.860 without also learning enough from it
00:09:06.560 that you could commercialize it as a project going forward.
00:09:11.380 I can't imagine him just doing it as a one-off.
00:09:14.080 That just doesn't seem very Musk-like.
00:09:18.140 It would be thinking too small.
00:09:21.860 All right.
00:09:25.220 I see your comments.
00:09:26.660 You know how long I've been asking you what's up with Soros?
00:09:34.420 And a lot of you in my audience say Soros is the devil
00:09:38.980 and he's behind all the bad things.
00:09:41.420 And I keep saying, but why?
00:09:45.940 Okay, if I accept that he's behind all the bad things,
00:09:48.760 because you can trace the money back to him,
00:09:52.340 but why?
00:09:53.280 Like, why would he want America to fail
00:09:57.440 from massive, unchecked crime in cities?
00:10:01.360 What would be the point of that?
00:10:03.700 And I felt I was all alone in that
00:10:07.020 until I saw that Elon Musk has a similar thought.
00:10:11.840 So I saw a tweet from Marina Medvin who said about Soros,
00:10:17.080 Soros isn't being criticized for being Jewish or a billionaire.
00:10:21.080 He's being criticized, rightly so,
00:10:24.280 for his investment in a neo-leftist takeover
00:10:26.840 of prosecutions throughout the U.S.
00:10:29.100 He invests around a million per district attorney race.
00:10:33.200 He does so openly, proudly, even wrote about it.
00:10:37.880 And then Elon Musk replied to that with this tweet,
00:10:41.180 I don't understand his goal.
00:10:45.380 That's exactly my opinion.
00:10:48.120 What?
00:10:49.740 Like, I'll accept, I do accept that you can trace the money back to Soros,
00:10:55.380 although it goes through a third party who makes the final decision.
00:10:59.180 So, you know, it's an indirect and yet strong connection,
00:11:04.180 if that makes sense.
00:11:05.540 It's indirect because it goes through a third party,
00:11:07.400 but it's a strong connection.
00:11:09.800 So we're not doubting that he's funding things
00:11:11.940 that are funding district attorneys.
00:11:14.740 But why do you think it's happening?
00:11:18.080 You can make a trillion dollars shorting America.
00:11:22.460 Do you think that 86 years old Soros is looking to make
00:11:27.060 one big killing before he dies?
00:11:29.540 That's ridiculous.
00:11:31.680 That's ridiculous.
00:11:32.880 The last thing that anybody 86 years old wants to do
00:11:37.120 is make a really big bet.
00:11:41.260 No.
00:11:42.220 I don't think there's any chance of that.
00:11:46.080 So, some say for legacy, some say for hate.
00:11:51.000 None of those reasons check out.
00:11:54.360 So you can make any explanation.
00:11:58.660 And they'd all sort of work a little bit,
00:12:00.800 but they don't really completely work.
00:12:02.880 There's no explanation that actually completely works.
00:12:08.720 Hatred, martyr, none of those make sense.
00:12:11.800 Irrational, no.
00:12:13.880 None of those make sense.
00:12:16.120 There's something terribly missing with the hypothesis
00:12:19.580 that Soros is doing this for bad intentions.
00:12:25.760 I mean, it's a bad outcome.
00:12:27.580 Definitely a bad outcome.
00:12:30.120 It's what he's done his whole life.
00:12:31.920 No, he's never done this.
00:12:34.860 There's no example of him doing this.
00:12:37.400 There are examples of him doing things that were bad for,
00:12:40.640 let's say, England,
00:12:42.700 that he made some money on.
00:12:45.720 But, again,
00:12:47.420 none of it makes sense in any coherent way.
00:12:50.640 Even when you explain it,
00:12:52.580 you're explaining it to me as if he's crazy.
00:12:55.760 You're explaining it to me as if he's crazy.
00:12:58.980 Which he might be.
00:13:00.640 Which would actually be a good explanation.
00:13:04.100 He's crazy.
00:13:05.580 He's a true believer in what?
00:13:07.800 True believer in what?
00:13:09.920 Crime?
00:13:10.320 None of it makes sense.
00:13:14.300 Now, socialism doesn't have anything to do with DAs.
00:13:19.640 But look how many different explanations you're giving.
00:13:24.680 Equity, destroy the country, make money.
00:13:28.140 The fact that there are so many different competing explanations
00:13:31.620 is all you need to know that there's something going on.
00:13:36.580 We don't know what's going on.
00:13:39.440 Open borders has nothing to do with the DAs.
00:13:42.100 I mean, not directly.
00:13:44.240 Yeah.
00:13:44.880 I don't know.
00:13:46.160 And the thing is,
00:13:47.720 depopulation.
00:13:49.520 No, it's not about depopulation.
00:13:52.020 It's not about the New World Order.
00:13:55.080 It's something very specific about district attorneys.
00:13:59.680 But why?
00:14:01.500 Who in the world would want to make this change?
00:14:05.720 Well, I'm not asking you to mind read.
00:14:08.700 I'm asking you to give me any explanation that isn't crazy.
00:14:13.700 Which doesn't mean it's what Soros is thinking.
00:14:16.660 I'm just looking for a non-crazy explanation
00:14:19.260 of why he'd be doing it.
00:14:21.300 NGOs, part of a coup game plan
00:14:26.320 for ideological agenda globalism.
00:14:29.200 See, now that's the word salad that I know means we don't need.
00:14:33.600 Now listen to this.
00:14:34.600 I don't want to make fun of you,
00:14:35.680 but this is clearly a word salad.
00:14:38.680 Soros NGOs and State Department use this
00:14:41.160 as part of coup game plans
00:14:42.940 for ideological agenda globalism.
00:14:48.420 What does that even mean?
00:14:49.640 That's the problem.
00:14:52.500 I don't even know what those words mean.
00:14:54.040 Now, when you say globalism,
00:14:59.660 again, you're missing the why.
00:15:03.520 But why?
00:15:07.500 All right, well, I'm going to stop talking about this.
00:15:10.120 But it was useful for me to know
00:15:13.380 that there's at least one person who's confused by it,
00:15:15.960 and his name is Elon Musk.
00:15:18.940 So there were two of us who were confused by it.
00:15:21.860 So I thought time had gone by
00:15:24.060 that I could ask the medical professionals
00:15:26.560 if they're seeing a big increase in myocarditis
00:15:29.320 and sudden deaths.
00:15:31.380 What do you think happened
00:15:32.820 when I tweeted
00:15:34.380 to ask doctors,
00:15:37.040 specifically doctors,
00:15:38.300 if they're seeing a big increase in myocarditis,
00:15:42.040 like Peter McCullough says,
00:15:43.940 and sudden deaths.
00:15:46.120 What do you think the doctors said?
00:15:52.040 The answers were all over the place.
00:15:54.640 Of course.
00:15:55.760 It's exactly what you think.
00:15:57.320 I got very qualified people
00:16:00.900 who treat thousands of patients a year
00:16:03.440 who said,
00:16:04.560 no, there's no difference.
00:16:06.420 And then I got other people
00:16:07.620 who might be qualified saying,
00:16:09.280 oh, yeah, there's a difference.
00:16:12.300 Right.
00:16:13.200 So you can't even tell
00:16:16.940 from the people in the industry.
00:16:20.260 So the people who work the industry,
00:16:24.220 McCullough mines the overall data.
00:16:27.580 McCullough,
00:16:28.280 we know he uses bad data.
00:16:31.400 And he does it knowingly,
00:16:32.900 which is the weird part.
00:16:34.640 And the bad data that McCullough uses
00:16:36.280 is the athletes that died suddenly.
00:16:39.500 Because that's been researched,
00:16:41.080 and we know that a lot of them
00:16:42.200 are children and senior citizens
00:16:44.220 and people who died before the pandemic
00:16:47.160 and all kinds of stuff.
00:16:48.960 But he still uses it.
00:16:50.760 It's the most debunked database
00:16:52.760 in all the world.
00:16:55.180 There's nothing more debunked
00:16:56.600 than that database.
00:16:57.940 And he still uses it.
00:16:59.440 Still talks about it like it's real.
00:17:01.180 So I don't trust anything he says
00:17:03.160 because there's such an obvious problem
00:17:06.660 with that specific data.
00:17:08.200 It's the most debunked
00:17:15.160 because you can look at the individual cases
00:17:17.740 and debunk them individually.
00:17:20.900 So there are accounts
00:17:22.580 that just spend all their time
00:17:23.780 debunking that database.
00:17:26.220 It is the most thoroughly debunked fact
00:17:29.640 of the whole pandemic
00:17:30.500 that athletes did not die at a higher rate.
00:17:34.400 That is absolutely something that didn't happen.
00:17:40.860 All right.
00:17:41.360 A lot of you are really bad
00:17:42.360 and say not debunked.
00:17:44.360 Interview them.
00:17:45.180 What good would that do?
00:17:47.460 What good would it do to interview them?
00:17:49.660 It would do no good at all.
00:17:52.700 Yeah.
00:17:53.340 I mean, we're still lost in that model
00:17:55.360 that if you talk to one person,
00:17:57.460 you can know something.
00:17:58.780 You can't.
00:17:59.400 That model doesn't work ever.
00:18:02.660 Talking to one person
00:18:03.800 gets you one person's opinion.
00:18:06.500 It's no value whatsoever.
00:18:09.360 All right.
00:18:10.700 So I guess that's an open question
00:18:12.620 and always will be, I suppose.
00:18:16.340 There's something weird happening
00:18:17.860 with Matt Taibbi and Elon Musk and Substack.
00:18:22.340 I'll just tell you the sequence of events,
00:18:24.900 but I don't know exactly what's going on here.
00:18:26.920 It's a little weird.
00:18:27.640 So Matt Taibbi tweeted.
00:18:31.680 He said,
00:18:32.200 Of all things,
00:18:32.980 I learned earlier today
00:18:34.020 that Substack,
00:18:35.140 now that's where people can write blog posts
00:18:38.340 and get paid for it
00:18:39.680 with subscription members.
00:18:42.800 I learned earlier today
00:18:43.840 that Substack links
00:18:44.800 were being blocked on this platform,
00:18:46.740 meaning Twitter.
00:18:47.980 When I asked why,
00:18:49.220 I was told it's a dispute
00:18:50.580 over the new Substack Notes platform.
00:18:53.640 Now Substack is adding a feature
00:18:55.940 called Notes.
00:18:56.800 where people can basically post things
00:19:00.160 like tweeting.
00:19:01.840 And there's some thought
00:19:03.080 that Twitter might see that as competition
00:19:05.380 and might want to suppress it.
00:19:08.060 So that was Matt Taibbi's claim,
00:19:10.660 and he said he talked to somebody
00:19:12.360 who must have known something.
00:19:14.820 But Elon Musk pushed back on that.
00:19:16.700 He said,
00:19:16.960 Number one,
00:19:18.520 Substack links were never blocked.
00:19:23.100 Matt's statement is false.
00:19:24.600 Then he said,
00:19:25.080 Number two,
00:19:27.060 Substack was trying to download
00:19:28.720 a massive portion of the Twitter database
00:19:30.960 to bootstrap their Twitter clone.
00:19:33.320 In other words,
00:19:33.900 they were trying to suck data
00:19:35.960 out of Twitter's API
00:19:37.400 to get enough data
00:19:39.620 that they could clone
00:19:41.320 or match Twitter's tweeting service.
00:19:44.980 So their IP address
00:19:46.460 is obviously untrusted.
00:19:48.620 So it's an untrusted site
00:19:50.720 because they tried to do things,
00:19:52.260 according to Musk,
00:19:53.260 that the API privileges
00:19:55.780 should not allow.
00:19:58.480 I don't know if there's a rule against it,
00:20:00.300 but he doesn't like it.
00:20:01.200 And then Elon says,
00:20:04.600 Turns out Matt is or was
00:20:07.100 an employee of Substack.
00:20:09.400 Now here's the fun part.
00:20:11.620 Twitter's context notes
00:20:13.400 called out Musk on a fact check.
00:20:19.420 So Twitter fact-checked its boss,
00:20:23.360 and they fact-checked him hard.
00:20:25.440 It wasn't polite.
00:20:28.960 They fact-checked him hard.
00:20:30.400 Here's what the...
00:20:32.440 Now that doesn't mean
00:20:33.260 the context notes are right, right?
00:20:35.660 Because the context notes
00:20:36.940 are a consensus of people
00:20:39.300 who do these notes.
00:20:40.900 Doesn't mean they're right,
00:20:42.180 but they have a different opinion.
00:20:44.760 And it says,
00:20:46.500 Substack links have been throttled on Twitter.
00:20:51.460 So that would be different
00:20:53.480 than being blocked,
00:20:54.540 but throttled.
00:20:55.960 So there may be some difference
00:20:57.520 between blocked and throttled.
00:20:59.100 Throttled might be true,
00:21:01.420 whereas blocked is not.
00:21:02.900 Okay?
00:21:07.060 And Substack's Twitter account
00:21:09.200 has been restricted, it says.
00:21:11.820 It also says Matt Taibbi
00:21:13.940 is not a Substack employee.
00:21:16.560 He writes a newsletter there,
00:21:18.920 and the CEO of Substack says
00:21:21.220 that he's never been an employee.
00:21:27.860 So what does that mean?
00:21:31.780 Here's what I think it means.
00:21:34.420 Here's what I think it means.
00:21:36.540 I don't believe that Matt Taibbi
00:21:38.540 ever got a paycheck from
00:21:40.260 or was technically ever an employee
00:21:43.060 of Substack.
00:21:43.840 I do believe that Substack
00:21:48.020 uses their high-profile users
00:21:51.160 to promote the system.
00:21:54.440 I do believe it's possible
00:21:56.900 that Matt Taibbi got a,
00:21:58.440 maybe a sweetheart deal
00:22:00.780 to move his business onto Substack.
00:22:02.900 And if you've got a sweet deal
00:22:06.400 in return for maybe promoting the service
00:22:10.740 or being there,
00:22:12.200 it's a little bit of an arrangement
00:22:14.060 more than just a customer.
00:22:16.580 So if Musk is speaking,
00:22:18.840 let's say, hyperbolically,
00:22:22.420 that he's an employee,
00:22:23.700 but sort of employee in quotes,
00:22:25.460 meaning that they're working together
00:22:26.940 for a mutual benefit,
00:22:28.260 it might not be 100% completely false,
00:22:35.240 but it is useful to know
00:22:36.620 he's not a W-2 employee, at least.
00:22:39.940 But I do think that probably
00:22:41.120 the big names as Substack
00:22:42.680 have some kind of understanding
00:22:45.500 for mutual benefit, I'm sure.
00:22:49.800 And of course, Matt, in protest,
00:22:52.660 I believe he's going to stop tweeting
00:22:54.180 and just use his Substack,
00:22:56.600 which is more to Musk's point, actually.
00:23:02.680 Yeah, that's more to his point.
00:23:04.620 So I don't know what's true here,
00:23:06.000 but I think this is a fascinating story
00:23:07.740 just to watch them try to fight it out there.
00:23:11.500 All right.
00:23:14.640 I have one other point I'd like to make.
00:23:18.600 I've been using on my phone
00:23:20.260 the Google search engine.
00:23:22.740 And you all know what that looks like.
00:23:28.640 And the Google search engine
00:23:30.280 also surfaces, you know, stories,
00:23:33.460 news stories.
00:23:35.000 And I'm fascinated because I noticed
00:23:37.180 that if I look at CNN,
00:23:42.780 and then I look at Fox News,
00:23:46.440 I used to think I got all the big stories.
00:23:49.240 But now I feel like both Fox News and CNN
00:23:52.480 just completely ignore major stories.
00:23:56.660 And I don't even know that they are stories
00:23:58.700 until I see them somewhere else,
00:24:00.860 such as on the Google page.
00:24:03.580 They do a different job of curating stuff.
00:24:09.520 Easter Sunday, yeah.
00:24:10.920 It's not so much on Easter,
00:24:12.100 but sometimes the news is just so different.
00:24:14.600 OpenAI's first physical robot shocks the industry.
00:24:22.900 Now, here's a story that's not on CNN or Fox News.
00:24:27.600 I didn't see it anyway.
00:24:29.540 But OpenAI has a physical robot?
00:24:34.940 That's really big news, isn't it?
00:24:39.140 Don't you think if AI actually is now in a robot,
00:24:45.180 that's maybe some of the biggest news
00:24:48.460 in the history of, oh, I don't know,
00:24:51.880 civilization?
00:24:56.140 But here it is.
00:24:57.940 It's on, at least Google found it.
00:25:02.140 What else is it finding?
00:25:03.680 Mastering mid-journey,
00:25:09.080 Google and Amazon struggle to lay off workers in Europe.
00:25:11.840 Well, who cares?
00:25:13.820 New battery tech could extend EV ranges by 10 times.
00:25:18.240 Now, I love those stories.
00:25:20.080 Let's find out.
00:25:23.120 New researchers from Pohang University
00:25:26.680 and Sogang University
00:25:28.800 developed a polymeric binder
00:25:33.720 for a stable, reliable, high-capacity anode material
00:25:37.620 rather than conventional anodes
00:25:40.140 made of graphite or other materials.
00:25:42.400 You don't want any of that conventional graphite stuff.
00:25:47.720 All right.
00:25:48.040 Well, we'll see.
00:25:49.160 If you can make batteries 10 times more powerful,
00:25:52.820 then I think all this green stuff
00:25:54.620 is going to work out after all.
00:25:56.580 Do we need some kind of a battery law?
00:25:58.800 Like, was it,
00:26:03.860 what was the law of microchips?
00:26:06.840 That they would double every 18 months or something?
00:26:10.140 Speed would double.
00:26:12.940 What was the name of that?
00:26:14.280 That was the Moore's law.
00:26:16.720 We need a Moore's law for batteries
00:26:19.600 because it does seem like batteries are not,
00:26:22.840 they're not doubling,
00:26:23.920 but I feel like batteries are maybe
00:26:26.280 20% better every year.
00:26:30.440 Does that sound right?
00:26:32.480 Maybe 20% batteries,
00:26:34.400 the batteries are about 20% better every year,
00:26:37.140 which is gigantic.
00:26:39.380 That's gigantic.
00:26:41.340 You don't think it's 20%?
00:26:42.380 I think lately the gains have been more impressive
00:26:46.600 because we can actually fly airplanes on batteries now.
00:26:50.800 That's something you couldn't do five years ago.
00:26:54.560 20% more expensive?
00:26:56.620 All right.
00:26:57.000 Well, maybe we'll see some of these 10x things come through.
00:27:03.660 Now, how many of you are watching this live stream
00:27:06.400 because there's absolutely nothing else worth watching
00:27:09.200 because it's Easter
00:27:10.720 and you're kind of tired of watching churchy stuff?
00:27:15.020 There's just nothing else to do, right?
00:27:17.160 It's just me or nothing.
00:27:18.940 So it's a lot of pressure on me.
00:27:20.800 You're watching golf or me.
00:27:25.240 Well, does anybody have any questions
00:27:26.880 because there's no real news today?
00:27:29.820 Got your 10-mile run-in.
00:27:31.140 Good for you.
00:27:33.620 Battery prices are going down.
00:27:36.120 You just watch me every morning, so.
00:27:38.360 Oh, the Masters are today?
00:27:41.660 There's a David Frum comment about what?
00:27:45.020 About me.
00:27:50.280 Any more I can share about the simulation?
00:27:53.800 I usually share all my simulation stories
00:27:56.280 as soon as I get them.
00:28:02.780 Underrated.
00:28:03.880 Where would you like to build a city?
00:28:06.240 I like to build it where there's the least amount
00:28:08.840 of weather or natural disasters
00:28:11.840 because I think you could build a city
00:28:14.980 off-grid now that's pretty far off-grid.
00:28:18.600 For example, you could build a city
00:28:21.040 where there's, say, one highway,
00:28:23.200 one superhighway to it,
00:28:25.240 but that superhighway,
00:28:26.400 let's say it's from a big airport,
00:28:28.280 but that superhighway is only self-driving cars.
00:28:32.680 Think about it.
00:28:33.580 If you had only self-driving cars on the highway,
00:28:38.200 you wouldn't need a speed limit,
00:28:41.320 and there would be no traffic congestion
00:28:44.420 because if they're self-driving,
00:28:46.660 they just adjust to all situations.
00:28:48.620 So you'd never have traffic.
00:28:51.300 You'd be able to travel, let's say, 140 miles per hour,
00:28:55.140 and you would just walk out to the curb
00:28:58.720 with your baggage,
00:28:59.940 throw it in the back of a self-driving car,
00:29:01.960 and you're boom in the middle of the city.
00:29:05.600 That's what I think is going to happen.
00:29:07.320 Yeah, there'll be some drones,
00:29:08.460 but those will be a little more expensive.
00:29:09.800 Werewolves and Vampires.
00:29:19.300 There's someone pandering to the conservative...
00:29:23.400 Here's a good question.
00:29:24.720 Scott, since your cancellation,
00:29:26.100 have you started pandering to the conservative wing
00:29:28.280 of your audience?
00:29:29.420 Well, let me ask you.
00:29:31.900 Let me ask you.
00:29:33.340 Am I pandering more,
00:29:35.700 let's say more,
00:29:37.060 to the right wing of my audience?
00:29:39.800 What do you say?
00:29:41.980 I see mostly no's on locals.
00:29:45.540 I see mostly no's.
00:29:48.080 Now, but we would agree that everybody's biased, right?
00:29:52.580 Including me.
00:29:53.540 So I'm not without bias.
00:29:55.780 And let me explain my bias.
00:29:59.320 When I was younger,
00:30:01.300 I was probably more identified with Democrats
00:30:05.460 in terms of their social stuff.
00:30:07.520 But a lot of it was about privacy.
00:30:09.800 And, you know, just leave me alone.
00:30:13.900 And now I think that the Republicans
00:30:16.980 are more the free speech,
00:30:18.580 leave me alone party.
00:30:20.300 So I've always been free speech,
00:30:22.620 leave me alone.
00:30:23.900 It's just that the parties changed.
00:30:26.040 They reversed.
00:30:27.040 So I'm kind of the same.
00:30:28.940 But I'm now more compatible
00:30:30.820 with right-leading philosophy
00:30:33.400 because the woke stuff
00:30:35.280 is absolute, complete bullshit.
00:30:39.260 So you can't...
00:30:41.100 There's no way I can support all the woke stuff.
00:30:44.960 So if it looks like I'm more right-wing
00:30:47.640 in the context of the woke stuff going crazy,
00:30:51.580 that's probably true.
00:30:53.240 That's probably true.
00:30:53.960 All right, here's an example
00:30:56.920 of me not being more right-wing.
00:30:59.960 You ready for this?
00:31:01.260 I saw actor Bryan Cranston
00:31:03.300 explain something to me
00:31:05.400 that for some reason
00:31:06.480 I had missed.
00:31:09.580 And that was his point.
00:31:11.020 His point was that
00:31:12.240 white people, mostly,
00:31:14.020 have a blind spot
00:31:15.480 for the phrase
00:31:17.320 make America great again.
00:31:18.740 And his point was
00:31:21.660 if you're black,
00:31:23.000 when was America great?
00:31:25.940 And I wanted to...
00:31:26.900 I immediately wanted to argue.
00:31:28.140 I was like, well...
00:31:30.220 You know, my first thought was,
00:31:32.300 well, come on.
00:31:33.600 You know, America was certainly
00:31:34.740 great in the 60s, right?
00:31:37.000 But if you're black,
00:31:38.680 maybe that was not
00:31:40.880 such a great time.
00:31:42.640 You know what I mean?
00:31:44.520 And so I actually agree
00:31:46.240 with that point.
00:31:47.020 But I'd never heard it
00:31:48.040 expressed that way.
00:31:49.100 And the reason is,
00:31:50.160 when I hear
00:31:50.900 make America great again,
00:31:53.280 I don't think
00:31:54.240 so much domestically.
00:31:56.420 Or I don't think
00:31:57.200 so much about individuals.
00:31:58.820 Do you?
00:31:59.680 I don't think so much
00:32:00.820 about the specific people.
00:32:02.940 I think about the country
00:32:04.640 as it would be seen
00:32:06.360 from another country.
00:32:08.180 When I hear
00:32:08.880 make America great again,
00:32:10.600 I think, oh,
00:32:12.060 people in France
00:32:13.220 will say that's a great country.
00:32:15.140 Or people in China
00:32:16.020 will say,
00:32:17.000 well, they might be
00:32:17.680 our rivals,
00:32:18.720 but, man,
00:32:19.240 they do things right
00:32:20.000 over there.
00:32:20.780 It's a strong country.
00:32:22.640 So when I hear
00:32:25.300 make America great again,
00:32:27.800 which, by the way,
00:32:28.700 I've never embraced.
00:32:30.320 You know that, right?
00:32:31.960 You know that I've
00:32:32.760 never embraced
00:32:33.380 that slogan,
00:32:34.420 make America great again.
00:32:35.620 I've talked about it.
00:32:37.180 And I've talked about it
00:32:37.960 being one of the strongest
00:32:38.960 campaign slogans
00:32:40.260 of all time.
00:32:41.220 But I don't use it.
00:32:43.140 You know,
00:32:43.300 I don't wear the hat.
00:32:44.180 I don't use the hashtag.
00:32:47.300 And it's because
00:32:48.040 it always bothered me
00:32:49.080 a little bit.
00:32:50.720 I wasn't sure
00:32:51.700 what it was.
00:32:53.600 And it probably
00:32:54.240 wasn't this,
00:32:55.040 but this is a good
00:32:55.820 enough reason.
00:32:56.660 I think that's a good point.
00:32:58.460 I think that if you're
00:32:59.640 trying to make a slogan
00:33:00.920 that fits everybody,
00:33:03.280 and you were a black
00:33:04.200 American,
00:33:04.780 and you said,
00:33:05.360 wait a minute,
00:33:06.620 what exactly is the year
00:33:08.520 that you're going back to
00:33:09.460 that was so great
00:33:10.280 when my people
00:33:11.860 were worse off
00:33:12.660 than they are now?
00:33:15.020 That's a pretty good question.
00:33:17.700 Now,
00:33:18.180 does that feel like
00:33:18.840 I'm pandering to the right?
00:33:21.320 Because I just told
00:33:22.320 people on the right
00:33:22.960 that make America
00:33:23.980 great again
00:33:24.760 is kind of a problem.
00:33:27.500 No,
00:33:27.980 I think that I don't
00:33:28.860 have any problem
00:33:29.540 criticizing people
00:33:30.700 on the right.
00:33:32.200 And I just got done
00:33:33.220 talking about
00:33:33.740 the Soros thing
00:33:34.620 making no sense to me.
00:33:37.380 So even just today,
00:33:38.880 I've criticized
00:33:39.700 the Soros theory
00:33:41.300 that comes from the right,
00:33:42.880 make America great again,
00:33:44.600 you know,
00:33:45.380 the most famous thing
00:33:46.560 from the right.
00:33:48.080 But I'm also solidly
00:33:50.100 philosophically more aligned
00:33:52.880 with conservatives,
00:33:54.920 at least on the woke stuff,
00:33:57.640 you know,
00:33:57.820 not on religion,
00:33:59.160 because I'm not a believer.
00:34:01.080 And on the,
00:34:03.320 and you know my opinion
00:34:04.260 on abortion
00:34:05.380 is that women should
00:34:06.240 figure it out
00:34:06.800 and let us know.
00:34:07.440 So I'm way left
00:34:10.240 of the conservatives
00:34:12.960 on that.
00:34:14.320 I'm left of the liberals
00:34:15.720 on that.
00:34:19.040 All right.
00:34:21.480 Build back better.
00:34:23.080 Yeah,
00:34:23.280 build back better
00:34:24.180 at least allows
00:34:26.640 that the future
00:34:27.320 will be better
00:34:27.920 than the past.
00:34:28.980 And I also think
00:34:29.960 make America great again
00:34:31.520 is a little backwards thing,
00:34:33.240 backwards looking.
00:34:35.020 Like,
00:34:35.460 I don't want to be great
00:34:36.400 like we used to be great.
00:34:38.420 Do you?
00:34:39.580 I want to be great
00:34:40.760 in the new way
00:34:42.140 that makes sense
00:34:42.840 for the,
00:34:43.400 you know,
00:34:43.860 the future.
00:34:44.840 I want to be great
00:34:45.880 in the future.
00:34:46.440 I don't want to be great
00:34:47.180 like the past.
00:34:48.460 That's not a goal for me.
00:34:49.700 What has Bryan Cranston
00:34:58.880 done for poor black folks?
00:35:01.460 Is that a fair question?
00:35:04.560 Pays taxes?
00:35:07.020 Bryan Cranston
00:35:07.980 probably pays
00:35:08.780 a lot of taxes.
00:35:11.140 Where does that go to?
00:35:12.380 A lot of it goes
00:35:15.000 to poor people
00:35:15.680 of all colors.
00:35:17.480 So Bryan Cranston
00:35:18.880 has done
00:35:19.500 way more
00:35:20.560 than you've done
00:35:21.280 for black people.
00:35:23.340 Fact.
00:35:24.440 Assuming he pays taxes.
00:35:26.000 Because he has
00:35:26.680 an enormous income.
00:35:29.020 He probably pays,
00:35:30.420 oh, let's say
00:35:30.820 he was making
00:35:31.380 $10 million a year.
00:35:32.980 He was probably paying
00:35:33.960 $5 million a year
00:35:35.120 in taxes.
00:35:36.140 What has he done
00:35:37.020 for black people?
00:35:38.540 He paid $5 million
00:35:39.600 a year in taxes.
00:35:40.840 Probably.
00:35:41.880 Something like that.
00:35:43.080 Probably.
00:35:44.400 That's a lot.
00:35:46.280 That's a lot.
00:35:47.480 All right.
00:35:56.020 Somebody says,
00:35:56.940 what have black people
00:35:57.680 done for you?
00:35:58.360 Are you kidding?
00:36:00.240 Have you noticed
00:36:00.940 anything about sports
00:36:02.340 or music
00:36:02.860 or fashion
00:36:03.640 or comedy?
00:36:07.200 Yeah, I was in
00:36:08.140 a spaces conversation
00:36:10.700 with mostly black members.
00:36:13.400 and they challenged me
00:36:17.340 to say, you know,
00:36:18.700 what would America
00:36:19.420 look like
00:36:20.120 without the black influence?
00:36:24.860 You know,
00:36:25.520 would it be as good?
00:36:27.280 And I was like,
00:36:27.940 try to imagine
00:36:29.060 American sports,
00:36:31.120 fashion, music,
00:36:33.140 entertainment.
00:36:33.680 Just try to imagine
00:36:36.400 America without
00:36:37.620 the influence
00:36:38.420 of black America.
00:36:40.000 Yeah, even rock
00:36:40.760 wouldn't exist.
00:36:42.200 Right?
00:36:42.420 Because even
00:36:43.060 our popular music
00:36:44.320 came from,
00:36:45.280 you know,
00:36:45.540 the blues,
00:36:46.200 et cetera.
00:36:47.080 So no,
00:36:47.720 you can't even imagine.
00:36:48.580 America would be,
00:36:50.640 it would look like
00:36:51.300 Great Britain.
00:36:51.820 The best things
00:36:57.360 about America
00:36:58.140 are that you can
00:36:59.420 come over here
00:37:00.160 and even though
00:37:01.520 there's a bunch
00:37:02.240 of white people,
00:37:03.000 you can still get
00:37:03.680 good food.
00:37:05.000 That's probably
00:37:05.620 one of the best things
00:37:06.360 about America
00:37:06.980 is you have enough
00:37:08.640 diversity
00:37:09.200 that if you need
00:37:10.700 some Thai food,
00:37:12.120 well, you can get
00:37:12.660 some Thai food.
00:37:14.120 If you need some
00:37:14.660 Indian food
00:37:15.260 so you can have
00:37:15.800 some flavor for once,
00:37:17.460 you can get
00:37:17.860 some Indian food.
00:37:18.580 I like that.
00:37:23.920 Now, crime,
00:37:25.040 of course,
00:37:25.340 is a big issue
00:37:26.360 of not ignoring it,
00:37:28.240 but you can't,
00:37:29.500 you can't say
00:37:30.800 that, you know,
00:37:31.620 black people
00:37:33.100 in America
00:37:33.640 didn't make America
00:37:34.620 a better place
00:37:35.420 in a whole bunch
00:37:36.680 of areas.
00:37:37.400 That's just obvious.
00:37:39.080 But there are
00:37:39.880 other areas
00:37:40.460 where crime
00:37:41.780 in particular
00:37:42.340 where that's,
00:37:43.380 that's a pretty
00:37:44.000 big challenge.
00:37:47.780 All right.
00:37:48.580 is Ben Carson
00:37:51.740 in the news?
00:37:53.380 I don't know.
00:37:57.380 So what is
00:37:58.220 the net benefit?
00:37:59.800 Is that the way
00:38:00.700 to do it?
00:38:02.200 Should you do
00:38:02.860 a net benefit?
00:38:04.520 That feels just
00:38:05.580 super racist.
00:38:09.540 Why would he even
00:38:10.780 want to do
00:38:11.280 a net benefit
00:38:13.020 calculation
00:38:13.720 for black people
00:38:14.760 in America
00:38:15.240 unless you're racist?
00:38:16.900 Like, what would
00:38:17.580 the possible benefit
00:38:18.640 of that be?
00:38:19.740 There's nothing
00:38:20.460 you could do
00:38:20.980 with that
00:38:21.440 except be racist.
00:38:24.740 It'd be funny.
00:38:25.680 Somebody says,
00:38:26.140 you could never
00:38:26.600 calculate it.
00:38:27.640 We'd never agree
00:38:28.480 on what's,
00:38:29.160 what are the ins
00:38:29.700 and what are the outs.
00:38:33.220 All right.
00:38:33.780 do you believe,
00:38:38.040 do you believe
00:38:38.740 that I'm,
00:38:39.340 I'm being dishonest
00:38:41.380 to avoid cancellation?
00:38:43.860 I see some
00:38:44.740 accusations
00:38:45.260 coming over.
00:38:46.300 Do you believe
00:38:46.900 that I'm,
00:38:47.500 I'm intentionally
00:38:49.160 being more woke
00:38:51.100 than I normally
00:38:52.240 would be?
00:38:54.060 A lot of people
00:38:54.860 say yes.
00:38:56.200 I, I think
00:38:57.100 that maybe
00:38:57.520 if you had
00:38:58.420 watched me
00:38:58.940 for a longer
00:38:59.520 period of time
00:39:00.460 that that opinion
00:39:01.720 would be revised.
00:39:02.760 because I have
00:39:04.540 a very long history
00:39:05.560 of being
00:39:06.940 left of Bernie
00:39:08.000 on social stuff.
00:39:09.920 I've been saying
00:39:10.580 it for five years
00:39:11.980 that I'm left of Bernie.
00:39:13.620 So if I pull out
00:39:14.540 something that sounds
00:39:15.340 left of Bernie,
00:39:16.460 that's not new.
00:39:19.140 That's who I've been
00:39:20.100 the whole time
00:39:20.680 in public life.
00:39:27.880 56 degrees
00:39:28.920 in Minneapolis.
00:39:29.700 Well, how about that?
00:39:32.280 Am I a libertarian?
00:39:33.640 No.
00:39:35.920 I, I,
00:39:36.760 as soon as I
00:39:37.720 label myself,
00:39:39.180 I'm less useful.
00:39:41.860 The, the moment
00:39:42.880 you say,
00:39:43.480 I am one of these,
00:39:44.940 you're, you're just
00:39:46.820 less useful
00:39:47.640 to everybody
00:39:48.300 the moment
00:39:49.340 you label yourself.
00:39:54.540 What do I think
00:39:55.460 motivates Soros?
00:39:56.460 Um, I'll give you
00:39:58.380 my best hypothesis.
00:40:00.660 My hypothesis
00:40:01.700 is that he's
00:40:03.000 not in control
00:40:03.880 of his own money
00:40:04.740 anymore.
00:40:06.320 Because
00:40:06.840 if he wanted
00:40:08.520 something specific,
00:40:10.240 he would tell you
00:40:11.540 in direct language.
00:40:13.520 He would say,
00:40:14.560 I'm trying to
00:40:15.220 accomplish X,
00:40:16.360 so I'm giving money
00:40:17.560 to make that happen,
00:40:18.880 and he doesn't.
00:40:20.340 It sounds like
00:40:21.100 he gives money
00:40:21.680 to groups
00:40:23.540 that he thinks
00:40:24.240 makes him look good.
00:40:25.340 Oh, I'll give
00:40:26.920 my money to
00:40:27.760 what's called
00:40:29.180 the color of change.
00:40:31.420 Because they do
00:40:32.240 a bunch of
00:40:32.660 progressive things.
00:40:34.280 And then
00:40:34.600 what they do
00:40:35.340 with the money,
00:40:36.260 I think,
00:40:37.900 is where the problem is.
00:40:39.320 So I think
00:40:39.740 the problem is
00:40:40.380 that the people
00:40:41.160 he gives money to
00:40:42.120 do bad things
00:40:43.560 with the money,
00:40:44.380 and he doesn't have
00:40:45.420 full control of that,
00:40:46.760 and maybe he isn't
00:40:47.520 too happy about it,
00:40:48.460 and maybe he doesn't
00:40:49.040 even know about it.
00:40:50.420 And maybe he's just
00:40:51.100 not fully there.
00:40:52.760 So I think
00:40:53.500 there's a competence
00:40:57.360 problem,
00:40:59.520 his competence.
00:41:00.680 I think there's
00:41:01.560 a communication
00:41:02.260 problem,
00:41:03.600 that he doesn't
00:41:04.200 know exactly
00:41:04.780 where his money
00:41:05.400 is going and
00:41:06.000 for why.
00:41:08.000 And he's probably
00:41:08.940 just trying to
00:41:09.540 make the best of it
00:41:10.420 with his limited
00:41:11.620 mental capacity
00:41:12.880 and his limited
00:41:13.740 vision about
00:41:14.480 where the money
00:41:15.120 is going.
00:41:16.640 And maybe he
00:41:17.400 doesn't even care.
00:41:18.460 I mean,
00:41:18.680 maybe he's just
00:41:19.200 so old he doesn't
00:41:19.880 even care at this
00:41:20.560 point.
00:41:26.580 Warren Buffett
00:41:27.260 has the same
00:41:27.740 problem?
00:41:28.240 Could be.
00:41:29.520 Yeah,
00:41:29.840 if you have
00:41:30.300 a lot of money,
00:41:31.620 it's really a
00:41:32.220 problem to give
00:41:32.900 it to charities
00:41:33.580 because it can
00:41:34.920 distort the charity,
00:41:36.160 first of all.
00:41:37.100 If you give a charity
00:41:37.960 too much money,
00:41:38.820 suddenly they're
00:41:39.360 going to find
00:41:40.640 reasons to spend
00:41:41.500 it.
00:41:44.000 So there's
00:41:44.580 probably no
00:41:45.020 right answer.
00:41:45.840 If you're going
00:41:46.360 to give a massive
00:41:47.080 amount of money
00:41:47.720 to charity,
00:41:48.300 you're going
00:41:49.440 to give it
00:41:49.780 to some
00:41:50.100 bad people.
00:41:55.420 All right.
00:41:59.180 You only
00:41:59.840 donate to
00:42:00.380 the Gates
00:42:00.740 Foundation.
00:42:03.100 Well,
00:42:03.600 a lot of
00:42:03.960 people think
00:42:04.420 Bill Gates
00:42:04.900 is the devil.
00:42:06.160 All right,
00:42:06.340 what do you
00:42:06.640 think?
00:42:07.000 Who is worse,
00:42:07.740 Bill Gates
00:42:08.200 or George Soros?
00:42:10.540 According to
00:42:11.140 you.
00:42:14.180 According to
00:42:14.820 you,
00:42:15.220 not according
00:42:15.700 to me.
00:42:16.160 I think
00:42:19.620 Gates is
00:42:20.240 not trying
00:42:20.980 to hurt
00:42:21.340 anybody.
00:42:22.640 I think
00:42:22.960 he's trying
00:42:23.320 to help,
00:42:23.900 period.
00:42:25.420 Soros,
00:42:25.920 I just
00:42:26.160 don't
00:42:26.400 understand.
00:42:27.400 That's
00:42:27.580 just like
00:42:28.240 a black
00:42:28.620 box to
00:42:29.140 me.
00:42:29.440 It's
00:42:29.620 just a
00:42:29.900 mystery.
00:42:34.340 Yeah.
00:42:40.400 Erica,
00:42:40.960 I will
00:42:41.220 change your
00:42:41.700 mind.
00:42:43.700 The thing
00:42:44.420 with Soros
00:42:45.400 and Gates
00:42:46.060 is that
00:42:47.720 I'm not
00:42:48.980 going to
00:42:49.240 argue
00:42:49.520 whether he
00:42:50.400 did some
00:42:50.920 things that
00:42:51.440 worked out
00:42:51.960 bad for
00:42:53.320 the world.
00:42:54.440 That's a
00:42:54.960 separate
00:42:55.220 conversation.
00:42:56.220 The only
00:42:56.600 conversation I'd
00:42:57.360 have is what
00:42:57.860 is their
00:42:58.160 motivation?
00:42:59.620 Why is it
00:43:00.520 that they do
00:43:01.000 what they
00:43:01.300 do?
00:43:02.160 And I
00:43:02.600 do not
00:43:03.760 believe that
00:43:05.160 either person
00:43:05.980 is motivated
00:43:06.640 by evil.
00:43:08.440 I don't
00:43:09.080 believe that.
00:43:10.260 There's
00:43:10.560 something going
00:43:11.140 on that we
00:43:11.900 don't understand
00:43:12.680 with Soros.
00:43:14.700 But it
00:43:15.300 doesn't look
00:43:16.020 like it's
00:43:16.460 just some
00:43:16.820 kind of
00:43:17.140 satanic
00:43:17.660 evil
00:43:18.100 take over
00:43:19.520 the world
00:43:19.900 kind of
00:43:20.260 thing.
00:43:20.640 It doesn't
00:43:21.060 look like
00:43:21.500 it.
00:43:22.180 I don't
00:43:22.640 know what
00:43:22.860 it is.
00:43:23.280 I don't
00:43:28.700 think it
00:43:29.020 has to
00:43:29.320 do with
00:43:29.560 being
00:43:29.780 Jewish.
00:43:31.060 I don't
00:43:31.400 think it
00:43:31.700 has to
00:43:32.000 do with
00:43:32.500 being,
00:43:34.020 I don't
00:43:34.520 know what
00:43:34.740 it is.
00:43:37.800 I don't
00:43:38.340 think it's
00:43:38.620 money.
00:43:39.680 I really
00:43:40.000 don't think
00:43:40.360 it's
00:43:40.520 money.
00:43:43.200 Those of
00:43:43.800 you who
00:43:44.100 have never
00:43:44.540 been rich,
00:43:45.760 you need
00:43:46.240 to take
00:43:46.560 my word
00:43:46.960 for it.
00:43:48.060 If you're
00:43:48.660 86 and
00:43:49.440 you've
00:43:49.620 been rich
00:43:50.040 for most
00:43:50.660 of your
00:43:50.980 adult life,
00:43:52.160 you're
00:43:52.420 just not
00:43:52.980 fighting for
00:43:54.240 money at
00:43:54.660 86.
00:43:55.880 That's
00:43:56.100 just not
00:43:56.520 happening.
00:43:57.700 I'm 30
00:44:00.420 years younger
00:44:01.020 than Soros.
00:44:02.480 I've already
00:44:03.200 stopped being
00:44:04.360 primarily
00:44:04.920 motivated by
00:44:05.720 money.
00:44:07.120 I'm motivated
00:44:07.980 in the sense
00:44:08.500 that more
00:44:09.860 money gives
00:44:10.340 you more
00:44:10.600 influence and
00:44:11.320 stuff like
00:44:11.740 that.
00:44:13.320 I'm kind
00:44:14.040 of over it
00:44:14.760 as my
00:44:15.200 primary
00:44:15.620 motivation,
00:44:16.880 where it
00:44:17.220 used to
00:44:17.520 be.
00:44:18.040 It used
00:44:18.380 to be
00:44:18.660 trying to
00:44:19.060 make as
00:44:19.340 much money
00:44:19.760 as I
00:44:20.020 can when
00:44:20.440 I was
00:44:20.620 young.
00:44:21.840 But I
00:44:22.460 don't
00:44:22.580 believe
00:44:22.800 it.
00:44:23.100 No.
00:44:23.520 I think
00:44:24.040 whoops.
00:44:27.860 Oh.
00:44:29.300 Well,
00:44:30.660 looks like
00:44:31.540 we've
00:44:31.860 completely
00:44:32.500 lost
00:44:33.080 our
00:44:33.340 signal
00:44:33.760 here
00:44:34.320 on
00:44:35.180 the
00:44:36.200 locals
00:44:36.520 platform
00:44:37.100 for
00:44:40.140 reasons
00:44:40.560 that
00:44:40.760 aren't
00:44:40.940 clear.
00:44:48.080 That's
00:44:48.720 weird.
00:44:49.640 Oh,
00:44:49.920 I know
00:44:50.140 what's
00:44:50.300 going
00:44:50.480 on.
00:44:54.760 Let's
00:44:55.240 know.
00:44:55.880 That doesn't
00:44:56.180 make sense.
00:44:57.580 All right.
00:44:57.900 Well,
00:44:58.080 I'm going
00:44:58.300 to have
00:44:58.500 to fix
00:45:00.280 that later.
00:45:02.880 But I'm
00:45:04.720 glad you're
00:45:05.180 here at
00:45:05.540 least.
00:45:07.960 That was
00:45:08.560 my problem.
00:45:09.240 That was
00:45:09.500 my device
00:45:10.460 ran out
00:45:11.540 of juice.
00:45:16.260 You've
00:45:16.660 merged.
00:45:17.020 Naming
00:45:23.760 the
00:45:24.020 what?
00:45:28.340 If I
00:45:29.020 only used
00:45:29.400 a
00:45:29.520 sense of
00:45:29.860 human
00:45:29.960 to oppose
00:45:30.480 transhumanism,
00:45:31.420 it would
00:45:31.600 have zero
00:45:31.980 effect.
00:45:38.860 Local
00:45:39.260 seems
00:45:39.640 buggy.
00:45:40.180 That was
00:45:40.480 on my
00:45:41.220 end.
00:45:41.900 My device
00:45:42.540 was on
00:45:43.920 power.
00:45:44.220 naming
00:45:48.600 the
00:45:48.900 what?
00:45:52.800 Read
00:45:53.240 the
00:45:53.400 Bible
00:45:53.620 about
00:45:53.920 the
00:45:54.080 resurrection?
00:45:54.800 I
00:45:54.920 probably
00:45:55.200 won't
00:45:55.420 do
00:45:55.600 that.
00:45:58.820 Well,
00:45:59.520 Locals has
00:46:00.300 a lot
00:46:00.540 more content
00:46:01.160 than you
00:46:01.620 have
00:46:01.900 anywhere
00:46:02.320 else.
00:46:03.020 So it's
00:46:03.320 the only
00:46:03.540 place you
00:46:03.940 can get
00:46:04.240 the
00:46:04.400 Dilbert
00:46:04.640 comic.
00:46:05.440 By
00:46:05.660 the way,
00:46:06.020 since I
00:46:06.400 have you
00:46:06.740 here on
00:46:07.340 YouTube,
00:46:09.400 it would
00:46:10.140 help me
00:46:10.480 out if
00:46:10.940 you
00:46:11.120 subscribed.
00:46:12.520 So there's
00:46:13.000 a subscribe
00:46:13.520 button on
00:46:14.200 there.
00:46:15.240 So if
00:46:15.560 you hit
00:46:15.740 that
00:46:15.920 button,
00:46:17.140 that would
00:46:17.840 be good
00:46:18.200 for me
00:46:18.600 if you'd
00:46:19.500 like to
00:46:19.780 do that.
00:46:23.980 Oh,
00:46:24.520 Locals is
00:46:25.020 down.
00:46:26.380 Is
00:46:26.820 Locals
00:46:27.240 down or
00:46:27.720 is it
00:46:27.960 just
00:46:28.160 I'm
00:46:28.880 down?
00:46:31.580 What
00:46:32.020 does
00:46:32.220 name the
00:46:32.720 noses
00:46:33.160 mean?
00:46:33.940 If I
00:46:34.280 see it
00:46:34.580 again,
00:46:34.860 I'm
00:46:34.980 going to
00:46:35.460 block
00:46:35.900 you.
00:46:37.480 Is
00:46:37.920 that
00:46:38.140 a
00:46:38.480 racist
00:46:38.760 thing?
00:46:40.380 Is
00:46:40.600 name the
00:46:41.040 noses
00:46:41.440 some
00:46:41.640 kind of
00:46:41.880 racist
00:46:42.120 thing?
00:46:43.520 I'm
00:46:51.840 not
00:46:52.020 depressed.
00:46:53.620 How much
00:46:54.220 do you
00:46:54.440 make on
00:46:54.840 YouTube?
00:46:55.820 It's
00:46:56.280 private,
00:46:57.260 but it's
00:46:57.880 not much.
00:47:00.300 It's
00:47:00.780 not a
00:47:01.060 big
00:47:01.220 percentage
00:47:01.620 of
00:47:01.860 anything.
00:47:05.320 All
00:47:05.760 right,
00:47:05.980 Locals is
00:47:06.460 down,
00:47:06.760 you say.
00:47:08.280 But is
00:47:09.340 it a
00:47:09.560 coincidence
00:47:09.880 that
00:47:10.260 Locals is
00:47:10.760 down at
00:47:11.140 the same
00:47:11.440 time that
00:47:11.840 my
00:47:12.080 Locals is
00:47:13.100 not
00:47:13.260 down.
00:47:18.280 And
00:47:18.840 I'm
00:47:19.060 back.
00:47:22.800 I'm
00:47:23.380 back on
00:47:23.740 my
00:47:23.920 phone.
00:47:25.540 All right
00:47:26.220 people,
00:47:27.740 I used
00:47:28.460 my phone
00:47:28.920 to sign
00:47:29.860 back on,
00:47:30.860 but I'm
00:47:31.640 just about
00:47:32.220 done.
00:47:33.480 So
00:47:33.920 don't
00:47:36.660 get too
00:47:36.960 excited.
00:47:38.300 My
00:47:38.520 iPad ran
00:47:39.380 out of
00:47:39.600 power.
00:47:40.680 User
00:47:41.140 error.
00:47:46.480 All right.
00:47:48.020 I don't
00:47:48.620 want you to
00:47:49.020 have to
00:47:49.260 look at
00:47:49.520 this angle
00:47:49.980 because it's
00:47:50.420 a terrible
00:47:50.780 angle.
00:47:52.220 It was
00:47:52.760 horrible
00:47:53.020 without me,
00:47:53.580 I know.
00:47:54.160 Yeah,
00:47:54.420 no,
00:47:54.680 it was my
00:47:54.980 fault.
00:47:55.420 Battery.
00:47:56.420 Soros got
00:47:56.980 my battery.
00:47:58.240 George
00:47:58.580 Soros made
00:47:59.220 my battery
00:47:59.720 drain.
00:48:00.060 I think
00:48:01.780 that's what
00:48:02.180 he's up
00:48:02.460 to.
00:48:03.720 All right,
00:48:04.460 YouTube,
00:48:04.920 I'm going to
00:48:05.160 see you
00:48:05.380 tomorrow.
00:48:06.600 Thanks for
00:48:07.020 joining.
00:48:07.900 Bye for
00:48:08.280 now.