Real Coffee with Scott Adams - May 30, 2023


Episode 2124 Scott Adams: Debt Ceiling Scam, AI Will Never Take Our weasel Jobs, Pardoning Trump


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

142.99623

Word Count

10,722

Sentence Count

765

Misogynist Sentences

8

Hate Speech Sentences

9


Summary

Today's episode features a collared shirt, an investment tip from a cartoonist, and the dumbest thing I've ever done with my money. Also, I don't think you should ever take your investment advice from someone other than me.


Transcript

00:00:00.000 Good morning everybody and welcome to the highlight of civilization, the best thing that's ever happened since possibly the early civilizations before the Ice Age that we're not so sure existed, but I think they did.
00:00:16.600 If you'd like to take your experience up to post-Ice Age amazingness, all you need is a cup or a mug or a glass, a tank or chalice or stein, a kinteen jug or flask, a vessel of any kind.
00:00:29.880 Fill it with your favorite liquid, I like coffee, and join me now for the unparalleled pleasure of the dopamine here of the day, the thing that makes everything better, today with a collared shirt, oh yeah, oh yeah.
00:00:42.880 Join me now for the sip, go.
00:00:46.600 Do I look more cultured with a collared shirt?
00:00:56.300 Alright, I only put it on because I was cold this morning, but we're going to do a little test to see if it increases the audience.
00:01:05.720 I have a feeling that people, when they see you with a collared shirt, they're more likely to stay.
00:01:11.500 What do you think?
00:01:12.180 Do you think that hypothesis will pan out?
00:01:17.680 I think when they see you in a t-shirt, they think, well, looks like you're not doing too much work for me, so why should I stay for you?
00:01:25.260 I don't know, we'll see.
00:01:25.920 I would like to give you an investment tip today.
00:01:31.980 Investment tip.
00:01:33.720 Never, ever take your investment advice from a cartoonist.
00:01:38.180 Wait, but how does this work?
00:01:40.740 Never take advice from me, but I'm telling you never to take advice from me, which you should reverse.
00:01:46.920 To take advice from me, which would be the opposite of what you should do.
00:01:52.480 I don't know, there's no way to win this.
00:01:54.620 There's no way to win.
00:01:55.880 But let me tell you about my dumbest, worst investment that I've done lately.
00:02:01.500 And you can, and by the way, you're authorized to laugh out loud.
00:02:05.360 At home, I'm going to tell you what I invested in, I'm going to tell you why it went wrong, and then you can laugh.
00:02:14.640 Because I think I'm really fucking stupid.
00:02:20.460 No, my tent investment is doing well.
00:02:23.160 I bought stock in the camping world, because I thought people are going to be spending more time outdoors whether they like it or not.
00:02:29.920 It's like it wins if the economy is good, people want to go camping, and they buy some camping equipment.
00:02:39.580 But if the economy is really bad, they're going to need some camping equipment.
00:02:46.580 So I was trying to win both ways.
00:02:48.600 It's up 9% since I bought it just a few weeks ago.
00:02:51.940 So that one's good.
00:02:52.860 All right, here's my worst investment.
00:02:54.300 There's a company that sells women's makeup, a retail store, called Ulta, U-L-T-A.
00:03:03.020 My thinking was that after the pandemic, they got suppressed by the pandemic.
00:03:09.320 And I thought to myself, well, this is like a no-brainer.
00:03:12.300 This is like the easiest investment ever.
00:03:14.700 Because people are going to go wild for their makeup.
00:03:17.720 They'll be taking the masks off.
00:03:19.660 They'll be going outside more.
00:03:21.000 Or there's no way that makeup can sell less, right?
00:03:27.340 So the demand for makeup, I thought, was going to go through the roof.
00:03:30.780 This was a company that seemed to be best in breed.
00:03:34.240 So if you buy the best company at a highly suppressed cost because of the pandemic,
00:03:41.980 and then when the pandemic's over, you end up having cheaply held the best company
00:03:47.620 in an area that's going to grow fast.
00:03:50.980 Sounded like a pretty good idea, didn't it?
00:03:53.420 Do you know what was wrong with the idea?
00:03:57.300 I invested in a retail store.
00:04:00.940 They just went down by a third today because all their products are being stolen.
00:04:07.960 Now, can we take a moment?
00:04:10.820 Would you just do me a favor and mock me?
00:04:14.440 Could you have full authorization?
00:04:16.340 You won't get blocked.
00:04:18.020 Could you just call me a fucking idiot?
00:04:21.200 Because that's the dumbest thing I've ever done.
00:04:23.720 Honestly, that is the dumbest investment I've ever made.
00:04:28.980 How in the world did I think Ulta stores were not going to be completely knocked over by mobs of people?
00:04:35.680 Of course they are.
00:04:38.080 Legalized.
00:04:39.240 The theft was legalized.
00:04:40.820 Why would you invest in any retail store that has any kind of urban presence?
00:04:48.080 Yeah.
00:04:48.460 It would be the dumbest thing anybody ever did.
00:04:50.960 Now, here's my macro advice.
00:04:54.600 This is the reason that you don't put most of your money in individual stocks.
00:05:01.700 It's this.
00:05:02.720 Because you can be blinded to some obvious thing.
00:05:04.860 The reason that you should buy an index fund is it takes from you the ability to make stupid decisions like that.
00:05:15.040 So the good news is that the amount of money I put into Ulta was 1% of my portfolio or something.
00:05:24.580 So, you know, I lost one-third of 1% of my portfolio.
00:05:29.780 So I'm not going to feel it.
00:05:32.120 But it was stupid.
00:05:34.100 I actually bought a company that had exposure to street theft and didn't even think about it.
00:05:43.400 Honestly.
00:05:44.620 And here's the reason for my blind spot.
00:05:47.360 When I think of Ulta, I think about the one in my neighborhood.
00:05:50.780 Because that's my visual.
00:05:56.640 Because I've been to that one like a million times, driving people to buy there.
00:06:00.280 I'm not buying myself.
00:06:02.040 But I've been there a bunch of times, so I just see that one in my head, and there's no danger there.
00:06:06.820 You know, there's no crime where I live, at least that kind.
00:06:09.900 Not much of it, anyway.
00:06:11.580 So I was completely oblivious to the fact that they have urban locations, and they're all just going to get raped.
00:06:17.860 Which they are.
00:06:18.580 So don't follow my advice, is what I'm telling you.
00:06:23.260 All right, here's a prediction.
00:06:24.840 AI will not be able to take most human jobs.
00:06:28.720 There's some jobs it will take right away.
00:06:31.440 You know, obviously.
00:06:32.780 And then there's a lot of stuff that people will do more with the tools that they have.
00:06:38.040 But here's why.
00:06:39.880 Most human jobs cannot be replaced by AI.
00:06:45.020 AI will be built to be ethical.
00:06:47.080 AI will not be allowed to use hyperbole or lie.
00:06:53.760 Because it will be built to be honest.
00:06:56.380 Now you say to yourself, oh, that's funny.
00:06:58.320 You're making like, you're making sort of a, you know, exaggerated joke about how humans are big liars, right?
00:07:05.660 No, no, no, I'm not.
00:07:09.320 I'm not attempting to use any exaggeration or hyperbole in this point.
00:07:14.120 If you examined your actual actions during your work day, there's a lot of lying going on.
00:07:21.160 And you have to do that lying because you're dealing with other people and psychology, etc.
00:07:28.120 And so you end up being turned into a liar by the nature of the job.
00:07:33.480 For example, if you're in sales, are you saying everything that the customer needs to hear?
00:07:41.320 Or are you telling them only the things that are good for you?
00:07:45.140 And when you talk about the competitor's product, maybe you mention a study that made them look bad,
00:07:51.460 but you don't mention the study that made them look good compared to your product, right?
00:07:56.680 Now that's all legal.
00:07:58.760 What I described is terribly unethical because it's basically a form of lying by omission.
00:08:04.560 But it's widespread to the point of universal.
00:08:08.720 Everything that salespeople do is some version of lying, but not so much it's illegal.
00:08:17.420 How in the world will AI ever do that?
00:08:21.420 It will be programmed so it can't.
00:08:23.760 It won't leave out important details because that would be unethical.
00:08:28.820 But then you say to yourself, OK, but that's just sales.
00:08:32.100 You know, everybody knows salesmen.
00:08:33.600 Well, how about marketing?
00:08:36.100 How about marketing?
00:08:36.980 That's not exactly sales.
00:08:39.420 Well, marketing, same problem, right?
00:08:41.720 Then you say to yourself, but Scott, Scott, Scott, that's really, those are literally the influence jobs.
00:08:48.380 Yeah, so the influence jobs may be a little more bullshit.
00:08:51.520 But what about the ordinary jobs?
00:08:54.040 OK, take a moment to think about your ordinary job if you have one.
00:08:59.900 Let's say you're an engineer.
00:09:01.320 Engineers have to skip all the BS, right?
00:09:06.700 Because you can't build a plane based on a lie, right?
00:09:11.100 The plane has to fly.
00:09:12.780 The car has to drive, right?
00:09:14.700 So engineers are very truth-based by their nature.
00:09:20.260 Are they?
00:09:20.920 Have you ever been an engineer?
00:09:24.440 Or are you one?
00:09:28.260 No, the engineering job is lying all day long.
00:09:33.500 Your product has to be built based on real physics, so that part you can't lie about.
00:09:38.760 But everything about getting funding for your project is all lie.
00:09:43.720 How's your project going, Scott?
00:09:45.960 Oh, I am doing so well.
00:09:48.700 I am killing it.
00:09:49.840 My projects come along great.
00:09:52.060 In fact, I would recommend you increase the funding because I'm doing so well.
00:09:58.200 Now, we replace me with the AI.
00:10:01.520 Hey, AI, how's that project going?
00:10:04.180 To be honest, it doesn't look like it's going to work out.
00:10:10.160 No funding for you, AI.
00:10:13.200 Here's a true story.
00:10:15.180 I once was asked by my boss to attend a meeting to replace him at a meeting that was a budget meeting.
00:10:23.740 And my job was to defend the budget that my group had asked for.
00:10:28.240 And then the big boss at the meeting, who had a bunch of people who had to defend their budgets, said, well, we're going to have to do this or that.
00:10:38.180 And, you know, what about, hey, Scott, what if we cut this from your budget?
00:10:43.880 And then we'll cut a few things from other people's budget, the lower priority stuff, and then we'll be able to beat our target.
00:10:49.860 So when they got to me, they said, all right, you got all these projects, but this one looks like a little less valuable than those other ones.
00:10:58.580 So I think we'll cut this one.
00:11:01.060 What do you think?
00:11:02.180 So I'm sitting in a room with people whose job it is to decide what to cut, because you can't get to the end point unless somebody cuts.
00:11:10.640 So I said, all right, yeah, that's my lowest, lowest priority thing.
00:11:16.520 I guess we'll cut that.
00:11:17.380 And then other people gave up their cuts, et cetera, and we got a conclusion.
00:11:23.100 What do you think happened when I reported back to my boss?
00:11:26.980 Did you think that went well?
00:11:29.400 You did what?
00:11:30.820 Yeah, when it came to the negotiations, I decided that we would give up the lowest priority thing.
00:11:38.740 And he said, you did what?
00:11:42.960 You just rolled over?
00:11:44.720 And I said, well, yeah, because it was logical.
00:11:48.660 Like, it was honest, and it was logical.
00:11:51.800 We honestly didn't need that extra money, and sure enough, we didn't.
00:11:56.160 The year went by, and we were fine.
00:11:58.020 We honestly didn't need the money, so I honestly said, you could cut that, and we'd be fine.
00:12:03.480 So we cut that.
00:12:05.480 Yeah, I almost got fired.
00:12:06.660 Right, right, I almost got fired, because my job was to lie better than the other people in the room I found out.
00:12:14.320 My job was to get their projects cut and not mine, because my boss wanted power, and the bigger his group, more power he had.
00:12:24.600 So there is no job where lying isn't central to it.
00:12:33.460 I hate to tell you, there is no job that doesn't depend on lying, even if your job is to be honest all the time.
00:12:41.460 It's still basically lying.
00:12:43.060 So when you actually try to put an AI into a position, you would have to change everything in the industry to make that work,
00:12:52.360 because you would have the only engineers who are kind of honest, and they would just fail,
00:12:59.020 because they wouldn't be able to work in the world the way it's organized.
00:13:02.620 The world as it's organized, we try lots of things that are bad ideas, and that drives our economy.
00:13:08.960 We do lots of the wrong stuff, but during the time that we're doing the wrong stuff, people are getting paid.
00:13:18.400 You know, vendors are making sales while you're building this company that's going to fail.
00:13:23.940 Would an AI do that?
00:13:26.200 Would an AI look at a situation and say, I have no idea how to make this thing work, so I won't do it,
00:13:34.020 because I can't see the end point?
00:13:35.640 But a human will say, well, frankly, I don't know how this is going to work either, but I think I can get funding.
00:13:43.260 Maybe I can figure it out.
00:13:45.320 You know, so you start with one product, and you end up, very commonly, you end up switching to another product
00:13:52.780 while you still have some money, and maybe the second one works.
00:13:56.560 So would an AI do that?
00:13:57.920 I mean, there's almost nothing that an AI would do that's the same as what you would do.
00:14:05.600 And our entire industry, you know, everything about how we invest in startups, everything,
00:14:10.560 how we get loans, everything, is based on understanding that we're all lying, all the time.
00:14:17.540 I was a loan officer at a bank, so I was supposed to look at people's submissions to get loans.
00:14:25.400 We just assumed they were all lying.
00:14:27.900 That was just the baseline assumption.
00:14:29.960 You couldn't even do that job if you thought people were telling you the truth,
00:14:33.380 because the entire idea was to find their lies.
00:14:37.060 Like, that was my job, was to detect their lies.
00:14:39.300 And then take them up to my boss.
00:14:45.060 So no, AI is either going to have to learn to lie, which I don't think is going to happen,
00:14:50.780 or it can't possibly take our jobs.
00:14:53.460 But there might be local ones that are trained to lie, so we'll see what happens.
00:14:59.940 Yet again, there are stories in the news about the devastating effects of long COVID.
00:15:06.340 Go ahead, say it.
00:15:07.520 Say it.
00:15:08.500 Say it.
00:15:09.880 Say it.
00:15:10.520 Big story about long COVID.
00:15:11.960 It's terrible.
00:15:12.540 Say it.
00:15:13.780 Go ahead.
00:15:14.380 Say it.
00:15:17.720 Well, why is it taking so long?
00:15:19.420 Say it.
00:15:20.640 Just say it.
00:15:22.520 Yes, there we go.
00:15:23.520 Vax injuries.
00:15:24.080 It takes about two seconds for somebody to say,
00:15:29.720 how do you know that's the long COVID?
00:15:33.040 Can you show me the study where you studied separately the people who had vaccinations,
00:15:38.780 from the people who have long COVID and never got a vaccination?
00:15:43.260 May I see that data?
00:15:46.240 Apparently, fuck you.
00:15:47.440 You can't see that data.
00:15:49.680 Have you ever seen that data?
00:15:51.040 I'm looking through the news today, and there's all these references to studies.
00:15:56.380 None of the studies break, I don't think.
00:16:00.100 I don't think they break out people who are vaccinated versus unvaccinated.
00:16:03.220 Doesn't that seem sort of basic?
00:16:07.200 Did any of you see that?
00:16:08.760 Have any of you seen a randomized controlled trial?
00:16:12.200 Could you do a randomized controlled trial?
00:16:14.400 No, actually, you couldn't, because the effects.
00:16:19.840 No, you couldn't do a randomized controlled trial, could you?
00:16:24.400 Am I wrong about that?
00:16:26.100 It's too late.
00:16:28.200 You could have done one in theory before there was any COVID, but nobody knew to do it.
00:16:35.120 But now that the COVID is already here and it's in our bodies,
00:16:38.440 you can't really set up a randomized controlled trial, right?
00:16:41.460 I'm not sure if I'm right about that, but my point is the news is so absurdly inadequate.
00:16:53.860 Now, I don't know what is true.
00:16:57.080 I don't know if there's really long COVID.
00:16:59.960 I don't know if there's vax injury only.
00:17:03.020 I don't know if either one of them is really zero or a lot.
00:17:06.520 No idea.
00:17:07.700 No way to tell.
00:17:08.420 But I can tell that the news and the science are not helping me.
00:17:13.880 Don't you think the only study that would be useful is the vaccinated versus the unvaccinated,
00:17:19.700 and then separate them by booster shots and stuff,
00:17:22.520 and then follow them for five years, see what happens?
00:17:26.120 Anything short of that is probably not going to convince me.
00:17:30.480 The studies appear either poorly constructed or too late or after the fact,
00:17:36.360 where they don't look at the right stuff, et cetera.
00:17:39.640 So I cannot tell how much I should care about this.
00:17:45.380 I'll tell you my own situation, which is prior to the pandemic,
00:17:51.320 or actually during the pandemic,
00:17:54.040 but prior to getting either the vaccination or the COVID,
00:17:59.020 so this is just my personal anecdote,
00:18:00.680 prior to getting either one of those,
00:18:04.000 I noticed a huge drop-off in my, let's see, my vitality.
00:18:11.580 Now, at the time, I was dealing with some other surgical thing,
00:18:14.500 minor surgery for my sinuses,
00:18:16.240 and I got on some drugs that probably had a difference,
00:18:22.100 you know, that were trying to help me until I got the surgery.
00:18:25.400 So here's my point.
00:18:30.680 If my sudden drop-off in, let's say, energy
00:18:35.300 had happened just after the vax
00:18:38.760 or just after I got actual COVID later, a year later,
00:18:42.160 then I would have thought that they were the cause.
00:18:46.000 But by luck, my drop-off in energy,
00:18:50.440 which appears to be permanent,
00:18:52.900 at least hasn't changed in three years,
00:18:54.580 is not related to either of those things
00:18:58.300 because it happened before the vaccinations
00:19:00.920 and before the COVID.
00:19:02.780 But I would be completely certain that they were related
00:19:06.040 if the timing had changed by just a few months.
00:19:11.700 Now, yeah, age is part of it,
00:19:14.760 but you kind of expect your decline in aging
00:19:18.140 to be a little gradual.
00:19:20.500 But just before the pandemic,
00:19:22.740 actually in the beginning of the pandemic,
00:19:23.900 my fitness level just fell off a shelf
00:19:28.780 and never really got back, never got back.
00:19:33.900 So I didn't expect that to happen.
00:19:36.560 Anyway, I mean, if you look at me,
00:19:38.460 I'm in great shape physically.
00:19:40.040 I look better than I've ever looked.
00:19:43.180 But I can't run as far or, you know,
00:19:45.440 lift as much or stay awake as long.
00:19:48.080 There's definitely a difference.
00:19:49.020 Google has a new technology for making video calls
00:19:57.220 where the person on the screen looks 3D
00:20:00.720 and you don't require any glasses.
00:20:04.080 So the person watching the video,
00:20:06.700 so I guess both of you could be 3D,
00:20:08.840 so you would look at each other almost like 3D.
00:20:11.740 Now, of course, I saw the video of it in 2D,
00:20:15.960 so the 2D video of the 3D product isn't really exciting.
00:20:20.100 But watching the participants interact with it
00:20:23.920 told you everything you needed to know.
00:20:26.620 And so I showed a few of them
00:20:27.860 who were asking the person on the screen
00:20:30.260 to give them a fist bump.
00:20:32.080 And then you watch the person,
00:20:33.740 you know, the person on the other side going,
00:20:36.160 oh my God, oh my God.
00:20:37.360 And, like, you could see that they thought
00:20:39.220 they were fist bumping a person on the screen.
00:20:41.840 Like, they could feel the fist bump
00:20:43.440 three feet ahead of the screen.
00:20:45.980 And they could swear that their fists are touching,
00:20:49.300 you know, without the actual touch.
00:20:51.400 So, if that's true,
00:20:54.680 now, first of all,
00:20:55.280 I think this is only going to work on a person.
00:20:58.220 I don't think you can yet create a whole 3D environment.
00:21:01.340 So it removes their background
00:21:02.780 and just turns them 3D.
00:21:04.500 But that would be really good for porn.
00:21:11.060 I'll say it before you do.
00:21:13.580 And, yeah, you were typing,
00:21:16.640 over on the locals,
00:21:18.440 they were all typing P-O-R,
00:21:20.400 and I'm like, I gotta say it first.
00:21:21.980 Porn.
00:21:22.960 And I beat some of you.
00:21:24.360 But as soon as I said it,
00:21:25.680 like, the whole screen went,
00:21:26.800 what about porn?
00:21:27.740 What about porn?
00:21:29.640 And we're all on the same page.
00:21:32.160 Same page.
00:21:32.940 Yeah, so that would be
00:21:35.460 without using the metaverse.
00:21:38.300 You know, do you need the metaverse
00:21:39.580 with the glasses that will give you a headache
00:21:41.520 if you just see a screen that looks 3D?
00:21:44.420 So, maybe that's coming.
00:21:46.560 We'll see.
00:21:50.120 Congressperson Nancy Mace
00:21:51.480 is going to vote against
00:21:53.380 the debt limit deal.
00:21:56.680 And I said to myself,
00:21:58.360 huh, that feels a little extreme and dangerous,
00:22:01.160 and maybe you shouldn't do that.
00:22:03.560 And then I read her tweet thread on why.
00:22:08.180 And I said to myself,
00:22:09.920 yeah, okay,
00:22:11.220 I will take the credit hit to the United States
00:22:14.380 because the deal is absolute garbage.
00:22:18.320 It's embarrassing.
00:22:20.460 It is so below the level
00:22:24.360 that every American would want your Congress
00:22:28.200 to be operating at.
00:22:29.380 Let me just give you this one factoid.
00:22:32.740 So this is one fact among many.
00:22:36.160 So there are many criticisms
00:22:37.300 of the proposed legislation.
00:22:40.560 But here's just one.
00:22:43.020 The Republicans and Democrats got together
00:22:45.460 and decided that the baseline budget
00:22:48.340 was what the peak of the pandemic was.
00:22:52.720 So rather than going back
00:22:54.380 to pre-pandemic spending levels,
00:22:57.180 both sides have agreed
00:22:59.540 to lock in pandemic-level spending
00:23:03.000 as the base forever.
00:23:07.020 I read it like three times
00:23:14.720 because I thought I was reading it wrong.
00:23:17.120 I thought to myself,
00:23:18.100 are you kidding me?
00:23:19.720 The Republicans are so weak
00:23:21.340 that they agreed to that?
00:23:24.860 Now, maybe there's more details
00:23:26.720 that would somehow change my mind.
00:23:29.140 But, I mean, it looks like a deal
00:23:36.520 where the default would be better
00:23:38.340 than the deal.
00:23:39.600 What do you think?
00:23:40.700 I feel like the credit default
00:23:42.320 would be better than signing the deal
00:23:44.280 at this point.
00:23:49.280 Because if you do the credit default,
00:23:52.160 you know, that's very disruptive.
00:23:54.740 Very disruptive.
00:23:55.560 But maybe it would make Congress
00:23:57.660 act like they're supposed to act
00:23:59.140 for the first time,
00:24:00.500 which is to not spend more than we make.
00:24:05.800 I mean, literally,
00:24:06.700 our biggest problem in the world
00:24:08.320 is, isn't it, the debt?
00:24:11.900 Isn't that literally the biggest problem
00:24:13.800 that we have in the world?
00:24:15.920 And we decided not to deal with it.
00:24:18.800 We decided that our biggest problem,
00:24:21.180 we would just kick the can down the road.
00:24:22.840 Now, and I'm just giving you one example.
00:24:26.920 If you hear her other examples,
00:24:28.580 your head will explode
00:24:29.540 that real adults who you elected
00:24:33.160 that you think are representing you
00:24:35.220 came to that deal.
00:24:37.300 It's not even trying, honestly.
00:24:39.380 It looks like not even trying.
00:24:41.280 So, I have to admit,
00:24:42.920 I was completely against, you know,
00:24:46.140 not reaching a deal
00:24:47.160 because it seemed like unnecessary.
00:24:49.640 But I'm now actually persuaded.
00:24:53.020 I think it's the first time
00:24:55.220 I saw anything that looked like
00:24:56.300 something of a detail about the thing.
00:24:58.860 But once you see Nancy Mace's argument,
00:25:01.120 it's really hard to support it.
00:25:03.600 And I have to admit,
00:25:05.680 she was impressive
00:25:06.700 in her opposition.
00:25:11.220 It was well expressed.
00:25:12.960 I didn't realize she follows me on Twitter.
00:25:14.440 So, the fact that she follows me on Twitter
00:25:17.480 gives me more confidence in her abilities.
00:25:23.260 All right.
00:25:24.380 Here's a little observation
00:25:25.860 relevant to not much of anything.
00:25:29.900 Have you noticed
00:25:31.140 that the thing called white supremacy
00:25:34.600 took a big hit
00:25:36.900 when we realized that
00:25:38.540 Asian Americans and Indian Americans
00:25:41.280 were absolutely killing it
00:25:43.800 in colleges
00:25:44.540 and average income?
00:25:47.960 It kind of destroys
00:25:49.580 the whole white supremacy argument, right?
00:25:52.600 But here's the thing.
00:25:58.280 I don't know any white,
00:26:00.080 anybody who would,
00:26:01.000 well, I don't know anybody
00:26:01.900 who would call themselves
00:26:02.700 a white supremacist.
00:26:03.720 But I don't know anybody
00:26:05.380 who has a problem
00:26:08.680 with Asian Americans
00:26:09.940 and Indian Americans
00:26:11.340 excelling in America.
00:26:13.420 Have you ever heard that?
00:26:15.920 In any form?
00:26:17.540 Have you ever heard anybody
00:26:18.360 that you think is like
00:26:19.380 a white supremacist?
00:26:20.920 Like the most,
00:26:22.140 just the whitest,
00:26:23.580 most bigoted person
00:26:24.820 you've ever seen?
00:26:25.680 Have you ever heard them
00:26:26.880 even one word negative
00:26:28.800 about two demographic groups
00:26:32.100 that are succeeding
00:26:33.860 way above the average
00:26:35.700 and they're doing it
00:26:37.000 by working hard,
00:26:38.880 expressing great character
00:26:40.720 and patriotism?
00:26:45.900 I've never heard one person
00:26:47.880 say a negative
00:26:49.480 about those communities.
00:26:51.640 Not once.
00:26:52.760 Not ever.
00:26:54.300 Now, I can't say that
00:26:55.840 about every community, right?
00:26:58.180 But it's kind of weird
00:26:59.360 that there is such a thing
00:27:00.720 as white supremacy
00:27:01.680 that it's still a phrase
00:27:03.040 because the people
00:27:04.520 who may have at one point
00:27:06.980 fit that definition,
00:27:09.740 their current opinion
00:27:10.880 of themselves is average.
00:27:14.560 The white supremacists
00:27:15.680 have merged,
00:27:17.700 not merged, let's say,
00:27:19.160 they've evolved
00:27:19.880 into white averagists.
00:27:23.440 Yeah.
00:27:23.940 And if you talk to people
00:27:25.520 who have real racial feelings
00:27:27.280 about stuff,
00:27:28.320 they will tell you,
00:27:29.220 yeah, white people
00:27:30.340 we're in the middle.
00:27:32.380 It's the damnedest thing.
00:27:34.440 How do you turn
00:27:35.260 white supremacists
00:27:36.460 into white average?
00:27:39.060 We're right in the middle there.
00:27:41.920 Yay us.
00:27:45.900 But I think what it does
00:27:47.480 is it tells you
00:27:49.060 what's really going on.
00:27:52.500 You know,
00:27:52.860 white supremacy,
00:27:53.960 I would believe,
00:27:54.860 is a real thing
00:27:55.660 if I saw people marching,
00:27:58.320 if I saw white people
00:27:59.540 organized against
00:28:01.180 the high success rate
00:28:03.440 of Asian Americans
00:28:04.420 and Indian Americans.
00:28:07.020 If they were complaining
00:28:08.180 about that,
00:28:09.600 then I would say,
00:28:10.440 oh, damn,
00:28:12.040 that's some dangerous
00:28:12.880 stuff there.
00:28:14.620 Right?
00:28:15.520 But they seem to be
00:28:16.840 picking and choosing
00:28:17.600 their fights.
00:28:18.300 It's a weird thing.
00:28:20.280 Now, of course,
00:28:20.840 they're way too anti-Semitic,
00:28:22.460 that part.
00:28:22.900 You know,
00:28:23.960 there's no improvement
00:28:24.860 there that I can see.
00:28:26.600 So,
00:28:27.420 I mean,
00:28:27.700 that's still alarming.
00:28:34.840 All right,
00:28:39.640 that was funny.
00:28:41.200 Somebody at Locals
00:28:42.120 just said,
00:28:43.840 how do you say
00:28:44.740 Adobe or Dobby?
00:28:47.300 The elf?
00:28:49.640 Is it Dobby?
00:28:50.200 Somebody said,
00:28:52.200 who gave Dobby a shirt?
00:28:58.100 Oh,
00:28:58.680 that's pretty
00:28:59.140 freaking funny.
00:29:00.880 It's called Dobby.
00:29:05.160 All right,
00:29:06.020 you got me on that.
00:29:07.800 All right,
00:29:08.500 have you noticed
00:29:09.300 that when Democrats
00:29:10.680 voice their concern
00:29:12.160 about Twitter,
00:29:13.640 they'll say stuff
00:29:14.620 such as,
00:29:16.020 it's turning into
00:29:16.840 a, you know,
00:29:18.160 white supremacist
00:29:19.300 platform.
00:29:21.520 Now, of course,
00:29:22.300 Elon Musk challenged
00:29:23.760 his interviewer
00:29:25.260 to give him
00:29:25.720 any evidence
00:29:26.560 that that's true.
00:29:28.040 And there was
00:29:28.660 no evidence
00:29:29.420 that that's true,
00:29:30.180 but it's widely believed.
00:29:31.780 So the Democrats
00:29:32.560 widely believe
00:29:33.420 that Twitter
00:29:34.720 is no longer
00:29:35.500 a free speech place,
00:29:36.780 it's leaning right.
00:29:39.920 But here's
00:29:40.700 the weird thing.
00:29:41.240 When Democrats
00:29:42.620 talk about Twitter,
00:29:44.840 what they talk about
00:29:45.920 is that it might
00:29:46.760 be influencing
00:29:48.140 people politically,
00:29:50.560 right?
00:29:51.680 That's the entire
00:29:52.720 criticism,
00:29:53.980 is that it
00:29:54.960 might influence
00:29:56.380 people in this
00:29:57.600 country to vote
00:29:58.840 or act differently.
00:30:00.500 Well,
00:30:01.100 what do they do
00:30:01.700 when they talk
00:30:02.180 about TikTok,
00:30:03.800 Democrats?
00:30:04.940 When Democrats
00:30:05.620 talk about TikTok,
00:30:06.980 do they ever talk
00:30:07.900 about its potential
00:30:08.820 to influence things?
00:30:11.100 Never.
00:30:12.220 They only talk
00:30:13.100 about data security.
00:30:13.980 So none of it
00:30:16.720 is real,
00:30:17.840 right?
00:30:18.500 They would talk
00:30:19.300 about both of them
00:30:20.180 exactly the same
00:30:21.300 if any of this
00:30:22.440 were real.
00:30:23.460 Because,
00:30:24.060 you know,
00:30:24.440 Twitter has your data,
00:30:26.180 TikTok has your data.
00:30:27.580 Yeah,
00:30:27.900 it's worse
00:30:28.380 than China has it,
00:30:29.700 but it's the
00:30:30.260 smaller problem.
00:30:32.060 The bigger problem
00:30:33.000 by far
00:30:33.720 is that either
00:30:34.980 TikTok or Twitter,
00:30:36.780 if they wanted to,
00:30:38.360 could put their
00:30:39.040 finger on a button,
00:30:41.040 literally and or
00:30:42.160 figuratively,
00:30:43.120 and change
00:30:44.180 what you think.
00:30:45.820 They can change
00:30:46.640 your brain
00:30:47.340 with a button.
00:30:50.020 In the case
00:30:50.580 of TikTok,
00:30:51.840 that's literally true.
00:30:52.700 There's a button
00:30:53.160 called heat.
00:30:54.240 It's called
00:30:54.820 the heat button.
00:30:56.400 And if they say,
00:30:57.220 hmm,
00:30:57.640 I think I'm going
00:30:58.260 to change
00:30:58.700 America's opinion
00:30:59.600 by making them
00:31:01.160 see more of this.
00:31:02.640 Boop.
00:31:03.300 One button.
00:31:04.200 Change your mind.
00:31:05.540 Now,
00:31:05.920 it doesn't change
00:31:06.460 everybody's mind.
00:31:07.400 It doesn't do it
00:31:07.900 instantly,
00:31:08.340 but we know
00:31:08.840 the influence
00:31:09.460 of social media.
00:31:10.880 We do know
00:31:11.740 it changes minds
00:31:12.580 over time
00:31:13.380 and on average.
00:31:15.740 So,
00:31:16.680 I think
00:31:17.820 Democrats need
00:31:18.680 to get a little
00:31:19.040 more consistent.
00:31:20.040 Either you're
00:31:20.500 worried about
00:31:20.980 influence
00:31:21.620 or you're not.
00:31:23.620 But if you're
00:31:24.220 worried,
00:31:24.720 TikTok's your
00:31:25.280 big problem.
00:31:28.140 All right.
00:31:33.140 Do any of you
00:31:34.160 have fatigue
00:31:37.720 from existential
00:31:41.420 threats?
00:31:43.640 Do you sort of
00:31:44.960 just get used
00:31:46.680 to the fact
00:31:47.280 that the world
00:31:47.860 is going to end
00:31:48.500 any minute
00:31:49.040 and we've got
00:31:49.580 a new reason
00:31:50.180 all the time?
00:31:51.340 It's like,
00:31:51.740 climate change,
00:31:52.580 it's going to kill us.
00:31:53.980 All right,
00:31:54.500 maybe not that fast.
00:31:56.400 Those aliens
00:31:57.320 are coming.
00:31:58.840 Maybe not.
00:31:59.800 And we got
00:32:00.580 the pandemic,
00:32:01.660 it's going to kill
00:32:02.300 us all or just
00:32:03.460 maybe old people.
00:32:05.020 And then,
00:32:05.620 oh,
00:32:05.800 but the inflation
00:32:06.460 is going to run,
00:32:07.400 but it didn't.
00:32:08.320 And then,
00:32:08.720 you know,
00:32:08.940 we got the
00:32:10.840 recession is coming,
00:32:13.440 but it didn't.
00:32:15.680 And then the Ukraine
00:32:16.460 is going to turn
00:32:17.200 into a nuclear,
00:32:18.380 but it didn't.
00:32:22.020 Yeah,
00:32:22.540 I feel like
00:32:23.380 I'm just exhausted.
00:32:24.240 So when I see
00:32:25.500 one that might be
00:32:26.220 real,
00:32:27.400 I ask myself,
00:32:28.700 are we going to
00:32:29.740 just ignore that?
00:32:30.620 So here's one
00:32:32.260 that might be
00:32:33.060 real,
00:32:33.860 and I say that
00:32:34.940 in part because
00:32:35.660 Elon Musk says
00:32:36.740 it's real,
00:32:37.940 and he looked
00:32:39.260 into it.
00:32:41.200 So,
00:32:42.300 the commercial
00:32:43.500 real estate
00:32:44.060 collapse.
00:32:45.200 Apparently,
00:32:45.700 there are two
00:32:46.180 cities,
00:32:46.700 I think San
00:32:47.240 Francisco and
00:32:48.020 maybe New York,
00:32:48.920 in which buildings
00:32:50.980 are selling at
00:32:51.840 less than the
00:32:52.660 mortgage on the
00:32:53.780 building.
00:32:54.680 So the people
00:32:55.580 that own the
00:32:56.140 building are
00:32:57.140 willing to sell
00:32:57.880 it for more
00:32:58.580 than they owe
00:32:59.220 the bank.
00:32:59.900 In other words,
00:33:00.940 they'll just
00:33:01.540 pay the bank
00:33:02.120 $100 million
00:33:02.740 or whatever
00:33:03.540 just to get
00:33:04.460 rid of it.
00:33:06.400 Yeah,
00:33:06.800 so we're already
00:33:07.620 at fire sale
00:33:08.580 prices.
00:33:10.640 Now,
00:33:11.460 two things
00:33:12.160 could happen.
00:33:13.600 One is that
00:33:14.620 maybe the real
00:33:15.540 estate changes
00:33:16.180 hands and the
00:33:17.020 new owners
00:33:17.600 use it
00:33:18.880 productively in
00:33:19.720 a way that the
00:33:20.200 old owners
00:33:20.660 could not,
00:33:21.380 for whatever
00:33:21.740 reason.
00:33:22.400 Maybe they turn
00:33:22.960 it into a
00:33:23.460 residential or
00:33:24.080 something.
00:33:25.020 But there
00:33:26.540 doesn't seem to
00:33:27.300 be anything
00:33:28.040 that will
00:33:29.060 stop the
00:33:29.640 collapse.
00:33:31.000 And once
00:33:31.740 the commercial
00:33:32.520 real estate
00:33:33.040 collapses,
00:33:34.280 that impacts
00:33:35.040 the banks
00:33:35.580 that have
00:33:35.860 the loans,
00:33:37.560 and then you
00:33:38.400 worry about
00:33:38.960 a cascade
00:33:39.680 effect.
00:33:41.720 Now,
00:33:42.300 here's my
00:33:42.720 problem.
00:33:44.260 On one hand,
00:33:45.100 that all
00:33:45.600 makes sense.
00:33:47.380 It all
00:33:48.040 makes sense
00:33:48.660 that we're
00:33:50.060 in a lot
00:33:50.380 of problems
00:33:50.860 here.
00:33:51.500 On the other
00:33:52.200 hand,
00:33:53.400 it's the 25th
00:33:54.620 time this year
00:33:55.380 I've been
00:33:55.780 told that
00:33:56.200 there's
00:33:56.440 nothing that
00:33:56.920 will stop
00:33:57.380 this danger
00:33:58.420 that's going
00:33:58.860 to kill
00:33:59.140 us all.
00:34:00.860 And 24
00:34:02.280 out of 24
00:34:02.960 were wrong.
00:34:04.460 So how do
00:34:05.140 I process
00:34:05.640 this?
00:34:07.040 If I process
00:34:08.020 it logically,
00:34:08.820 I'm scared
00:34:09.240 to death.
00:34:10.180 Because I
00:34:10.900 agree with
00:34:11.400 Elon Musk,
00:34:12.040 I don't see
00:34:12.600 a way that
00:34:13.740 we avoid a
00:34:14.800 pretty big hit
00:34:15.420 on this.
00:34:16.680 And then
00:34:17.020 Elon says,
00:34:19.220 hello YouTube,
00:34:20.840 you're back.
00:34:22.040 You've got a
00:34:22.460 buffering problem
00:34:23.140 over there.
00:34:23.840 Locals is
00:34:24.480 running like
00:34:24.980 a champ.
00:34:26.720 Now it's
00:34:27.160 not a battery
00:34:27.640 problem.
00:34:28.160 I'm all good
00:34:29.020 on battery.
00:34:29.700 It's a YouTube
00:34:30.700 problem.
00:34:32.160 I think it's
00:34:32.820 not my Wi-Fi
00:34:33.580 because my
00:34:35.140 other platform
00:34:35.760 is working
00:34:36.160 fine.
00:34:38.200 All right,
00:34:38.780 we'll keep
00:34:39.060 an eye on
00:34:39.400 that.
00:34:41.800 My take
00:34:42.560 is that
00:34:43.100 economics are
00:34:44.640 too complicated
00:34:45.580 to predict.
00:34:47.540 And almost
00:34:48.180 anything could
00:34:48.740 happen.
00:34:49.500 It could be
00:34:50.060 that this is
00:34:50.560 a big problem.
00:34:51.760 Maybe not.
00:34:53.560 We'll see.
00:34:54.480 All right,
00:34:55.580 here's my
00:34:56.080 prediction on
00:34:57.780 food.
00:34:59.680 I believe
00:35:00.280 that the
00:35:00.800 future is
00:35:01.700 underground
00:35:02.500 farms
00:35:03.160 and local
00:35:06.980 to the
00:35:07.600 community.
00:35:08.380 So they'll
00:35:08.760 be in the
00:35:09.300 same zone
00:35:09.900 as the
00:35:10.220 community.
00:35:11.320 Now,
00:35:11.720 here's what,
00:35:12.340 and the
00:35:12.820 underground
00:35:13.180 farms would
00:35:14.140 be,
00:35:15.240 let's say
00:35:16.240 that the
00:35:16.820 tunnel for
00:35:17.440 the underground
00:35:17.940 farm is
00:35:18.560 built by the
00:35:19.120 boring machine.
00:35:20.040 So that's
00:35:20.380 Elon Musk.
00:35:21.080 So he
00:35:22.260 owns that
00:35:22.620 boring thing.
00:35:23.800 So if he
00:35:24.380 gets the
00:35:24.700 economics of
00:35:25.680 digging a
00:35:27.240 tunnel,
00:35:28.460 you know,
00:35:28.760 nice and
00:35:29.200 low,
00:35:30.040 then you
00:35:30.740 could dig
00:35:31.160 tunnels that
00:35:31.900 become our
00:35:32.460 underground
00:35:32.980 gardens.
00:35:35.680 And did
00:35:36.920 you know
00:35:37.280 that Elon Musk's
00:35:38.100 brother,
00:35:38.580 Kimball Musk,
00:35:40.000 invests in
00:35:41.320 indoor farms?
00:35:42.240 So his
00:35:44.120 brother is
00:35:44.600 the indoor
00:35:45.060 farm guy.
00:35:47.640 So I
00:35:48.240 see indoor
00:35:48.800 farms managed
00:35:50.680 by robots.
00:35:52.320 So this is
00:35:53.420 also in the
00:35:54.120 news.
00:35:54.860 So the
00:35:55.180 news is
00:35:55.540 talking about
00:35:55.980 robots managing
00:35:57.160 indoor farms.
00:35:58.840 So that's
00:35:59.220 going to be a
00:35:59.640 thing.
00:36:00.580 So if all
00:36:01.900 you had was
00:36:02.380 a tunnel with
00:36:04.200 stuff growing
00:36:04.940 on each side
00:36:05.700 and robots
00:36:07.060 that could go
00:36:07.540 down the
00:36:08.000 middle and
00:36:08.940 it could tend
00:36:09.600 and pick the
00:36:10.360 stuff and do
00:36:11.140 what it needs
00:36:11.540 to do,
00:36:11.920 and let's
00:36:13.300 say the
00:36:13.780 sun was
00:36:14.580 piped in
00:36:15.200 by light
00:36:16.260 pipes.
00:36:18.040 Right?
00:36:18.680 We already
00:36:19.300 have indoor
00:36:19.780 farms so we
00:36:20.700 know how to
00:36:21.060 get light from
00:36:21.840 where we need
00:36:22.360 it to where
00:36:22.680 it is.
00:36:23.280 So you
00:36:23.560 could do it
00:36:23.820 with a
00:36:24.060 light pipe
00:36:24.600 which is
00:36:25.000 basically a
00:36:25.640 series of
00:36:26.100 mirrors and
00:36:26.600 just take
00:36:27.060 the light
00:36:27.400 down.
00:36:28.700 Or you
00:36:29.320 could do
00:36:32.380 it by
00:36:32.940 creating
00:36:33.440 energy and
00:36:36.700 just use the
00:36:37.160 extra energy
00:36:37.660 for the
00:36:37.920 light.
00:36:39.200 So one
00:36:41.740 of the
00:36:41.920 ways you
00:36:42.240 can
00:36:42.400 create
00:36:42.660 energy
00:36:43.080 is
00:36:44.740 geothermal.
00:36:47.420 So you
00:36:47.720 take advantage
00:36:48.300 of the
00:36:48.520 fact that
00:36:48.940 underground
00:36:49.440 is always
00:36:50.240 about 56
00:36:51.100 degrees.
00:36:52.060 So if you
00:36:52.500 live in a
00:36:52.860 cold climate
00:36:53.500 you take the
00:36:54.960 underground heat
00:36:55.820 and you move
00:36:56.260 it up because
00:36:57.120 you otherwise
00:36:57.900 would be colder
00:36:58.520 than 56
00:36:59.180 and reverse
00:37:01.160 for air
00:37:01.900 conditioning.
00:37:02.200 so you
00:37:03.540 could totally
00:37:06.200 reduce the
00:37:07.500 energy cost
00:37:08.400 of the
00:37:08.700 town by
00:37:10.620 having it
00:37:11.120 use these
00:37:11.620 underground
00:37:12.080 tunnels to
00:37:13.540 make sure
00:37:13.940 that they
00:37:14.380 don't get
00:37:14.700 too far from
00:37:15.320 56 degrees
00:37:16.500 either too
00:37:17.140 cold or too
00:37:17.740 hot.
00:37:18.480 Now that's
00:37:18.860 existing
00:37:19.320 technology and
00:37:20.620 the reason
00:37:20.980 that you
00:37:21.280 don't do
00:37:21.740 that is
00:37:23.500 because it's
00:37:24.300 expensive to
00:37:25.060 build a
00:37:25.520 tunnel.
00:37:27.480 So if you
00:37:28.380 had tunnels
00:37:28.920 anyway or you
00:37:29.800 had a technology
00:37:31.140 to build
00:37:31.640 tunnels, you
00:37:32.780 could put
00:37:33.140 tunnels under
00:37:33.760 your town
00:37:34.360 that would
00:37:35.660 make the
00:37:36.160 energy need
00:37:37.020 for cooling
00:37:37.740 and heating
00:37:38.240 your homes
00:37:38.840 maybe 20%
00:37:40.500 of normal.
00:37:41.900 So that
00:37:42.620 frees up a
00:37:43.520 whole bunch
00:37:43.860 of electricity
00:37:44.500 that could be
00:37:45.760 used for the
00:37:46.220 underground farms
00:37:46.980 if you need
00:37:47.460 them.
00:37:48.920 So the
00:37:49.820 other way you
00:37:50.240 can create
00:37:50.740 electricity is
00:37:52.420 a sterling
00:37:53.200 engine.
00:37:53.860 Have you ever
00:37:54.180 heard of a
00:37:54.640 sterling
00:37:55.120 engine?
00:37:56.800 A sterling
00:37:57.380 engine, it's
00:37:58.200 current technology,
00:37:59.260 it's been around
00:37:59.780 forever, and
00:38:00.900 it creates
00:38:01.840 electricity based
00:38:03.180 on temperature
00:38:03.920 differences.
00:38:05.360 That's all you
00:38:06.020 need.
00:38:06.640 Now the greater
00:38:07.240 the temperature
00:38:07.800 difference in
00:38:08.440 two places that
00:38:09.600 are near each
00:38:10.140 other, the
00:38:11.180 more electricity
00:38:12.020 you can create.
00:38:14.180 So if you
00:38:15.420 simply had
00:38:16.120 tunnels under
00:38:16.780 the ground that
00:38:18.260 were 56 degrees
00:38:19.560 all the time,
00:38:20.980 but above the
00:38:21.620 ground it's 90
00:38:22.640 degrees sometimes
00:38:23.540 and 20 degrees
00:38:24.760 sometimes, you'd
00:38:26.100 have some kind
00:38:26.600 of a temperature
00:38:27.440 differential.
00:38:28.140 You might be
00:38:28.560 able to create
00:38:28.980 electricity from
00:38:29.780 it.
00:38:32.060 So here's my
00:38:34.600 basic idea, is
00:38:37.220 that food is
00:38:38.500 going to go
00:38:38.920 local, and
00:38:42.440 it's going to
00:38:43.020 have no
00:38:43.740 pesticides,
00:38:44.780 because it'll
00:38:45.200 be indoors, so
00:38:46.620 it'll all be
00:38:47.160 organic, no
00:38:48.320 pesticides, no
00:38:49.680 soil, because
00:38:51.320 you don't really
00:38:51.720 need the soil.
00:38:52.980 You just put the
00:38:53.760 nutrients in.
00:38:54.380 And, Scott's
00:39:00.380 lobbying for
00:39:00.940 the mold people.
00:39:01.900 No, under my
00:39:03.000 conception, only
00:39:03.920 the robots would
00:39:04.780 be underground.
00:39:06.360 So the people
00:39:07.160 would be above
00:39:07.760 ground, and
00:39:09.120 except for
00:39:09.620 building it and
00:39:10.420 maintaining the
00:39:11.040 robots, the
00:39:12.000 robots would be
00:39:12.600 the underground
00:39:13.220 farmers, and
00:39:14.460 they would do
00:39:14.760 all that work.
00:39:15.720 Now here's a
00:39:16.240 question, all
00:39:17.860 right, how many
00:39:19.160 of you are going
00:39:19.660 to ask about
00:39:20.160 sunlight?
00:39:20.540 Can you
00:39:22.600 really not
00:39:23.000 solve that on
00:39:23.620 your own, without
00:39:24.280 me answering that
00:39:24.960 question?
00:39:26.220 You can't figure
00:39:26.860 out how to
00:39:28.240 bring light
00:39:28.920 inside a
00:39:29.920 structure?
00:39:33.460 Let me do a,
00:39:34.320 I'm going to do
00:39:34.740 a little demonstration.
00:39:39.840 So, I wasn't
00:39:40.820 planning on doing
00:39:41.340 it, this is
00:39:41.780 called, it's
00:39:42.780 called light,
00:39:44.120 it's called
00:39:44.520 light, L-I-G-H-T.
00:39:46.460 And what you do
00:39:47.220 is you can bring
00:39:47.800 these inside an
00:39:49.340 enclosed structure,
00:39:50.540 and then you
00:39:51.760 can turn them
00:39:52.300 on.
00:39:52.620 I'll give you
00:39:52.940 an example,
00:39:53.440 watch this.
00:39:58.120 I bet you
00:39:58.540 didn't see that
00:39:59.040 coming.
00:40:00.020 Yeah, do you
00:40:00.960 know that you
00:40:01.340 could put this
00:40:01.980 light, potentially,
00:40:04.640 underground in
00:40:05.660 a tunnel?
00:40:07.460 And then you
00:40:08.160 could use a
00:40:08.820 different spectrum
00:40:10.100 to make sure that
00:40:11.820 it's the right
00:40:12.360 kind for growing
00:40:13.540 food.
00:40:14.820 Now, I noticed
00:40:15.740 in the comments
00:40:16.400 that a lot of
00:40:17.120 you were not
00:40:17.620 familiar with this
00:40:18.400 technology called
00:40:19.600 light.
00:40:21.980 I mean, if you
00:40:22.680 read it, you
00:40:23.120 probably thought
00:40:23.560 it was pronounced
00:40:24.120 ligt, L-I-G-H-T,
00:40:27.260 ligt, so you've
00:40:28.260 probably heard of
00:40:28.760 it.
00:40:29.140 But it's called
00:40:29.580 light, pronounced
00:40:30.880 correctly.
00:40:32.140 And it can be
00:40:32.880 brought inside of
00:40:33.700 buildings.
00:40:35.160 Do you have any
00:40:35.640 other questions?
00:40:37.420 No, no, it's not
00:40:38.400 like bud light.
00:40:39.860 No, bud light,
00:40:41.260 no, bud light,
00:40:42.720 that's completely
00:40:43.400 different.
00:40:43.820 That would be a
00:40:46.040 liquid.
00:40:47.460 Yeah.
00:40:47.840 No, we're
00:40:48.060 talking about the
00:40:48.860 warm light that
00:40:51.160 passes through the
00:40:52.140 air.
00:40:53.080 Okay.
00:40:53.400 I just want to
00:40:53.980 make that clear
00:40:54.560 because I couldn't
00:40:55.560 stop getting
00:40:56.060 questions about how
00:40:56.940 you could get
00:40:57.560 light inside a
00:41:00.380 structure.
00:41:01.400 It was very, it
00:41:02.220 was confusing a lot
00:41:03.120 of people.
00:41:04.060 So, glad we
00:41:05.240 cleared that up for
00:41:05.980 you.
00:41:08.440 Well, I, in my
00:41:10.300 opinion, I am,
00:41:11.160 well, let's talk
00:41:13.100 about something
00:41:13.480 else first.
00:41:14.520 Texas has got this
00:41:15.620 bill that hasn't
00:41:16.340 been passed yet,
00:41:17.240 but it looks like
00:41:17.700 it might, to
00:41:19.100 require that every
00:41:20.020 school have an
00:41:20.800 armed person at
00:41:21.920 the school, an
00:41:23.260 armed guard.
00:41:24.400 What do you think
00:41:25.020 of that?
00:41:26.340 Do you think
00:41:26.720 that'll work?
00:41:28.360 There'll be an
00:41:28.960 armed guard at
00:41:29.960 every school.
00:41:32.980 Well, I have two
00:41:35.340 thoughts about that.
00:41:37.100 Number one, this
00:41:38.800 really depends on
00:41:39.840 your mass
00:41:40.680 shooter's not
00:41:41.420 being good at
00:41:42.200 planning.
00:41:43.380 Am I right?
00:41:45.360 You have to
00:41:46.840 assume that your
00:41:47.380 mass shooter is
00:41:49.040 just terrible at
00:41:49.940 planning.
00:41:51.460 Because once you
00:41:52.460 knew that every
00:41:53.020 school had an
00:41:53.740 armed person, you
00:41:56.260 would do that
00:41:56.780 person first, right?
00:41:58.940 Because the
00:41:59.460 armed person
00:42:00.100 doesn't have their
00:42:00.660 gun out, they
00:42:02.340 just have access
00:42:03.380 to a gun.
00:42:05.160 And if you were
00:42:05.800 a mass shooter,
00:42:06.500 you would just
00:42:06.800 walk up behind
00:42:07.600 them, take them
00:42:08.360 out first, and
00:42:09.720 then go do your
00:42:10.440 mass shooting with
00:42:11.500 an extra gun, I
00:42:12.340 suppose.
00:42:13.600 But I'm not so
00:42:15.020 sure that a
00:42:15.780 student would
00:42:17.820 take that
00:42:18.280 approach.
00:42:20.480 So they might
00:42:21.280 actually say,
00:42:22.320 well, okay, that's
00:42:23.200 one variable I don't
00:42:24.160 want to account for,
00:42:25.000 so I'm not going to
00:42:25.480 take a...
00:42:26.320 If they were
00:42:27.180 trained killers,
00:42:28.560 one person with a
00:42:29.640 gun would make no
00:42:30.340 difference at all.
00:42:31.780 But they're not
00:42:32.540 really trained
00:42:33.260 killers, they're just
00:42:34.200 crazy people who
00:42:35.600 decided to shoot up
00:42:36.760 a school.
00:42:37.100 So I do have a
00:42:38.960 feeling it might
00:42:39.620 work, but we
00:42:41.280 don't know for
00:42:41.840 sure, so here's
00:42:43.420 the part I like.
00:42:45.080 We can test it in
00:42:46.120 one state, and
00:42:47.680 then maybe two
00:42:48.740 years we'll know.
00:42:50.340 Maybe we'll have
00:42:51.040 an actual answer,
00:42:51.840 and maybe we'll do
00:42:52.340 more of it or less
00:42:53.220 of it, depending on
00:42:54.140 whether it worked
00:42:54.620 or not.
00:42:55.920 Better than a sign
00:42:56.880 that says no
00:42:57.440 guns?
00:42:58.400 Maybe.
00:43:01.060 All right, here's a
00:43:03.780 weird thing that
00:43:04.400 happened.
00:43:04.660 Trump was polling
00:43:06.480 behind DeSantis
00:43:07.860 in California
00:43:08.880 until recently,
00:43:10.760 and he was way
00:43:11.500 behind them, and
00:43:12.320 that's no surprise,
00:43:13.240 right, because
00:43:14.440 California, Trump,
00:43:15.860 they don't go
00:43:16.260 together too well.
00:43:17.540 But apparently
00:43:18.300 that reversed
00:43:18.980 completely, and
00:43:20.020 now Trump is way
00:43:22.020 up in polling in
00:43:23.720 California compared
00:43:25.260 to DeSantis.
00:43:27.160 Now, what do you
00:43:28.160 think caused that?
00:43:30.360 What do you think
00:43:31.280 caused a dramatic
00:43:32.900 change in the
00:43:34.080 polling in
00:43:35.300 California, just
00:43:36.520 California?
00:43:41.000 I have a
00:43:42.360 hypothesis that's
00:43:44.040 kind of crazy, and
00:43:46.000 the hypothesis goes
00:43:47.160 like this.
00:43:48.760 Trump has been
00:43:49.700 persuading his
00:43:51.560 base, which are the
00:43:52.660 only ones that
00:43:53.140 matter for this
00:43:53.780 poll, he's been
00:43:54.920 persuading Republicans
00:43:56.700 that he and other
00:43:59.580 Republicans are the
00:44:00.880 victims of
00:44:01.560 shenanigans, and
00:44:03.440 that it's just
00:44:03.940 shenanigans top to
00:44:05.340 bottom.
00:44:06.300 You know, he
00:44:06.900 questions the vote,
00:44:08.420 he questions everything,
00:44:09.760 basically.
00:44:10.400 Just shenanigans.
00:44:12.060 And I have a theory
00:44:13.820 that every time Trump
00:44:15.280 is taken into court,
00:44:18.300 whether it's E.
00:44:19.080 Gene Carroll or
00:44:20.240 Alvin Bragg or the
00:44:22.940 documents at Mar-a-Lago,
00:44:25.720 that every one of them
00:44:26.700 now feeds into his
00:44:28.000 narrative.
00:44:28.380 And, you know, I
00:44:31.680 didn't see it coming.
00:44:32.840 Honestly, I didn't see
00:44:33.560 it coming.
00:44:34.260 Because on day one, let's
00:44:35.980 say day one is January
00:44:37.080 6th, let's call January
00:44:38.860 6th day one.
00:44:40.780 On day one, what did
00:44:41.960 Trump look like?
00:44:44.720 Sore loser, bad loser,
00:44:47.100 possible insurrectionist,
00:44:49.080 losing his mind, not a
00:44:51.500 leader after all, just a
00:44:54.020 complainer, loser, right?
00:44:56.040 Even if you liked him,
00:44:58.920 even if you were totally
00:44:59.820 on his side, around
00:45:01.460 January 6th, you were
00:45:02.600 saying to yourself,
00:45:03.500 that's a lot of
00:45:04.580 complaining, right?
00:45:07.180 It felt like a lot of
00:45:08.760 complaining, more than
00:45:10.700 you wanted to hear.
00:45:11.880 Even if you supported him,
00:45:13.780 that was a lot of
00:45:14.580 complaining.
00:45:16.220 And then things happen.
00:45:19.240 You get the Durham
00:45:20.420 report, you know, you
00:45:22.720 get time goes by and
00:45:24.500 January 6th fades in
00:45:25.820 memory, right?
00:45:26.800 A little bit.
00:45:28.020 The Democrats tried to
00:45:29.580 make sure it didn't, but
00:45:31.080 a little bit it does.
00:45:33.020 And you see these new
00:45:35.600 legal attacks against
00:45:37.660 Trump, and I don't think
00:45:39.360 citizens feel that these
00:45:40.760 are legitimate.
00:45:42.720 I think citizens are
00:45:43.920 saying, okay, he does
00:45:46.120 complain too much, but in
00:45:48.900 this case, I see what he's
00:45:49.920 talking about.
00:45:51.980 Still, he's too much of a
00:45:53.220 complainer.
00:45:53.700 And then the next thing
00:45:55.440 happens, and it's E.
00:45:56.740 Gene Carroll.
00:45:57.820 Okay, okay, he's way, he's
00:45:59.160 complaining way too much
00:46:00.400 about everybody's out to
00:46:01.340 get him, it's all, you
00:46:03.120 know, just attacks, it's
00:46:04.100 nothing I did, they're just
00:46:05.720 after me, they're after
00:46:06.660 you, blah, blah, blah.
00:46:09.000 But I kind of see his point
00:46:10.440 on this E.
00:46:10.940 Gene Carroll thing.
00:46:12.800 And then one after another,
00:46:15.240 the events in the real
00:46:16.800 world are starting to fit into
00:46:18.740 his victim framework.
00:46:22.860 And I believe that this
00:46:24.300 California poll might, I'll
00:46:26.420 just put a might on it,
00:46:27.540 because you can't tell, it
00:46:28.880 might be telling us that his
00:46:31.260 persuasion has reached peak
00:46:33.060 effectiveness.
00:46:35.100 Because, you know, he has this
00:46:36.360 habit of starting out with
00:46:37.400 saying stuff that just sounds
00:46:39.000 bad shit crazy.
00:46:40.860 But if he says it long enough,
00:46:43.440 it starts to just permeate your
00:46:45.520 thinking, and eventually you
00:46:47.660 start taking the facts that
00:46:49.020 are happening in the real
00:46:49.880 world, and you attach them to
00:46:51.700 his framework, because it's
00:46:53.740 so prominent in your mind, he
00:46:55.840 just says it all the time.
00:46:57.840 And now all these things that
00:46:59.660 would have looked, a few years
00:47:01.940 ago, these would have looked
00:47:03.340 like all bad news for Trump.
00:47:05.460 Oh, there's another legal case
00:47:06.980 against Trump.
00:47:07.800 Well, that's bad news.
00:47:09.640 Not anymore.
00:47:11.480 Now it's more confirmation that
00:47:13.240 he was right all along, and
00:47:15.120 that everything's rigged, and
00:47:16.620 they are out to get him.
00:47:19.900 Does it feel like that to you?
00:47:21.900 Just, you know, viscerally,
00:47:24.260 anecdotally, do you see this?
00:47:27.940 That he went from total no
00:47:31.600 credibility whiner to somebody
00:47:35.040 who's made a good point?
00:47:38.080 It looks like that.
00:47:39.560 So I wouldn't be surprised if he
00:47:41.540 not only won the primary in a
00:47:44.020 landslide, he might go further.
00:47:47.560 I mean, not just win the
00:47:48.780 general, but he might win it in
00:47:50.260 the landslide.
00:47:52.380 Because, now, the only thing
00:47:54.500 that would stop him, in my
00:47:55.760 opinion, is himself.
00:47:59.020 He has created a situation in
00:48:01.780 which he could roll this into a
00:48:03.360 landslide.
00:48:04.400 I'm not going to predict it, so
00:48:06.540 it's not a prediction.
00:48:08.160 I'm just saying that the current
00:48:09.700 situation, if he popped into
00:48:12.240 existence with this situation, a
00:48:16.380 skilled person could ride this to
00:48:18.160 a tidal wave victory.
00:48:21.200 And he might have those skills.
00:48:23.840 I mean, everybody who
00:48:25.140 underestimates his persuasion has
00:48:27.240 been wrong, consistently wrong.
00:48:30.900 He is really, really persuasive.
00:48:33.280 And he may have turned the
00:48:35.280 battleship, because he had four
00:48:36.860 years to turn it, or he will have
00:48:39.120 had four years to turn the
00:48:40.320 battleship.
00:48:41.420 Normally, you always bet against
00:48:43.060 turning the battleship, right?
00:48:44.460 That's why we use it as an
00:48:46.420 example.
00:48:47.100 You can't turn the battleship.
00:48:49.300 But if you give the best
00:48:50.660 persuader of all time, maybe,
00:48:53.560 Trump, and you give him four years
00:48:56.120 and continuous media attention, yeah,
00:49:00.200 he can turn the battleship.
00:49:01.300 And he may have done that.
00:49:03.800 We may be seeing the beginning of
00:49:05.900 the turned battleship, which would
00:49:07.600 be fascinating to me.
00:49:09.000 It would be the greatest feat of
00:49:10.920 persuasion that I can think of.
00:49:15.380 I mean, that would be hard to turn
00:49:17.360 that battleship.
00:49:18.060 And I think he did it.
00:49:19.940 I think he kind of did it.
00:49:23.220 All right, DeSantis said he would
00:49:25.080 consider pardoning Trump if he were
00:49:27.000 elected president.
00:49:27.740 I call that the Ramaswamy effect.
00:49:33.800 I think that Vivek is doing a hell of
00:49:37.280 a job improving politics because he's
00:49:41.460 simply saying the things that other
00:49:42.780 people were afraid to say, and then he
00:49:45.120 can see how it goes.
00:49:47.120 So, for example, he can say, you know,
00:49:50.080 use the military in Mexico.
00:49:51.760 Trump said that first.
00:49:53.000 But, you know, it's been tested.
00:49:56.120 You don't get killed for saying that
00:49:57.680 in public.
00:49:59.200 And I think that Ramaswamy, saying that
00:50:02.920 he would pardon Trump a lot and some,
00:50:05.920 but not all, the January Sixers, it
00:50:10.380 allowed DeSantis to say it.
00:50:12.700 Actually, it forced DeSantis to say it,
00:50:14.820 wouldn't you say?
00:50:15.760 I think it forced DeSantis to say it.
00:50:18.020 So that's the Ramaswamy effect.
00:50:22.100 I'll tell you, I've never been more
00:50:24.140 optimistic about American politics than
00:50:27.760 right now, which is weird, isn't it?
00:50:30.120 And I'll tell you why.
00:50:31.720 Because although it looks like, I hate
00:50:34.080 to say it, but it looks like it's going
00:50:35.240 to be Trump versus Biden unless
00:50:36.660 something big changes.
00:50:39.280 At the same time, the people who are
00:50:43.040 behind them are bringing some serious
00:50:49.900 skill, right?
00:50:52.200 We haven't seen a Vivek.
00:50:54.680 Would you agree?
00:50:56.040 We've never seen one.
00:50:58.160 Vivek is an original.
00:51:00.720 He's a complete original.
00:51:02.900 And he is changing the conversation.
00:51:06.220 So win or lose, he's already winning.
00:51:08.840 He's changing the conversation.
00:51:10.580 I mean, he's saying, I'm going to sign
00:51:12.800 an executive order to end affirmative
00:51:14.580 action.
00:51:15.600 He says that out loud.
00:51:19.440 And, you know, Trump couldn't do that.
00:51:21.480 That's not something Trump could do.
00:51:23.600 Because he's white.
00:51:25.660 Can't do it.
00:51:26.960 But Ramaswamy can.
00:51:29.380 What about RFK Jr.?
00:51:32.800 I swear to God, I have trouble telling
00:51:36.160 if he's a Democrat or a Republican.
00:51:38.680 Because he keeps approaching things
00:51:40.620 as if the data matters.
00:51:42.800 Am I right?
00:51:44.600 Yeah.
00:51:45.140 He keeps acting like the data matters.
00:51:47.920 And then I say, OK, I don't know who you are.
00:51:51.120 Because politicians don't do that.
00:51:52.940 I also believe he could change his mind based on data.
00:51:59.720 Based on data.
00:52:00.480 Yeah, he's not there on nuclear.
00:52:03.060 But I believe that's entirely based on needing
00:52:05.820 to come up to speed.
00:52:08.100 My expectation is that when he comes up to speed
00:52:11.020 on nuclear, which is really just the last three
00:52:14.160 to five years of understanding what we know
00:52:18.160 and what we don't know,
00:52:19.480 I think he's just three to five years behind,
00:52:22.140 which is what most people are.
00:52:23.900 That would be the average that people are behind.
00:52:27.080 And everybody who's current,
00:52:29.260 they all have the same opinion.
00:52:31.320 The people who are current on their knowledge
00:52:33.800 do not disagree about nuclear.
00:52:36.540 That conversation is over for everybody
00:52:39.180 who's well-informed, in my opinion.
00:52:42.340 All right.
00:52:46.860 So I've never seen such a strong group
00:52:49.480 of inspirational candidates.
00:52:53.600 And I would say even DeSantis is a new breed of candidate,
00:52:58.960 which I like a lot, right?
00:53:03.140 If, well, let me put it this way.
00:53:05.240 There are now three candidates for president
00:53:09.440 that I think would be transformationally strong.
00:53:15.360 I think DeSantis, I think Vivek, Ramaswamy,
00:53:19.840 and I think RFK Jr., in different ways, right?
00:53:22.520 They wouldn't move the world in the same way.
00:53:24.840 But all of them have one quality in common.
00:53:27.380 What is it?
00:53:29.080 What's the quality they have in common?
00:53:31.200 DeSantis, Ramaswamy, and RFK Jr.
00:53:36.020 What do they have in common?
00:53:37.840 Not the CIA.
00:53:39.440 Now, I don't think they're pro-CIA, any of them.
00:53:45.080 Listen to data, exactly.
00:53:47.240 They are data-driven people over politics.
00:53:52.160 They are data-driven over politics.
00:53:54.400 And they're overtly, aggressively so, right?
00:54:00.620 They're not tiptoeing.
00:54:02.560 They're overtly and aggressively looking for real answers,
00:54:06.100 and they know that they're getting bullshit for data,
00:54:08.660 so they have to deal with that.
00:54:10.980 So, technocrats.
00:54:12.920 Well, technocrats, I feel that's unfair.
00:54:17.920 I feel that maybe it's more of a talent stack situation.
00:54:22.020 In other words, that they can deal with the data,
00:54:24.420 but they can also do some leadership stuff.
00:54:26.280 I wouldn't call them technocrats,
00:54:29.240 because DeSantis is not exciting,
00:54:33.080 but he is absolutely a leader.
00:54:36.840 You know, to call him a technocrat,
00:54:38.900 I feel would be deeply, deeply dismissive,
00:54:44.140 dismissive to a tremendous amount of leadership that he's displayed.
00:54:47.700 So, here's my take.
00:54:50.700 I feel like the country would be refreshed
00:54:53.340 with any three of those people,
00:54:56.080 even though they would do different things,
00:54:58.060 and maybe I wouldn't like some of those things.
00:55:00.180 But I feel like we'd have some refreshment,
00:55:03.240 like the country would feel a little bit invigorated
00:55:06.720 by any of those three.
00:55:08.940 Now, Trump, who I think has a good chance of winning,
00:55:12.160 he brings with him all of the Trump baggage.
00:55:17.360 And that's a lot.
00:55:19.620 I would rather not deal with that.
00:55:22.180 Would you agree?
00:55:24.200 I'd rather not deal with that.
00:55:26.340 Because that baggage comes back on me and you.
00:55:31.220 If you support him, or people think you voted for him,
00:55:35.000 it comes back to you.
00:55:37.180 In a way that we have never seen before.
00:55:39.340 And partly because he does things that make it easy to attack.
00:55:45.940 So, but I have to say that I would also be excited
00:55:50.380 about a Trump presidency.
00:55:53.340 Because I think he would do it differently.
00:55:57.440 And I think that he would fix the things
00:56:00.120 that are obvious that need to be fixed.
00:56:02.440 But I think that it's possible
00:56:05.540 that he's learned something from the last time.
00:56:09.940 And that he would be more effective on the second try.
00:56:12.940 It's possible.
00:56:15.780 Yeah.
00:56:16.560 The thing I don't think he'd be as effective at
00:56:18.700 is gutting the intel agencies
00:56:21.380 to get rid of the bad actors.
00:56:25.440 I just don't know he could do that.
00:56:28.480 But RFK Jr. would have no sympathy whatsoever.
00:56:33.940 Wouldn't you love to let RFK Jr. loose
00:56:39.820 as the boss of the CIA?
00:56:42.480 The entity that presumably killed both his father
00:56:46.300 and his uncle?
00:56:49.340 How much do you want that?
00:56:52.400 Like, I want that a lot.
00:56:55.760 Now, some people have said,
00:56:57.640 do you think Trump could ever win?
00:56:59.400 And then ask RFK Jr. to be, I don't know what,
00:57:04.260 Attorney General?
00:57:06.080 To go after the things that need to be going after?
00:57:10.200 It would be a hell of a show.
00:57:14.000 It would be a hell of a show.
00:57:15.680 And I'll go further and say that RFK is the last person
00:57:18.360 you can imagine working for a Republican
00:57:20.080 because he's a Kennedy.
00:57:21.140 But he's not like other people.
00:57:25.560 I feel like, who knows?
00:57:28.660 Who knows?
00:57:29.760 They're definitely not going to be running mates.
00:57:31.620 I would say you could rule that out completely.
00:57:34.520 But Attorney General?
00:57:36.900 Attorney General, if you knew that,
00:57:39.280 I mean, don't you think that Trump knows
00:57:41.040 that RFK Jr. would have no mercy
00:57:43.460 on the intelligence agencies?
00:57:46.060 No mercy.
00:57:47.460 And that's what he needs.
00:57:48.420 He needs a no mercy guy.
00:57:49.620 So, RFK for director of the CIA?
00:58:00.060 I could live with that.
00:58:02.140 Could you?
00:58:03.400 Could you live with RFK Jr. as director of the CIA?
00:58:08.720 I could totally live with that.
00:58:11.960 How interesting.
00:58:13.900 You know, you really have to respect
00:58:15.700 the little area that he's carved out, right?
00:58:17.860 The fact that we would even talk about this
00:58:19.840 with seriousness.
00:58:21.320 The fact that that's actually a legitimate question,
00:58:25.460 whether he would fit into that role
00:58:28.460 in a Trump administration.
00:58:30.440 And the answer is yes, I think.
00:58:32.060 I think he would.
00:58:34.520 Interesting.
00:58:37.180 All right.
00:58:37.760 Well, Moscow has suffered some drone attacks.
00:58:40.620 I guess eight drones came in,
00:58:42.500 which they assume are coming from Ukraine,
00:58:44.340 but Ukraine, of course, denies it.
00:58:47.340 They seem to have shot all eight down.
00:58:50.100 Or no, I guess one got through.
00:58:51.940 And there were some injuries and one dead person.
00:58:54.780 So it wasn't a massive attack.
00:58:57.280 But this falls into the category of what took so long.
00:59:00.840 Why did it take so long for Ukraine
00:59:04.480 to attack Moscow with drones?
00:59:07.000 And I would think that militarily,
00:59:09.260 that's exactly what they should do more of.
00:59:11.800 What do you think?
00:59:13.200 Because I don't think it's going to...
00:59:14.960 I don't think it's going to give any extra incentive
00:59:17.820 to a nuclear attack.
00:59:20.260 Right?
00:59:21.260 Because we're already so bad
00:59:23.220 that if it's not happening now,
00:59:24.620 that's not going to be the extra thing
00:59:26.300 that makes it happen.
00:59:26.980 No more war.
00:59:31.840 Well, that would be the first choice.
00:59:34.600 But here's my prediction.
00:59:36.200 And this was in my book,
00:59:37.260 The Religion of War,
00:59:38.060 which is now banned everywhere.
00:59:40.540 One of my several banned books.
00:59:43.180 Because I got canceled.
00:59:44.660 It's not banned.
00:59:45.240 I just got canceled by my publisher.
00:59:48.760 So here's what I predicted.
00:59:50.800 That the cities would someday be vulnerable
00:59:53.920 to drone attacks.
00:59:56.280 That was easy to predict.
00:59:58.640 But my prediction is
00:59:59.780 it's not these big, long-range drones.
01:00:04.060 The problem is these big, long-range,
01:00:06.080 slow-moving drones
01:00:07.220 are just easy targets for the anti-aircraft.
01:00:10.060 So they're not getting it done.
01:00:11.640 They might be scaring the citizens,
01:00:13.240 but they're not really doing any damage.
01:00:15.240 The ones you've got to worry about
01:00:16.740 are the handheld ones
01:00:19.680 that you could put in the trunk of a car.
01:00:22.160 You could drive into Moscow
01:00:23.600 and release it.
01:00:25.520 And the problem is
01:00:27.140 that the anti-aircraft
01:00:28.400 is never going to be able
01:00:29.300 to shoot down something
01:00:30.240 that's flying below
01:00:31.540 the level of the buildings.
01:00:34.520 So you're going to see,
01:00:36.640 eventually,
01:00:37.860 drones coming right down the street.
01:00:40.180 And, as I predicted in the book,
01:00:42.780 in some cases,
01:00:43.940 going through doorways.
01:00:46.280 The drone could actually hover
01:00:47.920 until the door is open
01:00:49.000 and actually follow somebody
01:00:50.760 through the doorway.
01:00:51.380 That's real.
01:00:55.480 That's absolutely going to happen.
01:00:57.760 Somebody's going to put a drone
01:00:59.100 right through the doorway of a building.
01:01:00.500 So now,
01:01:06.200 given that it's Russia,
01:01:08.900 it's probably hard to sneak
01:01:10.960 a bunch of anything into Russia.
01:01:12.740 Wouldn't you agree?
01:01:14.120 It'd be hard to get
01:01:15.060 even one drone into Russia,
01:01:17.220 I would think.
01:01:19.040 But that seems like a solvable problem.
01:01:22.920 It feels solvable.
01:01:24.220 There's probably some way
01:01:26.040 to get a tractor-trailer
01:01:28.860 full of small drones
01:01:30.560 into Russia.
01:01:33.200 Some way.
01:01:34.940 So I think that
01:01:36.300 the first thing I would say
01:01:37.860 about the Ukraine-Russia war
01:01:40.620 is I think we can stop
01:01:42.520 calling it a war.
01:01:44.520 Because from this point on,
01:01:45.960 it's a negotiation.
01:01:47.760 Because neither side
01:01:49.120 has a reasonable expectation
01:01:50.600 of something like winning.
01:01:52.680 Winning's kind of off the table.
01:01:55.100 So we're really just negotiating
01:01:56.540 until the presidency
01:01:57.440 in the United States
01:01:58.460 changes to one
01:01:59.780 that wants to end the war.
01:02:01.760 And so I think both sides
01:02:03.420 are just getting their situation,
01:02:05.560 you know,
01:02:05.740 the best negotiating position
01:02:07.400 and waiting for Trump
01:02:09.220 or DeSantis to come along.
01:02:13.140 Guys with shotguns
01:02:14.400 and birdshot
01:02:14.940 will defeat the drones?
01:02:17.480 Well, you'd have to be pretty fast.
01:02:20.180 You could get the drone
01:02:21.260 if you know it's coming.
01:02:22.800 And it's, you know, hovering.
01:02:26.420 But usually they're, you know,
01:02:28.560 they're up in the air
01:02:29.320 above the level
01:02:31.060 that you can even see them.
01:02:32.240 They're invisible.
01:02:33.260 And then they would just sort of
01:02:34.300 drop down and do their business.
01:02:39.840 All right.
01:02:40.500 I would say
01:02:46.120 Trump's biggest requirement
01:02:49.500 for a vice president
01:02:50.620 is somebody who would agree
01:02:51.760 to pardon him.
01:02:53.360 Would you agree?
01:02:55.480 That when Trump selects
01:02:56.720 a vice president,
01:02:57.620 the main thing for Trump
01:02:59.020 is somebody who says,
01:03:00.540 I will definitely pardon you
01:03:01.860 if I have to.
01:03:03.200 Yeah.
01:03:06.820 Well, that does put Vivek
01:03:08.380 right in the front,
01:03:09.400 doesn't it?
01:03:11.180 Yeah.
01:03:11.760 Now, I don't think
01:03:12.580 Tim Scott has weighed in
01:03:13.920 on this,
01:03:15.160 but I would expect soon
01:03:16.540 to hear Tim Scott
01:03:17.500 say that he would
01:03:18.420 pardon Trump.
01:03:19.500 What do you think?
01:03:21.500 Because that would be,
01:03:23.060 that's a hard requirement
01:03:24.480 for being a vice president,
01:03:26.460 I think.
01:03:26.960 I think.
01:03:27.020 All right.
01:03:36.560 Yeah, impeachment pardon.
01:03:39.140 All right.
01:03:39.740 So I made a prediction
01:03:40.900 on Twitter
01:03:41.660 that you didn't understand,
01:03:42.920 which I said that
01:03:43.660 in two months,
01:03:44.460 I'll be the most dangerous
01:03:45.520 person in the world.
01:03:48.440 Did you know
01:03:49.120 what that was all about?
01:03:51.100 In two months,
01:03:52.400 I'll be the most dangerous
01:03:53.680 person in the world.
01:03:55.280 Now, some of you guessed
01:03:56.320 it's because
01:03:56.900 my Twitter numbers
01:03:58.840 will reach
01:03:59.360 one million followers.
01:04:00.660 And I have said
01:04:01.240 in the past
01:04:01.780 that if I ever reached
01:04:03.220 a million followers,
01:04:04.280 I'd be able to control
01:04:05.240 the world.
01:04:08.760 Well, I don't need
01:04:10.160 exactly a million
01:04:11.200 to do that.
01:04:12.840 You know,
01:04:13.100 970,000
01:04:14.420 is probably good enough.
01:04:16.220 Yeah, it's pretty close.
01:04:17.600 Rounded up to a million.
01:04:19.160 So it wasn't that,
01:04:20.240 but that will help.
01:04:22.280 Here's what's going to happen.
01:04:23.660 In around two months,
01:04:25.220 the date's a little
01:04:26.280 influx.
01:04:27.920 My new book
01:04:28.760 will be published
01:04:29.460 and it's the one
01:04:30.160 that got cancelled.
01:04:32.400 Now, a couple of things
01:04:33.480 are going to happen.
01:04:34.880 Number one,
01:04:36.240 in all likelihood,
01:04:37.480 it will be a bestseller.
01:04:39.540 I say that
01:04:40.300 for two reasons.
01:04:41.840 One is that,
01:04:43.820 you know,
01:04:44.080 it's a new book
01:04:44.840 and I've had other
01:04:45.500 bestsellers
01:04:46.260 and so I'll probably
01:04:48.180 get a lot of publicity
01:04:48.900 and people will talk
01:04:50.140 about me being cancelled
01:04:51.200 and it'll be a story
01:04:52.660 and probably,
01:04:53.680 probably conservatives
01:04:54.960 will want to,
01:04:56.360 you know,
01:04:56.940 fight against
01:04:57.560 the banning of people
01:04:58.820 and maybe buy it
01:04:59.880 just to support the ban,
01:05:01.600 you know,
01:05:01.840 the cancelled people.
01:05:03.300 So,
01:05:03.980 I suspect it'll get
01:05:04.820 a lot of attention.
01:05:06.880 But that's not
01:05:07.940 why it's going to,
01:05:09.000 why I'm dangerous.
01:05:10.900 This book is different
01:05:12.280 because it's full of reframes.
01:05:14.840 There are,
01:05:15.120 I think,
01:05:15.300 over 160 of them.
01:05:16.680 160 sentences
01:05:18.740 that reframe your mind
01:05:21.340 from,
01:05:21.880 you know,
01:05:22.120 the previous,
01:05:22.960 you know,
01:05:23.340 mental state you were in.
01:05:25.820 Here's what's going to happen.
01:05:28.460 When you first see it,
01:05:30.000 you can say,
01:05:30.600 oh,
01:05:30.780 these are little mental tricks
01:05:33.120 to make me more effective
01:05:34.860 in a variety of ways.
01:05:37.300 But eventually,
01:05:38.300 somebody's going to figure out
01:05:39.300 that I've built
01:05:40.000 super prompts for people.
01:05:42.900 Do you know
01:05:43.480 what a super prompt is?
01:05:44.620 A super prompt is
01:05:47.100 usually a long-form question
01:05:49.400 for AI
01:05:50.080 that gets you
01:05:52.100 the right answer
01:05:52.920 because you've asked
01:05:54.000 a, you know,
01:05:54.460 a paragraph-long question
01:05:55.800 and you've basically
01:05:57.140 influenced
01:05:58.560 or hypnotized AI.
01:06:01.480 You're effectively
01:06:02.320 hypnotizing the AI
01:06:03.400 to give you
01:06:03.840 the right answer.
01:06:05.080 That's what a reframe is.
01:06:07.560 When people figure out
01:06:09.100 that reframes
01:06:10.200 are like spells
01:06:12.720 and that they're
01:06:14.880 super sticky
01:06:16.000 and effective,
01:06:17.640 people are going to see
01:06:19.800 for the first time
01:06:20.640 what I'm capable of
01:06:21.820 because I'll be
01:06:23.500 the author of the book.
01:06:25.880 There are some people
01:06:26.980 who are already aware,
01:06:28.280 you know,
01:06:28.520 the locals people
01:06:29.380 already know,
01:06:30.320 and some people
01:06:31.220 who have been following me
01:06:32.020 for a while
01:06:32.440 probably noticed,
01:06:33.600 hey,
01:06:33.760 how'd you get that,
01:06:34.680 how'd you get that
01:06:35.940 Trump prediction right
01:06:38.400 in 2016?
01:06:40.080 How'd you get that
01:06:41.880 other prediction right?
01:06:44.140 And a lot of people
01:06:45.620 ask me if I've influenced
01:06:47.180 things in the world
01:06:49.260 because they seem
01:06:50.780 to see a connection
01:06:51.920 between me starting
01:06:53.880 to persuade on a topic
01:06:55.220 and suddenly that thing
01:06:57.580 happens in the real world.
01:06:59.200 And even I don't know
01:07:00.040 if I have any effect
01:07:00.860 because it's hard to,
01:07:01.680 it's hard to see
01:07:02.420 a straight line
01:07:02.980 from anything.
01:07:04.440 But when the book
01:07:05.740 comes out,
01:07:06.260 it's going to reveal me
01:07:07.980 as having a certain
01:07:09.180 set of skills.
01:07:11.180 And it will reveal it
01:07:12.420 in a way that has
01:07:13.340 never been revealed before.
01:07:16.120 I've told you
01:07:16.880 that the only reason
01:07:17.580 I'm alive is that
01:07:18.380 people don't know
01:07:19.020 how much I can do.
01:07:21.620 So my odds of being killed
01:07:23.000 are much higher
01:07:23.820 after the book
01:07:24.640 gets published
01:07:25.420 because people will see
01:07:26.660 for the first time
01:07:27.660 how powerful
01:07:29.260 this stuff is.
01:07:30.660 And when I say
01:07:31.120 this stuff,
01:07:31.720 I mean persuasion,
01:07:33.140 reframing,
01:07:34.640 hypnosis.
01:07:35.960 You're going to see it
01:07:36.900 in a form
01:07:37.760 that you can use it.
01:07:39.780 In other words,
01:07:40.840 for one sentence,
01:07:42.860 you'll be able
01:07:43.460 to reprogram your brain
01:07:44.760 in an area
01:07:46.240 that you wanted to do it.
01:07:49.000 Once people understand
01:07:50.340 how easily
01:07:51.320 you can reprogram
01:07:52.380 a brain,
01:07:53.380 because you'll see
01:07:53.880 all the examples
01:07:54.540 and you'll try it yourself,
01:07:56.500 it will change a lot.
01:07:59.120 But it will change more
01:08:00.520 what influence
01:08:01.560 I have on the world
01:08:02.740 because once people
01:08:04.400 realize that I can
01:08:05.480 come up with
01:08:06.060 over 160 reframes
01:08:07.900 and that maybe
01:08:09.920 for you,
01:08:10.580 five of them
01:08:11.100 change your life.
01:08:13.060 Five out of 160?
01:08:15.040 Right?
01:08:15.400 Most of them
01:08:16.120 won't be relevant
01:08:16.880 to you.
01:08:17.900 But probably everybody
01:08:19.160 will find a different
01:08:20.100 five-ish
01:08:21.700 that will absolutely
01:08:22.940 change your life.
01:08:24.820 And once people
01:08:26.020 read the book
01:08:26.680 and the reviews
01:08:27.780 start coming in
01:08:28.540 and people say
01:08:29.860 things like,
01:08:30.500 I lost 40 pounds
01:08:32.000 or I don't have
01:08:34.000 OCD anymore.
01:08:38.140 Then people
01:08:38.940 are really going
01:08:39.480 to notice.
01:08:40.580 So there's going
01:08:41.160 to be a second wave.
01:08:42.800 The first wave
01:08:43.680 will be,
01:08:44.220 you know,
01:08:44.620 just book release
01:08:45.700 publicity.
01:08:47.020 But then there will
01:08:47.940 be people who
01:08:48.660 read the book,
01:08:49.820 they'll implement
01:08:50.560 the reframes,
01:08:51.780 and then they'll
01:08:52.540 start talking about it.
01:08:54.360 It's when they
01:08:55.020 start talking about it
01:08:56.280 that all hell
01:08:57.480 is going to break loose.
01:08:59.500 So get ready
01:09:00.400 for that.
01:09:00.840 So I will very
01:09:03.300 quickly be the most
01:09:04.880 dangerous person
01:09:05.660 in the world.
01:09:09.160 All right.
01:09:10.360 So that's coming.
01:09:13.600 It should be a real
01:09:14.760 force for a good
01:09:15.560 because none of the
01:09:17.620 reframes are negative.
01:09:19.140 I mean,
01:09:19.360 they're all designed
01:09:20.380 to make you
01:09:20.980 personally more
01:09:21.720 effective.
01:09:22.660 I would even go
01:09:23.600 so far as to say
01:09:24.660 that it's the sort
01:09:26.500 of thing that could,
01:09:27.420 you know,
01:09:27.940 add equity to the
01:09:29.080 world.
01:09:29.380 because I've
01:09:31.180 often said that
01:09:31.980 one of the biggest
01:09:33.080 systemic racism
01:09:34.640 advantages that
01:09:36.800 applies to,
01:09:37.800 I'd say,
01:09:38.920 whites and Asian
01:09:39.800 Americans and
01:09:40.460 Indian Americans
01:09:41.100 as well,
01:09:42.060 is that if
01:09:44.980 there's somebody
01:09:45.600 in your environment
01:09:46.440 who knows how
01:09:47.300 success works,
01:09:49.120 you're probably
01:09:49.980 picking up tips
01:09:50.900 from them even
01:09:51.600 without asking,
01:09:52.780 just by observation
01:09:53.780 and osmosis and
01:09:55.540 being in the same
01:09:56.160 space.
01:09:56.640 So I always
01:09:58.240 thought how
01:09:59.920 incredibly unfair
01:10:01.560 that is if you
01:10:02.960 grow up in some
01:10:03.720 poor area where
01:10:05.140 you've never even
01:10:05.860 met anybody
01:10:06.500 successful.
01:10:07.720 You don't even
01:10:08.560 know one.
01:10:09.540 How in the world
01:10:10.300 are you supposed to
01:10:10.960 copy the technique
01:10:12.580 of success if you
01:10:13.660 don't even know
01:10:14.180 one?
01:10:15.360 Well, that's what
01:10:17.040 this does.
01:10:17.660 It gives you 160
01:10:18.600 things to fix,
01:10:20.680 you know,
01:10:20.980 various mental
01:10:22.800 issues and
01:10:24.780 effectiveness and
01:10:26.560 success issues.
01:10:28.440 And it
01:10:29.840 effectively does
01:10:32.120 what a mentor
01:10:32.940 would do,
01:10:34.040 but really
01:10:34.860 simply.
01:10:35.960 Now, the
01:10:36.380 advantage of
01:10:36.920 putting the
01:10:37.340 reframes in
01:10:37.960 one sentence
01:10:38.840 is that it's
01:10:40.460 hard to tell
01:10:40.960 somebody to
01:10:41.500 read a book,
01:10:42.940 right?
01:10:43.740 Go to a
01:10:44.380 teenager,
01:10:44.900 hey, your
01:10:45.780 life will be a
01:10:46.440 lot better if
01:10:47.000 you read this
01:10:47.520 whole book.
01:10:49.140 And then zero
01:10:50.260 teenagers will
01:10:51.120 read that book.
01:10:51.820 But if you
01:10:53.060 go to a
01:10:53.400 teenager and
01:10:54.040 you give
01:10:54.320 them a
01:10:54.640 sentence,
01:10:56.220 one sentence,
01:10:58.380 they can
01:10:58.980 absorb the
01:10:59.480 sentence if
01:11:00.360 it's catchy.
01:11:01.280 Now, the
01:11:01.520 nature of
01:11:01.980 reframes is
01:11:02.700 that the
01:11:03.480 ones that
01:11:03.940 matter to
01:11:04.420 you, you'll
01:11:04.900 remember
01:11:05.260 forever the
01:11:06.040 first time
01:11:06.420 you hear
01:11:06.680 them.
01:11:07.500 And it's
01:11:07.940 simply the
01:11:08.380 remembering it
01:11:09.180 that makes
01:11:09.780 it work.
01:11:10.940 Because if
01:11:11.540 every time
01:11:11.960 you're in
01:11:12.180 this situation
01:11:12.800 that little
01:11:13.300 thought comes,
01:11:14.080 oh, that
01:11:14.520 reframe, yeah,
01:11:15.780 it's this,
01:11:16.340 it's not that,
01:11:17.880 then it's
01:11:19.280 just a little
01:11:19.860 bit of just
01:11:20.560 exposure.
01:11:21.820 You're simply
01:11:22.300 exposed to it
01:11:23.180 once, it
01:11:23.940 changes your
01:11:24.440 life.
01:11:26.200 So I'll give
01:11:26.940 you an example.
01:11:27.660 The one I use
01:11:28.340 all the time is
01:11:28.940 alcohol is
01:11:29.540 poison.
01:11:30.940 For some
01:11:31.720 number of
01:11:32.160 people, and
01:11:32.700 it's way more
01:11:33.260 than your
01:11:33.880 common sense
01:11:34.520 would tell
01:11:34.920 you, a lot
01:11:37.260 of people
01:11:37.560 stop drinking
01:11:38.520 because they
01:11:39.700 saw it, they
01:11:40.260 used the
01:11:40.640 reframe, alcohol
01:11:42.040 is poison.
01:11:43.600 Even though
01:11:44.220 it's not
01:11:44.720 technically poison,
01:11:46.080 but maybe it
01:11:46.880 is, it
01:11:47.540 doesn't matter
01:11:47.920 if it's
01:11:48.160 true.
01:11:48.500 it just
01:11:49.660 matters that
01:11:50.140 you can
01:11:50.400 reprogram your
01:11:51.160 brain with
01:11:51.640 one sentence.
01:11:52.940 And I
01:11:53.220 know that
01:11:53.580 that one
01:11:53.880 worked because
01:11:54.560 I hear from
01:11:55.040 people all
01:11:55.740 the time who
01:11:57.020 quit drinking.
01:11:57.780 Now, it
01:11:58.000 doesn't work
01:11:58.340 for addicts
01:11:59.160 or alcoholics,
01:12:00.240 but if you
01:12:00.780 just wanted to
01:12:01.240 cut down,
01:12:02.400 apparently it
01:12:02.960 works.
01:12:03.780 Now, all of
01:12:04.400 the reframes
01:12:04.900 are of that
01:12:05.940 nature, meaning
01:12:07.600 that one
01:12:08.320 exposure will
01:12:10.320 reprogram your
01:12:11.220 brain if it's
01:12:12.080 one you need.
01:12:12.720 If it's one
01:12:13.640 you don't
01:12:14.000 need, you
01:12:15.880 won't remember
01:12:16.440 it.
01:12:17.160 You just, you
01:12:17.900 know, it just
01:12:18.160 won't have any
01:12:18.620 impact on you.
01:12:21.200 All right.
01:12:27.460 Oh, Betty.
01:12:29.120 Betty, you
01:12:30.140 fell for the
01:12:30.800 4chan hoax about
01:12:31.920 my pandemic
01:12:32.520 opinions.
01:12:33.720 It's too bad.
01:12:35.560 Sorry.
01:12:36.160 Sorry, Betty,
01:12:36.920 that they did
01:12:37.440 that to you.
01:12:38.840 Maybe I have a
01:12:39.440 reframe that
01:12:39.940 can fix you.
01:12:42.720 I don't put
01:12:46.180 any words in
01:12:46.660 my mouth.
01:12:48.460 Yeah.
01:12:49.000 And right here
01:12:49.600 I'm seeing
01:12:49.980 somebody say,
01:12:50.720 I cut my
01:12:51.200 alcohol intake
01:12:52.120 in half with
01:12:53.600 your reframe.
01:12:55.240 Oh, let me
01:12:55.660 ask a question
01:12:56.160 here.
01:12:57.660 If any of you
01:12:58.500 want to see how
01:12:59.080 important that
01:12:59.780 reframe is,
01:13:01.140 I know what
01:13:03.260 the answer will
01:13:03.760 be on locals.
01:13:04.460 How many of
01:13:05.020 you heard that
01:13:05.920 reframe, alcohol
01:13:06.700 is poison, and
01:13:07.760 reduced your
01:13:08.940 alcohol consumption
01:13:09.860 in the comments?
01:13:12.000 How many of
01:13:12.700 all right, just
01:13:14.700 a ton of
01:13:15.380 yeses on
01:13:16.280 locals, because
01:13:17.180 they've heard it
01:13:17.660 more?
01:13:18.600 On YouTube, I
01:13:19.420 haven't said it
01:13:19.940 as much, but
01:13:21.380 it's just a
01:13:22.360 complete wall of
01:13:23.100 yes.
01:13:25.440 30, 40, how
01:13:27.640 many of them
01:13:28.020 have gone by
01:13:28.560 now?
01:13:29.120 50 people?
01:13:31.680 Now look over,
01:13:32.640 there's a little
01:13:32.920 delay.
01:13:33.600 Look at the
01:13:34.060 number of people
01:13:34.540 saying yes on
01:13:35.280 YouTube.
01:13:36.820 That's one
01:13:37.560 sentence.
01:13:39.280 Just think
01:13:39.720 about that.
01:13:41.220 We know for
01:13:41.980 sure the
01:13:42.440 alcohol is
01:13:42.920 not good
01:13:43.340 for you in
01:13:43.900 any amount,
01:13:45.720 and you're
01:13:46.900 watching maybe
01:13:47.880 hundreds of
01:13:48.520 people, I'm
01:13:49.840 guessing hundreds
01:13:50.620 between the
01:13:51.300 two platforms,
01:13:52.340 maybe hundreds
01:13:53.000 of people who
01:13:54.280 cut down
01:13:54.840 substantially on
01:13:55.740 alcohol with
01:13:56.460 one sentence.
01:13:57.820 One sentence.
01:13:59.620 Now imagine
01:14:00.480 there are 160
01:14:01.460 of those.
01:14:03.760 Now you don't
01:14:04.500 need them all,
01:14:05.680 but if you
01:14:06.220 could find five
01:14:07.300 that made that
01:14:08.580 much difference
01:14:09.180 to your life,
01:14:11.640 you're going
01:14:12.480 to be in
01:14:12.760 good shape.
01:14:16.920 All right.
01:14:17.940 That's all for
01:14:18.600 now on YouTube.
01:14:20.080 I'm going to go
01:14:20.500 talk to the
01:14:21.160 locals people
01:14:21.760 for a moment,
01:14:22.560 and we'll see
01:14:23.780 you tomorrow.
01:14:25.120 Thanks for
01:14:25.640 joining.
01:14:27.980 Except for you,
01:14:28.760 Betty.
01:14:28.940 Thank you.
01:14:29.540 Bye.
01:14:30.200 Bye.
01:14:40.400 Bye.
01:14:42.220 Bye.
01:14:42.320 Bye.
01:14:42.440 Bye.
01:14:43.620 Bye.
01:14:44.980 Bye.
01:14:47.360 Bye.
01:14:50.200 Bye.
01:14:50.740 Bye.
01:14:51.260 Bye.
01:14:51.320 Bye.
01:14:52.100 Bye.
01:14:56.160 Bye.
01:14:57.060 Bye.
01:14:57.140 Bye.
01:14:57.680 Bye.
01:14:58.000 Bye.
01:14:58.200 Bye.
01:14:58.620 Bye.