ManoWhisper
Home
Shows
About
Search
Real Coffee with Scott Adams
- April 30, 2022
Episode 1729 Scott Adams: Stories That Involve Elon Musk, Which Means Pretty Much Everything
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
150.41655
Word Count
10,111
Sentence Count
755
Misogynist Sentences
5
Hate Speech Sentences
21
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
I think so, yes. I feel the golden age is upon us. It's a little bit disguised. I'll grant you that.
00:00:10.120
But it's always the darkest before the... that's right, the dawn. And it's always the thirstiest
00:00:17.080
before the sip. That's not a saying yet, but it will be. It will be, damn it, if I have anything
00:00:24.500
to say about it. And so, are you prepared? Are you ready for an experience which will connect all
00:00:31.400
people almost as if we are a global mind? Almost as if we form a superintelligence collectively
00:00:39.000
being channeled through me? So between coffee and this is shared experience, let's do something
00:00:48.380
amazing. All right. And all you need is a cup or mug or a glass of tanker shells aside in the canteen
00:00:53.960
junk of flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:01:00.740
And join me now for the dopamine hit of the day. It's the tingle on the back of your neck. It's
00:01:07.700
the thing that makes you feel alive. That's right. Simultaneous sip. Go.
00:01:17.160
Oh, that's a good container of beverage right there. I hope the container of beverage you
00:01:22.180
just ingested was as good as the container of beverage I just ingested. And if that doesn't
00:01:28.960
get you going, nothing will. Well, the walls are closing in on Trump. His legal woes continue to,
00:01:36.960
oh, actually, no, nothing's happening. So it turns out that the latest rumor, so unconfirmed,
00:01:43.900
but it looks like all the Manhattan charges or the grand jury that was going to look into all the
00:01:51.040
Trump financial dealings, they've been looking and looking. They've been talking to people. They've
00:01:56.120
been investigating. They've demanded and they've received documents. And after months and months of
00:02:02.820
the grand jury stuff, the foreshadowing, not yet confirmed, is that it seems increasingly unlikely
00:02:11.600
there will be any indictments coming out of this.
00:02:14.700
Have we ever seen any President Trump witch hunts before? I feel as if it was nothing but witch
00:02:29.580
hunts. You know, I'm not going to say that Trump was an angel all of his life. And the reason I'm not
00:02:38.100
going to say it is because he told us that directly. He literally said in public, I'm, you know, I'm no
00:02:44.240
angel. But then he would tell you why, you know, he could help you and as President. So if you expected
00:02:50.620
him to be an angel in however you want to define that, you shouldn't have been surprised if he
00:02:59.000
talks about grabbing people by the whatever, because he kind of signaled that as directly as you possibly
00:03:05.040
could. And I always thought that it immunized him. A good way to immunize people is to tell
00:03:10.940
people that you have the flaw that you're worried they're going to blame you of. Because if you say
00:03:15.160
it first, it just takes all the fun out of it. You know, if Trump had said, you know, I say horrible
00:03:20.800
things in private, you should know that. And then you find out he said a horrible thing in private,
00:03:26.880
you're like, hey, you said a horrible thing. Okay, he did tell us that. And it just takes all the energy
00:03:31.600
out of it. So may I admit to you right now, I'd like to confess, I say horrible things in private.
00:03:41.460
Horrible things. Just terrible things. If any of it were presented to you out of context,
00:03:48.520
you would say to yourself, well, that's the worst person I think I've ever heard of in the world.
00:03:53.820
But here's the context. Do you have a friend like this? Now, this won't apply to all of you,
00:03:59.720
but some of you do. Do you have a friend who, if you're just alone, could be somebody you've
00:04:05.520
known a long time, usually it is, that the funniest thing you can do is to say the most inappropriate
00:04:12.020
things. Whatever is the most absolutely uncivilized thing you could say, something you would never say
00:04:21.040
in front of someone else. And the fun is how awful it is. Am I right? So I often think,
00:04:29.720
if somehow, you know, my digital devices are recording every word I say, and, you know,
00:04:35.380
somehow it all came back to me, and you played these bits, they would sound worse than anything
00:04:40.240
you've ever heard in your life. Like, you think you've heard people say bad things on hidden audio?
00:04:47.260
You should see mine. Well, I don't know, you should hear them, if such things exist. But yeah, I'll make
00:04:56.340
your head explode. But I wouldn't say it in public, right? The whole reason it's funny is because you
00:05:02.480
wouldn't say it in public. That's the entire energy of it, it's inappropriate. So it's tough to see stuff
00:05:09.600
out of context, is what I'm saying. So here's another big, gigantic story that has been haunting us
00:05:15.760
forever, this whole Manhattan possible indictments of Trump for financial chicanery or whatever,
00:05:22.980
that apparently none of it happened. None of it happened. What would happen if everybody saw
00:05:28.680
Trump's tax returns, and they were just clean? That would be the funniest thing, wouldn't it?
00:05:37.720
After all this time, like, let's just say the entire tax returns became public,
00:05:42.580
and everybody was like, oh, this is going to be good. This is going to be good. And everybody's
00:05:48.060
like salivating over it, and they're like, huh. Okay, there's nothing there. Because apparently
00:05:55.040
that's what happened with these Manhattan indictments. There was nothing there. It's what
00:05:59.200
happened with Russia collusion. You know, Russia collusion, not there. The closest they could find
00:06:06.600
is try to confuse us after the fact that Russian interference in the election was the same as or
00:06:14.280
somehow adjacent to Russia collusion with somebody running for president. Very different. Very
00:06:20.620
different. But that was the closest thing you'd get to making that stick, is talking about an
00:06:25.620
unrelated topic. That's the closest thing you get. So if Trump were to run for re-election,
00:06:35.700
he would be the most vetted person of all time. I don't think I would ever worry again that he
00:06:43.980
would be caught in some illegality or blackmailed. He might be the least blackmailable president now
00:06:51.260
of all time. Am I right? I mean, then you add to that his age, right? At some age, you stop worrying
00:07:00.640
about getting blackmailed. I was thinking about this the other day. At my current age, I was
00:07:08.060
saying, what if somebody blackmailed me? Like they really had the goods, whatever that was. And they
00:07:13.680
said, okay, you're going to be so terribly embarrassed. Your career will be destroyed if I release this
00:07:19.060
information. Well, if I were 30, that would be pretty scary, wouldn't it? Because you're like, oh, my
00:07:26.080
whole life's ahead of me. They're going to let this stuff out. If it happened now, I feel like I
00:07:32.840
think it was funny. I'm not positive. And I suppose that would depend on what it is, I guess. But I feel
00:07:40.500
like I would just laugh. Because it's hard. Nobody's going to take my money away from me. Or at least
00:07:47.680
nobody but the government, I guess. So I'm not sure what I would have to lose. It would just make me seem
00:07:53.200
more interesting, even in a bad way. And I'd say, well, okay, I'll take that trade off. If it makes
00:07:57.840
me seem more interesting. And by the way, if you ever hear bad things about me, I encourage you to
00:08:04.720
believe all of them, except the illegal ones. If you're any, if you're I did anything illegal,
00:08:09.540
totally did not do that. Because I actually, I do try pretty hard to avoid illegal stuff.
00:08:14.800
But if you hear anything that's just like wildly provocative, I would encourage you to believe
00:08:23.100
it, even though there's a very unlike, it's very unlikely it's true. But if it's fun, you should
00:08:29.160
believe it. If you enjoy it. So now there's a story about, I guess, Hannity was exchanging a whole bunch
00:08:37.540
of messages, over 80 messages with Mark Meadows about the January 6th situation. And the big scandal is
00:08:43.620
that Hannity was giving advice to the administration and the president through Mark Meadows. And
00:08:52.520
I'm watching this, I'm thinking, remind me why this is a story? What is the part that's news?
00:09:01.160
Is the news that Hannity and Trump were friends? Because they both talked about that publicly all
00:09:08.960
the time. Everybody knew that. Was, did anybody think that Trump doesn't listen to people who are
00:09:16.740
exactly the right person to give you exactly the right kind of advice? Who would you want advice
00:09:23.740
from if you were a Republican president in a tight spot? Who could give you the best possible advice?
00:09:33.140
Well, you know, Hannity would be near the top, I would think, right? You don't have to agree with
00:09:40.740
Hannity's opinions on anything to, for me to make this point. I'm just saying that Hannity's talent
00:09:47.160
stack, as I've pointed out before, uh, Hannity's just a perfect example of a talent stack. Somebody who,
00:09:53.920
if you looked at any individual thing he has a talent at, you know, speaking in public, knowing about
00:09:59.740
politics, whatever, you'd say, oh, that's, that's good. It's like, and sometimes really, really good.
00:10:05.720
But there's not one of those things that stands out as the best anybody's been at that thing, right?
00:10:11.400
He's got a look, he talks right, he's got the energy. He just has everything. So his magic is there are no
00:10:18.640
gaps. He just has everything. So it makes him, you know, very effective. So if you were going to give
00:10:24.020
somebody's advice in this exact topic, which is how do you handle the public opinion of something,
00:10:31.300
I would go to somebody who is one of the best people on the planet in managing public opinions.
00:10:37.660
Hannity is exactly whose advice I'd want to hear. And how, and what kind of advice did he give?
00:10:42.600
One of them is that he said that Trump should announce he will lead, he said after the six,
00:10:53.740
this is what Hannity said, that Trump should announce he will lead a nationwide effort to
00:10:57.740
reform voting integrity. Go to Florida and watch Joe mess up daily. Stay engaged. When he speaks,
00:11:05.740
people will listen. And I thought to myself, okay, that's really good advice, isn't it?
00:11:12.600
That's about as good as, if you were going to get advice, that's about as good as you could get.
00:11:20.640
Now, Trump didn't take this advice, right? Trump decided to be Trump, and maybe there's nothing
00:11:28.600
wrong with that because he's made it work so far. And so Trump decided to be, you know, fully
00:11:33.300
combative. But if I read between the lines, I think Hannity's approach was to basically shift
00:11:43.160
the argument and become the champion of election integrity, which nobody could disagree with.
00:11:49.000
Basically, it's a high ground maneuver. Have I told you that the high ground maneuver wins every
00:11:54.580
argument? It's the one that always wins. And as soon as you hear it, you're like, oh, okay,
00:11:59.740
damn it, the argument's over. That's the high ground. The high ground is not whether the election
00:12:05.300
was rigged or not rigged. That's the low ground. The high ground is what Hannity showed him. The
00:12:11.200
high ground is I'm going to lead a national effort to make sure that the next time this happens,
00:12:16.380
we're all comfortable with the outcome. National hero, right? Trump could have easily transformed
00:12:23.740
transformed this from maybe the biggest stain on his presidency. Not maybe. The biggest stain on his
00:12:30.740
presidency. He could have easily done taking Hannity's advice and turned it into, all right,
00:12:38.000
I guess we'll never know what happened in 2020. I have my suspicions. And people would say, okay,
00:12:43.860
that's fair. You have your suspicions. And it's fair that we'll never know. Yeah, okay. Not everything
00:12:50.420
was audited. Can't get into the, you know, the technology part of it especially. He would have
00:12:56.720
been a national hero. And probably when the next election rolled around, unless people thought he
00:13:03.420
rigged the election, I suppose they'd spin it that way, people would say, all right, let's run this
00:13:10.460
movie again. And we'll see if the election reform actually changes the outcome. Let's see who gets 81
00:13:16.500
million votes this time with election reform. Now, even if he lost, it would still be legendary
00:13:23.160
because people wanted election reform as like, you know, the basic, most fundamental thing to protect
00:13:30.100
the republic. So the fact that there's a story that Hannity was giving advice to Mark Meadows to give to
00:13:39.200
Trump, the story should have been, why wasn't Trump listening to it? That would have been the better
00:13:45.980
story. Because this is damn good advice. In my opinion, it's damn good advice.
00:13:53.720
Maria Bartiromo is getting some similar kind of pushback because apparently she shared some of the
00:13:59.200
questions that she was going to ask the president after January 6th with Mark Meadows, I guess. And
00:14:07.620
here's the first thing you should know about that. That's not unusual. It's not unusual for an
00:14:14.200
interview guest to get questions in advance. Because it's more about the topic. And it's more about
00:14:21.420
preparing somebody to have a good show. I can tell you that in many cases when I'm interviewed on
00:14:27.400
politics, I get the questions in advance. And, you know, there's nothing unusual about that.
00:14:35.940
The reason you do it is to make the show snappy. What you don't want is a show where
00:14:40.860
somebody asks a question, and then the guest says, uh, you know, I hadn't really thought about that.
00:14:48.180
You don't want that. So you want to say, I'm going to ask you some tough questions,
00:14:52.080
or not, but you tell them what they're going to be. And then the person has thought about it,
00:14:56.960
and they give a good, quick response, as short as possible. It's good for the audience. It's good
00:15:02.140
for the show. But it's also good for the interviewee. Everybody looks good.
00:15:05.440
Now, here's a question. In this case, is Maria Bartiroma an opinion person like Hannity,
00:15:16.260
where I think Hannity is perfectly transparent, that he's an opinion person, he's friends with
00:15:22.320
the president, they talk a lot. Perfectly transparent. But do you see Maria Bartiroma as opinion or news?
00:15:30.620
I'm just going to see what your opinion of her is. Some say news, some say opinion. Okay, that's the problem.
00:15:40.260
Both. Yeah, see, that's the problem. Because now this one gets a little more murky. But apparently,
00:15:48.100
apparently she didn't use exactly the questions that she broadcast. Because I don't know if you're aware of this,
00:15:58.580
well, even when the person asking the question has a set of questions that are on their notes,
00:16:05.220
that's mostly just so they don't forget a question or, you know, don't have any questions left.
00:16:12.700
But they kind of ask what they think is a good question when they actually get there.
00:16:16.260
So somebody like Maria Bartiroma isn't going to ask the exact question. It's just an indication
00:16:23.260
she's going to be in that area, basically. And that's what happened. She asked, you know,
00:16:28.380
some versions of the questions. So I don't think there's anything wrong with that, necessarily.
00:16:35.760
And I wouldn't be bothered. And I'm being consistent here.
00:16:40.360
Because when Chris Cuomo was accused of softball treatment of his brother, the governor,
00:16:51.340
if you recall, I also defended Chris Cuomo. Because it's transparent. As long as it's transparent,
00:16:59.600
I don't know if there's a higher standard. If you know it's his brother,
00:17:04.060
are you going to be surprised if a brother gave a brother advice in any context? That should have been
00:17:12.980
the least surprising news and shouldn't have affected anybody, really. So I just want to be
00:17:19.700
consistent. People should be able to talk to anybody they want and get news, get advice from anybody they
00:17:24.780
want. As soon as you make, oh, this one can't talk to this one unless you've told us. No. No,
00:17:30.880
anybody can talk to anybody about anything. That's by standard. And they don't necessarily have to
00:17:37.540
tell you. You can talk to anybody about anything. And they don't necessarily have to disclose it.
00:17:43.620
But it's nice when they do. All right. Have you noticed that every story is about Elon Musk?
00:17:51.800
We'll give you some examples. So AOC tweeted this sort of long, ambiguous tweet to which Musk
00:18:00.460
responded. So AOC tweets, tired of having to collectively stress out about what explosion of
00:18:07.180
hate crimes is happening because some billionaire with an ego problem unilaterally controls a massive
00:18:13.260
communication platform and skews it because Tucker Carlson or Peter Thiel took him to dinner and made
00:18:19.680
him feel special. Now, when I read that, I thought she was talking about Elon Musk buying Twitter.
00:18:27.220
Elon Musk must have thought the same because he tweeted back hilariously, quote,
00:18:33.940
stop hitting on me. I'm really shy.
00:18:38.720
OK. Now, if you see this outside of the realm of Twitter, which a lot of people will, they'll
00:18:47.600
just see this, say, reported in a news item or something. You don't really appreciate how perfectly
00:18:54.000
Twitter like his response is. Right. His response would be maybe inappropriate in almost any other
00:19:02.580
domain. In any other domain. It wouldn't be a little weird. But in this specific one of
00:19:10.020
Twitter, it's exactly right on point. He's hitting the target right on the head. Poink.
00:19:16.940
It's a Twitter response. So I've told you before and keep watching for this because it's fun to watch
00:19:24.000
that Elon Musk is very clear about what matters and what doesn't. And when things don't matter,
00:19:29.920
he mocks them. And when things do matter, like saving the planet or going to Mars or something,
00:19:35.720
he somehow makes that happen. So I've never seen anybody who's more clear about what's silly
00:19:40.700
and what's not. That's just one of his best qualities. And so, you know, he just makes fun of it.
00:19:47.540
And then apparently AOC tweeted, but quickly deleted. I was talking about Zuckerberg, but OK.
00:19:56.600
And then everybody had to debate whether she deleted it because it wasn't funny enough or didn't want to
00:20:04.180
engage or was it because it really wasn't about Zuckerberg or who knows. But apparently there's a
00:20:13.620
Zuckerberg version of meeting with at least Peter Thiel and there's a speculated Musk version in which
00:20:20.940
he probably met with Peter Thiel or did or something as part of, you know, deciding about Twitter.
00:20:27.160
And so I just love this little exchange. But so Elon Musk is in every part of the news. We'll keep
00:20:35.780
going on this. But first, so this really happened. A reporter for The Guardian decided to do a story
00:20:47.340
about virtual reality and so went into one of the virtual reality worlds. And I'm not sure which one
00:20:53.180
it was. I don't know if it was meta or just a virtual reality world. It doesn't matter to the
00:21:00.180
story. What matters to the story is that she was immediately assaulted with racism and actually was
00:21:09.400
groped in the VR environment. Basically sexually assaulted in virtual reality. Now, of course,
00:21:15.980
she was quick to point out that she never lost touch with the fact that it wasn't the real world.
00:21:20.820
But the problem with or the or the feature of virtual reality is it makes you feel the same
00:21:27.300
way as the real world. Knowing it's not the real world doesn't help you nearly as much as it should.
00:21:33.900
I've told you some of the story about my VR experience. I put on the glasses and I walked up to a cliff
00:21:40.680
edge so that in the virtual world, if I if I stepped off, it looks like I would fall to my death.
00:21:46.620
In reality, I knew I was just in a room in my house and had no danger whatsoever. I couldn't make
00:21:52.780
my legs move. I couldn't walk over the cliff in the virtual reality. Couldn't make my legs move.
00:22:01.640
My brain would say, move your legs. You're perfectly safe. I would even take the glasses off to make sure
00:22:07.900
I was still in the real room. Put them back in and say, all right, no, couldn't do it.
00:22:13.740
And so when she says she was actually assaulted, and like, I guess they cornered her and they were
00:22:20.240
doing stuff with their hands and stuff, that she felt actually assaulted. And I think that's real.
00:22:27.820
That's completely real. And so what are you going to do about that? Do you end up having all the same
00:22:35.700
laws in the virtual reality? Because the virtual reality just becomes your reality? Well, just to make
00:22:41.920
it more weird, there's a new invention that allows you to feel things while you're in virtual reality,
00:22:51.160
specifically on your mouth. And so Gizmodo had an article about this that I'm failing to find in my
00:23:00.100
notes. But apparently there's a little haptic response thing. And I saw a picture of it. You've got
00:23:08.300
the VR goggles on. And then there's some kind of sensors, or I don't know if they blow air or what
00:23:13.780
they do, hanging from the bottom of the goggles. And so they're directed at your lips and your mouth.
00:23:21.160
And the claim is that these little devices that are not touching your mouth, but I think they
00:23:27.920
might direct air or something at your mouth, they'll make you feel as if you're actually kissing
00:23:32.620
somebody, if you're kissing somebody in the VR world. Now, it did go on to say that you might be
00:23:41.260
able to feel it even internal to your mouth, such as if you had your mouth open. I'd imagine you'd feel
00:23:48.560
something on your tongue or the inside of your mouth, because that's where the haptic sensors would
00:23:54.040
be sensing. You kind of know where this is going, don't you? All you need to do is put those haptic
00:24:03.520
sensors in your belt, you know, one on your goggles and one on your belt, both shooting down, if you know
00:24:10.580
what I mean, if you know what I mean, if you could feel it, just like it's in the real world.
00:24:17.440
We're in big trouble. Big, big trouble. Big trouble. So much so that I tweeted, and people didn't
00:24:26.260
understand, that sometimes when you think you know somebody who's socially awkward, and they don't
00:24:32.740
have much of a social life, and they never seem to go out. And that's how that's your opinion of them.
00:24:38.260
So, oh, this is somebody with a bad social life. They don't like to go out. But I would suggest
00:24:43.400
to you there's one other possibility, that that is somebody who's really, really good at
00:24:48.580
masturbating. Like, so good, they can do it for hours, and it never gets old. To them, going out
00:24:56.720
might be the least fun thing they could possibly do. Imagine if you were bad at masturbating,
00:25:03.040
and somebody said, hey, there's a party. Or you can do this thing that's tons of fun. It's
00:25:09.180
going to last you two minutes. Two minutes of great fun. Or you can go to this party. Well,
00:25:14.780
the party sounds pretty good, doesn't it? Relative to two minutes of a good time, you'd have fun
00:25:21.660
all night. But suppose somebody was really good at it. They could keep themselves at a place for
00:25:30.580
hours at a time. Does the party look as good to them? What happens when virtual reality makes the
00:25:41.440
staying home alone just way better than going to a party? And I think probably you're already at the
00:25:49.040
point where for some people going out is the least fun thing they could possibly do. And where is that
00:25:55.980
going to take us? Because we're already there. I mean, we're knocking on that door.
00:26:02.380
I blocked Kathy Griffin today for being a racist, because she called Elon Musk a white supremacist.
00:26:08.520
And I thought, okay. You know, I'm certainly willing to put up with anything that she thinks is a joke.
00:26:18.600
Like, you know, I supported her with the severed head of Trump under, you know, under the rules of
00:26:25.800
parody and free speech and blah, blah, blah. So I didn't think that she should lose anything because
00:26:30.860
of that. I was very strongly supportive of that as a free speech thing. And as, you know, humorists take
00:26:37.580
chances, they don't always work. But you don't want them to not take chances. But calling Elon Musk a
00:26:47.740
white supremacist, I feel like that's just because the people on the right like him. I feel like that's
00:26:54.460
a little too close to home. Because I like him too. And I have no indication of anything like that.
00:26:59.960
That just feels so bigoted, essentially against white guys, basically, that I can't pretend that
00:27:11.060
somebody else is the bigot in this story. It just feels like she's the bigot in this story. Am
00:27:16.020
I right? It just feels like it's just an anti-white male thing.
00:27:21.960
All right. Here are the other stories that Elon Musk is associated with, right? Just listen to the whole list
00:27:37.840
and ask yourself, how is this even possible? Okay, I get that he's, you know, a richest person and bought
00:27:45.960
Twitter. So I get you a lot of attention. But look at all the stories that these topics that he's
00:27:51.640
directly involved in. Ukraine, right? He sent the Starlink stuff over there.
00:28:01.520
The Amber Turd story, because they're talking about, you know, his dating her. So the Johnny Depp
00:28:08.840
story is even that he's attached to. Anything about elections, fake news, Twitter, because they're
00:28:15.160
all sort of collectively one story now. Anything about income inequality, he's in that. Anything
00:28:20.160
about taxes of the rich and progressive taxes, he's in that story. Anything about free speech,
00:28:25.720
because of Twitter. Now he's weighing in on mental health. We'll talk about that. Tweeting about it.
00:28:30.240
So he's in a story about Adderall. He's in any story about space, climate change, how the country
00:28:37.040
is getting more divided, also because of the Twitter thing. Self-driving cars.
00:28:40.920
I mean, and he believes that the simulation is real. It's my theory that people who believe the
00:28:53.340
simulation is real can author it. And the more sure you are that the simulation is actually just
00:28:59.940
software, the more clearly you can see the machinery, and the more clearly you can tweak it.
00:29:05.580
It sure looks to me like he knows he lives in the simulation, and he's just playing it like a
00:29:12.020
game. It looks like he's playing it like somebody who's just a good gamer. You know, if you found
00:29:19.700
out tomorrow that this is all a game, and that we're just characters in it who temporarily don't know
00:29:24.320
what we are, and we think we're real, and you woke up and found out that Elon Musk was only,
00:29:29.760
only, the best gamer in another dimension. And he was just an avatar. But he was an avatar of the
00:29:37.800
best gamer for the game. So he became the richest person, and he became in every story. He had seven
00:29:44.880
kids, and God knows what kind of fun he has when he's alone. Does it look exactly like just a really
00:29:52.900
good gamer playing the game really well? It's weird how much it looks like that. You know, I've looked at
00:29:58.880
my own life, because I believe I live in a simulation as well. Like, literally. Like, actually, literally,
00:30:05.200
no joke. It's the most likely possibility. I can't say for sure about that or anything else. But I treat
00:30:13.760
it like it's not real. And my experiences just don't, they just don't seem like they could possibly
00:30:20.980
be coincidental. It just doesn't seem like it. I don't know what to think of that. But those people who
00:30:28.540
have said to me, this is just anecdotal, of course, that as soon as they feel they're in the simulation,
00:30:35.300
and they start using things like affirmations to author the simulation, they report that they get
00:30:41.140
good results. But of course, that's anecdotal. So Elon Musk weighed in on this thread. So somebody
00:30:51.480
named Michael Kersey tweeted this. This is just the second part of his tweet. He said,
00:30:57.920
pharmacological dark matter, and he's talking about a phenomenon among important people.
00:31:07.640
So he's just giving it a name. Pharmacological dark matter. So basically, the stuff we don't know
00:31:12.260
is invisible heavy amphetamine and other drug use among people playing significant roles in our society.
00:31:18.600
Now, you've heard me say that, right? The people who are doing the most moving and shaping of
00:31:27.260
civilization, mostly on drugs. It wouldn't happen otherwise. And, you know, don't take drugs.
00:31:38.820
Like, and I mean it, don't take drugs. I mean, and, you know, the only time you should is if you're
00:31:45.720
under a doctor's care, and even then it's probably too much. So I'm not encouraging it. I'm just saying
00:31:51.520
it's a fact, and ignoring it feels stupid. It just feels stupid to ignore it. The fact is
00:31:58.720
that some people, and here's the dangerous part, everybody responds to chemistry differently.
00:32:05.560
So there might be a drug that simply makes one person rich, because it just makes them perform
00:32:10.840
better, and they never have a side effect, or one they care about. And then another person,
00:32:15.920
it just kills them. It just freaking kills them, or it makes them crazy, or it ruins their life one way
00:32:21.660
or the other. So don't take drugs, because you don't know which one you are. You don't know if you're
00:32:27.360
the one the drug is going to kill, or the drug is going to help you. We're just not that smart,
00:32:32.640
because, you know, we're all different. So anyway, so Michael Kersey weighs in on this about the
00:32:40.540
significant role, and then that Mark Andreessen, one of the most important voices in the tech world,
00:32:52.300
tweeted this. He said, everyone thinks our present society was caused by social media. I'm wondering
00:33:00.240
whether Adderall plus ubiquitous Google searches have bigger effects. Now, I don't know about the
00:33:05.380
Google search part, but here's one of the most connected people in Silicon Valley, and then the
00:33:12.960
tech world, who would personally know the most important people in society. So, you know, this is
00:33:19.280
somebody who's been in the room and has the phone number to text pretty much anybody famous, I think,
00:33:25.720
at this point. And he's telling you that he thinks Adderall may be shaping civilization or things like
00:33:35.020
it. I'm not saying Adderall specifically, but you should take that as things like it, you know,
00:33:40.340
amphetamines. And then Alex Cohen tweeted this. He said, prescribed psychedelics will replace
00:33:47.640
amphetamines and SSRIs over the next decade, although I hope it's sooner than that, he said.
00:33:54.120
And then Musk gets into this, and he tweets, I've talked to many people who were helped by psychedelics
00:34:03.740
and ketamine, more people who were helped by psychedelics and ketamine than SSRIs and amphetamines.
00:34:11.580
And then he added, related to this, he said, and this is not me talking, so I'm just reporting what
00:34:18.720
he said, right? So these are not my words. He said, Adderall is an anger amplifier, avoided
00:34:24.280
all costs. Now, here's the interesting thing. As Jeff Pilkington pointed out in some tweets,
00:34:33.740
everybody's different. That's what I said earlier, right? I'm pretty sure that Adderall has saved
00:34:38.820
lives, but I'm pretty sure it's caused some problems. I think both of those can be true.
00:34:46.720
So it's a little bit, let's say it's a really good example of free speech, both its negatives
00:34:57.300
and its positives. You don't want to be getting medical advice from Elon Musk. That might be
00:35:04.120
his weakest category. I mean, he's insanely smart on a whole variety of things that allow him
00:35:13.220
to do what he does. But I think the medical part might be the part where you go, yeah,
00:35:17.980
you get a second opinion there. Usually when Musk says something, I'm usually done. It's like,
00:35:24.140
okay, I agree with that. There's nothing else to say. But I think in the medical domain,
00:35:31.240
let's be glad that there is free speech, because he can say this. If you looked at the comments,
00:35:36.080
there would be a whole bunch of people pushing back. And I say, okay, that's a really good example
00:35:41.880
of free speech working. Somebody's asking if I'm on Adderall. I'm not. So I've never been on any kind
00:35:49.740
of long-term stimulant, except coffee. Or sativa, I guess. But I will, I've said this before. But in college,
00:36:00.740
I had a few experiences with stimulants. And I wrote my entire senior thesis in like, I think it was like
00:36:11.300
mostly over a weekend. And then I was done with the semester. And it was actually easy. And I enjoyed it.
00:36:21.460
Think about that. I did a semester of work in four days. Got a reasonably good grade on it. I think B plus
00:36:28.400
or something. And I did it in four days. And I liked it. It was completely pleasant.
00:36:35.160
It was a horrible job. Like, just the most boring. It was like in economics. So it was a senior thesis
00:36:44.300
in economics. Do you know how boring that was? And I enjoyed the whole thing. Now, how many unicorn
00:36:52.800
companies have been created by people who had a little stimulant going on? How many? Probably a lot.
00:37:03.220
Right? Probably a lot. And it is one of the great untold stories. Anyway.
00:37:15.980
Michael Schellenberger, who is running for governor as an independent in California. I understand there
00:37:22.040
was some issue in terms of the paperwork. I've got to look into that a little bit. But how would you
00:37:30.440
like to be running for governor and part of your accomplishments just happened, which is Schellenberger
00:37:38.360
was, I think, the loudest, most effective voice for trying to save the Diablo Canyon nuclear power plant
00:37:47.320
in California? And the reasoning was it's already there. It's cost effective. And we don't have any green way
00:37:55.200
to replace it. And we'll just run out of energy if we don't keep it open. A pretty straightforward
00:38:02.080
argument. And apparently, he, the great weight of public opinion moved Schellenberger's way.
00:38:13.120
Is that a coincidence? Or did he cause it? Because remember, he was testifying to Congress. He was writing
00:38:19.200
books and articles on it. He was the most famous voice on this topic. I think he caused it. I think
00:38:26.740
he caused it. So he's running for governor, while the current governor is saying that he's going to
00:38:32.860
look to get some money from the federal funds that was allocated by Biden, and I'll give Biden credit
00:38:38.520
for this, for keeping nuclear power plants open longer than they would have been.
00:38:43.120
So the governor is looking at implementing the plan that the governor didn't want to do,
00:38:51.900
but Schellenberger convinced the entire country that they need to do it, and now he kind of has
00:38:57.680
to do it. I would hate to be running against Schellenberger in this situation. Honestly, I've never seen a more
00:39:07.440
capable politician in terms of, you know, competence about the actual topics that matter to the state.
00:39:16.620
It's sort of breathtaking, because we've never seen it before. I'm pretty sure we've never seen this
00:39:22.080
before. We've had some presidents who were a little wonky, you know, like Jimmy Carter and stuff,
00:39:27.700
but not like this. Like Carter wasn't, didn't have this kind of mastery over the exact topic that the
00:39:35.440
state cared about, and several of them, from homelessness to drug addiction to, you know,
00:39:41.460
he's talked about forest management. Basically, everything we care about, he has the better
00:39:47.320
solution for. All right, here's the most provocative thing that I've read lately.
00:39:52.160
We all assume at this point that Russia will have its way with Ukraine. In the comments,
00:40:00.000
where are you feeling as of today? Like, what do you feel today? Is Russia going to have its way?
00:40:10.560
All right, so I'm seeing mostly yeses. Some people saying they'll just take the east of Ukraine,
00:40:17.140
blah, blah, blah. All right. Now, I will remind you that in war, nothing is predictable.
00:40:24.120
So, among the unpredictable things that could happen, I tweeted this, and I feel terrible because
00:40:31.740
I want to mention the author and the publication, because I liked it so much, but I can't find it
00:40:37.240
in my own tweets for some reason. Maybe I imagined I tweeted it. Yeah, maybe I imagined it.
00:40:43.240
But if somebody sees it, maybe you can tell me in the comments if I did tweet it and you see it.
00:40:50.600
It was an article by somebody who definitely looked like they knew what they're doing. It was
00:40:54.060
somebody who had predicted in writing and could show the receipts that the Moskva, that ship would
00:41:02.020
be sunk by the Ukrainian anti-ship missiles. So, that's a pretty good prediction. Somebody had that
00:41:08.260
specific prediction and it happened. And then when I read the article, it seemed to know quite a bit
00:41:13.460
about the whole situation. So, I don't know how to judge military experts, because we've seen so
00:41:19.960
many of them being so wrong about so many things. But I'll give you his argument, and I'd love to see
00:41:26.480
how wrong it is. It goes like this. We're all focusing on the fact that the fighting isn't this
00:41:32.960
Donbass and the other name starts with L that I can't remember. And we're taking for granted
00:41:40.860
that Russia already owns Crimea, because they got annexed in 2014, something like that. So, yeah,
00:41:49.520
so Luhansk is the other region. So, we're ignoring Crimea, right? Because that's a done deal.
00:41:55.620
Russia already owns it. Here's the part I didn't know about. They barely own it. They do own it.
00:42:04.060
Their military is theirs. But apparently, if you're looking at it from a military perspective,
00:42:11.080
their connection to Crimea is one bridge and one poorly defended area. And if you take out the bridge,
00:42:20.940
which seems totally doable, right? Ukraine can take out a bridge. They've got missiles,
00:42:27.620
drones and whatnot. And then, apparently, the Ukraine military is actually pretty good.
00:42:33.820
They have better equipment. According to this expert, they win their firefights.
00:42:39.480
So, when it's something like a fair fight, the Ukrainians almost always win, because they're
00:42:45.860
better trained and better equipped. It's only when they're, you know, overpowered by artillery or
00:42:50.260
something that they get crushed. So, the thinking is that the Ukrainian military could
00:42:58.540
take out not only the thin connection between the main Russian forces and Crimea, they could
00:43:06.460
take out the bridge, and then they could just take Crimea, because it would probably be poorly
00:43:13.080
defended, because the strongest defenders are where the fighting is. And the fighting is not
00:43:18.860
happening in Crimea. And I'm thinking, that would be a shocker. What if Ukraine just tried to hold the
00:43:27.140
line, and just keep the main Russian army pinned down, and just take Crimea back? What would that do
00:43:36.440
to the Russian psychology? It would look like Russia lost the war. I mean, the war wouldn't be over.
00:43:43.540
I mean, Russia could maybe just take it back. I mean, it wouldn't be easy. But what would that do to the
00:43:51.080
whole balance of everything? I mean, that would be a great, at least a diversion. I mean, at the very
00:43:57.520
least, it would be an interesting diversion. And the idea is that Russia would have trouble equipping
00:44:02.700
and resupply in Crimea, because they could get cut off, but that Ukraine wouldn't, you know, because
00:44:09.280
they have a border there and stuff. They'd break out the tactical nuke if that happens, somebody says.
00:44:17.240
Would they? Because it seems to me that a tactical nuke would guarantee that Putin is taken out of office.
00:44:25.520
What do you think? I would say that right now, it looks like he might be, you know, at great risk,
00:44:33.420
maybe 50-50 proposition. But if he used a tactical nuke, I think that's the end. Am I wrong? There's no way
00:44:43.280
he could survive it, politically, if not his life. Because don't you think that there's somebody who's
00:44:51.440
like number two or three in the potential takeover chain that the CIA has already said, you know,
00:45:00.380
I know you're only number fourth or fifth in the potential chain of command. But if the other people
00:45:05.900
disappeared, and we made sure that you were well treated, do you think you'd like to step up to the
00:45:12.580
number one job? I don't know. I think that if a tactical nuke got used, it would be easy to turn people
00:45:19.200
within Russia really close to Putin against him. Because they would say, okay, this is a clean
00:45:24.940
win. If I take Putin out under these conditions, I'll even have internal support. Because I think
00:45:32.120
you could overthrow Putin and still have public support in Russia if he used a nuke. Or the other
00:45:40.280
possibility is that Russia is now so pro-Putin that they'd say, oh, that's strong. You know, we like it
00:45:48.140
that you were strong. Maybe. I don't know how to read the Russian mind. But it would certainly
00:45:54.820
turn enough people against Putin that he'd be in physical danger much more than he is now.
00:46:01.440
Now, here, let me throw something into the mix that feels different. So you know the Biden
00:46:08.800
administration is asking for, correct me if I'm wrong, $33 billion more for Ukraine, which
00:46:14.880
is a gigantic step up in military support. Do you know what that sounds like to me?
00:46:24.000
It sounds like the administration is playing to win, as in win the war. It doesn't look like
00:46:32.140
they're playing for a stalemate. It doesn't even look like they're playing to just, you know,
00:46:37.560
push them out of the East or something. I feel as if the Biden administration feels like
00:46:42.880
they could win. Because every day that the Russians don't, you know, have some crushing
00:46:52.740
victory on the East is a day that you say to yourself, huh, maybe that Ukrainian defense
00:46:59.380
is better than we thought. And they're not going to, the Ukrainian defense isn't going
00:47:03.140
to quit now because the good equipment is just coming in. And apparently they don't have
00:47:07.640
any manpower shortage. They had, they had more volunteers to fight than they had equipment
00:47:12.700
to give them. So if Ukraine doesn't run out of people to fight and they're getting more
00:47:20.220
equipment, not less, does the $33 billion signal that the Biden administration thinks they could
00:47:27.660
win outright? Because I feel like it does. Now, I don't think that means that they think it's
00:47:33.780
a sure thing or anything like that. But I feel like they went from, you know, let's see
00:47:38.860
how we can make this painful for Putin to let's finish off the Russian army. I think they want
00:47:45.600
to finish the Russian army. Or at least, you know, take it down by 50%. Because Putin's
00:47:52.560
down 25%, right? I never understood that if that's his entire military 25% degraded or that
00:47:59.700
was just what's in that area. I've not, I've not heard that clarified. So about 20 billion to
00:48:11.400
replace the weapons we gave them from the army. Okay. Oh, I see. So we'd be replacing our own
00:48:18.520
weapons with the $33 billion. Okay, that does look a little different. Those numbers are wrong.
00:48:24.980
Which numbers are wrong? The $33 billion? If all it is, is replacing our own equipment,
00:48:30.080
then it doesn't look like a step up, does it? So give me a, give me a fact check on that.
00:48:36.020
That replacing our own military equipment would not look like a step up.
00:48:43.520
Somebody says not true. Are we giving it directly to Ukraine? All right. Well, I guess we'll look
00:48:48.920
into this. But the point is, if it's a major escalation, if that's what that budget is telling
00:48:54.660
us, then I think the Biden administration actually feels they could win. Define win. Put Putin
00:49:02.260
out of power. Or, and or, get the Russian military to completely get out of Ukraine. Completely.
00:49:13.760
Now, of course, nobody wins because everybody's, everybody lost in this situation. But
00:49:19.040
that would look like a win to me. All right. I was asked to tell you about how we're entering
00:49:31.400
the golden age. Well, someone has asked me why I didn't have kids and do I regret it. Let
00:49:42.500
me answer that one first. I don't regret it. But, and the reason was, I just didn't want
00:49:54.580
to bring somebody into the world. Because there are plenty of people in the world. And I never
00:50:00.780
felt a need to propagate my DNA. Now, does that make me weird? Because I feel like there's
00:50:10.420
some kind of basic human impulse that people have to propagate their DNA. But I don't have
00:50:21.260
that. Because I didn't enjoy my childhood enough that I would take any chance that I could put
00:50:29.280
anybody through that again. But if there's somebody who's already having a tough childhood, and I
00:50:36.040
could help them have a better one, that feels like that's an easy win. Like, to me, that
00:50:40.880
would be satisfying. But to bring somebody into the world and then have them have a bad
00:50:45.320
life, I couldn't handle that. Like, my, I'm just not built that I could ever handle that.
00:50:51.120
So I take the sure thing of, you know, definitely helping humanity extend the light of consciousness,
00:51:01.560
I guess. So he says, what a load. There's somebody who thinks that I'm either lying, do you think
00:51:09.660
I'm lying to you, or lying to myself? Could be the one, right? I guess there's no way for
00:51:15.640
you to know, is there. But the thing I thought about the most is that I would rather support
00:51:24.600
the people who are already here. That's how I thought of it. It could be that that's a
00:51:29.120
rationalization of some sort. That would be pretty normal. But your child wouldn't have
00:51:36.700
a tough life. Yeah, no, I don't think, we're not talking about a tough life in terms of economics.
00:51:42.560
If it were just economics, I wouldn't have worried. But in terms of, you know, mental health
00:51:49.300
and that sort of thing. I'm not so worried about climate change ruining the world. I just
00:51:56.940
look around, I don't see a lot of happy people. Now, here's the golden age part. I'm pretty
00:52:02.920
sure that we're about to wake up, or we are waking up now, to this whole what drugs to put
00:52:10.420
in your body and whatnot. And maybe the pandemic helped with that. Because it sort of changed
00:52:16.000
how we see the medical community, you know, changed our opinion about maybe having to make
00:52:21.040
our own decisions and not depend on the experts so much, which could be good or bad. I guess
00:52:26.200
that could go both ways. But yeah, I think a big, the, I would go with Elon Musk's thing
00:52:36.040
that the ketamine, maybe. And of course, don't take any recommendations from me. I'm just saying
00:52:43.840
that these are things that are being talked about. I'm not recommending them at all. And
00:52:47.900
the psychedelics, I do think have the potential to be civilization changing. Absolutely. And
00:52:56.200
because they don't cost much, it's, you know. Let me ask you this. Imagine everything that
00:53:04.080
we do now to make the world a safer place. Right? We've got the United Nations and, you know,
00:53:11.020
we've got the hotline to the Kremlin. So we've got all these systems and things to make things
00:53:17.740
safer and avoid war. But we still have the damn wars. Right? So I guess those systems are better
00:53:24.320
than not having them. But they're not, they're not really getting it done. But imagine this.
00:53:30.300
This is purely speculative. And it's just, it's just a mental experiment. Imagine that if instead
00:53:37.280
of all that stuff, whenever there was a dispute, or even before there was one, the heads of
00:53:43.860
state would get together and do mushrooms together. That's it. And then replace everything else
00:53:51.000
with just that. Now, of course that will never happen. Of course it will never happen. I'm
00:53:57.000
not suggesting that's even remotely possible. I'm just saying if it did, it would probably
00:54:03.040
end war. Do you think, I mean, honestly, if you just imagine, you know, it doesn't work
00:54:13.920
if your leader is 100 years old. So forget about Trump. He won't take a drug. And forget
00:54:19.380
about Biden. He's too old. But imagine DeSantis, who I imagine is anti-drug. So again, this wouldn't
00:54:26.600
happen. But at least he's young enough. So imagine a President DeSantis someday, doing
00:54:32.540
mushrooms with a President Putin who's, you know, 70 or so, and not that old. And they
00:54:40.760
just like bond and see the world differently. And the next thing you know, it just war is
00:54:47.260
a lot harder. Like it doesn't make sense suddenly. Because it would be so much easier to say, you
00:54:53.600
know, wait a minute. Are you saying that if Russia and the United States simply had to
00:54:59.280
just make friends, that the same way that Germany benefited by being friends with the US and Japan
00:55:06.300
benefited, and basically everybody, every country that said, hey, can we be your friend, has benefited
00:55:14.540
substantially. Can you imagine sitting in a room and having the right kind of, you know, right
00:55:20.760
kind of, let's say, chemical incentive? And you just look at Putin and say, why are we
00:55:24.940
even doing this? Do you want to be the greatest leader that Russia ever had? And Putin would
00:55:30.920
be, well, I thought I already was. You say, okay, okay. But do you want to continue being
00:55:35.540
the greatest leader that there ever was? And Putin would say, how do you do that? He'd say,
00:55:41.440
end war. End war and go to space with us. How about that? How about end war and help
00:55:49.760
us go to space? And we'll get some good asteroids. We'll mine some stuff. We'll free your economy
00:55:56.180
to do what it can do. We'll share our technology with you. It'll be awesome. You'll be the best
00:56:01.380
leader that Russia ever had. You'll quadruple your GDP. There'll be statues of you everywhere.
00:56:06.940
And you will have ended war. I'm telling you, the golden age, you just have to accept it. It's
00:56:17.480
right there. It's right there. The hard part is getting people to just pick it up. It's like,
00:56:22.560
here it is. Here's all of your solutions. All of your solutions. We have all the answers now.
00:56:27.660
Here they are. People are like, I don't know. I don't trust those solutions.
00:56:36.640
So here's where the golden age, I think, is going to happen. I think that the energy shock
00:56:42.040
will cause us to be pro-nuclear in a way that we had to be. So our energy thing is now on a course
00:56:49.480
for full correction. It's going to be slow. But it's now on a very definite course toward full
00:56:56.320
correction. It's going to be nuclear. Energy will go nuclear. And through Tesla-like activities,
00:57:05.520
it will be also electric. I'm saying solar. So it's going to be solar and pretty much just solar
00:57:12.460
and nuclear. So I would say those two things solve climate change, or at least they're likely to be
00:57:19.220
enough to mitigate the worst problems as long as we're also mitigating things as we go. It's not the
00:57:25.600
only stuff. So that's better. What about the pandemic? Horrible, horrible thing, right? But don't you
00:57:32.600
think we got a lot better at handling the next pandemic? Like a lot better? And I feel like even
00:57:39.860
though you think these new vaccinations are killing people, some of you think that, I feel like what we
00:57:45.700
learned from that could create a platform for everything from vaccinations for cancer to vaccinations
00:57:51.440
for all kinds of stuff. Or we'll find out there was some problem with it. Can't rule that out.
00:57:57.980
I'm giving you the optimist view. Then look at the Ukraine-Russia war. Does it forever end the idea
00:58:07.540
that it's a good idea to attack your neighbor with tanks? Apparently not. Well, I mean, you know, up till now,
00:58:15.660
I guess Putin still thought it was a good idea to attack his neighbor with tanks. And at this
00:58:22.120
point, don't you think this will always be the cautionary tale? It's like, okay, all right, it
00:58:27.260
doesn't work. I think Afghanistan is bad of a situation as it was. At least it will always remind
00:58:35.200
us of what kinds of things not to do again. But Ukraine didn't look like Afghanistan. They looked
00:58:42.620
like such different places that maybe the lesson didn't transfer. But now you've got an industrial
00:58:49.040
country. You've got, you know, a backwards country. And neither of them could be conquered by the
00:58:55.260
Soviet Union or Russia. That should mean something. If you couldn't conquer either kind of country and
00:59:01.020
they're so different, maybe the whole country conquering thing isn't for you. All right. What else is
00:59:09.320
good? So we're going to solve energy. I think we're safer from pandemics.
00:59:19.080
And I think war looks less likely. I think the biggest problem is inflation.
00:59:25.880
All right. Let me give you the ultimate economic safety thought. Are you ready? Yeah. Most of you
00:59:34.260
have some concern, I would think, about inflation and GDP going down and maybe food shortages and
00:59:43.000
everything else. Most of you are starting to have a little anxiety about that, right? And gas prices.
00:59:47.360
Not a little. Maybe a lot. So some of you are having a lot of anxiety. I'm having anxiety about it.
00:59:55.640
And I'm rich. You know, relatively speaking. I can't even imagine how this would feel if I was just
01:00:03.820
squeaking by. This would feel like insanely bad. But let me give you the one positive thing that I
01:00:13.120
can give you. I've told you before that economics is real complicated stuff. I have a degree in economics
01:00:19.620
and I'm confused half the time. But there was one rule that I always look to that tells me where
01:00:25.760
things are headed. There's just one metric. If you get that one metric right, all the other things can
01:00:32.400
work themselves out. But if you get the one thing wrong, nothing can work itself out. Do you know what
01:00:38.840
the one thing is? Employment. Employment rate. Not even raises, not even cost of living adjustments, not unions.
01:00:51.000
Just the pure number of people who have jobs compared to the numbers who want them. And the fact that we have
01:00:56.880
labor shortages now. So we could actually use more workers. Even immigrants are in demand. Apparently, with all the
01:01:05.680
immigration we have, it's still hard to get the harvest picked, I think. So we're actually
01:01:10.920
understaffed. I don't know of any situation where you had close to full employment where you couldn't
01:01:18.280
work out the other stuff. You know what I mean? And here's the math of why that is so important.
01:01:25.040
The difference between an unemployed person and an employed person is a gigantic drag on the rest of
01:01:30.820
the people, right? If somebody is employed and they're taking care of themselves and maybe adding something, you know,
01:01:37.460
to the taxes, then that's taken care of. But if they're unemployed, not only are they not adding, but they're
01:01:44.680
subtracting. You have to pay them to live or they die. So one unemployed person is really, really expensive compared to
01:01:53.340
almost any other problem. So if you get that one thing right, the other stuff can be really painful
01:02:00.500
for a while, but the odds of it working out in the long run are real good. So when you're seeing people
01:02:06.780
who seem to be the most knowledgeable and experienced about economics, and they see all the things you see,
01:02:13.080
you know, they're seeing the inflation, they're seeing the, you know, it might get worse. They're seeing
01:02:17.520
everything. Supply chain problems, you know, China problems, blah, blah, blah. They see all that,
01:02:24.760
but they don't look like they're panicking, do they? Have you noticed that? There's no tone of panic,
01:02:31.820
even though all the metrics are sort of awful. Because that one thing is right, the jobs.
01:02:39.400
And I think the people who know the most are sort of just looking at that one and saying,
01:02:43.460
you know, probably much like I do, say, okay, as long as that one thing's okay, at least our
01:02:50.560
foundation is in good shape, right? The foundation is strong. Then you can work out the rest.
01:03:01.340
Mushrooms is tulips. Yeah, I see what you're saying.
01:03:08.580
Labor participation. Yes, that's a big one as well.
01:03:13.460
Total employment is still below pre-pandemic, but it's still good.
01:03:24.740
How do we fix BlackRock buying all the single-family homes?
01:03:29.540
The solution for housing is better housing, not building the same kind of homes and reselling them
01:03:36.520
over and over again. There is definitely a way to build a home for 10% of what it costs to build a home.
01:03:43.460
So I think that's going to be another part of the golden age.
01:03:50.220
I think that if you got rid of zoning and you turned it into a kit,
01:03:55.480
I've talked about this before, but imagine designing homes in which all the parts are an even amount.
01:04:05.140
In other words, a room in this house, house of the future, could be 10 by 12, but it could never be 10 by 12 feet and 2 inches.
01:04:16.520
And the reason is so you'd never have to cut anything.
01:04:20.360
So if you're putting it in the floor, you buy one-foot squares, and you put in as many as you need for the squares,
01:04:26.480
and nothing gets cut, and maybe it's a kit for them so everything snaps together,
01:04:30.940
so you could unsnap it and move a wall if you needed to, etc.
01:04:33.860
Now, I think that's where it has to go.
01:04:38.960
It's just that nobody has a business model to make money from making that.
01:04:43.380
The old Sears kits, so I know there's some historical examples,
01:04:49.980
but those kits I believe you still had to cut, didn't you?
01:04:54.440
Or not?
01:04:55.240
I think those were, if you imagine what we could do now compared to what they could do in those days,
01:05:02.420
I imagine that we could make a kit that would be way better, just way better.
01:05:09.860
And then if you make the homes with, so here's how I do it.
01:05:14.660
I'd design perfect rooms, and then you could design a house that used as many of those pre-set rooms
01:05:22.500
in whatever configuration they fit.
01:05:26.620
And then you could build almost any kind of a house, but you never have to cut anything.
01:05:30.420
You just get a kit and snap it together.
01:05:32.800
So I think that's where it needs to go.
01:05:34.240
The tough part would be, I mean, I think you could even do the plumbing.
01:05:40.660
In fact, I'll bet you could make a house that's self-aware.
01:05:44.420
Imagine a house that comes as a kit, and each part has a camera in it.
01:05:49.560
So every part you put becomes alive, and it can see around itself, and then it attaches to some brain.
01:05:58.380
So the house could see a leak in your wall before you knew it was there.
01:06:02.540
But it could also tell if you'd assembled it correctly.
01:06:05.640
So you put the new component on, and it becomes alive, because it's like electrically connected to everything else.
01:06:11.460
And it can see around.
01:06:12.900
And suddenly the house knows if you put the thing in the right place.
01:06:16.680
So you put the new piece on, and the house goes beep, beep, beep.
01:06:20.900
It doesn't belong there.
01:06:22.100
You're like, oh, darn, it's backwards.
01:06:24.580
Put it on, it goes beep.
01:06:26.480
And it knows you did it right.
01:06:27.940
So if you made all the components alive, the house would help you assemble itself.
01:06:34.260
And it would always watch.
01:06:35.420
It'd be watching for any defects, and it would warn you every time.
01:06:39.640
All right, just an idea.
01:06:41.280
That, ladies and gentlemen, is the best show ever.
01:06:44.000
I think I delighted and entertained you.
01:06:47.640
Probably some of you are about to start one of the best days ever.
01:06:51.700
And I think that you would agree that today is the beginning of the rest of your life.
01:06:59.340
No, it's true.
01:07:00.600
I read it in a greeting card once, so I know it's true.
01:07:03.860
And that is all I have to say to you on YouTube.
01:07:06.100
I'll talk to you tomorrow.
01:07:07.340
I'll talk to you tomorrow.
Link copied!