Episode 1414 Scott Adams: Find Out What Level of Awareness You Are at While Simultaneously Sipping
Episode Stats
Words per Minute
145.8678
Summary
Scott Adams is back in New York after a trip to Greece. He talks about the crush of people at JFK Airport yesterday, and why he loves being back in America. Plus, a new micro lesson on how to determine your level of awareness.
Transcript
00:00:00.000
Well, wow, this is going to be the best coffee with Scott Adams of all time, no doubt about it.
00:00:11.340
It's the first time I tried to talk today. It's not going well, but it's going to get better.
00:00:17.860
So I've been traveling and not doing a lot of sleeping. So I just stayed up all night last
00:00:25.400
night. And today, I'm ready to go. I'm going to adjust my time zone a little bit. And here we go.
00:00:36.680
Now, I want to warn you about what's coming. There's going to be an embedded micro lesson
00:00:42.620
on determining your level of awareness. Some of you might not like it. But the magic here is that
00:00:49.700
the stories we're going to start with are all going to build toward that point. Watch it happen.
00:00:55.400
It's going to be like magic. Later, I'll pick out the micro lesson and publish it individually on
00:01:01.840
the Locals platform because subscribers and Locals get a little bit extra.
00:01:09.400
Now, before we begin, would you like to do the simultaneous sip? Yeah, I'm pretty sure you would.
00:01:14.440
And all you need is a cup or a mug or a glass, a tank or a gel, a canteen jug or a flask, a vessel of any kind.
00:01:19.360
Fill it with your favorite liquid. I like coffee. And join me now for the unparalleled pleasure,
00:01:24.040
the dopamine hit of the day. The thing makes everything better. It's called the simultaneous sip.
00:01:31.820
Let me tell you, as much as I enjoyed this trip, it was to Santorini, Greece. I love being back in
00:01:49.280
America. So it turns out, I really like this country. And I like being in it. And being back
00:01:56.300
on American soil feels really good. So let me tell you some stories.
00:02:02.080
Number one, the JetBlue terminal at JFK yesterday was more packed than I've ever seen any terminal
00:02:14.700
ever. I mean, it was just a crush of people. You couldn't get near anything. You couldn't tell
00:02:20.620
that there was a line. It was just this mass of people. And, you know, there have been some
00:02:26.960
predictions that there'll be this rubber band effect. All the people who couldn't travel last
00:02:31.560
year, you know, they're going to double up and start traveling like crazy. Now, I know that American
00:02:36.520
airlines actually canceled flights because they couldn't get enough employees. So the airports
00:02:43.000
are not working at what I would call 100% efficiency. But boy, is there demand. So JetBlue was just
00:02:50.240
through the roof yesterday. Literally, I've never seen an airport that packed. And honestly,
00:02:57.180
it was a little difficult. Not just getting things done, which was also difficult. But the psychological
00:03:05.440
impact of being in a massive crowd after a year of social isolation, it's hard. It's actually just
00:03:15.160
psychologically hard. Not in a medical sense. You know, I wasn't worried about anything medically.
00:03:19.740
I'm all vaccinated, blah, blah. But just the social experience of being surrounded by people
00:03:29.120
was pretty intense. And it's going to take a little while for us to work our way back into
00:03:34.440
the social head. Well, here's another reason to miss President Trump. I think this is real,
00:03:43.260
by the way. Somebody tell me if this is fake news. But I believe he actually put out a press
00:03:48.820
release on Father's Day saying, Happy Father's Day to all, including the radical left, rhinos,
00:03:55.080
and other losers of the world. Hopefully, eventually, everyone will come together.
00:04:02.260
Now, there's just nobody else in the world who can be this interesting this consistently. So I'm
00:04:09.700
seeing some confirmations that it's real. I swear, I swear, Trump doesn't know how to be
00:04:19.280
uninteresting. You know, after years of being interesting, both good and bad, I suppose you
00:04:25.040
could say, he just doesn't know how to turn it off. He's just perpetually interesting, no matter what
00:04:29.940
you think of him. But I love the fact that he could put out a statement like this, and it's barely news.
00:04:41.160
Let's see. Oh, interesting. Roberts asked me how many precursors for the talent stacks and system
00:04:47.520
stuff is in ancient Greece. I don't know. I'm not aware of any precursors, but one assumes that not
00:04:57.240
every idea is original, right? There's always some precursor. The Panda Tribune, if you're not
00:05:08.840
following that account on Twitter, you should. The Panda Tribune. The handle there is at Panda
00:05:16.120
Tribune. And he tweeted at me a video of my face superimposed over Chuck Norris doing a bunch of
00:05:26.760
martial arts. And here's the thing. It's really convincing. For a relatively low-tech, you know,
00:05:37.720
I'm sure it was a low-tech relative to, you know, say Hollywood and CGI. And it's pretty convincing.
00:05:44.920
If you didn't know that my face had been, you know, substituted for Chuck Norris, it would kind of look
00:05:52.420
like it was really me. So how close are we to being completely fooled by a video? We're there.
00:06:02.800
We're at the point where the fake video can be created by, you know, people who just have access
00:06:08.800
to ordinary equipment. And it's going to look real. How long before you don't need to hire, you know,
00:06:16.120
real actors for movies? We're there. We're there. You already don't need real actors. You could
00:06:22.220
simulate them except for obviously getting sued for doing it. But technically, there's nothing to
00:06:27.500
stop you from doing it right now. Now, I told you that each of these stories is building to a larger
00:06:35.280
point. We're building to the micro lesson. You won't see yet how they fit together. But watch me
00:06:41.700
pull this mosaic off. All right. Larry Kudlow is saying that Biden gave up energy sovereignty,
00:06:52.180
American energy sovereignty. And, you know, the argument here is that agreeing to the Paris climate
00:06:59.860
accords is going to mean closing pipelines and not drilling in Alaska and doing a bunch of things that
00:07:05.960
Trump would have aggressively done. But Biden is pulling back on. Now, here's the reframe on this
00:07:14.300
that I think needs to happen. Whoever has the best energy policy is sort of going to be the winner
00:07:23.000
in the world of economics and power and even defense. So when we think of this energy programs
00:07:32.300
and we think about being more green, we really need to think of this as a national security issue
00:07:39.540
because the richest country pretty much wins the wars, right, one way or another. They either buy off
00:07:46.280
people or they bribe people or they have a better military. But the richest country wins pretty
00:07:53.040
consistently. And if you don't have a robust energy industry, you're probably not going to be that
00:07:59.720
country forever. So the energy being the probably the main component of a good economy, that plus tech,
00:08:07.500
I would suppose. But if you don't have energy sovereignty, you're really giving up a lot
00:08:13.180
militarily. And we don't talk about it that way. So here's an example where the way you look at it
00:08:19.880
can completely change what you think is important. If you're looking at it as climate change, well,
00:08:26.200
you know, then Biden's got an argument. You could disagree with it, but it's an argument. If you
00:08:31.820
look at it as a homeland protection issue, there's no argument. As soon as you say this is also self
00:08:40.840
defense, we're done. You need the best energy program that you can get for self defense. That one's not
00:08:50.040
debatable. Now, I suppose you could say if you're a nuclear power, you've always got that option. But
00:08:56.440
you don't want to, you know, you don't want to be talking about that option. All right, here's a way
00:09:01.400
to know that your opinions are assigned to you, as opposed, not you necessarily, specifically, but that
00:09:09.160
the public's opinions are assigned to them. And here's a classic, perfect example of that. So apparently,
00:09:16.700
as M Hackman tweeted, the number of unaccompanied kids in CBP custody, border custody, is at 1,040,
00:09:29.480
the highest it's been since April 27th. So we've got over 1,000 kids in custody at the border.
00:09:37.980
How big of a story would that be if Trump were president? Well, it's a big story either way,
00:09:45.180
right? So I think we would agree. Yeah, it's a national headline either way. But it's not really
00:09:52.300
even close to the same amount of energy, right? The energy that the news would be putting into the
00:09:57.900
story, if Trump was behind it, would be hair on fire. The energy that's put into the story with Biden
00:10:06.200
in charge. Let's mention it. Here's some data. Here's some statistics. This is really clear evidence
00:10:18.440
that your opinions are assigned to you. Because if your opinions were not assigned to you,
00:10:24.600
it would look the same regardless of who the president was, right? So are you seeing the mosaic
00:10:31.440
coming together yet? You will in a minute. So Adam, a user on Twitter named Adam, asked me this
00:10:40.940
question today. He says, is there any scenario where the intelligence agencies don't run the
00:10:46.400
government from the shadows eventually? Is there any possibility that intelligence agencies don't end
00:10:54.020
up controlling the government? And my answer was, not in the long run. In the long run, they have to.
00:11:01.420
It's basically guaranteed in the long run. Because they would have the tools to do it.
00:11:08.200
They would have the motivation to do it perfectly within their, you know, within their mission to
00:11:14.940
keep the country safe as they see it. So there's a lot of subjectivity in it as to what is morally right
00:11:21.200
and what is just common sense and what is just a smart way to manipulate the public and what is just
00:11:27.480
a good, sensible way to influence topics. You know, there's a very fine line between a sensible way
00:11:36.060
to influence a topic and just controlling the government from your intelligence agencies.
00:11:41.500
So given that the intelligence agencies have the motivation and the tools, and here's the important
00:11:47.460
part, lots of different people involved. So if somebody tries to influence the government and
00:11:52.600
it doesn't work and they get caught or outed or fired, there's always somebody else. So sooner or
00:11:59.100
later, the intelligence agencies are going to peck away at the, you know, the levers of government
00:12:04.760
until they own it. Because they know how and they have a reason to do it. And they have all the time
00:12:10.380
in the world because, you know, they exist for years. So the part we don't know is when it happens.
00:12:17.620
We don't know if it's already happened, probably has, or it's in our future. But it's guaranteed.
00:12:25.000
You can't avoid it. It would be, I can't even imagine a scenario where it wouldn't happen.
00:12:30.740
The only caveat to that is that the intelligence agencies don't care about most topics.
00:12:35.820
So most things are not going to be influenced by intelligence agencies because they don't care.
00:12:42.880
They don't care about picking up the garbage. They don't care about the, you know, the national
00:12:46.860
bird. They just care about some specific topics. And of course, they're going to be in control
00:12:52.100
of those eventually. So here's some fake news from the Guardian. They've got this story that is just
00:12:59.420
so over the top, ridiculously fake news that it should be embarrassing to be them, but I doubt it.
00:13:08.660
So they've got this story about some guy in prison who says he's the one who gave
00:13:12.660
Trump the idea of drinking bleach for coronavirus. Now, first of all, Trump never said drink bleach
00:13:21.020
for coronavirus. That's fake news level one. Level two is that if that were, if that fake news were
00:13:30.700
true, that Trump had suggested actual drinking bleach, which never happened, that there's somebody
00:13:37.140
claiming that he's the one who gave him the bleach and he actually drank some of it. Now, nothing about
00:13:46.300
this story is true. It's just so obviously, laughably, ridiculously fake, but it's published
00:13:56.240
in the Guardian. It's published just like real news. And once you see how often, you know, large
00:14:05.380
publications will print things that are just obviously made up. I mean, you don't have to be a genius to
00:14:11.380
know this one's made up. You don't have to have any inside information. You can just look at it and
00:14:16.380
say, oh, that's, that's crazy. Just keep that in mind as we continue to put the mosaic together.
00:14:26.320
Did you see that the McCloskeys, the couple who had the AR-15 in a pistol and were defending their,
00:14:33.560
their home against, I think it was Black Lives Matter protesters. And I guess they, they must
00:14:42.440
have pled guilty to some lesser charge, which involved them giving up their weapons. So they
00:14:48.620
had to, they had to turn in their, their AR-15 and I don't know, probably the pistol. And, but here's
00:14:57.060
the weird part. They had to turn in their weapons, but there was no prohibition about them driving to
00:15:04.280
a gun store and buying new weapons. So they did. That was their penalty. Their penalty is we're
00:15:12.660
taking your weapons away, but nothing to stop them from immediately buying replacement weapons,
00:15:19.480
which they not only did, but they took selfies. Hey, here's us buying our new replacement weapons,
00:15:25.240
which we can totally do legally. And the whole time I was looking at this, I thought to myself,
00:15:30.200
that guy's a good lawyer. If, if, can you imagine a better legal outcome than, well, we're going to
00:15:40.100
take your weapons away, but you can buy replacement weapons right away and you're going to get some
00:15:44.340
publicity and you'll be more popular than ever. He is a good, good lawyer. It's, you know, it's not a
00:15:50.740
coincidence. He's rich and lives in a big house. All right, here's a simulation alert. Apparently
00:15:57.720
there's a Democrat Senator named White House. This is his actual last name, White House. Now,
00:16:06.700
of course you think he's going to run for president because his last name is White House, but it also
00:16:11.920
has white right in the name. And interestingly, the New York Post is reporting that he's been a member
00:16:19.660
for decades in a allegedly all white private beach club. And, you know, he's, he's a progressive. So
00:16:29.040
this is a, you know, extra hypocritical story, but I'm going to say put a pin in that one. I think this
00:16:38.520
smells a little bit too much like fake news, meaning is there really a beach club in 2021 who actually has
00:16:48.000
literally says, literally says, we're not going to let in anybody who isn't white? Is that literally
00:16:52.800
happening? Or is it simply where they live? Unless you're rich and unless you know about it, you're not
00:17:00.820
going to be in this club. So maybe it just is, you know, a natural outcome of who they are and what
00:17:06.060
they're doing. Which one is it? I'm a little skeptical about this story. Could be true. All right. So I'm not
00:17:14.740
going to rule out the possibility that's exactly right. The New York Post has a pretty good, pretty
00:17:21.000
good record lately of getting stuff right that other people are getting wrong, right? Wouldn't you agree?
00:17:26.700
So the New York Post has been kind of solid on some of their scoops, but this story doesn't quite,
00:17:35.160
quite fit. So keep an open mind about this. But the simulation alert is that his name is White House,
00:17:42.500
and he was allegedly in an all-white club. I don't believe it. All right.
00:17:51.800
Rasmussen had a poll that asked, how important is freedom of religion to a healthy society? 82% said
00:18:00.260
very or somewhat important. Remember I told you that you can always get a solid 25% or so who will
00:18:08.400
answer any poll in a way that's just crazy. And I think that 25%, you know, could be 20, could be a
00:18:17.080
little more than 25. But every poll seems to have this little solid 20, 25% people who either didn't
00:18:25.600
understand the question or were intentionally trying to ruin the poll or are amazingly stupid
00:18:32.940
or some combination of those things. It's just every poll. And so it's almost as if, if you get a
00:18:38.860
result that says 82% of people say anything, you should think of it as 100%. Because the other people,
00:18:46.800
they're just, I don't know, what's wrong? Who in the world, in the United States, would say that
00:18:53.460
freedom of religion is not important? Really? Really, you can get, you can get almost one out of five
00:19:02.040
Americans, likely voters, who will literally say the freedom of religion isn't terribly important.
00:19:11.720
I'm not sure that that's true so much as it is a snapshot of civilization, which is you can get 20%
00:19:19.400
of people to be wrong about anything. If you said oxygen isn't important for human survival,
00:19:26.600
20% of respondents would say, I don't know, I think oxygen is not real. I think oxygen is probably just a
00:19:36.980
rumor. So 80% is 100% in my view. So British doctor Peter Daszak, who got a bunch of people together to
00:19:51.140
sign a letter that got printed in the Lancet Medical Journal, saying that the Wuhan lab leak theory
00:19:57.540
was not credible. Turns out he got fired. He got fired from the UN Commission investigating COVID.
00:20:07.000
Because not only was he maybe not helpful, he was the opposite of helpful. And apparently,
00:20:14.520
he had long ties with the lab and basically did something that on the surface looks like the least
00:20:24.540
ethical thing I've ever seen in my life. Right? Now, again, there might be something to the story
00:20:33.160
that we don't know that would soften that opinion. But I don't know what it would be. It literally looks
00:20:39.520
like the least ethical thing ever. I mean, you'd have to go back to, you know, industry suppressing
00:20:48.340
information that smoking tobacco causes lung cancer. I mean, you'd have to go, you'd have to dig pretty
00:20:54.760
deep to find something less ethical than what happened here, apparently. So keep that in mind when
00:21:04.020
you're thinking to yourself, well, those experts, those experts told me what to think. He was an
00:21:10.540
expert, and he got a hundred other experts to sign something that wasn't even close to true. Not even
00:21:16.680
close. So here's a little story that proves the gel man amnesia. Let's see, what do you call it? A theory
00:21:27.160
or a concept. And the idea is that if you happen to be the topic of a news report, you know that it's
00:21:35.560
fake, but the people reading it don't know, because they don't know what you know. Or if you're
00:21:40.080
specifically the gel man amnesia, is if you're an expert in a field, you can tell the stories about
00:21:46.260
that field are bullshit, but you can't tell that the other stories are untrue because you're not an
00:21:51.080
expert. So here's an example of that. Most of you might be familiar with Christopher Ruffo, who has
00:21:59.380
been doing amazing work uncovering some of the race-related education in businesses and schools.
00:22:08.360
Naturally, he became a target for the left because he was doing such a good job of exposing the, let's
00:22:17.540
say, clumsy and or dangerous ways that race was being taught, both in corporations and in schools.
00:22:26.840
So the Washington Post does this hit piece on him, and he basically just strangled him. I've never seen
00:22:36.940
anybody beat the media as convincingly. So here's what Christopher Ruffo tweeted about it. He said,
00:22:46.320
winning. The Washington Post's hit piece against me has collapsed. They have admitted to fabricating a
00:22:52.960
timeline. Just listen to this. These are the things they've admitted. They have admitted to fabricating
00:22:59.800
a timeline, retracted or added six full paragraphs, to give the right context, reversed a key claim.
00:23:08.540
In other words, just said that it was fake news, and failed to produce evidence of a falsified quotation.
00:23:18.700
Now, does that sound bad? Now, imagine this. This is normal. If you've never had a hit piece written
00:23:27.920
about you, and I'm lucky enough to have had a hit piece written about me, several of them actually,
00:23:33.640
this is actually pretty normal, that even a respected publication, I guess that's, you know,
00:23:42.060
subjective, can just make up stuff, just totally made up, and destroy somebody's credibility
00:23:49.880
just by printing it. Now, even though Christopher Ruffo basically just took the piss out of him,
00:23:59.380
I've never seen anybody just destroy an article like this so convincingly. But here's the bad news.
00:24:07.180
How many people who read the article, you know, know about the corrections? Not many. So the hit
00:24:15.680
piece still works, even though it's been uncovered as unscrupulous. It still works. So that's the world
00:24:22.500
we live in. So can you trust the news when it talks about an individual who is, let's say, obviously a
00:24:30.340
target for either the left or the right? So this isn't just about left or right. This is just the way
00:24:35.320
stuff works. No. Anytime you see a hit piece about an individual, just say to yourself, probably it's
00:24:44.080
just a hit piece. And the details might be missing a little context and the quotes might be made up.
00:24:51.300
Now, I want to say this in the comments. How many of you think that major publications routinely
00:24:59.480
make up quotes, actually put quotes on them, and publish them as if the subject of the article said
00:25:06.900
them? How often do you think that happens, that the quote is literally made up? All the time. Until it
00:25:17.200
happens to you, you just wouldn't even believe it. But I've been the subject of lots of articles, and I
00:25:25.440
see almost, I don't know, half of the time, there's a quote in there that I didn't say. It's just something
00:25:33.420
they think they remember sounds like something I would have said. So they just come up with a quote,
00:25:39.600
put it in quotes, and just say, I said it. It happens all the time. So when you see anything in
00:25:46.700
quotes, don't believe it. It might be true, and it might not be true, but don't believe it just because
00:25:53.640
it's in quotes and it's in a major publication. It doesn't mean anything, and never has. It's not
00:25:59.880
like a new phenomenon. The things in quotes are literally, frequently made up in all kinds of
00:26:08.340
publications, you know, big and small and respectable and not. Jack Posobiec tweeted this
00:26:16.600
provocatively today. I think it was today. The New York Times in October 1903 predicted that a flying
00:26:24.400
machine would take scientists millions of years to invent. That was in 1903, same year that the Wright
00:26:31.780
brothers had their first flight. So the New York Times was off by millions of years. The smartest
00:26:42.820
people who are all over this were wrong by millions of years. And as Jack points out, neither of the Wright
00:26:52.640
brothers attended college, which is an interesting side point. Yeah, I added to this point my own little
00:26:58.980
experience. All right, so here's a true story. In 1995, I was in a meeting with top Hollywood executives
00:27:06.220
and the top agents, not the top, but among the top executives and agents, like really, you know,
00:27:14.100
A-plus people. And I proposed that if we did a Dilbert movie, which is what we were there to talk about,
00:27:20.200
that it should be sort of what I was describing as a computer-generated, like 3D movie. Today,
00:27:26.960
you would call it CGI. But back then, you know, the language wasn't quite as clear. And so I said,
00:27:33.840
we shouldn't do this typical flat animation, you know, like The Simpsons. We should do like a 3D
00:27:39.760
computer-generated thing. The experts in the movie, the experts in the movie industry sitting around the
00:27:47.120
table, unanimously told me, can't be done. We don't have the technology to do that sort of a thing
00:27:53.980
and make money at it. This was 1995. A few months later, Toy Story had theaters.
00:28:03.720
Smartest people in Hollywood didn't know Toy Story was already well under production,
00:28:10.240
made over $300 million. So we live in a world in which your experts sometimes are off by millions of
00:28:19.160
years. There's an interesting video I tweeted. You can see it in my Twitter feed. And it's done by
00:28:28.800
Maze, M-A-Z-E. His Twitter handle is, I think it's M-A-Z-E-M-O-O-R-E. And what's interesting about
00:28:39.580
this is that it shows, you know, the clips of Harris talking about the border in a very open
00:28:47.140
border way and then cuts to clips of her talking more recently. Don't come, don't come. And if you
00:28:52.960
come, we'll send you back. So basically it's showing a complete flip-flop of opinions about
00:28:58.480
immigration. Now, we'll often see this with vice presidents in particular. So you shouldn't make
00:29:04.800
too much of it because it's sort of a vice president thing to have to reverse some specific
00:29:10.780
topics when you're, when you're working for the top guy or top person, let's say, let's be less
00:29:16.280
sexist, less sexist. And here's what I wanted to point out the talent stack on Maze. So if you just
00:29:27.640
look at the profile, this is someone who is a digital artist, a video editor, and a researcher
00:29:34.440
and says, you have seen my work, which is probably true. So that's a pretty strong stack, isn't it?
00:29:42.040
So I'm only pointing this out, not so much because the Harris part, because that's sort of typical for
00:29:46.980
a vice president to do a flip-flop. But look how strong this is to be a digital artist at the same
00:29:53.420
time you're, you do video editing and you're a researcher. So you can find stuff that is the
00:29:59.420
content you're doing, you're video editing. Every time I see a good talent stack, I like to point it
00:30:04.740
out. Barry Weiss on Substack has a guest essay by Abigail O'Smere. And the title of it really,
00:30:13.760
really caught my attention. The books are already burning. Now you think to yourself, well, you know,
00:30:21.860
they're not really burning books in 2021, right? Because that would be the worst thing you could
00:30:26.400
imagine. When you were a kid, didn't you always hear that if you were in a world where people are
00:30:31.360
burning books, that's the worst place to be. But we're there. It's just that the books are YouTube
00:30:37.080
clips. So the burning of books is a 2021 phenomenon. The worst thing that you could imagine for the
00:30:47.300
health of society, which is to, you know, delete people because they disagree with the mainstream,
00:30:54.280
that's going to turn you into China. That's going to turn you into the country that can't innovate.
00:31:01.320
It's the people who are wrong and have the freedom to be wrong in public who drive everything. Because
00:31:08.220
sometimes the people who are completely wrong, well, sometimes they turn out to be the Wright brothers,
00:31:13.240
right? So you need this freedom to be wrong and really, really wrong and wrong a lot in order for
00:31:21.200
society to move forward. It's, I would say the, one of the great systems that makes America so at least
00:31:29.640
economically dominant is that you could be wrong as hell in America. You can start a company that
00:31:35.380
nobody buys your product. You can fail like crazy. You can fail on a seven different things before you
00:31:40.800
succeed. I mean, you can say things in public that are just stupid, you know, when you find out what's
00:31:45.980
real, you can fail like crazy in America and have the wrong opinions and still be okay. So this is
00:31:56.700
really dangerous. The, and the, one of the topics here was the, uh, the dark horse podcast with, uh,
00:32:03.660
Brett Weinstein. So that's getting a lot of pressure. Now I'm not saying that I'm endorsing everything
00:32:10.500
that's in that or any other content. I'm saying that if they, if you don't allow them to be right
00:32:16.500
or wrong, I'm not the one who could judge that, but if you don't let them be right and let them be
00:32:23.080
wrong, you've just destroyed the engine of America. Being wrong is the engine of America
00:32:31.020
because that's what allows us to iterate until we hit something that works, right? You can't,
00:32:37.080
you can't hit something that works by shooting at it once one person, you know, hit that target. Ooh,
00:32:43.320
let's, let's hit that target. It's about trial and error. And we're, this country is one trial and
00:32:49.220
error mofo. It's what we do best is fail in a good way. Um, coincidence or not judge this. Is this a
00:33:01.760
coincidence or not? Today, Nick Gillespie of reason? Um, you all know, I hope you know of reason,
00:33:10.220
the publication, uh, he tweeted a tweet that I said about the airports being busy.
00:33:16.820
Now, is that interesting? Yeah, not especially. It's just Nick Gillespie of reason tweeted something
00:33:23.560
about an airport that I tweeted. The weird part is that, uh, yesterday I was literally in an airport
00:33:31.300
talking about Nick Gillespie because I was in Athens airport and, uh, Liz Wolf, who's also with
00:33:39.560
reason, uh, recognized me sitting there and introduced herself. And so I said, oh, you know,
00:33:44.800
blah, blah, blah. You know, I know Nick Gillespie. And what are the odds that on one day I'd be talking
00:33:52.440
about Nick Gillespie within an airport? And the very next day, Nick Gillespie would retweet just one
00:33:59.020
of my tweets. And I don't think he's retweeted anything from me in a while. It's about an airport.
00:34:06.000
What are the odds? Coincidence? Hold that thought. Hold the thought.
00:34:13.460
And now we're going to give you the micro lesson. I'll bet you're glad you stayed around for this.
00:34:26.420
This micro lesson is not about what is true. It's not about what is true. All right. Now let me introduce
00:34:34.220
it. Here's a micro lesson on understanding what level of awareness you're operating at. Now, first of all,
00:34:43.240
this is a way of looking at the world. Don't think of it as true or false. It's simply a frame or a
00:34:49.960
filter you can put on the world that either works and helps you understand things or it doesn't. So
00:34:56.120
judge it only by whether it's useful, not whether it's true. And let me tell you what these levels
00:35:03.740
are. And this is based on my own observation. So everything here is just from me as the source.
00:35:10.620
When you are born and you're a child, I'm going to call you innocent. You believe what your parents
00:35:17.600
tell you. You believe in Santa Claus. You believe in whatever religion they tell you is the right
00:35:21.840
one. You basically believe authority. But as you get a little bit older, you become what I call a
00:35:29.300
truther. Somebody who thinks that the facts and the truth are what really matter. And you understand
00:35:36.580
that people could lie to you. Your parents could lie to you about Santa and the Easter bunny and
00:35:42.420
you know, the tooth fairy. But other people, other adults could lie to you too. So you've got to be
00:35:47.560
careful. So you're a little higher level of awareness now because you know people can lie and you know
00:35:53.260
that the facts and the truth are the most important thing. Unfortunately, once you believe that the
00:36:01.880
truth is the most important thing, you become a victim. Because the truth is not something that you
00:36:09.480
have access to. It's something that's provided to you and you tend to accept it. That is to say that
00:36:16.460
there are people in power who control what the truth is. So the moment you say, what matters to
00:36:22.760
me most, what will guide me in my decisions, my affiliations, will be the truth, you become a
00:36:30.760
victim. Because leaders will feed you a truth that you'll believe that will be putting you in a victim
00:36:37.960
category. The leader will say, hey, you're black. You need to be with all the other black people and
00:36:44.140
ask for certain things. Hey, you're a member of the LGBTQ community. You need to be a victim so that
00:36:51.020
I as a leader can get you some better stuff. You know, better stuff in terms of a better life.
00:36:58.980
So the moment facts are your most important criteria, you will almost always be dragged into
00:37:05.240
the victim level of awareness. And leaders will tell you that they have the truth. You will accept it
00:37:11.760
because you tend to affiliate with the side and accept their truth. This is a bad place to be.
00:37:19.640
Next, if you can climb out of that, maybe through experience and just thinking about things right,
00:37:25.500
and maybe you have a good mentor, you can raise your level to be a skeptic. A skeptic is someone who
00:37:32.720
rejects the assigned opinions. Doesn't mean you have the right answers. Doesn't mean you're the smart one.
00:37:38.940
It just means that you don't automatically take the approved answer. You become skeptical.
00:37:45.880
So this is a higher level of awareness, but has a limited utility. It can keep you out of trouble
00:37:51.060
by keeping you skeptical to things that might hurt you, but there's still a ways to go.
00:37:58.640
The next level is what I call the strategist level, where you say to yourself, I don't know what's true,
00:38:04.760
but I do know what works. I know that if I have more talent and build a proper talent stack that I
00:38:11.840
will be more effective. I know that if I use a system rather than a goal, I'm going to get a
00:38:17.120
better result. I know if I work hard, I'll get a better result than if I don't. I know that if I
00:38:21.960
network with lots of people, I'll get a better result than if I don't. So the strategist is not
00:38:28.260
working on so much what is true, but rather what they observe works. And then at the top level,
00:38:37.080
I call this the author level. Now, can the author actually change? Let me change my spelling here.
00:38:45.980
Terrible speller. Simulation. That's close enough. I'm not saying an author or a person operating at
00:38:57.740
that level of awareness can literally change reality, because we're not smart enough to know
00:39:02.360
that. We don't even know what reality is, much less that someone is changing it. But they will have
00:39:08.400
the experience of it. And when you observe them, it will seem as if they can. Who would be in this
00:39:15.600
level? Well, I would put Trump in that level. Trump isn't caring so much about the facts, right?
00:39:24.620
You know, he's loose with the fact-checking. He is certainly above a skeptic, and he certainly
00:39:32.140
understands strategy. So he's passed through these levels to the point where he literally just makes
00:39:38.560
stuff happen out of nothing. How about Mike Cernovich? He's an author. He does things that
00:39:47.800
you almost can't understand would be possible. He's simply authoring the reality, or it looks
00:39:53.400
like that way, right? So keep in mind, I'm saying it's the appearance of changing reality that's the
00:39:59.180
part we can observe. We don't know what's really happening under the hood.
00:40:02.180
How about Naval Ravikant? In my opinion, he's operating at the author level, meaning that if
00:40:12.820
you looked at his life and what he's able to do, it just doesn't seem normal. It's almost as if
00:40:19.340
he can manipulate reality itself. So people can be in more than one group, but I would submit to you
00:40:30.080
that one way to use this is if you are in a disagreement with somebody, you might not
00:40:36.660
actually be disagreeing. And I have this problem quite a bit on Twitter. There are a number of
00:40:42.840
people on the truther level who will come at me on Twitter, and they'll say, for example,
00:40:49.360
transgender people are whatever they were born at. This is a man, and this is a woman,
00:40:57.100
and that's just a fact. That's the truther level. If somebody at the truther level, the
00:41:02.980
fact level, that's the only thing that matters, is the facts, gets into a debate with somebody
00:41:08.860
who's operating at a strategy level, at least, much less the higher level, these two people
00:41:15.780
will look like they're having a debate, but they're not. They're not even in the same,
00:41:22.440
I don't know, realm of reality or awareness. So for those of you who say to me, but Scott,
00:41:30.620
it's just a fact this is a man. It's just a fact that this is a woman. Well, if facts are what
00:41:37.700
matter most to you, okay, but we're not having the same conversation. I'm at a strategy level,
00:41:44.740
and I'm saying, well, what works? Given this set of uncertainties and disagreements,
00:41:51.580
what do you do about it? What's the system that makes all that work? And that's a different
00:41:57.520
conversation than is somebody definitely a man or definitely a woman. Might be fun to talk about,
00:42:04.960
doesn't have much use in the real world versus strategy. And the higher level is if you want
00:42:10.820
something to happen, make it happen. And that is your micro lesson. So notice that the stories
00:42:20.720
that I talked about were all about subjective reality. They were all about fake news, and they're
00:42:27.420
all about what you believe. Once you release on the fact that you can't really know what's true,
00:42:34.540
then you have the ability to rise up the levels of awareness and make yourself more effective.
00:42:40.160
If you're completely limited by what the facts are, then your ability to author the simulation
00:42:50.040
will be just not something you could do. But the people who understand that there's something about
00:42:58.900
this reality that doesn't quite make sense in terms of our factual or even scientific understanding,
00:43:05.740
once you see how often the same people can seem to author the simulation, because it's the same
00:43:13.100
people, right? But somebody who can do it does it more than once, right? Elon Musk is authoring the
00:43:21.700
simulation. Elon Musk is not trapped in any of these lower levels, right? I don't even think he's at the
00:43:29.800
strategy level, although clearly he understands strategy. I think he's operating unambiguously,
00:43:36.720
in my opinion. It's hard to know from the outside. But it looks to me like he's changing reality
00:43:42.200
consistently. And when somebody can do that, you have to say to yourself, how do I do that?
00:43:50.280
And that is your live stream for today. And I'm going to go do something else. And I am so happy to be
00:43:58.680
back in America, even though vacation was terrific. And I hope you enjoyed this live stream. And if you
00:44:06.940
want to see the individual awareness level micro lesson, that'll be posted on the locals platform,
00:44:13.460
subscription platform, a little bit later today. And I will talk to you tomorrow.