Episode 1640 Scott Adams: Joe Rogan's Video Response and How the Pandemic Changed Reality
Episode Stats
Words per Minute
144.45416
Summary
In this episode of What I guarantee will be the Best Thing That Has Never Happened to You in Your Life, Scott Adams takes you on a mind-blowing journey into the mind of Naval Ravikant, the smartest person in the world.
Transcript
00:00:00.000
Good morning, everybody, and welcome to what I guarantee will be the best thing that has ever
00:00:06.180
happened to you in your life. It's called Coffee with Scott Adams. Some say it's underrated.
00:00:14.680
They're all right. It's the best thing in the world, not the second best. And if you'd like
00:00:19.280
to take it up a notch to a level where we've never been before, all you need is a copper mugger,
00:00:27.380
a glass of tanker, a tanker, a chalice of stein, a canteen jug of flask, a vessel of any kind.
00:00:33.720
It could even be a Canadian truck. Fill it with your favorite beverage. I like coffee.
00:00:40.600
And join me now for the unparalleled pleasure. The dopamine hit of the day. You might feel a
00:00:47.700
little bit of a tingle. Chills? Anybody? Chills? It's called the simultaneous sip and it happens now. Go.
00:00:57.380
Only one word can describe this. Sublime. Let's try another word. Carrot. See, that didn't work.
00:01:12.020
There was only one word that could possibly describe that moment. Well, today's going to be a little
00:01:19.720
bit mind-blowing. I promise you. And we're going to build into it. So watch how this is not just
00:01:28.820
a series of little snippets. But by the end, you will say to yourself, my God, it formed a symphony.
00:01:36.580
At first, I thought it was just going to be the oboe and then a little timpani. But suddenly,
00:01:42.360
I realized it all came together into a symphony. That's what's going to happen today. That's how
00:01:48.100
good it is. Starting with a question that had been really on my mind lately, and I wondered if it,
00:01:55.540
is it just me? And watch what happens when I ask this question, because I did it on Twitter.
00:02:01.420
Watch what's going to happen in the comments. Is it my imagination or have people changed
00:02:07.100
changed because of the pandemic? I mean, basic personality changes. Big stuff. Go.
00:02:15.940
Watch the comments. Yes, yes, yes, yes, yes, yes, yes, yes, yes. Now, some no's. Some people say no.
00:02:23.640
But, oh my God, did I get a lot of response to that. And a lot of hypotheses about why that might be
00:02:32.440
the case. Now, hypothesis number one has to be what? What's the top hypothesis, if I've taught you
00:02:40.360
anything? It's just in your mind. The top hypothesis, until it's replaced by something better,
00:02:50.120
which is likely to happen. But your first thought should be, that's just in your mind.
00:02:54.480
All right. Now, that's just healthy thinking. I'm not telling you it's just in your mind. I'm telling
00:03:00.140
you that would be a healthy way to approach anything unusual. That's probably in our minds.
00:03:05.820
But let's see if we can tease it out a little bit. Here's a couple of things that smart people said,
00:03:12.300
and I'm going to put them together. One of the things that Naval said, Naval Ravikant,
00:03:18.840
for those of you new to the live stream or haven't heard his name before, smartest person in the world.
00:03:25.240
Maybe. I mean, I don't know that for sure. But if you were to just judge by things he has said and
00:03:34.140
done, maybe the smartest person in the world. All right. So you can go Google him and find out
00:03:40.020
yourself. But Naval, I'm pretty sure it was Naval. Do me a fact check, because I'm doing this by memory.
00:03:47.120
I think he said toward the beginning of the pandemic that the one of the things he predicted
00:03:51.640
is that it would accelerate everything. Can you give me a fact check? He did say that, right?
00:03:59.020
He said it would accelerate everything that was going to happen anyway. So instead of 10 years,
00:04:04.060
you know, things would happen in one or two. Now, how is his prediction? How is that prediction?
00:04:10.780
It's Naval Ravikant, R-A-V-I-K-A-N-T, creator of Angelist, et cetera. So how is his prediction?
00:04:24.720
Did the pandemic speed up everything? Is sped up vaccinations? Is sped up commuting, you know,
00:04:33.080
going away. It's sped up online buying. It's sped up door dashing and food delivery.
00:04:46.180
It's sped up a lot of things. And I think there are probably various technologies that get a kickstart.
00:04:53.520
I could speak for myself. I would say that there are things that I had put off
00:04:58.140
that I brought forward just because, you know, I had time because we were locked down.
00:05:04.340
So even the, you know, the upgrades I did to the live stream are things that probably would have
00:05:10.100
taken longer, but I accelerated them because of the pandemic. Now, there might be other things
00:05:16.240
to slow down, like the, you know, in the short run, the supply chains. But in the long run,
00:05:23.580
you know, inflation got worse fast. Just the whole international relations changed fast.
00:05:30.960
Deaths were, yeah, even death was accelerated. It's like everything was faster. So I would say
00:05:37.160
that that was a darn good prediction. Now, I'm going to combine this with something I heard recently
00:05:42.800
that Brett Weinstein and I think Heather Haying were saying, and I wish, tell me the name of their
00:05:51.580
new book because I'm such an idiot. I was, I just looked at it and then I forgot to write it down.
00:05:58.280
In the comments, just say the name of their new book. Apparently it's pretty good. I hear good
00:06:02.180
things about it. Hunter Gatherer's Guide. Thank you. But I don't know if this is, I think this might be
00:06:09.500
from the book, but in an interview, I heard him talk about how humans are the most adaptable
00:06:14.620
of really anything that's alive at this point. Now, and that makes sense, right? That we adapted to
00:06:23.520
all kinds of weather and all kinds of diets and all kinds of everything. And now we're finding this,
00:06:31.180
that we, that we're adapting faster and faster than we ever had to. Because the rate of change in the
00:06:37.660
external world is so fast that we're trying to keep up with the changes that are happening in the
00:06:44.320
environment. So we've gone from the, you know, the, the most adaptive creatures to having to super
00:06:53.300
adapt. And then the pandemic hits and suddenly the pandemic breaks all the laws. All the rules are
00:07:01.520
different. Like everything, everything you took for granted is in play now. Everything. Now you've
00:07:08.540
got a super adaptive species who's trying to figure out how to adapt, but we don't know what the hell is
00:07:14.560
going on. What are we adapting to? Exactly. Like everything's changing. All right. I'll adapt to
00:07:22.600
that. Wait. Oh, that's changing. Okay. I got used to it. Okay. That changed. So, so we're basically
00:07:29.640
in the state of insane flux because we're so adaptive, but that doesn't work if the environment
00:07:37.080
is changing faster than you can adapt. And that's where we're at the minute. What would you expect to
00:07:42.980
happen? What happened? Here's, here's just my personal hypothesis. I'll just throw in the pile. I think,
00:07:51.200
and a lot of you said some version of this. I think people were revealed for who they were
00:07:59.020
all along. I think that everybody became more of what they already were. Right? Everybody became
00:08:12.260
the extreme of what they started from. If you had a little bit of a weight problem, what happened
00:08:18.960
to you? A lot of people gained weight. If you were a fitness person, and I would say I would be in
00:08:26.500
that category, or even if your mind was, you know, oriented toward that way, what happened to you during
00:08:33.040
the pandemic? You got fitter. I'm at my peak fitness right now. Like, I don't want you to have to imagine
00:08:41.880
this. But naked, I look better than I've looked at any time in my life, and I'm pushing 65. And a lot
00:08:49.460
of people would say the same thing. There are a whole bunch of you on here who would say the same
00:08:52.640
thing. Leave out the naked part because we don't need to think about that. But the point is that,
00:09:01.080
let me say that lazy people became lazier. Just nod along as I say these things, because I know you're
00:09:09.000
going to agree. Lazy people became lazier. Cheaters cheated more. Cheaters cheated more.
00:09:18.260
People who were, let's say, achievement-oriented. Again, that's a category I would be in. Like,
00:09:26.800
I'm always thinking about trying to make something happen. Achievement-oriented people were even more
00:09:32.800
so. They went into hyper mode. Smart people became brilliant. People who were growing a little
00:09:42.060
grew a lot. Things that were failing slowly failed fast. You know, the little stores on Main Street
00:09:51.020
in my town, it's like they got raked away like leaves during the pandemic. But they were going
00:09:57.020
to fail anyway. It just wasn't going to be that fast. So here are some other things that happened
00:10:05.800
which would explain in many ways why we're so different. I think that people who were assholes
00:10:13.500
became more of more assholes. People who were nice became more nice. The people who were biased
00:10:22.360
toward helping people and empathy saw a crisis and they said, I was born for this. Literally
00:10:30.900
born for it. Because if you were born as a sort of empathy kind of a person, well, a crisis is
00:10:37.100
actually what you are born for. You know, not in the literal way, but you know what I'm talking about.
00:10:42.360
You're designed perfectly for a crisis because you care about people. So you jump right in and help.
00:10:48.180
So the people who are likely to help were very helpful. The people who are likely to be worthless
00:10:55.380
probably became more worthless than ever. Everything became more extreme. But the other
00:11:02.900
things that you have to throw in the mix is what happened to porn consumption during the pandemic?
00:11:10.960
I don't have data, but I'm going to take a guess. Anybody want to take a guess? Without the benefit
00:11:21.260
of any data, probably through the roof. Through the roof. I talked about the series on HBO, I think,
00:11:32.160
called Euphoria. It's about young people and, you know, working through the culture that's too much
00:11:39.340
drugs and too much porn and all that. And one of the things that the series, which is really
00:11:43.880
tries to hit something close to reality for people in the age group, it talks about how it's an entire
00:11:50.900
generation that only learned sex from porn. Only. And no other source. Because by the time,
00:11:58.880
you know, your, I don't know, your high school or your parents got around to it, you'd already
00:12:03.000
consumed so much that it wasn't likely your opinion was going to get changed too much.
00:12:07.300
So apparently even, you know, two young people looking to hook up is going to look like porn.
00:12:14.520
Or it's going to look like their imitation of the best they can do to look like what they've seen
00:12:19.460
because we're an imitative species. So what's that doing to people? Well, something. I mean,
00:12:26.200
I don't, I'm not even going to give you an opinion, you know, how that's good or bad. You know,
00:12:31.700
you can make your own opinions. But it's definitely different. It's, you know,
00:12:35.140
if you don't think that'll change your brain, let me ask you, you all think that porn changes your
00:12:41.860
brain, right? Like it actually rewires you. You all get that, right? It's only a question of how
00:12:47.560
much you do, right? If you don't do much, it's not much of a big deal. If you do a lot, it just
00:12:53.600
becomes, it would just turn you into it. You become it. You merge with it, basically. So,
00:13:01.160
so there's that. Then there's the whole commuting thing. What happened when people were forced to no
00:13:07.820
longer be with their second family? For a lot of people, people had two families, didn't they?
00:13:14.940
They had the work family, and then they had the home family. And then the work family went away.
00:13:20.460
What happens if you're buying stocks and you're not diversified? Anybody? Anybody? You're buying
00:13:26.720
individual stocks and you're insufficiently diversified, meaning not enough different
00:13:31.040
stocks. You're going to get wiped out sooner or later, maybe not right away. But if you're not
00:13:37.860
diversified, you're going to get wiped out. You have a 90% chance. So there are a whole bunch of people
00:13:43.580
who had their social life diversified, meaning you could have a bad day with your spouse, but at least
00:13:50.600
you go to work and there's your friends. Or you could have a bad day at work, but at least you can go
00:13:55.880
home and your spouse is nice to you. What happens when you just take away all the diversification
00:14:01.520
of your social life? And then what used to be this rich social life becomes your family members.
00:14:11.520
I'm sorry to say this, but there's nobody you can get sicker of faster than your own family members.
00:14:18.800
Right? And they're the people you love the most. You know, you still care about the most, love them,
00:14:25.940
you know, no change in that stuff. That's pretty much baked in. But oh my God, what stress to put on
00:14:33.060
marriages. I think that, you know, in the same way that all the small businesses got wiped out by the
00:14:42.460
pandemic. I think a lot of relationships got wiped out by the pandemic. I mean, I think the pandemic
00:14:49.260
just, and I don't know that we see the full result of that, you know, that's going to work through the
00:14:54.460
system. So almost everything was faster. And here's what's happened is I feel like, but let me tell you
00:15:04.900
my impression of what's different. So here's what's different for me. You know, I talk a lot about the
00:15:11.720
simulation too much, but how it feels to me is that I can see the machinery of reality in a way that
00:15:20.220
I couldn't see before, or that let's say maybe I knew about the machinery of reality intellectually,
00:15:27.960
just sort of philosophically, but I couldn't see it. I feel at this point, I can see it. It's almost
00:15:37.400
like, it's almost like there was a machine that had a solid front and now it's a glass front. The
00:15:44.500
machine is exactly the same as it was, but now I can see the mechanisms. And I believe that you're
00:15:50.660
having that experience too. And it feels as though society itself just had its software rebooted and we
00:15:59.300
all went to a higher level of awareness. I'm going to make that case with the headlines today.
00:16:05.800
So here's the theme, the theme that I just developed, that the headlines themselves, you can see the
00:16:13.800
machinery behind them like you've never seen before. Let me run through some examples.
00:16:19.500
Kyle Becker, who I tell you all the time, you should follow him on Twitter. He's got great,
00:16:30.120
sort of great reframings and lots of scoops and stuff on the news. And he gives us this little bit
00:16:37.360
of context about the January 6th situation. He says, the Democrats contested presidential elections
00:16:43.820
three times since 2001. They even argued voting machines were suspect. There were riots in DC at
00:16:52.220
Trump's inauguration. The amount of memory holding these left-wing news networks do is truly impressive.
00:17:01.460
Now, you see the machinery, right? When I read that, you say, oh, that's a, that's a Rupar.
00:17:09.900
In other words, the entire January 6th narrative only works because, as Kyle points out, they leave
00:17:19.000
out the context. If you put the context in, if you reverse Rupart it. Now, I'm not sure that all of
00:17:29.080
you would see this as instantly, even a few years ago. But now it's just automatic, isn't it? You just
00:17:33.900
see the machine. Here's some more examples. Lindsay Graham apparently said about Trump potentially
00:17:42.500
pardoning the January 6th rioters. He said, quote, I think it's inappropriate. So Lindsay Graham thinks
00:17:49.820
it would be inappropriate if Trump became president again to pardon those people. Here's Joel Pollack
00:17:57.340
giving you some context. He says, I want to hear him explain why the guy with the buffalo horns
00:18:04.560
got four years while an FBI lawyer who doctored an email to deceive a FISA court in the Russia
00:18:11.920
collusion probe got community service. Now, you, now you, you probably haven't heard it so clearly
00:18:25.460
and well stated before. But you could see that machinery, couldn't you? We could already
00:18:32.140
see that these were political prisoners. It's just, you know, Joel helps us put it in context
00:18:37.720
there. But you can see the machinery behind the, behind the glass facade. And, you know, of
00:18:46.840
course, we've lost all trust in our institutions, as Joel says. In a, on Twitter, they have these,
00:18:54.260
what you need to know sections. Every now and then there'll be a topic that Twitter helpfully
00:18:59.980
summarizes, you know, what you need to know. It's usually some bullet points. So, um, I sure
00:19:08.920
hope I wrote that down. Oh, yes, I did. Here it is. So what you need to know. So there were
00:19:19.100
one, two, three, four bullet points. So these would be four things that are so obviously true,
00:19:27.160
they could just be putting a bullet point to straighten you out. All right. I'm going to read
00:19:32.720
them. And then tell me if you don't see the machinery behind this. What you need to know,
00:19:40.460
the Department of Justice found no evidence of voter fraud that could have changed the outcome of the
00:19:46.360
2020 election, according to former A.G. William Barr. That's one. Number two, election officials at the
00:19:54.120
Department of Homeland Security said the 2020 election was the most secure in American history.
00:20:01.900
Number three, voter fraud of any type is extremely rare in the U.S., according to AP and Reuters.
00:20:09.320
And Reuters. And Reuters. Reuters might come up again today. Remember that Reuters is one of the
00:20:19.920
sources for voter fraud of any type is extremely rare in the U.S. Reuters. Just hold that in your
00:20:29.280
mind for a while. That'll be relevant. It's called foreshadowing. Foreshadowing. All right.
00:20:38.020
And then the last one is 44 states already have in place some form of post-election audit.
00:20:46.060
The National Conference on State Legislature's website notes. Now, do I even have to go through
00:20:54.240
what's wrong with all of these statements? You can see the machinery, right? Well, I'll do it
00:21:02.700
quickly, just in case you missed anything. The first one. All right. The Department of Justice
00:21:07.160
Justice found no evidence of voter fraud. Right. Because they didn't look for it. That's what's
00:21:12.620
left out. They didn't look for it. They were the wrong vehicle for judging it. They could only
00:21:20.420
judge the things brought to them in too short of a time window to be useful. Right? That entire
00:21:28.780
context is left out. This is clearly propaganda. So you can see the propaganda machinery just so
00:21:35.520
clearly now. Number two. Election officials at the Department of Homeland Security said the election
00:21:41.600
was the most secure in American history. And they know that how? How do they know that?
00:21:49.460
Wouldn't that be a case of them knowing the unknown? Do they know that the election of 1940
00:21:55.280
was fraudulent? No, I think what they're saying is that they have the most, I would guess, my
00:22:04.980
interpretation would be, that they have the most, let's say, guardrails in place to keep us safe.
00:22:13.800
Okay? Now, that would be a reasonably good thing to know. We have the most, in history,
00:22:20.680
guardrails and procedures in place to keep it fair. Here's some context I'd like to know.
00:22:28.060
Is that enough? Doesn't it sort of matter? Sort of binary, isn't it? I don't care if it's the best
00:22:34.940
it's ever been. Is it enough? The most basic question is left out. Is it good enough? Are you
00:22:46.780
saying we doubled it from 10% good enough to 20% good enough? The entire context is missing?
00:22:53.980
Obviously, propaganda. Voter fraud of any type is extremely rare in the U.S., according to AP and
00:23:00.660
Reuters. Reuters. Hold that thought. Reuters. We'll do the next one. 44 states already have in place
00:23:09.600
some form of post-election audit. Is it enough? Yeah. Okay. They have some form of post-election
00:23:21.320
audit. What form? Does it include any of the digital part? Does it include somebody looking
00:23:31.100
at the code? I don't think so. Some form. Now, isn't it obvious that somebody who would write a
00:23:39.680
sentence like this is not meaning to inform? It is quite, quite clear with the four of these
00:23:46.960
that they are designed for propaganda, for manipulation. And let me ask you, was it not obvious
00:23:57.000
to every one of you when you read it? Or when I read it to you? I mean, I primed you for it, but
00:24:01.620
you saw it right away, right? At least my audience does, I think. Now, let me be clear. I am also not aware
00:24:10.960
of any fraud in the 2020 election. I have to say that, because first of all, it's true. I personally
00:24:18.040
am aware of no fraud whatsoever. I'm not even aware of any small fraud, because if there were any
00:24:24.460
stories like that, I wouldn't have paid attention anyway. Somebody says, yes, you are. No, I'm not.
00:24:30.440
No, I'm aware of small irregularities, but I don't, like, remember the details, because they weren't
00:24:36.480
important, if they were small. But I'm aware that people have reported them, so maybe that's what
00:24:43.160
you were looking for. All right. What is true? Let's get into what is true. And I'll take a little
00:24:52.080
example. Do you remember the famous incident? And of course, you know that all the news has to go
00:24:58.880
through the Joe Rogan filter now, so it doesn't matter what you're talking about. It's got to have
00:25:05.340
a Joe Rogan reference to it, and we're going to have plenty. All right. Do you remember one of the
00:25:12.620
big blow-ups was when Joe Rogan had the Australian guy journalist on, and they disagreed about whether
00:25:20.280
the vaccination or the virus itself would cause more myocarditis in a certain age group? And that
00:25:28.680
it looked like maybe the journalist said something wrong, but then Joe Rogan disagreed. But then on the
00:25:36.080
show, it looked like Joe Rogan saw a source that agreed with the journalist. But then when we looked
00:25:40.860
at it later, it looked like maybe Joe Rogan was right after all. But then I listened to another video
00:25:46.900
of a cardiologist who said when he really dug into it to find out which of them was right after all
00:25:52.020
that you can't tell. That's the bottom line. So is the last cardiologist that I listened to
00:26:03.220
the one who's right? Or is Joe Rogan right? Or was the Australian guy right? Or two of the three of them
00:26:12.160
right? I don't know. But I will tell you, if you listen to a YouTube video of a cardiologist
00:26:19.340
talking about how they decided, you know, that risk, and what data they had and the quality of
00:26:27.580
the data, you will walk away from it saying, I'm pretty sure we can't tell. But it also doesn't
00:26:36.060
matter. And the doesn't matter part is that whatever the risk is, it doesn't matter even
00:26:41.740
which one's bigger. It's so small, it's not part of the decision. So even something as basic as what
00:26:49.120
you thought about that story, I don't even know if we know that. So our understanding of what is true
00:26:58.000
and what can be known is completely different after the pandemic, isn't it? Everything you thought
00:27:04.180
about the experts, everything you thought about the quality of the data, it's not the same as before
00:27:10.560
the pandemic. Now you think that even the most basic, clear story, and this one should have been
00:27:16.540
one. This one should have been two people, you know, weren't sure of some data. But then after the
00:27:23.100
episode aired, the experts looked at it and said, well, here's what's going on. And then they all agreed
00:27:28.360
because we're all looking at the same data. But things aren't that clear. Apparently not.
00:27:36.120
All right, here's, let's talk about Joe Rogan's video response. So many of you have seen it, but you don't
00:27:45.140
need to have seen it in order for me to, you know, talk about it. So he did a little, you know, handheld sort
00:27:51.360
of a selfie video that I saw on Instagram, and I guess it's on all the social platforms by now,
00:27:57.760
in which he talked about the accusations that he's spreading misinformation about COVID stuff,
00:28:06.980
and the Spotify problem of blah, blah, you know, and what's his name? Neil Young.
00:28:15.360
I didn't do that intentionally, but that pretty much summed up the whole story right there.
00:28:22.900
Neil Young, wanting his music to be taken off because he thinks Joe Rogan's spreading misinformation.
00:28:29.920
All right, so I listened to Joe Rogan's thing, and my first take, which I tweeted, but I'm going
00:28:38.020
to revise in a moment, and what I'm going to revise in a moment is that it's the best response I've ever seen
00:28:42.340
to a public relations problem. That was my first response, the best response I've ever seen
00:28:52.380
to a public relations brouhaha. I now revise that opinion. It is the second best response I've ever seen,
00:29:04.320
and I don't think it's a coincidence, but that's just a guess. Now, it would be fun to hear him confirm
00:29:14.220
or deny this, so I have a hypothesis, and I'm going to tell you who number one was and see if you can
00:29:21.180
draw a connection. Number one was Steve Jobs. Steve Jobs. Steve Jobs, when he had his public relations
00:29:30.860
problem, it was one of the early iPhone models. If you put your hand in a certain place on the phone,
00:29:39.000
it would touch the antenna, and it would, you know, it would cut off the call. Imagine having a handheld
00:29:45.880
device that you couldn't hold in your hand that you'd paid, I don't know, $1,000 for or whatever the
00:29:53.640
price was. That's like the worst thing that could ever happen to a company. Well, we made a handheld
00:29:59.760
object. You just can't hold it in your hand. That's the only problem. Otherwise, it's really
00:30:05.360
spiffy. It doesn't make phone calls, and it's a phone, but otherwise, really good. That's a big
00:30:13.240
problem, right? Here's how Steve Jobs handled it, which became the stuff of legends. It was actually
00:30:19.840
written about in his autobiography, and it was sort of a big deal. Steve Jobs got in a call with all the
00:30:27.820
journalists, and he said, and I'm paraphrasing, but this is the basic idea. He said, all smartphones
00:30:33.980
have problems. We want to make our customers happy, and then he said, here's what we're going to do.
00:30:41.580
And the next day, because he had reframed it as all smartphones have problems, the press,
00:30:48.960
instead of killing Apple for having a phone that had a problem, they started doing stories about
00:30:53.880
all smartphones had problems. It completely worked. Now, I don't know if Apple helped to seed those
00:31:00.480
stories, but the net effect of it was it really worked. And here is the form that Steve Jobs used.
00:31:06.840
Number one, reframe. He reframed iPhone has a problem to all smartphones have problems.
00:31:14.040
Good technique. Number two, he showed empathy. We want to make our customers happy. A very direct
00:31:22.360
statement about his customers. It wasn't about the company. It wasn't about Steve Jobs.
00:31:28.880
He reframed it, and he said, we want you to be happy. Then he said, we're going to do this to make
00:31:34.460
you happy. And then he set his solution. Very simple, perfect, perfect handling. All right?
00:31:45.020
So let me show you the frame again, because we're going to go, we're going to show you this frame a
00:31:49.960
second time. You reframe it, you show empathy, and then you give the solution. Reframe, empathy,
00:31:59.700
solution. Now, I don't believe, I think the story is that Steve Jobs did not come up with that himself.
00:32:05.140
I believe he came up with, and I forget the name of the, it was a PR executive, who was an expert at
00:32:13.620
that. Somebody will tell me in the comments, if you've read the biography, Jobs. Anyway, so Steve Jobs
00:32:22.000
has some help from an expert, but Steve Jobs was an expert, too, on this. Somebody will say the name of
00:32:30.640
it. Is it McKenzie? McKesson? McKesson? I don't know. It doesn't matter, but it was a professional
00:32:39.680
who was good at it. Now, let's talk about Joe Rogan's thing. Joe Rogan basically said,
00:32:52.120
did I actually not write that down? I don't think that's possible.
00:32:55.640
So here's how he started. His first reframe was, he talked about his show being a conversation
00:33:03.960
that grew big unexpectedly. I'm paraphrasing now, but he says, I'm just talking to people
00:33:11.320
about stuff that's interesting to me, and it grew into this big thing. And so that's the context.
00:33:20.140
The context is, I'm not the news, right? That's pretty important, because the context is that,
00:33:28.620
you know, allegedly misinformation. So the first frame is, this is just a conversation of something
00:33:34.600
interesting, not the news. Now, he didn't say, I'm not the news, but that's the context.
00:33:41.620
So he reframes it, and he said, basically, there's no agenda. It's just interesting stuff.
00:33:47.240
And he also talked about how the experts he's had on, some of them would have been banned,
00:33:53.420
he claims, for things that in the end ended up being right. So then he gives you further context
00:34:01.460
that says how many times he has specific examples of people who said things that the mainstream would
00:34:08.200
have said, no, that's dangerous, and then they turned down to be right. So that's good context.
00:34:13.560
All right, so like Jobs, Joe Rogan reframes the situation. Then he shows empathy. He basically
00:34:24.300
agrees with his critics. And then he tells you what he's going to do about it.
00:34:38.960
Now, the solution would be, he said that he probably does need to get an expert who disagrees
00:34:47.220
with, you know, some of the provocative people, get them on, you know, close to when the provocative
00:34:54.020
person was. And he said that he does his own scheduling, and that he needs to do that, you know,
00:35:01.120
more, I don't want to say better, but he just wants to pair the differing opinions so they're a little
00:35:07.920
closer together, which is, you know, is a form of what I'd been suggesting as well. Now, I thought
00:35:13.580
it's even better if they're there at the same time, but maybe that's hard to manage. But the next best
00:35:19.340
thing is to show the expert and then the counter experts, you know, as close as possible. So he's at
00:35:26.480
least acknowledged the nature of the complaints, and then he offered some solutions. And then he also
00:35:34.860
said he'd prepare better for some of the types of experts. Now, here's David Smith.
00:35:46.740
Scott loves sucking up to big pharma. Sheep. Rogan is accepting a misinformation tag on his show.
00:35:53.860
So, weak. All right. The people who only see things as, like, weak or sheep, you're like
00:36:03.880
binary idiots. We'll get rid of this binary idiot. Goodbye. All right. You've got to handle
00:36:11.700
a little bit of nuance to enjoy this live stream. All right. So, I would say, here's my speculation.
00:36:23.860
One of the things that Joe Rogan gets right is what I'll call the Norm Macdonald theory of comics.
00:36:35.320
Comedians, that is. Norm Macdonald explained once, I saw it on a video recently, that you
00:36:41.140
don't want to act smarter than your audience. You want to act dumber than your audience, but
00:36:46.620
maybe, you know, I think I'm adding this part, but maybe surprise them that your stuff hangs
00:36:52.240
together better than they'd think, right? Joe Rogan does an insanely good job of what a good comic
00:37:00.920
does. And remember, he's got this whole talent stack working, you know, stand-up comic, you know,
00:37:06.600
and then plus all the other skills, acting, blah, blah, blah. So, I don't know how much is, you know,
00:37:15.060
knowing what systems work and borrowing them. I don't know how much is natural. You can't read
00:37:19.800
minds. But when you see one of the things that makes Joe Rogan so popular is he doesn't ever let
00:37:28.240
himself look like he's smarter than you. Right? That is sort of genius, because it's a hard thing to do
00:37:38.560
if you think maybe you got some of your success, because you were smart. You know, you'd have to
00:37:47.620
think that in his private moments, he might have some positive thoughts about his own intelligence.
00:37:53.520
You know, it got him where he is, right? Now, here's my take. If, and I'll make this conditional.
00:38:00.620
My guess is that when the thing blew up with Spotify and Joe Rogan, that he's now playing at a,
00:38:08.820
let's say, a corporate level. I hate to say it, but, you know, because Spotify is involved,
00:38:14.100
there's sort of a corporate element to this. It would surprise me if Spotify did not offer
00:38:21.700
to give him some professional crisis management PR advice via somebody like Steve Jobs got the
00:38:33.320
advice. So my guess is that in both cases, Steve Jobs and Joe Rogan got advice from the best advice
00:38:41.580
givers you could possibly get advice from. But that's not good enough, right? Because if most people got
00:38:51.060
the greatest advice in the world, they, A, wouldn't recognize it, right? They wouldn't recognize it as
00:38:57.420
good advice. You have to be pretty smart to even recognize it. And then secondly, they couldn't
00:39:02.880
implement it. Because it takes a lot, a lot of communication skill and, most important, reserve,
00:39:14.260
like to hold back all of your normal instincts to give the perfect three-part, you know,
00:39:21.060
response that both of them did. So here's the thing. If Joe Rogan got advice from an expert,
00:39:28.460
he did a really good job of following the advice, like really good. But if Joe Rogan came up with this
00:39:35.840
spontaneously, which it has the look of, it has the look of something where he'd been thinking about
00:39:43.100
it for a while, picked up his phone, and then gave you 10 minutes of perfection. That could have
00:39:49.020
happened. I don't know, and I would love to know. Because if he did that spontaneously,
00:39:56.420
you know, after thinking about it a lot, of course, but if that was one take, spontaneous,
00:40:03.560
and he hit the three elements that cleanly, that is one of the smartest things you've ever seen in
00:40:09.440
your life, that would be just insanely smart. And like the amount of skill that would go into that
00:40:17.380
would be hard to imagine. So I would just love to know. I don't know if you'll ever talk about it,
00:40:24.680
but I'd be real curious if he got expert advice, or if that was just spontaneous. That would be really
00:40:30.840
interesting. I saw a little news today from Reuters. This came from Reuters. And it reported
00:40:41.200
that ivermectin was effective against Omicron in a phase three trial. Wow. Wow, that's big news.
00:40:53.380
You know, everybody's saying bad things about ivermectin. But here's Reuters today,
00:40:56.520
saying that ivermectin, at least in a Japanese study, that is effective. It is effective against
00:41:03.700
Omicron in a phase three trial. Holy cow. Wow. That fake news lasted, I believe, less than one minute.
00:41:16.300
It's not true. It took me one minute to say, well, that's a pretty vague claim,
00:41:23.440
because you look at it, and there's no link to a study. Like, it just looked obviously untrue.
00:41:31.540
Because I could see the machinery. Now, I didn't really have to, like, break it down or anything.
00:41:39.160
I just looked at it. I just looked at the story, and I said, well, that's somewhat transparently not
00:41:45.440
true. Now, I'm not talking about ivermectin, right? This has nothing to do with ivermectin.
00:41:50.220
It's just about the truth of a story. And then it took Andres Backhouse another, like,
00:41:57.840
five seconds to completely dismantle it. And he goes, he goes, he goes, this news fails two basic
00:42:05.360
sanity checks. One, there is no preprint or other documentation yet. And then two, assuming they
00:42:12.380
did the trial in Japan, Omicron became dominant there just one month ago. One month isn't a realistic
00:42:18.700
time frame for a whole trial. And I'm thinking, yeah, okay. And by the time I had read that,
00:42:29.980
the Reuters had already corrected the story and took out the phase three trial part, which was the
00:42:36.620
ridiculous part. Basically, they found out that ivermectin works in a lab, which we already knew.
00:42:44.880
In other words, there wasn't any news. There wasn't any news at all. Do you know what else works in a
00:42:52.260
test tube against diseases? Practically everything. Do you know what kills a virus? I don't know. You
00:43:01.720
could probably piss on it. I think pretty much everything kills it. In a lab? Yeah. Coca-Cola?
00:43:07.960
In a lab? So basically, this is Reuters reporting something that wasn't even close to being
00:43:14.340
credible or true. It was, but remember my original point? The pandemic has allowed us to see the
00:43:22.600
machinery. You could just see this one. You didn't even have to analyze it. You're just, oh, that's,
00:43:36.560
Pat Sajak had this tweet. He said, I've discovered that no matter how outlandishly over the top,
00:43:46.580
satirical, sarcastic, or ridiculous the tweet, approximately 20% of Twitter users who comment
00:43:52.840
will take it at face value, helps explain why there's so much anger out there. Well,
00:44:00.200
I think Pat was off by five percentage points. As I've been noting, 25% or so-ish people will be
00:44:11.640
wrong about anything, everything, to the point where I got a tweet just before I came on, or was it,
00:44:18.160
maybe I saw it on a local's comment? I'm forgetting where I saw it, that maybe it just might be as part
00:44:25.420
of the base rules of our reality. You know, the base rule of reality is around 25% of people have
00:44:32.580
to misunderstand everything. It's a different 25%, I hope. I hope it's not the same 25%. But there
00:44:39.520
always has to be the standard 25%, no matter what. All right.
00:44:47.080
One more thing, and then I'm going to solve the Ukraine problem.
00:44:51.520
We have to only start looking at unvaccinated people. And I'm sorry, we have to look at only
00:44:59.500
fully vaccinated deaths to make our decisions on the mandates. Fully vaccinated deaths. We've been
00:45:06.860
doing the wrong thing. We've been looking at unvaccinated deaths, but that's the group that
00:45:11.940
chose that option. Right? But if all the unvaccinated people are completely happy with their risk
00:45:20.040
management decision, and they are, they are, and all of the vaccinated people, they've seen the risk
00:45:28.000
drop to the point where it's now a baseline risk, not a pandemic risk. Why are we looking at the deaths
00:45:34.740
of the unvaccinated? They're getting exactly what they want. Not the dead ones, but the people who
00:45:41.680
lived got exactly what they wanted. And the people who died, they chose a path that they were fully
00:45:48.120
informed about. They didn't believe it, and that was their option. So if you looked at the total deaths,
00:45:55.680
vaccinated and unvaccinated, it looks like we're at a record. And that would be a bad argument for
00:46:00.500
ending mandates. But if you look at what people asked for and what they got, vaccinated people
00:46:06.980
asked for vaccinations, they got it. Unvaccinated people asked for a different risk profile,
00:46:14.740
they got it. As long as the hospitals can handle the load, and it looks like they can,
00:46:20.940
at this point it looks like they can, we're done. Tomorrow, ladies and gentlemen,
00:46:26.620
and people of all types. Regis McKenna was the person who advised Steve Jobs. I don't know if
00:46:36.100
he advised him on that question I was talking about. Thank you very much. It was Regis McKenna.
00:46:44.080
So February 1 is the date that the public takes over, because our government has not. And I believe
00:46:51.680
that the argument should be that everybody got what they wanted at this point. The vaccinated
00:46:56.280
got what they wanted. The unvaccinated got what they wanted. We are done. I would argue that the
00:47:02.700
worst way to protest at this point is with trucks. Because don't we need the stuff in those trucks?
00:47:11.140
Anybody? I think we need the stuff in the trucks. I don't think we should stop the supply chain
00:47:19.080
for anything. That's just my thinking. But here's what we should do. We should just take control.
00:47:27.720
Just take off your mask. If you go in a place that requires them, make sure that they ask you to put it
00:47:33.020
on. And then the first thing you should say is, no, after February 1, the public took control of the
00:47:39.560
mandates. And people will say, no, the government still has a mandate. And you'll say, yeah, I know.
00:47:45.160
That's why the public took control on February 1. Now, if they put up a fight, well, you can decide
00:47:53.060
to leave or put on your mask. That's up to you. But I'm just saying the default should be take the mask
00:48:00.060
off. Now, I probably won't, you know, just personally, I probably won't try, you know, going to Walmart or
00:48:07.600
target or anything. But I'll just stay away from any place that I know will require a mask. And any
00:48:12.940
place I think is a soft target, I'll go in and take it off. And we should just see it. And little
00:48:20.800
people will be, you know, more rebellious than I am. We're going to try to get on planes and everything
00:48:25.480
else. But that would be a little bit dangerous at this point. I don't think I'd mess around with an
00:48:30.180
airport. But the point is, we have to make it a big enough deal that the press starts talking about
00:48:37.780
it. If the press doesn't talk about the public taking control of the issue, and make that a theme,
00:48:45.860
it just wasn't going to happen. Right? So you need the press to understand this is a perfect story.
00:48:52.520
The press does not like dog bites man, because that's normal. The press likes man bites dog.
00:48:58.100
The press doesn't care if the government tells you what to do. That's normal. The press does care
00:49:03.960
when the public tells the government what to do. That's what makes it a story. So somebody needs to
00:49:11.220
talk about the public rebellion until the narrative catches on. And then it snowballs. But let's kick
00:49:19.080
this thing off. Now, let's tell you how to handle the Ukraine problem.
00:49:29.280
Have you ever wondered, what the hell do you do when you're dealing with a dictator?
00:49:33.880
Like, you could never, you never really solve a problem with a dictator, right?
00:49:37.940
It's very unlikely that democracies will fight. You know, two democratic countries rarely get in a
00:49:46.000
war, right? So if there is a war, it's going to be two dictators, or a dictator and a democracy,
00:49:51.980
etc. So wouldn't you love it if there were some way to solve the problem that there's no way to make
00:49:58.880
peace with a dictator? Because they kind of have to stay dictators to avoid getting killed. Am I right?
00:50:07.520
It's hard to be a dictator and then just retire. Because whoever takes over next will kill you
00:50:12.480
and wipe out your entire family. Now, what does a dictator want after a certain age? Let's say by
00:50:20.240
Putin's age, what does he want? Probably something like a legacy. Probably something like keeping his
00:50:27.480
genetic line safe. Do you feel that that would be a safe thing to say? At a certain age, they're less
00:50:36.800
about acquiring stuff, and more about making sure that what they have done becomes a permanent legacy,
00:50:43.480
both to protect the people and their family after they're gone, but also so their name will live on.
00:50:50.240
So how can you solve this problem of giving the dictator something that protects them in that way
00:50:57.660
after they're gone, but also allows you to negotiate in some productive way? So I think this is the
00:51:04.200
reframe that needs to happen. We should talk to our dictators about the fact that if they keep with
00:51:12.260
their current model, it's inevitable that their bloodline is going to get wiped out, meaning that
00:51:17.500
whoever takes over after them is going to look for your relatives and make sure they get out of there
00:51:22.580
because the relatives are the dangerous ones, right? So how can you keep your relatives and your legacy
00:51:27.940
from being erased and cancelled? And here's the way to do it. Guarantee everybody who sticks within their borders
00:51:38.260
as they exist today that they will be supported against all attacks forever. In other words, give them job security
00:51:50.340
instead of trying to depose them. It's the whole deposing them that gets the problem, right? If we're trying to
00:51:57.160
depose Putin all the time, wow, he's going to push back. Like, anybody's going to push back. So can we remove the
00:52:06.020
incentive for them to be hacking us and poking us back? I think we can. Because I'm not terribly
00:52:14.660
concerned if the Russian people have a dictator, are you? I mean, really? Because I feel like maybe
00:52:22.340
a lot of them prefer it. I think maybe a lot of them prefer it. And it's not our problem. So instead of
00:52:31.080
trying to turn anybody into, you know, into some kind of a democracy so that we can be friends,
00:52:37.920
why don't we do it the Trump way? Trump goes to North Korea and says, you know, there's no reason
00:52:43.500
we need to be enemies. How about if you want some economic development, we should talk? And then Kim
00:52:51.000
Jong-un is like, um, I can't think of a reason I need to be your enemy. And then for a while,
00:52:56.940
everything was heading in the right direction. And the thing that Trump did was he gave Kim Jong-un
00:53:03.420
job security. Think about it. That's what Trump did. He gave Kim Jong-un job security, the most he's
00:53:11.860
ever had. And as soon as he had job security, he got friendly, friendlier. Now Biden comes along,
00:53:19.140
and no longer does Kim have a relationship that's like a personal one with the president.
00:53:26.700
And now suddenly he's testing a lot of rockets, right? It's probably not an accident.
00:53:33.300
So could we say, here's the deal. As long as you stay within your international borders,
00:53:41.500
the entire world will make sure you don't get deposed. The entire world, China, U.S., you know,
00:53:48.080
we'll make sure that President Xi stays in power as long as he wants. Putin, long as you want. Kim
00:53:53.440
Jong-un, forever. But you have to, you have to end the poking us. In other words, you can't invade
00:54:04.880
your neighbors anymore, and you can't cyber attack us, and you can't be trying to undermine our currency
00:54:10.260
and stuff like that. You're going to just have to be a productive competitor in the world,
00:54:15.780
and then you can have everything you want. You just can't change your national borders anymore.
00:54:22.340
Now, would that work? I don't know. But what we're doing now doesn't work. Would you agree
00:54:30.340
that what we're doing now doesn't work? You know, we have to do the, we're stronger than them. Putin
00:54:37.120
only knows force. Well, Putin does only know force because it's the only option. What other option has
00:54:43.860
he been offered? Has he ever been offered the option? You're going to be, you're going to be,
00:54:49.780
you're going to have the job forever. Just don't be so much of an asshole. That's it. You can,
00:54:56.840
you can be our friend. I don't care. Just don't be an asshole to us, and you can have your job forever.
00:55:02.420
Now, I don't know if that would work, of course, right? And, you know, it'd be different for every
00:55:09.420
situation. No two situations are the same. But I can't see a reason that we are at some kind of
00:55:17.200
war footing with Russia. Can you? I feel like, infantile position, I feel as though we lost the
00:55:28.420
reason. We lost the reason. Now, the reason is, of course, that they're going to be aggressive,
00:55:34.540
so we have to keep them in. And they're thinking, you know, U.S. is going to be aggressive,
00:55:39.580
so we have to keep them in. Well, what if we just weren't? There's no reason. It's just two people
00:55:46.980
who are locked in this model that doesn't make sense anymore. The last people we want to have a war
00:55:52.240
with is frickin' Russia. It's last. Let's have a war with anybody else but Russia. Anybody.
00:56:00.580
Literally last on my list. Of course, you know, they make it easy to gin up war because of the way
00:56:08.680
they act. But let's figure out a way to change their incentive. All right. That is my incredible
00:56:16.300
live stream program for the day. Probably the best thing you've ever seen in your entire world.
00:56:21.280
I'd like to show you one more thing to show you the machinery of... Oh, stupid phone.
00:56:36.800
Stupid, stupid damn phone. All right. I guess I don't do it the long way.
00:56:41.180
Here's a picture that the L.A. Times ran about the Joe Rogan and Neil Young controversy.
00:56:55.000
All right. And look at the picture they chose for Joe Rogan. And look at the picture that
00:57:04.440
they chose... God damn it. Chose for Neil Young. So Neil Young, he doesn't look like a heroin addict.
00:57:13.660
He looks like a thoughtful, possibly a brilliant man. Joe Rogan, the picture that they picked,
00:57:20.380
they picked the AOC picture with the big eyes. And here's the headline. This is from the L.A. Times.
00:57:28.120
Spotify CEO Daniel Eck responded to Neil Young and others removing their music from the platform
00:57:34.920
over COVID-19 misinformation spread on Joe Rogan's popular podcast. So they say it like it's a fact
00:57:43.520
that it was misinformation. Is it a fact? Is it a fact that it was misinformation? Or is it experts
00:57:52.980
disagreeing? So they treat it like it's a fact? Like you don't need to think anymore? You see
00:57:59.780
the machinery, right? You can see the machinery. So that, my friends, is the best show ever. And