Real Coffee with Scott Adams - January 02, 2022


Episode 1611 Scott Adams: Today I'll Give You a Lesson on Spotting and Avoiding Cognitive Dissonance


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

147.51529

Word Count

11,108

Sentence Count

864

Misogynist Sentences

2

Hate Speech Sentences

6


Summary

Dr. Joe Rogan has a new idea that could change the world, and it involves a little bit of cognitive dissonance, a lot of money, and a whole lot of people who are willing to pay to watch a three-hour interview.


Transcript

00:00:00.000 in the middle. But before we do that, how about a little simultaneous sip? Anybody? Anybody up
00:00:06.480 for the simultaneous sip? Yes, of course you are. And all you need is a cup or a mug or a glass of
00:00:10.980 tankard, Charles de Stein, a canteen jug or a flask, a vessel of any kind. Fill it with your
00:00:14.240 favorite liquid. I like coffee. And join me now for the unparalleled pleasure. It's called the
00:00:21.100 dopamine hit of the day. Everybody's saying it. And it's all about the simultaneous sip and it's
00:00:26.420 going to happen right now. John is willing to pay me to watch the Dr. Malone interview.
00:00:41.220 You're going to pay me to watch the whole three-hour interview.
00:00:46.700 Can we get any seconds on that? Can I start a GoFundMe? Oh, that would be interesting, wouldn't
00:00:53.160 it? My God, I think we've invented something here. I think you invented something. I'm just
00:01:00.760 repeating what you invented. How much would you like a GoFundMe to force public people to
00:01:09.020 consume the opposite opinion? And then talk about it. Actually put a show on. Huh. That's
00:01:20.080 not a bad idea at all, is it? In fact, do you ever have an idea that's so good it makes
00:01:26.500 you get goosebumps? I just got goosebumps. That is such a good idea. Isn't there a whole
00:01:34.200 bunch of problem going on that you're sure somebody didn't see the other side? Because
00:01:39.100 the reason you're begging me, not begging, I'm sorry, that's a jerk word. The reason you're
00:01:43.900 asking me, and even willing to pay money in some cases, to watch that three-hour interview
00:01:49.760 is that you think that combining those two things would cause something good for the world,
00:01:54.880 right? Or at least good for your entertainment, which is good for the world, too. That's really
00:02:01.060 good. Because I actually do think that if you induced me to, say, read a book, I would read
00:02:06.800 it. I would actually go for that. Now, you know, each person's going to have a different
00:02:11.820 level of inducement. Maybe I'd need that money to go to charity or something. You know, something
00:02:18.220 to make it more fitting for people in my situation. But I would be absolutely influenced by that.
00:02:27.560 If you're wondering if I personally would change my behavior if somebody did a GoFundMe just to get
00:02:34.520 me to consume some information they think I need that would then therefore be useful for the
00:02:39.700 public, I would do that. I would totally do that. Great idea. I think we invented something today.
00:02:46.960 I'm going to build on that idea a little bit as we go. But first, have you seen the videos of
00:02:53.920 China publicly humiliating their citizens who violated COVID restrictions? I swear to God,
00:03:02.100 it looks like a scene from Game of Thrones, you know, with the Walk of Shame. Many of you have
00:03:08.700 seen it. I won't go into the details. But apparently they do a big public event with lots of military
00:03:14.720 people. And then the violators of their COVID restrictions wear all white outfits. And it
00:03:21.960 looks like they make them put a sign around their neck with a big photo on it. And I don't know if
00:03:29.240 the photo is a picture of them. Or is it a picture of somebody like that would shame them more than
00:03:35.900 their own picture? I don't know exactly. I couldn't tell from that. It's their face. Because I think
00:03:41.640 their actual face is covered with some kind of a masky situation, right? I'm guessing. So they need to be
00:03:51.880 able to see their full face without a mask. I guess that's why they do it. And how would you like that
00:03:57.680 process? And Viva Frye was saying this in a tweet. Is that better or worse than what Canada does?
00:04:06.280 Canada will fine you $1,000 to $6,000, according to Viva, if you violate COVID restrictions.
00:04:13.080 Which one would you prefer? Would you rather pay? Let's take an average, you know. Would you rather
00:04:20.160 pay $3,000 or $4,000? Fine. Or would you rather wear a sign around your neck with your picture on it and
00:04:30.260 be marched around by the Chinese leaders? I don't know. They're both pretty bad. I think I'd pay the
00:04:36.680 money. But your mileage might vary. All right. We're going to talk about Dr. Malone and Joe
00:04:43.260 Rogan. This will not be comfortable. I promise you, it will not be comfortable. Are you ready for
00:04:51.400 this? And I'm going to do it in the context of a lesson on cognitive dissonance. How to spot it
00:04:58.520 so you can see it in the wild. But then also how to maybe defend against it. Now, I would say first,
00:05:08.300 I do not have anything I would call solid evidence that anybody can defend against cognitive dissonance.
00:05:17.360 I don't think I can do it. But I think maybe there are some tools you could use and at least have a
00:05:23.640 chance. But I don't know how you would measure such a thing to know if it was working or not.
00:05:28.520 Now, the problem with cognitive dissonance is if you're the one who's experiencing it,
00:05:32.860 you're the only one who can't tell. You're the one who can't tell. That's the point.
00:05:38.780 Right. So anytime I'm experiencing it, by definition, there's a good chance that you could see it in me.
00:05:47.800 But I would argue like a wounded weasel that I'm not doing that at all. That's what it would look like.
00:05:54.580 All right. So if you want to know if I'm in cognitive dissonance, you want to look for the
00:05:58.960 trigger first. But also, and I'll explain the trigger, usually just means that I've done
00:06:05.040 something, usually in public in my case, that would be at odds with who I think I am.
00:06:11.740 So who I think I am is a person who is reasonable and rational. But if I were to be called out in public
00:06:18.580 for doing something the opposite, you know, very irrational, getting something really wrong,
00:06:24.320 presumably my self-image would now be on a whack with what I actually did. And then I would hallucinate
00:06:30.540 that I didn't actually do anything that bad. Oh, yeah, but you're interpreting that wrong.
00:06:35.240 So you could see it in me, because I would be peddling hard to, you know, convince you it wasn't happening.
00:06:43.100 But I'd be the only one who couldn't see it. It would be invisible to me. All right.
00:06:47.520 Now, here are some suggested ways to get out of cognitive dissonance, or at least maybe, maybe,
00:06:55.340 and that's the best I can give you on this topic, is maybe it will reduce your risk
00:06:59.520 of being in cognitive dissonance. So I will, and then I'm going to tie this to the Malone interview,
00:07:07.480 Dr. Malone interview on Joe Rogan. And it's all going to come together in ways that will make you
00:07:12.640 very uncomfortable. All right. So here's the best I can offer you for a defense for yourself
00:07:20.760 against cognitive dissonance. Number one, do some, do some, it would only take 15 minutes of your
00:07:29.040 entire life to do some Googling and learning what cognitive dissonance and confirmation bias are.
00:07:36.940 Now, I would argue that five years ago, a lot of you wouldn't know what those words meant.
00:07:41.440 Am I right? But in the last five years, almost all of us know what that means. It's a big deal.
00:07:49.700 Now, I think that some of you have known what they've meant forever, right? I understand that.
00:07:55.540 But I'd say for maybe a solid third of the public, maybe half, maybe more, they were learning about
00:08:02.880 these things for the first time, usually under the Trump scenario. So first of all, know what they
00:08:11.200 are. That's the starting point. If you don't know what they are, you're never going to defend against
00:08:15.900 them. Number two, practice being embarrassed in front of other people. It's probably the best
00:08:23.940 advice I could give you. If I were wrong in front of you about something really big,
00:08:30.700 could I survive it? Yeah, no problem. No problem. Because I've practiced for decades being criticized,
00:08:40.740 being wrong in public, having to deal with it, being embarrassed. I've given entire speeches
00:08:46.680 with, like, broccoli in my teeth. I mean, you name it. If it's embarrassing, oh, I've done it.
00:08:54.680 There's an embarrassing story that I desperately want to say and tell in public, and someday I will.
00:09:02.460 It's something that's more embarrassing than anything you, any of you, have ever been involved
00:09:07.740 in something that happened to me personally. Someday I'll tell you that story, and I have to tell you
00:09:13.120 that I laughed all the way through it. It was the funniest day of my life. For most of you, it would
00:09:19.280 have been the most embarrassing thing that has ever happened to you. Someday I'll tell you the story.
00:09:24.340 But the only thing I'm offering is that practice in anything makes you better at it. So if you can
00:09:31.620 practice being embarrassed, in other words, put yourself in a situation where you're not good at
00:09:36.880 something, do something you're not good at in front of other people, and do it a lot. Watch what
00:09:43.180 happens. You just get used to it. You really do get used to the fact that other people's opinions
00:09:49.520 are really just a chemical reaction in the skull of a stranger. When somebody has a chemical reaction
00:09:55.780 in their skull, why should that bother me? Because that's all it is. My sense of embarrassment and
00:10:03.800 shame is a reaction to some chemical reaction in somebody's skull, and I don't even know them
00:10:09.860 half the time. So don't let embarrassment bother you. That'll give you a defense against cognitive
00:10:15.480 dissonance. Create a track record of admitting mistakes. You don't want to be the person that
00:10:21.160 they say, you never say you're wrong. I am one of those people, unfortunately, that one of the most
00:10:27.440 common criticisms I get is people say, you won't admit you're wrong. Well, that's true. When I'm
00:10:34.900 right. I don't admit I'm wrong when I'm right. Maybe I don't admit it when I'm wrong. I wouldn't
00:10:40.640 know. Let's get rid of Ron for being an asshole. Goodbye, Ron. Now, since I have created a track
00:10:51.780 record, in fact, wrote a whole book about all my mistakes, if I were to add another mistake in public,
00:10:58.140 it would be a small percentage of all the mistakes I've admitted to in public. So I'm a little bit
00:11:04.200 invulnerable, and it's a little bit of protection, not completely, against getting triggered by, oh,
00:11:10.220 I have to defend my rightness. I don't feel that as strongly as I used to. If you asked me when I was
00:11:16.960 16 years old, do you think you could admit you were wrong? Nope. Nope. There's no way that's going to
00:11:25.760 happen. But at my current age, could I admit to you that I was wrong on something really big?
00:11:31.920 Yeah, I could. Yeah, I could get past that. So I have a little bit of protection.
00:11:37.980 Next, well, let me get to this last. Think in terms of the odds. So never think in terms of
00:11:44.620 something is always true or always false. As soon as you do that, you're setting yourself up to be
00:11:49.720 wrong if the answer is any of those nuanced answers. So don't put yourself in the position
00:11:55.520 where you've said something that can't be estimated is 100% likely to happen.
00:12:01.440 If it can't be estimated, just put some odds on it. And then if you're wrong, you're like, well,
00:12:05.700 I thought it was a 60% chance it would happen. But, you know, 40% chance it doesn't happen is pretty
00:12:10.760 big. So it's not like I have to explain anything. So if I say there's an 80% chance of something
00:12:16.260 happening and then it doesn't happen, do I get cognitive dissonance? Maybe. But there's a less
00:12:22.800 of a chance because I said there's a 20% chance it'll go the other way. And that's pretty big.
00:12:27.420 All right. Recognize the triggers. This is the most important one. And this will dovetail into
00:12:32.520 the conversation about Dr. Malone on Joe Rogan. One of the best ways you can trigger somebody
00:12:40.000 is by adding nuance into a conversation where people are taking only two sides.
00:12:46.260 If people imagine there are two sides and only two sides, and you know these people,
00:12:52.120 I call them the binaries. They can't really see anything but the two sides. And if you do
00:12:57.480 anything that's a nuance or anything that doesn't conform to exactly one of the sides,
00:13:03.020 the person who's a binary will say, well, you're on the other side because you didn't agree with
00:13:07.800 everything I say. So if you see a situation where somebody's sense of binary truth is violated,
00:13:16.880 they will be triggered into cognitive dissonance. And that's what happened with both Dr. Malone,
00:13:23.860 who was on the Joe Rogan show, and with a tweet that I made about Joe Rogan. Now, in the case of Dr.
00:13:30.540 Malone, Dr. Malone is a nuanced situation. Because he's being called an anti-vaxxer. That's what his
00:13:40.600 critics say. At the same time, he is vaccinated. He is vaccinated. And he's an expert who worked
00:13:50.240 to make vaccinations for other people. He is the most... Let me get rid of the absolute in that
00:13:58.580 sentence. It would be very difficult to make an argument he's actually anti-vax when he's vaccinated
00:14:05.660 and has worked to make vaccinations. What he is, is someone who has a nuance. He's someone who says
00:14:13.240 vaccination good in some situations. Perhaps we should be more reserved in other situations.
00:14:20.240 That's a nuance. It's not pro-vax or anti-vax. So that triggers people into cognitive dissonance.
00:14:27.400 In this case, it would be the people who are pro-vax. So the pro-vax people would be triggered
00:14:33.700 by the fact that Dr. Malone is a nuanced opinion. Everybody agree so far? You're all with me so far. This
00:14:42.140 is the easy stuff. I'm going to get to the more challenging stuff in a minute. Okay. I'm seeing a nope on
00:14:49.540 there, but only one. So I don't know what that's about. All right. Now, I tweeted this morning
00:14:56.020 something about Joe Rogan, which I was quite sure would trigger cognitive dissonance. Why? Because
00:15:03.280 Joe Rogan is the ultimate binary situation. And you're going to recognize this as soon as I say it.
00:15:10.740 You either think Joe Rogan is a national treasure. And I do, by the way. I think he's a national
00:15:17.900 treasure. Or you think he's a right-wing purveyor of misinformation. Seems to me that there are those
00:15:29.360 two impressions of him. But I tweeted a nuance in which you can love Joe Rogan to the maximum amount
00:15:38.900 and appreciate his talent stack, which is extraordinary. One of the best talent stacks,
00:15:45.560 meaning an assembly of talents that work together well. Show me a better example. Best example of work
00:15:53.400 ethic, talent stack, career strategy, and then execution. Just about that you've ever seen. So I can't say
00:16:03.600 enough good things about Joe Rogan and his show, which is why it's a gigantic phenomenon. But let me give you
00:16:10.280 some nuance. I'll give you an example of a kitten. Let's say you had a kitten. It's a tiny little kitten.
00:16:19.360 And you say, oh, this is so cute. And then the little kitten, because it's just a kitten,
00:16:25.300 like claws you a little bit. But it doesn't really hurt, because it's like little kitten claws.
00:16:30.800 And you just go, oh, kitten, take that hand off of me. And you've got like a little dot there,
00:16:36.320 and you don't even feel it. But it's so cute. It's so cute. You don't even mind that it's like it's got
00:16:41.960 nails and stuff. You don't mind it's inconvenient. Like it's just so cute. Now imagine that you're a kitten.
00:16:49.360 Was the size of a dinosaur. It's still a kitten. But it's the size of a dinosaur. And now it can kill
00:16:56.280 you. Does your opinion change? Yes. And the only point is that the size of something completely
00:17:04.920 changes what it is. Obviously, right? There's a difference between a rain drop that can't hurt you
00:17:12.220 and being stranded in the middle of the ocean, which probably kill you. Right? Now, let's get
00:17:20.400 to Joe Rogan. When Joe Rogan was a smaller show, if he had some misinformation on there, let's say a
00:17:28.260 guest said something that wasn't true, doesn't matter. Not really. Because when it's the smallest
00:17:34.140 show, it's for entertainment. It's not changing the world in any way. It's just for fun. People say
00:17:39.820 things that aren't true. Something to talk about. It's nothing. It's a kitten. It's just a kitten.
00:17:46.560 Then the Joe Rogan show turns into the biggest phenomenon in the world. Suddenly, it's the kitten
00:17:54.480 that turned into a dinosaur. Is everything the same now? If there's some misinformation,
00:18:00.480 hypothetically, that came out on the Joe Rogan show, would you say, ah, it doesn't matter? It's cute.
00:18:06.440 It's like a little kitten. Take that little kitten paw out of my hand. No. Joe Rogan has now
00:18:13.060 surpassed, at least for people who lean right, I think, Joe Rogan has surpassed the credibility
00:18:20.260 of the major news organizations. True or false? True or false? Joe Rogan is not only gigantic in terms
00:18:30.220 of audience, but has surpassed by far the credibility of the major media publications. True? I'm seeing all
00:18:39.640 trues on locals. YouTube, you've got a little delay here. Would you agree with that? They're for the
00:18:46.360 right, people who lean right, and most of my audience does. But for us, right? There's a general
00:18:55.080 agreement that he's far and away more credible. And one of the reasons he's more credible, and again,
00:19:01.660 I'm going to ask for a fact check on this. Would you say that he's more credible because he is not
00:19:07.720 intending to mislead you? And we can't read his mind, but we see no evidence, I mean none, that there's
00:19:15.900 any intention that is anything but positive. Right? 100% agreement. 100% agreement, I'm saying.
00:19:22.920 That there's no sense that money could buy him, right? Do you think money could buy Joe Rogan?
00:19:30.560 Like, do you think a pharma company could say, look, I know you're doing well, but I'll give you
00:19:36.000 half a billion dollars to say our drugs are good when you know they're not. Do you think Joe Rogan
00:19:42.600 would take it? What do you think? Half a billion dollars to lie to you. Do you think he'd take it?
00:19:50.160 No. I mean, we don't know, right? I mean, you know, that's in the realm of anything's possible.
00:19:57.720 But no, no, you would not believe that for a second. Because he's doing well, and why would
00:20:03.140 he suddenly turn from a force of unambiguously good to like the darkest evil? That wouldn't happen.
00:20:10.940 But would the news do that? Would any of your news platforms or even social media, would you trust
00:20:18.140 them to not take a billion dollars to say something that wasn't exactly true or maybe
00:20:25.040 de-emphasize something which ends up the same? Do you think the major platforms would turn down a
00:20:31.680 billion dollars to say something that wasn't exactly the best thing for the world? Don't know.
00:20:38.600 I don't know. But I wouldn't say it's the same standard. Like, I would feel really confident that
00:20:46.060 Joe Rogan would not take a giant bribe to lie. I would not be confident, even a little bit, that a
00:20:53.060 corporation wouldn't take money to shade the truth. So we have this situation where Joe Rogan is, at least
00:21:02.560 to a large segment of the world, the most credible person in the game, or among the top. Now, so far,
00:21:09.980 are you with me, everybody? So far? All right, because this is the setup for the cognitive dissonance
00:21:17.580 I'm about to present to you. If all of that's true, and you believe it, if I were to say something that
00:21:26.360 violated everything that we both agreed is true, what would happen? You would be triggered into cognitive
00:21:34.760 dissonance. So this is the setup, and I'll see if I can do it right now, because I'm going to read to you a bunch of
00:21:42.820 tells for cognitive dissonance in a little bit. But you're going to see them go by on the screen, too.
00:21:48.140 All right? Here it comes. I'm going to drop the trigger on you. Some of you saw this already in
00:21:52.460 Twitter. We all agree, or it seems like it, based on the comments, it seems like it, that Joe Rogan is
00:21:59.060 probably the most useful source of information we actually have in the United States. Right? We all
00:22:07.720 agree with that. Now, here's the trigger. Joe Rogan's program is ripping the country apart. It's one of
00:22:15.860 the most destructive things I've ever seen in my life. And now I'm going to back it up. It's one of
00:22:24.660 the most destructive things I've ever seen, and it's completely accidental. It has nothing to do with
00:22:31.240 Joe Rogan doing anything wrong. Okay. What happened was the show got big, but it stayed the same.
00:22:39.640 And there's the potential now that if, and can you deal with me that this is hypothetical,
00:22:46.360 right? So what comes next is not a criticism. It's a hypothetical. What if Joe Rogan, with all of
00:22:55.260 his credibility? Oh, here's the first one. I can't tell if this is a joke or not. But
00:23:00.820 market trader is coming in with the first tell. He says, cognitive dissonance again, Scott has
00:23:05.740 jumped the shark. Oh, actually, that's a different comment. But you'll see the cognitive dissonance
00:23:10.160 in a moment. So here's my argument for why Joe Rogan is ripping the country apart. Now,
00:23:18.880 remember, I'm now between the binaries. There's a binary that he's a force for good, and it's
00:23:24.420 just great. And there's a binary that says he's spreading misinformation. I just went right in
00:23:30.140 the middle. I said he's one of the best things we have in the country, complete respect. But
00:23:37.920 he's also ripping the country apart, accidentally. And here's the problem. When you get somebody with
00:23:44.900 that level of credibility, and they put one expert on, what if the expert says something that's wrong?
00:23:51.280 What's the outcome of that? Well, if you put an expert on a low-credibility show, and they said
00:23:58.180 something wrong, maybe no big deal. But what happens if you put somebody into a three-hour
00:24:03.160 interview, and hypothetically, we're not yet talking about any individual person, but hypothetically,
00:24:09.480 what if that person said something that was harmful to your health? How big a problem is that?
00:24:15.800 Or what if he said something, the expert, that would divide the country?
00:24:26.120 Is Scott having a Molyneux moment? I don't even know what that means.
00:24:31.040 So here's why Joe Rogan is destroying the country, while at the same time being one of the best
00:24:39.000 things we have in the country. And it goes like this. When you put one rogue expert, rogue meaning
00:24:47.920 doesn't agree with the mainstream of anything, or on some topic anyway, if you put one rogue on,
00:24:54.640 you are only misleading the public. That's the only thing that can come out of that. Because
00:25:00.220 you need the other side. Well, let me soften that. If the expert said only things that were true and good,
00:25:07.880 that would be the best thing in the world. Can we agree on that? If you were to put an expert on
00:25:12.660 who only said things that were true and good and useful, and you put it on the gigantic platform of
00:25:18.960 Joe Rogan, boosted by the credibility that his platform has, that would be the best thing in the
00:25:25.400 world. Right? But what if, and again, this is hypothetical, we're not yet talking about an
00:25:31.620 individual. What if he put on somebody who said something really dangerous? What's the check and
00:25:37.160 balance against that? Well, one check is you could go do your own research, right, to see if that
00:25:43.640 expert was right. Does that work? No, I always rail against that. We think we can do our own,
00:25:50.000 we think we can do our own research, but we end up just going down a trail of confirmation bias.
00:25:55.760 You know, if you do the research, you'll find out that you were right all along. And if I have the
00:25:59.900 opposite opinion, and I do my own research, what am I going to find? You did your own research and
00:26:05.300 found out you were right all along. I disagree with you, and I do my own research. Is there any
00:26:10.960 chance that I'm going to see the same thing you saw and then agree with you? No. We're both going
00:26:16.840 to do our own research and come to different conclusions. That's why doing your own research
00:26:20.920 is actually a joke for most of us. I would still say do it, right? You know, it's better than not
00:26:30.020 doing it. But you're going to mislead yourself a lot of times. So be careful about that when you do
00:26:36.320 your own research. So let me give you this proposition. How many of you would say that hearing
00:26:44.180 one side of an argument is better than hearing two sides? Nobody, right? Nobody would say that,
00:26:50.740 right? There's nobody, nobody, not even, I think this is one thing you could say as an absolute
00:26:56.020 that you could get away with. There would be no reasonable person who would say it's better to
00:27:02.160 hear one side of an argument than two. Who would ever say that? Nobody. But the Joe Rogan model,
00:27:09.380 again, didn't matter when it was a small show, but now it's really important. When he shows one
00:27:16.960 point of view and a lot of the country think there's a mistake there, like a big one that
00:27:24.000 could matter to people's health, that's a problem. That is a big, big problem. And that's where we are
00:27:31.460 right now. And let's talk about Dr. Malone and what he may have gotten right and what he may have
00:27:38.920 gotten wrong. One of the things I hear from his supporters and defenders is that he's an expert.
00:27:49.300 He has lots of expertise and so we should mention it. So I was, I did listen to the first, I don't know,
00:27:55.180 45 minutes or an hour of the Joe Rogan show. I'll watch the rest of it. You don't have to pay me.
00:28:00.540 So I do plan to finish it, but I have enough now that I can, I can make my point. So the doctor was
00:28:06.780 asked, and I love, I love this technique, by the way, instead of Joe Rogan introducing somebody,
00:28:12.620 and I use the same technique, I ask them to introduce themselves because they'll tell you
00:28:17.340 the things you need to know for the topic. So when Dr. Malone has asked for his expertise,
00:28:23.480 have you heard the list of his expertise? It's pretty damn impressive, right? I mean,
00:28:30.340 he just goes on and on about, you know, the trials he's run and the different degrees he's gotten and
00:28:36.840 the things he's worked on. And oh my God, it's really impressive. So we can all agree on that,
00:28:43.180 right? That is, that that's all impressive. But what are the two things that he is,
00:28:50.560 he is most being talked about? I would say the two things that have come out of the interview,
00:28:56.740 and since I haven't watched the three hours, I don't know how much is buried there. But in terms
00:29:00.780 of the things that emerged out of it as the clips or the highlights, I would say two categories have
00:29:06.400 emerged. One is why did he get kicked off of Twitter? So he got booted off of Twitter. And when asked
00:29:14.420 which tweet, Joe, of course, asked the smart question, which tweet or tweets got you kicked off
00:29:20.580 Twitter? Do you know what Dr. Malone said? He said he didn't know. He didn't know. Now, I think that
00:29:29.260 could be true, because I think Twitter doesn't tell you specifically. But other people know.
00:29:36.760 Other people believe that he got kicked off the Twitter, not for something that happened within
00:29:42.720 his expertise. But when he left his expertise, and did a data analysis, because he may have been
00:29:50.440 analyzing something that was within his expertise. But data analysis, I don't believe, was part of his
00:29:58.380 stated expertise. So when he left his expertise, to say that this study is a good one or a bad one,
00:30:06.360 other people that I rely on to tell me if a study is good or bad said, no, that's a bad study.
00:30:11.300 So you have the people who are actual experts at knowing whether a study looks legitimate just by
00:30:17.820 even what they tell you about it. The people who are actually experts, and I don't have an indication
00:30:23.980 that Dr. Malone is an expert on data analysis. But if he is, let me know. I would change my opinion
00:30:29.320 if that's true. But the people like, say, Andres Backhaus, I use him as my example a lot. When he looks at
00:30:38.220 some of the things that come out of some of the rogue doctors, he may have a different opinion.
00:30:42.460 So Twitter, presumably, and again, we need to learn more about Twitter, presumably banned him because
00:30:50.020 he mentioned a study or a fact that other people say, you're reading it wrong. Okay?
00:30:58.980 So the whole Twitter got kicked off is something outside of his expertise.
00:31:06.380 Who would disagree with that? Is the good doctor also an expert beyond being an expert on the topic?
00:31:15.180 So I'm agreeing he's an expert on the topic. Only data analysis as a field. I'm saying he didn't claim expertise.
00:31:23.040 Right? And Andres Backhaus could be wrong as well. That's correct. So we're getting to the point where
00:31:32.100 if you don't have two experts, you don't have much of anything. Then the second thing he talked about
00:31:37.620 that's getting a lot of attention, trending on Twitter, is mass formation psychosis. And he quotes
00:31:43.600 some experts on that. Now, did Dr. Malone say that he was an expert in this field,
00:31:51.300 the mass formation psychosis. I don't believe he made that claim, right? Can you fact check me?
00:31:59.520 I don't think there's any claim like that. So you have the good doctor who, within his area of
00:32:06.900 expertise, as far as I know, I don't have anything to disagree with him about. Does anybody have an
00:32:13.820 example? Because you've seen more of it. But I'm not familiar with anything he said within his
00:32:20.320 expertise that I disagree with. Has anybody seen any example of that? If you know my opinions as his,
00:32:27.480 I don't think we disagree. Now, he hasn't made some speculation about why he got kicked off the
00:32:33.900 Twitter. And although he doesn't have a specific theory about why it happened, he speculates that
00:32:40.700 it might not be about just the accuracy of his information. Could be maybe political. What do you
00:32:49.000 think? Do you think that it's political why he's being taken off? It's a good argument, right? I
00:32:57.880 think you could argue that it looks political. Now, whether or not it's right or wrong can be a
00:33:03.600 separate argument from whether the reason he's being taken off is political. I think we'd agree on
00:33:08.480 that, right? So those who say, and I think Joe Rogan and Dr. Malone would agree, that free speech
00:33:19.620 has to trump accuracy. Would you agree? That free speech as a standard has got to be more important
00:33:28.600 than accuracy? Because we all get stuff wrong. You want a situation where free speech fixes the
00:33:37.500 inaccuracies, right? So you're free to say something's wrong. Other people are free to
00:33:42.680 correct you. Self-correcting system. So can we all agree that free speech as a standard needs to be
00:33:52.180 higher and more important than accuracy? I think I could get 100% agreement on that. Now, how many of
00:34:00.420 you would agree that there are limits to free speech? Now, I use the example on Twitter of yelling
00:34:06.740 fire in the theater. And I guess there's some history of that that makes that a bad example. But
00:34:12.020 can you say things that are clearly untrue that would get the public damaged, for example? Do you have
00:34:24.320 the freedom of speech to say you can cure your COVID by drinking poison? Do you have that freedom of
00:34:35.560 speech? Actually, I don't know. That's not even a rhetorical question. Do I have the freedom of
00:34:41.660 speech? Let's say I'm an expert, to make it worse, right? Let's say I'm an expert, and I say, you know,
00:34:48.440 you should all just go drink poison, because that'll cure your COVID, but you'll still be alive,
00:34:54.980 of course, which is not true. Do I have the freedom of speech to do that?
00:34:59.280 The bleach thing never happened. It got a little complicated, right? Most of the people who were
00:35:08.240 answering with a yes or no were saying yes. Yeah, I do have that. I do have that freedom. But
00:35:14.320 should the social media companies, who are not exactly bound by the freedom of speech Second
00:35:23.220 Amendment, because that applies to the government, not to private companies, does a private corporation
00:35:27.680 have an obligation to use their best judgment to get rid of something that's so dangerous
00:35:35.320 that lots of people would die from it? Does the social media platform have that obligation?
00:35:42.980 They certainly have the power, and they certainly have the right. You'd agree with that, right? They
00:35:48.480 have the power, because they do it. And they have the right, because they're not a government
00:35:53.340 organization. So they have both the power and the right to do it. Should they? So suppose,
00:36:01.620 let's make it a little tougher for you. So you could argue whether somebody's specific information
00:36:07.340 was good or bad for the public. But let me make it harder for you. Suppose you knew for sure
00:36:13.340 that the information would kill people, and there would be no upside. What if you knew
00:36:18.780 for sure? What would you do then? You're Twitter, and you see some information, and it comes from
00:36:26.520 a top expert. And let's say, this is a hypothetical, because nobody could know for sure. I think we
00:36:32.420 can agree, right? Nobody's that sure. But let's say, hypothetically, you did know for sure
00:36:36.720 it would kill people, and a lot of them. What do you do? What would you do?
00:36:42.440 Let me ask you. You're the CEO of Twitter, and there's some information you could block
00:36:50.300 or you could allow. If you allow it, you're sure, because this is a hypothetical, you're
00:36:56.640 sure a million people will die. What do you do? You're CEO of Twitter. Go. Kill the million
00:37:02.980 or save the million? You don't get to not answer this question. Yeah, you got really quiet on
00:37:09.120 that question, did you? Yes or no? Do you kill the people or do you let them go? A lot
00:37:15.320 of you, some would say save. Some would kill them. All right? So I appreciate that. There
00:37:21.460 are free speech absolutists who say they would let a million people, innocent people die to
00:37:28.280 preserve free speech. Now, you could agree with that or disagree with it, but that's at
00:37:33.120 least a credible opinion, right? I mean, there's plenty of room to disagree, but that is a fair
00:37:40.400 opinion. Because I always appreciate anybody who can say, here's my costs, here's the benefits,
00:37:48.540 and I chose, you know, I chose one side. So the people who say, yeah, I'd let a million people
00:37:53.840 die to preserve free speech. I respect that opinion. I disagree. But I respect the opinion
00:38:02.620 because you showed your work. Good for you. And you stood behind it. Good for you. I don't
00:38:10.200 believe you, by the way. I don't think a single one of you would make that choice if you're
00:38:14.780 actually CEO. But for our purposes today, it's a perfectly reasonable thing to say. I just
00:38:21.320 disagree. I think if any of us were in that position, you'd save the million people. I
00:38:28.280 think you say you wouldn't, and I totally respect why you say that. Totally respect that. But
00:38:34.220 I don't believe it. No way to know, but I don't believe it. So here's the situation. I believe
00:38:44.500 that Dr. Malone has presented information which is obviously misleading and dangerous.
00:38:53.420 That's what Twitter would say when it kicked them off. Now, you could argue that. But do
00:38:59.020 you argue that Twitter has the right and the responsibility, and that aren't you glad you
00:39:06.020 didn't have to make the decision? Now, a lot of people are saying to me, Scott, are you
00:39:11.000 defending Google and Twitter? No, that's cognitive dissonance. I'm just walking you through it.
00:39:17.540 You make your own decision what's good or bad. I'm not even sure I have an opinion what's
00:39:22.160 good or bad. But you got to at least look at it right. All right. So there have been claims
00:39:31.260 made that there are specific tweets from Dr. Malone that are dangerously misleading.
00:39:37.340 And that's why Twitter banned him. When Joe Rogan asked him what those were, he said he wasn't
00:39:45.020 sure. Do you believe him? Do you believe that Dr. Malone doesn't know which of his tweets
00:39:50.980 got him banned from Twitter? Do you believe he doesn't know? I'm saying mostly yeses on
00:39:57.980 locals. How about you, YouTube? Do you believe he literally doesn't know which ones were the
00:40:04.720 problems? Because remember, he got lots of replies. So he knows which ones had the factual
00:40:12.200 problem, right? So I'm going to call bullshit on that. I would say that when Dr. Malone says
00:40:21.980 he doesn't know which ones did it, that might be technically true, meaning he doesn't have
00:40:27.200 a confirmation. But he knows. I would know. You would know. Do you think that you wouldn't
00:40:35.340 know which one of your tweets got you kicked off? You would know. Yeah. No. I can't read
00:40:41.420 his mind, right? Whoever said you're mind reading, absolutely right. That's a good comment. Can't
00:40:46.680 read his mind. But you don't think he's aware of which factual data he got the pushback from
00:40:55.020 at exactly the time that Twitter banned him? I don't think there would be anything he would
00:41:00.060 have heard of more than which of those tweets he got banned for. He knows. Now, it is, I
00:41:08.500 think I'm going to, I'm not going to call him a liar, because when he says he doesn't know,
00:41:13.040 I think that's technically true. You know what I'm saying? If Twitter hasn't confirmed it,
00:41:18.640 that's true. He doesn't know. But he knows. He knows. Anybody in that situation would know.
00:41:26.880 You couldn't not know. How could you not know? Oh, here's one. There we go. Boom, boom.
00:41:35.060 So here's, I'm going to be calling out all the tells for cognitive dissonance. Here's one I've
00:41:39.680 been waiting for. Keep digging a hole for yourself. So that one, and accept the L. Those are cognitive
00:41:47.200 dissonance. And it has nothing to do with agreeing or disagreeing with me. It's just the discomfort
00:41:52.300 that I've shown you that it's not a binary, and people are just floating around trying to figure
00:41:59.400 out what to do now that they're confused. All right. I have no idea what information Dr. Malone
00:42:10.060 put out there that was the problem. I have seen things he's put out there that were debunked
00:42:17.300 in a way that I thought were just really, obviously debunked. But I could be wrong about
00:42:23.160 the debunk. So here's the question, and I guess the request. Here's a request to Joe Rogan.
00:42:31.380 In his current model, which just evolved from the kitten to the dinosaur, his current model
00:42:39.880 is damaging to the country. It's seriously damaging. Very damaging. The interview with
00:42:46.280 Dr. McCullough, I think, was pure damage to the country. Pure damage. And I think the Dr.
00:42:52.640 Malone interview is pure damage. Pure damage. Because it lacks the counterpoint.
00:42:58.760 Now, have I told you that it's always darkest before the dawn, and we can't tell the difference
00:43:05.900 between good news and bad news? Joe Rogan is this close. All he has to do is add a second
00:43:13.920 person. Ideally, a third and fourth person who are silent members to just fact-check things,
00:43:21.660 you know, and maybe give a note to the person as they're going. So maybe a couple of silent
00:43:26.000 fact-checkers. But Joe Rogan is one invitation, meaning the person on the other side of the
00:43:32.800 argument, one invitation away from ruling the whole fucking world.
00:43:41.060 Joe, if you're listening to this, or anybody describes it to you, you're so close. Take it.
00:43:48.780 There's this big golden ring that's dangling right in front of you. And maybe you have so
00:43:53.460 many golden rings already that you can't see this one. But the public is begging you, Joe.
00:43:59.020 We're begging you. Please, please, bring on the other side. Because nobody's listening
00:44:06.860 to anybody else. We don't trust anybody. We don't trust the news on the left. We don't
00:44:12.800 trust the news on the right, sometimes, if it doesn't agree with us. We don't trust social
00:44:17.580 media. But we do trust Joe Rogan. Am I wrong? So close. Just invite the other side. Oh, my
00:44:30.080 God. I don't know how to understand. Well, I don't think I can overstate this. Rogan is so
00:44:40.520 close to the model that saves the whole fucking world. And it's killing me that it doesn't go
00:44:51.060 there. So close. So he's not only, in my opinion, a national treasure, but he is right on the edge
00:45:01.420 of running the whole country. Indirectly. Because if you imagined what public opinion would do if you
00:45:11.080 saw both sides on his show at the size of his platform, and you knew that Joe himself is not
00:45:19.200 trying to screw you. When was the last time you saw a moderator for a debate that you didn't worry
00:45:26.640 was trying to fix the debate by the questions they asked? Right? You always think that. But you
00:45:33.480 wouldn't think that with him. Because we have enough track record. You know, he's actually just
00:45:37.300 trying to figure out what the hell is going on. Somebody says that the other side will not show up.
00:45:47.360 Absolutely not. No. Let me ask you this. Dr. Malone is called by some, I think he calls himself
00:45:56.520 that an inventor of the mRNA technology. An inventor of the mRNA technology. Now, he acknowledges,
00:46:05.980 and his critics have pointed out, that there are a lot of people who had, let's say, discoveries or
00:46:11.700 contributions to the platform. And he apparently is one of the important ones. Have you heard from the
00:46:18.820 other ones? I'll just let that sit there for a while. So you've heard a lot from Dr. Malone. He's
00:46:28.200 trending. He's all over the news. He's on Joe Rogan's. We've heard a lot from one of the mRNA inventors.
00:46:38.380 What do the other ones say? Do they agree with him? Are you worried that you haven't heard it?
00:46:43.960 Is there even one other person who agrees with him who is also as qualified and as that close
00:46:50.880 to the mRNA technology? Did you once ask yourself, where have I heard the other experts who work
00:46:59.600 directly with him and have either the same opinion or a different one? Yeah, it's a good question.
00:47:06.560 And you should be embarrassed that if it's the first time you thought of it, that you heard it
00:47:12.800 here. That should embarrass you. You should say to yourself, there's one group of people that we
00:47:19.900 should be hearing from. All of the other people with this same expertise. None. I've heard of none.
00:47:27.780 Have you? I've heard of exactly zero people. Now, some of you suspect, and I'm not going to discount
00:47:34.680 this. Some of you suspect that the other side won't go on because they'd be embarrassed of their
00:47:40.080 point of view or they can't defend it. Maybe. But wouldn't you like to hear Joe Rogan say,
00:47:45.740 I put out, you know, 15 invitations to this group of people and they refuse to come on and talk about
00:47:51.740 it? First of all, whoever refused an appearance on Joe Rogan's show? Do you think anybody ever has?
00:48:00.580 I mean, I'm sure somebody has for some reason. But, well, I've been on his show, so I didn't refuse it.
00:48:11.580 Dr. Malone has a few money. He does have some money.
00:48:18.100 So, here's my bottom line. If you think that Dr. Malone is a big old expert and everything he says
00:48:25.100 should be taken seriously, I agree with you, as long as he's in his area of expertise.
00:48:31.220 When he goes into my area of expertise, it looks silly. Because I would say that his views on mass
00:48:37.320 formation psychosis are actually simplistic. Simplistic to the point of being misleading.
00:48:43.940 Here's my take on it. We did not enter a mass formation psychosis because of the pandemic.
00:48:49.700 We're always in one. And somebody said, Scott, if we're always in one of these mass formation
00:48:57.500 psychosis, where was the last one? Ha, ha, ha. If we're always in one, how come we're only having
00:49:04.160 this experience of everybody like hallucinating and being weird when we're in a pandemic? Give me
00:49:09.280 one other explanation or one other recent scenario, Scott, since you say this is all over all the time.
00:49:15.020 Give me any other example of a mass formation psychosis. Go. Go, Scott. See if you can do that.
00:49:22.940 To which I say, have you heard of Trump derangement syndrome? We all live through it, right?
00:49:31.820 That's a mass formation psychosis. Did we need a virus to get there? Nope. We were already there.
00:49:41.560 You want another one? Climate change. I'm seeing somebody suggest that 9-1-1 was a mass formation
00:49:51.060 psychosis. I accept that. I will accept that. It's a little more complicated, but I think there's
00:49:58.260 enough of that. Yes. So, have I in, let's say, 60 seconds, with my understanding of the topic,
00:50:09.980 psychology, hypnosis, et cetera, have I debunked the doctor's point of view by showing you that it adds
00:50:19.460 nothing to the conversation? We are always in this state. Now, we weren't always, always in as much of
00:50:27.300 a dangerous state of this because communication is so much better. Everything's faster and bigger
00:50:33.720 and ramped up. And, of course, the business model of the news and social media gets us all clicking on
00:50:41.220 whatever makes our blood boil. So, this was always a possibility, but in our modern times, because of
00:50:49.600 the social media and the news model, et cetera, this is now permanent. I would argue that when Germany
00:50:56.980 was the example, it was an exception. How many of you will accept my interpretation that Germany was
00:51:05.240 an exception because during those times, they didn't have good communication, et cetera, it would
00:51:11.380 be a little bit harder for one of these to form. But today, it forms instantly on every topic
00:51:18.160 because we go binary on everything. So, as soon as you go binary, you've lost all reason.
00:51:23.960 So, this is a permanent situation. And Dr. Malone is famous for this and clearly over-interpreting
00:51:34.780 it. I won't say wrong because I agree that this is a thing. It just doesn't, it isn't relevant to
00:51:40.340 the current topic because it's just always here. And data analysis, I would look to people who are
00:51:46.180 better at data analysis than I am, but also better than he is. And they say, you got some real
00:51:53.080 trouble with some of the things that you've forwarded. Now, what happens when you tell people
00:52:00.480 who think that Joe Rogan is the answer to everything, that he's destroying civilization
00:52:06.300 with these one expert interviews? Do you think that people will say, well, that's a good point.
00:52:12.880 You got me there. That is a good point. Probably not.
00:52:17.500 So, let me read some of the comments and I'll use these. I'm watching my follower account
00:52:25.460 on Twitter. It's dropping by hundreds every minute. I think we're down like 1,500 since
00:52:34.260 I woke up this morning. So, this is what happens when you talk like I do. If you don't stay on one of
00:52:42.960 the binaries, the people who live on a binary are like, I'm out of here. You are not my binary
00:52:48.060 with me. All right. So, let me tell you the tweet and then I'll tell you some cognitive dissonance
00:52:56.020 here. All right. I said this. I said, stop watching long interviews that involve one non-expert
00:53:07.160 talking to one expert. That's a guarantee you will be misinformed. And then just to ramp up the
00:53:14.460 cognitive dissonance, I said, ask yourself if Twitter or Google would ban content in which
00:53:20.360 opposing sides are argued by the experts. Do you think they would? If Joe Rogan had, or let's say,
00:53:29.220 if Dr. Malone's tweet had been a tweet about two experts who disagreed, and one of them was Dr.
00:53:36.900 Malone, and one of them was the other expert, do you think Twitter would have banned that?
00:53:42.380 Do you think Google would have banned that? You're saying yes, but you're lying because you know they
00:53:48.440 wouldn't. Come on. You know they wouldn't. No. You need to give me an example of that. If you
00:53:56.480 can show me an example of two experts who both get their point of view out, and you show me that
00:54:02.600 that got banned, then I'll believe you. I'm going to stake a claim here. My claim is if both sides are
00:54:11.940 shown fairly, that it would not be banned because that would just be useful. Right? All right. So,
00:54:21.660 anytime one person gets banned, that's not my argument. My argument is anytime that both
00:54:27.860 arguments are shown, you won't get banned. So if you're worried about freedom of speech,
00:54:32.760 are you as worried knowing that if both sides are shown, they would keep it? I know some of you
00:54:38.720 doubt that, but don't doubt that. Right? Because I can't believe that they could defend that in any way.
00:54:45.660 You could certainly defend that somebody had misinformation. Am I right? Twitter could be
00:54:54.080 right or wrong about it, but they can defend the notion that misinformation, if it looks damaging,
00:55:00.440 should be removed. Right? You could argue against it for free speech, but they could defend it. They
00:55:05.520 would have an argument whether you liked it or not. So, all right. Having said this provocative thing
00:55:11.920 that you should stop watching long interviews, anybody who thought they learned something
00:55:17.460 valuable from the Joe Rogan interview would, if they are a binary, say, no, no, no, Scott.
00:55:23.180 So let me show you some of the comments I got to that.
00:55:29.760 Here's one. Benjamin says, this assumes that Twitter and Google are playing in good faith
00:55:36.100 and not simply banning content they don't like. Nope. Nope. I'm not making any assumptions.
00:55:44.760 So this is a hallucination of an assumption. If somebody hallucinates your point of view,
00:55:50.320 it's probably cognitive dissonance.
00:55:54.460 Scott, saying we're always in, does not discredit. Yes. Okay. Here's another example of cognitive dissonance.
00:56:01.180 So Marv says, in all caps, Scott, saying we're always in a, one of these, mass formation psychosis,
00:56:10.900 does absolutely nothing to discredit what Malone is saying.
00:56:16.900 It discredited this. What do you mean it doesn't discredit it? I just discredited it.
00:56:25.580 Were you watching? What do you mean it doesn't discredit it? I just told you it's nonsense.
00:56:33.580 And I told you why. And you agreed with me. Stop yelling caps at me and agreeing.
00:56:40.640 So the other thing that happens is people would get really mad at me while agreeing.
00:56:45.580 So here's somebody who says, bowtied reptilian says, so stop watching your live periscopes.
00:56:52.580 What did I tell you about these? So tell for cognitive dissonance.
00:56:58.480 Did I say anything that would suggest somebody should watch my live podcast?
00:57:04.960 Am I an example of a host talking to one expert for three hours?
00:57:09.480 No. This is cognitive dissonance.
00:57:12.300 All right. Here's another one.
00:57:16.340 Somebody says, isn't this what happens every time Fauci says anything publicly?
00:57:21.540 In other words, the argument is, why would I complain about Joe Rogan talking to one expert
00:57:27.820 when isn't that exactly the same thing that Fauci says when nobody's challenging him?
00:57:33.780 He's just talking in public.
00:57:35.640 What do you think of that criticism?
00:57:38.980 Is that criticism valid?
00:57:40.700 Good point, I see somebody saying.
00:57:43.100 Who else thinks this is a valid criticism?
00:57:47.860 Okay, I'm tricking you here.
00:57:49.060 You're falling into the trap.
00:57:51.540 I'm seeing people say yes.
00:57:53.520 It's a valid criticism.
00:57:55.200 No, it's not.
00:57:56.860 This is agreeing with me.
00:57:59.200 I said, don't listen to one expert without the counterpoint.
00:58:05.180 The criticism is, well, isn't that what happens every time Fauci says anything publicly?
00:58:11.180 Yes, this agrees with me.
00:58:13.700 But people are agreeing with me as if they're disagreeing with me.
00:58:18.940 And there's a whole bunch of examples of this.
00:58:20.400 For $99, Sean asked this question.
00:58:28.920 I've asked the same question.
00:58:30.080 Maybe the others stand to gain financially, so crickets.
00:58:34.440 Sanjay Gupta interviews are a disaster.
00:58:38.600 Yes, I think we're all aware that people may have financial interests in the case of the pharma stuff.
00:58:46.780 All right, let's see some other cognitive dissonance.
00:58:53.840 And there's just weird things I don't even understand.
00:59:00.240 Okay.
00:59:01.720 Let's see.
00:59:03.620 Somebody just says it's a lousy strategy and my reading comprehension is just fine.
00:59:07.880 So there's a bunch of people who just insult me personally with no comment of whether the ideas could have been.
00:59:16.640 Scott comes down against Bill Gates' interviews, finally.
00:59:24.160 Again, this is somebody who's arguing as if I'm not agreeing with him.
00:59:28.520 Do I think that you should agree with everything that Bill Gates says without having an expert next to him?
00:59:36.000 I've never said that.
00:59:37.880 You should also doubt what Bill Gates says.
00:59:40.580 Yes.
00:59:41.260 That is exactly agreeing with me.
00:59:43.800 As if you don't agree.
00:59:49.340 Let's see.
00:59:55.500 The problem is that this would only be useful to an expert audience.
00:59:59.340 You think two experts talking would only be useful to an expert audience if Joe Rogan were the host.
01:00:09.040 You don't think Joe Rogan would force them to speak in common language so everybody understood it?
01:00:14.960 Of course he would.
01:00:15.720 That's what he does.
01:00:16.880 That's why he's Joe Rogan.
01:00:19.660 That's why he's Joe Rogan and you're not.
01:00:22.420 Because that's what he does.
01:00:23.660 He makes people talk in a way you can understand them.
01:00:29.340 Joseph is just here for the ratio, so he's pretty sure that what I've said is dumb.
01:00:36.180 Here, Vaping Rock says, except no expert, quote, expert, is willing to debate Dr. Malone.
01:00:42.460 Really?
01:00:42.940 You don't think I could find a data analysis expert who would say that that tweet that probably got you kicked off of Twitter is wrong and here are three reasons why?
01:00:53.880 You don't think we could find a data analysis expert?
01:00:57.760 I think you could.
01:00:59.540 I think you could.
01:01:00.560 Somebody says it's a risk of being misinformed, but far from a guarantee.
01:01:08.900 It's closer to a guarantee.
01:01:10.860 Now, I'll acknowledge that this is an opinion, so it's not, I can't debunk an opinion.
01:01:15.660 But I think the risk in these three-hour interviews, since we've watched quite a bit of them,
01:01:21.500 I think the risk is it's closer to a guarantee of being misinformed.
01:01:24.880 And part of the reason is that what makes a guest interesting is they're a rogue.
01:01:30.840 They're saying something that the others don't say.
01:01:33.400 So I think if you're selecting from the field of people who say non-mainstream things
01:01:38.360 and you let them talk alone for three hours,
01:01:41.980 I think that is always going to be closer to misinformation than useful.
01:01:46.620 Not 100%, but maybe 80% of the time it would be a bad idea.
01:01:53.640 And 20% of the time it would be just tremendous.
01:01:57.420 Something like that.
01:01:59.580 Here's the other one.
01:02:00.680 Just take the L, Scott.
01:02:02.460 Now, what would I take the L for?
01:02:05.100 Is it because somebody disagrees that hearing both sides of an argument is good?
01:02:10.160 No, nobody disagrees with that.
01:02:11.720 So here's somebody who's sure I'm taking the L, will not disagree with anything.
01:02:22.020 Let's see.
01:02:24.660 Some more.
01:02:29.800 All these comments are just weird comments all over the board.
01:02:34.580 Where can one find such content?
01:02:36.720 Well, that's the problem.
01:02:37.360 It doesn't exist.
01:02:37.960 Oh, here's somebody saying rogue and envy is a thing.
01:02:41.720 So that my reason for saying that having one expert instead of both sides
01:02:46.780 is that I have Joe Rogan envy.
01:02:50.200 Does that sound like an insightful comment?
01:02:54.300 Let me ask you this.
01:02:56.140 Who doesn't have Joe Rogan envy?
01:03:00.560 If you don't have a little bit of Joe Rogan envy,
01:03:03.780 maybe you're not familiar with his content.
01:03:06.200 Addison says, are you an expert in persuasion?
01:03:12.440 And if you had ever had another persuasion expert on your show?
01:03:20.300 I don't think any persuasion experts would disagree with me.
01:03:24.380 So that would be a weird case where I understand what you're asking,
01:03:29.180 but you'd have to have at least the suggestion of disagreement.
01:03:33.620 There would have to be some reason I think somebody would disagree.
01:03:36.000 I'm not aware of anything I've said that a persuasion expert would disagree with.
01:03:41.140 But let's say there's a situation where on Twitter there's a persuasion expert
01:03:46.140 who disagrees with me on something specific.
01:03:49.720 That would be a good idea.
01:03:51.340 In fact, I'd like that.
01:03:52.460 Maybe I could learn something.
01:03:55.580 But in most cases, if a persuasion expert disagreed with me,
01:03:59.340 they could cover the entire thing with a tweet.
01:04:02.940 So if a persuasion expert disagrees, here's the whole argument.
01:04:06.540 You said X.
01:04:08.400 Here's a study that says maybe it would go the other way.
01:04:11.300 You don't really need three hours for that.
01:04:15.420 But that was a good question.
01:04:16.660 All right, how about just lots of side comments
01:04:22.700 that act like they're disagreeing and it's not?
01:04:27.080 Patrick says, what if I listen critically?
01:04:30.020 Well, that's better than not, but it's not going to get you.
01:04:33.200 Listening critically won't tell you what you weren't told.
01:04:36.840 That's what you need the other expert for.
01:04:38.880 You don't know what's left out when an expert's talking,
01:04:42.460 unless it's obvious.
01:04:43.640 So that's why you need somebody who knows what's left out.
01:04:50.880 And
01:04:51.400 just go F myself, okay?
01:04:58.120 Every interview of every expert in the history of everything,
01:05:01.780 I'm being asked as if that's criticizing my point.
01:05:06.220 No, that's agreeing with my point.
01:05:08.460 That the way we've always done it is always wrong
01:05:10.700 and it will never be right no matter how long you do it.
01:05:13.640 Here's another so tell.
01:05:19.140 This one says,
01:05:20.000 so, basically,
01:05:21.700 don't listen to an interview with an expert
01:05:23.480 because you'll be misinformed.
01:05:26.140 Now, did I say that?
01:05:28.160 Did I say you shouldn't listen?
01:05:30.120 Well, I did,
01:05:31.180 but in the sense that you shouldn't take it too seriously.
01:05:35.240 Only two or more experts
01:05:36.700 to be properly informed?
01:05:38.140 Yeah, I would say that
01:05:38.820 that is roughly my suggestion.
01:05:41.340 And then he ends with,
01:05:43.600 do you see how ridiculous this sounds?
01:05:47.680 No.
01:05:49.620 No, I don't see how ridiculous this sounds.
01:05:53.180 Let me read it again.
01:05:54.280 He goes,
01:05:54.560 so, basically,
01:05:55.380 don't listen to an interview with an expert
01:05:57.100 because you'll be misinformed.
01:05:59.340 Only two or more experts to be properly informed.
01:06:01.660 Yes, that's an exact,
01:06:03.460 that is exactly my opinion.
01:06:04.920 And then he says,
01:06:05.500 do you see how ridiculous this sounds?
01:06:08.540 To which I say,
01:06:09.580 no.
01:06:12.760 Anything else?
01:06:14.660 Oh, and then,
01:06:15.220 then he goes on to say,
01:06:17.340 I guess all of my college lectures
01:06:19.120 were misinformation courses then.
01:06:21.000 Oh, okay,
01:06:21.400 I see where he's going.
01:06:22.020 So, here's the,
01:06:23.460 here's the clarification.
01:06:27.120 If it's a mature topic,
01:06:29.300 then one expert is fine.
01:06:31.880 Because in a mature topic,
01:06:33.600 you don't really have disagreement.
01:06:35.380 So, for example,
01:06:36.200 if your math teacher teaches you
01:06:37.980 how to add,
01:06:39.740 do you need the other expert
01:06:40.880 to tell you that that's wrong?
01:06:42.780 No, because it's a mature topic
01:06:44.300 and there is no,
01:06:44.920 there's no disagreement.
01:06:46.680 But on a new topic,
01:06:47.960 Professor Dunleavy,
01:06:51.580 I would say on a new topic
01:06:52.620 where it's obvious
01:06:53.340 there could be or is disagreement,
01:06:55.280 that's when you need
01:06:55.900 the other expert.
01:06:57.240 Okay?
01:06:57.760 You don't need a second expert
01:06:59.460 when there's no disagreement
01:07:00.620 to be had.
01:07:03.520 What about the flat earth,
01:07:04.860 you say?
01:07:07.240 Special cases
01:07:08.140 are just special cases.
01:07:11.820 All right.
01:07:14.900 And then,
01:07:15.620 all right.
01:07:19.000 Well, if you want to,
01:07:19.820 if you want to read
01:07:20.400 lots more examples
01:07:21.660 of cognitive dissonance,
01:07:23.260 you can do that.
01:07:27.220 So,
01:07:27.980 have I made my point
01:07:29.960 that Joe Rogan
01:07:31.420 can be both
01:07:32.360 a national treasure,
01:07:34.720 but because of his size
01:07:36.480 and the one,
01:07:37.220 one expert model,
01:07:39.180 that he is super destructive
01:07:41.160 at the moment
01:07:42.180 with,
01:07:43.600 just with that specific thing.
01:07:45.620 Thank you.
01:07:50.220 Some people
01:07:50.940 still say no.
01:07:52.440 Now,
01:07:52.780 I won't disagree
01:07:53.820 that having one expert on
01:07:56.360 could be better than none
01:07:58.360 if it's a new point of view.
01:08:01.060 Because you,
01:08:01.840 at least you can see
01:08:02.520 the counterpoints
01:08:03.320 in other places,
01:08:04.340 et cetera.
01:08:04.820 But it's a,
01:08:05.660 it's a terribly risky thing
01:08:07.040 to do.
01:08:08.380 Yeah,
01:08:08.640 it's super destructive.
01:08:09.860 Because right now,
01:08:11.000 it is changing
01:08:12.160 people's opinion
01:08:13.020 about vaccinating
01:08:14.660 or how they should
01:08:15.700 approach the pandemic.
01:08:17.320 And if people
01:08:18.280 are making real life,
01:08:19.500 life and death decisions,
01:08:20.960 and they are,
01:08:22.320 you want the other side
01:08:23.560 because it's controversial.
01:08:25.640 If it were not
01:08:26.400 controversial,
01:08:28.160 let me say this.
01:08:29.200 If Joe Rogan
01:08:29.860 had an expert on
01:08:30.680 that says,
01:08:31.140 hey,
01:08:31.380 we should be paying
01:08:32.220 more attention
01:08:32.740 to fitness
01:08:33.480 and health,
01:08:35.680 do you think
01:08:36.680 that I would say,
01:08:37.500 oh,
01:08:37.900 Joe,
01:08:38.260 I'd like to hear
01:08:38.680 the other side of that?
01:08:40.160 I'd like to hear
01:08:40.740 the person who says
01:08:41.480 you shouldn't pay attention
01:08:42.420 to fitness and health?
01:08:43.800 No.
01:08:45.080 No.
01:08:46.080 Because there's no debate.
01:08:47.760 It's only when
01:08:48.320 there's a debate.
01:08:49.980 Now,
01:08:50.300 years ago,
01:08:50.840 when I used to do
01:08:51.420 something called
01:08:52.120 the Dilbert Newsletter,
01:08:53.580 I don't do that anymore,
01:08:55.340 I came up with
01:08:56.600 an acronym
01:08:57.360 called
01:08:58.440 BOKTOW.
01:08:59.880 That's how you'd
01:09:00.460 pronounce the letters.
01:09:01.720 And it stands for
01:09:03.320 but of course
01:09:04.760 there are obvious
01:09:05.940 exceptions.
01:09:07.580 And I've often said
01:09:08.560 that the reason
01:09:09.520 that most people
01:09:10.440 disagree with me
01:09:11.360 is that they forget
01:09:12.660 the BOKTOW
01:09:13.480 is always implied.
01:09:15.760 BOKTOW,
01:09:17.000 but of course
01:09:17.820 there are obvious
01:09:18.900 exceptions,
01:09:20.240 should apply to
01:09:21.020 everything I say
01:09:22.000 all the time.
01:09:23.460 So when the professor
01:09:24.560 said,
01:09:25.560 oh,
01:09:25.840 so you're saying
01:09:26.360 that my lectures
01:09:27.200 are not good,
01:09:29.260 the actual reply
01:09:30.440 to that is,
01:09:31.000 but of course
01:09:33.100 there are obvious
01:09:33.940 exceptions
01:09:34.580 and a lecture
01:09:35.240 would be one
01:09:35.880 because there's
01:09:36.760 no counterpoint
01:09:37.600 to it.
01:09:44.140 Scott,
01:09:44.720 is it okay
01:09:45.660 to listen to me
01:09:46.360 for entertainment
01:09:46.940 and not any knowledge?
01:09:48.560 Yes,
01:09:48.840 it is.
01:09:49.140 for those of you
01:09:55.520 who have been
01:09:56.300 saying that I
01:09:57.100 should just
01:09:57.520 take the L,
01:09:59.380 can you explain
01:10:00.240 that in a way
01:10:00.820 that doesn't sound
01:10:01.500 exactly like
01:10:02.320 cognitive dissonance?
01:10:04.240 Can you explain
01:10:05.020 to me what
01:10:05.680 exactly the L
01:10:06.880 is that I
01:10:08.400 would be taking?
01:10:09.780 What exactly
01:10:10.720 am I losing
01:10:12.140 specifically?
01:10:13.140 like what
01:10:14.780 have I said?
01:10:23.020 So Donnie
01:10:24.100 is questioning
01:10:24.800 the one expert
01:10:26.000 thing.
01:10:27.160 Am I passing
01:10:28.280 myself off
01:10:28.960 as an expert?
01:10:29.760 Except on the
01:10:31.080 one topic
01:10:31.600 that we talked
01:10:32.460 about.
01:10:34.300 That's cognitive
01:10:35.300 dissonance,
01:10:35.980 my friend.
01:10:38.260 Kenny says,
01:10:39.300 you just got
01:10:39.720 the vaccine
01:10:40.200 and you regret
01:10:41.360 it,
01:10:41.800 just admit
01:10:42.280 it.
01:10:43.660 I don't
01:10:44.380 regret it
01:10:44.960 and that's
01:10:45.920 the mind
01:10:46.460 reading tell
01:10:47.100 for cognitive
01:10:47.800 dissonance.
01:10:49.060 Anybody else?
01:10:56.900 You're not
01:10:57.660 an expert
01:10:58.140 on it?
01:10:59.720 Yes,
01:11:00.040 I am.
01:11:01.960 I'm an
01:11:02.640 expert
01:11:02.920 on
01:11:03.500 persuasion
01:11:05.480 and this
01:11:07.320 would be
01:11:07.600 a subset,
01:11:08.940 a pretty
01:11:09.720 easy to
01:11:10.380 understand
01:11:10.800 subset.
01:11:11.580 In fact,
01:11:12.280 mass
01:11:13.280 formation
01:11:13.700 psychosis
01:11:14.360 would be
01:11:14.860 like
01:11:15.280 information
01:11:16.940 101
01:11:17.660 for
01:11:18.380 somebody
01:11:18.720 who
01:11:18.900 studied
01:11:19.280 persuasion.
01:11:20.680 This would
01:11:20.980 be,
01:11:23.100 let's say
01:11:23.420 if you
01:11:23.660 were a
01:11:24.120 physics
01:11:24.660 person,
01:11:26.420 this would
01:11:26.700 be like
01:11:27.080 the most
01:11:27.900 basic
01:11:28.340 equation
01:11:28.860 or something.
01:11:30.160 It's
01:11:30.500 something that
01:11:30.920 anybody would
01:11:31.440 know
01:11:31.700 is
01:11:32.180 applicable
01:11:34.020 or not.
01:11:37.120 You should
01:11:37.800 go debate
01:11:38.340 Sam Harris
01:11:39.240 on trust
01:11:39.720 the experts.
01:11:40.320 I'll bet
01:11:40.540 we wouldn't
01:11:40.980 disagree.
01:11:42.300 I'll bet
01:11:42.660 we wouldn't
01:11:43.100 disagree.
01:11:44.620 I don't
01:11:45.220 think Sam
01:11:45.680 Harris would
01:11:46.120 disagree with
01:11:46.900 one thing
01:11:48.100 I've said
01:11:48.500 today.
01:11:50.560 I mean,
01:11:51.100 I don't
01:11:51.320 know.
01:11:52.460 But let me
01:11:53.260 ask you,
01:11:53.880 do you think
01:11:54.260 Sam Harris
01:11:54.860 would disagree
01:11:55.360 with anything
01:11:55.900 I said on
01:11:56.460 today's show?
01:11:56.980 I don't
01:11:59.600 think so.
01:12:02.980 Please
01:12:03.420 interview with
01:12:04.040 Joe Rogan
01:12:04.660 again.
01:12:06.480 I think
01:12:07.140 you'd need
01:12:07.460 a reason.
01:12:10.380 You know,
01:12:10.700 when I
01:12:11.340 interviewed
01:12:11.740 with Joe
01:12:12.260 Rogan,
01:12:12.640 it was
01:12:12.720 because I
01:12:13.120 was in
01:12:13.380 the news
01:12:13.820 for having
01:12:14.860 a contrarian
01:12:15.720 view on
01:12:16.380 Trump,
01:12:17.100 I think.
01:12:18.520 And that's
01:12:19.080 why Dr.
01:12:20.000 Malone is
01:12:20.480 on there
01:12:20.780 and Dr.
01:12:21.240 McCullough.
01:12:21.960 So I don't
01:12:22.500 really have
01:12:23.020 a contrarian
01:12:23.800 view that
01:12:24.220 has much
01:12:24.640 depth.
01:12:25.520 I mean,
01:12:25.740 I don't
01:12:26.260 have three
01:12:26.720 hours of
01:12:27.220 contrarian
01:12:27.700 view.
01:12:28.500 My entire
01:12:29.100 thing could
01:12:29.540 be,
01:12:30.240 let's
01:12:30.360 hear the
01:12:30.620 other
01:12:30.800 side
01:12:31.140 and we'd
01:12:31.920 be done.
01:12:40.620 You can't
01:12:41.500 simply
01:12:41.800 misrepresent
01:12:43.060 Professor
01:12:43.680 Desmet's
01:12:44.220 work and
01:12:45.180 then discuss
01:12:45.860 the disfigured
01:12:46.840 result.
01:12:48.060 Well,
01:12:48.820 here's a
01:12:49.600 challenge for
01:12:50.120 you.
01:12:50.860 My claim
01:12:51.560 is that
01:12:52.480 mass formation
01:12:53.740 psychosis
01:12:54.420 is such
01:12:56.100 a simple
01:12:56.640 topic for
01:12:57.640 someone who's
01:12:58.180 familiar with
01:12:58.960 persuasion
01:12:59.800 that it's
01:13:01.300 easy to
01:13:01.700 understand.
01:13:03.200 And that
01:13:03.500 all of the
01:13:04.000 components of
01:13:04.820 it I've
01:13:05.880 heard and
01:13:07.040 I say,
01:13:07.500 oh yeah,
01:13:07.740 that's obvious.
01:13:08.740 When people
01:13:09.200 are confused,
01:13:10.140 for example,
01:13:10.780 they look for
01:13:12.000 new information.
01:13:13.300 When people
01:13:13.940 don't feel
01:13:14.480 connected to
01:13:15.140 other people,
01:13:15.780 they're a
01:13:16.040 little bit
01:13:16.360 more confused.
01:13:17.960 That's all
01:13:18.480 just obvious
01:13:18.960 stuff.
01:13:19.920 There's
01:13:20.420 nothing about
01:13:21.020 mass formation
01:13:22.100 psychosis
01:13:23.940 that is
01:13:26.540 interesting.
01:13:28.080 Somebody
01:13:28.340 says,
01:13:28.660 study it,
01:13:29.220 don't hide.
01:13:30.720 You know
01:13:31.280 I have,
01:13:31.760 right?
01:13:31.960 I have
01:13:33.300 read about
01:13:33.760 it,
01:13:34.060 and it
01:13:34.780 doesn't take
01:13:35.280 more than
01:13:35.660 a minute
01:13:36.040 for somebody
01:13:36.640 who's
01:13:36.880 familiar with
01:13:37.500 the field
01:13:37.940 to read
01:13:38.980 it and
01:13:39.300 say,
01:13:39.700 oh,
01:13:39.840 that's
01:13:40.020 stuff I
01:13:40.340 already
01:13:40.520 knew.
01:13:42.140 Do you
01:13:42.380 get that?
01:13:43.720 That it's
01:13:44.120 the most
01:13:44.620 basic thing
01:13:45.560 in the
01:13:45.820 field,
01:13:46.580 that people
01:13:47.760 who are
01:13:48.080 confused
01:13:48.640 will
01:13:48.980 accept
01:13:49.600 certainty
01:13:50.640 when it's
01:13:51.940 offered in
01:13:52.460 some
01:13:52.700 confident
01:13:53.140 form,
01:13:54.280 and that
01:13:54.660 there are
01:13:54.920 things which
01:13:55.380 cause people
01:13:56.000 to be in
01:13:56.400 an uncertain
01:13:57.000 state,
01:13:57.980 so they're
01:13:58.660 far more
01:13:59.100 likely to
01:13:59.720 be accepting
01:14:00.480 a strong
01:14:01.640 voice.
01:14:02.480 Is there
01:14:02.820 any part
01:14:03.340 about that
01:14:03.720 I got
01:14:03.960 wrong?
01:14:07.540 It's
01:14:08.060 kind of
01:14:08.420 simple.
01:14:11.980 It's
01:14:12.500 where
01:14:12.660 religion
01:14:13.080 sprouts
01:14:13.480 from,
01:14:13.980 correct.
01:14:15.360 All
01:14:15.560 right.
01:14:16.760 That's
01:14:17.280 all I
01:14:17.460 got for
01:14:17.720 now.
01:14:18.080 And I
01:14:18.400 will talk
01:14:18.760 to you
01:14:19.020 tomorrow.
01:14:19.560 I'm
01:14:19.740 expecting
01:14:20.120 lots
01:14:20.480 more
01:14:20.900 tells
01:14:21.900 of
01:14:22.120 cognitive
01:14:22.560 dissonance
01:14:23.120 in my
01:14:23.520 Twitter
01:14:23.720 feed.
01:14:24.800 Let's
01:14:25.220 see how
01:14:25.480 far my
01:14:26.020 follower
01:14:27.720 account has
01:14:28.340 dropped
01:14:28.680 since we
01:14:29.620 started.
01:14:31.240 It's
01:14:31.500 kind of
01:14:31.760 funny.
01:14:32.580 The
01:14:32.720 people who
01:14:33.280 can't hang
01:14:33.860 with
01:14:34.380 nuance,
01:14:37.740 I'd like
01:14:38.120 to get
01:14:38.380 rid of
01:14:38.640 them.
01:14:38.900 Yeah,
01:14:39.140 we lost
01:14:39.600 another
01:14:39.960 100,
01:14:40.540 I think.
01:14:41.860 I can't
01:14:42.380 tell.
01:14:42.620 I don't
01:14:42.800 remember
01:14:43.020 what it
01:14:43.240 was.
01:14:43.800 But we're
01:14:44.140 dropping
01:14:44.440 fast.
01:14:45.780 All
01:14:45.920 right.
01:14:46.280 I'll
01:14:46.680 talk to
01:14:46.960 you
01:14:47.200 tomorrow.
01:14:48.020 Tomorrow.
01:14:48.060 Thank you.