Real Coffee with Scott Adams - April 04, 2022


Episode 1703 Scott Adams: Elon Musk Buys 10% of Twitter, Everything Is Going To Get Interesting Now


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

143.26508

Word Count

9,315

Sentence Count

640

Hate Speech Sentences

8


Summary

Biden and the laptop, the cat on the roof, and the real guy who is actually implicated in the emails, Joe Biden's brother, Hunter Biden. It's another day at the office, and it's a good one.


Transcript

00:00:00.000 Good morning, everybody, and can we take a moment to just delight in the beauty of my shirt?
00:00:09.680 Now, it's just a t-shirt, I know, but sometimes when you get exactly the right color that fits
00:00:15.760 your eyes, well, you know, that just tells you it's all going to work today. And believe it or
00:00:20.760 not, not only is this a peak day for me with my good shirt and everything, but you too. You too
00:00:28.900 will be joining in in a little ceremony called the Simultaneous Sip, and it will signify and kick
00:00:34.980 off one of the peak experiences of your whole damn life. And it's called Coffee with Scott Adams,
00:00:41.600 and you begin with a cup or mug or a glass of tank or chalice or sign a canteen jug or a flask.
00:00:47.140 What kind of vessel? Any kind. Fill it with your favorite liquid. I like coffee. And join me now
00:00:53.320 for the unparalleled pleasure. It's the dopamine hit of the day. Turn on that dopamine faucet.
00:01:00.400 It's called the Simultaneous Sip, and it happens now, all over the world. Go.
00:01:04.220 Now, if you're watching this live in a different time zone, just wait an hour. Oh, no, it doesn't work
00:01:17.620 like that. All right. Well, there are some things happening today you might want me to mention.
00:01:27.000 Somebody thinks my cup is empty. Does anybody want to audit my coffee cup? All right. Spot audit.
00:01:38.660 You called for an audit? Here it is. Let's see. We will adjust our cameras for auditing.
00:01:50.100 Wah. Wah. Wah. It's coffee. It's the real stuff. Oh, it's a little watered down.
00:02:03.720 Ah. What you're seeing now is one of my broadcasting tricks. I have to put a towel on my desk. Otherwise,
00:02:13.920 the lavalier things will be clacking around. Yes, that's a little bit behind the scenes.
00:02:22.080 If you're listening to this on a podcast, you're saying, can you please change the subject from
00:02:27.720 your shirt and your desk towel? Because those things I cannot see when I'm listening.
00:02:36.080 Now, I've told you before, the way you should listen to this live stream is not like others.
00:02:40.180 So you're not a you're not a passive participant. You're in a conversation with somebody who won't
00:02:47.540 stop talking. But that's OK, because you have things to think about and do anyway. You're working
00:02:52.240 out. You don't need to be part of the conversation. You can just listen to me babble. Let's talk about
00:02:58.200 Biden and the laptop. Have I ever told you the story about the cat is on the roof? Yes, I have.
00:03:04.820 I'm not going to repeat it. But the idea is that you say the cat is on the roof to soften the blow
00:03:11.180 because, you know, the cat's already dead. So you say, well, it's on the roof. And then the next day you
00:03:16.660 say, well, it fell off the roof. You know, we're seeing how it is. And then you sort of slowly break
00:03:22.840 somebody into bad news. So that's called the cat is on the roof. Let's talk about Biden and the laptop
00:03:28.700 story that started out with laptop. I don't even know what you're talking about.
00:03:35.060 What laptop? I haven't even heard about such a thing. And that turned into, oh, oh, that laptop.
00:03:44.580 Well, you're talking about the laptop that's the Russian disinformation. Oh, yeah. Yeah,
00:03:50.500 there is a laptop. It turns out it's a laptop. It's a disinformation laptop. That's what it is.
00:03:58.160 OK, it's a real laptop. It's not so, maybe it's not so much Russian.
00:04:03.820 Completely real. The laptop is. I don't know about the contents, but that laptop is real. That
00:04:11.720 belonged to Hunter. OK, the emails are real. Yeah, obviously they're real because they can be,
00:04:17.660 lots of them have been validated. So yes, they're real emails. But the thing you have to understand
00:04:23.060 is that there's no kind of, you know, crime, you know, suggested there. Well, OK, there's plenty
00:04:29.460 of crime suggested there in the emails themselves, if you were to read them. Plenty. But the important
00:04:37.320 thing is, as of today, that they only involve potentially Hunter Biden and maybe Joe Biden's
00:04:46.460 brother. But there's, you know, there's nothing on there that would implicate the big guy. Oh,
00:04:53.080 wait, the big guy. The big guy is actually Joe Biden, who is actually implicated on the emails.
00:04:58.460 All right. So the laptop is real and the emails are real and they do implicate President Biden. But
00:05:12.520 I'm sure he's implicated in nothing that's going to matter. And by the way, have I mentioned he
00:05:20.020 hasn't been convicted of anything. That's right. Sure. It's a real laptop and it's real emails
00:05:28.380 and it really does implicate Hunter and it really does implicate Hunter's uncle. But it does mention
00:05:34.120 Joe Biden in the very same plots for which those others are implicated. But it's nothing to worry
00:05:41.300 about because he hasn't been indicted. No court has ruled that Biden has done anything wrong. Am I
00:05:48.300 right? Am I right? I mean, let's be honest. No court has ruled that Biden has done anything wrong.
00:05:55.720 Yet. That's how CNN handles any good news about Trump. Whenever they talk about Trump's, you know,
00:06:05.840 so-called legal problems. And they say, well, Trump hasn't been found guilty of any of those legal
00:06:12.620 problems yet. They always throw in the yet. So Joe Biden is completely clear of all legal problems related
00:06:26.760 to the laptop. And he has not been indicted yet. All right. And there's a great article in The Hill by
00:06:39.880 Jonathan Turley, who is pointing out how the narrative has changed from, you know, that's not a real laptop
00:06:46.900 to, yeah, it's real. But it's not really so much a problem. All right.
00:06:53.460 And even The Washington Post now, you know, everybody's sort of trying to get ahead of the
00:07:00.900 story a little bit. So it does look as if they're going to clip the cord and let Hunter and maybe
00:07:10.440 Joe Biden's brother, let them dangle so they can save Biden long enough to get something good going.
00:07:18.140 Do you have any idea what the hell the Democrats are planning?
00:07:24.160 I mean, seriously, what what are the what are those meetings sound like?
00:07:30.060 I mean, that's the honest question. You know, that whoever is the you know, we imagine to be the big powers
00:07:35.680 in the Democratic Party. What exactly do those meetings look like now? Do they say, I think we can ride
00:07:42.600 this Joe Biden thing out? All right. It looks bad now. But trust me, it's going to be a Putin
00:07:48.040 situation. Yeah, it looks bad at the moment, but he's going to ride it out. It'll be fine.
00:07:52.560 Does anybody say that? Is there anybody on the side of let's just ride this out? I think we can get
00:07:58.760 through four years and and maybe even get another Democrat elected? Probably not. Right. So if
00:08:04.900 they're talking about not that, what are they talking about? Is there anybody who said, you know,
00:08:11.780 I think what would really work is why don't we replace Biden with the least capable and least
00:08:19.080 popular vice president of all time? Well, I don't know if that's true, but let's just say she's not
00:08:25.080 among the top. I don't know. Maybe Dan Quayle was worse. Somebody would have to give me a fact check on
00:08:31.200 that. Speaking of Kamala Harris, Rasmussen has a poll about Kamala Harris's popularity and 56% say
00:08:42.200 they have an unfavorable view of some kind. Only 40% have a favorable view. And 54% of the voting
00:08:51.940 public in the United States believes that the vice president of the United States is unqualified to
00:08:57.040 be president. Wait, isn't that the one thing we should all feel fairly comfortable with? Now, I completely
00:09:06.680 get the idea that you wouldn't like the politics of the vice president. So you might say, oh, that's that's
00:09:14.080 not the politics I would want in office. But did we really, really, really, did we just elect a president
00:09:23.760 who is a thousand years old with a package with a vice president that 54% don't even think she can
00:09:30.940 handle the basics of the job? Like she'd get lost commuting or something? They'd have to remind her
00:09:39.220 what her job is every day or something? I don't know. You're right, commenter. I do look good in
00:09:45.640 this color blue shirt. See? It wasn't just me. It's a fact. So seriously, what the hell are the
00:09:54.600 Democrats saying? I mean, honestly, this is just a completely legitimate question. What do they say
00:10:02.000 privately to each other? Do they just say, well, we're just screwed? We're going to have to shake
00:10:07.740 the box and live with some Republicans for four to six years? Or are they saying, we can't change
00:10:16.940 our minds in public, but we need the Republicans to save the country? Like, are there any Democrats
00:10:22.480 who don't want to publicly change their opinion of anything, but privately they're thinking,
00:10:29.840 I really hope we lose this time because the winning isn't working out.
00:10:34.200 Do you remember when Trump said that we'd get tired of winning? I'll bet you thought he was
00:10:44.600 talking about Republicans, didn't you? Do you know who's tired of winning? I'll bet the Democrats
00:10:52.440 are pretty tired of winning right now because they won that whole presidential thing. They got the
00:10:58.860 Congress winning. Winning. Winning. It's looking good. Everybody? Do you think they're tired of all
00:11:06.960 the winning? I think they're tired of winning and they'd like to lose one for a change. Am I right?
00:11:14.400 Honest to God, I think Democrats are hoping to lose. Some of them. I mean, everybody's individual,
00:11:21.160 but don't you think there are some Democrats saying, I hope we lose this next one? Some? All right.
00:11:32.860 And then the news is also about the news, as it often is. The news is about the news.
00:11:40.620 Oh, and let me ask you this. I did a little poll on Twitter and I said, which of these two incidents
00:11:46.760 seem more indicative of a crime committed by a president? So which situation involves the most
00:11:54.560 evidence? So it's the most evidence of a U.S. president involved in a crime. Is there more
00:12:00.880 evidence about Joe Biden and the laptop situation? Or is there more evidence that President Trump did
00:12:08.500 something illegal relative to January 6th? Which one has currently, so we're not talking about
00:12:16.520 where it might go, because anything might happen. But at the moment, is there more solid
00:12:24.160 public information of crime on the laptop or the January 6th event?
00:12:33.540 Can anybody correct me if I'm wrong? So far, there's zero evidence of a Trump crime.
00:12:38.040 Am I wrong? Because, you know, you always worry that you're in your own news bubble. But if there's
00:12:45.680 anybody who's seen outside the bubble, I'm not sure if I do, I read CNN every day. And I
00:12:54.260 haven't seen anybody. I do sample MSNBC and CNN. I sample Axios all the time. I don't remember
00:13:03.400 seeing any of them saying there's a specific crime, even alleged. Am I right? Is there even
00:13:10.560 an allegation of a crime? Well, there's an allegation that something happened, but not based on any
00:13:17.080 specific fact pattern, right? I don't believe there's an allegation of a fact that would support
00:13:24.300 the idea that Trump committed a crime. Am I wrong? And I'm making a weird distinction between an
00:13:33.700 allegation of a crime, which does exist because every side, every side, you know, blames the other
00:13:39.920 of a crime, which means nothing in politics, right? So it doesn't mean anything that you say the other
00:13:44.960 side is a criminal. But if you have a specific fact that if it were true, would pretty much, you know,
00:13:53.980 say that somebody did something bad. I don't know. All right. Now, so a lot of you were pretty confident
00:14:01.920 in my poll. You know, this reflects my audience. It doesn't really reflect reality. But 94% said that
00:14:09.640 wait, I think I may have this backwards. I think most people said the laptop was the one that looked
00:14:18.920 like a serious crime. And let me ask you this. What crime does the laptop suggest?
00:14:26.840 Can you name one? What would be the crime, the actual crime, that Joe Biden would be guilty of
00:14:36.480 if all of it were true? Let's say the worst allegations based, you know, based on your
00:14:42.280 really speculation at this point. I don't think we have enough from the emails to
00:14:47.780 to know 100% of anything. But yeah, it looks like a 95% likelihood.
00:14:57.560 Yeah, I mean, I'm seeing lots of suggested possible crimes, but I'm not sure that there really
00:15:02.820 is one. Because I think it's completely legal, isn't it? Isn't influence peddling completely
00:15:10.240 legal? Now, there's the furrer thing, right? But nobody ever gets, nobody ever gets
00:15:17.720 convicted of that. I don't know. It seems to me that you've got this like slight lobbying
00:15:26.680 problem, you know, the foreign lobbying thing. But almost nobody goes to jail for that, because
00:15:32.000 it's so common. Is that the worst? Is that it? That's the worst? Yeah. I don't know. So
00:15:40.420 you have to be careful of the fact that you think your team didn't commit any crimes, but
00:15:45.100 the other team did. The most likely outcome is that nobody did. Because everybody accuses
00:15:51.520 everybody of a crime in the political context. Have you heard the phone calls yet? Phone calls
00:15:59.840 of who to what? So probably no, because I don't know what that question is. All right.
00:16:11.280 What do you think about slippery slopes? I'm going to see if I can add something to this. I
00:16:16.560 always talk about it, and you're sick of it. But do you think that you can identify a slippery
00:16:22.220 slope and then use that idea to predict the future? It's like, oh, this just started. It's
00:16:30.800 a slippery slope. So it's just going to slide in that direction. How many think that slippery slopes
00:16:36.960 predict? All right. Now, as far as I know, there's no science to back your opinion or anybody else's
00:16:47.220 about whether they predict. But let's walk through this a little bit. Now, would you say that a
00:16:52.760 straight line prediction is a slippery slope? In other words, let's say climate change says that
00:17:00.420 we're going to keep going in the direction of more and more CO2 and the climate will warm. How often
00:17:07.920 is a prediction that says things will keep going the way they're going? How often is that right?
00:17:14.320 Well, it depends how what your time frame is, right? If your time frame is really short,
00:17:21.160 then it's usually right. Things usually go tomorrow, literally tomorrow, like 24 hours from
00:17:27.620 today. Things usually are the same as they were yesterday, ish. But if you go out 100 years,
00:17:34.520 it's almost never the same. So it has to do with your timeline. So when you say there's a slippery
00:17:40.400 slope, what are you leaving out? For how long? Right? If it's a slippery slope between today
00:17:48.980 and tomorrow, I'm going to say, yeah, you know, probably if somebody did something today and it
00:17:55.420 worked out, they might do a little more tomorrow. That is correct. But if you said, is this a slippery
00:18:01.360 slope for 10 years? I'd say, well, first of all, it depends on the topic, I suppose. You know,
00:18:07.920 is this something that changes quickly or doesn't? But probably 10 years, you can't predict anything.
00:18:15.380 Do you know who says that? Everybody. Nobody disagrees with that. There's nobody in the world who
00:18:23.320 predicts anything about anything, who believes that anybody can predict 10 years in the future.
00:18:28.040 That's not a thing. So if there's no predicting of 10 years, the slippery slope probably has at least
00:18:38.280 a, you know, a limit in terms of time, would you not say? So my view is even more harsh on the slippery
00:18:47.800 slope. I believe that people start with a conclusion and they reason backwards. And the reason I think
00:18:54.500 that is because that's the only way we think. It's not special to this topic. And unfortunately, it's
00:19:02.920 something that hypnotists learn on day one, that people think backwards. We decide and then we reason
00:19:09.680 backwards to why we decided. We don't think of reasons and then decide. That's purely an illusion.
00:19:16.040 There are probably some exceptions, like in the scientific process, for example. But the way you
00:19:23.340 normally just navigate your life is you make decisions and then you come up with some bullshit
00:19:28.420 idea of why you did it. Yeah, it's totally against my philosophy for my entire life. But the reason I
00:19:35.480 did it this time is because I had a bed cold and I wasn't thinking straight. Yeah, we just have dumb
00:19:41.300 reasons for everything we do. So here's what I think the slippery slope is. It's a backwards
00:19:47.160 rationalization. We just say to ourselves, uh-oh, I'm afraid that X will happen. And now I need a
00:19:54.640 reason to support why I'm afraid of something irrationally. Uh-oh, there's a slippery slope.
00:20:00.740 There we go. I've reasoned backwards that the thing I'm irrationally afraid of has some kind of logical
00:20:06.800 path to it. And I'll call it a slippery slope.
00:20:13.240 Isn't slippery slope the thing he passed the sail? Let me think about that.
00:20:18.660 Yeah, I guess it is. Good point. If you say something's a slippery slope, you're making people
00:20:25.880 argue about how slippery it is and not whether it's going to happen at all. Yeah, in a sense,
00:20:32.160 I guess it is. Not as direct as most things. Um, I'm being asked, will Russians continue to love
00:20:44.200 Putin? Let's change the topic. And I say, um, the Putin situation is a perfect example of the
00:20:54.920 difference between wanting something and deciding. And I talk about that distinction in terms of,
00:21:02.480 you know, personal growth and success. Wanting success doesn't give you much of anything.
00:21:10.040 But deciding to get it probably gives you everything. And until you learn that distinction,
00:21:17.560 it's hard to get off the couch, right? Sitting around wanting things buys you just about nothing.
00:21:23.540 But deciding you're going to do it means that you're going to eat whatever garbage you have to. You're
00:21:30.700 going to take as little sleep, endure as much pain. You've just decided, right? For example,
00:21:38.980 I didn't want to go to college. I didn't want to go. I freaking decided. And, you know, sure,
00:21:46.660 there were obstacles and reasons to drop out at different points. And, but I decided. So once you
00:21:54.900 decide, it just simplifies everything. This is going to hurt. This is really going to hurt if you keep
00:22:00.160 doing this. Yup. Yes, it is. Yeah. If you do this, there's a risk you'll die. Yes, there is.
00:22:07.940 All right. A decision just simplifies everything. And the question about Russia comes down to has Putin
00:22:15.060 decided or does he just want whatever, you know, whatever it is he wants that we're not exactly sure
00:22:22.800 about? Does he want it or has he decided? Because if he's decided, well, then he'll get it because he'll
00:22:31.140 get it at any cost. And he certainly seems to be in the realm of somebody who could absorb large costs
00:22:39.840 and make it work for him at least. And maybe even make it work for Russia. I hate to say it, but he
00:22:47.060 may be sacrificing a lot in the short run for what could turn out to be like the, you know, like some
00:22:54.520 kind of Thomas Jefferson land deal, you know, Louisiana purchase, you know, and when we look at
00:23:00.960 it a hundred years from now, you might be looking at this big prosperous Russia that Putin built and
00:23:08.960 you forget about how brutal he was and you say, well, he sure was successful. Sort of like you do with
00:23:14.300 Genghis Khan. Like when we talk about, you know, is it Genghis or Genghis? Genghis Khan. We, we, we sort of
00:23:22.620 talk about how awesome he was, right? How much territory he conquered and how many children he
00:23:28.040 sired through his many concubines and whatever. We don't really talk about how amazingly awful he
00:23:34.500 was, like beyond probably anything we can imagine. And Putin could have the same thing. So if Putin
00:23:41.220 has decided, I'm pretty sure he'll survive and he'll end up with more territory. And somebody
00:23:52.440 says it's a Genghis or Genghis. I don't know, maybe. Yeah. So if I had to guess, and let's do the,
00:24:04.820 the armchair thing where we, we try to imagine that we know what's in Putin's head, you know,
00:24:10.180 good luck with that. If I had to guess, I think he's decided. What do you think? Has, has Putin decided
00:24:20.200 to take Ukraine or does he just want it, which would suggest he'll take something or, or less
00:24:27.260 than what he wants? Looking at your, I think it's going to be mostly yeses. All right. Looping back
00:24:37.020 to the slippery slope. I mean, the slippery slope isn't everything, right? If Putin takes Ukraine,
00:24:42.660 it's a slippery slope until he takes the Balkans. Am I right? I mean, that's the argument is that if
00:24:51.500 he can get away with it, why wouldn't he keep going? So here's my counter to the slippery slope,
00:24:58.180 besides the fact that it's backwards thinking. There's a better frame. If, even if you imagine
00:25:04.980 that the slippery slope is telling you something and it is predictive, and let's say for the moment
00:25:10.760 that I'll acknowledge that. I think in some cases it's kind of predictive, just not all
00:25:17.220 of them. And so, so let me just say, yeah, there's sort of a, there could be a domino effect
00:25:24.100 or a follow the leader or, you know, it gets into people's minds, et cetera. But here's the
00:25:30.560 better frame. Follow the money. That's it. If you have two frames that both predict, but one
00:25:39.240 does a 60% of the time good prediction, which would be very useful. If you had, if you had
00:25:45.400 a way to predict the future that was right 60% of the time, I think you'd become a billionaire,
00:25:51.460 right? I mean, if you could do that, I don't know that anybody could do that. Has anybody
00:25:56.320 ever predicted the future 60% of the time accurately? I don't know. That'd be a lot. But if you have
00:26:04.020 a different frame that could do way better than that, which is follow the money, I assert
00:26:10.440 that follow the money works 80% of the time. And that the 20% you don't know about usually
00:26:17.540 is because there's a time lag. So, so follow the, follow the money is still going to work.
00:26:23.560 It just hasn't worked yet, right? Like there might be an intermediate step that looks like
00:26:28.540 it's not working, but it's going to work. If you just wait, follow the money works basically
00:26:34.640 80% of the time, maybe all the time. So that's my argument against the slippery slope. It's
00:26:39.980 just not as good as other arguments, even if it kind of works or other frames. All right.
00:26:50.580 What do you think about the question of whether Russia is winning or losing or Ukraine is winning
00:26:56.820 or losing? Based on the propaganda that you've seen, is Russia winning or, tell me who's winning
00:27:05.420 Russia or Ukraine? Who's winning? Now this is just based on what you see in the news because
00:27:13.160 you're not there. All right. So there, there are two frames while you, while you're watching
00:27:19.900 your answers there. One frame is that the plucky Ukrainians have driven the Russians away from
00:27:26.600 Kiev, which certainly had to be their main objective. And, uh, since the main objective
00:27:32.920 of taking over the capital and getting Zelensky seems to be, uh, pushed away, that would suggest
00:27:41.320 that Russia is, is losing and maybe they're trying to just hold on to the territories that
00:27:46.420 they, they already have. So that's one, one narrative. The other narrative is that, uh, military
00:27:54.620 always tries to quote, shape the battlefield. This is sort of the Scott Ritter, um, narrative.
00:28:01.220 And, uh, and an army that shapes the battlefield would do things such as try to distract the
00:28:10.220 other army, try to make them think that your, your motive is over here, but it's really over
00:28:15.320 there. Trying to bog down their military, trying to basically change the landscape by inserting
00:28:23.080 variables that just make things difficult for the other team. Um, and yeah, I'm seeing in
00:28:29.460 the comments, the shaping is military doctrine. It's pretty, apparently it's just ordinary military
00:28:34.400 way. So the more ordinary way to look at what Russia is doing is that by threatening Kiev,
00:28:43.680 which would have been too expensive to take, which we all thought was the case, right? Or
00:28:51.200 at least a lot of us did, that taking Kiev would be such a bloodbath that there's no way
00:28:56.860 that Putin could do it and, and make everything work. I mean, it would just be too much. So
00:29:02.740 instead it looks like what they may have done and we don't know, but it looks like what they
00:29:08.840 may have done. Okay. Kiev, Kiev, whatever you want to call it. Uh, what they may have done
00:29:14.900 is surround Kiev, um, in order to keep the Ukrainians busy there so that they could consolidate things
00:29:25.120 in the, in the, uh, the South and the East. And then once they were done there, they would
00:29:30.940 consolidate their armies and then, you know, really completely own the parts that they already
00:29:36.820 control and then move toward taking whatever else they need. So they could basically force
00:29:43.260 Kiev to submit without directly attacking it, which would be the best way to, to do it. Now,
00:29:51.420 suppose you said to yourself, all these military experts in Russia, they, they've heard of this,
00:29:57.860 right? It's not like you and I are making up, making up military doctrine. Seems like it's
00:30:04.200 fairly standard stuff. Um, so, so this would be supportive of the theory that if Putin has decided
00:30:15.880 to take Ukraine, he, he always had a long-term, um, that he, that he always had a long-term view
00:30:26.200 of it. And that the idea that he could take, uh, Kiev or Kiev, the capital, the idea that he could
00:30:34.300 take that in 48 hours might've been Russian disinformation. Maybe, maybe, maybe Russia is
00:30:40.080 the one who started the idea that they could take over in two days so that everybody was looking for
00:30:44.740 the blitzkrieg that wasn't going to work, but they never planned to do that in any way. They
00:30:48.760 always planned to do it the, the slow, slow way. Now, I saw an estimate on Twitter that I don't
00:30:56.040 believe that, that only a thousand, uh, um, non-military people have died in Ukraine. Is that, you know,
00:31:07.840 I'm talking about just the citizens, uh, the population. That's not right, is it? Only a thousand
00:31:14.520 casualties, uh, among non-military people in Ukraine. We're not talking about the Russian
00:31:20.040 side. Yeah, that doesn't sound right. But what if it is? What if it is right?
00:31:29.340 Are you sure? I don't know if there's any way to, to know at this point, right? Yeah, a thousand,
00:31:38.520 yeah. I mean, let's, let's put in the obligatory, a thousand is too many. You know, it's a tragedy.
00:31:45.320 All right. Now let's talk about the, uh, how many of you saw the clips of what looked like, uh,
00:31:51.480 massive war crimes in, uh, what was it? Buka or Bucha or something. That one town that the Russians
00:31:59.800 have left. And you saw video that we have not verified. We meaning anybody. Uh, unverified video
00:32:07.640 of what looked like handcuffed prisoners who had been executed and dumped all over the road.
00:32:14.200 All right. How many of you believe that that was a real video and not propaganda?
00:32:23.720 And, and how many of you had a initial reaction which morphed a little bit as you heard other people
00:32:32.120 weigh in?
00:32:32.760 Now what I've been doing recently, and I'm not sure that I'm doing a good service here is that when I
00:32:40.520 see one of these videos that looks to be like it's, you know, at least possible or even probable
00:32:45.960 propaganda, if it's being promoted by any kind of legitimate source, I'll usually start out by just
00:32:53.160 tweeting it or retweeting it because it's news, which is not, you know, endorsing it necessarily. But
00:33:00.600 after sort of digesting things for a few hours, sometimes maybe the next day, uh, I'll, I'll tell
00:33:07.960 myself, you know, I'm not sure that we have sources for any of that. And that, that would be easy to fake
00:33:14.280 and everything else is fake. So why would you believe this? And so slowly I kind of talk myself out of
00:33:19.720 everything. That's my first reaction. So my first reaction was probably like everybody else's, which was,
00:33:26.120 oh my God, it's, that looks like the worst war crime we've seen since World War II. I doubt it's
00:33:33.160 the worst one, but you know, in terms of our consciousness, it feels like that. And then what
00:33:40.200 happened was a number of voices on Twitter in particular started doubting the accuracy of that
00:33:48.520 video. And a number of people pointed out, uh, income incongruencies. I don't know if those were
00:33:57.880 true. I'm not even sure if the, I don't know if the critics were accurate in their, what they pointed
00:34:02.360 out either. But, um, um, you saw the narrative sort to give up, didn't you? I feel as if the,
00:34:14.440 the media, um, did say they couldn't verify it, but they still ran it and made you think it was real,
00:34:23.000 uh, as war propaganda probably worked really well. So whether it was real or not, I mean,
00:34:30.040 it, it was packaged as propaganda by the Ukrainians. So here's what is interesting.
00:34:38.920 So the, the, the narrative comes out in the fog of war that there's this gigantic war crime.
00:34:45.240 It's very suspicious because of the source and, and, you know, the situation.
00:34:50.040 I believe that the, the, the wave of belief was turned back by what I call the internet dads.
00:35:00.120 Now, uh, to be a little less sexist, I'm not saying it's all male. Okay. And I'm not saying
00:35:07.840 it's people who have, you know, children. I'm using just a general phrase here. So the internet dads
00:35:14.280 to include internet moms and, and, you know, everybody else. But, but there are, yeah,
00:35:19.560 you could call it the adults in the room, but people have been calling it the dads. So I'll just use
00:35:24.440 that. And yeah, Mike Cernovich, uh, I would say chief among that group. And that group has enough
00:35:34.360 influence now that I feel that the major media had to pay attention that the internet dads, people with
00:35:42.760 credibility and big followings were saying it would look like bullshit to them. Now,
00:35:50.280 as that, have any of you watched it closely enough to feel like that's what happened or was it going
00:35:55.480 to happen anyway? Because, because I feel like, um, I feel as though the balance of power is really
00:36:04.680 starting to change. And at the same time that this other force is sort of developing that, you know,
00:36:13.240 the, I'll call them independent voices, you know, they're podcasters, they're, they're tweeters or
00:36:19.480 people with big audiences. I feel like the counterbalance was really productive this time.
00:36:24.760 Don't you? Did you see the same thing I saw? Now, in the context of the, um, the internet dads
00:36:35.400 balancing out the major media, what's the biggest obstacle that the internet dads have working
00:36:41.720 against them? Uh, in the comments, see if you can get ahead of me here. What's the biggest limiter
00:36:48.760 or limitation that the internet dads have? Possibly the algorithm, right? I don't want to say
00:36:56.600 censorship, censorship, because, uh, if you say censorship, that's, it reads intention into
00:37:05.480 something that's not demonstrated to have intention yet. It has effect. I mean, we can see what we think
00:37:12.440 is the effect as alleged effect, I guess. Um, so in the context where we worry that the only thing
00:37:19.400 limiting the, the, the sensible counter voices to the unsensible news,
00:37:27.640 Elon Musk just bought the biggest share of Twitter that anybody owns, including Jack Dorsey.
00:37:33.960 It's like several times more shares than Jack Dorsey currently owns.
00:37:42.360 Elon Musk, who was asking about censorship and about, you know, the, the big platforms and asked
00:37:47.800 if he should start one, uh, put his money where his mouth is, bought 9.2% of Twitter this morning
00:37:54.440 ish. And now he's the largest shareholder.
00:37:58.680 I should, uh, give you a disclosure here. Uh, I do own Twitter stock and I do own Tesla stock.
00:38:09.720 And I tried to sign up for Truth Social, but I couldn't get in. And there's a, uh, like a related
00:38:15.640 story to all of these stories that Truth Social, I think they lost one of their people who was, uh,
00:38:22.680 managing that thing. And it doesn't look as though, um, the launch was successful.
00:38:28.680 So far. All right. It looks like it's late and it's not on Android and it's not on a web browser
00:38:34.840 yet. So it doesn't look like it's successful. And it's not going to help one bit that, uh,
00:38:40.760 Elon Musk just bought into Twitter because Truth Social should be a solution to what people believe
00:38:49.640 is censorship. But if Elon Musk becomes the most influential voice, and I don't know that that can
00:38:56.680 happen, but we'll see. But if Elon Musk becomes the most influential insider at Twitter, what's the
00:39:03.640 first thing he's likely to do? Like what, what's the first change he's likely to promote if you had to
00:39:11.800 guess? And here's the weird part. Here's my guess. I think the first change that Elon Musk is going to,
00:39:19.800 uh, to, uh, make just a guess is Jack Dorsey's plan. How weird is that? Because Jack Dorsey's been saying for
00:39:30.520 how many years? That he thinks the platform should have, uh, your choice of algorithms. So if you just
00:39:37.240 want to see everything, that would just be your choice. And if you wanted to see it filtered by somebody else's
00:39:42.840 filter, you'd see how they filter and you just choose it. Now, I always wondered, why is Jack Dorsey
00:39:51.480 saying that, uh, the platform should be like that, but I saw nothing coming in on Twitter
00:39:57.160 that was going to do that thing? Did I miss something? Like why, why was the most important
00:40:03.960 voice at Twitter saying we should do this, but it wasn't happening? And for years or did it happen?
00:40:09.560 And then it was in some small way that I didn't notice. All right. So then Jack Dorsey leaves.
00:40:18.120 Who knows, who knows what the story was behind that, right? Shareholders, blah, blah, blah, blah,
00:40:23.480 personal reasons. Who knows? But then Elon Musk comes in. I can't think of anybody more likely to
00:40:33.720 implement Jack Dorsey's plan. Maybe Jack Dorsey didn't own enough stock to get it done.
00:40:42.200 Maybe. I mean, if you stay anywhere long enough, you're going to develop, you know,
00:40:46.680 competing interests and other shareholders and stuff like that. So Elon Musk comes in with
00:40:52.680 probably nothing except a social agenda.
00:40:57.160 Twitter. I doubt he bought this as a pure investment, right? Can we all agree with that?
00:41:03.880 I think we would agree that, uh, Elon Musk did not buy Twitter as a pure investment. Do you know why?
00:41:12.360 I don't think he invested stuff like that. It's not really, I don't think it's his wheelhouse, really.
00:41:19.240 I think he did it for purely social reasons and he broadcast it. He broadcast it as cleanly as you
00:41:24.920 could broadcast any message. He asked in public, should I start a social, you know, media network?
00:41:31.640 Um, do you think that censorship is being handled right or however he asked it? So he's being as
00:41:38.920 transparent as anything could ever be. And then he's probably going to take that same level of
00:41:44.280 transparency that he's taking into this transaction. I think he'll take it into the product itself.
00:41:51.960 Because remember what Elon Musk is more than anything? He's a product guy. He's the ultimate
00:41:59.560 product guy. In fact, maybe Steve Jobs, you could say, but it's hard to think of somebody who's more
00:42:07.720 product oriented versus marketing, let's say. In fact, Elon Musk does the product so well that he
00:42:15.480 doesn't need marketing. He just does it himself, tweeting from the toilet, as I like to say. And
00:42:24.040 so the only explanation is that he's doing it to make Twitter more transparent and it would be kind
00:42:30.760 of embarrassing if it works, meaning it would be embarrassing to the other, any other platform that
00:42:36.760 doesn't. But I would not want to be owning Getter or any competitor to Twitter. Because the only
00:42:46.920 problem Twitter has, in my opinion, is the one thing that Jack Dorsey said it should do and hasn't
00:42:53.640 done yet. But Elon Musk knows how to make a product work. And there's one thing that's wrong with Twitter.
00:43:02.440 It's just broken. Am I right? It's a product that just has a glaring flaw. And the guy who fixes glaring
00:43:12.440 flaws, it probably makes him crazy. Can you imagine being somebody who has the engineering, physics,
00:43:23.800 product design, knowledge of an Elon Musk, and then looking at this product, Twitter,
00:43:29.640 and seeing it's just massively broken in this most critical way, the transparency of the algorithm
00:43:36.520 and the censorship element? And knowing you could just fix it. Just buy the damn thing. Just fix it.
00:43:44.040 Now, my guess is that he's tiptoeing in and seeing what he can get done with 9.2%.
00:43:49.880 Probably a lot. But that's not full control, right? Now, I would also guess that there will be some large
00:43:59.000 number of other Twitter shareholders likely to throw in, in a proxy sense, behind Elon.
00:44:09.480 Well, I'm one. Okay, I'll use myself as an example. So I own stock in Twitter. And if Elon Musk said,
00:44:17.000 hey, we're going to have this, I don't know, vote, I'm going to push for a vote,
00:44:24.040 can you make sure that you sign your proxy vote? If Elon Musk asked me to sign it because he said
00:44:31.160 this is how you fix censorship, I'd sign the fucking thing. All right? And I never sign those. I never do
00:44:39.240 that. Because I always think, ah, somebody else will sign it. You know, they'll get enough people
00:44:44.440 signing their proxy, then, you know, I don't have to do it. I never do it. But if, you know, given
00:44:50.840 Elon Musk's credibility, especially in this, you know, specific domain, censorship, I mean,
00:44:56.840 he's very consistent. And I think most people, almost everybody, would think he's on the right track if he
00:45:03.400 opens up the transparency. So I think Elon Musk is correct in his first assumption. And again,
00:45:13.800 this is all just speculation. I can't see inside his mind. As I often say, if I could see inside
00:45:19.320 Elon Musk's mind, I would have started Tesla. Why would I wait for him to do it? I would have built
00:45:27.560 myself a rocket ship to Mars if I could do what he could do. But I can't see inside his mind, and
00:45:32.760 neither can you. But it seems to me from an outsider's perspective, if you wanted to limit
00:45:37.880 your risk while getting the most leverage, you would buy 9.2% of Twitter. That's enough to get
00:45:44.200 everybody's attention and say, okay, I'm the lion king of shareholders. And then just organize the
00:45:50.120 other shareholders. It's easier to organize the others than to buy their shares. Because I think he
00:45:56.280 could. Uniquely, he could. Not many people could do that. Or who else could, really? It'd be hard to
00:46:03.240 think of somebody else who could. All right. So Twitter stock was up like 21% last time I looked.
00:46:13.160 And as I said, I own both Tesla stock and Twitter stock. So discount everything I say by my bias.
00:46:21.640 How many of you think that truth social is going to work? Because it will attract basically nobody
00:46:32.040 on the left who disagrees, right? And what would be the point of it? The reason I'm on Twitter is
00:46:43.880 because that's where the mainstream media is. That's where all the energy is. If the mainstream media
00:46:50.280 decides to choke truth social, and they will. I'm not even sure that the right wing, or I like to say
00:46:59.320 right-leaning, I don't even think the right-leaning press is likely to go big on truth. Do you know why?
00:47:05.560 It's competition. Truth social is just a news network, just packaged as a social media network.
00:47:13.880 So if you're in the news business, I'm not so sure you want them to succeed, do you?
00:47:21.240 So I don't know how truth will solve the problem of not having enough energy because there's not enough
00:47:30.440 conflict. You know, if you take out the thing that makes it work, the conflict, why would it work?
00:47:42.600 All right.
00:47:47.880 Okay. We have 30 times the followers on truth social. So you think shadow banning is real?
00:47:55.480 Well, it could have to do with the fact that there's nobody except people who agree with you on there.
00:48:02.520 Yeah, right. The problem is everyone agrees. Okay.
00:48:09.000 All right. Ladies and gentlemen, that is the amazing, amazing meat of the show. We are now in bonus
00:48:18.360 territory, right? It's bonus territory. This is part where you tell me what I forgot to mention.
00:48:26.280 Um, yeah, it's the best show ever. You're right. Stop it. I'm embarrassed now.
00:48:31.720 Um,
00:48:32.920 you know, locals, somebody's making a comparison between truth and locals. So locals, as I'm on the
00:48:45.240 subscription network, uh, that's different because I don't think people are there for the fight
00:48:51.000 on locals. I think people are there because there's a, usually the central person has a point of view
00:48:59.880 that they, you know, enjoy. It's more, more of an entertainment, I think. So when I put,
00:49:06.200 when I put material on locals, I think of it as just entertainment. Uh, and also it's the edgy stuff.
00:49:15.560 So I can, I can put the stuff there that, uh, now, um, does anybody know, uh, Snoop Dogg?
00:49:22.840 Uh, and I'm going to do a Kevin Bacon thing here. Anybody, does anybody know Snoop Dogg? Uh,
00:49:30.600 didn't Carmen Diaz, uh, go to high school with him? And here's why. Uh, I want to invite Snoop Dogg to my, uh,
00:49:38.840 man cave. And I want to interview him and see, uh, which of us will, uh, last the longest.
00:49:47.720 If you know what I mean. Perfect example of something that I can say more clearly on the locals platform.
00:49:56.440 Wink, wink. But, uh, here on, here on YouTube, it's a more, uh, family friendly, uh, environment.
00:50:03.080 And let me just say the Snoop Dogg and I could, uh, have a conversation and maybe compete.
00:50:09.640 Possibly fix all the problems in the world. And, uh, and you might say to yourself,
00:50:16.360 what are the odds that Snoop Dogg would come to my garage? I would say the odds are really,
00:50:22.920 really small, but it would be amazing. Wouldn't it? Come on. It would be amazing.
00:50:29.960 So one of the things I'm thinking of doing is, uh, doing man cave, uh, interviews,
00:50:36.840 but I only want to do it if somebody can come to the man cave. I don't want to do it remotely.
00:50:43.560 Um, so there you go.
00:50:52.600 Yeah, you'd watch. You know, you'd watch. All right. Let me ask you this question. Uh, aside from
00:50:59.480 Snoop Dogg, um, aside from him, who would be the most interesting thing I could interview and think
00:51:09.160 about not just that it's an interesting person, but somebody that you think I uniquely would be the
00:51:15.080 right person to interview either because of it's entertaining or I'd ask the right questions or
00:51:20.440 something. So who would be, you think Trump? All right. Let me tell you something that I could guarantee
00:51:30.440 you. Here's something I guarantee you that me interviewing Trump would be the best Trump
00:51:38.120 interview you ever saw. I guarantee it. I guarantee it. I don't know how it could miss. I really don't
00:51:47.480 because I think everybody does it wrong. Do you know what they do wrong? Let's see if you know.
00:51:52.520 How does everybody do it wrong? And by the way, I'm going to make one exception. There's that I've seen
00:52:00.360 one person do it exactly right. And I think I could, I could do something in that vein as well. Yeah. Greg
00:52:06.840 Goffeld. I think Goffeld does, did the best job because he humanized him. Like you could tell that
00:52:13.400 they just liked each other. And so like he got, he got like a little human look out of him. So I think
00:52:21.000 that's the thing I could do better than, better than somebody who does interviews for a living.
00:52:26.840 Because I think when you do interviews for a living, especially depending on what platform you're on,
00:52:33.400 you're sort of driven toward a certain level of professionalism that I think distracts from a good,
00:52:40.680 good entertainment. And that's what the, that's what Greg Goffeld gets right. He never lets the
00:52:47.400 professionalism ruin the job. You know what I mean? As soon as the professionalism is more important
00:52:55.320 than the product, then the product suffers. But he does product first, professionalism second. So you
00:53:01.800 get, and that's why he has two shows. Interview Ben Shapiro. Yeah, I don't know that me interviewing Ben
00:53:11.240 Shapiro would produce like any headlines. I think I would just agree with what he said. And then there
00:53:19.480 would be no spark there. Yeah, Jordan Peterson. That would be interesting.
00:53:28.920 And I'm especially interested in his religious interpretations of recent, which I want to look
00:53:39.240 into a little bit. I want to hear it in his own words, because I'm considering that one of the most,
00:53:46.840 let's say, unpredictable
00:53:48.280 evolutions that I've seen. I would not have predicted Jordan Peterson's, is it a change? I don't even know
00:54:00.040 if it's a change, really. Maybe it's just something we're hearing more about. But I think it's an
00:54:04.680 evolution, right? To a more religious frame. And I'm going to say frame, because I think he puts it in,
00:54:13.320 I think he puts it in terms of, you know, living as if it's real. Is that true? I hate to try to
00:54:21.080 paraphrase other people. He says he doesn't believe it literally, just has, he likes the utility of it.
00:54:30.680 Is that it? Because that wouldn't be far from my own view. My own view is, if you don't know my view
00:54:38.040 in religion, is that I'm not, personally, I'm not a believer, but I'm very pro-religion. Because it's just so
00:54:44.520 obvious it works. I mean, why would I be against something that works? Now, I don't know that there's
00:54:52.040 any science behind that. But the only studies I've seen are that people have a faith-based
00:54:59.160 view of the world seem to be happier, seem to be more prosperous, seem to do fewer drugs. You know,
00:55:07.320 there's something about thinking that somebody's watching you all the time, and, you know, loves
00:55:11.560 you and is judging you, I guess, that probably has some, you know, some guardrail effect on your
00:55:20.040 behavior. And if, as long as people have a choice, you know, they can, they can enter a religion,
00:55:27.000 they can leave it, which is the beauty of America. As long as they have a choice, if they're picking a
00:55:33.320 frame that works, that works for me. Now, in the old days, when I was younger and dumber,
00:55:40.680 I used to say, whatever it is I'm thinking today is right, and all those other people,
00:55:47.560 how dumb could they be? And the older you get, if you do it right, if you do it right,
00:55:56.920 your humility should increase, not decrease. So I started out with no humility, meaning I was
00:56:05.160 pretty, pretty sure I had the whole God situation figured out, and other people didn't, and there
00:56:11.960 was something wrong with them. And, but the longer you live, you realize that more and more things you
00:56:17.720 thought were clearly true, they just, like, evaporate. If I could tell you how many things
00:56:26.600 I believed were absolutely true five years ago, that, and I'm talking about a lifetime of having
00:56:34.520 truths evaporate, like every year something that was true disappears again. I mean, like really big stuff.
00:56:41.720 I'm talking about really basic things about even who you are, like a belief about, like,
00:56:50.360 what happened to you in the last five years. Just really basic stuff. And so the longer I live,
00:56:58.520 the more humility I get. Because everything I think I believe, now I have a little message that says,
00:57:05.000 yeah, yeah, you're pretty sure this time, aren't you? Do you know how many times you've been sure before?
00:57:10.280 Yes, you do. Yes, you do. You do know. You do know. Don't change the subject, Scott.
00:57:16.520 You know how many times you were sure before, and then found out you were totally wrong.
00:57:22.360 Do you remember that? I'm not trying to think about it. Think about it. Think about it, Scott.
00:57:27.960 Don't, no, don't not think about it. That's how you get cognitive dissonance. Think about it.
00:57:33.480 So that's something you can do as you get older, like you're in, if you do it right, right?
00:57:38.600 If you do it right, you will remember how many times you were wrong. And that was one of the
00:57:44.440 motivations behind that book behind you, behind me, where I'm pointing, that had failed almost
00:57:51.240 everything and still went big. By cataloging all the things that I'd failed at, it was both a reminder
00:57:58.920 to myself how many things I was sure about that didn't work out, but also a way to tell other
00:58:05.320 people that even people who, you know, for whom life worked out fairly well, that were wrong all the
00:58:13.400 time. I mean, the number of times we're certain and wrong is just shocking, no matter how successful
00:58:21.640 you end up. So, and that helps people understand that their own certainty is worth questioning.
00:58:28.280 How sad to take the substance of God out and just embrace the utilitarian aspect.
00:58:44.840 Well, I don't know, is it sad or is it just a frame that works for some people and not for others?
00:58:49.720 Now, for me, the simulation hypothesis is the most predictive, right? So here's another prediction.
00:59:00.120 There will never be anything true about our reality, whatever you call this thing, that is not
00:59:06.040 also true for a simulated world. Meaning that a simulated world would have resource constraints,
00:59:14.360 it wouldn't have infinite memory, you know, it would be a bunch of things that would be true,
00:59:19.080 true for software. So my prediction is there will never be anything true in our so-called real world
00:59:26.040 that isn't just as true for software, because we can never get out of that model. Very specific prediction.
00:59:34.920 So we'll see.
00:59:35.640 Somebody says, somebody says, God is utilitarian.
00:59:43.400 Interview Sam Harris again.
00:59:47.080 Yeah, you know, I'd love to. That'd be good.
00:59:52.760 Kanye would be good.
00:59:56.360 Maybe Kanye would actually do it. I mean, it'd be more of a resource question. He can't be
01:00:02.120 everywhere and do everything. All right, that's all I got for today. Oh, Bill Maher. Oh, wow.
01:00:10.920 That would be interesting. Imagine me having Bill Maher on and having a video prepared to debunk the
01:00:20.440 fine people hoax and the leech drinking hoax and just say, just two things I want to talk to you about.
01:00:28.280 And that's it. I just want to show you the full video and then show you what you saw.
01:00:37.080 And then I'll tell you, I literally personally interviewed people who attended the Charlottesville
01:00:42.120 thing and found several of them who claimed that they disavow all racism. And they were just there
01:00:48.360 about the statues and the historical element of it.
01:00:51.240 And just see what he says. Because here's the thing. The thing that I keep saying to Bill Maher
01:00:59.480 about why he's so dangerous is that he's a shapeshifter. Well, that sounds negative. I mean
01:01:08.120 this to be positive. He is capable of entering your bubble and looking around.
01:01:13.480 Who can do that? I mean, really, who can do that? Not many people. It's a really small
01:01:22.040 list of people who can just walk outside their bubble if they choose to, right? It's the choosing
01:01:27.640 to that is where that little challenge is. But if they want to and they have a reason to,
01:01:33.000 and somebody alerts them to a good reason to do it, they have the ability to open their little
01:01:38.200 door and walk out of their little bubble and knock on the other bubble and go inside and see what's
01:01:43.880 happening. There's only a few people who can do that. And he's one. So a conversation with him,
01:01:51.640 if I could, you know, invite him into the other bubble, just to look around, you know, just to
01:01:57.880 check out the furniture, it would be pretty interesting, wouldn't it? That would be really
01:02:02.040 interesting. All right. And if there were anybody I thought I had a shot of getting to
01:02:10.040 the man cave, I would think the odds would be pretty good in his case.
01:02:17.960 Russell Brand will become more productive with your persuasion frame. Yeah. You know,
01:02:23.240 Russell Brand is fun, isn't he? Is it my imagination or is he also able to, I think that's the context
01:02:31.240 you're mentioning. He does seem to be able to check everybody's bubble, doesn't he? Am I right
01:02:36.280 about that? Those of you who have been watching before? Yeah. Joe Rogan, same thing. Joe Rogan
01:02:43.400 is bubble free. People will try to characterize him in one bubble or the other, but I don't think
01:02:51.720 he sticks. I think he's bubble free. Yeah. I think Russell Brand found his exact
01:03:04.200 place. Don't you? You know, I like, like him as a standup. I've liked him as an actor.
01:03:12.760 But when you look at his talent stack, you know, if you look at his verbal skills,
01:03:17.240 see, see the point of, what's the point of being, um, doing scripted material when you have his,
01:03:25.480 his capacity? Like that's a complete waste. And he should be in an unscripted, you know,
01:03:31.720 semi-scripted, um, bounded, but unscripted scenario. And that's what he's doing. So,
01:03:38.440 and then he also doesn't have like constraints, right? There's nobody writing lines for him and
01:03:43.320 telling him, do this, more of that. So, he's really the, um, yeah, he's, somebody says, uh,
01:03:52.360 Carl says, Brand's awareness, uh, as an MDT mushrooms is off the chart. Uh, that is correct. He is off the
01:04:00.740 chart. So, um, now he'd be an example of somebody that I don't know if he and I would have any reason
01:04:11.000 to talk because we'd end up just agreeing, I think. Chris Rock. Yeah. He'd be the hardest interview
01:04:20.280 in the world. Matt Damon. Why Matt Damon? He'd be, I like Matt Damon, but why? Dave Chappelle. Oh,
01:04:28.440 that'd be interesting. I don't think I could get him to come. Well, I probably couldn't get anybody to
01:04:34.600 come, but, uh, all right. That's all for now. And I will talk to you on, uh, YouTube tomorrow. Oh,
01:04:42.720 and before I go, I don't know if anybody's noticed this, but, uh, Madonna has slowly evolved into Jar Jar Binks
01:04:51.440 and I didn't want to be the first one to mention it, but it's important. See you tomorrow.