Real Coffee with Scott Adams - June 16, 2022


Episode 1776 Scott Adams: The Highlight Of Civilization Is About To Begin


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

144.42612

Word Count

10,678

Sentence Count

787

Misogynist Sentences

2

Hate Speech Sentences

11


Summary

The World Health Organization wants to rename Monkeypox, but it sounds like it's going to be called Crackerpox now. Who's the racist in this story? Or is it just a funny thing about monkeys that makes them funny?


Transcript

00:00:00.000 La-pa-pa-da-da-da-da-da-da-da-da.
00:00:05.640 Good morning, everybody.
00:00:08.480 Are you ready for the best show in the history of human civilization
00:00:14.020 and possibly one or two alien civilizations who tried and failed before we evolved?
00:00:22.180 And who knows how many after that?
00:00:23.940 All I know is this is the highlight of the entire, I don't know, Big Bang until now.
00:00:31.260 And if you'd like to take it up another notch, I know, I know, it seems impossible.
00:00:35.840 But it can be done.
00:00:37.440 And we're going to do it today.
00:00:39.460 And you will start your day with a success.
00:00:43.660 And that will probably, you know, snowball into all kinds of good things happening to you today.
00:00:48.280 And by the way, again, I don't think I say this often enough.
00:00:53.940 I don't know, are you working out or something?
00:00:56.820 Because you look sexier than normal.
00:01:00.320 And if you'd like to take that up a notch, too, all you need is a cup or a mug or a glass of tank or chalice,
00:01:06.780 a canteen jug or flask, a vessel of any kind.
00:01:11.380 Fill it with your favorite liquid.
00:01:14.500 I like coffee.
00:01:16.580 And join me now for the unparalleled pleasure.
00:01:20.400 It's the dopamine hit of the day.
00:01:24.080 It's the thing that makes you tingle.
00:01:27.860 It's called the simultaneous sip.
00:01:30.200 It happens now.
00:01:31.140 Go.
00:01:31.280 Those on the locals' platform know that I was just modeling a technique that I've been describing to them.
00:01:45.200 They learn so much there.
00:01:46.700 So much more than you learn on YouTube.
00:01:48.820 But yet I love you all.
00:01:49.860 So in the news, the World Health Organization, who I just told you, the World Health Organization, who I just told you, says they want to rename Monkeypox.
00:02:09.800 Monkeypox.
00:02:10.960 Why?
00:02:11.960 Why would they want to rename Monkeypox?
00:02:17.120 The first thing we have to acknowledge, and I think you'd all agree, there has never been a better name for a serious disease than Monkeypox.
00:02:26.600 I don't care who you are, you hear Monkeypox, and you think, ha ha, Monkeypox.
00:02:36.360 Am I right?
00:02:37.580 But you hear, you're going to get a bad case of COVID-19 that might have come out of some kind of weapons laboratory.
00:02:44.660 We don't know, but we worry that it might have.
00:02:47.600 Now that's scary stuff.
00:02:48.840 Although, as bad as saying you have COVID-19 sounds, ironically, saying you have long COVID sounds a little bit self-complimentary, if you know what I mean.
00:03:02.860 I don't like to brag, but you've got a pretty big case of long COVID.
00:03:09.840 Long COVID.
00:03:12.320 My COVID is so long.
00:03:14.320 So, Monkeypox, tragically, it looks like they're going to change the name, so we won't have the fun of calling it Monkeypox anymore, with all the jokes that you can make about that.
00:03:26.700 And, of course, the reason is, obvious, it's obvious, is racist.
00:03:32.260 Now, here's a question I ask provocatively.
00:03:34.500 If you and I heard Monkeypox and thought of monkeys and thought it was funny and it was a good name, and that's all we thought.
00:03:44.320 Because we're good people.
00:03:46.500 But if some other people, maybe the people in the World Health Organization and a bunch of people who wrote to ask for this,
00:03:53.020 there was some kind of a sign-up where people signed the letter and said,
00:03:58.060 you've got to change that name from Monkeypox because it sounded racist to them.
00:04:02.560 To which I say to myself, huh, who is the racist in this story?
00:04:08.680 If I put the variables together, the people who just think monkeys are cute?
00:04:18.780 Or is it the people who made some other association with cute little monkeys?
00:04:25.320 To which I say, I think we found the racists, or they found us, or something.
00:04:32.960 But if you'd like to contribute to the renaming of Monkeypox, I think it's a fertile grounds for humor.
00:04:45.980 So can somebody maybe start a tweet thread?
00:04:49.860 I should have done it myself.
00:04:54.080 Crackerpox.
00:04:54.960 On the Locals platform, somebody suggested renaming it to Crackerpox.
00:05:13.340 I think we can close the competition.
00:05:15.940 I think Crackerpox just won.
00:05:18.460 Oh, my God.
00:05:20.200 That is funny.
00:05:21.060 You know, and it's funny because of the actual sound of the letters.
00:05:27.260 One of the reasons that monkey is funny is because of the K sound.
00:05:31.960 Right?
00:05:32.220 When you put a K sound in anything, it just sounds funnier.
00:05:35.600 But Cracker has that same thing going for it.
00:05:40.020 Crackerpox.
00:05:41.520 And even pox.
00:05:43.020 Even pox is a funny word.
00:05:44.980 A pox on your house.
00:05:46.640 All right.
00:05:46.980 Does it seem to you that maybe losing is winning these days, or something?
00:06:02.220 There's something going on with the Democrats.
00:06:05.420 It's almost as if they're trying to play for a better draft pick next season.
00:06:12.400 And somebody needs to explain to them it doesn't work that way.
00:06:15.080 Because you know how, if a sports team looks like they can't make the playoffs, well, maybe they don't try so hard after that point.
00:06:23.700 Because it would be better to end the season in a low rank, because then you go to the top of the heap for picking the new talent out of the draft next time.
00:06:33.360 So you could actually leapfrog, you know, and turn into a dynasty pretty quickly if you get the right players.
00:06:40.420 So, but it almost feels as if the Democrats are like playing for a draft pick.
00:06:44.660 And somebody needs to tell them, you know, it doesn't really work that way.
00:06:49.200 Listen to what the Democrats are up against in terms of a headwind for the coming elections.
00:06:59.380 All right.
00:06:59.580 Rasmussen says, Rasmussen poll, 89% of voters are at least somewhat concerned about the economy.
00:07:08.500 89%.
00:07:09.140 How does any incumbent win re-election when 89% are worried about the economy?
00:07:17.800 It doesn't even matter if it's the fault of the president.
00:07:20.700 Because remember, the rules are, the president gets credit even if they didn't do anything.
00:07:26.360 And the president gets, you know, criticized even if there wasn't anything they could do.
00:07:32.420 Or even if they didn't do anything wrong.
00:07:33.820 So, if you've got 89% of the voters at least somewhat concerned about the economy, including 69% who are very concerned, that's not looking good for the incumbent.
00:07:47.100 And then, secondly, crime is the only issue that rivals economic concerns with voters.
00:07:54.280 This is also from Rasmussen.
00:07:56.580 88% are concerned about violent crime, including 64% who are very concerned.
00:08:03.820 Now, the economy and crime.
00:08:08.980 What are the two things that are most associated with the Republicans?
00:08:14.400 That even Democrats would say, well, you know, we hate these Republicans, but, you know, in a quiet moment, if nobody is overhearing me and I'm talking to my best friend who won't tell anybody,
00:08:26.820 I might admit that they might do better on economics and crime.
00:08:31.840 At least some people would have that view.
00:08:35.360 So, you know, it's hard to imagine how Democrats could win.
00:08:39.440 And so this predicts that the only strategy that they could have, they Democrats, election strategy, would be coming up with bigger and bigger hoaxes.
00:08:49.100 And, unfortunately, that's like real, isn't it?
00:08:57.380 Because it doesn't seem that the actual facts, even the facts that CNN is willing to report.
00:09:03.200 Because, remember, even CNN has decided to become more even-handed.
00:09:08.080 And apparently, to their credit, and I will say this publicly because I criticize them mercilessly, but to their credit, they had a fairly prominent primetime question about Biden's competency.
00:09:23.980 You know, Don Lemon grilled them hard on that.
00:09:26.360 And I think the public would appreciate that, that that was a pretty meaningful way to approach an important question.
00:09:35.940 So give them credit for that.
00:09:37.960 But how in the world do the Democrats win unless they come up with the world's biggest hoax?
00:09:42.860 And I think that the January 6th thing is at least how they're going to frame things.
00:09:54.160 And I think the idea is that if they can, so this is sort of Democrat math, but maybe it's just political math.
00:10:01.700 I won't blame Democrats.
00:10:02.760 If you can say that there's one person in a group of ten who did something despicable, you can say that that group is bad.
00:10:13.240 Am I right?
00:10:14.780 If one person in a larger group of ten does something terrible, it's our normal thing to say, well, that's a bad group.
00:10:23.680 Because at the very least, they allowed somebody in that group who would do that terrible thing.
00:10:28.700 So even if you say, well, there was only one person who did the terrible thing, it's a reasonable argument to say, but, you know, the other nine maybe should have seen it coming or should have stayed away from them or whatever.
00:10:42.180 But now let's say you successfully make the argument and now you've demonized the ten.
00:10:47.640 But what if the ten are a small group of another larger group?
00:10:52.820 Does it work again?
00:10:56.280 And the answer is yes, it does.
00:10:58.180 Once you've said that the group of ten are all bad, by extension, then if they're part of yet another larger group, you can say, well, that larger group is letting these guys in.
00:11:09.340 And so logically, by association, at least they should have policed themselves better.
00:11:16.640 And so I look at the January 6th thing.
00:11:21.580 So the January 6th thing is a small number of people, probably in the Proud Boys and others, probably did some really bad things.
00:11:34.720 Not probably.
00:11:35.500 There was actual violence and injury.
00:11:37.300 So people did some bad, violent things.
00:11:39.760 But probably not most of them.
00:11:44.120 But you still say, but wait, okay, if the Proud Boys had, you know, this many people in them who were willing to fight and cause trouble, that does say something about the larger group.
00:11:55.440 Even if the larger group, 90% of them had nothing to do with anything illegal, right?
00:12:03.320 So you start with the small troublemakers, and you extend that to the Proud Boys.
00:12:09.500 But then you say, the Proud Boys were a few hundred, or at least people like them, you know, if you include like-minded people, and of a number of several thousand.
00:12:22.880 So now you can say, wait a minute.
00:12:26.140 If there are a few people in the Proud Boys who were bad, you know, unambiguously bad, now you've demonized the Proud Boys.
00:12:34.460 So you've got that whole group.
00:12:36.380 But now you say, but the Proud Boys were a significant chunk of the entire protest.
00:12:43.700 Whatever you want to call significant, kind of subjective.
00:12:47.180 And now you can demonize the entire protest.
00:12:49.320 So now you've done two leaps, right?
00:12:54.320 One is the few protesters to the Proud Boys.
00:12:56.920 And I'll just use the Proud Boys as a proxy for the people who came there with the worst intentions.
00:13:03.380 You know, they were thinking fighting when they went there.
00:13:06.280 And then you say, you extend that to the people who were, let's just say, enthusiastic Trump supporters.
00:13:13.520 But they were nothing like the Proud Boys, right?
00:13:17.500 They didn't, many of them might not even know they existed.
00:13:21.980 So now you've demonized all the people at the protest.
00:13:26.100 And you can extend that now to Trump.
00:13:28.900 Because you can say, well, you know, maybe it's hard to prove in any legal sense that Trump directly told anybody to do anything illegal.
00:13:38.120 Or even unwise.
00:13:41.780 But we can certainly show that he didn't do enough to stop it as soon as he could.
00:13:47.880 So now you've extended the larger crowd to Trump.
00:13:54.020 And now that you've gotten to Trump, you can extend it to MAGA, the ultra-MAGAs.
00:13:59.660 So now you've got it all the way to Trump and MAGA.
00:14:03.780 And then MAGA, of course, makes up much of the Republicans and the right and conservatives.
00:14:10.100 So now you've done, how many leaps was that?
00:14:13.240 Four?
00:14:14.480 Four leaps.
00:14:16.080 So you started with a small group of, you know, violent people.
00:14:19.960 Then that demonized the Proud Boys and like-minded people.
00:14:23.300 And then that got all the protesters, no matter what reasons they were there or no matter what they did or no matter what their intentions were.
00:14:32.320 And then that got Trump's orbit.
00:14:35.100 And then that got all of MAGA.
00:14:37.160 And then that gets all of Republicans.
00:14:39.200 And then that's all of conservatives.
00:14:43.180 Right in front of our eyes.
00:14:45.520 You know, it's not like it's a magic trick.
00:14:48.980 They're doing it right in front of us.
00:14:50.480 And, yeah, we'll talk about the FBI in a minute.
00:14:55.960 So now do BLM and Antifa.
00:15:02.080 Same political math.
00:15:04.140 Yeah, so earlier when I said, when I was saying that it's not just a Democrat math, it's political math, that would be a good example.
00:15:15.000 So you can do the same thing with, there were some number of people who were violent in the Black Lives Matter protests.
00:15:21.420 They're all Democrats.
00:15:23.200 Blah, blah, blah, blah, blah.
00:15:24.220 So it scales up the same way.
00:15:26.000 But the January 6th thing, I think, has more jumps.
00:15:30.580 We're used to two jumps, right?
00:15:33.060 But the January 6th thing is like a four-jump play.
00:15:38.320 All right.
00:15:38.760 Speaking of hoaxes, so now the Biden administration is going to look into some kind of administrative punishment for the border patrol who allegedly were whipping Haitians crossing the border, the Mexican border.
00:15:56.040 And, of course, it turns out that the photographic evidence completely clears them.
00:16:04.260 It was just a misleading video, and they were just using the reins on their own horse.
00:16:08.880 But instead of just saying, well, we're sorry, I guess we were fooled by those misleading videos, go back to work and, you know, with our blessings.
00:16:21.360 They're actually going to pretend like they were right all along because demonizing the border patrol is good for their politics.
00:16:30.040 It's like one of the worst things I've ever seen.
00:16:32.320 And at what point do you stop ignoring it, you know, as a public?
00:16:40.360 Because every time I see a story like this, it makes my blood boil.
00:16:44.540 I'm like, oh, are they really going to punish these guys, the border patrol people, who apparently did nothing wrong?
00:16:54.260 Are they really going to punish them just for consistency and political gain and do it right in front of us?
00:17:00.540 Right in front of us.
00:17:01.480 Is that really happening?
00:17:04.220 And so here's the thought process I go through.
00:17:07.520 I think, well, I've got to do something about this.
00:17:10.800 But then the very next story I read, I think, well, I've got to do something about that, too.
00:17:16.620 There are too many things to do stuff about.
00:17:19.780 They've basically overwhelmed me with outrage.
00:17:23.800 So I have so many outrages.
00:17:25.700 I'm like, you know, I should, like, weigh in and make sure that these border patrol people, because they're employees of the United States, right?
00:17:34.880 So in a way, public employees are sort of my responsibility, aren't they?
00:17:42.460 You know, in an extended way.
00:17:45.040 I pay taxes.
00:17:46.640 The taxes go to their, you know, they're working for the public.
00:17:49.840 I'm the public.
00:17:51.080 I mean, I feel like it's a two-way obligation, right?
00:17:53.740 They're keeping us safe.
00:17:54.880 Don't I have the same obligation to them to keep them safe?
00:17:59.680 So I read this story.
00:18:00.840 I'm like, oh, it's their job to keep me safe.
00:18:03.880 They're now safe themselves.
00:18:05.660 I should run down there and, like, keep them safe.
00:18:09.440 It's just, you know, a reciprocal instinct kind of thing.
00:18:13.700 But then who has the time?
00:18:16.240 And I think, well, other people got problems, too.
00:18:20.360 And maybe I could solve a problem that's closer to home.
00:18:24.100 And, you know, you talk yourself out of it.
00:18:27.020 But, and it could be, it could be that if you looked into it, you'd find out there's more to it than you know.
00:18:34.360 And then you spend all your time trying to fix this outrage,
00:18:38.120 only to find out that a few of the border patrol that they're talking about actually did something wrong.
00:18:44.880 And then you've wasted all your outrage on something you just didn't have the full context to.
00:18:50.360 So it's easy to talk yourself out of working, you know, working against any of these outrages.
00:18:55.260 Because maybe they're not even true.
00:18:59.700 So what is the worst thing about having a job with co-workers?
00:19:04.780 It's the co-workers.
00:19:06.140 It's the meetings.
00:19:07.440 Because then there are more co-workers there at the same time.
00:19:11.340 You know, there's, the only thing worse, the worst thing at work is interacting with other people, right?
00:19:16.120 All of the unpleasantness at work is interacting with your boss or interacting with your co-workers.
00:19:24.900 Generally, the customers are fine.
00:19:27.780 You know, you can usually have a good day with customers.
00:19:30.620 You help them, they're happy about it, you know, it's all good.
00:19:34.060 But all the, all the unpleasant stuff has to do with meetings and just running into people.
00:19:39.300 So now we've found a way to make meetings even worse.
00:19:44.820 Apparently, the hybrid, the so-called hybrid meetings where some people are on video and some people are in the room,
00:19:51.120 it turns into a whole second-class citizen thing if you're on video, you know, and it's, it's a terrible look.
00:19:57.660 And you have to look at these little postage stamp people in the crowd and, but there are a few technologies, apparently, that are going to fix that.
00:20:06.120 One of them is, you know, automatic camera focusing on whoever's talking so that whoever's talking could be bigger on the screen.
00:20:15.120 That would be good.
00:20:16.300 That already exists.
00:20:17.940 It's a little expensive.
00:20:18.940 And then apparently the meta model where you put on the goggles and you go talk, according to one article I read who, I wish I could remember the author,
00:20:30.240 said that the, you know, the goggles were annoying so it's not quite ready for prime time,
00:20:35.900 but that the sensation of it was being in person, even though it was avatars at a table.
00:20:42.840 The sensation was as personal as being in person.
00:20:46.920 Now, I think that's real.
00:20:48.940 I think that meetings among avatars are going to be important.
00:20:54.700 And what's really interesting is if you're meeting as an avatar that doesn't look exactly like you,
00:21:00.820 you get to change how influential you are.
00:21:04.620 Here's something that nobody has, I don't, I haven't heard anybody talk about this.
00:21:09.040 Let me give you two scenarios in the real, real world.
00:21:12.760 In the real world, you go into a meeting and somebody, let's say, is applying for a job.
00:21:17.740 Well, we'll use that example, applying for a job.
00:21:22.460 You walk in and how long does it take you to know that you're going to hire that person?
00:21:27.240 From the moment you meet them, how long before you know if you're going to hire them?
00:21:33.060 Well, I hate to say it's like 10 seconds.
00:21:36.020 It is.
00:21:36.540 Because you just look at them.
00:21:39.260 You think that you interview them, but you don't.
00:21:44.720 That's just an illusion.
00:21:46.480 You walk in, you look at them, and you decide if you're going to hire them.
00:21:51.000 I'm sorry.
00:21:52.080 Did you think it was because of all their qualifications and the things they said?
00:21:57.100 Nope.
00:21:58.360 No, it's just you looked at them.
00:22:00.980 And now I'm exaggerating a little bit for hyperbole, right?
00:22:05.320 So here's an assumption I'm making.
00:22:07.740 That by the time they got in the room, you've already checked that they met the basic qualifications.
00:22:13.740 Am I right?
00:22:14.280 But generally, they don't even get in the room unless you've seen something on paper that says, oh, they have at least the background.
00:22:22.640 So now, has anybody interviewed enough people or hired enough people to realize that we're not good at it?
00:22:31.020 Has anybody had that sensation yet?
00:22:33.160 You hired somebody and you thought, well, this is obviously a good hire and turned out to be a disaster.
00:22:39.000 And did you learn from experience that nobody's good at it?
00:22:43.700 Nobody's good at it.
00:22:45.920 This is why the Naval Ravikant theory is so important.
00:22:54.000 I think he's the first person I heard it from.
00:22:57.440 That hiring people is not the skill.
00:22:59.980 Firing people is the skill.
00:23:02.140 Because they all look alike when you hire them.
00:23:04.800 They all look like, well, you know, I talked to ten people and any one of those ten might have been the superstar.
00:23:10.800 I don't know.
00:23:11.540 Can't tell.
00:23:12.700 But you can tell after they start.
00:23:14.920 You can tell if they're killing it after six months.
00:23:17.860 So if you don't execute, you know, the kill switch, then you don't have the skill to be a good manager.
00:23:23.740 So it's the firing that is the skill.
00:23:26.200 And, all right, so everybody's buying that?
00:23:33.840 The firing is the skill.
00:23:35.020 You haven't worked in engineering land before.
00:23:40.280 So what I'm saying is that if you see somebody looking differently, because the only contact you have with them is in a virtual world, in which they're an avatar and not a person, your decision-making system will be interrupted.
00:23:55.140 And you'll have to use a different one.
00:23:57.400 So the avatar that you choose to represent you will have a gigantic effect on your success in the real world.
00:24:08.600 You just don't realize it.
00:24:10.980 So if your avatar is a giant, let's say, weasel, and you think it just looks cute, but people who see it think weasel or skunk or something, that is going to affect you.
00:24:24.360 Your actual life will be much affected if you choose to look like a skunk because it's cool and it's edgy or you just like the color, colors white and black, whatever.
00:24:38.360 So here's just putting it all together.
00:24:41.260 You don't realize that people pretty much make their decisions on looks because you're sure that you don't do it.
00:24:47.900 I'm sure that you do.
00:24:49.140 What you do is you make your decision on looks and then you talk yourself into it by saying, you know, perfect qualifications.
00:24:56.500 I liked how he or she answered that question, but it's not really that.
00:25:00.340 It's just looks.
00:25:02.500 It's basically how much you want to mate with them.
00:25:05.820 That's pretty much it.
00:25:09.280 All right.
00:25:10.400 And sometimes it works the other way.
00:25:12.480 Sometimes there's a jealousy factor where good-looking people could be discriminated against.
00:25:17.940 I've actually seen that happen in hiring because the people who work there were not good-looking and they didn't want anybody there who was.
00:25:24.940 That's actually a thing.
00:25:27.680 But the point is, in both cases, looks are the real reason you're doing something.
00:25:33.340 School choice won big in Iowa in the election.
00:25:36.240 So I guess eight out of nine candidates who are super pro-choice beat out their challengers, including incumbents.
00:25:44.820 So we now know that there's an issue, a single issue, which moves the dial, and it's school choice, at least at the state level.
00:25:55.140 So I feel as if it's time for a presidential candidate to make a stand on that.
00:26:02.920 You know, they always have opinions on it, but it's never been a central point of anything.
00:26:06.960 And somebody could make that work.
00:26:11.200 So watch for school choice to be the big thing.
00:26:14.780 There was an article that says that weed lowers your IQ over time.
00:26:20.400 So if you use it every day, your IQ will decline by 5.5 points on average from childhood.
00:26:26.160 Do you believe that?
00:26:27.060 If you use weed every day, your IQ will decrease 5.5 points.
00:26:34.640 Not 5.5 percent, but 5.5 points.
00:26:39.680 So here's why this is a problem for some people more than others.
00:26:44.840 Suppose your IQ is 100, supposedly average.
00:26:50.600 Society is built for people with roughly an IQ of 100, because if they didn't build it that way, people wouldn't be able to open doors and use cars and just operate in society.
00:27:07.680 They wouldn't be able to, you know, buy a bus ticket.
00:27:10.340 So you have to make everything in society work for people who are around an IQ of 100.
00:27:15.740 Well, what happens if you take 5.5 points off of somebody who's just barely smart enough to operate the machinery of civilization?
00:27:26.780 It's dangerous.
00:27:28.540 I would say if you take somebody who's marginally smart enough to navigate life, and you take 5 points off their IQ, you've turned them into kind of a high-functioning moron, and maybe that's not such a good thing.
00:27:42.760 So if you have an IQ of 100, I do not recommend smoking pot every day.
00:27:52.920 I mean, I don't dis-recommend it.
00:27:54.700 I'm not a doctor.
00:27:56.520 So don't take my advice either way.
00:27:58.900 That would be my overall advice.
00:28:01.500 But suppose you had an IQ of 145, and you smoked weed and took 5 points off it.
00:28:09.480 You'd still be the smartest person in the room.
00:28:13.960 Would it make any difference?
00:28:16.980 So let's say the side effect is the same.
00:28:20.100 It's 5 points off your IQ no matter who you are.
00:28:24.920 I'm going to make an argument that for some people, it might not make a difference.
00:28:30.640 You know, if they're, I suppose if they're physicists, maybe lay off the weed.
00:28:36.500 That would be a good idea.
00:28:37.500 But, suppose they were just, you know, overqualified for their job.
00:28:45.580 Well, in that case, maybe they got a little to spare.
00:28:49.540 Now, here's the first question I ask about this study.
00:28:53.860 Did they study sativa users or indica?
00:28:58.460 Because it's really different.
00:29:00.860 Sativa makes you more creative and smarter.
00:29:04.360 I'm just going to, I'm going to just assert that.
00:29:06.340 Now, that's not based on science.
00:29:09.820 It's based on a lot of experience.
00:29:13.900 For the first 15 minutes that you experience sativa, you're smarter.
00:29:19.960 I'm not going to tell you which of my ideas that people like and have spread around came up, were concocted in the first 15 minutes after smoking pot.
00:29:34.320 But it's a lot.
00:29:36.440 It's a lot.
00:29:37.260 And if you were to smoke sativa, you get that effect.
00:29:46.080 More creative, you know, you're actually more energetic for a little while.
00:29:50.300 But if you smoke indica, and these both have a dominant hybrid, so they're not pure in either case, usually.
00:29:58.220 They're usually not pure.
00:29:59.140 So they're mixes, but they're dominant one way or the other.
00:30:02.200 If you do an indica dominant, it just makes you sleepy and stupid.
00:30:07.680 That's why people do it, to relax and go to sleep.
00:30:11.060 So do you think that the long-term IQ effect of taking a drug that makes you sleepy and stupid would be the same as the long-term effect of somebody who took only the sativa version, which makes you literally smarter?
00:30:32.640 But for a little while.
00:30:33.580 And then you, you know, you plunge after that.
00:30:39.360 What if the IQ effect is due to a lack of motivation?
00:30:42.620 Exactly.
00:30:43.640 Because sativa can make you feel more motivated to do things you weren't even motivated to do before.
00:30:51.680 Whereas indica makes you less motivated, because that's why you take it, to slow yourself down.
00:30:57.460 So here's the first question I'd ask is, if they didn't break that out, I think I tweeted it.
00:31:08.300 So if you want to see the sources for any of these things, look in my Twitter feed for the same day as I talk about it.
00:31:13.180 It's usually there.
00:31:14.840 All right.
00:31:15.160 So I saw a video that apparently has been out in a while.
00:31:19.240 I don't know how many weeks, but it's a video in which Ted Cruz was talking to the FBI representative and asked whether the FBI had been involved in instigating any part of the January 6th event.
00:31:35.400 And here's what I found interesting.
00:31:38.220 Now, somebody said this is a real old video, like from January.
00:31:42.040 Does anybody have a confirmation of when this was?
00:31:45.200 Because I think some of you saw it in my Twitter feed.
00:31:48.020 But it's not brand new.
00:31:51.280 But it's relevant more now than when it was new.
00:31:55.940 Because now we're in the middle of these January 6 hearings, so we have a lot more context.
00:32:00.980 So now when you look at it, it really reads different.
00:32:05.320 Because we have more context.
00:32:07.460 And here's what happened.
00:32:08.500 So Ted Cruz, in his prosecutorial brilliance, because he's really good at this stuff, was grilling the FBI representative if the FBI, he asked the question in a number of ways, to say yes or no, was the FBI involved in any part of instigating January 6th?
00:32:30.500 And specifically, Ray Epps, and the FBI representative refused to answer any of the questions, and at one point mentioned sources and methods as something that they don't want to give away accidentally, so they can't answer questions because it's questions about the secret stuff.
00:32:51.900 Now, I interpret that.
00:32:53.900 Now, I interpret that as a confirmation that the FBI instigated January 6th, or was involved in some way that was important.
00:33:06.000 How do you interpret it?
00:33:08.140 Because the ways and means explanation isn't real.
00:33:13.360 It isn't real.
00:33:14.480 Because it would be very easy to carve out an answer that didn't give anything away.
00:33:20.960 And it goes like this.
00:33:23.320 Mr. Cruz, we don't give away, you know, all the details of our undercover work or operational stuff, but I can tell you, with complete certainty, that the FBI did not instigate anything or talk anybody into anything or try to get anybody to break the law.
00:33:42.000 We don't do that, and that definitely did not happen in this case.
00:33:46.540 Beyond that, I think you'll understand, I don't want to answer any detailed questions, because that would get to sources and methods.
00:33:53.260 But for the benefit of the public, let me just say, certainly, the FBI is not out there breaking laws.
00:33:59.280 They did not break any laws or try to incentivize anybody to break any laws for any purpose at all.
00:34:05.340 We don't do that.
00:34:06.640 That's not the business we're in.
00:34:08.000 Now, did I just give away any sources and methods?
00:34:15.220 I don't think I did.
00:34:17.160 Did I?
00:34:18.340 I don't believe there were any sources and methods that I just gave away.
00:34:22.840 But I told you a very direct answer, didn't I?
00:34:26.500 No.
00:34:28.080 The thing you're worrying about, no.
00:34:30.480 We didn't do that.
00:34:31.340 But beyond that, don't ask any detailed questions, because that would be giving away sources and methods.
00:34:37.680 Now, here's why I consider this a confirmation from the FBI that they were involved.
00:34:44.320 Because it's too easy to say you weren't.
00:34:47.340 It's too easy.
00:34:49.140 I just did it.
00:34:50.940 I'm not like the spokesperson for the government.
00:34:54.820 I would think that they would be trained well enough to answer a question like that.
00:34:59.300 It's not even hard.
00:35:01.000 I mean, what was the degree of difficulty in saying, we can't give away secret stuff, but
00:35:05.880 I'll tell you we didn't have anything to do with what you're alleging.
00:35:12.320 I mean, if you're not willing to say that directly, it's because you're confirming it's real.
00:35:19.460 I don't have another way to interpret it, do you?
00:35:24.320 Because if your interpretation is that they genuinely were trying to protect sources and methods,
00:35:31.220 I just gave you the answer that proves that wasn't real.
00:35:35.940 It was too easy.
00:35:39.160 So, this is another one of those outrage things, where I have outrage exhaustion.
00:35:44.020 Do I go work on this Border Patrol, you know, unfairly accused whipper gate problem?
00:35:51.120 Or do I do something about the fact that the FBI just effectively, for all practical purposes,
00:35:58.400 confirmed that they were behind instigating a series of violent actions that are now being blamed on Republicans?
00:36:07.860 People went to jail, and the entire politics of the country will change, probably because of this.
00:36:14.940 And the FBI is clearly hiding something that they shouldn't be hiding from the public.
00:36:20.620 Now, are you not outraged?
00:36:23.800 Because to me, they just admitted one of the biggest crimes of the century.
00:36:30.340 I mean, this is a really big crime, if I'm interpreting it correctly.
00:36:34.260 And if I'm not interpreting it correctly, the FBI could always clarify and say,
00:36:40.460 oh, let's follow up with a clarification.
00:36:43.660 We can't give you sources and method stuff, but we can say for sure that we were not instigators of that event.
00:36:52.800 Too easy.
00:36:54.120 It's just too easy.
00:36:55.840 And they can't do that.
00:36:57.240 They can't do that.
00:36:57.940 So, I'm just out of outrage.
00:37:04.520 I just don't have enough.
00:37:08.640 The individual whose name I can never successfully pronounce,
00:37:14.860 Cenk, Cenk Iger, is that close?
00:37:19.840 C-E-N-K, Cenk, I think that's Cenk, right?
00:37:22.720 And whenever I mispronounce anybody's name, I always like to tell you,
00:37:29.040 like, I feel bad about it, and I'm not doing it disrespectfully.
00:37:33.000 I just sometimes don't know the name.
00:37:37.520 Uger?
00:37:38.840 So, it's Cenk Uger.
00:37:42.260 Okay, let's go with that.
00:37:43.300 He says, what do we do when, this is in a tweet,
00:37:48.620 what do we do when 40% of our fellow Americans just don't believe in facts anymore?
00:37:54.840 What do you think of that?
00:37:55.860 What do we do when 40% of Americans just don't believe in facts anymore?
00:38:00.260 To which I say, that's not what's happening.
00:38:04.780 That's not what's happening at all.
00:38:06.840 Everybody believes in facts.
00:38:09.500 What we don't believe in is who's telling us the facts.
00:38:13.640 We don't believe you.
00:38:16.620 It's not the facts that are lying to us.
00:38:19.040 It's you.
00:38:20.240 It's you.
00:38:20.860 You're the liar.
00:38:22.620 No, you, you.
00:38:24.900 Facts.
00:38:25.880 Facts are excellent.
00:38:28.180 I haven't met a fact I didn't like.
00:38:31.080 In fact, I've never even had a fight with a fact.
00:38:33.880 No fact has ever tried to malign me.
00:38:36.640 No fact has ever insulted me.
00:38:39.240 In fact, I have a good relationship with facts.
00:38:42.160 That's when I know what they are.
00:38:45.100 What I don't have a good relationship with is big old liars who are telling me that they know the facts and I don't.
00:38:52.620 I got a big old problem with them.
00:38:54.420 And I saw on the same theme, Jim Cramer, investment guru Jim Cramer, he tweeted,
00:39:06.480 Why can't we accept that most Fed chiefs have not faced a heinous war against a democracy, interesting way to put it, a heinous war against a democracy in a large country that doesn't believe in the science?
00:39:21.120 Really, do we have a country that doesn't believe in the science?
00:39:27.140 Is that the problem?
00:39:29.440 Do you wake up ever and say, you know, I don't believe in science?
00:39:34.620 Nope.
00:39:35.100 So I sent a correction on his tweet.
00:39:39.280 I retweeted it with a correction.
00:39:40.840 I said, the public doesn't believe scientists for good reasons.
00:39:46.280 But we still believe science is our best system for figuring out the truth.
00:39:51.080 So let me be clear.
00:39:53.860 Science, we all like.
00:39:57.520 Am I right?
00:39:58.320 We all like science as the best system to get as close as you can to the truth over time.
00:40:05.400 Every one of us, 100%.
00:40:07.240 We don't necessarily like the scientists.
00:40:12.820 How about the facts?
00:40:14.380 Is there anybody here who hates the facts?
00:40:16.900 Any fact haters here?
00:40:18.780 Or disbelievers?
00:40:20.720 Do you believe facts don't exist?
00:40:23.360 No, no.
00:40:24.540 But I'll bet there are people who think that they're liars.
00:40:26.820 Do liars exist?
00:40:29.480 Yes, they do.
00:40:30.920 Yes, they do.
00:40:32.380 So let's get it straight.
00:40:35.700 We love facts.
00:40:37.120 We love science.
00:40:37.960 We don't trust scientists.
00:40:40.660 And we don't trust people who tell us we got our facts wrong.
00:40:44.720 Even if we did.
00:40:45.700 I'm just saying that we don't trust them.
00:40:48.460 Maybe you shouldn't trust yourself.
00:40:50.740 Because I'm pretty sure you get the facts wrong sometimes.
00:40:54.900 I'm pretty sure you've seen me get facts wrong fairly frequently.
00:41:00.900 Am I right?
00:41:02.120 I mean, I do this in front of people every day.
00:41:06.440 I mean, I'm doing this in front of witnesses.
00:41:08.080 So you know I'm trying to get it right.
00:41:12.120 It's like I'm immersed in trying to get the story right.
00:41:15.960 But you've seen me get basic facts wrong a bunch of times, right?
00:41:21.080 If you've been watching for a while.
00:41:22.860 Because I get called out in the comments all the time.
00:41:25.360 You've seen it on Twitter.
00:41:27.120 People will call me out on a fax.
00:41:28.460 I'll talk about one in a moment.
00:41:29.480 So, I don't know.
00:41:33.860 We like facts.
00:41:34.940 We don't like liars.
00:41:37.920 All right.
00:41:38.900 Have you noticed that everything is broken?
00:41:41.980 I feel like I woke up in a third world country.
00:41:45.420 I'm going to give you the smallest example just from my life.
00:41:49.000 If I could just be self-indulgent for a moment.
00:41:51.980 But I keep hearing stories from other people that they can't get the most basic damn things done that seemed to work fine just a while ago.
00:42:01.360 And part of it is because there aren't enough employees.
00:42:05.340 Some of it is shortages of goods.
00:42:07.720 Some of it is incompetence.
00:42:08.940 Some of it is red tape.
00:42:10.540 But some of it is complexity, too.
00:42:13.380 And some combination of all those things.
00:42:15.420 So there are all these weird forces that have sort of, you know, come together.
00:42:19.660 But here was a little experience I had.
00:42:22.260 All right.
00:42:22.460 So, was it yesterday?
00:42:25.340 I was trying to get some health care for somebody in my circle, not me.
00:42:31.500 So there's a minor injury.
00:42:33.520 Everything's fine.
00:42:34.280 Don't worry about the injury.
00:42:35.440 All right.
00:42:35.640 So the injury is not the story.
00:42:37.460 Minor injury.
00:42:38.200 It doesn't matter the details.
00:42:40.240 And I was trying to figure out, should I go to my Kaiser Permanente, Northern California,
00:42:47.880 minor care, so that's one thing they have, urgent care, that's a different thing they have,
00:42:54.760 or emergency care.
00:42:57.240 All of them are in different towns.
00:42:59.440 So it's not like you could go to one place and then they tell you which one.
00:43:03.520 They're different towns.
00:43:04.560 So you'd have to drive pretty far and find out you went to the wrong one.
00:43:09.960 Now, here's the thing.
00:43:12.100 So I try to find out, like, there should be a website where I can quickly see, if you have
00:43:17.720 this kind of problem, go here.
00:43:19.900 I couldn't find it.
00:43:21.840 But, of course, it's also when you don't have all of your senses about you because you're
00:43:26.360 working fast.
00:43:27.720 You know, you're trying to get something done quickly for somebody who needs it right away.
00:43:31.000 And so you're driving and you're, like, you're Googling on your phone and trying to read their
00:43:37.520 website to figure out where to go.
00:43:39.820 Is it minor, urgent, or emergency?
00:43:45.040 And call the hospital, somebody says.
00:43:48.280 Do you know how much that doesn't work?
00:43:51.040 So, yes, I tried calling.
00:43:53.520 And I said, and when you call the health care, they start asking you questions.
00:43:58.160 Sorry, what's your number of your card, your membership number, phone numbers, and stuff.
00:44:04.520 And so as soon as they answered and started asking questions, I said, let me shortcut this.
00:44:10.680 I only need to know where to go.
00:44:13.180 Here's the situation.
00:44:14.660 Tell me which facility to go to.
00:44:16.400 That's all I need.
00:44:17.860 What do you think happened?
00:44:18.900 Do you think that the employee said, oh, okay, hearing what your problem is, here's the
00:44:24.740 address you want to go to.
00:44:26.840 Nope.
00:44:27.620 Couldn't do that.
00:44:29.460 Do you know why?
00:44:30.800 Because I didn't have the health care number for the person.
00:44:35.740 So couldn't tell me which facility to go to.
00:44:39.060 Because one requires an appointment and one doesn't.
00:44:43.480 And so can't make an appointment without a membership number.
00:44:46.920 And couldn't even tell me where to go to, like, work it out in person.
00:44:50.700 I had to tell them where the facility was that was closest to me because they couldn't
00:44:56.600 figure it out.
00:44:57.940 Because the first thing they told me was the wrong city.
00:45:00.780 And I said, I kind of vaguely remember that the city right next to me has one of these.
00:45:06.120 Oh, let me check that.
00:45:07.700 Oh, well, you're right.
00:45:08.940 They do have one right next to you.
00:45:11.540 This isn't an emergency.
00:45:12.900 In an emergency, I, the customer, had to figure out, based on my vast medical experience, what
00:45:21.160 does minor care, urgent care, or emergency room mean?
00:45:24.920 And which is the right one?
00:45:26.080 What requires what waiting time?
00:45:31.060 And let me tell you that if you've got, you know, a minor injury that you're not a medical
00:45:37.740 expert, you think, well, it looks minor to me.
00:45:40.940 But what if it only looks minor to me and I go to the wrong place?
00:45:45.080 I couldn't get any help.
00:45:46.580 I ended up basically guessing.
00:45:49.020 And I think by the time we got actual help, I think it was eight hours later.
00:45:56.080 So this is somebody bleeding and got help eight hours later.
00:46:02.400 And I pay for that health care.
00:46:04.500 Now, I don't think that things were this bad a year ago or two years ago.
00:46:09.120 I feel like everything broke.
00:46:11.480 Show me in the comments, are you having the same experience?
00:46:16.780 Because that was just one anecdote.
00:46:18.660 You know, you don't need to care about my little problem.
00:46:21.660 But I feel like nothing's working.
00:46:23.640 Like, all the things are too hard.
00:46:28.000 All the things.
00:46:30.040 Yeah.
00:46:30.600 I don't know.
00:46:31.460 I don't know if it's a bunch of different reasons or there's, like, some larger effect
00:46:36.000 going on here.
00:46:36.780 But I just woke up in not even the same country anymore.
00:46:41.060 I mean, the headlines that I'm looking at are that my electricity is no longer reliable
00:46:47.000 in California.
00:46:48.880 And that there's a very high likelihood that I'll have neither water nor electricity sometime
00:46:54.620 this summer.
00:46:56.280 Think about that.
00:46:58.000 I could have no water and no electricity this summer.
00:47:01.880 And I don't think anybody's doing anything about it.
00:47:05.940 Seriously.
00:47:06.500 I don't think anybody's doing anything about it that I know of.
00:47:11.900 So, that's weird.
00:47:14.100 Let's talk about Republicans getting hunted.
00:47:17.960 How many situations have we seen where that's happening?
00:47:21.420 So, we've seen the Russia collusion hoax could have put people in jail.
00:47:25.640 Actually did, because of, you know, some unrelated reasons.
00:47:28.440 But, so that's a Democrat hoax that would have put Republicans in jail for a hoax.
00:47:37.820 This January 6th thing, now that the FBI has indirectly confirmed it, now that's my subjective
00:47:44.080 opinion, that in listening to their answer, the right way to process it is as a confirmation
00:47:49.980 that they were behind it in a substantial way.
00:47:52.720 We don't know how substantial.
00:47:54.740 But, it's there.
00:47:57.200 And that looks like, well, that's already put people in jail.
00:47:59.560 Now, they're in jail for doing things they shouldn't have done, in most cases.
00:48:03.920 But, it still was, it looks like, the larger context is, it's being treated in a hoax way,
00:48:12.020 basically.
00:48:12.840 Because, they're treating the small number of bad people as representative of the whole
00:48:19.240 event.
00:48:19.540 Then, there's this border patrol whipping thing, and then these border patrol people
00:48:24.420 will probably be punished in some way.
00:48:27.460 And, then there's also a story about, there's some terrorist group that's attacking the pro-life
00:48:35.200 centers.
00:48:36.800 So, that's, again, Republicans being hunted.
00:48:39.360 Now, as has been pointed out, in the past, this went the other way.
00:48:43.420 That it was the abortion provider centers that were being targeted.
00:48:47.000 But, I think that hasn't happened lately.
00:48:50.400 So, in terms of a trend, it looks like Republicans are being hunted.
00:48:56.880 I have a fascinating update on gun safety and murder.
00:49:03.340 Want me to blow your mind?
00:49:05.440 All right.
00:49:05.700 So, Panda Tribune, a Twitter account you should be following, made a claim yesterday that I
00:49:13.740 questioned on Twitter.
00:49:15.560 And, the claim was that from 1993 to 2013, the CDC found that gun murders per capita declined
00:49:24.260 by 40%, yet gun ownership per capita increased by 56%.
00:49:30.580 And, I said, that doesn't sound right.
00:49:35.400 So, Panda Tribune provided a long thread with full sources.
00:49:41.940 And, sure enough, Pew Research, I think, was the one, that during that time, gun murders
00:49:47.980 dropped a lot, 40%.
00:49:50.160 But, gun ownership zoomed up 56%.
00:49:53.580 So, would that suggest to you that either guns are unrelated to murder rates, or does it
00:50:03.220 suggest to you that owning more guns reduced the number of people getting murdered?
00:50:09.660 What would you say?
00:50:10.660 The correct answer is, you can't tell anything from this.
00:50:18.680 That's the correct answer.
00:50:20.220 The correct answer is that there was something that had nothing to do with guns that was changing
00:50:25.300 crime of all kinds.
00:50:28.780 So, I'm pretty sure from 1993 to 2013, that all kinds of crimes, violent crimes, were all way
00:50:35.760 less.
00:50:36.160 And, I think, there's some argument about what that was.
00:50:39.680 I mean, Freakonomics thought it was something about abortion.
00:50:44.180 I don't know if that's checked out or not.
00:50:46.680 But, there were some number of factors that substantially made everything safer at the same time the number
00:50:55.800 of guns was going up.
00:50:57.220 So, what could you say about gun ownership either making more or less gun murder based on
00:51:04.660 this data?
00:51:06.160 Well, I would say you can't tell, because there's something bigger than gun ownership, much
00:51:11.800 bigger, that's changing how people are acting, and that whatever the guns are doing is being
00:51:18.760 masked by some larger social change.
00:51:22.700 So, you could not tell.
00:51:25.040 You cannot answer this question from this data.
00:51:27.320 If I added guns or subtracted guns from this society, would there be more or fewer murders?
00:51:36.060 There's no...
00:51:36.960 This data doesn't tell you anything on that.
00:51:40.420 So, you can't answer the question, do more guns cause more murder?
00:51:44.860 Even if more guns happened during a time when there was less murder.
00:51:49.980 Because, remember, the larger homicide thing, whatever's making less violent crime in general,
00:51:55.900 is overwhelming whatever the guns are doing or not doing.
00:51:59.420 So, you just can't tell.
00:52:00.440 It's hidden.
00:52:01.100 You can at least say guns aren't a huge factor.
00:52:07.960 Can you?
00:52:09.420 Can you?
00:52:11.560 It depends how you define huge.
00:52:14.440 You could certainly say it's more of a gray area, where some people would say it's huge and
00:52:19.820 some wouldn't.
00:52:21.320 Maybe.
00:52:22.640 I think it's a little more of a gray area.
00:52:25.540 All right.
00:52:27.740 So, we don't know what the trend would have been if the larger homicide trends had not been
00:52:31.720 what they were.
00:52:33.500 But, here's another thing.
00:52:34.800 Did you know that gun deaths correlate with income?
00:52:39.920 You probably do that, right?
00:52:41.120 The higher your income, the less likely you're going to get murdered with a gun.
00:52:47.300 Right?
00:52:47.740 And so, when I look at the United States having, you know, sort of extreme extremes of poverty
00:52:55.080 and rich people, so we've got income inequality extremes, and then we blend it all together
00:53:02.980 to see how the United States does compared to other countries.
00:53:06.440 But the United States is sort of like two countries.
00:53:10.100 You know, one is poor America, where things are terrible, and one is rich America, where
00:53:15.800 things are better.
00:53:16.440 And they don't have the same gun violence rates, not at all.
00:53:21.640 So, again, there's something much larger than having a gun or not having a gun that's causing
00:53:27.180 people to die or not die.
00:53:30.140 Also interesting that gun deaths were correlated with more suicides, which is largely a white
00:53:37.920 person problem, versus murder, which is largely, in a per capita basis, is more of a black person problem.
00:53:48.360 But then I always think it's weird to lump together.
00:53:50.820 How many of you have the same feeling?
00:53:52.280 I think this is common.
00:53:53.180 That when we statistically lump together suicide and murder, that's just an apple and an orange, isn't it?
00:54:03.320 Because here's the problem philosophically.
00:54:07.380 If you get murdered, you've got something you didn't want.
00:54:12.580 If you commit suicide successfully, you've got something you want.
00:54:17.080 And we add together the thing you want.
00:54:20.620 Now, I get it.
00:54:21.680 I'm not promoting any suicide.
00:54:24.140 So, see a professional if you have those thoughts.
00:54:26.560 Trying to be responsible here.
00:54:28.700 But it is nonetheless, by definition, something you want versus something you don't want.
00:54:34.600 Now, maybe you shouldn't want it.
00:54:36.560 That's a good conversation to have.
00:54:38.080 Maybe you could be talked out of it.
00:54:39.360 Maybe it's never a good idea until you've, you know, seen every health professional you could possibly consult.
00:54:45.640 But the fact is, if we just lump them together, we're just blinding ourselves to whatever the hell is going on.
00:54:51.380 We're using data to shield ourselves from knowledge, if you add those two things together.
00:54:58.440 So, we should forever separate them.
00:55:01.400 And we should probably separate rich and poor.
00:55:04.640 Because, suppose we find out that it's terrible for poor people to own guns, but it's actually a pretty good idea for rich people to own guns.
00:55:15.520 What if we find out that's true?
00:55:17.440 What if the data says that?
00:55:19.520 Well, then we have something to work with.
00:55:21.580 You could say if you're in a poor zip code, maybe, you know, maybe you don't get a gun.
00:55:26.860 But if you do well, or maybe it's by income.
00:55:30.060 Maybe it's literally by income.
00:55:31.360 If you're above a certain income, you could have a gun.
00:55:33.860 Now, that would be the least constitutional thing anybody could ever do.
00:55:38.380 But the data would lead you there.
00:55:42.200 But, however, the Democrats could not be led there by the data.
00:55:45.620 Because they can't say rich people can have guns and poor people can't.
00:55:49.700 But the data might say so.
00:55:52.200 Do you think I'd be wrong?
00:55:54.080 Do you think if you studied rich people gun ownership versus poor people gun ownership,
00:55:59.440 do you think they would look similar?
00:56:01.540 I don't think so.
00:56:02.320 I don't think so.
00:56:04.180 So if you're following the data, just make gun ownership based on your income.
00:56:13.600 Again, it's the most unconstitutional idea.
00:56:17.060 So, you know, you can't really do that.
00:56:19.440 But, you know, the data takes you in strange places.
00:56:21.700 So there's a new Hunter Biden audio where he's bragging to somebody that he can get his dad to agree to anything as long as it's sort of compatible with what Joe Biden might want to do.
00:56:34.540 So it's more about changing his priorities than changing his mind to do something you wouldn't want to do.
00:56:40.180 Important distinction.
00:56:41.180 But I don't know that this is new news, is it?
00:56:47.120 Because Joe Biden has said publicly a number of times that Hunter is the smartest person he knows in person.
00:56:53.720 So it does kind of follow that he would listen to his advice.
00:56:57.280 So it's sort of a story, non-story, because it does, of course, suggest that he's selling his influence and everything else.
00:57:08.880 But the way he's selling it and the way he describes it in the audio does say that he can influence the priority.
00:57:20.400 Not necessarily something that Biden wouldn't want to do, but the priority is of how he spends his time.
00:57:27.120 And that's pretty frightening.
00:57:28.800 But we also kind of knew that, didn't we?
00:57:31.860 Did you not think that Hunter could influence his father's priorities?
00:57:35.880 I feel like I knew that.
00:57:38.880 But hearing it directly is something.
00:57:44.340 So there's a congressman, Jamal Baumenten from New York.
00:57:52.600 He's a Democrat, and he tweeted that Elon Musk is a supporter of white supremacy
00:57:57.020 due to his announcement that he last night voted for Myra Flores.
00:58:03.320 Now, if you know this story, it's immediately funny.
00:58:08.880 Because they're calling Elon Musk a white supremacist for voting for a Mexican-born congressional candidate.
00:58:21.480 And Glenn Greenwald called him out on that, as he should.
00:58:25.960 And, yeah, I've told you that the world is really small.
00:58:30.920 Like, every time I'm listening to a story or watching a story, it's shocking how often I have some connection to it that I didn't know.
00:58:41.760 And so I went to look at the Twitter account for Myra Flores, and she was already following me.
00:58:54.620 So I followed her back and congratulated her for a good victory.
00:58:58.760 And she thanked me.
00:59:01.020 So the world is so tiny.
00:59:05.800 You can't appreciate how completely mind-bending it is to read stories and then say,
00:59:13.400 Oh, she follows me on Twitter.
00:59:15.460 I congratulate her, and she actually had the time to thank me.
00:59:18.120 It's like, it made this whole world just shrink down to, you know, two people sending a message to each other.
00:59:25.760 It's just the damnedest thing.
00:59:29.300 All right, if I were a Republican running and I saw stuff like this, like that tweet, everything turning racial,
00:59:36.140 here is how I would high-ground the hell out of it and destroy whoever came after me with this attack.
00:59:41.560 I would say, you know, I'm running against somebody who has a one-variable filter.
00:59:48.400 For them, everything runs through the race filter.
00:59:51.780 And I, too, think that the race filter is an important one.
00:59:54.440 You don't want to lose sight of that.
00:59:56.060 You know, it's been a big factor in the United States forever.
00:59:58.880 But I would suggest that you want somebody who works for you,
01:00:02.440 who can handle more than one filter on the world.
01:00:05.700 That's what I'm going to bring.
01:00:07.300 I'm not going to lose the filter that race is a big factor,
01:00:10.080 and we need to keep an eye on that.
01:00:11.860 We need to make things as fair as we can for everybody.
01:00:15.460 But if you're voting for somebody who's only filter is that,
01:00:18.940 you know, if everything looks like a nail to them,
01:00:22.320 they're only going to bring you a hammer.
01:00:24.300 And I'm going to bring you the whole toolbox.
01:00:26.900 So if you want to fix more than one thing, I'm your person.
01:00:31.740 If you feel there's only one thing that needs to be fixed,
01:00:34.920 and a lot of people do, to be honest,
01:00:36.720 then a one-hammer candidate might be your candidate.
01:00:42.520 But I'm going to offer you more than that for the same price.
01:00:47.180 Who doesn't like a bargain?
01:00:49.980 I'll handle the fairness issue as best I can
01:00:53.260 and as transparently as I can,
01:00:56.020 and I'll listen to everybody's complaints,
01:00:58.420 and I'll be serious about it.
01:01:00.180 But I'll also try to solve other problems,
01:01:02.160 and I won't see everything through the lens of race
01:01:05.080 because I don't think it's good for you.
01:01:08.100 So that's my proposition.
01:01:10.620 For the same price,
01:01:12.120 because I would get the same salary as a congressperson
01:01:15.100 as anybody else,
01:01:16.560 for the same price, I'll bring you more tools,
01:01:19.200 and I'll work on more problems,
01:01:20.800 and I won't see it through that one frame,
01:01:23.260 and you'll get a complete candidate.
01:01:27.680 Am I done?
01:01:28.460 Imagine listening to me as a hypothetical candidate
01:01:34.060 just giving you that proposition.
01:01:37.160 It would be over.
01:01:39.440 It would be over.
01:01:41.540 Like, that's it.
01:01:42.800 You wouldn't even need to run campaign ads.
01:01:45.200 Just explain what your proposition is.
01:01:48.120 I'm giving you more tools for the same price.
01:01:50.840 Why would you take a single-minded approach
01:01:53.220 when you can have more?
01:01:55.180 Accept more.
01:01:56.420 Don't settle for less.
01:01:57.440 Yes, you should have everything you want
01:01:59.440 and a few things you didn't know you wanted.
01:02:01.480 That's what I'm going to give you.
01:02:03.160 I'm going to try to give you everything you want
01:02:04.860 and a few things you didn't even know you could ask for
01:02:07.540 for the same price.
01:02:10.340 Why would you vote for the other one?
01:02:13.700 You know, I feel like running for office is easy,
01:02:16.820 and we just have people who are not good at doing it.
01:02:19.800 That's why it looks like it's not easy.
01:02:21.460 All right.
01:02:27.720 So President Biden tweeted,
01:02:30.380 This morning I spoke with President Zelensky
01:02:32.600 to discuss Russia's brutal and ongoing war with Ukraine.
01:02:37.080 I reaffirmed our commitment to stand by Ukraine
01:02:39.320 and shared that the United States is providing
01:02:41.140 over $1.2 billion in additional security
01:02:43.420 and humanitarian assistance.
01:02:44.860 And I ask you the following question.
01:02:48.260 Are we going to see that same tweet
01:02:50.280 every two weeks for 20 years?
01:02:52.980 It'll be a different president.
01:02:55.340 But is it just going to be groundhog day over and over?
01:02:59.040 We gave Ukraine another $1.4 billion
01:03:02.540 because it's a brutal war, Putin sucks,
01:03:06.420 you know, they need it for all these reasons.
01:03:11.340 I don't feel like this could ever end.
01:03:14.040 Like, when is Ukraine going to, like, be self-sufficient?
01:03:18.580 No time soon.
01:03:20.980 Now, I'm not saying I have a better idea,
01:03:23.440 but I think you can see the future here, can't we?
01:03:26.100 We're just going to be bleeding money into Ukraine forever.
01:03:31.060 Again, I don't know what the alternative is.
01:03:32.960 I don't have a better idea.
01:03:34.000 But, ugh.
01:03:37.700 And the Ukraine defense minister says
01:03:40.060 that the new U.S. weapons will help them get back to Crimea.
01:03:43.720 So, in other words, there's no end in sight.
01:03:46.780 Because as long as our, you know,
01:03:48.440 military-industrial complex
01:03:49.960 wants us to be at permanent war with somebody,
01:03:53.820 we will be.
01:03:55.420 And I guess this is a pretty profitable war
01:03:57.480 because you've got to sell them the new stuff,
01:03:59.080 you know, the modern equipment to have a chance.
01:04:02.020 So, I don't know.
01:04:05.020 I'd say follow the money.
01:04:06.320 It looks like we're in another permanent war
01:04:08.220 because it's good for the military-industrial complex.
01:04:12.020 And when Ukraine says they're going to get so many great weapons
01:04:15.500 that they might take Crimea back,
01:04:18.200 which to the rest of us seems like,
01:04:20.520 uh-oh, no.
01:04:22.420 Don't try to get Crimea back.
01:04:24.800 Let it go.
01:04:25.480 Because, you know, it's not my Crimea.
01:04:28.180 You see, it's easy for me to say,
01:04:31.560 Crimea River.
01:04:34.020 I can't be the first person who thought of that.
01:04:37.360 Right?
01:04:39.040 Crimea River.
01:04:41.380 Okay.
01:04:42.300 So, let me back up and be less dismissive of Crimea.
01:04:47.700 So, I have great empathy for the people who live there
01:04:50.200 because they're just being used as a political football.
01:04:52.620 I mean, it must be just a hell on earth to be in Crimea
01:04:56.740 and just be everybody's, you know, pull toy.
01:05:01.060 So, for all empathy to the people of Crimea,
01:05:06.480 do you really want Ukraine to try to take it back?
01:05:11.140 Is that good for you?
01:05:13.320 Is it good for anybody?
01:05:14.740 Is it good for the Crimeans?
01:05:16.600 I don't know.
01:05:17.840 Who knows?
01:05:18.580 So, anyway, looks like that'll be permanent.
01:05:27.000 I call it the Donbass region.
01:05:30.460 Not the Donbass, but the dumbass,
01:05:33.000 because I don't think they're ever going to stop fighting there.
01:05:37.520 Crimea's a beautiful little piece of land.
01:05:39.300 I'll bet it is.
01:05:40.020 It looks like its location is kind of awesome.
01:05:44.680 Somebody says Hunter Biden's making a profit off the war.
01:05:47.500 How many of you think the Ukraine war
01:05:49.740 has something to do with the Bidens hiding their crimes?
01:05:56.240 Is it a coincidence that Zelensky got the President of the United States,
01:06:03.380 who was at least alleged,
01:06:05.820 to be somehow in the blackmail pockets of Ukraine,
01:06:10.000 and that he's acting exactly like somebody who is being blackmailed?
01:06:18.440 Now, I'm not alleging that that's the case,
01:06:20.840 because usually these conspiracy theories don't turn out to be true.
01:06:25.660 But you could say, I think it would be fair to say,
01:06:28.940 that the way President Biden is acting,
01:06:32.260 if you were just looking at all of his list of actions for Ukraine,
01:06:35.920 he acts as though he is being blackmailed.
01:06:39.420 Meaning that if he doesn't give Zelensky everything Zelensky wants,
01:06:44.160 Zelensky might know too much.
01:06:48.640 Now, it's also possible that Joe Biden is very pro-NATO, anti-Russian,
01:06:56.160 and every single thing he's doing is exactly what he would have done
01:06:59.700 under any situation, because it's just what he believes,
01:07:03.220 and it's compatible with, you know, everything.
01:07:07.600 But those two hypotheses both are supported by observation.
01:07:13.360 He looks and acts exactly like somebody who's just a Democrat and a President,
01:07:17.560 and he looks and acts exactly like somebody who's being blackmailed,
01:07:21.720 when we have a strong suspicion that there's a good reason he might be.
01:07:28.720 I mean, I don't know how to put the odds on it.
01:07:31.580 Let's look at it this way.
01:07:32.880 If you were to put the odds on it, with only what we know, which is almost nothing,
01:07:37.760 what are the odds that President Biden feels a little blackmail pressure from Ukraine?
01:07:46.400 What are the odds?
01:07:49.320 25%?
01:07:51.620 I don't know.
01:07:52.320 There's no way to put a percentage on that.
01:07:54.680 So I think if you're certain one way or the other you're wrong,
01:07:58.060 I mean, you're thinking wrong,
01:07:59.740 because certainty is the wrong thing to have in this case,
01:08:01.720 but I would give it 25%.
01:08:04.380 And so we're involved in a war that could turn into a nuclear war,
01:08:11.020 and I consider myself a fairly unbiased observer on this question.
01:08:17.240 I think.
01:08:18.500 I mean, I don't know, but I think I am.
01:08:21.140 And I would put it at 25% chance
01:08:23.580 that our President is not even acting in the best interest of the United States.
01:08:29.260 That's pretty terrible.
01:08:31.720 Now, remember that the Democrats do a lot of projection,
01:08:34.500 so that's exactly what they wanted Republicans to think about Trump.
01:08:40.260 That you don't know if he's in Russia's pocket,
01:08:44.140 but look at all this.
01:08:45.880 You know, there was that time he said something good about Putin,
01:08:48.300 and there was that thing with a German bank
01:08:51.960 that might have had something to do with a Russian thing,
01:08:54.760 but no evidence.
01:08:56.420 And there was the allegation in the Steele dossier,
01:08:59.280 which wasn't real.
01:09:00.900 And then it gives you this sense
01:09:03.580 that even though, like, your logic says,
01:09:06.520 you know, I don't think there's any Trump problem with Russia,
01:09:09.680 but they can create enough doubt
01:09:12.560 that you can walk around saying,
01:09:15.060 but maybe, but maybe.
01:09:18.200 They did produce a lot of smoke.
01:09:20.840 As we know, it was hoax-based smoke.
01:09:24.360 You know, now we know that.
01:09:25.880 But at the time,
01:09:27.520 if you said,
01:09:28.560 you know,
01:09:29.640 there is a solid
01:09:31.380 10 to 20% chance
01:09:34.180 that there's something wrong there,
01:09:36.160 you wouldn't have been crazy.
01:09:38.520 You just probably would have been wrong.
01:09:40.980 You just wouldn't have been crazy.
01:09:42.800 Yeah, it's hoax-y smoke.
01:09:48.520 You just did the same thing.
01:09:51.760 I think you're right,
01:09:52.840 but what did I just do?
01:09:56.680 Oh, I just did the same thing
01:09:57.980 in terms of saying
01:09:58.720 there's a 25% chance
01:10:00.600 that Biden,
01:10:01.940 blah, blah, blah.
01:10:02.680 Yes, I was trying to do the same thing.
01:10:05.880 So, you were correct.
01:10:09.660 Just keep avoiding
01:10:10.540 the huge elephant in the room.
01:10:11.840 What's the huge elephant in the room?
01:10:16.960 The huge elephant in the room is...
01:10:18.900 and elephant.
01:10:31.700 All right.
01:10:36.620 Natural gas shortage in Europe this winter.
01:10:39.120 Yeah.
01:10:40.620 So, on social media,
01:10:42.060 there are all these stories
01:10:43.040 about refineries blowing up mysteriously,
01:10:46.020 and there was another recently.
01:10:47.880 But I don't see any of that
01:10:49.060 in the real news.
01:10:50.820 None of that's real, is it?
01:10:54.480 Do you think...
01:10:55.880 So, the food processing plants
01:10:59.660 and also, you know,
01:11:02.140 energy facilities
01:11:03.100 are having some kinds of,
01:11:04.620 you know, weird number of accidents
01:11:07.580 that don't look like accidents.
01:11:08.940 But, does that look real?
01:11:15.860 There's nobody in any
01:11:17.220 professional news organization
01:11:19.800 who's saying it's real, are they?
01:11:21.760 I haven't seen it.
01:11:25.020 Yeah.
01:11:26.320 I think these are just industries
01:11:28.180 that are prone to accidents,
01:11:30.720 aren't they?
01:11:31.700 Because I feel like all my life,
01:11:33.620 I've heard of refinery accidents.
01:11:36.300 So, I live fairly near a refinery.
01:11:40.800 Fairly near, meaning shore drive.
01:11:43.260 So, the refinery's in the Bay Area,
01:11:45.420 and they have periodic alerts
01:11:49.260 and accidents.
01:11:50.760 So, they have a sophisticated system
01:11:52.960 for warning the nearby town
01:11:55.380 of any, you know, leaks
01:11:57.700 or fires or anything.
01:11:59.360 And it's been used.
01:12:01.080 I think twice,
01:12:02.460 maybe twice in my time here,
01:12:07.340 there have been accidents
01:12:08.800 which I think fire was involved
01:12:10.800 at the refinery.
01:12:13.720 And we know that those
01:12:15.420 were not any kind of terrorist acts.
01:12:17.740 So, if I know one refinery
01:12:19.460 that's had more than one accident
01:12:21.600 since I've lived next to it,
01:12:25.060 it seems like maybe that's just
01:12:26.400 an industry that has lots of accidents
01:12:28.040 because of the nature of the business.
01:12:29.840 And maybe food processing plants
01:12:33.160 have lots of accidents?
01:12:35.620 Is there any reason they would?
01:12:36.880 I don't know.
01:12:37.720 And maybe they don't.
01:12:39.000 Maybe there's just lots of them.
01:12:41.680 And if you hear that one has an accident,
01:12:43.320 that's what sticks in your mind.
01:12:45.040 So, I guess I'm a skeptic
01:12:47.640 on the refineries
01:12:49.500 and the food processing plants.
01:12:51.880 But I'm definitely interested.
01:12:54.700 Right?
01:12:54.820 Like, my interest has peaked,
01:12:58.220 but I think we're well short
01:13:00.580 of anything that looks like
01:13:01.620 proof that something's happening.
01:13:05.820 All right.
01:13:06.700 That is all I have for now.
01:13:08.760 And I believe I've delivered
01:13:10.800 one of the best entertainment values
01:13:13.220 of your entire life.
01:13:15.040 For the price of basically nothing.
01:13:17.340 Well, depending on what platform.
01:13:21.920 You have been entertained
01:13:24.240 for over an hour, I believe.
01:13:26.960 What time is it?
01:13:27.660 Over an hour.
01:13:29.140 And I think it's the best thing
01:13:30.460 that's ever happened to you.
01:13:32.380 Or at least today.
01:13:34.060 Let's agree on that.
01:13:35.400 And we'll be back tomorrow.
01:13:36.760 And it will be amazing.
01:13:37.880 Somebody says,
01:13:43.820 this is my closing troll.
01:13:48.520 Scott's career and credibility
01:13:50.040 has really gone downhill.
01:13:53.560 And let's end on that.
01:13:55.380 Bye for now.