Real Coffee with Scott Adams - March 30, 2023


Episode 2063 Scott Adams: Newsom's Reparations Trap, IQ With Healthcare, AI Control, Restrict Act


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

143.41628

Word Count

10,663

Sentence Count

930

Misogynist Sentences

18

Hate Speech Sentences

26


Summary

This is the most mind-boggling thing you'll ever see in your life. It's called the "Simultaneous Sip" and it happens when there are 8 entirely different words that you can hear clearly by looking at them when the same sound is playing.


Transcript

00:00:00.000 Good morning, everybody, and welcome to Coffee with Scott Adams. It's the highlight of civilization.
00:00:07.000 You might notice my voice is a little raspy, got a little laryngitis.
00:00:12.000 So, turn up your sound, put on your headphones. This is as loud as I can get.
00:00:18.000 Now, if you'd like to take your experience up to levels that, I don't know, are just impossible to imagine at this point,
00:00:25.000 all you have to do is grab yourself a cup or a mug or a glass, a tanker gel, a stein, a canteen jug or flask, a vessel of any kind.
00:00:34.000 Fill it with your favorite liquid. I like coffee.
00:00:37.000 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better,
00:00:42.000 and today it comes with a little oxytocin, too. It's called the Simultaneous Sip. It happens now. Go.
00:00:49.000 Well, Jacinta, I hope you enjoyed your very first Simultaneous Sip.
00:00:59.000 Everybody, congratulations to Jacinta. Good.
00:01:06.000 I hope I'm pronouncing that right. I turned the J into a Y. Is that correct?
00:01:13.000 J-A-C-I-N-T-A. Jacinta? Sounds about right.
00:01:19.000 All right. Well, how many of you have seen the famous, probably every one of you,
00:01:26.000 the famous audio illusion in which you can hear, what is it, green needle or something else?
00:01:35.000 So you can hear two different things, even though it's always the same.
00:01:38.000 Well, have you seen the one where there's a whole list of things you can see?
00:01:43.000 This is the one that will completely change your opinion about the world.
00:01:49.000 Now, I don't know if you can hear it. Hold on.
00:01:57.000 All right. So before I put on the audio, just see that there's a whole list of different things,
00:02:04.000 which the same audio will sound like, but only if you're looking at it at the time.
00:02:10.000 So in other words, if you're looking at the line that says baptism piracy, you hear exactly that on the audio.
00:02:18.000 As soon as you move to any other thing on the list, and there are two, four, six, eight,
00:02:24.000 there are eight things here, eight completely different sets of words, completely different.
00:02:31.000 You can hear each of them clearly by reading it at the same time as the music.
00:02:36.000 Now, I don't think it'll come through very well on my audio,
00:02:39.000 but you'll at least get the sense that this is the most mind-bending thing you'll ever see in your life, right?
00:02:47.000 So here's the sound.
00:02:49.000 I just realized this doesn't work at all because you can't read the sentences.
00:03:06.000 But you can hear this as Bart Simpson bouncing, rotating pirate ship, that isn't my receipt,
00:03:14.000 lobsters in motion, that is embarrassing, lactates in pharmacy, baptism piracy, or that isn't mercy.
00:03:29.000 No sound? Yeah, there's plenty of sound.
00:03:31.000 So the people saying there's no sound are just trolls.
00:03:37.000 Now, here's why this is so mind-blowing.
00:03:41.000 When the audio illusion was two different things, you could tell yourself,
00:03:47.000 oh, well, that's a weird coincidence.
00:03:49.000 There are two different things that, in the right conditions, you would confuse with each other.
00:03:55.000 In a way, it's not really that interesting, is it?
00:03:58.000 Right? It's not really interesting.
00:04:00.000 So, but what happens when there's eight entirely different words that you can hear clearly
00:04:08.000 by looking at them when the same sound is playing?
00:04:12.000 How do you explain that?
00:04:14.000 The only way to explain that is that reality is super subjective.
00:04:21.000 Now, you knew reality was subjective because we all have different opinions looking at the same stuff.
00:04:26.000 But I'll bet you didn't know it was that subjective.
00:04:29.000 I mean, that is really subjective.
00:04:32.000 And that might be the way you walk around through life all the time.
00:04:38.000 You know, I think it was in the 60s, the first time I heard, you hear what you want to hear.
00:04:44.000 You see what you want to see.
00:04:46.000 And the first time I heard that, it was sort of mind-blowing.
00:04:48.000 Really?
00:04:49.000 Do you hear what you want to hear?
00:04:52.000 Hmm, I'm not so sure.
00:04:54.000 Because, you know, I was a kid.
00:04:55.000 I was like, I don't know.
00:04:56.000 That sounds a little simplistic, doesn't it?
00:04:59.000 But you do.
00:05:00.000 You actually hear what you want to hear.
00:05:02.000 You can see this with news stories.
00:05:05.000 How many times have you seen on social media somebody would tweet a story and say,
00:05:10.000 look, this proves X.
00:05:12.000 And I look at it and go, no, it doesn't.
00:05:15.000 It disproves X.
00:05:18.000 Literally the opposite.
00:05:20.000 So that's the reality that you walk around in.
00:05:23.000 There is no fixed reality.
00:05:25.000 Or if there is, we can't tell what it is.
00:05:30.000 I would like to give one clarification, an important one.
00:05:34.000 When Elon Musk and other prominent people said we should pause the constraints, I said,
00:05:42.000 that's a good idea.
00:05:44.000 Other people said, wait, wait.
00:05:46.000 China and other countries will blaze ahead of us if we slow down.
00:05:51.000 And that would be dangerous.
00:05:53.000 To which I say, I'm not in favor of stopping development.
00:05:59.000 So I'm only in favor of not making it available to the general public too soon.
00:06:05.000 So that's the only pause I would favor.
00:06:08.000 I don't favor stopping research.
00:06:12.000 I don't favor stopping training it.
00:06:15.000 I do believe that we have a competitive national security and financial interest in being first and being best.
00:06:25.000 And we're lucky that we have a company that seems to be first and best, at least at the moment.
00:06:31.000 So I wouldn't want to lose that asset.
00:06:34.000 It's one of the things that makes America as strong as it is.
00:06:38.000 That we have entrepreneurs who can create technologies that change the world.
00:06:42.000 It looks like that's what's happening.
00:06:44.000 But I do think it would be wise to keep it out of the hands of citizens for a little while.
00:06:51.000 Or even other commercial interests.
00:06:53.000 And let the AI people do what they're doing and then we can decide what to do with it.
00:06:59.000 But here's a take on this that is sort of a clarification of something I've said before.
00:07:07.000 The real risk of AI is that if it starts telling us stuff that isn't true, it will be banned or ignored.
00:07:17.000 Would you agree?
00:07:18.000 If you couldn't trust AI to tell you what's true, which is the current situation, it's not reliable.
00:07:24.000 I found that my need to use it was much lower once I realized that it was full of shit.
00:07:32.000 Did anybody have that experience?
00:07:35.000 When it first came out, I thought, oh my God, I'm so interested in what it says.
00:07:40.000 But as soon as I knew that it lies or just randomly makes up facts, I thought, oh, it's just a random word generator.
00:07:50.000 I have no interest in a random word generator.
00:07:54.000 A random word generator that looks intelligent to people who don't look into it too carefully is not interesting in the least.
00:08:03.000 But will it always be like that?
00:08:08.000 So there are two conditions.
00:08:09.000 One, if AI is undependable, it won't be that useful and people won't be caring about it.
00:08:16.000 But what if it becomes accurate?
00:08:19.000 And how would we know?
00:08:22.000 It's almost as dangerous if it's accurate.
00:08:26.000 Because if it's accurate, then there will be no common narrative for the country.
00:08:32.000 Here's the problem.
00:08:35.000 In my youth, in let's say 40s and 50s or 60s, we know now that the CIA was actively manipulating movies and TV and trying to create a culture that would be successful and defensible.
00:08:54.000 And they succeeded.
00:08:57.000 The so-called, you know, American dream of, you know, owning a house and being a consumer and having a nice car and being a Christian back in those days, that was a key thing.
00:09:10.000 Those things were never natural.
00:09:13.000 Those things were programs.
00:09:15.000 So the United States wisely and effectively programmed its citizens.
00:09:22.000 How many of you stood and did the Pledge of Allegiance?
00:09:25.000 You know that's just brainwashing, right?
00:09:28.000 It's good brainwashing.
00:09:30.000 It's the kind I approve of.
00:09:32.000 Because you can't really let society take its own wild direction any way it goes.
00:09:38.000 It's just chaos.
00:09:39.000 So there probably is some need to brainwash at least children.
00:09:44.000 At the very least, you have to brainwash the children.
00:09:47.000 Because they can't make decisions.
00:09:49.000 Now, you could argue that adults can't either.
00:09:52.000 But certainly with children, we'd all agree.
00:09:55.000 So you have to brainwash children.
00:09:58.000 Later, you can teach them critical thinking and maybe they'll change their opinions on things.
00:10:02.000 But they would do it somewhat on their own.
00:10:05.000 However, it appears that the CIA got out of that business for a while until Obama put them back in that business through legislation.
00:10:18.000 So once you know that the intelligence services of the United States can legally, legally brainwash the citizens,
00:10:29.000 and there's actually a utility to it.
00:10:31.000 It's not all bad.
00:10:33.000 Then you have to say to yourself, what are they brainwashing us with?
00:10:37.000 Well, at the moment, they're brainwashing us with wokeness.
00:10:42.000 Because they could make it go away.
00:10:44.000 They could make it go away.
00:10:47.000 If the CIA or whoever, you know...
00:10:51.000 Here's a phrase that I saw in an article by Jacob Siegel writing for Tablet.
00:11:00.000 And it's called A Guide to Understanding the Hoax of the Century, 13 Ways of Looking at Disinformation.
00:11:07.000 The problem is, in our current situation, the intelligence entities of the United States have formed working relationships with social media and the news.
00:11:20.000 Now, the way they approach it is that they're going to help the news and help social media remove disinformation.
00:11:30.000 So who says no to that?
00:11:32.000 You go to the news and you say, hey, we know what's true and what isn't.
00:11:36.000 Wouldn't you like to report true news?
00:11:38.000 What's the news going to say?
00:11:41.000 Of course.
00:11:42.000 Of course.
00:11:43.000 You know, tell us what's true.
00:11:45.000 You know, they might check it, but at least they'd want to know what's true.
00:11:49.000 So, if you have an entity that is selling itself as the correctors of disinformation, they really are in control of your information.
00:12:00.000 You've given them control of your minds because you've decided that some entity can tell you what's true and not true.
00:12:07.000 And believe me, they don't know.
00:12:10.000 They don't know.
00:12:11.000 They're just going to tell you a version that they think is to their best interest.
00:12:15.000 So, we have a situation where our media is completely, and here's the phrase I wanted to read from Jacob Siegel because he said it so well.
00:12:25.000 He said the American press, once the guardian of democracy, was hollowed out to the point that it could be worn like a hand puppet by the U.S. security agencies and party operatives.
00:12:39.000 The media is so hollowed out it could be worn as a hand puppet by the CIA, U.S. security agencies, and party operatives.
00:12:49.000 Perfect.
00:12:50.000 Perfect.
00:12:51.000 First of all, it's great writing because it's visual.
00:12:54.000 Secondly, it just captures the entire situation.
00:12:59.000 Yeah.
00:13:00.000 So, we're so addicted to watching the news that we think it's real.
00:13:05.000 Even when we know it's not real, we act like it is.
00:13:08.000 I do that every day.
00:13:10.000 I know the news isn't real, and I still act like it is, and I can't break myself of the habit.
00:13:17.000 I tell myself, well, you know, I'm working on this assumption, or, you know, it's real until it changes.
00:13:26.000 You know, I tell myself ridiculous things, but what I should tell myself is none of it's real, because it's not.
00:13:33.000 The facts might be real, but the narrative in which it is presented is always artificial.
00:13:38.000 So, I do believe in pausing AI, but how in the world do we get AI to be free from the same things that corrupted our media, which is intelligence agencies?
00:13:54.000 How do you keep intelligence agencies from owning AI?
00:13:59.000 Can you?
00:14:00.000 I don't think it's possible, because they have more power, and they're better at it.
00:14:05.000 See, here I think, this is my speculation, the way intelligence agencies can co-opt social media managers and news people,
00:14:17.000 is that when the CIA or somebody, you know, like FBI, somebody official, comes in and meets with you,
00:14:24.000 and they say, hey, let us be partners and let us help you, the first thing you might think is, oh, I've got a partner, free partner.
00:14:32.000 Maybe it's free, maybe it's not.
00:14:34.000 But I've got this partner, like, they'll help me.
00:14:37.000 But the trouble is, intelligence agencies are never your partner.
00:14:41.000 They're always your boss.
00:14:43.000 They're always your boss.
00:14:45.000 They're never your partner.
00:14:47.000 They might come in as your partner, but try to do something they don't want.
00:14:51.000 See how that goes.
00:14:53.000 And some of it is ego.
00:14:57.000 Imagine you're the CEO of whatever news entity, and the CIA asks for a meeting.
00:15:05.000 What does that do to your brain?
00:15:07.000 You're in charge, and the CIA asks to meet with you because it's important to national security.
00:15:13.000 And you're important.
00:15:15.000 Yes, you.
00:15:16.000 You're important to national security.
00:15:18.000 And, you know, we really think you're great, and we'd like to work with you to help you because you're so important.
00:15:24.000 We'd like to help you be even more effective.
00:15:27.000 And, by the way, we can tell you some stuff that you wouldn't have known otherwise.
00:15:32.000 Do you like some insider stuff?
00:15:34.000 Would you like to tell your family that you'd meet with the CIA?
00:15:38.000 Yeah, you would.
00:15:40.000 Sounds kind of sexy, doesn't it?
00:15:42.000 Makes you look kind of powerful.
00:15:44.000 You'll like it.
00:15:45.000 Let's be partners.
00:15:47.000 So I don't think they need to force anybody to do anything.
00:15:50.000 I think they just have to associate with them, and then people start doing what they want.
00:15:54.000 It's somewhat automatic.
00:15:56.000 Somewhat automatic.
00:16:02.000 All right.
00:16:03.000 Given that the news is not reliable, I would like to propose a way of dealing with the news.
00:16:13.000 This is how to deal with the news when you don't believe it's necessarily reliable.
00:16:19.000 I would call it the working assumption model.
00:16:22.000 The working assumption.
00:16:24.000 Now, a working assumption doesn't mean you believe it.
00:16:27.000 It means that the evidence suggests something's true, and unless something disproves it, that's your working assumption.
00:16:35.000 Let me give you some examples.
00:16:37.000 There's no evidence that the Mexican cartels have corrupted the government of the United States in a big way, right?
00:16:45.000 Would you agree?
00:16:46.000 There's no evidence that I'm aware of that the government of the United States is already corrupted by the cartels.
00:16:54.000 I've never seen any evidence of that.
00:16:57.000 However, given what we see, which is a complete lack of interest in dealing with the cartels effectively,
00:17:05.000 my working assumption is that the government is already compromised.
00:17:11.000 I don't know that it's true, but everything I see supports that working assumption.
00:17:17.000 So if I said it's true, well, then I would be lying, because I don't know that it's true.
00:17:24.000 But if I say the most practical working assumption is that the government has already been corrupted, what would you do?
00:17:32.000 How would you act differently?
00:17:35.000 I'll tell you how I would act differently.
00:17:37.000 I would arm, because we might reach a point where the citizens have to get rid of the cartel.
00:17:43.000 And the only way that's going to happen is with superior firepower.
00:17:47.000 If the government is out of the game, that doesn't mean the game's over.
00:17:53.000 The game isn't over because the government decided to set it out.
00:17:57.000 No, the game is on.
00:17:59.000 And you better arm yourself, because the cartels have already wiped out the black gangs.
00:18:05.000 Did you know that?
00:18:07.000 This is Peter Zahan's provocative idea.
00:18:11.000 He says that the reason that the murder rate dropped is because the cartels murdered the murderers, the gangs.
00:18:20.000 They just murdered the murderers and took over their business.
00:18:23.000 Have you wondered why you don't hear much about the Crips and the Bloods lately?
00:18:28.000 They're dead.
00:18:30.000 Not all of them, obviously.
00:18:32.000 But apparently, their power has been diminished and completely gunned by the cartels.
00:18:37.000 Now, the cartels are more dangerous in one way, even though they murder less, believe it or not.
00:18:44.000 When the cartel completely owns a place, the murder rate goes down.
00:18:49.000 It's only when they don't have control that the murder rate is high.
00:18:52.000 When the murder rate goes down, that's trouble.
00:18:57.000 Worry the most when the murder rate goes down, because that means the cartel already has control.
00:19:04.000 So, working assumption is that our government is already compromised by the cartels.
00:19:10.000 I'd love it to not be true, but I have to act like it is true, because the government is acting as if it's true as well.
00:19:18.000 Here's another one.
00:19:21.000 There is no evidence that I'm aware of that our elections were rigged in the sense of vote counting.
00:19:28.000 No evidence at all.
00:19:30.000 However, the fact that our elections are not fully auditable, and there seems to be no interest by our government to make them fully auditable,
00:19:39.000 my working assumption is that they're rigged or will be.
00:19:45.000 They're either rigged or they will be.
00:19:48.000 Now, I don't have evidence of that.
00:19:50.000 It's a working assumption under the context of a press that isn't dependable and a government that is obviously not working on a gigantic problem, completely ignoring it.
00:20:03.000 The working assumption is that the elections are rigged or that they will be soon.
00:20:10.000 Is that fair?
00:20:12.000 Again, let me be as clear as possible.
00:20:15.000 I don't have proof of it.
00:20:17.000 If I said I had any kind of proof or even strong evidence, that would be a lie.
00:20:23.000 What I have is a working assumption based on living a life of, you know, who you trust and who you don't, sorting things into those buckets and say, well, that's what it looks like.
00:20:34.000 That's my working assumption.
00:20:35.000 All right.
00:20:37.000 All right.
00:20:38.000 Here's another one.
00:20:42.000 The TikTok ban.
00:20:44.000 So the country wanted to ban TikTok.
00:20:47.000 It appeared that there was some bipartisan support, but instead of a simple bill that bans TikTok, we got something called the Restrict Act from Senator Warner, I believe, Democrat.
00:21:01.000 But the Restrict Act is not targeted at TikTok.
00:21:05.000 In fact, I believe TikTok is not mentioned.
00:21:08.000 Rather, it gives broad powers to the government to determine if there is foreign influence, you know, in any of our communication, social media structure.
00:21:20.000 And it can go after any of that foreign influence, you know, with the powers of the government.
00:21:27.000 Here's the problem.
00:21:29.000 Why are they doing that?
00:21:31.000 Who asked for that?
00:21:32.000 Do you remember a big American uprising saying, give the government more power over all our social media?
00:21:39.000 When did that happen?
00:21:41.000 Now, there was a problem.
00:21:45.000 There was a problem with TikTok, a specific TikTok problem.
00:21:51.000 If the government is not working on the problem as clearly defined and instead is looking to increase its powers more generally, what would you assume?
00:22:03.000 What's your working assumption?
00:22:06.000 My working assumption is it's a corrupt process and it's just a power grab and it's just political and has nothing to do, nothing to do with solving the country's risk.
00:22:17.000 Nothing to do with it.
00:22:18.000 Now, is that true?
00:22:20.000 I don't know.
00:22:21.000 I have no idea.
00:22:23.000 But it's my working assumption.
00:22:26.000 It's a reasonable working assumption.
00:22:29.000 Here's what I say.
00:22:30.000 I say that if you say something's true, your enemies can say it's false.
00:22:35.000 So don't do that.
00:22:37.000 Don't say it's true when the other side just says, well, let's prove it.
00:22:41.000 You can't prove it.
00:22:43.000 Nope.
00:22:44.000 Just say, if anybody wants to change my mind, it's pretty easy.
00:22:48.000 Do you know how you can change my mind on TikTok?
00:22:51.000 Produce a bill just about TikTok.
00:22:56.000 And then if you want, you can also have a bill with this other restrict stuff in it.
00:23:01.000 And then let Congress vote on them separately.
00:23:04.000 One to ban TikTok, which is what we all want.
00:23:07.000 And one, let's talk about it.
00:23:09.000 Maybe they have an argument that I'm not aware of.
00:23:12.000 On the surface, it looks like a big mistake.
00:23:15.000 Just a power grab that the citizens aren't going to like in the long run.
00:23:19.000 But I'd listen to it.
00:23:21.000 I just don't want it combined with the thing that I want solved, which is TikTok.
00:23:25.000 See how easy it would be to change my working assumption.
00:23:29.000 With that little change, a little change, just a tiny little change,
00:23:33.000 just make a separate TikTok bill,
00:23:35.000 I would say, okay, it looks like the government's trying to protect me,
00:23:39.000 and they've got some ideas I didn't think of.
00:23:41.000 So we'd better talk about those as well.
00:23:44.000 How could they prove me?
00:23:46.000 How could they change my working assumption about fentanyl
00:23:49.000 and the cartels already owning them?
00:23:51.000 Simple.
00:23:52.000 Just attack the cartels.
00:23:54.000 Or announce that you're going to, or get tough, or close the border.
00:23:59.000 Lots of stuff.
00:24:00.000 There's probably five different things they could do to change my working assumption,
00:24:04.000 and none of them are hard.
00:24:05.000 Or, well, none of them are undoable.
00:24:07.000 How about my, what else?
00:24:14.000 Well, you get the idea.
00:24:15.000 It's easy to change my working assumption.
00:24:17.000 Don't claim things are true.
00:24:19.000 Claim that your working assumption, based on the facts, can be changed.
00:24:26.000 So, basically, you would put the burden of proof back on your accusers.
00:24:31.000 Here's the normal way it goes.
00:24:33.000 I believe X is true.
00:24:36.000 And your Democratic critic usually says,
00:24:38.000 you don't have proof of that.
00:24:40.000 There's no court that said that's true.
00:24:43.000 Show me a link.
00:24:44.000 Show me some evidence.
00:24:45.000 Show me some data.
00:24:46.000 There's no evidence.
00:24:47.000 You're just off in crazy town.
00:24:49.000 But instead you say, well, the situation suggests that this is true.
00:24:55.000 If you'd like to disprove it, it's easy to do.
00:24:58.000 I welcome you to disprove it.
00:25:00.000 I'd love to know it's not true.
00:25:02.000 So, you want to put the burden of proof on the other side instead of having them put
00:25:09.000 it on you.
00:25:10.000 So, stop the certainty and say, well, this is what it looks like.
00:25:14.000 So, that's my working assumption.
00:25:16.000 If they'd like it not to look like that, they know how.
00:25:20.000 It's easy to make it look not like that.
00:25:23.000 It's real easy, but they're not doing it.
00:25:28.000 As long as people aren't doing the easy thing to solve the obvious problem, you can have a working assumption that they're corrupt.
00:25:37.000 You can.
00:25:38.000 That's completely reasonable.
00:25:41.000 All right.
00:25:43.000 Let's talk about the Nashville shooting that I hate to talk about because it just makes more of them.
00:25:50.000 But I guess this one has some details that made it interesting.
00:25:53.000 As Tucker Carlson and other people have noted, apparently the shooter left a manifesto, which one presumes is a complete explanation of motive.
00:26:08.000 That's what a manifesto is, right?
00:26:10.000 And yet that has not been released to the public, and the officials who have seen it say they don't know the motive.
00:26:21.000 What's your working assumption?
00:26:24.000 The working assumption, and this is the way Tucker handled it, and I think that was correct.
00:26:30.000 The working assumption has to be that they're hiding it from you for a reason.
00:26:36.000 What would be a good working assumption of why they would hide it from you?
00:26:40.000 Well, I'll tell you the most obvious interpretation for which I have no proof.
00:26:46.000 For which I have no proof.
00:26:49.000 The obvious working assumption is that it was a trans person who was declaring war on possibly Christians.
00:26:59.000 Now, do we have any direct evidence of that?
00:27:02.000 Nope. Nope.
00:27:03.000 There's no direct evidence of that.
00:27:05.000 In fact, my first interpretation was it was just a school.
00:27:09.000 And I think it's very likely that will still be the final interpretation, that it was just a school.
00:27:15.000 It wasn't about religion at all.
00:27:18.000 But at the moment, given the lack of information about the manifesto, I think a good working assumption is that it was about Christianity.
00:27:31.000 And the news or the government doesn't want you to know that.
00:27:36.000 Because it might just make things worse.
00:27:39.000 Now, I'm not totally criticizing that.
00:27:47.000 You know, as much as I think that freedom of information is good for us in general, and it is, this might be one of those special cases where we're better off not seeing it.
00:27:57.000 Because I doubt that it represents anything like a widespread opinion.
00:28:03.000 If it did, then maybe I'd want to see it.
00:28:06.000 If it's just a crazy person babbling, I don't want the right-leaning news to say, well, this represents trans people.
00:28:14.000 Yeah, look at this one case of this manifesto.
00:28:17.000 That's how we'll understand trans people from now on.
00:28:21.000 So, I don't feel like we're necessarily better off with seeing it.
00:28:27.000 But I don't like the way the government's treating us.
00:28:33.000 You know what I would have respected?
00:28:36.000 I would have respected saying, you know what?
00:28:38.000 We don't want you to see it.
00:28:41.000 I actually would have been perfectly good with that.
00:28:44.000 Because that would be honest.
00:28:46.000 That would be honest.
00:28:48.000 You know, we've seen it.
00:28:51.000 We're going to deal with it.
00:28:52.000 It does tell us something we didn't know.
00:28:55.000 We're not going to show it to you.
00:28:58.000 Believe it or not, I would be okay with that.
00:29:01.000 Believe it or not.
00:29:03.000 Because I'm completely okay with, there is some information.
00:29:06.000 I know you want to know.
00:29:08.000 We're going to keep it from you.
00:29:10.000 We think it's in your best interest.
00:29:13.000 Now, depending on who says it and what the context is, I might not believe it.
00:29:17.000 But in this case, I would.
00:29:19.000 In this one simple, unique case, if the FBI said, you know, honestly, we just don't want you to see it.
00:29:28.000 I'd be okay with that.
00:29:30.000 Now, I can totally understand if people were not.
00:29:34.000 I understand that as well.
00:29:35.000 But I feel like its only use is political.
00:29:38.000 Talk me out of it.
00:29:40.000 Would seeing the manifesto have any use besides political points?
00:29:48.000 I don't think so.
00:29:50.000 I don't think so.
00:29:52.000 And we already know what the point would be, right?
00:29:55.000 If the government kept it from you, you would already know what it says.
00:29:59.000 Or at least your working assumption would be.
00:30:02.000 If they kept it from me, my working assumption would be, it's some trans person who declared war on Christians.
00:30:11.000 I wouldn't have any, I would have no data to prove that.
00:30:15.000 But that would be my working assumption.
00:30:17.000 And you know what?
00:30:18.000 I'd be okay with that.
00:30:19.000 I'd be okay with that.
00:30:21.000 Because I wouldn't really be in the dark.
00:30:24.000 I would just be, okay, they kept a political football off the field.
00:30:28.000 Eh, that's okay.
00:30:31.000 Yeah.
00:30:32.000 All right, so, and then there's the question of why there seems to be, I don't know,
00:30:39.000 why are we treating a potential hate crime against Christians as different than we would treat other hate crimes?
00:30:46.000 But I guess it's a little unconfirmed at this point, so that's part of it.
00:30:50.000 And Tucker also asked the question, why don't we ask if antidepressants were involved?
00:30:56.000 Now, this question has a lot more salience to me than it might to you, because I've actually experienced, as most of you know,
00:31:06.000 when I took some blood pressure meds that were not to my liking, I was suicidal.
00:31:13.000 And it was just the meds.
00:31:15.000 Literally the day I stopped, boom, everything was fine.
00:31:19.000 And everything's been fine every day since then.
00:31:22.000 And you know what's interesting is that, you know, by some other measure, you might say,
00:31:28.000 but Scott, how could you be happier now when you've had a whole bunch of bad things happen to you in the last year?
00:31:34.000 And the answer is, I guess I just had the right mindset.
00:31:39.000 You know, these allegedly bad things don't seem so bad to me.
00:31:45.000 You know, the upside seems so substantial that I'm like, no, I came out okay.
00:31:50.000 But over the summer, when I was on the wrong meds, there was nothing that could go right.
00:31:57.000 I just wanted to get out of this world.
00:32:00.000 I just wanted to get out of this world.
00:32:02.000 And when I stopped the meds, it went away.
00:32:06.000 Now, it's easy for me to understand how people, and it would be different for individuals,
00:32:12.000 it's easy for me to understand how a drug could make you a mass killer.
00:32:17.000 Because if it made you suicidal, at the same time you had, you know, negative thoughts about something else,
00:32:23.000 the suicidal part could make you a murderer.
00:32:26.000 Because if you don't care about your own life, you're not going to care about anybody else's life after you're gone.
00:32:32.000 So, anyway.
00:32:37.000 So we should definitely look at antidepressants.
00:32:41.000 And I thought that that was super brave of Tucker to say it on a TV program,
00:32:46.000 which I believe is largely, or used to be anyway, sponsored by Big Pharma.
00:32:52.000 Imagine how much balls it takes to say that your advertisers might be causing mass murder.
00:32:59.000 That's like a lot of balls.
00:33:02.000 Now, I realize Tucker's in a strong situation, career-wise.
00:33:07.000 But you've got to have balls the size of a planet to say,
00:33:12.000 on the very platform that's supported by Big Pharma money,
00:33:16.000 somebody says no Pharma ads on Tucker.
00:33:20.000 And I was wondering about that, because I've never seen any.
00:33:23.000 There's probably a reason for that, right?
00:33:25.000 Yeah.
00:33:27.000 So, I think that was a valuable service by Tucker Carlson,
00:33:32.000 just sort of putting these ideas out there.
00:33:35.000 Of course, Madonna is going to have a concert in Nashville to raise money for trans rights.
00:33:41.000 I'm not sure she's reading the room right.
00:33:46.000 But can we just agree that Eddie's story about Madonna is a story about mental illness?
00:33:52.000 Would you agree with that?
00:33:54.000 I don't know what's happening with Madonna, but my working assumption,
00:33:59.000 I'm not a doctor, I'm not a doctor, I can't diagnose her from a distance,
00:34:04.000 but the way she presents is with mental illness.
00:34:08.000 Pretty severe. Pretty severe.
00:34:11.000 You know, the kind where if she were not famous and powerful,
00:34:14.000 somebody would have forced her into treatment of some kind.
00:34:20.000 But I also think that the trans story is at least partly about mental illness.
00:34:27.000 Now, because I have more respect than some of you do,
00:34:30.000 let me say as clearly as possible,
00:34:33.000 I think it is super, super likely that for some portion of the trans community,
00:34:40.000 transitioning was exactly what they needed, and it made their life better.
00:34:45.000 I just don't know what the percentage is.
00:34:47.000 I don't know if that's 10% or 90%, because I don't have visibility.
00:34:51.000 But whether it's 10% or 90%, it's the same point.
00:34:54.000 We live in a free world, or we try to be.
00:34:57.000 We're trying to be free.
00:34:59.000 If they're adults, they've looked into it, they think they need it, they talk to professionals,
00:35:06.000 that's their business.
00:35:08.000 It's not my business.
00:35:10.000 It's just their business.
00:35:11.000 And I've never had a problem treating any trans person as a human individual with dignity ever before.
00:35:19.000 I don't know why I'd have that problem in the future.
00:35:22.000 But it does seem that it has created a magnet for crazy people.
00:35:29.000 Which is no fault of the law-abiding good citizens who just have a different situation that they found a way to deal with.
00:35:39.000 It's not about them.
00:35:41.000 About some portion have been attracted to this for all the wrong reasons.
00:35:46.000 And if you can't say that the trans issue is both things, you know,
00:35:52.000 legitimate people looking at legitimate solutions that their freedom allows them to pursue,
00:35:58.000 versus people who are just about shit crazy.
00:36:01.000 And as soon as you lump them together and say it's all trans, well, there's nothing you can do.
00:36:06.000 There's nothing you can do with that as soon as you lump them into one category.
00:36:12.000 And I think that's what the right does to their detriment.
00:36:16.000 I think it weakens the argument to just treat it like it's all one thing.
00:36:20.000 So, I think the Madonna thing really underscores that.
00:36:29.000 Rasmussen asked about preventing mass shootings.
00:36:33.000 In a Rasmussen poll, 50% of voters say it's not possible to completely prevent mass shootings.
00:36:39.000 Well, shouldn't that be closer to 100%?
00:36:44.000 That only 50% of people polled agree it's not possible?
00:36:50.000 Well, it's not possible to stop them all.
00:36:53.000 I think everybody would agree with that.
00:36:55.000 How do you only get 50% to agree with that?
00:36:58.000 All right.
00:36:59.000 But 38% believe it is possible to completely prevent such shootings.
00:37:04.000 Okay, that's crazy.
00:37:06.000 And another 12% are not sure.
00:37:09.000 It's a weird little poll.
00:37:11.000 So, I suggested...
00:37:14.000 Well, actually, I stole an idea.
00:37:16.000 Somebody sent me a letter.
00:37:18.000 And I liked the idea, but I extended it.
00:37:21.000 So, the idea that was not mine was to recruit ex-police officers to be teachers.
00:37:32.000 Wouldn't you like to know that there were some legally armed ex-police officers
00:37:38.000 who just had become teachers?
00:37:41.000 So that if your school wanted to be a little safer, just make sure that of your, I don't
00:37:46.000 know, 100 teachers...
00:37:47.000 How many teachers in a school?
00:37:49.000 What would be an average number of teachers?
00:37:52.000 100?
00:37:53.000 50 to 100?
00:37:55.000 Something like that.
00:37:57.000 Let's say 50 to 100.
00:37:59.000 You wouldn't need many of those to be cops.
00:38:04.000 Two?
00:38:05.000 Three?
00:38:06.000 Two or 3% being retired police officers?
00:38:10.000 Because, you know, if you're getting out of the police game, and a lot of people are
00:38:14.000 because of the changes to, you know, bail and everything else, if a lot of people are
00:38:19.000 leaving the police force, you have exactly the pool of applicants you're looking for.
00:38:24.000 Some of them might...
00:38:25.000 Yeah.
00:38:26.000 Now, you could extend it to ex-military as well.
00:38:28.000 That's correct.
00:38:29.000 But here I...
00:38:31.000 But I wanted to extend the idea even further.
00:38:34.000 So, I tweeted that what if the federal government...
00:38:38.000 Or maybe you could leave it to the states.
00:38:39.000 That's a separate argument.
00:38:41.000 But what if funding was made available, either in a state or federal level, that anybody
00:38:47.000 who wanted to be a legal carrier of a weapon could take the classes or get certified in whatever certification.
00:38:56.000 And maybe have to get recertified every year.
00:38:59.000 And then keep the weapon in some kind of a protected safe in a few places.
00:39:07.000 So you want to make sure that people can get to them quickly.
00:39:11.000 But you don't want the kids to get them, obviously.
00:39:13.000 So it's some kind of a safe.
00:39:17.000 A holster.
00:39:18.000 Well, somebody can grab your holster.
00:39:21.000 You know, if three kids wanted to, they could hold you down and take your weapon.
00:39:25.000 So I don't want it in a holster.
00:39:28.000 Unless they're actually a security guard, then yes.
00:39:31.000 But not in the classroom.
00:39:33.000 We've seen too many videos of students taking on teachers.
00:39:38.000 Students will take on teachers.
00:39:40.000 They will steal their gun.
00:39:42.000 I think that's like a real thing.
00:39:46.000 There's biometrics for holsters.
00:39:49.000 Interesting.
00:39:51.000 I'm not sure I would trust a biometric for a holster.
00:39:54.000 You still might want to steal it and get rid of the biometric somehow.
00:39:57.000 So that's just one idea.
00:40:02.000 Speaking of which, I saw a tweet from Emily Brooks saying that there was a, quote, shouting match between Representative Jamal Bowman, Democrat, and Representative Thomas Massey.
00:40:15.000 And then I watched the video.
00:40:17.000 When I watched the video, do you think I saw a shouting match?
00:40:21.000 This is the way all headlines work.
00:40:23.000 There's a shouting match.
00:40:25.000 And then I watched the headlines.
00:40:27.000 Nope.
00:40:28.000 Nope.
00:40:29.000 There was one person shouting.
00:40:31.000 Jamal Bowman.
00:40:33.000 And then there was a plucky representative, Thomas Massey, who happened to be walking by the public hallway or rotunda or wherever it was.
00:40:42.000 He just happened to be walking by.
00:40:44.000 And so Jamal is just like screaming at the crowd in like every direction.
00:40:48.000 He's just, you know, screaming his gun control stuff.
00:40:51.000 Thomas Massey just walks right up to him and just starts talking to him calmly about a suggestion for, I think, for arming teachers.
00:41:03.000 And Jamal just keeps yelling and yelling.
00:41:05.000 And Massey just keeps, you know, keeps to his same tone.
00:41:09.000 And at one point, they're just talking each other over each other.
00:41:12.000 And then Thomas Massey sees that it's being filmed.
00:41:15.000 So he, you know, sort of excuses himself from the shouting guy and goes to talk to the, it was probably a phone or a news reporter or something.
00:41:25.000 And he starts to give his idea directly to the video.
00:41:28.000 And then the shouting guy comes over and tries to shout him away from the video.
00:41:32.000 So he can't even, he can't even say his idea.
00:41:35.000 All right.
00:41:37.000 Every day I'm finding a new reason to love Thomas Massey.
00:41:41.000 I just love the fact that he just walked right up to him and started talking to him with a productive suggestion.
00:41:47.000 And the guy wouldn't even listen to it.
00:41:49.000 Wouldn't even listen.
00:41:50.000 It was awesome.
00:41:52.000 All right.
00:41:55.000 Here's one of the best and scariest ideas I've ever heard.
00:42:02.000 How many of you are in favor of universal basic income, where the government gives you money and you don't have to work?
00:42:10.000 You just have enough to live, right?
00:42:13.000 Pretty much in this audience, it's going to be all no, right?
00:42:16.000 Oh, a few yeses, a few yeses.
00:42:18.000 All right.
00:42:19.000 Here's a hidden danger of UBI that I'd never thought of.
00:42:22.000 And you're going to say to yourself, oh, shit.
00:42:26.000 How did I not think about that?
00:42:28.000 Are you ready for this?
00:42:29.000 I'm going to bend your mind courtesy of a tweeter called the commander in chief one.
00:42:35.000 It's all one word, the commander in chief and then digit one, if you're looking for this person on Twitter.
00:42:44.000 And here's the tweet that just blew my top of my head off.
00:42:49.000 Are you ready for this?
00:42:50.000 Seriously, this is going to blow your head off.
00:42:53.000 Hold on to your scalp.
00:42:55.000 Here it comes.
00:42:56.000 They can talk about a universal basic income as much as they like.
00:43:01.000 Any smart person will know where this is going.
00:43:05.000 But I didn't know where this was going.
00:43:07.000 So I don't qualify as smart.
00:43:09.000 Let's see if you did.
00:43:11.000 Here's where he says it's going.
00:43:13.000 They will not be interested in keeping people around who cannot be useful.
00:43:18.000 That's what I realized before.
00:43:23.000 Yeah.
00:43:24.000 Suppose we create a society where, let's say, 30% of the people are just using the money of the other people.
00:43:36.000 And they're not working.
00:43:38.000 Now, in our current woke democratic world, that's pretty safe.
00:43:45.000 If you were to introduce UBI, if you could afford it somehow, and you introduced it into America,
00:43:51.000 and let's say 10% of Americans took you up on it, that'd be kind of safe.
00:43:58.000 It would be not that big of a burden on the country, 10%.
00:44:03.000 They're probably getting some kind of services anyway.
00:44:05.000 And you'd say to yourself, well, I'll never even meet one.
00:44:09.000 I'll probably never even run into anybody who's on UBI.
00:44:12.000 It's no big deal.
00:44:13.000 But imagine it got to like 30%.
00:44:17.000 Imagine if one third of adults were not working, and they were just taking your money,
00:44:24.000 your money, the money you work for, so you get to work and have a good life,
00:44:30.000 and they get to do no work at all and spend your money.
00:44:34.000 Now, presumably, they have fewer things.
00:44:37.000 They'd have to live a more basic life.
00:44:39.000 But probably they could be just happy doing their fentanyl and whatever.
00:44:43.000 Now, in the short run, that could work.
00:44:46.000 In the short run.
00:44:48.000 In the long run, you can never be sure that a despot won't take over.
00:44:54.000 Am I right?
00:44:56.000 History has shown us that every civilization ends.
00:44:59.000 Every one.
00:45:00.000 They all end.
00:45:01.000 Or they become a different thing.
00:45:03.000 What if the United States becomes a dictatorship?
00:45:07.000 And what if we don't have enough money to feed everybody?
00:45:12.000 They're going to kill all the UBI people first.
00:45:15.000 They're going to round up the UBI people, starve them to death, and say,
00:45:19.000 all right, we have a chance now.
00:45:21.000 We got rid of the anchor that was holding us back.
00:45:25.000 I'm glad they self-identified as useless, because when we got rid of them,
00:45:29.000 nobody even argued.
00:45:30.000 Oh, you're going to get rid of the useless people?
00:45:33.000 And it wouldn't even be racial, because it would be, you know, distributed across races
00:45:38.000 and everything.
00:45:39.000 So you say, hey, it's not even racial.
00:45:41.000 It's just all the useless people.
00:45:43.000 All the people who add nothing to society will just round them up and put them in camps
00:45:47.000 or kill them all.
00:45:49.000 Had you ever thought about that?
00:45:52.000 Had you ever considered that you're guaranteeing the death of the people who stop working?
00:45:58.000 And I'm going to generalize this to a bigger point.
00:46:02.000 There's no such thing as security in this world.
00:46:05.000 You're never safe.
00:46:07.000 You could have the illusion of being safe, and you could be safer than other situations.
00:46:12.000 So you could be safer, but you could never be safe.
00:46:15.000 We don't live in a safe world.
00:46:17.000 But one thing you can do to make yourself as safe as you can in a dangerous world
00:46:24.000 is to be useful to other people.
00:46:27.000 Nothing is going to help you more than that.
00:46:31.000 All of your happiness, all of your financial success, all of your physical safety is going
00:46:40.000 to depend on how useful you are to other people.
00:46:43.000 If you're taking UBI and you're nothing but a drain on other people, don't expect to be safe.
00:46:50.000 Do not expect to be safe.
00:46:52.000 Because other people don't care about you at that point.
00:46:54.000 They really don't.
00:46:55.000 If you're just going to take my money and give me nothing in return, good luck.
00:47:00.000 You're on your own.
00:47:02.000 So I would argue that the more you create your talent stack, the safer you are.
00:47:10.000 Why was it that I wasn't panicked when I got cancelled worldwide in my primary profession for 34 years?
00:47:17.000 Why did it not send me into a death spiral?
00:47:21.000 Because my talent stack is so well developed, I knew I could just do other things.
00:47:27.000 And so within a day, I was already spinning up my other things.
00:47:32.000 And they're working fine.
00:47:34.000 And it's only because I spent a lifetime developing a set of skills that I know work together and have commercial value.
00:47:43.000 If you don't have that going on, you're not safe.
00:47:47.000 You can only make yourself safe by developing a talent stack that helps other people in some way.
00:47:54.000 They find value in it.
00:47:57.000 That's your advice for today.
00:48:00.000 Well, I don't know if I said this yesterday, but Governor Newsom got himself in a trap by commissioning this group to recommend reparations.
00:48:09.000 And they came back with an absurd number that some are saying would cost $800 billion, two and a half times the annual budget of California, according to NBC.
00:48:22.000 So now Newsom has two choices.
00:48:25.000 If he backs his reparations, he has no future in politics.
00:48:31.000 Would you agree?
00:48:33.000 He has no future in politics if he backs the reparations.
00:48:37.000 But if he doesn't back them, he has no future in politics.
00:48:42.000 Not national politics.
00:48:44.000 He might be able to get reelected.
00:48:47.000 Probably not.
00:48:48.000 But he would certainly be out of the race for president.
00:48:51.000 There's no way he could be president.
00:48:53.000 That would be completely dead.
00:48:55.000 So he can't accept it because then he'd be dead.
00:48:59.000 And he can't reject it because then he'd be dead politically.
00:49:04.000 So what are you going to do?
00:49:06.000 You can't accept it and you can't reject it.
00:49:09.000 Resign?
00:49:10.000 No.
00:49:11.000 He's going to form a new committee to study the recommendations of the first committee.
00:49:16.000 That's what I would do.
00:49:18.000 That's what I would do.
00:49:20.000 I'd form a new committee to look into the committee's recommendation.
00:49:24.000 And then I'd wait a year and see if everybody forgot about it.
00:49:29.000 And maybe release it on a late Friday afternoon.
00:49:33.000 It looks like it wasn't practical.
00:49:36.000 Well, we tried.
00:49:37.000 But the committee that examines the committee found that the first committee didn't do its job as well as it could.
00:49:44.000 So I guess it's dead in the water.
00:49:46.000 It's a good thing I asked the committee to look at the other committee.
00:49:49.000 Now, every path he takes on this is a losing path, but that's the least losing path.
00:49:58.000 So I predict it.
00:49:59.000 All right.
00:50:01.000 Meanwhile, over in Minnesota, Governor Tim Walz did a big event in which he introduced and announced a new chief equity officer, Dr. Stephanie Barrage.
00:50:14.000 Chief equity officer.
00:50:16.000 If you're looking for a state to move to, I would recommend one that does not have a chief equity officer.
00:50:26.000 That should be a giant red flag to stay away from that state as far away as you can.
00:50:32.000 Why?
00:50:33.000 Because they're racists.
00:50:34.000 Do you think a white male could have been hired as the first chief equity officer?
00:50:41.000 Seriously.
00:50:42.000 Do you think a white male could have been considered for the job?
00:50:46.000 Of course fucking not.
00:50:48.000 Of fucking course not.
00:50:51.000 Of course not.
00:50:52.000 It's purely racist.
00:50:55.000 What's the definition of racist?
00:50:57.000 You're hiring people based on their race.
00:50:59.000 There's no fucking way they would have hired a white man for the job.
00:51:04.000 Why would you move to that state?
00:51:06.000 That state has a big sign on it that says, no white men allowed.
00:51:10.000 Women?
00:51:11.000 I don't know.
00:51:12.000 Maybe women.
00:51:13.000 But white men?
00:51:14.000 No.
00:51:15.000 Nope.
00:51:16.000 Can't get a job in Minnesota.
00:51:17.000 It's pretty clear.
00:51:19.000 All right.
00:51:20.000 Speaking of Governor Newsom, he tweeted two charts.
00:51:29.000 One is a chart of the states that have lots of guns.
00:51:34.000 The states have the most strict gun control.
00:51:39.000 And he found that when you have the strictest gun control, it correlates really high with fewer murders.
00:51:49.000 Now, what do you think of that?
00:51:52.000 So the data probably came from good sources.
00:51:56.000 Well, let's say the data is accurate.
00:52:00.000 We don't know it's accurate, of course.
00:52:02.000 That would be ridiculous.
00:52:03.000 But let's say it is.
00:52:05.000 How many problems do you think there are with that comparison?
00:52:09.000 A lot of them.
00:52:11.000 A lot of them.
00:52:13.000 All right.
00:52:14.000 Let me give you just the sampler set of problems, which for some reason I didn't include in my notes.
00:52:22.000 Here's the sampler set.
00:52:27.000 If you're looking at states and comparing a state to a state, do you think that's a good comparison?
00:52:33.000 Because California shows up as a low murder state with high gun restrictions.
00:52:41.000 But you can't go near Oakland.
00:52:43.000 You can't go anywhere near Oakland.
00:52:45.000 I drive around Oakland.
00:52:47.000 I don't even want to drive through it.
00:52:49.000 It's so fucking dangerous.
00:52:51.000 Right?
00:52:52.000 So, yeah.
00:52:53.000 Compton, et cetera.
00:52:55.000 So, the use of states is meant to conceal what's true, not to tell you what's true.
00:53:02.000 Would you agree?
00:53:04.000 That almost all the murders are in cities.
00:53:08.000 Well, you know, the vast majority are in cities.
00:53:11.000 And if you don't break out of city to city, so what I'd like to see is strictest gun laws by city.
00:53:19.000 I think you'd find that Chicago has strict gun laws and a high murder rate.
00:53:24.000 Washington, D.C., strict gun laws, high murder rate.
00:53:28.000 Oakland, strict gun laws, high murder rate.
00:53:32.000 Baltimore, strict gun laws, high murder rate.
00:53:35.000 Right?
00:53:36.000 So you can see how easily Newsom can lie with accurate numbers.
00:53:41.000 Now, you're saying the numbers might not be accurate, and I agree.
00:53:45.000 But even if the numbers are accurate, he's found the only way to be misleading.
00:53:50.000 By grouping it by state instead of by city.
00:53:55.000 It makes the whole thing ridiculous.
00:53:58.000 Right?
00:53:59.000 Then you'd also want to see a before and after.
00:54:02.000 Show me the murder rate in, let's say, Chicago, before and after the gun control strict regulations went into place.
00:54:10.000 Did it immediately fall?
00:54:12.000 That'd be a strong argument.
00:54:14.000 Or did it go up?
00:54:16.000 Which I think it did.
00:54:18.000 Yeah.
00:54:19.000 So those are just a few problems.
00:54:21.000 If you looked at the comments on that tweet, you'd see other people suggesting other problems.
00:54:28.000 We've reached the point where data is absolutely useless.
00:54:32.000 Data.
00:54:33.000 All right.
00:54:34.000 And so I tweeted this.
00:54:37.000 Here's where we're at.
00:54:39.000 This is my summary of everything you need to know about data.
00:54:44.000 Dumb people argue without data.
00:54:47.000 Would you agree?
00:54:49.000 If somebody's making their argument and they're ignoring the data or they don't know what it is or they're not using it, that's pretty dumb.
00:54:56.000 Pretty dumb.
00:54:57.000 Smart people argue with data.
00:54:59.000 Would you agree?
00:55:01.000 If you see a smart person, they're probably showing you their data.
00:55:06.000 But the smarter people, the people who are smarter than the smart people, know that you can torture the data until it tells you anything you want to hear.
00:55:16.000 Right?
00:55:17.000 So summarizing so far.
00:55:19.000 If you're dumb, you don't use data.
00:55:21.000 If you're smart, you do use data.
00:55:24.000 If you're smarter than that, you know the data is useless because it's been put together by corrupt people.
00:55:31.000 But there's one level above that.
00:55:33.000 Smarter than the smarter people, these would be the smartest people, know that the data is not real and neither are you.
00:55:43.000 I'll just leave that there.
00:55:44.000 All right.
00:55:45.000 Senator Mark Warren was defending his Restrict Act.
00:55:52.000 That's the one that's supposed to be aimed at TikTok, but is in fact a general statement about giving the government more power over social media, basically.
00:56:01.000 Even domestic.
00:56:03.000 And he said this, defending it, he said, the Restrict Act isn't an infringement on free speech.
00:56:10.000 It's a systemic, rules-based approach to identifying and addressing foreign tech that could threaten national security.
00:56:20.000 Huh.
00:56:21.000 Now, when he explains it that way, that doesn't sound so bad, does it?
00:56:26.000 Wouldn't you like to know that our government has a rules-based approach, you know, something you can really...
00:56:33.000 There's no gray area.
00:56:34.000 It's a rules-based approach, he says.
00:56:36.000 And don't you want them to identify and address foreign technology that could threaten national security?
00:56:42.000 That's all good, right?
00:56:44.000 Isn't that terrific?
00:56:47.000 Here's what I tweeted back at him.
00:56:50.000 We don't trust the government with new rules, in other words, powers, over communication.
00:56:55.000 It's that simple.
00:56:57.000 That's my reply is, we don't trust you.
00:57:01.000 That would be that what he's proposing is something that would be good for somebody that you trusted.
00:57:08.000 Am I right?
00:57:10.000 If we trusted the government, that probably makes sense.
00:57:14.000 It sounds like they need a little more structure to look for bad intention from foreign powers.
00:57:21.000 Sounds good.
00:57:23.000 But what happens if you don't trust your government and you think everything they do is a trick?
00:57:28.000 Which is my current operating belief.
00:57:31.000 In that case, you don't want to give them any power for any reason.
00:57:35.000 Because they're not going to use it for good reasons.
00:57:37.000 So his problem is not that it's a good or a bad bill.
00:57:42.000 The problem is him.
00:57:44.000 Let me say that again.
00:57:46.000 The problem is not Senator Warner's bill.
00:57:50.000 Because he defended that quite adequately.
00:57:53.000 The problem is Senator Warner.
00:57:56.000 Warner.
00:57:57.000 The problem is the senator.
00:58:00.000 And the other senators.
00:58:02.000 Because we don't trust them.
00:58:03.000 Am I right?
00:58:05.000 He's defending the wrong thing.
00:58:07.000 He should defend why we should trust him.
00:58:11.000 Not why we should trust the bill.
00:58:13.000 Why we should trust him.
00:58:15.000 And he has no argument for that.
00:58:17.000 Sorry.
00:58:18.000 Because, you know, nobody's going to trust the other party.
00:58:21.000 And nobody even trusts your own party.
00:58:23.000 And it all looks suspicious to me.
00:58:25.000 So no.
00:58:26.000 You're not allowed to change the topic.
00:58:28.000 The topic is you, Senator Warner.
00:58:30.000 We don't trust you.
00:58:32.000 And if you say, oh, I have this other senator who agrees with me.
00:58:37.000 Oh, that's just another person I don't trust.
00:58:39.000 Oh, you've also got Romney on your side.
00:58:42.000 I don't trust him.
00:58:45.000 I don't trust him.
00:58:47.000 Because he's in Congress, basically.
00:58:49.000 All right.
00:58:54.000 If he put together a one-pager banning TikTok, I might trust that.
00:59:05.000 Have you heard of the Tartarian theory of why civilizations might disappear?
00:59:15.000 How many of you heard this?
00:59:17.000 This is the greatest.
00:59:19.000 Some call it a conspiracy theory.
00:59:22.000 Some call it true.
00:59:24.000 All right.
00:59:25.000 Let me tell you what it is.
00:59:26.000 It's an idea that there used to be an advanced civilization on Earth that built great structures.
00:59:33.000 From the pyramids to even skyscrapers in New York.
00:59:37.000 So that, you know, a lot of stuff is included.
00:59:40.000 And that there was a great mud flood, literally a bunch of mud, that destroyed that advanced civilization.
00:59:49.000 But the buildings and structures that remained were the tall ones.
00:59:56.000 So that if you looked at the tall, older buildings that are more ornate, the less modern ones, you would find that they have many stories below the ground that used to be above the ground.
01:00:11.000 And that you don't know it, but, you know, it's been hidden from us that there was a great society that got wiped out by mud.
01:00:21.000 No, it's not real.
01:00:24.000 As far as I know, there's no reality to it.
01:00:27.000 But it's the most interesting, the most interesting new theory.
01:00:32.000 And apparently there are some examples of some buildings that do have substantial below ground structure.
01:00:39.000 So they use those as examples.
01:00:42.000 All right.
01:00:43.000 All right.
01:00:44.000 So no, I don't think that's true.
01:00:46.000 Here's a story about female students at the University of Wyoming.
01:00:52.000 They were concerned and now they're suing their sorority because a trans-identified, they call it a trans-identified male.
01:01:04.000 I don't know if that's the wrong way to say it or not.
01:01:07.000 But yes, a trans...
01:01:12.000 No, I think they're using the wrong language here.
01:01:15.000 I think the polite language is there's a trans person who identifies as female.
01:01:22.000 Right.
01:01:23.000 So there's a trans person who identified as female, not the way they were born.
01:01:29.000 At least physically, not the way they were born.
01:01:32.000 And it alleges that this trans woman, Artemis Langford, has been, quote,
01:01:38.000 watching the women undress in the sorority house, sometimes while erect.
01:01:43.000 Now, my first question is, why are all these women who are undressing erect?
01:01:48.000 I mean, is it a clitoral thing?
01:01:52.000 I don't understand.
01:01:53.000 It's possible that the sentence is just confusing.
01:01:56.000 And what they mean is that the trans person, Artemis Langford, is erect, not the women who are undressing.
01:02:02.000 Okay.
01:02:03.000 I think that's probably what it means.
01:02:05.000 All right.
01:02:06.000 Now, here's my question.
01:02:08.000 What's wrong with that?
01:02:11.000 What's wrong with this trans woman who still has male equipment having an erection while
01:02:17.000 in the company of all these women who are just like Artemis?
01:02:21.000 They're just other women.
01:02:23.000 And would they have a problem if Artemis had been born a lesbian?
01:02:30.000 Wouldn't a lesbian, I mean, I don't know too much about lesbian culture, but am I wrong?
01:02:35.000 Would a lesbian be turned on by seeing naked, attractive women?
01:02:40.000 Is that a thing?
01:02:42.000 I don't even know if that's a thing.
01:02:45.000 Is it a thing?
01:02:47.000 Right.
01:02:48.000 So if a lesbian would be allowed in the sorority, and I assume they would be, right?
01:02:52.000 The sorority isn't going to discriminate based on LBGTQ stuff.
01:02:58.000 So if you could be a lesbian in the sorority, it seems to me that a trans who also had a physical
01:03:08.000 reaction to women, same thing.
01:03:12.000 So I feel like they're discriminating against this poor trans person for having a penis.
01:03:18.000 That doesn't seem right.
01:03:20.000 Because they, you know, they wouldn't discriminate against lesbians.
01:03:27.000 It's not the same?
01:03:30.000 You can never tell when I'm serious, can you?
01:03:32.000 Am I serious?
01:03:36.000 All right.
01:03:37.000 So what I'm doing is a little bit of, you know, basically accepting and amplifying.
01:03:44.000 All right.
01:03:45.000 So I'm doing some amplification to make a point.
01:03:50.000 Embrace and amplify.
01:03:52.000 Correct.
01:03:53.000 So, all right.
01:03:54.000 Here is the most provocative thing you'll ever hear in your entire life.
01:03:58.000 Something that I could never even mention until I'd been canceled.
01:04:04.000 Somebody says I look high.
01:04:06.000 I wish.
01:04:07.000 No, I'm not high right now, believe it or not.
01:04:08.000 I'd tell you if I were.
01:04:10.000 You know, I live a transparent life when it comes to marijuana.
01:04:16.000 I'd tell you.
01:04:17.000 If I were, I'd just tell you.
01:04:18.000 But I'm not.
01:04:20.000 All right.
01:04:21.000 All right.
01:04:22.000 Here's the topic.
01:04:23.000 And first, you need to know where this comes from.
01:04:26.000 All right.
01:04:27.000 Here's my source.
01:04:28.000 So my source is Tyrone Williams, who's on Twitter and goes by Immune Hack.
01:04:34.000 You know, at Immune Hack.
01:04:37.000 Now, I don't know what's true, but I'll tell you what looks true.
01:04:43.000 That's all I can do.
01:04:44.000 What looks true is that Tyrone is a real person, a black American.
01:04:49.000 And the black American part is important to the story.
01:04:53.000 All right.
01:04:54.000 Because if you didn't know that I was quoting a black American man, you would think I'm the
01:05:02.000 worst person in the world for bringing up this topic.
01:05:04.000 But it's not my topic.
01:05:06.000 I think it's useful.
01:05:08.000 I think it's valuable.
01:05:10.000 I don't know if it's true.
01:05:13.000 But it has enough meat on the bones for me to say, I think Tyrone's onto something.
01:05:19.000 And I'm going to share it with you.
01:05:21.000 And the biggest reason that you probably won't have seen this is it's just so dangerous.
01:05:27.000 It's like dangerous thoughts.
01:05:29.000 You ready?
01:05:30.000 Here it comes.
01:05:31.000 Remember, this is from Tyrone, not from me.
01:05:34.000 Okay.
01:05:35.000 And all of this is meant to be constructive.
01:05:37.000 Right?
01:05:38.000 If it seems to you that this is anything but trying to be helpful for America, then you're
01:05:45.000 misinterpreting.
01:05:46.000 I'm trying to be helpful.
01:05:48.000 It goes like this.
01:05:49.000 According to Tyrone, there are at least four or five health related factors that can affect
01:05:56.000 your cognitive ability of a baby and therefore the cognitive ability of that baby when it
01:06:03.000 grows up.
01:06:05.000 Some of those things include weight.
01:06:08.000 If a mother is obese when delivering a child, the child has lower cognitive function.
01:06:13.000 Did you know that?
01:06:15.000 Or at least it's correlated.
01:06:17.000 Let's say it's correlated.
01:06:19.000 Causation.
01:06:20.000 I think you have to be careful about causation.
01:06:23.000 But it's correlated.
01:06:24.000 If you're obese, your kid will have lower cognitive function.
01:06:29.000 Now, some people have rightly said, are you sure you have the causation right?
01:06:33.000 Maybe it's backwards.
01:06:34.000 It might be.
01:06:35.000 It might be.
01:06:36.000 And that would be exactly the kind of thing you want to look into.
01:06:39.000 But it's not the only thing.
01:06:42.000 Other things associated with cognitive function are vitamin D, something called inflammation
01:06:48.000 IL-6, something called oxidative stress or CRP, something called omega-3,
01:06:54.000 well, we know what that is, but the ratio of omega-3 to omega-6, and then blood pressure.
01:07:01.000 So all of those things are based on science, not guessing, but based on studies.
01:07:08.000 Now, do we trust all studies?
01:07:11.000 And Nicholas Fleming is yelling, name the Jews, Scott.
01:07:16.000 Name them.
01:07:18.000 Nicholas believes that, as some of the people on this channel are, that I'm cowardly not
01:07:23.000 mentioning the cause of all problems which he believes are the Jews.
01:07:27.000 We're going to be hiding you on this channel for more serious people.
01:07:32.000 Sorry, you're gone.
01:07:37.000 But here's the point that Tyrone makes.
01:07:40.000 Every one of these is treatable.
01:07:42.000 Don't have enough vitamin D?
01:07:44.000 Add some vitamin D.
01:07:46.000 Don't have enough ratio of this to that?
01:07:48.000 Improve your ratio.
01:07:50.000 Blood pressure?
01:07:51.000 Yes.
01:07:52.000 Weight?
01:07:53.000 Yes.
01:07:54.000 Inflammation?
01:07:55.000 Yes.
01:07:56.000 There are a bunch of factors that can all be treated.
01:07:59.000 And here's the kicker.
01:08:01.000 All of the things I mentioned are at far higher rates in black Americans.
01:08:07.000 Far higher rates.
01:08:10.000 Obesity, for example, listen to this one.
01:08:12.000 This just blew my mind.
01:08:14.000 Do you know that 80% of black women are obese?
01:08:19.000 But it's two-thirds of white women are obese, too.
01:08:22.000 Two-thirds of white women, and 80% of black women are obese.
01:08:27.000 Huh.
01:08:28.000 I wonder why the...
01:08:33.000 I wonder why the birth rate is going down.
01:08:38.000 Like, sometimes the answers are so fucking obvious.
01:08:45.000 All right.
01:08:46.000 Let's not get into that, though.
01:08:48.000 All right, so Tyrone's point, which I think is brilliant and brave, is that there are identifiable, completely treatable medical conditions that might completely change the IQ differential between black Americans and everybody else.
01:09:13.000 Now, my biggest issue is that correlation and causation are not demonstrated.
01:09:20.000 Right?
01:09:21.000 But, you could test it.
01:09:27.000 Justin Dunn needs to go away, because he also wonders why I'm not blaming the Jews for everything.
01:09:34.000 Maybe because I'm not a crazy conspiracy theorist.
01:09:39.000 Maybe.
01:09:40.000 Maybe.
01:09:41.000 Maybe.
01:09:42.000 All right.
01:09:44.000 So, what do you think of Tyrone's approach?
01:09:48.000 You can see how toxic it is, right?
01:09:50.000 Because as soon as you talk about IQ difference, even though IQ difference is the primary indicator of success.
01:09:57.000 If it's true that black Americans, for a variety of reasons, are giving birth in a suboptimal way, healthcare-wise, it's a suboptimal way, why don't we take that seriously?
01:10:13.000 At least test it.
01:10:14.000 At least test it.
01:10:15.000 You know, you could take a county or a state and say, you know, I don't know if this is a racist theory or if it's just good thinking.
01:10:22.000 So, let's just take a population of people, correct everything, see how their kids do by third grade.
01:10:30.000 Right?
01:10:31.000 By third grade, you'd know if it worked.
01:10:34.000 Wouldn't you?
01:10:36.000 So, here's what I say.
01:10:37.000 What I say is, if you can test it small, and the potential benefit would be extreme, and the cost to do it looks like it'd be doable.
01:10:52.000 Because most of these things are education plus, you know, some supplements, basically.
01:10:58.000 So, I think that's worth doing.
01:11:03.000 I think that's worth trying.
01:11:05.000 Third grade is the inflection point.
01:11:07.000 I think you'd know by third grade, if it worked.
01:11:14.000 What about the slippery slope?
01:11:17.000 The slippery slope, just so I clarify.
01:11:21.000 Claire, I'm going to make you go away for too many repeated comments in caps.
01:11:29.000 Here's what I say about the slippery slope.
01:11:32.000 When people say slippery slope, it's lazy thinking.
01:11:36.000 Meaning that there's nothing about a slippery slope that tells you anything about a topic.
01:11:42.000 There are some things that will go until something changes them.
01:11:46.000 So, I prefer to say, everything will go forever until something stops it.
01:11:53.000 That's a more productive way to look at it.
01:11:55.000 And, usually something pops up to stop it.
01:11:58.000 But usually after it's gone too far.
01:12:01.000 Right?
01:12:02.000 You have to go too far before you get the reaction.
01:12:05.000 So, in my view, the acronyms and the, you know, let's say the wokeness, I think it's guaranteed to end.
01:12:15.000 It's guaranteed to end.
01:12:17.000 Because it went too far.
01:12:19.000 Now, if I believed in the slippery slope, I would say it will just keep going forever.
01:12:24.000 But you can already see the resistance starting to pop up.
01:12:28.000 People willing to say things they couldn't even say out loud a year ago.
01:12:32.000 They're willing to say it out loud.
01:12:33.000 I mean, even the fact that, you know, people are saying seriously that the trans situation is at least partly mental health.
01:12:42.000 Not all of it.
01:12:44.000 But some part of it is mental health.
01:12:46.000 That's a brave thing to say in public under the current situation.
01:12:51.000 And I would argue that the trans community needs to do a better job of separating themselves from the crazy parts.
01:13:00.000 Right?
01:13:01.000 Just as, you know, if you're a Trump supporter, you want to separate yourself from the white supremacist who might also like it.
01:13:09.000 Right?
01:13:10.000 It makes sense for your brand to separate yourself from the crazies.
01:13:13.000 So, I think the trans community would help themselves and help the rest of us if they helped us identify the crazy people.
01:13:24.000 It's not as easy for us, I think.
01:13:27.000 Get the hell away.
01:13:29.000 That's funny.
01:13:31.000 Alright, ladies and gentlemen.
01:13:33.000 This concludes the most shocking and entertaining live stream in all of reality.
01:13:38.000 I don't think you'll see a better thing today.
01:13:43.000 That's my belief.
01:13:44.000 And I'm going to say goodbye to the YouTube people for now.
01:13:48.000 Thanks for joining.
01:13:49.000 And I hope you learned something.
01:13:51.000 Bye for now.
01:13:52.000 Bye for now.
01:14:03.000 Bye.
01:14:11.000 Bye for now.
01:14:12.000 Bye for now.
01:14:14.000 Bye.
01:14:15.000 Bye.
01:14:16.000 Bye.
01:14:17.000 Bye for now.
01:14:18.000 Bye.
01:14:19.000 Bye.
01:14:20.000 Bye.