Real Coffee with Scott Adams - July 17, 2022


Episode 1807 Scott Adams: Who Is Trying To Brainwash You This Week And Largely Succeeding


Episode Stats

Length

1 hour and 30 minutes

Words per Minute

149.20279

Word Count

13,522

Sentence Count

993


Summary


Transcript

00:00:00.000 And welcome to the highlight of your life and civilization itself.
00:00:04.420 Sort of a bright, glowing spot in this big universe of darkness.
00:00:11.740 But, where you are, things are looking up.
00:00:15.420 Things are looking up.
00:00:16.560 And all you need to take it to another level
00:00:18.620 is a cup of mugger, a glass of tanker, a chalice, a cider, a canteen jugger,
00:00:21.860 a vessel of any kind, fill it with your favorite liquid.
00:00:27.080 You're right, Dennis.
00:00:28.080 That Scott Adams is one bad dude.
00:00:31.880 True. Fact check. True.
00:00:34.960 And now, will you join me for the unparalleled pleasure,
00:00:39.120 the dopamine hit of the day,
00:00:40.500 the thing that makes everything better in the entire world.
00:00:42.920 It's called the simultaneous sip, and nothing's ever been better ago.
00:00:50.180 Ah.
00:00:50.740 I would like to start with a checklist for the golden age.
00:01:01.360 Are you ready?
00:01:02.560 I think I might put this on paper, thinking about it.
00:01:06.620 I was thinking about all the things which have somewhat subtly turned positive
00:01:12.060 while we're in this environment of everything going to shit.
00:01:16.180 And you can lose sight of anything that's maybe getting better
00:01:21.040 because our focus is always on, you know, the latest disaster.
00:01:25.760 Number one, there have been no major forest fires in California this year.
00:01:30.640 Now, knock on wood, right?
00:01:32.120 Pup, pup, pup, pup.
00:01:32.660 But I believe that's because of the helicopter rapid response team.
00:01:38.020 I think we're actually getting on the fires faster.
00:01:41.380 In other words, we adjust it.
00:01:43.320 We found a way to make a difference.
00:01:45.280 We're not talking about it, right?
00:01:47.580 Why is nobody talking about California on fire?
00:01:51.960 It could be luck.
00:01:53.760 It could be luck.
00:01:54.620 It could be weather.
00:01:55.400 It could be chance, right?
00:01:56.800 But I feel like maybe we got on top of it.
00:02:00.300 Or at least there's a substantial improvement.
00:02:04.080 How about climate change and nuclear energy,
00:02:07.900 which is really the only solution to it?
00:02:10.080 All positive.
00:02:11.440 All positive.
00:02:12.360 Everything in the nuclear energy field went from super negative,
00:02:16.500 I don't know, five years ago, to super positive.
00:02:20.560 So you've got your climate change, your nuclear energy.
00:02:23.260 Now, in the short run, we've got a lot of challenges with power, etc.
00:02:30.600 Now, let me ask you about, let's talk about inflation.
00:02:35.380 Energy prices seem to have maybe stabilized and dropped a little.
00:02:39.700 Too soon to be happy.
00:02:41.560 But it looks like inflation, at least on the energy part, the biggest part,
00:02:46.740 looks like it might have slowed down.
00:02:48.120 And then the energy cost, if it slows down, will ripple through the rest of the product stuff.
00:02:55.160 We might have seen something closer to the top of inflation.
00:02:58.940 I guess some of the core stuff is still going to be higher for a few months.
00:03:02.680 But that should get pulled back.
00:03:04.500 It should get pulled back.
00:03:06.520 Because there...
00:03:08.460 Now, let's take China.
00:03:11.120 I just saw another article.
00:03:12.340 It's hard to speculate what's happening in China, right?
00:03:15.000 We don't know.
00:03:15.540 But it looks like China is in an actual recession already, if you read the tea leaves, right?
00:03:21.820 They don't report that, but it looks like they might be.
00:03:25.260 And it looks like China, as the growing power that we have to worry about,
00:03:31.600 may have taken quite a hit.
00:03:33.460 Because I think their manufacturing base will be, I would say, drained forever.
00:03:41.620 I don't think people are going back.
00:03:43.120 So, in terms of how we deal with China, even though it's hard for our supply chains and stuff,
00:03:49.780 I believe we're probably moving our pharma back.
00:03:53.700 Because if anything scared us straight, I would imagine big pharma has plenty of money.
00:03:59.400 Am I right?
00:04:00.700 Does big pharma have money?
00:04:02.140 Of course.
00:04:02.800 They've got all they need.
00:04:03.800 So, they can move their plants back to wherever they need to get it out of China.
00:04:08.240 So, I believe that our position in the world relative to China is better.
00:04:17.360 What do you think?
00:04:19.240 I mean, you could easily be wrong.
00:04:21.100 We could all be wrong about this stuff.
00:04:22.660 But even though we both took a hit,
00:04:24.500 I believe the United States will take less of a hit going forward.
00:04:31.520 About Russia.
00:04:33.040 Our two biggest competitors in the world, right?
00:04:35.500 Russia and China.
00:04:37.040 I feel like Russia is taking a step backwards relative to the United States.
00:04:42.060 At the same time, the United States is not doing, you know, terrific.
00:04:45.480 But, relatively, I think we opened a gap.
00:04:50.820 So, I would say that the United States has improved its position, maybe, maybe,
00:04:56.040 with Russia and China, its two biggest competitors.
00:04:59.900 I think we'll get past inflation.
00:05:02.580 What about that risk of nuclear war with Russia?
00:05:06.660 I haven't worried about it lately, have you?
00:05:09.340 I feel like we're past it.
00:05:11.440 And I feel like there's some kind of a stalemate situation happening there
00:05:14.580 that's expensive.
00:05:17.100 But I feel like the end point of Ukraine is looking a little more clear.
00:05:22.840 Looks like a stalemate where Russia will control what they control
00:05:26.580 and Ukraine will complain about it forever.
00:05:31.040 How about AI?
00:05:35.420 Now, AI is either going to be the greatest thing or the worst thing,
00:05:38.760 but it's going to change everything.
00:05:40.100 Whatever you think about the change that AI will bring, you have no idea.
00:05:47.640 There's nothing that will be the same.
00:05:50.180 After virtual reality and AR hit, AI,
00:05:54.500 if you combine AI and virtual reality,
00:05:57.940 what life looks like in 10 years is completely unpredictable.
00:06:03.440 There's just no way to know what's going to go.
00:06:05.180 I mean, it's going to be really interesting.
00:06:06.600 But it might be better.
00:06:09.580 There's a whole segment of the world that literally can't find happiness
00:06:13.380 because the resources don't exist.
00:06:16.540 There are people who can't find a mate,
00:06:20.000 which is sort of important for happiness.
00:06:24.780 And I think people will just take a drug,
00:06:29.440 put it on the VR helmet, and go into their other life,
00:06:32.340 and that other life will be fully satisfying.
00:06:36.520 Even if in the other life they're in love with an AI,
00:06:39.520 an NPC or something, driven by an AI.
00:06:42.260 It'll be a relationship.
00:06:44.900 And it'll be somebody that's not treating them poorly.
00:06:48.180 So I think that for, I don't know, 60% to 80% of the public,
00:06:53.900 they would be better off with an AI partner.
00:06:58.660 Eventually.
00:06:59.100 Everybody who disagrees with that,
00:07:02.940 100% of the people who disagree with that statement,
00:07:05.900 are thinking of current technology,
00:07:08.420 which couldn't come close to that.
00:07:10.980 But future technology,
00:07:12.760 and it's not that far in the future,
00:07:15.400 in the future there will be a better version
00:07:18.140 of a human for your enjoyment
00:07:21.420 than any human could ever compete with.
00:07:25.400 It won't even be close.
00:07:26.840 In the long run, it won't even be close.
00:07:29.460 It'll take a while for the crossover.
00:07:31.180 We're not there.
00:07:32.500 People, still better than robots.
00:07:34.520 Still better.
00:07:35.780 But that gap is just closing like crazy.
00:07:38.420 There's no question where it's going.
00:07:40.600 I mean, it can only go one direction.
00:07:44.080 Can they do plumbing and electrical?
00:07:46.680 Probably.
00:07:48.120 Probably.
00:07:48.560 All right.
00:07:50.160 I made a claim on Twitter.
00:07:55.840 I got a slight adjustment to by Paul Graham.
00:07:59.140 If you don't know Paul Graham,
00:08:01.300 you can Google him,
00:08:02.540 but he's one of the high-value accounts.
00:08:05.620 You know, tech founder,
00:08:07.600 generally considered one of the smart people
00:08:09.540 in California or Silicon Valley,
00:08:12.940 or I'm not sure where he lives now.
00:08:14.180 But anyway, the point of that is he's real smart.
00:08:17.860 So when he says stuff, you should pay attention.
00:08:20.780 And I had said, provocatively,
00:08:23.740 that the most valuable words for success
00:08:27.600 are be useful.
00:08:29.640 That if you didn't know anything else
00:08:31.460 about anything else,
00:08:33.020 and you're just like,
00:08:33.760 well, how do I be successful?
00:08:35.660 Just focus on being useful.
00:08:37.240 And the more useful you are,
00:08:40.360 the better off you'll be, right?
00:08:42.200 More is better, but be useful.
00:08:44.080 If you focus on that, things will go well.
00:08:46.460 Now, a lot of people agreed.
00:08:48.180 Paul Graham added this twist.
00:08:50.940 He said that if it's a business,
00:08:54.340 that's effectively making what people want.
00:08:57.080 So basically, if you're in a business,
00:09:01.020 being useful translates to making products
00:09:04.060 that people want.
00:09:04.920 And here's the part that gets interesting.
00:09:08.540 I'm not sure that's exactly right.
00:09:12.220 It's like close.
00:09:13.580 Feels like an 80% sort of situation.
00:09:21.300 And here's why I think there's a difference,
00:09:24.740 but help me think this through.
00:09:26.600 This is like preliminary thinking.
00:09:28.780 In my opinion,
00:09:30.420 if you focus on what the customer wants,
00:09:32.880 you can get a good result.
00:09:35.940 If you focus on making yourself useful,
00:09:40.800 I think you could also get a good result.
00:09:43.440 If you focus on making yourself useful
00:09:45.800 via the process of making a product
00:09:48.760 that somebody wants,
00:09:50.740 well, then I think Paul Graham's point
00:09:52.400 makes a lot of sense.
00:09:53.800 Like the being useful
00:09:54.720 and giving them something they want
00:09:56.280 ends up being pretty much the same thing.
00:09:58.540 But here's where I disagree a little bit.
00:10:01.920 And I'm not sure if my thinking is clear on this yet.
00:10:05.160 I feel like what you focus on makes a difference.
00:10:08.840 Even if you say,
00:10:09.840 well, you know,
00:10:11.020 the outcome is going to look about the same.
00:10:12.720 I don't know.
00:10:14.020 I think what you focus on
00:10:15.420 makes a difference for the gray areas
00:10:17.960 of which you would expect many.
00:10:20.480 I think if you focus on being useful,
00:10:22.460 you're always going to win.
00:10:23.560 But if you focus on giving people what they want,
00:10:27.480 sometimes they don't know what they want.
00:10:31.100 Know what I mean?
00:10:33.180 Being useful is a little bit more objective.
00:10:36.160 Okay, this is useful or not.
00:10:38.540 This is more useful than that.
00:10:40.240 You can usually kind of suss that out.
00:10:42.700 What is more useful than something else?
00:10:45.300 But I don't know if you can deal with what people want.
00:10:49.280 People's wants are not necessarily
00:10:51.000 tethered to what they need
00:10:52.740 or what's good for them
00:10:53.840 or they even know
00:10:55.160 how it's going to turn out.
00:10:57.240 They've got short-term wants
00:10:58.660 that are in conflict
00:11:00.740 with their long-term needs, right?
00:11:03.000 So I feel as if focusing on anybody's wants
00:11:06.700 is going to give you some problems sometimes.
00:11:12.520 The difference is mind-reading, maybe.
00:11:15.480 All right.
00:11:17.220 Apparently there's a class-action suit
00:11:19.080 against the candy Skittles
00:11:21.140 because there's some ingredient in there,
00:11:24.420 titanium dioxide,
00:11:25.940 that I think in Europe it's already banned
00:11:28.000 and there's some indication
00:11:30.000 that it might not be good for you.
00:11:31.920 But when I saw this,
00:11:33.120 I thought, you know,
00:11:33.820 I don't have too much of a thought about Skittles
00:11:36.300 because I have a rule.
00:11:39.500 I don't eat things
00:11:41.300 which don't seem to have any food in them.
00:11:48.560 So that's just my rule.
00:11:50.780 If you're putting something in your mouth
00:11:52.580 and digesting it
00:11:54.880 and it doesn't have any food-related quality
00:12:00.620 that you can determine,
00:12:02.580 well, I feel like you're really asking for trouble there.
00:12:06.140 Probably just food and medicine
00:12:09.180 is the only thing you should put in your mouth.
00:12:11.240 Okay, you've got a dirty mind.
00:12:13.140 Go ahead, make your dirty jokes.
00:12:15.860 Okay, we're done.
00:12:17.220 But otherwise, the only thing you should put in your mouth
00:12:19.040 is food and medicine and dirty stuff.
00:12:23.320 And naughty stuff.
00:12:24.340 But this brings me to my macro point.
00:12:31.160 Our food supply,
00:12:34.400 maybe it's more the United States
00:12:36.740 or the industrialized countries,
00:12:38.660 but let me tell you something
00:12:39.760 that's really, really obvious in America.
00:12:43.780 I want to see if there's anybody
00:12:45.340 who will disagree with the following statement.
00:12:48.060 Our food supply is poison.
00:12:52.860 You just look around.
00:12:55.200 People, just look around.
00:12:58.160 Just go to the mall and just look around.
00:13:02.100 And you tell me,
00:13:03.100 our food supply is not fucking poisoned.
00:13:06.080 It's obvious.
00:13:07.880 It's obvious.
00:13:09.440 And somehow, the food industry
00:13:11.500 got away with blaming us.
00:13:14.260 Think about that.
00:13:15.540 The food business is getting away
00:13:18.200 with blaming us
00:13:19.440 for being unhealthy
00:13:21.120 because we ate the wrong food.
00:13:24.340 I don't think so.
00:13:25.960 If you're selling, you know,
00:13:27.820 crack to children,
00:13:29.780 I don't think it's the children's fault
00:13:31.480 if they get addicted, is it?
00:13:32.900 If you sell crack to children,
00:13:36.060 it's your fucking fault.
00:13:37.720 It's not the children's fault.
00:13:39.500 And in this analogy,
00:13:40.740 we adults are the children.
00:13:41.880 We don't know any better.
00:13:43.260 You give me a fucking french fry,
00:13:44.680 I'm going to put it in my mouth.
00:13:46.540 Right?
00:13:47.080 Do I know that eating french fries
00:13:48.900 is not my optimal health-related thing?
00:13:51.260 Oh, I do.
00:13:52.840 I do.
00:13:53.740 But if you put a basket of fries
00:13:55.420 in front of me,
00:13:57.060 do you know what I'm going to say
00:13:58.000 in about a second and a half?
00:13:59.940 Well, one basket of fries
00:14:01.320 isn't going to hurt me.
00:14:02.860 Right?
00:14:03.140 I'm going to rationalize it
00:14:04.320 in a heartbeat
00:14:05.560 because they're really delicious.
00:14:09.540 We don't have a chance.
00:14:11.700 This food is engineered
00:14:12.800 to overwhelm our whatever
00:14:15.220 you think is your willpower,
00:14:16.860 which is an illusion.
00:14:17.920 So you're operating against something
00:14:21.040 that's scientifically designed
00:14:22.540 to beat you.
00:14:23.800 If you're fighting every day
00:14:25.300 against, let's say, an enemy
00:14:29.160 who is scientifically designed
00:14:31.720 to beat you every time,
00:14:34.360 well, don't be surprised if you lose
00:14:36.360 because it was scientifically designed
00:14:39.340 to beat you every time.
00:14:41.140 That's the sugar, fat, and salt combination.
00:14:45.220 If you manipulate the sugar, fat, and salt,
00:14:47.280 which is the name of a book
00:14:48.860 you should read,
00:14:50.080 it's addictive.
00:14:53.400 Right?
00:14:53.660 There's a reason you can only,
00:14:55.100 you can't eat more than one potato chip.
00:14:57.640 Right?
00:14:58.040 That's real.
00:14:59.460 I really can't eat one.
00:15:01.460 I've tried.
00:15:02.740 Have you ever tried to eat
00:15:03.480 one potato chip?
00:15:05.480 If you're all alone
00:15:06.560 and you keep the potato chips
00:15:08.980 in your room
00:15:09.500 and you're still hungry?
00:15:11.440 You just can't do it.
00:15:12.800 I would say that
00:15:14.640 if you were to rank me
00:15:17.060 against other human beings
00:15:19.340 in terms of willpower,
00:15:21.460 I think I'd be pretty high up there.
00:15:23.780 I would say my track record of life
00:15:26.260 suggests that I can put off pleasure
00:15:28.820 for a long time
00:15:29.980 to get some larger gain.
00:15:31.780 I have pretty high emotional intelligence
00:15:34.860 as objectively measured.
00:15:37.260 But I can't resist a second potato chip.
00:15:40.140 That's crazy.
00:15:42.340 I would just talk myself into it in a heartbeat.
00:15:45.120 So,
00:15:46.280 I think we have to come to
00:15:48.500 some kind of a
00:15:50.140 I don't know,
00:15:52.260 realization or awareness.
00:15:54.660 Our food supply is killing us.
00:15:57.260 It's not even close to adequate.
00:15:59.200 It's not even close.
00:16:00.940 It's not even in the general zip code
00:16:03.120 of fucking food.
00:16:06.100 Have you ever met somebody
00:16:07.080 who ate clean?
00:16:08.980 Like really clean?
00:16:10.140 Now you think you have
00:16:13.100 because you have a friend
00:16:14.100 who's a vegan or something like that.
00:16:15.480 No, not even close.
00:16:17.220 The difference between eating clean,
00:16:19.620 like really eating clean,
00:16:20.980 no additives,
00:16:22.040 no preservatives,
00:16:22.840 all that stuff.
00:16:24.700 If you meet one of those people,
00:16:27.300 they're not like you, are they?
00:16:29.260 They're not like you.
00:16:31.020 Their bodies and their health
00:16:32.680 is really different.
00:16:34.400 And it's very noticeable.
00:16:35.360 I once
00:16:37.020 briefly dated a woman
00:16:40.100 who was just the cleanest eater
00:16:42.180 and didn't use any
00:16:43.500 unnatural shampoos
00:16:46.420 or didn't even put
00:16:47.600 things on externally.
00:16:50.740 And it was so obvious
00:16:53.180 that that human being
00:16:55.020 was operating at a different level
00:16:56.600 of health.
00:16:57.840 It was just really obvious.
00:16:58.800 And I don't think
00:16:59.520 she started
00:17:01.060 as the healthiest person
00:17:02.180 in the world.
00:17:02.640 I think it's what she eats.
00:17:04.540 I think it's just that.
00:17:05.760 Now, I couldn't do what she did.
00:17:07.800 It was just too extreme.
00:17:09.280 It was just really extreme.
00:17:11.000 But no question it worked.
00:17:14.040 No question it worked.
00:17:17.720 All political movements
00:17:19.040 of any scale are fake
00:17:20.360 in terms of protests,
00:17:23.020 you know, like big movements.
00:17:24.000 whether it's your Antifas
00:17:26.400 or your January 6th events
00:17:28.400 or your
00:17:29.160 or your
00:17:31.380 fine people marches
00:17:32.780 or your Antifa.
00:17:36.580 I think they're all fake.
00:17:38.960 And here's what I mean.
00:17:40.540 I don't mean that the individuals
00:17:42.420 who are marching
00:17:43.400 don't have sincere feelings.
00:17:46.060 They do.
00:17:47.520 But they wouldn't be there
00:17:49.100 except that
00:17:50.660 some artificial entity
00:17:52.800 funded it
00:17:53.860 and organized it, right?
00:17:55.320 Now, here's the thinking.
00:17:57.020 You can imagine a time
00:17:58.560 in the 70s,
00:17:59.660 I don't know if it's true,
00:18:00.580 but you can imagine a time
00:18:01.960 when maybe spontaneous
00:18:04.160 large-scale protests
00:18:06.120 did happen, right?
00:18:08.260 Maybe.
00:18:09.360 But probably not even then.
00:18:11.300 And here's the reason.
00:18:14.100 It's that the organizers
00:18:15.380 have too much to gain.
00:18:17.720 They can monetize it.
00:18:19.480 If it's possible
00:18:20.520 to monetize leadership
00:18:22.140 of an organization,
00:18:24.160 well, you're done here, right?
00:18:26.360 Follow the money.
00:18:27.960 What's going to happen
00:18:28.900 if it's possible,
00:18:30.120 just if it's possible,
00:18:31.600 to monetize leadership
00:18:33.360 of a political movement?
00:18:36.760 Somebody's going to buy it.
00:18:38.940 There's somebody
00:18:39.420 who's going to buy your loyalty.
00:18:41.040 Because they can.
00:18:41.780 It's for sale.
00:18:42.960 You can monetize it.
00:18:43.920 Now,
00:18:45.200 without going into
00:18:47.760 a long argument,
00:18:48.700 here's the short version.
00:18:50.360 If you follow the money,
00:18:52.840 it has to be,
00:18:54.260 in the long run,
00:18:55.680 that all major
00:18:56.700 organized political movements
00:19:00.000 are all artificial.
00:19:01.800 In other words,
00:19:02.280 somebody just funded them
00:19:03.460 to look like
00:19:04.560 a grassroots movement.
00:19:05.920 But it's the organizers
00:19:07.200 that are getting
00:19:07.760 everybody spun up.
00:19:08.720 In the long run,
00:19:10.940 they all become owned
00:19:12.880 by a bad entity.
00:19:14.540 Because they're all
00:19:15.220 for sale.
00:19:16.900 Ultimately.
00:19:17.940 In the short run,
00:19:18.880 maybe not.
00:19:19.820 In the short run,
00:19:20.640 you can imagine
00:19:21.160 people spontaneously
00:19:22.200 running into the street
00:19:23.320 to protest something.
00:19:24.740 And I think maybe
00:19:25.480 the abortion situation
00:19:27.660 on day one
00:19:28.600 was probably
00:19:29.880 just organic.
00:19:30.820 On day one,
00:19:32.560 people probably just said,
00:19:33.740 oh, shit.
00:19:34.320 And they just got up
00:19:35.060 and made a sign
00:19:35.700 and walked outside.
00:19:37.960 But,
00:19:38.660 wait a little while,
00:19:40.980 whoever has the most
00:19:41.960 money and influence
00:19:42.860 will eventually
00:19:43.620 get to whoever
00:19:45.260 the organic leaders are
00:19:46.700 and say,
00:19:47.040 you know,
00:19:47.820 you're doing well
00:19:48.500 as an organic leader,
00:19:49.600 but you're not making
00:19:50.240 any money,
00:19:50.780 you're not feeding
00:19:52.360 your family.
00:19:53.440 How about I give you
00:19:54.340 a bunch of money,
00:19:55.260 make you much more effective?
00:19:57.000 They say,
00:19:57.440 yeah,
00:19:58.340 you're on my side.
00:20:00.140 I mean,
00:20:00.360 there's no downside,
00:20:01.940 right?
00:20:02.620 I just get more money
00:20:03.680 and I do the same thing.
00:20:05.360 But people who give you
00:20:06.360 money own you.
00:20:09.000 No matter what
00:20:09.900 your original motivation was,
00:20:12.280 as soon as you start
00:20:13.140 taking money,
00:20:13.880 you've got a boss.
00:20:15.880 Right?
00:20:16.280 So basically,
00:20:17.200 every leader
00:20:17.840 who might be
00:20:19.040 organic in the beginning
00:20:20.480 ends up getting owned
00:20:22.100 by somebody
00:20:22.640 who has enough money
00:20:23.360 to do that.
00:20:24.920 So,
00:20:25.640 I can't see
00:20:26.700 any situation
00:20:27.500 in which a large-scale
00:20:29.400 protest on
00:20:30.280 either side
00:20:31.020 for any reason
00:20:32.560 would ever be organic
00:20:34.340 in the long run.
00:20:37.720 I just can't see it.
00:20:39.780 So,
00:20:40.040 from this point on,
00:20:41.200 I assume all large
00:20:42.040 protests are fake.
00:20:44.020 A little bit.
00:20:45.040 Doesn't mean the people.
00:20:46.460 The people might be
00:20:47.260 very sincere.
00:20:48.280 But in terms of
00:20:49.040 what got them all there,
00:20:51.040 probably always fake
00:20:52.340 from now on.
00:20:53.600 I don't think
00:20:54.240 we live in a world
00:20:54.860 in which it can't
00:20:56.340 rapidly go fake
00:20:57.560 because follow
00:20:58.560 the money.
00:21:00.000 The money
00:21:00.400 will always be there.
00:21:01.540 There's always
00:21:01.900 going to be somebody
00:21:02.520 on that side.
00:21:03.620 Even if there's
00:21:04.340 nobody in the United
00:21:05.140 States
00:21:05.640 who wants this
00:21:06.880 protest.
00:21:08.040 There's always China.
00:21:09.720 There's always
00:21:10.240 another country
00:21:11.580 that wants some
00:21:12.120 trouble here.
00:21:13.340 So you can always
00:21:14.120 get it funded.
00:21:14.580 NPR has launched
00:21:20.760 what they call
00:21:21.760 a disinformation
00:21:22.600 reporting team.
00:21:24.580 So a group of people
00:21:25.580 who will tell you
00:21:26.400 what's true
00:21:26.940 and what isn't.
00:21:28.500 Here is what
00:21:29.660 I invented today.
00:21:31.160 I call it the
00:21:31.760 Adams Law
00:21:32.600 of Fact Checkers.
00:21:35.060 The Adams Law
00:21:36.340 of Fact Checkers.
00:21:38.260 It goes like this.
00:21:40.360 All fact-checking
00:21:41.300 organizations are
00:21:42.240 liars by omission.
00:21:43.800 and it isn't
00:21:45.440 fixable.
00:21:47.360 All fact-check
00:21:48.440 organizations are
00:21:49.600 liars by omission.
00:21:51.660 That's their job.
00:21:53.120 Their job is to
00:21:54.120 lie by omission.
00:21:55.580 If you don't know,
00:21:56.380 that's the real
00:21:57.060 mission.
00:21:58.340 So the job of
00:21:59.420 a fact-checker
00:22:00.000 is to say,
00:22:01.100 well, obviously
00:22:01.800 there are too many
00:22:02.380 things in the world.
00:22:04.220 You can't look at
00:22:05.060 every fact and
00:22:06.940 try to check it,
00:22:07.680 right?
00:22:07.840 So you have to
00:22:08.180 pick ones.
00:22:09.260 What happens when
00:22:10.080 the fact-checkers
00:22:10.900 pick the facts
00:22:11.920 to fact-check?
00:22:13.920 Well, they're not
00:22:14.760 going to pick the
00:22:15.300 one that's bad
00:22:15.940 for their team,
00:22:17.020 so they just
00:22:18.120 lie by omission.
00:22:19.600 They simply
00:22:20.340 leave out a
00:22:21.240 fact-check,
00:22:22.140 because it would
00:22:22.540 be inconvenient
00:22:23.200 to have one,
00:22:23.900 and then when
00:22:24.240 you look for
00:22:24.640 the fact-check,
00:22:25.300 it goes,
00:22:25.600 well,
00:22:26.340 nothing there.
00:22:28.260 So, and if
00:22:29.720 you've read any
00:22:30.320 of the fact-checks
00:22:31.120 so far,
00:22:32.620 the ones that
00:22:33.620 are obviously
00:22:34.080 fake fact-checks,
00:22:35.340 and you know
00:22:36.000 which ones I
00:22:36.600 mean, right?
00:22:37.480 You've seen it
00:22:38.160 yourself.
00:22:38.900 There are ones
00:22:39.320 that are just
00:22:39.620 obviously fake
00:22:40.560 fact-checks.
00:22:42.640 You can tell
00:22:43.400 what they left
00:22:43.920 out.
00:22:45.080 You just look
00:22:45.920 at it and go,
00:22:46.580 well, you know,
00:22:47.960 you've left out
00:22:48.660 like a main
00:22:49.280 variable.
00:22:49.860 You haven't
00:22:50.080 even ruled on
00:22:50.880 it, like you've
00:22:51.680 just ignored a
00:22:52.420 main variable.
00:22:53.760 That's a lie.
00:22:55.380 If your fact-checking
00:22:56.760 ignores what is
00:22:58.160 obviously a major
00:22:59.480 variable in the
00:23:00.460 decision, that's
00:23:02.300 a lie by omission.
00:23:04.320 So in the long
00:23:05.280 run, I don't
00:23:06.080 think they can
00:23:06.620 go in any other
00:23:07.360 direction.
00:23:07.760 So all
00:23:09.320 political
00:23:09.720 organizations
00:23:10.420 become, you
00:23:11.440 know, corrupted
00:23:12.020 eventually by
00:23:14.240 money, and all
00:23:15.840 fact-checkers
00:23:16.680 eventually become
00:23:18.240 liars by
00:23:20.920 omission.
00:23:21.960 Now, I don't
00:23:22.840 think that they
00:23:23.460 usually lie
00:23:24.540 directly.
00:23:27.000 There might be a
00:23:27.900 case of that,
00:23:28.780 but I think that
00:23:29.920 would be rare.
00:23:31.040 You don't see the
00:23:31.720 fact-checkers say,
00:23:32.600 oh, this happened
00:23:33.320 on this date when
00:23:34.420 it didn't.
00:23:35.120 The fact-checking
00:23:37.300 tends to be
00:23:37.860 right, but they
00:23:39.320 lie by leaving
00:23:40.420 out other facts.
00:23:42.300 If there's a
00:23:42.920 fact on the
00:23:43.400 other side, just
00:23:44.000 don't mention it.
00:23:45.200 There you go.
00:23:46.820 So do you
00:23:47.720 think that the
00:23:48.220 fact-checkers could
00:23:49.500 ever act
00:23:50.100 independently?
00:23:53.300 It's not even
00:23:54.100 designed to do
00:23:54.820 that.
00:23:55.800 Do you think
00:23:56.200 that they look
00:23:56.680 for the most
00:23:57.100 independent fact-checkers
00:23:58.560 to form the...
00:23:59.720 Do you think
00:24:00.220 NPR said, you
00:24:01.120 know, we are a
00:24:02.720 left-leaning
00:24:03.160 organization, but
00:24:05.260 for fact-checking,
00:24:06.520 that's exactly the
00:24:07.740 kind of bias you
00:24:08.480 want to get rid
00:24:09.180 of.
00:24:10.120 So rather than
00:24:11.020 just maybe
00:24:11.540 promoting our
00:24:12.480 obviously left-leaning
00:24:13.660 people into a
00:24:14.520 job of trust and
00:24:15.900 fact-checking, we're
00:24:17.280 going to scour the
00:24:18.020 world and find
00:24:19.440 people who are
00:24:20.020 famous for being
00:24:21.160 able to disagree
00:24:22.680 with their own
00:24:23.340 team.
00:24:24.780 That's the people
00:24:25.580 you want with
00:24:26.040 your fact-checkers,
00:24:27.020 because that's
00:24:27.500 people who have a
00:24:28.040 track record that
00:24:28.780 says, I don't
00:24:29.900 care if my team
00:24:30.600 says this is true.
00:24:31.520 It's just not
00:24:32.080 true.
00:24:33.160 And then they
00:24:33.860 hired those
00:24:34.360 people who
00:24:34.820 could go
00:24:35.260 against the
00:24:35.760 grain because
00:24:36.300 that's who
00:24:36.880 you could
00:24:37.180 trust.
00:24:38.880 No.
00:24:40.520 Nothing like
00:24:41.300 that's happening.
00:24:42.860 They're hiring
00:24:43.700 leftists, and
00:24:45.860 what do you
00:24:46.160 think the
00:24:46.480 leftists would
00:24:47.460 do if they
00:24:48.400 got a situation
00:24:49.220 in which,
00:24:50.680 let's say,
00:24:51.160 hypothetically,
00:24:52.640 let's say there
00:24:53.900 was a left-leaning
00:24:54.820 person who
00:24:55.700 legitimately wanted
00:24:57.000 to do fact-checking.
00:24:58.620 And let's say
00:24:59.160 this left-leaning
00:24:59.940 person said,
00:25:00.560 you know, I'm
00:25:01.640 going to seriously
00:25:02.480 try to be
00:25:02.980 objective here.
00:25:03.680 I know I'm
00:25:04.080 left-leaning,
00:25:05.160 but I'm being
00:25:05.940 hired to be
00:25:06.500 objective.
00:25:06.940 I'm going to
00:25:07.240 be objective.
00:25:08.140 And let's say
00:25:08.700 some facts came
00:25:09.580 to that person
00:25:10.280 that showed
00:25:11.360 that the
00:25:11.980 fine people
00:25:12.880 hoax was a
00:25:14.600 hoax, and
00:25:15.700 maybe it's the
00:25:16.160 first time they
00:25:16.640 heard it.
00:25:16.960 Do you think
00:25:18.280 a left-leaning
00:25:19.280 person working
00:25:20.140 for NPR,
00:25:21.640 even if they
00:25:22.260 believed it
00:25:22.940 true and
00:25:23.960 discovered that
00:25:24.800 the fine people
00:25:25.600 hoax was in
00:25:26.440 fact a hoax,
00:25:27.080 the hoax part
00:25:29.640 is just that
00:25:30.340 Trump called
00:25:31.420 neo-Nazis
00:25:32.100 fine people,
00:25:32.860 which we know
00:25:33.840 didn't happen
00:25:34.380 because of the
00:25:34.760 video.
00:25:36.100 But imagine
00:25:37.440 if you had
00:25:38.560 an impulse
00:25:39.620 to debunk
00:25:40.260 that.
00:25:41.320 Could you
00:25:41.660 do it?
00:25:43.380 No.
00:25:44.400 No.
00:25:45.680 You couldn't
00:25:46.060 do it.
00:25:46.920 And not
00:25:47.360 keep your
00:25:47.860 job.
00:25:49.340 Do you think
00:25:50.160 even if somebody
00:25:50.960 thought they
00:25:52.180 had the facts,
00:25:53.280 do you think
00:25:53.660 they could
00:25:54.020 debunk a
00:25:54.740 major left
00:25:56.500 talking point
00:25:57.420 and then
00:25:58.280 keep their
00:25:58.680 job?
00:25:59.400 Not a chance.
00:26:01.060 Not a chance.
00:26:01.920 They wouldn't
00:26:02.320 even be able
00:26:02.640 to go to a
00:26:03.060 party.
00:26:04.780 Imagine going
00:26:05.460 to a party
00:26:06.080 after you
00:26:06.500 debunked the
00:26:07.340 left's whatever
00:26:08.280 biggest talking
00:26:09.120 point.
00:26:10.180 Couldn't do
00:26:10.620 it.
00:26:11.440 It would be
00:26:12.280 like Alan
00:26:12.780 Dershowitz being
00:26:13.600 expelled from
00:26:15.400 polite company.
00:26:19.220 So by their
00:26:20.360 design, fact
00:26:21.280 checker
00:26:21.760 organizations
00:26:22.480 can't work
00:26:23.340 if they
00:26:23.720 come out
00:26:24.060 of organizations
00:26:25.140 that have
00:26:25.520 a bias.
00:26:26.680 Or even
00:26:27.300 if you
00:26:27.540 just live
00:26:27.960 in the
00:26:28.160 world, you
00:26:29.320 can't be
00:26:29.720 unbiased about
00:26:30.720 fact checking.
00:26:33.080 All right,
00:26:33.540 so another
00:26:34.800 robot that
00:26:36.140 is getting
00:26:36.720 close to
00:26:37.340 looking like
00:26:37.760 people.
00:26:38.580 Nobody has
00:26:39.360 yet solved
00:26:40.580 for making
00:26:41.580 a robot
00:26:42.100 actually look
00:26:43.140 like a
00:26:43.420 person, but
00:26:44.340 they're getting
00:26:44.620 closer and
00:26:45.160 closer.
00:26:46.180 What happens
00:26:46.840 when you
00:26:47.240 can't tell
00:26:47.600 the difference?
00:26:48.820 Because that
00:26:49.260 feels like
00:26:50.040 that's going
00:26:51.200 to happen,
00:26:51.620 at least on
00:26:52.160 video, right?
00:26:53.000 If you
00:26:53.300 did a
00:26:53.580 Zoom call
00:26:54.060 to a
00:26:54.400 robot,
00:26:55.220 someday you
00:26:55.840 won't know
00:26:56.160 the difference.
00:26:58.440 And how
00:26:59.480 would you
00:26:59.740 know the
00:27:00.020 difference?
00:27:00.460 And I
00:27:00.700 have this
00:27:01.440 suggestion,
00:27:04.100 that the
00:27:05.020 way you
00:27:05.320 can spot a
00:27:05.860 robot in the
00:27:06.380 future is the
00:27:06.980 same way you
00:27:07.380 can spot an
00:27:07.920 NPC today,
00:27:09.680 if in fact
00:27:10.740 NPCs exist,
00:27:12.040 which it
00:27:12.500 appears they
00:27:12.960 do.
00:27:13.960 It appears
00:27:14.520 they do,
00:27:14.920 doesn't mean
00:27:15.300 they do.
00:27:16.560 And that is
00:27:17.340 that NPCs and
00:27:18.520 robots won't
00:27:19.060 have stories.
00:27:21.260 They won't
00:27:21.620 have stories.
00:27:23.760 So yesterday
00:27:24.940 I was doing
00:27:25.300 a little video
00:27:25.840 from my
00:27:26.720 man cave
00:27:27.300 live stream,
00:27:28.580 and I was
00:27:29.160 challenging
00:27:29.500 people to
00:27:30.460 prompt me
00:27:31.820 to tell a
00:27:32.320 story.
00:27:33.600 And so there
00:27:34.020 were things
00:27:34.320 like what
00:27:34.660 was your
00:27:34.980 most embarrassing
00:27:35.720 moment,
00:27:36.560 or what was
00:27:38.220 the moment
00:27:38.560 that you
00:27:38.860 realized this
00:27:39.560 or that.
00:27:40.340 And for
00:27:40.800 each of
00:27:41.080 those,
00:27:42.080 I had a
00:27:42.520 story.
00:27:43.640 I go,
00:27:44.000 oh,
00:27:44.280 well,
00:27:44.760 let me tell
00:27:45.160 you the
00:27:45.360 story,
00:27:45.680 how I
00:27:45.980 felt,
00:27:46.520 and blah,
00:27:47.460 blah,
00:27:47.720 the surprise
00:27:48.260 ending.
00:27:49.200 And I
00:27:49.700 have lots
00:27:50.000 of stories.
00:27:51.360 But there
00:27:51.860 are definitely
00:27:52.220 people who
00:27:52.700 can't tell
00:27:53.040 you a story
00:27:53.520 about anything
00:27:54.120 that happened
00:27:54.560 to them.
00:27:55.280 And I
00:27:55.560 think robots
00:27:56.220 in the
00:27:56.720 short run,
00:27:58.100 a robot
00:27:58.580 won't be
00:27:59.020 able to
00:27:59.260 tell you
00:27:59.540 a story.
00:28:00.800 If you
00:28:01.280 talk to
00:28:01.600 a robot
00:28:01.920 about what's
00:28:02.480 happening
00:28:02.800 at the
00:28:03.100 moment,
00:28:03.600 well,
00:28:03.820 maybe you
00:28:04.140 can't tell
00:28:04.460 the difference.
00:28:05.800 But if you
00:28:06.180 say to the
00:28:06.560 robot,
00:28:06.900 hey,
00:28:07.740 can you
00:28:08.300 tell me a
00:28:08.720 story about
00:28:09.320 the best
00:28:10.240 thing you
00:28:10.540 ever got
00:28:10.820 for your
00:28:11.100 birthday?
00:28:13.200 You're
00:28:13.600 done.
00:28:15.520 Now,
00:28:16.220 could you
00:28:16.620 program a
00:28:17.260 robot to
00:28:17.860 fool you
00:28:18.580 about its
00:28:19.600 own biography?
00:28:20.820 Like maybe
00:28:21.260 it steals
00:28:21.740 from somebody
00:28:22.300 else's
00:28:22.600 backstory?
00:28:23.220 Probably.
00:28:24.120 So you
00:28:24.420 could definitely
00:28:24.840 get a robot
00:28:25.720 to fool you.
00:28:26.880 But that
00:28:27.280 leads me to
00:28:28.020 my next
00:28:28.840 observation.
00:28:31.860 We really
00:28:32.840 need a law
00:28:33.780 that if
00:28:35.200 something is
00:28:35.780 AI or a
00:28:36.760 robot,
00:28:37.320 it's got to
00:28:37.940 be labeled.
00:28:40.480 Right?
00:28:41.120 We do
00:28:41.580 truth and
00:28:42.340 labeling.
00:28:43.280 You've got to
00:28:43.820 label those
00:28:44.360 robots.
00:28:45.540 If you come
00:28:46.280 at me with a
00:28:46.920 robot pretending
00:28:47.760 to be human
00:28:48.420 and I
00:28:48.760 can't tell
00:28:49.160 the difference,
00:28:49.780 you and
00:28:50.160 I have a
00:28:50.520 problem.
00:28:51.940 All right?
00:28:52.580 Don't send
00:28:53.600 your robot at
00:28:54.400 me and
00:28:55.800 try to make
00:28:56.280 me think
00:28:56.620 it's human.
00:28:57.580 If you do
00:28:58.320 that, you
00:28:58.700 and I have
00:28:59.080 a problem.
00:28:59.620 You have
00:29:00.160 done something
00:29:01.320 to me that's
00:29:02.080 immoral and
00:29:03.560 unethical.
00:29:04.820 You have made
00:29:05.520 me think that
00:29:06.420 a robot was a
00:29:07.200 human being and
00:29:07.920 then maybe I
00:29:08.520 interacted with
00:29:09.280 it or made
00:29:09.860 decisions based
00:29:10.640 on it.
00:29:11.480 No bueno.
00:29:12.200 That needs
00:29:13.400 to be flat
00:29:14.160 illegal.
00:29:16.020 And the
00:29:17.420 voice as
00:29:18.020 well.
00:29:19.520 We need to
00:29:20.400 have a law
00:29:20.880 that says AI
00:29:22.220 voices cannot
00:29:23.480 be modulated
00:29:24.480 outside a
00:29:25.160 certain parameter.
00:29:26.560 In other words,
00:29:27.240 we can't let AI
00:29:28.240 be persuasive.
00:29:30.080 Not by voice.
00:29:31.720 You can't let
00:29:32.700 AI persuade you
00:29:35.040 with a human-like
00:29:36.000 voice.
00:29:36.540 That's got to be
00:29:37.400 illegal and we
00:29:38.180 got to do it
00:29:38.660 now.
00:29:39.860 Because the
00:29:40.560 first AI with
00:29:41.800 a human voice
00:29:42.640 that's persuasive
00:29:43.960 and maybe even
00:29:44.600 can modulate the
00:29:45.420 voice and change
00:29:46.120 it, that is too
00:29:47.320 dangerous.
00:29:48.580 And we know
00:29:49.060 it's dangerous.
00:29:50.320 You don't have
00:29:50.760 to wonder about
00:29:51.440 it.
00:29:51.900 Is there anybody
00:29:52.360 here who would
00:29:52.820 doubt the
00:29:53.500 proposition that
00:29:55.060 an AI that
00:29:55.960 sounded like a
00:29:56.660 human, enough
00:29:57.260 that you couldn't
00:29:57.720 tell, and also
00:29:59.240 could change its
00:29:59.940 voice quality to
00:30:00.840 be more persuasive
00:30:01.720 or less persuasive?
00:30:02.980 You don't want
00:30:03.780 that shit loose
00:30:04.900 in the world.
00:30:06.420 You do not want
00:30:07.440 that loose in
00:30:08.000 the world,
00:30:08.280 believe me.
00:30:09.200 It's bad enough
00:30:09.960 that I'm loose
00:30:10.660 in the world.
00:30:15.240 This isn't my
00:30:16.180 favorite category
00:30:16.920 of things to
00:30:17.600 say.
00:30:20.260 One of my
00:30:20.960 categories is
00:30:21.580 things which
00:30:22.080 cannot be
00:30:22.620 communicated.
00:30:24.680 They're easy
00:30:25.420 concepts, but for
00:30:26.300 whatever reason
00:30:26.800 you can't tell
00:30:27.300 something, like
00:30:27.780 the boy who
00:30:28.420 cried wolf.
00:30:30.080 The boy who
00:30:30.920 cried wolf
00:30:31.520 cannot communicate
00:30:32.560 that there's an
00:30:33.160 actual wolf.
00:30:34.760 You can say
00:30:35.200 the words, but
00:30:36.580 it can't be
00:30:37.020 communicated because
00:30:37.780 you already
00:30:38.180 believe that he
00:30:38.840 lies about
00:30:39.340 wolves, so
00:30:40.140 there's no way
00:30:41.320 to get from
00:30:41.760 here to there.
00:30:43.720 But here's
00:30:44.840 what I like to
00:30:45.460 say, sort of
00:30:46.300 in that genre.
00:30:47.780 If anybody knew
00:30:48.820 what I could
00:30:49.300 actually do, I
00:30:51.400 would be
00:30:51.700 murdered.
00:30:53.400 It would be
00:30:54.100 too dangerous to
00:30:55.740 have me loose
00:30:56.340 in society.
00:30:57.520 But the only
00:30:58.140 reason that I'm
00:30:59.260 completely safe is
00:31:01.000 that nobody has
00:31:01.620 any idea.
00:31:03.100 Nobody has any
00:31:03.980 idea what I
00:31:05.820 could accomplish
00:31:06.520 through persuasion.
00:31:07.940 If you did, I'd
00:31:09.560 be in jail.
00:31:10.920 Because you
00:31:11.300 just wouldn't
00:31:11.720 want, it would
00:31:12.600 be like having a
00:31:13.480 nuclear bomb
00:31:14.240 wandering around
00:31:15.180 without government
00:31:16.500 control.
00:31:17.680 That's sort of
00:31:18.380 what I am.
00:31:19.380 But I will always
00:31:20.340 be free because
00:31:21.320 nobody will ever
00:31:22.200 believe what I
00:31:24.540 could actually do.
00:31:25.980 Now, what I
00:31:27.180 could do might be
00:31:27.920 only a fraction of
00:31:28.900 what an AI could
00:31:29.760 do eventually,
00:31:31.220 persuasion-wise,
00:31:32.040 because it would
00:31:32.560 have all the
00:31:33.160 tools instantly.
00:31:34.660 It could quickly
00:31:35.860 check those tools
00:31:36.880 against the
00:31:37.380 individual.
00:31:38.600 Basically, everything
00:31:39.580 I do the slow
00:31:40.580 human way is
00:31:42.320 programmable.
00:31:43.800 It's all, and
00:31:44.640 better.
00:31:45.580 In fact, the AI
00:31:46.420 will be better at
00:31:47.560 detecting your
00:31:48.320 micro-changes than
00:31:49.720 even I am.
00:31:50.860 And I'm trained to
00:31:51.860 do it.
00:31:53.040 One of the big
00:31:53.620 skills of a
00:31:55.800 hypnotist is not
00:31:57.520 what you say,
00:31:58.800 but it's observing
00:31:59.900 how what you said
00:32:01.560 influenced the
00:32:02.880 person.
00:32:04.360 And there's a
00:32:05.200 look, there's a
00:32:07.020 look that people
00:32:07.740 present, and
00:32:09.020 hypnotists can tell
00:32:10.160 you this, when
00:32:11.420 they're in a
00:32:11.880 certain mode,
00:32:13.380 right, when their
00:32:14.120 mind has entered,
00:32:15.060 I don't know, like a
00:32:15.680 maintenance mode or
00:32:16.480 something, you can
00:32:17.740 see it.
00:32:19.140 Now, it's easy to
00:32:20.200 recognize if you
00:32:20.840 have experience, it
00:32:21.720 would be impossible
00:32:22.500 to recognize if you
00:32:23.380 didn't.
00:32:24.320 But an AI could do
00:32:25.240 it right away.
00:32:26.420 You would just have
00:32:27.220 to give it, okay,
00:32:27.960 here's ten pictures of
00:32:29.680 people in that
00:32:30.280 mode, you know,
00:32:31.140 that mental mode,
00:32:32.740 here's ten people
00:32:33.480 not, done.
00:32:35.460 The AI would be
00:32:36.480 able to spot it
00:32:37.100 every day after
00:32:37.660 that.
00:32:38.420 You know, I'm
00:32:38.820 simplifying, but
00:32:39.520 that's the basic
00:32:40.100 idea.
00:32:42.380 All right, so
00:32:43.460 yes, let's get
00:32:44.640 some laws about
00:32:45.720 robots and AI
00:32:47.080 pretending to be
00:32:47.980 people.
00:32:48.520 That needs to be
00:32:49.560 stopped right
00:32:50.820 away, because the
00:32:52.480 first one is going
00:32:53.140 to be like a
00:32:53.580 nuclear weapon.
00:32:55.060 You've got to
00:32:55.420 stop it right
00:32:56.220 away.
00:32:57.400 Do you know
00:32:57.780 how many
00:32:58.220 social media
00:33:00.360 followers, the
00:33:02.140 first robot that's
00:33:03.240 indistinguishable from
00:33:04.660 a human would
00:33:05.300 have?
00:33:07.020 Many as it
00:33:07.720 wanted.
00:33:08.860 As many as it
00:33:09.700 wanted.
00:33:11.860 All right, let's
00:33:13.360 do the January 6th
00:33:14.360 hoax update.
00:33:16.340 I can't call the
00:33:17.420 January 6th thing
00:33:18.440 like anything but a
00:33:20.160 hoax anymore,
00:33:21.240 because I feel at
00:33:22.380 this point it's just
00:33:23.020 so fucking obvious.
00:33:27.480 Somebody says for
00:33:28.480 $9.99 for a
00:33:30.400 super chat, it's
00:33:31.520 funny because the
00:33:32.300 chat influences
00:33:33.160 Scott more than
00:33:34.060 the influence,
00:33:34.840 than he influences
00:33:35.800 the chat.
00:33:37.280 It's definitely a
00:33:38.140 two-way street.
00:33:40.000 You definitely
00:33:40.660 influence me.
00:33:43.700 I didn't know.
00:33:46.340 You often fill in
00:33:47.400 contacts, but you
00:33:48.120 do actually fact
00:33:49.120 check.
00:33:49.320 Yeah, I'm
00:33:52.020 definitely influenced
00:33:52.960 factually, but the
00:33:55.900 nature of the
00:33:59.620 technology is that
00:34:02.060 maybe I guess I'm
00:34:02.900 like group peer
00:34:04.240 influence or
00:34:04.880 something like that,
00:34:05.840 but generally
00:34:06.440 speaking this is not
00:34:07.400 an influential
00:34:08.300 platform in two
00:34:11.100 directions.
00:34:12.680 It's influential in
00:34:14.080 my direction because
00:34:14.960 you can see me and
00:34:15.880 hear me, but I've
00:34:17.520 just got text to
00:34:18.480 look at.
00:34:19.160 So that handicaps
00:34:20.680 you quite a bit
00:34:21.300 compared to the
00:34:22.920 persuasion that I
00:34:23.660 can apply.
00:34:25.420 But anyway, so
00:34:30.080 here's the thing
00:34:33.260 that was bothering
00:34:34.500 me yesterday.
00:34:35.160 So we heard a
00:34:35.680 report that one
00:34:36.720 of the January
00:34:37.400 6th protesters is
00:34:40.080 going to get maybe
00:34:41.280 15 years for a
00:34:44.020 list of charges
00:34:44.920 that didn't look to
00:34:46.220 me like a 15-year
00:34:47.580 sentence, but I
00:34:49.640 guess there are
00:34:49.960 all these, what
00:34:52.100 do they call
00:34:52.520 them, will you
00:34:55.320 put on top of
00:34:55.880 the charges?
00:34:57.160 They're like
00:34:57.580 stimulants to the
00:34:59.560 charge, they're
00:35:00.880 extras, what's the
00:35:01.780 word?
00:35:02.900 Enhancements.
00:35:03.800 Yeah, so the judge
00:35:04.720 can add these
00:35:05.360 quote enhancements
00:35:06.500 to the sentence
00:35:09.920 and say, oh,
00:35:11.400 because of this or
00:35:12.340 that, it's extra
00:35:13.280 bad.
00:35:14.920 Now, here's the
00:35:16.980 problem.
00:35:17.880 In order for the
00:35:18.960 Democrats to sell
00:35:20.940 the January 6th
00:35:22.200 hoax, there's one
00:35:23.860 thing that has to
00:35:24.800 happen.
00:35:26.480 Really long jail
00:35:27.760 terms for at least
00:35:29.480 some of the
00:35:30.120 protesters, am I
00:35:31.320 right?
00:35:33.440 They really can't
00:35:34.580 sell the hoax
00:35:35.500 unless these people
00:35:37.360 go to jail for a
00:35:38.200 long time.
00:35:40.640 These people should
00:35:41.640 be freed immediately.
00:35:44.440 They should be
00:35:45.200 freed immediately,
00:35:47.620 irrespective of the
00:35:49.240 crimes they
00:35:49.760 committed.
00:35:51.500 Let me say it
00:35:52.280 again, and I would
00:35:53.020 say this if they
00:35:53.660 were Democrats,
00:35:54.400 right?
00:35:54.960 This is not team
00:35:56.060 play.
00:35:56.720 Well, it is team
00:35:57.360 play.
00:35:57.740 It's team America.
00:35:59.580 This is America.
00:36:02.100 In America, we do
00:36:04.320 not, we do not,
00:36:06.720 we do not, in
00:36:09.400 America, if we
00:36:11.120 want to stay
00:36:11.880 America, like if
00:36:13.720 you really want to
00:36:14.380 be America, we
00:36:16.220 do not set our
00:36:18.820 length of sentence
00:36:20.660 based on what the
00:36:22.460 Democrats need
00:36:23.360 politically.
00:36:25.520 We do not.
00:36:27.420 This is happening
00:36:28.200 right in front of
00:36:28.780 you.
00:36:29.740 Right in front of
00:36:30.740 you, it's very
00:36:31.440 obvious that these
00:36:32.320 are going to be
00:36:32.680 long sentences,
00:36:33.500 because if they
00:36:34.780 were dismissed,
00:36:37.100 what would the
00:36:37.880 January 6th
00:36:38.820 hearings look like
00:36:39.900 if while they
00:36:41.040 were happening,
00:36:42.780 the people accused
00:36:43.760 were getting
00:36:44.180 dismissed without
00:36:45.080 charges?
00:36:45.820 What would that
00:36:46.220 look like?
00:36:47.300 It would look like
00:36:48.240 the January 6th
00:36:49.120 thing was bullshit
00:36:49.880 because it is,
00:36:52.500 mostly.
00:36:53.320 I mean, you
00:36:54.040 know, let me say
00:36:55.420 the obvious thing
00:36:56.360 in case there's any
00:36:57.200 NPCs watching.
00:36:58.640 This is just for
00:36:59.400 the NPCs, right?
00:37:00.820 The smart people,
00:37:01.560 you can ignore this.
00:37:02.700 The real humans,
00:37:04.200 but the NPCs are
00:37:05.620 going to say,
00:37:06.440 well, what you're
00:37:07.040 ignoring, the
00:37:07.980 actual violence of
00:37:09.240 the people.
00:37:10.700 No, I'm not.
00:37:12.480 No, I'm not.
00:37:13.900 There's nobody in
00:37:14.840 the world who
00:37:15.540 thinks actually
00:37:16.360 violent people
00:37:17.240 should not be
00:37:18.680 dealt with by the
00:37:20.000 legal process.
00:37:20.980 Nobody.
00:37:22.120 So the NPCs, if
00:37:23.980 you're watching,
00:37:25.240 oh, you're
00:37:25.840 forgetting the
00:37:26.320 violence.
00:37:27.040 No, I'm not.
00:37:28.160 No, I'm not.
00:37:30.160 But we're not
00:37:32.440 America, if we
00:37:35.420 can watch
00:37:36.140 without acting
00:37:37.440 American citizens
00:37:39.560 who did make
00:37:40.220 some mistakes,
00:37:41.680 right?
00:37:42.280 I'm not going to
00:37:43.060 whitewash any of
00:37:44.840 their actions.
00:37:45.500 That's up to them.
00:37:47.100 Their actions are
00:37:47.940 just for them to
00:37:48.880 deal with.
00:37:49.740 Here's what I have
00:37:50.520 to deal with.
00:37:52.720 America, right?
00:37:54.660 Yeah, my problem
00:37:55.220 is America.
00:37:55.800 And America is a
00:37:58.020 little bit broken
00:37:58.600 right now, and
00:37:59.760 it's broken right
00:38:00.440 in front of us, and
00:38:01.900 what do we do?
00:38:03.500 What do Americans
00:38:04.960 do when something's
00:38:06.920 broken and it's
00:38:08.740 right fucking in
00:38:09.460 front of you?
00:38:11.400 You fix it.
00:38:13.440 There's only one way
00:38:14.400 to fix this.
00:38:16.480 You've got to vote
00:38:17.200 the Democrats out.
00:38:19.500 Because at least
00:38:20.300 you can pardon
00:38:20.880 people or whatever
00:38:21.740 the hell you're
00:38:22.180 going to do.
00:38:22.480 If there were no
00:38:25.880 other, well,
00:38:27.920 actually, let me
00:38:29.220 make a stand.
00:38:30.560 I'm going to make
00:38:31.260 a stand right now.
00:38:33.460 No matter who
00:38:34.540 the Republican,
00:38:35.540 well, it could be
00:38:36.040 even the Democrat,
00:38:37.220 but no matter
00:38:37.900 which candidate
00:38:38.680 says I'm going
00:38:39.680 to pardon all
00:38:40.500 the January 6th
00:38:41.500 protesters, has
00:38:43.120 my support.
00:38:45.040 And I'll even
00:38:45.860 change my opinion.
00:38:46.860 If Trump runs
00:38:47.840 and says I'll just
00:38:48.560 do that one thing,
00:38:50.300 he has my full
00:38:51.580 support.
00:38:53.160 I don't think
00:38:53.900 he should be
00:38:54.300 running because
00:38:54.800 he's too old.
00:38:56.060 I don't think
00:38:56.520 anybody should run
00:38:57.200 at that age.
00:38:58.540 But if he runs
00:38:59.560 and if he says,
00:39:02.560 you know, I got
00:39:03.140 all these other
00:39:03.760 ideas, but I'll
00:39:04.840 tell you one thing
00:39:05.420 I'm going to do
00:39:05.960 is I'm going to
00:39:07.020 fix this problem
00:39:08.020 of people being
00:39:09.520 sentenced based
00:39:10.480 on Democrat
00:39:11.360 political needs.
00:39:12.800 I'm just going
00:39:13.460 to fix that.
00:39:14.060 and this is so
00:39:22.000 important that I
00:39:24.760 could be a single
00:39:25.620 issue voter over
00:39:27.400 this.
00:39:28.120 I could also be
00:39:28.840 a single issue
00:39:29.400 voter over
00:39:30.260 fentanyl.
00:39:33.000 If a candidate
00:39:33.880 said, I got some
00:39:36.220 other policies,
00:39:37.960 you might like
00:39:38.460 them, you might
00:39:38.900 not, but I'm
00:39:39.960 going to drop a
00:39:40.560 drone on the
00:39:41.300 cartels.
00:39:42.480 Every time fentanyl
00:39:43.720 comes into this
00:39:44.300 country, I'm going
00:39:45.460 to make one, for
00:39:47.780 every death, I'll
00:39:48.620 send a rocket.
00:39:50.180 For every fentanyl
00:39:51.200 death, we'll send
00:39:51.920 one drone to blow
00:39:53.300 up some shit in the
00:39:54.260 cartel area.
00:39:55.720 Every time.
00:39:56.800 If you kill 100,000
00:39:58.560 Americans, we'll send
00:39:59.480 100,000 missiles into
00:40:02.380 Mexico.
00:40:03.920 One missile for one
00:40:04.980 death.
00:40:05.480 Or something like
00:40:06.140 that.
00:40:06.320 I'm just making
00:40:07.080 that up.
00:40:08.120 But if we had a
00:40:08.940 candidate who said,
00:40:10.260 you know, I'm just
00:40:11.000 going to solve this
00:40:11.900 one problem, and I
00:40:12.940 don't care if I have
00:40:13.620 to invade Mexico to
00:40:14.740 do it.
00:40:15.560 And by the way, I
00:40:16.900 would support a
00:40:17.540 candidate who said
00:40:18.280 he would invade
00:40:18.840 Mexico.
00:40:20.220 Let me say that
00:40:20.980 directly.
00:40:22.080 I'm in favor of
00:40:23.080 invading Mexico
00:40:23.940 militarily.
00:40:28.200 Just to say
00:40:28.880 directly, I'm in
00:40:30.360 favor of militarily
00:40:31.700 invading Mexico.
00:40:33.760 Because it's not
00:40:34.380 really a government
00:40:35.380 government.
00:40:35.960 It's a narco
00:40:36.800 government.
00:40:37.940 They're taking
00:40:38.660 100,000 of our
00:40:40.560 people.
00:40:41.700 If Trump ran for
00:40:42.980 president on invading
00:40:44.160 Mexico, I'd vote
00:40:44.900 for him.
00:40:47.260 Now, if Mexico
00:40:48.180 wanted to do
00:40:48.840 something about it
00:40:49.800 to avoid being
00:40:51.240 invaded, they
00:40:52.740 should do that
00:40:53.180 right away.
00:40:54.200 I'd even help
00:40:55.000 them do it.
00:40:56.180 Help them fund
00:40:56.880 it.
00:40:57.720 Give them some
00:40:58.220 resources.
00:40:59.080 Let them do what
00:41:00.420 they can to become
00:41:01.160 an independent,
00:41:02.620 strong ally.
00:41:03.840 But at the
00:41:04.340 moment, I would
00:41:04.960 support a
00:41:05.360 candidate who
00:41:05.920 said, invade
00:41:06.700 Mexico militarily.
00:41:08.860 Absolutely.
00:41:09.680 In a heartbeat.
00:41:10.920 I wouldn't even
00:41:11.520 hesitate.
00:41:14.900 So, those are
00:41:15.940 my big issues.
00:41:17.640 Freeing the
00:41:18.240 January 6th people
00:41:19.400 who are overcharged.
00:41:20.820 Again, we're not
00:41:22.260 forgiving any real
00:41:23.220 crimes.
00:41:24.140 They're just
00:41:24.560 obviously overcharged
00:41:25.880 for Democrat
00:41:27.220 political purposes.
00:41:28.440 It's just obvious.
00:41:29.380 You can't do this
00:41:30.140 right in front of
00:41:30.740 us.
00:41:31.560 Right?
00:41:32.480 It's like, it's
00:41:33.820 one thing to
00:41:35.200 fuck Americans
00:41:36.220 like privately
00:41:37.500 and I don't hear
00:41:38.460 about it.
00:41:39.120 But if you do it
00:41:39.980 right in front of
00:41:40.680 me, I'm not
00:41:42.020 going to let you
00:41:43.020 fuck Americans
00:41:43.940 right in front of
00:41:44.680 me.
00:41:46.300 That's going to be
00:41:47.720 noticed.
00:41:48.360 Here's the other
00:41:51.600 story that, I
00:41:52.740 don't know, it's
00:41:53.520 just amazing what
00:41:54.600 the public can be
00:41:55.560 sold.
00:41:58.260 And I saw Kyle
00:41:59.640 Becker was
00:42:00.340 responding the same
00:42:01.880 way I would have.
00:42:02.980 So there's a story
00:42:03.800 now, and I need a
00:42:05.340 fact check on this.
00:42:06.520 Can somebody verify
00:42:07.380 this is really true?
00:42:08.800 So I heard it from
00:42:09.420 one source on
00:42:10.280 Twitter.
00:42:11.100 So, you know, use
00:42:12.960 your judgment.
00:42:13.360 But apparently
00:42:15.200 there's a story
00:42:16.180 that Clinton
00:42:17.200 campaign lawyer
00:42:18.100 Michael Sussman,
00:42:19.920 who allegedly
00:42:22.360 ran three
00:42:23.060 fraudulent
00:42:23.900 operations against
00:42:24.860 Trump in 2016
00:42:25.880 and 17,
00:42:26.780 traveled to
00:42:27.480 London and
00:42:28.760 met with, you
00:42:30.280 know, spies and
00:42:31.020 stuff at the
00:42:32.280 British Spy
00:42:33.800 Center, GCHQ,
00:42:36.180 and the British
00:42:37.160 eavesdropping
00:42:37.940 agency that
00:42:38.620 provided materials
00:42:39.840 to CIA chief
00:42:40.860 John Brennan.
00:42:41.580 So we know
00:42:42.700 that John
00:42:44.900 Brennan said
00:42:45.540 he did get
00:42:46.840 some stuff
00:42:47.360 from Britain.
00:42:49.180 We know
00:42:49.800 that Christopher
00:42:50.340 Steele was
00:42:50.960 an ex-British
00:42:52.440 intelligence
00:42:53.000 officer, and
00:42:54.580 we were sold
00:42:55.440 on the concept
00:42:56.960 that there's
00:42:57.520 such a thing
00:42:58.120 as an ex-British
00:43:00.260 intelligence
00:43:00.920 officer, or
00:43:02.120 professional.
00:43:03.980 Do you think
00:43:04.600 there's such a
00:43:05.140 thing as an
00:43:05.720 ex-British
00:43:07.580 intelligence
00:43:08.840 person?
00:43:10.640 No.
00:43:11.580 Now, I
00:43:13.160 believe he's
00:43:13.600 not on the
00:43:14.020 payroll.
00:43:16.100 I believe he's
00:43:16.960 not on the
00:43:17.380 payroll.
00:43:18.960 Well, that's
00:43:19.380 not like being
00:43:19.860 an ex.
00:43:21.280 Do you think
00:43:21.920 that he could
00:43:22.360 do something
00:43:22.820 that his
00:43:23.300 masters didn't
00:43:24.420 like?
00:43:25.360 No.
00:43:26.500 I mean, not
00:43:27.040 in hope to
00:43:27.640 have a
00:43:28.240 relationship with
00:43:29.000 him in the
00:43:29.340 future, which
00:43:30.440 I'm sure was
00:43:30.940 valuable to
00:43:31.480 him.
00:43:32.540 So we
00:43:33.780 never say
00:43:34.780 directly that
00:43:36.200 it's obvious
00:43:36.800 that Great
00:43:37.260 Brennan was
00:43:37.800 involved in the
00:43:38.540 Russia collusion
00:43:39.200 hoax, if not
00:43:40.400 completely behind
00:43:41.260 it, with
00:43:42.680 Hillary Clinton.
00:43:44.280 But why do
00:43:45.020 we act like
00:43:45.600 it's not
00:43:46.500 there?
00:43:48.340 It's right
00:43:48.900 there.
00:43:50.260 Christopher
00:43:50.600 Steele was
00:43:51.980 an ex-
00:43:52.740 British spy.
00:43:56.840 Why don't we
00:43:57.840 call it
00:43:58.360 British
00:43:59.540 interference?
00:44:00.980 It's just
00:44:01.720 crazy that we
00:44:02.540 all agree that
00:44:03.700 it's Russian
00:44:04.140 interference when
00:44:05.540 the actual
00:44:06.420 reporting is
00:44:07.180 British
00:44:07.460 interference.
00:44:07.900 we're looking
00:44:09.000 right at it.
00:44:10.200 The news has
00:44:11.000 reached a point
00:44:11.580 where they can
00:44:12.060 put it right in
00:44:12.580 front of you
00:44:12.980 and tell you
00:44:13.400 it's not
00:44:13.760 there.
00:44:15.300 Here's all the
00:44:16.620 Brits being
00:44:18.100 behind this
00:44:18.680 Russia collusion
00:44:19.340 thing, but it
00:44:20.840 was Russia.
00:44:22.140 Here's all the
00:44:22.820 evidence it was
00:44:23.480 really Great
00:44:23.960 Britain.
00:44:24.780 Let's talk
00:44:25.240 about Russia.
00:44:26.740 We can actually
00:44:27.360 do that, like
00:44:28.660 right in front of
00:44:29.360 you.
00:44:30.420 All of the
00:44:31.120 information is
00:44:31.680 right there.
00:44:32.920 It's obvious
00:44:33.760 Great Britain was
00:44:34.840 involved in some
00:44:36.400 level.
00:44:37.380 Obvious.
00:44:40.180 But we just
00:44:41.000 act like it's
00:44:41.620 not true.
00:44:43.020 All right.
00:44:45.380 San Francisco
00:44:46.160 is trying to,
00:44:48.600 I guess the
00:44:49.000 mayor is beyond
00:44:49.560 this, they've
00:44:50.420 got a proposal
00:44:51.080 in which the
00:44:51.800 police would be
00:44:52.640 able to
00:44:53.080 continuously
00:44:53.840 monitor
00:44:54.900 neighborhoods and
00:44:57.540 cities through
00:44:59.000 your personal
00:45:00.540 security system.
00:45:04.560 What?
00:45:05.240 That's right.
00:45:06.980 If you have a
00:45:07.660 ring camera that's
00:45:08.620 connected to a
00:45:09.220 network, the
00:45:10.400 San Francisco
00:45:10.960 police, this is a
00:45:12.740 proposal, would
00:45:14.000 like to, at any
00:45:14.940 time they want,
00:45:16.460 look through your
00:45:17.360 camera, total
00:45:19.200 access to it at
00:45:20.060 any time, and
00:45:21.520 see what you
00:45:22.220 would see if you
00:45:22.840 were standing on
00:45:23.380 your front porch.
00:45:24.660 And the idea is
00:45:25.460 that once they
00:45:26.360 had access to
00:45:27.040 all this
00:45:27.500 surveillance, yeah,
00:45:31.960 what about your
00:45:32.480 indoor cameras, you
00:45:33.560 ask, quite
00:45:34.880 reasonably, quite
00:45:36.880 reasonably.
00:45:37.940 And do you
00:45:38.400 think the police
00:45:39.020 would misuse this
00:45:40.140 ability to see
00:45:40.840 what's happening
00:45:41.340 in your house?
00:45:43.180 Of course.
00:45:45.340 Of course.
00:45:46.640 Somehow.
00:45:47.080 Of course.
00:45:47.660 Of course.
00:45:49.600 Yeah.
00:45:50.920 And don't you
00:45:51.800 think they could
00:45:52.220 tell when you're
00:45:52.740 home?
00:45:54.340 Yeah.
00:45:55.300 The police will
00:45:56.040 know when you're
00:45:56.520 home.
00:45:57.740 Do you want the
00:45:58.280 police to know
00:45:58.900 when you're home?
00:45:59.380 Now, I'm a big
00:46:01.280 supporter of the
00:46:01.960 police.
00:46:03.640 Love the police.
00:46:05.280 It is, however, a
00:46:06.460 fact that there's
00:46:08.080 not much difference
00:46:08.940 between a police
00:46:10.320 officer and a
00:46:11.080 criminal.
00:46:13.560 Now, there's an
00:46:14.400 important difference,
00:46:15.600 which is one is on
00:46:16.540 the side of the law
00:46:17.340 and one is not, but
00:46:19.080 in terms of their
00:46:19.740 personalities, not
00:46:21.280 that different.
00:46:22.660 You know, the
00:46:23.280 ability to do
00:46:24.760 dangerous things for
00:46:26.800 benefit, for really
00:46:28.280 not that much
00:46:28.860 benefit, you
00:46:30.180 know, if you
00:46:30.740 assume they're
00:46:31.360 underpaid, which I
00:46:32.400 assume.
00:46:33.540 So, I don't
00:46:35.260 know.
00:46:36.120 If you said, what
00:46:37.160 is a group of
00:46:37.900 people you at
00:46:38.600 least want to
00:46:39.660 know if you're
00:46:41.020 home, number
00:46:42.620 one would be
00:46:43.360 criminals, right?
00:46:44.920 The number one
00:46:45.880 people you want
00:46:46.960 to know, you
00:46:47.740 don't want to
00:46:48.160 know if you're
00:46:48.740 home or not,
00:46:50.220 is criminals.
00:46:51.100 But what is the
00:46:51.820 number two
00:46:52.660 organization, the
00:46:54.700 second group that
00:46:56.320 you would most not
00:46:57.440 want to know
00:46:58.060 about what's
00:46:58.560 happening around
00:46:59.140 your home?
00:47:00.580 Isn't that the
00:47:01.380 police?
00:47:03.060 And it's the
00:47:03.860 police because I
00:47:04.540 wouldn't trust
00:47:05.080 them, not because
00:47:07.040 they'd see my
00:47:07.820 private stuff, which
00:47:08.700 is a good enough
00:47:09.420 reason.
00:47:11.020 Somebody says, my
00:47:11.900 wife.
00:47:12.400 Yeah, your ex-wife.
00:47:14.800 Do you think
00:47:15.800 anybody will be a
00:47:17.300 police officer with
00:47:18.340 access to the
00:47:19.140 system who also
00:47:20.960 has an ex who
00:47:23.060 has that system?
00:47:24.780 Imagine the
00:47:25.500 police officers being
00:47:27.000 able to see their
00:47:27.920 ex-wife or
00:47:29.540 husband and
00:47:30.980 their new lover
00:47:32.480 visiting the
00:47:34.780 house.
00:47:35.960 Do you want
00:47:36.720 any of that?
00:47:39.460 The amount of
00:47:41.200 intrusion into
00:47:42.720 your personal life
00:47:43.560 is just through
00:47:44.520 the roof.
00:47:46.180 Now, here's
00:47:46.660 something that I
00:47:47.280 would be okay
00:47:49.580 with.
00:47:50.680 If the police
00:47:51.860 said, optionally,
00:47:53.500 if you want to put a
00:47:54.440 separate camera
00:47:55.160 system in front of
00:47:55.960 your house and
00:47:56.880 give us access to
00:47:57.860 it, but it's your
00:47:58.500 choice, it's got to
00:47:59.920 be a specific kind
00:48:00.960 where we can't get
00:48:01.740 to any of your
00:48:02.400 other cameras or
00:48:03.300 anything, so it's
00:48:04.180 got to be a walled
00:48:05.140 off system.
00:48:05.980 If you'd like to
00:48:06.840 have that kind of a
00:48:07.520 system and you
00:48:09.940 agree, then we'll
00:48:11.960 be able to look at
00:48:12.820 it.
00:48:13.240 But you have to
00:48:13.860 agree, and you
00:48:15.020 have to know
00:48:15.360 exactly what they
00:48:16.080 have access to and
00:48:16.900 what they don't.
00:48:17.480 I could imagine
00:48:20.420 having a Ring
00:48:24.020 doorbell, let's
00:48:24.760 just say, and
00:48:26.220 let's say if Ring
00:48:26.960 could somehow wall
00:48:28.160 off its software so
00:48:29.500 that the police
00:48:31.120 could only look
00:48:31.800 through the front
00:48:32.340 door if you've
00:48:32.960 got another camera
00:48:33.600 somewhere.
00:48:34.980 Could you imagine
00:48:35.560 labeling it law
00:48:37.880 enforcement monitored?
00:48:40.540 That would be
00:48:41.260 better, wouldn't
00:48:41.740 it?
00:48:42.460 Like if you were
00:48:43.080 worried about your
00:48:43.700 own security and
00:48:45.160 you had a Ring
00:48:45.640 camera, that's
00:48:46.300 good.
00:48:46.580 But if you
00:48:47.760 had a Ring
00:48:48.100 camera that you
00:48:48.720 knew the police
00:48:49.220 could look at in
00:48:49.920 real time, that's
00:48:51.760 a little scarier.
00:48:53.820 Right?
00:48:54.260 As long as
00:48:54.720 everything you did
00:48:55.260 was voluntary,
00:48:56.840 that's a different
00:48:58.680 situation.
00:48:59.660 But if they're
00:49:00.800 going to force
00:49:01.480 this, no, I
00:49:04.160 don't think it
00:49:04.780 could get approved.
00:49:05.880 Do you?
00:49:06.660 Do you think
00:49:07.040 there's any way
00:49:07.500 that could get
00:49:07.960 approved?
00:49:09.200 If the resident
00:49:10.540 doesn't, you
00:49:12.240 think there is?
00:49:13.340 I mean, maybe.
00:49:15.560 Maybe.
00:49:16.580 It seems like a
00:49:17.320 long shot.
00:49:20.220 But maybe.
00:49:23.600 All right.
00:49:24.580 I could tell you
00:49:25.560 when the Ukraine-Russia
00:49:27.080 war will end if you
00:49:28.500 could give me the
00:49:29.240 following information.
00:49:30.900 And I know you
00:49:31.500 can't, but maybe
00:49:33.220 someday somebody can.
00:49:34.660 It goes like this.
00:49:35.900 How quickly can
00:49:36.940 Russia find and
00:49:39.040 spin up new ammo
00:49:40.700 warehouses?
00:49:42.400 Right?
00:49:42.540 Because their ammo
00:49:43.160 warehouses are being
00:49:44.040 targeted for destruction
00:49:45.300 by Ukraine, who
00:49:47.100 now has new,
00:49:48.540 accurate, long-range
00:49:49.880 weapons to get to
00:49:51.260 all their ammo
00:49:51.780 depots.
00:49:52.760 And if their ammo
00:49:53.660 depots were beyond
00:49:54.660 that range, they
00:49:55.940 also wouldn't be
00:49:56.680 useful.
00:49:58.080 Right?
00:49:58.700 So now Ukraine looks
00:49:59.980 like they can get to
00:50:00.920 any useful ammo
00:50:03.380 dumps that Russia
00:50:05.020 would have on Russia's
00:50:06.460 side of the territory.
00:50:07.520 Now the question is
00:50:08.980 this.
00:50:09.680 We have X many
00:50:10.960 missiles and rocket
00:50:12.800 launchers that can do
00:50:13.980 that level of
00:50:14.760 accuracy.
00:50:16.140 Is it the, what's
00:50:17.320 the name of them?
00:50:17.980 The H-I-M-A-S or
00:50:20.300 something?
00:50:21.220 There's a specific
00:50:21.980 type of weapon.
00:50:23.260 So they're being
00:50:23.840 shipped to Ukraine,
00:50:24.660 and I think they have
00:50:25.260 nine, hi Mars.
00:50:27.160 H-I-M-A-R-S.
00:50:29.600 So I think Ukraine has
00:50:31.600 nine of them.
00:50:33.260 People are saying,
00:50:34.200 can we get a hundred?
00:50:34.880 Now, carried a gun
00:50:40.360 in the capital, and he
00:50:41.320 threatened to kill his
00:50:42.060 family if they turned
00:50:43.040 him, they turned him
00:50:44.940 in, right?
00:50:48.560 So the warning there
00:50:50.760 was that the person
00:50:52.500 who was overcharged
00:50:53.640 at the capital, there
00:50:55.160 was some extenuating
00:50:56.100 situation there.
00:50:58.100 Maybe, right?
00:50:59.740 But in general, the
00:51:01.640 overcharging in general,
00:51:02.900 there will be specific
00:51:04.100 cases where people
00:51:04.860 where there's a real
00:51:05.800 argument there.
00:51:07.200 All right, but getting
00:51:07.600 back to my point,
00:51:09.480 how many rockets can
00:51:11.880 a high Mars, one
00:51:14.520 high Mars system
00:51:15.520 send per day?
00:51:17.980 Could they send a
00:51:19.540 hundred rockets per
00:51:20.560 day?
00:51:22.000 What do you think?
00:51:23.960 230?
00:51:25.520 They could do at
00:51:26.840 least dozens, right?
00:51:28.560 Now, how fast can the
00:51:31.100 Russians build ammo
00:51:32.620 warehouses as they're
00:51:34.860 being destroyed?
00:51:35.500 Now, I'm making an
00:51:37.360 assumption here that
00:51:37.960 we're always going to
00:51:38.720 know where they are
00:51:39.640 because we have eyes in
00:51:41.460 the sky and we've got
00:51:42.600 probably just tons of,
00:51:44.520 you know, local intel.
00:51:45.900 Because remember, Russia
00:51:46.940 is occupying an area in
00:51:49.600 which not everybody being
00:51:50.780 occupied is happy about
00:51:51.880 it.
00:51:52.800 So in theory, the locals
00:51:54.920 should be plenty happy to
00:51:56.780 say, you know, I saw some
00:51:57.880 ammo trucks heading into
00:51:58.880 that building.
00:51:59.360 So I feel like we would
00:52:01.320 always know where they
00:52:02.140 are.
00:52:04.000 And I would also say that
00:52:05.720 Russia's military capability
00:52:07.140 would basically disappear if
00:52:09.000 their ammo did, right?
00:52:10.880 Now, everybody adjusts.
00:52:12.200 Maybe they'd threaten with
00:52:13.760 nukes or something.
00:52:14.880 So we don't know what would
00:52:15.840 happen.
00:52:17.100 But in theory, the number of
00:52:19.920 HIMAR missiles blowing up a
00:52:22.740 number of ammo depots, if
00:52:24.840 you have enough missiles, and
00:52:26.480 you know, the number of
00:52:27.320 missiles is increasing as
00:52:28.600 they're getting more of
00:52:29.320 them.
00:52:30.280 And the rate at which
00:52:32.080 Russia can replace them
00:52:33.940 at some point will be
00:52:36.020 lower than the rate at
00:52:37.840 which they can be
00:52:38.480 destroyed.
00:52:39.960 As soon as that becomes a
00:52:41.400 permanent situation, which
00:52:42.720 seems very likely, the war
00:52:45.000 is over.
00:52:46.560 I feel like we now know
00:52:48.740 what it looks like.
00:52:50.560 Now, of course, anything
00:52:53.080 could be wrong, right?
00:52:54.560 Predicting war is sort of
00:52:57.500 silly, because it's always
00:52:59.000 based on surprises and
00:53:00.340 adjustments and, you know,
00:53:01.800 things you don't see
00:53:02.580 coming and the fog of war
00:53:03.900 and who knows if anything
00:53:05.220 we've heard about the war is
00:53:06.160 even true at this point.
00:53:08.420 So, but it does seem to me
00:53:11.420 that the number of HIMARs
00:53:13.780 compared to how rapidly
00:53:15.220 Russia can build new depots
00:53:17.180 that are functional, I think
00:53:19.700 it's down to that, right?
00:53:20.620 Am I right?
00:53:22.680 It's down to that.
00:53:24.200 And what appears to be what
00:53:26.300 is going to happen is that
00:53:28.080 Russia will start to think
00:53:30.520 that they're lucky if they
00:53:31.640 can keep what they've got.
00:53:34.180 And they'll make a deal that
00:53:35.740 doesn't make them happy at the
00:53:37.060 same time the Ukrainians will
00:53:38.180 make a deal that makes them
00:53:39.140 very unhappy too, which is a
00:53:40.920 good deal.
00:53:41.700 Makes everybody unhappy.
00:53:46.900 Somebody asked if I was
00:53:48.020 registered to vote.
00:53:48.940 The answer is, I don't remember
00:53:50.760 if I'm registered, do you
00:53:52.900 register once or do you
00:53:53.980 have to register every time
00:53:55.100 you move?
00:53:56.420 How does that work?
00:53:57.880 But I don't vote.
00:54:00.940 I'm not registered as a
00:54:02.580 Republican or, well, I take
00:54:04.120 that back.
00:54:05.320 I might be registered as
00:54:06.360 something.
00:54:07.500 I don't remember.
00:54:08.620 If I registered as either a
00:54:10.360 Democrat or a Republican,
00:54:12.700 it wouldn't have meant
00:54:13.600 anything to me.
00:54:14.680 It would not have been an
00:54:15.760 indication of who I was
00:54:16.800 going to vote for.
00:54:17.900 If I probably said
00:54:19.180 independent, wouldn't.
00:54:20.760 Because I have registered
00:54:21.820 in the past.
00:54:23.340 I don't know if I'm
00:54:24.240 registered at the moment.
00:54:26.260 And I don't have a plan to
00:54:27.840 vote.
00:54:28.680 The reason I don't vote is
00:54:30.180 different from your reasons.
00:54:31.780 It's because I do this
00:54:32.780 publicly.
00:54:34.300 I don't think you should vote
00:54:35.440 if you're going to try to be
00:54:37.420 objective about anything.
00:54:39.200 Now, it's hard enough to be
00:54:40.440 objective about anything.
00:54:41.640 And I don't think I hit that
00:54:43.020 standard.
00:54:43.560 I wish I did.
00:54:44.220 But for me, there would be a
00:54:48.460 bias that would occur if I
00:54:49.940 voted that would increase
00:54:52.780 whatever bias I'm already
00:54:54.000 operating under and struggling
00:54:55.680 against.
00:54:57.960 The rest of you, you should
00:54:59.200 vote.
00:55:00.200 I recommend it.
00:55:01.740 If you're talking in public
00:55:03.080 about politics, at least give a
00:55:06.120 thought to not voting.
00:55:07.800 I wouldn't want you to give up
00:55:09.760 your right to vote.
00:55:10.820 If it makes you feel good, go
00:55:12.160 ahead and do it.
00:55:12.840 But just be aware that casting
00:55:15.120 an actual vote, like acting
00:55:16.700 with your body, makes it
00:55:19.040 really, really hard later to
00:55:20.400 say that you made a mistake.
00:55:22.480 Right?
00:55:23.260 If I tell you I think this
00:55:24.600 candidate's better than
00:55:25.500 another one, then it turns
00:55:26.460 down I'm wrong.
00:55:27.860 Well, it's going to be hard
00:55:28.760 for me to say I was wrong,
00:55:30.260 but at least I didn't vote
00:55:31.860 for him.
00:55:33.020 Do you see it now?
00:55:35.160 Well, okay, that candidate I
00:55:36.960 was pretty sure would be the
00:55:38.060 good one, turned down poorly,
00:55:39.840 but at least I didn't vote for
00:55:42.060 him.
00:55:43.300 Do you get it?
00:55:44.560 That would make me more
00:55:45.620 biased.
00:55:46.760 There's no way I would act the
00:55:48.040 same if I had voted for him.
00:55:50.720 Right?
00:55:51.060 If I voted for him, I'd feel
00:55:52.580 like I had to defend it to the
00:55:54.180 death, even though logically I
00:55:55.920 shouldn't.
00:55:59.400 Revolver has an article
00:56:01.040 basically mocking the New York
00:56:03.320 Times coverage of the Ray Epps
00:56:05.220 situation.
00:56:06.540 So you all know Ray Epps was
00:56:08.020 caught on camera inciting
00:56:09.540 protesters to go into the
00:56:11.220 Capitol on January 6th.
00:56:13.100 And he seemed to be, if not
00:56:15.120 the most vocal organizer,
00:56:17.460 certainly one of them.
00:56:18.880 I think the most, right?
00:56:20.540 In terms of going inside the
00:56:22.140 Capitol, I think he was the
00:56:23.060 most vocal.
00:56:25.140 I think he had a megaphone and
00:56:27.020 everything else, did he not?
00:56:29.120 And fact check me on that.
00:56:30.640 Did he have a megaphone?
00:56:31.460 I think I may have just imagined
00:56:34.740 that, right?
00:56:37.260 Somebody says yes?
00:56:38.720 Or are you, I'm seeing yeses and
00:56:40.540 nos.
00:56:42.600 Yeah.
00:56:43.160 Isn't that interesting?
00:56:44.480 So some of us have false
00:56:46.760 memories, and it might be me.
00:56:50.280 Yeah, I think I've conflated him
00:56:51.900 with other people.
00:56:52.960 There was a megaphone person there,
00:56:54.400 but I don't know if it was him.
00:56:55.480 Anyway, we won't settle this right
00:56:56.800 away, but it doesn't matter.
00:56:57.620 And here's some of the things
00:57:01.720 that the Revolver story says
00:57:03.260 about the New York Times wrote
00:57:05.640 what, to many of us, our
00:57:08.540 impression is that the New York
00:57:10.720 Times was trying to cover for
00:57:13.100 some intelligence agency and run
00:57:15.900 a cover story that would make
00:57:17.580 their operative, allegedly, look
00:57:21.560 like he was just an innocent
00:57:22.480 citizen.
00:57:23.640 So it looks, to many of us,
00:57:26.480 can't prove it.
00:57:27.620 But it looks like the New York
00:57:28.560 Times is just working for an
00:57:29.800 intelligence agency to write a
00:57:31.760 fake story about Ray Epps to
00:57:33.760 cover their tracks.
00:57:35.040 That's what it looks like.
00:57:36.420 Now, do I believe that's what
00:57:37.900 happened?
00:57:38.520 I don't know.
00:57:40.380 But the New York Times has been
00:57:42.240 credibly accused and proven to be,
00:57:45.980 you know, untrustworthy on a
00:57:48.040 number of topics of great interest
00:57:50.140 to the public, from weapons of
00:57:52.160 mass destruction and, you know,
00:57:53.880 you name it, Russia collusion.
00:57:55.160 I mean, they have a long record of
00:57:58.580 clearly backing fake news.
00:58:01.380 Clearly.
00:58:03.000 So, when they say Ray Epps is just a
00:58:05.720 citizen and there's nothing to worry
00:58:07.720 about here, do we believe them?
00:58:10.900 Well, I will tell you that just this
00:58:13.460 week, my famous Democrat friend who I
00:58:16.640 mentioned anonymously, a smart person
00:58:19.600 that I often have conversations with
00:58:21.060 about politics, or I have in the past.
00:58:23.640 And he told me in the context of some
00:58:26.260 other conversation, it didn't have
00:58:27.940 anything to do with January 6th, I
00:58:29.820 don't think, but he told me that it was
00:58:31.920 in the New York Times and therefore,
00:58:33.800 you know, it was reliable.
00:58:36.120 And I thought, what?
00:58:38.480 In 2022, how could you possibly say that?
00:58:44.660 In 2022, there's like an adult human who
00:58:49.040 pays attention, who believes that because
00:58:51.380 it was in the New York Times, that makes
00:58:53.800 it very likely to be true.
00:58:56.020 Not 100%, nothing's 100%, but very likely
00:58:59.120 to be true because it was in the New York
00:59:00.440 Times.
00:59:01.460 To which I say, well, if you've been alive
00:59:03.800 for the last five years, you would know
00:59:06.500 that if it's a political story that favors
00:59:09.920 one side or the other, it's not going
00:59:12.960 to be true.
00:59:14.460 It's, you know, maybe facts and dates
00:59:16.080 are true, but the political spin, I don't
00:59:19.780 think anybody expects that to be true.
00:59:22.300 That's a spin like everything else.
00:59:26.540 So here's what, here's some quotes from
00:59:28.620 the Revolver article.
00:59:29.860 You can see it in my Twitter feed.
00:59:31.920 I tweeted it.
00:59:32.480 The only problem is that Ray Epps didn't
00:59:35.400 go to Trump's speech.
00:59:37.260 So apparently there's evidence that he
00:59:39.740 planned the protest, like he had advanced
00:59:44.340 planning, and we know that, that he would
00:59:46.900 be part of the protest.
00:59:48.240 He went all the way to Washington and did not
00:59:51.240 attend the speech, Trump's speech.
00:59:54.700 He went directly to the Capitol, where he
00:59:57.180 immediately started organizing them to go in,
01:00:00.260 which apparently he had talked about in
01:00:02.100 advance with somebody.
01:00:05.000 All right.
01:00:05.800 So the guy who went there innocently didn't
01:00:08.380 even go there for the reason that people
01:00:11.560 went there.
01:00:12.520 He had a different agenda.
01:00:14.560 All right.
01:00:17.840 And it says, quote, the Revolver piece
01:00:20.380 says, the Times piece attempts to wave off
01:00:22.340 Epps' January 6th participation as negligible,
01:00:26.260 negligible, similar to those who committed minor
01:00:29.660 offenses and weren't charged.
01:00:31.900 Yet Epps is the key person caught on video
01:00:34.620 with an advanced plan to go into the Capitol.
01:00:38.060 He's the primary person on video organizing
01:00:41.740 people to go into the Capitol, and we know he
01:00:44.520 had an advanced plan to do it.
01:00:46.900 And he's being treated like people who trespassed.
01:00:51.140 There's no correlation.
01:00:52.960 That's not just a person who trespassed.
01:00:54.540 That's somebody with a plan.
01:01:00.700 And even the New York Times, who dismisses the
01:01:09.220 conspiracy theories about Ray Epps, this is
01:01:12.360 according to Revolver as well, refers to Epps in
01:01:15.940 its own definitive video documentary.
01:01:18.740 So this is New York Times against New York Times.
01:01:21.120 They refer to him as a rioter for whom storming the
01:01:24.580 Capitol was part of the plot all along.
01:01:28.840 So the original New York Times reporting was that he's
01:01:31.500 obviously a key member of the plot, and he was, you know,
01:01:36.040 planned to go into the Capitol all along.
01:01:38.080 So New York Times has said this directly in the reporting,
01:01:41.080 and now they're reporting that he was basically a minor
01:01:43.320 person like other people who were released.
01:01:45.800 And then, according to the Revolver, this is more
01:01:52.340 opinionated, I think, that the New York Times ominously
01:01:57.360 suggests Epps will sue news outlets for defamation if they keep
01:02:02.040 saying things about him.
01:02:04.100 So doesn't this look exactly like the New York Times is doing a
01:02:07.960 cover piece for an intelligence agency?
01:02:09.980 Not that they are.
01:02:12.220 That's something I can't confirm.
01:02:15.080 But it looks exactly like it.
01:02:18.720 I don't know.
01:02:19.940 Unless there's something I'm missing.
01:02:27.920 Here's the most interesting part.
01:02:30.340 Revolver asked this.
01:02:31.520 I wonder if the author of the New York Times piece, Alan
01:02:35.080 Feuer, could clarify for the record, did he ask Epps if he had
01:02:40.720 any association with any intelligence agencies or cutouts of
01:02:45.060 such agencies?
01:02:46.300 If so, what did he say?
01:02:48.860 If not, why not?
01:02:50.240 And I read this, and I thought, wait a minute.
01:02:53.860 The New York Times did an article on the topic of whether Ray
01:02:57.980 Epps was working for an intelligence agency.
01:03:02.100 That was the point of the piece, and they didn't ask him if he
01:03:05.200 did, or if they did ask him, they didn't report his answer, that
01:03:10.640 being the only point, and they interviewed him, and they didn't
01:03:15.400 ask him if he's part of an intelligence agency, the only question
01:03:19.660 that mattered.
01:03:21.060 Now, here's the funny part.
01:03:23.500 If Revolver had not helpfully pointed out that he hadn't asked the
01:03:26.860 question, I don't know if I would have noticed.
01:03:31.020 Would you have noticed?
01:03:32.420 If you read a whole article about how he definitely totally wasn't an
01:03:37.180 intelligence operator, just a normal person, would you have noticed that
01:03:41.820 they never asked him the question and then reported the answer?
01:03:45.360 I don't know if I would.
01:03:46.900 I think that that is so clever that I would not have noticed.
01:03:51.500 Now, Revolver has more of an interest in this because they reported on
01:03:56.780 Ray Epps, so they're also defending their own reporting.
01:04:01.020 So they looked a little closer, and they noticed, and I didn't even
01:04:04.020 notice that.
01:04:05.600 Did any of you know that, that the New York Times did that article and
01:04:09.120 didn't report on whether they asked him if he was a member of an
01:04:12.420 intelligence organization?
01:04:14.540 We didn't notice that, did we?
01:04:16.320 Think about that.
01:04:17.120 Oh, you did notice.
01:04:20.720 Somebody says they noticed.
01:04:22.800 Good for you.
01:04:24.660 All right.
01:04:27.020 Apparently, a federal judge blocked the Biden administration from some kind of
01:04:33.040 ruling that would allow transgender students to use the same restrooms as
01:04:37.340 their identification, but also to allow transgender to join sports teams
01:04:42.800 corresponding with their chosen genders.
01:04:45.260 And the federal judge has blocked it.
01:04:49.880 Now, I would like to reframe this again.
01:04:58.180 I think we ought to look at the sports as what's broken and not the athletes.
01:05:03.320 And every time that we look at the athlete as, is the athlete good?
01:05:07.140 You know, is that fair for them to do what they're doing?
01:05:09.160 Or is there something wrong with the athlete?
01:05:11.200 Every time we're focusing on the athlete, we're just looking in the wrong
01:05:15.380 direction.
01:05:16.100 There's something wrong with how we organize sports that this is even a
01:05:19.160 question.
01:05:20.460 Why in the world are people not playing against people of equal ability?
01:05:26.400 When was that okay?
01:05:28.460 Why is it okay that your awkward kid who's not good at sports has to go to a
01:05:35.140 school in which only the gifted, physically capable people get to be heroes and they get
01:05:41.260 to be lauded for their accomplishments and they get all the boys and the girls and all
01:05:45.620 that?
01:05:46.740 Why is that fair?
01:05:48.520 Why is it fair that people of lower ability don't get to play on an organized sport in
01:05:53.420 the high school?
01:05:53.760 At least the ones with prestige.
01:05:55.360 They can play unorganized sports or lesser organized sports.
01:05:58.860 So I feel like we should just scrap the whole thing and say, how about people who have roughly
01:06:05.660 equal abilities play each other, right?
01:06:09.220 If you go to high school and the high school team you're playing against has Shaquille O'Neal,
01:06:13.680 you know, young Shaquille, on the other team, is that fair?
01:06:17.840 Should you even have a game?
01:06:19.800 What would be the point of playing?
01:06:21.740 Let's say your tallest guy on your team was 6'1 and you're playing against Shaquille O'Neal's
01:06:27.720 team in high school.
01:06:29.300 Are you going to win?
01:06:30.900 No.
01:06:31.980 No.
01:06:32.420 Because Shaquille, for whatever reason, is not like you.
01:06:35.560 He's not like anybody else.
01:06:36.960 So what would be even the point of playing those games?
01:06:38.980 It's like, what's the point of swimming against, you know, Leah Thomas if you're a woman?
01:06:43.680 What's the point?
01:06:45.140 Because you know how it's going to turn out.
01:06:46.740 So we don't have a problem of just, you know, trans on sports.
01:06:51.340 We have a problem of wildly out-of-the-norm athletes playing on sports.
01:06:58.860 The problem is athletes that are way out of the normal playing on the wrong team.
01:07:04.560 And the wrong team in this case just means ability.
01:07:06.740 It doesn't mean gender.
01:07:08.120 Why in the world do we care the gender of who's on the team?
01:07:10.600 Now, the whole idea that women need to get awards is just a bullshit woman thing.
01:07:17.380 Sorry.
01:07:17.860 Because if it worked the other way, probably you wouldn't see any change.
01:07:22.680 If women had always been the athletes, you know this is true.
01:07:26.560 If women had traditionally been the only ones who played sports, and socially we had adjusted to that, you know, to a time we adjusted that was right.
01:07:34.640 But then we got more aware and said, hey, men aren't playing sports.
01:07:40.020 How about get some men into these sports?
01:07:42.180 Do you think anybody would give a fuck?
01:07:44.380 No.
01:07:45.760 Nobody would give a shit if sports had always been about women and men wanted to get it.
01:07:51.120 Nobody would change the law for that.
01:07:53.400 Of course not.
01:07:53.920 It's only because men were, you know, and boys were playing sports.
01:07:58.660 Unfairly.
01:07:59.360 I'm not saying that was a good deal.
01:08:00.860 It was very unfair for the women.
01:08:02.400 But the reason it changed is because it was women asking for it.
01:08:07.520 I don't think it works the other way.
01:08:10.580 And I don't think that it was ever a good change.
01:08:13.940 Because what women ask for is, hey, can't women get their own awards?
01:08:18.920 To which I say, well, why can't shitty male athletes get their own awards?
01:08:23.920 Why isn't there a varsity team for short male basketball players?
01:08:31.880 Why is that wrong?
01:08:33.400 Why is there not a varsity male basketball team for people under 5'8"?
01:08:40.880 Why can't they win an award?
01:08:44.100 Well, you say, well, Scott, they're just bad athletes.
01:08:49.640 You don't give awards to bad athletes for whatever the reason.
01:08:52.280 Sure, they're 5'8", but, you know, you want to reward the good athletes.
01:08:57.740 To which I say, well, I wasn't born where I can compete with Shaquille O'Neal.
01:09:03.000 Look at me.
01:09:04.720 What?
01:09:05.580 Am I going to stop his dunk?
01:09:07.000 So why in the world should I, let's say, hypothetically, I had no sporting ability,
01:09:15.960 why should I be, like, you know, glorifying the people who were good at a random event?
01:09:21.880 Sports are completely random rules.
01:09:24.960 And I, as a person who just, you know, wouldn't be good at some of those sports,
01:09:29.460 I have to go glorify, and my tax, you know, some of my tax base goes to support their awesomeness.
01:09:36.960 You know, why am I paying for some athlete that I'm not?
01:09:40.620 So first you've got to figure out how we got here.
01:09:43.860 How we got here is that women complained.
01:09:48.400 I would have, too.
01:09:49.840 Perfectly reasonable.
01:09:51.460 And they worked to get, you know, sort of equal treatment,
01:09:55.240 but it wasn't what they should have asked for.
01:09:58.220 They got what they asked for, but it's not what they should have asked for.
01:10:01.720 It was good for girls and good for women,
01:10:04.380 so I can't fault them for asking for what's good for them.
01:10:07.280 That's the way it works.
01:10:08.340 You do get to ask for what's good for you.
01:10:10.620 I'm not going to criticize people asking for what's good for them.
01:10:14.420 But if we had re-architected the whole thing from scratch,
01:10:17.800 we would have said, you know, that's a good point.
01:10:21.240 Maybe we should just have teams that are based on ability,
01:10:24.340 and the best female players who are on teams mostly with men
01:10:30.020 would still be superstars, wouldn't they?
01:10:34.260 Could you imagine if a female made it onto a high-end team
01:10:40.660 in which the only other people who could play for that team were male?
01:10:45.080 Wouldn't that one woman who could actually legitimately make the team
01:10:48.360 be like, really?
01:10:50.240 It would be Danica Patrick, right?
01:10:53.020 Danica Patrick probably made more money than 99% of all male race car drivers.
01:10:59.260 Why?
01:11:00.060 Because she was interesting.
01:11:02.700 She was interesting.
01:11:03.740 She was pretty.
01:11:04.480 That helped.
01:11:05.260 But she was interesting because she was the highest-level female
01:11:09.200 in a largely male-dominated sport.
01:11:11.580 What's wrong with that model?
01:11:14.560 Do you think Danica Patrick thinks,
01:11:17.280 oh, I've been so discriminated against in this male sport?
01:11:21.080 Or does she think that she is world-famous
01:11:23.800 and has great opportunities because of it?
01:11:28.900 I don't know.
01:11:29.260 So, really, the trans topic has everything to do with the fact
01:11:38.040 that women are treated as a little bit more special than men,
01:11:42.120 and we've agreed that we're okay with that.
01:11:45.700 And then the trans thing just messes up the whole model.
01:11:49.060 Just messes up the whole model.
01:11:51.400 But I'm going to leave you with this one thought
01:11:55.080 that I'm not going to change.
01:11:57.040 You'll never talk me out of this.
01:11:59.260 The trans are not broken.
01:12:02.020 Trans people are not broken.
01:12:03.940 Sports are broken.
01:12:05.500 Just fix the sports.
01:12:07.120 I know it's not going to happen.
01:12:08.620 I'm just saying look in the right place for the problem.
01:12:10.900 It's not the trans people.
01:12:13.920 All right.
01:12:17.820 So, you know, who was it?
01:12:19.860 The Trump's Secret Service people testified for the January 6th people.
01:12:25.320 And here's an interesting fact that you would never notice
01:12:30.300 if I were not here to bring it up.
01:12:33.700 You know, I always notice, I always mention when CNN runs an opinion piece,
01:12:39.600 often by Chris Silesa.
01:12:41.240 Silesa?
01:12:41.740 I hope I'm saying that right.
01:12:43.560 And he's one of their superstar opinion people.
01:12:46.180 And he'll do a story that's sort of anti-Republican by its nature.
01:12:51.060 And does a good job.
01:12:52.200 And there's a couple others.
01:12:57.420 Collinson, I think, is the other one, who are their standard go-to.
01:13:01.560 I think they have like three or four big name opinion pieces
01:13:05.840 who always jump in when you need to rescue a Democrat problem.
01:13:10.660 Right.
01:13:10.820 If there's something in the news that's skewed a little bit too Republican,
01:13:15.240 you know, one of their big names jumps in and squashes it down.
01:13:18.580 Well, I noticed with interest,
01:13:20.300 and maybe it could be just summer vacations or something,
01:13:24.040 but they've got a, what I think is an unknown, at least to me,
01:13:27.680 somebody named Jeremy Erb,
01:13:29.860 who writes the opinion piece to say that
01:13:32.020 the House January 6th Committee corroborated key details
01:13:37.060 involving former President Trump's heated exchange with Secret Service
01:13:41.900 when Trump was told he could not go to the Capitol.
01:13:46.380 Okay.
01:13:47.640 So they corroborated key details about that exchange.
01:13:54.460 Let me think.
01:13:55.120 What did that story include?
01:13:57.240 Let's say there was a story about him throwing food against a wall,
01:14:02.020 which he denies.
01:14:03.780 There was a story about Trump allegedly grabbing the steering wheel
01:14:07.180 of the car he was in to try to, you know,
01:14:10.460 encourage the driver to go toward the protests.
01:14:16.040 And, but yet the reporting by Jeremy Erb
01:14:20.680 is that the key details were corroborated.
01:14:26.540 What do you think were the key details?
01:14:29.720 Which part was corroborated?
01:14:32.020 You see what's missing?
01:14:35.340 Here's a fact that was corroborated.
01:14:38.180 Here's another fact that was corroborated.
01:14:40.400 Here's the fact.
01:14:41.280 Here's the corroboration.
01:14:42.940 It doesn't have that.
01:14:44.840 It just says that key things were corroborated.
01:14:47.240 Were the key things, the actual things that you think are the key?
01:14:52.540 Because I think if Trump grabbed the steering wheel of a moving car,
01:14:57.120 that would be, I'd like to know that.
01:14:59.560 Did he say that happened?
01:15:01.220 I don't think so.
01:15:02.940 I don't think that was one of the key details.
01:15:04.660 I've got a feeling that no key details that you care about were confirmed at all.
01:15:12.400 And I got a feeling, and this is just a feeling, so I can't prove what I'm going to say now.
01:15:19.400 It feels as if, and it's probably just summer vacations, but it feels as if the big name opinion pieces,
01:15:25.920 people couldn't do it.
01:15:28.180 They're just like, seriously?
01:15:30.460 You want me to act like he corroborated the thing he didn't mention?
01:15:34.520 I feel like the ones that do the big pieces have a little more pride.
01:15:39.000 It's like, I can spin this, but I'm not going to just ignore the key detail
01:15:44.040 and say that other key details were corroborated and just lie about it.
01:15:47.820 I think they had to get somebody who is new who would go this far.
01:15:52.220 Again, this is pure speculation, right?
01:15:56.000 I can't read any minds.
01:15:57.340 I don't know what CNN is really doing.
01:15:58.780 It could just be that the bigger guys are on, and bigger women are on vacation,
01:16:04.920 so they had to find somebody to fill in.
01:16:06.800 But he hilariously fills in by doing just the most ham-handed job of spinning this in an illegitimate way.
01:16:16.160 He'll probably get a permanent job there.
01:16:18.740 It was so good.
01:16:19.840 All right, well, it turns out that that was the end of my notes,
01:16:23.960 because I thought I had a third page,
01:16:25.420 but it turns out that my printer decided all it needed was that little line there.
01:16:32.380 So it printed that little line, and that's all it needed to do.
01:16:38.600 Now, have I changed your minds anything?
01:16:48.260 Nope.
01:16:48.620 Nope.
01:16:49.840 I don't change too many minds on here, except that Skittles might not be a food group.
01:16:57.480 If you think Skittles is a food group, I hope I changed your mind.
01:17:03.280 You agree on the sports?
01:17:04.800 Good.
01:17:05.500 You know, part of it is because I played co-ed intramural sports quite a bit,
01:17:10.760 and it was just better.
01:17:11.960 And it was just better.
01:17:13.820 It was just better to have a mixed group.
01:17:15.960 If I could have played at the highest level of soccer in college, for example,
01:17:22.940 I probably wouldn't want, you know, anybody on my team who couldn't keep up.
01:17:27.480 But that was not the case.
01:17:29.800 When I played soccer with, you know, women on the team, they kept up with me.
01:17:34.960 You know, maybe they couldn't keep up with the best male players on the team, but they certainly kept up with me.
01:17:40.760 All right.
01:17:47.220 Oh, you know about Hartwick.
01:17:48.900 Yeah.
01:17:49.680 The tiny college I went to, Hartwick College, recruited European soccer players before that was a big thing.
01:17:58.280 So one year that I was there, we had the number one soccer team in the country, won the country.
01:18:04.420 Tiny little school.
01:18:06.960 That was quite exciting for us.
01:18:10.940 All right.
01:18:13.300 Should your theory on sports carry over to intelligence in the classroom?
01:18:17.500 Well, it does.
01:18:18.700 I mean, that's being reversed a little bit.
01:18:21.000 But ideally, you want people of equal capability to be in the same class.
01:18:26.500 Don't you?
01:18:28.280 Isn't that obvious?
01:18:29.320 You can't always get that.
01:18:31.080 There's practical reasons why you can't do it.
01:18:33.840 But you'd want it if you could get it.
01:18:38.500 What if music were treated like sports?
01:18:40.760 Well, I don't know where that's going.
01:18:46.060 Don't take criminal questions?
01:18:49.920 Oh, how about the interview office in town?
01:18:52.860 You know what?
01:18:55.000 I had a thought this morning.
01:18:56.520 Let me run this by you.
01:18:57.400 So I told you I was thinking of starting like an actual podcast studio.
01:19:02.820 Maybe something in town where people don't have to come to my house.
01:19:06.820 You know, something small, very small, but professional.
01:19:10.440 And maybe something I could have an audience.
01:19:13.340 So I could bring in an audience sometime.
01:19:14.800 But I realized that maybe what I have to do is host debates.
01:19:20.560 I feel like everything has sort of lined up that I'm the only one who can do it.
01:19:28.780 Now, I'm not the only one who has the skill.
01:19:32.900 That would be lots of people.
01:19:34.200 But I might be the only one who has the minimum amount of skill and the willingness to do it.
01:19:39.180 If there's nobody willing to do it, it doesn't matter how much skill they have.
01:19:44.020 So I don't think I have the highest level of skill.
01:19:46.020 I think I have, like, just enough.
01:19:48.780 But now to do this, I would have to do it on a video, though, because I don't think I could get people to come where I live.
01:19:55.040 It's just not like in L.A. where people are going to go there anyway.
01:19:58.880 If you had an office in New York, you could get everybody because they're in town anyway.
01:20:03.360 But people are not in my town anyway.
01:20:06.160 So I think I'd have to do a video.
01:20:10.000 And I might set it up so there's, like, three videos, you know, two of the competing people and then me.
01:20:17.680 But I would be live if anybody wanted to be there in the audience.
01:20:21.020 And my model would be heavy interruption.
01:20:25.040 So it wouldn't be a regular debate.
01:20:28.080 It would be a debate in which I interrupted both sides.
01:20:30.640 Oh, you're ignoring the question.
01:20:33.060 That sort of thing.
01:20:35.560 Medical software.
01:20:36.640 More evidence in the new golden age.
01:20:38.200 Oh.
01:20:39.300 Oh, interesting.
01:20:40.820 Julia Scherer says that Corpusim, C-O-R-P-U-S-I-M, is a thrilling, useful, new open source medical software.
01:20:51.240 Open source medical software.
01:20:53.600 How much do I want that?
01:21:04.960 That's either the worst idea or the best idea.
01:21:08.100 Imagine, if you will, that everybody who wanted to purchase...
01:21:11.080 I have no idea what this product is.
01:21:12.940 So what I'm going to talk about, don't assume that has anything to do with this product.
01:21:16.940 But imagine if just people could contribute their experience of what they did.
01:21:23.660 Right?
01:21:23.880 I had this problem.
01:21:24.860 I did this for it.
01:21:27.840 Eventually, wouldn't it turn into a giant AI?
01:21:31.020 And you wouldn't need doctors anymore except for the physical manipulation of things.
01:21:35.820 I don't know.
01:21:37.520 I think it would go in that direction.
01:21:43.840 Who would be the first and second debaters?
01:21:47.120 I'm not sure that I would bring in necessarily the politicians.
01:21:55.080 So the trouble with politicians debating is that they're professional liars and we don't expect much more from them.
01:22:02.200 Am I right?
01:22:03.940 Like, having a debate between two politicians feels like a complete waste of time.
01:22:07.820 Having a debate between experts, I can imagine a model where I could make that work.
01:22:15.880 But any debate between pundits or politicians would be just a complete waste of time.
01:22:23.120 Do both?
01:22:25.400 I just think it's a waste of time to do the politicians.
01:22:28.420 It really is.
01:22:29.840 I mean, the only reason to do the politicians is that you get more audience because they're noticeable.
01:22:34.540 And you can embarrass somebody and maybe there's a gaffe.
01:22:38.800 But it would be the least useful thing to do.
01:22:40.820 You know, if I did an actual debate on policy, nobody would watch, probably.
01:22:48.580 Is Rand Paul a professional liar?
01:22:51.220 Quote, yes.
01:22:53.460 Yes.
01:22:54.280 Now, I have a high opinion of Rand Paul.
01:22:57.020 And I think he's probably one of the most honest people.
01:22:59.940 But you don't think that he's ever left anything out?
01:23:04.580 Do you think he provides all the context for the other team or maybe just shows his side?
01:23:11.900 I don't feel like you can be in politics and give complete, let's say, complete respect to the other team's argument before you state what your opinion is.
01:23:22.200 And without that, I'm not sure that's really honest.
01:23:26.660 That's doing a good job as a politician.
01:23:29.460 I love Rand Paul as a politician, by the way.
01:23:31.960 I think he's a national asset.
01:23:34.260 There are a few politicians that I think have, let's say, they've surpassed Congress in terms of who they are.
01:23:41.920 Like, there's some people who are just members of Congress.
01:23:45.300 You've never even heard their names if they're not from your state, and even that.
01:23:49.580 But, yeah, Tom Cotton's one.
01:23:51.540 He rises above.
01:23:53.440 And Rand Paul's one.
01:23:55.300 So I believe there are some people who have risen above their limited role that they get elected for, and he's one, in a good way.
01:24:04.100 Well, Speaker of the House is a special case.
01:24:08.660 Ted Cruz, yeah.
01:24:09.580 Now, often it's people who have ambition to be presidents.
01:24:15.080 Yeah.
01:24:15.420 Ted Cruz, I think, for sure.
01:24:17.480 Yeah.
01:24:17.860 And MTG.
01:24:19.340 Yeah, some other people.
01:24:20.320 Thomas Massey, good example.
01:24:22.420 Thomas Massey and I do not agree on everything.
01:24:26.320 But whenever we disagree, his reasons are good.
01:24:30.500 Which bothers me, frankly.
01:24:34.240 I hate to feel disagreement with people whose reasons on the other side are actually solid.
01:24:38.540 Like, oh, that's actually a pretty good reason.
01:24:41.760 And Thomas Massey does that to me all the time.
01:24:44.900 I think probably three or four times I've had an immediate disagreement with something I thought he said.
01:24:51.660 And then you hear the larger argument, and you go, oh, okay.
01:24:55.220 Maybe I don't agree with it, because I'm not quite as, you know, my politics don't line up exactly with some of his stuff.
01:25:04.680 But when he gives his reason, I usually say, okay, that's the reason.
01:25:10.480 Yeah, he's more libertarian than I am.
01:25:13.200 But he's consistent.
01:25:14.460 You know, being consistent is worth a lot.
01:25:19.080 And there are libertarian arguments that you say to yourself, well, I don't know that that would work, but I can't rule it out.
01:25:26.820 You know, it's got some logic behind it.
01:25:30.820 You'd pay good money to hear that?
01:25:32.320 Now, that's somebody I would debate.
01:25:36.120 I would definitely debate Thomas Massey.
01:25:39.240 Because like you said, he doesn't operate like a traditional politician.
01:25:43.940 Oh, here's the biggest compliment I'll ever give a politician.
01:25:47.580 You ready for this?
01:25:49.060 This is literally the biggest compliment I will ever give a politician.
01:25:53.520 I'm positive that if I had a debate with Thomas Massey on some topic, which we disagreed, and I made a better argument, he would change his mind right in front of you.
01:26:08.200 Now, I'm not saying I have the ability to do that, or that there's any topic which stands out as the one that I would do that.
01:26:14.280 However, it is my opinion that if I gave him a better argument in public, that he would have the capability.
01:26:23.340 It's the capability that's important, right?
01:26:25.500 Because others would just cowardly retreat to their team.
01:26:29.320 I believe he would actually tell you in public that he changed his mind under the unusual condition that I made a better argument and, you know, it had some weight.
01:26:37.740 I don't know that anybody else could do that.
01:26:40.780 Do you?
01:26:41.180 Would you trust anybody else to change their mind in public when faced?
01:26:46.140 I would, but I'm not a politician, right?
01:26:49.060 So, but a politician.
01:26:51.440 I don't know.
01:26:53.420 Name one other person you think would be capable.
01:26:57.000 Forget about willing, you know, just capable.
01:27:01.720 I mean, Joe Rogan would, but again, not a politician.
01:27:09.000 What?
01:27:10.320 Co-punt?
01:27:11.960 Is that a word?
01:27:16.700 If Trump and DeSantis debated, it would be horrible.
01:27:19.800 Yeah.
01:27:20.740 That would be a weird one, wouldn't it?
01:27:22.740 Imagine Trump and DeSantis debating.
01:27:25.860 I can't see that ever happening.
01:27:32.140 Yeah, I see some support for Dan Crenshaw over on YouTube.
01:27:37.920 Conservatives have a real mixed feeling about him, don't they?
01:27:40.280 I forget what the issue is.
01:27:46.840 Yeah.
01:27:49.080 Crenshaw is interesting because he's smarter than he lets on.
01:27:54.540 I don't know if you know that, but Crenshaw is super smart.
01:28:00.160 And I think he's smart enough not to let you know just how smart he is, because I'm not sure that would play right.
01:28:06.660 So he's very smart.
01:28:07.660 So he's very smart.
01:28:09.380 Whether you disagree with him or agree with him, you know, I'm not going to have that conversation right now.
01:28:13.940 Oh, somebody's calling him a rhino.
01:28:15.580 Is that what's happening?
01:28:16.200 Well, I don't know about any of that.
01:28:19.120 I did an interview with him, and I'll tell you my first impression was all positive.
01:28:25.140 So I had only positive feelings.
01:28:27.000 So it's mostly the anti-war versus not anti-war enough thing that people have a complaint about.
01:28:38.800 Yeah, I don't know.
01:28:39.640 I'm not up to date, so I don't have anything to disagree with him about.
01:28:42.560 But maybe I would.
01:28:43.720 Who knows?
01:28:44.080 But you like Bill Gates, too.
01:28:50.620 Well, I think the shine on Bill Gates is coming off a little bit.
01:28:57.460 Here's what I like about Bill Gates.
01:29:00.060 Unlike you, I believe he's operating in the best interest of humanity.
01:29:06.480 I don't know that you could ever talk me out of that.
01:29:08.520 Because if you think he's operating to get power, there's no indication that he's after power.
01:29:15.080 He could have had all the power he wanted.
01:29:17.080 There's no indication he wants it.
01:29:19.340 And there's no indication that he's in it for money.
01:29:23.300 I don't see any indication of that.
01:29:26.260 Now, I could be wrong, right?
01:29:27.920 Because we're all just reading minds.
01:29:29.720 I can't read his mind.
01:29:30.860 But I would say the indication of anything except good intentions.
01:29:34.460 Now, does he get everything right?
01:29:37.040 No.
01:29:37.480 Does anybody?
01:29:39.500 No.
01:29:40.340 But if you were to bet consistently on what he predicted versus what other people predicted,
01:29:46.660 I think you'd come out ahead going with what he predicted.
01:29:51.500 All right.
01:29:52.100 Well, we're running a little long, I see.
01:29:53.720 And today, I think we've nailed it.
01:29:56.760 It's Sunday.
01:29:58.140 So it's the best show ever on a Sunday.
01:30:02.600 And yes, I am better at predicting.
01:30:05.080 You're right.
01:30:07.480 And, oh, I don't think carpool duty will recommence.
01:30:14.060 So when school starts, I will be in a different living situation.
01:30:18.440 So I won't be carpooling.
01:30:21.820 And that's all for now.
01:30:23.780 Ladies and gentlemen.
01:30:25.460 And thank you.
01:30:31.080 I'm just reading your comments and getting transfixed by them.
01:30:34.440 But goodbye, YouTube, for tonight.
01:30:37.160 We'll see you tomorrow.