Real Coffee with Scott Adams - January 14, 2023


Episode 1988 Scott Adams: Everything The News Gets Wrong Because They Are Not Engineers


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

141.3671

Word Count

9,623

Sentence Count

723

Misogynist Sentences

11

Hate Speech Sentences

20


Summary

A new Martin Luther King statue is being built in honor of slain civil rights hero MLK, but what will it mean for the future of the MLK statue? And what will happen to the one that stands in the place of MLK's former home?


Transcript

00:00:00.000 Good morning ladies and gentlemen and welcome to the highlight of civilization.
00:00:09.120 It's called Coffee with Scott Adams and I'm pretty sure you've never enjoyed anything
00:00:13.840 as much as you can enjoy the next approximately one hour.
00:00:18.840 No, I'm not bound by any hard guidelines of timing.
00:00:23.120 No, not at all.
00:00:24.360 I can make it as long or as short as I want.
00:00:27.740 Don't make any jokes.
00:00:29.180 Now, how would you like to take your experience up to levels that nobody's ever seen?
00:00:37.120 Ever?
00:00:38.140 And all you need is a cup or a mug or a glass of tank or chalice of dine, a canteen jug or
00:00:42.060 flask, a vessel of any kind.
00:00:44.080 Fill it with your favorite liquid.
00:00:45.240 I like coffee and join me now for the unparalleled pleasure of the dopamine of the day, the thing
00:00:53.160 that makes everything better.
00:00:54.100 It's called the simultaneous sip and it happens now.
00:00:55.820 Go.
00:00:59.180 Hold it, hold it, hold it, hold it, everybody.
00:01:02.340 Hold it.
00:01:06.160 We have a request to wait.
00:01:08.060 Somebody wasn't quite ready.
00:01:10.280 Okay, this is the real one.
00:01:12.020 First one didn't count.
00:01:14.040 Go.
00:01:14.660 Go.
00:01:15.100 Okay, I think we got everybody on board there.
00:01:21.480 We will leave no viewer behind.
00:01:25.640 We're like the Marines that way.
00:01:28.580 While President Biden was bragging, the gas is down more than $1.70 from its peak.
00:01:35.460 That adds up to around $180 per month for a typical family.
00:01:42.760 That's money in your pocket, not spent at the pump, he says.
00:01:46.820 Money in your pocket.
00:01:48.560 Yeah.
00:01:48.760 Does this kind of analysis remind you of anything?
00:01:55.600 Is there any fictional character who might be well known for saying this sort of thing?
00:02:04.000 And might that fictional character be a comic?
00:02:09.820 And might it involve a boss with pointy hair?
00:02:13.860 And might it involve a boss who says,
00:02:17.380 Hey, we could have wasted a million dollars this year,
00:02:21.160 but we only wasted half a million,
00:02:23.520 and that's money in your pocket.
00:02:25.740 That's money in your pocket.
00:02:27.540 I could have done worse.
00:02:29.900 So, you're welcome.
00:02:31.480 That literally is a Dilbert comic.
00:02:37.060 Not just one.
00:02:38.460 I'm pretty sure I've hit that theme a few times.
00:02:41.480 And you know what?
00:02:42.420 I'm going to hit it again.
00:02:44.700 I'll probably hit that theme again.
00:02:47.260 Yes, bragging about how things could have been so much worse,
00:02:51.520 and therefore, because they're not worse,
00:02:54.600 money in your pocket.
00:02:56.160 That's money in your pocket, baby.
00:02:57.580 Do you see there's a new Martin Luther King statue?
00:03:03.860 Some people say that viewed from a certain angle,
00:03:07.200 it looks a little different than maybe what the creator hoped.
00:03:13.480 I think they're suggesting it looks like a giant,
00:03:16.960 oh, I can't say that on live stream,
00:03:19.960 a giant thing.
00:03:22.040 But that's only from one point of view.
00:03:25.800 Now, I think I saw a Mike Cernovich tweet
00:03:29.580 asking what we do about this statue.
00:03:33.800 Because it turns out, at the same time,
00:03:36.580 the statue's going up.
00:03:37.760 I think it was Politico had an article about
00:03:39.660 some researcher, biographer,
00:03:42.280 who's come up with a bunch of dirt on MLK.
00:03:45.840 Apparently, MLK would not have survived the Me Too era,
00:03:49.280 if the allegations are correct.
00:03:52.600 And so now you have an interesting situation.
00:03:55.600 What happens if the most storied and famous leader
00:04:01.920 of the black community turns out to be a Me Too person,
00:04:06.500 and then maybe a criminal of the female part of the public,
00:04:12.820 and other people, too, as well?
00:04:14.240 That would be an interesting battle of power, wouldn't it?
00:04:19.860 Who do you think would win?
00:04:21.740 Who has more power in today's world?
00:04:25.000 Black America or female America?
00:04:30.960 I would think female, because there are more of them, right?
00:04:33.640 Numbers.
00:04:35.300 More than half of the world, I think.
00:04:38.100 Or more than half of the United States, anyway.
00:04:39.600 So, we'll see.
00:04:42.720 But I think it's optimistic to assume
00:04:48.020 that that statue will not be pulled down.
00:04:51.320 Now, one reason it might not be pulled down
00:04:54.000 is if it's mostly women who are complaining.
00:04:58.940 They don't have the upper body strength,
00:05:01.580 so statue's safe.
00:05:03.200 No, I'm just joking.
00:05:07.280 Because everybody cares about Me Too, not just women.
00:05:10.480 So, I'm just being provocative.
00:05:14.340 I don't think that statue's going to come down.
00:05:16.280 Nor do I think it should.
00:05:18.060 But it does raise some interesting questions
00:05:21.040 in our complicated times.
00:05:24.160 All right.
00:05:24.880 I may have told you before,
00:05:26.380 but there's a phrase on Twitter
00:05:28.920 that will get you instantly blocked.
00:05:31.040 And I don't care if you say it to me
00:05:33.640 or someone else.
00:05:36.180 If you use these words about anybody,
00:05:39.760 doesn't matter if it's about me,
00:05:41.900 if you use this to explain anybody
00:05:44.540 or to talk to anybody,
00:05:45.940 you are instantly blocked.
00:05:47.720 And here's the phrase.
00:05:50.240 You're better than this.
00:05:52.420 You're better than this.
00:05:55.000 Nope.
00:05:56.340 No.
00:05:57.740 I am exactly as good as whatever I'm doing.
00:06:01.040 If there's one thing I'm never going to back off from,
00:06:06.340 you are what you do.
00:06:08.500 You are what you do.
00:06:11.200 If you do something and somebody doesn't like it,
00:06:14.020 you're not better than that.
00:06:15.540 You're exactly that.
00:06:17.320 And you might be happy about it.
00:06:18.640 You might not be apologizing at all.
00:06:20.300 You are that.
00:06:21.320 Yes, I did that.
00:06:22.980 That is who I am.
00:06:23.900 And here's why I don't like the phrase
00:06:27.440 you're better than that.
00:06:28.920 Because it's emotionally manipulative
00:06:31.300 and it acts as though
00:06:33.500 I should organize my life
00:06:36.200 around a stranger's approval.
00:06:39.160 Why would I organize my life
00:06:40.860 around the approval of a stranger?
00:06:43.200 And why does that person think
00:06:44.620 they can impose that on me?
00:06:46.460 And why would they be dumb enough
00:06:48.000 to say it in public?
00:06:49.620 That their opinion should monitor
00:06:51.760 and should change my behavior
00:06:54.340 and make me a different person?
00:06:56.840 I don't think there's anything
00:06:57.940 that bothers me more than that phrase
00:06:59.700 because it's just manipulative.
00:07:02.420 How many of you would agree
00:07:03.900 it's just pure manipulation?
00:07:05.760 And here's the other thing.
00:07:08.740 Anybody who would use that phrase,
00:07:10.800 especially in public,
00:07:12.920 would you want to be friends with that person?
00:07:15.300 Because that tells you
00:07:17.300 where their mentality is, right?
00:07:22.500 All right.
00:07:25.980 Okay.
00:07:26.880 Recommendation there.
00:07:30.540 You know,
00:07:31.660 as much as I appreciate
00:07:34.220 the super comments on YouTube,
00:07:36.520 I think YouTube should get rid of them.
00:07:38.540 What do you think?
00:07:39.800 Because when you stop to read
00:07:41.180 a super comment,
00:07:43.360 often it's not on the topic
00:07:45.040 you're talking about.
00:07:46.360 And I feel like,
00:07:48.240 you know,
00:07:48.520 the law of reciprocity,
00:07:51.080 when somebody does anything for you,
00:07:53.240 you just automatically feel like
00:07:54.600 you owe them something.
00:07:55.480 We're all designed that way.
00:07:56.940 So the problem with this model
00:07:58.240 is that,
00:07:59.920 especially if it's people I like,
00:08:01.460 like sometimes they'll come from people
00:08:02.820 I know personally
00:08:04.120 and I like them.
00:08:05.460 But if it's not on the same topic,
00:08:08.660 it's not good for everybody else.
00:08:10.480 So it's like paying $20
00:08:11.700 to go to the top of the line.
00:08:15.180 Now,
00:08:15.420 if you go to the top of the line
00:08:16.580 on the topic we're talking about,
00:08:18.760 I think that's cool.
00:08:21.300 You know,
00:08:21.840 because if you make a good point,
00:08:23.520 it doesn't matter
00:08:24.060 if you paid or didn't pay.
00:08:25.560 But if it's a different topic,
00:08:28.120 that's probably not the best way
00:08:29.360 to use the super chat.
00:08:30.600 So just keep that in mind.
00:08:32.680 You know,
00:08:32.940 the feature is there
00:08:34.400 so you can use it
00:08:35.220 like there's no crime.
00:08:36.680 But just as a preference,
00:08:38.520 I would rather you not give me money.
00:08:40.780 Is that fair?
00:08:42.540 I would rather you not give me money
00:08:44.600 in that particular way.
00:08:47.860 But thank you.
00:08:49.740 Let me show some gratitude.
00:08:52.040 Thank you.
00:08:52.740 Because I think the people who do it
00:08:53.880 usually have the right,
00:08:55.080 you know,
00:08:55.540 the right thought in mind.
00:08:57.440 And I appreciate that.
00:09:00.220 All right.
00:09:00.800 The Trump Organization,
00:09:01.960 I guess,
00:09:02.320 is going to pay
00:09:02.980 a $1.6 million fine,
00:09:05.220 a criminal fine.
00:09:07.420 The news likes to make sure
00:09:09.040 you know it's a criminal fine
00:09:10.220 for what their CTO apparently did.
00:09:14.860 Now,
00:09:15.400 I'm having a little trouble
00:09:16.380 understanding this story.
00:09:18.600 And maybe somebody who's
00:09:19.820 maybe more clued into
00:09:22.440 either the specifics of this case
00:09:25.340 or the legal structure of the world
00:09:29.540 maybe can sort this out for me.
00:09:32.160 So here's what I understand.
00:09:34.320 The,
00:09:34.940 oh, is this CFO?
00:09:36.240 I'm sorry.
00:09:36.880 CFO.
00:09:37.540 The financial guy,
00:09:38.460 not the technical guy.
00:09:39.840 Yeah.
00:09:40.100 So the CFO of Trump Organization,
00:09:44.140 I accidentally said CTO,
00:09:46.480 but the CFO,
00:09:47.440 the financial guy,
00:09:49.080 apparently was in charge of,
00:09:51.140 you know,
00:09:51.420 the accounting
00:09:51.960 and how things were recorded.
00:09:54.980 and he paid himself
00:09:58.400 without going through payroll
00:10:01.400 by having the Trump Organization
00:10:03.980 pay some of his expenses directly.
00:10:06.760 Now,
00:10:07.140 that's illegal
00:10:07.800 because it didn't get taxed
00:10:09.800 in the way that the system requires it.
00:10:12.200 Now,
00:10:12.620 here's the problem.
00:10:14.520 Because it happened
00:10:15.760 and because the CFO is an officer
00:10:19.240 of the company,
00:10:21.020 the company is also on the hook
00:10:23.560 because although it was one person
00:10:26.100 who seems to be involved,
00:10:27.320 as far as we know,
00:10:28.000 just one person,
00:10:29.120 and the one person
00:10:30.080 was the person who does it
00:10:31.020 for the company.
00:10:32.520 So,
00:10:33.160 it's a weird situation
00:10:34.260 where the Trump Organization
00:10:36.460 was a victim of a crime
00:10:38.460 and they have to pay a penalty
00:10:41.440 for being a victim of a crime.
00:10:45.520 Now,
00:10:46.060 it's a little unclear
00:10:46.980 if the Trump Organization
00:10:48.700 made or lost any money
00:10:51.320 because of the way this was handled.
00:10:54.280 They should have come out
00:10:55.280 about the same.
00:10:57.240 You know,
00:10:57.420 certainly it would not have been,
00:10:58.920 it would not have been
00:10:59.860 a big enough crime
00:11:00.760 that the Trump Organization
00:11:02.900 would have had any intention
00:11:05.280 to do it.
00:11:06.500 Here's how you know
00:11:07.240 there's no intention
00:11:08.100 because it wouldn't be worth it.
00:11:11.320 The Trump Organization
00:11:12.580 would save maybe nothing,
00:11:15.660 you know,
00:11:15.880 or maybe a little bit
00:11:17.200 on some kind of payroll-related tax
00:11:20.040 that you wouldn't,
00:11:21.020 if it's just a straight expense.
00:11:22.760 In both cases,
00:11:23.880 it probably got,
00:11:24.560 it was written off
00:11:25.400 as an expense,
00:11:26.900 right?
00:11:27.760 So,
00:11:28.220 if the company pays it in payroll,
00:11:29.560 that's a write-off,
00:11:31.120 but if they pay it directly,
00:11:33.160 that's another,
00:11:34.160 but there's a little,
00:11:34.880 little extra
00:11:35.660 that you pay,
00:11:37.160 a little extra taxes
00:11:37.960 to pay to support
00:11:39.280 an employee
00:11:40.300 that you don't pay
00:11:41.000 for a regular expense.
00:11:42.300 So there might have been
00:11:43.200 like a little benefit
00:11:44.400 to the Trump Organization,
00:11:46.340 but given the size
00:11:47.820 of the organization
00:11:48.560 and the tiny amount
00:11:50.880 of benefit,
00:11:51.860 really just the
00:11:52.660 employee-related taxes
00:11:54.360 for one person,
00:11:55.880 there's no way
00:11:57.660 that anybody
00:11:59.240 besides the benefactor
00:12:01.300 would have okayed that.
00:12:03.760 In other words,
00:12:04.620 had Trump
00:12:06.080 or,
00:12:07.160 you know,
00:12:08.100 Don Jr.
00:12:09.160 or anybody
00:12:09.940 who was in charge,
00:12:11.060 if any of them
00:12:12.000 had been presented
00:12:12.900 with this option,
00:12:14.320 hey,
00:12:15.160 how about you pay
00:12:15.920 my expenses
00:12:16.600 instead of paying me?
00:12:18.620 I'll save a lot
00:12:19.580 on taxes.
00:12:20.780 You'll save a little bit,
00:12:22.940 but if you get caught,
00:12:25.120 it's like really bad.
00:12:27.060 Nobody would approve that
00:12:28.120 because the cost-benefit
00:12:30.020 doesn't make any sense.
00:12:31.580 You would never
00:12:31.980 put yourself
00:12:32.600 in that position.
00:12:34.000 So,
00:12:34.920 am I interpreting
00:12:36.120 this incorrectly,
00:12:37.800 that the Trump Organization
00:12:38.980 is a victim
00:12:39.900 of a crime
00:12:40.660 that one individual
00:12:43.180 in the company
00:12:43.880 committed.
00:12:45.540 There were two victims.
00:12:46.940 One is the taxpayers,
00:12:48.380 us,
00:12:49.440 and the other victim
00:12:50.300 is the Trump Organization
00:12:51.460 because there's no way
00:12:52.780 in the world
00:12:53.200 they would have
00:12:53.780 greenlit those activities.
00:12:56.360 The risk-reward
00:12:57.640 wouldn't make no sense,
00:12:59.040 no sense at all.
00:13:00.500 Right?
00:13:01.540 So,
00:13:02.760 the fact that this
00:13:04.780 is being reported
00:13:05.520 like it's some kind
00:13:06.340 of a Trump-related crime
00:13:07.920 when they're literally
00:13:09.260 the victim of the crime,
00:13:10.660 literally the victim
00:13:12.000 of the crime,
00:13:12.580 I think.
00:13:13.880 There's no upside
00:13:14.800 for them.
00:13:16.900 All right,
00:13:17.180 so that's your first
00:13:18.240 backwards news of the day.
00:13:21.580 So,
00:13:22.160 I tweeted yesterday
00:13:23.100 a,
00:13:23.520 you know,
00:13:24.500 strange little thought
00:13:26.060 and I said that
00:13:26.720 I think it would be
00:13:27.200 a good standard
00:13:27.900 for social behavior
00:13:29.060 to ignore anything
00:13:30.600 a person said
00:13:31.380 before the age of 25
00:13:32.720 because young people
00:13:34.840 are works in progress
00:13:35.840 and we usually improve.
00:13:37.400 Now,
00:13:37.880 this is based on
00:13:38.580 a specific story
00:13:39.680 in the news
00:13:40.240 that I'm not
00:13:40.720 going to repeat
00:13:41.320 to be consistent.
00:13:43.580 I'm not going
00:13:44.200 to repeat the story
00:13:44.960 because I don't think
00:13:45.740 we should be talking
00:13:46.520 about what somebody
00:13:47.340 said when they were
00:13:48.080 a young person.
00:13:50.080 Right?
00:13:50.400 So,
00:13:50.780 I don't even want
00:13:51.280 to give you enough
00:13:51.800 details that you
00:13:52.480 could Google it.
00:13:53.800 That's how much
00:13:54.400 I don't want
00:13:54.920 to talk about it.
00:13:56.020 But the concept,
00:13:56.960 I think,
00:13:57.180 is important.
00:13:58.500 So,
00:13:58.840 you know,
00:13:59.860 I looked at
00:14:01.080 the tweet today
00:14:02.200 and it's like
00:14:02.780 1.9 million views.
00:14:05.400 Now,
00:14:08.080 a normal tweet
00:14:08.760 of mine
00:14:09.080 might get like
00:14:09.740 10,000
00:14:11.160 or 50,000 views.
00:14:13.180 It's got
00:14:13.780 1.9 million views.
00:14:16.820 So,
00:14:17.420 I was like,
00:14:17.840 what's going on here?
00:14:18.720 And then I go,
00:14:19.120 oh,
00:14:19.580 Elon Musk
00:14:20.220 commented.
00:14:22.420 So,
00:14:22.680 Elon Musk
00:14:23.140 commented,
00:14:23.960 if not,
00:14:24.780 perhaps,
00:14:25.260 you know,
00:14:25.540 age 30.
00:14:27.880 Which makes me
00:14:28.720 wonder if there
00:14:29.320 was something
00:14:29.700 specific he said
00:14:30.780 or did
00:14:31.180 between his
00:14:32.440 ages of 25
00:14:33.340 and 30.
00:14:35.780 So,
00:14:36.400 I thought,
00:14:36.680 wow,
00:14:37.640 you know,
00:14:38.200 one Musk
00:14:39.360 commented.
00:14:39.960 He didn't even
00:14:40.400 retweet it.
00:14:40.980 He just commented
00:14:41.680 and I got
00:14:42.820 1.9 million views.
00:14:45.460 Okay.
00:14:46.360 So,
00:14:47.040 topic number two.
00:14:49.160 I think I mentioned
00:14:49.880 this yesterday.
00:14:50.500 Steve Malloy,
00:14:52.040 his Twitter name
00:14:54.440 is
00:14:54.640 at Junk Science
00:14:55.460 and he tweeted
00:14:57.660 a,
00:14:58.900 some
00:15:00.340 statistics
00:15:01.060 from the
00:15:01.760 NOAA,
00:15:03.240 the National
00:15:03.780 Organization
00:15:04.580 of
00:15:05.020 Alcoholics
00:15:06.620 Anonymous.
00:15:08.000 Now,
00:15:08.300 what is the
00:15:08.700 NOAA?
00:15:10.700 The National
00:15:11.400 Organization,
00:15:12.380 National
00:15:12.860 Oceanic
00:15:13.740 A
00:15:16.440 is
00:15:17.140 Atmospheric
00:15:17.980 Association.
00:15:22.300 Oceanic
00:15:22.820 and Atmospheric
00:15:23.480 Administration.
00:15:25.860 Exactly.
00:15:26.700 That's exactly
00:15:27.300 what I said.
00:15:28.640 Right.
00:15:28.800 Or something
00:15:30.460 like that.
00:15:31.080 Anyway,
00:15:31.480 so that
00:15:31.860 organization
00:15:32.620 put out
00:15:33.200 their
00:15:33.560 data
00:15:35.100 and it
00:15:35.580 was
00:15:36.420 interpreted
00:15:37.120 two
00:15:37.460 different
00:15:37.720 ways.
00:15:38.900 Steve
00:15:39.220 Malloy
00:15:39.560 says,
00:15:41.020 according to
00:15:41.520 the official
00:15:42.000 government
00:15:42.440 numbers,
00:15:43.800 in the
00:15:44.420 last eight
00:15:44.960 years,
00:15:45.400 the trend
00:15:45.900 has been
00:15:46.400 a decrease
00:15:47.200 in temperature.
00:15:49.480 Which,
00:15:50.120 by the way,
00:15:50.660 I don't see
00:15:51.240 anybody
00:15:51.500 questioning
00:15:52.000 the data.
00:15:53.600 In other
00:15:54.020 words,
00:15:54.380 when Steve
00:15:55.560 Malloy
00:15:55.900 looks at
00:15:56.360 it and
00:15:56.520 says,
00:15:57.280 you just
00:15:57.940 told us
00:15:58.360 that the
00:15:58.760 temperature
00:15:59.100 went down
00:15:59.580 for the
00:15:59.860 last eight
00:16:00.240 years.
00:16:01.560 And then
00:16:01.860 he shows
00:16:02.220 the graph.
00:16:03.960 I don't
00:16:04.680 believe
00:16:04.960 anybody
00:16:05.280 was
00:16:05.640 questioning
00:16:06.380 whether
00:16:06.760 he
00:16:07.080 interpreted
00:16:07.480 it
00:16:07.760 correctly.
00:16:08.460 I think
00:16:09.020 it actually
00:16:09.480 went down.
00:16:10.960 But,
00:16:11.680 does that
00:16:12.300 mean anything?
00:16:13.600 Does that
00:16:14.120 tell you
00:16:14.440 that climate
00:16:14.980 change is
00:16:15.640 over?
00:16:17.140 No.
00:16:18.160 No.
00:16:18.680 Because if
00:16:19.100 you look
00:16:19.380 at the
00:16:19.620 longer-term
00:16:20.220 graph,
00:16:21.380 there are
00:16:21.740 lots of
00:16:22.340 multi-year
00:16:22.980 periods where
00:16:23.620 it does
00:16:23.900 go down.
00:16:25.000 Followed
00:16:25.480 by going
00:16:25.980 to a
00:16:26.260 new
00:16:26.400 high.
00:16:27.660 Followed
00:16:28.180 often by
00:16:28.800 down a
00:16:29.320 few
00:16:29.480 years.
00:16:30.700 Followed
00:16:31.020 by a
00:16:31.280 new
00:16:31.380 high.
00:16:32.220 Now,
00:16:32.580 I'm not
00:16:32.880 saying that
00:16:33.280 those numbers
00:16:33.740 are necessarily
00:16:34.340 correct,
00:16:35.360 because I
00:16:35.880 don't believe
00:16:36.260 anything anymore.
00:16:37.620 But,
00:16:38.820 those are
00:16:39.320 two completely
00:16:40.380 different stories
00:16:41.300 with the same
00:16:41.920 data.
00:16:43.340 If you
00:16:43.980 pick the
00:16:44.380 most recent
00:16:45.120 eight,
00:16:46.320 do you
00:16:46.680 have an
00:16:47.000 argument that
00:16:47.600 the most
00:16:47.920 recent eight
00:16:48.580 are more
00:16:48.980 important than
00:16:49.980 the last
00:16:50.400 ones?
00:16:51.180 A little
00:16:51.500 bit.
00:16:52.780 Don't we
00:16:53.260 all just
00:16:53.660 assume that
00:16:54.160 whatever's
00:16:54.580 happening
00:16:54.840 recently is
00:16:56.240 going to be
00:16:56.520 more important
00:16:57.080 than what
00:16:57.480 happened in
00:16:57.900 the past?
00:16:59.260 Like,
00:16:59.820 yeah,
00:17:00.060 your brain
00:17:00.440 just automatically
00:17:01.080 says,
00:17:01.480 well,
00:17:01.640 yeah,
00:17:02.460 it should be
00:17:02.940 more important
00:17:03.580 than it went
00:17:04.000 down the
00:17:04.480 last eight
00:17:04.900 years while
00:17:06.120 CO2 was
00:17:07.260 going up.
00:17:08.520 That should
00:17:09.080 mean something.
00:17:10.380 And then you
00:17:10.740 look at the
00:17:11.380 way that people
00:17:12.760 don't like that
00:17:13.480 story or that
00:17:14.560 narrative,
00:17:15.460 how they twist
00:17:16.280 and turn.
00:17:17.660 People actually
00:17:18.440 said,
00:17:19.580 this proves
00:17:20.340 that our
00:17:21.800 climate
00:17:22.660 strategies are
00:17:24.120 working.
00:17:26.580 What?
00:17:27.820 There are
00:17:28.260 actually people
00:17:28.860 who are
00:17:29.140 following climate
00:17:29.920 change and
00:17:31.200 commenting in
00:17:32.600 public, and
00:17:33.580 they believe
00:17:33.980 that we would
00:17:34.500 see the
00:17:34.960 change, you
00:17:36.220 know, because
00:17:36.500 we've done
00:17:36.920 enough that's
00:17:37.560 already lowering.
00:17:39.220 Okay, maybe.
00:17:40.740 I guess anything's
00:17:42.140 possible.
00:17:43.400 But I don't
00:17:43.960 think so.
00:17:45.000 I mean, it
00:17:45.280 seems pretty
00:17:45.780 unlikely to me.
00:17:46.600 But anyway,
00:17:49.060 Elon Musk
00:17:53.180 responded to
00:17:53.960 that one as
00:17:55.160 well.
00:17:56.700 And he
00:17:58.100 said, so he
00:17:59.900 responded to
00:18:01.040 my comments
00:18:01.820 about it that
00:18:02.600 two different
00:18:05.400 things were
00:18:05.860 being presented.
00:18:07.840 It's, we
00:18:09.240 also have the
00:18:10.260 eight years
00:18:11.580 were the
00:18:11.980 warmest on
00:18:12.620 record, but
00:18:14.020 also trending
00:18:14.760 down.
00:18:15.180 So which
00:18:16.160 one do you
00:18:16.560 care about?
00:18:17.600 That it's the
00:18:18.160 warmest on
00:18:18.760 record, or
00:18:20.040 that it's
00:18:20.400 trending down?
00:18:21.300 You tell me.
00:18:21.960 Which one
00:18:22.340 should I care
00:18:22.780 about?
00:18:23.960 Which one
00:18:24.660 matters?
00:18:25.140 It's the
00:18:25.500 warmest in
00:18:26.140 record, or
00:18:27.700 it's trending
00:18:28.180 down?
00:18:28.540 trend, I
00:18:32.820 see a lot of
00:18:33.460 neither's, or
00:18:36.220 neither's, trend
00:18:38.400 both.
00:18:39.600 All right, here's
00:18:40.060 the correct
00:18:40.740 answer.
00:18:42.160 There is one
00:18:43.120 correct answer,
00:18:45.140 and there's, and
00:18:46.420 I won't take any
00:18:47.140 debate on this.
00:18:48.960 All right?
00:18:50.260 If you don't
00:18:51.140 present both of
00:18:51.960 them, you're not a
00:18:52.880 credible person.
00:18:53.600 That's the
00:18:55.860 answer.
00:18:56.520 If you don't
00:18:57.280 present both at
00:18:59.120 the same time,
00:19:00.580 don't listen to
00:19:02.380 anything else they
00:19:03.100 say.
00:19:03.920 Now, I think I'm
00:19:04.580 going to give
00:19:04.940 Steve Malloy a
00:19:06.060 pass, because he's
00:19:08.280 talking about it in
00:19:08.980 the context of the
00:19:09.980 other data, right?
00:19:10.900 So that's fair,
00:19:12.160 because the other
00:19:12.720 data's already
00:19:13.280 presented, so he
00:19:14.560 presented the part
00:19:15.400 they left out.
00:19:16.300 So that's good,
00:19:17.440 because that
00:19:17.940 incorporates the
00:19:19.200 whole.
00:19:19.400 But if anybody
00:19:20.560 tells you one of
00:19:21.440 those numbers, and
00:19:22.920 they leave out the
00:19:23.620 other one, don't
00:19:25.160 listen to anything
00:19:25.720 they ever say
00:19:26.260 again.
00:19:27.120 That's somebody who
00:19:27.780 doesn't know how to
00:19:28.440 do anything.
00:19:31.740 And by the way, I
00:19:32.640 don't know if Steve
00:19:33.640 Malloy is credible or
00:19:34.960 not credible.
00:19:35.720 I have no opinion.
00:19:36.940 I just know that he
00:19:37.940 had an interesting
00:19:38.560 point about this
00:19:39.440 particular data.
00:19:42.100 And I looked at
00:19:43.680 my numbers, and I
00:19:47.180 had millions,
00:19:47.900 millions of
00:19:49.160 views on my
00:19:50.500 tweet, and I
00:19:52.020 thought, what the
00:19:53.140 hell is going on
00:19:53.700 here?
00:19:54.880 Oh, Elon Musk
00:19:56.140 responded to that
00:19:57.160 one, too, last
00:19:58.260 night.
00:19:58.920 And he said, it's
00:19:59.860 easier to argue
00:20:00.560 that global warming
00:20:01.320 is a risk rather
00:20:03.120 than a certainty,
00:20:04.340 but it is foolish
00:20:05.760 to roll those dice,
00:20:07.300 given that we will
00:20:08.120 eventually run
00:20:08.760 out of fossil fuels
00:20:09.640 and have to
00:20:10.660 generate energy
00:20:11.540 sustainable anyway.
00:20:13.300 How many of you
00:20:14.280 would say that's
00:20:15.980 the best view?
00:20:17.580 Does that
00:20:17.920 would you
00:20:19.760 concur?
00:20:20.980 That we don't
00:20:22.080 know what the
00:20:22.840 future looks like,
00:20:24.120 but we do know
00:20:25.460 we can't use
00:20:26.980 fossil fuels
00:20:27.800 forever.
00:20:30.640 Is that a
00:20:31.920 reasonable view?
00:20:33.420 Is anything
00:20:34.100 left out?
00:20:37.320 Anything left
00:20:38.200 out?
00:20:41.140 Yes.
00:20:42.700 Because what he's
00:20:43.700 saying, I believe,
00:20:44.580 would be 100%
00:20:46.020 compatible with
00:20:46.980 everybody's view,
00:20:48.060 wouldn't it?
00:20:49.120 Is there anybody
00:20:49.860 who doesn't think,
00:20:51.560 let me say it
00:20:52.280 without the
00:20:52.860 negatives,
00:20:54.180 do you all
00:20:55.040 understand that
00:20:57.180 there's no doubt
00:20:58.020 about it,
00:20:59.040 humanity will use
00:21:00.240 different sources
00:21:00.980 of energy in the
00:21:01.900 future?
00:21:03.080 Does anybody
00:21:03.860 doubt that?
00:21:05.800 We're all 100%
00:21:07.100 certain that at
00:21:09.320 some point in the
00:21:10.320 future, we'll
00:21:11.940 probably have other
00:21:12.740 sources.
00:21:13.140 might be fusion,
00:21:14.640 might be more
00:21:15.360 standard nuclear
00:21:18.140 stuff, might be
00:21:20.400 something else,
00:21:21.160 might be electric,
00:21:22.060 might be more
00:21:22.480 wind, who knows.
00:21:23.880 But we're all
00:21:25.480 sure of that,
00:21:26.080 right?
00:21:26.400 So I would say
00:21:26.980 that what Elon Musk
00:21:27.780 said is something we
00:21:28.580 would all agree with,
00:21:30.060 right?
00:21:31.160 Is there anybody
00:21:31.900 who would disagree
00:21:32.620 with the proposition
00:21:34.720 that we're going to
00:21:36.800 run out of the
00:21:37.460 stuff we use,
00:21:38.340 so it doesn't matter
00:21:39.540 if there's climate
00:21:40.140 change or not,
00:21:40.900 we have to get
00:21:42.020 ready to develop
00:21:43.540 new sources?
00:21:44.800 Everybody?
00:21:46.860 I'm seeing some
00:21:47.760 dissenters, I'll get
00:21:48.580 to you in a moment.
00:21:49.280 All right, the
00:21:50.080 dissenters, you
00:21:50.880 already know it's a
00:21:51.580 trick, right?
00:21:52.680 You know I'm
00:21:53.400 tricking you, you
00:21:54.840 know it.
00:21:56.300 Yeah.
00:21:56.880 Here's what's left
00:21:57.840 out.
00:21:58.520 Here's the dog not
00:21:59.360 barking.
00:22:00.700 It's not about where
00:22:01.980 things are going.
00:22:03.380 We all agree that
00:22:04.580 there's not infinite
00:22:06.180 oil.
00:22:07.520 There's nobody
00:22:08.080 arguing that there
00:22:08.880 is.
00:22:09.140 The only thing
00:22:10.760 we disagree on
00:22:11.500 is the rate of
00:22:12.180 change, and he
00:22:14.140 didn't mention
00:22:14.580 that.
00:22:15.620 The rate of
00:22:16.320 change is the
00:22:16.900 only topic we
00:22:18.100 disagree on.
00:22:19.420 How aggressively,
00:22:21.340 how much of our
00:22:22.740 taxes do you put
00:22:24.500 into it because
00:22:25.180 you're trying to
00:22:25.660 make it happen
00:22:26.160 fast versus
00:22:27.700 letting the free
00:22:28.760 market do whatever
00:22:29.980 it wants, right?
00:22:31.520 So the question
00:22:32.440 was never about
00:22:34.060 whether we need
00:22:35.280 new sustainable
00:22:36.080 energy.
00:22:36.800 Everybody knew
00:22:37.420 that.
00:22:39.140 Everybody knew
00:22:39.960 we needed new
00:22:40.600 ones eventually,
00:22:41.500 whether it's 100
00:22:42.100 years from now
00:22:42.680 or 200 years
00:22:43.340 from now.
00:22:44.140 You still have
00:22:44.780 to do it.
00:22:46.060 But if you
00:22:47.360 have to do it
00:22:48.040 in 20 years,
00:22:49.900 that is an
00:22:50.600 entirely different
00:22:51.340 proposition than
00:22:52.200 you have to do
00:22:52.780 it in 300 years.
00:22:54.500 You can't compare
00:22:55.500 those two.
00:22:56.280 And to imagine
00:22:56.880 that you can
00:23:00.320 leave that out
00:23:01.120 looks a little
00:23:03.160 bit more,
00:23:04.080 a little bit
00:23:05.920 more, let's
00:23:06.880 say, persuasion
00:23:08.300 related than
00:23:09.420 data related.
00:23:12.420 Because Elon
00:23:13.220 Musk has a
00:23:14.960 business that is
00:23:16.340 very much in the
00:23:17.100 business of
00:23:17.720 alternative energy.
00:23:19.500 So of course
00:23:19.940 he would like to
00:23:22.020 do that maybe
00:23:22.880 faster than the
00:23:24.200 rest of us,
00:23:25.080 maybe, or
00:23:25.920 faster than some
00:23:26.760 of us and
00:23:27.580 slower than some
00:23:28.420 people would like
00:23:29.260 it as well.
00:23:29.660 But he does
00:23:30.660 have an
00:23:31.460 enormous
00:23:32.860 financial interest
00:23:34.740 in being a
00:23:36.040 little bit quick
00:23:36.720 about it.
00:23:37.860 Because that's
00:23:38.240 where his money
00:23:38.800 is, is being
00:23:39.400 quick about it.
00:23:40.140 Buy a Tesla
00:23:40.600 now, save the
00:23:41.880 world.
00:23:43.300 But he's not
00:23:43.940 wrong.
00:23:45.260 He's not wrong.
00:23:45.940 You just have to
00:23:46.540 consider that the
00:23:47.440 rate is the real
00:23:48.040 question.
00:23:48.760 Do you upend
00:23:49.400 everything quickly
00:23:50.220 or let it play
00:23:51.480 out?
00:23:51.760 That's really the
00:23:52.380 decision.
00:23:54.720 Well, this was
00:23:55.580 going to happen.
00:23:57.820 You know how
00:23:58.660 the Republicans
00:23:59.980 or just people
00:24:01.840 who lean right
00:24:02.680 like to say
00:24:04.040 that the
00:24:04.400 Democrats are
00:24:05.080 all groomers?
00:24:06.580 I'm sure you
00:24:07.320 have not missed
00:24:07.820 that story.
00:24:08.780 I try not to
00:24:09.940 talk about it
00:24:10.600 because it's
00:24:11.020 just achy.
00:24:11.740 It's not really
00:24:12.300 a topic.
00:24:14.220 I don't know.
00:24:14.540 It's just not
00:24:15.000 worthy of talking
00:24:15.960 about most of
00:24:16.480 the time.
00:24:17.360 But the
00:24:19.000 Democrats put
00:24:19.840 together a
00:24:20.360 compilation video
00:24:21.920 of GOP
00:24:23.340 groomers.
00:24:24.140 and it's a
00:24:26.740 pretty long
00:24:27.280 compilation video.
00:24:29.160 Now, of course,
00:24:29.980 they include
00:24:30.840 historical examples
00:24:32.120 so they can go
00:24:32.840 back a little bit.
00:24:34.220 But if you go
00:24:35.200 back a little bit,
00:24:36.120 you find a whole
00:24:36.800 bunch of people
00:24:37.460 who are
00:24:38.040 Republicans who
00:24:39.840 absolutely
00:24:40.640 were in that
00:24:42.640 groomer category,
00:24:43.800 if you will.
00:24:45.420 So,
00:24:46.460 do you know
00:24:48.280 offhand
00:24:49.040 who does
00:24:51.020 more of it?
00:24:51.680 Now, most
00:24:54.560 of the GOP
00:24:55.280 examples were
00:24:56.160 individuals who
00:24:58.200 did things and
00:24:58.940 got caught.
00:25:00.540 And they
00:25:01.020 seemed like
00:25:01.380 individual acts.
00:25:02.560 Whereas the
00:25:03.320 so-called
00:25:03.980 grooming that
00:25:05.040 is, you know,
00:25:06.080 the alleged
00:25:06.680 grooming, I'll
00:25:07.300 call it, on
00:25:08.380 the left is
00:25:09.540 industrial level.
00:25:11.700 Right?
00:25:12.640 It's not that
00:25:13.480 people on the
00:25:14.200 right are saying,
00:25:15.220 hey, there's
00:25:15.720 that one groomer
00:25:16.520 over there.
00:25:17.740 Like, sometimes
00:25:18.400 they talk about
00:25:18.980 Epstein or
00:25:19.480 something.
00:25:20.400 But mostly,
00:25:21.680 it's about
00:25:22.260 if you've
00:25:23.120 got a teacher
00:25:23.860 or somebody
00:25:25.680 who has power,
00:25:27.020 yeah, it's like
00:25:27.480 systemic grooming.
00:25:28.740 That's the
00:25:29.140 complaint.
00:25:29.660 So they're not
00:25:30.160 really comparable.
00:25:31.780 But if you
00:25:33.460 imagine that all
00:25:34.280 the groomers are
00:25:35.040 on one political
00:25:35.820 side, well,
00:25:38.600 maybe it will
00:25:39.260 change your mind
00:25:40.160 on that.
00:25:41.200 I don't think
00:25:41.880 anybody thought
00:25:42.380 that, but it's
00:25:43.500 a pretty good
00:25:43.900 play.
00:25:44.580 So just in
00:25:45.260 terms of
00:25:45.640 persuasion, I
00:25:47.220 thought the
00:25:47.580 Dems did a
00:25:48.240 good job.
00:25:48.700 it's completely
00:25:50.560 let's say
00:25:52.940 manipulative
00:25:53.680 and propaganda
00:25:55.000 and is not
00:25:56.300 based on
00:25:56.860 proper context
00:25:58.020 and all that.
00:25:59.040 So it's not
00:25:59.780 good, but
00:26:02.140 it is a
00:26:03.200 well-executed
00:26:04.140 persuasion.
00:26:05.900 It's a pretty
00:26:06.320 good Me Too
00:26:06.880 play.
00:26:07.440 Not Me Too,
00:26:08.640 what do you
00:26:08.960 call it?
00:26:09.920 You Too?
00:26:10.700 More of a
00:26:11.280 you too than
00:26:11.820 a me too.
00:26:12.360 One of my
00:26:17.820 guilty pleasures
00:26:18.640 lately is
00:26:19.340 watching
00:26:19.800 Jonathan Turley
00:26:21.420 dunking on
00:26:22.980 Adam Schiff.
00:26:24.300 If you haven't
00:26:25.020 caught any of
00:26:26.020 Jonathan Turley's
00:26:26.880 articles or
00:26:27.740 tweet threads
00:26:28.520 about Adam
00:26:30.020 Schiff, the
00:26:30.840 most storied
00:26:32.540 liar of our
00:26:34.180 day, it's
00:26:35.780 really good
00:26:36.240 stuff.
00:26:37.300 Like, just to
00:26:37.900 watch how
00:26:38.580 capably he
00:26:42.060 lays out the
00:26:43.160 case.
00:26:44.160 Turley's great.
00:26:45.320 Like, I can't
00:26:46.500 get enough of
00:26:47.120 anything he
00:26:47.640 writes.
00:26:48.360 I see a
00:26:48.880 tweet from
00:26:49.280 his, and
00:26:50.040 I'm definitely
00:26:50.840 going to read
00:26:51.200 that.
00:26:52.540 He does good
00:26:53.080 stuff.
00:26:55.700 So he points
00:26:56.620 out that the
00:26:58.380 Schiff was
00:26:59.200 publicly,
00:27:01.820 his office was
00:27:03.460 demanding all
00:27:04.140 kinds of
00:27:04.580 censorship.
00:27:06.000 At the same
00:27:06.720 time he was
00:27:07.260 lying about
00:27:07.800 stuff in
00:27:08.240 public.
00:27:08.580 So he was
00:27:10.100 lying in
00:27:10.800 public while
00:27:12.620 trying really
00:27:13.400 hard to get
00:27:14.460 the, at
00:27:15.580 least Twitter,
00:27:17.260 to suppress
00:27:19.100 his critics.
00:27:20.220 While he was
00:27:21.160 lying in
00:27:21.900 public, he
00:27:23.600 was moving
00:27:24.460 really hard,
00:27:25.300 not just like
00:27:25.880 one phone call
00:27:26.700 or something,
00:27:27.200 but a whole
00:27:28.020 program of
00:27:29.360 trying to
00:27:29.780 suppress the
00:27:30.380 people who
00:27:30.760 were calling
00:27:31.280 him out for
00:27:32.480 lying in
00:27:32.960 public.
00:27:36.080 Jenny.
00:27:38.580 stop it,
00:27:39.220 Jenny.
00:27:43.240 It's amazing.
00:27:44.960 It's just
00:27:45.400 amazing.
00:27:47.260 Yeah.
00:27:48.860 Like, to
00:27:49.460 watch Schiff
00:27:50.280 operate.
00:27:51.100 I think
00:27:51.700 Schiff is,
00:27:53.040 he is turned
00:27:53.780 into a national
00:27:54.540 asset accidentally.
00:27:56.740 Like, I think he
00:27:57.280 was just the
00:27:57.700 worst person.
00:27:58.540 Like, character
00:27:59.940 wise, I've never
00:28:00.680 seen anybody lower
00:28:01.360 character than
00:28:02.160 Adam Schiff.
00:28:03.440 But, because
00:28:04.840 he's now so
00:28:05.640 well-known as
00:28:06.420 the signal of
00:28:07.220 lying, that
00:28:08.740 whenever he gets
00:28:10.400 pushed forward
00:28:11.160 to lie for the
00:28:12.540 Democrats, you
00:28:13.660 can say to
00:28:14.080 yourself, oh,
00:28:15.480 they sent the
00:28:16.080 liar.
00:28:17.680 And whenever
00:28:18.160 there's a scandal
00:28:18.940 that's not really
00:28:19.800 anything, who
00:28:22.060 did they send?
00:28:23.160 Bernstein, to
00:28:24.620 say it's worse
00:28:25.120 than Watergate.
00:28:26.160 Right?
00:28:26.500 So if you see
00:28:27.300 Bernstein, or you
00:28:29.120 see Adam Schiff,
00:28:30.980 or you see Eric
00:28:32.000 Swalwell, the
00:28:33.780 one thing you
00:28:34.380 can know is
00:28:35.640 that honest
00:28:37.120 Democrats aren't
00:28:38.140 willing to say
00:28:38.760 those things.
00:28:40.560 Or Brennan,
00:28:41.540 right?
00:28:41.820 Brennan and
00:28:42.260 Clapper, too.
00:28:43.360 Right.
00:28:43.760 And I mean that
00:28:44.360 literally, because
00:28:45.980 there are a lot
00:28:47.100 of honest
00:28:47.660 Democrats.
00:28:49.520 Lots of
00:28:50.180 them.
00:28:51.100 The honest
00:28:51.700 ones won't go
00:28:52.360 anywhere near a
00:28:53.140 camera when
00:28:54.640 their team is
00:28:55.600 presenting a lie.
00:28:56.680 So they send
00:28:57.140 Schiff, and
00:28:57.820 you know, they're
00:28:59.080 designated liars.
00:29:00.260 So it's actually
00:29:00.880 useful now,
00:29:02.000 to know when
00:29:02.500 they're lying
00:29:03.000 because they
00:29:03.460 just tell you.
00:29:04.620 Oh, by the
00:29:05.160 way, this is
00:29:05.600 just lying.
00:29:06.880 Now, I'm
00:29:07.560 going to
00:29:07.740 disagree with
00:29:08.300 you on
00:29:08.560 Ted Lieu.
00:29:10.620 I'm more of
00:29:11.340 a Ted Lieu
00:29:12.000 fan than
00:29:13.080 you're going
00:29:13.740 to suspect.
00:29:15.040 Right?
00:29:15.280 So he and I
00:29:16.040 have had some
00:29:16.600 exchanges on
00:29:17.940 Twitter as
00:29:18.420 well.
00:29:19.420 But he's
00:29:21.280 just a
00:29:22.000 partisan.
00:29:23.440 He's just a
00:29:23.960 partisan.
00:29:25.280 He's not,
00:29:26.800 like, he's just
00:29:27.380 pushing his side,
00:29:28.380 and, you know,
00:29:28.820 he's doing it in
00:29:29.420 public.
00:29:30.680 Is he right?
00:29:31.800 No.
00:29:33.020 But is the
00:29:33.620 other team right
00:29:34.280 all the time?
00:29:35.020 No.
00:29:36.460 I just don't see
00:29:37.500 him anywhere near
00:29:38.760 the Schiff and
00:29:41.360 Swalwell level.
00:29:42.620 He just seems
00:29:43.180 like a
00:29:43.480 partisan.
00:29:46.460 Gates was on
00:29:47.300 Timcast last
00:29:48.080 night.
00:29:48.540 Somebody says
00:29:49.020 that was fire.
00:29:50.840 Yeah.
00:29:51.080 I like Ted
00:29:51.580 Lieu.
00:29:52.120 I don't know.
00:29:52.600 I feel like I
00:29:53.220 could hang with
00:29:53.760 him.
00:29:54.340 He just seems
00:29:54.760 like he'd be a
00:29:55.280 fun guy.
00:29:55.660 I tend to
00:29:58.000 like anybody
00:29:58.420 who's got a
00:29:58.800 sense of
00:29:59.180 humor.
00:30:01.460 Just sort of,
00:30:02.820 you know, a
00:30:03.280 general statement.
00:30:05.920 All right.
00:30:06.420 Carrie Lake
00:30:07.240 is still, I
00:30:09.380 guess she had
00:30:09.780 some court
00:30:11.080 success in
00:30:12.080 pressing her
00:30:13.440 case, so
00:30:14.080 basically she
00:30:14.960 just gets the
00:30:16.240 chance to
00:30:16.720 press her
00:30:17.180 case.
00:30:18.360 And two
00:30:19.280 of the things
00:30:19.740 that she's
00:30:20.140 claiming about
00:30:20.800 the Arizona
00:30:21.560 election that
00:30:22.840 she thinks
00:30:23.300 were illegitimate
00:30:24.720 are that 300,000
00:30:26.300 or more
00:30:26.900 ballots did
00:30:28.180 not have a
00:30:28.800 proper chain
00:30:29.400 of custody.
00:30:31.280 And number
00:30:31.880 two, over
00:30:33.000 100,000 ballots
00:30:33.960 with faulty
00:30:34.520 signature
00:30:35.040 verification got
00:30:38.080 through.
00:30:38.920 So the
00:30:39.480 Arizona law
00:30:40.200 says that
00:30:40.780 they must be
00:30:41.760 verified, but
00:30:43.200 her claim is
00:30:43.980 that over
00:30:45.040 100,000 were
00:30:45.820 not.
00:30:46.740 Now, here's
00:30:47.680 my problem.
00:30:48.260 here's my
00:30:54.580 problem.
00:30:57.640 How would a
00:30:58.680 court ever
00:30:59.240 act on that?
00:31:00.240 I don't even
00:31:00.700 see how a
00:31:01.240 court would do
00:31:01.720 anything.
00:31:02.420 Because isn't
00:31:03.380 the court going
00:31:03.920 to say, yeah,
00:31:05.520 yep, good
00:31:06.480 points.
00:31:07.600 Arizona didn't
00:31:08.540 do a good job
00:31:09.380 of following
00:31:09.880 their own
00:31:10.280 procedures.
00:31:13.040 Thanks for
00:31:13.660 telling me.
00:31:14.080 I think the
00:31:17.760 Constitution just
00:31:18.660 says it's up to
00:31:19.340 the states.
00:31:20.340 I don't believe
00:31:21.300 the Constitution
00:31:21.880 says the states
00:31:22.880 must operate
00:31:23.580 perfectly, according
00:31:25.300 to you, or it
00:31:26.900 doesn't count.
00:31:28.540 Now, it would be
00:31:29.240 different if there
00:31:30.080 were direct
00:31:30.960 evidence of a
00:31:31.680 crime.
00:31:32.760 But if the only
00:31:33.780 problem is that
00:31:35.000 there was the
00:31:35.960 potential for
00:31:36.740 rigging, not
00:31:38.240 direct evidence,
00:31:39.160 but just potential
00:31:40.020 for rigging, what
00:31:41.680 does the court do
00:31:42.300 about that?
00:31:42.840 doesn't the
00:31:46.280 court just say,
00:31:47.540 well, you
00:31:48.580 know, get a
00:31:49.220 better state.
00:31:50.340 It's up to the
00:31:51.020 state.
00:31:51.540 If the state
00:31:53.620 did not give
00:31:54.740 you the
00:31:55.200 transparency you
00:31:56.260 wanted, maybe
00:31:58.320 you should talk
00:31:58.960 to the state.
00:32:01.600 Orders of
00:32:02.260 full audit?
00:32:03.600 I don't know.
00:32:04.480 I can't see
00:32:05.580 them decertifying
00:32:06.600 a governor race
00:32:08.860 at this point.
00:32:10.580 So, yeah,
00:32:10.980 here's the
00:32:11.640 thing.
00:32:11.840 If you're
00:32:13.220 making a
00:32:14.200 strict technical
00:32:15.440 argument, then
00:32:17.220 maybe you can
00:32:17.780 make a case that
00:32:18.580 the court could
00:32:19.320 do something on
00:32:20.080 a strictly
00:32:20.620 technical argument.
00:32:22.460 But I don't
00:32:23.080 think the courts
00:32:23.620 act that way when
00:32:24.600 it comes to
00:32:25.060 elections.
00:32:26.000 I think the
00:32:26.700 courts will
00:32:27.240 always favor
00:32:28.400 stability of
00:32:30.300 the system.
00:32:32.360 And so the
00:32:32.920 court, knowing
00:32:33.760 that if they
00:32:34.400 act, let's say
00:32:35.280 overthrowing
00:32:35.860 election, it
00:32:36.880 would create
00:32:37.920 instability in the
00:32:39.220 system, I don't
00:32:40.300 think they'll
00:32:40.660 act.
00:32:41.920 I think the
00:32:42.660 court is going
00:32:43.280 to say, yeah,
00:32:44.220 that looks pretty
00:32:44.840 sketchy, but you
00:32:46.520 work it out.
00:32:47.440 It's better if you
00:32:48.160 work it out than we
00:32:48.980 work it out.
00:32:49.780 I think that's
00:32:50.280 where it's going.
00:32:51.360 Now, that doesn't
00:32:52.240 mean she shouldn't
00:32:52.900 press the case.
00:32:54.600 Because I love the
00:32:55.700 fact that she's
00:32:56.540 highlighting, you
00:32:58.120 know, two
00:32:58.640 vulnerabilities in
00:32:59.600 her system that
00:33:00.220 probably are common
00:33:01.100 to other states,
00:33:02.620 and maybe everybody
00:33:03.580 should be looking
00:33:04.100 into it.
00:33:04.520 So it's a good,
00:33:06.500 solid, you know,
00:33:08.500 patriotic thing, and
00:33:11.000 I think we should
00:33:11.660 all be, even if
00:33:13.340 you disagree with,
00:33:14.860 you know, her
00:33:15.320 case, you should
00:33:16.880 all be happy she's
00:33:17.700 fighting it.
00:33:18.940 This is exactly the
00:33:20.140 kind of fight that
00:33:21.560 you want to maybe
00:33:23.200 get some more
00:33:23.680 transparency in your
00:33:24.560 system, make people
00:33:25.860 a little bit more
00:33:26.460 alert where the
00:33:27.320 problems are.
00:33:28.400 It's all good.
00:33:29.460 No matter how this
00:33:30.260 ends, Carrie Lake
00:33:31.280 is fighting a good
00:33:32.480 fight.
00:33:32.780 So I think we're
00:33:33.420 going to come out
00:33:33.880 ahead one way or
00:33:35.160 another.
00:33:35.520 I don't think
00:33:35.920 she's going to
00:33:36.380 win, but I think
00:33:37.540 we'll come out
00:33:38.580 ahead.
00:33:41.260 Did you see the
00:33:42.220 new, is it the
00:33:46.160 CDC who's saying
00:33:47.040 it, that the new
00:33:48.140 variant of the
00:33:48.900 virus is more
00:33:50.320 likely to hit the
00:33:51.880 vaccinated and
00:33:53.120 already infected?
00:33:55.400 Do you believe
00:33:56.160 that?
00:33:58.520 It's the and
00:33:59.840 already infected.
00:34:01.380 Do you believe
00:34:02.020 that there's a
00:34:02.600 variant that is
00:34:04.160 more likely to
00:34:05.360 infect you if
00:34:07.100 you'd already
00:34:07.840 been infected
00:34:08.560 with a variant
00:34:09.840 of that variant?
00:34:12.240 Is that, is that
00:34:13.960 passing your
00:34:14.620 sniff test?
00:34:16.600 Now, the problem
00:34:18.200 is that I don't
00:34:18.880 see a mechanism,
00:34:20.280 right?
00:34:20.500 There's no
00:34:20.940 proposed mechanism
00:34:21.880 for why that could
00:34:22.640 be the case.
00:34:23.600 But with the
00:34:24.460 vaccinations, there
00:34:27.440 are smart people
00:34:28.360 who propose a
00:34:29.940 potential mechanism,
00:34:31.480 mechanism by which
00:34:32.940 they could cause
00:34:33.580 harm to some
00:34:34.200 people.
00:34:35.040 That's a well
00:34:36.040 described mechanism.
00:34:37.480 But having a
00:34:38.880 prior infection,
00:34:40.720 everything we know
00:34:42.100 about that says
00:34:42.820 that should make you
00:34:43.520 stronger against
00:34:45.000 anything related to
00:34:46.180 it, or at least
00:34:47.520 the same, under no
00:34:49.840 case have we ever
00:34:50.800 seen, correct me if
00:34:52.060 I'm wrong, but does
00:34:53.660 it make sense you'd
00:34:54.360 ever be weaker?
00:34:55.920 Weaker in that
00:34:56.700 specific way?
00:34:58.740 I don't know.
00:35:00.180 Well, let me ask
00:35:01.860 you this.
00:35:04.180 I sometimes go
00:35:06.360 entire days without
00:35:07.940 human contact.
00:35:09.820 Like, actually none.
00:35:11.980 Yeah, like one day
00:35:12.700 this week I talked to
00:35:13.800 no people in person.
00:35:15.680 Except I guess I
00:35:16.720 ordered something at
00:35:17.540 Starbucks.
00:35:18.600 That was as close as
00:35:19.740 I got to human
00:35:20.340 contact.
00:35:20.700 Now, would a
00:35:23.860 person like me, who
00:35:25.820 can, I can socially
00:35:28.020 isolate better than
00:35:29.200 almost anybody, would
00:35:30.900 a person like me be
00:35:31.800 inclined to get, like,
00:35:33.220 booster after booster?
00:35:35.200 Less likely.
00:35:36.380 Now, there are other
00:35:37.200 reasons, right?
00:35:38.040 That's not the only
00:35:38.640 reason, but much less
00:35:39.780 likely.
00:35:40.560 Now, suppose I had a
00:35:41.640 job where I was going
00:35:43.320 to be around people,
00:35:44.480 like, hordes of people
00:35:45.980 all the time.
00:35:47.740 Would I be more likely
00:35:49.240 to have been boosted?
00:35:50.900 Now, again, at my
00:35:51.720 age, right?
00:35:52.320 We're not talking
00:35:52.860 about younger people.
00:35:53.740 That's a whole
00:35:54.060 different calculation.
00:35:55.400 But at my age, would
00:35:57.240 I be more likely to
00:35:58.420 be boosted if I knew
00:35:59.500 I was just going to
00:35:59.960 be surrounded by
00:36:00.980 possibly infected
00:36:02.500 people all the time?
00:36:03.720 I think so.
00:36:04.960 I think so.
00:36:05.880 I think that I would
00:36:07.040 be more likely to
00:36:07.900 boost because I would
00:36:09.380 talk myself into that
00:36:10.560 being a danger.
00:36:12.360 So, how do we
00:36:13.660 know that the
00:36:15.560 people who have the
00:36:16.540 most obvious chance
00:36:18.580 of getting infected
00:36:19.440 aren't the ones
00:36:20.760 who quite reasonably
00:36:21.660 are more likely to
00:36:22.720 be vaccinated and
00:36:23.600 boosted?
00:36:25.200 Wouldn't you expect
00:36:27.040 that the more
00:36:28.600 boosted you are,
00:36:29.400 the more likely
00:36:29.920 you would get
00:36:30.320 infected?
00:36:31.620 That's what I'd
00:36:32.380 expect.
00:36:35.260 But I'm not sure
00:36:36.520 that that works
00:36:37.400 for natural immunity.
00:36:39.240 Does it feel to you
00:36:40.540 like they might have
00:36:41.320 thrown in the natural
00:36:42.320 immunity part so you
00:36:44.260 wouldn't suspect it's
00:36:45.180 the vaccinations?
00:36:45.860 Doesn't it feel like
00:36:48.500 that's what they did?
00:36:49.920 It's like, really?
00:36:51.360 Because one of them
00:36:52.040 has a described
00:36:53.000 mechanism and the
00:36:55.580 other one's never
00:36:56.300 happened.
00:36:59.720 It feels like a
00:37:00.780 little sketchy that
00:37:01.820 they put them in the
00:37:02.580 same sentence.
00:37:03.680 You know, if you
00:37:05.360 got vaccinated or
00:37:07.300 natural immunity,
00:37:09.400 I'm not buying any of
00:37:11.600 that.
00:37:11.800 Now, to his credit,
00:37:14.740 do you know what
00:37:15.180 Fauci said about it?
00:37:17.140 Fauci said the data
00:37:18.120 might be bullshit.
00:37:20.860 Okay.
00:37:21.900 I'm going to give him
00:37:22.820 that.
00:37:24.220 I'm going to give him
00:37:25.440 that.
00:37:26.220 He actually said the
00:37:27.040 data might be wrong.
00:37:28.060 Because apparently it's
00:37:29.040 something that our
00:37:30.720 data is showing, but
00:37:31.700 other countries are not
00:37:32.700 buying into this
00:37:33.440 analysis.
00:37:35.320 Right?
00:37:35.880 So there's something
00:37:36.740 sketchy about the
00:37:37.600 whole thing.
00:37:38.680 Either the data is
00:37:39.540 wrong, or if the data
00:37:40.940 is right, they're
00:37:41.720 lying to us, maybe by
00:37:43.620 throwing in the
00:37:44.240 natural immunity
00:37:44.900 people.
00:37:45.760 I mean, to me, this
00:37:46.460 screams bad data, or
00:37:48.020 bad analysis.
00:37:49.520 I don't know.
00:37:50.720 I mean, we could be
00:37:51.720 wrong.
00:37:53.380 I'll put everything in
00:37:54.260 a statistical frame,
00:37:57.320 not a yes or no.
00:37:59.500 But certainly sketchy
00:38:00.760 looking.
00:38:05.560 All right.
00:38:06.500 Here's where I want
00:38:07.800 to get the clot
00:38:09.160 Burtz all clicking
00:38:11.260 away.
00:38:12.000 Click, click, click,
00:38:12.560 click, click, click.
00:38:13.440 Clop Burtz.
00:38:15.460 I'm going to say
00:38:16.220 something that will
00:38:17.260 make you mistakenly
00:38:18.700 believe that I have a
00:38:20.740 different opinion than
00:38:21.540 I do.
00:38:22.480 Now, the rest of you
00:38:23.340 will completely be on
00:38:24.760 board, and it'll be
00:38:25.500 easy to understand.
00:38:26.640 But the clot Burtz are
00:38:27.820 just binary.
00:38:28.640 It's like, loves this,
00:38:29.900 hates this.
00:38:30.820 Wait, wait, wait.
00:38:31.360 You said something
00:38:31.900 about this?
00:38:32.300 You must hate it.
00:38:33.440 All right.
00:38:34.360 So the rest of you
00:38:35.360 will be fine.
00:38:35.900 It goes like this.
00:38:39.620 I'm going to give you
00:38:40.220 a little wind-up here.
00:38:42.840 There's one thing that
00:38:43.980 artists add to the
00:38:46.120 system that sometimes
00:38:47.800 scientists and engineers
00:38:49.260 and lawyers will miss.
00:38:51.700 What is it?
00:38:53.100 What is it that artists
00:38:54.400 will sometimes see that
00:38:57.060 engineers and scientists
00:38:58.600 and other people don't
00:39:01.280 see?
00:39:02.820 Now, not feelings.
00:39:05.180 Maybe that's true,
00:39:06.220 but that's not where
00:39:06.760 I'm going.
00:39:07.460 I'm not going for the
00:39:08.260 feelings part.
00:39:11.360 I just want to, not
00:39:12.660 nuance.
00:39:14.100 No, there's something
00:39:15.000 really big that artists
00:39:17.500 see that other people
00:39:19.060 don't even look for.
00:39:20.580 In other words, we're
00:39:21.380 actually actively looking
00:39:22.660 for something that
00:39:24.220 nobody else looks for.
00:39:26.580 Ooh.
00:39:27.560 A little connection
00:39:28.360 problem there.
00:39:30.120 All right.
00:39:30.520 Here's the answer.
00:39:31.280 Empty space.
00:39:35.440 Empty space.
00:39:37.300 If I'm going to design,
00:39:39.860 let's say, a graphic art,
00:39:42.020 if I'm going to design
00:39:45.200 something, I will look
00:39:47.040 for where I put stuff
00:39:48.200 in the picture, but I'll
00:39:49.640 make sure that there's a
00:39:50.540 big empty space, a
00:39:52.240 negative space, because
00:39:53.600 that's what makes the
00:39:54.620 composition work.
00:39:56.300 It's the, not just
00:39:57.360 where you put stuff,
00:39:58.900 it's where you decide to
00:39:59.880 not put stuff.
00:40:01.280 So I actively look for
00:40:03.420 the missing parts.
00:40:05.120 I often talk about the
00:40:06.180 dog not barking.
00:40:08.300 I believe I've just
00:40:09.600 trained my brain to look
00:40:10.860 for the part that's
00:40:11.640 missing, because humor
00:40:13.060 does that too.
00:40:14.100 If I'm looking at a
00:40:15.060 situation and I want to
00:40:16.260 find the joke, I don't
00:40:18.320 say what's there, I say
00:40:20.560 what's missing, as well
00:40:22.240 as what's there.
00:40:22.800 So I'm always actively
00:40:24.600 looking for what's
00:40:25.480 missing.
00:40:26.620 Now, we have this big
00:40:28.420 conversation about
00:40:29.180 excess mortality.
00:40:30.920 Some say it might be
00:40:32.680 COVID.
00:40:33.780 Some say it might be the
00:40:36.280 vaccination.
00:40:38.080 What does the artist say?
00:40:39.140 What if there's a third
00:40:45.880 mass killer and it's
00:40:48.460 invisible to us because
00:40:50.700 we're focused on the
00:40:51.900 other two things, the
00:40:53.100 COVID and the vax?
00:40:54.900 Now, of course, we think
00:40:56.700 there's some disruption
00:40:57.920 from the shutdowns
00:40:59.660 themselves.
00:41:00.640 You know, there's some
00:41:01.140 delayed medical things and
00:41:02.440 everything.
00:41:03.420 But do all of those
00:41:05.880 explanations feel like
00:41:07.120 they're capturing the
00:41:08.580 amount of death that we
00:41:09.740 might be experiencing?
00:41:11.680 Does it feel like there
00:41:12.740 might be something else?
00:41:15.020 And if there were, if it
00:41:17.180 were fentanyl, we'd
00:41:17.940 probably have noticed it
00:41:18.760 by now.
00:41:19.500 So that is something
00:41:20.520 else.
00:41:21.080 But I'm going to give
00:41:23.380 you a hypothesis that I
00:41:25.640 don't present as, you
00:41:27.720 know, like a conclusion.
00:41:28.920 So I'm not going to say
00:41:29.620 this is for sure.
00:41:31.080 I'm going to give you
00:41:32.040 a hypothesis that
00:41:34.740 there's something really
00:41:35.740 big and really obvious
00:41:37.700 that's killing people,
00:41:39.620 maybe not instead of
00:41:41.540 vaccinations or COVID,
00:41:44.360 but on top of, on top
00:41:46.780 of.
00:41:47.400 So here's where the
00:41:48.700 cloppers will get
00:41:49.540 confused.
00:41:51.080 It's a fact that
00:41:53.000 vaccinations kill some
00:41:54.360 people.
00:41:55.680 We just don't know what
00:41:56.820 the percentage is.
00:41:58.020 Like, even the people who
00:41:58.980 are the most pro-vaccination
00:42:01.180 understand that we're all
00:42:03.360 built differently and some
00:42:04.440 people are going to die.
00:42:05.720 It's like true of a lot of
00:42:06.640 medications.
00:42:07.960 So there's not a question
00:42:09.440 about whether the
00:42:10.340 vaccinations are adding to
00:42:12.460 excess mortality or, you
00:42:15.140 know, that as well as the
00:42:16.660 COVID itself.
00:42:18.100 But what if, what if there
00:42:20.600 were a third mass killer?
00:42:23.400 Would you know it?
00:42:25.500 Let me ask you the first
00:42:26.640 question.
00:42:26.940 If, if there were something
00:42:28.580 else going on, would you
00:42:30.260 buy the, the, the first
00:42:32.500 assertion that we wouldn't
00:42:36.420 notice?
00:42:37.280 We wouldn't notice, right?
00:42:40.540 Now, here's my question.
00:42:42.240 What else changed at about
00:42:44.420 the same time as the
00:42:45.600 vaccinations were rolled
00:42:46.680 down?
00:42:47.480 If it's the only thing that
00:42:48.700 really changed, then I think,
00:42:51.380 you know, then it probably
00:42:53.340 makes perfect sense to focus
00:42:54.880 on it.
00:42:55.300 If it's the only thing that
00:42:56.280 changed.
00:42:57.400 But is it?
00:42:58.640 Is it the only thing that
00:42:59.720 changed?
00:43:01.440 I don't know.
00:43:03.140 Did you know that stress can
00:43:06.440 cause cardiac arrest?
00:43:10.720 Did you all know that an
00:43:12.080 increase in stress kills you?
00:43:15.480 I mean, not every person, but
00:43:16.800 statistically.
00:43:17.980 All right.
00:43:20.020 Let me take you back to 1970.
00:43:23.240 You're going to love where this
00:43:24.220 is going.
00:43:25.880 1970.
00:43:28.720 I was alive then.
00:43:30.940 You know, a young person.
00:43:32.440 Let's say I had a stressful day.
00:43:35.560 Oh, a bunch of stress.
00:43:37.360 And then I wanted to recover
00:43:39.300 from the stress.
00:43:40.520 What did I do to recover from
00:43:42.180 the stress?
00:43:43.680 Well, I'd usually just sit around
00:43:45.300 bored or I'd like pick the
00:43:48.360 bark off a stick or I'd sit on
00:43:50.900 a wall and wait for a car to go
00:43:52.600 by.
00:43:54.440 Or as a healthful person said,
00:43:57.200 I'd beat off.
00:43:58.820 I would go for a walk.
00:44:00.240 I'd play some sports.
00:44:02.520 Right?
00:44:03.140 And if I did any of those
00:44:04.660 things, my cortisol levels and
00:44:07.500 my adrenaline would go down.
00:44:09.960 I'd get back into a healthy mode
00:44:11.420 for a while.
00:44:11.860 But then later, you know,
00:44:13.660 some stressful thing would
00:44:14.680 happen.
00:44:15.780 So I'd be peaking and
00:44:18.020 valleying all day long.
00:44:19.440 Right?
00:44:19.640 Peaking and valleying.
00:44:21.040 Now, one assumes,
00:44:22.460 one assumes that the low points
00:44:27.560 are where you regain your health.
00:44:31.020 Imagine if you never got to
00:44:32.600 relax.
00:44:34.280 What if you had high stress
00:44:35.920 followed by more stress,
00:44:39.240 followed by more stress and
00:44:41.760 it never stopped?
00:44:44.300 All right.
00:44:44.860 What do you do in 2023
00:44:46.500 after a stressful situation?
00:44:49.960 So you feel stressed.
00:44:52.500 What do you do?
00:44:53.860 I'll tell you what I do.
00:44:55.860 Check my phone.
00:44:57.780 Check my phone.
00:44:59.380 So I'll be up in like high
00:45:00.640 stress area and I'll check my
00:45:02.160 phone.
00:45:03.180 And then my stress goes up a
00:45:05.980 little bit.
00:45:06.360 And then I got to go do
00:45:09.460 something else.
00:45:10.080 I can't play on my phone all
00:45:11.060 day.
00:45:11.440 So I go off to a stressful
00:45:13.380 situation and I stay there.
00:45:15.980 But now I've got a break.
00:45:18.000 I've got a break.
00:45:19.320 So I pick out my phone and I
00:45:21.060 look at Twitter and I'm like,
00:45:22.040 ah!
00:45:23.160 And my stress stays the same.
00:45:26.680 If I simply describe that
00:45:28.460 situation to you and then I
00:45:30.960 add on top of it something we
00:45:32.480 know for sure.
00:45:33.800 One thing we know for sure is
00:45:35.600 that all change causes
00:45:37.820 stress.
00:45:39.480 If you get married, it's
00:45:40.900 stressful.
00:45:41.320 If you get divorced, if you
00:45:42.440 change jobs, if you get a
00:45:44.080 promotion, it's stressful.
00:45:45.220 If you get fired, it's
00:45:46.200 stressful.
00:45:47.020 If the weather changes, it's
00:45:48.580 stressful.
00:45:49.580 If your finances change, it's
00:45:51.040 stressful.
00:45:51.660 If your relationships, you
00:45:52.780 know, everything.
00:45:54.160 Every change.
00:45:55.420 What is a bigger change than
00:45:57.740 the pandemic in your life?
00:46:00.880 For me, nothing.
00:46:02.780 That was it.
00:46:03.700 The pandemic was the biggest
00:46:05.620 stressor of my life.
00:46:07.640 How many of you would agree?
00:46:09.540 The biggest stressor that's not
00:46:10.940 part of your actual personal
00:46:12.360 experience, but, you know, the
00:46:13.740 external stressor.
00:46:16.800 Now, if you went to war, that'd be
00:46:18.540 worse.
00:46:19.580 But in my lifetime, I haven't
00:46:21.880 gone to war.
00:46:22.760 So, I asked on Twitter, I did a
00:46:27.880 little, you know, non-scientific
00:46:29.500 poll, and I said, rank your stress,
00:46:32.420 you know, before the pandemic
00:46:33.740 compared to, you know, during and
00:46:35.980 after.
00:46:37.420 And 31% said it's about the same,
00:46:40.960 stress level's about the same.
00:46:43.560 28% say it's a bit higher now than
00:46:46.360 before the pandemic.
00:46:47.140 And 24% say it's much higher.
00:46:50.820 Much higher.
00:46:53.740 And 17% say it's actually less
00:46:55.880 stress.
00:46:56.680 Now, the way I asked the question
00:46:58.960 was a problem, and, you know, it's a
00:47:01.760 non-scientific poll, so don't put too
00:47:03.740 much credibility into it.
00:47:05.120 But the way I asked the question, and
00:47:06.860 I could tell from the comments, people
00:47:09.080 thought that when I said stress, that
00:47:11.860 what I really meant was, do you trust
00:47:13.760 the so-called vaccination or not?
00:47:17.020 And that wasn't the question.
00:47:18.440 So, we're so primed to see it as
00:47:22.100 anti-vax or pro-vax that I asked about
00:47:25.200 people's stress in general, and they
00:47:27.620 answered it like I was asking about
00:47:29.160 vaccinations, which, you know, was a
00:47:32.660 small part of the whole.
00:47:34.640 So, people can't even answer a simple
00:47:36.840 poll in 2023 without thinking it's
00:47:40.260 about vaccinations.
00:47:41.080 Like, we're just so primed for war,
00:47:44.700 yes or no, conflict.
00:47:47.300 So, now imagine that we were in a
00:47:53.620 period in America where we had the
00:47:55.440 lowest level of trust in our leaders,
00:47:59.560 and that at the same time, we had the
00:48:02.300 biggest challenge they've ever had.
00:48:04.640 You know, maybe World War II was
00:48:05.840 bigger, but in, you know, in my
00:48:07.840 lifetime.
00:48:08.620 The pandemic was the biggest
00:48:09.720 leadership challenge.
00:48:11.280 So, imagine not trusting your own
00:48:13.140 leaders before the pandemic even
00:48:15.220 started.
00:48:16.540 And then you've got a pandemic, and
00:48:19.080 then all the leaders and the experts
00:48:20.520 start looking like maybe they're not
00:48:22.120 on your side so much.
00:48:23.440 It looks like a money grab.
00:48:25.500 And the, you know, the data and the
00:48:27.040 fog of war, you don't trust your data.
00:48:29.340 How stressful would it be to have one
00:48:31.960 of the biggest challenges at the same
00:48:34.280 time that you think the people in
00:48:35.600 charge of it are lying to you and
00:48:38.160 incompetent and maybe corrupt?
00:48:41.400 I can't even imagine anything more
00:48:42.940 stressful than that, you know, short of
00:48:44.380 war.
00:48:45.180 You know, let me be clear.
00:48:48.340 The Ukrainians have it a lot worse.
00:48:50.740 All right?
00:48:51.540 Just put that out there.
00:48:52.600 We're not comparing it to war.
00:48:55.340 But in terms of just ordinary life
00:48:57.320 stress, I don't think it's ever been
00:48:59.520 higher.
00:48:59.800 If I had told you, I'm going to take
00:49:04.440 25% of your public, and I'm going to
00:49:08.460 make them more stressed than they've
00:49:10.380 ever been in their life, do you get
00:49:13.220 more heart attacks?
00:49:15.440 Yes or no?
00:49:17.120 Do you get more heart attacks if I take
00:49:19.800 25% of the public and stress them beyond
00:49:22.080 what they've ever been stressed?
00:49:25.120 I think so.
00:49:26.500 Now, I'm not sure if that would
00:49:27.980 necessarily have athletes falling
00:49:30.540 over.
00:49:31.320 Do you?
00:49:32.000 I don't know that that would make
00:49:33.420 athletes start falling over in bigger
00:49:35.180 numbers.
00:49:35.960 Like, I kind of expect that would
00:49:37.340 affect people my age, you know, not
00:49:39.920 elite athletes.
00:49:41.540 But let me give you another hypothesis.
00:49:44.380 Do you think there's anything that
00:49:45.940 elite athletes imbibe that's different
00:49:50.300 from what you imbibe?
00:49:52.860 Probably.
00:49:54.160 Probably.
00:49:55.400 Because, you know, they would
00:49:56.480 supplement, they would, you know, maybe
00:49:58.560 some would cheat.
00:50:00.720 You know, maybe some performance
00:50:02.700 enhancing things, right?
00:50:04.920 What if there was one popular form
00:50:08.900 of, let's say, athletic performance
00:50:11.760 enhancing thing that a lot of people
00:50:14.260 were using and there was a bad batch?
00:50:17.620 Would you ever stop to think that there
00:50:21.820 was any problem but the vaccinations
00:50:23.480 is killing people?
00:50:24.900 You would not.
00:50:26.360 You would assume it was the
00:50:27.680 vaccinations.
00:50:28.940 And it might be.
00:50:30.060 So here's the part I have to say for
00:50:31.840 the clopberts.
00:50:33.540 Clopberts, I'm not saying that the
00:50:36.740 vaccinations are safe.
00:50:39.880 I'm saying I don't know how dangerous
00:50:42.080 they are.
00:50:42.820 But I'm also saying that if you assume
00:50:45.560 it's the only thing that would be
00:50:47.220 killing people, at the same time a lot
00:50:49.620 of things are changing.
00:50:50.480 There are a lot of other things
00:50:52.560 that are changing.
00:50:53.880 And I worry that it's like a
00:50:55.640 magician's trick.
00:50:57.340 We're watching the magician's two
00:50:59.180 hands.
00:51:00.300 You're like, oh, watch his left hand.
00:51:01.960 That's the vaccination one.
00:51:03.600 Watch his right hand.
00:51:04.480 That's the anti-vaccination one.
00:51:06.180 But this guy's only got two hands.
00:51:08.020 So if we watch those two hands,
00:51:09.580 we'll see the trick.
00:51:10.860 And while you're watching the
00:51:11.820 magician's two hands, the magician's
00:51:14.700 assistant is doing the trick.
00:51:16.900 Because you're looking in the wrong
00:51:20.040 place.
00:51:20.900 So here's the only thing I'm going to
00:51:22.600 add with certainty.
00:51:24.640 Everything else is just speculative.
00:51:26.740 But with certainty, we are blind to
00:51:30.000 any third killer.
00:51:32.800 I don't know that there is one.
00:51:39.880 Jenny says, Scott, apologize for
00:51:43.740 begging the government for mass
00:51:45.100 testing for two years.
00:51:46.700 No, we needed mass rapid testing.
00:51:49.900 That would have helped a lot.
00:51:52.140 Anybody who thinks that wouldn't
00:51:53.440 have helped?
00:51:55.100 Because the mass rapid testing would
00:51:56.900 help you make your own individual
00:51:58.260 decisions.
00:51:59.440 Are you opposed to people making
00:52:00.940 their individual health decisions
00:52:02.440 with data?
00:52:04.620 That's just a silly thing to be
00:52:05.900 against.
00:52:08.580 And by the way, the only reason we
00:52:10.140 didn't have them had to be
00:52:11.160 corruption.
00:52:11.520 All right.
00:52:14.260 So, have I made my point, have I
00:52:17.980 made my point that if there's a
00:52:20.360 third killer, there totally could
00:52:22.580 be, and we would be blind to it.
00:52:25.860 We would be blind to it.
00:52:29.080 Okay, that's all.
00:52:30.360 Yeah, I'm not going to state that
00:52:31.420 those are killing people, just that
00:52:32.740 we would be blind to it.
00:52:33.940 But I will state that the extra
00:52:35.340 stress is guaranteed to cause more
00:52:37.580 heart attacks.
00:52:38.120 How about that?
00:52:40.940 We don't know if that's what we're
00:52:42.480 seeing, but would you say with a
00:52:44.920 large population, if you
00:52:46.940 substantially increase their
00:52:48.520 stress, more heart attacks?
00:52:53.940 The part that, and again, I'll say
00:52:56.100 it again, but not necessarily the
00:52:58.880 athletes.
00:53:01.120 The athletes dying looks like it
00:53:02.900 could be, there could be a fourth
00:53:04.400 killer, right?
00:53:05.440 We could have four things killing
00:53:07.560 people, and only be seeing two,
00:53:10.400 because we're just tuned to those
00:53:12.180 two.
00:53:14.080 All right, here's the question I
00:53:15.200 asked.
00:53:16.280 If an evil hypnotist took control of
00:53:19.860 TikTok's algorithm, could they use it
00:53:23.100 to murder people in a demographic?
00:53:26.620 Now, not murder a specific person,
00:53:29.300 although it could probably do that.
00:53:30.540 But I'll say, could it murder lots of
00:53:34.480 people you don't know which ones, just
00:53:36.380 yes, absolutely.
00:53:40.140 If you gave me control of TikTok's
00:53:42.860 algorithm and said, here's who we want
00:53:45.880 you to target.
00:53:47.320 We want you to increase the death rate
00:53:49.520 in this group of people.
00:53:51.780 You don't think I could do that?
00:53:53.360 If I had control of the algorithm, and
00:53:56.460 I could say these people are going to
00:53:57.800 see this content?
00:53:58.600 Oh, I could do that.
00:54:01.780 Yeah, I could do mass murder, and it
00:54:05.060 wouldn't even be hard.
00:54:06.840 Like, you know, I'm not like the
00:54:09.260 super hypnotist or anything.
00:54:10.800 It would just be basic, pretty basic,
00:54:13.880 for somebody who had just ordinary
00:54:16.680 persuasion skills, if they were evil,
00:54:19.420 and if they had full control of the
00:54:21.020 algorithm.
00:54:22.160 Now, how lucky are we that China
00:54:27.400 hasn't figured that out, huh?
00:54:29.220 Huh?
00:54:30.020 How lucky are we that, you know, a
00:54:32.820 podcaster here in the United States
00:54:34.660 can easily see it?
00:54:36.680 Easily see it.
00:54:37.760 Very obvious.
00:54:38.740 And all of you, as soon as I said it,
00:54:41.160 said, well, you know, it is obvious.
00:54:44.400 You can definitely kill people if you
00:54:45.980 control the algorithm.
00:54:47.220 But aren't we lucky that China,
00:54:50.140 our adversary, who is sending in
00:54:53.760 fentanyl to kill us by the tens of
00:54:56.220 thousands every year, isn't it lucky
00:54:57.980 that they haven't, like, keyed in on
00:54:59.960 this?
00:55:01.760 Yeah.
00:55:02.340 By the way, do you know that they
00:55:03.560 don't allow TikTok to be used in
00:55:07.340 China?
00:55:08.240 Yeah, the Chinese owned TikTok.
00:55:11.260 It's not allowed in China.
00:55:13.600 They have a different version that
00:55:15.000 doesn't have any of the problems.
00:55:17.660 Yeah.
00:55:18.100 So how lucky, lucky, lucky, lucky,
00:55:21.140 lucky that the Chinese government
00:55:23.200 government hasn't figured out what
00:55:24.800 every one of you know and is
00:55:26.400 obvious, that they could use that
00:55:28.360 algorithm to kill Americans.
00:55:35.740 Now, do you have any doubt that our
00:55:38.580 Congress is corrupt?
00:55:41.300 None.
00:55:42.480 None.
00:55:42.780 Now, there may be members of Congress
00:55:45.800 who are not corrupt.
00:55:46.720 Like, I have a good feeling about, you
00:55:49.780 know, Thomas Massey and Paul Rand and
00:55:52.400 a number of others, but clearly, the
00:55:56.540 reason that TikTok is still legal in
00:55:58.340 the United States clearly is
00:56:00.440 corruption.
00:56:02.040 There's nothing else it could be.
00:56:04.960 Nobody's even offered another
00:56:06.820 hypothesis.
00:56:08.780 Right?
00:56:09.520 Had there been another hypothesis where
00:56:12.280 somebody says, you know, Scott, you're
00:56:14.380 forgetting all these good reasons why
00:56:16.360 it might not be so easy.
00:56:18.540 There is no other argument.
00:56:21.380 Do we have any other national topic
00:56:23.900 where everybody's on the same side?
00:56:27.020 I've never seen one.
00:56:28.780 Everybody.
00:56:29.960 100% of the Congress is on the same
00:56:32.160 side.
00:56:32.480 And yet, TikTok's not banned.
00:56:37.020 Not banned.
00:56:38.600 Not even close.
00:56:40.080 There's not a vote.
00:56:41.460 I don't believe there's any
00:56:42.340 legislation.
00:56:44.080 There's no conversation.
00:56:46.360 Right?
00:56:48.360 That has to be massive corruption.
00:56:53.240 Give me any other hypothesis.
00:56:56.620 Any other hypothesis.
00:56:58.780 There is none.
00:57:00.460 It's obvious.
00:57:01.320 And it's like the rapid testing
00:57:03.200 scandal.
00:57:04.980 That the news treated as like it
00:57:06.840 wasn't a scandal.
00:57:07.560 Well, the biggest corruption we've
00:57:09.740 ever seen, unless we find out
00:57:11.320 something about the vaccinations
00:57:12.460 was as sketchy as it looks.
00:57:16.100 Was it even covered in the news?
00:57:17.960 Does the news cover the fact that
00:57:19.500 there's no debate on TikTok and yet
00:57:21.920 no action?
00:57:23.420 No.
00:57:24.340 No.
00:57:24.700 It's not covered on the left.
00:57:26.580 And it's not covered on the right.
00:57:28.380 Now, in both cases, they complain
00:57:30.840 about fentanyl.
00:57:31.560 And they complain about fentanyl deaths.
00:57:35.300 But they don't complain about
00:57:37.220 TikTok killing people and then tell
00:57:40.580 us that nobody's doing anything about
00:57:41.980 it when that's obviously the case.
00:57:45.540 What else could it be?
00:57:46.900 I mean, really.
00:57:48.420 Now, here's a perfect situation where
00:57:51.380 if I were talking about an individual
00:57:53.920 human citizen, I would not say that
00:57:57.440 they're guilty without proof.
00:57:59.780 I'd say, no, no, a citizen is innocent.
00:58:03.500 You better bring some serious proof
00:58:05.860 if you're going to accuse a citizen
00:58:07.960 of a crime.
00:58:09.360 But the government?
00:58:11.400 The government is presumed guilty
00:58:13.100 unless they can show you some
00:58:14.480 transparency to show you that
00:58:16.440 they're treating you seriously.
00:58:17.680 And they have no transparency,
00:58:19.540 no argument.
00:58:21.300 They are guilty by definition.
00:58:25.080 In the same way a citizen is
00:58:27.160 innocent by definition,
00:58:28.620 you know, our system just defines it
00:58:31.120 that way, until guilty.
00:58:32.740 The government, by definition,
00:58:34.600 is guilty.
00:58:35.840 By definition.
00:58:37.100 If they won't tell you or even engage
00:58:39.040 in the conversation,
00:58:40.700 that's just guilty.
00:58:41.840 And anything that you assume otherwise
00:58:43.560 doesn't make sense.
00:58:48.160 Yeah.
00:58:48.900 Yeah, the TikTok is an educational product
00:58:52.180 in China.
00:58:54.020 All right.
00:58:59.280 And that, ladies and gentlemen,
00:59:04.560 let me give you another insight into this.
00:59:08.780 I'm not going to tell you what the topic was,
00:59:11.860 but several weeks ago,
00:59:14.400 social media was messing with my head
00:59:19.620 because I made the mistake
00:59:21.140 of looking at some specific content,
00:59:23.720 you know, nothing disgusting,
00:59:25.200 don't be weird.
00:59:26.420 I looked at some content,
00:59:27.880 you know, legal, ordinary content,
00:59:29.760 and then the algorithm started serving me
00:59:31.940 more of it.
00:59:34.080 And it really messed with my life
00:59:36.680 because I couldn't break the habit
00:59:39.680 of looking at,
00:59:41.520 it was Instagram,
00:59:43.400 I couldn't break the habit,
00:59:44.560 but it was feeding me things
00:59:46.520 that made it impossible to sleep.
00:59:49.280 I couldn't sleep,
00:59:50.940 like, for weeks.
00:59:52.340 And it was only because
00:59:53.080 there was a recurring thought
00:59:54.900 that the algorithm
00:59:57.420 kept putting in my head.
00:59:58.600 And every time it would, like,
00:59:59.780 wear off,
01:00:00.900 every time I looked at it,
01:00:01.900 it would go back in my head.
01:00:03.280 Now, it doesn't matter
01:00:04.660 what the content is,
01:00:06.900 it only matters that
01:00:08.280 I couldn't turn it off
01:00:09.500 and I couldn't stop looking at it.
01:00:11.260 So that actually
01:00:16.180 caused me major health problems,
01:00:19.360 like actual health problems
01:00:21.020 from Instagram algorithm.
01:00:23.720 Not a joke,
01:00:25.520 very clearly a health problem
01:00:28.640 because I couldn't sleep
01:00:29.920 more than a few hours a night
01:00:31.200 for night after night.
01:00:32.900 That's a health problem.
01:00:34.000 That's a pretty major health problem, right?
01:00:36.180 And that came just from the algorithm.
01:00:38.960 And it wasn't because
01:00:40.000 I looked for something.
01:00:41.260 It was because something
01:00:42.300 was served to me.
01:00:43.680 And now it wasn't about
01:00:44.400 my personal life.
01:00:47.060 Then, here's another one.
01:00:50.440 I found some humor
01:00:52.160 and interest
01:00:52.940 in some anti-relationship content.
01:00:57.040 Meaning it was people saying
01:00:58.800 that men and women
01:01:00.080 basically are never going
01:01:01.940 to get along.
01:01:03.240 Because there's something
01:01:04.220 about modern life
01:01:05.180 which has made men and women
01:01:07.540 toxic to each other.
01:01:09.040 Like, we just don't have a way
01:01:10.240 to be partners anymore.
01:01:11.920 And it was basically
01:01:13.420 very strongly
01:01:15.260 would encourage you
01:01:16.100 never to get married
01:01:17.100 or be in a serious relationship.
01:01:20.400 And once I looked at some of them,
01:01:23.120 the algorithm started giving me
01:01:24.700 just bearing me with
01:01:27.040 don't be in a relationship.
01:01:31.000 Now, would that be a benefit
01:01:32.440 to China
01:01:33.340 to have an algorithm
01:01:35.300 convince people
01:01:36.160 not to get married
01:01:37.000 and have children?
01:01:37.620 Yeah.
01:01:40.580 Yup.
01:01:41.960 If I were in charge
01:01:43.680 of the algorithm,
01:01:45.000 I would be promoting
01:01:46.660 don't have children,
01:01:49.020 don't get married,
01:01:50.400 don't do any kind
01:01:51.560 of classical relationship.
01:01:53.400 It's all broken.
01:01:54.960 You have to go do something
01:01:56.680 totally different
01:01:57.500 and it'll just feed you
01:01:59.080 over and over again.
01:02:00.280 And I started getting inundated.
01:02:01.880 And at first,
01:02:03.920 I thought,
01:02:04.420 whoa, these are,
01:02:05.600 I love the fact
01:02:06.360 that these are
01:02:06.880 outside the mainstream.
01:02:08.660 Right?
01:02:09.020 I love the way
01:02:10.340 it's expressed.
01:02:11.760 The people who are
01:02:12.780 on these videos
01:02:13.620 were really good.
01:02:15.040 They were just charismatic.
01:02:16.920 They had good
01:02:17.940 communication skills.
01:02:19.060 And as content,
01:02:21.180 it was good stuff.
01:02:22.560 Just as content,
01:02:23.500 entertaining.
01:02:24.640 But what did it do
01:02:25.440 to my life?
01:02:27.400 If I had been
01:02:28.240 a younger person
01:02:29.280 and I was,
01:02:30.460 you know,
01:02:30.720 questioning whether
01:02:31.440 I get married
01:02:32.040 and have kids,
01:02:32.960 it would absolutely
01:02:34.020 have reprogrammed me.
01:02:35.180 It was strong enough
01:02:37.020 that it would have
01:02:38.420 changed my
01:02:39.580 procreation
01:02:41.920 preferences.
01:02:44.480 It would have changed
01:02:46.320 my reproductive
01:02:48.580 preferences.
01:02:50.880 And I could feel it
01:02:52.140 in real time.
01:02:53.420 Like,
01:02:53.820 I could feel it
01:02:54.780 changing my brain.
01:02:57.540 Right?
01:02:57.880 And you saw me
01:02:58.580 mention it a number
01:02:59.500 of times
01:03:00.040 because you could see
01:03:00.920 it was taking up
01:03:02.480 a little real estate.
01:03:05.300 And now,
01:03:06.340 just imagine somebody
01:03:07.300 doing that intentionally.
01:03:09.500 Maybe it is intentional.
01:03:11.140 No way to know.
01:03:12.320 But imagine if I had
01:03:13.980 control of the algorithm.
01:03:16.500 There's two examples
01:03:17.800 where I know
01:03:19.020 I could make you
01:03:19.660 less healthy.
01:03:21.180 I could make you
01:03:22.100 less healthy
01:03:22.600 because I experienced it.
01:03:24.100 I'd give you
01:03:24.620 just disturbing images
01:03:25.760 and you wouldn't
01:03:26.240 sleep well.
01:03:27.220 And then your health
01:03:27.860 would be bad.
01:03:30.360 Or I could tell you
01:03:31.480 that the things
01:03:32.120 which we know
01:03:32.760 are good for you
01:03:33.640 are really bad for you.
01:03:35.820 For example,
01:03:37.360 if you were China,
01:03:39.260 would you let
01:03:40.000 the algorithm
01:03:40.720 present lots of
01:03:43.200 being overweight
01:03:45.440 is good for you
01:03:46.680 content?
01:03:49.340 Would you?
01:03:50.940 Do you think
01:03:51.920 that the United States,
01:03:53.280 which has always
01:03:53.880 had a history
01:03:54.520 of telling people
01:03:55.660 to exercise,
01:03:57.020 right,
01:03:57.460 through the 60s
01:03:58.360 and 70s
01:03:59.100 and Arnold Schwarzenegger,
01:04:00.580 the government
01:04:01.740 has always said,
01:04:02.680 oh, exercise,
01:04:03.560 exercise is good for you.
01:04:05.760 But what would China
01:04:08.020 want the United States
01:04:09.100 to do?
01:04:09.540 China would want us
01:04:11.440 to value our fatness
01:04:13.180 and say,
01:04:14.500 don't change me.
01:04:15.680 This is just me.
01:04:17.020 Body positivity.
01:04:18.900 And by the way,
01:04:19.960 the women can all
01:04:20.980 weigh 400 pounds
01:04:22.180 and you men,
01:04:23.480 you still want to
01:04:24.380 mate with them
01:04:25.060 because, you know,
01:04:27.040 you don't want
01:04:27.420 to be a jerk.
01:04:29.660 Yeah.
01:04:30.900 China could absolutely
01:04:32.260 end us
01:04:32.960 just with TikTok.
01:04:35.840 And our Congress
01:04:37.220 is completely blind
01:04:38.700 to it.
01:04:40.100 That's the best
01:04:40.900 case scenario.
01:04:42.100 But I think
01:04:42.880 they're bought off.
01:04:44.340 I think they're
01:04:44.880 bought off.
01:04:45.940 So at this point,
01:04:46.880 China has the fentanyl
01:04:48.400 in our veins
01:04:49.060 and they got the finger
01:04:50.740 on the plunger
01:04:51.500 and the best
01:04:53.200 we can hope for
01:04:54.060 is they don't push
01:04:56.360 their finger
01:04:56.780 on the plunger.
01:04:58.240 But the needle's
01:04:59.140 in our arm,
01:05:00.180 right?
01:05:00.640 The TikTok's here.
01:05:02.080 It's already
01:05:02.760 in our kids' hands.
01:05:04.340 The needle's in.
01:05:05.920 All they have to do
01:05:06.760 is push the plunger
01:05:07.660 and they can take
01:05:08.880 out our whole society
01:05:09.880 if they haven't
01:05:13.000 already.
01:05:16.380 Because what would
01:05:17.260 it look like
01:05:17.820 if they did?
01:05:20.460 Body positivity.
01:05:23.980 Maybe you should
01:05:25.480 try being gay.
01:05:28.900 Maybe you shouldn't
01:05:29.740 get married.
01:05:32.720 Maybe you should
01:05:33.560 complain more
01:05:34.160 about wokeness
01:05:34.920 because that's
01:05:35.340 what's important.
01:05:38.500 Almost everything
01:05:39.620 that's on social media
01:05:40.960 is destructive.
01:05:45.480 Now, here's what
01:05:46.500 I don't know.
01:05:47.660 Is there also
01:05:48.440 a thriving TikTok
01:05:50.240 pro-health exercise
01:05:53.020 part of it
01:05:54.300 that I never see?
01:05:56.220 Maybe.
01:05:57.520 But, you know,
01:05:58.580 that would still
01:05:59.400 get lost in the noise.
01:06:00.600 All right.
01:06:09.860 Yeah, there's one.
01:06:12.640 If China wanted
01:06:13.540 to destroy
01:06:14.160 the United States,
01:06:15.220 would they promote
01:06:16.400 Andrew Tate's
01:06:18.240 videos?
01:06:21.020 Think about it.
01:06:22.700 If China wanted
01:06:23.700 to destroy
01:06:24.360 the United States,
01:06:25.280 would they promote
01:06:26.080 Andrew Tate's
01:06:27.040 videos?
01:06:27.340 Who was the
01:06:29.840 most viral
01:06:30.660 content
01:06:31.200 recently?
01:06:35.060 Andrew Tate.
01:06:36.480 The most viral
01:06:37.220 and TikTok
01:06:38.180 was the
01:06:39.100 leading cause
01:06:40.260 of that, right?
01:06:41.520 It was mostly
01:06:42.100 a TikTok thing.
01:06:44.680 So China
01:06:45.320 may have decided
01:06:47.800 unless the algorithm
01:06:49.260 did it itself
01:06:50.040 because Tate
01:06:50.840 did a lot of things
01:06:51.520 to make things
01:06:52.140 viral.
01:06:52.820 So, I mean,
01:06:53.380 he was pushing
01:06:54.700 himself.
01:06:55.620 So we don't know
01:06:56.140 if maybe that was
01:06:57.120 enough.
01:07:00.640 It looks like
01:07:01.380 they're trying
01:07:01.720 to encourage
01:07:02.240 obesity.
01:07:03.220 Yeah.
01:07:05.600 So I would just
01:07:06.740 put that out there
01:07:07.800 that the needles
01:07:09.740 in the arm
01:07:10.300 and we're trusting
01:07:11.600 our adversary
01:07:12.520 not to push
01:07:13.360 the plunger.
01:07:14.600 So great job,
01:07:15.540 Congress.
01:07:16.400 That, ladies and
01:07:17.260 gentlemen,
01:07:17.520 is the best
01:07:18.360 live stream
01:07:19.020 you've ever seen.
01:07:20.100 I hope it
01:07:20.620 challenged your
01:07:21.540 current thinking.
01:07:22.780 I don't know
01:07:23.060 if anything I'm
01:07:23.640 saying is true,
01:07:24.580 but I know
01:07:26.560 we're not
01:07:26.980 thinking broadly
01:07:29.040 enough about
01:07:29.940 what the
01:07:31.000 possibilities are.
01:07:32.440 That being
01:07:32.960 my theme
01:07:33.560 of the day.
01:07:34.580 And YouTube,
01:07:35.240 I'm going to say
01:07:35.580 goodbye to you
01:07:36.060 now.
01:07:37.160 I'm going to
01:07:37.620 talk to the
01:07:38.320 locals
01:07:39.400 platform
01:07:40.320 privately
01:07:40.960 and I will
01:07:43.120 see you tomorrow.
01:07:44.280 Bye for now.
01:07:45.440 Best live stream
01:07:46.100 you've ever seen.
01:07:46.640 минут,
01:07:48.640 Gilmore,
01:07:49.980 September 16,
01:07:50.240 April 16,
01:07:50.740 June 16,
01:07:51.660 June 16,
01:07:52.600 March 16,
01:07:54.060 June 16,
01:07:54.640 June 16,
01:07:55.420 December 17,
01:07:56.740 June 17,
01:07:57.880 Inì—¬ Larsen,
01:07:57.940 August 16,
01:07:58.660 June 26,
01:07:59.380 June 23,
01:08:00.640 June 17,
01:08:02.440 June 17,
01:08:03.240 June 17,