Real Coffee with Scott Adams - June 22, 2022


Episode 1782 Scott Adams: What Do Kim Kardashian, Alex Jones and Adam Schiff Have In Common?


Episode Stats

Length

1 hour and 1 minute

Words per Minute

144.53214

Word Count

8,937

Sentence Count

659

Misogynist Sentences

6

Hate Speech Sentences

16


Summary

Coffee is the greatest thing ever made, and it's not even close to being the first thing you should know about it. Also, we need a new word that doesn't sound like a Nazi invasion, and a new pronoun that's not being used for something else.


Transcript

00:00:00.000 Hey everybody, and welcome to the highlight of not only your life and my life, but civilization itself.
00:00:09.620 Something like 13.9 billion years have passed since the Big Bang, and all of that leading to this moment.
00:00:20.240 Think about it. If things had been just a little bit different, you wouldn't even be here.
00:00:25.500 And then I would be talking to, I don't know, what? Dinosaur? Anything could have happened.
00:00:32.080 But if we want to take it up to a less absurd level today, and probably do, all you need is a cup or mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind.
00:00:44.380 Fill it with your favorite liquid. I like coffee.
00:00:48.580 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:00:58.900 It's called the simultaneous sip. Has anybody ever done it before? Anybody? Anybody? Yes, of course you have.
00:01:05.460 And it's the best thing that's ever happened to you. Go.
00:01:07.700 I tried to get all of my senses involved. You got the touch, the smell, the taste.
00:01:22.180 I sort of listened to myself slurping a little bit there. I looked at it. Yeah, everything.
00:01:29.380 That's what you call a full-body experience.
00:01:31.860 And I think we just had it together. Not that there's anything wrong with that.
00:01:41.160 Now, I have a question for you.
00:01:44.840 How do you plan a vacation in the current environment?
00:01:51.400 Is anybody having that problem?
00:01:53.880 You think to yourself, you know, a few weeks from now or whatever, a few months,
00:01:59.120 I think I'll book myself a plane trip and I'll take myself a trip.
00:02:07.720 You don't know if masks are coming back, which would ruin a trip.
00:02:13.600 And you don't even know if you'll have an airplane.
00:02:17.280 Because the flights are so bad right now, the number of cancellations,
00:02:22.280 that you can't be entirely sure that if you book a flight, it'll actually happen.
00:02:27.040 Now, that was always the case that you couldn't be entirely sure.
00:02:30.920 But I feel like it went from a 95% odds in your favor to closer to what?
00:02:42.820 65%?
00:02:43.400 I don't even know what the odds are of missing your flight these days.
00:02:47.340 Or at least being a day late or something like that.
00:02:50.640 I mean, I don't know how you plan to do anything these days.
00:02:53.260 Anyway, get your flight insurance or something.
00:02:58.260 Because if you plan a big vacation, you don't want to blow it.
00:03:02.140 You don't.
00:03:04.080 All right, I saw the best suggestion I've seen from a tweet by David Boxenhorn.
00:03:10.140 And he's helping us out with this whole pronoun stuff.
00:03:13.480 And I was saying that what we need is some pronouns that everybody can agree on
00:03:18.860 that aren't being used for something else.
00:03:20.740 You know, don't use they, because that's sort of already being used for something else.
00:03:28.200 So David suggests that since there are all these different pronouns,
00:03:32.000 you get yours, you know, he, she, it, they.
00:03:35.020 That if you just took the first letter from the pronouns that are out there,
00:03:39.640 you could create, you know, your own sort of universal one.
00:03:43.100 So let's see what we'd get.
00:03:44.060 If you just took the first letter, let's say she, he, it, and they.
00:03:52.940 Yeah, that would be an S, an H, an I.
00:03:55.620 Oh, wait.
00:03:57.240 All right, so it wasn't the best idea in the world.
00:03:59.640 But I like where it's heading, generally speaking.
00:04:03.440 Maybe you need to arrange the letters a little differently.
00:04:05.380 Now, somebody mentioned that this had already been done with the letter Z,
00:04:11.980 like calling people Z, you know, instead of he or she.
00:04:16.980 Yeah, Z did that.
00:04:19.220 And I said to myself, hmm, almost there.
00:04:23.000 I mean, I can see what you're doing there.
00:04:24.920 It's a brand new word, so that's good, right?
00:04:28.320 But here's what I suggested.
00:04:29.800 Could we have a word that sounds less like a Nazi invasion?
00:04:36.660 That's all.
00:04:38.300 You know, it just, I'm not asking a lot.
00:04:40.940 I'd like a pronoun that's not being used for a completely different purpose already.
00:04:46.960 It's clear.
00:04:48.220 I mean, I'm not asking a lot.
00:04:50.240 Doesn't make you sound like a Nazi stormtrooper.
00:04:54.380 Is that a high bar?
00:04:56.420 I mean, I didn't think it was.
00:04:59.800 But nobody's crossed it yet.
00:05:02.140 So let's see if somebody can do that.
00:05:05.620 Now, as you know, reality and parody merged sometime in the 2021 period, I think,
00:05:12.040 would be when historians will call it.
00:05:14.600 And that's a period where you really can't tell the difference
00:05:18.040 between a joke and something serious.
00:05:21.200 Because the serious stuff became so ridiculous that the jokes, you stayed where they were,
00:05:28.840 and then they just merged.
00:05:31.080 You want another example of that?
00:05:33.520 Does that sound like hyperbole?
00:05:35.940 Does it sound ridiculous that parody and reality have merged to the point where you just can't tell the difference?
00:05:42.800 Well, here's a little story from today.
00:05:46.040 Apparently, the Atlantic magazine, this is fake news.
00:05:53.900 So somebody did a fake headline that made it seem as if the Atlantic magazine had done it.
00:06:00.280 And the fake headline was The Heroism of Biden's Bike Fall.
00:06:05.060 And then the subtitle was The President Gracefully Illustrated an Important Lesson for All Americans.
00:06:11.340 When we fall, we must get back up.
00:06:13.860 Now, how many of you thought that probably was a real headline in this magazine?
00:06:24.560 I'll bet you well over 90% of the people who saw it probably thought it was real.
00:06:32.160 Wouldn't you say?
00:06:33.400 Before you knew it was a fake.
00:06:36.300 Wouldn't you say that that sounds real?
00:06:39.520 It actually does.
00:06:40.660 Because as absurd as it sounds, it's not more absurd than the news that you'll see today somewhere.
00:06:48.800 There will be more absurd things than the real news today.
00:06:52.020 No difference.
00:06:54.120 Well, as part of our understanding of why things aren't the way they used to be,
00:07:02.900 Adam Dopamine on Twitter, AdamMD,
00:07:06.740 he points out that Eisenhower missed one key element when he warned us about the military-industrial complex.
00:07:14.340 So remember, Eisenhower gave a big speech in his day and said,
00:07:19.340 watch out for the people who make the weapons,
00:07:23.300 working with the people in government to make money by starting wars, essentially.
00:07:29.380 And sure enough, it looks like there's some of that going on.
00:07:35.060 But the part that he left down, as Adam Dopamine points out,
00:07:39.300 is you should have included the news in that.
00:07:41.980 It should have been the military-industrial-news complex.
00:07:44.880 Because unless the news is complicit, you can't get away with it.
00:07:49.940 So you really need that whole complex.
00:07:54.360 And I think that's sort of a useful foundational understanding.
00:08:02.400 You know, follow the money,
00:08:04.340 but there's somebody running cover for the money and diverting you.
00:08:09.700 So there's somebody on the, let's say, the bank heist team
00:08:15.020 whose job it is when the bad guys run out and turn left
00:08:18.820 and the police arrive, it's his job to say,
00:08:22.080 he went that way and, you know, point the wrong direction.
00:08:25.440 And when I say he, I mean shit.
00:08:31.100 All right.
00:08:33.860 What do these people have in common?
00:08:36.880 Let's see if you can get this.
00:08:39.100 In the comments, here's your quiz.
00:08:40.900 Three people, what do they have in common?
00:08:42.840 Alex Jones, Adam Schiff, and Kim Kardashian.
00:08:48.940 Go.
00:08:50.140 What do they have in common?
00:08:52.400 Something major, you know.
00:08:54.620 And it's not the funny stuff.
00:08:57.020 Something real.
00:08:59.580 Dog punchers, that was a good guess.
00:09:01.960 But we have no information about Alex Jones ever punching a dog.
00:09:07.880 So, that was a good guess.
00:09:09.900 Somebody said they're all dog punchers.
00:09:12.440 And I thought, Kim Kardashian, probably.
00:09:15.640 Adam Schiff, definitely.
00:09:17.400 But no information about Alex Jones.
00:09:19.540 He might be a dog lover.
00:09:21.000 So, it was a good guess, but I don't think it fits.
00:09:25.460 All had sex tapes.
00:09:27.280 That's a good guess.
00:09:29.680 Yeah.
00:09:32.660 Probably.
00:09:33.960 Probably.
00:09:34.600 But that's not what I was going for.
00:09:36.900 All have dated Pete Davidson.
00:09:39.620 Checking the list.
00:09:40.920 Yeah.
00:09:41.780 That too.
00:09:42.980 Well, okay.
00:09:43.620 I didn't realize there were that many coincidences.
00:09:46.300 There are quite a few of them.
00:09:47.580 But here's what I was going for.
00:09:50.200 I was going for they're all in the same line of work.
00:09:55.320 They're all in the same line of work.
00:09:57.180 Let me read them again.
00:09:59.980 Alex Jones, Adam Schiff, and Kim Kardashian.
00:10:04.000 Same job.
00:10:05.880 You know, different bosses.
00:10:07.280 Same job.
00:10:09.120 And here's what their job is.
00:10:10.500 It's an attenuated reality.
00:10:16.980 Attenuated reality.
00:10:18.620 They start with something that's like reality-ish.
00:10:23.860 But they create something that's not reality from it.
00:10:28.580 For the purpose of getting attention.
00:10:31.440 And then they can monetize that attention.
00:10:33.560 Or turn it into power.
00:10:34.940 In the case of Adam Schiff.
00:10:36.660 Maybe also monetize.
00:10:37.880 So, none of them are different, in my view, from wrestling.
00:10:47.100 When you were a kid, did you ever watch wrestling?
00:10:49.280 And you thought to yourself, people are saying this is fake.
00:10:51.540 But I'm not so sure.
00:10:53.320 This wrestling looks real to me.
00:10:54.740 I think they're really fighting.
00:10:56.940 And then later you learn it's an attenuated reality.
00:11:01.320 It's not reality.
00:11:03.400 It's sort of a base reality.
00:11:05.200 There's fighters, and there's a ring, and there's an audience, and people probably get hurt, and there's contact.
00:11:12.620 But it's attenuated.
00:11:14.540 To get your attention, it's more interesting.
00:11:17.940 Alex Jones is an attenuated reality.
00:11:23.380 He takes things that are at base are true, but adds to it.
00:11:27.700 And you could call it a theater.
00:11:30.240 Theater would be exactly the right.
00:11:32.040 Yeah.
00:11:32.400 Thank you.
00:11:34.580 Sean.
00:11:36.060 And is it a coincidence that Adam Schiff actually had some script writing ambitions?
00:11:45.740 I think he's written a few scripts.
00:11:47.400 So he actually has a theatrical background.
00:11:50.680 And Kim Kardashian, there was recently a story that there was some family meeting or something that was in the reality show,
00:12:02.880 but the complaint was that they scripted it, and it was an artificial gathering in which they pretended something was real.
00:12:11.200 To which I said, you just figured that out?
00:12:16.960 Is there somebody who watches reality shows and thinks they're actually watching them just filming what's happening?
00:12:23.980 Is there anybody doing that?
00:12:26.660 Is there anybody who still thinks wrestling is like a real sport?
00:12:31.120 I wonder, do people think that?
00:12:36.000 When you watch Alex Jones, do you believe that the purpose is that you're seeing reality?
00:12:44.260 Is that what you think you're watching?
00:12:47.960 Because you should watch Alex Jones, the Kardashians, and even Adam Schiff doing any of his public stuff.
00:12:56.120 It's the same thing.
00:12:58.060 They start with reality.
00:12:59.180 They attenuate it in ways that get your attention for their own purposes.
00:13:03.780 But to imagine that any of that is real is a serious misunderstanding of, I won't call it parity in reality.
00:13:14.340 But if you think that attenuated reality and reality, you can treat them the same, I think you should rethink that.
00:13:23.460 These are obviously attenuated reality situations.
00:13:26.280 And there's a new documentary or movie, I guess you'd say, coming out about Alex Jones, I think at the end of July.
00:13:36.700 So we'll talk more about that.
00:13:38.500 But I was looking at, I just looked at a little trailer for that.
00:13:42.220 And you see just a snippet of Alex Jones defending what he does.
00:13:47.220 Now, his defense is this, that he believes what he says to be true, and that sometimes he's wrong.
00:13:56.300 And I look at that and I think, maybe, maybe, maybe, like you can't really rule that out, can you?
00:14:09.000 I mean, think about it.
00:14:10.240 Think about how easy a target Alex Jones is, and then I'm going to give you his defense.
00:14:15.940 It's really simple.
00:14:17.000 It's a very simple defense.
00:14:18.220 He believes what he says, but sometimes he's wrong.
00:14:25.060 And I hear that and I think, all right, nobody has ever presented evidence that he doesn't believe what he says.
00:14:32.660 Am I wrong?
00:14:34.840 And then here's the second part.
00:14:37.080 Here's the mic drop.
00:14:38.280 Alex Jones asks us, why would we treat him differently than the New York Times, which also presumably believes what it says, but sometimes they're wrong, like weapons of mass destruction in Iraq, like really wrong, like wrong, start a war wrong, like that's as wrong as you can be.
00:15:00.920 And Alex Jones says, why would you treat me differently?
00:15:03.120 We both believed what we said, and we both could be wrong.
00:15:10.340 To which I say, oh, shit, that's actually a complete defense.
00:15:17.100 There's nothing wrong with that defense at all.
00:15:20.340 The only way he could be thwarted in that defense is if there's some recording of him saying, you know, I make all this stuff up.
00:15:29.660 Now, apparently there is some evidence that he likes the conspiracy theory domain.
00:15:36.860 So that part I think he said directly.
00:15:39.260 I believe that's, you know, also part of the trailer.
00:15:42.660 But that doesn't mean he doesn't believe it.
00:15:47.520 And I'm not entirely sure that believing things is a real thing anymore.
00:15:52.340 I feel as if we all choose things to believe and that the reality is now so obviously subjective that we're a little bit aware of the fact that we're just choosing a reality.
00:16:08.540 Does that feel true?
00:16:10.240 In a small way.
00:16:12.260 But you can see that this is like a trend, you know, like it's a socially expanding trend.
00:16:20.860 That people understand that what they believe or what they act on as their reality, they know is a selected reality, a decision reality.
00:16:30.680 Not an actual reality.
00:16:32.540 It's just one they choose to live in.
00:16:34.120 And I think that conspiracy theorists often are making almost a lifestyle choice to say, I'm going to live in a world in which I treat this as true.
00:16:48.700 And it's just a choice.
00:16:50.740 And on some level, maybe they know it isn't true.
00:16:53.900 Because did you ever see somebody who had a religion that wasn't yours?
00:16:58.420 And you talk to them and you're like, I don't know what's going on with this other person.
00:17:02.100 So whatever your religion is, is probably the real one.
00:17:06.340 But suppose you were talking to somebody who got it all wrong.
00:17:08.640 They had the wrong religion.
00:17:10.060 And you look at them and you think, did they actually believe that?
00:17:14.000 Like actually really, really believe that?
00:17:16.440 Like if you put a gun to the head of a loved one and their family and said, I'm going to kill your loved one.
00:17:25.920 You know, I have some way to know the truth.
00:17:28.760 It's not possible.
00:17:29.540 But I have some way to know what you're really thinking.
00:17:32.680 Do you really believe your own religion, like all of it?
00:17:36.220 Like all the important parts.
00:17:38.200 Do you believe all of that?
00:17:40.220 And I think you'd find that people say, well, okay.
00:17:43.940 If the life of my loved one is on the line, it is sort of a chosen belief.
00:17:51.040 And while many of you are still saying, well, that's not true with me.
00:17:57.000 I'm actually living in reality and living in the real world.
00:18:01.480 And maybe you are.
00:18:02.820 Maybe you are.
00:18:03.600 How would I know?
00:18:04.760 The only thing I'm sure is I don't know.
00:18:08.140 But maybe you got the right one.
00:18:10.520 Speaking of reality, here's why you could say with certainty we have no news industry in this country.
00:18:19.740 The most basic fact that you would want to know as a news consumer is this.
00:18:27.060 So Biden recently said, I think yesterday, we need more refining capacity.
00:18:33.220 And he says, this idea, they don't have oil and they don't have oil to drill and to bring up is simply not true.
00:18:43.040 So basically, Biden is saying that the oil industry is to blame for any shortages because there's plenty of oil with the leases and the drilling capacity they already have.
00:18:54.760 They just, they can't refine it.
00:18:56.780 And so that's a problem with the industry itself having not built enough refineries.
00:19:02.860 Now, wouldn't you expect if you had a news industry, they would fact check all that and then put it in context for you and then you would read it.
00:19:11.720 And it wouldn't matter if you were reading news from the left or the right because these are just really objective facts.
00:19:18.120 Is it true that the industry just didn't build enough refineries and there's nothing stopping them?
00:19:27.660 So if they did, everything would be fixed.
00:19:32.400 Does that even feel a little bit true?
00:19:36.180 I don't know what the true story is.
00:19:39.040 So here are things I've heard unreliably but also assumed to be true.
00:19:45.540 I believe the government is the problem with building refineries, isn't it?
00:19:51.180 Isn't the problem that the government, either it's either state or local or federal or regulations of some types.
00:19:57.420 But did I not hear a quote, and give me a fact check on this.
00:20:02.560 So I'm operating from faulty memory here.
00:20:05.240 I think this week the CEO of Chevron, this is the part you need to check,
00:20:10.360 said that he believes there will never be a refinery built in this country again.
00:20:17.040 Did he say that?
00:20:18.500 The CEO said he doesn't believe it will ever happen?
00:20:21.460 Because the, and I assumed that the context was the regulatory burden is too hard.
00:20:27.280 Am I right?
00:20:28.760 Yeah, it's just impossible for regulatory reasons.
00:20:32.120 So, why not put it in Central America?
00:20:41.340 Can't everybody win if we put the refineries in Central America?
00:20:45.180 There's got to be some, you know, are you telling me that Nicaragua,
00:20:49.060 you know, Nicaragua is going to say no to a refinery?
00:20:52.240 With all the, you know, jobs and whatever positivity it could create there?
00:20:56.900 And they probably have lower, you know, regulatory burdens.
00:21:01.980 So couldn't Kamala solve her problem, and the CEO of Chevron can solve his problem,
00:21:08.540 and basically work as a, you know, as a system?
00:21:13.320 Try to figure out how to make all that work as one thing.
00:21:17.300 So one of the big problems, and you see this all the time,
00:21:21.520 is we treat all of our issues like they're little silos.
00:21:26.900 But sometimes you could just connect two problems.
00:21:32.180 Yeah.
00:21:32.560 For example, you've got a labor shortage, and, you know, that's the economy,
00:21:38.940 and you've got an immigration problem.
00:21:43.060 Can't we figure out how those two silos could work together to fix something?
00:21:48.080 It feels like we're handicapping ourselves
00:21:52.500 by thinking all of our issues are, like, in their own little channel.
00:21:56.900 Anyway, so it's obvious we have no news industry,
00:22:01.560 because I can't tell if Biden is telling the truth that the industry is to blame
00:22:08.020 or the industry is telling the truth that the government is to blame.
00:22:12.140 And I would think that both the news on the left and the right,
00:22:15.640 if they knew the answer, would report it exactly the same.
00:22:18.920 Oh, yeah, they would love to build a refinery, but the government's too burdensome.
00:22:23.660 Or not.
00:22:25.220 Whatever it is.
00:22:27.420 So I was watching the new press secretary,
00:22:31.340 Biden's press secretary,
00:22:33.040 trying to answer a question about, you know, the economy.
00:22:36.000 And she attempted to answer it by saying there's a bunch of stuff
00:22:40.900 that isn't that bad right now, which is true.
00:22:45.580 So it's not all bad.
00:22:46.920 We're not in a recession now, blah, blah, blah.
00:22:49.080 But my overall impression of her was she didn't seem qualified for the job.
00:22:54.660 Does anybody else have that feeling?
00:22:59.440 She seems unqualified for the job.
00:23:01.500 Now, I couldn't remember her name.
00:23:04.960 Is it Jean-Pierre?
00:23:07.700 Is that her name?
00:23:09.120 What is her actually name?
00:23:10.280 So I'm at least being respectful when I'm talking about her.
00:23:13.860 Because I feel somewhat disrespectful.
00:23:16.220 I couldn't remember her name.
00:23:20.300 All right, well, look.
00:23:22.160 Corrine Jean-Pierre.
00:23:23.840 Karen?
00:23:24.520 Corrine?
00:23:26.000 I don't know.
00:23:28.140 But I'm going to put a positive spin on it.
00:23:31.500 Do you remember when Conan O'Brien first became a late-show host?
00:23:38.060 And he would do his monologue?
00:23:39.980 And I don't know if he had the same reaction I did,
00:23:41.900 but my first reaction to Conan O'Brien was,
00:23:45.700 oh, God, he makes me nervous.
00:23:48.200 Because he looks like he's not qualified.
00:23:51.040 Or he knows he's not qualified.
00:23:53.580 He just seemed so uncomfortable in his own skin on stage
00:23:59.660 that it made me uncomfortable.
00:24:02.160 But, but, then he got better.
00:24:06.840 Because it's one of those things you can't really practice.
00:24:10.540 You know, there's no way to practice being a late-night show monologue guy
00:24:15.620 because it's not real practice.
00:24:18.300 So, and then I would argue that he became, you know,
00:24:21.960 one of the best at it because he got to practice.
00:24:25.120 I feel as if she could, she could follow the same arc.
00:24:29.240 But at the moment,
00:24:32.000 the spokesperson is looking a little too uncomfortable with what she's saying.
00:24:37.460 And you do not want your spokesperson to look uncomfortable
00:24:43.400 with what he or she or shit are saying.
00:24:49.060 So, anyway, I was just wanting to do,
00:24:54.640 do you have the same impression that she seems to lack confidence in her own answers
00:25:00.420 and that lack of confidence is then,
00:25:03.520 makes you think maybe the administration doesn't know what it's doing.
00:25:07.700 It's a bad, it's a bad combination to have somebody who looks
00:25:10.820 uncomfortable in their job talking for you.
00:25:14.160 Anyway, but, on a positive note,
00:25:21.040 if I may be positive,
00:25:23.720 she looks like she's smart.
00:25:26.000 She looks like she has, you know, basic, you know, great capabilities.
00:25:30.380 So, maybe she becomes really good at this in a month.
00:25:33.160 You never know. Could happen.
00:25:35.020 Give her a month, see what happens.
00:25:38.060 So, Biden administration wants to ban
00:25:42.440 or greatly reduce nicotine in cigarettes.
00:25:47.840 And, basically, it would throw the whole cigarette industry into a tizzy
00:25:53.320 and it would make it easier for people to quit.
00:25:55.940 So, apparently, at least there's some science,
00:25:59.160 of course the industry disputes it, surprise,
00:26:02.360 to suggest that it would get people to quit.
00:26:05.580 What do you think of that?
00:26:07.560 Now, I've been an anti-smoking,
00:26:10.440 smoking proponent
00:26:13.400 and even public advocate
00:26:15.760 for decades in public places.
00:26:20.720 But, I'm much less interested
00:26:22.720 in what people do in their private lives.
00:26:25.820 So, do you think the government
00:26:28.060 should get into this business?
00:26:30.840 What do you think?
00:26:32.300 I mean, I do believe
00:26:33.420 that this is likely to be something
00:26:36.140 that would save tens of millions of lives
00:26:38.780 in the long run.
00:26:40.340 That would be my guess.
00:26:42.020 If they get away with this,
00:26:44.480 and I'll say it that way,
00:26:46.100 if they actually
00:26:46.840 basically wipe out the cigarette industry
00:26:50.360 and make it less addictive,
00:26:52.920 I feel like it would save
00:26:54.480 tens of millions of lives.
00:26:56.700 Which doesn't mean you should do it.
00:27:00.180 Somebody says it's like prohibition.
00:27:01.880 Yeah, so would there be
00:27:03.400 immediately counterfeit cigarettes?
00:27:06.240 Seems like it, right?
00:27:07.780 Yeah, it would just become
00:27:08.760 an illegal industry
00:27:09.760 and we would just get them
00:27:11.200 from the cartel
00:27:12.740 like we get everything else.
00:27:14.800 Yeah, I don't know if it would work.
00:27:16.500 So, I guess
00:27:16.900 I'm anti-smoking,
00:27:20.960 but I'm pro-freedom.
00:27:23.800 So, this one's a tough one.
00:27:27.080 Because cigarette smoking
00:27:28.400 mostly hurts yourself,
00:27:30.800 but on the other hand,
00:27:32.240 it might make my,
00:27:33.920 I don't know,
00:27:34.740 my health insurance cost more
00:27:36.220 because other people
00:27:37.480 are doing the wrong things,
00:27:38.680 I guess.
00:27:39.440 But you can't really go down
00:27:40.480 that road either
00:27:41.220 because that's, you know,
00:27:42.980 an infinite problem
00:27:44.040 because everybody's doing stuff
00:27:45.360 that's dangerous.
00:27:46.440 You can't take into account
00:27:48.180 all of it.
00:27:49.540 All right, I'm going to defend
00:27:50.480 Ivanka Trump's opinion
00:27:52.240 about January 6th here
00:27:53.620 because it turns out
00:27:55.280 it's pretty close to my own.
00:27:56.420 So, CNN is trying to point out
00:28:01.080 that in one video,
00:28:03.660 Ivanka said she accepted
00:28:04.920 Barr's opinion
00:28:05.740 that the election was fair.
00:28:09.300 And in other videos
00:28:12.080 where she was not,
00:28:14.880 you know, being asked by,
00:28:16.700 I think, the committee,
00:28:18.260 she had said separately
00:28:19.480 and earlier
00:28:20.180 that, you know,
00:28:21.600 there were things
00:28:22.480 that needed to be looked into
00:28:23.800 in the election.
00:28:26.060 Meaning that
00:28:26.960 maybe there's some
00:28:28.540 questions
00:28:29.960 that should be answered
00:28:31.560 about the election.
00:28:32.960 Now, that's pretty different
00:28:33.840 from saying
00:28:34.440 that election was rigged.
00:28:37.340 And do you think
00:28:38.360 that those two things
00:28:39.300 are necessarily incompatible?
00:28:43.440 Because here's my take.
00:28:45.460 I'll overlay my opinion,
00:28:47.120 and maybe that's unfair.
00:28:48.140 But I feel like hers
00:28:51.020 is so close to mine
00:28:52.040 that I can do this.
00:28:54.580 So I'm going to defend
00:28:55.500 my own opinion.
00:28:56.960 I separate
00:28:57.680 the question
00:28:59.240 of whether the election
00:29:00.380 was fair
00:29:01.100 from the question
00:29:03.100 of whether you should
00:29:03.960 support the system.
00:29:06.240 Because I often think,
00:29:07.880 maybe always,
00:29:08.800 but I'll say often,
00:29:10.380 that protecting the system
00:29:11.980 is going to be
00:29:12.500 a higher priority
00:29:13.440 than maybe fixing
00:29:16.580 any individual outcome.
00:29:19.280 I just think
00:29:20.120 the system's
00:29:20.760 more important.
00:29:21.920 So as soon as
00:29:22.780 the system
00:29:23.520 elected Biden,
00:29:26.360 I, like many people,
00:29:27.640 said,
00:29:28.020 huh,
00:29:29.240 I don't know,
00:29:30.460 maybe,
00:29:31.140 maybe not,
00:29:34.000 maybe and maybe not.
00:29:35.560 So I had my questions.
00:29:37.660 Many people had questions.
00:29:38.800 But some of you
00:29:40.940 will remember
00:29:41.380 that I also
00:29:41.960 immediately congratulated
00:29:43.600 him on Twitter
00:29:44.280 and have never
00:29:45.640 backed off
00:29:46.300 from my decision
00:29:47.400 that the system
00:29:49.200 elected Biden.
00:29:52.300 So when I say
00:29:53.400 congratulation Biden,
00:29:55.160 I'm saying,
00:29:56.060 okay,
00:29:56.540 I got some questions,
00:29:58.780 but I'm not going
00:29:59.960 to dismantle the system.
00:30:01.820 You're going to have
00:30:02.380 to give me
00:30:02.740 a lot better,
00:30:04.100 a lot better evidence
00:30:06.160 before I'm going to,
00:30:07.820 like,
00:30:08.160 break the system
00:30:09.040 because that's like
00:30:09.820 the backbone
00:30:10.980 of the country.
00:30:12.400 So I also have
00:30:13.360 an assumption,
00:30:14.140 this is part of the context
00:30:15.280 for my opinion,
00:30:16.420 that probably elections
00:30:18.420 have been rigged
00:30:19.480 in this country
00:30:20.320 for hundreds of years.
00:30:23.960 Yeah,
00:30:24.560 maybe in small ways,
00:30:25.920 maybe in big ways,
00:30:27.260 maybe it made a difference
00:30:28.440 such as the Kennedy election,
00:30:30.640 maybe it made a difference
00:30:32.000 in, you know,
00:30:32.600 Johnson's political life.
00:30:35.080 Maybe not.
00:30:36.020 I don't know.
00:30:37.820 I wouldn't know.
00:30:39.340 But
00:30:39.600 in all of those cases,
00:30:43.680 I would still support,
00:30:45.000 you know,
00:30:45.360 the republic
00:30:46.360 and the voting system.
00:30:49.100 So I believe that
00:30:50.460 Ivanka is saying something
00:30:52.100 at least consistent
00:30:53.320 with what I'm saying.
00:30:54.300 I won't burden her
00:30:55.780 by saying that her opinion
00:30:57.280 matches mine,
00:30:58.440 but at least it's consistent,
00:31:00.400 which is that
00:31:01.440 when Barr said
00:31:02.380 the election is fair,
00:31:05.540 he couldn't know
00:31:06.720 if there was any
00:31:08.760 unknown fraud,
00:31:11.160 right?
00:31:11.440 I mean,
00:31:12.480 by definition,
00:31:13.700 you don't know
00:31:14.720 the thing you don't know.
00:31:16.700 So if there had been problems
00:31:18.360 and nobody knew
00:31:19.220 what they were,
00:31:20.420 how would Barr know that?
00:31:22.240 How would anybody?
00:31:23.380 So Barr couldn't know
00:31:24.340 what he can't know.
00:31:25.020 So when Barr says
00:31:26.720 the election is fair,
00:31:28.600 here's how I interpret it.
00:31:30.980 The system
00:31:31.720 elected Biden,
00:31:33.820 which is what I say.
00:31:35.340 Yeah,
00:31:35.540 the system did
00:31:36.240 what the system did.
00:31:37.580 We observed it.
00:31:39.120 It might have had problems,
00:31:41.180 but I'm going to support
00:31:42.420 the system
00:31:42.960 until you give me
00:31:43.940 really,
00:31:44.740 really good evidence,
00:31:46.000 really good evidence,
00:31:47.160 and we're going to talk
00:31:47.740 about that in a minute,
00:31:48.920 that there's something wrong.
00:31:50.880 And in my opinion,
00:31:54.080 I haven't seen it yet.
00:31:55.520 I could.
00:31:56.980 I mean,
00:31:57.240 it's possible.
00:31:58.220 And I'm not sure
00:31:58.880 it's possible
00:31:59.440 to audit the full system.
00:32:01.080 I'm pretty sure it isn't.
00:32:02.920 So I would say
00:32:04.000 Ivanka is on good territory
00:32:05.620 if she says
00:32:06.400 she accepts the,
00:32:07.440 you know,
00:32:08.080 the Barr interpretation
00:32:09.020 that the election was fair.
00:32:11.200 I interpret that
00:32:12.220 just meaning
00:32:12.700 that it happened.
00:32:14.500 There was an election.
00:32:16.140 We observed an election.
00:32:17.420 It picked a winner.
00:32:19.600 So yes,
00:32:20.700 that happened.
00:32:22.660 But then
00:32:23.480 you should also
00:32:24.120 look into it
00:32:24.820 if you've got questions.
00:32:25.920 That just seems
00:32:26.420 like common sense.
00:32:27.880 You know,
00:32:28.040 the system is better.
00:32:31.140 The system is healthier
00:32:32.180 if you look
00:32:33.500 into any allegations
00:32:34.540 because then
00:32:36.040 you can keep
00:32:37.040 making sure
00:32:37.900 that you're finding
00:32:38.480 anything that's a problem.
00:32:40.620 I've got a theory
00:32:41.540 that maybe lawyers
00:32:42.440 should not be allowed
00:32:43.360 to run for office.
00:32:46.260 And it's a very,
00:32:47.340 it's based on
00:32:47.900 a very obvious
00:32:48.740 and simple concept.
00:32:50.700 people do
00:32:52.260 what they're trained
00:32:52.980 to do.
00:32:55.720 That's why
00:32:56.380 training works.
00:32:58.000 It wires your brain
00:32:59.560 to think
00:33:00.000 in a certain way.
00:33:01.300 The benefit
00:33:02.000 of learning economics
00:33:04.220 actually has
00:33:05.580 very little
00:33:06.140 to do with economics.
00:33:07.980 I don't know
00:33:08.400 if I've ever
00:33:08.840 told you that before.
00:33:10.200 But the benefit
00:33:10.840 of learning economics
00:33:11.820 is it wires
00:33:13.120 your brain
00:33:13.820 to simply view
00:33:15.560 things a certain way
00:33:16.640 that's productive.
00:33:18.360 That has nothing
00:33:19.180 to do with the economy
00:33:20.060 in many cases.
00:33:21.600 If you took economics
00:33:23.060 so you could predict
00:33:23.920 the future,
00:33:25.280 well,
00:33:25.580 good luck with that.
00:33:27.380 Nobody can do that.
00:33:29.320 All the economics
00:33:30.280 in the world
00:33:30.740 isn't going to tell you
00:33:31.460 what's going to happen
00:33:31.980 next year.
00:33:32.780 Too many surprises.
00:33:34.540 But,
00:33:35.160 it does wire
00:33:36.180 your brain
00:33:36.860 to know,
00:33:37.800 for example,
00:33:38.380 that if you're saying
00:33:39.100 something is good
00:33:39.960 or bad,
00:33:40.780 you better be comparing
00:33:42.060 it to something.
00:33:43.700 Which,
00:33:44.340 as amazing as this sounds,
00:33:46.060 is not automatic thinking.
00:33:47.740 It's not even
00:33:48.440 common sense.
00:33:49.860 People will often
00:33:50.680 just look at a thing
00:33:51.560 and say that's good
00:33:52.360 or bad
00:33:52.880 without regard
00:33:54.640 to what they're
00:33:55.420 comparing it to,
00:33:56.220 which is the only
00:33:56.840 relevant question.
00:33:58.380 So once you take
00:33:59.120 economics,
00:34:00.560 you automatically
00:34:01.360 see things as pairs.
00:34:03.780 It's either this
00:34:04.460 or that
00:34:04.920 or this or that
00:34:05.720 or the third thing,
00:34:06.600 et cetera.
00:34:07.320 So that wiring
00:34:08.400 just completely
00:34:09.200 changes how you act
00:34:10.500 all the time.
00:34:11.300 You can't turn it off.
00:34:13.460 You could not
00:34:14.500 force me
00:34:16.040 to not consider
00:34:18.400 alternatives
00:34:19.180 when I look
00:34:19.820 at a question.
00:34:20.800 It's just automatic.
00:34:22.920 Now,
00:34:23.720 suppose you've been
00:34:24.480 taught to be a lawyer.
00:34:26.840 Don't you think
00:34:27.340 lawyers get
00:34:27.980 a certain circuitry
00:34:29.060 burned in?
00:34:30.860 And,
00:34:31.380 you know,
00:34:32.000 I'm not exactly sure
00:34:33.040 what that circuitry is.
00:34:35.060 But if you look
00:34:36.020 at Adam Schiff
00:34:37.080 and,
00:34:39.360 you know,
00:34:39.580 I believe he's
00:34:40.280 an attorney
00:34:41.240 am I right?
00:34:43.180 Correct me
00:34:43.680 if I'm wrong.
00:34:44.120 He's an attorney,
00:34:44.780 right?
00:34:45.340 I feel like
00:34:45.840 they all are,
00:34:46.500 so can you
00:34:48.060 confirm that?
00:34:49.040 Somebody says
00:34:49.480 nope.
00:34:50.600 Somebody says
00:34:51.120 yes.
00:34:52.420 Anyway,
00:34:53.080 it feels as if
00:34:54.020 whether he's a real
00:34:55.160 one or wanted
00:34:55.760 to be one,
00:34:57.720 it feels as if
00:34:58.820 these public hearings
00:34:59.780 are attorneys
00:35:00.720 who couldn't be
00:35:01.520 attorneys,
00:35:02.580 like actually
00:35:03.520 successful big
00:35:04.600 trial lawyers.
00:35:05.240 It feels like
00:35:06.520 play acting
00:35:07.280 for attorneys.
00:35:10.040 This show
00:35:11.100 trial thing
00:35:11.860 that we're seeing,
00:35:12.760 and now it's like
00:35:13.460 we've seen several
00:35:14.300 of them from
00:35:14.720 the Democrats,
00:35:15.720 it looks like
00:35:16.960 attorneys
00:35:17.800 trying to be
00:35:19.880 attorneys.
00:35:22.440 Doesn't it?
00:35:23.300 it's like
00:35:25.820 they're play acting.
00:35:27.260 They couldn't do it
00:35:28.140 in the real world.
00:35:29.340 So it made me think
00:35:31.060 of the old joke,
00:35:32.780 those who can't do,
00:35:34.400 teach.
00:35:36.160 Right?
00:35:36.460 Those who can't do,
00:35:37.580 teach.
00:35:38.040 And then those who
00:35:38.860 can't teach,
00:35:39.720 teach him.
00:35:40.320 Well,
00:35:43.260 in the lawyer world,
00:35:44.860 it's like those
00:35:45.560 who can't become,
00:35:47.080 you know,
00:35:47.420 successful lawyers,
00:35:49.680 teach.
00:35:51.220 And those who can't
00:35:52.500 teach you how to be
00:35:53.300 a lawyer,
00:35:54.780 run for politics.
00:35:57.280 And they're in
00:35:58.240 Congress.
00:35:59.120 So you end up with
00:35:59.840 like the lowest
00:36:00.820 level of a lawyer
00:36:01.960 by the time you have
00:36:03.320 a politician who's
00:36:04.280 also a lawyer.
00:36:06.240 Not every time.
00:36:07.660 I'm making sort of
00:36:08.300 a generalization.
00:36:09.500 But then you've got
00:36:10.120 these people who
00:36:10.780 are wired for this
00:36:11.820 model of how to
00:36:13.240 solve things.
00:36:15.260 Right?
00:36:15.420 I'm a lawyer,
00:36:16.660 so the model of
00:36:17.700 how to solve things
00:36:18.720 is you have some
00:36:19.680 kind of this public
00:36:20.520 event where evidence
00:36:22.660 is displayed and
00:36:24.300 there are
00:36:24.860 representatives and
00:36:26.280 there's somebody
00:36:27.180 who's like a
00:36:27.800 defendant, you know,
00:36:29.600 like you're
00:36:30.020 accusing somebody.
00:36:31.260 And then there
00:36:31.680 will be other people
00:36:32.360 who will be like a
00:36:33.360 jury, but not
00:36:34.200 really a jury
00:36:34.840 because it's not
00:36:35.460 a trial.
00:36:36.200 But we'll make it
00:36:36.840 like it's a
00:36:37.300 pretend trial.
00:36:38.520 And then I'll get
00:36:39.180 to stand in front of
00:36:39.920 the world like
00:36:41.040 I'm a famous
00:36:42.440 trial lawyer and
00:36:43.780 I'll make my
00:36:44.920 impassioned case and
00:36:46.660 then everybody will
00:36:47.320 look at me and
00:36:47.840 they'll say,
00:36:48.220 wow, look at that
00:36:50.040 impassioned case that
00:36:51.940 that excellent lawyer
00:36:53.000 made who was not
00:36:54.960 good enough to be an
00:36:55.740 actual lawyer so he
00:36:56.820 had to run for
00:36:57.400 office.
00:36:59.740 So, so much of
00:37:01.820 this looks like
00:37:02.720 play acting because
00:37:04.360 it is.
00:37:05.020 I think it
00:37:06.700 actually is.
00:37:08.740 If you put me in
00:37:10.660 government, what
00:37:11.760 would I be doing so
00:37:13.080 much that you would
00:37:14.280 hate it?
00:37:15.540 I would be saying,
00:37:16.820 have you considered
00:37:17.660 the alternatives?
00:37:19.220 Have we really, have
00:37:20.560 we really seen both
00:37:21.620 sides of the
00:37:22.160 arguments?
00:37:23.100 Like I would be
00:37:23.740 doing that until you
00:37:24.580 were so fucking sick
00:37:26.240 of hearing it that
00:37:28.000 you would puke.
00:37:28.720 because I'm wired
00:37:32.420 that way.
00:37:33.280 Like I'm just
00:37:33.920 wired that way
00:37:34.620 because I studied
00:37:36.120 economics and
00:37:37.160 you know, business
00:37:38.100 et cetera.
00:37:40.280 If you, if you
00:37:41.440 randomly select a
00:37:42.660 lawyer from
00:37:43.480 anywhere, throw
00:37:44.940 that lawyer into
00:37:45.820 government and say
00:37:47.140 we've got this
00:37:47.820 problem, what do
00:37:48.580 you want to do
00:37:49.080 about it?
00:37:50.100 What's the lawyer
00:37:50.880 going to do?
00:37:52.180 Well, the lawyer is
00:37:52.980 going to do whatever
00:37:53.560 the lawyer was wired
00:37:54.780 to do.
00:37:56.140 Huh, how do you
00:37:56.980 solve a problem?
00:37:58.720 Probably some kind
00:37:59.540 of a evidence
00:38:01.520 gathering, then a
00:38:02.920 public event, there's
00:38:04.840 got to be a
00:38:05.320 villain, there's got
00:38:06.720 to be a jury.
00:38:07.940 They're just wired
00:38:08.660 that way.
00:38:09.580 So we keep getting
00:38:10.580 all these, you
00:38:11.260 know, impeachments
00:38:12.660 and show trials and
00:38:13.780 stuff.
00:38:14.220 Why?
00:38:15.160 Because that's who
00:38:15.840 we fucking hired.
00:38:17.840 We hired people who
00:38:19.160 can only do this.
00:38:21.680 If we wanted to get
00:38:22.860 some plumbing fixed,
00:38:24.340 we should have
00:38:25.420 hired a plumber.
00:38:26.140 If we wanted to
00:38:28.640 solve a fucking
00:38:29.900 problem, you
00:38:31.800 don't hire lawyers.
00:38:34.180 Lawyers aren't
00:38:34.900 there to solve a
00:38:35.720 problem, they're
00:38:36.240 there to fight.
00:38:37.780 They're there to
00:38:38.500 put on a trial.
00:38:40.080 They're just wired a
00:38:40.960 certain way.
00:38:41.820 And they're wired
00:38:42.660 the wrong way for
00:38:44.720 what we need them
00:38:45.620 for.
00:38:47.920 Are you ready for
00:38:48.940 the point where I
00:38:50.600 make my case so
00:38:51.740 strongly that you'll
00:38:52.640 never be the same
00:38:53.400 again?
00:38:54.960 January 6th.
00:38:56.800 Imagine January 6th
00:38:58.300 if all the lawyers
00:38:59.380 were not part of
00:39:00.300 the story.
00:39:01.340 Number one, it is
00:39:03.020 alleged, and I may
00:39:04.160 have some of these
00:39:04.760 facts wrong, so help
00:39:05.900 me on the fact
00:39:06.640 checking.
00:39:07.740 It is alleged that
00:39:08.740 lawyer Mark Elias
00:39:10.180 was part of a
00:39:12.080 larger effort to
00:39:13.780 change some of the
00:39:14.800 election rules,
00:39:16.580 partly because the
00:39:17.680 pandemic gave some
00:39:19.000 impetus to that,
00:39:19.980 but more largely
00:39:22.340 because they knew
00:39:24.080 those rule changes
00:39:25.300 would help the
00:39:26.260 Democrats.
00:39:27.380 Now, if as many
00:39:29.880 allege, and I'm not
00:39:31.120 sure how to know
00:39:32.340 this is true, if as
00:39:33.860 many allege that
00:39:35.320 made the difference,
00:39:37.340 wasn't that the
00:39:38.220 thing that shocked
00:39:39.080 Republicans after the
00:39:40.440 election?
00:39:40.980 It's like, wait a
00:39:41.540 minute, this didn't
00:39:43.020 go the way we were
00:39:44.180 all pretty sure it
00:39:44.920 would go.
00:39:46.220 And it's so far
00:39:47.060 off from what we
00:39:48.820 expected, that we
00:39:50.800 think there's
00:39:51.540 something fishy
00:39:52.180 going on.
00:39:53.600 So if Mark
00:39:54.280 Elias had simply
00:39:55.180 never existed, and
00:39:57.180 the election had
00:39:57.900 been run the way
00:39:58.580 elections had always
00:39:59.660 run before, maybe
00:40:01.680 unfairly, maybe
00:40:03.620 unfairly, Republicans
00:40:05.480 would have had an
00:40:06.220 advantage and Trump
00:40:07.080 would have won, and
00:40:08.240 then there would be
00:40:08.720 no January 6th.
00:40:10.040 So am I right?
00:40:11.200 So this is the
00:40:11.880 first case.
00:40:13.400 If Mark Elias, and
00:40:15.200 people he worked
00:40:15.880 with who probably did
00:40:16.640 similar things, if
00:40:17.920 they did not exist,
00:40:19.180 doing what they
00:40:19.840 did, Republicans
00:40:21.480 would not have
00:40:22.240 said, wait a minute,
00:40:23.420 this result doesn't
00:40:24.200 look right, and
00:40:25.920 maybe it was all
00:40:26.540 legal.
00:40:27.360 By the way, I'm not
00:40:28.380 alleging anything
00:40:29.280 illegal.
00:40:30.580 They're probably just
00:40:31.880 good lawyers, and
00:40:32.720 they made sure they
00:40:33.560 changed laws that
00:40:34.520 helped their clients
00:40:35.620 or their interests.
00:40:38.180 So here's another
00:40:39.240 one.
00:40:39.900 Apparently, the
00:40:40.900 primary reason that
00:40:42.260 Trump thought there
00:40:44.220 was some path for
00:40:45.680 delaying the vote or
00:40:47.480 getting these other
00:40:49.240 fake people in
00:40:50.620 there, fake
00:40:52.860 electors, is he
00:40:58.460 had a lawyer, John
00:40:59.460 Eastman.
00:41:00.700 So John Eastman was
00:41:02.060 the one who came up
00:41:02.840 with the theories
00:41:03.540 about how it would
00:41:04.540 be legal for Trump
00:41:07.580 and his supporters to
00:41:08.560 delay things or
00:41:09.580 challenge things or
00:41:10.920 get different
00:41:11.640 electors or whatever
00:41:12.500 it was.
00:41:12.820 Now, imagine if
00:41:16.280 Trump did not have a
00:41:17.840 high-end lawyer, a
00:41:19.200 high-end lawyer,
00:41:19.920 remember, John
00:41:20.600 Eastman's not your
00:41:22.300 neighborhood lawyer.
00:41:23.220 He's like a high-end
00:41:24.160 government lawyer kind
00:41:25.640 of guy.
00:41:26.080 If Trump did not have
00:41:27.500 that advice, would he
00:41:28.740 have even bothered?
00:41:30.360 If no lawyer had told
00:41:32.260 Trump, yeah, there's
00:41:33.500 an argument for this,
00:41:35.120 would he have done
00:41:36.180 anything he did?
00:41:38.080 No.
00:41:39.500 No.
00:41:40.360 No, there had to be a
00:41:41.400 lawyer telling him,
00:41:42.640 well, there's a good
00:41:43.220 chance this could work.
00:41:45.240 Now, in retrospect,
00:41:48.120 as we look at it
00:41:49.220 after the fact, it
00:41:50.780 doesn't look like that
00:41:51.700 was good legal advice.
00:41:53.820 But how would you know
00:41:54.700 if you were in the
00:41:55.360 moment?
00:41:56.900 If you're Trump and
00:41:58.660 your lawyer says this
00:41:59.720 looks like it could
00:42:00.480 work, what are you
00:42:02.840 going to do?
00:42:04.120 Overrule your lawyer?
00:42:05.540 I mean, you don't
00:42:06.200 know.
00:42:06.460 I mean, eventually
00:42:08.300 you might overrule.
00:42:10.980 So, if you took Mark
00:42:13.160 Elias and whoever he
00:42:14.400 worked with, probably
00:42:15.140 other lawyers out of
00:42:16.120 it, you wouldn't get
00:42:17.340 the opinion that
00:42:18.060 shocked the Republicans.
00:42:19.220 If you took Eastman
00:42:20.060 out of it, Trump would
00:42:21.480 have said, ah, damn
00:42:22.260 it, I just lost the
00:42:23.060 election.
00:42:23.700 I guess there's nothing
00:42:24.340 you could do about
00:42:24.960 that.
00:42:26.380 Now, by the way, oh,
00:42:28.840 and then if you take
00:42:30.000 the lawyers out of
00:42:31.080 Congress, they don't
00:42:32.500 even do the January
00:42:33.460 6th show trial thing.
00:42:35.220 So, have I made my
00:42:37.000 case that if you took
00:42:39.020 the lawyers out of
00:42:39.980 the story, none of
00:42:40.860 this would have
00:42:41.240 happened.
00:42:41.640 January 6th wouldn't
00:42:42.560 even be, there would
00:42:43.680 have been no riots.
00:42:46.540 There would have been
00:42:47.380 nothing except an
00:42:48.460 election.
00:42:49.440 It was only the
00:42:50.600 lawyers that fucked
00:42:51.960 up everything.
00:42:53.380 And if it were not
00:42:55.060 for their flying
00:42:55.800 monkeys in the
00:42:56.620 press to make us
00:42:58.380 think the problem
00:42:59.080 was somewhere else,
00:43:00.220 it would be really
00:43:01.260 obvious to you that
00:43:02.880 lawyers are the
00:43:03.780 problem.
00:43:04.140 And by the way,
00:43:08.180 they're not always
00:43:08.860 breaking the law.
00:43:11.400 How about if
00:43:12.460 Sidney Powell had
00:43:16.940 not told the Trump
00:43:18.640 supporters that they
00:43:20.160 totally had the goods,
00:43:21.960 do you think the
00:43:22.520 Trump supporters would
00:43:23.380 have been so worked
00:43:24.260 up if somebody who
00:43:26.600 wasn't a high-end
00:43:27.500 lawyer, somebody you
00:43:28.520 really expect, better
00:43:30.040 know they have the
00:43:30.780 facts before they talk
00:43:31.760 in public like that,
00:43:32.660 do you think they
00:43:33.720 would have been as
00:43:34.560 worked up?
00:43:35.560 Probably not.
00:43:38.300 This was lawyer
00:43:39.480 problems from top
00:43:40.580 to bottom.
00:43:41.940 And the fact that we
00:43:42.800 see it any other
00:43:43.680 way is a testimony to
00:43:45.800 how easily we can be
00:43:46.820 brainwashed.
00:43:48.300 Because, I mean, it's
00:43:48.880 only one of the
00:43:49.560 filters or frames you
00:43:50.960 could put on this,
00:43:51.920 but it's the most
00:43:52.540 productive one.
00:43:53.760 It's just a lawyer-on-lawyer
00:43:55.260 crime, and we're all,
00:43:57.200 basically, we're just
00:43:59.800 victims of the drive-by
00:44:01.100 shootings by lawyers,
00:44:02.660 and we're acting like
00:44:03.700 it's some other
00:44:04.220 problem entirely.
00:44:07.120 So, there's that.
00:44:16.660 What about this gun
00:44:18.620 control bill?
00:44:20.020 So, a lot of
00:44:21.620 Republicans are
00:44:22.420 hopping mad.
00:44:24.060 I guess a number of
00:44:24.880 Republicans signed on to
00:44:26.280 a Democrat gun
00:44:28.500 control bill that
00:44:29.340 includes some red
00:44:30.240 flag law funding.
00:44:33.320 So, it's funding for
00:44:34.460 states to look into
00:44:35.860 and or implement
00:44:36.940 red flag laws.
00:44:38.460 And that means that
00:44:39.380 if you think your
00:44:40.640 neighbor, friend, or
00:44:41.860 family member is a
00:44:42.680 little crazy, you can
00:44:45.000 have the government
00:44:45.600 take their gun away.
00:44:47.840 Because you reported
00:44:49.180 that they were somehow
00:44:51.100 mentally unstable or
00:44:52.460 dangerous or something.
00:44:53.860 Now, as many people
00:44:55.580 have suggested,
00:44:58.480 doesn't that mean that
00:44:59.920 everybody's just going to
00:45:01.080 turn in every Republican
00:45:02.320 that lives next door?
00:45:04.380 And every Democrat who
00:45:06.200 sees a Republican who has
00:45:07.400 more than one gun is
00:45:09.100 going to say, hmm, more
00:45:10.160 than one gun?
00:45:11.640 That's a little crazy to
00:45:12.940 me.
00:45:14.280 Why would you need more
00:45:15.360 than one gun?
00:45:17.180 And while you're, you
00:45:18.540 take pictures and put them
00:45:19.600 on social media of you at
00:45:21.040 the gun range, oh, that's
00:45:22.220 a little crazy.
00:45:23.460 I think maybe the red
00:45:25.380 flag is appropriate for
00:45:27.220 you.
00:45:28.380 So, there's definitely a
00:45:30.780 slippery slope.
00:45:33.320 It's not even slippery.
00:45:34.940 It's like right there.
00:45:35.800 You don't even have to
00:45:36.440 wait for the slide, do you?
00:45:38.320 Literally on day one,
00:45:40.500 your neighbor can turn you
00:45:41.620 in for being a little too
00:45:42.740 Republican with your guns.
00:45:44.060 Am I wrong?
00:45:45.660 Well, I would think that
00:45:48.160 being a little too
00:45:48.900 Republican is going to
00:45:49.960 start looking like a
00:45:50.880 mental disorder.
00:45:52.440 Did we not see the
00:45:54.520 mental disorder industry
00:45:58.240 diagnose Trump in public
00:46:00.940 when that's the last thing
00:46:03.080 anybody should do if
00:46:04.700 they're in that business?
00:46:06.100 Like, the last thing you
00:46:07.220 should do is diagnose
00:46:08.820 somebody's mental disorder
00:46:10.360 and they're not even your
00:46:11.720 patient.
00:46:12.260 It's just somebody you see
00:46:13.100 on TV.
00:46:13.540 and the psychiatric
00:46:15.900 industry, like, just
00:46:18.560 completely gave up their
00:46:19.760 standard because it was
00:46:20.740 Trump.
00:46:21.400 It's like, oh, yeah,
00:46:21.980 well, in this case, we'll
00:46:22.720 do it.
00:46:23.560 Yeah, normally.
00:46:24.780 Normally, we wouldn't do
00:46:25.760 it, but, you know,
00:46:26.520 Trump.
00:46:27.860 So, when you see how
00:46:29.800 flexible people can be
00:46:32.320 under the right situation,
00:46:36.220 yeah, there's a
00:46:37.640 legitimate reason to be
00:46:38.620 worried about these red
00:46:39.440 flag laws.
00:46:40.900 On the other hand,
00:46:43.540 what do I always say
00:46:44.880 when there's something
00:46:46.500 that you don't know if
00:46:48.440 it would work or not?
00:46:51.180 What do I always say?
00:46:55.260 Test it.
00:46:57.500 Test it.
00:46:58.900 Why are we even arguing?
00:47:00.740 Just put it in one damn
00:47:02.440 state, let it run for three
00:47:04.960 years, and see if any
00:47:06.680 Republicans get turned in.
00:47:07.980 I think you would know
00:47:10.640 pretty reliably if it turns
00:47:13.900 out that Democrats are just
00:47:15.480 going to be turning in
00:47:16.240 Republicans.
00:47:17.240 If you think that's the
00:47:18.540 problem, and I think that's
00:47:20.400 an entirely legitimate
00:47:21.660 worry, you just don't know
00:47:24.360 if it's true.
00:47:25.880 Like, you worry, but you
00:47:27.280 don't know.
00:47:28.880 It could be exactly the
00:47:31.060 opposite of that.
00:47:32.880 Here's the other way it
00:47:34.000 could go.
00:47:34.380 The Democrat calls and
00:47:36.340 says, I need to red flag
00:47:38.080 my neighbor because he's
00:47:40.100 like a little too gun
00:47:41.360 obsessed.
00:47:42.780 But maybe it's just a
00:47:44.040 Republican with a hobby,
00:47:45.560 right?
00:47:45.820 It doesn't mean anything
00:47:46.540 about violence.
00:47:48.720 What happens when the
00:47:49.580 police get that call?
00:47:52.000 Do they act on it?
00:47:52.960 Because now there's a red
00:47:53.840 flag law.
00:47:55.620 Or do they say, does your
00:47:58.660 neighbor have a red hat
00:48:00.920 that says MAGA on it?
00:48:02.080 And then the Democrat
00:48:03.600 says, yes.
00:48:06.920 And then do the police
00:48:07.980 say, oh, we'll put
00:48:09.080 somebody right on it?
00:48:10.780 Or do the police say,
00:48:12.320 oh, don't worry, that's
00:48:13.060 just how they act?
00:48:15.300 I feel like it might be
00:48:16.500 the latter.
00:48:17.740 Because the police are
00:48:18.520 just going to want to use
00:48:19.320 their resources for things
00:48:20.600 that are useful, right?
00:48:22.060 The police do not want to
00:48:23.340 waste their time.
00:48:25.140 So the counter to the
00:48:26.900 slippery slope is that
00:48:28.600 the police will hardly ever
00:48:30.000 want to do anything with
00:48:31.180 the red flag.
00:48:33.060 I think you'd be, I think
00:48:35.040 it would be shockingly
00:48:36.000 difficult to get the
00:48:37.200 police to act at all on a
00:48:39.900 red flag complaint.
00:48:41.280 It would have to be pretty
00:48:43.020 damning for the police to
00:48:44.920 act, only because the
00:48:46.760 police have limited
00:48:47.500 resources.
00:48:49.480 So do you remember I said
00:48:51.700 follow the money?
00:48:54.180 Do you know why I always
00:48:55.680 speed if I happen to be
00:48:57.580 driving at 4 a.m. on a
00:48:59.160 Sunday morning?
00:49:00.440 Well, let's say not 4 a.m.
00:49:02.300 Let's say 8 a.m. on a
00:49:03.540 Sunday morning.
00:49:04.180 Do you know why I always
00:49:05.040 speed?
00:49:06.100 Well, not only because
00:49:07.060 there's not much traffic,
00:49:08.660 but because I know the
00:49:09.820 police department will not
00:49:11.800 implement resources for a
00:49:14.660 low traffic time of day.
00:49:18.500 Follow the money.
00:49:20.100 If you know how money
00:49:21.380 influences how everybody
00:49:22.820 acts, you'll never get a
00:49:24.560 speeding ticket.
00:49:25.120 I realize that's a
00:49:28.820 dramatic claim, but let
00:49:30.460 me give you another
00:49:30.980 example.
00:49:31.820 Do you ever see those
00:49:32.500 signs that the police
00:49:34.320 will put up to tell you
00:49:35.300 what your speed is?
00:49:37.520 And the idea is to tell
00:49:38.960 you that you're speeding
00:49:39.700 so you'll slow down.
00:49:41.540 Do you know what I do
00:49:42.340 when I see that sign?
00:49:44.260 I speed.
00:49:46.020 Do you know why?
00:49:47.440 Because no police
00:49:48.440 department in the world is
00:49:49.700 going to put up that sign
00:49:50.840 to warn you to slow down
00:49:52.760 and then also assign an
00:49:54.760 officer to give you a
00:49:55.780 ticket if you didn't
00:49:56.560 follow the sign.
00:49:57.960 The sign is what you put
00:49:59.060 there when you can't
00:49:59.780 afford to put an officer
00:50:00.660 there all the time.
00:50:02.960 If you don't understand
00:50:04.020 economics, you might think,
00:50:06.300 oh, there's going to be a
00:50:07.040 police right after the
00:50:08.020 sign.
00:50:08.880 No.
00:50:09.840 No.
00:50:10.560 The sign is instead of the
00:50:11.820 police.
00:50:13.740 Somebody says wrong, but I'll
00:50:15.560 bet it's not wrong often.
00:50:18.420 So the point is, if you
00:50:20.780 understand economics, you
00:50:22.860 can anticipate things a
00:50:24.040 little bit better.
00:50:29.640 Tim Pool was swatted eight
00:50:31.020 times and the police keep
00:50:32.320 coming.
00:50:33.040 Yeah, that Tim Pool thing,
00:50:34.320 there's something we need to
00:50:35.260 know about that.
00:50:37.860 There's something about that
00:50:38.960 story that doesn't make
00:50:39.980 complete sense.
00:50:41.900 I'm having a little trouble
00:50:43.160 believing he gets swatted
00:50:44.580 time after time, you know,
00:50:46.080 and it's a fake call and that
00:50:48.220 the SWAT falls for it one
00:50:50.340 time after another.
00:50:51.900 Really?
00:50:52.660 There's nobody on a SWAT
00:50:53.860 team who's heard of Tim
00:50:54.900 Pool?
00:50:55.940 At this point, it would be
00:50:57.140 hard for me to imagine any
00:50:59.240 SWAT organization where there
00:51:01.680 isn't at least one person who
00:51:02.920 knows who the fuck Tim Pool
00:51:04.220 is and that he gets swatted all
00:51:06.660 the time on fake calls.
00:51:10.840 I don't know.
00:51:11.880 I'm just not believing.
00:51:12.960 There's something about the
00:51:13.700 story that doesn't add up.
00:51:14.880 But I would, so I'd say you
00:51:18.860 have two counter forces that
00:51:20.140 could be tested.
00:51:21.000 One is, of course, people
00:51:22.580 would abuse the red flag law.
00:51:24.840 Of course they would, because
00:51:26.020 we're people.
00:51:27.300 People will abuse anything.
00:51:29.100 There's nothing that can be
00:51:30.200 introduced into the world that
00:51:31.540 people won't abuse.
00:51:32.620 So yes, they'll abuse it.
00:51:35.100 But will the police resource
00:51:38.940 constraint cause the police to be
00:51:42.240 like a really useful check on
00:51:43.960 this, that they won't act unless
00:51:45.900 it's like really obvious.
00:51:47.660 And when it's really obvious,
00:51:48.680 maybe they should act.
00:51:51.340 So I'm not sure.
00:51:52.740 The only thing I know for sure is
00:51:53.860 that you could test it in one
00:51:55.000 state, wait a couple years, see
00:51:56.900 if it worked.
00:51:58.220 Then you don't give up your
00:51:59.440 constitutional rights without
00:52:01.040 getting something in return.
00:52:05.260 Alex Brogan on Twitter did a
00:52:08.660 thread with a bunch of good
00:52:10.400 ideas about persuasion.
00:52:12.660 But there was one that he had
00:52:14.300 there that I'm going to push
00:52:15.360 back a little bit on, because I
00:52:17.680 have a counter view.
00:52:18.720 So here's a persuasion tip of
00:52:20.000 the day.
00:52:21.420 So what Alex wrote was, he called
00:52:24.340 it the rule of three.
00:52:25.620 I think this came from McKinsey or
00:52:27.440 someplace.
00:52:28.320 That whenever you're trying to
00:52:29.540 persuade a person to do
00:52:31.520 something, always present three
00:52:33.260 reasons.
00:52:34.460 Never two, never four, but
00:52:36.300 exactly three.
00:52:37.160 And the idea is, it gets people's
00:52:39.980 attention, and most of us have
00:52:41.620 been hardwired to expect things in
00:52:43.700 groups of three.
00:52:44.860 So it's more persuasive.
00:52:47.620 And I can see the argument that
00:52:50.920 maybe there's some hardwiring that
00:52:52.680 three reasons is just the right
00:52:55.240 amount.
00:52:55.940 But I would push back on that,
00:52:57.580 because my experience is that if you
00:52:59.640 give people three reasons, the
00:53:02.040 argument goes like this.
00:53:03.260 And see if this sounds familiar.
00:53:04.460 I'll say, well, there are three
00:53:07.280 reasons.
00:53:08.560 Reason number one, it would kill
00:53:10.900 you.
00:53:11.860 Reason number two, it would be
00:53:13.880 expensive.
00:53:15.240 Reason number three, it's not even
00:53:17.820 doable.
00:53:18.300 It's impractical.
00:53:20.460 Now, if you give three reasons,
00:53:23.800 people will, you know, start arguing
00:53:26.220 with, you know, parts of those
00:53:27.900 categories.
00:53:29.100 And you'll never get them to
00:53:31.460 agree.
00:53:32.760 Because the moment you argue the
00:53:35.600 first one, and they say, no, it's
00:53:37.400 not going to kill you.
00:53:38.520 And then you do your argument, and
00:53:40.000 they agree.
00:53:41.800 Or they don't agree.
00:53:43.040 They usually get to the point where
00:53:44.120 they say, but what about that other
00:53:46.080 one?
00:53:46.720 And you're like, wait a minute.
00:53:48.120 Wait a minute.
00:53:49.000 Are you conceding this point?
00:53:51.280 And they'll never concede the point.
00:53:52.740 They'll just move to the other
00:53:53.660 point.
00:53:54.720 Then you say, okay, okay.
00:53:55.780 We'll get back to this one.
00:53:56.980 Because I think I won this, but you
00:53:58.360 didn't agree.
00:53:58.820 Okay, so let me win these other
00:54:00.740 two.
00:54:01.420 And then you make your great point on
00:54:03.680 the second one.
00:54:04.360 What do they do?
00:54:05.700 They say, oh, now you've made two
00:54:07.300 good points.
00:54:08.060 Let's see about this third one.
00:54:09.700 Never.
00:54:11.020 Never.
00:54:12.200 They'll just move to the third one
00:54:13.500 and start arguing why you're wrong
00:54:15.480 about that one.
00:54:16.600 And you'll be like, well, I'm not so
00:54:18.560 sure I've settled the first two.
00:54:21.880 Did I?
00:54:22.500 And then you'll argue the third
00:54:24.920 one and you make your point.
00:54:26.160 And now in your mind, you've said,
00:54:28.020 all right, three arguments.
00:54:29.440 I've now supported all three
00:54:30.840 arguments.
00:54:31.400 We're done here.
00:54:32.240 What does the person who has now
00:54:33.660 been completely vanquished on all
00:54:36.360 three points do next?
00:54:38.180 You tell me.
00:54:39.200 In your own experience, you've gone
00:54:41.280 down the list of three.
00:54:42.720 You've completely vanquished their
00:54:44.820 points in ways that they don't even
00:54:47.300 have a response to.
00:54:48.460 They literally don't have a response
00:54:49.800 to it.
00:54:50.120 They just move to the next one.
00:54:51.000 What happens next?
00:54:52.560 They return to the first one
00:54:55.120 and start over.
00:54:56.600 That's right.
00:54:58.180 Name here.
00:54:59.400 That is correct.
00:55:00.880 They start over.
00:55:02.640 And they will pretend that you had
00:55:04.800 never already countered their first
00:55:07.440 argument.
00:55:07.960 They'll just start back on the first
00:55:09.220 one.
00:55:10.100 And you'll think, do you have amnesia?
00:55:14.640 We just did this.
00:55:16.540 And they'll act like you hadn't.
00:55:18.760 And so you'll think, okay, well,
00:55:20.760 I'll do it again.
00:55:23.380 And then you'll do the same three
00:55:24.860 again.
00:55:25.220 And you'll say, there, at least you
00:55:27.320 remember all three debunks.
00:55:29.080 So now we're done.
00:55:29.960 And what will they do?
00:55:32.480 They'll start with the first one
00:55:34.020 again, like it never happened.
00:55:36.500 Now, you've seen this, right?
00:55:37.980 I'm not the only person who's gone
00:55:39.720 through this.
00:55:40.280 You know, I call it the well.
00:55:43.200 You keep going down the well and
00:55:44.740 there's no bottom.
00:55:48.380 Anyway.
00:55:49.540 So I would argue that you should
00:55:51.440 take their strongest argument and
00:55:53.180 debunk it and then say, boom.
00:55:55.600 If your strongest argument fell
00:55:58.780 apart, we don't really need to talk
00:56:01.500 about the other ones.
00:56:02.740 Because I'm not going to let you
00:56:04.140 escape to them, because I know
00:56:05.520 you'll escape.
00:56:06.480 So the first thing you want to do
00:56:08.140 is destroy their escape
00:56:11.220 paths.
00:56:12.920 Right?
00:56:13.480 Now, you have to be a deadly
00:56:15.080 debater to do this.
00:56:16.740 But if you're sure you're going to
00:56:19.000 win on the strongest point, make
00:56:21.420 sure you've eliminated all their
00:56:23.700 escape paths before you annihilate
00:56:25.980 their first one by getting them
00:56:27.880 to agree, how about I just take
00:56:29.860 your strongest point?
00:56:32.020 And if I can debunk your
00:56:33.800 strongest point, would you agree
00:56:35.360 to maybe rethink the rest of it?
00:56:38.680 We don't have to do it today,
00:56:40.340 but just your strongest point.
00:56:43.460 Because if you do that, that gets
00:56:45.600 rid of their escape paths.
00:56:47.020 You say, yeah, yeah, yeah, but I
00:56:48.400 just today, we'll just talk about
00:56:50.240 the strongest point.
00:56:51.300 And now that you see you were wrong
00:56:52.480 about that, maybe you should do a
00:56:54.800 little research.
00:56:56.200 That's about as far as you can go.
00:56:58.020 People can talk themselves into
00:56:59.540 things better than you can talk
00:57:01.160 them into it.
00:57:02.080 So if you sort of point them in the
00:57:03.300 right direction, sometimes they'll
00:57:04.900 walk there on their own.
00:57:07.500 All right.
00:57:08.860 I believe, I feel like there was at
00:57:10.380 least one thing that I didn't
00:57:11.720 mention today.
00:57:13.400 Looking at my notes, looking at my
00:57:14.820 notes.
00:57:17.140 Nope.
00:57:19.580 That was it.
00:57:20.320 I guess this is
00:57:22.680 complete.
00:57:25.040 I believe that once again, your
00:57:27.140 lives have been improved by the time
00:57:29.420 that we spend here together.
00:57:35.080 And how many of you just got
00:57:38.140 smarter?
00:57:41.160 Three.
00:57:41.960 Three of you.
00:57:42.760 But I'll get the rest of you next time.
00:57:45.260 I can't win them all.
00:57:46.640 Um, DeSantis is pushing back on baby
00:57:54.400 vaccines.
00:57:55.920 Yeah, I've heard some weird things on
00:57:57.660 the baby vaccine story.
00:57:59.760 Like, the weirdest one was, and I can't
00:58:02.020 believe this is right, like, there's
00:58:04.380 something about the vaccine companies
00:58:07.300 don't have liability unless it includes
00:58:09.680 kids, unless kids are part of the
00:58:12.060 program or something.
00:58:13.200 Did somebody hear that?
00:58:14.040 Like, that doesn't sound right.
00:58:19.480 I mean, that doesn't sound true.
00:58:22.780 I mean, just on the surface, that
00:58:24.000 doesn't sound true.
00:58:27.660 Scott, stop with the self-importance.
00:58:30.600 I've been advised to stop with my
00:58:32.840 self-importance.
00:58:34.420 Michael, I'm going to hide you on this
00:58:37.140 channel because you're less important
00:58:38.560 than me.
00:58:39.580 Goodbye.
00:58:39.860 I would advise all of you to ramp up
00:58:45.320 your self-importance.
00:58:48.840 Anybody who advises you to think of
00:58:50.940 yourself as less important, you really
00:58:53.620 need to remove them from your life
00:58:55.160 immediately.
00:58:57.400 Immediately remove them from your life.
00:59:00.700 So, if you'd like to talk to me about
00:59:03.620 how I'm feeling too good about my
00:59:05.520 abilities, that's great.
00:59:08.200 Just don't do it around me.
00:59:10.140 I just don't have any interest in you.
00:59:13.780 And I would advise you to maybe
00:59:16.320 think better of yourself.
00:59:20.500 I don't know which would make you
00:59:21.940 happier.
00:59:22.560 I'm no doctor.
00:59:24.180 But I'll bet you that if you learn to
00:59:26.980 think well of yourself, you'll be
00:59:29.120 happier.
00:59:30.160 I think.
00:59:31.660 I think.
00:59:32.380 I think.
00:59:34.980 Did I miss a Jordan Peterson tweet
00:59:37.500 discussion?
00:59:38.520 Oh, is there a spicy Jordan Peterson
00:59:40.860 tweet that I missed?
00:59:43.820 Did somebody fill me in?
00:59:45.800 What did Jordan Peterson say that
00:59:47.740 Oh, did he say that
00:59:50.880 All right.
00:59:58.800 We'll see if there's a Jordan Peterson
01:00:00.900 thing I need to respond to.
01:00:02.500 Asked Schellenberger about
01:00:04.620 Biden's oil lies.
01:00:06.280 Yeah.
01:00:06.760 Michael Schellenberger does the
01:00:08.460 best job of putting things in
01:00:11.480 context, especially for energy
01:00:13.300 questions and homeless stuff.
01:00:16.240 But yeah, he's been calling out
01:00:17.720 Biden for for his claims about
01:00:20.520 refineries.
01:00:21.660 So was there a Jordan Peterson
01:00:27.780 tweet?
01:00:31.240 The one with Gadsad.
01:00:32.660 Should I look it up?
01:00:33.400 It's like you're not going to tell
01:00:34.320 me.
01:00:35.560 You know, maybe I'll do that for
01:00:36.660 tomorrow.
01:00:37.160 I won't keep you.
01:00:38.320 All right.
01:00:38.620 That's all for today.
01:00:40.120 I will talk to you tomorrow.
01:00:42.620 And I'm hoping that all of you have
01:00:44.860 an amazing.
01:00:48.100 Peterson said you weren't important.
01:00:50.300 What's that say?
01:00:55.840 Peterson said you weren't important
01:00:57.340 or that people weren't important.
01:00:59.600 Is that what he said?
01:01:02.880 All right.
01:01:03.420 Well, we'll figure out what he said.
01:01:05.480 And on that note, I'm apparently
01:01:07.320 babbling, even though this is
01:01:10.280 still the best experience that most
01:01:12.980 of you have ever had in your life.
01:01:15.460 And some of you are going to have
01:01:17.940 an even better experience later
01:01:19.320 today.
01:01:20.840 I don't know who you are,
01:01:22.720 but probably.
01:01:24.120 Probably.
01:01:25.380 Because that's how it works.
01:01:29.620 Yes.
01:01:30.020 An amazing day coming.
01:01:30.980 And I'll talk to you tomorrow,
01:01:32.000 YouTube.
01:01:32.640 Bye for now.
01:01:33.240 Bye.
01:01:48.780 Bye.
01:01:48.940 Bye.
01:01:49.540 Bye.