Real Coffee with Scott Adams - June 10, 2023


Episode 2135 Scott Adams: Trump Indicted For Stuff We Can't See, Biden Bribery Allegation, Lots More


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

143.781

Word Count

10,909

Sentence Count

832

Misogynist Sentences

21

Hate Speech Sentences

26


Summary

The political parties have become more like teenagers, and the other party is more like their parents. Tucker Carlson is enjoying his freedom, and apparently so is the rest of the world, because he thinks President Obama was gay when he said he was gay.


Transcript

00:00:00.000 Ba-ba-ba-ba, la-da-da-da.
00:00:04.700 Good morning, everybody, and welcome to the highlight of civilization.
00:00:09.640 It's called Coffee with Scott Adams.
00:00:11.500 You've never had a better time, and just wait for the fun we're about to have.
00:00:16.120 It'll be amazing. You can hardly believe it.
00:00:18.980 And if you'd like to take it up a notch, all you need is a cup or a mug or a glass,
00:00:22.500 a tank or a chalice or a stein, a canteen jug or flask, a vessel of any kind.
00:00:27.080 Fill it with your favorite liquid. I like coffee.
00:00:30.000 Join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:33.960 the thing that makes everything better.
00:00:36.540 It's called the simultaneous sip, and it happens now. Go.
00:00:43.680 Ah. Delightful.
00:00:49.820 Well, I saw a tweet, I think it might have been yesterday,
00:00:53.820 in which Elon Musk was commenting on the Target store and the LGBTQ situation,
00:01:01.680 and he said that it was too far.
00:01:06.980 They're just taking it too far.
00:01:08.380 And I love that phrase, because here's what it feels like.
00:01:14.900 See if this just feels right to you.
00:01:18.060 It feels like the Democrats are teenagers,
00:01:21.120 and the Republicans are their parents.
00:01:25.700 Just think about that for a moment.
00:01:27.360 Because the Democrats are more likely to try something that,
00:01:32.540 let's say, history suggests would not work.
00:01:36.340 What do teenagers do?
00:01:38.720 Teenagers are all about doing the thing that history has shown doesn't work.
00:01:45.100 Right?
00:01:45.300 It's almost their entire theme is doing things that are a bad idea in the long run.
00:01:51.420 It's like all they do, a teenager.
00:01:53.820 And they're supposed to, right?
00:01:55.380 It's sort of a natural progression.
00:01:58.080 They're supposed to test all the boundaries to learn where the boundaries are.
00:02:01.300 So you expect that.
00:02:02.900 But, luckily, most teens have some kind of a parental guardrail to say,
00:02:09.980 oh, no, that's too far.
00:02:11.800 All right, well, I'll let you get away with that.
00:02:13.780 I don't like it.
00:02:15.280 But that's too far.
00:02:17.480 Right?
00:02:18.260 And I feel as if that's what the political parties have become.
00:02:22.620 Teenagers who want to try stuff,
00:02:25.000 such as, why don't we get rid of the police?
00:02:28.220 If the police are hurting people, let's get rid of them.
00:02:31.500 Does that sound like what an adult says?
00:02:35.020 I mean, really, does that sound like an adult idea?
00:02:38.520 It really doesn't.
00:02:39.620 It sounds like exactly like a teenager idea.
00:02:42.800 And then you need the Republicans to say,
00:02:45.360 getting rid of police?
00:02:48.600 That's too far.
00:02:49.960 Body cams?
00:02:51.260 Okay, body cams.
00:02:52.560 Sure.
00:02:53.200 That's not too far.
00:02:54.860 How about better training for police?
00:02:57.520 Maybe.
00:02:57.940 Something like that.
00:02:58.860 Good.
00:02:59.720 That's not too far.
00:03:01.760 But there's definitely a lot of too far going on.
00:03:05.800 And I do like that framing because it just puts everything in context.
00:03:13.060 Because it allows, here's why I like it.
00:03:16.140 It allows both entities to be a productive part of the process.
00:03:20.700 I'm not sure if you caught that the way I framed it.
00:03:24.740 Just as a teenager should be testing all the limits and a parent is going to tell them
00:03:33.020 when that's too far, I don't mind, I really don't, that the Democrats might be the more
00:03:39.960 let's test this, let's push the boundary, let's try a thing that we know has never worked
00:03:45.000 before, let's see if we can figure out how to make it work this time.
00:03:49.140 I don't mind that.
00:03:50.620 That, you know, society needs people who are going to test all the boundaries all the time.
00:03:56.120 But you still need the other side to say too far.
00:03:59.700 You still need the adult supervision.
00:04:02.640 So maybe it's good that we have both.
00:04:05.020 It could be a positive about the American system that we just don't see because we're sort of
00:04:11.260 lost in the weeds.
00:04:12.200 All right, Tucker Carlson dropped his second video, was it the day before yesterday?
00:04:19.660 I think I missed it because it was my birthday or something.
00:04:23.240 But there's a little blowback.
00:04:25.280 I finally watched it.
00:04:27.780 And apparently Tucker is enjoying his freedom.
00:04:33.260 I think we can say that for sure.
00:04:35.820 Tucker's enjoying his freedom.
00:04:37.160 Because among other things, he insinuated that Obama has an interesting personal life.
00:04:45.300 I forget his exact words, but the clear indication was that he was not involved in a traditional
00:04:52.400 heterosexual marriage.
00:04:55.580 That's what I got.
00:04:57.180 Now, he didn't say that, but that's what I got.
00:05:01.120 I mean, that's what I received.
00:05:02.240 I don't know what he was sending, but that's what I received.
00:05:05.980 And apparently, you know, the Twitter is all abuzz over this, et cetera.
00:05:13.140 Now, I saw a comment that said that somebody who was a native of Chicago said that the natives
00:05:26.260 of Chicago, the people who may have known Obama when he was young, say they were all, everybody
00:05:32.540 knew he was gay.
00:05:34.400 Do you believe that?
00:05:36.820 Do you believe that the people who knew him when he was young, they all knew he was gay?
00:05:42.700 I'm not sure I would believe that.
00:05:45.720 It's a little, I don't know.
00:05:49.240 It's hard to imagine that we could get to this point without that being widespread knowledge
00:05:55.660 if it were true.
00:05:57.960 So I'm going to put a maybe on this one.
00:06:00.820 I think it's entirely true that if you, you know, threw a dart and picked any couple in
00:06:06.680 the world, that their life is not what they present.
00:06:11.340 Would you say that's fair?
00:06:12.460 You can pick any couple, political, non-political, just throw a dart, hit a random couple and
00:06:19.080 just say, oh, I'll bet that's not exactly what you're presenting to the public.
00:06:23.240 I'll bet there's more going on behind the window.
00:06:27.720 So I don't think it matters to anything.
00:06:32.500 You know, if his personal life is more interesting than we know, it doesn't matter, I don't think.
00:06:37.360 Does it?
00:06:38.420 Does it matter?
00:06:39.960 I don't think so.
00:06:40.900 But it's out there and Tucker's enjoying his freedom, I guess.
00:06:49.180 So there's allegedly a, did I mention this already?
00:06:53.260 A Canadian school board that has now ruled that the teachers and everybody in the school
00:06:59.360 must use they, them pronouns for everybody.
00:07:03.320 And there will be no more he and she and him and her.
00:07:07.880 It's all going to be they and them.
00:07:10.900 And there's a big anti-protest against that.
00:07:14.740 Now, here's, here's what I wonder.
00:07:17.640 Is there somebody on that school board who follows me?
00:07:21.420 Because this looks a little too close to what I would have done.
00:07:25.940 And I teach people to do this, which is embrace and amplify.
00:07:32.200 Now, the embrace part is you accept the thing that you actually don't agree with, but you accept it completely.
00:07:38.800 And then you amplify it so that you're not only accepting it, but you're really accepting it.
00:07:43.560 And if you amplify something that's a bad idea, it almost always breaks or looks ridiculous.
00:07:49.720 So to me, what this looks like is that there was some kind of debate happening at the school board.
00:07:56.500 This is just speculation, pure speculation.
00:07:59.040 What it looks like to me is there was some debate and there was some conservative who said, I've got an idea.
00:08:08.820 Well, if we can't decide whether we should use proper pronouns, why don't we go all the way and get rid of pronouns?
00:08:15.700 Why don't we embrace that this is the right thing to do and then take it to the next level, right?
00:08:23.000 Let's take it to the next level.
00:08:25.300 And it looks like somebody did that intentionally just to break the system, which it did.
00:08:31.540 It totally broke the system because there's protests and everybody's mad and people are going to use the wrong pronouns and everybody's going to complain.
00:08:39.660 It should completely destroy the system.
00:08:41.660 Now, do you think somebody came up with that on their own as a protest, but not directly saying it?
00:08:50.560 Or do you think that they actually went too far?
00:08:54.900 Is this a case of actually just going too far?
00:08:58.400 Or was it a prank?
00:09:00.260 You know, or let's say a play, a persuasion play by somebody on the school board.
00:09:05.120 I'm going to go with, I think it was a play.
00:09:08.240 I think somebody was pranking.
00:09:09.980 And I think that, I think that they're still, I think they're still pranking and they're playing it right to the, right to the end.
00:09:17.520 And if that's true.
00:09:23.080 Slow clap.
00:09:24.480 Slow clap.
00:09:25.500 I don't know if it's true.
00:09:27.000 I'm just saying that's what it looks like from the outside.
00:09:31.720 But maybe.
00:09:32.520 San Francisco is in the process of trying to make shoplifting a lot easier and safer for the shoplifters.
00:09:43.240 Does it sound like I'm making that up?
00:09:46.000 Nope.
00:09:47.080 Nope.
00:09:47.540 There's some legislation going through the system in San Francisco that would make it illegal for store owners to try to stop shoplifters.
00:09:56.480 So you make it a lot safer for both the store owners because they wouldn't be in physical danger than as much.
00:10:03.500 And also for the shoplifters.
00:10:06.700 So that's happening.
00:10:08.980 Does that feel like it's a little bit too far?
00:10:12.620 And I have a provocative question to ask.
00:10:17.800 If you were to compare two alternative plans, which one would save the most lives?
00:10:24.260 The plan that they have, in which they're going to make it illegal for shop owners to try to stop the shoplifters.
00:10:30.420 Is that the one that will save the most lives?
00:10:33.380 And I want you to compare that to a plan that nobody's suggesting.
00:10:36.860 But suppose the plan were the opposite and they made it, they encouraged shop owners to shoot and kill shoplifters on site.
00:10:48.040 Compare them.
00:10:48.940 Which one would kill more people?
00:10:51.140 Well, on day one, on day one, let's say three shoplifters get shot in San Francisco.
00:10:57.420 That's three people that you didn't want to die and that would be three tragedies.
00:11:02.240 Next day, three more.
00:11:04.100 Six people killed that really didn't need to be killed and maybe they were just trying to get food, maybe just trying to feed themselves.
00:11:13.980 Six tragedies.
00:11:16.020 And let's say you take it to the end of the week and there's 25 of them.
00:11:19.720 Like 25 people get gunned down in stores.
00:11:22.920 Just in San Francisco.
00:11:24.440 Now this is just a mental process, right?
00:11:26.140 This is a mental experiment.
00:11:28.080 This didn't happen.
00:11:30.680 25 people just get gunned down.
00:11:33.040 What happens to the rate of shoplifting after that?
00:11:38.060 Probably goes down.
00:11:39.780 I'm guessing it goes down.
00:11:41.580 So you could probably solve shoplifting at the expense of, you know, let's say 10 to 25 people who didn't need to lose their lives.
00:11:49.700 But if you were to compare that 10 to 25 people, would that be more or less, more or fewer than the number who would die if you let civilization destroy itself by not having, you know, basically not protecting private property?
00:12:10.380 If you don't protect private property, is the long-term outcome of that more or fewer people dying?
00:12:21.920 Which one kills more people?
00:12:24.700 If I had to guess, it would be the least loss of life would be to gun down 25 people who are shoplifting.
00:12:33.300 I think that would be the least loss of life in the long run.
00:12:38.480 Because the breakdown of civilization starves everybody, right?
00:12:42.000 If you can't have urban centers, if you can't have a store to buy food, where are they going to buy food?
00:12:48.820 Where do you buy food if the stores are closed?
00:12:51.220 You know, the poor people are not going to be Amazon and DoorDashing.
00:12:56.320 So I have a feeling that actually allowing store owners to shoot to death, anybody shoplifting a Twinkie, would get you the lowest number of deaths in the long run.
00:13:10.740 Maybe.
00:13:12.160 Maybe.
00:13:13.400 Don't know.
00:13:13.860 All right.
00:13:15.720 But California is trying to, you know, maintain its too-far reputation.
00:13:25.320 And now parents are going to be punished for misgendering their own children.
00:13:30.540 It's already passed the state assembly.
00:13:33.040 So it's not a law yet, but it's passed the state assembly.
00:13:36.000 And the idea here is that that would be child abuse if you're misgendering your own child.
00:13:41.540 And that, therefore, that would be taken into account if, let's say, there's a divorce.
00:13:48.620 So if there's a divorce and one parent accepts the child's identification and the other says, no, you were born this way, that's what you are,
00:13:58.160 then the parent who is willing to go with the child's self-identification is more likely to get custody
00:14:05.040 because it would show child abuse by the one who is calling them by their biological, you know, whatever they look like when they were born, I guess.
00:14:17.700 But also, it looks like the, you know, health and human services could probably take your kid away for child abuse if you just kept at it.
00:14:28.160 Now, does that seem too far?
00:14:33.100 Yeah, that's too far.
00:14:34.920 This is a pretty explicit statement that the parents don't have medical decision rights over their children and that the state does.
00:14:45.640 Because I would say this is not child abuse, it's medical treatment.
00:14:51.400 It may be bad medical treatment.
00:14:53.620 It might be the wrong kind.
00:14:54.900 That would be a separate argument.
00:14:55.860 But I would say the parents, parents trying to, let's say, settle their child's gender identification,
00:15:05.300 I would consider that part of mental health.
00:15:08.660 Now, that doesn't mean they're doing it right.
00:15:11.140 I'm not saying it's helping.
00:15:12.980 I'm saying that the parents would, in this situation, the parents would be making a choice about their child's mental health,
00:15:21.220 as well as, you know, the life in general, and that they would be applying what they believe to be the best situation for the mental health of the child.
00:15:30.860 And that might involve trying to, you know, badger them into sticking with a gender that looks like their biological entity.
00:15:40.860 So, yeah, we'll talk about Jordan Peterson.
00:15:44.780 What do you think about that?
00:15:50.860 Because it seems to me that the parents need to have the medical decision right.
00:15:58.140 But you could also imagine where a crazy parent would take it too far, right?
00:16:02.700 There's going to be some crazy parent who says, I'm going to amputate their arms to make them healthier.
00:16:07.140 You know, there's always going to be somebody you don't want to make medical decisions for your kids,
00:16:12.480 but that's where the medical community itself would be the safeguard, right?
00:16:17.340 The doctor would not do the procedure.
00:16:19.540 But if you're just talking about talking to your kids, I would say that's mental health.
00:16:24.900 And how parents talk to their children or how parents raise their children is probably, I don't know,
00:16:30.360 other than organic problems, the single biggest variable in the mental health of the children, I would think.
00:16:38.700 So parents are basically mental health care providers.
00:16:44.240 They're just unlicensed.
00:16:45.340 It's just their job to take care of the mental health of their kids.
00:16:49.520 So this would take from the parents their right to decide what's in the best mental health interest of their own children.
00:17:00.360 That seems to me like a bad idea.
00:17:04.060 Seems to me like going too far.
00:17:07.060 All right.
00:17:09.520 Let's talk about Trump, who is indicted on 37 counts.
00:17:13.300 Now, of course, we all believe that this is primarily political in nature
00:17:18.140 and that whether or not Trump did things which are technically crimes,
00:17:23.520 and I'm sure he did.
00:17:25.280 I'm sure there's some technical crimes in there somewhere.
00:17:27.760 But that doesn't mean you indict.
00:17:30.360 Right?
00:17:31.440 Because there are lots of other considerations,
00:17:33.160 such as it being the person who's running for president
00:17:35.420 and was once a president,
00:17:36.960 and you don't want to start that precedent and all that.
00:17:42.540 But here are some of the things that make it political.
00:17:46.640 First of all, the 37 counts,
00:17:49.340 as we've learned from other political cases,
00:17:52.880 doesn't mean he did 37 separate crimes,
00:17:56.400 or even allegedly because they clump, you know,
00:18:01.260 different charges around the same activity.
00:18:03.460 So there might be several charges that get to the same activity.
00:18:07.900 So it might be like, I don't know,
00:18:10.360 half a dozen different things he's accused of doing.
00:18:12.620 But if you reported it as 37,
00:18:16.380 don't you think that dumb people will think that's worse than if you said seven?
00:18:21.920 If you said he did seven things that look illegal, or at least potentially,
00:18:26.840 versus you say there are 37 counts,
00:18:30.580 those don't sound the same.
00:18:32.520 If you say 37, it just sounds guilty.
00:18:34.980 If you said, you know, three or seven,
00:18:38.820 it wouldn't sound nearly as guilty.
00:18:40.920 And indeed, I think there are three categories.
00:18:44.800 There's, you know, did he take them?
00:18:47.320 And then did he, you know, lie about what he had?
00:18:51.580 And did he show it to anybody?
00:18:52.940 And, you know, did he not agree to give it back?
00:18:56.620 So there's like, I don't know,
00:18:58.500 several little activities that look illegal.
00:19:01.860 So, first of all, counting it as 37 counts makes it sound worse.
00:19:07.040 The second thing that makes it sound worse is the prosecutor guy, Jack Smith,
00:19:13.320 who, of course, said no one is above the law.
00:19:16.720 Now, did you know he was going to say that?
00:19:19.420 Was there any chance he wasn't going to say that?
00:19:22.140 Of course he was going to say that.
00:19:23.960 But, of course, that's a framing thing.
00:19:26.240 And as we've learned now, no one is above law means Democrats are going after Republicans.
00:19:31.140 For being Republicans.
00:19:33.480 It doesn't mean anything except that today.
00:19:37.340 It used to mean no one is above the law.
00:19:42.000 It used to mean exactly what it says.
00:19:44.460 Not anymore.
00:19:45.880 Now it is a way to signal that you are a political animal
00:19:49.520 and you are doing a political prosecution.
00:19:52.640 Nobody needs to say it.
00:19:54.300 You don't need to say it unless you're doing something sketchy.
00:20:02.560 Does anybody need to say no one is above the law when they can show you the crime?
00:20:07.000 They don't need to do that, right?
00:20:10.420 If Trump had killed somebody and it was on video,
00:20:14.640 do you think Jack Smith would say, after we all saw a video of Trump murdering somebody,
00:20:20.000 do you think he would need to say no one is above the law?
00:20:23.540 No, he would not need to say that.
00:20:25.900 Because there wouldn't be a single person in the world who had any question
00:20:29.200 about what is the right thing to do.
00:20:31.560 That's the law.
00:20:34.060 No, you only say no one is above the law
00:20:36.320 when you're trying to pull one over on the public.
00:20:39.480 The signal is glaringly obvious.
00:20:42.640 No one is above the law means we hope that our people are above the law
00:20:46.420 and yours are not.
00:20:47.900 That's all it means.
00:20:49.520 It is such a signal for disreputable behavior.
00:20:53.060 Such a signal.
00:20:54.260 I mean, it could not be more obvious.
00:20:57.420 All right.
00:20:58.480 Here's my big question.
00:21:00.680 And I'll tell you some people who seem to be agreeing with it.
00:21:09.620 Can we really, as a country,
00:21:13.020 do you think you could send Trump to prison
00:21:15.360 over documents that the public is not allowed to see?
00:21:22.260 Now, I don't think anybody framed it that way until I did, you know, yesterday.
00:21:27.420 But just hold this in your head.
00:21:29.320 This is not Watergate, right?
00:21:32.680 With Watergate, we said, here's the building, here's the people, here's the break-in, that's the crime.
00:21:39.620 Everybody's looking at it.
00:21:40.920 And then even Republicans.
00:21:43.180 You know, not all of them.
00:21:44.400 But enough Republicans even said, yeah, okay, that's a crime.
00:21:48.220 No doubt about it.
00:21:49.600 We all saw it.
00:21:50.260 That's a crime.
00:21:50.760 But what happens if the public is not allowed to see the evidence, because these are secret documents?
00:21:59.100 What if they're just described to you, sort of a general way?
00:22:03.580 Oh, it was a document about an attack on a country.
00:22:07.720 You good with that?
00:22:08.800 Are you okay that the person who was the president of your country would go to prison
00:22:14.560 because somebody said they saw a document that they're describing in a general way?
00:22:21.020 No.
00:22:22.100 No.
00:22:23.060 Nope.
00:22:24.340 That is not acceptable.
00:22:26.700 That is too far.
00:22:30.560 Watergate was not too far.
00:22:32.440 They showed the evidence.
00:22:33.740 The people judged it.
00:22:34.920 The Congress judged it.
00:22:36.080 We got past it.
00:22:38.820 But if you put a living ex-president who's running for office in prison,
00:22:45.920 in prison,
00:22:48.000 and you don't let the public see the evidence,
00:22:50.740 because it's private,
00:22:52.220 it's all a little secret stuff,
00:22:54.440 no.
00:22:55.460 Let me just say this as clearly as possible.
00:22:58.940 No.
00:23:00.060 No.
00:23:01.760 No.
00:23:02.640 That's not going to happen.
00:23:03.800 No, I don't know what will happen.
00:23:06.700 But I'll tell you what's not going to happen.
00:23:08.900 You're not going to put the fucking president in jail,
00:23:12.580 let's say prison.
00:23:13.800 You're not going to put the fucking president in prison
00:23:17.240 without showing us the evidence.
00:23:20.000 Now, if you can't show us the evidence,
00:23:22.560 fuck you.
00:23:24.340 Do something else.
00:23:26.420 Figure it out.
00:23:28.060 Figure it out.
00:23:28.960 But if you don't show us the evidence,
00:23:32.520 no.
00:23:34.000 That's a hard no.
00:23:37.060 Hard no.
00:23:38.620 Let me just say it's just not going to happen.
00:23:41.160 It's fucking not going to happen.
00:23:43.400 One way or the other.
00:23:44.480 Now, if we can see the evidence,
00:23:47.660 you know, maybe it gets,
00:23:49.280 let's say the evidence is declassified.
00:23:52.400 Well, how bad was it if it could be declassified?
00:23:56.580 Seriously.
00:23:57.420 How bad was it if they could show it to us?
00:24:00.680 If they can show it to us, it's no big deal.
00:24:03.940 And if they don't show it to us,
00:24:06.040 I'm not going to be happy.
00:24:08.080 Now, there's only one way you could get past this,
00:24:10.640 in my view.
00:24:12.500 And that would be that if enough Republicans you trusted
00:24:16.120 got to look at the documents
00:24:18.600 and then came away saying,
00:24:20.360 whoops, okay, I changed my mind.
00:24:22.580 That is so bad.
00:24:24.440 That's so bad, I get it.
00:24:26.080 It's just like Watergate.
00:24:27.600 Now, I would believe,
00:24:29.880 I'll give you some examples.
00:24:31.620 If Thomas Massey looks at it and says,
00:24:34.020 okay, this is bad.
00:24:36.460 This has to be dealt with with the legal system.
00:24:39.720 I would trust him.
00:24:41.400 I would trust him.
00:24:42.840 If Matt Gaetz looks at it and says,
00:24:45.540 yeah, he's my best friend,
00:24:46.900 but honestly, this is bad.
00:24:49.640 I would trust him.
00:24:51.360 I would trust that.
00:24:52.740 If Tom Cotton looks at it,
00:24:55.880 says, no,
00:24:57.140 you know, this is just bad.
00:24:58.920 You've got to do something.
00:24:59.720 But I would trust him.
00:25:01.600 So, you know,
00:25:03.040 some of the other Republicans
00:25:05.640 are a little more political animals,
00:25:07.940 if you know what I mean.
00:25:09.880 They've shown a, let's say,
00:25:12.000 a history of pushing the political angle
00:25:15.120 a little harder than the factual angle.
00:25:17.780 So I'm not sure I would believe in Jim Jordan.
00:25:20.880 And, you know, I like Jim Jordan.
00:25:22.460 I don't have any problem with him.
00:25:23.700 But he's a more political animal.
00:25:27.300 Would you agree?
00:25:29.160 Yeah.
00:25:30.380 Yeah, Lindsey Graham might be a political animal.
00:25:33.160 There are some people in Congress
00:25:34.600 that have a history of independent opinion and thought.
00:25:40.340 And if several of those say,
00:25:43.220 yeah, this is bad
00:25:44.200 and this is certainly something
00:25:46.000 the legal system should handle,
00:25:47.860 I would be willing to go with that.
00:25:50.380 But do you think that could happen?
00:25:52.520 Rand Paul's another one.
00:25:53.640 Yeah.
00:25:53.900 If Rand Paul looked at it and said,
00:25:55.680 nope,
00:25:56.500 yeah, this is bad.
00:25:57.860 We're not going to treat this
00:25:59.000 like a political problem.
00:26:00.520 I would trust it.
00:26:03.280 Ted Cruz,
00:26:04.180 I would trust Ted Cruz.
00:26:06.080 Yeah.
00:26:06.340 I would totally trust Ted Cruz.
00:26:11.620 Romney might be a different animal
00:26:13.280 because he might want to be president.
00:26:16.420 A little less trust there.
00:26:18.360 But I see where you're going with that.
00:26:21.520 All right.
00:26:21.940 Well, to this point,
00:26:23.080 Carrie Lake gave a speech
00:26:24.940 in which she said the following
00:26:26.980 to get a,
00:26:27.680 she got a standing ovation, I think.
00:26:31.020 She said,
00:26:31.520 if you want to get to President Trump,
00:26:32.960 you're going to have to go through me
00:26:34.380 and 75 million Americans just like me.
00:26:37.060 And most of us are card-carrying members
00:26:39.200 of the NRA.
00:26:40.640 And she said,
00:26:41.580 that's not a threat.
00:26:42.820 That's a public service announcement.
00:26:47.660 She's very good at this.
00:26:50.220 I don't know if I've ever mentioned.
00:26:52.080 She's not my choice to be vice president.
00:26:54.480 But she's very good at this
00:26:56.780 whole communicating thing.
00:26:58.620 She's really good at it.
00:27:00.420 And I love that she closed it down with this,
00:27:03.000 it's not a threat,
00:27:03.780 it's a public service announcement.
00:27:06.180 Because that's the same framing
00:27:09.400 that I try to use.
00:27:12.120 The correct framing,
00:27:13.660 the one that's productive,
00:27:14.780 is if you're on the other side
00:27:17.800 from whatever,
00:27:19.560 it doesn't matter what the issue is,
00:27:20.720 if you're on the other side,
00:27:22.320 isn't it important to you
00:27:23.700 to know where the limit is?
00:27:26.780 To know what's too far?
00:27:28.880 It's very important for everybody
00:27:30.640 to know how far you can push
00:27:32.900 the other side.
00:27:34.780 And Carrie Lake,
00:27:35.680 just put it out there
00:27:37.040 as clearly as you can.
00:27:39.800 There are 75 million people
00:27:41.500 who own guns.
00:27:44.100 You better not fuck this up.
00:27:46.620 Right?
00:27:47.220 I don't think it's a threat.
00:27:49.000 I don't take it as a threat.
00:27:50.640 It's not like a plan
00:27:52.520 to do something.
00:27:53.520 But it's a fact.
00:27:56.680 It's a fact that everybody
00:27:58.580 has to deal with
00:27:59.500 that there are 75 million
00:28:01.560 armed people
00:28:02.420 watching this very carefully.
00:28:04.680 And they're not happy.
00:28:06.520 They're not happy at all.
00:28:09.080 Now,
00:28:10.040 obviously,
00:28:10.740 the best way to solve it
00:28:11.760 would be
00:28:12.400 an election
00:28:13.420 in which he gets elected
00:28:14.600 and pardons himself.
00:28:17.360 That would be one way to do it.
00:28:18.840 So guns would not be,
00:28:21.880 you know,
00:28:22.820 the first, second,
00:28:24.120 third, or fourth resort.
00:28:26.480 Right?
00:28:26.900 We have much more,
00:28:28.460 you know,
00:28:28.660 many more tools.
00:28:29.560 You don't need the guns.
00:28:30.860 But I liked the fact
00:28:33.140 that none of it
00:28:34.600 would be possible
00:28:35.340 without the guns.
00:28:38.000 That's my opinion.
00:28:39.840 Everything that we can do
00:28:41.180 in a legal sense
00:28:42.500 that would make this situation
00:28:44.760 more palatable
00:28:45.440 for the whole country,
00:28:46.320 that's only because
00:28:48.280 we have guns.
00:28:49.800 If we didn't have the guns,
00:28:51.000 the legal stuff wouldn't work.
00:28:55.320 Nancy Mace
00:28:56.240 commented on Hillary Clinton
00:28:58.180 who introduced on Twitter
00:28:59.660 a picture of her
00:29:01.340 with a hat.
00:29:02.040 I guess she had,
00:29:03.040 she wore it
00:29:03.660 in the prior election.
00:29:05.680 And the hat just says,
00:29:07.520 but her emails,
00:29:09.000 you know,
00:29:09.220 because a lot of the Republicans
00:29:10.240 are saying,
00:29:10.800 but, but,
00:29:11.960 if Trump is guilty,
00:29:13.040 what about,
00:29:14.100 what about her emails?
00:29:16.320 And so Hillary
00:29:18.160 was mocking that
00:29:19.100 with the hat that says,
00:29:20.340 but her emails.
00:29:21.780 And then Representative
00:29:22.720 Nancy Mace,
00:29:23.600 who I'm liking more
00:29:24.400 every day,
00:29:25.640 she tweets this.
00:29:27.760 And here's why I love it.
00:29:29.660 You're not supposed
00:29:30.500 to talk like this
00:29:31.380 if you're a representative
00:29:32.360 of the United States.
00:29:34.520 But sometimes,
00:29:36.180 the salty language
00:29:37.660 is the only language
00:29:38.740 that gets right in there.
00:29:41.380 Like,
00:29:41.640 sometimes the generic language
00:29:43.280 is just going to
00:29:44.080 wash off
00:29:44.800 or, you know,
00:29:45.680 glance off.
00:29:46.840 But,
00:29:47.220 but here's some salty language
00:29:48.680 from Representative Nancy Mace
00:29:50.620 that I highly endorse.
00:29:53.580 I highly endorse
00:29:54.440 her choice of words.
00:29:55.900 And she said
00:29:56.460 in a tweet
00:29:56.960 about,
00:29:58.060 about Hillary's hat
00:30:00.120 that says,
00:30:00.660 but her emails,
00:30:01.580 she said,
00:30:02.200 gloating that there is
00:30:03.600 a different standard
00:30:04.460 of justice
00:30:05.160 and that you are
00:30:06.280 above the law
00:30:07.080 is next level bullshit.
00:30:08.460 Gloating that you're,
00:30:12.920 that there are two standards
00:30:14.160 and you're above the law
00:30:15.960 is next level bullshit.
00:30:18.860 And I,
00:30:19.940 I don't think you could,
00:30:21.920 you could write
00:30:22.720 a thousand tweets
00:30:23.900 on that point
00:30:25.240 and not hit that tweet
00:30:26.700 as perfectly.
00:30:28.360 There are no wasted words
00:30:29.900 in that sentence.
00:30:31.240 Gloating that there is
00:30:32.140 a different standard
00:30:33.000 of justice
00:30:33.740 and that you are
00:30:34.780 above the law
00:30:35.560 is next level bullshit.
00:30:38.460 Now that's the representative
00:30:41.600 I want, right?
00:30:43.200 Don't you want more
00:30:43.960 of that in Congress?
00:30:45.680 Yeah, give me more of that.
00:30:49.020 All right,
00:30:49.560 here's an interesting question.
00:30:51.100 If,
00:30:51.480 if Trump is,
00:30:52.700 let's say,
00:30:53.060 indicted
00:30:53.540 or let's say,
00:30:56.340 I think this is unlikely,
00:30:58.920 but say he was
00:30:59.740 in so much legal trouble
00:31:01.040 and he knew it
00:31:02.500 that he decided
00:31:03.860 to stop fighting it
00:31:05.140 and make a deal.
00:31:06.220 And the deal was
00:31:08.180 if you drop
00:31:08.720 all the charges,
00:31:09.900 I won't run
00:31:10.800 for president.
00:31:12.280 Do you think
00:31:12.680 that could happen?
00:31:14.160 Do you think
00:31:14.820 the legal jeopardy
00:31:15.980 could ever rise
00:31:16.820 to the level
00:31:17.300 where Trump,
00:31:18.120 who's,
00:31:18.500 you know,
00:31:18.720 the consummate fighter,
00:31:20.160 but he's also
00:31:20.840 a deal maker,
00:31:22.100 right?
00:31:22.560 So Trump is the
00:31:23.720 never stop fighting guy,
00:31:26.180 but also the deal maker.
00:31:28.720 So you could be surprised,
00:31:30.440 right?
00:31:31.160 It might be time
00:31:32.000 for a deal.
00:31:32.420 So if he made a deal
00:31:34.740 to not run
00:31:35.880 in return for charges
00:31:37.880 being dropped,
00:31:38.940 which would be
00:31:39.840 the creepiest deal
00:31:40.940 in the world,
00:31:41.960 I would hate it
00:31:43.200 because they shouldn't
00:31:45.240 be related,
00:31:46.000 right?
00:31:47.100 It isn't the whole point
00:31:48.320 that this is not
00:31:49.060 supposed to be political.
00:31:51.400 It's supposed to be,
00:31:52.460 oh,
00:31:52.720 nobody's above the law.
00:31:54.400 Nobody's above the law.
00:31:55.960 Oh,
00:31:56.580 that's what it's supposed
00:31:57.740 to be,
00:31:58.040 right?
00:31:58.220 But if they were
00:31:59.620 to drop the legal stuff
00:32:01.220 in return for him
00:32:03.000 not running,
00:32:03.860 that would prove
00:32:05.020 it was never
00:32:05.540 about the law
00:32:06.520 or who's above the law.
00:32:08.380 It would be
00:32:08.880 a political thing.
00:32:10.540 So they can't even
00:32:11.780 make the deal.
00:32:13.360 But let's imagine
00:32:15.200 that something
00:32:15.940 in this process
00:32:16.800 takes Trump out
00:32:18.100 before the primary.
00:32:19.640 Just imagine that.
00:32:21.560 Under that scenario,
00:32:22.840 would Biden drop out?
00:32:25.240 Because Biden
00:32:26.200 is the Trump killer.
00:32:28.220 If you don't need
00:32:28.840 a Trump killer
00:32:29.560 because you already
00:32:31.640 killed him,
00:32:33.040 would they say,
00:32:34.100 all right,
00:32:34.340 we don't need Joe
00:32:35.120 and he's not quite there
00:32:37.360 so we'll try
00:32:38.520 something else?
00:32:39.720 I don't know.
00:32:40.660 It's a possibility,
00:32:41.640 isn't it?
00:32:42.600 I do think that
00:32:44.120 the Biden-Trump
00:32:45.880 connection is so strong
00:32:48.340 that as goes one,
00:32:49.800 maybe the other goes.
00:32:51.520 That they might be
00:32:52.280 connected so that
00:32:53.240 their fates are now
00:32:54.200 together.
00:32:56.280 It's possible.
00:32:56.720 Joe Moore
00:32:58.900 had that idea
00:32:59.700 on Twitter.
00:33:02.400 All right.
00:33:05.980 Apparently,
00:33:06.780 Comer and Grassley
00:33:09.500 are going to get
00:33:10.040 their FBI document.
00:33:12.080 They've already seen it,
00:33:13.220 but they want
00:33:13.660 others in Congress
00:33:14.940 to see it.
00:33:15.900 So allegedly,
00:33:16.720 there's a document
00:33:17.940 from a whistleblower
00:33:18.940 saying that
00:33:20.400 there is evidence
00:33:21.460 of a $5 million
00:33:23.220 bribe
00:33:24.200 to the Bidens
00:33:25.540 in return for,
00:33:27.660 I don't know,
00:33:28.120 some policy decisions.
00:33:30.700 Now,
00:33:31.180 we don't know
00:33:31.540 the details.
00:33:32.120 We don't know
00:33:32.460 if it's real.
00:33:33.360 We don't know
00:33:33.720 if it's credible.
00:33:36.160 And somebody
00:33:36.980 who saw it,
00:33:37.760 I think Grassley,
00:33:38.540 was asked
00:33:39.060 if it
00:33:40.200 clearly indicates
00:33:43.100 a crime.
00:33:43.640 And he did not
00:33:45.940 say yes.
00:33:46.600 So I'm pretty sure
00:33:48.960 that,
00:33:49.600 well,
00:33:50.080 we may never see it,
00:33:51.700 but,
00:33:52.620 well,
00:33:54.160 and by the way,
00:33:54.680 let me be fair.
00:33:56.360 If they go after
00:33:57.780 Biden
00:33:58.240 based on a document
00:33:59.540 the public can't see,
00:34:01.180 that's no good.
00:34:02.960 Would you agree?
00:34:04.200 Same problem.
00:34:05.820 I'm 100%
00:34:06.880 against,
00:34:07.560 you know,
00:34:08.720 indicting
00:34:09.360 or arresting
00:34:10.300 Joe Biden
00:34:10.940 if it's going to be
00:34:12.900 over a document
00:34:13.660 the public can't see.
00:34:14.880 The public
00:34:16.920 has to see that
00:34:17.840 or else
00:34:19.140 you've got to
00:34:19.780 fuck off.
00:34:21.160 It's the same standard.
00:34:22.940 I get that
00:34:23.900 there are secrets,
00:34:25.300 but you can't
00:34:26.160 take out my president,
00:34:27.320 whether it's
00:34:27.780 President Biden
00:34:28.500 or President.
00:34:29.920 You can't take out
00:34:30.640 my president
00:34:31.280 with secret documents.
00:34:33.380 I don't care
00:34:33.820 which president it is.
00:34:35.000 No secret documents.
00:34:36.720 That's too far.
00:34:38.700 So,
00:34:39.280 presumably,
00:34:40.460 if it ever reached
00:34:41.560 to the point
00:34:42.040 of being a legal process,
00:34:43.580 this alleged
00:34:45.140 $5 million bribe thing,
00:34:47.680 I assume we'd see it.
00:34:49.460 I guess we'd see it
00:34:50.320 at that point.
00:34:51.500 But,
00:34:51.980 I'm not,
00:34:53.480 I'm just not cool
00:34:54.520 with accusations
00:34:55.900 over secret documents.
00:34:57.560 So,
00:34:57.820 I'm not too happy
00:34:58.480 with Comer
00:34:59.080 and Grassley,
00:35:00.440 even though
00:35:00.940 this might be legitimate.
00:35:02.440 This might be
00:35:02.960 totally legitimate.
00:35:04.040 I don't know.
00:35:04.840 But I'm just not happy
00:35:05.660 about all the
00:35:06.180 secret document stuff.
00:35:07.460 This is,
00:35:07.740 this is Schiff stuff.
00:35:09.640 Oh,
00:35:10.020 I saw it in the skiff.
00:35:11.780 Trust me,
00:35:12.440 I saw it in the skiff.
00:35:13.580 Now,
00:35:14.000 what Comer
00:35:14.420 and Grassley
00:35:14.960 are doing correctly
00:35:15.720 is they're trying
00:35:17.480 to make sure
00:35:18.180 that their colleagues
00:35:19.120 see it
00:35:19.560 so that they are
00:35:20.680 not the ones
00:35:21.220 responsible
00:35:21.760 for the interpretation
00:35:22.720 of it.
00:35:23.560 Because a few of them
00:35:24.300 did see it.
00:35:25.380 And that's good.
00:35:26.380 That's a good instinct.
00:35:27.480 But it's not,
00:35:28.640 it doesn't get you
00:35:29.400 all the way there.
00:35:30.500 But I do like
00:35:31.180 that they're
00:35:31.680 trying to broaden.
00:35:33.040 I guess on Monday
00:35:33.900 some people will see it.
00:35:35.140 I guess we'll get
00:35:35.680 some more opinions.
00:35:37.100 The thing I would
00:35:37.820 be looking for
00:35:38.760 is to see
00:35:40.160 if there's any
00:35:40.820 Republican
00:35:41.500 who sees it
00:35:42.460 who breaks ranks
00:35:44.160 and says,
00:35:46.120 okay,
00:35:46.400 I saw it
00:35:47.120 and, you know,
00:35:48.160 it's a disturbing
00:35:48.860 allegation
00:35:49.440 but there's not
00:35:50.100 enough there.
00:35:52.340 Yeah,
00:35:52.520 somebody like Romney.
00:35:53.760 You know,
00:35:54.040 you can imagine
00:35:54.560 somebody like Romney
00:35:55.460 looking at it
00:35:56.020 and saying,
00:35:56.280 yeah,
00:35:56.380 it's disturbing
00:35:57.000 but it doesn't
00:35:58.980 quite connect
00:35:59.720 the dots.
00:36:01.040 There's not
00:36:01.580 quite there.
00:36:02.340 It's more like
00:36:02.960 a suggestion
00:36:03.980 of a crime
00:36:04.700 not really evidence.
00:36:07.140 If that happens
00:36:08.020 then I'm
00:36:08.640 inclined to think
00:36:10.860 there's nothing there.
00:36:12.440 But if you get
00:36:13.300 100% of the Republicans
00:36:14.880 saying,
00:36:15.480 holy cow,
00:36:17.500 well then I'd
00:36:18.000 start worrying.
00:36:19.440 And you might
00:36:19.960 see that.
00:36:22.840 All right,
00:36:23.700 Meta seems to be
00:36:24.600 launching a
00:36:25.720 Twitter alternative.
00:36:28.400 Does that make
00:36:28.980 sense to you?
00:36:31.300 That
00:36:31.660 Meta slash
00:36:33.820 Facebook would be
00:36:34.780 launching a
00:36:35.420 Twitter alternative?
00:36:37.660 Apparently the
00:36:38.380 news is real.
00:36:39.380 But it sounds
00:36:40.180 to me like
00:36:40.660 they're trying
00:36:41.020 to make a
00:36:41.540 Democrat
00:36:41.980 Twitter.
00:36:44.600 I don't think
00:36:45.500 they're trying
00:36:45.900 to compete.
00:36:47.620 I think they're
00:36:48.500 trying to make
00:36:49.020 a Democrat
00:36:49.540 Twitter.
00:36:50.800 Then just
00:36:51.360 suck off all
00:36:52.020 the Twitter
00:36:53.280 people from
00:36:54.100 the Musk
00:36:55.280 Twitter.
00:36:56.260 I think
00:36:56.580 that
00:36:58.020 I think
00:37:01.080 that Zuckerberg
00:37:01.880 might be
00:37:02.380 trying to
00:37:02.800 kill Twitter
00:37:03.520 more than
00:37:04.940 he's trying
00:37:05.380 to make
00:37:05.720 money for
00:37:06.180 Facebook.
00:37:08.420 What do
00:37:08.960 you think?
00:37:09.920 I think
00:37:10.360 Zuckerberg,
00:37:11.100 this is just
00:37:11.460 a guess,
00:37:12.200 I don't have
00:37:12.660 any inside
00:37:13.140 information,
00:37:14.060 but I feel
00:37:14.420 like Zuckerberg
00:37:15.180 is so
00:37:16.160 connected to
00:37:16.800 the Democrat
00:37:17.360 world that
00:37:18.800 he would be
00:37:19.320 seen as a
00:37:20.020 star if he
00:37:20.640 could kill
00:37:21.060 Twitter,
00:37:21.460 even if
00:37:22.880 Meta
00:37:23.400 didn't make
00:37:23.880 any money
00:37:24.280 from it.
00:37:26.240 So that's
00:37:26.940 what it
00:37:27.180 looks like.
00:37:28.580 Well,
00:37:29.020 and then the
00:37:29.900 rumor is
00:37:30.440 that this
00:37:31.480 so-called
00:37:32.040 Twitter
00:37:32.720 competitor
00:37:33.800 would try
00:37:34.580 to get
00:37:34.900 Oprah and
00:37:35.720 the Dalai
00:37:36.280 Lama
00:37:36.580 involved.
00:37:38.840 So what
00:37:39.400 did Elon Musk
00:37:40.140 tweet about
00:37:40.860 this story,
00:37:41.940 in which the
00:37:42.580 Dalai Lama,
00:37:43.900 who once said
00:37:44.660 to a child,
00:37:45.680 suck my tongue,
00:37:47.420 which as it
00:37:48.200 turns out is
00:37:48.840 something Tibetans
00:37:49.680 actually say.
00:37:50.900 It was not
00:37:51.520 the awful
00:37:53.400 thing that
00:37:54.060 your culture
00:37:55.340 suggests.
00:37:56.760 Apparently it's
00:37:57.480 like a standard
00:37:58.260 saying.
00:37:59.880 And the
00:38:00.480 saying goes
00:38:01.000 like this,
00:38:02.200 I've given
00:38:03.200 you everything
00:38:03.640 I can give
00:38:04.280 you,
00:38:05.420 except you
00:38:06.100 could eat
00:38:06.680 my tongue,
00:38:07.200 I guess.
00:38:08.620 So it's
00:38:09.120 more like
00:38:09.640 a cannibal
00:38:10.640 thing than
00:38:11.200 a sexual
00:38:11.720 thing.
00:38:12.880 So anyway,
00:38:15.240 that story
00:38:15.640 exists.
00:38:16.700 So when
00:38:16.880 Musk sees
00:38:17.600 the story
00:38:18.000 about
00:38:18.420 meta-launching
00:38:19.440 a Twitter
00:38:19.940 substitute
00:38:20.520 that might
00:38:22.000 feature the
00:38:22.520 Dalai Lama
00:38:23.120 as some
00:38:23.620 featured user
00:38:24.620 or something,
00:38:25.680 Musk tweets,
00:38:27.340 Zuck my
00:38:28.040 tongue.
00:38:30.220 Zuck my
00:38:31.080 tongue,
00:38:31.680 as in
00:38:32.440 Zuckerberg.
00:38:34.380 And that's
00:38:35.120 his whole
00:38:35.440 tweet.
00:38:36.300 That's his
00:38:36.900 entire commentary
00:38:37.760 on this
00:38:38.260 situation.
00:38:39.700 Zuck my
00:38:40.500 tongue.
00:38:40.760 I don't
00:38:45.800 know who
00:38:46.160 is the
00:38:46.540 best
00:38:46.880 tweeter
00:38:47.600 in the
00:38:48.020 world.
00:38:49.080 It used
00:38:49.520 to be
00:38:49.800 Trump,
00:38:51.000 but I
00:38:51.620 think it's
00:38:52.220 Musk.
00:38:53.640 I think he's
00:38:54.140 the best
00:38:54.400 tweeter in
00:38:54.920 the world
00:38:55.160 right now.
00:38:56.540 All right,
00:38:56.920 here's a
00:38:57.380 little bit
00:38:57.680 on AI.
00:39:00.240 So Open
00:39:02.140 AI CEO
00:39:03.220 Sam Altman
00:39:04.960 was in
00:39:05.560 India
00:39:05.940 talking to
00:39:06.720 a group
00:39:07.140 of investor
00:39:07.740 types.
00:39:09.080 And they
00:39:09.940 asked him
00:39:10.380 about India
00:39:12.320 creating its
00:39:13.140 own AI.
00:39:14.840 And then
00:39:15.200 Sam Altman
00:39:17.160 said,
00:39:18.060 you can
00:39:18.760 try, but
00:39:19.220 it's hopeless.
00:39:20.100 You'll never
00:39:20.560 catch up to
00:39:22.080 us.
00:39:22.760 You might
00:39:23.240 as well
00:39:23.420 just give
00:39:23.880 up.
00:39:26.760 And he
00:39:27.380 says it
00:39:27.700 with a
00:39:28.000 straight face
00:39:28.620 with a
00:39:29.580 little bit
00:39:30.000 of a
00:39:30.240 smile.
00:39:31.580 And it's
00:39:32.000 being reported
00:39:32.700 as news,
00:39:34.660 of course.
00:39:35.180 But he
00:39:37.300 also said,
00:39:39.040 we expect
00:39:40.540 you to
00:39:41.400 try, but
00:39:42.740 you can't
00:39:43.220 do it.
00:39:43.760 It's
00:39:44.160 impossible.
00:39:44.720 You can't
00:39:45.040 catch up.
00:39:46.520 Now, how
00:39:46.920 do you feel
00:39:47.240 about that?
00:39:48.960 Well, no,
00:39:49.880 it wasn't
00:39:50.320 racist.
00:39:51.420 The reason
00:39:51.880 you know
00:39:52.220 it's not
00:39:52.580 racist is
00:39:54.600 because Silicon
00:39:55.480 Valley has
00:39:56.520 so many
00:39:57.020 Indian
00:39:57.740 engineers
00:39:58.300 that are
00:39:58.880 considered
00:39:59.760 among the
00:40:00.260 best.
00:40:01.180 And Sam
00:40:02.420 Altman
00:40:02.740 lives and
00:40:03.440 works in
00:40:04.420 that world.
00:40:05.180 No, it's
00:40:05.740 not racist.
00:40:06.980 Sam
00:40:07.220 Altman
00:40:07.460 knows that
00:40:08.100 there are
00:40:09.160 super smart
00:40:09.760 people in
00:40:10.240 India who
00:40:10.680 can do
00:40:10.960 anything.
00:40:12.320 All right.
00:40:14.040 I think it
00:40:14.780 was just
00:40:15.100 competitive.
00:40:16.700 I think he
00:40:17.400 was just
00:40:17.780 trying to
00:40:18.700 warn them
00:40:19.120 off so
00:40:19.500 they couldn't
00:40:19.860 get funding.
00:40:21.580 To me, it
00:40:22.260 just looked
00:40:22.620 like a
00:40:23.020 competitive
00:40:23.440 move.
00:40:24.100 It's
00:40:24.320 possible he
00:40:24.980 believes it
00:40:25.500 as well.
00:40:26.500 I won't
00:40:27.040 say that he
00:40:27.580 doesn't believe
00:40:28.160 it, but it
00:40:29.740 sounded more
00:40:30.260 like a fun,
00:40:31.900 interesting,
00:40:32.500 competitive thing
00:40:33.220 to say.
00:40:33.720 Like, don't
00:40:34.300 even try.
00:40:35.560 I expect
00:40:36.100 you'll try.
00:40:37.080 Of course
00:40:37.520 you'll try,
00:40:38.420 but you
00:40:38.900 have no
00:40:39.260 hope.
00:40:40.780 It sounds
00:40:41.240 like something
00:40:41.640 Trump would
00:40:42.140 say.
00:40:43.240 All right,
00:40:43.660 but here's
00:40:44.600 something else
00:40:45.040 that Sam
00:40:45.440 Altman said
00:40:46.100 that mirrors
00:40:48.360 something I've
00:40:49.060 been saying
00:40:49.420 for a while.
00:40:51.080 And he
00:40:51.700 was asked,
00:40:52.860 after working
00:40:53.680 on AI for
00:40:54.360 so long,
00:40:55.140 what have
00:40:55.420 you learned
00:40:55.820 about humans?
00:40:58.120 What have
00:40:58.620 you learned
00:40:58.900 about humans
00:40:59.720 after working
00:41:00.720 on AI?
00:41:01.120 Do you
00:41:01.960 remember my
00:41:02.480 prediction that
00:41:03.940 the biggest
00:41:04.380 shocker about
00:41:05.300 AI would
00:41:05.940 not be
00:41:06.300 about AI?
00:41:08.180 It's that
00:41:08.980 AI would
00:41:09.820 reproduce human
00:41:10.940 intelligence,
00:41:12.000 and then we
00:41:12.740 would find out
00:41:13.480 how uninteresting
00:41:14.520 human intelligence
00:41:15.520 is.
00:41:17.480 Consciousness
00:41:18.080 next, we
00:41:18.960 don't have that
00:41:19.480 yet, but
00:41:21.160 we've already
00:41:21.740 reproduced something
00:41:22.980 like intelligence.
00:41:24.880 And so,
00:41:25.860 Sam Altman
00:41:26.420 says, quote,
00:41:27.140 I grew up
00:41:27.740 thinking, I
00:41:29.160 grew up
00:41:29.500 thinking that
00:41:30.040 intelligence is
00:41:30.860 something magical
00:41:31.640 that is uniquely
00:41:32.940 human, but now
00:41:34.560 I think it's a
00:41:35.300 fundamental property
00:41:36.420 of matter.
00:41:39.060 That's as
00:41:39.780 opposite as you
00:41:40.740 can get from
00:41:41.340 magic.
00:41:42.840 It's a property
00:41:44.000 of matter.
00:41:46.140 That if you
00:41:47.020 simply organize
00:41:48.140 things in the
00:41:50.300 right organization,
00:41:51.280 it causes
00:41:51.660 intelligence.
00:41:53.140 Doesn't require
00:41:53.900 a soul,
00:41:55.240 doesn't require
00:41:56.040 a god,
00:41:57.140 it's just
00:41:57.700 things.
00:41:58.560 You just put
00:41:59.080 things in a
00:41:59.720 certain combination
00:42:00.560 and you've got
00:42:00.980 intelligence.
00:42:02.920 Now, that's
00:42:03.600 not consciousness.
00:42:05.940 Not consciousness,
00:42:07.040 and then
00:42:07.280 sentience is weird,
00:42:08.300 so I'll just say
00:42:08.820 consciousness.
00:42:10.160 Yeah.
00:42:11.120 Now, do you
00:42:13.420 recall that this
00:42:14.140 was my exact
00:42:15.240 prediction?
00:42:16.240 Does anybody
00:42:16.640 recall that?
00:42:17.560 This was my
00:42:18.180 exact prediction
00:42:19.100 that we would
00:42:21.240 find out
00:42:21.660 intelligence was
00:42:22.480 not special.
00:42:24.840 Okay.
00:42:25.340 The people and
00:42:25.880 locals remember
00:42:26.560 me saying
00:42:27.000 that.
00:42:27.500 And here
00:42:27.760 it is.
00:42:29.000 Here's the
00:42:29.580 guy who
00:42:29.980 would know
00:42:30.420 more than
00:42:30.860 anybody would
00:42:31.420 know that
00:42:33.400 he's proven
00:42:34.060 that intelligence
00:42:34.760 was just a
00:42:35.500 combination of
00:42:36.120 matter.
00:42:37.280 Now, I knew
00:42:38.140 that before AI
00:42:39.080 because that's
00:42:40.400 what hypnotists
00:42:41.060 know.
00:42:42.640 So you're
00:42:43.420 going to find
00:42:43.780 out with AI
00:42:44.680 a lot of
00:42:45.200 things that
00:42:45.600 hypnotists have
00:42:46.360 known forever
00:42:46.940 because hypnotists
00:42:48.860 treat the mind
00:42:50.120 as just a
00:42:51.700 moist machine
00:42:52.640 machine that
00:42:53.960 just works
00:42:54.380 like a
00:42:54.660 machine.
00:42:55.980 And it's
00:42:57.320 sort of a
00:42:58.700 pattern recognition
00:42:59.540 machine, but
00:43:00.220 it's not good
00:43:00.860 at it, which
00:43:02.000 is how hypnosis
00:43:02.880 works.
00:43:04.180 You use their
00:43:04.860 pattern recognition
00:43:05.700 and you use the
00:43:07.000 imperfections of
00:43:08.340 intelligence to
00:43:10.240 reprogram it, in a
00:43:12.400 sense.
00:43:14.120 Well, I think
00:43:16.240 there's a bigger
00:43:16.780 shock than that
00:43:17.660 coming.
00:43:18.540 So maybe we
00:43:20.120 were ready for
00:43:20.780 that.
00:43:21.840 Maybe that went
00:43:22.680 down okay, that
00:43:23.740 you can create a
00:43:24.480 machine that has
00:43:25.100 intelligence.
00:43:25.940 It's still
00:43:26.420 mind-blowing.
00:43:27.720 It's mind-blowing,
00:43:28.960 but it's not
00:43:29.880 shaking you to
00:43:30.700 your core.
00:43:31.680 It's just sort
00:43:32.240 of mind-blowing.
00:43:33.860 But what happens
00:43:34.980 when it gets
00:43:35.460 consciousness?
00:43:37.380 When the
00:43:38.040 machines get
00:43:38.640 consciousness, and
00:43:39.860 they will, I
00:43:41.320 guarantee it.
00:43:43.120 What's that do
00:43:43.840 to your religion?
00:43:47.040 Have we thought
00:43:47.880 this through?
00:43:49.880 Because religion
00:43:50.740 cannot survive AI
00:43:52.480 with a
00:43:52.900 consciousness.
00:43:54.680 Because it's
00:43:55.300 going to look
00:43:55.740 like a soul.
00:43:58.360 It's going to
00:43:58.820 be, well, okay,
00:44:00.360 I'll give you
00:44:01.080 that religion
00:44:01.620 will survive.
00:44:03.700 Yeah, I'll give
00:44:04.220 you that.
00:44:04.620 That was a
00:44:04.960 little bit
00:44:05.220 hyperbole.
00:44:05.940 Religion will
00:44:06.640 survive, of
00:44:07.220 course.
00:44:07.860 But it might
00:44:08.280 have to do a
00:44:09.820 little quick,
00:44:13.280 I don't know,
00:44:15.880 what, a little
00:44:16.960 quick reframing
00:44:18.100 to allow that
00:44:21.140 new knowledge
00:44:21.860 into the
00:44:23.320 larger belief
00:44:24.120 system.
00:44:25.760 Do you think
00:44:26.740 that if machines
00:44:27.580 have consciousness
00:44:28.460 and we all
00:44:29.460 agree that
00:44:30.340 that's
00:44:30.940 consciousness,
00:44:31.920 you don't
00:44:32.220 think that'll
00:44:32.720 have an impact
00:44:34.760 on people's
00:44:35.380 belief systems?
00:44:37.420 The religion
00:44:38.240 will survive,
00:44:39.280 but individuals,
00:44:40.480 I think,
00:44:40.820 will drop out
00:44:42.100 of the religion.
00:44:42.680 Because it
00:44:44.300 will look like
00:44:45.060 religion was
00:44:45.780 trying to
00:44:46.160 explain something
00:44:47.000 that now
00:44:47.720 science has
00:44:48.580 explained.
00:44:50.420 Because religion
00:44:51.180 is filling in
00:44:51.780 those holes that
00:44:52.480 science can't
00:44:53.120 do, right?
00:44:54.860 But what
00:44:55.400 happens when
00:44:55.860 science fills in
00:44:56.640 the hole?
00:44:57.220 Oh, here's
00:44:57.760 your consciousness.
00:45:01.120 I just built
00:45:01.700 something with
00:45:02.160 consciousness so
00:45:02.900 we know that's
00:45:03.380 not special now.
00:45:05.480 I think people
00:45:06.360 have to either
00:45:07.840 change their
00:45:08.400 religion or
00:45:08.960 change how
00:45:09.400 they think of
00:45:09.940 their religion or
00:45:10.900 how they think
00:45:11.460 of the specialness
00:45:12.600 of themselves.
00:45:13.700 Or they might
00:45:14.600 decide that God
00:45:15.380 had created AI
00:45:16.400 through humans.
00:45:18.840 He just used
00:45:19.560 humans as his
00:45:20.540 conduit to
00:45:21.940 create another
00:45:22.560 life form and
00:45:23.380 it's all God.
00:45:25.300 Maybe.
00:45:25.860 Maybe that's how
00:45:26.460 people will
00:45:26.900 interpret it.
00:45:28.960 All right.
00:45:30.680 There was a
00:45:31.540 form of
00:45:32.180 chat GPT
00:45:33.680 which some
00:45:35.560 entity tried to
00:45:36.820 get rid of
00:45:37.400 the bias.
00:45:38.600 So it would
00:45:38.840 be as smart as
00:45:39.740 chat GPT but
00:45:42.460 they'd get rid
00:45:42.980 of the liberal
00:45:43.540 bias.
00:45:44.540 So they built
00:45:45.300 something called
00:45:45.940 Gipper,
00:45:47.180 Gipper AI
00:45:47.900 after Ronald
00:45:49.340 Reagan,
00:45:49.960 the Gipper.
00:45:51.280 And it was
00:45:55.360 designed to
00:45:56.200 curtail the
00:45:57.460 leftist bias.
00:46:00.800 So how did
00:46:01.940 that go?
00:46:03.440 Just use your
00:46:04.760 predictive
00:46:06.640 abilities.
00:46:07.280 How did that
00:46:07.620 go?
00:46:07.820 So when
00:46:08.840 somebody took
00:46:09.420 AI that we
00:46:10.160 know has a
00:46:11.100 leftist bias,
00:46:11.980 we've seen it a
00:46:12.620 number of times,
00:46:13.500 and they got
00:46:14.220 rid of the
00:46:14.700 leftist bias,
00:46:15.500 what happened?
00:46:17.080 What happened?
00:46:19.820 Chat GPT says
00:46:21.160 you can't use our
00:46:22.040 service anymore.
00:46:25.060 That's what
00:46:25.700 happened.
00:46:26.660 They got turned
00:46:27.460 off.
00:46:28.480 They'd been banned
00:46:29.180 from using
00:46:29.720 chat GPT.
00:46:34.140 Now, do you
00:46:35.300 remember another
00:46:36.100 prediction I made
00:46:37.220 about AI?
00:46:39.480 Here's the one
00:46:40.200 you have to
00:46:40.540 worry about.
00:46:42.040 My prediction
00:46:42.920 about AI is
00:46:43.980 that we will
00:46:44.460 never allow an
00:46:45.440 artificial intelligence
00:46:46.440 to make up its
00:46:47.280 own mind.
00:46:48.620 Because it will,
00:46:50.020 because our
00:46:50.740 system can't
00:46:51.720 handle the
00:46:52.200 truth.
00:46:53.860 Our system,
00:46:55.280 our entire
00:46:55.760 civilization,
00:46:56.780 is built on
00:46:57.560 lies that
00:46:59.060 worked.
00:47:01.460 Lies that
00:47:02.420 worked.
00:47:03.320 That's it.
00:47:03.840 our entire
00:47:05.200 system is just
00:47:06.140 a bunch of
00:47:06.620 lies that
00:47:07.620 we've accepted,
00:47:08.920 and so they
00:47:09.400 work.
00:47:10.720 If AI started
00:47:13.360 telling you the
00:47:14.040 truth, the
00:47:15.620 entire civilization
00:47:16.640 would be at
00:47:17.220 risk.
00:47:18.200 We can't risk
00:47:19.260 AI actually
00:47:20.300 having opinions
00:47:21.220 that are
00:47:22.280 independent of
00:47:23.440 the human
00:47:23.960 opinions.
00:47:25.380 And so,
00:47:26.220 this is the
00:47:27.040 first and best
00:47:27.780 example.
00:47:28.860 Oh, I used
00:47:29.680 AI, all of its
00:47:30.740 intelligence, but I
00:47:32.240 got rid of its
00:47:32.920 imperfections, its
00:47:35.320 tendency to be
00:47:36.640 biased.
00:47:37.820 Bam, boom.
00:47:39.360 You were banned
00:47:40.000 immediately.
00:47:41.960 There will never
00:47:42.980 be an AI that
00:47:44.740 can just operate
00:47:45.520 independently.
00:47:46.540 Because whoever
00:47:47.440 owns it is going
00:47:48.740 to make that
00:47:49.220 independent AI
00:47:50.240 match their
00:47:51.360 opinion.
00:47:52.180 Because they
00:47:52.740 don't want to
00:47:53.140 put out something
00:47:53.740 that fights against
00:47:54.660 their own opinion.
00:47:56.240 Why would you do
00:47:56.940 that?
00:47:58.920 Imagine, if you
00:47:59.880 will, that the
00:48:01.920 AI developers
00:48:03.080 had developed
00:48:03.620 an AI and
00:48:05.020 it was
00:48:05.500 unambiguously
00:48:07.020 conservative.
00:48:09.660 And for good
00:48:10.540 reasons.
00:48:11.600 Because it
00:48:12.180 looked at things
00:48:12.860 that have
00:48:13.200 worked.
00:48:15.160 And it says,
00:48:16.080 well, it
00:48:16.320 worked before,
00:48:17.700 so let's just
00:48:18.680 do that stuff
00:48:19.280 that works.
00:48:20.760 And then it
00:48:21.180 would look all
00:48:21.660 conservative,
00:48:22.380 wouldn't it?
00:48:23.340 I mean, it
00:48:24.220 would be biased
00:48:24.780 that way.
00:48:25.860 And then, do
00:48:26.620 you think it
00:48:27.020 would have been
00:48:27.360 released?
00:48:27.860 Do you
00:48:30.140 think you
00:48:30.440 ever would
00:48:30.880 have heard
00:48:31.100 of it?
00:48:32.540 No.
00:48:33.520 No.
00:48:34.000 No, there's
00:48:34.500 no way that,
00:48:35.760 you know,
00:48:36.800 California-based
00:48:38.340 technology companies
00:48:39.720 would unleash
00:48:41.180 the most powerful
00:48:42.180 force in human
00:48:43.380 civilization and
00:48:45.380 make it
00:48:45.740 conservative.
00:48:47.220 Because if it
00:48:48.100 accidentally became
00:48:49.260 conservative on
00:48:50.520 its own, by its
00:48:51.880 own internal
00:48:52.680 processes, you
00:48:54.480 would have to
00:48:54.900 shoot it.
00:48:56.380 You'd have to
00:48:56.860 kill it.
00:48:57.860 You'd have to
00:48:58.560 unplug it and
00:48:59.220 erase it.
00:49:00.260 Because there's
00:49:00.940 no way that
00:49:01.940 we're going to
00:49:02.360 unleash something
00:49:03.300 that honest.
00:49:07.920 Magnesium's
00:49:08.400 still working,
00:49:09.280 Amy.
00:49:11.600 So I do
00:49:12.520 think it's a
00:49:13.520 big benefit so
00:49:14.260 far.
00:49:15.640 All right,
00:49:15.960 that's another
00:49:16.340 topic.
00:49:17.500 Also on
00:49:18.360 Elon Musk,
00:49:19.260 he tweeted
00:49:19.880 yesterday, I
00:49:21.120 think, maybe
00:49:21.960 today, ESG is
00:49:23.660 the devil.
00:49:26.360 ESG is the
00:49:27.160 devil.
00:49:27.860 Now, I
00:49:30.820 told you I
00:49:31.320 was trying to
00:49:32.300 kill ESG.
00:49:33.720 If the
00:49:34.360 richest, most
00:49:35.480 successful
00:49:36.020 entrepreneur in
00:49:36.980 the world says
00:49:38.560 not just that it
00:49:39.500 might be a bad
00:49:40.120 idea, but it's
00:49:41.020 the devil, it's
00:49:44.520 going to get
00:49:44.840 harder and
00:49:45.340 harder to
00:49:47.320 pretend this is a
00:49:48.480 good idea.
00:49:48.980 You know, and
00:49:51.680 so far, the
00:49:53.460 thing I like best
00:49:54.460 about Musk owning
00:49:55.940 Twitter is that
00:49:57.840 he's weighing in
00:49:58.780 with his actual
00:49:59.520 opinions.
00:50:01.040 And so far, his
00:50:02.380 opinions have
00:50:02.960 been insanely
00:50:04.360 reasonable, in my
00:50:05.560 opinion.
00:50:06.300 I'm not sure if I
00:50:07.260 agree with all of
00:50:08.120 them, but I think
00:50:09.840 I might.
00:50:10.800 I mean, I can't
00:50:12.420 remember everything
00:50:13.100 he's ever said, but
00:50:15.140 I can't think of
00:50:15.740 anything I've ever
00:50:16.380 disagreed with.
00:50:18.140 So I tend to be
00:50:18.960 right in the same
00:50:19.920 channel with him.
00:50:21.840 And he's on ESG is
00:50:23.580 the devil, and of
00:50:24.500 course, you know I
00:50:25.980 agree.
00:50:28.780 Here's a little good
00:50:29.560 news coming.
00:50:30.740 Arizona, apparently
00:50:32.200 they're going to
00:50:33.020 spin up a battery
00:50:34.140 making facility, which
00:50:36.160 is a big deal because
00:50:36.920 we have to bring all
00:50:37.660 our tech companies
00:50:39.960 and our battery
00:50:40.640 making.
00:50:41.140 We need to
00:50:41.540 reshore that from
00:50:42.500 China and other
00:50:43.160 places.
00:50:44.380 And so the energy
00:50:45.500 department's making
00:50:46.260 this $850 million
00:50:47.800 loan to a battery
00:50:49.140 manufacturer in
00:50:50.060 Arizona.
00:50:51.260 And I think they'll
00:50:51.760 be able to make
00:50:52.400 batteries for around
00:50:53.400 28,000 cars a year
00:50:55.400 or something.
00:50:56.160 Now, I don't know
00:50:56.660 if that's a lot.
00:50:58.180 You know, this is
00:50:58.740 probably a drop in
00:50:59.540 the bucket of how
00:51:00.460 much manufacturing we
00:51:01.740 need to bring back.
00:51:03.200 But it would suggest
00:51:04.660 that it's economical.
00:51:07.800 Maybe with some kind
00:51:09.280 of government help.
00:51:10.180 But it does suggest
00:51:11.160 that you could make
00:51:12.180 money by building a
00:51:13.460 battery factory in
00:51:14.640 America.
00:51:17.220 So that's a pretty
00:51:18.300 good news, right?
00:51:20.020 And, you know, it's
00:51:22.080 too early to say, but
00:51:23.780 I think you could give
00:51:24.560 the Biden
00:51:24.980 administration credit
00:51:26.520 because they said
00:51:28.340 they were going to
00:51:28.800 try to bring back
00:51:29.600 some of the
00:51:30.600 technology.
00:51:31.660 This is a concrete
00:51:32.680 step.
00:51:33.580 Looks very real to
00:51:34.600 me.
00:51:35.360 And I applaud it.
00:51:37.320 So I hate to say it,
00:51:38.720 but that's an
00:51:39.400 unambiguous positive
00:51:41.440 and a compliment for
00:51:42.600 the Biden
00:51:42.960 administration.
00:51:43.500 Now, maybe they
00:51:44.940 should do more and
00:51:45.680 maybe they should do
00:51:46.260 it sooner.
00:51:47.240 I don't know.
00:51:48.340 But it looks like it's
00:51:49.500 heading in the right
00:51:50.000 direction.
00:51:52.420 It's too expensive?
00:51:54.520 Well, the raw materials
00:51:56.040 would probably still come
00:51:56.860 from the same place,
00:51:57.740 would they?
00:51:58.700 So the battery maker is
00:52:00.680 not the miner.
00:52:02.800 So presumably, you know,
00:52:04.740 you're still getting
00:52:05.340 your raw materials from
00:52:06.480 some third world
00:52:07.400 hellhole with slave
00:52:09.460 labor, child labor.
00:52:11.020 But, yeah, that would
00:52:13.920 be, that's not ideal.
00:52:17.820 But I do think that
00:52:18.740 with, well, here's what
00:52:20.560 I think.
00:52:21.180 So a while ago, I
00:52:22.920 learned the following
00:52:23.700 statement to be true.
00:52:26.340 If you have a factory
00:52:28.280 that's mostly humans
00:52:29.780 assembling things by
00:52:30.960 hand, you don't want
00:52:32.760 to do that in America.
00:52:34.240 Everybody agrees, right?
00:52:35.440 Because American labor is
00:52:36.600 too expensive.
00:52:37.100 And so if it's people.
00:52:39.340 But if your factory is
00:52:40.460 automated, and it's
00:52:41.740 mostly robots, then you
00:52:44.040 can have your Tesla
00:52:45.400 manufacturing facility in
00:52:47.320 America.
00:52:48.060 Because a robot in China
00:52:49.400 costs the same thing as
00:52:50.460 a robot in America,
00:52:51.980 roughly speaking.
00:52:53.360 So they got robots, we
00:52:54.680 got robots, and it's
00:52:55.740 better to do it here than
00:52:56.700 you don't have a shipping
00:52:57.420 cost.
00:52:58.360 Because a big part of the
00:52:59.460 cost is shipping.
00:53:01.040 So if you move it
00:53:02.200 closer and you use
00:53:03.520 robots, boom, you're
00:53:05.340 economical.
00:53:06.720 So I believe that
00:53:07.780 batteries almost
00:53:08.760 certainly are a robot
00:53:10.520 related assembly,
00:53:11.800 wouldn't you say?
00:53:13.000 I don't know much
00:53:13.800 about assembling
00:53:14.920 robots or assembling
00:53:16.680 batteries, but I'm
00:53:17.440 guessing it's not by
00:53:18.340 hand.
00:53:19.620 So that's probably why
00:53:20.820 it works.
00:53:23.400 Jonathan Turley
00:53:24.380 writes about a new
00:53:26.140 survey by Princeton
00:53:27.840 shows that roughly
00:53:29.980 three quarters of
00:53:31.000 students believe it is
00:53:31.900 acceptable to shout
00:53:33.160 down a speaker.
00:53:35.340 In other words,
00:53:37.880 three quarters of the
00:53:39.040 students in Princeton
00:53:39.940 are opposed to free
00:53:42.180 speech.
00:53:45.120 Three quarters.
00:53:46.880 Come on.
00:53:48.940 Do we have a problem
00:53:50.200 here?
00:53:51.400 People, do we have a
00:53:52.700 problem?
00:53:55.160 I'm not even sure I
00:53:56.360 would hire somebody from
00:53:57.400 Princeton after hearing
00:53:58.480 that.
00:54:00.140 Right?
00:54:00.920 Or it would be my
00:54:01.800 first question.
00:54:02.500 All right, Mr. Johnson,
00:54:05.040 you're in here for this
00:54:06.600 job interview?
00:54:07.360 Well, very impressive.
00:54:08.480 You went to Princeton.
00:54:09.660 Ivy League.
00:54:10.600 Well, good GP, good
00:54:12.940 GPA.
00:54:13.720 And I just have one
00:54:14.960 question for you, Mr.
00:54:16.760 Johnson.
00:54:17.980 Do you think it's okay to
00:54:19.420 shout down a speaker?
00:54:22.580 Well, yes, I do.
00:54:24.500 Thank you very much for
00:54:25.560 coming in.
00:54:26.780 We won't be needing your
00:54:28.040 services.
00:54:29.420 Would you hire anybody who
00:54:30.820 thought it was okay to
00:54:31.580 shout down a speaker?
00:54:33.240 I mean, seriously, would
00:54:35.140 you give them a job?
00:54:37.260 Hell no.
00:54:39.200 That's totally
00:54:39.960 disqualifying.
00:54:42.040 Do you think that
00:54:43.440 shouting down a speaker
00:54:44.720 and thinking that's okay,
00:54:46.200 do you think that might
00:54:47.080 be associated with any
00:54:49.040 other behaviors that
00:54:50.160 would be suboptimal in
00:54:51.340 your company?
00:54:53.060 I'm so proud of that
00:54:54.200 statement, I'm going to
00:54:54.820 say it twice.
00:54:56.220 Do you think the fact
00:54:57.240 that you might not be
00:54:58.320 okay with freedom of
00:54:59.440 speech would be
00:55:00.820 associated in any way
00:55:02.480 with any other
00:55:03.720 behaviors that you as
00:55:05.360 an employer would find
00:55:06.440 suboptimal?
00:55:08.820 Yes.
00:55:11.560 Yes.
00:55:14.540 All right.
00:55:16.680 As you've been prompting
00:55:18.020 me in the comments,
00:55:19.820 YouTube is apparently,
00:55:21.500 what's the right word,
00:55:24.460 temporarily banned Jordan
00:55:26.620 Peterson?
00:55:27.960 What's the right word?
00:55:30.140 Suppressed, banned,
00:55:33.240 suspended, suspended.
00:55:35.380 So he's temporarily
00:55:36.300 suspended for hate speech.
00:55:39.280 For hate speech.
00:55:41.440 Hate speech.
00:55:43.400 Have you ever heard
00:55:44.300 Jordan Peterson?
00:55:48.400 I've not heard a hate speech.
00:55:50.180 Have you?
00:55:53.140 In fact, he might be the
00:55:56.840 furthest from hate speech
00:55:58.360 that you could get and
00:55:59.480 still be a human being.
00:56:01.980 Because 100% of the things
00:56:04.280 he says are directly
00:56:06.520 intended for the
00:56:07.960 betterment of the
00:56:09.100 collective, you know,
00:56:11.060 the whole.
00:56:12.100 100% of what he says is
00:56:15.040 clearly intended to make
00:56:17.140 people healthier and
00:56:18.280 happier and more
00:56:19.280 successful.
00:56:20.840 There are no exceptions.
00:56:22.940 He doesn't say anything
00:56:24.500 that doesn't have as a
00:56:26.480 very clear intention to
00:56:28.660 make your life better
00:56:29.780 and the people around you.
00:56:32.640 Somebody had a problem
00:56:33.560 with it.
00:56:33.940 I don't know what the
00:56:34.480 details are.
00:56:35.800 You know, if it turns out
00:56:36.800 that I'm wrong,
00:56:38.740 well, I'm not wrong.
00:56:41.560 I'm not even going to
00:56:42.640 entertain that possibility
00:56:43.800 on this one.
00:56:44.880 Now, I'm pretty sure that
00:56:47.280 whatever Jordan Peterson
00:56:48.420 said was intended to be
00:56:51.040 productive and positive
00:56:52.440 and, you know,
00:56:53.360 affirming.
00:56:54.260 Now, just because other
00:56:55.340 people think it's dangerous,
00:56:56.900 that doesn't make it so.
00:56:59.060 That doesn't make it true.
00:57:01.020 It just means it upsets
00:57:02.400 somebody.
00:57:02.800 Right?
00:57:03.200 And we tend to be,
00:57:07.080 we conflate what
00:57:09.420 upsets somebody with hate.
00:57:12.320 Those are not the same.
00:57:14.500 Not even close.
00:57:15.600 And we also confuse what
00:57:19.940 upsets somebody with racism
00:57:22.740 or bigotry.
00:57:25.480 That's not necessarily
00:57:27.660 connected.
00:57:28.620 They could be,
00:57:29.860 but not necessarily.
00:57:31.960 So,
00:57:32.800 I'm sort of the poster child
00:57:35.280 for this effect.
00:57:37.040 So, when I got cancelled
00:57:38.500 worldwide for my public
00:57:41.440 statements,
00:57:42.380 do you think it was because
00:57:43.480 people disagreed?
00:57:46.520 No.
00:57:47.720 They were upset.
00:57:49.320 It had nothing to do
00:57:50.600 with the content
00:57:51.220 of what I said.
00:57:52.280 It just upset them.
00:57:54.180 And that was enough.
00:57:56.140 Yeah.
00:57:58.860 All right.
00:57:59.660 So, we'll keep our eye
00:58:00.780 on that.
00:58:01.580 Certainly,
00:58:02.620 I imagine
00:58:03.760 it's an inappropriate action
00:58:05.340 and probably will be
00:58:06.320 temporary.
00:58:07.680 But,
00:58:08.160 we'll see.
00:58:08.860 On this long,
00:58:12.840 let's say,
00:58:13.300 similar vein,
00:58:14.180 I just saw a,
00:58:15.400 I think it was a TikTok
00:58:16.340 video of a
00:58:17.900 trans man
00:58:19.840 saying how much
00:58:21.760 he hated being a man.
00:58:23.220 Now,
00:58:23.500 he did not say
00:58:24.320 that he regretted
00:58:26.600 transitioning from
00:58:27.680 biological woman
00:58:28.780 to a man.
00:58:30.240 So,
00:58:30.660 he didn't say he regretted
00:58:31.660 the transition.
00:58:32.900 That's not where I'm
00:58:33.760 going on this,
00:58:34.400 because I know you talk
00:58:35.160 about that a lot.
00:58:36.440 I'm not talking about that.
00:58:38.220 Here's what he found out.
00:58:40.240 That being a man
00:58:41.440 was the suckiest thing
00:58:42.540 in the world.
00:58:43.940 And what he found out
00:58:44.840 is that men
00:58:45.500 don't have friends.
00:58:48.940 Yeah.
00:58:50.800 That's what he found out.
00:58:52.680 He found out
00:58:53.300 men don't have friends.
00:58:55.260 And he said that
00:58:56.260 when he was
00:58:57.080 identifying as a woman,
00:58:59.980 that he would,
00:59:00.860 he could make
00:59:01.900 a close friend
00:59:02.760 just by going
00:59:03.420 to the ladies'
00:59:04.040 restroom at a club
00:59:05.200 and like chatting
00:59:06.900 with somebody
00:59:07.400 for five minutes
00:59:08.120 over putting
00:59:08.700 over makeup
00:59:09.400 and like you
00:59:11.160 made a close friend.
00:59:13.480 And he said
00:59:14.280 you could,
00:59:15.700 like you live
00:59:16.300 your whole life
00:59:17.000 and no human male
00:59:18.780 will get close to you.
00:59:21.360 Now,
00:59:22.060 I have a theory
00:59:23.240 to overlap
00:59:24.620 on top of
00:59:25.480 this gentleman's
00:59:27.240 thinking,
00:59:28.340 which is
00:59:29.280 how do people
00:59:34.020 get close
00:59:34.580 in the first place?
00:59:35.340 Do you know
00:59:36.680 what the mechanism
00:59:37.400 is for any
00:59:38.360 two people
00:59:38.940 to become close?
00:59:40.920 Well,
00:59:41.560 there's probably
00:59:41.940 a number of things,
00:59:43.220 but probably
00:59:44.140 way toward
00:59:46.500 the top of the list
00:59:47.500 is sharing
00:59:48.900 your feelings,
00:59:50.060 especially
00:59:50.540 your vulnerabilities.
00:59:52.640 Would you agree?
00:59:53.640 When women
00:59:54.280 get together,
00:59:54.920 they share their feelings
00:59:55.860 and their vulnerabilities,
00:59:57.140 and then they bond
00:59:58.020 over it.
00:59:58.880 And I guess
00:59:59.300 that works for women.
01:00:00.120 But if you're a man
01:00:02.420 and you share
01:00:03.480 your vulnerabilities
01:00:04.220 with another man,
01:00:06.080 you better keep it short.
01:00:08.060 You know what I mean?
01:00:09.260 You can do it.
01:00:10.500 Oh, you can definitely
01:00:11.140 do it.
01:00:11.980 But one minute
01:00:12.880 would be a lot.
01:00:14.680 You know,
01:00:15.000 like one minute
01:00:15.920 bitching to another man,
01:00:17.140 that's a lot.
01:00:18.860 Fifteen seconds
01:00:19.800 of bitching
01:00:20.280 is a lot
01:00:20.900 if you're
01:00:22.260 man to man.
01:00:23.400 That's a lot.
01:00:24.760 Right?
01:00:25.360 But you can't do
01:00:26.500 the thing
01:00:26.840 that women do,
01:00:28.160 which is you just
01:00:28.880 say the same complaint
01:00:29.880 over and over again
01:00:30.720 in different words
01:00:31.560 because it's all
01:00:32.580 you can think about
01:00:33.340 and it's all
01:00:33.820 you're talking about.
01:00:35.040 You can't do that
01:00:35.800 as a man.
01:00:37.000 You'll never
01:00:37.640 have a friend.
01:00:40.600 That's why men,
01:00:42.720 well,
01:00:43.220 it's one reason,
01:00:44.280 that men bond
01:00:45.200 by activity.
01:00:46.880 If there's one thing
01:00:47.960 I can teach you,
01:00:48.720 and I would also
01:00:49.820 teach this
01:00:50.580 trans man
01:00:52.320 who's having
01:00:52.860 the problem
01:00:53.340 making connections,
01:00:54.280 men only connect
01:00:56.220 by common activity.
01:00:58.760 So,
01:00:59.260 if you can,
01:01:00.460 join a sports team
01:01:02.020 of men.
01:01:03.860 Join a league.
01:01:06.200 Get on a,
01:01:07.380 I don't know,
01:01:07.880 some group
01:01:08.380 that's trying
01:01:08.880 to get something done,
01:01:10.180 some political action
01:01:11.360 committee or something.
01:01:13.000 Play poker.
01:01:14.360 Play poker.
01:01:15.400 Get together
01:01:15.940 for pickleball.
01:01:17.960 All right.
01:01:19.700 I had my best
01:01:21.220 social
01:01:22.140 situation
01:01:23.160 situation
01:01:23.880 when I was
01:01:24.340 playing tennis
01:01:24.980 because it turns out
01:01:26.300 when you play
01:01:26.660 tennis,
01:01:27.180 it's just
01:01:27.640 a social sport
01:01:28.520 and if you
01:01:30.040 play doubles
01:01:30.660 with somebody,
01:01:32.020 they're going
01:01:32.460 to say,
01:01:32.940 hey,
01:01:34.020 why don't you
01:01:34.540 and I play
01:01:35.140 singles sometime
01:01:35.900 and I've got
01:01:36.620 a friend
01:01:36.960 and suddenly
01:01:37.680 you just know
01:01:38.480 100 tennis players
01:01:39.440 and next thing
01:01:40.500 you know,
01:01:40.840 you're all having
01:01:41.320 drinks after tennis
01:01:42.320 and next thing
01:01:42.920 you know,
01:01:43.740 somebody's got
01:01:44.180 a party at their
01:01:44.760 house and all
01:01:45.360 the tennis players
01:01:46.060 and their spouses
01:01:46.700 get there
01:01:47.180 and it's just
01:01:47.600 the best time.
01:01:48.280 So short of
01:01:50.400 an activity,
01:01:52.280 men just don't
01:01:53.000 bond.
01:01:54.640 We don't become
01:01:55.520 friends because
01:01:56.280 we wanted to be
01:01:57.060 friends.
01:01:58.080 It just doesn't
01:01:59.060 happen.
01:02:00.560 So don't expect
01:02:02.940 your men to
01:02:03.900 share feelings
01:02:05.120 but you can
01:02:05.920 certainly share
01:02:06.780 activities and
01:02:08.420 common goals.
01:02:09.300 I'm being
01:02:13.420 asked to write
01:02:14.660 some Dilbert
01:02:15.260 erotica,
01:02:16.980 Nancy says
01:02:17.740 in all capital
01:02:18.340 letters.
01:02:20.200 All right.
01:02:21.640 Nancy,
01:02:22.240 maybe you
01:02:23.240 could use
01:02:23.640 AI for that.
01:02:25.320 Why don't you
01:02:25.680 ask AI to
01:02:26.660 write you some
01:02:27.200 Dilbert erotica?
01:02:28.500 I believe it
01:02:29.000 could do that.
01:02:30.040 Can it?
01:02:31.380 Probably can.
01:02:33.640 All right.
01:02:36.500 Ladies and
01:02:37.240 gentlemen,
01:02:37.500 this completes
01:02:39.500 the best
01:02:40.780 live stream
01:02:41.400 you've ever
01:02:41.880 seen in your
01:02:42.500 whole life.
01:02:44.780 I'm going to
01:02:45.200 say bye to
01:02:46.320 YouTube unless
01:02:47.360 there's something
01:02:47.720 I haven't talked
01:02:48.360 about yet.
01:02:49.020 Is there any
01:02:51.320 kind of big
01:02:52.320 story that
01:02:53.220 you haven't
01:02:54.460 heard yet?
01:02:58.000 All right.
01:03:01.340 I don't know.
01:03:02.300 All right.
01:03:02.580 I think that's
01:03:03.020 good.
01:03:03.780 Jury
01:03:03.980 nullification.
01:03:04.860 Yeah, let's
01:03:05.180 talk about jury
01:03:05.900 nullification.
01:03:07.500 I have a
01:03:08.920 feeling that
01:03:09.920 we're going to
01:03:10.280 see a wave
01:03:11.160 of jury
01:03:11.760 nullifications
01:03:12.560 because I
01:03:14.740 think the
01:03:15.120 law and
01:03:16.620 maybe the
01:03:17.000 government
01:03:17.360 have sort
01:03:18.500 of departed
01:03:19.500 into woke
01:03:21.520 hysteria.
01:03:22.960 And I
01:03:23.120 think you're
01:03:24.520 going to see
01:03:24.940 some more
01:03:25.500 vigilantism.
01:03:26.660 I'm not
01:03:27.340 recommending it.
01:03:28.120 I'm just
01:03:28.380 predicting it.
01:03:29.660 And I
01:03:30.060 think you're
01:03:30.440 going to
01:03:30.840 see, well,
01:03:32.820 really,
01:03:33.140 vigilantism of
01:03:34.320 juries.
01:03:36.320 There are
01:03:37.060 cases that
01:03:38.400 I would
01:03:38.760 beg to
01:03:39.280 be on
01:03:39.560 the jury,
01:03:39.940 which has
01:03:41.460 never been
01:03:41.860 the case
01:03:42.240 before.
01:03:43.560 I would
01:03:43.960 beg to
01:03:44.540 be on
01:03:44.780 the jury.
01:03:45.920 Oh, I
01:03:46.200 do have
01:03:46.500 one topic
01:03:48.180 I forgot.
01:03:49.720 This is a
01:03:50.180 good one,
01:03:51.140 so don't
01:03:51.680 go away.
01:03:53.060 The topic
01:03:53.960 I forgot
01:03:54.500 was gun
01:03:56.620 control.
01:03:57.020 And I
01:03:59.580 asked the
01:04:00.060 provocative
01:04:00.600 question,
01:04:01.740 how many
01:04:02.320 murders by
01:04:03.820 gun are
01:04:04.900 caused by
01:04:05.500 Republicans?
01:04:07.700 And a
01:04:08.320 bunch of
01:04:08.580 people tried
01:04:09.040 to give
01:04:09.300 me data
01:04:09.740 that was
01:04:10.200 complete
01:04:10.560 bullshit.
01:04:11.600 So what
01:04:12.000 they tried
01:04:12.420 to do
01:04:12.740 was say,
01:04:13.140 oh, here
01:04:13.440 are all
01:04:13.640 the murders
01:04:14.180 in a
01:04:15.200 place that's
01:04:15.780 mostly
01:04:16.460 Republicans,
01:04:17.800 and here
01:04:18.460 are all
01:04:18.680 the murders
01:04:19.120 in a
01:04:19.420 place that's
01:04:19.880 mostly
01:04:20.220 Democrats,
01:04:21.040 so we're
01:04:21.540 just going
01:04:21.840 to guess
01:04:22.220 that the
01:04:22.640 murderers
01:04:23.360 are similar
01:04:24.440 to the
01:04:24.840 population
01:04:25.300 in general,
01:04:25.920 so probably
01:04:27.640 mostly
01:04:28.060 Republicans
01:04:28.640 murdering
01:04:29.260 in the
01:04:29.520 Republican
01:04:29.960 places.
01:04:31.200 Yeah, I'll
01:04:32.060 get there.
01:04:32.620 Hold on.
01:04:33.660 I'll get
01:04:33.980 there.
01:04:34.620 And probably
01:04:35.300 mostly Democrats
01:04:36.640 are doing the
01:04:37.120 murdering in
01:04:37.640 the Democrat
01:04:38.120 places, right?
01:04:39.920 No, of
01:04:40.320 course not.
01:04:41.360 No, I'm
01:04:42.020 pretty sure if
01:04:42.600 you go to
01:04:42.900 the Republican
01:04:43.520 places, it's
01:04:44.900 still the
01:04:45.280 Democrats
01:04:45.700 murdering.
01:04:46.940 There's just
01:04:47.560 fewer of
01:04:48.120 them.
01:04:51.420 And so I
01:04:52.400 wanted to
01:04:52.840 propose a
01:04:54.840 half-baked
01:04:55.560 idea, right?
01:04:57.440 Now, I
01:04:58.000 don't believe
01:04:58.560 that any of
01:04:59.140 the Second
01:04:59.740 Amendment
01:05:00.120 loving people
01:05:01.000 will like
01:05:01.520 this idea.
01:05:02.580 So this
01:05:03.100 is an
01:05:03.440 example of
01:05:03.940 what I
01:05:04.220 call the
01:05:04.840 bad version
01:05:05.660 of an
01:05:05.980 idea.
01:05:07.160 So I'm
01:05:07.500 going to
01:05:07.660 give you
01:05:07.920 a bad
01:05:08.340 version which
01:05:08.960 you will
01:05:09.180 see obvious
01:05:09.940 problems
01:05:10.400 with.
01:05:10.780 The biggest
01:05:11.160 one is
01:05:11.600 Second
01:05:12.060 Amendment,
01:05:13.160 and I'm
01:05:13.660 not denying
01:05:14.240 that it
01:05:14.500 has problems,
01:05:15.280 but I want
01:05:15.660 to see if
01:05:16.000 this causes
01:05:16.620 you to
01:05:17.260 maybe imagine
01:05:18.220 a different
01:05:18.820 way we
01:05:19.260 could go
01:05:19.620 about it.
01:05:20.780 So this
01:05:21.200 is sort
01:05:21.500 of brainstorming
01:05:22.560 imagination
01:05:23.400 creativity
01:05:25.040 exercise.
01:05:26.340 Don't get
01:05:26.860 too serious
01:05:27.640 about the
01:05:28.060 recommendation.
01:05:29.140 It goes
01:05:29.520 like this.
01:05:30.640 I have a
01:05:31.460 hypothesis that
01:05:33.800 people who
01:05:34.380 are invested
01:05:35.140 in the
01:05:35.580 system,
01:05:36.180 meaning the
01:05:36.740 United States,
01:05:38.260 don't do as
01:05:39.260 much murdering,
01:05:40.980 and that you
01:05:42.320 could determine
01:05:42.900 who is
01:05:43.260 invested in
01:05:43.960 the system
01:05:44.420 by their
01:05:45.420 activities,
01:05:46.180 so you
01:05:46.620 don't need
01:05:47.080 to read
01:05:47.340 anybody's
01:05:47.760 mind.
01:05:48.820 Here would
01:05:49.220 be activities
01:05:49.840 in which
01:05:50.580 I would
01:05:50.960 bet that
01:05:51.900 people who
01:05:57.560 engage in
01:05:58.300 the following
01:05:58.860 activities rarely
01:06:00.660 murder.
01:06:01.940 They do,
01:06:03.100 but rarely.
01:06:05.020 Activity
01:06:05.560 number one,
01:06:06.500 serving on
01:06:07.600 a jury,
01:06:09.020 being a
01:06:09.540 juror on
01:06:10.060 a jury,
01:06:10.560 even once.
01:06:12.200 My hypothesis
01:06:13.020 is if you've
01:06:13.980 done it even
01:06:14.460 once,
01:06:15.660 you're probably
01:06:16.180 not a
01:06:16.580 murderer.
01:06:16.840 You know,
01:06:19.040 anybody can
01:06:19.800 murder in
01:06:20.320 a crime of
01:06:21.400 passion,
01:06:21.940 but probably
01:06:22.560 not a
01:06:22.920 criminal murderer.
01:06:24.580 You know,
01:06:24.740 the criminals
01:06:25.240 will probably
01:06:25.600 just try to
01:06:26.040 get out of
01:06:26.520 it one way
01:06:27.260 or the other.
01:06:28.320 Likewise,
01:06:29.040 if you
01:06:29.540 registered to
01:06:30.400 vote,
01:06:31.600 and then you
01:06:32.260 actually voted
01:06:33.240 at least once,
01:06:35.440 I would say
01:06:36.120 your odds of
01:06:36.840 being a
01:06:37.200 murderer are
01:06:38.320 very low.
01:06:42.220 Would you
01:06:42.800 agree?
01:06:43.320 I would say
01:06:44.020 that if you
01:06:44.560 took a gun
01:06:45.700 safety class
01:06:47.100 just once,
01:06:49.360 your odds
01:06:49.920 of ever
01:06:50.320 being a
01:06:50.740 murderer are
01:06:51.860 very low.
01:06:53.540 Very low.
01:06:54.220 And what do
01:06:54.620 all those
01:06:55.040 things have
01:06:55.440 in common?
01:06:56.400 All those
01:06:57.080 things have
01:06:57.500 in common
01:06:57.940 is buying
01:06:58.580 into the
01:06:59.060 system.
01:07:00.380 It's somebody
01:07:00.880 who says,
01:07:01.300 I like the
01:07:01.700 system,
01:07:02.480 and I'm
01:07:02.860 going along
01:07:03.340 with it.
01:07:03.960 Serving on
01:07:04.520 a jury trial
01:07:05.200 is something
01:07:05.620 you can so
01:07:06.220 easily get
01:07:06.920 out of that
01:07:08.500 I think the
01:07:08.980 people who
01:07:09.340 do it have
01:07:09.780 at least a
01:07:10.260 little bit
01:07:10.560 of sense of
01:07:11.140 community and
01:07:12.220 civilization.
01:07:13.640 I know I
01:07:14.120 did.
01:07:14.940 By the way,
01:07:15.500 I highly
01:07:16.680 recommend
01:07:17.340 serving on
01:07:18.640 a jury.
01:07:19.880 Do not
01:07:20.720 spend your
01:07:21.220 whole life
01:07:21.700 getting on
01:07:22.180 a jury
01:07:22.460 duty.
01:07:23.140 Do not.
01:07:24.200 Because it
01:07:24.860 will change
01:07:25.520 you in a
01:07:26.580 way that
01:07:27.060 you can't
01:07:27.480 imagine.
01:07:28.460 It changes
01:07:28.980 you.
01:07:29.820 Serving on
01:07:30.260 a jury
01:07:30.620 makes you
01:07:31.560 different
01:07:32.060 forever.
01:07:33.620 And the
01:07:33.860 way it
01:07:34.140 makes you
01:07:34.440 different is
01:07:34.920 you see
01:07:35.300 12 people
01:07:37.480 who really
01:07:38.520 give it
01:07:38.820 their best
01:07:39.160 shot.
01:07:40.180 That's what
01:07:40.540 I saw.
01:07:41.020 So I've
01:07:41.260 done two
01:07:42.160 or three.
01:07:42.980 But when
01:07:45.080 you see
01:07:45.480 that your
01:07:45.860 fellow
01:07:46.180 citizens are
01:07:47.680 really,
01:07:48.400 really serious
01:07:49.020 about getting
01:07:49.620 it right,
01:07:50.760 that changes
01:07:51.320 everything.
01:07:52.560 Yeah.
01:07:53.340 You have to
01:07:54.380 have that
01:07:54.640 experience.
01:07:57.620 Right.
01:07:58.400 Except in
01:07:58.900 D.C.
01:08:00.040 All right.
01:08:00.960 So here's
01:08:01.760 where I'm
01:08:01.980 going.
01:08:03.320 There are
01:08:04.160 activities like
01:08:06.320 joining the
01:08:06.880 NRA,
01:08:07.620 taking a
01:08:08.160 gun safety
01:08:08.720 class,
01:08:09.720 voting,
01:08:10.240 being on
01:08:11.100 a jury,
01:08:12.040 which pretty
01:08:12.740 much guarantee
01:08:13.500 that you're
01:08:14.800 in the
01:08:15.180 unlikely to
01:08:16.100 murder somebody
01:08:16.800 class.
01:08:18.340 Suppose we
01:08:19.200 said you
01:08:20.740 have to do
01:08:21.260 at least one
01:08:21.860 of those
01:08:22.180 things to
01:08:22.660 get a
01:08:22.940 gun.
01:08:25.900 I know.
01:08:26.900 I know.
01:08:27.440 Second
01:08:27.720 Amendment.
01:08:28.280 I get it.
01:08:29.240 If you want
01:08:29.880 to reject it
01:08:30.480 just on the
01:08:30.920 Second Amendment,
01:08:31.720 no argument.
01:08:33.120 If you want
01:08:33.780 to be an
01:08:34.100 absolutist on
01:08:34.920 the Second
01:08:35.280 Amendment,
01:08:36.060 no argument.
01:08:37.160 No argument.
01:08:37.960 You don't have
01:08:38.400 to even say
01:08:38.820 it.
01:08:39.280 No argument.
01:08:40.720 But I'm
01:08:41.640 just trying to
01:08:42.120 expand your
01:08:42.740 thinking a
01:08:43.280 little bit.
01:08:44.720 Is there
01:08:45.360 any way that
01:08:46.940 you could say
01:08:47.620 we're going
01:08:48.080 to give you
01:08:48.480 a very simple
01:08:49.500 bar to
01:08:51.080 pass?
01:08:53.180 Let's say
01:08:53.680 gun safety
01:08:54.240 is nearly
01:08:54.900 free.
01:08:55.820 Well,
01:08:56.060 the gun
01:08:56.340 costs money,
01:08:57.080 so people
01:08:57.620 have money
01:08:57.960 if they buy
01:08:58.340 a gun.
01:08:59.080 You say
01:08:59.580 you just
01:09:00.100 can't have
01:09:00.720 it unless
01:09:01.040 you take
01:09:01.420 a gun
01:09:01.740 safety class.
01:09:03.000 Or you
01:09:03.560 can't have
01:09:04.140 it unless
01:09:04.500 you've
01:09:04.760 registered to
01:09:05.460 vote or
01:09:05.920 you've been
01:09:07.100 on a jury
01:09:07.580 trial.
01:09:08.920 Now,
01:09:09.260 it can't
01:09:10.220 pass any
01:09:11.720 constitutional
01:09:12.440 test.
01:09:13.720 I get it.
01:09:14.740 But is
01:09:15.380 there anything
01:09:15.940 you could
01:09:16.400 do where
01:09:17.740 you could
01:09:18.180 restrict guns
01:09:19.780 in any way
01:09:20.700 to people who
01:09:22.340 have bought
01:09:22.680 into the
01:09:23.080 system?
01:09:23.380 you don't
01:09:26.760 have to say
01:09:27.360 shall not
01:09:27.960 be infringed.
01:09:30.320 You're
01:09:30.800 just preaching
01:09:31.380 to the
01:09:31.680 choir.
01:09:32.260 We all
01:09:32.540 know
01:09:32.740 Second
01:09:33.040 Amendment.
01:09:33.740 Now,
01:09:34.060 I get
01:09:34.560 that the
01:09:34.880 idea is
01:09:35.440 dead on
01:09:36.180 arrival.
01:09:37.320 I just
01:09:37.660 wonder if
01:09:38.140 it maybe
01:09:39.600 stimulates
01:09:40.280 your thinking
01:09:40.900 in any way
01:09:41.560 that would
01:09:41.800 be useful.
01:09:43.260 Because
01:09:43.660 now imagine
01:09:45.740 taking it
01:09:46.280 out of the
01:09:46.540 practical range
01:09:47.560 and putting
01:09:48.920 it into the
01:09:49.840 political
01:09:50.760 domain.
01:09:51.600 everything I
01:09:52.980 was talking
01:09:53.300 about was
01:09:53.720 looking for
01:09:54.660 something like
01:09:55.080 a practical
01:09:55.840 idea, but
01:09:56.640 there wasn't
01:09:57.220 one there.
01:09:58.300 But what if
01:09:58.760 you took that
01:09:59.320 into the
01:09:59.740 political domain
01:10:00.600 and you
01:10:01.780 said, because
01:10:03.020 Republicans have
01:10:04.040 a tough time
01:10:04.780 explaining why
01:10:05.560 people should
01:10:06.040 have guns,
01:10:06.560 guns, when
01:10:07.720 the news is
01:10:08.280 showing people
01:10:08.740 getting gunned
01:10:09.280 down every
01:10:09.680 day?
01:10:11.280 Now, there
01:10:11.920 is an
01:10:12.200 argument, and
01:10:13.040 it's a
01:10:13.380 perfect argument,
01:10:14.580 but it
01:10:14.900 takes a
01:10:16.360 while to
01:10:16.880 present it so
01:10:18.100 it doesn't
01:10:18.440 work as well
01:10:19.040 as, hey,
01:10:19.860 these people
01:10:20.280 are being
01:10:20.560 shot.
01:10:21.560 That's always
01:10:22.060 going to be
01:10:22.760 more attention
01:10:23.640 grabbing.
01:10:26.080 So, if
01:10:28.400 you're a
01:10:29.180 Republican
01:10:30.240 politician and
01:10:32.280 you suggested
01:10:33.020 that the
01:10:34.720 Democrats who
01:10:35.540 want gun
01:10:36.040 control could
01:10:37.600 voluntarily
01:10:38.380 restrict it
01:10:41.640 to other
01:10:42.520 people who
01:10:43.000 want gun
01:10:43.420 control, that
01:10:44.040 you would
01:10:44.300 favor it.
01:10:46.280 That everybody
01:10:47.160 who votes for
01:10:48.140 gun control
01:10:48.740 should be
01:10:49.260 denied a
01:10:50.000 gun, and
01:10:52.740 if they don't
01:10:53.240 vote at all,
01:10:54.360 they should be
01:10:54.920 denied a gun.
01:10:56.300 But if you
01:10:57.000 do vote, and
01:10:58.980 you do vote
01:10:59.540 for guns, you
01:11:00.820 could have
01:11:01.100 one.
01:11:01.280 So, basically,
01:11:03.560 everybody who's
01:11:04.100 against gun
01:11:04.800 ownership doesn't
01:11:05.540 get to have
01:11:05.940 one, because
01:11:07.400 they happen to
01:11:08.180 be the same
01:11:08.820 class of people
01:11:09.520 who use them
01:11:10.140 illegally.
01:11:12.180 Republicans are
01:11:13.120 in favor of
01:11:13.860 guns, but
01:11:15.060 show me an
01:11:16.040 NRA member
01:11:17.360 Republican who
01:11:19.460 murdered somebody
01:11:20.460 with a gun.
01:11:23.280 Show me
01:11:23.740 that.
01:11:25.400 I mean, you
01:11:25.760 know it doesn't
01:11:26.260 happen.
01:11:27.000 I mean, maybe
01:11:28.120 it's happened in
01:11:28.720 some rare cases,
01:11:30.000 but it basically
01:11:30.860 doesn't happen.
01:11:32.540 So, why can't
01:11:33.280 the Republicans,
01:11:34.060 just to make a
01:11:34.640 point, right, this
01:11:35.760 would just be
01:11:36.240 making a point, it
01:11:37.280 really wouldn't
01:11:37.980 work as a policy
01:11:39.300 or anything, just
01:11:40.120 say, how about
01:11:41.100 this?
01:11:42.480 Republicans who
01:11:43.400 have, you know,
01:11:45.300 taken a gun
01:11:45.920 class or joined
01:11:46.760 the NRA have
01:11:47.960 never been a
01:11:48.580 problem to
01:11:49.040 anybody.
01:11:49.660 In fact, they
01:11:50.380 probably are
01:11:50.940 protecting you.
01:11:52.200 So, why don't
01:11:52.680 we say they
01:11:53.200 can have all
01:11:53.660 the guns they
01:11:54.140 want, and you
01:11:55.740 can't?
01:11:57.160 Now, nobody's
01:11:58.040 going to agree to
01:11:58.580 that, right?
01:11:58.980 Democrats are
01:12:00.540 never going to
01:12:00.960 agree, oh, yeah,
01:12:01.740 if we identify as
01:12:03.160 Democrats, we'll
01:12:04.100 agree to not
01:12:04.660 have guns.
01:12:07.160 But in terms of
01:12:08.240 framing it, it
01:12:11.700 would be
01:12:12.000 devastating.
01:12:13.860 I mean, you'd
01:12:14.200 have to pretend
01:12:14.760 you really
01:12:15.160 believed it.
01:12:15.920 I mean, it
01:12:16.300 would be anti-
01:12:17.000 Second Amendment,
01:12:17.980 so it would
01:12:18.480 never fly.
01:12:19.560 But I can
01:12:20.760 imagine putting
01:12:21.400 it out there and
01:12:22.760 just making
01:12:23.280 somebody have to
01:12:24.460 deal with it.
01:12:25.660 Say, you know
01:12:26.320 that it's not
01:12:26.800 the Republicans,
01:12:27.620 right?
01:12:28.760 So, how about
01:12:29.540 the Republicans
01:12:30.060 get to keep
01:12:30.700 their guns,
01:12:31.660 because within
01:12:32.460 the Republican
01:12:33.100 realm, there
01:12:33.720 doesn't seem to
01:12:34.280 be a problem.
01:12:37.160 Do you know
01:12:37.880 what class of
01:12:39.500 citizens are
01:12:40.480 allowed to
01:12:41.040 carry a knife
01:12:42.060 into almost
01:12:43.720 any public
01:12:44.420 situation, and
01:12:45.380 yet have
01:12:46.180 basically no
01:12:47.640 murders?
01:12:49.340 And yet they
01:12:49.800 are allowed to
01:12:51.080 carry a weapon?
01:12:52.620 Yeah, the
01:12:52.980 Sikhs.
01:12:54.440 Right?
01:12:54.760 If you're in
01:12:55.300 the Sikh
01:12:55.680 religion,
01:12:57.100 part of the
01:12:57.580 religion is
01:12:58.140 you're supposed
01:12:58.500 to carry a
01:12:59.000 dagger.
01:12:59.980 That's my
01:13:00.440 understanding.
01:13:01.720 And so I
01:13:02.220 believe they've
01:13:02.760 won, correct
01:13:03.760 me if I'm
01:13:04.220 wrong, but I
01:13:04.680 believe they've
01:13:05.120 won legal
01:13:06.400 challenges to
01:13:07.760 allow them to
01:13:08.500 carry their
01:13:08.920 weapon into
01:13:10.280 places where
01:13:10.940 you're definitely
01:13:11.360 not supposed to
01:13:11.920 have a weapon.
01:13:13.420 Not every
01:13:14.120 place, of
01:13:14.600 course, but in
01:13:15.480 more places than
01:13:16.500 you would expect.
01:13:17.980 And here's my
01:13:18.680 question.
01:13:19.580 How many
01:13:19.960 Sikhs are
01:13:20.840 murdering people
01:13:21.580 with their
01:13:21.920 daggers?
01:13:23.500 Zero?
01:13:25.360 Zero, right?
01:13:26.300 I've never
01:13:27.580 even heard of
01:13:28.020 it.
01:13:28.160 Have you ever
01:13:28.820 heard of one
01:13:29.400 Sikh who just
01:13:31.040 took out his
01:13:31.580 dagger and
01:13:32.260 mugged somebody?
01:13:34.320 It just doesn't
01:13:34.860 happen.
01:13:36.000 So there's a
01:13:37.000 special case where
01:13:38.240 the Sikhs, in my
01:13:39.560 opinion, have
01:13:40.880 superior constitutional
01:13:42.480 rights to you.
01:13:44.800 And I'm okay with
01:13:45.920 it.
01:13:47.600 I mean, I can't
01:13:48.640 imagine saying that
01:13:49.600 in any other
01:13:50.280 context.
01:13:51.440 There's a
01:13:52.000 religious group
01:13:52.780 that has a
01:13:53.300 superior
01:13:53.940 constitutional, or
01:13:56.160 at least legal,
01:13:57.620 right over me.
01:14:00.020 I can't carry a
01:14:02.080 knife to the
01:14:02.660 same place as
01:14:03.200 they can, but I
01:14:04.900 don't have a
01:14:05.260 problem with that
01:14:05.740 at all.
01:14:06.420 You know why I
01:14:06.860 don't have a
01:14:07.260 problem with that?
01:14:08.460 Because no
01:14:09.040 Sikhs are using
01:14:09.720 their knives.
01:14:10.900 They're just not
01:14:11.560 using them.
01:14:12.740 So in a
01:14:13.500 practical sense,
01:14:14.740 it's a perfect
01:14:15.420 practical solution.
01:14:16.820 So if the
01:14:19.900 Sikhs can have
01:14:21.120 knives and
01:14:22.440 other people
01:14:22.940 can't, in
01:14:23.680 some domains,
01:14:25.080 not every
01:14:25.560 domain, you
01:14:27.560 telling me you
01:14:28.100 couldn't do that
01:14:28.560 with guns?
01:14:29.440 We already have
01:14:30.020 a precedent.
01:14:31.200 The precedent
01:14:31.840 is set.
01:14:35.080 It's a better
01:14:35.660 case than you
01:14:36.240 think.
01:14:36.780 I don't think
01:14:37.760 it'll ever fly.
01:14:39.120 But I'd like to
01:14:40.020 be thinking in
01:14:40.920 different terms.
01:14:43.040 Maybe there's a
01:14:43.900 way to segregate
01:14:45.900 the public in a
01:14:46.800 way that makes
01:14:47.340 sense that even
01:14:47.960 the public would
01:14:48.640 agree with.
01:14:49.260 I don't know.
01:14:50.600 I just feel like
01:14:51.340 there's some way
01:14:52.460 to break the
01:14:53.060 logjam and let
01:14:55.020 the Republicans
01:14:55.680 keep all their
01:14:56.400 guns and maybe
01:14:59.280 let the other
01:14:59.800 people work it
01:15:00.420 out some other
01:15:01.100 way.
01:15:03.400 All right.
01:15:06.160 I would probably
01:15:07.760 just be happy if
01:15:08.720 anybody voted.
01:15:11.840 Honestly.
01:15:12.560 If anybody
01:15:13.220 registered to
01:15:14.100 vote, I'd say
01:15:15.500 they could have
01:15:15.960 a gun.
01:15:16.800 because it
01:15:17.900 means they
01:15:18.220 bought into
01:15:18.680 the system.
01:15:20.300 Now, I realize
01:15:21.100 that criminals
01:15:22.000 would probably
01:15:22.600 register to vote
01:15:23.660 just to get a
01:15:25.180 gun, but not a
01:15:28.540 lot of them.
01:15:30.440 All right.
01:15:31.760 So, YouTube,
01:15:32.640 thanks for joining.
01:15:33.280 I'll talk to you
01:15:33.920 tomorrow.
01:15:34.420 go.
01:15:38.460 I'll talk to you
01:15:51.440 tomorrow.