Real Coffee with Scott Adams - November 13, 2021


Episode 1560 Scott Adams: I Tell You All the Ways the Left is Being Manipulated by Their Own News Sources


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

147.14345

Word Count

9,939

Sentence Count

789

Misogynist Sentences

10

Hate Speech Sentences

15


Summary

In this episode, I talk about how I fixed my iPad's USB problem, and how to fix my headphones problem. I also talk about why I hate the people who make these headphones, and why they suck.


Transcript

00:00:00.000 Good morning, everybody, ladies and gentlemen, and everything else.
00:00:09.880 What a day we have today. Wow. Wow. It's going to be so good.
00:00:17.360 Let me start by showing how I fixed iPad's design problem.
00:00:23.460 You've probably heard that Apple computer does a pretty good job on design.
00:00:26.940 Well, not good enough. I had to take the iPad and fix it myself.
00:00:33.500 And I'll show you how I did it with one of my iPads here.
00:00:39.140 See, here's my iPad. And then if you see, let's see, see right here?
00:00:46.460 See this thing? This is a little sticky thing that I added myself.
00:00:50.260 Do you know why? Because every time I want to turn on or off my iPad,
00:00:56.140 or I want to change the volume,
00:00:59.540 I have to search all four corners because they look identical,
00:01:03.500 and the iPad can work in any orientation.
00:01:06.860 So from the front, you can't tell where the power button is,
00:01:11.360 the on and off switch, and you can't tell where the volume is.
00:01:14.620 So on each of my iPads, I had to modify them
00:01:17.420 so I know which corner the stuff is,
00:01:19.620 because otherwise I have a one in four chance
00:01:22.940 of getting it right on the first time.
00:01:25.080 So my experience of the iPad is always the same.
00:01:28.500 Wow, this iPad is a miracle of design and function.
00:01:34.300 Okay, let's turn it on.
00:01:36.200 Damn it. Damn it. Damn it.
00:01:39.040 Damn it.
00:01:39.780 I hate my iPad, but at least it's on.
00:01:43.560 That's right.
00:01:44.500 Every single time I turned my iPad on or off,
00:01:48.800 I hated it.
00:01:51.360 And I hated the designers who did this to me.
00:01:55.080 I don't get it on the first try ever, it feels like.
00:01:58.380 It should be one in four you get it on the first try.
00:02:00.780 But it feels like zero.
00:02:03.640 Now, don't get me started about who created the micro USB standard.
00:02:09.360 Have you ever used the micro USB?
00:02:11.360 Here, let me give you an example.
00:02:14.240 I'll do the simultaneous sip in a moment here.
00:02:16.820 But I'll give you an example.
00:02:19.040 God damn it.
00:02:20.000 I really, is this the time I'm not going to be able to find it?
00:02:23.920 I'm going to go into a swearing rant in about a second.
00:02:28.040 Come on.
00:02:28.560 There we go.
00:02:28.920 All right, so here's a micro USB.
00:02:33.100 Now, they made it so that there's a right way to go in
00:02:36.640 and a wrong way.
00:02:37.520 So, you know, this is right sometimes
00:02:39.440 and that's right sometimes.
00:02:40.520 And let me show you how the micro USB standard works.
00:02:45.200 So, here are my headphones.
00:02:47.520 And these headphones, of course, you can charge them.
00:02:51.440 Now, the headphones look very similar on the left and the right.
00:02:54.500 So, because the people who make these headphones suck
00:03:00.240 and I hate them like I hate anything.
00:03:04.560 So, you first have to figure out which of the two places has the place you plug it in.
00:03:11.500 So, I first go, well, let's, wait, it's sort of dark in here.
00:03:16.640 I can't even, I don't know.
00:03:18.360 Could you see the black hole on the black?
00:03:20.420 It's usually dark when I use these, you know.
00:03:25.560 So, you're like feeling around.
00:03:28.020 Where is it?
00:03:29.380 Oh, is this one?
00:03:30.960 This one?
00:03:31.480 So, there's a 50% chance you get it.
00:03:33.920 So, then once you find it, then you have to get the USB into it.
00:03:38.120 Let me show you how to do that.
00:03:39.640 So, here's the USB hole.
00:03:42.580 And, uh, you see that doesn't go in there.
00:03:46.780 Do you know why?
00:03:47.980 Because that's not the fucking USB hole.
00:03:50.740 That's just a little indentation exactly where the USB hole would be on the other one.
00:03:55.540 That's right.
00:03:56.580 They have a fake USB hole on one of the two places it could be.
00:04:01.500 They hate you.
00:04:02.980 Whoever made these, they hate you.
00:04:05.060 All right.
00:04:06.560 So, this is the real USB hole right here.
00:04:08.940 Now, you notice how it's made so that if you don't get it exactly in the hole on the first try,
00:04:16.860 it'll just...
00:04:18.940 It doesn't guide you into the hole.
00:04:22.360 And make sure that if you don't hit it on the first try, it doesn't go in.
00:04:30.200 All right.
00:04:30.740 But let's say you were more careful and you could find it.
00:04:35.600 Then, here's how you get it in.
00:04:37.240 And yet, remember, you have to get it in right side up.
00:04:40.100 So, here it is sliding right into the hole.
00:04:46.160 Okay.
00:04:46.800 So, it doesn't go in this way.
00:04:48.900 So, if it doesn't go in this way, obviously, it's the other way.
00:04:52.640 So, you turn it upside down and then you put it into the hole.
00:04:55.900 Okay.
00:04:58.140 It doesn't go into the hole the other way because it turns out that the first way was the correct way.
00:05:03.220 But because the stupid device is so poorly designed, you can't even tell if it's the right way.
00:05:09.320 So, you turn it back to the way that it wasn't working and you say, why isn't it going into the fucking hole?
00:05:15.240 And then you get it in there.
00:05:19.380 And then, by the time you use your device, you hate it.
00:05:23.760 You hate this thing.
00:05:25.120 So, how about the simultaneous sip?
00:05:34.240 I don't know.
00:05:34.900 Maybe it's just me complaining about this stuff.
00:05:37.240 But all you need is a cup or a mug or a glass, a tank or a gel, a syscine, a canteen, a jug, a flask, a vessel of any kind.
00:05:42.360 Filled with your favorite liquid.
00:05:44.720 I like coffee.
00:05:45.400 Join me now for the unparalleled pleasure of the dope being here today.
00:05:50.840 The thing that makes everything better except your micro USB connection.
00:05:55.300 It's called the simultaneous sip.
00:05:57.560 And it happens now.
00:05:58.440 Go.
00:06:02.660 Oh, yeah.
00:06:05.540 That's good.
00:06:08.220 I saw a tweet that said the one thing that...
00:06:12.400 This is from Jason Mouk on Twitter.
00:06:16.360 He talks about being literate in the future.
00:06:18.460 It isn't going to be just reading and writing.
00:06:21.080 But because AI and robots and stuff will be taking over everything,
00:06:25.360 we also need to teach kids things that kids can do that, you know, maybe robots and AI can't do.
00:06:32.920 And the example given would be, you know, an eight-year-old can Google something and a robot can plow.
00:06:39.560 But what AI can do is arrange living symphonies.
00:06:45.260 That's why AI can't do it.
00:06:46.880 It can't arrange a living symphony.
00:06:49.560 But you know what it can do?
00:06:51.760 It hasn't done it yet.
00:06:53.740 But do you know what AI can do?
00:06:56.080 It can make music that you would like a lot better than a living symphony made by humans.
00:07:01.220 Like a lot better.
00:07:02.400 So, at the moment, AI can't write music that you would like to listen to.
00:07:09.360 In the long run, it's all going to be AI.
00:07:13.120 It's all going to be AI.
00:07:14.660 Because once AI is better, and it will be better, because it can A-B test parts of a song, sounds.
00:07:21.980 It can A-B test a chord.
00:07:24.440 It can A-B test like one chord.
00:07:26.100 Put it out to a million people and say, how do you like this chord?
00:07:31.140 And if not, people say yes, well, we'll put some of this in the song.
00:07:34.580 How do you like this lyric?
00:07:36.860 All right?
00:07:37.220 A lot of people like this one, don't like this one.
00:07:39.000 We'll put the good one in there.
00:07:40.680 AI could build a song that would be the number one song in the world, period.
00:07:44.800 It would make you tingle.
00:07:46.660 It would just reprogram your brain.
00:07:49.220 That's how good AI will be, to make music.
00:07:53.200 So, I wouldn't assume that music is the safe place.
00:07:57.300 How about fine art?
00:07:59.580 Do you think an AI will be able to make a painting that would be as awesome as our best human painters?
00:08:06.380 Well, it can't do anything like that now.
00:08:09.840 But it will.
00:08:11.360 It will.
00:08:12.840 Yeah.
00:08:13.740 AI will look at all the great paintings.
00:08:16.660 It'll test all the components of it.
00:08:19.220 It'll break you down and figure out what parts of it make it interesting to people.
00:08:23.640 And then it will make paintings that are way better than what a person can make.
00:08:27.580 If you were to buy a human painting, it would only be because of the novelty.
00:08:31.760 That, oh, look, a human did pretty good on this one.
00:08:34.460 Pretty good.
00:08:35.180 Not as good as the AI, obviously.
00:08:37.700 But it's pretty good for a human.
00:08:38.900 You know, the same way we're impressed by cave wall carvings.
00:08:45.600 Anyway.
00:08:46.120 The continued emasculation of Vice President Harris's husband is ongoing.
00:08:54.360 I saw this tweet by Katie Rogers.
00:08:59.680 She tweets, at a French cookware store.
00:09:02.780 I guess the VP is in France.
00:09:08.780 So at a French cookware store, the VP says the second gentleman, her husband, learned how to cook during COVID.
00:09:14.500 Quote, I guess the husband said, she taught me during COVID, out of necessity, after I almost burned down the apartment, he adds.
00:09:26.620 How does that make you feel?
00:09:28.080 Now, here's what I like and don't like about it.
00:09:33.020 On one hand, I do like the fact that she's flipping the gender roles.
00:09:42.140 I kind of like that.
00:09:43.420 I mean, anything that just shakes you up and makes you uncomfortable, I usually like.
00:09:48.540 If it makes you uncomfortable, I probably like it.
00:09:52.380 So the fact that she's very conspicuously becoming the, I'll use this in the traditional historic sense.
00:10:00.960 In the historic sense, she's become the man of the family.
00:10:04.560 She literally wears the pants.
00:10:07.760 Literally wears the pants.
00:10:09.720 And the suit, and the husband stays home and takes care of the house, and apparently he's cooking now.
00:10:16.780 I had to add to that tweet by saying, sadly, the French cookware store did not carry the French maid uniform the VP ordered the second gentleman to wear around the house.
00:10:27.520 I don't know how much worse the emasculation can get, but I think it's coming to this.
00:10:33.260 I think he'll actually be wearing some kind of a French maid outfit around the house as he's dusting.
00:10:38.600 Looking forward to that.
00:10:43.760 But let me also say, I don't really care.
00:10:47.900 It doesn't bother me what nature of the relationship is in their family.
00:10:53.280 That's purely their business.
00:10:55.380 It's just interesting that they're flipping the gender roles.
00:10:59.980 All right, here's a horrible story.
00:11:03.560 So there was an astronaut that went up on Blue Origin's flight with Bezos.
00:11:08.600 So this was just a month ago.
00:11:10.620 A guy, you know, a private citizen gets to go on a rocket ship into space.
00:11:15.500 And that turned out well.
00:11:17.300 And then a month later, he dies in a private plane.
00:11:21.020 He died in a Cessna 172 with some other people.
00:11:25.600 They don't know the cause of the problem yet.
00:11:28.900 But I'll tell you a little bit that I know about aviation.
00:11:32.580 Because, as you know, my wife is a pilot.
00:11:36.240 And the aircraft that he died in is the exact one she spends most of her time in.
00:11:43.480 It's a Cessna 172.
00:11:45.620 Which is, I think the Cessna 172 is like the most common small aircraft.
00:11:51.200 There are a lot of them.
00:11:51.880 But here's what you need to know about the Cessna 172.
00:11:56.400 Again, I'm just sort of talking based on stuff I've heard being around pilot talk.
00:12:02.760 If you're flying that kind of an aircraft, you usually want to hang around somewhere where there's a road.
00:12:10.360 Have you ever heard that?
00:12:11.880 You want to make sure that you can see a sizable street from wherever you're flying.
00:12:17.660 And the reason is, if you have a mechanical difficulty, you still have a pretty good glide path.
00:12:23.500 And so you could at least glide onto a street, which wouldn't be clean, but it's better than a forest.
00:12:30.220 So the fact that he went down in a wooded area probably limits the possible problems to the following.
00:12:40.040 Yeah, I know where you're going with this.
00:12:43.860 Some of you are pilots, so you're ahead of me already.
00:12:47.660 So probably, so one of two problems.
00:12:50.600 One is he, or the pilot, diverted from where they could see a street.
00:12:56.060 So once they had mechanical problems, maybe they didn't have any option of any place to land.
00:13:00.760 So that would have been maybe a pilot error if that happened.
00:13:06.080 Because it is, I think it's pretty basic to keep a road in view.
00:13:12.300 All right, if somebody's a pilot, could you fact check me on that?
00:13:15.060 Is it pretty basic to make sure that you can see a street?
00:13:18.800 Give me a fact check on that.
00:13:20.100 I know there's some pilots watching this.
00:13:22.320 So I think it is.
00:13:23.520 So I would rule out mechanical difficulty.
00:13:27.180 I'll also tell you that in these small aircraft, the number of times they have mechanical difficulty in the air is shockingly high.
00:13:36.800 Shockingly high.
00:13:37.520 The number of times they get 5,000 feet in the air and the engine starts sputtering, it's way more than you think.
00:13:46.600 But experienced pilots almost always can overcome the minor problems.
00:13:52.940 The number one reason that small aircraft crash is, anybody?
00:13:57.840 What's the number one reason that small aircraft crash?
00:14:01.840 Well, pilot error, yes, but I want to be more specific.
00:14:06.180 Yeah, it's pilot error, but specifically pilot error.
00:14:10.820 The answer is people who accidentally go from VFR to IFR.
00:14:15.880 So VFR means visual flying, meaning you can see well.
00:14:19.820 So you can fly fine because you've got lots of visibility.
00:14:23.060 IFR means that suddenly there might have been a cloud cover that they misjudged.
00:14:28.220 And if you get in the clouds, the 172 might have, I think it has, the ability to fly IFR, meaning you could fly instruments, but only if you were good at it.
00:14:40.520 Only if you were good at it, right?
00:14:42.860 It's not good enough that the plane can fly IFR.
00:14:46.540 The pilot has to be able to fly it.
00:14:47.900 So there's, if I had to guess, and this is just pure speculation, if I had to guess, I would say the most likely situation is there was some cloud cover, but we haven't heard about that.
00:15:01.780 So wait to hear.
00:15:03.080 Yeah, John F. Kennedy Jr., perfect example, yes.
00:15:05.860 And pilots are so afraid of that situation, going from VFR to instrument, you know, low visibility, that I think even some of the people who teach how to fly on instruments have never flown in clouds.
00:15:24.440 Let me say that again.
00:15:25.600 The people who teach you how to fly in the clouds, some of them I don't think have ever flown in the clouds, because it's too dangerous, even if you know how.
00:15:36.960 That's how dangerous it is.
00:15:38.800 And you say, well, how do they learn to teach it if they've never flown in it?
00:15:42.460 They have something called foggles, a special kind of goggles that cover the top of your sight so you can only see the instruments.
00:15:51.100 So they practice with their vision obscured, you know, with a pilot who can see in the back, usually, or the front.
00:15:59.580 Anyway, that's your little aviation knowledge for the day.
00:16:03.980 I saw an excellent persuasive tweet that might not work the way the tweeter hoped.
00:16:09.260 You might be familiar with a doctor, Robert Malone.
00:16:13.340 I would put him in the category of contrarian doctors about the pandemic, specifically.
00:16:18.000 So he's one of these rogue contrarian doctors, one of the inventors of the mRNA technology, I guess.
00:16:25.680 So he's really highly qualified.
00:16:28.220 So that's the first thing you need to know.
00:16:30.060 If there were somebody you wanted to listen to, he'd be exactly the kind of credentials you'd want to look at.
00:16:36.620 So he has lots of differences about the pandemic from the mainstream stuff.
00:16:44.900 And he tweeted a picture of himself next to a picture of Big Bird, because Big Bird is being employed to persuade children to get vaccinated.
00:16:56.460 And then Robert Malone, MD, tweets of the pictures of his picture next to Big Bird.
00:17:03.340 He goes,
00:17:03.640 Alright, so who do you believe?
00:17:23.960 Do you believe the highly qualified doctor whose experience is exactly the kind of experience you'd want?
00:17:32.060 Or do you believe Big Bird?
00:17:34.540 Which one is more credible, Big Bird or a doctor with the exact credentials you'd want to see?
00:17:42.700 Here's the answer.
00:17:44.000 It's a tie.
00:17:45.860 It's a tie.
00:17:48.540 Do you know why?
00:17:49.620 Because what Dr. Malone offers is to show you well-sourced information, this is his own words,
00:17:57.640 and to help you be able to interpret it for yourself.
00:18:01.680 That's not a thing.
00:18:03.920 That's not a thing.
00:18:05.520 You can't show me your well-sourced information and expect me to interpret it correctly.
00:18:11.580 How the hell am I going to do that?
00:18:14.240 I can't do that.
00:18:16.160 So who am I going to believe?
00:18:17.180 If Big Bird, who is essentially a front character for the entire medical pharma community,
00:18:25.020 you know, let's say the majority, not entire,
00:18:27.200 versus the rogue doctor who is in the category of people who are usually wrong.
00:18:33.800 The rogue doctor is usually wrong.
00:18:37.460 Now, everything that changes starts with, you know, one person who's the rogue doctor,
00:18:44.620 and then eventually other people get persuaded.
00:18:46.980 But how often does that happen, versus how often the rogue doctor is just wrong?
00:18:53.280 I would say experientially, you know, without any science to back it,
00:18:57.840 I would say experientially the rogue doctor on any topic is wrong 95% of the time.
00:19:04.860 Wouldn't you say?
00:19:08.300 Does your experience give you a different result?
00:19:11.600 And it's only when they get it right that it's such a big story.
00:19:15.660 It's like, oh my God, the person who was the only lone voice got it right.
00:19:21.140 That's why you think it's a bigger deal than it is.
00:19:23.640 Because it's always a story if the lone outlier gets it right.
00:19:28.160 Do you remember in 2016, some of you remember,
00:19:31.680 that I predicted before most people that Trump would win in 2016?
00:19:38.640 Now, why is it that everybody heard of that?
00:19:41.720 Why is it that I was somewhat famous in America for that prediction?
00:19:47.140 I mean, I was invited on shows or articles written about it.
00:19:51.520 Why was I famous for that?
00:19:53.240 Because it was a rogue outlier opinion that was right.
00:19:59.620 It was rare.
00:20:02.360 The reason I got press is because it was so rare to be a contrarian and be right.
00:20:08.640 So anytime you see a contrarian, you can, you know,
00:20:12.060 I say you should give them full respect of listening to them,
00:20:15.620 especially if they have credentials that Dr. Malone has.
00:20:19.080 Those are serious credentials, right?
00:20:21.380 If you're going to ignore somebody with that kind of credentials,
00:20:25.080 well, do it at your risk.
00:20:26.660 I'm just saying that don't assume they're right.
00:20:29.900 The odds are way against it.
00:20:32.520 But they might be.
00:20:33.920 This could be the 5%.
00:20:35.700 You never know.
00:20:38.340 All right.
00:20:40.580 Newt Gingrich tweeted also about Kamala Harris.
00:20:44.660 He said,
00:20:45.560 Why is Kamala Harris in Paris, France, worrying about the Polish-Belarus border,
00:20:51.540 which is actually what's happening,
00:20:53.100 instead of being in Paris, Texas, worrying about the U.S.-Mexican border?
00:20:57.840 Did she misunderstand Biden when he made her in charge of the border?
00:21:02.360 And I said to myself,
00:21:06.080 You know, this is a suboptimal situation.
00:21:08.640 We've got these big problems in America,
00:21:10.380 and our vice president is in Europe.
00:21:13.280 That is suboptimal.
00:21:15.360 You know why it's suboptimal?
00:21:17.240 Because Europe isn't quite far enough.
00:21:20.380 If we could get the vice president entirely on the other side of the world,
00:21:25.000 I'd feel a little more comfortable.
00:21:26.320 Maybe if she would go up in one of those Blue Origin or Elon Musk flights,
00:21:32.340 get her off the earth entirely.
00:21:35.600 I'd feel a little more comfortable about that.
00:21:38.420 All right.
00:21:38.640 Here's some Gavin Newsom fake news.
00:21:41.300 Gavin Newsom, governor of New York.
00:21:43.120 There's a manipulated video that's really diabolical
00:21:46.320 that makes him look like he's smiling creepily when he's talking.
00:21:51.640 So somehow, I guess the technology exists now
00:21:54.600 that you can manipulate somebody's mouth on a video.
00:22:00.440 I mean, we had that capability for a long time,
00:22:02.820 but it looks like maybe it's easy to do now.
00:22:05.700 And so the governor is talking,
00:22:08.320 but he's got sort of a crazy joker mouth,
00:22:11.400 and it's really creepy.
00:22:14.380 Now, I don't know how anybody could think that was real.
00:22:18.500 Right?
00:22:19.200 It didn't look real to me,
00:22:21.240 but apparently some people did.
00:22:24.600 So that's the first Newsom fake news.
00:22:30.740 All right.
00:22:32.520 Do you know the author and investment advisor,
00:22:37.820 Robert Kiyosaki?
00:22:41.820 I think he wrote Rich Dad, Poor Dad.
00:22:45.120 Fairly famous for financial advice.
00:22:47.260 And I saw a tweet by Dr. Parikh Patel,
00:22:52.220 who's got lots of financial qualifications, it looks like.
00:22:55.540 And he showed a graph showing the stock market,
00:22:59.540 and, you know, it's climbed for years and years,
00:23:02.460 with Robert Kiyosaki's predictions
00:23:07.820 that it was going to crash.
00:23:09.220 Oh, it's going to crash.
00:23:10.880 It's going to crash.
00:23:12.540 Totally going to crash.
00:23:14.240 Oh, it's going to crash.
00:23:15.600 You're going to be so sorry when it crashes.
00:23:18.000 And apparently he hasn't been right yet,
00:23:19.820 at least in the long term.
00:23:22.180 He hasn't been right.
00:23:23.620 But here's what I say about this.
00:23:25.400 Number one, investment advice
00:23:29.400 is basically horoscopes with math.
00:23:34.980 Would you put your money into an investment
00:23:38.560 based on a horoscope?
00:23:41.240 No.
00:23:42.640 Well, don't put your money on anything
00:23:44.420 based on investment advice either,
00:23:46.060 because they're just horoscopes with math.
00:23:48.140 Nobody knows what's happening in the future.
00:23:51.120 Nobody understands the future.
00:23:53.260 Nobody.
00:23:53.600 There's no investment advisor who knows the future.
00:23:57.880 The only thing you can do is diversify.
00:24:00.700 And if you have enough money,
00:24:01.920 you might, you know,
00:24:02.640 take some risky bets on a few things,
00:24:05.320 you know, like high-tech things.
00:24:07.560 But I just gave you all the investment advice
00:24:10.340 you'll ever need.
00:24:11.720 You know, it could be like one page.
00:24:13.780 That's it.
00:24:15.180 Diversify, you know,
00:24:16.100 get the Fortune 500 index.
00:24:19.780 And that's 90% of what you need right there.
00:24:25.000 So, but let me say what Robert Kiyosaki
00:24:30.720 is doing wisely.
00:24:33.500 If a big crash doesn't come,
00:24:35.660 is anybody going to say,
00:24:37.460 oh, you got that wrong?
00:24:39.000 Or is he just going to say,
00:24:40.980 well, it hasn't happened yet.
00:24:42.260 What's going to happen?
00:24:42.940 So being wrong about a crash
00:24:45.100 generally doesn't hurt you.
00:24:47.620 Right?
00:24:48.760 You predict a crash,
00:24:49.820 it doesn't happen.
00:24:51.080 You can always say it's going to happen.
00:24:52.780 Well, it just hasn't happened yet.
00:24:54.000 Or you can say,
00:24:54.900 oh, we got lucky.
00:24:55.620 This is not good.
00:24:56.980 But if he gets it right,
00:25:00.160 he will be called out
00:25:01.540 the same way I got it right with Trump.
00:25:04.440 If I'd got my Trump prediction wrong in 2016,
00:25:08.500 I would just go on with life.
00:25:10.740 I'd say, well, got that one wrong.
00:25:12.660 Just go on with life.
00:25:14.320 But if you get it right,
00:25:15.380 it changes your life.
00:25:17.360 So Kiyosaki is playing the odds really well
00:25:21.640 because the odds of a big stock market crash
00:25:25.140 eventually are pretty good.
00:25:27.960 So he'll probably get one right
00:25:29.640 and get a lot of attention for it.
00:25:32.820 Might be a good thing.
00:25:34.140 At least in terms of publicity,
00:25:35.700 a good thing.
00:25:39.680 So here's a story on CNN
00:25:42.040 that's just sort of state of the world.
00:25:45.540 So apparently there was a white student
00:25:48.400 in a school,
00:25:49.260 it doesn't matter where,
00:25:50.060 somewhere in America,
00:25:51.160 and he made what is being described
00:25:53.020 as a racist video.
00:25:55.140 Now, I don't know the details
00:25:56.720 of the racist video,
00:25:58.140 so I don't know if you and I saw it,
00:25:59.700 we'd think it was racist,
00:26:01.140 but CNN says it's racist.
00:26:02.780 So let's take that assumption
00:26:04.360 and go with it.
00:26:06.300 Anyway, a black student
00:26:07.540 approached that white student
00:26:08.760 who had made that allegedly racist video,
00:26:10.960 and a physical altercation occurred,
00:26:13.440 and the white student was injured
00:26:14.780 and needed to be treated for head injuries.
00:26:17.300 Now, I don't know how bad the head injuries were,
00:26:21.540 but if it's like a concussion,
00:26:23.360 it's a brain injury.
00:26:25.940 So is it a head injury,
00:26:27.160 like, you know, he's got a bleeding head,
00:26:28.760 or is it a brain injury,
00:26:30.080 which would be a lot worse?
00:26:30.940 I don't know,
00:26:31.960 but it's worrisome.
00:26:34.040 But anyway, the students,
00:26:35.100 they called for a walkout,
00:26:36.740 and here's what they demanded.
00:26:37.800 They wanted disciplinary action
00:26:39.920 against the white student,
00:26:41.800 and no disciplinary action
00:26:44.520 for the black student
00:26:45.560 who caused the head injury
00:26:48.240 with this altercation.
00:26:51.580 Does that sound about right to you?
00:26:54.400 Now, what we don't know
00:26:55.720 is which one of the students
00:26:56.980 started the fight.
00:26:58.800 So if the black student
00:27:00.320 simply addressed the white student
00:27:03.960 with verbal complaints,
00:27:06.040 and the white student started the fight,
00:27:08.900 well, then I'd agree, right?
00:27:10.640 You know, whoever started the fight
00:27:12.640 is to blame.
00:27:13.840 But if it worked the other way,
00:27:15.160 or they sort of both started the fight,
00:27:17.100 which, you know, with kids,
00:27:19.400 with kids, it's usually
00:27:20.440 they both started the fight,
00:27:21.740 wouldn't you say?
00:27:22.820 You know, there's words,
00:27:23.860 there's a touch, there's a push.
00:27:26.600 It usually escalates at the same time.
00:27:28.760 It's not like somebody walks up
00:27:29.960 and throws a punch.
00:27:31.000 Could happen.
00:27:31.920 I mean, maybe.
00:27:33.060 But probably they're both about equally.
00:27:36.040 Culpable.
00:27:37.020 And it feels to me,
00:27:38.580 without knowing the full details of this,
00:27:40.320 so I could be wrong
00:27:41.040 if I heard the context,
00:27:42.700 but it feels like
00:27:43.620 it's another example
00:27:44.480 of violence against white males
00:27:46.060 being acceptable.
00:27:47.720 Would you agree?
00:27:49.380 Now, I'm not defending
00:27:51.200 his racist alleged video.
00:27:54.060 I'm not saying that, obviously.
00:27:57.000 I'm saying that the way this is written,
00:27:58.980 and the student's reaction to it
00:28:01.400 feels like, you know,
00:28:03.820 one more little brick on that wall
00:28:06.060 of it's okay to hunt white males
00:28:10.440 and hurt them.
00:28:14.240 So here's some more Newsom fake news,
00:28:16.940 except this time it's not Gavin Newsom,
00:28:18.700 it's Hawk Newsom,
00:28:20.300 with an E on the end of Newsom.
00:28:21.900 What are the odds
00:28:23.380 that two people
00:28:24.500 whose last name
00:28:25.520 is actually
00:28:26.400 a combination of
00:28:27.960 some news
00:28:29.220 are always creating
00:28:31.640 some news?
00:28:33.160 Their last name
00:28:34.080 is actually Newsom.
00:28:36.080 Some news.
00:28:37.320 Hey, you want some news?
00:28:38.180 I got some about Hawk.
00:28:39.460 You want some more news?
00:28:41.120 Got some about Gavin.
00:28:43.100 How weird is that?
00:28:44.840 But anyway,
00:28:45.280 there was some fake news
00:28:46.040 about Hawk
00:28:46.620 that suggested
00:28:47.440 he was encouraging violence
00:28:50.480 should the new mayor,
00:28:53.780 Eric Adams Institute,
00:28:56.040 reintroduce the task force.
00:28:58.500 They had some kind
00:28:59.020 of a police task force
00:29:00.200 against gun violence.
00:29:01.980 And I guess it was disbanded,
00:29:03.440 and if he reintroduces it,
00:29:04.960 Hawk was saying
00:29:05.700 that there would be violence.
00:29:08.240 Now, the way it was reported
00:29:09.480 was that Hawk
00:29:12.080 was maybe encouraging violence
00:29:14.140 or in favor of it.
00:29:15.360 That actually didn't happen.
00:29:18.820 He predicted it.
00:29:21.160 Predicting violence
00:29:22.100 is perfectly acceptable.
00:29:24.700 Would you disagree?
00:29:26.500 I've predicted violence
00:29:28.000 lots of times.
00:29:29.680 Is there any problem
00:29:30.300 with predicting violence?
00:29:32.760 No.
00:29:33.860 So, while I don't
00:29:35.160 back Hawk Newsom
00:29:37.020 on a lot of things he says,
00:29:38.540 this was just fake news.
00:29:40.720 All right?
00:29:41.420 Now, I've told you before
00:29:43.840 that don't underestimate
00:29:46.460 Hawk Newsom's game.
00:29:50.720 He's really clever.
00:29:53.080 Right?
00:29:53.480 And what he did here
00:29:54.580 was walk right up
00:29:55.480 to the line
00:29:56.180 of saying something
00:29:57.040 that would have been inappropriate,
00:29:59.040 you know,
00:29:59.240 suggesting violence.
00:30:00.460 He walked up to the line,
00:30:01.780 but he didn't cross it.
00:30:03.440 But he probably is...
00:30:04.640 He is.
00:30:05.320 I'm not...
00:30:05.840 I'll take away the probably.
00:30:06.840 He is smart enough
00:30:08.340 to know
00:30:08.820 that phrasing it
00:30:10.420 exactly the way he did
00:30:11.440 would get the most attention
00:30:12.580 while still putting him
00:30:14.320 in a safe place,
00:30:15.360 which is...
00:30:15.920 He's not recommending violence.
00:30:17.480 He's just predicting it.
00:30:19.500 And it's pretty clever
00:30:21.060 because it worked.
00:30:22.960 So, every time
00:30:24.520 you think to yourself
00:30:25.500 that something
00:30:28.060 that Hawk Newsom's doing
00:30:29.120 doesn't make sense,
00:30:30.540 give it another thought
00:30:31.700 because he's operating
00:30:33.220 at a pretty sophisticated level
00:30:35.720 with persuasion.
00:30:36.500 All right.
00:30:40.240 David McDowell tweets,
00:30:42.120 and I agree,
00:30:44.420 that Biden and company
00:30:45.600 have intentionally driven up
00:30:46.960 the price of oil and fuel
00:30:48.040 by taking a hatchet job
00:30:50.760 to our energy economy.
00:30:54.140 They've intentionally,
00:30:55.300 and intentionally will be the word
00:30:56.660 that I'll question here in a moment,
00:30:58.520 they've intentionally created
00:30:59.880 hardships for Americans.
00:31:01.720 Only when energy prices are high
00:31:03.620 do these alternatives make sense.
00:31:05.200 So, in other words,
00:31:06.940 Dagan is saying
00:31:07.560 that the Biden administration
00:31:08.860 is making fossil fuels
00:31:10.360 too high priced
00:31:12.120 to make it easier
00:31:13.680 for the, you know,
00:31:14.780 the green stuff
00:31:15.620 to be economical.
00:31:18.200 Now, generally speaking,
00:31:20.240 and I say this often,
00:31:22.180 I reject
00:31:22.940 the mind-reading opinions,
00:31:24.980 which is you have to be
00:31:25.840 in the mind of the other person
00:31:27.100 to have that opinion.
00:31:29.360 But I don't know
00:31:30.360 that that applies here
00:31:31.600 because I think
00:31:33.520 they're saying it directly,
00:31:34.580 aren't they?
00:31:35.760 You know,
00:31:35.940 isn't the Biden administration
00:31:37.040 saying directly
00:31:38.160 that they want green energy
00:31:39.340 and that they want to,
00:31:40.340 they want to price
00:31:41.640 carbon fuels out?
00:31:44.680 I feel like
00:31:45.560 that's a direct message,
00:31:46.780 but they don't say it,
00:31:48.900 they don't package
00:31:49.940 their message
00:31:50.580 as succinctly
00:31:52.160 as Dagan McDowell did
00:31:54.060 by saying
00:31:55.780 they're basically
00:31:56.420 torturing the middle class
00:31:58.420 to get their,
00:32:00.960 let's say,
00:32:03.000 left-leaning agenda through,
00:32:04.580 which is more green energy.
00:32:06.460 That's what it looks like.
00:32:07.820 So I don't think
00:32:08.420 that was mind-reading at all.
00:32:09.420 I think that was right on.
00:32:11.820 Anyway,
00:32:12.760 Steve Bannon got indicted
00:32:14.020 for refusing
00:32:14.980 to testify to Congress.
00:32:19.620 What part of the,
00:32:20.640 I assume the Constitution
00:32:22.160 gives Congress this power.
00:32:24.940 Congress has the power
00:32:26.380 to demand
00:32:27.140 that you go in
00:32:27.900 and talk to them
00:32:29.640 in public?
00:32:32.480 Does anybody know
00:32:35.100 where that law
00:32:36.240 would exist?
00:32:37.480 Or is it a law,
00:32:38.260 is it a right
00:32:38.760 they gave themselves
00:32:39.540 by just voting for it?
00:32:42.340 If there's a
00:32:43.440 constitutional scholar here,
00:32:46.180 could somebody tell me?
00:32:50.820 Yeah,
00:32:51.480 I didn't think,
00:32:52.080 I didn't think Congress
00:32:53.180 had any police power,
00:32:54.260 right?
00:33:00.260 So I'm a little confused
00:33:01.920 about this story.
00:33:03.540 They have subpoena power,
00:33:05.260 but does subpoena power
00:33:06.480 give them power
00:33:07.380 to punish you
00:33:08.160 if you don't
00:33:09.020 obey the subpoena?
00:33:10.260 I don't know.
00:33:13.780 I'm very much
00:33:14.880 in favor of citizens
00:33:16.080 resisting this
00:33:17.840 because it looks like
00:33:19.360 a show trial
00:33:20.060 to me.
00:33:21.860 You know,
00:33:22.080 when Congress
00:33:22.880 does these things,
00:33:23.780 it doesn't look
00:33:24.660 like the point of it
00:33:25.780 is justice,
00:33:26.580 does it?
00:33:28.220 You know,
00:33:28.760 when people get subpoenaed
00:33:29.800 for the court system,
00:33:31.420 I say to myself,
00:33:32.800 well,
00:33:33.100 flaws and all,
00:33:33.940 at least it's part
00:33:34.580 of the justice process,
00:33:35.840 so that's okay.
00:33:37.780 But this is outside
00:33:38.900 of the justice process.
00:33:40.840 How do you do
00:33:41.340 a show trial
00:33:42.140 when you're not part
00:33:43.120 of the justice system?
00:33:46.000 Yeah,
00:33:46.500 I guess Eric Holder
00:33:47.540 also refused.
00:33:51.300 So this is kind
00:33:52.540 of an interesting case.
00:33:53.640 We'll see
00:33:53.920 if any penalty
00:33:55.660 happens to Bannon.
00:33:56.740 I guess you could get
00:33:57.420 30 days in jail
00:33:58.480 for doing this.
00:33:59.780 But I definitely
00:34:01.120 support him in this.
00:34:03.100 I would support him.
00:34:04.160 And it wouldn't matter
00:34:05.000 what his reason is.
00:34:06.740 So I don't care
00:34:07.500 what Bannon's reason is
00:34:09.000 for not wanting
00:34:09.820 to do it.
00:34:10.680 I feel like every citizen
00:34:12.060 should be able
00:34:12.620 to refuse this.
00:34:14.360 Who's with me?
00:34:16.360 I don't think
00:34:17.380 we should be able
00:34:17.900 to refuse talking
00:34:18.800 to the real police.
00:34:22.060 But I think
00:34:22.880 we should be able
00:34:23.660 to refuse this.
00:34:28.320 Yeah?
00:34:28.760 Okay.
00:34:29.040 I think most of you
00:34:29.820 agree on that.
00:34:33.620 There is a great
00:34:35.300 article by Andrew Sullivan
00:34:38.000 on Substack.
00:34:40.580 I tweeted it
00:34:41.360 so you can find
00:34:42.120 just you can tweet
00:34:43.180 Andrew Sullivan
00:34:43.840 and look for his articles
00:34:45.000 or my Twitter feed.
00:34:48.200 And here's how he tweeted
00:34:49.800 about his own
00:34:50.560 this is his sort of
00:34:52.640 summary of his own article.
00:34:54.520 He said,
00:34:55.060 2016 election.
00:34:56.660 So see what all these
00:34:57.600 have in common.
00:34:58.240 What do all these
00:34:59.360 have in common
00:34:59.900 before he tells you?
00:35:01.280 2016 election,
00:35:03.000 Rittenhouse,
00:35:04.200 Covington,
00:35:05.420 Russian collusion,
00:35:06.960 vaccines,
00:35:08.360 bounties on U.S.
00:35:09.380 soldiers,
00:35:10.540 lab leak theory,
00:35:12.500 Jussie Smollett,
00:35:14.460 the Pulse shooting,
00:35:16.140 the Atlanta shooting,
00:35:17.960 Hunter Biden laptop,
00:35:19.620 inflation,
00:35:20.780 and the Steele dossier.
00:35:21.720 What do they all have
00:35:23.540 in common?
00:35:26.200 Fake news?
00:35:27.240 Yes.
00:35:28.480 That's one thing
00:35:29.160 they have in common.
00:35:29.860 What's the second thing
00:35:30.860 they have in common?
00:35:32.420 One is that they're
00:35:33.240 all fake news.
00:35:34.560 What's the second thing
00:35:35.620 they have in common?
00:35:39.380 Somebody says it's
00:35:40.400 about white people.
00:35:41.840 Not quite.
00:35:42.960 It's not right.
00:35:44.000 It's not exactly
00:35:44.640 where I'm going.
00:35:45.160 It's conservative bashing.
00:35:49.060 That's correct.
00:35:49.860 Every one of these stories
00:35:51.040 favors the left's narratives.
00:35:53.760 Every one of them.
00:35:54.640 And he didn't even
00:35:55.320 have a complete list.
00:35:57.160 He didn't have
00:35:57.880 the fine people hoax
00:35:58.840 and he didn't have
00:35:59.400 the drinking bleach hoax.
00:36:01.160 I'm not sure
00:36:01.640 if he knows
00:36:02.060 they're hoaxes,
00:36:02.800 honestly,
00:36:03.360 because a lot of people
00:36:04.200 who are well informed
00:36:05.160 don't know
00:36:05.760 that these are hoaxes.
00:36:07.200 But I would have
00:36:07.940 put them on the list.
00:36:09.280 But every one of these
00:36:10.580 has the same pattern to it,
00:36:13.140 that it's a liberal narrative.
00:36:16.000 And it's so shockingly obvious
00:36:18.100 when you see him
00:36:19.080 lay out the entire context.
00:36:21.660 We do not live
00:36:22.920 in a country
00:36:23.500 with anything
00:36:24.500 like a free press.
00:36:27.740 I mean,
00:36:28.680 technically it's free,
00:36:30.260 but the way it operates
00:36:31.420 is just as a captive
00:36:32.740 of the Democrats.
00:36:35.220 All right.
00:36:36.820 There was a French study
00:36:38.320 that I saw
00:36:39.060 and then didn't get
00:36:39.880 much attention,
00:36:41.440 maybe because it disagrees
00:36:43.020 with the narrative.
00:36:43.840 You be the judge.
00:36:45.720 So I'm not going to say
00:36:46.480 this study is necessarily
00:36:47.900 highly credible,
00:36:50.340 but it's worth noting.
00:36:52.040 So the French study
00:36:52.960 says long COVID,
00:36:54.720 the idea that
00:36:55.500 even after you recover
00:36:56.500 from COVID,
00:36:57.660 there would be symptoms
00:36:58.420 that are ongoing,
00:36:59.840 are all in your head.
00:37:04.000 Now,
00:37:04.760 that was always
00:37:06.020 a possibility, right?
00:37:08.000 We've seen the reports
00:37:09.120 that up to half
00:37:10.500 of the people
00:37:11.060 might have long COVID,
00:37:12.760 you know,
00:37:13.000 at least 25%,
00:37:14.180 but up to half.
00:37:15.500 Now,
00:37:15.780 from day one,
00:37:17.600 have I not told you
00:37:18.560 from day one,
00:37:19.920 it could be
00:37:20.960 all in people's heads.
00:37:22.840 This is exactly
00:37:23.820 what that would look like.
00:37:25.440 All right.
00:37:26.220 If it were real
00:37:27.280 or it were not real,
00:37:28.560 it would look
00:37:29.040 exactly like this.
00:37:30.600 Whether it was real
00:37:31.580 or not real,
00:37:32.260 it would look
00:37:32.800 exactly like this.
00:37:34.340 So you can't really tell
00:37:35.560 based on all the reports
00:37:37.640 and facts
00:37:38.400 and you can't tell
00:37:39.620 by asking people.
00:37:41.580 So if you ask people,
00:37:42.900 hey,
00:37:43.220 did you have any symptoms?
00:37:44.520 They'll say yes.
00:37:46.180 But you don't know
00:37:47.240 if they're accurate.
00:37:48.900 That's the problem.
00:37:50.280 So this new study
00:37:51.220 did something clever
00:37:52.240 in which,
00:37:54.560 I'm not sure
00:37:55.200 I totally understand it,
00:37:56.300 but they found a way
00:37:57.900 to do a control,
00:37:59.900 allegedly.
00:38:01.720 So let me tell you
00:38:02.700 what they did.
00:38:03.080 They had a huge pile
00:38:06.240 of people
00:38:06.600 that they were looking at
00:38:07.500 but only 1,000 of them
00:38:09.560 they ended up
00:38:10.240 looking at closely
00:38:11.300 because there were 1,000
00:38:12.860 who actually tested positive
00:38:14.740 and out of the 1,000
00:38:16.240 who tested positive
00:38:17.160 for COVID,
00:38:18.320 about 450 believed
00:38:20.040 they had the virus.
00:38:22.180 So half of the people
00:38:23.360 who didn't,
00:38:24.060 more than half,
00:38:26.560 didn't even know
00:38:27.180 they had the virus.
00:38:28.620 Now you ask the people
00:38:29.760 who definitely had the virus
00:38:31.040 and knew it
00:38:31.640 if they had symptoms
00:38:33.460 and then you ask the people
00:38:35.260 who didn't know it
00:38:35.940 if they had symptoms
00:38:36.640 and apparently
00:38:37.220 there's a difference.
00:38:38.860 But anyway,
00:38:39.640 I'm doing a terrible job
00:38:41.940 of explaining this
00:38:42.780 so if it sounds like
00:38:43.760 a bad study,
00:38:45.980 that has more to do
00:38:46.940 with the way
00:38:47.320 I'm explaining it,
00:38:48.180 I think.
00:38:49.280 But anyway,
00:38:49.960 they concluded
00:38:50.500 that the persistent
00:38:51.340 physical symptoms
00:38:52.440 quote,
00:38:53.440 may be associated
00:38:54.420 more with the belief
00:38:55.480 in having been infected.
00:38:58.220 So in other words,
00:38:59.260 if you thought
00:39:00.080 you had been infected
00:39:01.000 but you hadn't,
00:39:02.940 you have the same rate
00:39:03.960 of long COVID
00:39:04.780 as the people
00:39:05.580 who have been infected.
00:39:08.680 I think that's
00:39:09.580 what it's saying,
00:39:10.660 right?
00:39:11.560 Let me say that again.
00:39:12.680 If you thought
00:39:14.080 you had COVID
00:39:14.720 but you didn't,
00:39:16.640 you have the same experience
00:39:18.020 of long COVID,
00:39:19.000 you report the same symptoms
00:39:20.340 as people who actually
00:39:21.740 had the thing,
00:39:23.000 which would suggest
00:39:24.300 there's no such thing
00:39:25.640 as long COVID.
00:39:26.320 I'm not convinced
00:39:29.980 and I'm not convinced
00:39:31.720 for the wrong reason.
00:39:33.640 Here's the worst reason
00:39:34.780 to not be convinced.
00:39:36.980 Anecdotal evidence.
00:39:38.840 I know two people
00:39:40.020 whose experience
00:39:42.300 with long COVID
00:39:43.140 as they describe it
00:39:44.720 doesn't sound like
00:39:46.620 something that could be
00:39:47.320 in your head.
00:39:48.180 I mean,
00:39:48.400 it's just too dramatic.
00:39:49.980 It's like way bigger
00:39:50.900 than something
00:39:51.400 that would be in your head.
00:39:52.800 But,
00:39:53.300 and also I believe
00:39:54.340 that they found
00:39:54.900 differences in the lungs
00:39:56.120 long after you recovered.
00:39:58.960 So,
00:39:59.580 I'm still on the side
00:40:01.040 that says long COVID
00:40:02.200 is probably something.
00:40:05.060 But it might be
00:40:05.840 a lot smaller
00:40:06.500 than we think.
00:40:07.960 But probably something.
00:40:11.660 Yeah.
00:40:12.320 So,
00:40:12.600 Dr. Drew
00:40:13.400 is one who had
00:40:14.400 symptoms
00:40:15.280 that it would be hard
00:40:17.060 to imagine
00:40:17.620 that those were imaginary.
00:40:19.520 Because they were
00:40:20.160 pretty dramatic.
00:40:23.040 You can think,
00:40:24.120 if the only problem
00:40:25.140 was,
00:40:25.560 oh,
00:40:25.760 I had more headaches
00:40:26.660 or something,
00:40:27.300 you'd say,
00:40:27.720 eh,
00:40:27.980 did you?
00:40:28.920 I mean,
00:40:29.240 are you keeping
00:40:29.800 track of your headaches?
00:40:31.380 You really know
00:40:32.180 you had more headaches?
00:40:33.480 You know,
00:40:33.620 that would be suspicious.
00:40:34.980 But the types of symptoms
00:40:36.160 are pretty dramatic.
00:40:37.540 I don't think that,
00:40:39.160 I would be surprised
00:40:40.060 if they were imaginary.
00:40:41.680 Anyway,
00:40:42.000 what is the main theme
00:40:45.540 of Democrats
00:40:46.920 and the left?
00:40:48.500 I think it's child abuse.
00:40:50.940 And look at this list.
00:40:52.920 Climate alarmism.
00:40:54.040 Would you agree
00:40:56.460 that the children
00:40:57.220 are being taught
00:40:58.180 climate alarmism?
00:41:00.660 And would you also agree
00:41:01.660 that the kids
00:41:02.420 can't do anything
00:41:03.220 about climate change?
00:41:05.680 I mean,
00:41:06.200 you know,
00:41:06.500 Greta can complain,
00:41:07.600 but basically,
00:41:08.680 children in general,
00:41:10.800 they can't do anything
00:41:11.780 except
00:41:12.840 influence their parents.
00:41:18.180 That's the play, right?
00:41:20.000 Isn't the play
00:41:20.900 to influence their parents?
00:41:22.080 So we now have
00:41:25.600 a pretty strong
00:41:27.180 medical indication
00:41:28.260 that anxiety
00:41:29.940 is being,
00:41:31.600 you know,
00:41:33.180 definitely kicked up
00:41:34.420 because of climate alarm.
00:41:36.720 All right,
00:41:36.880 would you agree
00:41:37.460 with the following assertion?
00:41:39.500 We do know
00:41:40.500 that kids are getting
00:41:41.580 really scared
00:41:42.420 and anxious
00:41:42.880 about climate change.
00:41:44.640 Okay,
00:41:44.860 I think we all agree
00:41:45.940 that that's true.
00:41:47.040 We also know
00:41:47.940 that anxiety
00:41:48.620 is very strongly correlated
00:41:51.060 with bad health outcomes.
00:41:53.820 So does climate change
00:41:55.320 alarmism
00:41:56.000 cause children
00:41:57.360 to be less healthy?
00:41:59.620 Yes.
00:42:00.600 Yeah,
00:42:00.860 I mean,
00:42:01.080 it's pretty clear.
00:42:02.620 Unless,
00:42:03.600 I guess the only caveat
00:42:05.180 would be
00:42:05.720 if adults
00:42:07.020 are more sensitive
00:42:07.940 to anxiety
00:42:08.640 or something.
00:42:09.340 Maybe kids
00:42:09.800 are more resilient.
00:42:11.020 I doubt it.
00:42:12.160 But, I mean,
00:42:12.780 you could imagine it.
00:42:14.420 So,
00:42:15.300 climate change
00:42:15.980 is basically
00:42:16.620 child abuse
00:42:17.680 in the,
00:42:19.480 at least how it's
00:42:20.420 being presented
00:42:21.280 to children,
00:42:22.540 for some larger goal,
00:42:25.220 you know,
00:42:25.460 some goal of the left
00:42:26.540 to have more green energy,
00:42:28.480 I guess.
00:42:29.560 Critical race theory,
00:42:31.100 or let's not call it
00:42:32.460 by that name,
00:42:33.120 but the things
00:42:34.420 that are embedded
00:42:34.960 in curriculums,
00:42:36.420 does that not just
00:42:37.460 turn kids against each other?
00:42:39.200 It basically labels
00:42:40.640 some kids victims
00:42:41.640 and some kids oppressors.
00:42:43.620 How is that good
00:42:44.640 for kids?
00:42:45.880 That feels like
00:42:46.560 child abuse,
00:42:47.240 doesn't it?
00:42:48.540 To sort them
00:42:50.380 into abusers
00:42:51.260 and abused.
00:42:52.840 I think it's child abuse.
00:42:54.560 How about
00:42:54.820 lack of school choice?
00:42:56.820 If you wanted
00:42:57.720 your kid to get
00:42:58.440 out of the cesspool
00:42:59.440 of public school,
00:43:00.520 you couldn't do it.
00:43:02.000 You know,
00:43:02.260 unless you had
00:43:02.760 a lot of money.
00:43:04.240 So, that feels like
00:43:05.300 some kind of
00:43:06.700 child abuse.
00:43:08.800 How about
00:43:09.160 forced vaccinations
00:43:10.440 for kids
00:43:11.180 that have almost
00:43:11.780 no chance
00:43:12.400 of having a problem
00:43:13.180 with COVID?
00:43:14.800 But we're going
00:43:15.440 to vaccinate
00:43:16.300 those kids
00:43:16.900 to keep adults safe.
00:43:19.320 It's child abuse.
00:43:20.740 It's child abuse
00:43:21.740 for the benefit
00:43:23.620 of adults.
00:43:26.080 How about
00:43:26.940 social media?
00:43:29.680 We allow
00:43:30.580 children to use
00:43:31.520 social media,
00:43:32.240 which I'm opposed to.
00:43:33.440 I don't think
00:43:33.900 children should be
00:43:34.520 allowed to use
00:43:35.040 social media
00:43:35.600 because it's damaging.
00:43:37.700 It's obviously damaging.
00:43:40.640 It couldn't be
00:43:41.480 more obvious.
00:43:42.840 I'm not even sure
00:43:43.740 adults should use
00:43:44.880 social media,
00:43:46.180 but at least we get
00:43:46.900 to make adult choices.
00:43:49.060 Kids don't make
00:43:49.720 adult choices.
00:43:51.500 You're giving,
00:43:52.280 basically,
00:43:52.820 you're feeding
00:43:53.140 cocaine to children.
00:43:55.300 That's what it is.
00:43:56.160 It's like giving
00:43:56.600 them cocaine.
00:43:58.200 So,
00:43:58.800 that's child abuse.
00:44:00.940 How about
00:44:01.680 the Rittenhauer
00:44:03.080 prosecution?
00:44:03.700 It was a 17-year-old
00:44:06.900 who was being
00:44:08.420 made a scapegoat
00:44:09.820 to feed the
00:44:12.060 narratives that
00:44:12.840 white people are bad
00:44:13.760 and guns are bad
00:44:14.600 and whatever else.
00:44:16.200 And racism,
00:44:16.980 I guess,
00:44:17.240 even though there
00:44:17.680 was no race involved.
00:44:19.560 So,
00:44:20.120 you know,
00:44:20.480 you could argue
00:44:21.020 whether 17 at the
00:44:22.360 time of the crime,
00:44:23.480 he's 18 now,
00:44:24.620 or the alleged crime.
00:44:25.940 No crime.
00:44:26.420 No crime is,
00:44:27.520 no crime is in
00:44:29.020 evidence,
00:44:30.400 but it's an alleged
00:44:31.060 crime.
00:44:31.360 That looks like
00:44:33.700 child abuse too.
00:44:36.100 So,
00:44:36.700 at what point
00:44:37.380 do we say
00:44:37.880 this pattern
00:44:38.480 is a little
00:44:38.940 too obvious?
00:44:41.000 Right?
00:44:41.620 Now,
00:44:41.900 I don't think
00:44:42.840 that there are
00:44:43.260 any people
00:44:44.920 on the left
00:44:45.420 saying,
00:44:45.800 you know what
00:44:46.200 we could do?
00:44:46.900 We could abuse
00:44:47.640 some children
00:44:48.120 and then we
00:44:49.140 get away with this.
00:44:50.300 But just like
00:44:51.300 follow the money
00:44:52.160 usually predicts,
00:44:54.320 follow the child abuse
00:44:55.640 seems to predict
00:44:56.640 wherever Democrats
00:44:57.860 are going to go.
00:44:59.320 If you said to yourself,
00:45:00.260 I don't know
00:45:00.640 how the Democrats
00:45:01.300 are going to respond
00:45:02.100 to some future
00:45:03.300 situation.
00:45:04.780 Just ask yourself,
00:45:05.980 what would be
00:45:06.360 the worst thing
00:45:07.020 for children?
00:45:08.100 And that will be it.
00:45:09.640 Is it bad
00:45:10.420 for children?
00:45:11.260 Yeah,
00:45:11.480 they'll do that.
00:45:13.500 I mean,
00:45:13.760 it's really consistent.
00:45:16.080 All right.
00:45:21.660 I love this tweet
00:45:23.240 from Michael Edwards
00:45:25.020 on Twitter.
00:45:25.780 So I got into
00:45:28.220 a conversation
00:45:29.360 with somebody
00:45:29.840 who said that
00:45:30.500 I'm a dumbass
00:45:31.560 for believing
00:45:32.780 that CRT
00:45:33.600 is taught
00:45:34.240 in schools
00:45:35.200 because CRT
00:45:36.600 is only a
00:45:37.480 college-level class
00:45:38.880 and man am I
00:45:40.360 an idiot
00:45:41.040 for thinking
00:45:42.160 that any of that
00:45:42.840 is in the public
00:45:43.600 school system.
00:45:46.120 Now,
00:45:46.960 this person
00:45:48.920 is a victim.
00:45:50.240 The person
00:45:50.920 who thinks
00:45:51.320 that CRT
00:45:51.980 is not being
00:45:52.760 taught in schools.
00:45:54.060 Every day
00:45:55.720 I wake up
00:45:56.380 and my social media
00:45:57.360 has at least
00:45:57.940 one example
00:45:58.700 of written
00:45:59.760 documentation
00:46:00.520 where clearly
00:46:01.420 the tenets,
00:46:03.580 you know,
00:46:03.800 the philosophy
00:46:04.700 of CRT
00:46:05.500 is embedded
00:46:06.020 in the curriculum.
00:46:07.100 It's in writing
00:46:07.880 in lots of
00:46:09.100 different places
00:46:09.740 and it's clear.
00:46:11.460 Right?
00:46:11.600 They don't use
00:46:12.100 the words
00:46:12.380 critical race theory.
00:46:13.440 They just
00:46:13.800 embed the concepts.
00:46:16.320 So somehow
00:46:17.280 the left
00:46:17.900 has convinced
00:46:18.560 a lot of morons
00:46:19.920 that just because
00:46:21.240 the name
00:46:21.740 critical race theory
00:46:22.700 isn't specifically
00:46:23.780 used when they
00:46:24.660 talk about
00:46:25.240 schools and
00:46:26.220 high schools
00:46:26.720 that somehow
00:46:28.200 that's meaningful.
00:46:30.360 It's not meaningful.
00:46:31.720 You're just using
00:46:32.280 a different word.
00:46:33.280 But listen to what
00:46:33.900 Michael Edwards
00:46:34.540 tweets and I love
00:46:35.660 this.
00:46:36.560 He goes,
00:46:38.880 if there,
00:46:42.720 so after this guy
00:46:44.740 said I was a
00:46:45.280 dumbass saying
00:46:45.860 there's none
00:46:46.340 of this is being
00:46:47.180 taught in schools,
00:46:48.420 Michael Edwards
00:46:48.940 tweets,
00:46:49.280 if there is
00:46:50.040 none in schools
00:46:50.920 then there's
00:46:52.040 no problem
00:46:52.540 banning teaching
00:46:53.300 it, right?
00:46:55.360 Why would you
00:46:56.180 be opposed
00:46:56.860 to banning
00:46:57.860 something that
00:46:59.380 isn't happening
00:46:59.960 anyway?
00:47:00.760 Can't we all
00:47:01.220 agree on that?
00:47:02.560 If critical race
00:47:03.760 theory is not
00:47:05.220 being taught
00:47:05.700 in schools
00:47:06.220 and that's
00:47:07.220 what the left
00:47:07.780 says,
00:47:09.360 then let's
00:47:09.980 just ban it
00:47:10.720 and everybody
00:47:11.740 wins.
00:47:12.900 The left
00:47:13.440 is irrelevant
00:47:14.820 because nothing
00:47:15.540 would happen.
00:47:16.600 You can't ban
00:47:17.280 something that's
00:47:17.820 not happening.
00:47:18.320 So big deal.
00:47:20.400 And the right
00:47:21.100 would be happy
00:47:21.620 to ban it.
00:47:22.500 So we've got
00:47:23.660 a deal, right?
00:47:25.480 We'll all just
00:47:26.180 ban it.
00:47:27.200 And the left
00:47:27.940 will say,
00:47:28.340 well,
00:47:28.540 nothing will
00:47:29.240 happen.
00:47:33.360 All right.
00:47:36.820 American courts
00:47:37.700 are saving the
00:47:38.680 day.
00:47:40.080 I am of the
00:47:41.500 opinion that
00:47:42.180 the only thing
00:47:42.900 that holds
00:47:43.300 America together
00:47:44.220 is the court
00:47:45.040 system and
00:47:46.280 that the court
00:47:46.780 system is the
00:47:47.840 jewel and the
00:47:49.140 crown of the
00:47:50.660 republic.
00:47:51.780 Everything else
00:47:52.640 in the republic
00:47:53.180 is a good
00:47:53.820 idea, but the
00:47:55.780 court system
00:47:56.520 is the one
00:47:57.620 that's got to
00:47:58.980 be working.
00:48:00.600 Now, it's full
00:48:01.440 of warts.
00:48:02.180 Of course, you
00:48:02.900 know, any big
00:48:03.460 system has
00:48:04.040 problems, but
00:48:05.520 it's the only
00:48:07.240 thing that'll
00:48:07.700 save us.
00:48:08.900 And I'll give
00:48:09.320 you a couple
00:48:10.000 of examples.
00:48:10.620 officials.
00:48:11.420 Apparently, the
00:48:12.180 Biden idea to
00:48:13.320 have OSHA
00:48:13.940 enforce vaccine
00:48:16.400 mandates got
00:48:17.960 stopped by the
00:48:19.120 I guess the
00:48:19.780 Fifth Circuit,
00:48:20.640 who said it
00:48:21.560 was just an
00:48:22.080 insanely over
00:48:23.140 broad mandate
00:48:23.980 that didn't
00:48:24.740 distinguish by
00:48:27.060 degree of
00:48:27.680 risk.
00:48:28.520 And it
00:48:28.820 doesn't make
00:48:29.260 sense for the
00:48:29.840 truck driver to
00:48:30.540 have the same
00:48:31.080 degree of risk if
00:48:32.920 he's all by
00:48:33.340 himself compared
00:48:34.140 to other types
00:48:35.340 of jobs.
00:48:35.720 So they
00:48:37.200 just said,
00:48:38.440 oh, this is
00:48:39.200 too far.
00:48:40.240 They stopped
00:48:40.660 it.
00:48:41.100 Now, I don't
00:48:41.400 know if that'll
00:48:41.880 last, but it's
00:48:43.180 stopped at the
00:48:43.940 moment.
00:48:45.120 Likewise, I think
00:48:45.940 the court system
00:48:46.680 is going to save
00:48:47.400 us from this
00:48:48.040 Rittenhauer
00:48:48.600 situation.
00:48:50.240 I think the
00:48:50.840 court is going
00:48:51.720 to find him
00:48:52.180 not guilty of
00:48:52.900 murder, and
00:48:54.160 even the
00:48:54.560 prosecution has
00:48:55.300 sort of given
00:48:55.760 up on their
00:48:56.240 murder case, and
00:48:57.520 they're introducing
00:48:58.160 lesser charges.
00:49:00.520 Does that seem
00:49:01.260 fair to you?
00:49:03.260 Can the
00:49:03.720 court halfway
00:49:05.300 through the
00:49:05.760 trial, or
00:49:06.280 even toward
00:49:06.640 the end of
00:49:07.060 the trial, can
00:49:08.080 the court just
00:49:08.620 say, oh, it's
00:49:09.640 not really this
00:49:10.220 charge, it's a
00:49:10.920 different charge?
00:49:13.000 Really?
00:49:14.660 Can they do
00:49:15.400 that?
00:49:15.620 Apparently they
00:49:16.140 can, because
00:49:17.180 it's being done.
00:49:18.200 It doesn't seem
00:49:19.020 like that would
00:49:19.600 be fair to
00:49:21.300 the charge, but
00:49:23.460 maybe it is, if
00:49:24.600 it speeds things
00:49:25.260 up, I don't
00:49:25.820 know.
00:49:26.980 But once the
00:49:29.520 prosecution has
00:49:30.460 admitted that
00:49:31.280 they don't have
00:49:31.860 a case, and
00:49:33.960 there were no
00:49:34.520 surprises, and
00:49:35.300 the trial, just
00:49:38.900 think about
00:49:39.280 this.
00:49:39.980 I don't
00:49:40.580 believe the
00:49:41.040 defense introduced
00:49:42.920 any kind of
00:49:43.660 surprises.
00:49:45.200 So everything
00:49:45.920 that was known
00:49:46.520 by the
00:49:46.860 prosecution was
00:49:48.580 known before the
00:49:49.400 trial is known
00:49:50.260 now, which is
00:49:51.740 how much evidence
00:49:52.420 they have, which
00:49:53.140 is not, of
00:49:54.520 murder anyway.
00:49:56.060 And I still
00:49:58.200 feel like the
00:49:58.980 prosecutor needs
00:49:59.840 to go to jail
00:50:00.440 for that.
00:50:00.880 What the
00:50:02.420 prosecutor did
00:50:03.560 to Kyle
00:50:05.460 Rittenhouse, it's
00:50:07.400 not a crime
00:50:08.080 technically, but
00:50:10.020 boy, does it
00:50:10.620 need to be.
00:50:11.620 I mean, at the
00:50:12.040 very least, you
00:50:12.740 need to be
00:50:13.120 disbarred.
00:50:14.120 I meant the
00:50:14.560 least.
00:50:15.660 But I don't
00:50:16.200 know that any
00:50:16.640 of that will
00:50:17.000 happen.
00:50:18.100 Still, I'm
00:50:18.880 glad that the
00:50:19.380 court system, I
00:50:20.160 think, will
00:50:20.840 keep Kyle
00:50:21.400 Rittenhouse from
00:50:22.040 the worst of the
00:50:23.280 potential outcomes.
00:50:24.220 I heard this
00:50:28.000 factoid that is
00:50:29.360 not confirmed
00:50:30.300 because it's
00:50:31.520 just in a tweet
00:50:32.200 from somebody I
00:50:32.860 don't know.
00:50:33.780 But can anybody
00:50:34.380 confirm that this
00:50:35.300 is true?
00:50:36.480 That we learned
00:50:37.440 just recently
00:50:38.980 that Rittenhouse's
00:50:40.060 dad, grandma,
00:50:41.160 aunt, uncle, and
00:50:42.060 cousins all live
00:50:42.800 in Kenosha?
00:50:44.560 Because a big
00:50:45.460 part of the
00:50:45.860 story is he
00:50:46.680 came to a town
00:50:47.900 that he wasn't
00:50:48.420 part of, and
00:50:50.380 he should have
00:50:50.860 just stayed
00:50:51.240 home.
00:50:51.760 It wasn't his
00:50:52.340 business.
00:50:54.220 So, I'm
00:50:55.880 seeing somebody
00:50:56.360 saying that his
00:50:56.940 dad does.
00:50:58.440 Let me explain
00:50:59.440 how a two
00:51:01.100 family, you
00:51:03.220 know, a two
00:51:03.760 parent situation
00:51:04.900 works when
00:51:05.500 they're separated
00:51:06.260 or divorced.
00:51:08.500 Rittenhouse has
00:51:09.260 two hometowns.
00:51:11.240 Am I wrong?
00:51:12.660 If his father's
00:51:13.740 in one town and
00:51:14.500 his mother's in
00:51:15.120 a neighboring
00:51:15.540 town, presumably
00:51:16.980 he spends a
00:51:17.820 lot of time in
00:51:18.420 both, he might
00:51:19.520 have a bed in
00:51:20.320 one and not in
00:51:21.060 the other.
00:51:22.400 But he has
00:51:22.980 two hometowns.
00:51:24.220 You know, the
00:51:25.620 town that you
00:51:26.120 spend your time
00:51:27.300 with, with your
00:51:28.000 father, who
00:51:30.200 lives there, is
00:51:31.540 kind of your
00:51:32.100 hometown, too.
00:51:34.080 So, and he
00:51:35.100 only was 20
00:51:35.740 minutes away.
00:51:36.800 So, all of this
00:51:37.760 business about
00:51:38.440 crossing state
00:51:39.300 lines is just
00:51:41.720 bullshit.
00:51:42.800 It's just
00:51:43.220 bullshit.
00:51:44.800 So, it was
00:51:47.040 bullshit before I
00:51:49.700 knew that he had
00:51:50.340 a reason to be
00:51:51.040 there.
00:51:51.280 That was his
00:51:52.220 hometown he was
00:51:53.280 protecting.
00:51:53.980 He was
00:51:54.180 protecting his
00:51:54.780 father's town,
00:51:56.080 you could say
00:51:56.500 that.
00:51:57.440 Oh, and
00:51:57.760 somebody says he
00:51:58.300 works in
00:51:58.760 Kenosha.
00:51:59.620 Yeah, Kenosha
00:52:00.760 is pretty much
00:52:01.280 his town.
00:52:02.880 All right.
00:52:04.120 The funniest
00:52:04.900 thing that came
00:52:05.720 out of this is
00:52:06.600 there's some
00:52:07.400 photos of some
00:52:09.080 part of the
00:52:09.460 trial where it
00:52:10.240 looks like the
00:52:11.000 judge was
00:52:12.840 sitting at maybe
00:52:13.660 the defendant's
00:52:15.840 table to look
00:52:16.580 at some, I
00:52:17.940 don't know,
00:52:18.100 documents or a
00:52:19.280 video or something.
00:52:20.040 And Kyle
00:52:21.040 Rittenhouse, not
00:52:22.560 handcuffed, is
00:52:23.920 leaning over the
00:52:24.840 judge's shoulder,
00:52:26.520 not handcuffed,
00:52:28.220 leaning over the
00:52:29.060 judge's shoulder,
00:52:30.020 looking at stuff
00:52:30.740 with him, and
00:52:31.740 nobody in the
00:52:32.480 courtroom is the
00:52:34.000 least bit worried
00:52:35.480 that Kyle would be
00:52:37.160 dangerous.
00:52:38.240 He's being accused
00:52:39.620 of double murder,
00:52:41.780 and nobody in the
00:52:43.340 courtroom, even the
00:52:45.040 bailiff, isn't
00:52:45.680 anywhere near him.
00:52:47.660 The bailiff's on
00:52:48.960 the other side of
00:52:49.540 the room, and an
00:52:50.900 unhandcuffed guy
00:52:52.840 charged with double
00:52:53.680 murder is just
00:52:54.500 looking over the
00:52:55.100 shoulder of the
00:52:55.660 judge, and
00:52:56.600 nobody's concerned.
00:52:59.140 Nobody's concerned.
00:53:00.840 Because there's
00:53:01.580 nobody there who
00:53:02.460 thinks he's a
00:53:03.020 murderer.
00:53:04.780 Not the defense,
00:53:06.480 not the prosecution,
00:53:08.140 not anybody.
00:53:12.760 I'm hearing in the
00:53:14.160 comments that
00:53:14.640 Robert Barnes might
00:53:15.520 think there's more
00:53:16.220 risk for Rittenhouse
00:53:17.420 because the defense
00:53:18.280 didn't do a good
00:53:18.920 job or something.
00:53:19.840 Is that the story?
00:53:21.020 I don't know the
00:53:21.600 details of that, but
00:53:22.480 you should follow
00:53:23.320 Barnes.
00:53:27.820 I saw a tweet from
00:53:29.480 Christian Venderbruck
00:53:30.840 talking about the
00:53:31.900 issue of Kyle.
00:53:32.900 He was 17 at the
00:53:33.820 time.
00:53:34.840 And Christian asked
00:53:35.880 this on Twitter.
00:53:37.020 What the hell kind
00:53:37.720 of, quote, well-regulated
00:53:39.600 militia includes
00:53:41.140 child soldiers?
00:53:44.480 Do you want to
00:53:45.220 answer that question?
00:53:45.860 What kind of
00:53:47.080 well-regulated
00:53:48.160 militia, well-regulated
00:53:50.100 militia, in quotes,
00:53:51.800 includes child
00:53:52.820 soldiers?
00:53:54.160 Do you know what
00:53:54.620 the answer is?
00:53:56.140 All of them.
00:53:57.620 All of them.
00:53:59.400 Yes.
00:54:00.540 Every one of them.
00:54:02.400 Do you think the
00:54:03.300 15-year-old colonial
00:54:05.300 boys weren't going to
00:54:06.400 pick up a gun?
00:54:08.040 Of course they were.
00:54:09.800 Of course they were.
00:54:11.180 17-year-olds join the
00:54:13.400 military routinely.
00:54:15.760 Yeah.
00:54:16.740 Child soldiers is the
00:54:18.120 norm.
00:54:18.780 Of course we're going
00:54:19.540 to have child soldiers.
00:54:20.760 I'd hate to go to war
00:54:22.100 without any.
00:54:23.800 Imagine trying to go to
00:54:24.920 war without any, you
00:54:26.540 know, 17, 18-year-olds
00:54:28.260 as part of the process.
00:54:29.760 It would be a lot
00:54:30.120 harder.
00:54:34.000 Apparently Wisconsin
00:54:35.200 has activated 500
00:54:36.960 National Guard troops.
00:54:38.800 Do you think we would
00:54:40.740 need any National Guard
00:54:42.020 troops if we were not
00:54:43.800 all completely aware
00:54:44.920 that the national media
00:54:47.920 is trying to turn this
00:54:48.960 into a riot?
00:54:51.140 If there's a riot, it's
00:54:52.700 not about Kyle, is it?
00:54:55.860 The riot would be
00:54:56.940 whatever the mainstream
00:54:58.720 media ginned up.
00:55:00.380 And it looks like they're
00:55:01.200 trying to create a little
00:55:02.740 riot here.
00:55:03.840 And I wonder if
00:55:04.880 activating the troops
00:55:05.960 is part of the risk
00:55:08.280 risk management, I hope.
00:55:10.740 I hope it's just good
00:55:11.680 risk management.
00:55:13.060 Or is it part of the
00:55:14.120 persuasion?
00:55:17.220 Are we activating
00:55:18.320 these troops as part
00:55:19.820 of the signal that
00:55:21.220 there's going to be a
00:55:21.900 riot to make it more
00:55:22.820 likely?
00:55:24.660 Because I would like to
00:55:25.820 not know that they were
00:55:27.200 activated, wouldn't you?
00:55:29.120 I'd like them to be
00:55:30.400 activated, but I don't
00:55:31.500 want it to be in the
00:55:32.320 news necessarily.
00:55:34.660 Does it make it less or
00:55:36.020 more likely that they'll
00:55:37.020 be rioting?
00:55:38.980 I don't know.
00:55:39.400 It feels more like
00:55:40.340 game on.
00:55:41.800 There's going to be a
00:55:42.580 lot of action and
00:55:43.340 energy.
00:55:44.440 So wherever there's
00:55:45.180 energy, people are
00:55:46.680 attracted.
00:55:47.860 And I think if you put
00:55:49.220 all these National Guard
00:55:50.100 troops in there, it's
00:55:50.860 going to be a lot of
00:55:51.440 energy.
00:55:52.760 So I'm not sure if
00:55:53.960 that's going to work
00:55:54.620 exactly as they hope.
00:55:56.540 But it's probably good
00:55:58.320 risk management to do
00:55:59.260 this.
00:55:59.880 But talking about it,
00:56:00.920 maybe not so much.
00:56:01.940 All right, that is
00:56:04.940 pretty much influence
00:56:07.460 the jurors.
00:56:08.140 Yeah, could influence
00:56:09.160 the jurors.
00:56:09.740 You're right.
00:56:11.920 Ten proud boys could
00:56:13.160 handle it.
00:56:14.760 Don't get me started on
00:56:16.100 the proud boys.
00:56:18.460 You know, the problem
00:56:19.060 with groups like the
00:56:20.200 proud boys and really
00:56:21.780 any group, the left,
00:56:23.160 the right, the problem
00:56:24.700 with any of these groups
00:56:25.640 is there's always going
00:56:26.360 to be somebody bad in a
00:56:27.560 group.
00:56:28.520 And then they will come
00:56:29.580 to define the entire
00:56:30.560 group.
00:56:38.040 Russell Brand coming
00:56:39.200 out as a commentator
00:56:40.600 like you, leaning
00:56:41.540 liberal but expresses
00:56:42.800 very conservative
00:56:43.580 opinions.
00:56:44.900 He has four million
00:56:45.700 subscribers.
00:56:47.160 Yeah, Russell Brand,
00:56:48.960 I hear what you're
00:56:49.660 saying, that he's a,
00:56:50.560 he's, he leans
00:56:51.660 liberal but because he's
00:56:53.780 open-minded and he's
00:56:55.800 smart and he does seem,
00:56:57.960 he's on the short list
00:56:59.080 of people that I think
00:57:00.020 is immune to some of
00:57:03.340 the biases.
00:57:06.260 And he's got four
00:57:07.240 million subscribers.
00:57:08.600 I think he earned four
00:57:09.880 million subscribers.
00:57:11.160 I think his number of
00:57:12.400 subscribers is completely
00:57:14.860 commensurate with the
00:57:15.900 quality of his product.
00:57:17.360 It's a very high quality
00:57:18.240 product.
00:57:18.760 If you haven't watched
00:57:19.340 it, it's really good.
00:57:22.080 Oh, also the divorce
00:57:24.140 courts, yeah.
00:57:27.440 He's a libertarian,
00:57:28.440 somebody says.
00:57:29.120 That makes sense.
00:57:29.880 A left-leaning
00:57:30.540 libertarian.
00:57:32.620 And Bill Maher keeps,
00:57:34.620 keeps pounding on the
00:57:36.240 wake mob as well.
00:57:44.600 Conservatives with
00:57:45.500 liberal audience.
00:57:46.760 Where's the other way
00:57:47.400 around?
00:57:48.120 Oh, good point.
00:57:49.680 Yes.
00:57:50.560 There, there are
00:57:51.380 liberal-leaning people
00:57:52.520 who have conservative
00:57:53.400 audiences and I would be
00:57:55.000 one of them.
00:57:55.440 By the way, let me
00:57:57.300 explain this in case,
00:57:58.120 in case, you know,
00:57:59.220 anybody's new here.
00:58:01.160 Here's, here's the way I
00:58:02.560 try to brand myself.
00:58:04.340 I say I'm left of
00:58:05.480 Bernie because I think
00:58:07.360 the, the left has
00:58:08.540 better goals.
00:58:10.380 All right.
00:58:11.120 Their, their objectives
00:58:12.020 are better.
00:58:12.500 It's like feed
00:58:13.060 everybody, you know,
00:58:14.580 give everybody a good
00:58:15.740 chance.
00:58:16.440 Get rid of the bias as
00:58:17.840 much as you can.
00:58:18.960 You know, get rid of
00:58:19.920 any structural racism
00:58:21.320 that's holding anybody
00:58:22.200 back.
00:58:22.520 I love all that
00:58:23.280 stuff.
00:58:24.480 I even, I even
00:58:25.580 respect when people
00:58:27.920 prefer to be called
00:58:29.580 by certain terms.
00:58:32.100 Totally.
00:58:32.820 Because I would like
00:58:33.640 the same right.
00:58:34.380 I would like to be
00:58:35.040 called, well, I would
00:58:36.460 like to be called.
00:58:37.720 In my case, I'd like to
00:58:38.820 be called, you know,
00:58:40.600 he or, you know, Mr.
00:58:42.340 Adams or whatever.
00:58:45.560 So that's just, you
00:58:46.720 know, being polite.
00:58:47.800 Everything can go too
00:58:48.840 far, of course, right?
00:58:49.920 Everything can go too
00:58:50.680 far.
00:58:50.840 So I like the
00:58:51.820 objectives of the
00:58:52.680 left, but I don't
00:58:54.000 like their tactics, and
00:58:56.580 I don't think they
00:58:57.220 understand human
00:58:58.040 motivation, so they
00:58:58.980 don't build good
00:58:59.620 systems.
00:59:00.480 So I like the systems
00:59:01.940 and practicality of
00:59:03.060 the right, because the
00:59:04.600 right never ignores
00:59:06.000 stuff that matters.
00:59:08.940 The left consistently
00:59:10.840 ignores gigantic
00:59:13.020 variables.
00:59:16.000 They just ignore
00:59:17.100 variables, such as how
00:59:19.080 people act, right?
00:59:21.060 Or how much money
00:59:21.800 there is.
00:59:22.820 Just gigantic
00:59:24.020 variables, just
00:59:25.240 completely ignored on
00:59:26.420 the left.
00:59:27.020 So I can't embrace the
00:59:28.560 left's strategy, because
00:59:32.300 it's just nonsense.
00:59:33.980 The right, however, has
00:59:35.280 a, let's say, maybe a
00:59:38.540 more clear-eyed, and
00:59:40.740 some would say, you
00:59:42.060 know, cold-blooded
00:59:43.240 approach to things,
00:59:44.200 because those make
00:59:45.680 systems work better.
00:59:47.860 You know, a good
00:59:48.460 system is going to be a
00:59:49.460 little harsh, but it's
00:59:51.000 only going to be harsh
00:59:51.660 against the people it
00:59:52.460 needs to be.
00:59:53.120 It's not going to be
00:59:53.880 harsh against the
00:59:54.840 people who are heading
00:59:56.000 in the right direction.
00:59:57.360 So I appreciate the
00:59:59.220 right's systems, and I
01:00:01.480 appreciate the left's,
01:00:02.740 you know, idealism, but
01:00:05.980 I don't think either of
01:00:06.760 them have it right.
01:00:07.940 That's my take.
01:00:09.180 I don't think either
01:00:09.860 side has it right.
01:00:10.680 There is clearly
01:00:12.320 gigantic space for a
01:00:14.460 middle party.
01:00:15.760 It would take somebody
01:00:16.840 with extraordinary
01:00:17.740 persuasion to pull it
01:00:18.820 off, but there is an
01:00:21.080 opening, and I don't
01:00:22.760 think it's ever existed
01:00:23.840 before.
01:00:24.680 There is an opening for
01:00:25.900 a middle party that
01:00:28.500 would represent 70% of
01:00:30.420 the country.
01:00:31.620 70% of the country.
01:00:33.000 But the only way to
01:00:33.860 pull it off is that the
01:00:34.920 middle party would
01:00:35.640 actually have to be
01:00:36.600 middle.
01:00:37.820 Like, it couldn't be
01:00:38.660 pretend middle, like
01:00:39.640 everybody tries to do.
01:00:40.680 You know, they try to
01:00:42.160 pretend they're appealing
01:00:43.620 to the middle, but
01:00:44.260 they're not.
01:00:45.380 At the very least, the
01:00:47.720 candidate for the
01:00:48.600 middle party would give
01:00:51.120 you both arguments.
01:00:53.260 Right?
01:00:54.180 That's the way I'd run
01:00:55.140 it.
01:00:55.600 Now, somebody said
01:00:56.580 Trump is middle.
01:00:58.220 That's not too far off.
01:01:00.340 It's not too far off to
01:01:01.380 say Trump is middle, but
01:01:02.780 because of his
01:01:03.380 provocative ways, he'll
01:01:05.240 never be seen that way.
01:01:06.320 So he doesn't work that
01:01:08.200 way.
01:01:08.420 He doesn't work as a
01:01:09.560 middle candidate.
01:01:10.680 But the other thing that
01:01:11.560 Trump doesn't do, he
01:01:15.000 doesn't explain the
01:01:15.840 other side.
01:01:18.180 Now, I'm not saying that
01:01:19.260 he should, because he's a
01:01:21.200 great persuader.
01:01:22.780 And just sticking to your
01:01:23.760 points and letting the
01:01:25.560 other side defend
01:01:26.400 themselves is probably a
01:01:27.700 good strategy.
01:01:28.760 But if you wanted to be
01:01:30.000 credible, and I think
01:01:31.360 there's this gigantic
01:01:32.180 thirst for at least
01:01:34.060 credibility, which is
01:01:35.140 different from being
01:01:35.780 right.
01:01:36.140 right, you know, people
01:01:37.380 will forgive you for
01:01:38.260 being wrong, but they
01:01:39.940 won't forgive you for
01:01:41.100 being wrong when you
01:01:42.020 should have done it
01:01:43.000 differently, when you
01:01:44.300 should have known.
01:01:45.140 They won't forgive that.
01:01:47.280 Right?
01:01:47.720 So if I were the middle
01:01:48.720 party, I would always tell
01:01:50.160 you the other person's
01:01:51.080 argument, and I would do
01:01:52.500 the best job I could of
01:01:53.620 explaining it.
01:01:54.460 Say, look, I'm going to
01:01:55.920 propose X, but I've got to
01:01:57.920 tell you Y has some good
01:01:59.140 points.
01:01:59.520 If you can't do that, you
01:02:02.540 don't deserve the vote.
01:02:05.480 I'll say that again really
01:02:07.160 clearly.
01:02:07.880 If you can't argue the
01:02:09.980 other team's side
01:02:11.380 convincingly, you shouldn't
01:02:14.380 be a politician.
01:02:15.940 You don't have the
01:02:16.680 capability.
01:02:17.920 And I don't think you
01:02:18.700 would be credible if you
01:02:19.980 can't describe the other
01:02:21.080 side credibly.
01:02:23.220 Now, on this live stream,
01:02:25.120 I try to do that.
01:02:27.340 Do I succeed?
01:02:28.700 I don't know.
01:02:29.000 That would be a judgment
01:02:29.680 call.
01:02:34.460 But I do try to show the
01:02:35.900 other party's argument.
01:02:37.240 And the reason I do that
01:02:38.220 is to build credibility.
01:02:40.680 Does it work?
01:02:42.380 Because I know it pisses
01:02:43.520 you off a lot when you
01:02:46.160 hear me even explain the
01:02:47.320 other side's point of
01:02:48.740 view.
01:02:49.600 But does it work overall
01:02:51.380 that when I agree with
01:02:53.220 you, it doesn't feel like
01:02:54.420 I'm just taking a side?
01:02:57.540 Yeah.
01:02:57.940 So I think it works.
01:02:59.200 It's just not a perfectly
01:03:00.900 clean process.
01:03:02.140 Some people are just
01:03:02.800 going to hate me for it.
01:03:10.040 You can't have a middle
01:03:11.520 party because the media
01:03:12.920 is in control?
01:03:14.040 Well, if somebody had a
01:03:15.060 big enough social media
01:03:16.280 platform, they could end
01:03:18.280 run, as Trump did in many
01:03:20.060 cases, they could end
01:03:21.140 run the major media.
01:03:28.380 What about Joe Rogan?
01:03:30.220 Oh, you mean?
01:03:32.040 Well, so Joe Rogan made
01:03:34.960 some news.
01:03:36.000 Probably not the way he
01:03:36.960 wanted to.
01:03:38.600 How many of you know the
01:03:39.920 latest Joe Rogan news?
01:03:41.720 All right.
01:03:45.780 I wasn't going to talk
01:03:46.500 about it, but I don't
01:03:47.560 know.
01:03:48.600 I don't think you can
01:03:49.920 avoid talking about it.
01:03:51.120 It's just...
01:03:52.260 So keep in mind that Joe
01:03:53.880 Rogan is, you know, a
01:03:55.720 comedian.
01:03:57.860 So when he says things,
01:03:59.280 you shouldn't take them
01:04:00.020 100% seriously.
01:04:02.340 But apparently he said in
01:04:04.180 some context that he's
01:04:05.260 very flexible.
01:04:06.580 And he's so flexible that
01:04:07.900 he can...
01:04:08.720 let's say he can orally
01:04:11.120 please himself.
01:04:12.800 Now, first of all, I
01:04:14.460 don't know that that's
01:04:15.200 actually true, meaning I
01:04:16.920 don't know if he's
01:04:17.760 literally that flexible.
01:04:20.260 But, you know,
01:04:21.040 theoretically, yes.
01:04:23.240 But the fact that he said
01:04:24.640 it is why he's a national
01:04:28.220 treasure.
01:04:30.080 You know, I keep referring
01:04:32.680 to Joe Rogan as a national
01:04:34.480 treasure, because I think
01:04:35.820 he is.
01:04:36.880 But that's why.
01:04:37.660 Like, I would never tell
01:04:41.680 that joke, because you
01:04:43.060 can't get it out of your
01:04:43.760 head.
01:04:44.820 You know, like, now, it's
01:04:46.500 going to really, I think
01:04:48.000 for some people it might
01:04:48.860 change the way they see
01:04:50.040 him.
01:04:50.480 So I wouldn't have told
01:04:51.360 that joke, but I love the
01:04:52.380 fact that he did.
01:04:53.680 I love the fact that he
01:04:55.660 threw caution to the wind
01:04:56.960 and told that joke.
01:04:57.760 He had to know, you know,
01:04:59.540 on some level he had to
01:05:00.620 know what would come of
01:05:02.180 that.
01:05:02.460 And I'm just hoping that he
01:05:08.520 doesn't, because he's
01:05:09.440 influential, I'm just hoping
01:05:11.200 he doesn't cause other men
01:05:13.220 to join yoga classes.
01:05:16.200 So let me make a prediction.
01:05:18.740 You ready for this?
01:05:20.480 I'm going to make a prediction.
01:05:23.500 It's based on the fact that
01:05:24.940 Joe Rogan is influential in the
01:05:27.740 sense that he has a large
01:05:28.580 platform.
01:05:29.060 So his platform, plus the
01:05:31.260 fact that he makes news,
01:05:33.380 tens of millions of
01:05:34.240 Americans have heard that
01:05:35.940 he's so flexible that he
01:05:37.260 can please himself.
01:05:39.860 Here's my prediction.
01:05:41.600 You ready?
01:05:44.500 Male attendance in yoga
01:05:46.320 classes will go up in the
01:05:48.240 next year.
01:05:52.060 Anybody?
01:05:52.720 Anybody want to take the
01:05:53.560 other side of that back?
01:05:55.440 I don't know if we can
01:05:56.780 measure such things, because
01:05:58.000 I don't know if there's a
01:05:58.800 data on how many men take
01:06:01.020 yoga classes, but my
01:06:03.340 prediction is that there
01:06:05.180 will be a good 10% uptick in
01:06:08.340 men joining yoga classes.
01:06:16.800 I'll just leave that there.
01:06:19.100 All right.
01:06:19.440 Apparently, he's already got a
01:06:27.880 nickname.
01:06:28.720 Joe Blow Rogan.
01:06:33.820 All right.
01:06:41.580 I'm just looking at your jokes
01:06:43.000 now.
01:06:45.280 I'm just going to look at the
01:06:46.440 jokes.
01:06:55.980 All right.
01:06:56.680 I'm not going to read any of
01:06:57.760 your jokes.
01:06:58.500 I think that's enough for
01:06:59.440 today.
01:06:59.740 We'll leave it on that.
01:07:01.080 And thanks for joining me.
01:07:02.980 I hope you enjoyed it.
01:07:03.920 So guys.
01:07:05.040 I'm just going to read it my
01:07:05.360 jokes.
01:07:05.600 I'm going to read some of my
01:07:07.040 jokes next time.
01:07:08.300 Bye.
01:07:11.240 Bye.
01:07:11.740 Bye.
01:07:12.320 Bye.
01:07:12.620 Bye.
01:07:13.760 Bye.
01:07:14.260 Bye.
01:07:14.680 Bye.
01:07:15.380 Bye.
01:07:15.700 Bye.
01:07:17.560 Bye.
01:07:17.820 Bye.
01:07:18.320 Bye.
01:07:19.860 Bye.
01:07:20.480 Bye.
01:07:22.400 Bye.
01:07:22.480 Bye.
01:07:22.940 Bye.
01:07:23.200 Bye.
01:07:23.900 Bye.
01:07:24.660 Bye.
01:07:25.180 Bye.
01:07:25.880 Bye.
01:07:26.220 Bye.
01:07:26.840 Bye.
01:07:26.860 Bye.
01:07:27.820 Bye.
01:07:28.000 Bye.
01:07:28.760 Bye.
01:07:29.320 Bye.
01:07:29.900 Bye.
01:07:30.180 Bye.
01:07:31.100 Bye.
01:07:31.280 Bye.
01:07:32.240 Bye.