Real Coffee with Scott Adams - December 03, 2023


Episode 2311 CWSA 12⧸03⧸23 Gaslighting, Persuasion, Bribery, Blackmail & Other Government Functions


Episode Stats

Length

1 hour and 6 minutes

Words per Minute

135.11588

Word Count

9,037

Sentence Count

799

Misogynist Sentences

6

Hate Speech Sentences

13


Summary

What's better than coffee? Sippin' on it. Or should we say, sippin'? This is the thing that makes everything better, and it happens now. Join me in the unparalleled pleasure of the dopamine rush that is this morning s sip.


Transcript

00:00:00.000 Do-do-do-do-do-do-do-do-do-do-a-ra-pa-pa-pa.
00:00:05.000 Good morning, everybody.
00:00:08.000 And you're probably thinking to yourself,
00:00:10.000 you don't know if you should colonize or gaslight.
00:00:13.000 Should I colonize? Should I gaslight?
00:00:16.000 Well, those questions will be answered and many more.
00:00:19.000 But if you'd like to take your experience up to levels
00:00:21.000 where gaslighters can't even get you,
00:00:24.000 all you need is a cup or a mug or a glass,
00:00:26.000 a tank or a gel or a sign, a canteen jug or a flask,
00:00:29.000 a vessel of any kind,
00:00:31.000 fill it with your favorite liquid.
00:00:33.000 I like coffee.
00:00:35.000 So join me now in the unparalleled pleasure
00:00:37.000 of the dopamine the end of the day,
00:00:39.000 the thing that makes everything better.
00:00:41.000 It's called this simultaneous sip,
00:00:43.000 and it happens now.
00:00:45.000 Save it.
00:00:47.000 Yeah.
00:00:57.000 Well, there's some shocking news.
00:00:59.000 You heard about the big Tesla Cybertruck rollout,
00:01:03.000 big event.
00:01:04.000 But there's some news that there was some crazy guy
00:01:07.000 who was planning a mass casualty attack on the event.
00:01:11.000 It didn't happen.
00:01:12.000 He got picked up by the fuzz, so to speak.
00:01:16.000 They caught him, so he did not do anything like that.
00:01:20.000 But I would like you to imagine for a moment
00:01:24.000 what could have been the coolest thing that ever happened.
00:01:28.000 It didn't happen.
00:01:29.000 This did not happen.
00:01:30.000 But it could have.
00:01:33.000 Imagine this mass shooter showing up at the Tesla truck rollout.
00:01:38.000 You have a whole bunch of trucks because it's a rollout,
00:01:42.000 so you're not going to have one, right?
00:01:44.000 You have lots of Tesla trucks and lots of people.
00:01:48.000 And suddenly somebody yells,
00:01:50.000 gun!
00:01:51.000 And everybody immediately jumps inside a Tesla truck.
00:01:56.000 And the gunman starts opening fire.
00:02:00.000 And then finally, you know, law enforcement takes him down.
00:02:04.000 Everybody lives.
00:02:07.000 There's just like bullet holes in all the cars,
00:02:11.000 all the trucks, but everybody got in.
00:02:13.000 They lived.
00:02:16.000 Best marketing campaign of all time.
00:02:20.000 But instead, they just caught him before he shot any bullets
00:02:23.000 or did anything.
00:02:24.000 So that was good, too.
00:02:26.000 That was good, too.
00:02:28.000 Better?
00:02:30.000 600 people jumping into cyber trucks
00:02:34.000 and being totally safe from gunfire.
00:02:36.000 Would have been better.
00:02:38.000 Would have been better.
00:02:41.000 By the way, I heard a little clip of Elon Musk
00:02:45.000 when he was introducing the cyber truck.
00:02:47.000 And he had this great line.
00:02:50.000 He said,
00:02:53.000 if you're in the cyber truck
00:02:54.000 and you ever get in an argument with another vehicle,
00:02:57.000 you will win.
00:03:00.000 If you get in an argument with another vehicle,
00:03:03.000 you will win.
00:03:05.000 Perfect.
00:03:07.000 That's just like a perfect line.
00:03:10.000 All right.
00:03:11.000 Let's talk about George Santos.
00:03:13.000 As you know, he's going to be expelled,
00:03:16.000 voted to be expelled by his former colleagues from Congress.
00:03:19.000 But now he says he's going to drop the dime
00:03:23.000 on four of his colleagues for ethics complaints
00:03:26.000 and he's got charges and he's got accusations
00:03:30.000 and he's got allocations.
00:03:32.000 I don't know if any of them are good.
00:03:35.000 But he's going scorched earth and that's fun.
00:03:39.000 But the thing that caught my attention was
00:03:41.000 he was referred to in the New York Post
00:03:43.000 in the opening line to this story
00:03:46.000 as a disgraced serial liar.
00:03:49.000 Disgraced serial liar?
00:03:51.000 I feel like I want to form a club.
00:03:55.000 Well, everybody was disgraced.
00:03:58.000 Because, you know, I'm a disgraced cartoonist.
00:04:01.000 And, you know, I didn't really care too much
00:04:03.000 about the George Santos story one way or another.
00:04:06.000 Frankly, I just wasn't interested in it.
00:04:08.000 But now that I know he's a fellow disgraced person,
00:04:14.000 I feel like we should have some kind of a club
00:04:16.000 or, you know, informal organization, a party,
00:04:20.000 a Christmas party perhaps.
00:04:22.000 Wouldn't you love to see me host a Christmas party
00:04:25.000 of everyone who was disgraced during 2023?
00:04:30.000 Like the class of 2023,
00:04:32.000 all the people who got canceled and disgraced.
00:04:36.000 It would be a big party.
00:04:39.000 Well, Vivek Ramaswamy said this in a post.
00:04:43.000 He said,
00:04:44.000 I went to a racially diverse public schools
00:04:47.000 to public schools until the eighth grade.
00:04:49.000 I've never met a single black kid
00:04:51.000 who couldn't achieve everything I have
00:04:53.000 if he had the same true, quote,
00:04:55.000 privilege that I enjoyed.
00:04:58.000 Not being born into money,
00:05:00.000 but having a stable family with two parents
00:05:02.000 who emphasized education.
00:05:04.000 That's the answer to black empowerment in America,
00:05:07.000 not affirmative action.
00:05:09.000 Okay.
00:05:10.000 But as one observer who calls himself,
00:05:19.000 free speech is expensive on the X platform,
00:05:22.000 had this to say.
00:05:24.000 That is the ideal situation.
00:05:26.000 I think we'd all agree.
00:05:28.000 Don't you think that would be ideal
00:05:29.000 to have two parents who are really strong about education?
00:05:34.000 And whether they had money or not,
00:05:38.000 they were just really strong about education
00:05:40.000 and probably character and that sort of thing.
00:05:43.000 Yeah, that would be ideal.
00:05:44.000 How many people can get that situation?
00:05:47.000 In a perfect world.
00:05:49.000 Well, as free speech is expensive says,
00:05:54.000 this is the ideal situation.
00:05:56.000 If you assume 50% odds for having a good versus a bad parent,
00:06:01.000 for any one parent, you have about a 50% odds,
00:06:04.000 but you have two parents.
00:06:06.000 So to get both of them to be, you know,
00:06:09.000 capable pro-education parents,
00:06:12.000 the odds are 25%.
00:06:14.000 That's just, you know, 50% times 50%.
00:06:17.000 What system works for the other 75%?
00:06:21.000 Now, that has always been the question.
00:06:26.000 I think Vivek is 100% right.
00:06:30.000 Everything he said, I agree with.
00:06:34.000 But he didn't have any control over that.
00:06:40.000 So Vivek had no control over his parents.
00:06:44.000 He just got lucky.
00:06:46.000 Did his parents have control over it?
00:06:49.000 Well, I don't know.
00:06:52.000 A little bit.
00:06:54.000 I think they had less control than maybe you think.
00:06:57.000 Because what were the odds that, you know,
00:06:59.000 they both had the same, let's say, philosophy.
00:07:04.000 They had the same philosophy of, you know, pro-education.
00:07:08.000 And they met each other.
00:07:11.000 And they got along.
00:07:13.000 And they had a child.
00:07:14.000 That's a lot of things to go right.
00:07:16.000 And I'm not sure they controlled all those things.
00:07:19.000 They might have gotten lucky.
00:07:20.000 You know, they may have fallen in love.
00:07:22.000 And wow, good luck.
00:07:24.000 How lucky am I that my mate has the same philosophy on parenting.
00:07:28.000 So there's a whole lot of luck involved, right?
00:07:31.000 You'd prefer a system where the luck is removed
00:07:35.000 and the system itself can give you wins.
00:07:38.000 That's the trouble with, the trouble with the parenting situation
00:07:43.000 is that you can't choose them and it's not a manageable process.
00:07:48.000 You know, you can hope for it and you can advise people what to do.
00:07:53.000 But you really just, it's wishful thinking.
00:07:56.000 And unfortunately, if you ever introduced an alternative,
00:08:05.000 it would be politically impossible.
00:08:08.000 Because if what you're trying to sell as Republicans is the family unit,
00:08:14.000 because the family unit works so well if it's done right, you know,
00:08:17.000 if you've got a little religious stuff, you've got attention to education,
00:08:22.000 as Vivek says, that's your ideal situation.
00:08:26.000 But it's just not achievable by probably at least half of the public.
00:08:31.000 So what about the other half?
00:08:33.000 Do we just ignore them because the thing we want them to do is unavailable to them?
00:08:38.000 And then you just let them die?
00:08:41.000 Like, I feel like we could do better.
00:08:43.000 And here's what I suggest.
00:08:45.000 I think we absolutely need to start forming
00:08:49.000 non-family support structures.
00:08:54.000 Non-government.
00:08:56.000 Non-government.
00:08:57.000 That's the important part.
00:08:58.000 Nothing about the government.
00:09:00.000 But, you know, individually, privately,
00:09:03.000 finding ways to support each other.
00:09:06.000 For example, let's say you were two parents who never went to college
00:09:11.000 and you had a vague idea that college is good,
00:09:14.000 but you're not really the ones to sell it.
00:09:17.000 But suppose you had in your social network
00:09:20.000 people who were there all the time who were very pro-college
00:09:23.000 and could sell it to your own children better than you could.
00:09:27.000 I mean, it could be an uncle.
00:09:29.000 It could be as a family member.
00:09:31.000 It could be a cousin.
00:09:32.000 But just somebody in your extended family
00:09:34.000 who could sell to your children what you're not good at selling.
00:09:38.000 Because not everybody's good at everything.
00:09:40.000 So you need an extended structure just to cover all your black holes.
00:09:49.000 All the things that you can't do yourself.
00:09:52.000 So I think that that's what's coming.
00:09:55.000 Just take the simplest example.
00:09:59.000 Childcare.
00:10:01.000 Childcare is this big nightmare because it's just so hard if you're low income.
00:10:06.000 How do you afford childcare and also go to a job?
00:10:10.000 You can't do both.
00:10:11.000 Too expensive.
00:10:12.000 But suppose you had a network of people and some of them were like,
00:10:16.000 well, you know what?
00:10:17.000 You always mow my lawn and I like kids and I'm retired.
00:10:21.000 How about I let the kids go over to my house while you're working?
00:10:25.000 So you can imagine an informal, I don't want to say tribe,
00:10:30.000 because then I think you get other implications.
00:10:34.000 But I think it's some kind of voluntary, virtual, tribal situation
00:10:39.000 would be what you'd get.
00:10:42.000 So I think that's where it's going to end up.
00:10:45.000 Yeah, I don't want to say clan.
00:10:47.000 That's got a bad connotation.
00:10:49.000 But I know what you're saying.
00:10:51.000 All right.
00:10:52.000 So I think there's something the Republicans are missing.
00:10:55.000 And it's really big.
00:10:57.000 Because I think the Democrats are seeing that the traditional family unit
00:11:02.000 just doesn't work for enough people.
00:11:04.000 It's ideal.
00:11:05.000 It just doesn't work for enough people.
00:11:07.000 So they look to the government.
00:11:09.000 So now Republicans are offering an impractical solution
00:11:15.000 and your option, we don't want the government to help you too much either.
00:11:19.000 That's not really much of an offer.
00:11:22.000 A good offer would be, we're really, really going to sell this family thing hard.
00:11:26.000 But if that isn't working for you, we suggest that you self-organize
00:11:31.000 in ways you can support each other.
00:11:34.000 Some version of that.
00:11:35.000 I don't know.
00:11:36.000 I don't know what it looks like, but it's not addressed.
00:11:39.000 Anyway, Wall Street Journal reports, and I quote,
00:11:44.000 the U.S. companies have lost momentum in promoting black professionals into management.
00:11:50.000 Huh.
00:11:51.000 I wonder if there's an alternative way that headline could have expressed that
00:11:57.000 that would be equally true.
00:11:59.000 I'll take a stab at it.
00:12:02.000 Instead of saying, U.S. companies have lost momentum in promoting black professionals
00:12:07.000 into management, would that be identical to anti-white racism has peaked?
00:12:16.000 Are those the same?
00:12:19.000 Because what is it that was promoting black applicants into upper management?
00:12:27.000 If you were working hard on it, who was not being promoted to?
00:12:32.000 Because it's not like they created more jobs just so they could promote black applicants.
00:12:39.000 Same number of jobs.
00:12:41.000 Who was not being promoted while there were a higher percentage of black people being promoted?
00:12:50.000 I don't know.
00:12:51.000 It looks like a good sign to me.
00:12:53.000 I would say the healthiest thing for black professionals and black Americans
00:12:58.000 is that we become honest and practical about what works and what doesn't.
00:13:06.000 Everybody is better off in a system that's honest.
00:13:10.000 If you do these things, go to college, stay out of jail, don't do drugs,
00:13:14.000 you'll probably do fine.
00:13:16.000 That's honest.
00:13:18.000 If they tell you that the reason you're being held back is systemic racism,
00:13:23.000 well, there is systemic racism.
00:13:26.000 I very much agree that it exists, but it's not what's holding you back.
00:13:30.000 Right?
00:13:31.000 Systemic racism has held back literally zero people
00:13:35.000 because there's such an easy workaround.
00:13:38.000 Do you know what the workaround is to systemic racism?
00:13:42.000 It's not really hard.
00:13:44.000 Look around you.
00:13:46.000 The people around you are largely terribly incompetent, whoever they are.
00:13:50.000 It doesn't matter.
00:13:51.000 You know, forget their race, gender, ethnicity.
00:13:56.000 The people around you are largely incompetent.
00:13:59.000 How hard is it to stand out when your competition is largely incompetent?
00:14:05.000 It's the easiest thing in the world.
00:14:07.000 Do you know how much I would care about systemic racism if somebody came to me
00:14:12.000 and was clearly a good applicant for a job?
00:14:14.000 And that's all I saw.
00:14:16.000 Am I going to say, you know, you're clearly the best applicant.
00:14:21.000 You show so much serious attention to learning and character and staying in a jail
00:14:27.000 and you don't do drugs and you've got a nice religious base.
00:14:31.000 You know, it's not necessary, but it's nice that you've got that structure in your life.
00:14:35.000 Am I going to say to myself, you know, but you know, a little systemic racism.
00:14:40.000 I can't give you the job.
00:14:42.000 That just never happens anywhere ever.
00:14:45.000 There are so few qualified people who do just the basics right.
00:14:51.000 Just the basics.
00:14:53.000 Show up on time.
00:14:55.000 You know, don't be drunk when you go to work.
00:14:58.000 Basics.
00:14:59.000 You're not really competing against, you know, the best of the best.
00:15:04.000 You want to hear the advice I gave one of my stepkids recently?
00:15:11.000 Look around you.
00:15:14.000 Do you see who you're competing with?
00:15:17.000 One of my stepkids is very dependable.
00:15:21.000 You know, very high of character.
00:15:23.000 Just naturally.
00:15:24.000 Just born that way.
00:15:25.000 Had nothing to do with, you know, I'd love to say I had an influence.
00:15:29.000 Just born, you know, with strong character.
00:15:32.000 Does, you know, tells the truth.
00:15:36.000 Just all the basics.
00:15:38.000 Immediately went off and got herself a job without any prompting.
00:15:42.000 Soon as she was old enough to do it.
00:15:44.000 To help, you know, help pay her expenses.
00:15:46.000 Because, you know, she doesn't get everything for free.
00:15:49.000 And just total high character individual, high initiative.
00:15:56.000 The whole package.
00:15:58.000 And, of course, being, you know, a certain age, you're worried about how you'll do in the world.
00:16:04.000 And about once every two weeks, I have to have this conversation.
00:16:08.000 Look around you.
00:16:10.000 Just look around.
00:16:12.000 There are very few people who have anything you have, you know, in terms of character.
00:16:19.000 You know, good education.
00:16:21.000 The whole package.
00:16:23.000 So that's the story I'd be telling a black teenager if there was one in my life that needed some advice.
00:16:33.000 Just look around you.
00:16:34.000 You're not competing with much.
00:16:36.000 You know, it has nothing to do with black or white or male or female.
00:16:41.000 There's just not much competition in 2013.
00:16:46.000 So if you want to be the best one in whatever category, your odds are basically 100%.
00:16:53.000 All you have to do is want it.
00:16:56.000 Well, you have to decide.
00:16:57.000 Not wanting it is not enough.
00:16:59.000 You have to decide that you'll do the things that put you in that top 20%, but it's not hard.
00:17:06.000 All right.
00:17:07.000 I saw there's some data out.
00:17:12.000 There's a very clever and persuasive moving video of the percentage of military suicides.
00:17:21.000 And it starts out as a pie graph, and then it's animated over time.
00:17:25.000 So you see, as the years go by, the part of the pie that's the suicides, you know, seems to keep getting bigger.
00:17:32.000 And you're like, what?
00:17:34.000 Why are suicides in the military so high?
00:17:38.000 And worse, why are they getting bigger?
00:17:40.000 And unfortunately, I couldn't let that stay.
00:17:47.000 I couldn't let that go.
00:17:50.000 Because nowhere on the pie chart did I have the raw numbers.
00:17:56.000 It was just percentages.
00:17:57.000 Now, do you remember the rule?
00:17:59.000 Here's the data rule.
00:18:01.000 If you're looking at data in the news, if they show you the percentage of something without the raw numbers, it's propaganda.
00:18:09.000 And the reverse.
00:18:11.000 If they show you raw numbers, but they very clearly leave out percentages of, you know, how important this is as a percentage, that's propaganda.
00:18:21.000 So if you see either one of them without the other, percentages without raw, or raw data without the percentages, that's propaganda.
00:18:30.000 That's brainwashing.
00:18:32.000 And that's what this was.
00:18:34.000 Now, was it also telling you that there's a problem?
00:18:39.000 Absolutely.
00:18:40.000 Right?
00:18:41.000 I'm not arguing with the general statement that suicides are alarmingly high in the military.
00:18:48.000 That's a given.
00:18:49.000 But whether or not they've moved into alarming territory is hard to say.
00:18:54.000 Now, I don't want to mock a specific person, so I'll only tell you their job title.
00:19:02.000 So in response to me saying this, that either the percentages of the raw data, if you leave it down as propaganda, somebody who is a medical doctor, according to their profile, suggested that maybe the reason that the suicides go up after a period of battle is that because the PTSD is high.
00:19:27.000 So that, you know, you've got a battle, you expect maybe there's high suicides when there's a war going on.
00:19:33.000 But when a war winds down, suicides are high.
00:19:36.000 And the doctor speculated that maybe PTSD is part of it.
00:19:41.000 Maybe.
00:19:44.000 Yeah, maybe.
00:19:47.000 Do you know what the other reason that there would be a higher percentage of suicides when the war is over?
00:19:53.000 What would be another reason that there would be a higher percentage of deaths and suicide, not raw numbers, but percentage, when the fighting is over?
00:20:08.000 Okay, am I really going to have to explain this to you?
00:20:12.000 Oh, my God.
00:20:14.000 I'm sorry.
00:20:16.000 Oh, God.
00:20:17.000 I really have to explain this, don't I?
00:20:22.000 No, it's by math.
00:20:25.000 When the war is over, something like zero people die from getting shot.
00:20:30.000 So that part of the pie shrinks to zero.
00:20:33.000 But the suicides are relatively constant, no matter what's going on, because it has more to do with individuals than what they're doing.
00:20:42.000 So your percentage of suicides looks like it's high, only because the number of people getting shot went to zero.
00:20:50.000 Now, are there other things going on, such as PTSD?
00:20:56.000 Probably.
00:20:57.000 Probably.
00:20:58.000 You know, guilt from the war?
00:21:01.000 Maybe.
00:21:02.000 Yeah.
00:21:03.000 There's all kinds of things going on.
00:21:04.000 But if you don't understand the most basic math, that if you take out a whole bunch of data from the pie, the pie will change, even if the data on the thing that looks like it's now a big part of the pie was exactly the same.
00:21:19.000 I shouldn't...
00:21:21.000 Now, here's the lesson from this.
00:21:24.000 Look at the comments that you saw from your fellow citizens, and how many of them immediately knew, oh, it's just because fewer people are shot.
00:21:34.000 How many of you knew that immediately?
00:21:36.000 And how many of you missed it?
00:21:38.000 Do you see how easy it is to fool the public?
00:21:42.000 Yeah.
00:21:43.000 A lot of people are fooled, and we're talking about the most well-informed, in my opinion,
00:21:49.000 my livestream crowd is the most well-informed about spotting bullshit in the news, because it's all we do.
00:21:58.000 We talk about how to spot it every day.
00:22:01.000 If you missed this one, just keep in your mind it's not because you're dumb.
00:22:08.000 Right?
00:22:09.000 This is intentionally misleading because it works.
00:22:14.000 I mean, they know what works.
00:22:18.000 So it's easy to fool people, even people who are well-informed and fairly savvy about how they get fooled.
00:22:25.000 So this one should be shocking to you if it fooled you, because it's such a basic trick.
00:22:31.000 Always look for percentages without raw data and vice versa.
00:22:35.000 All right.
00:22:38.000 But that said, are there not a hundred reasons why military suicides should be higher than the average?
00:22:47.000 You'd expect it, right?
00:22:49.000 Here are some obvious reasons.
00:22:52.000 There's more suicide in general, everywhere.
00:22:56.000 So it's just a subset of everywhere.
00:23:00.000 There is more fentanyl, maybe.
00:23:03.000 Fentanyl, maybe it looks like suicide sometimes.
00:23:06.000 I don't know.
00:23:07.000 It could be in the numbers.
00:23:10.000 What about the selection process?
00:23:12.000 It's a volunteer army or volunteer military.
00:23:17.000 If you have a volunteer military, what would you expect about the mix of people who are joining?
00:23:25.000 Number one, males have more suicide than women.
00:23:29.000 That's correct, right?
00:23:31.000 Men have, like, way more suicide than women.
00:23:34.000 So to the extent that the military is mostly men, the percentage of suicide in the military should be five times the average.
00:23:43.000 Or whatever it is.
00:23:44.000 But it's a lot.
00:23:45.000 Like, men are way more suicidal than women.
00:23:48.000 What else?
00:23:50.000 As you said, PTSD.
00:23:53.000 PTSD.
00:23:54.000 I can tell you for sure that although I consider myself mentally tough, I don't consider myself mentally tough for something that would happen in an actual military conflict.
00:24:07.000 Like, I think I know myself well enough to know I would be permanently damaged by that, even if I survived physically.
00:24:16.000 Very much like the, you know, the October 7th attacks.
00:24:20.000 The people who were, you know, near the violence but maybe didn't personally get hurt, they're still PTSD forever.
00:24:29.000 Right?
00:24:30.000 So I know it would affect me.
00:24:33.000 So that's no surprise.
00:24:35.000 And then how about the more obvious thing?
00:24:39.000 Military members have ready access to weapons.
00:24:43.000 Right?
00:24:44.000 Even after being in the military, there's a pretty high gun ownership rate because they're comfortable around guns.
00:24:52.000 So if you have guns readily available and you're having that bad day, that's a pretty bad combination.
00:25:00.000 So you would expect them to be higher.
00:25:03.000 And let me be very careful about what I say next.
00:25:07.000 So I want to make sure this part doesn't get lost.
00:25:09.000 I do think it's a crisis.
00:25:12.000 It's super alarming.
00:25:14.000 I'm just saying that when you're looking at it, make sure that you have, you know, all the moving parts in your head.
00:25:22.000 All right.
00:25:23.000 All right.
00:25:25.000 Bidenomics.
00:25:26.000 So we're watching as, so the latest attack on the country, I guess, from the Democrats, is that misinformation about the economy is the latest threat to democracy.
00:25:41.000 And so the Democrats, the Biden supporters specifically, are trying to tell us that the economy is good.
00:25:50.000 We just don't realize it because we're poorly informed.
00:25:54.000 And once we're well informed, we would understand that we're not paying more.
00:25:58.000 Oh, I guess we are paying more at the store.
00:26:01.000 So I guess it looks like Bidenomics is trying to make a distinction between do you have a job?
00:26:08.000 Yes.
00:26:09.000 That's good.
00:26:11.000 Got a job.
00:26:13.000 Now, to be fair, I've told you for a long time that as long as employment is good, you're going to be okay.
00:26:22.000 And I stand by that.
00:26:24.000 But everybody can see that the prices are way higher.
00:26:29.000 A hundred percent of human beings eat food.
00:26:33.000 We all eat food.
00:26:35.000 And food is wildly higher, and it's obvious.
00:26:38.000 You could not miss it.
00:26:40.000 We all use gas.
00:26:42.000 Yeah, maybe it came down a little bit, not as much as it, you know, it could have been.
00:26:47.000 So I'm pretty sure everybody feels it.
00:26:50.000 Pretty sure everybody does.
00:26:52.000 But the Bidenomics is, no, no, you don't see it.
00:26:56.000 It's fine.
00:26:57.000 So now we have at least three different things that Democrats are gaslighting us with.
00:27:05.000 Now, remember, I used to complain all the time when people said that they would complain about gaslighting from Republicans or Trump.
00:27:14.000 And I would say, that's not gaslighting.
00:27:17.000 You know, that's just persuasion.
00:27:19.000 Persuasion is not gaslighting.
00:27:22.000 And propaganda is not gaslighting.
00:27:24.000 Not necessarily one-to-one.
00:27:27.000 You know, they could be connected.
00:27:29.000 But gaslighting is very specific.
00:27:31.000 Gaslighting is telling you that you can't believe your own eyes and your own experience.
00:27:36.000 That's completely different.
00:27:38.000 It's a subset, but it's different than persuasion, which is, you know, people are persuading you and, you know, you understand the concept.
00:27:47.000 But telling you that you don't see something you see is gaslighting.
00:27:51.000 It's not just persuasion.
00:27:53.000 It's persuasion with almost the intention of making you insane and not believe what you see about anything.
00:28:00.000 So now we're seeing that Biden and his gang are making us believe the economy is better than it is.
00:28:11.000 Gaslighting.
00:28:12.000 That the border is secure.
00:28:15.000 We're literally watching people screaming across the border of the cities.
00:28:20.000 And they're like, border's secure.
00:28:22.000 Yep.
00:28:23.000 All good.
00:28:24.000 No problems at all.
00:28:25.000 Now, some of that is based on a trick of language.
00:28:32.000 Here's how Mayorkas can say the border is secure.
00:28:36.000 There is an argument for it.
00:28:38.000 It's just a bullshit argument.
00:28:40.000 Here's the argument.
00:28:42.000 The people who come in and then we process them.
00:28:46.000 And then we give, you know, we say, come back for your court hearing.
00:28:49.000 And then we release them into the country.
00:28:52.000 That is a secure system.
00:28:55.000 Do you know why?
00:28:57.000 Because it's operating exactly as designed.
00:29:01.000 He's operating within the existing rules of the country.
00:29:05.000 That if you say you're here for asylum, we don't know how to check it right away.
00:29:11.000 So we give you this court date that's way in the future.
00:29:14.000 And then you get in the country and you probably stay.
00:29:17.000 Now, to most ordinary people, we would call that an insecure border.
00:29:22.000 But if you're a bureaucrat, you say, well, wait a minute.
00:29:27.000 Congress gave me the rules of how to secure a border.
00:29:31.000 And then I followed the rules for securing the border.
00:29:35.000 I didn't make the rules.
00:29:37.000 Congress told me what to do to secure the border.
00:29:41.000 And then I secured the border exactly as I was asked to do.
00:29:46.000 So it's a secure border.
00:29:48.000 But if you say, are millions of people getting across the border that you wish would not?
00:29:53.000 Well, that's a different question.
00:29:55.000 Oh, yes.
00:29:56.000 Millions of people are coming into the border who are not citizens.
00:30:00.000 And they do stay and we can't get them out.
00:30:04.000 But it's a secure border because the process by which they come in is the actual process the country has approved.
00:30:11.000 I feel like that's what they're trying to do.
00:30:14.000 Now, suppose you say, but Scott, Scott, Scott.
00:30:18.000 You're only talking about the ones who apply for asylum.
00:30:21.000 We still have video of people streaming across the border in the insecure places.
00:30:27.000 How can you say that's secure?
00:30:30.000 Here's how.
00:30:31.000 If you're a bureaucrat.
00:30:33.000 Oh, the illegal part is about the same as it has always been even under Trump.
00:30:40.000 And under Trump, you said you had a secure border, didn't you?
00:30:45.000 Didn't you?
00:30:46.000 I mean, Trump keeps saying that when he was in charge, he secured the border.
00:30:50.000 So if we have the same situation, you know, there's a bunch of, there's a legal part.
00:30:57.000 That's not insecure because that's legally following the law.
00:31:00.000 And then there's this other part that's completely illegal.
00:31:03.000 We agree.
00:31:04.000 It's completely illegal.
00:31:05.000 But no more than when Trump was in charge and you called it a secure border.
00:31:10.000 So I guess that's secure.
00:31:14.000 Now, I think that's the argument.
00:31:17.000 But they don't, they don't even give you the respect of explaining why they say it's secure.
00:31:24.000 They just say it's secure when you're watching people stream across the border.
00:31:27.000 And it obviously isn't.
00:31:30.000 And then the third area, besides the economy and the border, is Biden's age.
00:31:37.000 His age.
00:31:38.000 They're actually telling us he's fine.
00:31:41.000 Who was it?
00:31:42.000 He said he would take Biden at age 100 over, over Trump or something.
00:31:47.000 Right?
00:31:49.000 Now that's clearly, that's the most clear gaslighting you'll ever see.
00:31:54.000 Him?
00:31:55.000 What do you mean?
00:31:56.000 He's fine.
00:31:57.000 I don't really see the problem.
00:31:59.000 His brain is working like a, a well-oiled clock.
00:32:05.000 Okay.
00:32:06.000 Now, I asked the following question.
00:32:10.000 Do Republicans do something like that?
00:32:13.000 Now, I know Republicans will lie.
00:32:17.000 I know Republicans can be incorrect.
00:32:20.000 I know that they can be, you know, sometimes crazy.
00:32:24.000 I know that sometimes they can be George Santos.
00:32:27.000 Right?
00:32:28.000 I'm not, I'm not defending Republican politicians.
00:32:32.000 But do they do this?
00:32:34.000 Do they ever say the thing you can obviously see doesn't exist?
00:32:39.000 I don't know.
00:32:42.000 You, I suppose if you're a Democrat, you'd say, well, what about January 6th?
00:32:47.000 They say that wasn't an insurrection.
00:32:50.000 Because it wasn't.
00:32:53.000 Because it wasn't.
00:32:55.000 Yeah.
00:32:56.000 All right.
00:32:58.000 All right.
00:32:59.000 I'm just going to say something out loud because I think I can now say it.
00:33:04.000 There's something I've been holding back.
00:33:06.000 Because you don't want to say things that are, that would incite violence.
00:33:11.000 Because I never want to do that.
00:33:13.000 All right.
00:33:14.000 I do not want to incite violence.
00:33:16.000 So I'm going to simply observe a situation.
00:33:19.000 There's nothing in what I'm going to say that's a recommendation or encouragement.
00:33:24.000 Right?
00:33:25.000 You can, you can, you can note that something is dangerous without encouraging it.
00:33:30.000 So I'm not going to encourage it.
00:33:32.000 Just going to note it.
00:33:33.000 And it's just three words.
00:33:39.000 Trump is unjailable.
00:33:49.000 Just let that sink in.
00:33:52.000 Regardless of what the legal system decides on, Trump is unjailable.
00:34:00.000 Now, can we all agree on something?
00:34:07.000 I don't need to fucking explain that.
00:34:12.000 Agreed?
00:34:15.000 Nothing, nothing else needs to be said, right?
00:34:18.000 Nothing else needs to be said.
00:34:21.000 That glitched.
00:34:23.000 YouTube just glitched.
00:34:28.000 Wow.
00:34:30.000 Now, let me say it again.
00:34:32.000 Because I don't want to be, I don't want to be kicked off of any social media.
00:34:35.000 I don't encourage violence under any conditions.
00:34:38.000 You know, short of media's self-defense.
00:34:42.000 So I don't recommend anything.
00:34:44.000 I'm just describing a situation.
00:34:47.000 He is unjailable.
00:34:49.000 That's done.
00:34:52.000 Yeah.
00:34:53.000 Unless they find something on him that doesn't look like absolute political bullshit.
00:34:59.000 Now, could they, could they cripple his business?
00:35:04.000 Probably.
00:35:05.000 I don't know that the public would have the same reaction to a, a business issue.
00:35:14.000 But if you put his physical body in jail, not going to happen.
00:35:21.000 Just not going to happen.
00:35:24.000 That's all I'm going to say.
00:35:26.000 Just not going to happen.
00:35:28.000 All right.
00:35:30.000 That had to be said.
00:35:33.000 Bill Maher, talking about Elon, said this.
00:35:40.000 He said, what was his, he said, he said that Elon Musk was testing his patients.
00:35:58.000 Uh, because he may not be an anti-Semite, anti-Semite.
00:36:02.000 But when someone tweets what they tweeted and he tweets, quote, you've spoken the actual truth.
00:36:08.000 It looks really anti-Semitic.
00:36:10.000 I mean, come on.
00:36:12.000 So that's Bill Maher saying that, um, he might not be anti-Semitic, but what he said looks anti-Semitic.
00:36:23.000 So come on.
00:36:27.000 What is going on here?
00:36:29.000 Now, my take on this is that, uh, you've got two people in this situation.
00:36:34.000 Uh, Elon Musk and, uh, Bill Maher.
00:36:37.000 And I'd like to point out that one of those two people, uh, is a fucking idiot and the other one is building rockets to Mars.
00:36:43.000 You know, just, just in case you want to get the lay of the land.
00:36:47.000 Kind of a useful distinction.
00:36:51.000 Um, but literally no one.
00:36:56.000 And by the way, Dave Rubin, you know, tried to correct, uh, Maher on this on the show.
00:37:01.000 But literally no one, including Bill Maher, believes that Musk was talking about all Jewish people when he was complaining.
00:37:10.000 And he later clarified it was about the ADL.
00:37:12.000 And the ADL has been the subject of much of his criticisms.
00:37:16.000 It's obvious that he was talking about the ADL and maybe, maybe some other folks who were like-minded.
00:37:22.000 In no way was he making a sweeping statement about Jewish Americans or Jewish people in general.
00:37:28.000 A hundred percent of everybody knows that.
00:37:31.000 We all know it.
00:37:33.000 Why, why pretend it, why pretend something else is happening?
00:37:37.000 You know, what, what is the point of pretending when you know it's not true?
00:37:44.000 It's just the weirdest thing.
00:37:46.000 Okay.
00:37:47.000 And we don't really expect, uh, Bill Maher, of all, of all people, to complain about something that was, wait for it, politically incorrect.
00:37:57.000 What, what Elon Musk said was not technically untrue.
00:38:03.000 There are groups within the Jewish community who had a certain point of view, just as there are groups within the Jewish community who had the opposite point of view.
00:38:12.000 Everybody understands what he meant.
00:38:16.000 It was just politically incorrect because he said it in a, in a way that could be interpreted in the worst possible way.
00:38:24.000 So of all people, Bill Maher essentially complaining about political incorrectness.
00:38:31.000 Like, what's up with that?
00:38:34.000 What is up with that?
00:38:36.000 It's, it's amazing.
00:38:41.000 Well, uh, a Gallup poll finds that, uh, although Americans, Americans in general approve of Israel's action in Gaza, that there is much disagreement among the younger Americans.
00:38:56.000 So there's a big difference between older and younger Americans on Israel.
00:39:01.000 And, you know, that's TikTok, right?
00:39:06.000 What else would it be?
00:39:08.000 I mean, it could be also, uh, college education, but I don't think this effect is limited to college.
00:39:15.000 Right?
00:39:16.000 You're saying young people, so it's probably, you know, pre-college as well.
00:39:20.000 So, do you think TikTok is the main reason for this?
00:39:26.000 I do.
00:39:27.000 Where, where do young people get their news?
00:39:30.000 It's not from ABC News.
00:39:32.000 It must be TikTok.
00:39:33.000 So, I'm pretty sure this is a TikTok effect, but you know what's the great thing?
00:39:39.000 I, I finally figured out why they named it TikTok.
00:39:46.000 Hey America, TikTok, TikTok, TikTok.
00:39:54.000 Do you get it yet?
00:39:57.000 Yeah.
00:39:58.000 TikTok is a fucking time bomb.
00:40:01.000 It says they're destroying your country, and you don't know it until it's too late, and
00:40:06.000 it all blows up in your face.
00:40:08.000 TikTok is named exactly right.
00:40:12.000 Looks like the comments stopped on locals.
00:40:17.000 Might be a bug.
00:40:20.000 Can't tell.
00:40:22.000 Anyway.
00:40:23.000 Yeah.
00:40:24.000 So, Bill Ackman was noting that, on the X platform,
00:40:29.000 that one of the most respected technology investors in the world, he calls him,
00:40:34.000 a guy named David Frankel, is talking about the TikTok risk.
00:40:40.000 And, you know, he's basically saying, if David Frankel thinks there's a TikTok risk,
00:40:46.000 then you should take that seriously.
00:40:48.000 And I say, in addition, if Bill Ackman says there's a risk, you should take that seriously.
00:40:54.000 Because now you have two non-political people, they have no contact with politics directly,
00:41:00.000 Bill Ackman and David Frankel, who would be two of the most respected business minds,
00:41:07.000 who are saying, you know, pretty directly, I think, TikTok's, you know, an inappropriate
00:41:13.000 outside risk.
00:41:14.000 So I use a little persuasion of my own.
00:41:16.000 Here's a little persuasion lesson for you.
00:41:19.000 If you say, TikTok is super dangerous, do people get excited enough to act on it?
00:41:26.000 No.
00:41:27.000 No.
00:41:28.000 Because something's dangerous never gets us to move.
00:41:32.000 You need more than it's dangerous, because we're in a world where everything's dangerous,
00:41:36.000 so it's hard to differentiate.
00:41:38.000 You know, what do you work on?
00:41:39.000 You've got limited time.
00:41:40.000 So just being dangerous according to smart people isn't going to make anybody do anything.
00:41:47.000 It's not enough.
00:41:49.000 But I ramped it up a little bit.
00:41:52.000 And this won't be enough either, but it's in the right direction.
00:41:55.000 And I said that Congress must be owned by China to some degree, because the case for
00:42:01.000 banning TikTok is both obvious and critical to the survival of the United States.
00:42:06.000 So the first thing I'm doing is ramping up the perceived risk.
00:42:13.000 Because TikTok is an existential risk.
00:42:16.000 It's not a risk of, oh, some people will get the wrong idea.
00:42:19.000 No, we're not talking about that.
00:42:21.000 We're talking about the end of the republic.
00:42:23.000 It is absolutely that much of a risk.
00:42:27.000 And so I said, TikTok is a bigger risk to America than Russia, climate change, Iran, and the next pandemic combined.
00:42:41.000 Now, if there's an actual nuclear war, that could be worse.
00:42:47.000 But the actual risk or the chances of a nuclear war are really low.
00:42:52.000 The odds that TikTok will destroy America are 100%.
00:42:58.000 Let me say that again.
00:43:00.000 The odds of us starting a nuclear war are really, really low.
00:43:05.000 But I would agree if it ever happened, that would be the worst outcome.
00:43:09.000 It's just that the odds of it are almost vanishingly small.
00:43:13.000 There's no way Russia wants to nuke us when the end of the Ukraine conflict is kind of obvious.
00:43:18.000 At this point, everybody knows how that ends.
00:43:21.000 So if you know how the war ends, the risk of nuclear conflict is completely off the table.
00:43:27.000 And it's obvious that the Middle East is not heading toward a nuclear confrontation.
00:43:32.000 It's obvious that the next pandemic will probably be bullshit.
00:43:36.000 Right?
00:43:38.000 And what else did I say?
00:43:40.000 Oh, climate change.
00:43:41.000 And climate change is likely something we'll mitigate and figure out how to handle just fine.
00:43:48.000 But TikTok is in motion.
00:43:52.000 It's not like a potential thing.
00:43:54.000 It's in motion.
00:43:55.000 It's got a pit bull grip on America.
00:43:59.000 And it's fucking killing us.
00:44:02.000 It's slow.
00:44:03.000 It's a time bomb.
00:44:04.000 Tick tock.
00:44:05.000 Tick tock.
00:44:06.000 Tick tock.
00:44:07.000 But it's ticking.
00:44:08.000 You can't say it's not ticking.
00:44:10.000 It's literally tick tock.
00:44:12.000 So if you don't understand that it's worse than climate change, Russia, Iran, and the next pandemic, then you don't understand what's going on.
00:44:25.000 And the only reason Congress hasn't banned it that I can think of is that too many members are bought off by China money.
00:44:34.000 There's no other explanation.
00:44:37.000 And not only is there no other explanation, but nobody's offered one.
00:44:43.000 Think about that.
00:44:44.000 It's not that there's no other explanation.
00:44:47.000 Nobody's even tried.
00:44:49.000 Because they say things like, oh, data security is not such a big problem.
00:44:56.000 That's not an explanation.
00:44:58.000 Because the influence is the big problem, not the data security.
00:45:01.000 So if you're arguing about the data security, you're avoiding the problem.
00:45:05.000 You're not arguing it.
00:45:07.000 That's what avoiding the problem looks like.
00:45:10.000 So there's not even anybody who actually addresses the problem and says we should keep it anyway.
00:45:17.000 The closest you can get to that is Vivek, who says, all right, I'm going to use it because it's the only way to reach young people.
00:45:26.000 But he also thinks, you know, I believe he thinks we might be better off without it.
00:45:31.000 I assume that's true, right?
00:45:33.000 Like he's going to use it because he can't kill it?
00:45:36.000 Which actually does make sense.
00:45:39.000 But he should kill it as soon as he's president if he gets a chance.
00:45:43.000 I think he would, by the way.
00:45:46.000 Just a guess.
00:45:48.000 There's some information now that says that Google was serving up brand search ads through porn sites.
00:45:55.000 And other, you know, racists and everything.
00:45:59.000 So apparently Google search is doing a worse job than X for keeping their brands away from bad content.
00:46:13.000 What's going on?
00:46:15.000 It's exactly what you think it is.
00:46:18.000 It's exactly what you think it is.
00:46:20.000 There was never a problem with content and ads being matched on X.
00:46:25.000 It was always an attack on free speech.
00:46:29.000 It was always an attack on free speech.
00:46:32.000 It was never about...
00:46:34.000 It's not even about Elon Musk, except that he's, you know, a powerful force for free speech.
00:46:40.000 But it's about free speech.
00:46:42.000 They just can't stand it.
00:46:44.000 All right.
00:46:45.000 A user on X called Maze, M-A-Z-E, a good account to follow, by the way,
00:46:57.000 tells us that the ADL once had no problem with Twitter, back when it was Twitter.
00:47:03.000 So in 2018, just five years ago, a little bit more, the ADL analyzed a year of Twitter, when it was Twitter, for anti-Semitic tweets.
00:47:17.000 And they found a lot.
00:47:18.000 They found a lot.
00:47:19.000 They found 4.2 million anti-Semitic tweets by 3 million unique handles in the English language.
00:47:26.000 But they said, this report is not about bashing Twitter.
00:47:31.000 They suggested no boycotts.
00:47:36.000 They just wanted you to know it's out there.
00:47:39.000 Yeah.
00:47:40.000 Just want you to know about it.
00:47:43.000 Does that seem a little different?
00:47:45.000 Now in 2018, what did we know about Twitter?
00:47:48.000 We knew that it was a cesspool of FBI and Democrat, you know, finger on the scale.
00:47:56.000 And that it was the opposite of free speech.
00:47:59.000 It was the opposite of free speech.
00:48:01.000 It was literally controlled speech.
00:48:03.000 And one group was being highly censored.
00:48:06.000 But under that conditions, you know, a little bit of anti-Semitism.
00:48:11.000 Ah, you know, ah, there's a little bit everywhere, really.
00:48:16.000 I mean, you know, who doesn't have a little bit of anti-Semitism?
00:48:20.000 Yeah.
00:48:21.000 That's the ADL.
00:48:23.000 So the ADL trying to pretend to be a credible organization as opposed to, obviously, just an attack dog for the Democrats, who also do some good stuff.
00:48:35.000 By the way, you know, the ADL's main mission?
00:48:39.000 Excellent.
00:48:40.000 Yeah.
00:48:41.000 Yeah.
00:48:42.000 Protect Jewish people from unfair discrimination?
00:48:45.000 Good job.
00:48:46.000 But it's obviously more than that as well.
00:48:49.000 All right.
00:48:51.000 Speaking of Israel, Israel's UN ambassador is slamming Soros for donating to pro-Hamas groups.
00:49:02.000 Huh.
00:49:03.000 The UN ambassador from Israel is slamming Soros for being anti-Israel.
00:49:15.000 You know, when people say that George Soros is doing bad things, don't we usually call that anti-Semitic?
00:49:25.000 Yeah.
00:49:26.000 If you criticize George Soros, aren't you sort of automatically anti-Semitic, aren't you?
00:49:33.000 I've been told that for a long time.
00:49:35.000 So I'm wondering if MSNBC will cover this story, and will they accuse Israel of being anti-Semitic?
00:49:44.000 Because they just accused...
00:49:47.000 Now, I'm sure it wasn't long ago that if you criticize the ADL or George Soros, you are automatically anti-Semitic.
00:49:57.000 Do you know who else criticizes the ADL and George Soros?
00:50:03.000 Israel.
00:50:04.000 Israel.
00:50:05.000 Right.
00:50:06.000 Now, Israel doesn't criticize everything the ADL does.
00:50:11.000 But there are members of Israel who have been quite critical of the organization.
00:50:17.000 So are they anti-Semitic against themselves?
00:50:22.000 Probably not.
00:50:23.000 Probably not.
00:50:24.000 Now, I ask you this question.
00:50:26.000 When the UN ambassador to Israel criticizes George Soros for funding pro-Hamas entities, what should that trigger in the news?
00:50:40.000 That should trigger a story.
00:50:42.000 Right?
00:50:43.000 Because I saw the story.
00:50:45.000 Think MSNBC will get Alex Soros on to explain why his money is going to pro-Hamas organizations?
00:50:58.000 Let me take a fucking guess.
00:51:00.000 How about no?
00:51:02.000 How about he will not be invited on to explain why his money is going toward, you know, prosecutors who are letting people out of jail for the minor crimes and destroying the cities and pro-immigration beyond what we want and pro-Hamas organizations?
00:51:24.000 There will, here's my prediction, there will not be one news entity that successfully gets him on air to ask him tough questions.
00:51:35.000 And he's the most important person in the country.
00:51:38.000 Because his funding has the most impact.
00:51:41.000 Because they're really good at funding things that make a difference.
00:51:44.000 They're very capable, whatever it is they're doing.
00:51:49.000 Yeah.
00:51:50.000 So, that's all you need to do.
00:51:54.000 That's the dog not barking.
00:51:57.000 If you never see the press interview Alex Soros, you don't have a free press.
00:52:05.000 And if they interview him and they don't ask him any important questions, you know, it's going to be more like,
00:52:10.000 so, what do you think about the Ukraine war?
00:52:13.000 And he'll be like, I think they should settle it.
00:52:15.000 You know, it's going to be something like that.
00:52:17.000 You know, just complete bullshit.
00:52:19.000 But now, I understand why Fox News or Breitbart can't get Alex Soros to sit down for an interview.
00:52:27.000 Presumably, you would just say no, because it would feel like a hit piece.
00:52:31.000 But why can't the left get him to sit down?
00:52:34.000 Why can't CNN get him to sit down for an interview and ask real questions?
00:52:39.000 So, here's a good Jake Tapper question.
00:52:44.000 Jake Tapper?
00:52:46.000 I'm pretty sure Jake Tapper is not anti-Israel or anti-Semitic.
00:52:52.000 Say what you will about your Jake Tappers.
00:52:55.000 He's not anti-Semitic.
00:52:57.000 Yeah, I'm pretty sure of that.
00:53:00.000 Don't you think he would have a personal interest, which would match the national interest, which is ideal, to get Alex Soros to explain why he's acting in a way that even Israel doesn't like?
00:53:15.000 You don't think Jake Tapper should do that?
00:53:17.000 Do you think he would be allowed?
00:53:19.000 Even if he had the instinct to do that, do you think he'd be allowed to put that on TV and actually ask the really hard questions?
00:53:30.000 I think not allowed.
00:53:32.000 I don't know.
00:53:33.000 I mean, this would be a baseless accusation I'm making, because it's just pure speculation.
00:53:38.000 But the fact that we've never seen it, and it's been the most obvious thing that should be on the news for now years.
00:53:45.000 For years, it's the most obvious thing you should do.
00:53:49.000 Get the Soroses on the air.
00:53:51.000 Ask them what the hell is going on.
00:53:54.000 So, we'll see.
00:53:57.000 And then we have a report that Israel has destroyed 500 of an estimated 800 tunnels in Gaza.
00:54:05.000 Let me ask you this.
00:54:08.000 How do we know how many people were in those 800 tunnels, or the 500?
00:54:13.000 So, they blew up 500 tunnels.
00:54:17.000 How many of them were empty?
00:54:20.000 And do they make sure there's no, let's say, hostages in them first?
00:54:26.000 Is it possible they've already killed most or all of the remaining hostages?
00:54:32.000 Because that's the obvious risk of war.
00:54:37.000 Yeah, there's no way to know.
00:54:39.000 Because Hamas would still say they were alive, you know, to use them as a bargaining chip even if they weren't.
00:54:45.000 But how do you do a death estimate in Hamas if you're blowing up tunnels?
00:54:53.000 If you don't know who's in the tunnel, or do they clear them all before they blow them up and all they're doing is making sure the tunnel can't be used?
00:55:01.000 Or are they blowing them up with people inside them?
00:55:04.000 Because that's the safest way to kill them.
00:55:08.000 I don't know.
00:55:10.000 But if they blew up 500 tunnels, I would say that there's no way we'll ever have anything like an accurate death count.
00:55:20.000 How could you?
00:55:22.000 I mean, I'm not even sure that Hamas knows who was underground and who escaped and, you know, who's hiding in southern Gaza and who dressed as a woman and got out.
00:55:32.000 How would they know?
00:55:34.000 I can imagine that Hamas does not have good communication among its various parts at this point,
00:55:39.000 because the communication would be too easily discovered.
00:55:43.000 They're chopping up and burying cars.
00:55:48.000 Yeah, I saw that.
00:55:50.000 All right.
00:55:52.000 Ladies and gentlemen, are there any stories I have not covered in my amazing one hour of the tour of the news and everything you needed to see?
00:56:03.000 Persuasion and propaganda.
00:56:06.000 Yeah.
00:56:11.000 You know, I do have a lot of curiosity about how Israel always agrees to give up, like, ten of their people for one of their people.
00:56:24.000 You know, because there are lots of hostage negotiations that happen.
00:56:28.000 Why does Israel always do that?
00:56:31.000 Like, I have a hypothesis.
00:56:35.000 One is that it works.
00:56:38.000 One is that maybe they arrest far more people than they really needed to just so they have people to bargain with.
00:56:45.000 That would be terrible.
00:56:46.000 But value signaling.
00:56:50.000 It might be that.
00:56:51.000 It might be that they legitimately think, you know, their people are worth...
00:56:56.000 Certainly, certainly their people are worth more to them than their enemies are worth.
00:57:03.000 Yeah.
00:57:04.000 The Hamas leader is called Sinwar.
00:57:13.000 The leader of Hamas, his name is literally Sinwar.
00:57:18.000 Really?
00:57:19.000 Really?
00:57:28.000 Tucker on Gottfeld.
00:57:30.000 Yeah, I did notice that Greg is pushing the envelope a little bit lately.
00:57:37.000 I hope that works out for him.
00:57:40.000 Conor McGregor, we've talked about him.
00:57:43.000 Have we done it all?
00:57:50.000 Give us more of the lay of the land, like with Bill Maher and Mosk.
00:57:54.000 Well, that's a big question.
00:57:56.000 I don't know if I can do that on demand.
00:58:03.000 How long before Greg gets the boot at Fox?
00:58:06.000 I don't think he will.
00:58:08.000 Because here's the thing.
00:58:11.000 Tucker knew that he was saying things that could get him fired, and so he wasn't surprised what happens.
00:58:19.000 When I observe Greg, I think he chooses his spots better.
00:58:25.000 He says provocative things, but they don't seem to cross the line.
00:58:31.000 And also you can tell when he's just pushing buttons, because that's part of his job.
00:58:38.000 Tweaking people, pushing buttons.
00:58:40.000 And that looks different than what Tucker was doing.
00:58:50.000 Ask me if I'd buy a Cybertruck.
00:58:52.000 Well, here's the thing.
00:58:55.000 One of the things that I keep hearing is that the Cybertruck looks way more amazing in person than it does in pictures.
00:59:04.000 You've heard that too, right?
00:59:06.000 That you can't appreciate it in a picture.
00:59:09.000 Now, if that's the case, I haven't been close to one in person.
00:59:14.000 So I don't yet have an opinion, because I'd want to be close to one and I'd want to drive one.
00:59:21.000 My external opinion goes like this.
00:59:24.000 To me, it feels, this is a terrible analogy, but it feels a little like the Hummer used to be.
00:59:31.000 You know, before gas prices were too high, if somebody would get a Hummer.
00:59:36.000 It was because it was just so cool.
00:59:39.000 And I think the Cybertruck has that.
00:59:42.000 It's like, if you're a vehicle person, you know, if you're really a car person, this would be the one to have.
00:59:48.000 I mean, if you were going to add one vehicle to your, you know, three sport car collection, you would have to add this.
00:59:56.000 Because it beat a Porsche.
00:59:58.000 It beat a 911, or 911, what do you call the Porsche?
01:00:02.000 It beat it in a test.
01:00:05.000 So if you're a car person, you just have to own this vehicle, I think.
01:00:09.000 But I'm not a car person.
01:00:11.000 You know, I want one vehicle that does what I need to do.
01:00:14.000 Now, when I look at it, it looks like there's poor visibility.
01:00:19.000 And that's number one on my list.
01:00:23.000 My number one things are easy to park and good visibility with at least cameras and ideally through windows.
01:00:31.000 Because those tend to be the things that bother you most in the day to day.
01:00:35.000 Can you park it?
01:00:37.000 You know, is it too wide?
01:00:39.000 I don't know how wide it is.
01:00:41.000 Yeah.
01:00:42.000 So, I generally would not buy the first production run of a car that radical.
01:00:54.000 But if I were a collector, I would.
01:00:56.000 You know, I mean, if I had the money and I really cared about cars, I probably would.
01:01:01.000 I would strongly be biased toward getting one.
01:01:06.000 But, for me, it would be a little too showy.
01:01:13.000 So let me tell you what public figures have to worry about.
01:01:18.000 When I first made money doing Dilbert, I'd been driving, you know, used cars for a while.
01:01:28.000 But when I got my first good car, like an expensive car, it was a BMW M3.
01:01:35.000 And it was brand new.
01:01:37.000 And in the first day, somebody keyed the entire side of the car, fuck you, from the back bumper
01:01:49.000 across both doors and the front fender.
01:01:52.000 Now, at the time that happened, I was working like three jobs.
01:01:57.000 You know, I was writing a book, doing Dilbert, doing my day job.
01:02:00.000 I couldn't get it fixed.
01:02:02.000 So imagine this experience.
01:02:05.000 I'd waited all my life for the first time to be able to buy a car that I really wanted.
01:02:11.000 I'd never done that before.
01:02:13.000 You know, I did what everybody else does.
01:02:15.000 I bought the car I could afford that was good enough.
01:02:17.000 And that was usually a used car.
01:02:20.000 So the first time I could buy a car that like, you know, you could feel it.
01:02:24.000 It's like, oh, you sit in the car.
01:02:26.000 You're like, oh, man, like this is good.
01:02:30.000 The experience was completely ruined because for the first two months, that's how long it
01:02:37.000 took me to find time to take it to the shop, for two months, everywhere I went, feeling
01:02:44.000 good about my car, people would pull up to me on the other lane and they'd be like, and
01:02:52.000 they'd be pointing to the, like I didn't know.
01:02:55.000 And I'd be like, yeah, I know.
01:03:00.000 I know.
01:03:01.000 I'm sad about it, but I can't do anything about it right now.
01:03:04.000 I know.
01:03:05.000 As I'm driving my expensive sport car.
01:03:12.000 But time goes by and a number of years later, I got a new car.
01:03:21.000 It was also an M3 and so I got a redo.
01:03:25.000 I finally got a redo.
01:03:26.000 So I could experience, you know, the first week of owning a car that really means something
01:03:32.000 to you, that really speaks to your whole body.
01:03:36.000 So I finally get my new M3.
01:03:38.000 I'm still like, I got PTSD from getting keyed on my very first day.
01:03:43.000 And I take it to the gas station, park it at the gas station, go to fill it.
01:03:50.000 And I got rear-ended at the gas station, parked in front of a pump.
01:04:02.000 Has that ever happened to you?
01:04:06.000 Have you ever been rear-ended at a gas station?
01:04:11.000 It's only happened to me once.
01:04:14.000 It was the first day I had that car.
01:04:17.000 First fucking day.
01:04:19.000 So that's two BMW M3s in a row.
01:04:23.000 They were effectively, physically destroyed the first day.
01:04:29.000 So I never really got that sports car.
01:04:34.000 Anyway, the point of it was, you asked, would I get a Cybertruck?
01:04:38.000 If you're a public figure, you don't want to get a Cybertruck unless you're going to drive
01:04:45.000 it around your neighborhood and hide it in your garage.
01:04:48.000 You don't want to take the Cybertruck to buy groceries.
01:04:52.000 Right?
01:04:53.000 It's basically, unless you just want people to hate you and to think about how they can
01:04:58.000 destroy your stuff, it's just basically, it paints a target on your back.
01:05:04.000 So my criteria for buying a car is it has to be safe, you know, safety first.
01:05:13.000 It has to have function.
01:05:15.000 But on top of safety and function, I don't go for thrill.
01:05:22.000 Safety, function, and nondescript.
01:05:26.000 Those are the three things.
01:05:27.000 Safety, function, nondescript.
01:05:30.000 I want to park in the parking lot and nobody notices.
01:05:33.000 So my X5 does that because where I live, there's a lot of cars in that category.
01:05:42.000 Your car has to align with your neighborhood, correct?
01:05:45.000 Your car should align with your neighborhood.
01:05:47.000 But I can't imagine taking a Cybertruck and parking it at an airport.
01:05:51.000 I don't know.
01:05:55.000 And I get that it's hard to key it, but there must be something you can do to it that's bad.
01:06:01.000 All right.
01:06:03.000 By the time you get your Cybertruck, there will be many on the road.
01:06:07.000 I wouldn't rule it out.
01:06:09.000 So I'd say it's on my short list, but I'd have to know a lot more about it.
01:06:13.000 And I'll have to see how people like it the first year.
01:06:16.000 I'm just assuming there will be a number of minor recalls and stuff because it's a new vehicle.
01:06:22.000 So we'll wait for that.
01:06:24.000 All right.
01:06:29.000 Cameras.
01:06:31.000 I assume it has 360 cameras, right?
01:06:34.000 Cameras all around?
01:06:36.000 It needs that, doesn't it?
01:06:38.000 Yeah.
01:06:39.000 It would have cameras all around.
01:06:40.000 All right.
01:06:41.000 YouTube, thanks for joining.
01:06:43.000 We'll see if this episode gets suppressed.
01:06:47.000 It's looking like it.
01:06:49.000 And I'll talk to you tomorrow.
01:06:51.000 Thanks for joining.
01:06:52.000 Thanks for joining.