Real Coffee with Scott Adams - January 17, 2025


Episode 2723 CWSA 01⧸17⧸25


Episode Stats

Length

1 hour and 1 minute

Words per Minute

151.3155

Word Count

9,248

Sentence Count

672

Misogynist Sentences

6

Hate Speech Sentences

12


Summary

An Air Force veteran claims to have seen an egg-shaped UFO. The Washington Post has a new mission statement. And the New York Times has a mission statement that sounds like it could be based on fiction. Guests: Comedian John Rocha, writer, podcaster, and podcaster.


Transcript

00:00:00.000 A tanker, Charles Verstein, a canteen jugger, a flask, a vessel of any kind.
00:00:03.900 Fill it with your favorite liquid.
00:00:05.740 I like coffee.
00:00:07.280 Join me now for the unparalleled pleasure, the dopamine.
00:00:10.500 Yet at the day, the thing that makes everything better, it's called the simultaneous sub.
00:00:14.940 And it happens now.
00:00:15.780 Go.
00:00:23.540 Same shirt.
00:00:25.300 Are you just now noticing that I wear the same shirt every day?
00:00:28.260 Yeah, it's true.
00:00:32.100 Sometimes not on laundry day, but most of the time.
00:00:36.080 Well, did you know that unsweetened coffee is associated with reduced risk of Alzheimer's
00:00:40.060 and Parkinson's disease, according to SIPOS?
00:00:42.380 Yes.
00:00:43.360 There's nothing that coffee can't do.
00:00:45.740 And this is the reason that I stopped doing cold plunges.
00:00:50.220 You know, everybody's like, oh, take a cold plunge.
00:00:52.820 Take a cold plunge.
00:00:54.040 It hurts a lot.
00:00:55.360 It's so painful.
00:00:56.420 Do it.
00:00:57.060 Do it.
00:00:57.380 It's good for you.
00:00:58.260 And I say, how painful is that?
00:01:01.220 That looks like really painful.
00:01:03.080 Oh, it's really painful.
00:01:04.080 It's good for you.
00:01:05.540 So instead of cold plunges, I have a big vat full of warm coffee and I just roll around
00:01:12.000 in it every day.
00:01:13.000 And that's why I don't have Alzheimer's or Parkinson's.
00:01:17.240 Yeah.
00:01:17.600 It's not an accident, people.
00:01:19.300 Well, I hope you see the video that's going around of an Air Force veteran who was assigned
00:01:27.320 to pick up alien wreckage.
00:01:30.500 And he would take his helicopter out.
00:01:32.240 And he would take his helicopter out.
00:01:32.820 So the military assigned him to do this.
00:01:35.380 And he said most of the things he picked up were just, you know, secret government programs and stuff.
00:01:40.780 But once, once he saw non-human egg-shaped aircraft while working on his secret UFO retrieval program, according to the New York Post.
00:01:51.800 Now, here's what's funny.
00:01:55.080 You'd have to see his eyes when he's describing it.
00:01:58.220 Now, I always tell you that the lying eyes are the ones that get wide.
00:02:04.840 If you watch the politicians, you can turn off the sound and you can still tell when they get to the lie because their forehead gets wrinkled because their eyes are so wide.
00:02:15.560 Now, now that you know that that's what a lie looks like, you know, it's like, yeah, I'm going to take office on the 19th and I'll probably eliminate the deficit.
00:02:28.220 On the day one, you know, when they get the lie.
00:02:32.080 But this guy's eyes are these giant, they look like saucers themselves.
00:02:37.340 I saw a UFO.
00:02:42.140 It's not even slightly, slightly credible when you see his face.
00:02:46.440 If all you did is read about it or hear about it, you'd say, hmm, military veteran, probably, probably credible.
00:02:53.500 Probably found an egg-shaped UFO.
00:02:56.280 Sounds good.
00:02:57.100 And then we'll look at his face and you go, oh.
00:03:02.240 All right, you got to see it.
00:03:04.120 Oh, my favorite story of the day.
00:03:06.520 Every time the Washington Post does worse, I get a little charge of pleasure.
00:03:12.260 I know I shouldn't.
00:03:13.560 You know, I shouldn't be reveling in the complete incompetence and destruction of a newspaper that canceled me worldwide.
00:03:22.440 They're the ones who kicked it off, by the way.
00:03:26.200 But here's the...
00:03:27.580 Oh, my God.
00:03:31.860 So the Washington Post came up with a new mission statement.
00:03:35.160 Thank you, God.
00:03:39.620 All I pray for is that I can use my Dilbert filter to mock the Washington Post.
00:03:48.320 And today you gave me, they have a new mission statement.
00:03:53.580 Oh, thank you.
00:03:54.560 Thank you.
00:03:55.780 I'm so grateful.
00:03:58.460 Here's their new mission statement.
00:04:01.040 I'm not making this up.
00:04:03.340 This is actually, I swear to God, this is what their new mission statement is.
00:04:07.000 You ready for this?
00:04:10.120 It's...
00:04:11.000 They're going to highlight their focus on...
00:04:16.440 No, their new mission statement is, quote, riveting storytelling for all America.
00:04:23.240 Storytelling.
00:04:25.700 Storytelling.
00:04:28.420 Newspaper, news, facts.
00:04:32.760 Storytelling.
00:04:33.320 I think they just created a mission statement that says their news isn't real.
00:04:41.040 Because do you use that language when you're talking about the facts?
00:04:46.600 How about, we're going to tell you what you need to know?
00:04:50.320 How about all the important facts?
00:04:53.480 No, it's storytelling.
00:04:57.020 Now, are they completely unaware of what that sounds like?
00:05:01.260 When you hear storytelling, you don't think fact.
00:05:04.600 You think fiction.
00:05:05.780 You think narrative.
00:05:07.780 And indeed, that's exactly what they are.
00:05:09.960 They accidentally picked a mission statement that describes exactly what they are.
00:05:14.340 They're storytellers.
00:05:16.140 They're storytellers.
00:05:18.960 Their own mission statement of a news organization left out the news.
00:05:23.880 They left out the news.
00:05:27.060 Stories.
00:05:27.780 Stories are not the news.
00:05:29.180 I mean, they could be based on the...
00:05:30.820 Based loosely on the news.
00:05:32.900 But then it gets better.
00:05:35.300 They said they're keeping their little tagline that they've had for a long time.
00:05:40.120 That democracy dies in darkness.
00:05:43.980 What do you think causes the darkness?
00:05:45.780 Darkness, let me see, causes the darkness that would destroy democracy.
00:05:52.800 What would cause such darkness?
00:05:55.660 Would it be storytelling?
00:05:58.640 Could storytelling, instead of telling real news, do you think that might cause the darkness
00:06:04.280 that destroys democracy?
00:06:06.620 Now, you have to admire their transparency.
00:06:09.960 They basically told you that they're going to destroy democracy by telling you stories instead
00:06:17.000 of news.
00:06:19.520 Am I over-interpreting this?
00:06:22.500 Or is that exactly what they're telling us?
00:06:25.580 I don't think you could be more clear about what you're doing.
00:06:30.300 I don't think it's intentional.
00:06:33.060 But it's clear.
00:06:35.720 Washington Post failing again.
00:06:37.760 Meanwhile, I'd like to give you a SpaceX update.
00:06:42.400 I was watching the news yesterday, and I was looking at clips about the SpaceX launch,
00:06:48.460 and I was reading all the news.
00:06:51.780 And sometimes the clip would say that it was a success, and sometimes another clip would
00:06:57.080 say that it failed.
00:06:58.700 And so all yesterday, I had this Schrodinger's cat experience about the launch.
00:07:04.420 It was like, wait, did it fail but also succeed?
00:07:09.980 Wait, were there two rockets and one failed and one succeeded?
00:07:13.680 Because nobody told the whole story.
00:07:16.180 Every clip I had was like one pixel out of the story.
00:07:21.060 So I'm like, do I have to assemble this whole story in my head?
00:07:23.880 And I'm still not entirely sure what happened.
00:07:28.300 But my best understanding, reading all these clips, but of course I didn't want to go to
00:07:34.400 the news.
00:07:35.620 Duh.
00:07:36.440 You know, if I went to the news, I don't know what I'm going to see.
00:07:39.060 It might be storytelling.
00:07:40.420 You know what I mean?
00:07:40.940 But I think what happened was it was one launch and one successful giant chopstick caught it.
00:07:52.960 But it was only the booster that did well and the payload part, the upper stage, that
00:07:58.900 had to be destroyed.
00:08:00.660 So there was some kind of catastrophic failure.
00:08:03.420 It either blew up on its own or they destroyed it.
00:08:06.120 Anyway, so Elon considers it a success, as would I, as would I, that this is the sort
00:08:16.180 of, I'm going to put it in quotes, failure that is exactly the kind you want, where you're
00:08:23.660 unambiguously learning something that puts you ahead.
00:08:26.880 You have the ability to keep going.
00:08:29.600 You take this, you've learned from it, you take it to the next level.
00:08:33.440 So it's hard for me to even call a test a failure if they found out what they needed to find
00:08:40.200 out.
00:08:41.760 You know, it would be cool if they found out, oh, everything worked.
00:08:44.740 But either way, if you find out, you're moving forward.
00:08:47.660 So I like the whole attitude about it, that it's a successful, successful test because they
00:08:54.100 learned critical things from the test that will move them forward.
00:08:58.540 Now, here's something I didn't know.
00:08:59.760 Nick Cruz-Petain on X said this, and then Elon Musk said yes to it.
00:09:07.180 So this sounds like it might be something SpaceX plans, which would be to be a general Earth
00:09:13.800 transport to other places on Earth.
00:09:16.840 Now, you probably know that if you're in a vehicle that goes into space and then comes
00:09:22.220 down in another place on Earth, it can go there faster than some airplane that's trying
00:09:27.240 to go through, you know, go through the air because of the resistance and whatnot.
00:09:32.420 So that could take a trip from L.A. to New York.
00:09:34.960 Oh, Supreme Court upholds the TikTok ban.
00:09:43.040 Oh, OK.
00:09:43.880 New news.
00:09:44.600 Supreme Court upholds the TikTok ban.
00:09:47.080 Well, I guess it's gone.
00:09:49.020 We shall see.
00:09:50.100 I think there's more to happen on that story.
00:09:52.040 But anyway, if SpaceX does create a product where you can just take the rocket ship to
00:09:58.880 another place on Earth, L.A. to New York, which would normally be five and a half hours, would
00:10:03.800 be 25 minutes.
00:10:06.260 Can you imagine getting from L.A. to New York in 25 minutes?
00:10:10.340 How about London to New York?
00:10:14.220 29 minutes.
00:10:18.100 Wow.
00:10:19.020 Why is that 29 when the six-hour flight is 27?
00:10:22.800 I don't know if the numbers don't look right, but that would be pretty amazing.
00:10:28.340 But what would it cost to be on a rocket that uses that kind of fuel, even if it's re-landable?
00:10:37.860 I don't know.
00:10:38.420 But Elon said, yes, maybe this is the future.
00:10:42.240 Meanwhile, according to interesting engineering, China is testing a microwave weapon with what
00:10:47.520 they call nuclear bomb-like power to kill satellites.
00:10:50.780 So Christopher McFadden and interesting engineering is writing it.
00:10:55.820 It's still experimental.
00:10:56.980 They haven't ruled it out.
00:10:57.880 But it has the potential to take out swarms of drones or satellites.
00:11:04.420 So you can kind of see the future, can't you?
00:11:08.080 We've got all this high-tech stuff, the satellite drones and the smart missiles and, you know,
00:11:14.100 the robot dogs.
00:11:15.780 And then probably the superpowers will have these ginormous electronic devices to cancel
00:11:24.860 out the electronics on these other weapons.
00:11:28.760 So the race to have the better robot military equipment is going to meet with the race to
00:11:39.620 find out a thing that can cancel all electronics in the area.
00:11:43.460 We'll see where that goes.
00:11:45.960 There's a massive fire.
00:11:47.660 It's probably still burning.
00:11:48.860 It was yesterday.
00:11:49.980 At the lithium battery storage plant in California, of course.
00:11:54.940 Of course.
00:11:55.880 Now, when you hear that a lithium battery storage plant has a fire, you probably have to say
00:12:01.860 to yourself, that's sort of lithium batteries, that's not the biggest surprise in the world.
00:12:09.680 But when that fire has an impact on our whole power grid, presumably, and it comes right on
00:12:22.380 the heels of several other fires in L.A., not all of them have known causes, here's what
00:12:30.700 worries me.
00:12:31.940 If we were already under attack, this is exactly what it would look like.
00:12:38.440 Wouldn't it?
00:12:39.140 If the homeland was under a serious attack, you would see fires started in different places,
00:12:47.220 in places that are highly valuable and hard to start.
00:12:50.740 You wouldn't see a fire in maybe the center of Detroit, because that could be kind of limited
00:12:56.540 to one building.
00:12:57.240 But you would see it where you're seeing them.
00:13:00.480 And then you would see attacks on the grid.
00:13:05.320 And you would see them happening in bunches.
00:13:08.300 So it would look like, well, that's a big coincidence.
00:13:11.280 What, another fire?
00:13:12.660 Well, that's another big coincidence.
00:13:14.360 What, at the same time as the lithium battery storage plant?
00:13:17.540 Well, that's another coincidence.
00:13:19.560 Now, I'm not saying we're under attack.
00:13:21.420 I'm saying that if the real news looks exactly like an attack, you better pay attention, because
00:13:29.680 one of these is going to be an attack, I feel like, someday, hope not soon.
00:13:35.340 But apparently, did you know the United States buys transformers for our power grid?
00:13:43.580 And we've got about, I don't know, close to 500 of them.
00:13:46.980 And one of them was once taken apart to see if there's anything dangerous in these Chinese
00:13:52.240 transformers.
00:13:54.040 And they found that there was a way to turn it off remotely.
00:13:58.780 Right.
00:13:59.580 Meaning that if you were a Chinese agent, you could just walk up to a transformer in our
00:14:05.120 grid, probably take out your phone and go, boop.
00:14:09.860 And if you had the right app, it would turn off the network.
00:14:14.300 They can just turn off our power now.
00:14:16.980 Pretty much, I don't know how many, 500 transformers.
00:14:21.240 You'd only need one transformer to get turned off in your network and the whole network after
00:14:26.280 that transformer would be down.
00:14:28.240 That's a lot of power they have control over, theoretically, just by walking up to it.
00:14:32.460 And maybe they don't even have to stand next to it.
00:14:35.080 Maybe they can do it remotely.
00:14:36.420 I don't know how remotely they have to be.
00:14:38.460 But apparently, there's a back door.
00:14:40.680 And we bought more of them instead of phasing them out, probably because we can't make stuff
00:14:44.900 in this country affordably.
00:14:48.040 Anyway, Megyn Kelly reports that she had a woke friend who lost a home in the California
00:14:55.860 fires and says, quote, this is what the friend said, the woke friend.
00:14:59.800 Everyone I know is ready to vote Republican.
00:15:02.540 Nick Garamond, Red States reporting that.
00:15:04.500 And here's what I think.
00:15:08.500 You know, I live in California.
00:15:10.640 And so I know lots of Californians.
00:15:12.860 I don't know any woke ones.
00:15:15.420 I don't know any woke ones.
00:15:17.580 You know, even my famous smartest Democrat friend that I often talk about is not really
00:15:24.280 woke, just prefers the policies of the Democrats and thinks the Republicans are evil and stuff.
00:15:31.640 You know, normal stuff.
00:15:32.680 But not, you know, he's not like hosting drag queen hours or anything.
00:15:36.540 I don't know any.
00:15:37.440 I don't know any woke people.
00:15:39.340 I live in California.
00:15:41.120 I don't know any.
00:15:42.180 I literally don't know one.
00:15:44.260 I mean, I know they exist.
00:15:45.460 I think I've seen them, you know, in public, but I don't know one.
00:15:51.140 So I cannot confirm or deny that the people who are in that category that I don't even
00:15:56.580 know one have decided they've had enough and they're going to vote Republican.
00:16:02.540 I honestly don't believe that.
00:16:05.720 I believe that only the people who lost their houses or got displaced are even thinking
00:16:11.600 in that way.
00:16:12.200 Because whatever it is that makes the Democrats the way they are, it would have to be a personal
00:16:18.980 tragedy that would change their minds.
00:16:21.620 So those who are unfortunately having the personal tragedy, I definitely would agree that
00:16:27.520 they're thinking about voting Republican.
00:16:30.060 You don't even have to ask them.
00:16:31.460 That's just sort of obvious.
00:16:32.780 But I'll bet you if you were one mile outside of the danger zone and you didn't have a good
00:16:37.840 friend who was affected, you'd be like, well, Biden's still better than Trump and et cetera.
00:16:44.120 And my local local mayor should be blue and all that stuff.
00:16:48.300 So we'll see how real that is.
00:16:50.540 Meanwhile, you know, I've reported this before, but I now have a hypothesis for it.
00:16:55.500 So Gen Z, who are the people between 13 years old and 28 years old at the moment, they're
00:17:03.240 barely using alcohol compared to all the prior generations.
00:17:07.760 If you saw the bar graph, it'd be like, you know, there'd be a bar that's a foot tall and
00:17:11.700 then it immediately goes down to like two inches tall for the current generation.
00:17:16.320 What is up with that?
00:17:19.580 You would think that the one universal throughout time has been if you have alcohol, you're going
00:17:25.220 to drink it.
00:17:26.320 And kids, unfortunately, have access.
00:17:28.720 So they drink it.
00:17:30.480 Why would it just suddenly fall off a cliff?
00:17:33.660 I have a hypothesis.
00:17:35.620 My hypothesis goes like this, that alcohol, people don't do it because it feels good.
00:17:43.280 They do it because it helps them socialize when they're shy.
00:17:46.320 And that among the young, it's not so much, yay, I like this feeling of being drunk.
00:17:53.840 It's, wow, I can finally figure out a way to talk to strangers and I'm not as shy and
00:17:58.800 strangers drunk, so everything's easier.
00:18:01.960 So if the current generation uses the internet to make friends, and what I've observed is
00:18:09.220 that young people will chat with somebody online that they've never met, but they will
00:18:13.440 shared pictures, they've checked each other's social networks, and they've chatted for a
00:18:19.720 long time.
00:18:20.940 Then, if everything's going well, they like the chat, they like the look, they like the
00:18:26.000 interaction, they decide to get together.
00:18:30.000 So you see what they've done?
00:18:32.440 They've taken the utility end of alcohol.
00:18:36.280 The utility of alcohol is, how do I meet a new cool person that I want to spend time with?
00:18:41.640 There wasn't really a great way to do it when I was a kid.
00:18:45.940 So you would have a drink and go where there's other people drinking, and that was easier
00:18:50.100 to be less shy.
00:18:52.080 But if I can meet, you know, if I imagine myself back at this young age where the main
00:18:57.760 thing is friends and girlfriends and boyfriends, that I would think, well, if I can get all
00:19:02.540 the social interaction I want through the internet, I don't know, do I want to feel drunk?
00:19:10.560 Do I want to take the chance?
00:19:12.620 It isn't really cool.
00:19:14.900 So, I don't know.
00:19:16.260 So I just put that out as a hypothesis that the internet way of meeting people just replaced
00:19:21.740 the utility of alcohol.
00:19:23.700 I mean, if that happened, maybe it's more good than bad.
00:19:29.120 Just a guess.
00:19:30.100 Ontario, the wait is over.
00:19:33.300 The gold standard of online casinos has arrived.
00:19:36.140 Golden Nugget Online Casino is live, bringing Vegas-style excitement and a world-class gaming
00:19:41.260 experience right to your fingertips.
00:19:43.760 Whether you're a seasoned player or just starting, signing up is fast and simple.
00:19:48.200 And in just a few clicks, you can have access to our exclusive library of the best slots and
00:19:52.880 top-tier table games.
00:19:54.240 Make the most of your downtime with unbeatable promotions and jackpots that can turn any mundane
00:19:59.460 moment into a golden opportunity at Golden Nugget Online Casino.
00:20:03.920 Take a spin on the slots, challenge yourself at the tables, or join a live dealer game to
00:20:08.260 feel the thrill of real-time action, all from the comfort of your own devices.
00:20:12.620 Why settle for less when you can go for the gold at Golden Nugget Online Casino?
00:20:17.480 Gambling problem?
00:20:18.820 Call Connex Ontario, 1-866-531-2600.
00:20:23.120 19 and over.
00:20:24.040 Physically present in Ontario.
00:20:25.420 Eligibility restrictions apply.
00:20:27.020 See goldennuggetcasino.com for details.
00:20:29.500 Please play responsibly.
00:20:30.880 All right.
00:20:31.520 Well, the Wall Street Journal is talking, again, you've heard this, that China's population
00:20:35.440 continues to decline.
00:20:36.980 So it declined again last year.
00:20:38.700 Um, the births edged up, but, uh, for the first time, there are more deaths than births.
00:20:46.300 Wow.
00:20:47.560 First time in eight years.
00:20:49.040 China had more deaths than births last year.
00:20:52.020 Wow.
00:20:52.760 That's definitely a turning point.
00:20:54.960 Um, so that would be an indication of doom.
00:20:59.600 Kyle Bass, investor Kyle Bass, who often, uh, talks about China, uh, much the way I do,
00:21:06.260 but he's far more informed, um, he said today on X that China is experiencing a complete financial
00:21:12.500 crash.
00:21:13.240 The, the, the 10 year government bonds are yielding only 1.65 and the, the overnight rate
00:21:19.600 just spiked to 16%, which would be very bad.
00:21:24.360 Um, and that maybe all of this economic, all this economic, uh, unfolding in China might
00:21:32.380 cause them to move on Taiwan sooner because that would, that would unify the country maybe
00:21:39.140 and be a distraction from their other problems.
00:21:42.240 I'm going to go opposite on that.
00:21:44.660 I, I agree that China has got some big economic challenges, no question.
00:21:50.700 But, um, my take on China just as an observer, no expert, right?
00:21:57.160 So I'm, so I will defer to the experts if they say, Scott, you've got this totally wrong.
00:22:02.460 You know, there's a history of them doing this exactly.
00:22:05.460 Um, my take on China is that they keep acting rationally, just rational, rational, rational.
00:22:14.020 Now that's, that's the current leadership who have lots of engineers in office.
00:22:17.880 Did you know that?
00:22:18.880 Did you know a lot of the top, uh, the top officials in the, you know, the Chinese Communist
00:22:25.060 Party, the top officials, many of them are engineers.
00:22:28.820 So you have a really practical level headed, you know, what do we need to do to get from
00:22:34.280 here to there?
00:22:35.380 The, the Chinese government always seemed to me rational, right?
00:22:40.480 There's some governments that maybe have a religious driving force and to others, they
00:22:46.160 would not look rational, but China always looks rational.
00:22:48.240 So my, my belief is that you could do a pretty good job of predicting what China will do, because
00:22:56.260 it's what you would do if you were in the same situation.
00:22:59.740 And if I were struggling with all these domestic problems and I were China, the last thing I'd
00:23:06.740 want to do is make a move on Taiwan because while there's some chance it can make things
00:23:12.320 better, there's an equal risk.
00:23:15.740 It could make things so bad.
00:23:18.540 It would be the end of the regime because China doesn't really know how the rest of the world
00:23:24.200 would respond.
00:23:25.600 They think they do, but they don't.
00:23:28.080 And they certainly don't know what Trump would do.
00:23:30.560 So my take on China is they're too risk averse to do something that aggressive when they're
00:23:36.380 already teetering on the edge.
00:23:37.820 Now, I am perfectly willing to be embarrassed and shamed for my lack of understanding of
00:23:46.460 China.
00:23:47.200 If somebody says, Scott, you don't realize in the past, let's say just during the, the
00:23:52.680 administration of President Xi, if somebody said, but you saw what they did here and you
00:23:57.220 saw, and I might go, oh, I forgot about that, or I didn't know about that.
00:24:01.820 But the brutal, I'm going to call it brutal, the brutal, rational way that China operates
00:24:10.600 is brutal rationalization, brutal, rational thinking, meaning they're not all about wokeness
00:24:17.920 is brutal.
00:24:19.960 It's like, does this work for China?
00:24:22.280 Yes, no.
00:24:23.720 That's it.
00:24:25.120 That's just brutal engineering thinking.
00:24:27.080 Does it work?
00:24:27.820 Yes, no.
00:24:28.500 So I don't think they're going to make a move on Taiwan because of their, because of their
00:24:36.020 economic problems.
00:24:37.280 That, that doesn't square with me.
00:24:39.780 However, to be fair, Kyle Bass is a very good observer of things.
00:24:44.840 And if he has a different opinion, I'd take that seriously.
00:24:48.800 I'm just giving you my perspective.
00:24:50.640 We'll, we'll see who's right over time.
00:24:52.740 Um, but here's the thing, is America doing better than China?
00:24:59.500 It doesn't look like it to me because, you know, we've got the immigration drag, we've
00:25:05.640 got the corruption in every part of the, everything.
00:25:08.740 And then if Doge doesn't work, we've got to get rid of $1.83 trillion deficit per year.
00:25:14.980 We're adding a trillion dollars every hundred days.
00:25:19.000 And really, I don't see an end in sight.
00:25:23.120 So we have a situation where, in which I'm not positive, but it looks like the three biggest
00:25:29.820 powers in the world are circling the drain at the same time.
00:25:34.580 Russia looks like it might be on the edge of economic collapse because of the war.
00:25:39.020 And, you know, but maybe not China looks like it might be on the edge of economic collapse
00:25:45.300 because of various real estate crashing and demographics and whatever, but maybe not.
00:25:52.960 And the United States looks like it's circling the drain with no hope of recovery because
00:25:57.460 of the debt, but maybe not.
00:26:01.540 So I'm going to call it.
00:26:04.300 We're already in the middle of a world war.
00:26:06.320 It's just not with bullets.
00:26:07.680 Well, you could argue that Russia and Ukraine is a proxy war, but we are now in a total full
00:26:15.560 out world war, at least the three major powers in which we're not trying to beat the other
00:26:21.200 as much as we're trying to be the one that survives.
00:26:24.480 So beating China might be simply as simple as, as simple, I say as simple as, as simple
00:26:30.300 to describe, really hard to do, to survive.
00:26:35.360 The war between the three superpowers may be down to, are you still here in 10 years?
00:26:42.740 Yes or no?
00:26:43.360 Because I think that if all three of them, you know, find a way through these extraordinary
00:26:50.240 economic times, that would be great, I guess.
00:26:56.240 I mean, I don't, I don't really want to see China crumble because that feels like that would
00:26:59.980 create more risk than it, than it solves.
00:27:02.020 But it's kind of weird that we're already in a world war.
00:27:07.780 What America has to figure out, Mark Andreessen did a good job of explaining this.
00:27:12.020 China has not just good manufacturing, but they have a network of component manufacturing
00:27:17.960 for all the things that are most important.
00:27:19.680 So if we tried to build a, let's say, an iPhone assembly manufacturing thing in the United
00:27:25.980 States, we wouldn't just build one factory that makes iPhones.
00:27:30.020 You would need the entire network of suppliers to that one factory, and we don't have any
00:27:35.400 of that.
00:27:36.440 China does.
00:27:37.520 If we wanted to be good at making robots, we wouldn't build a robot factory.
00:27:42.560 You'd have to build an entire, entire network of suppliers that make the parts that end up
00:27:49.300 in the one factory.
00:27:50.200 We don't have any of that.
00:27:52.120 Now, Elon Musk would be the one exception, because he's the one person that we know of
00:27:57.160 who's made manufacturing, at least some of the manufacturing in the United States work.
00:28:02.280 But even, even Elon Musk depends entirely on China for parts, right?
00:28:08.520 At least the battery parts.
00:28:09.800 I don't know what else.
00:28:10.420 But, so China has all of these things.
00:28:13.700 They can make the phones.
00:28:14.960 This is Mark Andreessen's point.
00:28:17.200 They can make the drones, the cars.
00:28:20.080 According to Andreessen, China is rolling out cars that are going to be high quality, meaning
00:28:26.300 competitive with existing cars, that might cost a third or a quarter of the price of existing
00:28:32.940 cars.
00:28:36.320 Do you know how big of a problem that is for us?
00:28:38.820 If they can make a car for 25% of what our cars now cost, and they're just great, that's
00:28:48.220 a problem.
00:28:49.200 If they can do, if they can make phones and drones and robots and cars because they have
00:28:55.800 a complex ecosystem for manufacturing, we don't have any of that.
00:29:00.660 Do you know that Andreessen said he was trying to, you know, I guess he invested in a company
00:29:06.380 that makes drones or tries to in the United States, and they can't get parts because they
00:29:12.320 don't have that ecosystem.
00:29:13.840 And if they try to create manufacturing, I don't know the details, but he said that the
00:29:18.800 Biden administration was basically just had a foot on them the whole time.
00:29:22.580 So presumably, the Trump administration can open up, you know, a little more freedom for
00:29:30.520 the drone manufacturers.
00:29:32.260 I don't know if it's enough.
00:29:33.400 But so what we have as an advantage in the US is software and entrepreneurs, and we're
00:29:38.420 probably better at AI, and we might be better eventually at humanoid robots and stuff.
00:29:44.020 But that's not much to hold on to.
00:29:47.880 It seems to me that China can get better at software and AI in the long term.
00:29:55.200 They can get better at those things than we are.
00:29:57.060 Because software, I just feel like you can steal it, learn it, catch up.
00:30:03.080 But by having an entire complex manufacturing ecosystem that's lower cost than we could
00:30:08.440 ever match, what do we do about that?
00:30:11.400 So the odds of them closing their gap and learning software and being better at AI, that's not
00:30:17.480 as big a stretch as what we'd have to do to catch up with them.
00:30:20.980 So it does seem that the US and China are going to depend on each other to a degree that war
00:30:29.120 would just be stupid.
00:30:31.440 War would be stupid for both sides.
00:30:34.440 So I hope we avoid it.
00:30:38.920 Bank more encores when you switch to a Scotiabank banking package.
00:30:43.780 Learn more at scotiabank.com slash banking packages.
00:30:47.100 Conditions apply.
00:30:48.980 Scotiabank.
00:30:49.640 You're richer than you think.
00:30:51.360 Meanwhile, at Iowa State University, talking about how their glacier experts have found
00:30:58.060 a critical flaw in the sea level predictions, sea level rising because of climate change.
00:31:04.060 What they found was that the way cold ice acts is different than warm ice.
00:31:11.300 So I learned today that there can be warm ice and cold ice.
00:31:16.460 Warm ice would be ice that's already slightly melting.
00:31:19.580 Like it's just barely ice.
00:31:21.900 And then cold ice would be something that's so cold that it's a solid.
00:31:26.040 And what they found was that if you were to predict ice like it's all the same, you would
00:31:32.040 get the wrong answer.
00:31:32.880 But it took them 10 years of experimenting.
00:31:37.060 They studied for 10 years because of various hiccups.
00:31:40.740 And they got to the point where, whoa, this should change your sea level predictions.
00:31:45.880 Now, have you ever noticed that no matter how many times I tell you stories about some major
00:31:55.300 assumption or variable being updated or changed or added to the climate models, that they still
00:32:01.520 predict the same thing?
00:32:02.540 You can change all the inputs almost every day.
00:32:08.360 All right.
00:32:08.540 If you watch my show, you know that almost every day, seven days a week, I tell you about a
00:32:14.000 new thing in climate change that would change the variables you put in the model.
00:32:18.500 But yet the models are all about the same.
00:32:21.700 That is the biggest tell that it's not real.
00:32:24.960 So if no matter what the situation is, the models still say the same thing, or at least
00:32:30.660 they're still within that narrow band, what are those models really doing?
00:32:36.080 All right.
00:32:36.460 That's a pretty big signal.
00:32:38.920 But here's what I've learned by experience with persuading people to have a more reasonable
00:32:45.080 understanding of climate change.
00:32:47.380 Now, a reasonable understanding would be things might be getting warmer and that should be paid
00:32:54.220 attention to.
00:32:55.360 The humans might have something to do with it.
00:32:58.100 I don't know what percentage, but we should pay attention to it just to make sure we're
00:33:01.980 not doing something dumb.
00:33:04.680 But the climate models, the projection models, those are clearly bullshit.
00:33:10.080 And if you try to explain it to people, you run into the same roadblocks.
00:33:14.540 So here's the problem.
00:33:19.560 If I were to try to explain to somebody who was not exposed whatsoever to these arguments,
00:33:24.220 it would be a long conversation and it wouldn't be one that you could possibly do over, you
00:33:29.520 know, a thread on X, which is where everybody debates.
00:33:33.160 So you can't really break through if you're on a, you know, a quick messaging thing where
00:33:38.020 people are just arguing and blah, blah, blah.
00:33:40.100 You'd have to really sit down with somebody for like two hours to get them to the point
00:33:45.300 where they understand why.
00:33:46.880 Not that the climate models are inaccurate.
00:33:51.100 If you think that's what I'm saying, I'm saying more than that.
00:33:54.520 I'm saying they couldn't be accurate except by pure luck.
00:33:58.720 So here's what you'd want to explain if you had time.
00:34:02.300 You'd explain why the complicated models are driven by assumptions, not data.
00:34:08.900 Now, this is something a normie would not understand.
00:34:11.420 What do you mean it's driven by assumptions?
00:34:13.140 It's the data.
00:34:13.920 No, it's not.
00:34:15.720 It's what assumption you made about what data to put in it.
00:34:18.940 I know that because I used to make complicated prediction models for financial decisions at
00:34:24.300 a bank and then at a phone company.
00:34:27.340 So I've made complicated financial predictions for years and years of my professional life.
00:34:33.120 And I can tell you with certainty what everyone who has ever made a complicated financial prediction
00:34:38.740 has ever known.
00:34:40.160 It's your assumptions.
00:34:41.080 You can make it turn out to whatever your boss wants it to be.
00:34:45.280 That's the story.
00:34:46.780 And I know that for sure because I did that work.
00:34:50.380 So how hard would it be to explain to a normie that something as basic as, you know, it's just
00:34:57.500 the assumptions you put into it.
00:34:59.100 This is not based on following the data, right?
00:35:02.180 It's the assumptions about the data we're following, not the data.
00:35:05.120 So that's a hard thing for a normie to understand if you haven't been, like, immersed in the
00:35:11.740 field, or at least in any field that had a lot of variables.
00:35:15.300 Then in order to understand why all these climate models seem to fit this narrow band, you would
00:35:24.100 have to understand how scams work.
00:35:26.580 And one specific one, which I've explained too many times, about how if a model, if somebody
00:35:32.320 made a model that was outside the zone of what the experts are expecting, you would just
00:35:38.200 never see it.
00:35:38.940 Now, if you saw everybody's model all the time, that might be people trying to be honest
00:35:45.540 about the models.
00:35:47.800 But if you know for sure that nobody could survive in their career publishing a model that was
00:35:53.220 way above or way below the little zone, this sort of the approved zone, you couldn't publish
00:35:59.260 that.
00:36:00.220 So if you have a situation where the only models that can be published without losing your whole
00:36:05.040 career or your funding have to be in a narrow zone, that's not science.
00:36:12.720 That's people doing what they have to do to make money.
00:36:16.860 If you simply follow human incentive, it couldn't possibly be true that all these models fit this
00:36:25.360 narrow band.
00:36:26.700 That's just not real.
00:36:28.020 So it would take a while.
00:36:33.220 And then how about this?
00:36:34.240 How long would it take you to explain to a normie, let's say a Democrat, that the number
00:36:39.760 of scientists to agree, no matter how violently they agree, no matter how adamant they are,
00:36:45.780 that it doesn't mean anything?
00:36:48.300 Just think about how long it would take you to explain that.
00:36:51.060 OK, let me start.
00:36:52.820 You've heard of cause and effect.
00:36:54.580 Uh, OK, I think I've heard of that.
00:36:59.080 Did you know that people do things for money?
00:37:02.460 Did you know that?
00:37:04.140 Uh, they do?
00:37:06.340 Yeah, they do.
00:37:07.880 Well, I thought they did things because they're good people.
00:37:10.340 No, no, no, they do things for money.
00:37:13.740 But not all the time.
00:37:15.200 Yeah, all the time.
00:37:17.060 Not every time.
00:37:18.220 Every time.
00:37:19.360 Every single time.
00:37:20.440 It's the most predictable thing in your entire life is that people do things for money.
00:37:29.460 So if you could get paid for saying climate change is a crisis and you could lose your
00:37:35.620 job and your reputation and your family would starve if you go against the grain, what does
00:37:42.320 it mean that 99% or whatever are on the same side?
00:37:46.340 And I know the real argument is it's not 98% on one side.
00:37:49.920 That's all fake.
00:37:50.740 I get it.
00:37:51.780 But just think how far you would have to go to explain things that we know that if you've
00:37:58.100 never been exposed to them, it just takes forever to explain it.
00:38:00.720 Um, so I would maintain that you don't need to be a scientist or anything like it, um, to
00:38:08.560 understand that the models are broken and couldn't possibly be right.
00:38:12.080 The only thing you need to know is how cause and effect works.
00:38:16.440 If you understand cause and effect, you know, the models are bullshit.
00:38:20.560 If you're confused about the most simple thing in the world, like, uh, if I change the incentives,
00:38:28.440 uh, will, will that change behavior?
00:38:32.440 Yes.
00:38:33.720 Yes.
00:38:34.880 Changing the incentives, change the behavior.
00:38:37.700 Yes.
00:38:38.820 Now, in order for me to be correct in, in this domain of climate change that, um, you know,
00:38:47.700 that, uh, it's really about people who don't understand cause and effect.
00:38:51.060 If you don't understand that people are going to lie for money, you don't know anything,
00:38:55.580 but let's see if that exists in any other domain.
00:39:00.580 Well, I was seeing Austin Allred today talking about how, uh, he said in a post, uh, they
00:39:06.540 meaning, you know, the people in charge looked at the data and saw that college graduates
00:39:11.060 on average did better in life.
00:39:13.120 Now that's true, right?
00:39:14.220 People who went to college do better in life.
00:39:18.160 Duh.
00:39:18.960 So how would you make equity?
00:39:21.600 If equity is what the Democrats want, what would be the way to get there?
00:39:26.000 If you know, the people who didn't go to college don't do as well as the people who went to
00:39:31.100 college simple, you make it so that everybody can get into college, regardless of their
00:39:36.200 qualifications and regardless if they have money to pay.
00:39:40.360 That's it.
00:39:41.000 That's equity.
00:39:41.520 And then you find out that the real thing about college was the selection bias, not what the
00:39:48.040 college taught you that the colleges were only, they only existed in many ways.
00:39:53.840 They existed as a way to certify that these were the smart people.
00:39:58.240 They didn't make them smart.
00:40:00.280 I mean, they may have trained them in some skills, but they didn't make them smart.
00:40:03.660 The being smart had to be there in the first place.
00:40:05.880 And then the smart people always find a way.
00:40:09.260 So this seems to be another example.
00:40:14.680 And then also the Democrats got rid of honors classes because they found out that people
00:40:20.240 who went to honors classes did better in life and got into college.
00:40:24.100 And that was an unfair advantage.
00:40:25.880 So how do you fix that?
00:40:27.040 You get rid of the honors classes and then, you know, have that problem.
00:40:30.900 So if you look at education, would it be fair to say that the reason it's so messed up and
00:40:37.060 people have, they have a gigantic crushing college loan debt and all they got from it
00:40:44.500 was a degree and something useless.
00:40:47.760 All of these things are the same problem, not understanding how anything works, not understanding
00:40:54.160 cause and effect, not understanding what's the real cause and what's just a correlation.
00:40:58.960 It's all the same.
00:41:00.400 The reason that Democrats don't understand climate change is they don't understand cause
00:41:05.880 and effect and how money changes incentive and that people are basically liars.
00:41:10.880 And they think that college is just a matter of getting rid of the racism so that everybody
00:41:15.080 can get in.
00:41:17.320 That was so not the problem, not even close, a complete analysis failure.
00:41:24.700 So I would say that you could find this consistently that the people on the left, the woke left,
00:41:31.620 don't understand cause and effect.
00:41:34.080 It really is that simple.
00:41:35.660 They don't know how money and incentives affect anything.
00:41:39.020 Trust and safety is becoming a key driver of customer experience, influencing how users
00:41:45.020 engage, how safe they feel, and ultimately how likely they are to return.
00:41:49.760 Because I don't know about you, but if I've had too many bad experiences on a platform,
00:41:54.780 I'm definitely not rushing back for more.
00:41:57.060 This is the intersection we're here to explore today.
00:42:00.280 Tap to keep listening to how trust and safety redefined CX for brands like TikTok, Trustpilot,
00:42:05.500 and more.
00:42:06.540 A conversation with InTouchCX.
00:42:09.240 Well, here's some good news.
00:42:11.180 President Trump has picked Bill Pulte for director of the Federal Housing Finance Agency.
00:42:17.420 So that would be in charge of Freddie and Fannie.
00:42:22.520 So those would be two big programs that the government runs for making loans.
00:42:27.000 Now, here's what's good, great really, about having Bill Pulte.
00:42:32.620 The rumored, and I believe it's true, but the rumored goal of the Trump administration
00:42:38.380 is to do away with government control of these two entities that used to not be government controlled.
00:42:45.420 Now, if you want to get rid of government control over these really big dollar, gigantic dollar
00:42:53.740 things, who are you going to hire?
00:42:56.480 Somebody who needs that job?
00:42:59.920 Or somebody who definitely doesn't need the job, but is such a patriot, is willing to take
00:43:05.380 the pain to get something done?
00:43:07.600 That's Pulte.
00:43:08.160 Pulte is one of the most honest, effective people you'll ever know in your life.
00:43:13.440 I know him personally, so I can speak from authority on this.
00:43:17.940 This is another genius pick.
00:43:20.820 Trump is just nailing these picks.
00:43:24.300 I didn't, you know, I have to say I was cautious on Hegseth, because I thought, hmm, you know,
00:43:30.760 is it too much emphasis on his TV qualities?
00:43:34.100 But I think he brought the goods.
00:43:36.020 I think he showed us, he showed us what I wanted to see.
00:43:40.380 Pulte is the best pick for somebody to transfer control from the government to the private sector
00:43:46.760 and then move on to another challenge.
00:43:49.480 He doesn't need the job.
00:43:51.340 Trust me, he doesn't need the job.
00:43:52.600 So, this is just brilliant.
00:43:56.400 And all credit to Bill Pulte for being willing to take on what looks like a tough, tough job
00:44:03.480 that we'd all be better off if he succeeds.
00:44:07.140 So, good luck.
00:44:08.320 Good luck there.
00:44:10.720 All right.
00:44:11.380 Trump picked a head of Secret Service, which is the head of his current Secret Service detail,
00:44:17.920 Sean Corrine.
00:44:18.820 Now, my first question was, how common is it that you would be promoted from the head
00:44:25.820 of a detail, even a presidential detail, directly to the top of the entire Secret Service?
00:44:32.340 Has that happened before?
00:44:34.580 Or my Dilbert filter on this, you know, the way most things work, is that you would go
00:44:40.120 from the head of a detail to maybe, you know, third in command of the entire network, and
00:44:46.460 then maybe you could work yourself up to the top.
00:44:49.360 Has this ever happened, that you went from the head of a detail to the head of the whole
00:44:53.460 thing?
00:44:54.580 Maybe it has.
00:44:55.420 It's entirely possible that that's more normal than I think.
00:44:59.720 But here's what it hints at.
00:45:02.160 What it hints at is that Trump doesn't trust the Secret Service.
00:45:07.780 And the one he does trust is the person who jumped on his body when the bullets were flying.
00:45:13.160 The only person he could trust to literally have his best interests in mind is the person
00:45:21.080 who jumped on him when bullets were flying.
00:45:24.460 And I got to say that, you know, sometimes Trump gets a hit for putting loyalty over, you
00:45:31.200 know, maybe experience sometimes.
00:45:32.700 But this is a place where loyalty over experience, and the guy has plenty of experience, right?
00:45:39.180 He has all the experience in the world of what a presidential detail should be doing.
00:45:43.240 And apparently he's been advocating hard for more resources for the presidential detail
00:45:49.480 and was getting was getting blocked.
00:45:52.440 So he seems to have all the right stuff.
00:45:55.600 So I'm plus on this.
00:45:58.740 But I think the genius of it is for Trump to feel comfortable that the person in charge
00:46:03.960 really, really wants to keep him alive.
00:46:06.400 And I'm not sure he would have thought that before.
00:46:10.000 So this is a good change.
00:46:13.140 Kevin O'Leary was on CNN.
00:46:15.160 And as one of his rhetorical flourishes with the Democrats he was arguing with, he said,
00:46:21.780 everyone is wondering how Joe Biden got rich.
00:46:25.220 How did he get rich?
00:46:26.700 And he asked that several times while they were yammering at him.
00:46:29.980 And he just said, how did he get rich?
00:46:32.900 Blah, blah, blah, blah, blah, blah, blah.
00:46:34.000 Yeah, but how did Joe Biden get rich?
00:46:37.420 Now, I asked the search engine how rich he is.
00:46:42.600 And the answer is something like $10 million net worth.
00:46:46.760 Now, could he have that net worth from his book sales and his speaking that he did in
00:46:55.280 between when he held office?
00:46:57.360 Because he was paid a lot for a book, probably.
00:46:59.400 Probably paid a lot for his speaking.
00:47:01.160 And sometimes we think that's the way that politicians get bribed.
00:47:07.500 You know, if you do this now, I'll make sure that you get this big book deal later.
00:47:11.920 That sort of thing.
00:47:14.300 But I hate to tell you that $10 million is not as rich as it sounds.
00:47:20.300 It does sound sketchy to me.
00:47:22.020 You know, I don't think that anybody else who had his pay and a book deal and a little
00:47:26.560 speaking would be at that value.
00:47:29.460 But most of the value is in his two homes.
00:47:32.560 So probably half or more is that he has two homes.
00:47:35.840 Now, the question is, how do you get these two homes?
00:47:40.200 Because they're nice ones.
00:47:42.280 I don't know.
00:47:42.900 So $10 million doesn't seem like LBJ level of corruption.
00:47:49.700 You know, that was sort of off the chart.
00:47:51.680 But I don't think he could get there without something sketchy going on.
00:47:56.020 So that's a good question.
00:48:00.460 So, you know, that Biden and his final speech there, his goodbye speech, he warned against
00:48:05.580 the formation of tech oligarchies, which I think is just aimed at Elon Musk and maybe
00:48:13.180 the fact that Zuckerberg's a, you know, less pro-Democrat lately.
00:48:19.000 But what I loved about this, it was the final act of projection.
00:48:23.740 You know, we keep saying that the real problem of the Democrats is a mental health problem,
00:48:27.920 some kind of weird thing where they do projection.
00:48:31.320 Everything that they're doing, they just blame Democrats at.
00:48:36.140 So what were they doing for years?
00:48:38.000 They were using their oligarch Soros and Reid Hoffman and other rich people who might be
00:48:43.660 in jail to fund them.
00:48:46.840 And that was OK.
00:48:49.280 But as soon as some rich people said, wait a minute, the fine people hoax isn't real?
00:48:56.900 Then some of his best backers, I mean, Elon Musk was backing him before.
00:49:01.320 Well, some of his best backers have decided to go the other direction.
00:49:08.780 So this is his final act of projection.
00:49:11.660 The guy who benefited entirely from the billionaires saying, you better watch out for billionaires.
00:49:18.580 It's kind of perfect, isn't it?
00:49:20.900 That his final act was incompetent, laughable, and almost a ridiculous level of persuasion.
00:49:28.740 All right.
00:49:35.200 So Biden has said that his problem was he spent too much time on policy and then not enough on politics.
00:49:44.600 OK.
00:49:45.580 The gall it takes to say that your problem with your reputation was that you spent too much time on policy.
00:49:55.360 That sounds like when you go in for the job interview and you get that standard question.
00:50:00.900 It's like, all right, you know, are there any flaws about you that I should know about?
00:50:06.060 Well, sometimes I like to work too hard.
00:50:11.000 Yeah.
00:50:11.920 Sometimes I focus on the policy too hard.
00:50:15.120 It's the same damn thing.
00:50:16.660 Ridiculous.
00:50:17.140 And then, of course, MSNBC has been defending Biden's mental acuity right up till today.
00:50:25.880 So even Morning Joe is still saying, you know, he gets a word wrong here and there, but I spend time with him and he's fine.
00:50:34.200 He's fine.
00:50:35.520 So their entire brand, MSNBC, is that Biden's fine.
00:50:39.880 He's always been fine.
00:50:40.600 He's still fine.
00:50:41.340 I don't know what you're talking about.
00:50:42.240 So Lawrence O'Donnell from MSNBC gets to sit down with Biden and from the network that defended Biden's mental acuity for years.
00:50:52.580 And Biden says this, quote, you've heard Barack get mad at me when I was a kid.
00:51:02.300 What?
00:51:04.340 Barack got mad at him when he was a kid?
00:51:07.080 Now, I suppose he meant when I was vice president.
00:51:13.120 But when I was a kid isn't really that close to when I was vice president.
00:51:20.520 That's a brain fart.
00:51:21.960 And I get, by the way, forgetting a name, something I do all the time.
00:51:26.000 It's not that unusual.
00:51:26.780 But this is kind of embarrassing if you're one of the hosts of the Everything's Fine with Biden's Brain and he gives you, you've heard Barack get mad at me when I was a kid.
00:51:40.340 Meanwhile, then, Harris was, she was signing her little desk in the vice president's office and talking in public.
00:51:48.440 I guess it was her goodbye, sort of.
00:51:49.980 And the big question was, is she drunk?
00:51:54.900 So.
00:51:57.220 I would like to do my impression of Kamala Harris in the news, not knowing if she's stupid or drunk, because it was so much fun trying to figure out, OK, is that stupid or is that drunk?
00:52:08.720 To me, she looked drunk in the little episode yesterday.
00:52:12.940 But if it wasn't drunk, oh, my God, it's worse.
00:52:17.800 You hope it's drunk, because if that's the way her real brain is working, oh, we've got a problem.
00:52:23.580 So I'd like to give you my impression and see if you think this is drunk or stupid.
00:52:29.200 First of all, hands were involved.
00:52:31.640 So I'll do the hand stuff.
00:52:35.960 My work is not done.
00:52:39.140 We'll be working long hours.
00:52:40.660 It's not my nature to go quietly into the night.
00:52:48.900 Now, did that sound drunk or stupid?
00:52:53.840 Because I think I nailed it.
00:52:55.660 That was right on.
00:52:58.120 Join us aboard CN's podcast, The Inside Track, your front row seat to the railroads and supply chains powering North America.
00:53:04.960 In episode two, we unpack CN's approach to temperature controlled shipping with Keegan Donaghy.
00:53:09.560 From frozen goods to protect from freeze shipments, learn how CN's specialized intermodal fleet helps deliver on time and on temperature.
00:53:17.540 Tune in now to The Inside Track.
00:53:18.840 Biden's commuting the sentences of 2,500 people convicted of nonviolent drug offenses.
00:53:29.720 I'm roughly okay with that.
00:53:31.720 But I'd love to know if any of them were fentanyl dealers, because if they were, I'm not in favor of that.
00:53:39.900 The FBI shut down this DEI office just before Trump comes in office and tells them to shut it down.
00:53:48.180 But of course, they didn't really shut it down.
00:53:50.220 They just hid all the parts in other departments so that they don't have to call it DEI, but they can still do DEI.
00:53:56.300 Trump has demanded that the FBI retain all his records about their so-called DEI office.
00:54:01.980 And if they get rid of their records, that seems like that would be a bad thing.
00:54:11.260 Meanwhile, scientists, according to NoRidge, scientists have developed nanoparticles that target and remove plaque from inside the arteries.
00:54:19.820 But here's the exciting part.
00:54:21.020 So they figured out a way to get the nanoparticle to get inside a cell and reprogram it.
00:54:29.040 So they were the cells that were the macrophages or whatever, the things that eat stuff.
00:54:36.380 And they can just goose them so that they do a better job of eating the things that you want them to eat, like the harmful plaque.
00:54:43.000 But what strikes me about this is this sounds exactly like a software upgrade to your body.
00:54:49.000 If you can reprogram through a nanoparticle that can get into a cell, and you can reprogram the function of the cell, I mean, it sounds dangerous.
00:55:01.540 But if they figured out the safety part, they can do a software upgrade to your cells.
00:55:11.300 This looks like a real thing.
00:55:12.920 Now, it's not rolled out yet.
00:55:14.380 But in the lab, they're upgrading your cells.
00:55:19.000 That's pretty amazing.
00:55:23.120 SciPost talks about a new study.
00:55:26.500 Eric Nolan says that there's an oral brain axis.
00:55:31.440 Okay.
00:55:31.960 This isn't as exciting as it sounds at first.
00:55:35.620 New research uncovers surprising links between the bacteria in your mouth and mental health symptoms.
00:55:41.860 So they discovered that people who have mental health symptoms have different mouth chemistry than people who don't have mental health problems.
00:55:52.360 I'm just going to say that one of those groups eats glue and the other doesn't.
00:55:57.500 So I'm not sure the cause and the effect is exactly what they think, which makes me suspect the scientists are Democrats because there's one group that can't get cause and effect right.
00:56:08.460 So, all right, that's all I got for you.
00:56:13.020 Stock market's up.
00:56:16.200 We're going into the weekend.
00:56:19.100 So it's Sunday that's the inauguration, right?
00:56:22.020 I think we have to talk about the security of the inauguration day.
00:56:29.040 How in the world could you protect that?
00:56:33.680 It seems to me if a foreign power wanted to do something, they'd have the ability.
00:56:39.980 I don't know if a hobbyist could do it, like just some lone drone gunman or something.
00:56:47.960 But they could get close.
00:56:49.500 I mean, I don't know how good the anti-drone detection is and repulsion.
00:56:56.700 Yeah, I'm seeing the comments that Joe Biden's $10 million is not what it used to be.
00:57:02.640 Yeah.
00:57:05.480 Not what it used to be.
00:57:06.580 Yeah, you could get there being very frugal.
00:57:12.340 Yeah, I don't think that's how it happened.
00:57:14.020 Now, there were two incomes for a long time.
00:57:17.140 Two incomes living in a relatively less expensive state.
00:57:23.320 Maybe you could get there.
00:57:25.700 But I doubt that's how it happened.
00:57:27.560 Yeah, there's a photo of Daniel Penny taking a subway again.
00:57:33.720 I tell you, however brave you think Daniel Penny is or was, he's probably braver than you think.
00:57:41.640 Because he's just back on the subway.
00:57:43.480 And I love the fact that I don't think anybody's going to want to mess with him.
00:57:46.960 He would be the one person you wouldn't want to start a fight with.
00:57:49.480 I mean, you should never start a fight with somebody who is a young ex-Marine.
00:57:54.100 Because, again, the ex part, you know, you could forget about.
00:57:59.240 You wouldn't want to, he looks like he's like 6'4".
00:58:02.560 How tall is he?
00:58:04.320 He's a big guy, right?
00:58:05.400 He looks like he's 6'4", young, and has been trained to tear people apart.
00:58:10.180 And he's already killed one person, although I think that was more about the person than the act.
00:58:14.840 And he'd be scary.
00:58:17.260 Like, if you were a troublemaker and you looked up and you saw him, you'd be like, oh, God.
00:58:22.460 Last person you want to see on the subway.
00:58:25.920 Unless you're, unless you feel unsafe, and then he's the first person you want to see.
00:58:34.440 You know, I don't even know.
00:58:37.040 Yeah, I don't even know.
00:58:37.880 I would love to know how the black community really thinks about that.
00:58:42.040 Have they been listening to Washington Post's storytelling?
00:58:47.260 Or do they just say, if the black people in the Daniel, in the Daniel Penny subway car,
00:58:54.060 if the other black passengers expressed relief that he stopped this guy, which is what happened.
00:59:01.180 I don't know how you say that's racial.
00:59:04.040 If the other black, if the other black passengers are like, yeah, I'm glad he did that.
00:59:12.100 You think everyone is drunk.
00:59:13.980 It sounds stupid at this point.
00:59:15.480 Really?
00:59:15.860 Everyone?
00:59:16.260 You mean that one person who's obviously drunk?
00:59:20.340 Are you a Democrat?
00:59:22.500 You sound like a Democrat.
00:59:24.580 You don't understand cause and effect.
00:59:27.840 All right.
00:59:29.960 Everyone with a job.
00:59:32.180 Well, that's going too far.
00:59:34.200 Yeah, I think people, at least some people respect somebody who steps in.
00:59:45.480 I would imagine there are plenty of people in the black America community who are just saying, you know, that's not my cause.
00:59:52.820 He tried to help.
00:59:53.800 That's the end of that story.
00:59:54.720 That's the end of that story.
00:59:55.720 That's what I think.
01:00:02.340 All right.
01:00:03.340 Ladies and gentlemen, that's all I got for now.
01:00:05.480 Now, if you're watching the Dilbert comic that you can only get by subscription on X or on the locals platform, scottadams.locals.com.
01:00:15.820 You would see that the fires are approaching Dilbert's house.
01:00:21.580 So Dilbert and Dogbert are looking out the window.
01:00:23.660 The fire is approaching.
01:00:24.680 It's not controlled.
01:00:25.920 And they have to decide what they're going to do.
01:00:29.740 I'll give you a spoiler.
01:00:30.880 The spoiler is Dilbert is going to stay and try to protect his house as the fire approaches.
01:00:40.860 And Sunday will be my.
01:00:43.500 The big conclusion of that.
01:00:45.920 So you'll find out if Dilbert dies or not.
01:00:50.820 Which could happen, you know.
01:00:53.600 If I decided to retire, I might kill him.
01:00:56.600 It's possible.
01:00:58.880 So you never know.
01:00:59.840 You better watch it on Sunday.
01:01:00.700 All right.
01:01:01.180 That's all I got for you.
01:01:02.220 I'm going to go talk to the locals people privately.
01:01:04.700 The rest of you, thanks for joining.
01:01:06.220 I'll see you next week.