Real Coffee with Scott Adams - April 21, 2023


Episode 2085 Scott Adams: Biden-Hiding Strategy 2024, Buzzfeed's Death Rattle, Hunter Laptop Update


Episode Stats

Length

56 minutes

Words per Minute

142.5878

Word Count

8,067

Sentence Count

704

Misogynist Sentences

27

Hate Speech Sentences

16


Summary

The dopamine hit of the day, the thing that makes everything better, it's called " simultaneous sip, it happens now" and it's a thing that's been around for a while, but it's getting better and better every moment.


Transcript

00:00:00.000 on YouTube.
00:00:03.020 All right, I'm going to close that window.
00:00:05.540 Boy, what a day.
00:00:10.820 Good morning, everybody, and welcome
00:00:18.380 to the highlight of civilization.
00:00:22.000 Somehow, amazingly, I think I got most of this technology to work.
00:00:26.320 It was a little bit of a struggle, I've got to say.
00:00:28.560 I've got to say.
00:00:30.160 But now that we're all here, everything's perfect and good
00:00:33.680 and getting better every moment.
00:00:36.420 And if you'd like to take this up to levels heretofore unknown,
00:00:40.100 all you need is a cup or a mug or a glass,
00:00:42.200 a tankard, a chalice, a stein, a canteen, a jug or a flask,
00:00:45.200 a vessel of any kind.
00:00:47.500 Fill it with your favorite liquid.
00:00:50.700 I like coffee.
00:00:52.620 Stop doing funny memes when I'm in the middle of this.
00:00:55.000 And join me now for the unparalleled pleasure,
00:01:03.580 the dopamine hit of the day,
00:01:04.800 the thing that makes everything better.
00:01:06.440 It's called simultaneous sip.
00:01:09.160 It happens now.
00:01:10.060 Go.
00:01:10.240 Yeah, why can't I say tankard?
00:01:16.600 For some reason I say that word tankard
00:01:18.440 and my brain goes numb.
00:01:22.080 Well, let's talk about all the news.
00:01:27.800 Are you ready for all the news?
00:01:30.100 It looks like Larry Elder has officially announced
00:01:32.400 he's running for president in 2024.
00:01:35.880 You know, here's the good news.
00:01:37.920 See if I can focus on some good news for a change.
00:01:40.960 We do have good people running for president.
00:01:43.780 Now, they won't all get elected.
00:01:46.920 But I love having, you know,
00:01:49.320 Vivek Ramaswamy in the race.
00:01:51.560 I love having Larry Elder in the race.
00:01:55.080 You know, Trump's Trump.
00:01:56.880 So he's a given and he's always fun.
00:02:00.040 I like that DeSantis is going to run.
00:02:02.020 I like that RFK Jr. is going to run.
00:02:04.920 Those are all pretty good signs.
00:02:08.140 You know, so at least there's one thing we can say,
00:02:13.960 which is there are good, qualified,
00:02:17.720 high-character people willing to run for president.
00:02:21.520 That's something.
00:02:23.360 That is something.
00:02:24.840 We do have high-qualified, younger people
00:02:27.240 willing to run for president.
00:02:29.940 So we don't know who we're going to get,
00:02:32.960 but at least we have qualified, good, smart people.
00:02:36.860 I like that.
00:02:40.780 Biden apparently is rumored to be planning
00:02:43.640 to do his announcement for president,
00:02:45.940 running for president,
00:02:47.100 by video in two weeks.
00:02:50.660 Now, can somebody give me a fact check
00:02:52.740 in a history lesson?
00:02:53.760 Has any incumbent ever announced by video before?
00:03:01.820 Isn't that sort of an interesting choice?
00:03:07.480 Hillary did?
00:03:08.820 Somebody says Hillary did.
00:03:11.760 I don't know if that's true.
00:03:13.620 But it seems to me that there's more evidence
00:03:15.660 that we don't have a real president.
00:03:16.940 We have just somebody who's, you know,
00:03:19.520 a bag of twigs and leaves.
00:03:22.660 And he's being, you know,
00:03:25.320 sort of stood up there
00:03:27.080 as an actual living, breathing person.
00:03:29.560 That's what it feels like.
00:03:31.600 But you know what's different about this time?
00:03:35.540 Let me tell you.
00:03:36.580 If you see a video of somebody
00:03:40.740 announcing for president,
00:03:42.620 how do you know AI didn't modify that video?
00:03:46.680 How do you know?
00:03:47.920 Now, we got used to the fact
00:03:49.680 that people wear makeup when they're on TV
00:03:52.280 because they don't really look like that, right?
00:03:55.260 Unfortunately, they look more like
00:03:57.040 what you're seeing with me.
00:04:00.080 So we're sort of used to the fact
00:04:02.140 that TV doesn't show you the actual person.
00:04:04.580 It's an enhanced, better version.
00:04:07.120 But couldn't AI speed things up
00:04:11.840 and edit things
00:04:12.840 and make him look a little younger
00:04:14.240 and better than he is?
00:04:16.020 And would you know the difference?
00:04:18.300 If the AI made him look,
00:04:21.540 oh, I don't know,
00:04:22.960 10 or 15% younger,
00:04:25.500 would you know it?
00:04:28.200 So I'm kind of interested to see if AI,
00:04:31.580 it's probably too early for AI
00:04:32.740 to do something that clever.
00:04:34.580 I mean, it could do it, I suppose.
00:04:37.240 But I don't think anybody's yet using it for that.
00:04:40.840 But everything you say is later.
00:04:43.160 It's going to be in five minutes.
00:04:47.360 So Locals is working, by the way.
00:04:50.120 So Locals is up and working.
00:04:51.840 We've got them both working well here.
00:04:53.820 I don't know.
00:04:58.080 Is it my imagination that having Biden announced by video
00:05:03.780 in our current environment,
00:05:05.960 given that he doesn't like to show himself in public too much,
00:05:10.120 isn't that the weakest announcement you've ever heard of?
00:05:13.200 I'm trying to think,
00:05:15.580 who's ever made a weaker announcement
00:05:19.000 than a video announcement
00:05:20.840 in the context of wondering if he's capable of making an announcement?
00:05:25.040 You like that one?
00:05:31.420 Biden's bag of leaves and twigs?
00:05:34.240 It's pretty visual, isn't it?
00:05:36.740 Can't you even see the bag?
00:05:39.320 It's like you can see the bag,
00:05:40.540 and it's just full of leaves and twigs.
00:05:42.960 That's just Joe Biden.
00:05:44.580 Yeah, it's pretty good.
00:05:45.220 All right.
00:05:49.560 So Google has announced it's going to use AI
00:05:52.160 to generate customized ad campaigns.
00:05:56.440 I don't exactly know what that means yet.
00:05:59.660 But does that sound safe to you?
00:06:04.440 I'll tell you the conversation that we're going to have.
00:06:07.680 I was going to, well, here I made the same mistake.
00:06:10.340 I started to say this is the conversation we're going to have soon.
00:06:13.540 But with AI, soon doesn't mean anything anymore.
00:06:18.740 By the time you're done saying the word soon,
00:06:22.060 you get to the end in soon, and it's already done.
00:06:26.300 So before I finish the sentence,
00:06:28.160 it's probably already doing it.
00:06:29.900 But here's my prediction.
00:06:34.420 Almost everything about AI that we care about
00:06:36.860 is going to have to do with free will.
00:06:38.340 So here is Google using AI
00:06:43.640 to improve the value of their customized ad campaigns.
00:06:48.940 Now, doesn't that mean it's going to be more manipulative?
00:06:56.280 More persuasive?
00:06:58.800 So if AI is being used to persuade,
00:07:02.620 and then let's say it works,
00:07:04.560 and the customization of their ad campaign,
00:07:06.920 we don't know what that means yet, right?
00:07:09.080 Customizing the ad campaign could mean almost anything.
00:07:12.060 But what if it works?
00:07:14.980 And they really find out what moves you?
00:07:21.060 I've told you already that Instagram,
00:07:25.660 when Instagram puts an ad in my feed,
00:07:29.140 do you know how often I buy that?
00:07:30.460 I don't know what Instagram magic they're using,
00:07:37.620 but when Instagram puts an advertisement in my feed,
00:07:41.140 I almost salivate.
00:07:44.000 I'll just be reading through.
00:07:45.280 It's like,
00:07:46.080 a new potter?
00:07:50.020 A new potter?
00:07:52.520 It can correct your pot?
00:07:55.160 If you miss the middle of the...
00:07:58.040 What?
00:08:02.120 Must have a potter.
00:08:04.540 Do you know how many times I've been absolutely...
00:08:07.880 Like, my brain has just been taken over?
00:08:10.340 Now, the first time I see it,
00:08:12.360 like, maybe I'm busy,
00:08:14.320 but by the fifth time I see that ad for that special potter,
00:08:17.920 that's the best potter you've ever used,
00:08:20.020 and it's never been better,
00:08:21.080 I'm buying the potter.
00:08:23.900 I can't even stop my hands from moving.
00:08:27.720 No, I don't golf.
00:08:29.340 This is a true story.
00:08:31.420 I don't golf.
00:08:32.900 I just bought a potter.
00:08:34.760 Because I couldn't resist it.
00:08:37.660 I was like,
00:08:38.360 how did they make the potter?
00:08:40.060 How did they make a potter that's better than all putters?
00:08:42.760 I must hold this in my hand.
00:08:44.580 I've got to validate this claim.
00:08:46.520 If I could get a potter
00:08:47.660 that would make me a good potter
00:08:49.320 without any actual practice in golf,
00:08:51.800 maybe I would take up golf.
00:08:53.920 Now, I do...
00:08:54.760 I have golfed.
00:08:56.540 It's just I haven't golfed in years.
00:08:58.600 So it's not like I'm a golfer.
00:09:00.500 But I bought a potter.
00:09:02.460 And you know what the best part of the story is?
00:09:05.360 Here's the best part of the story.
00:09:07.780 It actually is the best potter I've ever touched.
00:09:11.280 It's called the Pyramid.
00:09:13.260 I think that's the right thing.
00:09:14.240 The Pyramid Putter.
00:09:15.760 Maybe you've seen it in the ads.
00:09:17.120 And I was not prepared for it to be better than other putters.
00:09:22.280 Because I have a putting green in my house.
00:09:24.960 So I have a whole row of different putters.
00:09:28.180 So I'm actually somebody who's tested
00:09:30.560 a whole bunch of different putters just this year.
00:09:34.620 And damn it,
00:09:35.940 it's actually better than all putters.
00:09:38.400 I didn't think it was possible.
00:09:40.260 It actually delivered.
00:09:41.700 And I've had a few other experiences like that with...
00:09:45.020 Oh, I also bought...
00:09:46.480 Have you seen that little device?
00:09:47.980 I forget what it's called.
00:09:49.680 It's like a little handheld device
00:09:52.480 that adds air to your tires.
00:09:54.940 Different vehicles.
00:09:56.280 Have you seen the ads for that?
00:09:58.720 And I saw that thing after like 50 times.
00:10:01.040 I saw it like...
00:10:02.740 I do own a bicycle.
00:10:05.460 An e-bike.
00:10:06.620 I do have a car.
00:10:07.860 They have tires.
00:10:08.560 I must have this tire inflation device.
00:10:12.560 And so I bought it.
00:10:14.840 Like, yeah, like this is going to be the one
00:10:17.100 best tire inflation device in the world.
00:10:20.680 No, it just hypnotized me.
00:10:22.880 But here's the weird part.
00:10:25.060 It's actually the best tire inflation thing I've ever...
00:10:28.100 Because I like to buy almost all tire inflation things.
00:10:31.440 Because I'm always looking for the good one.
00:10:33.400 Yeah, it's cordless.
00:10:34.320 And it actually is the best tire inflator I've ever seen.
00:10:39.460 You just snap it on and walk away.
00:10:41.600 And it fills up your tire.
00:10:45.440 Anyway.
00:10:46.640 What happens if we lose our sense that we have free will?
00:10:50.360 Because I absolutely...
00:10:52.320 I feel my free will just...
00:10:55.700 You know, the illusion of free will
00:10:56.980 just disappearing when I see Instagram ads.
00:11:00.500 I just salivate over them.
00:11:01.660 I don't know what they're doing.
00:11:02.520 That they know me better than I know myself.
00:11:07.540 All right.
00:11:08.180 So there's going to be more on that.
00:11:10.640 I saw a tweet by Ollie London.
00:11:14.880 Who...
00:11:15.520 I think this is true.
00:11:17.340 All right.
00:11:17.680 Now this is a story you have to put a little bit of like
00:11:20.080 grain of...
00:11:21.340 Grain of sand or salt or whatever that is.
00:11:24.020 So don't 100% trust that this story is true.
00:11:27.460 Because it's too on the nose.
00:11:30.200 Wait till you hear how on the nose this is.
00:11:33.260 But...
00:11:33.780 Because it's fun.
00:11:35.640 I'm going to treat it like it's true.
00:11:38.740 There's a museum in our house...
00:11:41.360 Our house...
00:11:42.620 That's how you pronounce it.
00:11:43.660 It's in Denmark.
00:11:45.040 Not our house.
00:11:46.500 But our house.
00:11:48.080 It has been renamed...
00:11:49.620 So it used to be the Women's Museum.
00:11:51.280 Now this can't be true.
00:11:56.200 Has this been debunked already?
00:11:58.520 This can't possibly be true.
00:12:00.700 All right.
00:12:01.040 I'm going to read it.
00:12:01.820 But just know that I don't believe it's true.
00:12:03.980 That they renamed the Women's Museum to the Gender Museum in Denmark.
00:12:10.000 And has created a statue of a man breastfeeding a baby.
00:12:14.040 As a symbol of the, quote, hybrid of masculine and feminine.
00:12:19.880 So they show a picture of this, you know, muscular male statue that's naked.
00:12:24.800 And it's got full genitalia.
00:12:27.760 And it's got breasts as well.
00:12:29.580 And it's...
00:12:30.800 It's breastfeeding the baby.
00:12:35.040 And, you know, I thought to myself, you know, we're reaching a point where you can choose
00:12:43.480 your own body type from sort of a buffet of choices after you're born.
00:12:49.020 Now, in some cases, before you're born.
00:12:51.260 Because they'll be able to, you know, maybe correct some genetic problems while you're in
00:12:55.280 the womb.
00:12:55.860 That's coming.
00:12:57.000 But after you're born, you can still use surgery and chemicals to modify your body.
00:13:01.940 And I wondered what other variants we would get.
00:13:07.740 Because I was thinking about this.
00:13:09.480 Like, suppose you could just pick your body parts.
00:13:11.960 Because you can.
00:13:13.380 Breasts?
00:13:14.040 Yes or no.
00:13:16.040 You know, penis?
00:13:17.180 Yes or no.
00:13:18.420 And I was thinking, I don't know, it's just because I'm an artist, I'd like to do something
00:13:22.820 different.
00:13:23.220 But I was thinking about getting one breast.
00:13:27.440 Just keep the other one normal.
00:13:29.640 Maybe just one breast.
00:13:30.740 But also, I'd like to get rid of my penis by keeping my balls.
00:13:37.920 Is that an option?
00:13:39.640 I just want to, I don't want to be exactly like everybody else.
00:13:43.080 You know, like if you get a tattoo, you don't want exactly the same, you know, tramp stamp
00:13:47.820 as everybody else.
00:13:49.240 You know, if you're going to modify your body, you don't want it to look just like everybody
00:13:52.620 else.
00:13:53.060 So I'm thinking balls and just one breast.
00:13:58.380 That would work for me.
00:14:02.340 All right.
00:14:03.940 In related news.
00:14:07.840 Okay.
00:14:08.380 Is this one real?
00:14:09.440 I don't think any of the news is real today.
00:14:12.220 You tell me if this is real.
00:14:14.520 I don't believe it's real.
00:14:16.520 All right.
00:14:16.920 You ready?
00:14:17.240 Pete Buttigieg has proposed a $20 million budget to create female crash dummies because
00:14:31.900 they don't have female crash dummies.
00:14:34.420 Is that real?
00:14:35.640 But, but here's the thing that they just got in an order of new female crash dummies and
00:14:49.700 they're all Leah Thomas.
00:15:01.260 We should have specified.
00:15:02.560 We should have been, we should have been more, we should have put more detail in the, in
00:15:09.340 the RFP because we got, we got $20 million worth of female crash dummies, but they're
00:15:16.440 all based on Leah Thomas.
00:15:22.200 That would be too much fun.
00:15:24.180 All right.
00:15:24.460 Here's what I think happened.
00:15:26.940 All right.
00:15:27.300 I think, I think it's partially true.
00:15:29.920 I'm not positive, but I think what they're trying to do is make sure that they had crash
00:15:35.560 dummies that were smaller size, right?
00:15:39.080 Because if you get a test, a big crash dummy, maybe that one would survive a crash.
00:15:43.880 But if you had a hundred pound dummy, you know, would a hundred pound more like a woman,
00:15:49.820 you know, would that survive?
00:15:51.560 So it does, it makes perfect sense.
00:15:53.420 It makes perfect sense that they should have crash dummies of all size, sizes, but why
00:15:59.840 would you call them female?
00:16:02.280 In today's world, why can't you just say we need some small ones and some big ones?
00:16:07.860 Do you really need a female crash dummy?
00:16:10.060 Are they going to put a womb in it?
00:16:12.480 Are they going to put like, uh, like a pregnant crash test dummy?
00:16:16.120 Like, like, like maybe there's a reason, but it just sounds funny in today's world.
00:16:21.580 They, oh, that is perfect.
00:16:28.140 And then I also wonder what about gay crash test dummies?
00:16:33.020 Um, we haven't tested that.
00:16:36.560 And then I guess the obvious question is, if this were the Trump administration, just
00:16:46.980 do this, uh, thought experiment in your mind.
00:16:51.380 Imagine, um, Pete Buttigieg being the head of transportation in the Trump administration.
00:16:58.220 I just want to paint this scene in your head.
00:17:00.920 You could use AI to create the movie of the scene if you need it.
00:17:04.160 But Pete Buttigieg walks into the White House and proposes his plan for $20 million for
00:17:10.740 female crash dummies to Trump.
00:17:14.700 And Trump's sitting behind the Oval Office listening to the proposal for female crash dummies.
00:17:19.660 Just hold that in your head.
00:17:22.000 And then what does Trump say?
00:17:23.600 I'm just going to give you an idea of what he might say.
00:17:27.520 Uh, Pete, why don't you just slap a dress on those dummies and call it a day?
00:17:34.160 I feel like, I feel like, I feel like Trump would just say, put a dress on them, save $20 million.
00:17:43.600 Now, I realized that there was a functional reason to do it, that they need smaller ones.
00:17:48.900 And that probably makes perfect sense.
00:17:50.780 But does it really cost you $20 million?
00:17:53.080 How many crash dummies are you buying?
00:17:55.740 Do you have to buy them by the million?
00:17:57.500 Like, who is it who came up with the idea that making it a little bit smaller would cost $20 million?
00:18:04.560 I don't know.
00:18:05.200 Everything about this story is funny.
00:18:07.980 All right.
00:18:08.460 I've made a decision I'd like to share with you.
00:18:11.700 Here's a decision I've made.
00:18:13.060 If I get into any conversation, social media or otherwise, in which the person I'm talking to uses equity as part of their argument or opinion,
00:18:26.740 I'm going to call it out as a racist dog whistle.
00:18:30.700 Because it is.
00:18:32.800 Equity is a racist dog whistle.
00:18:34.520 And I'm not going to have a conversation with somebody who uses that word.
00:18:41.880 I'm going to label it and I'm going to walk away.
00:18:44.940 Because there is no use having a conversation about equity when it's just a racist concept.
00:18:51.900 And I thought we were past racism.
00:18:55.500 I want to live in a country with less racism.
00:18:57.940 So I'm just going to label it and walk away.
00:18:59.640 So, equity, racist dog whistle.
00:19:05.980 There's something I can say that you can't.
00:19:09.120 How do you like your lack of free speech?
00:19:12.200 Do you wonder what it's like to be able to say whatever you're thinking?
00:19:15.300 It's awesome.
00:19:16.440 It's really awesome.
00:19:18.340 I can go right in public and say what I actually think.
00:19:22.280 I didn't have to shade that at all.
00:19:25.100 That was just my actual opinion.
00:19:27.580 Try that at work.
00:19:28.460 See how long you keep your job.
00:19:32.940 All right.
00:19:34.660 BuzzFeed is laying off 15% of its staff as part of its, let's see, the story says,
00:19:40.340 rotting from the inside.
00:19:42.360 Rotting from the inside.
00:19:43.260 Okay, the story doesn't say that.
00:19:45.160 But it feels like that's what's happening.
00:19:47.460 And it looks like they're making plans to end their BuzzFeed news.
00:19:51.500 Huh.
00:19:53.140 Huh.
00:19:54.780 Well, how about that?
00:19:57.120 How about that?
00:19:58.460 Are you aware of some of the fake news that BuzzFeed has published about me?
00:20:06.540 Just me.
00:20:07.500 Just one person.
00:20:09.000 I'm just one person in the world.
00:20:10.580 Do you know how much fake news they've printed about me?
00:20:14.900 Oh, yeah.
00:20:15.600 If you don't mind, I'd like to take a moment to dance on their graves.
00:20:25.680 Yeah, BuzzFeed is an evil enterprise.
00:20:30.640 They are evil people, terrible people.
00:20:32.700 I mean, just the worst of people.
00:20:34.940 You know, human beings don't get worse than people working at BuzzFeed.
00:20:39.000 So I'm very glad that one more source of fake news and disreputable behavior is possibly going out of business.
00:20:51.260 So it looks like they're going to focus on their Huffington Post business.
00:20:58.800 So that's not really an improvement, is it?
00:21:01.900 Okay.
00:21:02.780 How many of you are aware that the story about Justice Clarence Thomas and the billionaire who bought some property that was adjacent to his and stuff,
00:21:15.820 how many of you think that that's a real story, like that's real news?
00:21:20.400 Do you think that's real news?
00:21:22.400 Nope.
00:21:23.540 Turns out it was fake news.
00:21:24.540 Now, there's a tiny kernel of truth, but not enough to make it real.
00:21:31.160 Okay.
00:21:31.500 So there's a tiny little real thing.
00:21:33.460 Here's the tiny little real thing.
00:21:35.460 There was a financial transaction.
00:21:38.440 Some property was bought.
00:21:41.980 Justice Thomas lost money on it.
00:21:45.380 In other words, he sold it for lower than, I guess, its value in the books or something.
00:21:50.120 So there was no taxable event.
00:21:53.040 Nothing was taxable.
00:21:54.540 He owed no extra taxes.
00:21:56.660 He didn't make any money.
00:21:58.480 I think it was $1,000 involved.
00:22:01.280 $1,000.
00:22:03.060 Did you hear that?
00:22:04.620 That's it.
00:22:05.340 $1,000.
00:22:07.000 That was the extent of it.
00:22:09.060 And the only thing he needs to do to correct it, the only thing he needs to do is file an amended thing and just add it to the list.
00:22:17.180 That's the remedy.
00:22:19.380 Now, how do we know that this is fake news?
00:22:22.840 Well, thanks to the great work of James Taranto at the Wall Street Journal.
00:22:30.780 So the Wall Street Journal, to its credit, is running significant pieces in which the ProPublica publication that was the original publication that made these charges is just ripped apart.
00:22:43.440 Just ripped apart.
00:22:44.940 And you know what?
00:22:46.060 ProPublica doesn't even respond to the Wall Street Journal's debunking.
00:22:52.780 It doesn't disagree with the debunking, nor does it, you know, reinforce that what it said it was true.
00:23:00.520 It just sort of gave up and slunk away.
00:23:03.100 So Taranto just put the hammer on them.
00:23:06.340 He just dropped the hammer.
00:23:08.080 And they just slunk away.
00:23:11.320 Slinked?
00:23:12.460 Is that even a word?
00:23:13.380 Can you use slunk?
00:23:14.180 They slunk away?
00:23:15.600 I don't know.
00:23:16.280 Sounds like a word.
00:23:16.980 So great work, James Taranto.
00:23:23.520 I'm clapping for you in my mind.
00:23:26.560 So I appreciate that there was a public figure who was the subject of really, really despicable fake news.
00:23:34.800 And that another part of the news world fixed it.
00:23:39.200 So that's the way it's supposed to work.
00:23:41.000 Yeah, haunted.
00:23:42.080 Yeah, Thomas was haunted.
00:23:44.440 You're right.
00:23:45.840 All right, big announcement.
00:23:47.260 You've been wondering, what am I going to do with my book, Reframe Your Brain, that was canceled before it was published?
00:23:54.440 Most of you know I got canceled.
00:23:56.820 So my book was in limbo.
00:23:58.680 I got the rights back.
00:24:00.340 And letting you know today that I'm working with Joshua Lysak.
00:24:05.100 And he is helping me edit the book.
00:24:07.720 And he's amazing.
00:24:09.320 Best editor I've ever worked with.
00:24:12.500 He's actually editing things that another editor had already seen.
00:24:15.980 And it's a lot better when he's done with it.
00:24:19.000 So we're working on that together.
00:24:21.240 He's going to help me.
00:24:22.680 He used the word, quote, ghost publishing.
00:24:26.320 So the writing is mine.
00:24:28.100 Most of his business is ghost writing.
00:24:30.520 But the writing is mine.
00:24:33.560 He is editing it for me, with me.
00:24:36.640 And we will effectively self-publish.
00:24:41.060 But he's going to help me with that process.
00:24:43.880 So it should be available in at least a few of the places you buy books.
00:24:47.660 We'll try to get it on Amazon and wherever else it makes sense.
00:24:51.360 And bullet points, yeah.
00:24:55.400 And here's the question I have to ask.
00:25:01.220 I don't know that traditional publishing makes sense for anybody anymore.
00:25:04.900 Why would you use a traditional publisher?
00:25:08.780 It seems to me that their purpose and function has largely gone away.
00:25:15.520 So we'll see.
00:25:17.200 Now, I do think that if you go through a traditional publisher, you will sell more books.
00:25:22.040 You will.
00:25:22.480 But you're going to have a lot of overhead, and there's extra expense to that, and a burden,
00:25:29.040 and they can ask you to do things that maybe you don't want to do.
00:25:31.760 Yeah, they could pay in advance up front.
00:25:35.040 There's some things they could do.
00:25:36.960 But if you're an established author, and you already have an audience,
00:25:42.340 I'm not sure.
00:25:43.260 I think we've reached the point where it doesn't make any sense.
00:25:45.420 It was diminishing in importance over time, but it still was the smart play to use a traditional publisher.
00:25:54.360 I feel like this is the crossover year.
00:25:57.160 This feels like the year where, you know, if you have a million people on social media following you,
00:26:03.380 you probably don't need much advertising.
00:26:06.560 Because you can do more on podcasts and tweeting than you can with a book tour.
00:26:11.500 One of the biggest things that a publisher would do is help you, you know,
00:26:16.780 organize a book tour with book signings and TV appearances on Good Morning America.
00:26:22.260 Do you know how much a TV appearance on Good Morning America is worth for a publisher?
00:26:28.580 What do you think?
00:26:29.880 A good appearance, like a solid hit on Good Morning America.
00:26:33.900 How many books does that sell?
00:26:35.980 None.
00:26:37.280 Zero.
00:26:38.300 It doesn't have any marketing value at all.
00:26:40.220 I don't even know why people do it.
00:26:42.040 I used to do it, but then I would watch my Amazon number after I went on a big morning show,
00:26:48.460 and I'd be like, oh boy, watch this.
00:26:51.800 I just got off that big morning show, you know, one of the biggest shows in the country,
00:26:55.980 one of the big three networks.
00:26:57.940 Look at my numbers, no difference.
00:27:01.920 It didn't even fluctuate.
00:27:03.300 But I'd do a podcast, you know, say a Tim Ferriss podcast or a James Altusher podcast,
00:27:12.180 and you could watch your number just go, as soon as the podcast comes out.
00:27:16.300 Yeah.
00:27:16.620 So all of the old ways which the publishers used, I just don't know that they make any sense anymore.
00:27:23.420 Yeah.
00:27:23.860 Yeah, Joe Rogan.
00:27:24.840 If you go on Joe Rogan, you're going to sell some books.
00:27:26.880 That's real.
00:27:29.920 But Good Morning America, nah, no books.
00:27:34.280 All right, so watch that.
00:27:38.800 So there's a story now about this.
00:27:42.960 Well, let me echo a couple of things that Tucker Carlson said.
00:27:47.640 Being a normal consumer of the news, there are things that I don't know
00:27:54.240 that I have to wait for other people to tell me.
00:27:57.460 And one of them is, how long does it normally take to investigate anything?
00:28:02.660 I mean, I always thought, well, it takes forever to investigate anything.
00:28:06.760 But do you think it takes five years to investigate Hunter's laptop?
00:28:12.000 Do you think that makes sense?
00:28:14.060 No.
00:28:14.960 Of course not.
00:28:15.660 But it's obvious to me, fairly obvious, that it's being stalled or, you know, protected
00:28:24.360 or the government's, you know, making sure it goes slow or something.
00:28:30.340 Yeah.
00:28:30.540 So that doesn't look real to me.
00:28:32.460 That looks like something corrupt.
00:28:36.800 What about the fact that it's been, how long has it been since Epstein died in prison?
00:28:43.200 And we still can't figure out who killed Epstein?
00:28:46.960 Seriously.
00:28:48.600 Seriously.
00:28:49.900 We don't know who killed Epstein.
00:28:52.460 Of course, somebody knows.
00:28:53.980 Now, don't you think that that, too, is just a story that is just being suppressed by whoever has the power to do it?
00:29:02.140 I mean, our level of trust about anything has now reached just zero.
00:29:09.440 Yeah.
00:29:09.580 I love the theory that he's still alive, which I don't discount, by the way.
00:29:16.860 You know, how hard would it be to stage a photograph, a still photograph of him looking dead?
00:29:23.420 You know, just put some makeup on him and...
00:29:26.580 Yeah.
00:29:26.980 But I think there was an autopsy, so that's got to be harder to fake.
00:29:29.660 I think the autopsy's a little harder to fake.
00:29:35.660 All right.
00:29:36.500 Well, and now we know that there's this news story about how the 50 Intel executives ended up signing off that the Russians were behind that laptop.
00:29:50.940 So I guess there was an ex-CIA.
00:29:53.960 This is, I saw it in a tweet from Mario Norfall, whose spaces events on Twitter are amazing.
00:30:02.260 You should try those.
00:30:03.800 Anyway, so they asked this guy, the guy who was at one point a potential CIA director under Biden, right?
00:30:12.080 So this is somebody who's pretty important relative to the Intel world.
00:30:17.680 And he asked to write a false Hunter Biden laptop letter to help Biden win the election.
00:30:27.540 So now we have somebody who's saying definitively that he was asked by Blinken in the Biden campaign to do a fake letter getting the Intel people to say that it was Russian disinformation.
00:30:42.180 Now, we know it was Russian, we knew that the Intel people were liars, and we knew that they did this, but isn't it interesting that it was somebody who was a potential CIA director who was asking him to sign the letter, which means it was under coercion, right?
00:31:06.080 It was coercion.
00:31:09.000 Now, he didn't say anything coercive, I'm sure.
00:31:11.960 But don't you think that many of the people, the 50 people who were involved in Intel, you don't think that they knew that the guy who asked them to sign the letter might someday be the head of the CIA?
00:31:23.000 So was he X, but also future, maybe future, I don't know, it's a little confusing in the story.
00:31:34.800 But just, but just think about that.
00:31:41.240 Think about the fact that the Biden campaign asked somebody who other people would think might someday be their boss in the Intel world.
00:31:49.440 And he got 50 people to lie, probably because they wanted their careers to be protected.
00:31:58.180 That's horrible.
00:32:00.460 And it feels like it should be illegal.
00:32:03.640 I don't think it is.
00:32:05.380 But if somebody who might be your future boss tells you to risk your reputation, what are you going to say?
00:32:13.920 It's a tough choice.
00:32:15.580 So that's a little bit coercive.
00:32:17.060 I think that should be part of the story.
00:32:19.440 And can we conclude that the election was rigged?
00:32:25.900 Not in an election sense, or not in a vote counting sense.
00:32:29.700 But can we say that if this story had not gone this way, that Biden might not have won?
00:32:39.220 Yeah.
00:32:40.480 But let me counter my own statement.
00:32:43.720 I'm going to give you the counterpoint to my own statement.
00:32:46.280 Don't you think it's true that whatever the news coverage is a few months before any election determines who wins?
00:32:55.980 You remember the grab them by the you-know-what?
00:32:59.300 So that coincidentally drops right before the election?
00:33:02.880 That was also election interference.
00:33:06.060 The fact that that little bombshell was saved until the last minute, that's the news, or some part of the news, deciding who's going to be president.
00:33:18.200 They don't save that little surprise until right before election, unless somebody's trying to change the election dynamic.
00:33:24.680 So I would say that was election interference, but legal, probably legal.
00:33:30.760 I'm not aware of any crime that stops you from doing an October surprise, the sort of basic stuff.
00:33:38.040 But it does change, potentially, change the outcome of the election.
00:33:42.160 Somehow Trump survived that.
00:33:43.620 And I think I'm going to add some skepticism to this laptop story.
00:33:52.360 We saw polling that said that people might have voted differently, right?
00:33:57.180 Like a whole bunch of people said, oh, if I'd known about that laptop being real, I might have voted differently.
00:34:03.740 I like to call BS on that.
00:34:06.260 I don't think people would have voted differently.
00:34:07.980 If they had seen the real story about the laptop, do you know what the Biden supporters would have said?
00:34:15.640 That sounds like Russian interference.
00:34:18.640 They didn't need any, nobody needed any 50 intel people to sign anything.
00:34:23.780 But if you're that close to the election, people just don't change their mind for anything.
00:34:30.320 They're so committed at that point that they would say, ah, that laptop's probably BS.
00:34:35.700 It's probably BS from the Republicans.
00:34:38.780 Yeah.
00:34:39.340 So they would just discount it as a Republican ploy instead of, you know, falling for a Democrat ploy.
00:34:46.840 I don't think it would have changed anything.
00:34:49.360 But it might have, and that's bad enough.
00:34:53.600 Would you accept it might have?
00:34:55.960 Yeah.
00:34:56.380 You can disagree because there's plenty of room for a disagreement because there's no way to check it.
00:35:00.640 I'm just saying the way I know people to think, in all of my experience, is that they're locked in by them.
00:35:09.340 That the fact that Trump's saying he would grab them by the you-know-what didn't change anything.
00:35:15.020 I don't think it'd change any votes.
00:35:18.260 If that didn't change any votes, the laptop thing probably wouldn't either.
00:35:23.540 Right.
00:35:23.640 All right.
00:35:27.100 All right.
00:35:29.700 Over at Tel Aviv, a company called Technion, I think, say the engineers have built a hybrid micro robot the size of a single cell,
00:35:39.340 which can be put into the body and navigate within the body, and then this little robot can be controlled by AI.
00:35:50.940 So are you afraid yet?
00:35:54.280 They're going to put tiny robots in you controlled by AI.
00:35:58.280 Now, I don't know what that means, but everything about it sounds scary.
00:36:03.020 Tiny robots scare me, first of all.
00:36:05.340 I hate tiny robots.
00:36:06.540 Secondly, this is the kind of story that's always overblown.
00:36:11.920 Maybe not this time.
00:36:13.120 Maybe not.
00:36:14.220 But what do you think the robot can actually do?
00:36:17.700 Like, when you hear that story, do you see, like, a little robot walking through your body?
00:36:21.580 He's got arms and legs and a little head.
00:36:23.920 No.
00:36:24.920 It's the size of one cell.
00:36:27.640 It's probably just something that could go like this.
00:36:31.740 It can probably flop on command.
00:36:33.780 Or, like, you know, maybe crawl in some direction or something.
00:36:38.400 I don't think this robot's going to do a lot of work.
00:36:41.740 You know, maybe it can carry a little payload.
00:36:44.080 You could put a little virus in there to do something and tell it to move toward an organ or something.
00:36:49.360 So maybe it could do some little stuff.
00:36:51.620 But, you know, it's not like you're going to be full of robots.
00:36:56.160 More like a submarine, yes.
00:36:58.280 Fantastic voyage.
00:36:59.080 Well, we'll see.
00:37:03.300 I believe I warned you that AI would have a fight to the death with lawyers.
00:37:10.000 So remember when you thought AI was going to take the job of lawyers?
00:37:14.900 Now the lawyers are fighting back and they're trying to ban AI.
00:37:18.220 Now lawyers are going to find a million ways to ban it and a million reasons.
00:37:24.320 But the EU is considering categorizing AI systems into four main groups that would be unacceptable risk, high risk, limited risk, and minimum risk.
00:37:35.180 And if you fell into the higher risk groups, that you could perhaps run afoul of the law if they make these laws.
00:37:45.480 What do you think of that?
00:37:47.000 Does that sound like a practical plan to divide AIs by unacceptable risk, high risk, limited risk, and minimum risk?
00:37:56.980 Good rock.
00:37:58.420 Exactly.
00:37:59.700 Who gets to decide what the risk is?
00:38:01.840 And who can even predict risk with a superior intelligence?
00:38:08.960 Nobody knows what the risk is of anything.
00:38:11.580 It could be the thing that's reading bedtime stories to your child.
00:38:16.440 Could be making some decisions that you totally don't want them to make.
00:38:20.960 What's an unacceptable risk?
00:38:22.680 I don't know if you can have a subjective criteria.
00:38:30.440 You can't have subjective criteria and expect it's going to work.
00:38:36.460 All right.
00:38:36.860 Now I do know that the law sometimes has subjective criteria, such as, you know, a reasonable person's standard.
00:38:44.700 You know, would a reasonable person assume this or believe this or act this way?
00:38:48.740 So, of course, there's always going to be some human judgment in it, but this seems like really, really hard to sort them into four categories of risk.
00:39:00.440 To me, they're either connected to something or they're not.
00:39:06.060 If you connect an AI to anything, like the Internet, how do you know what's going to happen?
00:39:12.360 You know, wouldn't it program itself if it, as soon as it gets access to the Internet, if it's AI and it has any autonomous abilities, which would make it AI,
00:39:24.060 I feel like just connecting to the Internet gives you an unpredictable outcome.
00:39:29.460 And that unpredictable outcome could be awful or not awful.
00:39:33.080 You just don't know.
00:39:34.920 So I feel like we're creeping toward making everything illegal just because you can't tell what's going to kill you.
00:39:40.940 So the lawyer versus AI is going to be the most interesting battle, I think.
00:39:48.180 You know, until the robots get lasers and attack us, then that will be more interesting.
00:39:54.080 All right.
00:39:56.500 In the similar vein, some comedians made a deep fake but didn't really look like Tom Brady.
00:40:04.160 When I say it didn't look like him, it was obviously a cartoonish version of Tom Brady.
00:40:08.040 And then they had this cartoonish version of Tom Brady do a stand-up comedy act in which he told off-color jokes, you know, racist jokes and stuff.
00:40:20.160 Now, Tom Brady's lawyer told them to take it down.
00:40:26.480 What do you think happened?
00:40:28.180 This is protected speech.
00:40:30.020 It's parody.
00:40:31.280 It didn't look enough like him that anybody would be fooled.
00:40:34.040 It was obviously a comical version.
00:40:36.140 What do you think happened?
00:40:38.180 Completely legal.
00:40:39.960 Everything the comedian says was 100% legal.
00:40:43.060 And then Tom Brady's lawyer says, we're going to sue you.
00:40:45.280 Take that down.
00:40:46.260 They took it down.
00:40:48.040 Absolutely.
00:40:48.440 So when you say to yourself, well, those lawyers don't have a leg to stand on, it doesn't matter.
00:40:56.180 The lawyers just have to threaten you with a risk or with a big enough threat that you don't want to deal with it.
00:41:02.740 And then you're going to say, well, I don't want to spend a year in court winning.
00:41:07.140 Nobody wants to spend a year in court winning.
00:41:10.460 Even winning.
00:41:11.420 You don't want to spend a year in court.
00:41:13.080 So they just took it down.
00:41:14.020 How much of that do you think you're going to see?
00:41:17.240 A lot.
00:41:18.820 You're going to see lawyers threaten because somebody's willing to spend more money to make you broke defending yourself even if you're going to win.
00:41:29.780 So that's going to be a thing.
00:41:33.100 Lawyers just threatening you without the benefit of the law being on their side.
00:41:37.960 Where's that go?
00:41:38.820 All right.
00:41:40.520 Chinese have some kind of weird satellite weapon that can shut down the communication of other satellites.
00:41:47.800 It does look like they're preparing for war.
00:41:51.500 On the other hand, it could be just normal national defense.
00:41:57.280 But some scary stuff.
00:42:01.020 Yeah.
00:42:02.360 Well, keep an eye on that.
00:42:04.180 I assume that the United States also has satellite killers.
00:42:07.740 Don't you?
00:42:09.600 I imagine it wouldn't take long before a lot of satellites got swept out of space if a war started.
00:42:20.360 So are you aware that the Supreme Court is looking at this abortion pill situation?
00:42:26.060 So the lower court judge put a stay on it to block the use of it.
00:42:31.680 But that was temporarily unblocked.
00:42:35.020 The Supreme Court looked at it.
00:42:36.140 And I guess they were getting ready to decide.
00:42:39.260 Now, you tell me if I'm wrong about the following prediction.
00:42:44.220 If the Supreme Court decides to make illegal this abortion pill, you can cancel the election.
00:42:52.480 Just cancel it.
00:42:53.780 Just give all the Democrats whatever jobs they want.
00:42:57.420 That would be the end of it.
00:42:58.460 I don't see that the Republicans could win an election if the Supreme Court bans the abortion pill.
00:43:05.500 That would be over.
00:43:07.120 I don't even think I'd have fun talking about it after that.
00:43:10.500 I don't think Trump could win in that environment.
00:43:12.300 And I'm not entirely sure.
00:43:19.820 Well, I'm not even going to give you an opinion on that.
00:43:22.680 I think it's just going to be what we're going to see.
00:43:25.140 So basically, I think the whole election is going to be decided by the Supreme Court.
00:43:29.900 Don't you?
00:43:30.340 Does anybody disagree that the election will be completely determined by this ruling in the Supreme Court?
00:43:38.840 You disagree?
00:43:41.200 Yeah.
00:43:42.240 I mean, it's hard to know.
00:43:43.640 So again, there's plenty of room for disagreement.
00:43:46.540 I would respect any disagreement with my opinion.
00:43:49.700 Because I think there's plenty of room for disagreeing.
00:43:52.880 But to me, it looks like that would be the end.
00:43:55.480 I don't see a Republican could get elected into national office on that.
00:44:02.460 So we'll see what happens there.
00:44:04.320 I don't know which way that's going to go either.
00:44:06.500 Is anybody making a prediction which way the Supreme Court goes?
00:44:10.860 Because I don't know what the legal argument is exactly.
00:44:14.780 But if the legal argument is ambiguous, what would you expect?
00:44:20.720 You'd expect the Supreme Court to be conservative-leaning, right?
00:44:25.480 So it would be interesting to be a conservative in the Supreme Court.
00:44:31.500 And if you make this thing illegal,
00:44:34.140 knowing that you made it impossible for conservatives to succeed,
00:44:39.280 that would be a tough choice.
00:44:41.240 I don't know what people would do.
00:44:43.540 All right.
00:44:43.820 Twitter apparently has dropped their labeling of state-affiliated media.
00:44:49.020 And it was kind of quietly.
00:44:50.420 It just went away.
00:44:51.080 So they're not even labeling the media that's obviously other countries' state-affiliated media.
00:44:58.840 And I think this would be another example of the Musk entrepreneurial approach,
00:45:04.100 which is you just throw something up, see how people react,
00:45:09.260 get a bunch of marketing attention,
00:45:10.900 and then if you need to change it, you just change it.
00:45:14.420 So I think that's all he did.
00:45:16.180 He just threw out some provocative ideas,
00:45:18.580 got all kinds of attention for Twitter.
00:45:20.660 People were jabbering about it.
00:45:22.360 And then he saw how it worked.
00:45:24.880 And then he thought about it some more.
00:45:26.840 This is what it looks like anyway.
00:45:28.700 Thought about it some more and said, yeah, that didn't work.
00:45:31.120 So then he just stopped doing it.
00:45:32.340 If you just watch what Elon Musk does, forget what he says, just watch what he does.
00:45:42.660 It's an entire lesson in business.
00:45:45.860 Even the things he does wrong are a lesson in business because you watch how he corrects it.
00:45:51.740 It's the correction that's the lesson.
00:45:54.100 Like guessing wrong about what will work and what won't, that's sort of a nothing.
00:45:58.800 Everybody's guessing.
00:45:59.520 But what do you do when you find out you're wrong?
00:46:04.120 He just blows it up and moves on.
00:46:07.420 So I love that.
00:46:08.620 I feel we're all getting a useful education just watching.
00:46:13.900 All right, here's speaking of Musk and all things like that.
00:46:18.040 There's an announcement from one of the most big battery makers.
00:46:22.300 So it's important to know this isn't a startup.
00:46:25.960 This is the biggest or one of the biggest battery makers.
00:46:28.860 And they've announced a new battery process to make a condensed battery with 500 watts per kilogram power.
00:46:40.220 So that's how much electricity for the weight of the battery.
00:46:44.020 And apparently it doubles, doubles current best batteries.
00:46:49.960 Doubles them.
00:46:50.680 Now, for the same weight, you'll get double.
00:46:56.200 Now, you say to yourself, well, that's just a story of things we already knew of getting better.
00:47:02.040 But it's not.
00:47:04.300 Because this doubling crosses an important economics barrier.
00:47:08.940 And the economics barrier was, it wasn't practical to make electric airplanes.
00:47:16.080 And now it is.
00:47:18.180 And now it is.
00:47:20.300 Do you realize how big that is?
00:47:22.500 It wouldn't make sense to make an airplane that's not electric now.
00:47:28.040 Probably.
00:47:28.480 Now, it's going to take a long time to transition, of course.
00:47:32.180 But if you're looking at a new, if you're looking at a new small plane, we'll start with small planes.
00:47:37.880 If you were looking at a new small plane, if somebody makes one, why would you buy the other one?
00:47:47.260 Yeah.
00:47:48.040 It's gigantic.
00:47:49.080 That this one little technological change is going to have a huge impact on transportation and maybe storage, too.
00:47:58.520 I mean, just think about if you had one of these batteries as a backup on your house.
00:48:05.260 You could get away with, well, let me make the most extreme case.
00:48:10.520 If Californians had enough of this capacity battery, it wouldn't matter if our grid was undependable.
00:48:19.080 Because if the main source of the grid failed, it could suck some of the battery power out, maybe.
00:48:27.700 Or people could just get off the grid long enough for it to recover.
00:48:31.620 You might actually find a way for the grid to become robust without fixing the grid.
00:48:38.460 I mean, the implications of just doubling this battery are almost unimaginable.
00:48:44.060 It's not an incremental change.
00:48:45.640 It could change everything.
00:48:46.660 Or not.
00:48:48.560 We'll see.
00:48:49.880 Robots.
00:48:50.420 Exactly.
00:48:51.660 How good would a robot be?
00:48:53.560 Let's say you had an autonomous robot that has AI in it, but you have to plug it in every hour.
00:49:00.200 Or it's running around on an extension cord or something.
00:49:02.720 Now you imagine that you could free it from its charging requirements for longer times.
00:49:09.140 Now you've got a serious robot, don't you?
00:49:11.740 Now you've got a robot that lifts some stuff and jumps some stuff.
00:49:15.580 And you've got a military robot, probably better, you know?
00:49:18.940 So the scope of change from this, what about e-bikes?
00:49:25.640 You know, an e-bike is already amazing.
00:49:29.540 You know, I talk about it too much.
00:49:31.000 But imagine your e-bike being twice as powerful or going twice as far.
00:49:35.720 You wouldn't need much else.
00:49:37.100 All right, ladies and gentlemen.
00:49:42.820 I believe that's what I wanted to talk about today.
00:49:45.920 If you have not watched, if you have not seen any of the Dilbert Reborn comics, the spicier version,
00:49:52.660 I was reading some previews to my Man Cave participants.
00:49:59.240 I do a private live stream many evenings from my Man Cave, just for the people on the Locals platform, subscribers.
00:50:08.300 And I was reading some of the future comics, the ones that are, let's say, the new spicy version.
00:50:14.680 I had to work through the comics that were in the pipeline already, because they were the ones that were slated for newspapers.
00:50:20.160 But I think they agreed that they're pretty provocative, but also funnier, I hope.
00:50:30.060 I think they are.
00:50:31.600 For men only.
00:50:32.600 No, not for men only.
00:50:35.800 All right.
00:50:38.060 Even your dog loves them.
00:50:39.340 Good to know.
00:50:41.020 Good to know.
00:50:41.940 All right, did I miss anything?
00:50:43.120 Is there any story that you're dying to talk about?
00:50:45.440 Well, you'd rather watch paint dry.
00:50:51.400 Well, well, RFK Jr. has a story.
00:50:55.320 Well, I think that story is wait and see.
00:51:01.500 But you're reminding me, I want to give him a voice tip.
00:51:04.500 So, you know, his voice is better.
00:51:07.380 He had a procedure to improve his voice quality.
00:51:11.940 And it definitely is better.
00:51:13.480 There's a big difference.
00:51:14.800 So I'm glad he did that.
00:51:16.560 But I feel like there's something he could do with his voice production technique that would take better advantage of that.
00:51:25.560 Oh, yeah, I was on Dr. Drew last night with Dr. Victory, and you might want to watch that.
00:51:36.160 So check that out.
00:51:37.680 We had a lot of fun.
00:51:41.280 All right.
00:51:47.900 Oh, okay.
00:51:48.760 I just saw a recommendation for Kat Timph's book.
00:51:51.460 Let me give Kat Timph a little shout-out, okay?
00:51:55.840 So Kat Timph has a new book.
00:51:57.600 I've been seeing her promoting it on some TV shows.
00:52:01.980 Who was it who said, there's somebody who was talking to her and said, you know, you could tell she wrote every word in the book.
00:52:12.180 So for public figures, that's, you know, not always the case.
00:52:15.440 But I thought, what do you mean?
00:52:16.900 Of course she did.
00:52:17.900 She's a professional writer.
00:52:19.140 Of course she wrote every word of it.
00:52:22.380 Of course she did.
00:52:23.660 She's an excellent writer, a professional writer.
00:52:27.280 Of course she did.
00:52:29.500 But apparently it's quite good.
00:52:32.220 So I'm hearing good things about it.
00:52:34.160 So take a look at that.
00:52:39.020 All right.
00:52:40.160 Did I find out about the pandemic's hoax?
00:52:44.600 Candace Owens says women shouldn't vote.
00:52:46.860 A lot of women say that.
00:52:49.580 I don't think they mean it literally.
00:52:51.860 Maybe they do.
00:52:52.700 I don't know.
00:52:53.480 But it's a provocative thing to say.
00:52:58.420 BuzzFeed, yeah, BuzzFeed's dying.
00:53:00.100 I was dancing under a grave earlier.
00:53:01.720 You missed that.
00:53:02.260 I think AI will have human advocates who are trying to get it rights.
00:53:19.880 Oh, Matt Tybee being threatened with jail time.
00:53:22.080 I saw a tweet on that, but I didn't know the context.
00:53:25.340 Matt Tybee was threatened.
00:53:28.300 I saw that he was threatened with some kind of jail time, or there was a risk of it.
00:53:32.840 But what was that about?
00:53:34.200 Somebody acting badly that wasn't him, right?
00:53:36.820 Oh, Mattie's claim?
00:53:41.780 All right.
00:53:42.380 So there's some claims about him that would be a problem.
00:53:45.760 I'll look into that.
00:53:46.960 I'll get back to you on that.
00:53:47.900 I do think that we need to protect Matt Tybee.
00:53:52.800 What do you think?
00:53:54.520 I think if anybody like Amanda Devine or Matt Tybee or Michael Schellenberger, if any of those,
00:54:04.400 if any of those independent voices get attacked, I think we need to protect them.
00:54:10.400 I think, and by the way, I'm now cursed, so I have the curse on me.
00:54:18.300 The same thing happened when I became a cartoonist.
00:54:21.420 I got some advice that really made a difference, and then I felt cursed because if anybody asked
00:54:27.360 me for advice once I was successful, I felt the need to give it to them, to pass on the
00:54:35.160 goodness, but because I was protected when I got canceled, I feel the need to promote anybody
00:54:44.300 in that space.
00:54:45.900 Anybody who didn't try to cancel me is my friend, right?
00:54:52.380 So that's what I say about that.
00:54:59.020 DMT, okay.
00:55:00.060 Talk about Musk removing rules on deadnaming from Twitter.
00:55:06.020 Oh, I didn't know about that.
00:55:07.680 So is it true that you used to get banned from Twitter for deadnaming, calling a trans person
00:55:13.180 by their old name, and now that's gone away, right?
00:55:17.480 Oh, I agree with that.
00:55:19.420 Yeah.
00:55:19.700 As much as I don't want to see any discrimination, and I don't want to see any harshness toward
00:55:27.680 the adult trans community, I don't think you should go to jail for using somebody's, you
00:55:35.620 know, first name, original name, I guess, dead name.
00:55:42.240 Can we cancel the publishers that canceled Scott?
00:55:44.800 No, I don't think that's the way to go.
00:55:48.100 I don't think that the editors of either the newspapers or my syndication company or the
00:55:56.020 publishers had much choice.
00:55:58.900 You know, the public, the public's kind of firmly in command of things, and the public
00:56:03.380 demanded that they act in a certain way.
00:56:06.620 So they did.
00:56:09.640 All right.
00:56:10.500 Yeah, I'm not going to criticize any of the individuals who made business decisions for
00:56:16.100 the benefit of their stockholders.
00:56:19.500 That's what they're supposed to do.
00:56:23.420 All right.
00:56:24.540 All right.
00:56:25.260 Looks good.
00:56:25.840 We're going to end here.
00:56:27.780 YouTube, thanks for joining.
00:56:29.100 All of our technology work today, amazing.
00:56:32.180 So happy.
00:56:33.840 Talk to you tomorrow.