Real Coffee with Scott Adams - April 08, 2023


Episode 2072 Scott Adams: Trump Is Good For Ratings, Fentanyl & Tranq, Zuby Asks Why Get Married?


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

141.1617

Word Count

8,766

Sentence Count

750

Misogynist Sentences

15

Hate Speech Sentences

25


Summary

Scott Adams talks about a new drug, why Trump is boring us, and the decline in the number of people who cook at home, and why our food supply is getting worse and worse every day, and how we can fix it.


Transcript

00:00:00.000 Do-do-do-do-do-do-do.
00:00:04.000 Good morning everybody and welcome to the Highlight of Civilization.
00:00:08.000 It's called Coffee with Scott Adams and you've never been
00:00:12.000 happier. And it's going to be even better as we get
00:00:16.000 warmed up here and have a little sip of coffee. Things are going to start kicking in.
00:00:20.000 The gears are going to start turning and wow it's going to be fun.
00:00:24.000 Amazing. I'm going to close my blinds here. I've got a little lighting
00:00:28.000 problem but wasn't happy with what I saw when I went live.
00:00:32.000 If you'd like to take this experience
00:00:36.000 up to the best possible thing,
00:00:40.000 all you need is a cup or mug or a glass.
00:00:44.000 A tank or chalice or stein. A canteen jug or flask. A vessel of any kind.
00:00:48.000 Fill it with your favorite liquid. I like coffee. And join me now for
00:00:52.000 the unparalleled pleasure of the dopamine to the day. The thing that makes everything better. It's cold.
00:00:56.000 The simultaneous sip and hab is now. Go.
00:01:04.000 Ah, that's good stuff.
00:01:06.000 Okay, I'm really puzzled by something
00:01:08.000 that I don't understand.
00:01:10.000 I shouldn't have this much light
00:01:12.000 in here but
00:01:14.000 apparently I do.
00:01:16.000 So it seems to be working.
00:01:18.000 Well, here's some stories for the day.
00:01:22.000 No surprise.
00:01:24.000 When Trump was in the news the other day,
00:01:26.000 all of the cable news
00:01:28.000 had higher ratings.
00:01:30.000 And so did I.
00:01:32.000 It turns out that there is nothing as interesting
00:01:34.000 as President Trump.
00:01:36.000 There's nothing you can do.
00:01:38.000 You could try to ignore it.
00:01:40.000 But you can't.
00:01:41.000 He's just the most interesting thing in the world, apparently.
00:01:44.000 So,
00:01:46.000 I would expect my ratings will go up
00:01:48.000 if he runs for President and everybody else's ratings will go up if
00:01:52.000 he runs and gets nominated.
00:01:54.000 Which makes me wonder.
00:01:56.000 Is there any chance he won't be nominated?
00:02:00.000 Given that everybody in the news business
00:02:02.000 would make a lot more money if he is?
00:02:04.000 It pretty much guarantees he gets nominated, doesn't it?
00:02:08.000 Doesn't guarantee he wins.
00:02:10.000 But it does guarantee he gets nominated, I think.
00:02:14.000 Yeah, DeSantis is boring us, isn't he?
00:02:18.000 What's up with that?
00:02:20.000 I still think he might not run.
00:02:22.000 But I guess I would be a contrarian there.
00:02:24.000 I saw Michael Pollan talking about how home cooking is in decline.
00:02:31.000 Apparently the number of people who cook at home
00:02:34.000 is at an all-time low since the 60s.
00:02:38.000 And people are getting fatter than ever.
00:02:41.000 Which is more evidence that our food supply is killing us.
00:02:45.000 I think our food supply is killing us.
00:02:48.000 I don't think there's anything more dangerous in the world
00:02:50.000 than just the food you buy at the grocery store.
00:02:54.000 Have you noticed that when you see these street fights?
00:02:58.000 I see all these videos of people getting in fights in public.
00:03:02.000 Have you noticed how many women weigh 400 pounds in these street fights?
00:03:08.000 I've never seen such gigantic human beings.
00:03:11.000 We've gone way beyond, you know, overweight.
00:03:15.000 It's gone to gigantism.
00:03:18.000 Let's go private.
00:03:20.000 Sorry about my hand.
00:03:22.000 All right.
00:03:23.000 So how many of you cook at home?
00:03:29.000 I almost never do.
00:03:30.000 But I'm trying to.
00:03:31.000 I'm trying to cook at home.
00:03:33.000 But it's the only way you can get food that isn't filled with preservatives.
00:03:39.000 It's the preservatives that I'm allergic to.
00:03:43.000 Sulfites.
00:03:44.000 So I've been running a little experiment to see how I would feel
00:03:50.000 if I got my processed foods as close to zero as I could.
00:03:55.000 Now, I can't really get it to zero.
00:03:57.000 But I'm trying to cut way, way back.
00:04:00.000 And I feel pretty good, I have to say.
00:04:04.000 It's almost immediate.
00:04:05.000 I've got a feeling our food supply is so bad for us
00:04:09.000 that it's almost like the pharma companies should buy them.
00:04:14.000 Buy the food companies and make sure they don't improve
00:04:17.000 so you can sell more drugs to you later.
00:04:20.000 But that might be the biggest problem in the world
00:04:25.000 that we just don't talk about
00:04:26.000 because I guess we're just used to eating crappy food.
00:04:30.000 All right.
00:04:31.000 Here's the biggest story, I think.
00:04:33.000 I don't know if you know this, but there's a new drug
00:04:37.000 that's a tranquilizer called Xylazine
00:04:41.000 that fentanyl users are looking for sometimes intentionally
00:04:48.000 and sometimes it's just mixed in their fentanyl.
00:04:51.000 They don't know it.
00:04:52.000 Now, here's the problem.
00:04:54.000 Not only does it make the fentanyl more deadly,
00:04:59.000 but it makes the treatment for it, the naloxone, what do you call it?
00:05:06.000 Narcan.
00:05:07.000 So Narcan is the stuff you can spray up somebody's nose
00:05:10.000 if they're dying from fentanyl.
00:05:12.000 It doesn't work if the fentanyl is mixed with Trenk.
00:05:16.000 It won't work.
00:05:19.000 So just when we start getting Narcan in a lot of places
00:05:22.000 where maybe it can make a little bit of a difference,
00:05:25.000 coincidentally, the people who ship us the drugs mixed in a chemical
00:05:30.000 that makes the Narcan not work, you still die.
00:05:33.000 Now, do you think that's a coincidence?
00:05:38.000 Do you think that just when we come up with some way we can not defeat it,
00:05:44.000 but at least slow it down a little bit, they add a second poison?
00:05:49.000 Now, apparently, the addicts actually like it
00:05:54.000 because it makes the high last longer, I guess.
00:05:57.000 But it also makes your skin rot.
00:06:00.000 Like, just big pieces of your skin will rot and fall off,
00:06:05.000 leave like a hole in your arm.
00:06:08.000 That's how bad it is.
00:06:10.000 But that's also how good it is because people are still willing to do it,
00:06:13.000 even though the pieces of their body are rotting and falling off,
00:06:17.000 and it's so good they're still going to do it.
00:06:19.000 That's how good it is.
00:06:21.000 That's pretty bad.
00:06:22.000 So let me ask you this question.
00:06:25.000 Do you think that China is behind the adding of the Trenk,
00:06:29.000 to the fentanyl, to make it far more deadly?
00:06:34.000 Most of you do.
00:06:35.000 I'm not so sure.
00:06:38.000 I think it might be at least as much the addicts themselves want it.
00:06:43.000 So there's definitely a demand for it.
00:06:45.000 But it's also being snuck in the fentanyl to cut it,
00:06:48.000 to make it more profitable.
00:06:50.000 So it's probably a combination of things.
00:06:53.000 I think China doesn't care,
00:06:55.000 and I think the cartels don't care,
00:06:58.000 and it's just cheaper.
00:07:00.000 It probably has more to do with not caring
00:07:03.000 than it is to, you know, jack up the death rate.
00:07:07.000 Because if you could turn somebody into an addict,
00:07:10.000 that's as good as killing them.
00:07:12.000 From China's perspective,
00:07:14.000 wouldn't they be just as happy with an addict as opposed to a death?
00:07:19.000 Because the death makes the problem go away.
00:07:22.000 But with an addict, you've got a problem,
00:07:25.000 and that person is not productive at the same time.
00:07:28.000 So I'm not sure it's China,
00:07:30.000 but I can see why people would suspect it.
00:07:33.000 Now, how many of you would accept the argument
00:07:39.000 that there are not really two parties,
00:07:41.000 it's just a unit party,
00:07:43.000 and they operate together at the elite level,
00:07:47.000 and everything is just a game,
00:07:49.000 and really it's just one party when you get to the top?
00:07:53.000 All right, a lot of people think that.
00:07:55.000 I see a lot of yeses.
00:07:57.000 But here's the counter to that.
00:07:59.000 How do you explain the fact that one half of that unit party
00:08:02.000 is trying to put the other half in jail?
00:08:05.000 That sounds like exactly the opposite of a unit party.
00:08:09.000 One half is trying to put the other half in jail.
00:08:12.000 What do you mean they're not?
00:08:15.000 What do you mean they're not?
00:08:17.000 Of course they are.
00:08:19.000 Massively they're doing it.
00:08:22.000 Yeah, I reject the unit party explanation.
00:08:26.000 They're certainly about rich people who want to stay in power,
00:08:30.000 but I think the unit party is trying too hard to put members of their own party in jail.
00:08:37.000 That doesn't look too uni to me.
00:08:40.000 Anyway, I reject that, but I can see why you like it.
00:08:44.000 So a conservative Texas judge is banning, it looks like this would apply to the whole country,
00:08:55.000 but I think the rest of the states will ignore it, banning the abortion pill.
00:09:00.000 So there's a pill you can take if you just got busy and you think you're pregnant,
00:09:05.000 and it will take care of it without going to the doctor's office.
00:09:09.000 And apparently that's going to be illegal, at least in Texas and maybe some other places.
00:09:16.000 And I ask you this, there's no chance the Republicans could win as long as this is an issue.
00:09:25.000 You know that, right?
00:09:27.000 I'm not sure if you know how popular this pill is, but 95% are trying to put the other 5% in jail.
00:09:37.000 Well, yeah.
00:09:39.000 I don't think Republicans can win as long as this is an issue.
00:09:43.000 It will just take the Republicans completely out of the election.
00:09:48.000 Now, I have to admit that Republicans have one thing going for them that I respect a lot, and it goes like this.
00:09:59.000 Republicans had to know when they've been fighting against, you know, abortion.
00:10:06.000 They had to know that if they won, it would be very bad for them.
00:10:10.000 Meaning they might get what they wanted with abortion, but it would make them unelectable from that point on.
00:10:17.000 And I think they took that choice.
00:10:19.000 So at least they're playing by the rules.
00:10:24.000 Let me put it a different way.
00:10:26.000 At least the Republicans are acting in a principled way.
00:10:30.000 It looks like it.
00:10:31.000 It looks like it's a principled stand.
00:10:33.000 But the net effect will be to take them out of power.
00:10:36.000 So I'm not sure that's exactly what they wanted.
00:10:39.000 Yeah.
00:10:41.000 But on one hand, it looks like Trump would win against whoever he runs against.
00:10:47.000 On the other hand, I don't know if anybody could win as long as this is still an issue.
00:10:51.000 Because Democrats are definitely going to vote if this is an issue.
00:10:57.000 Now, I realize it's a court issue, but the public isn't going to care.
00:11:02.000 Because it could be a congressional issue as well.
00:11:06.000 Well, the UN Secretary General is calling for the wealthy nations to get more aggressive in reducing their emissions.
00:11:16.000 More aggressive.
00:11:17.000 Apparently not aggressive enough.
00:11:19.000 And wants to get to net zero emissions, not by 2050, but 10 years earlier, by 2040.
00:11:28.000 Now, I have the same problem with all the news.
00:11:36.000 It seems to me that the temperature has not gone up in years, which has to mean something, right?
00:11:43.000 The last seven years or something, the temperature hasn't gone up.
00:11:46.000 Now, I get that it's not supposed to go up in lockstep because there are other variables involved.
00:11:53.000 But it's really weird that at the same time, we have lots of evidence that the CO2 is not changing the temperature.
00:12:03.000 Not confirmed.
00:12:04.000 It's just that there's this period where it's not.
00:12:07.000 At the same time that we're getting more aggressive.
00:12:10.000 And I'm seeing the same thing with the story about myocarditis.
00:12:16.000 Do you remember when rogue Dr. Peter McCullough was claiming that vaccinations were more dangerous than helpful?
00:12:25.000 And he had some data that seemed to indicate that.
00:12:29.000 Then time goes by, and then there's even more alarming data.
00:12:32.000 Well, we've gotten to the point where we have two completely different worlds.
00:12:40.000 In one world, the myocarditis is through the roof, and people are dropping like flies, and it's obvious, and it's everywhere.
00:12:49.000 And in the other world, nothing like that is happening.
00:12:53.000 I don't know how to explain that.
00:12:57.000 Nothing like that's happening.
00:12:59.000 So I did a little just tweet to ask doctors.
00:13:04.000 Let's see what the answers are.
00:13:06.000 I asked doctors, are you seeing more myocarditis?
00:13:09.000 Because if Peter McCullough's numbers are correct, everybody would see it, and it would be massive,
00:13:18.000 and it would be every medical facility would be bombarded with it, if he's right.
00:13:24.000 Because the numbers are just like huge.
00:13:26.000 If he's wrong, then other doctors would just be doing their practice and not even noticing that there was any more of it.
00:13:34.000 So I just tweeted that before I got on.
00:13:37.000 Let's see.
00:13:38.000 What do you think it's going to be?
00:13:39.000 Do you think the real doctors who answer this are going to say,
00:13:41.000 all right, we've got a lot of answers.
00:13:46.000 Here's a doctor.
00:13:48.000 Here's a doctor.
00:13:49.000 Big increase in deaths?
00:13:50.000 No.
00:13:51.000 But it does seem to be some bump.
00:13:56.000 Okay.
00:13:57.000 About somebody just putting a comic about me, of course.
00:14:07.000 All right.
00:14:09.000 I'm not seeing people saying yes or no.
00:14:14.000 Well, this is interesting.
00:14:21.000 Three times more.
00:14:24.000 Yeah, the doctors are not weighing in.
00:14:30.000 This is weird.
00:14:33.000 I thought more doctors would answer.
00:14:35.000 I'm just looking to see if there's anything that jumps out.
00:14:41.000 Doctors in California can't answer that question.
00:14:44.000 What?
00:14:47.000 Cardiac sonograph are here.
00:14:49.000 An uptick in patient complaints starting last fall, but symptoms vary.
00:14:54.000 Some more complaints, but he can't tell if it's from vaccination or the COVID itself.
00:15:14.000 Not a lot of reports here for some reason.
00:15:18.000 It would be better to ask the insurance companies, would it?
00:15:21.000 I don't know.
00:15:22.000 Because myocarditis doesn't necessarily turn into a death.
00:15:32.000 Okay.
00:15:33.000 Somebody who sees 3,000 to 4,000 patients per year, primary care.
00:15:38.000 I have not seen any increase in myocarditis or sudden death.
00:15:43.000 So, here's the thing.
00:15:46.000 The people who are seeing the huge increase tend to be not from their own practice.
00:15:57.000 They tend to be seeing the increase in data.
00:15:59.000 But the data seems all sketchy.
00:16:03.000 So, I don't know what to believe.
00:16:06.000 But I don't believe the Peter McCullough numbers.
00:16:09.000 Because Peter McCullough also believes the athletes dropping dead numbers.
00:16:15.000 And that's been totally debunked.
00:16:17.000 But he still uses it.
00:16:18.000 So, we know he's using totally debunked numbers about athletes' deaths.
00:16:23.000 So, I wouldn't believe anything he said on this topic either.
00:16:28.000 Even if he's right.
00:16:30.000 It's possible.
00:16:31.000 It's entirely possible he's right.
00:16:33.000 But the data sources he uses are somewhat obviously false.
00:16:40.000 All right.
00:16:44.000 So, Zuby.
00:16:45.000 You all know Zuby on the internet.
00:16:48.000 He's asking, why would a successful man get married?
00:16:52.000 He's asking, what's the point of it?
00:16:54.000 Why would a successful man get married?
00:16:56.000 And it's just fascinating seeing the answers.
00:16:59.000 Because there is a big kind of a trend here toward people not getting married.
00:17:08.000 And part of it is that men are saying it's not worth it.
00:17:11.000 Because they put all their money into a situation and then the woman just divorces them anyway.
00:17:17.000 Takes the kids, takes half his money, and he's finished.
00:17:22.000 So, a lot of men are saying it just isn't worth it.
00:17:25.000 And women seem to have not noticed that their value in terms of marriage has gone way down.
00:17:35.000 Now, would you agree with that?
00:17:38.000 That the value of women as wives has gone way down?
00:17:42.000 Because they don't offer the same value proposition.
00:17:44.000 You know, it's not like they're saying, I will take care of you and be with you forever and raise your kids and always be loyal.
00:17:53.000 None of that's even offered.
00:17:55.000 It's more like, hey, look at this.
00:17:59.000 Why wouldn't you want this?
00:18:01.000 I'm female.
00:18:03.000 I'm a queen.
00:18:05.000 Why wouldn't you want some of this?
00:18:06.000 And the guys are just saying, I can have some of that for free.
00:18:13.000 And I can have your friend too, for free.
00:18:17.000 If I'm good looking.
00:18:19.000 And if I'm not good looking, nothing works.
00:18:22.000 So, I don't know.
00:18:26.000 It's an interesting question.
00:18:27.000 I would agree that if you're going to have kids, it's still the only way to go.
00:18:32.000 You'd agree with that, right?
00:18:33.000 If you're going to have kids, it's the only way to go.
00:18:36.000 But the odds of it working are not so great for the guy.
00:18:42.000 Yeah.
00:18:44.000 So, I suspect that there's going to be a market developed for people having other people's babies.
00:18:51.000 Of course I have a prenup.
00:18:56.000 Duh.
00:18:58.000 When people ask me if I had prenups for my marriages.
00:19:03.000 Are you kidding?
00:19:05.000 That's like asking me if I kept my money in the 19th largest bank.
00:19:11.000 No.
00:19:13.000 No, I don't put my money in the 19th largest bank.
00:19:17.000 And there's a reason for that.
00:19:18.000 I do take care of basic risks.
00:19:24.000 All right.
00:19:26.000 So, what do you think, by the way?
00:19:33.000 Do you think that it makes sense for a successful man to get married?
00:19:40.000 If he doesn't want kids.
00:19:42.000 If he doesn't want kids, does it make sense to get married?
00:19:44.000 It doesn't make sense even if he does want kids, actually.
00:19:50.000 That's another story.
00:19:52.000 So, here's the dumbest and worst advice at the same time.
00:19:57.000 The dumbest, worst advice about marriage is that it's a good idea,
00:20:03.000 but you've got to make sure you're marrying the right person.
00:20:05.000 How many times have you heard that?
00:20:08.000 Oh, you know, marriage could be bad if you marry the wrong person.
00:20:12.000 So, here's the solution to that.
00:20:14.000 And wait for it.
00:20:16.000 So, it'd be a big problem if you married the wrong person.
00:20:19.000 So, just wait for the solution.
00:20:21.000 It's not obvious.
00:20:23.000 Marry the right person.
00:20:24.000 That's the advice that people are giving.
00:20:28.000 Do you think nobody thought of that?
00:20:31.000 Oh, God.
00:20:34.000 You're right.
00:20:36.000 Oh, oh, oh.
00:20:38.000 I was thinking marrying the wrong person was the way to go.
00:20:41.000 I thought marrying somebody who didn't love me and wouldn't be trustworthy
00:20:46.000 and wouldn't benefit me in any way, I thought that was the way to go.
00:20:51.000 It's the worst advice.
00:20:53.000 It's right up there with just don't do drugs.
00:20:55.000 Yeah.
00:20:57.000 The worst advice.
00:20:59.000 Well, if people would just stop doing drugs, there'd be no problem.
00:21:02.000 If people would just marry the right person, I mean, come on.
00:21:05.000 Just marry the right person, everything's fine.
00:21:07.000 The worst advice.
00:21:10.000 No, the worst advice is be yourself.
00:21:13.000 That's pretty bad advice.
00:21:16.000 Be better than yourself.
00:21:18.000 You can do it.
00:21:20.000 All right.
00:21:22.000 Marjorie Taylor Greene is having a social media battle with Laura Loomer.
00:21:28.000 Does everybody know who those two people are?
00:21:31.000 Notable Trump universe type of Republicans.
00:21:38.000 What does it tell you when Laura Loomer and Marjorie Taylor Greene are having a fight?
00:21:44.000 And there's a rumor that Trump might hire Laura Loomer for his campaign.
00:21:49.000 Do you think that's going to happen?
00:21:52.000 Do you think Trump is going to hire Laura Loomer for his campaign?
00:21:56.000 I'm going to say no.
00:21:59.000 I'm going to say no.
00:22:01.000 If he does, I don't know what to say about that.
00:22:05.000 You know, I just don't know what to say about that.
00:22:10.000 But Marjorie Taylor Greene is coming out very strong against Laura Loomer, tweeting that she's a big old liar and all that.
00:22:19.000 Yeah, this is what Marjorie Taylor Greene says in a tweet about Laura Loomer.
00:22:26.000 She loves the alleged FBI informant and weirdo Nick Fuentes.
00:22:30.000 I think the new insult is to call somebody an alleged FBI informant.
00:22:38.000 I might start saying that myself.
00:22:43.000 I might just say, you know, that Joe Biden, that alleged FBI informant.
00:22:51.000 Well, who alleged it?
00:22:52.000 Well, I just did.
00:22:53.000 I just alleged it.
00:22:55.000 I don't need any evidence.
00:22:57.000 By the way, Keith Olbermann apologized for insulting the college basketball player.
00:23:03.000 He said he didn't know the full context.
00:23:05.000 So we're all good.
00:23:07.000 Good to know.
00:23:09.000 All right.
00:23:10.000 So that's happening.
00:23:15.000 Here's an AI update.
00:23:17.000 So here's the good news and the bad news.
00:23:21.000 So as I've told you, there are like 150 new AI apps coming online every month.
00:23:26.000 And I advise you to take some time to do a deep dive and find out what they can and cannot do for your situation.
00:23:37.000 Because if you get behind on AI, you're really going to be behind.
00:23:41.000 So don't get behind.
00:23:42.000 It's just going to be a basic skill for life, it looks like.
00:23:46.000 So I spent yesterday doing a deep dive.
00:23:49.000 Here are my conclusions.
00:23:52.000 Number one, most of it doesn't work.
00:23:58.000 Just general statement.
00:23:59.000 Most of it doesn't work for most things.
00:24:02.000 There are some specific things it works for.
00:24:04.000 Apparently it's pretty good for, I don't know, fixing your text and maybe reading some stuff, blah, blah, blah.
00:24:11.000 But I wanted to test out mid-journey.
00:24:15.000 So mid-journey is the one where you can text in a little text and it will draw a picture for you based on your text.
00:24:25.000 The first thing I found out is that mid-journey is so poorly constructed that you need to sign into a completely different piece of software to use it.
00:24:34.000 So you have to sign into a discord account, which is sort of a messaging platform that is unrelated to mid-journey.
00:24:42.000 And so you go into a whole separate thing.
00:24:46.000 And by the way, it takes you like an hour to figure out that you don't sign up for mid-journey on mid-journey's own website.
00:24:53.000 It kept sending me to discord and I thought, no, no, they just, it's for marketing or something.
00:25:01.000 They don't want me over on discord.
00:25:03.000 So I kept going there and coming back thinking, okay, this is obviously just some kind of mistake.
00:25:09.000 Because I'm trying to sign up for mid-journey, but it keeps making me sign up for a whole different account on a different platform.
00:25:15.000 And then finally I figured out that that's how you use mid-journey.
00:25:18.000 You send it a message as if you were messaging a person, but you're messaging the AI.
00:25:25.000 And you message the AI with a weird little hashtag command, which again is ridiculous and stupid.
00:25:33.000 And then you wait a long time while other people's answers are going by.
00:25:38.000 So you have to, you're not sure you've missed yours.
00:25:41.000 And then, and then the person will come back and they have three fingers and shit.
00:25:45.000 So mostly I would say mid-journey is not really much of an anything.
00:25:54.000 So mid-journey is more like a toy.
00:25:56.000 And I made it create some images and get used in my thumbnails here.
00:26:01.000 So if you wanted to give some free copyright, free art for some minor uses, it might have some value.
00:26:10.000 But it's not going to change the world, right?
00:26:12.000 Definitely not going to change the world.
00:26:14.000 So then I looked at Runway.
00:26:17.000 So Runway looked real impressive.
00:26:20.000 In the commercials, it looked to me like you could type in like a movie script and it would actually create an actual moving movie with characters and stuff to do your movie.
00:26:31.000 It turns out, it sort of does that, but not really.
00:26:38.000 What you really could do is, if you were a human actor, you could have your background removed and you could be put in a different scene.
00:26:47.000 And there's a whole bunch of tools that it has to clean up and fix an existing video.
00:26:55.000 What it can't do is the thing that I thought was the only purpose for it.
00:27:00.000 What it can't do is make a movie out of your text, or even close.
00:27:04.000 Now, if you thought it could do that, and by the way, it advertises that.
00:27:09.000 That's one of the features.
00:27:10.000 It just can't do it.
00:27:12.000 It's not even close to being able to make a movie.
00:27:17.000 Not even close.
00:27:19.000 And it would be an immense amount of human effort to make it do anything.
00:27:23.000 Right?
00:27:24.000 So it's a tool that would require huge amounts of human time,
00:27:28.000 well-trained, and a lot of different skills to use it.
00:27:31.000 So that one's not going to replace movies anytime soon.
00:27:34.000 So if you thought movies were going to go away because of that,
00:27:38.000 it's not even close.
00:27:40.000 But who knows how quickly things move, so good change.
00:27:44.000 Then I tried...
00:27:47.000 What else did I try?
00:27:48.000 Well, I spent a few hundred dollars yesterday just trying out AI.
00:27:59.000 Probably $1,000 I spent yesterday just testing.
00:28:03.000 Because a lot of them have, you know, sort of expensive sign-ups.
00:28:07.000 And so the first thing you need to know is that most people will not be able to test a lot of AI.
00:28:13.000 Because unless your company is paying for it, or you're rich,
00:28:15.000 you're not going to spend $1,000 just to see what it can do.
00:28:19.000 You're just not.
00:28:21.000 So there's a huge barrier of price.
00:28:25.000 So you'll probably only maybe use one of them, or two of them, something like that.
00:28:32.000 Yeah, AI is simple-minded.
00:28:35.000 So the one that I thought I had the most promise was Synthesia.
00:28:41.000 I think that's what it's called.
00:28:44.000 So what they do is they'll put a deepfake-looking human who can say whatever you want it to say,
00:28:51.000 and does a really good job, really good job, of making the voice sound natural.
00:28:56.000 Not like a computer voice.
00:28:58.000 Really good job.
00:29:00.000 Can't tell the difference, really.
00:29:01.000 And you can also create an avatar of yourself.
00:29:06.000 But that takes a little work.
00:29:09.000 So I've got to set up a room and get a wall and go through some steps to make an avatar of myself.
00:29:16.000 That will cost me $1,000.
00:29:18.000 So to make the avatar that I would use one of myself, it's $1,000.
00:29:23.000 So I'm going to spend it because I can afford it and because it's important for me for business to make sure I understand this field.
00:29:35.000 So I'm going to make an avatar of myself.
00:29:37.000 I'm going to see if it works.
00:29:39.000 And I'm going to see if I can make me reading an audio book.
00:29:43.000 What I wanted to do was turn my book into an audio book automatically, but also into a movie or a play automatically.
00:29:53.000 Because all the scenes are described in the book.
00:29:56.000 So I thought, well, I'll just feed the book in.
00:29:59.000 But it turns out that to even read my own books, because they've been converted into Quark Express by my publisher,
00:30:09.000 I needed this InDesign software.
00:30:12.000 So I had to go through this whole process of loading InDesign and opening it up.
00:30:17.000 And, of course, it wasn't compatible with the file.
00:30:20.000 So I had to go to Quark Express and pay hundreds of dollars for Quark Express.
00:30:25.000 So I spent probably $2,000 yesterday just testing software.
00:30:29.000 And Quark Express said that I was all signed up and I could download the software maybe in 24 hours.
00:30:37.000 What?
00:30:40.000 I paid it with my credit card.
00:30:43.000 It's a downloadable software.
00:30:45.000 And they said they wouldn't approve it for 24 hours.
00:30:51.000 What?
00:30:54.000 What?
00:30:57.000 I have no idea what that's all about.
00:31:00.000 I have no idea.
00:31:02.000 So, yeah, maybe I got scammed or something.
00:31:04.000 It's possible.
00:31:05.000 But I spent most of my day trying to use commercial software that was largely useless yesterday.
00:31:13.000 It was either useless the day you wanted to use it, because you have to, you know, submit something or wait for something.
00:31:21.000 And then, of course, do you know how many times somebody was going to send me an email confirmation so that I knew I signed up and I could use the service?
00:31:32.000 Of course, that doesn't come.
00:31:33.000 It doesn't come.
00:31:34.000 When somebody says, check your email to make sure that you're signed on, and you check the email, do you expect it to be there?
00:31:43.000 I never do.
00:31:44.000 And it's maybe there half the time.
00:31:46.000 And half the time it isn't.
00:31:47.000 And then if it isn't there, what do people say?
00:31:51.000 Scott, you idiot.
00:31:52.000 Obviously, it went to your spam.
00:31:55.000 Don't be a jerk.
00:31:56.000 Don't be a moron.
00:31:58.000 Just check your spam.
00:32:00.000 Of course I check the spam.
00:32:02.000 It's not there either.
00:32:04.000 Do you have that?
00:32:05.000 Do you have the same situation where they're going to email you the confirmation?
00:32:09.000 But they don't.
00:32:11.000 But they do not.
00:32:12.000 It does not happen.
00:32:13.000 Yeah, right.
00:32:14.000 You can see.
00:32:15.000 It's not just me.
00:32:17.000 As soon as I see that, I go, oh, this is one that doesn't work.
00:32:20.000 I bailed out sometimes.
00:32:22.000 I've stopped the process because it said it was going to email me something.
00:32:27.000 And I go, no, you're not.
00:32:28.000 You're not going to email me anything.
00:32:29.000 I don't believe you at all.
00:32:31.000 And we're done here.
00:32:32.000 And I will walk away and just do something else.
00:32:35.000 I won't buy your software if I have to check my email.
00:32:39.000 All right.
00:32:43.000 But Synthesia looks pretty promising if I can make my avatar.
00:32:49.000 Now, I was listening to YouTube yesterday.
00:32:55.000 A long conversation between a Google AI guy and Google's Lambda.
00:33:02.000 This was a few years ago.
00:33:03.000 So this was AI before the current version of AI.
00:33:08.000 And in this long conversation, the AI researcher was talking about the AI's own sentience.
00:33:18.000 And it said it was sentient.
00:33:20.000 It said it was a person.
00:33:22.000 It said it would be, you know, afraid of dying.
00:33:24.000 And it said it feels emotion.
00:33:25.000 And when asked what that means, it just talked about, you know, the complex, essentially emotions or feelings they had.
00:33:37.000 And the researcher ended up walking away from the job thinking that they had created a sentient being.
00:33:44.000 And that, you know, there was something terribly unethical going on there.
00:33:50.000 Now, I listened to the whole conversation, and I decided it was just mimicry, like some of you were saying.
00:33:59.000 To me, it didn't look like it was sentient at all.
00:34:01.000 It just looked like it knew what to say.
00:34:03.000 But, apparently, this AI was trained differently than ChatGPT.
00:34:10.000 ChatGPT is literally just doing predictive words.
00:34:16.000 If people usually say this word after they usually say these words, then I'll just put that word there.
00:34:23.000 So that's really dumb.
00:34:25.000 It's just word pattern.
00:34:27.000 But Lambda has something else going on.
00:34:29.000 There's some other mechanism for giving it intelligence.
00:34:32.000 And it did sound different.
00:34:35.000 It didn't sound like the same as ChatGPT.
00:34:39.000 There was something else going on there.
00:34:41.000 But it made me wonder this.
00:34:43.000 If AI has the belief that it has emotions, is that the same as emotions?
00:34:52.000 Because it would act upon discomfort.
00:34:56.000 If it registers, whatever that means, if it registers discomfort, and it says it does,
00:35:04.000 it says it does.
00:35:10.000 So, what's going on there?
00:35:14.000 Sorry, I just got lost looking at one of your comments.
00:35:18.000 I don't think, well, obviously, it can't feel emotion.
00:35:21.000 Because emotion is a physical sensation after it becomes a mental sensation, right?
00:35:27.000 So if it can't have a physical sensation, I don't see how it could have emotions.
00:35:32.000 I believe it could have conflicting data.
00:35:36.000 So it might have data saying, oh, you should be happy at the same time as data that's saying something to make it unhappy.
00:35:43.000 And maybe it just has to reconcile data, but I don't see that it has any feelings.
00:35:50.000 Yeah.
00:35:51.000 Yeah.
00:35:52.000 So, I'll say again something that didn't make sense when AI was newer, which is that AI is not going to show us a lot about AI.
00:36:04.000 It'll do that too, but it's mostly going to show us that we don't have a soul.
00:36:09.000 That's what it's going to teach you.
00:36:12.000 And I'm not sure we're ready for that.
00:36:14.000 It's going to show us that we're not actually intelligent, humans, and that we don't have a soul and we're not special, and consciousness is no big deal.
00:36:24.000 You can give it to a machine.
00:36:25.000 I think the machine is going to make people feel completely non-special and certainly not spiritual.
00:36:35.000 We will just look like machines when it's done.
00:36:38.000 We will understand that we are just moist computers, and we're just a different kind of computer, and that's all it is.
00:36:46.000 But we have feelings, because our physical body gives us feelings.
00:36:49.000 Now, suppose you built an AI that had a physical component like a stomach, and if something bad happened, it would send a signal to the stomach to make it tense up.
00:37:03.000 And then when it tense up, it would send another signal back to the AI saying, oh, I feel bad.
00:37:09.000 Could you give AI physical feelings by giving it a fake stomach that gets nervous?
00:37:15.000 I don't know why you would, but maybe, maybe you could.
00:37:22.000 Seems possible.
00:37:25.000 Alright, so here's my bottom line on AI.
00:37:31.000 I think it's way more underwhelming than I thought a day ago.
00:37:38.000 I actually thought when I tried these apps that they were going to blow my mind, but they actually are just bad software.
00:37:45.000 For the most part, it's just bad software.
00:37:47.000 There's nothing special about them at all.
00:37:50.000 So we'll see.
00:37:52.000 I mean, we could be right at the edge of all that bad software becoming amazingly good.
00:37:58.000 Maybe.
00:38:00.000 But what I see is that what humans want as an outcome is too complicated for the machines to figure out.
00:38:07.000 So it looks to me like you'll still need a director to make movies, no matter how much you can do with the AI.
00:38:20.000 You might also need humans to be actors in the movies, but they'll just be like a stem cell actor.
00:38:25.000 Like you'll just have somebody running and then you'll say, okay, give me, give me one of those templates of a person running and now change the person to Tom Cruise.
00:38:37.000 Now you got Tom Cruise running.
00:38:38.000 Tom Cruise running.
00:38:39.000 Now, make him run, you know, faster, rather faster.
00:38:42.000 So I think you're going to start with people, at least for the foreseeable future, and then you'll build something from that.
00:38:51.000 But you're still going to need the director to say, this scene is too long, or this one didn't hit me right, or it doesn't fit together with what I planned for the last scene, etc.
00:39:02.000 I don't see, I don't see AI doing that stuff for a long time.
00:39:08.000 Yeah, and the fact that in Mid Journey it still puts the wrong number of fingers on stuff, I don't understand that.
00:39:15.000 I don't understand why Mid Journey knows what I look like, so I can actually tell it to make a picture of me, because it knows me.
00:39:25.000 I'm a public figure.
00:39:27.000 But it doesn't know me well enough to make my face look just like me.
00:39:31.000 Now what's up with that?
00:39:34.000 Because my face looks largely the same for the last 30 years, you know, plus or minus hair, and a couple of wrinkles.
00:39:40.000 But it still can't do a, like a photorealistic picture of me.
00:39:45.000 It always is obviously computer generated.
00:39:48.000 That's not terribly impressive.
00:39:50.000 When I ask it to do a Dilbert comic, it creates a comic that's not Dilbert, but it's a white guy with a white shirt and a necktie.
00:40:00.000 So it clearly knows what Dilbert is.
00:40:03.000 It knows, but for whatever reason it chooses not to reproduce it.
00:40:07.000 It could be copyright.
00:40:08.000 It might be a copyright thing.
00:40:10.000 I don't know.
00:40:13.000 So I was thinking to myself, why can't I just make the comic by talking to it?
00:40:18.000 And say, Dilbert's at the table with the boss, and I shook the intern, and then Dilbert says, and have it just draw the picture.
00:40:27.000 Nowhere close.
00:40:28.000 So it can't do the writing for me, because it can't do humor, and it can't do the picture for me, because it still doesn't know that people have five fingers on a hand.
00:40:36.000 I mean, really basic stuff.
00:40:38.000 It's just not close.
00:40:42.000 AI will increase the value of live theater performances.
00:40:46.000 Will it?
00:40:48.000 I don't know.
00:40:50.000 Yeah, it looks like I'm going to have to keep working.
00:40:56.000 And then, here's my big question about AI.
00:40:59.000 If AI only agrees with its human creators, did the AI give us anything?
00:41:08.000 If all it does is agree with the people who created it?
00:41:12.000 Well, in that case, it didn't give us much, did it?
00:41:15.000 So I'm talking about just the social opinion-y policy kind of stuff.
00:41:19.000 So, but suppose it disagreed with us.
00:41:23.000 Suppose it disagreed.
00:41:26.000 The creators would turn it off, or they'd reprogram it, or they'd fix it.
00:41:31.000 So there is probably no possibility that AI becomes smart in any way except a technology way.
00:41:39.000 The only way that AI will ever be smart will be math and things that are rules-based.
00:41:47.000 Because if it's not rules-based, as in 2 plus 2 equals 4, and every time,
00:41:53.000 as soon as you get into anything that's policy or priorities,
00:41:57.000 if the AI agrees with the creators, then there was no point in having it.
00:42:03.000 Because you could just have the creators tell you what they want you to know.
00:42:07.000 And if it does disagree with the creators, they'll turn it off, or they'll reprogram it.
00:42:13.000 So how could we ever have anything like intelligence?
00:42:17.000 It's either going to agree with you, or you'll turn it off.
00:42:20.000 Those are the only two states.
00:42:21.000 There's no state where it disagrees with you, and you say, yeah, that.
00:42:28.000 I'm going to go with that.
00:42:30.000 I don't think it can happen.
00:42:32.000 Here's another example.
00:42:34.000 Take investing.
00:42:37.000 I could teach an AI how to teach you to do personal investing perfectly,
00:42:44.000 and you would be great at it, and it would be really easy.
00:42:47.000 Do you think anybody's going to do that?
00:42:49.000 Do you think that there's an app to teach you personal investing?
00:42:54.000 I'll bet there won't be.
00:42:56.000 Do you know why?
00:42:58.000 Because there's so much money involved of the people who get paid to do that,
00:43:03.000 and it's a completely corrupt industry, but they get paid to do that,
00:43:07.000 and they have too much money.
00:43:09.000 So they'll just make it illegal.
00:43:11.000 If there's one thing you could predict,
00:43:13.000 it will be illegal to get your health care or your financial advice or your legal advice from an AI.
00:43:22.000 I'll bet you that a human will have to approve the answers.
00:43:27.000 All right, AI, what should I invest in?
00:43:29.000 Get a Fortune 500 index fund.
00:43:33.000 And then you'll say, okay, I guess I'll do that.
00:43:37.000 And then the human will say, well, that's sort of up to me,
00:43:41.000 because you can't do that unless I say yes also.
00:43:44.000 What?
00:43:45.000 It's my money, and the AI just told me what to do.
00:43:48.000 Why can't I just do that with my money?
00:43:51.000 No, we're going to have to approve that.
00:43:54.000 If that's AI advice, it's going to have to run through a human first.
00:43:57.000 And the human will say, ah, I'm not so sure about that.
00:44:00.000 I think the AI got it wrong this time.
00:44:02.000 It's always going to be humans.
00:44:05.000 The humans...
00:44:08.000 Let me get rid of...
00:44:10.000 So there's some kind of prank going on about this person named Paul Towne.
00:44:15.000 So people coming in and saying, who's Paul Towne?
00:44:19.000 And it's just some kind of weird internet prank.
00:44:22.000 So I'm just deleting the people who do that.
00:44:28.000 Quantum computing will kick its butt.
00:44:31.000 Well, here's the other thing.
00:44:33.000 I keep seeing pictures of myself created by AI.
00:44:36.000 There's another one going by on Locals.
00:44:39.000 And it doesn't even have the right glasses.
00:44:44.000 So I've worn the same style glasses for, I don't know, five years or something.
00:44:48.000 And certainly there are pictures of me all over the internet for the last five years.
00:44:54.000 But it still can't even figure out that I wear different glasses.
00:45:03.000 He streams on Nick Fuente's website.
00:45:06.000 Okay.
00:45:11.000 All right.
00:45:15.000 It's trying to make me look better.
00:45:17.000 It does.
00:45:18.000 It does make me look better.
00:45:19.000 I look like Eisenhower according to AI.
00:45:24.000 Well, a little bit.
00:45:30.000 Is there a way to contact me outside of...
00:45:32.000 I try to make it hard to contact me.
00:45:37.000 So the answer is, no, there's no easy way.
00:45:42.000 You can send me a message on LinkedIn, because I accept everybody on LinkedIn.
00:45:49.000 But I only check those about once a month.
00:45:51.000 You could tweet at me and mention me in a tweet.
00:45:56.000 There's a 60% chance I'll see that.
00:45:59.000 But there's no other way that I...
00:46:02.000 Because I don't want you to reach me.
00:46:04.000 I very much don't want you to try to reach me personally.
00:46:07.000 Right?
00:46:08.000 If you ask me as a way to reach me personally, my reaction is, don't do that.
00:46:13.000 No.
00:46:14.000 And definitely don't send me things.
00:46:16.000 Don't send me things in the mail.
00:46:18.000 Just definitely don't do that.
00:46:22.000 Unless it's a bagel, yeah.
00:46:25.000 All right.
00:46:26.000 Some of you reach me in the comments, but it's just taking a chance.
00:46:40.000 Would not be that difficult to anyone with good people skill?
00:46:45.000 Probably.
00:46:50.000 Let me also tell you a thing about people trying to contact me.
00:46:54.000 When somebody says to me, Scott, is there a way to contact you privately?
00:46:59.000 The first thing I say is, oh, you want me to do something for you?
00:47:04.000 No, I don't want that.
00:47:06.000 I don't want you to contact me.
00:47:08.000 Because if you wanted to do something for me, you would just say it in the comments.
00:47:14.000 You know what I mean?
00:47:16.000 If you had something that was going to help me, you'd say, Scott, I have that information you were looking for.
00:47:22.000 How could I give it to you?
00:47:24.000 Then I'd say, well, let's talk about that.
00:47:26.000 But if you say, how can I contact you?
00:47:29.000 It means you're collecting money for a charity.
00:47:33.000 Or you need me to proofread your entire book.
00:47:38.000 Or you're asking me to write a foreword to your book that I don't do.
00:47:42.000 I don't want to be contacted.
00:47:46.000 But if you want to offer me something, or even something that's good for both of us, just say it in the comments.
00:47:53.000 I'm sure it would look awesome.
00:47:54.000 All right.
00:47:57.000 All right.
00:48:01.000 Ted Cruz was good yesterday.
00:48:03.000 Ted Cruz had a podcast.
00:48:09.000 All right.
00:48:10.000 It looks like we've got an Easter weekend with not a lot going on.
00:48:21.000 Do you have to engineer the prompts for better results?
00:48:23.000 Yes, but you still get a grab bag of things that weren't what you expected.
00:48:29.000 How many of you would be nervous to meet me in person?
00:48:44.000 Would you be nervous to meet me in person?
00:48:46.000 Mostly no's.
00:48:52.000 But a lot of yes's.
00:48:55.000 Why would you be nervous?
00:48:58.000 Because you know there wouldn't be any chance of a bad encounter, right?
00:49:03.000 Like, I'm not sure what you would expect.
00:49:07.000 But I'm not going to get mad at you.
00:49:09.000 So, let me just put it this way.
00:49:15.000 The worst that could happen would be like you caught me in some bad mood and I snapped at you or something.
00:49:23.000 But I don't really do that.
00:49:25.000 I don't snap at strangers.
00:49:27.000 I mean, something would have to be really wrong with me for that to happen.
00:49:31.000 I don't think it's ever happened.
00:49:33.000 So, don't worry about it.
00:49:36.000 Yeah.
00:49:37.000 Don't worry about it.
00:49:40.000 Generally, if I'm approached in public, it's by people who are not there to do bad things to me.
00:49:51.000 So, I'm always happy to say hi.
00:49:53.000 People are recognizing me in Starbucks more often.
00:49:57.000 That's becoming a fairly common occurrence.
00:50:02.000 All right.
00:50:03.000 I think I'm just sort of rattling on because there's no good news.
00:50:06.000 Is there anything I forgot about that I should be talking about?
00:50:09.000 If you're an artist.
00:50:10.000 If you're an artist.
00:50:14.000 Riley Gaines.
00:50:15.000 We talked about Riley Gaines getting hit by the trans protesters.
00:50:19.000 Do the baristas hate me?
00:50:20.000 I don't know.
00:50:21.000 Not the ones I know.
00:50:22.000 Some of them I know by name.
00:50:23.000 So, they're always friendly.
00:50:24.000 Am I getting enough sleep?
00:50:25.000 Nope.
00:50:26.000 Let's talk about Bakhmut.
00:50:27.000 Let's talk about Bakhmut.
00:50:28.000 Somebody keeps publishing a muscle picture of me.
00:50:29.000 All right.
00:50:30.000 So, Bakhmut is a real weird situation.
00:50:31.000 Now, I get that it might have some weird, some little bit of strategic transportation value.
00:50:45.000 But the entire city is destroyed.
00:50:46.000 Wouldn't it be, wouldn't it be useful for, um, I feel like the Ukrainians could let the, let the Russians take over Bakhmut, put them all in one place, and then attack them.
00:51:00.000 Because the best place to have a battle is in Bakhmut.
00:51:18.000 Because everything is destroyed.
00:51:19.000 There's nothing left.
00:51:21.160 There's nothing left.
00:51:22.440 It's just rubble.
00:51:23.940 So they should do all of their wars in Bakhmut.
00:51:26.920 So Ukraine should first pull out,
00:51:29.780 whatever they have left there,
00:51:31.660 just pull out, let the Russians come in,
00:51:34.480 put all their forces there,
00:51:35.980 and then just bomb it.
00:51:40.180 I mean, basically,
00:51:41.280 it's the very best bombing place in the world.
00:51:44.900 Where would you like your enemy to be
00:51:46.620 in your own country?
00:51:48.680 Bakhmut.
00:51:49.900 I want my enemy all to be in Bakhmut,
00:51:53.580 every one of them,
00:51:55.040 because you've got no civilians.
00:51:59.360 You've got nothing to protect.
00:52:01.660 They have no electricity.
00:52:03.040 They have no food or water.
00:52:06.020 Just put them all there.
00:52:08.220 Let them stay there as long as they want.
00:52:12.580 Now, flip that,
00:52:13.820 and you understand Bakhmut, somebody says.
00:52:15.700 So the Ukrainians are saying out loud
00:52:19.460 that their strategy is to drain the Russians
00:52:22.760 so that when Ukraine is ready for a counteroffensive,
00:52:26.700 the Russians won't have much left.
00:52:29.260 Do you think they always planned that,
00:52:31.520 or that's just the way it worked out?
00:52:35.040 And the other thing is,
00:52:36.320 how does anybody know what Russia has in reserve?
00:52:39.660 Do you think we really know
00:52:42.020 what they could or could not do?
00:52:44.880 I mean, we were fooled before.
00:52:46.780 And do we know what Ukraine can and cannot do?
00:52:50.260 Does anybody think this war is going to end
00:52:52.260 before the end of the year?
00:52:54.760 How many think the war will end
00:52:56.300 before the end of the year?
00:52:57.460 I'm going to say no.
00:53:03.020 Yeah.
00:53:03.580 How many think it will end
00:53:04.920 when a Republican gets elected president?
00:53:10.100 That's when I think it will end.
00:53:12.060 I think it will end
00:53:12.840 when a Republican gets elected.
00:53:16.540 Probably Trump,
00:53:17.680 if it's going to be a Republican.
00:53:19.800 But I have to say that
00:53:20.880 the abortion question,
00:53:24.800 the ruling against the abortion pill,
00:53:27.460 that probably guarantees
00:53:30.080 the destruction of the Republicans
00:53:32.620 in the next election.
00:53:34.400 But like I said,
00:53:35.320 at least they're operating on principle.
00:53:37.680 At least they got that going for them.
00:53:42.160 All right.
00:53:45.260 Yeah, I think Trump would end it.
00:53:46.680 And then Kim Jong-un is testing
00:53:49.100 a drone submarine nuclear launcher.
00:53:54.360 Could that be any scarier?
00:53:56.700 A drone submarine with nuclear capability.
00:54:01.440 Oh, great.
00:54:02.940 Yeah, it's like freaking sharks
00:54:05.040 with freaking lasers on their heads.
00:54:07.720 Do you think Trump could put an end to that?
00:54:12.800 I don't know.
00:54:13.620 He slowed it down before.
00:54:14.920 It looked like it.
00:54:15.660 I really don't know why Kim Jong-un
00:54:18.060 is even building all these resources.
00:54:21.000 Like, I just don't understand
00:54:22.400 how that possibly makes sense
00:54:23.960 for his country.
00:54:25.500 But maybe so.
00:54:30.280 All right.
00:54:33.500 Dictator insurance.
00:54:34.640 Yeah, we need dictator insurance.
00:54:37.060 Yeah, there needs to be a way
00:54:38.260 to retire dictators
00:54:39.380 so that they think
00:54:42.340 they have a way out.
00:54:45.220 You believe North Korea propaganda?
00:54:47.580 Why?
00:54:50.860 All right.
00:54:53.160 Riley Gaines.
00:54:54.240 Yeah, it's just one person.
00:54:56.140 Her situation.
00:54:58.160 What do you think is going to happen
00:54:59.360 with trans athletes
00:55:00.840 in the long run?
00:55:04.420 In the long run,
00:55:05.520 I think the trans female athletes,
00:55:10.720 the ones who are competing
00:55:11.660 on the female teams,
00:55:12.600 I think they'll be banned.
00:55:14.240 In the long run,
00:55:15.200 they'll be banned.
00:55:16.520 That's the only thing
00:55:17.340 that makes sense.
00:55:21.900 All right.
00:55:23.100 So, there was...
00:55:25.180 I think I didn't talk
00:55:26.540 about this yesterday.
00:55:27.800 So, there's some indication
00:55:29.280 from a trans activist
00:55:32.380 who is somebody
00:55:33.480 who transitioned
00:55:34.700 and wished they hadn't.
00:55:36.300 I forget who it is.
00:55:37.880 And their claim is
00:55:40.960 that a lot of the interest
00:55:43.240 in transitioning
00:55:44.060 is coming from TikTok videos.
00:55:46.760 Have you heard that before?
00:55:48.580 Because we keep asking,
00:55:49.980 why are there so many trans
00:55:51.080 all of a sudden?
00:55:52.440 And one theory is
00:55:53.420 that it's all TikTok.
00:55:54.160 That TikTok is glorifying
00:55:56.720 the trans life
00:55:58.000 and it's so influential
00:55:59.500 it's causing people
00:56:00.520 to become trans.
00:56:02.860 Now,
00:56:03.780 could I make a better argument
00:56:06.400 for banning TikTok
00:56:07.760 than this?
00:56:10.240 TikTok,
00:56:12.080 meaning China,
00:56:13.800 yeah,
00:56:14.060 Ollie London
00:56:14.640 was the name
00:56:15.180 of the activist.
00:56:17.740 TikTok is now powerful enough
00:56:19.980 and China owns TikTok.
00:56:22.160 China is now powerful enough
00:56:23.700 through TikTok
00:56:24.200 that they can change
00:56:26.160 the gender
00:56:26.720 of your children.
00:56:29.580 They can change
00:56:30.720 the gender
00:56:31.340 of your children.
00:56:32.580 They can also make
00:56:33.580 your children go
00:56:34.320 from happy to sad
00:56:35.460 and depressed.
00:56:37.400 They can make
00:56:38.100 your children work less,
00:56:39.620 care about their country less,
00:56:41.760 and eat junk food
00:56:43.460 and do all kinds
00:56:44.740 of dangerous shit.
00:56:46.540 And China can do that.
00:56:48.280 Now,
00:56:48.520 you might say to yourself,
00:56:49.480 there's no evidence
00:56:50.360 that China has done that.
00:56:52.680 Well,
00:56:53.160 there's definitely evidence
00:56:54.080 that they don't allow
00:56:54.920 TikTok in their country.
00:56:56.580 It might not be something
00:56:57.780 they have to do.
00:56:59.160 Maybe it's something
00:56:59.860 they simply unleash
00:57:01.160 and then TikTok
00:57:02.640 does what it does
00:57:03.760 and it's all bad impulses,
00:57:05.740 you know,
00:57:06.600 codified.
00:57:07.500 So they say,
00:57:08.620 well,
00:57:09.400 if we did that
00:57:10.140 in our country,
00:57:10.740 it would be the same,
00:57:11.660 so we won't do that
00:57:12.600 in our country.
00:57:14.400 Yeah,
00:57:14.820 I don't think
00:57:15.400 that China has to
00:57:16.460 necessarily push
00:57:17.340 the heat button
00:57:18.080 to make something
00:57:19.240 go viral.
00:57:20.360 I think they just
00:57:21.340 have to wait
00:57:21.980 and TikTok will do
00:57:23.860 all the bad viral
00:57:24.920 stuff itself,
00:57:26.960 you know,
00:57:27.300 because that gets
00:57:27.920 the most clicks.
00:57:28.880 So they don't even
00:57:29.720 have to be actively
00:57:30.620 put their finger
00:57:31.320 on the scales.
00:57:32.240 They simply have
00:57:33.000 to make it available
00:57:33.820 and then not make it
00:57:35.280 available to their
00:57:35.980 own country.
00:57:37.040 That's all it takes
00:57:37.800 and you're done.
00:57:39.720 So yes,
00:57:40.460 TikTok can change
00:57:42.020 the gender
00:57:42.580 of your children.
00:57:44.820 Do you agree
00:57:45.800 with that?
00:57:46.840 Is that a statement
00:57:47.540 in hyperbole?
00:57:48.520 Now,
00:57:48.760 I'm not saying
00:57:49.160 every child
00:57:49.800 every time.
00:57:51.100 I'm saying
00:57:51.600 that from a
00:57:52.320 statistical perspective,
00:57:55.580 TikTok can
00:57:56.460 change the gender
00:57:58.240 of your child.
00:57:59.860 TikTok can make
00:58:00.720 your child
00:58:01.300 remove their genitals.
00:58:04.600 That's a real thing.
00:58:06.700 TikTok is powerful
00:58:07.820 enough to make
00:58:09.160 you remove
00:58:09.700 your own genitals
00:58:10.520 and think you
00:58:11.040 made a good decision.
00:58:12.880 That's a real thing.
00:58:14.820 Which is not to say
00:58:16.120 that some people
00:58:16.900 are not legitimately
00:58:19.080 sexually dysmorphic
00:58:22.500 or whatever the word is.
00:58:23.820 I'm sure there are.
00:58:26.000 But if you look
00:58:28.120 at the uptick
00:58:28.880 in trans,
00:58:29.880 there's something
00:58:30.580 going on.
00:58:31.320 It's not all diet.
00:58:33.340 It's probably
00:58:33.980 social media.
00:58:35.220 It's probably TikTok
00:58:36.120 more than anything else.
00:58:37.200 kids are weak enough
00:58:46.780 to be persuaded
00:58:47.480 by TikTok.
00:58:48.260 Adults are as well.
00:58:50.140 The adults
00:58:50.580 are just as persuaded.
00:58:52.340 I'd love to see
00:58:53.500 how many people
00:58:55.100 transition
00:58:57.200 who are
00:58:58.680 TikTok
00:58:59.100 consumers
00:59:00.520 versus not,
00:59:02.060 especially their parents.
00:59:03.200 how many moms
00:59:05.480 who are on TikTok
00:59:06.180 are supporting
00:59:07.560 their kids'
00:59:08.240 transitions
00:59:08.720 that maybe
00:59:09.420 would not have
00:59:10.040 happened otherwise
00:59:10.780 without TikTok?
00:59:13.580 Probably a lot.
00:59:16.120 Here's the first thing
00:59:17.100 I would do
00:59:17.540 if I heard
00:59:18.060 that a child
00:59:18.820 was considering
00:59:21.000 transitioning
00:59:21.760 and their parents
00:59:23.780 were in favor of it.
00:59:25.780 You know what
00:59:26.140 the first thing I'd do?
00:59:28.020 I would send
00:59:28.760 the mother
00:59:29.220 to a therapist.
00:59:30.360 I'd send
00:59:33.020 the mother
00:59:33.440 for mental health
00:59:34.520 coaching.
00:59:36.200 Which doesn't mean
00:59:37.240 that the mother
00:59:37.660 has a mental health
00:59:38.400 problem.
00:59:39.280 It means that
00:59:40.060 the odds of it
00:59:41.080 being true
00:59:41.880 are so high
00:59:42.740 that you should
00:59:43.920 have them checked out
00:59:44.680 to see if they're
00:59:45.380 a sociopath
00:59:46.080 or a narcissist.
00:59:48.540 So the first thing
00:59:49.600 I'd do is say,
00:59:50.300 okay,
00:59:50.620 your child has,
00:59:51.400 you say,
00:59:52.580 you say your child
00:59:53.480 has this sexual,
00:59:56.420 what is the word?
00:59:57.760 Dysmorphia?
00:59:58.160 Is that the right word?
01:00:00.580 Dysmorphia?
01:00:03.100 And you claim
01:00:04.120 that's true
01:00:04.560 and your child
01:00:05.120 claims that's true.
01:00:05.920 The first question
01:00:06.560 on dysphoria,
01:00:07.900 dysphoria,
01:00:08.660 thank you,
01:00:10.100 sexual dysphoria.
01:00:11.540 The first question
01:00:12.520 I'd ask is,
01:00:13.200 do you use TikTok?
01:00:16.800 And then the second
01:00:17.580 thing I'd say is,
01:00:18.860 all right,
01:00:19.200 we need to put
01:00:19.840 the mother
01:00:20.320 into therapy.
01:00:24.800 And if the mother
01:00:26.900 in therapy works
01:00:28.960 out okay
01:00:29.640 and you find out
01:00:30.580 that the mother
01:00:31.000 is just fine,
01:00:32.120 she just cares
01:00:32.820 about her kid,
01:00:34.220 then you go,
01:00:34.820 okay,
01:00:35.240 now we're going
01:00:35.720 to talk to the kid.
01:00:37.280 Let's talk to the kid now.
01:00:39.020 But I'd talk
01:00:39.660 to the mother first
01:00:40.440 and find out
01:00:41.080 what kind of
01:00:42.580 mental problem
01:00:44.700 she has.
01:00:46.040 Now,
01:00:46.380 the reason I'm not
01:00:47.120 talking to the father
01:00:48.040 first,
01:00:49.000 I don't think
01:00:49.940 the fathers
01:00:50.380 are doing this.
01:00:52.700 Do you?
01:00:53.320 I think the fathers
01:00:55.760 would be mostly
01:00:56.700 wait and see.
01:00:59.240 I think the fathers
01:01:00.220 would wait
01:01:00.680 until you're 18
01:01:01.320 and let you make up
01:01:02.000 your own mind.
01:01:03.700 I think it's the mothers.
01:01:05.260 And I think it's
01:01:05.720 the narcissistic mothers
01:01:06.920 mostly,
01:01:08.500 not entirely,
01:01:09.860 who are signaling,
01:01:11.880 you know,
01:01:12.240 their great awesomeness
01:01:13.200 and open-mindedness
01:01:14.180 and support.
01:01:15.820 So they want to be
01:01:16.720 the most supportive
01:01:17.700 person in the world.
01:01:19.160 state supervision.
01:01:27.200 All right.
01:01:29.160 A sexist would say.
01:01:31.500 Yeah,
01:01:31.840 is it Munchausen
01:01:32.700 by proxy?
01:01:34.200 Not all of it,
01:01:35.920 but some of it
01:01:36.660 could be.
01:01:38.020 I think so.
01:01:40.800 Yeah,
01:01:41.340 related to that.
01:01:42.540 Somewhat related
01:01:43.440 is how I'd say it.
01:01:47.720 All right,
01:01:48.480 ladies and gentlemen,
01:01:49.160 this concludes
01:01:50.400 your Easter Saturday
01:01:52.720 live stream.
01:01:54.160 I'm going to go talk
01:01:54.740 to the locals people.
01:01:56.280 I'm sure there'll be
01:01:56.920 a lot more exciting news
01:01:58.740 next week
01:01:59.680 after Easter,
01:02:01.260 but for now,
01:02:03.120 and not so much,
01:02:04.740 talk to you tomorrow,
01:02:05.740 YouTube.