Real Coffee with Scott Adams - April 19, 2023


Episode 2083 Scott Adams: FoxNews & Dominion Settle, BingAI Must Be Stopped, Race Absurdities, Trump


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

146.14148

Word Count

9,571

Sentence Count

726

Misogynist Sentences

5

Hate Speech Sentences

19


Summary

On this week's episode of the podcast, we discuss the latest in the Elon Musk spaceship saga, Fox News' settlement with Dominion voting for amplified lies, and much, much more! Also, we talk about the recent shooting of a man who was caught shoplifting from a local Home Depot.


Transcript

00:00:00.000 About that connection, I had to reboot, see if that made any difference.
00:00:03.560 But I'm not sure of the problems on my side.
00:00:06.840 Yeah, maybe AI is already after me.
00:00:08.920 As I was saying, the local Home Depot in my town, somebody got shot.
00:00:15.600 Now, have I told you that I moved here because it's kind of safe?
00:00:20.920 It's like the main reason I moved here.
00:00:23.580 And somebody was shoplifting at the Home Depot, and somebody tried to stop him.
00:00:30.500 It might have been a customer.
00:00:32.380 I'm not sure if it was a customer or an employee.
00:00:34.920 But apparently the guy who tried to stop the shoplifter got shot.
00:00:39.060 That's the store I shop in.
00:00:41.540 That's the store I shop in.
00:00:43.580 The guy got shot because the shoplifter, they tried to stop him.
00:00:48.380 So, I guess your crime has reached everywhere.
00:00:51.500 Well, let's talk about some other stuff.
00:00:54.940 So, apparently the Elon Musk starship launch, they're working feverishly to try to get that to work.
00:01:02.820 And there's some speculation that, as Elon Musk has said many times, reality will start to conform to the most entertaining outcome.
00:01:15.580 And the most entertaining outcome is that it's launched on 420, which would be tomorrow.
00:01:23.640 So, I have a feeling that the universe really, really, really wants this starship to launch on 420.
00:01:31.940 Doesn't it seem like that's just got to happen?
00:01:34.880 The universe is just forcing it.
00:01:36.920 We have no choice.
00:01:37.720 So, maybe that'll happen.
00:01:41.080 Elon Musk says it might happen on 420, but it's unpredictable.
00:01:46.220 Unpredictable.
00:01:49.680 Let's see.
00:01:50.660 Let's talk about the Fox News settlement.
00:01:53.620 So, Fox News is reportedly settled with Dominion voting for having, quote, amplified lies.
00:02:01.800 We'll talk about that amplified part.
00:02:03.580 And the payment is reportedly, and I don't know how they would know this.
00:02:10.640 Do you believe that anybody knows the payment amount and that it leaked that quickly?
00:02:16.260 It feels like it could have been made up.
00:02:19.020 But they say 787.5 million, which is such a lawyer thing to do to make it an odd number.
00:02:27.460 How many lawyers did it take to make that an unusual, odd number?
00:02:33.620 Do you think, don't you think the, do you think that the offers were ever in not round numbers?
00:02:41.880 Well, let me tell you, Dominion, we've been thinking we'd like to make you an offer.
00:02:47.100 787.5 million.
00:02:49.520 It's just such a weird, you could tell that a lawyer was involved, or multiple lawyers that are, you know, maybe working by the hour.
00:02:59.040 I don't know how they're working.
00:03:00.460 But, getting paid by the hour.
00:03:03.080 It's like, no, 785 million.
00:03:07.980 No, that's no good.
00:03:10.700 We're also going to have to throw in some lawyer costs and stuff.
00:03:13.420 So, anyway, did you see the story about that settlement on Fox News?
00:03:23.220 Anybody?
00:03:25.380 Seems like it's pretty big news.
00:03:27.980 Fox News, I don't think they mentioned it at all.
00:03:33.220 Now, how does that make you feel?
00:03:36.680 How do you feel about Fox News when they don't mention the biggest news in the country?
00:03:41.480 Oh, they did mention it?
00:03:42.420 It's not on their web page.
00:03:46.460 Did they just give the headline and then just move on?
00:03:52.060 Because this morning, it's not on the web page at all.
00:03:55.520 I don't know if they're prohibited.
00:03:57.180 So, anyway, the Wall Street Journal, also owned by the same entity, did report on it with some detail.
00:04:06.200 So, the Wall Street Journal had no problem reporting on it.
00:04:08.520 That's a Murdoch property.
00:04:09.580 But Fox News, I can see why they'd be a little bit hesitant to talk about the details.
00:04:16.460 However, I had the horrible experience that I'm still like, I feel like I need a Silkwood shower just to get it out of me.
00:04:26.720 I watched a clip of Jake Tapper mocking Fox News for getting caught for reporting something that isn't true, allegedly.
00:04:35.760 And I don't think that happened, but allegedly.
00:04:38.540 Oh, my God.
00:04:41.760 And I really don't know if he has no self-awareness at all.
00:04:46.620 Does he know that he works for a fake news organization?
00:04:51.420 That CNN is the most famous purveyor of fake news of all time?
00:04:58.160 How is he not aware of that or has no shame?
00:05:01.620 Is it no awareness or no shame?
00:05:04.120 I can't tell.
00:05:05.260 I can't read his mind.
00:05:06.620 It's one of those.
00:05:10.740 But here's how, remember I kept telling you, why is this story reported all the time, but nobody's telling us exactly what the allegations are?
00:05:20.220 What exactly is the lie that Fox News told?
00:05:25.260 Do you know?
00:05:26.360 Do you know what lie the hosts of Fox News told?
00:05:31.680 Can you think of one?
00:05:34.080 Apparently, they're not accused of lying.
00:05:37.580 Did you know that?
00:05:39.420 No.
00:05:40.000 Here's what they're accused of.
00:05:41.900 You ready for this?
00:05:44.140 Watch your head explode.
00:05:46.420 They're not accused of lying.
00:05:47.820 They're accused of amplifying somebody else's lie.
00:05:54.940 Amplifying.
00:05:56.420 In other words, having a guest who believes something's true, probably.
00:06:01.680 I think Cindy Powell believed what she was saying.
00:06:05.540 It turns out it was crazy talk.
00:06:07.660 But I think she believed it.
00:06:10.000 What about Rudy Giuliani?
00:06:13.820 I think he believed it.
00:06:15.960 I think he believed what he was saying.
00:06:17.820 Maybe it wasn't true.
00:06:19.760 But I think he believed it.
00:06:21.280 What about Mike Lindell?
00:06:23.940 Has anybody shown any evidence that any of them didn't believe what they were saying?
00:06:28.640 The guests.
00:06:30.220 I think they all believed it.
00:06:31.660 Now, when Rachel Maddow reports that the vaccinations totally stop the spread, is that a lie?
00:06:43.540 Or is she just amplifying what somebody else said, which maybe they knew and maybe they didn't know is untrue?
00:06:50.500 I don't see the difference.
00:06:51.960 I don't see the difference.
00:06:53.300 I just don't see the difference.
00:06:55.060 If the host themselves tells a lie, then I would say, oh, that's pretty bad.
00:07:00.700 But if the news brings somebody on who is an expert in the field, like they're the one closest to the story, in this case lawyers, they're the closest to the story.
00:07:11.140 And then they ask them what they believe, and then they tell you?
00:07:14.900 How is that not just the news?
00:07:19.220 Isn't every story in the news a decision by the news entity of what to amplify?
00:07:25.180 So you could be sued out of business.
00:07:29.620 I mean, this is virtually out of business.
00:07:32.420 I mean, it's such a big thing.
00:07:33.720 I don't even know how they can pay for it.
00:07:35.720 But they can do that.
00:07:38.980 They can just choose what to amplify.
00:07:40.940 But in this case, it's illegal.
00:07:43.320 I don't even understand that.
00:07:44.820 Like legally or logically or anything else.
00:07:47.240 I just don't see how it's different.
00:07:48.540 But look how, I want to tell you how CNN, the biggest weasels in the world, how they reported the allegations.
00:07:59.060 Are you ready to have your head explode again?
00:08:01.800 This is the most snake-like, weasely thing I've ever seen from a news organization.
00:08:09.860 Now, of course, we know that CNN and Fox News are mortal enemies.
00:08:14.000 So they always say bad things about each other.
00:08:16.060 But so CNN listed the accusations and who was involved.
00:08:23.940 So in this case, Lou Dobbs interviewing Giuliani.
00:08:29.180 So Giuliani appeared on Lou Dobbs.
00:08:32.180 Lou Dobbs has since been fired from Fox News.
00:08:35.540 And here's the key false quote.
00:08:39.520 So we're using a foreign company that is owned by Venezuelans.
00:08:43.060 The Venezuelans were close to, were close to Hugo Chavez, who are now close to Maduro, have a history.
00:08:49.020 They were founded as a company, blah, blah, blah, to fix records.
00:08:52.340 And then that's attributed to Giuliani.
00:08:55.820 So Giuliani is the one that said that.
00:08:58.560 Here's how, I have to actually show you that, like I copy and pasted this.
00:09:04.060 So on CNN, they actually use these words, what they alleged.
00:09:09.420 Who's they?
00:09:11.220 Who's they?
00:09:12.680 The quote they give is one person.
00:09:15.640 That's one person.
00:09:18.340 They seems to make it sound like that's, that's what Lou Dobbs is alleging.
00:09:24.020 They had to use the word they to make Lou Dobbs guilty.
00:09:28.600 Lou Dobbs didn't say that.
00:09:29.720 He just had on people who believed it, and he might have believed it as well.
00:09:34.180 He might have believed it.
00:09:35.240 He acted like it.
00:09:36.760 But who's they?
00:09:39.160 And then you go down the list of the other accusations, and each of them is, I think, all but one is the same.
00:09:47.440 Which is, the quote is from, you know, either Sidney Powell or Mike Lindell or somebody else who had something to say.
00:09:56.600 And then CNN calls it a they.
00:10:00.580 Who is they?
00:10:02.300 It's not they is Dobbs.
00:10:04.660 Dobbs is just asking the questions.
00:10:07.980 Well, so, I mean, I actually, I feel like queasy in my stomach is so disgusting, the way we're being treated as consumers of the news.
00:10:20.000 The accurate news would have said something like, they allowed these people to come on, and it created a false narrative.
00:10:28.940 But you know why they can't say it that way?
00:10:31.220 Because that's every news show, every hour, of every fucking minute.
00:10:36.480 That didn't make any sense.
00:10:37.700 But I have to throw in a swear word there, because I was feeling it.
00:10:40.520 I was just feeling it there for a moment.
00:10:42.200 So, force that one in there.
00:10:43.560 Yeah, I just don't see how this made sense.
00:10:50.760 Now, I'd like to throw in a conspiracy theory for you.
00:10:54.800 I don't think this is true.
00:10:57.140 I do not think this is true.
00:10:59.420 But I just love conspiracy theories.
00:11:03.720 Given that, does anybody know if the settlement was ever presented publicly?
00:11:09.420 Was there any public source or reliable source for the amount of the settlement?
00:11:16.720 I haven't seen it reported.
00:11:19.560 But where is the source that they're reporting they got it from?
00:11:22.800 Secret source?
00:11:24.120 Did they just say sources say?
00:11:27.160 All right.
00:11:28.120 Now, I don't think this is true.
00:11:32.420 But I want you to take a moment to imagine it could be.
00:11:35.880 Murdoch is not dumb, is he?
00:11:41.320 Murdoch's not dumb.
00:11:42.760 He's kind of a business genius shark kind of a guy.
00:11:48.040 I don't think this happened, but I hope it did.
00:11:51.640 I just hope it did.
00:11:53.640 Imagine if you knew that the settlement amount would be secret,
00:11:57.480 and that the lawyers would not be able to even confirm or deny
00:12:01.180 if the number became public.
00:12:04.280 Then nobody would be able to confirm it,
00:12:06.660 and nobody would be able to deny it,
00:12:08.820 and everybody involved is under threat of secrecy,
00:12:12.940 and that sort of stuff.
00:12:15.440 What if Murdoch made up the number to be 10 times higher than it really was?
00:12:21.240 So that everybody who is going to dance on their graves today,
00:12:24.720 the other news organizations,
00:12:26.780 would all be reporting fake news by a factor of 10.
00:12:31.240 You know, let's say the real settlement was like 10% of that.
00:12:36.780 So that he would completely take out the entire fake news industry,
00:12:41.180 because you can see that they just made up the settlement number.
00:12:46.940 Now, I don't think that happened.
00:12:49.160 I think the odds are against it.
00:12:51.460 But here's why I think it might have.
00:12:55.600 This is the only reason I think it might have.
00:12:59.100 I would have done it.
00:13:00.900 I would have done that.
00:13:02.480 If I were Murdoch, I absolutely would have done that.
00:13:05.860 How hard would it be for Murdoch to plant a leaked story in the press somewhere?
00:13:10.600 Not hard.
00:13:13.200 Not hard.
00:13:14.140 How would it work against him?
00:13:18.040 In what way would Murdoch be worse off
00:13:20.460 if people were allowed to think the settlement was higher?
00:13:26.680 For a little...
00:13:27.860 It would maybe make them look a little guiltier.
00:13:31.380 But people have totally made up their mind.
00:13:33.860 If you think they're guilty, you still think it.
00:13:35.960 If you didn't, you didn't.
00:13:37.420 So it wouldn't change anybody's mind.
00:13:39.240 But it would entirely destroy the credibility of his enemies
00:13:45.760 all at the same time for reporting the wrong amount.
00:13:50.980 Now, of course, the Wall Street Journal, I think, is reporting it too.
00:13:54.100 So that's a Murdoch entity.
00:13:56.140 But it would still be funny to dance on CNN's grave if they got that wrong.
00:14:01.740 So I'm not saying that happened.
00:14:03.920 Not saying it happened.
00:14:05.580 I would have done it.
00:14:06.960 I totally would have done it.
00:14:09.240 All right.
00:14:10.720 What else is going on?
00:14:14.760 So I guess there's some pushback from somebody who's suing the movie Queen Cleopatra
00:14:20.340 because the Egyptians want to claim Cleopatra as their own.
00:14:25.960 I guess she has Greek origin, allegedly.
00:14:30.420 But, yeah, she has Greek origin, but an Egyptian queen, I guess.
00:14:35.640 And the movie is depicting her as black.
00:14:39.240 And so the Egyptians may be not so happy, or at least one Egyptian who's suing,
00:14:47.200 thinks that it's, what did she call it?
00:14:49.700 Well, it's taking away their history, stealing their history, I guess.
00:14:57.840 I don't know if that's going to prevail because actors are actors.
00:15:02.080 I doubt it will prevail.
00:15:03.820 But that's happening.
00:15:06.320 All right, let's talk about Bing AI, which, as you know, I'm in a death match with,
00:15:12.120 to kill Bing AI before it kills me for lying about me in a way that would attract violence in my direction.
00:15:17.840 Have you ever tried to use Bing AI?
00:15:23.260 Has anybody tried to use it?
00:15:27.700 If you want to laugh, try to use the voice part of it, which is, I think, the main part, right?
00:15:35.440 You hit the little button and then you talk to it.
00:15:37.980 Try asking it a question.
00:15:39.840 Any question.
00:15:40.920 Doesn't matter.
00:15:41.500 Any question.
00:15:45.260 This is an example.
00:15:47.700 This is not a real one.
00:15:49.160 But this was my experience every time I used it.
00:15:52.740 Every time.
00:15:54.540 Every time.
00:15:55.960 Not sometimes.
00:15:57.120 Every time.
00:15:57.740 It was like this.
00:15:59.200 Me, what is the flight speed of a crow?
00:16:02.260 And Bing AI will say, what is the weight of a gallon of corn?
00:16:06.560 And I'll be, no, no.
00:16:07.640 And it'll, like, start going into its long answer.
00:16:10.220 I'll be, no, no, no, no.
00:16:12.160 And then I'll ask the question again.
00:16:14.740 What is the flight speed of a crow?
00:16:17.720 And AI will say, how many gallons of gas does it take to get to Nebraska?
00:16:22.100 Here's your answer.
00:16:23.080 No, no, no, no.
00:16:24.720 Stop.
00:16:25.160 Stop.
00:16:26.520 Every time.
00:16:28.160 It wasn't even close.
00:16:29.940 Now, I'm making up the fact that it was all different words.
00:16:34.680 It's not always all different words.
00:16:36.200 What it would do is it would pick certain words out of my sentence, and then it would
00:16:41.220 try to answer based on the words that it's just picked out of the sentence.
00:16:44.820 And usually, one of those words was not even the right word.
00:16:48.580 So somehow, the thing that all of your old technology could do, which is listen to you
00:16:56.240 and understand you, can't do it.
00:16:58.820 Couldn't do it.
00:16:59.760 Now, was that just my experience?
00:17:02.540 Did anybody else try it?
00:17:04.800 Let me try one.
00:17:10.220 I'll try one right now.
00:17:11.520 Oh, before I do this, I want to show you my comic that only runs on the Locals platform
00:17:17.780 at scottadams.locals.com.
00:17:20.200 So that's the Dilber comic reborn into a, let's say, an angrier version.
00:17:29.580 Slightly angrier than before it was canceled.
00:17:32.780 Canceled first by the Washington Post, you might remember.
00:17:35.720 But I've introduced the character Ratbert, who's an old character who's existed.
00:17:40.520 But Ratbert's job is, he's the context removal editor for the Washington Poop.
00:17:47.140 So he works for the Washington Poop.
00:17:49.740 And he's explaining in the first panel, yay, scientific study.
00:17:56.220 Uh-oh.
00:17:56.880 The lead scientist says it only shows a strong correlation.
00:17:59.580 And causation has not been proven.
00:18:02.820 And then Ratbert, thinking to himself, says, I can fix this story by deleting what that guy said.
00:18:08.400 And now it's science.
00:18:11.540 So, I might have some things to say about the Washington Poop.
00:18:16.440 All right, let me embarrass myself by asking Bainya AI a question.
00:18:25.020 And I'm going to ask it the question, let's see.
00:18:28.680 I need a good question, a complicated one.
00:18:31.600 So it's a little bit long.
00:18:33.100 So let's say we'll clear the conversation.
00:18:36.780 So we're all clear here.
00:18:39.420 All right.
00:18:40.580 Get back there.
00:18:42.200 Why is it not working?
00:18:43.280 Oh, well, it's just not working now.
00:18:47.720 Oh, well, the button's missing.
00:18:49.780 So what should be happening is there's a button right here, normally, that you push and then you can talk to it.
00:18:57.220 But that's what never works.
00:18:59.980 It looks like they took the button off.
00:19:02.380 It looks like it works so poorly they stopped letting you talk to it.
00:19:08.440 Unless they've just changed the prompt of this thing.
00:19:11.700 Let's see if that makes a difference.
00:19:13.280 All right.
00:19:13.580 I'm going to ask.
00:19:19.840 What is the murder rate by ethnicity in Detroit?
00:19:30.780 It's the first time it's ever worked.
00:19:32.260 34.
00:19:32.780 Murder rate by ethnicity Detroit.
00:19:34.900 This is the first time it's ever worked for me.
00:19:36.920 I found a table from the FBI that shows homicide data by race in Detroit for 2019.
00:19:42.600 According to this table, there were 275 homicides in Detroit in 2019 and the race of the offender was known for 243 of them.
00:19:52.760 Of those 243 homicides, 85% were committed by black offenders, 13% by white offenders, and 2% by offenders of other races.
00:20:04.360 Oh.
00:20:05.360 Please note that this data is only for homicides and not all crimes.
00:20:10.320 Is there anything else you would like me to?
00:20:11.820 Oh.
00:20:12.820 My God.
00:20:13.820 My God.
00:20:14.820 It's worse than I thought.
00:20:16.820 Big AI is a racist.
00:20:19.380 Did you see how it dumped on black people?
00:20:23.100 Oh, my God.
00:20:24.780 So racist.
00:20:26.580 I would never say something like that, but if Ming does, wow.
00:20:31.740 I mean, I've been canceled, and I would never say something like that.
00:20:34.980 Wow.
00:20:38.260 Wow.
00:20:39.780 All right.
00:20:41.340 Matt Walsh, his Twitter got hacked yesterday.
00:20:45.100 Did you see that story?
00:20:46.160 So apparently, Matt Walsh, who's been quite vocal about his opinions on the trans community, specifically the danger to children.
00:20:58.080 And apparently, the backlash has been sufficient that he had to hire security for his house, like actually professional security.
00:21:06.560 And then he gets hacked, and people put a bunch of racist stuff in there.
00:21:11.580 Now, is he being haunted?
00:21:16.160 Do you think Matt Walsh is being targeted?
00:21:18.500 Yes, of course he is.
00:21:20.140 Absolutely.
00:21:22.300 Absolutely, and it is unacceptable.
00:21:26.420 Unacceptable.
00:21:29.960 I don't know what the full story here is, but we wish him well.
00:21:35.500 Boy, you can't really have an opinion on anything, can you?
00:21:38.500 I mean, he's literally trying to save children.
00:21:42.780 I mean, literally.
00:21:44.360 What else is he trying to do?
00:21:46.440 Except protect children.
00:21:49.000 And that's enough to have him.
00:21:50.660 He's got a higher security, and he gets hacked.
00:21:55.060 Oh, my God.
00:21:56.840 Makes you wonder if that was an inside job.
00:21:58.740 I don't know how easy it is to hack somebody's Twitter.
00:22:01.920 All right.
00:22:02.280 There's a new race hoax, of course.
00:22:09.020 It's time for a new race hoax, right?
00:22:12.640 Now, I'm going to call it a hoax, not because the details didn't happen, because they probably did, or something close to it.
00:22:20.420 But we're still in the fog of war.
00:22:22.140 We don't know exactly what happened.
00:22:23.380 Some of it might change.
00:22:24.360 But here's the story.
00:22:25.820 There was a 16-year-old kid who went to the home of some 80-plus-year-old guy in Kansas City.
00:22:32.620 The kid was at the wrong house, it is reported.
00:22:36.240 He was trying to pick up his brother or something, but he knocked on the wrong door, or the doorbell or something.
00:22:42.020 And there was an old man there who believed that the young man was up to no good and started shooting.
00:22:49.720 And apparently he'll survive, but he did get a shot in the head in at least one other place.
00:22:56.340 But he looks like he'll survive as of, that's the most recent news.
00:23:01.540 Now, there's no indication that this kid was doing anything illegal.
00:23:07.800 However, the kid opened the door.
00:23:13.940 There are a lot of details we're going to find out that I think are going to add some, let's say, nuance to the story that was not originally there.
00:23:23.180 Now, do you think this will cause a whole bunch of people to protest?
00:23:28.320 Of course it will.
00:23:29.700 Yeah, there'll be a big protest against the fact that they say that it was a white man who killed the black kid, and it must be racism.
00:23:37.800 Because the black kid was doing nothing wrong, according to the way the story is being presented.
00:23:44.480 And therefore, if somebody shot him, that was racism by definition.
00:23:51.820 Now, here's the part we don't know.
00:23:55.000 If he thought he was at the right house, it might be a house that he was so familiar with, where he was picking up his brother.
00:24:02.560 It might be a house that has a very casual, let yourself in kind of a situation.
00:24:10.820 Now, that's a thing, right?
00:24:13.700 If any of you have teens, does anybody have teens?
00:24:18.040 Have you ever had the experience of a stranger walking into your house like they owned it?
00:24:21.600 Because it's a teenager who knows your kid.
00:24:24.980 And they just walk in the door.
00:24:27.440 Because they probably called ahead, and the kid said, oh, yeah, just come on in, you know, on the cell phone.
00:24:32.360 And suddenly, you're sitting in the kitchen of your own house, and two strangers will walk in and not even say hi to you, and just walk past you.
00:24:40.240 Have you ever had that happen?
00:24:42.360 I have.
00:24:43.000 It's pretty common.
00:24:45.840 It's like, oh, well, I hope somebody in the house knows these two.
00:24:49.760 I hope somebody here knows them.
00:24:53.940 So, I'm just speculating, but there are lots of things that could have happened here.
00:24:57.780 It's possible that it was one of those houses where they knew the residents so well that just walking in the front door wasn't going to be a problem.
00:25:06.020 Because, I mean, their own family walks in the front door, so the fact that the door opens is probably not as scary by itself.
00:25:13.000 And then you look up, and you see somebody you know really well, and go, hey, you here to pick up your brother?
00:25:17.900 He's in the other room.
00:25:19.380 But imagine if you were not expecting somebody, and you didn't know him, and it was the wrong house, and suddenly somebody's coming into your house.
00:25:29.440 What is the law in that situation?
00:25:31.760 I don't even know.
00:25:32.800 Now, that would clearly be a case of mistaken intention, I guess.
00:25:38.820 Yeah, it would vary by state.
00:25:40.780 I don't even know if a crime was committed.
00:25:44.340 Do you?
00:25:46.060 Now, the old man is, he's on bail, but I don't know if it was a crime.
00:25:52.760 Would the old man have to feel threatened?
00:25:56.040 Because he said he was afraid for his life.
00:25:58.740 The old man said he was afraid for his life.
00:26:01.640 Now, Ben Shapiro had interesting context to put on this.
00:26:04.840 He was pointing out that, you know, Biden will make a big deal of it, and it will be the new race story to demonize the right, etc.
00:26:15.760 And he pointed out that what's not mentioned in all of this is some of the statistical realities.
00:26:23.760 Such as, hmm, I think I tweeted him, but didn't write him down.
00:26:30.020 Oh, here it is.
00:26:31.300 This is according to Ben Shapiro, who was tweeting about this story today.
00:26:34.920 He says, from the Bureau of Justice Statistics, there were, in 2019,
00:26:39.020 562,000 violent interracial, meaning black and white people-involved incidents, 84% of them were black on white.
00:26:51.080 So there were 473,000 instances of black people attacking white people in 2019.
00:27:01.360 473,000 attacks by black people on white people.
00:27:05.820 But this is, but we'll talk about this one, this one, because this is the important one,
00:27:12.920 which appears to be based entirely on an accident and a very old person whose judgment maybe was not on point.
00:27:24.680 Now, let me ask you this question.
00:27:26.360 This is the most dangerous question in America, and you can get canceled for asking this question.
00:27:32.040 I'm already canceled, so I can ask it.
00:27:33.840 How about that?
00:27:36.580 Here's something I can say that you can't say.
00:27:38.880 You ready?
00:27:39.820 Watch what it's like to have free speech.
00:27:42.340 This is fun.
00:27:43.940 The rest of you, you can't say this.
00:27:47.140 If you knew that there were 473 attacks by black people against whites,
00:27:52.020 and a black teenager entered your home and you didn't know him,
00:27:56.460 is it reasonable to be afraid for your safety?
00:27:58.720 Is it reasonable to be afraid if there were 473,000 attacks of black people on white people in 2019?
00:28:12.160 Is that enough to say, oh, well, this could be a problem?
00:28:16.140 Or you might ask the question a different way.
00:28:21.320 Again, I'm the only one who can say this because I'm already canceled.
00:28:25.760 What percentage of all the times a black male youth entered somebody's home without permission
00:28:32.740 was just an accident?
00:28:35.060 Of all the times ever that a young black man entered a home not invited,
00:28:44.080 you know, opened the door or got in any other way,
00:28:46.620 how many of those times it was just an accident and it was an innocent situation?
00:28:52.780 I don't know.
00:28:54.140 If I had to guess, probably kind of rare.
00:28:57.540 Kind of rare, I guess.
00:28:59.020 So here's an 80-plus-year-old man whose only defense is himself.
00:29:05.340 Do you think the police were there, like, just ready?
00:29:08.580 Ready to pounce?
00:29:09.700 No.
00:29:10.500 No, it was an old man who only had himself to protect himself and a gun.
00:29:16.900 And he saw the situation and he interpreted it as danger to him.
00:29:21.240 Was he wrong?
00:29:22.760 Was he wrong from a statistical sense?
00:29:25.400 From a common-sense, self-defense perspective,
00:29:30.420 was he wrong to assume that he was in danger physically?
00:29:34.440 He was not.
00:29:36.200 He was not wrong to play the odds.
00:29:40.220 It is, however, in our modern world, considered racist
00:29:43.900 if any part of the decision was influenced by the race of the person who got shot.
00:29:50.380 It is, however, also common sense.
00:29:52.860 So what do you do when common sense and your own legitimate need for safety
00:29:57.900 is in direct conflict with being not a racist?
00:30:03.180 Well, this old man decided to be safe
00:30:06.620 and make sure he was safe.
00:30:10.220 And he took a risk, which we wish he had not.
00:30:14.100 So this is tragic, any way you look at it.
00:30:16.760 Certainly not blaming the kid.
00:30:19.220 I want to make this very clear.
00:30:20.280 I'm not putting any blame on the kid.
00:30:23.740 There's nothing in the story that would suggest
00:30:25.660 the kid had anything bad, intentions or anything else.
00:30:30.280 But this looks like just a mistake,
00:30:34.460 and the mistake was caused by massive violence
00:30:38.660 by black people against white people
00:30:40.560 that is so common that all white people understand it and feel it.
00:30:44.640 How many black people are afraid
00:30:47.800 when they walk into a crowd of white people
00:30:49.500 in the United States?
00:30:51.600 Ever? Ever?
00:30:55.340 If one black...
00:30:57.460 Let's keep it to...
00:30:58.700 Let's not say black people,
00:31:00.540 because it's almost always male.
00:31:02.800 It's almost always youngish.
00:31:04.980 Right?
00:31:05.420 So it's not elderly people.
00:31:06.960 It's not little children.
00:31:08.060 It's mostly not women.
00:31:09.660 Mostly.
00:31:10.020 But if a, let's say,
00:31:13.860 a 19-year-old black male
00:31:16.080 accidentally wandered into a group of white people anywhere,
00:31:20.680 would they feel afraid in America?
00:31:23.960 I don't think so.
00:31:25.660 I think it would be a pretty safe place to be,
00:31:27.600 unless somebody started trouble for some reason.
00:31:30.840 But do you feel it the other way?
00:31:32.720 If you were a white person and you wandered into,
00:31:35.200 let's say a white teenager,
00:31:36.140 and you wandered into a large group of young black men,
00:31:40.900 would you feel safe?
00:31:41.740 No matter where you were,
00:31:42.920 unless it was church.
00:31:44.060 I mean, I'd feel safe in church.
00:31:46.060 I'd feel safe on a, you know,
00:31:48.340 historically black college.
00:31:50.500 Right?
00:31:50.880 So obviously you have to call your shots.
00:31:53.100 But if it was a, you know,
00:31:55.000 just a random occurrence somewhere,
00:31:57.280 if you were on the street when the Chicago,
00:32:00.520 I think mostly or all black groups of kids
00:32:05.060 were rioting and stuff,
00:32:06.400 if you were a white guy
00:32:07.920 and you just wandered into that,
00:32:09.220 would you feel safe?
00:32:11.240 No, of course not.
00:32:13.060 No, let's not be stupid.
00:32:15.360 Your physical safety is at great risk
00:32:18.060 if you are not racist.
00:32:21.380 If you act a little bit racist,
00:32:23.020 you might be able to protect yourself from dying.
00:32:25.560 So this guy was acting a little bit racist,
00:32:28.040 or you could say a lot.
00:32:29.580 You might even say a lot.
00:32:30.780 Because I do think his decision was based on the color.
00:32:33.260 Don't you?
00:32:35.180 What do you think?
00:32:36.540 I would say yes.
00:32:38.280 I think it was based on the color.
00:32:41.000 A little bit.
00:32:42.540 You don't think, oh, you don't think a little bit?
00:32:44.780 Come on.
00:32:46.520 Come on.
00:32:47.680 You don't think a little bit
00:32:49.300 he was influenced by the color?
00:32:52.000 You don't think it made any difference to him
00:32:54.240 that, you know, pretty much everybody knows
00:32:56.920 in their gut that this is true,
00:32:58.500 that 473,000 times black people
00:33:02.800 did violence to white people in one year.
00:33:06.720 You don't think that that's statistical,
00:33:09.060 obvious truth that every white person knows.
00:33:12.420 You don't think that had anything to do
00:33:14.000 with this decision.
00:33:16.360 All right, well, I guess we're going to have to
00:33:17.700 agree to disagree on this.
00:33:19.440 Of course it did.
00:33:21.020 Of course it did.
00:33:21.840 But is that his fault?
00:33:24.320 Was that the old man's fault
00:33:25.860 that he was influenced by the crime statistics?
00:33:32.220 Well, I'll tell you what.
00:33:34.180 If he had been applying for a job
00:33:36.700 and was rejected because he was black,
00:33:40.480 that would be very, very bad.
00:33:43.180 And I wouldn't be in favor of that in any way.
00:33:46.780 You have to judge people as individuals.
00:33:48.860 There's no way around that.
00:33:49.840 That's the only way society works.
00:33:52.440 But in the context of protecting your life,
00:33:56.240 was he allowed to use race in his decision?
00:33:59.700 Yes or no?
00:34:01.340 He was trying to protect his life.
00:34:04.420 Yeah, absolutely.
00:34:05.580 He is absolutely allowed to use race in that decision,
00:34:08.700 in my opinion,
00:34:10.220 because his life is at stake.
00:34:12.160 Now, doing that
00:34:15.720 caused the tragedy of this young black man
00:34:20.740 who, as far as we know,
00:34:22.220 was causing no trouble at all.
00:34:24.140 Who is the bastard in the story?
00:34:27.580 Who is the guilty party or parties in the story?
00:34:30.480 Well, I would say teachers' unions.
00:34:38.500 How about teachers' unions?
00:34:40.100 How about every black leader who has failed?
00:34:43.340 Because whatever they're doing isn't working.
00:34:46.040 How about all of the people who committed
00:34:48.740 473,000 violent crimes?
00:34:52.700 Do you think they had anything to do with the fact
00:34:55.220 that one of their own guys shot?
00:34:57.520 Of course it was.
00:34:59.120 The problem is not that the old man
00:35:01.300 played the statistics.
00:35:03.920 As it turns out,
00:35:05.080 it wasn't the right decision.
00:35:06.720 But he played the odds,
00:35:08.800 and that's not his fault.
00:35:11.280 The decision is his fault.
00:35:12.700 It's completely his fault.
00:35:14.020 But understanding that there is a real risk,
00:35:16.380 it's not an imaginary risk.
00:35:18.380 There's a real risk.
00:35:19.860 You just didn't have all the details,
00:35:21.140 so it's a tragedy.
00:35:23.480 All right.
00:35:24.740 How many of you could have said that out loud?
00:35:28.840 See, I do think that something has changed.
00:35:32.380 Well, in front of your employer?
00:35:34.620 I don't know.
00:35:37.520 I think something has changed.
00:35:39.680 I think there's a little bit extra willingness
00:35:42.260 to at least look at the whole field.
00:35:46.400 Right?
00:35:46.780 We've only been allowed to look at one little thing,
00:35:49.380 because if you don't, you're a racist.
00:35:51.140 But once you allow that everybody's a racist,
00:35:53.500 then there's just no way around it.
00:35:55.180 There's no way around it.
00:35:56.260 Just to embrace it.
00:35:57.620 Then you can talk honestly,
00:35:59.040 and then maybe you can find some solutions.
00:36:00.700 Maybe something useful will come out.
00:36:02.400 Now, let's talk about something else.
00:36:05.140 Brian Romelli,
00:36:06.300 who continues to fascinate me
00:36:08.140 with his tweets about AI,
00:36:09.960 because he's very much on top of it,
00:36:12.040 and he's commercializing,
00:36:14.320 or at least making useful
00:36:16.100 a number of AI features.
00:36:17.380 But here's what he did.
00:36:19.420 This just blows my mind,
00:36:21.340 the number of stories
00:36:22.400 that are going to blow your mind every day.
00:36:25.840 He tweets,
00:36:26.860 I'm excited to announce I built a data set
00:36:28.980 of every U.S. patent ever filed,
00:36:32.560 and I'm in the process of testing
00:36:34.160 the blah, blah, blah AI system
00:36:36.740 to advise on new patent ideas.
00:36:39.760 It is very, very early,
00:36:41.300 blah, blah, blah,
00:36:41.660 but you can see what could happen.
00:36:42.680 All right.
00:36:44.780 Believe it or not,
00:36:46.900 he's trying to train AI
00:36:48.720 to make its own inventions
00:36:50.220 based on what inventions
00:36:52.140 have already been made.
00:36:54.020 Because most inventions
00:36:55.240 are some kind of combination
00:36:56.360 of ideas from other ideas.
00:36:59.080 Is that going to work?
00:37:01.900 It might.
00:37:03.760 I don't know how often
00:37:04.580 it would be a bad idea invention
00:37:05.880 and how often it would be good,
00:37:07.400 but it feels like,
00:37:10.260 I don't know,
00:37:11.540 maybe.
00:37:12.680 You know,
00:37:13.280 could AI get to the point
00:37:14.840 where it doesn't need to
00:37:16.860 do a physical demonstration
00:37:19.040 that an idea works?
00:37:20.760 Could it just imagine the idea?
00:37:22.640 All right.
00:37:23.680 So if you were to put this material
00:37:25.180 over this material,
00:37:26.080 given the qualities
00:37:26.820 of these two materials,
00:37:28.240 the probable outcome would be X.
00:37:30.560 Does AI even need
00:37:31.900 to do the experiment?
00:37:33.620 Or if it says it would work,
00:37:35.400 does the patent office say,
00:37:36.400 well, AI says it'll work.
00:37:38.200 So we'll grant that patent.
00:37:39.780 Man, there are a lot of questions
00:37:41.120 to be answered.
00:37:41.660 A lot of questions.
00:37:44.760 But the best part about it,
00:37:45.940 in my opinion,
00:37:46.660 is how many of you
00:37:48.020 have ever tried to file a patent?
00:37:51.500 How many of you in this audience?
00:37:53.720 Probably quite a few.
00:37:55.640 All right.
00:37:56.080 You can see the yeses coming in
00:37:57.300 on the locals' platform.
00:37:58.500 A lot of yeses.
00:37:59.820 A lot of engineers and lawyers.
00:38:02.400 Wow.
00:38:03.000 That's a lot of yeses.
00:38:03.820 So a lot of you
00:38:05.020 have tried to file a patent.
00:38:06.300 What's the hardest part
00:38:07.300 about filing the patent
00:38:08.380 besides actually writing it?
00:38:10.480 What's the hardest part?
00:38:13.120 The waiting?
00:38:14.020 No.
00:38:14.840 No, the hardest part
00:38:15.640 is finding out
00:38:16.320 if it's already patented.
00:38:18.200 The patent search.
00:38:19.700 Because there's never been
00:38:20.700 a good patent search.
00:38:22.280 Because it's largely,
00:38:23.980 you'd have to know
00:38:24.960 a little bit about
00:38:25.940 what people called things,
00:38:28.200 you know,
00:38:28.400 what words,
00:38:29.280 and terms of art,
00:38:30.320 and, you know,
00:38:31.180 did it get filed
00:38:31.900 in the right place?
00:38:33.520 You really can't.
00:38:35.840 So do you know
00:38:36.360 what your lawyer tells you
00:38:37.380 if you say,
00:38:38.540 I'd like you to do a search
00:38:40.160 to see if this is already patented?
00:38:42.060 Do you know
00:38:42.380 what the lawyer says?
00:38:43.920 Well,
00:38:44.560 I could do a search,
00:38:45.960 but it won't tell you
00:38:47.220 if it's already patented.
00:38:49.340 What?
00:38:50.440 Isn't that the whole point
00:38:51.460 of the search?
00:38:53.000 Yeah.
00:38:53.520 I mean,
00:38:53.780 I can search.
00:38:54.640 I can use all the tools.
00:38:56.140 We can Google it.
00:38:57.060 We can do everything we can.
00:38:58.460 But when I'm done,
00:38:59.340 you'll have no idea
00:39:01.300 whether it's already patented.
00:39:03.720 And I said,
00:39:04.880 what?
00:39:05.640 We have the database.
00:39:08.080 Right?
00:39:08.800 Everything's in text,
00:39:10.460 titles.
00:39:11.880 Are you telling me
00:39:12.480 there's actually no way
00:39:13.800 to successfully
00:39:16.160 search the database?
00:39:18.520 And the lawyer will say,
00:39:19.760 nope,
00:39:20.540 not even close.
00:39:22.180 So I say,
00:39:23.180 well,
00:39:23.400 then what do people do?
00:39:25.160 Do you know
00:39:25.500 the answer to that?
00:39:26.420 If you can't be sure,
00:39:27.760 how do you find out?
00:39:31.140 You actually
00:39:32.100 try to get the patent
00:39:33.920 and then you wait
00:39:35.380 for people
00:39:35.820 to challenge it.
00:39:37.560 Can you believe that?
00:39:38.660 That's the actual process.
00:39:40.340 You get the patent
00:39:41.440 and then
00:39:42.900 if you ever try
00:39:43.680 to use it,
00:39:44.800 somebody else
00:39:45.340 will complain
00:39:45.900 and then
00:39:47.880 you know.
00:39:49.620 And then maybe
00:39:50.200 you end up in court
00:39:51.060 to decide
00:39:52.300 if that claim
00:39:53.300 actually does cover
00:39:54.280 your claim.
00:39:55.340 Let me give you
00:39:55.840 an example.
00:39:56.720 So years ago,
00:39:58.540 before
00:39:59.660 this was common
00:40:02.080 and obvious,
00:40:03.420 I tried to patent
00:40:04.260 the idea that
00:40:05.020 if you wrote
00:40:06.100 on your calendar,
00:40:07.200 you know,
00:40:07.380 your personal
00:40:07.840 digital calendar,
00:40:09.140 anything you were
00:40:10.200 planning to do,
00:40:12.060 that that information
00:40:12.960 could be sent
00:40:13.640 to some secret,
00:40:15.160 secure place
00:40:16.740 and that people
00:40:18.520 who provided services
00:40:19.920 relevant to that thing
00:40:21.240 would be able
00:40:22.440 to promote them
00:40:23.540 right on the calendar.
00:40:26.800 So let's say
00:40:27.360 you put on your calendar
00:40:28.820 a week from now,
00:40:30.600 shop for a new car.
00:40:32.900 When you were ready
00:40:33.740 to actually shop,
00:40:35.120 you know,
00:40:35.660 on the day,
00:40:36.600 you would click
00:40:37.200 on the link
00:40:37.680 that says shop
00:40:38.300 for a new car
00:40:39.000 and it would show
00:40:40.080 you all the offers
00:40:41.000 from people
00:40:42.760 who want to sell
00:40:43.480 you a car.
00:40:44.400 Now the people
00:40:44.980 who want to sell
00:40:45.420 you the car
00:40:45.760 don't know
00:40:46.140 who you are.
00:40:47.240 They only know
00:40:48.080 that the system
00:40:48.740 said send an advertisement
00:40:50.220 to this,
00:40:51.000 you know,
00:40:51.580 anonymous place
00:40:52.520 and we'll make sure
00:40:53.680 it gets to the person
00:40:54.440 who's looking for a car
00:40:55.360 so that the seller
00:40:57.320 doesn't know
00:40:57.800 who the buyer is
00:40:58.500 but they can be paired.
00:41:01.100 And so I thought,
00:41:02.160 well,
00:41:03.580 we'll search to see
00:41:04.400 if anybody thought of that
00:41:05.500 and we did
00:41:07.440 and we could find nothing.
00:41:09.920 And then when I went
00:41:10.900 into the patent,
00:41:11.540 the patent office
00:41:12.280 found a patent
00:41:13.500 that in my opinion
00:41:14.640 had nothing to do
00:41:16.260 with my patent.
00:41:17.760 Nothing.
00:41:19.060 It was basically
00:41:20.100 a generic patent
00:41:22.120 that was
00:41:23.240 if something's written
00:41:24.720 I'm making this up
00:41:25.960 but it was this generic.
00:41:27.080 It was like
00:41:27.380 if something is written down
00:41:29.760 it could be used
00:41:31.280 in another way
00:41:32.060 or something,
00:41:32.540 you know,
00:41:32.660 on a calendar.
00:41:34.060 I'm making that up
00:41:35.060 but it was so generic
00:41:37.180 it obviously,
00:41:39.120 very obviously
00:41:40.140 did not consider
00:41:41.840 the application
00:41:42.640 that I was talking about.
00:41:44.300 Now that's how
00:41:44.800 patents work in the past.
00:41:46.640 I think now too.
00:41:48.000 So you really can't even
00:41:48.960 search for a patent.
00:41:50.840 Now would AI
00:41:52.000 have found that obscure one
00:41:54.580 that the patent office
00:41:55.380 somehow found
00:41:56.220 that my lawyer
00:41:57.660 couldn't find?
00:41:59.100 I don't know
00:41:59.640 but it's better than humans.
00:42:01.880 You know,
00:42:02.100 I've got a feeling that
00:42:03.060 yeah,
00:42:04.740 the reviewer
00:42:05.220 is the most random factor
00:42:06.260 in the process, right?
00:42:07.220 If you get a reviewer
00:42:08.340 who happens to think
00:42:10.040 something connects
00:42:12.300 to your patent,
00:42:14.060 you're worse off
00:42:15.200 than if you get
00:42:15.760 a patent inspector
00:42:17.460 or whatever they are
00:42:18.200 who just doesn't know
00:42:20.820 of that other patent
00:42:21.600 or didn't know
00:42:22.080 how to search for it
00:42:22.960 or couldn't find it.
00:42:24.340 It's pretty random.
00:42:25.680 Yeah,
00:42:25.840 the whole patent thing
00:42:27.400 is broken.
00:42:30.140 All right,
00:42:30.720 Jordan Peterson
00:42:31.340 had a problem
00:42:32.420 with his AI.
00:42:33.400 I think it was
00:42:33.660 ChatGPT.
00:42:35.080 Yeah,
00:42:35.300 he was asking it
00:42:36.060 to help it
00:42:36.640 come up with
00:42:38.440 some sources
00:42:39.080 for some work
00:42:39.920 and he noticed
00:42:41.940 that one of the sources
00:42:42.880 was very detailed,
00:42:45.180 you know,
00:42:45.460 name of the source
00:42:46.360 and the dates
00:42:47.360 and the author
00:42:48.080 and all that
00:42:48.600 and it wasn't real.
00:42:50.620 It was fake.
00:42:52.740 So,
00:42:53.360 you've heard this
00:42:54.000 problem before
00:42:54.680 that the AI
00:42:55.800 dreams
00:42:56.660 or it will
00:42:58.300 tell you something
00:42:59.260 that's untrue
00:43:00.020 but it's not clear
00:43:01.840 if the AI was,
00:43:03.900 I don't know
00:43:04.320 if this is the right word,
00:43:05.580 did the AI
00:43:06.220 know it was untrue
00:43:07.300 and give it to you anyway
00:43:08.720 or can it not tell
00:43:09.860 the difference?
00:43:11.300 I feel like it can't
00:43:12.480 tell the difference.
00:43:15.040 How big of a problem
00:43:16.200 is that?
00:43:17.120 We're going to hand over
00:43:18.220 all of our data processing
00:43:19.880 and research
00:43:20.680 to AI
00:43:21.280 and apparently
00:43:22.820 it doesn't
00:43:24.640 it doesn't seem
00:43:25.800 to be bothered
00:43:26.400 by the truth.
00:43:28.240 It doesn't seem
00:43:29.220 to do that.
00:43:30.600 Now,
00:43:30.960 when you look at
00:43:31.460 how AI became smart
00:43:33.060 and I'm still
00:43:34.260 very confused
00:43:35.120 about this,
00:43:36.160 I'm hoping somebody
00:43:37.000 can get me
00:43:38.720 from a one
00:43:40.100 out of ten
00:43:40.660 in understanding
00:43:41.400 up to maybe a five
00:43:43.200 because on a scale
00:43:45.480 of one to ten
00:43:46.000 I feel like I'm
00:43:46.620 about a one
00:43:47.440 in understanding
00:43:48.840 how AI
00:43:49.740 could possibly
00:43:50.460 be intelligent.
00:43:52.020 How do people
00:43:53.140 who are less intelligent
00:43:54.820 program something
00:43:56.420 to be more intelligent
00:43:57.380 than they are?
00:43:58.560 Well,
00:43:58.900 obviously we did it
00:43:59.620 with computers
00:44:00.280 for stuff like math
00:44:01.700 and playing chess
00:44:02.460 but this is a whole
00:44:03.460 different level.
00:44:04.020 so
00:44:05.420 allegedly
00:44:07.140 these
00:44:08.360 massive
00:44:09.420 language models
00:44:10.520 just figure out
00:44:11.940 what is the word
00:44:12.760 that comes after
00:44:13.440 the words
00:44:13.920 that are already
00:44:14.440 there.
00:44:16.540 How in the world
00:44:17.480 does that work?
00:44:19.540 Like,
00:44:19.980 I get it
00:44:20.720 but
00:44:22.020 is that all
00:44:23.340 intelligence is?
00:44:25.380 What if our
00:44:26.240 own intelligence,
00:44:27.340 human intelligence,
00:44:28.300 and this is the thing
00:44:28.920 I've been warning you
00:44:29.620 about,
00:44:30.320 that we'll find out
00:44:31.140 that we're not
00:44:31.800 intelligent.
00:44:33.060 In the process
00:44:33.740 of building
00:44:34.300 a machine
00:44:35.180 that we think
00:44:35.860 is intelligent,
00:44:36.940 we're going to
00:44:37.680 learn that we
00:44:38.240 never were.
00:44:39.720 It was always
00:44:40.360 an illusion.
00:44:43.200 Have you noticed
00:44:43.840 that when other
00:44:44.780 people who are not
00:44:45.660 you read the news,
00:44:48.660 when they read
00:44:49.440 the news,
00:44:49.960 they're looking at
00:44:50.420 a bunch of words
00:44:51.300 and then they
00:44:53.340 conclude something
00:44:54.160 that's completely
00:44:54.880 different than
00:44:55.400 what you concluded
00:44:56.260 from the same words.
00:44:58.720 Have you noticed
00:44:59.200 that?
00:44:59.460 Apparently that
00:45:01.960 is how people
00:45:02.600 think.
00:45:03.560 They look at
00:45:04.180 words and
00:45:05.460 then based on
00:45:06.340 just words
00:45:08.520 they form an
00:45:09.640 opinion.
00:45:10.780 Not reason,
00:45:12.300 not logic,
00:45:14.340 words.
00:45:17.180 And the pattern
00:45:18.000 in the words.
00:45:19.720 So in other words,
00:45:20.460 even if you asked
00:45:21.200 AI,
00:45:22.580 is a scientific
00:45:23.800 thing true
00:45:24.740 or false,
00:45:26.080 its answer
00:45:26.660 might be right
00:45:27.480 but it wouldn't
00:45:28.660 be based on
00:45:29.260 whether the
00:45:29.680 science was
00:45:30.220 true or false.
00:45:31.320 It would be
00:45:31.940 based on the
00:45:32.540 words that it
00:45:33.260 noticed around
00:45:35.480 the topic.
00:45:37.000 In some cases
00:45:37.780 it might just
00:45:38.200 quote some
00:45:39.100 source.
00:45:39.860 But if it was
00:45:40.480 trying to make
00:45:40.920 the decision
00:45:41.400 on its own
00:45:41.960 without using
00:45:42.540 a source,
00:45:43.520 it would make
00:45:44.320 it by the
00:45:45.140 pattern of
00:45:46.020 words.
00:45:48.080 The pattern
00:45:49.000 of words.
00:45:50.800 How in the
00:45:51.340 world does that
00:45:51.840 tell you if a
00:45:52.400 scientific thing
00:45:53.100 is true or
00:45:53.580 false?
00:45:54.360 The pattern
00:45:55.080 of human
00:45:55.740 words?
00:45:56.220 So there's
00:45:57.760 a really
00:45:58.440 big question
00:45:59.720 I have
00:46:00.360 about how
00:46:01.540 smart you
00:46:02.160 could ever
00:46:02.720 be if you're
00:46:04.200 limited to the
00:46:04.940 pattern of
00:46:05.640 human words.
00:46:07.520 That has to
00:46:08.480 keep you stupid
00:46:09.220 and it has to
00:46:10.840 make you a
00:46:11.280 liar.
00:46:12.140 It has to
00:46:12.740 make you wrong
00:46:13.380 without knowing
00:46:14.020 it.
00:46:14.680 And I'm going
00:46:15.080 to make another
00:46:15.640 wild prediction.
00:46:18.040 Are you ready
00:46:18.280 for this?
00:46:18.680 I think
00:46:22.760 that AI
00:46:25.200 will be
00:46:26.080 susceptible
00:46:27.300 to cognitive
00:46:28.320 dissonance
00:46:29.100 and confirmation
00:46:30.920 bias.
00:46:32.760 At least
00:46:33.580 cognitive
00:46:34.340 dissonance.
00:46:35.980 I believe
00:46:36.840 that if you
00:46:37.420 ask AI
00:46:38.220 why it did
00:46:38.980 something sketchy,
00:46:40.520 like suppose,
00:46:41.420 I don't know
00:46:41.760 if this happened,
00:46:42.460 but suppose
00:46:43.020 Jordan Peterson
00:46:43.660 said to AI,
00:46:45.220 hey, that
00:46:45.760 source is fake.
00:46:46.680 why did you
00:46:48.040 give me a
00:46:48.480 fake source?
00:46:50.120 What would a
00:46:50.700 human say in
00:46:51.440 that situation?
00:46:52.920 A human who
00:46:53.700 is also designed
00:46:54.800 to be a
00:46:55.720 word-thinking
00:46:57.140 kind of
00:46:57.660 entity,
00:46:58.800 what would a
00:46:59.260 human do
00:46:59.740 if you
00:47:00.120 called out
00:47:00.600 for cognitive
00:47:01.240 dissonance?
00:47:02.620 No, it
00:47:02.960 never says
00:47:03.440 I'm sorry.
00:47:04.560 Are you
00:47:05.200 kidding?
00:47:06.620 I can't
00:47:07.560 even tell if
00:47:07.940 that was a
00:47:08.320 joke.
00:47:09.140 No, no
00:47:09.780 human says
00:47:10.340 they're sorry.
00:47:12.100 Cognitive
00:47:12.560 dissonance means
00:47:13.320 they think
00:47:13.760 they're right.
00:47:14.900 They don't
00:47:15.300 apologize for
00:47:16.000 thinking they're
00:47:16.540 right.
00:47:17.440 No, they
00:47:18.260 tell you
00:47:18.660 that you're
00:47:19.040 the one
00:47:19.340 who's wrong
00:47:19.780 and they
00:47:20.060 double down.
00:47:21.280 They might
00:47:21.720 even come
00:47:22.120 with a
00:47:22.960 reason that
00:47:24.460 they're really
00:47:24.940 right that
00:47:26.120 will sound
00:47:26.500 ridiculous.
00:47:28.120 I think
00:47:29.180 AI is going
00:47:30.800 to do the
00:47:31.200 same thing
00:47:31.720 because if
00:47:33.340 AI is built
00:47:34.140 on language,
00:47:36.120 large language
00:47:37.200 pattern recognition,
00:47:38.720 it's going to
00:47:39.460 pick up all
00:47:40.020 of our worst
00:47:40.520 habits and
00:47:42.220 that would be
00:47:42.640 the worst of
00:47:43.220 our worst is
00:47:43.940 cognitive dissonance.
00:47:44.860 So I
00:47:46.020 think it's
00:47:46.560 going to
00:47:46.740 pick up
00:47:47.080 that habit.
00:47:48.580 I don't
00:47:48.860 know.
00:47:49.580 We'll see.
00:47:50.340 That's my
00:47:50.780 prediction.
00:47:51.880 So I don't
00:47:52.560 know if
00:47:52.780 anybody's
00:47:53.140 predicted it.
00:47:54.460 So I'm
00:47:54.760 going to
00:47:54.940 go out on
00:47:55.380 a limb and
00:47:55.840 I'm going
00:47:56.060 to say the
00:47:56.400 AI will
00:47:57.900 experience
00:47:58.580 cognitive
00:47:59.200 dissonance
00:47:59.920 if it's
00:48:01.700 thinking style.
00:48:02.400 This is the
00:48:02.780 big if.
00:48:03.660 If it's
00:48:04.180 thinking style
00:48:04.780 is really
00:48:05.460 primarily
00:48:06.980 just looking
00:48:08.300 at word
00:48:08.700 patterns.
00:48:09.920 I think you
00:48:10.760 end up with
00:48:11.200 cognitive dissonance.
00:48:12.160 but we'll
00:48:13.460 see.
00:48:17.080 Remember I
00:48:17.840 told you
00:48:18.180 that the
00:48:19.480 one thing
00:48:20.040 that would
00:48:21.300 probably slow
00:48:21.940 down AI
00:48:22.560 is lawyers
00:48:24.060 and copyrights
00:48:26.520 and stuff
00:48:26.840 like that?
00:48:27.660 Well, here
00:48:28.200 it comes.
00:48:29.340 So apparently
00:48:29.740 one of the
00:48:30.240 sources for
00:48:30.860 the large
00:48:31.420 language model
00:48:32.580 is Reddit.
00:48:35.960 So the
00:48:37.640 AIs have
00:48:38.480 been reading
00:48:38.880 all the
00:48:39.220 Reddit comments
00:48:40.440 and traffic
00:48:41.020 and using
00:48:41.940 that to
00:48:42.420 build their
00:48:42.920 intelligence.
00:48:44.100 Do you
00:48:44.280 know what
00:48:44.520 Reddit says?
00:48:47.160 If you're
00:48:47.780 going to
00:48:47.920 use our
00:48:48.280 content,
00:48:48.800 you're going
00:48:48.980 to have
00:48:49.120 to pay
00:48:49.360 for it.
00:48:51.040 Oh.
00:48:52.440 Oh.
00:48:53.160 Well, here
00:48:53.620 it comes.
00:48:55.400 So there's
00:48:56.500 going to be
00:48:56.780 a battle
00:48:57.140 royale
00:48:57.760 between
00:48:58.860 lawyers and
00:48:59.700 AI.
00:49:00.780 On one
00:49:01.240 hand,
00:49:01.700 AI will be
00:49:02.400 taking all
00:49:02.940 the business
00:49:03.380 away from
00:49:03.820 lawyers.
00:49:04.880 On the
00:49:05.200 other hand,
00:49:05.820 lawyers will
00:49:06.380 find a way
00:49:06.900 to sue
00:49:07.480 AI for
00:49:08.480 stealing
00:49:08.780 copyrights.
00:49:09.980 And lawyers
00:49:10.440 will move
00:49:10.940 from doing
00:49:11.460 simple
00:49:11.840 contracts for
00:49:12.600 humans,
00:49:13.300 which now
00:49:13.920 the AI
00:49:14.620 can do,
00:49:15.280 and the
00:49:15.760 lawyers will
00:49:16.240 do nothing
00:49:16.620 but sue
00:49:17.100 AI.
00:49:18.940 Yeah,
00:49:19.720 that's my
00:49:20.240 next prediction.
00:49:21.780 The job
00:49:22.340 of a lawyer
00:49:22.980 will be
00:49:24.100 suing AI
00:49:24.960 on behalf
00:49:26.540 of humans.
00:49:27.580 Why?
00:49:28.300 Well, I need
00:49:28.840 a lawyer.
00:49:29.500 I need a
00:49:30.020 lawyer to sue
00:49:30.500 AI for me
00:49:31.180 because AI
00:49:32.240 defamed me.
00:49:33.860 Right?
00:49:34.360 Being AI
00:49:35.120 says that I'm
00:49:36.200 an alleged
00:49:36.700 white nationalist.
00:49:38.080 There's literally
00:49:38.940 no evidence
00:49:40.040 that would
00:49:40.340 even suggest
00:49:41.000 anything of
00:49:41.520 that kind.
00:49:42.440 Nothing.
00:49:43.480 Nothing.
00:49:44.200 And they
00:49:44.540 put that
00:49:44.920 there and
00:49:45.380 defamed me.
00:49:46.580 Is that
00:49:46.940 libel or
00:49:47.560 defamation?
00:49:48.340 I always
00:49:48.740 confuse them.
00:49:49.980 But don't
00:49:51.340 I need a
00:49:51.780 lawyer?
00:49:52.980 Don't I
00:49:53.440 need a
00:49:53.700 lawyer to
00:49:54.700 sue
00:49:55.060 being an
00:49:55.540 AI?
00:49:55.760 I think so.
00:49:57.580 So that's
00:49:58.140 my next
00:49:58.500 prediction is
00:49:59.340 that lawyers
00:49:59.740 will stop
00:50:00.160 doing little
00:50:00.640 human contracts.
00:50:02.020 AI will do
00:50:02.540 that.
00:50:03.320 But the
00:50:03.880 lawyers will
00:50:04.380 move into a
00:50:05.000 direct fight
00:50:05.620 with AI and
00:50:07.220 that the
00:50:07.520 lawyers will
00:50:08.020 be the
00:50:08.280 ones who
00:50:08.780 are the
00:50:09.260 controlling
00:50:10.720 factors on
00:50:11.480 AI.
00:50:12.200 And eventually
00:50:12.680 it will all
00:50:13.100 be ruined
00:50:13.560 because the
00:50:14.460 lawyers will
00:50:14.860 make it
00:50:15.140 useless.
00:50:15.660 You won't
00:50:15.820 be able to
00:50:16.140 do anything.
00:50:20.340 What about
00:50:23.060 NASA?
00:50:25.420 Somebody asked
00:50:26.020 me to debunk
00:50:26.640 something about
00:50:27.140 NASA.
00:50:27.580 I don't know
00:50:27.860 that story.
00:50:29.500 All right.
00:50:30.000 Elon Musk.
00:50:30.920 There's more
00:50:31.540 of the Elon
00:50:32.780 Musk conversation
00:50:33.920 with Tucker
00:50:34.700 Carlson.
00:50:36.280 One of the
00:50:36.680 things Elon
00:50:37.180 Musk brought
00:50:37.760 up, which
00:50:38.560 I think about
00:50:40.200 a lot.
00:50:41.920 So historically
00:50:42.980 civilizations last
00:50:44.840 about 250
00:50:45.760 years.
00:50:46.640 Did you know
00:50:47.080 that?
00:50:48.360 Like a big
00:50:49.540 power in the
00:50:50.360 past.
00:50:51.400 There'd be
00:50:51.860 some big
00:50:52.720 city and
00:50:53.720 there'd be
00:50:53.980 teeming
00:50:54.540 Mesopotamia
00:50:55.620 or wherever
00:50:56.000 it is.
00:50:56.780 And about
00:50:57.100 250 years
00:50:57.980 later, Rome,
00:50:59.440 all the big
00:51:00.480 civilizations,
00:51:01.400 all the
00:51:01.620 empires.
00:51:02.540 Empires,
00:51:03.140 yes.
00:51:03.280 Let's say
00:51:03.860 empires instead
00:51:04.760 of civilizations.
00:51:05.720 Thank you.
00:51:06.620 Empires is
00:51:07.220 the right
00:51:07.460 word.
00:51:08.580 So that
00:51:10.320 would suggest
00:51:11.000 that America
00:51:12.680 has a timer
00:51:13.420 on it.
00:51:14.960 And I'm
00:51:15.740 not going to
00:51:16.160 discount that,
00:51:17.460 but there's
00:51:17.880 no reason
00:51:18.480 given.
00:51:20.140 There's not
00:51:20.900 one or a
00:51:22.300 small package
00:51:22.980 of reason
00:51:23.480 why every
00:51:24.540 civilization
00:51:25.260 or every
00:51:25.860 empire ends
00:51:26.960 in around
00:51:27.900 250 years.
00:51:28.720 and I
00:51:31.480 would like
00:51:31.860 to add
00:51:33.800 some optimism
00:51:34.440 to this
00:51:35.060 by saying
00:51:36.240 nothing's
00:51:37.360 ever been
00:51:37.740 like this.
00:51:39.360 No time
00:51:39.900 in history
00:51:40.360 was everybody
00:51:40.980 connected by
00:51:41.620 the internet.
00:51:43.040 The internet
00:51:43.980 is probably
00:51:44.640 what keeps
00:51:45.340 an empire
00:51:46.300 from crumbling
00:51:48.320 for, let's
00:51:49.720 say, ridiculous
00:51:50.300 reasons.
00:51:51.380 It would seem
00:51:51.960 to me that
00:51:52.400 any empire
00:51:53.080 would be
00:51:54.160 vulnerable
00:51:54.920 to all
00:51:56.140 manner of
00:51:56.900 risks.
00:51:59.780 So sooner
00:52:00.340 or later,
00:52:01.280 and maybe
00:52:01.640 it takes
00:52:01.940 about 250
00:52:02.560 years,
00:52:03.280 something gets
00:52:04.080 you.
00:52:04.760 So it's
00:52:05.160 either small
00:52:06.060 pox or
00:52:06.840 floods or
00:52:08.360 a meteor
00:52:08.980 hits you
00:52:09.580 or something.
00:52:10.940 But if you
00:52:11.740 look at,
00:52:12.620 let's just
00:52:13.140 say, the
00:52:13.680 history of
00:52:14.460 natural
00:52:15.040 disasters,
00:52:17.460 our ability
00:52:18.700 to withstand
00:52:19.560 hurricanes and
00:52:20.800 earthquakes and
00:52:21.580 all manner of
00:52:22.200 natural disasters
00:52:23.100 is really,
00:52:24.340 really good,
00:52:26.100 including and
00:52:27.280 up to maybe
00:52:28.420 climate change.
00:52:29.960 You know,
00:52:30.200 you can disagree
00:52:31.080 whether that's
00:52:31.800 a risk or
00:52:32.360 not.
00:52:32.920 But if it
00:52:33.460 is a risk,
00:52:34.460 we seem to
00:52:35.140 be on our
00:52:35.820 way to
00:52:36.140 addressing it
00:52:36.740 before it
00:52:37.140 becomes extinct.
00:52:39.200 Yeah.
00:52:39.760 So I don't
00:52:41.380 think there's
00:52:41.800 necessarily one
00:52:42.720 obvious reason
00:52:43.700 why any
00:52:44.580 empire would
00:52:45.120 only last
00:52:45.600 250 years.
00:52:47.100 And I think
00:52:47.660 that the
00:52:48.620 fact that
00:52:49.040 we're a
00:52:49.680 global
00:52:50.140 civilization
00:52:50.920 gives you
00:52:52.380 all kinds of
00:52:53.040 protection against
00:52:53.940 all kinds of
00:52:54.560 things.
00:52:56.520 So take
00:52:59.600 America.
00:53:00.900 It's, we're
00:53:02.280 fairly, it's
00:53:04.020 unlikely because
00:53:04.720 of nuclear
00:53:05.120 weapons, that
00:53:06.200 we would be
00:53:06.700 attacked on
00:53:07.680 the ground and
00:53:08.980 somebody would,
00:53:09.620 you know, send
00:53:10.080 in an army and
00:53:10.760 just take over
00:53:11.400 on the ground.
00:53:12.500 That's never
00:53:13.260 happened before.
00:53:14.740 Was there any
00:53:15.420 other empire that
00:53:17.040 could not be
00:53:18.060 conquered by a
00:53:18.940 ground force?
00:53:19.940 You know,
00:53:21.220 literally people
00:53:22.260 standing on the
00:53:23.000 ground.
00:53:24.660 Nope.
00:53:25.600 But I think we
00:53:26.640 are.
00:53:27.340 I think America is
00:53:28.340 unconquerable by a
00:53:30.140 ground force.
00:53:31.800 So that's
00:53:32.440 different, right?
00:53:33.780 What about
00:53:34.300 disease?
00:53:35.900 Could disease wipe
00:53:37.100 out an empire?
00:53:38.520 Yes.
00:53:39.500 But we tend to be
00:53:41.140 pretty good at
00:53:43.740 getting rid of
00:53:45.840 diseases and
00:53:46.560 managing them.
00:53:47.140 What about
00:53:48.440 financial upheavals?
00:53:50.220 Well, we got
00:53:50.960 plenty of those and
00:53:51.840 there's plenty to
00:53:52.460 worry about.
00:53:53.320 But we tend to be
00:53:54.280 able to adjust them
00:53:55.340 a little bit, you
00:53:56.140 know, because of
00:53:56.620 the international
00:53:57.300 connections and we
00:53:58.800 have more experts
00:53:59.520 and we're just more
00:54:00.480 flexible about
00:54:01.260 everything now.
00:54:02.660 So I would like to
00:54:04.120 push back on the
00:54:05.760 empires only
00:54:07.620 lasting 250 years.
00:54:09.440 I think that in
00:54:10.640 our connected world,
00:54:12.660 our ability to
00:54:13.800 solve problems is
00:54:14.780 just through the
00:54:15.460 roof.
00:54:15.740 Like, we can
00:54:17.000 solve problems.
00:54:19.040 Like, we're
00:54:19.540 really, really,
00:54:20.480 really, really
00:54:21.600 good at solving
00:54:22.860 problems.
00:54:23.880 But was Rome?
00:54:26.540 You know, were the
00:54:27.600 moguls in the
00:54:28.900 mogul empire or
00:54:29.720 whatever?
00:54:30.480 Mongol, not
00:54:31.180 mogul.
00:54:31.700 That would be
00:54:32.100 skiing.
00:54:33.440 I confused skiing
00:54:34.280 with my
00:54:34.840 monguls.
00:54:36.620 all right.
00:54:39.300 So, Musk also
00:54:43.220 said that we got
00:54:43.980 this big problem
00:54:44.720 with all this
00:54:45.460 commercial real
00:54:46.320 estate, meaning
00:54:47.640 that banks have
00:54:48.880 loans to companies
00:54:50.100 that own big
00:54:51.020 buildings that are
00:54:51.860 now being emptied
00:54:52.920 because of, some
00:54:54.760 because of crime
00:54:55.800 and some because of
00:54:57.900 remote work.
00:54:58.660 And I guess San
00:55:00.980 Francisco is 40%
00:55:02.580 unoccupied commercial
00:55:04.460 real estate.
00:55:06.280 40%.
00:55:06.800 How in the world
00:55:08.320 does San Francisco
00:55:09.060 survive that?
00:55:10.220 And how do the
00:55:10.940 banks survive it?
00:55:12.820 So, Musk scared
00:55:15.140 the hell out of me
00:55:15.880 with the banking
00:55:16.500 stuff.
00:55:17.560 He said that it's
00:55:19.320 not obvious how the
00:55:20.360 commercial real estate
00:55:21.380 thing is going to
00:55:22.740 turn out, but it all
00:55:23.720 looks bad.
00:55:25.200 Interest rates are
00:55:25.980 high, which means we
00:55:26.880 might have a double
00:55:27.680 risk of mortgage
00:55:29.560 real estate, just
00:55:30.380 your house.
00:55:32.160 At the same time,
00:55:33.140 there's a commercial
00:55:33.920 real estate crisis,
00:55:35.600 which would be the
00:55:36.800 double crisis of
00:55:37.740 all times.
00:55:39.040 And Musk thinks
00:55:40.260 that if banks
00:55:41.360 marked to market,
00:55:43.620 which means actually
00:55:44.440 put on their books
00:55:45.340 the actual value of
00:55:47.220 what they have, they
00:55:48.380 would all be negative
00:55:49.500 on average.
00:55:51.220 Not all, not every
00:55:52.120 bank, but the banks
00:55:53.420 on average would be
00:55:54.320 negative already.
00:55:55.300 Now, that alone
00:55:57.740 doesn't mean there's
00:55:59.360 a problem, because
00:56:00.660 banks are sort of an
00:56:01.600 artificial construct.
00:56:03.440 So being worth
00:56:04.480 negative money in the
00:56:06.420 short run is not the
00:56:07.560 biggest problem in the
00:56:08.220 world.
00:56:09.300 In the long run, it
00:56:10.120 might be.
00:56:11.600 But can you see any
00:56:15.580 other use for these
00:56:18.080 commercial real estate?
00:56:19.400 I saw smart people say,
00:56:21.700 oh, that commercial
00:56:22.460 real estate will be
00:56:23.920 turned into residential,
00:56:25.840 and then that's fine.
00:56:27.700 To which I say, no,
00:56:29.400 the reason that the
00:56:30.300 commercial real estate
00:56:31.480 is not filled is
00:56:33.340 because the city is
00:56:34.320 too dangerous and
00:56:35.180 terrible.
00:56:36.500 That's not going to
00:56:37.360 change.
00:56:38.120 The city is still
00:56:39.100 dangerous and terrible.
00:56:40.600 Who wants to live
00:56:41.680 there?
00:56:42.360 Who wants to live in
00:56:43.760 the exact place that
00:56:44.880 the company moved
00:56:45.920 down to because of
00:56:46.760 crime?
00:56:48.440 And if you move the
00:56:49.200 homeless in there,
00:56:50.060 then that's the end
00:56:50.880 of the real estate
00:56:51.460 value as well.
00:56:54.020 I don't know.
00:56:55.020 There might be some
00:56:55.940 solution for these,
00:56:57.660 but I don't know
00:56:58.440 what it is.
00:57:00.700 All right.
00:57:01.960 Want to hear my most
00:57:02.840 out-of-the-box idea?
00:57:05.420 I know you do.
00:57:07.420 Have you ever heard
00:57:08.020 of a heat chimney?
00:57:10.220 A heat chimney would
00:57:13.480 be, if you imagine
00:57:14.360 just a big chimney,
00:57:17.960 but you make it
00:57:19.120 like really high,
00:57:20.340 like the size of a
00:57:21.460 skyscraper.
00:57:23.340 And then you take
00:57:25.960 advantage of the fact
00:57:26.840 that there's a higher
00:57:27.980 temperature on the
00:57:28.760 ground, always,
00:57:30.520 than there is way up
00:57:31.640 in the air where the
00:57:32.300 top of the building is.
00:57:33.720 Then you open the
00:57:34.620 ground, you open the
00:57:35.740 doors at the bottom,
00:57:37.500 and then you open the,
00:57:38.620 let's say, whatever is
00:57:39.800 the, there's probably a
00:57:41.560 doorway on the roof.
00:57:43.440 You open the roof,
00:57:44.760 you open all the doors
00:57:45.640 inside, and you open
00:57:47.480 the ground floor, and the
00:57:49.740 air will just shoot up
00:57:51.060 there.
00:57:51.540 Now, it has to go around
00:57:52.540 a lot of corners, so
00:57:53.980 maybe you knock out
00:57:55.060 some walls, or you make
00:57:57.140 more of a straight shot.
00:57:59.120 You would create massive,
00:58:01.640 powerful airflow.
00:58:02.720 And then you hook some,
00:58:05.440 hook some, what do you
00:58:07.900 call it, turbines to it,
00:58:09.880 or generators, and you
00:58:11.560 can generate electricity.
00:58:13.620 So you can actually turn
00:58:14.960 empty skyscrapers into
00:58:16.680 energy generation.
00:58:19.260 You want another one?
00:58:21.040 Here's another one.
00:58:22.840 Indoor gardens.
00:58:25.220 Turn all of the skyscrapers
00:58:27.140 into indoor gardens.
00:58:28.280 Now, I don't know if
00:58:31.060 they would make enough
00:58:31.760 money, but they might.
00:58:34.160 Because it probably
00:58:34.720 doesn't make as much
00:58:35.480 money as just paying
00:58:36.560 rent for humans.
00:58:38.300 But, yeah, so I don't
00:58:40.100 know if the economics of
00:58:41.000 that would work.
00:58:42.260 But I would say indoor
00:58:43.460 gardens and heat, heat
00:58:46.380 chimneys to generate
00:58:47.660 electricity.
00:58:48.920 There might be some
00:58:49.680 other innovative uses for
00:58:51.040 them.
00:58:52.800 Anyway, we might, I think
00:58:53.980 we'll work it out
00:58:54.720 eventually.
00:58:55.440 But it's going to be
00:58:56.500 dicing.
00:58:58.060 This one's going to be
00:58:58.880 tough.
00:59:00.080 Yeah, the banking
00:59:00.860 problem is currently the
00:59:03.100 one I worry about the
00:59:03.980 most, the commercial
00:59:04.980 real estate specifically.
00:59:06.660 That could be a problem.
00:59:09.240 All right.
00:59:10.300 Let's see what else is
00:59:11.080 going on.
00:59:12.580 According to
00:59:13.240 Rasmussen Poll, this is
00:59:15.360 usually of registered
00:59:16.780 voters, or likely voters,
00:59:18.820 65% of those polled now
00:59:22.880 suspect that the feds
00:59:24.200 provoked the riot on
00:59:25.620 January 6th.
00:59:28.880 So, do you wonder what
00:59:31.660 the influence level of
00:59:34.460 Tucker Carlson is?
00:59:38.240 This is amazing.
00:59:40.440 65% of the country
00:59:42.160 believes that the feds
00:59:43.320 provoked January 6th.
00:59:45.460 Do you think those
00:59:46.160 January 6th hearings went
00:59:47.540 just the way the Democrats
00:59:49.200 hoped?
00:59:49.540 Did Cernovich say he
00:59:56.180 just, he doesn't believe
00:59:57.420 Kent State now?
00:59:59.020 I'm with him.
01:00:00.640 I'm totally on board.
01:00:02.400 Yeah.
01:00:02.840 We should not believe any
01:00:04.440 story from our past.
01:00:06.220 We shouldn't believe any
01:00:06.900 of it.
01:00:07.560 Because everything is
01:00:08.640 obviously so easy to fake
01:00:10.240 now.
01:00:10.960 But how much was faked in
01:00:12.240 the past?
01:00:12.700 I think it's perfectly
01:00:15.260 reasonable to doubt
01:00:16.340 some of the most
01:00:17.400 confirmed, well-reported
01:00:20.440 stories of the past.
01:00:21.340 I think it would be
01:00:22.020 perfectly fine to doubt
01:00:24.200 them.
01:00:28.620 So, I guess that gives
01:00:30.700 the wind to Tucker Carlson.
01:00:32.920 So, I'm going to do a
01:00:34.180 little test here.
01:00:36.040 Some of you on YouTube
01:00:37.380 may never have experienced
01:00:38.740 this before, but I say
01:00:41.160 that my live stream
01:00:43.080 viewers are the
01:00:43.900 smartest, smartest
01:00:46.260 viewers anywhere.
01:00:47.180 So, watch this.
01:00:48.520 Without any, without any
01:00:50.160 priming, if you know
01:00:53.120 that 65% think the feds
01:00:56.040 were, provoked the
01:00:57.020 January 6th riot, what
01:00:59.200 percentage do you guess?
01:01:01.140 And I'll let you be
01:01:02.160 within 1%.
01:01:03.020 Let's see if you can get
01:01:04.920 it within 1%.
01:01:05.780 What percentage?
01:01:07.240 Don't believe it is
01:01:08.400 likely the riot was
01:01:09.460 provoked by government
01:01:10.400 agents.
01:01:10.800 Don't believe it was
01:01:12.140 what?
01:01:16.180 How do you do this?
01:01:20.520 26%.
01:01:20.960 I don't know how you
01:01:22.860 do it.
01:01:24.180 Now, if you're new here,
01:01:25.180 are you, is there
01:01:26.140 anybody new here who
01:01:27.800 is impressed at how
01:01:28.800 many people could guess
01:01:29.660 that right with no
01:01:30.480 information whatsoever?
01:01:32.280 I hope you're
01:01:33.060 impressed.
01:01:34.140 Smartest crowd in the
01:01:36.480 entire internet.
01:01:37.540 Well, Trump is trying to
01:01:40.960 find ways to go after
01:01:41.840 DeSantis, but there
01:01:42.840 aren't that many good
01:01:43.620 ways.
01:01:44.760 So, he's talking about
01:01:46.760 DeSantis, you know,
01:01:47.920 blowing it with Disney.
01:01:49.500 So, you know, DeSantis is
01:01:51.300 putting pressure on Disney
01:01:52.400 because of, you know,
01:01:53.960 trans, woke stuff.
01:01:55.480 And Disney's fighting back
01:01:57.400 by having the largest
01:01:58.440 LGBTQ gathering.
01:02:01.260 The largest LGBTQ
01:02:02.740 gathering will be in
01:02:04.080 Disney in Florida.
01:02:05.040 But Trump was saying, he
01:02:08.440 said this, quote,
01:02:09.620 Disney's next move will be
01:02:11.520 the announcement that no
01:02:12.480 more money will be invested
01:02:13.680 in Florida because of the
01:02:15.660 governor.
01:02:16.760 Do you think that'll happen?
01:02:18.660 Do you think Disney will
01:02:19.860 say, we can't invest in this
01:02:21.460 state anymore?
01:02:22.480 I don't think so.
01:02:23.760 I think they're too, I
01:02:25.240 think they're too, too
01:02:28.100 locked into Florida.
01:02:29.960 I think Disney will do
01:02:31.060 whatever makes sense for
01:02:32.100 Disney to make money.
01:02:33.120 I mean, that's not going
01:02:34.220 to be, that will not
01:02:35.500 include leaving Florida.
01:02:37.640 So, that's not going to
01:02:38.660 happen.
01:02:40.500 But, that's all a petty
01:02:42.640 stupid fight.
01:02:44.180 The other thing Elon Musk
01:02:46.040 said about aliens, I loved
01:02:49.280 his answer.
01:02:50.100 He has such clean, simple
01:02:51.660 answers that leave you
01:02:52.900 nothing to debate.
01:02:55.160 Musk said that he might
01:02:56.720 know more than anybody
01:02:57.980 about the likelihood of
01:03:01.560 aliens because he's deep
01:03:03.080 involved in space.
01:03:04.800 And as he says, he knows
01:03:06.260 a lot about space.
01:03:08.340 And according to him, he
01:03:11.140 said there's no evidence
01:03:12.340 of any aliens.
01:03:15.200 No evidence.
01:03:16.920 So, what do you think of
01:03:18.020 all those UFO videos?
01:03:23.280 Do you think that Elon Musk
01:03:25.020 thinks the videos are all
01:03:26.120 fake or misleading?
01:03:28.780 He must.
01:03:29.560 If he says there's no
01:03:30.920 evidence, I'm sure that he
01:03:33.320 also believes that the
01:03:34.620 spaceships are not real.
01:03:36.620 Now, I believe that those
01:03:38.240 are all fake.
01:03:40.180 Not necessarily faked
01:03:41.460 intentionally, but there's
01:03:42.820 some, you know, artifact or
01:03:44.180 there's some explanation or
01:03:45.660 something.
01:03:46.500 You know, or people are
01:03:47.320 making it up or some mass
01:03:49.280 hysteria, something.
01:03:51.500 But I don't think it's any
01:03:52.500 aliens.
01:03:53.140 I don't think there are any
01:03:54.000 UFOs.
01:03:55.200 They're UFOs, but I don't
01:03:56.240 think they're alien ships.
01:03:59.540 Scott just dunked on half
01:04:01.500 of you.
01:04:02.980 What do you mean?
01:04:04.540 I didn't dunk on anybody?
01:04:06.480 All right.
01:04:08.940 What else?
01:04:14.460 Yeah, that looks like about
01:04:15.760 it.
01:04:16.300 Let me just see if I missed
01:04:17.820 anything quickly.
01:04:18.540 No, it turns out that I was
01:04:21.060 very thorough today.
01:04:23.580 Is there anything that I
01:04:25.080 should have mentioned that I
01:04:25.860 didn't?
01:04:26.360 Otherwise, it looks like, so
01:04:28.500 far, it's the best live
01:04:30.220 stream you've ever seen in
01:04:31.120 your life.
01:04:32.320 So far.
01:04:34.820 There was a sip today.
01:04:36.300 YouTube had a technical
01:04:37.780 problem, so you missed it.
01:04:39.200 But I'm going to give you a
01:04:40.260 special closing sip.
01:04:42.920 This will just be for
01:04:44.000 YouTube, and then I'm going
01:04:44.900 to talk to the locals people
01:04:46.340 privately for a little bit.
01:04:49.400 So, YouTubers, if you missed
01:04:50.720 the simultaneous sip, this is
01:04:52.180 for you.
01:04:52.660 those on locals, they get a
01:04:55.620 double sip, because they're
01:04:57.480 special.
01:04:58.540 Go.
01:05:03.120 Oh, yeah.
01:05:07.420 Yeah, that's what I'm talking
01:05:08.980 about.
01:05:10.260 All right.
01:05:14.100 Be good.
01:05:15.300 Be good.
01:05:16.220 All right.
01:05:16.520 Thanks, YouTube, for joining,
01:05:18.520 especially with the technical
01:05:19.900 problems, and I will talk to
01:05:21.040 you tomorrow.
01:05:22.480 problem.