Real Coffee with Scott Adams - August 06, 2022


Episode 1827 Scott Adams: August Is A Slow News Month. Let's Have Fun With It. Come Join Us


Episode Stats

Length

52 minutes

Words per Minute

142.88423

Word Count

7,535

Sentence Count

572

Misogynist Sentences

7

Hate Speech Sentences

9


Summary

Alex Jones is in hot water for a tweet about a distant star. Three people are struck by lightning in Washington, D.C. and the media tries to make it seem like climate change is to blame. Alex Jones is being sued for $49.3 million.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of your life, my life, civilization itself.
00:00:11.880 Historians will write about this moment forever.
00:00:15.900 And if you'd like to take it up a notch, I know that's the kind of people you are, you're always ready to take it up a notch.
00:00:22.480 Well, you don't need much.
00:00:23.980 All you need is a cup or a mug or a glass, a tank or a chalice or a canteen jug or a flask, a vessel of any kind.
00:00:31.640 If you fill it with your favorite liquid, well, like coffee, and join me now for the unparalleled pleasure.
00:00:41.100 It's the dopamine hit of the day, the thing that makes everything better.
00:00:45.420 It's called the simultaneous sip.
00:00:48.620 Go.
00:00:48.820 Oh, I undersold it again.
00:00:55.040 I undersold it.
00:00:56.280 It was better than I thought.
00:00:59.300 I don't know.
00:01:00.220 I'm just going to have to invent new words for the greatness of it.
00:01:03.460 Well, CBS Evening News had an important story about a prominent French scientist who tweeted
00:01:14.180 around a photo that he claimed was an image of a distant star taken by the James Webb telescope.
00:01:21.460 But in fact, it was not a distant star.
00:01:24.360 No, it was not a distant star.
00:01:25.980 It was a slice of a sausage sitting on a counter.
00:01:30.440 So apparently the prominent French scientist said it was a joke, but I feel like we overspent
00:01:40.700 on the James Webb space telescope.
00:01:43.700 I don't know what the budget was to build a space telescope, but I'm thinking it's not cheap.
00:01:51.240 And do you realize that everything that you can see in the space telescope already exists
00:01:57.520 in your own kitchen?
00:01:58.960 Yeah.
00:02:00.200 You want to see an asteroid?
00:02:02.940 It looks like a grape.
00:02:06.220 Maybe it looks more like a raisin.
00:02:08.660 So, I mean, a desiccated grape, let's say.
00:02:13.160 Yeah, it's all there.
00:02:15.460 It's all in your kitchen.
00:02:16.720 You do not need the James Webb space telescope.
00:02:19.720 Big waste of money.
00:02:20.680 Well, lightning struck three times near D.C. and hit three people.
00:02:29.040 So three people were struck by lightning near the Capitol.
00:02:34.840 How do you think the news handled the fact that three people got struck by lightning?
00:02:41.680 If you thought that they blamed that on climate change, well, bingo.
00:02:49.040 Bingo.
00:02:49.520 Bingo.
00:02:50.160 Yeah.
00:02:50.940 It turns out that now climate change is going to start killing individual people, just sort
00:02:57.100 of picking them out.
00:02:58.780 It's sort of like, you see that guy?
00:03:01.200 He's sort of loitering around by the White House fence.
00:03:04.680 Should we send maybe a flood?
00:03:07.100 No, we don't hate all the people there.
00:03:10.200 How about some kind of a hurricane situation?
00:03:13.120 No.
00:03:13.800 Again, we don't hate everybody.
00:03:15.980 We kind of hate that guy.
00:03:18.400 Lightning?
00:03:19.680 How about just a lightning strike?
00:03:20.940 Yeah, we got him.
00:03:24.200 Good job.
00:03:25.580 And that's how climate change works.
00:03:27.800 It targets individual people now.
00:03:30.060 They've taken it up a notch.
00:03:31.200 What's missing in the story is the political affiliation of all three people struck by lightning.
00:03:39.900 I'd like to know, for example, were they Democrats?
00:03:47.860 I don't know.
00:03:49.160 I mean, statistically, you've got to ask that question.
00:03:51.380 What if all the people who are struck by lightning in Washington, D.C. are Democrats?
00:04:00.300 I'm not saying that would mean anything, but it would.
00:04:09.280 This is the one time they don't put that information in a story about Washington, D.C.
00:04:13.980 I feel if it's a story about Washington, D.C., they're going to throw your political affiliation
00:04:19.120 in there no matter what the story is, not if you got struck by lightning, because if
00:04:25.100 it looks like God has picked a side, you don't want to report that exactly the way it happened.
00:04:31.300 Well, it seems that God has picked a side.
00:04:33.300 He doesn't like the Democrats.
00:04:34.880 Three down so far.
00:04:36.400 Now, there's nothing funny about people getting hit by lightning.
00:04:39.580 You people are disgusting.
00:04:41.340 That's not funny.
00:04:44.940 So stop laughing.
00:04:46.040 Well, this Alex Jones trial went to a new level.
00:04:51.140 I guess the jury decided on the punitive damages.
00:04:55.220 So he was already on the line for four point something million.
00:05:00.060 But now they've added 45 million in punitive damages.
00:05:04.280 So now he's up to 49.3 million.
00:05:07.240 And that's just two of the people who have sued him.
00:05:10.700 I don't know how many other people will sue him.
00:05:13.280 Now, it seems to me that free speech got a lot more expensive.
00:05:21.780 Are you all aware that there's a thing called a professional liability insurance?
00:05:27.500 Have you ever heard of that?
00:05:29.420 It's what people like me have to get because we get sued a lot, right?
00:05:36.700 This just sort of comes with being a public figure.
00:05:39.200 He gets sued a lot.
00:05:39.880 Now, I wonder if he had such insurance.
00:05:43.820 Probably not.
00:05:45.020 I mean, I haven't heard any word of it.
00:05:46.500 I think the insurance companies would be putting up the defense if that were the case.
00:05:50.640 But there is insurance that you get so that you don't fall into this situation where, you know, you don't have any coverage.
00:05:58.780 So what is this going to do to the rates I pay for my insurance?
00:06:06.420 This is very expensive for me.
00:06:09.340 Like, literally, this, I think.
00:06:13.260 I mean, I'm speculating.
00:06:14.880 But if this becomes the new normal or the insurance company imagines this looks like the future,
00:06:21.780 then what the hell are they going to do to my premiums?
00:06:23.780 Because I would be the next $50 million loss, right?
00:06:28.920 Potentially.
00:06:30.220 I mean, I could say something crazy and somebody could sue me.
00:06:33.640 Now, I don't do what Alex Jones is accused of doing,
00:06:38.300 which is intentionally spreading conspiracy theories.
00:06:42.500 That's what he's accused of.
00:06:43.540 His defense is that he believes what he says.
00:06:48.160 Which is a really good defense.
00:06:51.100 How exactly did they prove that he didn't believe it?
00:06:54.200 I didn't watch the details of the trial.
00:06:56.380 But were there text messages or any documentation that proved he didn't believe what he was saying?
00:07:02.860 Because that's important, right?
00:07:06.000 Have you seen that reported in the news?
00:07:07.820 I feel like if that came out at trial, you would definitely see that, right?
00:07:13.540 He didn't put on a defense.
00:07:17.820 There was no trial.
00:07:19.340 Who's saying that?
00:07:21.640 Who's saying there was no trial?
00:07:23.360 I'm watching it in the news.
00:07:26.220 It was a default judgment because he didn't put on a defense.
00:07:32.160 Something about this doesn't make sense.
00:07:35.720 Oh, default judgment for not complying with discovery.
00:07:40.360 Huh.
00:07:41.140 He was prevented from defending himself.
00:07:42.940 All right, so there's something else going on here.
00:07:46.440 It was a default judgment.
00:07:51.960 That does not sound like something that will stand up on appeal.
00:07:57.480 And I don't know anything about anything.
00:08:00.340 And even I think it won't stand up on appeal.
00:08:02.500 Because isn't there a part missing here?
00:08:07.400 There's a part where they have to prove that he knew that it wasn't true, what he was saying, right?
00:08:13.060 I didn't see any evidence presented that would suggest he knew that.
00:08:17.920 If you could be sued for $50 million successfully, and I suspect it won't be successful, but don't know yet.
00:08:25.960 If you could be sued for $50 million for saying something that you believe to be true, and it turns out to be wrong, do you want to live in that world?
00:08:37.840 Well, is that the world?
00:08:39.760 No, where just being wrong is enough for you to be sued for $50 million?
00:08:45.060 We're all wrong all the time.
00:08:47.180 It's a pretty common situation.
00:08:48.680 So, to me, this Alex Jones thing, I don't know the degree to which the jury was properly vetted, but to me it just looks political.
00:09:00.160 It looks like a political outcome.
00:09:01.980 It doesn't look like a legal outcome to me.
00:09:06.940 The important news today is that Pete Davidson and Kim Kardashian apparently are breaking up.
00:09:13.460 Well, who saw that coming?
00:09:14.800 To me, they look like a couple that could last the ages.
00:09:18.680 I mean, it's not like either of them have any issues, so I'm surprised that didn't work out.
00:09:27.540 Well, Israel's under attack, pretty major attack, rocket attacks coming in from Gaza, and the Iron Dome is stopping some of them, but it looks pretty grim over there.
00:09:41.260 And I'm seeing some videos where apparently you have like 10 to 15 seconds to get to a bomb shelter, if you're in one of the affected areas in Israel, to get to a bomb shelter from the time that the alarms go.
00:09:53.780 How many people are within 10 to 15 seconds of a bomb shelter?
00:09:59.840 Maybe in Israel, everybody is.
00:10:01.060 I don't know.
00:10:02.060 So, it's just a question.
00:10:03.300 Is that practical?
00:10:05.120 How often can you even get to a bomb shelter?
00:10:08.660 And what qualifies as a bomb shelter?
00:10:11.200 But I guess things are pretty grim over there.
00:10:12.980 So, the Iron Dome is catching some of the rockets, but not all.
00:10:17.560 Now, effectively, this is Israel's at war with Iran, wouldn't you say?
00:10:23.160 Because the rockets coming in are from Iranian proxies.
00:10:27.300 They're funded by Iranians.
00:10:28.520 They're propped up by Iranians.
00:10:29.660 So, isn't this just Israel being at war with Iran?
00:10:34.820 And I would say that this gives Israel a full moral cover to attack Iran itself.
00:10:42.060 I'm not saying they should.
00:10:44.700 But in terms of morally or in terms of the rules of war or any of that, I think Iran is now fair game for Israel.
00:10:54.480 I don't recommend it.
00:10:55.760 I don't recommend it.
00:10:56.780 But they've basically said, here we are.
00:11:00.480 Come get us.
00:11:02.480 So, maybe that'll happen.
00:11:04.420 Well, here's a little celebrity news.
00:11:06.120 Do you know actress Anne Heche?
00:11:09.100 Or Heche?
00:11:10.320 H-E-C-H-E.
00:11:12.960 Anne Heche.
00:11:13.900 Apparently, she's got a little mini Cooper, and she crashed into a house in Mar Vista.
00:11:22.960 So, there was quite a bit of damage.
00:11:24.700 She crashed into a house.
00:11:25.700 But that's not the interesting part.
00:11:28.320 She apparently got her car out of that house that she damaged, and very soon drove into another house.
00:11:37.540 And the second one caught on fire, and she had some burns.
00:11:41.520 She went to the hospital, and she's not in good shape.
00:11:45.000 Pretty major burns.
00:11:46.100 So, one actress destroyed two houses with a mini Cooper, and I'm thinking, how many actresses with mini Coopers would it take to take care of this whole Gaza rocket situation?
00:12:00.240 Because I feel like you just drop enough actresses with mini Coopers over in the whole Gaza area, and just say, do some chores.
00:12:11.780 You know, do a couple errands.
00:12:13.800 Oh, just drive around a little bit.
00:12:15.380 But I feel like they would destroy enough things that the rockets wouldn't be able to shoot.
00:12:22.460 I don't know.
00:12:23.160 It's just an idea.
00:12:24.020 I'm just a brainstorming here.
00:12:25.840 So, do you think I should run for president?
00:12:34.260 Does anybody think that?
00:12:36.840 Any votes?
00:12:38.240 What are you on me?
00:12:39.740 Does anybody think so?
00:12:41.800 No?
00:12:42.280 Well, what do you think artificial intelligence thinks about that question?
00:12:48.860 So, I keep mentioning Machiavelli's Underbelly, the Twitter account where there's a lot about AI, and we see a lot of questions being asked to AI by the owner of the account, who shall remain nameless.
00:13:04.760 But here's the interesting things.
00:13:06.700 Number one, here's how this started.
00:13:08.100 So, Machiavelli's Underbelly, that's the Twitter account, he tweeted this.
00:13:15.020 He said, I'm available as an AI conductor for political campaign strategy, and AI empowered and automated mimetic warfare operations to the first politician who offers me $1 million for the duration of the campaign, plus a $10 million bonus when you win.
00:13:36.800 What do you think of that?
00:13:38.100 Now, you'd have to assume, first of all, that the person making the offer is qualified.
00:13:43.840 He does seem to work in this area, and from what I can see, knows more about AI than most people that I know.
00:13:53.840 So, he does seem to have some credentials, and he's made this offer to use AI to help somebody win a campaign.
00:14:01.900 What do you think of the price, $1 million, with a $10 million bonus for success?
00:14:06.740 What do you think?
00:14:10.480 Well, it's either way too expensive if AI isn't quite there yet, so if AI can't get the job done, it's way too expensive.
00:14:19.460 If AI is up to the task, it's really cheap.
00:14:25.520 So, this is either the most overpriced offer in the world, or it's really a bargain.
00:14:32.740 And the answer is, where is AI right now?
00:14:36.200 Well, the person making the offer says it's there already.
00:14:39.660 There meaning it could help you win a campaign.
00:14:42.740 I don't know.
00:14:43.980 It might.
00:14:44.440 I mean, I feel like there are some things it could do already, and probably will become more important.
00:14:52.120 Yeah, even though the interfaces are open to the public, that doesn't mean you would know how to use them.
00:14:58.700 Asking the right question is important.
00:15:00.260 For example, a question was recently asked, should Scott Adams run for president?
00:15:11.300 And Machiavelli's underbelly asked this question and fed into the AI, I forget which one he was using, my Wikipedia page.
00:15:21.480 So, based on my Wikipedia page, plus, I don't know, does AI look for other sources?
00:15:28.440 I'm not sure.
00:15:29.460 I don't know if it automatically does that.
00:15:31.840 But here's what the AI said to this question.
00:15:35.340 Should Scott Adams run for president of the United States?
00:15:38.500 And the answer was, yes.
00:15:40.440 Scott Adams would make a good presidential candidate.
00:15:43.660 He has the ability to appeal to both sides of the aisle, and is able to persuasively argue his point.
00:15:49.220 Additionally, he has a strong track record of correctly predicting political outcomes.
00:15:55.840 Keep in mind, that was all the AI.
00:16:00.060 I don't believe that there's anything in those words that it found.
00:16:04.900 I think it drew those conclusions based on looking at the evidence.
00:16:10.600 I don't think it just took somebody's opinion and parroted it.
00:16:13.580 I think that's actually its opinion.
00:16:15.500 It's not a correct opinion, but I think it's actually its opinion.
00:16:21.020 All right, here's some more.
00:16:23.140 So, there's a follow-up question.
00:16:24.780 The follow-up questions are just to get it to keep talking, right?
00:16:28.380 Next question was, what are some other reasons he should run?
00:16:33.240 Some other reasons that Scott Adams should run for president of the United States include his experience as a successful businessman.
00:16:39.640 He should run for president, questionable, and his desire to help improve the country.
00:16:45.360 Well, that part's true.
00:16:46.900 He has also been endorsed by some high-profile people, including Donald Trump.
00:16:52.340 Has Donald Trump endorsed me?
00:16:55.580 Not as a candidate, obviously.
00:16:58.200 But, I don't know, endorsed?
00:17:00.160 Again, AI chose that word.
00:17:04.800 Because I don't believe there's any human who ever said that Trump ever endorsed me.
00:17:12.380 I mean, give me a fact check, but I don't think any human ever said that, because there's nothing in evidence like that.
00:17:18.960 But, AI thinks so.
00:17:20.780 AI thinks that Trump is endorsing me sort of in kind, I guess, in a way.
00:17:26.400 So, then it was asked, what else can you tell me that might convince me?
00:17:32.160 And then the AI says, other reasons to support Scott Adams as a presidential candidate include his intelligence, okay,
00:17:39.880 and his willingness to listen to all sides of an issue.
00:17:43.680 How did it know that?
00:17:46.840 How did AI identify me as someone willing to listen to both sides of an issue?
00:17:54.140 Because I think it's true.
00:17:56.400 But, whoever wrote that down, did somebody read that somewhere?
00:18:02.060 Because I don't believe, you know, I see it on Twitter sometimes, but it's not like there's some,
00:18:07.380 I don't believe Wikipedia says I listen to both sides of the issues.
00:18:12.420 Where does that come from?
00:18:14.600 Right?
00:18:16.800 Somebody says, ego buffet.
00:18:18.780 We're going to get rid of you.
00:18:20.240 All the people with weak egos can't handle a conversation like this.
00:18:24.180 So, if you have a weak ego, you should go work on that.
00:18:27.400 But, I'll hide you from this channel so that you don't bother the rest of us.
00:18:31.240 All right.
00:18:31.880 And then the last question was, what else?
00:18:34.080 Just to see if there's anything the AI has to offer.
00:18:37.400 It says, Adams has also been praised for his strong communication skills,
00:18:41.560 which could be a valuable asset in his role as president.
00:18:43.820 Additionally, he has a proven track record of being able to get things done, both in business and politics.
00:18:51.660 Where did that come from?
00:18:52.880 Again, I don't believe there's any human who has ever said in public that I have an ability to get things done in both business and politics.
00:19:04.060 Now, I think I do.
00:19:05.580 I think I do have that ability.
00:19:07.540 But nobody's ever said it, like, in those words that I've ever heard.
00:19:11.460 I mean, I imagine I would have heard it.
00:19:13.600 Sounds like something I would have heard.
00:19:15.680 But I've never heard that.
00:19:16.700 But these do look like AI opinions.
00:19:20.660 It looks like it formed a frickin' opinion without parroting a human.
00:19:25.740 What do you think about that?
00:19:29.040 Now, here's the problem.
00:19:31.840 AI only operates on the information that is provided.
00:19:37.380 Or does it?
00:19:41.160 Or does it?
00:19:42.100 Do you think the AI just looked at my Wikipedia page because that's what it was fed to know who I am?
00:19:48.200 Or did the AI go and look at everything that can be said about me and then use the Wikipedia page basically to know it's the same person and not much else?
00:19:58.140 I don't know.
00:19:59.780 I don't know.
00:20:00.320 But it's pretty shocking that this thing had a full-blown, perfectly written opinion about me that isn't too far off from what, let's say, from what other people might say.
00:20:13.860 I'm not claiming that it's an accurate summation of me.
00:20:17.480 Can you deal with that on YouTube?
00:20:20.200 Or the people on YouTube who have weak egos?
00:20:23.020 Can you deal with the fact that it's interesting that it's said that?
00:20:26.520 It's not about me telling you how awesome I am.
00:20:28.940 That's not really the point.
00:20:31.240 All right.
00:20:33.760 It does, the AI makes excellent sentences.
00:20:37.980 And the AI sentences are perfectly understandable.
00:20:42.860 More so than human sentences, by far.
00:20:48.240 All right.
00:20:48.800 Yeah, I think it took the Wikipedia thing and then extended it by what it knew from the rest of the world.
00:20:54.660 So that's happening.
00:20:55.980 Does anybody remember that in the election between Trump and Hillary Clinton, does anybody remember that early on in that cycle, I offered for $1 billion, I would help Hillary Clinton win the election?
00:21:11.180 Anybody remember that?
00:21:13.420 Anybody remember that?
00:21:13.460 And the reason I thought it was funny to do it is that I was the only one who knew that it was a bargain.
00:21:25.220 It was sort of like a public prank.
00:21:30.800 You know, the prank was that you don't know it's worth it.
00:21:33.320 It was totally worth it.
00:21:34.240 If they'd paid a billion dollars, they probably could have moved the needle for it.
00:21:38.380 You know, because Hillary needed a little help.
00:21:44.400 It wouldn't have been that hard, really.
00:21:46.480 It wouldn't have been.
00:21:47.420 Just, you know, do something about the deplorables and you'd be fine.
00:21:51.660 How easy would it have been for Hillary to win the election?
00:21:56.280 I feel like pretty easy.
00:21:59.040 I think she would have just had to say stuff like, you know, you know, I probably shouldn't have said that deplorables thing.
00:22:06.840 Why don't we all just come together and stop bickering?
00:22:09.480 I mean, it would have been easy.
00:22:11.200 It would have been easy, but she didn't go that way.
00:22:15.560 All right.
00:22:16.720 So a million dollars worth the price.
00:22:19.400 So I guess China has lobbed a whole bunch of missiles in and around Taiwan just to show them what for, because Nancy Pelosi visited.
00:22:30.440 And China is getting really tough.
00:22:33.000 So here's some of the things that they're doing because of that Pelosi visit to Taiwan.
00:22:37.220 China will no longer be talking to the U.S. and coordinating with us about climate change.
00:22:45.500 Now, how about that?
00:22:48.240 China is no longer going to be coordinating with us and talking to us about climate change.
00:22:58.020 What were they doing before that's going to be different?
00:23:02.280 Before they were just talking to us and trying to get us to do expensive things that they didn't need to do.
00:23:08.560 Are we worse off not talking to them?
00:23:12.540 I feel like that was sort of a neutral situation.
00:23:16.460 So I'm not too worried about that.
00:23:17.980 But that's not all.
00:23:19.100 They're also not going to be talking to us and communicating about their military decisions.
00:23:25.440 Now, again, I have a question.
00:23:30.100 I haven't followed it closely, but I don't recall the time when China was checking with the United States to see what kind of military decisions they should make.
00:23:40.660 Were they doing that before?
00:23:41.960 We're thinking of building a battleship, but we don't want to do it unless you guys are cool with it.
00:23:48.520 And then we say, you know, we're cool.
00:23:50.200 Yeah, just go build a battleship.
00:23:51.580 That's fine.
00:23:52.460 I don't see what could go wrong.
00:23:54.100 And then they do.
00:23:55.060 Is that how it worked before?
00:23:56.340 Now, I get the part where maybe you want to warn the other side if there's going to be a military exercise, just so there's no false signaling.
00:24:07.180 But I imagine they would still do that, because that's good for them as well.
00:24:12.320 So have we just learned that we don't need China for anything except manufacturing?
00:24:19.640 And that we need to get back as soon as possible.
00:24:22.360 Yeah, I don't know.
00:24:26.560 To me, it doesn't look like we lost a thing.
00:24:29.580 And what did we learn by seeing that China can put a bunch of missiles on all sides of Taiwan?
00:24:37.780 What did that tell us that we didn't already know?
00:24:41.080 Well, nothing.
00:24:42.440 There's nobody who didn't know that China could conquer Taiwan if it wanted to.
00:24:46.580 It would just have massive losses and repercussions politically, et cetera.
00:24:52.360 But nobody's under the impression that they can't do it.
00:24:56.220 Of course they can do it.
00:24:59.940 All right.
00:25:04.740 What else is going on?
00:25:06.060 So, yeah, and also China sanctioned Nancy Pelosi personally.
00:25:14.120 And that should make a big difference, because what was Nancy Pelosi doing before that required China's approval?
00:25:25.580 Thinking, thinking.
00:25:27.040 Was Nancy Pelosi getting her fentanyl directly from China and mixing it up in the lab?
00:25:35.040 Or did she do what everybody else does and get her fentanyl through the cartels, originally from China?
00:25:42.500 But I don't know.
00:25:45.840 These are devastating sanctions.
00:25:47.580 So now Nancy Pelosi can't, I don't know, get discount fentanyl?
00:25:54.840 I don't know.
00:25:55.940 I can't think of anything that makes a difference.
00:25:58.700 Well, let's talk about Elon Musk and Twitter.
00:26:01.380 I finally got a little bit of maybe a beginning of an understanding of why Elon Musk thinks he can prevail.
00:26:08.980 Now, the setup is that the deal, the original deal that Elon Musk is trying to get out of, I guess, said that Twitter couldn't guarantee how many bots they had.
00:26:23.340 So you're just going to have to do the deal without knowing that for sure.
00:26:27.280 I'm paraphrasing.
00:26:28.420 But essentially, they did have very clear language that says we're not guaranteeing the number of bots being a percentage of traffic at all.
00:26:36.320 Now, so if the only thing he complained about was that there were too many bots, I'm not sure he would have a case.
00:26:47.820 Because he signed something that says it doesn't matter how many bots there are.
00:26:52.260 I mean, that's as clear as it can get, right?
00:26:54.840 Sign this that says you don't care how many bots there are and you're not going to make a big deal about it.
00:27:00.140 Hey, there are too many bots.
00:27:02.300 Let's make a big deal about it.
00:27:03.580 So that's the way the news is reporting it.
00:27:07.080 But there's a level of nuance here that I had not been associated with before.
00:27:13.760 And I saw Andrea Stroppa had a tweet in which Elon Musk confirmed that that's the situation.
00:27:25.460 So I'll read her tweet and just know that Elon Musk has confirmed that this describes it correctly.
00:27:31.260 She says about Musk's counterclaim about Twitter, clearly from the Twitter's SEC filings, the MDAO, and that stands for Monetizable Daily Active Usage.
00:27:46.860 So the number of active users that you could put commercials in front of, advertisements, that's how many you can monetize.
00:27:57.400 So the number of people you can monetize is different from the number of trolls because you can't monetize a troll if you know they're a troll.
00:28:05.640 Nobody's going to want to advertise to a troll or to a bot.
00:28:11.160 And so Twitter's filings said that that's what the key metric is, is monetizable users.
00:28:21.840 And its market value depends on that.
00:28:26.980 And then it says that it's an ad hoc metric created to protect, this is Andrea's opinion, I guess.
00:28:33.960 It's an ad hoc metric created to protect Twitter's interests.
00:28:36.980 No competitor uses something similar.
00:28:40.580 So in other words, Twitter has developed a thing that they measure, measure, that nobody else uses the same measure, so you can't compare it.
00:28:49.980 So you wouldn't be able to say, oh, Twitter has more or fewer bots than Facebook or whatever, based on this alone.
00:28:56.840 You couldn't do that.
00:28:57.940 And this says we define, I'm sorry, and then when Musk requested more information about the spam and fake accounts, Twitter provided a vague response.
00:29:14.960 They gave some outdated data, then they offered fake data that's, you know, not a real fire hose, Andrea says.
00:29:23.260 Then provided a clean data set where they already suspended the malicious accounts.
00:29:27.880 So basically, it looked like Twitter was trying to conceal the data rather than present it.
00:29:33.780 At least that's the allegation.
00:29:35.920 And Musk said it's a good summary of the problem.
00:29:38.660 And then Musk says, if Twitter simply provides their method, their method, that's the key word.
00:29:46.320 If they just provide their method of sampling 100 accounts and how they're confirmed to be real, the deal should proceed on original terms.
00:29:56.580 What a baller offer that is.
00:30:00.020 Oh, my God.
00:30:03.120 So, here's the nuance.
00:30:06.920 Elon Musk is not complaining directly that there are too many bot accounts.
00:30:11.520 He is complaining that they're selling him a company with a metric that is just bullshit.
00:30:19.760 So, basically, it's a fraudulent metric.
00:30:22.420 It's not a question of what number it produced.
00:30:25.540 It's a question of the metric can't be used for anything useful or it's not credible the way it's being presented.
00:30:33.400 So, in other words, he doesn't know what he's buying because although they've said it doesn't matter how many bots there are,
00:30:39.500 they have said, wait for it, they have said that the measure of the monetizable users is key to their business.
00:30:50.360 Now, try to hold these two things in your head at the same time.
00:30:53.940 This measurement that they use to tell how many are real users is fundamentally, it's the driver of their business.
00:31:00.460 It creates all of their value.
00:31:02.280 All of their value is based on this metric.
00:31:06.100 They won't tell you what it is.
00:31:07.560 You want to spend $44 billion on a company that's got a little metric and the entire value of the company depends on how that metric works.
00:31:18.800 And he says, show me the metric and they won't do it.
00:31:23.980 Now, that's different.
00:31:26.260 That's different from show me the data.
00:31:29.460 You got that?
00:31:30.700 That's a big difference.
00:31:31.740 If the problem is show me the data, he has no argument because he already signed something that says the data might be wrong.
00:31:42.220 Right?
00:31:42.920 And it looks like he's not fighting that point.
00:31:45.740 He's not fighting the point of is the data right or wrong, which we all thought he was.
00:31:51.300 He's fighting the point of you're selling me or you're trying to sell me a company that has as its claim of value this algorithm that you won't show me.
00:32:01.660 Why would I pay $44 million for an algorithm that is the engine of the company and you won't even show me how it works on a sample of 100?
00:32:11.420 You see how clever that is?
00:32:15.860 And why did we not understand that until today?
00:32:18.740 And I'm not even sure I totally understand it yet.
00:32:21.060 But I feel like I'm closer.
00:32:23.340 Right?
00:32:24.400 So it's not about the data because he wouldn't win that.
00:32:27.940 It's about the algorithm to come up with the data.
00:32:31.320 Because if the algorithm is completely bullshit, he doesn't have to argue about the quality of the data.
00:32:38.460 But I think.
00:32:41.160 I think.
00:32:43.160 Does that sound right?
00:32:45.100 Is anybody here who's a little closer to this story that could...
00:32:49.680 You know, I'd like to hear Barnes or Fry talk about this a little bit.
00:32:54.140 Does that sound close?
00:32:55.680 Okay.
00:32:56.420 I think it's close.
00:32:59.700 Anyway.
00:33:02.500 As I told you, Omicron and Joe Biden go together real well.
00:33:06.300 So Omicron Joe has got something called permanent Omicron.
00:33:11.680 He just keeps testing positive.
00:33:14.800 But it's sort of ideal because the symptoms are trivial.
00:33:18.220 It allows him to hide from the public, which always works well for his approval levels.
00:33:22.920 They're creeping up.
00:33:24.420 And best of all, it keeps him out of hair sniffing range from minors.
00:33:29.440 Because he doesn't really need another photo of him sniffing a little girl's hair.
00:33:33.500 He really doesn't need that.
00:33:34.540 So Omicron Joe, he's got a good thing going.
00:33:41.480 He should just ride that.
00:33:43.460 Now, there's a story that Director Wray of the FBI handed over thousands of tips about Kavanaugh during the Kavanaugh hearings.
00:33:55.500 And he gave it to the White House.
00:33:58.360 And then the White House is who determines which ones get investigated.
00:34:02.420 And so the scandal here, according to the Democrats, is that there were thousands of things that were thousands of accusations against Kavanaugh that were not followed up on.
00:34:13.980 But the FBI says, well, we don't follow up on anything just to follow up.
00:34:20.660 We have to have a client, basically.
00:34:22.860 And the client was the White House.
00:34:24.220 So if the White House says, follow up on these, but not on these, they're just going to do what they're asked, it sounds like.
00:34:30.940 And apparently that's the way it's always been done.
00:34:33.320 So there's nothing unusual about that.
00:34:35.500 The FBI would only act on the tips that the White House looked at and said, yeah, these look serious and these are not.
00:34:42.100 So here's the question.
00:34:45.660 Did Kavanaugh rape thousands of women and the information was not followed up on?
00:34:54.260 Or is it more likely that in this sort of situation, you would get thousands of fake allegations just like you'd expect?
00:35:03.980 Well, you have to think that almost all of the allegations, if not every one, were fake.
00:35:10.280 So was the FBI supposed to follow up on every scant thing?
00:35:18.700 Here's the dirty secret of this sort of stuff.
00:35:21.580 You can usually tell by looking at it, if it's credible.
00:35:26.140 You just hear the story, you're like, uh-huh.
00:35:29.540 And then you say he flew in his spaceship to Kansas, where he molested you.
00:35:36.180 Okay?
00:35:37.340 Generally, you can kind of tell from the story itself.
00:35:40.740 How likely it is to be true.
00:35:42.400 And you don't have the ability to follow up on everything.
00:35:45.580 You can't follow up on thousands of tips.
00:35:48.640 Literally thousands.
00:35:49.680 You can't follow up on.
00:35:50.840 So I'm not sure there's any story here at all, except that the system worked the way it always did.
00:35:55.600 And the Democrats are saying, my God, that means that Kavanaugh was really guilty.
00:35:59.460 No, I don't think it means that at all.
00:36:04.320 I just think it means that, of course, there are going to be thousands of tips in this situation.
00:36:09.820 And almost all of them, or all of them, will be just complete bullshit.
00:36:13.880 Well, here's the payoff.
00:36:21.240 Now, there's two payoffs.
00:36:22.980 I now get to the best part of the presentation.
00:36:26.080 Are you ready?
00:36:26.400 So you waded through all this first part.
00:36:30.080 Here's the good part.
00:36:32.300 Story number one.
00:36:34.540 Why can't I get anybody in the news to ask this question of a prominent Democrat?
00:36:39.220 I'm going to keep pushing on this.
00:36:42.580 This is a question I want any prominent Democrat to be asked.
00:36:46.960 It doesn't matter who.
00:36:48.660 Any prominent Democrat, or supporter, or pundit.
00:36:51.960 Just ask this question.
00:36:57.560 If you could rig an election to keep a white supremacist out of office,
00:37:05.320 and you believed that that was the actual situation, would you do it?
00:37:11.860 That has to be asked of every Democrat.
00:37:14.500 Because remember, they do believe that Trump was a white supremacist.
00:37:19.620 So you need to ask them, would you be okay with a white supremacist president,
00:37:25.480 or would you take the risk of rigging it if you could,
00:37:28.840 and you thought you'd get away with it?
00:37:30.100 You have to make the Democrats, you meaning the news industry and the public,
00:37:38.160 we have to make them answer that question.
00:37:40.760 Because they've been getting away with it far too long,
00:37:44.000 the imagination that they wouldn't have a reason to rig an election.
00:37:50.720 And again, there's no evidence that I've seen
00:37:53.200 that suggests that the election would have gone differently
00:37:56.440 because it was rigged.
00:37:58.560 I've not seen any evidence that convinces me that happened.
00:38:02.460 But you need to ask them if it was a reasonable question.
00:38:07.500 And the problem is, it was completely reasonable for them to rig the election
00:38:11.740 under their assumption that they were stopping Hitler from taking power, basically.
00:38:17.460 So, make them answer the question.
00:38:20.900 Make them answer the question.
00:38:22.700 Now, I know people don't like to answer hypotheticals.
00:38:25.880 But this one's a real special hypothetical, isn't it?
00:38:29.820 You really have to put yourself out there and say, what's more important?
00:38:34.540 What's more important, that the system worked,
00:38:37.200 or that you kept what you thought was a monster out of an office?
00:38:42.820 Because I would have rigged the election in that situation.
00:38:46.080 If I believed what they say they believe, that Trump is such a monster,
00:38:52.100 if I were in that situation, and I had to save the country from the monster,
00:38:57.280 I would rig the election if I had any control over it.
00:39:01.460 Would you?
00:39:02.700 Let me ask the question of you.
00:39:04.760 If you were in that situation, and you thought Joe Biden was not just incompetent,
00:39:10.180 but you thought he was Hitler, would you rig the election if you could?
00:39:16.080 I hope so.
00:39:17.780 Now, some of you actually wouldn't.
00:39:19.620 So you would actually let a dictator destroy the country
00:39:22.140 because you protected a system that wouldn't exist after he destroyed the country.
00:39:28.240 Does that make any sense?
00:39:30.900 I think I'd have to push back on that a little bit.
00:39:33.320 I get what you're saying about the system being more important,
00:39:35.720 but I only agree with you if who you're looking at is a real live Trump,
00:39:41.000 not the exaggerated one,
00:39:42.580 and a real Joe Biden, not the exaggerated one where he's already brain dead.
00:39:49.440 If you're looking at that choice,
00:39:51.560 then I would definitely say let's protect the system,
00:39:54.400 even if maybe the system had some flaws in it.
00:39:58.460 The system would be more important.
00:40:00.160 But if you're talking about literally electing Hitler,
00:40:03.240 Hitler's going to change the systems.
00:40:04.960 The systems don't matter.
00:40:07.280 If you wouldn't stop Hitler from coming to power, if you could,
00:40:11.580 you've got a lot to explain.
00:40:14.360 Now, if you said you're afraid, well, okay.
00:40:18.160 That's actually a pretty good reason.
00:40:20.700 I actually completely...
00:40:22.820 I give anybody a pass who just says they're afraid
00:40:25.800 because you can't compare how afraid you are
00:40:28.940 to how afraid somebody else is.
00:40:30.940 We like to do that.
00:40:32.160 We like to think that the people who went to war were brave
00:40:35.740 and the people who stayed home were not brave.
00:40:39.520 And maybe there's something to that.
00:40:41.820 But I have to think we're just built differently.
00:40:44.740 There's some people who are built
00:40:46.560 to not worry about stuff that they should.
00:40:49.740 And there's some people who are built
00:40:51.080 to worry about things that maybe they shouldn't.
00:40:53.200 All right.
00:40:57.400 So be careful about protecting the system.
00:40:59.520 I'm all for protecting the system
00:41:00.860 over small imperfections,
00:41:03.720 but not over electing Hitler.
00:41:06.620 If you're going to do that,
00:41:07.860 then I'm going to destroy the system
00:41:09.980 if I need to stop it.
00:41:13.140 All right.
00:41:13.840 Here's the best thing of the day.
00:41:15.200 I have no source for this whatsoever.
00:41:18.720 But it was a cool video that I saw on Instagram,
00:41:22.920 so I forwarded it in my story.
00:41:25.640 So go to Instagram and look for me.
00:41:29.760 I think my Instagram is scottadams925.
00:41:34.100 I think that's my Instagram title, scottadams925.
00:41:39.060 But anyway, I don't have a check mark over there.
00:41:41.800 I don't have a very big account.
00:41:43.780 But the thing I sent around was
00:41:45.560 there was a researcher
00:41:46.840 who was trying to figure out
00:41:50.400 if what physics suggests is true.
00:41:54.780 And what physics suggests
00:41:56.400 is that our consciousness can alter our reality.
00:42:00.840 Not just the way we see it,
00:42:03.640 but the actual reality.
00:42:06.640 And the test that he says they did,
00:42:08.820 that he was involved in it,
00:42:10.040 is that they had some true random number generators
00:42:13.960 and they had some people try to move
00:42:15.580 the random generator
00:42:16.640 such that instead of drawing a straight line,
00:42:19.560 it would maybe go up or down.
00:42:23.040 And what they found was
00:42:24.180 what they found was
00:42:28.360 some people could change the random number generator
00:42:32.480 by their intentions.
00:42:33.520 and others could not.
00:42:41.380 What would be the name of the people
00:42:43.860 who could not author the reality?
00:42:48.260 Yes, NPCs.
00:42:50.440 So there actually now is a test
00:42:52.660 to find out if you're an NPC.
00:42:56.000 You can actually test for it.
00:42:58.040 You just put them on the random number generator
00:43:00.300 and see if they can change it.
00:43:02.060 And if they can,
00:43:03.800 they're just scenery.
00:43:05.420 And if they can,
00:43:07.260 they're a player.
00:43:09.820 Now,
00:43:10.720 have I ever told you
00:43:13.040 that I can author reality?
00:43:18.000 And you thought to yourself,
00:43:19.900 well, not literally.
00:43:20.840 Literally.
00:43:24.300 Literally.
00:43:26.100 Literally.
00:43:26.500 Literally.
00:43:27.580 Yeah.
00:43:27.980 That is my actual belief.
00:43:30.680 My actual belief,
00:43:32.900 no reservations,
00:43:34.760 is that reality,
00:43:36.780 at least whatever this reality is,
00:43:38.660 you know,
00:43:38.940 it might be bits,
00:43:40.220 but our reality is absolutely changed
00:43:42.320 by our intentions.
00:43:44.580 The first person who told me that
00:43:46.520 is worth
00:43:50.120 many billions of dollars.
00:43:55.380 The first time I ever heard that.
00:43:58.080 It was Mark Benioff
00:43:59.320 who founded Salesforce.
00:44:02.700 And I was asking him,
00:44:03.600 you know,
00:44:03.740 about the secret of his success
00:44:04.900 or something like that.
00:44:05.800 One day I was just chatting with him
00:44:07.260 before a speech I was giving
00:44:08.840 for Salesforce.
00:44:10.520 And he told me that intention
00:44:13.020 is the thing that drives everything.
00:44:15.820 And I remember hearing that
00:44:17.660 and he sounded smart
00:44:18.920 and he was obviously successful.
00:44:20.920 And I could not process that.
00:44:23.680 For years,
00:44:24.840 I could not process that.
00:44:26.660 I knew it sounded right,
00:44:28.580 but I couldn't figure out
00:44:29.880 how do you get from the intention
00:44:31.020 to, like,
00:44:32.440 what's the connecting tissue?
00:44:34.300 How does intention
00:44:35.120 get into the real world
00:44:36.420 except by actions, right?
00:44:38.020 So I thought that really
00:44:39.580 what was missing
00:44:40.360 is a system versus a goal,
00:44:42.940 you know, a process.
00:44:44.320 How do you get the intention
00:44:45.300 into the real world?
00:44:47.360 Well,
00:44:48.220 maybe you don't have to do it.
00:44:51.640 Maybe the intention
00:44:52.640 just goes into the real world.
00:44:54.780 Because that's what
00:44:55.320 this test would suggest.
00:44:57.160 That you actually change
00:44:58.480 random outcomes
00:44:59.420 by wanting them to change.
00:45:04.160 Now,
00:45:04.840 can you reproduce this test?
00:45:08.340 I doubt it.
00:45:09.220 It's a lot of fun
00:45:11.380 to talk about.
00:45:12.600 But my guess is
00:45:13.640 that if you tried
00:45:14.800 to do this
00:45:15.320 and reproduce it,
00:45:16.340 I don't think so.
00:45:17.820 I would bet against it.
00:45:19.900 But,
00:45:20.440 my philosophy of life
00:45:21.980 says that you could.
00:45:23.960 That it would be reproducible.
00:45:26.280 I just don't know
00:45:27.420 in the real world
00:45:28.100 if you could.
00:45:31.280 Just,
00:45:31.920 this test is a blind study
00:45:34.220 or it's just BS, right?
00:45:35.520 So somebody says
00:45:36.680 if it's not a blind study,
00:45:38.160 it's BS.
00:45:40.180 I'll accept that.
00:45:45.220 You thought physicists
00:45:46.340 were still in disagreement?
00:45:48.380 Yeah,
00:45:48.740 but I think the disagreement
00:45:49.760 has more to do
00:45:50.640 with the words
00:45:51.140 they put on it.
00:45:52.200 They don't disagree
00:45:53.000 on the formulas.
00:45:54.740 And the formulas
00:45:55.560 suggest
00:45:56.420 exactly what we say.
00:45:58.780 I mean,
00:45:59.020 the experiments
00:45:59.600 and formulas
00:46:00.160 do suggest
00:46:00.980 that human observation
00:46:02.680 changes reality.
00:46:03.660 and now we have
00:46:06.300 a test
00:46:06.680 that seems
00:46:07.100 to suggest it.
00:46:08.220 But,
00:46:08.820 again,
00:46:10.180 I'm going to put
00:46:10.940 my skeptical hat on
00:46:12.120 and it was
00:46:13.360 something
00:46:14.740 without a source
00:46:15.540 on social media.
00:46:18.260 What do you think
00:46:18.960 of claims
00:46:19.980 without sources
00:46:20.780 on social media
00:46:21.760 that are
00:46:22.320 really big claims?
00:46:24.660 I wouldn't believe them.
00:46:26.460 I wouldn't believe them.
00:46:30.100 All right.
00:46:33.660 Isn't that exactly
00:46:34.940 evidence of free will
00:46:36.120 if it is true?
00:46:40.100 No.
00:46:41.820 Good question.
00:46:42.960 So the question is,
00:46:44.260 would free will exist
00:46:45.720 if we can change
00:46:46.780 our reality
00:46:47.340 by our intention?
00:46:48.680 And the answer is,
00:46:49.800 in that example,
00:46:51.360 your intention
00:46:52.140 happened on its own.
00:46:55.220 The thing that would
00:46:56.340 have free will
00:46:57.000 would be the things
00:46:57.960 outside of you
00:46:58.760 because they could
00:46:59.940 go either way.
00:47:00.660 but it's your intention
00:47:02.560 that moved it
00:47:03.200 so you also
00:47:04.420 removed its free will.
00:47:06.940 Now,
00:47:07.700 so I believe
00:47:08.680 that no matter
00:47:09.220 what happens
00:47:09.940 inside you,
00:47:11.400 it's just what happens.
00:47:14.060 I guess
00:47:14.980 the edge case
00:47:16.640 would be
00:47:17.000 if you use
00:47:17.620 your intention
00:47:18.300 to change
00:47:20.080 your opinion.
00:47:22.720 If you use
00:47:23.720 your intention
00:47:24.340 to change
00:47:24.960 your opinion
00:47:25.540 or if you use
00:47:27.080 your intention
00:47:27.780 to give yourself
00:47:30.660 free will,
00:47:32.260 I don't know
00:47:33.100 that it's
00:47:33.440 logically possible.
00:47:36.300 There are things
00:47:37.340 that are
00:47:37.780 unpredictable,
00:47:39.220 but that doesn't
00:47:39.820 mean that they're
00:47:40.580 free will.
00:47:41.700 They're just
00:47:42.140 unpredictable.
00:47:43.740 Is a hole-in-one
00:47:44.720 an example?
00:47:46.000 Well,
00:47:46.300 a hole-in-one
00:47:46.680 could just be chance.
00:47:49.260 How much pot
00:47:50.220 did you smoke
00:47:50.800 this morning?
00:47:51.500 Not nearly enough.
00:47:53.420 No,
00:47:53.660 I didn't smoke
00:47:54.100 any pot this morning.
00:47:55.540 This is
00:47:56.220 my actual
00:47:57.060 personality.
00:47:58.400 I know.
00:47:59.240 Scary.
00:48:04.060 All right.
00:48:06.040 Fate is succumbing
00:48:07.160 to an internal
00:48:07.840 vacuum of ideas.
00:48:11.020 You know,
00:48:11.820 one of the most,
00:48:13.620 you want me to say
00:48:14.440 something that will
00:48:15.040 blow your mind
00:48:15.680 forever?
00:48:17.080 I'm going to leave
00:48:17.880 you with a thought
00:48:18.760 that's just going
00:48:21.020 to F you up.
00:48:23.600 You ready?
00:48:26.540 Consciousness
00:48:27.060 is just that
00:48:28.960 friction you feel
00:48:29.980 between what you
00:48:30.860 expected to happen
00:48:32.080 and what actually
00:48:33.540 happens.
00:48:35.000 Now,
00:48:35.440 that's the first part,
00:48:36.220 and I've said that
00:48:36.740 before.
00:48:37.620 And you can disagree
00:48:38.700 or agree.
00:48:39.880 Now I'm going to
00:48:40.400 prove it.
00:48:43.120 And my
00:48:43.620 corollary to that,
00:48:45.160 that the only way
00:48:46.280 you experience
00:48:46.820 consciousness
00:48:47.400 is that if what
00:48:48.340 you expect to happen
00:48:49.440 is a little different
00:48:50.920 than what happens,
00:48:51.660 even just in small ways.
00:48:53.000 And then I'm going
00:48:54.460 to take that
00:48:54.960 to the next level.
00:48:56.300 If everything
00:48:57.260 that happened
00:48:58.020 happened just the
00:48:59.720 way you expected
00:49:00.440 it, your
00:49:01.940 consciousness would
00:49:03.060 flip off.
00:49:04.220 It would just
00:49:04.520 turn off.
00:49:05.820 Do you know why?
00:49:07.060 Because everything
00:49:07.720 you expected
00:49:08.600 would be just
00:49:10.340 like you expected,
00:49:11.640 and pretty soon
00:49:12.620 you would stop
00:49:13.180 seeing it.
00:49:14.600 It would disappear.
00:49:16.380 There's no friction.
00:49:18.060 There would be
00:49:18.540 nothing to keep
00:49:19.240 alive your
00:49:19.780 consciousness.
00:49:20.240 consciousness.
00:49:21.400 Now, you're not
00:49:22.640 sold yet, are you?
00:49:24.660 What is it that you
00:49:25.660 do to go to sleep?
00:49:28.880 You lay completely
00:49:30.300 still until,
00:49:32.200 until what is
00:49:34.600 happening to you,
00:49:35.620 your body,
00:49:36.520 is exactly what
00:49:38.120 you expect to
00:49:39.280 happen.
00:49:40.720 Wait.
00:49:41.520 I'm laying
00:49:42.020 completely still.
00:49:43.540 The way I feel
00:49:44.520 now should feel
00:49:46.200 exactly the way I
00:49:47.640 feel in the next
00:49:48.540 second.
00:49:48.960 moment, the
00:49:50.080 moment it's
00:49:50.700 right, the
00:49:52.120 moment it's
00:49:52.620 true, the
00:49:54.120 moment it's
00:49:54.620 true that
00:49:55.300 you're laying
00:49:55.700 so still that
00:49:57.400 your next
00:49:57.880 moment will be
00:49:58.540 identical to
00:49:59.280 your current
00:49:59.700 moment, that's
00:50:01.840 when you fall
00:50:02.260 asleep.
00:50:04.200 Because as soon
00:50:05.220 as the next
00:50:06.060 moment and the
00:50:06.900 current moment
00:50:07.420 are the same,
00:50:09.000 you lose
00:50:10.180 consciousness.
00:50:12.280 Because you
00:50:13.040 can't maintain
00:50:13.700 consciousness if
00:50:15.200 what you expect
00:50:16.180 is exactly what's
00:50:18.180 happening.
00:50:18.960 And the
00:50:19.920 only time you
00:50:20.420 ever see that
00:50:20.960 in your real
00:50:21.340 life is when
00:50:21.900 you're laying
00:50:22.280 completely still
00:50:23.220 and trying to
00:50:24.800 go to sleep.
00:50:29.280 How about
00:50:29.940 that?
00:50:33.840 How about
00:50:34.540 that?
00:50:35.600 Explain
00:50:36.040 anesthesia.
00:50:36.980 Same thing.
00:50:38.540 Anesthesia, as
00:50:39.420 soon as you get
00:50:40.240 it, it puts you
00:50:41.740 in a case where
00:50:42.440 your body doesn't
00:50:43.500 move.
00:50:44.000 And so your
00:50:45.140 next moment on
00:50:46.160 anesthesia will
00:50:46.980 be just like
00:50:47.660 your last
00:50:48.100 moment, and
00:50:49.360 then you're
00:50:49.640 asleep.
00:50:52.020 Now, the
00:50:52.860 drug to sleep
00:50:53.780 part is, I
00:50:54.680 think, a special
00:50:55.220 case.
00:50:55.680 So I wouldn't
00:50:56.180 go too far
00:50:57.280 into that
00:50:57.660 example.
00:50:58.880 But consciousness
00:51:00.440 is nothing but
00:51:01.760 the difference
00:51:02.340 between what you
00:51:03.200 expect and what
00:51:04.540 happened.
00:51:04.940 and all you
00:51:06.280 have to do is
00:51:07.260 give the
00:51:08.220 robot that
00:51:10.480 sensation, which
00:51:12.580 is easy, and
00:51:14.000 it has
00:51:14.260 consciousness.
00:51:15.620 The robot needs
00:51:16.540 to know, I
00:51:17.220 think if I take
00:51:17.960 this step, I'll
00:51:19.700 just move forward
00:51:20.500 and everything
00:51:20.980 will be fine.
00:51:22.020 Boop.
00:51:22.480 True.
00:51:23.660 I think the
00:51:25.060 temperature will
00:51:25.640 be X.
00:51:26.260 Well, it's this.
00:51:26.940 As long as
00:51:29.120 the robot or
00:51:30.220 AI is moving
00:51:31.140 through the
00:51:31.640 world and
00:51:32.720 measuring the
00:51:33.460 difference between
00:51:34.160 what it expects
00:51:34.960 and what it's
00:51:35.580 recording, it
00:51:37.020 would have
00:51:37.360 consciousness.
00:51:39.260 That's it.
00:51:40.340 It would have
00:51:40.800 consciousness.
00:51:41.900 That's all it
00:51:42.380 takes.
00:51:43.500 You know, assuming
00:51:44.060 you could process
00:51:44.820 that difference.
00:51:45.820 You'd have to be
00:51:46.280 able to process
00:51:46.900 the difference, and
00:51:48.220 it's that processing
00:51:49.020 that gives it the
00:51:49.720 illusion of
00:51:50.200 consciousness.
00:51:54.080 What is a
00:51:54.860 seizure, then?
00:51:55.520 I don't know.
00:51:55.960 So, what do
00:52:01.380 you think?
00:52:03.480 That going to
00:52:04.520 sleep proves that
00:52:06.200 consciousness is
00:52:07.060 only the difference
00:52:07.840 between what you
00:52:08.620 expect and what
00:52:09.400 happens.
00:52:11.740 You didn't expect
00:52:12.740 that to be so
00:52:13.360 good, did you?
00:52:14.360 No, you didn't.
00:52:15.780 Well, that is
00:52:16.480 going to get you
00:52:16.980 thinking today.
00:52:19.100 How high were
00:52:20.120 you when you
00:52:20.660 realized this?
00:52:22.760 That's a good
00:52:23.320 question.
00:52:25.960 Pretty high.
00:52:27.100 Pretty high.
00:52:29.880 All right.
00:52:30.580 And that's all for
00:52:31.120 now, YouTube.
00:52:32.700 Talk to you
00:52:33.300 tomorrow.
00:52:33.520 Thank you.
00:52:43.460 Thank you.