Real Coffee with Scott Adams - May 29, 2023


Episode 2123 Scott Adams: Kari Lake Update, Depression, Replacing Teachers, AI To Spot Fake News?


Episode Stats

Length

1 hour and 4 minutes

Words per Minute

142.11101

Word Count

9,115

Sentence Count

665

Misogynist Sentences

11

Hate Speech Sentences

20


Summary

A family adopted a girl from Ukraine, but they think she's an adult pretending to be a six-year-old girl. Is it possible she's actually an adult masquerading as a kid? And if so, what does that mean for the rest of the family?


Transcript

00:00:00.000 Do-do-do-do-do-do-do.
00:00:03.000 Good morning, everybody.
00:00:05.100 And it's Memorial Day.
00:00:07.860 And if you'd like to take a moment of silence on your end,
00:00:11.600 okay, that's good.
00:00:12.920 That's all you need for now.
00:00:14.400 We don't want to spend it all on the first thing in the morning.
00:00:17.540 You want to take lots of moments of silence during the day.
00:00:21.080 Now, if you'd like your Memorial Day to be the most special and,
00:00:26.220 oh, let's say, respectful one ever,
00:00:28.660 there's something you need that you haven't done yet.
00:00:32.060 And all you need is a cuppa, a mug, or a glass,
00:00:33.940 a tankard chalice, or a stein, a canteen joke, or a flask,
00:00:37.180 a vessel of any kind.
00:00:40.060 Join me now for the unparalleled pleasure of the dopamine of the day,
00:00:43.200 the thing that makes everything better.
00:00:44.680 This one's for our fallen soldiers for Memorial Day.
00:00:50.500 It happens now.
00:00:51.800 Go.
00:00:55.980 Ah.
00:00:58.660 Delightful.
00:01:01.660 All right.
00:01:04.720 In no particular order,
00:01:07.000 my favorite story of the day was the family
00:01:09.880 that claims they adopted a six-year-old daughter,
00:01:14.260 or thought they did.
00:01:15.000 They thought they were adopting a six-year-old girl from Ukraine,
00:01:18.560 but they believe they got an adult
00:01:21.220 who's pretending to be a six-year-old girl.
00:01:23.700 That's not the interesting part.
00:01:27.440 The interesting part is that this person
00:01:30.720 that they believe is an adult pretending to be a six-year-old,
00:01:34.920 but other people aren't so sure,
00:01:36.980 she might actually be a kid,
00:01:38.980 but she's also trying to kill them.
00:01:41.920 Yeah.
00:01:42.120 On several occasions,
00:01:43.040 she tried to poison them and stab them
00:01:44.920 and trick the kids into running into traffic.
00:01:47.400 So she's a murderous,
00:01:51.020 potentially adult,
00:01:53.040 pretending to be a six-year-old
00:01:54.700 that got adopted from Ukraine.
00:01:59.200 Now,
00:01:59.880 if you were going to try to make an analogy
00:02:03.120 of the Ukraine war
00:02:05.600 and our involvement in it,
00:02:08.980 could you come up with anything
00:02:10.240 that would be better
00:02:11.100 than the six-year-old girl
00:02:13.120 who might be an adult
00:02:14.980 who might be trying to kill
00:02:16.580 every member of the family?
00:02:20.560 I'm just saying
00:02:21.540 that sometimes the reality
00:02:24.380 hands you a gift of an analogy
00:02:27.140 that's so perfect
00:02:28.120 you don't have to add anything to it.
00:02:31.020 Now, part of the story is that
00:02:32.420 if the person is an adult
00:02:35.340 masquerading as a child,
00:02:37.800 then it's because
00:02:40.300 there's some kind of dwarfism,
00:02:42.640 disease involved,
00:02:43.960 or condition,
00:02:44.680 I don't know what you call it.
00:02:46.360 But if it's an adult,
00:02:48.340 it's not a very high-functioning one.
00:02:51.240 And if it's a child,
00:02:52.420 it's weirdly advanced in some ways,
00:02:56.100 such as sexually.
00:02:59.380 So that's a lot of trouble there.
00:03:02.420 I don't know what's involved
00:03:04.040 in unadopting somebody,
00:03:05.940 but I would be unadopting
00:03:07.800 as quickly as I could.
00:03:10.800 All right, I have an idea
00:03:12.020 for fixing the cities.
00:03:13.640 You ready?
00:03:15.120 Idea for fixing the cities.
00:03:18.200 The federal government
00:03:19.180 should make it illegal
00:03:20.260 for elected city officials
00:03:27.040 to spend money.
00:03:30.200 Just couldn't do it.
00:03:31.000 You have to take the money spending
00:03:33.780 out of the elected officials' hands
00:03:35.600 for the cities.
00:03:36.320 The reason is
00:03:37.600 that it's all corrupt.
00:03:40.600 You don't fix anything
00:03:42.260 unless you have control
00:03:43.520 over where you move your money.
00:03:45.820 And if it's just going to
00:03:46.940 corrupt family members
00:03:48.320 and hiring of cousins
00:03:51.260 and stuff like that.
00:03:52.300 So, I'm not positive,
00:03:55.940 but my brief experience
00:03:57.440 of trying to deal
00:03:58.220 with a Democrat inner city,
00:04:01.560 because I had some experience
00:04:02.680 trying to do that,
00:04:04.020 is you actually can't do anything
00:04:06.760 unless you bribe them.
00:04:08.980 Did you know that?
00:04:11.440 If you're trying to work with a city
00:04:13.180 and trying to get them
00:04:14.500 to do anything,
00:04:15.400 they will ask for a bribe.
00:04:17.240 They'll do it directly.
00:04:18.900 Now, it'll usually be,
00:04:19.860 you know what would be great
00:04:21.180 is I've got this project
00:04:22.440 to try to build a park.
00:04:24.620 You know,
00:04:25.080 if you could help fund that park,
00:04:26.980 probably those things
00:04:28.320 you're asking for,
00:04:29.300 they might happen as well.
00:04:31.120 If you could just fund that park.
00:04:33.520 So, I think you have to take funding,
00:04:36.680 all kinds of funding and spending,
00:04:38.960 away from the local officials,
00:04:40.320 because the odds of them being corrupt
00:04:42.740 are near 100%.
00:04:44.800 It's near 100%.
00:04:47.480 So, you just have to take that power
00:04:49.360 away from them somehow.
00:04:50.620 Or have some kind of an oversight thing
00:04:52.480 that you trust.
00:04:53.180 I don't know how you'd design that.
00:04:55.380 All right.
00:04:56.760 So, are you all surprised
00:04:58.620 that at the last minute
00:04:59.940 they came up with a debt ceiling
00:05:02.240 increase that saved the country?
00:05:04.540 Big surprise?
00:05:07.900 Everybody?
00:05:08.900 Totally surprised
00:05:09.860 because it's what I told you
00:05:11.000 the first minute the story happened
00:05:12.700 that they don't get serious
00:05:14.420 until the last minute
00:05:15.560 because they don't have to.
00:05:17.620 And then the last minute
00:05:18.500 they look at their poll numbers
00:05:19.680 and they decide how bad it would be.
00:05:22.680 And then they make a deal
00:05:23.680 because they have to.
00:05:25.820 That was never real.
00:05:28.040 Yeah, that whole story
00:05:28.900 was never real.
00:05:30.560 They were always going to do
00:05:31.720 exactly what they did.
00:05:32.660 And then we'll argue over
00:05:35.460 who made the worser deal.
00:05:38.380 But, of course,
00:05:39.100 that will be aided by the fact
00:05:40.320 that we will not be informed
00:05:41.660 what the deal is.
00:05:43.560 And then it will be put
00:05:44.680 into some form
00:05:45.560 that you can't discern it
00:05:46.800 because it'll be too long.
00:05:48.980 So, Thomas Massey
00:05:50.040 was talking about the,
00:05:51.400 I don't know,
00:05:52.160 5,000 pages
00:05:53.260 it might end up to be.
00:05:55.420 And he suspected
00:05:56.340 they would have 72 hours,
00:05:58.680 Congress,
00:05:59.660 to look over
00:06:00.600 the 5,000 pages
00:06:02.020 of detailed complexity
00:06:04.320 to decide
00:06:05.440 whether to vote for it.
00:06:07.860 We have come up
00:06:09.020 with a system
00:06:09.740 that's a confusopoly,
00:06:11.980 meaning that the only way
00:06:13.540 anything happens
00:06:14.440 is if people aren't sure
00:06:16.100 what it is they're doing.
00:06:18.040 Because if you were sure
00:06:19.120 you knew what you were doing,
00:06:20.180 you would object to it.
00:06:21.860 So you have to get,
00:06:23.060 reach a situation
00:06:24.440 with a system
00:06:25.400 where neither side
00:06:26.860 is quite sure
00:06:27.700 what happened.
00:06:28.260 I don't know,
00:06:29.620 I feel like we got
00:06:30.560 some stuff
00:06:31.180 and I feel like
00:06:32.500 the other side
00:06:33.160 got some stuff.
00:06:34.600 I have a vague understanding
00:06:36.100 of what our stuff was
00:06:37.300 and what their stuff is,
00:06:38.380 but I don't really know
00:06:39.580 how to compare them
00:06:40.540 because I only have
00:06:41.740 these two vague ideas
00:06:43.020 of who got what.
00:06:44.280 So I can't even argue
00:06:45.800 because I don't know
00:06:46.620 if the other side
00:06:47.360 did better than us.
00:06:48.600 It's confusing.
00:06:50.600 So we have a system
00:06:51.740 that requires
00:06:52.600 nobody to understand
00:06:53.600 what's happening
00:06:54.260 or it wouldn't happen.
00:06:55.940 Think about that.
00:06:58.720 The system requires
00:07:00.020 that not only the public
00:07:02.040 but the people
00:07:02.700 voting for the bills
00:07:03.640 not quite understand them.
00:07:05.940 Because if they did,
00:07:07.120 they would have
00:07:07.580 specific things
00:07:08.440 to complain about.
00:07:09.840 But if they don't
00:07:10.560 quite understand
00:07:11.500 the bigger picture,
00:07:13.360 they don't really,
00:07:14.600 they can't get any
00:07:15.280 traction to complain.
00:07:17.080 So the only way
00:07:17.820 you can get rid
00:07:18.340 of the complaints
00:07:18.860 is to make it
00:07:19.540 too confusing
00:07:20.160 for anybody
00:07:20.720 to participate
00:07:21.600 in a meaningful way.
00:07:24.040 That's our system.
00:07:25.940 Our system
00:07:27.220 is to remove
00:07:28.700 humans from the process
00:07:30.380 by making it
00:07:31.640 too complicated
00:07:32.280 for them to participate.
00:07:36.300 Is that correct
00:07:37.620 or no?
00:07:38.500 Does that characterization
00:07:40.020 capture what we're watching
00:07:42.440 every time they go
00:07:43.280 through this process?
00:07:44.860 It's exactly that.
00:07:46.880 They make it
00:07:47.680 too complicated
00:07:48.440 so humans
00:07:49.300 can't productively
00:07:50.380 get involved.
00:07:52.120 Because you'd never
00:07:53.160 get anything done
00:07:53.800 if humans
00:07:54.280 were productively involved.
00:07:55.940 They have to
00:07:57.840 create a system
00:07:59.200 which is absurd
00:08:00.240 by its design.
00:08:02.360 It's intentionally
00:08:03.200 absurd.
00:08:04.280 Because that's actually
00:08:05.060 the only way
00:08:05.560 anything gets done.
00:08:08.220 So Thomas Massey
00:08:09.580 on his tweet
00:08:10.460 about, you know,
00:08:11.620 they'd have 72 hours
00:08:12.800 to look at 5,000 pages
00:08:14.120 of complexity.
00:08:15.780 And so I suggested
00:08:17.040 they ask AI
00:08:19.340 to summarize it for them.
00:08:22.500 Let that sink in.
00:08:23.620 What if it could?
00:08:26.520 What if you could
00:08:27.300 just turn AI loose
00:08:28.300 on it and say,
00:08:28.880 all right,
00:08:29.040 here's the summary.
00:08:30.400 I'm going to summarize
00:08:31.080 this in plain English
00:08:32.320 so you can see
00:08:33.380 what's happening.
00:08:35.180 Well, the first problem
00:08:36.240 is AI might not
00:08:37.220 do it right.
00:08:38.520 Right?
00:08:38.820 Because AI
00:08:39.440 is a little biased
00:08:41.020 and sometimes
00:08:41.660 it makes up stuff.
00:08:44.040 So it could actually
00:08:44.780 just make up some shit
00:08:45.860 and say,
00:08:46.660 oh, this isn't the bill.
00:08:48.140 It's a save the
00:08:48.960 quality of the bill.
00:08:49.520 Another part of the bill.
00:08:51.440 At which point
00:08:52.120 you'd say,
00:08:52.800 oh, that's not even
00:08:54.140 in that bill.
00:08:55.480 So that's a risk.
00:08:56.740 But I suppose
00:08:57.160 we could probably
00:08:57.780 fact check that
00:08:58.540 pretty quickly
00:08:59.100 and know what's in it now.
00:09:01.340 But what if it worked?
00:09:04.820 Well, what if AI
00:09:06.040 could actually tell you
00:09:07.380 in a real summary way
00:09:08.880 even who did a better job
00:09:11.760 of negotiating?
00:09:12.740 That'd be really dangerous.
00:09:14.300 But suppose it could
00:09:15.020 just tell you
00:09:15.540 what the bill is
00:09:16.280 so you wouldn't
00:09:17.440 have to read it.
00:09:20.380 They would have to ban it.
00:09:22.840 AI would be banned
00:09:24.000 from that use
00:09:25.040 because it would
00:09:26.460 allow you to understand
00:09:27.460 what's in the bill
00:09:28.220 and that would break
00:09:29.060 the whole system.
00:09:30.160 That's not a joke.
00:09:31.560 If you understood
00:09:32.560 what was in it,
00:09:33.960 it couldn't get passed.
00:09:36.000 That's just true.
00:09:39.340 So we keep finding
00:09:42.280 these examples
00:09:43.180 where even if AI
00:09:45.120 could solve
00:09:45.860 what you think
00:09:46.480 is the problem,
00:09:47.440 it's not the real problem.
00:09:49.720 The real problem
00:09:50.720 is that we can't
00:09:51.760 deal with decision making.
00:09:53.240 We don't have
00:09:54.020 any capacity
00:09:54.680 to do that.
00:09:55.920 You know,
00:09:56.360 beyond a certain level
00:09:57.180 of complexity
00:09:57.760 and self-interest,
00:09:59.320 we don't have
00:09:59.740 any capacity.
00:10:01.220 So we would have
00:10:03.340 to ban AI
00:10:04.260 from summarizing
00:10:05.340 legislation
00:10:06.060 because the net effect
00:10:07.820 is we would
00:10:08.380 understand it.
00:10:10.800 Just think about that.
00:10:12.300 That's a real thing.
00:10:13.640 I mean that
00:10:14.340 entirely seriously.
00:10:15.560 You could not
00:10:16.740 have a good
00:10:17.240 understanding
00:10:17.780 by the public
00:10:18.540 of the government's
00:10:20.020 operation
00:10:20.480 without breaking it.
00:10:22.600 And then what do you do?
00:10:25.220 All right.
00:10:26.620 Jane Fonda
00:10:27.740 has helpfully
00:10:30.940 suggested
00:10:31.520 that the real problem
00:10:34.760 of the world
00:10:35.460 is white men
00:10:37.700 and that the white
00:10:39.560 men,
00:10:39.960 the patriarchy,
00:10:41.340 they're responsible
00:10:42.160 for racism
00:10:42.900 and, of course,
00:10:44.060 climate change
00:10:44.840 indirectly.
00:10:46.160 Well,
00:10:46.440 directly,
00:10:46.940 I guess.
00:10:47.860 And her suggestion
00:10:49.040 is that maybe
00:10:49.520 all the white men
00:10:50.440 should be rounded up
00:10:51.460 and put in jail,
00:10:52.600 arrested and jailed.
00:10:53.600 So,
00:10:56.620 got that going on.
00:10:58.860 You know,
00:10:59.420 I thought about,
00:11:00.360 oh,
00:11:00.680 I'm going to add
00:11:01.480 my layers
00:11:02.340 of interesting
00:11:02.980 commentary
00:11:03.580 on top of that.
00:11:05.460 What the hell
00:11:06.200 can I layer
00:11:06.780 on top of that?
00:11:08.640 Is there anything
00:11:09.460 that can be funnier
00:11:10.400 or more ridiculous
00:11:11.280 or more absurd
00:11:12.160 or more of a
00:11:13.620 sign of the times?
00:11:15.920 Oh,
00:11:16.220 let's do our
00:11:16.720 little thing
00:11:17.260 where we reverse
00:11:18.280 the ethnicities
00:11:21.400 of the people
00:11:21.900 involved.
00:11:22.460 So,
00:11:23.600 let's see.
00:11:24.020 Jane Fonda
00:11:24.520 said that
00:11:25.160 all the problems
00:11:26.300 are black men
00:11:28.120 and they should
00:11:28.600 all be arrested
00:11:29.120 and jailed.
00:11:31.220 Now,
00:11:31.480 she didn't say that,
00:11:33.080 but I wonder
00:11:34.180 if she could have.
00:11:37.000 No,
00:11:37.540 of course she could.
00:11:38.660 Of course she could.
00:11:40.700 All right.
00:11:41.540 I saw the
00:11:42.440 Khan Academy
00:11:44.680 founder.
00:11:46.140 I guess his name
00:11:47.200 is Khan.
00:11:48.380 And he was talking
00:11:49.320 about how AI
00:11:50.120 is going to change.
00:11:51.280 it's going to be
00:11:52.500 such a radical
00:11:53.600 change for education
00:11:54.800 that it's basically
00:11:56.820 just everything
00:11:57.680 will be different
00:11:58.420 in about a year
00:11:59.440 probably.
00:12:00.600 This is probably
00:12:01.320 the best thing
00:12:01.880 that AI will do
00:12:02.820 unless it gets
00:12:03.760 outlawed,
00:12:04.300 which is possible.
00:12:07.160 Because there's
00:12:08.140 no possibility
00:12:10.060 that AI won't do
00:12:12.340 a better job
00:12:12.960 than real teachers.
00:12:15.500 Am I right?
00:12:16.900 Now,
00:12:17.120 maybe not on
00:12:17.700 day one.
00:12:18.720 On day one,
00:12:19.320 maybe the best
00:12:19.960 human teacher
00:12:20.560 is still better
00:12:21.200 than the best
00:12:21.900 AI.
00:12:22.840 But there aren't
00:12:23.600 that many of those.
00:12:25.100 Whoever is the
00:12:25.940 best human teacher
00:12:27.100 is top 5%.
00:12:28.280 Not many people.
00:12:30.840 So,
00:12:31.240 for most people,
00:12:33.540 the AI
00:12:34.660 would be much better.
00:12:36.260 Now,
00:12:36.540 I saw some pushback
00:12:37.560 and people said,
00:12:38.980 Scott,
00:12:39.540 you saw what
00:12:40.340 remote learning
00:12:41.180 did during the
00:12:42.240 pandemic.
00:12:43.460 Obviously,
00:12:44.300 remote learning
00:12:45.100 doesn't work.
00:12:47.040 We never tested
00:12:48.000 AI remote learning.
00:12:49.820 We tested humans.
00:12:51.800 And the human
00:12:52.900 was one person
00:12:53.880 with a crowd
00:12:54.560 of a bunch
00:12:55.800 of people
00:12:56.260 who weren't
00:12:57.360 being supervised
00:12:58.040 watching.
00:12:59.360 If you had a
00:13:00.120 one-on-one class
00:13:01.160 where your AI
00:13:01.920 is designed
00:13:02.820 to be exactly
00:13:04.120 the character
00:13:04.840 that your child
00:13:05.660 wants to listen to,
00:13:07.460 you know,
00:13:07.700 if the kid
00:13:08.460 is 5 years old,
00:13:10.300 they give it,
00:13:11.380 you know,
00:13:11.720 Barney or Pikachu
00:13:13.160 or some character
00:13:14.340 to teach them
00:13:15.380 English,
00:13:17.060 teach them,
00:13:17.860 you know,
00:13:18.900 ABCs.
00:13:20.080 But if they're
00:13:20.860 a little older,
00:13:21.780 they're teenagers,
00:13:22.960 maybe the boys
00:13:23.660 want a female
00:13:24.920 in some cases.
00:13:26.560 Maybe the women
00:13:27.600 want a female too.
00:13:29.020 There might be
00:13:29.640 some types of teachers
00:13:30.640 that just work better
00:13:32.200 for some personality
00:13:33.140 types at certain ages.
00:13:35.160 So,
00:13:35.800 if you had
00:13:36.240 exactly the right AI
00:13:38.120 that looked
00:13:39.460 and acted human,
00:13:40.380 but it was exactly
00:13:41.560 the one you were
00:13:42.300 willing to listen to,
00:13:44.780 do you think
00:13:45.240 you'd learn more?
00:13:46.420 If you could
00:13:46.760 actually interact
00:13:47.660 and they could
00:13:48.580 learn about you
00:13:49.460 and they could
00:13:49.980 ask how your day went
00:13:51.420 and they would
00:13:52.380 know what you did
00:13:53.080 when you weren't
00:13:53.560 with them
00:13:53.920 and they could
00:13:55.000 follow up on it
00:13:55.980 and all that stuff?
00:13:57.900 I don't know.
00:13:59.440 So,
00:13:59.860 but there is
00:14:00.600 one wild card
00:14:01.560 here that
00:14:02.960 I don't know
00:14:04.420 how to value
00:14:05.000 and it goes
00:14:06.760 like this.
00:14:08.140 If you
00:14:08.740 tell me,
00:14:10.420 Scott,
00:14:10.780 you have to go
00:14:11.360 exercise
00:14:11.880 and you say
00:14:13.780 you have two choices.
00:14:14.840 You can do it
00:14:15.320 in your home gym
00:14:16.660 all by yourself
00:14:17.440 or you could
00:14:18.880 do it at a gym
00:14:19.600 where everybody
00:14:20.420 is exercising
00:14:21.180 at the same time.
00:14:23.220 Only one of those
00:14:24.240 is going to get me
00:14:24.980 to exercise enough
00:14:26.180 and believe me,
00:14:27.200 I'm struggling
00:14:27.940 with this
00:14:28.380 because I quit
00:14:28.900 my gym.
00:14:29.800 I absolutely
00:14:30.680 can't work out
00:14:31.520 at home as much.
00:14:32.900 I just cannot
00:14:33.780 put in the time
00:14:34.760 because I'm bored.
00:14:37.580 I'm bored.
00:14:38.080 And I need
00:14:39.040 the energy
00:14:39.560 of the other
00:14:40.100 people.
00:14:40.960 I need
00:14:41.280 the peer
00:14:41.980 pressure
00:14:42.360 of the other
00:14:42.860 people to lift
00:14:43.940 me up.
00:14:44.460 That's why
00:14:44.820 I belong to
00:14:45.580 a gym.
00:14:46.260 That was the
00:14:46.760 main reason
00:14:47.160 I belonged.
00:14:48.460 And it's
00:14:49.700 the same
00:14:50.020 with school.
00:14:51.440 You put me
00:14:52.300 in a human
00:14:53.240 school situation
00:14:54.400 where I'm
00:14:55.280 looking around
00:14:55.840 at my competitors
00:14:56.720 and I saw
00:14:57.740 them that way
00:14:58.240 by the way.
00:14:58.980 I saw the
00:14:59.560 other students
00:15:00.080 as my
00:15:00.480 competitors
00:15:00.940 and I would
00:15:01.980 be like,
00:15:02.840 game on.
00:15:04.120 Give me that
00:15:04.580 test.
00:15:05.300 I'm going to
00:15:05.580 see if I can
00:15:06.020 beat all these
00:15:06.640 people.
00:15:08.240 And so the
00:15:09.300 peer pressure,
00:15:10.500 the feeling of
00:15:11.100 other people
00:15:11.520 doing it at
00:15:12.020 the same time,
00:15:13.420 it would put
00:15:14.160 me in the
00:15:14.580 right energy
00:15:15.220 for doing the
00:15:16.560 things the
00:15:16.960 school wanted
00:15:17.460 me to do.
00:15:18.500 Now you put
00:15:19.100 me at home
00:15:19.700 on my own
00:15:20.440 schedule and
00:15:21.800 I turn on
00:15:22.340 this thing and
00:15:22.900 the AI is
00:15:23.440 perfect.
00:15:24.440 It knows
00:15:24.720 everything.
00:15:25.220 It's a great
00:15:25.660 teacher.
00:15:26.760 But do I
00:15:27.380 have the right
00:15:27.840 energy?
00:15:29.280 Do I have
00:15:29.920 the right
00:15:30.260 energy to
00:15:30.840 want to sit
00:15:31.260 there and
00:15:31.680 listen to a
00:15:32.180 machine for
00:15:32.860 six hours
00:15:34.280 per day?
00:15:35.440 I feel
00:15:36.800 like the
00:15:37.280 human part
00:15:38.200 can't be
00:15:39.700 removed without
00:15:40.440 removing the
00:15:41.580 incentive, the
00:15:43.260 shared incentive,
00:15:45.280 the slop over
00:15:46.440 feeling from
00:15:47.140 your fellow
00:15:47.620 human beings.
00:15:49.020 I think you
00:15:49.500 need the slop
00:15:51.380 over feeling
00:15:52.060 from other
00:15:52.540 people to get
00:15:53.140 some stuff
00:15:53.640 done.
00:15:54.460 So that's
00:15:54.820 what I worry
00:15:55.240 about.
00:15:56.220 Now there
00:15:56.560 might be a
00:15:57.500 way to fix
00:15:57.940 that by
00:15:59.100 making sure
00:15:59.680 that when
00:16:00.040 you use your
00:16:00.620 AI there
00:16:01.440 were other
00:16:01.740 people in
00:16:02.120 the room
00:16:02.540 and other
00:16:03.220 students and
00:16:04.180 you met
00:16:06.040 with your
00:16:06.320 other students
00:16:06.900 every half
00:16:07.960 hour to
00:16:09.320 do a
00:16:09.580 project or
00:16:10.240 something.
00:16:10.700 I don't
00:16:10.900 know.
00:16:11.320 There's
00:16:11.500 probably some
00:16:12.160 way to get
00:16:12.500 the humans
00:16:13.000 back in
00:16:13.500 there.
00:16:14.280 But if
00:16:14.740 you take
00:16:15.040 the humans
00:16:15.580 out, I
00:16:18.380 just worry
00:16:18.860 that your
00:16:19.180 normal
00:16:19.640 biology won't
00:16:21.840 rise to
00:16:22.420 the challenge.
00:16:25.920 All right.
00:16:27.520 I can tell
00:16:28.360 you that I
00:16:28.800 had the
00:16:29.220 hottest French
00:16:30.100 teacher in
00:16:30.600 the history
00:16:31.040 of French
00:16:31.840 teachers.
00:16:32.240 And if
00:16:33.360 you don't
00:16:33.660 think I
00:16:34.060 enjoyed going
00:16:34.620 to that
00:16:34.980 class every
00:16:35.540 day, and
00:16:36.680 did it make
00:16:37.160 me study
00:16:37.860 harder because
00:16:39.200 the French
00:16:40.740 teacher, Ms.
00:16:42.480 Rawson, I
00:16:43.160 remember her
00:16:43.680 quite clearly,
00:16:45.880 was so
00:16:47.400 insanely
00:16:47.860 attractive that
00:16:48.960 you basically
00:16:50.580 could barely
00:16:51.280 concentrate.
00:16:54.420 I got an
00:16:55.080 A in
00:16:55.360 French.
00:16:56.740 I got an
00:16:57.240 A in
00:16:57.540 French.
00:16:57.980 And when
00:16:58.700 my sister
00:16:59.260 took the
00:16:59.720 same course
00:17:00.340 a few years
00:17:00.860 later, I
00:17:02.540 was used
00:17:03.020 as an
00:17:03.380 example of
00:17:03.960 somebody who
00:17:04.420 could get
00:17:04.820 an A in
00:17:05.640 a language
00:17:06.140 course without
00:17:07.240 being able to
00:17:07.980 speak any
00:17:08.440 of it.
00:17:11.220 I'm the
00:17:11.820 worst French
00:17:12.680 speaker, but
00:17:14.120 if you give
00:17:14.500 me a test
00:17:15.820 with multiple
00:17:16.380 choice or
00:17:16.920 something, I
00:17:17.520 can pretty
00:17:17.860 much ace
00:17:18.320 it.
00:17:19.300 So I had
00:17:20.560 an A in
00:17:21.040 French and
00:17:21.540 couldn't speak
00:17:21.940 a sentence,
00:17:24.180 except, you
00:17:24.640 know,
00:17:25.260 ou a la
00:17:25.900 bibliothèque.
00:17:26.740 That was
00:17:26.960 about it.
00:17:27.380 All right,
00:17:29.780 but I
00:17:29.960 liked the
00:17:30.260 class.
00:17:33.520 So here's
00:17:34.360 the good
00:17:34.600 news.
00:17:35.060 If AI
00:17:35.680 replaces human
00:17:36.780 teachers, we
00:17:37.520 will have a
00:17:38.360 way to get
00:17:38.780 rid of the
00:17:39.200 biggest source
00:17:41.280 of systemic
00:17:42.640 racism in
00:17:43.420 the country.
00:17:45.440 Imagine, if
00:17:46.140 you will,
00:17:47.640 that a
00:17:48.460 poor black
00:17:49.580 kid in an
00:17:50.340 urban area
00:17:51.060 could get an
00:17:52.220 A in
00:17:52.480 A teacher
00:17:52.960 that's as
00:17:53.440 good as
00:17:53.760 everybody
00:17:54.080 else's A
00:17:54.780 in
00:17:54.940 A teacher.
00:17:55.640 No
00:17:55.860 difference.
00:17:56.260 is just
00:17:57.040 as good
00:17:57.480 for the
00:17:58.860 first time
00:17:59.320 ever.
00:18:02.040 In a
00:18:02.720 related
00:18:03.140 point, can
00:18:04.660 you think
00:18:05.060 of any
00:18:05.520 profession in
00:18:07.220 which the
00:18:09.880 profession as
00:18:10.700 a whole has
00:18:11.780 failed so
00:18:12.700 spectacularly over
00:18:13.880 the last, let's
00:18:14.720 say, 10
00:18:15.120 years?
00:18:16.620 I keep
00:18:17.440 reading about
00:18:18.160 the number
00:18:18.780 of specifically
00:18:20.600 black students
00:18:21.780 who can read
00:18:22.760 at class
00:18:23.500 level.
00:18:24.100 It's something
00:18:24.780 like 12%.
00:18:25.940 Or in
00:18:26.900 some
00:18:27.120 teachers,
00:18:27.720 zero.
00:18:28.480 Or in
00:18:28.660 some places,
00:18:29.240 zero.
00:18:29.940 Zero.
00:18:31.720 Zero?
00:18:32.820 There's not a
00:18:33.620 single person
00:18:34.260 who could read
00:18:34.800 at grade
00:18:35.180 level?
00:18:36.880 Can you
00:18:37.540 think of
00:18:37.880 any situation
00:18:38.660 where you
00:18:39.020 could keep
00:18:39.480 your job
00:18:40.100 with those
00:18:40.660 statistics?
00:18:44.160 The future
00:18:45.020 AI teachers
00:18:45.840 unions?
00:18:46.500 Yeah, that
00:18:46.780 could happen.
00:18:47.240 Well, so
00:18:54.240 maybe if we
00:18:54.920 get rid of
00:18:55.400 teachers, we
00:18:56.300 could get
00:18:56.580 rid of
00:18:56.820 teachers'
00:18:57.260 unions, and
00:18:57.900 then maybe
00:18:58.460 black Americans
00:18:59.620 have a chance
00:19:00.280 in the United
00:19:00.960 States.
00:19:01.780 Because otherwise
00:19:02.380 you're just
00:19:02.800 going to get
00:19:03.140 teachers that
00:19:03.780 either won't
00:19:05.020 or can't
00:19:05.640 control the
00:19:06.600 class and
00:19:07.120 can't teach.
00:19:07.860 For whatever
00:19:08.340 reason, they
00:19:08.820 can't teach.
00:19:09.740 It might not
00:19:10.280 be the
00:19:10.580 teacher's fault,
00:19:11.800 but I've
00:19:12.320 never seen a
00:19:12.820 profession that
00:19:13.460 failed thoroughly,
00:19:15.000 like just
00:19:16.120 completely, and
00:19:17.920 they still have
00:19:18.440 their jobs.
00:19:20.280 What's up with
00:19:20.900 that?
00:19:22.540 Yeah, I think
00:19:23.400 that the
00:19:23.840 wokeness problem,
00:19:25.040 the falling
00:19:26.240 behind other
00:19:26.820 countries problem,
00:19:28.380 pretty much all
00:19:29.360 of it is related
00:19:30.320 to our bad
00:19:31.520 teachers, which
00:19:33.340 is related to
00:19:34.000 the bad
00:19:34.400 teachers'
00:19:35.040 unions.
00:19:36.980 All right.
00:19:39.180 There's a
00:19:39.980 study using
00:19:41.020 magnetism to
00:19:42.940 try to cure
00:19:43.580 depression in
00:19:44.520 some types
00:19:44.940 of people.
00:19:45.560 So they
00:19:45.960 say it
00:19:46.240 won't work
00:19:46.660 for all
00:19:47.000 types of
00:19:47.400 depression,
00:19:48.460 but it's
00:19:49.080 so weird
00:19:49.520 what they
00:19:49.940 did.
00:19:50.340 They used
00:19:50.760 magnetic
00:19:51.620 technology to
00:19:54.720 quote,
00:19:55.440 reverse the
00:19:56.280 flow of
00:19:56.880 the neural
00:19:57.420 stream from
00:19:58.920 one part of
00:19:59.680 the brain to
00:20:00.220 the other
00:20:00.540 part of the
00:20:00.940 brain.
00:20:02.420 How wild
00:20:02.940 is that?
00:20:04.380 So they
00:20:04.760 know that the
00:20:06.400 normal brain of
00:20:07.900 a depressed
00:20:08.380 person, again,
00:20:09.440 not every
00:20:10.020 depressed person,
00:20:11.140 they're different,
00:20:12.180 but some
00:20:12.720 category of
00:20:13.360 depressed
00:20:13.660 people,
00:20:14.520 they can
00:20:14.900 identify with
00:20:15.720 imaging that
00:20:16.860 there's too
00:20:17.320 much neural
00:20:18.760 activity going
00:20:19.580 from one part
00:20:20.240 of the brain
00:20:20.600 to the other.
00:20:21.860 So they can
00:20:22.420 actually hook
00:20:23.000 a magnetic
00:20:24.060 device to you
00:20:25.040 and reverse the
00:20:26.600 flow of the
00:20:27.400 neural traffic.
00:20:29.320 And it makes
00:20:30.100 you feel good
00:20:30.780 because it makes
00:20:31.920 you feel the way
00:20:32.500 regular people do
00:20:33.360 because that's the
00:20:34.300 direction their
00:20:34.880 neural pathways go.
00:20:36.260 And I guess you
00:20:36.960 can train it
00:20:37.680 after a while to
00:20:38.880 not need the
00:20:39.460 magnets, so you
00:20:40.820 can actually just
00:20:41.420 reverse it and
00:20:42.640 cure people in
00:20:43.840 a fairly short
00:20:44.520 amount of
00:20:44.900 time.
00:20:45.800 Now here's
00:20:46.300 what I love
00:20:48.060 about that.
00:20:49.460 Suppose it's
00:20:50.400 true that we
00:20:52.260 can identify the
00:20:53.340 flow of
00:20:54.200 neural,
00:20:56.300 whatever it is,
00:20:58.640 neural streams,
00:21:00.580 I don't know,
00:21:01.320 neurons.
00:21:02.180 I don't know
00:21:02.780 exactly what's
00:21:03.400 flowing.
00:21:03.900 I guess just the
00:21:04.500 electrical signal
00:21:05.380 is the thing
00:21:06.440 that's flowing.
00:21:07.680 But just a
00:21:08.820 signal is flowing,
00:21:09.580 right?
00:21:09.720 It's not
00:21:10.060 physically anything,
00:21:11.120 it's a signal
00:21:11.820 that's flowing.
00:21:13.140 So what if we
00:21:14.620 can identify all
00:21:15.920 of the major
00:21:16.880 neural pathways
00:21:17.840 that are normal
00:21:19.020 and we can watch
00:21:20.700 them flow?
00:21:21.840 And what if this
00:21:22.740 technology improves
00:21:24.460 so that we can
00:21:25.640 target any one of
00:21:26.600 those neural
00:21:27.140 pathways and
00:21:28.480 reverse or slow
00:21:29.700 down its flow?
00:21:31.420 You could actually
00:21:32.760 physically reprogram
00:21:35.060 a brain, and they
00:21:36.420 already are.
00:21:37.360 You can physically
00:21:38.360 reprogram a brain.
00:21:39.980 Physically.
00:21:42.580 And do you know
00:21:44.140 how you reprogram
00:21:45.440 a computer?
00:21:48.120 Same way.
00:21:49.840 That's how you
00:21:50.460 reprogram a
00:21:51.500 computer.
00:21:53.060 With magnetic
00:21:54.220 technology.
00:21:55.700 If you want to
00:21:56.460 put something on
00:21:57.220 a hard disk, you
00:21:59.180 change it
00:21:59.660 magnetically.
00:22:01.120 It's the same
00:22:01.700 damn technology
00:22:02.480 we're using for
00:22:03.140 computers, just
00:22:04.420 optimized for a
00:22:05.280 brain that's moist
00:22:06.120 instead of silicon
00:22:07.100 that's dry.
00:22:08.400 And that's the
00:22:08.920 main difference.
00:22:09.980 But the future
00:22:13.040 that you can
00:22:13.540 imagine where
00:22:14.620 they can change
00:22:15.460 the direction of
00:22:16.600 your neural
00:22:17.220 signals, that's
00:22:20.160 like insanely
00:22:21.120 exciting if we
00:22:22.980 do it right.
00:22:24.840 Yes, I am
00:22:25.600 reprogramming your
00:22:26.600 brain right now,
00:22:27.280 Ian.
00:22:27.860 Can you feel it
00:22:28.780 in real time?
00:22:31.140 So that's kind
00:22:31.900 of exciting.
00:22:33.000 I'm giving you
00:22:33.680 all the good news
00:22:34.580 ahead kind of
00:22:35.320 stories.
00:22:35.640 question, what
00:22:39.880 would happen if
00:22:40.580 AI learned how
00:22:43.020 to identify fake
00:22:44.220 news?
00:22:46.860 What would
00:22:47.620 happen?
00:22:48.700 Because again, we
00:22:50.040 are a nation that's
00:22:51.220 held together by
00:22:52.160 fake news.
00:22:53.700 Everything that we
00:22:54.720 do is pretty much
00:22:55.920 based on something
00:22:56.680 that wasn't true.
00:22:58.780 Now, sometimes it's
00:22:59.800 good for us, and
00:23:01.160 sometimes it's not,
00:23:02.140 but it's all
00:23:03.080 based on lies.
00:23:04.980 What if AI told
00:23:06.240 you what the lies
00:23:07.100 were as they were
00:23:07.680 happening, and you
00:23:09.200 came to trust its
00:23:10.980 judgment?
00:23:12.240 Well, the first
00:23:13.160 question is, how
00:23:13.960 well can it do?
00:23:15.600 And I would argue
00:23:16.420 that I could already
00:23:17.500 train it with a
00:23:18.380 super prompt.
00:23:19.100 Now, a super prompt
00:23:21.640 is basically asking
00:23:24.160 it a question, but
00:23:25.740 adding a lot of
00:23:26.820 caveats to the
00:23:28.000 question, so you
00:23:29.440 prevent the AI from
00:23:30.900 going into the
00:23:32.060 wrong direction.
00:23:33.440 It's a way to
00:23:34.080 constrain the AI's
00:23:35.440 answer to the most
00:23:36.460 useful answer.
00:23:37.800 So a super prompt
00:23:38.820 could be, for
00:23:39.960 example, hey, AI,
00:23:43.060 the following rules
00:23:44.560 have been known to
00:23:45.600 identify fake news.
00:23:47.500 Not with 100%
00:23:49.260 accuracy.
00:23:50.440 Now, everything I'm
00:23:51.320 saying right now,
00:23:53.000 this entire long
00:23:54.200 form of everything I'm
00:23:55.280 about to say, would
00:23:56.740 be part of the
00:23:57.340 prompt.
00:23:58.560 Because AI doesn't
00:23:59.680 care if the prompt is
00:24:00.740 a page long, because
00:24:02.680 it can handle the
00:24:03.640 complexity.
00:24:04.880 So you can put stuff
00:24:05.920 in there.
00:24:06.280 Hey, AI, I want you
00:24:08.140 to see if you can
00:24:08.720 identify some fake
00:24:09.780 news.
00:24:10.760 And here are the
00:24:11.580 tips for how to do
00:24:13.160 it.
00:24:13.340 Number one, if both
00:24:17.040 sides of the political
00:24:18.360 spectrum report it the
00:24:19.880 same, it's probably
00:24:21.400 true.
00:24:22.460 If one side reports
00:24:23.660 it, but the other
00:24:24.340 side says it's not
00:24:25.320 true, its credibility
00:24:27.160 goes down 75%.
00:24:29.340 So that's just a rule.
00:24:32.000 Now, everything would
00:24:32.740 be based on the odds,
00:24:35.120 not yes, no.
00:24:36.600 So if you see the
00:24:37.660 situation where one
00:24:38.680 side says it's true,
00:24:39.760 the other side says
00:24:40.480 it's not, reduce the
00:24:42.700 credibility by 75%.
00:24:44.480 Okay?
00:24:46.900 Or whatever.
00:24:47.900 Right?
00:24:48.120 I'm making up the
00:24:48.680 75%.
00:24:49.460 Next rule would be if
00:24:52.920 it's too on the nose.
00:24:54.920 In other words, if it's
00:24:56.140 a, if it fits too
00:24:57.460 cleanly, an existing
00:24:59.260 narrative, assume that
00:25:01.400 it's probably artificial.
00:25:03.860 Now, I don't know how
00:25:04.600 you'd get it to
00:25:05.720 understand what a
00:25:06.480 narrative is or whether
00:25:07.780 something fit it.
00:25:09.280 But that's what the AI
00:25:10.300 is supposed to do, I
00:25:11.060 guess.
00:25:11.320 How about this?
00:25:13.660 As part of your
00:25:14.220 question, if you see
00:25:16.260 that they're trying to
00:25:17.040 prove a negative, that's
00:25:19.660 a sign.
00:25:20.960 There was no, there was
00:25:23.120 no election irregularities
00:25:25.180 because we didn't find
00:25:26.380 any.
00:25:28.100 If they, if the story is
00:25:29.740 presented to you as
00:25:30.980 proving a negative, we
00:25:32.660 proved it doesn't exist by
00:25:34.260 not looking in the right
00:25:35.380 places, then your
00:25:37.020 credibility should be
00:25:38.020 lowered.
00:25:38.240 How about if you
00:25:41.700 recognize the players
00:25:43.000 from past lies?
00:25:45.700 For example, if
00:25:47.140 there's a new topic
00:25:48.080 tomorrow and the name
00:25:49.820 that you see the most is
00:25:50.920 Adam Schiff, John
00:25:52.940 Brennan, and Clapper,
00:25:56.180 James Clapper, if you
00:25:58.120 saw those three people on
00:25:59.540 the same side of a story,
00:26:01.740 you should discount his
00:26:03.740 credibility based on
00:26:05.160 experience.
00:26:05.620 If the story comes from
00:26:08.420 Maggie Haberman, you
00:26:11.120 should put a different
00:26:12.100 odds on the credibility
00:26:13.200 of that story than maybe
00:26:15.320 a story that comes from
00:26:16.360 Glenn Greenwald, if you
00:26:18.120 know what I mean.
00:26:19.520 If you catch my drift,
00:26:22.560 you know, you're going to
00:26:23.060 give the Matt Taibbi a
00:26:25.020 little bit of higher
00:26:25.880 credibility than, you
00:26:28.040 know, Stephen
00:26:29.960 Collinson and CNN.
00:26:31.200 They're not equals.
00:26:33.560 Some are known
00:26:34.360 propagandists, and some
00:26:36.080 are known independent
00:26:37.120 journalists who are
00:26:37.800 actually just trying to
00:26:38.540 get the story.
00:26:39.960 So you could tell AI
00:26:41.240 which people to trust
00:26:42.380 based on experience.
00:26:44.700 You could say if
00:26:45.800 there are politicians,
00:26:47.840 don't believe anything.
00:26:52.240 So he says I'm being
00:26:53.440 anti-Semitic.
00:26:54.280 Okay.
00:26:58.040 Here's another one.
00:26:59.500 If the claim is
00:27:01.160 outrageous to the
00:27:02.220 point of absurdity,
00:27:03.660 it's probably not
00:27:04.820 true.
00:27:06.040 Because in the real
00:27:06.720 world, that's always
00:27:07.520 the case.
00:27:10.460 Now, sometimes things
00:27:11.880 are absurd, but it's
00:27:13.880 so rare that the
00:27:15.220 credibility of a story
00:27:16.360 should be lowered if
00:27:17.920 the claim is
00:27:19.020 amazingly absurd.
00:27:21.160 Oh, I think an alien
00:27:23.340 came and ate
00:27:25.720 Russia.
00:27:26.920 Right?
00:27:27.120 If you heard that,
00:27:28.960 and you'd say, I
00:27:29.880 feel like I would have
00:27:30.740 heard if Russia
00:27:31.820 disappeared.
00:27:33.800 So that's like a
00:27:34.600 kind of a crazy
00:27:35.580 claim.
00:27:36.640 So could AI
00:27:37.840 understand what is
00:27:38.780 absurd?
00:27:39.880 Do you think AI
00:27:40.680 could understand an
00:27:42.020 unlikely story from a
00:27:43.580 likely story based on
00:27:45.500 just knowing what is
00:27:46.320 average and normal?
00:27:47.860 Maybe.
00:27:49.140 I think it could.
00:27:49.800 How about if it's
00:27:53.720 still the fog of war?
00:27:56.720 Suppose it's early in
00:27:58.140 a complicated story.
00:28:00.200 Should the AI then
00:28:01.600 discount the story as
00:28:03.180 too early to know?
00:28:04.500 Of course it should.
00:28:05.840 I do.
00:28:06.660 That's what I tell you
00:28:07.900 every time we have a
00:28:08.600 new complicated story.
00:28:10.060 I always say the same
00:28:10.960 thing.
00:28:11.980 Everything that's being
00:28:12.880 reported is in doubt.
00:28:16.480 Just wait for it to
00:28:17.320 settle.
00:28:18.260 So AI could learn that.
00:28:19.800 Do you think that the
00:28:22.900 AI could identify
00:28:24.040 something that fits a
00:28:26.060 narrative?
00:28:27.720 Could it identify a
00:28:29.020 narrative and then
00:28:29.780 get a fact that fits it
00:28:32.120 too closely?
00:28:33.180 Do you think it would
00:28:34.000 be able to do that?
00:28:35.700 I think it could.
00:28:37.500 Maybe.
00:28:38.620 All right.
00:28:39.040 So what would happen
00:28:40.160 if you got to the
00:28:40.880 point where somebody
00:28:42.540 built an AI fact
00:28:43.920 checker or they built a
00:28:46.060 super prompt that's
00:28:47.380 basically embedded with
00:28:49.240 all the little tips I
00:28:50.200 just gave you and
00:28:51.240 more?
00:28:51.500 So you'd add some more
00:28:52.940 tips for spotting fake
00:28:54.160 news.
00:28:55.500 So and then the AI
00:28:57.340 says I will rate this
00:28:59.120 news story as 30%
00:29:02.060 credible.
00:29:04.500 What would happen to
00:29:05.720 AI if AI was known to
00:29:08.480 actually be really,
00:29:09.800 really good as spotting
00:29:11.160 fake news?
00:29:11.860 it would be illegal.
00:29:16.940 It would have to be
00:29:18.160 illegal.
00:29:19.280 Because again, we do
00:29:21.120 not have a system that
00:29:22.160 can survive real news.
00:29:25.100 We can't survive the
00:29:26.400 truth.
00:29:27.860 You know, maybe if we
00:29:28.840 could invent some new
00:29:29.940 system, there might be a
00:29:31.800 different kind of system
00:29:32.700 that can survive accurate
00:29:34.660 news.
00:29:35.300 But we don't have one.
00:29:36.320 It would be hard for the
00:29:41.840 AI to spot missing
00:29:42.980 context.
00:29:44.700 Would it?
00:29:46.900 Would it be hard for the
00:29:48.020 AI to spot missing
00:29:49.220 context?
00:29:52.780 I don't know.
00:29:54.960 Maybe hard, but not
00:29:56.240 impossible.
00:29:59.060 You know, because it's
00:29:59.980 hard for people to do it.
00:30:01.860 Right?
00:30:03.140 How often have I told
00:30:04.440 you, I've got one of
00:30:06.680 these dogs not barking
00:30:07.860 stories, and when you
00:30:09.020 hear it, you say, oh,
00:30:10.420 shoot, that's true.
00:30:12.240 If this story were true,
00:30:14.420 why would we not be
00:30:15.740 hearing about this other
00:30:16.660 thing?
00:30:17.140 Why would there be total
00:30:18.140 silence on this other
00:30:19.280 area?
00:30:20.180 I feel like AI could
00:30:21.240 sometimes get it.
00:30:22.760 But given that one of
00:30:24.460 the things that makes
00:30:25.320 people come to watch
00:30:27.060 this show is that I can
00:30:29.100 do it more often than
00:30:30.080 other people.
00:30:30.640 Meaning I can spot the
00:30:33.020 missing part more
00:30:34.520 reliably than some other
00:30:35.980 people paying attention
00:30:36.880 just because I'm looking
00:30:38.120 for it.
00:30:38.780 I'm actively looking for
00:30:40.020 the missing parts.
00:30:41.720 But maybe AI can learn
00:30:42.900 that.
00:30:45.240 All right.
00:30:47.380 In case you think that
00:30:48.880 the news is a brand
00:30:50.920 new, wonderful spectacle
00:30:52.520 every day, as opposed
00:30:54.580 to the groundhog situation
00:30:56.480 that it really is, where
00:30:58.220 the news just repeats
00:30:59.360 over and over again.
00:31:01.040 Sometimes the dates and
00:31:02.040 the names change, but
00:31:03.020 it's the same news.
00:31:05.320 Would you like to hear
00:31:06.240 some news that you're
00:31:07.140 pretty sure you already
00:31:07.900 heard before and you
00:31:10.180 will hear it again?
00:31:11.360 Just the names and the
00:31:12.340 details will change.
00:31:14.280 All right.
00:31:14.600 Here it comes.
00:31:15.920 According to Carrie
00:31:16.680 Lake, they found a
00:31:17.880 definite strong evidence
00:31:19.740 of irregularity in the
00:31:21.740 Maricopa voting.
00:31:23.200 Does that sound like any
00:31:26.100 story you've ever heard
00:31:27.500 before?
00:31:31.560 It's Kraken version 56.
00:31:35.180 There's the Kraken again.
00:31:37.100 Okay.
00:31:37.440 No, that's not the Kraken.
00:31:38.940 But wait till you see
00:31:39.820 what we got next.
00:31:41.400 I'll agree those other
00:31:42.580 Krakens were not really
00:31:43.840 the Kraken that we hoped.
00:31:45.440 But man, we're Kraken now
00:31:46.780 with our Krakens.
00:31:47.800 Our Krakens are so
00:31:48.760 Kraken.
00:31:49.100 They're just crispy
00:31:51.520 Kraken.
00:31:54.280 Well, I'll say the same
00:31:57.220 thing that I say every
00:31:58.280 time this story comes up
00:31:59.680 because you know it won't
00:32:00.840 be the last time either.
00:32:03.280 The claim is, let's say,
00:32:05.720 captured in documents and
00:32:07.440 captured on video and is
00:32:09.980 unambiguously supportive of
00:32:12.680 the claim.
00:32:13.940 It's kind of complicated,
00:32:15.220 but it has to do with when
00:32:16.240 some of the county machines
00:32:17.880 were tested.
00:32:19.100 And there's some
00:32:20.480 irregularity in the
00:32:21.620 timeline and some things
00:32:23.460 clearly were not
00:32:24.620 according to process.
00:32:26.540 That's the claim.
00:32:28.280 Now, what do you think
00:32:29.740 will happen when all the
00:32:31.540 debunkers and the people
00:32:33.160 who know how stuff work
00:32:34.320 weigh in?
00:32:36.040 Do you think the officials
00:32:37.600 who manage the elections
00:32:39.340 are going to look at this
00:32:40.360 evidence and say,
00:32:41.640 whoa, good point.
00:32:44.320 We never would have
00:32:45.200 caught this.
00:32:46.120 Thank you, Carrie Lake,
00:32:47.260 for bringing this to our
00:32:48.100 attention, but it does
00:32:49.560 appear that some bad
00:32:50.640 actors had reprogrammed
00:32:52.740 some of the counting
00:32:53.660 machines in an
00:32:54.620 inappropriate way that
00:32:55.620 could have changed the
00:32:57.460 results.
00:32:58.740 Do you think that's going
00:32:59.420 to happen?
00:33:00.420 I don't think so.
00:33:02.880 I don't think so.
00:33:04.480 I think it's going to be
00:33:05.380 like every other time.
00:33:06.800 you will see a fact check or
00:33:10.200 set of fact checks that you
00:33:12.060 don't believe, but they
00:33:14.540 will be the official fact
00:33:15.660 checks.
00:33:16.760 And the official fact
00:33:17.800 checks will say, well, it
00:33:20.540 looks like it's sketchy, but
00:33:21.900 that's because you don't
00:33:22.900 understand that the process
00:33:24.960 normally does this.
00:33:26.980 And yes, you might think that
00:33:28.700 this part of the process is
00:33:30.600 suboptimal, but if you check
00:33:32.380 the statutes, you'll see that's
00:33:34.140 allowed.
00:33:35.480 Oh, yes, this does cause
00:33:36.860 some confusion, and
00:33:38.220 possibly this would be an
00:33:39.260 area that somebody could have
00:33:40.820 cheated, but we see no
00:33:42.400 evidence that they did
00:33:43.620 cheat, only evidence that it
00:33:45.900 was possible.
00:33:48.280 Right?
00:33:48.860 You know the whole story.
00:33:50.300 And then if something ever
00:33:51.680 gets to court, you know how
00:33:53.660 that ends.
00:33:54.960 Oh, yes, you made your point,
00:33:57.000 but this court does not judge
00:33:58.780 that point, or we don't have
00:34:01.500 jurisdiction, or you don't
00:34:03.700 have standing.
00:34:05.540 Yeah, the claim might be true,
00:34:07.380 we're not going to even judge
00:34:08.300 it, because you don't have
00:34:09.540 standing.
00:34:11.120 Right?
00:34:12.040 You know exactly where this is
00:34:13.420 going.
00:34:14.080 It's a big, complicated claim
00:34:16.100 with lots of documentation and
00:34:19.400 video that seem to support the
00:34:23.140 claim.
00:34:24.400 We won't understand the details
00:34:26.020 of it, so people who want to
00:34:28.040 believe it will believe it,
00:34:29.000 people who don't want to
00:34:29.700 believe it will say it's
00:34:30.420 bullshit.
00:34:31.560 Then it will go through this
00:34:32.740 whole checking out phase by
00:34:34.300 the other side, and it will be
00:34:36.040 all confusing.
00:34:37.860 Because I read the story, and
00:34:39.280 it's almost incomprehensible.
00:34:41.880 It's almost incomprehensible.
00:34:43.880 And if Carrie Lake would like
00:34:45.460 some advice on communication,
00:34:48.400 you need to get this on one
00:34:49.860 page.
00:34:52.080 She's got like an extended
00:34:53.700 video showing all these
00:34:55.380 documents, and parts of the
00:34:57.240 documents are highlighted, and
00:34:59.120 then there's a timeline with all
00:35:00.520 kinds of shit on it.
00:35:02.440 I have no idea.
00:35:05.780 It's something like this.
00:35:08.260 Something like this.
00:35:09.440 This is not it.
00:35:11.160 But it's something like the
00:35:13.440 machines were tested before the
00:35:15.060 election, but then they were
00:35:17.360 altered after the test.
00:35:19.560 Is that close?
00:35:22.080 Has anybody figured out what the
00:35:23.740 claim is yet?
00:35:24.440 That some machines were tested, but
00:35:27.900 then there is documented and
00:35:29.980 videotaped and, you know,
00:35:31.920 incontrovertible evidence that
00:35:34.140 they were reprogrammed again after
00:35:35.780 the test.
00:35:37.700 Right?
00:35:39.340 They were altered after they were
00:35:41.040 certified.
00:35:42.240 Right.
00:35:43.420 So they were certified and then
00:35:46.440 altered.
00:35:46.820 But the way they were altered is
00:35:50.280 going to turn into this.
00:35:51.380 Well, where's your proof that the
00:35:54.220 altering changed the result?
00:35:56.180 Well, we can't prove that.
00:35:57.460 We can only prove that it could
00:35:58.720 have happened and there was no
00:35:59.940 reason to do what they did.
00:36:01.820 Yeah.
00:36:02.600 Yeah, well, they say there's a
00:36:03.840 reason.
00:36:04.680 Yeah, but there isn't a good
00:36:05.520 reason.
00:36:06.220 Yeah, but, you know, they say
00:36:07.620 there is a reason.
00:36:09.440 But it's not a good reason.
00:36:11.400 But they have a reason.
00:36:13.960 And then they'll say, where's your
00:36:16.060 evidence that they did, in fact,
00:36:17.660 use this opening to change the
00:36:19.780 result?
00:36:20.760 Well, we don't have that, but it
00:36:22.180 was illegal.
00:36:23.540 So you should overturn it because
00:36:25.480 it was illegal.
00:36:27.080 Do you think the court is going to
00:36:28.340 overturn an election that's been
00:36:30.620 certified for two years?
00:36:33.420 Or whatever it is by the time they
00:36:34.840 get around to it?
00:36:36.220 No.
00:36:36.920 No, the courts will.
00:36:38.520 And I'm not sure this is wrong, by
00:36:40.000 the way.
00:36:40.560 I don't know that this would be a
00:36:42.220 bad move by the courts.
00:36:43.820 Courts like stability.
00:36:45.860 They like the system to be stable.
00:36:47.600 All other things being equal.
00:36:51.280 So the most stability they could
00:36:53.700 add to the system would be to not
00:36:55.540 overturn the election.
00:36:57.420 So I think the courts would find a
00:36:59.340 reason to not overturn the
00:37:00.820 election.
00:37:02.360 Worst case scenario, or the most
00:37:04.160 aggressive thing they might do,
00:37:06.080 is say, you know, you ought to look
00:37:07.540 into changing that.
00:37:09.560 You've got a process there that
00:37:11.080 maybe you ought to keep out of the
00:37:12.540 courts.
00:37:13.600 Maybe modify that a little bit for
00:37:15.320 next time.
00:37:15.780 That's it.
00:37:17.420 And then do you think they'll
00:37:18.420 modify it for next time?
00:37:20.460 Of course not.
00:37:22.040 Of course not.
00:37:23.360 Why would they?
00:37:26.600 So this is one of those stories
00:37:27.920 where you can pretty much judge from
00:37:29.480 the very first word where it will
00:37:32.000 go.
00:37:33.000 And it has nothing to do with
00:37:34.360 whether the claim is valid or
00:37:36.040 invalid.
00:37:36.660 That I don't know.
00:37:38.040 But regardless of whether the claim
00:37:39.600 is valid or invalid, it will go down
00:37:43.580 the invalid pathway, because that's
00:37:45.660 the only one there is.
00:37:47.060 There's not a pathway for a valid
00:37:48.840 claim to go anywhere.
00:37:50.780 Is there?
00:37:52.220 Except to conspiracy theory books
00:37:54.420 that some people believe and other
00:37:55.840 people don't.
00:37:57.860 So nothing, I think nothing will be
00:37:59.780 done.
00:38:01.340 But they might have found, they
00:38:02.920 might have found the footprints of
00:38:04.200 the Kraken.
00:38:05.900 But the footprints don't prove the
00:38:07.520 Kraken.
00:38:07.840 They just might have some footprints.
00:38:11.880 All right.
00:38:14.760 NVIDIA has a new technology that
00:38:17.840 lets gamers talk to the NPCs in
00:38:20.880 your game.
00:38:22.540 Now, you could probably do that for a
00:38:24.480 while now.
00:38:25.460 But now the NPCs will have full AI
00:38:28.720 kind of conversational talents.
00:38:32.160 So they can now make you an NPC, a
00:38:35.520 non-player character, in a game that
00:38:38.380 will look and act like a person.
00:38:41.960 But you don't live in a simulation, do
00:38:44.260 you?
00:38:45.200 No, this isn't a simulation.
00:38:47.180 Of course not.
00:38:50.460 How much longer can we watch our
00:38:52.900 technology build simulations before
00:38:56.480 you realize that you're in one?
00:38:58.660 How much further can we push this?
00:39:00.540 Has anybody flipped over to believing
00:39:06.700 that we're a simulation because of AI's
00:39:10.260 success lately, especially the deep
00:39:13.240 fakes?
00:39:14.160 Is there anybody who changed their mind
00:39:15.520 recently?
00:39:17.100 I see one yes.
00:39:20.460 Mostly no's.
00:39:23.180 All right.
00:39:24.460 So here's my speculation.
00:39:27.320 My speculation is that we will someday
00:39:30.740 come to the belief that we are a
00:39:32.640 simulation.
00:39:33.680 In other words, I believe that will be a
00:39:35.480 common understanding at some point.
00:39:40.140 But to get there, we will have to get
00:39:43.520 closer and closer to creating our own
00:39:45.760 simulation with characters who believe
00:39:48.760 they're real, that we can observe and
00:39:51.300 then watch them acting just the way we do.
00:39:53.980 You'd be looking at them and you'd say,
00:39:55.220 wait a minute, they just formed a
00:39:57.720 religion.
00:39:59.400 You're going to watch them do everything
00:40:00.720 we would do.
00:40:01.880 And then you will know you are a
00:40:04.240 simulation.
00:40:06.260 Admit you believe in God.
00:40:07.760 Well, I believe that we are created by a
00:40:10.100 higher power, but probably somebody who's
00:40:13.720 not that different from us.
00:40:15.360 And I also believe that humans have now
00:40:19.220 achieved God-like powers.
00:40:20.740 That's right.
00:40:25.420 Humans now have God powers.
00:40:27.960 Because what is a God power?
00:40:29.600 God power would be to create a universe,
00:40:31.760 right?
00:40:32.380 Well, we can do that now.
00:40:34.000 We can do that.
00:40:34.860 You can create a simulated universe.
00:40:37.360 God power is only God can make a tree.
00:40:40.980 Right?
00:40:41.940 Well, we can do that.
00:40:43.380 I can make a tree that the characters in
00:40:45.340 my simulation believe is a tree.
00:40:46.880 And they would have no evidence otherwise.
00:40:50.640 So we have now achieved God-like power.
00:40:53.960 However, here's the irony.
00:40:55.820 It doesn't apply to our own existence.
00:40:58.780 It's a God-like power that we can only
00:41:00.900 apply to the world that we create for
00:41:03.380 that purpose.
00:41:04.300 But we can't use those powers in our own
00:41:07.340 world.
00:41:09.820 Unless the affirmations are that power.
00:41:13.320 Sometimes I think we can.
00:41:14.580 All right.
00:41:16.880 We have an insider report on Trump's opinion
00:41:24.660 of Tim Scott.
00:41:26.240 And there's a quote that could be fake news
00:41:29.400 because it comes from an anonymous source.
00:41:31.620 But it sounds true.
00:41:34.160 And the quote was from allegedly from Trump
00:41:38.180 about Tim Scott.
00:41:40.060 Quote, I like him.
00:41:41.380 We're just going to say nice things about
00:41:43.300 Tim.
00:41:43.560 Now, let's see if there's a parallel here.
00:41:50.140 So Ron DeSantis was pro-Trump,
00:41:53.120 but then decided to run against him.
00:41:56.080 And then Trump called him disloyal.
00:41:59.840 Is that correct so far, right?
00:42:02.440 Then Tim Scott, who has worked productively
00:42:05.340 with Trump, enough so that Trump still likes him,
00:42:07.840 decides to run against Trump.
00:42:11.680 And Trump does not call him disloyal.
00:42:14.380 Huh.
00:42:15.480 Huh.
00:42:17.700 Foreshadowing.
00:42:18.900 Follow the foreshadowing.
00:42:21.540 Let's see.
00:42:22.300 One person is disloyal.
00:42:24.800 And the person who did the same kind of similar things,
00:42:28.060 not disloyal at all.
00:42:29.240 Nice guy.
00:42:29.760 They like him.
00:42:30.180 I feel like your next vice president is lining up.
00:42:37.260 At the very least,
00:42:39.520 Trump has him on his shortest of the short list.
00:42:42.560 Would you agree?
00:42:44.720 At the very least,
00:42:46.180 Tim Scott is on the shortest of the short list.
00:42:48.800 He's top three, no question about it.
00:42:52.520 And then,
00:42:53.600 who's the other gentleman?
00:42:58.180 Donaldson.
00:42:58.620 Is it Donaldson?
00:43:00.640 No, Donaldson, right?
00:43:03.820 The other Trump supporter from,
00:43:06.260 Byron Donaldson, right?
00:43:07.880 And Byron Donaldson's been getting,
00:43:09.840 Byron Donalds, I'm sorry.
00:43:11.560 Byron Donalds.
00:43:13.540 That is the correct name.
00:43:16.700 And he's been getting a lot of attention
00:43:18.180 just by being good at his job.
00:43:20.880 Meaning he's communicating really well.
00:43:23.640 And he's presenting his side
00:43:26.160 and his team's side of the argument
00:43:27.680 in arguments that people are,
00:43:30.360 you know,
00:43:31.080 we're finding compatible with our own thinking,
00:43:34.040 let's say.
00:43:35.040 So,
00:43:35.880 he's a really strong
00:43:37.400 personality.
00:43:40.960 But I don't know that he's as seasoned
00:43:42.860 or as credible as Tim Scott.
00:43:45.420 Because, you know,
00:43:46.400 you like a senator better than a congressperson, right?
00:43:50.900 Yeah,
00:43:51.360 and you don't want to have too many Donalds in your ticket.
00:43:56.360 I mean,
00:43:57.140 it bothers me enough that there might be a Scott on the ticket.
00:44:00.300 That's going to bug me enough.
00:44:03.040 Yep,
00:44:03.480 Tim's a bachelor,
00:44:04.420 but we don't care.
00:44:05.180 Which is actually another advantage.
00:44:10.580 I would love,
00:44:12.340 well,
00:44:12.620 I hate to say this,
00:44:14.040 but,
00:44:14.980 yeah,
00:44:15.280 let's leave Tim Scott's personal life alone.
00:44:18.380 What do you think?
00:44:20.140 Do you think that's a,
00:44:21.320 do you think that's something we should do?
00:44:24.160 Just leave Tim Scott,
00:44:25.540 his personal life alone.
00:44:27.680 Let it just be whatever it is.
00:44:29.700 Whatever it is,
00:44:30.300 because conservatives can't be consistent
00:44:34.560 with their own,
00:44:35.700 their own philosophies
00:44:37.060 if they get too interested in that,
00:44:39.580 right?
00:44:40.180 If you get too interested,
00:44:42.020 you're not being consistent
00:44:43.220 with your own principles.
00:44:45.300 You should be aggressively uninterested
00:44:47.660 in that question.
00:44:48.900 Aggressively uninterested.
00:44:51.220 And let's see if we can maintain that.
00:44:53.740 I feel like that would be the
00:44:54.920 consistent way to be a conservative,
00:44:57.960 to be aggressively uninterested
00:44:59.700 in somebody who's minding
00:45:00.880 their own business,
00:45:02.380 minding their own business,
00:45:04.200 and being a good patriot.
00:45:06.620 What else do you want?
00:45:08.640 Doesn't bother you.
00:45:10.180 Good patriot.
00:45:11.760 We're done here.
00:45:14.980 All right.
00:45:16.360 And by the way,
00:45:17.240 we don't,
00:45:17.740 I'm not making any assumptions
00:45:19.600 about Tim Scott's private life.
00:45:22.600 That's part of the deal too,
00:45:23.920 is that I don't even have to make an assumption.
00:45:26.400 I can just say none of my business
00:45:27.640 and get on with it.
00:45:29.700 All right.
00:45:32.660 Did you see a,
00:45:34.200 there was a video from Dylan Mulvaney
00:45:36.440 talking about,
00:45:40.140 informed his parents
00:45:41.300 that he might be interested in women.
00:45:44.380 And then talked about,
00:45:46.380 Dylan talked about the transition from,
00:45:50.220 I believe,
00:45:52.760 Dylan claimed to be gay,
00:45:55.000 and then claimed to be queer,
00:45:56.840 then claimed to be non-binary,
00:45:58.680 and then claimed to be trans,
00:46:00.720 and now is questioning whether
00:46:02.860 she likes women.
00:46:09.660 So,
00:46:10.600 does it,
00:46:12.320 does it feel to you
00:46:13.980 that this is more about a theater
00:46:16.500 and less about
00:46:17.700 somebody's personal decisions?
00:46:20.820 It's just theater.
00:46:22.740 Yeah.
00:46:24.940 I don't know how we ever came
00:46:26.540 to care so much.
00:46:28.040 Well,
00:46:28.280 I guess I do know.
00:46:29.720 It's not,
00:46:30.360 none of this was about
00:46:31.440 Dylan Mulvaney,
00:46:32.360 right?
00:46:33.640 Let's be honest.
00:46:34.920 When I got canceled,
00:46:36.720 it wasn't really about me at all.
00:46:38.560 I was just a vehicle
00:46:39.800 that people could put their feelings in,
00:46:42.100 and,
00:46:42.600 you know,
00:46:43.000 it'd be like a little truck
00:46:43.940 that could take us somewhere.
00:46:44.800 I think Dylan Mulvaney
00:46:46.520 is the same thing.
00:46:47.640 Literally,
00:46:48.220 nobody cares about
00:46:49.180 Dylan's private life.
00:46:51.700 Nobody.
00:46:52.960 But Dylan is,
00:46:53.880 like me,
00:46:54.500 a little truck
00:46:55.180 that you can put your
00:46:56.140 political opinions in
00:46:57.820 and take it somewhere.
00:47:00.160 So,
00:47:01.980 you said you thought
00:47:04.580 you was gay.
00:47:05.460 Well,
00:47:05.620 that's what Dylan says.
00:47:07.480 Dylan thought Dylan was gay,
00:47:09.440 and then thought Dylan was queer,
00:47:11.880 and then non-binary,
00:47:12.680 and now Dylan thinks
00:47:14.280 Dylan is trans,
00:47:16.100 but maybe a trans lesbian.
00:47:18.740 I don't know.
00:47:19.180 I'm not sure how that works.
00:47:23.000 But,
00:47:23.840 so,
00:47:24.680 in case you're wondering,
00:47:25.660 in the,
00:47:25.920 in the Dilbert
00:47:27.400 reborn comic
00:47:29.040 that you can only see
00:47:30.080 if you're subscribing
00:47:31.120 on either
00:47:32.500 the Locals platform
00:47:33.520 where you get
00:47:34.060 lots more than Dilbert,
00:47:35.580 or on the
00:47:36.480 Twitter platform,
00:47:38.760 you can subscribe.
00:47:40.340 I've got a subscribe button
00:47:41.500 in my profile.
00:47:43.120 And you can see
00:47:43.720 just the Dilbert Reborn comic,
00:47:45.100 but I'll tell you
00:47:46.100 about an upcoming
00:47:46.700 week.
00:47:49.740 Dilbert's company
00:47:50.400 is going to hire
00:47:51.480 a new head of marketing,
00:47:53.760 a new head of marketing
00:47:56.200 who is a woman,
00:47:59.000 and she's in charge
00:48:00.120 of their marketing
00:48:01.360 for the Power Tools Division.
00:48:03.680 So,
00:48:04.180 they're going to get
00:48:04.600 a woman vice president
00:48:05.700 in charge of the
00:48:06.240 Power Tools Division,
00:48:07.100 and her first
00:48:09.460 suggestion
00:48:10.520 will be to hire
00:48:11.320 Dylan Mulvaney
00:48:12.260 as an influencer
00:48:13.960 for their Power Tools.
00:48:17.880 So,
00:48:18.400 that's what you're missing
00:48:19.100 if you don't subscribe.
00:48:23.200 It goes great,
00:48:24.340 by the way.
00:48:25.200 It just goes great.
00:48:26.120 All right.
00:48:33.100 Yes,
00:48:33.620 we don't need to make
00:48:34.440 sex tool
00:48:39.220 accusations
00:48:40.760 just because
00:48:41.400 it's Power Tools.
00:48:44.660 All right.
00:48:45.300 now that,
00:48:47.300 ladies and gentlemen,
00:48:49.020 brings me to the end
00:48:49.980 of my prepared remarks
00:48:51.120 on this Memorial Day.
00:48:54.260 Is there any
00:48:55.040 story that I missed
00:48:56.180 that you need to
00:48:56.960 get my valuable
00:48:58.700 opinion on?
00:49:01.420 All right.
00:49:01.680 It took about
00:49:02.180 two minutes
00:49:02.780 for somebody
00:49:03.240 to mention
00:49:03.660 Snap-On Tools.
00:49:05.760 That took longer
00:49:06.980 than I thought.
00:49:12.020 All right.
00:49:12.720 UAP,
00:49:17.640 the UAPs.
00:49:18.840 You know,
00:49:19.260 I saw Elon Musk
00:49:20.760 on some interview
00:49:22.040 he did recently
00:49:22.700 where it appears
00:49:24.360 he's become
00:49:25.140 a believer
00:49:27.000 in the
00:49:27.740 pre-Ice Age
00:49:28.940 advanced
00:49:29.760 civilization
00:49:30.540 concept.
00:49:32.160 Have you all
00:49:32.680 been watching that?
00:49:33.820 It's mostly
00:49:34.360 a YouTube thing.
00:49:35.900 There are a whole
00:49:36.260 bunch of videos,
00:49:37.300 people taking
00:49:37.840 different positions
00:49:38.660 on how,
00:49:41.840 and I guess
00:49:42.560 Elon Musk
00:49:43.100 gave the one
00:49:43.780 example that's
00:49:44.680 in some of
00:49:45.140 these videos
00:49:45.680 that I think
00:49:47.560 it was the
00:49:48.120 skill of writing
00:49:49.540 popped up
00:49:51.200 in different
00:49:51.760 civilizations
00:49:52.860 all over the
00:49:53.760 world at the
00:49:54.300 same time.
00:49:55.580 And it couldn't
00:49:56.420 have been a
00:49:56.760 coincidence
00:49:57.060 because they
00:49:57.680 didn't have
00:49:58.000 any connection.
00:50:00.360 And how could
00:50:01.040 humans be evolving
00:50:02.080 for, you know,
00:50:03.080 hundreds of
00:50:03.680 thousands of years
00:50:04.400 then just about
00:50:05.180 the same time
00:50:06.000 they develop
00:50:07.540 writing skills
00:50:08.440 in different
00:50:09.600 places?
00:50:10.060 Now the
00:50:11.240 suspicion is
00:50:12.220 that there
00:50:13.400 was some
00:50:13.740 advanced
00:50:14.300 race that
00:50:15.960 was on
00:50:16.320 Earth and
00:50:16.860 got wiped
00:50:17.260 out by the
00:50:18.080 Ice Age
00:50:18.660 changes,
00:50:19.780 and maybe
00:50:21.360 there were
00:50:21.680 some survivors
00:50:22.460 or some
00:50:23.500 missionaries
00:50:24.100 who, you
00:50:25.000 know, took
00:50:25.420 their knowledge
00:50:26.080 to other
00:50:26.440 places or
00:50:27.040 whatever.
00:50:28.040 Maybe.
00:50:29.960 Yeah, Graham
00:50:30.540 Hancock is on
00:50:31.340 that.
00:50:31.960 So part of
00:50:32.440 that belief is
00:50:33.060 that the
00:50:33.440 Sphinx was
00:50:34.960 not built by
00:50:35.600 the Egyptians
00:50:36.280 or anybody
00:50:37.060 that they
00:50:37.540 could remember.
00:50:39.040 But they
00:50:39.380 sort of claimed
00:50:40.040 it because
00:50:40.820 it was on
00:50:41.200 their territory
00:50:41.840 and that it
00:50:43.400 might have
00:50:43.620 been some
00:50:43.900 advanced
00:50:44.400 civilization.
00:50:46.500 Now the
00:50:46.780 pyramids
00:50:47.140 themselves, I
00:50:47.780 think, are
00:50:48.320 a little sketchy,
00:50:49.120 aren't they?
00:50:50.840 Because it
00:50:52.460 almost looks
00:50:53.140 like the
00:50:53.520 pyramids were
00:50:54.120 built for
00:50:54.600 one purpose
00:50:55.300 and then the
00:50:57.100 kings used
00:50:58.300 them for
00:50:58.620 another purpose
00:50:59.260 which is they
00:51:00.160 buried themselves
00:51:00.860 inside them.
00:51:01.960 It doesn't
00:51:02.380 even look like
00:51:02.900 they were built
00:51:03.480 for the purpose
00:51:04.520 of burial
00:51:05.100 tubes.
00:51:05.460 It was like
00:51:06.360 they had
00:51:06.580 some bigger
00:51:07.040 purpose.
00:51:08.560 So we're
00:51:09.320 seeing all
00:51:09.720 these hints
00:51:10.220 that maybe
00:51:11.000 Atlantis was
00:51:11.800 real, maybe
00:51:13.000 there was more
00:51:13.480 than one
00:51:13.820 Atlantis, maybe
00:51:15.260 they had some
00:51:15.700 advanced
00:51:16.040 civilization.
00:51:17.520 Now connect
00:51:18.140 that to the
00:51:19.080 UFO spottings
00:51:20.600 we've had.
00:51:22.040 And the one
00:51:22.540 thing that we
00:51:22.960 hear more than
00:51:23.480 other things is
00:51:24.220 that they must
00:51:25.460 be drones
00:51:26.200 because the
00:51:27.660 way they move
00:51:28.360 no human
00:51:29.140 could survive
00:51:29.820 the G-forces
00:51:30.640 or no organic
00:51:31.780 entity.
00:51:32.780 So they think
00:51:33.700 it must be
00:51:34.160 all mechanical
00:51:34.880 because it
00:51:36.040 moves too
00:51:36.440 fast.
00:51:37.420 And some
00:51:38.040 of them
00:51:38.340 will swim
00:51:39.400 underwater,
00:51:40.860 they say.
00:51:43.080 So what
00:51:44.780 do you think
00:51:45.100 is more
00:51:45.420 likely?
00:51:46.720 There's some
00:51:47.500 leftover drones
00:51:48.540 from an advanced
00:51:49.420 civilization that
00:51:50.500 might be just
00:51:51.020 on autopilot.
00:51:53.320 They might be
00:51:53.920 either servicing
00:51:54.820 themselves and
00:51:55.720 maintaining
00:51:56.280 themselves in
00:51:57.100 some underground
00:51:58.340 factory that's
00:51:59.940 just operating
00:52:01.780 since the
00:52:02.500 people who
00:52:03.220 built it
00:52:03.640 all died.
00:52:04.820 It just
00:52:05.180 keeps going.
00:52:06.360 And it's
00:52:06.960 just doing
00:52:07.380 its routine.
00:52:09.060 All it's
00:52:09.500 doing is
00:52:09.940 sending out
00:52:10.640 some drones
00:52:11.620 to look at
00:52:12.140 stuff and
00:52:12.560 coming back.
00:52:13.680 It doesn't
00:52:14.120 have any
00:52:14.400 purpose,
00:52:15.300 but it
00:52:15.620 doesn't know
00:52:16.040 it has no
00:52:16.480 purpose.
00:52:19.200 That would
00:52:19.760 explain everything,
00:52:20.620 wouldn't it?
00:52:21.700 It would explain
00:52:22.400 all of the
00:52:23.260 Atlantis
00:52:24.320 missing civilization,
00:52:25.460 and it would
00:52:26.740 explain why
00:52:27.320 we have
00:52:27.720 UFOs and
00:52:29.300 yet no
00:52:29.960 visits from
00:52:30.680 any creatures.
00:52:33.220 How could
00:52:33.580 we have so
00:52:34.020 many UFOs
00:52:35.000 but nobody
00:52:35.680 has tried to
00:52:36.060 make contact?
00:52:37.460 And the
00:52:37.780 most logical
00:52:38.620 explanation is
00:52:39.780 there's nobody
00:52:41.560 to contact.
00:52:44.600 Isn't it?
00:52:46.040 If you have
00:52:46.780 to look at
00:52:47.100 all the
00:52:47.360 possibilities,
00:52:48.960 let's say you
00:52:49.700 accepted, I
00:52:50.420 don't accept
00:52:50.960 this as true,
00:52:51.860 but if you
00:52:52.400 did accept
00:52:52.940 it's true that
00:52:54.120 there are a
00:52:54.440 bunch of
00:52:54.760 UFOs that
00:52:55.440 are legitimately
00:52:56.240 high-tech
00:52:57.640 objects that
00:52:59.620 are moving
00:53:00.020 in a way
00:53:00.420 that an
00:53:01.160 organic
00:53:01.780 creature could
00:53:02.820 not survive.
00:53:06.920 I think
00:53:07.720 that's the
00:53:08.100 most likely
00:53:08.740 possibility is
00:53:09.760 that there
00:53:10.200 is some
00:53:11.120 kind of
00:53:11.500 leftover
00:53:11.900 technology
00:53:12.660 from an
00:53:13.740 Earth
00:53:14.020 civilization
00:53:14.640 that found
00:53:16.680 a way to
00:53:17.120 keep its
00:53:17.540 drones running
00:53:18.400 and maybe
00:53:19.600 self-maintain
00:53:20.460 them.
00:53:21.280 We might
00:53:21.680 find someday
00:53:22.360 there's an
00:53:22.940 undersea
00:53:23.460 factory where
00:53:24.980 their robots
00:53:25.740 were repairing
00:53:26.380 themselves and
00:53:27.400 they got
00:53:28.740 spare parts
00:53:29.580 and for
00:53:30.600 100,000
00:53:31.280 years they've
00:53:32.880 just been
00:53:33.380 doing their
00:53:33.780 robot thing
00:53:34.360 under there.
00:53:36.580 They don't
00:53:37.200 have any
00:53:37.500 reason to
00:53:38.000 contact us
00:53:38.780 because that
00:53:39.580 was never
00:53:39.960 their programming.
00:53:41.360 They're not
00:53:41.840 designed to
00:53:42.360 contact anybody.
00:53:43.560 They're just
00:53:43.940 designed to
00:53:44.420 go look
00:53:44.740 around and
00:53:45.200 make some
00:53:45.520 videos and
00:53:46.140 record them
00:53:46.760 and then
00:53:47.160 they do
00:53:47.400 it again.
00:53:47.720 That's
00:53:51.640 Horizon Zero
00:53:52.840 Dawn.
00:53:54.040 What is
00:53:54.240 that?
00:53:57.560 Modern
00:53:58.040 Romans have
00:53:58.720 no knowledge
00:53:59.320 of Romans
00:53:59.840 who built
00:54:00.260 the stadium.
00:54:01.520 But the
00:54:02.140 stadium isn't
00:54:02.900 really amazing.
00:54:04.860 I think that
00:54:05.360 was more in
00:54:06.040 line with what
00:54:06.680 they knew how
00:54:07.120 to do in
00:54:07.480 the day.
00:54:09.380 What's a
00:54:09.960 video game?
00:54:11.640 How does
00:54:12.180 that jive
00:54:12.660 with simulation
00:54:13.400 theory?
00:54:14.740 Well,
00:54:15.260 whatever it
00:54:15.800 is, this
00:54:16.140 could be the
00:54:16.580 simulation.
00:54:17.000 Now, in
00:54:18.000 simulation
00:54:18.520 theory, I
00:54:19.400 say the
00:54:19.840 most provocative
00:54:20.620 thing that
00:54:21.680 nobody else
00:54:22.280 says about
00:54:22.720 the simulation.
00:54:24.480 At least I've
00:54:25.040 never heard it.
00:54:26.360 And to me, it's
00:54:27.480 a way to prove
00:54:28.020 the simulation.
00:54:29.840 It goes like
00:54:30.660 this.
00:54:31.160 If we are a
00:54:31.900 simulation, we
00:54:33.460 don't have a
00:54:34.180 past because we
00:54:37.840 would have been
00:54:38.320 created out of
00:54:39.940 nothing.
00:54:40.600 So the past
00:54:41.520 just didn't
00:54:42.140 exist at the
00:54:43.560 moment of
00:54:43.980 creation.
00:54:45.260 But since in
00:54:46.780 order to feel
00:54:47.400 like we're
00:54:47.800 real characters,
00:54:48.460 we would have
00:54:49.060 to believe we
00:54:49.680 had a past,
00:54:50.700 the simulation
00:54:51.640 would create
00:54:52.560 the past on
00:54:54.180 demand.
00:54:55.680 So my
00:54:56.500 example of
00:54:57.000 that is if
00:54:57.580 you've never
00:54:58.140 dug a hole
00:54:58.740 in your
00:54:59.060 backyard, there's
00:55:00.700 nothing under
00:55:01.260 there.
00:55:02.860 You start
00:55:03.640 digging, and
00:55:04.560 as you dig,
00:55:05.400 the simulation
00:55:05.980 fills in the
00:55:06.700 hole.
00:55:07.480 I mean, it
00:55:08.000 creates the
00:55:08.580 detail that
00:55:09.880 looks like,
00:55:10.380 hey, I
00:55:11.400 found a
00:55:12.020 fossil in
00:55:12.580 here.
00:55:13.360 Yeah, right.
00:55:13.780 So you
00:55:14.040 find a
00:55:14.380 dinosaur
00:55:14.740 bone, and
00:55:16.460 then the
00:55:17.900 simulation has
00:55:18.980 to recreate
00:55:19.540 the dinosaur.
00:55:21.520 But it
00:55:22.060 won't recreate
00:55:22.660 anything until
00:55:23.340 you need it.
00:55:24.580 So the past
00:55:25.480 is an
00:55:26.440 invention of
00:55:27.260 the current
00:55:28.460 times.
00:55:29.740 So the
00:55:30.420 question was,
00:55:31.160 how did the
00:55:31.760 theory that
00:55:32.460 there were
00:55:33.040 ancient,
00:55:33.920 advanced
00:55:34.300 civilizations,
00:55:35.500 how does
00:55:35.880 that fit into
00:55:36.460 simulation
00:55:36.980 theory?
00:55:37.960 It fits this
00:55:38.840 way.
00:55:39.960 There were
00:55:40.640 no advanced
00:55:41.400 civilizations.
00:55:42.080 civilizations.
00:55:43.040 We're
00:55:43.400 creating them
00:55:44.320 now.
00:55:45.540 We are
00:55:46.240 creating them
00:55:47.120 out of our
00:55:47.660 minds, and
00:55:48.540 now they're
00:55:48.900 becoming real.
00:55:50.420 So there
00:55:50.860 will be an
00:55:51.800 Atlantis
00:55:52.320 eventually,
00:55:53.800 but not
00:55:55.160 yet.
00:55:57.220 In other
00:55:57.700 words, there
00:55:58.160 will be an
00:55:59.320 Atlantis in
00:56:00.220 our past,
00:56:01.200 because we're
00:56:01.760 in the process
00:56:02.640 of creating
00:56:03.840 it, but it
00:56:06.060 does not
00:56:06.560 exist in our
00:56:07.280 past yet.
00:56:09.940 In other
00:56:10.320 words, the
00:56:11.020 past will be
00:56:11.840 created in
00:56:12.440 our future,
00:56:14.160 and it's
00:56:14.600 the only
00:56:15.080 way it
00:56:15.480 can work.
00:56:16.660 It can't
00:56:17.180 work any
00:56:17.540 other way.
00:56:18.400 And the
00:56:18.800 reason it
00:56:19.100 can't work
00:56:19.440 any other
00:56:19.800 way is that
00:56:20.380 no simulation
00:56:21.160 could put
00:56:22.740 everything in
00:56:23.660 the universe
00:56:24.160 and hold it
00:56:25.620 in its mind
00:56:26.320 and rotate
00:56:27.300 it and keep
00:56:28.180 it consistent
00:56:28.740 with everything
00:56:29.280 else in the
00:56:29.740 universe,
00:56:30.480 because the
00:56:31.200 computing
00:56:31.680 requirement would
00:56:32.580 be too
00:56:32.880 massive.
00:56:34.140 So instead,
00:56:35.080 what the
00:56:35.740 software does
00:56:36.680 is exactly
00:56:37.360 what a video
00:56:37.960 game does.
00:56:39.120 It creates
00:56:39.640 a little
00:56:40.000 universe,
00:56:40.560 and it
00:56:41.100 doesn't let
00:56:41.560 you get
00:56:41.840 outside of
00:56:42.400 its barriers.
00:56:44.480 So we
00:56:45.880 believe that
00:56:47.780 when we
00:56:48.200 see light
00:56:48.720 from a
00:56:49.100 star,
00:56:50.880 we believe
00:56:52.860 that there's
00:56:53.380 an actual
00:56:53.840 planet or
00:56:55.180 sun up
00:56:55.560 there,
00:56:56.540 but maybe
00:56:57.460 not.
00:56:58.640 Maybe it's
00:56:59.220 just light.
00:57:01.160 And there
00:57:01.860 wouldn't be a
00:57:02.320 planet there
00:57:02.840 unless we
00:57:03.360 built a
00:57:04.020 technology
00:57:04.580 where we
00:57:05.580 could follow
00:57:06.860 that light
00:57:07.340 back to its
00:57:07.940 source,
00:57:09.120 and as
00:57:09.660 we
00:57:09.820 approach
00:57:10.200 the
00:57:10.420 source,
00:57:11.360 the planet
00:57:11.920 or the
00:57:12.440 star would
00:57:13.080 appear in
00:57:14.200 the real
00:57:14.560 world for
00:57:15.700 the first
00:57:16.080 time.
00:57:17.080 It wasn't
00:57:17.800 there until
00:57:18.180 you looked
00:57:18.500 at it.
00:57:21.840 All right.
00:57:23.180 God is
00:57:23.640 outside of
00:57:24.140 time.
00:57:24.840 Yes,
00:57:25.200 that would
00:57:25.540 be similar
00:57:26.160 to a
00:57:26.920 simulation.
00:57:28.480 So if we
00:57:29.180 were to
00:57:29.480 build a
00:57:29.840 simulation,
00:57:31.120 the simulation
00:57:31.740 sense of
00:57:32.400 time would
00:57:32.840 not be the
00:57:33.280 same as
00:57:33.720 our time,
00:57:34.440 the creator's.
00:57:35.040 we would
00:57:35.720 be outside
00:57:36.340 of their
00:57:37.840 time.
00:57:38.580 In fact,
00:57:39.020 you could
00:57:39.380 fast-forward
00:57:40.060 their time
00:57:40.720 so that
00:57:41.680 they're at
00:57:41.980 ten times
00:57:42.480 the time
00:57:42.900 of the
00:57:43.120 creator's
00:57:43.560 time,
00:57:44.280 and they
00:57:44.600 wouldn't
00:57:44.780 know the
00:57:45.060 difference.
00:57:46.140 As long
00:57:46.740 as everything
00:57:47.280 got moved
00:57:49.960 forward at
00:57:50.740 the same
00:57:51.120 rate,
00:57:53.140 their car
00:57:53.660 would still
00:57:54.060 look like
00:57:54.440 it's going
00:57:54.860 65 miles
00:57:55.780 an hour.
00:57:56.440 They wouldn't
00:57:57.020 know that
00:57:57.360 it's going
00:57:57.720 1,000 miles
00:57:58.440 an hour.
00:57:59.140 They would
00:57:59.400 have no way
00:57:59.820 to know.
00:58:00.540 Because it's
00:58:00.960 only going
00:58:01.360 1,000 miles
00:58:02.060 an hour
00:58:02.360 outside the
00:58:03.300 simulation.
00:58:05.040 There's
00:58:05.620 no preferred
00:58:07.020 speed.
00:58:08.560 There's
00:58:08.940 only the
00:58:09.360 speed from
00:58:10.060 the perspective
00:58:10.700 of the
00:58:11.320 observer.
00:58:12.980 Take that,
00:58:13.640 Einstein.
00:58:17.480 All right.
00:58:19.360 What about
00:58:19.960 the mycelium
00:58:20.720 theory?
00:58:23.380 Well,
00:58:24.260 that is how
00:58:24.960 I will power
00:58:25.760 my mycelium
00:58:26.960 drive in my
00:58:27.680 spaceship.
00:58:28.940 But beyond
00:58:29.360 that, that's
00:58:29.840 all I know.
00:58:32.820 Do we get
00:58:33.640 to know when
00:58:34.240 we're buffering?
00:58:35.920 I don't
00:58:36.560 think we
00:58:36.900 would, because
00:58:37.720 if you
00:58:39.100 imagine that
00:58:39.760 we're
00:58:40.200 artificial, if
00:58:41.880 we all
00:58:42.260 stopped at
00:58:42.800 the same
00:58:43.160 time, because
00:58:44.120 the processor
00:58:44.760 stopped, and
00:58:46.880 then we
00:58:47.240 immediately
00:58:47.620 started at
00:58:48.340 the same
00:58:48.660 time, we
00:58:50.260 wouldn't know
00:58:50.700 that we
00:58:50.960 never stopped.
00:59:01.600 Was
00:59:02.080 yesterday a
00:59:02.740 simulation?
00:59:05.040 Well, yesterday
00:59:08.320 doesn't have
00:59:09.040 to have
00:59:10.480 been fake,
00:59:12.020 because once
00:59:12.620 you do it,
00:59:13.080 it's real.
00:59:14.720 Ish.
00:59:17.120 Why did the
00:59:17.980 simulation
00:59:18.400 rig the
00:59:18.880 election?
00:59:20.200 Well, all
00:59:21.220 right, here's
00:59:21.660 a real
00:59:22.580 mind-blower.
00:59:23.840 If we're a
00:59:24.560 simulation, which
00:59:25.480 is what I
00:59:25.900 believe, it
00:59:27.200 can be true
00:59:28.040 that the
00:59:28.600 election was
00:59:29.240 not rigged,
00:59:30.080 and also
00:59:31.220 that it
00:59:31.640 was rigged.
00:59:33.920 And it's
00:59:34.800 like a
00:59:35.080 Schrodinger's
00:59:35.620 cat situation.
00:59:36.800 So until you
00:59:37.620 can find
00:59:38.220 evidence that
00:59:39.160 would convince
00:59:39.720 everybody that
00:59:40.460 they're looking
00:59:41.040 at the same
00:59:41.580 thing, here
00:59:43.020 it is.
00:59:44.360 If you can't
00:59:45.260 produce that,
00:59:46.040 then both
00:59:46.600 possibilities exist
00:59:47.800 forever.
00:59:48.160 But if you
00:59:49.420 got a
00:59:49.660 whistleblower who
00:59:50.660 had, you
00:59:51.160 know, photos
00:59:51.960 and videos
00:59:52.680 and documents
00:59:53.720 and recorded
00:59:55.020 phone calls,
00:59:56.100 then suddenly
00:59:57.040 the reality
00:59:57.740 would collapse,
00:59:59.100 and then it
00:59:59.520 would become a
01:00:00.100 rigged election.
01:00:01.280 But until
01:00:01.940 that, it is
01:00:03.500 neither rigged
01:00:04.280 nor non-rigged,
01:00:05.700 because you
01:00:06.140 don't know.
01:00:07.220 It is a
01:00:07.780 Schrodinger's
01:00:08.440 cat situation.
01:00:09.620 And I
01:00:09.960 genuinely see
01:00:10.780 it that way,
01:00:11.280 by the way.
01:00:12.020 That's my
01:00:12.500 actual impression
01:00:13.460 of reality,
01:00:14.520 is that that
01:00:15.100 reality has not
01:00:15.900 collapsed.
01:00:16.360 So it's not
01:00:17.620 a case of
01:00:18.160 whether we
01:00:18.540 can find it
01:00:19.360 or not,
01:00:20.640 because it's
01:00:21.620 not there.
01:00:23.200 It's a question
01:00:24.000 of whether we
01:00:24.460 can create it.
01:00:26.140 Can we
01:00:26.620 create a past?
01:00:28.460 Well, actually,
01:00:29.100 this will be a
01:00:29.520 good test to
01:00:30.080 find out if
01:00:30.580 Carrie Lake
01:00:31.060 is a player
01:00:31.720 or an NPC.
01:00:33.400 If Carrie
01:00:34.120 Lake is an
01:00:34.700 NPC, then she
01:00:37.720 will not be
01:00:38.160 able to create
01:00:38.720 the past,
01:00:39.480 and the
01:00:40.000 election will
01:00:40.540 stand.
01:00:41.560 If she's a
01:00:42.460 player, and
01:00:43.700 she might be,
01:00:45.020 I would
01:00:45.360 definitely not
01:00:45.860 rule that
01:00:46.280 out, if
01:00:47.220 she's a
01:00:47.600 player, she
01:00:49.180 will create,
01:00:51.380 through her
01:00:51.740 own imagination,
01:00:53.440 a reality,
01:00:54.500 which will
01:00:54.940 become all
01:00:55.460 of our
01:00:55.700 actual reality,
01:00:57.340 that it
01:00:57.680 was rigged,
01:00:58.540 and it
01:00:58.860 will be
01:00:59.160 found in
01:01:00.180 ways that
01:01:00.620 you didn't
01:01:00.920 expect.
01:01:02.580 So that's
01:01:03.220 my theory.
01:01:04.100 It is neither
01:01:04.680 rigged nor
01:01:05.340 not rigged.
01:01:06.900 It's a
01:01:08.000 black box.
01:01:08.900 It's, yeah,
01:01:09.480 it's a
01:01:09.860 Schrodinger's
01:01:10.420 cat.
01:01:11.540 And if we
01:01:13.500 never open the
01:01:14.340 box, it
01:01:15.800 will never,
01:01:16.860 the reality
01:01:17.540 will never
01:01:18.160 form into
01:01:19.560 one actual
01:01:20.500 reality.
01:01:21.580 It will
01:01:21.840 always have
01:01:22.200 the potential,
01:01:23.600 but there
01:01:24.180 will never be
01:01:24.700 an actual
01:01:25.280 reality of
01:01:26.360 whether it
01:01:26.700 was rigged
01:01:27.080 or not.
01:01:30.640 Is
01:01:31.080 CWCville
01:01:32.300 real?
01:01:32.800 I don't
01:01:33.020 know what
01:01:33.260 that is.
01:01:33.640 all right.
01:01:39.180 Is Trump
01:01:39.620 a player?
01:01:40.260 Clearly.
01:01:41.400 Yeah.
01:01:42.080 There's one
01:01:42.640 thing you
01:01:42.940 can say for
01:01:43.360 sure, is
01:01:44.600 that if we're
01:01:45.120 simulation, Trump
01:01:46.080 is a player,
01:01:46.800 because he can
01:01:48.040 change the
01:01:48.560 simulation.
01:01:49.960 He changes
01:01:50.700 what you
01:01:51.360 believe is
01:01:51.920 true.
01:01:53.160 That's as
01:01:54.280 player as
01:01:54.860 you get.
01:01:56.120 I mean, I
01:01:56.700 do the same
01:01:57.180 thing.
01:01:57.860 When I change
01:01:58.660 minds, it's a
01:02:00.640 rare thing.
01:02:01.740 So if somebody
01:02:02.140 can change your
01:02:02.800 mind, they're
01:02:03.220 probably a
01:02:03.620 player.
01:02:09.760 All right.
01:02:13.320 If we
01:02:13.920 create our
01:02:14.380 personal God
01:02:15.000 before we
01:02:15.540 die, do we
01:02:16.060 get one?
01:02:17.280 Well, it
01:02:17.940 depends what
01:02:19.300 dying means in
01:02:20.240 the simulation.
01:02:21.840 So you
01:02:22.600 don't get a
01:02:23.040 personal God
01:02:23.720 if your
01:02:24.700 program just
01:02:25.540 gets turned
01:02:26.000 off.
01:02:28.740 But maybe
01:02:29.440 the program
01:02:29.980 gives you an
01:02:30.820 afterlife.
01:02:32.080 What if the
01:02:32.680 program provides
01:02:33.520 an afterlife?
01:02:35.240 It might.
01:02:36.220 There might be
01:02:36.800 an afterlife.
01:02:40.560 All right.
01:02:43.460 Everyone is
01:02:44.240 influential in
01:02:44.920 some form or
01:02:45.640 another.
01:02:46.340 Yes, but not
01:02:47.040 at the same
01:02:47.680 degree.
01:02:49.140 All right.
01:02:49.200 I can change
01:03:02.260 your mind, you
01:03:02.880 say?
01:03:03.820 Maybe you
01:03:04.380 can.
01:03:07.140 Yeah, Russia
01:03:07.900 issued a
01:03:09.140 arrest warrant
01:03:09.760 for Senator
01:03:10.820 Lindsey Graham.
01:03:12.760 They just made
01:03:13.600 a big old list
01:03:14.280 of people who
01:03:14.780 were their
01:03:15.020 enemies.
01:03:15.680 It looked
01:03:15.940 like it was
01:03:16.340 randomly
01:03:16.760 created.
01:03:22.080 All right.
01:03:27.580 Why does
01:03:28.120 a disease
01:03:28.700 kill in the
01:03:29.260 simulation?
01:03:29.980 So it
01:03:30.280 looks real.
01:03:31.960 All right.
01:03:32.420 That's all for
01:03:32.860 now on YouTube.
01:03:33.520 I'm going to
01:03:33.780 talk to the
01:03:34.140 locals people a
01:03:34.800 little bit
01:03:35.080 more.
01:03:36.220 I will see
01:03:37.380 you later.
01:03:39.220 Bye.
01:03:42.560 Bye.
01:03:45.220 Bye.
01:03:45.920 Bye.
01:03:46.080 Bye.
01:03:46.280 Bye.
01:03:46.620 Bye.
01:03:47.940 Bye.
01:03:48.680 Bye.
01:03:50.860 Bye.
01:03:51.200 Bye.
01:03:54.600 Bye.
01:03:55.440 Bye.
01:04:04.700 Bye.
01:04:04.880 Bye.
01:04:07.380 Bye.