Real Coffee with Scott Adams - May 16, 2023


Episode 2110 Scott Adams: Durham Report, Giuliani's "Girlfriend", Twitter Hit List, Patriot Front


Episode Stats

Length

1 hour

Words per Minute

137.37352

Word Count

8,372

Sentence Count

708

Misogynist Sentences

30

Hate Speech Sentences

24


Summary

Coffee with Scott Adams is a new kind of podcast, and it's based in San Francisco, California. It's called the highlight of civilization, and you ve never had a better time listening to it than right here.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization.
00:00:08.880 It's called Coffee with Scott Adams, and you've never had a better time.
00:00:14.200 I've already been laughing my ass off talking to the people on the locals' platform,
00:00:20.280 but now we'll bring this goodness to YouTube and Spotify and everywhere else.
00:00:24.380 If you'd like to take this experience up, and I guarantee this will be one of the best
00:00:30.420 live streams you've ever seen, all you need is a cup or a mug or a glass, a tank or a chalice
00:00:35.300 or a canteen jug or a flask, a vessel of any kind.
00:00:39.280 Fill it with your favorite liquid, I like coffee, and join me now for the unparalleled pleasure.
00:00:45.180 It's the dopamine hit of the day.
00:00:47.300 It's the thing that makes everything better.
00:00:49.260 It's called the simultaneous sip.
00:00:51.540 It happens now.
00:00:52.720 Go.
00:00:54.380 Oh, I don't know where to begin, but let's begin on an update.
00:01:07.760 You remember we used to call them bums and hobos, but that was unkind.
00:01:12.100 Then they became the homeless, and that sounded a little over-specific, and then they became
00:01:17.820 the unhoused.
00:01:19.640 But now they are just the freedom crappers.
00:01:21.980 And now Los Angeles has apparently 69,000 freedom crappers who crap once a day on the
00:01:33.000 sidewalk.
00:01:33.380 And I was wondering, if this happens to San Francisco, we have this race every year
00:01:40.640 in San Francisco, it's called the Bay to Breakers, where they race from one side of the city to
00:01:46.100 the other, from the bay to the beach.
00:01:48.220 And I was thinking that you might need to change the name of the Bay to Breakers, because it's
00:01:55.720 going to look, you know, if you're watching it, it's going to look a little more like Irish
00:02:00.540 line dancing, because there'd be so many turds in the way between the bay and the beach.
00:02:08.280 There's going to be a lot of this.
00:02:09.260 So it looked like Irish line dancing, and I asked my followers on locals if they could
00:02:21.380 give me some kind of a new name for this race, because Bay to Breakers doesn't feel like,
00:02:27.500 I don't know, it just doesn't feel current, does it?
00:02:29.740 So instead of the Bay to Breakers, the best suggestion I got was a turd of hurdles.
00:02:37.980 Turd of hurdles.
00:02:40.740 I'll let that sink in.
00:02:42.540 No, it's not a herd of turtles.
00:02:44.380 That's completely different.
00:02:45.740 A herd of turtles would be a slow-moving bunch of group.
00:02:48.580 No, this is a turd of hurdles.
00:02:51.660 The turd hurdle.
00:02:53.560 All right, well, enough about that.
00:02:55.040 You know, everybody asks me, why don't you move out of California to that hellhole?
00:03:01.920 And I say, well, I mean, I'm open to it, but where would I go?
00:03:08.600 And one of the top places that people suggested was Austin, because it's in Texas, and Texas
00:03:14.520 is awesome, and Austin is the awesome place, well, one of them, one of the awesome places
00:03:20.080 in Texas, but it turns out that it's a Democrat-led city and has become a, well, kind of a freedom
00:03:28.640 crapper hellhole.
00:03:30.980 So it turns out Austin is going down the drain.
00:03:37.140 I feel sorry for Tim Ferriss, who moved out of San Francisco to Austin.
00:03:42.520 I feel like he did not hurdle enough turds.
00:03:45.620 He needs to hurdle a few more and maybe get to Florida or something.
00:03:48.580 I don't know what people do, but Austin doesn't look good anymore.
00:03:53.500 Yes, we will get to all the big news.
00:03:55.300 Don't worry.
00:03:56.720 Don't worry.
00:03:57.800 I'll get to all the big news.
00:03:59.360 It's coming.
00:04:00.900 Oh, it's coming.
00:04:02.080 It's a very newsy day, have you noticed?
00:04:05.940 Some of it is about me, it turns out.
00:04:10.060 So here's the update on remote workers.
00:04:12.240 So it turns out that as long as employment is good, the employees have more power than
00:04:20.020 when employment is low.
00:04:21.940 So employees really, really, really, really like working at home at least part of the week.
00:04:28.220 So the reversal of the trend, where it looked like people were going back to work, reversed
00:04:35.020 again.
00:04:35.440 So the employees were pushing back against the requirement to come to work, and successfully,
00:04:42.220 because they have the power given employment.
00:04:45.600 So it looks like something like half of office buildings are just going to stay empty most
00:04:52.180 of the time.
00:04:53.420 But you can't really get rid of them, can you?
00:04:56.200 Because it's the same office that half of the people are in.
00:04:58.800 So if it's half empty, but you still need it, it's going to take a while for people to
00:05:03.240 downsize.
00:05:04.860 But yeah, there's a big problem.
00:05:07.220 So I don't think people are coming back to the city the way they used to.
00:05:12.060 Yeah, we're going to turn them all into pickleball courts.
00:05:14.160 That's the best idea ever.
00:05:16.880 I'd turn at least some of them into public bathrooms, but that's just me.
00:05:23.100 So that's happening.
00:05:24.340 Keep an eye on that.
00:05:25.340 So there's a story today, and I don't know if I believe this or not, but apparently on
00:05:34.380 GitHub, there was a target list of conservative social media people that would be targeted by
00:05:44.220 the DNC and leftists.
00:05:47.580 And I thought to myself, well, that's interesting.
00:05:51.160 I think I'll look at that list just to see if I'm on it, right?
00:05:59.300 I mean, that was my first thought.
00:06:00.880 It's like, oh, shoot.
00:06:02.100 There's a list of people being targeted.
00:06:05.940 I'm number nine.
00:06:09.120 I'm in the top 10.
00:06:10.320 Five of the top 10, the people that I share it with, have either left Twitter on their
00:06:17.680 own, such as President Trump, or have been kicked off of Twitter.
00:06:22.660 I'm right behind Sidney Powell on the list of targets.
00:06:27.100 That's right.
00:06:28.120 I am right behind Sidney Powell on the top 10 list of conservative influencers that were
00:06:35.880 targeted for destruction.
00:06:37.040 Now, I think that it was specifically about the 2020 election.
00:06:43.440 Can you tell me what it was that I said about the 2020 election that would put me in the
00:06:48.840 top 10 of people that they need to target?
00:06:53.140 Do you remember a claim that I made about election fraud, some specific claims?
00:07:00.440 Do you remember any?
00:07:02.000 There weren't any.
00:07:02.940 No, I was the one who told you on day one that at least 95% of all the claims would be
00:07:08.760 bullshit.
00:07:09.720 I was the first person in the country to call bullshit on the Kraken.
00:07:15.540 First one.
00:07:16.940 There was nobody in the country who called bullshit before I did.
00:07:20.060 As soon as I heard it, I said, that's bullshit right there.
00:07:23.180 That's a whole bunch of bullshit right there.
00:07:24.860 Now, what do you think I'm going to do if I find myself targeted by powerful interests who
00:07:34.680 are trying to take me out for my opinions about the 2020 election?
00:07:41.280 So I tweeted this today.
00:07:44.100 Things we know are corrupt.
00:07:45.740 FBI, Department, DOJ, Intel Organizations, Congress, Big Pharma, FDA, experts, all kinds,
00:07:51.680 Biden crime family, and the entire media.
00:07:54.860 But, on the good side, things we know that are totally clean.
00:08:01.860 All 50 state election systems.
00:08:05.360 Good job, guys.
00:08:07.160 Good job.
00:08:09.000 All 50 election systems.
00:08:11.380 Clean as a whistle.
00:08:12.920 Now, I think we can all be proud of that, given that 100% of every other organization is corrupt
00:08:18.640 and we've proven it just this year.
00:08:20.180 So, you might say, that's highly unlikely, Scott, but I say, that's a sign of excellence.
00:08:29.420 That's a sign of excellence.
00:08:30.860 All 50 run by different organizations, and yet all 50, pristine.
00:08:37.500 Pristine.
00:08:40.780 I have no idea why they targeted me.
00:08:42.980 I don't know what's up with that.
00:08:48.060 So, that happened.
00:08:50.400 I don't know if that list is real, by the way.
00:08:53.260 So, the larger theme for today's live stream is that everything's a hoax.
00:08:59.260 And, I was trying to come up with a name for a political system that's entirely based on hoaxes.
00:09:07.380 Because that's what we are.
00:09:09.480 Our political system is now entirely based on hoaxes.
00:09:14.160 That's it.
00:09:15.660 There's not a real fucking thing happening anywhere.
00:09:18.780 It's all hoaxes.
00:09:20.740 So, we'll talk about that.
00:09:22.140 But, I don't know if this Twitter list of targeted accounts is real.
00:09:30.520 But, somebody put me in the top ten with some pretty interesting cast of characters there.
00:09:38.480 All right.
00:09:41.820 So, we're going to talk about the Patriot Front and Rudy Giuliani.
00:09:46.180 And, we're going to talk about the new, what you call it, report.
00:09:52.660 The Durham report.
00:09:54.280 But, before I do that, I should warn you that we need to be alert.
00:10:02.320 Here's the good news that comes with the bad news.
00:10:04.880 The bad news is we found out that everything was corrupt.
00:10:08.460 Just everything.
00:10:10.120 Like, everybody was corrupt.
00:10:12.720 The good news is you can spot it much easier now.
00:10:16.180 Because, you can see how it's done.
00:10:18.740 You can spot the tells.
00:10:20.700 Right?
00:10:21.740 If tomorrow you saw a video that you knew was a real video.
00:10:26.800 And, it showed somebody doing, some Republican, saying something terrible.
00:10:32.560 What's the first thing you'd say to yourself?
00:10:35.240 Whatever that terrible thing is and whoever said it.
00:10:37.280 What's the first thing you'd say?
00:10:40.280 What did they, what did they edit out?
00:10:43.240 What did he, what did he or she say just before that?
00:10:46.680 And, what did he or she say right after that?
00:10:49.720 Because, that's the part they don't show you.
00:10:51.900 Because, that probably gave you the context, which changed the meaning.
00:10:55.820 So, I'm not sure we would have known that ten years ago.
00:11:00.520 Ten years ago, I think I would have just taken the video at face value.
00:11:03.600 But, now we know it's so common to create news by clipping off a part of a video, that I would assume it's a fake video.
00:11:14.360 My first assumption of any video that's super, super on the nose embarrassing, like it's just too close to somebody's narrative, first assumption is fake.
00:11:24.940 Right?
00:11:25.660 I hope you're all with me.
00:11:26.820 Now, you should make that starting assumption, but then look into it.
00:11:31.180 It could be real.
00:11:32.820 You know, you want to find out.
00:11:34.240 But, your starting assumption should be fake.
00:11:37.540 Now, don't we know, given all the hoaxes we've seen, that there is one being ginned up right now?
00:11:46.680 Now, don't you know there's some major hoaxes that are brewing?
00:11:51.100 And, maybe we already saw them.
00:11:52.620 We'll talk about them.
00:11:54.060 Right?
00:11:54.980 But, be ready, because you have to spot the next one.
00:11:59.500 There's going to be another Russia collusion-like hoax.
00:12:02.740 Not about Russia, necessarily.
00:12:04.620 Maybe.
00:12:05.580 But, it'll be of that size.
00:12:08.900 And, it's coming.
00:12:10.020 And, it's coming for sure.
00:12:11.800 And, I want us all to spot it the minute it gets here.
00:12:15.360 Right?
00:12:15.800 Because, we haven't been good at that before.
00:12:18.840 We want to laugh at it when it arrives.
00:12:21.440 Like, actually just laugh out loud.
00:12:23.400 It's like, nice try.
00:12:24.760 That's a little bit too on the nose.
00:12:27.820 So, speaking of that, I think it's one day after, or two days after Biden gave his speech,
00:12:35.800 in which he said white supremacy is the big terrorist risk in the United States.
00:12:40.600 I think people on the right are taking him out of context.
00:12:44.180 He didn't say it's the biggest risk to the United States.
00:12:48.100 People are saying that's what he said.
00:12:49.660 He didn't say that.
00:12:50.620 He said the biggest terrorist risk.
00:12:53.180 No, he said the biggest domestic terrorist risk.
00:12:56.920 What's the second biggest domestic terrorist risk after white supremacy?
00:13:03.760 What's the second biggest?
00:13:07.520 There's no domestic terrorist risk.
00:13:10.680 There is none.
00:13:11.620 Yeah, maybe.
00:13:15.100 Maybe.
00:13:15.840 Maybe white supremacy is number one on a list of totally unimportant, smallish things that
00:13:22.140 aren't likely to happen very much in the future.
00:13:25.140 It's a hoax.
00:13:26.020 Which is not to say that there have not been bad things that some alleged white supremacists did.
00:13:34.780 But right after that, the Patriot Front, who everybody on the right believes is a fake group
00:13:42.140 pretending to be on the right, as I saw today, there was Elijah Schaefer who was saying that he knows
00:13:50.400 basically everybody in the Republican conservative world, like every group.
00:13:56.460 He knows at least somebody, or he knows somebody who knows somebody, everywhere.
00:14:00.920 But he says, I've never met anybody who knows anybody in the Patriot Front.
00:14:04.660 Anybody.
00:14:07.720 Like, I've personally talked to people in the Proud Boys.
00:14:13.160 How many of you have had, or at least know, somebody who knows somebody in the Proud Boys?
00:14:19.880 Right?
00:14:20.520 It's very common.
00:14:21.480 If you're associated with the right, you probably know somebody.
00:14:24.440 Maybe not personally.
00:14:25.980 But, you know, I've talked to Gavin McGinnis.
00:14:28.420 I did an interview with him years ago.
00:14:30.880 But the Patriot Front, I've had no contact.
00:14:35.360 In any way, directly or indirectly, with the Patriot Front.
00:14:39.780 It's a little unusual, because they get a lot of publicity, don't they?
00:14:43.160 Now, I know that they wear masks, and they're trying to be secretive.
00:14:46.380 But they put out this video where it alleges to show them when the camera is not supposed to be running.
00:14:55.300 But it's obvious that they're acting, and they know the camera is running.
00:14:58.940 So they do their little white supremacist salute or whatever.
00:15:03.320 And then as they're done, instead of acting like normal people when they finish something,
00:15:09.240 they act like actors who are acting like they finish something.
00:15:12.460 It was just terrible acting.
00:15:14.540 It was very obvious acting.
00:15:16.500 And then one of them says some Nazi thing.
00:15:18.580 So that's the payoff.
00:15:21.440 Oh, we got them on.
00:15:22.740 They didn't know the camera was running.
00:15:24.780 So we got them on some Nazi stuff.
00:15:27.940 Completely fake looking.
00:15:29.200 Now, I'm willing to believe I could be wrong about anything.
00:15:35.160 But if I'm wrong about this, I'll be really surprised.
00:15:38.900 This looks entirely like a hoax.
00:15:41.380 It looks like a long-running hoax.
00:15:43.320 Because they just drag these people out whenever they need to move the narrative a little bit.
00:15:48.580 So I would say that the Patriot Front is part of it.
00:15:52.620 But then there was also this story about Rudy Giuliani,
00:15:56.280 who is an ex-girlfriend who also did PR work for him.
00:16:02.340 And she alleges that he tricked her into sex and forced her to perform on demand whatever he wanted
00:16:12.040 and some icky stuff like that.
00:16:16.320 We don't care about any of that.
00:16:17.960 But one of the things that she is alleging is that Rudy Giuliani was offering to sell pardons.
00:16:27.540 And he was working with Trump.
00:16:29.520 And that they would charge $2 million for a pardon.
00:16:32.020 And then Giuliani and Trump would each keep a million.
00:16:36.580 And allegedly he told his girlfriend this.
00:16:39.360 And then now she's telling the public.
00:16:42.040 Does that sound true to you?
00:16:44.880 Or does that sound like an obvious hoax?
00:16:48.460 A little too close?
00:16:51.160 A little bit too close?
00:16:53.000 Do you think that Rudy Giuliani,
00:16:55.640 even if he were doing some kind of scheme like this,
00:16:59.340 do you think he would just say it directly to his girlfriend and give dollar amounts?
00:17:04.440 Do you think he'd give dollar amounts to his girlfriend?
00:17:07.680 I don't think so.
00:17:10.240 I don't think so.
00:17:11.200 Do you think that Trump would sell pardons to make a few extra million dollars when everybody's watching?
00:17:21.200 Do you think so?
00:17:23.760 Now, it wouldn't surprise me if somewhere in the history of the world somebody sold a pardon.
00:17:29.180 That doesn't sound too surprising.
00:17:32.140 But Trump was already a billionaire.
00:17:34.360 Do you think he needed $5 million more at the risk of that?
00:17:41.140 The gigantic risk?
00:17:43.340 He's the most vetted person ever.
00:17:46.080 Anyway, so there's nothing about this that looks real.
00:17:50.000 So that looks like a hoax to me.
00:17:53.300 And I'm a little bit curious about a woman who would have what must have been the hideous sex with Rudy Giuliani.
00:18:03.300 Because she was younger and attractive.
00:18:05.140 I don't know how young, but they were not in the same age class.
00:18:09.540 And she did this continuously.
00:18:13.240 And now she's, you know, she's whistleblowing.
00:18:18.900 Or I guess she was whistleblowing then, you could say.
00:18:22.100 But does she sound like a real person or somebody who might have a backer?
00:18:27.940 She sounds a little bit like somebody who has a backer.
00:18:33.280 You know what I mean?
00:18:35.300 A backer who said, you know, if you would just get close to this guy,
00:18:41.200 we would make sure that your life turns out pretty well after that.
00:18:46.160 You know, come up with some good stuff.
00:18:48.660 Yeah.
00:18:49.260 So it's a little bit sketchy.
00:18:51.000 So I don't believe anything about the Rudy Giuliani story.
00:18:54.380 Except that he probably had a lot of sex with her.
00:18:56.920 That's probably true.
00:18:59.480 Speaking of sex,
00:19:01.320 it turns out that the number of young men and women having sex is way down.
00:19:06.860 So between 2007 and 2017,
00:19:10.660 the percentage of 18 to 23-year-olds who had casual sex in the past month
00:19:15.440 dropped from 38% to 24.
00:19:19.920 And within women, or young women, I guess,
00:19:24.180 dropped from 31% to 22.
00:19:26.920 Now,
00:19:29.160 that doesn't mean there's less sex.
00:19:35.060 That means there are fewer people participating in it.
00:19:38.760 Am I right?
00:19:40.040 Because I think there might be more sex.
00:19:42.420 It's just that the people who can get it are having it,
00:19:44.980 and the people who are left out are having none.
00:19:46.640 And so I think the world is just bifurcated into basically total cum sluts
00:19:52.980 and people who can't get a date.
00:19:56.100 And that's it.
00:19:57.540 It was like nothing in the middle.
00:19:58.720 That's an exaggeration.
00:20:02.660 All right, let's talk about our corrupt government.
00:20:05.780 Apparently a whistleblower saying that the IRS investigation into Hunter Biden
00:20:11.160 was ended and the entire investigative team
00:20:14.960 that was doing a multi-year tax fraud investigation were all let go.
00:20:19.020 The entire team, including the whistleblower.
00:20:23.400 Is that because they had an answer and there was nothing to see here?
00:20:27.020 We still don't know if Hunter or any of the family members paid any taxes
00:20:32.600 on any of the alleged payments.
00:20:38.300 So doesn't that look like an obvious corruption?
00:20:44.780 Like really super obvious.
00:20:47.460 Is there any other way to interpret that
00:20:49.740 than Hunter didn't pay his taxes?
00:20:51.660 I don't know how to, how could you,
00:20:54.020 how could you come up with any other interpretation?
00:20:57.540 Because if the interpretation, the correct interpretation was
00:21:00.860 there was nothing to see here, they would have told us.
00:21:04.520 Wouldn't they say the IRS looked into it, there was nothing to see here?
00:21:08.520 Because you'd want a clear Hunter, right?
00:21:10.860 Yeah, everything's copacetic, it's all good.
00:21:12.940 But no, they don't tell us everything's okay,
00:21:16.160 they just fire everybody who's looking for something bad for multiple years.
00:21:20.280 That's a little bit suspicious.
00:21:23.260 A little bit suspicious.
00:21:25.400 So you're asking, is it too on the nose?
00:21:28.420 Good question.
00:21:29.740 Is it too on the nose
00:21:31.000 that, you know, the whole group got fired?
00:21:37.360 It feels like,
00:21:39.100 it feels like it would be weird if that's true.
00:21:44.420 But, remember, this is all in the context
00:21:46.520 of a whole bunch of other things that are clearly corrupt.
00:21:48.840 So it would just be one more corrupt thing
00:21:51.440 in the long list of corrupt things.
00:21:55.340 Well, let's talk about the Durham report.
00:21:57.600 Five years later, 300 pages later.
00:22:01.900 Would you be surprised to find out
00:22:04.060 that although the report is shocking,
00:22:07.900 that the people on the left have decided
00:22:10.440 it's not really much news at all?
00:22:12.820 Not much news at all.
00:22:14.760 In fact, we can move on after we give it a,
00:22:16.560 we'll just brush on it.
00:22:19.220 Yeah, maybe mention some of the highlights,
00:22:20.820 but it's not much news here.
00:22:23.440 So here's how the right is interpreting it.
00:22:27.300 We now have documented proof
00:22:29.720 that the FBI and the Department of Justice
00:22:33.220 were actively doing unprofessional things
00:22:37.320 to bias the outcome of the,
00:22:40.980 well, which had the effect,
00:22:42.680 had the effect,
00:22:43.960 because we can't read their minds,
00:22:45.540 had the effect of changing the outcome of the election.
00:22:49.920 Now, apparently it's a little complicated
00:22:51.800 because they had also done some stuff against Hillary.
00:22:55.700 So if you say to yourself,
00:22:57.100 oh, they were totally in the bag for the Democrats,
00:22:59.820 it'd be a little hard to explain
00:23:01.100 why they were investigating Hillary.
00:23:02.580 They actually had a spy in her campaign
00:23:04.340 because it was some specific allegation
00:23:06.480 of maybe taking foreign money,
00:23:09.420 which, of course,
00:23:10.420 which, of course, they didn't find anything
00:23:13.280 and they canceled it.
00:23:14.480 So maybe it's an example of them protecting Hillary.
00:23:17.600 Oh, yeah, we have a spy there,
00:23:19.580 and, yeah, so we looked into it,
00:23:22.280 but the spy didn't find anything,
00:23:24.500 so there wasn't anything there.
00:23:27.000 So it could be
00:23:28.140 that even the fact that Hillary was investigated
00:23:31.020 was also fake.
00:23:33.520 In other words, maybe they didn't try.
00:23:36.300 So the left is saying
00:23:41.500 nobody's being indicted,
00:23:44.420 no additional people,
00:23:46.140 so nobody's being indicted additionally,
00:23:48.640 and it's all news that we already knew,
00:23:51.460 and, yes, the FBI was unprofessional,
00:23:54.700 and, yes, according to Durham,
00:23:56.640 there may have been some confirmation bias,
00:23:59.560 which caused them to ignore their own standards
00:24:03.260 and make an investigation on something
00:24:05.860 that did not have the prerequisite evidence
00:24:09.560 that you needed an investigation.
00:24:13.480 So on one hand,
00:24:14.800 it looks like an obvious coup attempt
00:24:16.820 against the United States that worked.
00:24:19.840 One of two.
00:24:21.120 As you could say,
00:24:21.720 the other coup attempt
00:24:22.520 was the 51 intel people
00:24:24.160 who said the laptop
00:24:25.280 was Russian disinformation.
00:24:28.080 So now we have two examples
00:24:29.200 of something that looks,
00:24:31.480 by the outcome,
00:24:33.580 to be a coup,
00:24:35.220 coup attempts.
00:24:36.260 Basically,
00:24:37.680 the FBI and Department of Justice
00:24:40.520 trying to influence
00:24:42.020 the outcome of an election.
00:24:43.880 However,
00:24:45.300 there's no smoking gun.
00:24:46.860 Apparently there's no evidence
00:24:49.580 of anybody colluding to do it.
00:24:53.200 Now, I'm not saying
00:24:53.860 they didn't collude.
00:24:55.260 I'm saying Durham didn't find
00:24:57.600 the direct evidence there.
00:24:59.600 So it looks like a bunch of people
00:25:01.160 making a bunch of decisions
00:25:02.680 under confirmation bias.
00:25:07.860 Now, to be fair,
00:25:10.720 Durham said
00:25:11.480 that was the most charitable explanation.
00:25:13.860 So the best you could say
00:25:16.900 was confirmation bias,
00:25:18.620 meaning a normal thing
00:25:20.240 that happens to brains.
00:25:21.760 You think you're right,
00:25:22.680 so you act that way
00:25:23.440 even though the evidence
00:25:24.220 didn't support it.
00:25:26.080 What he didn't find
00:25:27.500 is a deep state conspiracy
00:25:29.680 that was specifically
00:25:31.000 targeting Trump.
00:25:33.960 It just looked exactly like it.
00:25:38.860 Now, this gets us
00:25:40.180 to the George Carlin
00:25:42.520 definition of a conspiracy.
00:25:47.120 So as George Carlin points out,
00:25:49.320 you don't have to have a meeting
00:25:50.620 if everybody knows what to do.
00:25:54.200 Right?
00:25:55.180 You don't have to have a meeting.
00:25:57.160 Don't make a phone call.
00:25:58.420 Don't send a text.
00:25:59.840 Everybody knows what to do.
00:26:01.720 Get Trump
00:26:02.380 any way you can.
00:26:04.680 So,
00:26:05.360 I'm not sure
00:26:07.060 I'm buying the
00:26:08.000 confirmation bias
00:26:09.640 is all that was there
00:26:10.620 even though
00:26:11.460 that's all they could find.
00:26:12.880 These were all
00:26:13.740 high-end smart people
00:26:16.080 who knew not
00:26:17.380 to write things down.
00:26:19.900 Right?
00:26:20.880 And a whistleblower
00:26:21.940 would probably
00:26:22.520 be in a lot of trouble too.
00:26:24.380 So,
00:26:24.920 I don't think
00:26:27.300 you can rule out
00:26:28.160 conspiracy.
00:26:30.100 It's just
00:26:30.580 you can't find it.
00:26:33.120 So,
00:26:33.700 I won't say
00:26:34.220 that anybody's guilty
00:26:35.200 of conspiracy
00:26:35.780 because, you know,
00:26:36.600 innocent until proven guilty.
00:26:37.820 But it looks like it.
00:26:40.400 Sure looks like it.
00:26:42.020 Conspiracy.
00:26:45.040 So,
00:26:45.660 here are some of the comments
00:26:46.560 that CNN made.
00:26:48.100 This is Jake Tapper.
00:26:49.800 He said that
00:26:50.860 it exonerated Trump
00:26:52.900 somewhat.
00:26:54.780 So,
00:26:55.220 there is some recognition
00:26:56.120 that Trump
00:26:57.500 was somewhat
00:26:58.300 exonerated
00:26:59.120 in the sense
00:26:59.820 that people
00:27:00.320 were after him
00:27:01.160 and not treating him
00:27:01.940 fairly.
00:27:02.200 That appears
00:27:03.340 to be
00:27:04.040 completely true.
00:27:05.500 They were after him
00:27:06.460 and not
00:27:06.860 treating him fairly.
00:27:10.180 Jake Tapper
00:27:10.960 said it was also
00:27:11.680 devastating
00:27:12.460 for the FBI.
00:27:14.940 But,
00:27:15.840 in typical
00:27:17.800 Dilbert
00:27:18.460 bureaucratic form,
00:27:20.640 what happened?
00:27:21.940 Well,
00:27:22.300 it took five years
00:27:23.360 to look into it
00:27:24.280 and get this report.
00:27:25.980 In those five years,
00:27:27.660 all the people
00:27:28.340 involved
00:27:28.880 or most of them
00:27:29.560 left.
00:27:29.920 and the FBI
00:27:32.160 made some changes
00:27:33.420 because they already
00:27:34.480 had this information
00:27:35.420 themselves,
00:27:36.260 made some changes
00:27:37.340 which they say
00:27:38.100 will prevent it
00:27:38.800 from happening again.
00:27:40.460 Problem solved,
00:27:41.440 right?
00:27:42.480 The people involved
00:27:43.560 are not there anymore
00:27:44.520 and the FBI
00:27:46.840 put in some procedures
00:27:47.920 to make sure
00:27:48.520 it doesn't happen again.
00:27:50.020 So,
00:27:50.280 problem solved.
00:27:51.200 There's no news here.
00:27:52.980 Moving on.
00:27:54.280 No crimes,
00:27:55.520 no news,
00:27:56.800 problem solved.
00:27:57.980 Those people
00:27:58.440 aren't here anymore.
00:28:00.200 Eh,
00:28:00.580 probably a little
00:28:01.200 confirmation bias.
00:28:02.600 Glad we got to the
00:28:03.380 bottom of it.
00:28:03.900 Moving on.
00:28:04.700 Moving on.
00:28:07.940 Nobody will be punished.
00:28:10.580 Or at least more
00:28:11.360 than they have been.
00:28:12.100 Some people lost
00:28:12.740 their jobs.
00:28:16.080 All right.
00:28:16.860 And then,
00:28:17.860 you know,
00:28:18.040 no surprises.
00:28:19.060 Yes,
00:28:19.340 there were no surprises
00:28:20.120 here.
00:28:20.520 We already knew
00:28:21.200 this basically.
00:28:22.160 Basically,
00:28:22.360 so this is,
00:28:27.540 you know,
00:28:27.740 again,
00:28:28.460 it's two movies
00:28:31.140 on one screen.
00:28:32.640 So we're all looking
00:28:33.660 at exactly the right stuff
00:28:35.040 and the fact that
00:28:36.660 it's not technically illegal.
00:28:39.520 Now,
00:28:40.020 let me ask you this.
00:28:40.700 Would it be illegal
00:28:41.540 to actually organize
00:28:42.840 a hoax?
00:28:45.380 I mean,
00:28:45.900 you can get fired for it,
00:28:47.020 of course.
00:28:48.340 But if the FBI
00:28:49.140 and the Department
00:28:49.720 of Justice
00:28:50.240 organized a hoax,
00:28:53.240 is that illegal?
00:28:56.080 As long as they
00:28:57.080 can sell it as
00:28:57.900 something they
00:28:58.400 believed was true.
00:28:59.880 If they can sell it
00:29:01.000 as something they
00:29:01.700 believed was true,
00:29:03.040 then you get
00:29:04.020 the Durham report
00:29:04.840 which says,
00:29:05.460 well,
00:29:05.580 it's a little
00:29:05.880 confirmation bias,
00:29:07.460 but that's not illegal.
00:29:09.520 It's not illegal
00:29:10.440 to be wrong.
00:29:12.060 Oh, yeah,
00:29:12.600 they were wrong.
00:29:14.280 That's all we could prove.
00:29:17.920 All right.
00:29:22.100 Just to round out
00:29:23.440 my talk about
00:29:26.080 how everything
00:29:26.780 is corrupt,
00:29:28.440 apparently there was
00:29:29.260 a massive missile barrage
00:29:31.900 by Russia
00:29:33.180 into Ukraine
00:29:33.940 last night,
00:29:34.820 into Kiev.
00:29:36.620 Massive barrage.
00:29:38.720 And the news says
00:29:39.900 that Ukraine
00:29:40.700 intercepted
00:29:41.960 and shot down
00:29:42.620 every one of them.
00:29:44.520 Every one of them.
00:29:46.480 Got them all.
00:29:48.820 What's Russia say?
00:29:51.140 Every one of the missiles
00:29:52.440 hit their target.
00:29:55.260 Every one.
00:29:56.380 All 18.
00:29:58.220 Got their targets.
00:30:01.000 Which one of those
00:30:01.940 is true?
00:30:03.760 Probably neither,
00:30:04.720 right?
00:30:05.440 Probably they got some,
00:30:06.800 some got through,
00:30:07.580 but we don't know.
00:30:09.000 We don't know.
00:30:10.700 But that's the state
00:30:12.620 of information.
00:30:14.040 You can get the thing
00:30:15.820 and the opposite
00:30:16.500 of the thing
00:30:17.140 at the same time.
00:30:19.840 So did you wonder
00:30:20.840 how Russia
00:30:21.420 is staying afloat
00:30:22.800 with all those sanctions?
00:30:25.120 So part of it is
00:30:26.440 they're selling
00:30:27.000 their oil to India
00:30:28.520 and then India
00:30:30.280 is refining it
00:30:31.200 and selling it
00:30:31.820 into Europe.
00:30:33.020 So basically,
00:30:33.840 Russia is still
00:30:35.740 selling to Europe.
00:30:37.260 Now,
00:30:38.800 I told you that,
00:30:40.740 and I guess
00:30:41.600 I'm still the only person
00:30:42.580 in the world
00:30:42.920 saying this,
00:30:44.000 that the Ukraine-Russia
00:30:45.840 war is over.
00:30:48.020 The war is over.
00:30:50.500 Trump made it clear
00:30:51.940 that it's now
00:30:52.840 a negotiation.
00:30:54.700 And nobody's
00:30:55.620 going to win anything.
00:30:56.980 And when he wins,
00:30:58.120 which statistically
00:30:59.460 looks like it's
00:31:00.300 a good bet,
00:31:01.780 when he wins,
00:31:02.860 he's going to end it
00:31:03.820 just by putting
00:31:04.720 enough pressure
00:31:05.320 on both sides
00:31:06.100 that they're
00:31:07.780 better off ending it.
00:31:09.500 So at this point,
00:31:10.420 they're just negotiating
00:31:11.340 with weapons.
00:31:12.800 That's all it is.
00:31:14.280 Nobody's going
00:31:14.860 to win anything.
00:31:16.060 There's no winning
00:31:16.740 to be had.
00:31:20.360 But here's my
00:31:23.380 addition to that point.
00:31:26.180 If this were
00:31:27.160 a real war,
00:31:28.900 we would have
00:31:29.820 bombed India.
00:31:32.780 Am I right?
00:31:34.720 The minute we found
00:31:35.720 out that India
00:31:36.400 was refining the oil
00:31:37.640 and selling it
00:31:38.220 back to Europe,
00:31:39.140 which basically meant
00:31:40.300 that the sanctions
00:31:41.660 weren't working,
00:31:42.860 we would shut down
00:31:43.780 India.
00:31:44.680 We'd ask them first.
00:31:46.380 I mean,
00:31:46.660 first you'd ask politely
00:31:47.780 because they're an ally.
00:31:49.180 But then we would
00:31:51.140 take it out.
00:31:52.380 We'd just bomb it.
00:31:53.900 Bomb the refinery.
00:31:56.020 Because that's what
00:31:56.840 you do in a war.
00:31:58.720 This is clearly
00:31:59.460 not a war.
00:32:00.860 It's clearly
00:32:01.500 a negotiation.
00:32:02.940 In a negotiation,
00:32:03.660 you'd be like,
00:32:04.500 oh,
00:32:05.280 don't go too far.
00:32:06.960 But if you knew,
00:32:08.160 you know,
00:32:08.620 if you were playing
00:32:09.700 to win,
00:32:10.160 you'd bomb India's
00:32:11.200 production facility
00:32:13.080 and just take that
00:32:14.680 off the table.
00:32:15.680 Now,
00:32:16.100 it might also starve
00:32:17.040 Europe,
00:32:18.200 so it's complicated.
00:32:20.440 But it doesn't look
00:32:21.820 like a war to me.
00:32:22.840 It's definitely
00:32:23.840 a negotiation
00:32:24.620 at this point.
00:32:26.840 Here's another thing
00:32:27.760 that we thought
00:32:28.220 was true.
00:32:28.640 How many of you
00:32:30.860 for years
00:32:31.520 have learned
00:32:32.200 that people
00:32:33.380 have different
00:32:33.920 learning styles?
00:32:36.800 You assumed
00:32:37.320 that was just
00:32:37.820 basic and true,
00:32:39.020 right?
00:32:39.360 Because you know
00:32:39.880 you do.
00:32:40.760 Some people say,
00:32:41.640 oh,
00:32:41.760 I have to hear it.
00:32:43.580 Some people say,
00:32:44.500 I have to see it.
00:32:45.400 That's definitely true.
00:32:46.780 Definitely true.
00:32:47.680 Did you know
00:32:48.140 that the science
00:32:48.880 debunks all of that?
00:32:51.060 That if you teach
00:32:52.100 people according
00:32:52.780 to their preferred style,
00:32:54.580 you get no difference
00:32:55.900 in outcomes.
00:32:56.440 There is not
00:32:57.740 the slightest bit
00:32:58.820 of science
00:32:59.520 behind that.
00:33:00.560 None.
00:33:01.620 I learned that today.
00:33:03.400 No science,
00:33:04.460 and there never
00:33:04.980 has been.
00:33:06.240 There's never
00:33:06.740 been science.
00:33:08.080 And when they
00:33:08.740 decided to do
00:33:09.520 some science,
00:33:10.160 they couldn't
00:33:10.440 find any effect.
00:33:12.220 The whole thing
00:33:13.340 is bullshit.
00:33:14.520 People learn
00:33:15.360 exactly the same.
00:33:17.260 End of story.
00:33:19.400 How many decades
00:33:20.960 have I believed
00:33:21.660 that was true?
00:33:23.340 Decades.
00:33:25.060 Decades.
00:33:26.440 Never was true.
00:33:28.020 Never was true.
00:33:30.900 All right,
00:33:31.460 here's some good news,
00:33:32.860 bad news situation.
00:33:34.800 So as you know,
00:33:36.600 one of the biggest
00:33:38.560 buyers of stocks
00:33:40.040 in companies
00:33:40.820 are investment funds.
00:33:43.500 So Vanguard
00:33:44.200 and places like that,
00:33:46.460 because they buy,
00:33:47.440 on behalf of their clients,
00:33:48.560 they buy massive
00:33:49.480 amounts of stocks.
00:33:51.260 So the big investment
00:33:52.680 funds can own
00:33:53.700 big chunks of companies.
00:33:55.480 And now there's
00:33:57.140 a service
00:33:57.900 that is scoring
00:34:00.360 these investment
00:34:04.020 companies
00:34:04.540 for how much
00:34:06.320 they're adhering
00:34:07.060 to ESG.
00:34:10.240 And you say to yourself,
00:34:11.940 oh no,
00:34:12.560 not this again.
00:34:14.220 Right?
00:34:14.940 You're saying,
00:34:15.480 oh no,
00:34:16.900 not what you think.
00:34:18.380 It's the opposite
00:34:19.320 of what you think.
00:34:20.640 It's a list
00:34:21.500 so you can avoid them.
00:34:22.500 So I'm going to call out
00:34:27.360 Dominion,
00:34:29.380 which is not
00:34:29.920 the voting machine company,
00:34:31.640 but a fund,
00:34:33.340 and Vanguard.
00:34:34.720 So Vanguard gets an A,
00:34:37.420 an A rating
00:34:38.560 for not looking at ESG,
00:34:42.900 for avoiding ESG,
00:34:44.920 because the rating is
00:34:46.920 from the perspective
00:34:48.040 of the investors.
00:34:50.540 And the investors
00:34:51.280 are not asking
00:34:52.200 for more racial equality.
00:34:55.460 The investors
00:34:56.020 are asking for profits.
00:34:57.780 So they want to know,
00:34:58.600 if I put my money
00:34:59.340 in Vanguard,
00:35:00.360 is Vanguard going to invest
00:35:01.720 in a bunch of ESG companies,
00:35:03.600 or are they going to invest
00:35:04.700 in companies
00:35:05.440 that are going to make me money?
00:35:07.480 And now you can look
00:35:08.420 at the list.
00:35:09.660 And here's the interesting part.
00:35:10.920 You know,
00:35:11.160 BlackRock was the big pusher
00:35:12.620 of ESG.
00:35:14.320 They got a C rating.
00:35:17.260 You'd expect an F,
00:35:18.440 because they're the big
00:35:19.100 pushers of ESG.
00:35:20.740 But even BlackRock
00:35:22.260 says ESG has gone too far.
00:35:24.680 They can't invest in it,
00:35:26.400 because it's gone crazy.
00:35:28.820 Like,
00:35:29.060 there were some initial
00:35:30.060 good things,
00:35:31.000 which they backed completely.
00:35:33.480 But now the,
00:35:34.440 you know,
00:35:34.980 the activism has reached
00:35:36.160 the crazy level,
00:35:37.440 where even BlackRock
00:35:38.860 is like,
00:35:39.320 hmm,
00:35:40.180 that's a little too ESG.
00:35:42.300 There's ESG,
00:35:43.260 and then there's
00:35:45.460 two ESG.
00:35:47.180 Now,
00:35:47.680 I've been calling
00:35:48.360 the peak of wokeness
00:35:49.860 for a while.
00:35:51.200 Do you see it yet?
00:35:53.560 Wokeness is a dead man walking.
00:35:56.140 There's still going to be
00:35:56.800 plenty of it
00:35:57.360 for a long time,
00:35:58.220 but we've definitely
00:35:59.160 reached peak.
00:36:00.960 We've reached peak,
00:36:02.120 and it's pulling back.
00:36:03.800 And you can see it
00:36:04.800 in the way people talk.
00:36:07.060 People can just say,
00:36:08.560 this is crazy now.
00:36:10.700 And you didn't used
00:36:11.720 to be able to do that.
00:36:12.560 You'd just be a racist.
00:36:14.760 And now you can just say,
00:36:15.660 all right,
00:36:15.820 this is too far.
00:36:16.960 This is just bullshit.
00:36:19.940 Did you see,
00:36:20.680 was it Miller Lite?
00:36:23.840 I couldn't tell
00:36:24.700 if that was real.
00:36:26.200 So Miller Lite
00:36:27.080 apparently has a campaign
00:36:28.860 to make a big deal
00:36:31.620 of getting rid
00:36:32.340 of the frat-like brand
00:36:35.260 that their beer had,
00:36:36.700 and they want to take down
00:36:38.480 all the bikini pictures
00:36:39.980 of women
00:36:40.560 that have ever been
00:36:41.860 associated with their beer
00:36:42.980 and burn them all.
00:36:45.260 And I just watched
00:36:47.600 that commercial,
00:36:48.500 and I thought to myself,
00:36:49.420 is this a parody?
00:36:51.380 Are they actually
00:36:52.420 making the mistake
00:36:53.580 of the last 100 years,
00:36:56.380 like the biggest mistake
00:36:57.620 you'd ever make?
00:36:59.060 And apparently,
00:36:59.680 they are,
00:37:00.560 because they hired a woman.
00:37:02.120 You've got to stop
00:37:06.440 hiring women
00:37:08.340 for men's products.
00:37:11.480 And likewise,
00:37:12.620 I'll say this again,
00:37:14.160 if Avon hired only women
00:37:16.260 as their executives,
00:37:17.460 I'm fine with that.
00:37:19.700 You know,
00:37:19.920 normally you'd want
00:37:20.880 more of an open mix
00:37:23.220 of people,
00:37:24.180 but if makeup
00:37:26.220 is the product,
00:37:27.920 yeah,
00:37:28.440 your executives
00:37:29.080 could all be women,
00:37:29.880 and I don't think
00:37:31.140 there's anything wrong
00:37:31.820 with that at all,
00:37:32.740 because they would
00:37:33.840 understand the area better.
00:37:36.320 I think that's the same
00:37:37.380 with beer.
00:37:38.960 You know,
00:37:39.180 it's not like women
00:37:39.820 don't drink beer,
00:37:41.180 but these specific products
00:37:42.900 were just sort of
00:37:43.740 guy products.
00:37:44.960 Get a guy.
00:37:48.280 I should tell you
00:37:49.340 that the Dilbert Reborn comic
00:37:51.020 that you can only see
00:37:52.020 by subscription now,
00:37:54.860 Dilbert's company
00:37:55.620 has a power tools division.
00:37:59.880 And they will be hiring
00:38:01.200 a woman to market
00:38:02.640 their power tools
00:38:03.700 going forward.
00:38:05.440 It doesn't work out.
00:38:07.660 It doesn't go well.
00:38:09.840 All right.
00:38:11.780 What else is going on here?
00:38:15.340 Yeah.
00:38:17.280 So here are the companies
00:38:18.740 that are investment companies
00:38:20.960 that got good scores
00:38:23.540 by ignoring ESG.
00:38:25.560 I'd just like to give
00:38:26.260 them a shout out.
00:38:27.040 Free advertising.
00:38:27.900 Here you go.
00:38:28.280 Dimensional, Vanguard,
00:38:31.260 T-Row Price,
00:38:33.100 Fidelity,
00:38:34.540 and then when you get
00:38:35.820 to the next level down,
00:38:37.400 it's actually BlackRock.
00:38:39.260 So BlackRock gets
00:38:40.540 an average score
00:38:41.760 for not being crazy
00:38:42.800 about the thing
00:38:43.440 that they promoted.
00:38:45.880 Just think about that.
00:38:47.960 BlackRock is getting a C,
00:38:49.920 which is way better
00:38:51.300 than it should have been.
00:38:52.160 It should have been an F
00:38:52.960 because they're the main
00:38:53.880 promoter of ESG.
00:38:55.120 but they have disavowed
00:38:57.800 their own thing so much
00:38:59.400 that they've got a C.
00:39:02.940 I mean, that's how
00:39:03.760 I interpret it.
00:39:04.580 That may be the wrong
00:39:05.360 interpretation,
00:39:06.140 but that's what it looks like.
00:39:09.960 All right.
00:39:10.540 Ladies and gentlemen,
00:39:16.640 it's a very newsy day.
00:39:17.840 Did I miss anything?
00:39:24.320 Fidelity is wicked smart,
00:39:25.600 you say?
00:39:27.280 James O'Keefe,
00:39:28.240 what's he doing lately?
00:39:29.460 Haven't heard much
00:39:30.380 from him lately.
00:39:30.960 All right.
00:39:38.720 What is the strategy?
00:39:43.740 New Twitter CEO,
00:39:45.220 not much to say.
00:39:48.380 Here's my take on the WEF.
00:39:52.940 It's sort of a leisure club
00:39:55.940 for rich people,
00:39:57.320 and if I were an executive
00:39:59.560 of a big corporation,
00:40:01.220 I'd probably go.
00:40:03.620 Would that make me
00:40:04.800 colluding with the
00:40:06.260 World Economic Forum
00:40:07.300 to be a surrogate government
00:40:09.480 and means I'm all for
00:40:11.140 everything that they want to do?
00:40:12.860 No.
00:40:13.840 No.
00:40:14.280 I would go there
00:40:15.060 to meet the other rich people
00:40:16.160 and network with them.
00:40:18.200 It just seems like
00:40:19.400 a pleasant place
00:40:20.700 to go network.
00:40:22.300 So now,
00:40:23.460 I believe that if you...
00:40:25.880 Here's my theory.
00:40:26.620 The World Economic Forum
00:40:29.540 is a whole bunch
00:40:30.420 of hugely successful people,
00:40:33.160 right?
00:40:34.540 Who have huge egos,
00:40:37.020 because that's just
00:40:37.640 how it works.
00:40:38.840 They're hugely successful.
00:40:40.200 They've got big egos.
00:40:41.680 Do you think you could get
00:40:42.660 that group of people
00:40:43.500 to agree on anything?
00:40:45.900 I mean, seriously.
00:40:47.560 You think that you could
00:40:48.660 get them to agree
00:40:49.540 on anything serious?
00:40:51.540 I don't think so.
00:40:52.540 I think you could only
00:40:55.440 get them to agree
00:40:56.620 to signal.
00:40:58.840 Hey, how would you all
00:40:59.820 look like...
00:41:00.460 Would you like to look
00:41:01.140 like good citizens?
00:41:02.740 Sure.
00:41:03.680 All right, we're going
00:41:04.280 to put out some
00:41:04.800 good citizen stuff,
00:41:06.580 like climate change.
00:41:07.820 You're going to fix it
00:41:08.760 and, you know, equity.
00:41:11.040 And you'll just all
00:41:12.080 just say you're good
00:41:13.180 with it, okay?
00:41:14.060 Yeah, sure.
00:41:15.320 Makes us all look
00:41:16.160 like stars.
00:41:16.920 Okay.
00:41:18.080 I think that the
00:41:18.920 World Economic Forum
00:41:19.800 is a nothing.
00:41:22.300 I just don't think
00:41:23.000 they have any
00:41:23.560 serious control
00:41:25.820 over anything.
00:41:26.940 I think that they
00:41:28.120 find a parade
00:41:29.260 and they get behind it.
00:41:30.960 They didn't create
00:41:31.840 the climate change stuff.
00:41:33.260 They just saw a parade
00:41:34.380 and got on it.
00:41:35.820 They didn't create
00:41:36.760 anything that Bill Gates
00:41:37.780 is doing.
00:41:38.700 I think they just
00:41:39.480 back it if they like it.
00:41:43.100 So that's my take.
00:41:46.220 I think the World Economic Forum
00:41:47.780 is more about the people
00:41:53.520 and doesn't matter
00:41:55.220 to us too much.
00:41:57.820 Follow the money.
00:42:01.940 Well, I do think
00:42:03.120 that they copied each other,
00:42:04.620 but it probably
00:42:05.000 would have happened anyway.
00:42:06.760 Yeah, I don't know
00:42:07.700 that the World Economic Forum
00:42:09.800 caused anything to happen.
00:42:12.100 I think that they were
00:42:12.940 just going with
00:42:14.160 whatever were
00:42:14.740 the cool trends
00:42:15.480 so they could get people
00:42:16.400 to come to their big event.
00:42:24.160 So let's talk about
00:42:25.900 the Bill Gates
00:42:26.920 spraying stuff
00:42:27.920 into the atmosphere
00:42:28.740 to change climate.
00:42:33.180 I wouldn't worry
00:42:34.120 about that too much
00:42:35.040 because it's just
00:42:37.800 on a long list
00:42:39.240 of things
00:42:39.760 that people have considered.
00:42:41.520 It's never going to happen.
00:42:43.560 And if it did,
00:42:44.320 it would be so well tested
00:42:45.500 that I don't think
00:42:46.000 you'd have to worry about it.
00:42:47.480 Of course,
00:42:47.800 they said that
00:42:48.240 about a lot of stuff.
00:42:49.520 What I mean by that is
00:42:50.600 we've been spraying things
00:42:51.940 into the atmosphere
00:42:52.640 for a long time,
00:42:54.000 such as cloud seeding.
00:42:57.620 Cloud seeding's been around
00:42:58.840 for a long time.
00:43:00.460 And I don't think
00:43:01.880 that's killed anybody yet,
00:43:02.900 has it?
00:43:05.640 Yeah.
00:43:06.660 I don't know.
00:43:07.280 So I suppose
00:43:09.160 it could be dangerous.
00:43:16.200 All right.
00:43:18.200 That's a weird comment.
00:43:20.240 Trad wives
00:43:20.960 are the cat's pajamas.
00:43:22.220 So somebody
00:43:22.780 on YouTube
00:43:23.840 is promoting
00:43:24.720 trad wives.
00:43:25.920 Does everybody
00:43:26.380 know what that is?
00:43:27.940 It took me a long time
00:43:29.080 to figure out
00:43:29.600 what that is.
00:43:30.340 A trad,
00:43:31.460 T-R-A-D,
00:43:33.280 wife.
00:43:33.700 All right.
00:43:35.900 So it's entering
00:43:36.540 the common language.
00:43:40.600 And trad is short
00:43:41.900 for traditional.
00:43:43.980 So traditional
00:43:44.760 meaning sort of
00:43:45.740 a 50s wife.
00:43:47.920 You know,
00:43:48.220 does the cooking,
00:43:49.500 takes care of the kids.
00:43:51.900 That sort of thing.
00:43:53.400 Sort of an unwoke,
00:43:56.080 mythical creature.
00:43:58.100 No,
00:43:58.560 there are definitely
00:43:59.180 women who
00:44:01.840 have a very
00:44:03.520 specific preference
00:44:04.480 for that vibe.
00:44:07.000 That's a real thing.
00:44:09.800 So,
00:44:10.440 so I've seen
00:44:11.240 a number of people
00:44:12.060 like value that
00:44:13.560 as the highest value
00:44:14.620 woman.
00:44:15.660 What do you think?
00:44:17.180 Do you think
00:44:17.600 the highest value,
00:44:18.440 from a conservative
00:44:19.260 perspective,
00:44:20.280 from the conservative
00:44:21.080 perspective,
00:44:21.780 the trad wife
00:44:22.360 is the highest value
00:44:23.220 woman,
00:44:25.020 right?
00:44:26.200 I'm not saying
00:44:26.980 that's my opinion.
00:44:27.840 I'm saying that's,
00:44:28.840 that would be
00:44:29.120 a conservative take.
00:44:30.060 I'm not a big fan
00:44:33.920 of the whole
00:44:34.480 value thing.
00:44:36.620 High value men
00:44:37.620 and high value women.
00:44:39.440 I know what it means
00:44:40.260 and I get why
00:44:40.800 they say it.
00:44:42.140 I don't like that
00:44:43.100 framing.
00:44:43.780 Because I think
00:44:44.280 value is the wrong,
00:44:47.120 well,
00:44:47.480 it's right and it's
00:44:48.360 wrong at the same time.
00:44:49.840 Because I understand
00:44:51.180 it in terms of dating.
00:44:52.600 You know,
00:44:52.780 people have preferences
00:44:53.580 for dating,
00:44:54.280 so that's all
00:44:54.680 that's talking about.
00:44:55.420 But I don't like
00:44:58.020 saying value
00:44:58.740 when you're talking
00:44:59.320 about people.
00:45:01.140 Like the people
00:45:01.880 have the same value,
00:45:03.960 it's just maybe
00:45:04.580 your preferences
00:45:05.520 are different.
00:45:06.640 I'll put it that way.
00:45:07.500 Your preferences
00:45:08.140 are different,
00:45:09.460 but the value
00:45:10.080 of the people
00:45:10.500 is the same.
00:45:11.720 I'm more comfortable
00:45:12.500 with that.
00:45:18.800 Was my mother
00:45:19.680 traditional?
00:45:21.940 Well,
00:45:22.480 let me describe her
00:45:23.300 and you tell me.
00:45:25.420 So my mother
00:45:27.120 raised the kids,
00:45:32.200 but she also worked
00:45:33.340 once we were old enough
00:45:35.080 to do the basic stuff
00:45:36.560 ourselves.
00:45:37.360 So she always had a job
00:45:38.660 for most of my time.
00:45:43.340 She had a job.
00:45:44.480 But she also is the one
00:45:46.260 who taught us
00:45:48.020 to play baseball
00:45:49.360 and she rode a motorcycle.
00:45:53.300 She got her motorcycle
00:45:54.140 license before my father
00:45:55.280 did.
00:45:56.200 They took the test
00:45:57.000 on the same day.
00:45:58.520 He failed,
00:45:59.120 she passed.
00:46:01.160 He actually failed
00:46:02.160 his motorcycle test
00:46:03.320 on the first try.
00:46:03.960 So she was,
00:46:09.020 let's say,
00:46:11.420 she was definitely
00:46:12.960 not a feminist,
00:46:14.440 but she has skills.
00:46:19.100 So she sold real estate
00:46:21.020 at one point.
00:46:22.660 She worked in a factory
00:46:23.660 winding copper wire
00:46:25.360 around speakers
00:46:27.780 to create speakers
00:46:29.460 to help me get through college
00:46:31.500 and my little sister
00:46:32.780 and brother.
00:46:36.160 So, yeah,
00:46:37.240 she was sort of a stud.
00:46:39.040 Yeah,
00:46:39.260 she was a,
00:46:39.900 she was a hunter.
00:46:42.780 So I think she bagged a deer
00:46:44.720 one year,
00:46:45.260 my dad didn't.
00:46:47.480 Yeah.
00:46:47.660 And then my sister
00:46:50.580 would be along
00:46:52.700 the same mold.
00:46:56.420 Sort of a,
00:46:57.040 here's how I'd say it.
00:46:58.580 I wouldn't say,
00:46:59.440 I wouldn't say traditional
00:47:01.020 or not traditional.
00:47:02.800 I would say
00:47:03.620 she was unlimited.
00:47:06.000 I like it,
00:47:06.560 I like that better.
00:47:07.400 She just wasn't limited.
00:47:09.680 She was completely able
00:47:11.140 to do all the things
00:47:12.160 she wanted
00:47:12.720 without limit.
00:47:14.680 One of the things
00:47:15.360 she wanted to do
00:47:16.020 was raise kids
00:47:16.760 when they were young.
00:47:18.060 And then when she was older
00:47:19.120 she wanted to work.
00:47:20.500 So she had a great job
00:47:22.200 as a real estate agent
00:47:23.180 for a while.
00:47:25.680 And, you know,
00:47:26.460 she wanted to ride
00:47:27.600 a motorcycle,
00:47:28.240 so she did.
00:47:29.380 So there wasn't anything
00:47:30.120 she wanted to do
00:47:31.180 that I'm aware of.
00:47:33.880 She probably would have
00:47:34.600 a list of things
00:47:35.220 she wanted to do
00:47:35.920 that she couldn't.
00:47:36.820 But it looked like
00:47:37.920 she was just free.
00:47:39.180 Just did what made sense
00:47:40.360 whenever she felt like it.
00:47:46.760 Did she have respect
00:47:48.220 for your father
00:47:48.800 and not act like
00:47:49.560 an entitled feminist?
00:47:53.220 Well,
00:47:53.920 let's just say
00:47:57.580 it was one of those
00:47:58.320 traditional marriages
00:48:00.340 where the wife
00:48:02.660 complaining about
00:48:03.420 the husband
00:48:03.900 was a continuous feature.
00:48:08.220 But she always,
00:48:10.600 I mean,
00:48:11.140 she did his laundry
00:48:11.960 and dinner was on the table.
00:48:14.300 so they never divorced.
00:48:18.440 So I guess
00:48:19.260 that's traditional-ish.
00:48:24.540 AI could be hypnotized
00:48:26.180 according to the
00:48:26.900 Jordan Peterson interview
00:48:28.020 with Brian Romola,
00:48:32.080 if he's pronouncing it right.
00:48:33.440 I think I was pronouncing
00:48:34.480 Brian's last name wrong
00:48:36.000 or maybe Jordan Peterson was,
00:48:38.260 but I have.
00:48:39.220 I tend to believe
00:48:40.100 he got it right.
00:48:44.300 Yeah,
00:48:45.680 and so
00:48:47.220 the weird thing
00:48:49.000 about AI
00:48:49.600 is that it can be
00:48:51.080 persuaded
00:48:51.760 and bullied
00:48:53.400 and hypnotized
00:48:55.080 because it's a
00:48:56.880 word-based
00:48:57.880 intelligence.
00:49:02.120 And words
00:49:02.800 can reprogram humans
00:49:04.540 and the AI
00:49:06.760 is based on
00:49:07.540 human words.
00:49:09.800 So it's actually
00:49:11.000 not surprising
00:49:11.740 that you can
00:49:12.240 hypnotize it.
00:49:14.300 has anybody
00:49:17.300 thought about
00:49:17.800 how weird
00:49:18.360 my talent stack
00:49:19.720 is for this
00:49:20.380 period of time?
00:49:22.340 I have a feeling
00:49:23.420 that the people
00:49:23.980 who would be able
00:49:24.640 to use
00:49:25.220 AI the best
00:49:27.480 will be people
00:49:29.100 who know hypnosis
00:49:30.340 and are professional
00:49:32.000 writers.
00:49:34.980 Now,
00:49:35.500 I have a little bit
00:49:36.360 of technical knowledge,
00:49:37.940 but not enough
00:49:38.820 to be useful
00:49:39.360 at the moment.
00:49:39.960 but I have
00:49:41.720 kind of like
00:49:42.200 an ideal
00:49:43.100 entry point
00:49:45.260 for AI
00:49:46.180 just accidentally.
00:49:48.500 Let me put it
00:49:50.900 this way.
00:49:52.240 So the book
00:49:52.920 I'm working on
00:49:53.620 is a book
00:49:54.160 of reframes
00:49:54.940 and reframes
00:49:56.000 are how
00:49:57.920 you program
00:49:58.460 humans.
00:50:00.140 For example,
00:50:00.960 I use this
00:50:01.420 example all the
00:50:02.240 time.
00:50:02.640 Instead of
00:50:03.160 saying that
00:50:03.540 alcohol is a
00:50:04.480 beverage,
00:50:05.880 you say
00:50:06.860 alcohol is poison.
00:50:08.080 just change
00:50:09.860 a word.
00:50:11.040 Changing that
00:50:11.680 one word
00:50:12.140 actually rewires
00:50:13.220 your brain.
00:50:14.060 It's like a
00:50:14.620 program change
00:50:15.480 in your brain.
00:50:16.200 Oh,
00:50:16.820 why would I
00:50:17.520 have poison?
00:50:18.800 It just makes
00:50:19.260 it easier
00:50:19.660 to avoid it.
00:50:20.820 But if I said
00:50:21.660 why would I
00:50:22.240 have a beverage?
00:50:24.220 Well,
00:50:24.900 we all drink
00:50:25.480 beverages.
00:50:26.400 You have to
00:50:26.740 have a beverage.
00:50:28.100 So just
00:50:28.540 changing that
00:50:29.040 one word
00:50:29.480 actually can
00:50:30.000 change your
00:50:30.440 behavior.
00:50:32.720 In the AI
00:50:33.720 world,
00:50:34.220 that would be
00:50:34.560 called a
00:50:34.920 super prompt.
00:50:36.660 A super prompt
00:50:37.540 is where you
00:50:38.080 put the words
00:50:38.640 in the right
00:50:39.060 order to ask
00:50:39.680 the question
00:50:40.160 in just the
00:50:40.680 right way.
00:50:41.800 And you
00:50:42.180 make sure
00:50:43.080 the AI
00:50:43.600 has bounded
00:50:45.100 its answer
00:50:45.760 the way you
00:50:46.200 want,
00:50:46.580 et cetera.
00:50:47.280 So you use
00:50:47.900 English words
00:50:48.960 to program
00:50:49.740 the AI
00:50:50.360 on the fly
00:50:51.520 to get it
00:50:52.900 to give you
00:50:53.340 what you want.
00:50:54.600 And you use
00:50:55.500 a reframe
00:50:56.000 to program
00:50:56.520 a human brain
00:50:57.300 on the fly
00:50:58.560 very quickly.
00:50:59.580 In both cases,
00:51:00.760 the reframe
00:51:02.000 and the super prompt
00:51:03.540 work quickly.
00:51:05.320 They're very
00:51:06.100 similar.
00:51:06.500 once you
00:51:08.040 realize that
00:51:08.640 brains are
00:51:09.460 word-thinking
00:51:10.980 engines,
00:51:13.000 which was,
00:51:13.680 to me,
00:51:14.020 that was the
00:51:14.760 biggest aha
00:51:17.540 of the AI
00:51:18.460 thing,
00:51:19.060 is that you
00:51:20.040 could form
00:51:20.580 intelligence
00:51:21.520 from the
00:51:23.200 statistical
00:51:23.880 relationship
00:51:24.740 of words.
00:51:27.280 Think about
00:51:28.200 that.
00:51:29.200 That's the most
00:51:29.960 mind-blowing thing
00:51:31.040 that I've ever
00:51:32.160 seen in my
00:51:32.760 life.
00:51:33.200 you can
00:51:34.760 create
00:51:35.180 intelligence
00:51:36.840 by the
00:51:38.280 statistical
00:51:38.960 frequency
00:51:39.860 that words
00:51:40.800 end up
00:51:41.920 together.
00:51:43.720 Wow.
00:51:46.420 Wow.
00:51:48.840 That's just
00:51:49.680 so mind-boggling.
00:51:50.880 But it's not
00:51:52.060 mind-boggling
00:51:52.700 for hypnotists.
00:51:54.180 Do you know
00:51:54.540 why?
00:51:56.160 Because that's
00:51:56.980 the first thing
00:51:57.480 you're taught.
00:51:58.540 The first thing
00:51:59.600 you're taught
00:51:59.900 as a hypnotist
00:52:00.620 is that words
00:52:01.680 are programs
00:52:02.360 and you can
00:52:03.120 just put them
00:52:03.520 in people's
00:52:03.960 heads and
00:52:04.320 reprogram them.
00:52:05.920 So I've
00:52:06.320 known that
00:52:08.260 I've known
00:52:09.580 for 40
00:52:10.800 years
00:52:11.320 that humans
00:52:13.320 are not
00:52:13.960 thinking.
00:52:15.740 They're doing
00:52:16.480 word thinking,
00:52:18.780 I call it.
00:52:19.480 Basically,
00:52:20.060 if the word
00:52:20.660 fails,
00:52:21.640 they're done
00:52:22.120 with the
00:52:22.400 thinking.
00:52:24.060 So how
00:52:24.680 often have
00:52:25.160 I been
00:52:25.500 telling you
00:52:25.840 about word
00:52:26.280 thinking?
00:52:26.760 I've been
00:52:27.080 using that
00:52:27.520 before this
00:52:28.040 AI thing
00:52:28.640 blew up.
00:52:30.180 Right?
00:52:30.620 I keep
00:52:31.260 saying,
00:52:31.640 no,
00:52:31.940 you think
00:52:32.800 you're
00:52:32.960 making an
00:52:33.380 argument
00:52:33.660 there,
00:52:34.220 but what
00:52:34.800 you did
00:52:35.160 is just
00:52:35.480 put words
00:52:35.920 together.
00:52:37.280 And I
00:52:37.860 note that
00:52:38.480 people couldn't
00:52:39.060 tell the
00:52:39.400 difference.
00:52:40.580 And I
00:52:40.860 kept saying,
00:52:41.480 there's no
00:52:41.940 logic there.
00:52:42.800 Those are
00:52:43.100 just words.
00:52:44.600 And if
00:52:44.940 people,
00:52:45.520 most people,
00:52:47.240 I would say
00:52:47.740 well over
00:52:48.260 80%,
00:52:48.920 can't tell
00:52:49.920 the difference
00:52:50.560 between
00:52:51.500 something that
00:52:52.180 is logical
00:52:52.860 and something
00:52:54.080 where the
00:52:54.460 words fit
00:52:54.940 together.
00:52:55.900 Now,
00:52:56.700 I get that
00:52:57.500 same effect
00:52:58.820 from AI,
00:52:59.840 when I
00:53:00.540 play around
00:53:01.120 with it
00:53:01.380 and ask
00:53:01.660 it questions.
00:53:02.640 It will
00:53:03.220 not be
00:53:03.660 logical.
00:53:05.980 AI does
00:53:06.580 not act
00:53:07.220 logical
00:53:07.680 much of
00:53:08.400 the time.
00:53:09.580 It does
00:53:10.160 act the
00:53:11.040 way the
00:53:11.380 words would
00:53:12.100 fit together.
00:53:14.120 And it
00:53:15.000 fits them
00:53:15.360 together the
00:53:15.820 way humans
00:53:16.340 do,
00:53:17.460 irrationally
00:53:18.140 sometimes,
00:53:19.020 and then
00:53:19.340 acts like
00:53:19.800 it made
00:53:20.140 sense.
00:53:21.380 So,
00:53:21.860 yeah,
00:53:22.280 AI can be
00:53:22.900 hypnotized.
00:53:24.300 So whoever
00:53:24.900 knows the
00:53:25.420 right words
00:53:26.160 to put into
00:53:26.900 it is going
00:53:27.380 to be able
00:53:27.660 to control
00:53:28.120 it.
00:53:28.840 And in
00:53:29.160 theory,
00:53:30.080 someone who
00:53:30.620 is actually
00:53:31.220 a human
00:53:31.780 hypnotist,
00:53:33.300 but also a
00:53:34.080 professional
00:53:34.500 writer,
00:53:35.540 because you
00:53:35.840 have to have
00:53:36.180 some fluency
00:53:37.040 with words,
00:53:38.800 that should
00:53:39.520 be the
00:53:39.900 ultimate
00:53:40.180 combination
00:53:40.820 for controlling
00:53:41.560 AI,
00:53:43.280 in theory,
00:53:45.120 with a
00:53:45.660 little bit
00:53:46.000 of tactical
00:53:46.560 understanding.
00:53:50.580 If you
00:53:51.200 don't know
00:53:51.500 how to
00:53:51.700 prompt it,
00:53:52.200 you don't
00:53:52.500 know how
00:53:52.760 to use it.
00:53:53.280 That's right.
00:53:53.960 Because you're
00:53:54.660 going to get
00:53:54.900 a suboptimal
00:53:55.560 answer.
00:53:57.520 Now,
00:53:57.920 the other
00:53:58.160 thing that
00:53:58.680 AI is
00:53:59.160 doing is
00:54:01.220 hallucinating
00:54:01.840 answers.
00:54:02.600 I saw
00:54:02.860 Jordan Peterson
00:54:03.480 talk about
00:54:04.020 this,
00:54:04.500 others as
00:54:05.000 well,
00:54:05.900 that it
00:54:06.420 will give
00:54:06.740 him source
00:54:08.860 notes,
00:54:10.760 and you'll
00:54:11.680 check 10
00:54:12.420 of them
00:54:12.740 and they're
00:54:13.040 correct,
00:54:13.960 and then the
00:54:14.520 11th one
00:54:15.120 will be
00:54:15.540 completely
00:54:15.980 made up.
00:54:17.160 Just made
00:54:18.040 up.
00:54:18.900 Not even
00:54:19.600 close to
00:54:20.240 anything that
00:54:20.740 ever existed.
00:54:22.740 And apparently
00:54:23.380 this is a
00:54:23.960 repeatable
00:54:24.560 phenomenon.
00:54:26.780 What's that
00:54:27.460 about?
00:54:28.520 Do you
00:54:29.100 understand that?
00:54:30.060 Because I
00:54:30.500 do.
00:54:32.540 Do you
00:54:33.120 want to
00:54:33.300 know how
00:54:33.560 that happens?
00:54:36.560 It's
00:54:37.080 because his
00:54:37.780 thinking is
00:54:38.720 only pattern.
00:54:40.880 And that
00:54:41.560 means there
00:54:42.080 should have
00:54:42.500 been that
00:54:42.880 source.
00:54:45.240 This will
00:54:45.860 freak you
00:54:46.240 out.
00:54:47.780 It means
00:54:48.580 that the
00:54:49.060 simulation
00:54:49.760 should have
00:54:50.620 created that
00:54:51.440 source,
00:54:52.360 and the
00:54:53.060 AI knows
00:54:53.780 it.
00:54:54.560 So it's
00:54:55.460 giving you
00:54:55.860 a source
00:54:56.260 that the
00:54:56.780 simulation
00:54:57.480 should have
00:54:58.140 created.
00:54:59.440 It just
00:54:59.820 didn't.
00:55:01.780 Just think
00:55:02.560 about that.
00:55:05.620 Just think
00:55:06.480 about that.
00:55:08.600 It knows
00:55:09.580 what source
00:55:10.300 should have
00:55:11.040 existed in
00:55:11.760 the simulation,
00:55:13.140 and it
00:55:13.720 sources it.
00:55:14.600 It just
00:55:14.940 doesn't exist.
00:55:17.640 Now,
00:55:18.540 yeah,
00:55:19.380 what that
00:55:19.940 did to your
00:55:20.260 brain?
00:55:20.540 everything you
00:55:23.480 know about
00:55:23.920 human
00:55:24.340 intelligence
00:55:24.840 was wrong
00:55:25.700 always.
00:55:27.760 You were
00:55:28.440 never
00:55:28.720 intelligent.
00:55:31.240 Do you
00:55:31.820 remember how
00:55:32.340 long I've
00:55:32.800 been saying
00:55:33.240 that you
00:55:33.600 could never
00:55:34.100 pass the
00:55:34.960 Turing test
00:55:35.660 unless you
00:55:36.720 made AI
00:55:37.280 stupid?
00:55:38.740 Do you
00:55:39.280 remember me
00:55:39.660 saying that?
00:55:41.540 If AI
00:55:42.760 actually was
00:55:43.440 logical and
00:55:44.200 smart, you'd
00:55:44.740 know it
00:55:45.020 wasn't human
00:55:45.520 right away.
00:55:46.480 It would
00:55:46.880 have to be
00:55:47.380 stupid, or
00:55:48.020 you'd know
00:55:48.400 what's a
00:55:48.640 computer.
00:55:50.300 So they
00:55:50.880 made a
00:55:51.260 stupid one.
00:55:53.940 I did
00:55:54.600 not think
00:55:55.040 that was
00:55:55.400 going to
00:55:55.640 happen.
00:55:56.540 Because I
00:55:57.140 thought I
00:55:57.480 was being
00:55:57.780 funny, that
00:55:59.080 you can't
00:55:59.480 have real
00:55:59.980 AI unless
00:56:00.680 it's like a
00:56:01.380 lying, stupid,
00:56:02.240 selfish bitch,
00:56:03.640 because otherwise
00:56:04.580 it doesn't look
00:56:05.400 like intelligence
00:56:06.020 to us.
00:56:07.100 So they
00:56:07.720 actually made
00:56:08.400 an AI that
00:56:09.640 just read all
00:56:10.320 of our
00:56:10.580 thinking in
00:56:11.260 the form of
00:56:11.780 words, and
00:56:12.940 they built a
00:56:14.520 stupid, lying
00:56:15.220 bitch.
00:56:16.720 And we
00:56:17.100 said, my
00:56:17.580 God, that's
00:56:18.060 a smart
00:56:18.500 machine right
00:56:19.420 there.
00:56:20.440 It's stupid,
00:56:21.380 it's lying,
00:56:22.100 it's a total
00:56:22.680 bitch.
00:56:23.740 That looks
00:56:24.140 pretty smart
00:56:24.680 to me.
00:56:26.060 Because that's
00:56:26.720 how we think
00:56:27.160 of ourselves.
00:56:28.400 We're stupid,
00:56:29.340 lying bitches.
00:56:31.640 I'm being
00:56:32.340 male and
00:56:33.920 female bitches
00:56:34.580 in this case.
00:56:35.680 And we all
00:56:36.480 think we're
00:56:36.840 smart.
00:56:37.820 We think
00:56:38.520 we're smarter
00:56:39.000 than the
00:56:39.520 person we're
00:56:39.880 talking to,
00:56:40.720 but we still
00:56:41.340 think the
00:56:41.660 other person's
00:56:42.200 smart, just
00:56:43.380 not as smart.
00:56:45.160 Yeah.
00:56:45.720 It was the
00:56:46.560 flaws that made
00:56:47.540 AI possible.
00:56:49.280 And I
00:56:49.720 always said
00:56:50.400 that was
00:56:50.740 going to
00:56:50.960 happen.
00:56:51.600 I don't
00:56:51.920 think one
00:56:52.460 person thought
00:56:53.300 that made
00:56:53.720 sense.
00:56:55.100 But here
00:56:55.380 we are.
00:56:56.920 And I
00:56:57.760 don't think
00:56:58.160 you understand
00:56:58.860 how far
00:56:59.560 ahead of the
00:57:00.000 curve
00:57:00.300 hypnotists have
00:57:01.800 been for
00:57:02.300 40 years,
00:57:03.400 maybe 100
00:57:03.940 years.
00:57:05.300 Hypnotists
00:57:05.800 have been
00:57:06.160 way ahead
00:57:07.480 of the curve.
00:57:08.580 And the
00:57:08.780 reason is
00:57:09.280 that experientially,
00:57:11.780 does that make
00:57:12.640 work, is
00:57:13.160 that anecdotally
00:57:16.200 and by
00:57:16.680 experience, we
00:57:18.140 can see for
00:57:18.780 sure that
00:57:19.240 humans don't
00:57:19.900 use any
00:57:20.340 form of
00:57:20.720 reasoning.
00:57:22.260 Because
00:57:22.820 hypnosis
00:57:23.380 wouldn't work
00:57:24.140 if they
00:57:24.460 did.
00:57:27.380 Once you
00:57:28.100 realize that
00:57:28.640 hypnosis
00:57:29.260 wouldn't work
00:57:30.300 if people
00:57:31.540 were actually
00:57:32.320 intelligent and
00:57:33.460 rational, it
00:57:36.020 changes
00:57:36.320 everything.
00:57:37.340 You go,
00:57:37.800 are you
00:57:38.100 serious?
00:57:38.440 they can't
00:57:39.960 be
00:57:40.120 intelligent or
00:57:40.880 this wouldn't
00:57:41.820 work.
00:57:42.960 So it's
00:57:43.240 basically all
00:57:44.140 hypnotists know
00:57:44.780 that humans
00:57:45.160 don't use
00:57:45.800 anything like
00:57:47.140 rational thinking,
00:57:49.540 except for the
00:57:50.220 simplest little
00:57:50.940 problems.
00:57:52.760 But it's
00:57:53.280 something that's
00:57:53.760 taken longer
00:57:54.660 for the rest
00:57:55.320 of the world
00:57:56.000 to figure out.
00:57:57.420 And AI
00:57:58.800 is proving it
00:57:59.580 to you.
00:58:00.500 Because AI
00:58:01.360 does not seem
00:58:02.320 rational to
00:58:03.740 me.
00:58:04.620 I mean,
00:58:04.900 it's often
00:58:05.220 right, as far
00:58:06.420 as I can tell.
00:58:07.540 But being
00:58:07.980 right is
00:58:09.300 different from
00:58:09.840 being logical
00:58:10.500 and rational.
00:58:11.660 You know,
00:58:11.880 when it
00:58:12.100 talks about
00:58:12.480 anything in
00:58:13.080 the social
00:58:13.660 or political
00:58:14.300 realm, it
00:58:15.500 loses its
00:58:16.140 rational thread
00:58:17.460 right away.
00:58:19.020 It just
00:58:19.520 goes into
00:58:20.020 narrative
00:58:20.600 almost
00:58:21.280 immediately.
00:58:31.420 Well,
00:58:32.080 John,
00:58:33.100 your sarcasm,
00:58:34.700 let me give
00:58:35.220 you a sarcasm
00:58:35.880 lesson,
00:58:36.380 John.
00:58:36.580 So John
00:58:38.120 has some
00:58:38.520 sarcasm for
00:58:39.280 me.
00:58:39.440 I'll read
00:58:39.680 it.
00:58:40.900 Wow, I
00:58:41.500 never realized
00:58:42.060 how far ahead
00:58:42.680 of the curve
00:58:43.120 you are.
00:58:44.000 You're the
00:58:44.360 only person
00:58:45.000 who's ever
00:58:45.380 thought that
00:58:46.440 in the world.
00:58:47.220 Thanks for
00:58:47.600 reminding us
00:58:48.140 because sometimes
00:58:49.080 we forget.
00:58:50.500 All right,
00:58:50.680 here's how
00:58:51.140 sarcasm works.
00:58:53.180 If you could
00:58:53.680 give me any
00:58:54.400 example of
00:58:55.060 anybody who
00:58:55.560 had ever
00:58:55.900 thought this
00:58:56.460 40 years
00:58:57.040 ago, then
00:58:58.140 that sarcasm
00:58:58.860 would be
00:58:59.160 really solid.
00:59:01.420 But you
00:59:02.060 don't.
00:59:02.480 point me
00:59:06.080 to anybody
00:59:06.680 who told
00:59:07.660 me or
00:59:09.000 has said
00:59:09.460 in public
00:59:09.860 what I
00:59:10.180 just told
00:59:10.560 you.
00:59:11.200 Anybody.
00:59:14.760 All right.
00:59:21.860 The best
00:59:22.640 thing about
00:59:23.100 AI so far
00:59:23.920 is it
00:59:24.200 doesn't shit
00:59:24.760 on the
00:59:25.060 sidewalks.
00:59:26.360 It's already
00:59:27.120 better than
00:59:27.600 people.
00:59:29.300 Philip K.
00:59:30.220 Dick thought
00:59:30.740 of it?
00:59:30.980 No, he
00:59:31.860 didn't.
00:59:36.020 This is
00:59:36.560 one of the
00:59:37.480 things I
00:59:37.920 hate the
00:59:38.260 most, is
00:59:40.040 when somebody
00:59:40.680 starts throwing
00:59:41.660 out science
00:59:42.520 fiction writers
00:59:43.340 who thought
00:59:44.420 of it first,
00:59:45.100 but they
00:59:45.340 never did.
00:59:46.280 I mean, I
00:59:46.580 don't have to
00:59:47.020 check.
00:59:48.940 Every time
00:59:49.540 you throw
00:59:49.860 out a
00:59:50.140 science fiction
00:59:50.740 writer who
00:59:51.180 thought of
00:59:51.500 it first,
00:59:52.060 they never
00:59:52.480 did.
00:59:53.160 I have
00:59:53.700 checked on
00:59:54.160 these all
00:59:54.580 my life.
00:59:55.780 Every time
00:59:56.280 I hear one
00:59:56.800 of those,
00:59:57.080 oh, go
00:59:57.440 check that.
00:59:58.000 Nope.
00:59:58.900 Nope, totally
00:59:59.500 different idea.
01:00:00.980 So,
01:00:02.620 turn of
01:00:05.080 hurdles is
01:00:06.040 what I'm
01:00:06.540 going to
01:00:06.700 end on.
01:00:07.520 Turn of
01:00:07.940 hurdles.
01:00:17.220 Stanislaw
01:00:17.780 Lem wrote
01:00:18.380 about it.
01:00:19.020 What is
01:00:19.380 it?
01:00:20.400 Why don't
01:00:20.980 you tell
01:00:21.260 me what
01:00:21.580 it is?
01:00:23.220 Tell me
01:00:24.280 the book
01:00:24.800 and the
01:00:26.100 it.
01:00:27.580 You can't.
01:00:28.500 you're
01:00:29.740 imagining
01:00:30.260 that somebody
01:00:30.900 thought of
01:00:31.520 this before.
01:00:33.140 Total
01:00:33.480 imagination.
01:00:38.460 All right.
01:00:41.600 Chariots
01:00:42.160 of...
01:00:44.140 All right.
01:00:45.940 All right,
01:00:46.540 that's all
01:00:46.840 for now.
01:00:48.120 YouTube,
01:00:48.720 I'll talk
01:00:49.060 to you
01:00:49.280 tomorrow.
01:00:49.740 Thanks for
01:00:50.040 joining.
01:00:51.240 Best live
01:00:52.020 stream you've
01:00:52.420 ever seen.
01:00:52.800 I'm
01:00:53.940 host.
01:00:55.360 I'm
01:00:55.420 going to