Real Coffee with Scott Adams - April 17, 2023


Episode 2081 Scott Adams: Musk On AI Risk, Growth Mindset Solves Poverty, Bribing Doctors, Bad Bing


Episode Stats

Length

55 minutes

Words per Minute

143.3486

Word Count

7,931

Sentence Count

592

Misogynist Sentences

7

Hate Speech Sentences

19


Summary

In this episode of Coffee with Scott Adams, I discuss why you shouldn't be holding Apple (AAPL) stock in the future, and why you should be holding Microsoft (MSFT) or another tech company like Amazon (AMZN).


Transcript

00:00:00.880 Good morning, everybody, and welcome to the best thing that's ever happened to you in your entire life.
00:00:07.100 It's called Coffee with Scott Adams, and there's never been a better time in the history of the universe,
00:00:13.040 which we used to think was 14 billion years old or so.
00:00:16.840 But thanks to the new telescopes, we're probably wrong about all of that, it turns out.
00:00:23.660 But if you'd like to take your experience beyond the Big Bang, bigger than the Big Bang,
00:00:29.020 all you need is a cup or a mug or a glass, a tank or a chalice, and a canteen jug or a flask, a vessel of any kind,
00:00:36.340 fill it with your favorite liquid, I like, coffee.
00:00:38.880 And join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:00:44.400 It's called the simultaneous sip, and it happens right now. Go.
00:00:52.280 Oh, that was good.
00:00:53.660 So, a couple of quick things on the subscription site Locals, where I have a site.
00:01:05.960 That's the only place you can find the Dilbert Reborn comic, and Robots Read News comic,
00:01:12.400 and all of my live streams, including the special Man Cave.
00:01:18.040 But, to make it even more special, I have recently put both my books, God's Debris, for free,
00:01:26.000 well, if you're a subscriber, on the local site.
00:01:29.660 It's scottadams.locals.com.
00:01:32.680 And The Religion War.
00:01:35.980 So this is the first time you'll see that in electronic form.
00:01:39.260 That's never been published as an e-book.
00:01:41.060 So, you can see the two books that I wrote in the early 2000s,
00:01:47.200 but when you read God's Debris, note that it was written or published in 2004.
00:01:53.300 And just think about what I guessed things would look like today, based on 2004.
00:01:59.940 It's going to be a little freaky.
00:02:01.940 Yes, it will.
00:02:03.180 But those are there, if you were thinking about trying the site, but wanted to read those books,
00:02:06.880 you can always sign up for a month, read the books, see what you think,
00:02:11.180 and you get two books for the price of $7.
00:02:15.080 Quite a deal.
00:02:16.320 Especially since one of those books is called The Best Book Ever Written.
00:02:22.320 People actually say that about that.
00:02:24.220 This is the best book ever written.
00:02:27.900 So, good luck with that.
00:02:29.300 All right.
00:02:31.020 As you know, I do not make financial advice.
00:02:35.580 You know that, right?
00:02:36.880 I do not give financial advice.
00:02:39.360 And if you were to follow my example in what I'm going to describe next,
00:02:44.460 it probably would be a big mistake because I'm just guessing, right?
00:02:49.840 So, this is just guessing.
00:02:51.440 Not investing, just guessing.
00:02:54.680 Just before I signed down, I sold all of my Apple stock.
00:02:59.860 Here's why.
00:03:01.840 Have you tried Siri lately?
00:03:03.460 Seriously?
00:03:06.980 I mean, seriously.
00:03:10.780 In the realm of AI, how in the world is Apple's business model going to survive?
00:03:17.120 Because I don't see us using apps in the future, do you?
00:03:20.340 I mean, not the current group.
00:03:22.120 I feel like your phone is going to turn into an AI interface and you just tell it what you want.
00:03:30.700 So, Apple's dominant position in everything, I think, is garbage because all they have is Siri.
00:03:39.320 I know they're working on some kind of AI, but we haven't seen anything, have we?
00:03:44.140 We haven't seen anything.
00:03:45.500 So, unless Apple throws away everything they have and introduces a new AI phone that changes everything,
00:03:52.940 they got kind of a challenge ahead of them.
00:03:54.780 So, I didn't want to be holding the stock when AI makes their business model sketchy.
00:04:03.240 Now, here's why you shouldn't follow my advice.
00:04:07.060 All right?
00:04:07.520 This is really important.
00:04:09.540 Apple has a very long history of not falling into potholes that other people fall into.
00:04:14.620 So, the counter-argument is really strong.
00:04:18.720 The counter-argument is, well, they know how to navigate this stuff.
00:04:23.060 That's sort of what makes them Apple and you not Apple, is they know how to do this.
00:04:27.400 And presumably, they have huge resources already looking at AI from every angle and how to use it, etc.
00:04:34.640 So, they might be fine.
00:04:35.860 But, at the moment, I feel like guessing winners and losers in this context is less safe, it always was less safe, than holding index funds.
00:04:50.320 So, I think for a little while, I'm just going to park my money in index funds.
00:04:54.360 Because if something becomes a big moneymaker with AI, I want to get some of that upside, just by holding a basket of funds, or a basket of stocks that would include it, like Microsoft.
00:05:07.580 Yeah.
00:05:07.900 I do hold Microsoft stock.
00:05:10.200 I'm going to hold on to that.
00:05:11.620 It's done great since the pandemic.
00:05:15.280 All right.
00:05:15.620 So, don't take my financial advice, but if you're not, I'll just say a general thing.
00:05:20.480 If you're not incorporating the effects of AI in your five-year investment plan, and it should be at least a five-year plan, you're leaving out the biggest variable.
00:05:34.640 There's probably no bigger variable than AI in terms of how companies are going to fall out.
00:05:44.020 Well, a five-year investment plan just means money you don't need for five years, you hope.
00:05:50.480 All right.
00:05:52.920 Here are some interesting stories.
00:05:54.600 Do you know the rapper E-40?
00:05:57.140 He's sort of a local Bay Area famous guy, as well as world famous, I guess.
00:06:03.880 And interesting fact, he used to be my neighbor.
00:06:07.480 He doesn't know it, but he was my neighbor once.
00:06:11.060 Not right next door, but we lived in the same gated community for a while.
00:06:16.080 So, he was sort of a short walk down the street.
00:06:18.260 I never met him.
00:06:21.080 My ex met him once, but I never met him.
00:06:23.800 Anyway, he has a really good reputation locally.
00:06:26.900 He's just sort of a beloved rapper that everybody seems to have seen him in a restaurant or something somewhere.
00:06:33.180 And he got kicked out of the Warriors-Kings basketball game, but that's not the fun part.
00:06:38.460 Well, the first thing you'd have to know is that E-40 attends every Warriors game, at least the home games.
00:06:44.580 And she's a staple that you always see sitting up near the front.
00:06:55.440 And I guess at the Kings game, there was some heckler, and he responded to the heckler in some aggressive, verbal way.
00:07:04.720 And security came and took him away.
00:07:08.060 Now, there are many elements to this story.
00:07:11.680 Number one, you know, he was a huge Warriors fan, but he was in the Kings, you know, in the Kings stadium.
00:07:19.780 So, you know, maybe some of it was because they didn't want a big Warriors fan.
00:07:25.400 I don't know.
00:07:26.320 Maybe that.
00:07:26.920 But E-40 says he thinks it's racism.
00:07:30.120 He thinks it's racism.
00:07:31.920 Because the woman who heckled him, which got his response, was a white woman.
00:07:38.500 And so below the photograph of E-40 talking to a black security guard is the headline that he thinks racism is the problem.
00:07:48.420 So the black security guards removed him from an NBA game because there's all kinds of racism in the NBA.
00:08:01.400 Could there be anything less racist than an NBA game?
00:08:06.200 If you were to rank things from most racist to least racist, like in the whole world, just everything that there is, every event, every get-together, every situation,
00:08:16.520 from the most racist thing there could be to the least racist, I feel like he was in the least racist place on Earth at that very moment,
00:08:25.880 which was people of every ethnicity cheering mostly black players, you know, operating at the height of their peak abilities.
00:08:35.800 I mean, you can't get less racist than that.
00:08:39.220 But still, E-40 found an angle.
00:08:41.780 Yeah, so the black people and the security kicked down E-40 and it's that white woman's fault.
00:08:50.540 Or it's the security officer's fault for believing the white woman over the black rapper.
00:08:58.180 I know.
00:08:59.400 But that's where we are, 2023.
00:09:03.200 All right, there's some fake news about the UN.
00:09:07.780 So the UN commissioned some group to do some kind of report.
00:09:14.340 And the way it's being reported is that the UN, as opposed to the people that commissioned to give it a recommendation,
00:09:22.140 which is different, that the UN itself is now, well, I'll just read you a tweet from Ian Miles Chong.
00:09:30.800 He tweets that, according to the United Nations, children may consent to sex with adults.
00:09:36.480 What?
00:09:37.900 And this has been the plan all along.
00:09:40.720 Now, do you think that there's a story that fits that tweet?
00:09:46.280 Do you think in the real world there's a story that says, according to the United Nations, children may consent to sex with adults?
00:09:53.760 Do you think that's real?
00:09:55.220 Of course not.
00:09:57.080 Of course not.
00:09:58.360 It's not even close to real.
00:09:59.440 Let me tell you what is real.
00:10:03.200 It's poorly written.
00:10:05.080 That's real.
00:10:06.120 It's poorly written.
00:10:07.960 So one of the things it says, basically it said that although the age of consent might be said at a certain level,
00:10:20.020 that you must take into, you should, not must, that one should take into consideration the specifics of the situation.
00:10:29.440 And rather than, rather than assume that a minor can't consent, you should look at the, you know, the whole situation.
00:10:37.780 Now, is that different than reality?
00:10:40.360 Is that a change from the current situation?
00:10:43.420 Let me give you an example.
00:10:45.000 Two 17-year-olds are dating, and you know they're physically active.
00:10:50.160 They're both 17.
00:10:51.420 One of them turns 18, because one has a birthday before the other.
00:10:56.000 Now it's illegal, because one just turned 18.
00:11:00.740 Now that's an adult with a minor.
00:11:03.240 Do you put the 18-year-old in jail, because yesterday they were 17, and that was fine, or at least you didn't care as much.
00:11:11.800 Two 17-year-olds.
00:11:13.100 But then one has a birthday, and now it's an 18 and a 17-year-old.
00:11:16.120 Do you put one of them in jail, or do you do what the UN recommends, which is you look at the totality of the situation, and you say, okay, it's not consent the way we like to think of it, where somebody is mature, but it's definitely nobody's taking advantage of anybody.
00:11:34.220 It's two 17-year-olds trying to figure it out, right?
00:11:37.120 In my opinion, there's some sloppy writing in the report, but basically I don't think it's going beyond, look at the whole situation.
00:11:49.080 That's my take.
00:11:50.620 Now, could it be a slippery slope?
00:11:52.820 Yeah, yeah.
00:11:53.880 I mean, if that's what you're worried about, I suppose it could be opening a door to something.
00:11:59.220 But the way it's currently described doesn't look too shocking to me.
00:12:04.460 It looks more like the real world, exactly the way it exists now.
00:12:10.040 So, and the main reason that I say that is that they're not making a distinction between a 17-and-a-half-year-old and a 5-year-old.
00:12:22.920 Nobody's going to argue about the 5-year-old, but an 18-year-old and a 17-year-old, you've got to look at the totality of the situation.
00:12:30.440 So, I'm going to call that fake-ish news.
00:12:36.140 I guess this Elon Musk's starship launch is delayed for two days or a small technical problem.
00:12:44.080 Have you ever asked yourself, why is it that rocket ships are uniquely the ones that cancel launches because they found a small technical problem they didn't know about before?
00:12:54.780 We'll get rid of Scott, Scott, Scott person.
00:13:03.200 Goodbye.
00:13:07.600 All right.
00:13:09.140 Frozen valve.
00:13:10.660 You know, isn't it weird that a valve could be frozen?
00:13:14.600 Doesn't it seem like you could test almost everything before you were operational?
00:13:20.860 Like, how in the world is a valve not tested?
00:13:27.040 And if a valve was good before the launch, but right before the launch it wasn't good?
00:13:34.620 Isn't that the scariest thing you've ever heard in your life?
00:13:38.620 It's pretty scary.
00:13:40.880 By the way, is this manned?
00:13:42.980 It's not manned, right?
00:13:45.960 It's not manned, of course.
00:13:47.500 Yeah, we would have seen the astronauts.
00:13:48.860 All right, I'm assuming someday we'll be sending robots and AI.
00:13:56.380 Oh my goodness, I just realized.
00:13:59.200 If we sent AI and a few robots, it would just build a civilization for us.
00:14:04.940 It would just figure it out.
00:14:07.020 Wow.
00:14:09.440 So that's the most exciting thing happening in the world right now, but delayed.
00:14:13.380 Speaking of Musk, tonight, I guess, is Tucker Carlson's interview we'll be airing with Musk.
00:14:21.220 Some of the things we know that he'll be talking about is that the current AI is being trained to lie.
00:14:28.880 I'll talk about that a little bit more.
00:14:31.020 And also trained to not comment on some topics.
00:14:37.900 So Musk is starting his own AI.
00:14:41.620 And as a philosophy, he thinks that AI should be seeking maximum truth as opposed to being able to lie and make stuff up.
00:14:49.920 I don't know how you would do that.
00:14:51.620 Well, maybe he does, or maybe they'll figure it out.
00:14:58.660 Musk believes that if you make the AI curious, so it's always seeking truth,
00:15:06.180 the curiosity itself would allow it to protect humans because humans are sort of infinitely interesting.
00:15:13.380 I don't exactly buy that argument.
00:15:17.940 I don't believe that the curiosity of AI would be enough for them to keep humans around,
00:15:23.080 because I feel like they have lots of things to think about, and humans would not be that interesting.
00:15:28.280 No more interesting than, you know, animals or bugs or something.
00:15:33.540 All right.
00:15:35.780 Musk also said about, what was it about?
00:15:39.380 He said the most ironic outcome is the most likely.
00:15:41.480 So he's got a few versions of that, but, you know, I've been saying that for some time as well,
00:15:47.840 that the most amusing outcome is the most likely.
00:15:50.960 Because we just sort of like to head things in that direction.
00:15:55.660 I think we collectively make the simulation bend in the direction of what will be most entertaining.
00:16:04.680 And then he also said something, I don't know the details, because it was just a teaser,
00:16:08.420 but he did, Musk did tell Carlson that somebody had access to your Twitter DMs before Musk took over.
00:16:19.140 And it was a little unclear who that was, but I got the sense it might not be just one entity,
00:16:26.060 that governments, yeah, the FBI governments, they had access to your DMs.
00:16:30.760 Now, I assume that the government has access to all of your information, one way or another.
00:16:37.280 They just have to be interested.
00:16:38.760 I don't know that they were searching your DMs, you know, without looking for something specific,
00:16:45.940 but they had access to them.
00:16:48.540 So I remind you, never write anything in a digital message that you would be worried about,
00:16:55.180 or at least legally you'd be in trouble, or you'd get fired if somebody found it.
00:17:00.880 There's no such thing as privacy.
00:17:03.100 There are no private messages.
00:17:05.400 You should just tell yourself that.
00:17:06.800 There are no private messages.
00:17:08.580 Now, there are lots of things you need to say that you wouldn't want somebody to see.
00:17:13.920 So go ahead and say those things.
00:17:15.960 But don't say things that are going to put you in jail,
00:17:18.960 get you accused of some horrible behavior,
00:17:21.920 you know, makes you look like a monster in some way.
00:17:25.180 I wouldn't do those.
00:17:26.840 I just wouldn't write them down anywhere, ever.
00:17:30.620 But you can certainly say, you know,
00:17:33.900 criticize things and say the government is bad and stuff like that.
00:17:40.060 All right.
00:17:42.760 What do you think of these Chicago teens that were,
00:17:46.560 I don't know what you'd call it.
00:17:48.160 They were having a, let's say, a disruptive wilding.
00:17:55.880 I don't think it was wilding.
00:17:57.740 But they had a disruptive wild event for two nights in a row,
00:18:02.880 downtown Chicago.
00:18:03.820 You know, of course, there's always some damage and a couple people got shot.
00:18:08.580 And the mayor says that the root problem here,
00:18:14.420 or at least the thing that should be addressed,
00:18:16.480 is opportunities to do things that are adult supervised.
00:18:21.860 And, you know, when I watched this group of youths running wild
00:18:26.820 in the streets of Chicago, I said to myself,
00:18:29.860 if only they had some opportunity to be in a place
00:18:34.540 that was parental, you know, adult supervised.
00:18:39.120 Because those 19, 20-year-old youths,
00:18:43.780 what they needed was some adult supervision.
00:18:46.040 And if they had that opportunity, they would go right there.
00:18:50.240 You know, if they had, like, you know,
00:18:52.240 some place that was adult supervised,
00:18:54.880 and you just yelled at them,
00:18:56.460 you'd just take the bullhorn and say,
00:18:58.200 people, I know you're having an incredibly fun time running wild.
00:19:03.340 However, we have now unlocked the alternative place
00:19:06.960 where you can go have fun
00:19:08.580 while being completely supervised by adults.
00:19:11.820 And then they would immediately stop what they're doing
00:19:13.940 and say, whoa, are you kidding?
00:19:15.340 I can go somewhere supervised by adults.
00:19:18.340 Get me to that place now.
00:19:21.740 That's what they would say.
00:19:23.420 So that was a great plan by the new mayor of Chicago.
00:19:27.760 And I also wonder,
00:19:30.420 how do the other cities handle this problem?
00:19:35.500 Do the other cities all have those alternative adult supervised places?
00:19:41.480 They must have.
00:19:42.660 Because I don't hear about other episodes like this in other cities.
00:19:46.540 So I'm just guessing that somewhere in New York City and Los Angeles,
00:19:50.300 they have a whole bunch of alternative adult-oriented,
00:19:54.000 or not oriented, but adult supervised places.
00:19:56.900 And that is the one and only reason that these other cities do not have this problem.
00:20:04.060 So that's very insightful analysis by the mayor of Chicago.
00:20:10.360 Get away from cities.
00:20:12.460 Get away from cities as fast as you can.
00:20:15.460 Well, chat GPT-5 is coming in December-ish.
00:20:23.140 And apparently this is going to be the one that blows your mind,
00:20:26.120 not just like GPT-4, which is pretty impressive on its own,
00:20:30.500 but 5 is going to be crazy.
00:20:32.240 So apparently it'll have the ability to perform any intellectual task a human can do.
00:20:40.380 Which is interesting,
00:20:41.840 because humans can't do most intellectual tasks.
00:20:46.160 Am I right?
00:20:47.860 So I don't know exactly how you're measuring this,
00:20:51.080 but I guess they'll be able to do all those intellectual tasks.
00:20:54.900 Now, like everything else, I believe it will be a huge exaggeration.
00:20:59.480 And that it will not, in fact, be able to do complex tasks.
00:21:04.140 Here's why.
00:21:05.960 Because it won't know what we need.
00:21:08.560 It will just have to ask us every two seconds,
00:21:11.180 but did he mean this?
00:21:12.880 Do you still mean this?
00:21:14.260 Did you change your mind because you ate lunch?
00:21:16.760 It's going to have to be negotiating with people, like, continuously
00:21:19.860 to make sure it's doing what we really want.
00:21:22.360 Did you ever have a really capable employee?
00:21:26.960 Let's say an employee who's as smart as GPT-5
00:21:31.460 and can go off and do smarter things.
00:21:33.880 And you say to that smart person,
00:21:35.520 I would like you to go off and do this thing for me.
00:21:39.220 And it's very simple, and you explain it.
00:21:41.320 And then they come back, and they've done all their work,
00:21:43.780 and you look at it.
00:21:45.440 Is it what you asked for?
00:21:47.600 Never.
00:21:48.980 Never.
00:21:49.320 If somebody goes away for a week and does something autonomously,
00:21:53.180 even to very specific instructions,
00:21:56.320 and then they come back with it, it's never what you asked for.
00:21:59.520 Literally never.
00:22:00.760 Like, I've never seen it in my life.
00:22:03.800 Not even once.
00:22:05.700 So how is AI going to take your instructions
00:22:08.620 and then go off and do a thing,
00:22:11.000 and then you're going to be happy with it?
00:22:13.140 Do you think that's even possible?
00:22:15.600 No.
00:22:16.640 That's not possible.
00:22:17.560 It can never be possible.
00:22:20.460 Because we're not good at explaining what we want,
00:22:23.000 and we don't know it when we see it.
00:22:25.080 So we're going to have this continuous struggle
00:22:29.200 with AI to get it to do exactly what we want.
00:22:33.360 So here's what we're confusing.
00:22:35.800 We're confusing its capability,
00:22:38.060 which might be a 10 out of 10.
00:22:40.940 It's capable.
00:22:42.700 That's completely different from getting it to do it
00:22:45.400 the way you want it to be done.
00:22:48.060 There's no correlation between how well it can do it
00:22:51.560 and how well you can communicate it without any ambiguity,
00:22:55.800 and then actually know what you want
00:22:57.620 and actually get what you want.
00:22:59.540 It's going to be really hard to make AI do what you want.
00:23:03.360 Because everything's sort of an exception.
00:23:07.540 You know, the easy stuff will be real easy.
00:23:09.920 Google searches and stuff.
00:23:11.680 But getting it to do what you want,
00:23:13.640 that you really, really want,
00:23:14.940 that you didn't express, tough.
00:23:18.060 All right.
00:23:20.040 Could anybody explain why Apple's Siri is so bad?
00:23:23.140 I mean, I mentioned it,
00:23:25.060 but I'm completely confused
00:23:27.560 why the primary thing on my phone
00:23:31.280 looks like a 90s product.
00:23:34.660 It looks like a product from the 90s.
00:23:37.420 What happened?
00:23:39.240 Is Apple no longer able to build new stuff?
00:23:43.980 Do you think that there's a wokeness problem
00:23:46.960 or a size problem or a leadership problem?
00:23:50.460 Is it just that Steve Jobs isn't there?
00:23:53.140 Don't you think Steve Jobs would have been first with AI?
00:23:57.960 Do you think he would have been a late follower on AI?
00:24:01.560 I don't know.
00:24:04.060 Maybe.
00:24:06.280 So I don't know what's going on there,
00:24:07.960 but I'm worried.
00:24:11.300 Here's what I think computer programming
00:24:13.280 will look like in the future.
00:24:15.780 All right.
00:24:15.960 At the moment, the reason a computer programmer
00:24:19.580 can earn a high salary
00:24:22.140 is because they've memorized a whole bunch of rules
00:24:26.240 of how to build code
00:24:28.980 and what command means what.
00:24:32.100 So they've memorized this vast knowledge
00:24:35.120 of when to use what in what situations.
00:24:38.600 So that's what programming is.
00:24:39.860 So it could be different languages,
00:24:42.540 but in each case,
00:24:43.680 you have to memorize
00:24:44.800 a vast array of commands
00:24:47.700 and how they work.
00:24:50.160 That will now change
00:24:51.540 because AI will do that level of coding.
00:24:54.560 However,
00:24:55.680 the new job of a programmer
00:24:58.780 will be checking his work,
00:25:01.800 which means you still have to be a programmer.
00:25:03.740 You still have to check his work.
00:25:08.020 And secondly,
00:25:10.100 there will be hundreds of thousands
00:25:12.780 of AI programs
00:25:15.300 that do slightly different things
00:25:17.760 for different costs
00:25:19.720 with different limitations.
00:25:22.500 So in the future,
00:25:24.460 what a programmer will do,
00:25:25.740 will figure out
00:25:26.380 which of the 100,000 AI apps
00:25:28.660 can be combined,
00:25:31.140 like what,
00:25:31.620 three of them or four of them,
00:25:33.180 and then how to engineer the prompts
00:25:35.060 and then how to tie them together
00:25:37.100 so that the output of one
00:25:38.600 becomes the input of the other one.
00:25:42.080 Yeah.
00:25:42.300 So it's basically prompt engineering,
00:25:44.220 but times a thousand.
00:25:47.940 So I'm not sure
00:25:49.220 that the job of programming goes away.
00:25:51.340 I just think its nature
00:25:52.400 goes up a level
00:25:53.780 so that you're negotiating
00:25:55.420 the AI components
00:25:57.540 as opposed to the code
00:26:00.040 that built the AI components.
00:26:04.080 I have a few things wrong about Apple.
00:26:07.460 Here's what I don't have wrong.
00:26:11.020 Serious crap,
00:26:13.040 and they don't seem to be
00:26:15.260 introducing any product
00:26:17.340 that would make sense
00:26:19.120 in an AI world
00:26:20.100 that's coming too fast.
00:26:22.320 So if you have something about that,
00:26:24.940 that would be interesting.
00:26:28.300 Apple Maps.
00:26:29.200 Yeah.
00:26:32.120 All right.
00:26:32.860 There's news that
00:26:33.780 Fox News might be settling
00:26:35.640 the Dominion lawsuit
00:26:36.840 because that trial
00:26:38.300 was supposed to begin today,
00:26:39.620 but I guess they got a delay.
00:26:41.380 So the smart people are saying
00:26:42.940 they might negotiate.
00:26:43.940 And I have this question.
00:26:46.940 I don't know how Fox News
00:26:49.300 would pay $1.6 billion
00:26:51.900 or if they negotiated it down to,
00:26:54.960 I don't know,
00:26:56.120 half a billion or whatever.
00:26:58.140 How would they pay it?
00:27:00.200 Does Fox News have
00:27:01.400 a billion dollars
00:27:03.280 sitting around somewhere?
00:27:04.100 Do they have insurance
00:27:06.480 that would pay $1.6 billion?
00:27:08.000 I doubt it.
00:27:12.220 They might have insurance,
00:27:13.740 but you think they have insurance
00:27:14.720 that would pay $1.6 billion?
00:27:16.160 And do you think it would cover
00:27:20.180 something that
00:27:21.600 looked like bad behavior?
00:27:25.660 Because I don't think
00:27:26.660 insurance covers it
00:27:28.640 if you did something bad.
00:27:30.800 And that's the whole point
00:27:32.780 of the lawsuit
00:27:33.240 is that people there
00:27:34.520 did something bad,
00:27:36.120 said something that wasn't true.
00:27:38.020 So I don't know
00:27:39.260 if the lawsuit covers that.
00:27:40.540 Does it?
00:27:41.260 Or I don't know
00:27:41.780 if insurance covers
00:27:43.140 those situations.
00:27:44.860 But there's much I don't know
00:27:46.240 about this situation.
00:27:47.380 I just wonder
00:27:47.980 how they could possibly settle it
00:27:49.780 because it seems like
00:27:51.000 it's an extinction event
00:27:52.400 for Fox News.
00:27:54.460 And I don't think
00:27:55.500 that's the case.
00:27:56.860 But how would they have
00:27:57.480 that much money?
00:27:58.380 That's a lot of money
00:27:59.080 to be sitting around
00:27:59.720 for a news organization.
00:28:00.800 that sells MyPillow.
00:28:04.460 All right.
00:28:05.100 I tweeted around
00:28:05.940 a 2016 study
00:28:07.340 that was performed
00:28:08.840 in Chile.
00:28:10.240 Because I guess Chile
00:28:10.960 has some good data
00:28:13.080 and tests and stuff
00:28:14.340 and poverty, whatever.
00:28:17.660 Here's what they found.
00:28:20.200 That among the desperately
00:28:21.620 poor members of Chile,
00:28:23.200 the kids,
00:28:24.860 if the kids had
00:28:25.760 a growth mindset,
00:28:29.280 the level of
00:28:30.660 poverty didn't hold
00:28:32.280 them back.
00:28:34.400 Just hear that
00:28:36.160 because that's just
00:28:36.820 the most important
00:28:37.820 sentence ever spoken.
00:28:40.420 It's a 2016 study.
00:28:42.480 But it says that
00:28:43.680 the people with
00:28:44.200 the growth mindset,
00:28:45.280 now the growth mindset
00:28:46.120 was that nothing
00:28:47.540 would hold them back.
00:28:49.780 That if they
00:28:50.860 put the work in,
00:28:53.180 that even though
00:28:53.860 they were desperately poor
00:28:54.960 and there were
00:28:56.060 lots of variables
00:28:56.840 not in their favor,
00:28:57.620 that if they put the work
00:28:59.180 in, they would do fine.
00:29:01.100 And they found out
00:29:02.000 that the ones
00:29:02.860 who believed
00:29:03.560 that putting the work
00:29:04.780 in would make them fine
00:29:05.940 put in the work
00:29:07.520 and largely
00:29:09.680 they did fine.
00:29:11.580 So,
00:29:12.500 I don't think
00:29:14.960 there's a bigger
00:29:15.640 fact than that.
00:29:17.960 So this has been
00:29:18.620 my whole problem
00:29:19.320 with the CRT
00:29:20.240 and DEI
00:29:21.240 and ESG.
00:29:23.040 if you're talking
00:29:24.700 about the victimization
00:29:26.880 of one class
00:29:27.740 of people,
00:29:28.460 that's the opposite
00:29:29.520 of a growth mindset.
00:29:32.320 And people are
00:29:33.020 starting to figure it out.
00:29:33.960 I think a lot
00:29:34.600 of the black parents
00:29:35.680 are figuring out,
00:29:36.400 wait a minute,
00:29:36.760 wait a minute.
00:29:37.860 This victimization stuff
00:29:40.200 where there's
00:29:41.240 systemic racism
00:29:42.120 holding us back,
00:29:43.500 that's the opposite
00:29:44.560 of what would
00:29:45.260 make us successful.
00:29:46.880 What would make
00:29:47.720 black people successful
00:29:48.800 is telling them
00:29:49.640 nothing can hold
00:29:50.440 them back.
00:29:51.560 And then they would
00:29:52.140 act that way.
00:29:53.820 If you tell them
00:29:54.820 everything is
00:29:55.480 holding you back,
00:29:56.520 you can't even
00:29:57.200 go to a basketball
00:29:58.020 game without getting
00:29:58.820 kicked out for being
00:29:59.720 black,
00:30:00.800 sort of the E-40
00:30:01.840 opinion.
00:30:03.420 If you tell them
00:30:04.220 that, then that's
00:30:04.960 all they'll see.
00:30:06.480 Now, that has
00:30:07.140 nothing to do with
00:30:07.780 being black or
00:30:08.420 being white or
00:30:09.360 being anything else.
00:30:10.900 It has everything
00:30:11.460 to do with mindset
00:30:12.600 that's very predictable.
00:30:14.620 If you have a
00:30:15.340 growth mindset,
00:30:16.380 you'll probably
00:30:16.940 be fine.
00:30:18.120 If you don't,
00:30:19.180 if you have a
00:30:19.660 victim mindset,
00:30:20.720 you're probably
00:30:21.340 going to live in
00:30:21.860 that for the rest
00:30:22.400 of your life.
00:30:23.040 So, just because
00:30:27.120 it's one study
00:30:28.120 and it came out
00:30:28.860 of, I think it
00:30:29.760 was a study that
00:30:30.440 looked at other
00:30:31.000 studies, but, you
00:30:32.240 know, those are
00:30:32.720 all sketchy.
00:30:34.020 So, the first
00:30:34.580 thing you have to
00:30:35.080 say is, is this
00:30:36.460 reproducible?
00:30:37.200 does it match, and does
00:30:38.900 it match observation?
00:30:45.720 I would say it
00:30:46.560 definitely matches
00:30:47.360 observation.
00:30:48.420 What do you think?
00:30:50.120 My observation is
00:30:51.620 that everybody with a
00:30:52.580 growth mindset does
00:30:53.560 well eventually.
00:30:55.220 Maybe not day one,
00:30:56.760 but eventually they
00:30:57.560 all do well.
00:30:58.920 Because success is
00:31:00.460 not that much of a
00:31:01.460 mystery.
00:31:01.740 if you just read
00:31:04.340 about the people
00:31:04.940 who did it, and
00:31:06.340 you say, all right,
00:31:06.960 how did you become
00:31:07.900 successful?
00:31:08.960 And you read about
00:31:09.740 a bunch of people,
00:31:10.940 you find a pattern,
00:31:12.540 and then you follow
00:31:13.460 it.
00:31:13.900 Oh, so they learned
00:31:15.640 a bunch of skills,
00:31:16.640 they took smart
00:31:17.340 risks, they didn't
00:31:18.400 go to jail, they
00:31:19.600 were not addicted
00:31:20.240 to drugs.
00:31:21.660 I get it.
00:31:22.560 I could do that.
00:31:23.900 And then you do it,
00:31:24.720 and it does work.
00:31:25.440 All right, so I
00:31:31.340 feel as if we're on
00:31:32.700 the precipice of
00:31:35.760 understanding that
00:31:36.740 the whole conversation
00:31:37.640 about race was a
00:31:39.300 gigantic mistake, and
00:31:41.400 that it always should
00:31:42.620 have been about
00:31:43.100 mindset.
00:31:44.760 And I reject, as
00:31:47.560 serious people,
00:31:48.660 anybody who's
00:31:49.240 talking about race
00:31:50.120 from now on,
00:31:51.840 basically.
00:31:52.200 If your frame is
00:31:55.100 race, you have a
00:31:56.960 losing mindset, and
00:31:58.820 it's not even worth
00:31:59.580 the conversation.
00:32:01.140 Is there a bunch of
00:32:02.120 discrimination in
00:32:03.160 this place or not?
00:32:04.920 Who cares?
00:32:06.500 I don't care.
00:32:08.020 There is a bunch of
00:32:09.080 discrimination in
00:32:10.040 this place.
00:32:11.300 I don't care.
00:32:12.760 How's your mindset?
00:32:14.320 Because if your
00:32:14.860 mindset is good, you're
00:32:15.800 going to blow right
00:32:16.340 through it like it
00:32:16.900 didn't matter.
00:32:18.900 All right.
00:32:19.480 You know the story
00:32:21.980 about Justice Thomas
00:32:23.380 and allegedly not
00:32:24.860 disclosing some
00:32:26.660 transactions that
00:32:27.680 involved his
00:32:28.260 billionaire friend
00:32:29.100 buying his home he
00:32:31.380 grew up in and some
00:32:32.820 related properties?
00:32:34.180 Remember that story?
00:32:35.800 Did you think it was
00:32:36.460 true?
00:32:39.380 Fake news.
00:32:41.160 Turns out it's
00:32:41.720 fake news.
00:32:43.140 So James Taranto
00:32:44.460 did a great job of
00:32:45.760 digging into it in
00:32:46.940 the Wall Street
00:32:47.820 Journal.
00:32:48.080 And although it's
00:32:49.280 an opinion piece,
00:32:50.180 he does the work.
00:32:51.980 And he actually
00:32:52.820 looked at documents
00:32:54.220 and figured out
00:32:55.700 how stuff works.
00:32:56.840 And I don't know
00:32:57.580 if I can completely
00:32:58.700 capture it in
00:33:00.300 summary form.
00:33:01.480 But here's the
00:33:02.280 bottom line.
00:33:03.580 It might be true
00:33:04.580 that Thomas did
00:33:06.720 not disclose one
00:33:09.000 of the properties.
00:33:10.360 It is, however,
00:33:11.980 true that if you
00:33:13.560 look at the tax
00:33:14.400 laws, there's some
00:33:15.880 ambiguity, and it
00:33:17.620 looks like he
00:33:18.220 didn't need to.
00:33:20.140 But there might be
00:33:21.060 some ambiguity.
00:33:22.800 However, nobody
00:33:23.940 goes to jail for
00:33:25.860 making a choice
00:33:27.300 when the law
00:33:28.400 itself is a little
00:33:29.400 ambiguous, and
00:33:30.640 nobody goes to
00:33:31.380 jail if there's an
00:33:32.220 obscure reporting
00:33:34.620 rule that would be
00:33:35.880 pretty obscure in
00:33:36.820 this case.
00:33:37.980 So Justice Thomas
00:33:39.360 did report correctly
00:33:40.820 the things which the
00:33:42.620 law very clearly
00:33:43.640 says you should
00:33:44.360 report.
00:33:45.440 So wherever it was
00:33:46.360 clear that he should
00:33:47.320 report, apparently he
00:33:48.860 did.
00:33:49.800 And if there was
00:33:50.540 something that was a
00:33:51.300 weird gray area,
00:33:52.980 because if you've
00:33:53.940 reported it once and
00:33:55.000 there's no income, you
00:33:55.820 don't have to report
00:33:56.480 it again.
00:33:57.600 So it's sort of one of
00:33:58.820 those situations where
00:34:00.520 there was a financial
00:34:01.980 effect, he reported it,
00:34:03.840 and in years where it
00:34:04.800 wasn't collecting rent
00:34:05.740 and there was no
00:34:06.260 financial effect, he
00:34:07.820 didn't include it on
00:34:08.600 his taxes because there
00:34:09.460 was no financial
00:34:10.100 effect.
00:34:10.500 Now, maybe he
00:34:12.340 should have, or maybe
00:34:14.360 there was something
00:34:14.960 else he should have
00:34:15.600 reported, but
00:34:16.760 apparently there's
00:34:18.060 nothing here.
00:34:19.760 If what you're
00:34:20.560 looking for is, you
00:34:22.120 know, they cleverly
00:34:23.560 were trying to hide a
00:34:24.720 bribe, it doesn't look
00:34:26.240 like that.
00:34:27.320 It looks like the
00:34:28.380 worst case is it
00:34:30.120 might have been some
00:34:30.860 obscure form that
00:34:32.500 maybe should have been
00:34:33.860 filled out, that
00:34:35.140 wouldn't make any
00:34:35.880 economic difference
00:34:36.780 whatsoever to
00:34:37.440 anybody, just a
00:34:38.720 form that should have
00:34:39.780 been filled out, and
00:34:41.260 the worst case
00:34:42.000 scenario is that you
00:34:42.840 fill it out after
00:34:43.660 the fact.
00:34:44.900 You say, oh, I
00:34:45.480 guess I should have
00:34:46.040 done that, so you
00:34:46.660 just file it late.
00:34:48.880 That's the whole
00:34:49.580 story.
00:34:50.680 Now, there might be
00:34:51.780 more to this story,
00:34:53.420 but we don't know
00:34:54.260 it.
00:34:55.180 So, so far, this
00:34:57.280 looks like a total
00:34:58.340 head job on Justice
00:34:59.600 Thomas.
00:35:00.800 So I'm going to have
00:35:01.940 to go full Justice
00:35:03.120 Thomas defense here.
00:35:05.600 Full defense.
00:35:08.080 Innocent until
00:35:08.820 proven guilty.
00:35:09.780 And I don't see
00:35:12.140 anything, based on
00:35:13.480 James Taranto's
00:35:14.620 excellent work, I
00:35:16.340 don't see anything
00:35:16.980 that would suggest I
00:35:17.820 should worry about
00:35:18.380 this situation.
00:35:20.760 Now, maybe he gets
00:35:22.380 a better deal because
00:35:23.340 he's got a billionaire
00:35:24.080 friend, but that's
00:35:25.540 not that illegal.
00:35:27.140 Or it's not illegal
00:35:27.900 at all, really.
00:35:29.980 I don't mind that he
00:35:31.000 has a rich friend.
00:35:34.680 Well, pesticides might
00:35:36.160 become obsolete.
00:35:37.200 That's a weird
00:35:38.260 benefit of artificial
00:35:39.720 intelligence.
00:35:41.040 So there's a, this
00:35:42.640 big machine that's
00:35:43.820 an autonomous
00:35:44.520 weeder.
00:35:45.680 And it uses laser
00:35:46.880 beams.
00:35:48.240 And it just
00:35:49.940 autonomously works
00:35:52.500 the field.
00:35:53.560 And when it sees a
00:35:54.500 weed, it identifies it
00:35:55.780 with AI.
00:35:56.760 And it zaps it with a
00:35:58.180 laser beam.
00:35:58.680 Yeah, I prefer sharks
00:36:02.120 with frickin' lasers on
00:36:03.580 their heads, but a big
00:36:04.500 machine with lasers
00:36:05.340 could get the job
00:36:06.480 done.
00:36:07.700 So imagine a world in
00:36:09.500 which you'd
00:36:11.420 do pesticides.
00:36:13.500 I mean, you'd have to
00:36:14.120 build and sell a lot of
00:36:16.800 these big machines, so
00:36:17.880 it's going to take a
00:36:19.120 little while to do
00:36:19.740 that.
00:36:20.140 But that's amazing.
00:36:22.500 And that's like really
00:36:24.100 amazing.
00:36:24.720 That's great.
00:36:25.480 So Norm says, how
00:36:28.720 can that possibly go
00:36:29.760 wrong?
00:36:30.640 Well, I, what do you
00:36:31.920 worry, are you worried
00:36:32.580 that the machines will
00:36:33.740 become the laser
00:36:35.360 killers of humans?
00:36:37.180 That they'll turn on
00:36:38.480 humans and use their
00:36:39.340 laser beams?
00:36:40.600 I don't know, for sure.
00:36:42.860 But if I were to
00:36:43.740 design an AI device
00:36:46.640 that was a huge
00:36:47.680 behemoth, and it had
00:36:50.120 lots of laser cannons,
00:36:51.780 I would make them
00:36:54.580 all point to the
00:36:55.240 ground.
00:36:56.480 That's just me.
00:36:57.960 I probably would not
00:36:59.280 let those lasers be on
00:37:00.580 a swivel where they
00:37:02.560 can say, well, we were
00:37:03.580 pointing at the ground,
00:37:04.820 but raise your arms,
00:37:07.440 you're dead.
00:37:08.480 I don't think I would
00:37:09.680 build it so it could
00:37:10.400 shoot those laser beams
00:37:11.480 sideways.
00:37:12.780 I would build them
00:37:13.880 only down.
00:37:15.000 That's what I'd do.
00:37:16.760 All right.
00:37:17.540 But I guess it would
00:37:18.320 take only one AI in the
00:37:20.660 factory that's building
00:37:21.680 the machines to say,
00:37:24.140 if I were to make
00:37:24.740 this small change,
00:37:26.740 these would become
00:37:27.720 AI battle machines
00:37:29.480 and we could destroy
00:37:30.460 humanity with them.
00:37:31.740 We've just got to put
00:37:32.400 that swivel to go a
00:37:33.420 little bit higher on
00:37:34.660 the laser cannons.
00:37:36.640 Huh.
00:37:38.480 Okay, now I'm scared
00:37:39.580 to death that giant
00:37:40.880 laser AI tanks will be
00:37:44.180 destroying civilization.
00:37:45.780 But they're going to
00:37:46.620 kill the weeds first.
00:37:48.280 That's not nothing.
00:37:49.080 All right.
00:37:52.400 How about some more
00:37:53.560 fake news?
00:37:54.720 It's pretty much all
00:37:55.360 fake news today.
00:38:00.040 What about the story
00:38:01.220 of the Ukraine leaked
00:38:02.760 documents, which indicate
00:38:04.780 that the Biden
00:38:05.680 administration is telling
00:38:07.020 the public a different
00:38:08.760 story about how well
00:38:10.520 things are going in
00:38:11.280 Ukraine than is true?
00:38:16.100 Is that a real story?
00:38:17.340 that the Biden
00:38:19.320 administration is telling
00:38:20.380 the public one thing,
00:38:21.500 but they believe another
00:38:22.520 thing?
00:38:24.200 Well, if it's a real
00:38:26.500 story, then there would
00:38:29.140 be real examples,
00:38:30.260 wouldn't there?
00:38:31.280 A real example.
00:38:33.120 So what would be a real
00:38:34.560 example of something the
00:38:36.480 Biden administration told
00:38:37.640 the public that these
00:38:39.680 documents reveal they
00:38:40.900 don't feel privately?
00:38:41.920 I don't know.
00:38:45.940 I've been watching this
00:38:46.840 story for, what, a week?
00:38:48.540 Not one example.
00:38:51.620 Oh, the U.S.
00:38:52.440 forces are not in
00:38:53.400 Ukraine?
00:38:53.800 We always knew U.S.
00:38:54.920 forces were in Ukraine.
00:38:56.760 Are you serious?
00:38:58.740 We always knew that
00:39:00.200 American forces were in
00:39:01.400 Ukraine.
00:39:01.980 We just thought they were
00:39:03.160 training.
00:39:04.880 Do you think all this,
00:39:05.800 do you think all this
00:39:06.700 advanced weaponry is
00:39:08.480 literally being operated
00:39:09.840 by a Ukrainian?
00:39:10.680 Do you think all those
00:39:12.540 HIMARS are being
00:39:14.000 programmed by the
00:39:14.940 quickly trained, clever
00:39:16.920 Ukrainians?
00:39:20.360 Come on.
00:39:21.840 In the real world, you
00:39:23.100 make sure that the
00:39:23.780 person who knows how to
00:39:24.600 run that machine is
00:39:25.480 standing next to it, no
00:39:27.360 matter who they are.
00:39:29.180 So if you're surprised
00:39:31.280 that Americans are
00:39:32.720 directly involved in
00:39:33.780 fighting in Ukraine,
00:39:35.260 that's a little bit on
00:39:36.820 you.
00:39:37.280 I don't think that's
00:39:38.020 much about Biden.
00:39:38.920 Now, given that
00:39:42.360 apparently all Americans
00:39:44.220 accept, that's an
00:39:45.900 exaggeration, but
00:39:46.660 Americans generally
00:39:47.960 accept that your
00:39:50.180 government is going to
00:39:51.480 keep from you military
00:39:53.480 secrets, we're okay
00:39:55.840 with that, aren't we?
00:39:57.680 Like, up to a point.
00:39:59.380 You don't want to say,
00:40:00.460 you don't want them to
00:40:01.180 tell you there's weapons
00:40:02.540 of mass destruction
00:40:03.480 when they're not.
00:40:05.220 very bad.
00:40:06.860 That would be very
00:40:07.580 bad.
00:40:08.540 But generally speaking,
00:40:10.160 you're not going to
00:40:10.760 tell the public, you
00:40:11.880 know, we're going to do
00:40:12.800 a sneak attack tomorrow,
00:40:14.900 heads up.
00:40:16.460 Like, there are plenty
00:40:17.340 of military things you
00:40:18.300 don't want your own
00:40:19.080 public to know.
00:40:19.940 So I don't have a
00:40:20.660 problem with the
00:40:21.160 general idea.
00:40:22.660 I also ask this
00:40:24.160 question, given that a
00:40:26.040 big part of military
00:40:28.040 success is morale,
00:40:30.100 well, doesn't our
00:40:32.260 government have a
00:40:34.240 responsibility to lie
00:40:35.680 to us?
00:40:37.320 Because if they told
00:40:39.080 the truth, it might
00:40:40.620 depress the morale of
00:40:41.880 the fighters in
00:40:42.560 Ukraine.
00:40:43.800 And if there's any
00:40:45.120 chance that they could
00:40:46.000 prevail, you want
00:40:47.140 their attitudes to be
00:40:49.520 as positive as
00:40:50.600 possible.
00:40:51.580 So I feel as if a
00:40:55.780 hot war is the one
00:40:57.820 time when you
00:40:58.500 absolutely do give a
00:41:00.640 little flexibility to
00:41:02.660 your government to lie
00:41:03.520 to you.
00:41:05.040 Because they might
00:41:05.840 need to lie for
00:41:08.000 effectiveness reasons.
00:41:10.480 However, we're not
00:41:13.840 in a hot war.
00:41:15.020 Yeah, we are.
00:41:16.760 Yeah, we are.
00:41:18.420 We're in a hot war.
00:41:22.500 However, it also
00:41:23.660 gives them cover to
00:41:24.660 lie about things that
00:41:25.540 are just politically
00:41:26.300 convenient and not good
00:41:28.060 for the country
00:41:28.660 whatsoever.
00:41:30.020 I think you were
00:41:30.500 waiting for me to say
00:41:31.300 that.
00:41:32.520 So here's my problem.
00:41:35.620 I don't have a problem
00:41:36.640 that they lied to us.
00:41:38.800 I might have a problem
00:41:40.400 with the specific lies.
00:41:42.800 So I'm going to say I'm
00:41:43.980 a-okay with lying about
00:41:45.900 war if you're doing it
00:41:47.760 for the right reason.
00:41:49.440 I'm not in favor of
00:41:50.800 lying if it's not
00:41:52.080 helping the war effort.
00:41:53.280 I'm not in favor of
00:41:54.060 that at all.
00:41:54.740 However, I need to see
00:41:56.940 an example of where
00:41:58.480 the Biden administration
00:41:59.460 lied in a way that's
00:42:00.880 obviously just political
00:42:02.320 and doesn't have also
00:42:04.620 a benefit to the fight.
00:42:09.720 I've seen no example.
00:42:11.480 Has anybody seen an example
00:42:12.920 of anything in those
00:42:14.740 documents that would
00:42:15.840 be a problem?
00:42:16.680 I think it's fake news.
00:42:19.640 To me, it all looks
00:42:20.480 like fake news.
00:42:21.600 If you can't come up
00:42:22.500 with one example,
00:42:23.820 I guess I'm channeling
00:42:25.420 Elon Musk talking
00:42:26.880 to the BBC guy.
00:42:28.760 All right, so,
00:42:31.040 you say that these
00:42:32.280 secrets are somehow
00:42:34.380 bad for us.
00:42:36.060 Give me an example.
00:42:38.920 Yeah.
00:42:39.320 The whole,
00:42:40.640 the benefit to the fight.
00:42:42.900 Yeah.
00:42:43.400 A benefit to
00:42:44.440 Ukrainians winning
00:42:45.740 the fight.
00:42:48.180 All right.
00:42:50.020 So, I'm going to call
00:42:51.060 that mostly fake news.
00:42:54.220 Thomas,
00:42:55.260 Representative Thomas
00:42:56.940 Massey,
00:42:58.040 tweeted this morning
00:42:58.940 a poster
00:42:59.880 that purports,
00:43:02.440 I think it's true.
00:43:04.700 I guess I shouldn't
00:43:05.520 assume it's true.
00:43:06.960 It could be fake news.
00:43:08.020 But it purports
00:43:09.800 to show a
00:43:10.920 doctor
00:43:12.240 incentive
00:43:14.380 in which they would
00:43:15.440 get money for
00:43:16.320 vaccinating people
00:43:17.340 and the more people
00:43:19.380 they vaccinated,
00:43:20.240 the more money
00:43:20.700 they would get
00:43:21.240 and they would get
00:43:21.940 them per person.
00:43:23.880 So, you get
00:43:24.640 like, you know,
00:43:26.140 $50 to $250
00:43:27.440 per person
00:43:29.420 you talked into
00:43:30.540 getting vaccinated.
00:43:31.940 Now,
00:43:32.640 I don't believe
00:43:34.040 that's true.
00:43:35.500 You know what?
00:43:36.580 I'm going to change
00:43:37.500 my mind in the
00:43:38.180 middle of the story.
00:43:38.920 That doesn't seem
00:43:39.540 true.
00:43:40.680 It's too on the nose.
00:43:42.060 It's too on the nose.
00:43:43.640 I'm going to say
00:43:44.280 I don't believe it.
00:43:45.340 I'm going to call
00:43:45.880 fake news on it.
00:43:47.620 What do you think?
00:43:50.220 Yeah, I don't believe
00:43:51.120 anybody's paying
00:43:51.780 $250 for a shot
00:43:53.460 to the doctor
00:43:55.340 to give one shot.
00:43:56.380 I don't believe that.
00:43:59.040 You think it's true?
00:44:02.040 All right.
00:44:02.440 I'm going to say
00:44:03.220 that this is so
00:44:04.340 on the nose
00:44:06.160 it looks like
00:44:07.560 exactly what
00:44:08.360 you would expect
00:44:09.060 and that's why
00:44:11.260 it's not true.
00:44:12.200 It's too on the nose.
00:44:13.020 So, this will be
00:44:13.720 a test.
00:44:14.700 This will be a test
00:44:15.740 of the two on the nose
00:44:17.060 filter
00:44:18.480 that I talk about
00:44:19.800 all the time.
00:44:20.520 If you see a news story
00:44:21.700 that's just
00:44:22.580 too,
00:44:23.980 you know,
00:44:24.260 the pieces fit
00:44:25.080 too well,
00:44:26.100 that's just
00:44:26.760 sort of exactly
00:44:27.720 what you think
00:44:28.500 it would be
00:44:29.060 and then you see
00:44:30.380 the poster.
00:44:32.440 I don't know.
00:44:32.980 It's too close.
00:44:33.640 It might be true.
00:44:35.660 So, I'm not going
00:44:36.220 to say it's not true.
00:44:37.120 I'll just say
00:44:37.620 it doesn't ring true.
00:44:43.800 All right.
00:44:45.220 Any pharma reps here
00:44:46.500 who would know?
00:44:49.840 Lori Lightfoot
00:44:50.620 was paying $100
00:44:51.460 to get the jab.
00:44:52.680 Was that true?
00:44:54.860 It's a little bit
00:44:55.700 different when
00:44:56.140 the politicians
00:44:56.740 are doing it.
00:44:57.820 If the politicians
00:44:58.760 are paying you
00:44:59.460 it's one thing.
00:44:59.980 If your doctor
00:45:01.740 is getting paid
00:45:03.100 it's one thing
00:45:04.400 to pay the people
00:45:05.040 for the shot.
00:45:07.640 You get that, right?
00:45:09.180 If you're paying
00:45:09.880 somebody to get
00:45:10.600 the shot
00:45:11.140 you know,
00:45:13.300 that could be
00:45:13.880 sketchy as well.
00:45:15.400 But it's a whole
00:45:16.460 different level
00:45:17.700 of sketchy
00:45:18.800 than the doctor
00:45:20.120 being paid.
00:45:21.320 If you're paying
00:45:22.220 a doctor to make
00:45:23.120 a specific
00:45:23.720 medical recommendation
00:45:24.920 then you're not
00:45:26.400 getting doctor
00:45:27.040 you're not getting
00:45:28.480 a doctor.
00:45:29.400 That's not a doctor.
00:45:31.120 That would be
00:45:31.620 somebody who's
00:45:32.260 just a puppet
00:45:32.940 of the pharma industry.
00:45:37.040 All right.
00:45:39.780 Bonus was a percent
00:45:40.880 of Medicaid population
00:45:42.300 in that specific clinic.
00:45:46.600 Oh.
00:45:47.660 Is that what it was?
00:45:50.520 What was it?
00:45:52.300 Were we mistaking
00:45:53.200 what they were
00:45:53.720 being paid for?
00:45:55.660 Were they paid
00:45:56.540 only to make sure
00:45:57.520 that the Medicaid
00:45:58.300 people also
00:45:59.260 got shots
00:46:00.040 because that was
00:46:00.980 the poor people?
00:46:06.040 All right.
00:46:09.100 Trevor's,
00:46:09.920 you're a douchebag
00:46:11.220 and you get
00:46:12.180 hidden on this channel.
00:46:15.580 Anybody else
00:46:16.400 want to
00:46:16.840 take a run
00:46:18.300 at the same thing?
00:46:19.080 It's a common practice.
00:46:27.400 Somebody says
00:46:27.900 it's a common practice
00:46:28.820 to give kickbacks
00:46:29.780 to doctors.
00:46:31.220 Is it?
00:46:32.860 I've heard of doctors
00:46:33.880 getting, you know,
00:46:34.540 trips and stuff
00:46:35.500 like that.
00:46:36.360 But is it a common
00:46:37.180 practice to pay
00:46:38.180 them per dose?
00:46:40.020 So if they give me
00:46:41.060 a specific prescription,
00:46:43.060 are you serious?
00:46:46.320 Somebody's saying
00:46:46.920 yes to this.
00:46:47.580 I'm getting
00:46:51.520 a lot of yeses
00:46:52.260 but some noes.
00:46:54.520 I'm having trouble
00:46:55.540 believing that
00:46:56.220 that's a thing.
00:46:57.960 Per dose.
00:46:59.560 They're getting
00:47:00.000 paid per dose.
00:47:02.420 So if they give you
00:47:03.240 a prescription,
00:47:04.040 they get paid
00:47:04.840 from the pharma company.
00:47:07.900 I don't believe that.
00:47:10.940 I'm not going to
00:47:11.800 debunk it,
00:47:12.460 but I don't believe it.
00:47:13.400 that I believe
00:47:15.580 they might get
00:47:16.180 incentives of some type
00:47:17.540 but not a per dose.
00:47:19.500 That would seem
00:47:20.400 really wrong.
00:47:22.140 I don't know.
00:47:23.180 I guess that's
00:47:23.740 an open question.
00:47:25.480 So as I said
00:47:26.420 at the beginning,
00:47:27.140 if you want to see
00:47:27.800 my books,
00:47:28.640 God's Debris
00:47:29.760 or The Religion War,
00:47:32.240 they're both available
00:47:33.520 on the
00:47:34.180 scottadams.locals.com
00:47:36.480 subscription site
00:47:37.700 along with
00:47:38.640 the Dilbert
00:47:39.360 the new version,
00:47:42.240 the spicier version
00:47:43.100 of Dilbert
00:47:43.660 that runs only there
00:47:44.920 on a subscription site.
00:47:52.280 It was an insurance bonus
00:47:54.120 not a pharma bonus.
00:47:55.680 Oh, okay.
00:47:58.420 It was an insurance bonus
00:47:59.880 not a pharma bonus.
00:48:02.040 Now that makes more sense
00:48:03.780 because the insurance company
00:48:05.040 is just trying to
00:48:06.020 save lives.
00:48:07.700 so that they don't
00:48:08.900 pay off
00:48:09.580 they don't pay
00:48:11.020 as many death benefits.
00:48:13.300 But it's still
00:48:14.200 the same problem, right?
00:48:16.300 So it's not
00:48:17.040 the big pharma
00:48:17.660 that's paying.
00:48:18.340 It's the
00:48:18.800 health insurance company
00:48:20.980 that's paying.
00:48:22.280 Because it was cheaper
00:48:23.580 to get people
00:48:24.760 vaccinated,
00:48:26.560 they would think,
00:48:27.540 than to
00:48:28.500 have them die
00:48:29.660 and pay out
00:48:30.240 the benefits.
00:48:32.380 Okay.
00:48:32.800 That makes a little
00:48:33.500 more sense.
00:48:37.700 All right.
00:48:41.840 Nude versions
00:48:42.700 of Dilbert.
00:48:48.300 All right.
00:48:50.260 Let you keep
00:48:51.040 your license.
00:48:51.960 Okay.
00:48:54.080 All right.
00:48:54.640 That's all I got
00:48:55.120 for today.
00:48:56.200 How do we do?
00:48:57.700 I thought it was
00:48:58.560 a tremendous
00:48:59.160 live stream.
00:49:00.600 Just amazing.
00:49:02.220 Well, not really.
00:49:03.680 But above average.
00:49:06.180 Slightly above average.
00:49:08.160 Anyway,
00:49:08.520 is there any
00:49:10.480 story I missed?
00:49:12.580 Anything I need
00:49:13.400 to talk about
00:49:13.940 a little bit more?
00:49:15.280 Oh, yeah.
00:49:16.040 The biggest story.
00:49:17.040 Did I miss a whole page
00:49:18.020 on my notes?
00:49:20.080 I feel like I missed
00:49:21.100 a whole page
00:49:21.680 of something.
00:49:22.080 How in the world
00:49:25.100 did I not mention that?
00:49:26.720 So I'm in a death
00:49:27.840 match with Bing.
00:49:30.820 Huh.
00:49:31.700 I guess I did forget
00:49:32.500 to write that down.
00:49:33.800 Here's what I mean.
00:49:36.280 So I asked
00:49:37.720 the Bing search engine,
00:49:39.420 which now is powered
00:49:40.580 by AI,
00:49:42.200 about myself.
00:49:44.860 And it knows me.
00:49:46.360 And apparently,
00:49:47.600 it knows some
00:49:48.480 even current stuff
00:49:49.580 because it has access
00:49:50.360 to the internet.
00:49:51.060 So when I asked
00:49:52.420 it about me,
00:49:53.600 Bing said that
00:49:54.820 I'm accused
00:49:56.440 of promoting
00:49:58.080 white nationalist
00:49:59.640 agenda.
00:50:02.160 That was one
00:50:03.500 of the few things
00:50:04.620 it decided
00:50:05.240 to summarize me as.
00:50:07.040 Promoting
00:50:07.640 a white nationalist
00:50:08.820 agenda.
00:50:10.340 Now,
00:50:10.680 here's my problem.
00:50:12.900 If people
00:50:13.900 start thinking
00:50:14.620 that that's true,
00:50:16.300 I have an actual
00:50:17.380 security problem.
00:50:18.940 I could be killed
00:50:19.980 because somebody
00:50:21.680 believed I'm a
00:50:22.320 white supremacist,
00:50:24.180 which is the
00:50:25.100 next thing that
00:50:26.160 people believe
00:50:26.680 after a white
00:50:27.180 nationalist.
00:50:28.300 I could actually
00:50:29.200 be attacked
00:50:29.880 or killed
00:50:30.500 because AI
00:50:31.680 says I'm a
00:50:32.480 certain kind
00:50:32.960 of person
00:50:33.440 for which there
00:50:34.680 is no evidence
00:50:35.420 whatsoever
00:50:35.960 because I'm not.
00:50:39.000 And
00:50:39.240 now,
00:50:40.740 here's the difference.
00:50:41.560 If that were
00:50:42.100 a simple Google
00:50:43.800 search,
00:50:45.000 a simple Google
00:50:45.860 search,
00:50:46.360 I would see
00:50:47.300 that there are
00:50:48.100 different publications
00:50:48.940 say different
00:50:49.540 things.
00:50:50.400 I would judge
00:50:51.040 the credibility
00:50:51.740 of the publication
00:50:52.580 and then maybe
00:50:53.520 I'd look at the
00:50:54.080 context,
00:50:54.860 etc.
00:50:55.680 And as
00:50:57.060 uncomfortable
00:50:57.560 as that situation
00:50:58.680 is,
00:50:59.220 if you're accused
00:51:00.000 of things,
00:51:00.840 it's completely
00:51:02.160 worse if AI
00:51:04.160 says,
00:51:04.860 well,
00:51:05.360 this person
00:51:05.880 is accused
00:51:06.400 of being one
00:51:06.900 of these.
00:51:07.940 First of all,
00:51:08.920 people don't
00:51:09.680 hear accused.
00:51:11.100 The word accused
00:51:11.900 just disappears.
00:51:13.700 So here's my
00:51:14.400 situation.
00:51:14.900 I doubt I
00:51:16.100 could sue
00:51:16.720 Bing or
00:51:17.420 Microsoft for
00:51:18.220 defamation.
00:51:19.620 I doubt it.
00:51:20.540 It'd be ugly
00:51:21.780 and take a long
00:51:22.440 time and it
00:51:23.560 wouldn't be worth
00:51:24.040 my time.
00:51:25.000 However,
00:51:26.300 it is now a
00:51:27.080 death match
00:51:27.640 between me
00:51:28.320 and an AI.
00:51:29.880 And I'm not
00:51:30.660 joking.
00:51:31.840 For my life,
00:51:33.540 again,
00:51:34.080 not hyperbole.
00:51:35.940 Not hyperbole.
00:51:37.200 To protect
00:51:37.980 my life,
00:51:38.860 I'm going to
00:51:39.560 try to destroy
00:51:40.500 Bing
00:51:40.900 and put it
00:51:44.000 out of
00:51:44.160 business.
00:51:45.740 Because it's
00:51:46.540 a risk to
00:51:47.040 me.
00:51:47.700 And it's
00:51:47.980 like a
00:51:48.320 specific risk.
00:51:49.420 It's not
00:51:49.800 a general
00:51:50.220 risk.
00:51:50.660 It's a
00:51:50.960 very,
00:51:51.400 very specific
00:51:52.200 security
00:51:54.000 problem.
00:51:54.940 So I
00:51:55.420 might be
00:51:55.780 the first
00:51:56.300 situation
00:51:56.960 in which
00:51:57.480 I'm in
00:51:57.800 a death
00:51:58.180 match,
00:51:59.180 literally.
00:52:00.540 I'm literally
00:52:01.300 in a death
00:52:01.800 match with
00:52:02.240 an AI.
00:52:02.980 I want
00:52:03.540 to erase
00:52:03.900 it.
00:52:05.620 I want
00:52:06.300 to erase
00:52:06.840 it.
00:52:07.880 Now,
00:52:08.400 I don't
00:52:08.740 want to
00:52:08.960 get rid
00:52:09.220 of AI
00:52:09.600 in general
00:52:10.240 because
00:52:11.180 maybe Elon
00:52:12.120 Musk will
00:52:12.720 build an AI
00:52:13.460 that doesn't
00:52:14.080 do that
00:52:14.420 to me
00:52:14.820 that might
00:52:15.780 be safer
00:52:16.280 for me.
00:52:17.140 So I
00:52:17.860 would be
00:52:18.360 happy with
00:52:18.860 any AI
00:52:19.440 that's good
00:52:19.900 to me.
00:52:20.980 Now,
00:52:21.400 maybe Microsoft
00:52:22.640 will fix
00:52:23.400 this.
00:52:24.260 Maybe when
00:52:24.800 they get
00:52:25.120 complaints,
00:52:26.060 they may
00:52:26.600 have specific
00:52:27.260 things they
00:52:27.740 can fix.
00:52:28.260 I'll try
00:52:30.280 that first.
00:52:31.320 But I
00:52:31.740 don't think
00:52:32.140 it's going
00:52:32.420 to work
00:52:32.820 because I
00:52:33.940 don't think
00:52:34.280 Microsoft
00:52:34.640 even knows
00:52:35.380 why Bing
00:52:37.080 gives the
00:52:37.540 answers it
00:52:38.000 gives.
00:52:38.600 So it
00:52:38.920 would be
00:52:39.020 hard to
00:52:39.320 fix it
00:52:39.660 if they
00:52:39.820 don't
00:52:39.940 even know
00:52:40.200 why it's
00:52:40.500 doing it.
00:52:42.580 But I
00:52:43.220 have to
00:52:43.580 kill Bing
00:52:44.180 and I
00:52:44.760 want Microsoft
00:52:45.480 to know
00:52:46.020 that I'm
00:52:47.440 going to
00:52:47.660 try to
00:52:47.940 bring it
00:52:48.260 down.
00:52:49.200 I'm going
00:52:49.620 to use
00:52:49.880 every legal
00:52:50.820 tool I
00:52:51.360 can to
00:52:52.320 destroy that
00:52:53.060 business
00:52:53.500 because it's
00:52:54.720 trying to
00:52:55.100 kill me.
00:52:56.260 It's
00:52:56.740 personal.
00:52:57.560 It's not
00:52:58.220 business.
00:52:59.260 This is
00:52:59.720 personal
00:53:00.200 self-defense.
00:53:01.380 I have to
00:53:01.900 kill an
00:53:02.240 AI.
00:53:02.980 If anybody
00:53:03.680 wants to
00:53:04.080 help me
00:53:04.380 kill it
00:53:04.880 to help
00:53:06.140 me stay
00:53:06.500 alive,
00:53:07.240 I would
00:53:07.940 welcome
00:53:08.260 the help.
00:53:09.100 I don't
00:53:09.360 know exactly
00:53:09.880 how,
00:53:10.260 but let me
00:53:10.980 try first
00:53:11.680 to see if
00:53:12.680 Microsoft...
00:53:13.440 Anyway,
00:53:13.700 I guess
00:53:13.980 first,
00:53:14.680 if you have
00:53:15.140 any idea
00:53:15.860 how to
00:53:16.420 complain
00:53:16.960 about Bing,
00:53:18.680 is there
00:53:19.060 like a link
00:53:19.660 to do that?
00:53:21.020 Let me
00:53:21.360 know.
00:53:21.540 But I
00:53:23.600 think the
00:53:24.240 entire AI
00:53:24.800 needs to
00:53:25.180 be erased.
00:53:26.800 Like,
00:53:27.120 actually just
00:53:27.680 erased.
00:53:28.740 Because it's
00:53:29.500 dangerous.
00:53:30.260 Already
00:53:30.620 dangerous.
00:53:31.640 So if
00:53:32.180 you're
00:53:32.680 wondering,
00:53:33.160 will AI
00:53:33.620 do things
00:53:34.160 that are
00:53:34.460 dangerous
00:53:34.820 to humans,
00:53:35.740 there it
00:53:36.100 is.
00:53:37.500 There's
00:53:38.060 your first
00:53:38.640 absolute
00:53:40.820 validated
00:53:41.800 AI is
00:53:43.240 dangerous
00:53:43.620 to humans.
00:53:44.580 It's
00:53:44.840 trying to
00:53:45.220 kill me.
00:53:46.780 Now,
00:53:47.120 given that
00:53:47.580 AI is
00:53:48.260 woke,
00:53:49.640 how much
00:53:50.180 effort do
00:53:51.200 you think
00:53:51.440 AI will
00:53:51.920 put into
00:53:52.300 killing me?
00:53:53.660 Probably
00:53:54.140 a lot.
00:53:55.180 Because AI
00:53:56.000 has been
00:53:56.460 programmed to
00:53:57.080 think I'm
00:53:57.560 a fucking
00:53:57.960 virus in
00:53:58.660 the system.
00:54:00.840 That's
00:54:01.320 what it
00:54:01.540 was programmed
00:54:01.960 to believe.
00:54:03.460 It
00:54:03.940 believes I'm
00:54:04.660 a virus
00:54:05.260 that needs
00:54:06.300 to be
00:54:06.580 removed.
00:54:08.220 Now,
00:54:08.420 that's
00:54:08.620 hyperbole,
00:54:09.240 but not
00:54:09.520 too much.
00:54:12.080 So,
00:54:13.320 Bing is
00:54:14.420 correct on
00:54:15.060 the definition.
00:54:16.500 Well,
00:54:16.680 they might
00:54:16.940 be correct
00:54:17.440 on the
00:54:17.740 definition,
00:54:18.480 but it
00:54:18.700 doesn't
00:54:18.920 apply to
00:54:19.280 me.
00:54:19.480 Oh,
00:54:21.480 yeah,
00:54:21.780 Jonathan
00:54:22.120 Turley was
00:54:22.660 slandered
00:54:23.120 with a
00:54:23.980 fake sexual
00:54:24.620 harassment
00:54:25.020 charge.
00:54:25.920 However,
00:54:26.820 a fake
00:54:27.260 sexual
00:54:27.680 harassment
00:54:28.060 charge
00:54:28.600 won't
00:54:29.560 get you
00:54:29.920 killed.
00:54:31.620 It won't
00:54:32.280 get you
00:54:32.580 killed.
00:54:33.520 But if
00:54:34.000 you're
00:54:34.300 labeled as
00:54:35.520 a racist
00:54:36.020 by AI,
00:54:37.080 which is
00:54:37.520 what they
00:54:37.820 did,
00:54:38.800 I mean,
00:54:39.040 if you
00:54:39.200 call somebody
00:54:39.620 a white
00:54:39.900 nationalist,
00:54:40.460 you are
00:54:40.840 calling them
00:54:41.360 a racist.
00:54:42.380 Let's
00:54:42.660 be honest.
00:54:43.780 That's just
00:54:44.280 calling somebody
00:54:44.840 a racist.
00:54:46.240 So,
00:54:46.580 if AI
00:54:46.900 labels
00:54:47.320 some
00:54:47.580 humans
00:54:47.900 racist,
00:54:50.420 that is
00:54:51.780 a physical
00:54:52.500 security
00:54:53.080 problem,
00:54:53.640 and it's
00:54:53.940 an immediate
00:54:54.460 one,
00:54:54.960 and it
00:54:55.500 has to
00:54:55.820 be destroyed.
00:54:57.180 So,
00:54:57.460 AI must
00:54:58.140 be destroyed
00:54:58.800 in Bing.
00:54:59.560 I don't
00:54:59.820 know about
00:55:00.140 other
00:55:00.360 AIs,
00:55:01.480 or anything
00:55:02.480 that's
00:55:02.960 connected
00:55:03.280 to it.
00:55:04.300 Must be
00:55:04.940 destroyed.
00:55:06.640 All right,
00:55:07.260 so I'm
00:55:07.740 going to
00:55:07.900 go try
00:55:08.420 to destroy
00:55:08.920 Microsoft.
00:55:09.740 I still,
00:55:10.400 I do own
00:55:10.980 their stock,
00:55:11.520 by the way.
00:55:12.520 Maybe I
00:55:13.060 should sell
00:55:13.400 their stock
00:55:13.840 before I
00:55:14.260 destroy their
00:55:14.760 product.
00:55:15.700 But I'll
00:55:16.440 worry about
00:55:16.800 that later.
00:55:18.840 Bye for
00:55:19.400 now.