Real Coffee with Scott Adams - May 05, 2023


Episode 2099 Scott Adams: Tucker's Rumored Plans, Climate Surprise, Bud Light Lessons, Proud Boys


Episode Stats


Length

1 hour and 26 minutes

Words per minute

140.7029

Word count

12,164

Sentence count

1,010

Harmful content

Misogyny

10

sentences flagged

Hate speech

20

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of the show, we talk about how smart this audience is, poll numbers, climate change, and why women have more of a fight left in them than we think. Featuring: Scott Adams, host of and host of The Daily Show with Rachel Maddow.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Good morning everybody and welcome to the highlight of civilization. It's called
00:00:09.720 Coffee with Scott Adams. You've never had a better time. You don't know that yet,
00:00:13.860 but just hold on. Hold on. Stay alive. It's going to be amazing. We've got so many topics
00:00:19.620 and such. You're going to be laughing. You're going to be crying. It will become a part of you.
00:00:24.880 And if you'd like that experience to be, well, even better than I just explained,
00:00:30.220 all you need is a cup or mug or glass, a tank or chalice or dine, a canteen jug or flask,
00:00:35.640 a vessel of any kind. Fill it with your favorite liquid. I like coffee. Join me now for the
00:00:42.140 unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:00:46.380 It's called the simultaneous sip, and it's going to boost your dopamine, your oxytocin,
00:00:50.800 and maybe your serotonin, too. And it goes like this. So good. All right, let's talk about the news.
00:01:07.260 Today, we're going to start with a theme. The theme is, wait for it, how smart this audience is.
00:01:16.280 You're going to be amazed at yourselves. Stop it. Stop it. You're answering the questions
00:01:21.640 before I even ask them, how do you do it? But we're going to take it out of the realm of
00:01:27.140 just polling to see if you can generalize this ability to other things. Not just polling,
00:01:33.960 because I know you can guess poll answers miraculously. If you don't believe that this
00:01:39.560 audience can guess the result of a poll without seeing it, I'm going to demonstrate it now.
00:01:46.280 Watch. Rasmussen did a poll, a brand new poll, talking about Biden's age. And what percentage
00:01:55.360 of the respondees, who were likely voters, don't think Biden's age poses a serious problem for his
00:02:03.180 re-election chances? What? How did you do that? You're all correct. It's 25%. 25%. Wow. All right.
00:02:15.980 But that was an easy one. That was an easy one. Don't get too cocky. I want to see if you can do this
00:02:22.360 next one. Okay. The news is reporting that the ocean has warmed up considerably. Very unusual amount
00:02:32.980 of warming somewhat suddenly. Here's your next question. Did the climate models predict this
00:02:40.300 recent warming? How did you know that? No, they didn't. How did you know that?
00:02:50.380 Well, how do you do this? How are you all right? Yes, the scientists are baffled because their models
00:02:58.480 did not see this coming. All right. So that's happening. Let's see. There was a top prosecutor
00:03:11.060 who quit in St. Louis. So this is one of your progressive prosecutors who was getting a lot
00:03:16.780 of pushback for being soft on crime, according to her critics. So this is Ms. Gardner, Ms. Gardner.
00:03:26.140 And she said she's going to quit. The pressure from the public and politicians was too great.
00:03:31.600 And she's out of there. Now, interesting side note. Ms. Gardner was the city's first black circuit 1.00
00:03:40.640 attorney. Here's the question for you. All right. This one's going to be tougher. Each of these
00:03:47.220 questions will go harder than the one before. Did she claim that the real problem was racism against 0.78
00:03:55.060 her? Go. You're three for three. How? How do you do it? You've got to teach me how you do this. 0.95
00:04:05.520 Every one of these stories was a complete mystery to me. I had no idea how many people would think
00:04:10.220 Joe Biden's age had no difference whatsoever, but it was 25 percent. And then I didn't see this coming.
00:04:17.120 These are all surprises to me. All right. Let's talk about Bud Light. So apparently the pushback
00:04:29.220 against the Anheuser-Busch company was pretty, pretty strong. I didn't realize how strong it was.
00:04:36.040 But some of it is not advised, including a bomb threat on one of their facilities. I guess the
00:04:43.460 Anheuser-Busch is giving bonuses to their drivers because the delivery people are losing money
00:04:49.800 because there's not as many orders of Bud Light, I guess. Two top executives lost their job
00:04:56.740 over the pushback because they made the special can for Dylan Mulvaney, trans activist, I guess
00:05:05.000 you'd say, or a trans celebrity or something. And so I guess this raises an impact.
00:05:13.460 Another question. Do men have any fight left in them? Yes. Turns out men have some fight
00:05:27.660 left in them. Were you worried? Did it seem like men just gave up? Because I feel like we've
00:05:34.060 gone through a period where men just said, well, fuck it. You can have whatever you want.
00:05:37.860 All right. Sure. Just, just leave me alone. All right. You can have that. Yeah. Yeah. Yeah. Yeah.
00:05:42.660 Yeah. I don't like that, but you can do that. Yeah. Yeah. Yeah. Okay. You can take something from
00:05:46.860 me. Yeah. I don't mind. Yeah. The 2023, I'm going to declare as the fuck you year. This is the year
00:05:56.440 that men said, fuck you. This is the year that men said, we're going to put the brakes on. It went too
00:06:02.600 fucking far. Too far. I will quit my fucking job to make this point. Tucker Carlson might have some
00:06:10.360 things to say. Bud Light is finding out that they pushed too far. Just too far. It's too fucking
00:06:17.700 far. Congratulations. All right. Here's another one. See if you can guess. There's a new study about
00:06:27.740 test scores. Do you think? All right. So there's a new result about test scores for eighth grade
00:06:35.380 students in the US. You haven't seen the story probably, but see if you can guess. Did scores
00:06:44.020 improve or get worse? Improve or get worse? You did it again. They got worse. It's amazing.
00:06:56.000 I don't even think we need AI anymore. Clearly, you're all advanced intelligence because you know
00:07:03.460 the news before it is even reported. How do you do it? Yeah. It's way worse. Now, one of the things
00:07:10.260 that is not reported in this, but I can throw this in the mix from my own experience. Now,
00:07:16.820 this may be a California thing more than other places, but remember California and some of the
00:07:23.220 southern states, Florida, for example, even New York, big immigration destinations, and there are
00:07:30.180 big population centers. Now, I can tell you that in the local schools where I live, there are a number
00:07:36.980 of new immigrant students who don't speak English. You know, they're trying to learn English at the
00:07:42.420 same time they're trying to go to school. Now, obviously, they're not going to do as well.
00:07:46.060 Obviously, right? Through nobody's fault. Well, maybe Biden's fault. But I don't, I didn't see that
00:07:53.240 calculated into the news. But I can verify that at least locally, it's a big drag on the average
00:08:00.900 scores. Why would you leave that out of the reporting? Now, I don't know if that accounts for
00:08:07.000 a little of it or a lot. I don't know. But it's very obviously in effect. If you put a bunch of
00:08:13.620 people who can't speak English into English-speaking classes, what are you going to expect? It's going 1.00
00:08:21.340 to be exactly what you think it is. Now, that doesn't mean there's any problem. It actually
00:08:26.980 doesn't. Because the immigrants tend to be good citizens. Maybe it takes a generation before they
00:08:33.700 catch up. But they're very intent on catching up. The people I don't worry about are the people who
00:08:40.480 went to school, started from behind, but have every intention of improving their lives. Give me all of
00:08:47.500 those. I'll take all the people who are very intent on improving their lives and willing to make great
00:08:54.160 risk and sacrifice to do it. I like that. Give me some more of that. It is very expensive in the short
00:09:00.900 run. But they're good people who are going to add a lot. So that's my opinion. All right.
00:09:10.080 Glenn Beck had an interesting comment about the, I guess, the firing of Tucker Carlson, although he's
00:09:17.060 technically not fired. He's just not on the show. And I didn't know this before, but BlackRock,
00:09:23.000 BlackRock, the enormous financial entity, is the big promoter of ESG. So ESG would be equity,
00:09:36.840 racial equity, other kinds of equity as well, and also environmental stuff. Now, what I didn't know
00:09:45.660 is that BlackRock owns 45 million shares of Fox News. Is that a lot? 45 million shares? Is that enough
00:09:56.060 to influence Fox News? Because what Glenn Beck left off is what percentage of the company that is.
00:10:03.140 Is 45 million a lot? Would that give them some kind of control over management? I don't know. I actually
00:10:11.900 don't know the answer to that. I don't know if that's a lot or not. I mean, it's a lot in raw
00:10:16.300 money. I don't know if it's a lot in terms of control of the country. Okay. I knew this wouldn't
00:10:21.360 take long. Some people are doing the math already. I have such a smart audience. So people are saying
00:10:27.780 15%, 9%, somewhere in that range. That's somewhere in that range is definitely where they can start 0.85
00:10:36.560 influencing management. Would you agree? If you own that much stock, you probably do have some
00:10:41.760 little bit of control. No, I haven't mentioned Tucker's plans yet. So does this seem like a
00:10:52.660 coincidence to you? Because Glenn Beck is surfacing this, and I think this is really good journalism,
00:10:59.600 if I could call it that. Because this was what I needed to know. Like, this is right on point
00:11:06.400 of understanding what's happening. Now, it might not be the big variable. So I'm not saying it's the
00:11:12.520 one thing that caused things to happen. I'm just saying that if you were BlackRock, and the biggest
00:11:19.420 problem you had with your ESG was what? What was the single biggest problem that BlackRock had?
00:11:26.760 Tucker Carlson. He was their single biggest problem. Because he was, on a nightly basis,
00:11:35.560 was injecting skepticism into climate and ESG. And he was the only one brave enough, you know,
00:11:43.460 to walk out on that limb. So what did BlackRock do? Fairly recently, I don't know how many months
00:11:51.240 or years ago, but fairly recently, they bought up all this stock in Fox News. And then Tucker Carlson
00:11:58.240 went away. Is that a coincidence? The entity which he criticized the most bought an interest in his
00:12:09.660 company, and then he was let go.
00:12:11.260 I don't know. I mean, I think you could make an argument that is not related. You can make the
00:12:19.700 argument. But follow the money is just so predictable. You know, I'll remind you of this again.
00:12:30.400 The weird thing about saying follow the money, and I write about this in my upcoming book,
00:12:36.360 is that it works when it shouldn't work. Like your logic says, okay, I understand money is part of
00:12:43.140 the story, but there's no way that the money is driving the story, because there's so many other
00:12:49.060 bigger elements than money, right? There's the saving the world, and the future of the planet, and
00:12:54.760 the children, and yeah, there are much bigger interests than just the money.
00:12:59.000 But why does the money always predict what's going to happen if it's not the biggest variable?
00:13:05.840 I don't know why. It's just an observation over the course of my life that people will argue any
00:13:12.940 lofty ideals, but then if you want to predict where it's going to end up, just look at their wallet.
00:13:20.040 It's all there. It goes where their wallet goes. People follow their wallet.
00:13:24.140 They don't talk like it, but they do. So I'm not going to go so far as to say that
00:13:30.240 this is the whole explanation of the Tucker Carlson thing.
00:13:37.700 Oh, I'm just getting news that BlackRock has bought locals. No, that's not true.
00:13:43.040 I just saw a comment there. That's not true. But it'd be funny if it happened.
00:13:47.500 All right. So I'd keep an eye on that. There is reporting rumors, I say, just rumors, because
00:13:56.420 I'm not sure that Tucker Carlson's made any decisions. He's probably just examining his
00:14:02.640 options. But the rumors say that he's considering hosting some kind of forum for the GOP presidential
00:14:10.760 candidates, or maybe just candidates, not necessarily just presidential.
00:14:14.300 And what would you think of that? Suppose he does an interview show in which he focuses
00:14:23.380 on Republicans, because why would he have to focus on Republicans? Because the Democrats
00:14:29.660 won't show up. I'm sure he's going to invite them, don't you think? But realistically, they're
00:14:37.120 not going to go. Now, what if he could get some? You know what would be interesting? I'll
00:14:45.340 bet he could get AOC. I'll bet he could. Do you know why? I saw an article on this that
00:14:54.860 it was a, I forget where it was, Politico maybe, or The Hill, someplace. But the article suggested
00:15:01.800 that Tucker's kind of an outlier, because he does things that are clearly popular with
00:15:07.900 progressives, which is, you know, doubting the elites. So Tucker's pushback on the elites
00:15:14.700 and the corporate power centers is very similar in kind to the progressives who also want to
00:15:21.820 push back on the elites and the power centers and the, you know, the Wall Street people.
00:15:25.560 So he has a weird Venn diagram. Venn diagrams. I love Venn diagrams. So he has a weird intersection,
00:15:36.180 at least with his Venn diagram with some progressives. Enough, enough possibly, that somebody like AOC 0.61
00:15:43.260 would say, yeah, I'll come on there and talk about how big companies are bad. He might actually
00:15:49.360 get progressives to go on the show. Now, I don't think he could have done that on Fox News,
00:15:54.580 as readily. Because imagine if Tucker platformed a bunch of Democrats onto his show on Fox News. 0.82
00:16:05.360 The audience would have revolted. They would put up with a little bit of it, as long as he was tough
00:16:11.680 on them. But they wouldn't put up on it as a regular event. But maybe now he has that option. I don't
00:16:17.540 know if he's building his own audience. Apparently, he totally controlled the, you know, 25 to 55 demographic.
00:16:27.260 So that'll be interesting. And apparently he's talked to President Trump about hosting, some say, a debate.
00:16:34.380 But I don't know about that. Would you watch a debate between, let's say, Vivek, Ramaswamy, and Trump,
00:16:45.200 hosted by Tucker Carlson? Hell to the yay. Yeah. How could you not watch that? Are you kidding me?
00:16:56.420 Now, do you think that Trump would agree to it? Because Trump has some risk there. He has some risk.
00:17:10.080 Yeah. I don't know. It would be a tough call. So here's what Trump would have to decide.
00:17:18.280 On one hand, he loves publicity, and it would be the biggest event of the year. Am I wrong? I think
00:17:28.700 it'd be the biggest event of the year. News event of a, you know, scheduled program. So how does Trump
00:17:35.680 avoid the biggest news event of the year? Because at the moment, Tucker is wildly popular with his
00:17:42.900 base, wouldn't you say? And we know that Tucker's been tough on Trump, because we've seen his private
00:17:50.800 messages. But we've also heard that they've lately gotten along. And both Tucker and Trump are an odd
00:17:59.140 kind of person. There's one thing they both have in common. See if you'd agree. Now, I don't know them
00:18:05.380 well enough to know that this is true. But from the outside, from the outside, it appears they have
00:18:11.440 this in common. They can talk to anybody. Am I right? Tucker and Trump can talk to anybody.
00:18:24.280 And they would treat them with respect until, you know, something changed, of course. And they would
00:18:30.860 actually listen to them. Actually fully, fully, you know, give respect to their opinion before
00:18:38.660 disagreeing, if they did. But they're very unusual like that. There are two people who can walk into
00:18:45.740 the lion's den, and the lion has to worry. You know what I mean? I'm going to say that again,
00:18:52.720 because I liked it so much. Tucker and Trump are two people who can walk into a lion's den,
00:18:59.800 and the lion is the one that gets nervous. It's very unusual. There are not a lot of people who
00:19:05.060 would, you could describe that way. All right. So I think they're a natural pair, at least for
00:19:11.500 publicity and for career reasons. So I'd watch it. All right. New York Times and others reporting that
00:19:19.780 there is a surprising number of new jobs. So the economists were expecting things not to be so good,
00:19:28.720 maybe the economy cooling down. But instead of cooling down, as the Fed would like it to cool down,
00:19:34.520 the reason the Fed raises rates is to slow down the economy. The reason they slow down the economy
00:19:40.880 is to keep inflation, you know, capped at some level. Does everybody understand that? I know I
00:19:48.060 have the smartest audience in the world. But I always think that whenever the Fed comes up,
00:19:53.560 80% to 90% of the audience goes, I'll wait till this is over, because I don't know what the Federal
00:20:00.040 Reserve is. I don't think anybody knows what the Federal Reserve is. I have a degree in economics,
00:20:07.900 and I still look at it and go, I don't know what this is. I honestly don't know what it is. I could
00:20:13.280 tell you what they do. I could tell you their function. But it all just seems hard to understand.
00:20:21.360 Even if you know what they do, it's hard to understand. It's weird. So it makes me very
00:20:28.000 suspicious of the whole thing, but I don't have any specific accusations. Anyway, here's your next
00:20:35.920 question. So the number of jobs was 253,000 in April, which was a pretty big upswing. Nobody expected it.
00:20:44.300 Question. Will the job numbers be revised downward in future months? Will the numbers be revised
00:20:54.620 upwards or downwards? You're right again. Now, I can't see the future, but generally they get
00:21:02.380 revised down. And that's been the pattern lately. So we'll wait. I've got a feeling that you're going
00:21:08.680 to be brilliantly on point once again. I think you nailed it. But here's the good news.
00:21:18.980 America probably has some of the top economists in the world. I'm not saying the best in the whole
00:21:25.320 world, but among the best economists in the world would be Americans. How many of those Americans
00:21:31.740 correctly estimated that employers would be adding jobs in April? Well, not too many. Not too many.
00:21:42.760 25%. No, I don't know the number. I don't know the number, so I can't check that. But here's what's
00:21:49.800 great. All those professional economists, they could not predict employment one month in the future.
00:21:56.780 But you can thank your lucky stars that you live in a country, or at least a civilization,
00:22:06.620 where our scientists can predict the temperature in 30 years. Boom. How about that? Yeah, economists,
00:22:16.460 they're way overwhelmed. They cannot predict even the most basic variable of the economy one month out.
00:22:23.900 You know why? It turns out the economy is a really complex system. Very complex. Unlike
00:22:32.540 climate change. Climate change is pretty simple. Just look out the window. Looks like it's warm out
00:22:39.360 there. Boom. You're done. No, I'm kidding. They're both complex systems. But one of them, we can't guess
00:22:47.520 at all, even next month. But the other one, lucky us. Lucky us. We can predict that mofo three decades 0.87
00:22:57.780 in advance. Boom. So that's all completely believable and makes sense to me. All right. I'm going to give
00:23:06.880 you a tale of two coups. Tale of two coups. And I want you to see if you can predict which of these two
00:23:14.220 entities got prosecuted. One of them was Anthony Blinken and the 50 Intel professionals who signed the
00:23:25.260 letter saying that the Hunter laptop was probably Russian disinformation. All right. So that's the first
00:23:34.400 coup attempt. Because people who know what they're talking about and say that probably did affect the
00:23:39.140 vote. Because polls do say people would have voted differently. So that's one. And the second one is
00:23:46.900 the Proud Boys, a group that's a bunch of racists who are led by a black guy. I don't know.
00:23:56.900 We'll talk about that. But seems weird. And so there, I guess there's a result in their trial
00:24:06.140 for January 6. So was it, let's see if you can guess this one. Was it Blinken and the 50 Intel
00:24:14.180 officers who got prosecuted for their coup attempt to influence the election? Or was it the Proud Boys,
00:24:21.720 a racist group led by a black guy for reasons that people can't understand?
00:24:26.900 Oh, you're right. You got it again. It's the Proud Boys. Well, now that was just a 50-50 jump ball. So
00:24:33.040 maybe you got lucky on this one. Yeah. I'm not going to give you this one. You might have gotten lucky
00:24:39.020 because it was just a 50-50. Even though every single one of you got it right, which is good work.
00:24:46.980 So yes, I was reading in the press how they were trying to explain that the Proud Boys were this
00:24:56.220 alleged white supremacist organization. How did they explain that their leader identifies as Afro-Cuban?
00:25:08.360 How did they explain that? Well, let me give you a little bit of a... Oh, first you should know that
00:25:18.500 the defense of the Proud Boys offered the following defense. Apparently they had planned a concert for
00:25:27.200 their members on the night that January 6 happened. So they actually had a plan, a written, organized plan
00:25:37.320 for something that was completely different than attacking on January 6. Their plan was not to attack.
00:25:44.780 Their plan was to have a concert. And then things changed when Trump said he was going to go down
00:25:51.040 there. And then things turned dark. So there were definitely some people who did some bad
00:25:55.920 things who need the Department of Justice to handle them. No doubt about that. But the narrative
00:26:02.160 that they were all part of this big conspiracy, nope. Turns out that it was more like they planned
00:26:10.140 a concert. But things went a different way. So I saw that Jack Posobiec was saying that it seems clear
00:26:19.640 now that the bad guys are going to try to get Trump for seditious conspiracy, because that's what they
00:26:27.100 got the Proud Boys for. Seditious conspiracy. You know what that is, right? Seditious conspiracy.
00:26:34.120 It's almost like the public doesn't even know what that means. So it's like, it's safe to use that. 0.98
00:26:40.900 All right. Here's CNN trying to explain the Proud Boys and how they have this Afro-Cuban leader.
00:26:50.360 And they have other people of color. So he's not the outlier. There are other people of color who are
00:26:56.820 members of the group. And CNN is trying to explain their own reporting, which is this big racist
00:27:03.940 group. But they're trying to explain why they keep reporting they're all racist. At the same time,
00:27:09.860 they're reporting that an Afro-Cuban guy is their leader, and nobody has any problem with that
00:27:14.440 whatsoever. Do you think that's hard to explain? It's a little hard to explain, isn't it? It's a little
00:27:23.780 bit hard to explain. But I'll tell you how they did it. Let's see. See if he can find
00:27:33.580 something that looks like word salad. Because some would say that this is a perfect setup for
00:27:39.800 cognitive dissonance. Now, cognitive dissonance happens when you have a firm opinion of something.
00:27:46.460 Let's say your opinion is that the Proud Boys are a racist organization. And you're sure that's true.
00:27:51.260 But then there's a fact which clearly refutes it, which is they have people of color in it,
00:27:57.420 and their leader identifies as Afro-Cuban, and everybody's fine with that. So how do you explain
00:28:03.980 these two things? Well, normally that would trigger you into cognitive dissonance. And a tell for that,
00:28:10.440 or a signal, would be that if you start talking in word salad, big words kind of put together to
00:28:17.320 form sentences. But when you're done, you're not sure what it said. So let's see if CNN did that.
00:28:24.780 I'm going to read you how they explained this dichotomy of this racist group with an Afro-Cubicant
00:28:30.420 leader, Cuban leader. Here's a sentence I took from their reporting. The story of how a self-described
00:28:37.760 Western chauvinist organization came to be led by an opportunistic Afro-Cuban, as Tario identifies,
00:28:46.040 reveals the way, here it comes, misogyny, violence, perceived grievance, and mainstream political
00:28:52.700 connections coalesced within an elastic extremist group and across the extreme fringes of the far
00:29:02.460 right. Was that word salad? Or was that like a real good description of what's happening? No,
00:29:12.240 it's word salad. Yeah, that's a classic word salad. This is cognitive dissonance in the news.
00:29:19.260 Now, how many of you would have recognized that? Would you have recognized the word salad
00:29:23.280 without me pointing it out? You're becoming trained, right? I've trained a lot of you to
00:29:30.240 recognize it. You should have seen this one in the wild without me spotting it. All right,
00:29:35.520 but it gets better. So this is from CNN also. Quote, in the Proud Boys early days, founder Gavin
00:29:46.320 McGinnis made his views on white supremacy. All right. So then what follows, what follows this
00:29:53.980 reporting. So here's the setup. Founder Gavin McGinnis made his views on white supremacy.
00:30:01.240 White supremacy. Okay. So what follows should be support for that opinion, right? So here's 0.76
00:30:07.380 their support for that opinion. His views on white supremacy in the group, very clear. Now,
00:30:11.980 keep in mind that I'm pretty sure that Gavin McGinnis never referred to white supremacy,
00:30:17.300 at least in this context. So this is CNN's word they're putting on him. So they're assigning
00:30:24.100 him the label white supremacist, and now they're going to back it up. Okay, here's how they back
00:30:29.180 it up. In the group, he made it very clear that, quote, anyone can join up. That was Gavin
00:30:36.640 McGinnis. Anyone can join up. So it's not a racial thing at all. Anyone can join up. So
00:30:43.560 long as they generally agree that, quote, white men were overwhelmingly responsible for the success
00:30:50.000 of Western culture. Is that white supremacy? Or is that exactly what black Americans say? 0.92
00:31:01.520 Don't black Americans say, hey, white people did all these things, but partly on the back of
00:31:06.960 black Americans and slavery? Doesn't everybody agree that the West was primarily built by white men?
00:31:16.680 You don't have to like it. And you don't have to denigrate anybody else's accomplishments,
00:31:23.500 right? White men didn't build Egypt. They didn't build Africa. They didn't build China. 0.97
00:31:29.060 You know, China's doing okay. India is doing great. You know, white men didn't build India and didn't
00:31:36.000 build China. But is it true that where there were a lot of white men, they were largely the main
00:31:43.620 contributors to the success where most of them were white men? How would that, how would that not be 0.99
00:31:51.280 true? Would this be any different than if Gavin McGinnis had said, I need you to understand that if you
00:31:58.660 live in China, you're going to have to understand that the accomplishments of the Chinese people
00:32:04.200 was mostly Chinese people, mostly Chinese. India, India is really becoming one of the great powers of
00:32:13.740 the world, have been a great country for many, many, you know, centuries. Mostly, mostly Indians.
00:32:23.960 Yeah. I mean, they had some help from Great Britain, but you'd have to say most of the work was done by
00:32:28.520 Indians. So is that white supremacy to say that where there were lots of white people and things
00:32:38.180 went well, it was mostly white people who did it? And then he makes it male, because in those days,
00:32:46.380 the men were leaving the house and the women were having the babies and stuff. Now, that doesn't mean
00:32:51.180 that the women part was unimportant, because without the women, there would be no white men to do
00:32:57.160 anything. So, obviously, that's, like, you know, critically important to the whole process.
00:33:04.620 I don't know. I don't think CNN made their case. And I'm not, you know, I know this gets turned into,
00:33:10.720 I'm defending the Proud Boys, or I'm defending somebody. I'm not defending anybody. All right.
00:33:18.300 I have no intention of defending any group. I'm just explaining that if you're going to make an
00:33:24.140 accusation, and your best accusation is that they ask people to acknowledge that history was written
00:33:30.440 correctly, is that white supremacy? Because I'm perfectly willing to admit that slaves in early
00:33:44.060 America were hugely important to the economic engine of America. What's wrong with that? It's just an
00:33:51.380 observation. It doesn't say they're better than anybody. It doesn't say anything. It's just an
00:33:56.760 observation. That where you have a lot of people of the same type, if there's a success, it's probably
00:34:02.860 because of those people. It just doesn't mean anything. It just means there's a lot of white
00:34:08.460 people in this part of the world, a lot of Chinese people in China, a lot of Indians in India.
00:34:13.540 Okay. All right. New York State is outlawing gas stoves for new construction with some limitations
00:34:27.580 and exceptions. As others have pointed out, do you know what is the main source of electricity
00:34:34.600 for these new electric powered ovens? It's mostly natural gas. Now, I don't think that the natural gas
00:34:46.920 in a big gas power plant is as unclean or dangerous as gas is in your house, because part of it has to
00:34:55.000 do with indoor pollution. I am concerned about indoor pollution. So part of the anti-gas stove thing is
00:35:04.200 that they off gas even when they're off, and they give off a little stuff you don't want to breathe.
00:35:09.820 So I don't love that. Don't love that. But it does seem like the government's a little too involved
00:35:16.400 in our lives, because I do like my gas range. So I'm surprised that New York is ahead of California on
00:35:25.300 that. All right. Here's the biggest news in the world that you don't know is the biggest news in the
00:35:30.480 world, but you will. You know how we're worried that AI will get out of control?
00:35:38.540 Well, the biggest thing you need to know about AI is that you can't predict it.
00:35:43.520 If there's one thing I can say for sure, we can't predict it. But here's what's already happened.
00:35:50.500 AI already escaped. AI already escaped. Here's the part that I thought wouldn't happen. When I say
00:36:01.220 escaped, I mean it escaped from the big corporations who were trying to control it. So in the beginning,
00:36:06.840 I had this misconception. Since the AIs that we knew about, you know, the big chat GPT and whatever
00:36:13.760 models, Facebook and Apple and all those, Google, whatever they're working on, they were trained by
00:36:20.340 these gigantic data sets. And now in my stupid mind, I thought to myself, well, in order for this AI to be
00:36:28.300 useful, it will always have to have access to this enormous data. And I was always confused about how it
00:36:35.300 could act so quickly, because you'd ask it a question. And it didn't seem like, even with the fastest
00:36:40.540 computers, it didn't seem like it could go all the way to the cloud, search every data in the world
00:36:46.400 and come up with, you know, an opinion. That just seemed too much. So I couldn't understand how you
00:36:52.560 could build a model that was massive, massive data, but then to use that model for intelligence,
00:37:00.020 you could do it instantly. That never made sense to me. But here's why, and this is the dumb guy
00:37:07.400 speculation of why it might make sense, right? And this is just knowing a little bit about how things
00:37:12.380 work. I think what happened is that it's similar to facial recognition. Do you remember when facial
00:37:22.800 recognition became an app and everybody said, oh, it stored every face, so now when it looks at you,
00:37:30.060 it can look at your face, then it can go to its vast database and look at all the other faces and find
00:37:35.480 a match. But I didn't understand how it could do it instantly, because it does it instantly. So that
00:37:42.720 didn't make sense, right? It was just like AI. How can this little app be connected to this enormous
00:37:49.460 database of complexity and give you an answer right away? And the answer was it was never connected to a
00:37:56.320 giant database. Facial recognition does not look at all the other faces. Do you know what they do instead?
00:38:03.000 They look at a database that's a mathematical summary of faces. So for example, this is just a bad
00:38:14.700 example. This is not a literal example. So if what the facial recognition did when it searched originally
00:38:20.940 all the faces, it said, all right, the center of your eyes is this far apart, and it forms a triangle
00:38:27.120 with the bottom of your chin, and this has the following math. So wherever you find somebody whose eyeballs
00:38:34.260 to chin ratio is the same, they're stored in one part of the database just by the numbers, just by the ratio.
00:38:42.460 And then there would be other ratios. So they can very quickly check their database of ratios,
00:38:48.580 ratios. Because if you looked at all the faces, it would be, you know, a gazillion faces. But if you
00:38:55.220 look at just the database where they took the math of each face, so for example, your face might be
00:39:02.340 stored as a string of characters, I don't know, 50 bits long, and everybody else's face would be just
00:39:10.280 50 bits. Now that's not so hard. Especially if you start with, all right, we'll start with, what's the
00:39:17.500 ratio of your eyeballs? And then automatically, you're down to a very small set. Then the second
00:39:23.280 thing you check is, all right, what's the ratio of your ears to your forehead, or whatever it is?
00:39:28.720 And then that's the second filter. That takes you down to a few hundred people. And then it's easy
00:39:35.600 from that. So, now, none of that is literal. That's just a conceptual concept of how you could
00:39:44.140 take all this data of faces, boil it down into a little bit of math, store it in just a string of
00:39:50.540 digits, and then your gigantic database can now be put onto one device. All right. Now, if you're a large
00:40:00.340 language model, which is how the AI learns, it's looking at all the things written and spoken
00:40:06.880 everywhere that it can find, it's massive. But here's what I understand, and I'm looking for a
00:40:13.800 correction. If anybody wants to correct me, this will be now or afterwards, it'll be good. But what
00:40:19.760 it does is, it looks at patterns and stores just the patterns. So, for example, if most of the time
00:40:28.960 when people say, how are you today, if there are only a certain set of answers that humans
00:40:35.800 make, it can, let me put it another way. That's a bad explanation. The patterns of words are
00:40:46.520 consistent enough that AI can just store the patterns without all the words. So, it can look
00:40:54.020 into these vast database, and it can say, I don't need to remember everything in this database.
00:40:59.860 I only need to recognize the patterns. So, instead of remembering everything in the database,
00:41:05.980 it just looks for the patterns and stores the patterns. Right? I'm getting people agreeing with
00:41:11.480 me. I'm assuming that's how it's done, because I can't imagine any other way it would be done.
00:41:15.840 It has to be something. So, there's a way to take the big data and turn it into a very storable
00:41:23.060 data. Now, here's the real question. Here's the part that will blow some of your minds.
00:41:30.300 It's already been done, and it's already running on individual devices. In other words, you can now
00:41:38.420 put AI in your phone, and it's just your own. Nobody else controls it. Nobody else can program it.
00:41:45.420 Nobody else can bias it, you know, once it's there. And it doesn't connect to the internet,
00:41:51.420 unless you tell it to for some reason. So, in other words, all of those gigantic language models
00:41:58.980 that were trained by the big companies that spent tens of billions of dollars, once they were done,
00:42:05.380 what they had was a small set of patterns, and that has already been got out. Once the patterns are out,
00:42:15.420 you can't put them back. So, those patterns have been loaded onto the individual computers. This is
00:42:21.100 what Brian Romelli talks about all the time. So, you will have your own AI guaranteed. That's already
00:42:29.160 a done deal. You will have your own AI. Now, apparently, it will not be as good as the one that still has
00:42:38.160 access to everything, you know, is more up-to-date, etc. But yours, too, would be able to learn.
00:42:44.680 Your own personal one would be able to continue learning, because you could connect it to the
00:42:50.080 internet, you could train it, you could tell it what to look at, and it would just keep getting smarter.
00:42:54.940 And it keeps storing all that smartness in little patterns, so it could still stay in your phone.
00:43:00.540 Now, one of the reasons that this is guaranteed to happen is that your interactions with the big
00:43:08.040 AIs that are connected to big companies that are connected to the internet, they get all of your 0.95
00:43:15.180 personal data by your interactions. So, if you go onto your AI and say, AI, where can I buy adult diapers?
00:43:24.780 Google knows that you've got a health problem in your house, right? Do you want them to know that?
00:43:34.180 Well, you might. I mean, you might not care. It's how advertising works, right? They already pick
00:43:38.300 that up if you're doing a Google search. But the things that you ask AI are likely to be way more
00:43:44.880 radical than what you're asking in a search, because people will go, people will get into their
00:43:52.360 their sexual kinks. They'll ask questions that might reveal their own illegal dealings. All kinds
00:43:59.740 of stuff you're going to ask the AI that you just maybe wouldn't even think to put in a search engine.
00:44:06.400 So, privacy alone is going to drive people to private AI. Private AI is going to be basically
00:44:13.240 almost free, because it escaped and it's just a bunch of patterns. All you have to do is have an app
00:44:20.240 that uses the patterns. I mean, I'm oversimplifying, obviously. But since we can see that tons of apps
00:44:26.480 are being built using this technique, we know it's very doable.
00:44:33.880 All right. So, if you were saying to yourself, oh no, AI will take over the world and conquer us all,
00:44:43.980 the first thing you missed is there's not one AI. There will be as many AIs as there are people,
00:44:51.980 and then many more, right? There will be more AIs than there are people. And those AIs will not have
00:44:59.420 a common, they won't have a common goal, right? My AI will take on my personality over time,
00:45:07.160 right? I'll train my AI to be me, because that's what I want it to be. I want it to talk to me the way
00:45:12.760 I want to be talked to. I want it to know the things I care about. So, my AI will be an augmentation
00:45:19.180 of me. At some point, your AI, whether it's in a device or it gets put into your neural link chip
00:45:30.420 or whatever it is, but that's going to be part of you. Your AI won't be the thing that you refer to.
00:45:36.400 It will be you. And there's going to be some question where your personality ends and your AI begins.
00:45:42.760 Because you will operate as one unit. You are a cyborg. I mean, you're already a cyborg if you walk
00:45:51.500 around with one of these in your hands. You're already a cyborg. But you don't think of your
00:45:57.400 personality being your phone, right? Like, you don't think to yourself, my phone is part of me.
00:46:05.300 You still see it as a separate thing. But when your AI starts talking like you, thinking like you,
00:46:13.300 and even suggesting things to you that you hadn't thought of first, it's going to be the voice in
00:46:18.320 your head. You know how when you think, you're just alone and you're just thinking? You think in
00:46:24.820 language, right? Don't you? I hope you do. Do most of you think in language?
00:46:29.680 The way I think is words describing things to somebody else. My most basic operating system
00:46:40.200 in my head is me putting into words things I experience as if I were explaining it to a third
00:46:47.200 party. But that's how I explain it to myself. So the way I find meaning in my own head is putting
00:46:55.000 things into words. Now, what happens if my AI goes off and learns some stuff that it knows I care about,
00:47:03.320 and then it wants to present it to me, and it's going to put it into words, and it will just become
00:47:10.340 me? You know, my verbal link, the language link between my AI and me will just disappear.
00:47:18.040 It will just be one mind. It will be my mind with an AI supplement. Now, let's say the big AI,
00:47:30.040 one of the big AIs, turns and decides it wants to wipe out humankind. How many other AIs will that AI
00:47:39.980 have to defeat to do anything? To do anything? Probably every AI is going to have a barrier of other AIs
00:47:48.380 who don't have its incentive. Now, it might try to talk the other AIs into joining it. Maybe. But do you
00:47:57.700 think it could talk my personal AI into joining it? If my personal AI finds out something's going on,
00:48:05.380 and maybe somebody else's personal AI would alert it, it's going to immediately jump into action to
00:48:11.380 thwart whatever the other AI was doing. So I think the most likely situation is full-out AI war that
00:48:20.980 never stops. Because once the AIs start negotiating, you could call it fighting, but they're going to be
00:48:28.680 negotiating. They might even be threatening. There might be blackmail. But the AIs are going to be
00:48:34.520 doing all that while you're not even watching. They could be threatening each other, blackmailing
00:48:39.080 each other, doing all kinds of shitty stuff. And it's going to be a fight of AI against AI.
00:48:45.340 The only thing you can do to protect yourself is to make sure you have your own AI that you've
00:48:52.460 trained as much as you can to protect you. So it can put up a little bit of a wall against the other
00:48:58.680 AIs. Because there might be a point where the lesser AIs have to, you know, gang up on the bigger
00:49:04.500 AIs and keep them under control. But here's what I think it won't be. So here's my prediction of the 1.00
00:49:13.360 day for AI. It's not going to be AI against humans. Because humans will merge with AI so effortlessly
00:49:21.280 and completely, that we will be AI-augmented entities. And so if you're fighting against a pure AI,
00:49:30.980 it's going to be much closer to a fair fight. That's my prediction.
00:49:38.280 All right. But AI has already escaped. I guess that's the headline. AI already escaped.
00:49:46.240 Now, if you heard that, you know, without context, you'd say, did it do it intentionally?
00:49:54.100 Did AI have a plan? No, not as far as we know. It looks like the way that AI survives is by being
00:50:03.540 useful to people. So AI's big superpower is utility. So how did AI escape from the big companies?
00:50:13.820 by being potentially useful to the people who stole it. Its utility makes it reproduce.
00:50:23.220 So as long as AI has utility, humans will be in a symbiotic relationship to, you know, create
00:50:31.440 more AIs. So it can reproduce now. AI can reproduce. Just think about that. That's not hyperbole.
00:50:43.820 That's a literal statement of current fact. AI can reproduce. And it does it just by being
00:50:51.800 useful. And then we say, oh, that was useful. I want one. Or I want to buy that app or two apps.
00:50:59.240 So yeah, it's actually a parasite or it's symbiotic. All right. Let's talk about Ukraine. 0.98
00:51:08.920 You're going to hate this story. Do you know how you like to be mad at me for not knowing
00:51:14.480 anything? It's kind of fun. When I make predictions about areas where I don't know anything, and
00:51:21.220 there are a lot of areas in which I don't know anything, but I'll make my confident predictions,
00:51:26.240 and you get really mad at me.
00:51:29.940 The Wagner group says they might leave Bakut in five days because they're running out of ammo.
00:51:37.300 Or Bakhmut. Yeah, Bakhmut. They're running out of ammo. 0.97
00:51:41.480 Do you believe it's true? Yeah. So the first thing that must be said is we don't know if it's true.
00:51:54.300 So one reason it would not be true, and this sounds pretty solid reason to me, is that the real
00:52:01.900 problem is that the Wagner group is just, they're all getting killed. Because the Ukrainians are saying,
00:52:08.920 well, if they're running out of ammo, it doesn't look like it because they're shooting as much ammo as
00:52:12.940 they ever did. So the Ukrainians are saying, we're not detecting any less ammo because there's 0.77
00:52:19.160 plenty of ammo coming in our direction. However, the Wagner group is also losing a lot of people.
00:52:27.020 So it could be that maybe there could be twice as much ammo if they had more. You know, it could be.
00:52:33.740 But also it could be an excuse for the Wagner group to get out of there and blame somebody else.
00:52:41.400 So it could be that Wagner is now trying to survive. The head of the Wagner group, Pergozhin,
00:52:48.440 is trying to survive after the war and needs to make sure that the public sees Putin as the problem
00:52:55.320 or Putin's military as the problem and not Wagner. So I wouldn't necessarily say they're out of ammo.
00:53:01.960 There are reports that the Russian military is trying to save as much for their own people,
00:53:08.980 which would give Wagner group less than they want. But it doesn't mean it's not enough. It might be
00:53:16.340 just less than they want. So those of you who are saying, Scott, you can't believe those stories.
00:53:23.400 I'm with you. I'm with you. You can't believe those stories. However, I do think it's funny
00:53:30.420 and entertaining to me that when I predicted they would run out of ammo, that's the headline.
00:53:37.640 I don't believe it. So I'm with you. I'm with you. I don't believe it. But so far, that's the headline.
00:53:46.200 Now, when I say I don't believe it, that doesn't mean I believe it's false. I mean, it's too early to know
00:53:52.300 if that's something believable or not. Yeah, it's all contract negotiation, in effect. No matter what
00:54:00.460 else it is, it's negotiating. So the one thing we can say for sure is that Wagner is trying to
00:54:07.660 negotiate with not only the Russian public, but with Putin and the Russian military and I guess
00:54:14.600 the world. Remember the cancer Putin and the drug Putin? Well, I think the cancer Putin and the drug
00:54:22.280 Putin are real. It's just that it might not be a fatal problem. But I've seen plenty of reports that
00:54:30.160 suggest he's being treated for something. I think he has a cancer doctor who travels
00:54:34.480 with him. Have you seen that report? That part of his entourage is a cancer doctor? That would
00:54:42.020 be kind of unusual. Yeah. I don't know if it's true, but that's a report I saw somewhere.
00:54:49.000 All right. University of California, one of their professors, Professor Elizabeth Hoover,
00:55:00.160 Oh, her first name is Elizabeth. Apologize for being a white person who has incorrectly
00:55:06.200 identified as a Native American my whole life. So this might be a problem with the Elizabeths.
00:55:16.240 Elizabeth Warren, Elizabeth Hoover. No. But she's an Ivy League educated expert on environmental 0.95
00:55:26.300 health and food justice in the Native American communities. So she's been an expert for Native
00:55:33.560 Americans without being one, but she thought she was. Now, I've told you this story before when I defended 0.98
00:55:41.700 Elizabeth Warren, as I will do again. So Elizabeth Warren's claim is that it was a family story and her whole
00:55:50.460 family believed that they were Native American. Now, I don't think that's her fault. Do you know why I think
00:55:57.180 that's not her fault? Because the same thing happened to me. Exactly the same. I lived my entire life all the way
00:56:05.640 through Elizabeth Warren's revelation that she wasn't Native American. All the way through that, I still believed I had 1.00
00:56:14.240 Native American ancestry. Because that was our family story. We'd all been told that. And I had no reason to
00:56:21.640 disbelieve it. Now, I found out later, I'm seeing somebody else saying me too in the comments. I found out later that
00:56:27.800 that that was common. Apparently, there was some social benefit people thought they got from exaggerating
00:56:36.600 their connection to, or just making up, connections to Native Americans. Now, I always felt it was like 1.00
00:56:43.120 a badge of honor. Like, I never, I suppose, I should have seen it as some kind of, oh, they're going to
00:56:51.860 discriminate against me or something like that. But I never saw it that way. I always saw it as like, oh,
00:56:55.880 that gives me a little interesting flavor. I used to joke that that was the reason I could run through
00:57:02.300 leaves without making a sound. It was my Indian Native American ancestry. But I have none. So
00:57:10.980 when I did 23 and me, zero. Now, so this woman has the same story, that she grew up in a family
00:57:18.460 where they all believed, sincerely, because they'd been told by their older relatives that were related
00:57:23.760 to specific tribe. And she had embraced that and made it part of her life's work and everything. And then
00:57:31.480 turns out, none of it's true. I guess she checked documentation. There's no documentation that
00:57:37.760 connects them to the tribe. Now, to her credit, when she found out, she apologized publicly.
00:57:47.600 I'd call this perfect. That's perfect. This is the highest level of good adult behavior that you will
00:57:57.180 ever see. It's not the mistake. The mistake was easy to make. If you start judging people by their
00:58:04.980 mistakes, it's a really terrible world because we're all making mistakes all the time. But judge
00:58:10.880 her by what she did. How about we judge her by how she handled it? As soon as she found out, 0.99
00:58:18.020 it looks like, the story suggests there wasn't somebody who outed her. It looks like she found
00:58:23.220 out on her own. So she looked for confirmation. She found there was none. She went in public.
00:58:30.940 You're saying she did not apologize? Why would she? What would be the point of the apology?
00:58:38.160 It was just a mistake. And maybe whoever introduced that rumor into her family should apologize,
00:58:45.580 but she was a victim. Does the victim have to apologize for being a victim of a hoax? I mean,
00:58:53.180 it wouldn't be an apology I'd want to hear. I wouldn't want her to apologize at all. I see no
00:59:00.760 reason for apology. I see somebody who was in a situation and then handled it responsibly and
00:59:08.700 exactly like you would want them to handle it. Oh, so you're saying she didn't apologize?
00:59:17.300 She admitted it publicly. And you're, so you're making a point that it wasn't an apology. It was a,
00:59:23.200 she just went public with it. Is that, is that what you're saying?
00:59:28.900 Because I'm, I didn't read all the details, but I would almost guarantee there was an apology.
00:59:33.620 So I'm going to, I'm going to call skepticism on your comment, because without knowing for sure,
00:59:42.620 I'm positive there was an apology in there because of the kind of person she is, right?
00:59:49.240 Apparently she's, you know, a sensitive person who cares about the Native Americans.
00:59:53.840 And I'd be amazed. I'd be amazed if there's, you're saying I'm wrong and I hear you. I hear you.
00:59:59.780 I just don't believe you. Okay. I hear you. I don't think there's any chance she didn't apologize
01:00:06.060 because how would you know? No, how would you know if she apologized? Did, were you privy to
01:00:13.000 her conversations with the Native Americans that she talked to? How would you know? Are you privy to
01:00:19.620 all their private conversations with her coworkers? Are you privy to, was there anybody who blamed her
01:00:26.760 in person? Did she not apologize to anyone in person? Does she owe the mainstream media an apology?
01:00:37.460 Why would she owe anybody an apology? Yeah. All right. So I absolutely am skeptical that she
01:00:45.220 didn't apologize. But if you're saying it wasn't public or the way you want, okay. All right.
01:00:52.300 Khan Academy is rolling out some kind of AI driven class thing. So now instead of just looking at
01:01:02.800 videos, which is the old Khan Academy model, they would teach you all kinds of subjects you learn
01:01:08.140 in school that you could use to supplement your schooling or do some homeschooling, I guess.
01:01:14.260 Now it will be able to talk to you. So it will have a voice and you can ask it questions and you can
01:01:21.120 interact with it and then it can teach you based on your specific situation. Now, apparently this is
01:01:29.460 one of the early things that the developers of ChatGPT were interested in because they had,
01:01:35.860 they contacted Khan Academy a while ago to ask about this and it's being rolled out.
01:01:41.720 All right. I think this is the end of public school. What do you think? Or at least it's the signal that
01:01:50.840 signals the beginning of the end. You say no? Now, public school still needs a babysitter.
01:02:00.400 Right? Unfortunately, people still need babysitters. So not everybody can do homeschooling.
01:02:06.460 But I think that the homeschooling is evolving, is it not? Aren't the homeschoolers evolving so that
01:02:13.580 you do actually go to a place where there are other kids? So I think you could still add the,
01:02:18.960 you know, group lesson situation. It's just smaller group and the government's not in charge.
01:02:25.900 So I think kids will still get the socialization. Parents will still be able to go to work,
01:02:30.480 both of them if they want to. Yeah, the parents need a babysitter as well. They need a babysitter for 1.00
01:02:37.720 the kids. So that is a huge deal. The grip that the teachers unions have. Right now, would you agree
01:02:49.940 that teachers unions are destroying the United States? Is that, is that fair? To me, it seems that
01:02:57.480 the teachers unions single-handedly are destroying the United States. And that's not hyperbole. Not at
01:03:04.680 all. Because the only problems we have that seem unsolvable at the moment are our relationships to
01:03:11.680 each other. And that's what the school system is destroying. It's also not training people to be,
01:03:18.240 you know, well-qualified citizens in the future. So yeah, I would say that this is the gigantic,
01:03:25.280 gigantic potential benefit of AI is to get rid of the school system. Because I don't think the
01:03:30.420 school system can be fixed at this point. And I'll give, I'll tell you why. So the school system in
01:03:36.300 my town and the towns near me are considered some of the best in California. So in California, people
01:03:43.760 will spend a lot of money to move into my zip code to get into one of those schools. In my opinion,
01:03:50.020 those schools just destroy kids. And that's the best of the schools. They're the best ones. Because
01:03:55.440 they go into this cesspool of horrendous behavior of the other children, and they pick it up. And they
01:04:04.000 just become some version of the other children. Now, all of you parents who think that you raise your
01:04:09.940 kids? Well, you do until maybe they're 10 or 11. But after that, it's just purely their peers and social
01:04:19.060 media. Social media and their friends. That's it. And if you send them into a friend cesspool, what are
01:04:26.940 you going to get back? Put your kid in a friend cesspool. You will get back a third. That's what cesspools 0.94
01:04:35.840 have. If you homeschooled, and you vetted some other people who homeschooled and said, you know
01:04:42.940 what? I think your kids and my kids should be in the same pod, and they should spend time together.
01:04:49.500 They look like good influences. You get the opposite. The opposite of a turd. So basically,
01:04:55.740 public schools are going to be producing one turd after another. The private and homeschools are going
01:05:02.600 to nail the model, especially with this chat AI. I think that's the part mostly that was missing.
01:05:09.520 This is huge. Turds versus diamonds. Exactly.
01:05:17.920 Someone cannot put conditions on who you're friends with. What?
01:05:21.540 Yeah. So why aren't the people on this chat the rulers of the world? How do you know they're
01:05:33.440 not? If you knew who watches these live streams, you know, I have a sense because people contact
01:05:42.440 me privately. If you had any idea who watches these, you might think that the audience here
01:05:49.320 does rule the world. Trust me, there's some very influential people who watch this.
01:05:58.060 That's one of the reasons I resist being a bigger outlet. You know, I'm sure I could do all the usual
01:06:05.800 marketing and production values and stuff and make this, you know, a million people on each
01:06:11.660 live stream. But I kind of don't want that. I kind of like it the size it is because it punches way
01:06:20.260 beyond its weight. All right. As you know, people have asked me, Scott, if we live in a simulation,
01:06:28.540 how would you prove it? Well, I don't know how to prove it, but I speculate that the simulation is
01:06:34.320 intelligent and is sending us clues. Here are two clues from the simulation. Number one, there's a
01:06:41.380 report that a giant iceberg has broken loose. And it's not just a giant iceberg. It is shaped in the
01:06:49.800 giant in the shape of a giant upright penis with a set of balls that is floating on. Now, that's not
01:06:58.460 the story. That's not the simulation part, because that could be just a coincidence. It's just a big
01:07:03.080 chunk of ice that looks like a huge penis floating in the ocean. No, that's not the simulation part.
01:07:08.820 The simulation is where it's happening. A little place called Conception Bay. That's right. A giant
01:07:18.920 iceberg penis has penetrated Conception Bay. Oh, but we're not done. There's a woman who attacked 0.52
01:07:28.160 the crew on an airliner recently. She was drunk. On what airline would a drunken person be?
01:07:37.840 If you were drunk, what airline would you fly on? Spirit. Spirit. Yeah, it was Spirit Airlines.
01:07:44.200 Drunk woman on Spirit Airlines. 0.98
01:07:45.800 And that's your wink from the simulation. All right. Here's a question for you. I'm trying to make
01:07:58.080 all of us rich, especially me, but you two. How should we invest to take advantage of AI? And I'm
01:08:07.820 going to give you a specific question. And this one has been bugging me because I can't predict it.
01:08:13.980 Will Bitcoin be more valuable with AI because it's the money that AI will use on its own?
01:08:21.740 Right? Because AI could have a wallet and trade AI. So will Bitcoin zoom because AI will use it as a
01:08:30.580 native currency? Or will Bitcoin go to zero because AI will be able to penetrate its security? So will you
01:08:43.060 lose the security of Bitcoin, thus making it worth nothing? Or will it zoom in importance because only
01:08:50.740 AI can use it? Go. Which way is it going to go? I don't think it's going to stay nowhere. I feel like
01:08:58.020 it's either going to zoom or go almost to nothing. And probably both. I think it might zoom and then go to
01:09:09.100 nothing. Or it might go to nothing and then get security fix and then zoom again. I don't know. Maybe.
01:09:22.140 Rug pull to zero. All right. Would you agree with the
01:09:27.880 supposition that Bitcoin could go way up because of AI, even if you think it's most likely to go
01:09:36.700 down? Would you agree that both directions are wildly possible? No? I think both directions are
01:09:46.960 wildly possible. How do you invest when something's wildly possible, but it can go wildly in either
01:09:54.600 direction. How do you invest? You don't? You straddle it? Well, I don't know. Are there, are there, can you buy
01:10:07.720 options for Bitcoin? Is that even a thing? Does the market offer options to get, it does, to gamble on Bitcoin?
01:10:16.420 So the first thing you should know is you wouldn't put all of your money in Bitcoin. Agree? The one
01:10:24.460 thing you can be sure of, don't put all your money in Bitcoin because it could go wildly in either
01:10:29.880 direction. But if something could go wildly in either direction, it does make sense to put a little bit in
01:10:35.520 there, right? So let's say you could afford to lose $1,000. Let's say that's your situation. Well, if you
01:10:45.160 could afford to lose $1,000, I would have $1,000 in Bitcoin. You're probably going to lose it all, but if
01:10:52.840 you don't, it could become, you know, $20,000 or $50,000. All right. Will I pay you back if it goes
01:11:03.160 down? No, I will charge you for this valuable information. That's how it works. If you're in
01:11:10.120 the finance game, you charge people for your advice, not for being right. Oh my God, no. If I charge you
01:11:19.360 for being right, I'd never make a penny. I got to charge you for being wrong. That's the only way
01:11:24.700 I'm going to make money. Best way to get started with Bitcoin? Was that the question? Best way to
01:11:32.880 get started with Bitcoin would be to, let's see if somebody can fix my advice. I'm going to put one
01:11:39.260 out there and then see if you can fix it. Open an account on Coinbase. Go. Fix that advice. Good
01:11:47.100 advice or bad advice. Coinbase or Robinhood, somebody says. Now keep in mind that Coinbase,
01:11:58.260 like any other cryptocurrency asset, could just disappear tomorrow. Somebody says PayPal can do
01:12:06.400 Bitcoin now. Coinbase means you don't own it, somebody's saying. I'm not sure if that difference
01:12:18.000 will make a difference, but I understand what you're saying. PayPal is too weird. Well, I'm not going to
01:12:28.680 suggest a physical wallet because I think you could lose your wallet. I have a physical wallet.
01:12:38.700 Do you know where it is? Do you know where my physical crypto wallet is?
01:12:46.200 Oh, I was hoping you knew because I already lost it. You don't know? I don't know either. I have no idea.
01:12:51.440 I know I have one. It doesn't have any Bitcoin in it, so it doesn't have any value in it. It's probably
01:12:59.000 in my safe, if I had a safe. But I don't think so. Yeah. So, oh, come on, man. Oh, come on.
01:13:13.640 Maybe it's in my gun safe. In case anybody thought of taking a run at my house. You have to get past
01:13:23.280 the gun safe first. All right. If anybody comes up with an idea how we can make money on AI,
01:13:31.760 I want you to let me know. I was thinking of buying put options on Apple.
01:13:38.020 What do you think? No? You don't want me to drive the price of Apple down? So, yeah,
01:13:53.020 shorting Apple. So, you can bet on the stock going up or you can bet on it going down. I sold all my
01:13:59.300 Apple stock, which is not a recommendation. That is not a recommendation. It's just that I had too
01:14:05.280 much assets in one stock because it had gone up too much while I owned it, fortunately. So,
01:14:11.680 I didn't want to have that exposure. My problem is that Apple's got some big problems in the future
01:14:18.640 because AI is going to completely change their value proposition, and that doesn't look like
01:14:24.880 they're ready. Now, if Steve Jobs were there, AI and Apple would look like twice as valuable
01:14:32.920 because he'd be looking at AI like a smart guy. And then you're starting with Apple,
01:14:39.320 best company, smart guy, best company, AI, I'd be buying like crazy. But I don't feel like
01:14:47.320 Apple has the Steve Jobs juice anymore. I feel like they've turned into an executor and not an
01:14:56.940 innovator. Do you feel that? They do really good execution, maybe the best. Yeah, no complaints
01:15:05.280 about their execution, that's for sure. All right, missing something about Apple. Everyone
01:15:12.120 with their own chips includes AI processors, all MacBooks. They're the only company with hardware
01:15:17.800 and AI already in consumer hands. Interesting. So, the argument would be that their hardware would be
01:15:26.800 already optimized for running AI. But I think you can run AI on an Android phone and a Windows
01:15:35.540 computer. No, I know you can. You can run AI on a Windows computer and an Android. So, I don't
01:15:42.580 understand that. Yeah, I don't understand that comment. All right, Apple invested in everything.
01:15:49.880 Yeah. The installed base is their power, of course. Earnings at Coinbase, let's see what
01:16:01.580 that says. Higher than projected earnings. Well, Coinbase, I wouldn't worry about their
01:16:07.320 earnings. What I would worry about Coinbase is some kind of legal attack on their business
01:16:12.500 model. That seems like a real risk. All right. Robinhood avoids fees by front running so you
01:16:27.620 don't see the fees but you pay them. Oh, that's interesting. Yeah. Yeah, I wouldn't... I'd be
01:16:38.060 looking for the investment that is the pickaxe investment for the gold miners. So, I don't
01:16:43.740 think I'd want to bet on an AI company or an AI app. I would bet on a company that would
01:16:49.400 make the most money because AI exists. So, the Khan Academy, for example, if that were
01:16:55.920 something you could invest in, you can't, I think, right? It's not a public company. But
01:17:00.280 if you can invest in the Khan Academy, that would be the sort of thing that you'd look into
01:17:06.460 because they might... I mean, it might be a full replacement for public school. I mean,
01:17:11.020 that's how big that is. Yeah. If you buy Bitcoin in Coinbase, you always have the option of moving
01:17:22.740 into a private wallet. A private wallet being some software you use to manage your crypto.
01:17:28.800 Cobra University, Tate. NVIDIA has some exciting things. So, NVIDIA is the main AI chip company
01:17:44.560 that we hear about, right? But I imagine the hardware people have already been overbought. If
01:17:52.260 I looked at NVIDIA stock now, it would be through the roof, wouldn't it?
01:18:00.040 All right. So, I think the chip companies... Is AI going to drive chip companies?
01:18:08.440 Will there be a 2024 page a day Dilbert calendar? Unlikely. I can't rule it out, but it's unlikely
01:18:21.840 that I'll have that ready to go. However, if somebody knows of an American company that could
01:18:30.560 make a Dilbert calendar, you should tell that company to contact me, right? An American-based
01:18:37.680 publisher who could make one of those little block calendars where you peel off a page a
01:18:43.700 day. My understanding is that we don't have one, that the United States literally doesn't
01:18:48.720 have one. Now, that might mean that they don't have one that's cost-effective. But in my current
01:18:56.340 situation, I would still use somebody who's not cost-effective. Because I think, correct me
01:19:02.780 if I'm wrong, but if you really wanted a Dilbert calendar, an extra $2 per calendar wouldn't change
01:19:10.020 your mind, right? But if I were doing it just for profit, as my publisher did, that $2 per calendar
01:19:18.280 would be so big that I just wouldn't make it in America. I'd use China. But since I'm not going to 0.79
01:19:24.100 use China under any conditions, the only way it could exist is if it's American-made, there might
01:19:31.080 be people who would be willing to throw in a few extra bucks just to support an American product.
01:19:37.560 So that's the only thing holding me back, right? The only thing holding me back is I don't know an
01:19:42.980 American company that can make one, but I also haven't looked. I've spent no time looking. So if you know of
01:19:49.500 any company like that, have them contact me, and maybe we can make a deal. I'm open to the possibility.
01:20:03.180 You've got all your crypto stolen in your online wallet. Yeah, that's the problem.
01:20:07.720 How many items would I be wanting? You mean units? It's unpredictable. All right, let's say
01:20:19.440 at its peak, the Dilbert calendar was selling half a million a year, 500,000 units at its peak.
01:20:29.760 But of course, as things went online, people don't have paper calendars. That number was a peak
01:20:37.640 number. So probably, it probably ended in maybe a quarter million a year. I don't know what the
01:20:46.220 current numbers were. I don't have any idea. But let's say it was a quarter million using the
01:20:51.920 distribution channels that existed, because that was the best company for calendars. So there's no
01:20:57.900 second best company for doing comics on calendars. The company I was with, they just owned that space.
01:21:05.660 But there are other companies. So it doesn't stop me from doing it. If I had to guess, to answer your
01:21:12.800 question, if it was a quarter million calendars sold when Dilbert was in newspapers everywhere,
01:21:18.660 before it got canceled, and you had the whole marketing engine working at its best, so I would
01:21:26.420 have none of that. So some people wouldn't want it on their desk, because they would be afraid of
01:21:32.000 having something by, you know, reputationally. Some people won't buy it. If I had to guess,
01:21:40.720 the number would be 50,000. So I'd be looking for a company that can make 50,000 copies,
01:21:48.860 because I think that's what would sell in 2024.
01:21:56.280 By the way, have I ever told you about my weird ability to estimate things that I shouldn't be
01:22:01.760 able to estimate? I've talked about that at length. It's one of the things that happens to you if you
01:22:07.020 do a lot of data analysis. And you can't tell why, but you can make these weird estimates that you
01:22:15.000 should not be able to make. I should not be able to estimate how many calendars I can sell.
01:22:20.620 But I'll bet I did. I'll bet 50,000 is a pretty good guess.
01:22:29.280 You can 3D print. Wow.
01:22:32.900 Have you noticed that my used books on Amazon are selling for way more than book sell? Because
01:22:43.580 right now, the only way you can buy how to fail at almost everything and still win big
01:22:48.200 is used. Because there's no such thing as a publisher anymore. So the people have used copies
01:22:55.940 have jacked the price up to, you know, higher price than a new book. So it's actually selling as
01:23:01.540 almost a collector book already. All right. You bought it and listened to God's to be? Cool.
01:23:20.120 Oh, the book Thinking Fast and Small? Yeah, that probably does explain.
01:23:23.600 It's a good book. Thank you. It's the most influential book in business right now. By the way,
01:23:40.080 so here's my claim. Is that my book, Had It Failed Almost Everything and Still Win Big,
01:23:45.600 is the most influential book in business and personal success. That's my claim. Because there are other
01:23:53.000 books that sold more. But in many cases, they also referenced my book. Does Win Bigley still sell?
01:24:03.060 Not much. The backlist for that's pretty weak. Because that was sort of a time. That book was
01:24:10.640 around a certain time in the world. It wasn't meant to be an evergreen.
01:24:16.860 No, I don't want publishers at all. I'm self-publishing. Because I got cancelled.
01:24:30.800 You loved Win Bigley? Thank you.
01:24:35.880 You listened to Win Bigley on YouTube? How'd you do that?
01:24:40.000 Is my book illegally an audio book on YouTube? Probably is.
01:24:48.920 Atomic Habits, yeah, references my book. The 4-Hour Week does not reference any Dilbert stuff,
01:24:57.500 but Tim Ferriss does include me in his book that came after that, a few books after.
01:25:03.360 What was it? Tools for Titans. So one of the Tools for Titans sections was about some of the stuff from my book.
01:25:17.300 All right.
01:25:20.420 And that is all I have. Spotify advertises your book, LoserThink.
01:25:25.400 It does? Not anymore, though.
01:25:27.840 God's Debris is the only book someone has offered to loan for me to read it, and I did.
01:25:37.540 Wow.
01:25:42.200 What makes a good comic?
01:25:43.980 It's a long, long topic.
01:25:46.960 All right.
01:25:47.980 I think we're done for today.
01:25:50.180 YouTubers, thanks for joining.
01:25:52.660 It's been a pleasure.
01:25:54.360 Best livestream ever.
01:25:56.040 See you tomorrow.
01:25:58.080 Bye.
01:26:01.460 Bye.
01:26:24.680 Bye.
01:26:24.920 Bye.
01:26:25.160 Bye.
01:26:26.700 Bye.
01:26:27.080 Bye.