Real Coffee with Scott Adams - January 02, 2021


Episode 1239 Scott Adams: I Show You Some Shocking Media Manipulation, Election Dogs Not Barking, Whiteboard Too


Episode Stats

Length

1 hour and 16 minutes

Words per Minute

155.07129

Word Count

11,872

Sentence Count

854


Summary


Transcript

00:00:00.000 Hey, and if you'd like to make 2021 an unbroken string of incredible mornings, I can't speak for
00:00:09.280 your afternoon, but your mornings, I'm in control and they're going to go well. So it looks like
00:00:15.840 yet another day in 2021 where the morning is incredible. Good morning, Omar. And let me
00:00:24.220 make it even better with one little tweak, just one little thing that you need to make this special.
00:00:32.360 I think you know what it is. It's a cup or mug or glass, a tank or chalice or stein, a canteen jug or
00:00:37.100 flask or vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me now
00:00:45.360 for the unparalleled pleasure, the dopamine of the day. Yeah. The thing that makes everything better.
00:00:51.220 It's called the simultaneous sip. It happens now. Go.
00:01:00.640 Excuse me while I savor it. Savor it. Savor it. All right. The most mysterious story in the news.
00:01:11.440 mysterious is that President Trump, uh, skipped out on his own New Year's Eve celebration at Mar-a-Lago.
00:01:24.220 A lot of, a lot of people paid a thousand dollars a ticket to attend, but the president, uh, was not
00:01:30.220 there. So the thinking is the speculation. Uh, Oh, we'll get to the whiteboard. I'm going to make you
00:01:37.900 wait for that. It's going to be worth it. But the speculation is that there might be something
00:01:42.240 brewing with Iran in terms of a, an attack. And it's because it's the, uh, it's the anniversary,
00:01:50.600 one year anniversary of Soleimani. And apparently Iranian officials are just saying directly that
00:01:56.980 there will be revenge. And it will be pretty big. And I ask you this, number one, would Iran be able
00:02:06.680 to, even with our, you know, ambiguous presidential transition situation going on, the worst, worst
00:02:13.700 possible situation, do you think Iran would find it in their interest to do some kind of an attack
00:02:21.640 where it was obvious? It was that it actually killed Americans. Do you think they could do that
00:02:28.080 right now? I would like to, uh, send a message to the Iranian, uh, the Iranian elite. And it goes like
00:02:38.760 this, um, dear Iranian elite, before you make the decision to attack the United States, something you
00:02:50.620 need to know. And I don't know if there's a cultural difference. So maybe it's not obvious to you
00:02:56.700 or something like that. But let me explain it in the clearest possible terms.
00:03:05.180 If you kick us when we're down, it's not going to be pretty.
00:03:10.380 Let me tell you, if this would be maybe the worst time in the history of human civilization
00:03:23.260 to attack the United States. Because if you heard of this president that we still have for a few more
00:03:30.620 days, a president Trump, rumor has it, he's not in a good mood lately. Something about the election not
00:03:41.260 going his way. You don't want to do the attack while he's still got his finger on the, you know,
00:03:49.360 on the military. So here's my prediction. I don't think there's any chance Iranians are so dumb that
00:03:58.220 they would attack a lame duck, President Trump. There wouldn't be anything more dangerous than
00:04:04.540 that. Now, waiting for Biden to take office. But then why would you attack? Right? Because Biden
00:04:13.360 would have been opposed to the attack. If Biden had been in charge, he would have said, no, don't kill
00:04:18.580 Soleimani. So why would Iran want to do an attack while Biden is still, if Biden is in office,
00:04:26.720 wouldn't that be the time that they would want to not attack? Because that's where they would think
00:04:31.940 maybe we could work on a deal and still get our nukes and still get everything we want, right?
00:04:37.820 So it seems to me that if Iran has some need to attack for, I don't know, national pride or whatever
00:04:44.840 the reason is, that don't they have to do it while Trump is still in office? Right? Because first of all,
00:04:51.600 it lines up with the anniversary and second of all, it was Trump who made the decision. Why would they
00:04:56.780 attack Biden who will be closer to something like on their side? It wouldn't make any sense
00:05:03.880 strategically, right? Attacking Biden would turn Biden into Trump. And what they want is less Trump.
00:05:10.200 So what the heck is Iran going to do? Now the talk, the chatter is it might be drone attacks in Iraq
00:05:19.720 against American forces or quadcopter attacks. So we're seeing now the beginning of the era of the
00:05:28.260 small drone. A quadcopter is like those four propeller little drones that you can pick up with
00:05:33.880 sort of a consumer sized drone, but bigger, a little bit bigger. And if those things are the
00:05:42.320 new attack vector, it's a whole new world. Yeah, it could be a drone war. But the problem with any
00:05:49.280 kind of drone attacks is that we would know where they came from, wouldn't we? I mean, even if you
00:05:53.960 didn't know where they came from, you'd still know where they came from. So I think the Iranians
00:06:00.040 don't have any options. Don't you? I feel like there's no options for them to do anything.
00:06:07.060 But they may have just wanted to ruin everybody's holiday, the president's holiday by threatening.
00:06:13.180 So my live stream yesterday, in which my title was that I said it looks like Trump won based on the
00:06:20.200 news out of Georgia, with the forensic expert who looked at the hackability of the system. And it's
00:06:26.560 pushing a quarter million views today. What exactly happened? Why would that video get a quarter
00:06:33.760 million views? Is it because there was nothing else going on? Most of the other content yesterday was
00:06:39.680 a repeat or something because it was a holiday-ish? I don't know why that got a quarter million views
00:06:46.320 and it's still screaming. Maybe it's just because it said Trump won in the title.
00:06:49.580 Um, so here's the, the most, uh, let's say obvious and grotesque media manipulation that I've seen in a
00:07:00.740 while. Now, if you've been watching this for a while, this periscopes or my live streams for a while,
00:07:06.740 you know that I talk about the, uh, technique that the media uses to brainwash the public. And I use
00:07:14.080 brainwash, not in any exaggerated way. I mean, actual literal brainwashing. And I would call
00:07:21.640 propaganda brainwashing. You know, I would use those interchangeably, but propaganda usually means
00:07:27.240 in a political sense. So let's say propaganda. And here's the example that if you did not
00:07:33.340 know the techniques of persuasion, you wouldn't even notice this, but you do know the techniques
00:07:40.280 of persuasion. So let me read this, uh, this, uh, this bit from the New York times. And then you tell
00:07:47.440 me what manipulation that they put into it. It says, uh, here's a exact sentence from the New York
00:07:54.160 times. Trump continues his assault on election integrity, baselessly claiming the presidential
00:08:00.620 results. And now the Senate runoffs in Georgia were both invalid. And it said that no courts have ruled
00:08:08.880 on, uh, have ruled that there's any fraud. Now, what is the, yeah, thinking beyond the sale. Thank
00:08:16.900 you. Right. Whenever you see a major media entity make you think past the sale, that's just propaganda.
00:08:26.200 Now they can disguise their news as opinion, but it's in the New York times. And if you're reading the
00:08:33.720 New York times and something they decided it was good enough to publish, even as an opinion,
00:08:38.300 says that Trump is making an assault on election integrity, what do you have to assume to understand
00:08:47.460 this sentence? Well, you would have to assume that whatever he's doing is illegitimate and that the
00:08:53.420 election was perfectly good. That's not in evidence. I will agree that it has not been proven that the
00:09:02.040 election, uh, was tainted to the point where it changed the, the result. Of course that has not
00:09:09.120 been proven. Of course. But there is massive circumstantial evidence that has not been fully,
00:09:20.420 you know, fully debunked and, or even analyzed by any kind of court. Now, most of you know that,
00:09:26.040 but at least half of the public believes that courts have looked at the evidence 50 times and, or 60
00:09:33.140 times and rejected them all. So I just tweeted this morning, an actual consumer of news in the United
00:09:40.560 States believes that this is a true statement, that 59 out of 60 courts or court cases were thrown out
00:09:50.120 because the evidence of fraud was non-existent. Nothing like that's ever happened in this country.
00:09:58.780 Didn't happen this year, last year, never happened. It's just a completely imaginary story,
00:10:04.800 which now has become truth. It is completely imaginary that courts have looked at the evidence.
00:10:10.420 Nobody has. Now courts have looked at the case and decided that they couldn't, couldn't rule on it for
00:10:20.100 a variety of tactical reasons, but no court has looked at the evidence, right? So let me give you
00:10:26.220 an idea how the news or even an opinion within a news vehicle would look if they were trying to be
00:10:34.160 honest. If they were even trying to be honest, what would the same kind of story look like? Here's how,
00:10:41.760 here's what it would look like. Instead, instead of saying Trump continues his assault on election
00:10:46.880 integrity, you might say something more objective, such as Trump continues to push every legal means
00:10:54.800 to challenge a result that evidence he believes looks solid says that it was fraudulent. But
00:11:03.300 courts have not looked at any evidence, and so there's no ruling one way or the other about
00:11:09.300 whether the election was fraudulent. Now, did I say anything that was inaccurate? Did you hear any
00:11:17.240 inaccuracies in what I said? Trump is using his legal challenges? True. Trump believes the election was
00:11:24.940 fraudulent? True. He bases his opinion on a massive amount of circumstantial evidence,
00:11:33.300 true. No, no court has ruled on the evidence. True. Isn't that good context? Don't you think that
00:11:44.420 that context, sort of important to the story? And if you see that kind of context left out,
00:11:51.300 but instead they do a making you think past the sale, that's just brainwashing. Now, if you did not
00:11:58.580 recognize this thinking past the sale technique, you would read this as news. And so all of those
00:12:05.540 consumers who are not as well-armed as my viewers who know this technique, if you don't see it so
00:12:11.460 clearly and say, oh, that's one of those techniques, that's a brainwashing technique, making you think
00:12:17.460 past the sale. If you didn't recognize it, you wouldn't know. You would just think you saw the news.
00:12:23.120 Think about that. At least 95% of all consumers would not recognize this big glaring signal that
00:12:30.800 this is not real news. They wouldn't see it. Because you're sort of used to people talking this way.
00:12:36.540 You don't understand it as technique. Here's another thing that they could have added for context.
00:12:43.780 Tell me if any of this is untrue. I'll call it the Adams rule of fraud. That whenever there's a
00:12:51.340 situation where fraud can happen, and there's a very high upside, and lots of people are involved,
00:12:58.080 so you know that somebody's going to take that chance, even if some of them are honest. Under those
00:13:02.860 conditions, fraud happens every time. Did I just say anything that you find even provocative,
00:13:10.860 much less untrue? Is it even controversial? Can you find anything wrong with that statement?
00:13:18.020 That if it's possible, and there's a high payoff, and lots of people involved, so you know that some
00:13:24.080 of them will be criminals, that fraud happens basically every time? How is that not a fair
00:13:30.440 statement? Now, isn't that pretty valuable context? Because they're treating the election as if it's a
00:13:37.000 thing which is usually, or should be assumed by its nature, is probably fair. Nothing like that is true.
00:13:46.180 It is observably obviously true that it is almost certainly fraudulent this election and all of our
00:13:54.340 other ones. Now, I'm not saying that no Republican ever won because of fraud in an election. I assume
00:14:00.180 it's happened. I just think that the smarter, fairer, more rational context would be, of course there was
00:14:09.600 major fraud in the election. Of course there was. There had to be. And if I had to guess that wherever
00:14:17.440 Republicans could get a little advantage, almost certainly there was Republican fraud. If you think
00:14:24.320 I'm the one who's going to say just the Democrats do fraud, nope. Nope. There's no part of me that thinks
00:14:31.220 Democrats are the only ones who do election fraud. But it is nonetheless true that in those swing states
00:14:38.620 and in the big cities and the swing states, they're under Democrat control and so the Democrats have
00:14:43.680 more options for fraud. And so it wouldn't matter if Republicans and Democrats were equally willing to
00:14:49.560 do fraud. It would only matter that the Democrats were willing because they have the levers that the
00:14:54.880 Republicans didn't have. Now, you could argue, if you wanted to be consistent, you could say, well,
00:15:00.680 Scott, you know, Trump won Florida and maybe there was some Republican fraud. I don't know.
00:15:07.100 I don't know. If you told me there was Republican fraud that gave Trump Florida, I wouldn't push back
00:15:14.060 on that. I'd say, I don't know. But it's certainly within the category of things that you would expect
00:15:19.400 to happen. The only reason I wouldn't expect fraud in Florida is if the Republicans were confident they
00:15:26.240 were going to win the regular way. And then maybe not. But wherever you can have it, it's going to be.
00:15:33.040 Wherever there can be. Now, I think that would be perfectly good context to say there hasn't been
00:15:40.860 proven that there's fraud, but we are talking about a situation where it's guaranteed. Let me give you
00:15:46.240 another example. I have no statistics of crime in New York City. Let's just say I, for whatever reason,
00:15:53.620 I don't have statistics for that, could I still know that there was crime in New York City?
00:16:00.060 Pretty reliably, right? Because it's a city and it's full of lots of people. And within that big,
00:16:07.080 complicated city, you know there are going to be people who want to do crime, have a high upside
00:16:11.880 and the capability of doing it. You don't need to see it to know it's happening. Same with the
00:16:19.120 election. You don't need to see the fraud. It can't not happen. It's the setup that makes it happen.
00:16:26.660 All right. So that's the New York Times. And that is pure brainwashing and manipulation.
00:16:32.700 That's the world you live in. They're not even trying to be news. And let me say that again.
00:16:36.820 This isn't even trying to be news. If you think this is some kind of an accident where this was
00:16:43.140 just not a good job, somebody wasn't a good writer, no, no, nothing like that's happening.
00:16:49.700 This is just brainwashing. There's another piece of brainwashing going on. And I'm at least partly
00:16:57.100 the subject of it. Meaning that people like me, simply talking about the topic, and let me say this
00:17:04.300 as clearly as possible. I am personally not aware of any proof of election fraud.
00:17:13.640 Period. Personally, I'm not aware of any proof. I'm aware of lots of allegations. Some look credible,
00:17:21.040 some look less credible. But I'm not personally aware of any fraud. How could I be? I mean,
00:17:26.120 unless I saw it myself, I probably wouldn't believe it if it were reported. So I'm speaking completely
00:17:32.320 objectively about what we all see. And in fact, I don't think I'm adding anything that we don't all see.
00:17:40.140 So even when I retweet something that's some claim, I don't retweet it like it's a fact. If you've seen
00:17:47.860 my tweets lately, I say, is this true? Or has it been debunked? So am I allowed, ask yourself, am I allowed
00:17:57.080 to speak this subjectively on just the topic? I'm not telling you, you know, I'm not adding information
00:18:06.120 that's not obvious to everybody. Can I talk about it? Well, I've had people come at me on social media
00:18:12.220 lately, and say that I'm the election denier, and that now I'm the, I'm the brand of crazy people who
00:18:21.660 think the election was faked. Do you notice, do you notice what's happening? By making me a character of
00:18:32.900 ridicule, they want to, they want to hold me up as somebody they're ridiculing, so that you don't do it.
00:18:39.500 Let me ask you this. If you saw me getting savaged in social media, because somebody else claims that
00:18:48.600 I'm saying the election was stolen, are you as likely to go public? No. The whole point of it is
00:18:57.820 it's suppressing fire. So they'll go after somebody like me, and I'm not even, I'm not even one of the
00:19:03.680 crazy people with crazy claims. I'm pretty sure I haven't made any crazy claims, like zero. But if
00:19:09.840 they can paint me as one of the crazy ones, then it will discourage you from even talking about it
00:19:15.540 in public. Because what, what they need to do is set up a situation where if you even brought up the
00:19:20.720 topic, even at work, you'd be at work and you brought up the topic, you want the other people
00:19:27.140 at the table to just go, whoa, I guess you're one of those deniers. What's next? Holocaust denial?
00:19:38.380 See? So that's, so it's suppressive work. Now there's something they didn't count on.
00:19:45.680 Me. They didn't count on the fact that I have no shame and no sense of embarrassment,
00:19:52.500 embarrassment. And I'm largely immune to this sort of thing. So all it does is make me want to talk
00:19:57.960 about it more. So today I wasn't going to talk about it. But because, because people told me that
00:20:03.780 I'm a bad person for talking about it, oh, now I'm going to talk about it. That's just red meat. As soon
00:20:10.300 as you tell me I can't talk about it, I don't want to talk about anything else. So you might hear a lot
00:20:14.860 of it. All right. Do you know George Conway? I'll bet you have. I bet you do. George Conway, he's,
00:20:25.820 you know, Kellyanne Conway, of course, had been a top advisor to the president, helped him get
00:20:32.440 elected the first time. And her husband, George Conway, who I call the lesser Conway, he's like the
00:20:39.500 Conway that, you know, if you were going to do a startup, have you ever heard of an MVP version?
00:20:47.660 Sometimes when you do a startup, you do something called an MVP version or a minimum viable product.
00:20:54.620 So you slap together some software or a product just to see if somebody would use it, but it's
00:21:01.520 really poorly done. It's just slapped together. I feel like when God was making Conways, he's like,
00:21:07.980 let's try to make it Conway. And he made George. And he's like,
00:21:12.820 I feel like version 1.0 needs a little upgrade. And then he did Kellyanne and their child.
00:21:23.000 And they got it right eventually. But anyway, I asked this question, which George Conway responded
00:21:29.340 to, which is why I'm mentioning him. I said, question for experts, hypothetically. Now,
00:21:34.200 hypothetically is important to what I'm going to say next. Hypothetically. All right. I said,
00:21:40.020 hypothetically, what would happen if Biden is inaugurated? And a month later, it's proven,
00:21:45.920 and I put this in capitals, beyond any reasonable doubt. So just hypothetically, what would happen?
00:21:54.640 Biden takes office, gets inaugurated, his administration gets down to work. And then later,
00:22:00.020 some evidence is beyond dispute. Let's say, an eyewitness plus computer logs, whatever. Just
00:22:08.700 something that nobody would doubt. Just literally, no Democrat would doubt that the election was stolen.
00:22:15.720 Now, I don't think that can happen. That's why I'm saying hypothetically. It's very unlikely. But,
00:22:21.580 you know, in the wildest possibility, it could happen. Anything's possible. Here's what George
00:22:27.160 Conway said to that. He mocked me. Now, remember how the mocking is important, right? The mocking
00:22:34.620 is suppressive fire. Do you think George Conway had any real reason to tweet at me?
00:22:42.240 Why would he? What would be the motivation? Suppressing fire. It has nothing to do with whether
00:22:47.980 my point is good or bad. It has everything to do with whether I should be able to talk,
00:22:52.760 or should I be mocked for my opinion. So in order to mock me, he rewords my hypothetical question to
00:23:00.560 show how ridiculous it is. And remember, he's a professional lawyer. And professional lawyers
00:23:05.840 can, they can debate really well. So you're going to see a top lawyer at the top of his form debating
00:23:14.340 in public and making me look silly with this great, great mocking point that he makes. He
00:23:20.580 rewrites my tweet to say, hypothetically, I'll do it in a mocking voice so you can get the whole
00:23:25.820 feel of it. Hypothetically, what would happen if Biden is inaugurated and a month later it's proven
00:23:31.740 beyond any reasonable doubt that Martians have the real constitution and it says Trump is president
00:23:38.900 for life. Unlikely scenario, sure. But who would be president then? Says George Conway. Now, I've got a
00:23:48.880 book for George. It's behind my head. It's called Loser Think. In my book Loser Think, I have a large
00:23:56.500 section talking about how only idiots use analogies as arguments. Now, his argument that
00:24:07.780 that Martians having a constitution would be a similar likelihood to fraud being proven in the
00:24:17.640 election. Now, let me put this in context. The odds that there was massive election fraud is close to
00:24:27.720 100%. Again, it's because of the setup of the situation guarantees it, not because of any specific
00:24:34.280 evidence that I'm looking at. Just the setup guarantees it. So you have a situation that
00:24:39.260 guarantees something will happen, and he's comparing that in an analogy to something that could never
00:24:45.940 happen. So his argument is an analogy of something that could never happen to something that pretty
00:24:54.960 much is guaranteed did happen. We just don't know if we'll ever know, right? So it's the part where you
00:25:01.320 don't know if you'll ever know that's, you know, the uncertainty, not whether it happened.
00:25:07.480 And now, let me ask you this. Do you think that that was a good lawyer point? He's got the logic and
00:25:13.780 the data on his side, using a rational argument? No. No. This point, because let me give George
00:25:23.200 Conway a compliment if I might, right? Because it's important to the story. He is a professional
00:25:30.920 high-end lawyer. I don't think Kellyanne Conway married a dummy. Do you? This is a smart guy.
00:25:40.380 Smart guy who knows the law and therefore knows how to debate things. Do you think when he chose the
00:25:46.580 strategy of mocking me as a person, do you think he chose the strategy of mocking me while he had a
00:25:53.760 different strategy, a go-to strategy of showing me data and an argument that was more solid than mine?
00:26:01.840 Do you think if he had an argument that was better than mine, or let's say a point with data, etc.,
00:26:08.580 do you believe that he would have used the good argument if he had one? Of course. Nobody uses the
00:26:18.280 bad argument if they have a good one. If he had any kind of a pushback that would make sense and would
00:26:24.940 be persuasive, he would use it. But instead, he goes for the mocking analogy, which tells you their
00:26:30.080 strategy is to go after the person, which he did. Now, here's the other tell for a fake news industry
00:26:38.500 that's manipulating you. If you see the media going after the people instead of the argument,
00:26:45.380 that's all you need. That's all you need to know. If they're going after the people instead of the
00:26:51.840 argument, you are in a massive brainwashing scenario, which you are. Let's see. What else we got here?
00:27:00.760 So the Louis, Representative Louis Gohmert, I don't know how to pronounce his name, of Texas, he had this
00:27:08.240 suit where he wanted Pence to be able to throw out the electoral whatever. But that got tossed in a
00:27:16.120 court, so we don't have to worry about that. So on social media, on Twitter, a user said that he
00:27:28.460 doesn't know anybody who smokes weed who adds to society. So somebody said that in public.
00:27:38.340 I don't know anybody who smokes weed who adds to society. So I tweeted back, for context, it would be
00:27:48.180 important to know how many people do you know? Because if you only know three people, okay. I can
00:27:54.960 certainly believe if you only know three people, you probably don't know anybody who smokes weed and,
00:28:00.060 or you might not know anybody who smokes weed and is also successful. I have a feeling, just speculation,
00:28:08.180 that somebody who has an opinion like this, maybe their friends lie to them. Just put it out there.
00:28:16.080 Maybe, just maybe. This is a person who has, possibly, several friends who smoke weed and are very
00:28:25.240 productive, but maybe they don't share that fact with Mr. Judgy, Mr. Judgmental. Maybe they don't share
00:28:33.700 that. But this sparked me to make a larger statement about drugs. Do not smoke marijuana. Can I say that
00:28:45.360 as clearly as possible? Can we, can we start with that? Don't smoke marijuana. You know, don't be like
00:28:52.660 me. Don't do drugs. Now, having said that, children, did you all hear that, children? Don't do drugs,
00:28:59.900 children. Now, that's, that's the child message. Can you put the children away? If there are any
00:29:05.840 children listening, quite seriously, this part I'm not kidding. No joke here. Don't let the kids hear
00:29:12.960 this next part. This is for adults. And I think you, you adults would all understand that there are
00:29:18.640 things adults will and will not say in public because kids will hear it, right? There are things
00:29:25.620 you would say to other adults, but you might not say it in public because your kids will hear it.
00:29:31.480 And this is going to be that conversation. It goes like this. I have access to, because of my Dilbert
00:29:40.780 career, access to an unusual number of highly successful people and have for years. Because
00:29:49.140 people who are successful like to talk to other people who have done something. So it's very
00:29:53.380 natural you end up meeting a whole bunch of billionaires. So for whatever reason, I probably
00:29:59.780 have met and had, you know, meaningful conversations with more billionaires than most people and hugely
00:30:06.080 successful entrepreneurs, et cetera. And I'll tell you, there is one correlation that looks
00:30:11.320 pretty strong, which is drug use. But a really, really big difference between successful people
00:30:21.740 who use any kind of drug and unsuccessful people who use any kind of drug and maybe become addicted
00:30:29.240 and their life is ruined. And here's the difference. You don't know which one of those people you are,
00:30:37.240 unfortunately. You might say to yourself, you know, let's say you're 19, pick an age. You say to
00:30:43.720 yourself, I think I'm the kind of person who can do some drugs and I think it'll just help me.
00:30:48.640 Might relax me, expand my mind, whatever. I think I am that kind of person who can do a bunch of drugs and
00:30:55.700 I'll be fine. Turns out you didn't know you're an addict, meaning that biologically you're more
00:31:04.520 likely to be addicted to substances. And the same thing that might not have killed someone else
00:31:10.040 ends your life. And I don't mean you just die. I mean, you become addicted and then the addiction
00:31:17.060 becomes your life. So your old life is over. You have this new thing. You could call it a life,
00:31:23.040 but it's the addiction becomes your personality at that point. The old you is dead. Still alive,
00:31:31.220 but dead in a sense. So here's the thing you just have to know. It is not true that drugs are bad.
00:31:40.320 It is not true that drugs are good. It's very not true that drugs are good. Here's what's true.
00:31:47.760 If you could match the right drug with the right person and the right situation, you get magic.
00:32:02.000 Sorry, children. Now, the trick is you don't know if you're that person. And chances are you're not.
00:32:12.160 If you're going to play the odds, you're not that person, right? But sometimes there is that person.
00:32:20.360 I'm going to throw out some names. And these are not, I'm not going to, I don't want to incriminate
00:32:26.000 anybody. I'll just put out some names that maybe would be obvious to you. But let's take a Steve Jobs,
00:32:32.160 who famously, you know, did some LSD and probably marijuana and some other stuff.
00:32:38.180 Take your Bill Gates, young Bill Gates. I think he admits some marijuana usage. I believe that's
00:32:46.460 part of the public record. If it's not, I take it back. So I don't want to, I don't want to cast
00:32:51.800 aspersions on people. Now, why did I pick Steve Jobs? Because he's deceased, right? So I can talk about
00:32:59.200 him because he's deceased. I want to mention some other billionaires. I want to.
00:33:08.180 But I'm not, right? Because they're so alive. Wouldn't be fair. I can tell you that among the
00:33:14.900 most successful people, one of the defining characteristics doesn't look like a correlation.
00:33:21.740 It does look like causation. But without a, you know, randomized controlled trial, you don't really
00:33:27.760 know for sure. But it sure looks like causation. And certainly personal experience would suggest it is.
00:33:34.300 And that is that a very smart person who is lucky enough, and it has to be luck, to not have the
00:33:43.000 addiction genes, if you will, that person can, on some occasions, find the right combination of legal
00:33:49.920 or illegal drugs to boost their performance a lot. In fact, I would go so far as to say the success of
00:34:00.980 Silicon Valley is largely based on drugs. And it's largely based on drugs in the sense that those
00:34:08.500 few people who found the right drugs that just got lucky, they weren't addicts. And in some cases,
00:34:15.040 they probably were actually, and still did good things, despite their lives being suboptimal.
00:34:21.360 So here's the point. There's no such thing as drugs being good or bad. You just have to be lucky
00:34:29.560 enough that the mixture of the drug, the person, and the situation are right. People who do, and again,
00:34:37.700 this is the adult conversation, right? Nothing I say today should suggest to you to do drugs. I'm telling
00:34:44.280 you as clearly as possible that if you just wander into it and say, I think I'll try some drugs,
00:34:50.680 your odds are bad. They're bad because there are so many people who are addicts. It's just a bad bet.
00:34:57.880 It is nonetheless true that people do it, right? It's a free country. People analyze their risks.
00:35:04.600 They take chances. Maybe I think they shouldn't, but they do. We live in the real world. They do.
00:35:09.160 Some of them are going to work. Now, one of my lucky coincidences is that my chosen drug,
00:35:16.140 marijuana, finally is legal. So at least I don't have to break the law. But I can tell you in my
00:35:22.140 personal situation, again, this is a coincidental, perfect situation where the exact drug, the exact
00:35:30.900 person, and the situation were perfect. I'm a creative person by nature. Marijuana makes me
00:35:40.040 more creative, and it's not even close. If you're saying to yourself, I think you just imagine you're
00:35:45.620 being creative. Nope. It's pretty easy to demonstrate. I mean, I can't demonstrate it to you,
00:35:53.720 but I can tell you over the course of my career, you know, I've got a whiteboard next to where
00:35:59.600 my bong is. And I can tell you that the whiteboard gets filled with ideas that do become part of my
00:36:08.520 content. They do make money. The next day, they're just as good. You know, it's not like I wake up the
00:36:13.740 next day and go, what the hell was I thinking last night? It doesn't work that way. I look at the board
00:36:20.040 and I go, shoot, I never would have thought of that if I hadn't been high. And it's like my best ideas,
00:36:26.540 my best jokes, my best books, my best concepts, my best weird ideas, pretty much all of them come
00:36:35.020 out of marijuana. Now, that's not why I smoke. It's just luck. The reason I do it is for health
00:36:42.920 reasons. I don't do it recreationally. I do it completely as a medicinal. It solves like a whole
00:36:49.340 list of problems. Everything from it makes me sleep better, good for my allergies, you know,
00:36:56.900 fixes my attitude. So those are the reasons I'm doing it. I can exercise better. I mean,
00:37:03.140 just everything works better with marijuana for me. Don't do marijuana, right? I got lucky,
00:37:11.360 pure luck, pure luck, that that drug and my personality and my job, which is creativity,
00:37:20.880 just happen to match perfectly. The odds of you finding that combination, don't even look for it
00:37:27.800 because your chances of ruining your life looking for it are pretty high. I'm just saying it happens.
00:37:35.120 Some of us get lucky. Now, is it really luck? Is it really? Well, let me tell you some things.
00:37:43.860 I'll give you my experience, and that might give you a little insight. In college, I did try some form
00:37:50.760 of speed. Now, it was, I think it was Ritalin, you know, some illegal Ritalin or something I had,
00:37:56.700 and some other illegal stimulant. I wrote my entire senior thesis in, I think, four days over Thanksgiving
00:38:07.540 break. I wrote my entire senior thesis in economics in four days. You were allocated an entire term,
00:38:18.800 right? So you've got the whole term to write this thing. I did it in four days because I wanted to,
00:38:24.260 didn't want to go home right away. That's it. And I did that on one pill. Now, I got like a B+,
00:38:31.960 I think. It was fine. And do I recommend that you take a stimulant? No. No, I do not. Do you know
00:38:42.540 how easily that one experience could have ruined my life? Oh, I take it back. It was more than one
00:38:49.260 experience. Well, I did have several experiences with, I think it was Ritalin at the time. And that
00:38:55.840 was a stimulant. And when I did that, I could do all my homework. I could smoke weed as much as I
00:39:03.860 wanted, and it would sort of cancel each other out. And I could exercise all day long. And when I was
00:39:10.760 done, the outcome was a lot of good work, a lot of exercise. And that's it. And I walked away. It's
00:39:20.320 like, okay, I just won. And I didn't give up anything. So now you say to yourself, hey, I guess
00:39:28.360 I'll go do those drugs because it worked out for Scott. No, no, don't do that. Do you know how lucky
00:39:35.920 I was to have that experience and not become addicted? Probably the only thing that kept me
00:39:43.800 from being addicted was supply. Because it wasn't easy to get those stimulants in those days. At the
00:39:50.740 moment, it's easy. I believe it's easy at the moment. I don't try to get them, so I don't know.
00:39:56.420 But I feel as if the only thing that kept me alive was a lack of supply. Because the experience was so
00:40:03.520 good that I feel like, especially at that age where, you know, your brain isn't quite developed,
00:40:09.120 you don't have the experience or the ability to avoid things that are tempting, I feel as if it
00:40:14.900 might have killed me. You know, I could have been a meth addict, could have taken it up the chain to
00:40:19.400 something worse. So I would say that everything about my experience is really luck, plus the fact
00:40:29.100 I don't seem to have an addictive personality. I don't like alcohol. I just don't, I don't like
00:40:34.900 the experience of alcohol. And I did, I did mushrooms once, only one hallucinogen, changed my
00:40:43.760 whole life. I've told this story many times. Once you've done a hallucinogen, you can see that life
00:40:49.160 is subjective. And once you understand that your experience of life is subjective, even if there's
00:40:54.360 an underlying objective truth, that once you realize that it changes the whole way you live.
00:41:01.260 Because you say, well, if I don't like my subjective reality, can I change it? And the answer is,
00:41:08.060 yep, you can. You know, it takes technique. But you can completely change how you frame your reality
00:41:14.300 and hallucinogens prove that to you in a way that just thinking and researching and reading and
00:41:20.760 experimenting will never prove it to you. But you want proof? Spend a few hours on mushrooms. Again,
00:41:28.780 don't do drugs. I'm not saying you should do that. I'm very, I'm aggressively telling you not to do it.
00:41:37.380 But if you did, and you're an adult, you would have an experience if you didn't die, you know,
00:41:42.860 if you didn't get a bad batch of mushrooms or something. So there's some risk involved.
00:41:46.880 But if you did it, it would change forever your, and here's, here's why it's important. It's not just
00:41:54.280 this weird new knowledge, this reality is subjective. It's understanding that you can
00:42:00.120 change that subjective reality to optimize your life gigantically. This is not trivial stuff.
00:42:08.400 I'm talking about something that will make your life five to 10 times better. You know,
00:42:14.540 we're not talking about a 20% improvement, because you did mushrooms once talking about like a 500%
00:42:20.820 improvement over the course of your life, simply by understanding that you can shape your reality
00:42:27.880 to have a subjective experience that's different than the one that's the default. So that's what
00:42:35.660 that will do to you. Now there are other drugs I've never done, such as Molly or what's the other
00:42:46.140 name for Molly? Ecstasy. So I've never done that because I hear that they're too good. There are some
00:42:54.920 drugs that I won't take because they're too good. Heroin, cocaine, and yeah, and MDMA. So there's some
00:43:05.320 things that that's the reason I don't take them. They're too good. And I know that if I let myself
00:43:10.120 get into that stuff, I don't know if I'm strong enough not to do heroin twice. Do you? I'd like to
00:43:18.260 think I'm strong enough that if I, you know, took some heroin and I really liked it, I'd be like, you
00:43:23.440 know, I did that once. Guess I don't need that. I'd love to know that's me. But you don't know that
00:43:31.040 about yourself. If there's anything that experience teaches you better than anything, the best thing
00:43:37.680 that experience teaches you is you don't know yourself. Once you realize that you think you know
00:43:44.820 yourself, but even you don't know yourself, then you're a little safer because you'll treat yourself
00:43:51.160 like an unknown and then you'll do better risk management because you'll say, I don't know, feels like,
00:43:56.780 it feels like I wouldn't get hooked on heroin, but I don't know. So I'd better stay away.
00:44:03.400 So that's my point. I personally don't know. I'm not going to say I don't know anybody,
00:44:09.360 but I personally don't know anybody who is hugely successful. I'll say almost anybody who is hugely
00:44:15.900 successful, who did not get there with drugs as an assist. Sorry. That's why that's why the children
00:44:25.780 had to leave the room. In the real world, this is well known, by the way. Among the successful people
00:44:32.220 who also know other successful people, this is common knowledge, what I'm telling you. There are
00:44:39.100 some people through luck, but also some skill. The luck is that they're not addicts by nature.
00:44:45.680 The skill is that they then put a little thinking on what to do and what not to do.
00:44:49.480 And some of them craft their drug use specifically for productivity. So when somebody says, I don't
00:44:57.080 know anybody who smokes weed and has been successful, that may be, you know, you may have some friends
00:45:01.600 that fit that category. It makes sense. But also most people are not terribly successful. If I said to
00:45:07.740 you, I don't know anybody who's, okay, I'll say this. I don't know any redheads that are really
00:45:14.660 successful. And I'm thinking if I can think of an exception. And the answer is, I'm sure I probably
00:45:24.000 know some redheads that are super successful. I mean, I know lots of redheads who have good jobs,
00:45:29.700 but I don't know any redhead like Steve Jobs. I don't know a redheaded billionaire, right? But,
00:45:37.840 you know, solid jobs and good citizens and stuff. And so should I conclude that nobody with red hair
00:45:43.420 could ever be really, really successful? No. This is bad statistics. Because most people do not
00:45:50.680 become billionaires. Most people do not become Steve Jobs. So it wouldn't matter what category you
00:45:56.600 picked, people with big noses, people who, whatever, you would find that most of them are not successful.
00:46:03.860 So would you say, well, I think that red hair must be correlated with a lack of success because the
00:46:09.360 people I know personally with red hair are not billionaires. Okay, that's just bad thinking about
00:46:14.280 statistics, right? All right. Way more than you wanted. I saw something on CNN about how to make
00:46:25.020 your habits last. And so I'll recommend you to CNN, but I'm not going to use exactly their list. I'm going
00:46:31.640 to modify it a little bit. And it goes like this. So here are some ways to lock in habits. One way,
00:46:39.620 and so inspired by CNN, so go to CNN.com if you want to see their take on it. One way is to put the
00:46:46.420 new habit associated with an existing habit. So the way that one-a-day vitamins were marketed
00:46:55.500 is as one-a-day. And the one-a-day told you, oh, I'll do it when I brush my teeth because I'm doing
00:47:02.560 that other stuff that gets you ready for the morning. So if you pair your one-a-day vitamin,
00:47:08.480 and the reason it's one-a-day is just for marketing. There's no reason that vitamins should be one-a-day
00:47:12.800 versus, you know, smaller doses throughout the day or any other dose. It's purely marketing and not
00:47:18.240 science that it's once a day. That's well understood, by the way. I'm not making that up.
00:47:24.980 But by pairing it with something you're definitely going to do every day, which is brush your teeth,
00:47:29.600 you pretty much guarantee that you're going to take your one-a-day vitamin. But there are other
00:47:33.820 ways to do it. You can pair other habits. For example, the CNN example was, well, actually,
00:47:41.280 this is the second example. You can pair something pleasurable with the thing that you want to make a
00:47:47.940 habit. So their example was to make it a habit to listen to an audio book while you exercise. So
00:47:54.500 you're pairing a thing you enjoy anyway with the thing you want to make your habit. And so you pair
00:47:59.780 them. Good technique. So that's slightly different than just pairing habits where one habit exists.
00:48:07.280 This would be a second thing. Another one is to get a partner who's in on it with you to keep you on
00:48:13.560 track. Let's say you're both trying to lose weight. You're both trying to walk every day,
00:48:17.420 whatever it is. If you've got one other person who's going to keep you honest, that can help.
00:48:23.140 All right? Because maybe they can talk you into walking the day you don't. You can talk them into
00:48:27.000 it, etc. Here's another one. Create an incentive, reward, or penalty for each time you do or don't
00:48:32.980 do the habit. Now, I've talked about this with, I love to be tired after working out and have a nice
00:48:41.200 protein shake that's delicious while looking at what happened on Twitter on my phone. That little
00:48:47.680 moment where you've worked out and you're just enjoying your beverage and the protein feels like
00:48:53.360 it's helping you and you're just thinking of whatever you want to think is my reward. So I would
00:48:58.660 always pair, I would put that after exercise. But there's another way to do it, which is to penalize
00:49:03.320 yourself. To penalize yourself. So the CNN example was that you'd have like a, your spouse would
00:49:12.500 donate to a small amount of money to a charity that you don't like every time you didn't do your
00:49:19.020 habit. So it'd be like a little penalty, but not really a penalty for not doing anything. Here's how
00:49:24.440 I did this with myself. I could never remember to put on my sunscreen, at least on my face,
00:49:30.380 the exposed parts before I went out for the day. If it's just a regular day, I would always remember
00:49:35.920 my sunscreen if I were going swimming or I'm going to be outside all day, but I couldn't get the habit
00:49:41.880 of putting on sunscreen just every day, you know, in California, especially. And so here's what I
00:49:49.200 lit on. I decided to punish myself every time I forgot. And the punishment would be, I'd be sitting
00:49:56.400 in my car and I'm ready to go wherever it is I want to go. And if I realized then that I didn't
00:50:01.740 have my sunscreen on, I would have to get out of the car, put everything down like I'd never been in
00:50:08.760 the car, take off my jacket, walk upstairs, put on the sunscreen, and then redo it. Now, do you know
00:50:16.800 how annoying that is? If you're like in the car and you're ready to start it and you want to go
00:50:21.900 somewhere, you hate that. You hate that you've got to redo everything and just start again. And so
00:50:30.420 that was my punishment. Do you know how many times I had to do that to myself? I don't know the exact
00:50:36.620 number, but if I had to guess over the summer and let's say since the last six months I've been trying
00:50:43.020 it probably 40 times. Think about that. 40 times I'm ready to go. I put it all down and I walked
00:50:55.820 upstairs. As a result of punishing myself regularly for six months, I'm much closer to remembering it
00:51:05.900 every time now. I'm not there yet, but that's my technique. All right, so that's how to do that.
00:51:11.480 Here's a little question that is also for the adults. In this case, I don't mean just the adults
00:51:17.420 by age, but people who can understand nuanced points. What I'm about to talk about, I should
00:51:24.000 not talk about because I'm not a doctor. I'm no scientist. Damn it. I shouldn't be talking about
00:51:31.300 stuff like this. But there is an interesting category in the world, which is one of my favorite
00:51:36.900 categories, and it is this. Things which cannot be communicated for different reasons. For example,
00:51:44.360 here's a thing that could not be communicated. Let's say you were known as the biggest liar in
00:51:49.040 your town. You just had a reputation for making up stories that were crazy. And then one day,
00:51:54.840 you're actually abducted by actual aliens. Like in real life, it's the only time it's ever happened,
00:52:01.500 hypothetically. And you're abducted by actual aliens. Could you communicate to anybody else
00:52:09.240 after that, that it really happened? And the answer is no. Because your reputation as the most
00:52:14.840 famous liar would make it literally impossible for you to communicate a true thing, that you were
00:52:21.120 abducted by aliens. Couldn't be done. And there are a whole bunch of other weird situations in which
00:52:26.980 you can't communicate. Like I could be right in front of you, and it can't happen. It's just
00:52:32.960 impossible. I'll give you another example of one of those. There are all kinds of weird examples of
00:52:37.680 this. One is, there was something that happened to me a long time ago. And I once tried to tell
00:52:44.520 somebody about it, that I'd never told anybody about it. And it would be the first time anybody ever
00:52:51.360 heard this traumatic story. And I actually had this weird experience where my mouth wouldn't do it.
00:53:00.000 I had made the decision to say it. Like the mind was all on board. There was no ambiguity in my brain.
00:53:08.820 And my brain ordered my mouth to talk, and it wouldn't do it. I've never had that experience before.
00:53:15.540 And it just wouldn't do it. I mean, it literally wouldn't do it. And it was because it was so
00:53:23.980 traumatic that my brain just said, nope, this is not going to ever come out of your mouth.
00:53:31.120 And it never has. Never in its fullness. I've suggested it before. But never in its fullness.
00:53:37.920 And I don't know if I ever could. Actually, I think my mouth actually wouldn't work. Now,
00:53:43.000 you know, I don't want to get into some sad example, because I'm well over it. So you don't
00:53:48.060 have to worry about me. But there are these categories. Here's another one. What I'm going
00:53:54.080 to talk to you about now should only be talked about by experts and doctors. But they can't.
00:54:02.580 They can't. And here's the reason. Because it would be unethical for them to tell you what I'm going
00:54:09.120 to tell you. Now, I can do it because I'm not a doctor. And I will trust you to be adults and to
00:54:16.700 understand that if someone who is a professional cartoonist says something that you believe has a
00:54:23.440 medical implication, and then you follow that advice, well, that's your own damn fault. Because
00:54:30.480 let me tell you, you should not be getting medical advice from cartoonists. Now, if I told you
00:54:36.180 something that, you know, you thought was interesting, and you asked your doctor, that's
00:54:40.840 different. But make sure it's a doctor that gives you medical advice, okay? Don't take it from me.
00:54:47.280 So I'm going to wade in some dangerous territory. And this is just for fun. Okay? Nothing here is
00:54:54.980 supposed to suggest a policy change. Just for fun. And it goes like this.
00:55:00.460 I believe we know that coronavirus infections, let me get myself out of the way, will be worse
00:55:07.940 if the initial exposure is higher. In other words, if you were locked in a phone booth with
00:55:13.920 three infected people, and you stayed there for six hours, you would get so much virus in that little
00:55:21.340 phone booth that your sickness would probably be more extreme. Now, this is what the experts told us
00:55:28.280 early on in the pandemic. I don't think it's changed, right? Has anybody seen anything to
00:55:34.800 counter that? So I think it's an established fact that the greater the viral load, let's say in
00:55:40.540 closed spaces especially, the greater the fatalities. And then therefore it follows that if you were to
00:55:49.120 be exposed in like a passing manner, let's say in the, you were shopping and you walked past a few
00:55:54.860 people who had it and it was just sort of in the air, but it was just, you were in and out.
00:56:00.160 It follows that these people would be more likely to be infected with a lower viral load,
00:56:05.360 load, and therefore fewer fatalities. I'll ask now if there are any, you know, doctors watching this.
00:56:12.020 Give me a fact check on that. That's still true, right? That somebody says,
00:56:17.260 directly linked. Okay, so people seem to be confirming it in the comments. So here's the
00:56:24.460 part I want to add to this. Suppose you had a policy that said that you would do the same
00:56:32.580 amount of mitigation, whether it were in closed spaces or in open spaces. And then you would get
00:56:39.140 a result, which is a number of fatalities. So that the number of fatalities would be the result
00:56:43.780 of your national policy about what you did in both of these places. It's sort of a blended average,
00:56:50.180 right? But suppose, and again, this is the unethical part. This is the part that can't be tested
00:56:57.760 because it wouldn't be ethical to infect people intentionally. There's no way you can do that
00:57:02.700 ethically. So you can talk about it, but only if you're not a doctor, right? So that's why I can do it
00:57:09.900 because you won't take me too seriously. And that's the right mindset for this. Don't take me
00:57:14.220 too seriously. Suppose, and I'm just going to put this out here because I can't, my brain can't quite
00:57:22.860 get the answer itself. So I'm going to ask you to help me out. And it goes like this. Suppose you said
00:57:29.420 don't wear masks in places where your contact would be casual. Just for example, let's say that the
00:57:37.840 guideline came out that says, if you're in a closed space, wear your mask. If you're in an open space,
00:57:44.960 even let's say a Trump rally, you know, you're outdoors, you don't have to wear a mask. Now,
00:57:52.600 would there be a lot more infections if people didn't wear masks in, let's say, shopping centers
00:57:59.300 and stores and rallies? The answer is yes. There'd be tons more, right? We all agree with that. There'd be
00:58:05.180 tons more infections. But they would be this kind. And these people would go through the system and
00:58:12.540 have fewer fatalities. So here's the question. I don't think it makes sense to go for herd immunity,
00:58:21.080 right? I think the experts have, at least the experts in this country, have said that going for
00:58:27.300 herd immunity through infection is a really bad idea, especially because of the lingering health
00:58:33.540 health problems that you could have, even if you recover. So even if you don't die, you might have
00:58:39.840 some lingering problems, which seem pretty nasty. So there's no ethical way that anybody should suggest
00:58:47.200 taking your mask off in, say, a shopping center, right? So do we agree with that? Nobody ethical in the
00:58:56.700 medical community would ever suggest you take your mask off in a public place to give yourself an
00:59:02.920 infection, or on average, more people get it, just because it has fewer fatalities. You can't
00:59:09.280 ethically recommend that. But here's the math question. Forget about the ethical part. I just want to
00:59:17.540 understand the math. If you did more of these infections, it would be at the expense of more of
00:59:24.280 these, wouldn't it? Now, this assumes also something about how quickly the vaccinations are rolled down.
00:59:30.820 If the vaccinations could be rolled down instantly, then wear your masks everywhere. Do you agree?
00:59:38.340 If you knew that the vaccinations would be here in one week, and everybody would be vaccinated in one
00:59:44.300 week, it can't happen. But if you knew that, you'd say, all right, I'm just going to stay home for a week,
00:59:49.680 because this is worth it, right? I'm not going to take any chance for one week. But because we don't
00:59:56.760 know how long it's going to take, could be six months, etc., we know that there's going to be
01:00:00.280 massive extra infections. Would we be better off, again, it's unethical to ask the question,
01:00:06.980 but math-wise, just in the math, would the fatalities go down if people stopped wearing masks
01:00:15.380 in outdoor public places? The infections would go way up, but would they be lower viral load, and
01:00:23.780 therefore the fatalities would go down? Now, if it's not obvious to you why this is not ethical,
01:00:30.640 it's because if you said, the day you said, don't wear your mask in public, you'd be killing people that
01:00:35.200 didn't need to be killed. You would kill extra people, but you also might get to the end of the
01:00:42.580 pandemic faster and save more of these kinds. So forget my credibility. You should assume none,
01:00:55.620 all right? So for this conversation, assume I have no credibility, I'm not good at math,
01:01:00.700 and I don't know anything about science. I'm just curious. I'm just curious if we're being ethical
01:01:07.800 at the expense of being successful. I don't know, but I think it's a good question. All right.
01:01:22.300 Let's see if there's anything I haven't mentioned that you just have to hear.
01:01:27.840 Oh, here's another point on the, if you're trying to build a habit and keep it, you'll see some people
01:01:34.280 give you this advice that you should have cheat days. If you, let's say you're on a diet and you'll
01:01:39.560 have, okay, every two weeks or whatever is your schedule, I'll have a cheat day. I think that's the
01:01:45.440 worst advice I've ever heard. It does make sense to allow yourself to have days when it doesn't work.
01:01:51.560 In other words, if you say to yourself, uh, you know, I'm going to try to exercise seven days a week,
01:01:56.900 but I'm also living in the real world and sometimes things come up. That's okay. That's just living in
01:02:02.840 the real world. But the moment you allow yourself that there's a cheat day, let's take food as the
01:02:09.940 best example. I think that having a cheat day for food is the number one worst way to diet.
01:02:16.320 Literally, there's nothing, literally there would be nothing, uh, less effective than having a cheat day.
01:02:23.080 And the reason is this, that if you're dieting successfully, you're not just eating less or
01:02:29.000 eating healthier or exercising more. Those are all good, but you're also getting rid of your cravings.
01:02:35.440 If, if you lose 20 pounds, but you don't lose your cravings, did you really lose 20 pounds?
01:02:41.500 Because they're coming back, right? So if you think about weight loss as weight, you've already lost,
01:02:49.280 right? You might as well not even bother because if you think what you're trying to do
01:02:53.040 is lose weight, you'll never succeed. If you're trying to lose your cravings,
01:02:58.960 well, then your chances are very good. So I teach you in my book,
01:03:03.200 how to fill almost everything and still win big that is behind this whiteboard.
01:03:08.880 I teach you to work on your cravings and then the weight loss works on itself. And an example of that
01:03:14.000 would be, uh, for, I used to be addicted to Snickers candy bars. Like I would almost drool when I woke up
01:03:22.060 in the morning. I couldn't wait to get my first one. And eventually it would progress to like,
01:03:25.980 I'd be eating four of the big size Snickers a day. And I'd be like, every bite of it would actually
01:03:31.540 make me feel alive. And I'd be like, Oh, I mean, I would actually shudder with craving for this
01:03:39.580 particular food. But the technique I would use is I'd say, how about I could eat anything I want
01:03:46.200 except that? So every time I wanted that, I would eat something else that I also liked.
01:03:53.600 And I wouldn't limit myself to the other stuff. And then if I didn't eat that Snickers for,
01:03:58.700 I found out the, it was about eight weeks. So you might differ a little bit, but after eight weeks of
01:04:04.720 not eating it, it didn't even taste good. So right now, if I put one in my mouth, I'd be like,
01:04:11.880 I don't know why I like these in the first place. Now my natural biological, you know,
01:04:19.300 preferences didn't change at all in eight weeks. What changed is my addiction. I had an addiction to
01:04:25.920 this certain fat, sugar, salt combination. And by the way, do you know, that's how they addict you.
01:04:31.080 Okay. There's a book called, I may have the words reversed or, but it's like fat, sugar, salt,
01:04:37.820 or those words might be in different order. And it teaches you how the scientists found out that if
01:04:43.180 you get the right combination of those three materials, it's addictive. Like it actually
01:04:47.780 activates the addiction part of your brain. So when I was eating the Snickers, I thought I was
01:04:53.160 overeating in the old days. But when I got smarter, I realized I wasn't overeating.
01:04:58.440 I was addicted to certain foods that had a bad outcome for me. So I would break each addiction
01:05:04.240 individually, get rid of the craving, and then it's easy. You know, once the cravings are gone,
01:05:11.300 you don't even think about it. It's just what you eat. So at this point, I can eat a yam with some soy
01:05:18.380 sauce on it after cooking it, you know, getting it nice and soft. And just eating a yam with some pepper
01:05:24.240 and soy sauce is like a delightful meal. But only because I trained myself to get rid of the addiction
01:05:30.380 food, which I don't even miss. Now, if you told me how much do you miss your favorite food? I'd say
01:05:37.080 zero. Zero. There's not even 1% of me that wants to eat that. It's not like giving up cigarettes, as I
01:05:45.280 understand it. I've never given up cigarettes. I don't smoke. But what I hear is that even if you give
01:05:52.240 up as cigarettes and you're successful, you sort of always crave it. But you can actually get rid
01:05:57.240 of a craving for certain foods. It's just gone. It never comes back. All right. You're lying to
01:06:05.940 yourself that a Snickers bar is greater than soy on a yam? I don't know what that means.
01:06:13.660 So he says, the latest Dr. Pepper commercials cured my Dr. Pepper addiction. Yeah, I had to cure
01:06:18.960 my Diet Coke addiction, which was probably 12 Diet Cokes a day. So I guess I do have an addictive
01:06:25.700 personality in some of the food sense. But I've managed to work around it. All right.
01:06:34.180 These are your tips for the day. Did you learn anything today? Any of this helpful?
01:06:38.440 We'll see. All right. And here's my last question. Is it ethical to travel during the pandemic
01:06:49.500 to places that are open? Since you know that traveling during a pandemic increases the rate
01:06:58.100 of risk for not just yourself, but for other people, is it ethical to travel to places that
01:07:05.860 are open? Just looking at your comments. See, mostly yeses. Yeah. And here's my take on it.
01:07:15.660 I would say yes, it is ethical to do things which are open, which is also why I've defended
01:07:21.780 Governor Newsom, California's Governor Newsom, for eating at the expensive French laundry restaurant.
01:07:28.880 Because if it was open, he can eat there. Right? That's it. You know, if for whatever reason
01:07:38.720 it's open, I feel as if you should use it. And here's the ethical trade-off. Those people
01:07:44.760 who are selling those services, they need to live too. There's a reason they're open. And
01:07:50.120 the reason is that their government has decided that keeping them open and having customers is
01:07:56.200 better than closing them, all things considered. So I think it is ethical to travel during a
01:08:01.520 pandemic if you're traveling to places that are open and they're open for a reason. You
01:08:06.340 know, so I'd say yes. All right. I'm tripping right now, somebody says. What if you've already
01:08:15.160 had COVID? Well, yeah, then you're fine. We think.
01:08:20.000 If we know our immunity levels, blah, blah, blah, blah, blah, it was not open. So somebody's saying
01:08:29.120 that the French laundry was not open. Here's what I have not heard. I have not heard anybody say that
01:08:34.560 the French laundry was in trouble for opening illegally. So I don't believe the wasn't legal
01:08:40.180 to be open part. There was something going on there, but I don't think the French laundry was
01:08:45.300 opening illegally inviting the governor. I don't know. I don't think that happened. So there might
01:08:50.480 be some details. We don't know about that, but somebody says, how is a full airplane allowed in
01:08:58.640 the same city where all the stores are closed? I guess that's the difference between travel being
01:09:07.960 necessary and, you know, your, your trinket store, maybe not being as necessary.
01:09:14.920 What's my take on neural link? That's a, you're talking about Elon Musk's invention or his startup
01:09:21.080 that will somehow put a chip in your brain or, or they'll, I think they drill into your skull and
01:09:27.320 implant something that can communicate with your brain. And then you can directly control your
01:09:32.100 environment through your brain. Well, we're not there yet. So I think it's the implementation that
01:09:37.600 matters. I assume we'll get there where you will be part machine and part person. We're already
01:09:45.060 chemical cyborgs because most of us are enhanced by some kind of chemical at this point. So, you know,
01:09:52.400 a chemical enhances me. Other people might have a prescription drug that is their chemical enhancement.
01:09:58.640 But at this point, pretty much all of us are at least chemical cyborgs. You know, we don't operate
01:10:05.840 the way we were born and built based on our DNA. We're DNA plus a chemical that was designed.
01:10:13.560 Now, because that chemical is not a machine like a, like a microchip, we don't think you're a cyborg.
01:10:19.300 But if you put a chemical machine into you, which is basically, you know, a drug,
01:10:24.540 you're a cyborg. You're just a chemical kind. So we'll be real cyborgs whenever this neural link gets
01:10:31.360 going. And I don't think it's a question of should we be merging with machines or should we not?
01:10:37.160 We're going to merge with machines. So don't even worry about should we or should we not?
01:10:44.640 Because we're going to merge with machines. And those who do not will be at such a disadvantage.
01:10:50.500 They're going to want to merge with a machine pretty soon too. So we may have a problem where
01:10:55.560 there's a time where the rich or whatever, the well-off can merge with machines and have effectively
01:11:01.200 superpowers. Imagine, imagine, just imagine this. Imagine if you're in a conversation and you could
01:11:08.700 just think a Google search. And the Google search would come to your brain through this chip. I don't
01:11:16.120 know if any of that's possible, by the way. I don't know if that's contemplated or possible. I'm just
01:11:20.260 saying, imagine a world. It seems like you could do it someday. Maybe not soon. But just imagine doing
01:11:25.160 a Google search without talking, without using your fingers, while you're in the middle of a
01:11:30.480 conversation. Imagine looking at somebody and being able to tell their identity by looking at them.
01:11:40.660 Is that possible? Well, we already have facial recognition apps. You've heard of Clearview,
01:11:46.500 right? Law enforcement uses Clearview. They can take a picture of anybody and their identity will pop up
01:11:52.540 if that person has ever been on social media. So what if you had the chip in your head? Your eyes,
01:11:59.380 your eyes see a person. The chip takes what you see with your eyes, turns it into an algorithm,
01:12:06.960 checks faces, and reads back and says to you, you're talking to Bill, Bill Jones. He's an engineer at
01:12:15.260 Intel. Do you think that's possible? Well, I don't know that Neuralink would necessarily
01:12:21.380 have the technology to do that anytime soon. But my guess, if I just had to guess,
01:12:28.000 yup. Yeah, I'll bet it's possible. You've seen the experiments, and this was years ago,
01:12:35.260 so it's been a while, where they could actually, a computer can create a picture of what you're
01:12:41.240 thinking. You know that's a thing, right? They've already done that. You can have the computer draw a
01:12:48.400 picture based on sensors on your head of what you were imagining. Now, it's a grainy kind of
01:12:54.540 approximate picture, but it's definitely what you were thinking. That already exists. Now, imagine that
01:13:00.460 got better. Suppose they found out how to get closer to wherever your optical nerve meets your
01:13:07.320 brain. Is there a clearer picture somewhere in your brain? Because if the picture is clear,
01:13:13.060 you can use facial recognition through the chip in your head. You just look at somebody
01:13:18.620 and say in your mind, who is this? And your chip will Google it, and it'll say, Bob Jones. You'll
01:13:27.460 say, hey, Bob. Good to meet you. How's the kids? I think that's where we're going. Maybe in your
01:13:35.760 lifetime, not mine. We'll see. All right. That's all for now, and I will talk to you tomorrow.
01:13:40.280 You know, it's so hard for me to turn off these live streams. I'm talking to you, YouTube. I've
01:13:48.880 already turned off Periscope. The reason it's so hard to end these is that I'm getting probably more
01:13:55.900 out of it than you are. There's something about this that I've never quite understood,
01:14:00.180 and I think it has to do with the live comments. Because if I were doing this just by myself,
01:14:06.620 just recording it so you could see it later, I wouldn't be that interested. It's only this live
01:14:12.980 interaction that makes this a two-way thing. And I should tell you that even if I'm not reading
01:14:19.900 every comment, because sometimes I'm in my own head when I'm presenting here, I see them.
01:14:26.540 And I see the life, and I see the... You can detect emotion and whether... You can tell if what you're
01:14:33.840 doing is going over well or if you need to move on. And it's really addicting. Really addicting.
01:14:42.320 Now, I don't know if you've noticed, but I use this medium differently than other people.
01:14:47.800 And I do a lot of A-B testing, so sometimes you get lucky and you try something and it works.
01:14:52.740 And the one thing that I tried that I think makes all the difference with this medium is that I talk to
01:14:59.120 you like I'm talking to one person. Now, I mentioned the audience, so you know I'm talking to everybody.
01:15:04.820 But in terms of style, it's like I'm talking to one person. Now, I do that intentionally.
01:15:11.960 And that's something you can do because of this two-way nature. The moment you put me on, say,
01:15:18.680 a satellite and it's a recorded thing and I'm just talking to the audience, I go into presenter mode.
01:15:23.920 Do you know what presenter mode looks like? It'd be like, and the next item here is item number one.
01:15:30.120 I just wanted to mention, you know, you go into presenter mode. But I found that with this medium,
01:15:35.940 it's uniquely resistant to that. Like, I feel like I'm talking to you like three people sitting around
01:15:43.600 the dinner table. It doesn't feel like any kind of presentation. And that's also dangerous because
01:15:49.360 I've said things in this context that I really probably shouldn't have said in public. But it's
01:15:55.600 that I feel like I'm talking one-on-one that, you know, I let my guard down. All right. Somebody said,
01:16:02.540 oh, let me see this, this comment. Bella Bella says, move away from Trump, Scott. He's finished.
01:16:09.380 Just be interesting and be interactive. When you tell me to not do something, you know I just want to do
01:16:16.400 more of it, right? So when you tell me to move away from Trump because it's bad for me,
01:16:23.300 you know I'm just going to do more of it. All right. That's all for now. I'll talk to you tomorrow.