Real Coffee with Scott Adams - January 28, 2020


Episode 801 Scott Adams: David Mittelman on DNA Opportunities, Sour Don Lemon, Impeachment, China


Episode Stats

Length

56 minutes

Words per Minute

154.17563

Word Count

8,675

Sentence Count

388

Misogynist Sentences

6

Hate Speech Sentences

6


Summary

David Middleman is the founder and CEO of Authorum, a biometrics company that uses DNA sequencing and genomics to advance human identification, especially in the crime solving domain. In this episode, David talks about the new DNA-solved cases of the Golden State Killer and the San Francisco serial killer.


Transcript

00:00:00.000 Hey everybody, sorry about that aborted start, I didn't have the options on that I needed.
00:00:17.400 So now I do.
00:00:19.380 And today is going to be a very special day.
00:00:21.840 We're going to be talking about impeachment and China and coronavirus and all that.
00:00:27.360 But before we do that, I'm going to bring on a special guest to talk about some updates in the world of DNA.
00:00:33.820 Some stuff that you will be interested in because it will matter to you either now or later.
00:00:40.060 But first, if you'd like to participate in the simultaneous sip, it doesn't take much.
00:00:44.740 It takes a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind, fill it with your favorite liquid.
00:00:52.060 I like coffee.
00:00:52.760 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:01:01.680 It's called the simultaneous sip.
00:01:03.820 Go.
00:01:10.520 Now let me check to see if my guest has found us.
00:01:16.700 Yes.
00:01:16.860 I'm going to bring on my guest and introduce him.
00:01:21.720 David, can you hear me?
00:01:25.500 My guest today is David Middleman.
00:01:27.420 You've seen him before.
00:01:28.240 He's the founder and CEO of Authorum, a biometrics company that uses DNA sequencing and genomics to advance human identification.
00:01:38.140 And especially in the crime-solving domain, which is some of the interesting parts.
00:01:43.160 David, how are you this morning?
00:01:44.960 I'm doing good.
00:01:45.600 How are you?
00:01:45.980 I'm great.
00:01:47.300 Thanks for joining us to catch us up on some stuff.
00:01:51.220 One of the interesting things I wanted to ask you about is I believe Authorum has started a database where you can voluntarily have your DNA stored there for crime-solving purposes.
00:02:05.120 Am I saying that right?
00:02:06.860 Yes.
00:02:07.320 Yes.
00:02:08.200 So what's the name of that and how do people participate if they want to?
00:02:11.720 So we started a database.
00:02:14.100 It's called dnasolves.com.
00:02:16.780 And so you've probably heard, folks have heard from watching your show as well as from reading about the news about the Golden State Killer and a number of other folks that have been caught, victims identified, all because there's genealogical data that's been out on public databases to help triangulate and identify who unknown folks are.
00:02:39.800 And so what we've done with DNA solves is we've built a database that has nothing to do with medical research or genealogy research.
00:02:50.920 This is a database that is broadly available for folks that have been tested and want to contribute their DNA towards identifying the victim or solving a crime.
00:02:59.760 How would they do that?
00:03:00.760 How would they do that?
00:03:01.820 So they've been tested for some other purpose and it's in some other database?
00:03:06.240 How did they get it into your database?
00:03:08.340 Yeah.
00:03:08.760 So folks have tested with recreational testing companies such as ancestry.com or 23andMe.
00:03:15.640 When you test to learn something exciting about yourself, you can then take that profile, that information, and download it from your site.
00:03:27.360 And then you can upload it.
00:03:29.440 Some people upload it to genealogical databases.
00:03:32.100 Some will upload it through tools that predict medical traits.
00:03:34.560 We've built a database that allows you to upload it so that in the event that you're genetically distantly related to someone that might be a perpetrator or a victim in a crime, we can use your data point as kind of like a stepping stone to identify who that person is.
00:03:50.800 So if I've got – let's say I'm afraid that there's somebody in my family, could be a distant cousin, but somebody in my family who might be a serial killer and I don't want to turn him in, I could just register my DNA and let the system do the rest, couldn't I?
00:04:09.040 Yes, and if you know that your sibling is a serial killer and you don't want them to get caught, then you would specifically not upload your data to the database.
00:04:18.380 Now, my data is in 23andMe, and I have not authorized anybody, at least not that I'm aware of, unless I did it accidentally.
00:04:26.980 I'm not aware of telling them to share it or download it.
00:04:30.560 How likely is it that my DNA would be available to law enforcement right now with no change?
00:04:37.300 Could they get at it?
00:04:39.840 There's been discussion about whether or not there's an ability to essentially, like they used to do in the 90s with the ISPs,
00:04:47.700 they've kind of used the court system to request data from these companies.
00:04:53.740 I think 23andMe has a very clear position.
00:04:56.380 They don't work with law enforcement.
00:04:58.040 I think they've done a stellar job of protecting data.
00:05:00.520 And in fact, if you've been a 23andMe customer for a long period of time, you'll notice they used to have a third-party API where you could authorize people to look at your data.
00:05:09.840 They killed it.
00:05:10.460 So I would say 23andMe is a very secure place to keep your information.
00:05:14.780 The only way your data leaves 23andMe is if you go in there and specifically click a button that says, I would like my raw data.
00:05:22.080 And again, some people will do that for a variety of reasons, research, genealogy, whatever.
00:05:26.100 But 23andMe will not move your data on your behalf, even if you ask them to.
00:05:30.120 You have to actively pull it yourself.
00:05:32.720 So what percentage of all Americans, let's say adults, what percentage of them have at this point at least one relative who's close enough that would help identify them,
00:05:45.100 who's in one of these databases that law enforcement can get at?
00:05:48.740 Are we closer to 10% or closer to 90%, meaning that you've got at least a cousin or something who's in the database?
00:05:59.280 So GEDmatch, which was a public database, before they moved to the opt-in model, I think there was about a million profiles in there.
00:06:07.640 And I think with that size of a database, there was pretty good odds that closer to 90% of people would have some connectivity to someone that's tested.
00:06:19.440 It may not be a first or second cousin, but at least a third cousin or closer.
00:06:24.020 And how close does a relative have to be before it becomes, let's say, useful for law enforcement?
00:06:33.140 I know every situation is going to be different, but you said a third cousin. Can you go even further?
00:06:39.500 You can. You can. And it obviously depends on luck and the size of your family and even to some extent, like where in the world you're from.
00:06:51.660 Obviously, Americans are enriched for these databases.
00:06:55.120 But, yeah, the answer to your question is that, you know, if you have a first, second, or third cousin, and a third cousin is pretty far out,
00:07:03.400 it's very easy to then use that to figure out the identity of an unknown DNA sample.
00:07:09.960 If you go further out, you can solve it.
00:07:12.200 But as you move past fourth and fifth cousin, it becomes, you know, more difficult.
00:07:15.940 Now, I made a provocative claim without really knowing what I'm talking about the other day.
00:07:23.220 And I said that this technology is very close to making all serial crimes, at least the violent ones,
00:07:30.600 to make all serial crimes solvable by the end of the year.
00:07:35.920 In other words, if somebody is doing a serial sex crime or serial killer,
00:07:42.760 they're always going to leave DNA, wouldn't you say?
00:07:44.880 Would you say it's close to 100% of the time?
00:07:48.420 I mean, it's a matter of time, yeah.
00:07:50.440 And the way technology has moved, you know, Offerum, unlike, you know,
00:07:55.660 so Offerum's a laboratory, so we actually work on not just the integration of data,
00:08:00.320 but the collection of DNA.
00:08:01.820 And we've been able to collect, you know, decent amounts of DNA from touch DNA.
00:08:06.420 So it's not just that you're leaving DNA.
00:08:08.440 It's that the limit of what you can leave and still generate useful information for
00:08:12.860 has continuously dropped as well.
00:08:14.520 So I would say, I'd say you're right.
00:08:16.720 Like, if you're a serial, you know, rapist, you're probably going to leave DNA at some
00:08:22.140 point somewhere.
00:08:23.560 How many people do you think have already figured out that they're adopted or at
00:08:29.840 least not related to their father without doing much work?
00:08:34.140 In other words, are the databases such that if you've done one of these tests and you download
00:08:41.140 your own data and upload it somewhere else, is there any way that you can just find out
00:08:46.040 if you're really related to your parents?
00:08:47.680 There is.
00:08:49.960 And since that's a non-law enforcement application, you can do that at 23 Me or Ancestry.
00:08:56.140 They have giant databases.
00:08:58.380 You know, 23 Me's database is over 10 million people.
00:09:01.120 I think Ancestry just announced that they surpassed 16 million people.
00:09:04.360 So with that size of a database, it would be very straightforward to look for what they
00:09:10.520 call MPEs, which are non-praternal events.
00:09:13.200 So situations where your father is not your father.
00:09:15.480 And there's stories all the time.
00:09:16.780 People discover relatives they didn't know they had.
00:09:19.440 On the other side, and kind of the less cheery side, people discover that relationships that
00:09:24.440 they thought were real or not real.
00:09:25.640 So, and it's a personal decision if you go on that journey.
00:09:29.300 But there's a story, it seems like, all the time.
00:09:31.720 And that's actually how things were done prior to this application to law enforcement.
00:09:37.520 I mean, it's basically taking the adoption agencies and these kind of family mysteries
00:09:42.860 and applying it towards folks who are not your family, right, that could have been involved
00:09:47.440 in a crime.
00:09:49.180 And what about, I know you're not specifically working on the medical stuff, but you probably
00:09:54.520 know about it.
00:09:55.080 I keep hearing about, there are some people who might have a natural immunity to, let's
00:10:00.480 say, cancer, or certain types of cancer.
00:10:03.820 Are you going to be able to identify people who have DNA that just won't get certain kinds
00:10:09.440 of problems?
00:10:11.120 And therefore, would we be able to commercialize that and say, if these people can't get this
00:10:18.240 disease, we can figure out what it is and take some proteins or whatever the hell you
00:10:22.680 do?
00:10:23.280 Is that a thing?
00:10:24.160 Is that a real thing?
00:10:26.240 It is a thing.
00:10:27.480 So I will tell you, you know, in my opinion, it's a really good idea to separate medicine
00:10:34.060 and medical traits from human identification.
00:10:37.640 And so 23andMe, for example, doesn't participate in human identification, but they do medical
00:10:42.600 research.
00:10:43.080 And what you're describing is kind of the direction 23andMe is headed, right?
00:10:47.980 Are there DNA components that would help predict if someone responds positively to a drug or
00:10:53.820 not?
00:10:54.040 Are there new drugs that can be developed based on DNA markers that would make them more effective
00:11:00.520 than some folks that otherwise are benefiting from a drug?
00:11:06.160 But at Offerum, we're the inverse company.
00:11:08.740 So what you're saying, I think, is very likely and true.
00:11:12.020 But at Offerum, it's not about a science issue.
00:11:14.920 It's more of a policy issue.
00:11:16.000 We simply do not use any information for medical work.
00:11:20.440 And if you look at the dhsolves.com website, you'll see that we're very clear the only
00:11:26.020 application, and this is consistent with the DOJ policy, the only application of human
00:11:30.760 identification, because we don't want to create any anxiety or make people think that we're
00:11:37.120 mixing multiple topics together, trick them into participating.
00:11:41.000 People should come to the site for no reason other than to solve a crime, not because they're
00:11:45.420 trying to, like, cure cancer or find a new drug.
00:11:49.360 All right.
00:11:49.700 I can tell my audience is itching for me to talk about the impeachment.
00:11:54.360 They're coming here primed.
00:11:55.820 So just tell us again how they can find this database to voluntarily upload their data if
00:12:03.040 they want to help with law enforcement.
00:12:05.320 Yeah.
00:12:05.540 So the website is dnasolves.com, and anyone can participate.
00:12:11.000 And, you know, obviously, you know, whoever it is, we appreciate it.
00:12:16.140 And you'll probably be seeing in the news, you know, over the next year or two, just dozens
00:12:20.400 and dozens, not just perpetrators found that commit crimes, but I think a lot of victims
00:12:24.180 that otherwise would have remained anonymous being identified and being reanchored into society.
00:12:29.300 So I think that's a great thing as well.
00:12:31.780 All right.
00:12:32.000 Great.
00:12:32.340 And, David, thanks for coming back.
00:12:34.180 We're going to have you on in the future as we have.
00:12:36.920 I know there's going to be lots more DNA news, so it's great having somebody who's immersed
00:12:42.440 in it to sort us out.
00:12:44.420 So thanks so much, David.
00:12:45.860 Thanks for having me.
00:12:47.340 All right.
00:12:47.880 Take care.
00:12:50.320 All right.
00:12:50.800 Let's talk about the news, some other stuff that's happening.
00:12:54.480 Here's an interesting idea.
00:12:55.980 Bernie Sanders may have the opportunity to completely win the nomination by using a poison pill.
00:13:06.540 Now, a poison pill is a term from mergers and acquisitions of companies.
00:13:13.600 And what it means is if you don't want your company to get purchased, you can pass some
00:13:18.280 internal rules that are called a poison pill, meaning that if somebody tries to buy your company,
00:13:24.020 they will wish they hadn't because the purchase will trigger something.
00:13:28.680 For example, if you didn't want your company to be bought by a larger company, but you were
00:13:34.260 a public company, so there's nothing you could do to stop it because they could just buy your
00:13:38.080 stock and own you, you could say, I'll pass a rule that says all the employees get a 500%
00:13:45.240 bonus if we get purchased, which would make you unpurchaseable because the moment you are
00:13:51.840 purchased, all the money in the company would be given to the employees and the purchaser would be
00:13:57.140 out of luck.
00:13:58.200 So that's called a poison pill.
00:13:59.580 There's something you do that makes you unbuyable.
00:14:03.340 Bernie Sanders could do a version of this, just by analogy here, he could do a version of this if
00:14:11.240 his supporters decided to claim that if Bernie doesn't get the nomination, they'll vote for Trump.
00:14:18.760 In other words, if you can get enough Bernie Sanders people to say either signing up or
00:14:25.680 they sign a petition or just say it on social media, if they say, look, if Bernie gets screwed
00:14:32.280 out of the nomination again, two years in a row or two cycles in a row, that they're just
00:14:39.860 going to protest to vote for Trump.
00:14:41.880 Now, there are a lot of people who probably believe that, so it wouldn't be much of a stretch.
00:14:47.880 Now, think about this.
00:14:49.180 It would make it impossible for any of the other Democrats to get elected in the general
00:14:55.240 election.
00:14:56.500 So Biden's argument that he's the only one who could beat Trump just disappears.
00:15:01.620 Because the moment anybody but Bernie gets elected, the protest vote kicks in.
00:15:08.120 It's a poison pill, and it makes it impossible for the actual nominee to get elected.
00:15:12.780 Now, you could say to yourself, well, every politician has that option, right?
00:15:17.560 Anybody could do that.
00:15:18.980 So why are you saying this about Bernie?
00:15:21.440 No, not everybody could do that.
00:15:23.760 There is something special about Bernie's supporters.
00:15:27.620 Bernie's supporters are just sort of, I would say, a mirror version of Trump supporters in
00:15:36.000 this one special way.
00:15:37.780 I mean, they favor different policies, obviously.
00:15:40.580 But in one way, they're the same.
00:15:43.900 Trump supporters wanted somebody to burn down the system.
00:15:49.140 Bernie supporters want somebody to burn down the system.
00:15:52.160 So if they don't get somebody to burn down the system, they're not going to be terribly
00:15:56.460 concerned about who it is, because it's just going to be more of what we already had.
00:16:01.080 So I think Bernie supporters could actually form a poison pill and make it impossible for
00:16:05.880 anybody else to win the nomination and have a fair chance at winning in the general.
00:16:11.920 But I don't know that that will happen.
00:16:13.760 But the play is there.
00:16:15.560 Let's talk about the impeachment.
00:16:16.760 I've been watching, of course, the president's attorneys argue against impeachment, and I've
00:16:24.820 got a few observations in no particular order.
00:16:28.940 Pam Bondi made a really good argument that Burisma is sketchy and that the Biden's association
00:16:36.700 with it just needs to be looked into.
00:16:39.400 That is key, because if you accept that Burisma was worth looking into, then that's the end
00:16:48.160 of the story, because that's what the president asked for.
00:16:51.380 Obviously, it would have a national interest.
00:16:54.040 And there you are.
00:16:55.420 There's nothing else to be said as long as Burisma was worth looking into.
00:17:00.560 Pam Bondi made that argument.
00:17:02.600 Now, there's one part of her case which I don't know why all conservatives keep saying
00:17:08.240 this, because it's just inaccurate as far as I can tell, and that is that there's something
00:17:16.160 important about the fact that Biden asked for the prosecutor in Ukraine to be fired when
00:17:22.780 that prosecutor either had an open case or some kind of paperwork involved about looking
00:17:31.400 into Burisma.
00:17:33.380 Now, the transporters and lawyers are saying, well, that just shows that Joe Biden pressured
00:17:42.900 Ukraine to do something.
00:17:44.760 But in fact, it's a ridiculous argument, because Biden was doing what was national policy.
00:17:51.000 It was national policy that that guy was rotten, that he wasn't really going after Burisma.
00:17:56.180 Lots of other entities wanted him gone.
00:18:00.260 So I think including that part of the argument in why the Burisma Biden thing is all sketchy,
00:18:08.940 I think that just really weakened the argument, because it's a part of the argument that is
00:18:14.960 just demonstrably false.
00:18:16.860 So if you make an argument with a fact that is just so easily debunked, that's just not
00:18:25.180 a strong play.
00:18:26.460 So I thought Pam Bondi made a good argument that Burisma is dirty and worth looking into,
00:18:31.680 but that one fact about the prosecutor, that's just fake news.
00:18:37.020 So you throw that in there, it weakens your case.
00:18:38.960 Let's talk about Alan Dershowitz.
00:18:45.200 Now, I just caught up with his presentation.
00:18:48.420 I watched it on delay.
00:18:51.300 And as I imagined, it was a total kill shot.
00:18:57.160 If you think that Alan Dershowitz made a good presentation, it doesn't matter what anybody
00:19:03.380 else says, it doesn't, because he made a very convincing argument, very convincing, that
00:19:12.240 the articles of impeachment are not impeachable offenses.
00:19:16.620 And the argument goes to what is high crimes and misdemeanors.
00:19:22.360 And Dershowitz walked through the entire history of it, what the various founders who were important
00:19:27.000 to it, what they said, what people misinterpret about it because they've conflated things.
00:19:32.160 And he clears that up.
00:19:34.000 So he gives you a real clean history of how we got here.
00:19:37.940 And I would say it was 100% persuasive.
00:19:41.820 You know, obviously, people will just stick to their sides because they want to.
00:19:45.840 But if you were even a little bit open-minded and you heard his presentation, you would say
00:19:50.540 to yourself, oh, yeah, they really did mean something like a crime or something that's
00:19:56.920 so like a crime, even if it's technically not a crime, it's still a crime.
00:20:03.280 And the example he used was, what if the president did something terrible, such as taking a bribe
00:20:08.380 or whatever, but it wasn't on American soil?
00:20:11.120 Well, and it was in some country where it wasn't illegal or it couldn't be prosecuted.
00:20:16.360 Is that impeachable?
00:20:17.800 Well, yes, because that crime is exactly a crime in our country.
00:20:21.360 And even though there's a technical reason why he couldn't be prosecuted, it's no doubt
00:20:26.760 about it.
00:20:27.180 It's a crime.
00:20:27.720 That is, that would be crime-like.
00:20:31.140 So that would be impeachable.
00:20:32.620 But as Dershowitz points out, there is no law against abuse of power, nor, and this is
00:20:39.280 the important part, nor could there be.
00:20:43.060 Not only is there not a law against abuse of power, but the framers were very clear, and
00:20:49.520 Dershowitz walks you through their thinking, about how you can't have a vague standard.
00:20:54.360 Because if you have a vague standard, like what is abuse of power?
00:20:58.660 You can't even really tell when it's happened.
00:21:01.000 As long as you can have that kind of vague standard, then it makes the presidency a puppet
00:21:07.760 of the House and the Senate, I guess.
00:21:12.660 So Congress would actually own the presidency if they could get rid of a president just by
00:21:18.100 claiming something they did was an abuse of power, because apparently every president
00:21:22.500 abuses their power, according to someone.
00:21:26.060 So you can never have a constitutional rule about removing a president that is so subject
00:21:32.540 to interpretation, you can't even tell if the standard has been met.
00:21:36.460 That's all you need to know.
00:21:37.980 In fact, there's nothing more predictive about all this legal stuff than if you see that there's
00:21:43.560 something where legal experts can't even agree if any kind of a crime has happened.
00:21:48.500 Anytime you're in that situation, you're going to go free.
00:21:53.600 It doesn't matter if you're president and somebody's trying to impeach you.
00:21:57.040 It doesn't matter if you get picked up on the street for some alleged crime.
00:22:00.860 If lawyers look at the activity and they're not arguing about the facts, they're looking
00:22:05.920 at the same facts, and some of them say, yeah, this is a crime, but others, just as experienced
00:22:11.660 and unbiased, say, I don't know, I don't see it, you're almost certainly going to go free.
00:22:17.480 So whenever you see that much ambiguity about whether it even is a crime, that's good for
00:22:23.160 the defendant.
00:22:24.440 So I thought Nershowitz put the hammer down.
00:22:27.120 Now, in our world in which people aren't really going to change their minds, pretty much everybody
00:22:34.180 knows how they want to vote, but you've got a handful of senators who are in these swing
00:22:39.680 districts, and they need some backing.
00:22:43.760 In other words, the Republicans probably would prefer to vote with the other Republicans because
00:22:50.160 it's less trouble, but they also want to win re-election.
00:22:53.260 So they need a reason to vote for the Republicans that's clean, one they can explain to their base
00:22:59.640 and say, look, here's my reason, let's call it a fake because.
00:23:05.780 Sometimes people have already decided, but they need to have a reason to give to other
00:23:10.540 people.
00:23:11.480 This sounds good.
00:23:13.140 And the reasons that don't sound good are all the things that the other lawyers were
00:23:17.980 talking about.
00:23:19.180 Was it quid pro quo?
00:23:20.760 Was it not?
00:23:21.960 Did anybody know about it?
00:23:23.340 Was the money withheld?
00:23:25.120 All of that stuff, all of that stuff will get you not re-elected.
00:23:30.380 If you're arguing in the weeds, there are too many weeds.
00:23:33.700 You can't argue them all.
00:23:35.500 The other side has weeds too.
00:23:37.340 You just can't win if you're at that lower level.
00:23:40.380 Dershowitz just provided the senators that are in that danger zone.
00:23:45.600 If they vote one way, they might lose their job, but if they vote the other way, they might
00:23:49.160 lose their job.
00:23:49.780 That very few senators just got a simple, clean, fake because.
00:23:57.880 So should they decide to vote with the other Republicans, here's what they say.
00:24:04.240 Alan Dershowitz's presentation on whether or not these were constitutional charges was
00:24:10.440 so strong that we could ignore the specifics because it doesn't pass the first test of being
00:24:17.460 something worthy that rises to the level of an impeachable offense no matter what the
00:24:23.100 facts are.
00:24:24.040 So we don't need to have an opinion about whether it was a good idea for the president
00:24:29.020 to do this.
00:24:30.220 We can still say it wasn't.
00:24:32.640 We don't have to have an opinion about whether it was quid pro quo or not.
00:24:36.700 We don't have to have an opinion about any of that because you can stop with Dershowitz's
00:24:41.760 opinion that it's not impeachable.
00:24:44.640 Tell your base, you know, I've got two responsibilities.
00:24:48.020 One is, of course, to the people, but another is to the Constitution.
00:24:52.800 And I'm not going to degrade the Constitution by turning something that's kind of specific
00:24:58.080 into something that's kind of vague by precedent because then we'll never have another president
00:25:03.300 who completes a term if the Congress is the other party.
00:25:08.500 So I think Dershowitz did what he needed to do, which is he gave the people on the fence
00:25:13.160 a clean argument.
00:25:16.380 Just look at Dershowitz.
00:25:18.660 What he said is what I said.
00:25:20.520 And now you can vote for the president.
00:25:23.620 Rush Limbaugh, again, was talking about my tweet about Bolton.
00:25:28.080 The so-called Bolton bombshell.
00:25:32.520 Somebody unironically called it a bolt of lightning.
00:25:36.840 They should have called it a Bolton of lightning, but they missed that opportunity.
00:25:42.960 And what you see when, and the reason I'm mentioning that Rush Limbaugh was talking about
00:25:50.120 my tweet, it's the second time this week on the same topic of impeachment, a different
00:25:54.640 tweet this time.
00:25:55.280 And what I tweeted was that the Bolton manuscript proves that the president should not be impeached
00:26:03.920 because Bolton's story is that the president was worried about Ukraine and corruption.
00:26:14.020 So once you've established that the president genuinely cared about corruption and Ukraine
00:26:19.780 and other countries not paying their share and other things, that's all you need to demonstrate.
00:26:24.720 As long as there's a national interest, it doesn't matter if it's also good for the president,
00:26:29.560 as Dershowitz also explained.
00:26:31.080 But here's my point.
00:26:35.420 And every time I see an example of this, I'm going to call it out until you see the pattern.
00:26:39.980 We are no longer a constitutional republic the way we always had been through history.
00:26:45.860 Because of the Internet, there are voices, in this case it was mine, where I simply had an idea
00:26:53.720 that was worthy of being shared.
00:26:56.660 So because my idea was good, a lot of people saw it, and then apparently they were tweeting at
00:27:02.360 or sending him to Rush Limbaugh and saying, you should read this on the air, and then he did.
00:27:07.860 So Rush Limbaugh has a far bigger audience than I do.
00:27:11.240 Mine's pretty big.
00:27:12.580 His is far bigger.
00:27:13.940 And then basically everybody's seeing it at that point.
00:27:16.800 And I think that we've reached something like an idea meritocracy, meaning that if you have
00:27:24.200 a good idea, it's going to get to the right place.
00:27:28.160 Because we've developed somewhat accidentally, and I think Jack Dorsey gets the win on this
00:27:35.540 one, for building Twitter.
00:27:37.840 Twitter allows a good idea to find supporters and then grow from that small good idea into
00:27:45.900 something that actually forces the politicians to move in that direction, because the public's
00:27:51.140 already there.
00:27:53.340 So keep watching for that.
00:27:55.180 We have an idea meritocracy instead of a constitutional republic, and we just sort of drifted into it.
00:28:05.300 I was looking at the CNN pundits who were trying to find something wrong with the president's
00:28:11.900 legal case.
00:28:14.820 And so here was a funny one from Jen Psaki, writing for CNN.
00:28:20.220 And she said that President Trump's defense team failed at their most important job.
00:28:25.740 And I thought, uh-oh, his defense failed at their most important job?
00:28:31.260 You mean the president's going to be impeached?
00:28:33.640 You mean they failed to keep him from being impeached?
00:28:35.660 No, no, that's not what she meant.
00:28:38.360 She says they failed at their most important job, which was making a clear and compelling
00:28:43.020 argument that there was no need to hear from Bolton.
00:28:47.880 Well, did he hear anything I said about Dershowitz?
00:28:51.320 I would say that they absolutely hammered the thing they needed to do.
00:29:00.880 The thing they needed to do was to give the senators on the fence a clean, easy way to
00:29:07.220 vote for the president instead of against them, and they did that.
00:29:11.340 Is it going to matter that Bolton testifies?
00:29:14.920 Nope.
00:29:15.360 Not if you accept the Dershowitz argument, and it's so strong that you should.
00:29:22.240 If you accept the Dershowitz argument, and he said directly more than once, he said it
00:29:27.640 at the beginning and he said it at the end, no matter whether the new Bolton information
00:29:33.760 is correct or not correct, it has no bearing on the decision because none of it's impeachable.
00:29:42.220 The true version or the fake version, they're both not impeachable.
00:29:46.660 It doesn't matter.
00:29:47.960 So Jen Psaki saying that they missed their most important job, and that's just not true.
00:29:54.800 They made her entire question irrelevant.
00:29:57.620 That's as good as you can get.
00:30:01.060 All right.
00:30:03.580 Here's a question for you, just sort of a general way to predict what's going to happen.
00:30:07.900 And I've seen a few people ask this question on social media.
00:30:10.960 How many people would switch from Trump, let's say they voted or supported Trump in the past,
00:30:17.980 and vote for a Democrat, versus how many Democrats are likely to, for the first time, switch
00:30:25.540 and support Trump?
00:30:28.140 It reminds me a little bit of those old Apple computer and IBM commercials.
00:30:33.060 And somewhere along the line, when IBM was still making personal computers and they were
00:30:37.740 the main competition for Apple, somewhere along the line, someone at Apple cleverly realized
00:30:44.100 that when people move from the IBM PC over to Apple, they almost never move back.
00:30:51.860 But very few people will move from Apple to IBM.
00:30:56.560 So it's basically, it was always a one-way direction.
00:30:58.800 And sure enough, that predicted where we are today.
00:31:02.760 Where I live in Silicon Valley, you don't even see a Windows computer.
00:31:07.600 I mean, it's the rarest thing.
00:31:09.460 If you see a laptop, at least within the Silicon Valley, San Francisco, Bay Area, if you see
00:31:16.520 somebody with a laptop and they happen to be in the technical world, it's an Apple.
00:31:21.580 Well, pretty much every time.
00:31:24.740 So Apple won, because they did have that quality where when people changed their minds, they
00:31:29.760 only changed it in one direction.
00:31:31.800 I think that's starting to develop with Trump versus at least the generic Democrats.
00:31:39.740 Anecdotally, I'm hearing, and this is just anecdotal.
00:31:42.980 Anecdotally, I'm hearing people who are going to vote for Trump for the first time.
00:31:47.520 But I'm not hearing people saying, Trump disappointed me, I'm going to go vote Democrat.
00:31:55.500 Now, I'm not saying it won't happen.
00:31:57.440 I'm just saying that it's starting to shape up like it's a one-way path.
00:32:03.400 So watch for that.
00:32:06.560 All right.
00:32:07.620 Let's talk about closing the borders.
00:32:11.400 So apparently Tibet is going to close their border with China,
00:32:15.480 and Hong Kong is talking about closing traffic coming in and out of Hong Kong.
00:32:22.280 And I think even since yesterday, the number of people infected has maybe doubled,
00:32:28.560 and the estimates are climbing every day.
00:32:32.500 And so here's the question, which I'll just ask every day.
00:32:35.460 I guess I'll just ask this every day.
00:32:37.120 Why does our government not tell us why they've decided to not close all of our traffic coming in from China?
00:32:46.680 Now, I know they've expanded the checkpoints,
00:32:49.860 so there are more airports being checked from more destinations from China.
00:32:53.980 That's great.
00:32:55.200 I'm glad that they're checking people coming in.
00:32:57.540 But we do know the checks don't work, meaning that unless somebody's already feverish,
00:33:04.700 and I think that's the main symptom, maybe cough, I'm not sure.
00:33:08.080 But if they don't have symptoms yet, you can't tell.
00:33:11.260 And apparently there might be tens of thousands of people with no symptoms.
00:33:15.480 So we know our government has implemented a set of processes that can't work completely.
00:33:24.060 They can slow it down.
00:33:25.520 They can get the obvious cases.
00:33:27.120 But we know it can't stop everybody.
00:33:30.080 So how much do you let in?
00:33:32.000 Well, I saw some really bad arguments.
00:33:35.180 Well, first of all, let me say this.
00:33:37.000 Look for the dog that isn't barking.
00:33:38.840 Can you tell me who is the face in our government who's in charge of deciding if the airports are open or closed?
00:33:50.400 Right?
00:33:51.660 Do you remember seeing anybody who was identified as being in charge of that decision?
00:33:57.580 Do the airports stay open or closed?
00:33:59.500 Do we let, you know, what do we do with China traffic?
00:34:03.300 Who is that?
00:34:04.680 Because, you know, on some level, of course, it's always the president.
00:34:07.540 But isn't there a cabinet-level person, somebody?
00:34:13.540 Is there not a person we should see on the news every single day explaining to us why what we're doing is better than closing the airports?
00:34:25.360 Where's that person?
00:34:26.620 The fact that you haven't seen that person tells me the government is failing you.
00:34:32.780 All right?
00:34:34.100 If the government could give you a person, this is a person responsible.
00:34:39.460 Here's why we have not yet closed the airports.
00:34:42.140 We might.
00:34:43.120 But this is why we have not yet.
00:34:45.880 Even if you don't agree with the reason, if you don't have that person with a face, with a job, whose job it is, you know, maybe with the president's approval, of course.
00:34:55.380 But short of that, you haven't seen it, have you?
00:35:00.560 People are saying the CDC, but who?
00:35:03.020 Who's the face?
00:35:04.460 Who's the person?
00:35:05.700 Why are they not in every interview, every 10 minutes on TV all the time?
00:35:09.860 There's something missing, right?
00:35:12.200 That missing part should make you distrust your government's motives.
00:35:17.020 Because if everything was on the up and up, you would know who that person is, and they would be interviewed every day.
00:35:25.580 Every day.
00:35:27.760 So there's something about this process that isn't working.
00:35:31.340 And let me just put this thought out there.
00:35:33.280 If this is somehow a political decision, and it might be, right?
00:35:39.660 It might be a political decision.
00:35:41.720 I hope it's not a political decision.
00:35:43.800 I hope it's a health and welfare and security decision.
00:35:47.340 But what if it's a political decision?
00:35:50.080 I would like to put this thought out here.
00:35:53.040 No president ever got, ever lost his job or her job by being too cautious about a pandemic.
00:36:03.280 Let me say that again.
00:36:05.260 Nobody ever lost their job by being too cautious about a pandemic.
00:36:11.920 But if President Trump is not cautious enough, you can definitely lose your job for that.
00:36:20.900 Indeed, if we don't close our airports, and this thing gets into our country and starts killing people by the hundreds or thousands,
00:36:29.300 how could you possibly support Trump for re-election?
00:36:31.920 I couldn't.
00:36:34.340 Could you?
00:36:35.340 So he has a political risk of losing the election, but there's nothing he could do in terms of being overly cautious that could cost him even one vote.
00:36:45.740 There's not one person who would vote against him if he went too tight and was too cautious.
00:36:51.400 But I'm telling you, I'm getting close to voting against him just for not talking about it enough.
00:36:56.980 So I'm close to the edge where this issue, should it grow and more people in the United States get it and the deaths are coming in,
00:37:07.180 if we see that, how do you support the president?
00:37:11.540 I mean, really, how could you?
00:37:13.380 That would be such an egregious failure.
00:37:15.280 All right, egregious.
00:37:19.380 I'm seeing some people who are bad at economics argue that we shouldn't yet close travel from traffic because of the economic cost.
00:37:28.720 And I want to just dig into that a little bit.
00:37:33.580 Would there be a, let's say, a formidable or a cost that's just so high that we can't stand it if we close travel for, let's say, 30 days?
00:37:45.900 Because 30 days is a long time in the life of one of these pandemics.
00:37:50.720 Maybe if you could stop travel for 30 days, just to pick a number, we can get a good foothold.
00:37:59.040 Now, what's the economics of that?
00:38:01.220 Now, remember, we're not talking about stopping trade.
00:38:05.060 Trade wouldn't stop.
00:38:06.480 We're only talking about human beings.
00:38:08.100 How much trade would be lost because for one month, human beings could only talk to each other on the phone or video chats or e-mail or whatever.
00:38:21.220 How many deals would be lost simply because people had to wait a month to fly?
00:38:27.800 Close to none, right?
00:38:29.820 How much would the travel industry lose if the people who wanted to fly, you know, between China and the United States this month, suppose they had to wait till next month?
00:38:41.200 Well, first of all, most of those people who waited a month still have to go.
00:38:46.160 So most of the people who didn't go this month, let's say, hypothetically, the travel was shut down between China and the U.S.,
00:38:53.060 the people who didn't go still have to go if they were visiting family.
00:38:57.740 They still got to visit their family.
00:38:59.580 They just do it later.
00:39:01.020 So it might not even have that much of an impact on travel except that one month would be low,
00:39:06.980 but you'd probably have the best month you ever had the month after.
00:39:10.440 It wouldn't be enough to compensate, but probably 80% of it would just come back the next month.
00:39:18.320 And then other people were making this terrible economic comparison.
00:39:22.460 This is why I wrote Loser Think, my book that you should read.
00:39:26.080 People were saying that so few people have died from this coronavirus compared to, let's say, a regular flu,
00:39:35.920 which actually can kill thousands of people every year, or let's say car crashes.
00:39:41.500 So somebody said, well, if you're going to be that cautious, shouldn't you stop people from driving cars?
00:39:46.980 To which I say, analogies do not win arguments.
00:39:51.840 You may have heard me say that.
00:39:53.660 It is a ridiculous comparison of a mature risk that is basically woven into the fabric of our entire economy
00:40:04.760 versus a risk that's just starting, and we don't know how big it could get.
00:40:09.900 You can't compare those things, because on day one of the AIDS epidemic,
00:40:16.740 more people died from stepping on rakes.
00:40:21.460 On day one, the first day that anybody ever found an AIDS virus,
00:40:27.100 this is more hypothetical than literal,
00:40:29.680 the first day that somebody got AIDS and died, a human being,
00:40:32.620 there were more deaths from people stepping on rakes that year, right?
00:40:38.720 So would you say, well, AIDS is no problem,
00:40:41.560 because look, one person died versus all these people drowned in swimming pools
00:40:46.280 and car accidents and drank themselves to death,
00:40:48.620 so why worry about AIDS?
00:40:51.440 Because AIDS is going to grow quickly.
00:40:54.240 That's why.
00:40:55.000 You can't compare day one.
00:40:57.960 You'd have to look at what it could become
00:41:00.940 and make your decision based on risk management.
00:41:06.200 So a lot of people were really bad at comparing things
00:41:08.820 and looking at the economics way in,
00:41:11.540 and it's a good thing I'm here to fix that for you.
00:41:15.380 Apparently, Boris Johnson over in Great Britain
00:41:18.220 has decided they're going to use China's Huawei company
00:41:23.140 for their networks.
00:41:27.860 Now, that's a problem,
00:41:29.640 because China, the government,
00:41:32.340 we believe, at least the United States believes,
00:41:35.140 they use this company to spy on anything that crosses the network.
00:41:39.740 So this would, in theory,
00:41:41.240 give China, the government,
00:41:42.540 the utility to snoop on all of the Great Britain traffic.
00:41:46.400 And so you might ask yourself,
00:41:48.500 what the hell is Boris Johnson thinking?
00:41:51.660 To the point where people are suggesting online
00:41:54.260 that we not do a trade deal with Great Britain
00:41:57.460 because we don't want to be dealing with them
00:42:00.340 because all of our information will be stolen.
00:42:03.480 Anything that you do with Great Britain from now on
00:42:06.100 presumably would be known by China or could be.
00:42:11.540 So Piers Morgan tweeted on this,
00:42:15.360 and he said,
00:42:15.920 big call by Boris on Huawei,
00:42:18.540 not least because it will infuriate President Trump.
00:42:22.900 He's not saying that's a good thing,
00:42:24.420 but he's just saying that's what makes it big.
00:42:26.740 And this should end any fears
00:42:29.020 our prime minister will be a lapdog to the White House.
00:42:32.540 Well, maybe,
00:42:33.840 but doesn't this make him a lapdog to China?
00:42:37.040 Isn't it better to be a lapdog to the United States,
00:42:39.620 if you're Great Britain,
00:42:40.860 than to be a lapdog to China?
00:42:43.660 Looks like those are the only two choices,
00:42:45.640 and he picked the wrong one.
00:42:49.860 Here's a question that keeps coming up,
00:42:51.500 and again,
00:42:51.880 here's another economic lesson for you in small.
00:42:56.840 I've been saying that,
00:42:58.120 wouldn't it be great
00:42:58.600 if there were some way to bet on climate change?
00:43:01.640 Because then all the people
00:43:02.660 who are so certain of their views
00:43:04.480 that it's a problem or not a problem
00:43:05.860 could bet.
00:43:09.360 They'd just get into the betting market,
00:43:11.080 and everybody would think
00:43:11.860 they would have an advantage.
00:43:13.860 And I suggested
00:43:14.880 that we already have that opportunity
00:43:17.060 because there are insurance companies,
00:43:19.580 there are big insurance companies
00:43:20.820 that insure lots of stuff,
00:43:23.220 and therefore,
00:43:24.400 you could just bet on the insurance companies.
00:43:26.400 If you think climate change
00:43:28.240 is not going to be a problem,
00:43:30.300 well,
00:43:30.760 they probably can charge
00:43:32.320 greater premiums for it every year,
00:43:34.060 because every year
00:43:34.700 they'll adjust their rates.
00:43:36.400 So next year,
00:43:37.020 they might say,
00:43:37.600 hey,
00:43:38.220 we'll throw a 2% on there
00:43:40.120 for climate change risk.
00:43:42.380 So in theory,
00:43:43.600 if you believe climate change
00:43:45.280 is not a risk,
00:43:46.720 these insurance companies
00:43:47.720 would look like pretty good bets
00:43:49.120 because they would charge
00:43:51.280 for this risk
00:43:52.080 but not have to pay it out,
00:43:53.440 according to you.
00:43:55.340 Now,
00:43:55.640 people said,
00:43:57.000 no,
00:43:57.160 that doesn't work
00:43:57.980 because these are big
00:43:59.440 multi-business,
00:44:00.800 multi-national
00:44:03.800 insurance companies.
00:44:06.040 And one type of risk
00:44:07.220 isn't necessarily
00:44:08.580 going to move the stock
00:44:09.620 enough that you could isolate it.
00:44:11.780 So it's not a clean play.
00:44:14.320 To which I say,
00:44:16.440 if that's not a clean play,
00:44:19.240 then we don't have a problem with,
00:44:21.900 we wouldn't have any problem
00:44:24.720 with climate change
00:44:26.580 if it doesn't reflect
00:44:28.300 into the profits
00:44:29.220 of the insurance companies
00:44:30.520 in a big way.
00:44:31.320 Let me say this cleaner.
00:44:33.460 If it were true
00:44:34.520 that we would not notice
00:44:36.860 any particular insurance company
00:44:38.600 having a really bad year
00:44:40.060 because of climate change
00:44:42.000 now or any time in the future,
00:44:43.600 if there's no insurance company
00:44:45.340 that's going to go out of business
00:44:46.400 from this,
00:44:47.680 it is also true
00:44:48.740 that climate change
00:44:50.060 is not that big a problem.
00:44:52.900 Right?
00:44:54.120 It can't be a problem
00:44:55.300 if all insurance companies
00:44:56.900 are going to say,
00:44:57.480 well, yeah,
00:44:58.820 our losses on climate change
00:45:00.520 were pretty big
00:45:01.080 but we made it up
00:45:01.880 on car insurance
00:45:02.820 and life insurance
00:45:03.800 and stuff.
00:45:05.040 There's no world
00:45:06.120 in which climate change
00:45:07.900 can go bad
00:45:08.760 the way the people
00:45:09.900 who say it could go bad
00:45:10.940 will go
00:45:11.420 and insurance companies
00:45:13.200 can still make money.
00:45:14.320 It's not a thing.
00:45:15.460 At the very least,
00:45:17.380 they would have to stop
00:45:18.460 covering things
00:45:19.300 they used to cover.
00:45:21.060 So in every scenario,
00:45:22.760 even if they found a way
00:45:23.720 to limit their losses,
00:45:25.840 the number of things
00:45:27.440 in which they could insure
00:45:28.440 would drop like crazy.
00:45:30.700 So there is no world
00:45:31.760 in which the insurance companies
00:45:33.020 are not hugely affected
00:45:34.920 by climate change
00:45:36.060 if climate change
00:45:37.900 is the risk
00:45:38.620 that the scientific community
00:45:40.500 is telling us.
00:45:42.560 I saw an idea
00:45:43.640 from Balaji Srinivasan
00:45:48.060 talking about
00:45:49.280 some kind of a blockchain situation
00:45:51.560 where you could have
00:45:54.000 a certain set of thermometers.
00:45:55.920 You'd have to agree
00:45:56.540 on which thermometers
00:45:57.620 were part of this
00:45:58.880 but the thermometers
00:45:59.780 would just automatically
00:46:01.220 register their temperatures
00:46:02.620 ideally without
00:46:04.100 human intervention
00:46:05.000 if you could control
00:46:06.160 such things
00:46:06.700 and it would report it
00:46:08.460 and then people
00:46:09.580 could gamble
00:46:10.420 using their cryptocurrency
00:46:11.620 just for this purpose.
00:46:13.780 They could gamble
00:46:14.360 whether the temperatures
00:46:15.340 could be higher
00:46:15.940 or lower next year
00:46:16.780 and then over time
00:46:17.820 maybe any given year
00:46:19.780 there's some noise
00:46:21.660 but over time
00:46:22.720 if you placed your bet
00:46:23.740 every year for 10 years
00:46:24.920 you could find out
00:46:27.060 if your temperature predictions
00:46:29.600 are better or worse
00:46:30.520 than the experts.
00:46:32.060 So you actually could build
00:46:33.160 a betting situation.
00:46:38.160 Did you all see
00:46:38.940 it's just huge news today
00:46:41.540 the Don Lemon interview
00:46:43.980 he was talking to Rick Wilson
00:46:45.760 big anti-Trumper
00:46:47.040 and another gentleman
00:46:48.860 I didn't recognize
00:46:49.700 Ali somebody
00:46:50.680 I can't remember
00:46:51.560 his last name
00:46:52.100 but the three of them
00:46:53.480 got yucking it up
00:46:54.460 about how Trump supporters
00:46:56.900 are stupid
00:46:58.180 southern hillbillies
00:46:59.640 basically.
00:47:00.500 They didn't use those words
00:47:01.780 but they were pretty clear
00:47:03.240 what their meaning was
00:47:04.040 and I think
00:47:06.660 everyone who was
00:47:07.940 a Trump supporter
00:47:08.740 looked at that
00:47:09.420 and a lot of people
00:47:10.040 were not
00:47:10.440 and said
00:47:10.860 you know
00:47:11.900 if you want
00:47:13.560 a landslide for Trump
00:47:14.900 do more of that
00:47:16.780 because
00:47:18.600 the Democrats
00:47:21.060 seem to consistently
00:47:22.800 make the same play
00:47:24.180 which is instead of
00:47:25.800 going after the leader
00:47:26.920 or the policies
00:47:27.660 they go after the people
00:47:28.760 I mean they're going
00:47:30.280 after people
00:47:31.180 citizens
00:47:32.740 just for having
00:47:33.700 different opinions
00:47:34.460 and
00:47:35.860 most people
00:47:38.120 regard that
00:47:38.800 as just
00:47:39.260 way too far
00:47:40.400 and I think
00:47:40.860 the adjustment
00:47:41.420 to that
00:47:41.880 is that
00:47:43.200 Trump is likely
00:47:44.200 to win
00:47:44.580 in a landslide
00:47:45.320 if you see more
00:47:46.040 of that
00:47:46.420 you know
00:47:47.160 one incident
00:47:48.140 may be
00:47:49.160 a lot of nothing
00:47:49.920 but
00:47:50.220 Netanyahu
00:47:52.120 is flying out
00:47:52.940 to meet Putin
00:47:53.840 right now
00:47:55.180 I guess
00:47:55.560 to talk about
00:47:56.600 the peace plan
00:47:57.140 now
00:47:59.420 I keep telling you
00:48:00.420 that
00:48:00.740 we've never been
00:48:01.860 closer to
00:48:02.700 Middle East peace
00:48:03.600 even though
00:48:04.700 most of you
00:48:05.200 think it's impossible
00:48:06.040 right
00:48:06.340 most of you
00:48:07.300 are going to say
00:48:07.820 it's impossible
00:48:09.000 Scott
00:48:09.480 thousands of years
00:48:11.040 of history
00:48:11.560 tens of thousands
00:48:12.340 years history
00:48:13.120 it's never
00:48:14.260 going to be
00:48:14.700 peaceful over
00:48:15.380 there
00:48:15.660 but to that
00:48:17.660 I say this
00:48:18.420 we've never
00:48:19.800 had this
00:48:20.580 strong a team
00:48:21.680 meaning the
00:48:22.960 leaders of the
00:48:23.820 various countries
00:48:24.460 are all strong
00:48:25.160 leaders
00:48:25.540 and they're all
00:48:27.280 deal makers
00:48:27.800 and they all
00:48:29.200 and most of them
00:48:29.940 get along
00:48:30.420 with the exception
00:48:31.040 of the Ayatollah
00:48:31.860 who just got
00:48:34.220 isolated
00:48:34.760 because he lost
00:48:35.520 his general
00:48:36.540 his economy
00:48:37.200 is falling apart
00:48:37.920 and the entire
00:48:39.300 Middle East
00:48:39.780 is sort of
00:48:40.340 anti-Iran
00:48:41.120 at this point
00:48:41.960 except for
00:48:42.580 the places
00:48:43.260 that Iran
00:48:43.780 actually controls
00:48:45.020 so we've never
00:48:47.260 had a more
00:48:48.600 conducive situation
00:48:49.980 the people who
00:48:51.080 would stop it
00:48:51.780 are more flexible
00:48:52.640 probably
00:48:53.340 because their
00:48:54.600 economies are
00:48:55.260 in shambles
00:48:55.760 and we've got
00:48:57.320 the right people
00:48:57.880 which is probably
00:48:59.000 at least half
00:49:00.120 of the battle
00:49:00.600 now
00:49:01.500 I know what
00:49:02.860 you're thinking
00:49:03.320 still Scott
00:49:04.940 even with all
00:49:06.080 of these
00:49:06.520 so-called
00:49:07.480 advantages
00:49:08.100 you're talking
00:49:08.720 about
00:49:09.060 I still don't
00:49:10.420 think it can
00:49:10.900 happen
00:49:11.180 because these
00:49:11.800 people will
00:49:12.320 just fight
00:49:12.860 forever
00:49:13.240 it's just
00:49:14.260 impossible
00:49:14.820 well
00:49:16.020 let me confess
00:49:17.720 something
00:49:18.280 I'm not
00:49:21.100 just predicting
00:49:22.020 that's
00:49:24.040 that's not
00:49:24.440 what I'm doing
00:49:24.940 if you've
00:49:25.380 watched me
00:49:25.800 long enough
00:49:26.340 especially
00:49:27.220 if you've
00:49:27.740 watched
00:49:28.000 if you've
00:49:28.560 read
00:49:28.720 win bigly
00:49:29.340 you know
00:49:30.500 that sometimes
00:49:31.620 I'm just
00:49:32.240 predicting
00:49:32.720 but sometimes
00:49:34.460 I'm trying
00:49:35.080 to change
00:49:36.360 something
00:49:36.740 and here's
00:49:38.080 specifically
00:49:38.640 what I'm
00:49:39.140 trying to
00:49:39.580 change
00:49:40.060 in order
00:49:41.280 for anything
00:49:41.920 to get
00:49:42.580 done
00:49:42.900 by humans
00:49:43.620 in order
00:49:44.620 for humans
00:49:45.200 to say
00:49:45.660 okay let's
00:49:46.340 do this
00:49:46.800 and then
00:49:47.100 go do it
00:49:47.680 there's one
00:49:49.200 really important
00:49:50.240 thing that has
00:49:50.900 to happen
00:49:51.280 first
00:49:51.940 and that is
00:49:53.900 you have to
00:49:54.680 believe it's
00:49:55.460 possible
00:49:56.020 if you believe
00:49:57.460 something's
00:49:57.920 impossible
00:49:58.460 you will put
00:50:00.360 no effort
00:50:00.820 into it
00:50:01.360 well it's
00:50:01.900 impossible
00:50:02.360 and so
00:50:03.680 in the
00:50:03.960 Middle East
00:50:04.480 the first
00:50:06.100 most important
00:50:07.020 thing which
00:50:07.600 must be
00:50:08.020 accomplished
00:50:08.560 is that
00:50:09.460 for the
00:50:09.900 public
00:50:10.180 to say
00:50:10.680 you know
00:50:11.680 no matter
00:50:12.880 how unlikely
00:50:13.760 it is
00:50:14.420 for the
00:50:15.780 first time
00:50:16.600 it does
00:50:18.240 look possible
00:50:19.020 and so
00:50:20.360 that's what
00:50:20.740 I'm doing
00:50:21.120 I'm doing
00:50:21.680 it overtly
00:50:22.500 I'm not
00:50:22.940 trying to
00:50:23.320 hide it
00:50:23.720 I'm trying
00:50:24.780 to tell
00:50:25.180 people
00:50:25.540 everybody
00:50:26.240 I can
00:50:26.640 talk to
00:50:27.060 that it's
00:50:28.340 possible
00:50:28.880 and only
00:50:29.960 because
00:50:30.260 something's
00:50:30.740 different
00:50:31.140 what's
00:50:31.820 different
00:50:32.100 is the
00:50:32.500 group of
00:50:32.960 leaders
00:50:33.280 are unusually
00:50:34.320 strong
00:50:34.880 very strong
00:50:36.180 leaders
00:50:36.500 and deal
00:50:36.920 makers
00:50:37.240 it's very
00:50:38.320 important
00:50:38.680 and Iran
00:50:40.180 is weakened
00:50:40.620 to the point
00:50:41.180 of being
00:50:41.460 flexible
00:50:41.900 probably
00:50:42.440 they lost
00:50:42.980 their general
00:50:43.520 probably makes
00:50:44.560 a big
00:50:44.780 difference
00:50:45.220 so I
00:50:47.600 think
00:50:48.040 we should
00:50:49.820 act as
00:50:50.320 though it
00:50:50.820 is possible
00:50:51.500 because if
00:50:53.040 we act
00:50:53.560 as though
00:50:53.900 it's
00:50:54.100 possible
00:50:54.600 you've
00:50:55.820 created
00:50:56.140 at least
00:50:56.580 the first
00:50:57.200 necessary
00:50:57.820 condition
00:50:58.440 for it
00:51:00.300 to be
00:51:00.520 possible
00:51:00.940 so it's
00:51:01.920 sort of
00:51:02.300 a self
00:51:03.380 fulfilling
00:51:03.800 prophecy
00:51:04.600 I'm going
00:51:05.740 to tell
00:51:05.980 you it's
00:51:06.240 possible
00:51:06.660 because I
00:51:07.100 really believe
00:51:07.620 that
00:51:07.900 and based
00:51:09.140 on observation
00:51:10.200 and the
00:51:10.700 variables
00:51:11.060 and the
00:51:11.540 players
00:51:11.880 etc
00:51:12.240 so I
00:51:14.280 would ask
00:51:14.700 you to
00:51:15.200 at least
00:51:15.540 accept
00:51:16.040 even if
00:51:17.320 you think
00:51:17.600 it's deeply
00:51:18.160 unlikely
00:51:18.640 I'm just
00:51:19.660 trying to
00:51:20.060 move you
00:51:20.480 to but
00:51:22.020 it's
00:51:22.220 possible
00:51:22.660 but it's
00:51:24.260 possible
00:51:24.680 all right
00:51:28.220 here's a
00:51:28.560 funny story
00:51:29.180 sort of
00:51:31.080 sort of
00:51:31.620 funny story
00:51:32.240 there was
00:51:33.520 a football
00:51:34.900 coach
00:51:35.420 at Grand
00:51:37.360 Valley State
00:51:37.980 University
00:51:38.480 who got
00:51:38.860 fired
00:51:39.280 because he
00:51:40.480 responded to
00:51:41.400 a student
00:51:42.220 interview
00:51:42.760 so he was
00:51:43.700 interviewed by
00:51:44.180 a student
00:51:44.600 and the
00:51:45.740 student asked
00:51:46.360 name three
00:51:47.820 people you'd
00:51:48.560 like to
00:51:49.000 have lunch
00:51:49.460 with
00:51:49.680 and the
00:51:52.460 coach said
00:51:54.420 Adolf
00:51:54.940 Hiller
00:51:55.300 was one
00:51:56.060 of the
00:51:56.280 three people
00:51:56.840 he'd like
00:51:57.220 to have
00:51:57.480 lunch
00:51:57.660 with
00:51:57.920 because
00:51:58.920 he was
00:51:59.740 very clear
00:52:00.180 to say
00:52:00.640 he doesn't
00:52:01.400 support
00:52:01.840 any of the
00:52:02.420 bad stuff
00:52:02.940 that Hiller
00:52:03.360 did
00:52:03.720 but he was
00:52:04.820 noting in
00:52:05.420 his opinion
00:52:05.980 that Hiller
00:52:06.560 was a strong
00:52:07.220 leader
00:52:07.620 got people
00:52:08.860 to do
00:52:09.240 stuff
00:52:09.580 and he
00:52:10.440 would like
00:52:10.740 to have
00:52:11.180 lunch
00:52:11.440 with him
00:52:11.720 to learn
00:52:12.160 that
00:52:12.460 technique
00:52:13.040 well he
00:52:14.500 got fired
00:52:14.940 for that
00:52:15.360 but there's
00:52:16.960 a punch
00:52:17.300 line here
00:52:17.800 what was
00:52:19.700 his exact
00:52:20.380 job title
00:52:21.260 and I'm
00:52:21.820 not making
00:52:22.240 this up
00:52:22.700 this is
00:52:23.920 the simulation
00:52:24.640 winking at
00:52:25.320 you
00:52:25.480 the guy
00:52:26.640 who said
00:52:27.080 that he
00:52:27.320 wanted to
00:52:27.640 have lunch
00:52:28.100 with Hitler
00:52:29.100 just to learn
00:52:30.320 his technique
00:52:30.940 not because
00:52:31.400 he was a
00:52:31.820 friend of
00:52:32.100 Hitler
00:52:32.320 and got
00:52:33.860 fired for
00:52:34.400 it
00:52:34.600 his job
00:52:35.600 description
00:52:36.080 was
00:52:36.720 offensive
00:52:38.380 coordinator
00:52:39.100 that was
00:52:41.000 his actual
00:52:41.500 job
00:52:41.960 offensive
00:52:43.100 coordinator
00:52:43.780 and he
00:52:45.080 got fired
00:52:45.600 for being
00:52:46.140 offensive
00:52:47.640 that's
00:52:49.820 all
00:52:50.020 he got
00:52:50.980 fired
00:52:51.220 for being
00:52:51.580 offensive
00:52:51.960 and he
00:52:52.400 was an
00:52:52.660 offensive
00:52:52.980 coordinator
00:52:53.520 well if
00:52:54.520 he wants
00:52:54.820 somebody to
00:52:55.300 not be
00:52:55.700 offensive
00:52:56.100 maybe you
00:52:57.160 should hire
00:52:57.540 an inoffensive
00:52:58.540 coordinator
00:52:59.140 that's what
00:53:00.140 I'm saying
00:53:00.500 all right
00:53:02.040 so I guess
00:53:03.380 there'll be
00:53:03.660 some more
00:53:04.100 of the
00:53:05.540 president's
00:53:06.480 defense
00:53:07.760 I think
00:53:08.860 so far
00:53:09.300 the president's
00:53:10.060 defense
00:53:10.480 team
00:53:11.540 A plus
00:53:14.980 really
00:53:16.420 really good
00:53:17.120 one of
00:53:18.120 the things
00:53:18.420 that's
00:53:18.660 interesting
00:53:19.060 too
00:53:19.420 and the
00:53:19.980 president
00:53:20.300 always gets
00:53:20.940 he gets
00:53:22.620 some heat
00:53:23.440 for this
00:53:23.900 the president
00:53:25.720 is famous
00:53:26.700 for saying
00:53:27.280 that he
00:53:27.620 likes people
00:53:28.180 who look
00:53:28.500 like they
00:53:28.860 came from
00:53:29.320 central
00:53:29.800 casting
00:53:30.300 people who
00:53:31.460 look good
00:53:31.960 on television
00:53:32.640 people who
00:53:33.640 play the
00:53:34.040 part
00:53:34.420 and so
00:53:36.020 the president
00:53:36.420 gets heat
00:53:36.840 for that
00:53:37.140 but I
00:53:37.940 gotta tell
00:53:38.420 you
00:53:38.600 it does
00:53:39.440 matter
00:53:39.920 the president
00:53:41.260 is so
00:53:42.220 right about
00:53:42.820 that
00:53:43.160 because if
00:53:44.360 you look
00:53:44.680 at the
00:53:45.080 team
00:53:45.880 that's
00:53:46.220 against
00:53:46.520 them
00:53:46.820 Nadler
00:53:48.180 Schiff
00:53:49.140 these are
00:53:51.500 not
00:53:51.900 television
00:53:53.420 ready
00:53:53.860 personalities
00:53:54.740 if I can
00:53:55.640 be kind
00:53:56.240 there are
00:53:57.400 people who
00:53:57.860 have maybe
00:53:58.580 a face
00:53:59.960 for radio
00:54:00.760 could I
00:54:01.880 say
00:54:02.200 and they're
00:54:04.920 against
00:54:05.380 and I'm
00:54:05.840 watching the
00:54:06.520 president's
00:54:07.780 the president's
00:54:10.800 councils
00:54:11.380 and they're
00:54:12.780 all kind
00:54:13.500 of good
00:54:13.860 looking
00:54:14.220 and even
00:54:17.520 yeah
00:54:18.000 have you
00:54:18.580 seen
00:54:18.940 Jay
00:54:20.180 Sekulow's
00:54:20.960 hair
00:54:21.320 I was
00:54:22.620 looking at
00:54:23.000 his hair
00:54:23.400 yesterday
00:54:23.700 he was on
00:54:24.380 there and
00:54:24.800 he was
00:54:25.060 doing a
00:54:25.340 close up
00:54:25.760 and I
00:54:26.540 was like
00:54:26.920 leaning into
00:54:27.440 the television
00:54:27.960 to look at
00:54:28.460 his hair
00:54:28.900 because I
00:54:29.680 just thought
00:54:30.160 that is the
00:54:31.380 finest head
00:54:32.140 of hair
00:54:32.740 I've ever
00:54:34.080 seen on a
00:54:34.620 human being
00:54:35.360 like you
00:54:36.500 could do
00:54:36.840 anything with
00:54:37.380 that hair
00:54:37.760 you could
00:54:38.240 pick any
00:54:38.680 kind of
00:54:39.000 hairstyle
00:54:39.400 I mean
00:54:40.500 that's some
00:54:41.360 seriously
00:54:41.920 good looking
00:54:42.800 hair
00:54:43.200 and then
00:54:44.400 his other
00:54:44.900 lawyers are
00:54:45.580 tall and
00:54:46.700 good looking
00:54:47.140 and they've
00:54:47.540 got powerful
00:54:48.280 voices and
00:54:48.920 stuff
00:54:49.120 and I would
00:54:50.060 even say
00:54:50.560 if you were
00:54:53.080 going to pick
00:54:53.420 the odd
00:54:54.340 person out
00:54:55.060 in that
00:54:55.620 group
00:54:55.940 and by the
00:54:57.940 way even
00:54:58.700 Pam Bondi
00:54:59.580 looks great
00:55:00.220 on camera
00:55:00.780 they're just
00:55:02.140 people who
00:55:02.580 look great
00:55:03.000 on camera
00:55:03.500 Alan Dershowitz
00:55:06.560 80 years
00:55:07.300 old
00:55:07.620 and in
00:55:09.320 the past
00:55:09.760 he's had
00:55:10.180 some weird
00:55:11.920 curly hair
00:55:12.580 situation
00:55:13.140 which somebody
00:55:13.820 needed to
00:55:14.320 tell him
00:55:14.680 to shave
00:55:15.720 off
00:55:16.120 but at
00:55:17.840 80 years
00:55:18.520 old
00:55:18.940 he's now
00:55:20.620 looks like
00:55:21.900 he's cut
00:55:22.320 his hair
00:55:22.660 close to
00:55:23.220 his head
00:55:23.620 and looks
00:55:24.100 great
00:55:24.560 he looks
00:55:25.680 great
00:55:26.160 I mean
00:55:27.140 he just
00:55:27.800 stood up
00:55:28.200 there and
00:55:28.500 did this
00:55:28.800 long
00:55:29.100 presentation
00:55:29.720 that didn't
00:55:30.920 look like
00:55:31.320 it missed
00:55:31.720 I don't
00:55:32.500 think he's
00:55:33.080 I don't
00:55:33.760 even think
00:55:34.200 he's a
00:55:34.720 tenth of
00:55:36.040 a step
00:55:36.460 slow
00:55:36.980 if you
00:55:38.080 saw it
00:55:38.640 there was
00:55:39.560 no point
00:55:40.180 in Dershowitz's
00:55:41.420 presentation
00:55:41.880 when you
00:55:42.500 said to
00:55:42.820 yourself
00:55:43.060 no point
00:55:43.700 when you
00:55:44.720 said to
00:55:45.000 yourself
00:55:45.260 wow
00:55:46.160 he's 80
00:55:46.720 didn't
00:55:48.400 happen
00:55:48.720 he did a
00:55:50.060 presentation
00:55:50.460 like he
00:55:51.180 was 62
00:55:52.320 and at the
00:55:53.080 height of
00:55:53.460 his
00:55:53.720 analytical
00:55:54.820 powers and
00:55:55.840 strength
00:55:56.280 so first
00:55:57.300 of all
00:55:57.460 his haircut
00:55:57.820 looks great
00:55:58.440 let me
00:55:59.140 say that
00:55:59.620 that looks
00:56:00.380 great
00:56:00.740 he should
00:56:01.360 have done
00:56:01.600 that years
00:56:01.960 ago
00:56:02.160 I mean
00:56:02.680 it just
00:56:03.000 totally
00:56:03.540 improved
00:56:03.980 his overall
00:56:04.940 credibility
00:56:05.980 and look
00:56:06.640 and he's
00:56:07.600 in good
00:56:07.880 shape
00:56:08.260 he's 80
00:56:09.300 very impressive
00:56:10.960 all right
00:56:12.220 that's all I
00:56:12.660 got to say
00:56:13.140 about that
00:56:13.840 and I will
00:56:14.320 talk to you
00:56:14.900 tomorrow