The Roseanne Barr Podcast - August 24, 2023


Scott Adams | The Roseanne Barr Podcast #011


Episode Stats

Length

1 hour and 30 minutes

Words per Minute

163.66321

Word Count

14,858

Sentence Count

1,329

Misogynist Sentences

26

Hate Speech Sentences

41


Summary

On today's show, Jake and Roseanne discuss the Maui Fires and how it's affected their family and the people of Hawaii. They also talk about how the fires affected Scott Adams and his family and how they dealt with the situation. This is an important episode to discuss because it was filmed before the fires broke out in Maui, and it's one of Roseanne's favorite episodes of all time. If you're a fan of my mother, this is the best place to get her unfiltered opinion. It's the last place she can talk where they haven't canceled her from yet. And it's also one of the last places she can do it where they still get a chance to do it in a place where she can speak freely about anything and everything, including her opinion on current events and current events. This was supposed to be last week's episode, but we preempted it with an important story about the fires in Hawaii and how important it is to be prepared for them to respond to them in a timely manner. And, as always, thank you so much for all the support you've shown. We're doing well and we can't thank you enough. -Jake & Roseanne - The Price of Gold & Currencies - The Gold Guy - Andrew the Gold Guy's YouTube Channel - The Bitcoin Guy's Gold Guy Podcast - The Dollar vs the Dollar - and much more! Thank you for listening and supporting the show. We really appreciate it. . Peace, Blessings, Cheers, Jake, Roseanne, and Good Luck! -ED& Blessings. -ED & Cheers. - EJ & Good Luck, EJAC - - Rory, Rory, - AKA - and the crew at the Fitstairs Podcast. - Jake, . . . - Jake & the Crew at the FSI Podcast & the FTSE Files ( ) - Roseanne Barsun, Jake Pentland & the rest of the FFSYS team. - The FSI Team. , Roseanne Baronsons, , and the team at the podcast, and the rest at the Podcast - and all the rest in the future of the podcast - and so much more. - , with all the good stuff that's coming in the next episode of the show, coming soon! - CHEERS! - CHEER!


Transcript

00:00:00.000 I hate hockey. Seriously, I can't stand it.
00:00:04.920 My name is William Woodhams and I'm the CEO of the British-born sportsbook, Fitstairs.
00:00:09.880 We've been over in Ontario for well over a year now and have loved every second of it, except one thing.
00:00:17.300 Let me tell you, us Brits simply can't get our heads around hockey.
00:00:21.000 It is so confusing to us and it is impossible to outsmart Canadians on the ice.
00:00:26.660 That's why at Fitstairs, the world's oldest bookmaker,
00:00:29.160 you can play with us on anything, anything you want, cricket, tennis, just not hockey.
00:00:35.280 Plus, with our world-class casino and over 150 years of experience, you're in great hands.
00:00:42.000 So you've got to stop pucking around and go to fitstairs.ca.
00:00:46.660 That's F-I-T-Z-D-A-R-E-S dot C-A.
00:00:51.320 19 plus, Ontario only. Please play responsibly.
00:00:54.980 Hey everyone, Jake Pentland here.
00:00:56.760 You may know me as Roseanne Barsun and the occasional person on this podcast outside of the guest that talks.
00:01:03.660 I wouldn't say co-host. I don't think I have that important of a role,
00:01:07.120 but I'm definitely all over the place on this podcast and, you know, also producing and editing and blah-bity-blah.
00:01:12.980 So, first things first, I wanted to say thank you all for supporting us and listening to us.
00:01:19.780 The numbers have been great.
00:01:21.140 The reviews, as much as you can on a podcast, we look at comments.
00:01:26.940 They've been great and encouraging and we're really encouraged that you guys are liking the show.
00:01:32.480 We want this to continue growing.
00:01:33.700 This is a long-term commitment my mother's made to do a show weekly.
00:01:38.740 It's not a vanity project.
00:01:40.600 It's not something she does because she's bored.
00:01:42.820 It's not like when a, you know, Billy Bob Thornton starts a band or something horrible.
00:01:46.720 Like, oh God, here's another celebrity doing something we simply don't give a shit about.
00:01:50.160 This is really the best place.
00:01:51.820 If you're a fan of my mother, this is the best place to get her unfiltered opinion.
00:01:56.240 It's one of the last places she can talk where they haven't canceled her from yet.
00:02:00.520 So, we really appreciate the support and, you know, the best way to support us is to like, share, and subscribe this show.
00:02:08.880 Help us get it out.
00:02:09.880 Help us get the numbers up.
00:02:11.540 We're doing well, but we can always do better.
00:02:14.000 Also, I want to say those of you that went to bh-pm.com to talk to Andrew about purchasing precious metals,
00:02:21.020 I'm really encouraged by the numbers.
00:02:22.740 I see a lot of you are protecting your wealth.
00:02:25.840 Because you say it came from Roseanne, we can track the leads, and a significant amount of you are taking it very seriously.
00:02:32.800 And I think that's very smart.
00:02:34.080 As we know, the dollar is unstable.
00:02:36.920 The economy is unstable.
00:02:38.080 And the best thing you can do right now, it could change, but right now is to put as much of your money in precious metals as you can.
00:02:44.520 And you can see that episode here on our YouTube channel with Andrew the Gold Guy.
00:02:50.420 We're very encouraged by that.
00:02:52.740 And anyway, now on to the episode.
00:02:55.280 I just want to be very clear that this episode we filmed with Scott Adams is one of my favorite.
00:03:00.220 But it was filmed before the Maui fires.
00:03:02.800 And the reason I'm telling you that is because there is a portion of this episode where my mother and him are discussing slow-moving disasters.
00:03:09.780 And they make mention of being stuck in traffic and not being able to get out.
00:03:15.800 And, you know, if this was recorded after the Maui fires, that would have been very insensitive to talk about it that way.
00:03:22.660 So I just want to be clear.
00:03:23.900 This was supposed to be last week's episode.
00:03:26.220 We preempted it with anomaly to discuss the Maui fires because that was obviously a very important story.
00:03:31.520 And one that was personally important to us as well.
00:03:35.540 My mother is a citizen of Hawaii.
00:03:37.480 So I just want to be clear when you're hearing them talk about being stuck in traffic and disasters that it was no way about the Maui fires.
00:03:46.240 And lastly, and most importantly, I want to say that this episode, because Scott was not in person, was filmed via Zoom.
00:03:54.460 And unfortunately, I don't know the state of the audio until the end of the episode when it uploads.
00:04:00.320 We don't have great internet here where we live.
00:04:03.220 And unfortunately, my mother's audio, specifically in the first few minutes, is popping and probably very irritating to listen to.
00:04:11.240 More irritating than her natural voice.
00:04:14.340 I just want to say stick with it.
00:04:15.820 It's a great episode.
00:04:16.760 Her audio does get better.
00:04:18.340 And like I said, there's not anything I can do about it.
00:04:20.620 But don't turn it off if it annoys you.
00:04:23.240 Just stick it out.
00:04:24.260 The conversation is brilliant.
00:04:25.820 Scott Adams has a Malcolm Gladwell-esque approach to things.
00:04:30.120 And, you know, I find this episode, even though it's subdued and intellectual, it's one of my favorite we've recorded.
00:04:38.180 You know, we like to give you something different every week.
00:04:40.280 Sometimes we're getting high.
00:04:41.920 Sometimes we're getting deep in conspiracy theories.
00:04:43.720 And other times we're just shooting the shit with really, really smart people.
00:04:47.320 And that's what this episode is.
00:04:48.720 So the best thing you can do to help us, aside from visiting sponsors like bh-pm.com and letting them know Roseanne sent you, that helps, is to like, share, and subscribe.
00:04:58.300 We want as many people to see this episode as possible.
00:05:01.620 And I'm not sure that the episodes are showing up in the algorithm of YouTube in the most efficient way on their end.
00:05:09.800 That's all I'll say.
00:05:10.820 So sharing, liking, subscribing, that's the best way you can support us.
00:05:16.820 So, all right, enough of my begging.
00:05:20.000 I just want to say enjoy this episode, and we will see you next week, where we will be joined by Jack Pasoviak.
00:05:26.720 The following week will be J.P. Sears.
00:05:29.060 So we've got some good episodes coming to you.
00:05:31.460 So, anyway, thanks again for the support.
00:05:33.760 Okay, hi, everybody.
00:05:35.060 Welcome to the Roseanne Barr Podcast.
00:05:37.240 So I'm very excited to have, as a guest today, the author of Dilbert and a new book, too, that he'll hold up and we'll talk about, Scott Adams.
00:05:54.280 Hi, Scott.
00:05:55.200 Hi, Roseanne.
00:05:56.160 Hi.
00:05:56.800 Thanks for having me.
00:05:57.380 I'd like the title of your book.
00:05:58.280 Oh, thanks for coming on.
00:05:59.780 I love that the title of your book is Reframe Your Brain, right?
00:06:04.760 Reframe Your Brain.
00:06:05.580 Yeah, it should be out just about the time people see this.
00:06:08.520 I'm just going through the final edits, and it'll be ready to go.
00:06:12.620 You know, I'm all about programming your brain.
00:06:15.360 I'm all about that.
00:06:17.060 And specifically, since we have this in common, as we're both labeled racist by the United States racist media,
00:06:26.020 I loved seeing you on Cuomo, and I mostly enjoyed so much Cuomo lecturing you on response.
00:06:35.580 Responsibility for what you say.
00:06:38.820 I loved it.
00:06:40.460 You know, he was really nice to me.
00:06:43.080 I thought he did a solid job of journalism on that.
00:06:45.600 I did, too.
00:06:46.120 I think you and I have one thing in common, which is interesting in our two controversies,
00:06:52.500 which I don't think that anybody actually believed them.
00:06:56.560 As in, I don't think anybody actually think you're a racist.
00:06:59.860 I don't think anybody actually thinks I am.
00:07:02.000 I haven't met anybody.
00:07:02.840 I mean, in person, not a single person believes anything like that.
00:07:07.340 But the average viewer of the news doesn't understand that public figures are generally used as sort of a conduit for other people's opinions.
00:07:17.000 So if they can find any way to define you as the hub of the place they can load their opinion on and put it through you,
00:07:25.860 it's really just a vehicle for other people to express their opinions.
00:07:28.820 So what you actually said or what you actually meant and what I actually said and what I actually meant never really came up.
00:07:36.980 It was like that wasn't an important part of the process.
00:07:40.100 You know, if anybody had asked me.
00:07:40.900 And remove context.
00:07:43.920 Yeah.
00:07:44.440 I mean, that's sort of what.
00:07:45.980 Now, the thing that the average viewer of news doesn't realize, and you, of course, would know it better than anybody,
00:07:52.540 the news about public figures is almost never real, almost never, probably nine out of 10 times they're leaving out the important part of the story.
00:08:03.360 You know, they might get a fact right, like if somebody died, usually right.
00:08:08.920 But if it's somebody said something or was alleged to say something, in my experience, those are almost never true.
00:08:15.160 I'm really puzzled about whether things have gotten worse or we got smarter about how bad they were.
00:08:22.960 There's definitely more canceling going on.
00:08:25.160 That's for sure.
00:08:27.020 I think both.
00:08:28.260 We got smarter, so they got more pervasive in their mind control over the airwaves.
00:08:35.260 Yeah, at this point, the ability to hide a major story is the scariest part of the media situation.
00:08:44.700 It's not what they say that's not true.
00:08:47.020 That's bad enough.
00:08:48.000 But they can make a major story just disappear.
00:08:51.080 Well, I know you're concerned about the indictment of Trump.
00:08:56.180 And I like that you said, don't say weaponization of the Department of Justice, say destruction.
00:09:05.260 Yeah, you know, I don't think people fully realize that everything about America that works is based on the foundation of the justice system and the fact that ours is better than most.
00:09:18.280 You know, the reason you come to America, among other reasons, is that the justice system, you know, gives you a chance of, you know, not being jailed for the wrong reasons and running your company and not running into too much trouble with other criminals.
00:09:32.080 But if we lose that, that's somewhat irreparable.
00:09:37.380 I mean, everything else would collapse.
00:09:39.360 And I think we're taking some pretty big shots at it with, you know, what's going on lately with Trump specifically.
00:09:47.900 I mean, to me, I don't know anybody who's following the story who thinks that's legitimate.
00:09:52.780 I know a lot of people who are not aware that, for example, questioning the electors has been a historical thing.
00:10:01.300 It's happened.
00:10:02.540 People don't seem to understand that Republicans don't hold insurrections without weapons.
00:10:07.200 Like that somehow we were sold the idea that Republicans would launch a coup and they wouldn't bring weapons and that the way they would conquer the United States is by sauntering around the Capitol for a few hours until the government surrendered.
00:10:23.760 I don't know.
00:10:24.080 Were they waiting for the surrender?
00:10:25.600 And it's so horrifying to think that it was that easy to overthrow the government of the United States with all those police and everything around letting them do it, knowing that they were there to overthrow the government.
00:10:44.300 How easy was that?
00:10:45.820 You know, you think the nuclear triad would have been more important, but no.
00:10:51.580 People with bison hats and whatnot.
00:10:56.020 So today I heard from Dershowitz.
00:10:59.260 He said that the Jack Smith indictment included language from Trump's speech, and they did not include in the indictment where he said the peaceful and patriotic part.
00:11:11.280 The most important part of it.
00:11:12.560 They purposely left that out.
00:11:13.940 What so disturbed me about what?
00:11:20.220 I wanted Scott to talk more because I watched that video, Scott, your podcast about this, about how Jack Smith is basically committing the crime that he's accusing Trump of.
00:11:28.920 I wanted you to explain that a little bit more.
00:11:30.640 It's brilliant.
00:11:31.140 So this is Dershowitz's point, Ellen Dershowitz.
00:11:35.920 He was saying that if the crime that Trump is committing was not telling the truth and that therefore that had repercussions in the real world, that Jack Smith is also not telling the truth by leaving out the key part.
00:11:49.920 You know, it's a lie by omission in the indictment.
00:11:53.100 They're both, you know, they both have something important to do with our government and with the country.
00:11:58.740 And it's hard to see, you know, I don't know about the technical legal details that, you know, Dershowitz can talk about that.
00:12:06.540 But it looks the same.
00:12:08.060 It looks like somebody lying.
00:12:10.320 The difference is that Trump may have said something in a political context, which everybody should understand, includes some untruths.
00:12:18.740 But if you see it in an indictment, I'm sorry, those are not the same.
00:12:24.680 Those are not the same.
00:12:25.540 No, they're not the same.
00:12:26.240 Well, one of them is the worst thing in the world, and the other one is just somebody talking.
00:12:31.640 The fact that we've found that the one talking is the one going to jail, and the one doing the worst thing in the world is the one putting him in jail.
00:12:40.520 And we're just okay with that.
00:12:41.320 Well, already, they've already been caught with making up fake FISA's, fake dossiers.
00:12:47.440 They've already been caught, and everybody knows that's the truth.
00:12:50.900 And yet, they're never indicted, and people see that there is a dual-tiered, a two-tiered system of justice, one for the establishment, and one for everybody who's not in the establishment and trying to make change or expose it.
00:13:14.340 Yeah, you know, the most amazing thing that boggles my mind is that we've now learned enough about, let's say, the laptop and the 50 Intel people who lied about that.
00:13:25.980 We know about the Russia collusion hoax.
00:13:28.880 Siri, get out of here.
00:13:31.500 Oh, my God.
00:13:32.460 The sound screwed up, Jake.
00:13:34.360 What's happened?
00:13:36.120 The sound.
00:13:36.840 Siri came on.
00:13:38.700 Uh-oh.
00:13:39.040 Is it still doing it?
00:13:39.800 I don't hear it.
00:13:40.460 Well, let me get rid of it, that bitch.
00:13:44.560 Sorry, I wanted to center your camera.
00:13:46.880 That Siri bitch.
00:13:48.220 I hate that bitch.
00:13:50.360 Okay.
00:13:51.880 I'm sorry to have interrupted you.
00:13:54.660 Continue, please.
00:13:56.660 What was I saying?
00:13:57.780 Oh, and so we found out the laptop story, that was all faked.
00:14:01.320 And it was also, it wasn't just fake, but it was a fairly massive collusion.
00:14:06.860 You know, a lot of people had to be in on it.
00:14:08.440 Then, you know, of course, we know the Russia collusion hoax was fairly massive in terms of how many people were in on it.
00:14:14.100 And now we're learning about the, you know, the Biden family business and how that worked.
00:14:19.140 We now know, I think we know the entire structure of it.
00:14:23.000 Yes, we do.
00:14:23.720 There are not many questions left.
00:14:25.600 Now, I don't know what's legal and illegal.
00:14:27.420 I think Hunter might have been clever enough to, you know, be on the right side of the law.
00:14:31.680 I'm no expert.
00:14:32.500 I don't know.
00:14:33.420 But the fact that the way it's being presented to the public is just a complete, just a cover-up.
00:14:41.920 And how many illusions on the same side, followed by a media cover-up, do we have to see before we make some kind of change?
00:14:51.860 I'm hoping it's an election change.
00:14:54.100 What about this, that Trump had the right to have those papers under the presidential, whatever it's called, I forget.
00:15:02.080 But he had the right to have those classified papers as the president of the United States.
00:15:06.960 But Biden had them as a senator and as a vice president, which is highly illegal.
00:15:12.600 And he doesn't get indicted for that.
00:15:14.460 They're laying around his garage and he took him to Chinatown, where China owned the building he was renting, handed him out in the street.
00:15:21.220 You know, I don't know the technical details of what, you know, what's the difference between those two cases.
00:15:30.500 But here's the thing.
00:15:31.640 Allegedly.
00:15:32.720 Allegedly.
00:15:33.640 But the dog that's not barking really loud right now is we've not heard one peep about what the contents of Trump's boxes are.
00:15:41.720 Now, the only reason that could be is because there's nothing important there.
00:15:47.580 There could not be any other reason.
00:15:49.620 Because if, in fact, it were nuclear secrets or something like that, you don't think that they would tell us, at least in broad strokes, hey, there were nuclear secrets?
00:16:00.380 You know, they don't have to tell us the secret.
00:16:01.960 I would imagine he probably had some things that would sort of defend him in the future or, you know, maybe make his situation look better.
00:16:10.780 I would imagine most of it's just for a biography.
00:16:14.200 You know, I assume they could have been for that, too.
00:16:16.400 But I think he has the Epstein list and every tentacle that it reached out to from the 28 bank half of HSBC Bank, which they covered up for all these years.
00:16:28.760 And, you know, I think he has it all.
00:16:32.360 I mean, I love I really wanted to talk to you about being left of Bernie because I am.
00:16:38.540 But anyways.
00:16:43.380 I'm forgetting what I'm saying.
00:16:44.820 What am I saying, son?
00:16:46.840 Oh, reparations.
00:16:48.920 We agree on this.
00:16:50.140 Reparations.
00:16:50.740 We agree.
00:16:51.260 I'm sure.
00:16:51.760 And it should be infrastructure, starting with schools.
00:16:56.100 That is reparations to the black community.
00:16:59.700 And that's how we can do it.
00:17:01.860 We can put infrastructure where they live instead of bringing in fentanyl from the border to destroy that community.
00:17:09.540 And I think that it is a genocide on black America going on and nobody's talking about it because the only people that are allowed to talk are people that are in on the yank.
00:17:24.100 Yeah, I would agree with you that the the biggest source of systemic racism is the school system because it's inadequacies are, you know, multiplied in the black community.
00:17:38.220 So it's it's sort of a forever problem.
00:17:41.540 But it ain't no it isn't a mistake.
00:17:43.800 It's on purpose.
00:17:44.980 Is it on purpose by who?
00:17:49.520 By the institution of by institutionalized racism.
00:17:54.140 And, you know, you look at who supports it and it's not who you think, you know, it's not who you think at all.
00:18:01.760 It's people who are in on the yank.
00:18:04.020 They're getting paid off government money to keep the shit going.
00:18:07.760 They don't want to change anything.
00:18:10.540 They don't want to use public money for the good of the public.
00:18:14.560 Come on.
00:18:15.260 It's all corrupt.
00:18:16.660 Every bit of it, every stinking tear of it.
00:18:20.460 Well, I cannot believe that these people are out there applauding a compromise and corrupt justice system that sits there and invents and, you know, perverts law to get an innocent person in jail.
00:18:35.980 When that's what's happened to a whole bunch of black men in this country.
00:18:39.860 And they're applauding for it.
00:18:42.020 I want people to snap out of their stupor.
00:18:44.980 And I want them to do it right now.
00:18:47.000 And I'm pissed about it.
00:18:50.320 I've joked that Trump is one indictment away from winning the black vote.
00:18:55.900 And I'm only a little bit kidding about that because I actually think that the Justice Department is so, so corrupted.
00:19:05.980 That he could say, look, the Justice Department doesn't work for me.
00:19:11.080 Imagine how bad it is for you.
00:19:13.760 Elect me and I'll fix it for all of us.
00:19:16.200 Now, I'm not sure if he could do that because, you know, a lot of the judges.
00:19:19.440 Well, but you know that these judges, they're only doing it because they're getting paid off.
00:19:23.980 So if, you know, they go and pay the judge to convict, you know, they've done it a million times to convict the nearest black man.
00:19:33.280 They've done it a million times.
00:19:34.800 I'm here in Texas.
00:19:35.860 They've done it a million times here in Texas.
00:19:38.140 A lot.
00:19:38.440 And everybody's supposed to ignore that while they're doing it to Trump?
00:19:43.320 Yeah.
00:19:43.660 Thanks to the new black man.
00:19:45.020 I'm definitely willing to believe that there's, you know, massive injustice from the top to the bottom.
00:19:51.340 So, but we have something in common accidentally.
00:19:54.260 I mean, the entire public now is being abused by the Department of Justice.
00:19:57.920 So we weirdly have something in common.
00:20:01.760 And look how they did to us.
00:20:03.780 Look at all the things they've done to us to destroy our freedom of speech.
00:20:08.320 It's just terrifying.
00:20:10.280 But we have to find a way to wake people up.
00:20:13.680 And I know that that's why you said what you said.
00:20:17.200 Like I heard you on Chris Cuomo.
00:20:19.960 You get the big attention and then you come back and reframe and explain.
00:20:24.060 That's the only way you can get anything on the media now.
00:20:27.180 Yeah.
00:20:28.220 The reframe that I was trying to promote is that we're at a point in history where the affirmative action and, you know, real aggressive race-based policies probably did help a lot in the past.
00:20:43.600 Probably that's the reason that we have diversity in businesses.
00:20:47.720 Probably it was one of the best things that America's ever done to make sure that everything was inclusive.
00:20:53.080 But it is logical and obvious that at some point you have to stop doing that because it's hurting more than it's helping.
00:21:02.300 And it's going to take basically people like me who don't mind getting canceled to start calling out when the crossover happens.
00:21:12.240 And that's what I was trying to do.
00:21:13.320 So in my mind, the CEI, the ESG, the DEI, the CRT, they all have in common that they demonize white people for the benefit of a class that would benefit if they can change things.
00:21:33.680 So you don't want to live around that situation.
00:21:38.100 In other words, you want to reduce that as much as possible.
00:21:40.860 Jackpot City is the home of all things casino.
00:21:44.100 There's just one house rule to create the perfect online casino for you.
00:21:48.060 We've built a world-class lineup of classic casino games such as roulette and blackjack
00:21:51.940 and crafted a virtual range of the best slots, including Atlantean treasures.
00:21:55.780 Everything's online.
00:21:56.760 Everything's ready.
00:21:57.640 Everything's for you.
00:21:58.680 So whenever you're feeling playful, head to Jackpot City and you'll be endlessly entertained.
00:22:03.020 Jackpot City.
00:22:03.960 Casino games perfectly made for you.
00:22:05.920 Must be 19 plus.
00:22:06.920 Ontario residents only.
00:22:07.960 Please play responsibly.
00:22:09.000 Gambling problem?
00:22:09.600 Visit connexontario.ca.
00:22:10.860 Now, when I hyperbolically said, get the hell away, there's no practical way to do that.
00:22:17.740 Nobody would want to do that.
00:22:19.720 That should never have been taken seriously.
00:22:21.560 It was hyperbole.
00:22:22.540 But here's what can be done.
00:22:24.360 We can absolutely make sure that every black kid and every other kid learns how to succeed on a personal level.
00:22:32.920 In other words, if you teach them the tools of success that have worked for every person everywhere of every type at every income level,
00:22:40.160 they're going to do fine, they're going to do fine.
00:22:42.560 Will they do as well as some other group?
00:22:45.040 You know, I don't care.
00:22:46.060 I feel like I need to stop caring about this weird average of one group versus the average of another.
00:22:53.840 I cared a lot when the averages were, you know, completely out of whack, right?
00:22:59.440 You care a lot then.
00:23:00.480 But once it gets close, even if it's not even, you've got to drop the stuff that's creating more problems than it's solving.
00:23:08.200 And I think at this point we have to switch to individual success strategies.
00:23:12.620 I'm actually working with Joshua Lysak to build a student guide that's based on my books, these two.
00:23:22.320 This will come out pretty soon.
00:23:23.680 But it's basically guides for personal success.
00:23:28.200 And, you know, I came from a low-income situation.
00:23:31.920 I'm guessing you probably did too as well, low-income situation.
00:23:35.300 And if you simply do the things that people have always done to succeed, you've got a really, really good chance in America.
00:23:44.300 And if the averages of two groups that somebody decided have to be measured are different, I'm not sure that that's the problem anymore.
00:23:55.060 Because show me a black kid who went to school, stayed off drugs, studied, and developed a skill or set of skills that the marketplace valued and didn't do well.
00:24:10.220 Does that person even exist?
00:24:12.720 It's sort of like I always say, you know, show me the homeless engineer.
00:24:18.600 Unless I have a mental problem.
00:24:20.580 Well, you never know about that once you bring in drugs, you know, and alcohol.
00:24:24.200 There's a lot, probably a lot of them.
00:24:26.900 But, you know, everyone should have the right to have a public school that's a place that actually teaches them how to get along in this world that we actually live in so they can be employed and have a gainful future, which they refuse to do.
00:24:46.960 They don't think that that is important at all.
00:24:49.260 Yeah, the thing I always notice.
00:24:50.460 They're trying to manufacture child soldiers.
00:24:53.380 Have you ever noticed that when you meet a black conservative, they're doing well?
00:25:00.080 They're always doing well.
00:25:01.600 And it's because they have a set of rules that everybody has always used for success.
00:25:06.500 And they just use the rules and now they're doing well.
00:25:09.200 Well, so do rappers that live within million-dollar mansions.
00:25:15.100 They don't talk about it unless they, when they're launching their fourth clothing line, then they begin to talk about discipline and all that other stuff.
00:25:24.060 Yeah, but if you're talking about like Jay-Z or even Yeh, I mean, he's controversial, of course.
00:25:30.320 But if you look at his work ethic, amazing.
00:25:34.020 Yeah.
00:25:34.360 Amazing work ethic.
00:25:35.960 And he built a-
00:25:36.580 He's probably conservative when they do their taxes.
00:25:38.360 Well, once you get money, you turn conservative real fast.
00:25:43.260 Didn't you find that out, Scott, if you came from working class background?
00:25:47.580 You know, I haven't examined my history to know if I've changed my opinion that much.
00:25:52.620 But let's see, I grew up in a Republican town.
00:25:55.080 I became kind of, I felt a little, you know, lefty when I was young.
00:26:01.960 And, but I wasn't paying attention to much, right?
00:26:05.280 The more you pay attention, the more, the more you dig down, the more, I don't, I don't identify as conservative.
00:26:14.080 Because like I say, I'm left to Bernie on some things.
00:26:16.540 I'll give you some examples.
00:26:17.780 Everybody always asks, what's that mean?
00:26:19.440 But I bet you ain't, I bet you're conservative on putting your money in the bank.
00:26:23.780 Well, you mean conservative about money in general?
00:26:29.040 Yes.
00:26:29.360 Oh yeah, of course.
00:26:30.320 Because money is not politics, right?
00:26:33.460 National defense is not politics.
00:26:36.280 So if you tell me, you know, a question about the military or the economy,
00:26:39.120 I don't even know that there's, you know, a Democrat or Republican way to look at that.
00:26:44.380 I just look at, okay, who's going to be gored and who's going to make money in this plan?
00:26:49.680 And that's sort of all there is.
00:26:50.900 But if you, but if you take, I'll give you some examples of why I say I'm left to Bernie.
00:26:56.060 Okay.
00:26:56.280 So Republicans might say they don't like abortion.
00:27:00.240 Democrats might say, yes, we'd like it under certain conditions.
00:27:03.760 And I go further than that to the left.
00:27:06.620 And I say, people like me should stay out of it.
00:27:09.520 People like me who don't have babies.
00:27:13.240 People like me who can't have a baby.
00:27:15.460 I would, now, of course, nobody should give up the right to vote or have an opinion.
00:27:20.420 But I think that the best abortion law would be the one that is backed by the most women in the country.
00:27:27.840 And maybe by state, that makes sense, you know, to make it local.
00:27:32.540 But let's imagine that the men wanted abortion to be illegal.
00:27:38.500 The women wanted it to be legal.
00:27:40.520 You wouldn't want the men to win in that situation.
00:27:44.180 That's not a stable country.
00:27:46.300 The most stable situation is where the people closest to the decision got to make the decision.
00:27:52.700 And that might be different by state.
00:27:55.380 Yeah, but that never happens.
00:27:58.220 Which part?
00:27:59.660 Never happens.
00:28:00.940 The people who have to live with the decision are never the ones making the decision.
00:28:05.200 Oh, well, that's true.
00:28:06.140 I'm just saying that as a citizen, I feel I should not influence that conversation.
00:28:13.800 But I try pretty hard to influence that.
00:28:15.280 Well, I think you should.
00:28:17.160 What does that make me?
00:28:18.720 I think every person should have a voice in this.
00:28:22.120 But it's a huge thing.
00:28:23.880 It's torn our country apart for decades.
00:28:25.700 Hold on.
00:28:26.080 There's a nuance here that I'm not explaining well.
00:28:29.300 When you have a situation where you know nobody will agree, and abortion is one of those,
00:28:34.680 you're not going to win anybody over.
00:28:37.300 And it's also life and death.
00:28:39.480 And it's also vital.
00:28:40.660 It's such a big topic.
00:28:41.640 It's vital to the cohesion of the country.
00:28:44.940 When the stakes are that high, and you can't decide and you know you'll never win an argument,
00:28:49.940 the default, and this is the important part, in that situation, the best solution is that
00:28:56.620 you have the most credible set of laws, not the best ones.
00:29:01.400 So the most credible ones are that the people who are, let's say, most skin in the game,
00:29:07.720 looked into it, and collectively they said, you know, we're the closest to this.
00:29:13.020 This is what we think.
00:29:14.140 And then people like me, who are sort of outside that circle, can say, you know what,
00:29:19.060 I'm not even sure I agree with what you decided, but I definitely agree that you're the right
00:29:23.740 people to decide it.
00:29:25.520 So if you can't get the right decision, which really is not, it's not a possibility, because
00:29:30.100 we're so polarized on that.
00:29:32.420 If you can't get the right decision, the next best thing you can do is get the right people
00:29:37.120 to be the dominant part of the decision.
00:29:39.460 I think if common sense was introduced, which it has never been, if it was for once introduced,
00:29:48.080 a compromise could appear that everyone could accept.
00:29:52.200 And that's what I like.
00:29:53.740 I like the power of words and conversation in order to access together a common sense solution
00:30:01.720 to every problem, because we do have the brains and the wherewithal and the intelligence,
00:30:09.420 even if, sometimes I say, even if we have to manufacture it artificially, but we do have
00:30:15.220 access to the intelligence to be able to solve each and every situation and problem facing us.
00:30:23.060 We do.
00:30:24.060 If we don't, I like that you said, when you don't monetize the problem, you can better solve
00:30:30.980 it.
00:30:32.120 Yeah.
00:30:32.760 But I think with the question of abortion, there's no conversation that would get you
00:30:38.600 really close to any kind of a greed, central.
00:30:41.720 I think there is.
00:30:43.080 And here's what I say.
00:30:44.440 So put this under your hat and smoke it or whatever the old lady thing is.
00:30:49.100 I'll smoke it.
00:30:50.620 Boomer talk.
00:30:51.600 Huh?
00:30:52.200 I'll smoke it.
00:30:53.040 What?
00:30:53.180 Just give it to me.
00:30:53.780 I'll smoke it.
00:30:54.340 Okay.
00:30:55.360 I think it should be between a woman and her doctor, and it's nobody else's damn business
00:31:02.980 to try to politicize it.
00:31:04.880 And I think that everyone should get together in a room and accessing the greatest data that
00:31:13.780 we have available to us, whether that being inside the womb feels pain or not, because
00:31:19.920 in our new world, there's going to be one commandment.
00:31:23.400 Well, there's going to be a really important commandment, which is no cruelty.
00:31:28.520 And so I think that using all of that in a higher mind where we're intelligent and not
00:31:35.900 base creatures that crawl on the earth, but we use our mind, we'll go, well, what week
00:31:43.140 is that?
00:31:44.520 And it will be between the intelligent woman and her ethical doctor, and it will be no
00:31:52.040 one else's business.
00:31:53.220 That's what I propose.
00:31:54.360 Now, I think people can agree.
00:31:56.420 I do.
00:31:56.720 But here's the question.
00:31:59.420 You're laying out a world in which rational people look at the evidence and kind of agree
00:32:06.440 because logic and data drives them to the same place.
00:32:10.480 You may know I'm a trained hypnotist, and the first thing you learn is that you don't live
00:32:15.480 in that world.
00:32:17.140 The first thing you learn as a hypnotist is that people make decisions first, they rationalize
00:32:23.400 them after the fact, but they don't know they did that, and then they get in an argument
00:32:27.220 with you about how they arrived at their decisions through their logic and data when nothing like
00:32:32.180 that happened.
00:32:34.200 That's so true.
00:32:35.140 You've heard the famous saying that you can't use reason to talk somebody out of an opinion
00:32:42.000 that they did not arrive at through reason.
00:32:45.620 And that's nine out of ten decisions.
00:32:48.040 So when you talk about something like abortion, that more than anything else is driven by
00:32:54.140 your ickiness feeling about a fetus, in my opinion.
00:32:58.400 If you're thinking of a fetus and you go, oh my God, that's like a person, and if you
00:33:02.840 ended it, you'd be murder, nobody can talk you out of that.
00:33:07.140 That's literally what you see and feel.
00:33:09.180 It's not a reason.
00:33:10.720 If somebody else says, well, it has no memory, it's not, I mean, it looks like something,
00:33:15.800 but...
00:33:16.700 But they're just assuming all that.
00:33:18.800 What's that?
00:33:19.320 They're just assuming that it doesn't feel pain, but they have done studies about a fetus
00:33:25.660 feeling pain.
00:33:26.800 I don't think...
00:33:27.640 And I think that should be factored in.
00:33:29.760 I just do because I like humanity, and I like keeping mom.
00:33:34.520 I like your idea of that as a standard.
00:33:38.340 I just don't think you would get people to agree with that standard.
00:33:42.260 Well, I know, but look at what they do agree with, Scott.
00:33:45.880 Why not?
00:33:46.820 Why couldn't we do the right thing for once at the right time with the right people for
00:33:50.920 the right reason?
00:33:51.460 We're not rational.
00:33:52.440 We don't have to keep being Murphy.
00:33:55.880 Well, we kind of do.
00:33:57.200 We kind of don't have the option of suddenly becoming rational.
00:34:03.100 It's just not...
00:34:04.000 Well, we do, though.
00:34:05.400 Why do you say that?
00:34:07.440 We...
00:34:07.840 You really don't believe we do?
00:34:10.340 No, not even a little bit.
00:34:11.580 No.
00:34:12.300 That's the...
00:34:12.980 The...
00:34:14.160 Damn.
00:34:16.920 The hypnosis point of view is that it's all rationalization after the fact.
00:34:20.820 The exceptions to that would be like balancing your checkbook or looking for a sale at the
00:34:26.320 store.
00:34:27.320 That could be rational, picking the best route to your destination.
00:34:30.980 But anything that has an emotional weight to it at all is 100%.
00:34:37.280 How does it feel?
00:34:38.840 And then can I build an argument around how I feel?
00:34:41.320 I just hope we evolve to the next level quickly.
00:34:45.040 I really think we'll have to.
00:34:47.220 Well, take the simplest situation.
00:34:52.660 I guess you could call it simple.
00:34:54.840 Climate change.
00:34:56.440 Climate change, in theory, is so studied that we should not have any disagreement about what
00:35:02.800 it is or where it's going.
00:35:04.540 But you can see that there's no amount of data, logic, argument on either side.
00:35:11.160 I'm not taking a side now.
00:35:12.440 At the moment, I'm just talking about it from the big picture.
00:35:15.240 There's no amount of anything that's going to change anybody's opinion on that, except
00:35:19.480 for this little sliver of people who actually weren't committed one way or the other.
00:35:24.620 You know, sometimes you can bend a few in the middle.
00:35:26.860 But do you think you're going to get granted to say, you know what, after I thought about
00:35:31.940 it, fossil fuels would be terrific.
00:35:35.060 You know, it's just not going to happen.
00:35:37.400 I think it will happen because I think it's already happening.
00:35:41.880 Like, for instance, the major proponents of climate change say the oceans are rising,
00:35:47.960 but they all went and bought a beachfront mansion for millions of dollars.
00:35:52.060 So obviously, they know it's bullshit.
00:35:54.340 Oh, let me push back.
00:35:55.340 You don't need to go farther than that.
00:35:57.220 Hold on.
00:35:57.460 Let me push back on that.
00:35:59.300 I'm going to put the rich person filter.
00:36:02.240 If I have enough money, I'm going to buy a beach house.
00:36:05.180 And if it goes underwater, I don't care.
00:36:08.400 It's my third house.
00:36:09.240 What if you're in it?
00:36:10.340 No, they live in it, Scott.
00:36:12.020 It's not a vacation.
00:36:14.220 Yeah.
00:36:14.580 No, I mean, you're talking about the rich people.
00:36:16.840 The rich people with the beach homes can take the risk that the beach home goes underwater.
00:36:21.760 Yeah, but what if they're in it and it happens while they're asleep?
00:36:26.900 That ain't thinking clearly.
00:36:28.880 Well, if it's a hurricane, you usually get a little warning.
00:36:32.040 Yeah.
00:36:32.580 Yeah.
00:36:32.860 I personally would not live anywhere where I had to pack up because something on the news happened.
00:36:38.160 Right.
00:36:38.400 I don't want to be listening to the news and the news says, you know, people in your zip code, you really ought to get in your car and drive as fast as you can this way.
00:36:48.140 Like, I never want to hear that.
00:36:51.120 I just want to watch the news.
00:36:52.180 No, because, you know, everybody, it'll be the worst traffic jam.
00:36:55.760 You'll die out there.
00:36:57.020 You're not going to make it out of your city.
00:37:01.760 Can you imagine people?
00:37:03.260 They buy that bullshit.
00:37:04.780 Get a motorcycle.
00:37:05.480 But they won't go, let's have a rational conversation.
00:37:08.840 Well, I know people are stupid if that's what you mean.
00:37:12.240 Well, that's one way to put it.
00:37:14.840 No, I like to say irrational.
00:37:16.200 I like to say irrational because then that doesn't, I don't like to hold myself outside that category because being human, I must have as many irrational, you know, opinions that are actually nonsense, but they seem totally reasonable to me.
00:37:32.980 I just don't know what they are.
00:37:34.480 The nature of irrationality is that you always think it's the other person.
00:37:40.320 I don't assume that I'm somehow immune to that.
00:37:44.200 I just don't know where my blind spots are.
00:37:48.200 I know where they are.
00:37:51.200 Well, I'm glad I'm here.
00:37:53.180 I'm glad I'm here to find out.
00:37:54.440 Not yours.
00:37:55.380 Not yours, but in general.
00:37:58.120 Something that pisses you off about another person, it pisses you off because you're seeing yourself and that's God telling you, hey, that's what you hate about yourself, but you don't know it.
00:38:13.440 Take a look in the mirror.
00:38:15.480 You know, that's, yeah, that's a special case of the larger thing, I believe, which is we evaluate everything as a version of ourselves.
00:38:26.220 I mean, every person, every object, it's all just, it's me, but it's broken, so I don't like it.
00:38:33.600 It's like a bad version of me.
00:38:37.140 So that's sort of a deeper thought there.
00:38:39.460 Yeah.
00:38:40.900 Yeah, it's me and everybody else, and they're all the others.
00:38:45.920 And they're all wrong.
00:38:48.020 Yeah.
00:38:48.580 It's really childish, but after you see, do you have children, Scott?
00:38:54.720 Stepkids from prior marriage.
00:38:59.540 Well, you know how people just learn stuff the older they get and the more they grow up.
00:39:04.700 They do, for the most part, I think, get more rational.
00:39:09.040 Do you agree?
00:39:11.060 They have more rational capabilities, definitely, because, you know, after 25, your brain is sort of locked in and starts to work the way it's supposed to.
00:39:20.240 So there's definitely that, and there's definitely more knowledge and more context.
00:39:23.420 I can tell you that I feel like a god at my age.
00:39:28.840 I'll bet you have the same feeling, because something will come up in the news, and I already know the context, because I live there.
00:39:35.860 Yeah.
00:39:36.300 For example, there are people who are worried that the world is going to hell, and maybe they're right, but my context is I was born into that world.
00:39:45.840 I was born into we're going to be nuked by Soviet Union any moment.
00:39:49.760 The ozone layer is gone.
00:39:52.400 All the hippies will no longer work.
00:39:54.560 The economy will crumble because of the long hair.
00:39:57.340 The drugs will destroy us.
00:39:59.260 You know, we're getting closer to that now.
00:40:01.320 But this is, what, my 25th crisis that's going to end the world?
00:40:07.780 Eventually, you get on that the business model of the news is to sell you crises.
00:40:13.880 Once you get that, the next crisis starts looking like the old ones.
00:40:18.220 You're like, oh.
00:40:19.840 Now, I have something I call the Adam's Law of slow-moving disasters, which says that throughout history, if you could see a disaster coming from a long way away, such as we're going to run out of food because there's too many people.
00:40:34.960 Nope.
00:40:35.740 Fine.
00:40:36.280 We're going to run out of fuel.
00:40:38.460 Nope.
00:40:38.820 We found a way to frack.
00:40:39.940 We have plenty of fuel.
00:40:41.120 So we're really good if we have noticed.
00:40:42.960 The things we're bad at is, oops, COVID, we're really bad at that.
00:40:48.580 But you give us a five-year warning, or with climate change, a 20-year warning, pretty darn good at that.
00:40:55.380 So climate change worries me almost not at all because I think our ability to fix it and adjust.
00:41:04.060 I think deaths from extreme weather are down 98% over the last few decades.
00:41:11.380 So, yeah, there might be bigger hurricanes.
00:41:13.700 Some places will get hotter.
00:41:15.040 Some places won't.
00:41:16.980 We'll adjust.
00:41:17.980 We always do.
00:41:20.920 Yeah, I think that too.
00:41:22.440 How old are you?
00:41:24.220 I'm 66.
00:41:26.800 I'm 70.
00:41:28.900 So I'm older than you.
00:41:30.340 Ha ha.
00:41:30.800 You know, wasn't there, I realized when I was a kid, I was pretty sure I was supposed to be retired now.
00:41:40.920 Yeah.
00:41:41.440 That didn't work out.
00:41:43.180 No, that didn't work out.
00:41:44.720 And I just read online the other day that Americans of our age group are afraid they can never retire.
00:41:53.880 I'll tell you, I don't understand how people our age will retire unless they had pretty big careers.
00:42:02.840 Like, I do the math, and I think, I don't know how this works.
00:42:08.680 But I also think that you get, I think what you're going to see is you're going to see a lot of people who have a house get a bunch of roommates.
00:42:16.520 And they may be less lonely than they were before.
00:42:21.620 So we have infinite capacity to figure out how to re-engineer and solve stuff.
00:42:27.100 So we've got a loneliness.
00:42:28.760 That's why I think we're going to awaken to our need for each other and bring more love and compassion into it and forget our cruelty and our need to be right.
00:42:40.380 And I think it's going to be a great correction for us.
00:42:43.360 I think COVID was, you know, the quarantine was the beginning of that.
00:42:46.700 We had to stay with our horrible families and work out a lot of our problems.
00:42:52.100 You know?
00:42:53.220 Yeah.
00:42:53.900 Quite a torture.
00:42:55.280 How did that work?
00:42:56.060 How did that work out for everybody?
00:42:57.100 In some ways.
00:42:59.020 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:43:04.660 Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous for.
00:43:09.880 When you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:43:16.720 With our ever-growing library of digital slot games, alert selection of online table games, and signature BetMGM service.
00:43:24.220 There's no better way to bring the excitement and ambience of Las Vegas home to you than with BetMGM Casino.
00:43:30.660 Download the BetMGM Casino app today.
00:43:33.700 BetMGM and GameSense remind you to play responsibly.
00:43:36.240 BetMGM.com for T's and C's.
00:43:37.980 19 plus to wager.
00:43:39.300 Ontario only.
00:43:40.180 Please play responsibly.
00:43:41.460 If you have questions or concerns about your gambling or someone close to you, please contact Connects Ontario at 1-866-531-2600 to speak to an advisor.
00:43:51.620 Free of charge.
00:43:52.640 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:43:55.820 Jackpot City is the home of all things casino.
00:43:59.580 We've built a world-class lineup of classic casino games such as Roulette and Blackjack and crafted a virtual range of the best slots, including Atlantean treasures.
00:44:07.880 Everything's online.
00:44:08.960 Everything's ready.
00:44:09.900 Everything's for you.
00:44:11.020 So whenever you're feeling playful, head to Jackpot City and you'll be endlessly entertained.
00:44:15.640 Jackpot City.
00:44:16.620 Casino games perfectly made for you.
00:44:18.780 Proud partner of the Toronto Maple Leafs.
00:44:20.860 Must be 19 plus.
00:44:21.960 Ontario residents only.
00:44:22.900 Please play responsibly.
00:44:24.180 Gambling problem?
00:44:24.800 Visit ConnectsOntario.ca.
00:44:28.000 I'm not married anymore.
00:44:33.300 Very few people survived it on that.
00:44:35.200 Well, I think it was a...
00:44:36.760 Yeah, my marriage didn't survive COVID.
00:44:39.920 No, a lot of people didn't.
00:44:41.440 A lot of people, but, you know, I guess it's like if you're able to change or not.
00:44:46.320 Speaking of change, I got hypnotized and it was something else.
00:44:52.540 Hey, could you hypnotize me into quitting smoking?
00:44:57.320 You don't do that kind of shit.
00:44:59.220 Interesting story.
00:45:00.460 No, actually, that was the thing that got me into hypnosis in the first place.
00:45:06.140 My mother was hypnotized when she gave birth to my sister and she didn't have any painkillers.
00:45:13.080 She says, maybe she doesn't remember, but she said no painkillers and no pain.
00:45:17.900 And she was aware the entire time.
00:45:20.700 But the family doctor was the one who hypnotized her in the hospital before birth.
00:45:26.280 But also he tried to get her to quit smoking.
00:45:28.700 Now, that didn't work.
00:45:32.980 And I didn't know that until she died of lung cancer decades later.
00:45:37.460 I actually thought she quit smoking.
00:45:39.820 She told her she did, but she was a secret smoker.
00:45:42.280 Oh, she lied?
00:45:43.720 Yeah, yeah.
00:45:45.420 She'd sneak off and smoke?
00:45:47.720 She was a sneaky smoker.
00:45:50.800 Yeah.
00:45:51.640 So anyway, that's what got me interested in it.
00:45:55.440 But the answer to your specific question is that hypnosis doesn't work super well for losing weight or quitting a habit that you enjoy.
00:46:06.120 It would be really good for, say, curing stage fright because nobody wants the enjoyment of stage fright.
00:46:12.860 But you kind of like the cigarette, right?
00:46:15.300 So how about fear of flying?
00:46:17.820 Fear of flying.
00:46:19.320 Very good because nobody wants to keep that.
00:46:22.060 Or performing better in your sport.
00:46:24.420 You might visualize this.
00:46:26.300 Everybody wants to do better.
00:46:27.580 There's no counter force.
00:46:29.640 But as soon as you've got a counter force, like you really, really like the taste of that food, and you really, really like that cigarette, hypnosis works as well as, but not better than, almost any other technique.
00:46:43.060 I think there are a few that might be some meds now that make a difference.
00:46:47.580 I'm not sure.
00:46:48.020 But in the old days, about 30% of the people would get a benefit of quitting smoking.
00:46:54.020 But as the hypnotist who instructed me taught us, the people who quit smoking would have quit with every other technique as well.
00:47:03.460 And it's the difference between wanting to do something and deciding.
00:47:08.120 The people who want to do it are looking for something.
00:47:11.600 Yeah, they're looking for somebody to do it for them.
00:47:13.900 I want it.
00:47:14.980 Could you give me some willpower somehow with a pill or something?
00:47:19.480 But the people who just say...
00:47:20.440 They always tell...
00:47:21.840 Huh?
00:47:22.460 Yeah, the people who say, I'm done, they're just done.
00:47:26.500 And every technique works because they're done.
00:47:30.220 I always, when I meet Christian people, I say, can you pray for me to quit smoking?
00:47:36.000 And then when I don't, I say, you ain't praying hard enough.
00:47:42.540 But, yeah, I know it's my fault.
00:47:44.740 But, you know, hypnosis, that's kind of programming your brain, isn't it?
00:47:50.440 Yeah, that's essentially, that's the background behind this book.
00:47:55.720 So it's got over 160...
00:47:57.100 What are you telling people to do in that book?
00:47:59.300 Are you telling them to do repetition?
00:48:02.020 What kind of things can you tell us about programming our brain at the end here?
00:48:07.140 I'll give you the simplest one.
00:48:09.340 Some years ago, I used the reframe that alcohol is poison.
00:48:13.420 Now, reframes don't have to be actually technically correct or logical.
00:48:18.240 They just have to work.
00:48:19.240 Now, when I said that, I thought it was just an interesting reframe that was working for me because I don't drink anymore.
00:48:26.760 And a whole bunch of people told me they stopped drinking forever with that one sentence.
00:48:34.800 Because instead of looking at it as a beverage, if you think of it, if you think your alcohol is a beverage, you're going to drink it because, hey, it's dinner.
00:48:42.380 I have a beverage.
00:48:43.840 If you tell yourself it's poison, you don't drink poison with dinner.
00:48:48.340 And although you say to yourself, you say to yourself, but wait, Scott, my logical brain knows it's the same thing before or after.
00:48:56.040 The words I use don't make any difference.
00:48:58.520 But here's what the hypnotists know that the public doesn't.
00:49:01.660 The words are your program.
00:49:02.940 You know what?
00:49:05.460 That is so right.
00:49:06.480 And that's why I wanted to be on TV.
00:49:09.020 That is so right.
00:49:10.160 Words.
00:49:10.760 Everything is strung together with words.
00:49:12.900 I'll tell you this one thing I did because I like doing that.
00:49:16.240 Well, I was in my car and I had quit smoking.
00:49:19.120 So I was real self-righteous because I quit all the time.
00:49:22.340 I was real self-righteous about it.
00:49:23.800 And I saw this young girl in her car next to me smoking.
00:49:27.440 So I rolled down my window.
00:49:28.720 She looked over and she's like, because I was famous then.
00:49:32.560 And she's just staring at me with her cigarette.
00:49:34.580 And I go, you need to stop smoking now.
00:49:40.360 And she goes, oh, my God.
00:49:43.860 Roseanne Barr is telling me that I need to.
00:49:47.040 And I was thinking about stopping smoking.
00:49:49.280 OK.
00:49:50.100 She threw it out the window.
00:49:51.220 She goes, that's it.
00:49:52.100 I'm done.
00:49:52.620 And I like doing that, you know.
00:49:55.700 Then she died of lung cancer.
00:49:57.400 So here's the hypnotist explanation of what happened.
00:50:03.480 And I talk about this in the book as well.
00:50:05.780 Reframe your reign.
00:50:07.840 That people need a fake because.
00:50:10.900 A reason that sounds like a reason but isn't really a reason.
00:50:14.660 When famous Roseanne said you need to quit smoking, that was a fake because.
00:50:21.640 But did it work?
00:50:23.720 I'll bet it did.
00:50:24.860 If I had to bet on it, she's got a story that she can tell for the rest of her life.
00:50:30.280 And that's a fake reason why she actually quit.
00:50:33.240 You can give yourself fake reasons.
00:50:35.280 And I actually teach you how to do it because your brain is not a rational engine.
00:50:41.060 It's a word engine.
00:50:42.500 If you put the right words in, you're going to get the right actions out.
00:50:46.540 That is so right.
00:50:48.640 And by the way.
00:50:49.220 I know.
00:50:50.040 By the way, the AI that we have now that's based on nothing but language patterns is confirmation of what the hypnotists always knew.
00:50:59.980 That if you simply combine words, it looks like intelligence.
00:51:03.980 So the intelligence we're getting out of AI is sort of like a fake intelligence.
00:51:08.940 It's really just pattern recognition.
00:51:10.620 But that's what we do.
00:51:12.300 Humans are no different.
00:51:13.880 So I think I have an opinion.
00:51:15.980 But what I really have is an understanding that these sets of words fit together in a pattern.
00:51:21.820 That is not a thinking or reason.
00:51:24.140 It's simply pattern recognition.
00:51:25.720 It's like, oh, under this situation, I produce these words because that's what most people do.
00:51:31.040 They're associated with the topic.
00:51:33.120 And so we think we're thinking, but almost never.
00:51:36.400 Almost never.
00:51:37.000 It's like wizardry, isn't it?
00:51:40.660 All it is is pattern recognition.
00:51:42.520 But if you don't know that's all it is, it's like a magic trick.
00:51:45.760 So yes, to your point.
00:51:47.400 Do you use it when you tweet, Scott?
00:51:50.800 Sometimes.
00:51:51.780 By hypnosis knowledge?
00:51:52.840 Yes.
00:51:53.580 Yeah.
00:51:53.900 Yeah.
00:51:54.160 So it's more like an understanding.
00:51:58.280 So there's technique.
00:51:59.520 But if you've lived it and breathed it and worked with it for years, I write it without thinking about it.
00:52:07.840 So it's almost like typing if you're a touch typer.
00:52:11.140 I don't think I'll say this in a persuasive way, but it's also the only way I know how to write at this point.
00:52:17.260 So it's just automatic.
00:52:19.260 What about persuasion?
00:52:21.360 I wanted to talk to you about persuasion.
00:52:23.440 Persuasion is the umbrella under which everything from marketing to sales, propaganda, hypnosis, they all fall under there and they all have common elements.
00:52:35.240 So that's my larger field is persuasion.
00:52:39.940 And hypnosis would just be one thing I learned.
00:52:42.600 But, you know, all forms of communication have an element of persuasion in them if you're doing it right.
00:52:49.360 So I've simply learned those little techniques and incorporated them in my normal presentation.
00:52:56.080 For example, I'll give you an example.
00:52:57.260 Now, Vivek Ramaswamy was on an interview recently talking about climate change.
00:53:03.180 He had a different view than the host.
00:53:06.180 But what he did before he went into his different parts is he said that he agreed that the climate is warming.
00:53:13.020 Now, I'm not getting into a discussion about climate.
00:53:15.580 I'm just giving you an example of persuasion.
00:53:17.720 He said, I agree the planet is warming and the humans caused it.
00:53:21.080 So by agreeing with her first, he's got you on his side.
00:53:26.500 That's called pacing.
00:53:27.460 Right.
00:53:28.040 Then he can lean.
00:53:29.080 It's called what?
00:53:30.160 Pacing.
00:53:30.940 You pace the person you're trying to persuade by agreeing with them or matching them in some way.
00:53:36.860 For example, if I were pacing you, I would have, you know, maybe dressed the same way you're dressing or the same style.
00:53:45.160 I might have, you know, I might have leaned the way you're leaning.
00:53:48.000 I might have used the words you use.
00:53:50.220 So, for example, if you were a person who talked in military terms, it's usually guys.
00:53:55.180 But if they said stuff like, well, we're going to take that hill tomorrow and, you know, somebody jumped on the hand grenade.
00:54:01.200 If I were pacing you, then I would say, yeah, you've got to get the troops to be marching in one direction.
00:54:06.900 And, you know, you can't go to war without ammunition.
00:54:09.680 And then they would, without realizing it, they would say, you know, you and I are basically the same person.
00:54:15.740 Because the things coming out of your mouth, they're in my head.
00:54:18.460 And as I'm thinking them, you're saying them, or you're saying what I could have thought in that situation.
00:54:23.420 So the next thing I say, you're convinced that we're the same person.
00:54:29.900 Now, that's an exaggeration.
00:54:31.840 But I'm saying that if your friend says something's true, you're more likely to agree than an enemy.
00:54:37.320 Because the friend is a version of you.
00:54:41.260 That's why they're a friend.
00:54:42.880 An enemy is the opposite of you.
00:54:44.640 So you, like, reject that.
00:54:46.260 So you become the person you're trying to influence long enough for them to feel comfortable with your message.
00:54:54.320 So that's just one thing.
00:54:55.140 Well, are you for sales or just for human compassion or communication?
00:55:03.520 That's everything from personal relationships to marketing to sales.
00:55:09.580 Everything.
00:55:10.520 So persuasion isn't everything we do.
00:55:13.060 You can either be good at it or bad at it.
00:55:15.320 But you never are not doing it.
00:55:17.720 Right?
00:55:17.880 We're all selling ourselves all the time.
00:55:21.060 Yeah, I can get that with my grandkids, like, trying to persuade them to choose the right thing.
00:55:27.840 I can, you know, I get that with the, hey, buddy, I hate your parents, too, and I'm on your side.
00:55:36.940 But, yeah, it is kind of that.
00:55:38.480 Well, one of the reframes in the book is how to talk to a teenager.
00:55:42.880 And one of the tricks that I give, and by the way, there's no good way to talk to a teenager.
00:55:46.620 So if you think you're going to, like, you know, you've got the silver bullet or something, no, no.
00:55:51.800 You can do a little bit better.
00:55:53.260 That's all.
00:55:53.820 That's all you can do.
00:55:54.920 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:56:00.380 Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
00:56:05.620 when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:56:11.960 With our ever-growing library of digital slot games, a large selection of online table games,
00:56:17.420 and signature BetMGM service, there's no better way to bring the excitement and ambience of Las Vegas home to you
00:56:23.800 than with BetMGM Casino.
00:56:26.380 Download the BetMGM Casino app today.
00:56:29.420 BetMGM and GameSense remind you to play responsibly.
00:56:31.980 BetMGM.com for T's and C's.
00:56:33.880 19 plus to wager.
00:56:35.040 Ontario only.
00:56:35.920 Please play responsibly.
00:56:36.940 If you have questions or concerns about your gambling or someone close to you,
00:56:40.300 please contact ConnexOntario at 1-866-531-2600 to speak to an advisor free of charge.
00:56:48.340 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:56:52.440 Jackpot City is the home of all things casino.
00:56:55.260 We've built a world-class lineup of classic casino games such as Roulette and Blackjack
00:56:59.460 and crafted a virtual range of the best slots including Atlantean Treasures.
00:57:03.600 Everything's online.
00:57:04.660 Everything's ready.
00:57:05.620 Everything's for you.
00:57:06.520 So whenever you're feeling playful, head to Jackpot City and you'll be endlessly entertained.
00:57:11.380 Jackpot City.
00:57:12.360 Casino games perfectly made for you.
00:57:14.520 Proud partner of the Toronto Maple Leafs.
00:57:16.620 Must be 19 plus.
00:57:17.720 Ontario residents only.
00:57:18.820 Please play responsibly.
00:57:19.920 Gambling problem?
00:57:20.560 Visit ConnexOntario.ca.
00:57:23.400 But one of the reframed is, I use this with my stepson who's deceased now,
00:57:29.220 I would tell him that it's two against one whenever I disagreed with him
00:57:34.580 because I'd say, look, here's the deal.
00:57:36.820 My job is to report to your future self.
00:57:40.420 I don't report to you.
00:57:41.760 But your future self, when you're an adult, you're going to hold me responsible for this situation.
00:57:48.220 That's right.
00:57:48.980 That future you and me are on the same side.
00:57:52.240 And that future you is going to be pissed if I let you do this.
00:57:56.240 Because it's going to look back and say, why did you let me do that?
00:57:58.740 Like I didn't learn any discipline.
00:58:00.560 So the young kid doesn't want to understand that and doesn't want to agree that you're on the same side as their future self.
00:58:10.660 But they also don't have much to say about it.
00:58:14.460 Right?
00:58:14.620 It's hard to push back on it.
00:58:16.560 So that's one reframe.
00:58:18.160 I call that accessing the future to work for you in the now.
00:58:24.040 Yeah, that would be a problem.
00:58:25.040 Everything I think is for how to manipulate children to get them out of this morass of my control they're under.
00:58:33.360 I feel bad for them.
00:58:35.080 They can't even think.
00:58:36.780 I'm in agony over that.
00:58:39.160 Well, let me give you the best optimistic take on the children of tomorrow.
00:58:45.260 Okay.
00:58:45.780 I don't think it's ever been true that more than 10% of the kids made a difference anyway.
00:58:51.740 Meaning that the ones who built the startups and became Apple and like that.
00:58:57.620 It's not many people.
00:58:59.660 And those people probably are unfazed by the nonsense.
00:59:05.160 In other words, if there's a Steve Jobs born today, he's still Steve Jobs.
00:59:10.140 He still emerges.
00:59:11.920 There's nothing that can stop him.
00:59:13.280 Nothing would have stopped Bill Gates.
00:59:16.240 Right.
00:59:16.520 Maybe it wasn't Microsoft.
00:59:18.040 But he had the goods.
00:59:19.780 Right.
00:59:19.960 He had all the tools.
00:59:21.080 So he was going to do something no matter what.
00:59:24.120 And I think that that doesn't change.
00:59:26.380 I think from the 60s, again, going back to my youth, you know, we saw the kids seem to be tuning out and dropping out.
00:59:35.000 And it looked like the youth that lost all their interest in hard work and all those things that kept the country together.
00:59:41.680 And then the country just kept getting stronger.
00:59:43.740 And every generation, you know, every 10 years, we're like, oh, this generation, this, you know, generation X, Y, whatever, they're all bad.
00:59:52.940 And then it never really happens because every generation produces their 10 percent who do all the important stuff.
01:00:01.000 The rest are working.
01:00:02.720 They're contributing.
01:00:03.260 But their individual difference is not that important to the whole.
01:00:07.740 So as long as we're producing superstars, and I don't think you could turn it off.
01:00:13.300 I mean, it's just the luck of the genetics.
01:00:16.380 Kid comes out.
01:00:17.680 They're just a superstar.
01:00:18.720 I don't think there's anything in our society that can hold back a superstar.
01:00:23.920 I don't think that.
01:00:24.940 Well, I don't either.
01:00:26.380 But I just, you know, wish that the regular human that ain't a superstar could make better decisions for themselves.
01:00:37.120 So they aren't like, you know, so they have more sovereignty in their lives and communities.
01:00:45.340 I get tired of seeing people used as social experiments for freaks.
01:00:52.420 Yeah, you know, but do you think it's getting worse?
01:00:55.320 Or are we just more aware?
01:00:57.480 I don't.
01:00:58.120 You don't think we're just more aware of it?
01:01:00.480 I think we're more aware of it, but I do think it's getting worse.
01:01:05.000 But, yeah, it's more.
01:01:06.500 There's more of them now.
01:01:08.480 There's more people at their disposal to do their bidding now.
01:01:13.620 And they take more and more freedom from them and give them less and less education.
01:01:19.440 Yeah, I think it's way different.
01:01:21.140 Maybe it's not.
01:01:22.280 Maybe if you go back to feudalist times, it ain't different from that.
01:01:26.400 It's like a return to feudalism.
01:01:29.760 You know, the thing about the future is that it's fundamentally unpredictable.
01:01:35.460 And there are so many things that are boiling around right now that could change just completely what it looks like five years from now.
01:01:45.860 I mean, if you add AI to the fact that they may have this superconductivity working, I'm not totally convinced.
01:01:52.640 So, you know, by the time people see this, maybe it's debunked.
01:01:56.220 But if that works, superconductivity plus AI plus quantum computers plus fusion energy forever, these are all the things that would be enabled by these technologies.
01:02:07.980 Everything's different.
01:02:09.540 We could get to the point where energy is close to free in, say, 20 years.
01:02:14.140 And what happens when energy is close to free?
01:02:17.520 Because every economy that succeeds does it on the back of energy, right?
01:02:22.360 If you don't have energy, you're not going anywhere.
01:02:24.500 And if you do, you'll probably almost certainly be fine.
01:02:27.760 Every country that's got a lot of energy seems to have a plus.
01:02:31.940 So, I don't know.
01:02:33.780 The future is fundamentally unpredictable, but there's a whole bunch of good stuff happening.
01:02:39.740 Yeah, there is.
01:02:40.480 That is at least as powerful as the bad stuff.
01:02:42.540 I think it's more powerful than the bad stuff.
01:02:46.320 And that's why I say, I think because of all the good that's incoming because of the, I mean, of course, we could use technology to ruin everything, which we are good at.
01:02:57.460 But I think we might get a chance to better ourselves and improve our situation and therefore think more clearly.
01:03:07.700 I think that's coming.
01:03:09.000 You know, one of the things that I've noticed because I have a background in economics, so my education is economics and then I got an MBA.
01:03:20.360 And what they teach you is how to compare things properly so that you're not comparing to some magical thought in your mind of how things should be compared to the actual options.
01:03:29.920 Right.
01:03:30.920 And what I've observed is that when I meet people who have the same background as me, we usually agree right away.
01:03:38.600 Or if we don't, there's an assumption that we can see, oh, you believe that'll happen, but I have a different assumption.
01:03:45.340 So, you end up agreeing or getting really close to it if you've learned how to make decisions.
01:03:49.740 And that's a field that teaches you specifically, do this or do this, how do you analyze these?
01:03:56.840 So, when I talk to what I'll call normies, you know, regular people who might have even a college degree, could be in math, could be in a variety of things.
01:04:07.060 But if it's not in a decision-making field, you believe you can do it, but you can't.
01:04:15.160 And I had that experience when I became a cartoonist because I'm not very good at drawing.
01:04:20.300 People told me early in my career, you know, there's nothing you're doing that I couldn't do.
01:04:25.540 Like, I could write that.
01:04:25.980 No, they told me that.
01:04:27.120 It's right.
01:04:27.720 I could write that.
01:04:28.840 I could do that comic.
01:04:30.040 But, of course, they never did.
01:04:32.160 Right.
01:04:32.340 So, people sometimes look at things and think as simple, such as decision-making, and they think this is something that any ordinary person could do.
01:04:41.540 I could look at the costs.
01:04:43.000 I could look at the benefits.
01:04:44.580 Anybody can do it.
01:04:45.980 But, indeed, it's a learned skill.
01:04:49.580 It is a learned skill.
01:04:51.520 My native intelligence, which I like to think is pretty good, I don't think would help me without the actual training and the discipline to always make sure I'm looking at the base case,
01:05:01.580 always looking at the do-nothing, I know what a sunk cost is, you know, that sort of thing.
01:05:08.580 So, for example, when we look at the economic models or the prediction models of climate change, as a trained person in the field of decision-making, it looks like an absurdity to me.
01:05:22.200 Because, first of all, there are too many variables.
01:05:24.100 Second of all, there are hundreds of models.
01:05:29.060 And then, as new information comes in, they throw away some of the models that didn't work and tell you that these models were predictive.
01:05:37.200 But they weren't predictive.
01:05:39.120 There were just hundreds of them.
01:05:40.860 And some of them had to be close.
01:05:42.480 So, as they're throwing away the ones that fail, just by survivorship, there's something left.
01:05:49.000 There always was going to be something that was close.
01:05:51.760 It doesn't mean it predicted.
01:05:53.660 Now, that's obvious to me because, you know, I have some background.
01:05:57.220 But that would totally be not obvious to an ordinary consumer of news.
01:06:02.220 They would say, are you telling me that all the scientists say this is valid?
01:06:05.340 And they'll look at you and say, yes, all the scientists.
01:06:09.700 Well, 98%, 2% are crazy, but 98% say this is a valid, smart thing.
01:06:16.420 It never was.
01:06:17.980 Not only are the models, the models are not science.
01:06:20.840 They're based on humans making assumptions that go into them.
01:06:24.140 And that drives the result.
01:06:25.460 I know this because that was my job.
01:06:27.960 I used to work at a big corporation to predict, you know, our economic future.
01:06:32.700 And I would have to make assumptions.
01:06:34.280 assumptions, the assumptions of the model, not the data.
01:06:38.240 It was just whatever I decided was sort of true-ish I'd put in there.
01:06:42.820 Well, that's how they do is they get in a room and decide how much money's worth.
01:06:47.440 It's all fantasy.
01:06:49.260 Well, there are definitely people deciding what reality is, the reality that we see.
01:06:55.000 So, the point being that if you're looking at, let's say, a 50-year climate model,
01:07:00.560 which part of that predicted that we would have superconductivity this year?
01:07:05.800 None.
01:07:07.120 Which part of the model predicted that Sam Altman's other startup would make fusion work,
01:07:14.900 at least in the lab?
01:07:16.940 None.
01:07:18.020 Right?
01:07:18.140 So, the biggest variables are all positive, you know, in the sense that there are technologies
01:07:24.140 coming online.
01:07:25.600 The models don't account for that.
01:07:27.700 So, the biggest part of the future, these enormous social scientific breakthroughs,
01:07:34.840 that's the biggest part of the future.
01:07:37.100 They're not in the models.
01:07:38.140 Because no model can predict that.
01:07:41.720 That's right.
01:07:42.660 So, I look at it and say, okay, I'm pretty sure we can get on top of all these problems.
01:07:48.100 Maybe the sea level rises.
01:07:50.020 We can fix that.
01:07:51.720 You know, just move back.
01:07:53.140 Move back, build a dike, do something.
01:07:56.180 Build a boat.
01:07:58.720 Live on a boat.
01:08:00.360 We now have desalinization, especially as energy costs come down.
01:08:07.980 We can live on the ocean.
01:08:09.200 You can build a city on the ocean.
01:08:11.160 And that's what, maybe that's what 30 years from now looks like.
01:08:14.720 Because there is talk that it's now practical.
01:08:18.180 And then if you're on the ocean, maybe you move your city based on the season or the hurricane
01:08:22.480 pattern.
01:08:24.480 Right.
01:08:25.240 You can easily imagine a future where hurricanes are irrelevant.
01:08:29.460 You know what the...
01:08:30.220 Yes, absolutely.
01:08:31.680 I'll tell you what the...
01:08:32.920 It has to start with thinking, words, imagination before it can, you know, manifest it in the
01:08:38.680 world.
01:08:39.080 But I think that's what they're trying to lock down is our ability to, you know, well, I
01:08:44.100 mean, our ability.
01:08:45.940 Well, our desire to create and imagine and think.
01:08:50.860 They're trying to lock all that down.
01:08:52.560 And question.
01:08:53.800 They're trying to get rid of that out of us.
01:08:56.000 Yeah, the other thing that nobody can predict is what somebody thinks of that nobody thought
01:09:01.560 of before.
01:09:02.340 Here's one of my favorite examples.
01:09:04.640 So climate change and hurricanes, you know, we've got some predicted danger there.
01:09:09.720 But most of our hurricanes, at least on the Atlantic side, they form because the desert
01:09:15.460 in northern Africa is super hot at some season.
01:09:18.720 And that causes the sequence of events.
01:09:23.440 But suppose we got our desalinization better.
01:09:27.940 We use livestock, which is another way to build greenery on deserts.
01:09:32.800 You just let the cows wander around and poop on stuff.
01:09:36.620 And the next thing you know, your vegetations move 10 feet into the poop and you just let
01:09:41.780 them keep wandering around.
01:09:43.060 10 years later, you've got a forest just from cows wandering around pooping.
01:09:46.660 So we have the technology to turn a super hot place into a slightly cooler one if we wanted
01:09:53.880 to.
01:09:54.920 And that would actually change the nature of hurricanes.
01:09:58.120 Well, that's what I'm saying.
01:09:59.580 We have the ability to make this place way better.
01:10:03.180 And I do think that a lot of us, maybe that 10% you were talking about before, but the 10%
01:10:10.460 that matter, a lot of us know that, that we can improve things.
01:10:14.480 And, you know, I think we'll make it happen.
01:10:18.280 I really do.
01:10:19.220 I, I just feel it.
01:10:21.700 I see evidence of it.
01:10:24.120 I think our smartest people are smarter than they've ever been smart.
01:10:28.640 And there's no way to quantify the impact of brilliance, of literal genius on our future.
01:10:38.160 Because it's the geniuses that are changing stuff.
01:10:41.020 And you don't see that coming until it comes.
01:10:43.380 You know, I didn't see superconductivity at room temperature a month ago, but somebody
01:10:49.340 got it.
01:10:49.980 Maybe.
01:10:50.640 I hope so.
01:10:51.560 I interviewed a guy in 1985 that was making a car work on V8 cans full of water with some
01:11:06.920 infrastructure around it.
01:11:09.540 I'm going to say that was a fraud.
01:11:12.780 No, it worked.
01:11:14.140 I wrote about it for a magazine.
01:11:15.860 I mean, he had a generator and the whole thing, and I can't remember exactly how, but that
01:11:20.820 was how he made the fuel.
01:11:22.640 It was cans with full water in a tank.
01:11:27.200 And then he put it in the car.
01:11:28.500 They were charged up.
01:11:29.840 Ben Franklin used to do that.
01:11:31.080 He played with cans of water, and that's how he developed the battery or helped develop
01:11:34.720 it.
01:11:35.400 So you might be on to something.
01:11:36.540 Ben Franklin's battery.
01:11:37.900 Yeah.
01:11:38.360 I don't know.
01:11:38.900 I can't remember enough to talk about it, but I did see it with my own eyes.
01:11:43.140 Unless it was a magic trick.
01:11:44.600 I don't know.
01:11:45.860 But do you believe that we're going to have Nisara and Gisara with your MBA in economics?
01:11:51.720 That's what everybody's saying is coming.
01:11:53.480 Wait, what are we going to have?
01:11:56.040 Gisara.
01:11:57.000 Gisara.
01:11:57.720 G-E-S-A-R-A.
01:12:00.780 Was that a virus or something?
01:12:02.960 What is that?
01:12:03.520 No.
01:12:04.400 It's a redistribution of gold.
01:12:08.160 Oh, I'm not even up to date on that story.
01:12:11.360 I don't know.
01:12:11.600 Oh, okay.
01:12:12.200 Well, that's what they say, you know, gold-backed currency that America will be having that soon.
01:12:19.240 Don't you think it's just going to all turn digital?
01:12:21.740 It has to, doesn't it?
01:12:23.380 Yeah.
01:12:23.600 No, because they're trying.
01:12:25.720 I think that's the reset that they want is that we move to digital.
01:12:30.460 But I think that there's another thing coming.
01:12:35.420 But let me ask you this.
01:12:36.780 Can you even imagine, let's say, put your imagination 20 years in the future.
01:12:41.380 Is there any way in 20 years you're paying for things with pieces of paper that you had in your wallet?
01:12:47.440 In 20 years?
01:12:48.020 No, there's no way.
01:12:49.500 No way.
01:12:50.020 No.
01:12:50.440 So it's all going to be digital.
01:12:52.380 I would say that fighting it.
01:12:53.860 No, I don't think so.
01:12:54.360 I think we'll be taking a chicken to the doctor to heal us.
01:13:00.260 Barter.
01:13:02.160 It's probably what's going to happen.
01:13:03.960 When we live in the feudal state of the oligarchy.
01:13:10.980 Then you'll be lonely because your chicken was your only friend.
01:13:13.820 So you've got to be careful with it.
01:13:16.900 No, but I think we're going to have a better system.
01:13:19.940 I think we're going to come up with something better that actually serves the many instead of the few at the expense of the many.
01:13:28.060 You know?
01:13:28.800 I do.
01:13:29.640 I think we are.
01:13:30.420 I think we're dangerously close to not needing people to work.
01:13:35.600 That's one of my big concerns.
01:13:37.700 Because once you get free energy, you can make your robots build more robots.
01:13:44.760 You can make them mine the ore.
01:13:47.640 And pretty soon you've got a completely, you know, self.
01:13:50.840 You could build an economy that didn't require people.
01:13:54.380 Except for maybe giving some orders to some robots now and then.
01:13:57.680 Because the robots can build robots.
01:14:00.420 The energy will come.
01:14:02.060 They can find, they can mine.
01:14:04.280 Basically, they can just do anything we want.
01:14:07.800 Well, what are people going to do in that world?
01:14:12.280 Well, some of them are going to merge with the robots, according to Elon Musk.
01:14:17.140 And I agree with that.
01:14:18.120 I think we'll have chips in our heads for a long time in human history.
01:14:23.480 Well, actually, it'll be a short time in the universe.
01:14:26.060 But in human history, I think there's going to be a period where we're legitimately cyborgs in every sense.
01:14:32.740 I wonder about that, too.
01:14:35.460 I don't wonder.
01:14:36.460 I think that's guaranteed.
01:14:38.900 Really?
01:14:40.220 Oh, yeah.
01:14:41.180 That's guaranteed.
01:14:41.780 Yeah.
01:14:43.340 Some percentage of the humans, not all of them.
01:14:45.940 I mean, there will be people who will stay natural.
01:14:48.320 But some percentage of people are going to put the chip in.
01:14:52.080 And they're going to be able to directly interfere.
01:14:54.640 They're already paying it.
01:14:55.700 I think that's true.
01:14:58.740 Yeah.
01:14:59.060 Yeah, that's right.
01:14:59.580 And then they'll have a cyborg girlfriend that looks like Pamela Anderson, the Stanford wife they've already dreamed of, always dreamed of.
01:15:08.500 Yeah.
01:15:08.880 You know?
01:15:09.340 And the women will have Rock Hudson or whoever it is that's...
01:15:14.260 Probably not Rock Hudson, but that's another story.
01:15:16.840 Yeah.
01:15:17.280 Well, it's programmable, so it could be.
01:15:18.640 No, I can't think of who's hot, but you know what I mean.
01:15:21.200 Well, that's right.
01:15:21.940 He's gay.
01:15:23.260 But, you know, you could program Rock Hudson.
01:15:24.700 Well, you know, like the perfect man, and then they'll just program it to say, you're right, dear.
01:15:30.180 I love you.
01:15:31.500 What else can I do for you?
01:15:34.020 I feel like the sex bots will be programmable.
01:15:37.140 So if you get tired of yours, you can, like, change it to, all right, now you're gay.
01:15:42.440 I'm just going to try this out for a while.
01:15:45.720 Yeah, exactly.
01:15:46.960 If this doesn't work, we'll update your software again until we get something we like.
01:15:52.200 I think that that's the way it has to go.
01:15:54.700 The way it's headed.
01:15:56.700 You know, I think that at least 30% of the male public will prefer digital girlfriends.
01:16:06.240 They do, too.
01:16:07.380 Because they're...
01:16:08.160 That's a low number, actually.
01:16:10.080 I think that's close.
01:16:10.360 It's probably closer to 50.
01:16:11.860 Yeah.
01:16:13.020 Okay, I was going to say 50, so I'm not going to argue with that at all.
01:16:15.940 80.
01:16:15.960 Let's go 85.
01:16:17.940 But why?
01:16:19.360 But you already see...
01:16:20.280 Because women are terrible.
01:16:21.520 But you see it in dating apps, right?
01:16:23.180 The dating apps have made the top 1% of guys golden, so they're getting all the women.
01:16:29.160 And the rest of the guys have no women.
01:16:31.640 And then when those women have been run through properly, they try to get married, and the other
01:16:37.280 guys are like, maybe not.
01:16:39.720 Maybe not.
01:16:40.600 You have a sex robot.
01:16:41.540 You missed out.
01:16:42.880 I got an alternative right over here.
01:16:44.960 Yeah.
01:16:45.340 No, that is what's happening.
01:16:46.520 The funniest thing about the sex robots, which are certainly going to be better and better
01:16:51.580 every year, is that women were abusing nerds for 100 years, and they didn't see there would
01:16:59.420 be a blowback.
01:17:01.260 The nerds actually replaced women.
01:17:04.340 It's like...
01:17:04.980 What, Dan?
01:17:06.340 Nerds.
01:17:06.860 Yeah, the nerds.
01:17:08.520 The nerds.
01:17:09.140 The technology people.
01:17:11.200 They were like, all right, if you're going to abuse us for 100 years, you just wait.
01:17:18.160 I see.
01:17:19.240 I can see the genius behind that.
01:17:22.020 Well, it might be better for men.
01:17:24.180 It might be better for women, too, to just have an agreeable partner that you didn't have
01:17:28.680 to take out all of your personal PTSD with constantly and call that a relationship.
01:17:35.520 Well, my idea doesn't...
01:17:36.980 It looks like it's not going to catch on.
01:17:38.520 My idea was that your spouse, the person you married, would be the only person you can't
01:17:44.820 have sex with.
01:17:49.220 Everybody else is fair game.
01:17:52.380 Didn't go over.
01:17:53.600 Well, when you get married, you basically show your worst side to your spouse.
01:17:58.520 Because you can't really hide it at that point.
01:18:00.940 So, why is the only person you can have sex with the only person who showed you all their
01:18:05.060 flaws?
01:18:06.920 Right.
01:18:08.440 I had a joke about that.
01:18:10.720 I had like, well, you know, you don't want to have sex because you know them.
01:18:19.860 Yeah, because where's the mystery in seeing somebody on the toilet?
01:18:23.860 I mean, you know, the romance.
01:18:25.580 You just say, what?
01:18:27.160 It's just over.
01:18:28.520 I think people will probably stop having sex because, you know, it's deadly and it's
01:18:33.860 a, you know, it's not going to add up.
01:18:38.480 If the robots are stepping up, I think you're right.
01:18:42.160 They got to step up though.
01:18:43.940 Well, the women have been using the robot since they came out, what, then in the 70s, all
01:18:49.320 the vibrators.
01:18:50.260 And so, you know, I don't know.
01:18:52.820 It kind of makes sense.
01:18:54.520 People are pretty much chronic masturbators anyway.
01:18:57.680 They don't really want to love nobody or talk to them.
01:19:01.360 So here's a topic I can only say here, which is, I'm pretty sure the womanizer will destroy
01:19:07.920 civilization.
01:19:09.420 Do you know the womanizer?
01:19:10.380 It's a special kind of sex toy for women that doesn't work.
01:19:14.040 Oh, no, I thought you were talking about like a Lothario, but okay.
01:19:16.620 Oh, man.
01:19:17.080 I thought you were talking about a man.
01:19:18.020 No, he's talking about a vibrator.
01:19:19.420 I've never seen it.
01:19:20.380 I'm going to order it right now for Hannah.
01:19:22.000 So it's a product that instead of vibration, which, you know, was pretty good, it, I won't
01:19:29.020 give you more description, but it does some kind of a sucking clitoral thing.
01:19:33.860 Oh, yeah, yeah.
01:19:35.200 Yeah.
01:19:35.700 But, but I'm told, that face.
01:19:40.480 People make me sick.
01:19:43.400 God.
01:19:44.800 Well, I'm interested.
01:19:46.880 All right.
01:19:47.360 So I'm told that it's like a whole level above whatever existed before, such that it's so
01:19:55.120 good that, you know, it's becoming a replacement for actual men.
01:19:58.620 And I don't mean that as hyperbole.
01:20:01.580 You know, I mean, I mean, that's like 5% of people just like backing out and say, you know.
01:20:07.800 Maybe people could just be friends, which would be good.
01:20:12.100 You know, you'd be friends with the opposite sex rather than, you know, rushing to them to
01:20:16.720 solve all your daddy issues and your mommy problems and using them forever.
01:20:23.180 Just be friends and take care of your own business.
01:20:26.160 For God's sake, the world would probably be a way better place.
01:20:29.120 This is the most optimistic I've been in a long time with you two.
01:20:32.100 I just want you to know, I feel the first time I feel good about the future.
01:20:34.920 I'm not even trying to be funny.
01:20:36.380 Oh, the future is going to be fine.
01:20:39.100 This has been a great, even though it's all about the future of dildos and sex robots and
01:20:43.280 stuff, it looks good.
01:20:45.300 I'm excited.
01:20:46.080 What the hell?
01:20:48.740 I like the big three.
01:20:50.740 I like to talk about, you know, AI robots and dildos.
01:20:54.920 If you've covered that field, everything else is sort of ancillary.
01:21:01.140 You know what I mean?
01:21:02.500 Yeah.
01:21:02.880 But we talked about racism, the media, persuasion, programming your brain.
01:21:10.760 You know, most people believe what most people believe and don't think.
01:21:15.660 We talked about the indictment.
01:21:17.720 Come on, man.
01:21:18.840 We've had a great conversation.
01:21:21.840 I so enjoyed it.
01:21:23.880 You are so smart, so interesting.
01:21:26.920 And I thank you so much for being my guest today.
01:21:29.500 You know, I am, I think I may have mentioned this, but you don't know how many podcasts
01:21:34.440 I turned down since I got canceled.
01:21:37.920 You know, I did a few up front just to get my message out.
01:21:41.120 And then I went silent.
01:21:42.620 But as soon as I heard that you were interested, I said to myself, I got to talk to the other
01:21:50.100 disgraced artist.
01:21:53.100 And by the way, maybe you could use this too.
01:21:55.360 Um, I like to call myself not canceled, but disgraced, which I kind of, I like that.
01:22:02.200 I kind of like being disgraced.
01:22:04.100 Uh, you know, there's some, there's some jobs where it can add, it can add just a little
01:22:10.020 bit of flavor.
01:22:11.260 Right.
01:22:11.900 So the people don't know that I've continued doing the comic, but it's by subscription on
01:22:17.780 Twitter and on the locals platform.
01:22:19.580 So you can see it there for, for subscription, but of course it became much wilder, you know,
01:22:25.240 when, once all the, uh, the censorship was off.
01:22:27.800 So I'm just having the most fun.
01:22:30.180 I mean, the, the series I'm going to work on, I just, I'm just going to start because
01:22:34.000 I'm going to have a Dilbert looking at his 23 and me and his DNA.
01:22:37.840 He starts looking at, Oh, I did that.
01:22:40.200 He's going to start looking at his fourth cousins.
01:22:41.940 They have like 0.002%.
01:22:44.640 He's going to be thinking, I could use this as a dating app, but I got to stay under 0.004
01:22:52.880 because that's just crazy.
01:22:54.080 Yeah.
01:22:54.360 That's like dark Dilbert.
01:22:55.900 Yeah.
01:22:56.240 But 0.004.
01:22:57.520 You should hear about my family.
01:22:59.400 The Jews are all married to their cousins in my family, all the way through Europe.
01:23:04.620 We all married.
01:23:05.360 That's why we're so neurotic.
01:23:07.000 I figured it out.
01:23:07.880 Inbreeding.
01:23:08.220 Because we're way too finely tuned.
01:23:11.100 If you know what I mean?
01:23:12.120 I remember my mom set me up with a guy when I was sick.
01:23:16.040 Well, I guess I was 16.
01:23:17.780 And the first thing that came in my head when she said she had a nice Jewish boy for me to
01:23:22.140 date, I guess I was 15.
01:23:24.540 She goes, my first question, and I guess this isn't normal for non-Jewish people.
01:23:30.300 Are we related to him?
01:23:34.800 And, and it took her too long to answer.
01:23:37.500 She goes like this.
01:23:40.500 Wow.
01:23:40.980 So, so I knew, you know, I did not want to carry that on.
01:23:46.500 Yeah.
01:23:46.660 You don't want to get into the, you don't want to get into the conversation of a third cousin.
01:23:52.700 Third cousin.
01:23:57.100 Fourth cousin, definitely okay.
01:24:00.420 Third cousin.
01:24:01.780 That's my limit.
01:24:02.760 I'm drawing the limit there.
01:24:04.500 Yeah.
01:24:04.700 Do you believe in God before we go?
01:24:08.500 I believe in the simulator.
01:24:10.100 Or is that too personal?
01:24:11.000 I believe that we are a simulated world, much like Elon Musk.
01:24:16.980 And the, the quick argument for that is we already have the technology to build simulations that
01:24:23.240 would act like they think they're real.
01:24:25.240 We haven't done it, but we have all the tools to do that.
01:24:28.260 Just make an NPC with AI and it thinks it's alive.
01:24:32.160 So the argument is if it's possible, what are the odds that we're the first ones to do it versus some other entity already did it.
01:24:41.580 And we're one of their million simulations running on some computer.
01:24:45.220 And maybe we think we lived a hundred years of our life, but maybe that was only a microsecond on somebody's computer.
01:24:52.940 So, you know, they may be, you know, reaching their finger for the off button and we go away forever.
01:24:58.320 But in the time that it takes them the finger to get there, maybe we live hundreds of years, you know, the world.
01:25:04.380 So the possibility is that just from, you know, statistical likelihood, the odds of us being after 14 billion years of this universe,
01:25:15.820 there's exactly one human-like group of people who can build a simulation where the simulated people think they're real.
01:25:23.900 Very unlikely.
01:25:25.500 Far more likely it's happened a lot of times, like millions of times, billions, even trillions.
01:25:31.560 And under that scenario, you have maybe a trillion to one odds that you're a digital creation, which would give you a god.
01:25:41.560 Your god would be the programmer who built your little simulation.
01:25:44.760 Intelligent design.
01:25:46.080 It would be intelligent design, yeah.
01:25:47.980 The author.
01:25:49.900 Yeah, the author.
01:25:51.200 That's what I think.
01:25:52.580 We're living in the author's mind.
01:25:54.580 Part of the reason that I go for this, and I speculate that Musk might have a similar feeling,
01:25:59.620 is that his life and mine, and maybe you would say this about yours as well,
01:26:04.620 doesn't seem to conform to what you would expect of a normal life.
01:26:09.640 In other words, you know, without getting into details, my life has been so extraordinary,
01:26:14.660 and so many things have happened that I, you know, did my affirmations on and visualized.
01:26:21.720 I mean, I became a number one best-selling author.
01:26:25.620 I became one of the top cartoonists in the world.
01:26:28.620 When I was six years old, I decided that's exactly what I wanted to do.
01:26:33.000 Then, I had this weird fantasy that someday I'd be invited to the Oval Office,
01:26:37.440 and the president would ask me some questions just because he thought I'd have a good opinion.
01:26:41.980 That actually fucking happened.
01:26:44.500 Wow.
01:26:44.940 Yeah, in 2018, Trump actually invited me to the Oval Office.
01:26:50.200 I sat in the Oval Office, and we just chewed the fat.
01:26:54.780 How awesome.
01:26:56.460 And now, when I look back at that, I say to myself,
01:27:00.060 that is one of 25 things that are so unusual,
01:27:05.840 and yet they were things I didn't really imagine and targeted.
01:27:08.780 And without even knowing your whole story, I'll bet that you had a feeling that there was some kind of magic happening,
01:27:17.680 and you were imagining your future, and the next thing you have, you know,
01:27:20.900 top TV show in the world, and everything's weird.
01:27:24.860 Yeah.
01:27:25.540 I imagined it as a child, and I worked to get it.
01:27:29.460 But, you know, I carefully honed my craft in order to get it.
01:27:34.260 So you just walk in there, hey, I'm taking over.
01:27:36.900 We all did the work.
01:27:39.380 Yeah, we all did the work.
01:27:41.200 But aren't there lots of people who did the work, and they didn't get the results?
01:27:47.840 Yeah, there are a lot of people who did the work, but I don't know why they didn't make it,
01:27:56.140 and I did ultimately.
01:27:58.240 I don't know why, because I thought they were just as good.
01:28:02.440 Here's what I think.
01:28:03.360 If we're a simulation, I speculate that we can author it from within the simulation.
01:28:11.600 In other words, that the visualization of your future might be the mechanism by which you're actually steering yourself through infinite possibilities.
01:28:21.240 Now, I don't know.
01:28:21.780 I believe that.
01:28:22.580 I definitely believe that.
01:28:24.100 I got a head injury when I was 16, getting hit by this car.
01:28:30.120 And since then, after a nightmare, 10 years of healing my brain concussion, my brain injury, I started having lucid dreams.
01:28:40.660 Have you ever had those?
01:28:42.140 Oh, yeah.
01:28:42.840 Of course.
01:28:43.840 Oh, that's the real shit, ain't it?
01:28:46.160 It's the greatest.
01:28:49.280 Oh, I have to have you come back and discuss that.
01:28:52.640 Please come back again.
01:28:54.000 I've so enjoyed speaking with you.
01:28:56.640 Yeah, I'd love to.
01:28:57.860 I'd love to.
01:28:58.480 Yeah, it was amazing, Scott.
01:29:00.000 Thank you so much.
01:29:00.780 You're up here in the penthouse of thinking.
01:29:03.560 I love it.
01:29:04.840 God bless you.
01:29:06.200 Thanks so much for having me.
01:29:09.360 The time just zipped by.
01:29:11.120 I assume we're at the hour and a half.
01:29:14.240 Yeah, 1.28.
01:29:15.520 And I'll tell you, take this to the bank.
01:29:18.600 You are not in any way a racist.
01:29:21.220 You are a deep thinker.
01:29:22.740 And you can take that to the bank.
01:29:24.940 Yeah.
01:29:25.280 Said by another person called a racist by the United States racist media.
01:29:32.980 Thank you.
01:29:33.680 And I am a racist, and I'm telling you both.
01:29:36.980 All right.
01:29:38.300 Make sure you leave your computer open, Scott.
01:29:41.000 Will do.
01:29:41.400 If you can.
01:29:42.360 All right.
01:29:42.840 I'm going to end the recording.
01:29:43.540 At ease, man.
01:29:44.900 At ease.
01:29:45.860 Thank you so much.
01:29:46.960 Oh, you see my patience is growing thin
01:29:53.520 With this synthetic world
01:30:01.580 Jackpot City is the home of all things casino.
01:30:07.260 We've built a world-class lineup of classic casino games
01:30:10.420 Such as roulette and blackjack
01:30:11.840 And crafted a virtual range of the best slots
01:30:14.220 Including Atlantean treasures
01:30:15.640 Everything's online
01:30:16.740 Everything's ready
01:30:17.660 Everything's for you
01:30:18.880 So whenever you're feeling playful
01:30:20.420 Head to Jackpot City
01:30:21.720 And you'll be endlessly entertained
01:30:23.360 Jackpot City
01:30:24.420 Casino games perfectly made for you
01:30:26.660 Proud partner of the Toronto Maple Leafs
01:30:28.860 Must be 19 plus
01:30:29.880 Ontario residents only
01:30:30.980 Please play responsibly
01:30:32.060 Gambling problem?
01:30:32.860 Visit ConnexOntario.ca
01:30:34.220 Or something where you're doing
01:30:36.140 Sp offend livre and can you talk a lot
01:30:37.120 of adults andchin
01:30:38.560 Because we're just doing
01:30:39.600 As our team who's already
01:30:40.680 I started off
01:30:41.760 For some sports
01:30:42.720 programs
01:30:43.280 We're just pirates
01:30:43.500 Not getting into
01:30:43.880 You
01:30:44.260 No
01:30:45.720 You
01:30:46.340 I
01:30:46.440 I