Real Coffee with Scott Adams - May 07, 2024


Episode 2467 CWSA 05⧸07⧸24


Episode Stats

Length

1 hour and 23 minutes

Words per Minute

162.2828

Word Count

13,524

Sentence Count

916

Misogynist Sentences

7

Hate Speech Sentences

20


Summary

In this episode of the show, Scott Adams talks about why the New York Yankees should change their name to something less offensive, and Apple and Riven, the electric car company, are talking about making a fully-fledged car company.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the Highlight of Human Civilization.
00:00:12.620 It's called Coffee with Scott Adams, and I'm pretty sure there's never been a better time
00:00:16.180 in your whole life.
00:00:17.800 Today, not only our regular show, but toward the end, I'm going to do an interview with
00:00:22.260 author Carmen Simon, and you're going to want to see that to learn more about how to be
00:00:27.340 a more effective communicator and persuader.
00:00:30.000 with her new book called Made You Look, available now for order.
00:00:36.100 All right.
00:00:36.820 If you'd like to take this experience up to a level that nobody can even understand,
00:00:40.620 all you need is a cup or mug or a glass, a dunk or chalice, a stein, a canteen jug or a flask, a vessel of any kind.
00:00:45.400 Fill it with your favorite liquid.
00:00:46.620 I like coffee.
00:00:47.960 And join me now for the unparalleled pleasure of the dopamine at the end of the day,
00:00:50.960 the thing that makes everything better.
00:00:52.020 It's called the simultaneous sip, and it's going to happen now.
00:00:56.200 Go.
00:01:00.180 Oh, that's so good.
00:01:03.600 It gets better every time, I think.
00:01:09.560 Well, I'm going to start a movement, and I hope you'll all join me.
00:01:14.300 I think this is very important.
00:01:15.660 I realized recently, I just learned, that the name New York Yankees is racist.
00:01:23.480 And I'm going to start a movement to make them change their names to something less offensive.
00:01:28.740 Now, if you don't know the background, Yankees is actually a Dutch slur.
00:01:37.420 So apparently the Dutch were the original settlers in New York City.
00:01:42.320 And the Dutch often have names that sound like keys or yan.
00:01:50.800 And so it was a Dutch slur to call them the yan keys or the New York Yankees or the damn Yankees,
00:01:59.840 if you want to be super racist.
00:02:01.280 And so as someone who has a non-zero amount of Dutch blood in me,
00:02:07.100 I insist that the New York Yankees change their name to something far less offensive.
00:02:16.380 I would say, and they could keep the general concept.
00:02:20.640 It's just you can't be Yankees.
00:02:22.860 I was thinking Yankees.
00:02:27.420 What do you think?
00:02:28.620 The New York Yankees?
00:02:31.040 No?
00:02:32.100 All right.
00:02:32.780 Well, that's just brainstorming.
00:02:34.520 That's just my first idea.
00:02:36.640 Don't get too committed to that.
00:02:38.700 You might know that Jack Dorsey, after he left what was Twitter,
00:02:45.320 was working on what could have been a competitor to Twitter called Blue Sky,
00:02:50.800 some kind of open source kind of thing.
00:02:54.160 But he's left the board of that company and he's endorsing X Platform.
00:03:03.100 And he calls the X Platform a freedom technology.
00:03:06.120 I don't know if that's endorsing it, but kind of endorsing it.
00:03:11.020 Yeah.
00:03:14.240 In other news, Apple and the electric car truck company Riven, R-I-V-E-A-N.
00:03:21.180 Is that how you say it?
00:03:22.040 Riven or Riven?
00:03:24.960 Is it Riven like driving?
00:03:27.480 I'm driving my Riven or is it Riven?
00:03:30.320 So that you're driving your Riven, which is, ugh.
00:03:34.140 I hate to hear that come out of my mouth.
00:03:35.740 I'm driving my Riven?
00:03:37.540 Ugh.
00:03:38.360 No, but if I'm driving my Riven, well, that's a good time.
00:03:43.280 We don't know why Apple and Riven are talking, but as some people have noted,
00:03:48.380 you could probably buy Riven for $10 billion and Apple's doing a buyback of their own stock
00:03:54.720 for $110 billion.
00:03:56.320 So they've got $10 million laying around if they wanted to buy themselves a fully made car company.
00:04:02.380 You know, I don't know if you've spent any time up close with the Riven, but, you know,
00:04:11.860 there's one in my driveway almost every day and it belongs to Husway.
00:04:17.160 And it's sort of a cool-looking vehicle.
00:04:22.240 It's got a good look to it.
00:04:23.500 I can kind of imagine that Apple, you know, being the design company that likes good design,
00:04:30.600 they might like it.
00:04:31.620 It does have a look.
00:04:33.100 I'll say that design-wise it looks pretty good.
00:04:35.380 I don't know anything about the dependability or how much the drivers like it or any of that.
00:04:40.320 Honestly, at this point, I don't know how anybody can compete with Tesla
00:04:46.280 because Tesla has such an advanced bite on the self-driving.
00:04:52.960 How many of you are having the problem I'm having right now,
00:04:55.460 which is I'm sort of in the market for a new vehicle,
00:05:00.780 but I'm thinking when will be my last combustion engine?
00:05:07.960 Because I think that once I go electric, which I assume I'll go someday,
00:05:12.560 I don't think I'll ever go back.
00:05:14.960 So I'm actually thinking in terms of should I get my last gas engine
00:05:19.800 and just sort of enjoy it for its, you know, nostalgic feel,
00:05:24.380 you know, get like a Bronco or something,
00:05:25.980 or should I just go to the Tesla Y and get the self-driving?
00:05:36.820 So I think the, it's kind of a tough choice
00:05:40.920 because whichever way you do, you're going to regret it, right?
00:05:45.460 If I don't get the gas engine, I'll think,
00:05:47.860 oh, I wish I had, you know, another year with a gas engine.
00:05:51.100 And if I don't get the electric, I'm going to wish I had the self-driving car.
00:05:54.580 But I'll tell you what I wouldn't, wouldn't buy right now.
00:05:59.060 There's no scenario in which I would buy a Riven.
00:06:02.980 Because if you're going to have an electric,
00:06:05.100 you're going to want the Tesla network for charging.
00:06:09.460 I don't know if the Riven is compatible with that yet,
00:06:12.060 or if it will be or what.
00:06:13.960 And secondly, you've got to have the self-driving part.
00:06:18.180 The self-driving part is going to be 50% of the value of the damn car.
00:06:22.540 And if there's only one company that can do it capably,
00:06:25.860 I think they're going to own everything.
00:06:29.000 I mean, if you think about the fact,
00:06:31.200 I was just watching a video of a Tesla self-driving car,
00:06:35.160 and it not only drove the entire way to the destination,
00:06:38.880 but it picked out a parking space and parked in it.
00:06:42.580 It picked out its own parking space
00:06:44.560 and then just parked perfectly in it.
00:06:46.760 So, I mean, how are you going to compete with that?
00:06:51.000 Yeah.
00:06:51.500 I mean, Tesla is probably right at the edge
00:06:54.040 where they can make a case to lower your insurance costs by 50%.
00:06:58.840 Do you think that the $100 you spend per month
00:07:04.140 on the self-driving software
00:07:05.580 will pay for itself and lower insurance costs?
00:07:09.620 I think it will, eventually, not yet.
00:07:14.040 But yeah, I think it'll be like free money
00:07:16.700 because it'll just lower your insurance costs eventually.
00:07:20.380 Or maybe they'll just track the percentage of time
00:07:23.020 you're using self-driving compared to regular driving
00:07:25.560 and give you some kind of discount.
00:07:27.820 You know, the more you do self-driving,
00:07:29.380 the more of a discount you get at the end of the month,
00:07:31.540 something like that.
00:07:32.140 Well, Bill Ackman, investor Bill Ackman,
00:07:36.000 had an idea for retirement.
00:07:38.420 He said, give $7,000 per baby
00:07:40.640 and then you just wait.
00:07:45.980 Yeah, it would have nearly $48,000 saved for retirement
00:07:49.940 when they turn 25.
00:07:52.900 But what's the difference when you turn 25?
00:07:55.540 Here's what I worry about.
00:07:57.620 That the inflation will eat up that $7,000
00:08:00.920 till there's basically nothing left.
00:08:04.520 I'm not sure that idea works anymore.
00:08:07.200 I don't know if the math works.
00:08:09.120 You know, once you add the inflation rate,
00:08:10.960 which is likely to go up,
00:08:12.740 I think it's going to chew it into nothing.
00:08:14.820 I don't even know if money will be worth something
00:08:16.540 when a kid born today retires.
00:08:19.360 I mean, what are the odds of that?
00:08:21.020 What are the odds that a child born today
00:08:23.120 will ever spend something like money?
00:08:27.780 Like as a retired person,
00:08:29.840 not as an adult,
00:08:31.060 but as a retired person in 65 years,
00:08:35.300 you think somebody's going to be using money?
00:08:37.660 I don't think so.
00:08:39.460 I think money will not even be a thing
00:08:41.460 by the time they retire.
00:08:43.060 I'm not sure.
00:08:45.360 Well, Boeing called off its planned launch
00:08:47.640 of the Starliner.
00:08:48.740 So they got this rocket ship.
00:08:50.080 It was going to go today,
00:08:51.000 but they called it off
00:08:51.860 two hours before liftoff.
00:08:54.540 I think the problem was
00:08:56.000 that they didn't have all the whistleblowers
00:08:58.100 on it yet.
00:09:01.720 I'll just let that sink in a little bit.
00:09:04.140 They had to delay it
00:09:05.060 because they couldn't get
00:09:05.820 all of the whistleblowers on it.
00:09:10.060 Okay.
00:09:10.460 Yeah, I'm just trying to start
00:09:12.500 my own conspiracy theories.
00:09:15.560 That's all.
00:09:16.700 Just trying to start them.
00:09:19.180 Well, here's speaking of hoaxes.
00:09:21.920 Kathy Hochul,
00:09:22.820 governor of New York,
00:09:25.240 is getting some criticism
00:09:26.500 because she said the following thing.
00:09:29.020 She said,
00:09:29.880 she was talking about
00:09:30.780 getting more educational opportunity
00:09:32.340 to poor neighborhoods
00:09:33.840 and especially she was calling out
00:09:36.360 the black neighborhoods in New York.
00:09:37.920 And she said that young black kids
00:09:40.000 in Bronx don't know
00:09:41.220 what the word computer even means.
00:09:43.840 They don't know what a computer is.
00:09:47.360 Wait, what?
00:09:50.260 The governor of New York
00:09:51.720 thinks that black kids
00:09:52.800 don't know what a computer is?
00:09:55.400 What?
00:09:56.420 Now, of course,
00:09:57.440 the people on the right
00:09:58.500 picked that up as,
00:09:59.900 you're a racist.
00:10:02.140 How racist could you be?
00:10:04.660 Now, is that racist?
00:10:05.700 No, it's not racist.
00:10:09.620 It's hyperbole.
00:10:11.240 If Trump had said
00:10:12.400 exactly these words,
00:10:14.140 what would I say about it?
00:10:16.080 If Trump had said
00:10:17.740 exactly what she said,
00:10:19.600 I would say
00:10:20.460 he doesn't mean it literally.
00:10:23.180 He doesn't literally mean
00:10:25.300 they don't know
00:10:25.960 what the word computer means.
00:10:28.400 She's simply saying
00:10:29.780 they don't have access
00:10:30.840 to good technology,
00:10:32.440 which is completely true.
00:10:34.520 Am I wrong to say
00:10:38.340 that if Trump had said this,
00:10:39.660 we would have supported him
00:10:40.680 and said he's just exaggerating?
00:10:42.300 You know what he means.
00:10:43.540 He means they need
00:10:44.420 more computers.
00:10:45.960 Yeah.
00:10:46.300 So I'm going to give her
00:10:47.160 a break in this.
00:10:47.900 I think that's fake news.
00:10:49.600 It was, you know,
00:10:50.220 not the best choice of words,
00:10:51.780 but it's pretty obvious
00:10:52.780 she's using hyperbole.
00:10:54.100 She doesn't actually believe
00:10:55.360 that black people
00:10:56.040 don't know what computers are,
00:10:57.420 even children.
00:10:58.660 She doesn't really believe that.
00:11:00.220 I mean, I'm not a mind reader,
00:11:01.320 but I'm pretty sure
00:11:02.860 she doesn't believe that.
00:11:04.260 It would be ridiculous.
00:11:08.620 So here's my take
00:11:10.160 on the meta overall strategy
00:11:13.880 for the Democrats
00:11:17.860 and see if you can find this trend.
00:11:20.340 I'll talk about it as we go,
00:11:22.400 but it looks like
00:11:23.280 they try to hide
00:11:24.580 their bad behavior
00:11:25.760 in complexity.
00:11:27.460 Have you noticed that?
00:11:28.660 Which is something
00:11:29.940 I don't see
00:11:30.860 the Republicans do
00:11:32.500 as much.
00:11:35.560 So let me,
00:11:36.280 I'll give you some examples.
00:11:40.880 The election claims.
00:11:43.660 So I'm going to tell you later
00:11:45.140 about some election claims,
00:11:46.760 you know,
00:11:46.940 claims that there was
00:11:47.640 some irregularity.
00:11:48.820 When you read
00:11:49.560 the election claims,
00:11:50.920 have you noticed
00:11:51.600 that it's really confusing?
00:11:53.800 Because you got
00:11:54.520 several states,
00:11:55.960 you got Arizona
00:11:56.920 and something
00:11:58.200 about Pennsylvania
00:11:59.200 and there's something
00:12:00.120 about Wisconsin,
00:12:01.300 but there's something
00:12:01.820 about Georgia.
00:12:03.220 So first of all,
00:12:03.900 you're confused
00:12:04.460 about what a state.
00:12:06.040 And then there's
00:12:07.400 a whole bunch of claims
00:12:08.680 that are like,
00:12:11.040 oh, the paper was wrong,
00:12:12.900 the custody,
00:12:14.320 the ballots,
00:12:15.040 there was a locked door.
00:12:16.240 And it's so detailed
00:12:18.040 and confusing.
00:12:21.820 There's somebody on Rumble
00:12:23.020 who's just yelling
00:12:23.880 fart a hundred times
00:12:25.320 in the comments.
00:12:27.200 I don't think I have a way
00:12:28.260 to block that.
00:12:29.340 So enjoy yourself.
00:12:31.440 Go nuts.
00:12:32.520 I assume you're drinking.
00:12:35.260 So that's the election claims.
00:12:36.620 Election claims
00:12:37.200 are too complicated
00:12:38.580 for the public
00:12:40.380 to understand.
00:12:41.920 So you can't get activated
00:12:43.500 to do something
00:12:45.040 about something
00:12:45.680 or vote differently
00:12:46.540 when you're overwhelmed
00:12:48.660 with complexity.
00:12:50.020 A confusopoly,
00:12:51.020 if you will.
00:12:52.020 Now what about
00:12:52.500 the lawfare on Trump?
00:12:53.900 If the only case
00:12:56.460 were the Stormy Daniels case,
00:12:58.900 we would probably
00:12:59.880 get engaged
00:13:00.720 and we'd understand it
00:13:01.900 and we would understand
00:13:02.940 that it's complete
00:13:03.840 bogus BS.
00:13:05.600 I'll talk about that more.
00:13:07.120 But because there are
00:13:08.640 91 counts
00:13:09.900 and there are four of them
00:13:11.180 and we're not all lawyers
00:13:12.320 and we're all confused
00:13:13.420 about which story
00:13:14.220 goes to which,
00:13:15.560 it's really hard
00:13:16.520 to know
00:13:17.300 did Trump do something
00:13:20.160 or not do something.
00:13:21.240 I'm going to cover up
00:13:24.260 the comments
00:13:24.920 from
00:13:26.320 everybody but locals.
00:13:30.980 I've got a troll there
00:13:32.080 that I can't...
00:13:32.600 Oh, let me see
00:13:33.120 if I can get rid of.
00:13:34.300 Let's see if this is
00:13:35.440 fixable on my end.
00:13:37.480 Yeah, no, I don't have...
00:13:38.800 So...
00:13:40.240 Yeah, so there's just
00:13:42.600 one troll out here
00:13:43.500 who's trying to ruin
00:13:44.220 the experience for everybody.
00:13:45.860 Hey, troll.
00:13:47.820 We've all seen it now
00:13:49.100 50 times.
00:13:50.860 Could you do something else?
00:13:54.060 Something potentially
00:13:55.140 unhealthy for yourself?
00:13:58.120 Go eat some snacks?
00:14:01.000 Yeah, do something
00:14:01.940 that's bad for your health.
00:14:03.720 That's what we recommend.
00:14:04.960 All right, back to my point.
00:14:06.180 The Democrats' complex strategy.
00:14:08.220 How about the funding
00:14:10.280 of the protests?
00:14:11.880 Do you notice that
00:14:12.620 the protests are complicated?
00:14:14.720 And then when you try
00:14:15.360 to find out
00:14:15.860 who's funding them,
00:14:17.400 it's complicated.
00:14:19.100 Because there's groups,
00:14:21.100 but then there's a group
00:14:22.160 that funds the group,
00:14:25.060 and then there's
00:14:25.760 somebody behind that.
00:14:27.200 So they make it
00:14:28.000 as complicated as possible,
00:14:29.700 so you can't quite tell.
00:14:32.840 Yeah, I've got to cover that up.
00:14:34.280 It's going to make me too angry.
00:14:49.460 So I won't be able
00:14:50.400 to see the comments
00:14:51.040 from anybody except locals,
00:14:52.620 because there's somebody
00:14:53.340 in rubble
00:14:53.840 who's just pissing me off.
00:14:57.260 All right,
00:14:57.840 but you can see each other.
00:15:00.660 I'm just not going to watch them,
00:15:02.280 so I'm only watching
00:15:02.940 the locals' people,
00:15:04.700 their comments.
00:15:07.360 All right.
00:15:10.220 What about the censorship octopus
00:15:12.540 that Mike Benz
00:15:13.480 always talks about?
00:15:14.420 You know,
00:15:14.800 the big blob?
00:15:17.360 There's like literally
00:15:18.660 hundreds of these entities
00:15:19.920 censoring people
00:15:21.040 around the world
00:15:21.640 on behalf of the United States.
00:15:23.420 It's too hard to understand.
00:15:25.520 We just don't know
00:15:26.620 how that all works.
00:15:27.540 It's a big old octopus.
00:15:29.180 So you can't do much about it.
00:15:30.480 What about Ukraine?
00:15:32.340 The whole Biden,
00:15:33.380 Ukraine, CIA,
00:15:35.000 weapons labs,
00:15:36.820 Putin,
00:15:37.940 NATO,
00:15:38.800 it's too complicated.
00:15:40.980 It's too complicated.
00:15:42.200 You just can't figure it out.
00:15:44.540 So I think that
00:15:46.140 that's the general reason
00:15:47.940 that Democrats
00:15:48.580 are holding on to power
00:15:50.320 is that people can't understand
00:15:52.220 what's going on.
00:15:53.080 And I'll let me give you
00:15:53.920 some detail
00:15:54.320 in each of those stories
00:15:55.120 as we go.
00:15:56.560 Meanwhile,
00:15:57.140 Matt Gaetz
00:15:57.720 was trying to get into prison
00:16:00.440 to talk to Peter Navarro,
00:16:02.340 who is,
00:16:02.840 as you know,
00:16:03.100 a political prisoner
00:16:04.240 of the Nazi-like regime
00:16:08.800 in power.
00:16:10.340 And I think that's fair.
00:16:13.440 I think it's fair
00:16:14.940 to call the Biden administration
00:16:16.240 a Nazi-like regime
00:16:17.560 if they have political prisoners.
00:16:20.800 If they're trying
00:16:21.880 to put people in jail
00:16:22.900 for their politics
00:16:25.060 and they're overtly
00:16:27.480 racially discriminating
00:16:28.820 overtly
00:16:31.040 against white people,
00:16:33.820 I would say
00:16:34.380 that that would be,
00:16:35.300 that's pretty Nazi-like
00:16:37.360 in its own way.
00:16:40.320 Anyway,
00:16:41.180 so Matt Gaetz,
00:16:42.660 a member of Congress,
00:16:44.280 was told by the director
00:16:45.620 of the Federal Bureau of Prisons
00:16:46.760 that he couldn't talk
00:16:47.980 to Peter Navarro.
00:16:48.900 and the reason was,
00:16:51.260 quote,
00:16:51.680 Peter Navarro
00:16:52.420 is too notorious
00:16:53.480 to be interviewed
00:16:54.560 by a member of Congress.
00:16:56.620 Too notorious
00:16:57.820 to be interviewed
00:16:59.500 by a member of Congress.
00:17:02.440 Does any of that
00:17:03.500 sound legitimate to you?
00:17:05.580 No.
00:17:06.340 He's a political prisoner
00:17:07.700 and they don't want him talking.
00:17:09.460 So you've got censorship
00:17:10.820 on top of fuckery,
00:17:12.540 on top of political prisoners,
00:17:14.740 in our absolute criminal
00:17:16.880 Nazi regime
00:17:18.000 that's in power.
00:17:21.580 Kristi Noem
00:17:22.380 is having
00:17:23.660 some more
00:17:24.280 fun.
00:17:26.580 She's trying to explain
00:17:27.480 why her book
00:17:28.140 said that she met
00:17:28.940 Kim Jong-un,
00:17:30.320 but when asked,
00:17:32.120 she will not confirm it.
00:17:33.560 Rather,
00:17:33.900 she's backing up to
00:17:34.960 even Jesse Waters
00:17:37.800 asked her,
00:17:38.400 you didn't have a conversation
00:17:39.820 with him at the DMZ,
00:17:41.380 did you,
00:17:41.700 talking about Kim Jong-un?
00:17:42.820 And Noem said,
00:17:44.540 I don't have conversations
00:17:46.300 about my conversations
00:17:47.460 with world leaders.
00:17:50.180 So her claim is,
00:17:52.280 well,
00:17:52.980 I'm not going to say
00:17:54.520 I didn't talk to Kim Jong-un
00:17:56.260 because I talked to
00:17:57.480 a lot of leaders.
00:17:59.280 And you know,
00:18:00.500 maybe,
00:18:01.360 maybe I did talk to him.
00:18:03.180 Maybe I didn't.
00:18:04.120 I'm not saying,
00:18:04.980 but I took it out of the book.
00:18:06.460 I took it out of the book
00:18:07.560 because I don't talk about
00:18:08.700 the people I talk to
00:18:10.640 who are world leaders.
00:18:12.820 Let's just say
00:18:15.160 that Megyn Kelly
00:18:16.040 is not buying that story.
00:18:19.100 Maybe a lot of you
00:18:20.120 are not buying that story.
00:18:24.100 It's possible.
00:18:25.800 It's actually possible
00:18:27.040 that it's true.
00:18:28.100 I wouldn't say
00:18:29.000 it's super believable,
00:18:31.720 but is it possible
00:18:33.120 that she ever had a,
00:18:34.660 I don't know,
00:18:35.500 anything's possible?
00:18:37.040 It seems unlikely.
00:18:37.980 I think her political future
00:18:41.000 has been destroyed
00:18:43.060 by that book.
00:18:45.340 I hope she sells
00:18:46.300 a lot of books
00:18:46.840 because it definitely
00:18:47.680 ended her chances
00:18:48.640 of being VP.
00:18:50.960 Meanwhile,
00:18:51.940 Corinne Jean-Pierre
00:18:53.320 got behind the podium,
00:18:55.920 spokesperson for Biden,
00:18:56.960 and said that violent crime
00:18:58.000 is at a nearly 50-year low.
00:19:02.040 Violent crime
00:19:02.860 is at a 50-year low?
00:19:04.080 Oh,
00:19:06.220 really?
00:19:10.080 Does that even sound
00:19:11.080 like it's possible?
00:19:13.060 I don't even know.
00:19:14.500 Like,
00:19:14.980 nothing is real anymore.
00:19:16.320 Literally,
00:19:16.840 everything is just made up.
00:19:18.900 I don't think the news
00:19:19.940 even tries anymore.
00:19:21.760 They're not even trying.
00:19:23.200 It's just,
00:19:24.020 what did this one claim?
00:19:25.860 What did the other one claim?
00:19:27.740 There's just no,
00:19:28.620 there's just no attention
00:19:31.200 to any kind of reality.
00:19:34.020 Ontario,
00:19:34.800 the wait is over.
00:19:36.300 The gold standard
00:19:37.040 of online casinos
00:19:38.080 has arrived.
00:19:39.140 Golden Nugget Online Casino
00:19:40.700 is live,
00:19:41.420 bringing Vegas-style excitement
00:19:42.860 and a world-class
00:19:43.900 gaming experience
00:19:44.880 right to your fingertips.
00:19:46.760 Whether you're a seasoned player
00:19:47.960 or just starting,
00:19:49.100 signing up is fast
00:19:50.040 and simple.
00:19:51.220 And in just a few clicks,
00:19:52.500 you can have access
00:19:53.160 to our exclusive library
00:19:54.500 of the best slots
00:19:55.580 and top-tier table games.
00:19:57.440 Make the most
00:19:58.080 of your downtime
00:19:58.720 with unbeatable promotions
00:20:00.260 and jackpots
00:20:01.060 that can turn
00:20:01.580 any mundane moment
00:20:02.860 into a golden opportunity
00:20:04.380 at Golden Nugget Online Casino.
00:20:06.940 Take a spin on the slots,
00:20:08.280 challenge yourself
00:20:08.860 at the tables,
00:20:09.660 or join a live dealer game
00:20:11.080 to feel the thrill
00:20:12.080 of real-time action,
00:20:13.400 all from the comfort
00:20:14.360 of your own devices.
00:20:15.620 Why settle for less
00:20:16.600 when you can go for the gold
00:20:18.100 at Golden Nugget Online Casino?
00:20:20.920 Gambling problem?
00:20:21.820 Call ConnexOntario
00:20:22.920 1-866-531-2600.
00:20:26.120 19 and over.
00:20:27.060 Physically present in Ontario.
00:20:28.420 Eligibility restrictions apply.
00:20:30.020 See GoldenNuggetCasino.com
00:20:31.720 for details.
00:20:32.500 Please play responsibly.
00:20:35.000 Meanwhile,
00:20:35.720 MSNBC has two new polls
00:20:37.560 that were just released
00:20:39.060 showing that President Biden
00:20:40.640 is ahead of Trump
00:20:42.540 by as much as five points.
00:20:44.380 So do you believe
00:20:47.100 that at the same time
00:20:48.340 there are polls
00:20:49.340 that say that Trump
00:20:51.080 is well ahead,
00:20:52.280 like well ahead,
00:20:53.420 like 12 points,
00:20:54.800 that they're coincidentally
00:20:56.980 right on time?
00:20:59.840 You got two new polls
00:21:01.960 that...
00:21:04.780 Two new polls.
00:21:07.260 All right.
00:21:10.080 Let me make sure
00:21:11.240 participants...
00:21:14.380 All right.
00:21:16.500 It looks like we've got
00:21:17.780 Carmen's in the waiting room.
00:21:21.600 So it looks like
00:21:22.500 that's working.
00:21:23.600 So we should be able
00:21:24.440 to have our interview
00:21:25.080 at 15 minutes before the hour
00:21:27.420 with author Carmen Simon
00:21:29.340 who's got a new book
00:21:30.400 that you will be
00:21:31.520 very interested in.
00:21:33.440 All right.
00:21:34.240 So you got these polls.
00:21:37.500 Does anybody think
00:21:38.540 that these polls are real?
00:21:40.340 Or was it just necessary
00:21:41.820 for Biden to have some polls
00:21:43.180 that look good?
00:21:43.900 So they just told
00:21:45.180 their toadies,
00:21:46.440 go give us some polls
00:21:47.320 to make us look good?
00:21:50.140 Because it doesn't look real to me.
00:21:53.200 But I also don't necessarily
00:21:54.600 think the other polls are real.
00:21:57.060 Is polling even real?
00:21:59.040 I mean,
00:21:59.280 I just don't even know
00:22:00.220 at this point.
00:22:01.240 How can they be polling
00:22:02.500 the same thing
00:22:03.200 and be off by 17 points?
00:22:06.460 Can it be true
00:22:09.800 that Trump is up 12
00:22:11.440 but also down 5?
00:22:16.880 17 point difference?
00:22:18.680 I mean,
00:22:18.920 I don't think you can believe
00:22:19.700 anything at this point.
00:22:20.940 It's almost as if
00:22:22.040 they're trying to make polls
00:22:23.380 look not real.
00:22:25.900 It's almost like
00:22:26.760 it's an attempt
00:22:27.360 to make you not believe
00:22:28.440 any polls.
00:22:29.620 Why would they do that?
00:22:30.740 Why would somebody
00:22:33.280 want you to not believe
00:22:34.460 the polls
00:22:35.100 before an election
00:22:36.920 like this?
00:22:38.600 It's probably exactly
00:22:39.700 what you think it is.
00:22:40.820 If they can make you
00:22:41.920 not trust
00:22:42.560 any of the polls
00:22:43.560 by giving you
00:22:45.000 wildly different
00:22:45.920 poll results,
00:22:46.880 then when Biden
00:22:48.260 wins unexpectedly,
00:22:49.720 they'll say,
00:22:50.340 well,
00:22:51.320 haven't we been telling you
00:22:52.500 that the polls
00:22:53.120 are not reliable?
00:22:53.920 they have to debunk
00:22:57.040 the polls
00:22:57.700 before they steal
00:22:58.880 the election.
00:23:00.200 Now,
00:23:00.760 I'm not saying
00:23:01.340 that I know
00:23:01.960 that that's going
00:23:02.560 to happen.
00:23:03.420 I'm saying
00:23:04.220 that we all expect
00:23:05.220 it to happen
00:23:05.740 and it would look
00:23:06.720 exactly like this.
00:23:09.400 So,
00:23:10.100 the fun part
00:23:10.920 about this election season
00:23:12.180 is that we've learned
00:23:13.740 all of their patterns.
00:23:15.840 We know how they work.
00:23:17.700 So,
00:23:17.900 this is exactly
00:23:18.620 on time
00:23:19.360 and exactly
00:23:20.700 on message.
00:23:22.660 Oh,
00:23:23.160 guess what?
00:23:24.660 Suddenly,
00:23:25.300 the polls
00:23:25.740 are super,
00:23:26.960 super undependable
00:23:28.000 for the first time
00:23:28.820 in history.
00:23:30.700 Yeah.
00:23:31.080 It looks like
00:23:31.820 the whole
00:23:32.200 too big to rig.
00:23:35.420 You know,
00:23:35.860 Trump says
00:23:36.880 he needs to have
00:23:37.640 a victory
00:23:38.040 that's too big to rig.
00:23:39.580 It looks like
00:23:40.380 they're going to make sure
00:23:41.060 that there's nothing
00:23:41.780 too big to rig
00:23:42.760 by making sure
00:23:44.080 that you don't know
00:23:44.680 what it was supposed to be.
00:23:46.140 If you don't know
00:23:46.980 what it was supposed to be,
00:23:48.140 you won't know
00:23:48.920 if it's not supposed
00:23:50.740 to be the answer
00:23:51.440 that they give you
00:23:52.040 at the end.
00:23:53.420 So,
00:23:54.340 I would say
00:23:56.260 this is highly
00:23:57.020 suspicious.
00:23:58.720 Highly.
00:24:00.080 Well,
00:24:00.760 what else is suspicious?
00:24:02.280 Rasmussen
00:24:02.760 reports on X
00:24:04.800 is my main source
00:24:06.580 of,
00:24:07.040 let's say,
00:24:08.340 allegations
00:24:08.880 about the past election
00:24:10.340 2020.
00:24:11.600 Here are a few things
00:24:12.800 that Rasmussen reports
00:24:14.200 likes to remind us of
00:24:15.320 on a regular basis.
00:24:16.180 Now,
00:24:18.380 these are claims.
00:24:19.660 I can't verify
00:24:21.000 the truth of these.
00:24:22.320 They're just claims.
00:24:23.860 In Arizona,
00:24:25.360 this is talking
00:24:25.940 about the 2020 election,
00:24:27.460 there was only
00:24:27.920 one official ballot paper.
00:24:29.800 So,
00:24:30.080 there's only one paper
00:24:30.840 that was approved
00:24:31.520 to be used
00:24:32.240 in Maricopa County.
00:24:34.140 Yet,
00:24:34.700 10 types
00:24:35.200 were discovered
00:24:35.900 by volunteers.
00:24:38.380 There were 10 types
00:24:39.580 of paper.
00:24:40.960 Only one is legal.
00:24:43.040 How in the world
00:24:43.880 do the people
00:24:45.120 who print the ballots
00:24:46.180 not be aware
00:24:47.840 of what the one
00:24:48.960 approved paper is?
00:24:51.580 How is that?
00:24:52.600 And how can you tell
00:24:53.520 it's the wrong paper
00:24:54.340 by looking at it?
00:24:55.840 How in the world
00:24:56.660 can the volunteers
00:24:57.500 just hold it in their hand
00:24:59.120 and know it's
00:25:00.080 the wrong paper?
00:25:01.820 And know it's
00:25:02.380 one of 10 different...
00:25:03.520 So,
00:25:03.760 nothing about this
00:25:04.380 makes sense to me.
00:25:05.580 I don't know how you can tell
00:25:06.540 it's the wrong paper.
00:25:07.920 I mean,
00:25:08.180 could you tell
00:25:09.180 10 different flavors
00:25:10.220 of paper?
00:25:11.760 I mean,
00:25:12.040 where the volunteer
00:25:13.040 is going.
00:25:24.080 Yeah.
00:25:24.780 This is a brand new
00:25:25.720 kind of paper right here.
00:25:27.640 This one.
00:25:28.520 You know,
00:25:28.840 the first eight
00:25:29.840 that I categorized
00:25:31.060 as totally different
00:25:31.860 than paper.
00:25:34.600 But we're not done.
00:25:36.880 Yeah.
00:25:37.840 Yeah,
00:25:38.080 this one tastes
00:25:38.600 a little different.
00:25:39.380 I think this is
00:25:39.980 a different kind of paper.
00:25:40.720 So,
00:25:41.700 my first question
00:25:42.500 would be,
00:25:43.560 can volunteers
00:25:44.320 really tell
00:25:45.100 counterfeit ballots?
00:25:48.700 Have you met
00:25:49.940 the public?
00:25:51.340 Have you met
00:25:52.020 the public?
00:25:53.460 So,
00:25:53.820 if these...
00:25:54.660 So,
00:25:54.980 my other assumption
00:25:55.940 is,
00:25:56.700 the only way
00:25:57.460 you could tell
00:25:58.020 that the ballots
00:25:58.760 were non-conforming
00:25:59.920 is if they're
00:26:01.080 way off.
00:26:02.900 They must be
00:26:03.700 so different.
00:26:05.040 Somebody says
00:26:05.520 watermarks.
00:26:06.180 I don't think
00:26:07.560 it was watermarks
00:26:08.300 because they would
00:26:08.900 have said
00:26:09.200 there was a
00:26:09.800 watermark problem,
00:26:10.680 but they're
00:26:10.900 saying it's paper.
00:26:12.160 So,
00:26:12.360 the paper
00:26:12.640 must have
00:26:13.020 been so off
00:26:13.960 that you could
00:26:14.380 tell by
00:26:14.720 holding it
00:26:15.160 in your hand.
00:26:16.800 That's pretty
00:26:17.440 far off.
00:26:19.740 Well,
00:26:20.160 that would be
00:26:20.460 the allegation
00:26:21.040 to me.
00:26:21.880 And that
00:26:22.360 the 10 types
00:26:23.920 were discovered
00:26:24.500 and that amounted
00:26:25.420 to over 200,000
00:26:26.560 non-conforming votes
00:26:27.880 that shouldn't
00:26:28.340 have been counted,
00:26:29.560 which is way
00:26:30.160 more than
00:26:30.600 the margin
00:26:31.160 of victory
00:26:31.620 for Biden.
00:26:34.460 Meanwhile,
00:26:35.100 in Georgia,
00:26:35.880 here are just
00:26:36.380 a few things,
00:26:37.600 just a few things
00:26:39.300 that we know
00:26:39.720 about the Georgia
00:26:40.360 election,
00:26:40.920 2020.
00:26:41.900 We know
00:26:42.640 that a forensic
00:26:43.340 audit was blocked.
00:26:44.580 We know
00:26:44.860 that 100 drop boxes
00:26:46.140 lacked surveillance
00:26:46.980 videos,
00:26:47.720 20,000 ballot
00:26:48.640 images vanished,
00:26:50.220 13 election
00:26:51.540 routers vanished,
00:26:52.800 10 Dominion
00:26:53.460 tabulators vanished,
00:26:54.640 140,000 ballots
00:26:56.020 are still locked
00:26:57.180 up and you can't
00:26:57.860 look at them,
00:26:58.700 and 148,000
00:27:00.020 mail ballots
00:27:00.640 signatures
00:27:01.200 are unverified.
00:27:05.220 That seems
00:27:06.300 suboptimal.
00:27:09.380 It seems
00:27:09.800 suboptimal.
00:27:15.740 Fart boy
00:27:16.540 is still there,
00:27:17.500 so we'll just
00:27:17.980 leave you
00:27:18.400 blocked.
00:27:22.680 Meanwhile,
00:27:24.140 there's the
00:27:25.180 new Trump
00:27:25.700 gag order,
00:27:27.560 and Trump
00:27:28.620 said,
00:27:28.960 this judge
00:27:29.420 has given
00:27:29.800 me a gag
00:27:30.340 order and
00:27:30.780 said,
00:27:31.200 you'll go
00:27:31.720 to jail
00:27:32.140 if you
00:27:32.480 violate it.
00:27:34.140 And frankly,
00:27:34.840 you know what?
00:27:35.260 Our Constitution
00:27:35.820 is much more
00:27:36.420 important than
00:27:36.780 jail.
00:27:37.040 It's not even
00:27:37.420 close.
00:27:38.140 So Trump
00:27:38.620 is basically
00:27:39.180 saying,
00:27:39.800 yeah,
00:27:41.020 I'm going to
00:27:41.660 call your
00:27:41.980 bluff.
00:27:43.180 Take me
00:27:43.560 to jail.
00:27:45.460 Now,
00:27:46.440 I say this
00:27:47.940 a lot,
00:27:49.080 but Trump
00:27:50.180 reads the room
00:27:51.080 better than
00:27:51.580 any politician
00:27:52.420 has ever read
00:27:53.220 the room.
00:27:53.600 like,
00:27:54.820 he can feel
00:27:56.000 the zeitgeist
00:27:56.940 and then
00:27:57.620 just serve
00:27:58.140 it.
00:27:58.980 And he
00:27:59.540 is correctly
00:28:00.260 feeling that
00:28:01.100 what his
00:28:01.740 supporters want
00:28:02.940 is for him
00:28:04.020 to push this
00:28:04.780 as hard as
00:28:05.260 he can
00:28:05.480 push it.
00:28:07.160 It is what
00:28:07.880 we want.
00:28:08.740 Now,
00:28:09.160 that's different
00:28:09.720 from saying
00:28:10.160 it's a good
00:28:10.620 idea.
00:28:11.820 I think I
00:28:12.480 would stop
00:28:12.880 short of
00:28:13.280 saying it's
00:28:13.660 a good idea
00:28:14.200 to go to
00:28:14.600 jail,
00:28:14.800 but it's
00:28:17.260 definitely a
00:28:17.660 good idea
00:28:18.040 to push
00:28:18.580 it if
00:28:19.180 you're trying
00:28:19.460 to make
00:28:19.780 your base
00:28:20.260 happy about
00:28:20.840 it.
00:28:21.680 The thing
00:28:22.340 we like
00:28:22.800 about him
00:28:23.220 is he
00:28:23.500 doesn't
00:28:23.800 give up.
00:28:25.600 It's sort
00:28:26.240 of the
00:28:26.440 thing that
00:28:26.800 he's accused
00:28:27.600 of the
00:28:28.020 most.
00:28:28.720 You know,
00:28:28.880 hey,
00:28:29.200 let this
00:28:29.680 election go,
00:28:30.500 you know,
00:28:30.800 you lost.
00:28:32.700 But it's
00:28:33.380 hard not to
00:28:34.020 like somebody
00:28:34.680 who will
00:28:35.080 never give
00:28:35.580 up.
00:28:37.360 That's just
00:28:37.980 an inherently
00:28:38.580 attractive quality
00:28:39.840 in a human.
00:28:41.140 Really?
00:28:41.580 You just
00:28:41.840 never give
00:28:42.380 up?
00:28:43.360 And not
00:28:44.160 only does
00:28:44.520 he never
00:28:44.820 give up,
00:28:45.260 but he's
00:28:45.520 on the
00:28:45.760 verge of
00:28:46.160 winning
00:28:46.420 the election,
00:28:47.880 at least
00:28:48.320 if you
00:28:48.900 believe some
00:28:49.560 set of
00:28:49.860 polls.
00:28:51.060 So that's
00:28:51.740 always
00:28:51.960 impressive
00:28:52.600 when somebody
00:28:53.660 pushes that
00:28:54.360 hard,
00:28:55.580 fights every
00:28:56.360 fight,
00:28:57.380 pushes every
00:28:57.920 door open,
00:28:58.780 doesn't give
00:28:59.320 up.
00:28:59.980 It's just
00:29:00.400 an equality.
00:29:01.340 It's just
00:29:01.700 a good
00:29:02.640 quality.
00:29:03.460 We like
00:29:03.880 it.
00:29:04.540 We're drawn
00:29:05.000 to it as
00:29:05.460 a leadership
00:29:05.920 quality.
00:29:08.440 Meanwhile,
00:29:11.320 I like
00:29:14.260 that
00:29:14.520 prosecutor
00:29:15.420 Matthew
00:29:15.980 Coelangelo
00:29:17.060 is just
00:29:18.380 routinely
00:29:18.860 called
00:29:19.240 corrupt.
00:29:20.080 Corrupt
00:29:20.420 prosecutor
00:29:21.060 I guess
00:29:22.660 that would
00:29:22.900 be an
00:29:23.160 opinion
00:29:23.480 until it's
00:29:24.920 proven.
00:29:26.000 So they
00:29:26.700 bring in
00:29:27.100 this
00:29:27.400 Trump
00:29:28.400 employee
00:29:30.000 who was
00:29:31.280 the one
00:29:31.520 who did
00:29:31.760 the accounting
00:29:32.340 for the
00:29:32.720 business
00:29:33.040 records,
00:29:33.560 the one
00:29:33.800 who actually
00:29:34.260 decided what
00:29:35.000 category things
00:29:35.860 were and
00:29:36.440 did the
00:29:37.000 actual work,
00:29:38.220 and said it
00:29:39.460 was his job
00:29:40.120 to decide
00:29:40.660 how things
00:29:41.120 were categorized
00:29:41.940 and Trump
00:29:42.400 never touched
00:29:42.980 it and had
00:29:43.460 nothing to do
00:29:44.060 with it.
00:29:44.980 The entire
00:29:45.860 case depends
00:29:47.920 on Trump
00:29:48.480 being aware
00:29:49.340 and consciously
00:29:50.460 deciding to
00:29:51.200 make a change
00:29:51.900 that the
00:29:52.680 courts may
00:29:53.780 decide was
00:29:54.380 illegal.
00:29:56.560 Now the
00:29:57.240 one most
00:29:57.780 credible person,
00:29:59.280 the person
00:29:59.740 who knows
00:30:00.240 the most,
00:30:01.160 the person
00:30:01.540 who personally
00:30:02.520 made the
00:30:03.000 changes,
00:30:04.260 says Trump
00:30:04.780 had nothing
00:30:05.180 to do with
00:30:05.680 it.
00:30:06.120 No connection
00:30:06.680 at all.
00:30:07.800 That's the
00:30:08.220 entire case.
00:30:10.440 So the
00:30:11.140 case is
00:30:11.580 over,
00:30:12.540 but it
00:30:13.020 just keeps
00:30:13.480 going.
00:30:15.240 Like nobody
00:30:15.980 told the
00:30:16.560 prosecution
00:30:17.180 they should
00:30:17.600 just go
00:30:17.960 home.
00:30:19.160 If the
00:30:19.880 person who
00:30:20.380 decides what
00:30:21.240 category it
00:30:22.060 goes in,
00:30:22.820 the expense,
00:30:24.260 says I did
00:30:25.000 it myself,
00:30:25.860 nobody else
00:30:26.560 was involved,
00:30:27.500 and there's
00:30:28.080 no other
00:30:28.560 witness,
00:30:29.840 there's literally
00:30:30.500 just one
00:30:31.060 person who
00:30:31.880 did the
00:30:32.240 job and
00:30:32.820 says I
00:30:33.140 did it
00:30:33.440 alone.
00:30:36.060 That's the
00:30:36.720 whole case.
00:30:37.300 it's
00:30:38.060 completely
00:30:38.460 gone.
00:30:39.240 Do you
00:30:39.480 know why
00:30:39.760 you don't
00:30:40.100 understand
00:30:40.620 that case?
00:30:42.360 It's like
00:30:42.780 there's
00:30:43.600 something about
00:30:44.000 the statute
00:30:44.580 of limitations
00:30:45.380 and there's
00:30:46.440 something about
00:30:47.040 it's not
00:30:48.040 illegal unless
00:30:49.180 there's a
00:30:49.640 second thing
00:30:50.420 that was
00:30:51.480 the crime
00:30:52.100 and the
00:30:52.520 first thing
00:30:52.960 was trying
00:30:53.340 to cover
00:30:53.720 up the
00:30:54.040 crime,
00:30:54.920 but it's
00:30:55.460 so complicated
00:30:56.220 because the
00:30:56.760 crime that
00:30:57.420 you're covering
00:30:57.840 up doesn't
00:30:58.900 even have to
00:30:59.320 be something
00:30:59.700 you've been
00:31:00.100 convicted of.
00:31:01.040 It could be
00:31:01.660 an alleged
00:31:02.400 crime that
00:31:04.060 they believe
00:31:04.600 is a crime
00:31:05.180 and then the
00:31:05.720 other thing
00:31:06.080 was to cover
00:31:06.720 it up but
00:31:07.820 only if the
00:31:09.320 only reason you
00:31:10.220 were doing it
00:31:10.720 was to cover
00:31:11.240 it up for
00:31:11.620 political purposes
00:31:12.720 and if there
00:31:13.680 was any other
00:31:14.360 reason you were
00:31:14.980 doing it, it
00:31:15.460 was fine and
00:31:16.340 it's all
00:31:16.680 complicated.
00:31:18.100 I don't
00:31:18.740 understand it.
00:31:21.080 So that's how
00:31:21.880 they get away
00:31:22.260 with it.
00:31:23.180 If this
00:31:23.860 situation were
00:31:24.640 simplified, the
00:31:26.700 public wouldn't
00:31:27.280 put up with
00:31:27.720 this.
00:31:28.220 Even Democrats
00:31:29.040 wouldn't put up
00:31:29.720 with this if
00:31:30.860 they could see
00:31:31.340 it in its
00:31:31.780 simplest form.
00:31:33.080 The simplest
00:31:33.580 form is we're
00:31:34.660 making up
00:31:35.160 some charges.
00:31:36.820 We're making
00:31:37.580 up charges.
00:31:38.340 That's the
00:31:38.680 simplest form.
00:31:39.180 They're just
00:31:39.420 making up
00:31:39.840 some shit
00:31:40.280 and seeing
00:31:41.120 if it'll
00:31:41.440 stick.
00:31:41.920 And it
00:31:42.180 didn't.
00:31:43.060 It's not
00:31:43.380 even close.
00:31:44.820 They're not
00:31:45.380 even in the
00:31:45.940 zip code of
00:31:46.940 a conviction.
00:31:48.260 Do you think
00:31:48.620 Democrats know
00:31:49.360 that?
00:31:50.380 Only the ones
00:31:51.100 really watching.
00:31:52.720 The rest just
00:31:53.300 know that Trump
00:31:53.920 has 91
00:31:54.500 indictments.
00:31:55.720 That's all
00:31:56.060 they know.
00:31:58.640 Apparently
00:31:59.080 Stormy is going
00:31:59.880 to be testifying
00:32:01.020 today.
00:32:02.400 Will she
00:32:03.020 testify that
00:32:03.720 Trump gave
00:32:04.200 her a gag
00:32:04.680 order?
00:32:08.160 Yes?
00:32:08.980 No?
00:32:10.440 All right.
00:32:10.740 That's your
00:32:11.020 dad joke for
00:32:11.560 the day.
00:32:12.160 Dirty dad
00:32:12.780 jokes all
00:32:14.140 day long.
00:32:17.340 Let's see.
00:32:20.700 And then
00:32:21.340 Byron York
00:32:22.060 is talking
00:32:22.820 about this
00:32:23.300 rigged trial
00:32:24.220 as many of
00:32:25.020 us believe
00:32:25.380 it is.
00:32:28.520 Let's see.
00:32:29.100 So I guess
00:32:30.340 Trump's team
00:32:31.520 wanted to
00:32:31.920 bring in
00:32:32.940 an ex-head
00:32:34.060 of the
00:32:34.440 FEC, the
00:32:35.300 Federal Election
00:32:36.000 Commission,
00:32:36.800 to say that
00:32:38.020 what Trump
00:32:38.440 did was
00:32:38.980 totally
00:32:39.360 illegal.
00:32:40.620 And
00:32:40.940 apparently
00:32:41.840 he's not
00:32:42.280 being allowed
00:32:42.920 to testify.
00:32:44.820 Why would
00:32:45.540 that be?
00:32:46.920 So the
00:32:48.660 witness doesn't
00:32:49.800 have direct
00:32:50.380 evidence about
00:32:51.040 anything in
00:32:51.540 the Trump
00:32:51.880 case, which
00:32:52.720 might be how
00:32:53.220 the judge
00:32:53.760 gets away
00:32:54.220 with this.
00:32:55.040 But he
00:32:55.380 does have
00:32:55.780 direct evidence
00:32:56.520 as being the
00:32:57.300 head person
00:32:58.360 who decides
00:32:59.000 what the
00:32:59.380 law is in
00:33:00.040 the federal
00:33:00.520 election, or
00:33:01.300 he was, he
00:33:02.560 would be the
00:33:02.960 number one
00:33:03.420 person to
00:33:03.940 say, under
00:33:04.700 these situations,
00:33:06.600 this expense
00:33:07.360 is totally
00:33:07.920 allowed because
00:33:09.680 whatever.
00:33:13.140 So the
00:33:13.480 guy who
00:33:13.920 would say
00:33:14.280 this isn't
00:33:14.880 the person
00:33:16.180 who would be
00:33:16.580 the expert
00:33:17.340 to say
00:33:18.780 whether something
00:33:19.400 was or was
00:33:20.200 not illegal
00:33:20.920 is not allowed
00:33:22.240 to testify.
00:33:23.300 The one
00:33:24.020 person who
00:33:24.580 would know
00:33:25.000 more than
00:33:25.520 anybody else
00:33:26.300 except maybe
00:33:26.940 the current
00:33:27.380 head of that
00:33:27.900 job,
00:33:28.240 that it's
00:33:29.880 illegal or
00:33:30.380 not, and
00:33:31.180 they're not
00:33:31.440 going to let
00:33:31.720 that person
00:33:32.140 in.
00:33:33.140 Now, that's
00:33:33.740 a rigged
00:33:34.060 trial.
00:33:35.260 That is a
00:33:35.960 very rigged
00:33:36.660 trial.
00:33:37.520 Now, I
00:33:38.040 had trouble
00:33:38.500 explaining it
00:33:39.220 to you because
00:33:39.580 I kind of
00:33:40.000 quickly looked
00:33:40.680 at the story
00:33:41.180 before I came
00:33:41.780 on today.
00:33:42.760 But imagine
00:33:43.240 if you
00:33:43.600 will, did
00:33:46.500 I lose
00:33:46.780 Carmen?
00:33:49.220 Oh, I
00:33:49.780 did.
00:33:51.520 Hmm.
00:33:52.240 All right.
00:33:52.780 Well, I'm
00:33:53.240 not sure if
00:33:53.480 that'll work
00:33:53.860 or not.
00:33:54.200 Oh, no, I
00:33:56.020 do have
00:33:56.340 you.
00:33:57.480 I think I
00:33:58.160 still have
00:33:58.480 her.
00:33:59.480 All right.
00:34:01.720 So the
00:34:02.300 Harvard
00:34:02.580 protesters were
00:34:03.660 awakened today
00:34:04.960 by some
00:34:08.360 patriots, as
00:34:09.420 they're called,
00:34:09.840 blasting the
00:34:10.400 national anthem.
00:34:11.660 So all the
00:34:12.080 people in their
00:34:12.580 little tents
00:34:13.280 got awakened
00:34:14.700 early in the
00:34:15.320 morning.
00:34:15.660 I hope they're
00:34:16.120 morning people
00:34:16.700 because they
00:34:18.060 got awakened
00:34:18.700 really early in
00:34:19.440 the morning to
00:34:19.860 go to their
00:34:20.200 job of
00:34:20.620 protesting.
00:34:21.040 But here's
00:34:23.040 the thing.
00:34:23.600 You know, we
00:34:24.040 keep talking
00:34:24.480 about who's
00:34:25.120 buying all
00:34:25.600 these tents
00:34:26.200 and why are
00:34:27.480 all these
00:34:27.800 people getting
00:34:28.880 the same
00:34:29.460 matching tents,
00:34:30.720 like who's
00:34:31.100 behind it all?
00:34:32.680 And I have
00:34:33.580 an answer for
00:34:34.160 you.
00:34:35.280 You think it's
00:34:36.280 George Soros
00:34:37.100 who's funding
00:34:37.640 it?
00:34:38.500 I know some
00:34:39.060 of you said
00:34:39.480 that.
00:34:40.740 Maybe.
00:34:41.960 I think it
00:34:42.740 might be the
00:34:43.180 parents of
00:34:43.760 these protesters.
00:34:45.320 I think they
00:34:45.880 all live at
00:34:46.400 home.
00:34:47.820 And their
00:34:48.220 parents said,
00:34:49.060 hey, have
00:34:50.800 you seen
00:34:51.500 the news?
00:34:52.560 No.
00:34:52.880 What's in
00:34:53.180 the news?
00:34:53.680 I was
00:34:53.940 playing games.
00:34:54.600 I didn't
00:34:54.780 see.
00:34:55.440 Oh, terrible
00:34:56.200 things.
00:34:57.300 Terrible
00:34:57.580 things.
00:34:58.160 Really terrible.
00:34:59.300 Man, if I
00:34:59.840 were younger,
00:35:00.360 I'd be out
00:35:00.760 there protesting.
00:35:01.640 I'd be on
00:35:01.980 that campus.
00:35:03.200 What are you
00:35:03.560 talking about?
00:35:04.180 Oh, I'll
00:35:04.980 tell you, in
00:35:05.520 my day, if
00:35:06.720 this were me,
00:35:07.740 I'd get
00:35:08.160 myself a tent
00:35:09.060 and I would
00:35:09.700 go live right
00:35:12.020 in the middle
00:35:12.480 of the campus
00:35:13.100 until Israel
00:35:15.240 gives you
00:35:15.900 everything that
00:35:16.480 they want or
00:35:18.060 the college
00:35:18.520 does.
00:35:18.740 And the
00:35:20.280 kids would
00:35:20.560 say, wow,
00:35:21.440 yeah, that's
00:35:22.020 a good idea.
00:35:22.760 I hate hearing
00:35:23.800 all this news
00:35:24.460 about what
00:35:24.880 Israel is
00:35:25.300 doing.
00:35:26.000 I think I'll
00:35:26.580 get a tent.
00:35:27.800 I can't
00:35:28.440 afford a
00:35:28.800 tent.
00:35:29.660 And then the
00:35:30.340 parents say,
00:35:30.980 look, I
00:35:32.600 care about
00:35:33.080 this issue
00:35:33.640 too.
00:35:34.360 I don't have
00:35:35.040 time to
00:35:35.420 protest.
00:35:36.060 You have
00:35:36.380 plenty of
00:35:36.760 time because
00:35:37.140 you just
00:35:37.440 live at home
00:35:37.860 and don't
00:35:38.140 work.
00:35:39.260 But how
00:35:39.700 about,
00:35:41.060 just a
00:35:42.260 spitball in
00:35:42.760 here, how
00:35:43.260 about if I
00:35:43.680 were to buy
00:35:44.240 you a
00:35:44.640 tent and
00:35:46.780 then you
00:35:48.080 could move
00:35:48.500 out and
00:35:49.940 you could
00:35:50.300 move into
00:35:50.780 the quad in
00:35:51.760 your tent and
00:35:53.140 move out of
00:35:53.540 the house.
00:35:54.620 So I think
00:35:55.340 it's mostly
00:35:55.820 parents who
00:35:56.360 are trying to
00:35:56.740 get their
00:35:57.040 kids out of
00:35:57.580 the house by
00:35:58.260 telling them
00:35:58.720 they should
00:35:59.020 go protest.
00:36:00.000 They buy
00:36:00.400 them the
00:36:00.740 tent.
00:36:01.240 No, I'm
00:36:01.560 making this
00:36:02.000 all up.
00:36:02.840 I think it's
00:36:03.280 actually coming
00:36:03.860 from Soros
00:36:05.000 and Rockefeller
00:36:05.820 and Pritzker
00:36:07.100 and a few
00:36:08.100 other rich
00:36:08.640 Democrats.
00:36:10.420 And by the
00:36:11.180 way, we do
00:36:12.160 have, I
00:36:12.760 think,
00:36:13.080 confirmation.
00:36:14.240 That the
00:36:15.200 college protests
00:36:16.540 are the
00:36:17.040 summer hoax.
00:36:18.580 You know, we're
00:36:19.200 all waiting for
00:36:19.720 the big summer
00:36:20.320 hoax.
00:36:20.780 Who's going to
00:36:21.160 hit the
00:36:21.480 streets and
00:36:22.280 who's Soros
00:36:23.240 going to
00:36:23.560 fund to be
00:36:24.180 a pretend
00:36:24.720 protest.
00:36:25.700 This is it.
00:36:27.200 Now, how
00:36:27.500 much do you
00:36:27.940 love the
00:36:28.280 fact that we
00:36:28.820 all called
00:36:29.320 it ahead of
00:36:29.800 time?
00:36:31.400 Literally for
00:36:32.160 two years
00:36:32.880 ahead of time,
00:36:33.900 all of us
00:36:35.280 were saying,
00:36:36.420 all right,
00:36:36.640 what will the
00:36:37.160 summer before
00:36:37.860 the election
00:36:38.320 hoax be?
00:36:39.620 You know,
00:36:39.800 what will be
00:36:40.440 in the streets
00:36:41.020 that's completely
00:36:41.760 artificial?
00:36:42.880 Well, here it
00:36:43.260 is.
00:36:44.520 You can
00:36:45.540 identify it
00:36:46.260 by who
00:36:46.540 funds it.
00:36:47.440 You can
00:36:47.920 identify it
00:36:48.720 by the
00:36:49.200 fact that
00:36:49.580 the news
00:36:50.180 really ignored
00:36:52.040 who was
00:36:52.460 funding it
00:36:52.960 for a long
00:36:53.400 time.
00:36:55.080 But they're
00:36:55.780 on it now.
00:36:57.640 But if
00:36:58.300 you're a
00:36:59.200 casual news
00:37:00.040 watcher,
00:37:00.820 what do you
00:37:01.260 know about
00:37:01.700 the funding
00:37:02.340 of the
00:37:03.860 protests?
00:37:05.880 Nothing.
00:37:07.120 Nothing.
00:37:07.760 If you're not
00:37:08.360 really deep in
00:37:09.300 the weeds of
00:37:10.160 the news,
00:37:10.620 you don't
00:37:11.440 know that
00:37:11.940 the main
00:37:13.260 Democrat
00:37:13.800 funders are
00:37:14.720 funding this
00:37:15.320 thing.
00:37:16.620 For why?
00:37:17.580 Well, it
00:37:19.020 might be so
00:37:19.720 that they can
00:37:20.200 get enough
00:37:20.700 action going
00:37:21.520 that they
00:37:21.920 can cancel
00:37:23.220 the election
00:37:23.780 or do
00:37:24.220 something
00:37:24.520 drastic.
00:37:25.500 But it's
00:37:27.020 certainly the
00:37:27.480 summer hoax.
00:37:30.640 And it's
00:37:31.260 another complex
00:37:32.040 web of
00:37:32.460 things we
00:37:32.860 don't
00:37:33.040 understand.
00:37:33.560 apparently
00:37:35.680 Alvin
00:37:36.120 Bragg's
00:37:36.520 lead
00:37:36.700 prosecutor
00:37:37.220 that
00:37:37.500 Matthew
00:37:37.940 Colangelo
00:37:39.160 I mentioned
00:37:40.700 before,
00:37:41.880 he was
00:37:42.140 paid $12,000
00:37:43.320 by the
00:37:44.220 Democrat
00:37:45.120 National
00:37:45.580 Committee
00:37:46.080 to be
00:37:47.280 a political
00:37:48.000 consultant.
00:37:49.940 Does that
00:37:50.740 sound suspicious?
00:37:51.900 Because it
00:37:52.500 sounds a lot
00:37:53.140 like the
00:37:53.500 Democrats are
00:37:54.200 paying him
00:37:54.660 to go
00:37:54.920 after Trump.
00:37:56.500 That's what
00:37:57.020 it looks
00:37:57.280 like.
00:37:58.440 Doesn't
00:37:59.040 mean it's
00:37:59.360 true.
00:38:00.280 Could be
00:38:00.740 coincidentally
00:38:01.420 he's such
00:38:01.960 a good
00:38:02.240 political
00:38:02.720 consultant
00:38:03.280 that for
00:38:03.940 the first
00:38:04.260 time ever
00:38:04.660 they decided
00:38:05.200 to give
00:38:05.500 him a
00:38:05.780 bunch of
00:38:06.020 money.
00:38:07.320 Or it's
00:38:08.180 exactly what
00:38:08.860 it looks
00:38:09.200 like.
00:38:10.280 The
00:38:10.760 prosecutors
00:38:11.420 are funded
00:38:12.380 and elected
00:38:12.900 by Soros
00:38:13.740 and then
00:38:14.560 the Democrats
00:38:15.220 get to tell
00:38:16.040 them what
00:38:16.360 to do
00:38:16.720 and sometimes
00:38:17.380 pay them
00:38:17.820 to do
00:38:18.100 it.
00:38:19.220 So yes,
00:38:19.820 the justice
00:38:20.320 system is
00:38:21.060 completely
00:38:21.780 broken.
00:38:23.960 Why does
00:38:24.420 the average
00:38:24.900 person not
00:38:26.360 complain?
00:38:27.020 Because it's
00:38:27.520 complicated and
00:38:28.280 they don't
00:38:28.480 really understand
00:38:29.040 it and
00:38:29.580 they don't
00:38:29.820 know anything
00:38:30.120 about this
00:38:30.500 case and
00:38:31.000 they don't
00:38:31.220 know the
00:38:31.500 DNC may
00:38:32.120 have paid
00:38:32.860 somebody for
00:38:33.400 some political
00:38:34.000 consulting and
00:38:34.840 they don't
00:38:35.080 know how
00:38:35.380 it all
00:38:35.640 fits together.
00:38:37.080 They don't
00:38:37.320 see the
00:38:37.600 big picture.
00:38:38.280 It's all
00:38:38.520 complicated.
00:38:42.700 So Elon
00:38:46.580 Musk said
00:38:47.280 in a recent
00:38:48.240 get-together
00:38:49.000 public event,
00:38:51.140 he said
00:38:51.540 that America
00:38:52.840 could end
00:38:53.240 without free
00:38:55.340 speech.
00:38:55.800 speech.
00:38:57.540 But here's
00:38:58.180 what I
00:38:58.380 think.
00:38:59.460 I like
00:38:59.860 free speech,
00:39:00.840 but I
00:39:01.520 don't think
00:39:01.840 America is
00:39:02.400 going to
00:39:02.640 end if
00:39:02.960 you don't
00:39:03.220 have it.
00:39:03.960 Because I
00:39:04.400 don't think
00:39:04.720 we've ever
00:39:05.140 had it.
00:39:06.140 Not really.
00:39:07.400 And I
00:39:07.700 don't think
00:39:07.980 the country
00:39:08.420 has ever
00:39:08.720 been a
00:39:09.240 democratic
00:39:09.980 republic since
00:39:10.800 I've been
00:39:11.100 alive.
00:39:11.880 As far as
00:39:12.700 I can tell,
00:39:13.960 I'll probably
00:39:14.380 tell you this
00:39:14.840 every day,
00:39:16.260 our government
00:39:16.840 is just a
00:39:17.820 criminal
00:39:18.140 enterprise,
00:39:19.180 like all
00:39:19.940 other democratic
00:39:20.780 countries
00:39:21.440 eventually become.
00:39:22.980 The trouble
00:39:23.600 with democracy
00:39:24.320 is that as
00:39:25.780 long as people
00:39:26.360 have freedom,
00:39:27.220 the rich people
00:39:27.860 will use their
00:39:28.540 freedom to take
00:39:29.260 full control
00:39:29.960 of the government,
00:39:30.900 which is what
00:39:31.720 happened.
00:39:33.240 It's exactly
00:39:34.120 what happened.
00:39:35.160 So if you
00:39:36.580 have a democracy
00:39:37.460 slash republic,
00:39:39.200 you can pretty
00:39:40.300 much just
00:39:41.380 wind the tape
00:39:42.420 forward until
00:39:43.720 the rich people
00:39:44.360 have all the
00:39:44.860 control.
00:39:45.900 And here we
00:39:46.380 are, exactly
00:39:48.020 where you'd
00:39:48.620 expect based on
00:39:49.460 the design of
00:39:50.140 the system.
00:39:51.280 Now, if the
00:39:51.700 design of the
00:39:52.360 system was
00:39:53.520 such that rich
00:39:54.240 people couldn't
00:39:55.000 donate and
00:39:56.300 maybe just
00:39:57.240 couldn't do
00:39:57.580 anything, and
00:39:58.140 you would watch
00:39:58.640 them very
00:39:59.000 carefully to
00:39:59.580 make sure they
00:39:59.980 weren't doing
00:40:00.760 clever donations
00:40:01.740 in indirect ways,
00:40:03.640 maybe you can
00:40:04.360 make a democracy
00:40:05.220 work.
00:40:06.440 But in the
00:40:07.000 current design,
00:40:07.960 you guarantee
00:40:08.580 that the rich
00:40:09.180 people take
00:40:09.740 control, probably
00:40:11.240 right away, and
00:40:12.440 probably always
00:40:13.080 have been in
00:40:13.860 control.
00:40:14.720 So the wars
00:40:15.540 will be driven
00:40:16.300 by the people
00:40:16.920 who make money
00:40:17.460 from wars, and
00:40:18.420 the climate
00:40:19.940 change will be
00:40:20.720 driven by people
00:40:21.560 who can sell
00:40:22.260 you solar
00:40:23.060 panels, and
00:40:23.860 it's exactly
00:40:24.740 what it looks
00:40:25.120 like, a
00:40:25.660 gigantic criminal
00:40:26.560 enterprise.
00:40:27.560 But let me
00:40:27.920 tell you where
00:40:28.320 they went too
00:40:28.920 far.
00:40:29.920 It's one thing
00:40:30.600 to be called
00:40:31.080 a tax cattle.
00:40:33.500 Basically, we're
00:40:34.320 livestock that
00:40:35.080 pay taxes to
00:40:36.440 other people so
00:40:37.100 they can use
00:40:37.540 our money.
00:40:39.820 But where I
00:40:40.960 think they went
00:40:41.340 too far is
00:40:41.980 poisoning our
00:40:42.600 food supply and
00:40:43.640 making us pay
00:40:44.320 for it.
00:40:45.380 Now, that
00:40:46.000 alone would be
00:40:46.680 enough reason to
00:40:47.480 vote for RFK
00:40:48.320 Jr., because
00:40:49.760 he's very
00:40:50.200 much on the
00:40:50.780 poisoned food
00:40:52.820 supply, and
00:40:54.200 he's not letting
00:40:54.780 go.
00:40:55.580 And I really,
00:40:56.380 really appreciate
00:40:57.220 his patriotism and
00:40:58.860 service to the
00:40:59.700 country.
00:41:00.340 Win or lose,
00:41:01.680 it's awesome.
00:41:04.440 All right.
00:41:04.940 So I think
00:41:05.980 they've gone too
00:41:06.560 far with our
00:41:07.220 criminal enterprise,
00:41:07.980 and like the
00:41:08.860 mafia, they need
00:41:09.660 to take care of
00:41:11.260 their public a
00:41:11.920 little bit better.
00:41:13.360 Right?
00:41:13.580 It's one thing to
00:41:14.300 be part of a
00:41:14.880 criminal enterprise,
00:41:15.940 but you've got to
00:41:16.700 take care of the
00:41:17.220 criminals, right?
00:41:18.560 We're all in this
00:41:19.460 together.
00:41:20.380 It doesn't work if
00:41:22.700 you kill us all.
00:41:23.760 So please don't
00:41:24.820 kill us with your
00:41:25.540 poisoned food and
00:41:27.320 poisoned pharma.
00:41:29.640 I don't think
00:41:30.380 America is going to
00:41:31.140 end because I think
00:41:31.900 we've always been a
00:41:32.600 criminal enterprise in
00:41:33.540 my lifetime.
00:41:34.420 It worked out fine.
00:41:36.000 So the dark truth
00:41:37.640 is that a criminal
00:41:38.960 government is
00:41:39.660 actually very
00:41:40.200 effective.
00:41:41.200 Putin, for
00:41:41.700 example, total
00:41:42.980 criminal government,
00:41:44.600 very, very
00:41:45.460 stable.
00:41:47.220 Romney is
00:41:50.480 admitting that
00:41:51.060 the reason
00:41:51.420 TikTok is going
00:41:52.380 to get banned
00:41:53.080 or there's so
00:41:54.320 much effort to
00:41:54.960 do it is
00:41:56.520 strictly, well,
00:41:58.560 he didn't say
00:41:58.980 strictly, but
00:41:59.820 mostly because
00:42:01.640 TikTok was so
00:42:02.900 pro-Palestinian
00:42:04.080 anti-Israel.
00:42:07.900 Now, we knew
00:42:08.680 that, right?
00:42:11.640 Didn't we all
00:42:12.520 know that it
00:42:14.360 was because of
00:42:15.080 Israel?
00:42:15.760 No.
00:42:17.500 All right.
00:42:18.380 I was just
00:42:18.980 checking the
00:42:19.440 comments to see
00:42:20.100 if the troll's
00:42:20.740 gone, but he's
00:42:22.220 very dedicated, so
00:42:23.420 we'll cover him
00:42:23.920 back.
00:42:25.600 So I can still
00:42:26.860 see all the
00:42:27.360 locals' comments,
00:42:28.320 but only from
00:42:28.940 locals.
00:42:33.600 All right.
00:42:34.360 So I guess it
00:42:35.340 was exactly what
00:42:36.000 it looked like.
00:42:39.680 All right.
00:42:40.180 I'll do a couple
00:42:40.700 of quick stories,
00:42:41.480 and then we're
00:42:41.800 going to do an
00:42:42.140 interview if I can
00:42:43.300 connect with
00:42:44.160 Carmen Simon,
00:42:46.180 which will test
00:42:47.040 the Rumble
00:42:47.960 Studio technology
00:42:49.000 here to add a
00:42:50.000 guest.
00:42:51.340 Anyway, Max
00:42:51.780 Wheaton Waters
00:42:52.280 says that there's
00:42:53.320 a bunch of
00:42:54.640 right-wing
00:42:54.960 organizations that
00:42:55.840 are trading up in
00:42:56.560 the hills somewhere.
00:42:58.200 Now, I got
00:42:59.240 really worried about
00:43:00.140 those right-wing
00:43:00.860 organizations that
00:43:01.720 are trading up in
00:43:02.420 the hills in
00:43:03.640 case Trump loses.
00:43:04.680 I guess they're
00:43:05.100 going to do
00:43:06.080 their armed
00:43:06.600 revolution.
00:43:08.320 And I think
00:43:09.500 that they're
00:43:09.900 probably white
00:43:11.180 supremacists.
00:43:12.800 And I've
00:43:13.740 decided to form a
00:43:14.540 posse to go hunt
00:43:15.460 for the white
00:43:16.060 supremacists, or
00:43:17.360 as she calls them,
00:43:18.160 right-wing
00:43:18.500 organizations.
00:43:19.340 They're hiding in
00:43:20.160 the mountains.
00:43:22.840 And you can
00:43:24.000 identify them by
00:43:24.820 their khaki pants.
00:43:26.540 So when you go
00:43:27.240 hunting for the
00:43:28.260 white supremacists
00:43:29.480 that are up in the
00:43:30.060 mountains forming
00:43:30.740 their secret armies,
00:43:32.060 once you see a
00:43:33.240 whole bunch of
00:43:33.700 guys in khakis and
00:43:34.900 baseball hats,
00:43:35.600 that's them.
00:43:36.600 That is them.
00:43:38.380 And I assume
00:43:40.560 that most of
00:43:41.300 them were once
00:43:41.880 in the U.S.
00:43:42.440 military, but
00:43:43.840 probably left as
00:43:45.420 soon as they found
00:43:46.080 out that the
00:43:46.600 military was looking
00:43:47.540 for all the white
00:43:48.260 supremacists, because
00:43:49.700 that's why the
00:43:50.240 military couldn't
00:43:50.820 find any.
00:43:52.200 As soon as they
00:43:52.880 started looking,
00:43:53.540 all the white
00:43:54.000 supremacists quit
00:43:54.940 the military
00:43:55.460 immediately to be
00:43:57.460 not detected, and
00:43:59.020 they went up into
00:43:59.620 the mountains where
00:44:00.480 only Maxine Waters
00:44:01.600 knows they live.
00:44:03.140 But I believe
00:44:04.020 everything she says,
00:44:05.020 so I'm forming a
00:44:05.680 posse to go get
00:44:06.500 those damn guys,
00:44:07.380 and we're going
00:44:08.300 to have to
00:44:08.600 stop this
00:44:09.080 ourselves.
00:44:10.180 Who's with
00:44:10.640 me?
00:44:12.460 Nobody?
00:44:13.840 All right.
00:44:14.420 Maybe I'll do
00:44:15.080 something else.
00:44:17.280 RFK Jr.
00:44:18.060 says that Bill
00:44:18.840 Gates uses
00:44:19.380 philanthropy as
00:44:20.420 a front to
00:44:21.060 amass vast
00:44:22.260 personal profits.
00:44:24.380 Now, you might
00:44:25.380 know that I've
00:44:26.340 challenged many
00:44:27.760 of you publicly
00:44:28.840 many times to
00:44:30.180 explain to me how
00:44:31.100 he's making
00:44:31.520 money from any
00:44:33.300 of this.
00:44:34.300 Well, RFK Jr.
00:44:35.320 just answered
00:44:35.760 my question.
00:44:37.080 Here's the
00:44:37.620 question.
00:44:38.340 Quote, talking
00:44:39.260 about Gates.
00:44:40.220 He gets tax
00:44:41.160 deductions for
00:44:41.960 giving money to
00:44:42.580 the WHO.
00:44:43.060 Okay.
00:44:43.780 That's not making
00:44:44.700 money.
00:44:45.780 That's just giving
00:44:46.640 your money away.
00:44:48.580 All right.
00:44:48.880 He gains control
00:44:49.800 of the WHO,
00:44:51.220 arguably, but I
00:44:54.200 thought the WHO
00:44:54.640 was controlled by
00:44:55.380 China.
00:44:56.260 So, I know, is it
00:44:57.720 Bill Gates or is it
00:44:58.600 China?
00:44:59.620 The WHO finances
00:45:00.460 the health
00:45:00.900 ministries in
00:45:01.560 virtually every
00:45:02.080 country in
00:45:02.540 Africa.
00:45:03.580 So, he can
00:45:04.260 say, Gates can
00:45:05.200 say, as a
00:45:05.580 condition of
00:45:06.100 getting that
00:45:06.480 money, this is
00:45:07.860 what the WHO
00:45:09.300 does.
00:45:10.880 And you have
00:45:11.520 to show that
00:45:12.200 your people are
00:45:14.240 vaccinated.
00:45:15.520 And then the
00:45:15.880 vaccines are
00:45:16.560 something that
00:45:17.220 Gates owns a
00:45:17.860 part of, so he
00:45:18.540 makes money.
00:45:19.580 So, let's see,
00:45:20.300 pulling all this
00:45:20.900 together, because
00:45:22.060 it's complicated,
00:45:22.900 isn't it?
00:45:24.100 All the bad
00:45:24.900 stuff hides in
00:45:25.580 the complication.
00:45:27.600 So, there's a
00:45:29.160 World Health
00:45:29.660 Organization.
00:45:30.900 that makes
00:45:32.060 sure that
00:45:32.760 funds certain
00:45:34.520 countries for
00:45:35.520 their health
00:45:35.900 care, and if
00:45:37.700 they don't
00:45:38.060 vaccinate, they
00:45:38.780 can't get the
00:45:39.240 money.
00:45:40.800 So, Bill Gates
00:45:41.780 gets control of
00:45:42.880 the World Health
00:45:43.580 Organization.
00:45:44.020 This is RFK
00:45:44.780 Jr.'s take on
00:45:45.700 it.
00:45:46.340 And then he
00:45:47.840 gets a tax
00:45:48.360 break for that,
00:45:49.360 but he still
00:45:49.720 pays money.
00:45:50.700 I mean, he
00:45:51.000 still ends up
00:45:51.600 in a pocket
00:45:52.200 money, even
00:45:52.980 though he
00:45:53.200 makes the
00:45:53.520 tax break.
00:45:54.380 But that he
00:45:55.340 can influence
00:45:56.420 them to buy
00:45:57.380 his vaccines,
00:45:58.100 and then he
00:45:58.580 makes all his
00:45:59.040 money back.
00:46:00.900 Well, that's
00:46:02.060 one way to
00:46:02.460 look at it.
00:46:04.540 But it's a
00:46:05.280 little bit of
00:46:05.900 a mind-reading
00:46:06.940 problem, isn't
00:46:08.080 it?
00:46:08.940 So, the
00:46:09.460 mind-reading
00:46:09.980 problem is
00:46:10.640 that you
00:46:10.920 have to
00:46:11.200 read his
00:46:11.480 intention.
00:46:12.860 If his
00:46:13.460 intention is
00:46:14.540 he's trying to
00:46:15.100 save these
00:46:15.660 countries and
00:46:16.280 give them
00:46:16.620 health care that
00:46:17.200 they wouldn't
00:46:17.540 have otherwise,
00:46:18.840 then he's
00:46:19.720 doing a real
00:46:20.140 good job of
00:46:20.800 it.
00:46:21.760 Now, that
00:46:22.340 would depend
00:46:22.900 on the
00:46:23.940 vaccines being
00:46:24.700 a good idea.
00:46:25.460 And I'm
00:46:27.520 pretty sure he
00:46:28.100 thinks it's a
00:46:28.620 good idea,
00:46:30.040 whether it
00:46:30.800 is or not.
00:46:32.100 So, I don't
00:46:32.860 think you can
00:46:33.280 read his
00:46:33.680 mind to
00:46:34.200 know that
00:46:34.660 that's what's
00:46:35.740 going on
00:46:36.160 there.
00:46:36.560 All right.
00:46:37.020 I'm going to
00:46:37.480 bring on my
00:46:38.580 guest.
00:46:40.080 Let's see if
00:46:40.460 this technology
00:46:41.180 works.
00:46:45.500 Let's see.
00:46:47.220 We'll be
00:46:47.620 testing to
00:46:48.260 see if I
00:46:48.600 can do
00:46:48.820 a promote
00:46:49.380 to moderator.
00:46:51.080 No.
00:46:56.740 Participants.
00:46:58.080 Let's see.
00:47:00.800 You got your
00:47:01.220 microphone off.
00:47:03.600 Huh.
00:47:05.500 All right.
00:47:06.100 Let's try
00:47:06.420 something else.
00:47:08.840 Going into
00:47:09.680 duo mode.
00:47:11.560 So, the
00:47:12.160 question will
00:47:12.620 be whether I
00:47:13.200 can find the
00:47:13.920 user interface
00:47:14.760 to bring her
00:47:15.700 up to...
00:47:17.920 I can promote
00:47:18.580 to moderator.
00:47:19.600 That's not a
00:47:20.240 choice.
00:47:21.460 Hey, there
00:47:22.120 you are.
00:47:23.560 All right.
00:47:24.200 Turn on your
00:47:24.640 microphone.
00:47:25.980 Your microphone's
00:47:26.920 off.
00:47:27.540 Hello.
00:47:29.960 All right.
00:47:30.660 I can't hear you
00:47:31.620 because your
00:47:31.920 microphone's off.
00:47:32.960 Okay.
00:47:34.440 Try again.
00:47:35.620 How about
00:47:36.060 now?
00:47:37.420 And how
00:47:38.020 about this?
00:47:39.520 I've got you
00:47:40.240 muted.
00:47:40.680 Let me unmute
00:47:41.360 you.
00:47:44.520 No, that's
00:47:45.300 not working.
00:47:46.400 I can't hear
00:47:46.980 you.
00:47:47.760 I can't hear
00:47:48.420 you.
00:47:49.480 Your microphone.
00:47:51.080 How about
00:47:54.440 now?
00:47:55.980 So, apparently
00:47:57.260 you don't know
00:47:57.960 what I'm saying
00:47:59.020 and I'm saying
00:47:59.740 I can't hear
00:48:00.460 you.
00:48:01.800 I can't hear
00:48:02.540 you.
00:48:03.680 I appear to
00:48:04.460 have...
00:48:05.880 On my system
00:48:07.000 it says your
00:48:07.500 microphone's off.
00:48:09.400 It says your
00:48:09.820 microphone's off.
00:48:12.720 You have no idea
00:48:13.700 what I'm saying
00:48:14.220 right now, do you?
00:48:14.880 I do.
00:48:15.460 Oh, you do?
00:48:15.980 Yes.
00:48:16.500 All right.
00:48:16.680 I can't hear
00:48:17.240 you, Carmen.
00:48:18.780 Because my
00:48:19.500 computer is
00:48:20.160 telling me that
00:48:20.700 your microphone
00:48:21.260 is off.
00:48:22.560 It's on mute.
00:48:23.260 Okay.
00:48:23.880 Can you see a
00:48:24.560 microphone icon
00:48:25.420 toward the bottom
00:48:26.020 of your screen
00:48:26.540 or something?
00:48:27.360 I do, but it
00:48:28.420 shows that it's
00:48:29.300 not muted.
00:48:30.480 All right.
00:48:31.200 So, it's not
00:48:31.960 working.
00:48:33.040 Hold on.
00:48:35.640 We're going to
00:48:36.120 try another minute
00:48:36.780 here to get this
00:48:37.600 working.
00:48:38.640 I don't know if
00:48:39.240 she's got a bad
00:48:39.920 microphone.
00:48:42.420 This is the first
00:48:43.180 time we've tried.
00:48:43.740 By the way, this is a
00:48:44.760 beta test.
00:48:46.600 So, the reason
00:48:47.980 that this is not
00:48:48.880 all worked out in
00:48:49.700 advance is that
00:48:51.020 this is the
00:48:51.620 working it out.
00:48:53.360 This is the
00:48:54.120 trying to make it
00:48:54.800 work right here.
00:48:55.840 So, you're all
00:48:56.480 part of the
00:48:56.820 experiment.
00:48:57.420 We're using the
00:48:57.900 Rumble Studio.
00:48:59.600 How about now?
00:49:00.660 All right.
00:49:01.000 No, there's
00:49:01.920 nothing.
00:49:02.400 Still nothing?
00:49:03.840 Nothing.
00:49:04.620 Hmm.
00:49:04.840 Because I tried.
00:49:06.600 So, I'm using
00:49:07.240 the technology of
00:49:08.840 MIME.
00:49:11.180 Okay.
00:49:12.500 I have one, too.
00:49:13.660 So, she has a
00:49:14.720 microphone, but
00:49:16.080 there's a button.
00:49:17.780 You don't see a
00:49:18.440 button to turn it
00:49:19.160 on?
00:49:19.940 Mm-mm.
00:49:21.380 So, scroll your
00:49:23.240 page down to see
00:49:24.180 if there's another
00:49:25.240 choice that comes
00:49:26.700 up.
00:49:27.380 It might be at the
00:49:28.060 bottom of the
00:49:28.520 page.
00:49:32.000 Let me try this.
00:49:33.400 Hold on.
00:49:35.000 How about this
00:49:35.720 one?
00:49:36.280 Oh, there we go.
00:49:37.620 Yeah?
00:49:39.320 All right.
00:49:39.580 So, I can hear you
00:49:41.640 now, but are people
00:49:43.400 going to get a
00:49:44.120 double?
00:49:45.100 I wonder if I need
00:49:46.040 headphones for this.
00:49:47.300 Is there an echo?
00:49:49.600 I'm not hearing an
00:49:50.560 echo.
00:49:52.920 And how about the
00:49:53.960 users on locals?
00:49:55.160 Do you hear an echo?
00:49:59.400 You hear both?
00:50:01.940 Yeah.
00:50:02.380 So, it sounds like
00:50:03.040 we are being heard.
00:50:04.480 Well, let's try to do
00:50:05.920 this, then.
00:50:07.220 All right.
00:50:08.400 No echo, and we
00:50:09.580 have audio.
00:50:10.360 All right.
00:50:10.940 How about that?
00:50:12.060 All right.
00:50:12.280 So, that was my user
00:50:13.200 in the face problem.
00:50:14.220 I think I was blaming
00:50:15.220 you for not finding
00:50:16.280 the right button.
00:50:17.600 On my system, it
00:50:20.200 showed mute, but it
00:50:21.840 actually was my
00:50:22.580 computer.
00:50:23.120 I had to turn up my
00:50:25.080 sound.
00:50:26.200 So, I think everybody
00:50:26.860 was hearing you except
00:50:28.480 me, right?
00:50:29.660 Is that what it was
00:50:30.160 called?
00:50:30.600 This is about the
00:50:31.300 neuroscience of
00:50:31.840 attention, and now it
00:50:33.160 looks like we have
00:50:33.840 quite a bit of
00:50:34.420 attention.
00:50:35.920 Part of your
00:50:36.660 technique is putting
00:50:38.080 an error in it.
00:50:40.980 All right.
00:50:41.340 Let's talk about
00:50:41.920 this.
00:50:42.400 You have a new
00:50:43.240 book.
00:50:44.220 I do.
00:50:45.180 It's called
00:50:45.780 Made You Look.
00:50:47.460 Made You Look.
00:50:48.520 Just out now, and
00:50:49.520 you can buy it.
00:50:50.860 And I'm going to
00:50:51.540 give the people a
00:50:53.140 quick idea, in my
00:50:55.900 own words, that are
00:50:57.560 all wrong, before you
00:50:59.160 correct me, okay?
00:51:00.400 Okay.
00:51:00.740 So, one of the
00:51:01.240 things you do, if you
00:51:02.700 can see in this book,
00:51:04.700 see this person
00:51:05.380 hooked up to all
00:51:05.960 these sensors and
00:51:06.780 looking at a screen
00:51:07.560 on the computer?
00:51:08.760 That's what Carmen
00:51:09.540 does, puts these
00:51:10.400 little sensors on
00:51:11.260 their body and
00:51:11.960 their head, EEG, ECG,
00:51:13.900 GSR, facial coding,
00:51:15.280 eye tracking, and you
00:51:17.700 have them look at
00:51:18.560 mostly slides, like
00:51:20.740 business presentation
00:51:21.780 slides, or is it
00:51:23.240 other things as well?
00:51:25.320 It could be a
00:51:26.880 presentation, it
00:51:27.880 could be a website,
00:51:29.000 it could be a
00:51:29.540 video, anything that
00:51:31.060 in somebody's view
00:51:32.240 should capture
00:51:33.260 attention, which is
00:51:34.260 what the essence of
00:51:35.220 the book is.
00:51:36.500 And the reason you
00:51:37.120 even need attention in
00:51:38.040 the first place is
00:51:38.760 because quite often
00:51:40.020 people remember
00:51:40.980 better if they pay
00:51:42.060 attention.
00:51:42.460 I'm saying quite often
00:51:43.460 because it is
00:51:44.040 possible to remember
00:51:44.900 something without
00:51:45.640 consciously paying
00:51:46.760 attention to it, but
00:51:47.740 we're all here in the
00:51:49.140 business of
00:51:49.600 communication, and
00:51:50.520 usually you like to
00:51:51.620 get credit for
00:51:52.980 things that you
00:51:53.880 made people look at.
00:51:56.740 So, attention
00:51:57.340 influences memory,
00:51:58.220 and in turn, memory
00:51:59.820 influences decision
00:52:01.080 making, and the
00:52:01.960 world moves around,
00:52:03.540 and all the great
00:52:04.240 things that you just
00:52:04.880 talked about happen
00:52:05.680 because people
00:52:06.980 remember and decide,
00:52:08.740 and attention is the
00:52:09.540 root foundation of
00:52:10.480 that.
00:52:11.040 So, the thing that
00:52:12.160 surprised me as I was
00:52:13.440 looking through your
00:52:14.020 book is how many
00:52:14.780 categories of attention
00:52:16.360 there are.
00:52:17.260 In other words, you
00:52:17.780 can, you know,
00:52:18.780 give us a few ideas
00:52:21.000 like attention isn't
00:52:22.520 just simply looking at
00:52:23.580 stuff, right?
00:52:24.600 Give us a little
00:52:25.200 sense of how complicated
00:52:26.160 that is to determine
00:52:27.620 attention.
00:52:28.780 It is true.
00:52:29.820 There are a variety of
00:52:31.060 ways in which the brain
00:52:32.060 pays attention.
00:52:32.940 We pay attention with
00:52:33.660 our brain.
00:52:34.740 And, in fact, as I'm
00:52:35.840 looking at the comments,
00:52:36.640 I'd love to know what
00:52:37.700 typically attracts your
00:52:39.400 attention, speaking to
00:52:40.460 our audience, and what
00:52:42.140 would you like more
00:52:42.820 attention to?
00:52:44.660 And as you're reflecting
00:52:45.840 on those questions,
00:52:47.680 just know that typically
00:52:48.920 when you pay attention
00:52:49.800 to something, you pay
00:52:51.700 attention in space, and
00:52:53.220 quite often that
00:52:53.920 attention can be very
00:52:54.860 focused.
00:52:55.400 Like, for instance, when
00:52:56.280 you have dinner with
00:52:57.180 somebody at a
00:52:57.720 restaurant, and you
00:52:59.400 try to be polite, then
00:53:00.340 you're only focusing in
00:53:01.400 on that one person.
00:53:02.240 I'm even seeing this
00:53:03.460 habit now where people
00:53:04.480 are putting their phones
00:53:05.740 with the face down just
00:53:07.280 to indicate that now
00:53:09.040 everything is in focus,
00:53:10.680 and that focus is very
00:53:12.240 sharp.
00:53:12.720 So that's selective attention.
00:53:14.400 But at some point, from an
00:53:16.480 evolutionary perspective, we
00:53:17.600 know that it would not
00:53:18.480 serve us well if we were so
00:53:20.220 laser-focused.
00:53:20.980 We have to widen our focus,
00:53:22.960 and we have to distribute
00:53:24.360 our attention so that you
00:53:26.320 can see there are some
00:53:27.460 creepy crawlies hiding in the
00:53:29.240 bushes, and your survival
00:53:30.960 may be in danger of sorts.
00:53:33.880 So it is possible to divide
00:53:35.300 attention, but in that
00:53:36.360 process, then you're missing
00:53:38.260 some of the sharpness from the
00:53:40.340 selective attention, but you
00:53:41.560 have additional advantages.
00:53:43.880 So those are attention in
00:53:45.060 space.
00:53:45.900 It's also possible to pay
00:53:47.160 attention across time, and
00:53:48.800 that's sustained attention.
00:53:50.720 This is what people
00:53:51.400 associate with attention span.
00:53:53.800 And there is a myth that I'm
00:53:55.900 sure that you've heard
00:53:57.240 debunked before.
00:53:58.160 There is no such thing as a
00:53:59.320 short attention span.
00:54:01.020 As a scientist, it bothers us
00:54:03.080 greatly when people say, well,
00:54:04.760 are you noticing that these
00:54:05.880 days the attention span is
00:54:07.220 getting shorter, and the
00:54:08.860 answer is no, it is not
00:54:10.820 getting shorter.
00:54:11.540 The brain hasn't changed
00:54:12.660 much.
00:54:13.220 No, it hasn't changed much in
00:54:14.800 the past 35,000 years.
00:54:16.520 We're capable of paying
00:54:17.600 attention.
00:54:18.860 In fact, to test this, we can
00:54:20.600 ask our audience members here
00:54:23.300 what's the longest amount of
00:54:25.840 times you have binge-watched on
00:54:28.260 a TV show, and I guarantee
00:54:30.560 that if they're genuine with
00:54:32.160 us, their answers are not
00:54:33.560 going to be in the seconds.
00:54:35.300 Their answers are not going to
00:54:36.500 be in the minutes.
00:54:37.920 Their answers are going to be
00:54:39.100 in the hours.
00:54:40.980 So is the whole thing that as
00:54:43.820 long as we're interested, we
00:54:45.140 have infinite attention?
00:54:48.060 We can stay definitely
00:54:49.920 focused, and as long as the
00:54:51.860 stimulus is interesting and or
00:54:54.060 important.
00:54:55.120 Because sometimes you have to
00:54:56.300 pay attention because something
00:54:57.260 is critical, not because it's
00:54:58.440 the most interesting thing you
00:54:59.540 have ever seen.
00:55:01.960 Let's give the audience some
00:55:03.920 takeaways so they can get
00:55:05.780 some of the functional, useful,
00:55:08.380 like how's it going to make your
00:55:09.820 life better.
00:55:11.240 So based on your data, and now
00:55:15.060 you have a pretty big database of
00:55:17.260 human beings looking at visual
00:55:19.100 things and having certain
00:55:21.020 physical, biological reactions.
00:55:22.380 physical reactions.
00:55:23.600 Yes.
00:55:23.920 All right.
00:55:24.700 So in theory, you could give us
00:55:29.020 some ideas of what things you
00:55:30.960 found out work better than other
00:55:32.580 things for attention.
00:55:34.000 Can you break it down into some
00:55:36.420 tips?
00:55:37.760 Yeah, definitely.
00:55:38.500 Let's consider some tips.
00:55:40.680 And in the book, we go beyond the
00:55:43.000 multitude of attention types, and we
00:55:45.640 look at the intersection of two ways
00:55:47.640 in which we pay attention.
00:55:49.180 By asking the questions, where is
00:55:50.940 the brain looking?
00:55:51.760 And you can look externally or you
00:55:53.400 can look internally.
00:55:54.820 And who's doing the looking?
00:55:56.360 Because sometimes you decide to
00:55:57.640 look on your own, but sometimes
00:55:59.240 somebody makes you look as per the
00:56:01.360 title of the book.
00:56:03.060 So let's consider some techniques on
00:56:04.880 when you force somebody to look.
00:56:06.940 Let's just say that you want your
00:56:07.960 children to pay attention to you.
00:56:09.360 You want your customers to pay
00:56:11.400 attention to you.
00:56:12.060 You want your boss to pay attention
00:56:13.260 to you.
00:56:14.100 So you're forcing the looking, and
00:56:15.820 you want them to look externally.
00:56:18.180 And of course, you can manipulate
00:56:19.500 physical properties of a stimulus.
00:56:21.300 Like, for instance, if you make
00:56:22.180 something louder, of course, you'll
00:56:23.600 pay attention if something was
00:56:24.760 silent before that.
00:56:25.900 Or if you make something brighter or
00:56:27.520 bigger if something was not so bright
00:56:29.560 or not so big before.
00:56:30.840 So as long as there's a contrast,
00:56:32.780 here's a hard number that I like to
00:56:34.240 use.
00:56:34.980 The brain will pay attention to a
00:56:36.380 physical property of a stimulus if
00:56:37.960 there's enough contrast between two
00:56:39.960 items.
00:56:40.380 And that contrast has to be in the 30
00:56:42.080 to 40% range.
00:56:43.880 In other words, for instance, a
00:56:45.600 company that may not be
00:56:46.620 differentiated enough is because the
00:56:48.460 product that it offers compared to
00:56:50.360 the competition is not that much
00:56:52.400 different.
00:56:52.800 The brain cannot perceive the
00:56:54.520 contrast.
00:56:55.760 So as you're reflecting on your own
00:56:57.100 message, ask this question.
00:56:59.000 Sometimes it's not easy to answer.
00:57:01.480 Am I different by at least 30% from
00:57:04.300 somebody else?
00:57:05.560 So contrast could be a good technique
00:57:07.120 in terms of forcing somebody to look
00:57:09.140 externally.
00:57:09.820 You don't always have to make them
00:57:12.020 look externally.
00:57:12.840 You can also make them think.
00:57:15.000 I really like communication that
00:57:16.900 guides you internally and reflect for
00:57:19.420 a moment.
00:57:21.020 So for instance, I was seeing this
00:57:22.760 ad for an Alzheimer's organization
00:57:25.980 and they were showing the person that
00:57:27.940 critically needs care.
00:57:29.920 But the focus was not on that person.
00:57:32.380 It was on the person that gives the
00:57:34.120 care because sometimes we forget that
00:57:35.940 those people are being taken care of by
00:57:37.600 somebody else and it just gives you that
00:57:39.940 small moment of thinking and the joy
00:57:43.160 of getting it.
00:57:44.160 So as you reflect on your own
00:57:45.480 communication, are you giving others
00:57:47.240 the joy of getting it?
00:57:49.220 The joy of getting it.
00:57:50.940 So you can get attention by tweaking
00:57:53.740 their emotions or by tweaking their
00:57:57.640 contrast, which is more of an
00:58:00.100 intellectual thing.
00:58:00.920 So there's ways to get at them
00:58:03.300 intellectually, by contrast, but also
00:58:06.340 biologically, by what excites them.
00:58:09.680 And yes, and also you can think of it
00:58:12.780 in terms of being a bit more provocative.
00:58:15.540 I mean, even hearing the conversation that
00:58:17.920 you had with the audience before you and
00:58:19.940 I started talking, there are a lot of
00:58:21.860 provocative things that are being said.
00:58:23.840 Like for instance, if you were to insist on
00:58:25.680 the phrase poison pharma, that's not a
00:58:28.940 combination of words that you typically
00:58:30.640 hear mixed together.
00:58:34.020 And unfortunately, what happens, especially
00:58:35.520 if you operate in the business space, people
00:58:37.520 are often way too cautious and they baby
00:58:39.920 other people's brains and they don't want
00:58:42.660 to challenge the status quo.
00:58:44.720 They don't want to impose any tension.
00:58:47.240 And just like with that contrast in some
00:58:49.300 perceptive power, like for instance, something
00:58:51.180 is bigger by 30% than something else.
00:58:53.500 You can't have a contrast in emotion.
00:58:56.020 In other words, not all emotions are going
00:58:57.780 to get attention and be memorable.
00:58:59.860 You really have to have a much stronger
00:59:01.760 stimulus.
00:59:02.340 And I'm noticing these days, as I've been
00:59:03.820 doing a lot of these neuroscience studies
00:59:05.840 you showed the cap earlier, that it does
00:59:09.120 take a stronger and more intense stimulus
00:59:11.660 for the brain to focus and stay with you
00:59:13.680 for a while.
00:59:14.760 Now, have you picked up any changes in
00:59:18.100 what it takes to get somebody's attention?
00:59:21.480 And now you said that we have good attention
00:59:23.640 spans if we're interested enough, but does
00:59:26.900 the average person looking at, let's say
00:59:28.460 a PowerPoint slide deck, do they have the
00:59:32.320 same brain that we did 10 years ago?
00:59:36.340 I'm noticing that the threshold for
00:59:38.480 stimulation has changed.
00:59:40.500 So even though the brain itself is still
00:59:42.740 the same organ, the way that we build our
00:59:45.240 habits is to now crave more and more
00:59:48.320 stimulation.
00:59:48.960 So we will stay with you for a while if
00:59:51.640 you give us something.
00:59:52.740 Like even our attempt to fix the microphone,
00:59:55.040 for instance, which just creates this
00:59:56.540 positive of stimulation.
00:59:58.300 So people very quickly can reach for their
01:00:00.340 phone.
01:00:00.840 We're only a click away from being turned
01:00:02.820 off.
01:00:03.400 And if you're not the source of strong
01:00:05.560 stimulation, then the brain is very adept and
01:00:08.760 has a lot of choices these days.
01:00:11.300 So here's the funny thing.
01:00:12.960 For those of you who are watching the
01:00:14.720 audience, when you saw me struggling to get
01:00:18.040 the technology working and it looked like it
01:00:20.220 wasn't going to work and maybe I'd have to
01:00:21.880 bail out and everything.
01:00:23.280 What I was thinking during that period was
01:00:26.400 not, oh no, everything's going wrong.
01:00:29.000 What I was thinking is there's no way you're
01:00:30.900 going to turn this off.
01:00:32.900 I was thinking, you're going to have to wait
01:00:35.520 at least to see if I can make this work.
01:00:38.140 Because you know you wanted to see me fail.
01:00:40.540 Like, you know, in public.
01:00:41.680 Because I always say that danger is the thing
01:00:45.240 that keeps people interested.
01:00:47.300 So you have to have some, like there's
01:00:48.980 somebody's going to go off the rails.
01:00:51.020 You know, somebody's going to say something
01:00:52.100 that gets them canceled or something.
01:00:53.940 So when you and I were trying to make this
01:00:56.760 work, the whole time I was thinking, oh, this
01:00:59.000 is kind of perfect.
01:01:00.940 Like I wasn't having a care in the world
01:01:03.580 because it was either going to work or not
01:01:05.300 work.
01:01:05.660 But one way or the other was going to get a
01:01:07.280 lot of attention.
01:01:07.880 And so I was just sort of conscious of that,
01:01:10.580 that it was working, even though it was a
01:01:12.620 disaster.
01:01:13.780 Like the disaster worked just the way I
01:01:15.840 wanted it to work.
01:01:17.340 And turning what you just said into a
01:01:18.600 practical guideline for everyone, consider
01:01:20.900 this a method of priming.
01:01:22.600 So one of the reasons why people don't pay
01:01:24.180 attention is because they're not ready to
01:01:26.740 pay attention.
01:01:27.740 And we observe this in business all the
01:01:29.260 time.
01:01:29.600 Let's just say that you have a sales
01:01:30.920 presentation that has 20 slides and what's
01:01:33.520 something very critical for your product
01:01:34.960 appears on, let's say, slide 18.
01:01:37.220 By slide 18, you may have done something
01:01:40.420 that would have diverted your audience's
01:01:42.840 attention away.
01:01:44.260 So right before that slide, you need a
01:01:45.800 primer.
01:01:46.340 You need something, something that says to
01:01:47.880 the brain, now something important or
01:01:50.960 interesting or ideally both is about to
01:01:53.460 happen.
01:01:54.640 Let me ask you, do you want to prime right
01:01:57.020 before the thing you're priming for or can
01:01:59.580 you prime like, like 10 minutes before?
01:02:03.040 I love that.
01:02:03.840 I love that question.
01:02:04.760 And the answer depends on what we were just
01:02:07.260 talking about in terms of the intensity of
01:02:08.920 the stimulus, because if I use that phrase
01:02:11.640 that you're using in terms of poison
01:02:13.380 pharma, that is a loaded phrase that will
01:02:16.540 last a while in terms of intensity.
01:02:19.180 If it's something that's a weaker stimulus,
01:02:21.400 like for instance, you may just use a
01:02:23.060 gratuitous photo of somebody naked that may
01:02:26.780 just last there for a few fractions of a
01:02:28.820 second.
01:02:29.080 And then it's just, it's just gone.
01:02:30.820 So if you have something intense, you can rely
01:02:33.360 on that primer.
01:02:34.000 And what do I mean by priming is inviting the
01:02:36.880 brain to react to something based on what it
01:02:39.580 just experienced.
01:02:41.060 And you can have a perceptual primer.
01:02:43.060 You can have a semantic one, like we were
01:02:44.940 doing with the phrases that you're using, that
01:02:47.880 affective one or the emotional one.
01:02:50.360 Give us some examples, because the audience
01:02:53.700 needs examples.
01:02:54.380 Well, so for instance, let's just think of a
01:02:57.380 semantic one is one that, that gives you the
01:03:00.020 joy of, of getting it.
01:03:01.500 So I was just looking at an ad, not, not too
01:03:05.600 long ago.
01:03:06.020 And you just mentioned the phrase dad joke.
01:03:08.720 This, this ad was created by a Durex, you
01:03:11.480 know, the, the condom company, and it only
01:03:14.240 had words on it.
01:03:15.600 I love ads that only have words because they
01:03:17.440 really have to make you think.
01:03:19.340 And the word said, I don't need a condom.
01:03:22.440 And the conclusion was our favorite dad
01:03:25.660 joke.
01:03:26.220 And this comes from, from the, from the
01:03:28.320 Durex company.
01:03:30.060 I don't need the, I don't need a condom.
01:03:31.820 Favorite dad joke.
01:03:32.780 Okay.
01:03:33.080 It took me a while to get it.
01:03:34.100 All right.
01:03:34.620 But it does.
01:03:35.480 So it takes just a moment.
01:03:36.480 And then you think about it's our favorite
01:03:38.120 dad joke.
01:03:38.980 And it gives you that small aha moment.
01:03:41.260 And by the way, with the, with the EEG
01:03:43.100 signal, we can calculate the Eureka effect.
01:03:45.840 So we know where to look in the brain and
01:03:47.980 what brainwaves to analyze in order to see
01:03:50.560 if the brain has just experienced that aha
01:03:52.700 moment.
01:03:53.440 Yes.
01:03:53.880 But, uh, bring, bring it back to the
01:03:55.880 priming on the Durex example.
01:03:58.240 I get that.
01:03:59.560 That's memorable, but is that priming?
01:04:01.820 What's the priming part?
01:04:02.960 That would be the priming.
01:04:03.940 If let's just say I was in a business
01:04:05.380 presentation and I wanted to get the brain
01:04:07.820 ready.
01:04:08.260 So, because that was just such an
01:04:10.020 incongruous moment.
01:04:11.100 So something that maybe people did not
01:04:12.980 expect.
01:04:13.480 Now I'm ready for what happens next.
01:04:15.680 I'm willing to sacrifice one stimulus in
01:04:17.780 favor for what happens immediately after.
01:04:20.560 And that's an example of a semantic one
01:04:22.580 because I, you have to think about the
01:04:24.000 meaning and also an affective one because
01:04:26.120 it's a bit more intense than your typical
01:04:27.820 business stimulus.
01:04:29.220 So is priming in this sense, it's more
01:04:31.740 than just giving you, let's say a
01:04:33.300 foreshadowing of the specific content, but
01:04:36.000 rather it's just going to, is it just
01:04:37.680 rebooting your brain?
01:04:38.700 So you're open to anything?
01:04:40.240 Exactly.
01:04:40.760 So you're in a ready state.
01:04:42.000 The reason we pay such little attention
01:04:44.640 quite often in business context, and we
01:04:46.980 forget most of the things that we're
01:04:48.960 exposed to, we forget our lives almost as
01:04:51.480 quickly as we live them, is because we
01:04:54.020 are not in a state that is ready to
01:04:56.260 receive.
01:04:57.740 Would it be fair to say that people sort
01:05:00.320 of get an inertia of thinking?
01:05:02.620 In other words, they're thinking in their
01:05:04.280 logical drive this train right down the
01:05:07.240 tracks, and you got to, you got to derail
01:05:09.720 the train before you can get to the
01:05:12.260 think that anything besides the train
01:05:14.260 going down the tracks is going to happen.
01:05:15.920 Exactly.
01:05:16.880 Something like that?
01:05:17.740 Yeah.
01:05:17.960 So in terms of practical guidelines, if you
01:05:19.680 think about this notion of habituation, which
01:05:21.760 is what you just said, if you're showing
01:05:24.020 people, let's just say that you're in a
01:05:25.200 business context, and slide one will have
01:05:27.040 some text and charts, and slide two will
01:05:28.700 have more text and charts, and slides three
01:05:30.300 will have more text and charts.
01:05:32.320 After a while, your brain starts learning
01:05:34.600 to predict what happens next.
01:05:36.320 What's the likelihood that the next slide is
01:05:37.980 also going to have some text and charts?
01:05:39.860 But if you jolt the brain out of its
01:05:42.460 habituation with one of these primers we're
01:05:44.760 talking about, so whether it's something
01:05:46.240 perceptual or whether it's, so now you don't
01:05:48.100 have a chart and text, you just have an
01:05:50.280 intense photo, for example, and then you go
01:05:53.100 back to your typical stimulus, the brain has
01:05:56.240 to be jolted somewhat in order to remind it
01:05:59.860 that you cannot predict the next moment.
01:06:03.380 Nice.
01:06:04.420 So unpredictability.
01:06:06.020 So I'm trying to relate this to my hypnosis
01:06:10.260 background.
01:06:11.960 So there's something about making people
01:06:13.880 uncertain and confused, which makes them
01:06:17.460 ready for any certainty that you as a leader
01:06:20.800 or a hypnotist give them.
01:06:22.740 So you want them to make them think they
01:06:24.820 don't know what's going to happen, or they're
01:06:27.140 not in control, and then you give them the
01:06:29.460 answer, because everybody wants certainty.
01:06:32.360 So if they think they have certainty, they
01:06:34.540 don't need any, but if you scramble them, they're
01:06:37.640 going to be looking for certainty, and then you
01:06:39.180 give it to them just in time.
01:06:40.600 Is it similar to that?
01:06:42.280 That's what happened at this take.
01:06:44.040 Definitely similar, and I like how you're
01:06:46.200 associating with this.
01:06:47.260 And of course, the moment that you give the
01:06:48.660 brain a modicum of uncertainty, you can
01:06:51.880 guarantee some attention, because we cannot
01:06:54.080 afford to not know what happens next.
01:06:56.760 If you think about the brain, the most important
01:06:59.040 thing is what happens next.
01:07:01.320 And the reason we enjoy that which is familiar and
01:07:04.900 that which is predictable is because we can
01:07:06.960 better say, this is what will happen next.
01:07:09.780 The moment that you introduce that element of
01:07:11.780 surprise, and surprise, by the way, is always
01:07:14.060 biologically bad for the brain, because what is
01:07:17.020 surprise but a failure to predict?
01:07:19.380 Then, of course, we're attentive, because the
01:07:21.280 difference between what the brain expected and
01:07:23.280 what happens in real life is how the brain
01:07:25.260 learns.
01:07:26.520 I think this is a very important practical
01:07:28.440 guideline for all of us to reflect on, because as
01:07:30.860 you think about the content that you create, or
01:07:32.820 the way that you want to grab other people's
01:07:34.420 attention, often wonder, how do I create those
01:07:37.980 moments where people think something is going to
01:07:40.500 happen, and in fact, something else does surprise.
01:07:43.740 Yes, different from novelty, by the way, because
01:07:46.760 sometimes people treat those elements interchangeably.
01:07:50.100 But novelty is something that you haven't seen or
01:07:52.200 experienced before.
01:07:53.880 Surprise is something that you have experienced
01:07:56.520 before, but did not expect.
01:07:59.020 Like, for example, you have an eggshell, and suddenly
01:08:01.460 it opens, and a baby elephant comes out of it.
01:08:04.480 You have seen the eggshell, you have seen the baby
01:08:06.320 elephant, but they have not combined together.
01:08:09.640 Now, another thing I saw in the book that interests
01:08:12.040 me in particular, because it's based on something I
01:08:14.820 used to do when I gave presentations in my corporate
01:08:17.340 days, before I give a presentation on the boringest
01:08:20.700 subject in the world, which was like the budget, like
01:08:23.480 just impossible to keep people awake during that, I would
01:08:27.020 hand out Tic Tacs, you know, the little lozenge thing, the
01:08:31.280 Tic Tac, and everybody will take a Tic Tac, so that's the
01:08:34.920 first thing I learned.
01:08:35.700 You say, pass these around.
01:08:37.220 Everybody takes one.
01:08:38.420 Like, nobody says no to a Tic Tac.
01:08:40.200 That's the weird thing.
01:08:40.840 But it also gets them physically moving, and to keep
01:08:46.040 somebody awake, if they're chewing, they'll stay awake.
01:08:48.520 People don't fall asleep while their mouth is moving, and so
01:08:52.620 I found that you could actually keep people a little bit
01:08:55.180 more awake by making them move while you were talking to
01:08:58.380 them, and the movement is just eating them the Tic Tac.
01:09:01.200 So you said, yours was a deeper, complex thing, but you said
01:09:06.400 something about getting people moving as part of the process.
01:09:10.040 I like that reminder, one of the hottest trends in neuroscience
01:09:13.680 right now is this notion of embodied cognition, because as
01:09:18.360 scientists, we're recognizing more and more that the way we
01:09:21.180 come to know things and perceive things and talk about politics
01:09:24.420 or talk about the boardroom and the bedroom is not by simply
01:09:28.560 looking around us and building some mental representations that
01:09:32.440 are abstract.
01:09:33.460 What we know and what we perceive and what we remember and what
01:09:36.320 we decide on comes at the intersection of the brain and body
01:09:39.740 interacting with the environment, therefore embodied cognition.
01:09:44.100 So whether you give people the Tic Tac to chew or what we're
01:09:47.060 recommending business is inviting people to like physically take notes.
01:09:51.160 I'm hoping that this conversation is useful to you and maybe you have
01:09:54.580 jotted a few things down.
01:09:56.380 The more you physically write, not electronically write, the stronger the
01:10:00.280 attention and the memory for those segments.
01:10:02.380 If you have customers, invite them to an experience center if your products
01:10:06.480 support you to do that.
01:10:08.080 But yes, getting the brain and body in motion will definitely contribute to your
01:10:12.720 attention and memorability factor.
01:10:15.320 Here's a practical tip based on that.
01:10:17.900 When I would study back in my school days, I would try to take the same
01:10:23.460 information in as many ways as possible.
01:10:25.760 So I'd read it, but I'd also rewrite it.
01:10:29.520 Sometimes I'd draw a picture because the process of turning the idea into a
01:10:35.000 picture really solidifies it because you'll remember the picture and then you'll
01:10:38.980 work backwards to what it was.
01:10:40.740 But I would also sometimes sing it or hum it.
01:10:44.120 I'd be like, you know, two plus two equals three.
01:10:47.300 And I'd be drawing it and stuff.
01:10:48.340 So I would try to get as many physical bodily connections to the stuff.
01:10:54.600 And it really made a difference, I think.
01:10:56.620 So it's a smart technique.
01:10:57.980 I wish more people use it for themselves.
01:11:00.120 If you're here in a session like this, because you want to coach others, perhaps even your
01:11:03.940 children, teach them how to keep the body in motion.
01:11:06.960 We are at a strange point in our society because we're used to having so many things
01:11:11.460 come to us.
01:11:12.220 Like the food comes to you.
01:11:13.620 You no longer have to do that.
01:11:15.120 Your friends come to you.
01:11:16.280 You can meet them online.
01:11:17.400 But the moment that you put the body in motion, the brain is put in motion as well.
01:11:22.140 And the opposite is true as well.
01:11:23.620 You keep your body still, then your cognition slows down also.
01:11:28.920 That makes sense.
01:11:30.200 So, you know, the amount of information in this book is kind of, it's amazing.
01:11:38.640 So I'm going to have to spend a lot more time looking at it.
01:11:42.040 But there were, in the end, there's like a whole checklist of things that you could have
01:11:47.200 done in your presentation to have made it better.
01:11:49.740 And I won't read them all.
01:11:50.880 But you've done something that, as far as I know, has never been done, which is you've
01:11:57.240 captured the biological markers for attention.
01:12:00.800 And it seems to me that there's some AI companies that's going to give you a billion dollars for
01:12:06.000 that because nobody else has it, right?
01:12:08.980 If you could teach your AI to make you a PowerPoint slide or a video, and then one of the databases
01:12:15.920 they had access to was, how does this affect people biologically?
01:12:20.240 Where do I put the image?
01:12:21.560 How often do I mix it up?
01:12:23.160 When do I surprise them?
01:12:24.400 When do I give them novelty?
01:12:26.300 That's all something an AI could do if it had the data, couldn't it?
01:12:31.080 If they had the data.
01:12:32.380 And you're absolutely right.
01:12:34.100 We're sitting in a good position to have some biological markers for attention, for working
01:12:39.160 memory, for motivation to keep watching.
01:12:42.000 Because it's one thing to watch something and pay attention for two minutes.
01:12:45.020 It's another.
01:12:45.540 We know it's possible.
01:12:46.680 But it's another to pay attention for 30 minutes, for hours on end.
01:12:50.100 So you have to be motivated enough and have the biological endurance to do that.
01:12:55.260 They're looking at how much the brain enjoys the experience and how alert and awake it is.
01:13:00.540 And we do that by going beyond self-reports.
01:13:03.420 Because you could ask your audience, what attracted your attention?
01:13:06.380 What kept you going?
01:13:07.660 What did you like?
01:13:08.560 What did you not like?
01:13:09.320 And people will tell you.
01:13:10.240 But quite often, those self-reports are unreliable.
01:13:13.240 So as AI models get trained, are they getting trained on serving data that quite often relies
01:13:18.760 on memory that is fallible?
01:13:21.560 So it's good to use these kinds of signals and also debunk some myths.
01:13:25.540 Like, for instance, it surprised me that complexity is actually more of an attention grabber than
01:13:31.240 simplicity is.
01:13:33.380 Well, that's a double-edged sword.
01:13:36.700 I don't know if you heard my earlier presentation.
01:13:39.640 But you can also hide all your fuckery in complexity.
01:13:44.140 So it's a place to diversify somebody's attention until they can't find the needle in the haystack.
01:13:53.520 But I can also see how complexity would attract you.
01:13:57.340 Because you want to unravel it.
01:13:59.480 It's like, oh, what does this mean?
01:14:00.660 I'm going to spend a little time unraveling this.
01:14:02.640 I can see how you work both ways.
01:14:04.880 So true.
01:14:05.740 And I really like your viewpoint.
01:14:07.420 And I think, especially from a scientific perspective, it's good to define the terms.
01:14:11.260 Because there's complexity, there's complication, and there's chaos.
01:14:16.000 And often, I think, in the latter two, you could probably hide a few more things.
01:14:20.640 But complexity, as long as you present to the brain some items that are not only large in volume,
01:14:25.660 but also diverse and interconnected, and you add some meaning to this complexity that keeps the brain going,
01:14:33.180 and you manage that complexity well, then the brain synchronizes a lot better with a complex stimulus
01:14:38.800 than with a simple and quite often simplistic one.
01:14:43.440 So if we turn this into a practical guideline, I would say complexify in a manageable way versus simplify your communication.
01:14:50.740 Got it.
01:14:51.020 So give us, let's say, what would you say if you were going to give somebody, say, the best operational tip?
01:15:03.120 What's the one thing out of everything you learn from the studies, the sensors you put on people and your data?
01:15:09.620 If you're going to make a PowerPoint or a video or something, what's the one thing you're going to make sure you absolutely do?
01:15:15.900 Well, here's something that you can reflect on as we reach the end of the conversation, and it will make you think.
01:15:24.800 It will not necessarily be easy to implement, but it will be very helpful.
01:15:29.320 I found through all this research that one concept that comes up again and again is this notion of fractals.
01:15:35.280 Are you familiar with fractals?
01:15:36.960 If you insist on some elements that maintain their properties at any level of magnification,
01:15:41.360 like if you look at a tree, the shape of the tree is represented in any kind of tree branch.
01:15:46.860 It has the same shape as the bigger tree.
01:15:48.940 And if you look at a smaller branch yet, it still maintains the same properties as let's call it the father tree.
01:15:54.000 Or if you go to the grocery store, look at a head of broccoli, you'll see that the broccoli itself has some properties.
01:15:59.580 And then as you zoom in, every single little head of broccoli has some same properties or a cauliflower.
01:16:06.100 So it's cauliflower all the way down.
01:16:09.140 And as you reflect on your communication and the way that you want to attract attention and remain memorable,
01:16:14.440 pick a theme, something that repeats at any level of magnification.
01:16:18.640 And the moment that you have that set of properties and you keep going again and again and again,
01:16:23.560 repetition, by the way, is a sort of priming.
01:16:26.180 Then as long as you make it that it's cauliflower all the way down,
01:16:29.400 it doesn't matter if you're speaking to your audiences for two minutes, for two weeks, for two years.
01:16:34.460 If you come back to the same theme and you make friends with fractals,
01:16:38.360 you will be able to keep people's attention for a long time.
01:16:42.100 I think politicians do this very well.
01:16:44.060 So fractals is a tough word to wrap my head around.
01:16:47.700 But let me ask you if I did it right.
01:16:50.780 So today when I did my live stream before we talked,
01:16:53.920 I said I had a theme that is that the Democrats were hiding things in complexity.
01:16:59.320 And then I gave five different stories where there were some complexity and maybe some bad stuff was hidden in there.
01:17:06.680 Is that a fractal?
01:17:08.240 That is definitely a fractal because if you were to then expand on those themes,
01:17:13.880 we could be here for another five minutes.
01:17:16.600 We could be here for another five hours.
01:17:18.860 And it would be the same set of properties, the same set of sorely equations, for instance.
01:17:25.060 So, yes, pick your theme and some supporting points.
01:17:27.860 And as long as you're very consistent with those and you believe in them
01:17:31.160 and they have great impact on society, too, that's always very, very helpful.
01:17:35.980 Then you can keep the brain going for a while because you're giving it the best of both worlds.
01:17:40.300 You're giving it some familiarity because it's the same repetition at any level of magnification.
01:17:45.060 But with each elaboration, you can add some of those elements of surprises.
01:17:49.140 You can complexify and you can keep the brain motivated to stay with you for a while.
01:17:54.340 I'll tell you, my biggest pet peeve is when somebody wants to explain something to me
01:17:59.600 and they're waiting to give me the answer at the end.
01:18:03.300 And my brain can't handle that.
01:18:05.000 So it would be like Josue, my builder handyman here.
01:18:11.020 He'll say, can you have a minute?
01:18:12.880 He'll say, you know, over by the eaves, there's this board that covers the thing
01:18:18.660 and then there's the corner and I'm like, where is this going?
01:18:21.620 Where is it going?
01:18:22.440 And the answer is, you know, there's a leak and I can fix the leak.
01:18:27.360 So what I want to hear is I can fix the leak.
01:18:30.400 And then after I hear that, every detail he gives after that is now salient.
01:18:35.340 I was like, oh, OK, so I'm understanding because the boards come together.
01:18:38.760 That's why there's a leak.
01:18:39.700 But if you start with, OK, the boards come together, I'm like, I have no structure.
01:18:45.400 I have nothing to pin it to.
01:18:47.120 So is that close to what you're talking about?
01:18:49.180 I really like what you're saying in terms of elaboration because that's what builds up the complexity.
01:18:54.920 And especially in business, in our personal relationships as well, you have to earn the right for that elaboration.
01:19:00.240 And in your relationship with the plumber, that may be a little bit different because if your plumber had been Elon Musk and he started with the details at first and something that maybe initially didn't make sense, because your relationship is probably different, he has earned the right to elaborate first and then give you the...
01:19:18.440 No, no, I'd be like, Elon, no, get to the point, get to the point.
01:19:24.480 But I think what's important for people to recognize is that sometimes we are, let's say, in a business context and they may have heard the guideline, well, tell a story, because a story is just so attention grabbing and memorable.
01:19:36.820 And that's not the case.
01:19:37.640 You have to earn the right for those details and you have to earn the right for the story.
01:19:41.920 Sometimes you do have to start with a conclusion depending on your rapport.
01:19:45.580 So if you start with a conclusion, you need to back it up, but it should also be, should it be like a surprising conclusion?
01:19:55.920 In other words, you should have some novelty as well, such as, I'm going to show you over the next 40 minutes that everything you knew was wrong.
01:20:07.280 Would that be a good one?
01:20:08.420 Yeah, so you can start with something that challenges the status quo, challenges a personal norm, challenges the way that you're thinking in some way.
01:20:20.240 Not that many business people know how to do this, by the way.
01:20:23.240 They want to start by not discarding accuracy.
01:20:26.360 They want to be precise and correct.
01:20:28.340 You're talking about those budget meetings.
01:20:30.280 You don't want to disregard accuracy in a budget meeting.
01:20:34.060 But why not?
01:20:35.900 In the book, by the way, you'll learn about this notion of exaggeration and discarding accuracy whenever you can.
01:20:43.760 In a way, that's mindful and playful.
01:20:46.140 So as you think about your hooks, for instance, sure, some unusual stats could do the trick.
01:20:50.720 But what about an unusual photo or something that you did not see, you did not predict, like we were talking earlier?
01:21:01.320 So I'll just tell the viewers that several years ago you helped me build a presentation, a slide deck that went with my public speeches.
01:21:11.960 I was giving corporate speeches.
01:21:13.960 And I had an experience with that that was so uniform.
01:21:18.000 I'd give my talk, and usually it was a group of people who were there because they wanted to hear me.
01:21:23.700 So, you know, they were a friendly crowd.
01:21:25.640 Everything would go great.
01:21:26.960 And then afterwards, maybe I'd sign some books or say hi to people.
01:21:30.920 And people would come up to me, and I swear to God, every group said the same thing.
01:21:36.240 Who made your slide deck?
01:21:38.120 Like, who did that?
01:21:39.880 Because it was so, so different than anything that I'd ever done or even seen, really.
01:21:46.700 So I can confirm that the science that you were putting into it even a few years ago, before you did this book, was already completely lighting brains on fire.
01:22:00.420 Like, people were just like, what did I just see?
01:22:03.660 And they would talk about it, and they would come, they would act.
01:22:06.780 That's the ultimate test, getting somebody to act.
01:22:11.500 And they all acted the same way.
01:22:13.300 Where did you do this?
01:22:14.480 How did you do that?
01:22:15.900 And, you know, I would tell them that you helped me, et cetera.
01:22:18.540 So I recommend this book, Made You Look.
01:22:23.960 Carmen Simon, PhD.
01:22:25.320 And thank you so much for joining us, Carmen, and making us smarter.
01:22:33.060 And anybody who reads this book is going to do a lot better in your presentations and getting people's attention and basically your life in general.
01:22:40.960 So even if you're not in this kind of business, you might be interested for the academic, intellectual, why does something work and something doesn't?
01:22:49.440 It's really a fascinating drive inside the brain.
01:22:54.340 It feels like taking a trip inside a brain.
01:22:57.300 That's what it feels like.
01:22:58.940 Thank you so much for all the kind words, and thank you for all the attention that we got here in the chat box.
01:23:03.980 And keep in mind that if you do attract attention, you can stay on people's minds, and when you do, you will live longer.
01:23:09.960 All right.
01:23:10.580 Great.
01:23:10.900 Thanks for joining, and I'll talk to you later.
01:23:13.740 And everybody, thanks for joining, and I'll see you tomorrow, same time, same place.
01:23:19.440 Bye for now.