Dr Pippa Malmgren on Trump, Social Credit, AI & Virtual Reality
Episode Stats
Length
1 hour and 4 minutes
Words per minute
178.82855
Harmful content
Misogyny
15
sentences flagged
Toxicity
14
sentences flagged
Hate speech
7
sentences flagged
Summary
In this episode of Trigonometry, our first returning guest, Dr. Pippa Malmgren, talks to us about the Mueller investigation, the Trump administration, impeachment and much, much more. She's a former advisor to U.S. President Donald Trump and the founder of HRobotics, a company that makes autonomous vehicles.
Transcript
00:00:00.000
hello and welcome to trigonometry i'm francis foster i'm constantin kissin and this is the
00:00:12.360
show for you if you're bored of people arguing on the internet over subjects they know nothing about
00:00:17.200
at trigonometry we don't pretend to be the experts we ask the experts our fantastic expert guest this
00:00:23.700
week is a returning guest to trigonometry the first returning guest to trigonometry
00:00:27.480
She's a former advisor to U.S. President and the founder of H-Robotics, Dr. Pippa Malmgren.
00:00:50.820
It's Swedish, it's got too many consonants, nobody can...
00:00:53.340
Yeah, I struggle, to be honest, but it's so great to have you back.
00:00:56.200
We're really, really pleased you've come back to talk to us.
00:01:00.300
And for anyone who didn't watch the first interview, just remind everybody who you are,
00:01:04.200
what's been your journey through life, kind of, you know, how are you, where you are.
00:01:07.140
Oh, well, bottom line is, look, I'm an economist.
00:01:10.040
So what I try to do is help shape economic policy by advising governments.
00:01:14.080
So I'm currently a non-executive director of the British government on the Brexit and trade policy issues.
00:01:25.580
coming out. And then I kind of got tired of just being an economist who talked about the world
00:01:32.920
economy and decided to start building it. So I have a robotics company that I co-founded.
00:01:38.320
As you do, exactly. And that's one of the reasons we want, there's loads of reasons we wanted to
00:01:42.800
speak to you because last time we talked about all kinds of things. But there was a particular
00:01:46.240
moment of, I think, about 40 minutes in when you just started talking about technology and
00:01:51.860
Francis's head just started to explode a little bit and so did mine and we were just like we have
00:01:57.300
to come back and talk about this for a good good portion of time but before we get to that there
00:02:04.440
is something that's been going on with Donald Trump recently that we just wanted to get your
00:02:08.220
take on yeah one of two things yeah and obviously by the time this goes out a couple of weeks from
00:02:14.340
now he might not be president anymore right that's a possibility is he going to be impeached do you
00:02:48.360
And that's why everybody says, let's not do it.
00:02:54.520
They love grabbing on to the ankle of a bad guy and then hauling him down to jails.
00:03:04.260
And we've seen the Mueller investigation dragging on, not that many charges brought, to be fair, considering how much time.
00:03:12.100
And I personally think the real game in town is not Mueller.
00:03:15.680
It's Barbara Underwood, who is the state attorney general in New York.
0.56
00:03:20.180
And they have been very aggressive, saying they are coming after the Trump family, the Trump companies.
00:03:26.620
That matters because you've got to remember, in the Mueller case, really, theoretically, the president can pardon everyone, including himself.
00:03:36.640
Whereas the state federal level—the state level is totally different from federal, and the president has zero capacity to pardon anyone.
00:03:44.080
And the penalty for a New York state charge is not federal penitentiary.
00:03:50.840
That's a white-collar prison with a golf course and tennis courts.
00:03:54.760
New York state penitentiary is almost not survivable for a white-collar criminal.
00:04:00.140
So it's a—I mean, it's viewed in Washington as literally a death sentence.
00:04:05.840
So this is the negotiating point, and that's why I tell everybody, watch Barbara Underwood
00:04:39.220
Would you be worried with everything that's going on?
00:04:43.240
Bear in mind, in terms of front, he almost seems untouchable to the outside eye
00:04:49.900
because everybody else gets arrested, everybody else goes down,
00:04:57.400
Just like Ronald Reagan, who I worked for as a kid right out of college.
00:05:03.800
He also had charm, which is absent in this case.
00:05:06.640
But at any rate, you know, who knows what's going on inside of Trump's head?
00:05:11.820
I mean, we know a little more because of Bob Woodward's book, and he taped everything.
00:05:17.720
The way to think about Trump is he's a property guy who wants to win the deal and get the prize.
00:05:24.500
So, you know, I've talked a lot about his interest in the North Korea deal.
00:05:29.340
It's all about—he may not get the Nobel, but if we get resolution between North and South Korea,
00:05:34.620
The leaders of those two countries are going to get the Nobel, and he'll be basking in the aura of that.
00:05:43.240
I think that he's got all kinds of things like this where he sees a prize.
00:05:48.260
And as long as he gets those, then he feels like, whatever, you can do what you like, but I'm going to win anyway.
00:05:55.080
Including, you know, I would keep a little eye on any announcements out of the U.S. space station and NASA
00:06:03.220
that we might even aim to do a moonshot during the Trump administration.
00:06:09.460
And I know, amazing as that sounds, but that would be so up his alley to create a—
00:06:20.180
Getting a man back up in space, or a woman maybe this time.
00:06:26.420
I'm going to do this, and it's going to make history,
00:06:35.340
Well, I think most people would prefer him to be in space.
00:06:37.760
But, you know, I'm not a big fan of Donald Trump,
00:06:40.700
but I would love for him to get the Nobel Peace Prize
00:06:43.540
just to see how badly everyone gets triggered by this.
00:06:50.260
Well, it would change the whole perception of the peace prize.
00:06:53.000
But listen, your book is about leadership, the Leadership Lab.
00:06:56.260
What do you think is kind of happening with Donald Trump vis-a-vis this leadership?
00:07:02.520
Why has America opted for him, and what's going on there?
00:07:06.040
I have to say, I just spent a month in the U.S., different parts of America.
00:07:13.060
What I felt like was, oh, my God, Kramer, the television show on CNBC, and Shark Tank.
00:07:26.300
That used to be just one television program that was kind of for people who were extreme.
00:07:36.480
I mean, it's literally this kind of attitude is really heavy on everybody.
00:07:40.440
Everybody wears it on their sleeve, the attitude.
00:07:42.940
So I wonder, I'm like, he's kind of a creation of our own making in many respects
00:07:47.760
because the whole country behaves like this now.
00:07:50.500
What I also detect, and I've heard the private polls that the politicians have, which are run by companies that are what they call purple—they're kind of not red, they're not blue, they're genuinely neutral—they're all showing that there's a massive backlash.
00:08:08.820
but it's particularly female voters, and they don't care anymore about public policy stances.
1.00
00:08:15.580
They care about having someone their children can watch without being embarrassed.
00:08:21.060
It's literally like, don't even talk to me about public policy, politics, law, strategy.
00:08:26.960
I want a person that my child can turn the television on, and I don't have to explain what they're saying.
00:08:32.580
And I think that this is one of the reasons we're seeing a record number of female and minority political candidates at every level of government.
00:08:41.100
And they're starting to get voted in because people don't care that they don't have any experience.
00:08:45.840
Ironically, the same motivation they had for electing Trump, which is, I don't care if they have experience.
00:08:51.900
I want this guy to take the government down from the inside.
00:08:55.880
We have the exact same view, just now we want a different person to take it down from the inside.
00:09:00.220
So the hostility of the public towards the federal government remains very, very high.
00:09:07.200
And I would argue it has been that way, because that was partly how Obama got elected, right?
00:09:13.820
And a lot of people felt he didn't deliver as much radical change as they wanted,
00:09:17.320
so they went for something even more radical, in a sense.
00:09:22.380
And, by the way, I'll finish last thing is, you know,
00:09:24.660
we have a long tradition of electing very outside candidates, right?
00:09:29.540
Like, nobody ever heard of Bill Clinton two years before he became president.
00:09:33.580
Nobody heard of Obama two years before he became president.
00:09:36.520
The president I worked for, George W., everybody's like,
0.89
00:09:38.740
don't be ridiculous, that's such an outside possibility, he'll never win.
0.94
00:09:43.620
So I would put a lot of money on the person you never heard of right now.
00:09:49.860
And would you say he's a good leader, Trump, in what he does?
00:09:52.600
Because, I mean, if you look at the figures economically, they're not doing too badly.
00:10:02.120
Yeah, well, all of it. It's mine. It's all mine.
00:10:15.840
I wrote about how the economy was absolutely improving
00:10:22.680
and jobs were moving from China back to the U.S.
00:10:29.060
So the real issue, I think, is, yes, the economy is getting better.
00:10:34.580
And that's a big problem in politics because the Democrats now not only don't have a candidate, they don't have a story.
00:10:41.400
The story used to be the economy is up the creek.
00:10:49.260
And I think it's actually going to keep getting stronger.
00:10:52.200
I think I'm the only bull left in the entire financial markets because everybody's like, the crash is coming.
00:11:03.900
People are really genuinely prepared for a meltdown on an epic scale.
00:11:08.780
Because we're booming, therefore the bust is coming.
00:11:10.880
It has to be, because we've had the longest full run in history, so that can't last.
00:11:17.220
And we've just seen fairly recently Amazon's valuation, the first company hit a trillion dollar valuation.
00:11:27.240
I think the Amazon hitting a trillion is the beginning of a whole new era of valuations, that we are in a whole new category.
00:11:35.500
And that's partly because we still have $20 trillion from quantitative easing left in the system that now is coming off the sidelines because investors are going, wait a minute, inflation is definitely picking up.
00:11:47.760
It's no longer that, you know, every time you go out for dinner, you're like, what?
00:11:53.260
I mean, everyone palpably knows prices are higher.
00:11:56.680
But the data is now really reflecting that it's higher.
00:11:59.880
So investors know you lose money every day, that you have inflation, if you have cash.
00:12:04.140
So they're going, ah, I've got to get out of cash.
00:12:07.520
If you have inflation, you're not going to buy bonds.
00:12:10.060
So you're going to buy equities, what they call private equity, which means private investment
00:12:14.760
and private businesses, or hard assets like infrastructure.
00:12:23.420
So I actually think that we should be preparing for several more years of stronger performance rather than a sudden crash.
00:12:33.000
Well, no, not necessarily, because you can have the economy performing well and have so turned off the public, even in your own team, that everybody goes, basta, enough.
00:12:44.980
Has that happened in the history of American elections?
00:12:47.000
I haven't gone back and done the numbers, but I would say it would be hugely exceptional to have a strong economy and get chucked out.
00:12:55.800
But if you're hugely obnoxious, maybe this is possible.
00:13:00.660
Well, it's just like your former boss is a good example.
00:13:07.680
Tony Blair was deeply unpopular for going into Iraq, but they both got reelected because the economy was doing well, right?
00:13:13.440
And so the question is, is that decision about Iraq less offensive to people than what Trump has done?
00:13:26.040
And what does that say about all of us and our interest in people affected by our own decisions?
00:13:33.760
When you put it that way, that is pretty crazy.
00:13:35.620
I mean, you know, the war in Iraq killed at least a million people.
00:13:42.860
You know, a lot of Republicans like myself, we wanted someone who was going to, you know, push the establishment, shake it up a bit, make government smaller, reduce taxes, hopefully rein in the spending.
00:13:58.640
You know, we like this as libertarians, but what we weren't really banking on was someone who would do it in such a way as to create enemies at every turn and send messages to the public that unless you're in my particular category or a second-class citizen, this is totally inconsistent with these core values that you're trying to promote individual freedoms versus centralized government.
00:14:28.320
So now there's this awful situation that you've got a lot of Republicans who are like,
00:14:32.000
I like the philosophy, but I can't handle the delivery.
00:14:37.560
And I think this, again, is going to be really hard,
00:14:40.280
because now what I'm arguing is that we're going to see a Republican go up against the president,
00:14:50.240
I'm sure it has, but it has, but it's usually a dead end.
00:14:57.740
Right. Maybe not. And then I haven't got into the whole business of the real possibility that Trump doesn't run.
00:15:04.080
He voluntarily doesn't run. He announces I'm going to step down, but he'll never say step down.
00:15:09.500
What he'll say is I have bigger plans. And my bigger plan is I am going to launch what they're calling TNN.
00:15:15.680
It's the Trump version of CNN. It is the Trump platform that will have everything from reality TV to political talk shows.
00:15:28.780
And my guess is, if that's where he wants to go, I think he does want to go there.
00:15:32.700
I think he was planning for that when he ran for the presidency.
00:15:38.560
But I think he's probably already raised a good chunk of change to do that
00:15:41.740
and wouldn't have a hard time raising money to do it if he leaves.
00:15:45.520
And actually, the politicians on the left and the right might say,
00:15:48.700
you know what, that would be an excellent solution
00:15:51.320
because then nobody has to pull the trigger on an impeachment.
00:15:54.820
We bargain with him that if he goes, we don't prosecute him.
00:15:59.560
The New York State Attorney General vaguely says, yeah.
00:16:04.080
And then as soon as he's out, begins prosecution because they're not going to hold back.
00:16:12.860
It's not a public fight of the president in the Oval Office.
00:16:16.860
It's a private fight of a former president about his businesses.
00:16:20.580
and Donald if you're watching we're massive fans and we would love to be on TNN
00:16:25.600
yeah I'm sure I'm sure he's watching me that's what's happening but but to be fair the TNN thing
00:16:30.820
don't forget there's a huge component of the American public who do think in this way and
00:16:36.980
will like what he's doing oh absolutely you know and you can see that his his popularity approval
00:16:42.380
you know it remains like 48 48 48 no matter what happens and so we can't be too dismissive of that
00:16:49.480
side they're real and they're not going to go away let's move on to technology you've got a
00:17:03.460
whole chapter in your book which talks about that and like i said last time we were absolutely
00:17:08.260
fascinated by it and you know what i was thinking you and i met two years ago at kilkenomics in
00:17:13.300
island doing comedy well yeah i was doing you were doing the same thing i was doing comedy
00:17:18.300
and we were chatting in the bar after one of the shows that we were on and you were telling me
00:17:25.500
about this new thing in china where they're going to have this system where they're going to track
00:17:29.880
everybody's behavior and all this stuff and i'll be honest with you pippa i really like you but i
0.79
00:17:34.740
was sitting there listening to you going she's a little bit crazy she could be a little bit crazy
0.99
00:17:41.040
Maybe she smoked a lot of weed when she was a student.
1.00
00:17:51.360
And that's what you were talking to us about last time you were on the show.
00:18:00.340
Okay, so let's just quickly go through what is already real.
00:18:05.220
If you jaywalk in not all of China, they've only rolled it out in one city,
00:18:10.600
If you jaywalk, it clocks that you've done it because the cameras are ubiquitous now.
00:18:16.360
So the facial recognition is incredibly strong.
00:18:19.220
They have the most valuable artificial intelligence startup in the world.
00:18:22.320
In fact, the most valuable startup in the world called SenseTime.
00:18:25.380
And it can recognize an individual out of a crowd of 10,000.
00:18:28.780
And it can recognize your emotional state at any given time.
00:18:34.940
Next thing you know, you pick your mobile phone up.
00:18:36.960
And the fine for having jaywalked is already in your text messages, if not already deducted from your bank account.
00:18:44.540
And your name and or your government number is already broadcast on the OLED screen that's above the intersection, the nearest intersection.
00:18:53.880
So you've now been broadcast to everyone nearby that you are bad and you just violated the law.
00:18:58.800
Now, this is important because what it does is affects your effectively personal Uber score.
00:19:03.400
The social credit system is based on the idea that you're given a score, which reflects your social compliance.
00:19:10.940
So if you Google stuff that they don't want you looking at, your score goes down.
00:19:15.720
If your brother or your sister does it, your score goes down.
00:19:21.600
Because Mao always said the best eyes and watchers, or it's not the government, is to get everybody to report on each other.
00:19:29.480
And this is a kind of a, I mean, the Stasi would love this system, but it's bigger than that.
0.55
00:19:36.280
Within the last few weeks, it's been announced that when you tap, swipe, and pinch on your mobile device,
00:19:44.540
it's a better indicator of who you are than your thumbprint.
00:19:48.200
So now, even if you're using someone else's phone, they know it's you.
00:19:54.600
It turns out your walking gait is also a better indicator of who you are than your thumbprint.
00:20:01.400
So what they're doing is taking all these things and triangulating.
00:20:05.940
Think about it as previously independent silos of data.
00:20:09.800
Now they triangulate using artificial intelligence to pull it all together.
00:20:16.540
And recently they arrested someone at a rock concert of like 60,000 people because, again,
00:20:21.660
The face went by one camera, bang it clocks, this person is wanted, and they went to the person's seat, you know, the debit card, yep, the ticket, yep, they're arrested.
00:20:32.800
I think they now have 11 million people who are, you can buy a train ticket or a plane ticket, but you can't board it.
00:20:44.000
No, no, so they're putting you into digital prisons.
00:20:46.820
So, I think one of the spouses of one of the dissenters, she's, like, locked into a very small, few-block area around her house.
00:20:58.640
And any time she tries to move beyond it, the officials are alerted and the police will be right there.
00:21:03.700
So, she literally is stuck in a digital prison.
00:21:09.800
And it rewards you if you do things that are supportive of what government's interested in.
00:21:28.140
And I don't know if you saw, but there's been an announcement
00:21:31.120
that there's been a deal between, it seems, Google and MasterCard.
00:21:37.840
So now if I go to, you know, I don't know, a department store
00:21:41.940
and I buy this color lipstick, Google's going to know, and now they know what they're going to
00:21:47.420
advertise to me. So I'm concerned about this, and I write a lot about this in my book, because
00:21:53.780
imagine what we've got. It's a world where you are emitting data points about yourself all the time
00:22:00.140
without even knowing it, because you've turned all the fitness apps off on your phone, but the fact
00:22:04.780
is the way you walk is revealing a lot about you, including, by the way, your cardiac condition,
00:22:10.200
a lot about your health is revealed by how you walk. So this multiplicity of data points that
00:22:15.660
you're throwing off, but you don't know who gets to see it. But whoever does get to see it,
00:22:21.120
they know more about you than you know about yourself. And by the way, it's a two-way thing.
00:22:26.400
If I'm a chief executive and I go on, let's say, this program, and I start talking about my company
00:22:32.180
and I'm lying, this same facial recognition technology can identify your microfacial
00:22:37.860
movements and they know that you're lying and they can set the algorithms to short the stock
00:22:42.620
of your company as you're talking on, say, CNBC's Quackbox. So think of it as almost like a crystal
00:22:49.160
ball of data points. And on the one side, it will let us conjure forth answers that will do things
00:22:55.360
like solve cancer, literally. We will solve extraordinarily difficult problems by having all
00:23:00.360
this data. But it also changes the balance of power between companies and customers because
00:23:06.480
Because companies will have so much information that I've argued—you know, we used to have
00:23:16.160
If I am MasterCard now or Google, my knowledge of you is so great.
00:23:23.380
They only had 5,000 data points on 81 million people, and that was enough to begin to influence
00:23:30.280
I can sell you way more than a political position.
00:23:32.480
If I have more than that, I can sell you a refrigerator.
00:23:36.180
So I think human beings are very vulnerable to this kind of power being exercised.
00:23:41.200
And similarly, what kind of world do we have if people don't know how they appear?
00:23:45.560
And should we have a right to know how do we look with that data slice looking at us at any given time?
00:23:53.240
So in the book, I've tried to lay this out because we have a lot of leaders who are making decisions about the landscape of our future who don't understand what I'm saying.
00:24:03.500
And so they're literally missing a profound shift in humanity.
00:24:10.780
I mean, this is the future of one of us, right?
00:24:14.220
This is nothing I'm not telling you is for the future.
00:24:26.480
So, I mean, joking aside, but then that literally means
00:24:30.700
If you're being tracked of every single second of every day, if they're reading what you were doing, if they can manipulate you, is that the end of free will eventually?
00:24:41.320
This is the—so what's really interesting to me is when I was in college, I studied political philosophy.
00:24:47.420
And everyone went, oh, my God, you will never get a job.
00:24:50.560
I mean, you are permanently unemployed if you study.
00:24:53.100
Now we get to this, and I'm like, these are questions of political philosophy.
00:24:58.480
This is about the balance of power between states and citizens, companies and customers, between citizens.
00:25:07.180
It's about the invasion of a person's free will.
00:25:19.920
And the more we develop artificial intelligence, the greater it's going to be.
00:25:23.940
For example, I know a guy who's building an extraordinary company.
00:25:28.940
It is effectively going to place nanochips, which are so small that they can fit inside the body.
00:25:37.860
They are at the level or they're below the level of atoms, right?
00:25:42.660
You're basically into the level of you can see someone's atomic structure, their DNA, their cellular structure, and their organs.
00:25:51.420
because they float around in your body at the nano level.
00:25:55.560
So that means, the good news is, when you get cancer,
00:26:03.400
but then you'll be able to say, you're building up proteins,
00:26:05.820
we see it coming, we can hit this thing, and you'll never get cancer.
00:26:08.660
But the bad news is, you're emitting information
00:26:17.900
But again, the good news is that we'll create a kind of, he calls it a cure chain, the guy
00:26:23.000
who founded it, Steve Papermaster, like blockchain.
00:26:26.820
We'll be able to cure many obscure diseases that only a few people have that right now
00:26:38.060
I mean, the health care companies will know more about your health than you will ever
00:26:43.240
So yes, these are really profound philosophical questions.
00:26:47.700
Well, here's the thing is it's not just health as well
00:26:49.500
because we had an evolutionary psychologist on the show a few weeks back.
00:26:52.620
I don't know if you've caught that episode, Diana Fleischman.
00:26:55.180
And one of the things she was talking about is genetic studies.
00:27:02.600
that there's a very strong genetic component to behavior.
00:27:11.840
you're not just going to be able to tell what disease they might develop
00:27:50.560
They are already using the artificial intelligence
00:27:53.680
in conjunction with this extraordinary data sweep,
00:27:56.780
and they are determining which people look more inclined
00:28:04.240
and then to start corralling them into a corner
00:28:14.580
So again, this is a huge philosophical question, is this question of prejudging, precognition,
00:28:25.620
I ended up going back and reading a book, which I actually have to really strongly recommend,
00:28:29.800
actually an author called Norbert Wiener and a guy called Manuel de Landa.
00:28:35.120
But Norbert Wiener, who wrote a book called The Human Use of Human Beings.
00:28:39.960
and it's a profound book he wrote in the late 1940s and he's the guy who came up with this
00:28:46.480
word cybernetics which means the interface between humans and machines so 1949 he's writing about how
00:28:56.120
we're going to have this interface and even assuming that machinery will begin to enter our
00:29:00.620
bodies which already it is but that this will be this conversation between humans and machines is
00:29:08.120
going to be central to our society it's so beautifully written it's so clear it's so
00:29:14.320
absent of jargon or anything and it's a wonderful baseline for anybody who's interested in this
00:29:19.120
subject i would say go back and read norbert weiner it's very short he he was considered
00:29:24.240
the father of modern artificial intelligence and and that's where we got to go back to start at
00:29:29.700
the beginning and do you think it's the way we're moving i know it's a very broad question but do
00:29:34.140
you think it's positive on the whole or do you think this is actually quite worrying and we're
00:29:38.740
heading to a dystopia? I am very positive. My leaning is positive. I think that all this
00:29:46.060
innovation, artificial intelligence, automation and robotics are fundamentally going to solve
00:29:51.560
many more problems than they create. But the problems they create are very serious. So it's
00:29:57.540
a bit like cars, you know, I think cars are marvellous and can they kill thousands of people
00:30:03.320
every year. Yes, they can. So, you know, you wouldn't, we don't say, okay, anybody can vent
00:30:10.020
any kind of car they want and just put it on the road. No, we have rules of the road, right? We
00:30:13.740
have manufacturing standards. We have health and safety and we've gotten better at it, right? When
00:30:18.180
I was a kid, they just dates me, but you know, we didn't even, we didn't have seat belts. We didn't
00:30:22.980
have airbags, you know, the seats weren't even fixed. They used to just flop back and forth. We
00:30:26.940
just thought that was normal, you know? So now we have airbags and seat belts. That's a very good
00:30:31.940
thing. We're going to have airbags and seatbelts for the robotic era too, for sure. So I definitely
00:30:38.120
lean on the side that this is hugely beneficial to society. And I think it's massively democratizing
00:30:44.760
because it's putting power in the hands of people who haven't had it before. And everybody always
00:30:50.140
argues with me and says, well, you got to have a PhD from Stanford or Berkeley in order to even
00:30:55.200
play in this game. And I say, no, that's not true any longer. I was recently at a really cool event
00:31:00.460
held by the Founders Forum called Accelerate Her.
00:31:05.660
And it was about promoting women in robotics and business.
00:31:16.020
but never was able to make any money out of it.
00:31:28.220
they're writing about through their app so they get a little piece they get to take I don't know
00:31:33.900
whatever 10 percent 5 percent suddenly you've got women all over the world they tend to be women
1.00
00:31:38.720
the women who are making half a million bucks a year from their Instagram account just sitting
1.00
00:31:44.580
playing on it because the technology has empowered this so it's not the story that the the insiders
00:31:51.520
have a lock on this anymore it's becoming ubiquitous and I think that's a super beneficial
00:31:56.420
thing to society so then it's a race between the citizen and the state who's
00:32:01.280
got that information and how is it permitted to be used well my concern
00:32:05.600
would be the recent stuff that we've seen the Francis like Facebook and
00:32:09.540
Twitter you know they have not come out well of recent news stories about them
00:32:14.360
and you look at you know Donald Trump got laughed at when he said that Twitter
00:32:17.900
and Facebook is biased or whatever but actually then Twitter came out yes yes
00:32:22.820
we are yeah we are and we're going to just shut down the accounts of those people that we don't
00:32:27.580
like yes which they're doing well i yeah so this raises to me a really interesting question about
00:32:32.760
why is it we only have one facebook why don't we have more competition we why do we have this
00:32:39.420
monopoly positions in these social media arenas it's very interesting i'm not sure i have an
00:32:46.440
answer as to why well the currency is the ubiquity isn't it like everyone is on it that's what makes
00:32:52.180
at Facebook. But why is it that so few of them, and they all serve kind of very different purposes,
00:32:57.300
so they tend to have monopolies over the space that they're in. Where are the competitors? This
00:33:01.860
is super interesting to me why it's harder to create competition in this tech space than it
00:33:07.900
is if it's a grocery store on the corner, but it seems to be. But I do think all of these things,
00:33:13.380
for me, it changes the landscape of leadership. And by that, I'm speaking very broadly. I don't
00:33:19.580
just mean political leaders and business leaders even religious and community leaders i mean all
00:33:24.940
of us have to kind of think about what is the same and what's changed and so what i've argued here
00:33:31.420
with my co-author chris lewis is some things have profoundly changed so one of them is we used to in
00:33:38.140
the 20th century the answer was always in drilling down into greater quantitative detail because
00:33:43.820
somewhere in there there'd be some a black box and that's where your answer was actually that didn't
00:33:48.540
really work out so well and the financial crisis i would say arose in large part because of this
00:33:53.100
approach now in the 21st century okay you can drill down if you want but the answer resides
00:33:58.940
in the look across not the analytical thinking but the parenthetical thinking this connecting
00:34:05.180
the dots between the silos seeing the whole landscape or another way of putting it is we
00:34:11.580
used to be all about measuring the math what are the numbers give me the data points now it's
00:34:17.180
equally important or maybe more so to understand the mood, that the facts and the feelings are
00:34:24.420
very different things. And I know we'll get a lot of pushback because people say, well,
00:34:28.740
you can't analyze feelings. And I'm like, well, I'm sorry, you can when you get voted in or out
00:34:33.600
of office. And you feel it when your customers turn on you for no reason that you can quantify,
00:34:39.260
but everybody knows. So it's about using not only your head, but all the parts, your left brain and
00:34:46.100
your right brain and really joined up thinking and also including in your thought process what's
00:34:51.660
the heart aspect because that's the part when I talk about technology it's not your intellect that
00:34:56.900
twists so much you're like your heart because you're like but what happens to individuals
00:35:01.120
what happens to people right that's your heart speaking and so the book is very much about we
00:35:06.340
need a lot more heart in our leadership we need a lot more joined up thinking and also I know this
00:35:11.820
sounds really corny, but leadership in the 20th century was all about the leader. It was the cult
00:35:17.700
of the leader. It was the Jack Welch. You know, it was one infallible leader, kind of the Jesus
00:35:23.160
Christ model of management. Now it's about the ship. It's about the team. It's about the
00:35:29.340
organization. It's about bottom up, not top down. And a lot of our leaders today have no idea how
00:35:36.320
to manage in that environment. They don't think inclusively. So if there's anything, the core
00:35:41.460
message of the book is we need much more diversity of thinking. And diversity of people is critical
00:35:47.620
to that, and I'm hugely supportive that we have much, much more diversity of people. And I don't
00:35:53.360
just mean gender, although that's a big push in the book, but I mean ethnicity, income background,
00:35:59.920
neuroplasticity, how your brain is wired up. I mean, really, diversity of experience. But you
00:36:07.280
can still have a room full of people who all look very different who go, Trump will never win.
00:36:13.460
So one of the things leaders need to do is to stop focusing on prediction,
00:36:18.620
because if you say Trump will never happen, that's a prediction,
00:36:20.880
and instead get focused on preparedness for things that sound way out there
00:36:25.440
and use scenarios to test your robustness and your agility.
00:36:30.740
I was just going to say, if we just move on a little bit to virtual reality,
00:36:34.000
I was at a comedy club, playing this comedy club,
00:36:37.540
and one of the guys working there is sort of a graduate
00:36:40.980
and he was saying that actually in a few years' time,
00:36:50.040
and that this will all go into the virtual environment.
00:36:57.680
and having that emotional and physical connection?
00:37:06.620
we're making drones, commercial drones. One of the ways we deliver the information is through
00:37:13.140
virtual reality. So our clients can put on goggles and then walk through an asset that's on the other
00:37:19.500
side of the world. So if you had an avalanche at a mining site, suddenly you could be inside that
00:37:24.280
mining site and viscerally feel it as opposed to watching a video in a two dimension. So I spent a
00:37:30.420
of time in this space and it is so realistic the brain cannot distinguish between real and virtual
00:37:39.780
i recently did a new virtual reality game it was like in a big warehouse and i know
00:37:45.300
i'm standing on a flat floor in a big warehouse and there's nothing to bump into and yet at one
00:37:50.900
point i am like on the 157th floor of a building that has been blown out by explosives and i have
0.97
00:37:57.700
have to cross a rickety bridge and I'm literally like and then I'm going don't be an idiot it's a
0.99
00:38:04.000
floor you're in a building but your brain is going you might fall you might fall it is so realistic
0.98
00:38:09.520
and now the real kicker we're starting to get haptics and this is what as a word everybody
00:38:15.540
should look up and know haptics are items of clothing that permit feeling from a remote
00:38:24.320
location. So haptic gloves, people are familiar with haptic gloves, so you can grab things in a
00:38:30.680
video game by going like this. But now we're getting full body haptics. So let's say I go on
00:38:37.600
a virtual reality boxing program. When the guy in Tokyo who's playing the game at midnight and I'm
00:38:45.280
sitting here in London, when we go against each other in that boxing ring and he hits me in the
00:38:49.980
solar plexus, I'm going to feel that because the haptic thing that I'm wearing is going to give
00:38:55.560
me that punch. Now, I may be able to choose that I only want to feel one-tenth of what he's
00:39:00.960
throwing, right? I could do that, but it'll all be scored. And because the internet is a scoring
00:39:07.380
mechanism, which is actually one of my political philosophy questions is, do we want to live in a
00:39:12.440
world where we're all being scored all the time? Because where is there any place for, you know,
00:39:18.160
beauty I mean how do you smile you guys they do score beauty which I just find
00:39:22.420
incredible you know the women one out of ten like you know what I mean though
00:39:26.560
beauty is not a thing that can happen on a numerical scale and so this business
00:39:31.900
of moving into a world where everything is scored and then marry that up with
00:39:36.460
haptics and virtual reality you're gonna have people who are gonna go I gotta
00:39:40.900
have full force I don't know if you saw the Spielberg film ready player one it
00:39:46.960
is super worth watching. It is our first time that we really are shown what living in a virtual
00:39:52.940
reality world is going to look and feel like. And at one point, the main character is in this
00:39:57.380
situation, and his friends say, turn it down so you don't feel the full blow. And he goes,
00:40:03.280
no, I'm a real guy. And so he sets his outfit to feel whatever the other player is throwing at him.
00:40:11.140
The point is, people are going to get hurt. And put it another way, the Germans recently developed
1.00
00:40:15.920
really interesting virtual reality chair basically sits on bungee ropes that are attached to the
00:40:22.400
sides of the room so that when you sit in the chair with your goggles on not only will you
00:40:28.260
see the VR world but you're going to feel the actual g-forces now you will be moving and I took
00:40:35.240
one look at that and I went that chair someone's going to have a heart attack in that chair
1.00
00:40:39.180
because the brain can't tell the difference they're going to step off the edge of the building
00:40:43.000
in the VR, and then the chair is going to go voof, and that guy is going to have a hard check.
00:40:48.240
It's like the Matrix. The Matrix, your mind makes it real.
00:40:52.140
In fact, we've even been talking with the Red Cross about torture and the need to have a new
00:40:59.760
Geneva Convention on torture. Because if I am given these VR goggles, and I'm a victim of
00:41:08.160
torture. Well, if what I see is that someone's just cut my hand off, my brain is going to process
00:41:15.760
as if that injury is real. You won't be able to, and if you touch me when that happens on my screen,
00:41:23.300
my brain will know something has happened. They visually see what's happened. The brain starts
00:41:27.860
responding to that. So I think this waterboarding will have nothing on what this can bring. And
00:41:36.020
equally, even things that seem innocent, the fact that this is what Spielberg shows in that film,
00:41:40.900
Ready Player One, that you can enter alternative worlds and live there and just stay there.
00:41:49.000
You don't ever have to come back into reality. And again, a big message we have in the book,
00:41:53.720
we looked at all the research, what's happening is people are, in theory, more connected because
00:41:59.520
of the internet. But in practice, there's less and less conversation, less and less human
00:42:04.780
connectedness and virtual reality and augmented reality take you to an even greater level of
00:42:10.680
disconnectedness and create a world where you think that certain things are normal that are
00:42:16.780
not at all normal in real life because what's created in a virtual reality world is it's comics
00:42:21.800
it's it's pictures and it's drawings but they're so real that you start to believe that's the
00:42:27.900
standard that i want that in my life but i do wonder i mean with your question francis right
00:42:36.500
I mean, right now, you can pay £30 or whatever it is for Netflix
00:42:54.820
to go and see some far inferior comedians by comparison in person.
00:43:02.220
i run a comedy club where i live right people will come and do that so do you think maybe we
00:43:07.180
sometimes we we kind of overestimate the the or maybe underestimate people's desire to be physically
00:43:13.820
doing something physically present with others physically connected in a space that is without
00:43:19.900
screens that is without that thing where it's just right you're right there in that moment i think
00:43:25.180
there's a market for both and i think humans will be doing both but i also think that it's
00:43:32.980
Like one thing that I'm working on right now is being able to bring augmented reality to the audiences that I speak to when I do my public speaking.
00:43:41.780
And you guys will be doing the same at some stage.
00:43:44.040
You'll be able to say to the audience, if you pick your phone up and point it at the stage, I'm going to show you something.
00:43:49.340
And we're standing next to you on the stage is suddenly going to be a Pokemon type thing that you've chosen, an example of something that's in your comedy routine.
00:43:57.240
and now you guys are creating an augmented reality experience for that audience so it's not either or
00:44:03.980
it's going to merge more and more but also it's a different level of risk watch if i decide to come
00:44:09.540
watch you guys live of course this is extremely risky thing for me to do but but there's a certain
00:44:15.440
level of risk that you might mess up right or you might surprise me because the mood in that audience
00:44:21.460
that night creates a different energy level in you guys then right because your routine changes
00:44:26.400
depending on the audience right versus going to watch it on in that kind of vr landscape might
00:44:32.640
actually be it's more like watching a pre-programmed thing so the level of risk that it might screw up
00:44:39.600
is much lower or different so it depends what your risk appetite is and you'll measure that
00:44:44.880
on any given day and decide what's the right venue to express that risk appetite but what i do think
00:44:53.160
that it'll reach the point we start finding it hard
00:45:13.640
I mean, it's also going to impact on people's mental health.
00:45:19.680
You're going to go into these virtual reality worlds where, in a sense, there are no consequences for your actions.
00:45:25.320
You can always get to the end of the game, start again, whatever else.
00:45:28.700
And then once you get into real life, the rules are completely different.
00:45:33.980
And when it comes as well, when things between relationships, sex, all the rest of it.
00:45:38.940
So in the book, we talk at some length about the fact that an ever larger proportion,
00:45:44.760
And now it looks like in some industrialized countries, it's heading towards 15% of males under the age of 25 have never had a relationship because they live so much online or are in a position where they can avoid the human interaction that they do, which means they have no capacity to negotiate with other human beings because you have to learn that.
00:46:13.120
You have to learn how to negotiate with other human beings.
00:46:21.760
Let's say older people, people over the age of 40 who are not.
00:46:29.180
People over the age of 40 are not living together anymore.
00:46:33.440
If you're over the age of 40, you're elderly.
0.57
00:46:37.980
But even people over the age of 40 are not living together any longer.
00:46:42.760
They may share a roof, but they don't share a life.
00:46:54.140
So the real human consequences of all this technology seems to be that it is separating us from one another.
00:47:02.560
It's ironic, but that is what we're finding as the result.
00:47:09.220
We talk about it being it's an atomization, that people becoming more and more isolated, and they don't identify, you know, with larger groups.
00:47:21.700
They feel alone and isolated, and maybe that helps account for the rise of identity politics.
00:47:28.520
It kind of feeds off of that sense of I'm alone.
00:47:35.540
You're uber-connected and simultaneously disconnected.
00:47:39.940
And are there any remedies to this on a personal level?
00:47:43.120
Like, is there something you do as a human being to kind of plug yourself back into reality in this?
00:47:49.380
So, you know, and isn't it interesting that all the magnates in Silicon Valley don't let their kids have mobile devices?
00:48:00.880
I've heard so many of these people who work on these tech products say, basically, we're using
00:48:06.800
all the techniques you use to entice gamblers. Yes. That's what's your phone. So yeah, maybe we
00:48:14.040
need to learn how to turn stuff off, switch off, actually have a conversation. And yes, that is
00:48:21.680
one of the things that we recommend that taking the problem is people are so impatient. So they
00:48:28.380
don't want to, that's just painful for them. They won't even wait two seconds. If a page
00:48:33.640
doesn't load on the web in two seconds, they're gone. So one of the things we have to figure
00:48:38.520
out is how to cultivate patience. There are lots of these things, but the societal impact
00:48:45.380
is quite profound. Actually, there's a section, if I could be really boring, there's one page,
00:48:50.440
page 228 we made a list of all the things that have profoundly changed so this is about the
00:49:00.500
whole landscape not just technology technology feeds this but it used to be we talked the good
00:49:06.500
used to be we the people now it's me the people we used to like thoughtful clear measured responses
00:49:12.540
now it's twitter at speed it's jargon it's emojis it's compressed information um we used to like to
00:49:18.920
study now it's all hacks and shortcuts uh we used to like saving now we like spending we used to
00:49:25.820
dress up now we dress down there's there's a whole swing it's almost like you know when alice in
00:49:30.920
wonderland drinks and she goes through that looking glass and she's ended up on the other
00:49:36.000
side of reality technology has done this and i think we should be aware of this list of what
00:49:43.020
it's doing to us, which is, you know, much more profound than just put your phone down a little
00:49:48.860
bit. Okay, yeah, but we have to learn how we're going to deal with a world where, you know,
00:49:54.560
free speech has been replaced by political correctness. These are super profound societal
00:49:59.980
changes. And do you think this technology is encouraging a narcissistic behavior in people?
00:50:05.640
Sure, absolutely. I mean, it's the rise of the selfie is only possible because of the camera.
00:50:10.860
uh and and that sense is that's we the people no me the people is a person with the selfie stick
00:50:18.740
and so yeah technology that's part of being atomized that we're not connected anymore it's
00:50:24.400
me and only me and how do i look there's a question i've always wanted i every day whenever
00:50:30.040
i speak to you i just got i can't believe i forgot to ask that question you know when you go on google
00:50:34.860
and you go you know what i could i need some new trainers and google's an advert for trainers comes
00:50:39.880
up are they really listening well is that slight paranoia well what we like i said it's now in the
00:50:47.860
public domain the google and mastercard have a relationship and uh and some kind of information
00:50:55.800
sharing is what's implied but sorry i think what francis means is your phone literally picking up
00:51:00.540
what you say and then processing it in that way well the answer is it's perfectly capable of doing
00:51:06.480
so and many other internet things devices in your home and in your life are doing it as well
00:51:13.120
so for example the uh you know amazon echo it's already been subpoenaed in a murder trial as a
00:51:19.980
witness and of course the two of you guys are like whoa yeah and by the way it's a very good
00:51:25.380
witness it's a highly reliable witness well absolutely because it's not you know it doesn't
00:51:30.380
distort the memory in any shape or form and i have to say the story i love was the guy
00:51:38.320
and left his mobile phone at somebody else's house,
00:51:44.600
and then, you know, petrol and gasoline everywhere,
00:51:49.060
because he just wanted to get the insurance money,
00:51:52.300
and the pacemaker was broadcasting not only his location,
00:51:58.320
So this thing is you are constantly emitting data.
00:52:05.080
In Sweden, companies have recently introduced microchipping the employees in the thumb.
00:52:12.180
And amazingly, a lot of the employees in some of these companies are like,
00:52:15.540
I'd love to do that because I don't want to have to pay for my coffee.
00:52:24.400
Now you're like on the grid and everything you do is on the grid.
00:52:29.100
So the answer is you should assume that any Internet of Things device that is in your world
00:52:34.300
is perfectly capable of broadcasting not only your voice but your physical movement within the space.
00:52:42.760
Vacuum cleaners, automated vacuum cleaners are broadcasting the dimensions of the room
00:52:49.900
It seems in this family, they come in to watch television at 4.45.
00:52:54.600
So if you want to advertise spaghetti to these people,
00:52:58.120
4 o'clock might be a good time to hit that little reminder for them.
00:53:04.300
I mean, again, with social media, the bottom line is if you're not paying for it, you are the product.
00:53:13.620
We talk about all kinds of stuff on here about men and women, feminism, everything.
00:53:17.620
But fundamentally, what I love is when we do an interview and I kind of go, OK, there's some things I need to do differently in my life.
00:53:26.880
Among other things, she was talking about veganism.
00:53:29.740
And she was saying that actually it's very difficult to be vegan.
00:53:32.200
And therefore, if you want to reduce the quantity of suffering that you cause
00:53:36.160
and the number of deaths, you've got to start eating the biggest possible animal.
00:53:41.280
Because there's like a year's worth of meat in a cow.
00:53:44.340
And if you eat chicken, you're going to basically be killing another creature
00:53:48.820
So I pretty much don't eat chicken now after that conversation.
00:53:58.960
and that's what I love about this is like it's fascinating it's terrifying I think it is
00:54:06.400
absolutely terrifying what you're talking about it's also very encouraging that there's going to
00:54:10.800
be this the ability to resolve all these problems to fix the cancer whatever else but I mean I just
00:54:18.140
think we've got to be really careful I also think you know there's an element there there is a part
00:54:23.460
me that's going yeah but we're still we still have that need for personal contact we still
00:54:30.020
there's going to be some kind of pushback against all of this there's going to be some kind of
00:54:34.220
feeling like you know what actually no we want to get together we want to be physically in the
00:54:37.600
same like look at us people will listen to an hour's interview i mean we talk about you know
00:54:42.460
this kind of bite size everything is now shorter attention spans but actually there's a massive
00:54:47.540
pushback against that now there are people we've interviewed some people who do youtube videos
00:54:51.960
they don't even show their face it's just a guy talking for 40 minutes yeah and is not and he's
00:54:58.320
not even like doing kind of comedy like a little bit of setup punchline it's just a guy talking for
00:55:03.200
40 minutes and people love it and that guy's got a million subscribers and i i totally agree i think
00:55:07.940
there's a market for these things but but this idea that we're automatically jumping to that end
00:55:14.080
i don't know one of the really tricky parts of the book you guys are going to laugh we actually
00:55:18.820
talked about one kind of robotics and automation called teledildonics yes and oh yes it's worse
00:55:27.580
than you think extending your penis remotely is that basically right yeah and so you know people
0.96
00:55:32.680
having sex using devices where someone you've never met on the other side of the world is
0.99
00:55:36.320
controlling what happens like we are moving both directions simultaneously i know i'm like oh my
00:55:43.540
god i was amazed the publisher let us put it in there but but it's a real phenomena like a real
00:55:48.820
phenomena now elon musk actually no he's losing the plot but at any rate at one point not so long
00:55:54.820
ago uh he he said he was in some kind of artificial intelligence product that he'd been introduced to
00:56:04.020
and obviously it's something nobody else has seen yet and he basically said this is a the thing i
00:56:08.980
experienced was something to do with love not sex but love and so you don't know is he talking about
00:56:16.340
some kind of a sex doll or is he talking about an interface with a computer i don't know but
00:56:21.780
what he said was it was so much better than the real thing that it's terrifying and it's going to
00:56:29.940
be the end of humanity and and i have to say this is a thing we really have to think about what are
00:56:37.540
our basic human needs and how do they get met remember that film with um joaquin phoenix i
00:56:43.700
think it was where he falls in love with yes artificial intelligence who had scarlett johansson's
00:56:49.540
voice i think it was yeah right but he literally begins to fall in love because the the ai knows
00:56:56.260
the answers and knows him so well and at the very end is he's completely shocked that someone who
00:57:02.420
knows him so well can just abandon him because you feel the involvement and so again we're back
00:57:08.860
to political philosophy how do how do we handle this because we're vulnerable human beings are
00:57:13.820
vulnerable so the sex realm i mean it's quite there's this is what diana flashman told us
00:57:19.160
there's quite a big difference between men and women in terms of kind of sexual preference yeah
00:57:24.940
So she's talking about the fact that men, basically, they have less disgust when it
00:57:35.900
So that means men, as we all know, tend to be more adventurous and more kind of, right?
00:57:40.220
So if you're a man in this new, brave new world, and there's stuff that you want to
00:57:44.260
do sexually, right, well, why would I, why would I meet a real woman and have to like
0.94
00:57:52.540
But not even dinner, but, like, I can do what I want as opposed to what we both agreed to do, right?
00:58:00.540
And, yeah, why would you meet a real woman in that situation, right?
1.00
00:58:05.720
No, and although this seems like a narrow thing we're sidelining into, this is so fundamental to the human condition.
00:58:15.280
There's a writer who I think is really interesting, like a California guru called David Data.
00:58:21.380
and he writes about he writes about sex and what makes sex really good and he
00:58:27.300
says they're really three kinds of relationships there's the one where one
00:58:31.880
person is saying I want then that's what you just described I want and they're
00:58:36.380
all about imposing what they want they just need another person to do it then
00:58:40.100
there's the I want but I know you do too so let's negotiate right and it's
00:58:45.200
transactional even if it's elegantly transactional it's still I'll take you
00:58:50.120
dinner if you know but the love love is i choose to love and it's unconditional and and my interest
00:59:00.280
in loving you is for your individuation as a human being you know that's how do we get more of that
00:59:07.320
when we're in this environment that's a really interesting question and at in this book we kind
00:59:13.320
of we dance a little around it but we kind of say really good leadership has love i mean now
0.75
00:59:20.020
And that just sounds so corny as to be ridiculous.
00:59:22.380
But in the context of what we're saying, you begin to understand love, the care for the
00:59:27.740
well-being of other human beings now is a fundamental issue for all of us.
00:59:32.640
We have to think about what's happening to people with a heartfelt perspective.
00:59:37.020
But doesn't it also come down to gratification and instant gratification?
00:59:41.920
Because if you go into a computer game, you can be a black belt martial artist in four
00:59:50.000
the reality is in life if you want to be that that takes what 15 20 years constant discipline
00:59:55.140
eating right and the same with love love is incredibly hard it means going and meeting
01:00:00.280
lots of different people involves rejection and then you know it involves compromise maybe for
01:00:05.020
you mate maybe for you yeah i did i did i did i did yeah i did but it it's it's sure yeah it's
0.85
01:00:19.360
Well, and not only that, but again, virtual reality world, it's really interesting because now there have been computer games where you have people who are winning who don't fit what that model is.
0.97
01:00:32.680
So you could have a 50-year-old African-American woman emerges as the champion in a virtual reality world where in a physical world she wouldn't win that fight, but she can win it online.
0.81
01:00:46.500
so so and in virtual reality and the other thing is you know about the marshmallow test
01:00:51.140
no no so i think it was harvard university but i can't remember but anyway the marshmallow test was
01:00:58.080
they took little kids five six years old and said uh and left them alone in a room and said
01:01:06.120
if you don't touch yes the marshmallow will give you three yeah and they left them for 20 minutes
01:01:14.580
And the ones who weren't able to wait 20 minutes and ate the marshmallow versus the ones who waited.
01:01:21.580
The ones who waited had much better human skills, an easier life.
01:01:29.860
The people who can't wait, their achievement levels are not good.
01:01:33.520
Their empathy and capacity for connection is not good.
01:01:37.580
We need better performance on the marshmallow test as a race.
01:01:41.780
humanity needs to do better at the marshmallow test and the problem is all the technology is
01:01:47.240
pushing us in the other direction so how do we make choices that balance it out because otherwise
01:01:52.820
we may end up in a you know you said is a dystopian we could end up in a dystopian place but i don't
01:01:59.500
see that we have to but we could well on the subject of impatience our time i'm afraid is up
01:02:06.180
so uh coming back as always to our final question what is something that we're not talking about
01:02:11.060
We've talked about all kinds of interesting things today.
01:02:13.420
What is something that we are not talking about that we ought to be talking about?
01:02:17.500
Oh, well, gosh, you should have asked me at this beginning
01:02:24.140
I still think all of the stuff we've talked about just now we're not talking about.
01:02:31.460
We're not talking, I think, look, in my world, in financial markets and stuff,
01:02:36.460
really everybody is just preparing for catastrophe all the time and I don't think we're talking
01:02:43.120
enough about what about if all of this technology all this stuff works and we end up in a world
01:02:48.760
where we can solve the any food shortages I already know technology that can do this so
01:02:55.460
I think we're there how do we deal in a world that doesn't have any scarcity it has ubiquity
01:03:04.920
And I think we're moving towards a ubiquity world more than a scarcity world,
01:03:10.620
but with all the problems that we've talked about.
01:03:13.920
So to me, that's the interesting thing we're not talking about,
01:03:20.720
Well, if you've still not left the Internet after that,
01:03:23.980
not destroyed all your mobile devices in one big house fire,
01:03:28.760
Follow Dr. Pippa Mountain-Gren on Twitter at Dr. Pippa M.
01:03:33.260
Buy her book, The Leadership Lab, which is coming out on October 4th.
01:03:43.660
And as always, follow us at TriggerPod on Twitter, on Instagram, on Facebook.
01:03:48.680
Obviously, subscribe to this YouTube channel if you're watching it.
01:03:51.680
And click that bell button right next to the subscribe button,