Ep 48 | Anarchists Don’t Wear Antifa Masks | Tim Pool | The Glenn Beck Podcast
Episode Stats
Length
1 hour and 54 minutes
Words per Minute
187.55428
Summary
In this episode, host Alex Blumberg sits down with journalist Tim Poonle to talk about how he got his start in the world of journalism. They talk about his early life growing up as a homeschooled child and how he went on to become one of the most well-known and respected journalists of his generation.
Transcript
00:00:00.000
So excited to talk to Tim. We're going to get to him in a minute. And the things we're going to
00:00:04.280
talk about really, I think, freaky, might make you a little paranoid, probably for good reason.
00:00:10.940
But there is no reason to be paranoid or worried about your safety when you're in your own home.
00:00:17.020
SimpliSafe is our sponsor, and they have award-winning home security that protects
00:00:21.760
every door and every window with 24-7 professional monitoring around the clock. And unlike any other
00:00:28.340
company, SimpliSafe has no contract, no hidden fees, no fine print. It's $15 a month to protect
00:00:34.900
your home. But if that's not enough, here's what really makes them stand out. When you have your
00:00:40.520
home security triggered and it alerts police, police assume it's a false alarm because that's
00:00:46.700
usually what it is. So it goes to the bottom of the 911 list, but not with SimpliSafe because
00:00:51.920
SimpliSafe is the only company that offers video verification technology. This is included. This
00:00:58.040
is not something that you have to pay extra for. They're able to visually confirm that a break-in
00:01:03.160
is happening. That allows police to get to the scene 3.5 times faster, which is 45 minutes to
00:01:10.460
7 minutes. And just for you, they're giving a huge deal away. SimpliSafe.com slash Glenn Beck.
00:01:17.240
You'll get a free HD security camera when you order. That's a $100 value or more. They'll have eyes on
00:01:22.620
your home 24-7 video evidence if somebody tries to get in and the police will be coming to your house
00:01:28.020
quickly. SimpliSafe.com slash Glenn Beck. That's SimpliSafe.com slash Glenn Beck.
00:01:52.620
As it stands today, journalism is changing and it will never be the same. Thank God. That's a good
00:02:03.740
thing. Except what is it going to change to? We don't know. At the moment, the mainstream media
00:02:09.240
is toppling and a whole new cast of people is replacing the old guard. Today's guest is Proof.
00:02:15.720
He first rose to attention in 2011 when he live streamed the Occupy Wall Street protests.
00:02:21.300
Unlike the high budget, heavily choreographed reporters and, you know, front of whole film
00:02:26.760
crews, he wandered around inside the protests using the new medium. He gave us a view of the
00:02:32.760
inner workings of a chaotic political movement. The reporting earned him a spot on the Time 100 list
00:02:39.200
and won him a short award for best journalist in social media. He's traveled all over the world for
00:02:45.280
various stories. In 2017, he accepted a challenge made by Paul Joseph Watson, who offered to pay
00:02:51.260
travel costs for any journalist willing to investigate the claim that there were Muslim
00:02:55.980
dominated no-go zones in Sweden. Like many of the people who are dismantling mainstream media,
00:03:02.100
he's kind of a black sheep. He has friends and millions of followers, six million per month on
00:03:08.540
YouTube to be exact. He has written for The Guardian, Reuters, The New York Times, NBC, even Al Jazeera
00:03:14.760
English. He was the founding member of Vice News. And in 2015, he became the director of media innovation
00:03:22.280
at the TV network Fusion, where he has been tasked with reporting new technology as live streaming,
00:03:29.540
aerial drones, mobile software, and hardware, even Google Glass. He has been attacked by the people on
00:03:36.200
both the left and the right, which usually means that what you're saying is important, something that
00:03:42.300
people don't want to hear, something we all need to know. This is my conversation with Tim Poole,
00:04:09.380
I have no idea. Roguelike. Heterodox. Anti-authoritarian.
00:04:22.940
You know what I mean? So, I don't want to say contrarian, because that's not true, you know?
00:04:26.640
Although I certainly have contrarian positions. But I think for the most part, it's just,
00:04:31.380
I've often reflected on myself, like, I don't even know what I am. I didn't go to school. Maybe
00:04:36.420
that's why. You know, I dropped out of high school within the first couple months. Briefly
00:04:41.020
did a homeschooling thing and then just stopped. Didn't get a GED. Didn't go to college.
00:04:47.480
Yeah. Yeah. But I owe it to my mom, I guess. She was homeschooling me and my siblings before
00:04:54.040
I even started preschool. So, got a head start.
00:04:56.520
Right. So, you had, because you're not a, you know, when you think of an anarchist now,
00:05:02.400
you just think of somebody, well, dressed all in black.
00:05:05.780
Yeah. Who's, you know, just marching in the street and usually just a numbskull.
00:05:13.940
You know, I mean, some of them, actually. I've actually, I have met some, I was in San Bernardino.
00:05:18.840
I met an actual anarchist with Antifa who denounced the violence. And I'm like, then why would
00:05:23.380
you march with these guys? He's like, a real anarchist is peaceful.
00:05:28.440
And it's about cooperation and things like that. Whether you're market, you know, forces
00:05:33.820
or you're cooperative forces, like, anarchists do not hit each other, do not hit people to
00:05:40.160
get their way. You know, that's the antithesis of anarchy, you know.
00:05:44.180
But you are against the orthodoxy. You're, you're, you question the orthodoxy. Not necessarily
00:05:52.680
against it. Just question it. If you find it to be true, right?
00:05:56.580
Right, right, right. Exactly. That's why, that's why I think contrarian is probably the
00:05:58.680
wrong way to put it. So, you know, when I was younger, I, I was an anarchist skateboarder
00:06:03.240
very far left. And I, what, what pushed me away from all this was violence. Like these,
00:06:08.300
these, I was a teenager and I saw these kids picking fights for no reason. They wanted to
00:06:11.780
fight the jocks. I'm like, why would you start a fight with somebody? They'd yell at people.
00:06:15.700
And I was like, keep me away from that, man. That is not freedom and liberty and anarchy.
00:06:20.820
That is violence. And that is authority. And that's you imposing your will on others.
00:06:26.160
And so then I found myself more of this kind of like, um, more of a Bernie Sanders type.
00:06:30.980
This is when I was younger. Um, Hey, we should have more, you know, bigger government. We
00:06:35.380
should have free college. We should have all these things. I, and against the orthodoxy,
00:06:39.140
but, but this was when I was like 17 or 18. Okay. And then I had this really, this really
00:06:43.680
profound moment. So I, I, I was Catholic, um, from kindergarten until the end of fifth
00:06:48.720
grade. And then, uh, in sixth grade, we transferred to, uh, to a public school.
00:06:53.580
So I went to Catholic school. We were, my family were, were, um, great parishioners.
00:06:58.380
We donated, we fundraised, we were the top selling, you know, and my, put it simply,
00:07:04.320
we were disrespected as it were. And for family economic reasons, we decided to move to public
00:07:08.940
school. And so from there, I immediately became this very atheist man. These conservative pro-life,
00:07:15.260
you know, these people are idiots. They're all insane. And I had that very typical,
00:07:18.960
and then something happened. I, I was, I've been skateboarding my whole life and I met
00:07:23.720
the skateboarder guy who was prominent in Chicago. And so I was like, Whoa, like how cool I get to
00:07:29.980
hang out with this guy. He's like way older and he's really good at skating. And he said, Hey man,
00:07:33.860
me and my friends, we're going to go jam at my place later. You want to come with? And now I'm all
00:07:37.040
like, yeah, like how cool is this hanging out with the cool kids? Right. Right. And then when I get to
00:07:41.200
his apartment, I walk in his room, he's got a picture of Jesus on the wall. And I was like,
00:07:46.240
we have a picture of Jesus on the wall for you, like Christian or something. Think I'm all like
00:07:50.220
tough punk rock. And he goes, no. And I was like, what? No. Then why do you have Jesus on the wall?
00:07:54.940
He was like, Oh, I just thought it was cool. There's a story about a guy who traveled around
00:07:57.440
helping people. And that was like a smack in the face to me. And I was like, maybe I'm misunderstanding
00:08:03.420
people by assuming I know what they think and feel. This guy wasn't Christian. He just says cool story,
00:08:09.200
right? Like he's a, he's a cool guy. He was like really nice to people. And I'm like, that is,
00:08:12.980
that is, that is a good story. He's right. And so that was like a big wake up for me as a young
00:08:17.440
person. And I decided more churches need to do that. Right. Uh, so I, I'm still, I wouldn't call
00:08:22.980
myself an atheist, but that was like a moment for me where I realized I better actually start listening
00:08:27.420
to people. Right. Cause I had all these assumptions about what this meant and who these people were.
00:08:31.920
And that did that change come? What year, man, I was like, I think I was 18, you know,
00:08:36.720
so this is 15 years ago. And then from then on, I was more willing to like, listen. So that's,
00:08:42.040
that's the funny story. Uh, I mentioned off podcast. Uh, I used to work for Greenpeace
00:08:46.600
very briefly and then I worked for another, some other environmental nonprofits, but I remember
00:08:51.340
Greenpeace sent us out in Chicago to go canvas. So I'm one of those guys on the street corner,
00:08:55.720
wave into people like, Hey, you want to help the environment? Now it's been a long time. It's been
00:08:59.360
like 13 years. So I could be getting some of the, it's, it's maybe a bit apocryphal at this point,
00:09:03.600
but I, I, I, I vaguely recall them being for cap and trade, right. Government intervention.
00:09:09.680
And I remember when I was canvassing the books, there's a bookstore next to me with your book.
00:09:14.620
I think it's been a long time. An inconvenient truck. Yeah. Or yeah. Inconvenient book. Yeah.
00:09:19.300
And so I remember like opening up and like reading through it and it was a very like the free market
00:09:22.920
solutions to or something. And I was just like, I was guys, guys, no, he was talking about,
00:09:26.620
I'm 20. I'm, I know. And I go back out and start telling people I have the solution. I thought it was
00:09:31.620
funny that, you know, now here I am. Now here you are. Yeah. Yeah. Um, uh, the, uh, the problem I
00:09:40.480
think of just real quickly on that is, uh, that is the, uh, that's the attitude of too many people,
00:09:47.240
no matter what side they're on is I just know better. I don't even need to read that where no,
00:09:53.200
no way that's kind of important. You know, if you really, you know, nobody has any credibility with
00:09:59.120
me if you're, uh, if you're asking for government solutions, but you're not a vegan and you don't
00:10:05.280
agree with nuclear energy. You know, uh, I have a friend and I made them cry because they're far
00:10:12.580
left anarchist, real anarchist though, not violent, denouncing violence, but very, very, you know, so
00:10:18.060
we were having a conversation and I said, basically my issue with the policies you propose is that you
00:10:24.980
don't care about anyone outside of your community. Everything you're proposing would just benefit you.
00:10:30.200
And she was like, no, no, no. I want to make the world a better place. And I'm like, you're using
00:10:33.680
a Mac book right now. Look, I under, look, no, no, no. I'm not saying you can't have it, but do you
00:10:37.860
recognize that people at the Foxconn labs were walking off the building because the conditions
00:10:42.260
are so horrifying, right? You've got to say no to the Mac book, right? You can use a computer,
00:10:46.320
but you got to avoid the Foxconn stuff. And, and I said, listen, you're not making the world a
00:10:50.840
better place by giving your resources to a company that has people committing suicide. And what they did
00:10:54.960
they put nets up to catch the bodies. That was their response. And, uh, that conversation led to her
00:11:00.240
crying when I said the fact that you're willing to support these horrifying circumstances, but here's
00:11:06.220
the thing. I recognize that personally. And, uh, it's just, it's, we have to work to solve that
00:11:13.580
problem, but I do try to avoid, you know, you don't use Apple products. Oh, I hate Apple. Yeah. I hate
00:11:20.340
Apple too, but they're great products. I'm pretty sure Foxconn still does Samsung stuff too.
00:11:24.760
So it's, do you want the tool or not? You have to, you have to recognize that Americans
00:11:27.980
are wealthy, wealthy, privileged, and there are people on the world around the world who
00:11:31.620
don't have access to that. What are you going to do about it? That was one of my, uh, funny
00:11:35.520
highlights. I thought of the occupy wall street movement was that they were, they were taking
00:11:40.360
down Steve jobs. They were taking down. That's not true. Well, the ones I, the ones I saw on
00:11:45.720
video were taking them down and, but they were using the products, but, but actually it's worse
00:11:50.620
than that. When Steve jobs died, it was during occupy. They put up a shrine for him. There,
00:11:55.880
there was a Steve and I'm, I'm just like, how is that possible? It's you look, it's, it's
00:12:00.560
not like everyone occupy was marching in lockstep. Uh, but some people there, that, that guy was
00:12:07.140
a ruthless businessman. He was really bad. He's peak capitalism. Yeah. He, uh, he, uh, he's,
00:12:12.960
he's, he's, uh, the dirty kind of cat, right, right, right, right. You know, right. Capitalist
00:12:19.500
capitalism when it's done, right. I think is a charity almost. If you, you know, if you,
00:12:25.220
if you know moral sentiments, you know, that wealth of nations is just how it works, but
00:12:30.740
moral sentiments shows you what the invisible hand is going to give. If you, if you are a
00:12:36.140
society that wants to do good and you're a capitalist who wants to help people, right? How can I
00:12:42.820
help them have an easier life? Oh, I'm going to invent this. Great. I think we'll have some
00:12:47.860
disagreements on capitalism for sure, but yeah, but I do agree. I think, you know, the core of
00:12:53.800
what capitalism is the right to private property and to free trade for the individual, very simple.
00:12:59.080
It makes a ton of sense. And it will, socialism and communism will never work. So where do we
00:13:04.480
disagree? Um, I lean more towards government regulation probably. Oh, I'm assuming. I don't know.
00:13:09.400
Actually, I just made the assumption based on the global warming free market solutions and stuff.
00:13:14.200
I think, uh, left unchecked, like laissez-faire capitalism, you end up with, you know, humans
00:13:20.920
put their resources towards what trigger dope, what's what will trigger their dopamine. And so
00:13:25.320
we have to have, I, I think a mixed economy makes a lot of sense. Um, I think the U S technically
00:13:30.820
leans a little more towards the right in terms of how much taxes go towards government spending.
00:13:34.540
I probably lean a little bit to the left on a lot of, you know, some of these issues,
00:13:38.280
but I'm, I'm rather centrist. I just think, you know, how much money do we allocate towards
00:13:43.460
curing things like baldness? And is it like, I get it, you know, you have a right to do it, but
00:13:49.020
it's that baldness isn't the big concern. It's just kind of a funny thing to poke at,
00:13:52.760
but I look at, um, toxic waste, uh, dead zones in the ocean and things like that.
00:13:58.060
And I absolutely think there can't be market solutions to environmental problems, but I
00:14:03.900
think, you know, look, left unchecked, humans are going to make virtual reality games where
00:14:08.080
they can just, I'll try not to be crude, but you know, pleasure themselves.
00:14:11.520
Oh no, it's coming. It's absolutely coming. Um, and I, no pun intended. And I, I, uh, you
00:14:18.340
know, what am I, what are you going to do about, what are you going to do about that?
00:14:21.060
And I, and, and this is the, this is the ethical conundrum that exists within me because
00:14:24.500
the media is the best example of capitalism going wrong in that we're at a point now where
00:14:30.160
these companies are in danger. So they, they reach to the bottom of the barrel, which is
00:14:33.700
outrage content and ideology. And you know what they're lying. And you know what their next
00:14:38.680
reach out for is the government protection. We have to have government protection because
00:14:45.400
we are a valuable asset. We need bailouts. We need all of that. And there's already advocacy
00:14:50.820
for government funding. Correct. I mean, wasn't there some program
00:14:54.260
where, uh, they were funding journalism? I, I don't want to get into if I don't know the
00:14:58.060
facts, but I'll, I'll, I'll leave that. But yeah, I think that my, my big criticism of
00:15:02.700
government, which you probably share is it's extremely capitalism is effective in that bad
00:15:07.300
systems die. Can't make it work. You're gone. The problem with communist command economy
00:15:12.860
systems is that they just mandate the expenditure and take the resources from someone else by force.
00:15:17.040
And if your system is failing, it doesn't matter. You're draining the system, right? So,
00:15:21.560
you know, I recently made this critique about Bernie Sanders, his campaign can't pay his own
00:15:25.740
staff $15 per hour. Right. So what does he do? He cuts hours to get the equivalent. Well,
00:15:30.760
here's the problem. His staff said 36 K a year wasn't enough to buy food. Did Bernie say,
00:15:35.960
I'll give you more money? No. He said, okay, then don't work Saturdays. So do the same amount of
00:15:40.400
same work you have to do in shorter time, work harder. And we're still not going to pay you more
00:15:44.720
and let some of you go to pay. Some people are quitting, but so, so he's not going to increase
00:15:50.560
their wages. He's just making, he's just saying you better get your work done on time. And it's
00:15:53.980
really funny to come from Bernie. But the reason I think it's, it's a really good example of how
00:15:57.580
socialism doesn't work is that Bernie doesn't have the input to make the output work. The only thing
00:16:03.360
he can do is cut hours. Welcome to business, Bernie. Right. You know, his ideology just doesn't
00:16:08.700
function. And so he will, he will condemn that very act in others while doing it himself.
00:16:14.960
Yeah. So, you know, I was a big fan of Bernie in 20 and 2015, 16, but not particularly for his
00:16:21.200
more far left approaches to like free college. I think that's a terrible idea, but it was more so
00:16:25.560
about for one, he opposed the free trade agreements. He is for secure borders and he has a, he wasn't a
00:16:30.760
flip-flopper. I think it's for secure borders. Oh yeah. And 20 and 2015, he said open borders is a
00:16:37.220
Koch brother's proposal. We can't have that. And in fact, a few months ago, Bernie Sanders said,
00:16:41.980
he was asked, would you be for open borders? He said, my God, no, there's too many poor people
00:16:46.640
in this world. Bernie said that. Wow. And that's why I don't like him today because I think he's a
00:16:50.600
hypocrite. I think when he goes on stage and says, we're going to give healthcare to, you know,
00:16:54.200
undocumented immigrants. Correct. And, and then everyone, and then you've got Julian Castro saying
00:16:58.200
decriminalize border crossings. It's tacit open borders, but it's worse than that. These people are
00:17:03.700
advocating for a permanent underclass. That to me, freaks me out. You're saying you're going to
00:17:08.260
have people who aren't citizens who you won't deport them. What are they going to do? They're
00:17:14.200
going to work under the table. They're going to get government benefits, but I imagine it's a very
00:17:17.980
ivory tower elite position to welcome that. And it's not surprising to me. You can't, you can't have
00:17:23.660
a welfare state and open border, right? You have to pick one. You want a welfare state or do you want
00:17:30.920
open borders? But once you open up and say, Hey, free food for everybody, where do you think
00:17:36.140
everybody in the world who has cancer and we happen to be the best at curing cancer, where do you think
00:17:42.260
they're going? So you, so you know what? My, my, my conundrum in all this is for one, love the
00:17:47.060
environment. I'm very much an environmentalist. Like I said, I worked for Greenpeace. You get Ocasio
00:17:51.040
Cortez on the scene and she proposes this green new deal. When she first did, I was like, I'm down to
00:17:56.080
hear it. It sounds great. And then what does she propose? Free, free college, free healthcare,
00:18:01.000
guaranteed jobs, income. I'm like, no, no, no, no, hold on, hold on, hon. You're, you're killing
00:18:04.580
the environmental argument here. So what, you know, what I, uh, what I often say to a lot of my
00:18:09.280
less liberal friends, you know, more conservative types is if, if look, if you've got people who
00:18:13.820
don't believe climate change is manmade and you're coming to them and saying they're stupid or wrong,
00:18:18.120
you're not arguing anything. Yes. If you think we need a massive overhaul to save the planet
00:18:22.600
and you, you refuse to accept anything from conservatives, you're not making an argument.
00:18:27.340
Perhaps the argument is what can we both achieve? Even if we disagree, is there perhaps a market,
00:18:32.840
you know, a tax incentive program for certain technologies that can bolster business, make
00:18:37.780
the United States more competitive internationally and help, uh, you know, increase environmental
00:18:42.200
awareness and protect the environment. Can we agree on something like that? You don't make.
00:18:46.100
So, so, so, so where do we, so where do we, where do we go? And, and, and let me give
00:18:52.840
you, um, I'm, you know, I'm a demon on global warming. I'm the worst guy ever, except, except
00:19:00.760
my, my farm and my ranch are 100% green. Um, I have insulation over the top, new insulation
00:19:11.700
that makes sure that you can, you can live in my house at 20 below zero or a hundred degrees,
00:19:18.820
and it's going to stay without any real air conditioning and without any fireplace. I mean,
00:19:24.380
one fireplace, 3000 square feet, and I can keep it at 10 below zero outside. I can keep it at 65
00:19:31.520
with one fireplace on private jets. Well, I was, um, I, uh, I'm not saying I'm an environmentalist
00:19:39.180
guy. I'm just saying I care about the environment, right? I do care about things that I can do.
00:19:45.200
There are some things that I'm, I'm not going to change my ways on, but I can afford the expensive
00:19:53.100
stuff for my house. And by, by, by putting those in my house, I'm helping making them cheaper for
00:19:59.940
other people. I do believe that man is probably involved. There's no way man can't be doing all
00:20:06.440
that we're doing and it's not affecting it somehow or another. Yeah, I agree. I also think you can look
00:20:11.240
at the thermometer and, you know, is it going up or down? It's hard to track it beyond 1900 because
00:20:19.380
everything was so bad. And I think some of it is screwy now, but we're getting better at tracking it.
00:20:26.960
Predicting it. I have no idea. The climate is always in flux and always changing,
00:20:32.940
but I'm willing to do the things that I think we should do. Uh, but I don't agree with the
00:20:40.600
solutions to, I don't agree with deals, not a solution. I know that that's most of it's not
00:20:47.140
a solution. Right, right, right. Most of it is about crippling, um, the West or keeping the rising
00:20:55.380
economies in poverty, which I think is absolutely immoral. I don't, I disagree with the first one.
00:21:01.560
I don't think there's an intention among like many people in power to say like, we want to,
00:21:06.980
we want to hurt America. AOC? Well, AOC is different though. Okay. Right. She's this far
00:21:11.660
left socialist type who has this identity. I mean, look, that, that green new deal was talking about
00:21:16.780
like racial equity. I'm like, what does that have to do with the environment? It is none of it. And,
00:21:21.080
and I'm, I'm deeply offended by that. You know, not like my offense matters to the greater picture,
00:21:26.160
but no, I am. I mean, I want, um, greener pastures and bluer skies and all this beauty
00:21:31.840
and everything. Like I'm sure most people do, but when she comes out, all she did was make the
00:21:36.780
environmental movement look ridiculous. And it's, and it, and then you get environmentalists just
00:21:40.960
backing her for no reason. But you know what? I've, I've talked to some prominent progressives
00:21:45.360
and they asked me if I support the green new deal. And I say, not hers. And they asked me why.
00:21:50.900
And then I asked him, did you read it? And they say, no. And that's, and there we are.
00:21:54.900
So they're, they're under the assumption, the green new deal is going to be this like
00:21:58.700
government investment into solar power. And I'm like, no, it was like free college and free
00:22:04.580
healthcare, which had nothing to do with the environment. In fact, if you, if we had open
00:22:08.420
borders, if not even let's, let's, let's clarify the open borders things. I know the left is going to
00:22:12.560
jump and say, Oh, they're lying. They're misconstruing. Sure. If you don't criminalize illegal border
00:22:16.920
crossings and then provide government benefits to people to come here, you're incentivizing
00:22:20.400
the behavior and the U S already produces too much carbon, right? So the last thing we need is
00:22:25.460
more people coming here and increasing the amount of carbon we're producing, right? Not only that,
00:22:29.420
but they're not vaccinated. So you've got all these arguments that doesn't, don't seem to make
00:22:33.100
sense. I think, you know, ultimately the way I reach out to people who are more conservative on
00:22:38.620
the issue is look, the government offers you a tax incentive if you buy an electric car. Good.
00:22:43.420
What can we, can we do more things like that, but aren't, aren't the changes and, and I'll give
00:22:49.900
you government regulation has encouraged changes, but haven't the changes in the combustion engine
00:22:57.260
now, a brand new combustion engine is actually better for the environment than an electric car.
00:23:06.400
Is it, is it? I'm not. Yeah, it is. That's, I can't quote it for you right now and we'll look
00:23:12.920
it up, but I'm pretty sure. Yeah. Operate assumption that that's true. Um, uh, and it's,
00:23:18.900
you know, it's close, but it's still getting better. Right. Yeah. Um, and that's just innovation.
00:23:25.320
And I think there's, I go back to the case for moral sentiments, which is, you know, Adam Smith is
00:23:32.780
who wrote the, you know, um, wealth of nations. The first book is moral sentiments. And it says,
00:23:39.020
if you're a part of a society that has moral sentiments, good moral sentiments that can come
00:23:46.860
from just, everybody is just in love and everybody somehow or another has just become, you know,
00:23:52.720
Gandhi and Christ all put together, or there is some system that has taught them how to be moral.
00:24:01.940
That is really important because that will direct what they build and what they want.
00:24:08.860
I agree. We are asking and needing more and more, more regulation and more laws because
00:24:15.000
we're screwing this up. We're not, uh, uh, moral decent people. Uh, I can't say for the most part,
00:24:24.440
but we're, we're losing that. I think we're increasingly becoming, uh, we, we agree less
00:24:30.000
on what is more on what isn't right. Um, uh, yes and no. I think we're, I think, I think we're
00:24:37.220
just becoming morally lazy. Well, I, I look at it this way. Um, you go back to the fifties where
00:24:43.660
most people are Christian, you know? Um, and so there's very common values and it's really easy
00:24:48.660
to trust someone when you know, you have very, very similar base values. Well, things are changing
00:24:55.540
now. I, I wouldn't go to Portland. I wouldn't either, but it's not because it's Christianity.
00:25:00.740
It's because it's because we don't agree on thou shall not steal. Thou shall not kill. Take out
00:25:06.880
thou, just make it Bob safety tips. The big 10, give me seven of the 10. We, it doesn't matter who
00:25:12.960
said them anymore. Um, it, it, it, they just don't, they don't agree. They believe it is moral
00:25:19.980
to lie. Yes. Um, I say they, as in the, whatever this, uh, more vocal and dominating faction on the
00:25:26.900
left is, you know, the Antifa types, the regressive, whatever you want to call them. They believe by any
00:25:32.480
means necessary, even if it means hurting innocent people, they don't care. And so we don't agree on
00:25:38.080
what is moral now. I certainly don't agree with them. And I think that's why, you know,
00:25:42.460
there's kind of like this weird alignment between centrists, former liberals and conservatives
00:25:46.700
today. We still agree, you know, at least we have these base rules, but this group doesn't have it.
00:25:52.180
They believe that's a majority. I here's, I don't think the majority of the left hold these views,
00:25:57.800
but I do think the majority of the left is bowing to it. Just ignoring it, backing away. I mean,
00:26:04.220
you know, I had a conversation with a boomer who is a lifelong Democrat who told me I'm scared by
00:26:14.320
what's happening with immigration. The Democrats aren't doing anything about it and I can't say
00:26:19.100
anything because I'll be destroyed. So do you think that plays into what? Well, my friends who voted for
00:26:30.400
Bernie felt cheated by the Democrats and they're not paying attention. They don't care anymore.
00:26:34.220
They're just like, I'm done. It's pointless. And there are other people who are paying attention
00:26:39.220
that I know that feel scared. If they speak up, they'll be called a Nazi. Right. They don't.
00:26:45.080
So what are they going to do? Nothing. They, uh, I've, it's, will they vote? Um, independent,
00:26:50.980
probably. So I'm talking about an anecdotal group of, right, right, right. Yeah. I know,
00:26:55.160
you know, it's my personal sphere and, um, my family, lifelong Democrats. And, uh, right now they're
00:27:00.980
all going, what the hell is going on? You know, we're just, I don't see it. I, I, I have no problem
00:27:06.000
speaking up. Of course they call me a right wing, even though you're not, I'm, I'm on a lot of, a
00:27:11.680
lot of issues probably around where Obama was. Yeah. And that's, well, that's right wing today to
00:27:16.220
these people. Look at the Democrats, the 2020 Democrats. I mean, we had an article from the
00:27:20.600
New York Times saying the governors are worried because the 2020 candidates are going so far
00:27:25.760
left. Promising healthcare to undocumented immigrants is mind blowing to me. Obama promised universal
00:27:31.080
healthcare at the end of his first term. Couldn't get it done yet. We've jumped over that. And now
00:27:35.180
we're like, we'll just give healthcare to everybody. And he promised, he promised it won't go there.
00:27:41.720
Yeah. It won't go there. We will never do that. You have, you have Nancy Pelosi saying in 2008,
00:27:48.160
and I believe this is a pretty accurate quote, the last thing we need are more undocumented
00:27:55.160
citizens. Well, the, first of all, they're not citizens if they're undocumented, but how did you
00:28:00.860
get from there to here? You know what I think it is? Twitter. I really think, you know, Twitter,
00:28:07.740
all these journalists have become addicted to it. And so they're all writing the same things and
00:28:12.880
they're listening to these extremist activists, but you know what it is? It's a combination of around
00:28:18.100
the 2010s when Huffington Post, Buzzfeed, and these other blogs started to pop up. They were
00:28:22.820
exploiting human sentiment on Facebook for shares. Outrage generates the most traffic. It's a fact,
00:28:28.960
as we know this to be true. So you think about what's going to work in terms of selling content,
00:28:33.920
orange man bad, right? That's why we're seeing the Trump bump because scare people, shock them,
00:28:38.740
make them angry. You know, there was a website that I could be getting this wrong. I want to make
00:28:43.280
sure I clarify that, but we got a lot of gray zone stuff here. Well, I mean, I say this,
00:28:48.460
I say, I say it could be wrong specifically to avoid one lawsuits. Yeah. Okay. And also just,
00:28:52.720
it's been years since I've gone over the data, but there was a website dedicated to nothing but
00:28:57.740
police brutality on Facebook that cracked like the top thousand websites in the world
00:29:01.460
because people on Facebook would see this police brutality video and click share. And that was money,
00:29:06.780
money, money. So they went nuts with it. And it's, you know, you, you, you see these,
00:29:11.120
you see sites like that. You saw the problem of, there was a lot of fake news on the right as well,
00:29:14.380
because they were exploiting outrage with fake news. And then what was left is Vox, Buzzfeed,
00:29:20.800
Huffington post, these websites that still try and maintain this fringe identitarian ideology
00:29:26.200
for rage, you know, for, for clicks. And that's where we're at now. And so what happens is you take
00:29:31.660
millennials who for the past 10 years have been inundated nonstop with stories about privilege and
00:29:36.320
diversity and police brutality. And now they live in this crazy world of nonsense where they're like,
00:29:44.620
it's a delusional state. If you ask me, I'll actually give you a real example of,
00:29:48.300
of how this happened. So Facebook created an algorithm that algorithm shows content based on
00:29:54.460
certain criteria, engagement, outrage generates engagement. So Buzzfeed staff, actually, I'll give
00:29:59.860
you a real example, Mike.com. Uh, there was an expose where, uh, I believe tablet magazine did an
00:30:05.060
expose talking to former employees who said, we have a formula X does Y to X. Like basically this
00:30:10.500
is, we write this and it will get shares. Humans trying to cater to the algorithm, which was imperfect,
00:30:17.720
sent people into this weird world of chaos, but, but that's hard to parse through. But this,
00:30:24.080
this next story isn't YouTube created an algorithm. Parents will give their tablet to a baby
00:30:30.760
and press play on a video. And then YouTube automatically plays videos that are similar.
00:30:36.520
The first videos that people were giving to their kids were, was a song called finger family.
00:30:41.380
I don't know if you've ever heard the nursery rhyme, like finger family, finger family.
00:30:44.920
And so you'd see these cartoons of someone singing very happy, very normal, very wholesome
00:30:49.320
people in India seeing an opportunity to generate traffic because you know, these views,
00:30:56.420
these videos were getting watched for 30 minutes because the babies can't press buttons. So they
00:31:00.740
started making really low quality versions that started getting twisted and deranged. They wrote
00:31:05.460
programs to create the videos for them. They sought out keywords in the algorithm that were hot and
00:31:09.820
were more likely to be recommended. And all of a sudden babies were sitting in front of tablets
00:31:14.520
where they would watch the incredible Hulk dance with Adolf Hitler while this creepy muffled voice
00:31:21.720
was singing this very out of tune song. And the parents didn't know their babies were watching it.
00:31:27.300
It was automatic. So YouTube fixed this problem, right? But that story gives you a better view
00:31:33.360
of what human, like the, the adults are getting into and not realizing it. Right. When you look at
00:31:38.240
a video of Hitler dancing with the incredible Hulk while some Indian guy sings into a 1980s microphone,
00:31:43.540
you're like, this is insane. And it was like, it's, and there were different versions of like
00:31:48.160
a Spider-Man and Elsa doing it. We started seeing these really weird videos of Spider-Man and Elsa
00:31:53.160
giving each other injections for, cause it was, yep. And this got to the point where you actually
00:31:58.300
had a video in Russia where a father held his daughter down and gave her an injection on camera
00:32:02.180
and it got millions of views because the algorithm didn't know what was actually good content.
00:32:07.720
It only knew like it gets a lot of watch time, right? So now think about what we can visibly see
00:32:14.140
as insanity, delusional content where babies are watching just, it was, it was like an acid trip.
00:32:20.480
And now you have to realize the stuff that's being fed on Facebook by Buzzfeed Vox. It's very,
00:32:25.900
very similar in its derangement, but it's masked. You know, adults can't see that they're being fed
00:32:31.400
this nightmare reality. And now here we are with the Democrats promising healthcare to people who
00:32:35.440
aren't citizens, which we can't afford, which makes no sense.
00:32:38.540
So where does that end? Oh man. Um, civil war chaos. I mean, look, I'm worried about these babies
00:32:49.820
who were inundated with this nightmarish content. That's, that's, that affects them. Oh yeah. It's
00:32:55.640
the most impressionable years of their life. They're going to grow up with weird things in their heads.
00:33:00.660
These people, these millennials who were teenagers, late teenagers started getting inundated with this
00:33:06.480
fringe outrage content are now living in this nightmare realm that can't be broken out of
00:33:11.180
because they've spent, you know, 40% of their life in this world that isn't real.
00:33:15.840
Okay. So, so, so, uh, before we go to the future, help me here. It's just talking to a senator just a
00:33:24.320
couple of days ago and, um, very intelligent, very well read, um, constitutional guy. And, uh,
00:33:32.840
we were talking about Google and YouTube and Facebook and everything else. And he said,
00:33:37.540
uh, he said, I'm really torn. He said, cause I don't want to regulate them out of business. He said,
00:33:46.420
because I believe in the free market, he said, but then again, we have to have some safeguards. So
00:33:52.620
this kind of stuff doesn't happen. Uh, and I said, well, but they are so fast and things can change
00:34:02.860
so rapidly and it's only going to get worse with machine learning. It'll just, it'll just happen
00:34:08.400
overnight. And I said, I, I really believe this election could be deeply affected by just algorithms
00:34:17.600
that they're designing. Now, you know, they're the creepy line project by the Harvard professor
00:34:22.580
that is documenting just the way they change the recommended, uh, videos, or by the way,
00:34:30.620
Google stacks the news. When you go in and search for something, they can swing independence 80% in
00:34:37.320
the other direction. Oh yeah. Yeah. Out of sight, out of mind. If the only thing you hear is from one
00:34:42.000
side. So, you know, I've said this before and it's, it's meant to be a little hyperbolic, but I think
00:34:46.540
if something isn't done, there won't be Republicans as they are today will cease to exist.
00:34:52.020
By, by when? Uh, I mean, I'm being hyperbolic within the next couple of elections. Uh, look at,
00:34:58.760
I'll, I'll give you a good example. Donald Trump, he's the president. A lot of predictions, uh,
00:35:02.480
models say he'll win 2020. I think that's a fair point, but Twitter said he breaks the rules.
00:35:08.420
He's not allowed to be on the platform, but here's what we're going to do. Twitter said,
00:35:11.640
we're going to put a notice on his tweet if they break the rules, but you know what they're really
00:35:15.340
saying? There's a line in the sand. As of today, no one, whoever speaks like the president will be
00:35:20.720
allowed to use this platform. Well, if you can't use Twitter, which is one of the dominant, you know,
00:35:25.820
spaces for public discourse, then we will never see the rise of someone like Donald Trump ever again.
00:35:32.480
Well, that's the Google stated goal. Facebook is said, same thing internally. The good sense,
00:35:39.120
yeah, we can't let this kind of stuff happen again. You know, they're there. They believe that
00:35:44.820
they are right. And as no offense, but as progressives usually do, they believe they
00:35:51.820
are the right ones. They have more information. And so I'm going to just take progressive steps
00:35:57.900
to get you dummies to go along that there's two ways of looking at progressivism that way.
00:36:03.500
Okay. If it is good and we all agree, that's authoritarian progressivism. Yes. Yeah. And
00:36:09.740
I disagree with it. And that's where they are. That's where they are. Yep. I mean, there's a,
00:36:14.180
there's a, a real problem in my opinion, that is going to happen before we even get up to speed
00:36:21.680
to where these companies, and I am for the free market, but these companies are getting double dealt
00:36:29.920
by the government. They're getting platform status and they're getting publishing status. So they get
00:36:36.820
all the complicated, complicated problem too, but they've got, but you can't be protected on all
00:36:43.920
fronts and then start to say, you know what? I want to do good in the world. And so I believe this is
00:36:50.100
good. And so I'm going to steer people this way. We're, we're talking about companies that
00:36:55.540
will not have, no one will have the protection of the constitution because the constitution applies
00:37:02.920
to the government, not to private companies. The, uh, tyranny is being outsourced. Yes. That's that
00:37:08.200
I find that nightmarish. So it's funny, you know, you get these woke activists, they'll, they'll be
00:37:13.140
like, here comes Tim to the defense of conservatives. And I'm like, first of all, remember that poem
00:37:17.120
first they came for? Yeah. Right. I'm, I, I look to the conservatives in this country as doing what
00:37:22.540
conservatives always do. I've heard the arguments. I disagree with a lot of them, but Hey, we're all
00:37:26.900
Americans. And the point is we find that compromise. The problem now is nothing to do with
00:37:31.580
conservatives. It's a handful of billionaires running unaccountable corporations with government
00:37:36.640
protection. Where's the left? I mean, I grew up with liberals saying corporation bad. And I'm like,
00:37:42.380
I agree. I grew up making fun of things like blade runner going, please. I work for the
00:37:48.560
corporation. Shut up. I've always been there. And for the last 10 years, I find myself with a
00:37:55.640
growing number of conservatives all saying, ah, something's really wrong here. Yep. Something's
00:38:02.480
really wrong. And you know, you know, what really, what, what really annoys me is I've had a lot of
00:38:07.080
conversations with inaction conservatives. I'll call them because I'm not saying regulation is
00:38:11.740
necessarily, you know, what has to happen, but something has to happen. Yes. And there are so many
00:38:16.200
conservatives that say, no, no, no, we should let them do what they want. I'm like, okay, great.
00:38:20.600
You'll be banned first. I mean, your opinions won't exist anymore. Right. You know, here's the
00:38:24.740
thing I noticed. I was talking to this guy who's a far left activist. He told me he was a centrist.
00:38:30.380
He was a socialist. Like socialism is not the center. Quite literally on the political compass,
00:38:35.260
socialism is as far left as can possibly you can go. And laissez-faire is as far right as you can
00:38:40.440
possibly go. And I'm talking about the compass, not like American cultural standards. And he thought he was in
00:38:45.000
the middle. And so you realize that, and you look at someone like, you know, Jack Dorsey or Sundar
00:38:50.480
Pichai, and they think they're in the middle when they're far left. So then they go, okay, so
00:38:56.440
Democrats are far right and communists are far left and socialists, you know, social, socialist
00:39:02.440
Democrats, you know, our democratic socialists are here in the middle. We should, we should ban
00:39:06.600
everyone to the right of a Democrat. That's literally what they're doing, man. I got to say,
00:39:10.780
you know, it's really scaring me too. I, it's my, we're going to derail a little bit,
00:39:13.500
I guess, but I was talking, uh, on the Jack, on the, on the Joe Rogan podcast with Jack
00:39:18.200
Dorsey. And right away I said, your rules are biased against conservatives. And the look
00:39:23.220
on his face, he didn't realize it. He said, what, what rule? He was genuinely believed
00:39:27.160
his rules weren't. And I said, first of all, the misgendering policy. Like, I understand
00:39:31.040
why you want to protect the trans community. They have very high suicide rates. And I can
00:39:35.480
agree with, you know, I can respect that sentiment, but you have to realize more than half
00:39:39.020
this country, like substantial majority of this country, even liberals do not agree with
00:39:44.240
a lot of what's happening in this argument. Like, um, like transgender, uh, women competing
00:39:48.980
against biological women, right? I'm like lifelong Democrats are concerned about that. When you,
00:39:53.720
when you say you're going to ban every conservative because they hold this view or, or bent, or they
00:39:58.300
have to bend the knee and never express that view. You've got a rule base that's, is basically
00:40:02.300
threatening them at any moment. The sword is damocles over your head. You can be, you kicked off
00:40:06.060
the platform. Now what do we see in Canada? A trans woman is filed, has filed a human rights
00:40:12.420
complaint against a Brazilian, uh, waxing salon because the trans woman has male parts and
00:40:19.160
is demanding the female make contact. The law, it is this, this, this, this, um, professional
00:40:27.280
waxer shut down her business. And we've just gone insane. That to me is crossing the line.
00:40:33.060
We are, we are denying reality. Yeah. At the same time, we're denying an individual right
00:40:41.720
to say, I don't want to participate in that. I don't care if you do it, but I don't want
00:40:47.740
to participate. This right here is like, uh, when it came to the baker in Colorado, whether
00:40:53.400
you should make a custom cake for a gay wedding, I've always been like, it's just a cake. You
00:40:57.640
don't have to, like my opinion was if you're accepting a public infrastructure, a taxpaying
00:41:03.460
citizen who helps contribute to the pipes, to the sidewalks, to this, to this pub, to
00:41:06.920
your safety and fire department, then just make the cake for them. Right. But I got to
00:41:10.320
admit, making a cake is simple. You might disagree with it, but waxing a male's privates is like
00:41:19.740
Right. So that's a principle, but it doesn't nightmare. I mean, you know, art is so sacred.
00:41:25.880
Well, my father was a cake decorator. I grew up in a bakery and his, he, that was his art.
00:41:32.540
Right. Right. And you're going to force people their art. And you know, you can define that
00:41:38.200
in many, many ways. What I do to some degree is, is an art form. You know, you're, you're
00:41:45.080
doing television, you're whatever. It's an art form. You, you can't tell people and force
00:41:51.820
people to do the things that are against what they feel, truly feel.
00:41:57.880
But this is, this has been so crazy for me because man, I've, I, even in the past few
00:42:02.180
months, I made videos where I said, listen, you know, that gay couple, they're paying taxes.
00:42:07.640
Okay. You're reaping the benefits of their income in our community to turn them away because
00:42:12.080
you don't want to write some words on a cake, but writing, but the issue then becomes, uh,
00:42:16.260
actually, I've got another really great point on this. We're getting now to this, you know,
00:42:19.460
waxing thing in Canada and it's stressing my view on the ethics of whether a business should have to
00:42:24.100
do anything. But I, but I thought about something similar in that it's really easy to hold a
00:42:29.180
principle when you don't challenge the line. For instance, um, you're familiar with Blackstone's
00:42:33.500
formulation. It is better that 10 guilty persons escape than one innocent suffer. Or, um, yeah.
00:42:38.800
So I thought about that and I said, okay, would you free 10 shoplifters if it meant one innocent
00:42:48.400
person accused of shoplifting would go free? And most people would say, oh, of course, would
00:42:53.940
you, would you free 10 child rapists if it meant one innocent person would go free? And
00:43:00.000
then all of a sudden people start questioning Blackstone's formulation because that's 10 people
00:43:03.540
did some nightmarish crime. And so, you know, it's easy to hold a principle when your perspective
00:43:09.500
is shoplifting, but you have to, it has to remain in place no matter, no matter how bad
00:43:16.100
the crime, right? And otherwise this is the, well, this is the ethical and moral conundrum
00:43:19.720
for me. Cause I'm like, am I going to be the person to tell that woman she has to touch
00:43:23.360
that man? No, that's that line is hard. That's hard. We, we, we have, but we have to respect
00:43:32.440
that people have different viewpoints, different lifestyles, different everything. But there's
00:43:39.080
if, if you don't protect the individual right to be themselves and the individual right, um,
00:43:47.680
to say, I'm a, I disagree with that. What, what do you have? You have, you have mom and
00:43:55.940
dad called into the room every time brother and sister are having an argument and then
00:44:00.580
they have to decide, right? They sometimes don't make the right decision. You know what
00:44:04.920
I mean? You have to say, you know what guys, you're going to have to work it out. You have
00:44:11.100
to get along cause we're all living in the same house. You know, it's really interesting
00:44:14.820
though is, uh, the cake argument is my, as I understand it is, is often mis, uh, understood
00:44:20.600
by liberals, the left. I hate saying liberals cause let's be real. Like the left is not,
00:44:24.800
but the people on the left think the issue was he said, get out of my store. When in
00:44:28.040
reality he said, you can have any cake you want, you're not going to get me to draw what
00:44:31.640
you want. What's interesting about that is it's a first amendment argument. I can't be compelled
00:44:36.840
to speak. That's Twitter's argument for banning whoever they want. Twitter's argument
00:44:41.240
is that speech is coming from Twitter and we can't be forced to permit it. It's a first
00:44:46.360
amendment argument, but then they are not a platform and should not have platform protections.
00:44:53.040
But outside of this, right? Yeah. The idea is why should a baker be forced to speak on behalf
00:44:58.960
of the gay couple, but Twitter not be forced to speak on behalf of the citizen. It's great.
00:45:03.400
Arbitrary lines, right? It's, it's, it's man. Um, you know, I think it's funny that there
00:45:09.200
are a lot of people who engage in like political commentary and debate that think you have to
00:45:12.680
have an answer to everything and they approach everything as a debate where one either is
00:45:16.560
or isn't. And I'm like, look, I can just flat out say, honestly, I don't know where that
00:45:19.280
line is. It's a hard, it's a hard ethical, you know, like the pro-choice pro-life thing
00:45:22.820
for me is one of the hardest ethical issues I've ever, you know, had to even try and figure
00:45:26.900
out. But we, well, we can go to the platform publisher thing, you know, to move on from
00:45:32.400
this. Hang on. Let me, let me stop at abortion. Yeah. The problem is the problem
00:45:37.400
with this is I am pro-life, but the thing that I have the hardest time doing is at conception
00:45:51.720
at two weeks, if it's rape, if it's incest, I don't want to be the dad that has to go to
00:46:00.540
my daughter who's just been raped. And so you got to carry that child. Right. However, for
00:46:05.320
me to be consistent, that's what I have to say, but I don't like it. And I think there's
00:46:11.700
a lot of people that are there. They're like, look, I don't want to get involved in somebody
00:46:15.560
else's life. I don't know all the facts. And so they, they will be mushy there, but we're
00:46:22.920
not that we're not having that conversation. Now we're having the conversation of kill it
00:46:28.220
at any time for any reason, even after they're born. I mean, it's right. Right. So, um, well,
00:46:34.200
we'll jump into this topic. Uh, I've grown up Democrat, Democrat family, pro-choice, except
00:46:41.080
we were always like, there's a limit at a certain point we can't. And so safe, safe, rare. And
00:46:49.560
what was the other one? It was like available or so legal, legal, safe and rare, rare. Yeah.
00:46:54.800
And now, uh, I remember seeing the segment on, um, I think her name is Michelle Wolf or something.
00:47:00.340
I can't remember her name. Was that the comedian? She had a show on Netflix and she was yelling
00:47:04.100
abortions for everyone. You had Lena Dunham say that she wished she had an abortion. And for me,
00:47:09.180
that was kind of like, what, what's happening? I saw, I saw a person in Seattle, an abortion
00:47:15.220
activist, and she was giving a speech in Seattle. And she said, you know, my first abortion was in
00:47:22.120
Seattle and it was my best abortion. I thought you go to a doctor, you go to a vet and the vet says,
00:47:30.760
Ooh, putting your dog down. That was my first, that was my first euthanizing an animal. That was
00:47:37.040
a good one. You go to everybody, you know, and say, this person has deep issues. This is,
00:47:41.600
this is a really good, uh, man, there's so many issues to talk about how the mainstream media
00:47:46.240
supported left is losing their minds. Polls show most Americans believe on a limitation at a certain
00:47:51.240
point. You know, it was like after the first trimester or something, I don't know the way,
00:47:54.780
the way I've always viewed it is I grew up with very, you know, my, my dad was always kind of
00:48:00.520
conservative. My mom was pretty much a hippie and, but we were Democrat pro-choice and all that. And
00:48:05.600
it was, you know, my dad saying something like, look, you're, you should never be happy about an
00:48:10.740
abortion. You know, if you're ever with a woman and something, you know, you have to be responsible.
00:48:14.660
And there are certain circumstances where we begrudgingly accept we need the ability to have an
00:48:18.740
abortion. That was the growing up. It was like, it was really bad. And if you had to do it, you had
00:48:23.820
to, and it was something to be upset about today. It's a joke. Oh yeah. People like people wearing
00:48:29.100
shirts that are saying like, you know, going to go get an abortion or something. But most Americans
00:48:33.500
don't feel that way. So here I am as somebody who is always been a moderate, you know, Democrat leaning
00:48:39.180
to the left of my family saying, yeah, right. Um, for me, I'm pro-choice on libertarian grounds and it's
00:48:44.400
a very, very complicated ethical position. I am 100% within every inch of my body opposed to the
00:48:49.180
death penalty. And that makes it really hard when it comes to pro-life. I believe life begins at
00:48:54.480
conception. I don't, I don't see any way you can argue against that. It's it is life exists,
00:49:00.200
but I also think there's a challenge in having the government mandate that one person share their
00:49:05.260
body with another person, not while recognizing sure you can have irresponsible young kids who got
00:49:10.840
themselves in that problem, but you can't have victims of assault or now being forced by the
00:49:14.600
government. And I don't know whether I should be involved in the moral, ethical, and health related
00:49:21.380
decisions of an individual at a governmental level. I want this. I want the cops to go in with guns and
00:49:26.600
force this person to do that thing. And so all I can really do is say within me, the only thing I can
00:49:32.280
do is lean towards the more freedom, even though I understand there are two lives at stake. I can't
00:49:37.180
compel someone to provide their body to someone else. I just don't know. I don't know the answer
00:49:40.960
as best as I can do. I would, I would, um, see your reasoning. Um, if it stops at rape or incest.
00:49:50.040
Well, no, not even that, not even that, because I don't blame the baby for that. I don't blame them.
00:49:55.060
No, no, no. I know that, but you're saying I can't, the government can't compel you to have,
00:49:59.840
um, well, okay, but the government's not compelling you. Those are your consequences.
00:50:07.620
You know, I stand on the railroad track. Well, it's like, um, I'm going to get mowed over by a
00:50:13.060
train eventually. One of the challenges with like later term abortion is severe deformity or
00:50:18.780
inviability. And so it's, it's hard for me to believe the government has the ability to know
00:50:24.700
a one size fits all solution. And here's the bigger ethical conundrum. The data, at least I've
00:50:30.160
looked up shows that majority of abortions are for no reason at all. And so it's like, how do you,
00:50:36.660
that, that I don't believe there's like a good, like compromise to this problem at all. I don't
00:50:40.900
know if there's a solution at all because you either have there, if you're talking about rape,
00:50:46.200
incest, horrible deformities, you can at least make a case to all reasonable people, right?
00:50:54.700
To say no abortions. You can, you could make the case to a moral person, but then it breaks apart
00:51:05.780
when you start to add in the human element of, Hey, but we don't know everything that's going on.
00:51:12.780
But the abortion anytime for any reason, I don't see the argument. Now here's the problem though.
00:51:18.660
When it comes to the argues of rape and incest, how does the government determine
00:51:22.900
it was or wasn't right? Is someone lying to get it? Or my problem is, is the baby isn't responsible.
00:51:29.620
Right. Exactly. This is a tough, tough problem. And, and, and, you know, I think, but that's,
00:51:34.760
but that the struggle to get to an answer is just as important as the answer. As long as we're
00:51:43.060
struggling to get to the answer, but I don't think we are, I think we've just abandoned all good
00:51:49.820
faith. Well, whatever this left leftism, there's not even like here we are having a conversation
00:51:55.320
trying to figure out the ethics of what is right and what is wrong, like normal human beings. And
00:51:59.120
the left is basically saying at any point for any reason. Right. And I'm like, wait, wait, hold on.
00:52:02.520
You, you are in the other room. We're not, you're not even in a conversation. What's going on?
00:52:06.020
You know, for me, it comes down to like, you have a victim of rape, incest, and do you have
00:52:12.020
to then have them present evidence? Do they go through some ordeal or is it like between you and
00:52:16.600
your doctor? I just lean towards libertarian, I guess. And I don't think I'm right or wrong. I
00:52:22.260
just, I actually think I'm wrong in a lot of ways on it, but I don't know what to do. You know what
00:52:25.940
I mean? So it's, it's, I think that's where most people are on a lot of big issues. Right. And they
00:52:32.680
just, what they want is an end to this nightmare of calling people names for, you know, you look at
00:52:39.960
identitarians. You've been using that word a lot. And yeah, the identitarian of the, of Europe
00:52:46.420
is wildly misunderstood in, in my, in my view. Yeah. Yeah. Yeah. There are those who are,
00:52:54.580
you know, Nazi nationalists, blah, blah, blah. But, and it's happening here in America. And it's one of
00:53:00.160
the reasons I think Donald Trump won, not for the Nazi reason that they keep putting out, but maybe
00:53:05.600
in one way, because of the Nazi thing, if you believe in your country, if you believe that,
00:53:12.260
you know, this is a good place, you're demonized over in Sweden. You're, you're a racist. If you,
00:53:18.880
if you won't fly the EU flag and you want to fly the Swedish flag, that makes people automatically
00:53:24.940
push back and go, you know what? My, my identity is important too. I don't think, I think, uh, so let me
00:53:33.460
clarify for those that might understand identitarian is identity based government, right? So identity
00:53:38.280
meaning in this instance, your race or the races of individuals. So in Europe, you have these groups
00:53:44.340
that are concerned about mass migration and the displacement of the indigenous population.
00:53:49.000
It's actually interesting. They call themselves indigenous rights activists because white people
00:53:53.060
in France are the indigenous population. Uh, in the U S I don't think, um, just based on the people I've
00:53:58.960
talked to that white identitarianism was a major factor to Trump's victory. So I'm not talking about,
00:54:05.100
I'm not talking about white, um, nationalism. What I'm talking about are, is this feeling,
00:54:12.140
and it's always connected to white nationalism. Okay. Um, by the press and by people who are,
00:54:18.580
I think being insincere or just not well-read the, the idea that my culture, I don't care what color you
00:54:26.440
are. My culture is important, not to the expense of other cultures. I don't mind, come in and bring
00:54:34.080
the best of yours and let's blend in, but it's underneath the umbrella of America. Yes. Yeah.
00:54:40.900
And it's, and what the problem is, is that people are now saying, no, all cultures are equally great.
00:54:47.580
Well, you know what? There's been some bad cultures that have happened and I don't care what color it is.
00:54:53.020
Hey, what do you think about the culture of the, of Germany in 1930s and forties? It sucked. Okay.
00:54:59.560
So, so everything is not equal. There may be some great aspects of it. There might be things that
00:55:06.000
you're proud of, but why can't, why do I have to celebrate this and this is torn down? And that's,
00:55:13.220
that's the problem. That's the rub that I think people are feeling. And the, in many politicians,
00:55:19.660
many groups, uh, and especially the media are exploiting it or they just don't understand what
00:55:27.260
this problem is. Well, first I'll clarify that's not identitarianism, right? So identitarianism is,
00:55:31.640
is usually about what your identity is, things you can't control, you know, be it male, female,
00:55:35.480
white, that just valuing American culture, altruism. I don't know what you want to call it.
00:55:40.720
Right. But what is happening and you probably will, I mean, please correct me if I'm wrong on this.
00:55:46.320
And what's happening in, for instance, Brexit, I think there are some people that are part of
00:55:53.420
Brexit that are white identitarian. Totally. Okay. But there are also people who are so beaten up
00:56:01.440
and see what's happening to their country and see what's going on. And no one is recognizing them.
00:56:07.360
Right. The power is putting them into this group that they just say, you know what? Nobody is freaking
00:56:13.640
talking about me. Nobody's helping me. Exactly. At least these guys, you know, are not calling me
00:56:20.220
names and I don't like what they stand for, but they're being pushed into that corner.
00:56:24.920
This is why I think, I do think that at current course, the way the left and, you know, look,
00:56:31.240
the left in this country has gone so far off the path and it's not my opinion. It's the New York
00:56:35.840
Times data. It's Quartz. It's Pew. It's Gallup. It's The Economist. I've got the charts. I show
00:56:40.600
them all the time. The media feels safe for whatever reason with this fringe far left identity
00:56:46.600
based world. I don't understand. You know what? And that's why the center and the right are kind
00:56:51.780
of, you know, more aligned. And the left is the media just thinks it's a safe. They've always felt
00:56:55.960
it safe. I have no idea. But what do you think happens when you have Sarah Jung? Do you know who
00:57:01.240
she is? She's the, she was on Twitter for three years. Oh yeah. Yeah. Anti-white racist
00:57:05.420
posts. Yes. New York Times hires her. It gives her a prominent position. The alt-right responded
00:57:10.900
by saying, bless the New York Times. I wish we had a thousand more Sarah Johns. They didn't
00:57:15.360
hate it. They loved it. They wanted that because they know the kind of things she was putting
00:57:20.080
out will force white people to join the, well, they're hoping white people will then come
00:57:24.780
to the alt-right because. Well, if the alt-right is, the media is already saying that the,
00:57:32.700
the right, Trump supporters are part of the alt-right. They are not. It is a big, there
00:57:39.360
are some people in the alt-right that support Donald Trump. Yep. But there are very few people
00:57:44.060
that support Donald Trump that support the alt-right if they know what it is. But because
00:57:48.440
the media says you're all alt-right. Yep. A lot of people just go, well, I guess it can't
00:57:54.540
be that bad. I'm part of the alt-right. That's dangerous. Now that quote is going to get taken
00:57:58.580
out of context because you just said it. So I've actually talked to some young people who
00:58:05.340
told me they were, and I was just like, I, you know, I completely disagree where America
00:58:11.160
was not, it's not like, you know, look, it's not a European nation. White people didn't come
00:58:16.280
from here. We came to here and there are other people that were here too. And they're like,
00:58:19.380
oh no, I don't care about race. And I'm like, wait, what? Do you know what alt-right means?
00:58:23.460
Literally it's like neo-confederate white nationalism. And they were like, oh,
00:58:28.380
all they ever saw was people they thought were funny being accused of being alt-right and assumed
00:58:32.940
alt-right meant like. When Ben Shapiro is alt-right. Oh my God. You know, sign me up, I guess.
00:58:41.420
That's not what it is. But, but you've got, you know what I think? I think a lot of people
00:58:45.600
on the left are so far left, they look to the right and they actually, I'll put it this way.
00:58:53.560
One of the things I describe is this event that happened in Boston where you had the right and
00:58:58.360
the left on one side on the right, you had a guy waving a Confederate flag on the other side,
00:59:03.460
you had a guy waving a Soviet flag. So I was talking to one of the organizers of the protest
00:59:07.940
and I said, you know, I'm sure they would welcome you over there for conversation. And he was like,
00:59:12.660
are you kidding? Look, they're flying a Confederate flag. And I said, oh, that's one guy though.
00:59:16.400
And they're like, yeah, but they're, he's there with them. They support him. And I'm like, no,
00:59:20.360
it doesn't mean they support him. I'm like, you guys are flying a Soviet flag. And he goes,
00:59:23.420
no, we're not. And then I said, look behind you. He didn't realize the flag was behind him.
00:59:28.140
So here's what happens. On the right side, they look over the hill and they see the big Soviet flag
00:59:31.460
and they assume everybody over there must be a communist. They look over on the right,
00:59:34.520
see the Confederate flag. So I think that's, that's kind of a good way to explain what happens when the
00:59:38.200
left is so far left. They look to the right and the center, they can see the people in the center,
00:59:43.100
but then to the far right, they see the flags and assume it's one group. When actually we're far
00:59:46.780
away from each other, you know, they can't, they can't tell the difference.
00:59:49.360
Let me switch subjects. Cause I hate to have you here and not go deep on, um,
01:00:12.600
on tech. Um, China 2020, uh, is I think 1984. Oh man. And America 2020, 20, whenever I think is
01:00:30.960
brave new world. I think we're headed the same direction. And when Peter Thiel, who is not an
01:00:38.180
extremist says Google should be at least looked into for treason. Wow. I have, yeah. Wow. And I
01:00:48.520
have a hard time not saying, I don't know about treason. It's the, it's very specific on what that
01:00:55.580
is, but that's really dangerous what they're doing. You know, I kind of feel like China has a
01:01:02.580
very authoritarian system. You do what China needs like, but that's good for them in the long
01:01:08.220
run. While life may suffer for the individual, they, I think they're on track to just trounce
01:01:13.800
the United States in a worrisome way. We've allowed, I mean, first of all, I think we have no purpose.
01:01:19.960
The United States millennials, the reason why we're seeing all this fringe, ridiculous activism is
01:01:24.780
because no one is, there's no mission. You know, they did a survey of the UK and the United States
01:01:29.660
about what you want to be when you grow up. Americans and Brits said vloggers on YouTube.
01:01:35.340
Chinese said astronauts. People in China have a mission. They want to be great. They want to
01:01:39.720
succeed. They want to fly their flag. People in America, I just want people to look at them.
01:01:44.040
And so what happens? They go on Twitter and they say silly things. So that says to me, you know,
01:01:48.660
if young people today don't care about the strength of our community, why would we have one 20 years?
01:01:55.760
China will exist. It's going to be a crappy place to live in for sure. But they're going to have a
01:02:01.100
massive, they're going to have the commanding economy. They're going to have international
01:02:03.900
power. But as somebody who watches the tech world, um, it says something that Google will assist
01:02:12.580
China in some of the most, some of the worst things we've seen since, you know, fascism in the 1930s,
01:02:22.160
Pol Pot. I mean, they're, they're, they are brutal.
01:02:27.120
I think Google pulled out of the Dragonfly. There's like two things. There's Project Maven,
01:02:31.380
which was a government, I believe was like Chinese military in which they're like, oh,
01:02:34.860
we play a small role. Yeah. Right. If, if any at all, why should you? And this is really scary
01:02:40.640
actually. So, um, I believe they did pull out of Dragonfly, which was censored search,
01:02:44.880
but I'll tell you what's really scary is that Google starts in the United States built off the back of the
01:02:49.800
United States. They now, I believe they're headquartered in like Dublin, right? What do
01:02:54.000
you think happens if U S regular regulators come into Google and say, we want you to stop the
01:02:58.480
meddling. We want you to stop the search manipulation. Google can say, we make more
01:03:04.100
money in China. Screw you. And then they basically just do what China wants. What are we going to do?
01:03:09.820
Go to Ireland, stop them. We can't. So what's scary is the authoritarianism of China will force
01:03:15.660
Google to take action, but Google wants the benefit. So they'll bend the knee in the United
01:03:19.280
States. It just, it's an issue of which is more valuable to Google. Who is Google going to be
01:03:24.180
loyal to? If it's an international company at this point, that is scary. Absolutely terrifying.
01:03:28.860
Think about if, uh, you know, the conflict in the South China sea, which has been brewing for a long
01:03:32.980
time. And then China says, listen, Google, we war is bad, right? Why don't you start weighing
01:03:38.440
things down in the U S so U S interests. And then all of a sudden people in America are like, China's great.
01:03:42.800
We love China. You search for China. What do you see? It's all this really beautiful things about
01:03:47.340
China and Google saying it's better this way because war is bad. And then China's interests
01:03:52.160
supersede the United States. So, and that's not unrealistic. I think it's probably already
01:03:57.400
happened. I do too. I do too. Yeah. So, um, the China 2020, what I'm of, what I'm concerned about
01:04:05.960
is we're, we're just accepting it here. We're not even really standing up and saying, Hey, that's a,
01:04:15.320
that was an episode of the black mirror. Yeah. Um, and we're accepting it and it's growing bigger
01:04:21.460
and bigger. Do you see a time where the doors close, where, where these big corporations have
01:04:31.600
the power and there's just no turning back from it? Absolutely. And you know, the thing is a hundred
01:04:37.640
years ago, if the doors closed, somebody would come and knock those doors down. Can't, you can't
01:04:42.760
anymore. You know why? As I explained earlier with the algorithms, you won't even know the door is
01:04:47.340
closed, right? You're all you're going to see when you open your phone is lollipops and rainbows and
01:04:51.660
you're going to beg life is good and you're not going to realize it. So it is the matrix. Right,
01:04:56.460
right, right, right. It's, it's here's, I'll tell you what's scary about it. Right. At a certain
01:05:00.980
level, we can say, you know what, if we can avoid war, that's a good thing, right? Google has the
01:05:05.660
perspective. We should stop the bad speech and eventually there will be no bad speech. Well,
01:05:10.120
that sounds nice. Right. But here's the thing. As we learned from the YouTube story about Hitler and
01:05:15.460
the Hulk dancing, their algorithms aren't going to lead to what they want. Right. So the way,
01:05:19.960
the way I can actually explain it is YouTube thought their algorithm would lead people towards
01:05:25.140
funny sitcoms. That's what they had in their mind. They were like, what's a good show? Friends,
01:05:29.440
right? So let's tell the algorithm things like friends are good more than 10 minutes. People
01:05:34.080
watch more than 10 minutes. You know, it's got this amount of likes, this amount of views
01:05:37.200
and YouTube instead directed them towards an Indian guy singing into an eighties microphone of Hitler
01:05:42.100
dancing because it didn't know the difference. So they're doing this now. They think they're doing
01:05:47.040
the moral good and they're going to be working. It's going to be this big international thing where
01:05:51.120
it's like, actually, I believe what Jack Dorsey said was we're a global community.
01:05:55.140
They don't, they don't view themselves as catering to the American people.
01:05:58.240
I've talked to, I don't, for years I've talked to some of the people in Silicon Valley, really
01:06:03.980
deep thinkers that don't think the same way I do. Um, but they're fascinating to talk,
01:06:13.480
talk to because they see the world that is coming and they're designing the world that is coming.
01:06:18.900
They think they are. Yeah. They think they are. And they have said to me for years, um, Glenn,
01:06:24.640
you're, you're, you're, you're thinking old world. Um, the, we're entering a world where borders
01:06:33.180
won't matter. It's true though. It is, it is, but not in the way that we have ever felt of borders.
01:06:43.480
You know, the way we think of borders, they mean the, the corporations, uh, have no worry about
01:06:52.600
borders, uh, at all. And they, they are, it's a completely redesigned corporate entity.
01:07:01.680
The corporation will be, I mean, Facebook, if Facebook wanted to right now, they could be the
01:07:05.760
government. You'd never see, look, look, Elizabeth Warren came out and said she wants to break up big
01:07:10.940
tech, right? And then what happened almost right away, Facebook took her ads down and then brought
01:07:15.480
them back up later saying, oops, that was a mistake. Was it? Yeah, that's scary. YouTube, uh, I'm sorry,
01:07:20.920
Google and Facebook control nearly half, uh, or I believe they control the overwhelming majority of
01:07:26.220
maybe like, uh, I mean, each control around half of the digital ad market. They've just monopolized
01:07:30.200
the whole thing. Biggest ad agency in the world. Yep. So when it comes to the internet,
01:07:34.480
they can make you think whatever they want. You'll never hear a story again. And then, you know,
01:07:39.640
blows my mind is why these people in media just love it. Defend Google all day and night. You know,
01:07:45.420
it's crazy. Uh, a few weeks ago, a bunch of Trump supporters and conservatives had a protest in DC
01:07:51.600
called stop the bias. Something about, I don't know what it was about. All I know is they were saying
01:07:56.020
these big corporations are silencing speech. Antifa shut up to protest the American citizens
01:08:00.420
fighting for their rights against corporations. The far left came out to protest, not authoritarianism,
01:08:06.420
but American citizens complaining and demanding a redress of grievances. It's, it's like the, um,
01:08:13.000
uh, what is it? Uh, the, the internet rules that they tried to get past and, uh, Pajit changed from
01:08:20.420
the FCC. Oh, I don't know. But, but I'll, but I'll say this when it comes to the talk about like
01:08:25.560
China and, you know, 1984 and how the algorithms are manipulating us. Think about the kind of
01:08:30.540
content that these people, these Antifa people see all, all, all day, every day, World War II
01:08:35.100
imagery, talks of parallels between Trump and Nazism. Their Google is whether they're doing it on purpose
01:08:40.720
or not, are manufacturing people. Twitter is a great example. They create block lists. So you can
01:08:46.120
never see what I have to say. They only ever see the fringe far left and they are festering in this,
01:08:51.420
in this bubble where they just eventually get big and angry and then think they're morally just and
01:08:57.480
then act in defense of these massive corporations. What's the solution? I honestly have no idea. How
01:09:03.380
do you, how do you tell a company they're not allowed to have free enterprise? How do you tell
01:09:06.300
an individual they're not allowed to have free speech? Uh, I think perhaps there, uh, there was
01:09:12.480
one interesting option put forward. I can't remember which Senator did it. He's saying an algorithmic free
01:09:16.380
mandatory option for all social networks, meaning you can choose to see things as they are without
01:09:21.280
being manipulated. And then perhaps a guarantee of free speech on these platforms. Now the
01:09:28.080
conversation has to be had in a very particular way. We can't just say you have to have free speech,
01:09:32.020
but we can say publisher platform, right? This is important distinction because legally there is no
01:09:37.440
publisher or platform distinction. The law just says, you know, a digital service will not be deemed
01:09:43.540
the speaker of, uh, of, of something based on a third person, which is a really interesting
01:09:49.380
conundrum in the United States specifically in Australia. Um, I believe a man recently sued media
01:09:56.300
organizations because they posted to Facebook and then someone commented defamatory information on that
01:10:02.060
post. And he said, that was the media company's speech. And Australia said, you're right. It is
01:10:05.780
because they created the space for it. They were obligated to regulate it in the U S section two 30
01:10:11.220
protects them from this. But I ask if that's true, there's no distinction. Why can't, so you're
01:10:16.240
saying that the wall street journal can write, you know, Glenn Beck is an alien and make some ridiculous
01:10:20.920
false claim that he's a bigot racist. And here's proof. Make it all up. You can't sue him. You can,
01:10:26.320
of course, it's liable. You can, but, but hold on. Section two 30 says you can't, you'd have to sue
01:10:32.100
the individual author, not wall street journal. Right? Because the, they have an editor. And so they look at
01:10:41.160
everything before it goes. The law doesn't say that though. Pardon me? The section two 30 doesn't
01:10:45.440
clarify that. That's the thing. Legally, there's no distinction between platform publisher. So how
01:10:50.640
can we be at a point now where the law accidentally meant gave the New York times immunity? Right.
01:10:56.620
That doesn't make sense. Right. So there needs to be a clarification and, and I will make this point
01:11:00.960
stressing it too. When, when the left always says, you know, it's a private business. They can do what
01:11:05.660
they want. I say, that's wonderful. We create new regulations every day. End of story.
01:11:11.600
Sure. Today they can do whatever they want. I'm arguing they shouldn't be able to,
01:11:14.660
and now we should actually enforce something. Maybe what form regulation takes is a bigger
01:11:19.020
question. But the, the point is just because they can do it now doesn't mean they should be able to,
01:11:24.000
we need to either refine section two 30 to clarify. We need to pass laws restricting, you know,
01:11:29.280
the banning of speech based on arbitraries, you know, whatever. But, you know, I guess it is an
01:11:35.620
extremely complicated problem. It is the development of a new civic within our country that requires
01:11:40.700
the conversation and simply saying, do nothing. It's a private business. I think it's the wrong
01:11:44.580
approach. Where is the, where is the line where we are not going to know if free will exists?
01:11:55.660
You might be past it already. You know, uh, I'll tell you something creepy.
01:11:59.460
I watch on YouTube, anime videos like Dragon Ball Z and One Punch Man. And I don't watch politics
01:12:07.800
for the most part. Uh, even though I make political videos, I watch skateboarding and I watch cartoons
01:12:13.060
where people, Japanese people fight each other. And one day I got a video about people living in a van,
01:12:19.840
van life. And I thought that's weird. I've never watched anything having to do with living off the
01:12:25.480
grid or living in a van. I clicked it and I thought, wow, this is actually really interesting.
01:12:30.600
And then all of a sudden, all I I've been getting recently, every other video is van videos.
01:12:35.440
And I started thinking, why would, why would Google just send me this video? Now I'll be honest. I
01:12:40.180
ended up building a van. I now have a mobile production, you know, slash live shower in it.
01:12:44.720
So I thought it was a great idea, but something else happened. This woman, her name is like Janelle
01:12:50.440
Elena in, in a couple of weeks with only two videos has like 1.5 million subscribers on YouTube.
01:12:57.320
It's one of the fastest growing channels ever for the amount of content she put out, which is a nun.
01:13:00.940
And now a lot of people are saying they believe it's an industry plant. I'm not saying that's true.
01:13:06.160
At the very least, we can say though, whether on purpose or an accident, the algorithm is putting
01:13:11.000
in the minds of young people not to buy homes and to live in vans. And think about what that's
01:13:16.020
going to do to the housing market in 10 years when millennials are like, I'd rather live in a van.
01:13:18.880
It sounds fun. Nobody's right now. Millennials aren't buying homes. Can't afford it.
01:13:23.080
So what happens then when no one can sell a home, the market bottoms out, everyone's
01:13:27.960
investments destroyed. See, that's where people don't understand this. For instance, um, it's
01:13:35.360
my understanding that Amazon is really looking to the future of just being a shipping company,
01:13:40.980
a prediction and shipping company. Once they have enough in their algorithm to predict you 90%,
01:13:47.720
95% of the time they'll just start shipping stuff before you even want to order it and
01:13:52.800
it'll be there. And if you want it, you'll, you'll keep it and pay for it. If not, you'll
01:13:57.780
ship it back. But do I really want it or did they shape me with their ads that are, were
01:14:07.060
Well, there are two, two things. First, there's a story of a father received, uh, uh, he picked
01:14:13.980
the mail up from his house and there was some, some coupons for his daughter for maternity
01:14:21.360
He got angry because he was like, why are they advertising to my young daughter? She'd be
01:14:24.540
a mother. And it turned out they actually knew she was already pregnant because of the
01:14:28.460
things she was looking at online, the things she was saying, they didn't mean anything by
01:14:32.060
it. They were like, Oh, she'll probably want this not realizing. But what's really scary
01:14:36.080
is, you know, a lot of people believe that Facebook spies on them because you'll be talking
01:14:40.860
about something and then you'll go on Facebook and see an ad for that very thing. And you're
01:14:44.400
thinking like they had to have been listening to me. It's actually, that's very naive.
01:14:48.220
The reality is scarier. Facebook can predict your behavior so well, they knew you were going
01:14:53.380
to talk about it. And this is true. There was an article that said YouTube, I'm sorry, Facebook
01:14:57.320
knows where you're going to eat lunch with a ridiculous accuracy. They know when you use the
01:15:01.980
bathroom, they can predict all of these things about you. And so they're not spying
01:15:06.040
on you. They don't need to. When you said to your dad or whatever, man, I'm really interested
01:15:09.740
in getting that Nutella caster I saw at the Guitar Center. And then also you see an ad
01:15:13.420
for it. It's because the behaviors before seemingly irrelevant. I like nachos. It's raining. Those
01:15:19.740
behaviors are associated with someone wanting to buy a guitar. Somehow, in some way we don't
01:15:23.540
understand, the AI does. So you think they're listening to you? No, they're predicting your
01:15:28.300
behavior. Now you combine that with shaping behavior. And it could be as simple as if we show
01:15:35.160
you the post about nachos, we know you'll vote for Obama or whoever, right? You might
01:15:39.800
not think seeing endless posts about Taco Bell will impact you in a certain way, so you'll
01:15:44.420
never question it. But they know those seemingly, you know, irrelevant things do guide your behavior.
01:15:52.260
You know, what's scary is how they can control the stock market. They can control, which they
01:15:57.700
Well, AI is most, most of the stock market now is AI.
01:16:01.600
Right, right, right. But think of it this way. If they really wanted to Twitter, Google,
01:16:06.340
Facebook, whatever, combat global warming or whatever, they could snap their fingers and
01:16:10.740
you'd only see news, you'd only see comments, you'd only see opinions. They could easily
01:16:16.140
restrict or limit speech. They're doing it. On Twitter, you know, people are shadow banned.
01:16:21.000
On YouTube, I'm sorry, I always mix them. Google, you type in a search term, won't appear.
01:16:26.880
Right? So like one common thing right now is you go on Google, type in Ilhan Omar marriage,
01:16:31.260
nothing comes up. You go on DuckDuckGo, marriage fraud pops up first thing, because that's the
01:16:35.940
question people are asking. Google is manipulating what people can see and shaping our behaviors.
01:16:41.900
They call it the good censor. That's the document that was leaked and presented by Ted Cruz in
01:16:46.440
Congress. They know what they're doing. And what's scary is the example of, you know, Hitler and the
01:16:52.140
Hulk. They think they're guiding you in a direction that makes sense. They don't know.
01:17:01.120
You know, it's really funny. People always say like, you know, YouTube's coming to get me.
01:17:05.300
I'm next. I'm going to get censored. And I'm like, listen, you got to understand YouTube.
01:17:09.540
YouTube, they could get rid of me at any moment. They've definitely de-ranked independent commentary,
01:17:15.400
but my channel is doing better than ever. So if their intention is to get rid of my voice,
01:17:19.900
I don't think so. Actually, I have a theory on this, which is kind of a sidestep.
01:17:23.720
The media calls me right wing. I'm clearly not, you know, centrist at the very least.
01:17:28.780
But a lot of, you know, I'm supporting Tulsi Gabbard and Andrew Yang.
01:17:31.280
But you know what they want to happen? They want the wheel to shift over.
01:17:36.160
They want me to be conservative so that you, Glenn, are a fringe far-right extremist.
01:17:41.200
How do they shift the full opinions of everybody? How do they control that behavior?
01:17:45.260
They start telling everybody slowly Tim Pool's right wing. They prop up my YouTube channel
01:17:48.940
and then call it right wing. So it's the only thing people can see.
01:17:51.920
And then people get this view of a moderate liberal as conservative and it shifts the whole
01:17:55.820
wheel to the left. Wow. I'm not saying it's on purpose. It's an Overton window move.
01:18:00.240
Right. So right now I look at, like, Tucker Carlson. You know, he's conservative.
01:18:05.060
I'm definitely to the left of him and Ben Shapiro.
01:18:08.080
But Ben Shapiro won't use the proper pronouns for a trans person.
01:18:16.760
They'll now ban Ben Shapiro eventually. He's high profile, so they have to wait this one out.
01:18:20.520
But they're banning all the lower-tiered individuals who have less followers,
01:18:24.580
who have no protection and no chance of rectifying this for misgendering.
01:18:27.880
And they say, but you broke the rules of a private platform.
01:18:30.240
Once they're all gone, then they say, but Tim Pool is a conservative.
01:18:33.920
Now there's nothing to the right of me but the extremists who are banned.
01:18:37.240
And then the whole wheel shifts over one degree.
01:18:40.100
So why hasn't anyone done anything to at least build another platform?
01:18:56.180
And I think it requires people to use them, right?
01:19:01.020
So I've, I used to have the Twitter logo in my video thumbnails on YouTube because I want people to follow me.
01:19:14.580
It's kind of like Facebook, but it's decentralized, encrypted, privacy, free speech.
01:19:20.480
If you post egregiously offensive things that are like not safe for work, you get a not safe for work tag,
01:19:26.320
which means people have to turn the filter around to see it.
01:19:29.800
So they've got, I think, a couple million users.
01:19:32.640
I actually, I'm working with the CEO on a site, on a separate project, a news venture.
01:19:40.400
You've got BitChute, which is video hosting, which is doing pretty well.
01:19:44.780
And then you've, which is kind of like a YouTube alternative.
01:19:47.300
You've got Gab and Parlay, which are Twitter alternatives.
01:19:52.600
And if, if Donald Trump right now signed up for Minds.com and then posted something related to Iran,
01:20:01.880
And that would, the market would just overnight, Twitter would lose a massive portion of its share.
01:20:08.480
But they would also immediately be deemed by, by media and social media as a far right extreme platform.
01:20:18.320
And I think, you know, I, I, I, I'm not going to claim these people are doing it on purpose,
01:20:23.800
but I am, it is disconcerting that you have these journalists acting in defense of massive,
01:20:28.620
unaccountable corporations smearing small businesses, decently small businesses.
01:20:34.660
There's another network called the Fediverse, which stands for Federated Universe.
01:20:44.020
Gab did something really brilliant and they overhauled their code and took the open source
01:20:48.680
code from a, from a group, uh, I don't want to call it a company, but an organization called
01:20:55.940
They even banned Will Wheaton for being a transphobe.
01:21:05.920
Gab federates, meaning any browser that accesses this open source network can get you to Gab.
01:21:11.100
Can Google ban a browser because you can go to a bad website?
01:21:19.180
So this is, this is a step in, in, uh, you know, the right direction, whether you like
01:21:24.420
I'm not talking about the history of Gab, just the idea of unbannable networks.
01:21:28.640
What's really interesting about this Fediverse option is that you can make your own social media
01:21:36.040
And you can tell people, follow me on the Fediverse, glenn at glennbeck.com, like email
01:21:41.800
No one can ban you because it's your server hosting your speech.
01:21:45.480
The only thing that take you down is a court order.
01:21:47.580
So this is, this is an alternative to these massive networks and censorship.
01:21:51.440
What it's going to take though, high profile individuals using them.
01:21:54.620
So I've been promoting and using minds because it's like, I hate Facebook.
01:22:00.020
But you know, the easiest path to changing it is if Trump right now, if Trump made a
01:22:06.560
post on any one of these platforms, be it Gab, Mines, Bitchute, whatever, the media would
01:22:10.540
have no choice but to show it front and center on every channel, on every website.
01:22:14.680
And then, you know, you know, Twitter was bleeding users for a while.
01:22:17.360
And then Trump came in, helped prop it back up because now there was a reason to be on
01:22:22.540
They're still kind of losing users, but until high profile people say, I'm going to use
01:22:27.140
something else, well, they control everything, you know?
01:22:30.020
I want to switch gears and keep it focused on what we're facing in this election right
01:22:47.900
For instance, I've been warning the audience of deep fakes for a long time.
01:22:54.420
Once, it's like people said in Germany, you know, in 1925, nobody knew what hyperinflation
01:23:06.080
By 29, that's all anyone was talking about, you know?
01:23:11.500
And I think it's going to be that way with the first deep fake, the first real deep fake.
01:23:16.620
I kind of, I kind of think deep fakes are going to be a really good thing for us.
01:23:20.820
And I, I, I, you know, sometimes the solution to problems are counterintuitive.
01:23:30.880
But I'm talking about the first real critical moment deep fake that everybody thinks is
01:23:38.460
real and could change an election or could start a war.
01:23:47.980
Remember that, you know, Covington, that wasn't, that was real footage.
01:23:52.020
And so because it was real footage, there was no fact checking to determine whether or not
01:23:58.840
Now, what happens when a video comes out of Trump saying something nightmarish and dangerous?
01:24:03.180
And then in three or four days, it turns out to be completely fake.
01:24:06.560
We need something to get, I guess, I don't know if journalists will ever get back on track on
01:24:12.560
But we need to get to a point where people are less trusting of random videos they see
01:24:16.940
So once we get that first big deep fake and the media says, oh my, that's, that was fake?
01:24:21.840
And people might take a step back for every other video following that.
01:24:25.000
So we need, we need something to say legitimately stop believing everything you see.
01:24:30.800
Because, you know, you can take a video and it's so easy to strip out not just the context
01:24:35.460
of the sentence, but the cultural context surrounding the conversation.
01:24:39.480
So it's amazing what DARPA is trying to do right now.
01:24:45.540
DARPA is, I'm sure you know more than I do, working on deep fakes because they want to
01:24:50.800
be able to have this algorithm that can spot them and then mark them as deep fakes.
01:25:00.760
I don't need, we don't need a little approved by marking.
01:25:05.260
We need to have people go back and say, wait a minute, does that make sense?
01:25:22.040
You've been taken out of context for over a decade.
01:25:31.640
What's really interesting about context too, is it's not just the paragraph you said, right?
01:25:36.680
Like if you were to say something like, you know, George Washington is quoted as saying
01:25:40.840
in a sentence, they can take that sentence and then say, you said it.
01:25:45.260
So in one instance, there was a particular news story the New York Times put out, specifically
01:25:51.400
And then there was commentary on it from an internet figure.
01:25:54.280
I said, I don't trust the New York Times, you know, but I do trust so-and-so.
01:25:59.580
The cultural context was that that week, a story came out that was, you know, you know,
01:26:06.320
And I wasn't saying I just don't trust the New York Times, period.
01:26:09.300
It was in that week we were talking about one issue.
01:26:11.160
And so it's not even, you know, so it's, it's true in that instance, I didn't trust
01:26:14.980
them, but people will take that and they use it.
01:26:19.820
Our society is extremely sensitive to whatever pops up on the internet at whatever moment.
01:26:25.160
You know, there's a story right now of Macy's canceling plates because one person complained.
01:26:29.880
Our businesses need to get to the point, corporations, marketing, and people to where they just say,
01:26:36.840
So that's what I, where I kind of wanted to go.
01:26:39.120
I was reading a book and I can't remember what it was.
01:26:41.720
I read it over the summer and it, it had a, it was about how the internet, how some people
01:26:50.660
not coordinated, just took someone by association and said, that person's bad.
01:27:01.140
So this person was being destroyed and just a regular citizen.
01:27:04.960
So somebody in the tech world decided the way to fight this is to fight it with fire hose
01:27:12.700
Just say everything about this person from the believable to the absolutely outrageous and
01:27:30.820
Um, and, and in some ways I kind of feel like that's where we're headed.
01:27:34.640
Not necessarily that somebody is doing it, but because everybody's a racist, everybody's
01:27:41.420
It's going to come to a place to where it's a little boy who cried wolf.
01:27:48.420
But you know what, what, what's worrying is I already feel that way as somebody who watches
01:27:53.660
a wide, you know, swath of different news from left to right.
01:27:56.980
But there are people on Twitter who, who purposefully block dissenting opinion and only follow people
01:28:02.960
It's, it's, it's, it's, this is a uniquely left thing to do.
01:28:05.720
And so, you know, you'll see something like there's an interview I did with some alt-right
01:28:14.480
I've done interviews with a former Soviet general with refugees, but you take one photograph
01:28:19.160
and then when you isolate yourself, these people lose their minds.
01:28:22.880
And so it's one of the reasons why I can't go on the ground and cover things the way I
01:28:27.220
And now I'm doing, you know, more political commentary and we're working on like documentary
01:28:33.320
These people believe insane things about me because they're in isolated groups and you'll
01:28:37.920
have one or two people who know they're lying, but they don't care.
01:28:40.320
The ends justify the means by any means necessary.
01:28:47.120
And it was, it's, it's kind of crazy, you know, in 2011, the first time I got attacked,
01:28:52.720
this may have been early 2012 by, uh, you know, black block anarchist type MSNBC had
01:28:57.760
me on and they said, wow, how could this have happened today?
01:29:02.380
They're, they're tacitly supporting it, ignoring it at the very least.
01:29:06.320
And we're seeing high profile individuals prop them up and praise them.
01:29:09.080
Um, we just had, uh, the guardian did a piece, I believe recently on the John Brown gun club.
01:29:14.540
That guy who attacked the ice facility was a member of that, of that Antifa organization
01:29:20.220
I mean, CNN even, uh, uh, it was, um, I can't remember the guy's name, but he has a show
01:29:25.500
on CNN and he was saying you can donate to them.
01:29:27.580
I could, I could be wrong about that, but this was before the attack.
01:29:30.760
At the very least, I'm surprised that, could you imagine, you know, you're familiar with
01:29:35.840
Are you, could you imagine if CNN was like, everybody donate to the Oath Keepers?
01:29:41.380
But they're, but they do it with Antifa, you know, and how many, how many pieces did we
01:29:45.660
There was a CNN piece said Antifa seeks peace, peace through violence.
01:29:49.960
Ah, that's a, that was, they changed the headline eventually, but they, they, they, you know
01:29:56.980
They disagree with whatever he says, no matter what he says, even if they're going to be
01:30:00.900
wrong about it in the end, just because that's what they, I, for whatever reason, that's what
01:30:07.940
What do you quickly, cause I've got very little time.
01:30:13.720
What do you think of, uh, what's happening on the ground?
01:30:21.320
Normal people just waking up and going, I don't want any of this, or are they polarizing and
01:30:27.620
I think a lot of people are, who are not conservative are moving to join conservatives.
01:30:33.580
Actually, vox.com said in the Andy No scenario, the narrative was divided by left and the center
01:30:42.060
Conservatives and centrists and like moderates are pretty much still in agreement, but the
01:30:45.640
left has veered off in a ridiculous path and the data shows it.
01:30:48.600
It's the New York times, it's Gallup, Pew, uh, Axios, Quartz.
01:30:56.980
And, and when media traces after them, they're, they're going nuts too.
01:31:00.160
But are those people who were kind of on the moderate left who are Democrats?
01:31:04.920
Are they peeling off or are they going with them?
01:31:10.140
You know, my friends who are, you know, were Bernie supporters in 2016, count me out.
01:31:15.260
And then you have a lot of people who are now voted for Obama.
01:31:20.200
They're saying they're, they're, they're with Trump now.
01:31:48.060
Um, I mean, there's a lot to criticize him for, I guess, but I think he's hilarious.
01:31:53.580
He's kind of like, he's like a real version of Tony Stark.
01:31:57.180
That's living the life that a billionaire, you, if you were a billionaire, you'd like
01:32:03.040
And jokingly, Hey, Elon Musk, why haven't you built yourself an Iron Man suit yet?
01:32:11.560
And it got, it went like decently viral and people were really excited that he were, you
01:32:18.300
So, um, uh, he was talking about, uh, oh shoot, I can't, NeuralNet.
01:32:33.640
I've got, I've always got the catastrophe side for you.
01:32:41.560
Um, my daughter is currently, she's gone through a year of testing, uh, to be able
01:32:47.200
to have the implants put into her brain, uh, Neuralink, uh, not for Neuralink.
01:32:52.760
I want to call Elon or write to Elon and say, uh, Hey, can, can she test?
01:33:00.060
Um, she's got the, uh, current technology, which his is going to be 10, 10,000 times more powerful.
01:33:22.000
Man, if you like VR, imagine if you could actually just link your brain to a computer and then
01:33:29.140
Now the matrix is a scary movie, but what I mean is like, imagine that fake reality of
01:33:33.860
I'm sure people would love to come home from work and just feel powerful for a few that's
01:33:39.160
In reality, one of the hopeful things is that by being connected to the net, we won't be
01:33:45.900
Our, our, our brain computation will be in sync, but here's the, here's the, the more nightmare
01:33:51.200
The first one is I go back to the Hitler dancing with Hulk thing.
01:33:55.100
Just because we hook our brains into that, who knows what our, imagine you think you're
01:33:59.160
going to, you're going to go into a VR world to be Superman.
01:34:01.240
And instead it's a nightmare trip of like talking, you know, hammers exploding and just weird
01:34:12.540
So I could be getting my, my lore wrong, but my understanding is the Borg started out like
01:34:16.820
humans, integrated themselves into technology and eventually it formed a hive mind where
01:34:23.540
That's exactly how I described it to a friend over the weekend.
01:34:33.920
Twitter, you know, um, we're not integrated at the speed of light, the speed of electricity,
01:34:39.600
but we open up our phones and we look at that Twitter feed and we're connected to that network
01:34:44.220
immediately to see their thoughts and opinions.
01:34:46.800
What happens when it's all instant to our brains and we can hear everything.
01:34:49.720
And when it's two directions, I mean, you want to talk about algorithms just nudging
01:34:56.820
I mean, if you read Cass Sunstein's work on nudge.
01:35:03.380
It, you know, it sounds bad too, but I could maybe envision it being a good thing in some
01:35:08.760
You know, a lot of great, we'd be able to speak.
01:35:10.580
I mean, he, he talks about being able to affect, uh, language as far as, I mean, I could
01:35:16.580
go to, I could go visit, you know, Russia and speak Russian and understand it.
01:35:20.940
And it would, and it would somehow or another English, they'd be speaking Russian, but you'd
01:35:27.500
But I also think, you know, one of the biggest problems in politics is the inability to understand
01:35:33.400
What if we did, what if you were like, that's what you think?
01:35:36.840
And then they really, and so I'll, I'll put it in the, um, I guess like the beneficial
01:35:44.180
And all of a sudden, one day their eyes were opened to the, to everything you knew.
01:35:48.680
Let me use your, let me use your Star Trek analogy.
01:35:52.600
Did you ever see the Vulcan mind meld work out poorly?
01:35:57.640
I mean, when he, yeah, I mean, he was, he was always like Vulcan mind meld and he would
01:36:08.380
In, in reality, I think, you know, left, right, up, down, religious, a religious, whatever
01:36:13.280
will, will really understand the perspectives of people and everything.
01:36:18.540
And, and, you know, uh, when it comes to human speech, look, I've got 15 billion ideas
01:36:23.880
on my head and I can bring them out in this tiny little string.
01:36:27.680
So, you know, it's, it's, I can tell a liberal, think of all those dumb conservatives
01:36:32.560
and all of a sudden they know everything, you know, that sounds great, doesn't it?
01:36:35.240
Cause then they'll, then, then they'll know you're right.
01:36:37.560
And you can say the same to conservatives in reality, everybody will learn from everybody
01:36:42.480
Although admittedly, it's like, it's the far left and the fringe wackos.
01:36:46.400
Isn't it the same thing though with, um, I mean, I don't know where you stand on, uh,
01:36:55.480
Isn't this, isn't this Libra for the brain who's controlling?
01:36:59.880
You know, if I, if, you know, Stephen Hawking was so misunderstood at the end of his life,
01:37:08.340
By 2050, there's a good chance that homo sapiens don't exist.
01:37:15.600
It means what we define as a homo sapien today does not exist because of stuff like this.
01:37:21.380
But if, if they can control, if they're controlling speech, Libra can say, oh, well, you're not,
01:37:41.780
That's why I think Neuralink is fun for video games and for you choosing what you connect to.
01:37:47.540
But this like constantly on system, it's kind of scary.
01:37:54.300
MasterCard, depending on your source, there's a, it's disputed a bit between Patreon and MasterCard,
01:38:01.240
You said you have to, you have to ban this man, Robert Spencer.
01:38:09.700
And Patreon said, I'm sorry, MasterCard forced us to do it.
01:38:13.300
MasterCard, I think denied it, but so what happens when banking institutions can tell
01:38:21.300
One of the things they're proposing with Libra is speeding up transactions by using
01:38:25.500
So MasterCard, when MasterCard wants to send money from the United States, from their bank,
01:38:29.960
they transfer it through all these different, you know, systems and there's like regulation
01:38:37.400
What happens when Facebook bans you for being an extremist?
01:38:40.020
And then you take your MasterCard and you swipe it at a gas station and it says denied
01:38:44.980
They say, well, we use Libra to do exchanges and you're banned from Libra.
01:38:52.880
Unfortunately, Libra is our intermediary and you're banned, so you can't use this card
01:38:57.200
Now Facebook is in between everything you do and can choose to shut you down.
01:39:02.600
You're seeing this, you're seeing this already being pushed by state governments with Cuomo
01:39:08.980
saying to the financial center of the world saying, hey, by the way, we have to send in
01:39:17.380
our state regulators and we're not telling you how to do business, but we think some of
01:39:24.080
these guys who are making guns or selling guns are kind of suspicious.
01:39:28.560
So it will probably extend your examination every year.
01:39:35.300
And so there he's saying, but you can avoid all that if you just don't do business.
01:39:45.980
But again, without the government's involvement.
01:39:51.560
This is what I find really scary about big tech and especially whatever, you know, what's
01:39:57.560
really worrying to me are these people who defend Google and Twitter and Facebook.
01:40:01.180
I'm like, Elizabeth Warren is coming after them.
01:40:07.680
But but even I worry about that because of the history with FDR.
01:40:18.860
So let's go to Ford and GM and and Chrysler and ask them how we should regulate.
01:40:27.480
Well, they put all of the small people out of business.
01:40:29.940
It bothers me that that Zuckerberg is saying, hey, we need regulation.
01:40:40.320
One of the more terrifying things if we don't have A.I. is China.
01:40:47.720
I think it was Sam Harris's podcast where something to the effect of the moment that
01:40:54.120
A.I. defeated, I think, Kasparov or whatever, the greatest chess player.
01:40:57.980
There was there will never be a chess player who can beat our best A.I., period.
01:41:01.140
What happens if China beats us to the race and then declares war?
01:41:08.580
Win before declaring war or something like that effect?
01:41:10.960
So if they have the A.I., they know our output.
01:41:13.940
They know the known knowns, the unknown unknowns, et cetera.
01:41:17.380
And then the A.I. says you have an 87 percent chance of victory.
01:41:21.820
And the U.S. can't counter that artificial intelligence.
01:41:26.780
Are you do you believe that we will get to a G.I.?
01:41:33.600
OK, so A.I. is artificial intelligence, which what we have.
01:41:36.860
What you're talking about is at least artificial general intelligence where, you know, Kasparov
01:41:48.020
So A.G.I. is something that can look at all of it.
01:41:57.820
You know, one of the things that's really interesting I tell people is what makes humans
01:42:01.560
better than computers is that when you need to screw something into a piece of wood,
01:42:05.860
a screw, a screwdriver, we take it, we say the pointy end goes into the hole and then
01:42:14.300
But but computers, when they do this, have to calculate literally every step.
01:42:18.200
So when the A.I. defeats Kasparov, it literally calculates is all of the calculates all of
01:42:23.560
these different outcomes, wasting time and energy.
01:42:26.680
If it had general intelligence, which I believe I could be in the right spaces, they could make
01:42:31.300
more assumptions and skip off the wasted time and energy and move faster and more quickly
01:42:37.160
So, well, when you add quantum computing, oh, then it's then then it's everything at once.
01:42:51.960
It's similar in Rome to like Dungeons and Dragons.
01:42:54.640
A.I. can't win because it's just got so many variables that the computer has to calculate
01:43:01.620
way too much, whereas a human with general intelligence ignores the wasted time and energy.
01:43:12.580
You can only have four of each individual card.
01:43:14.360
So the human thinks I'm playing against this opponent in a professional game.
01:43:17.820
I know the likelihood they'll use a certain card.
01:43:20.240
Why would they use this really bad card in their deck?
01:43:24.920
It's very simple then for the human to go up against the human.
01:43:27.440
The computer has to calculate 13,000 different cards because it doesn't understand if someone
01:43:33.000
would or wouldn't use one because it doesn't have the general understanding, the context,
01:43:41.020
So for the time being, it's one of the most popular card games in the world, I think,
01:43:47.420
If a computer can't win that game, I'm not super concerned about Global Warfare because
01:43:53.020
Well, they've won Go, which is a Chinese game that they were...
01:44:02.920
It's when the Go Masters in China, when they did lose to the...
01:44:09.000
They thought that it was cheating because it was playing it in a way that no human had ever
01:44:25.240
We don't know what it's going to do because it's not going to think like us.
01:44:30.340
They assume a lot of people, not the experts, obviously, but they assume a Terminator scenario
01:44:39.100
You know what would happen before the Terminator?
01:44:44.080
What would happen before the Matrix is a bunch of happy humans gleefully serving their
01:44:48.080
robot overlords, thinking ignorance is like not realizing it.
01:44:52.480
You know, Olgotron, the giant robot overlord, loves us and we please him, not realizing they've
01:44:58.560
been manipulated, our behaviors, you know, at this...
01:45:03.740
Wouldn't it be great just to bow to your giant robot AI and not be aware at all that
01:45:19.860
In the 1990s saying, there's no freaking way I'll ever give my fingerprints to the government.
01:45:30.700
And in 2000, I was talking about facial recognition saying, this is really dangerous down the road.
01:45:39.820
Weren't social security numbers like shocking to people when they came out?
01:45:48.260
You know, we're walking down that path towards the Borg, I guess.
01:45:53.200
Unless we blow ourselves up or something before that.
01:45:55.520
I actually think, you know, World War III type scenarios of like international nuclear war
01:46:00.180
is ridiculously unlikely due to the decaying nature of borders.
01:46:05.580
So what's really interesting about the idea of borders today is, you know, you mentioned
01:46:08.600
how they said there won't be any, you know, it's really...
01:46:10.980
Listen, 20 years ago, 30 years ago, if you went to Canada,
01:46:15.340
and did something to make money, you're in violation of your tourist visa.
01:46:25.140
Americans, you know, are the ones paying me, not Canadians.
01:46:27.660
So I can buy a house in whatever country I want and do the exact same work legally.
01:46:34.360
And it's really, really weird because it gets even weird.
01:46:37.460
I have people in the UK who donate to me through like, you know, PayPal or whatever.
01:46:43.840
If I go to the UK and make a video about delicious, you know, UK beer or something,
01:46:49.780
and then I receive a donation from a British person, did I just violate a tourist visa?
01:46:54.080
A British citizen paid me in exchange for producing content.
01:46:56.960
But they paid an American company, which then relayed the money to me, right?
01:47:02.060
Things are starting to break down in terms of...
01:47:05.460
If we move off of a production-based economy for the most part,
01:47:09.060
anybody who does a job that involves information can live wherever they want and work.
01:47:12.900
You've got American companies that hire developers like digital coders and stuff in Romania.
01:47:31.560
Where that's supposed to go and how it works, right?
01:47:37.400
Re-hypothecation is my signal, head to the mountain.
01:47:43.380
Whenever you read the word re-hypothecation in a news story, run for your life.
01:47:51.080
There's not enough gold for all the money that everybody has.
01:48:03.140
So now when Germany demanded their money, their gold from the Fed, it took them five
01:48:12.120
And the Fed was like, we're not going to do this for everyone.
01:48:15.240
Right, because there's not enough gold for everybody.
01:48:27.000
When you have the bundles, the tranches, if you will, of CDOs and mortgages and everything
01:48:33.040
else, I have a mortgage with Chase Manhattan, but Chase Manhattan has sold it to somebody
01:48:39.820
else, yet they're keeping that on their balance sheets as well as an asset.
01:49:06.000
I think, like, uh, it's an ever-exploitable system where people are going to figure out
01:49:14.520
And, um, it's, I, I, I, this is probably derailing, but it reminds me of what, uh, these
01:49:20.920
media companies were doing with what's called a traffic assignment, where without naming any
01:49:26.620
companies to avoid litigation, uh, some very prominent high profile media companies would
01:49:31.740
buy the assignment of viewership from a clickbait website.
01:49:36.820
So you had, you know, um, oh my God, there was one that was called like luckyfarmer.com.
01:49:43.540
Uh, and they would put up those, the, those articles where it's like 25 celebrities who
01:49:48.800
Uh, and every time you want to see a new photo, it's a new page.
01:49:53.400
Then they sell the assignment to a major network who then claims we got a billion views this
01:49:58.700
And they go to advertisers and say, we got a billion views.
01:50:00.800
Don't you want to buy ads with us when they only had a million?
01:50:03.360
So it's the, the, the media economy was being built on fake numbers and a manipulation.
01:50:09.520
And, you know, I was reminded of this just because when you're talking about all these different
01:50:13.260
banks owning different things, what I see are different companies trying to figure out how they
01:50:16.420
can sell something again, is this valuable to you?
01:50:19.220
So the clickbait website already made money off that view.
01:50:22.740
Now they can claim the view is part of a network.
01:50:25.940
And now the bigger company can say our network got X views and your ads will sell against those.
01:50:34.040
And that's another reason why these companies kind of started to fall apart because eventually
01:50:37.740
if someone pulls the pin on that one bank and then he says, okay, we got to ask the next bank,
01:50:43.960
And then eventually it comes back to you, you know, and the few,
01:50:46.420
he was, um, all right, there's real quick, you've already, we have run a longer podcast
01:50:57.460
Um, uh, let me just do some real quick, uh, rapid fire 2030 more free, less free Americans,
01:51:25.360
Um, the, the, I think we're going to see, uh, the press is becoming more and more independent
01:51:31.220
So I think it'll be not better or worse, just very different.
01:51:48.960
He's done, he's done, he's, his, his, his character, his foreign policy.
01:51:53.000
But if you think there's any charismatic Democrat today who's on that stage is going to beat
01:51:56.900
Trump, I, I'll take that bet any day of the week.
01:52:11.220
Um, I'm little things really, you know, I, I'm.
01:52:17.080
So freedom of speech, the loss of free speech is not something that keeps you awake at night.
01:52:21.640
But it's constantly there saying, how can we figure this out?
01:52:25.700
Well, I guess, you know, nothing keeps me awake at night because I'm very anarchistic
01:52:33.640
I am ready to go live in the mountains with a dog and go hunting for food.
01:52:36.660
So it's like, yeah, you want to tear down your society, you know?
01:52:39.460
Um, but I, I do think big tech is the most pressing issue right now.
01:52:44.660
Um, thing that you're most excited about with tech thing that you think nobody is really
01:52:52.320
I'd, I'd love to fly around a Superman for, you know, punching bad guys and shooting laser
01:52:56.520
Wouldn't that sound, sound like fun, a good relaxing.
01:52:58.660
And you know, there's, there's a lot of other things that can do like downloading, like in
01:53:01.620
the matrix, downloading Kung Fu, becoming a master at like, think about, you know, I want
01:53:08.200
So you just plug in and download tennis and all of a sudden you're champion.
01:53:12.440
You wouldn't be a champion because everybody would be good, you'd be good, right?
01:53:16.620
Everybody, you wouldn't be that good because your muscles would still need to be, you know,
01:53:28.600
Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend