Ep 120 | Who Is More Evil: Government or Facebook? | Robby Soave | The Glenn Beck Podcast
Episode Stats
Words per Minute
174.55705
Summary
Robbie Suave is a writer and editor at Reason Magazine and a frequent contributor to the New York Times. He's also the author of a new book, "Tech Panic: Why We Shouldn't Fear the Future," and a regular contributor to Rolling Stone.
Transcript
00:00:00.000
In 1997, a man on a giant stage ripped an article from a magazine, shoved it into a blender,
00:00:06.480
put some clear liquid in it, hit blend, and then drank the cloudy, pulpy substance.
00:00:11.520
The occasion was the sixth International World Wide Web Conference. The man was that year's
00:00:18.320
keynote speaker, Robert Metcalf. In the 1970s, Metcalf invented Ethernet, founded 3Com.
00:00:26.860
He helped create the Internet. So why was he chugging mulch in front of an audience?
00:00:33.440
Because two years earlier, he wrote a column in which included the sentence,
00:00:38.360
I predict the Internet will soon go spectacularly supernova. And in 1996, catastrophically collapse.
00:00:47.080
He ate his words. He was so certain that he was right, that he was willing to eat his words,
00:00:54.340
and he did. History is full of horrible predictions about technology, and not from idiots either,
00:01:01.000
from experts. In 1876, an inventor offered to sell the patent for one of his inventions to
00:01:07.160
William Orton, president of Western Union. Orton was outraged. In a memo, he wrote that the invention
00:01:15.580
had too many shortcomings to be seriously considered as any means of communication.
00:01:21.040
The inventor was Alexander Graham Bell, and the invention was the telephone. In 1927,
00:01:28.100
H.M. Warner, co-founder of Warner Brothers, said,
00:01:31.440
Who the hell wants to hear actors talk? Actually, that one I still agree with.
00:01:36.920
In 1966, Time Magazine predicted that online shopping would flop because women like to get
00:01:43.640
out of the house. They like to handle the merchandise. They like to be able to change their minds.
00:01:49.080
In 1936, the New York Times predicted a rocket will never be able to leave the Earth's atmosphere.
00:01:57.700
95, an article in Newsweek predicted that no online database will replace your daily newspaper.
00:02:05.660
You can find that article on Newsweek's online database if they haven't gone out of business yet.
00:02:11.900
Predicting the future is a tough game. Even the most accurate guesses have giant errors and blind spots.
00:02:18.920
It's why fortune tellers keep things airy and vague, you know.
00:02:25.140
But, if we had to guess, most of us would say that social media is a technological behemoth.
00:02:31.980
There is no way to know how it's going to play out or what turn we should make.
00:02:36.660
Will it become an obsolete relic? A ridiculous old trend?
00:02:42.960
Or will it fuse us and fuse into our society until we're inseparable from it?
00:02:50.340
We don't know. And not knowing really bothers us.
00:02:57.660
Today's guest is one of the brave souls willing to make a prediction.
00:03:10.300
It's an honest and generally refreshing take on the question.
00:03:15.520
When he isn't predicting the future, he's the senior editor of Reason.
00:03:36.580
I think the first time we spoke was when you exposed the rape hoax.
00:03:54.600
Now I feel like it's the most miserable place on earth between the I mean, people wear masks outside by themselves on the street under circumstances that even are notoriously risk averse.
00:04:06.180
CDC doesn't think you have to wear a mask anymore.
00:04:09.480
They they flip out if you if you express any disdain at this.
00:04:14.220
In fact, they express disdain at you for not doing it.
00:04:18.080
I think the mask is becoming a symbol for the what I call the blue tribe the way the MAGA hat was a symbol for the red tribe for for Trump's tribe.
00:04:27.840
The mask is their version of that because it's not really about health or safety anymore.
00:04:34.540
And by letting people know who you voted for, you know, University of Chicago, I just read this morning, is not only making you get the mandated vaccination.
00:04:43.100
But you have to sign your name in agreement with several statements that I don't necessarily agree with any of those statements there.
00:05:03.260
With the college campuses, elite liberal arts campuses, the Ivy League, Columbia, Harvard, Brown, other places.
00:05:09.760
They are implementing the most authoritarian campus covid policies of all.
00:05:16.260
And they are places where they could have less rigorous mandates because students, young people are generally fairly protected from covid.
00:05:26.040
And they're telling people, you know, don't take sips of water in class in class because you'd have to take your mask down for long enough.
00:05:33.760
They're telling students the cafeterias are being shut down.
00:05:38.800
If you must interact with another human being, please do it outside.
00:05:41.740
And please don't make any new friends because we don't want you to expand your social circle.
00:05:47.220
And we've already seen how campus kind of culture can infect the rest of society with the sort of takeover.
00:05:55.280
I'm very worried that this crop of students who will become the elites of our society will be accustomed to policies that are, frankly, insane.
00:06:06.960
But I read something the other day from an author.
00:06:46.160
But I guess the good news is, at some point, it'll be so bad.
00:06:53.500
So it must feel weird for you to be, you know, at this time.
00:06:59.340
Saying why we shouldn't fear Facebook and the future.
00:07:06.240
You, I'm not sure how much we're going to agree on.
00:07:08.940
And I think that's really good to have a healthy debate.
00:07:12.920
We might end up agreeing on more than, than I think.
00:07:18.620
Tell me what, tell me why we shouldn't fear Facebook.
00:07:24.380
So I think most of the revelations that have come out in the last few weeks with this quote
00:07:28.600
unquote whistleblower, although I'm not sure she told us anything.
00:07:33.180
And it seemed, her solution seems to be more government control and more of the sort of
00:07:37.040
limit what we're, what we see on these platforms.
00:07:44.100
But her solution seems to be exactly the solution Zuckerberg, the man.
00:07:51.280
And I don't want to hear, I don't want to hear it from another Senator that Instagram's
00:07:56.360
addiction problem is akin to like cigarettes and big tobacco.
00:08:04.000
Instagram, there are some issues and I think parents would be well advised to limit the amount
00:08:11.020
I'm totally, now that doesn't require a government solution, which is parents, please do that.
00:08:15.780
But there are a lot of kids who benefit also from social media.
00:08:19.000
We force them to stay home and to, and to avoid all social contact for the last two
00:08:23.560
I think probably on net, I would bet their mental health is better because they had some access
00:08:31.220
And you know, we got to, again, we can talk about how we should limit their time using it,
00:08:37.740
I bet, I bet the majority of teenage girls would tell you that just going to high school
00:08:41.620
makes them feel sad or depressed or bullied because being a kid is hard.
00:08:45.200
Oh, and especially, I have three girls, one boy, a boy's going to high school.
00:08:52.600
He has a rough time, but not, girls are vicious.
00:09:01.620
You know, and Facebook only, I think, or, or social media only amplifies.
00:09:09.300
But these problems existed long before social, I mean, we were talking about the body image
00:09:13.900
problems of like glossy magazines and other things.
00:09:17.920
That's probably the main thrust of my book is that everything you're saying is wrong with
00:09:22.820
I think there are some legitimate issues there too, but a lot of this is also just present
00:09:27.160
in the traditional media, the media that predates social media, which is, and if you,
00:09:31.400
you know, if you want to get into the sort of treatment of conservative speech on social media,
00:09:35.220
this is a major issue for a lot of people on the right.
00:09:37.680
And I've agreed most of the time when they take down a piece of, you know, right wing
00:09:42.040
content or conservative content and it's a bad call.
00:09:48.320
Still, I think on net social media is probably really good for conservatives.
00:09:55.760
Better than, so the New York times will never run an op-ed by a Republican Senator again.
00:09:59.500
That was their lesson from the Tom Cotton thing, from him expressing an opinion, probably
00:10:03.220
a majority of the country agrees with that these riots are out of control and the federal
00:10:08.520
That opinion was considered unsayable in the mainstream media.
00:10:12.240
That's the level of hostility to conservatives, to non-liberal thought, to even to probably
00:10:16.680
far left thought, anything contrarian or that doesn't fit in the liberal bubble.
00:10:23.260
They, they do sometimes take down stuff they should leave up, but I see the numbers.
00:10:27.560
I see, you know, Tucker, Ben Shapiro, other, or the blaze, other conservative news sites
00:10:34.600
actually doing really well on Facebook, which is why the liberals want to destroy Facebook
00:10:41.180
So, so I, I think we agree on, you know, capitalism is, and, and the internet and Facebook and everything.
00:10:55.120
What you put into it is what makes it either really good or really dangerous.
00:11:02.140
Um, the, the problem, and I don't, I, I am libertarian minded and I don't like regulation
00:11:11.220
and I'm firm on no regulation, no, no more laws.
00:11:19.400
Um, but, um, I, I'm stuck in this place because I've always said private companies can do whatever
00:11:28.380
they want, but this is the one place where I, I'm beginning to feel the founders couldn't imagine
00:11:36.580
a corporation being far more powerful than a country.
00:11:42.400
Um, and that's fine, but now they're merging the two of them.
00:11:49.620
Um, so I'm not sure that the solution lies with regulating Facebook as much as it does
00:11:57.380
regulating the government and say, go back into your box.
00:12:02.000
Well, that we, yeah, that I agree with completely.
00:12:05.220
Government is the one, you know, the big tech, the marriage of big tech and big government
00:12:09.200
is scary, but the one we can have some control over, or at least theoretically because of our
00:12:13.880
constitutional structure is the government when, when, you know, Facebook takes down,
00:12:18.960
um, you know, so-called coronavirus misinformation, the lab leak theory, you're not allowed to
00:12:23.580
discuss it on Facebook, terrible call, but the white house is pushing them to do that.
00:12:30.540
Everyone in the mainstream media is telling them to do that.
00:12:33.080
They rely on cues from what the New York times tells them, right?
00:12:35.840
They shouldn't do that, but they feel scared to do anything else because right now there's
00:12:39.780
a, there's an ax hanging over their head, something hanging over their head.
00:12:42.900
Everybody on Capitol Hill wants to regulate them.
00:12:50.220
So I think it's a mistake too, because the, the, in the end of the day, the regulators will
00:12:55.100
The people who actually investigate these companies.
00:12:58.140
When has more government ever worked out for Liberty?
00:13:04.760
You think the, you know, the bureaucrats who are investigating these companies will care
00:13:08.780
that they took down too much conservative speech.
00:13:10.860
No, it will be, these companies are, make too much money.
00:13:14.660
They're too big and powerful and we have to break them up because they're a threat to
00:13:20.340
And that's what it will be at the end of the day.
00:13:22.560
Do you believe that, um, that a corporation can ever, you know, cause I've, I've, I've always
00:13:32.580
mocked and hated, you know, what those dystopian future movies where they're like, I work for
00:13:39.800
the corporation and it was like, you know what I mean?
00:13:43.820
Um, but I kind of feel that way now because it is becoming one.
00:13:54.100
Um, and you know, if, if social media is not taking it down, then it'll be the FBI and the
00:14:01.420
FBI is really kind of guided by the white house, but so is social media.
00:14:08.720
Well, these companies are making a lot of bad calls at the behest of their woke employees,
00:14:13.540
I don't think, I don't think Mark Zuckerberg is actually inclined towards sort of liberal
00:14:21.060
I think there are people who work at his company who graduated from places like Oberlin and Yale
00:14:26.960
and et cetera, and they hold hostile views toward kind of basic principles of a culture of free
00:14:33.660
speech and it's really bad that our education system has produced that.
00:14:38.760
And we need to, you know, down the line, right.
00:14:40.960
When we're not going to see some change for many years, but that if we're talking about
00:14:45.520
big, what kind of big substantive changes do we need to make to not have people who, who
00:14:51.920
feel this way running all of our society, being all the elites, that's the kind of thing we
00:14:57.120
Just, you know, regulations aimed at the companies are at the, like the last link in the chain
00:15:03.820
So how do you deal with a company that just swallows, has swallowed every competitor and,
00:15:11.080
you know, Hey, I got something and they want to buy it.
00:15:16.020
But they've swallowed almost everybody and to compete against them because of regulations
00:15:23.660
that they've helped put into place, they can afford things that other companies can't
00:15:29.260
afford. So startups is, so it's not a level playing ground, uh, play, you know, playing
00:15:35.160
field. There is a market for an, the opposite of a Facebook. There is that market. Um, but
00:15:44.520
they kill you. I mean, look at what, look at, look what happened on January 6th.
00:15:49.880
Yeah. Yeah. It, so what happened to parlor for instance, I think it was very, um, suspicious.
00:15:56.000
I don't, I don't like it at all. Uh, I've criticized it. Um, now the, the reason that they take parlor
00:16:01.820
out of the app store, but they, they said there are some violences being organized on
00:16:05.120
the platform, but of course violence was being organized on Facebook and still is every
00:16:08.460
day. Uh, and they didn't do anything about that. So it does seem unfair. Now you get into
00:16:13.120
a, just a technical issue where, you know, I hear a lot, Republicans are more interested
00:16:16.660
in antitrust law now, but existing antitrust law is about harm to the consumer. Not so the problem
00:16:25.320
imposed by Facebook isn't really essentially you would need new laws to kind of address this
00:16:30.200
problem, right? The traditional monopoly is they own all of the resource or whatever the people
00:16:35.380
need, and then they can raise the price of it, but they're not chart. Facebook doesn't charge the
00:16:39.040
user for the service. So there's not, it's, it's a different kind of company. I would, I would
00:16:44.400
listen to whatever proposal is on the table to address this, but I still think that even though
00:16:50.180
they've, you know, they've crushed some competition, there's still more ways to speak online than
00:16:56.140
ever before. Like Twitter is a competitor. They have Instagram, but kids now like Snapchat
00:17:01.740
and Tik TOK. I think Facebook is kind of a dying star. They can't get new, new young users.
00:17:07.520
No, it's, it's, it's boomer book. That's what they call it. Right. And I remember when I was
00:17:11.640
younger, my space was my, was the social media site I liked and AOL instant messenger. And those
00:17:16.020
things are gone. Those things are dead. So what happened to, what happened to my space? Why did
00:17:20.700
that fail and Facebook secede? It's a, it was market competition. Facebook was a better product.
00:17:26.120
My space was glitchy. It had a bad user interface. It actually allowed too much customization. Like
00:17:32.380
you could change the layout, the way it looked to have like your favorite band in the background.
00:17:35.980
And they really wanted to focus on music. That was their value add. Whereas Facebook was initially
00:17:41.300
just a site for college students and they quickly decided, let's open this up to everyone.
00:17:44.760
And it was just better. It was a better, it was a cooler, sleeker product and they beat
00:17:49.380
my space. That's it. That's really what happened. And someone else, it's hard to imagine how anyone
00:17:52.920
else could do it, but I, it is possible that someone could do that to Facebook. In fact,
00:17:57.980
and in the, the history of web online companies that has happened a lot.
00:18:04.640
Google is very dominant. No question there. They're mostly dominant because the search engine
00:18:10.540
they offer everybody really likes, I mean, they have the best product. Uh, they, you know,
00:18:17.120
they beat, they just beat the search engines that were on the market previously. I think
00:18:20.460
they're closer to a traditional monopoly than other companies. And again, I don't like, I know
00:18:25.540
they've made mistakes where the conservative websites disappear.
00:18:33.360
Yeah. But, um, but I, again, the, the sort of investigation into them, the, the, the feds
00:18:40.220
were doing, uh, was aimed at, yeah, their monopoly power and was saying they really didn't like
00:18:44.920
that. It's the default search engine on Apple products. But if it, I think they pay Apple to
00:18:51.340
be the default search engine. So if somehow the government broke up that arrangement, you know,
00:18:55.320
what would happen is all of the Apple customers would just be like, well, how do I put Google
00:18:59.260
back on my phone? Right. It's what the customers want. So at some, at some level, what we're
00:19:03.120
saying is we need to bring the government in between two companies who's used, who's
00:19:08.520
willingly, who like this, this is the thing they like. And you're saying, why are we, why
00:19:13.020
are we involving the government again? Do you know what I mean?
00:19:14.640
So let me ask you this, cause this is a place where I think the government, um, maybe
00:19:20.600
should get involved. And that is, um, what I am, my thoughts, you know, my output is me.
00:19:31.900
It's mine, my data, what I do, that's mine. Um, things would dramatically change if you could
00:19:40.840
just opt into that or say, no, you don't get any of that unless you want to pay me. Wouldn't
00:19:47.620
that flip the power back around to, because really when it's free, you're the product,
00:19:53.740
you know? Um, and I don't think people, I don't know if people really still get that. Um,
00:20:01.240
but, uh, hard to put Pandora back in this box or she doesn't go in the box. She opens the
00:20:06.460
box. Anyway, hard to shut this box. I think it's just her box. I'm not sure. She doesn't
00:20:11.780
live in the box. Sorry. Um, uh, so you don't think there is a way to get information back?
00:20:17.520
Europe has a different approach to a lot of these things. They, in the speech versus privacy
00:20:22.800
trade-off, they're way more interested in privacy and, and we are way more interested
00:20:28.460
in speech in, in, in being able to speak, even if it's makes people uncomfortable or if
00:20:33.740
it's revealing private information about people, that's the way our first amendment kind
00:20:38.060
of understanding has gone. And there are a lot of benefits to that. You know, you can't,
00:20:42.060
I mean, you can have, you have hate speech laws in, in England, in France, et cetera. Um,
00:20:46.520
in ways that I think are cruel and abusive to people, but they also can have right to be
00:20:51.480
forgotten where you can, it's a, you can petition Google to have your search results taken out.
00:20:56.420
Um, and, and they, there's a process to do that. And Google, Google grants a lot of those
00:21:01.440
in accordance with the law. I, I like privacy too. Um, I don't, I think it's unfortunately
00:21:07.320
some of those things are a non-starter because of how our courts interpret the first amendment.
00:21:11.440
There's only so much you can do at the end of the day.
00:21:13.880
So when you say don't fear Facebook, I, I fear the collusion, but the part that I do fear,
00:21:23.240
and it's not a Facebook problem. It is, um, it's an AI problem. Um, when you can predict
00:21:30.800
and you know someone much better than they know themselves, you can move and sway, uh, for instance,
00:21:40.760
just the search engine and the, uh, the search results for, um, politicians, you know, you can just
00:21:49.900
by, by ordering them differently, you can sway people, um, that kind of stuff. How do you deal
00:21:58.120
with, uh, these companies being able to, it's kind of the trap of what's free choice worth?
00:22:10.260
Yeah. Well, I mean, yeah, no, I mean, these are the conversations I think we should be having
00:22:14.260
where the world is dramatically changing and that, you know, it was called propaganda. And then
00:22:22.520
because propaganda got a bad name, they changed the name to advertising. Right. Now we're being
00:22:28.940
marketed to without even knowing we're being marketed to, and it can be very dangerous in
00:22:35.140
the wrong hand. It can be dangerous, but I think the left is very concerned about this because they
00:22:39.200
just hate capitalism. Whereas I like, uh, I see advertisements when I watch TV and most of them
00:22:45.680
are irrelevant to my interest. I live in DC. I'm not going to buy a car anytime soon. And I see tons
00:22:49.840
of car commercials. Whereas on Facebook, I get ads for things I might actually buy. So there's a
00:22:54.520
manner in which this is not nefarious. Now, sometimes it's nefarious, but in terms of the
00:22:59.060
propaganda and, you know, political, that kind of stuff, but I still have to compare it to everything
00:23:03.700
else. Like cable news is a 24 hour infomercial for one political view or the other, depending on
00:23:09.180
which channel you're watching. Um, there's so much open, you know, encouragement of here's how,
00:23:14.920
what you should think that I have no problem with. Everyone should, should make use of their
00:23:19.780
first men right to advocate for whatever they want. I don't really know that social media is
00:23:25.120
doing that in any more direct or nefarious way than any of the traditional media comes. The New
00:23:30.360
York times endorses people for president. They tell you who to vote for directly. So it's weird
00:23:34.560
for them to complain that Facebook is, you know, Facebook convinced everyone to vote for Donald
00:23:38.560
Trump. And that's why he's president when you told people to vote for him. Right. But I think
00:23:43.320
there's a difference on being open and being nefariously in the shadows. You know what I
00:23:49.980
mean? I have no problem. You want to tell me what your opinion is and this is what we do
00:23:54.040
here. Okay. That's cool. I know who you are and I can choose in or out. But when you are,
00:23:59.620
no, we're completely, all we are is a search engine. Right. You want to find an answer.
00:24:04.940
We're stacking the deck. So you believe the answer. They have a black lives matter,
00:24:09.520
you know, banner in there, but they're neutral. No, no, no. They're absolutely not completely
00:24:14.320
neutral. We should dispense with that farce. Not, that would be a, not making that claim whatsoever.
00:24:18.860
So, yeah, I think there are problems. I just think on balance, they've been good for heretical
00:24:26.520
views because we used to have decades ago, we had just a small number of media companies and you could
00:24:32.880
only say something that was in their narrow range. Now the conversation, those people have no control
00:24:37.300
of the conversation and they hate it. They want control. They want to shut down Facebook and
00:24:42.460
everything else so that they can again be the people who decide what we're talking about.
00:24:47.300
They, again, it's really, it's the weirdest thing for. They hate speech. They hate free speech.
00:24:54.100
They really do. They do. They do. And it's so weird that companies that were built on you be you.
00:25:01.660
Yeah. You be you. Yeah. Now don't want you to be you. You be you. If as long as you fit into this
00:25:08.560
category, it's so bizarre. I mean, they're collapsing the idea of this is very philosophical, but with the
00:25:15.340
new kind of woke left, the idea of the individual is kind of anathema. You have to be part of some
00:25:20.920
group identity. And I don't have any I'm a libertarian. I'm not really a social conservative.
00:25:25.360
I don't have hostility to these kinds of group identities they're talking about. I just think
00:25:29.320
but at some level, that's just your characteristics. That's just who you are. And that's great. But
00:25:34.700
you're just you. You don't have to subsume you to whatever group identity you're choosing.
00:25:39.860
I think that's one of the biggest problems. If you if you were a conservative, but you didn't
00:25:47.900
say that Donald Trump was the greatest person to ever live and all of the claims he made about
00:25:55.960
the most luxurious hotels you've ever stayed in. If you didn't buy into that, you're like,
00:26:02.280
no, I don't really for a while there. Right. You were dead. You were dead. That was weird.
00:26:08.700
It was really weird. Because that's what I expect from the left. Yeah. And they're light
00:26:15.700
years ahead. But still, the conservative side is toe the line where where is.
00:26:23.360
Have I just misread America for so long that I really thought that we could disagree on things
00:26:32.540
and get along? Or is that America just not reflected? I have such a bias against I hate
00:26:39.460
the idea. Things were better in the past and they're bad now because that's it's sort of
00:26:43.100
a cognitive bias. There are a lot of ways in which the past was bad. But it does. It does
00:26:48.420
feel like this has gotten worse or the ability to be the polarization has gotten worse.
00:26:54.220
I think because it has. I think we are. You know, we we used to be able to get along because
00:27:01.640
we all at least thought we all believed in the Bill of Rights. You know what I mean? Yeah.
00:27:09.380
We're Americans. That's what makes us American. Hey, you do what you do and I'll do what I do.
00:27:13.560
And we believe in these Bill of Rights. When we would violate those those rights for blacks
00:27:20.040
or Native Americans or whoever, it would start to grind and eventually people would be out
00:27:26.600
in the streets protesting and we would return back to the actual meaning. We're not even talking
00:27:33.660
about rights anymore as as things that are are immovable and, you know, given by a higher
00:27:43.020
power than government. So government can't reach into them and change them. I think that's
00:27:50.000
what we've lost is we don't have an unum anymore. We don't have. I mean, what what is it we agree
00:27:58.400
on? Well, I also think the stakes feel higher because they are higher because the federal government
00:28:04.800
is more powerful and does more things. So it used to be easier for us. We could disagree and
00:28:10.220
go our separate ways or be friends. And because your rules are not going to be forced on me
00:28:15.080
and my rules are not going to be forced on you. Correct.
00:28:17.100
But now whoever wins will be forcing their rules on literally every person who lives in
00:28:21.700
this country. Do you believe that there is because I don't believe in the Republican Party.
00:28:27.200
But do you believe that. A reset of this system, I mean, because when when your computer starts
00:28:34.300
acting like this, you turn it off and turn it back on again, turn it on and turn it back
00:28:40.320
off again. Yeah. And it reboots back to the original program, factory settings. Yes. Factory
00:28:45.820
settings. We do. We do turn it on and turn it off and turn it on again. Do you believe that
00:28:56.000
we are so far from those factory settings that whoever is in charge is just going to write the
00:29:02.240
new code? Yeah. I mean, I hear this, unfortunately, a lot, even from people on the right. They say,
00:29:07.420
no, when we take power, we have to use it. We have to wield it in a very like their Boromir
00:29:12.080
talking about, you know, the ring at the at the gathering. It's a gift. We have to like
00:29:16.260
you cannot use this tool is hostile to our views, to our limited government views. We can't use it to
00:29:21.880
accomplish limited government. You can use it to to get, you know, repeal bad laws, to reduce
00:29:27.260
regulations, to get rid of pointless bureaucracies. But don't don't try to force don't
00:29:33.160
create these new institutions to enforce your worldview that in four years will not be enforcing
00:29:38.320
your worldview and probably won't even be enforcing them in the meantime, because they'll be staffed
00:29:41.740
by progressives. So don't like don't fall. And I see a lot of people on the right falling into this
00:29:46.740
trap. I mean, the fact that Trump was not he did some things I agree with and some things I disagree
00:29:51.580
with. But I don't think anyone would say fairly that he accomplished a massive decrease in the size of
00:29:56.660
the federal government, right? No. So we just tried it. It's not going to work. So you have to
00:30:02.580
don't use the tools, break the tools. That would be my advice.
00:30:12.660
I've blown your mind. How do you know? I'm just trying to think how you hold things together. I mean,
00:30:18.140
my, my, our culture, the culture war is, it's terrible, terrible. Um, you know, I go back,
00:30:28.240
we were talking earlier about the invisible hand of the market, the market, it is just a tool. And if,
00:30:35.060
you know, moral sentiments, if you're not a moral society, the market's going to give you immoral
00:30:43.440
things, because that's what you want. So how do we, how, how do we change this? How do we change?
00:30:51.020
How do we use the tools that we're not supposed to fear? Right. To, to hold us together in a reset?
00:31:01.460
We have to, we have to cling to things that are not political and try to zealously guard against
00:31:08.220
them becoming political. We need to, you know, think about community in a healthy way, not in a,
00:31:15.260
not in a, we have to force everyone to think one way or the other. We have to look outside
00:31:20.400
the structures of government for your, for your kind of social cohesion. Um, and this pandemic has
00:31:27.040
probably been the worst thing for that of all time, because the rule of this pandemic was don't
00:31:32.360
socialize, don't keep those, those social ties. Those should fray because we have to do that for
00:31:38.100
our health. There are many unhealthy things about, about the restrictions we have decided will be
00:31:44.220
fine that the government can snap its fingers at any time. If anyone anywhere is in danger of
00:31:49.380
contracting COVID-19, we're going to shut down all the schools and all the community services,
00:31:53.400
and you'll have to wear masks and don't see anyone really bad, really, really, really,
00:31:58.720
really bad. People should stop like following those. I mean, the, and then, and of course the
00:32:03.840
only people following those are the people, the elites don't follow those. The mayor of every,
00:32:09.200
you know, blue city is not there still having birthday parties and weddings, et cetera. Um,
00:32:15.820
that has been the, the pandemic has been toxic in, in that way in that. I mean, it's, it's
00:32:22.080
ripe for like a French revolution type thing. Honestly, I mean, I feel this and I'm a, I'm not
00:32:27.400
really a resentful or populist type person whatsoever. And even I am like off with their
00:32:32.420
heads. These people who, who, who will order you to wear a mask, but they're, they'll, they're not
00:32:38.000
going there, you know, get met gala Emmys, et cetera. It, it really infuriates me. It's, it's
00:32:43.760
absolute craziness. Yeah. So why is, I keep, I keep wondering what is the hold on people? Cause
00:32:54.200
I, I mean, we don't have anything because of social media. Everything's been destroyed. We
00:33:01.100
don't have trust in anything anymore. And I don't know if it is because of social media,
00:33:07.280
although I think the left played a big part of this. Um, but we also destroyed ourselves.
00:33:13.200
There, there wasn't anybody who said, no, I, I'm staying true to these principle. I mean,
00:33:19.580
name the media organization that deserves any credibility right now. There is no specific
00:33:26.500
media organization that deserves credibility. People need to, but you know what this is,
00:33:31.360
I still think this is a benefit of social media because now I'll see on social media. I see content
00:33:35.860
from all sorts of media organizations and I get to, you know, read many perspectives and then decide
00:33:41.600
which one I think is right. I think the media environment of the pre-social media era produced
00:33:47.700
a lot of really bad policy, really bad foreign policy. I think a lot of reckless intervention
00:33:52.780
that was cheerleaded on the front page of the New York times and other places that was unthinkingly
00:33:57.500
celebrated. Now you have a lot of the non-interventionist, uh, even on the right, there's a lot of
00:34:04.000
healthy anti-interventionist sentiment in dissident, non-liberal media comment. And I,
00:34:10.740
even on far weird, weird, far left stuff too is sometimes has a lot of value. People like Glenn
00:34:16.100
Greenwald, Matt Tabe, et cetera, um, who have been able to thrive in a non-traditional media
00:34:21.920
environment. That's part of, that's part of the breakup that social media provided that is in some
00:34:26.800
ways we're, we're more informed. We're capable of being less informed because there's a lot more
00:34:30.780
bad information out there, but sometimes the bad information is actually good information.
00:34:35.040
How do we know? We don't want one person in charge of deciding it.
00:34:38.220
I don't want Mark Zuckerberg to be that one person in charge of deciding it.
00:34:41.140
Ted Koppel thought he should, I mean, Ted Koppel actually said to me,
00:34:44.420
there should be license. Everyone should have a license. Um, and we in the, in the mainstream
00:34:51.520
media should vote on who gets those license. And I was like, no, no Ted, no, that's a bad idea.
00:34:58.360
And he was specifically directing it. I think somewhat to me, like you shouldn't have one.
00:35:06.400
But like, like newspapers caused the Spanish American war, right? The, the, the coverage of
00:35:11.940
the main, um, every innovation in the communication space, I actually have a lot of anecdotes about
00:35:16.940
this in my book. They're really funny. Every innovation in the communication space had the,
00:35:21.780
whatever the previous communication thing was saying, this will be the end of the world. The,
00:35:25.720
the New York times coverage of like radio, when it emerges, there were psychologists who talked
00:35:30.460
about how radio, well, we're just all dead now. We'll never have a conversation with a human being.
00:35:34.680
You'll be this mindless listening ape. Now, um, the, uh, the phonograph, the New York times
00:35:40.220
wrote an editorial calling for Alexander Graham Bell to be killed for inventing the phonograph.
00:35:44.960
Oh my gosh. Isn't that funny? Uh, every, every kind of thing prompts this sort of panic. And
00:35:51.000
because all of these things have had some downsides, but I don't want to succumb to,
00:35:55.120
I mean, I remember video games where there was a whole freak out, violent video games are gonna
00:35:59.560
make all your kids into crazy, you know, school shooters. It turns out that now the research shows
00:36:05.980
that not only is that not true, probably the tiny, small, small, small minority of kids who are
00:36:12.040
inclined to violence probably are made less violent by violent video games because they have an outlet
00:36:16.860
for their violence other than actually going out and committing real violence. So that whole idea
00:36:21.860
was not true. And, and, uh, Scalia wrote what is the greatest Supreme court opinion in my opinion
00:36:27.380
of all time in the, the Schwarzenegger, California video game ban, where he says, these are expressive
00:36:33.520
ideas just like, well, Grimm's fairy tales are violent and obviously we can't ban those. It's a great
00:36:39.120
decision that I would encourage everyone on the right. Who's in a kind of, you know, panic about
00:36:44.320
social media to, to read this. And he, he, he said, cooler heads should prevail on the opposite
00:36:48.740
side. I have a story you probably don't know. Um, when the, um, motion picture was put together
00:36:57.460
by Edison. I have a, an original copy of the, the, um, investment, uh, you know, uh, portfolio and in it,
00:37:10.940
he says the last couple of lines are we will now forever know the truth because it will be recorded.
00:37:19.840
And one reporter says this, one person says this, we'll know because we'll have it on film.
00:37:27.920
Like that didn't work out. Nope. That did not work out. Smart people can be very stupid in some ways.
00:37:34.140
Yeah. Let's talk a little bit about, um, Donald Trump and sure. His being banned. Yeah. The Ayatollah
00:37:42.380
is not ban. Right. Uh, um, which is embarrassing and humiliating and ridiculous that these companies
00:37:49.940
have these different practices. No disagreement. Yeah. Um, how Trump was on social media for almost
00:37:56.920
the entirety of his presidency. They kicked him off at the very end. I think they probably showed him
00:38:02.900
honestly, I feel like we're going to disagree with this probably more leniency than they would have
00:38:07.220
someone not in his position. I think there are multiple times he tweeted something that if he were
00:38:11.000
anyone else, they would have banned him. Um, they've also, I've seen the court cases. There
00:38:15.140
are people who sued Twitter saying that Donald Trump's tweets, lefty people represent violence
00:38:20.700
or harassment or something. And, and we're suing you telling you, you have to take Trump off Twitter
00:38:26.360
and Twitter. And it's said, we're not going to do that. And here is the relevant law section 230
00:38:31.000
that, that we, because of this law, we don't have to do that, which is funny because Donald Trump
00:38:36.440
has tweeted about getting rid of this law. Right. But this law actually empowered them to keep
00:38:40.920
him on the platform. So that's an example of another way that, you know, changing these
00:38:45.460
kinds of regulatory things I think would be bad. And also, but wait, isn't that still just bowing
00:38:52.180
to the pressure from little groups? Right. And they bow, he's the president of the United States,
00:39:01.400
former president. You, you, he has no, there's no one who has used social media more effectively
00:39:09.040
than him. Yeah. Good or bad. He, he got his ideas out. He did. He used it very effectively. Now he
00:39:18.560
could, but I think it's, he could talk to the people in any format he wants. Like he could be
00:39:24.700
on TV all day. He could, you know, every, everyone will point a camera at him if he starts talking,
00:39:29.880
but he loves Twitter. He wants to be on Twitter. That's his favorite one. Right. And he, they,
00:39:34.920
they finally said, look, I think they should let him back on the platform. I've openly said that
00:39:39.200
I don't, I, I don't think their decision to take him down in that, in the moment when they did it
00:39:44.500
was necessarily wrong. Um, I, I attended the Capitol. I covered it. The, the Capitol riots on
00:39:49.840
January 6th. You were there. I was there. Part of the clan. I was not there. I was there as a covering
00:39:54.960
it. That's what you say. Right. Yes. Yes. No, that was me in the, right. In the, with the horns.
00:40:00.800
I did get, I got tear gassed in the face, uh, because there was a cloud of tear gas and I
00:40:04.800
couldn't get away from it. And I thought to myself, well, how bad could this be? It's time
00:40:08.280
for, you know, time to have a real journalist, you know, experience. I'm tough. I can take this.
00:40:13.360
Several seconds later, I'm like crying. I'm like, this is horrible. This is so horrible.
00:40:20.020
So it was, it, look, it was a bad day. It wasn't an insurrection. They weren't in danger of
00:40:24.240
overthrowing the government. I agree with that, but it was still a pretty bad, embarrassing thing
00:40:28.280
that they did. It was very embarrassing. And Trump, he should have calmed. I think he had
00:40:32.560
a moral responsibility to lower the temperature and he didn't do that. Yeah. I think that was
00:40:38.680
his, that was his real error. I thought that that was a turning point for a lot of Americans
00:40:45.080
that supported him when he didn't immediately come out with the same reaction that I think
00:40:51.080
99% of Americans had stop. Yeah. Stop this. This has got to stop. And they, you
00:40:58.260
had, you know, the left came to town, they smashed every window in sight for the summer
00:41:02.740
of protest right before that. It was, it was horrible. It was condemned. It should have
00:41:07.080
been condemned more strongly, obviously by people who weren't doing that. But then Trump
00:41:13.040
supporters came to town and, and they wrecked the place a little, not, not every street,
00:41:17.340
but they did not, you know, respect the buildings at all. And there was a, there's a little bit
00:41:22.600
of excuse making for that. That doesn't sit right with me because you should not, you shouldn't
00:41:26.840
smash the windows of places. No. And I think people are just getting so frustrated and they
00:41:30.700
see the left doing it and it seems to be working for them. So I'm going to do it too. And that's,
00:41:35.080
you can't, it doesn't work for anyone. No, it does not. It doesn't work. Um, let's talk a little
00:41:40.160
bit about section two 30. Sure. Um, because it's not, you say it's not what everybody thinks it is.
00:41:49.380
And what I think what everybody thinks it is, is, um, government protection from litigation.
00:41:58.420
So Facebook is, um, a publisher, which I can be sued if we put something on the air, that's not
00:42:07.340
right. I can be sued. They said, we can't have this if we're going to get sued every time. Um,
00:42:14.500
which makes sense, but that would make you that, that would, um, yeah, you'd just be a publisher
00:42:21.380
and not a editor. Right. Is that right? Yeah. Once you start to edit, you have the right to be sued
00:42:30.140
because you're editing. But the issue is they have to do some, everyone, I don't care who it is.
00:42:35.440
Anyone who says, no, these platforms shouldn't do any kind of moderation. They should just,
00:42:38.880
you know, be, they, they're not publishers. They should just allow anything. I will show you things
00:42:43.200
and you'll say, well, of course I want that taken down. Right. Everybody wants craziness.
00:42:48.180
Right. And I agree with that, but, um, the rules don't equally apply. I mean, it's clear there's an
00:42:56.700
editorial opinion. It's not just editing. It's editing for a certain opinion. Why should that be
00:43:05.800
protected by the government? New York times isn't, I'm not right. You know, honestly,
00:43:12.160
if I wanted to square this, I would probably say, well, maybe other media companies should have
00:43:16.240
that. I don't like suing everyone willy nilly. It's a gift to, I support tort reform. It's a gift
00:43:20.940
to trial lawyers who are the most reliable democratic party people anyway. Um, so you're
00:43:26.500
right. Section 230, it's not like the first amendment. It's not something sacred. They
00:43:30.200
could change it if they want to. I just think the immediate consequence of changing it in any way
00:43:34.300
would be less conservative speech online, because if you subject them to greater liability,
00:43:39.040
they're just going to take down more stuff. They're going to, or they're going to have a
00:43:43.220
system where only people they trust can post at will. And who's that going to be? It's not going
00:43:48.240
to be you. It's not going to be me. It's going to be the mainstream media. It's going to be blue
00:43:52.680
checkmark people. There could be some system where only if you have a blue checkmark next to your name,
00:43:56.580
can you post everything else has to be reviewed by our attorneys to make sure we're not going to be
00:44:01.440
sued for it. This does not benefit the right at all. Like it's so obviously bad that I can't
00:44:07.420
believe anyone has seriously considered doing it, but they have. So let me, um, uh, let me go to
00:44:13.800
the places where, um, it's, it's quite obvious to some that it's bad, but you don't want to squash
00:44:22.780
it. I would imagine you don't, I don't. Um, uh, I, I, I've always believed more speech, not less. Let,
00:44:31.160
let ideas stand on their own. You know, um, I'm glad we didn't ban, um, Hitler's writings. I know
00:44:40.760
when I first read Mein Kampf, I'm like, what part of this did, were the Germans surprised by when he
00:44:48.960
started killing all the Jews? He said it right here. Um, and so I, I think, you know, you, you want to
00:44:57.240
expose people to everything. When I was at CNN in 2006, the people who believed we went to the moon
00:45:05.460
was I think 6%. And I had seen the problem with credibility already with people. And I said,
00:45:14.760
that number is going to grow. And if we don't start being true to who we are, once people don't believe
00:45:24.380
anything, they'll believe anything, they'll believe anything. That number now on, we didn't
00:45:29.080
go to the moon, I think is 14%, 13 or 14%. Um, and QAnon, QAnon is, I mean, I, I hear stuff from QAnon
00:45:40.920
and I'm like, are you, are you kidding? You don't believe that, do you? Um, how do you, what do you deal
00:45:49.500
with that? How do you deal with that? I, I struggle with this because I think there's a certain level
00:45:54.080
of involvement in it that is not quite so scary and is in some ways, though it seems weird, is like,
00:46:01.660
um, is like people who believe in astrology or, or it's almost, it's almost like a video game or
00:46:06.820
something or, or, or, uh, it's like an online game where you're, oh, you're finding clues. You're doing
00:46:11.160
some kind of like detective story where it's not, do you really, really believe it? Or is it just kind of
00:46:16.220
something you're involved in? And people have believed, um, people believe kooky things for
00:46:21.680
like forever. People used to believe, I, I was talking to a Steven Pinker, who's a Harvard, uh,
00:46:27.120
a lot of great books. Um, you know, he points out that, you know, people used to kill animals and
00:46:32.020
then look for signs like in their blood, right? They're, they're, we used to torture animals for
00:46:36.620
fun. We used to, people used to believe that, that, that, you know, the, if, uh, if birds landed on
00:46:43.260
the trees in a certain way, that meant the comets coming and we're going to like, people have
00:46:47.020
believed kooky things. Now there's a political flavor to everything. And QAnon has this political
00:46:52.840
flavor that, that feels like weird and new, but I don't know how many of these people actually,
00:46:59.400
maybe the, the, the most committed people are the ones like being tricked by the FBI into doing
00:47:03.320
something, the, the, the kidnapping plot, the, my, my former, I'm from Michigan originally. Uh,
00:47:09.680
the more you read about that, the, the, the kidnapping plot of the governor, the more you
00:47:13.380
see it was just, you know, people sitting around complaining and then kind of got talked into
00:47:19.380
it by these, by federal officials where they're, and they used to do that. They did this by the
00:47:24.020
way to a lot of Islamic people, but every time there's a bust of a so-called, you know, ISIS
00:47:28.280
affiliated person, well, this person never met anyone from ISIS. They're a radical teen saying
00:47:33.580
radical bad stuff online. And then they meet with a law enforcement officer who's like,
00:47:37.900
Oh, you could purchase explosives for me. And now you're not great. Now you're going,
00:47:41.740
now you're going to jail forever. No, there's no actual threat that was averted here. And
00:47:45.880
you know, they get to throw a clap, clap for the feds. There's no threat was averted. It's all,
00:47:51.680
there's a lot of, and actually this is one thing I admire about Trump especially, and that has been
00:47:56.080
good. He's brought, he's made the right more suspicious of what he calls the deep state for
00:48:01.340
absolutely valid reasons. There, there is, there are. And the more he said, I remember the first time he
00:48:06.840
said things like the press is an enemy of the people. I'm like, Oh God, what are you? Right.
00:48:12.940
And then the more you watch their reaction, the more you're like, well, you know, kind of, yeah.
00:48:19.400
Yeah. You know, and the deep state, when I first heard that, I thought so conspiratorial, but
00:48:25.380
if you understand it as, no, just a group of career people that are politicians and, and, uh,
00:48:34.200
you know, salaried people that just think, I don't care what the president says. I'm going to be here
00:48:38.540
a lot longer than he is. Right. That's the deep state. It's not a star chamber. Right. It's just
00:48:44.200
people who think they know better. The state department. Right. The generals who are, you know,
00:48:48.760
just feeding totally wrong information about the state of Afghanistan to president after president,
00:48:54.480
after president, knowing they'll outlast this guy, knowing they can't make themselves look
00:48:57.920
bad. I mean, and Millie has just totally embarrassed himself, but, uh, it's, it is a deep state. What
00:49:06.000
can you say? It is. So, um, on that, we've had the Hunter Biden story and the media and everybody
00:49:15.900
has turned that around and made you look like, not you, but well, me look like a conspiracy theorist
00:49:22.800
for believing it and going, you know, I think we should at least look into this. I mean, there's,
00:49:28.640
how do we, how do we navigate the world we're in? Uh, you're right about it's, um, it's always bad.
00:49:38.780
I mean, I, I, in some ways pine for the days when, uh, you know, we had to sit down together
00:49:48.000
as a family or whatever and watch a show at eight o'clock on Thursday night, because it kind of
00:49:53.520
brought all of us together. You know what I mean? I never thought of it that way. I always thought
00:49:58.580
this sucks. I can't wait until you can get it when you want to get it and watch it how you want to
00:50:02.960
watch it. But there was some good things about that. And television was going to destroy us.
00:50:08.520
You know, it was destroying the fabric of America. This is the same thing. I agree,
00:50:14.360
but it's on steroids and it's so much faster. People do need to turn it off more. I, when I say
00:50:21.340
don't panic, I also don't mean make this your entire life and submit, submit yourself to it.
00:50:27.020
Turn your phone off. Sometimes log out, stop using social media sites that aren't contributing to your
00:50:32.940
own wellbeing. You don't have to be on them. Some of the people advocating the most for,
00:50:37.580
for restrictions or breaking up these companies. What I think they are is that they're addicts
00:50:41.680
and they think everyone's an addict when you like the social dilemma, that Netflix documentary about
00:50:45.940
how terrible it was. It's all these X Google people, X Facebook people. Well, they're the
00:50:50.320
most online people of all. They're talking like they invented like their Manhattan project scientists.
00:50:54.740
We invented the greatest, most powerful, destructive thing ever. And now we're warning you
00:50:58.280
and it's so great and we're so great for inventing it, but it's really bad for you. Maybe it's bad for
00:51:02.200
you. Maybe if you're, you know, most people can walk into a casino, not bet away their life savings.
00:51:07.580
Some people, they would bet every, they can't do it because they're addicts and they should,
00:51:13.540
they should not do it. I think we're, I think we're, uh, at a, uh, a point where we don't
00:51:20.920
agree on something. No, maybe we don't try to, I mean, no, but I mean, try to, uh, try to tell
00:51:27.940
people, yeah, put your phone down for the day. Just, they will all tell you. Oh no, no, no,
00:51:33.540
I can't, I can't. Wait a minute. 10 years ago, you could, everything was, I say this to my wife
00:51:39.400
all the time. She's like, no, I have to be able to, why, why? I don't have a phone. I don't carry a
00:51:45.740
phone. I'm fine. Um, why does everybody have to have that? You tell them to turn it off and most
00:51:55.420
people are like, uh, no, no, no, no, no, no. Have to have to have this. Yeah. It's not healthy.
00:52:01.940
It's very bad. Yeah. We need to, I hope these companies will seriously consider, um, implementing
00:52:11.500
things that, that make it easier to turn them off some of the time. Maybe they won't,
00:52:17.760
but they're right now going to face regulation. They are going to be regulated one way or the
00:52:22.240
other, unless they, unless they can fix themselves, unless they can give some, they don't want to
00:52:28.080
fix themselves, but they don't want to be regulated. They're going to be the one, they're going to be
00:52:30.780
the ones who write it. Well, well that won't be good. I know Facebook supports section 230 reform
00:52:35.280
because they know it's going to put Twitter out of business. And right. And you know,
00:52:40.120
also that the government, they don't, we're talking about people who are a thousand years old.
00:52:47.960
They, the ger, the gerontocracy. Yeah. I mean, they talking to these people about AI or anything
00:52:56.040
on the horizon, they don't have any clue is what they're, so they're just going to go to their
00:53:02.340
friends in Silicon Valley and say, why don't you write some stuff up and how would you regulate it?
00:53:10.120
That's what FDR did with GM and the big three automakers. Every hearing where you have these
00:53:15.100
senators yelling at Zuckerberg and Dorsey and everyone else, I mean, they're embarrassing.
00:53:18.800
They have no idea what they're talking about. Recent Blumenthal was yelling at someone from
00:53:23.560
Facebook about Finsta. And he's like, get rid of Finsta, take down Finsta. You have to eliminate
00:53:29.160
Finsta. Finsta is not a feature that it's just having a fake Instagram account. It's just something
00:53:34.040
users came up with. It's not a service they offer. They can't. And she's trying to explain this
00:53:38.560
and you can't, he's so old. He doesn't know. That should disabuse anyone of the notion that
00:53:43.700
these people are competent. I think it was Johnson, Congressman Johnson, I think that was talking
00:53:50.380
to a general. And he said, he's, they were talking about the bases on, I think the Philippines.
00:53:59.820
And, uh, he said to the general, are you concerned at all? If we have too many people on that side of
00:54:08.220
the Island that the Island will capsize and the general without missing a beat went, no, sir,
00:54:17.160
that's not anything we've, we've considered, uh, Oh yeah. It's our, the, uh, the people running our
00:54:28.180
government are not good. A lot of the people, uh, running and I think to a greater extent working
00:54:33.100
at these companies are not good. They're hostile to our values. They're hostile to, um, classically
00:54:37.960
liberal values, free speech values, and it's very bad, but what can we really sleep realistically do
00:54:43.100
about it? It doesn't, I don't know that it's a lot from a policy standpoint and I don't want to
00:54:47.940
make things worse. And I just don't want to give power back to the media environment we had before
00:54:52.840
this, because for every bad thing we have now, I, I honestly, I think it was worse when I was a young
00:54:58.060
person before that, the, the dominance of the people who really hate you, who really disagree
00:55:04.020
with your values, who won't let you hear a single different perspective and just want you to listen to
00:55:08.500
them. Those are the people that would come to power again. If we broke up these companies.
00:55:15.140
Let me ask you one last question. And I don't know if you're, I'm sure you've, I'm sure you're
00:55:19.900
involved in cryptocurrency. Yeah. That's another place where the powers that be are actually talking
00:55:29.120
about making the post office a bank. Yep. I mean, if that's not a thousand year old thinking,
00:55:36.960
I don't know what is, they've got to stop cryptocurrency, um, because they lose all
00:55:44.360
control. Once, once you don't control currency, you, you lose, oh, I vast amounts of power.
00:55:55.680
Crypto could very well be the solution to some of the problems we're talking about. I am not an
00:56:00.500
expert in this subject. I'm very interested in it. I'm just frankly not an expert, but a lot of my
00:56:04.500
colleagues at Reason Magazine write about a Zach Weissmuller is someone people should look up if
00:56:09.300
they're interested in this. And he writes very persuasively about how this could be the future,
00:56:14.660
a future where there is less central control, where there is privacy, where a lot of the things we're
00:56:19.320
talking, we're, we're worried about concentration can be addressed by this, uh, the decentralized
00:56:24.520
internet. It is possible. And so there are, and that's all going to be innovation. That's not
00:56:29.340
going to be the government doing anything. At best, that will be the government not doing anything.
00:56:33.360
Right. So there, that, that could be the solution to a lot of what we're talking about.
00:56:37.920
Generally speaking, I can't get a feel from you, whether you are hopeful for the future
00:56:45.680
I think I'm hopeful for the future. I think things tend to get better.
00:56:48.820
Our political conversations have gotten worse and the government always gets bigger,
00:56:52.460
but there's a lot more to life and we should, we should enjoy that. And it's been a bad,
00:56:57.800
it has been a bad two years. I don't think it, I don't think things can be as bad as they were
00:57:04.040
You know, everybody that said last year, that's what they said. 2021 can't be as bad as 2020.
00:57:09.860
My personal 14 days to slow the curve are over. Let me tell you, I am going to enjoy my life.
00:57:17.100
I don't care what Dr. Fauci or any other health official says ever again. So we will,
00:57:23.700
it will be better. We will make it better if it has to be.
00:57:28.100
Name of the book is Tech Panic, Why We Shouldn't Fear Facebook and the Future.
00:57:37.800
Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend