RadixJournal - December 04, 2020


Fahrenheit 230


Episode Stats

Length

50 minutes

Words per Minute

154.92096

Word Count

7,765

Sentence Count

466

Misogynist Sentences

4

Hate Speech Sentences

10


Summary

J.F. Garipay and I debate the future of the internet, and the role of Section 230 of the Communications Decency Act, which protects websites and their users from being sued for their speech on the internet by other users.


Transcript

00:00:00.000 It's Tuesday, December 4th, 2020, and welcome back to The McSpencer Group, an unrehearsed,
00:00:08.060 hastily assembled program about meta-politics.
00:00:11.840 Joining me is the legendary G.F. Jean-François Garipay, an homme qui croit que manger du fromage
00:00:19.220 brie-fondue est mauvais-leur.
00:00:21.840 Main topic, Fahrenheit 230.
00:00:25.100 Social media deplatforming is an existential crisis for our time.
00:00:30.120 Twitter, Facebook, and YouTube have become, in effect, the mainstream media, replacing
00:00:35.480 the big networks of yesteryear.
00:00:38.120 As they've risen to prominence, they seem less like free speech bastions and more like
00:00:43.080 institutions with an agenda to promote their preferred messages and advertiser-friendly
00:00:48.680 content.
00:00:49.620 Donald Trump thinks we can crack this nut by removing big tech's immunity to lawsuits through
00:00:55.320 Section 230 of the Communications Decency Act.
00:00:58.260 J.F. and I debate the future of the internet.
00:01:02.720 All right.
00:01:03.420 So we can get maybe on this.
00:01:04.860 Who's the host here, J.F.?
00:01:06.040 Yeah, yeah.
00:01:07.660 You're like, all right, next question.
00:01:14.040 It's hard to get up.
00:01:14.960 This is my show.
00:01:16.700 We've been doing this for years.
00:01:18.940 I know.
00:01:19.520 We have.
00:01:20.060 So it's kind of weird.
00:01:21.180 Okay.
00:01:21.800 Let me take the reins.
00:01:23.640 So we have some big issues that I want to talk about.
00:01:31.520 And we have an interesting disagreement here.
00:01:34.260 So de-platforming Section 230.
00:01:38.920 Now, de-platforming, let me set the scene here and then we can kind of talk about it.
00:01:43.640 Because this is a very specific issue in the sense of this aspect of the Decency and Communications
00:01:49.820 Act.
00:01:50.180 But it's also a really big issue that affects you and me personally, in fact, and affects
00:01:57.320 millions of people.
00:01:58.720 So de-platforming is an issue that was around maybe five or 10 years ago.
00:02:08.040 You would occasionally hear of someone maybe selling marijuana or maybe publishing what,
00:02:15.900 you know, quote unquote, white supremacy articles on the internet.
00:02:20.480 And they were getting kicked off PayPal.
00:02:22.320 But it seemed to be a very marginal thing.
00:02:24.720 And in my experience, because I've been doing this for a while, I was not affected by de-platforming.
00:02:31.360 And in fact, Silicon Valley's platforming was vital to my business, such as it was, in
00:02:41.380 the sense that my ability to use Stripe, to use Squarespace, to use PayPal, to use all of
00:02:47.860 these kind of layers on top of Stripe were essential.
00:02:52.160 And by about 2016, things started to dramatically change.
00:02:57.680 I remember right after Trump's election, I was kicked off Twitter for really no reason.
00:03:03.020 I mean, my last tweets before I was kicked off were actually, I think my last tweet before
00:03:08.780 I was being kicked off, ironically, was a response to David Frum.
00:03:13.040 And he actually wrote an article saying I was about to respond back to him and his account
00:03:18.040 disappeared.
00:03:18.600 So anyway, there was no real reason.
00:03:21.960 It was just kind of like Trump movement, white supremacist, Richard Spencer, gone.
00:03:27.680 Now, I was let back on Twitter a month later in an equally mystical fashion.
00:03:33.240 But by 2017, we were all getting banned from payment platforms, from some web hosting companies.
00:03:41.460 I could go down the list.
00:03:43.300 Yes, there's hundreds of examples.
00:03:45.660 And a big ban occurred of a number of people, including Stefan Molyneux and myself, this summer
00:03:53.340 of 2020.
00:03:54.820 I think it was June or July, when we were unceremoniously banned.
00:03:59.880 And in a Kafkaesque manner of, you know, what did we do?
00:04:04.080 We weren't there was we didn't have strikes.
00:04:08.060 We no one was yelling vulgar language or issuing death threats.
00:04:12.740 We were simply kicked off in a coordinated fashion that happened instantaneously is clearly
00:04:17.960 coordinated and planned.
00:04:19.120 I was kicked off Facebook before that.
00:04:21.440 We've all faced these major issues.
00:04:22.900 So deep platforming is a thing.
00:04:24.140 And I think some average Trump supporters have faced de-platforming themselves.
00:04:30.100 QAnon has been just purged for, I mean, I have a difficult time exactly defending QAnon.
00:04:37.360 But regardless, QAnon has been purged from all major social media networks and so on.
00:04:47.140 And so there's a general anxiety about de-platforming.
00:04:51.120 Um, the answer that has been given to us by Republicans and by Trump himself, who's tweeted
00:04:59.960 about this dozens of times now, is that we need to remove, uh, section 230 of the Decency
00:05:08.860 and Communications Act.
00:05:10.780 So let me just read this just to give everyone some background.
00:05:14.740 So section 230 says, no provider or user of an interactive computer service shall be treated
00:05:21.540 as the publisher or speaker of any information provided by another information content provider.
00:05:28.800 Now, what this means is that these companies like Twitter, Facebook, or YouTube are platforms
00:05:37.520 and they are not a publisher in the sense that the New York Times is a publisher.
00:05:42.260 If the New York Times issued death threats, then you can sue both the publisher and, you know,
00:05:50.380 maybe the speaker who said them, uh, and so on.
00:05:54.180 Uh, if they give out completely false information that damages you in some way, you could potentially
00:06:00.860 sue them.
00:06:01.380 Now, what this, this act did, and this came in 1996.
00:06:05.660 So kind of right at the burgeoning, you know, era of the internet, when people were just kind
00:06:11.520 of getting online, uh, was it treated these companies like platforms.
00:06:16.540 And I think we kind of understand where they were coming from.
00:06:19.660 I think that this was actually a, a quite smart law in the sense that if you have a blog,
00:06:26.040 yes, you might want to moderate comments to some degree, but if someone can't moderate
00:06:33.340 them all and that, that takes a lot of time and someone might go into your comment section
00:06:39.460 and leave a death threat or give out false information, et cetera, and that you are not
00:06:45.580 jeopardized by that as the, the, the platform or the person writing the blog.
00:06:51.660 So this makes a lot of sense now, as time has gone by, the web isn't, hasn't, isn't just
00:07:00.300 some hobby place where you, you put out weird, wild information or, you know, put up a fan
00:07:09.040 page or have a blog that 20 people read, um, with the development of social media, social
00:07:15.260 media has replaced the mainstream media to, to, in so many ways, the mainstream media actually
00:07:20.720 kind of feeds off social media to a large degree.
00:07:24.400 And so we, we have this situation where these formerly minor sites, which in recent memory
00:07:31.480 were the wild west.
00:07:32.800 I mean, I remember going on Twitter and, you know, inward here, crazy lunatic meme there,
00:07:39.220 uh, porn here, uh, uh, you know, bizarre selling marijuana there.
00:07:44.620 I mean, it was the wild west.
00:07:46.680 It was almost like the dark web or something.
00:07:48.720 Well, now social media has, is increasingly driving mainstream media.
00:07:54.600 It's increasingly replacing mainstream media, uh, in terms of impact on all sorts of things,
00:08:00.500 culture, but the impact on elections.
00:08:03.480 And so they're feeling the sense of responsibility and they're getting, they're getting pressure from
00:08:10.280 journalists, but they're feeling the sense of responsibility to, uh, uh, curate content.
00:08:16.160 And so just little things on Twitter, you know, like you, you go to the search and they'll
00:08:20.940 give you the trending topics that are maybe algorithmically based, but they'll, they'll
00:08:25.460 curate them.
00:08:26.080 They'll kind of describe to you the latest thing of, you know, Kim Kardashian's new butt
00:08:32.020 pick and all these reactions to it or whatever they'll, they'll curate content for you.
00:08:37.160 So on some level, they are becoming a type of publisher.
00:08:41.880 Now it's still mostly all user generated, but it is a kind of curation where they're a publisher
00:08:47.900 and they feel a certain responsibility, particularly with the kind of Russia hacking, you know,
00:08:53.280 memes going through and going around them that they can't allow bad, false information out there
00:09:00.780 because it affects elections.
00:09:02.480 So social media is taking on this responsibility and they are banning people.
00:09:07.420 They are suppressing information, which we saw vividly with the Hunter Biden scandal of
00:09:12.580 a few months ago, and they are banning accounts, um, and so on.
00:09:17.700 And so we're in this transitionary stage where we don't really know what social media is going
00:09:24.660 to be right now.
00:09:25.740 And they don't know either.
00:09:27.360 And conservatives feel anxious.
00:09:29.700 They feel like they're the ones getting suppressed and the left is not.
00:09:33.260 And that's, you know, true, uh, basically, I mean, with a few minor exceptions here and
00:09:40.080 there, and they want to do something about it.
00:09:43.060 And so the, the solution that Republicans have given is that we want to remove this waiver
00:09:50.460 of liability, a kind of shield from the platforms.
00:09:54.580 And what that means from my perspective, and again, I'm, I'm, I'll avidly listen to your
00:10:01.860 rejoinder to this is that you are going to start treating Twitter as a publisher in the
00:10:07.680 sense that they are liable for, uh, content that is user generated.
00:10:13.680 It is like the, you know, the New York times is publishing a letter to the editor.
00:10:18.420 Well, they, they have to take responsibility for that user generated content at some level.
00:10:23.820 So they could be sued.
00:10:26.060 Um, I have no idea how this helps us at all outside of some kind of 40 chess ways of we
00:10:35.740 could sue them for millions of dollars and, and what, and they'd be out of a few million
00:10:41.180 dollars.
00:10:41.680 I think this removing section two 30 might very well destroy their business model, uh, in
00:10:46.820 the sense that they aren't going to be able to push people to certain content because that
00:10:52.400 is a kind of curation if they're just a platform.
00:10:55.160 Um, but the fact is, if you, if you start threatening to treat them like a publisher or, or in fact,
00:11:00.840 treat them like a publisher, then why are you surprised when they act like a publisher?
00:11:06.800 The fact is the New York times prints what it wants to print everything that's fit to print.
00:11:12.500 As they say, they don't, I, they have no responsibility to print my letter to the editor.
00:11:18.100 They can forget it, leave it printed if they want.
00:11:22.020 Um, that is what a publisher does.
00:11:24.620 It curates content.
00:11:25.940 It expresses a point of view, uh, and so on.
00:11:29.620 And the idea of Twitter, you know, being treated as a publisher and thus acting like a publisher
00:11:37.520 seems a recipe for even more intense content curation.
00:11:42.920 That is banning, uh, of users and information.
00:11:47.080 And so I, I, I, I mean, I'll, I'm, I, I will wait for your rejoinder to this, but I feel like
00:11:54.400 focusing on section two 30 is the worst approach possible.
00:12:00.140 Um, the better approach would be to simply say that this is a platform.
00:12:06.580 It is not a publisher.
00:12:08.320 We are going to continue to shield it as a publisher.
00:12:10.820 Section two 30 was proper, but it is a platform for information.
00:12:14.880 And that you as a citizen have a certain right to post that much like if I go out onto a public
00:12:25.220 sidewalk or even a private place, like say a shopping mall or a sports stadium or something
00:12:31.760 that I actually have expanded rights.
00:12:34.680 I can't go into your home and yell at you and hold up a sign saying the end is nigh, repent,
00:12:42.660 achieve salvation.
00:12:43.420 However, I do have expanded rights, even in public spaces that are treated as public to
00:12:52.160 some degree, you do have more rights in those spaces than you would say in a private home.
00:12:56.440 There are distinctions blurry line, maybe, but there are distinctions.
00:13:00.620 Uh, and I certainly have the ability to use the phone system, uh, or the, uh, uh, internet
00:13:08.760 over cable or internet over whatever as a utility.
00:13:12.160 Um, you know, the government might be listening in to my phone calls, but they, they don't have
00:13:17.500 the right to ban me if I tell dirty jokes to you, uh, over the phone and I, it is treated as a utility.
00:13:26.240 That is the approach.
00:13:27.240 We need to treat them as privately held utilities that, that are public spaces, privately held public
00:13:34.820 spaces, and that you as a citizen have a right to that.
00:13:38.340 And that means that if you get banned, you have some recourse to go to the platform and that if
00:13:45.360 you, I don't know, spend a year in exile or apologize or whatever, you are allowed back on.
00:13:52.140 Because, you know, if I, if I rob a liquor store, I might be thrown in jail for five years, but I get
00:13:59.120 out of jail at some point and I can, uh, and I can, uh, hold a job, hold a bank account, vote,
00:14:06.400 drive a car, et cetera.
00:14:08.060 You don't just lose all your rights by doing one bad thing.
00:14:13.000 And I think the same should hold for social media.
00:14:15.780 Let's say I tell a terrible joke or use an offensive word one time at the middle of the
00:14:21.200 night on Twitter, my bad, but should I therefore lose, you know, my access to Twitter for the rest
00:14:29.000 of my life?
00:14:29.740 That seems totally egregious.
00:14:32.420 And this is the proper approach to the deplatforming issue.
00:14:36.800 The two 30 is a red herring.
00:14:39.780 It will make things worse.
00:14:41.540 And it's a typical Republican trick.
00:14:44.760 Go.
00:14:45.320 Well, certainly, uh, abolishing section two 30 would make things worse.
00:14:52.680 And it is only through the lens of a scorched earth policy that it's interesting to consider
00:15:06.260 the abolition of section two 30, uh, as part of warfare against this big tech.
00:15:12.060 Uh, but ultimately the best outcome would be to fix section two 30, but we're probably
00:15:17.940 not going to have this.
00:15:19.220 So, but let's talk first about section two 30 as it is now, because the current step status
00:15:26.260 of that law and the way it is being applied, uh, is out of spirit.
00:15:30.280 I would say essentially, um, we live in a lie.
00:15:34.260 Twitter is saying that they are a platform because they want to benefit from the legal protections
00:15:39.600 of being a platform, but they are not a platform because they select users that they ban because
00:15:45.080 of political ideology, but worse than this, they act as a publisher.
00:15:48.980 When they say this claim is disputed, when they add this to any claim about voting, uh,
00:15:55.660 at this point, they are themselves creating the first event of dispute that exists on the
00:16:01.600 internet because a claim shows up and within a second Twitter labels it as disputed who disputed
00:16:07.460 it, no one except Twitter because a new claim cannot have been disputed before.
00:16:12.680 So all of this is, and of course the shadow banning, the selection of the timeline, everything
00:16:20.120 that keeps, uh, HGIDs from reaching the timeline.
00:16:23.460 This is all the action of a publisher of a selector.
00:16:27.160 So the law is not being respected.
00:16:29.540 The spirit of the law is being stepped over and is being exploited.
00:16:34.520 And we have a case of defamation within the protection that is this law is supposed to
00:16:42.420 protect, but it opens actually a space for the types of defamation that couldn't happen
00:16:49.300 at a publisher.
00:16:50.520 For example, if New York times was to say, Oh, you know, this claim about election fraud
00:16:55.420 by JF is disputed.
00:16:57.240 I could sue them because they would be attacking my credibility.
00:16:59.980 And if I could show in court, well, it would be extremely complicated, but in principle,
00:17:04.880 at least I have a right to say the New York time has defamed me by putting this disputed
00:17:11.280 label to my claims, which were true.
00:17:13.220 I don't have this possibility against Twitter and we have to examine in depth.
00:17:19.060 Why is it that section 230 ultimately protects, ends up protecting forms of defamation that couldn't
00:17:26.400 have existed elsewhere, or that could have at least been punished by law if it had happened
00:17:30.980 in the past or in publishers?
00:17:33.820 And the answer to this question is that it stems from the otherwise objectionable clause.
00:17:41.600 It stems from the fact that when they designed that law, they foresaw that, okay, there are
00:17:46.180 a couple of things that platforms may legitimately want out of their systems without it being acts
00:17:55.120 of a publisher.
00:17:56.120 It could be to exclude lewd content, violence, porn, that kind of thing.
00:18:01.300 But as they listed these potential objections, they also added otherwise objectionable, which
00:18:09.720 leaves a complete blank slate for big tech media to decide what it is that we can talk about
00:18:17.640 on the internet.
00:18:19.740 And that is a big problem.
00:18:21.000 The ideal situation we would be in is we would be in a constructive system where politicians
00:18:27.180 are actually interested at making freedom of speech being applied to larger portions of
00:18:34.120 America, including our online life.
00:18:36.160 And they would get together and say, all right, the otherwise objectionable clause is to be removed
00:18:42.080 and we need to specify better.
00:18:43.940 What do we mean when we say lewd content?
00:18:47.720 Because it's not been designed to ban center-right people who say some edgy joke.
00:18:54.440 It's been designed for extreme stuff and it should be limited to it.
00:18:59.320 Unfortunately, the original writing of the law was not limiting.
00:19:02.640 It just said, good faith, otherwise objectionable.
00:19:05.460 And that one path through which we could specify these words is not even through legislation,
00:19:11.940 but through the FCC intervention.
00:19:14.580 But Trump came to this much too much late in his presidency.
00:19:19.080 Could have started this in 2016, but he didn't.
00:19:23.020 And so we find ourselves with no solution.
00:19:25.580 Now, with the little time left of Trump, assuming that he doesn't win his legal case and gives
00:19:30.900 us four more years of presidency, we find ourselves with a little time, a little time in which
00:19:37.800 we can wreck some of the machine.
00:19:39.480 And so in this context, since we're not going to get an otherwise objectionable redaction,
00:19:45.980 why not say abolition of Section 230?
00:19:50.080 It's a position of negotiation.
00:19:52.520 It's an unrealistic leveraging position.
00:19:54.820 You've once told me that, you know, when you say we claim all of America, we don't want
00:20:00.900 to give up California even.
00:20:03.300 You once explained to me that in negotiation, you need to go a little further than what you
00:20:08.940 would want so that you can be well positioned.
00:20:11.600 And I think it's saying we want to abolish Section 230 is an extreme measure.
00:20:16.520 It won't happen in a way, by the way, because for abolishing Section 230, imagine what you
00:20:21.740 need.
00:20:22.060 You need Nancy Pelosi, you need the Democrats to get together and say, oh, yeah, let's
00:20:26.460 let's abolish Section 230, which they will never do.
00:20:31.300 Actually, Biden has explicitly told the New York Times that he wants to get rid of Section
00:20:37.380 230.
00:20:38.920 He told that in I can look back at the reasoning, but that was actually said in his interview
00:20:44.840 that he gave to them before they endorsed him for president.
00:20:48.100 So I don't know if he's going to do it or not.
00:20:50.260 I mean, the default position seems to just be the status quo, you know, misery continuing.
00:20:55.660 But he actually has come out in favor of that.
00:20:59.680 Let me and I'll let you talk some more.
00:21:01.300 But let me just address some of these points that you brought up.
00:21:04.960 Why are we negotiating with big tech?
00:21:07.720 This is not there needs not be any negotiation with them.
00:21:12.900 Through net neutrality and through Section 230, they are driving their car on the government
00:21:22.640 road, basically.
00:21:24.500 They are not forging new paths out in the wilderness.
00:21:29.340 They are driving on a national highway and the government sets the speed limit.
00:21:34.540 So there's no negotiation that needs to take place.
00:21:37.580 It's simply I mean, I don't I'm not a lawyer or a constitutional lawyer, so I'll leave this
00:21:43.920 up to other people.
00:21:44.520 But it seems like you could even do this in an executive order.
00:21:48.500 Just basically say that if something is legal in public, then it should be legal on your
00:21:56.340 platform, which is what it is.
00:21:58.180 It's not a publisher.
00:21:58.900 So therefore, I mean, for instance, YouTube has a big problem with porn and some really
00:22:06.440 creepy stuff regarding children's videos.
00:22:11.480 And they've handled that by creating like kids YouTube.
00:22:14.460 And they've done many things in that, all of which I support, by the way.
00:22:19.180 I think that they need to do that.
00:22:20.740 And they are responsible for that, you know, morally, if not legally.
00:22:25.140 But I don't know if you remember, but a few years ago, there was a lot of kind of
00:22:28.900 creepiness going on with children's videos on YouTube.
00:22:33.660 Kids are using this platform more and more.
00:22:36.340 I think this is they absolutely have a moral and likely legal responsibility to do that.
00:22:41.960 But those kinds of things are illegal in public.
00:22:47.640 Most speech, I mean, even extreme speech that you and I don't take part in.
00:22:53.160 The N word is that just easy example.
00:22:56.840 This is legal.
00:22:57.960 You can say that word.
00:23:00.600 I mean, Canada might be different, but in the United States, you could go around using
00:23:05.120 that word.
00:23:06.400 Now, at some point, it becomes harassment if you're yelling at someone or threatening them,
00:23:11.720 of course.
00:23:12.720 But you can use it.
00:23:14.560 And you can say pretty much 100% of everything I've ever said legally.
00:23:21.980 There is no law against it.
00:23:24.240 And so it's just an easy fix.
00:23:26.260 You don't need to negotiate with Jack or Mark or Jack or Zuck.
00:23:32.460 You just do it.
00:23:34.320 And they'll secretly thank you because it will save them money because they can fire a bunch
00:23:41.020 of blue haired white people and Indian SH-1B visa holders who are furiously and incoherently
00:23:51.480 censoring our speech.
00:23:55.100 And you just do it.
00:23:56.780 And I just don't understand why we're even going there.
00:24:00.840 I mean, I get it that you want it scorched earth and just fuck them all kind of stuff.
00:24:06.620 But, you know, why?
00:24:08.760 And this is kind of my other issue with this.
00:24:11.220 I remember hearing this from like alt-white personalities and the left.
00:24:15.940 So from like Elizabeth Warren to Laura Loomer, you hear this.
00:24:20.860 We need to break up big tech so that they stop censoring us.
00:24:25.560 Do we actually?
00:24:28.280 Actually, I want to be on YouTube.
00:24:30.660 I am really grateful for BitChute and I wish them the best luck in the world because without
00:24:38.740 them, I don't know what I would do.
00:24:40.620 But I'll just be honest.
00:24:42.260 I would prefer to be on the largest platform on the planet.
00:24:46.440 That is YouTube.
00:24:47.560 I get more views that way.
00:24:49.280 There's more interaction.
00:24:50.780 I am not against these companies.
00:24:53.200 I am not against Monopoly.
00:24:54.760 I don't have some kind of almost, I don't know, desire that we should have all these
00:25:00.360 mom and pop video servicing companies.
00:25:04.120 I actually want to be on monopolistic global platforms.
00:25:09.300 That is, I don't want to go on 20 platforms and tweet the same thing.
00:25:13.660 I want to tweet once on Twitter and it can potentially reach everyone.
00:25:18.420 It can reach Housewife in Missouri.
00:25:20.900 It could reach the President of the United States.
00:25:22.500 I want to go on YouTube and take advantage of its monopoly and do that.
00:25:28.140 We don't want an internet that is fragmented and disconnected.
00:25:34.860 There are so many benefits to monopoly.
00:25:39.540 It's just, it benefits us.
00:25:42.100 It's easy.
00:25:42.680 I don't want to have 20 different payment processors and go to a customer or a donor and be like,
00:25:48.900 well, if you have a Visa card, use this one.
00:25:50.680 If you're a Canadian, you have American Express, use that one.
00:25:53.320 If you're like, I don't want that.
00:25:54.800 I want one thing.
00:25:56.520 It's just so much monopolies are the best friend of users and dissidents alike.
00:26:02.620 So I'm not, I'm not anti big tech.
00:26:05.500 I might dislike them personally, but they offer services that are vital to dissidents.
00:26:12.400 So why not just make their lives easier with an executive order or perhaps legislation,
00:26:18.560 which could get support or could have gotten support when the Republicans were in control
00:26:23.700 and might very well get support with Democrats in control.
00:26:26.420 You just have to pitch it differently.
00:26:28.520 You just be like, the left is being centered, censored.
00:26:31.720 Black Lives Matter doesn't have a voice anymore.
00:26:33.840 What are we going to do about this stuff?
00:26:35.300 You just pitch it in some way and you present it as it's a public, a privately held public
00:26:42.100 space.
00:26:43.280 End of statement.
00:26:44.620 No negotiation.
00:26:46.100 No need to burn it all down.
00:26:49.220 No need to play hardball.
00:26:51.780 Just do it.
00:26:52.860 It's the government.
00:26:53.840 They have this power.
00:26:55.220 The internet is, is literally a government operation.
00:27:00.180 I mean, it was developed by the government.
00:27:03.120 It is legally protected by the government.
00:27:06.260 It is the infrastructure is to a large degree, they're not entirely governmental.
00:27:10.840 What's the issue?
00:27:12.220 Just set the speed limit.
00:27:15.000 Well, uh, that's a beautiful dream, but the big problem we're faced with is that the,
00:27:20.500 the elite, uh, the people who control the government are not interested in this.
00:27:26.460 They are very much interested in seeing, uh, big tech do the dirty job that they cannot
00:27:32.600 possibly do under the first amendment.
00:27:34.560 Uh, so you won't get, uh, and I'm reading the news item from Biden wanting to abolish section
00:27:42.580 230 and they say, yes, he wants to abolish it for the contrary reason to Trump.
00:27:47.320 He wants to abolish it because essentially he wants to sue platforms more.
00:27:52.080 Uh, that being said, studying this possibility, given that we won't have a reasonable fix of
00:27:59.020 the law because there's just too much interest into silencing dissent, both on the side of
00:28:04.860 the Antifa that are like, oh my God, white supremacy is harassing me on 24 seven.
00:28:10.260 I hear it in my head, even when it's not there.
00:28:13.260 Uh, and of course I don't want to see it on the internet.
00:28:17.080 You could then sue Twitter for platforming Richard Spencer.
00:28:21.980 A Antifa could say, this is a, they're publishing Richard Spencer's content, which has traumatized
00:28:28.400 me and I have, I have damages and therefore I'm, but I'm, I'm going to sue Twitter.
00:28:33.420 I mean, why do we want this outcome?
00:28:35.960 Um, well, here it is because first we have to recognize that the current state of the
00:28:41.700 law, uh, advantages, the mainstream elite, the advantage on the side of the left, it
00:28:48.860 helps them side with Antifa and say, oh my God, this whole harassment on the internet is
00:28:53.780 a big problem.
00:28:55.060 Racism is a big problem.
00:28:56.900 They side with them on the side of the center, right?
00:29:00.000 Uh, you have within the mind of a, of a Will Chamberlain, for example, within the mind of
00:29:05.660 these people, they are acquiring the monopoly power that they are removing from you from,
00:29:13.280 from their point of view, they are the winners of this game of censorship because they think
00:29:18.680 it's never going to reach them probably, uh, wrongfully because it's going to, but for
00:29:24.480 now they think that they are the winners of this game where their competition is being
00:29:28.980 outlawed by effectively big tech.
00:29:31.640 So no one has an interest in pushing for this.
00:29:35.580 So in, in the, the, the circumstance where no one will do something reasonable, why not
00:29:40.800 agree that we can all do together something unreasonable, which is to, I appreciate your
00:29:49.920 honesty on this matter, but didn't Trump have an interest in this because Trump and, and,
00:29:56.880 and I'm not just blowing smoke here.
00:29:59.300 I don't think Trump would have been elected in 2016 without the alt-right writ large.
00:30:05.720 That is without the crazy Twitter memes and the edginess, the bad boy quality to the alt-right.
00:30:13.560 It gave him this, umph, this, this, this, this non non tangible, but, but, but, but maybe
00:30:21.140 non measurable, but, but, but, but essential just vibe that allowed him to be not just a
00:30:28.400 Republican candidate, but a, a, a, the candidate of the internet, the candidate of revolution.
00:30:34.560 And he, he, he, he needed that.
00:30:37.240 And he actually didn't have it.
00:30:39.140 He, he had a lot of wackiness.
00:30:41.100 He had QAnon, he had the conservatives and whatever, but he actually didn't have that
00:30:44.860 little thing.
00:30:45.480 That little thing was missing in 2020 and he didn't win.
00:30:48.660 And so wouldn't he be, or have been motivated to do this?
00:30:54.980 Um, the, the answer is he should have been, but Trump has no insight in the world.
00:31:01.580 Trump is surrounded by people who are deceiving him and he's not intelligent enough himself
00:31:07.180 to draw the right conclusion that you just drew.
00:31:10.200 Now I've been hanging out on 4chan throughout this election period of 2020, and I've seen
00:31:17.480 the memes, the mimetic powers have reversed.
00:31:20.700 And, uh, earlier you mentioned that you may have given a nudge to Biden.
00:31:25.920 Actually, when I was on 4chan and I saw the people starting to meme with Biden, with the
00:31:31.860 ice cream cone, and he was like, I don't give a shit.
00:31:35.280 I'm just the, I'm the tough guy this time.
00:31:37.880 And I don't give a shit if my son has dick pics of himself and, uh, just hanging out with
00:31:43.800 Eastern prostitutes, no problem.
00:31:46.100 I'm eating my ice cream and I don't care.
00:31:48.520 Uh, this to me, and combined with the analysis of the statistics of the vote, which shows
00:31:54.380 that essentially what led to the Trump defeat is that he's gained vote within minorities in
00:32:00.180 big centers where he wouldn't have won anyway.
00:32:03.120 And he's lost vote in white populations.
00:32:06.740 That was what was allowing him to win in 2016.
00:32:10.580 So I think you did more than a nudge.
00:32:12.300 I think you totally finished Trump.
00:32:14.900 You are absolutely the reason why.
00:32:19.400 Well, I mean...
00:32:20.760 Because with the, the amount of censorship, the amount of smearing that you could have
00:32:26.300 caused this, that it could even cross my mind that Richard Spencer himself is responsible
00:32:31.880 for the entire result of 2020.
00:32:34.720 That is a wonderful thought.
00:32:37.200 It shows that you can do a lot with very little.
00:32:39.680 Well, I think your tongue might be a little bit in your cheek right there, but I, it's
00:32:45.560 not totally wrong though, what you're saying.
00:32:49.360 And, and I'm, I'm not claiming that I did it, but it, it was a little piece of it.
00:32:55.360 And it is interesting.
00:32:57.200 Trump gained all of this support in all of these big cities where he's claiming fraud.
00:33:03.100 So like, it's funny, like he, he did amazing.
00:33:08.400 I mean, he lost, but he did amazing in like New York.
00:33:11.020 In the real vote.
00:33:12.160 In the legal votes.
00:33:13.520 Yeah.
00:33:13.780 In the legal votes.
00:33:14.100 He won the legal votes.
00:33:17.980 Anyway, it's, it's weird, like weird.
00:33:22.000 I mean, everything I've noticed you, you can't predict, like you can look at like the broad
00:33:27.820 trajectory, but you, it's hard to predict like current events because it is stranger than
00:33:32.340 fiction.
00:33:32.660 It's like just the weirdest thing you'd imagine actually happens and you have to react to it.
00:33:36.860 But anyway, yeah.
00:33:40.040 So I, I don't know, but I, I think that going forward, if this is, I mean, as we're recording
00:33:47.380 this and I'll try to get this up right away, because this is quite topical.
00:33:51.280 They're doing, I believe a defense appropriations act or something like that in Congress.
00:33:55.560 And Trump wanted it to two 30, to be added to the defense appropriations act.
00:34:01.720 And he said, this is a matter of national security.
00:34:04.040 Now, when you say that, that's, that leads me to believe that you want to ban more people
00:34:09.100 that you're claiming that like ISIS is tweeting and we've got a, or something.
00:34:13.440 I mean, that, that seems to be the implication.
00:34:15.200 I don't know if there's any intended implication to whatever Trump says.
00:34:19.620 He's just talking out of his, you know what, uh, but the, we might actually get this now
00:34:25.340 it's not been added as of today.
00:34:27.520 This is Friday.
00:34:28.520 Um, but we might get something like this.
00:34:31.420 There's reason to believe that we would get it under Biden.
00:34:34.800 And I think that what will happen, um, if two 30 is taken away is, is just going to be
00:34:42.960 terrible for us.
00:34:44.140 Um, but let's explore this then because we have a disagreement, uh, what happens when
00:34:51.860 section two 30 goes away?
00:34:53.500 So that's the scorched earth path.
00:34:55.200 And I love it because what do you have?
00:34:57.580 You have a central providers like Twitter unable essentially to provide the diversity of opinion
00:35:03.900 that they've been providing.
00:35:05.140 So heavy on censorship so much so that now the network effect that, that benefits only them
00:35:11.560 and as a monopoly start spreading.
00:35:14.520 And suddenly you have an interest in being located, having your servers located outside
00:35:18.820 the U S suddenly you have an interest for decentralized stuff like Bitcoin.
00:35:23.440 This is, this, this is where I feel like a fish in the water.
00:35:27.080 This is where I want the internet to be headed.
00:35:29.680 And so if section two 30 is abolished, I predict that in five years, a site like Gab will have
00:35:35.800 been gaining in power and even perhaps an alternative to Parler and certainly Bitcoin.
00:35:41.340 All of these things will rise because they are decentralized and by nature, not suable instead
00:35:47.260 of having a law that, which is weak because the law is subject to elite manipulation.
00:35:53.020 Instead of having a law protecting free speech, why not have the infrastructure so resilient,
00:35:59.180 so powerful that it's not possible anymore to sue a central entity for the, the, the,
00:36:05.580 the sentences of a random person on the internet.
00:36:09.420 Um, I, I, you make a really good case, but I think I would shout back at you.
00:36:15.180 That's a beautiful dream.
00:36:17.300 Much as much as what you said to me, I, I, I, I see where you're going and I think you've
00:36:22.640 made the best case for it, but I'm not sure I buy it.
00:36:26.020 Well, uh, you have to buy it because it's happening even on their section two 30 Bitcoin
00:36:31.940 is at the all time high back to the all time.
00:36:34.640 I parlay has never been as popular.
00:36:37.400 Gab is at his top.
00:36:38.800 BitChute is at the top.
00:36:40.440 Everything is going well.
00:36:41.740 It there's, we just need a little nudge.
00:36:44.640 And this process of freeing the internet will continue.
00:36:48.420 The forces of freedom on the internet are profound and they never give up.
00:36:53.620 And you may think that they are, uh, kind of, uh, weak at some points.
00:36:58.080 They, they weaken, they slow down, but they're always there and they're always going to keep
00:37:02.640 advancing.
00:37:03.340 And if section two 30 was to be abolished, it could be the last nudge needed to fall down
00:37:08.900 the cliff.
00:37:10.860 You make a very good case.
00:37:12.420 I'll actually allow you to have the last word on this.
00:37:15.040 All right.
00:37:15.700 Well, those are my last word.
00:37:16.880 And my last words, I guess would be, I'm extremely happy to have talked to you again
00:37:21.120 because I don't invite you on the show anymore, but it's in no way a denial of our friendship.
00:37:27.300 Oh, well that, that is good to hear.
00:37:28.740 I was personally miffed, but, uh, it's, it's good to hear that it's just about YouTube censorship.
00:37:35.800 Yeah.
00:37:36.320 Uh, so anyway, we should, uh, we can check back on on this because the story isn't over.
00:37:41.240 Um, if I were to make a prediction, I would say that the misery is going to continue.
00:37:46.860 In the sense that this, it two 30 won't be abolished and the status quo will just continue.
00:37:52.160 And we're going to have a difficult time navigating being on a big platform, um, which is half
00:37:58.620 platform, half curation site.
00:38:02.360 So, but then if Trump follows up on his threat, this will be the continuation of the status
00:38:07.160 quo, but less money in the pockets of the CIA and army and all these people who are going
00:38:13.240 to be funded by this defense act.
00:38:16.440 Well, then they, they might cut the payments they give to me then.
00:38:20.060 That would be.
00:38:20.600 So, I can't support this live.
00:38:39.700 Okay.
00:38:40.620 So recording is now on.
00:38:46.760 JF, how are you?
00:38:48.520 I'm doing great.
00:38:49.520 What about you?
00:38:50.960 Well, I'm doing fine.
00:38:53.140 It's snowy out here.
00:38:55.040 It's a winter wonderland.
00:38:57.300 Um, I'm sure it's the same in Montreal as well.
00:39:01.340 It's a beautiful city.
00:39:02.340 I'm no more in Montreal.
00:39:03.700 I live with Ron from civilization in the very close to the North Pole, but it's, uh, it's
00:39:09.680 very cold here.
00:39:10.680 Yes.
00:39:11.320 We'll say hello to Santa for me.
00:39:13.760 Yes.
00:39:14.980 I've been a bit naughty this year, but that's usual.
00:39:18.540 So, he's used to it.
00:39:20.540 Um, you've been naughty because you've been standing for Biden.
00:39:25.820 I've been seeing you, uh, full on the Biden, the Biden, uh, train.
00:39:31.000 Is it the train or is it a Cadillac?
00:39:33.700 It's a Cadillac convertible.
00:39:35.640 Yeah.
00:39:37.380 Yeah.
00:39:38.000 It's awesome.
00:39:38.800 It's a vintage, cool vibes.
00:39:41.600 We're wearing, he wears aviator Ray-Bans.
00:39:43.340 I wear Wayfair's, but it's all good.
00:39:47.300 Um, yeah, it's, it's been fun.
00:39:50.000 I've, um, yeah, my transformation to, um, white suburban liberal professional is complete.
00:39:58.120 Now, what was your goal with, uh, supporting Biden and did you accomplish it?
00:40:03.820 Well, I, I think I did to a certain degree.
00:40:08.560 I mean, my, my goal was complex, um, and I'll, I'll just sum it up real briefly.
00:40:18.280 Uh, I, I think there is a large degree of toxicity with the Trump movement, not, not
00:40:26.660 in the way that that word is often used by liberals or whatever, but just in the sense
00:40:31.840 that, um, we have a sense that we are winning and we can just trust the plan and he's one
00:40:38.900 of us and he's going to do all this stuff for us and so on.
00:40:42.380 And, uh, in 2016, I think there was a, a kind of hopefulness and potentiality to all
00:40:50.400 that.
00:40:50.740 And I think by at least 2018, but probably earlier, I, I felt like it was all a lie and
00:40:57.500 there is a, a complete absence of policy vision, uh, on, in the Trump administration.
00:41:03.620 We'll, we'll actually talk a little bit about that today.
00:41:06.280 I see, I think there, there's going to be a stark disagreement on section 230 and that's,
00:41:10.700 that's good.
00:41:11.180 Um, we'll, uh, you know, have at it, uh, but there was just an absence of policy vision,
00:41:16.360 but this, you know, sense that we're winning and he's, he's one of us, he's our guy.
00:41:21.220 And I wanted to get rid of that.
00:41:23.200 Um, I think that the Democrats have set them up, set themselves up in a remarkable position
00:41:29.660 where they can be a hegemonic party if they want to, but they're a hegemonic party in a
00:41:37.400 moment of extreme polarization, uh, to the point where both sides think the other side
00:41:44.380 is evil.
00:41:44.780 And I think actually, no matter what happened in November 3rd, 2020, I think who the loser
00:41:51.020 would be claiming election fraud or, or some, you know, thing like that.
00:41:55.660 We, we're, we just hate each other.
00:41:57.420 We're just at this point where we don't have no trust in one another.
00:42:01.440 Um, and so the, the, the Democrats are in this just remarkable position, but I, I just
00:42:06.220 wanted to break away from this, um, break the chains, just get away from this notion that
00:42:12.320 we have to support the fake, right, the phony, right.
00:42:16.520 It is the phony, right.
00:42:17.520 And that actually we can be ourselves and that as liberals are hegemonic, at least politically,
00:42:22.780 they're going to have a legitimacy crisis, but at least politically they're hegemonic.
00:42:26.860 Um, they can actually do some things that are good.
00:42:32.540 And the Republican, the, the, the era of Republican congressional dominance has been one of just an
00:42:39.680 absence of vision.
00:42:40.640 I can't name a single thing that they've accomplished, you know, well before Trump.
00:42:46.220 And, uh, yeah, I just wanted to kind of break away and, uh, you know, to some degree, um,
00:42:52.820 kind of blow people's minds, you could say, but, but to some degree, kind of give the middle finger
00:42:58.440 to Biden himself who was saying, you know, like, oh, I decided to run for presidency after
00:43:04.400 Charlottesville or, or some just obviously no one, I don't think a single person believes that.
00:43:10.640 Uh, that he's being genuine.
00:43:12.680 That's just some line that was fed to him.
00:43:14.940 Uh, but I, I, I kind of like saying, look, you don't understand me.
00:43:20.040 You don't understand what that event was about.
00:43:22.540 You don't understand what I want and how I think about things and, and just to kind of break free
00:43:26.240 of it.
00:43:26.520 So, you know, it's a liberating thing.
00:43:28.800 It's one thing to want to exert leverage on the Republican party, but, uh, are you, are
00:43:34.960 you enthusiastic about getting better from the Democrat party?
00:43:38.380 Are you just trying to destroy the Republican party as it is to eventually reform something
00:43:44.200 better?
00:43:44.600 Uh, I, I think that the, the GOP is in a precarious position and, um, whether I added just a little
00:43:54.680 bit to that in terms of expressing my support for Biden, maybe arguable.
00:44:00.880 Um, millions of people did hear about this through some form, either through my Twitter
00:44:06.620 account or through news reports on it that were, I saw news reports in the Guardian and,
00:44:11.580 you know, so on.
00:44:12.920 Dinesh D'Souza tweets, no doubt have tons of, um, you know, uh, uh, um, impact.
00:44:20.360 Uh, so yeah, I might've had a little bit of, of nudge here and there.
00:44:25.300 It might've not been decisive, but it was a nudge.
00:44:27.660 Um, but I, I do want us to get beyond conservatism.
00:44:31.760 I think the, the American right is the imminent enemy, um, towards, you know, having a real right
00:44:39.040 that's moving in the right direction.
00:44:40.120 But beyond that, uh, no, I, I think you could make easily make arguments that on a whole
00:44:47.600 host of issues, um, the, the Biden and, and, and Trump, it's just a wash of who exactly
00:44:55.160 is better.
00:44:55.940 And on certain issues, I think we actually, what liberals are putting forward are better.
00:45:01.480 I mean, if you actually care about Medicare for all or UBI, uh, or so on, um, which party
00:45:10.480 is more likely to bring that to us?
00:45:12.380 I think the answer is clear.
00:45:13.940 Which party is at least, at least to what they explicitly promise better for, um, Americans
00:45:23.140 who are struggling.
00:45:23.980 I think it's clearly the Democrats and, and in terms of foreign policy, I mean, it's again,
00:45:27.920 a bit of a wash, but, um, I, I think you could, I could make an argument that, that Trump, uh,
00:45:35.580 even in his last weeks in office might very well be, uh, pursuing something very bad with
00:45:41.480 Iran.
00:45:41.860 So, uh, yeah, but, uh, that's what I would say.
00:45:46.860 Well, I don't expect Trump to do anything interventionist at, uh, in foreign policy, especially
00:45:53.400 given the, the amount of work he has to deploy on the fraud investigations.
00:45:57.620 Right.
00:45:58.100 But I think that the, the things you mentioned about Joe Biden, well, I don't want these,
00:46:02.120 I don't want Medicare for all.
00:46:03.800 I don't want a UBI, but I would understand that you want this, but it comes with a, it's
00:46:09.920 a poison gift.
00:46:10.840 It comes with access to bathroom for trans people of the other sex.
00:46:16.640 It comes with potentially foreign wires and less wires.
00:46:19.980 It's a large price to pay.
00:46:21.400 It comes with inflation too.
00:46:23.020 If he gets into money printing and stimulus checks.
00:46:25.840 Well, I mean, Trump has gotten into money printing, uh, to say the least, um, if you look back
00:46:33.300 at recent history, uh, in terms of deficit spending, I mean, the Republicans are clearly
00:46:38.620 worse, uh, in terms of the, uh, the, the tranny stuff and, and so on that is happening under
00:46:46.980 Trump.
00:46:47.400 I, I don't, I mean, yes, it's true that people who support Trump have healthy instincts on
00:46:54.360 these matters and think that it's, you know, an outrage, disgusting.
00:46:58.680 Um, but there, I don't think there's any evidence that a Trump presidency brings it into this.
00:47:04.900 And, and to the contrary, um, the transformation of the right itself, at least aesthetically and
00:47:13.000 culturally has become so much more pro-gay and pro-transsexual or transvestitism than I could
00:47:22.420 ever imagine.
00:47:23.280 And now granted that maybe you don't fully blame Trump.
00:47:26.860 That's a long-term trajectory that he just happened to preside over.
00:47:31.020 But I wonder if he isn't uniquely responsible for the lady maga-ization of his movement, that
00:47:41.840 there's something about him that actually is unique that brought all of this nonsense into
00:47:48.360 the right, because you never saw this.
00:47:49.900 I mean, to 2012, the idea of a gay man transvestite singing YMCA with Trump supporters while waving
00:48:00.380 a rainbow and Israeli flag, that is not happening in the Brooks Brothers era of Mitt Romney.
00:48:09.580 And it is happening under Trump.
00:48:12.120 Now, whether that's his fault, uh, we could debate it, but he does seem to be kind of uniquely
00:48:16.380 responsible for this.
00:48:19.900 Deeply, uh, in Trump and we saw it in the apprentice way before he got into politics, there's something
00:48:25.740 deeply pro-LGBT in him.
00:48:28.020 And I remember there was this scene in the apprentice where he finds out that one of the
00:48:32.000 contestants is gay and he's asking to the others, were you aware of this, that he's gay?
00:48:36.820 Uh, but he seems very open at it.
00:48:38.740 And he's always been extending the olive branch to the LGBT community.
00:48:42.880 But let's take an example on which Trump clearly will differentiate himself from Biden.
00:48:48.700 Critical race theory course and white guilt being essentially promoted in the federal
00:48:54.380 government.
00:48:55.260 This is a price you're going to have to pay under Biden and that Trump clearly abolished.
00:49:00.000 He didn't clearly abolish.
00:49:02.880 He basically wanted to look into funding, uh, with regard to critical race theory in the
00:49:09.860 federal government.
00:49:10.960 Um, now where is critical race theory coming from?
00:49:15.780 Uh, it is not coming from the government.
00:49:18.440 The government is in fact late to the game.
00:49:20.760 The critical race theory is coming from, uh, public intellectuals is coming from academia and
00:49:27.440 it has been coming for the last 40 years.
00:49:31.360 Um, so, I mean, I'm, I'm obviously not opposed.
00:49:36.640 I mean, I, I, in fact, support just taking away funding from these people in the federal
00:49:42.780 government.
00:49:43.240 Um, but the notion that these kinds of things are, are anything else than a kind of cosmetic,
00:49:50.920 you know, uh, rebuff and, and, and kind of social signal to his supporters.
00:49:57.440 I, I think it's a bit naive.
00:49:59.980 All right.
00:50:01.060 So we can get the host here, JF.
00:50:03.740 Yeah.
00:50:04.020 Yeah.
00:50:05.360 You're like, all right.
00:50:06.340 Next question.