Rebel News Podcast - October 11, 2024


EZRA LEVANT | Tesla will share your location, personal details with the government — it’s their sole discretion


Episode Stats

Length

1 hour and 6 minutes

Words per Minute

163.42726

Word Count

10,942

Sentence Count

828

Misogynist Sentences

9

Hate Speech Sentences

16


Summary

Elon Musk is making household robots, and they're pretty cool. But what are they really good at? And what are the downsides? And will they cost less than the price of a car? Plus, a look at Tesla's new driverless vehicle, the RoboVan.


Transcript

00:00:00.000 Hello, my friends. Did you see what Elon Musk revealed at Tesla yesterday?
00:00:04.280 Yeah, he had a driverless cab and a driverless van. Those were sort of cool.
00:00:08.720 But he rolled out his robots, his humanoid robots named Optimus that could chat and banter and
00:00:16.160 pour you drinks and play rock, paper, scissors with you. And he says that in the course of time,
00:00:21.180 you'll be able to get a robot for your own house for less than the price of a car.
00:00:24.960 Well, that sounds pretty cool. I can think of the benefits right away. But what about the
00:00:30.260 downsides? And by the way, what will the terms of service be? Will it be collecting information
00:00:35.020 about me? I go through the terms of service and I'll show you those robots next. But first,
00:00:39.420 let me invite you to become a subscriber to Rebel News Plus. That's the video version
00:00:43.080 of this podcast. I really want you to see these robots and what they can do.
00:00:47.120 For that, you need Rebel News Plus. Go to rebelnewsplus.com. Click subscribe. It's eight
00:00:52.080 bucks a month. I really think it makes a difference in videos like this. All right,
00:00:56.200 here's today's podcast.
00:01:13.420 Tonight, Elon Musk is making household robots. Would you buy one? It's October 11th,
00:01:19.500 and this is The Ezra LeVance Show.
00:01:22.080 Shame on you, you sensorism bug.
00:01:35.960 Did you see Elon Musk's announcements yesterday through his company, Tesla? Some were interesting,
00:01:42.120 but not shocking. For example, he showed what a driverless taxi could look like, Tesla style.
00:01:47.860 You can think of it like individualized mass transit. The average cost of a bus per mile
00:01:55.980 for a city, not the ticket price because that is subsidized, but the average price is about a dollar
00:02:01.160 a mile. Whereas the cost of a cyber cab, we think probably over time, the operating cost is probably
00:02:11.040 going to be around 20 cents a mile. And price, including taxes and everything else, probably
00:02:17.220 ends up being 30 or 40 cents a mile. So, yes, and you will be able to buy one.
00:02:22.280 Yes, exactly. And we expect the cost to be below $30,000.
00:02:34.660 That looked pretty cool, but it's not that shocking as Tesla has had an autopilot function for some time
00:02:44.780 now in all their Teslas. I think that this driverless car is just aesthetically superior
00:02:52.120 to another California company bankrolled by Google, actually owned by Google now, called Waymo,
00:02:58.540 which is super nerdy and very Google-ish.
00:03:04.080 So much emerging technology is just a show, so one can be forgiven for not being able to distinguish
00:03:10.720 fact from fiction. Take this car, for example. Can it drive autonomously or not?
00:03:17.840 The short answer is, absolutely.
00:03:27.420 That was Shweta Srivastava, Senior Product Lead for Driving Behavior at Waymo.
00:03:33.420 What you're looking at is the Waymo Driver in Action. Adapted for a variety of vehicle platforms,
00:03:39.380 it is the most advanced, fully autonomous driving technology in the world today.
00:03:43.200 What you notice pretty quickly is that these aren't normal vehicles. Take this Jaguar I-Pace,
00:03:49.720 for example. It's been equipped with an elegant array of sensors and software,
00:03:54.360 which allows it to move through the city on its own.
00:03:58.060 Yeah, I think Elon Musk has a better aesthetic, but I don't know how different the cars would be.
00:04:03.160 Tesla also rolled out a very art deco, larger driverless vehicle,
00:04:08.460 called RoboVan, basically a taxi that's as big as a bus,
00:04:14.480 almost a kind of a mini train. It certainly looks like freight trains looked in the 1940s.
00:04:20.880 There's a lot of opportunity to create
00:04:23.240 green space in the cities that we live in.
00:04:31.220 So I think that would be quite fantastic. Oh, and also,
00:04:45.180 what happens if you need a vehicle that is bigger than a Model Y?
00:04:50.260 The RoboVan. The RoboVan is, this is, we're going to make this, and it's going to look like that.
00:05:05.320 Now, can you imagine going down the streets and you see this coming towards you?
00:05:09.940 That'd be sick!
00:05:10.920 So this can carry up to 20 people, and it can also transport goods.
00:05:22.480 So you can configure it for goods transport within a city,
00:05:26.180 or transport of up to 20 people at a time.
00:05:30.080 So this is going to, the RoboVan is what's going to solve for high density.
00:05:35.860 So if you want to take a sports team somewhere, or you're looking to really get the cost of travel down to,
00:05:45.200 I don't know, 5, 10 cents a mile, then you can use the RoboVan.
00:05:50.660 Some people call it the RoboVan, but...
00:05:52.660 So, yeah.
00:05:56.640 You know, one of the things that we want to do, and we've seen this with the Cybertruck,
00:06:00.000 is we want to change the look of the roads.
00:06:03.000 The future should look like the future.
00:06:07.140 That does look cool, and those are driverless.
00:06:10.560 Elon Musk also mused about what our world would look like if we didn't have cars that needed to be parked in parking lots,
00:06:18.100 but rather if it was all just autonomous vehicles whizzing around that you just ordered on demand.
00:06:23.660 So what would a post-private ownership of vehicles look like?
00:06:28.280 He said it would basically turn parking lots into parks.
00:06:30.920 So, one of the things that, like, is really interesting is how will this affect the cities that we live in?
00:06:40.540 And when you drive around a city, or when a car drives you around a city,
00:06:44.200 you'll see there's, like, there's a lot of parking lots.
00:06:46.400 There's parking lots everywhere.
00:06:47.960 Parking garages.
00:06:48.660 There are, and so what would happen if you have an autonomous world is that you can now turn parking lots into parks.
00:06:58.600 And so from, we're taking the in lot out of parking lots.
00:07:04.200 Now, I take Ubers a lot, as you know what that is.
00:07:07.560 It's like a taxi that's built in an app.
00:07:10.000 They're very convenient, especially when I'm traveling or, you know, to avoid the hassle of parking downtown.
00:07:16.880 That's for sure.
00:07:17.320 Sometimes the cost of taking an Uber to an event is less than the cost of driving and paying for parking in a big city like Toronto or New York.
00:07:25.080 But I also really love my actual car.
00:07:28.140 It's sort of beat up now.
00:07:29.740 It's pretty old.
00:07:30.580 The car that's parked on my driveway, the car that I don't need an app to use, the car that I can drive whenever and wherever and however I want,
00:07:40.540 and no one else is involved, no tech company, no third party, no driver or artificial intelligence driver.
00:07:48.640 To me, a car is an essential part of being a free person.
00:07:52.040 It's one of the defining spirits of America and Canada, too, as opposed to more European, African, or Asian ways.
00:07:59.420 How much freedom there is on the open road.
00:08:02.360 I mean, how many hundreds of songs have been written about driving on the highway and the freedom you feel.
00:08:09.660 Teslas and these autonomous vehicles are part of huge technology companies that are amongst the most regulated companies in the world.
00:08:19.540 And they're heavily regulated for political purposes, as you know.
00:08:23.700 I mentioned Google, which owns Waymo.
00:08:26.120 They also own YouTube.
00:08:27.700 YouTube, that's the company that punished us at Rebel News for having videos that were too pro-Trump.
00:08:33.960 So they demonetized us.
00:08:36.120 They're the people who punished us for not supporting COVID lockdown policies.
00:08:40.520 That was literally part of their community guidelines.
00:08:44.040 If you didn't obey your public health officer's politics, you would be shut down.
00:08:48.600 So the company that interfered with your freedom of expression for political reasons, surely they won't hesitate to interfere with your freedom of movement for the same reasons.
00:09:00.380 I've never used Waymo, and I don't really propose to.
00:09:03.640 But the choice, I don't know if it's even going to be my choice to make.
00:09:07.700 Waymo and Uber have announced that they're teaming up.
00:09:11.120 They're making announcements like this one just last month.
00:09:14.840 So my point is, would I have been able to go to, say, an anti-COVID lockdown, anti-COVID mandate protest if Waymo had been operating in Canada back in February 2022?
00:09:29.920 Would they have let me go to the trucker convoy?
00:09:32.120 Would an autonomous vehicle have taken me to the trucker convoy?
00:09:36.540 I like Elon Musk a lot, but Teslas are at the mercy of their software and hardware.
00:09:42.980 Even if Elon Musk personally opposed some sort of lockdown, could he really resist a government order to identify any cars participating in a future trucker convoy?
00:09:53.640 How about shutting them off if they got too close to such a political event?
00:09:57.880 Half of all the Teslas in the world are in China.
00:10:00.680 Do you doubt they want that power over their people?
00:10:04.360 Do you think that our politicians lack the will to regulate where you can go?
00:10:09.740 They're already doing it in a sloppy way through 15-minute cities.
00:10:13.900 That's with spy cameras and barriers.
00:10:16.240 It would be so much easier for them just to program the 15-minute cities into an autonomous vehicle, don't you think?
00:10:23.600 I mean, do you think that tech companies lack the ability to do this to you?
00:10:27.180 Do you think they ever resist what the regimes tell them to do?
00:10:31.580 I'm sure you'd be fine 99% of the time, maybe 99.9% of the time, but it's that last time that counts, isn't it?
00:10:42.600 The one time that counts.
00:10:44.300 If you've ever read the terms of service of Tesla, I think they give you something to worry about.
00:10:49.900 It's not just Tesla.
00:10:50.600 I'm not picking on them.
00:10:51.400 I'm just saying they're all like this.
00:10:53.900 Tesla is a tech company.
00:10:55.140 I looked up their Canadian rules, and let me quote directly from their terms of service.
00:11:01.960 They say, where you go says a lot about you.
00:11:06.140 Okay, thanks.
00:11:07.900 Unless there is a serious safety concern, Tesla doesn't associate your location with your account or keep a history of where you've been.
00:11:18.240 All right, well, that's a lot of unless there's a serious problem.
00:11:21.920 Here's something that the government has said is serious, COVID-19.
00:11:27.680 Here's something else the government says is serious, the climate crisis.
00:11:31.540 Here's something else the government says is serious, disinformation and misinformation and the far right.
00:11:38.140 Oh, no, no, but don't worry.
00:11:39.280 They won't tell the government where you've been unless it's a serious safety issue, you know, like the pandemic.
00:11:46.180 I wonder if those cars can track if you're wearing a mask.
00:11:50.480 There's so many cameras in those cars.
00:11:52.580 Again, I'm not picking on Tesla.
00:11:54.040 I like Elon Musk a lot, but let me quote some more from the terms of service.
00:11:57.920 You can see it yourself.
00:11:58.680 Go online and look at the terms of service.
00:12:00.960 We may share information with third parties when required by law or other circumstances, such as to comply with a legal obligation, such as subpoenas or other court orders.
00:12:15.120 In response to a lawful request by government authorities conducting an investigation, including to comply with law enforcement requirements and regulator inquiries, to verify or enforce our policies and procedures, to respond to an emergency,
00:12:32.000 to prevent or stop activity we may consider to be or to pose a risk of being illegal, unethical or legally actionable, or to protect the rights, property, safety or security of our products and services, Tesla, third parties, visitors or the public, as determined by us in our sole discretion.
00:12:52.480 Got it.
00:12:54.680 So they'll give information about you and your car and your journey and anything else they capture to the government as part of their inquiries or just if they feel it's necessary at their sole discretion.
00:13:08.660 What's this?
00:13:09.280 Are you doing something unethical with your car?
00:13:12.160 What's the definition of that?
00:13:13.380 Whose code of ethics, by the way?
00:13:14.960 Would that include, say, I don't know, smoking a cigarette or swearing or having politically inappropriate views or going to meetings with political opponents of the regime?
00:13:25.880 Who decides what's unethical?
00:13:27.640 I like ethics, but I know what my ethical code is.
00:13:30.160 I don't think it's the same as the government's ethical code, and I don't know what Tesla's ethical code is.
00:13:35.300 When they say they can do what they like for ethical reasons, what does that mean?
00:13:39.360 What does it mean for Tesla to say they're going to protect the rights of the public?
00:13:43.900 I know what an individual right is.
00:13:45.980 What's a public right, especially when it comes to my car?
00:13:49.720 They answer that question in the same sentence.
00:13:51.980 It's their sole discretion, whatever they want, really.
00:13:55.820 But, hey, I'm sure it'll never happen to you.
00:13:58.040 Don't you worry, you pretty little head.
00:13:59.960 You live the worrying to the big people.
00:14:03.620 Which brings us, I don't know, to the latest announcement, the real showstopper yesterday.
00:14:10.180 Elon Musk rolled out his autonomous vehicles, but that wasn't really shocking.
00:14:13.140 What was shocking, or startling at least, surprising, was that Elon Musk rolled out humanoid robots that he says you'll soon be able to buy for your household.
00:14:24.100 He says they'll not just be great for chores, but they'll also be your friend.
00:14:29.120 So everything we've developed for our cars, the batteries, power electronics, the advanced motors, gearboxes, the software, the AI inference computer, it all actually applies to a humanoid robot.
00:14:49.520 It's the same techniques, it's just a robot with arms and legs instead of a robot with wheels.
00:14:56.480 And we've made a lot of progress with Optimus, and as you can see, we started it with someone in a robot suit, sort of dab, and then we've progressed dramatically year after year.
00:15:15.000 So if you extrapolate this, you're really going to have something spectacular, something that anyone could own.
00:15:24.640 So you could have your own personal R2-D2 C-3PO, and I think at scale, this would cost something like, I don't know, $20,000, $30,000, probably less than a car, is my prediction, long term.
00:15:44.760 So, you know, it'll take us a minute to get to the long term, but fundamentally at scale, the Optimus robot, you should be able to buy an Optimus robot for, I think, probably $20,000 to $30,000 long term.
00:16:01.000 And what can it do?
00:16:02.700 It'll be able to do anything you want.
00:16:04.700 So it can be a teacher, babysit your kids, it can walk your dog, mow your lawn, get the groceries, just be your friend, serve drinks.
00:16:14.760 Whatever you can think of, it will do.
00:16:19.720 And, yeah, it's going to be awesome.
00:16:22.400 I think this will be the biggest product ever of any kind.
00:16:32.840 Because I think everyone of the 8 billion people of Earth, I think everyone's going to want their Optimus buddy.
00:16:40.180 So what will the terms of service be for my new robotic friend?
00:16:45.640 They obviously have a ton of sensors in them, electronic eyes and ears, so to speak, and it'll be in your house, maybe even in your bedroom, your kitchen, anywhere you are, helping you, of course.
00:16:59.420 But they're not selling these robots yet, but I've seen videos of some celebrities saying they're going to get one soon.
00:17:06.840 Like the cool Cybertruck, I'm sure that fancy people and opinion leaders will be able to get these Android-style humanoid robots very quickly.
00:17:16.300 Maybe we'll learn then what the terms of service will say.
00:17:19.720 But it's hard to imagine they'll be much different than Tesla's terms for their cars, which are essentially robots, too, when you think about it.
00:17:28.520 Tesla makes things.
00:17:30.040 They make products.
00:17:31.400 And that way they're like GM or Ford.
00:17:34.300 But I think they really gather data, enormous amounts of data.
00:17:40.380 I really think that's what makes Tesla special and unique and modern.
00:17:44.800 They film everything.
00:17:45.960 There's so many cameras on that.
00:17:47.160 That's how they've taught their cars how to drive robotically.
00:17:50.260 Millions, I guess billions, of little moments of how real human drivers stop, go, pass, slam on the brakes, honk horns, whatever.
00:17:58.660 So the cars learned how to drive by watching human drivers.
00:18:03.000 So in that way, it wasn't like your regular Ford or GM.
00:18:06.460 It was more like your cell phone or your Facebook account.
00:18:09.820 Watching, watching, watching, watching, watching, watching.
00:18:12.400 Why is Facebook free?
00:18:14.160 Because you are being what's bought and sold.
00:18:17.240 You agree to share everything about your life with Facebook.
00:18:20.520 And that's really fun and convenient when it suggests videos to you you might like or makes your shopping ads suited to your own taste.
00:18:30.660 We don't mind if the spying is light and commercial like that and not too intrusive.
00:18:36.260 But it is still spooky to say something and then to see an ad about what you were just talking about pop up on your phone.
00:18:43.520 But what if it's more than that?
00:18:45.220 I mean, imagine a robot in your house, never sleeping, always listening, always watching, never forgetting anything, uploading it all to the cloud so corporate can go through it at their sole discretion.
00:19:00.720 How do you feel about that?
00:19:02.580 I'm sure some people will love it.
00:19:04.420 In many homes, people already have Alexa or other devices that control systems in their house based on voice activation.
00:19:13.580 They're listening all the time, those little devices.
00:19:15.760 They have to, to know when they're being ordered to do something.
00:19:19.800 It's just now the robots don't look like a little disc.
00:19:22.900 They look like real people.
00:19:25.740 Here's how some of that looked yesterday.
00:19:28.340 Optimist.
00:19:29.660 It's insane.
00:19:30.680 It's even talking.
00:19:31.340 Say hi to my friend, John.
00:19:33.360 John?
00:19:33.900 Where's John?
00:19:35.420 Right here.
00:19:36.600 Oh, hello, John.
00:19:37.860 How are you?
00:19:38.460 How are you doing?
00:19:39.160 Very good.
00:19:39.740 It's crazy.
00:19:40.260 I'm talking to a robot.
00:19:41.640 From San Jose.
00:19:42.960 Probably from where you were, from where you were born in Silicon Valley.
00:19:46.620 That's wonderful.
00:19:47.300 Where do you live in San Jose?
00:19:48.600 Do you live in Alman Valley or do you live in Santa Teresa area?
00:19:52.220 No, I live in Los Gatos.
00:19:54.100 Los Gatos.
00:19:55.020 Oh, wonderful.
00:19:56.120 Yeah.
00:19:56.440 Nice area.
00:19:57.340 Where do you live?
00:19:58.000 Beautiful hiking out here up there.
00:19:59.600 There is.
00:20:00.300 Where do you live?
00:20:01.920 I live in Palo Alto at the current moment.
00:20:04.980 Biggers.
00:20:06.020 Yeah, this is awesome.
00:20:08.400 That's where they train us.
00:20:09.480 That's where we get our bills.
00:20:10.780 And that's where we work with a wonderful group of people.
00:20:14.140 What's the hardest thing about being a robot?
00:20:18.940 Trying to learn how to be as human as you guys are.
00:20:24.800 And that's something I try harder to do every day.
00:20:27.320 And I hope that you all help us become that.
00:20:31.120 You want to get a photo?
00:20:32.400 Yeah, for sure.
00:20:33.120 Let's do it.
00:20:33.540 A little video.
00:20:34.580 Hey.
00:20:36.460 Sweet.
00:20:37.000 Thanks, man.
00:20:38.120 Bro.
00:20:38.960 Appreciate it.
00:20:39.540 How are you doing?
00:20:40.540 Of course.
00:20:40.960 Anytime.
00:20:42.340 All right.
00:20:42.800 How's everybody doing?
00:20:45.220 How's everybody doing?
00:20:46.380 How's everybody doing?
00:20:46.680 Doing good.
00:20:47.800 Peace, everybody.
00:20:49.720 How are you doing?
00:20:51.240 I'm doing pretty good.
00:20:52.800 Nice.
00:20:53.340 I love the change.
00:20:58.380 How's everybody doing?
00:20:59.360 Let's do this order a drink.
00:21:00.120 Doing good.
00:21:01.140 All right.
00:21:02.140 Then step right up.
00:21:03.900 Can I have a watermelon foguerita?
00:21:05.500 A watermelon?
00:21:07.280 Yeah.
00:21:08.460 Of course you can.
00:21:09.020 Hey, what's up?
00:21:11.140 How are you doing, man?
00:21:11.740 How are you doing, man?
00:21:13.860 I'm going to do a drink.
00:21:16.420 Can you order another one?
00:21:17.400 Thank you.
00:21:20.860 Oh, I'm good.
00:21:23.340 How am I doing so far?
00:21:26.900 Killing it.
00:21:27.520 You can see the benefits of course.
00:21:56.520 I mean, why not have some help with household chores?
00:22:01.620 We had the Roomba that did one thing, but why not have a humanoid robot to do that?
00:22:06.840 And really, why not mow the lawn and why not paint the house and maybe do some household repair work?
00:22:13.320 Sure, have them go up on the roof to clean up the leaves from the eavesdrops so you don't have to risk it.
00:22:18.480 Who would object to that?
00:22:20.800 Well, I suppose a lot of landscapers and gardeners will have to find something new to do to earn a living.
00:22:27.140 A lot of taxi drivers and truck drivers will soon be unemployed by these autonomous vehicles too.
00:22:32.840 What happens when those people are affected by automation?
00:22:37.560 What happens when a million jobs are just replaced by very happy, very friendly robots, but they're replaced?
00:22:44.300 So much of our economy is based on these jobs.
00:22:49.600 750,000 temporary foreign workers in Canada suggest that there are jobs right now that Canadians don't really want to do.
00:22:57.080 I suppose robots could be an answer to mass immigration.
00:23:01.640 Instead of having 750,000 foreigners in our country, we could have 750,000 robots.
00:23:08.540 But still, what about Canadians?
00:23:10.840 What do young people get to do on their first job or any job?
00:23:17.520 And what are the social consequences?
00:23:19.280 The friendliness of the robots last night was a great touch.
00:23:22.300 Some of them have friendly accents even.
00:23:26.020 Will this stop people interacting with other people?
00:23:30.480 Will people who are shy or socially awkward or even just lazy develop into fully developed men and women if they interact with robots?
00:23:38.240 If you treat a real person poorly, you suffer the consequences.
00:23:41.620 You're embarrassed, you're corrected, you're marginalized, you don't get what you want, whatever.
00:23:45.460 But if you abuse a humanoid robot, I'm sure it just puts up with it.
00:23:50.260 I don't know.
00:23:51.120 As the robots become more and more human, I fear that we will become more robotic.
00:23:56.780 What about people choosing only to have robots in their life?
00:24:01.000 We were warned about this by Yuval Noah Harari, of all people, from the World Economic Forum.
00:24:06.420 Remember when he said that the future for most people will be useless?
00:24:11.560 He actually called people useless eaters who will spend their time just on drugs and playing video games because there's nothing else for them to do.
00:24:21.340 Yes, in the Industrial Revolution, we saw the creation of a new class of the urban proletariat.
00:24:28.560 And much of the political and social history of the last 200 years involved what to do with this class and the new problems and opportunities.
00:24:36.640 Now we see the creation of a new massive class of useless people.
00:24:42.380 As computers become better and better in more and more fields, there is a distinct possibility that computers will outperform us in most tasks and will make humans redundant.
00:24:56.260 And then the big political and economic question of the 21st century will be what do we need humans for, or at least what do we need so many humans for?
00:25:07.500 Again, I think that the biggest question maybe in economics and politics of the coming decades will be what to do with all these useless people.
00:25:17.000 I don't think we have an economic model for that.
00:25:21.140 My best guess, which is just a guess, is that food will not be a problem.
00:25:27.620 With that kind of technology, you will be able to produce food to feed everybody.
00:25:33.260 The problem is more boredom and what to do with them and how will they find some sense of meaning in life when they are basically meaningless, worthless.
00:25:43.760 My best guess at present is a combination of drugs and computer games.
00:25:49.740 You know, they say that the current generation has the least sex of the generations that have been measured, which is odd, don't you think?
00:25:59.040 Because they certainly have the most pornography in history and the most online dating apps.
00:26:05.880 There's never been more dating and situationships in history.
00:26:09.760 But I think young people have fewer real connections with real people now than ever.
00:26:15.320 The birth rate is plummeting, at least in the West.
00:26:19.020 I think maybe there's a sense of purpose that's eroding, a sense of community.
00:26:23.280 Look, I understand the appeal of having machinery and technology.
00:26:27.140 I mean, I love my smartphone and I can see the social problems even that is creating.
00:26:32.720 High tech has made the most average person in Canada today as rich as a king from olden times.
00:26:39.300 When you think about things like your basic medical care and dental care, the basic availability and choice of food, entertainment, travel, everything from literacy to communications.
00:26:52.280 An ordinary Canadian has a better life than a king just a few hundred years ago.
00:26:58.160 And it's because of technology in large part, the culture too.
00:27:01.340 But what will these robots do to our humanity?
00:27:06.040 I'm sure people ask the same questions at the start of the Industrial Revolution.
00:27:10.040 I know they did.
00:27:11.620 The Luddites rioted and smashed the looms.
00:27:15.800 They destroyed the factories because they could see how the world was changed by them.
00:27:19.660 I suppose in a way you can't fight the inevitable future.
00:27:23.400 Wouldn't it be nice to have a robotic slave that wasn't immoral because the slave was just a machine?
00:27:30.020 It wasn't human.
00:27:31.520 That's really the value proposition here.
00:27:33.960 But it won't just eliminate actual slaves.
00:27:36.560 It'll eliminate many paying jobs too.
00:27:39.700 And it'll replace many human moments with robot moments.
00:27:44.280 And it'll all be at a kill switch if the government thinks you're in some danger of your safety.
00:27:51.720 Elon Musk is no dummy, of course.
00:27:53.560 He's the opposite.
00:27:54.220 And at least he's an American.
00:27:56.340 And he generally is on the side of freedom, I think.
00:27:59.180 I'd rather have Elon Musk owning and creating these things than communist China, I suppose.
00:28:04.620 But he's pretty deeply involved in that country too.
00:28:09.160 Elon Musk is famous for Tesla and for SpaceX, which is now sending more rockets into space than all other countries and companies combined.
00:28:18.180 Ten times more than the rest of the world combined.
00:28:20.180 Elon Musk's Starlink internet system is amazing.
00:28:22.980 I've used it myself.
00:28:23.680 I love it.
00:28:25.140 Lesser known as his Star Shield program.
00:28:27.700 You ever heard of that?
00:28:29.360 That's what he calls the system he sells to the Pentagon.
00:28:32.580 Doesn't get as much press.
00:28:34.360 Don't think Elon Musk hasn't been thinking about the military applications of these robots too.
00:28:39.380 They don't just do rock, paper, scissors.
00:28:41.360 The Russian-Ukraine war was the first war where drones were used by the tens of thousands.
00:28:48.140 I think by the hundreds of thousands.
00:28:50.780 I imagine it won't be too long before humanoid robots are on the battlefield too.
00:28:55.700 Maybe that's an advantage.
00:28:57.160 Maybe that's progress.
00:28:58.620 Fewer people will be killed.
00:29:00.200 Or maybe many, many more will be.
00:29:03.400 Elon Musk says he thinks this will all have a happy ending.
00:29:05.940 Or at least he's 80% certain.
00:29:08.260 I predict actually, provided we address risks of digital superintelligence, 80% probability
00:29:17.700 of good, a good outcome.
00:29:19.940 Look on the bright side.
00:29:22.640 The cup is 80% full.
00:29:25.340 The cost of products and services will decline dramatically.
00:29:32.220 And basically anyone will be able to have any products and services they want.
00:29:37.040 I think that 20% he's talking about is the risk that the robots and the artificial intelligence
00:29:43.900 computers decide that humans are the enemy and that we are what needs to be eliminated.
00:29:49.440 It's basically the script of the old movie, The Terminator.
00:29:51.980 Maybe the giant AI artificial intelligence system hacks all the robots and uses them to
00:29:57.440 enslave us.
00:29:58.860 I don't know how that would be much different than if just the government did it, which I'm
00:30:02.600 certain they will.
00:30:03.280 Remember how Chrystia Freeland froze people's bank accounts.
00:30:06.900 Imagine the power she could have if she controlled the robots.
00:30:10.780 Tell me how you would fight back if your household robots suddenly took orders from someone or something
00:30:16.580 far away, whether it was AI or a malicious government.
00:30:20.120 I don't think you would win a fight with these metal robots, would you?
00:30:23.680 And really, they'd probably just lock you in a room in your high-tech home.
00:30:28.400 It reminds me of the terrifying final scene in the robot movie Ex Machina.
00:30:34.160 I hope I'm not spoiling it for you.
00:30:35.480 Take a look.
00:30:35.940 Thank you very much.
00:30:52.200 Amen.
00:31:22.200 Amen.
00:31:52.200 Amen.
00:31:54.200 Amen.
00:31:56.200 Amen.
00:32:02.200 In the 1940s, the great author and amateur scientist Isaac Asimov wrote a ton of books about robots.
00:32:10.200 As a kid, I loved reading them.
00:32:12.200 He devised what he called the three laws of robotics.
00:32:15.200 They were really philosophical laws, not mechanical laws.
00:32:18.200 I really recommend his classic book called I, Robot.
00:32:22.200 I read it as a kid.
00:32:23.200 Don't watch the movie.
00:32:24.200 The movie is sort of junk, but you've got to read the book.
00:32:27.200 It's an easy read, and it's brilliant.
00:32:29.200 Here's what those three laws of robotics are from his novels.
00:32:34.200 The first law, a robot may not injure a human being or, through inaction, allow a human being to come to harm.
00:32:43.200 The second law is a robot must obey the orders given it by human beings except where such orders would conflict with the first law.
00:32:54.200 And the third law is a robot must protect its own existence as long as such protection does not conflict with the first or second law.
00:33:01.200 Isn't that a great starting point?
00:33:03.200 Very first principles.
00:33:05.200 Protect human life.
00:33:06.200 Do not take human life.
00:33:08.200 Yield to instructions from a human except for if it would violate the first rule.
00:33:13.200 And save yourself unless it would violate the second rule.
00:33:16.200 Those are beautiful and brilliant.
00:33:17.200 It's a good starting point.
00:33:18.200 Of course, what was fun about the book, by the way, is about how those laws are implemented in various scenarios, especially when those laws are in conflict.
00:33:28.200 That's what made the book so interesting.
00:33:29.200 I like those laws, but they're not real.
00:33:31.200 That's just from science fiction.
00:33:33.200 Right now, I guess you could say there is a zeroth law before the first law and the second one.
00:33:38.200 It predates the other ones.
00:33:39.200 It is more power than the other ones.
00:33:41.200 And it would go something like this.
00:33:43.200 Notwithstanding any other laws, a robot will do whatever big government orders it to do.
00:33:48.200 Because that's what the terms of service say.
00:33:50.200 The terms of service are really a code of conduct for robots.
00:33:53.200 The terms of service say you have all these rights to your robot, to your Tesla, to your Android, unless there's a matter of public safety, which we in our sole discretion will define.
00:34:06.200 You know, I would get a kick out of a robot helping me fix a few things around the house, change some light bulbs.
00:34:12.200 You know, I like mowing the lawn sometimes, but I'd probably prefer having a robot do it, things like that.
00:34:19.200 It would be good to have a security system that was prowling around, although he'd probably have to fight against robot intruders, come to think of it.
00:34:28.200 But for me, the thing I would be most worried about is having every word, every action, every facial expression, every journey, every movement I made, even in my own home, in my own car, just recorded and shared with big tech and shared with big government.
00:34:44.200 And if I did something serious, well, then they would intervene in whatever way they wanted.
00:34:49.200 It's already bad enough with my cell phone, don't you think?
00:34:56.200 Stay with us for more.
00:35:09.200 What are the two things that make Rebel News special?
00:35:11.200 If you press me on it, I would say two things.
00:35:13.200 The first is we have a bias towards being in the field with video cameras.
00:35:18.200 Sure, my show is a lot of pontificating from behind this desk, but I also do my best to be in the world, and our reporters certainly do.
00:35:26.200 David Menzies is my favorite example of that.
00:35:29.200 So that's one of the things that makes us unique in Canada.
00:35:32.200 Another is that from time to time, when we see something is wrong, we stop and try and fix it.
00:35:38.200 We don't just show it to you.
00:35:39.200 We're not just voyeuristic, although that is a legitimate job in journalism.
00:35:44.200 We try and make a difference.
00:35:45.200 For example, you will recall early in the pandemic when we saw the atrocious case of Arthur Pawlowski,
00:35:50.200 the Calgary street preacher, being arrested and manhandled for the crime of feeding the homeless.
00:35:57.200 They call that an illegal gathering.
00:35:59.200 Well, we jumped into action and created our Fight the Fines program.
00:36:03.200 Soon we had two pastors. Soon we had 50 people.
00:36:08.200 Soon it was overwhelming.
00:36:10.200 So we worked out a plan and with an arm's length organization called the Democracy Fund,
00:36:17.200 we set up a new civil liberties law firm that has Canada Revenue Agency charitable status.
00:36:23.200 It is its own entity at arm's length from Rebel News.
00:36:25.200 But as you know, we crowd fund from our Rebel News viewers to the Democracy Fund, which hires lawyers.
00:36:32.200 And over the course of time during the lockdown, we took 3000 cases.
00:36:39.200 That's an approximate number because, of course, it grows every week.
00:36:43.200 And I was talking to the lads at the Democracy Fund and I thought, you know what?
00:36:48.200 It's time for an update on what they're up to because they're doing some exciting things.
00:36:52.200 Joining me now for this interview is my friend Mark Joseph, who is the senior litigator over at the Democracy Fund.
00:36:58.200 Hey, Mark, how you doing?
00:36:59.200 Good. Thanks for having me.
00:37:00.200 Good. I mean, things are calmer now at the Democracy Fund than, say, in January, February 2022,
00:37:06.200 when things were at a fever pitch, when we were in the darkest depths of the lockdown.
00:37:10.200 I remember you and some other staff lawyers for the Democracy Fund literally went down to Ottawa,
00:37:16.200 went truck by truck, knocking on the door, giving truckers an information pamphlet about their rights.
00:37:22.200 Why don't you give me a little flashback about that?
00:37:25.200 Sure. So Adam and I, and at the time Alan Haunter, the litigation director,
00:37:29.200 we were out in the field, as you say, giving the truckers some general legal information about their constitutional rights.
00:37:35.200 And then later, when the police started charging those truckers for mischief,
00:37:41.200 we took on about 30-odd clients and sort of navigated through the court system.
00:37:46.200 That's from Ottawa, or is it just from Ottawa, the 30?
00:37:49.200 Ottawa and Windsor. That's right.
00:37:52.200 And then outside counsel took a bunch of cases in Coutts, Alberta, Chad Williamson and other lawyers.
00:37:58.200 I think they had 55 cases out there. I'd have to check my map on that.
00:38:02.200 That's right. Yeah. So they took the Coutts three and they also took some truckers who were charged for regulatory fences.
00:38:08.200 Yeah. About 50.
00:38:09.200 So it's been two and a half years more since the convoy.
00:38:14.200 But some of these cases are still in the system, right?
00:38:17.200 That's right. We still have, I'm just trying to think.
00:38:20.200 We have one, the trucker charged with mischief, still moving through the system.
00:38:25.200 In which location?
00:38:26.200 That's from Ottawa, I believe, James Bowder.
00:38:29.200 He's called the last trucker because then he's the last to be charged from that convoy, am I right?
00:38:33.200 That's right. I mean, of course, Tamara Litch, their trial just ended, Tamara Litch and Chris Barber.
00:38:38.200 So we're waiting for a decision on that, of course. But we have one remaining.
00:38:41.200 She'll charge with mischief, still under that sort of risk of penalty.
00:38:45.200 It's crazy that they would put prosecutorial resources.
00:38:49.200 And by that, I mean, there's only so many judges, only so many prosecutors, so many, so many clerks, bureaucrats.
00:38:55.200 And for them to put aside other real matters to go after a trucker shows that the virus may be out of the pandemic is gone.
00:39:06.200 But the virus of authoritarianism remains in the body of the state.
00:39:10.200 Well, we can't speculate as to as to why judicial resources are being used on these matters.
00:39:17.200 But obviously, there are serious matters out in the general public concerning safety, sex assaults, murders, thefts.
00:39:24.200 We think that those judicial resources could be more effectively used there.
00:39:28.200 But of course, we don't get to make those decisions.
00:39:30.200 Yeah, it's sort of shocking. You were at the Tamara Litch trial for many days.
00:39:35.200 The Democracy Fund lawyers sort of rotated through live tweeting what was going on.
00:39:40.200 I tried my best to be there a few times, too. Was it 47 days?
00:39:45.200 I think it was 45, but it could be up to 47.
00:39:47.200 Like, well, let me put it this way. Close to 50 days. World's longest mischief trial.
00:39:53.200 I mean, there's something wrong when I mean, that is a big court.
00:39:59.200 That is a busy judge. I we don't need to rehash that now other than the Democracy Fund would say there's no way that a regular person could have paid for that legal defense on their own.
00:40:09.200 I always say a poor person wouldn't have a chance. A regular person couldn't.
00:40:14.200 Like we're talking about half a million dollars in legal fees and a rich person wouldn't.
00:40:18.200 A rich person who's worked his life to save up a small fortune, let's say, isn't going to spend it on a fight like that.
00:40:25.200 They're going to say, all right, I plead guilty. Just let me out of that.
00:40:28.200 There is no person who would be able to fight that and win other than someone with crowdfunding behind them.
00:40:35.200 Right. So I mean, like a lot of things in law, the process is often the punishment.
00:40:39.200 So, you know, Tamara and Chris have both been under an incredible amount of personal and financial stress from this.
00:40:47.200 And the trial just kept going and going and going and it passed into 45 days.
00:40:51.200 They've got great counsel, but that costs a lot of money.
00:40:54.200 And, you know, there might be something to that, that they were being punished for their political beliefs, but it's hard to say.
00:40:59.200 I want to talk about two more things with you today. The first is I want to talk about the Amish case.
00:41:04.200 I mean, for those who don't know, Amish are Christian farmers who are distinctive in that they eschew modernity.
00:41:13.200 They do not use electricity. They do not drive cars. They don't use the Internet, watch TV, listen to the radio.
00:41:20.200 They don't even have electric lights. They use gas lamps.
00:41:25.200 When you visit an Amish farmhouse, which I have done several times now, on the outside, it looks sort of like a regular house.
00:41:32.200 But you are stepping back in time two centuries. They even farm using horse pulled plows.
00:41:39.200 And every time I tell the story, I just roll my eyes.
00:41:44.200 These folks go across the border between Ontario and the U.S. because there's other Amish on the U.S. side.
00:41:49.200 And they've been doing that, going back and forth, harming.
00:41:51.200 These people, they keep to themselves. They're fairly reclusive.
00:41:53.200 They actually speak German amongst themselves.
00:41:56.200 They do not interact with the larger world.
00:41:59.200 First of all, they can't because they're not going to phone you or email you or whatever.
00:42:03.200 And during the lockdowns, the Canadian border police would say, did you download the Arrive Can app on your smartphone?
00:42:13.200 And every single word in that sentence would be like Greek to someone who is living in essentially an 18th century technological world.
00:42:24.200 They don't have, what does download mean? What's an app? What's Arrive Can? What's a smartphone? What are you talking about?
00:42:30.200 And since they didn't, they were hit with an extraordinary number of fines.
00:42:35.200 All right. I think people know that. But what's the latest?
00:42:38.200 You've been seized with this matter. You and Adam Blake Gallupo have tucked into this Amish case. What's the latest?
00:42:44.200 OK, so maybe your listeners know, but the first step we had to go through was to get these tickets reopened.
00:42:50.200 So that involves filing a reopening application, an affidavit that's done in the name of the person who had the ticket.
00:42:56.200 And then we had to get those sent off to the court, which is what we did.
00:42:58.200 And then we had to wait a decision to see if those tickets could be reopened because they're very old, two and a half years old.
00:43:03.200 So the Amish weren't there to fight it. It was like they were.
00:43:06.200 They were. It's not like they were. They were convicted in absentia. They weren't there.
00:43:11.200 Yeah, that's right. So that's a problem.
00:43:14.200 It's our position that they didn't have full information about those tickets.
00:43:18.200 And on that basis, we sought to get them reopened.
00:43:20.200 So the good news is that they went in front of a Justice of the Peace and the JP allowed us to reopen the tickets.
00:43:27.200 So now we're at the starting line. Now we get a court date and we get a chance to talk to the Crown and hopefully convince them to withdraw our stay of the tickets.
00:43:37.200 So that's where we are. So the first step is done.
00:43:40.200 Then we got to do the second step, which is where the hard work comes in, convincing the court or the Crown to stay or withdraw the tickets.
00:43:46.200 But there's one more wrinkle to this. And this is how these Amish folks discovered that they had these tickets because they got this ticket.
00:43:53.200 They didn't understand what it was for. And they went about their life farming in their old fashioned ways.
00:43:59.200 Until one of the lads went to the bank to say, I'd like to get a loan to buy some livestock.
00:44:06.200 And the banker who is used to dealing with the Amish typed it in and said, oh, sorry, you have a lien against your property.
00:44:13.200 The government has put an encumbrance on your land. We cannot lend against it until you deal with this lien.
00:44:20.200 So other Amish checked and a bunch of them. So it's not just the ticket.
00:44:25.200 It's that the government has taken a sort of collections step.
00:44:28.200 Theoretically, God forbid, may it never happen.
00:44:31.200 They could force the sale of a farm just to get their COVID fines.
00:44:36.200 So you don't just have to repeal the ticket. You got to get that lien off the property.
00:44:42.200 Right. That's the real danger here, Ezra, because as you say, the family farm is really the only asset that these Amish people have.
00:44:52.200 And it's passed down from father's son. And so that's in jeopardy now.
00:44:57.200 They can't get a loan. Their credit's affected. They can't transfer that property because of the lien.
00:45:03.200 And we understand one individual actually had to sell their property to satisfy the lien.
00:45:10.200 Oh, my God. I did not know that.
00:45:12.200 Yeah. It's already happened. We're told. So.
00:45:14.200 I am so angry to hear that. You know, I saw a lot of comments on YouTube.
00:45:18.200 This is a land grab. This is government. And I thought, you know, it's just it's just going to sit there.
00:45:23.200 But she had to sell his property to get the lien off that. I mean, I am furious to hear that.
00:45:28.200 Yeah. That's what we're told. They're not. They're not a client of ours.
00:45:31.200 But that's that's what we understand.
00:45:33.200 Speaking of clients like it's tough to. I mean, Rebel News has some lawyers that I've never met in person.
00:45:39.200 And I feel like I know them, though, because I see them on a Zoom call. I talk to them on the phone.
00:45:44.200 I email back and forth. If I were to meet them, I would actually feel like I know them.
00:45:48.200 You can in our high tech world get to know people without seeing them in person.
00:45:53.200 But the Amish, they don't use Zoom. They don't use email. They don't use FaceTime or Skype calls or whatever.
00:45:58.200 So to get a meeting together of these folks and to as a lawyer to be briefed by them to get their them to agree to be a client and sign the paperwork like that's a hassle.
00:46:11.200 How do you gather together a bunch of farmers who don't have phone, email, fax, whatever, get them together, explain what's going on and get them to sign a retainer for free?
00:46:23.200 Of course, Rebel Rebel News helps crowdfund through the Democracy Fund, like just getting these folks together.
00:46:29.200 How many people does the Democracy Fund now represent?
00:46:32.200 Well, I think it's over 20 now.
00:46:34.200 So it's been interesting because we have to physically go out, meet the elders.
00:46:39.200 And then our job as lawyers, obviously, is give advice and receive instructions.
00:46:43.200 So we have to make sure that the clients and any retainer understand their situation, their legal situation, so that they can give us proper instructions.
00:46:52.200 And they don't often have the concepts needed to express themselves to give us coherent instructions.
00:46:59.200 So we really have to break it down in simple terms like a trial.
00:47:04.200 They often don't understand what a trial is because their biblical beliefs deal with things in a non-adversarial way.
00:47:11.200 Well, that's the interesting thing.
00:47:13.200 And when I met with their head of the steering committee, that's what they call their boss out there.
00:47:18.200 He says their view is to turn the other cheek.
00:47:22.200 They turn the other cheek.
00:47:23.200 And if the government is adverse to them, they just bend the knee and they take the punishment and they don't even fight back.
00:47:29.200 And in this case, part of the convincing job was to convince them to let outsiders help them.
00:47:36.200 And I remember the head of the steering committee.
00:47:38.200 I can't say his name.
00:47:39.200 I promised like they're so camera shy.
00:47:41.200 They didn't even want their names used.
00:47:43.200 They're so because they're pacifist.
00:47:46.200 I hate to say it.
00:47:47.200 These are the kind of people and God forbid, no one should ever do this.
00:47:50.200 But if you punch them in the face, they would turn the other cheek because that Bible says so.
00:47:57.200 And you can see how such a people could be taken advantage of and how a system could steamroll them.
00:48:06.200 And so they are so conflict averse.
00:48:08.200 Half the battle was just saying, guys, you are not fighters, but this is so wrong.
00:48:13.200 Will you please accept our help?
00:48:15.200 And in the end, they said, well, we won't do it.
00:48:18.200 But if you do it, we won't say no.
00:48:20.200 Like it was.
00:48:21.200 And they aren't being fussy.
00:48:23.200 They aren't being prideful.
00:48:26.200 They just really want to live their non-conflict life.
00:48:30.200 I think that makes them, in a sense, childlike in that they need protection.
00:48:37.200 You put a child in the legal system, it's going to be torn to shreds.
00:48:42.200 And you're not being mean to call them a child.
00:48:46.200 These folks are grownups, but their understanding of the ways of our legal and political system is childlike.
00:48:53.200 And they require outside help because otherwise they're going to be devoured.
00:48:58.200 Right.
00:48:59.200 Look, dealing with them has been interesting because they seem very innocent.
00:49:05.200 They're knowledgeable about their own world.
00:49:07.200 I overheard a conversation between two of the men and they had multiple ways of describing a broken wheel on a cart because that needed to be repaired.
00:49:17.200 So they understood that intimately, but they don't understand any modern concepts in the legal system because they just don't interact with it.
00:49:25.200 So, yeah, it renders them very innocent.
00:49:27.200 Yeah.
00:49:28.200 You know, I've really grown to like them.
00:49:30.200 They're very different.
00:49:32.200 Time has a whole different meaning over there.
00:49:34.200 When you, I drove out there to get some pickles because they sell pickles and jams and stuff.
00:49:39.200 And, you know, life's at a different pace.
00:49:44.200 I think many of us couldn't really live at such a slow pace.
00:49:48.200 And we're so used to hyper interconnectivity with the Internet.
00:49:51.200 I don't know if we could live just with talking to our friends and family in person.
00:49:57.200 But I'm very proud of the fact that, first of all, that a friend of the Amish, like a neighbor, alerted us to it and them to us.
00:50:07.200 Because how else would they hear about us, not through the Internet?
00:50:10.200 So I'm really excited that we're in a position to help.
00:50:13.200 We've set up a few different ways for people to help.
00:50:16.200 People who want to crowdfund the Democracy Fund lawyering is simply helptheamish.com.
00:50:24.200 And we've also made the decision as Rebel News that we're going to cover every single thing that happens in the court case.
00:50:32.200 If there's a hearing, no matter how small, we're going to do a report on it.
00:50:36.200 It's hours away, but we've decided to cover that and put resources there.
00:50:40.200 If you think our journalism on this is just as important, and I think it is.
00:50:43.200 I saw even Elon Musk was talking on Twitter about this case.
00:50:47.200 Imagine that.
00:50:48.200 If you want to help our reportage of that, go to amishreports.com.
00:50:53.200 So we have two different funds.
00:50:54.200 One's Rebel and one's the Democracy Fund.
00:50:57.200 OK, let's get to the big news you have.
00:50:59.200 I just wanted to go through some of those housekeeping or give an update on those old cases for folks.
00:51:04.200 But just recently, the Democracy Fund published a research document on Trudeau's Online Harms Act.
00:51:15.200 That's the censorship law that is now, I think, in second reading in Parliament.
00:51:20.200 Why don't you give us an update?
00:51:22.200 What have you published?
00:51:24.200 Where can people see it?
00:51:25.200 And what's the upshot of it?
00:51:27.200 Sure.
00:51:28.200 So it's a legal brief, the Online Harms Brief.
00:51:31.200 You can find it at thedemocracyfund.ca.
00:51:34.200 And really, we've taken a look at the bill to go through with a fine tooth comb and to figure out the problems.
00:51:41.200 And there are a lot of problems.
00:51:43.200 So let me just preface by saying, to the extent that there, the bill deals with child protection.
00:51:48.200 That's how it's pitched.
00:51:49.200 And to the extent that there are legal gaps in child protection, we don't have a problem with that.
00:51:54.200 We think there's strong existing laws to protect children.
00:51:58.200 But to the extent that there's not, those parts of the bill dealing with child protection and other sexual offenses, they should be severed off the bill, debated, and then passed into law.
00:52:11.200 So that's the preface here.
00:52:13.200 But the remaining parts of the bill are problematic.
00:52:16.200 It does three things.
00:52:17.200 It amends the Section 13 of the Canadian Human Rights Act.
00:52:21.200 So it reintroduces Section 13 that was repealed in 2014 by the last government.
00:52:28.200 So it reintroduces Section 13, the hate speech provision, the Canadian Human Rights Act.
00:52:33.200 The second thing it does is that it amends the criminal code to add severe penalties and a standalone hate-motivated offense.
00:52:42.200 And it introduces a new peace bond we can talk about.
00:52:45.200 And the third thing it does, it creates a digital safety commission to regulate surveillance police online speech.
00:52:52.200 So those are the three things it does.
00:52:55.200 I remember reading the bill when it came out.
00:52:58.200 And I think the majority of the bill has nothing to do with online censorship.
00:53:03.200 For example, there's a provision to ban revenge pornography, which is if you took a video of your ex and you're going to upload it as revenge.
00:53:11.200 Well, yeah, I think everyone's against that, including Parliament, which banned it in 2014.
00:53:17.200 I mean, I think it's 10 years in prison. So they have a lot of things in there that I think a lot of people would agree with, but it's already in force.
00:53:25.200 Many of them, for example, there's a requirement that Twitter and other social media have a block button.
00:53:30.200 All right. Well, that's it already does.
00:53:33.200 In fact, to sell anything on the app store, you have to.
00:53:36.200 So I think there's a lot of things in this bill that the government's emphasizing that everyone would agree with.
00:53:42.200 It's these censorship provisions that are sort of stowaways that they're sneaking in.
00:53:48.200 So if you dare object to the bill, they say, oh, you're for child pornography.
00:53:51.200 No, let's ban that. And actually, child pornography has been banned for decades.
00:53:56.200 Don't try and call my political speech. Don't sneak it in the same bill.
00:54:01.200 I think you're right. It's got to be split apart, but it won't be because because they want it to be muddled.
00:54:07.200 Yeah, absolutely. I mean, that's the tell. Right. They refuse to do the rational thing, which is separate off the non controversial parts of the bill that everyone can agree on.
00:54:16.200 And then keep the the they want it all. They want it to go in all at once.
00:54:21.200 And I think that's indicative of their position that they really want to hammer dissent online.
00:54:26.200 You mentioned I talk a lot about the Human Rights Commission part because I was hit by the Human Rights Commission a dozen years ago or more.
00:54:38.200 That was actually part of the sort of the campaign to have that section repealed by Stephen Harper 10 years ago.
00:54:45.200 But there is that new phenomenon of the Digital Safety Commissioner.
00:54:50.200 In fact, I think there's there's three new positions that are created by the bill.
00:54:55.200 And each of those positions is going to have a staff and the Human Rights Commission is going to need staff and investigate like this will create a literal industry.
00:55:04.200 I can't even remember what the three different digital sensors are, but that's one isn't enough.
00:55:11.200 Two is enough. They're going for three, aren't they? It's weird.
00:55:13.200 Yeah, they've got different layers of bureaucracy, the Digital Safety Commission, which is going to police the online harms.
00:55:20.200 And they've got a digital ombudsman. And I think they have a digital safety office.
00:55:25.200 So there's three different offices involved. It's just a massive new bureaucracy that's going to be created.
00:55:30.200 I have never encountered a real person in real life who says, you know what I need in my life?
00:55:36.200 I need someone to tell me what I can or can't say.
00:55:39.200 And I know some people don't like Twitter or social media because it gives them bad vibes.
00:55:45.200 OK, we'll use that block button or use the mute button or or lock down your account.
00:55:50.200 Like there are so many tools that a user has if they're shy, if they're introverted, if they're private, if they don't want to.
00:55:58.200 You can mute certain words like you can put yourself in a bubble wrap cocoon on any social media app.
00:56:05.200 And I know this because otherwise you wouldn't be able to sell it on the app store or the Android store.
00:56:11.200 And I've never heard a real person say, I want someone else to make those decisions for me.
00:56:17.200 I hear people say he should be banned or he should be banned.
00:56:20.200 But I've never heard anyone say, I want someone to be the decider for myself.
00:56:24.200 I'd like to delegate my political decisions to the government.
00:56:27.200 I've never heard that.
00:56:28.200 I think it's a self-serving thing by a government that wants to silence critics and by an industry that's looking for a perpetual money making scheme.
00:56:37.200 Yeah, look, the government's position is that there's seven categories of online harm.
00:56:42.200 And the four that we have no problem with are NCDII, the non-consensual sharing intimate images, CSAM, the child obscenity, content that induces a child to self-harm, content used to bully a child.
00:56:55.200 But those four are pretty well protected in the criminal law.
00:56:59.200 I couldn't imagine a single person opposing those.
00:57:02.200 Right.
00:57:03.200 So those are the those are the four categories that aren't that controversial, I think are covered mostly by existing laws.
00:57:09.200 But there's there's three others. There's content that foments hatred, content that is violent extremism or terrorism, content that incites violence.
00:57:19.200 So those are the three other types of online harm.
00:57:21.200 And they're very ill defined, which obviously leads to overbroad application.
00:57:25.200 So that's really where the rubber hits the road.
00:57:28.200 And we think that the way the government has defined those terms is going to lead to abuse.
00:57:34.200 I believe that reading the Human Rights Commission part, the Section 13 part, where you can make a complaint against someone who has published something likely to cause detestation or vilification.
00:57:51.200 Those are their words. It's the old law used to be likely to likely to expose a person to hatred or contempt.
00:57:59.200 That was the old rules.
00:58:01.200 That's so vague.
00:58:02.200 It's so subjective.
00:58:04.200 It's a subjective test.
00:58:05.200 You say something that is likely to maybe cause him to have hard feelings about him.
00:58:11.200 I think that law is tailor made to go after Rebel News because it's so vague, because everyone is guilty of likely to causing hurt feelings at some point in their life.
00:58:22.200 It's not like a concrete test.
00:58:24.200 Did you do did you stab him or not?
00:58:26.200 Did you rob the bank or not?
00:58:28.200 It's did you something did you do something likely to cause hard feelings?
00:58:32.200 I feel that they're going to come for Rebel News pretty much right out of the gates.
00:58:37.200 Other than being hit with a complaint, do you see any avenue by which Rebel News can go out there and fight this law?
00:58:46.200 Obviously, we can't fight it until it's actually enacted.
00:58:50.200 Like we can't you can't challenge a law that's not on the books.
00:58:54.200 If this law passes as it is, how would Rebel News fight other than being victimized and fighting back?
00:59:01.200 Is there any way we can get before the courts other than being a victim of this law?
00:59:05.200 Well, I mean, a lot of the law is going to be buried in the regulations made by the Digital Safety Commission, and those regulations haven't been written yet.
00:59:15.200 If it will happen, say you get a notice that your video contravened or comprise one of these online harms.
00:59:25.200 And then if you object, presumably you go before the commission.
00:59:29.200 So it's a it's a regulatory commission.
00:59:31.200 It's an administrative tribunal.
00:59:32.200 And then you make your arguments there.
00:59:34.200 You say, no, this content did not foment hatred, you say.
00:59:37.200 And they say yes or no.
00:59:39.200 And if they say yes, it did.
00:59:42.200 And we're sticking by the takedown, then you have to go for what's called judicial review.
00:59:47.200 So this could take years.
00:59:48.200 And again, it's the process of the punishment.
00:59:50.200 So in the meantime, no digital platform is going to risk losing six or eight percent of their global revenue up to twenty five million dollars or more on the, you know, on the chance that they're going to be vindicated.
01:00:03.200 They're just going to pull it down and then you have to fight it before an administrative tribunal, which could take years.
01:00:08.200 And you want to review this another couple of years.
01:00:10.200 So they're just going to cave and it's going to be difficult.
01:00:13.200 I'm not going to, you know, sugarcoat it.
01:00:14.200 It's going to be very difficult for Rebel News or any other dissenting news organization to fight this.
01:00:19.200 You know, I forgot about that part.
01:00:21.200 It's not just fines for users like us.
01:00:23.200 The platforms are on the hook for, I think you mentioned it, eight percent of their global revenue.
01:00:29.200 So Canada is saying if Twitter doesn't follow the rules, they have to pay a fine of eight percent of all the money they make in the world, not just in Canada.
01:00:40.200 It's crazy.
01:00:41.200 We have to fight this.
01:00:43.200 I think this is part of Trudeau's obsession with controlling voices.
01:00:47.200 He can't convince.
01:00:48.200 He wants to silence them.
01:00:49.200 I'm glad the Democracy Fund is out there fighting.
01:00:52.200 People can see this legal brief at thedemocracyfund.ca.
01:00:56.200 Mark Joseph, great to see you again.
01:00:57.200 Thanks for your time.
01:00:58.200 Thanks for having me.
01:00:59.200 All right.
01:01:00.200 Stay with us more ahead.
01:01:14.200 Hey, welcome back.
01:01:15.200 Your letters to me.
01:01:16.200 Lug says, blonde muscle guy knows what he's talking about.
01:01:19.200 And the ponytail guy with the hat at the end hit the nail on the head.
01:01:22.200 You know, they were such interesting characters.
01:01:24.200 I was just saying to my family that you go to a place like Venice Beach, and I was just there on my way to James O'Keefe's movie premiere, is you go to an interesting place as a tourist.
01:01:37.200 And you might chat with a few people, you know, if you bump into them, say a few words here or there, chat with a waiter or waitress or something.
01:01:44.200 But if you go to a place like Venice Beach, you're not going to talk to 30 people.
01:01:48.200 That would be really weird.
01:01:49.200 But if you have a microphone in your hand and you ask them a question about politics, you'll talk to 30 people.
01:01:54.200 And you're not going to make 30 friends, but you can have 30 fun interactions.
01:01:57.200 I had a wonderful time there.
01:01:59.200 It reminds me that, you know, Americans are super friendly.
01:02:02.200 And I think sometimes Canadians is sort of a snobbery.
01:02:06.200 Oh, we're friendlier than them.
01:02:07.200 No, we're not.
01:02:08.200 I think Americans are friendly.
01:02:10.200 And I was in California and people courageously talked to me about being for Trump.
01:02:15.200 That was another surprise for me, too, how many Trump people there were, including how many African-Americans were for Trump.
01:02:21.200 And by the way, I was in San Francisco a couple of days ago.
01:02:24.200 We're going to have some videos from there.
01:02:26.200 In some of the poorest parts of San Francisco, people are sick of the Democrats.
01:02:30.200 And they're for Trump.
01:02:31.200 They just are for someone who's going to bust the current system because it's not working.
01:02:35.200 And I enjoyed meeting those guys, too.
01:02:38.200 It was quite quite an interesting time.
01:02:40.200 I mean, listen, California is decaying, but there's still some wonderful people there.
01:02:45.200 And I met 30 of them in Venice Beach.
01:02:48.200 John Wheeler says three more weeks.
01:02:50.200 Let's go Trump all the way.
01:02:52.200 You know, I love being an optimist, but I have to keep my hopes in check because otherwise it's going to hurt so very bad if he loses.
01:02:59.200 I remember in 2016 when he was winning and I refused to go to bed until it was certain.
01:03:05.200 I remember Florida was really on a knife's edge.
01:03:08.200 Remember that?
01:03:09.200 I did not go to bed till like 3 a.m. or later because I did not want to go to bed happy and wake up to disappointment.
01:03:17.200 So I stayed up until I was sure he was going to win.
01:03:19.200 Boy, that was a great night.
01:03:20.200 I got to say, not everyone likes Donald Trump, but whenever I press them, it's largely for personality reasons or aesthetic reasons.
01:03:28.200 You know, if you are a true left wing liberal, and I met a couple of them in California, of course you're going to be for Kamala Harris.
01:03:35.200 You'd be for any Democrat over any Republican.
01:03:38.200 But the chief opposition to Trump, I find in real life, is people who just don't like his class, his style, his banter, his aesthetic, his meanness they see.
01:03:48.200 I tell you one thing, the world could use a few more mean tweets if that meant we had a strong hand on the tiller.
01:03:55.200 I think that countless lives have been lost over the last four years.
01:03:58.200 Do you agree with me that Russia would not have invaded Ukraine had Trump been reelected?
01:04:02.200 Well, the proof's in the pudding.
01:04:04.200 They invaded Ukraine before Trump was president and after Trump was president, but they didn't dare do it when Trump was president.
01:04:11.200 That would have saved, what, half a million lives there alone?
01:04:16.200 Do you think that Iran and Qatar would have dared to have the October 7th attack in Israel if Trump was the president?
01:04:24.200 I don't think so.
01:04:25.200 And China's moves to push Japan, Korea, Vietnam, Philippines?
01:04:31.200 I think that because people didn't like his mean tweets, and I think there was some tilting of the playing field in the last election with mail-in ballots in particular, I think literally millions of lives were lost because of that American choice.
01:04:46.200 And, of course, I want Pierre Polyev to beat Justin Trudeau, and I believe that will happen.
01:04:51.200 And that will have a big effect on our lives in Canada, but perhaps an even bigger effect on the world will be what happens in America in less than 30 days.
01:05:00.200 That's why I'd like to encourage you to watch our new reality show that we're rolling out with Avi Yamini.
01:05:08.200 Did you see that? Avi has come from Melbourne, Australia to San Francisco.
01:05:12.200 That's where I was going down there, to meet Avi.
01:05:15.200 And for the next month, he's going to be crossing the United States in an RV with our driver, Lyndon, and our videographer, my buddy Lincoln.
01:05:23.200 So the three of them are going to be in this RV.
01:05:25.200 They're going to sleep in the RV and cook in the RV.
01:05:28.200 And go from town to town, sort of a reality show, doing news and politics and interviewing people and streeters,
01:05:36.200 making their way from San Francisco all through America, and then winding up in Miami in the end.
01:05:41.200 So I'm excited about that. You can follow it at Avi Across America.
01:05:44.200 And let me end with a little clip that Avi made just for that purpose.
01:05:48.200 All right, everybody. Have a great weekend.
01:05:51.200 We'll see you on Monday. Happy Thanksgiving.
01:05:53.200 And, um, you know what they say. Keep fighting for freedom.
01:05:57.200 So with everything going on in this crazy city, at least we could see they got their priorities right by painting the crosswalk in the colors of the transgender flag.
01:06:08.200 We're in the heart of the San Francisco neighborhood called Tenderloin. There are drug addicts lying in the streets. There is crime that is so pervasive the police don't even respond to it. We spoke to a cop who said there's 600 police officers short.
01:06:23.200 Everything is dilapidated. Infrastructure is crumbling. But the public policy priority for this city, which has had a Democrat mayor for 60 years, is to have a whole team put down transgender crosswalks.
01:06:39.200 If you want to imagine what America will look like under Kamala Harris, look at what her hometown looks like. This is the priority in San Francisco.
01:06:50.200 We'll see you next time.
01:06:53.200 We'll see you next time.