Elon Musk is making household robots, and they're pretty cool. But what are they really good at? And what are the downsides? And will they cost less than the price of a car? Plus, a look at Tesla's new driverless vehicle, the RoboVan.
00:07:17.320Sometimes the cost of taking an Uber to an event is less than the cost of driving and paying for parking in a big city like Toronto or New York.
00:07:30.580The car that's parked on my driveway, the car that I don't need an app to use, the car that I can drive whenever and wherever and however I want,
00:07:40.540and no one else is involved, no tech company, no third party, no driver or artificial intelligence driver.
00:07:48.640To me, a car is an essential part of being a free person.
00:07:52.040It's one of the defining spirits of America and Canada, too, as opposed to more European, African, or Asian ways.
00:07:59.420How much freedom there is on the open road.
00:08:02.360I mean, how many hundreds of songs have been written about driving on the highway and the freedom you feel.
00:08:09.660Teslas and these autonomous vehicles are part of huge technology companies that are amongst the most regulated companies in the world.
00:08:19.540And they're heavily regulated for political purposes, as you know.
00:08:36.120They're the people who punished us for not supporting COVID lockdown policies.
00:08:40.520That was literally part of their community guidelines.
00:08:44.040If you didn't obey your public health officer's politics, you would be shut down.
00:08:48.600So the company that interfered with your freedom of expression for political reasons, surely they won't hesitate to interfere with your freedom of movement for the same reasons.
00:09:00.380I've never used Waymo, and I don't really propose to.
00:09:03.640But the choice, I don't know if it's even going to be my choice to make.
00:09:07.700Waymo and Uber have announced that they're teaming up.
00:09:11.120They're making announcements like this one just last month.
00:09:14.840So my point is, would I have been able to go to, say, an anti-COVID lockdown, anti-COVID mandate protest if Waymo had been operating in Canada back in February 2022?
00:09:29.920Would they have let me go to the trucker convoy?
00:09:32.120Would an autonomous vehicle have taken me to the trucker convoy?
00:09:36.540I like Elon Musk a lot, but Teslas are at the mercy of their software and hardware.
00:09:42.980Even if Elon Musk personally opposed some sort of lockdown, could he really resist a government order to identify any cars participating in a future trucker convoy?
00:09:53.640How about shutting them off if they got too close to such a political event?
00:09:57.880Half of all the Teslas in the world are in China.
00:10:00.680Do you doubt they want that power over their people?
00:10:04.360Do you think that our politicians lack the will to regulate where you can go?
00:10:09.740They're already doing it in a sloppy way through 15-minute cities.
00:11:58.680Go online and look at the terms of service.
00:12:00.960We may share information with third parties when required by law or other circumstances, such as to comply with a legal obligation, such as subpoenas or other court orders.
00:12:15.120In response to a lawful request by government authorities conducting an investigation, including to comply with law enforcement requirements and regulator inquiries, to verify or enforce our policies and procedures, to respond to an emergency,
00:12:32.000to prevent or stop activity we may consider to be or to pose a risk of being illegal, unethical or legally actionable, or to protect the rights, property, safety or security of our products and services, Tesla, third parties, visitors or the public, as determined by us in our sole discretion.
00:12:54.680So they'll give information about you and your car and your journey and anything else they capture to the government as part of their inquiries or just if they feel it's necessary at their sole discretion.
00:13:14.960Would that include, say, I don't know, smoking a cigarette or swearing or having politically inappropriate views or going to meetings with political opponents of the regime?
00:13:45.980What's a public right, especially when it comes to my car?
00:13:49.720They answer that question in the same sentence.
00:13:51.980It's their sole discretion, whatever they want, really.
00:13:55.820But, hey, I'm sure it'll never happen to you.
00:13:58.040Don't you worry, you pretty little head.
00:13:59.960You live the worrying to the big people.
00:14:03.620Which brings us, I don't know, to the latest announcement, the real showstopper yesterday.
00:14:10.180Elon Musk rolled out his autonomous vehicles, but that wasn't really shocking.
00:14:13.140What was shocking, or startling at least, surprising, was that Elon Musk rolled out humanoid robots that he says you'll soon be able to buy for your household.
00:14:24.100He says they'll not just be great for chores, but they'll also be your friend.
00:14:29.120So everything we've developed for our cars, the batteries, power electronics, the advanced motors, gearboxes, the software, the AI inference computer, it all actually applies to a humanoid robot.
00:14:49.520It's the same techniques, it's just a robot with arms and legs instead of a robot with wheels.
00:14:56.480And we've made a lot of progress with Optimus, and as you can see, we started it with someone in a robot suit, sort of dab, and then we've progressed dramatically year after year.
00:15:15.000So if you extrapolate this, you're really going to have something spectacular, something that anyone could own.
00:15:24.640So you could have your own personal R2-D2 C-3PO, and I think at scale, this would cost something like, I don't know, $20,000, $30,000, probably less than a car, is my prediction, long term.
00:15:44.760So, you know, it'll take us a minute to get to the long term, but fundamentally at scale, the Optimus robot, you should be able to buy an Optimus robot for, I think, probably $20,000 to $30,000 long term.
00:16:22.400I think this will be the biggest product ever of any kind.
00:16:32.840Because I think everyone of the 8 billion people of Earth, I think everyone's going to want their Optimus buddy.
00:16:40.180So what will the terms of service be for my new robotic friend?
00:16:45.640They obviously have a ton of sensors in them, electronic eyes and ears, so to speak, and it'll be in your house, maybe even in your bedroom, your kitchen, anywhere you are, helping you, of course.
00:16:59.420But they're not selling these robots yet, but I've seen videos of some celebrities saying they're going to get one soon.
00:17:06.840Like the cool Cybertruck, I'm sure that fancy people and opinion leaders will be able to get these Android-style humanoid robots very quickly.
00:17:16.300Maybe we'll learn then what the terms of service will say.
00:17:19.720But it's hard to imagine they'll be much different than Tesla's terms for their cars, which are essentially robots, too, when you think about it.
00:18:45.220I mean, imagine a robot in your house, never sleeping, always listening, always watching, never forgetting anything, uploading it all to the cloud so corporate can go through it at their sole discretion.
00:23:51.120As the robots become more and more human, I fear that we will become more robotic.
00:23:56.780What about people choosing only to have robots in their life?
00:24:01.000We were warned about this by Yuval Noah Harari, of all people, from the World Economic Forum.
00:24:06.420Remember when he said that the future for most people will be useless?
00:24:11.560He actually called people useless eaters who will spend their time just on drugs and playing video games because there's nothing else for them to do.
00:24:21.340Yes, in the Industrial Revolution, we saw the creation of a new class of the urban proletariat.
00:24:28.560And much of the political and social history of the last 200 years involved what to do with this class and the new problems and opportunities.
00:24:36.640Now we see the creation of a new massive class of useless people.
00:24:42.380As computers become better and better in more and more fields, there is a distinct possibility that computers will outperform us in most tasks and will make humans redundant.
00:24:56.260And then the big political and economic question of the 21st century will be what do we need humans for, or at least what do we need so many humans for?
00:25:07.500Again, I think that the biggest question maybe in economics and politics of the coming decades will be what to do with all these useless people.
00:25:17.000I don't think we have an economic model for that.
00:25:21.140My best guess, which is just a guess, is that food will not be a problem.
00:25:27.620With that kind of technology, you will be able to produce food to feed everybody.
00:25:33.260The problem is more boredom and what to do with them and how will they find some sense of meaning in life when they are basically meaningless, worthless.
00:25:43.760My best guess at present is a combination of drugs and computer games.
00:25:49.740You know, they say that the current generation has the least sex of the generations that have been measured, which is odd, don't you think?
00:25:59.040Because they certainly have the most pornography in history and the most online dating apps.
00:26:05.880There's never been more dating and situationships in history.
00:26:09.760But I think young people have fewer real connections with real people now than ever.
00:26:15.320The birth rate is plummeting, at least in the West.
00:26:19.020I think maybe there's a sense of purpose that's eroding, a sense of community.
00:26:23.280Look, I understand the appeal of having machinery and technology.
00:26:27.140I mean, I love my smartphone and I can see the social problems even that is creating.
00:26:32.720High tech has made the most average person in Canada today as rich as a king from olden times.
00:26:39.300When you think about things like your basic medical care and dental care, the basic availability and choice of food, entertainment, travel, everything from literacy to communications.
00:26:52.280An ordinary Canadian has a better life than a king just a few hundred years ago.
00:26:58.160And it's because of technology in large part, the culture too.
00:27:01.340But what will these robots do to our humanity?
00:27:06.040I'm sure people ask the same questions at the start of the Industrial Revolution.
00:27:56.340And he generally is on the side of freedom, I think.
00:27:59.180I'd rather have Elon Musk owning and creating these things than communist China, I suppose.
00:28:04.620But he's pretty deeply involved in that country too.
00:28:09.160Elon Musk is famous for Tesla and for SpaceX, which is now sending more rockets into space than all other countries and companies combined.
00:28:18.180Ten times more than the rest of the world combined.
00:28:20.180Elon Musk's Starlink internet system is amazing.
00:33:18.200Of course, what was fun about the book, by the way, is about how those laws are implemented in various scenarios, especially when those laws are in conflict.
00:33:28.200That's what made the book so interesting.
00:33:29.200I like those laws, but they're not real.
00:33:43.200Notwithstanding any other laws, a robot will do whatever big government orders it to do.
00:33:48.200Because that's what the terms of service say.
00:33:50.200The terms of service are really a code of conduct for robots.
00:33:53.200The terms of service say you have all these rights to your robot, to your Tesla, to your Android, unless there's a matter of public safety, which we in our sole discretion will define.
00:34:06.200You know, I would get a kick out of a robot helping me fix a few things around the house, change some light bulbs.
00:34:12.200You know, I like mowing the lawn sometimes, but I'd probably prefer having a robot do it, things like that.
00:34:19.200It would be good to have a security system that was prowling around, although he'd probably have to fight against robot intruders, come to think of it.
00:34:28.200But for me, the thing I would be most worried about is having every word, every action, every facial expression, every journey, every movement I made, even in my own home, in my own car, just recorded and shared with big tech and shared with big government.
00:34:44.200And if I did something serious, well, then they would intervene in whatever way they wanted.
00:34:49.200It's already bad enough with my cell phone, don't you think?
00:38:26.200That's from Ottawa, I believe, James Bowder.
00:38:29.200He's called the last trucker because then he's the last to be charged from that convoy, am I right?
00:38:33.200That's right. I mean, of course, Tamara Litch, their trial just ended, Tamara Litch and Chris Barber.
00:38:38.200So we're waiting for a decision on that, of course. But we have one remaining.
00:38:41.200She'll charge with mischief, still under that sort of risk of penalty.
00:38:45.200It's crazy that they would put prosecutorial resources.
00:38:49.200And by that, I mean, there's only so many judges, only so many prosecutors, so many, so many clerks, bureaucrats.
00:38:55.200And for them to put aside other real matters to go after a trucker shows that the virus may be out of the pandemic is gone.
00:39:06.200But the virus of authoritarianism remains in the body of the state.
00:39:10.200Well, we can't speculate as to as to why judicial resources are being used on these matters.
00:39:17.200But obviously, there are serious matters out in the general public concerning safety, sex assaults, murders, thefts.
00:39:24.200We think that those judicial resources could be more effectively used there.
00:39:28.200But of course, we don't get to make those decisions.
00:39:30.200Yeah, it's sort of shocking. You were at the Tamara Litch trial for many days.
00:39:35.200The Democracy Fund lawyers sort of rotated through live tweeting what was going on.
00:39:40.200I tried my best to be there a few times, too. Was it 47 days?
00:39:45.200I think it was 45, but it could be up to 47.
00:39:47.200Like, well, let me put it this way. Close to 50 days. World's longest mischief trial.
00:39:53.200I mean, there's something wrong when I mean, that is a big court.
00:39:59.200That is a busy judge. I we don't need to rehash that now other than the Democracy Fund would say there's no way that a regular person could have paid for that legal defense on their own.
00:40:09.200I always say a poor person wouldn't have a chance. A regular person couldn't.
00:40:14.200Like we're talking about half a million dollars in legal fees and a rich person wouldn't.
00:40:18.200A rich person who's worked his life to save up a small fortune, let's say, isn't going to spend it on a fight like that.
00:40:25.200They're going to say, all right, I plead guilty. Just let me out of that.
00:40:28.200There is no person who would be able to fight that and win other than someone with crowdfunding behind them.
00:40:35.200Right. So I mean, like a lot of things in law, the process is often the punishment.
00:40:39.200So, you know, Tamara and Chris have both been under an incredible amount of personal and financial stress from this.
00:40:47.200And the trial just kept going and going and going and it passed into 45 days.
00:40:51.200They've got great counsel, but that costs a lot of money.
00:40:54.200And, you know, there might be something to that, that they were being punished for their political beliefs, but it's hard to say.
00:40:59.200I want to talk about two more things with you today. The first is I want to talk about the Amish case.
00:41:04.200I mean, for those who don't know, Amish are Christian farmers who are distinctive in that they eschew modernity.
00:41:13.200They do not use electricity. They do not drive cars. They don't use the Internet, watch TV, listen to the radio.
00:41:20.200They don't even have electric lights. They use gas lamps.
00:41:25.200When you visit an Amish farmhouse, which I have done several times now, on the outside, it looks sort of like a regular house.
00:41:32.200But you are stepping back in time two centuries. They even farm using horse pulled plows.
00:41:39.200And every time I tell the story, I just roll my eyes.
00:41:44.200These folks go across the border between Ontario and the U.S. because there's other Amish on the U.S. side.
00:41:49.200And they've been doing that, going back and forth, harming.
00:41:51.200These people, they keep to themselves. They're fairly reclusive.
00:41:53.200They actually speak German amongst themselves.
00:41:56.200They do not interact with the larger world.
00:41:59.200First of all, they can't because they're not going to phone you or email you or whatever.
00:42:03.200And during the lockdowns, the Canadian border police would say, did you download the Arrive Can app on your smartphone?
00:42:13.200And every single word in that sentence would be like Greek to someone who is living in essentially an 18th century technological world.
00:42:24.200They don't have, what does download mean? What's an app? What's Arrive Can? What's a smartphone? What are you talking about?
00:42:30.200And since they didn't, they were hit with an extraordinary number of fines.
00:42:35.200All right. I think people know that. But what's the latest?
00:42:38.200You've been seized with this matter. You and Adam Blake Gallupo have tucked into this Amish case. What's the latest?
00:42:44.200OK, so maybe your listeners know, but the first step we had to go through was to get these tickets reopened.
00:42:50.200So that involves filing a reopening application, an affidavit that's done in the name of the person who had the ticket.
00:42:56.200And then we had to get those sent off to the court, which is what we did.
00:42:58.200And then we had to wait a decision to see if those tickets could be reopened because they're very old, two and a half years old.
00:43:03.200So the Amish weren't there to fight it. It was like they were.
00:43:06.200They were. It's not like they were. They were convicted in absentia. They weren't there.
00:43:11.200Yeah, that's right. So that's a problem.
00:43:14.200It's our position that they didn't have full information about those tickets.
00:43:18.200And on that basis, we sought to get them reopened.
00:43:20.200So the good news is that they went in front of a Justice of the Peace and the JP allowed us to reopen the tickets.
00:43:27.200So now we're at the starting line. Now we get a court date and we get a chance to talk to the Crown and hopefully convince them to withdraw our stay of the tickets.
00:43:37.200So that's where we are. So the first step is done.
00:43:40.200Then we got to do the second step, which is where the hard work comes in, convincing the court or the Crown to stay or withdraw the tickets.
00:43:46.200But there's one more wrinkle to this. And this is how these Amish folks discovered that they had these tickets because they got this ticket.
00:43:53.200They didn't understand what it was for. And they went about their life farming in their old fashioned ways.
00:43:59.200Until one of the lads went to the bank to say, I'd like to get a loan to buy some livestock.
00:44:06.200And the banker who is used to dealing with the Amish typed it in and said, oh, sorry, you have a lien against your property.
00:44:13.200The government has put an encumbrance on your land. We cannot lend against it until you deal with this lien.
00:44:20.200So other Amish checked and a bunch of them. So it's not just the ticket.
00:44:25.200It's that the government has taken a sort of collections step.
00:44:28.200Theoretically, God forbid, may it never happen.
00:44:31.200They could force the sale of a farm just to get their COVID fines.
00:44:36.200So you don't just have to repeal the ticket. You got to get that lien off the property.
00:44:42.200Right. That's the real danger here, Ezra, because as you say, the family farm is really the only asset that these Amish people have.
00:44:52.200And it's passed down from father's son. And so that's in jeopardy now.
00:44:57.200They can't get a loan. Their credit's affected. They can't transfer that property because of the lien.
00:45:03.200And we understand one individual actually had to sell their property to satisfy the lien.
00:45:33.200Speaking of clients like it's tough to. I mean, Rebel News has some lawyers that I've never met in person.
00:45:39.200And I feel like I know them, though, because I see them on a Zoom call. I talk to them on the phone.
00:45:44.200I email back and forth. If I were to meet them, I would actually feel like I know them.
00:45:48.200You can in our high tech world get to know people without seeing them in person.
00:45:53.200But the Amish, they don't use Zoom. They don't use email. They don't use FaceTime or Skype calls or whatever.
00:45:58.200So to get a meeting together of these folks and to as a lawyer to be briefed by them to get their them to agree to be a client and sign the paperwork like that's a hassle.
00:46:11.200How do you gather together a bunch of farmers who don't have phone, email, fax, whatever, get them together, explain what's going on and get them to sign a retainer for free?
00:46:23.200Of course, Rebel Rebel News helps crowdfund through the Democracy Fund, like just getting these folks together.
00:46:29.200How many people does the Democracy Fund now represent?
00:46:34.200So it's been interesting because we have to physically go out, meet the elders.
00:46:39.200And then our job as lawyers, obviously, is give advice and receive instructions.
00:46:43.200So we have to make sure that the clients and any retainer understand their situation, their legal situation, so that they can give us proper instructions.
00:46:52.200And they don't often have the concepts needed to express themselves to give us coherent instructions.
00:46:59.200So we really have to break it down in simple terms like a trial.
00:47:04.200They often don't understand what a trial is because their biblical beliefs deal with things in a non-adversarial way.
00:48:59.200Look, dealing with them has been interesting because they seem very innocent.
00:49:05.200They're knowledgeable about their own world.
00:49:07.200I overheard a conversation between two of the men and they had multiple ways of describing a broken wheel on a cart because that needed to be repaired.
00:49:17.200So they understood that intimately, but they don't understand any modern concepts in the legal system because they just don't interact with it.
00:49:25.200So, yeah, it renders them very innocent.
00:51:49.200And to the extent that there are legal gaps in child protection, we don't have a problem with that.
00:51:54.200We think there's strong existing laws to protect children.
00:51:58.200But to the extent that there's not, those parts of the bill dealing with child protection and other sexual offenses, they should be severed off the bill, debated, and then passed into law.
00:52:17.200It amends the Section 13 of the Canadian Human Rights Act.
00:52:21.200So it reintroduces Section 13 that was repealed in 2014 by the last government.
00:52:28.200So it reintroduces Section 13, the hate speech provision, the Canadian Human Rights Act.
00:52:33.200The second thing it does is that it amends the criminal code to add severe penalties and a standalone hate-motivated offense.
00:52:42.200And it introduces a new peace bond we can talk about.
00:52:45.200And the third thing it does, it creates a digital safety commission to regulate surveillance police online speech.
00:52:52.200So those are the three things it does.
00:52:55.200I remember reading the bill when it came out.
00:52:58.200And I think the majority of the bill has nothing to do with online censorship.
00:53:03.200For example, there's a provision to ban revenge pornography, which is if you took a video of your ex and you're going to upload it as revenge.
00:53:11.200Well, yeah, I think everyone's against that, including Parliament, which banned it in 2014.
00:53:17.200I mean, I think it's 10 years in prison. So they have a lot of things in there that I think a lot of people would agree with, but it's already in force.
00:53:25.200Many of them, for example, there's a requirement that Twitter and other social media have a block button.
00:53:30.200All right. Well, that's it already does.
00:53:33.200In fact, to sell anything on the app store, you have to.
00:53:36.200So I think there's a lot of things in this bill that the government's emphasizing that everyone would agree with.
00:53:42.200It's these censorship provisions that are sort of stowaways that they're sneaking in.
00:53:48.200So if you dare object to the bill, they say, oh, you're for child pornography.
00:53:51.200No, let's ban that. And actually, child pornography has been banned for decades.
00:53:56.200Don't try and call my political speech. Don't sneak it in the same bill.
00:54:01.200I think you're right. It's got to be split apart, but it won't be because because they want it to be muddled.
00:54:07.200Yeah, absolutely. I mean, that's the tell. Right. They refuse to do the rational thing, which is separate off the non controversial parts of the bill that everyone can agree on.
00:54:16.200And then keep the the they want it all. They want it to go in all at once.
00:54:21.200And I think that's indicative of their position that they really want to hammer dissent online.
00:54:26.200You mentioned I talk a lot about the Human Rights Commission part because I was hit by the Human Rights Commission a dozen years ago or more.
00:54:38.200That was actually part of the sort of the campaign to have that section repealed by Stephen Harper 10 years ago.
00:54:45.200But there is that new phenomenon of the Digital Safety Commissioner.
00:54:50.200In fact, I think there's there's three new positions that are created by the bill.
00:54:55.200And each of those positions is going to have a staff and the Human Rights Commission is going to need staff and investigate like this will create a literal industry.
00:55:04.200I can't even remember what the three different digital sensors are, but that's one isn't enough.
00:55:11.200Two is enough. They're going for three, aren't they? It's weird.
00:55:13.200Yeah, they've got different layers of bureaucracy, the Digital Safety Commission, which is going to police the online harms.
00:55:20.200And they've got a digital ombudsman. And I think they have a digital safety office.
00:55:25.200So there's three different offices involved. It's just a massive new bureaucracy that's going to be created.
00:55:30.200I have never encountered a real person in real life who says, you know what I need in my life?
00:55:36.200I need someone to tell me what I can or can't say.
00:55:39.200And I know some people don't like Twitter or social media because it gives them bad vibes.
00:55:45.200OK, we'll use that block button or use the mute button or or lock down your account.
00:55:50.200Like there are so many tools that a user has if they're shy, if they're introverted, if they're private, if they don't want to.
00:55:58.200You can mute certain words like you can put yourself in a bubble wrap cocoon on any social media app.
00:56:05.200And I know this because otherwise you wouldn't be able to sell it on the app store or the Android store.
00:56:11.200And I've never heard a real person say, I want someone else to make those decisions for me.
00:56:17.200I hear people say he should be banned or he should be banned.
00:56:20.200But I've never heard anyone say, I want someone to be the decider for myself.
00:56:24.200I'd like to delegate my political decisions to the government.
00:56:28.200I think it's a self-serving thing by a government that wants to silence critics and by an industry that's looking for a perpetual money making scheme.
00:56:37.200Yeah, look, the government's position is that there's seven categories of online harm.
00:56:42.200And the four that we have no problem with are NCDII, the non-consensual sharing intimate images, CSAM, the child obscenity, content that induces a child to self-harm, content used to bully a child.
00:56:55.200But those four are pretty well protected in the criminal law.
00:56:59.200I couldn't imagine a single person opposing those.
00:57:03.200So those are the those are the four categories that aren't that controversial, I think are covered mostly by existing laws.
00:57:09.200But there's there's three others. There's content that foments hatred, content that is violent extremism or terrorism, content that incites violence.
00:57:19.200So those are the three other types of online harm.
00:57:21.200And they're very ill defined, which obviously leads to overbroad application.
00:57:25.200So that's really where the rubber hits the road.
00:57:28.200And we think that the way the government has defined those terms is going to lead to abuse.
00:57:34.200I believe that reading the Human Rights Commission part, the Section 13 part, where you can make a complaint against someone who has published something likely to cause detestation or vilification.
00:57:51.200Those are their words. It's the old law used to be likely to likely to expose a person to hatred or contempt.
00:58:05.200You say something that is likely to maybe cause him to have hard feelings about him.
00:58:11.200I think that law is tailor made to go after Rebel News because it's so vague, because everyone is guilty of likely to causing hurt feelings at some point in their life.
00:58:28.200It's did you something did you do something likely to cause hard feelings?
00:58:32.200I feel that they're going to come for Rebel News pretty much right out of the gates.
00:58:37.200Other than being hit with a complaint, do you see any avenue by which Rebel News can go out there and fight this law?
00:58:46.200Obviously, we can't fight it until it's actually enacted.
00:58:50.200Like we can't you can't challenge a law that's not on the books.
00:58:54.200If this law passes as it is, how would Rebel News fight other than being victimized and fighting back?
00:59:01.200Is there any way we can get before the courts other than being a victim of this law?
00:59:05.200Well, I mean, a lot of the law is going to be buried in the regulations made by the Digital Safety Commission, and those regulations haven't been written yet.
00:59:15.200If it will happen, say you get a notice that your video contravened or comprise one of these online harms.
00:59:25.200And then if you object, presumably you go before the commission.
00:59:29.200So it's a it's a regulatory commission.
00:59:48.200And again, it's the process of the punishment.
00:59:50.200So in the meantime, no digital platform is going to risk losing six or eight percent of their global revenue up to twenty five million dollars or more on the, you know, on the chance that they're going to be vindicated.
01:00:03.200They're just going to pull it down and then you have to fight it before an administrative tribunal, which could take years.
01:00:08.200And you want to review this another couple of years.
01:00:10.200So they're just going to cave and it's going to be difficult.
01:00:13.200I'm not going to, you know, sugarcoat it.
01:00:14.200It's going to be very difficult for Rebel News or any other dissenting news organization to fight this.
01:00:21.200It's not just fines for users like us.
01:00:23.200The platforms are on the hook for, I think you mentioned it, eight percent of their global revenue.
01:00:29.200So Canada is saying if Twitter doesn't follow the rules, they have to pay a fine of eight percent of all the money they make in the world, not just in Canada.
01:01:16.200Lug says, blonde muscle guy knows what he's talking about.
01:01:19.200And the ponytail guy with the hat at the end hit the nail on the head.
01:01:22.200You know, they were such interesting characters.
01:01:24.200I was just saying to my family that you go to a place like Venice Beach, and I was just there on my way to James O'Keefe's movie premiere, is you go to an interesting place as a tourist.
01:01:37.200And you might chat with a few people, you know, if you bump into them, say a few words here or there, chat with a waiter or waitress or something.
01:01:44.200But if you go to a place like Venice Beach, you're not going to talk to 30 people.
01:03:20.200I got to say, not everyone likes Donald Trump, but whenever I press them, it's largely for personality reasons or aesthetic reasons.
01:03:28.200You know, if you are a true left wing liberal, and I met a couple of them in California, of course you're going to be for Kamala Harris.
01:03:35.200You'd be for any Democrat over any Republican.
01:03:38.200But the chief opposition to Trump, I find in real life, is people who just don't like his class, his style, his banter, his aesthetic, his meanness they see.
01:03:48.200I tell you one thing, the world could use a few more mean tweets if that meant we had a strong hand on the tiller.
01:03:55.200I think that countless lives have been lost over the last four years.
01:03:58.200Do you agree with me that Russia would not have invaded Ukraine had Trump been reelected?
01:04:25.200And China's moves to push Japan, Korea, Vietnam, Philippines?
01:04:31.200I think that because people didn't like his mean tweets, and I think there was some tilting of the playing field in the last election with mail-in ballots in particular, I think literally millions of lives were lost because of that American choice.
01:04:46.200And, of course, I want Pierre Polyev to beat Justin Trudeau, and I believe that will happen.
01:04:51.200And that will have a big effect on our lives in Canada, but perhaps an even bigger effect on the world will be what happens in America in less than 30 days.
01:05:00.200That's why I'd like to encourage you to watch our new reality show that we're rolling out with Avi Yamini.
01:05:08.200Did you see that? Avi has come from Melbourne, Australia to San Francisco.
01:05:12.200That's where I was going down there, to meet Avi.
01:05:15.200And for the next month, he's going to be crossing the United States in an RV with our driver, Lyndon, and our videographer, my buddy Lincoln.
01:05:23.200So the three of them are going to be in this RV.
01:05:25.200They're going to sleep in the RV and cook in the RV.
01:05:28.200And go from town to town, sort of a reality show, doing news and politics and interviewing people and streeters,
01:05:36.200making their way from San Francisco all through America, and then winding up in Miami in the end.
01:05:41.200So I'm excited about that. You can follow it at Avi Across America.
01:05:44.200And let me end with a little clip that Avi made just for that purpose.
01:05:48.200All right, everybody. Have a great weekend.
01:05:51.200We'll see you on Monday. Happy Thanksgiving.
01:05:53.200And, um, you know what they say. Keep fighting for freedom.
01:05:57.200So with everything going on in this crazy city, at least we could see they got their priorities right by painting the crosswalk in the colors of the transgender flag.
01:06:08.200We're in the heart of the San Francisco neighborhood called Tenderloin. There are drug addicts lying in the streets. There is crime that is so pervasive the police don't even respond to it. We spoke to a cop who said there's 600 police officers short.
01:06:23.200Everything is dilapidated. Infrastructure is crumbling. But the public policy priority for this city, which has had a Democrat mayor for 60 years, is to have a whole team put down transgender crosswalks.
01:06:39.200If you want to imagine what America will look like under Kamala Harris, look at what her hometown looks like. This is the priority in San Francisco.