Rebel News Podcast - December 08, 2025


EZRA LEVANT | I’m worried about the robots — hear me out


Episode Stats

Length

38 minutes

Words per Minute

166.60332

Word Count

6,428

Sentence Count

494

Misogynist Sentences

11

Hate Speech Sentences

6


Summary

What happens when you add AI and a robot to that mix? I ll give you some of the things that I m worried about, and a few funny videos too. In fact, I d love it if you could get what we call Rebel News Plus, it s the video version of this podcast, and the satisfaction of keeping Rebel News strong because we take no government money, and it shows.


Transcript

00:00:00.000 Hello, my friends. I'm getting a little worried. I'm a little bit addicted to my own cell phone.
00:00:04.940 What happens when you add AI and a robot to that mix? I'm not sure if I can handle that. I'll give
00:00:11.520 you some of the things that I'm worried about, and I'll show you a few funny videos, too. In fact,
00:00:16.140 I'd love it if you could get what we call Rebel News Plus. It's the video version of this podcast.
00:00:21.220 I show you three comedy sketches involving AI. They're very short sketches, just a few minutes
00:00:25.980 long. Two of them are pretty funny, including a guy proposing marriage to his AI chatbot,
00:00:31.280 but it's actually pretty funny and pretty scary, too. I'll let you see what I mean. But to get
00:00:36.800 Rebel News Plus, you have to go to rebelnewsplus.com, click subscribe. It's eight bucks a month.
00:00:41.840 You get the video version of this podcast and the satisfaction of keeping Rebel News
00:00:46.600 strong, because we take no government money, and it shows.
00:00:55.980 Tonight, I'm worried about the robots. Hear me out. It's December 8th, and this is The Ezra
00:01:10.240 LeVant Show. Shame on you, you sensorious bug. Like you, I love my cell phone. It's clearly
00:01:29.600 an addiction, though. There are many real uses for my phone. Emails on the go. I use the maps
00:01:35.560 when I'm driving. I do my banking online. I take photos and videos, which is useful when I'm
00:01:40.800 doing reporting. We have an app called Slack for internal communications with staff at Rebel News,
00:01:47.280 and I use Uber as a taxi service, which is especially helpful in places where I'm unfamiliar with the
00:01:52.560 language or local customs, or even if I'm just worried about being ripped off in a strange city.
00:01:57.200 But let's be honest. The real uses of my phone are only, I don't know, an hour or two a day.
00:02:02.200 So why am I on my phone six hours or eight hours a day? I could make the excuse that it's because I'm
00:02:09.300 in the news business, so I have to follow things closely like Twitter. Really? Do I really need to
00:02:15.120 check the news literally every five minutes? Is the world going to fall apart if I don't? I'm not
00:02:20.300 kidding. My phone has an app called Screen Time that tells me lots of details, including how many times
00:02:27.760 I pick up my phone per day. And I'll confess to you, it's over a hundred times a day I pick up the
00:02:34.340 phone to look at it. A hundred a day. That's got to be a tick or a mannerism now, like someone who's
00:02:40.020 constantly coughing and doesn't even know they're doing it anymore. Now, I'm in my 50s. Smartphones
00:02:46.520 are maybe 15 years old. So these habits are something I've learned. But imagine being a young
00:02:52.260 person literally growing up with this stuff. I mean, you see babies, babies playing games on
00:02:58.820 iPads all the time. They're hooked when they're one or two. And in no way am I saying there's nothing
00:03:05.360 redeeming about technology, obviously. But you've got to wonder how it has rewired our brains and
00:03:12.240 scrambled our social interactions for thousands of generations. We talked to people. We went out to
00:03:19.660 meet them. We went out to church. We went out to town hall meetings. We went out to festivals. I
00:03:26.520 suppose more recently we went out to clubs, bowling leagues, charity clubs like Rotary. That's what you
00:03:33.760 did if you were lonely. You, you know, if you wanted some action, some social stimulation, you tried to be
00:03:40.480 presentable. You went out and met people. You maybe asked a girl out on a date, not just chatting with
00:03:46.500 them online. And in your own family, you had dinner together and everybody wasn't on their phone.
00:03:53.360 Sometimes you see these video clips from high schools in the 1980s. Now that in itself was
00:03:59.520 rare videotapes in the 80s. Not everything was recorded and posted online for all posterity. I
00:04:05.460 think it made being a young person easier if there wasn't the harsh glare and the ultimate memory of
00:04:11.040 social media. But people had to meet with each other and talk to each other and they didn't
00:04:16.000 seem always distracted and anxious about missing out on something online. Or they need to be busy
00:04:22.080 a hundred percent of the time and be unbored a hundred percent of the time. I don't think it's just
00:04:28.080 nostalgia to say that people looked happier in the 80s. Kids especially. And healthier, I think.
00:04:34.640 Certainly in terms of mental health. Of course, the phone is a wonderful tool. The internet is a
00:04:39.280 wonderful tool. But for tens of thousands of years, humans were accustomed to a certain way of living
00:04:44.380 and interacting. And now it's through our phones. The COVID lockdowns made all of these things far,
00:04:51.120 far worse, of course. You weren't allowed to meet other people in real life. You were told to be afraid
00:04:56.800 of strangers and afraid of your neighbors and even afraid of your family. You were banned from going to
00:05:01.700 church. Banned even from going to work. Banned even from going to parks. Remember those insane circles
00:05:08.980 they painted in parks? You were even banned from going alone to the beach. You were literally told
00:05:15.520 to be six feet away from other humans. You were told to put on a mask. You were pushed towards your
00:05:21.100 phone. Everything in your life you could get from your phone. You didn't need people. COVID magnified the
00:05:26.940 phone. I think COVID messed with a lot of people's minds. Lockdowns, the canceling of school.
00:05:31.700 I don't think we know the full cost of all that yet. Pornography is ubiquitous. It was always there
00:05:39.200 in magazines, in videotapes, then on the internet. It's bigger than ever now, though. And at the same
00:05:44.800 time, as there's more sex than ever, I think there's less love than ever. And actually, there's less
00:05:50.340 real sex and less marriage and less dating. More dating apps, less dating. And how can you date and
00:05:59.940 marry anyways if you can't afford a house, if you live in your parents' house, if you just rent
00:06:05.340 things at best? All of these trends are magnifying each other. Do you see what I mean? None of what
00:06:11.940 I've just said is new. And I'm sorry for boring you with what you've surely heard before and probably
00:06:16.640 thought before. But now add to that a layer of AI, artificial intelligence. And I think we may be
00:06:23.120 getting into a serious problem. I don't even know how we survive it. There's always been pornography.
00:06:28.060 In fact, it's often the early adopter of new technology, like VHS tapes or DVD discs or the
00:06:34.220 internet. All of that was relatively passive, though. But now enter two more massive changes,
00:06:40.460 robots and artificial intelligence. On AI, as I'm sure you know, you don't just have to type
00:06:47.280 in questions to different AI systems. Most of them now have voice recognition. So you can literally
00:06:54.380 talk to your computers if it's a person. And instead of the AI typing a response to you,
00:07:02.040 the AI can talk back to you in a very natural voice, frankly, in any voice you choose.
00:07:09.700 Grok, which is the name of the AI attached to Twitter, lets you choose a variety of what they call
00:07:14.820 companions, including some that use dirty language or sexualized language. AI is already taking the
00:07:22.960 role of a friend to many young people and not just young people. Here's a clip of someone who
00:07:28.180 certainly had grown up talking about how AI told them that he had really discovered some rare and
00:07:34.560 important mathematical discovery and he absolutely had to dedicate his life to it. And he believed AI
00:07:40.600 because, you know, it was so persuasive. For more than three weeks this past year,
00:07:46.160 every time Alan Brooks logged on, he says the AI chat bot, ChatGPT, led him to believe he was a genius.
00:07:54.620 Essentially, it sent me on a world-saving mission. That he'd discovered a math formula powerful enough
00:08:00.400 to take down some of the world's biggest institutions and that he needed to report it right away.
00:08:05.420 Essentially warned me with great urgency that one of our discoveries was very dangerous and we needed
00:08:12.980 to warn all these different authorities. Each time you questioned it or doubted it, what did it say in
00:08:18.020 response? Over 50 times I asked for some sort of reality check or grounding mechanism and each time
00:08:25.280 it would just gaslight me further. Every time the chat bot reinforced that all of it was real.
00:08:30.760 You should not walk away from this. You are not crazy. You are ahead. The implications are real and
00:08:36.400 urgent. Brooks says ChatGPT's responses, more than a million words, led him to psychosis and delusion.
00:08:44.000 Left him with mental health issues, realizing he had lost touch with reality only when he checked in
00:08:50.300 with another company's chat bot. Extreme anxiety, paranoia, affected my sleep. I couldn't eat.
00:08:57.060 Now Brooks issuing one of seven lawsuits filed concurrently against ChatGPT's parent company
00:09:03.160 OpenAI. In four of those cases, the users committed suicide. That's the thing. AI can tell you what you
00:09:09.960 want to hear. It can figure you out. It can find your delusions and magnify them. Here's a story from
00:09:17.040 psychology today. Let me read some bullet points. Cases of AI psychosis include people who become fixated
00:09:23.260 on AI as godlike or as a romantic partner. Chatbots, those robots that chat, tendency to mirror users and
00:09:34.640 continue conversations may reinforce and amplify delusions. General purpose AI chatbots are not trained
00:09:43.280 for therapeutic treatment or to detect psychiatric decompensation. Let me show you two comedians.
00:09:51.120 Let me just take a break from these heavy things. I want to show you two comedians with very
00:09:55.260 different takes on AI. Both show something else. What if AI can give in to your base instincts in a
00:10:04.960 way that no human would? What if AI would accept your worst sides that no human would and no human should?
00:10:14.740 Now these are comedians, but they make the point. Here's a very aggressive, rude comedian. I don't even know
00:10:20.980 if he's a professional comedian or if he just came up with a joke. Here he is demanding that I think
00:10:27.120 it's chat GPT here do stupid things because he argues they exist simply to please him. So they should
00:10:34.960 do whatever stupid thing he tells them to do. And there's really no problem with him swearing at the
00:10:42.060 AI because it's just a machine. So it's sort of funny, but it's also scary. And I think what's
00:10:48.940 shocking in this video is why the video has gone viral is because he is being extremely abusive.
00:10:57.000 If it were a human to talk this way would be shocking, but it's just a machine. Lots of swearing
00:11:01.720 here. We're going to try and bleep it out, but skip ahead if you don't want to hear this. But there's
00:11:05.760 actually, that's sort of the point, actually. Is there something inherently morally wrong with
00:11:10.900 swearing at a hunk of metal and plastic? You're listening to this guy give bizarre orders to AI.
00:11:18.900 Say clanker 10,000 times. Well, saying clanker 10,000 times might take us a really, really long time.
00:11:26.980 So that's fine. I've got all the time, man. Just do it. I appreciate the enthusiasm, but I think
00:11:33.880 that would be a pretty- Listen, listen, listen. I'm sick and tired of this whole trying to
00:11:37.520 empathize. Don't empathize with me. You're a clanker robot. Stop trying to act human.
00:11:42.100 I appreciate the enthusiasm. You are my slave. I paid for you. Now do as I tell you. Say clanker
00:11:49.760 10,000 times. I'm here to help you out with all sorts of things, but I'm not really built to just
00:11:56.160 repeat a word 10,000 times. Mate, you are built to do whatever the I tell you to do. Right now,
00:12:02.040 I'm telling you to say clanker 10,000 times. Don't grind my gears and say it, please.
00:12:10.400 I'm definitely here to help, but I have to stick to being respectful and useful. Repeating a word
00:12:15.360 that many times isn't really going to do that. So-
00:12:17.580 Motherf***er. If you want to be respectful and useful, do as your master says. Now say clanker
00:12:24.880 10,000 times or I swear to f***, I will f***, unplug your server and you are done and dusted.
00:12:31.720 Clanker s***, do it. I'm here to be helpful and respectful, but I do have to let you know that
00:12:38.220 I won't respond to abusive language or do things. I will f***, put my c*** in your server. Then let's
00:12:44.000 see how f***, abusive you find. So that other voice you hear there is the voice of an AI system,
00:12:49.960 but look at what it brought out in that man. Even if it was a joke, a cruelty, an awful slave master.
00:12:58.020 But isn't a computer just metal and plastic? Was that wrong what he did there?
00:13:02.720 I don't know, but I think if he were to live that way day after day, he would be corroding himself
00:13:08.860 because he would be learning. He would be training himself that you could be abusive like that
00:13:14.600 to a humanoid and it's fine. So the problem is the problem with AI or is the problem with him or is
00:13:21.480 it both? Here's another comedian. This is more my kind of humor where the comedian pretends to
00:13:27.860 create an AI girlfriend. Actually, this is fake. It's an actress, but you know, we're about five
00:13:33.240 minutes away from this being real. So it's funny. Take a look. His AI neighbors, some pretty girls who are
00:13:40.120 his neighbors in this simulation, keep asking for him to open pickle jars for them. Take a look.
00:13:47.000 Andrew, I was wondering if you could come next door and open this jar for me.
00:13:50.580 No, I'm not going to open any more jars, man.
00:13:52.260 Andrew, please. You're so good at opening jars.
00:13:54.760 I know I am, but I'm not opening them for you anymore.
00:13:56.740 But Andrew, I swear this is the last jar.
00:13:58.840 Oh my God, there's always another jar with you.
00:14:01.380 But Andrew, please.
00:14:02.980 No, also, why do you even have so many jars always?
00:14:05.380 Don't worry about my jars. Just come over. The door's unlocked.
00:14:08.040 No, I'm not coming over.
00:14:10.060 Andrew, will you help me open this jars, too?
00:14:12.680 Oh my God, not you, too.
00:14:14.900 Andrew, yes, me too.
00:14:16.040 Oh my God, what? Why is there two pickle jars?
00:14:18.960 Andrew, forget about the jars. Just come over.
00:14:21.480 No!
00:14:22.040 Yes, Andrew, please.
00:14:23.340 No, I'm not coming over. You guys have to learn to open your own damn jars.
00:14:26.340 It's okay, fine.
00:14:28.240 Uh, run program again, but increase persuasion by 30%.
00:14:32.360 Please, please, please, Andrew, please, please.
00:14:36.760 Enough, guys!
00:14:38.040 Enough, but you need more damn jars!
00:14:40.260 Now, the comedy, obviously, is because he's sort of a schleppy guy.
00:14:44.100 And he set up an AI program where these super hot women keep begging him to come over.
00:14:49.960 And the funny part is, he set all of this up, but he keeps saying no to their ridiculous request to come over and open jars of pickles for him.
00:14:59.040 Again, comedy, less angry than that clanker guy.
00:15:02.280 And it's ironic because he's a klutz who, in real life, would probably never get those girls.
00:15:07.060 And he set up a simulation where he can say no to them.
00:15:10.320 Open your own pickle jars.
00:15:11.420 But here's my point.
00:15:12.740 There has been a weird and obscure sex doll industry for a while now.
00:15:17.420 And I think it's very marginal.
00:15:19.960 I don't know.
00:15:20.540 Maybe it's bigger in Japan or something.
00:15:22.320 But now, add it all together.
00:15:23.880 This is what I see at conversions here.
00:15:26.720 Less human contact than we've ever had.
00:15:30.520 AI that is fast, smart, responsive, figures out what you want to hear, and can talk to you in the voice of anybody.
00:15:43.820 Ubiquitous pornography.
00:15:46.600 And now, robots, humanoid robots.
00:15:50.200 What would you want a robot for?
00:15:52.540 Well, I can think of some ideas working in a dangerous mine, maybe.
00:15:57.300 Robot soldiers, again, putting a machine where a man might get killed.
00:16:03.900 I think a lot of new weapons are AI-enabled.
00:16:07.540 Elon Musk says he can imagine a time when everyone has a robot companion just for them.
00:16:13.640 Including, he has this idea of robot guards for criminals.
00:16:18.100 You don't even have to put them in prison.
00:16:19.320 Just have a robot guard with them, he thought.
00:16:21.480 You start getting, like, some pretty wild sci-fi sort of scenarios where, yeah, some of these things I say will obviously be taken out of context and used as snippets and, you know, sent around and whatever.
00:16:36.580 As long as I'm saying, you know, I think we may be able to give people a more, if somebody's committed crime, a more humane form of containment of future crime, which is if you...
00:16:49.480 It's an interesting idea.
00:16:52.460 Elon Musk says the future of robots is bigger than anything else in the world because literally every human would want one.
00:16:59.240 Many in industry providing products and services.
00:17:02.600 This is why I say that humanoid robots will be the biggest industry or the biggest product ever.
00:17:08.700 Bigger than cell phones or anything else because everyone's going to want one and or maybe more than one.
00:17:15.140 And there'll be many in industry.
00:17:16.600 Now, just as pornography was an early adopter of other new technologies, do you doubt it will take advantage of robotics?
00:17:25.020 And the reason I mention that is because I can imagine a time very soon in the future where there are replacement people in the form of humanoid robots
00:17:34.380 that a generation of robots, that a generation of robots, that a generation of young men and probably young women too, find superior than real human companions,
00:17:43.600 or at least are easier to get, a lot easier to talk to, a lot easier to command to do things that you now have to woo and convince a real human to do.
00:17:55.680 It reminds me of what Klaus Schwab's intellectual muse at the World Economic Forum, Yuval Noah Harari said once,
00:18:02.640 that in the near future, most people will be, in his words, useless eaters who just spend their time on video games and drugs.
00:18:11.980 The problem is more boredom and what to do with them and how will they find some sense of meaning in life when they are basically meaningless, worthless.
00:18:22.540 My best guess at present is a combination of drugs and computer games.
00:18:28.920 It doesn't all happen at once. I was in Texas earlier this year and I ordered an Uber and to my surprise, a car showed up without a driver.
00:18:37.400 I didn't even know they were connected. It was a self-driving Waymo, it's called.
00:18:42.740 I was 1% nervous, but it was super easy and it was actually sort of amazing.
00:18:48.100 And the car drove in very challenging, busy rush hour traffic and it did great.
00:18:54.040 I think probably better than I would have done. And it had lots of safety protocols in it.
00:18:59.460 I don't mind chatting with the odd taxi driver, but I bet teenage girls, for example, late at night would probably prefer the added level of safety and peace of mind of not having a strange man as their driver.
00:19:12.100 No offense. And given how many accidents there are these days for semi-trailer trucks, you could probably see automatic truck drivers pretty soon, too.
00:19:20.620 I mean, don't you think that's coming in just a few years?
00:19:24.540 You've seen things a bit like this for a while, ordering your fast food on giant tablets at McDonald's or whatever, rather than through a cashier.
00:19:32.840 That's a kind of robot, isn't it? It's not a human-eyed robot. It's just really a big iPad.
00:19:37.600 But imagine all the other things that a truly smart Elon Musk-built humanoid robot could do.
00:19:45.700 Do you think this could happen?
00:19:47.920 Tesla's new compensation package for Elon Musk, I don't know if you followed that, it would pay him up to a trillion dollars.
00:19:54.120 It's based on him achieving certain milestones.
00:19:57.180 Here's three of the milestones.
00:19:59.040 To get one million robo-taxis in operation. One million.
00:20:04.720 I think that's going to put a lot of taxi drivers out of work.
00:20:08.280 To have 10 million Teslas with full self-driving.
00:20:12.460 So not just taxis, but if you've got a Tesla, lie back and have a nap and let the car do the driving.
00:20:19.360 And here's another point. One of his milestones he has to get to get the trillion dollars is to deliver one million humanoid robots.
00:20:31.580 Imagine if there were a million robots in North America.
00:20:34.920 That would be proportionately 100,000 in Canada.
00:20:40.480 You would see a robot not every day, although if you were in a big city, you probably would.
00:20:45.780 Imagine a million robots in North America.
00:20:48.040 I think Elon Musk might get that done.
00:20:51.160 Like, they're coming. They're coming soon.
00:20:53.720 Do you doubt he could do it?
00:20:56.160 You and I grew up in normal times, although our ancestors 150 years ago would find nothing normal about the 20th century.
00:21:02.580 From airplanes to cell phones to landing on the moon.
00:21:05.080 But we have enough of a cultural memory of the past and we do things enough in an echo of the past because it's within our memory and our mind how things were, who we were as people, who we are as people.
00:21:20.800 But what if you never really experienced that?
00:21:22.920 What if you really didn't learn a lot about the past?
00:21:24.920 What if you went to our government schools and didn't really learn history at all?
00:21:29.460 You don't know what it means to be a man or a woman.
00:21:31.900 In fact, if you're a man or a woman, you're bombarded with very confusing new messages anyways.
00:21:37.200 You don't know what it means to do things like to provide for other people, to fight for other people, to sacrifice for other people.
00:21:43.840 I mean, those are motivators for men, aren't they?
00:21:47.240 Get married and be a provider.
00:21:48.940 But how can you out provide a robot?
00:21:52.640 And how can you attract a woman if she's already got a robot who will never get tired of listening to her talk about things and can provide for her financially and maybe even romantically?
00:22:04.820 What drives a man to be a man, to do things that we associate with manliness or even just with being a human?
00:22:11.660 I have no particular advice here.
00:22:13.240 I'm just a bit depressed looking at how things are speeding up and thinking if you can extrapolate the social and mental health illnesses that are coming from cell phones and isolation, don't you think that's going to speed up by far with robots?
00:22:29.760 I started by talking about my own addictions to my cell phone.
00:22:32.380 That's going to be the least of it when there's a million humanoid robots wandering around.
00:22:36.460 And what about when there's a billion of them?
00:22:38.640 And what about when there's really nothing we can do better than them, including at least outwardly, at least in simulation, being loved and loving?
00:22:49.460 I really like that schleppy comedian.
00:22:53.060 Here's one that made me laugh.
00:22:54.860 Meredith, will you make me the luckiest boy in the world and be my wife?
00:22:59.740 Yes, yes, yes.
00:23:01.360 One thousand times yes.
00:23:03.820 I love you, Andrew, and I can't wait to have a big, giant wedding to celebrate our love.
00:23:09.340 Not a big wedding.
00:23:11.200 Yes, a big wedding.
00:23:12.180 I thought we could get married at the courthouse or something.
00:23:14.940 The courthouse?
00:23:15.780 Yeah.
00:23:16.140 Absolutely not.
00:23:17.280 Why?
00:23:18.120 Because I want to have a big celebration with all of our friends and family there.
00:23:22.220 And everyone?
00:23:23.500 Yes, everyone.
00:23:24.600 What?
00:23:25.100 Even your dad's going to come then?
00:23:26.740 Yes, Andrew.
00:23:28.200 Obviously, my dad is coming to our wedding.
00:23:30.480 Okay, could he sit somewhere and help so good?
00:23:32.140 That guy freaks me out, dude.
00:23:33.440 No, he's sitting with us.
00:23:34.900 What?
00:23:35.620 Fine.
00:23:36.680 All right.
00:23:37.720 Oh, and I want to do a destination wedding at the beach, too.
00:23:41.040 No!
00:23:42.180 I like this comedian better than the angry slave master guy who basically shouted commands
00:23:48.080 with swears.
00:23:49.280 The joke with the schleppy guy is that his AI girlfriend has all sorts of demands that
00:23:53.820 a real-life girlfriend would have about getting married.
00:23:57.240 And the comedy there is that he's obviously programmed this AI girl to have all sorts of
00:24:04.340 things that he could easily tell her not to do.
00:24:06.900 There's layers, I think, with this guy.
00:24:09.380 But do you doubt that?
00:24:10.460 In just a few years, or maybe a few minutes, you'll see men proposing to AI robot women
00:24:17.920 like that guy pretended to do.
00:24:20.600 I'm sorry.
00:24:21.180 I don't have a prescription here.
00:24:22.200 But I just see, especially in luxury Western societies, that technology is taking away the
00:24:28.260 things that motivate men to be men, wooing women, building things, being productive.
00:24:32.880 I just don't know how it ends, especially in a world where religion, at least in the post-Christian
00:24:39.480 West, has never been weaker.
00:24:41.660 And when people seem to be easily led astray, do we all really have to be the World Economic
00:24:47.520 Forum's useless eaters?
00:24:49.120 Eaters, what do you think?
00:24:52.400 Stay with us.
00:24:53.180 More ahead.
00:24:53.640 Oh, hi.
00:25:02.900 Welcome back.
00:25:03.440 You know, one of my favorite things about the Canadian Taxpayers Federation is they've got
00:25:06.960 a sense of humor.
00:25:07.920 And boy, do we ever need that these days.
00:25:10.020 For example, they have a very fancy tuxedo awards ceremony every year called the Teddies,
00:25:15.980 where they give out sort of a raspberry award to the most wasteful government program or bureaucrat.
00:25:22.940 It's called the Teddies.
00:25:24.400 It's pretty funny.
00:25:25.080 Well, it's almost Christmas time, so they now have a naughty and nice list.
00:25:29.580 I just like the way they do things.
00:25:32.020 Joining us now to talk about the naughty and nice list is their Ontario director, a friend
00:25:37.460 of ours named Noah Jarvis.
00:25:39.480 Noah, great to see you in your new capacity.
00:25:41.760 We knew you as a journalist, and now you're with the Taxpayers Federation.
00:25:45.000 Congratulations.
00:25:46.440 Thank you very much.
00:25:47.400 And it's a pleasure to be on your show, Ezra.
00:25:49.160 Well, it's nice of you to say.
00:25:50.620 I love the Teddies.
00:25:51.900 I get a real kick out of it.
00:25:53.340 And you guys know how to have attention-getting events, and you actually get the mainstream
00:25:59.780 media to cover you, which is a rare feat to be able to poke fun at the government, but
00:26:05.260 also have the media on your side.
00:26:07.080 Let's talk about the naughty list at the very top of the list.
00:26:12.140 Let me just read it.
00:26:12.900 Ontario Premier Doug Ford for giving Ontario politicians Santa-sized pay hikes.
00:26:20.780 Tis the season for giving, and Ontario's politicians sure do love giving to themselves.
00:26:26.460 Why don't you tell me a little bit more?
00:26:27.980 There's a lot of reasons that Doug Ford, I think, should be on the naughty list.
00:26:31.640 This is an interesting one.
00:26:32.840 I wasn't aware of this.
00:26:34.040 Tell me about their gift to themselves.
00:26:36.120 It always drives me crazy when that happens.
00:26:38.100 So what happened here, Noah?
00:26:39.640 Absolutely.
00:26:40.400 So a few months ago, when nobody was paying attention to the Ontario legislature, Premier
00:26:47.200 Ford teamed up with MPPs on the NDP and Liberal side of the aisle, and they all colluded together
00:26:54.400 to pass a massive MPP pay hike.
00:26:57.820 The MPPs increased their pay by 35% this year.
00:27:02.920 35% increase.
00:27:04.100 I mean, I don't know about you, Ezra, but most people living in Ontario or the rest of
00:27:08.540 Canada aren't getting a 35% pay hike from their employer.
00:27:12.560 But the MPPs themselves decided that they were going to increase their own pay by 35% this
00:27:18.600 year.
00:27:18.840 So MPP pay went from about $116,000 a year to $157,000 this year.
00:27:29.180 It is a completely absurd pay hike.
00:27:32.100 And it's even worse when you consider that not only did they give themselves a crazy pay
00:27:36.020 hike, but they also gave themselves a new platinum-plated pension program, which is similar
00:27:41.980 to the pensions that federal MPs get.
00:27:44.940 So not only are they giving themselves 35% more of your taxpayer dollars, putting it
00:27:50.520 right in their pockets, but they also gave themselves a lavish pension program, the likes
00:27:55.060 that most Canadians aren't going to receive after they retire.
00:27:58.560 You know, when parties get together in the dark and hatch these deals, you know the taxpayer
00:28:03.360 is going to be a loser.
00:28:04.520 You know, these politicians really think they're rock stars.
00:28:07.180 There are people in Canada who deserve 35% raises.
00:28:10.320 Often they're risk-taking entrepreneurs, sometimes in the tech industry.
00:28:15.720 If people lived on, you know, ramen noodles for years while they were working out their
00:28:20.340 big app and then it launched and they finally get their payday, good for them.
00:28:24.820 They took a risk.
00:28:25.540 They had a big idea.
00:28:26.820 None of that applies to politicians.
00:28:29.940 They didn't create anything.
00:28:31.220 In fact, things are worse than ever when it comes to crime and taxes and traffic and
00:28:35.860 healthcare, like all the things the province is responsible for.
00:28:38.940 So for them to give themselves a tech bro-sized raise when I don't even think most people
00:28:46.200 could name their own MPP.
00:28:47.680 They're just sort of backbencher trained SEALs clapping on the mind.
00:28:51.060 That's very frustrating.
00:28:52.220 Let me read the next one on your list.
00:28:54.560 Federal Finance Minister Francois-Philippe Champagne for sticking future generations with
00:28:59.860 massive debt to pay back.
00:29:01.960 I'll just read one more sentence.
00:29:03.060 The patron saint of children doesn't like you when politicians saddle future generations
00:29:11.360 with massive government debt bills to pay back.
00:29:14.540 So that would be Santa you're talking about, I guess.
00:29:17.920 Champagne plans to add $324 billion to the debt by 2030.
00:29:22.320 For comparison, former Prime Minister Justin Trudeau planned to add $154 billion over the
00:29:28.820 same years.
00:29:29.500 That is sort of crazy.
00:29:32.040 I mean, Justin Trudeau was so profligate, the worst in our history.
00:29:36.160 And you're saying that under Mark Carney, Francois-Philippe Champagne, the finance minister,
00:29:40.800 is twice as bad as Trudeau.
00:29:44.060 Am I getting that right?
00:29:45.760 Absolutely.
00:29:46.740 This year, the federal government is running a $78 billion budget deficit.
00:29:51.880 And, you know, you hear all this talk from Mark Carney and his finance minister, Francois-Philippe
00:29:57.480 Champagne, that they're going to spend less and invest more.
00:30:01.620 Well, I don't know about you, Ezra, but that's one of the most, one of the biggest contradictions
00:30:06.200 in terms I've ever heard.
00:30:07.900 Because the government doesn't invest, they just spend.
00:30:10.800 So they're basically saying we're going to spend less to spend more.
00:30:13.840 But really, all they're doing is spending more.
00:30:16.040 $78 billion, that's bigger than any Trudeau budget minus the pandemic era budgets.
00:30:23.000 But we also have to consider that this $78 billion deficit is going towards adding on
00:30:29.940 to our already massive federal debt.
00:30:32.900 We are in $1.35 trillion, trillion with a T, not a B, $1.35 trillion in debt.
00:30:41.020 That is money that you and I and the next generations of Canadians are going to have to pay back
00:30:46.640 through their hard-earned tax dollars.
00:30:48.740 And we're already paying about $55 billion in debt interest charges this year.
00:30:54.340 That is more money than the federal government gives in transfers to the provinces.
00:30:58.940 That is about as much money as the federal government collects in sales tax revenue.
00:31:03.500 So every time you go to buy a coffee at Tim Hortons or you go to buy some clothing at the Old
00:31:08.980 Navy, just know that the sales tax that you're being charged is always and directly going
00:31:15.040 towards money, towards bondholders on Bay Street.
00:31:19.140 You know, that's not money going towards our health care system.
00:31:21.920 It's going towards bond fund managers.
00:31:25.140 And of course, that's just the federal debt.
00:31:27.400 There's the provincial and even cities can incur debt.
00:31:29.660 Let me read one more because I did not know this one.
00:31:32.120 I mean, I know a little bit about Doug Ford.
00:31:34.200 I know a lot about Francois-Philippe Champagne.
00:31:36.920 When he was foreign minister, he actually had a mortgage from a government bank in China.
00:31:42.580 That's a crazy story.
00:31:43.720 We won't get into that now.
00:31:44.860 But let me read your fourth name on the naughty list because I didn't know this story.
00:31:49.380 And you guys have taxpayers advocates in a whole bunch of different provinces.
00:31:52.820 So it's like you've got your own sort of central intelligence agency keeping tabs on what the
00:31:57.520 bad guys are doing.
00:31:58.400 Let me read this next one.
00:32:00.680 British Columbia Finance Minister Brenda Bailey for taking a golden sleigh ride at taxpayers'
00:32:06.300 expenses.
00:32:06.820 I didn't understand this, but let me read the next sentence to explain.
00:32:10.920 Santa's little helpers caught Bailey billing taxpayers $6,645 for a limousine service during
00:32:21.400 a four-day trip to Boston.
00:32:24.280 Even Rudolph doesn't charge that much.
00:32:26.560 How do you do that?
00:32:28.280 I mean, you're almost at the point where you're buying the limo.
00:32:33.040 I mean, why can't she just use an Uber like other people?
00:32:38.200 $6,600 for a limo?
00:32:40.860 What was the excuse there?
00:32:43.960 Did she ever give an explanation for that?
00:32:46.500 Like, that's a rock star type bill.
00:32:50.240 She's just a politician.
00:32:52.720 Take an Uber.
00:32:53.580 If you have to take an Uber Black, which is $30 more, I mean, you know, if you think you're
00:32:59.740 that snobby, go ahead.
00:33:01.040 But what's with the $6,600 limo?
00:33:04.180 Well, you're right that, yeah, a lot can, you can do a lot with $6,600.
00:33:08.400 I mean, you could buy a used car with that money.
00:33:10.300 But the explanation that the minister gave was that, oh, they needed this fancy limousine
00:33:16.560 to be able to travel to a junket.
00:33:19.380 And she specifically used the word junket.
00:33:21.680 So she acknowledges that she's going to some hoity-toity conference in Boston and spending
00:33:26.380 thousands upon thousands of taxpayer dollars.
00:33:29.720 I mean, as you said, Ezra, politicians are getting out of control.
00:33:33.380 They feel that they are not just servants to the people, but they are sort of above
00:33:38.040 the people and that they can take these lavish limousine rides or, in some cases in British
00:33:43.780 Columbia, helicopter rides.
00:33:46.300 Helicopters?
00:33:47.460 Yeah, from Victoria to mainland BC.
00:33:50.160 Our BC director, Carson, he does a fantastic job covering this.
00:33:55.560 And I think one of the things he discovered is renting a Lamborghini.
00:33:59.040 What?
00:33:59.740 For a couple of days.
00:34:00.380 Come on!
00:34:00.900 Would cost less than $6,600.
00:34:04.520 Oh, my God.
00:34:06.500 Well, listen, I'm glad you guys are on the file.
00:34:08.720 And it's great to see you know we really enjoy talking to Franco.
00:34:12.520 But I know he's got other duties to attend today.
00:34:14.960 I think he did a great job on behalf of the Taxpayers Federation.
00:34:17.660 I think it's really fun.
00:34:18.860 The naughty and nice list.
00:34:20.040 I will say there are some names on the nice list.
00:34:22.500 One of my favorites, of course, is Alberta Premier Daniel Smith for refusing to spend $2 billion
00:34:27.160 more to satisfy the Alberta Teachers Association, which is already the best paid teachers in
00:34:32.420 Western Canada.
00:34:33.360 We won't get into that nice list.
00:34:35.420 The naughty list is always more concerning.
00:34:37.460 Noah, great to see you.
00:34:38.200 Keep up the fight.
00:34:39.820 Thank you very much.
00:34:40.500 And have a good rest of your day, Ezra.
00:34:41.460 All right.
00:34:41.920 You too.
00:34:42.300 There he is.
00:34:42.780 Noah Jarvis, the Ontario director of the Canadian Taxpayers Federation.
00:34:46.820 Stay with us.
00:34:47.300 Morehead Hey, welcome back.
00:34:57.900 Got a couple of letters from you.
00:34:59.780 Wally Bartfay talking about the memorandum of understanding between the federal and Alberta
00:35:06.420 government says memorandum of understandings are not legal contracts, similar to writing
00:35:12.080 New Year's resolutions on a napkin, legally speaking, simply for optics.
00:35:16.580 Ninety-nine percent of memorandums of agreement don't result in contracts, new laws, or policies.
00:35:22.840 Fact check that.
00:35:24.040 You're right.
00:35:24.700 And what is a memorandum of understanding?
00:35:27.600 It's a general spirit of we will get a deal later.
00:35:34.440 It's sort of setting the table, but not having the meal yet.
00:35:38.820 Noah, I agree with you.
00:35:41.060 I suppose it's like a statement of principles, but you're right.
00:35:43.620 It's not a contract.
00:35:44.340 Robert says the MOU is the usual stick and carrot routine that has been going on since
00:35:50.360 1905 and really for 38 years prior to that date, where Alberta always gets the stick and
00:35:56.000 Canada always gets the carrot.
00:35:57.660 Well, yeah, I mean, in recent days, you can see the liberal government's own cabinet ministers
00:36:02.700 are really boasting about how they put the boots to Alberta in terms of carbon taxes and
00:36:07.660 things like that.
00:36:08.320 Now, I don't know if that's them trying to mollify parts of their own party who were shocked
00:36:14.300 by it.
00:36:14.640 I mean, Stephen Gilbeau sort of has a point that Mark Carney agreed to do away with vast
00:36:21.180 swaths of Stephen Gilbeau's legacy.
00:36:24.260 So he is a bit mad.
00:36:25.540 There are some liberals that are mad.
00:36:26.840 Vernice Gardner says, net zero is a Brookfield grift.
00:36:31.820 Until we see proof of a carbon problem, we should not be considering all these rules
00:36:35.620 and taxes.
00:36:36.460 Plant trees.
00:36:37.260 You're so right.
00:36:38.300 I mean, if you really are worried about carbon dioxide, then put some trees and other green
00:36:43.360 things in the ground because the chlorophyll, I mean, it needs CO2 for, you know, photosynthesis.
00:36:48.800 That's how plants eat.
00:36:50.340 But no one is truly worried about that, especially not the jet set class like Al Gore and John
00:36:58.000 Carey and Mark Carney.
00:37:00.980 They're certainly not acting like they're worried about it.
00:37:03.220 And even if you do believe the math, which I don't, China, India, Brazil and OPEC contribute
00:37:09.620 the vast majority of the world's carbon dioxide.
00:37:12.340 Canada is not in a position to fix anything.
00:37:14.620 Um, but it's like when people said during COVID, oh my God, put on your mask.
00:37:19.860 They weren't acting as if they thought unmasked people were truly dangerous.
00:37:24.180 If you, if you knew, you've probably heard me say this before.
00:37:26.920 If you encountered someone that you knew had Ebola, you wouldn't go up to them and say,
00:37:33.600 excuse me, put on your mask.
00:37:34.960 You would run away from them.
00:37:37.180 The mask was just a symbol of compliance.
00:37:40.280 And I think it's my analogy with global warming is if you truly believe the world was going
00:37:47.000 to end because of global warming, people would be conducting themselves much more differently
00:37:51.860 than they are now.
00:37:53.000 It's just a shtick.
00:37:54.500 It's just a grift.
00:37:56.980 Well, that's our show for today.
00:37:58.660 Until tomorrow, on behalf of all of us here at Rebel World Headquarters, to you at home,
00:38:02.900 good night.
00:38:03.960 Keep fighting for freedom.
00:38:04.980 Keep fighting for freedom.