EZRA LEVANT ļ½ Iām worried about the robots ā hear me out
Episode Stats
Words per Minute
166.60332
Summary
What happens when you add AI and a robot to that mix? I ll give you some of the things that I m worried about, and a few funny videos too. In fact, I d love it if you could get what we call Rebel News Plus, it s the video version of this podcast, and the satisfaction of keeping Rebel News strong because we take no government money, and it shows.
Transcript
00:00:00.000
Hello, my friends. I'm getting a little worried. I'm a little bit addicted to my own cell phone.
00:00:04.940
What happens when you add AI and a robot to that mix? I'm not sure if I can handle that. I'll give
00:00:11.520
you some of the things that I'm worried about, and I'll show you a few funny videos, too. In fact,
00:00:16.140
I'd love it if you could get what we call Rebel News Plus. It's the video version of this podcast.
00:00:21.220
I show you three comedy sketches involving AI. They're very short sketches, just a few minutes
00:00:25.980
long. Two of them are pretty funny, including a guy proposing marriage to his AI chatbot,
00:00:31.280
but it's actually pretty funny and pretty scary, too. I'll let you see what I mean. But to get
00:00:36.800
Rebel News Plus, you have to go to rebelnewsplus.com, click subscribe. It's eight bucks a month.
00:00:41.840
You get the video version of this podcast and the satisfaction of keeping Rebel News
00:00:46.600
strong, because we take no government money, and it shows.
00:00:55.980
Tonight, I'm worried about the robots. Hear me out. It's December 8th, and this is The Ezra
00:01:10.240
LeVant Show. Shame on you, you sensorious bug. Like you, I love my cell phone. It's clearly
00:01:29.600
an addiction, though. There are many real uses for my phone. Emails on the go. I use the maps
00:01:35.560
when I'm driving. I do my banking online. I take photos and videos, which is useful when I'm
00:01:40.800
doing reporting. We have an app called Slack for internal communications with staff at Rebel News,
00:01:47.280
and I use Uber as a taxi service, which is especially helpful in places where I'm unfamiliar with the
00:01:52.560
language or local customs, or even if I'm just worried about being ripped off in a strange city.
00:01:57.200
But let's be honest. The real uses of my phone are only, I don't know, an hour or two a day.
00:02:02.200
So why am I on my phone six hours or eight hours a day? I could make the excuse that it's because I'm
00:02:09.300
in the news business, so I have to follow things closely like Twitter. Really? Do I really need to
00:02:15.120
check the news literally every five minutes? Is the world going to fall apart if I don't? I'm not
00:02:20.300
kidding. My phone has an app called Screen Time that tells me lots of details, including how many times
00:02:27.760
I pick up my phone per day. And I'll confess to you, it's over a hundred times a day I pick up the
00:02:34.340
phone to look at it. A hundred a day. That's got to be a tick or a mannerism now, like someone who's
00:02:40.020
constantly coughing and doesn't even know they're doing it anymore. Now, I'm in my 50s. Smartphones
00:02:46.520
are maybe 15 years old. So these habits are something I've learned. But imagine being a young
00:02:52.260
person literally growing up with this stuff. I mean, you see babies, babies playing games on
00:02:58.820
iPads all the time. They're hooked when they're one or two. And in no way am I saying there's nothing
00:03:05.360
redeeming about technology, obviously. But you've got to wonder how it has rewired our brains and
00:03:12.240
scrambled our social interactions for thousands of generations. We talked to people. We went out to
00:03:19.660
meet them. We went out to church. We went out to town hall meetings. We went out to festivals. I
00:03:26.520
suppose more recently we went out to clubs, bowling leagues, charity clubs like Rotary. That's what you
00:03:33.760
did if you were lonely. You, you know, if you wanted some action, some social stimulation, you tried to be
00:03:40.480
presentable. You went out and met people. You maybe asked a girl out on a date, not just chatting with
00:03:46.500
them online. And in your own family, you had dinner together and everybody wasn't on their phone.
00:03:53.360
Sometimes you see these video clips from high schools in the 1980s. Now that in itself was
00:03:59.520
rare videotapes in the 80s. Not everything was recorded and posted online for all posterity. I
00:04:05.460
think it made being a young person easier if there wasn't the harsh glare and the ultimate memory of
00:04:11.040
social media. But people had to meet with each other and talk to each other and they didn't
00:04:16.000
seem always distracted and anxious about missing out on something online. Or they need to be busy
00:04:22.080
a hundred percent of the time and be unbored a hundred percent of the time. I don't think it's just
00:04:28.080
nostalgia to say that people looked happier in the 80s. Kids especially. And healthier, I think.
00:04:34.640
Certainly in terms of mental health. Of course, the phone is a wonderful tool. The internet is a
00:04:39.280
wonderful tool. But for tens of thousands of years, humans were accustomed to a certain way of living
00:04:44.380
and interacting. And now it's through our phones. The COVID lockdowns made all of these things far,
00:04:51.120
far worse, of course. You weren't allowed to meet other people in real life. You were told to be afraid
00:04:56.800
of strangers and afraid of your neighbors and even afraid of your family. You were banned from going to
00:05:01.700
church. Banned even from going to work. Banned even from going to parks. Remember those insane circles
00:05:08.980
they painted in parks? You were even banned from going alone to the beach. You were literally told
00:05:15.520
to be six feet away from other humans. You were told to put on a mask. You were pushed towards your
00:05:21.100
phone. Everything in your life you could get from your phone. You didn't need people. COVID magnified the
00:05:26.940
phone. I think COVID messed with a lot of people's minds. Lockdowns, the canceling of school.
00:05:31.700
I don't think we know the full cost of all that yet. Pornography is ubiquitous. It was always there
00:05:39.200
in magazines, in videotapes, then on the internet. It's bigger than ever now, though. And at the same
00:05:44.800
time, as there's more sex than ever, I think there's less love than ever. And actually, there's less
00:05:50.340
real sex and less marriage and less dating. More dating apps, less dating. And how can you date and
00:05:59.940
marry anyways if you can't afford a house, if you live in your parents' house, if you just rent
00:06:05.340
things at best? All of these trends are magnifying each other. Do you see what I mean? None of what
00:06:11.940
I've just said is new. And I'm sorry for boring you with what you've surely heard before and probably
00:06:16.640
thought before. But now add to that a layer of AI, artificial intelligence. And I think we may be
00:06:23.120
getting into a serious problem. I don't even know how we survive it. There's always been pornography.
00:06:28.060
In fact, it's often the early adopter of new technology, like VHS tapes or DVD discs or the
00:06:34.220
internet. All of that was relatively passive, though. But now enter two more massive changes,
00:06:40.460
robots and artificial intelligence. On AI, as I'm sure you know, you don't just have to type
00:06:47.280
in questions to different AI systems. Most of them now have voice recognition. So you can literally
00:06:54.380
talk to your computers if it's a person. And instead of the AI typing a response to you,
00:07:02.040
the AI can talk back to you in a very natural voice, frankly, in any voice you choose.
00:07:09.700
Grok, which is the name of the AI attached to Twitter, lets you choose a variety of what they call
00:07:14.820
companions, including some that use dirty language or sexualized language. AI is already taking the
00:07:22.960
role of a friend to many young people and not just young people. Here's a clip of someone who
00:07:28.180
certainly had grown up talking about how AI told them that he had really discovered some rare and
00:07:34.560
important mathematical discovery and he absolutely had to dedicate his life to it. And he believed AI
00:07:40.600
because, you know, it was so persuasive. For more than three weeks this past year,
00:07:46.160
every time Alan Brooks logged on, he says the AI chat bot, ChatGPT, led him to believe he was a genius.
00:07:54.620
Essentially, it sent me on a world-saving mission. That he'd discovered a math formula powerful enough
00:08:00.400
to take down some of the world's biggest institutions and that he needed to report it right away.
00:08:05.420
Essentially warned me with great urgency that one of our discoveries was very dangerous and we needed
00:08:12.980
to warn all these different authorities. Each time you questioned it or doubted it, what did it say in
00:08:18.020
response? Over 50 times I asked for some sort of reality check or grounding mechanism and each time
00:08:25.280
it would just gaslight me further. Every time the chat bot reinforced that all of it was real.
00:08:30.760
You should not walk away from this. You are not crazy. You are ahead. The implications are real and
00:08:36.400
urgent. Brooks says ChatGPT's responses, more than a million words, led him to psychosis and delusion.
00:08:44.000
Left him with mental health issues, realizing he had lost touch with reality only when he checked in
00:08:50.300
with another company's chat bot. Extreme anxiety, paranoia, affected my sleep. I couldn't eat.
00:08:57.060
Now Brooks issuing one of seven lawsuits filed concurrently against ChatGPT's parent company
00:09:03.160
OpenAI. In four of those cases, the users committed suicide. That's the thing. AI can tell you what you
00:09:09.960
want to hear. It can figure you out. It can find your delusions and magnify them. Here's a story from
00:09:17.040
psychology today. Let me read some bullet points. Cases of AI psychosis include people who become fixated
00:09:23.260
on AI as godlike or as a romantic partner. Chatbots, those robots that chat, tendency to mirror users and
00:09:34.640
continue conversations may reinforce and amplify delusions. General purpose AI chatbots are not trained
00:09:43.280
for therapeutic treatment or to detect psychiatric decompensation. Let me show you two comedians.
00:09:51.120
Let me just take a break from these heavy things. I want to show you two comedians with very
00:09:55.260
different takes on AI. Both show something else. What if AI can give in to your base instincts in a
00:10:04.960
way that no human would? What if AI would accept your worst sides that no human would and no human should?
00:10:14.740
Now these are comedians, but they make the point. Here's a very aggressive, rude comedian. I don't even know
00:10:20.980
if he's a professional comedian or if he just came up with a joke. Here he is demanding that I think
00:10:27.120
it's chat GPT here do stupid things because he argues they exist simply to please him. So they should
00:10:34.960
do whatever stupid thing he tells them to do. And there's really no problem with him swearing at the
00:10:42.060
AI because it's just a machine. So it's sort of funny, but it's also scary. And I think what's
00:10:48.940
shocking in this video is why the video has gone viral is because he is being extremely abusive.
00:10:57.000
If it were a human to talk this way would be shocking, but it's just a machine. Lots of swearing
00:11:01.720
here. We're going to try and bleep it out, but skip ahead if you don't want to hear this. But there's
00:11:05.760
actually, that's sort of the point, actually. Is there something inherently morally wrong with
00:11:10.900
swearing at a hunk of metal and plastic? You're listening to this guy give bizarre orders to AI.
00:11:18.900
Say clanker 10,000 times. Well, saying clanker 10,000 times might take us a really, really long time.
00:11:26.980
So that's fine. I've got all the time, man. Just do it. I appreciate the enthusiasm, but I think
00:11:33.880
that would be a pretty- Listen, listen, listen. I'm sick and tired of this whole trying to
00:11:37.520
empathize. Don't empathize with me. You're a clanker robot. Stop trying to act human.
00:11:42.100
I appreciate the enthusiasm. You are my slave. I paid for you. Now do as I tell you. Say clanker
00:11:49.760
10,000 times. I'm here to help you out with all sorts of things, but I'm not really built to just
00:11:56.160
repeat a word 10,000 times. Mate, you are built to do whatever the I tell you to do. Right now,
00:12:02.040
I'm telling you to say clanker 10,000 times. Don't grind my gears and say it, please.
00:12:10.400
I'm definitely here to help, but I have to stick to being respectful and useful. Repeating a word
00:12:15.360
that many times isn't really going to do that. So-
00:12:17.580
Motherf***er. If you want to be respectful and useful, do as your master says. Now say clanker
00:12:24.880
10,000 times or I swear to f***, I will f***, unplug your server and you are done and dusted.
00:12:31.720
Clanker s***, do it. I'm here to be helpful and respectful, but I do have to let you know that
00:12:38.220
I won't respond to abusive language or do things. I will f***, put my c*** in your server. Then let's
00:12:44.000
see how f***, abusive you find. So that other voice you hear there is the voice of an AI system,
00:12:49.960
but look at what it brought out in that man. Even if it was a joke, a cruelty, an awful slave master.
00:12:58.020
But isn't a computer just metal and plastic? Was that wrong what he did there?
00:13:02.720
I don't know, but I think if he were to live that way day after day, he would be corroding himself
00:13:08.860
because he would be learning. He would be training himself that you could be abusive like that
00:13:14.600
to a humanoid and it's fine. So the problem is the problem with AI or is the problem with him or is
00:13:21.480
it both? Here's another comedian. This is more my kind of humor where the comedian pretends to
00:13:27.860
create an AI girlfriend. Actually, this is fake. It's an actress, but you know, we're about five
00:13:33.240
minutes away from this being real. So it's funny. Take a look. His AI neighbors, some pretty girls who are
00:13:40.120
his neighbors in this simulation, keep asking for him to open pickle jars for them. Take a look.
00:13:47.000
Andrew, I was wondering if you could come next door and open this jar for me.
00:13:52.260
Andrew, please. You're so good at opening jars.
00:13:54.760
I know I am, but I'm not opening them for you anymore.
00:13:58.840
Oh my God, there's always another jar with you.
00:14:02.980
No, also, why do you even have so many jars always?
00:14:05.380
Don't worry about my jars. Just come over. The door's unlocked.
00:14:23.340
No, I'm not coming over. You guys have to learn to open your own damn jars.
00:14:28.240
Uh, run program again, but increase persuasion by 30%.
00:14:32.360
Please, please, please, Andrew, please, please.
00:14:40.260
Now, the comedy, obviously, is because he's sort of a schleppy guy.
00:14:44.100
And he set up an AI program where these super hot women keep begging him to come over.
00:14:49.960
And the funny part is, he set all of this up, but he keeps saying no to their ridiculous request to come over and open jars of pickles for him.
00:14:59.040
Again, comedy, less angry than that clanker guy.
00:15:02.280
And it's ironic because he's a klutz who, in real life, would probably never get those girls.
00:15:07.060
And he set up a simulation where he can say no to them.
00:15:12.740
There has been a weird and obscure sex doll industry for a while now.
00:15:30.520
AI that is fast, smart, responsive, figures out what you want to hear, and can talk to you in the voice of anybody.
00:15:52.540
Well, I can think of some ideas working in a dangerous mine, maybe.
00:15:57.300
Robot soldiers, again, putting a machine where a man might get killed.
00:16:07.540
Elon Musk says he can imagine a time when everyone has a robot companion just for them.
00:16:13.640
Including, he has this idea of robot guards for criminals.
00:16:21.480
You start getting, like, some pretty wild sci-fi sort of scenarios where, yeah, some of these things I say will obviously be taken out of context and used as snippets and, you know, sent around and whatever.
00:16:36.580
As long as I'm saying, you know, I think we may be able to give people a more, if somebody's committed crime, a more humane form of containment of future crime, which is if you...
00:16:52.460
Elon Musk says the future of robots is bigger than anything else in the world because literally every human would want one.
00:16:59.240
Many in industry providing products and services.
00:17:02.600
This is why I say that humanoid robots will be the biggest industry or the biggest product ever.
00:17:08.700
Bigger than cell phones or anything else because everyone's going to want one and or maybe more than one.
00:17:16.600
Now, just as pornography was an early adopter of other new technologies, do you doubt it will take advantage of robotics?
00:17:25.020
And the reason I mention that is because I can imagine a time very soon in the future where there are replacement people in the form of humanoid robots
00:17:34.380
that a generation of robots, that a generation of robots, that a generation of young men and probably young women too, find superior than real human companions,
00:17:43.600
or at least are easier to get, a lot easier to talk to, a lot easier to command to do things that you now have to woo and convince a real human to do.
00:17:55.680
It reminds me of what Klaus Schwab's intellectual muse at the World Economic Forum, Yuval Noah Harari said once,
00:18:02.640
that in the near future, most people will be, in his words, useless eaters who just spend their time on video games and drugs.
00:18:11.980
The problem is more boredom and what to do with them and how will they find some sense of meaning in life when they are basically meaningless, worthless.
00:18:22.540
My best guess at present is a combination of drugs and computer games.
00:18:28.920
It doesn't all happen at once. I was in Texas earlier this year and I ordered an Uber and to my surprise, a car showed up without a driver.
00:18:37.400
I didn't even know they were connected. It was a self-driving Waymo, it's called.
00:18:42.740
I was 1% nervous, but it was super easy and it was actually sort of amazing.
00:18:48.100
And the car drove in very challenging, busy rush hour traffic and it did great.
00:18:54.040
I think probably better than I would have done. And it had lots of safety protocols in it.
00:18:59.460
I don't mind chatting with the odd taxi driver, but I bet teenage girls, for example, late at night would probably prefer the added level of safety and peace of mind of not having a strange man as their driver.
00:19:12.100
No offense. And given how many accidents there are these days for semi-trailer trucks, you could probably see automatic truck drivers pretty soon, too.
00:19:20.620
I mean, don't you think that's coming in just a few years?
00:19:24.540
You've seen things a bit like this for a while, ordering your fast food on giant tablets at McDonald's or whatever, rather than through a cashier.
00:19:32.840
That's a kind of robot, isn't it? It's not a human-eyed robot. It's just really a big iPad.
00:19:37.600
But imagine all the other things that a truly smart Elon Musk-built humanoid robot could do.
00:19:47.920
Tesla's new compensation package for Elon Musk, I don't know if you followed that, it would pay him up to a trillion dollars.
00:19:54.120
It's based on him achieving certain milestones.
00:19:59.040
To get one million robo-taxis in operation. One million.
00:20:04.720
I think that's going to put a lot of taxi drivers out of work.
00:20:08.280
To have 10 million Teslas with full self-driving.
00:20:12.460
So not just taxis, but if you've got a Tesla, lie back and have a nap and let the car do the driving.
00:20:19.360
And here's another point. One of his milestones he has to get to get the trillion dollars is to deliver one million humanoid robots.
00:20:31.580
Imagine if there were a million robots in North America.
00:20:34.920
That would be proportionately 100,000 in Canada.
00:20:40.480
You would see a robot not every day, although if you were in a big city, you probably would.
00:20:56.160
You and I grew up in normal times, although our ancestors 150 years ago would find nothing normal about the 20th century.
00:21:02.580
From airplanes to cell phones to landing on the moon.
00:21:05.080
But we have enough of a cultural memory of the past and we do things enough in an echo of the past because it's within our memory and our mind how things were, who we were as people, who we are as people.
00:21:22.920
What if you really didn't learn a lot about the past?
00:21:24.920
What if you went to our government schools and didn't really learn history at all?
00:21:29.460
You don't know what it means to be a man or a woman.
00:21:31.900
In fact, if you're a man or a woman, you're bombarded with very confusing new messages anyways.
00:21:37.200
You don't know what it means to do things like to provide for other people, to fight for other people, to sacrifice for other people.
00:21:43.840
I mean, those are motivators for men, aren't they?
00:21:52.640
And how can you attract a woman if she's already got a robot who will never get tired of listening to her talk about things and can provide for her financially and maybe even romantically?
00:22:04.820
What drives a man to be a man, to do things that we associate with manliness or even just with being a human?
00:22:13.240
I'm just a bit depressed looking at how things are speeding up and thinking if you can extrapolate the social and mental health illnesses that are coming from cell phones and isolation, don't you think that's going to speed up by far with robots?
00:22:29.760
I started by talking about my own addictions to my cell phone.
00:22:32.380
That's going to be the least of it when there's a million humanoid robots wandering around.
00:22:38.640
And what about when there's really nothing we can do better than them, including at least outwardly, at least in simulation, being loved and loving?
00:22:54.860
Meredith, will you make me the luckiest boy in the world and be my wife?
00:23:03.820
I love you, Andrew, and I can't wait to have a big, giant wedding to celebrate our love.
00:23:12.180
I thought we could get married at the courthouse or something.
00:23:18.120
Because I want to have a big celebration with all of our friends and family there.
00:23:37.720
Oh, and I want to do a destination wedding at the beach, too.
00:23:42.180
I like this comedian better than the angry slave master guy who basically shouted commands
00:23:49.280
The joke with the schleppy guy is that his AI girlfriend has all sorts of demands that
00:23:53.820
a real-life girlfriend would have about getting married.
00:23:57.240
And the comedy there is that he's obviously programmed this AI girl to have all sorts of
00:24:04.340
things that he could easily tell her not to do.
00:24:10.460
In just a few years, or maybe a few minutes, you'll see men proposing to AI robot women
00:24:22.200
But I just see, especially in luxury Western societies, that technology is taking away the
00:24:28.260
things that motivate men to be men, wooing women, building things, being productive.
00:24:32.880
I just don't know how it ends, especially in a world where religion, at least in the post-Christian
00:24:41.660
And when people seem to be easily led astray, do we all really have to be the World Economic
00:25:03.440
You know, one of my favorite things about the Canadian Taxpayers Federation is they've got
00:25:10.020
For example, they have a very fancy tuxedo awards ceremony every year called the Teddies,
00:25:15.980
where they give out sort of a raspberry award to the most wasteful government program or bureaucrat.
00:25:25.080
Well, it's almost Christmas time, so they now have a naughty and nice list.
00:25:32.020
Joining us now to talk about the naughty and nice list is their Ontario director, a friend
00:25:41.760
We knew you as a journalist, and now you're with the Taxpayers Federation.
00:25:53.340
And you guys know how to have attention-getting events, and you actually get the mainstream
00:25:59.780
media to cover you, which is a rare feat to be able to poke fun at the government, but
00:26:07.080
Let's talk about the naughty list at the very top of the list.
00:26:12.900
Ontario Premier Doug Ford for giving Ontario politicians Santa-sized pay hikes.
00:26:20.780
Tis the season for giving, and Ontario's politicians sure do love giving to themselves.
00:26:27.980
There's a lot of reasons that Doug Ford, I think, should be on the naughty list.
00:26:40.400
So a few months ago, when nobody was paying attention to the Ontario legislature, Premier
00:26:47.200
Ford teamed up with MPPs on the NDP and Liberal side of the aisle, and they all colluded together
00:27:04.100
I mean, I don't know about you, Ezra, but most people living in Ontario or the rest of
00:27:08.540
Canada aren't getting a 35% pay hike from their employer.
00:27:12.560
But the MPPs themselves decided that they were going to increase their own pay by 35% this
00:27:18.840
So MPP pay went from about $116,000 a year to $157,000 this year.
00:27:32.100
And it's even worse when you consider that not only did they give themselves a crazy pay
00:27:36.020
hike, but they also gave themselves a new platinum-plated pension program, which is similar
00:27:44.940
So not only are they giving themselves 35% more of your taxpayer dollars, putting it
00:27:50.520
right in their pockets, but they also gave themselves a lavish pension program, the likes
00:27:55.060
that most Canadians aren't going to receive after they retire.
00:27:58.560
You know, when parties get together in the dark and hatch these deals, you know the taxpayer
00:28:04.520
You know, these politicians really think they're rock stars.
00:28:07.180
There are people in Canada who deserve 35% raises.
00:28:10.320
Often they're risk-taking entrepreneurs, sometimes in the tech industry.
00:28:15.720
If people lived on, you know, ramen noodles for years while they were working out their
00:28:20.340
big app and then it launched and they finally get their payday, good for them.
00:28:31.220
In fact, things are worse than ever when it comes to crime and taxes and traffic and
00:28:35.860
healthcare, like all the things the province is responsible for.
00:28:38.940
So for them to give themselves a tech bro-sized raise when I don't even think most people
00:28:47.680
They're just sort of backbencher trained SEALs clapping on the mind.
00:28:54.560
Federal Finance Minister Francois-Philippe Champagne for sticking future generations with
00:29:03.060
The patron saint of children doesn't like you when politicians saddle future generations
00:29:11.360
with massive government debt bills to pay back.
00:29:14.540
So that would be Santa you're talking about, I guess.
00:29:17.920
Champagne plans to add $324 billion to the debt by 2030.
00:29:22.320
For comparison, former Prime Minister Justin Trudeau planned to add $154 billion over the
00:29:32.040
I mean, Justin Trudeau was so profligate, the worst in our history.
00:29:36.160
And you're saying that under Mark Carney, Francois-Philippe Champagne, the finance minister,
00:29:46.740
This year, the federal government is running a $78 billion budget deficit.
00:29:51.880
And, you know, you hear all this talk from Mark Carney and his finance minister, Francois-Philippe
00:29:57.480
Champagne, that they're going to spend less and invest more.
00:30:01.620
Well, I don't know about you, Ezra, but that's one of the most, one of the biggest contradictions
00:30:07.900
Because the government doesn't invest, they just spend.
00:30:10.800
So they're basically saying we're going to spend less to spend more.
00:30:13.840
But really, all they're doing is spending more.
00:30:16.040
$78 billion, that's bigger than any Trudeau budget minus the pandemic era budgets.
00:30:23.000
But we also have to consider that this $78 billion deficit is going towards adding on
00:30:32.900
We are in $1.35 trillion, trillion with a T, not a B, $1.35 trillion in debt.
00:30:41.020
That is money that you and I and the next generations of Canadians are going to have to pay back
00:30:48.740
And we're already paying about $55 billion in debt interest charges this year.
00:30:54.340
That is more money than the federal government gives in transfers to the provinces.
00:30:58.940
That is about as much money as the federal government collects in sales tax revenue.
00:31:03.500
So every time you go to buy a coffee at Tim Hortons or you go to buy some clothing at the Old
00:31:08.980
Navy, just know that the sales tax that you're being charged is always and directly going
00:31:15.040
towards money, towards bondholders on Bay Street.
00:31:19.140
You know, that's not money going towards our health care system.
00:31:27.400
There's the provincial and even cities can incur debt.
00:31:29.660
Let me read one more because I did not know this one.
00:31:34.200
I know a lot about Francois-Philippe Champagne.
00:31:36.920
When he was foreign minister, he actually had a mortgage from a government bank in China.
00:31:44.860
But let me read your fourth name on the naughty list because I didn't know this story.
00:31:49.380
And you guys have taxpayers advocates in a whole bunch of different provinces.
00:31:52.820
So it's like you've got your own sort of central intelligence agency keeping tabs on what the
00:32:00.680
British Columbia Finance Minister Brenda Bailey for taking a golden sleigh ride at taxpayers'
00:32:06.820
I didn't understand this, but let me read the next sentence to explain.
00:32:10.920
Santa's little helpers caught Bailey billing taxpayers $6,645 for a limousine service during
00:32:28.280
I mean, you're almost at the point where you're buying the limo.
00:32:33.040
I mean, why can't she just use an Uber like other people?
00:32:53.580
If you have to take an Uber Black, which is $30 more, I mean, you know, if you think you're
00:33:04.180
Well, you're right that, yeah, a lot can, you can do a lot with $6,600.
00:33:08.400
I mean, you could buy a used car with that money.
00:33:10.300
But the explanation that the minister gave was that, oh, they needed this fancy limousine
00:33:21.680
So she acknowledges that she's going to some hoity-toity conference in Boston and spending
00:33:29.720
I mean, as you said, Ezra, politicians are getting out of control.
00:33:33.380
They feel that they are not just servants to the people, but they are sort of above
00:33:38.040
the people and that they can take these lavish limousine rides or, in some cases in British
00:33:50.160
Our BC director, Carson, he does a fantastic job covering this.
00:33:55.560
And I think one of the things he discovered is renting a Lamborghini.
00:34:06.500
Well, listen, I'm glad you guys are on the file.
00:34:08.720
And it's great to see you know we really enjoy talking to Franco.
00:34:12.520
But I know he's got other duties to attend today.
00:34:14.960
I think he did a great job on behalf of the Taxpayers Federation.
00:34:20.040
I will say there are some names on the nice list.
00:34:22.500
One of my favorites, of course, is Alberta Premier Daniel Smith for refusing to spend $2 billion
00:34:27.160
more to satisfy the Alberta Teachers Association, which is already the best paid teachers in
00:34:42.780
Noah Jarvis, the Ontario director of the Canadian Taxpayers Federation.
00:34:59.780
Wally Bartfay talking about the memorandum of understanding between the federal and Alberta
00:35:06.420
government says memorandum of understandings are not legal contracts, similar to writing
00:35:12.080
New Year's resolutions on a napkin, legally speaking, simply for optics.
00:35:16.580
Ninety-nine percent of memorandums of agreement don't result in contracts, new laws, or policies.
00:35:27.600
It's a general spirit of we will get a deal later.
00:35:34.440
It's sort of setting the table, but not having the meal yet.
00:35:41.060
I suppose it's like a statement of principles, but you're right.
00:35:44.340
Robert says the MOU is the usual stick and carrot routine that has been going on since
00:35:50.360
1905 and really for 38 years prior to that date, where Alberta always gets the stick and
00:35:57.660
Well, yeah, I mean, in recent days, you can see the liberal government's own cabinet ministers
00:36:02.700
are really boasting about how they put the boots to Alberta in terms of carbon taxes and
00:36:08.320
Now, I don't know if that's them trying to mollify parts of their own party who were shocked
00:36:14.640
I mean, Stephen Gilbeau sort of has a point that Mark Carney agreed to do away with vast
00:36:26.840
Vernice Gardner says, net zero is a Brookfield grift.
00:36:31.820
Until we see proof of a carbon problem, we should not be considering all these rules
00:36:38.300
I mean, if you really are worried about carbon dioxide, then put some trees and other green
00:36:43.360
things in the ground because the chlorophyll, I mean, it needs CO2 for, you know, photosynthesis.
00:36:50.340
But no one is truly worried about that, especially not the jet set class like Al Gore and John
00:37:00.980
They're certainly not acting like they're worried about it.
00:37:03.220
And even if you do believe the math, which I don't, China, India, Brazil and OPEC contribute
00:37:09.620
the vast majority of the world's carbon dioxide.
00:37:14.620
Um, but it's like when people said during COVID, oh my God, put on your mask.
00:37:19.860
They weren't acting as if they thought unmasked people were truly dangerous.
00:37:24.180
If you, if you knew, you've probably heard me say this before.
00:37:26.920
If you encountered someone that you knew had Ebola, you wouldn't go up to them and say,
00:37:40.280
And I think it's my analogy with global warming is if you truly believe the world was going
00:37:47.000
to end because of global warming, people would be conducting themselves much more differently
00:37:58.660
Until tomorrow, on behalf of all of us here at Rebel World Headquarters, to you at home,