The Glenn Beck Program - January 16, 2018


1⧸16⧸18 - 'Reclusive, Abusive and Bankrupt (William Hertling & Tim Ballard from O.U.R. join Glenn)


Episode Stats

Length

1 hour and 52 minutes

Words per Minute

162.05489

Word Count

18,251

Sentence Count

1,390

Misogynist Sentences

38

Hate Speech Sentences

21


Summary

David and Louise Turpin and their 13 children were arrested and charged with torture and child endangerment after a 911 call from one of their daughters led police to find them chained to their beds. The Turpins have been charged with abuse and neglect.


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:10.440 Love. Courage. Truth. Glenn Beck.
00:00:17.280 You know, we have to start comparing ourselves to ourselves yesterday.
00:00:24.360 And stop comparing ourselves to the things that we see on television or on Facebook.
00:00:31.860 Because it's a lie.
00:00:34.800 David and Louise loved each other. Oh, and they loved Elvis.
00:00:39.660 It was clear from their family Facebook page.
00:00:43.280 You see the pictures on their Facebook page?
00:00:45.940 Their three vow renewal ceremonies?
00:00:49.560 They had a tradition of going to the Elvis Chapel in Las Vegas.
00:00:55.300 David, Louise, and their 13 children, oh, and the king, all look very happy in these photos.
00:01:02.240 Every Facebook page lies.
00:01:07.660 Sometimes just by omission.
00:01:11.220 Sometimes you're not getting the whole picture.
00:01:13.660 Sometimes you're getting a complete fraud.
00:01:16.420 Today, you're looking at the Turpin's images in a whole different light.
00:01:26.700 And now they look dark and twisted.
00:01:30.380 If you don't know who the Turpin's are, let me tell you.
00:01:34.800 Sunday, authorities responded to a 9-11 call by one of David and Louise's daughters.
00:01:41.220 The daughter frantically explained on this stolen phone that she had escaped the family home
00:01:48.580 and begged the police to come and rescue her and her siblings who were starving and chained to their beds.
00:01:55.480 The girl claimed to be 17 years old.
00:01:58.820 When police actually saw her, they didn't believe her.
00:02:01.820 They thought she was 10.
00:02:03.720 But she is 17.
00:02:05.440 They found the girls' 12 brothers and sisters chained, padlocked to their beds.
00:02:13.840 They were filthy.
00:02:15.460 They were emaciated.
00:02:18.020 The children, they thought, were all under the age of 15.
00:02:23.900 But they weren't.
00:02:25.480 The children ranged in age from 2 to 29.
00:02:28.700 They were so famished that they were shocked, the deputies were shocked,
00:02:37.040 to find out that seven of them were adults.
00:02:41.720 David and Louise were unable to immediately provide a logical reason
00:02:45.680 to why their children might have been, you know, chained to their bed.
00:02:49.980 I'm trying to think, as I read that this morning, I thought,
00:02:52.760 what would the logical reason be to chain your children to the bed?
00:02:57.820 But they apparently had none.
00:03:01.060 The children were taken to the hospital.
00:03:03.040 They're in the care of Child Protective Services.
00:03:06.320 The Turpins were arrested.
00:03:08.500 They were booked on charges of torture and child endangerment.
00:03:17.480 We're at a time when we have so much information about almost everything.
00:03:23.840 Except what is true.
00:03:25.040 In the age of social media, it is getting more and more difficult
00:03:29.680 to know what's real and what isn't.
00:03:35.160 Because we all have the ability to edit and manipulate our own narratives.
00:03:40.220 You know, I don't know if you saw the movie about P.T. Barnum,
00:03:43.360 The Greatest Showman.
00:03:46.000 But I left that movie and I thought, I want to read about P.T. Barnum.
00:03:49.800 So I went, and there's just not a lot of good books about him.
00:03:54.720 The one that gets the highest ratings, the one that is supposedly the best,
00:03:59.720 the author was P.T. Barnum.
00:04:03.600 How could I possibly believe anything that P.T. Barnum said about himself?
00:04:10.260 He was trying to give an image of himself that wasn't necessarily reality.
00:04:20.360 And isn't that what we're doing with Facebook?
00:04:23.580 Because of Facebook, Turpin's friends and family had no idea that they were hideous monsters
00:04:29.700 who were torturing their children.
00:04:31.460 What's really frightening is
00:04:35.800 they're not the only ones.
00:04:39.500 This isn't going to be the first time.
00:04:41.520 It's not going to be the last time.
00:04:45.500 We'll continue to be shocked by the dichotomy of how people present themselves
00:04:50.300 and what the actual truth is.
00:04:52.660 How many people are we finding out are monsters?
00:04:57.820 The Turpin family provides us with a horrible, cautionary tale today.
00:05:07.300 Something that we all need to understand as we move forward.
00:05:11.940 And it is just really simple, but somehow or another, at times, it controls our life.
00:05:20.280 It even controls our own attitude and self-worth because we forget.
00:05:27.820 That when it comes to social media, you're not seeing the full picture.
00:05:34.140 And sometimes, you're not even anywhere close to the truth.
00:05:46.220 It's Tuesday, January 16th.
00:05:49.120 This is the Glenn Beck Program.
00:05:50.860 So I got up this morning, and I saw this Turpin story.
00:05:59.360 And I just, I, my first thought was, what the hell's wrong with us?
00:06:05.200 What happens to human beings?
00:06:07.220 What is happening to us?
00:06:09.080 And I guess nothing.
00:06:11.180 We're not getting worse.
00:06:12.400 If anything, maybe we're getting better.
00:06:14.300 But this was, I mean, this was the way of the world when, you know, in the Dark Ages
00:06:22.860 and when people were not living on top of each other.
00:06:27.720 This kind of stuff, you know, happened a lot.
00:06:31.020 We just hide it now, and hopefully we're getting better.
00:06:36.400 Yeah, I mean, I think, you know, hopefully it's not quite as common.
00:06:40.440 But, I mean, I guess these stories probably did happen.
00:06:42.660 We're hearing about them now.
00:06:44.120 It's weird that, because when the story first broke, you think to yourself, oh, yeah, that's
00:06:47.660 like one of those, like this story or that story or that story.
00:06:50.300 And you're like, why do I have other examples of this?
00:06:52.220 Yeah.
00:06:52.560 I shouldn't have any other examples to point to.
00:06:54.400 This should be a one-time thing.
00:06:55.640 But it's not.
00:06:57.060 These things do exist.
00:06:58.300 It's not something that is, it's not something that is new.
00:07:03.500 No.
00:07:03.560 I mean, you know the story of St. Nicholas.
00:07:05.900 One of the reasons why he was deemed a saint is because of the miracle of the four boys.
00:07:11.960 I can't remember exactly what it's called, but there were these boys that went from their
00:07:17.460 house.
00:07:17.880 They were supposed to go maybe, I think, to their uncle's house or something, and they
00:07:23.200 were halfway there.
00:07:24.160 They stopped at this inn, and this guy who ran the inn said, oh, yeah, you guys can stay
00:07:32.540 here.
00:07:33.220 You know what?
00:07:33.740 You can stay here for free.
00:07:34.600 I just need some help downstairs.
00:07:36.920 Went downstairs.
00:07:38.040 He killed them.
00:07:38.720 He ground them up into sausage, and he was making meat pies for the people who came to
00:07:46.300 his place.
00:07:49.580 That's what the meat was.
00:07:51.000 It wasn't cats.
00:07:52.540 It was kids.
00:07:54.680 St. Nicholas was the bishop of the town, knew those children.
00:07:58.700 He went on a search for the kids, stopped at this inn, sensed that something was really
00:08:07.280 wrong, went down into the basement, saw the condition.
00:08:12.740 He was canonized because they say that he rose.
00:08:15.740 He assembled them back together somehow or another and brought them back from the dead.
00:08:21.700 I don't know about that part of the story, but that's the kind of stuff that people have
00:08:29.720 been doing to people forever.
00:08:31.060 Yeah.
00:08:33.380 You know, that's a weird, dark part of humanity, isn't it?
00:08:37.280 And I guess, you know, it's hard to understand how you could get to that position.
00:08:40.980 Like, first, you have to make that decision.
00:08:42.460 It's like, you know, the first person who ever drank milk.
00:08:45.000 You're like, what was the decision that led to that?
00:08:47.720 Milk.
00:08:48.440 Yeah.
00:08:48.920 You know, who was like, hey, I'll suck off that cow thing.
00:08:51.820 I think clams or oysters or lobsters are like, what?
00:08:57.300 You know, somebody had to try to eat a spider at one point.
00:09:00.280 I mean, a big, hairy spider.
00:09:01.980 And they were like, oh, that's not good.
00:09:03.700 Don't know.
00:09:04.300 Right.
00:09:04.680 And those decisions, like, you make that first decision and it influences the rest of your
00:09:09.040 decisions and it doesn't catch on with humanity.
00:09:11.220 Yes.
00:09:11.580 It seems like that's, you know, most of us are not enslaving people and that's really positive.
00:09:15.860 But, like, you start that, you know, that family at some point made a decision for that
00:09:21.460 first kid to be chained up in a room and that lasted for how long and how many more people
00:09:30.160 were affected by it?
00:09:32.140 They continued to go down that road.
00:09:34.000 Maybe they didn't think they could reverse it.
00:09:35.560 Maybe they're so psychotic that that's all they wanted kids for.
00:09:40.020 You know, we're going to find out a lot about this story.
00:09:42.560 Because there's all these pictures of them happy.
00:09:45.860 And some of the kids were out.
00:09:48.800 They were seen, you know, as early as a couple of weeks ago, one of the older ones driving
00:09:53.880 a car, going out.
00:09:55.060 I mean, the fear of what would happen, I guess, to the rest of the family must have been incredible.
00:10:05.600 Yeah.
00:10:06.160 Yeah.
00:10:06.340 You hit on an interesting point in the monologues.
00:10:07.880 I think most people are just going to take this to, you know, how horrible the situation
00:10:12.300 is, which is true.
00:10:13.720 But, I mean, we take Facebook pictures seriously for some reason.
00:10:21.480 Right?
00:10:21.940 It's the same way my wife, every time a new product comes out and that she wants to buy,
00:10:27.780 she comes and shows me the testimonial page of their website.
00:10:30.700 Like, to me, there's no impact at all from a testimonial page from a website.
00:10:36.000 Because I know you have an incentive of this company.
00:10:39.040 They're trying to sell it to me.
00:10:40.180 Of course, they found the nine people who really like the product.
00:10:43.260 Or they made them up.
00:10:44.640 Right?
00:10:44.860 Like, that's how I internalize that.
00:10:46.960 And I think a lot of people do.
00:10:48.140 Same thing with Facebook.
00:10:49.100 I don't know.
00:10:49.920 I look at Facebook and people are, you know, smiling photos.
00:10:52.200 And I think internally, for most people, you just be like, oh, wow, they must have a really
00:10:56.720 good life.
00:10:58.480 They're all happy.
00:10:59.580 They're having these wonderful times together.
00:11:01.260 What a great story.
00:11:02.840 And we all know that at some level, not to the level where you're shading your children
00:11:08.880 to the beds, hopefully, but we all, at some point, we don't post the really sad pictures.
00:11:15.740 We don't post, we don't take pictures when people are mourning.
00:11:18.740 We don't take pictures when people are fighting and are having troubled times.
00:11:23.180 You take pictures when things are happy.
00:11:24.620 You post them.
00:11:25.580 You are doing the same editing bubble stuff we complain about with the news.
00:11:29.640 Right?
00:11:30.200 Like, you're only exposing people to the happy points of your life.
00:11:34.800 And the fact that that could work on people to the level to separate a family from happiness
00:11:42.460 to kids chained up to their beds shows how powerful that is.
00:11:47.440 I mean, their own relatives weren't even questioning it.
00:11:50.440 Well, there's something wrong with the relatives, too.
00:11:52.840 I mean, the grandparents live in West Virginia.
00:11:56.440 They said they hadn't seen the kids in five years, hadn't seen the family in five years.
00:12:01.380 But they said, you know, the last couple of years, they've talked to the family lots of
00:12:06.780 times.
00:12:07.800 However, never the grandchildren.
00:12:10.860 Well, what?
00:12:12.860 I mean, did that strike you as weird that you have grandchildren from two to 29 and over
00:12:19.920 a two-year period, you've talked to people in the house, but never the children?
00:12:24.560 It is weird.
00:12:25.080 But I mean, family dynamics can be strange sometimes.
00:12:27.920 I mean, sometimes there's fights.
00:12:29.880 There's long-term battles.
00:12:31.420 There's, you know, there's all sorts of stuff that happens in families where it's possible
00:12:34.480 that that could happen.
00:12:35.340 You'd think, though, there'd be some indication of trouble.
00:12:37.080 And it's frightening as well, if you look at the aerial shots of this house, these houses
00:12:44.320 are, you know, not acres apart.
00:12:47.280 They're right on top of each other.
00:12:49.980 Neighbors said they didn't even know that, some of them didn't even know that there were
00:12:53.700 13 kids in there.
00:12:56.160 What?
00:12:58.540 What?
00:12:59.700 Wow.
00:13:01.040 You didn't even know what the one neighbor said they thought it was weird, but I think
00:13:07.800 two weeks ago she went outside at night and it was like nine or 10 o'clock at night and
00:13:15.040 the kids were the four of them and they were all kneeling down in the grass together and
00:13:20.020 they were all just kind of rolling around in the grass and the mother was standing in
00:13:24.300 the back watching over them and she cried over the fence.
00:13:27.860 You know, the neighbor did said, hey, hey, guys, and the kids kind of looked at her and
00:13:34.080 then just kind of froze and the mother didn't even recognize the neighbor saying anything
00:13:40.680 and she said it was just really odd, but she didn't think anything about it.
00:13:44.880 It just, you know, it's weird that they were out at 930 at night, but, you know, and their
00:13:49.380 reaction is weird.
00:13:51.480 Well, we just have no, I mean, people on my street could have, you know, entire tribes in
00:13:56.820 their basement and I wouldn't even know.
00:13:58.560 I mean, you just, people just don't, you don't react, you don't interact with your neighbors
00:14:02.440 like you used to.
00:14:03.020 I mean, we always talk about, we always throw back to the times where you used to be able
00:14:06.200 to just let your kids out and they would go around the neighborhood and now we'd be terrified
00:14:10.740 to do those things, right?
00:14:12.160 Like you'd never want that to happen because we're sort of somewhat crazy, right?
00:14:17.440 Like the crime rates are all lower.
00:14:19.480 There's no reason to, we were overprotective now with our kids, but part of the reason
00:14:24.880 why it felt okay is because you knew everybody in the neighborhood.
00:14:28.580 You knew all the other parents.
00:14:29.900 You knew that they would look out for your kids.
00:14:32.180 You knew that they would feel okay punishing them if they did something wrong.
00:14:36.420 I think it was more, I think it was also more than that though.
00:14:38.940 We thought we all had the same values.
00:14:41.860 Yeah.
00:14:42.080 We were wrong, you know what I mean?
00:14:44.880 We were totally misled, but we all thought we had the same basic values.
00:14:52.020 And so we didn't question the parents because we thought, oh, well, you know, parents are
00:14:57.720 parents.
00:14:58.280 They believe the same thing we do and they're going to make sure.
00:15:01.700 No.
00:15:03.020 And I don't know if anything has changed because, I mean, I grew up, I grew up in the
00:15:08.100 Pacific Northwest, so, you know.
00:15:09.720 But I remember me and my friends going over to my friend's house and my friends, we were
00:15:17.580 all embarrassed for the kid whose parents were hippies and were always stoned.
00:15:22.700 And we were like, oh, your mom and dad stoned again?
00:15:26.260 Yeah.
00:15:27.060 And they would be like, you know, hey, kids, how's this?
00:15:30.900 You know?
00:15:31.400 And I mean, my parents didn't know that I was going over to, you know, I happen to be
00:15:37.760 running with a group of decent kids that we're all like, okay, the stoners are here.
00:15:43.060 But we all turned out to be stoners.
00:15:46.580 You know what I mean?
00:15:47.560 Yeah, yeah, yeah.
00:15:48.000 And part of it is because our parents were radically different.
00:15:54.140 My parents would not have been cool with that.
00:15:58.180 But my parents just thought everybody's parents were the same.
00:16:02.320 They didn't realize, oh, that's the 60s hippie family.
00:16:06.260 Yep.
00:16:06.700 How was your sleep last night?
00:16:18.380 Did you get a good night's sleep?
00:16:19.740 Did you wake up feeling refreshed, feeling good?
00:16:24.040 I want to talk to you about Casper.
00:16:26.020 Casper is a mattress that will really help you feel good at night.
00:16:29.900 Casper mattress has a unique combination of foams that provides the right pressure relief
00:16:35.000 and the comfort so you'll feel balanced when you get up in the morning.
00:16:39.740 Plus, it's made of breathable material so you don't wake up in the middle of the night,
00:16:45.860 you know, having to throw off the blankets because it's so hot.
00:16:51.100 I had a Tempur-Pedic mattress.
00:16:53.560 Oh, my gosh.
00:16:54.040 I hated that thing.
00:16:54.780 I hated it because it was a billion degrees every night.
00:17:00.360 Casper has come up with a foam mattress that is comfortable, supportive, and it breathes.
00:17:08.000 Casper, try it out in your own home for 100 nights, risk-free.
00:17:11.500 They'll ship it to you in a little compact box.
00:17:13.400 You can just take it in and open it up, and boom, there's the mattress.
00:17:16.900 But if you don't love it, you don't have to put it back in that tiny little box.
00:17:20.020 They'll come and pick it up and refund every single penny, no questions asked.
00:17:23.260 They want you to have a great night's sleep.
00:17:26.040 Try Casper in your own home for 100 nights.
00:17:29.560 Go to Casper.com slash Beck.
00:17:31.420 Use the promo code Beck and save $50 on the purchase of select mattresses.
00:17:35.920 That's Casper.com slash Beck.
00:17:38.680 Promo code Beck.
00:17:39.860 Save $50 off the purchase of select mattresses.
00:17:42.560 Terms and conditions do apply.
00:17:44.160 It's Casper.com slash Beck.
00:17:47.980 Glenn Beck Mercury.
00:17:53.260 Have a fascinating conversation next hour with a guy who can tell you a little bit what the future is going to be like in a very entertaining way.
00:18:09.040 We will take you there next hour.
00:18:12.100 That's a terrible story, too, about these kids that were chained in the basement.
00:18:16.880 That's the big story today about this Turpin family.
00:18:19.420 When you were reading the story, did you have a moment at all, like I did, of thinking about Hank Johnson?
00:18:24.560 Remember Hank Johnson, the congressman?
00:18:26.880 And so he asked the military guy, do you think if we put too many troops in Guam, the whole thing will tip over and capsize?
00:18:38.480 If you remember his response, it was, we don't anticipate that.
00:18:43.480 Right.
00:18:43.660 And there's just something about the way that, like, officials respond to these things that's just, in the most serious circumstance, is really funny.
00:18:52.940 And so in this one, they went and they interviewed, or it was a police statement where they were talking about this terrible case.
00:18:59.120 13, you know, kids, and they're all, like, chained up in a basement.
00:19:02.060 And it's inexplicable.
00:19:02.860 These old, you know, 20-year-olds that look like 10-year-olds because they're so thin.
00:19:06.180 And they asked him what happened, and he said, and the quote was, David and Louise were unable to immediately provide a logical reason why their children were restrained in that matter.
00:19:17.720 What would the logical reason be?
00:19:20.980 I know.
00:19:21.560 That is such a great quote.
00:19:23.740 They were unable to immediately come up with a logical reason why they changed 13.
00:19:27.460 I'll give you time on that.
00:19:29.060 I'll give you a couple hours.
00:19:30.660 Take a couple hours.
00:19:30.940 Take the week.
00:19:31.780 Can you come up with a logical reason to chain your kids to the bed?
00:19:38.700 I can't come up with it.
00:19:40.700 I got to say, I mean, let's see.
00:19:42.640 Maybe if there was a, you were worried about them escaping because there was a nuclear holocaust outside.
00:19:51.860 Nope.
00:19:52.460 No, I don't think that that would work.
00:19:55.020 You thought they were possessed by the devil.
00:19:57.000 Nope.
00:19:57.320 No, no, no.
00:19:58.360 You, you, you, you, they were violent to themselves and others.
00:20:05.380 Nope.
00:20:06.000 They wanted to be prepared if gravity reversed itself.
00:20:09.660 That could be.
00:20:10.340 That could happen.
00:20:10.820 That could be.
00:20:11.560 That could happen.
00:20:12.320 At any moment.
00:20:13.140 At any moment.
00:20:13.840 They could be floating up.
00:20:15.280 Well, they wouldn't have floated away if they would have, all they would have to do is close the windows.
00:20:18.480 Everyone else would have followed through.
00:20:19.660 Well, yeah, that's true.
00:20:20.380 You would have set the ceiling.
00:20:21.220 Just close the windows, and I got to pull them down from the ceiling again.
00:20:24.640 Glenn.
00:20:25.480 Back.
00:20:26.640 Mercury.
00:20:26.960 Mercury.
00:20:28.360 You're listening to the Glenn Beck Program.
00:20:39.960 This Aziz Ansari story is really where what's going to separate the men from the boys, the revolutionaries from the sane.
00:20:54.740 Um, Margaret Atwood.
00:20:58.300 She's the woman who wrote, um, A Handmaiden's Tale.
00:21:01.900 She's been a, you know, a feminist for forever.
00:21:05.160 Um, she is now, I think, 78 years old, and she wrote this, she wrote this article for The Guardian last week, um, entitled, Am I a Bad Feminist?
00:21:17.780 And so, I read this yesterday, and I read the whole thing, not just the highlights, like everybody else, and I read the whole thing.
00:21:27.800 She said, It seems like it seems like I'm a bad feminist, and I can add that to the other things that I've been accused of since 1972, such as climbing to fame up a pyramid of decapitated men's heads, from a lefty journal, of being a dominatrix bent on the submission of men, a right one, of being an awful person who can annihilate with her magic white witch powers anyone critical of her at Toronto dinner tables.
00:21:54.980 She said, I'm so scary, aren't I?
00:21:58.240 And now, it seems I'm conducting a war on women, like the misogynistic, rape-enabling, bad feminist that I am.
00:22:07.820 My fundamental position is that women are human beings, with the full range of saintly and demonic behaviors this entails, including criminal ones.
00:22:20.940 Women are not angels, incapable of wrongdoing.
00:22:24.960 If they were, we wouldn't need a legal system.
00:22:28.240 Nor do I believe that women are children, incapable of agency or making moral decisions.
00:22:35.200 If they were, we're back to the 19th century, and women should not own property, have credit cards, have access to higher education, control their own reproduction, or vote.
00:22:46.640 There are powerful groups in North America pushing this agenda, but they're not usually considered feminists.
00:22:54.020 Furthermore, I believe that in order to have civil and human rights for women, there have to be civil and human rights, period.
00:23:03.840 Including the right to fundamental justice.
00:23:07.960 Just as for women to have the vote, there has to be a vote.
00:23:12.520 Do good feminists believe that only women should have such rights?
00:23:18.260 Surely not.
00:23:19.780 That would be a flip of the coin on the old state of affairs in which only men had those rights.
00:23:26.360 So let me suppose that my good feminist accusers and the bad feminists, that is me, agree on the above points.
00:23:34.300 Where do we diverge?
00:23:36.140 How did I get into such hot water with good feminists?
00:23:39.440 In November 2016, I signed as a matter of principle, as I have signed many petitions, an open letter called UBC Accountable.
00:23:49.020 This is the University of British Columbia.
00:23:51.220 She goes into how the University of British Columbia treated one of its former employees, Stephen Galloway.
00:23:58.440 He was the former chair of the Department of Creative Writing.
00:24:02.780 He was accused of something.
00:24:08.440 He wasn't even allowed to know what he was accused of.
00:24:12.240 They did this inquiry.
00:24:14.560 They had multiple witnesses in interviews.
00:24:18.160 Finally, a judge said there was no sexual assault.
00:24:23.340 Yet he got fired anyway.
00:24:26.080 His life was destroyed.
00:24:28.700 And she's looking at this going, no, no, this is not this is a witch hunt.
00:24:32.440 This is a witch hunt.
00:24:34.820 She said any fair minded person would withhold judgment as to guilt until the report and the evidence is available for us to see.
00:24:42.680 But this is witch talk.
00:24:45.080 And she talks about going into now these witch trials and and we've got to stop.
00:24:54.220 We can't have the witch trial justice.
00:24:57.820 Now, another feminist rights.
00:25:01.500 Because Margaret Atwood was called a blood drinking monster.
00:25:05.620 The things that they said about her were just horrendous.
00:25:11.580 So another woman writes and she says.
00:25:14.640 Margaret Atwood, as an enemy of feminism, is a tough concept to get your head around.
00:25:19.480 She is, after all, the author of The Handmaid's Tale, the universally acclaimed dystopian fantasy in which men are women are enslaved to men.
00:25:27.620 Her impressive body of work, one that has profoundly informed the feminist zeitgeist, is a 50 year long attack on misogyny and the patriarchal state.
00:25:37.880 As that would is probably the leading feminist author in the world.
00:25:41.460 So what happened?
00:25:42.980 Listen to this.
00:25:44.860 What happened is that the revolution has entered a new phase.
00:25:50.060 Having vanquished the reactionary, the Jacobians are now sending the moderates to the guillotine.
00:25:57.620 Buildings have to be raised so society can begin anew.
00:26:02.340 And everyone who isn't for them is against them.
00:26:05.700 Moderates like Ms. Atwood and their odious ideas about due process and the presumption of innocent until proven guilty are traitors to the revolution.
00:26:15.980 One letter to the globe had put it another way.
00:26:18.340 Revolution isn't about justice.
00:26:20.660 It's about change.
00:26:21.960 This is a frightening thing to behold.
00:26:29.300 We as a society are now starting to go into a new period of this revolution.
00:26:39.300 It was not about justice.
00:26:42.440 It was about change.
00:26:45.480 What did we say in 2007-2008?
00:26:51.880 Change to what?
00:26:54.320 Change to what?
00:26:56.140 Nobody would ever define the change.
00:26:58.840 And so this revolution now has happened.
00:27:02.600 And now the revolutionaries are taking everyone, kicking and screaming.
00:27:08.560 And these are the same people who said George Bush was horrible because he said, if you're not with us, you're against us.
00:27:15.120 That was a horrible thing.
00:27:16.800 That's what they're saying now.
00:27:18.380 And there's this standard that has nothing to do with human rights.
00:27:32.000 It's against human rights.
00:27:34.360 This, uh, Aziz, Aziz Ansari, this story is, is remarkable.
00:27:46.980 And even Ashley Banfield, who I don't think I've ever agreed with anything she's ever said, she gets on CNN.
00:27:55.080 And I want you to listen to what she said about this.
00:27:57.340 But what you have done, in my opinion, is appalling.
00:28:01.760 You went to the press with the story of a bad date.
00:28:05.580 And you have potentially destroyed this man's career over it right after he received an award for which he was worthy.
00:28:12.240 If you were sexually assaulted, go to the cops.
00:28:15.500 If you were sexually harassed, jeopardizing your work, speak up and speak out loud.
00:28:21.420 But by your own descriptions, that is not what happened.
00:28:24.920 You had an unpleasant date, and you did not leave.
00:28:28.840 That is on you.
00:28:30.680 And all the gains that have been achieved on your behalf and mine are now being compromised by the allegations that you threw out there.
00:28:38.000 And I'm going to call them reckless and hollow.
00:28:40.660 I cannot name you publicly and sentence you to a similar career hit, as Ansari, because you chose to remain anonymous.
00:28:48.600 Lucky you.
00:28:50.540 Wow.
00:28:52.640 Wow.
00:28:53.080 I mean, jeez, if a guy said that, their career would be over, right?
00:28:57.900 Oh, if a guy said that?
00:28:59.480 Mm-hmm.
00:28:59.980 Let me give you some other things.
00:29:01.440 You know, babe.com is the rag that put this charge out?
00:29:08.140 Right.
00:29:09.220 From the unnamed source.
00:29:10.880 Yeah.
00:29:11.060 Who had a bad date.
00:29:13.100 Explain exactly what happened.
00:29:15.220 They went to his apartment, started the date at the apartment.
00:29:19.200 They had met at a party, and she was impressed by his celebrity.
00:29:21.840 I mean, if you don't know who he is, he's a comedian.
00:29:24.600 He does the show Master of None on Netflix, among other things.
00:29:27.480 And so they went to his apartment.
00:29:31.120 At one point, he offered her a glass of wine, and he brought her white wine.
00:29:38.780 And in the article, it was an interesting, revealing moment where she says, it was white wine.
00:29:44.120 Of course, I prefer red.
00:29:46.540 Like, as if she couldn't say, can I have some red wine?
00:29:49.780 It was like he was forcing her to accept this white wine, even though she preferred red.
00:29:56.020 And it's like such a strange detail to put in there, just showing that she...
00:30:00.360 That is, if it is true that he didn't want you to have white wine, and you said, no, I prefer white, she'll have the red, that should have been the end of the date right there.
00:30:11.640 Right.
00:30:11.860 And that's not what happened.
00:30:13.040 She didn't even express, by her own telling, that she actually wanted red wine.
00:30:17.120 She just accepted the white wine.
00:30:18.720 And again, it's another part of, she has no agency.
00:30:20.900 She's not a person.
00:30:21.640 She's, you know, she is an underling in society.
00:30:26.640 She can't make her own decisions.
00:30:27.780 She can't express her own wants and desires.
00:30:30.440 So they go out.
00:30:31.480 They have a quick date.
00:30:32.360 They go through, as she puts it, they rush through a dinner at a fancy restaurant.
00:30:36.620 Now, so far, this guy's given her wine, and he's brought her to a fancy restaurant.
00:30:40.360 There's no indication he's been a jerk to her in any way.
00:30:44.200 They get back.
00:30:45.240 They start kissing.
00:30:46.540 They eventually perform oral sex on each other.
00:30:49.500 Which she said she was uncomfortable with.
00:30:55.280 Okay.
00:30:56.080 But then she left.
00:30:58.000 She came back.
00:30:59.580 And they did it again.
00:31:01.040 Right.
00:31:01.980 And so she claims, through this, she gave nonverbal cues, which should have told him to stop.
00:31:08.940 At one point, she did say something like, I kind of want to take it slow, when it came to the final act.
00:31:13.420 Right.
00:31:13.700 Like, it seemed like, by her description, she had drawn some sort of line that she didn't want to go all the way that night.
00:31:20.420 And he, she tried at one point.
00:31:23.060 She said, let's take it a little slow.
00:31:24.960 They stopped.
00:31:26.080 They started up again.
00:31:27.520 And it got close to that point again.
00:31:29.920 She decided, you know, I don't want to do this.
00:31:31.340 And she left.
00:31:32.840 He did not say, no, you will do this.
00:31:35.260 I demand it.
00:31:36.180 When she asked to slow down, he slowed down.
00:31:38.600 And she left.
00:31:39.880 And he was so sure that this was a consensual, fine evening, even though he wanted to have sex with her.
00:31:45.500 And she decided to stop it at one level that the next day he, he texted her and said, hey, it was really nice meeting you last night.
00:31:52.640 I had a great time.
00:31:53.560 And then she responded with this 45, you know, 100 word text of, I did not feel fine.
00:31:59.680 And here's why.
00:32:00.360 And she was very upset.
00:32:01.820 He responded and said, oh, my God, I had no idea you felt that way.
00:32:05.980 I'm really sorry if you felt that way.
00:32:07.940 I feel terrible.
00:32:09.460 And that was it until she decided to write an article anonymously about the incident.
00:32:12.900 Rupert Murdoch put six million dollars into Babe.
00:32:15.600 He was one of the first big investors of Babe.
00:32:17.540 The average writer is about 25 years old.
00:32:20.960 Female.
00:32:22.320 Now, I just you said a minute ago, imagine if a woman or a man said what Ashley Manfield just said.
00:32:29.520 OK.
00:32:31.360 Let me give you a couple of the stories.
00:32:33.280 And I have to I can't read very much of these stories because they're absolutely pornographic.
00:32:39.700 OK.
00:32:40.320 Carolyn Finney.
00:32:41.360 She the headline is how to trap your man who doesn't know he's your man, but is still
00:32:50.780 your man before Valentine's Day.
00:32:54.140 OK.
00:32:55.000 How to now imagine if I said, hey, I'm going to write an article about how to trap that
00:33:00.480 woman who doesn't know she's your girlfriend, but she's your girlfriend until Valentine's
00:33:06.440 Day.
00:33:06.640 The men writing articles about trapping women doesn't go over that well, typically.
00:33:10.260 Yeah.
00:33:10.920 Yeah.
00:33:12.420 Cuffing season is officially over, which means your current victim is crafting a convenient
00:33:19.020 excuse to leave you before the big day of Valentine's Day.
00:33:22.860 Your current victim.
00:33:25.060 Imagine I write that writing.
00:33:27.800 Yeah.
00:33:28.140 Writing that you have female victims not usually looked upon kindly.
00:33:32.900 Kindly.
00:33:33.420 Right.
00:33:33.620 When men write that and who can blame them.
00:33:36.420 Now, listen to this.
00:33:37.200 Who can blame them?
00:33:38.840 You know, you're manic, but you thought your head.
00:33:42.180 You thought your game would make up for it.
00:33:46.040 Nevertheless, three to five meltdowns and half a breakup later.
00:33:51.500 He's weary and you're mad and he's still not sure why you think you're dating him in the
00:33:59.740 first place.
00:34:03.420 This, this is only one.
00:34:06.520 Oh, the stuff.
00:34:07.420 I mean, I, the stuff you can't read, which you read to me and I'm internally scarred from
00:34:12.220 you reading to me earlier today.
00:34:14.020 It's just stunning.
00:34:15.360 It is stunning.
00:34:16.440 And it's, it's, what's amazing about it is it's all written in from this perspective
00:34:19.340 of, oh gosh, we're, we're feminists.
00:34:21.960 We can do anything.
00:34:23.060 We're in control of our sexuality.
00:34:24.500 Look at us.
00:34:25.080 We're, and then she can't say no to red wine at the same time.
00:34:30.040 This talks about how you trap a man in bed, how you get him to do what you want him to
00:34:36.880 do by fooling him and trapping him.
00:34:39.420 So please don't lecture the rest of us on how to behave because if my daughter, if I
00:34:48.420 behaved this way, forget my daughter.
00:34:50.100 If I behaved the way you are, are, uh, condoning and encouraging, you would have the right to
00:35:00.020 describe me as a despicable human being.
00:35:04.200 Um, and by the way, uh, babe.net is the site.
00:35:07.700 Don't, don't try the other one.
00:35:08.940 Oh, don't try the other one.
00:35:09.660 Okay.
00:35:09.840 Sorry.
00:35:10.120 Sorry about that.
00:35:11.560 Whoops.
00:35:13.780 Blinds.com.
00:35:15.080 The best at what they do.
00:35:17.860 Blinds.com.
00:35:18.880 Marlene from Ohio.
00:35:20.320 She said this blinds.com is by far the most customer friendly company I have ever experienced.
00:35:26.060 The customer service representatives are knowledgeable about the products.
00:35:29.680 They take a lot of time helping me with my selection of even just the color.
00:35:33.820 The blinds were easy to install.
00:35:35.720 They look beautiful.
00:35:36.600 The quality is excellent.
00:35:37.820 I highly recommend blinds.com.
00:35:39.880 Don't take it from me.
00:35:41.220 I've told you for years how great they are, how great the services.
00:35:44.360 Stu has used them.
00:35:45.760 He put, um, shutters in, uh, in his house and had the same at exactly the same kind of
00:35:52.680 experience.
00:35:53.100 In fact, did they down sell you?
00:35:55.540 Didn't they?
00:35:56.000 Yeah.
00:35:56.120 They actually downsold me from the, I guess I was looking at the, of course my wife was
00:35:59.560 choosing.
00:35:59.860 And so we were going to go with the top of the line when it was a stunning surprise that
00:36:03.440 was.
00:36:03.940 Uh, and they said, actually this other kind is the same quality and maybe even a little
00:36:08.640 bit better and it's actually cheaper.
00:36:09.980 You should try that one, which we did.
00:36:11.600 And it's, they were amazing.
00:36:12.720 They were totally right.
00:36:13.320 So their service is great.
00:36:14.980 Their product is great.
00:36:15.820 Their prices are great.
00:36:16.720 What are you waiting for?
00:36:17.940 Uh, go to blinds.com right now.
00:36:20.100 Blinds.com slash Beck, uh, right now through January 28th, they're going to save 20% site
00:36:26.260 wide at blinds.com slash Beck rules and restrictions to apply.
00:36:32.760 Glenn Beck, Mercury.
00:36:35.080 See, there's a new app out now that allows you to create a consent contract before you
00:36:55.960 sleep with somebody like this.
00:36:58.160 This is, uh, now we can all have it all official.
00:37:00.260 Yeah.
00:37:00.680 You just, you know what?
00:37:01.760 Sign here, sign here, initial this and sign and I'm hot and ready to go now.
00:37:09.580 It's not sexy, but also aren't you giving away, like, can you imagine if you actually
00:37:12.900 got really assaulted after this, the person would have a legally binding contract saying
00:37:17.160 you agreed that would not, it's not a good move.
00:37:19.620 That'd be a form contract.
00:37:21.420 You need the, you know, the long extended version.
00:37:24.280 Mercury.
00:37:31.760 You know, Sweden is, is a fantastic, um, country.
00:37:44.480 It is a country that is known as the most refugee friendly country in the world.
00:37:49.500 And that, that is something that they are very proud of, uh, and something that they
00:37:55.340 really lived in World War II and they're living in now.
00:37:59.360 When you go to Sweden, if you're an immigrant, you're given free housing, money, language lessons,
00:38:05.180 even a salary while you search for a job.
00:38:09.940 They are leading the world in, uh, tolerance and acceptance.
00:38:15.440 Or so you would think they are leading the world unless you're a Christian.
00:38:23.160 There is a shocking story coming out of Sweden now that is kind of being buried of a Dean
00:38:29.260 Stranson.
00:38:30.420 She was a popular TV and movie actress in her home country of Iran.
00:38:35.440 One day she witnessed a woman getting stoned to death and she thought, I gotta get, I gotta
00:38:40.600 get out of here.
00:38:41.840 Not long after she had a dream, uh, it was a dream of Jesus and she decided that she wanted
00:38:48.440 to convert, but she had to do it in private and secret because in Iran, converting to Christianity
00:38:53.720 is deadly.
00:38:55.620 So she left for Sweden because Sweden was taking in immigrants and refugees.
00:38:59.980 She decided to immigrate on a work visa.
00:39:04.080 When she got to Sweden, it's when she decided to make her conversion public to not live a lie
00:39:09.820 anymore.
00:39:10.700 Leaving the Islamic faith is illegal in Iran, punishable by death.
00:39:14.560 But Swedish immigration recently has decided now to deny her asylum request and block her
00:39:21.280 from getting a job.
00:39:23.200 The UN and Swedish immigration policy states that an immigrant cannot be denied asylum if
00:39:29.700 the seeker faces imminent danger upon arrival back at their home country.
00:39:34.460 She was a public figure.
00:39:36.460 She's getting death threats.
00:39:37.840 If she's deported back to Iran, she will face imprisonment, rape, and execution.
00:39:44.960 Why?
00:39:46.000 Because of her faith.
00:39:49.120 Kind of an odd story from the most tolerant and accepting society on the planet.
00:39:53.100 I don't know what's happening to Sweden.
00:39:55.780 Given the recent investigation done by the Swedish newspaper, they uncovered the program what the
00:40:01.760 Swedish government was running to protect ISIS terrorists arriving from Syria.
00:40:08.240 Apparently, these poor jihadists were having a hard time finding jobs because their pictures
00:40:12.980 and starring roles in propaganda videos were scaring off employers.
00:40:18.220 Imagine that.
00:40:18.920 So the Swedes did an undercover operation and fixed all that with brand new identities and protected status.
00:40:29.720 What the hell is what?
00:40:31.100 What are we doing?
00:40:32.720 I don't care what faith you're in.
00:40:34.880 If you're being persecuted by your faith because of your faith, we need to protect you.
00:40:40.280 But if you're being persecuted because you're a starring role in a terrorist recruiting film, I don't get it.
00:40:49.680 There is a global war being waged right now.
00:40:53.120 And in some places like here in the United States, the war on faith is being fought against ideology.
00:41:00.300 In other places like Iran, the war is literally life and death.
00:41:05.960 We've come full circle in the Middle East, a return to the first century.
00:41:10.640 And like then, the time has come to show the world that a church is meaningless.
00:41:17.460 A church is how you behave.
00:41:19.700 It's more than brick and mortar.
00:41:21.620 It's not a place.
00:41:22.620 It is a state of being.
00:41:24.720 I don't know what it means to be a Christian, quite honestly, anymore.
00:41:27.680 It's supposed to change us.
00:41:33.160 But a church is about people.
00:41:36.240 And millions of Christians and Yazidis and even Muslims who aren't Muslim enough live under the constant threat of persecution and death.
00:41:45.960 It is time we all stand shoulder to shoulder without any ill will or hatred in our hearts.
00:41:51.500 And we stand with them.
00:41:54.020 Because never again is now.
00:41:57.680 It's Tuesday, January 16th.
00:42:07.080 This is the Glenn Beck Program.
00:42:09.120 I have been immersing myself in future tech to try to understand what is coming our way and what the moral issues are of the near future.
00:42:28.440 What it means to each of us in our lives, what it means to be asked the question, am I alive?
00:42:39.200 Is this life?
00:42:43.600 We have so many questions that we have to answer and we're having trouble with just some of the basic things.
00:42:51.540 And no one is really thinking about the future.
00:42:53.720 When you think about the future and you think about robots or you think about AI, Americans generally think of the Terminator.
00:43:00.920 Well, that's not necessarily what's going to happen.
00:43:07.000 How do we educate our kids?
00:43:09.140 So I've been reading a lot of high tech stuff and I've in my spare time been trying to read some novels and I'm looking for the storytellers, the people who can actually tell a great story that is really based in what is coming.
00:43:27.160 The futurist or the near future sci-fi authors that can show us what's on the horizon.
00:43:38.600 And I found a series of books.
00:43:41.380 It's called the Singularity Series.
00:43:45.860 And I found them over the Christmas vacation and I've just last night finished the fourth one.
00:43:52.300 And they are really, really well done.
00:43:56.420 They are, they get a little dark, but it also shows the positive side of what could be.
00:44:03.160 And it was a balanced look and a way to really understand the future that is coming and is on the horizon.
00:44:10.740 William Hurtling is the author and he joins us now.
00:44:14.860 William, how are you, sir?
00:44:16.660 I'm doing great.
00:44:17.740 Thanks so much for having me on.
00:44:19.440 Congratulations on a really good series.
00:44:23.000 This is self-published?
00:44:25.040 Yep, it is self-published.
00:44:27.120 I could not find a publisher who saw the vision of the series, but I've self-published it and people love it.
00:44:36.400 So you get the word out there.
00:44:38.640 Yeah, you've won several awards for it.
00:44:42.060 And I hope, you know, I don't know what your sales have been like, but I hope your sales are really good.
00:44:48.360 Because I think it, well, let me ask you this.
00:44:53.660 What was the intent of the series for you?
00:44:56.220 You know, what happened was about 10 years ago, I read two books back-to-back.
00:45:04.160 One was Ray Kurzweil's The Singularity is Near, which I know you've read as well.
00:45:08.980 Yep.
00:45:09.080 And the other one was Charles Strauss's Accelerando, which is a fictional book about the singularity.
00:45:15.160 And what I realized at that point in time was that we had the biggest set of changes that were ever going to face humanity.
00:45:22.100 And they were coming, and they were in the very near future, right?
00:45:25.180 They're certainly coming in my lifetime.
00:45:27.040 They're probably coming within the next 10 years.
00:45:29.420 And there's very little out there about that.
00:45:31.860 And as you said, most of the stories that are in media today are about these Terminator-style stories.
00:45:37.780 AI rise up, they take control over the machines, and we fight them in a battle, which, of course, makes for a great movie.
00:45:43.380 I mean, I would love to see The Terminator many times over.
00:45:46.440 But what happens when it's not like that?
00:45:50.420 What happens when it's sort of the quiet AI kind of story?
00:45:53.240 And that's really what I wanted to explore.
00:45:55.220 What happens when there's this moment of emergence of the first AI that's out there, and people realize they're being manipulated by some entity?
00:46:05.200 And what do they do about it?
00:46:06.340 How do they react?
00:46:07.380 So I find this, first of all, you lay it out so well.
00:46:12.340 And you start, the first book starts with the emergence of AI, and then moves, I think the next book is what, 10 years later, five years later?
00:46:26.140 Yeah, all 10 years apart, yeah.
00:46:27.980 They can basically explore different points in technology in the future.
00:46:32.180 Right.
00:46:32.520 So the last one is in the 2040s or 2050s, and it's a very different thing then than it starts out as.
00:46:40.120 And the thing I wanted to talk to you about is, first of all, can you just define, because most people don't know the difference between AI, AGI, and ASI, which is really important to understand.
00:46:57.020 Sure.
00:46:57.740 So AI is out there today.
00:47:00.220 It's any time programmers write a piece of software that instead of having a set of rules, you know, if you see this, then do that.
00:47:09.040 Instead, the AI software is trained to make decisions on its own.
00:47:15.060 So AI is out there today.
00:47:16.520 It's how you have self-driving cars.
00:47:19.160 It's what selects the stories that you read in Facebook.
00:47:22.460 It's how Google search results come about.
00:47:25.320 And AGI is this notion that artificial intelligence will become more general, right?
00:47:31.080 All of those things I mentioned are very specific problems to be solved.
00:47:34.660 How to drive a car is a very specific problem.
00:47:36.740 So a good explanation of AI would be Big Blue, the chess-playing IBM robot.
00:47:46.600 It has no general intelligence.
00:47:48.880 It does that.
00:47:50.580 Exactly.
00:47:51.140 Right.
00:47:51.320 Okay.
00:47:51.860 Right.
00:47:52.180 And we have IBM's Watson, which is really good at making diagnoses about cancer, but you can't have a conversation about how you're feeling.
00:48:00.740 Right.
00:48:01.760 But AGI would.
00:48:03.240 AGI would appear to be like a human being, conceivably, in that it could talk and reason about a wide variety of topics, make decisions, generally use its intelligence to solve problems that it hasn't even seen before.
00:48:17.020 Now, AGI can pass the Turing test?
00:48:21.620 Yeah.
00:48:22.240 So the Turing test is this idea that you've got a person in one room chatting with someone in another room, and they have to decide, is that a human being or is it a computer?
00:48:34.420 And if they can't figure it out, then that is the Turing test.
00:48:39.540 And that you've passed the Turing test if you can't distinguish between a computer and a person.
00:48:45.160 How close are we to that?
00:48:48.380 Well, I think we've probably all been fooled at least a few times when we've either gotten a phone call or made a phone call, and we think that we're talking to a human being on the other end, right?
00:48:59.400 But it actually turns out that we're talking to a machine that wrote our phone call somewhere.
00:49:03.440 So, you know, we're there for like a couple of sentences, but we're still pretty far away if you're going to have any kind of a meaningful conversation.
00:49:12.540 And AGI is when a computer has the computing power of a human brain.
00:49:19.700 Yep.
00:49:20.060 Okay.
00:49:21.360 Now, that's not necessarily a scary thing, but it's what happens when you go from AGI to ASI, artificial superintelligence, and that can happen within a matter of hours, correct?
00:49:38.220 It can.
00:49:39.360 There's a couple of different philosophies on that.
00:49:41.780 But if you can imagine that, think about the computer that you have today versus the computer you had 10 years ago, right?
00:49:50.260 It's vastly more powerful, vastly more powerful than the one you had 20 years ago.
00:49:54.000 So, even if there's not these super rapid accelerations in intelligence, even if you just today had a computer that was the intelligence of a human being, you would imagine that 10 years from now, it's going to be able to think about vastly more stuff, much faster, right?
00:50:14.160 So, we could see even just taking advantage of increasing in computing power, we would get a much smarter machine.
00:50:20.840 But really dangerous, or not necessarily dangerous, the really rapid change comes from when the AI can start making changes to itself.
00:50:31.740 So, if you have today programmers create AI, but in the future, AI can create AI, and the smarter the AI gets, then in theory, the smarter the AI it can build.
00:50:43.460 And that's where you can get this thing that kind of spirals out of control.
00:50:46.780 So, you get a handle on how fast this can all change.
00:50:52.700 If you have an Apple iPad 2, that was one of the top five supercomputers in 1998.
00:51:03.000 Okay?
00:51:04.000 That was a top five supercomputer.
00:51:08.520 That's how fast technology is growing on itself.
00:51:13.240 All right, so, William, I want you to kind of outline, we're going to take a break, and I want you to come back and kind of outline why all of this stuff matters.
00:51:25.120 What is in the near future that we're going to be wrestling with, and why people should care when we come back?
00:51:35.740 We have a burglary in America about every eight seconds, and you know what really thwarts them is somebody who has a security system.
00:51:50.340 They find, generally, houses that aren't protected.
00:51:53.120 That's why securing your home is a necessity.
00:51:55.060 But a security system can be really expensive, and if you go the old-fashioned way, you don't ever own it.
00:52:00.860 You're always paying on it, and, you know, you're charged $50 and $60 for the 24-7 monitoring.
00:52:07.740 It's just really ridiculous.
00:52:10.500 So, SimpliSafe has a new, much smarter system.
00:52:16.600 Its sensors are going to protect every point of access in your home.
00:52:19.400 If a burglar just even breaks the glass or tries to open anything, the sirens go off that will allow, you know, let them know, hey, police are on the way.
00:52:30.260 Police have been called.
00:52:31.260 But it also will take a picture of whoever it is that was trying to break in that you can give to the police when they arrive.
00:52:38.700 SimpliSafe has 24-7 monitoring at $14.99 a month.
00:52:43.880 They never lock you into a long-term contract, and you own the system.
00:52:47.720 Plus, you get a 60-day money-back guarantee, so there's no reason not to try it.
00:52:52.400 SimpliSafe, the security system that you should have in your home, SimpliSafeBeck.com.
00:52:57.380 Go there now, SimpliSafeBeck.com.
00:53:02.060 Glenn Beck.
00:53:03.900 Mercury.
00:53:13.040 Glenn Beck.
00:53:13.960 As you know, if you're a long-term listener of the program, I am very fascinated with the future and what is coming, the future of tech and artificial intelligence.
00:53:25.120 William Hurtling is an author and a futurist.
00:53:28.200 He is the author of what's called the Singularity Series.
00:53:31.220 It's a series of four novels that kind of break it down and tell you exactly what's coming and break it down in an entertaining fashion.
00:53:39.640 I highly recommend the Singularity Series.
00:53:43.320 If you are interested in any of this, you need to start reading that.
00:53:47.140 You will really enjoy it.
00:53:48.540 William, I know Glenn is a big fan of your work and has been reading a lot about technology.
00:53:52.700 I think a lot of people who are living their daily lives aren't as involved in this.
00:53:57.880 I think a third or a half of the audience, when you hear AI, don't even connect that to artificial intelligence until you say it.
00:54:05.740 I know as a long-term NBA fan, I think Allen Iverson, honestly, when I hear AI.
00:54:11.700 So can you make the case of with everything going on in the world, why should people put this at the top of their priority list?
00:54:18.800 Well, it's the scale of the change that's coming.
00:54:24.040 And probably the nearest thing that we're really going to see is over the next five years,
00:54:28.580 we're going to see a lot more self-driving cars and a lot more automation in the workplace.
00:54:35.040 So I think transportation jobs account for something like 5% of all jobs in the United States.
00:54:41.620 And whether you're talking about driving a car, a taxi, driving a delivery truck,
00:54:47.100 all of those things are potentially going to be automated, right?
00:54:50.120 This is one of the first really big problems that AI is tackling.
00:54:54.080 And AI is good at it.
00:54:56.020 So AI can drive a car and it can do a better job.
00:54:59.720 It doesn't get tired.
00:55:00.820 It doesn't go out and drink before it drives.
00:55:03.320 And it doesn't make mistakes.
00:55:05.980 Well, that's not quite true.
00:55:07.220 It's going to make mistakes, but it's going to make less mistakes than your typical human operator.
00:55:10.940 So business likes to save money and it likes to do things efficiently.
00:55:17.560 And self-driving cars are going to be more cost effective.
00:55:19.680 They're going to be more efficient.
00:55:20.980 So what happens to those 5% of the people today who have transportation jobs, right?
00:55:27.360 This is probably going to be the biggest thing that affects us.
00:55:30.060 I think, William, that Silicon Valley had better start telling the story in a better fashion
00:55:39.680 because as these things hit, we all know politicians on both sides.
00:55:43.840 They'll blame somebody.
00:55:45.960 They're telling everybody that I'm going to bring the jobs back.
00:55:49.040 The jobs aren't coming back.
00:55:50.340 In fact, many, many more are going to be lost, not to China, but by robotics and AI.
00:55:57.060 And when that happens, you know, I can see, you know, politicians turning and saying,
00:56:02.660 it's these robot makers.
00:56:04.140 It's this AI people.
00:56:07.900 Yeah, naturally.
00:56:09.620 And yet, unfortunately, the AI genie is out of the bottle, right?
00:56:13.300 Because we're investing in it.
00:56:15.060 China's investing in it.
00:56:16.580 Tech companies around the world are investing in it.
00:56:18.880 If we stop investing in it, even if we said, hey, we don't want AI, we don't like it,
00:56:23.700 all that's going to do is put us at a disadvantage compared to the rest of the world.
00:56:28.400 So it's not like we can simply opt out.
00:56:30.820 It's not really.
00:56:31.500 We don't have that option.
00:56:32.600 It's moving forward.
00:56:33.480 So we need to participate in it, and we need to shape where it's going.
00:56:36.920 And I think this is the reason why it's so important to me that more people understand
00:56:41.180 what is AI and why it matters.
00:56:43.260 Because we need to be involved in a public conversation about what we want society to look like in the future.
00:56:48.640 As we go out, if even more jobs are eliminated by AI, what does that mean?
00:56:53.340 What if we don't have meaningful work for people?
00:56:55.800 I think that the thing I like about your book series is it starts out really hopeful,
00:57:02.500 and it shows that this technology is not going to be something that we really are likely to refuse
00:57:13.360 because it's going to make our life incredibly stable and easy in some ways.
00:57:22.280 And I'd kind of like you to talk a little bit about the stock market and the economy and war and everything else,
00:57:30.300 something that you would talk about in your first novel,
00:57:33.160 and show you when we come back the good side and then what it could turn into.
00:57:39.900 So Alan Iverson is taking our transportation jobs?
00:57:42.640 Yes.
00:57:43.220 Okay.
00:57:43.520 That's what I got for that.
00:57:47.780 Glenn Beck.
00:57:49.760 Mercury.
00:57:52.280 This is the Glenn Beck Program.
00:58:03.600 We're talking to William Hurtling.
00:58:05.260 He is the author and futurist, the author of many books.
00:58:09.620 His latest is The Kill Process.
00:58:11.580 I'm talking to him about the Singularity series.
00:58:15.440 And the first one in there is the Avogadro Corp.
00:58:18.580 And it starts out around this time, and it starts out with a tech center in Portland,
00:58:29.820 and a guy is working on a program that will help everybody with their email,
00:58:35.360 and all of a sudden he makes a couple of changes, and unbeknownst to him,
00:58:39.080 it grows into something that is thinking and acting and changing on its own.
00:58:44.220 And William, I'd like you to take us through this, because the first book starts out really kind of positive,
00:58:51.140 where you're looking at this, and there's some spooky consequences, but you're looking at it and going,
00:58:56.120 you know, I can see us.
00:58:57.680 I'd kind of like that.
00:58:59.020 And by the end, in the fourth book, you know, we have all been digitized,
00:59:03.760 and we're in a, you know, a missile leaving the solar system because Earth is lost.
00:59:12.680 A, do you think this is, is this your prediction, or you just think this is a really kind of good story?
00:59:18.320 I, well, you know, I think a lot of, I think a lot of it has the potential to be real.
00:59:26.160 And I think one of the things you probably know from my reading is that I'm fairly balanced,
00:59:29.540 and what I see is both the risks and the benefits.
00:59:32.080 I think there's both.
00:59:33.800 I get very upset.
00:59:35.700 There's so many people that are very dogmatic about artificial intelligence in the future,
00:59:39.900 and they either say, hey, it's either all benefits and there are no risks,
00:59:43.280 or they only talk about the risks without the benefits.
00:59:45.280 But, you know, and there's a mix of both, and it's like any other technology, right?
00:59:49.060 We all love our smartphones.
00:59:51.480 We all find our smartphones to be indispensable.
00:59:54.160 And at the same point in time, they affect us, right?
00:59:57.080 And they have negative effects.
00:59:58.600 And society is different today than it was 10 years ago because of our smartphones.
01:00:03.920 This is different, though, than anything else that we've seen, like a smartphone,
01:00:08.780 because this is like, you know, an alien intelligence.
01:00:12.500 And so we don't have any way to predict what it's going to be like or what it's going to do
01:00:16.540 because it will be thinking, and it most likely will not be thinking like a human.
01:00:20.940 But can we start at the beginning where, just give me some of the benefits that are going to be coming
01:00:25.640 in the next, you know, let's say 10 years that people are going to have a hard time saying no to?
01:00:32.000 Sure.
01:00:32.540 I mean, first of all, we already talked about self-driving cars, right?
01:00:36.080 I think we'd all like to get into our car and be able to do whatever we want to do
01:00:40.080 and not have to think about driving.
01:00:41.860 That's going to free us up from a mundane task.
01:00:44.880 We're going to see a lot more automation in the workplace,
01:00:49.500 which means that the cost of goods and services is going to go down.
01:00:53.400 So we're going to be able to get more for less.
01:00:56.220 So that's going to seem like an economic boom to those of us that can afford it, right?
01:01:01.220 We are going to enjoy more things.
01:01:04.740 We are going to have better experiences when we interact with AI.
01:01:09.720 So today, if you have to go to the doctor, you're going to wait to get a doctor's appointment.
01:01:14.220 You're going to go in.
01:01:15.140 You're going to have this rushed experience more than likely, at least here in the U.S., right?
01:01:20.340 And you're going to get five minutes of their time,
01:01:22.460 and you're hoping that they're going to make the right diagnosis in that five minutes that they're with you.
01:01:26.660 So that's going to be, I think, one of the really big changes over five to ten years from now
01:01:32.620 is we're going to see a lot more AI-driven diagnosis.
01:01:36.060 So when you're having medical issues, you can go in and you can talk to an AI.
01:01:40.160 That'll be more or less indistinguishable from talking to the nurse when you walk into the doctor's office.
01:01:45.620 And by the time the doctor sees you, there'll already be a diagnosis made by the AI,
01:01:50.620 and it'll likely be more accurate than what the doctor would have done,
01:01:54.460 and all they're going to do is sign off on it.
01:01:56.080 Yeah, I had a hard time, until I started reading about Watson,
01:02:00.640 I had a hard time believing that, you know, people would accept something from a machine,
01:02:06.220 but they are so far ahead of doctors if they're fed enough information.
01:02:11.980 They're so far ahead on, you know, predicting cancer and diagnosing cancer than people are.
01:02:17.460 I think it's going to be a quick change.
01:02:19.080 You're going to want to have the AI diagnose you.
01:02:23.000 Right, because that's going to be the best, right?
01:02:25.260 When we go to the doctor, we want the best.
01:02:27.160 We don't want the second best.
01:02:29.780 So we're going to see a lot of that.
01:02:32.900 And then, you know, 10, 15 years out, we start, and you know, it's funny.
01:02:37.960 I had a conversation with my daughter one day, and she asked,
01:02:40.760 hey, Dad, when am I going to get to drive a car?
01:02:42.860 And I thought about her age, and I thought about that, and I was like, well, I'm not sure you're
01:02:49.060 ever going to get to drive a car, because, you know, where you are and when self-driving
01:02:53.440 cars are coming, you may never drive a car.
01:02:56.460 And so you'll just get in one, and it'll take you where you want to go.
01:02:59.820 So there's going to be these very sort of, they're both subtle and yet dramatic changes
01:03:04.600 in society when you think about, hey, we're going to have a generation of people who may
01:03:07.920 never learn how to drive a car, right?
01:03:09.960 And their time will be free to do other things, but it'll be different than we are.
01:03:15.080 Do you see the, you know, in your first book, you talk about, you know, AI changing, you
01:03:23.780 know, the emails that are being sent and doing things on its own and really manipulating people.
01:03:29.920 We are already at the point to where we accept the manipulation of what we see in our Facebook
01:03:35.900 feed, but that's not, there's, there's, there's, there's, that's not a machine trying to do
01:03:41.180 anything but give us what we want.
01:03:44.220 Right.
01:03:45.020 Or do you see us very far away from, you know, hedge fund computers that, that can really
01:03:53.860 manipulate the markets in a positive way or computers that can begin to manipulate for
01:04:01.040 peace as you put in your book, your first one?
01:04:06.020 It's a good question.
01:04:07.920 We're definitely going to see that.
01:04:09.440 We're going to, at least at a minimum, right, where we can imagine that if you have an authoritarian
01:04:15.240 government, they're going to distribute information to pacify people.
01:04:19.960 And that's not a good thing often, but in some ways it is.
01:04:26.800 I mean, you know, if you have armed unrest, people will die.
01:04:30.260 So there's a balance there.
01:04:32.520 I think what we're going to see is we're just going to see lots of different people using
01:04:35.500 technology in lots of different ways.
01:04:37.260 So maybe we don't have a, you know, a hedge fund manipulating the markets in a positive way.
01:04:43.740 Maybe it starts with a bunch of hackers in another country manipulating the markets to
01:04:49.980 make money, right?
01:04:51.260 But I think we are going to see that distribution, that, that manipulation of information.
01:04:55.600 And it's hard.
01:04:56.480 It's out there now, right?
01:04:57.560 There's content.
01:04:59.080 A lot of the content that you read on the web, whether it's a review of a restaurant or
01:05:03.540 a business, a lot of that is already generated by AI.
01:05:06.660 And it's hard to tell what's an AI versus a person.
01:05:10.200 Talking to a genuine review.
01:05:11.760 Talking to William Hurtling, he's an author and futurist, author of a great series of novels
01:05:17.340 called the Singularity Series.
01:05:20.160 William, the idea that intelligent, not AI, not narrow AI, but, you know, super intelligence
01:05:35.560 or artificial general intelligence just kind of comes out of nowhere as it does in your
01:05:42.460 first novel where it wasn't the intent of the programmer is interesting to me.
01:05:49.680 I sat with a, one of the, a bigger name from Silicon Valley just last week.
01:05:56.720 Uh, and we were talking about this and he said, whoever controls AI, whoever gets this first
01:06:05.100 is going to control the world.
01:06:06.640 He was talking to me privately about, um, a need for almost a Manhattan project for this.
01:06:12.800 Do you see this as something that's just going to be sprung on us or will it be, uh, taken,
01:06:19.920 you know, in a lab intentionally?
01:06:23.820 I think the odds are probably strongly biased towards in a lab, um, both because they have
01:06:30.720 the kind of deeper knowledge and expertise and also because they have the kind of raw computing
01:06:34.680 power, right?
01:06:35.420 So, um, that alone is like, they have the computers that we'll have in 15 to 20 years, right?
01:06:51.220 That kind of computing power.
01:06:52.460 And that makes AI a lot easier of a problem to solve.
01:06:55.660 So I think it's most likely to come out of, um, a lab.
01:06:59.840 If you're looking at, for instance, the lawsuit that was just filed with, um, uh, against Google,
01:07:04.680 about the way they, uh, treat people with different opinions, uh, et cetera, et cetera, my first
01:07:11.400 thought is, good God, what are those people putting into the programming?
01:07:16.720 Um, I mean, that, that, that doesn't, that doesn't work out well for people.
01:07:22.500 Is there enough, are there enough people that are concerned about what this can do and what
01:07:29.900 this can be that we have the safeguards with people?
01:07:34.680 You know, I, um, I really think we don't.
01:07:38.720 I mean, think about the transportation system we have today and the robust set of safety
01:07:43.160 mechanisms we have around it, right?
01:07:44.680 So, um, we want to drive from one place to another.
01:07:47.580 We have a system of streets.
01:07:49.000 We have laws that govern how you drive on those streets.
01:07:51.280 We have traffic lights.
01:07:52.700 Cars have anti-lock brakes.
01:07:54.140 They have traction control.
01:07:55.240 All these things designed to prevent an accident.
01:07:58.200 If you get into an accident, we have all these harm reduction things, right?
01:08:01.580 We have seatbelts and airbags and crumple zones.
01:08:04.360 And after the fact, we have all this, we have a whole system of mitigation, right?
01:08:08.600 We have ambulances and paramedics and hospitals to take care of what damage does result.
01:08:14.540 And in the future, we're going to need that same sort of very robust system for AI.
01:08:19.300 And we don't have anything like that today.
01:08:23.080 And nobody's really, um, thinking about it.
01:08:26.500 Um, uh, which is thinking, yeah, nobody's thinking about it comprehensively.
01:08:31.360 And one thing you could imagine is, is, well, we'll wait until we have a problem and then
01:08:37.180 we'll put those safety mechanisms in place.
01:08:39.460 Well, the problem of course, is that AI operates at the speed of computers, not at the speed
01:08:44.020 of people.
01:08:44.800 Um, and there's a scene in one of my books, I'm sure you remember reading it where there's
01:08:49.480 a character who witnesses a battle between two different AI factions.
01:08:54.880 Yes.
01:08:55.080 And the whole battle takes place.
01:08:57.920 A lot of things happen between the two different AI factions, all in the time it takes the human
01:09:03.580 character's adrenaline to get pumping.
01:09:05.720 And by the time he's like primed and ready to fight, the battle is over and they're into
01:09:09.700 negotiations and, and how to resolve it, right?
01:09:12.260 It is, it is, it's remarkable in reading that.
01:09:14.960 That is a great, um, uh, understanding of, uh, how fast this will, things will move.
01:09:23.980 It's, it's, uh, like one of the best action novels of war scenes I've ever seen.
01:09:28.880 Really, really good, you know, page after page after page of stuff happening and you get
01:09:33.040 to the end and you realize, oh my gosh, this, the human hasn't even hardly even moved.
01:09:39.900 He hasn't even had a chance to think about the first step that happened and it's already
01:09:44.840 over.
01:09:46.740 Exactly.
01:09:48.080 So this is, this is why we need to be thinking about how are we going to control AI?
01:09:53.460 How are we going to safeguard ahead of time?
01:09:56.100 We have to have these things in place long before we actually have AI.
01:09:59.940 Isn't though, isn't it true though, William, that eventually some bad actor is going to
01:10:03.820 be able to develop this and not put those safeguards in and we're not going to have a
01:10:08.360 choice.
01:10:09.400 Eventually the downside of this is going to affect everybody.
01:10:14.040 You know, it's, it's very true.
01:10:16.020 And part of the reason why I say, right, we can't opt out of AI.
01:10:19.760 We can't not develop it because then we're just at a disadvantage to anyone who does.
01:10:24.920 And it gets even scarier as you move out.
01:10:27.980 So one of the things that I talk about in my third book, which is set in around like 2035,
01:10:33.560 and that's, I talk about neural implants.
01:10:36.120 I think neural implants, so basically a computer implanted in your brain, the purpose of which
01:10:42.660 is mostly to get information in and out, right?
01:10:44.760 So instead of having a smartphone in our hand where we're trying to read information on
01:10:48.240 the screen, we can get it directly in our head.
01:10:50.520 It makes the interaction much smoother, easier.
01:10:54.080 And, but it can also help tailor your brain chemistry, right?
01:11:00.860 And so if you could imagine if you're someone who had depression or anxiety or a severe mental
01:11:05.740 disability, that a neural implant could correct for those things.
01:11:09.740 So you would basically be able to flip a switch and turn off depression or turn off anxiety.
01:11:13.420 William, I'm unfortunately out of time.
01:11:16.620 Could I ask you to come back tomorrow and talk and start there?
01:11:19.940 Because that's really the third book.
01:11:21.720 Start with the neural implants and where it kind of ends up with technology, because it
01:11:27.580 is remarkable.
01:11:29.840 And in reading the real science behind it, it's real.
01:11:34.000 It's real.
01:11:34.700 It sure is.
01:11:35.360 It's coming.
01:11:36.440 Could you come back maybe tomorrow?
01:11:38.800 Sure.
01:11:39.220 I'd be happy to.
01:11:39.900 Okay, great.
01:11:40.740 Thanks so much, William.
01:11:41.660 Uh, William Hurtling, author and futurist.
01:11:44.300 He is the author of the Singularity series.
01:11:52.020 You should get one of those things, Glenn.
01:11:53.480 Uh, you know, that thing that'll alter your brain.
01:11:56.860 Uh, William Hurtling is the author of, uh, of all of these books.
01:12:00.960 There's four of them in this series and in the Singularity series, plus Kill Process just
01:12:05.320 came out.
01:12:05.980 That's williamhurtling.com.
01:12:07.620 Let me ask you this, Glenn.
01:12:08.340 Um, is this the right way to think about it?
01:12:10.680 Uh, this comes in from Twitter at World of Stew, uh, to understand the difference between
01:12:14.820 AI, artificial intelligence, and AGI, artificial general intelligence.
01:12:19.040 So, if there's a self-driving car, and it's AI, you say, take me to the bar, and it says,
01:12:24.680 calculating route, beginning travel.
01:12:27.300 Okay?
01:12:27.680 If you say it to AGI, um, take me to the bar, it responds, your wife says you drink too much,
01:12:33.900 and my sensors indicate you've put on a few pounds routing to the gym.
01:12:36.900 Uh, I have a feeling you're exactly right.
01:12:40.400 That's terrible.
01:12:41.240 All right, let me tell you about Car Shield.
01:12:42.860 Um, I love my truck.
01:12:45.140 I have, I have two old, uh, trucks.
01:12:47.300 Um, one, I think it's 2008 and 2012 or 14.
01:12:52.680 Um, and, you know, the 2008 is starting to have some sensor problems, et cetera, et cetera.
01:12:57.360 And, you know, when these things start to happen, it's way out of warranty, but I have
01:13:02.360 the extended vehicle service protection from Car Shield, so I can still take it into the
01:13:06.760 mechanic, whether it's, you know, the guy at the dealership or just a mechanic, and
01:13:11.400 they get them paid, so it's, I don't have to wait for a check or anything else, and I'm
01:13:15.040 not afraid every time I take that truck in to have it serviced, because just, you know,
01:13:20.180 one little sensor, it could be a thousand bucks, new fuel pump, 500, water pump, a thousand.
01:13:26.480 If you need repairs, um, to your car, and we all do, it's all going to happen, they have
01:13:35.240 a plan now that will cover everything from, you know, the water pump to the car's computer,
01:13:39.560 Car Shield, the ultimate in extended coverage, and, uh, and they get the mechanic paid, so
01:13:46.360 you don't have to pay out of your pocket.
01:13:48.120 Sign up today, get 24-7 roadside assistance in a rental car when yours is in the shop.
01:13:52.560 Do yourself what I do, save yourself from high repair bills, get covered with Car Shield.
01:13:58.840 Call 1-800-CAR-6100, mention the promo code BEC, or visit carshield.com and use the code BEC,
01:14:05.540 and you'll save 10%.
01:14:07.300 That's carshield.com, promo code BEC, deductible, may apply.
01:14:13.640 Glenn Beck Mercury.
01:14:29.100 Glenn Beck.
01:14:30.140 We will, um, I'll post, uh, the, uh, and tweet the links to, uh, William Hurtling's books.
01:14:41.140 It's, it's really, I've been looking for somebody who can, um, really explain, uh, in an entertaining
01:14:49.380 way through a novel, um, what's coming our way.
01:14:54.060 Tomorrow we have another author, uh, one of my favorite authors is going to be on with us.
01:14:59.100 And he's got a similar novel, uh, and we'll talk to him as well.
01:15:03.640 More with William Hurtling tomorrow.
01:15:14.340 Love.
01:15:15.560 Courage.
01:15:17.260 Truth.
01:15:19.000 Glenn Beck.
01:15:20.620 Good news and bad news, which do you want to hear first?
01:15:23.680 According to researchers, the University of London, it doesn't really matter which one you hear
01:15:30.280 first.
01:15:31.340 Good news or bad news, which it doesn't matter.
01:15:34.140 You're more likely to believe the good news on something called the desirability bias.
01:15:41.120 Desirability bias is when you consider information more credible because it makes you feel good.
01:15:46.000 It helps explain the, uh, whole social media, fake news phenomenon.
01:15:50.880 When you, you see something, it's not confirmation bias.
01:15:54.420 It's desirability virus, uh, uh, bias that is actually more difficult and more troubling.
01:16:01.640 Researchers at the university of London set up a study just before the 2016 presidential
01:16:06.080 election.
01:16:06.520 And they, they took 900 voters who were diehard Hillary Clinton or Donald Trump fans, and they
01:16:13.120 asked them, which one do you support and which one is going to win?
01:16:18.440 Researchers then separated the voters into the two groups.
01:16:22.860 They gave the first group polling results that indicated that Trump would win.
01:16:27.080 And the second group resulting, um, in, uh, Hillary's win with this new information.
01:16:34.020 They were asked, who do you think is going to win?
01:16:37.680 The result of the study was clear.
01:16:40.040 The desirability bias changes people's minds.
01:16:43.580 People believed the polling results that were given only when their poll indicated that their
01:16:49.480 candidate would win.
01:16:50.540 They would even change it.
01:16:52.260 If their candidate was shown to be, uh, winning in a strong poll, even if they thought they
01:16:59.100 were Clinton might win, they changed it and they dug their heels in.
01:17:05.200 So what does all of this mean?
01:17:08.660 It, it means that we are listening to the things that let us believe the things we want to believe.
01:17:15.680 The lesson for politics is really clear.
01:17:18.500 And it's something that is a lost art now on both sides of the aisle.
01:17:23.440 If you want to persuade people, you have to get them to want to agree with you.
01:17:31.740 This is the biggest problem.
01:17:33.340 Now people become monsters and pariahs and they, and, and strident.
01:17:40.140 And so nobody wants, you don't, you don't like them.
01:17:44.360 You don't want to agree with them.
01:17:45.820 Reagan was a guy who really understood this.
01:17:49.960 He was called the great communicator.
01:17:51.900 He won 49 States in the 84 election.
01:17:54.460 And that's because he, he said things that people wanted to believe.
01:18:00.300 Now we can't fathom a candidate appealing across the aisle.
01:18:04.600 In fact, I think if you see a candidate that tries to appeal to the other side, you immediately
01:18:08.800 mark them as a traitor.
01:18:10.020 We saw Obama supporters blinded by the desirability bias for eight years.
01:18:16.780 They would not believe reports about the IRS, that he was using the IRS because they didn't
01:18:24.600 want to believe that.
01:18:26.440 Now we're seeing the same thing with the Trump base.
01:18:29.980 We have to move past this concept of the presidency as the ultimate bully pulpit.
01:18:36.460 It's not what the executive office was designed to be, and it will not help heal our divisions.
01:18:44.180 It's Tuesday, January 16th.
01:18:56.180 This is the Glenn Beck program.
01:18:59.540 Well, we have some really good news to share with you here in just a second.
01:19:04.840 We have Tim Ballard, the founder and CEO of Operation Underground Railroad and the chairman
01:19:12.560 of the Nazarene Fund.
01:19:14.720 And Jessica Mass, she is the director of aftercare for Operation Underground Railroad and want
01:19:19.500 to welcome you guys.
01:19:20.360 And we have some exciting stuff to talk about here that happened last week in a very ironic
01:19:25.460 sort of way.
01:19:26.300 But I want to start with the Turpin family in California.
01:19:33.520 How prevalent is that?
01:19:36.020 I mean, people were living right on top of this house.
01:19:40.020 They were just feet away.
01:19:42.740 Nobody knew.
01:19:44.280 And it had been going on for a long time.
01:19:46.360 Thirteen children in this house chained and living in squalor.
01:19:52.480 Is this common?
01:19:54.300 You know, it's more common than I think people want to believe.
01:19:58.100 You know, the whole thing of human bondage and human captivity, people want to, in this
01:20:02.160 country, want to put it far, far away from us.
01:20:04.420 There's hundreds of thousands of people, children, in the United States that are in captivity in
01:20:10.200 one form or another.
01:20:11.160 A lot of times it's sexual captivity.
01:20:13.880 And it's just, it's an eye-opener for everybody to look around.
01:20:17.520 I mean, this neighborhood, you saw the houses.
01:20:19.500 They were right next to each other.
01:20:21.100 Yeah.
01:20:21.500 And it was a nice neighborhood.
01:20:22.520 And it wasn't even a huge house and 13 kids are chained to beds.
01:20:26.260 And I mean, it's just, it's just insane, but it's not something that's shocking to me.
01:20:30.660 Any, any, have you ever seen anything like that, Tim?
01:20:34.340 Because this is what you've done for your whole life.
01:20:37.580 Yeah, I have.
01:20:38.940 Yeah.
01:20:39.120 We've, we've, we've, we've seen things like that throughout our, our careers.
01:20:42.120 Absolutely.
01:20:42.560 What is usually the motivation?
01:20:44.440 You know, it's usually tied to something in, in, in a sex crime.
01:20:49.180 It's usually child pornography being made, you know, people coming over.
01:20:53.300 We're still looking to see what in the world, even the, the, no matter what, it's going to
01:20:57.760 be a crazy intent, whatever the intent was, but what was it, what were they getting out
01:21:02.080 of this?
01:21:02.320 I don't know yet.
01:21:03.300 So Jess, you have been, um, around the, what you do is try to heal people like this.
01:21:09.560 29 years old.
01:21:11.180 They, they, they thought that they were young teenagers because they were so emaciated.
01:21:16.940 Does that 29 year old have any chance of life?
01:21:20.100 Well, my belief and based on my experience is that there's always hope.
01:21:25.280 I have seen people heal through things that, that they've said there's no hope for, and
01:21:30.520 I've seen people overcome things.
01:21:32.780 So in my personal opinion, I believe that there's always hope and the journey is really hard
01:21:38.340 and it's long and painful, but I, I hope that there are people that will come around this
01:21:45.760 29 year old and help them and, and really walk through that journey with them.
01:21:51.640 Boy, it's going to be a long road for all of those, all of those kids, all of those kids.
01:21:56.960 Okay.
01:21:57.400 So let's talk a little bit about why you guys are here.
01:22:00.520 Um, I've been so excited for this.
01:22:02.720 We've talked about it for a long time, but haven't been able to reveal it.
01:22:05.960 Um, you did a mission in Haiti, uh, as we know, a crap hole of a country and, uh, you
01:22:14.560 did a mission, uh, how long ago was the first, it was Superbowl Sunday of last year, of last
01:22:19.260 year.
01:22:19.500 Okay.
01:22:20.200 And the problem was, is the system can be corrupt.
01:22:25.040 And in a lot of places in, in Haiti, it is corrupt.
01:22:28.040 So what happened to the bad guys?
01:22:30.260 So in this operation, beautifully executed, everything was spot on.
01:22:33.600 And, uh, you know, we were warned, don't work in Haiti and other NGOs don't work in
01:22:38.240 Haiti because of this system.
01:22:39.300 But you know, guess no, Marty, the father of the boy who was kidnapped, which in Haiti,
01:22:43.360 which is why we were there.
01:22:44.480 And I said, guess no, I don't know if I can attach our name to, we don't know what's
01:22:48.280 going to happen.
01:22:48.740 There's going to be a corrupt judge potentially.
01:22:50.320 And he says, Tim, if you give up on this operation, you've given up on my son because
01:22:55.060 you've given up on Haiti.
01:22:56.060 And I said, you're right.
01:22:57.260 So we went forward and Glenn, within 10 days, within 10 days, these traffickers were
01:23:03.040 like, oh, particularly this one, the kingpin, her name was Francois.
01:23:05.500 Um, she holds kids in stables in the darkest parts of this country.
01:23:11.320 And we, she brought several, a bunch of these kids with her young kids.
01:23:14.820 They're making child pornography, porn that was sent into the United States.
01:23:18.440 We got the U S authorities involved.
01:23:20.180 Um, and they were let go there.
01:23:22.060 They found the right judges.
01:23:23.420 They paid their way out over $80,000 they paid and got out.
01:23:27.600 And that phone call broke us.
01:23:29.400 I mean, Jessica and I were talking in the minutes after we found out we were in tears.
01:23:32.780 I mean, we, I, I was, we were just paralyzed.
01:23:35.740 We couldn't believe it.
01:23:36.900 So then what happened next?
01:23:38.540 Well, we had a contingency plan, which was dangerous.
01:23:42.020 I didn't want to do it.
01:23:43.240 And basically the contingency plan was go into the, into the, into the belly of Port-au-Prince
01:23:48.380 and scream what happened.
01:23:51.100 Tell the media what happened, what really happened.
01:23:54.020 And, uh, the Rotary Club of Port-au-Prince supported us and they said, you need to bring
01:23:57.820 someone of somewhat kind of celebrity status in Haiti or the media is not going to pay
01:24:01.920 attention.
01:24:02.780 And they said, we have an idea for you.
01:24:05.100 Can you get ahold of this, this of Haitian descent, a U S Congresswoman named Mia Love.
01:24:11.160 Yeah.
01:24:11.460 Have you heard of her, Tim?
01:24:13.100 And I said, are you kidding me?
01:24:14.740 Not only is she my Congresswoman who lives like down the street, she's a good friend.
01:24:18.840 I couldn't, it was, it was a miracle.
01:24:20.380 I called me in.
01:24:21.400 I said, look, this is, this is not, this is not necessarily a safe approach.
01:24:26.680 And she didn't even bat an eye.
01:24:28.620 She said, are you kidding me?
01:24:29.500 Like we're, when are we going, let's go.
01:24:32.500 And, uh, we went down to Port-au-Prince.
01:24:34.340 We rented out this conference room.
01:24:36.900 People came from all over.
01:24:39.160 Uh, the media was there and this was the speech we gave.
01:24:42.060 We reminded the Haitians of their history.
01:24:44.460 In 1791, the Haitians did something that no country then or since has, has done.
01:24:49.740 And that is, it was a slave nation that rose up and pushed its European oppressors out of
01:24:56.780 the country, took the island by force, created a republic and abolished slavery.
01:25:02.740 Uh, the first nation to abolish slavery like this.
01:25:05.640 And what was interesting about this was the American abolitionists were watching this.
01:25:10.700 And when the United States finally eradicated or abolished slavery, at least legally, Frederick
01:25:16.600 Douglas, the great abolitionist stood up and gave a speech.
01:25:19.160 And he said, let us not forget the sons and daughters of Haiti, the true pioneer abolitionists
01:25:24.860 of the 19th century.
01:25:26.160 They inspired this whole movement.
01:25:27.880 And so this is the message that me and I gave and, and people were just rising up.
01:25:33.540 I mean, they were like, and then we said, look, you let us out of slavery the first time,
01:25:38.100 do it again.
01:25:39.860 And, and letting out nine horrible traffickers who are sexually abusing children.
01:25:45.460 That's the wrong, that's, you're going in the wrong direction right now.
01:25:48.880 Help us.
01:25:49.540 Let's do this.
01:25:50.760 Um, they rose up.
01:25:51.880 The media went nuts.
01:25:52.840 We got an invitation from the president and this was another miracle is the president of
01:25:57.540 Haiti, uh, Jovenel Moise, he was elected just days after the Superbowl operation.
01:26:04.140 So he was able to say, I wasn't part of that.
01:26:06.380 And he wasn't.
01:26:07.380 And so he was easy.
01:26:08.440 It was, it was easy for him to invite us in.
01:26:10.180 I couldn't believe it.
01:26:10.840 We're going into the presidential palace, Mia Love and I, and I just kind of sat back.
01:26:14.320 Mia speaks fluent Creole, you know, and I just watched her work her magic.
01:26:18.660 And she, she got right up in, in, in, you know, in him and just said, look, here's the
01:26:22.480 evidence.
01:26:23.040 Here's what's going on.
01:26:24.520 He says, I'm not going to stand for this.
01:26:26.300 And I felt it when he said it, you know, you can look in someone's eyes and know if
01:26:29.300 they, he, he, he said, I can't believe this.
01:26:31.380 We will, we'll put an end to this.
01:26:33.700 Um, within a couple of months, we knew the investigation was going.
01:26:37.260 We were supporting it in a lot of ways.
01:26:38.680 We had a full time people down supporting, watching, and then the news broke.
01:26:43.500 Six judges ripped from the bench.
01:26:45.140 They found the judges who took the bribes.
01:26:47.500 Now this is unprecedented.
01:26:48.840 I mean, in Haiti, in Haiti, this is a big deal.
01:26:51.960 And the, the attorney general who, um, his name was Aknam, he, he was instrumental in
01:26:58.240 doing this.
01:26:58.880 I mean, this guy, uh, his team lives in, lives, a lot of them live in Miami.
01:27:04.380 They can't even live in Haiti for fear that their, their children will be killed or their
01:27:08.420 families, you know, uh, because they are, they are true corruption fighters and they found
01:27:13.400 those judges ripping from the bench.
01:27:14.660 And then they called us and said, let's go.
01:27:16.940 Will you come help us?
01:27:18.020 You guys have more intelligence than we do on the whereabouts of these traffickers.
01:27:21.520 Cause you found them all.
01:27:22.660 It was our team that went undercover and found them.
01:27:24.500 Let's go round them up.
01:27:25.940 So I know, um, I was with, with you in Haiti a year ago, maybe, and, uh, never seen anything
01:27:34.380 like it.
01:27:34.840 And the corruption from the United States, quite honestly, uh, from some of the big charitable
01:27:40.580 organizations that, you know, and the UN that's, you know, we're going to help them.
01:27:44.760 No, it's all, the money is not going to the people and where it needs to be.
01:27:49.560 And so we've had these conversations before.
01:27:52.280 And so when you told me what was happening with the president and that he was serious now,
01:27:57.020 and he was going to take these people out, it was great news.
01:28:01.720 One of the reasons why I took what the president said last week.
01:28:05.660 So personally is because my wife and I had just gotten up off of our knees to pray for
01:28:10.600 you guys, because we knew you were doing this operation under the protection of the president
01:28:18.100 of Haiti.
01:28:22.560 Were you, were you already in the operation when you found out what happened?
01:28:28.040 It was, it was the most awkward moment of any operation I've ever been on.
01:28:31.420 And I've been on a lot over my 16 year career, we were, um, we were sitting in a, um, we were
01:28:37.140 sitting in a kind of private, uh, restaurant.
01:28:40.560 The attorney general of the country is sitting across the table from me.
01:28:43.860 Uh, the chief of police is sitting next to him and then my operators and, and, and we're
01:28:48.860 sitting around, um, and we're talking about this operation.
01:28:53.880 We, we literally have a recon team out this, this woman, Francois, and we got, we'll tell
01:28:57.580 you about that in a second, what, what happened, how we found her.
01:28:59.540 Once we took her down, she's the kingpin, right?
01:29:02.740 And our recon team's looking for her and we're just on pins and needles.
01:29:05.300 When, when are we going to get the call?
01:29:06.280 We're all geared up and ready to go.
01:29:07.440 As soon as he says, you know, our team says, there she is.
01:29:10.180 And, and there's television screens all around.
01:29:11.960 This is like T minus 40 minutes.
01:29:14.080 Oh, geez.
01:29:15.460 And boom, I see this and I see it above their heads and I'm just reading this.
01:29:19.600 I'm like, no, it says Haiti.
01:29:21.660 No, don't turn around.
01:29:22.780 Don't turn around.
01:29:23.760 Don't turn around.
01:29:25.280 And again, I wasn't in the room and I've heard different people say different things.
01:29:28.300 Yeah.
01:29:28.420 So I don't know.
01:29:29.440 I'm just telling you my situation.
01:29:30.900 Yeah.
01:29:31.120 In this moment, this is the message coming from, from, from America to Haiti.
01:29:35.120 And I'm just, you know, and, and they turned around, they turned around and they looked
01:29:39.900 at it and they looked at me and I didn't know what to, I just was like, so, so, so how are
01:29:45.880 things?
01:29:49.220 What, uh, how did you smooth it over?
01:29:52.920 How I smoothed it over was I just told them, my wife and I, as you know, we're, we're adopting
01:29:58.340 two children from Haiti that, that, um, we actually rescued in an earlier operation.
01:30:05.200 And these are our kids.
01:30:06.520 I mean, we, we love them like our own kids.
01:30:08.540 And, and, you know, I, I just said, you know how much I love, we love your country.
01:30:14.440 We love your country.
01:30:15.640 You, I mean, my children are from your country, my children.
01:30:19.820 And that, that ended it.
01:30:23.460 Have you talked to the president since?
01:30:25.000 Not this president, but the Haitian president?
01:30:27.280 I have, I have not.
01:30:28.260 We're going, we're going down in a couple of weeks to, to mute.
01:30:30.580 He called in though.
01:30:31.320 He called into the attorney general just an hour before and just wishing us well and
01:30:34.880 saying his prayers were with, uh, his prayers were with us.
01:30:37.900 I mean, that's also unprecedented by the way to have the, the president of a nation calling
01:30:41.320 in telling, I'm backing you guys.
01:30:43.420 Get these guys, get them out of our country.
01:30:46.000 Yeah.
01:30:46.340 Yeah.
01:30:46.760 Okay.
01:30:47.060 We're going to come back in a minute and, uh, he's going to tell you exactly what happened.
01:30:51.540 And then tonight at five o'clock, we have video footage of this operation and what it
01:30:57.180 was like on the ground and a lot of stuff that you're going to want to see.
01:31:00.140 You can check that out tonight, five o'clock on the blaze TV back in a second with Tim
01:31:06.640 Ballard, uh, founder and CEO operation underground railroad and the chairman of the
01:31:12.200 Nazarene fund.
01:31:17.060 So if you're setting new goals for your business, it is really difficult to reach them without
01:31:21.500 the right people on your team.
01:31:22.940 And that's where zip recruiter comes in.
01:31:24.840 They have transformed the way that you can find really good people.
01:31:28.680 Zip recruiter will not only post your job at over a hundred of the, uh, the, uh, leading
01:31:33.000 job boards with one single click zip recruiter.
01:31:36.620 Then, uh, it's a smart program and it goes out and it looks for the most qualified candidates
01:31:40.960 and invites them to apply.
01:31:42.580 Hey, I don't know if you've seen this.
01:31:43.700 This is a really great job.
01:31:44.920 You should apply.
01:31:46.180 And that is why, uh, 80% of all of the people that use zip recruiter, they get a qualified
01:31:51.720 candidate through the site in the first day.
01:31:53.780 Zip recruiter, the smartest way to hire.
01:31:56.060 Find out today why zip recruiter has been used by businesses of all sizes and industries
01:32:00.420 to find the most qualified job candidate with an immediate result right now.
01:32:05.720 Zip recruiter, try it for free.
01:32:07.460 Zip recruiter.com slash Beck.
01:32:09.440 That's a zip recruiter.com slash Beck.
01:32:13.980 Glenn Beck, Mercury.
01:32:26.280 Glenn Beck.
01:32:29.100 Operation, uh, Underground Railroad, our rescue.org.
01:32:34.560 And, uh, the chairman of the Nazarene Fund.org is Tim Ballard.
01:32:39.020 Um, he is here to tell us about the operation that, uh, happened in Haiti just last week.
01:32:44.560 The day the president was, uh, saying what he said about Haiti.
01:32:48.720 Um, it was a really important moment because the president of Haiti was working with, uh,
01:32:55.060 Operation Underground Railroad to get some really bad guys.
01:32:59.100 So it wasn't a crap hole.
01:33:00.660 Uh, what happened?
01:33:02.480 So you, you're in that meeting, you get the go.
01:33:05.160 Tell me what.
01:33:06.220 So we get the go on this, on Francois.
01:33:08.440 I mean, France, she is, if she's the kingpin, they want her.
01:33:11.540 She, not only because she has the most kids that she's selling, but because she had paid
01:33:16.120 $80,000, uh, to certain government officials that they didn't know who they were.
01:33:20.540 And these guys want to clean up their country.
01:33:22.260 The president, the attorney general, they wanted her to find out who did you give that money
01:33:26.320 to that got all your minions out of jail.
01:33:28.600 So they really wanted her.
01:33:30.240 Do you think she's going to, do you think she's going to tell?
01:33:32.340 I mean, that's a really dangerous country.
01:33:34.500 Well, let me tell you what she said when we got her.
01:33:36.360 Glenn, I've never seen this.
01:33:37.400 She said, evil will protect me.
01:33:40.420 I'll never talk.
01:33:41.360 Evil will protect me.
01:33:42.540 Wow.
01:33:42.940 I mean, that's the kind of, I've never seen, there's a lot of things on this op I had never
01:33:46.160 seen before where she's just like, I'm hand in hand with, with darkness.
01:33:51.280 And I'm.
01:33:51.820 She actually said the words?
01:33:53.160 She actually said that.
01:33:55.140 Wow.
01:33:55.500 Yeah.
01:33:57.980 Bizarre.
01:33:58.760 But this, this, this woman, she owns a street essentially.
01:34:02.300 I mean, it's, she sits on one side of the street, all dressed beautifully, you know, with
01:34:06.780 her wig and all this stuff.
01:34:08.080 And, and these Johns come and she has all these kids in these stables across the street.
01:34:12.040 I call them stables.
01:34:13.360 I mean, they're, they're, they're these steel doors that, the clothes that lock, they have
01:34:17.280 metal beds that fold down literally with a kid's sit and then a bedroom in the back.
01:34:21.120 And the Johns go in and do their, and do their thing.
01:34:23.940 We had been doing surveillance on, on Francois for about three weeks in advance.
01:34:28.840 And we never saw a single child out.
01:34:32.200 It was, it was adults out that she was running.
01:34:34.740 The day we rolled in and this minute and this second we roll in, we, we, we, we, we, we
01:34:40.380 identify her and she's swarmed.
01:34:42.840 Well, something happened as we're, there's a videographer we have named Justice.
01:34:46.820 I think you've met Justice, inspired dude at the last minute, we're driving to the street,
01:34:52.020 to this brothel street.
01:34:52.900 He jumps to the back seat and in the van, there's three SWAT guys, Haitian SWAT team members
01:34:58.060 who they're going to come out of the back, the back of the, of the van.
01:35:01.380 He jumps back there and starts wrapping duct tape around this guy's helmet and stay and
01:35:06.140 putting on a GoPro camera.
01:35:07.820 I'm like, Justice, what are you doing?
01:35:08.760 He's like, I don't know.
01:35:09.820 I just, this guy needs a camera.
01:35:11.360 Wow.
01:35:11.720 And you say it's divine providence.
01:35:15.660 We'll find out why, when we come back.
01:35:29.720 Glenn Beck, Mercury.
01:35:41.720 This is the Glenn Beck program.
01:35:45.380 We all talk about, oh, if, if I would have lived back in the day, I would have been an
01:35:50.440 abolitionist.
01:35:51.240 Really?
01:35:51.820 Would you?
01:35:53.160 Because there are more people that are being held as slaves today than in the entire 400
01:35:58.840 year history of the slave trade.
01:36:01.280 And yet, are we involved?
01:36:04.560 Are we doing things?
01:36:06.340 Are you an abolitionist?
01:36:07.620 I invite you to be an abolitionist by going to ourrescue.org.
01:36:13.680 Abraham Lincoln is on the $5 bill.
01:36:16.220 Once a month, just give five bucks and become an abolitionist.
01:36:20.300 If you can afford it, give more.
01:36:22.340 Ourrescue.org and the nazarenefund.org, which is also rescuing the slaves in the Middle East.
01:36:31.360 Tim Ballard is the founder and CEO of Operation Underground Railroad and the chairman of the
01:36:35.820 Nazarene Fund, just returning from Haiti, where there is a big shakeup happening, trying to
01:36:42.500 make their, their government work and get away from evil.
01:36:45.560 You just rescued a whole bunch of children and caught a real, probably the kingpin.
01:36:52.680 The kingpin.
01:36:53.760 Yes.
01:36:54.300 The kingpin that we had arrested her back in, in February of last year and bought her way
01:36:58.940 out of jail.
01:36:59.580 We got the right people, the new president of Haiti.
01:37:01.800 And here we are going in to get her again two nights ago.
01:37:04.220 So you have, your photographer is in the back of one of your vehicles.
01:37:08.040 My videographer jumps in the back and I'm like, justice, what are you doing?
01:37:11.140 We're, we're about to hit the brothel.
01:37:13.180 And he's like, I got to get a camera on this guy's helmet.
01:37:15.140 So he's, he's wrapping, I'm laughing.
01:37:16.820 I'm like, okay, he's wrapping duct tape around this, this, this kind of awkward scene.
01:37:19.900 And the guy's like, all right, that's fine.
01:37:21.340 So we get to the spot.
01:37:22.940 The team gets out of the van.
01:37:24.180 There's, there's like three vans that jump out and, and, and get around the target and take
01:37:27.760 her down.
01:37:28.180 Um, she's in the middle of selling one of the girls while we do it with a guy who has
01:37:32.720 1200 condoms that he holds the bag for her.
01:37:36.680 But these two guys, one with the, with the camera on his helmet gets out of the van and
01:37:41.340 he's supposed to come with us, but I see him take off the other way.
01:37:45.220 And what had happened was one of the girls, one of the victims got scared, didn't know
01:37:49.120 what was going on and ran into what I can call a stable.
01:37:52.440 I mean, it's just, it's a stable door.
01:37:53.800 It's a steel door.
01:37:54.560 And she runs into it.
01:37:55.720 He follows her.
01:37:56.480 He says, something's not right.
01:37:57.980 He gets in there and Glenn, in my 16 years, I've never seen this.
01:38:01.000 And I've seen the camera footage of what he captured.
01:38:02.980 I've never seen this.
01:38:04.880 A girl, probably 13 years old.
01:38:07.480 And I would ask people, this is a tough thing to ask people, but who's 13 in your life?
01:38:11.660 You have a daughter.
01:38:12.820 Or I have a daughter who's 13.
01:38:14.380 These are babies.
01:38:15.720 Okay.
01:38:17.360 He goes in there and catches in the act a man who's raping her in the back bedroom and
01:38:26.760 pulls him off and goes to her rescue.
01:38:32.180 And we caught it all on camera, which will allow us to prosecute the heck out of this guy.
01:38:38.860 I've never seen that.
01:38:40.100 I mean, we've rated so many situations like this.
01:38:41.820 Never in the act.
01:38:43.600 And we had not seen kids, by the way, in the four weeks leading up to, you know, tracking
01:38:48.040 this woman because she keeps them so hidden.
01:38:50.900 It was providential.
01:38:52.000 And the fact that Justice, my videographer-
01:38:53.940 At the last minute.
01:38:54.820 Last minute, puts a camera on this guy's helmet for no apparent reason.
01:38:59.040 It was just kind of a strange act.
01:39:01.300 So you get the madam or the kingpin here of the Haitian slave trade.
01:39:08.480 Is she the kingpin?
01:39:09.380 Is she the or one of them?
01:39:12.040 She is, according to the Attorney General of Haiti, she is the number one child trafficker
01:39:16.460 that they wanted.
01:39:17.180 Wow.
01:39:17.660 The number one.
01:39:18.140 So you get her and she says to you, evil will protect me.
01:39:24.460 Yes.
01:39:24.720 That's a quote.
01:39:25.700 That's it.
01:39:26.040 That's it.
01:39:26.520 It was translated to me, but that's as close as that's the translation that I got.
01:39:30.260 I don't think people understand, you know, going to, knowing the history of Haiti, at
01:39:39.880 the same time we made a pact with God, the same year, 1791, they made a pact with Satan,
01:39:48.900 a literal pact with Satan.
01:39:51.140 And I think that's one of the reasons why that country has so many problems.
01:39:56.200 They, they've never really broken that.
01:39:59.280 I think a lot of people might dismiss it now, but I think it's, it's real there.
01:40:04.440 Evil is real there.
01:40:05.960 And people like her to say evil will protect me.
01:40:10.920 That's what she's talking about.
01:40:13.900 So how do you, how do you heal a country like that, Tim?
01:40:18.060 You know, what you do is you, you, you find those of light and they're there.
01:40:24.560 And it was that very history you're talking about where we, we brought that to light and
01:40:29.860 said, look, even in that darkness, because that is part of their history.
01:40:33.240 Anyone will tell you, they'll tell you like, yeah, yeah, there's elements that did that,
01:40:36.340 but there was, there was light that came with that too, that from the beginning that fought
01:40:40.100 it.
01:40:40.240 But, and it's about finding those and, and people like Congressman Mia Love who came
01:40:44.560 down and, and Attorney General Sean Reyes also accompanied us and, and helped us to bring
01:40:49.120 the light to, to the country through people like President Moise and, and the Attorney General
01:40:54.860 Alknom and, and, and then all the aftercare partners we have.
01:40:59.560 I mean, talking, you've been to our safe house.
01:41:02.840 Jessica is, um, you know, the director of the aftercare and you're one of the most tender
01:41:08.600 hearted people I know.
01:41:09.880 And I just, I, I love watching you because you are tender and yet you are a pit bull,
01:41:16.020 uh, in those situations because you know where the danger is.
01:41:20.680 Um, how many, how many people did you get out?
01:41:23.520 How many?
01:41:23.920 There's four.
01:41:25.300 Four children.
01:41:26.160 Yeah.
01:41:26.360 And, and, uh, the one big boss.
01:41:29.660 Yeah.
01:41:29.960 Well, and, and four, and then three of her minions went down.
01:41:32.740 That's, we nailed right at the same time.
01:41:35.020 And they are going to, the president is all over this one.
01:41:38.120 Oh yes.
01:41:38.800 Yeah.
01:41:40.020 So, um, the, um, situation with the children, you've been there so many times.
01:41:45.300 Tell me about what happened on the anniversary of the earthquake and how that tied in.
01:41:51.160 Do you know what I'm talking about?
01:41:54.020 Tim?
01:41:54.640 Oh yeah.
01:41:55.140 Sorry.
01:41:55.500 Yeah.
01:41:55.920 Yeah.
01:41:56.880 Yeah.
01:41:57.280 So with the four girls, two of them, their parents were killed during the earthquake.
01:42:03.800 And while we were there was when they were rescued was the anniversary of the earthquake
01:42:09.300 eight years later.
01:42:11.240 And I was sitting with.
01:42:12.500 So they've been in that situation for eight years.
01:42:15.240 They were kidnapped because their parents were killed in the earthquake.
01:42:18.060 Just like my kids that I'm adopting.
01:42:19.560 The same thing happened.
01:42:20.340 It happened to thousands, tens of thousands of kids.
01:42:23.200 People don't understand that the earthquake, we think, oh, well, the Clinton Foundation
01:42:27.260 came down and built some roads.
01:42:28.480 No.
01:42:28.720 Well, what people didn't look at is all of the parents that were dead and the kids were
01:42:33.360 kidnapped and they're still slaves.
01:42:35.220 So they've been slaves for eight years.
01:42:36.960 The devastation that comes when someone becomes an orphan overnight and the vulnerabilities that
01:42:42.660 are there are ongoing.
01:42:45.580 And yes, these kids have been in this situation for eight years.
01:42:50.760 And so when you're sitting with a kid that has been through that, their parents killed, they've
01:42:58.000 been trafficked.
01:42:59.420 And she's looking at my eyes and she says, she says, this is the first time I've ever felt
01:43:06.540 like there's hope.
01:43:08.280 And tears start to roll down her face.
01:43:11.200 And she's like, I finally feel like there might be hope that I'll have something for my life.
01:43:17.200 Eight years later, there's the pain and the beauty that go hand in hand in the story because
01:43:24.320 she's been through this for so long.
01:43:26.420 But even this girl feels the hope in that moment.
01:43:33.300 We started the hour talking about what happened in California, what they found out from those
01:43:37.380 monsters of parents that chained their kids to the beds.
01:43:43.660 And I know because Jessica and I have talked privately about things that she has seen here
01:43:49.260 in the United States.
01:43:50.740 This happens everywhere.
01:43:54.280 This is not a Haitian problem or a Middle Eastern problem or an American problem.
01:43:58.900 It's a human problem.
01:44:00.820 And there is something inside of man that when it goes dark, it goes really dark.
01:44:07.360 And, um, and that's what the Nazarene fund and, uh, operation, um, rescue operation.
01:44:14.780 Oh, you are is, is really all about is rescuing and going into the darkest of dark places.
01:44:20.860 And I don't know how you guys do it.
01:44:22.880 I don't Tim.
01:44:23.540 I don't know how you are as full of light as you are after all of the things that you
01:44:28.420 have seen.
01:44:28.860 Um, but you're a miracle and, and we thank you tonight at five o'clock, we're going to
01:44:34.560 have the footage.
01:44:36.080 So you'll see some of the things that we're talking about.
01:44:39.100 Um, and if we have time, I, I don't know if we'll have time tonight, but, um, we were
01:44:44.960 talking off the air about what's happening in Sweden right now, where Iran is, um, going
01:44:51.780 to be welcoming home.
01:44:52.820 One of their own, somebody who is a refugee who left Iran because she saw a woman being
01:44:58.640 stoned in the streets.
01:45:00.380 She was a journalist over in Iran.
01:45:02.220 She left, she went to Sweden.
01:45:03.800 She got a work visa.
01:45:05.140 She was a refugee.
01:45:06.620 They accepted her.
01:45:07.780 Then she announced she felt comfortable enough to say, I'm not a Muslim.
01:45:12.740 I am a Christian.
01:45:14.140 And Swedes turned on her and have revoked her visa and are threatening to send her back.
01:45:20.800 I don't know what's happening in the world.
01:45:22.220 I mean, I just, it is like evil is protecting its own right now.
01:45:27.920 And, um, and we need your help.
01:45:31.260 So if you can help us, please become an abolitionist.
01:45:35.520 And you can do that by going to our rescue.org, our rescue.org more tonight at five o'clock.
01:45:43.660 Thanks guys for coming in.
01:45:44.440 Thank you.
01:45:45.040 Researchers found two serious security flaws in chips used in every PC, server, smartphone,
01:46:04.860 tablet, anything that's been produced.
01:46:07.320 If it has a, uh, Intel chip in it.
01:46:10.500 Yeah.
01:46:11.480 Yeah.
01:46:11.760 It's got a backdoor.
01:46:13.060 Who would have thunk it?
01:46:14.840 Hackers can potentially make use of these flaws to steal data stored in memory, including
01:46:19.600 your passwords and your files.
01:46:21.420 This is the biggest flaw that we have found the biggest backdoor because it is in almost
01:46:28.860 literally everything.
01:46:30.080 And there's nothing you can do about it right now.
01:46:33.980 Operating system providers, they have released security patches.
01:46:39.180 Okay.
01:46:40.880 Is that really doing the job?
01:46:43.420 One in four people have already experienced identity theft.
01:46:46.500 And I tell you, this is going to, that number is going to go up and it will happen to you.
01:46:50.580 It will happen to your children.
01:46:52.520 Thieves can sell your information on the dark web or get an online payday loan in your name.
01:46:56.760 And LifeLock works to detect those wide ranging, um, identity threats.
01:47:02.660 If you have a problem, a U S based restoration specialist is going to work to fix it.
01:47:07.480 Nobody can prevent all identity theft or monitor all transactions at all businesses.
01:47:10.980 But LifeLock is the best.
01:47:13.220 Join now, get 10% off with promo code back call 1-800 LifeLock or go to LifeLock.com.
01:47:18.660 Use the promo code back.
01:47:19.880 That's Beck and save 10% now at LifeLock.com.
01:47:24.360 Glenn Beck, Mercury.
01:47:43.700 Glenn Beck.
01:47:45.820 You know what I love about living in this, uh, this time period is especially if you saw the post,
01:47:52.080 have you seen the movie, the post yet?
01:47:53.140 I have not seen that yet.
01:47:55.240 It's really good worth seeing.
01:47:56.900 Um, but you'll, you'll see in that when they release the Pentagon papers and the New York
01:48:00.620 times is shut down, uh, and they say you can't release anymore.
01:48:04.660 Uh, there's no place to go.
01:48:06.260 If the Washington post doesn't release them, there's no place to go.
01:48:09.760 Government wins.
01:48:10.840 You can't release the papers.
01:48:12.480 It seems so odd that that news couldn't come out, but that's the way it was.
01:48:21.040 The Monica Lewinsky stuff.
01:48:22.740 It wasn't for Matt Drudge.
01:48:24.600 We may not have known that the internet changed everything.
01:48:28.480 Back in the eighties, I remember trickle down economics and it was always lampooned and you
01:48:34.400 could never, you could make a case, but you could never make a, a media case because you
01:48:39.860 didn't have control of it.
01:48:41.860 Going around the internet now, Washington free beacon.
01:48:45.780 Um, here's what people said about trickle down economics and the president's tax plan
01:48:50.780 and what actually happened once it was passed.
01:48:54.720 It feels like you're relying on this tax cut of the corporations of the wealthy to trickle
01:49:00.100 down Southwest and American Airlines both announcing they're going to give a thousand dollar bonuses
01:49:04.020 to employees following the tax overhaul.
01:49:06.320 Wage increases don't follow tax cuts like this.
01:49:08.840 So the world's largest retailer giving its U S employees, a bonus, a wage increase and expanded
01:49:14.740 maternity and parental leave.
01:49:16.920 So you're creating a huge tax cut and you might not get wage growth.
01:49:21.840 Capital One Financial, which just confirmed to CNBC that they will raise the minimum wage
01:49:27.180 for all U S based employees at Capital One to $15 per hour.
01:49:31.500 And anybody who thinks that this corporate tax cut is going to trickle down to lift wages
01:49:37.880 has a staggering ignorance of how public companies function.
01:49:41.780 Wells Fargo said it would raise its minimum wage to $15 per hour.
01:49:45.420 I mean, it's amazing.
01:49:47.340 I love that.
01:49:47.800 He's just so sure of himself on that last one.
01:49:50.260 Well, because you know what?
01:49:51.140 They could be sure of themselves because there wasn't anybody that would have given that information
01:49:57.420 back in the eighties.
01:49:58.380 If CBS, NBC or ABC didn't make those, uh, those, uh, stories about what the companies were
01:50:06.420 actually, it didn't happen.
01:50:07.620 I mean, it was like, it didn't happen.
01:50:10.620 Now you have enough outlets and you have control of the media yourself to where you can grab
01:50:17.780 those snippets.
01:50:18.640 You can edit those things and you can show, no, this is exactly what happens.
01:50:23.460 They have no fear that the, the liberals with trickle down economics had no fear of this turning
01:50:30.860 around on them because it never has.
01:50:33.420 Hmm.
01:50:34.240 But now we have the internet.
01:50:35.520 It's interesting too, to see these companies take these stands.
01:50:39.440 Normally companies, even companies that lean, right?
01:50:43.100 Don't want to take stands that associate themselves with Republicans publicly.
01:50:47.500 But this is such a clear win for companies and you know, companies really do this.
01:50:52.760 Yeah.
01:50:53.040 Companies really do this.
01:50:53.940 I mean, I think people, they want their employees happy.
01:50:56.260 There might be selfish reasons for it, right?
01:50:57.820 They want their, their, their, uh, employees happy.
01:51:00.880 They like the PR of saying, Hey, we got a bunch of extra money and you know, we're going
01:51:05.140 to distribute that to the people who work for us.
01:51:08.620 There are some selfish reasons for it, but who cares?
01:51:11.800 I mean, it's great.
01:51:12.500 It's great that people, you know, are able to make plans, long-term plans.
01:51:16.900 Now these are all permanent, uh, changes until, I mean, permanent is permanent as they get,
01:51:22.360 um, you know, with lawmaking, but permanent changes for corporations and they are able
01:51:28.680 to really plan for their long-term, uh, you know, companies well-being and their employees
01:51:34.320 well-being.
01:51:34.860 And this is a big change.
01:51:35.980 I mean, it's not, it's not the most bold tax plan we've ever seen.
01:51:39.700 It's not, uh, imagine, imagine what would have happened had they done a flat tax rate.
01:51:46.000 Oh my gosh.
01:51:47.000 If they would have done a flat tax rate, can you imagine the money that would have poured
01:51:51.920 into the average family's home?
01:51:55.260 Oh my gosh.
01:51:56.060 Yeah.
01:51:56.200 I mean, cause this is really, uh, it was really a more of a corporate plan, right?
01:51:59.540 I mean, it was not particularly, uh, life changing when it comes to the individual side.
01:52:04.880 I'll take anything, right?
01:52:06.080 I'll take any, any dollar amount you want to give me of my own money.
01:52:09.360 I'll willingly accept it and act like you're doing me a favor.
01:52:12.600 Uh, you know, for the corporation side, it actually is a really big difference.
01:52:15.960 They no longer have to, I mean, they don't have to make these big changes.
01:52:19.200 There were so many people who said it wasn't going to be a big deal because they're
01:52:21.680 effective rate was this low.
01:52:22.760 Anyway, it's shown to be a big change in these companies.
01:52:25.600 A big deal.
01:52:34.800 Glenn Beck, Mercury.
01:52:37.160 Mercury.