1⧸16⧸18 - 'Reclusive, Abusive and Bankrupt (William Hertling & Tim Ballard from O.U.R. join Glenn)
Episode Stats
Length
1 hour and 52 minutes
Words per Minute
162.05489
Summary
David and Louise Turpin and their 13 children were arrested and charged with torture and child endangerment after a 911 call from one of their daughters led police to find them chained to their beds. The Turpins have been charged with abuse and neglect.
Transcript
00:00:17.280
You know, we have to start comparing ourselves to ourselves yesterday.
00:00:24.360
And stop comparing ourselves to the things that we see on television or on Facebook.
00:00:34.800
David and Louise loved each other. Oh, and they loved Elvis.
00:00:49.560
They had a tradition of going to the Elvis Chapel in Las Vegas.
00:00:55.300
David, Louise, and their 13 children, oh, and the king, all look very happy in these photos.
00:01:11.220
Sometimes you're not getting the whole picture.
00:01:16.420
Today, you're looking at the Turpin's images in a whole different light.
00:01:30.380
If you don't know who the Turpin's are, let me tell you.
00:01:34.800
Sunday, authorities responded to a 9-11 call by one of David and Louise's daughters.
00:01:41.220
The daughter frantically explained on this stolen phone that she had escaped the family home
00:01:48.580
and begged the police to come and rescue her and her siblings who were starving and chained to their beds.
00:01:58.820
When police actually saw her, they didn't believe her.
00:02:05.440
They found the girls' 12 brothers and sisters chained, padlocked to their beds.
00:02:18.020
The children, they thought, were all under the age of 15.
00:02:28.700
They were so famished that they were shocked, the deputies were shocked,
00:02:41.720
David and Louise were unable to immediately provide a logical reason
00:02:45.680
to why their children might have been, you know, chained to their bed.
00:02:49.980
I'm trying to think, as I read that this morning, I thought,
00:02:52.760
what would the logical reason be to chain your children to the bed?
00:03:03.040
They're in the care of Child Protective Services.
00:03:08.500
They were booked on charges of torture and child endangerment.
00:03:17.480
We're at a time when we have so much information about almost everything.
00:03:25.040
In the age of social media, it is getting more and more difficult
00:03:35.160
Because we all have the ability to edit and manipulate our own narratives.
00:03:40.220
You know, I don't know if you saw the movie about P.T. Barnum,
00:03:46.000
But I left that movie and I thought, I want to read about P.T. Barnum.
00:03:49.800
So I went, and there's just not a lot of good books about him.
00:03:54.720
The one that gets the highest ratings, the one that is supposedly the best,
00:04:03.600
How could I possibly believe anything that P.T. Barnum said about himself?
00:04:10.260
He was trying to give an image of himself that wasn't necessarily reality.
00:04:23.580
Because of Facebook, Turpin's friends and family had no idea that they were hideous monsters
00:04:45.500
We'll continue to be shocked by the dichotomy of how people present themselves
00:04:52.660
How many people are we finding out are monsters?
00:04:57.820
The Turpin family provides us with a horrible, cautionary tale today.
00:05:07.300
Something that we all need to understand as we move forward.
00:05:11.940
And it is just really simple, but somehow or another, at times, it controls our life.
00:05:20.280
It even controls our own attitude and self-worth because we forget.
00:05:27.820
That when it comes to social media, you're not seeing the full picture.
00:05:34.140
And sometimes, you're not even anywhere close to the truth.
00:05:50.860
So I got up this morning, and I saw this Turpin story.
00:05:59.360
And I just, I, my first thought was, what the hell's wrong with us?
00:06:14.300
But this was, I mean, this was the way of the world when, you know, in the Dark Ages
00:06:22.860
and when people were not living on top of each other.
00:06:31.020
We just hide it now, and hopefully we're getting better.
00:06:36.400
Yeah, I mean, I think, you know, hopefully it's not quite as common.
00:06:40.440
But, I mean, I guess these stories probably did happen.
00:06:44.120
It's weird that, because when the story first broke, you think to yourself, oh, yeah, that's
00:06:47.660
like one of those, like this story or that story or that story.
00:06:50.300
And you're like, why do I have other examples of this?
00:06:52.560
I shouldn't have any other examples to point to.
00:06:58.300
It's not something that is, it's not something that is new.
00:07:05.900
One of the reasons why he was deemed a saint is because of the miracle of the four boys.
00:07:11.960
I can't remember exactly what it's called, but there were these boys that went from their
00:07:17.880
They were supposed to go maybe, I think, to their uncle's house or something, and they
00:07:24.160
They stopped at this inn, and this guy who ran the inn said, oh, yeah, you guys can stay
00:07:38.720
He ground them up into sausage, and he was making meat pies for the people who came to
00:07:54.680
St. Nicholas was the bishop of the town, knew those children.
00:07:58.700
He went on a search for the kids, stopped at this inn, sensed that something was really
00:08:07.280
wrong, went down into the basement, saw the condition.
00:08:12.740
He was canonized because they say that he rose.
00:08:15.740
He assembled them back together somehow or another and brought them back from the dead.
00:08:21.700
I don't know about that part of the story, but that's the kind of stuff that people have
00:08:33.380
You know, that's a weird, dark part of humanity, isn't it?
00:08:37.280
And I guess, you know, it's hard to understand how you could get to that position.
00:08:42.460
It's like, you know, the first person who ever drank milk.
00:08:45.000
You're like, what was the decision that led to that?
00:08:48.920
You know, who was like, hey, I'll suck off that cow thing.
00:08:51.820
I think clams or oysters or lobsters are like, what?
00:08:57.300
You know, somebody had to try to eat a spider at one point.
00:09:04.680
And those decisions, like, you make that first decision and it influences the rest of your
00:09:09.040
decisions and it doesn't catch on with humanity.
00:09:11.580
It seems like that's, you know, most of us are not enslaving people and that's really positive.
00:09:15.860
But, like, you start that, you know, that family at some point made a decision for that
00:09:21.460
first kid to be chained up in a room and that lasted for how long and how many more people
00:09:35.560
Maybe they're so psychotic that that's all they wanted kids for.
00:09:40.020
You know, we're going to find out a lot about this story.
00:09:42.560
Because there's all these pictures of them happy.
00:09:48.800
They were seen, you know, as early as a couple of weeks ago, one of the older ones driving
00:09:55.060
I mean, the fear of what would happen, I guess, to the rest of the family must have been incredible.
00:10:06.340
You hit on an interesting point in the monologues.
00:10:07.880
I think most people are just going to take this to, you know, how horrible the situation
00:10:13.720
But, I mean, we take Facebook pictures seriously for some reason.
00:10:21.940
It's the same way my wife, every time a new product comes out and that she wants to buy,
00:10:27.780
she comes and shows me the testimonial page of their website.
00:10:30.700
Like, to me, there's no impact at all from a testimonial page from a website.
00:10:36.000
Because I know you have an incentive of this company.
00:10:40.180
Of course, they found the nine people who really like the product.
00:10:49.920
I look at Facebook and people are, you know, smiling photos.
00:10:52.200
And I think internally, for most people, you just be like, oh, wow, they must have a really
00:11:02.840
And we all know that at some level, not to the level where you're shading your children
00:11:08.880
to the beds, hopefully, but we all, at some point, we don't post the really sad pictures.
00:11:15.740
We don't post, we don't take pictures when people are mourning.
00:11:18.740
We don't take pictures when people are fighting and are having troubled times.
00:11:25.580
You are doing the same editing bubble stuff we complain about with the news.
00:11:30.200
Like, you're only exposing people to the happy points of your life.
00:11:34.800
And the fact that that could work on people to the level to separate a family from happiness
00:11:42.460
to kids chained up to their beds shows how powerful that is.
00:11:47.440
I mean, their own relatives weren't even questioning it.
00:11:50.440
Well, there's something wrong with the relatives, too.
00:11:52.840
I mean, the grandparents live in West Virginia.
00:11:56.440
They said they hadn't seen the kids in five years, hadn't seen the family in five years.
00:12:01.380
But they said, you know, the last couple of years, they've talked to the family lots of
00:12:12.860
I mean, did that strike you as weird that you have grandchildren from two to 29 and over
00:12:19.920
a two-year period, you've talked to people in the house, but never the children?
00:12:25.080
But I mean, family dynamics can be strange sometimes.
00:12:31.420
There's, you know, there's all sorts of stuff that happens in families where it's possible
00:12:35.340
You'd think, though, there'd be some indication of trouble.
00:12:37.080
And it's frightening as well, if you look at the aerial shots of this house, these houses
00:12:49.980
Neighbors said they didn't even know that, some of them didn't even know that there were
00:13:01.040
You didn't even know what the one neighbor said they thought it was weird, but I think
00:13:07.800
two weeks ago she went outside at night and it was like nine or 10 o'clock at night and
00:13:15.040
the kids were the four of them and they were all kneeling down in the grass together and
00:13:20.020
they were all just kind of rolling around in the grass and the mother was standing in
00:13:24.300
the back watching over them and she cried over the fence.
00:13:27.860
You know, the neighbor did said, hey, hey, guys, and the kids kind of looked at her and
00:13:34.080
then just kind of froze and the mother didn't even recognize the neighbor saying anything
00:13:40.680
and she said it was just really odd, but she didn't think anything about it.
00:13:44.880
It just, you know, it's weird that they were out at 930 at night, but, you know, and their
00:13:51.480
Well, we just have no, I mean, people on my street could have, you know, entire tribes in
00:13:58.560
I mean, you just, people just don't, you don't react, you don't interact with your neighbors
00:14:03.020
I mean, we always talk about, we always throw back to the times where you used to be able
00:14:06.200
to just let your kids out and they would go around the neighborhood and now we'd be terrified
00:14:12.160
Like you'd never want that to happen because we're sort of somewhat crazy, right?
00:14:19.480
There's no reason to, we were overprotective now with our kids, but part of the reason
00:14:24.880
why it felt okay is because you knew everybody in the neighborhood.
00:14:29.900
You knew that they would look out for your kids.
00:14:32.180
You knew that they would feel okay punishing them if they did something wrong.
00:14:36.420
I think it was more, I think it was also more than that though.
00:14:44.880
We were totally misled, but we all thought we had the same basic values.
00:14:52.020
And so we didn't question the parents because we thought, oh, well, you know, parents are
00:14:58.280
They believe the same thing we do and they're going to make sure.
00:15:03.020
And I don't know if anything has changed because, I mean, I grew up, I grew up in the
00:15:09.720
But I remember me and my friends going over to my friend's house and my friends, we were
00:15:17.580
all embarrassed for the kid whose parents were hippies and were always stoned.
00:15:22.700
And we were like, oh, your mom and dad stoned again?
00:15:27.060
And they would be like, you know, hey, kids, how's this?
00:15:31.400
And I mean, my parents didn't know that I was going over to, you know, I happen to be
00:15:37.760
running with a group of decent kids that we're all like, okay, the stoners are here.
00:15:48.000
And part of it is because our parents were radically different.
00:15:58.180
But my parents just thought everybody's parents were the same.
00:16:02.320
They didn't realize, oh, that's the 60s hippie family.
00:16:19.740
Did you wake up feeling refreshed, feeling good?
00:16:26.020
Casper is a mattress that will really help you feel good at night.
00:16:29.900
Casper mattress has a unique combination of foams that provides the right pressure relief
00:16:35.000
and the comfort so you'll feel balanced when you get up in the morning.
00:16:39.740
Plus, it's made of breathable material so you don't wake up in the middle of the night,
00:16:45.860
you know, having to throw off the blankets because it's so hot.
00:16:54.780
I hated it because it was a billion degrees every night.
00:17:00.360
Casper has come up with a foam mattress that is comfortable, supportive, and it breathes.
00:17:08.000
Casper, try it out in your own home for 100 nights, risk-free.
00:17:11.500
They'll ship it to you in a little compact box.
00:17:13.400
You can just take it in and open it up, and boom, there's the mattress.
00:17:16.900
But if you don't love it, you don't have to put it back in that tiny little box.
00:17:20.020
They'll come and pick it up and refund every single penny, no questions asked.
00:17:31.420
Use the promo code Beck and save $50 on the purchase of select mattresses.
00:17:39.860
Save $50 off the purchase of select mattresses.
00:17:53.260
Have a fascinating conversation next hour with a guy who can tell you a little bit what the future is going to be like in a very entertaining way.
00:18:12.100
That's a terrible story, too, about these kids that were chained in the basement.
00:18:16.880
That's the big story today about this Turpin family.
00:18:19.420
When you were reading the story, did you have a moment at all, like I did, of thinking about Hank Johnson?
00:18:26.880
And so he asked the military guy, do you think if we put too many troops in Guam, the whole thing will tip over and capsize?
00:18:38.480
If you remember his response, it was, we don't anticipate that.
00:18:43.660
And there's just something about the way that, like, officials respond to these things that's just, in the most serious circumstance, is really funny.
00:18:52.940
And so in this one, they went and they interviewed, or it was a police statement where they were talking about this terrible case.
00:18:59.120
13, you know, kids, and they're all, like, chained up in a basement.
00:19:02.860
These old, you know, 20-year-olds that look like 10-year-olds because they're so thin.
00:19:06.180
And they asked him what happened, and he said, and the quote was, David and Louise were unable to immediately provide a logical reason why their children were restrained in that matter.
00:19:23.740
They were unable to immediately come up with a logical reason why they changed 13.
00:19:31.780
Can you come up with a logical reason to chain your kids to the bed?
00:19:42.640
Maybe if there was a, you were worried about them escaping because there was a nuclear holocaust outside.
00:19:58.360
You, you, you, you, they were violent to themselves and others.
00:20:06.000
They wanted to be prepared if gravity reversed itself.
00:20:15.280
Well, they wouldn't have floated away if they would have, all they would have to do is close the windows.
00:20:21.220
Just close the windows, and I got to pull them down from the ceiling again.
00:20:39.960
This Aziz Ansari story is really where what's going to separate the men from the boys, the revolutionaries from the sane.
00:20:58.300
She's the woman who wrote, um, A Handmaiden's Tale.
00:21:01.900
She's been a, you know, a feminist for forever.
00:21:05.160
Um, she is now, I think, 78 years old, and she wrote this, she wrote this article for The Guardian last week, um, entitled, Am I a Bad Feminist?
00:21:17.780
And so, I read this yesterday, and I read the whole thing, not just the highlights, like everybody else, and I read the whole thing.
00:21:27.800
She said, It seems like it seems like I'm a bad feminist, and I can add that to the other things that I've been accused of since 1972, such as climbing to fame up a pyramid of decapitated men's heads, from a lefty journal, of being a dominatrix bent on the submission of men, a right one, of being an awful person who can annihilate with her magic white witch powers anyone critical of her at Toronto dinner tables.
00:21:58.240
And now, it seems I'm conducting a war on women, like the misogynistic, rape-enabling, bad feminist that I am.
00:22:07.820
My fundamental position is that women are human beings, with the full range of saintly and demonic behaviors this entails, including criminal ones.
00:22:28.240
Nor do I believe that women are children, incapable of agency or making moral decisions.
00:22:35.200
If they were, we're back to the 19th century, and women should not own property, have credit cards, have access to higher education, control their own reproduction, or vote.
00:22:46.640
There are powerful groups in North America pushing this agenda, but they're not usually considered feminists.
00:22:54.020
Furthermore, I believe that in order to have civil and human rights for women, there have to be civil and human rights, period.
00:23:07.960
Just as for women to have the vote, there has to be a vote.
00:23:12.520
Do good feminists believe that only women should have such rights?
00:23:19.780
That would be a flip of the coin on the old state of affairs in which only men had those rights.
00:23:26.360
So let me suppose that my good feminist accusers and the bad feminists, that is me, agree on the above points.
00:23:36.140
How did I get into such hot water with good feminists?
00:23:39.440
In November 2016, I signed as a matter of principle, as I have signed many petitions, an open letter called UBC Accountable.
00:23:51.220
She goes into how the University of British Columbia treated one of its former employees, Stephen Galloway.
00:23:58.440
He was the former chair of the Department of Creative Writing.
00:24:08.440
He wasn't even allowed to know what he was accused of.
00:24:18.160
Finally, a judge said there was no sexual assault.
00:24:28.700
And she's looking at this going, no, no, this is not this is a witch hunt.
00:24:34.820
She said any fair minded person would withhold judgment as to guilt until the report and the evidence is available for us to see.
00:24:45.080
And she talks about going into now these witch trials and and we've got to stop.
00:25:01.500
Because Margaret Atwood was called a blood drinking monster.
00:25:05.620
The things that they said about her were just horrendous.
00:25:14.640
Margaret Atwood, as an enemy of feminism, is a tough concept to get your head around.
00:25:19.480
She is, after all, the author of The Handmaid's Tale, the universally acclaimed dystopian fantasy in which men are women are enslaved to men.
00:25:27.620
Her impressive body of work, one that has profoundly informed the feminist zeitgeist, is a 50 year long attack on misogyny and the patriarchal state.
00:25:37.880
As that would is probably the leading feminist author in the world.
00:25:44.860
What happened is that the revolution has entered a new phase.
00:25:50.060
Having vanquished the reactionary, the Jacobians are now sending the moderates to the guillotine.
00:25:57.620
Buildings have to be raised so society can begin anew.
00:26:02.340
And everyone who isn't for them is against them.
00:26:05.700
Moderates like Ms. Atwood and their odious ideas about due process and the presumption of innocent until proven guilty are traitors to the revolution.
00:26:15.980
One letter to the globe had put it another way.
00:26:29.300
We as a society are now starting to go into a new period of this revolution.
00:27:02.600
And now the revolutionaries are taking everyone, kicking and screaming.
00:27:08.560
And these are the same people who said George Bush was horrible because he said, if you're not with us, you're against us.
00:27:18.380
And there's this standard that has nothing to do with human rights.
00:27:34.360
This, uh, Aziz, Aziz Ansari, this story is, is remarkable.
00:27:46.980
And even Ashley Banfield, who I don't think I've ever agreed with anything she's ever said, she gets on CNN.
00:27:55.080
And I want you to listen to what she said about this.
00:27:57.340
But what you have done, in my opinion, is appalling.
00:28:01.760
You went to the press with the story of a bad date.
00:28:05.580
And you have potentially destroyed this man's career over it right after he received an award for which he was worthy.
00:28:12.240
If you were sexually assaulted, go to the cops.
00:28:15.500
If you were sexually harassed, jeopardizing your work, speak up and speak out loud.
00:28:21.420
But by your own descriptions, that is not what happened.
00:28:24.920
You had an unpleasant date, and you did not leave.
00:28:30.680
And all the gains that have been achieved on your behalf and mine are now being compromised by the allegations that you threw out there.
00:28:38.000
And I'm going to call them reckless and hollow.
00:28:40.660
I cannot name you publicly and sentence you to a similar career hit, as Ansari, because you chose to remain anonymous.
00:28:53.080
I mean, jeez, if a guy said that, their career would be over, right?
00:29:01.440
You know, babe.com is the rag that put this charge out?
00:29:15.220
They went to his apartment, started the date at the apartment.
00:29:19.200
They had met at a party, and she was impressed by his celebrity.
00:29:21.840
I mean, if you don't know who he is, he's a comedian.
00:29:24.600
He does the show Master of None on Netflix, among other things.
00:29:31.120
At one point, he offered her a glass of wine, and he brought her white wine.
00:29:38.780
And in the article, it was an interesting, revealing moment where she says, it was white wine.
00:29:46.540
Like, as if she couldn't say, can I have some red wine?
00:29:49.780
It was like he was forcing her to accept this white wine, even though she preferred red.
00:29:56.020
And it's like such a strange detail to put in there, just showing that she...
00:30:00.360
That is, if it is true that he didn't want you to have white wine, and you said, no, I prefer white, she'll have the red, that should have been the end of the date right there.
00:30:13.040
She didn't even express, by her own telling, that she actually wanted red wine.
00:30:18.720
And again, it's another part of, she has no agency.
00:30:21.640
She's, you know, she is an underling in society.
00:30:32.360
They go through, as she puts it, they rush through a dinner at a fancy restaurant.
00:30:36.620
Now, so far, this guy's given her wine, and he's brought her to a fancy restaurant.
00:30:40.360
There's no indication he's been a jerk to her in any way.
00:30:46.540
They eventually perform oral sex on each other.
00:31:01.980
And so she claims, through this, she gave nonverbal cues, which should have told him to stop.
00:31:08.940
At one point, she did say something like, I kind of want to take it slow, when it came to the final act.
00:31:13.700
Like, it seemed like, by her description, she had drawn some sort of line that she didn't want to go all the way that night.
00:31:29.920
She decided, you know, I don't want to do this.
00:31:39.880
And he was so sure that this was a consensual, fine evening, even though he wanted to have sex with her.
00:31:45.500
And she decided to stop it at one level that the next day he, he texted her and said, hey, it was really nice meeting you last night.
00:31:53.560
And then she responded with this 45, you know, 100 word text of, I did not feel fine.
00:32:01.820
He responded and said, oh, my God, I had no idea you felt that way.
00:32:09.460
And that was it until she decided to write an article anonymously about the incident.
00:32:12.900
Rupert Murdoch put six million dollars into Babe.
00:32:22.320
Now, I just you said a minute ago, imagine if a woman or a man said what Ashley Manfield just said.
00:32:33.280
And I have to I can't read very much of these stories because they're absolutely pornographic.
00:32:41.360
She the headline is how to trap your man who doesn't know he's your man, but is still
00:32:55.000
How to now imagine if I said, hey, I'm going to write an article about how to trap that
00:33:00.480
woman who doesn't know she's your girlfriend, but she's your girlfriend until Valentine's
00:33:06.640
The men writing articles about trapping women doesn't go over that well, typically.
00:33:12.420
Cuffing season is officially over, which means your current victim is crafting a convenient
00:33:19.020
excuse to leave you before the big day of Valentine's Day.
00:33:28.140
Writing that you have female victims not usually looked upon kindly.
00:33:38.840
You know, you're manic, but you thought your head.
00:33:46.040
Nevertheless, three to five meltdowns and half a breakup later.
00:33:51.500
He's weary and you're mad and he's still not sure why you think you're dating him in the
00:34:07.420
I mean, I, the stuff you can't read, which you read to me and I'm internally scarred from
00:34:16.440
And it's, it's, what's amazing about it is it's all written in from this perspective
00:34:25.080
We're, and then she can't say no to red wine at the same time.
00:34:30.040
This talks about how you trap a man in bed, how you get him to do what you want him to
00:34:39.420
So please don't lecture the rest of us on how to behave because if my daughter, if I
00:34:50.100
If I behaved the way you are, are, uh, condoning and encouraging, you would have the right to
00:35:20.320
She said this blinds.com is by far the most customer friendly company I have ever experienced.
00:35:26.060
The customer service representatives are knowledgeable about the products.
00:35:29.680
They take a lot of time helping me with my selection of even just the color.
00:35:41.220
I've told you for years how great they are, how great the services.
00:35:45.760
He put, um, shutters in, uh, in his house and had the same at exactly the same kind of
00:35:56.120
They actually downsold me from the, I guess I was looking at the, of course my wife was
00:35:59.860
And so we were going to go with the top of the line when it was a stunning surprise that
00:36:03.940
Uh, and they said, actually this other kind is the same quality and maybe even a little
00:36:20.100
Blinds.com slash Beck, uh, right now through January 28th, they're going to save 20% site
00:36:26.260
wide at blinds.com slash Beck rules and restrictions to apply.
00:36:35.080
See, there's a new app out now that allows you to create a consent contract before you
00:36:58.160
This is, uh, now we can all have it all official.
00:37:01.760
Sign here, sign here, initial this and sign and I'm hot and ready to go now.
00:37:09.580
It's not sexy, but also aren't you giving away, like, can you imagine if you actually
00:37:12.900
got really assaulted after this, the person would have a legally binding contract saying
00:37:17.160
you agreed that would not, it's not a good move.
00:37:21.420
You need the, you know, the long extended version.
00:37:31.760
You know, Sweden is, is a fantastic, um, country.
00:37:44.480
It is a country that is known as the most refugee friendly country in the world.
00:37:49.500
And that, that is something that they are very proud of, uh, and something that they
00:37:55.340
really lived in World War II and they're living in now.
00:37:59.360
When you go to Sweden, if you're an immigrant, you're given free housing, money, language lessons,
00:38:09.940
They are leading the world in, uh, tolerance and acceptance.
00:38:15.440
Or so you would think they are leading the world unless you're a Christian.
00:38:23.160
There is a shocking story coming out of Sweden now that is kind of being buried of a Dean
00:38:30.420
She was a popular TV and movie actress in her home country of Iran.
00:38:35.440
One day she witnessed a woman getting stoned to death and she thought, I gotta get, I gotta
00:38:41.840
Not long after she had a dream, uh, it was a dream of Jesus and she decided that she wanted
00:38:48.440
to convert, but she had to do it in private and secret because in Iran, converting to Christianity
00:38:55.620
So she left for Sweden because Sweden was taking in immigrants and refugees.
00:39:04.080
When she got to Sweden, it's when she decided to make her conversion public to not live a lie
00:39:10.700
Leaving the Islamic faith is illegal in Iran, punishable by death.
00:39:14.560
But Swedish immigration recently has decided now to deny her asylum request and block her
00:39:23.200
The UN and Swedish immigration policy states that an immigrant cannot be denied asylum if
00:39:29.700
the seeker faces imminent danger upon arrival back at their home country.
00:39:37.840
If she's deported back to Iran, she will face imprisonment, rape, and execution.
00:39:49.120
Kind of an odd story from the most tolerant and accepting society on the planet.
00:39:55.780
Given the recent investigation done by the Swedish newspaper, they uncovered the program what the
00:40:01.760
Swedish government was running to protect ISIS terrorists arriving from Syria.
00:40:08.240
Apparently, these poor jihadists were having a hard time finding jobs because their pictures
00:40:12.980
and starring roles in propaganda videos were scaring off employers.
00:40:18.920
So the Swedes did an undercover operation and fixed all that with brand new identities and protected status.
00:40:34.880
If you're being persecuted by your faith because of your faith, we need to protect you.
00:40:40.280
But if you're being persecuted because you're a starring role in a terrorist recruiting film, I don't get it.
00:40:53.120
And in some places like here in the United States, the war on faith is being fought against ideology.
00:41:00.300
In other places like Iran, the war is literally life and death.
00:41:05.960
We've come full circle in the Middle East, a return to the first century.
00:41:10.640
And like then, the time has come to show the world that a church is meaningless.
00:41:24.720
I don't know what it means to be a Christian, quite honestly, anymore.
00:41:36.240
And millions of Christians and Yazidis and even Muslims who aren't Muslim enough live under the constant threat of persecution and death.
00:41:45.960
It is time we all stand shoulder to shoulder without any ill will or hatred in our hearts.
00:42:09.120
I have been immersing myself in future tech to try to understand what is coming our way and what the moral issues are of the near future.
00:42:28.440
What it means to each of us in our lives, what it means to be asked the question, am I alive?
00:42:43.600
We have so many questions that we have to answer and we're having trouble with just some of the basic things.
00:42:51.540
And no one is really thinking about the future.
00:42:53.720
When you think about the future and you think about robots or you think about AI, Americans generally think of the Terminator.
00:43:00.920
Well, that's not necessarily what's going to happen.
00:43:09.140
So I've been reading a lot of high tech stuff and I've in my spare time been trying to read some novels and I'm looking for the storytellers, the people who can actually tell a great story that is really based in what is coming.
00:43:27.160
The futurist or the near future sci-fi authors that can show us what's on the horizon.
00:43:45.860
And I found them over the Christmas vacation and I've just last night finished the fourth one.
00:43:56.420
They are, they get a little dark, but it also shows the positive side of what could be.
00:44:03.160
And it was a balanced look and a way to really understand the future that is coming and is on the horizon.
00:44:10.740
William Hurtling is the author and he joins us now.
00:44:27.120
I could not find a publisher who saw the vision of the series, but I've self-published it and people love it.
00:44:42.060
And I hope, you know, I don't know what your sales have been like, but I hope your sales are really good.
00:44:56.220
You know, what happened was about 10 years ago, I read two books back-to-back.
00:45:04.160
One was Ray Kurzweil's The Singularity is Near, which I know you've read as well.
00:45:09.080
And the other one was Charles Strauss's Accelerando, which is a fictional book about the singularity.
00:45:15.160
And what I realized at that point in time was that we had the biggest set of changes that were ever going to face humanity.
00:45:22.100
And they were coming, and they were in the very near future, right?
00:45:27.040
They're probably coming within the next 10 years.
00:45:31.860
And as you said, most of the stories that are in media today are about these Terminator-style stories.
00:45:37.780
AI rise up, they take control over the machines, and we fight them in a battle, which, of course, makes for a great movie.
00:45:43.380
I mean, I would love to see The Terminator many times over.
00:45:50.420
What happens when it's sort of the quiet AI kind of story?
00:45:55.220
What happens when there's this moment of emergence of the first AI that's out there, and people realize they're being manipulated by some entity?
00:46:07.380
So I find this, first of all, you lay it out so well.
00:46:12.340
And you start, the first book starts with the emergence of AI, and then moves, I think the next book is what, 10 years later, five years later?
00:46:27.980
They can basically explore different points in technology in the future.
00:46:32.520
So the last one is in the 2040s or 2050s, and it's a very different thing then than it starts out as.
00:46:40.120
And the thing I wanted to talk to you about is, first of all, can you just define, because most people don't know the difference between AI, AGI, and ASI, which is really important to understand.
00:47:00.220
It's any time programmers write a piece of software that instead of having a set of rules, you know, if you see this, then do that.
00:47:09.040
Instead, the AI software is trained to make decisions on its own.
00:47:19.160
It's what selects the stories that you read in Facebook.
00:47:25.320
And AGI is this notion that artificial intelligence will become more general, right?
00:47:31.080
All of those things I mentioned are very specific problems to be solved.
00:47:36.740
So a good explanation of AI would be Big Blue, the chess-playing IBM robot.
00:47:52.180
And we have IBM's Watson, which is really good at making diagnoses about cancer, but you can't have a conversation about how you're feeling.
00:48:03.240
AGI would appear to be like a human being, conceivably, in that it could talk and reason about a wide variety of topics, make decisions, generally use its intelligence to solve problems that it hasn't even seen before.
00:48:22.240
So the Turing test is this idea that you've got a person in one room chatting with someone in another room, and they have to decide, is that a human being or is it a computer?
00:48:34.420
And if they can't figure it out, then that is the Turing test.
00:48:39.540
And that you've passed the Turing test if you can't distinguish between a computer and a person.
00:48:48.380
Well, I think we've probably all been fooled at least a few times when we've either gotten a phone call or made a phone call, and we think that we're talking to a human being on the other end, right?
00:48:59.400
But it actually turns out that we're talking to a machine that wrote our phone call somewhere.
00:49:03.440
So, you know, we're there for like a couple of sentences, but we're still pretty far away if you're going to have any kind of a meaningful conversation.
00:49:12.540
And AGI is when a computer has the computing power of a human brain.
00:49:21.360
Now, that's not necessarily a scary thing, but it's what happens when you go from AGI to ASI, artificial superintelligence, and that can happen within a matter of hours, correct?
00:49:39.360
There's a couple of different philosophies on that.
00:49:41.780
But if you can imagine that, think about the computer that you have today versus the computer you had 10 years ago, right?
00:49:50.260
It's vastly more powerful, vastly more powerful than the one you had 20 years ago.
00:49:54.000
So, even if there's not these super rapid accelerations in intelligence, even if you just today had a computer that was the intelligence of a human being, you would imagine that 10 years from now, it's going to be able to think about vastly more stuff, much faster, right?
00:50:14.160
So, we could see even just taking advantage of increasing in computing power, we would get a much smarter machine.
00:50:20.840
But really dangerous, or not necessarily dangerous, the really rapid change comes from when the AI can start making changes to itself.
00:50:31.740
So, if you have today programmers create AI, but in the future, AI can create AI, and the smarter the AI gets, then in theory, the smarter the AI it can build.
00:50:43.460
And that's where you can get this thing that kind of spirals out of control.
00:50:46.780
So, you get a handle on how fast this can all change.
00:50:52.700
If you have an Apple iPad 2, that was one of the top five supercomputers in 1998.
00:51:08.520
That's how fast technology is growing on itself.
00:51:13.240
All right, so, William, I want you to kind of outline, we're going to take a break, and I want you to come back and kind of outline why all of this stuff matters.
00:51:25.120
What is in the near future that we're going to be wrestling with, and why people should care when we come back?
00:51:35.740
We have a burglary in America about every eight seconds, and you know what really thwarts them is somebody who has a security system.
00:51:50.340
They find, generally, houses that aren't protected.
00:51:55.060
But a security system can be really expensive, and if you go the old-fashioned way, you don't ever own it.
00:52:00.860
You're always paying on it, and, you know, you're charged $50 and $60 for the 24-7 monitoring.
00:52:16.600
Its sensors are going to protect every point of access in your home.
00:52:19.400
If a burglar just even breaks the glass or tries to open anything, the sirens go off that will allow, you know, let them know, hey, police are on the way.
00:52:31.260
But it also will take a picture of whoever it is that was trying to break in that you can give to the police when they arrive.
00:52:38.700
SimpliSafe has 24-7 monitoring at $14.99 a month.
00:52:43.880
They never lock you into a long-term contract, and you own the system.
00:52:47.720
Plus, you get a 60-day money-back guarantee, so there's no reason not to try it.
00:52:52.400
SimpliSafe, the security system that you should have in your home, SimpliSafeBeck.com.
00:53:13.960
As you know, if you're a long-term listener of the program, I am very fascinated with the future and what is coming, the future of tech and artificial intelligence.
00:53:28.200
He is the author of what's called the Singularity Series.
00:53:31.220
It's a series of four novels that kind of break it down and tell you exactly what's coming and break it down in an entertaining fashion.
00:53:43.320
If you are interested in any of this, you need to start reading that.
00:53:48.540
William, I know Glenn is a big fan of your work and has been reading a lot about technology.
00:53:52.700
I think a lot of people who are living their daily lives aren't as involved in this.
00:53:57.880
I think a third or a half of the audience, when you hear AI, don't even connect that to artificial intelligence until you say it.
00:54:05.740
I know as a long-term NBA fan, I think Allen Iverson, honestly, when I hear AI.
00:54:11.700
So can you make the case of with everything going on in the world, why should people put this at the top of their priority list?
00:54:18.800
Well, it's the scale of the change that's coming.
00:54:24.040
And probably the nearest thing that we're really going to see is over the next five years,
00:54:28.580
we're going to see a lot more self-driving cars and a lot more automation in the workplace.
00:54:35.040
So I think transportation jobs account for something like 5% of all jobs in the United States.
00:54:41.620
And whether you're talking about driving a car, a taxi, driving a delivery truck,
00:54:47.100
all of those things are potentially going to be automated, right?
00:54:50.120
This is one of the first really big problems that AI is tackling.
00:54:56.020
So AI can drive a car and it can do a better job.
00:55:07.220
It's going to make mistakes, but it's going to make less mistakes than your typical human operator.
00:55:10.940
So business likes to save money and it likes to do things efficiently.
00:55:17.560
And self-driving cars are going to be more cost effective.
00:55:20.980
So what happens to those 5% of the people today who have transportation jobs, right?
00:55:27.360
This is probably going to be the biggest thing that affects us.
00:55:30.060
I think, William, that Silicon Valley had better start telling the story in a better fashion
00:55:39.680
because as these things hit, we all know politicians on both sides.
00:55:45.960
They're telling everybody that I'm going to bring the jobs back.
00:55:50.340
In fact, many, many more are going to be lost, not to China, but by robotics and AI.
00:55:57.060
And when that happens, you know, I can see, you know, politicians turning and saying,
00:56:09.620
And yet, unfortunately, the AI genie is out of the bottle, right?
00:56:16.580
Tech companies around the world are investing in it.
00:56:18.880
If we stop investing in it, even if we said, hey, we don't want AI, we don't like it,
00:56:23.700
all that's going to do is put us at a disadvantage compared to the rest of the world.
00:56:33.480
So we need to participate in it, and we need to shape where it's going.
00:56:36.920
And I think this is the reason why it's so important to me that more people understand
00:56:43.260
Because we need to be involved in a public conversation about what we want society to look like in the future.
00:56:48.640
As we go out, if even more jobs are eliminated by AI, what does that mean?
00:56:53.340
What if we don't have meaningful work for people?
00:56:55.800
I think that the thing I like about your book series is it starts out really hopeful,
00:57:02.500
and it shows that this technology is not going to be something that we really are likely to refuse
00:57:13.360
because it's going to make our life incredibly stable and easy in some ways.
00:57:22.280
And I'd kind of like you to talk a little bit about the stock market and the economy and war and everything else,
00:57:30.300
something that you would talk about in your first novel,
00:57:33.160
and show you when we come back the good side and then what it could turn into.
00:57:39.900
So Alan Iverson is taking our transportation jobs?
00:58:05.260
He is the author and futurist, the author of many books.
00:58:11.580
I'm talking to him about the Singularity series.
00:58:15.440
And the first one in there is the Avogadro Corp.
00:58:18.580
And it starts out around this time, and it starts out with a tech center in Portland,
00:58:29.820
and a guy is working on a program that will help everybody with their email,
00:58:35.360
and all of a sudden he makes a couple of changes, and unbeknownst to him,
00:58:39.080
it grows into something that is thinking and acting and changing on its own.
00:58:44.220
And William, I'd like you to take us through this, because the first book starts out really kind of positive,
00:58:51.140
where you're looking at this, and there's some spooky consequences, but you're looking at it and going,
00:58:59.020
And by the end, in the fourth book, you know, we have all been digitized,
00:59:03.760
and we're in a, you know, a missile leaving the solar system because Earth is lost.
00:59:12.680
A, do you think this is, is this your prediction, or you just think this is a really kind of good story?
00:59:18.320
I, well, you know, I think a lot of, I think a lot of it has the potential to be real.
00:59:26.160
And I think one of the things you probably know from my reading is that I'm fairly balanced,
00:59:29.540
and what I see is both the risks and the benefits.
00:59:35.700
There's so many people that are very dogmatic about artificial intelligence in the future,
00:59:39.900
and they either say, hey, it's either all benefits and there are no risks,
00:59:43.280
or they only talk about the risks without the benefits.
00:59:45.280
But, you know, and there's a mix of both, and it's like any other technology, right?
00:59:51.480
We all find our smartphones to be indispensable.
00:59:54.160
And at the same point in time, they affect us, right?
00:59:58.600
And society is different today than it was 10 years ago because of our smartphones.
01:00:03.920
This is different, though, than anything else that we've seen, like a smartphone,
01:00:08.780
because this is like, you know, an alien intelligence.
01:00:12.500
And so we don't have any way to predict what it's going to be like or what it's going to do
01:00:16.540
because it will be thinking, and it most likely will not be thinking like a human.
01:00:20.940
But can we start at the beginning where, just give me some of the benefits that are going to be coming
01:00:25.640
in the next, you know, let's say 10 years that people are going to have a hard time saying no to?
01:00:32.540
I mean, first of all, we already talked about self-driving cars, right?
01:00:36.080
I think we'd all like to get into our car and be able to do whatever we want to do
01:00:41.860
That's going to free us up from a mundane task.
01:00:44.880
We're going to see a lot more automation in the workplace,
01:00:49.500
which means that the cost of goods and services is going to go down.
01:00:53.400
So we're going to be able to get more for less.
01:00:56.220
So that's going to seem like an economic boom to those of us that can afford it, right?
01:01:04.740
We are going to have better experiences when we interact with AI.
01:01:09.720
So today, if you have to go to the doctor, you're going to wait to get a doctor's appointment.
01:01:15.140
You're going to have this rushed experience more than likely, at least here in the U.S., right?
01:01:20.340
And you're going to get five minutes of their time,
01:01:22.460
and you're hoping that they're going to make the right diagnosis in that five minutes that they're with you.
01:01:26.660
So that's going to be, I think, one of the really big changes over five to ten years from now
01:01:32.620
is we're going to see a lot more AI-driven diagnosis.
01:01:36.060
So when you're having medical issues, you can go in and you can talk to an AI.
01:01:40.160
That'll be more or less indistinguishable from talking to the nurse when you walk into the doctor's office.
01:01:45.620
And by the time the doctor sees you, there'll already be a diagnosis made by the AI,
01:01:50.620
and it'll likely be more accurate than what the doctor would have done,
01:01:56.080
Yeah, I had a hard time, until I started reading about Watson,
01:02:00.640
I had a hard time believing that, you know, people would accept something from a machine,
01:02:06.220
but they are so far ahead of doctors if they're fed enough information.
01:02:11.980
They're so far ahead on, you know, predicting cancer and diagnosing cancer than people are.
01:02:19.080
You're going to want to have the AI diagnose you.
01:02:23.000
Right, because that's going to be the best, right?
01:02:32.900
And then, you know, 10, 15 years out, we start, and you know, it's funny.
01:02:37.960
I had a conversation with my daughter one day, and she asked,
01:02:40.760
hey, Dad, when am I going to get to drive a car?
01:02:42.860
And I thought about her age, and I thought about that, and I was like, well, I'm not sure you're
01:02:49.060
ever going to get to drive a car, because, you know, where you are and when self-driving
01:02:56.460
And so you'll just get in one, and it'll take you where you want to go.
01:02:59.820
So there's going to be these very sort of, they're both subtle and yet dramatic changes
01:03:04.600
in society when you think about, hey, we're going to have a generation of people who may
01:03:09.960
And their time will be free to do other things, but it'll be different than we are.
01:03:15.080
Do you see the, you know, in your first book, you talk about, you know, AI changing, you
01:03:23.780
know, the emails that are being sent and doing things on its own and really manipulating people.
01:03:29.920
We are already at the point to where we accept the manipulation of what we see in our Facebook
01:03:35.900
feed, but that's not, there's, there's, there's, there's, that's not a machine trying to do
01:03:45.020
Or do you see us very far away from, you know, hedge fund computers that, that can really
01:03:53.860
manipulate the markets in a positive way or computers that can begin to manipulate for
01:04:09.440
We're going to, at least at a minimum, right, where we can imagine that if you have an authoritarian
01:04:15.240
government, they're going to distribute information to pacify people.
01:04:19.960
And that's not a good thing often, but in some ways it is.
01:04:26.800
I mean, you know, if you have armed unrest, people will die.
01:04:32.520
I think what we're going to see is we're just going to see lots of different people using
01:04:37.260
So maybe we don't have a, you know, a hedge fund manipulating the markets in a positive way.
01:04:43.740
Maybe it starts with a bunch of hackers in another country manipulating the markets to
01:04:51.260
But I think we are going to see that distribution, that, that manipulation of information.
01:04:59.080
A lot of the content that you read on the web, whether it's a review of a restaurant or
01:05:03.540
a business, a lot of that is already generated by AI.
01:05:06.660
And it's hard to tell what's an AI versus a person.
01:05:11.760
Talking to William Hurtling, he's an author and futurist, author of a great series of novels
01:05:20.160
William, the idea that intelligent, not AI, not narrow AI, but, you know, super intelligence
01:05:35.560
or artificial general intelligence just kind of comes out of nowhere as it does in your
01:05:42.460
first novel where it wasn't the intent of the programmer is interesting to me.
01:05:49.680
I sat with a, one of the, a bigger name from Silicon Valley just last week.
01:05:56.720
Uh, and we were talking about this and he said, whoever controls AI, whoever gets this first
01:06:06.640
He was talking to me privately about, um, a need for almost a Manhattan project for this.
01:06:12.800
Do you see this as something that's just going to be sprung on us or will it be, uh, taken,
01:06:23.820
I think the odds are probably strongly biased towards in a lab, um, both because they have
01:06:30.720
the kind of deeper knowledge and expertise and also because they have the kind of raw computing
01:06:35.420
So, um, that alone is like, they have the computers that we'll have in 15 to 20 years, right?
01:06:52.460
And that makes AI a lot easier of a problem to solve.
01:06:55.660
So I think it's most likely to come out of, um, a lab.
01:06:59.840
If you're looking at, for instance, the lawsuit that was just filed with, um, uh, against Google,
01:07:04.680
about the way they, uh, treat people with different opinions, uh, et cetera, et cetera, my first
01:07:11.400
thought is, good God, what are those people putting into the programming?
01:07:16.720
Um, I mean, that, that, that doesn't, that doesn't work out well for people.
01:07:22.500
Is there enough, are there enough people that are concerned about what this can do and what
01:07:29.900
this can be that we have the safeguards with people?
01:07:38.720
I mean, think about the transportation system we have today and the robust set of safety
01:07:44.680
So, um, we want to drive from one place to another.
01:07:49.000
We have laws that govern how you drive on those streets.
01:07:55.240
All these things designed to prevent an accident.
01:07:58.200
If you get into an accident, we have all these harm reduction things, right?
01:08:01.580
We have seatbelts and airbags and crumple zones.
01:08:04.360
And after the fact, we have all this, we have a whole system of mitigation, right?
01:08:08.600
We have ambulances and paramedics and hospitals to take care of what damage does result.
01:08:14.540
And in the future, we're going to need that same sort of very robust system for AI.
01:08:26.500
Um, uh, which is thinking, yeah, nobody's thinking about it comprehensively.
01:08:31.360
And one thing you could imagine is, is, well, we'll wait until we have a problem and then
01:08:39.460
Well, the problem of course, is that AI operates at the speed of computers, not at the speed
01:08:44.800
Um, and there's a scene in one of my books, I'm sure you remember reading it where there's
01:08:49.480
a character who witnesses a battle between two different AI factions.
01:08:57.920
A lot of things happen between the two different AI factions, all in the time it takes the human
01:09:05.720
And by the time he's like primed and ready to fight, the battle is over and they're into
01:09:09.700
negotiations and, and how to resolve it, right?
01:09:14.960
That is a great, um, uh, understanding of, uh, how fast this will, things will move.
01:09:23.980
It's, it's, uh, like one of the best action novels of war scenes I've ever seen.
01:09:28.880
Really, really good, you know, page after page after page of stuff happening and you get
01:09:33.040
to the end and you realize, oh my gosh, this, the human hasn't even hardly even moved.
01:09:39.900
He hasn't even had a chance to think about the first step that happened and it's already
01:09:48.080
So this is, this is why we need to be thinking about how are we going to control AI?
01:09:56.100
We have to have these things in place long before we actually have AI.
01:09:59.940
Isn't though, isn't it true though, William, that eventually some bad actor is going to
01:10:03.820
be able to develop this and not put those safeguards in and we're not going to have a
01:10:09.400
Eventually the downside of this is going to affect everybody.
01:10:16.020
And part of the reason why I say, right, we can't opt out of AI.
01:10:19.760
We can't not develop it because then we're just at a disadvantage to anyone who does.
01:10:27.980
So one of the things that I talk about in my third book, which is set in around like 2035,
01:10:36.120
I think neural implants, so basically a computer implanted in your brain, the purpose of which
01:10:42.660
is mostly to get information in and out, right?
01:10:44.760
So instead of having a smartphone in our hand where we're trying to read information on
01:10:48.240
the screen, we can get it directly in our head.
01:10:50.520
It makes the interaction much smoother, easier.
01:10:54.080
And, but it can also help tailor your brain chemistry, right?
01:11:00.860
And so if you could imagine if you're someone who had depression or anxiety or a severe mental
01:11:05.740
disability, that a neural implant could correct for those things.
01:11:09.740
So you would basically be able to flip a switch and turn off depression or turn off anxiety.
01:11:16.620
Could I ask you to come back tomorrow and talk and start there?
01:11:21.720
Start with the neural implants and where it kind of ends up with technology, because it
01:11:29.840
And in reading the real science behind it, it's real.
01:11:53.480
Uh, you know, that thing that'll alter your brain.
01:11:56.860
Uh, William Hurtling is the author of, uh, of all of these books.
01:12:00.960
There's four of them in this series and in the Singularity series, plus Kill Process just
01:12:10.680
Uh, this comes in from Twitter at World of Stew, uh, to understand the difference between
01:12:14.820
AI, artificial intelligence, and AGI, artificial general intelligence.
01:12:19.040
So, if there's a self-driving car, and it's AI, you say, take me to the bar, and it says,
01:12:27.680
If you say it to AGI, um, take me to the bar, it responds, your wife says you drink too much,
01:12:33.900
and my sensors indicate you've put on a few pounds routing to the gym.
01:12:52.680
Um, and, you know, the 2008 is starting to have some sensor problems, et cetera, et cetera.
01:12:57.360
And, you know, when these things start to happen, it's way out of warranty, but I have
01:13:02.360
the extended vehicle service protection from Car Shield, so I can still take it into the
01:13:06.760
mechanic, whether it's, you know, the guy at the dealership or just a mechanic, and
01:13:11.400
they get them paid, so it's, I don't have to wait for a check or anything else, and I'm
01:13:15.040
not afraid every time I take that truck in to have it serviced, because just, you know,
01:13:20.180
one little sensor, it could be a thousand bucks, new fuel pump, 500, water pump, a thousand.
01:13:26.480
If you need repairs, um, to your car, and we all do, it's all going to happen, they have
01:13:35.240
a plan now that will cover everything from, you know, the water pump to the car's computer,
01:13:39.560
Car Shield, the ultimate in extended coverage, and, uh, and they get the mechanic paid, so
01:13:48.120
Sign up today, get 24-7 roadside assistance in a rental car when yours is in the shop.
01:13:52.560
Do yourself what I do, save yourself from high repair bills, get covered with Car Shield.
01:13:58.840
Call 1-800-CAR-6100, mention the promo code BEC, or visit carshield.com and use the code BEC,
01:14:07.300
That's carshield.com, promo code BEC, deductible, may apply.
01:14:30.140
We will, um, I'll post, uh, the, uh, and tweet the links to, uh, William Hurtling's books.
01:14:41.140
It's, it's really, I've been looking for somebody who can, um, really explain, uh, in an entertaining
01:14:49.380
way through a novel, um, what's coming our way.
01:14:54.060
Tomorrow we have another author, uh, one of my favorite authors is going to be on with us.
01:14:59.100
And he's got a similar novel, uh, and we'll talk to him as well.
01:15:20.620
Good news and bad news, which do you want to hear first?
01:15:23.680
According to researchers, the University of London, it doesn't really matter which one you hear
01:15:31.340
Good news or bad news, which it doesn't matter.
01:15:34.140
You're more likely to believe the good news on something called the desirability bias.
01:15:41.120
Desirability bias is when you consider information more credible because it makes you feel good.
01:15:46.000
It helps explain the, uh, whole social media, fake news phenomenon.
01:15:50.880
When you, you see something, it's not confirmation bias.
01:15:54.420
It's desirability virus, uh, uh, bias that is actually more difficult and more troubling.
01:16:01.640
Researchers at the university of London set up a study just before the 2016 presidential
01:16:06.520
And they, they took 900 voters who were diehard Hillary Clinton or Donald Trump fans, and they
01:16:13.120
asked them, which one do you support and which one is going to win?
01:16:18.440
Researchers then separated the voters into the two groups.
01:16:22.860
They gave the first group polling results that indicated that Trump would win.
01:16:27.080
And the second group resulting, um, in, uh, Hillary's win with this new information.
01:16:34.020
They were asked, who do you think is going to win?
01:16:43.580
People believed the polling results that were given only when their poll indicated that their
01:16:52.260
If their candidate was shown to be, uh, winning in a strong poll, even if they thought they
01:16:59.100
were Clinton might win, they changed it and they dug their heels in.
01:17:08.660
It, it means that we are listening to the things that let us believe the things we want to believe.
01:17:18.500
And it's something that is a lost art now on both sides of the aisle.
01:17:23.440
If you want to persuade people, you have to get them to want to agree with you.
01:17:33.340
Now people become monsters and pariahs and they, and, and strident.
01:17:40.140
And so nobody wants, you don't, you don't like them.
01:17:54.460
And that's because he, he said things that people wanted to believe.
01:18:00.300
Now we can't fathom a candidate appealing across the aisle.
01:18:04.600
In fact, I think if you see a candidate that tries to appeal to the other side, you immediately
01:18:10.020
We saw Obama supporters blinded by the desirability bias for eight years.
01:18:16.780
They would not believe reports about the IRS, that he was using the IRS because they didn't
01:18:26.440
Now we're seeing the same thing with the Trump base.
01:18:29.980
We have to move past this concept of the presidency as the ultimate bully pulpit.
01:18:36.460
It's not what the executive office was designed to be, and it will not help heal our divisions.
01:18:59.540
Well, we have some really good news to share with you here in just a second.
01:19:04.840
We have Tim Ballard, the founder and CEO of Operation Underground Railroad and the chairman
01:19:14.720
And Jessica Mass, she is the director of aftercare for Operation Underground Railroad and want
01:19:20.360
And we have some exciting stuff to talk about here that happened last week in a very ironic
01:19:26.300
But I want to start with the Turpin family in California.
01:19:36.020
I mean, people were living right on top of this house.
01:19:46.360
Thirteen children in this house chained and living in squalor.
01:19:54.300
You know, it's more common than I think people want to believe.
01:19:58.100
You know, the whole thing of human bondage and human captivity, people want to, in this
01:20:04.420
There's hundreds of thousands of people, children, in the United States that are in captivity in
01:20:13.880
And it's just, it's an eye-opener for everybody to look around.
01:20:22.520
And it wasn't even a huge house and 13 kids are chained to beds.
01:20:26.260
And I mean, it's just, it's just insane, but it's not something that's shocking to me.
01:20:30.660
Any, any, have you ever seen anything like that, Tim?
01:20:34.340
Because this is what you've done for your whole life.
01:20:39.120
We've, we've, we've, we've seen things like that throughout our, our careers.
01:20:44.440
You know, it's usually tied to something in, in, in a sex crime.
01:20:49.180
It's usually child pornography being made, you know, people coming over.
01:20:53.300
We're still looking to see what in the world, even the, the, no matter what, it's going to
01:20:57.760
be a crazy intent, whatever the intent was, but what was it, what were they getting out
01:21:03.300
So Jess, you have been, um, around the, what you do is try to heal people like this.
01:21:11.180
They, they, they thought that they were young teenagers because they were so emaciated.
01:21:20.100
Well, my belief and based on my experience is that there's always hope.
01:21:25.280
I have seen people heal through things that, that they've said there's no hope for, and
01:21:32.780
So in my personal opinion, I believe that there's always hope and the journey is really hard
01:21:38.340
and it's long and painful, but I, I hope that there are people that will come around this
01:21:45.760
29 year old and help them and, and really walk through that journey with them.
01:21:51.640
Boy, it's going to be a long road for all of those, all of those kids, all of those kids.
01:21:57.400
So let's talk a little bit about why you guys are here.
01:22:02.720
We've talked about it for a long time, but haven't been able to reveal it.
01:22:05.960
Um, you did a mission in Haiti, uh, as we know, a crap hole of a country and, uh, you
01:22:14.560
did a mission, uh, how long ago was the first, it was Superbowl Sunday of last year, of last
01:22:20.200
And the problem was, is the system can be corrupt.
01:22:25.040
And in a lot of places in, in Haiti, it is corrupt.
01:22:30.260
So in this operation, beautifully executed, everything was spot on.
01:22:33.600
And, uh, you know, we were warned, don't work in Haiti and other NGOs don't work in
01:22:39.300
But you know, guess no, Marty, the father of the boy who was kidnapped, which in Haiti,
01:22:44.480
And I said, guess no, I don't know if I can attach our name to, we don't know what's
01:22:48.740
There's going to be a corrupt judge potentially.
01:22:50.320
And he says, Tim, if you give up on this operation, you've given up on my son because
01:22:57.260
So we went forward and Glenn, within 10 days, within 10 days, these traffickers were
01:23:03.040
like, oh, particularly this one, the kingpin, her name was Francois.
01:23:05.500
Um, she holds kids in stables in the darkest parts of this country.
01:23:11.320
And we, she brought several, a bunch of these kids with her young kids.
01:23:14.820
They're making child pornography, porn that was sent into the United States.
01:23:23.420
They paid their way out over $80,000 they paid and got out.
01:23:29.400
I mean, Jessica and I were talking in the minutes after we found out we were in tears.
01:23:38.540
Well, we had a contingency plan, which was dangerous.
01:23:43.240
And basically the contingency plan was go into the, into the, into the belly of Port-au-Prince
01:23:51.100
Tell the media what happened, what really happened.
01:23:54.020
And, uh, the Rotary Club of Port-au-Prince supported us and they said, you need to bring
01:23:57.820
someone of somewhat kind of celebrity status in Haiti or the media is not going to pay
01:24:05.100
Can you get ahold of this, this of Haitian descent, a U S Congresswoman named Mia Love.
01:24:14.740
Not only is she my Congresswoman who lives like down the street, she's a good friend.
01:24:21.400
I said, look, this is, this is not, this is not necessarily a safe approach.
01:24:39.160
Uh, the media was there and this was the speech we gave.
01:24:44.460
In 1791, the Haitians did something that no country then or since has, has done.
01:24:49.740
And that is, it was a slave nation that rose up and pushed its European oppressors out of
01:24:56.780
the country, took the island by force, created a republic and abolished slavery.
01:25:02.740
Uh, the first nation to abolish slavery like this.
01:25:05.640
And what was interesting about this was the American abolitionists were watching this.
01:25:10.700
And when the United States finally eradicated or abolished slavery, at least legally, Frederick
01:25:16.600
Douglas, the great abolitionist stood up and gave a speech.
01:25:19.160
And he said, let us not forget the sons and daughters of Haiti, the true pioneer abolitionists
01:25:27.880
And so this is the message that me and I gave and, and people were just rising up.
01:25:33.540
I mean, they were like, and then we said, look, you let us out of slavery the first time,
01:25:39.860
And, and letting out nine horrible traffickers who are sexually abusing children.
01:25:45.460
That's the wrong, that's, you're going in the wrong direction right now.
01:25:52.840
We got an invitation from the president and this was another miracle is the president of
01:25:57.540
Haiti, uh, Jovenel Moise, he was elected just days after the Superbowl operation.
01:26:10.840
We're going into the presidential palace, Mia Love and I, and I just kind of sat back.
01:26:14.320
Mia speaks fluent Creole, you know, and I just watched her work her magic.
01:26:18.660
And she, she got right up in, in, in, you know, in him and just said, look, here's the
01:26:26.300
And I felt it when he said it, you know, you can look in someone's eyes and know if
01:26:33.700
Um, within a couple of months, we knew the investigation was going.
01:26:38.680
We had a full time people down supporting, watching, and then the news broke.
01:26:48.840
I mean, in Haiti, in Haiti, this is a big deal.
01:26:51.960
And the, the attorney general who, um, his name was Aknam, he, he was instrumental in
01:26:58.880
I mean, this guy, uh, his team lives in, lives, a lot of them live in Miami.
01:27:04.380
They can't even live in Haiti for fear that their, their children will be killed or their
01:27:08.420
families, you know, uh, because they are, they are true corruption fighters and they found
01:27:18.020
You guys have more intelligence than we do on the whereabouts of these traffickers.
01:27:22.660
It was our team that went undercover and found them.
01:27:25.940
So I know, um, I was with, with you in Haiti a year ago, maybe, and, uh, never seen anything
01:27:34.840
And the corruption from the United States, quite honestly, uh, from some of the big charitable
01:27:40.580
organizations that, you know, and the UN that's, you know, we're going to help them.
01:27:44.760
No, it's all, the money is not going to the people and where it needs to be.
01:27:52.280
And so when you told me what was happening with the president and that he was serious now,
01:27:57.020
and he was going to take these people out, it was great news.
01:28:01.720
One of the reasons why I took what the president said last week.
01:28:05.660
So personally is because my wife and I had just gotten up off of our knees to pray for
01:28:10.600
you guys, because we knew you were doing this operation under the protection of the president
01:28:22.560
Were you, were you already in the operation when you found out what happened?
01:28:28.040
It was, it was the most awkward moment of any operation I've ever been on.
01:28:31.420
And I've been on a lot over my 16 year career, we were, um, we were sitting in a, um, we were
01:28:40.560
The attorney general of the country is sitting across the table from me.
01:28:43.860
Uh, the chief of police is sitting next to him and then my operators and, and, and we're
01:28:48.860
sitting around, um, and we're talking about this operation.
01:28:53.880
We, we literally have a recon team out this, this woman, Francois, and we got, we'll tell
01:28:57.580
you about that in a second, what, what happened, how we found her.
01:28:59.540
Once we took her down, she's the kingpin, right?
01:29:02.740
And our recon team's looking for her and we're just on pins and needles.
01:29:07.440
As soon as he says, you know, our team says, there she is.
01:29:10.180
And, and there's television screens all around.
01:29:15.460
And boom, I see this and I see it above their heads and I'm just reading this.
01:29:25.280
And again, I wasn't in the room and I've heard different people say different things.
01:29:31.120
In this moment, this is the message coming from, from, from America to Haiti.
01:29:35.120
And I'm just, you know, and, and they turned around, they turned around and they looked
01:29:39.900
at it and they looked at me and I didn't know what to, I just was like, so, so, so how are
01:29:52.920
How I smoothed it over was I just told them, my wife and I, as you know, we're, we're adopting
01:29:58.340
two children from Haiti that, that, um, we actually rescued in an earlier operation.
01:30:08.540
And, and, you know, I, I just said, you know how much I love, we love your country.
01:30:15.640
You, I mean, my children are from your country, my children.
01:30:28.260
We're going, we're going down in a couple of weeks to, to mute.
01:30:31.320
He called into the attorney general just an hour before and just wishing us well and
01:30:34.880
saying his prayers were with, uh, his prayers were with us.
01:30:37.900
I mean, that's also unprecedented by the way to have the, the president of a nation calling
01:30:47.060
We're going to come back in a minute and, uh, he's going to tell you exactly what happened.
01:30:51.540
And then tonight at five o'clock, we have video footage of this operation and what it
01:30:57.180
was like on the ground and a lot of stuff that you're going to want to see.
01:31:00.140
You can check that out tonight, five o'clock on the blaze TV back in a second with Tim
01:31:06.640
Ballard, uh, founder and CEO operation underground railroad and the chairman of the
01:31:17.060
So if you're setting new goals for your business, it is really difficult to reach them without
01:31:24.840
They have transformed the way that you can find really good people.
01:31:28.680
Zip recruiter will not only post your job at over a hundred of the, uh, the, uh, leading
01:31:33.000
job boards with one single click zip recruiter.
01:31:36.620
Then, uh, it's a smart program and it goes out and it looks for the most qualified candidates
01:31:46.180
And that is why, uh, 80% of all of the people that use zip recruiter, they get a qualified
01:31:56.060
Find out today why zip recruiter has been used by businesses of all sizes and industries
01:32:00.420
to find the most qualified job candidate with an immediate result right now.
01:32:29.100
Operation, uh, Underground Railroad, our rescue.org.
01:32:34.560
And, uh, the chairman of the Nazarene Fund.org is Tim Ballard.
01:32:39.020
Um, he is here to tell us about the operation that, uh, happened in Haiti just last week.
01:32:44.560
The day the president was, uh, saying what he said about Haiti.
01:32:48.720
Um, it was a really important moment because the president of Haiti was working with, uh,
01:32:55.060
Operation Underground Railroad to get some really bad guys.
01:33:02.480
So you, you're in that meeting, you get the go.
01:33:08.440
I mean, France, she is, if she's the kingpin, they want her.
01:33:11.540
She, not only because she has the most kids that she's selling, but because she had paid
01:33:16.120
$80,000, uh, to certain government officials that they didn't know who they were.
01:33:22.260
The president, the attorney general, they wanted her to find out who did you give that money
01:33:30.240
Do you think she's going to, do you think she's going to tell?
01:33:34.500
Well, let me tell you what she said when we got her.
01:33:42.940
I mean, that's the kind of, I've never seen, there's a lot of things on this op I had never
01:33:46.160
seen before where she's just like, I'm hand in hand with, with darkness.
01:33:58.760
But this, this, this woman, she owns a street essentially.
01:34:02.300
I mean, it's, she sits on one side of the street, all dressed beautifully, you know, with
01:34:08.080
And, and these Johns come and she has all these kids in these stables across the street.
01:34:13.360
I mean, they're, they're, they're these steel doors that, the clothes that lock, they have
01:34:17.280
metal beds that fold down literally with a kid's sit and then a bedroom in the back.
01:34:21.120
And the Johns go in and do their, and do their thing.
01:34:23.940
We had been doing surveillance on, on Francois for about three weeks in advance.
01:34:32.200
It was, it was adults out that she was running.
01:34:34.740
The day we rolled in and this minute and this second we roll in, we, we, we, we, we, we
01:34:42.840
Well, something happened as we're, there's a videographer we have named Justice.
01:34:46.820
I think you've met Justice, inspired dude at the last minute, we're driving to the street,
01:34:52.900
He jumps to the back seat and in the van, there's three SWAT guys, Haitian SWAT team members
01:34:58.060
who they're going to come out of the back, the back of the, of the van.
01:35:01.380
He jumps back there and starts wrapping duct tape around this guy's helmet and stay and
01:35:45.380
We all talk about, oh, if, if I would have lived back in the day, I would have been an
01:35:53.160
Because there are more people that are being held as slaves today than in the entire 400
01:36:07.620
I invite you to be an abolitionist by going to ourrescue.org.
01:36:16.220
Once a month, just give five bucks and become an abolitionist.
01:36:22.340
Ourrescue.org and the nazarenefund.org, which is also rescuing the slaves in the Middle East.
01:36:31.360
Tim Ballard is the founder and CEO of Operation Underground Railroad and the chairman of the
01:36:35.820
Nazarene Fund, just returning from Haiti, where there is a big shakeup happening, trying to
01:36:42.500
make their, their government work and get away from evil.
01:36:45.560
You just rescued a whole bunch of children and caught a real, probably the kingpin.
01:36:54.300
The kingpin that we had arrested her back in, in February of last year and bought her way
01:36:59.580
We got the right people, the new president of Haiti.
01:37:01.800
And here we are going in to get her again two nights ago.
01:37:04.220
So you have, your photographer is in the back of one of your vehicles.
01:37:08.040
My videographer jumps in the back and I'm like, justice, what are you doing?
01:37:13.180
And he's like, I got to get a camera on this guy's helmet.
01:37:16.820
I'm like, okay, he's wrapping duct tape around this, this, this kind of awkward scene.
01:37:24.180
There's, there's like three vans that jump out and, and, and get around the target and take
01:37:28.180
Um, she's in the middle of selling one of the girls while we do it with a guy who has
01:37:36.680
But these two guys, one with the, with the camera on his helmet gets out of the van and
01:37:41.340
he's supposed to come with us, but I see him take off the other way.
01:37:45.220
And what had happened was one of the girls, one of the victims got scared, didn't know
01:37:49.120
what was going on and ran into what I can call a stable.
01:37:57.980
He gets in there and Glenn, in my 16 years, I've never seen this.
01:38:01.000
And I've seen the camera footage of what he captured.
01:38:07.480
And I would ask people, this is a tough thing to ask people, but who's 13 in your life?
01:38:17.360
He goes in there and catches in the act a man who's raping her in the back bedroom and
01:38:32.180
And we caught it all on camera, which will allow us to prosecute the heck out of this guy.
01:38:40.100
I mean, we've rated so many situations like this.
01:38:43.600
And we had not seen kids, by the way, in the four weeks leading up to, you know, tracking
01:38:54.820
Last minute, puts a camera on this guy's helmet for no apparent reason.
01:39:01.300
So you get the madam or the kingpin here of the Haitian slave trade.
01:39:12.040
She is, according to the Attorney General of Haiti, she is the number one child trafficker
01:39:18.140
So you get her and she says to you, evil will protect me.
01:39:26.520
It was translated to me, but that's as close as that's the translation that I got.
01:39:30.260
I don't think people understand, you know, going to, knowing the history of Haiti, at
01:39:39.880
the same time we made a pact with God, the same year, 1791, they made a pact with Satan,
01:39:51.140
And I think that's one of the reasons why that country has so many problems.
01:39:59.280
I think a lot of people might dismiss it now, but I think it's, it's real there.
01:40:05.960
And people like her to say evil will protect me.
01:40:13.900
So how do you, how do you heal a country like that, Tim?
01:40:18.060
You know, what you do is you, you, you find those of light and they're there.
01:40:24.560
And it was that very history you're talking about where we, we brought that to light and
01:40:29.860
said, look, even in that darkness, because that is part of their history.
01:40:33.240
Anyone will tell you, they'll tell you like, yeah, yeah, there's elements that did that,
01:40:36.340
but there was, there was light that came with that too, that from the beginning that fought
01:40:40.240
But, and it's about finding those and, and people like Congressman Mia Love who came
01:40:44.560
down and, and Attorney General Sean Reyes also accompanied us and, and helped us to bring
01:40:49.120
the light to, to the country through people like President Moise and, and the Attorney General
01:40:54.860
Alknom and, and, and then all the aftercare partners we have.
01:40:59.560
I mean, talking, you've been to our safe house.
01:41:02.840
Jessica is, um, you know, the director of the aftercare and you're one of the most tender
01:41:09.880
And I just, I, I love watching you because you are tender and yet you are a pit bull,
01:41:16.020
uh, in those situations because you know where the danger is.
01:41:29.960
Well, and, and four, and then three of her minions went down.
01:41:35.020
And they are going to, the president is all over this one.
01:41:40.020
So, um, the, um, situation with the children, you've been there so many times.
01:41:45.300
Tell me about what happened on the anniversary of the earthquake and how that tied in.
01:41:57.280
So with the four girls, two of them, their parents were killed during the earthquake.
01:42:03.800
And while we were there was when they were rescued was the anniversary of the earthquake
01:42:12.500
So they've been in that situation for eight years.
01:42:15.240
They were kidnapped because their parents were killed in the earthquake.
01:42:20.340
It happened to thousands, tens of thousands of kids.
01:42:23.200
People don't understand that the earthquake, we think, oh, well, the Clinton Foundation
01:42:28.720
Well, what people didn't look at is all of the parents that were dead and the kids were
01:42:36.960
The devastation that comes when someone becomes an orphan overnight and the vulnerabilities that
01:42:45.580
And yes, these kids have been in this situation for eight years.
01:42:50.760
And so when you're sitting with a kid that has been through that, their parents killed, they've
01:42:59.420
And she's looking at my eyes and she says, she says, this is the first time I've ever felt
01:43:11.200
And she's like, I finally feel like there might be hope that I'll have something for my life.
01:43:17.200
Eight years later, there's the pain and the beauty that go hand in hand in the story because
01:43:26.420
But even this girl feels the hope in that moment.
01:43:33.300
We started the hour talking about what happened in California, what they found out from those
01:43:37.380
monsters of parents that chained their kids to the beds.
01:43:43.660
And I know because Jessica and I have talked privately about things that she has seen here
01:43:54.280
This is not a Haitian problem or a Middle Eastern problem or an American problem.
01:44:00.820
And there is something inside of man that when it goes dark, it goes really dark.
01:44:07.360
And, um, and that's what the Nazarene fund and, uh, operation, um, rescue operation.
01:44:14.780
Oh, you are is, is really all about is rescuing and going into the darkest of dark places.
01:44:23.540
I don't know how you are as full of light as you are after all of the things that you
01:44:28.860
Um, but you're a miracle and, and we thank you tonight at five o'clock, we're going to
01:44:36.080
So you'll see some of the things that we're talking about.
01:44:39.100
Um, and if we have time, I, I don't know if we'll have time tonight, but, um, we were
01:44:44.960
talking off the air about what's happening in Sweden right now, where Iran is, um, going
01:44:52.820
One of their own, somebody who is a refugee who left Iran because she saw a woman being
01:45:07.780
Then she announced she felt comfortable enough to say, I'm not a Muslim.
01:45:14.140
And Swedes turned on her and have revoked her visa and are threatening to send her back.
01:45:22.220
I mean, I just, it is like evil is protecting its own right now.
01:45:31.260
So if you can help us, please become an abolitionist.
01:45:35.520
And you can do that by going to our rescue.org, our rescue.org more tonight at five o'clock.
01:45:45.040
Researchers found two serious security flaws in chips used in every PC, server, smartphone,
01:46:14.840
Hackers can potentially make use of these flaws to steal data stored in memory, including
01:46:21.420
This is the biggest flaw that we have found the biggest backdoor because it is in almost
01:46:30.080
And there's nothing you can do about it right now.
01:46:33.980
Operating system providers, they have released security patches.
01:46:43.420
One in four people have already experienced identity theft.
01:46:46.500
And I tell you, this is going to, that number is going to go up and it will happen to you.
01:46:52.520
Thieves can sell your information on the dark web or get an online payday loan in your name.
01:46:56.760
And LifeLock works to detect those wide ranging, um, identity threats.
01:47:02.660
If you have a problem, a U S based restoration specialist is going to work to fix it.
01:47:07.480
Nobody can prevent all identity theft or monitor all transactions at all businesses.
01:47:13.220
Join now, get 10% off with promo code back call 1-800 LifeLock or go to LifeLock.com.
01:47:45.820
You know what I love about living in this, uh, this time period is especially if you saw the post,
01:47:56.900
Um, but you'll, you'll see in that when they release the Pentagon papers and the New York
01:48:00.620
times is shut down, uh, and they say you can't release anymore.
01:48:06.260
If the Washington post doesn't release them, there's no place to go.
01:48:12.480
It seems so odd that that news couldn't come out, but that's the way it was.
01:48:24.600
We may not have known that the internet changed everything.
01:48:28.480
Back in the eighties, I remember trickle down economics and it was always lampooned and you
01:48:34.400
could never, you could make a case, but you could never make a, a media case because you
01:48:41.860
Going around the internet now, Washington free beacon.
01:48:45.780
Um, here's what people said about trickle down economics and the president's tax plan
01:48:54.720
It feels like you're relying on this tax cut of the corporations of the wealthy to trickle
01:49:00.100
down Southwest and American Airlines both announcing they're going to give a thousand dollar bonuses
01:49:06.320
Wage increases don't follow tax cuts like this.
01:49:08.840
So the world's largest retailer giving its U S employees, a bonus, a wage increase and expanded
01:49:16.920
So you're creating a huge tax cut and you might not get wage growth.
01:49:21.840
Capital One Financial, which just confirmed to CNBC that they will raise the minimum wage
01:49:27.180
for all U S based employees at Capital One to $15 per hour.
01:49:31.500
And anybody who thinks that this corporate tax cut is going to trickle down to lift wages
01:49:37.880
has a staggering ignorance of how public companies function.
01:49:41.780
Wells Fargo said it would raise its minimum wage to $15 per hour.
01:49:51.140
They could be sure of themselves because there wasn't anybody that would have given that information
01:49:58.380
If CBS, NBC or ABC didn't make those, uh, those, uh, stories about what the companies were
01:50:10.620
Now you have enough outlets and you have control of the media yourself to where you can grab
01:50:18.640
You can edit those things and you can show, no, this is exactly what happens.
01:50:23.460
They have no fear that the, the liberals with trickle down economics had no fear of this turning
01:50:35.520
It's interesting too, to see these companies take these stands.
01:50:39.440
Normally companies, even companies that lean, right?
01:50:43.100
Don't want to take stands that associate themselves with Republicans publicly.
01:50:47.500
But this is such a clear win for companies and you know, companies really do this.
01:50:53.940
I mean, I think people, they want their employees happy.
01:50:57.820
They want their, their, their, uh, employees happy.
01:51:00.880
They like the PR of saying, Hey, we got a bunch of extra money and you know, we're going
01:51:05.140
to distribute that to the people who work for us.
01:51:08.620
There are some selfish reasons for it, but who cares?
01:51:12.500
It's great that people, you know, are able to make plans, long-term plans.
01:51:16.900
Now these are all permanent, uh, changes until, I mean, permanent is permanent as they get,
01:51:22.360
um, you know, with lawmaking, but permanent changes for corporations and they are able
01:51:28.680
to really plan for their long-term, uh, you know, companies well-being and their employees
01:51:35.980
I mean, it's not, it's not the most bold tax plan we've ever seen.
01:51:39.700
It's not, uh, imagine, imagine what would have happened had they done a flat tax rate.
01:51:47.000
If they would have done a flat tax rate, can you imagine the money that would have poured
01:51:56.200
I mean, cause this is really, uh, it was really a more of a corporate plan, right?
01:51:59.540
I mean, it was not particularly, uh, life changing when it comes to the individual side.
01:52:06.080
I'll take any, any dollar amount you want to give me of my own money.
01:52:09.360
I'll willingly accept it and act like you're doing me a favor.
01:52:12.600
Uh, you know, for the corporation side, it actually is a really big difference.
01:52:15.960
They no longer have to, I mean, they don't have to make these big changes.
01:52:19.200
There were so many people who said it wasn't going to be a big deal because they're
01:52:22.760
Anyway, it's shown to be a big change in these companies.