On today's show, we have a special guest, Ross Anderson from Cambridge University, who is an expert in security. We talk about the border, COVID, and how to rein in these little dictators all around the state. Also, we did an hour of empowerment, where we were opening up the phones and asking people if they came from a really bad situation as a child and had no reason to believe they could make it, but they did.
00:00:08.720We talked a little bit about technology and what is coming our way and the plan for iPhones here in the U.S.
00:00:16.060that Apple is going to scan everything for child abuse imagery.
00:00:22.760Ross, who, again, his whole thing is technical security.
00:00:26.640He says this is a nightmare that is about to happen.
00:00:32.180Also, on today's podcast, we had Ken Paxton, our attorney general here from the great state of Texas.
00:00:40.740We were talking about the border, what the governor is doing, also COVID, and to rein in these little teeny dictators all around the state.
00:00:50.520Also, we did an hour today of, I think, empowerment.
00:00:57.200It was opening up the phones and asking people if you came from a really bad situation as a child, and you had no reason to believe you could make it, but you did.
00:01:52.820Texas is under fire today because of the hospital situation here.
00:01:58.760It is not an emergency, but we have hospitals that are firing people if they don't want to take the vaccine, and we could use a few extra nurses, I understand.
00:02:08.860And so the run-of-the-mill request from the governor is now being distorted as that there's a massive problem with the hospital system here in Texas, which is not true.
00:02:23.060Well, you're right about that, and the problems that we are having as it relates to COVID, oddly enough, as you watch the border being pushed by thousands and thousands of people, potentially with COVID.
00:02:47.280Texas is overrun at the border, and, I mean, small little teeny towns are having to put COVID tents up.
00:02:56.280These people are just dumped into their city by the federal government, and they're just putting up little tents because they don't know what else to do.
00:03:07.080You know, what's amazing is that they are bussing people around the country and just dropping them off.
00:03:11.940I know for a fact, I was talking to some police officers in Dallas, and they said that these buses showed up in Dallas and just dumped a couple hundred people out of buses, and they just wandered around the streets, and that was it.
00:03:28.580First of all, I had a county commissioner on last week, and he was talking about a commissioner's court and a judge in a commissioner's court.
00:03:49.540This is like a city council for a county.
00:03:53.740There are a few counties in Texas that don't have enough court, so they act as a judge, but for the most part in these large counties, a county commissioner's court or a county judge is not really a judicial position.
00:04:06.640It's more of a management or a city council type position.
00:04:09.640Okay, and this quote-unquote judge made a ruling against a commissioner and forced him to leave even though he was vaccinated because he wasn't wearing a mask.
00:04:21.200That is exactly what happened, and, of course, we have an executive order from the governor who's acting under his power called the Disaster Relief Act where he said, hey, you can't require a mask, and that is completely in violation of state law what that commissioner did.
00:04:44.220So in the next, I think, hour or so, we're going to file to intervene in his case because he's already got litigation going, and we're going to try to stop what we would consider illegal by this county judge.
00:04:56.080And I doubt he's going to take it well.
00:04:59.920He has been on Twitter just saying, you know, you've got to mask up.
00:06:06.360This is deja vu all over again for us.
00:06:08.360So we're very confident we're going to win.
00:06:11.080I think he's doing it for media coverage.
00:06:12.560But you just said, you know, if the law matters.
00:06:15.500I have to tell you, and I can't speak for Texas because Texas is, you know, there's a few states that are bucking this system.
00:06:23.900But it doesn't seem like the law matters anymore.
00:06:26.900To many Americans, the law doesn't matter anymore.
00:06:31.340Well, I think Obama set this up, you know, when he was president.
00:06:35.620He ignored federal law, didn't work through Congress, made up his own executive orders, had agencies make up the law, and just thumbed his nose at laws.
00:06:45.140And we had to assume we were very successful doing that.
00:06:47.400But I think he set sort of the mindset for a lot of Democrat, particularly Democratic elected officials.
00:06:52.800I don't have to follow the law either.
00:06:54.180If President Obama doesn't do it, I might adopt the same approach.
00:06:57.340So that's kind of the approach that you're seeing by mayors, by county judges, by elected officials all over the country who just say, why should I have to follow the law?
00:07:54.640But we've got thousands of fights going on, and these occur almost every day in my office where we have to pick among a number of choices of which battle can we go fight.
00:08:05.240And how are we fighting at the border, Ken?
00:08:09.660So that's another long battle for us because we've got six lawsuits as it relates to the border.
00:08:15.220You know, the governor issued his executive order where he said, hey, if there are people being trans-border that are illegal, they need to be sent back to the border, which is, in my mind, you know, perfectly normal.
00:08:25.720And you'd expect a governor to try to protect his state from the spread of COVID from the border and the crime that's associated with that.
00:08:32.500And yet we were sued by the federal government for trying to protect our state in a way that the federal government refused to.
00:08:52.560So there's this idea under this U.S. Supreme Court case, U.S. v. Arizona, that because the federal government has statutes providing them authority over immigration,
00:09:03.280that the states still can't protect themselves, that the federal government doesn't.
00:09:06.740I just don't think that's correct understanding of what the law is.
00:09:10.280How can it possibly be that the state has to sit by while the federal government ignores the law and allows great harm to the citizens of that state?
00:09:18.160I can't believe that the governor has to sit by and let that happen.
00:09:21.780This is the old Constitution is not a suicide pact.
00:12:32.500And so I just worked dead end job after dead end job until finally, you know,
00:12:36.680I landed in a service industry that I was very successful in and my skills were great.
00:12:42.380And after working for many people with a lot of prayer and counsel from from church leaders,
00:12:49.400I started my own business and I actually started my own business, probably similar to the way that you did a lot of things in your life.
00:12:56.420I didn't have a lot of money or support.
00:12:58.060It was whatever I had at the time, I think I might have had eleven hundred dollars to start a company with.
00:13:04.260And fast forward five years later, the Lord has blessed me to the point where I know the Bible says,
00:13:09.980if you bring the tithe of the storehouse of the Lord, he will pour out blessings upon you more than you can ever contain.
00:13:14.220And sometimes I'm praying, Lord, I appreciate the blessings.
00:13:17.340But can I just have just a little bit less right now?
00:13:19.520Because this is a lot and and God is just so good.
00:13:23.940And, you know, I have more hope and joy than I have ever had in my life.
00:13:30.040And success to me is not the business.
00:13:32.440Success to me is not the possessions that I have.
00:13:35.600The success that I feel in my heart is my relationship with Jesus Christ and my love for people.
00:13:43.500Just the stories that I can tell them, the testimony that I have.
00:13:47.880And if you were to even look at the business profile that I have on Google or talk to the customers that I have, they'll tell me day after day.
00:13:54.720They say, we love doing business with you because you care about me, not the money.
00:13:59.360You come into my place of business or my home to help me with my service issue.
00:14:54.100Actually, they're not, I say they're always going to be my children, but they're 18 and 19 now.
00:14:58.200So they're older and very successful themselves.
00:15:01.340One of my crowning achievements is that day on August 10th when I shut the bottle down for the last time.
00:15:07.380And my youngest daughter was actually not old enough to ever have recollection of me ever drinking.
00:15:12.380And still to this day, someone 14 years old, 13 years old, still has no clue that I ever drank alcohol and never will.
00:15:20.280Ever, ever, ever will know that I ever did.
00:15:22.680Only through stories that I tell her is cautionary tales of things to look out for from the evils of the world and the vices that can grab ahold of you so tightly and easily.
00:15:30.580Wow, David, I am so glad that you called in.
00:17:46.940We were a nation of laws only because we agreed on those laws that they were pretty much eternal.
00:17:55.560That the things that the things that we had in place in our Constitution and the directives of the Declaration of Independence didn't come from man.
00:18:31.100But I know that people who are kind of indifferent, when they're in the foxhole, whenever they're in trouble, their kids get sick, they suddenly find God.
00:21:15.240Apple plans to scan U.S. iPhones for child abuse imagery.
00:21:19.320And on the surface, if you're not paying attention to what's going on in the world, you think, oh, well, that's good.
00:21:24.700I got an email from a guy who used to be very, very high up at Yahoo about 20 years ago.
00:21:33.560And he said, we basically set up a direct line with the FBI because we frequently had websites submitted to us to be crawled or indexed that we believed may contain child pornography or other illegal things.
00:21:46.720At first, we were just emailing links over to the FBI task force.
00:21:49.680But the process for them to check on it and get back to us was just so arduous that we just eventually set the FBI up with an E3 terminal in their L.A. area offices.
00:22:02.040That was basically the same terminal that our editorial team members, those who were the keepers of the Internet at the time, they determined which websites and search results showed up for people and which ones we would block, trying to keep harmful or illegal content out of Yahoo's search results.
00:22:17.920So by giving the FBI an E3 terminal, anyone from the editorial team or the sales group could submit a website directly to them for review.
00:22:25.700It would pop up in their queue, just like a help desk ticket and someone at the FBI could review it and let us know if the content was legal.
00:22:33.180But they could also just flag something to be blocked and it would be blocked.
00:22:50.220This is a long winded way of saying, yeah, I'm sure that Apple is and others are getting into a cozy relationship between social search tech giants and the government.
00:23:03.120We now know that this was a mistake to open up the door in the first place.
00:23:22.700Thank you so much for being on the program with me.
00:23:25.520I want to understand why this is so dangerous.
00:23:30.520Apple says it has all kinds of safety features and they're only scanning faces for those that are, you know, in in in sex rings or have been sex trafficked and maybe those children that are missing.
00:23:48.300Well, you can see how this is going to develop, but first of all, Apple will be scanning all the photos in everybody's camera roll everywhere in the USA and later everywhere in the world against a database of 200,000 abuse images that have been supplied by the National Center for Missing and Exploited Children.
00:24:11.540Now, given the way that their neural network is organized, it looks like it's going to scan for faces, as you say.
00:24:19.520And so you can imagine the kind of things that will happen, that there'll be some abuse image that's 10 or 20 years old.
00:24:27.400And so the reviewers at Apple will see a photograph which isn't of child sex abuse, but of a grown up with their clothes on.
00:24:35.780And the system will recognize this person.
00:24:38.400And the person at Apple then has to decide whether this is a survivor or a perpetrator and what to do with it.
00:24:46.240There's a small problem there that U.S. law says that if you do get suspected child abuse material, you should report it at once to NCMEC rather than reporting it to Apple.
00:24:56.300Right. So there's first the illegal problem there.
00:24:59.040And then secondly, there's the problem of what happens when NCMEC extends that to missing children.
00:25:06.040Right. Because some of the children who are known, you know, who go missing, go missing for perfectly good reasons.
00:25:12.080For example, they might be getting abused at home.
00:25:14.280They might get beaten or even sexually abused.
00:25:16.320And so if you recognize runaways, there's all sorts of processes around that you have to think about.
00:25:24.100Apple doesn't seem to have thought about this.
00:25:26.200You know, they don't seem very keen to provide help desks and help lines and places where people can report stuff.
00:25:33.600And then the next problem is that once you've got a mechanism sitting in your iPhone that can scan your camera raw for for faces, it's open to any government in the world to come along with a warrant and say, hey, Mr. Apple, we've got a file of 20,000 faces that we'd like you to scan for in our country.
00:25:52.600And you can guess what those faces might be in China.
00:25:55.980It might be the faces of the Dalai Lama and the Pope in Europe.
00:26:01.220If there's been some rowdy demonstration in Paris, for example, the police might feed in the faces of demonstrators.
00:26:08.120You know, your guess is as good as mine.
00:26:10.060But Apple is building a very, very dangerous mechanism into its iPhones.
00:26:14.400And there needs to be proper scrutiny and accountability of this.
00:26:18.120So this is actually is it based on or is it just very similar to something that happened in 2008 in China that kind of opened Pandora's box?
00:26:28.980And this is a worrying thing, you know, because if our civilization is going to be in Cold War 2.0 with China for the next 20 years or 30 years, we should watch what the Chinese do and understand it rather than copying them.
00:26:42.680And in China, what happened in 2008 is that they mandated everybody in the country to put software called Green Dam on their PCs.
00:26:52.080And Green Dam was sold to the population as being a porn filter.
00:26:57.200And it did that to some extent, but very badly.
00:27:00.060However, the real purpose of Green Dam was to look for words like Falun Gong and Dalai Lama and so on that were of interest to the Ministry of State Security.
00:27:08.980What Green Dam also did is it made your computer vulnerable because the government weren't very good at writing software and the software that they produced meant that everybody who used the Green Dam software was vulnerable to having their PC taken over by websites that they visited.
00:27:25.100Now, that's been fixed by now, but still, it's the case that in China and in Russia, we have this ecosystem of government scanning what's on people's PCs.
00:27:35.080And no doubt the Chinese will be seen to it that they get to scan stuff in people's phones as well.
00:27:41.220So, you know, it was we were talking about this the other day and I said to my staff, can anybody name anything that, you know, is a bigger threat to your freedom and security than social media and technology?
00:27:59.440I mean, it's not necessarily the threat today, but we know what it can do.
00:28:06.800And and yet, again, do you know anybody who has given it up and said, I'm just not going to be a part of this?
00:28:39.400You know, I used Android for many years and switched to Apple a couple of years ago when I was updating my security engineering book and I noted how much more secure Apple was.
00:28:49.580But the problem with Apple iPhones being tamper resistant is that I can't easily drill into them and find out what they're doing.
00:28:56.920I can't see the database of hashes of abuse in my phone and check that it doesn't contain hashes of dissidents instead.
00:29:06.540With a less secure phone like an Android, you could perhaps do that and you could hold people to account.
00:29:12.020So here is a case where security is being used against us to undermine our privacy.
00:29:17.240And the tamper resistance of the iPhone means that the government can have an eye spy in your iPhone over which you've got no control whatsoever.
00:29:28.380Ross, how how far away are we from a an easy police state with the wrong person taking charge?
00:29:40.320Well, that's exactly the problem here.
00:29:43.420You mustn't give the police too much power.
00:29:45.760You may think that it's nice to give the police power when your lot are in charge, but it never works because you end up with the other lot in charge and, you know, then you've had it.
00:29:55.220So to stay free, we have to see to it that the government can only do so much that it can't undermine our basic freedoms.
00:30:03.200And you're lucky in the United States having your constitution here in Europe.
00:30:08.180We have the European Convention on Human Rights.
00:30:10.660And once you get to those parts of the world where we don't have guarantees for basic freedoms, well, good luck.
00:30:16.300Well, I don't know if you've been paying attention much to America lately, Ross, but we're not following the Constitution.
00:30:22.180I mean, we are our Constitution says, you know, you you can't you can't quarter soldiers into a house and go through somebody's papers.
00:30:30.700Well, I think you already have that if you're if you're if you're if you're on online and the government wants some information, they're just going to go to one of these tech companies and they'll go through all of your papers.
00:30:43.020I mean, and they're they'll watch you or they'll scan your photos and you there's no such thing as privacy anymore.
00:30:49.720Well, indeed, and I'm not an expert in U.S. law, of course, being a Brit, but I hear from American friends that the argument which Apple and the FBI are going to use goes to a case around drug sniffer dogs where somebody's in a traffic stop.
00:31:08.560A drug sniffer dog brought around and they found some weed in his boot and he got convicted and he said that was unfair.
00:31:16.520And the court said that if you've got a search that finds only contraband, that's OK.
00:31:22.980Now, it depends on what the government defines as contraband.
00:31:26.460But if you've got a government search engine that can look at all your most intimate stuff, you know, your photos, your emails, your texts, and it can use artificial intelligence to find out something that the government of the day considers to be contraband.
00:31:39.220And then that means it makes a mockery of the idea that you've got to get warrants because suddenly you're turning the universe around so that the government to do surveillance doesn't have to get a warrant against a suspected person, but against a suspected idea or an expect a suspected image or a suspected form of speech.
00:32:00.600Is that what's that's what's that's what's changing here.
00:32:21.580This is down to, well, you know, my next phone isn't going to be an iPhone.
00:32:24.840In the meantime, Apple saying that they will only scan your photos if you back them up in iCloud or then find buy yourself a disk drive and attach it to your laptop and back your phone up on your laptop and back the laptop up on a disk drive.
00:32:51.120Well, and it's also too much bother for most people.
00:32:53.880You see, what Apple and the FBI will be relying on here is the fact that Apple nagged you really, really hard to get an iCloud account and to put more money in it and to back your phone up to iCloud and your MacBook, too, rather than using a disk drive.
00:33:08.480You know, for some years, my wife refused to get an iCloud account.
00:33:11.340And every time she connected her iPhone to the MacBook, it just complained and said, put in your iCloud password.
00:33:16.920And it's this kind of commercial nagging that is now going to be exploited by law enforcement to drive a coach and horses through security.
00:33:25.300But there is another thing here, which is that your photos in iCloud aren't properly encrypted when they're backed up anyway.
00:33:33.900So Apple could, if it wished, run child porn detection software over the photos, just like, for example, Facebook does over photos in Facebook Messenger and Google does over photos in Gmail.
00:33:47.040And it could then report people who already have illegal images in iCloud to NC Mac.
00:34:46.280Thank you for everything that you're doing and speaking the truth and letting people know what is possible with technology and what is coming our way.
00:34:56.040Professor of security at Cambridge University on the Apple plan to now use iPhones to scan for child abuse imagery and possibly much more than that.