A Chicago judge recently let a woman who was caught laughing the entire time walk away with just 200 hours of community service. Thursday a former police officer was acquitted of murder or manslaughter for an on-duty shooting. The same week a jury found a man innocent of murder and manslaughter after he killed a woman on a pier.
00:32:26.940Well, the first problem is that Americans don't even know what the Bill of Rights are.
00:32:32.260They don't know the three systems of government.
00:32:35.860We have more people in this country who believe that UFOs exist than believe Social Security will exist by the time they retire.
00:32:42.560We have more people in this country that can name the home of the Simpsons than where Abraham Lincoln was born.
00:32:51.500More people can name more Kardashians than can name members of the Supreme Court.
00:32:56.400All of that scared the living hell out of me, because we know our pop culture absolutely to the last detail, and we know nothing about our founding fathers.
00:33:23.580He is truly empathetic and can hear beyond the words.
00:33:31.040I think he is, quite honestly, I think he is a solution to many of the things that ail us.
00:33:37.200If more people will speak honestly and more people, like Frank, will listen, please go to focuswithfrank.com and sign up to be part of his focus groups, focuswithfrank.com.
00:33:49.520Frank Luntz, always a pleasure and a privilege to have you on the program.
00:34:49.440Um, they make it easy because, uh, they have, uh, FaceTime and a million different ways to connect with these people who are, um, interior designers and it's free.
00:35:02.840I mean, what is that going to look like when you put it up there?
00:35:05.440What they did is we FaceTime and they will, uh, take a picture of the room or your house and they will then, uh, superimpose the look of the drapes or the shutters or the blinds or whatever it is that you want.
00:35:19.000And they will show you, this is what it's going to look like.
00:35:22.020Then they'll send you a piece of the fabric or a piece of the blind.
00:35:24.480So, you know, the color is exactly right.
00:35:50.400Christmas Eve, you can enjoy the savings site-wide plus a guaranteed 20% off when you go to the blinds.com and you use the promo code back.
00:36:01.100So if you use the promo code back now through Christmas Eve, you will get 20% off anything for your window coverings at blinds.com.
00:36:10.660Blinds.com promo code back rules and restrictions to apply.
00:36:31.400Chapter 2 of The Immortal Nicholas, Rafe and I, um, are reading it every night and we ask you to join us with your family and it's commercial free and it's only online at theblaze.com slash TV for premium subscribers.
00:36:42.280One other thing to add about Alabama is that the people of Alabama just don't trust the people presenting the information about Roy Moore.
00:36:49.260And let me give you an example of this.
00:36:50.920This is about Al Franken and his, uh, accusations, uh, in the, from the New York Times.
00:36:54.480And the grand cavalcade of sexual assault charges we've been hearing lately, his list, Franken's list, goes from fanny gropes to tongue thrusts.
00:37:01.260It's appalling but pretty minor league.
00:37:03.720And the picture of Franken feeling up the well-protected breasts of a sleeping colleague on a tour could have been subtitled portrait of a comedian who does not suspect he'll ever run for senator.
00:37:13.440Franken was a good politician and many Democrats hoped he might grow into a presidential candidate.
00:37:17.260But it was his destiny to serve history in a different way.
00:38:02.920Democrats have driven themselves insane for about a year trying to figure out what happened when it comes to Donald Trump and Hillary Clinton.
00:38:11.320Now the left can't understand why Doug Jones isn't running away with this race.
00:38:15.240After the really creepy allegations against Roy Moore.
00:38:19.680Why would conservatives vote for a creep?
00:38:50.880For many conservatives in Alabama, this is the same struggle they had in the last presidential election where they held their nose, voted for Trump.
00:42:21.740And 10 years from now, our lives, our health, our jobs, possibly, hopefully, our politics are going to be completely different.
00:42:32.220And it's very exciting, but it's also terrifying.
00:42:35.520And it's only terrifying if you haven't thought of these things before, because they are coming and you can't put the genie back in the bottle.
00:43:31.780So I don't even know where to begin with you.
00:43:36.000I really kind of want to try to just introduce America to some of the thoughts that you put together in Augmented of what the world is going to be like coming our way.
00:43:53.980There's jobs, there's education, there's health, and then we get into stuff like AI and robotics.
00:44:03.740But let's just start with jobs, education, and health.
00:44:06.900So there's four disruptive themes I identified in the book.
00:44:12.240Obviously, the first and most disruptive technology we're going to deal with over the next 10, 20 years is artificial intelligence.
00:44:19.700But that's going to spur on a whole range of other changes in society.
00:44:24.320So the first notable impact is we'll be talking to computers.
00:44:29.540We'll have computers embedded in the world around us that are collecting data and sensors and so forth.
00:44:34.840And then that flows on to things like health care.
00:44:38.020For example, you may have seen in the news a live core just got approval from the FDA to launch a device, a band that essentially attaches to the Apple Watch that can do sort of a full EKG, ECG monitoring of your heart rate over time.
00:44:54.980But when you tie that with an artificial intelligence, they're now expecting within the next 18 months or two years, I'll be able to predict whether you're going to have a heart attack based on that data.
00:45:05.700So this is where we see the marriage of sort of sensors, sensors and artificial intelligence really changing the way we think about things like health care.
00:45:15.940Yeah, you talk about these sensors in a way that has made me want to wear my, you know, my Apple smartwatch a little bit more about the way it's going to be able to detect exactly what's happening in our body.
00:45:35.120I mean, we would much rather go to the ingestibles you're talking about, like you can swallow a computer that can read, you know, like you're a diabetes, a diabetes software, you'll be able to swallow a computer in the future that will monitor your, your, your, your, your blood work.
00:45:51.300And so look at your sugar levels and then, you know, it won't be long before we have an internal device or be able to dispense insulin.
00:45:58.880So, you know, regulate insulin in our body without having to inject it and things like that.
00:46:03.920And, and, and, you know, if you've got a, if you've got a complaint, you know, we can, we can get you to swallow a camera now and ingest that instead of having invasive surgery.
00:46:12.200I mean, there's a lot of stuff happening on the sensor stuff on the health front.
00:46:15.440Are we, are we, are we entering a time where it's possible to say disease goes away?
00:46:25.180So the, the, the biggest shift in respect to disease won't necessarily be just diagnosis.
00:46:31.140I think that, you know, what we can do with an imaging AI right now, machine learning is we can give a, a, an algorithm, 3,000, 5,000 medical images with diagnosis data.
00:46:45.440And it will be able to do a pretty good job of approximating the diagnosis that you would have got from your, your doctor.
00:46:51.680So diagnostic, diagnostic technology is going to increase exponentially.
00:46:58.180And essentially we're going to get these computers doing the best diagnosis possible.
00:47:03.060Talk a little bit, combining all this.
00:47:04.900Talk a little bit about the computer in New York.
00:47:08.140This is kind of a, an offshoot of Watson.
00:47:11.660Watson, you know, could beat anybody at chess.
00:47:15.440Um, they had the idea of, wait a minute.
00:47:18.520What if we just put all of the medical information into it and all of the different cases and see if it can, if it can diagnose cancer and it's far better than, than human doctors.
00:47:30.600So right now, uh, IBM Watson gets about a 96, 97% hit rate in terms of its diagnosis for specific types of cancer.
00:47:39.820Now, when you compare that to the best oncologists in the U S who have 20 years of experience, they get it right about 50% of the time, which of course is why, you know, everyone tells you should always get a second opinion.
00:47:55.240Uh, it's, it's obviously fairly new tech, but what would be really good is if we could eliminate cancer altogether.
00:48:01.580And so what we're working on is technologies like gene editing and the two major streams of this is CRISPR and Talon where essentially we can now sequence your DNA, but the future is actually modifying your DNA.
00:48:15.760So if you've got a disease, a protein switch that results in, say, leukemia, we'll be able to flick that switch to create antibodies instead of creating leukemia just by changing in genome.
00:48:28.040So I don't know if it was yours, uh, Brad, I, I've been reading so much lately, uh, but do you speak about telomeres in your book?
00:48:36.720So, um, you know, telomere, telomere links is another element for around longevity.
00:48:41.080Um, and there's a whole lot of new science coming out around longevity now, which is really interesting, but the ability to, uh, you know, insert telomerase, which is the, uh, sort of the, the protein that leads to the, um, at the end of the DNA is these little, um, uh, you know, strands that sort of hold the DNA together, sort of like the aglets you have on your shoelace and they fray over time.
00:49:05.360Now, if we can restore them, then it's believed that we can extend life.
00:49:09.560So, um, you know, this is, there's a lot of work going into longevity because that is, that is, as those begin to fray, that's the aging process.
00:49:18.800So if we can, yeah, errors creep in and that's how we age exactly.
00:49:23.240So, you know, I'm, I'm reading your book and, and half of it, I am more, uh, excited about the future and more convinced that, you know, you just have to just hold on to 2030, 2035 and the world's going to be different.
00:49:40.720Uh, you're going to, I mean, anything that you're dealing with, we're going to be able to take care of.
00:49:46.880Um, that's kind of the, the optimist, uh, optimistic feeling that I get.
00:49:51.940Um, however, the other half, uh, of me, you know, you look at, for instance, talk about the climber.
00:50:00.440Uh, I don't remember his name, the mountain climber that, uh, yeah, tell the story.
00:50:07.940So Dr. Hugh Herr, this is a really interesting one.
00:50:10.820He, uh, when he was 17, he lost both of his legs through frostbite in a, in a climbing, uh, incident where he's trapped on Mount Wellington.
00:50:21.180And so he was very inspired to, uh, you know, to, to fix that problem.
00:50:25.640So he went to Harvard and MIT to learn bio, uh, medicine and robotics.
00:50:30.140And he basically built himself new legs.
00:50:33.200And, and today his friends joke that they're going to have to get amputations as well to keep up with him in terms of his ability to climb a mountain now because of his specialist, uh, uh, prosthetics that he's designed to, to climb the mountains, you know.
00:50:47.320But, um, this does raise the ethical concern is once we get to the point where prosthetics are able to perform at, or at a better level than our own human limbs, what do we do when people start voluntarily, um, having amputations to get prosthetics because they're going to get improved performance?
00:51:11.220So obviously we have to have, you know, we have to start thinking about the ethics of things like artificial intelligence.
00:51:17.320And technology in a pretty structured manner.
00:51:20.660We can't just let it happen as we have with the iPhone and the internet and so forth, where we just let the, the pure capitalist, uh, approach take.
00:51:28.960We, we need a, uh, an ethical structural approach to these technologies.
00:51:32.820So there is some initiatives coming out like this, like a deep mind, the Google effort.
00:51:37.760They've created an ethics, uh, society to sort of put together or codify ethical standards.
00:51:43.960But, you know, uh, like it's hard to decide on ethics in our society.
00:51:48.920Uh, you know, we don't agree on things, as you pointed out at the, uh, the start of the show.
00:51:53.280How do we codify ethics when as humans, we can't necessarily agree on, on a code of ethics amongst ourselves?
00:51:59.680I, I worry about this because I, you know, you also look at people like Vladimir Putin, who has recently come out and said, whoever, whoever is the first in with AI controls the world.
00:52:10.740Uh, I don't think Putin cares about, you know, ethics, uh, or at least the same kind of ethics.
00:53:50.400If you've ever had chocolate dip strawberries, uh, you know, and you order them online and stuff, they're usually crappy, small, unripe strawberries.
00:53:58.080These are the biggest and the juiciest strawberries you've ever seen.
00:54:02.860There is no better gift than Sherry's berries.
00:54:06.180They've added also amazing treats like snowman, a brownie pops, cheesecake, Christmas, Christmas trees, uh, chocolate truffles.
00:59:24.920But, you know, it's not necessarily that machines are going to be, you know, malevolent or benevolent.
00:59:30.940The one thing we're learning is that artificial intelligence, they don't think like us as humans.
00:59:37.180So when we attribute a superintelligence and the fact that it's going to take over the world like, you know, T-1000 Terminator,
00:59:44.700you know, we are thinking in human terms, but it's not necessary that machines are going to act like humans.
00:59:51.840So I think that's the one saving grace here is that, you know, sufficiently advanced AIs may not really care about us that much.
01:00:00.160They may have their own agenda, which we have to cope with.
01:00:05.280But that's, again, where I think empathy is important.
01:00:07.760I think if they have empathy for their creator, us, I think that that will help us.
01:00:13.160So I think building empathy and ethics into robotics is sort of key for a safety valve.
01:00:19.720Well, I mean, you can already see the seeds planted in there's a robotic brothel in Germany now.
01:00:28.360And people like to go there and, you know, they'll have their way with the robotics and the wives, I guess,
01:00:33.860wait in the parking lot for the guys because, you know, it's not like really cheating and all this stuff.
01:00:38.820And you think about how these robots, some of these robots are going to be used and abused by people.
01:00:46.080If it is AI at some point, as Kurzweil says, an age of spiritual machines, at some point it will say, don't, I'm hurt, I'm lonely.
01:00:56.180And then, you know, you do have the human emotion.
01:01:00.140Well, you know, the other element of this, of course, is as these AIs get very, very good at understanding human behavior and learning to adapt to our concerns.
01:01:11.600If you have a personal AI encoded in your smartphone, for example, you know, it could become your best friend.
01:01:19.980In fact, you know, people may fall in love with their AI.
01:01:25.360You know, I think that's because if you've got someone who reacts to you in a perfect way, responding to your every need, then, you know, that's a great way to build a basis or a friendship.
01:01:34.260I mean, quite honestly, Brad, I mean, they don't have to destroy us.
01:01:38.700They have to get us to fall in love with them and not procreate.
01:01:43.760I mean, why would I've thought about this for a long time?
01:01:46.880If I could come home and it's the perfect woman who has it has every trait that I love physically, mentally, everything else.
01:02:19.360This may be the resurgence of humanity as well.
01:02:22.880I think we'll have a approach where we get totally into this technology.
01:02:27.660It infuses in society and people get carried away with it.
01:02:30.780But, you know, there may be an authenticity to the human experience that we miss after a time.
01:02:36.040And I think, you know, that's probably where, as humans, we'll need to differentiate.
01:02:40.720We'll have to differentiate in our very humanity.
01:02:43.540So, you know, you talked about employment and education.
01:02:46.520You know, if you want to be relevant in that future, you're going to have to be extremely adaptable.
01:02:52.220But I think the skills that will come to the fore are those that are really human, that people cater for that real human contact, that human touch, that really authentic humanity.
01:03:03.860I because of your book, I talked to my 13 year old son, who is who is just really an empathetic kid and just loves people and loves children.
01:03:15.560And I said, you know, have you ever thought about going into nursing and being a nurse practitioner?
01:03:20.360And we talked about, you know, having your own robotics that you would be watching over several patients, but you would be the one that would be able to come in and kind of telepresence and be able to be there for people and have the actual person to person experience.
01:03:37.600It's not nursing is not going to be like it is today.
01:03:40.280Well, you know, if you look at how AI is going to impact jobs right now, the biggest impact we see, particularly in markets like the U.S. and even China is a process where humans are involved in process, ticking the box, following a checklist, you know, these sorts of things, accountants, lawyers, you know, bankers, bank tellers.
01:04:03.840But the thing where we see a lot of demand coming is those human elements, the creative elements, design and counselling, we think counselling and psychology and those sort of elements, particularly as the role of work in society shifts and we become less defined by what we do and more defined by who we are.
01:04:25.260You know, there's going to be a huge demand for those sort of human elements of behavioral psychology and counselling.
01:04:32.840So, you know, it's really easy to say, I'll never I'll never do this.
01:04:39.920We'll all augment, especially you describe how super intelligence, artificial super intelligence will will be so far ahead of us that we won't even be able to understand it.
01:04:54.640That, you know, that, you know, I look at if if I'm an augmented human, I've augmented my brain and I'm connected to the, you know, Borg or whatever it would be.
01:05:05.720I I am looking at the world differently.
01:05:09.060I have access to knowledge and I'm talking to a human non augmented.
01:05:16.020Well, you know, so, you know, Kurzweil and Musk, of course, say that for us to keep up with AI, we're going to have to augment our intelligence.
01:05:30.160Now, this sounds it sounds pretty far fetched putting neural implants in so we can do a Google search in our head, for example.
01:05:37.540But, you know, that's only one step away from where we are today where we pick up the phone and we, you know, we will ask Google or Alexa to search on information.
01:05:46.560You know, my kids will never have to pick an Encyclopedia Britannica off the wall to learn about, you know, I don't know how many moons is around Jupiter.
01:05:54.080They can just ask their their computer.
01:05:58.500But this is, you know, when we talk about things like the robotic prosthesis, you know, would you if you have, you know, short sightedness or a problem with your vision, would you be prepared to wear an implant that could give you 20, 20 or better vision?
01:06:35.560You just implant this in just based on what I do for a living.
01:06:40.420It would give me such an advantage that I would really be hard pressed not to do it because I would know also if if I don't do it, the other guys are going to do it and there's no way I'll be able to compete.
01:06:54.440I mean, it's going to be a really tough choice.
01:06:56.540And this is where science fiction actually informs us about some of these things, because, you know, we've seen sci fi writers write about this and talk about the fact that you've got natural humans versus augmented humans and the battle between these two ethically.
01:07:12.340And, you know, I think that that's probably a pretty real thing that we're going to have to deal with.
01:07:16.160Of course, some of it's a little bit more simple, you know, like, for example, in respect to repairing damage that you might have to your body, you know, prosthesis is there, but we're working on 3D printed organs.
01:07:30.540So bioprinted kidneys and hearts and things like that.
01:07:33.940So, you know, if you develop a heart disease in the future, we may be at a 3D printer, a new heart using your own stem cells so that it doesn't get you don't need any rejection medicine anymore.
01:07:44.320And, you know, you could get a new heart and that could extend your life by 20 or 30 years.
01:07:57.100And I know now, you know, exoskeletons are being developed that would give her use of her full use of her of her arm and her hand.
01:08:05.680And is there a time in her lifetime, she's 29 now, where she would have the fog of the way she thinks lifted?
01:08:18.960Certainly, I think within the next 20 to 30 years, that's a real possibility.
01:08:25.840I think we are going to, with both gene therapy and with augmentation technologies, I think the word disability will disappear from our vernacular.
01:08:37.360Are you concerned at all about, especially when it comes to gene manipulation, the creation of Iceland has now been the only place that has a zero birth rate of Down syndrome.
01:08:54.460And it's because of abortion and early detection.
01:09:01.280I mean, you know, I just don't think it's a good, I don't know if it's a good thing.
01:09:05.880I've met a lot of people with Down syndrome, and I quite honestly think when I'm with them, I think, and excuse the use of this word, but it's appropriate for us.
01:09:21.000Are you worried at all of this world that we're going into that can just make everybody perfect?
01:09:30.120Designer babies, you know, we've heard talk about it, you know, in vitro manipulation of DNA and so forth.
01:09:35.880You know, it's obviously, there's a huge ethical minefield in terms of where do you stop, where do you start?
01:09:41.760Now, if there's a congenital disease that's going to debilitate, you know, that child for the rest of its life and you can fix it, then why wouldn't you?
01:09:51.300But at the same time, what if I was able to change your hair color or your skin color or your muscle tone, you know,
01:09:58.660and get you to be more athletic or more mathematically inclined?
01:10:02.780You know, and this is where it's a slippery slope.
01:10:05.820Having said that, I think what history teaches us is, and this is the inevitability of this and why I choose to be optimistic,
01:10:14.240is with all of these technologies and things we think about, you know, if you look back to the start of the Industrial Revolution,
01:10:21.580we don't have the self-control to limit humanity's experimentation with these things generally.
01:10:28.780We rush forward and embrace this and worry about the complications later.
01:10:33.800Okay, so I've only got about a minute, and I'd love to have you back to talk about banks and everything else.
01:10:39.980That's a new book, and of course you're...
01:15:39.560But why is Mueller still bogged down over that?
01:15:44.060Hand that off and get busy investigating, I don't know, the largest hostile intelligence operation conducted in United States history.
01:15:53.560You know, the little thing when Russia attempted to interfere with the election and seemed to be influencing and worming its way into both parties.
01:16:03.860You know, the crime you're supposed to be concentrating on.
01:16:07.920The last time I checked, the election happened months before Trump and the transition period.
01:16:14.480If you see obstruction of justice there, great.
01:16:18.580Well, I mean, not great, but get on that and take it on.
01:16:25.360We still don't know how a company like Fusion GPS became so influential in this investigation.
01:16:31.980Fusion GPS, a company that was employed by the Russians, the Clinton campaign, the DNC.
01:16:38.260Does it sound like the information you'd get from them might be just a little biased?
01:16:42.800Apparently, the knowledge, you know, was either lost on the FBI and DOJ or they didn't care because we now know that the FBI was trying to pay to have Fusion GPS, the operative, continue his work.
01:16:58.020When the House Intelligence Committee began pushing Deputy Attorney General Rosenstein to answer some of these things, the the pushback began.
01:17:11.180Now, what possible motive would he have to stonewall Congress?
01:17:17.000Well, it came out last week that senior DOJ official Bruce Orr was demoted after it found out that he had secret meetings with the head of Fusion GPS.
01:17:26.720Now, that alone sounds bad, but that was so last week.
01:17:30.500Yesterday, Fox reported that it was not only Orr holding secret meetings with Fusion GPS, but his wife was also working for them.
01:17:38.420She was employed by Fusion GPS while being while the Trump dossier was being compiled before working at Fusion GPS.
01:17:48.040She worked at a Washington think tank where her specialty was described as a Russia expert.
01:17:54.120Want to take any guess where Orr's office is at the DOJ or was?
01:17:59.700Lo and behold, just a few doors down from the same Deputy Attorney General.
01:18:04.760Now, whether something nefarious happened here or not, you could very easily make the case that this is the reason the outlandish Trump dossier was used to possibly get FISA warrants to spy on a presidential candidate.
01:18:18.580You could also make the case that the bias is so thick that it's also the reason why a special counsel seems so hell-bent on proving obstruction of justice rather than finding out who was involved in possibly the worst foreign intelligence attack in our history.
01:18:36.300Is the FBI, is the FBI, is the DOJ dirty?
01:18:43.760They need to get their house in order.
01:21:04.480And then, over the weekend, I don't know if you saw this brutal, brutal tape of police coming after a guy, he's laying down in the hallway, he's clearly drunk or drugged or something.
01:21:20.940And the police, you know, have their flak jackets on and they've got an AR trained on him.
01:21:28.100And the vest cam shows this guy and they're just barking orders at him.