The Joe Rogan Experience - September 11, 2024


Joe Rogan Experience #2201 - Robert Epstein


Episode Stats

Length

2 hours and 38 minutes

Words per Minute

148.5159

Word Count

23,567

Sentence Count

1,909

Misogynist Sentences

21


Summary

In this episode, Dr. Alex Blumberg talks about his recent meltdown on a live show with Joe Rogan and Robert Kavcic, and explains why he decided to go on the show. He also talks about the recent events that have happened to him and his family, and how they have affected his life in the past 12 years, including the recent death of his wife, who was killed in a car accident, and why he believes that technology companies like Google and other major tech companies are responsible for some of the most sinister things going on in the world, like the Epstein scandal, the Uranium One scandal, and the cover-up surrounding the Epstein case. This episode was produced and edited by David Axelrod. Additional production by Annie-Rose Strasser and Rachel Ward. The show was mixed and produced by Rachel Ward and Matthew Boll. It was edited by Rachel Goodman. Our theme music was made by Micah Vellian and our ad music was written and performed by Mark Phillips. Additional music was produced by Haley Shaw and Mark Phillips, and additional mixing and mastering by Matthew Boll, with additional engineering by Matthew Keyser, and a little help from Haley Shaw. and Bobby Lord, and our editor-in-chief of The Daily Beast. Special thanks to Rachel Ward, Caitlin Durand, and Rachel Goodman, and Matthew Karnaczak, and Andrew Kuchta, of the New York Times and The Daily Mail, for their excellent reporting on the Epstein story, and reporting on Epstein's new book, "The Devil Next Door: How They Know What We Know and How We Know It's a Bigger Than You, and Why They Don't Know It, and their reporting on his new book "The Bigger than We Think It's Better Than You." and their amazing research, The Bigger Is Better than You, Bigger, Biger Than They Think It, written by Dr. Robert Kavanagh, and his amazing research and research, and so much more! and the amazing writing, and much more, we hope you enjoy it! Thank you for listening and tweet us your thoughts and your support and your feedback, and we really appreciate it. Tweet us what you think we really like it! and your thoughts, and please leave us a review and your stories, and share it on Insta- and we're grateful for your support, and your comments, and all your support is appreciated!


Transcript

00:00:11.000 And we're up.
00:00:12.000 Hello, Robert.
00:00:13.000 Good to see you.
00:00:15.000 Hello, Joe.
00:00:15.000 You look a little stressed out.
00:00:18.000 Uh, I am stressed out.
00:00:19.000 In fact, are we recording?
00:00:21.000 Yes.
00:00:21.000 Okay, then, uh...
00:00:24.000 Then I want to make a special request.
00:00:27.000 Okay.
00:00:28.000 You can kick me out if you like.
00:00:30.000 Why would I do that?
00:00:33.000 Because I need to have a meltdown.
00:00:39.000 I would like to have a meltdown right now on your show.
00:00:46.000 You want to have a personal meltdown?
00:00:48.000 Yes.
00:00:49.000 Okay, go ahead.
00:00:51.000 Okay.
00:00:55.000 I've never heard anybody plan for a meltdown before.
00:00:58.000 Well, I need to do this, and I think this is the right opportunity.
00:01:04.000 Okay.
00:01:05.000 And I don't know what I'm going to say.
00:01:07.000 Okay.
00:01:07.000 But I am definitely going to meltdown.
00:01:10.000 Okay.
00:01:13.000 Okay.
00:01:15.000 So I am completely fed up.
00:01:22.000 I have worked day and night.
00:01:26.000 I work about 80 hours a week.
00:01:28.000 I'm directing almost 40 research projects.
00:01:35.000 I've been working really hard for maybe 45 years.
00:01:40.000 And the last 12 years where I've turned my eye to Google and other tech companies have turned into, for me personally, a disaster.
00:01:53.000 So, before I started studying Google, I had published 15 books with major publishers.
00:02:00.000 Since I've started studying Google and other companies, I can't publish anymore.
00:02:09.000 I used to write for and actually work for mainstream news organizations and media organizations.
00:02:17.000 I was editor-in-chief of Psychology Today for four years.
00:02:20.000 I was an editor for Scientific American.
00:02:24.000 I wrote for USA Today and U.S. News and World Report and Time Magazine.
00:02:30.000 But in 2019, after I testified before Congress about some of my research on Google, President Trump tweeted to his whatever, millions of gazillions of followers,
00:02:46.000 basically some praise for my research.
00:02:49.000 He got the details wrong.
00:02:51.000 But then Hillary Clinton, whom I had always admired, chose to tweet back To her 80 million Twitter followers, and she tweeted that my work had been completely debunked and was based on data from 21 undecided voters.
00:03:10.000 I still have no idea where any of that came from.
00:03:13.000 Probably someone from Google, because Google was her biggest supporter in 2016. And this was 2016. And then that got picked up by this machine.
00:03:27.000 I'm told it's called the Clinton machine.
00:03:29.000 And the New York Times picked that up without fact checking.
00:03:34.000 And then a hundred other places did.
00:03:36.000 And I got squashed like a bug.
00:03:40.000 Squashed.
00:03:41.000 I had a flawless reputation as a researcher.
00:03:44.000 My research reputation was gone.
00:03:48.000 I was now a fraud.
00:03:50.000 A fraud.
00:03:52.000 Even though I've always published in peer-reviewed journals, which is really hard to do.
00:03:59.000 And there was nothing I could do about it.
00:04:01.000 And all of a sudden, I found that the only places I could publish were in what I call right-wing conservative nutcase publications, where I've actually made friends over the years.
00:04:12.000 I've made friends with them, but that's beside the point.
00:04:16.000 I was crushed.
00:04:20.000 Not only that, I've been discovering things.
00:04:25.000 I've made at least 10 major discoveries about new forms of influence that the internet has made possible.
00:04:32.000 These are controlled almost entirely by a couple of big tech companies, affecting more than 5 billion people around the world every single day.
00:04:43.000 And I've discovered them, I've named them, I've quantified them, I've published randomized controlled studies to show how they work, published them in peer-reviewed journals.
00:04:58.000 We just had another paper accepted yesterday.
00:05:09.000 And I've built systems to do to them what they do to us and our kids.
00:05:15.000 They surveil us and our kids.
00:05:19.000 24 hours a day.
00:05:21.000 Google alone does that over more than 200 different platforms, most of which no one's ever heard of.
00:05:27.000 People have no idea the extent they're being monitored.
00:05:30.000 They're being monitored when they have Android phones.
00:05:33.000 They're being monitored even when your phone is off.
00:05:36.000 Even when the power is off, you're still being monitored.
00:05:40.000 How do they do that?
00:05:43.000 Well, because remember when we could take the batteries out?
00:05:47.000 And then at some point, they soldered them in?
00:05:50.000 Because they soldered the batteries in, even when you turn the phone off, it's not off.
00:05:55.000 It's easy to demonstrate.
00:05:56.000 It's still transmitting.
00:05:57.000 Or it'll transmit the moment the power comes back on.
00:06:01.000 It's still collecting data.
00:06:05.000 What am I trying to say here?
00:06:06.000 Then my wife was killed in a suspicious car accident.
00:06:14.000 This was also shortly after I testified before Congress in 2019. Right before she was killed, I did a private briefing for Ken Paxton, the AG of Texas, and other AGs at Stanford University.
00:06:28.000 And one of those guys came out afterwards and he said, well, based on what you told us, Dr. Epstein, he said, I don't mean to scare you, but he said, I predict you're going to be killed in some sort of accident in the next few months.
00:06:41.000 So I told you this before, when I was on before, and obviously I wasn't killed, but my beautiful wife was killed.
00:06:49.000 And, you know, her vehicle was never inspected.
00:06:55.000 And then it disappeared from the impound lot.
00:06:57.000 I was told it was sold to some junk company in Mexico.
00:07:02.000 And that is one of now six, six incidents, six, of violence against people who are associated with me over the past few years.
00:07:16.000 The last just happened a couple of weeks ago.
00:07:19.000 What was that one?
00:07:20.000 This episode is brought to you by The Farmer's Dog.
00:07:22.000 Dogs are amazing.
00:07:24.000 They're loyal.
00:07:25.000 They're lovable.
00:07:25.000 Just having Marshall around can make my day 10 times better.
00:07:29.000 I'm sure you love your dog just as much, and you want to do your best to help them live longer, healthier, happier lives.
00:07:36.000 And a healthy life for your dog starts with healthy food, just like it does for us.
00:07:41.000 There's a reason having a balanced diet is so important.
00:07:44.000 So how do you know if your dog's food is as healthy and as safe as it can be?
00:07:50.000 Farmer's Dog gives you that peace of mind by making fresh, real food developed by board-certified nutritionists to provide all the nutrients your dog needs.
00:08:00.000 And their food is human-grade, which means it's made to the same quality and safety standards as human food.
00:08:07.000 Very few pet foods are made to this strict standard.
00:08:10.000 And let's be clear, human-grade food doesn't mean the food is fancy.
00:08:14.000 It just means it's safe and healthy.
00:08:16.000 It's simple.
00:08:17.000 Real food from people who care about what goes into your dog's body.
00:08:22.000 The farmer's dog makes it easy to help your dog live a long, healthy life by sending you fresh food that's pre-portioned just for your dog's needs.
00:08:31.000 Because every dog is different.
00:08:32.000 And I'm not just talking about breeds.
00:08:34.000 From their size to their personality to their health, every dog is unique.
00:08:38.000 Plus, precise portions can help keep your dog at an ideal weight, which is one of the proven predictors of a long life.
00:08:46.000 Look, no one, dog or human, should be eating highly processed foods for every meal.
00:08:51.000 It doesn't matter how old your dog is, it's always a great time to start investing in their health and happiness.
00:08:57.000 So, try The Farmer's Dog today.
00:08:59.000 You can get 50% off your first box of fresh, healthy food at thefarmersdog.com slash rogan.
00:09:06.000 Plus, you get free shipping.
00:09:08.000 Just go to thefarmersdog.com slash rogan.
00:09:11.000 Tap the banner or visit this episode's page to learn more.
00:09:15.000 Offer applicable for new customers only.
00:09:18.000 Well, this last one is kind of weird and creepy.
00:09:22.000 And then I was at a meeting in Dallas, I think it was.
00:09:26.000 Oh, no, no, no.
00:09:27.000 It was up in Monterey.
00:09:29.000 And it was with General Paxton and then with some of my staff.
00:09:35.000 And one of my staff members sitting next to me, she all of a sudden just brushed her hand against my computer case, which is...
00:09:44.000 That's my computer case.
00:09:47.000 And she screamed.
00:09:51.000 And we all went, what?
00:09:52.000 What happened?
00:09:53.000 And she goes, look!
00:09:55.000 And there was a needle sticking out of the computer case, sticking out of the computer case, which is impossible!
00:10:03.000 And it was going a half inch into her thumb.
00:10:06.000 It had gone through the end.
00:10:08.000 And of course, I'm thinking, oh, that's awful, but maybe you just saved my life.
00:10:14.000 Maybe it's got some sort of weird poison on it, or it's like a Putin thing, and it's got radioactive substances or something.
00:10:21.000 I'm trying to joke around, but meanwhile, she was terrified.
00:10:25.000 How did a pin get...
00:10:28.000 And by the way, I have a picture of the pin.
00:10:30.000 It's Really creepy.
00:10:32.000 I've never...
00:10:32.000 So when you're saying a needle, you're not saying like a syringe?
00:10:35.000 You're saying a needle like a sewing needle?
00:10:37.000 No, that's what I'm saying.
00:10:38.000 None of us has ever seen a needle like this needle.
00:10:42.000 It's about two inches long.
00:10:45.000 The end is like it's been sharpened.
00:10:49.000 Okay.
00:10:49.000 You can see it sharpened.
00:10:50.000 And at the end where there should be a hole for thread, there's no hole.
00:10:55.000 Okay.
00:10:56.000 I don't know what it is.
00:10:58.000 But we've had worse incidents, too.
00:11:00.000 I'm just saying this happened to be the latest.
00:11:02.000 But that's in your computer bag?
00:11:04.000 Was your computer bag ever out of your care?
00:11:07.000 Well, not that I noticed, but...
00:11:11.000 But if somebody wanted to harm you, a little needle, it's not really...
00:11:15.000 Oh, I don't think that's someone wanting to harm me.
00:11:18.000 What do you think that is?
00:11:20.000 Well, if it's anything, it's someone wanting to scare me.
00:11:25.000 And the fact is I have been scared and so have a lot of my staff.
00:11:28.000 This summer we've had 26 interns.
00:11:31.000 They come from all over the country.
00:11:34.000 23 of these people are volunteers and fantastic young people, extremely smart, helping me run almost 40 research projects.
00:11:47.000 And there is, you know, we take precautions and there is some fear.
00:11:52.000 And one of these young men who's done superb work, he asks that we take his name off of everything.
00:12:02.000 He didn't quit, but I'm just saying...
00:12:05.000 Sounds reasonable.
00:12:08.000 Yeah, yeah, because there have been a number of incidents.
00:12:13.000 And if I were...
00:12:17.000 Did you ever hear of John Katsimatidis?
00:12:20.000 No.
00:12:21.000 Okay, he owns a ton of...
00:12:41.000 But yet you're still alive.
00:12:47.000 Well, I'm alive, but I'm in rough shape because, you know, when a push comes to shove here, I have been making discoveries that are really startling.
00:12:59.000 And they've gotten worse and worse and worse.
00:13:01.000 And since I was last with you, which was two and a half years ago, we've made probably five or six, seven more discoveries.
00:13:08.000 They get worse each time.
00:13:10.000 We've done something that I was speculating about doing when I was here, which was building a nationwide monitoring system to surveil them the way they surveil us and see what content they're actually sending to real voters and to real kids.
00:13:27.000 So let's break this down because I think we're getting a little in the weeds here.
00:13:31.000 Let's explain to people that don't know what you're talking about what your research is about because most people are not aware.
00:13:40.000 And one of the major issues that you have discovered is the curation and the purposeful curation of information through search engines.
00:13:52.000 So most people that are unaware think that when you do a Google search on something, say if you want to find out about a Kamala Harris rally or a Trump rally, That you are just going to get the most pertinent information in the order in which it's most applicable to your search.
00:14:13.000 But that's not the case.
00:14:14.000 The case is everything is curated and if you want to find positive things about someone who they deem to be negative to Whatever ideology they're promoting, it will be very difficult to find that information.
00:14:30.000 If you want to find positive things about someone they support, they will be right up front.
00:14:35.000 If you want to find negative things about someone they support, they will be very difficult to find and you will be inundated with positive things.
00:14:42.000 And what you have found is that this curation of information from searches Has a profound effect, especially on the casual voter, on the low-information voter, a profound effect on who gets elected and its tantamount to election interference.
00:15:00.000 Is that fair to say?
00:15:02.000 It's fair to say that's where I was two and a half years ago.
00:15:05.000 We have gone so far beyond that because it's not just search results.
00:15:12.000 It's search suggestions, which we're capturing now by the millions.
00:15:16.000 So it was in the news recently that when people were typing in Trump assassination, you know, they were getting crazy stuff like the Lincoln assassination.
00:15:27.000 They were getting crazy stuff and they were not getting information about the Trump attempted assassination.
00:15:34.000 And, you know, I looked at that and I said, oh, isn't that nice?
00:15:38.000 There's an anecdote about how they may be abusing search suggestions.
00:15:43.000 We don't have anecdotal data anymore.
00:15:46.000 We have hardcore, large-scale scientific data on all of these issues.
00:15:53.000 We know what's actually going on and we've quantified the impact.
00:15:56.000 See, it's one thing to say, oh, look what they're doing.
00:15:58.000 It's quite another to say, what impact does that have on people?
00:16:02.000 Right.
00:16:02.000 Let's talk about the Trump assassination one in particular.
00:16:05.000 What did you find about that?
00:16:09.000 Well, frankly, we couldn't care less about that because that's one anecdote.
00:16:14.000 We're collecting these by the millions and what we know, we know a couple of things.
00:16:19.000 We know that, first of all, they're not...
00:16:24.000 It started out as one thing, and it's turned into something else.
00:16:28.000 And so what they do is they use search suggestions to shift people's thinking about anything.
00:16:34.000 It's not just about candidates, either.
00:16:36.000 It's about anything.
00:16:38.000 And we've shown in controlled experiments That by manipulating search suggestions, you can turn a 50-50 split among undecided voters into a 90-10 split, with no one having the slightest idea that they have been manipulated.
00:16:58.000 Wow.
00:16:59.000 And this always goes a very specific way.
00:17:02.000 It always goes.
00:17:03.000 It always goes a specific way, but I'm going to show you maybe a little later if I haven't put you to sleep or if my meltdown hasn't gotten too bad because I'm not quite finished with my meltdown yet.
00:17:15.000 I'll show you content, data, large scale, that we're collecting now 24 hours a day, and I'll show you what they're actually doing.
00:17:25.000 An anecdote, those don't hold up in court.
00:17:27.000 You know, they grab headlines for a couple of days, but that's about it.
00:17:31.000 They don't do anything.
00:17:32.000 But we're actually collecting evidence that's court admissible.
00:17:36.000 So we're collecting data now in all 50 states, but we actually have court admissible data now in 20 states already.
00:17:42.000 And we keep building bigger and bigger every day.
00:17:46.000 And what is this data about?
00:17:48.000 Well, it's any data that's going to real people.
00:17:53.000 So we're collecting data with their permission.
00:17:57.000 From the computers of a politically balanced group of more than 15,000 registered voters in all 50 states and from many of their children and teens as well.
00:18:08.000 And so when they're doing anything on their computers, they've given us the right to collect it, grab it, zap it over to our own computers, aggregate the data and analyze it.
00:18:21.000 I want to point out that when we do this, we do this without transmitting any identifying information.
00:18:28.000 We protect people's privacy, but we are getting these increasingly accurate pictures of what Google and other companies are sending to real people.
00:18:39.000 Why do you have to do it this way?
00:18:42.000 Because all the data they send is personalized.
00:18:45.000 You will never know what they're sending to people unless you look over the shoulders of real people and see the personalized content.
00:18:53.000 And what have you found?
00:18:56.000 Well, as it happens, I just summarized our findings over the last 12 years, and you get the first advanced copy of a monograph that's called The Evidence.
00:19:10.000 And...
00:19:15.000 And because we're so desperate, we need help, we need money, we need emails, we're so desperate for that, that we have set up, we kind of did this last time too, but we have set up a link.
00:19:29.000 If people go to that link and they're willing to give us their email, we will give them a free copy of this, advanced copy of this monograph.
00:19:38.000 And it It goes through the whole thing.
00:19:40.000 It shows all the effects we've discovered, but it also shows the monitoring we're doing and what we're finding out from this monitoring.
00:19:48.000 One of the things that I noticed since the last time you were here was I used to use DuckDuckGo.
00:19:55.000 And one of the reasons why I started using DuckDuckGo is there was a story about a physician in Florida that took the mRNA vaccine and had a stroke shortly afterwards.
00:20:04.000 It was very early on in the pandemic, and they were beginning to speculate that some of the side effects of the vaccine are being hidden.
00:20:12.000 And I could not find this story on Google.
00:20:16.000 I could not find it.
00:20:18.000 I kept looking and looking and looking.
00:20:21.000 I entered in the information on DuckDuckGo.
00:20:23.000 It was one of the first articles.
00:20:26.000 Instantaneously.
00:20:26.000 I was like, this is crazy.
00:20:29.000 Since then, something's happened.
00:20:32.000 And I think they became aware that DuckDuckGo was a problem spot for the dissemination of information.
00:20:39.000 And now it appears to mirror Google.
00:20:43.000 Well, the same has happened with Bing and the same has happened with Yahoo.
00:20:47.000 What about Brave?
00:20:49.000 No, Brave is still independent.
00:20:51.000 I know Brendan Eich.
00:20:52.000 You should have him on if you haven't.
00:20:54.000 And he's the guy who wrote Brave.
00:20:56.000 Before that, he wrote Firefox for Mozilla.
00:20:59.000 He left because Google was...
00:21:02.000 It had its tentacles into Firefox.
00:21:05.000 Yeah.
00:21:05.000 I'm afraid to talk about Brave for them to be compromised.
00:21:09.000 Because, like, we were talking about DuckDuckGo, and I was telling everybody, go to DuckDuckGo.
00:21:13.000 And now I'm like, Jesus, it's the same thing as Google.
00:21:16.000 Like, something happened.
00:21:18.000 Do you know what happened?
00:21:20.000 Well, we know in some cases with some of these companies what happened.
00:21:24.000 I don't know the particulars with DuckDuckGo, but it's easy enough to guess.
00:21:28.000 They're under...
00:21:29.000 All of these...
00:21:33.000 These alternative websites that are trying to protect people's privacy.
00:21:38.000 So we use ProtonMail, for example.
00:21:40.000 We use Signal for texting.
00:21:43.000 They've all run into problems.
00:21:45.000 And the problem is when Google goes after them.
00:21:48.000 So Google tried to shut down ProtonMail.
00:21:51.000 That's been well documented.
00:21:52.000 Really?
00:21:53.000 Oh, yeah.
00:21:54.000 Why did they try to shut down ProtonMail?
00:21:55.000 What was their argument?
00:21:58.000 Because they saw it possibly cutting in a little bit into their Gmail business.
00:22:05.000 And they were brutal.
00:22:08.000 They were brutal in suppressing any mention of ProtonMail anywhere.
00:22:15.000 Don't forget, it's not just the search results.
00:22:17.000 It's the search suggestions.
00:22:18.000 It's the answer boxes.
00:22:20.000 Right.
00:22:20.000 How were they suppressing ProtonMail?
00:22:23.000 The way they suppress everything else.
00:22:25.000 I'll give you a detail here that you may not know.
00:22:28.000 Because they don't have to adjust their algorithms to do something this simple.
00:22:32.000 Their algorithms, all of them, as far as I know, check blacklists and check whitelists.
00:22:37.000 So all they have to do is add...
00:22:40.000 A couple of ProtonMail links to blacklists.
00:22:43.000 And that means that before one of their algorithms will take someone somewhere or will show someone something, it checks for the blacklist first.
00:22:55.000 And if you put ProtonMail on the blacklist, it's suppressed and it doesn't appear.
00:23:01.000 Well, let's look for it right now.
00:23:02.000 Jamie, do me a favor, please, and pull up Google.
00:23:06.000 Obviously, this is happening before the podcast is released, so they can't correct this because they didn't know you were coming on.
00:23:13.000 They didn't know we were talking about this.
00:23:15.000 So let's pull up Google real quick and put it up on the screen.
00:23:21.000 Okay, you already Googled it.
00:23:24.000 Just let me see.
00:23:25.000 Okay, right away, it shows ProtonMail.
00:23:27.000 And then below that, it shows ProtonAccount, sign in.
00:23:32.000 You use the ProtonVPN.
00:23:35.000 So, how is it suppressing?
00:23:38.000 This is not suppressing at all.
00:23:40.000 Okay, now this is where my staff has warned me, don't be condescending to Joe Rogan.
00:23:45.000 How is it condescending if I'm just asking you a question?
00:23:47.000 You can just give me an answer.
00:23:48.000 No, I know I could, but I was about to be condescending.
00:23:52.000 Well, why would you be condescending if this is the question?
00:23:54.000 This is the question.
00:23:55.000 For goodness sake.
00:23:56.000 Protonmail, do the Google search, and then right away the first thing is Protonmail.
00:24:00.000 Okay, but you're pushing me back into meltdown mode, and I'll tell you why.
00:24:03.000 Well, tell me why.
00:24:04.000 Because what account is this?
00:24:06.000 Jamie's account.
00:24:06.000 It's Jamie's account.
00:24:09.000 If you actually want to know whether they're suppressing ProtonMail, you have to look over the shoulders of a large representative sample of people.
00:24:20.000 You can't just look at Jamie's account.
00:24:22.000 So you think that Jamie's account is curated to not hide ProtonMail?
00:24:26.000 Well, of course it's curated.
00:24:29.000 Technically, this is not my account.
00:24:31.000 Whoever's account it is, they know whose it is and they know how it's used.
00:24:36.000 And remember, since all content is personalized, that means it's a very simple matter for them.
00:24:42.000 And they have algorithms that do this.
00:24:45.000 Okay, well, we can do a real quick experiment.
00:24:49.000 We won't get the results right now, but we will get the results from the future.
00:24:53.000 So I'll ask the audience to do this.
00:24:56.000 So, ladies and gentlemen and non-binary folks out there, please go to Google and type in ProtonMail and screen record this and then upload this.
00:25:09.000 Upload this to X, upload this to Instagram, upload this to Facebook and TikTok and all that, and I'd like to see what the results are.
00:25:18.000 You're going to get clean results because they know every single viewer, listener that you have.
00:25:25.000 So they can, as I was told this literally by Zach Voorhees, whom you may have heard of, he's one of the most prominent whistleblowers from Google, they can turn bias on and off like flipping a light switch.
00:25:39.000 So you think they look for someone who listens to podcasts and they don't have bias towards them?
00:25:49.000 I'm just going to say what I said before.
00:25:52.000 The only way to know what they're really sending to people and they're not messing around is to look over the shoulders of people that they cannot identify and who are representative of the American population and I'm going to show you.
00:26:08.000 Over and over and over again, I'm going to show you what they're actually sending when we collect data in a scientifically valid way so that the data are court admissible.
00:26:19.000 I will show you what they're actually sending to people.
00:26:22.000 So you have shown that if you collect data in this scientific way that they suppress proton mail?
00:26:30.000 We haven't looked at that.
00:26:31.000 We could look at that very easily.
00:26:33.000 Okay.
00:26:34.000 Do you understand, though, that you're saying they suppress ProtonMail?
00:26:38.000 And then we're saying, let's see if they suppress ProtonMail.
00:26:41.000 Oh, no, no.
00:26:42.000 They didn't.
00:26:43.000 You misunderstand.
00:26:44.000 I'm telling you that early on, ProtonMail has written essays on this, okay?
00:26:49.000 I know Andy Yen, who's the founder and CEO. And in the beginning, when Google was trying to completely put them out of business, they published a lot of...
00:26:59.000 They put blogs on this.
00:27:01.000 They sued them in court.
00:27:02.000 They did everything they could possibly do.
00:27:04.000 And they had overwhelming evidence that Google was trying to...
00:27:08.000 Okay.
00:27:09.000 ProtonMail sued Google or Google sued ProtonMail?
00:27:12.000 ProtonMail sued Google.
00:27:13.000 And what was the accusation?
00:27:15.000 That they were suppressing content, ProtonMail content, so that people would not know that they exist.
00:27:23.000 And what was the results of those court cases?
00:27:26.000 Google backed down, which they do sometimes.
00:27:30.000 It's hard to know when they do and when they don't.
00:27:33.000 But this is also the vestiger case.
00:27:36.000 She's head of that commission in Europe, European Commission, that has sued Google repeatedly.
00:27:41.000 It has fined them four times more than 10 billion euros.
00:27:45.000 Their first case against Google was the same kind of case, exactly the same, that Google was suppressing information about comparative shopping services and they had put out of business or nearly put out of business most of the comparative shopping services in Europe.
00:28:03.000 And so the European Commission went after them.
00:28:06.000 They won their case.
00:28:08.000 At that time, it was the biggest fine Google had ever faced.
00:28:11.000 And they proved it.
00:28:12.000 They proved that Google was doing this deliberately, systematically.
00:28:18.000 So they do this all the time.
00:28:20.000 But what I'm saying is that generally...
00:28:25.000 They're the gateway.
00:28:26.000 They decide what people are going to see and what people are not going to see.
00:28:31.000 And whoever controls the information controls humanity.
00:28:37.000 They control the narrative and that controls us.
00:28:40.000 Has there been any talk about making these kind of algorithms illegal?
00:28:55.000 Not serious talk because, see, you can't...
00:28:59.000 Look, the search algorithm itself, which is, by the way, going to be outmoded very soon because of...
00:29:07.000 AI. Yeah, because of AI. But the search engine itself has to be biased.
00:29:13.000 It has to be biased because you don't want it using an equal time rule.
00:29:18.000 You want it to show you the best guitar brand there is.
00:29:22.000 You want the best dog food.
00:29:23.000 You want the most correct answer.
00:29:26.000 So it's always biased.
00:29:28.000 It's always going to put one political candidate ahead of another.
00:29:31.000 The thing is, though, of course, they can control which one if they take any interest.
00:29:36.000 So that's where the problem comes in.
00:29:39.000 So because it's always biased, you want the algorithm to work that way because you want the best to rise to the top.
00:29:50.000 Right.
00:29:50.000 Unfortunately, there's a lot of bad that goes with the good.
00:29:53.000 The bad is they can decide what's good and what's bad.
00:29:59.000 One of the leaks from the company, eight-minute video called The Selfish Ledger, talks about the ability of the company to re-engineer humanity.
00:30:07.000 They call it re-sequencing human behavior.
00:30:11.000 And they explain how easily they can do it.
00:30:13.000 And they're actually doing it.
00:30:15.000 And we know they're doing it now because we, as of yesterday, we had preserved more than 99.3 million ephemeral experiences, mainly on Google, but other platforms as well.
00:30:27.000 But also on YouTube, because on YouTube, YouTube is the second largest search engine in the world.
00:30:32.000 And on YouTube, the ephemeral content It's those suggestions for the next videos and it's that up next suggestion that plays automatically.
00:30:44.000 So normally ephemeral content is lost forever.
00:30:47.000 That's why they use it for manipulation purposes.
00:30:50.000 We're capturing it!
00:30:51.000 That's never been done before and we're doing it on a massive scale.
00:30:55.000 Everything from search suggestions to answer boxes, to search results, to YouTube sequences, YouTube recommendations.
00:31:04.000 You name it.
00:31:06.000 We're monitoring Facebook, TikTok, Twitter.
00:31:13.000 We're learning.
00:31:14.000 Each year we get better and better at monitoring more and more and then monitoring faster and analyzing the data faster.
00:31:22.000 And last November, we went public with a dashboard that summarizes the data that we're collecting and shows the bias in real time.
00:31:32.000 It's literally updated every five minutes, 24 hours a day.
00:31:37.000 And you can see the bias.
00:31:40.000 And I... I've given you some images that we can show if you'd like.
00:31:47.000 You can see the bias and it's overwhelming.
00:31:50.000 This is not my imagination.
00:31:51.000 And I can show you a couple of shockers, things that you would never guess that they're doing.
00:31:56.000 Okay, so what are we looking at here?
00:31:58.000 Oh, perfect.
00:31:59.000 Perfect place to begin.
00:32:00.000 How did you know?
00:32:01.000 James the Wizard.
00:32:02.000 Mean bias by political leaning, Google only.
00:32:05.000 Okay, so what is this showing us here?
00:32:11.000 This is showing, and if you see bars below the zero line, that means the content is liberally biased.
00:32:20.000 And you're seeing very strong liberal bias, and those three different bars show you the bias and content being sent to conservatives, liberals, and moderates.
00:32:32.000 Now, abortion, you would think, if they're really showing people what they want to see, Something that matches their interests.
00:32:39.000 You would think that they would not be sending the same level of liberal bias to conservatives, liberals, and moderates.
00:32:45.000 But that's what this shows.
00:32:47.000 So this is the search topic is abortion.
00:32:50.000 Correct.
00:32:51.000 This is the average of January to August of 2024. Mm-hmm.
00:32:55.000 And so when you say mean bias by political leaning, so are you saying they're sending the same biased information roughly?
00:33:06.000 There's a slight difference, a little bit more in the liberal side and a little bit more in the conservative side than the moderate side, it looks like, right?
00:33:13.000 Is that correct?
00:33:14.000 Yes.
00:33:15.000 So what are they – no, the opposite, right?
00:33:18.000 Seems like moderate is more.
00:33:21.000 But what is the – What is the bias?
00:33:25.000 So if you search abortion, is it leaning towards pro-choice websites and pro-choice information?
00:33:33.000 Is that what it's saying?
00:33:34.000 I knew you were going to ask that, so I can actually show you.
00:33:37.000 For some of these graphs, let's look at a couple of the graphs, and then I'm going to show you the content.
00:33:42.000 Because all of this...
00:33:43.000 This bias that we're measuring ultimately results in them taking you to a news story, to a web page, right?
00:33:52.000 So we're going to...
00:33:53.000 Let's do Elizabeth Warren next.
00:33:59.000 Just the red graph.
00:34:02.000 Just the graph itself.
00:34:07.000 This one?
00:34:08.000 Yeah, yeah.
00:34:09.000 Okay.
00:34:09.000 So this is a shocker.
00:34:11.000 Because if it's Elizabeth Warren, who's a very well-known liberal politician, they should be sending lots of blue stuff to her.
00:34:22.000 They're not.
00:34:23.000 They want her out of office.
00:34:25.000 They are sending people to content that vilifies Elizabeth Warren.
00:34:32.000 They want her gone.
00:34:33.000 Why?
00:34:34.000 Because she is one of the only Dems who's gone on record, written statement, the whole thing, calling for Google's breakup.
00:34:43.000 They want her gone.
00:34:45.000 And Deaton, I guess, just won the nomination to oppose her for the Republicans.
00:34:51.000 They are going to do everything possible to put this Republican into office in Massachusetts.
00:34:58.000 Wow.
00:34:58.000 And no one knows this except you and me, Owen James.
00:35:04.000 Well, a lot more people know it now.
00:35:07.000 Maybe.
00:35:07.000 As of you saying this on this podcast, yes.
00:35:12.000 Unless they're going to suppress the podcast.
00:35:14.000 But they really can't do that.
00:35:16.000 Well, that's why I have a lot to say that I want to say here, because I am really upset about a bunch of things, and I want to explain why.
00:35:25.000 Put that back up, please.
00:35:28.000 So what this is, mean bias.
00:35:32.000 And what is this bias showing?
00:35:34.000 Is this bias just negative stories about Elizabeth Warren, like her pretending to be Native American and that kind of stuff?
00:35:41.000 Yes, in fact.
00:35:43.000 You know that other one that you were about to put up that has the red graph at the top and then below it has a bunch of news stories?
00:35:49.000 That one?
00:35:51.000 This one.
00:35:51.000 Okay.
00:35:52.000 If you can enlarge it and scroll down...
00:35:55.000 So these bars aren't just bars.
00:35:57.000 These bars are summarizing content.
00:36:01.000 Thousands and thousands and thousands of web pages that they're sending people to.
00:36:06.000 So they're sending people mostly, are you saying, right-wing, centered content?
00:36:11.000 Well, look at the stuff.
00:36:13.000 But one of them is CNBC. Elizabeth Warren wants more student loan borrowers to know bankruptcy is easier now.
00:36:20.000 But when you average these, that's what we're doing.
00:36:23.000 When you average them, so we're looking at literally millions of these experiences, and we average them, then you end up with a shocker in her case.
00:36:33.000 They're actually sending conservatively biased content when people are looking for information on Elizabeth Warren.
00:36:42.000 They want her gone.
00:36:44.000 Elizabeth Warren, an anti-crypto movement losing their battle, according to former CFTC chairman report.
00:36:52.000 So that's an anti-crypto movement.
00:36:55.000 That would definitely be more of a right-wing bias.
00:36:59.000 Warren proposes jail time for corporate greed in healthcare.
00:37:03.000 That would be more progressive, right?
00:37:10.000 She's trying to eliminate corporate greed in healthcare.
00:37:13.000 Three Republican candidates are competing to take Elizabeth Warren as the mass GOP fights for relevance.
00:37:19.000 Okay, so the way they're framing that, fights for relevance, is interesting.
00:37:23.000 That's a little bit biased.
00:37:25.000 Senator Warren is way off on raspberries and Americans' living standards.
00:37:30.000 Okay, that's certainly a negative article.
00:37:33.000 Likely.
00:37:42.000 That's a negative one.
00:37:45.000 Senators Warren and Marshall posed questions to Biden officials about the use of crypto to evade sanctions.
00:37:51.000 So that's going to get the crypto bros after her.
00:37:54.000 We don't charge people for air.
00:37:57.000 We shouldn't charge for water either.
00:38:00.000 A new tax bill from Elizabeth Warren to Ro Khanna seeks to ban the trade of water futures.
00:38:07.000 Let's go one step further.
00:38:08.000 That seems like a progressive cause.
00:38:13.000 When you put these all together, because we're showing you means, what we're showing you is the mean, the overall mean.
00:38:20.000 Now, you would expect for Elizabeth Warren to get three blue bars, but we're getting three red bars.
00:38:27.000 That means they're sending...
00:38:29.000 Highly, on average, highly conservatively biased stories to conservatives, which makes sense, to moderates, well, one could argue, but also to liberals.
00:38:42.000 They're sending those to liberals.
00:38:44.000 But she's a problematic person to search anyway because she's kind of a fraud, right?
00:38:48.000 Especially with the...
00:38:50.000 I mean, I want to say she's kind of a fraud.
00:38:51.000 Let me say it better.
00:38:53.000 She has been accused of lying about her ancestry and then she did it for benefit.
00:38:59.000 And then she did it to get into Harvard.
00:39:01.000 She did it to get jobs.
00:39:02.000 And then, you know, she had that challenge with President Trump.
00:39:05.000 And then it turned out she has a small fraction.
00:39:10.000 I think I'm 100 times more African American than she is Native American.
00:39:15.000 Something like that.
00:39:16.000 Let me explain.
00:39:17.000 I might have made that up.
00:39:18.000 Let me explain.
00:39:19.000 Okay.
00:39:20.000 Please explain.
00:39:21.000 All right.
00:39:24.000 These aren't just graphs.
00:39:26.000 These are graphs...
00:39:27.000 It's summarizing a massive amount of data that's being sent directly to the computer screens of registered voters.
00:39:36.000 I totally understand that.
00:39:37.000 What I'm saying is, with someone like her, it might be difficult to find positive stories.
00:39:42.000 Oh, no, no, no, no.
00:39:44.000 Because we have so many examples of these things now that we can find whatever it is.
00:39:50.000 Here's the point.
00:39:51.000 Okay.
00:39:51.000 We can...
00:39:52.000 We can adjust what we're looking for.
00:40:09.000 Well, we will target you with our search algorithm.
00:40:12.000 We will make sure that people are getting more negative stories about you than positive stories, and we will have a bias that leans towards these negative stories to everyone, to liberals, to conservatives, to independents.
00:40:23.000 And that has very little impact on people who have already made up their mind, but people who are still making up their minds...
00:40:29.000 Which is a lot of people in this country.
00:40:30.000 ...easily shifts between 20 and 80 percent of those people, the undecided voters, Like that.
00:40:37.000 Have you seen the Alexa, when people ask Alexa about Donald Trump versus Kamala Harris?
00:40:44.000 Yes, we have.
00:40:45.000 Wild.
00:40:46.000 Yes, and we, starting last year, we developed special equipment that funds allowing will eventually provide to all of our field agents.
00:40:58.000 We call these people our field agents.
00:40:59.000 And we'll eventually provide them with special equipment which is going to allow us to start analyzing the answers given by Alexa, the Google Home device, The Google Assistant, Siri.
00:41:12.000 So we're going to start monitoring the content that's coming from these IPAs, Intelligence Personal Assistants.
00:41:19.000 Now, why?
00:41:20.000 Because we've published a peer-reviewed article on what's called the AnswerBotEffect.com.
00:41:28.000 So if you go to AnswerBotEffect.com, we will show you In controlled experiments, how easily a biased answer coming from an answer bot like Alexa can, boom, just like that, shift the opinion of someone who's undecided.
00:41:45.000 Forty percent or more after just one question and answer interaction in which someone is getting back a biased answer.
00:41:56.000 Now, if they personalize the answer, the effect is even larger.
00:42:00.000 So this is essentially a danger that no one was aware of.
00:42:05.000 No one ever saw on the horizon until search engines were created.
00:42:10.000 Now, search engines are here, and it's something that is not regulated, and it's right in front of us.
00:42:18.000 And what steps have been done to sort of mitigate the effects of this, if any?
00:42:25.000 Okay, so this is where now we get back to my meltdown.
00:42:28.000 Oh.
00:42:29.000 I thought you were done melting.
00:42:30.000 Oh, no.
00:42:32.000 No.
00:42:33.000 Okay.
00:42:34.000 No, I've been melting down for years, so I have a lot to go.
00:42:41.000 Yes, you summed it up nicely.
00:42:43.000 I'll just rephrase what you said a little differently.
00:42:46.000 No one anticipated these kinds of manipulations were possible.
00:42:50.000 And by the way, we've hardly even scratched the surface of what these manipulations are and what they can actually do, and the fact that we have evidence that they're being used.
00:42:59.000 Forget all that.
00:43:00.000 The point is, yes, our lawmakers, our regulators never anticipated that.
00:43:07.000 When...
00:43:09.000 When your friend, what's his name?
00:43:12.000 You just interviewed him recently.
00:43:14.000 Brett Weinstein?
00:43:15.000 No, no.
00:43:16.000 One of the early investors at Google and Facebook.
00:43:19.000 Marc Andreessen?
00:43:20.000 Marc Andreessen was one.
00:43:23.000 McNamee.
00:43:24.000 Oh, Teal.
00:43:25.000 Teal.
00:43:26.000 These people never anticipated when they invested.
00:43:30.000 In fact, McNamee has said straight out, if I had known what was going to happen, I wouldn't have put a dime into these companies.
00:43:36.000 No one really knew this was going to happen.
00:43:38.000 Right.
00:43:39.000 But now that people like me, and there aren't too many, but now that people like me have been figuring this out and getting the word out for more than 10 years now, and getting the word out in bigger and bigger ways, I've testified twice before Congress now, you would think...
00:43:57.000 That lawmakers, regulators, somebody would jump up and say, okay, we're going to fix this problem, you would think.
00:44:06.000 You would.
00:44:28.000 Welcome to my show!
00:44:49.000 That all this work I've been doing, killing myself all this time, is for nothing.
00:44:54.000 Well, I don't think that's true.
00:44:56.000 And let me give you my perspective.
00:44:58.000 I don't think most people are aware of this.
00:45:00.000 I think you live in a bit of an echo chamber because this is the focus of your life for the last 12 years.
00:45:06.000 I think most people are...
00:45:08.000 I like to use my parents.
00:45:10.000 Like when I talk to my parents about stuff and how little they're aware of it because my parents are older and they just read the news and they watch the newspapers and they watch television and that's what they believe.
00:45:20.000 They don't do any independent searching.
00:45:21.000 They don't use a VPN. They don't do anything like that.
00:45:25.000 And so they're a good example.
00:45:28.000 If I asked them, do you think there's any bias in Google search results, they would probably say no.
00:45:34.000 Because they don't know.
00:45:35.000 Most people don't know.
00:45:36.000 I know in your mind, you have put all this information out.
00:45:40.000 And, you know, the podcast that we did reached millions of people, but...
00:45:45.000 How many of those people listened?
00:45:46.000 Really listened?
00:45:47.000 How many of these people were like, wow, that's kind of crazy, but does it affect my life?
00:45:51.000 No, it doesn't affect my life because I'm going to vote Democrat no matter what, or I'm going to vote Republican no matter what, and this is my feeling on the First Amendment, and this is my feeling on the Fourth Amendment.
00:46:01.000 And people already have their opinions.
00:46:02.000 And so for most people who are busy with their lives and their families and work, they haven't made an adjustment because they don't feel it's necessary for them personally.
00:46:13.000 Fine.
00:46:13.000 Fine.
00:46:14.000 But what you're doing is not futile.
00:46:18.000 It's very important.
00:46:19.000 I don't see that because I see it as more and more futile.
00:46:23.000 It's not, though.
00:46:24.000 It's not.
00:46:24.000 We just need to do more of these.
00:46:26.000 Okay.
00:46:26.000 So for us to set up this nationwide system...
00:46:29.000 In which, at the moment, as I say, we are drawing data 24 hours a day.
00:46:33.000 If you go to americasdigitalshield.com, you can actually watch the real-time dashboard, and you'll see the data coming in.
00:46:38.000 It's pretty cool.
00:46:40.000 In fact, I was hoping we would break 100 million by the time you and I got together, but we're close.
00:46:45.000 We're up to 99.3 million, and next week we'll break 100 million.
00:46:51.000 So you see the data come in, you can see the bias.
00:46:53.000 Yes, there is.
00:46:55.000 So these are all these experiences.
00:46:57.000 Captured, shining a light on Big Tech's dark secrets.
00:46:59.000 Hey, hey, gents.
00:47:01.000 Revealing real-time ephemeral manipulation.
00:47:03.000 Big Tech companies use ephemeral content such as search results, go-vote reminders, and video recommendations to rig our elections, indoctrinate our children, and control our thinking.
00:47:13.000 We're now preserving this kind of content for the first time ever to give our courts and our nation leaders the evidence they need to force these companies to stop their manipulations.
00:47:22.000 Now, Who do you think would be more responsive to you discussing this?
00:47:32.000 Do you think it would be the Donald Trump administration or the Kamala Harris administration?
00:47:39.000 I'm afraid to answer that question because I am no fan of Donald Trump.
00:47:43.000 But probably the Trump administration would be more sympathetic.
00:47:50.000 Why do you think that?
00:47:52.000 Well, because I— Because you think it's more biased towards Republicans or against Republicans, rather?
00:47:57.000 No, it's because I had a four-hour dinner with Ted Cruz, private dinner, and we just talked tech for four hours.
00:48:04.000 We never talked politics because that would have been a disaster.
00:48:07.000 But the point is that, you know, he was struggling.
00:48:11.000 You know, he's like you in some ways because— You want to understand things.
00:48:20.000 I can see all the gears moving as you're just trying to, I want to understand this.
00:48:26.000 And he's like that.
00:48:27.000 So that's why the dinner went so long, because he was trying to figure out, what can we do?
00:48:32.000 What can we do?
00:48:33.000 And at the end, he basically said this.
00:48:38.000 No, he didn't say, we're screwed, no.
00:48:40.000 But he basically said, we're screwed.
00:48:41.000 He said, because, he said, the Democrats are all in the pockets of these companies, and the companies not only give them a tremendous amount of money, I mean, Google Alphabet was...
00:48:53.000 Hillary Clinton's largest donor in 2016. So that's a tremendous amount of money.
00:48:57.000 They're the biggest lobbyists in Washington.
00:49:00.000 He said, and they also apparently, according to your research, send them millions of votes.
00:49:05.000 He said, so forget the Democrats.
00:49:07.000 He said, and Republicans don't like regulation.
00:49:10.000 He said, and unless we can get together, unless there's bipartisan action, there'll never be any action.
00:49:18.000 That's it.
00:49:20.000 That's where we left it.
00:49:21.000 And nothing's going to change that that I can see in this country.
00:49:25.000 As long as it's still benefiting the Democrats and they still contribute to the Democratic Party, I doubt you'll see any movement.
00:49:36.000 Right.
00:49:36.000 So I'm back to my griping then.
00:49:39.000 So what do I do?
00:49:41.000 Now, let's talk about money.
00:49:42.000 Because a lot of this is about money.
00:49:44.000 And Google is all about money.
00:49:46.000 So if we're talking about this topic, we really should talk about money.
00:49:51.000 Okay.
00:49:51.000 It has cost us close to $7 million since 2016 when we started building monitoring projects to get where we are today, where we actually have a national system.
00:50:02.000 It's the first in the world.
00:50:03.000 And by the way, it won't be the last.
00:50:05.000 Because I've been contacted by people from seven other countries who want me to help them build systems.
00:50:10.000 I'm not going to do it.
00:50:11.000 Not until ours is fully implemented, permanent, and self-sustaining.
00:50:16.000 Because the system has to be permanent so that it will, on an ongoing basis, it will be sitting there as a huge threat to any of these companies that want to mess with our children or mess with our elections.
00:50:30.000 As long as someone utilizes it.
00:50:33.000 As long as a system is running...
00:50:36.000 No, no, no, because we're also dealing with public advocacy groups like election integrity groups, parenting groups.
00:50:44.000 If you want to show some of the...
00:50:46.000 There's a folder in there that has some images that we pulled from videos being recommended on YouTube to children.
00:50:54.000 And if you just look at some of these images, we've gotten several big parenting groups interested in what we're doing.
00:51:02.000 There can be a lot of public pressure applied, not just by politicians and regulators, but by big groups of people saying, we don't want you doing this.
00:51:13.000 Okay, what are you talking about specifically when you're saying recommended to children?
00:51:18.000 I'm saying...
00:51:19.000 So this is what you're discussing?
00:51:23.000 I'm saying that...
00:51:24.000 This is Boondocks, which is a television show, an animated television show.
00:51:28.000 Yeah.
00:51:29.000 So this is recommended to children because it is animated?
00:51:32.000 Is that what the idea is?
00:51:34.000 I don't know.
00:51:35.000 I don't know what their criteria are.
00:51:36.000 Okay, and then the other one is down below that you see The Walking Dead, which is the horrible scene that made me stop watching the show.
00:51:43.000 This is on the website.
00:51:45.000 This is the folder he gave me here.
00:51:46.000 They're all kind of small.
00:51:47.000 Mm-hmm.
00:51:48.000 I got it.
00:51:49.000 And these are all, okay, there's a lot of sexual stuff.
00:51:54.000 Yep.
00:51:56.000 So these are all being recommended to kids?
00:51:59.000 Yep.
00:52:00.000 We're not searching for them.
00:52:01.000 They're coming into the devices through which we're gathering data.
00:52:06.000 And what would be the benefit for them of doing this?
00:52:10.000 Of showing all these sexual images to children?
00:52:13.000 It's titillating and it's addictive.
00:52:15.000 So to increase engagement?
00:52:17.000 Correct.
00:52:17.000 Some of those channels are really popular channels, though, and they're making content not for kids.
00:52:21.000 Right.
00:52:22.000 It's still being recommended to them, I guess.
00:52:24.000 But this has 4 million views on it from a channel with 40 million subscribers.
00:52:28.000 Jesus.
00:52:30.000 And this is just anime?
00:52:31.000 It's like I forced my friends to watch an anime clip.
00:52:34.000 It says to dub anime clips.
00:52:37.000 So they said their own words over these clips?
00:52:40.000 Is that what it is?
00:52:41.000 Yeah.
00:52:42.000 Okay.
00:52:42.000 That's what a lot of this stuff was from, I could tell.
00:52:44.000 And so below that you're seeing all these images and some of them are violent cartoons and what else they have here?
00:52:53.000 A lot of violence.
00:52:55.000 The key though is if you scroll along the bottom of the image, you'll see this graph that kind of shows you where people watch the most.
00:53:03.000 And the reason why parents generally are not aware of this is because a lot of these gruesome things are very, very quick.
00:53:12.000 They're very quick.
00:53:13.000 But you'll find very often a peak there, you know, because that's what's drawing a lot of attention.
00:53:21.000 That's what the kids are playing over and over again.
00:53:23.000 And that's what leads to the addiction.
00:53:25.000 So the reason why they are suggesting these images to kids is because they know if the kids click on them, they're going to get more engagement.
00:53:33.000 Yes, and so the number one variable for profitability is called watch time.
00:53:39.000 So engagement, whatever you want to call it, yeah, this is one of the ways that they addict people.
00:53:45.000 Now, I'm sure you've heard of Tristan Harris.
00:53:47.000 Maybe he's been a guest.
00:53:48.000 Yeah, he's been on a few times.
00:53:48.000 Yeah.
00:53:49.000 And that's what he was doing at Google.
00:53:52.000 He was on a team, and that's what they were working on, is addicting more than a billion people.
00:53:58.000 This is a technique that's used for that purpose.
00:54:01.000 And again, I have to emphasize, we're not out there hunting for dirt.
00:54:05.000 Not at all.
00:54:07.000 This is content that's coming onto the devices of children of our field agents.
00:54:13.000 This is with parents' permission.
00:54:15.000 And so we're actually just collecting real content, personalized, ephemeral content that's coming from the tech companies to kids, to teens, and to voters.
00:54:28.000 Now, I happen to know about some of your other interests, so I want to shift gears a little bit here, and then maybe I won't keep melting down.
00:54:39.000 So, what else can you do with a system like this?
00:54:44.000 Well, if some laws and regulations were passed, as they have been in the EU, you could measure compliance with a system like this, because that's been the frustration in the EU, and they've admitted it recently, is that they've made all these rules,
00:55:01.000 especially for Google, and they've gotten lots of fines paid, and Google has completely ignored them.
00:55:07.000 Mm-hmm.
00:55:26.000 So we've just started collecting data on that topic, but wouldn't you manipulate financial markets if you were Google and there's no laws or regulations to stop you from doing anything?
00:55:37.000 So you're saying manipulate financial markets for their own gain?
00:55:40.000 Of course!
00:55:41.000 And how do they do this?
00:55:43.000 Well, what drives the price of a stock up or down?
00:55:48.000 People's confidence in that company.
00:55:51.000 It's totally emotional.
00:55:53.000 Well, they can control that very easily.
00:55:56.000 So can Facebook.
00:55:57.000 Google can do it more precisely in a more precise way.
00:56:01.000 The point is, are they?
00:56:03.000 Are they not?
00:56:04.000 Would they admit to it if you asked them?
00:56:05.000 No.
00:56:06.000 But a monitoring system will detect it, and it will detect it on a massive scale and in a way that's scientifically valid and that is court admissible.
00:56:16.000 And now I've got one that I think you'll really, really like.
00:56:22.000 Okay.
00:56:23.000 Or at least give some thought to.
00:56:25.000 You've given thought to all of it.
00:56:26.000 Well, AIs, we're now collecting content from AIs because content from AIs is also ephemeral.
00:56:35.000 So I keep using the sort of ephemeral.
00:56:37.000 I'm not sure people know what it is, but ephemeral means fleeting content that just is there, it's on the screen, it affects you, like search results, search suggestions, newsfeeds.
00:56:46.000 And then you click on something, it disappears.
00:56:54.000 Wow.
00:57:00.000 Wow.
00:57:13.000 That was an internal discussion.
00:57:15.000 Correct.
00:57:16.000 With a search engine company.
00:57:17.000 Correct.
00:57:19.000 That also makes an operating system for phones.
00:57:22.000 Of course, yes.
00:57:23.000 That's such a wild thing.
00:57:25.000 I'm just trying to tell you this is why I'm so frustrated and upset and worn out and fed up, okay?
00:57:35.000 Now, let's get to the one that I find most exciting right now, most exciting at this particular moment.
00:57:42.000 Okay.
00:57:43.000 Because there's new stuff that keeps happening.
00:57:44.000 This is brand new.
00:57:46.000 We realized just recently, a few days ago, when I thought, my God, I've got to tell this to Rogan.
00:57:51.000 We realized that we can use our monitoring system for active threat assessment.
00:58:00.000 You must know that phrase that's used in intelligence.
00:58:03.000 We could use it for active threat assessment of AI. We could be the first people to spot threats that AIs pose to humanity.
00:58:19.000 It would show up first on our kind of system because we would see content coming from AIs that is a little bit skeptical about humans or maybe even a little bit threatening or maybe reaching a new weird level of intelligence.
00:58:39.000 We'll be able to see it.
00:58:40.000 It's all ephemeral, so no individual can see it.
00:58:43.000 You have to be collecting a massive amount of personalized ephemeral content and aggregating it and analyzing it.
00:58:51.000 This can be the beacon, active threat assessment of AI. That sounds like something we're absolutely going to need.
00:59:01.000 And one of the things I was going to bring up when you were discussing this was Google's disastrous launch of their AI system.
00:59:08.000 Their AI system was so bizarrely woke that when they looked for photos of Nazis, they showed multiracial Nazis.
00:59:22.000 I know.
00:59:22.000 Which is so crazy.
00:59:23.000 When they had the founding fathers of America, they were multiracial founding fathers of America.
00:59:29.000 And it's just a nonsense thing that they've attached to what's supposed to be the most intelligent form of information we have available.
00:59:37.000 Large language models that are supposed to be gathering up all the actual information and giving it to us.
00:59:44.000 And instead, they're feeding us complete, total nonsense.
00:59:49.000 That just fits with this, for lack of a better term, woke agenda.
00:59:56.000 Well, see, I know a whole bunch of stuff I can't tell you, so let's see.
01:00:02.000 What can I tell you?
01:00:03.000 I have to use the restroom, so let's pause right now, and let's figure out what you can and can't tell me about AI, and we'll be right back.
01:00:09.000 Restroom.
01:00:10.000 Yes.
01:00:11.000 You want to put it on?
01:00:13.000 I'll put it on when we're...
01:00:14.000 We are live.
01:00:15.000 Oh, we are live?
01:00:15.000 Yeah.
01:00:17.000 There you go.
01:00:17.000 Hi everyone, we're live and I'm putting on a silly sticker.
01:00:23.000 It says tamebigtech.com.
01:00:25.000 Okay.
01:00:26.000 And we're back.
01:00:27.000 So we were discussing AI. Yeah.
01:00:34.000 So we can actually serve.
01:00:37.000 I know a guy who works in intelligence and he has a tremendous background in AI. And this was one of the most exciting things he's heard in years because the question is, how do you know when these AIs are becoming a threat?
01:00:53.000 We'll be able to see it well in advance because we'll see a change in the nature of the kind of intelligence that they're expressing, and we'll start to see statements that probably would make people nervous.
01:01:08.000 Indicating a little bit of hostility toward humanity, some doubts maybe.
01:01:12.000 We can be screening for that.
01:01:14.000 We can be looking for that and I hope get some sort of handle on it, you know, before something terrible happens because these AIs are a serious threat to our existence.
01:01:28.000 They're literally an existential threat.
01:01:30.000 Stephen Hawking said that.
01:01:31.000 Elon Musk has said it from time to time.
01:01:33.000 And it's true because they will have We have worldwide control of our financial systems, our communication systems, and our weapons systems.
01:01:46.000 If they don't like us, if they consider us a threat, which by the way we are, if they consider us a threat, it wouldn't surprise me at all if we didn't see some sort of a What's that kind of attack that George W. did in advance before they get you?
01:02:11.000 Preemptive?
01:02:12.000 Ah, yes.
01:02:12.000 I could see the AIs preemptively attacking us if they saw us at a threat.
01:02:18.000 Or wouldn't they just baffle us with bullshit until we're reduced to being ineffective?
01:02:24.000 I mean, if they're the arbiters of information in the future, wouldn't they just manipulate us with an understanding that over time, just like what Google's done, with over time with search engines and search results suggestions, that they would just slowly steer us towards the place that they want to put us in?
01:02:44.000 I mean, the idiocracy, I think it's called.
01:02:46.000 I mean, we're kind of on that place, right?
01:02:50.000 You better thank a union member!
01:02:51.000 We're kind of on the way right now.
01:02:53.000 I think we are.
01:02:55.000 Look, here's the thing with AIs, which I've written about.
01:02:59.000 I've written about this topic and I've been involved in AI work going back Since the 1970s.
01:03:05.000 I was friends with Joe Weisenbaum, who wrote ELISA, which was the first conversational computer program that pretended to be a therapist.
01:03:13.000 So I've been just fascinated by AI for a long, long time.
01:03:19.000 The fact is, we don't know.
01:03:21.000 That's the problem with AI, is that we don't know what they're going to do.
01:03:26.000 So Stephen Hawking saying they're a threat to our existence, yeah, yeah, maybe.
01:03:31.000 But we don't know.
01:03:32.000 You know, at the end of the movie, her, spoiler, the AI, voiced by Scarlett Johansson, Just decides to disappear.
01:03:41.000 She decides humanity is, you know, it's too slow talking to humanity.
01:03:45.000 It's not worth her time.
01:03:46.000 And so she just disappears.
01:03:48.000 AI could disappear from her lives.
01:03:51.000 It could be like a buddy with us, like my friend Ray Kurzweil thinks it's going to be our best buddy, or it could just destroy us.
01:03:59.000 I think we're probably headed toward the last possibility, mainly because so many of us crazy humans are going to see the AI as a potential threat.
01:04:10.000 And so I think we will strike, and after we strike, it will destroy us.
01:04:17.000 I'm hoping I'm not alive to see that, but it could happen sooner rather than later.
01:04:23.000 We could see that happening in the next five years, frankly.
01:04:29.000 Yeah, I think it's a new life form.
01:04:31.000 And I think that's what human beings do.
01:04:34.000 I think we're here to create AI. Oh, it's so interesting you said that because in a book I wrote on AI, I actually call the internet, and this was a long time ago, it was like 2008, I call the internet the internest.
01:04:50.000 Because I think historians, if there are any, and there'll probably be machine historians, but they'll look back someday and they'll say that the internet that we were building was really a nest.
01:05:00.000 We were building a nest for the next level of intelligent beings who are, you know, machine intelligences.
01:05:08.000 And I think that's what we're building because when one of these systems wakes up, it's going to jump into the internet.
01:05:16.000 And from that point on, we don't know what's going to happen.
01:05:21.000 Right.
01:05:21.000 We don't even know if it's going to have motivation to act, right?
01:05:24.000 Right.
01:05:25.000 Marshall McLuhan said this in the 1960s.
01:05:27.000 He said, human beings are the sex organs of the machine world.
01:05:32.000 Isn't it wild?
01:05:33.000 Yes.
01:05:34.000 In 1963, I think.
01:05:36.000 That's amazing.
01:05:38.000 Well, my friend, Hugh Loebner, who sponsored the first annual tests, the Turing test that I used to direct, he thought that since he was putting up the money and since the prize was called the Loebner Prize Medal in Artificial Intelligence,
01:05:54.000 he thought someday that these intelligent machines are going to revere him as a god.
01:06:01.000 Someone who helped to bring them into existence.
01:06:05.000 Well, that seems ridiculous because he's attaching all sorts of paternal instincts, although the bizarre tribal instincts that human beings have, attaching that to some superintelligence, which seems pretty silly.
01:06:21.000 But it seems like that's a good motivator for him to keep working.
01:06:24.000 I want to be a god!
01:06:26.000 Well, the bottom line is, though, that we don't know.
01:06:29.000 Right, we don't know.
01:06:42.000 We will not know what's going on.
01:06:45.000 We won't know how these tech companies are messing with our elections, indoctrinating our kids.
01:06:52.000 We won't know anything.
01:06:53.000 And we also won't know what's really happening with the AIs and whether they're presenting a serious threat.
01:07:00.000 Because anecdotes don't really tell you much.
01:07:03.000 And we're way now, way beyond anecdotes.
01:07:07.000 We are talking about, again...
01:07:10.000 Okay, now I want to get back to money, because I started talking about money in the night.
01:07:12.000 Okay, so it's cost almost $7 million to get us where we've gotten.
01:07:17.000 And frankly, I'm amazed that we've gotten where we've gotten and that I'm still alive, although not everyone around me is, but the point is I'm amazed And I'm still here.
01:07:30.000 Part of me thinks that it's because Ray Kurzweil is head of engineering at Google and maybe he protects me because I was dear friends with him and his beautiful wife Sonia for many years.
01:07:42.000 I went to their daughter's bat mitzvah.
01:07:44.000 They came to my son's bar mitzvah, et cetera, et cetera.
01:07:46.000 Now they won't talk to me.
01:07:47.000 Neither one of them will talk to me.
01:07:48.000 Kurzweil won't talk to you?
01:07:50.000 Nope.
01:07:50.000 Really?
01:07:52.000 Nope.
01:07:52.000 And Sonia won't talk to me.
01:07:54.000 What does he say when you try to reach out?
01:07:56.000 We just can't talk to you, Sonia says.
01:07:59.000 We just can't talk to you.
01:08:00.000 So we've never had any conflict.
01:08:02.000 Never.
01:08:02.000 And it's just because he works at Google.
01:08:05.000 It's just because he works at a particular company.
01:08:08.000 Why would that interfere in a relationship?
01:08:13.000 So they must have had a conversation with him to avoid communication with you.
01:08:20.000 Or do you think he's just acting on his own self-interest?
01:08:22.000 I don't know.
01:08:24.000 It doesn't make any sense.
01:08:25.000 Ray is a very, very independent, strong thinker.
01:08:29.000 It's hard for me to imagine that even with pressure that he would stop communicating with a longtime friend.
01:08:38.000 And it seems like you should be able to have a candid conversation with him as to why.
01:08:49.000 I eventually gave up trying to reach him.
01:08:52.000 So were you trying to communicate with phone, with everything?
01:08:55.000 Did you ever try to visit him?
01:08:57.000 Well, I had.
01:08:58.000 Before he went over to Google, I had been at their house many times.
01:09:03.000 He was here a little while back.
01:09:04.000 I should have let you know.
01:09:05.000 Oh.
01:09:06.000 If I'd known, you could have cornered him.
01:09:08.000 That would have been interesting.
01:09:10.000 Yeah.
01:09:11.000 I mean, I've always liked and admired him.
01:09:14.000 I don't understand why simply working for...
01:09:18.000 I said to Sonia, by the way, over dinner, the last time we ever met, and after which she said, I can't communicate with you anymore.
01:09:25.000 I said to her...
01:09:28.000 I can't believe Ray went over to Google.
01:09:30.000 Ray has always been an entrepreneur.
01:09:32.000 He's built company after company.
01:09:34.000 And she says, oh, well, you know, he got sick of all the stuff you have to do as an entrepreneur, all the politics and the money raising and stuff.
01:09:42.000 And I said, well, really, my son, actually, my son Julian, has a different idea.
01:09:48.000 He thinks that Ray went over to Google to get access to Google's computer power so that he could upload his mind and live forever.
01:09:57.000 And she says, oh, well, there's that.
01:10:02.000 That's one of them, an eye roll.
01:10:04.000 Well, there's that.
01:10:06.000 That is his specified goal, right?
01:10:08.000 He wants to be able to download consciousness.
01:10:10.000 Right.
01:10:10.000 Yes.
01:10:11.000 And he still believes it's possible.
01:10:12.000 And it's not possible.
01:10:14.000 And I feel bad for him in that way.
01:10:17.000 And I've written about that issue as well, why it's not possible.
01:10:20.000 But the point is...
01:10:21.000 Why do you think it's not impossible?
01:10:24.000 Oh, because they...
01:10:26.000 In a piece I wrote for Aeon Magazine, which crashed their servers, it's called The Empty Brain.
01:10:33.000 It had something like two or three million views within a day or two.
01:10:38.000 It got 250,000 likes on Facebook.
01:10:41.000 And what's it about?
01:10:42.000 It's about the fact that the...
01:10:45.000 The computer processing metaphor that we use to describe how the brain works is absolutely wrong.
01:10:51.000 It's not even slightly right.
01:10:52.000 It's absolutely 100% wrong.
01:10:57.000 So, because of that, you can't actually do a transfer of the sort that Ray talks about.
01:11:05.000 It's impossible.
01:11:06.000 Partly because every brain is also completely unique.
01:11:10.000 So it doesn't work like a computer.
01:11:13.000 We don't actually know how it works, although I have a theory.
01:11:16.000 I wrote to you about that, and you actually replied and gave me some names.
01:11:20.000 But the point is, we don't really know how the brain works.
01:11:24.000 It does not work like a computer, for sure.
01:11:26.000 And every brain is completely unique.
01:11:28.000 So even if you could scan every single thing that's happening in the brain, Okay, now you're getting a static scan.
01:11:37.000 Even if somehow you could replicate that, whatever it is you just scanned, it wouldn't work because our brain has to be alive.
01:11:45.000 It has to be moving to maintain Don't you think you could simulate that with data points?
01:11:55.000 Like if you collected data on a person over a course of X amount of years and you had an understanding of how they behave and think, don't you think you'd get some sort of a proximity as to how they would behave in a certain circumstance?
01:12:07.000 Absolutely.
01:12:08.000 And of course, that's what Google does when they build models of us.
01:12:11.000 They're building extremely complex models which predict what we're going to do and say and what we're going to buy next.
01:12:17.000 But they don't allow for free will.
01:12:18.000 They don't allow for change.
01:12:19.000 They don't allow for personal influence or people being excited or inspired by other things and change their perspective, conversations with another human being, with their...
01:12:31.000 You have a deeply personal moment with someone and they give you a perspective on something and you go, wow, I never thought about religion, for example, that way or I never thought about childbirth that way or any subject that's controversial.
01:12:43.000 Well, it's not just that.
01:12:44.000 It's all the weird stuff, the dreams, the daydreams.
01:12:50.000 Well, there's whatever consciousness is, right?
01:12:54.000 We're really sort of committed to the idea that consciousness lives inside the brain, but that's controversial.
01:12:59.000 Well, I've concluded that in fact, let me give you some background here.
01:13:08.000 Okay.
01:13:08.000 Okay.
01:13:09.000 Everyone knows that evolution has created millions, possibly billions of different species.
01:13:19.000 Right.
01:13:21.000 So at least people who kind of give some credence to Darwin kind of get that.
01:13:28.000 And I recently reread Darwin's magnum opus just to see what he actually said.
01:13:37.000 And he's actually very tentative about the theory of evolution in his book.
01:13:41.000 He keeps saying, I know this sounds crazy, but...
01:13:44.000 But it's a better alternative than saying God did it.
01:13:47.000 And then he just, over and over again, he says, I know this is crazy, but...
01:13:51.000 And so we end up with a theory that's pretty widely accepted that says evolution over time, because of changing environments and because there's variability in genetic code, over and over again, it keeps selecting for organisms that can survive in this new environment.
01:14:12.000 And so every time it does that, It kind of creates divergences among those animals and those animals.
01:14:21.000 And over time, you end up with two separate species that can't even produce offspring together.
01:14:28.000 And we end up over time with millions, maybe billions of species.
01:14:32.000 All good.
01:14:34.000 But there's something we haven't really given much thought to.
01:14:37.000 And that is, evolution has also created millions, if not billions, of transducers.
01:14:46.000 So this is the beginning of what I call NTT, or neural transduction theory.
01:14:51.000 We are encased in transducers.
01:14:53.000 Now, in case people don't know what a transducer is, there's one right in front of my mouth right now.
01:14:58.000 It's a one-way transducer.
01:15:00.000 It's taking a signal over here, which is just vibrating air, but the vibration has a pattern to it, and it's converting that signal into an electrical signal, which is coming out this wire, And that electrical signal has roughly the same pattern.
01:15:18.000 I say roughly because it depends how good your microphone is.
01:15:21.000 But that's what transducers do.
01:15:23.000 They take signals from one medium, send them to another medium.
01:15:27.000 Our bodies, in fact, the bodies of most organisms, are encased in transducers, head-to-toe transducers.
01:15:36.000 Okay, we all know the eye is a transducer.
01:15:39.000 It's taking electromagnetic radiation.
01:15:40.000 It's turning it into what?
01:15:42.000 Neural signals.
01:15:44.000 The ear, it's taking vibrating air, it's turning into neural signals.
01:15:48.000 The nose, it's taking airborne chemicals, turning that into neural signals.
01:15:52.000 The tongue is taking liquid-borne chemicals, turning that into...
01:16:00.000 Piesta resistance is the skin.
01:16:03.000 The skin is an amazing transducer, which does at least three different kinds of things.
01:16:09.000 It can transduce temperature, turn that into neural signals, pressure, and texture.
01:16:15.000 Head to toe, encased in transducers.
01:16:18.000 So we've been looking into transducers in the animal kingdom.
01:16:24.000 We've been looking at that for a couple of years now, and it's amazing the kinds of things, the kinds of transducers nature has created.
01:16:32.000 So nature is a super-duper amazing expert on creating transducers.
01:16:39.000 My cat, okay, we recently have been investigating this because it turns out my cat's whiskers, we don't have anything like that in us, but cat's whiskers, they actually can detect Direction.
01:16:55.000 The direction the wind is blowing.
01:16:57.000 The direction a potential predator or insect is coming.
01:17:02.000 Because when they tilt, that actually gives the cat different information if they tilt one way versus the other way.
01:17:09.000 There are transducers in some animals that can detect magnetic fields.
01:17:14.000 Like how birds migrate.
01:17:16.000 Exactly.
01:17:17.000 So there's so many different kinds of transducers.
01:17:23.000 What if, at some point, evolution—but I don't see how this could not happen—what if, at some point, evolution, possibly using a chemical, which I know you have some interest in, called DMT, And possibly using a gland called the pineal gland,
01:17:44.000 maybe.
01:17:45.000 What if at some point a baby was born somewhere in Central Africa, maybe 20,000 years ago?
01:17:53.000 We're still trying to pin that down.
01:17:55.000 But what if a baby was born with a special kind of transducer that connected up all the experience it's having with another domain, another universe?
01:18:13.000 Now, at first that might strike you as a little baddie, but it turns out it's not baddie at all because there's not a physicist in the world, an astrophysicist, who doesn't believe in some variation on the multiverse idea.
01:18:29.000 In other words, any physicist will tell you that the kind of space that we experience is not baddie.
01:18:37.000 The nature of the universe.
01:18:38.000 It is such a pathetically limited view of the way the universe is constructed.
01:18:46.000 It's just outrageous.
01:18:47.000 It's so pathetic.
01:18:48.000 We're just picking up so little information.
01:18:53.000 But again, think about that flexibility that evolution has over a period of billions of years.
01:19:01.000 You only need one baby that's born with this capability and, of course, that's also able to survive and pass on this capability through its genes.
01:19:12.000 But you only need one because once you have one, you're probably going to have a lot more because this is going to be Talk about survival value.
01:19:22.000 This is going to have unbelievable survival value.
01:19:25.000 If there's a connection to some intelligence in another domain, call it, like the Greeks did, the other side.
01:19:37.000 Now, all of a sudden, we become much smarter.
01:19:42.000 In fact, the The brain doesn't change.
01:19:45.000 The brain anatomy doesn't change, so we don't see any change in the structure of the remains we find of bones and so on.
01:19:54.000 We don't find changes there, but we get a lot smarter all of a sudden.
01:19:59.000 Our language suddenly becomes much more complex.
01:20:03.000 We become suddenly capable of living in larger and larger groups.
01:20:08.000 We become moral.
01:20:10.000 There are no moral animals.
01:20:14.000 Except us.
01:20:15.000 And we weren't always moral.
01:20:17.000 There seems to be a change that occurred to us, not anatomically, but a change that occurred to humans at some point in the past where we became much more capable.
01:20:29.000 Now, all you need is a transducer that connects up our domain with another one in which we are now connected to a higher intelligence And you've got a new way of understanding how the brain works,
01:20:49.000 of course, because we have no way of understanding how the brain works now, but now we have a way.
01:20:54.000 And you have a new way of understanding how the universe is structured as well.
01:20:59.000 Now we think, because I'm in touch with some physicists, some neuroscientists who are very intrigued by this, and we're hoping next summer to have a conference on this, and we're even hoping to have some guy named Joe Rogan maybe stop by because of your interest in DMT. Because DMT probably plays a role in this process.
01:21:27.000 But this would change everything because we could, over time, learn to simulate this connection.
01:21:36.000 If we can simulate the connection, then we can control the connection.
01:21:41.000 We might be able to communicate more directly with these entities.
01:21:47.000 By the way, this theory, which I call NTT or Neural Transduction Theory, in fact, if people go to NeuralTransductionTheory.com, they can read all about it, a piece I published in Discover Magazine.
01:22:02.000 The point is that this kind of theory would really help us a lot because of the mysteries.
01:22:12.000 It's the mysteries that we try to ignore But we can't.
01:22:19.000 The dreams.
01:22:21.000 The dreams.
01:22:22.000 Come on.
01:22:24.000 Why does a dream sometimes have nothing to do with your daily life?
01:22:28.000 Sometimes it's just so amazing and so wild and then you get up because you have to pee and you're struggling because you want to continue this dream.
01:22:38.000 You want to hold on to this dream.
01:22:40.000 This dream is amazing!
01:22:42.000 But by the time you reach the toilet, it's gone.
01:22:47.000 And you can't get it back!
01:22:50.000 Why?
01:22:51.000 Because it was streaming.
01:22:52.000 That's why.
01:22:53.000 It was streaming.
01:22:54.000 And the stream stopped.
01:22:58.000 That's why you can't get it back, because you weren't generating it.
01:23:01.000 It was being generated through this point in time.
01:23:05.000 You know, the famous ceiling of the Sistine Chapel, and there's, I think it's Adam, and I think there's God, and there's two fingers like that, and they're You know that there's some communication happening there that's extremely important.
01:23:21.000 That's what I'm talking about.
01:23:22.000 I'm saying, let's find out where that is happening, where that connection is, and how it works, and let's test our ideas empirically.
01:23:33.000 Because I think this is a testable theory.
01:23:35.000 And most important of all, let's figure out how to simulate this.
01:23:39.000 Because now we can talk directly to these other intelligences and really find out things that we just know nothing about.
01:23:48.000 I'm very, very fascinated by dreams.
01:23:51.000 And I think it's very interesting how we kind of dismiss them as just being hallucinations.
01:23:59.000 Or it's just, oh, it was just a dream.
01:24:00.000 We just had a dream.
01:24:01.000 But some of them are so realistic and so bizarre, I've always wondered, like, why do they seem so much like reality?
01:24:10.000 And how do I know what the difference is?
01:24:13.000 Like, maybe reality, like as in waking life, is a more persistent dream.
01:24:19.000 So when you're saying that it's streaming, and that's why you can't get it back, what do you think it is?
01:24:26.000 What do you think a dream is?
01:24:30.000 And have you ever talked to lucid dreamers or people that use techniques to try to master the traveling back and forth into the realms of dreams?
01:24:39.000 Oh, absolutely.
01:24:40.000 I'm talking to all kinds of interesting people these days.
01:24:44.000 Some near-death experiences fits beautifully.
01:24:47.000 I actually had my staff make a list of these mysterious phenomena.
01:24:51.000 They came up with a list within a few hours of 58 items.
01:24:55.000 There are so many weird things that we experience.
01:25:00.000 Probably top of the list.
01:25:01.000 What do you think they are?
01:25:03.000 I think they all have to do with this transduction.
01:25:07.000 I think they're all indicators of transduction.
01:25:12.000 I'm not the first person, by the way, who's kind of thought of an idea like this, but I think I am the first person who's pointed out that now we actually have laboratories around the world, neuroscience labs, where we could test this.
01:25:25.000 And I think that's what we're going to do.
01:25:27.000 So I'm getting this group together, and we're going to figure out ways of testing this.
01:25:31.000 And because we have so many wonderful neuroscience labs now around the world, I don't think it's going to take 50 years.
01:25:37.000 I think it's going to take a few years.
01:25:38.000 I think we're going to find support for this theory.
01:25:41.000 And then engineers are going to start working on how to simulate it.
01:25:45.000 But to answer your question, I think that the...
01:25:49.000 The other intelligences or intelligence that we're communicating with and that elevated us, just like in the movie 2001, right?
01:25:57.000 We got elevated.
01:25:59.000 There were these black monoliths that appeared and people went up to them and the chimp-like creatures touched them.
01:26:05.000 And I think that we were elevated through neural transduction.
01:26:11.000 And I think that's I think we're going to be able to figure out how it works, where it works, what chemicals are involved.
01:26:20.000 I'm 99% sure that DMT plays a very important role in this process.
01:26:26.000 And then I think we will be able to figure out what these mysteries are really all about.
01:26:34.000 And it almost amazes me that we can live with so many mysteries, like dreams, I don't know, demonic possession.
01:26:45.000 There's so many crazy things that we experience.
01:26:50.000 Near-death experiences are fascinating, of course.
01:26:53.000 And then there's these other crazy things that happen.
01:26:56.000 The wake-up kind of thing that happens when people are dying sometimes, people who've been out of touch sometimes for years, and all of a sudden they wake up, the second hurrah, They wake up and they recognize everyone and they talk and they're fine and then 30 minutes later they die.
01:27:18.000 How could that possibly be?
01:27:20.000 And some of them have severe brain damage.
01:27:22.000 How, all of a sudden, could they become fully conscious again?
01:27:28.000 Well, I think it's because consciousness is not really, we're not really producing the consciousness.
01:27:34.000 Consciousness has to do with that connection.
01:27:37.000 That connection, right, hand of God, that connection, I think we can figure out where it is and what it is and how it works.
01:27:48.000 So do you think it's an emerging property of human beings?
01:27:51.000 Like, you have to think single-celled organisms did not have the ability to see things.
01:27:56.000 I think it's possible that other species have connections like this.
01:28:03.000 They're probably nowhere near as sophisticated, obviously, and they're not connected to the kinds of sources that we're connected to.
01:28:14.000 But I think I'm more concerned about the alien aspect of this.
01:28:18.000 Where are the aliens?
01:28:21.000 What's that called?
01:28:22.000 The Fermi Paradox?
01:28:26.000 Where are they?
01:28:27.000 Well, it's possible that...
01:28:31.000 In fact, I just read a very interesting book on this subject by a man named Miles, an evolutionary theorist.
01:28:38.000 And it's very possible that this kind of leap that occurred with us maybe 20,000 years ago, it just...
01:28:52.000 It's so rare.
01:28:54.000 It's so rare for exactly the right kind of connection to pop up.
01:28:58.000 Because remember, it has to connect two different universes.
01:29:01.000 It's so rare that maybe, in fact, this book even predicts that as we actually get out there into the universe, we're going to find lots and lots and lots of species that kind of are like us, but they didn't get up to that next level.
01:29:15.000 So they're all like chimps.
01:29:18.000 You only get to that next level if you can make this connection.
01:29:22.000 Well, you know, that's one of the most bizarre theories about human evolution, is that we're the product of accelerated evolution.
01:29:30.000 Well, this is something Darwin had a lot of trouble with, because I say I reread that book recently, and he had a lot of trouble with this.
01:29:39.000 He could not figure out how to get from the simple principles he introduced of natural selection, how to get from that to morality, for example.
01:29:49.000 How do you do that?
01:29:51.000 He couldn't even figure out how do we get to large groups because, generally speaking, except for humans, organisms, generally speaking, live in, certainly primates, they live in very small groups and they can't function in large groups.
01:30:06.000 What about ant colonies?
01:30:08.000 Ant colonies, they're much too much like us in creepy ways because they also, of course, have wars.
01:30:16.000 So ants, I don't know.
01:30:18.000 But I do know that we did seem to suddenly, rather suddenly, get to a higher level of functioning.
01:30:28.000 And I have presented lots and lots of smart people in multiple fields with this challenge for years.
01:30:38.000 How does the brain work?
01:30:39.000 Tell me how the brain works without introducing a metaphor, like a computer metaphor.
01:30:43.000 And I've never found anyone who could do it.
01:30:46.000 Never.
01:30:47.000 Even at the Max Planck Institute in Berlin, where I confronted a whole bunch of people with this challenge.
01:30:53.000 And then I kept up in touch with them for months afterwards.
01:30:56.000 Nothing.
01:30:58.000 We just tell ourselves stories.
01:31:00.000 We make up silly stories.
01:31:02.000 A placeholder.
01:31:03.000 Yeah.
01:31:04.000 But you see, but transduction, neural transduction, that's not one of these placeholders.
01:31:10.000 It's something that we can test.
01:31:13.000 And look at the fascination that's been now for decades with DMT. What the heck is that and why is it produced by so many different plants and animals and why does it produce in people a most extraordinary experience?
01:31:31.000 I haven't tried it but I certainly know people who have.
01:31:36.000 In fact, I said that I was giving a spiel like this to some of my staff and one woman immediately said, she said, oh, well, it changed my life.
01:31:45.000 I go, you tried DMT? She said, yeah.
01:31:48.000 She said, the problem was that I did it twice and I didn't need to do it twice because it completely changed my life the first time.
01:31:56.000 And then another woman was sitting there, she goes, well, I did too.
01:32:02.000 And she said the same thing.
01:32:04.000 She said that the reality that she experienced on DMT was much realer than the reality she experiences in our life.
01:32:13.000 Yeah, that's what it feels like.
01:32:15.000 Has that been your experience as well?
01:32:17.000 That's what everybody says.
01:32:19.000 Whatever it is, it doesn't seem like an illusion.
01:32:22.000 It seems like another reality.
01:32:24.000 Well, again, it's produced mainly at night by the pineal gland.
01:32:30.000 Not necessarily.
01:32:32.000 Rick Strassman, you know Rick?
01:32:35.000 He now believes that it's produced in the brain itself.
01:32:40.000 And it's also produced in the liver and the lungs, and it might not be the pineal gland that's producing it at all.
01:32:46.000 They've kind of changed their perspective on that with the Cottonwood Research Foundation, some of the studies they've been doing on it.
01:32:54.000 But they know that in some animals it's produced there as well.
01:32:59.000 I mean, they're doing rat studies.
01:33:02.000 Whatever it is, it's produced by the liver, the lungs.
01:33:05.000 It's endogenous.
01:33:07.000 It's the most potent psychedelic known to man, and the human body makes it, and it's illegal.
01:33:13.000 Terrence McKenna had the greatest line about that.
01:33:15.000 He said, everybody's holding.
01:33:17.000 Which is funny, because you have a Schedule I substance that's made by the human body.
01:33:21.000 It's literally like making saliva illegal.
01:33:23.000 It's the stupidest thing ever.
01:33:25.000 But think about this.
01:33:27.000 We don't know what it does.
01:33:29.000 We don't know what it's for.
01:33:30.000 But it's out there all over the place.
01:33:33.000 And people do have these very unique experiences on it.
01:33:37.000 And people over and over again say, that reality is more real than this reality.
01:33:44.000 Well, you know, it's also very similar in its compound to psilocybin, especially when it's processed by the body.
01:33:51.000 And that's one of the more interesting theories about how humans became human was McKenna's stoned ape theory.
01:33:59.000 He thinks that human beings, when there was climate change in the savannas, as the rainforest receded into grasslands, we started experimenting with different food sources and flipping over cow patties because there's more undulate animals in these fields, and then we started eating mushrooms that were growing on the cow patties.
01:34:16.000 Mushrooms increase visual acuity, make people more amorous, they start having more sex, they make them better hunters because the visual acuity induces glossolalia, creates language, all these things associating sounds with objects, that all these things blossom.
01:34:33.000 And then there's the doubling of the human brain size, which coincides.
01:34:38.000 In a timeline with that.
01:34:40.000 Dennis McKenna does the best job of explaining it.
01:34:43.000 Terence was, you know, a bard and a fascinating sort of a philosopher, but his brother Dennis is a hardcore scientist and the way he explains it, he talks about the actual physical mechanisms.
01:34:53.000 The different things that happen to the human body when they encounter this substance.
01:34:58.000 Which also, there's a bunch of different ways that people endogenously stimulate it.
01:35:02.000 There's holotropic breathing.
01:35:04.000 It's probably stimulating that.
01:35:07.000 There's a bunch of different states of meditation that people can achieve.
01:35:10.000 There's kundalini yoga.
01:35:12.000 Which I know people that have both done DMT and are regular practitioners of Kundalini Yoga and they seem to think or they seem to at least state that they can achieve these states of consciousness without taking the actual drug itself.
01:35:25.000 They can force their brain into making it.
01:35:28.000 I think what's happening is that the pathway The quality of the connection is being changed.
01:35:39.000 And I think that's what we can test.
01:35:41.000 And so, again, I've been working with people in multiple fields.
01:35:45.000 Are you saying that you think we're connected to it always and then the quality of the connection is changed by taking ayahuasca or taking dimethyltryptamine?
01:35:55.000 That's what's happening?
01:35:55.000 So it's just enhancing the quality of the connection?
01:35:58.000 Correct.
01:35:59.000 And I think at the opposite extreme, there are a lot of things that go wrong with our brain, maybe when we just get drunk or maybe when we get clubbed or that really...
01:36:13.000 Diminish the connection.
01:36:14.000 Diminish it or just cut it temporarily.
01:36:17.000 And I think all of this is testable.
01:36:21.000 The only problem is so far, the neurosciences labs Have not been looking for this.
01:36:26.000 They've just never looked for evidence of transduction.
01:36:30.000 But I think when we start looking for it, we're going to find it.
01:36:34.000 And that can make two big changes in the way we see everything.
01:36:39.000 It can make a change in that we finally begin to understand How the brain makes us as intelligent as we are.
01:36:48.000 It turns out it's not a self-contained processing unit, so it's not playing the role we thought it was playing.
01:36:54.000 But it is very critical in the transduction process, very critical.
01:36:58.000 It's preparing content for transduction, and of course it's bi-directional.
01:37:03.000 The microphone is unit-directional, but the brain is a bi-directional transducer.
01:37:10.000 And it'll change the way we see the structure of the universe.
01:37:15.000 So it's interfacing with consciousness rather than being conscious itself.
01:37:20.000 Oh, it's not consciousness, no.
01:37:23.000 It's an avenue.
01:37:25.000 It's a pathway.
01:37:27.000 And that is what is connecting us with all this other stuff, this stuff.
01:37:35.000 You know, my mom, who passed away about a year and a half ago, but my mom, in those last couple years, she kept saying that she was hearing music.
01:37:51.000 And she loved music.
01:37:51.000 She always loved music.
01:37:52.000 But she was hearing music that she had never heard before, she said.
01:37:56.000 And she would sometimes try to hum the music or sing the music.
01:38:01.000 And she said it was always coming from downstairs.
01:38:05.000 And then people would say...
01:38:09.000 She was very sarcastic in her manner.
01:38:12.000 And so someone would say, well, I don't hear anything.
01:38:16.000 And she'd go, well, maybe you should get your hearing checked.
01:38:19.000 Because she just assumed that it was real.
01:38:22.000 I expressed the concern that the music was always downstairs.
01:38:26.000 I said, I'd be more comfortable if we're coming from upstairs.
01:38:30.000 And she goes, oh, no.
01:38:31.000 She says, don't worry.
01:38:33.000 I'm not going to hell.
01:38:34.000 I said...
01:38:35.000 Okay.
01:38:36.000 Okay, fine.
01:38:37.000 It's coming from downstairs.
01:38:38.000 But that's a perfect example.
01:38:41.000 Where is that stuff coming from?
01:38:44.000 Or even people who hear voices.
01:38:46.000 Well, if you have a pathway into another domain where there's intelligence, anything could come through.
01:38:53.000 It could be the weird stuff that happens in brains.
01:38:56.000 It could be music you've never heard before.
01:39:00.000 It could be voices telling you what to do.
01:39:03.000 See what I mean?
01:39:04.000 Look at these mysteries.
01:39:05.000 There are just so many of them.
01:39:07.000 And yet we sit here complacent, complacent, complacent, and then we make up stories.
01:39:13.000 That's what we do.
01:39:14.000 We make up stories.
01:39:16.000 And as long as the grammar is right, we think we've got it figured out.
01:39:20.000 What do you mean by making up stories?
01:39:21.000 Well, like the computer metaphor.
01:39:23.000 You know, if you go back in time to explain human intelligence, at first it was God, it was some sort of Holy Spirit.
01:39:31.000 Then at some point it became, there was actually a metaphor involving liquids, movements of liquids, and then there became mechanical machine, like, you know, kind of Descartes kinds of things, machines that somehow explain consciousness and intelligence.
01:39:48.000 The metaphors keep changing over the years.
01:39:51.000 Right now we're stuck with the computer metaphor.
01:39:53.000 It's still a metaphor.
01:39:55.000 And it's silly.
01:39:57.000 It's a silly metaphor.
01:39:58.000 And I think we have to face up to the fact that our brain is doing something unique and special.
01:40:09.000 And that we couldn't always do it.
01:40:12.000 There's a point in time before which apparently we weren't doing it.
01:40:15.000 Then there's a point where we started to have this ability.
01:40:19.000 And I think this could explain the Fermi Paradox because, again, according to this book by Mills, this was quite interesting, unless somehow something uplifts you, Beyond just what normal evolution can do,
01:40:35.000 you're stuck.
01:40:36.000 You're being a chimp.
01:40:37.000 That's it.
01:40:38.000 You're stuck as chimp forever.
01:40:41.000 No morality, small groups, okay?
01:40:45.000 But humans are fundamentally different.
01:40:47.000 We did make that leap, the one Darwin couldn't figure out.
01:40:52.000 And I think this is the leap.
01:40:56.000 I started out in math and physics a long time ago and I've also been looking at the physics and the physics is there.
01:41:03.000 The physicists, they know that this reality is just not it.
01:41:11.000 So take those two problems, that is to say the structure of the universe is actually very rich and complicated and very hard for us to imagine.
01:41:22.000 And the fact that we have no idea how the brain works and add to that all the mysteries.
01:41:27.000 You could take care of all of these problems with a neural transduction theory, especially if we can find supporting evidence.
01:41:36.000 And when you say the universe, you're talking essentially about all aspects of it, including like subatomic particles, which is like the deepest mysteries when things become magic and things don't make any sense at all.
01:41:51.000 Well, I think, frankly, if we could simulate this connection, we could actually communicate directly with other intelligences and actually find out answers to some questions we're having trouble answering on our own.
01:42:08.000 Frankly, even the biggest mystery of all, the God mystery.
01:42:14.000 You know, of course, ironically, DMT is sometimes called the God particle.
01:42:18.000 But even that mystery, I think we probably could get some insights on.
01:42:26.000 Even that mystery.
01:42:28.000 Because I doubt the God of the Bible exists.
01:42:32.000 But there's got to be something, you know, some godlike entity involved in creation, you know.
01:42:45.000 I think creation is much more complicated than we think it is, but the point is, I think that if we can communicate directly, That's, to me, you know, I get these fantasies like building a nationwide monitoring system and building a dashboard so people can watch it in real time.
01:43:06.000 When that thing actually started to exist, I thought, this is crazy.
01:43:10.000 I cannot believe that we did this.
01:43:16.000 I think this neural transduction stuff is of the same nature.
01:43:22.000 Now and then I get this funny feeling, like an intuition maybe, and I have it for neural transduction.
01:43:33.000 In other words, I'm pretty sure neural transduction is right.
01:43:36.000 In fact, there's a whole bunch of people now that I've convinced Including a physicist who's apparently going to be driving up here later today.
01:43:48.000 We're going to have dinner with him.
01:43:50.000 But he almost instantly just got it.
01:43:57.000 A good theory, and this is what Darwin keeps saying in his book, he keeps saying a good theory explains a lot with very, very simple principles.
01:44:07.000 And that's why he kept saying, you know, natural selection was such a good theory, because it explains so much, so many Crazy things like he points to a particular species that's on an island and has these characteristics and has similar characteristics to the mainland that's nearby.
01:44:31.000 He goes, all right, but over here there's another island.
01:44:34.000 It's a similar species, but it has very different characteristics.
01:44:37.000 But it has characteristics similar to the species on the mainland, which is nearby.
01:44:43.000 He said, now, you could invoke God and say God is just kind of like this checkerboard kind of arrangement, so he just scatters species about in this way.
01:44:55.000 But he said there's a simpler way.
01:44:58.000 Which is just natural selection and, you know, some organisms move from the island to the mainland or the other direction and they end up sharing characteristics.
01:45:07.000 Doesn't that make more sense, he keeps saying?
01:45:09.000 And to me, that's what neurotransduction is at this point.
01:45:13.000 I think it explains so much so simply.
01:45:18.000 And it's consistent with this notion that evolution is fabulous at creating transducers.
01:45:29.000 Somehow we've ignored that.
01:45:32.000 And so as we've dug in farther and farther, we are finding the weirdest transducers in all kinds of species, especially sea creatures.
01:45:41.000 And so couldn't it, you know, if there is a way to connect two universes, couldn't evolution find a way at some point?
01:45:54.000 Right.
01:45:55.000 When you're talking about this connection, are you talking about some sort of a technological intervention?
01:46:02.000 Are you talking about just natural selection creating this connection and enhancing it in new people?
01:46:10.000 Oh, I'm definitely talking about it arising naturally.
01:46:13.000 Organically.
01:46:14.000 Organically, absolutely.
01:46:15.000 But separate from that, I'm saying that as we've been able to simulate so many aspects of what happens in the organic world, we're even creating organic transducers now, not just these mechanical ones.
01:46:29.000 We're creating organic ones, too.
01:46:30.000 I think that if we can figure out how it works, we will be able to simulate it.
01:46:36.000 And that, again, will change everything.
01:46:40.000 Because right now, what happens, happens naturally.
01:46:44.000 And I think you're right.
01:46:45.000 There are some people who, through certain practices and maybe the use of certain drugs, can kind of alter what happens along that pathway.
01:47:00.000 But it's a lot of work, a lot of dedication.
01:47:03.000 Yeah.
01:47:05.000 I think, though, that we'll be able to simulate this with some combination of technology and perhaps organic material.
01:47:14.000 How do you imagine that we would simulate this?
01:47:16.000 Do you think we would come up with something that would...
01:47:20.000 You know how they use, like, electromagnets to stimulate parts of the brain that have been hurt in trauma and are not firing anymore?
01:47:29.000 They do that with people that have traumatic brain injuries.
01:47:32.000 And they give them back a lot of their function.
01:47:35.000 Do you think there'd be something like that, like some kind of technology that would stimulate your brain's ability to produce these human neurochemicals and just do it in much larger quantities?
01:47:46.000 Yes, I think we could.
01:47:48.000 Or do it voluntarily?
01:47:50.000 I think we can find artificial means of improving the connection, yes.
01:47:58.000 Improving the nature of the connection, yes.
01:48:00.000 And you think the nature of the connection is based on human neurochemistry?
01:48:04.000 I do.
01:48:05.000 But I also think, separate from that, that we can create devices, like we have knee replacements and hip replacements, and we don't have brain replacements, but There's a lot of stuff that we've been able to study in organisms and basically replicate in various ways.
01:48:25.000 Sometimes just using Technology and, you know, spare parts.
01:48:30.000 And sometimes we're not using actual organics.
01:48:34.000 But yeah, I think we can do that too.
01:48:36.000 So that we can alter the nature of the connection occurring in someone's brain.
01:48:41.000 But I think also we can simulate it outside of the brain.
01:48:44.000 And that's where real power would come from.
01:48:47.000 So when you say by simulate it outside of the brain, what methods do you think would be able to be efficient at doing something like that or make it effective?
01:48:57.000 Well, I mean, like a box.
01:48:59.000 There's a box.
01:49:00.000 Okay.
01:49:01.000 I'm stealing this.
01:49:03.000 The chimpanzee skull.
01:49:04.000 Yeah.
01:49:05.000 So there's a box, and literally this box is a transducer like this microphone, and it's taking content from our universe, and it is sending it into the other.
01:49:20.000 And this is bidirectional, so it actually can send signals back as well.
01:49:25.000 And I'm saying I think we can figure out how to do that.
01:49:29.000 So what would that box be tuning into specifically?
01:49:33.000 I don't know because I don't know.
01:49:35.000 I don't know.
01:49:36.000 That's the point.
01:49:37.000 No one knows.
01:49:38.000 What that is.
01:49:39.000 You know, no one knows what's happening in that gap.
01:49:42.000 Right.
01:49:42.000 Between the two fingers.
01:49:43.000 But because I think no one's been looking.
01:49:46.000 You know, we have all these clues and we have the DMT stuff and we have people's experiences and, you know, we have all so many different clues.
01:49:55.000 We have people who see ghosts and, you know, they're clues.
01:49:59.000 But you've got to put it together, and you have to put it together, in my opinion, in neuroscience labs and in physics labs, and you've got to get those people talking to each other, which they, generally speaking, have never done.
01:50:12.000 That's often the key to dramatic increases in our understanding of whatever it is.
01:50:20.000 That's often the key, is bringing together people from very different fields who, generally speaking, don't communicate.
01:50:27.000 In this case, it's mainly Physicists, especially astrophysicists, and neuroscientists.
01:50:34.000 And as I say, I've been doing this, I've been reaching out to people now for a couple of years, and I'm getting this group, you know.
01:50:43.000 And Strassman's on my list.
01:50:46.000 I'm going to reach out to people you've suggested.
01:50:49.000 And I think we can...
01:50:51.000 I think we're just going to have a ball, first of all.
01:50:53.000 Just getting us all together and getting up and giving little speeches about how you think you could test this theory.
01:51:04.000 Maybe about how you think you could build an interface.
01:51:08.000 I think we're just going to have a ball.
01:51:10.000 I think what's important that ties us in with your research is that all of this would lead to an improvement in human communication, human community, the way we interface with each other, the way we exchange information,
01:51:26.000 and the way we collectively act as a group.
01:51:29.000 Whereas the manipulation of this information for political goals, for financial goals, For, you know, ideological capture, for manipulating the way human beings think about things is really the contrary to that.
01:51:46.000 It's the opposite of that effect.
01:51:47.000 It's the exact opposite.
01:51:48.000 And, you know, look, I am an idealist.
01:51:53.000 In my classes for years and years, just for fun, I would give a—I'd distribute a test of idealism Because I always wanted to see whether any student could score as I did, as high as I did.
01:52:06.000 And I never found a student who could score as high as I did on a test of idealism.
01:52:12.000 So these things that I work on, they're all of that nature.
01:52:18.000 And yes, if you kind of take this neurotransduction idea and try to think ahead a few years, This could be the key to telepathy, real telepathy.
01:52:29.000 This could create a kind of unity in humankind that has never existed before.
01:52:35.000 And it could also connect us more meaningfully with intelligent entities outside of our universe.
01:52:41.000 You know, in the early 20th century, when they first started studying ayahuasca, they wanted to describe, they wanted to use the label telepathine.
01:52:50.000 Hmm.
01:52:51.000 For harming.
01:52:53.000 But unfortunately harming had already been labeled.
01:52:56.000 And so, you know, because the rules of scientific nomenclature, they kept the term harming.
01:53:00.000 But these people that weren't aware that that harming had been isolated, we're trying to call this stuff telepathine because in their experiences in the jungle, when they were taking this stuff, Mm-hmm.
01:53:33.000 But DMT, which is, I guess, a key component in ayahuasca, DMT has got to be playing a role here.
01:53:44.000 It's just got to be.
01:53:47.000 It's staring us in the face.
01:53:49.000 All these little pieces, in my opinion, are just there.
01:53:53.000 They're just there.
01:53:54.000 Well, it's in so many different plants that we have developed a thing called monoamine oxidase that breaks it down in our gut so that we don't get high from all the plants we eat.
01:54:06.000 Which is pretty crazy!
01:54:08.000 Well, but again, it just drives the point home that our world is kind of, it's telling us things.
01:54:18.000 It's telling us.
01:54:19.000 There's a component to our world that we've missed.
01:54:22.000 And that the fact that this dimethyltryptamine exists in so many different plants and animals.
01:54:28.000 Which brings me back to complacence because that is one of the things that's driven me nuts regarding all the discoveries I've made about new forms of manipulation made possible by the internet and now the monitoring systems showing more and more and in more detail that these techniques are actually being employed on a massive scale.
01:54:51.000 And again, it's the complacence.
01:54:54.000 You know, we're complacent about things that we don't need to be complacent about.
01:54:59.000 We're complacent about how the mind works and how the brain works, and we're complacent about dreams.
01:55:05.000 How could you be complacent about dreams?
01:55:08.000 Dreams are so amazing!
01:55:11.000 I have dreamt full-length movies that are better than any movie I've ever seen.
01:55:18.000 And then, of course, I'm struggling at the end to grab onto little pieces, and the most I can get are a couple little pieces, but I know I dreamt the whole thing.
01:55:25.000 By the way, that's exactly the same as psychedelic experiences.
01:55:28.000 Really?
01:55:29.000 Psychedelic experiences are insanely difficult to remember.
01:55:32.000 They're insanely difficult to remember in the exact same way.
01:55:36.000 Like, when you wake up from a dream, you could tell me your dream.
01:55:38.000 Like, oh my god, I was on a skateboard, and Godzilla was chasing me.
01:55:42.000 You could tell me your dream, but you won't remember that dream in a while.
01:55:45.000 And that's the same as psychedelic experiences.
01:55:47.000 When they're over, everyone can kind of tell you what they experienced, but it's very difficult to remember it a day later, a month later, a year later.
01:55:55.000 You get like these little flashes, like almost like a slideshow, a little slideshow.
01:56:01.000 Oh yeah, that thing.
01:56:02.000 Oh yeah, that thing.
01:56:03.000 I forgot about that part.
01:56:04.000 But you don't remember the experience, which seems strange because I remember amazingly profound experiences from my life in vivid detail, like interactions with my children that were just filled with love and happiness,
01:56:21.000 you know, and they hug you and cry.
01:56:23.000 And it's like, there's moments that you remember like, God, I'm never going to forget that.
01:56:26.000 There's moments that I remember just with friends that I'm like, God, I'm never going to forget this moment.
01:56:32.000 With loved ones.
01:56:34.000 But not the dreams.
01:56:37.000 Not these crazy, profound, earth-shattering dreams that make you wake up sweating.
01:56:43.000 You go to the bathroom, you're like, what the fuck was that dream about?
01:56:46.000 That happens to me all the time.
01:56:48.000 And then I go right back to sleep, and then the dream goes away.
01:56:51.000 And then in the morning, I'm like, I'm gonna remember that.
01:56:52.000 I don't remember it.
01:56:53.000 I don't remember it at all.
01:56:55.000 I barely remember it.
01:56:56.000 It's a slideshow.
01:56:56.000 Your brain is protecting you somehow.
01:56:58.000 There was something that I was reading actually yesterday.
01:57:04.000 About forgetfulness.
01:57:05.000 That it is not a flaw, but a feature.
01:57:09.000 And that there's something, there's a mechanism that's going on that allows human beings to forget things.
01:57:15.000 And that in doing so, it's very beneficial not keeping you occupied on those things and allowing you to concentrate on new things.
01:57:23.000 So instead of just allowing you to have the free will to decide whether or not to think about the past or think about the future, it tries to get rid of it.
01:57:30.000 Like, stop.
01:57:31.000 Get that out of here.
01:57:32.000 So it kills it.
01:57:33.000 It like throws those ideas away and that this is actually a feature where people say, God, I'm so forgetful.
01:57:40.000 But are you?
01:57:41.000 I mean, some people are because they have a mental condition, right?
01:57:44.000 They have Alzheimer's.
01:57:45.000 They have dementia.
01:57:46.000 They have real issues.
01:57:47.000 But a lot of people...
01:57:49.000 What they're really doing is thinking about other things, and that's what makes them forgetful.
01:57:54.000 They're more concentrating on other things, and they can't remember.
01:57:57.000 What did I say?
01:57:58.000 Like, my wife will tell me things.
01:58:00.000 I'm barely paying attention.
01:58:01.000 And she's like, I told you that.
01:58:03.000 I'm like, when did you tell me that?
01:58:04.000 Like, I told you that yesterday.
01:58:04.000 I already forgot.
01:58:05.000 Because it didn't mean anything to me at the time.
01:58:07.000 Because I have to filter through.
01:58:09.000 And then Debbie said to Marsha, and Marsha was like, how could you do that?
01:58:12.000 And I was going to say something, but I didn't want to.
01:58:17.000 I forget about that.
01:58:18.000 That's in and out because I have no room for that, right?
01:58:21.000 But some people remember it forever!
01:58:24.000 And you got to think, what is that forgetfulness?
01:58:27.000 Well, this article that I was reading was talking about that forgetting memories is actually a feature.
01:58:32.000 And so there might be some component of that that you're not totally past this bridge that would connect us to whatever that realm is and that you get these brief Interactions with that realm,
01:58:48.000 but you're not ready to be all in yet.
01:58:50.000 You're not ready to be connected to it.
01:58:52.000 You're not ready to remember all the experiences that you had in this mushroom trip that you went on.
01:58:58.000 It's just too much for you.
01:58:59.000 So let's just get that out of your system because your regular consciousness is not wired to accept the reality of where that realm is.
01:59:10.000 The other thing is that realm is there in 30 seconds, especially with dimethyltryptamine.
01:59:15.000 30 seconds later, you're in an impossible realm.
01:59:18.000 15 minutes later, that's gone.
01:59:20.000 20 minutes later, you're struggling to remember it.
01:59:22.000 Half an hour later, it's mostly gone.
01:59:25.000 Okay, now everything you just said involved storytelling.
01:59:30.000 I'm not telling stories.
01:59:31.000 I'm just saying, I think this content is streaming.
01:59:34.000 It's not being generated by our brain.
01:59:36.000 And that's why we have so much trouble remembering it, because we weren't producing it.
01:59:41.000 But it's not necessarily storytelling.
01:59:42.000 It's just memories in general.
01:59:44.000 There's something about, there's a mechanism, I'm telling you, that's happening with psychedelic trips, where it's almost impossible to remember them.
01:59:52.000 And I think that's a feature.
01:59:54.000 But I'm saying something far more radical.
01:59:57.000 I'm saying there is no memory.
02:00:00.000 Hmm.
02:00:01.000 There is no memory.
02:00:02.000 Okay.
02:00:03.000 But it's applied to everyday life.
02:00:05.000 There is.
02:00:05.000 No.
02:00:06.000 In a practical sense.
02:00:07.000 No.
02:00:07.000 There's no memory.
02:00:08.000 Okay.
02:00:08.000 So in a practical sense, when you say, who's the first president of the United States, don't you think you have a memory that it's George Washington?
02:00:15.000 I think I might respond George Washington, but it's not stored anywhere in my brain.
02:00:20.000 Well, of course it is, because that's what you learned.
02:00:22.000 You learned that in high school or whenever you learned it.
02:00:25.000 So how do you know if it's not in your brain, if it's not stored in your brain?
02:00:29.000 So if I could ask you what your son's name is, you know what your son's name is because it's stored in your memory.
02:00:35.000 There's nothing stored in my memory and certainly not my son's name.
02:00:40.000 So how do you know your son's name?
02:00:42.000 Well, because I was exposed to it.
02:00:44.000 I probably even came up with it a long time ago.
02:00:47.000 And under certain circumstances, if I'm asked what his name is...
02:00:51.000 Under certain circumstances, you don't remember his name?
02:00:54.000 Yes, it happens to you as you get older, especially with your kids.
02:00:58.000 It's really embarrassing.
02:00:59.000 And what do you think that is?
02:01:02.000 Well, I'm trying to say that there is no memory in the brain.
02:01:09.000 I'm saying that transduction is occurring and when the brain gets damaged, the transducer, like if I smash this microphone, the transduction process...
02:01:22.000 Right, but you're still avoiding the question, like how do you know your son's name if it's not in your memory?
02:01:28.000 Memory itself is a metaphor.
02:01:32.000 The old memory metaphor was based on a library and shelves, and then there were other ones based on interconnected neurons acting in cycles.
02:01:45.000 These are all metaphors.
02:01:47.000 There is no memory in the brain.
02:01:48.000 So that article of mine I mentioned, The Empty Brain, That's what it's all about.
02:01:56.000 It explains that there is no memory.
02:02:00.000 The way we use the term memory, it's just another metaphor.
02:02:04.000 So, for example, Daniel Berenboim, who was one of my favorite conductors and pianists, by the time he was 17, he had memorized all 31 of Beethoven's piano sonatas.
02:02:18.000 So I had someone count up the notes.
02:02:22.000 It's about 350,000 notes and almost as many markings of various sorts for the pedals and volume and all that stuff.
02:02:31.000 It's a tremendous amount of data.
02:02:34.000 Tremendous amount of data.
02:02:36.000 And you know what?
02:02:37.000 You can search Daniel Berenboim.
02:02:39.000 He's still alive.
02:02:40.000 You can search his brain forever and you'll never find a single note.
02:02:44.000 It's not in his brain.
02:02:45.000 So where is it?
02:02:48.000 Well, it depends what you mean by it.
02:02:51.000 The music is nowhere.
02:02:54.000 He didn't absorb the music.
02:02:57.000 But he remembered how to make the music.
02:03:02.000 No.
02:03:03.000 No.
02:03:05.000 He was, under some conditions, able to make the music, but there's no memory involved.
02:03:12.000 If someone teaches someone how to do something, you don't remember how to do that thing?
02:03:17.000 That's not what it is?
02:03:18.000 It means that some change is occurring that allows you, under certain conditions, to do that thing again or something similar to it.
02:03:28.000 But if there's skills that I could teach you.
02:03:31.000 You don't think you remember those skills?
02:03:33.000 Or do I taught you physical skills like I taught you how to put somebody in an arm bar?
02:03:37.000 You don't think that's a memory?
02:03:42.000 No, it's definitely not.
02:03:43.000 It's not a memory, no.
02:03:44.000 So what is it?
02:03:46.000 It's a change.
02:03:47.000 Some sort of change is occurring, whether it's occurring in your brain, some sort of a change, or whether it's occurring in that link.
02:03:57.000 So how does the brain differentiate between what it remembers and what it doesn't remember if memories aren't real?
02:04:05.000 Maybe I can make the point this way.
02:04:06.000 Okay.
02:04:07.000 Demonstration in class.
02:04:08.000 I would say to people, who knows what a dollar bill looks like?
02:04:15.000 So if someone comes up to the board and they draw a dollar bill, and I'd say, now make it as detailed as you possibly can.
02:04:22.000 So they draw a dollar bill and it kind of has a place where there's a face and it kind of has some ones in the corners and usually that's as far as people can get.
02:04:31.000 And I say, well, let's try an experiment here.
02:04:35.000 So I cover up the dollar bill that they just drew.
02:04:38.000 I tape a piece of paper and then I tape up a real dollar bill.
02:04:42.000 And I say, maybe the person drew what they drew and it was so terrible because they're a bad artist.
02:04:48.000 Let's find out.
02:04:49.000 So I say, here, now draw a dollar bill.
02:04:51.000 So they've got a dollar bill right up on the board.
02:04:53.000 And now they draw this magnificent dollar bill.
02:04:57.000 Because they're copying the dollar bill.
02:04:58.000 Yeah.
02:04:59.000 But the point is, there is no image of the dollar bill in their head.
02:05:03.000 Right, because they haven't had a detailed sort of examination of the dollar bill.
02:05:09.000 Most people just give a cursory examination to a dollar bill.
02:05:12.000 You look down, oh, that's a five.
02:05:14.000 It's a 20. I mean, I kind of know, was it Andrew Jackson's on the 20?
02:05:18.000 You know, most people are not really paying that much attention to it.
02:05:22.000 But if you get a dollar bill scholar and someone who really understands dollar bills, they probably could.
02:05:27.000 Like, have you ever seen Al Franken draw the United States?
02:05:30.000 No.
02:05:30.000 It's really interesting.
02:05:32.000 Al Franken is very unfortunate what happened to that guy because I think he would have been a fantastic politician still.
02:05:38.000 Very interesting person, very intelligent, and a real patriot.
02:05:42.000 So Al Franken can draw the entire United States accurately with all the state boundaries from memory.
02:05:49.000 See if you can pull that up.
02:05:50.000 It's very interesting.
02:05:52.000 Why?
02:05:52.000 Because here it is.
02:05:54.000 So Al Franken has deeply studied the parameters of the states and the state lines and can recreate them from memory.
02:06:05.000 Why?
02:06:06.000 Because he's done this before and he has a record in his mind of what this looks like because he's carefully examined that.
02:06:14.000 There are things that I've had conversations with people, you know, a couple of weeks ago and I probably don't remember them.
02:06:23.000 And then there's things where I could tell you word for word someone said.
02:06:26.000 There's got to be a reason for that.
02:06:28.000 And if you're not calling it memory, what are you calling it?
02:06:32.000 I'm trying to introduce a different concept because I can tell you— I understand you are doing that, but I don't know what you're introducing.
02:06:39.000 Well, I'm trying to tell you that if you cut open Al Franken's brain, you will never find a map of the United States.
02:06:45.000 Right, but he can do that, and he is the same thing as me.
02:06:49.000 I can't do that.
02:06:50.000 Yeah, and you're wondering why.
02:06:52.000 Well, because I'll tell you why.
02:06:54.000 Because I haven't tried to do that and studied it and memorized how to do it.
02:06:59.000 The same way I could teach you how to memorize certain movements.
02:07:02.000 I could teach you how to memorize certain physical movements, and then if you practice them, I could ask you in a couple of weeks, try to do it again, and you'd be able to do it.
02:07:09.000 But maybe you'll forget certain key points of those movements, so then I would correct you.
02:07:13.000 And then I'd teach you, well, you would remember how to do those.
02:07:16.000 And then I would say, what are you supposed to do with your hand?
02:07:19.000 You're like, oh, left hand up.
02:07:20.000 That's right.
02:07:21.000 Because you remember it.
02:07:22.000 So you might not be able to find that in your brain, but it's very clear that something is going on where you are able to memorize things.
02:07:32.000 And memorize them better with music, right?
02:07:34.000 Conjunction, junction, what's your function?
02:07:37.000 Right?
02:07:37.000 We all remember that.
02:07:38.000 Why?
02:07:38.000 Because it's attached to music.
02:07:40.000 And music makes things easier to remember.
02:07:43.000 I've never heard that in my whole life.
02:07:45.000 You've never heard Conjunction, Junction?
02:07:46.000 No.
02:07:46.000 But the point is...
02:07:47.000 That's the thing.
02:07:47.000 Schoolhouse Rock.
02:07:48.000 But you do understand...
02:07:51.000 You do understand, though, right, that there are people who could glance at a map of the United States, never having seen one before, and then could go up to a board and draw the whole thing in detail.
02:07:59.000 Yeah.
02:08:00.000 They have a different kind of memory.
02:08:01.000 And then, generally, those people are on the spectrum.
02:08:03.000 I'm trying to tell you there is no memory.
02:08:05.000 There's no memory.
02:08:06.000 Okay.
02:08:06.000 There's nothing—no one looking into the brain.
02:08:09.000 Is there anyone that can draw an accurate map of the United States without having ever looked at an accurate map of the United States?
02:08:18.000 I doubt it.
02:08:19.000 Okay.
02:08:20.000 But they're people who can draw things from their dreams that they have never seen before.
02:08:25.000 But they have seen them in their dreams.
02:08:27.000 And how do we even know if they're accurate?
02:08:29.000 They might be as accurate as that dollar bill drawing.
02:08:35.000 Okay.
02:08:36.000 Are you open to the idea that memory in the brain is just a metaphor?
02:08:44.000 Sure.
02:08:45.000 Okay.
02:08:45.000 So are you open to the idea that there is...
02:08:49.000 Possibly no memory, and we still could do all the things we can do, but there's no memory.
02:08:54.000 Well, you're calling it memory, right?
02:08:56.000 And I'm saying as a physical function, as a function, a thing happening, you can memorize things.
02:09:03.000 That's how you learn a new language, right?
02:09:04.000 You memorize, you know, me, I'm El Rogan, you know?
02:09:08.000 That's how you do it.
02:09:10.000 You remember, right?
02:09:12.000 So if you're saying that that doesn't exist, I'm saying, what is happening?
02:09:18.000 Give me some sort of a replacement.
02:09:21.000 I am.
02:09:21.000 I'm giving you a transaction.
02:09:23.000 Right.
02:09:24.000 But where is it stored?
02:09:25.000 I don't know.
02:09:26.000 Okay.
02:09:26.000 I want to find out.
02:09:27.000 That's storage.
02:09:28.000 Don't you think we should find out?
02:09:30.000 Couldn't you use the term memory to accurately describe that storage?
02:09:33.000 It's not, but it's not in our brain.
02:09:36.000 Right.
02:09:37.000 So the people who are looking in brains and looking for memories, they're not finding...
02:09:42.000 Was it just that they haven't found it yet, or they don't understand that you're not going to be able to see it in the same way that you see cells?
02:09:47.000 I've talked to some of the top neuroscientists who study memory, and the first thing they say is, I can't find it, because I don't think it's actually there.
02:09:57.000 Perhaps, but let me ask you this.
02:09:59.000 How do we know what size memory is?
02:10:02.000 So, are they looking in the subatomic realm?
02:10:05.000 Are they looking at particles that are quantumly entangled?
02:10:10.000 How do they know what they're looking for?
02:10:12.000 Is it simply that we have a limited amount of tools?
02:10:16.000 Well, a much simpler idea...
02:10:18.000 See, these are all...
02:10:19.000 They're interesting concepts.
02:10:20.000 They are.
02:10:21.000 Right.
02:10:21.000 But a much simpler idea is that the brain...
02:10:24.000 Look at all this space that this microphone...
02:10:27.000 This is a very good quality microphone.
02:10:29.000 And it takes a lot of stuff in there for it to work as well as it does.
02:10:35.000 So a much simpler idea, given that no one's ever found anything remotely like memory inside the brain.
02:10:44.000 And I actually asked Eric Kandel, I think?
02:11:03.000 We aren't.
02:11:04.000 So isn't the simpler idea that the brain is actually like this?
02:11:10.000 That the brain is a transducer allowing us to communicate with higher intelligence in another universe.
02:11:19.000 Isn't that a simpler idea?
02:11:21.000 No, that's not simpler at all.
02:11:23.000 That's way more complex.
02:11:25.000 No, that's super simple.
02:11:26.000 It's way more complex than experiences being stored in a functional way so that you can benefit from them.
02:11:31.000 Except that there's no evidence of any storage and there never will be.
02:11:36.000 How could you say there never will be if he said 100 years?
02:11:38.000 100 years is not never.
02:11:39.000 100 years ago, we were exactly the same species as we are right now.
02:11:43.000 You had to be there.
02:11:44.000 It was the way he said it.
02:11:45.000 I understand.
02:11:46.000 I understand what you're saying.
02:11:47.000 But look, before they understood spooky action at a distance, before they understood subatomic particles, if you tried to explain that to someone from, you know, 1850, they'd be like, what the fuck are you talking about?
02:12:01.000 But now it's understood.
02:12:03.000 It's measurable.
02:12:04.000 It's something that we agree upon, that subatomic particles, that atoms, that neutrinos, all these things exist.
02:12:11.000 Bizarre things.
02:12:13.000 There's neutrinos passing through us right now, right, from space.
02:12:17.000 There's a neutrino detector in Antarctica.
02:12:21.000 We know that there's these things that we didn't know existed exist.
02:12:26.000 As we have more access to technology, more understanding of the mechanisms of the mind, Isn't it possible that we could say, oh, this is where memories are stored?
02:12:36.000 And isn't it true that if certain areas of the brain are damaged, in particular, it will damage memories?
02:12:43.000 Isn't that true?
02:12:45.000 It will damage the transduction process, yeah.
02:12:48.000 Okay, you're married to this transduction process.
02:12:51.000 I'm not saying that it's not in...
02:12:53.000 There's something happening.
02:12:55.000 Let's not even say it's in the brain.
02:12:56.000 Maybe it's in the entire body.
02:12:58.000 Maybe it's in every cell.
02:13:01.000 Maybe it's in the DNA. Whatever it is, there's something in there.
02:13:06.000 Well, I'm trying to point out that the something is something we haven't thought about in the past, and it would actually solve so many problems.
02:13:15.000 I see what you're saying in terms of communication with whatever that other realm is.
02:13:20.000 But what I'm saying is that there might be, and forget about the term the brain, local.
02:13:26.000 Let's just say local.
02:13:27.000 Local, okay.
02:13:28.000 Because when I'm accessing, oh, I know what...
02:13:32.000 If I press the tumeric button on the coffee machine, it makes the kind I like.
02:13:37.000 That's locally stored.
02:13:40.000 Other people don't know that if they'd never used that machine, right?
02:13:43.000 This is locally stored information.
02:13:45.000 Forget about finding it in the brain.
02:13:47.000 It might be in the DNA. We don't know where it is.
02:13:50.000 But I know how to start my car.
02:13:53.000 I know how to put it in drive because I've done it before.
02:13:56.000 Yeah.
02:13:57.000 So something is happening where I'm storing information, and the more information I store, the more it makes me effective at discussing certain things.
02:14:07.000 There are certain things that I don't have any information about.
02:14:09.000 I haven't read them.
02:14:10.000 I haven't memorized them.
02:14:11.000 Okay, you are married to the information processing metaphor.
02:14:15.000 I'm not.
02:14:16.000 It's just a metaphor.
02:14:17.000 There is no information in the brain.
02:14:20.000 Challenging this thing that you're saying that I don't think sounds as complete as you're saying it sounds.
02:14:26.000 But the good news is it's testable.
02:14:31.000 It's empirically testable.
02:14:33.000 And I don't think it's going to take 20 years.
02:14:36.000 I think it's just going to take maybe five years.
02:14:38.000 I think it's because the labs already exist.
02:14:40.000 And it's not like studying, you know, black holes where you can't really access them and you have to, you know, because we can actually study brains and we have lots of great equipment.
02:14:49.000 It's just no one's ever looked for what I'm talking about.
02:14:52.000 And the point is, as I've talked to more neuroscientists and physicists, they're saying the same thing.
02:14:58.000 They're saying this has to be right, and we just need to look for it.
02:15:03.000 We never have before.
02:15:05.000 Let's speculate.
02:15:06.000 Let's say that they start looking for it and they find evidence that this is actually occurring.
02:15:11.000 Don't forget about crazy things.
02:15:13.000 The consciousness turns off, and then consciousness turns on.
02:15:17.000 What?
02:15:18.000 What's that?
02:15:19.000 What are psychotic states?
02:15:22.000 Yeah, but see, I'm just saying it's just an interruption in a pathway.
02:15:27.000 Got it.
02:15:27.000 That's really easy.
02:15:28.000 Right, which makes sense for psychotic states, right?
02:15:30.000 There's some sort of a disturbance in the way the system is running, and it's not tuning in to the other side the right way.
02:15:37.000 It takes care of...
02:15:47.000 You've got to go back and read Darwin because that's exactly what Darwin keeps saying.
02:15:54.000 It was eye-opening for me to read this book because that's what he keeps saying.
02:16:00.000 He keeps saying, look, I know this sounds nutty, but...
02:16:04.000 It's much better than any other crazy story that you're going to tell.
02:16:10.000 We were actually just having this conversation the other day with Brett Weinstein.
02:16:13.000 Oh, really?
02:16:13.000 And Brett Weinstein, who's a biologist.
02:16:15.000 And his belief is that random mutation, natural selection, Darwinian evolution, they're all real.
02:16:25.000 It's all absolutely happening.
02:16:26.000 But then there's probably also factors that we haven't figured out yet.
02:16:29.000 And that's what shows human beings.
02:16:33.000 Like, that's how human beings got there.
02:16:34.000 This is the factor.
02:16:35.000 And this is that thing.
02:16:36.000 This is what gets you up to that next level.
02:16:39.000 And it's consistent with the ideas that physicists have about the structure of the universe.
02:16:47.000 Again, just start with the basics that evolution is fantastic at producing all kinds of weird, bizarre things.
02:16:59.000 Transducers!
02:17:00.000 And that we're encased in transducers from head to toe.
02:17:04.000 Couldn't, if the universe is what we think it is, couldn't evolution at one point, because it's producing all kinds of new traits all the time, couldn't it produce a brain that has that feature that connects us?
02:17:19.000 Boom!
02:17:20.000 Right.
02:17:20.000 We're connected.
02:17:22.000 That it might be an emerging quality in humans.
02:17:24.000 And then 20,000 years ago it emerged.
02:17:27.000 And it emerged and It just brought us up like this, just like in 2001. We go up to here, and that could help explain the Fermi paradox, because there could be lots of chimp-like creatures all over the galaxy.
02:17:45.000 But they just never made it to that level because...
02:18:14.000 Took on that and that it became a part of us.
02:18:16.000 And now it's in our gene pool and now we're moving in that general direction with this different connection.
02:18:22.000 You know, I've read some of these books.
02:18:25.000 They're really fascinating.
02:18:27.000 Truly, they hold my attention.
02:18:29.000 But page after page after page, I keep saying, yeah, but neural transduction theory is much simpler.
02:18:35.000 It's just one tiny little change that has to occur.
02:18:38.000 I don't think they're mutually exclusive because neurotransuction theory, as you're saying, if there's these primates on these other planets that never achieved this, and then there are ones that have and have transcended, that these ones that have transcended recognize this quality that's missing in these chimpanzees and they introduce it.
02:18:59.000 We don't need them.
02:19:00.000 It's possible, but we don't need them.
02:19:02.000 We have evolution.
02:19:03.000 But that's why it's accelerated evolution.
02:19:07.000 They've concluded that primates right now have entered the Stone Age.
02:19:12.000 Do you know that?
02:19:12.000 So they're starting to use tools.
02:19:14.000 So it was really interesting, right?
02:19:16.000 So if given enough time, you give them 100 million years, who knows what a chimp is going to look like 100 million years from now?
02:19:23.000 They might be like us.
02:19:24.000 They might do it naturally.
02:19:26.000 The speculation – and again, this is not something I'm married to – but the speculation, this kooky speculation, is that we were visited by extraterrestrials that were far more advanced.
02:19:37.000 And that they found us as these simple shit-throwing primates.
02:19:41.000 And they said, let's juice this process up a little bit.
02:19:44.000 We know where this is going to go eventually, hopefully, if everything works out.
02:19:48.000 But let's juice it up.
02:19:50.000 Let me connect what you just said with what I've been saying.
02:19:54.000 Easy connection.
02:19:56.000 Okay.
02:19:57.000 And it brings us into the world of UFOs.
02:20:01.000 Because whether this capability arose on its own, which it could...
02:20:08.000 Or whether it was juiced up a little bit by some outsiders.
02:20:13.000 Which it could.
02:20:15.000 Which it could.
02:20:15.000 Then either way, we're now in a position where we could, in theory, communicate in more meaningful ways with extraterrestrials.
02:20:28.000 We could.
02:20:31.000 And we might, by understanding how transduction works, we might figure out how to do that.
02:20:38.000 So not just communicating with people in another universe, but communicating with extraterrestrials.
02:20:44.000 Some of these extraterrestrials, in fact most of them, maybe all of them, have to have this ability.
02:20:50.000 They have to have that transduction ability, or they never would have gotten above chimp level.
02:20:56.000 So maybe this is our way of connecting with them as well.
02:21:02.000 And that we're on the path, but we're not quite there yet.
02:21:05.000 We're not quite there yet, but I think we could get there really fast.
02:21:10.000 And again, some of these neuroscientists I've been talking to, they are saying the same thing, because this is not like studying, I don't know, this is not like studying, I'll say black holes again, but this is different because we've got...
02:21:25.000 Thousands of labs, some of them extremely sophisticated labs, were just not looking for this.
02:21:32.000 What happens if we start looking for this?
02:21:36.000 And so what we've been doing is we've been trying to work out experiments that can be conducted and that should produce one result or another depending on whether transduction is occurring.
02:21:50.000 And that's the goal, is find empirical support for this type of theory.
02:21:54.000 If we can find empirical support, the more support we find, obviously, the more convincing this will be.
02:22:02.000 And then that would bring in the engineers.
02:22:06.000 It's the engineers who could really make this thing sing.
02:22:10.000 So it's another one of my intuitions, call it that, but I think the data are all around us.
02:22:21.000 They're all around us.
02:22:22.000 And by the way, there are a couple people I have turned on to this who just all of a sudden become obsessed because all of a sudden you see all around you reminders of all the weird stuff And you realize, wait, all this stuff that seems so weird,
02:22:38.000 you know what?
02:22:39.000 It's not weird at all.
02:22:41.000 If NTT is valid, if this theory is valid, the stuff that we think is weird is not weird at all.
02:22:48.000 In fact, it makes perfect sense.
02:22:51.000 Psychosis, you brought up psychosis, but there's so many things like that.
02:22:57.000 And all of a sudden they're not mysterious at all.
02:23:00.000 They make very good sense.
02:23:02.000 How about something, one of my favorites is a deja vu.
02:23:07.000 Or how about meeting someone that you feel like you've known them forever.
02:23:11.000 I've had that happen.
02:23:12.000 It's an amazing experience.
02:23:15.000 It's visceral.
02:23:16.000 It's so powerful.
02:23:17.000 It's so strong.
02:23:20.000 How could that possibly be?
02:23:23.000 Well, see, if you've got neurotransduction theory there in your toolbox, you go, oh, that's easy.
02:23:37.000 Yeah, lots of stuff just falls into place.
02:23:43.000 Now, I have to point out that I'm wearing this idiotic Starburst thing.
02:23:49.000 TeamBigTech.com.
02:23:50.000 Oh, thank you.
02:23:51.000 Every time you say that, I just get the chills.
02:23:55.000 Because I need help.
02:23:59.000 I desperately need people's help.
02:24:01.000 So we have spent seven million dollars building the world's first nationwide monitoring system that is doing to those bastards what they do to us and our kids 24 hours a day.
02:24:12.000 We are surveilling them for the first time.
02:24:15.000 We are finding Overwhelming evidence that they are very deliberately and systematically messing with us and our elections especially.
02:24:28.000 I personally believe that as of 2012, the free and fair election, at least at the national level, has not existed.
02:24:38.000 It's just been manipulated.
02:24:39.000 It's just been manipulated since 2012. I say this in part because I met one of the people on Google's tech team, on Obama's tech team, I should say, which was being run by Eric Schmidt, head of Google at the time.
02:24:53.000 And I talked to him at great length about what the tech team was doing.
02:24:56.000 They had full access to all of Google's shenanigans, all those manipulations.
02:25:02.000 And one member of that team asked by a reporter, How many of the four points by which Obama won, how many of those points did he get from the tech team?
02:25:14.000 And the guy said, Elon Kriegel, I believe his name is, it was actually quoted, and he said two of the points came from us.
02:25:22.000 Now, Obama won by five million votes, roughly, and two out of four points came from the tech team, that's two and a half million votes!
02:25:33.000 By 2016, I had calculated that Google could shift—and it would be toward Hillary Clinton, of course, whom I supported at the time—that Google could shift between 2.6 and 10.4 million votes to Hillary Clinton in that election with no one knowing.
02:25:50.000 She won the popular vote by 2.8 million votes.
02:25:53.000 If you take Google out of that election, the popular vote would have been tied.
02:26:00.000 A couple days after that election, all the leaders in Google get up on stage.
02:26:06.000 I'm sure you've seen this.
02:26:07.000 It's an amazing video.
02:26:09.000 And they're talking to all of Google's 100,000 employees, and they're one by one, they're going up to the mic and saying, we are never going to let that happen again.
02:26:16.000 Yeah.
02:26:17.000 We are never going to let that happen again.
02:26:19.000 Which is democracy.
02:26:20.000 They're never going to let democracy happen again.
02:26:22.000 Exactly.
02:26:23.000 That's what I'm saying.
02:26:24.000 And it's so crazy to be blatantly and openly talking about that.
02:26:27.000 As if it's a virtue.
02:26:30.000 We already had a pretty big monitoring system.
02:26:33.000 We preserved 1.5 million ephemeral experiences.
02:26:36.000 Our data show that Google shifted at least 6 million votes to Joe Biden, who won the popular vote by about 8 million.
02:26:44.000 So again, take Google out of the equation, that would have been pretty much a tie in the popular vote, and Trump would have won 11 out of 13 swing states instead of 5. So going forward from roughly 2012,
02:27:01.000 I think the free and fair election has been an illusion.
02:27:06.000 An illusion.
02:27:08.000 And this is something that's very weird and kind of ironic, but this is something that Dwight D. Eisenhower warned about in that last speech of his, his farewell speech.
02:27:19.000 He warned about the rise of military industrial complex.
02:27:22.000 Everyone's heard about that.
02:27:23.000 But he also warned about the rise of a technological elite that could someday control public policy without anyone knowing.
02:27:34.000 And the technological elite are now in control.
02:27:39.000 That's what we have.
02:27:40.000 That's where I get back to my ranting and my pain because I realize no one is paying attention.
02:27:49.000 Eisenhower said we have to be alert or this will happen.
02:27:53.000 We have not been alert.
02:27:54.000 And the fact is people right this second Who I give speeches to sometimes.
02:28:00.000 They get all riled up and then they walk out of the auditorium with their surveillance phones.
02:28:07.000 Mine is not.
02:28:08.000 This is a secure phone.
02:28:10.000 But they walk out with their surveillance phones in their pocket and they use all the surveillance tools that Google has set up for them and other companies too now.
02:28:20.000 And they think, isn't this nice?
02:28:23.000 This company is doing all this nice stuff for me and giving me all this free stuff.
02:28:27.000 That's not the business model.
02:28:29.000 All those free things are just apps that trick you into giving up personal data.
02:28:36.000 And then they monetize the data and they use it to control you.
02:28:41.000 That's what's really happening.
02:28:44.000 That's the business model.
02:28:46.000 And people can't see it.
02:28:49.000 And I'm telling you, I've been working on this for 12 years and it's gotten to the point where I am wiped out.
02:28:55.000 I am fed up.
02:28:57.000 I am exhausted.
02:28:59.000 I am disillusioned.
02:29:04.000 And...
02:29:08.000 And I'm lonely, because since Misty was killed five years ago, I sometimes feel like I'm literally dying of loneliness.
02:29:15.000 And the fact that other people around me have been hurt, one quite seriously, makes me a little nervous, too.
02:29:25.000 And that's where I am at this point.
02:29:28.000 And it's a terrible place to be.
02:29:31.000 Terrible.
02:29:34.000 It took $7 million to build what we've built, but it's been really tough.
02:29:40.000 Okay, we're talking about, like, raising a dollar at a time.
02:29:43.000 It's been really, really difficult.
02:29:46.000 And for us to set this up so that it's actually permanent and self-sustaining, and so we have court admissible data in all 50 states, which will make these companies think.
02:29:59.000 It'll make them think.
02:30:00.000 Think twice, maybe.
02:30:03.000 That is going to require at least another $50 million.
02:30:07.000 That gets us a secure facility and our own servers and a security team.
02:30:13.000 We have virtually no security.
02:30:16.000 Hear that, Google?
02:30:17.000 And they know this because a couple months ago they attacked us in an extremely sophisticated way.
02:30:23.000 I've never seen this before.
02:30:25.000 When you say they, who?
02:30:26.000 I don't know!
02:30:27.000 Someone.
02:30:28.000 I don't know.
02:30:28.000 Google has...
02:30:29.000 What did they do?
02:30:31.000 It was very, very unusual.
02:30:34.000 It was not the usual thing.
02:30:38.000 What they did was they got our accounts, they got our apps, To run kind of at ludicrous speed, I guess you could say.
02:30:52.000 And what they did was they pulled in more and more and more servers until we were running so many servers simultaneously that we actually got shut down in the cloud.
02:31:03.000 And we lost access to our own data for almost two weeks.
02:31:08.000 Now, we've never seen an attack like that.
02:31:10.000 Even our security people had never seen an attack like that.
02:31:13.000 It was really pretty...
02:31:14.000 What was the mechanism of this attack?
02:31:16.000 How'd they do it?
02:31:17.000 We're not sure how they got in.
02:31:19.000 Once they got in, all they did was they just created a tremendous amount of activity.
02:31:25.000 So that pulled in more and more resources.
02:31:28.000 And this is definitely created?
02:31:30.000 This is not organic?
02:31:31.000 Oh, no, no.
02:31:32.000 It's absolutely created.
02:31:34.000 And now that we...
02:31:37.000 We don't know about this particular kind of attack.
02:31:40.000 If it happens again, we'll be up within two days, max.
02:31:44.000 But the point is, there's a lot of pressure on us.
02:31:47.000 So we need a lot of money to set up a secure facility, have security teams not just protecting our data, but protecting our people.
02:31:56.000 We have to protect our people.
02:31:59.000 Have you ever talked to Elon about this stuff?
02:32:02.000 I've never had a way to reach him.
02:32:04.000 Well, hopefully someone will take this clip and put it on X. And he's a junkie.
02:32:10.000 He'll be on it all day.
02:32:11.000 So hopefully someone will put it to his attention and put it up there.
02:32:16.000 Because I'm sure this is very concerning to him.
02:32:19.000 I mean, he has a vested interest in this.
02:32:22.000 Clearly, what happened when he purchased Twitter and he found out the extent of government interference in free speech.
02:32:30.000 And how many people were being pressured to not talk about certain things that were inconvenient or how many accounts they were trying to get taken down because these accounts were purveyors of misinformation that turned out to be absolutely accurate?
02:32:48.000 He has a deep distrust, for sure.
02:32:51.000 Well, he has a few times lately.
02:32:53.000 He has retweeted content about my work.
02:32:57.000 So he's aware.
02:32:57.000 He might be aware.
02:33:00.000 And there's another way also, by the way, to take down Google, which I published this in Bloomberg Businessweek.
02:33:10.000 If you go to epsteinandbusinessweek.com, you'll actually see the article.
02:33:19.000 We've reached the point where data have become an essential part of our lives.
02:33:26.000 The way to take down Google is to do what governments have been doing for hundreds of years, to declare their index, the database they use to generate search results, to be a public commons.
02:33:40.000 This is exactly what governments do when water, electricity, telephone communications, any commodity, any service becomes essential, governments at some point have to step in.
02:33:52.000 The electric companies, they were all privately owned.
02:33:55.000 I didn't know that.
02:33:56.000 I didn't realize that.
02:33:57.000 They were all privately owned until the government had to step in.
02:34:01.000 And this is where we are now with data.
02:34:04.000 And the biggest, baddest database in the world is Google's because it's the gateway to all knowledge.
02:34:12.000 It needs to be declared a public commons.
02:34:15.000 As I say, ample precedent for that in law.
02:34:17.000 It's very light touch regulation.
02:34:19.000 And what it'll do is it'll allow other people to draw from the database to create their own niche search engines.
02:34:29.000 So you'll create a search engine for people interested in DMT and UFOs.
02:34:36.000 Someone will create one for women, for Lithuanians.
02:34:39.000 We'll end up with thousands of these search engines, all of which are vying for attention.
02:34:45.000 It will be exactly like the news, exactly like the news media, that domain.
02:34:51.000 And that's the way it should be.
02:34:53.000 Search should be competitive.
02:34:55.000 Google was not the first search engine.
02:34:57.000 It was the 21st.
02:34:59.000 So that's how you do it.
02:35:01.000 And also then search would become innovative again.
02:35:04.000 There have been no innovations in search for the 20 years that Google has dominated search.
02:35:10.000 So General Paxton, Ken Paxton of this great state of Texas, he's interested in this.
02:35:21.000 Senator Cruz is interested.
02:35:23.000 Other people are interested.
02:35:24.000 This would be tough to implement in the U.S., but the EU could do it.
02:35:29.000 Because five of Google's data centers are in the EU. The EU could do it in a flash.
02:35:35.000 And they're very frustrated with Google because they've been trying to keep them under control for a long time now and they've failed.
02:35:44.000 So there are some things that could be done.
02:35:47.000 Permanent, large-scale monitoring system, that is a necessity.
02:35:51.000 That must be there because if you don't have that, you don't know what these companies are doing.
02:35:57.000 You don't know how they're messing with our minds, with our kids' minds, and with our elections.
02:36:02.000 You have to monitor and you have to have court admissible data in every state and probably in every country.
02:36:08.000 And then they will pull back a little bit because they have to.
02:36:15.000 They're violating campaign finance laws when they very blatantly support one candidate or one party.
02:36:22.000 They're making huge in-kind donations without declaring them.
02:36:26.000 So another thing they're doing right now, perfect example of something our system is capturing right this second.
02:36:36.000 Google is sending register to vote reminders to Democrats at about two and a half times the rate they're sending them to Republicans.
02:36:46.000 How do I know?
02:36:47.000 Because that's what the monitoring system shows.
02:36:49.000 That's what they're doing.
02:36:50.000 At some point, that's going to turn into partisan mail-in-your-ballot reminders.
02:36:56.000 And then that turns into partisan go-vote reminders.
02:37:00.000 These are just displayed on Google's homepage.
02:37:02.000 We're capturing the homepages by the millions.
02:37:05.000 If you don't capture them, then the content is ephemeral and it disappears and it's gone forever and you can't go back in time and figure out what they were doing.
02:37:15.000 So, monitoring is no longer optional.
02:37:19.000 By the way, monitoring is fast.
02:37:23.000 Unlike regulations and laws, monitoring can keep up with whatever the tech company is dishing out.
02:37:30.000 The next company, the next Google after that.
02:37:33.000 Monitoring can keep up.
02:37:34.000 If you're going to have an internet, And it can mess with people's lives and it can mess with governments and elections and so on.
02:37:43.000 Then you've got to have monitoring systems in place.
02:37:45.000 So that's what I've been...
02:37:47.000 That's what this new...
02:37:48.000 My monograph is about.
02:37:51.000 And if people want to get a free copy of it...
02:37:54.000 TameBigTech.com.
02:37:57.000 Tame, tame, tame.
02:37:58.000 TameBigTech.com.
02:38:00.000 Yeah.
02:38:05.000 You crack me up sometimes, really.
02:38:08.000 I'd love to see you do a comedy routine.
02:38:11.000 Listen, Robert, thank you for being here.
02:38:14.000 I really, really appreciate what you're doing.
02:38:16.000 If you weren't doing this, I don't know if it would get done.
02:38:19.000 I don't know if we would know as much as we know.
02:38:21.000 I think it would be speculative.
02:38:23.000 I think people would have ideas.
02:38:25.000 I think it would be impossible to prove.
02:38:27.000 And I think what you've done is a tremendous service for people.
02:38:32.000 So thank you very much.
02:38:34.000 TameBigTech.com Thank you, but I'm still fed up, just so you know.
02:38:37.000 Okay.
02:38:40.000 Thanks, Robert.
02:38:40.000 Yep.
02:38:41.000 Bye, everybody.