00:00:38.000So what a rendezvous is, is it's not, you know, you go to those like, I don't even know what they're called, but people do like reenactments.
00:01:18.000That was considered like Jeremiah Johnson's time, like peak fur trapping.
00:01:22.000So there's people, you know, they dress like either revolutionary, like American revolutionaries, or they dress like mountain men or they dress like Indians.
00:01:59.000It softens the leather in a natural way.
00:02:01.000And what's cool about it is every animal, no matter what animal you kill, has the exact amount of brain needed in order to tan the hide.
00:02:08.000So you don't need any additional, like people use egg yolks or mayonnaise or something like that.
00:02:13.000All you do is you take the brain out of the cavity, you grind it up, you mix it into some water, and then after you've cleaned the leather and you've scraped it clean, you stretch it.
00:03:14.000We had to grind it back or he had to grind it back down.
00:03:17.000And then the sheath is traditional, like, you know, you could, the cool thing about doing rendezvous and the cool thing about this is you could have a DeLorean and drop that in 1840 and somebody would pick it up and think it was made yesterday.
00:03:31.000And so everything on there has been done traditionally from the quilling on the bead work is made from porcupine quills.
00:03:56.000So when I was thinking about what I was, because I wanted to give you something for inviting me on because it's still a shock to me that you did it.
00:04:02.000Even though we've been talking for so long, I just never imagined a scenario where you'd want to have me on here.
00:04:34.000I mean, the guy who dated it said 1860 to 1890 is what they figured.
00:04:40.000And you can tell by the way that like around the hilt and the way that it's the pitting on it and stuff like that and the way that it was made that it fits that era.
00:04:49.000I mean, it could have been somebody redid it in 1900, but it's definitely that old, the type of steel and the way that it was worked and the way that it is around the hilt around the bottom there.
00:05:24.000Like it's not something you'd be gutting an elk out with.
00:05:28.000Well, if we get attacked by zombies in the studio, it's a good thing to have on the desk.
00:05:32.000Yeah, I mean, if you're going to make a last stand, you know, that's a pretty good, that's a pretty good knife to make your last stand with.
00:06:31.000So did you bring your own food or did you have to hunt for food?
00:06:34.000So you bring your own food, but there are other rendezvous that are kind of invite only.
00:06:38.000And I don't even think a lot of people who do rendezvous know about these, but there's ones that I think they're called, I think I might be speaking out of school.
00:06:44.000Somebody might send me an email after this, but I'm going to talk about it anyway because I never got read the Riot Act.
00:07:19.000And it's always these weird, like, eccentric history teachers that run them, like guys who, you know, teaches history at Berkeley or something like that or other places.
00:07:29.000They just really enjoy living like this.
00:07:30.000And at those ones, if they're in season, you can hunt whatever's in season.
00:07:34.000You're hunting with traditional archery.
00:09:26.000It's just more like I want to act like it's 1840 for a couple of weeks and not look at my phone one time and not worry about the news.
00:09:33.000It's amazing after a week here, you really forget about the world and you like don't even know you're supposed to be stressed out about things.
00:09:40.000You're just out there doing your thing for a couple of weeks.
00:10:02.000So you'll walk into like a huge, they call them marquees, but it's like a huge 100-foot square lodge.
00:10:07.000There'll be three gambling tables in there, girls in like the low-cut shirts and dealing cards and smoking cigars and just having an amazing time.
00:10:14.000And there are people, you go by camp names while you're in there.
00:11:17.000And it was a, you know, it's one of the things we're kind of missing in culture today or something that I'm trying to reinvigorate, especially with my son and with other young men that I run into is kind of like coming of age rights.
00:12:34.000And if you don't impose nature on yourself by undergoing those types of rights and understanding what it means to become a man, nature will impose itself on you by either A, you're never going to have children and therefore you're dead forever, or B, it will kill you because you're fat and in your mom's basement, you get diabetes, the foot chopped off and you're 35.
00:12:51.000And, you know, we just don't tell men.
00:12:54.000We don't have a the military did it for me.
00:12:56.000I had really put off responsibility or seeking meaning or any of those things until I was in the military.
00:13:04.000And like I said, my father died when I was five.
00:13:06.000So I really had no central male authority until I was about 13 or 14 when I met this guy, Steve.
00:13:13.000And he kind of initiated some of those rights for me and held me to account.
00:13:18.000But it was really the military, which was a turning point for me where there was a standard and I was expected to hold it.
00:13:26.000I think there's a reason why most ancient cultures, a lot of ancient religions, have these rites of passages where you are like now officially, officially a man.
00:13:35.000Officially, you know, you're responsible.
00:14:08.000There's this weird primal feeling that you're responsible for these like very vulnerable little people that you love more than life itself.
00:14:33.000God, we have so many rabbit holes we could go down on this.
00:14:35.000But I mean, it was, you know, growing up in the 80s and the early 90s, it was really like a divorce culture.
00:14:44.000And I obviously understand that if you're in a bad relationship or an abusive relationship or, you know, certainly there's a threshold where marriage should dissolve.
00:14:54.000But I kind of feel like the central thrust of a lot of culture at that time was about like divorce or not getting married or, you know, discovering yourself and that type of thing, which in some ways is good.
00:15:05.000But when it becomes a central thrust or a central narrative and divorce becomes very easy or it's happening everywhere, it's super normalized.
00:15:26.000And that is, you know, when you look up the stats on that, like remarriage and having a new family, like that, that becomes the single most likely vector of abuse in a chi young child's life is that new person, right?
00:15:38.000Because now they're raising someone else's kid or whatever.
00:15:41.000I mean, it's a that's in every old movie, the evil stepmother.
00:15:53.000And so, you know, I kind of resented that part of that time, that culture was, I shouldn't say when I was a child.
00:16:02.000I should say as I got older, because I was in a single mom home.
00:16:04.000And the guy that my mother remarried right after my father died was abusive.
00:16:10.000And, you know, he really got hard on my younger brother.
00:16:13.000And, you know, my mother moved us out almost immediately.
00:16:16.000But when I re-examined that time, it really was, you know, I don't know how to describe it, but, you know, there are no rules when it comes to relationships and family.
00:16:27.000And every family is special and particular in its own way, and they all need to be venerated.
00:16:32.000And there's, of course, some truth to that.
00:16:33.000We shouldn't deride someone because they come from a broken family, but we shouldn't elevate it like it's at the same level as a unified family.
00:16:44.000But also, the people who are making those movies in that culture came from the 50s and 60s where divorce was just not in the cards.
00:16:51.000And so that was, you know, Hooke's law: as you bend any object, it wants to return back to its natural state.
00:16:58.000And Hooke's law kind of played there where nobody could get divorced in the 40s, 30s, 40s, 50s, and 60s.
00:17:04.000And then you had the baby boomers who kind of culturally said, you know, actually, it's not as bad as we think, but then it overcorrected and it became kind of part of that cultural zeitgeist.
00:17:27.000This episode is brought to you by Ketone IQ.
00:17:29.000The demands on my time, energy, and focus are immense.
00:17:33.000So when I need my brain to lock in for hours and hours and fire at its fastest, most alert state, I'm taking ketone IQ.
00:17:42.000It's an energy shot powered by this little miracle molecule that your body already naturally makes and your brain especially loves ketones.
00:17:52.000I've been talking about ketones for over a decade and this company has finally figured out how to put them in a bottle.
00:17:57.000When I take ketone IQ, I drop right into a state of laser-like focus and sustained mental clarity.
00:18:04.000Whether I'm podcasting, training in the gym, or just want to show up locked in when it matters, the difference is night and day with ketone IQ.
00:18:13.000Visit ketone.com/slash Rogan for 30% off your subscription order or find Ketone IQ at target stores nationwide in the protein and electrolyte aisle and get your first shot free.
00:18:29.000Plus, they have a 60-day money-back guarantee.
00:18:33.000That's how confident they are that you're going to love the increased focus you get from ketone IQ.
00:18:38.000And I would say that's the, and not, this isn't a political thing.
00:18:43.000That's mostly what makes me conservative in nature is I agree systems need to change, but they need to change slowly and pragmatically.
00:18:51.000So we, because, you know, any social, any social scientist worth their salt will know a social experiment almost never has the outcome that we thought it was going to have.
00:19:01.000In other words, we thought doing something to society would form society this way, but it almost has the inverse, the anti-pattern like we talked about before, and almost ends up propagating itself.
00:19:12.000And so that makes me, I'm still a proponent for change, but it should be slow and thought out and done in pockets first.
00:19:51.000And there's, you know, even for that being as 250 years ago, there's a profound amount of profundity in that.
00:19:57.000Like, let's change things slowly and let social experience take place and adopt the best parts of those things and then integrate them to the culture overall as we move along.
00:20:06.000But, you know, let's not throw the baby out with the bathwater.
00:20:09.000Yeah, I think in this country, one of the primary problems that people have is a profound lack of respect for discipline and how important discipline is for your life.
00:20:20.000And discipline is associated with conservatism.
00:20:24.000And because of that, like a lot of people think that I'm, I don't think I'm anything.
00:20:29.000I think I have politically or ideologically, I have a lot of everything in me.
00:20:35.000I don't think I identify with one side or another.
00:20:37.000But if one thing that I agree with conservative people on, conservative people lend more towards the importance of discipline.
00:20:44.000Hard work, discipline, don't complain, get things done, deal with the hand that you've been dealt with, and just sort it out and get to work.
00:20:58.000And this is not something that's celebrated in society.
00:21:02.000It's thought of as a cruelty that if you say that you need discipline, that you're not treating these people that are victims of circumstance with the proper respect or with the proper empathy.
00:21:14.000And I think a certain amount of empathy is probably not so good for you at a certain point in time.
00:21:19.000There comes a point in time where you're letting people wallow in their bullshit and just make excuses for why they're not getting anything done.
00:21:25.000And in that sense, I think California is, that is a giant part of what's wrong with California.
00:21:31.000What's wrong with California when it comes to crime?
00:21:35.000You know, the way they address crime and the way they address homelessness and all these issues that they have, they don't put their foot down.
00:21:42.000And at a certain point in time, you've got to realize what God Sad calls suicidal empathy.
00:21:48.000Society can suffer from suicidal empathy.
00:21:50.000And at a certain point in time, you've got to enforce rules and you've got to make it so that people have to get their shit together.
00:21:56.000And that suicidal empathy becomes a way for the person who's imposing it on someone else to feel good about themselves, which makes it even trickier and even more insidious because they're feeling good from the weaponization of other people's lot in life.
00:22:14.000And the thing about that is none of the rules that you're going to impose, especially as a legislator or as somebody in a think tank, you'll never feel the repercussions of them.
00:22:24.000You'll never have to actually deal with it day to day.
00:22:26.000You're just imposing it on someone else and saying, I better understand the structure of reality and the fabric of the world.
00:22:44.000That's a giant part of government, for sure.
00:22:46.000That's a giant part of what's the problem with liberal governments.
00:22:49.000Liberal governments should get paid based on whether or not the city does better or worse financially than when they were in office.
00:23:01.000If their policies lead to greater domestic production of goods and services and GDP does better and everything does better, then you should get paid more.
00:23:12.000If more real estate sales, more people are making more money, medium income raises, less homeless people, you should get paid more.
00:23:19.000And you should get paid less if homelessness goes up, if crime goes up, if there's more destruction, if there's more, you know, assaults and home invasions, you should get paid less.
00:23:51.000You're right to be cynical because that's what they do about everything.
00:23:54.000Someone was explaining to me yesterday that one of the problems with cleaning up fraud is that fraud is responsible for a giant percentage of GDP.
00:24:05.000And if you have hundreds of billions of dollars of fraud in this country and you eliminated that, you actually lower GDP because you actually lower the amount of money that's in circulation.
00:24:56.000You know, and that was some of the stuff that was uncovered during Doge, you know, the limited amount of access that Doge had to it, just the beginning of it, where you got to see the curtain pulled back and get to see exposure of so many of these fraudulent, supposedly charitable organizations that were really just money laundering.
00:25:30.000The government has vetoed these audits and they have no idea where that $24 billion went and yet homelessness went up.
00:25:39.000But you've got a giant machine that is this homeless establishment, this homeless industrial complex that is being funneled money into that.
00:25:49.000And that actually aids the GDP, which is kind of crazy.
00:25:53.000Yeah, I mean, it was one of the things.
00:25:55.000My last three years in the military, I was advising a colonel and a two-star general, and they were in charge of all of the offensive cyber development, ethical hacking, offensive cyber development.
00:26:11.000And one of the things I kind of learned about government at that point was these systems have their own incentive.
00:26:19.000And the incentive is not the output of their purported mission.
00:26:22.000The incentive is the growing of the organization and the execution of budget.
00:26:27.000So while they're in there, you know, I've never seen a field-grade officer get dressed down more than when he didn't spend all of the money that he was budgeted for for that year.
00:26:36.000He would go to the Pentagon and they'd be like, well, you didn't execute $300 million of OCO, of overseas contingent operations funds here.
00:26:44.000And they would dress him down for an hour.
00:26:46.000And what people don't understand is if you don't spend that money, your budget for the next year will be lower because there's no need to have a higher budget.
00:26:55.000Instead of tying it to mission to say, did you achieve your mission objectives?
00:27:21.000Yeah, and that kind of shifted my thinking in that these systems have their own incentive to exist and to grow because those guys that were holding that general officer or that 06 is that colonel's feet to the fire, they also have an incentive to, because they were part of that trickle-down.
00:28:10.000And then what, like $150 trillion of unfunded liability.
00:28:14.000In other words, we've promised people money for the next 30 years.
00:28:18.000And it's debt that, you know, I don't see how we'll ever escape that debt.
00:28:23.000And it's the thing about it is, and I don't want to be pigeonholed because I'm actually quite liberal when it comes to my politics are like yours in that I'm kind of a man without a home, but they also change at different levels of analysis.
00:28:37.000I'm very liberal with my family and I'm very like communist.
00:28:46.000And even in my community, I'll help someone out out of pocket or do something for them that's a strain on my time or might hurt something else because there are really no solutions.
00:29:13.000Like I don't, I really have enough crap in my own life.
00:29:17.000As long as someone's not getting hurt.
00:29:18.000Yeah, as long as no one's getting hurt, consenting adults.
00:29:20.000Like I have enough problems and I screw up enough and people have, there's a laundry list of things that people could say about me, how I've screwed up in my life.
00:29:27.000But then as I graduate and get higher and higher, more conservatism takes place.
00:29:34.000And that's a result of just, you know, having an engineering mindset when I'm looking at life and understanding that it's just not Republican or Democrat or leftist or rightist or liberal or classically liberal.
00:29:50.000All of these monikers don't work for me because they break down at some level of analysis.
00:29:58.000I think the problem is these ideologies that people subscribe to, where you have a predetermined pattern of thinking that you're supposed to adopt.
00:30:28.000There's so many crazy things that people just adopt that don't make any sense.
00:30:32.000And, you know, when you subscribe to an ideology, the problem is if like if you define yourself as this person, I am this.
00:30:40.000I am a hardcore right-wing blah, blah, whatever it is.
00:30:44.000You immediately close the door to all the very productive and interesting things that the other side thinks.
00:30:49.000Yeah, and you're also making yourself into a tool of propaganda.
00:30:52.000Because if someone, if I meet someone and they just say, I'm this, it's like, well, I could reasonably predict everything that's going to come out of your mouth.
00:31:02.000I don't want to have a conversation with that person.
00:31:04.000I can't seek to learn from them because I could just pick up the Communist Manifesto or Mein Kampf and have a pretty good understanding of who I'm dealing with.
00:31:10.000And therefore, a conversation is not relevant.
00:31:14.000A lot of people are afraid of social ostracize too.
00:31:17.000So they're afraid of straying outside of the narrative, whatever side they're supposed to be on.
00:31:23.000And, you know, some groups are really good at making you feel like dog shit if you don't agree entirely with even things that don't even make any sense.
00:31:32.000So that's why people go along with stuff that's illogical, like open borders or whatever it is.
00:31:37.000They go along with things that's not in their best interest because they're scared.
00:31:43.000They're scared of being cast out of the kingdom.
00:31:45.000They're scared of being excommunicated.
00:31:47.000Yeah, I dealt with a lot of people first when I retired from the military and then more recently leading up to the last election where I was entertaining the deal of doing some work for government, believe it or not.
00:32:01.000And because as we talk more, you'll figure out I'm pretty anti-institutions.
00:32:07.000I'm really against those types of things.
00:32:10.000But I really felt, if you would have asked me three years ago how I felt about the Trump election and all of that stuff, I was very excited because he was saying a lot of things that I wanted someone to say.
00:32:21.000And this is what people I think kind of lack when they my whole life is built around pattern analysis.
00:32:27.000I really enjoyed patterns and exhuming and looking into patterns.
00:32:33.000And there's a pattern of like a you this, you'll laugh when I say this first part of the pattern, but then I'll make it make more sense later.
00:34:20.000And so we have to put up with all of this other stuff because we understand that when the system is corrupt at every level, you need someone who's outside of the system to come in and set the system right.
00:37:37.000And so when I examined Trump, I said, yeah, I don't like what he says.
00:37:41.000I wouldn't want him around my daughters.
00:37:43.000I wouldn't want him at a dinner party.
00:37:46.000But he seems to be saying these things like he's going to reset this system.
00:37:50.000You know, I think it was Chappelle was on your show or another show or someone like that where he talked about Hillary saying, you know, something about the tax loopholes or whatever.
00:37:58.000And he just hit right back at her and said, well, the people who are funding your campaign take advantage of those same loopholes.
00:38:04.000And if they're there, I'm going to take advantage of them.
00:38:06.000I wouldn't be a pragmatist if I didn't.
00:38:08.000When he started saying stuff like that, it seemed to me like he was going to upend this system.
00:38:12.000The jury's out on that because I don't know how I feel these days.
00:38:15.000We can get into that if you need to, if we want to.
00:38:17.000But he's an outsider personality, and I thought he was going to really reset this system.
00:38:23.000And there are good things that are happening.
00:38:26.000If I were to grade him, I would probably give him a C plus or a B minus.
00:38:30.000Certainly better than what was happening under Biden.
00:38:33.000I was still in the military when Biden was in charge, and it was awful to say the least.
00:39:08.000And it was, you know, I would sit there and say, you know, all of the friends, all the people that I know who've died during this war, not all of them, but 80% of them, and the numbers bear this out when you look at them.
00:39:19.000They're all white guys from the middle of the country who were on their farms or, you know, not all of them, 80% of them.
00:39:26.000I think the numbers bear out about 80% of them.
00:39:28.000Were these guys from the Midwest or these places where they didn't really have a lot going?
00:39:32.000And they went off to fight a war that we probably shouldn't have been fighting in the first place, especially in Iraq.
00:39:40.000And now you're saying that those people who make up the majority of the combat deaths are somehow part of this problem and that other people aren't benefiting from it.
00:39:50.000I don't believe race to me is disgusting.
00:39:52.000Even to talk about someone's race, even on both sides of the spectrum, when they were electing that Supreme Court justice, I can't remember her name right now off the top of my head just because I'm a little nervous still.
00:40:07.000Yeah, they were talking about how it's historic because she's black.
00:40:10.000And Biden had said he's going to hire a black woman to do this job.
00:40:14.000If I had worked my whole life to do something, but now I'm only being elevated to this next position because of my gender and the color of my skin, I would turn that job down so fast because that's not what I want to be known for.
00:40:26.000These are immutable characteristics that I'm not in control of.
00:40:30.000I didn't choose to be born white or with blue eyes.
00:40:32.000I didn't choose to be born in a trailer park in the middle of nowhere without a dad at five.
00:40:52.000It's just a great way to control people because you pit people against each other that way.
00:40:57.000And it's just an awesome way that they can stay in control and make everybody walk on eggshells and think that they've victimized people in order to get to their position and they have to be shameful of who they are that they had no control over.
00:41:13.000It also gives people an easy rubric to judge other people.
00:41:28.000And it gives people, people want easy answers, really, at the end of the day.
00:41:32.000They want to be told the easy rubric to navigate life because really none of it's easy and it requires discipline, like you said before, and thought.
00:41:40.000And so it was that stuff in the military.
00:41:43.000I remember getting told in an equal opportunity briefing we were getting, it doesn't matter what you meant when you said what you were saying.
00:41:53.000It only matters what the person felt when you said it.
00:41:57.000They'd said that in a military briefing?
00:41:58.000This is a military equal opportunity briefing.
00:42:01.000And the example they gave was if a woman walks into the, like we worked with a lot of civilians at this military organization where we were developing these offensive cyber capabilities, a lot of civilians in there.
00:42:14.000And so if, you know, woman X walks in today and she's got a dress on, and the thought in your head is, I'd like to get my wife that dress or something like it or find out where she bought it.
00:42:24.000And you just say, that's a nice dress.
00:42:48.000But it was weaponized and it was carried out in a way where it's only about how people feel and not what a reasonable person standard would be in a particular situation.
00:42:56.000And from the time I joined the military until that time, we had been at war.
00:43:00.000My entire time in the military, we were at war.
00:43:43.000And but then someone would overhear that joke or something.
00:43:45.000And now you're looking down the barrel of a 15-6, which is a military investigation.
00:43:51.000And all of these things that could permanently impact your life in a way and give you a scarlet letter to where you could never be employed again or do anything ever again because you were simply trying to relieve some pressure or you were trying to find out where to buy your wife with the next dress and now your life's being ruined.
00:44:08.000And I know guys who suffered under that sword.
00:44:10.000Like I wouldn't name them, but I know guys who, you know, their career met a terminal end because of a dumb joke or something.
00:44:18.000It's like you can't be expected to go out and shoot people in the face and then be sensitive to someone's feelings an hour later.
00:44:25.000It's just, it doesn't, it does not work.
00:45:41.000And the other thing that they were doing in this briefing, which is where I kind of, you know, the last couple of years of my military career, I got in trouble a couple of times, or I should say, called down.
00:45:59.000I wasn't high in the dominance hierarchy, but I was adjacent to people who were as an advisor.
00:46:05.000And the amount of in this briefing in particular, they had gotten into, you know, it's bad that there are so many white people.
00:46:20.000I'm doing high points here, but we need more diversity.
00:46:22.000I was part of an accepted career program that they were starting to call like the old white boys network because most of the people, so the requirements for this network were you had to speak a couple languages.
00:46:34.000You needed an engineering degree or some kind of demonstrated engineering background.
00:46:58.000You need somebody who speaks languages.
00:47:00.000Well, now they also need to be kind of, you know, speak French, speak Russian, whatever it was.
00:47:06.000So they had to have studied or lived in an area and done this.
00:47:09.000And they need to be able to go through these crazy tactical and strategic types of courses.
00:47:14.000By virtue of those things, you're going to get men.
00:47:18.000And there were lots of women, but then there'll be more white men.
00:47:21.000And it's not because the pool presented itself that way.
00:47:25.000Now you have to extract from that pool.
00:47:28.000And so in this briefing, when they were talking about like the old white boys network or how we need to change things, I said, you know, do you realize that most men have more in common than most women?
00:47:40.000Or like if I say I need more diversity in a particular room, if you said diversity of thought, I'd be fine with that.
00:47:48.000But Joe and random black guy in the same program in the same office have far more in common than the white woman.
00:47:59.000But what you're saying is these people need to have all separate different colors and different like all of this needs to be this way.
00:48:06.000It's going to naturally present itself that way because men in the military generally are disagreeable.
00:48:11.000Men in the military who like engineering are generally hyper disagreeable.
00:48:16.000And the only difference between these two people is the pigment of their skin.
00:48:20.000So this fake diversity quota that they're putting on top of us doesn't achieve anything other than giving some officer a bullet on their OER.
00:48:29.000And I got pulled into the office afterward.
00:48:31.000I said way more than that, but essentially afterwards they're like, hey, Chief, you can't say that in those briefings, like the way that you were getting animated in there and what you're saying, what you're doing.
00:48:53.000And that the similarities and the way that things stack up, you recruit from a pool of volunteers and candidates.
00:48:59.000If I'm recruiting from a pool of volunteers and candidates who are 80% male and white, I have to expect that the selected individuals are going to be male and white.
00:49:08.000The majority of people who join the military, I don't control this.
00:49:12.000I'm just, as an engineer, I'm looking at statistics.
00:49:15.000Also, if you want a highly functional, productive group, it's got to be based on meritocracy.
00:49:49.000Now, whether or not we should be using that all the time or how we use it, that's a separate question.
00:49:53.000But the entity itself needs to comport itself in this way.
00:49:56.000Otherwise, you are endangering this truly special experiment, which at least in its beginnings valued the individual.
00:50:05.000It valued individual rights and states' rights.
00:50:10.000And the founders, and this was another thing I said in that briefing, was the founders knew, yes, they were all slaveholders, but they knew that the Constitution and the Bill of Rights and the Declaration of Independence would eventually lead to a system where we had to acknowledge these people as people.
00:50:26.000And we fought a civil war where a million white dudes died to see this experiment through.
00:50:35.000You have to look at the things the zeitgeist of the time.
00:50:38.000If they had just said, nope, everyone's going to be free, there will be no slaves, you would have never gotten ratification through the southern states.
00:50:45.000But they knew that there were, and when you read the Federalist papers, they knew that they were erecting this system.
00:50:51.000When you look at Thomas Jefferson and some of these other great thinkers who, yes, he owns slaves, I get it.
00:50:56.000They knew what they were building and they knew what would ultimately terminate in.
00:51:00.000And then we had a civil war where we destroyed our country from the inside to see this dream come about.
00:51:07.000And now we're just going to all go back and say they're all slave owners.
00:51:10.000I know this has all been said here a million times, but this stuff animates me because it's built with blood and treasure.
00:51:16.000Well, it's also, you can't judge people from the past based on the standards of the present.
00:52:01.000Like someone, you know, stores your stuff for profundity's sake, for the future to hear about this.
00:52:06.000You know, I've always loved your podcast, Joe, and it was because you're a genuinely curious person, and I'm not kissing your ass right now.
00:52:15.000You're a genuinely curious person that was saying things that were not in the current zeitgeist at the time, and you refused to apologize for it.
00:52:23.000And it led to a lot of great things, but it led to an updating of the system.
00:52:28.000And you did it with dialogue, with the Diologos, with two people trying to learn things about each other.
00:52:34.000And it led to an updating of a system.
00:52:36.000I think it's very important for culture to have free and open dialogue so we can update our system.
00:52:41.000So bad ideas can die so we don't have to die instead of our bad ideas.
00:52:45.000Because if I can't express a bad idea, I have to act it out.
00:52:49.000And if I act out the bad idea, it could kill me.
00:52:55.000And it's just really, there's just been such a weird inversion in politics where the free, hippie-loving liberals of yesteryear are now the ones telling you what words you can use.
00:53:08.000There are no borders, all of these crazy things.
00:53:11.000And I always say to people, I said it to Andy on my last podcast with him.
00:53:16.000I'm like a 1996 Bill Clinton Democrat.
00:53:19.000If you go watch his State of the Union and he talks about lowering debt, getting out of debt, actually, working with Newt Greenridge to get out of debt, securing the borders, making work and education freely accessible.
00:53:57.000Judging people by the standards of the past, you know, JFK doesn't look so good in the Me Too movement.
00:54:03.000You know, I mean, he would have got canceled.
00:54:05.000It's like you have to recognize that those, this ideological bubble that we find ourselves in left versus right, Bill Clinton does not fit in that.
00:54:15.000Bill Clinton is securely on the right in terms of 1996 standards applied to today.
00:54:23.000No, he would never want to hear that because he's kind of shifted with the zeitgeist because that's what you kind of have to do if you want to stay in your party and be protected by your party.
00:56:11.000But generally, my principles are in place.
00:56:14.000And when you watch these people who get in their 30s, 40s, 50s, and 60s, and their core foundational principles are changing, it really should give you cause for concern.
00:56:24.000Because like you were saying this at this time, and now you're saying this at that time.
00:56:28.000It's like generally, my rubric that I don't think will change about myself is, I'm fervently for the individual and I'm fervently for truth and and that we can that the, that the world you, you should measure it and look at not what your intentions are but what the outcomes are and and then evaluate the system and how it scales based on those outcomes.
00:56:51.000If you, I try to live that standard up to myself.
00:56:54.000I fall fall short of that standard all the time, but I try to live by that standard and I feel like that will always be me, even into my 90s, like unless something goes horribly wrong right right right, and and I've pretty much been here since, you know, the past seven or eight years or so, like even into my 30s, I quite wasn't quite sure who I was as a human and uh, but I'm pretty, you know, steadfast in that,
00:57:20.000and the amount of opportunities and the amount of goodness in my life and my children and and my home and the things I've been able to do have really been born out of.
00:57:29.000That last seven years of the truth's going to be the top of the of the decision matrix for me, the top of the hierarchy for me, I'm going to try not to cut corners whenever I can and help good people around me and and and the truth is the way that I'll organize and function myself in life, and that I will try to only judge people as individuals and the world.
00:57:52.000You know, these are Christ's teachings from 2,000 years ago and, but the world for me has just opened up in a way that I could have never predicted.
00:58:01.000Using a very simple rubric, it's not easy, but it's simple.
00:58:05.000And if more people just took those and this isn't me, I didn't come up with this, this is the result of, you know, watching a bunch of experiments go bad, but if people just adopted that very simple thing and just tried it for three months, you'll feel better about yourself, you'll feel better about the world, you feel better about the people proximately around you.
00:58:23.000It might make you hate the government more yeah, but uh well, I don't think.
00:58:28.000If you don't hate the government, I think you're not paying attention.
00:58:55.000When I joined the military, I was in signals intelligence and essentially learning the ins and outs of radars, how radars work, what they do, how they function.
00:59:05.000Did you guys ever see any weird shit, like UFO shit?
00:59:13.000I was more in the signals intelligence side of the house, focusing first on electronic signals or emanations from radars, mapping them so that, you know, if we were going to go do the ground invasion and there was going to be some air support going in first and blowing shit up, we would tell them, hey, there's a man-packable SA-7 here.
00:59:33.000And then telling these pilots so they didn't get shot out of the sky.
00:59:37.000Quickly, when the war kicked off, that became irrelevant because there was no surface-to-air missiles, surface-to-surface missiles in Iraq.
00:59:44.000We had knocked them all out in the first few weeks.
00:59:47.000So then it shifted to communications intelligence.
00:59:49.000So I kind of retrained on communications intelligence, and that was at that time off of cell phones, off of push-a-talk radios, repeaters, long-haul networks, terrestrial networks, extraterrestrial networks.
01:00:02.000And what I mean by that is stuff, the satellites in the sky.
01:00:06.000And doing analysis on those to try to inform what we call the common operating picture of the battlefield for a combatant commander.
01:00:14.000So command commander wants to know where the bad guys are, what they're doing, what they're saying.
01:00:18.000To the amount that we could, my job was to come up with solutions and conduct passive and active signals analysis on these things and then inform the commander so that we could mitigate risk.
01:00:37.000I'd been doing this for about seven years, eight years.
01:00:40.000And from there, it shifted to the phones getting smart.
01:00:43.000And essentially, it went from you walking around with a 2G phone or a 3G phone that had limited compute capability to now there's robust compute capability with the advent of like the iPhone.
01:00:55.000And now it's like, well, now we've got to get after guys who are essentially walking on with a computer we could never have envisioned 20 years ago in their pocket with all this capability.
01:01:04.000Because the military and our forces that we're fighting against, it all comes down to our ability to shoot, move, and communicate.
01:01:11.000Communication being the part that I was focused on.
01:01:13.000So as the advent of the iPhone and those things came out, the Army realized we didn't have a computer network operations MOS.
01:01:20.000We didn't have a offensive cyber component.
01:01:23.000We didn't have a defensive cyber component.
01:01:25.000So we kind of, I was there at the ground floor when we were building out these new MOSs now that are all over the military.
01:01:31.000But at that time, there was a thought going into, you know, we need to have people who know how to be on-net operators.
01:01:37.000Ethical hacking, as paradoxical as that sounds.
01:01:41.000That's how the lawyers called it that.
01:01:42.000So it's hacking at the end of the day, but ethical hacking because you've got the backing of the U.S. government.
01:01:47.000And so we set up that framework and really started launching into operations, you know, 2006, 7, 8, all the way into my last deployment in 2017 or 2017.
01:02:00.000It was all focused on computer network operations and how they lash up with terrestrial networks.
01:02:05.000How do we exploit all of that was one facet of my job.
01:02:09.000And your question was, how did I get into all of that?
01:02:18.000What was the operational aspect of it?
01:02:20.000How did you actually, what did you do?
01:02:23.000So, you know, I'll stick to terms that are more generally understood by the public, but learning how to do things like war driving, collecting on networks, Wi-Fi endpoints, cell phones, understanding the ins and outs of them, understanding how to do forensic analysis of them.
01:02:42.000So after there was an operation and a bunch of guerrillas had been sent in to kill a bad guy, we could derive maximum intelligence value from the handset to plan other operations.
01:02:54.000And so, you know, it would be passive monitoring of networks to inform the intelligence picture, which would lead to either combat operations or active computer network operations, where now it's like, well, there's, you know, a, I don't know, an Iraqi or an Afghani router that hasn't been patched in three years.
01:03:18.000And we think we can either write or find a zero day, which is just an exploit of those routers, where we can muck with their router in a way where they think they're getting good information and they're not, or they're erecting other things to mitigate risk for the commander.
01:03:39.000And so that really, you know, exploded at that point.
01:03:42.000And between that and human intelligence, which is kind of the actual gathering of intelligence from other people, you know, you would call it spy or James Bond, but that's James Bond was a horrible spy.
01:03:56.000I mean, yeah, you know, your job's to remain anonymous, and you're walking into a casino and there's Goldfinger calling you by your first and last name.
01:04:15.000And then my focus for the last 10 years was how does signals intelligence computer network operations become a force multiplier for people conducting overt and clandestine operations throughout the theater at that time.
01:04:29.000My deployments and my time was spent in Iraq, Afghanistan, Africa, Northern Africa.
01:04:36.000And then a lot of people don't know it, but we were in active combat operations in the southern Philippines as well for a fair amount of time.
01:04:43.000I want to maybe say seven or ten years.
01:04:44.000We were doing combat operations in the southern Philippines.
01:04:47.000My first deployment to the southern Philippines was 2007.
01:05:29.000So there's what's called the autonomous region of Muslim Mindanao, which is the southern part from a place called Zambuanga down to Hulu or Holo Island.
01:05:39.000And there's a, it's a funny joke because if you zoom into Zambuanga, which is God, look how many islands are.
01:05:57.000At the tip of that penis is called Zambuanga.
01:06:00.000All of our combat operations, now if you zoom out a little bit more and pan more south and zoom out just a little bit more so the joke hits, all that sperm south of the tip of the Zamboanga city, there are terrorist operations in here.
01:06:17.000Now, if you go to that main island called Sulu, there's Holo Island, that's where I was on this tiny island out in the middle of nowhere.
01:06:47.000In fact, there was a guy, and I believe I'm going to get his name wrong, perhaps, but I believe his name, it was either Insulan Haplon or oh, it's Jamar Patek, Jamal Patek.
01:06:59.000He was actually arrested outside of Osama bin Laden's compound the day after he was killed.
01:07:03.000We were trying to kill him on that island or in and around that island is where we were trying to find him and kill him.
01:09:21.000I mean, I just, the people down there were fantastic.
01:09:24.000And it was awful because those guys would be bombing churches, Christian churches, and stuff like that.
01:09:28.000And they're doing counter-operate, like I said, counter intelligence operations out there doing intelligence operations collection to inform that battle picture.
01:09:38.000But those guys had direct links with Osama bin Laden and other people.
01:09:42.000Yeah, right after we, like I said, I think it was, I think if you look it up, I think his name is Patek, P-A-T-E-C, P-A-T-E-K.
01:10:15.000I prefer to call people out face to face, but I always make sure people know I was not a cool guy.
01:10:22.000Like sometimes I got to dress like one.
01:10:24.000For a few years, I didn't wear any uniforms, and I got to grow my beard out and act like a cool guy.
01:10:28.000But I was really a nerd for cool guys.
01:10:30.000I've literally got pictures of myself down in the Holo or in Afghanistan or anywhere else and tape around my glasses and Pez Dispencer and my radio and collection equipment looking like a true blue American nerd.
01:10:44.000But I was not the guy who kicked the door in.
01:10:46.000I was always the guy who pointed the door out.
01:10:48.000So I'd be safe in the Humvee in the back, you know, eating an MRE and somebody that looked like another gorilla, you know, like Annie Stumpf or Tim Kennedy or someone like that.
01:11:01.000I'll be out here or I'll be in an airplane above, you know.
01:11:05.000And yeah, it was, it was being born in North Dakota and, you know, my mother, single mother, after she left that first guy, trailer house in the middle of this little town called Cavalier, North Dakota.
01:11:30.000But, you know, my mother, you know, I don't know if you would remember this, but maybe other people my age, you know, you'd get these scholastic book order forms that you'd bring home from school and you could order books.
01:11:45.000There'd always be like little cool stuff like you could get like, you know, a pair of gloves or a hat or something.
01:11:50.000Anyway, one time there was a coil radio that you could order with an earpiece and you put this coil radio together and with an earpiece, no battery.
01:11:59.000It was just the electromagnetic radiation would activate the coil and the coil would, you could listen to radio chatter.
01:12:15.000And you would just kind of like a record, like, you know, how you hit a record.
01:12:20.000Electromagnetic radiation would hit the coil and the coil would feed up to an amplifier or up to an earpiece and the earpiece you could hear chatter and you could.
01:14:12.000In this little town, Edinburgh, North Dakota, there was a guy who had a computer store in a basement of an old general store, and his name was Jeff Munzerbrotten.
01:14:20.000And I would go there and ask him questions about computers and just start learning ins and outs on how do I update the RAM?
01:16:27.000Like, you're wasting, like, you're obviously my RP, my CPU clocks high.
01:16:32.000I'm always thinking, even when I'm not thinking, and even as we're sitting here talking, I'm thinking about other things or stuff I want to do when I get back to my computer or stuff I want to do for my business.
01:16:47.000I had joined to be a military policeman, which I absolutely would have hated.
01:16:52.000All of them got turned into infantry people or stand gate guard, which is a needed function in the military, but it doesn't apply to my personality.
01:16:59.000But when I went to the recruiter station out in Minneapolis, I think it was, I was a bonehead and I forgot my driver's license.
01:17:06.000And they're like, well, and I was supposed to leave.
01:17:08.000And at this time, I had dumped my girlfriend, told everyone goodbye.
01:17:12.000I'd wiped the dust off my boots, like left Cavalier, North Dakota.
01:17:16.000And I was like, hey, I'm not going back.
01:17:34.000And they're like, well, you're not leaving today without a driver's license.
01:17:39.000So I looked at my recruiter and I was like, I don't know what job you need to get me into, but it needs to be a different job.
01:17:44.000And they're like, well, you scored exceptionally high in your general technical part of your ASVAB, which is like understanding machines and objects and stuff.
01:17:52.000So we could get you into this like Intel job where you'd learn about radars and stuff.
01:20:41.000Like amplitude modulation isn't as efficient as frequency modulation when it comes to for the vocorder to produce sound.
01:20:49.000Amplitude modulation travels farther, but it doesn't have the amount of information.
01:20:55.000It's not modulated with the carrier wave can't be modulated with as much information as you need, whereas frequency modulation is much quicker, megahertz, and you can amplitude and add more sound or more information, which is why it sounds better.
01:21:08.000So FM sounds better, but it doesn't travel as far.
01:21:13.000When I was training people in the military on this, I always use the analogy of if a party is happening next door, you can hear the bass music, but you can't hear the treble.
01:21:21.000You can hear the bass music because that frequency travels farther because it's lower in the frequency band.
01:21:27.000But you can hear the treble because, or you can't hear the treble, I'm sorry, because it's higher frequency and there's more modulation.
01:21:34.000And so it disperses quicker and you can't hear it as well.
01:21:37.000And it's the same thing with like VLF comms coming off of like a submarine can travel underwater for a very long ways, but you can't put as much information in them as you could if you were doing, you know, VHF or UHF comms where there's lots of modulation.
01:21:54.000And, you know, a lot of my, you know, mid-part of my career was explaining this stuff to, you know, military guys who were trying to understand like, here's how a cell phone works, and this is how frequency works, and this is how we send information.
01:22:06.000And just kind of demystifying, you know, how a GSM network works.
01:22:12.000One of the things that I wanted to ask you about that is when new technology is emerging, how do you stay ahead of the ability to extract information from this technology, hack into networks before people understand the capability?
01:22:37.000And that's the beauty of the free market, is that the innovation to perform the function that you want someone to pay for will always move faster than your ability to exploit the technology.
01:22:48.000Then how do you explain things like Pegasus?
01:22:51.000Well, I mean, something like Pegasus, well, first off.
01:22:55.000Explain Pegasus to people that don't know.
01:22:57.000It was a persistent implant on cell phones for people.
01:23:05.000Initially, it was a click, and then it became a non-click exploit.
01:23:08.000So in other words, you had to interact with something on the phone in order to initialize and install the implant.
01:23:14.000And then after, but the reason why it was so good is because it wasn't stored in the it wasn't stored in the unusual areas that you would want a persistent implant or where you would have a persistent implant.
01:23:27.000For instance, you know, you might want to put it in the application layer of an app or something like that where there's a binary that can run and execute commands or functions.
01:23:38.000And so they, I won't get into the very specifics of where and how they did this because I'm not sure if I got this information from the government or not, so I won't say it.
01:23:47.000But they stored it in a place where it wasn't normal.
01:23:50.000And you can read papers on your own and look at the forensics of it and how the actual implant was executed.
01:23:57.000But it essentially allowed people to own your phone and was the kind of implant I only dreamed of when I was helping develop my own implants in the military.
01:24:10.000Mostly what we would rely on is zero-day architecture and looking for something in a phone that either they hadn't patched or that the phone that you were looking at hadn't been patched.
01:24:20.000So phones, as they have their own red teams, are going through the phone for their own, because they want to sell a product that people will use and people won't use stuff that can get hacked.
01:24:29.000So they'll do their own red teaming and they'll discover like, oh, you know, on this router we developed, we left this port open and it shouldn't have been open.
01:24:37.000So now we're going to write a patch that will close that port so that this port is no longer accessible by a guy like me.
01:24:42.000So I can't go in there and do something to this particular type of router.
01:24:46.000Another great thing, I'll say something good about the administration.
01:24:49.000They're doing some stuff right now to make sure that we're getting rid of Chinese technology and Chinese routers.
01:24:55.000And, you know, there's a widespread network of the PLA has a, and I can't remember the name of the botnet, but they essentially implanted a bunch of old unpatched routers to get access to government and business proximal people.
01:25:15.000And, you know, it looked like to me, I haven't read this anywhere, but if I were looking at this implant and how it was done, they were trying to really cause some trouble.
01:25:25.000It was being placed at critical places, think power, think energy, think banking.
01:25:31.000Like they really wanted to cause some ruckus.
01:25:34.000And I have not been part of this administration, so I'm not saying anything classified for those of you who are listening.
01:25:40.000But there was a decision to say, hey, we need to make sure that these things get patched, and also that we're not bringing in architecture from the overseas because they don't play by the same rules that we at least say we play by.
01:25:51.000So that's why they banned Huawei devices.
01:26:15.000And then, you know, I had heard some people say, oh, they're just trying to stop competition.
01:26:19.000It's like American companies are trying to stop it.
01:26:22.000And then I went into it deeper and I said, no, it seems like there's third-party input on some of their routers and some of their network devices that they had engineered in order to be able to access them by third party.
01:26:38.000And this, because of whatever, lack of understanding, lack of knowledge of how these things are constructed, the people that purchased them weren't aware of them.
01:26:49.000And these things had gotten into place.
01:26:51.000And they had gotten into place in universities.
01:26:53.000They got into place in military establishments.
01:26:56.000They were using them in cell phone towers that people had, you know, inadvertently bought from China.
01:27:02.000And that's really, I mean, I can tell you firsthand from having done some of the forensic exploitation on this stuff.
01:27:08.000Another large part of my career I didn't talk about was just on mobile forensics and media forensics, which is essentially you think of like CSI, Miami, or CSI, whatever the city was.
01:27:35.000I would do this in the military so that when we did do an operation, and I was part of some of the largest ones ever done out in Afghanistan, there would be treasure troves of phones and all of these computers and stuff like that.
01:27:49.000And I had a great team that worked for me.
01:27:51.000In my deployment in 2015, we would go in afterwards, gather up all of this stuff.
01:27:57.000And, you know, the task force commander would literally be standing by and we would say, you know, here's the intelligence that we've derived.
01:28:10.000And those guys would be rolling like within moments after the last operation.
01:28:13.000Like some operations we'd do where we'd be rolling one after another target because we were getting really good at media forensics and intelligence that was there.
01:28:22.000And then getting into active media forensics, which is a different discipline.
01:28:25.000But essentially, I can get into that later if you want to.
01:28:28.000But launching and doing these follow-on operations off, you know, dumping the binary from a phone and examining it at the ones and zeros level to say everything that was going on with this thing.
01:28:40.000Or if it was a really high, like the organization that I worked for at that time did the analysis of the Osama bin Laden media.
01:28:48.000And, you know, on that media, we're doing far more than we would for another piece of media and that we're, you know, x-raying it and we're looking at maybe what the disk looked like before or what was destroyed or reconstructing things, spending millions of dollars on that intelligence analysis because we wanted to fully understand everything that this guy was involved in and what he was doing and where he was and who he was talking to.
01:29:10.000And so that was another part of my career that I did for about five years or so.
01:29:14.000What was going on with the Huawei phones?
01:29:18.000I mean, some of them were coming out implanted.
01:29:22.000In other words, there was access built in for a foreign actor.
01:29:25.000And then in other terms, other places with routers, with the ZTE stuff, there were just things that you would patch or that you would fix as a company who was trying to protect the consumer and create a product that people would use.
01:29:39.000So they were creating persistent back doors either by actively placing code on there that would allow root access or they were leaving things open, especially in Africa, like the work that, you know, when I was working in Africa, the Chinese were just owning Africa.
01:29:54.000They were just giving them communications infrastructure.
01:29:58.000And they were doing that because they wanted their resources and they wanted to know what these people were saying and what they were doing.
01:30:04.000And so I'm a free market real, like I'm as free market as a guy can get.
01:30:09.000I want the best people building the best products and I want everyone to be able to compete.
01:30:13.000But in that case, I would never own a Huawei or a ZTE or anything else.
01:30:18.000On a consumer level, what were they doing with those phones?
01:30:21.000Like if they had imported them to the United States, if they didn't have that ban, what would have been the issue?
01:30:27.000Getting access to, you know, any number of people that the Chinese really want access to everybody.
01:30:34.000But you could start at the topical level of just saying, you know, getting Joe Rogan to use a ZTE would be, that would be my wet dream as a guy who used to do this work back in the day because you're talking to the president or you're talking to this guy or that guy.
01:30:46.000And I can build out a network of understanding who you're in contact with, who you're talking to, what's being talked about.
01:30:53.000But then also finding out this person's phone number and now doing a deep dive on there.
01:30:57.000So it's really about getting all of that data and constructing an analyst notebook, essentially, outline of who's talking to who, who do we need to implant.
01:31:10.000They would want this in the hands of somebody who's in charge of a business because they want their IP.
01:31:14.000They would want this in soldiers' hands so they would know deployment dates or who's going where and who's doing what.
01:31:18.000They want this in routers because routers are usually the most unpatched piece of technology in that you're not, especially, you know, these days they're more automated patching.
01:31:28.000But back in the day, like you had to manually update a router.
01:31:31.000And if you didn't, well, then you had potential exploits that were sitting on that router where I could gain access to the router in your home, or I could gain access to a BGP router, which is like a border gateway, which is moving all of the internet data.
01:31:44.000Or I could get access to a microwave terminal.
01:31:47.000If you look at a cell phone, they've got the microwave terminals on there that are sending information in between them.
01:31:52.000If those are Chinese parts that are either being used for the processing, the CPU, or the physical infrastructure of that, the products that they were putting out would give me direct access to the information that's being passed on those terminals.
01:32:05.000So you're getting, you know, system-level, root-level access through machinery, through communication devices, and through things like routers where you can know everything you want to know about your enemy.
01:32:44.000Most of these phones, if you're just an average everyday citizen who's just going about your job, the phones today are pretty secure, especially versus a few years ago.
01:32:54.000If you're a reporter, now the nexus is, do you trust the government and do you trust Apple?
01:33:01.000If you trust the government, you trust Apple, then Apple's probably your best bet for using an, you know, there's lockdown mode on an Apple phone or they used to call it back in the day.
01:33:11.000I think it was called reporter mode, but there was ways to encrypt the devices and to encrypt the chatter and the tunnel coming out of the phone, the RF coming out of the phone.
01:33:34.000They are more interested in monetizing people's data than they are providing them capability.
01:33:39.000So every time you take a photo, every time you upload a document, every time you talk to it, every time it asks you about your, you know, you'll get these questions where it says if your password's lost, you can back up your password in these ways.
01:33:55.000Tell us your mom's this, your mom's that.
01:33:57.000Lockdown mode is extreme optional protection.
01:33:59.000You can only be used if you believe you may be personally targeted by a highly sophisticated cyber attack.
01:34:03.000Most people are never targeted by attacks of this nature.
01:34:06.000When iPhone is in lockdown mode, it will not function as it typically does.
01:34:09.000Apps, websites, and features will be strictly limited for security, and some experiences will be completely unavailable.
01:34:16.000Yeah, so when I was advising guys back in the day on going out and doing like a high-risk source meet, so they're going to go meet a spy for another country, and you're a military guy and you're debriefing someone or doing something, I was always telling them to use lockdown mode.
01:35:52.000But what Apple and Meta want to do is, like, they're trying to build these new neural networks.
01:35:57.000They're trying to, you know, humans, and we can get into this too later if you want.
01:36:02.000Humans are the only thing, in my opinion, and I'm happy to have you disagree with me, and I love to have this conversation.
01:36:08.000In my opinion, we're the only ones that are.
01:36:10.000After May 8, 2026, announced plans to discontinue support for end-to-end encryption for chats on Instagram.
01:36:16.000If you have chats that are impacted by this change, you will see instructions on how you can download any media or messages you may want to keep.
01:36:23.000Social media giant said in a help document, if you're on an older version of Instagram, you may also need to update the app before you can download your affected chats.
01:36:31.000When reach for comment, this is what Meta had to say.
01:36:34.000Very few people are opting for end-to-end encrypted messages and DMs, so we're removing this option from Instagram in the coming months.
01:36:40.000Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp.
01:36:44.000But WhatsApp is a little squirrely, right?
01:36:51.000And so you asked me why I don't trust them.
01:36:53.000It's because they want to, they want to use, so humans, in my opinion, and some animals are the only things that are, that have the ability to project consciousness.
01:37:04.000And projecting consciousness is how you train a neural network.
01:37:07.000And it's how you train all these large networks.
01:37:10.000A lot of my time also in the military is spent.
01:37:12.000I was doing artificial intelligence in 2012, 2011, before it was even a catch term.
01:37:17.000We were using artificial intelligence to map dynamic networks and to do other things, more pragmatic uses of it than how it's being used today with large language models or convolutional neural networks.
01:37:27.000But they need consciousness to train their models.
01:37:30.000So when Google offers you meta or Instagram or whoever else offers you photo storage, it's because they want your face to train neural networks.
01:37:38.000If they're going to pay for the compute, if they're going to pay for the storage for these things, they're doing it because they're going to use the data.
01:37:46.000If you're getting a free app, in essence, any free app, if the product's free, then you're the product.
01:37:52.000So when Google is allowing you to use a Google Drive and get a gig of storage, they're going to use those photos to train neural networks to do better facial recognition.
01:38:00.000What if you're paying for Google Drive?
01:38:02.000I don't know about their terms of service now.
01:38:04.000That is one of the best things that I use with large language models is any product I download, I have the neural network examine the terms of service.
01:38:14.000And then you can pretty much understand like, here's my focus.
01:38:17.000Here's the 40-page terms of services document.
01:38:20.000When you click that link that you got, what are they able to do with my data?
01:38:46.000But they're trying to build these hyper-competent artificial intelligences.
01:38:50.000And you need two things for that, really, is training data and you need compute.
01:38:55.000And that's why you start seeing them coming out with like Meta's building its own nuclear engineering facility or something, a nuclear facility or something like that.
01:39:04.000So if I want to build a replica of Joe Rogan that I can make hyper-realistic AI videos for, I need every picture of your face from every angle.
01:39:13.000I need every wince, every squint, everything you've ever done.
01:39:16.000So I can introduce more training data to better train that neural network in order to generate more hyper-realistic versions of yourself.
01:39:25.000And so when a company is offering you something for free, and it's fine, like if people are fine with that idea, then by all means, download all the free apps that you want.
01:39:34.000But if you're downloading a free app, it's because you are the product.
01:39:37.000They either want to see how you type, they want to see what you're saying, they want to see how you're thinking about things, they want to understand your political biases, they want to look at your photos.
01:39:45.000And this isn't because they're a deep-seated nation-state actor.
01:39:49.000They can become that, but it's because they're trying to build the best products because the big money is in AI.
01:39:57.000So anytime you're doing any of these things, and it's just been obvious to me from the on not from the onset, but pretty close to the onset, that yeah, this is a good example, right?
01:40:06.000Pokemon Go players built a 30 billion photo map.
01:40:09.000That's how training robots deliver your pizza.
01:40:14.000So you, you know, they view people, and they can say they don't.
01:40:18.000And maybe if someone from there catches this podcast, which they well could, they might put out a statement that's saying that that's not their doing.
01:40:24.000But I'm telling you, as a person who has done media forensics, who has done computer network operations, and who has trained artificial intelligence models, that is precisely what they are doing.
01:40:37.000What is the difference between using Apple and using Android?
01:40:41.000Well, Android will do the same things, and Google will do the same things.
01:40:43.000It's just that I can root my phone or I can install a custom operating system like graphene or something like that, which I'm not doing right now.
01:40:52.000I had to make a sacrifice when I started my company, SpartanForge.
01:40:56.000And the sacrifice was I had to be the face of this product.
01:40:59.000And so I never had a social media until I started the company.
01:41:03.000And I didn't upload things to the cloud until I started this company.
01:41:06.000And it became just like, I have to sell a product.
01:41:09.000I have to, you know, and I'm actually selling a product, not people's data or people's photos.
01:41:15.000I have to let people often don't know who is the company or who is the organizing principal and what do they care about in the company.
01:41:23.000And I just made that trade and said, I'm going to have to become a public person and start putting things out there.
01:41:28.000And so, you know, I started a company.
01:41:31.000We started our first Instagram and I started my marketing team started my first Instagram.
01:41:36.000And I had to start uploading things and talking about how I felt about things because I wanted people to know that this company was not going to be like the other companies that are out there.
01:41:52.000You know, we've got millions of emails from people who have signed up for our apps.
01:41:55.000Other companies who are starting companies, they want to go out and reach marketing people.
01:42:01.000So if you're starting another hunting app, maybe for cameras or for a call or a turkey call or an out call or something, and you found Spartan Forge and you said, man, they've got 2 million emails.
01:42:13.000I could pay them a half million dollars for that $2 million and start some top-of-line marketing, top-of-funnel marketing, and go blast them.
01:42:21.000So they would pay me a lot of money for those emails.
01:42:43.000Not now, but what I still can use and what I still do use is Android also publishes their framework in an open source fashion where you can look at the Android.
01:42:53.000It's called AOSP, Android open source project.
01:42:57.000So the basis of Android, the nuts, think of it as the nuts and bolts.
01:43:01.000I'll try not to talk in too technical terms here.
01:43:04.000But the basic framework, think about it like a car.
01:43:07.000The frame and the engine makeup is published so you can look at how things work on the inside.
01:43:12.000Apple goes the opposite way and they don't publish any of that and you can't see any of that stuff.
01:43:16.000I'm for the free and open version because at least if something, at least if I'm worried about my phone having a problem, I can actually dump binary or I can create an EO1 file and exhume.
01:43:27.000I can look at the binary and say, is my phone acting like it should or doing what it should?
01:43:31.000Or is there some kind of persistent implant?
01:43:33.000I wouldn't be able to do that with a – I would have to trust Apple and Apple's ecosystem and whoever they're – McAfee or whatever they're using.
01:43:41.000I would have to trust them, which I don't.
01:43:43.000So I like the Android because that option available for the average consumer that's not that learned in computers?
01:43:51.000Well, the great part about large language models now is if you wanted to dump your own phone today, you could follow along with a large language model and do it, your own Android.
01:44:02.000Well, you would have to buy some expensive, there is something, you'd either have to pay a firm to do it, or you could download things like Celebrite.
01:44:12.000You could get a Celebrite or there's other things called Forensic Toolkit, other things like that that allow you to examine your phone at a deeper level.
01:44:39.000I've just got everything against foreign actors.
01:44:41.000Just if they're not an American company, that automatically kicks them down a level for me.
01:44:46.000So anyway, there's all kinds of Android just makes it much easier to examine your phone or to understand if you've got something going on that's funky than it is on Apple.
01:44:58.000So for the average person, like for me, like if I got.
01:45:30.000You'd be able to look at all of those things and then determine because Android open source project is published, you could do a one-for-one and say, well, you know, at the kernel level, there's this weird code that's not in the Android build.
01:45:47.000And then with a neural network, you could probably, I've never done it, but I'm sure you could figure out what the intent is of that code, even for a lay person.
01:45:55.000So I could take that information, I could put it into Perplexity, and Perplexity would lay out what's going on with it?
01:46:01.000Ostensibly, it would be able to, yes, unless it was some type of weird code.
01:46:05.000I don't know if I haven't used Perplexity, so I don't know if they have something like ChatGPT's Codex.
01:46:11.000Sort of just try just to be like, can you help me examine my Android phone as doing looking for any malicious actors?
01:46:16.000Yes, I can walk you through structured non-destructive check for malware or other shady activity on your Android phone.
01:46:23.000Four tools, commands, quickly check for common warning signs, sudden big battery dream, you're not using the phone, unusual data usage, particularly in the background, apps you don't remember installing, or icons briefly appearing and then disappearing.
01:46:36.000Lots of pop-ups, redirects in browser, or new default search launcher, strange calls, SMS messages you didn't send yourself.
01:46:44.000If any of those ring a bell, we'll focus on them in later steps.
01:46:47.000Yeah, it's just asking you, like, why are you running?
01:46:49.000So this is just something that you could do with an Android phone that you just can't do with it.
01:48:40.000If it's very long, send in chunks and tell me chunk 1-3, chunk 2-3, et cetera, so we can help keep track.
01:48:48.000I can do static analysis here, read and reason about the code, but I cannot actually execute it in a sandbox.
01:48:54.000So this is more like a careful forensic read-through than a full dynamic malware analysis.
01:49:00.000Go ahead and paste the code snippet your friend flagged as not fitting with the rest of the system and tell me in a sentence or two where in the EO1 it came from.
01:49:09.000Example, app folder, system partition, random file path.
01:49:17.000And so you can do a forensic examination of an Apple, by the way.
01:49:22.000Sorry if I misspoke there, but you can't do it to the level that you can with, because the Android open source project publishes all of the code, I can get an understanding of the very inner working.
01:49:32.000So if something's being done, for instance, at the kernel, or you could think about it as like the lowest level of the phone, something that wouldn't normally get caught in a forensic examination, I wouldn't be able to do that with Apple.
01:49:45.000And the nation state actors are doing things at very low levels in the code framework for that exact reason because most people who aren't very deep into forensics would miss that.
01:49:58.000It would be like the fingerprint under the couch cushion or something like that.
01:50:01.000And what is the difference between what someone can do with an Android phone with the standard Android operating system versus Graphene?
01:50:12.000So that gets into, you know, if you wanted to WarDrive or sample Wi-Fi networks in an area, or if you wanted to run a barrage attack on a Wi-Fi endpoint, you could work that in there to do things with the phone that you couldn't otherwise do with a standard app, with a standard Android operating kit.
01:50:31.000But as far as on a consumer level, what protections do you have by running graphene that you don't have by running Android?
01:50:40.000You're much more in control of the ecosystem.
01:50:45.000And again, you could use a large language model to do this to understand exactly what's being run on the phone.
01:50:50.000You control the background services that can be run on the phone.
01:50:53.000So if you're getting hot mic'd or if your camera's taking pictures of you when you're not looking or it's listening to you for advertising content, stuff like that, you would be in control of all of that in a way that you're not control of on a native Android app.
01:51:04.000In control, like how so would it alert you that this is happening?
01:51:07.000Or just the functionality wouldn't be there for it to take place.
01:51:10.000Right, because the functionality is only designed for the standard Android operating system.
01:51:15.000And I haven't installed graphene in a while.
01:51:18.000So a lot of this, all of this updates, and I could be saying things that are incorrect.
01:51:22.000I stopped doing this about three years ago.
01:51:24.000Well, I know that there was, I forget what country it was, but they were focusing on people who use Google Pixel phones, for example.
01:51:31.000Yeah, because that's because that's one of the phones that are more commonly rooted.
01:51:50.000Because you're going to jack things up.
01:51:51.000have to you know get the bootloader and uh essentially the starting you know the starting mechanisms of the phone that launches all of the other things you have to get down to a level and unlock that so that you can um is that available for all android phones No, not all Android phones.
01:52:06.000Lots of them lock it down, so you can't do that.
01:52:24.000And the older Samsung's made it available.
01:52:28.000Older Galaxy S7s, S10s, you could do more than you can with like, you know, I've got the Galaxy fold here, and you can do almost none of that on here.
01:52:40.000But like I said, I went away from doing all that, A, because it was work.
01:52:44.000B, because I'm not working in national security anymore, and I'm not, you know, I haven't written an exploit in years.
01:52:51.000I don't do this type of work anymore, and I need to sell a product.
01:52:54.000And it just, you know, working with other employees, like that run my Instagram or, you know, assistant going through my email and all those other types of things, it just wasn't pragmatic anymore for me to keep doing that, and I had to give up that.
01:53:05.000Did you forge your app work run on graphene?
01:53:45.000So, yeah, I don't know why they do it.
01:53:47.000It might be people can, well, the Android open source project exists.
01:53:52.000So, it would stand to reason that you would want a way for someone, because what you want is people interacting with that code and red teaming it and making the code better and then offering bug bounties so that you can tell Android, like, hey, you've got a critical flaw in your system architecture here, and then they'll pay you 20 grand for that.
01:54:24.000And look, Eric's a wonderful guy, and the principles that he used for the first instantiation of that phone are the correct principles, which is we need to get, if you want, if you're security focused at all, you should get away from these big, large conglomerates because none of your data is private.
01:54:45.000An incorrect principle, and I'm going to get shit about this, but I told you in the beginning I care about the truth and I do care about the truth, is that when you're using a PKI subsystem that relies on Microsoft, then you're not in control of the PKI certificate signing, and Microsoft could cause a bunch of problems, and they were using that.
01:55:05.000So, the other thing being, if you're building on the Android open source project, that means the code that you're using as the engine, let's just call it that of your phone, is examinable by the public.
01:55:16.000So, you're relying on Android to publish these updates to the phone, and you're relying on those things to be as good as possible.
01:55:25.000Now, you might harden it some more, but as long as the code is out there, it can always be mucked with.
01:55:30.000As long as people have to interact with the device and type, and you have to see what you're typing, a phone's going to be, it's going to have Swiss cheese.
01:55:38.000So, when people say something is unhackable, as you said, that's just not true.
01:55:50.000Like I said, great guy, done lots of great things for the country.
01:55:53.000And it's just if they had just said something along the lines of it's hackable as any phone is hackable, because by virtue of you having to interact with it, it's hackable.
01:56:03.000It just, like if I install, if I came up with an app that had a, you know, look at the TikTok terms of service on the first TikTok.
01:56:11.000With those terms of services, I will own your phone.
01:56:14.000And I'm not saying you can install TikTok on his phone, but what I'm saying is by virtue that you have to interact with the phone and see what you're doing and type passwords, and you've got those kinds of terms of service, I could easily put a key logger in that, and now I know your signal password or your signal pin.
01:56:29.000Or, you know, I get you, you know, you're going to China, so I stop you in secondary.
01:56:34.000And while you're in secondary, I've got a CCTV on you, and you unlock your phone.
01:57:04.000Just don't call it totally unhackable.
01:57:06.000Because a guy like me, I don't need but a week or two to tell you on this current build, like here, here's the hole in this Swiss cheese.
01:57:14.000Now, is it far better than having a Google phone with standard firmware and standard OS or an Apple phone?
01:57:23.000I don't know about Apple because, again, you asked me about Apple and I said, I don't know Apple.
01:57:27.000I don't know what's happening at the top of that company, but I know that they like to monetize people, and that's pervasive in my mind.
01:57:34.000And using data that people don't know is getting used, even though it's in a 40-page terms of services document, is pervasive.
01:57:40.000So I just don't know at that highest level of analysis.
01:57:43.000And that's why I said to answer your question about the safest phone, I would ask you what you're using it for, who you are, and what are you doing in the world is the best way to answer that question.
01:57:53.000So, me, like, what would you recommend I use?
01:57:56.000I mean, I wouldn't want to, I mean, okay, I'll tell you generally what I would say because you might ask me that question one day because we go back and forth about a lot of tech.
01:58:05.000I know specifically what I would recommend for you to do, and I'd even tell you to hire someone else to do it and not me, because that just that checks and balances is what I would want.
01:58:15.000But for you, I would say you should take something like a Raspberry Pi and you should run WireGuard on your phone, and you should route all of your internet traffic through something like a home terminal at your house through a Raspberry Pi using something like WireGuard, which is a VPN that I use that's very good.
01:58:35.000And everything should be routed through that.
01:58:39.000And if you trust Apple, continue using Apple.
01:58:43.000If you don't trust Apple, then use Android.
01:58:46.000And you could use a Pixel and do graphene, and you could use Signal on there and those other things.
01:58:52.000And you're going to be relatively safe.
01:58:54.000But again, if I'm a nation-state actor, I can create circumstances where I'm going to get access to your shit and I'm going to lock you down.
01:59:02.000And some of them are more expensive than other methods to do it.
01:59:06.000But I'm a pragmatist and you can always come up with a method to get a hold of somebody's shit.
01:59:09.000You can always create the circumstances, especially if you're a nation-state actor to get a hold of somebody's stuff.
01:59:15.000That would be the very high level of things that I would recommend to you just out the gate.
01:59:24.000Yeah, it's very concerning because it seems like these things keep getting stronger and more capable.
01:59:55.000You know, all of these things we think are added for layers of protection.
01:59:58.000For instance, you used to get that pop-up on your phone where it said, you know, there'd be like blocks of pictures and it would say, click all of the pictures with a traffic light in it.
02:00:10.000I was just going to say that, a traffic light in it.
02:00:20.000You think you're getting security out of it, but you're a product at that point because you're helping to educate a neural network on what traffic lights look like and how they can look and all those different instantiations of traffic lights.
02:00:31.000So, and again, like we have to separate causality and intention and outcomes in that the companies might do this because they want to create the greatest AI ever.
02:00:42.000But when you're issuing someone a 40-page terms of service document on everything they can do with your thing that you paid $2,000 for, it's just, you know, we need more ethical people.
02:00:53.000At least what Eric Prince was trying to do was right, which was we need to off-ramp from some of these big things because the way that this government is going, I'm very worried about the rights of the individual now and going forward because we have an uneducated class of people for all of the reasons in the world.
02:01:13.000Like if you want to just focus on your family and you're not thinking about these things, I don't hate that for you.
02:01:17.000But the idea of individual autonomy and rights has been so shit on in recent years that when we get more uneducated and we rely large language models are great, but they're not a foundation of learning.
02:01:32.000In other words, we have a lot of people with access to information but no wisdom.
02:01:37.000It's like when your parents would say, learn how to do addition and subtraction on paper before you use a calculator.
02:01:43.000Like, understand how to do research and cite sources and understand, you know, how to conduct really good analysis before you just use a neural network for everything.
02:01:53.000Because as we lose focus of our civics and what our founders are trying to do and the uniqueness of it, which is truly unique, which is, you know, when I joined the Army, I joined the Army to get out of North Dakota.
02:02:04.000When I re-enlisted in the Army, it's because I believed in the experiment.
02:02:09.000But the foundation of the experiment is good, but we've eroded it in so many ways over the years and given up so many individual rights in the name of security.
02:02:20.000And I'm sure it's been said on here before, but Franklin said, anybody who gives up their individual rights in the name of security deserves neither.
02:02:28.000Your freedoms in the name of security deserve neither.
02:02:31.000And it's some of the ways that they've done it have been really above the surface.
02:02:35.000And it frankly blows my mind that we let the government get away with some of these things that we let them get away with, where you even explain it to people and they're like, I don't see it.
02:02:45.000Like, I don't see how that was a big deal.
02:02:47.000And I'm like, it was a total recalibration of the system that allowed the Democratic Party and the Republican Party to usurp your rights in a way that if you knew any better, you'd probably be protesting.
02:03:00.000Like some of the ways that they've done this, you know, we can go with the easy stuff like the Patriot Act, right?
02:03:06.000In the name of security, we're going to start collecting on Americans.
02:03:09.000You know, and the Biden and Obama administration, I will say this at risk of, you know, getting in trouble because I used to have a clearance.
02:03:19.000They had a massive vacuum cleaner and they knew what it was vacuuming up.
02:03:23.000And they kept vacuuming it up anyway in the name of security.
02:03:26.000I'm not saying they were going after American citizens, but they certainly knew they were.
02:03:31.000And they just vacuumed shit up and collected it and stored it in a database.
02:03:37.000In case at some point we needed to, you know, come up with a narrative or get rid of somebody who's inconvenient or whatever else that just flies in the face of individual American rights and American autonomy and is really, in my mind, the anti-pattern to freedom.
02:03:56.000I mean, I'll give you one that people always crap on me whenever I talk to them about it, but there's two that really bother me.
02:04:01.000One of them being like the 17th Amendment.
02:04:03.000Do you know the 17th Amendment to the Constitution?
02:04:06.000So the 17th, so when the founders, when you read the Federalist papers and the Federalist papers, I really love reading the Federalist papers.
02:04:13.000I love reading how they informed the Constitution, the Bill of Rights, the Declaration even.
02:04:19.000John J. James Madison wrote these documents explaining the framework.
02:04:23.000And the 17th Amendment, essentially how the Senate, the Senate, right?
02:04:27.000The 50 people there that are supposed to be representing us was originally constructed was a state would have legislatures and the state legislatures and the governor would appoint the senator.
02:04:37.000The reason that the founders did that was because the state governments had to give power to the federal government to exist.
02:04:45.000Back with the Articles of Articles of Confederation.
02:04:57.000Back before there was a strong centralized American government, we had problems with money, we had problems with interstate commerce and those types of things.
02:05:04.000And those articles eventually turned into what is the Constitution.
02:05:07.000But the states had to grant that power.
02:05:09.000And the signers of the Declaration of Independence and the Constitution knew that the states needed to be those small projects that we talked about before where if California wanted to go nuts, let them go nuts.
02:05:20.000But it shouldn't impact what's happening in Texas.
02:05:22.000It shouldn't impact what's happening over in New England.
02:05:24.000It shouldn't impact what's happening in the Midwest.
02:05:26.000But if that goes nuts and it fails, it needs to fail.
02:05:30.000So the state senators, I'm sorry, the state legislatures would come together and they would vote for a senator.
02:05:37.000And that senator's job was to go to the federal government and protect the rights of the state.
02:05:43.000Not to protect the rights of individuals per se, and certainly not to embolden the federal government.
02:05:49.000But with the 17th Amendment, what happened was the House of Representatives' function was to be the petulant children of government.
02:05:58.000So their job was to come up with crazy ideas, crazy laws, all of those things.
02:06:02.000The more liberal version of government jurisprudence would be the House of Representatives, your crazy ideas.
02:06:08.000And then you had state senators who were supposed to be between the House and the President who would say, well, here's a good idea, but the rest of this is retarded, AOC.
02:06:24.000And that's because it would erode the state's rights and the state's constitution and what made this state great.
02:06:29.000Because what the legislatures would do is say, hey, Joe Rogan, you've made a lot of money and you've got a big podcast and a big voice and you've learned some lessons around the way.
02:06:38.000And you were able to do that in Texas.
02:06:39.000And you decided to come to Texas because we had all of these things that California didn't have.
02:06:44.000We need you to go to the Senate for three years or six years or seven years, whatever it was back then, and represent those same principles.
02:06:52.000So when Obamacare comes through, you can say, not only no, but fuck no.
02:06:59.000But what the 17th Amendment did was it was redundant with the House of Representatives, which was, in the founder's eyes, the only popular vote part of the Constitution, of the American government was the popular vote.
02:07:14.000And then you had, you know, the way the president gets elected through electors, but you had the state senate, which was appointed by the states.
02:07:20.000So the legislatures, and I'll use North Dakota where I'm from, you'll have one big city, two big cities, Fargo and Grand Forks, North Dakota.
02:07:32.000Crazy thought exists, hyper crazy ideas, but some of them are useful.
02:07:37.000The rest of the state's agriculture, right?
02:07:39.000So all of those legislators from all those counties, those legislative districts would get together and say, we're going to put Bill Thompson, that would never happen, but in charge of, he's going to be at the Senate representing North Dakota.
02:07:51.000But he has to represent the whole state.
02:07:54.000In other words, you can't do things that will help Grand Forks or Fargo because that's where the universities are.
02:07:59.000That's where all the crazy politics are.
02:08:01.000You also need to be thinking about the guys out in the western counties, Lemoore County and North Dakota or way out west.
02:08:11.000What the 17th Amendment under Woodrow Wilson and how they really usurped the Constitution and made the Senate a redundant, they made it a redundant House of Representatives and using the popular vote.
02:08:25.000But if you want the popular vote in North Dakota, 85% of the population is in Fargo and Grand Forks.
02:08:30.000So now you've got, if I want to run for Senate in North Dakota, I'm just going to spend all of my time in Fargo and Grand Forks.
02:08:36.000Because if I can repeat back to those people all the ideas that they want to hear, I'm going to win that vote and I don't have to represent those people out in the rest of the state in anything.
02:08:45.000So they created a redundant House of Representatives.
02:08:48.000But another reason why it happened was they wanted popular vote because there is no amount of money that you could stick into a legislature out in the western part of North Dakota.
02:09:03.000So this guy will do whatever we tell him to do.
02:09:06.000And it has nothing to do with the state or representing the state's rights or the rest of those legislative districts.
02:09:11.000We're going to pick this senator and he's getting $300 million for his election bid.
02:09:15.000And this other guy, who's a slower-moving constitutional conservative, who might be a free market absolutist and a classical liberal, he's not being funded.
02:09:27.000But under the state architecture, you might have been a better representation of the state.
02:09:32.000And that's why the legislators had to vote for you to put you in as a senator.
02:09:38.000But now, all that someone who wants to be a senator needs to do is go to the Republican National Committee or the Democrat National Committee and say, I'll do all the things you tell me to do, fund my campaign, and I'm going to go stump in Fargo and Grand Forks, North Dakota, and the hell with the rest of the state.
02:09:56.000It's a very important sleight of hand.
02:09:57.000And when that happened, you made a redundant House of Representatives, and the state no longer was protected at the federal level.
02:10:05.000And what happened was all of the power from all of these states and these legislatures and these individuals got sucked up into the federal government.
02:10:13.000And then after that, you see all of these things that would never have been passed by a state getting passed, things like Obamacare, things like the Patriot Act, certain war resolutions, all kinds of things where it just further erodes the power of the state.
02:10:27.000And federal government wants that because it puts all of the power up in the federal government.
02:10:31.000And people always say we need to get money out of politics.
02:10:34.000No, we need to get power out of politics.
02:10:36.000That power that they've taken over the last 130 years or so used to exist at the state and local levels because they wanted these thought experiments happening where we could pluck the best things out of them and forget the rest.
02:10:49.000But all of that power has now gone up to the federal government and the federal government won't ever release that power and they only want more budget and more spending to execute that power.
02:10:59.000And that's also because the interest groups that want to go, they don't want to have to go and convince a whole state of whether or not something is good that people are going to vote on.
02:11:07.000They just want to go take a lobby and go up to the federal government because they want all of the power up there as well.
02:11:12.000And the federal government wants all the power up there as well because they make $300,000 a year before they become a politician and they're worth $30 million when they're done being a politician because all of the money has to go to the federal government because they're in charge of light bulbs we can use, computers we can use, flush toilets we can have, how our roads are going to look, what our medical care looks like.
02:11:33.000None of those powers are explicitly written in the Constitution of the United States and they use things like the commerce law and other things in order to create things like Obamacare, where really we want competing states.
02:11:44.000If Texas comes up with a great way to do health care and North Dakota's isn't so great, they can look at that experiment and they can adopt the principles and they can have it at that level.
02:11:54.000But it's much easier to get change at the local level when the power is derived from the state and the individual because if I want to change the way that my state does health care, I have one of two options or three options.
02:12:05.000I can run for office, I can support someone who is going to go into office and do what I want, or I can move.
02:12:10.000But when everything's centralized at the federal government and everything flows from the federal government, all of the money, power, and gravity is up there.
02:12:17.000And the individual, the 300 million of us or so, have really no power now to exercise either state's rights or individual rights at the higher level.
02:12:26.000I hope I'm elucidating this correctly, but it's a real usurpation of individual and state autonomy that really got rid of state power, which was, if you read the federalist papers, was so important to the founders that there was this state, that the state's needs were organized because the state was where the founders wanted these thought experiments.
02:12:45.000You read Thomas Hobbes Leviathan or John Locke or Montesquieu.
02:12:50.000All of them talked about this great experiment that was being set up and how it was built on all this Western politics and everything that came before it on how we could have a government that was forced to respect the rights of individuals and allowed for these competing think tanks of ideas and that the power would never rest at the federal government.
02:13:06.000But the 17th Amendment was a way that a lot of that power went from the state level and the state legislatures.
02:13:13.000And now to become the president, they want to do a popular vote.
02:13:17.000And under a popular vote, you would just have to campaign in New York and L.A. You would get the popular vote out of the likely voting people.
02:13:25.000And now the rest of the country is not.
02:13:27.000And that would be another, you hear all these people saying we need a popular vote.
02:14:02.000And I could go on for 15 more things about that.
02:14:04.000I won't do it for the sake of your listeners because I doubt this is what they wanted to do.
02:14:08.000But similar things happened with the Supreme Court in Marbury v. Madison and allowing the Supreme Court to have judicial review.
02:14:15.000That was never a thing that was in the Constitution.
02:14:17.000And the Supreme Court, if you like the Supreme Court being able to have the power to describe everything as being either constitutional or unconstitutional, then you're not ruled by a democracy.
02:14:33.000And that all started back in Marbury v. Madison with Thomas Jefferson and these writs of mandamus that were the Supreme Court, long story short, essentially granted itself the power to conduct judicial review under the old system or the system, old system.
02:14:49.000The system that was ratified and that the founders approved was if a law was deemed unconstitutional, it would go before the Supreme Court and they just would rule in favor of the person.
02:15:00.000And then eventually the government would figure out, oh, this law doesn't work.
02:15:03.000But it was never on the Supreme Court to say constitutional, unconstitutional.
02:15:07.000You would get arrested for some law, and it would get appealed to the Supreme Court.
02:15:11.000The Supreme Court would say, we're not punishing this person.
02:15:15.000But the government would have to keep arresting people.
02:15:18.000It would have to keep going in front of the federal government.
02:15:20.000So what I'm saying is, and I'm sorry to go off on this, we can go back to tech.
02:15:23.000But all I'm saying is the core of the American experiment in individual rights and what makes this country so great and why I was willing to die for it after my initial enlistment.
02:15:34.000And why I have such love for this is because it was the only experiment where the value of the individual was held at the top of the hierarchy and that people could truly be allowed to flourish.
02:15:43.000And in 250 years, we did more than any society could have hoped to have achieved in tens of thousands of years.
02:15:49.000Not that it's been around that long, but in thousands of years.
02:15:52.000Everything tends towards disorder and everything, power always gets centralized.
02:15:58.000And we had a framework to do that, but we were willing participants in our own demise.
02:16:03.000And now we're scratching our heads and wondering why there's no individual and why there's no individual autonomy and why a guy can't smoke weed on the weekend or why a guy can't do X, Y, or Z, because we have centralized the authority and the power and the decision-making structure.
02:16:16.000And we're allowing them to be, there would be no problem with money in politics if the federal government had only the powers that were outlined to it in the Constitution.
02:17:46.000There's no sense of knowing there in that, you know, Penrose, I've read a lot of, on his orc OR, if people want to read about that, I won't explain it.
02:17:57.000orchestrated objective reduction and how the mind works and these fleets of consciousness that we have, these shimmers of consciousness that we have based around what he describes in the microtubule.
02:18:08.000We get conscious thought and that conscious thought we project into things.
02:18:12.000AI is very good conscious projection, but it will never have consciousness or knowing because it has no system of values.
02:18:19.000And if we were to instill values in it, it would still be consciousness projection.
02:18:25.000My dad died when I was five, but I bought it back and was working on it.
02:18:27.000And inside of his cabin, I got to learn a lot about my father by working on the cabin that he built.
02:18:34.000We wouldn't measure things or cut things right on walls and that type of stuff.
02:18:37.000That's all consciousness projection that allowed me to get to know him away.
02:18:41.000I might not even have known him if he were alive, but I got to re-experience and understand my father and his thoroughness through that cabin.
02:18:50.000It's getting very good, but on a calculator, you could get the same thing out of a neural network that you get out of a neural network if you had sufficient time.
02:18:58.000I could present you a question just like you did on Perplexity.
02:19:01.000I could sit here with a rule book and I could type in a calculator.
02:19:06.000It might take me a million years, but I could do it and I could give you the same answer that a neural network would give you.
02:19:11.000That doesn't mean consciousness or knowing or AGI is presence, is present.
02:19:30.000And to me, it's just really fancy, clever math.
02:19:34.000And having trained these networks from dozens of years or dozen years now and working with them, they're just really clever consciousness projection.
02:19:43.000And so, yeah, that is four hours and we can do that next time.