The Joe Rogan Experience - March 25, 2026


Joe Rogan Experience #2473 - Bill Thompson


Episode Stats

Length

2 hours and 20 minutes

Words per Minute

197.98862

Word Count

27,890

Sentence Count

1,908

Misogynist Sentences

27

Hate Speech Sentences

31


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Joe Rogan Experience" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:01.000 Joe Rogan podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Train my day, Joe Rogan podcast by night, all day!
00:00:12.000 What's up, Bill?
00:00:13.000 How you doing, Joe?
00:00:15.000 This might be one of the coolest things anybody's ever given me.
00:00:18.000 So you gave me this knife.
00:00:21.000 Explain all this.
00:00:22.000 All right, so, I mean, there's a larger explanatory reason behind this.
00:00:26.000 My brother and I grew up, my father died when I was five.
00:00:30.000 My brother and I grew up doing these things called rendezvous.
00:00:34.000 Have you ever heard of them?
00:00:36.000 In what way?
00:00:36.000 What is a rendezvous?
00:00:37.000 So there you go.
00:00:38.000 So what a rendezvous is, is it's not, you know, you go to those like, I don't even know what they're called, but people do like reenactments.
00:00:46.000 Oh, okay.
00:00:46.000 Like Civil War reenactments?
00:00:48.000 It's not like that.
00:00:49.000 So that's the closest thing approximation to probably what it is.
00:00:53.000 You get invited to them, or these days they're easier to get to, but my stepfather, the guy my mother remarried, brought us to them.
00:01:00.000 All you do is camp, but you're only allowed to camp, and no one comes to the camp, or sometimes they might have people at the end.
00:01:05.000 But while you're in the camp, everything in the camp has to be 1840 or prior.
00:01:09.000 So there can be no modern appurtenances, nothing like a refrigerator or nothing like that.
00:01:14.000 1840?
00:01:15.000 Why?
00:01:15.000 18 years.
00:01:16.000 At the end of the fur trapping.
00:01:18.000 That was considered like Jeremiah Johnson's time, like peak fur trapping.
00:01:22.000 So there's people, you know, they dress like either revolutionary, like American revolutionaries, or they dress like mountain men or they dress like Indians.
00:01:30.000 How'd you guys dress?
00:01:31.000 Mountain men.
00:01:32.000 So while we're there, you learn all kinds of stuff while you're reenacting.
00:01:36.000 Like I learned how to brain tan hides.
00:01:39.000 I learned how to traditionally art or do traditional archery.
00:01:43.000 Stuff like that.
00:01:43.000 So anyway, this knife was a knife I had actually started working on with my brother a while ago.
00:01:48.000 I do more of like the brain tanning tomahawk thing.
00:01:51.000 And when you're saying brain tanning, you talk about using brains to tan animal hides.
00:01:55.000 Right, using animal brains.
00:01:57.000 What does brains do?
00:01:58.000 Why does brain do it?
00:01:59.000 It softens the leather in a natural way.
00:02:01.000 And what's cool about it is every animal, no matter what animal you kill, has the exact amount of brain needed in order to tan the hide.
00:02:08.000 So you don't need any additional, like people use egg yolks or mayonnaise or something like that.
00:02:13.000 All you do is you take the brain out of the cavity, you grind it up, you mix it into some water, and then after you've cleaned the leather and you've scraped it clean, you stretch it.
00:02:23.000 I usually use like a dull shovel.
00:02:25.000 You stretch it over the dull shovel and then you soak it in the brain water mixture.
00:02:30.000 And then you just keep repeating that pattern and the leather gets like a really nice soft feel to it.
00:02:37.000 What is it about the brain?
00:02:38.000 Is it the fat?
00:02:39.000 It breaks down the leather.
00:02:41.000 I'm not sure if it's the fat or not.
00:02:42.000 I haven't gotten that deep into it, but it breaks down the leather and just makes it feel really soft, really nice.
00:02:47.000 So anyway, this knife here, I started, I killed that bear.
00:02:51.000 So the jaw is made out of two bear jaws, or out of one bear jaw split in half.
00:02:57.000 So that was a bear I killed in Canada in 2017.
00:03:01.000 It was my biggest black bear.
00:03:04.000 And so we split the jaw, put that together.
00:03:08.000 It's Irish linen threading.
00:03:09.000 Then that's a knife that my brother had picked up that was from 1860.
00:03:13.000 It was totally rusted.
00:03:14.000 We had to grind it back or he had to grind it back down.
00:03:17.000 And then the sheath is traditional, like, you know, you could, the cool thing about doing rendezvous and the cool thing about this is you could have a DeLorean and drop that in 1840 and somebody would pick it up and think it was made yesterday.
00:03:31.000 And so everything on there has been done traditionally from the quilling on the bead work is made from porcupine quills.
00:03:38.000 The backing is buffalo brain tan.
00:03:42.000 And then the front is beaver hide or a beaver tail.
00:03:45.000 I'm sorry.
00:03:47.000 And then the sides are horse and turkey hair hanging off of it.
00:03:51.000 And these are bear teeth?
00:03:53.000 And those are bear teeth, yep.
00:03:55.000 From the same bear.
00:03:56.000 So when I was thinking about what I was, because I wanted to give you something for inviting me on because it's still a shock to me that you did it.
00:04:02.000 Even though we've been talking for so long, I just never imagined a scenario where you'd want to have me on here.
00:04:07.000 Well, you're an interesting dude.
00:04:08.000 I thought, what could I give this guy that, you know, money or people or whatever couldn't get you?
00:04:14.000 And so I thought this is the right thing to do.
00:04:16.000 So it went from a me project to a U project.
00:04:18.000 And my brother Aaron helped me out with it tremendously.
00:04:23.000 So how'd you find this knife from the 1860s?
00:04:25.000 Well, he found it.
00:04:26.000 My brother is even more esoteric and odd than I am, believe it or not.
00:04:31.000 And he collects this kind of stuff.
00:04:34.000 I mean, the guy who dated it said 1860 to 1890 is what they figured.
00:04:40.000 And you can tell by the way that like around the hilt and the way that it's the pitting on it and stuff like that and the way that it was made that it fits that era.
00:04:49.000 I mean, it could have been somebody redid it in 1900, but it's definitely that old, the type of steel and the way that it was worked and the way that it is around the hilt around the bottom there.
00:04:59.000 Wow.
00:05:00.000 And so it's at least, you know, 130, 140, but most likely 160, 170 years.
00:05:07.000 It actually fits my hand perfect.
00:05:08.000 Yeah, so that's also something my brother and I talked about, about how long it was going to be.
00:05:13.000 And we made some educated guesses and put it all together.
00:05:16.000 So yeah, I mean, like I said, not something you can just go pick up somewhere or something that will, you know, hopefully mean something.
00:05:22.000 Not saying it's practical.
00:05:24.000 Like it's not something you'd be gutting an elk out with.
00:05:28.000 Well, if we get attacked by zombies in the studio, it's a good thing to have on the desk.
00:05:32.000 Yeah, I mean, if you're going to make a last stand, you know, that's a pretty good, that's a pretty good knife to make your last stand with.
00:05:38.000 It's a good way to go out.
00:05:39.000 Yeah, exactly.
00:05:40.000 That's awesome, man.
00:05:41.000 Yeah, so the rendezvous, we did those from...
00:05:44.000 How long did they last?
00:05:46.000 Uh...
00:05:46.000 They vary from a week and then some go up to three weeks.
00:05:49.000 And what do you do for food while you're out there?
00:05:52.000 So inside of your lodge.
00:05:53.000 So there's two types of rendezvous.
00:05:55.000 At most rendezvous inside of your lodge, you can have a cooler as long as it doesn't leave the lodge.
00:06:01.000 So I have like a 20-foot teepee that I take to these things.
00:06:04.000 And inside of my teepee, you can have a cooler and some modern appurtenances.
00:06:10.000 Did they have any kind of coolers in the 1800s?
00:06:12.000 I mean, they had ice boxes and like steel ice boxes and that type of thing, but nothing like we have today.
00:06:19.000 You know, stuff was getting dug out, buried in the ground, or put into the ground, like cool areas of the ground or dig outs.
00:06:24.000 And they dried everything.
00:06:26.000 So pemmican would have been the, you know, everyday thing to eat.
00:06:30.000 That's just dried.
00:06:31.000 So did you bring your own food or did you have to hunt for food?
00:06:34.000 So you bring your own food, but there are other rendezvous that are kind of invite only.
00:06:38.000 And I don't even think a lot of people who do rendezvous know about these, but there's ones that I think they're called, I think I might be speaking out of school.
00:06:44.000 Somebody might send me an email after this, but I'm going to talk about it anyway because I never got read the Riot Act.
00:06:49.000 They're called juried.
00:06:50.000 I think they called them juried southerns.
00:06:51.000 And I've only been to one of those.
00:06:53.000 And that's where everything in the camp has to be pre-1840.
00:06:56.000 And you meet down in a parking lot, you put everything on the back of a mule.
00:07:00.000 When I did mine, it was up in the, I think it was the Bighorns.
00:07:04.000 So, you know, you talk to a rancher, get everything packed up.
00:07:08.000 You go into the back of the Bighorns, and everything in camp has to be pre-1840, as close as it can get.
00:07:14.000 They'll even look at your stitching and say, oh, that was sewn with a sewing machine.
00:07:18.000 You got to take that off.
00:07:19.000 And it's always these weird, like, eccentric history teachers that run them, like guys who, you know, teaches history at Berkeley or something like that or other places.
00:07:29.000 They just really enjoy living like this.
00:07:30.000 And at those ones, if they're in season, you can hunt whatever's in season.
00:07:34.000 You're hunting with traditional archery.
00:07:35.000 And it's really good for kids.
00:07:37.000 Like, the internet wasn't a problem as much when I was a kid.
00:07:40.000 I was certainly into computers.
00:07:41.000 I have been since I was a child.
00:07:43.000 But you could just detach.
00:07:45.000 Everyone's running around crazy.
00:07:47.000 You're sitting around the campfire at night.
00:07:49.000 People are singing with songs in the guitar.
00:07:51.000 You're learning how to do things like this.
00:07:53.000 You're learning how to brain tan.
00:07:54.000 You're learning how to live traditionally.
00:07:56.000 And it's an eccentric cult, kind of.
00:07:59.000 It's not a cult.
00:08:00.000 It's an eccentric group of people.
00:08:01.000 It's a lot of fun.
00:08:02.000 People take it very community.
00:08:04.000 People take it very seriously.
00:08:07.000 There's more advertising surrounding it now than there used to be because numbers are kind of dwindling.
00:08:12.000 But I did my last one last year with my brother.
00:08:14.000 So if you go on my Instagram, there's a picture of my brother, my son, and I doing, I think, our second rendezvous together.
00:08:20.000 And we're just dressed like, you know, I've actually got an awesome war shirt.
00:08:24.000 I can show you the picture.
00:08:25.000 I've got an awesome war shirt that a friend of mine went to war with.
00:08:30.000 He was half Native American.
00:08:31.000 His grandfather was Ojibway or something, Chippewa, something like that.
00:08:37.000 And he was, I don't remember what his role was.
00:08:40.000 But anyway, we deployed to Iraq together and his grandpa made me this war shirt.
00:08:45.000 Oh, there you found it.
00:08:47.000 Jamie, he pulled it up.
00:08:48.000 That's my lodge.
00:08:51.000 How much do you enjoy a shower after you get out of here?
00:08:55.000 I mean, as long as you keep, you know, they have showers in camp.
00:09:00.000 They've got a showering area, a showering area where it's just like pallets.
00:09:03.000 That's the inside of my lodge.
00:09:05.000 So there's a cooler at this one.
00:09:06.000 This is not a jury rendezvous.
00:09:10.000 And so you can shower while you're in.
00:09:12.000 Some of them, they call them hooters.
00:09:14.000 They'll be like a latrine in a shower area in camp.
00:09:16.000 But also, like, some of them I don't, I don't do it at all.
00:09:20.000 This is wild.
00:09:21.000 And so there's no reenactment.
00:09:22.000 Like, there's not like civilians walking around.
00:09:24.000 It's not like Renaissance films.
00:09:25.000 Yeah, exactly.
00:09:26.000 It's just more like I want to act like it's 1840 for a couple of weeks and not look at my phone one time and not worry about the news.
00:09:33.000 It's amazing after a week here, you really forget about the world and you like don't even know you're supposed to be stressed out about things.
00:09:40.000 You're just out there doing your thing for a couple of weeks.
00:09:43.000 And you just cook over open fire.
00:09:45.000 Everything gets done traditionally that way.
00:09:47.000 And did you bring your own meat in there?
00:09:48.000 Yeah, you bring your own meat and stuff in the cooler.
00:09:51.000 And then there's also cooking classes where they teach you like all the recipes to do with like a Dutch oven, like an old cast iron oven.
00:09:59.000 And they do gambling at nights.
00:10:02.000 So you'll walk into like a huge, they call them marquees, but it's like a huge 100-foot square lodge.
00:10:07.000 There'll be three gambling tables in there, girls in like the low-cut shirts and dealing cards and smoking cigars and just having an amazing time.
00:10:14.000 And there are people, you go by camp names while you're in there.
00:10:17.000 Nobody uses their real name.
00:10:18.000 Well, some people use their real name.
00:10:20.000 I'd say 60% of people don't use their real names.
00:10:23.000 What was your camp name?
00:10:25.000 This is embarrassing.
00:10:27.000 It should be.
00:10:28.000 Yeah.
00:10:28.000 So I got my camp name.
00:10:30.000 I got christened with my camp name in the Bighorns when I was 14 or 13.
00:10:37.000 And it was talks a lot.
00:10:38.000 Talks a lot.
00:10:39.000 And Sue, it was pronounced Iaota.
00:10:39.000 Yeah.
00:10:42.000 Just because you talk a lot?
00:10:43.000 When I was a kid, I talked a lot.
00:10:45.000 Actually, as an adult, I don't talk that much unless I know you.
00:10:50.000 But as a kid, I would never shut up.
00:10:51.000 I had really bad ADHD.
00:10:53.000 They kind of diagnosed me with having some low-level version of Asperger's.
00:10:57.000 And I was a rap scallion in class, just never showed up, never listened, never did anything.
00:11:05.000 Those are the people that are the most fun.
00:11:06.000 Well, they didn't enjoy me in high school or in college.
00:11:09.000 That probably would have been your friend.
00:11:11.000 But yeah, they called me Iaota.
00:11:15.000 And we got christened.
00:11:17.000 And it was a, you know, it's one of the things we're kind of missing in culture today or something that I'm trying to reinvigorate, especially with my son and with other young men that I run into is kind of like coming of age rights.
00:11:29.000 Yeah.
00:11:29.000 Something to say, you're a man, and I'm going to start treating like a man from this moment forward.
00:11:34.000 Like, you know, what does that, there should be structure to that.
00:11:37.000 You know, we're tribal and it's important to me.
00:11:41.000 So I think that is really something that's missing from society.
00:11:45.000 I think that I used to think it was silly when I was young.
00:11:48.000 And then as I got older, I realized, well, I went through that.
00:11:51.000 I became a black belt and I started fighting.
00:11:54.000 And you had a group of men telling you, you're at this level, we're going to treat you like that.
00:11:57.000 And if you fall from grace, we're going to remind you right away.
00:12:01.000 And we just don't do that with young men.
00:12:03.000 And we have a society now where young men act like young men until they're 45 or 50 or 60.
00:12:08.000 And sometimes never stop.
00:12:09.000 Yeah.
00:12:10.000 And, you know, women, nature imposes itself on women.
00:12:14.000 They become fertile.
00:12:15.000 They're able to have babies.
00:12:17.000 And they've got to seek security or find a husband or a really good job that will supplement whatever a husband would provide.
00:12:24.000 And they got to start acting like a woman.
00:12:26.000 Whereas men can sit in a basement and it becomes very dangerous.
00:12:30.000 Especially men that never have children.
00:12:31.000 Yeah.
00:12:32.000 And they're perpetual children.
00:12:34.000 And if you don't impose nature on yourself by undergoing those types of rights and understanding what it means to become a man, nature will impose itself on you by either A, you're never going to have children and therefore you're dead forever, or B, it will kill you because you're fat and in your mom's basement, you get diabetes, the foot chopped off and you're 35.
00:12:34.000 Yeah.
00:12:51.000 And, you know, we just don't tell men.
00:12:54.000 We don't have a the military did it for me.
00:12:56.000 I had really put off responsibility or seeking meaning or any of those things until I was in the military.
00:13:04.000 And like I said, my father died when I was five.
00:13:06.000 So I really had no central male authority until I was about 13 or 14 when I met this guy, Steve.
00:13:13.000 And he kind of initiated some of those rights for me and held me to account.
00:13:18.000 But it was really the military, which was a turning point for me where there was a standard and I was expected to hold it.
00:13:26.000 I think there's a reason why most ancient cultures, a lot of ancient religions, have these rites of passages where you are like now officially, officially a man.
00:13:35.000 Officially, you know, you're responsible.
00:13:35.000 Yeah.
00:13:39.000 You have to think of yourself as a different thing now.
00:13:41.000 Whereas if you leave it up to your own decision, men sort of dwindle into this perpetual state of childhood.
00:13:49.000 Yep.
00:13:50.000 And it's not about you anymore.
00:13:51.000 It's about other people.
00:13:52.000 Like that, for me, having children, I've got four kids.
00:13:57.000 Really, you know, the military was kind of the first inkling of responsibility.
00:14:01.000 But then having children and realizing this isn't about me at all.
00:14:04.000 And I need to be willing to break my back for these people who depend on me.
00:14:04.000 Right.
00:14:08.000 There's this weird primal feeling that you're responsible for these like very vulnerable little people that you love more than life itself.
00:14:17.000 It just changes everything.
00:14:18.000 It just kicks you into gear.
00:14:20.000 But for some people, it doesn't.
00:14:22.000 Some people that are so stuck in that perpetual childhood thing, they just wind up deciding it's too much of a drag and they get divorced.
00:14:30.000 And then they fuck up the kids.
00:14:31.000 Yeah.
00:14:33.000 God, we have so many rabbit holes we could go down on this.
00:14:35.000 But I mean, it was, you know, growing up in the 80s and the early 90s, it was really like a divorce culture.
00:14:44.000 And I obviously understand that if you're in a bad relationship or an abusive relationship or, you know, certainly there's a threshold where marriage should dissolve.
00:14:52.000 No question.
00:14:54.000 But I kind of feel like the central thrust of a lot of culture at that time was about like divorce or not getting married or, you know, discovering yourself and that type of thing, which in some ways is good.
00:15:04.000 There's goodness there.
00:15:05.000 But when it becomes a central thrust or a central narrative and divorce becomes very easy or it's happening everywhere, it's super normalized.
00:15:13.000 It's super destructive.
00:15:13.000 And it's normalized.
00:15:14.000 Children are the ones who suffer the most on it.
00:15:16.000 And I think the data is clear on that.
00:15:18.000 When you look at single parent homes or no parent homes or being raised without an authority.
00:15:24.000 Or an abusive step person.
00:15:25.000 Or an abusive.
00:15:26.000 And that is, you know, when you look up the stats on that, like remarriage and having a new family, like that, that becomes the single most likely vector of abuse in a chi young child's life is that new person, right?
00:15:38.000 Because now they're raising someone else's kid or whatever.
00:15:41.000 I mean, it's a that's in every old movie, the evil stepmother.
00:15:45.000 Yeah.
00:15:45.000 You know?
00:15:46.000 Yeah.
00:15:46.000 Or evil stepfather.
00:15:47.000 But in the old movies, it's always the stepmother that abuses the girl.
00:15:52.000 Yes.
00:15:53.000 And so, you know, I kind of resented that part of that time, that culture was, I shouldn't say when I was a child.
00:16:02.000 I should say as I got older, because I was in a single mom home.
00:16:04.000 And the guy that my mother remarried right after my father died was abusive.
00:16:10.000 And, you know, he really got hard on my younger brother.
00:16:13.000 And, you know, my mother moved us out almost immediately.
00:16:16.000 But when I re-examined that time, it really was, you know, I don't know how to describe it, but, you know, there are no rules when it comes to relationships and family.
00:16:27.000 And every family is special and particular in its own way, and they all need to be venerated.
00:16:32.000 And there's, of course, some truth to that.
00:16:33.000 We shouldn't deride someone because they come from a broken family, but we shouldn't elevate it like it's at the same level as a unified family.
00:16:42.000 And that's a tricky line to walk.
00:16:44.000 But also, the people who are making those movies in that culture came from the 50s and 60s where divorce was just not in the cards.
00:16:51.000 And so that was, you know, Hooke's law: as you bend any object, it wants to return back to its natural state.
00:16:58.000 And Hooke's law kind of played there where nobody could get divorced in the 40s, 30s, 40s, 50s, and 60s.
00:17:04.000 And then you had the baby boomers who kind of culturally said, you know, actually, it's not as bad as we think, but then it overcorrected and it became kind of part of that cultural zeitgeist.
00:17:15.000 That's kind of what humans do, right?
00:17:17.000 We always overcorrect.
00:17:18.000 Yeah, we do.
00:17:19.000 Yeah.
00:17:19.000 We go in one direction until we realize it's destructive, and then we overcorrect until we realize that's destructive.
00:17:26.000 Yeah.
00:17:27.000 This episode is brought to you by Ketone IQ.
00:17:29.000 The demands on my time, energy, and focus are immense.
00:17:33.000 So when I need my brain to lock in for hours and hours and fire at its fastest, most alert state, I'm taking ketone IQ.
00:17:42.000 It's an energy shot powered by this little miracle molecule that your body already naturally makes and your brain especially loves ketones.
00:17:52.000 I've been talking about ketones for over a decade and this company has finally figured out how to put them in a bottle.
00:17:57.000 When I take ketone IQ, I drop right into a state of laser-like focus and sustained mental clarity.
00:18:04.000 Whether I'm podcasting, training in the gym, or just want to show up locked in when it matters, the difference is night and day with ketone IQ.
00:18:13.000 Visit ketone.com/slash Rogan for 30% off your subscription order or find Ketone IQ at target stores nationwide in the protein and electrolyte aisle and get your first shot free.
00:18:29.000 Plus, they have a 60-day money-back guarantee.
00:18:33.000 That's how confident they are that you're going to love the increased focus you get from ketone IQ.
00:18:38.000 And I would say that's the, and not, this isn't a political thing.
00:18:41.000 This is just the reality of it.
00:18:43.000 That's mostly what makes me conservative in nature is I agree systems need to change, but they need to change slowly and pragmatically.
00:18:51.000 So we, because, you know, any social, any social scientist worth their salt will know a social experiment almost never has the outcome that we thought it was going to have.
00:19:01.000 In other words, we thought doing something to society would form society this way, but it almost has the inverse, the anti-pattern like we talked about before, and almost ends up propagating itself.
00:19:12.000 And so that makes me, I'm still a proponent for change, but it should be slow and thought out and done in pockets first.
00:19:22.000 Kind of, you know, federalism.
00:19:23.000 Let's do little changes here.
00:19:25.000 Let's let California be crazy for a while and see how that works out for them.
00:19:29.000 But let's not nationalize the craziness.
00:19:31.000 Let's learn from what they learned there.
00:19:33.000 And there'll be goodness, you know, hoparicism, make great coffee and cool art.
00:19:38.000 And let's take those parts.
00:19:39.000 But how about the rampant homeless?
00:19:42.000 Let's find out what caused that and solve for that.
00:19:45.000 And, you know, that was kind of the founder's intent with federalism.
00:19:49.000 They're really federalist-minded, state-minded.
00:19:51.000 And there's, you know, even for that being as 250 years ago, there's a profound amount of profundity in that.
00:19:57.000 Like, let's change things slowly and let social experience take place and adopt the best parts of those things and then integrate them to the culture overall as we move along.
00:20:06.000 But, you know, let's not throw the baby out with the bathwater.
00:20:09.000 Yeah, I think in this country, one of the primary problems that people have is a profound lack of respect for discipline and how important discipline is for your life.
00:20:20.000 And discipline is associated with conservatism.
00:20:24.000 And because of that, like a lot of people think that I'm, I don't think I'm anything.
00:20:29.000 I think I have politically or ideologically, I have a lot of everything in me.
00:20:35.000 I don't think I identify with one side or another.
00:20:37.000 But if one thing that I agree with conservative people on, conservative people lend more towards the importance of discipline.
00:20:44.000 Hard work, discipline, don't complain, get things done, deal with the hand that you've been dealt with, and just sort it out and get to work.
00:20:54.000 Don't cry.
00:20:55.000 Don't look for other people to save you.
00:20:56.000 They're not going to.
00:20:58.000 And this is not something that's celebrated in society.
00:21:02.000 It's thought of as a cruelty that if you say that you need discipline, that you're not treating these people that are victims of circumstance with the proper respect or with the proper empathy.
00:21:14.000 And I think a certain amount of empathy is probably not so good for you at a certain point in time.
00:21:19.000 There comes a point in time where you're letting people wallow in their bullshit and just make excuses for why they're not getting anything done.
00:21:25.000 And in that sense, I think California is, that is a giant part of what's wrong with California.
00:21:31.000 What's wrong with California when it comes to crime?
00:21:33.000 What's wrong with California?
00:21:35.000 You know, the way they address crime and the way they address homelessness and all these issues that they have, they don't put their foot down.
00:21:42.000 And at a certain point in time, you've got to realize what God Sad calls suicidal empathy.
00:21:48.000 Society can suffer from suicidal empathy.
00:21:50.000 And at a certain point in time, you've got to enforce rules and you've got to make it so that people have to get their shit together.
00:21:56.000 Yeah.
00:21:56.000 And that suicidal empathy becomes a way for the person who's imposing it on someone else to feel good about themselves, which makes it even trickier and even more insidious because they're feeling good from the weaponization of other people's lot in life.
00:22:14.000 And the thing about that is none of the rules that you're going to impose, especially as a legislator or as somebody in a think tank, you'll never feel the repercussions of them.
00:22:24.000 You'll never have to actually deal with it day to day.
00:22:26.000 You're just imposing it on someone else and saying, I better understand the structure of reality and the fabric of the world.
00:22:33.000 And you can't help but be this way.
00:22:35.000 It's the system that's done this to you.
00:22:36.000 So let me give you pittance that I'm going to take from someone else.
00:22:40.000 And that makes me benevolent.
00:22:42.000 I get to feel good about that.
00:22:44.000 That's a giant part of government, for sure.
00:22:46.000 That's a giant part of what's the problem with liberal governments.
00:22:49.000 Liberal governments should get paid based on whether or not the city does better or worse financially than when they were in office.
00:23:01.000 If their policies lead to greater domestic production of goods and services and GDP does better and everything does better, then you should get paid more.
00:23:12.000 If more real estate sales, more people are making more money, medium income raises, less homeless people, you should get paid more.
00:23:19.000 And you should get paid less if homelessness goes up, if crime goes up, if there's more destruction, if there's more, you know, assaults and home invasions, you should get paid less.
00:23:30.000 You're doing a shitty job.
00:23:32.000 And if you did that, I think they would impose laws that made it safer and healthier and made it better for society.
00:23:40.000 Yeah, and then they would just inevitably change the ways that we track and measure those things and pay themselves more.
00:23:45.000 Well, they shouldn't have the opportunity to do that.
00:23:47.000 Then you need some sort of an oversight that's going to be cynical.
00:23:50.000 You're right, though.
00:23:51.000 You're right to be cynical because that's what they do about everything.
00:23:54.000 Someone was explaining to me yesterday that one of the problems with cleaning up fraud is that fraud is responsible for a giant percentage of GDP.
00:24:05.000 And if you have hundreds of billions of dollars of fraud in this country and you eliminated that, you actually lower GDP because you actually lower the amount of money that's in circulation.
00:24:19.000 That's interesting.
00:24:20.000 I've never thought about that before.
00:24:21.000 He was explaining to me, and I was like, oh my God, that is crazy that a giant percentage of our GDP is fraud.
00:24:28.000 And if that was somehow or another eliminated, like one of the things that they do when they raise jobs, like they increase GDP.
00:24:37.000 We've added 200,000 jobs to the market.
00:24:41.000 Well, what are those jobs?
00:24:42.000 Like, what are those jobs?
00:24:43.000 Are these government jobs?
00:24:45.000 Because the government is a giant percentage of our GDP, government jobs.
00:24:49.000 It's way bigger than it should be.
00:24:51.000 And those jobs, a lot of them, are bullshit and waste, a lot of them.
00:24:51.000 Way bigger.
00:24:56.000 You know, and that was some of the stuff that was uncovered during Doge, you know, the limited amount of access that Doge had to it, just the beginning of it, where you got to see the curtain pulled back and get to see exposure of so many of these fraudulent, supposedly charitable organizations that were really just money laundering.
00:24:56.000 Yeah.
00:25:15.000 They're really just funneling money into these people's hands, like the homeless thing in California.
00:25:21.000 Oh, my goodness.
00:25:22.000 It's a bonkers situation where they've spent $24 billion.
00:25:27.000 They cannot track it.
00:25:28.000 They've tried to audit it.
00:25:30.000 The government has vetoed these audits and they have no idea where that $24 billion went and yet homelessness went up.
00:25:39.000 But you've got a giant machine that is this homeless establishment, this homeless industrial complex that is being funneled money into that.
00:25:49.000 And that actually aids the GDP, which is kind of crazy.
00:25:53.000 Yeah, I mean, it was one of the things.
00:25:55.000 My last three years in the military, I was advising a colonel and a two-star general, and they were in charge of all of the offensive cyber development, ethical hacking, offensive cyber development.
00:26:08.000 I was their technical advisor.
00:26:11.000 And one of the things I kind of learned about government at that point was these systems have their own incentive.
00:26:19.000 And the incentive is not the output of their purported mission.
00:26:22.000 The incentive is the growing of the organization and the execution of budget.
00:26:27.000 So while they're in there, you know, I've never seen a field-grade officer get dressed down more than when he didn't spend all of the money that he was budgeted for for that year.
00:26:35.000 Isn't that crazy?
00:26:36.000 He would go to the Pentagon and they'd be like, well, you didn't execute $300 million of OCO, of overseas contingent operations funds here.
00:26:44.000 And they would dress him down for an hour.
00:26:46.000 And what people don't understand is if you don't spend that money, your budget for the next year will be lower because there's no need to have a higher budget.
00:26:55.000 Instead of tying it to mission to say, did you achieve your mission objectives?
00:26:58.000 Right.
00:26:59.000 We started the year agreeing from the president's framework, the NIPIF, the National Intelligence Priority Framework.
00:27:05.000 We wanted to achieve these effects.
00:27:07.000 What you would want to hear is, we achieve them and we save 25%.
00:27:11.000 But instead, it's we achieve them, but we didn't execute all of this money.
00:27:14.000 Well, you're fired.
00:27:15.000 And I literally have seen that happen.
00:27:17.000 I've literally seen that happen.
00:27:19.000 And that kind of...
00:27:20.000 What a sick society.
00:27:21.000 Yeah, and that kind of shifted my thinking in that these systems have their own incentive to exist and to grow because those guys that were holding that general officer or that 06 is that colonel's feet to the fire, they also have an incentive to, because they were part of that trickle-down.
00:27:21.000 Yeah.
00:27:38.000 And they've got bureaucracy that surrounds them.
00:27:40.000 And if they didn't execute it, that means they didn't execute it.
00:27:43.000 And that means they have to go to whomever.
00:27:45.000 This was during the Biden administration.
00:27:47.000 I believe Hag Seth for everything we could say has actually tightened this up quite a bit.
00:27:51.000 And he's kind of re-hauled the way development works, especially on the offensive cyber side.
00:27:56.000 But they have bureaucracies, and the incentive of the bureaucracy is to make sure that we grow.
00:28:00.000 And that's it.
00:28:01.000 And then you think about that for a minute, and you're like, well, it's no longer a question why we have $30 trillion of debt.
00:28:08.000 $39.
00:28:09.000 $39 trillion.
00:28:10.000 And then what, like $150 trillion of unfunded liability.
00:28:14.000 In other words, we've promised people money for the next 30 years.
00:28:18.000 And it's debt that, you know, I don't see how we'll ever escape that debt.
00:28:23.000 And it's the thing about it is, and I don't want to be pigeonholed because I'm actually quite liberal when it comes to my politics are like yours in that I'm kind of a man without a home, but they also change at different levels of analysis.
00:28:37.000 I'm very liberal with my family and I'm very like communist.
00:28:41.000 I protect them.
00:28:42.000 I give them everything they need.
00:28:44.000 I'm trying to give them structure.
00:28:46.000 And even in my community, I'll help someone out out of pocket or do something for them that's a strain on my time or might hurt something else because there are really no solutions.
00:28:54.000 There's just trade-offs.
00:28:56.000 That's supportive for the community, though.
00:28:57.000 That's how people are supposed to do charity.
00:29:00.000 And I'm also very non-judgmental in someone how they care.
00:29:02.000 I don't care what they do in their house.
00:29:03.000 I don't care if it's a Roman orgy on the weekends.
00:29:06.000 Like be a predictable, productive person Monday through Friday and go do your Roman orgy on the weekend.
00:29:12.000 I won't judge you.
00:29:12.000 I don't care.
00:29:13.000 Like I don't, I really have enough crap in my own life.
00:29:17.000 As long as someone's not getting hurt.
00:29:18.000 Yeah, as long as no one's getting hurt, consenting adults.
00:29:20.000 Like I have enough problems and I screw up enough and people have, there's a laundry list of things that people could say about me, how I've screwed up in my life.
00:29:27.000 But then as I graduate and get higher and higher, more conservatism takes place.
00:29:34.000 And that's a result of just, you know, having an engineering mindset when I'm looking at life and understanding that it's just not Republican or Democrat or leftist or rightist or liberal or classically liberal.
00:29:50.000 All of these monikers don't work for me because they break down at some level of analysis.
00:29:56.000 And I think that's the problem.
00:29:56.000 Right.
00:29:58.000 I think the problem is these ideologies that people subscribe to, where you have a predetermined pattern of thinking that you're supposed to adopt.
00:30:04.000 Yes.
00:30:05.000 You're supposed to adopt these opinions.
00:30:07.000 And some of them just don't fit.
00:30:09.000 And that's how people get pigeon.
00:30:10.000 That's like people on the left.
00:30:11.000 They get pigeonholed into weird stuff that you can't really justify, like trans women in sports.
00:30:17.000 Like, what the fuck are you doing?
00:30:18.000 Like, we're being inclusive.
00:30:20.000 Like, no, you're not.
00:30:21.000 We're loving the borders of Ukraine while hating our own border.
00:30:24.000 Yeah.
00:30:25.000 Fucking bonkers.
00:30:26.000 There's so many crazy things.
00:30:26.000 Yeah.
00:30:28.000 There's so many crazy things that people just adopt that don't make any sense.
00:30:32.000 And, you know, when you subscribe to an ideology, the problem is if like if you define yourself as this person, I am this.
00:30:40.000 I am a hardcore right-wing blah, blah, whatever it is.
00:30:44.000 You immediately close the door to all the very productive and interesting things that the other side thinks.
00:30:49.000 Yeah, and you're also making yourself into a tool of propaganda.
00:30:52.000 Because if someone, if I meet someone and they just say, I'm this, it's like, well, I could reasonably predict everything that's going to come out of your mouth.
00:31:00.000 Yeah.
00:31:01.000 That's not entertaining.
00:31:02.000 I don't want to have a conversation with that person.
00:31:04.000 I can't seek to learn from them because I could just pick up the Communist Manifesto or Mein Kampf and have a pretty good understanding of who I'm dealing with.
00:31:10.000 And therefore, a conversation is not relevant.
00:31:13.000 It's not needed.
00:31:14.000 A lot of people are afraid of social ostracize too.
00:31:17.000 So they're afraid of straying outside of the narrative, whatever side they're supposed to be on.
00:31:23.000 And, you know, some groups are really good at making you feel like dog shit if you don't agree entirely with even things that don't even make any sense.
00:31:32.000 So that's why people go along with stuff that's illogical, like open borders or whatever it is.
00:31:37.000 They go along with things that's not in their best interest because they're scared.
00:31:37.000 Yeah.
00:31:41.000 They're scared of being ostracized.
00:31:43.000 They're scared of being cast out of the kingdom.
00:31:45.000 They're scared of being excommunicated.
00:31:47.000 Yeah, I dealt with a lot of people first when I retired from the military and then more recently leading up to the last election where I was entertaining the deal of doing some work for government, believe it or not.
00:32:01.000 And because as we talk more, you'll figure out I'm pretty anti-institutions.
00:32:07.000 I'm really against those types of things.
00:32:10.000 But I really felt, if you would have asked me three years ago how I felt about the Trump election and all of that stuff, I was very excited because he was saying a lot of things that I wanted someone to say.
00:32:19.000 Trump fits a pattern.
00:32:21.000 And this is what people I think kind of lack when they my whole life is built around pattern analysis.
00:32:27.000 I really enjoyed patterns and exhuming and looking into patterns.
00:32:33.000 And there's a pattern of like a you this, you'll laugh when I say this first part of the pattern, but then I'll make it make more sense later.
00:32:41.000 But he fits the pattern.
00:32:42.000 Well, first, he's a Jacksonian, and in that he's a pragmatic person the way that he governs, which I liked, or at least I did.
00:32:50.000 And, you know, there's some things he's done recently that I don't enjoy.
00:32:55.000 But he's also an outsider or a savior type.
00:33:01.000 Allah, you know, I don't remember the movie, but the Magnificent Seven back in the day.
00:33:05.000 I don't remember the actor's name.
00:33:07.000 There's this group of, you know, there's this Western town.
00:33:10.000 Everything's going to shit.
00:33:12.000 These seven guys walk in.
00:33:13.000 I think Chris Pratt remade it with Denzel Washington or someone else.
00:33:16.000 Oh, really?
00:33:17.000 I can't remember.
00:33:17.000 I think so.
00:33:18.000 But there's an old one that I used to watch with my grandpa.
00:33:20.000 God, there's too many movies.
00:33:21.000 Yeah.
00:33:22.000 And there's this pattern where you wouldn't invite these guys to a dinner party.
00:33:27.000 You wouldn't want them in church on Sunday.
00:33:29.000 But when a system is so corrupt and so horrible, you have to rely on these types of people to come in and be a check to the system.
00:33:36.000 But then also you don't want them to stick around when the system is reset.
00:33:40.000 So there's a scene in the movie where he says, you know, man, these seven guys are talking.
00:33:45.000 And they said, man, these people must have really wanted us.
00:33:47.000 Like, it's crazy.
00:33:48.000 They must be happy we're here.
00:33:50.000 And I think it's Gary Cooper or someone or one of these guys says, looks at him and says, they're going to be even happier when we leave.
00:33:55.000 And Trump kind of fits that narrative.
00:33:57.000 Wolverine from the X-Men would be another one who fits this narrative.
00:34:01.000 Like, is he going to be at the X-Men Christmas party?
00:34:04.000 No.
00:34:04.000 Right?
00:34:05.000 Is he trying to hit on Scott Gray's wife, Cyclops?
00:34:08.000 I'm a comic nerd, so I'm sorry.
00:34:09.000 Is he trying to hit on sleep with Cyclops' wife?
00:34:12.000 Yes.
00:34:13.000 Did he chop a guy's head off and throw it at a car?
00:34:16.000 But we're about to go face Galactus and we're going to need him.
00:34:16.000 Yes.
00:34:20.000 And so we have to put up with all of this other stuff because we understand that when the system is corrupt at every level, you need someone who's outside of the system to come in and set the system right.
00:34:31.000 It's a Western pattern as well.
00:34:34.000 Other people who fit this would be like Patton, right?
00:34:36.000 Married his cousin, slap soldiers who had.
00:34:39.000 Did he really?
00:34:39.000 Oh, his cousin?
00:34:40.000 Yeah, I think it's his third cousin.
00:34:42.000 How many cousins removed or doesn't become okay?
00:34:46.000 Is it third?
00:34:46.000 I don't know.
00:34:47.000 Fourth?
00:34:48.000 If there's blood.
00:34:49.000 Have you never met them?
00:34:50.000 I'm Icelandic, so I really can't say anything, right?
00:34:52.000 They literally have apps in Iceland.
00:34:53.000 Like my grandparents and my great-grandparents are all from Iceland.
00:34:57.000 They settled in Manitoba, Gimli, Manitoba, which is this Icelandic community.
00:35:01.000 And they literally have apps in Iceland to make sure you're not dating your cousin.
00:35:04.000 So, you know, less than a million people on one island.
00:35:11.000 So you're trying to prevent that stuff.
00:35:13.000 But anyway, Patton, yeah, slap soldiers who had tuberculosis.
00:35:17.000 One of them probably had shell shock.
00:35:19.000 It got in the newspaper.
00:35:20.000 They wanted his head.
00:35:22.000 And thankfully, the generals were like, no, he's the guy that we need for the moment, right?
00:35:25.000 He had the ivory pistols and he dressed like not like a general.
00:35:29.000 He didn't talk like a general.
00:35:30.000 He wasn't like an Eisenhower where he had the veneer of a general.
00:35:36.000 But we knew he was the only guy we could have at the Battle of the Bulge.
00:35:39.000 Like the Germans talked about him like he was already a mythic legend in his lifetime.
00:35:45.000 But part of this pattern that people should understand, or when they examine this pattern, is it never ends well for these anti-heroes.
00:35:52.000 They're always killed or defamed in the final analysis.
00:35:56.000 So when Magnus and Seven come in, they'll go to another town and all get killed.
00:36:00.000 When Patton retired, he died in some weird Jeep accident.
00:36:04.000 You know, Wolverine, he's the only guy left on this desolate world where the Hulk's in charge and it's a horrible existence.
00:36:12.000 Patton, or not Patton, Petraeus is another one.
00:36:16.000 I briefed Petraeus.
00:36:17.000 I worked for, not for him, but for people who worked for him in Iraq.
00:36:21.000 And he was the guy that got us through with the surge.
00:36:25.000 But he was really a weird guy when you would talk to him.
00:36:30.000 You knew that he knew something you didn't and that he was seeing things that you weren't.
00:36:35.000 But even for myself, as being like a chief warrant officer at that time, a low-level technician, he would ask questions like he got it.
00:36:41.000 He didn't act like other generals.
00:36:43.000 Like other generals would have their three things they want to talk about, then they'd want to get out of Dodge.
00:36:47.000 He would ask questions that really had implications.
00:36:49.000 And he is another one of these outsiders who came in to write a system that was not working vis-a-vis Iraq in 2006.
00:36:58.000 And then what happens to him when he leaves?
00:37:00.000 They put him in charge of the CIA.
00:37:02.000 They knew he had been screwing around with this woman.
00:37:04.000 And they're like, okay, he served his function.
00:37:07.000 Now he needs to get out of Dodge.
00:37:08.000 And then now he's, you know, got tried for all these things and sleeping with someone while he wasn't married.
00:37:15.000 And, you know, it's not a ceremonious end for these types.
00:37:18.000 I saw.
00:37:19.000 Is that really what happened to Petraeus?
00:37:20.000 That's how he ended?
00:37:22.000 Yeah, he was sleeping with some girl that was writing his book or something along those lines.
00:37:27.000 Well, I'm not saying that's the end of him.
00:37:30.000 All I'm saying is that history will remember the pattern is ending unfavorably.
00:37:36.000 You know what I'm saying?
00:37:37.000 And so when I examined Trump, I said, yeah, I don't like what he says.
00:37:41.000 I wouldn't want him around my daughters.
00:37:43.000 I wouldn't want him at a dinner party.
00:37:46.000 But he seems to be saying these things like he's going to reset this system.
00:37:50.000 You know, I think it was Chappelle was on your show or another show or someone like that where he talked about Hillary saying, you know, something about the tax loopholes or whatever.
00:37:58.000 And he just hit right back at her and said, well, the people who are funding your campaign take advantage of those same loopholes.
00:38:04.000 And if they're there, I'm going to take advantage of them.
00:38:06.000 I wouldn't be a pragmatist if I didn't.
00:38:08.000 When he started saying stuff like that, it seemed to me like he was going to upend this system.
00:38:12.000 The jury's out on that because I don't know how I feel these days.
00:38:15.000 We can get into that if you need to, if we want to.
00:38:17.000 But he's an outsider personality, and I thought he was going to really reset this system.
00:38:23.000 And there are good things that are happening.
00:38:26.000 If I were to grade him, I would probably give him a C plus or a B minus.
00:38:30.000 Certainly better than what was happening under Biden.
00:38:33.000 I was still in the military when Biden was in charge, and it was awful to say the least.
00:38:38.000 What were the problems?
00:38:39.000 Oh, my goodness.
00:38:41.000 Books that general officers were being told to read and that I, as an advisor, were being told to read.
00:38:46.000 Books like White Rage, like understanding why you're a problem.
00:38:52.000 You as a white man are a problem in the modern day military because this whole thing's built on systemic racism.
00:38:59.000 You have inbuilt implicit bias that you can't escape even if you wanted to or you recognized it.
00:39:04.000 It's woke politics.
00:39:05.000 Yeah, it was woke politics.
00:39:08.000 And it was, you know, I would sit there and say, you know, all of the friends, all the people that I know who've died during this war, not all of them, but 80% of them, and the numbers bear this out when you look at them.
00:39:19.000 They're all white guys from the middle of the country who were on their farms or, you know, not all of them, 80% of them.
00:39:26.000 I think the numbers bear out about 80% of them.
00:39:28.000 Were these guys from the Midwest or these places where they didn't really have a lot going?
00:39:32.000 And they went off to fight a war that we probably shouldn't have been fighting in the first place, especially in Iraq.
00:39:38.000 And they died for their cause.
00:39:40.000 And now you're saying that those people who make up the majority of the combat deaths are somehow part of this problem and that other people aren't benefiting from it.
00:39:50.000 I don't believe race to me is disgusting.
00:39:52.000 Even to talk about someone's race, even on both sides of the spectrum, when they were electing that Supreme Court justice, I can't remember her name right now off the top of my head just because I'm a little nervous still.
00:40:05.000 She was black.
00:40:06.000 Sangie Brown Jackson.
00:40:07.000 Yeah, they were talking about how it's historic because she's black.
00:40:10.000 And Biden had said he's going to hire a black woman to do this job.
00:40:14.000 If I had worked my whole life to do something, but now I'm only being elevated to this next position because of my gender and the color of my skin, I would turn that job down so fast because that's not what I want to be known for.
00:40:26.000 These are immutable characteristics that I'm not in control of.
00:40:30.000 I didn't choose to be born white or with blue eyes.
00:40:32.000 I didn't choose to be born in a trailer park in the middle of nowhere without a dad at five.
00:40:36.000 I didn't choose any of those things.
00:40:38.000 I don't see how I benefit from these things at the individual level.
00:40:41.000 And, you know, the individual level of analysis for me is really the only way to evaluate someone for their pluses and their minuses.
00:40:48.000 And anything beyond that to me is discriminatory on its face.
00:40:52.000 Of course.
00:40:52.000 It's just a great way to control people because you pit people against each other that way.
00:40:57.000 And it's just an awesome way that they can stay in control and make everybody walk on eggshells and think that they've victimized people in order to get to their position and they have to be shameful of who they are that they had no control over.
00:41:13.000 It also gives people an easy rubric to judge other people.
00:41:15.000 Yeah.
00:41:16.000 Because nothing's easy, really.
00:41:18.000 And it gives some like white guy bad, black guy good.
00:41:22.000 Chinese guy, as long as he's not applying to the college I want to get into, he's good.
00:41:26.000 Right.
00:41:28.000 And it gives people, people want easy answers, really, at the end of the day.
00:41:32.000 They want to be told the easy rubric to navigate life because really none of it's easy and it requires discipline, like you said before, and thought.
00:41:40.000 And so it was that stuff in the military.
00:41:43.000 I remember getting told in an equal opportunity briefing we were getting, it doesn't matter what you meant when you said what you were saying.
00:41:53.000 It only matters what the person felt when you said it.
00:41:57.000 They'd said that in a military briefing?
00:41:58.000 This is a military equal opportunity briefing.
00:42:01.000 And the example they gave was if a woman walks into the, like we worked with a lot of civilians at this military organization where we were developing these offensive cyber capabilities, a lot of civilians in there.
00:42:14.000 And so if, you know, woman X walks in today and she's got a dress on, and the thought in your head is, I'd like to get my wife that dress or something like it or find out where she bought it.
00:42:24.000 And you just say, that's a nice dress.
00:42:26.000 Anyway, here's the TPS reports.
00:42:28.000 If she heard something sexual or didn't like the connotation or whatever, there's going to be an investigation.
00:42:36.000 You're going to be pulled out of that office.
00:42:38.000 This is all going to happen despite what you meant.
00:42:41.000 So the idea probably was good.
00:42:43.000 We want to prevent sexual harassment inside of the office.
00:42:47.000 But it was weaponized.
00:42:48.000 But it was weaponized and it was carried out in a way where it's only about how people feel and not what a reasonable person standard would be in a particular situation.
00:42:56.000 And from the time I joined the military until that time, we had been at war.
00:43:00.000 My entire time in the military, we were at war.
00:43:03.000 I deployed throughout my career.
00:43:05.000 And I wouldn't say that I was a war horse.
00:43:08.000 I was not a long tabber.
00:43:09.000 I was not a cool guy kicking indoors.
00:43:11.000 It was my job as the guy with tape over his glasses to point out the door for someone else and say, bad guys in there.
00:43:18.000 So I was not a super badass in that regard.
00:43:21.000 I was a nerd for super bad asses.
00:43:25.000 But we also all engaged in gallows humor.
00:43:28.000 And we would, you know, the jokes and stuff.
00:43:32.000 Even someone who had recently died, we would make a joke about.
00:43:34.000 It's because you have this tremendous pressure.
00:43:38.000 And comedy is the relief valve for that in a lot of ways.
00:43:42.000 Yeah, of course.
00:43:43.000 And but then someone would overhear that joke or something.
00:43:45.000 And now you're looking down the barrel of a 15-6, which is a military investigation.
00:43:51.000 And all of these things that could permanently impact your life in a way and give you a scarlet letter to where you could never be employed again or do anything ever again because you were simply trying to relieve some pressure or you were trying to find out where to buy your wife with the next dress and now your life's being ruined.
00:44:08.000 And I know guys who suffered under that sword.
00:44:10.000 Like I wouldn't name them, but I know guys who, you know, their career met a terminal end because of a dumb joke or something.
00:44:18.000 It's like you can't be expected to go out and shoot people in the face and then be sensitive to someone's feelings an hour later.
00:44:25.000 It's just, it doesn't, it does not work.
00:44:25.000 Right.
00:44:27.000 Now, should you talk to that guy and say, hey, you know, you made Woman X feel so-and-so.
00:44:31.000 Be more cognizant of that whenever you're around her in the future.
00:44:34.000 Well, you should also have a rational discussion with the woman.
00:44:37.000 Yes.
00:44:37.000 And what did he ask you?
00:44:39.000 He said, where did you get that dress?
00:44:40.000 It's very lovely.
00:44:42.000 I'd like to get one for my wife.
00:44:44.000 Why were you upset at that?
00:44:45.000 Like, is this rational?
00:44:48.000 Like, you can't be in an office if you're that sensitive.
00:44:52.000 Like, it's one thing if the guy said, I'd like to get you out of that dress.
00:44:55.000 Well, for sure.
00:44:56.000 Now you're in a different world.
00:44:58.000 100%.
00:44:58.000 100%.
00:44:59.000 But if someone says, you look great, you know, have you lost weight?
00:44:59.000 Right.
00:45:03.000 You look fantastic.
00:45:04.000 That's a compliment.
00:45:06.000 And if someone gets upset, I felt sexually objectified.
00:45:09.000 I felt harassed.
00:45:10.000 Like, okay, he just said you look great.
00:45:12.000 Yeah.
00:45:13.000 That's it.
00:45:13.000 Healthy.
00:45:14.000 It's not, you look great.
00:45:15.000 I'd like to get you naked.
00:45:16.000 Now we've crossed the Rubicon, right?
00:45:18.000 Now we're into.
00:45:20.000 But just you look great or I like your dress.
00:45:23.000 That's like if you said that to a man, like, hey, great suit.
00:45:26.000 Yeah.
00:45:27.000 And he's like, oh, you've trimmed a complaint.
00:45:29.000 Yeah.
00:45:29.000 You need to file a complaint.
00:45:30.000 Yeah, you've trimmed up, Joe.
00:45:31.000 You're looking good.
00:45:32.000 You're looking great, Bill.
00:45:33.000 Like, oh, my God, I am being harassed.
00:45:35.000 I need it, like, complaint.
00:45:36.000 That would have worked during the Biden administration.
00:45:38.000 That's fucking crazy.
00:45:39.000 That would have worked.
00:45:40.000 That's so crazy.
00:45:41.000 And the other thing that they were doing in this briefing, which is where I kind of, you know, the last couple of years of my military career, I got in trouble a couple of times, or I should say, called down.
00:45:49.000 I was a senior, I was a CW-4.
00:45:51.000 I was one rank from the top.
00:45:53.000 I was advising two-star generals, colonels on very important matters.
00:45:58.000 I wasn't high.
00:45:59.000 I wasn't high in the dominance hierarchy, but I was adjacent to people who were as an advisor.
00:46:05.000 And the amount of in this briefing in particular, they had gotten into, you know, it's bad that there are so many white people.
00:46:20.000 I'm doing high points here, but we need more diversity.
00:46:22.000 I was part of an accepted career program that they were starting to call like the old white boys network because most of the people, so the requirements for this network were you had to speak a couple languages.
00:46:34.000 You needed an engineering degree or some kind of demonstrated engineering background.
00:46:39.000 You had to have deployed.
00:46:41.000 They wanted you to speak the language very well.
00:46:44.000 They wanted you to be able to go through these engineering courses, these other things.
00:46:49.000 And what happens naturally is you now need people who are interested in engineering.
00:46:54.000 So you've got somebody who's maybe more constrained in their thinking.
00:46:54.000 All right.
00:46:58.000 You need somebody who speaks languages.
00:47:00.000 Well, now they also need to be kind of, you know, speak French, speak Russian, whatever it was.
00:47:06.000 So they had to have studied or lived in an area and done this.
00:47:09.000 And they need to be able to go through these crazy tactical and strategic types of courses.
00:47:14.000 By virtue of those things, you're going to get men.
00:47:18.000 And there were lots of women, but then there'll be more white men.
00:47:21.000 And it's not because the pool presented itself that way.
00:47:25.000 Now you have to extract from that pool.
00:47:28.000 And so in this briefing, when they were talking about like the old white boys network or how we need to change things, I said, you know, do you realize that most men have more in common than most women?
00:47:40.000 Or like if I say I need more diversity in a particular room, if you said diversity of thought, I'd be fine with that.
00:47:48.000 But Joe and random black guy in the same program in the same office have far more in common than the white woman.
00:47:59.000 But what you're saying is these people need to have all separate different colors and different like all of this needs to be this way.
00:48:06.000 It's going to naturally present itself that way because men in the military generally are disagreeable.
00:48:11.000 Men in the military who like engineering are generally hyper disagreeable.
00:48:16.000 And the only difference between these two people is the pigment of their skin.
00:48:20.000 So this fake diversity quota that they're putting on top of us doesn't achieve anything other than giving some officer a bullet on their OER.
00:48:29.000 And I got pulled into the office afterward.
00:48:31.000 I said way more than that, but essentially afterwards they're like, hey, Chief, you can't say that in those briefings, like the way that you were getting animated in there and what you're saying, what you're doing.
00:48:41.000 Yeah, this is not going to fly.
00:48:43.000 And this was like 2018 or 2019 or something.
00:48:46.000 Yeah, just trying to be rational and say that there's more difference in groups than there is between groups.
00:48:46.000 Just being rational.
00:48:53.000 And that the similarities and the way that things stack up, you recruit from a pool of volunteers and candidates.
00:48:59.000 If I'm recruiting from a pool of volunteers and candidates who are 80% male and white, I have to expect that the selected individuals are going to be male and white.
00:49:08.000 The majority of people who join the military, I don't control this.
00:49:12.000 I'm just, as an engineer, I'm looking at statistics.
00:49:15.000 Also, if you want a highly functional, productive group, it's got to be based on meritocracy.
00:49:19.000 Yeah, for sure.
00:49:20.000 For sure.
00:49:21.000 Anything other than that is literally a threat to national security.
00:49:24.000 Yeah, you're denigrating lethality.
00:49:27.000 The role of the army is to deter war through exuding superior military fighting and technology.
00:49:34.000 And when deterrence fails to win.
00:49:36.000 That's it.
00:49:38.000 Those are the two things that we need to do with our military.
00:49:41.000 It needs to look like the guy in the playground who you would not muck about with.
00:49:44.000 And if you were to muck with him, he will beat you senseless.
00:49:48.000 That's it.
00:49:49.000 Now, whether or not we should be using that all the time or how we use it, that's a separate question.
00:49:53.000 But the entity itself needs to comport itself in this way.
00:49:56.000 Otherwise, you are endangering this truly special experiment, which at least in its beginnings valued the individual.
00:50:05.000 It valued individual rights and states' rights.
00:50:10.000 And the founders, and this was another thing I said in that briefing, was the founders knew, yes, they were all slaveholders, but they knew that the Constitution and the Bill of Rights and the Declaration of Independence would eventually lead to a system where we had to acknowledge these people as people.
00:50:26.000 And we fought a civil war where a million white dudes died to see this experiment through.
00:50:33.000 The scaffolding was there.
00:50:35.000 You have to look at the things the zeitgeist of the time.
00:50:38.000 If they had just said, nope, everyone's going to be free, there will be no slaves, you would have never gotten ratification through the southern states.
00:50:45.000 But they knew that there were, and when you read the Federalist papers, they knew that they were erecting this system.
00:50:51.000 When you look at Thomas Jefferson and some of these other great thinkers who, yes, he owns slaves, I get it.
00:50:56.000 They knew what they were building and they knew what would ultimately terminate in.
00:51:00.000 And then we had a civil war where we destroyed our country from the inside to see this dream come about.
00:51:07.000 And now we're just going to all go back and say they're all slave owners.
00:51:10.000 I know this has all been said here a million times, but this stuff animates me because it's built with blood and treasure.
00:51:16.000 Well, it's also, you can't judge people from the past based on the standards of the present.
00:51:21.000 For sure.
00:51:22.000 Because culture changes, people understand things better.
00:51:25.000 We have a much greater recognition of what was wrong with things 100 years ago, 200 years ago.
00:51:34.000 And I'm sure in the future, we're going to look back on today with the same lens.
00:51:40.000 It just always works that way.
00:51:41.000 Did you know Joe had a gas-powered car?
00:51:43.000 Exactly.
00:51:44.000 That kind of stuff.
00:51:45.000 Yeah.
00:51:45.000 Did you know that?
00:51:46.000 You consumed more, you flew more, you ate more meat, you did whatever you did.
00:51:51.000 You were a problem.
00:51:52.000 He was a problem.
00:51:53.000 And now why would we ever, like, I'm voting to get rid of the Joe Rogan experience from the National Archives because you drove a gas car.
00:51:53.000 Yeah.
00:52:00.000 Yeah.
00:52:01.000 You know what I mean?
00:52:01.000 Like someone, you know, stores your stuff for profundity's sake, for the future to hear about this.
00:52:06.000 You know, I've always loved your podcast, Joe, and it was because you're a genuinely curious person, and I'm not kissing your ass right now.
00:52:15.000 You're a genuinely curious person that was saying things that were not in the current zeitgeist at the time, and you refused to apologize for it.
00:52:23.000 And it led to a lot of great things, but it led to an updating of the system.
00:52:28.000 And you did it with dialogue, with the Diologos, with two people trying to learn things about each other.
00:52:34.000 And it led to an updating of a system.
00:52:36.000 I think it's very important for culture to have free and open dialogue so we can update our system.
00:52:41.000 So bad ideas can die so we don't have to die instead of our bad ideas.
00:52:45.000 Because if I can't express a bad idea, I have to act it out.
00:52:49.000 And if I act out the bad idea, it could kill me.
00:52:51.000 And the celebration of good ideas.
00:52:53.000 And the celebration of good ideas.
00:52:55.000 And it's just really, there's just been such a weird inversion in politics where the free, hippie-loving liberals of yesteryear are now the ones telling you what words you can use.
00:53:08.000 There are no borders, all of these crazy things.
00:53:11.000 And I always say to people, I said it to Andy on my last podcast with him.
00:53:16.000 I'm like a 1996 Bill Clinton Democrat.
00:53:19.000 If you go watch his State of the Union and he talks about lowering debt, getting out of debt, actually, working with Newt Greenridge to get out of debt, securing the borders, making work and education freely accessible.
00:53:33.000 I'm voting for that guy.
00:53:34.000 I know.
00:53:35.000 Isn't it crazy that that's why the problem of labels doesn't work, ideological labels?
00:53:41.000 Because if you go back far enough and look at Clinton, for example, he's one of the best ones.
00:53:46.000 And by the way, did balance the budget.
00:53:47.000 Yeah, he did.
00:53:48.000 We had a surplus when he left the office.
00:53:50.000 Yeah.
00:53:50.000 Amazing.
00:53:51.000 Did a fucking amazing job.
00:53:52.000 So he got his dick sucked.
00:53:54.000 Yeah, yeah.
00:53:55.000 Back then, that's the other thing.
00:53:55.000 Who didn't?
00:53:57.000 Judging people by the standards of the past, you know, JFK doesn't look so good in the Me Too movement.
00:54:03.000 You know, I mean, he would have got canceled.
00:54:05.000 It's like you have to recognize that those, this ideological bubble that we find ourselves in left versus right, Bill Clinton does not fit in that.
00:54:15.000 Bill Clinton is securely on the right in terms of 1996 standards applied to today.
00:54:22.000 He would never want to hear that.
00:54:23.000 No, he would never want to hear that because he's kind of shifted with the zeitgeist because that's what you kind of have to do if you want to stay in your party and be protected by your party.
00:54:31.000 Yes.
00:54:32.000 You know, but he's essentially, he had a lot of the idea.
00:54:36.000 We've talked about this before.
00:54:37.000 We've played clips of Hillary Clinton from 2008 and she's more MAGA than MAGA.
00:54:42.000 I know.
00:54:43.000 You know, her take on the border was like hardcore.
00:54:47.000 It was hardcore.
00:54:48.000 If you've been convicted of a crime, get out.
00:54:50.000 You know, if you stay here, pay a stiff penalty and you have to get in line and you have to learn English and everybody cheers.
00:54:57.000 That is a hardcore right-wing 2026 perspective.
00:55:00.000 Obama did it too in 2012.
00:55:02.000 Absolutely.
00:55:03.000 And Obama deported more people than Trump did.
00:55:05.000 Yes, exactly.
00:55:06.000 This episode is brought to you by Threat Locker.
00:55:08.000 Data breaches are happening more frequently than ever.
00:55:11.000 And it's not because of sophisticated tactics.
00:55:14.000 Attackers are using the same methods and exploiting the same vulnerabilities.
00:55:20.000 What's changed is speed and scale.
00:55:23.000 Reacting to breaches can leave you exhausted, constantly chasing threats instead of preventing them.
00:55:29.000 That's where threat locker comes in.
00:55:32.000 With threat locker, zero trust, you only allow what you need and block everything else by default.
00:55:40.000 You control what runs, when, where, and how, blocking ransomware before it executes.
00:55:46.000 Because no matter how you respond, a fast response simply isn't fast enough.
00:55:51.000 Visit threatlocker.com/slash J-R-E to learn more.
00:55:56.000 And it's just, I'm not saying, like, my thought is always, I'm always updating, I'm always updating my systems.
00:56:02.000 I'm always getting told things.
00:56:04.000 I always have a pre-prescribed way of looking at the world that I'll have a good conversation with someone.
00:56:10.000 I'll update my system.
00:56:11.000 But generally, my principles are in place.
00:56:14.000 And when you watch these people who get in their 30s, 40s, 50s, and 60s, and their core foundational principles are changing, it really should give you cause for concern.
00:56:24.000 Because like you were saying this at this time, and now you're saying this at that time.
00:56:28.000 It's like generally, my rubric that I don't think will change about myself is, I'm fervently for the individual and I'm fervently for truth and and that we can that the, that the world you, you should measure it and look at not what your intentions are but what the outcomes are and and then evaluate the system and how it scales based on those outcomes.
00:56:49.000 Those are that's principally.
00:56:51.000 If you, I try to live that standard up to myself.
00:56:54.000 I fall fall short of that standard all the time, but I try to live by that standard and I feel like that will always be me, even into my 90s, like unless something goes horribly wrong right right right, and and I've pretty much been here since, you know, the past seven or eight years or so, like even into my 30s, I quite wasn't quite sure who I was as a human and uh, but I'm pretty, you know, steadfast in that,
00:57:20.000 and the amount of opportunities and the amount of goodness in my life and my children and and my home and the things I've been able to do have really been born out of.
00:57:29.000 That last seven years of the truth's going to be the top of the of the decision matrix for me, the top of the hierarchy for me, I'm going to try not to cut corners whenever I can and help good people around me and and and the truth is the way that I'll organize and function myself in life, and that I will try to only judge people as individuals and the world.
00:57:52.000 You know, these are Christ's teachings from 2,000 years ago and, but the world for me has just opened up in a way that I could have never predicted.
00:58:01.000 Using a very simple rubric, it's not easy, but it's simple.
00:58:05.000 And if more people just took those and this isn't me, I didn't come up with this, this is the result of, you know, watching a bunch of experiments go bad, but if people just adopted that very simple thing and just tried it for three months, you'll feel better about yourself, you'll feel better about the world, you feel better about the people proximately around you.
00:58:23.000 It might make you hate the government more yeah, but uh well, I don't think.
00:58:28.000 If you don't hate the government, I think you're not paying attention.
00:58:31.000 Yeah yeah, for sure.
00:58:32.000 I mean when you were working in cyber, Cyber defense.
00:58:36.000 Like, what cyber offense?
00:58:37.000 Cyber offense.
00:58:38.000 What was the primary function?
00:58:41.000 Like, what did you do?
00:58:43.000 So, in the beginning, I have no short answers, and I apologize.
00:58:48.000 In the beginning.
00:58:49.000 I don't like short answers.
00:58:50.000 Yeah, I always feel like I'm.
00:58:52.000 I like a good long answer.
00:58:53.000 Yeah, don't worry about that.
00:58:54.000 Okay.
00:58:55.000 When I joined the military, I was in signals intelligence and essentially learning the ins and outs of radars, how radars work, what they do, how they function.
00:59:05.000 Did you guys ever see any weird shit, like UFO shit?
00:59:08.000 I wish I had.
00:59:10.000 I really do.
00:59:11.000 I wish you had.
00:59:12.000 Yeah, I really do.
00:59:13.000 I was more in the signals intelligence side of the house, focusing first on electronic signals or emanations from radars, mapping them so that, you know, if we were going to go do the ground invasion and there was going to be some air support going in first and blowing shit up, we would tell them, hey, there's a man-packable SA-7 here.
00:59:30.000 There's a SA-10 here.
00:59:32.000 There's this here, there's there.
00:59:33.000 And then telling these pilots so they didn't get shot out of the sky.
00:59:37.000 Quickly, when the war kicked off, that became irrelevant because there was no surface-to-air missiles, surface-to-surface missiles in Iraq.
00:59:44.000 We had knocked them all out in the first few weeks.
00:59:47.000 So then it shifted to communications intelligence.
00:59:49.000 So I kind of retrained on communications intelligence, and that was at that time off of cell phones, off of push-a-talk radios, repeaters, long-haul networks, terrestrial networks, extraterrestrial networks.
01:00:02.000 And what I mean by that is stuff, the satellites in the sky.
01:00:06.000 And doing analysis on those to try to inform what we call the common operating picture of the battlefield for a combatant commander.
01:00:14.000 So command commander wants to know where the bad guys are, what they're doing, what they're saying.
01:00:18.000 To the amount that we could, my job was to come up with solutions and conduct passive and active signals analysis on these things and then inform the commander so that we could mitigate risk.
01:00:32.000 It was all about mitigation of risk.
01:00:35.000 This is 2008 or so.
01:00:37.000 I'd been doing this for about seven years, eight years.
01:00:40.000 And from there, it shifted to the phones getting smart.
01:00:43.000 And essentially, it went from you walking around with a 2G phone or a 3G phone that had limited compute capability to now there's robust compute capability with the advent of like the iPhone.
01:00:55.000 And now it's like, well, now we've got to get after guys who are essentially walking on with a computer we could never have envisioned 20 years ago in their pocket with all this capability.
01:01:04.000 Because the military and our forces that we're fighting against, it all comes down to our ability to shoot, move, and communicate.
01:01:11.000 Communication being the part that I was focused on.
01:01:13.000 So as the advent of the iPhone and those things came out, the Army realized we didn't have a computer network operations MOS.
01:01:20.000 We didn't have a offensive cyber component.
01:01:23.000 We didn't have a defensive cyber component.
01:01:25.000 So we kind of, I was there at the ground floor when we were building out these new MOSs now that are all over the military.
01:01:31.000 But at that time, there was a thought going into, you know, we need to have people who know how to be on-net operators.
01:01:37.000 Ethical hacking, as paradoxical as that sounds.
01:01:41.000 That's how the lawyers called it that.
01:01:42.000 So it's hacking at the end of the day, but ethical hacking because you've got the backing of the U.S. government.
01:01:47.000 And so we set up that framework and really started launching into operations, you know, 2006, 7, 8, all the way into my last deployment in 2017 or 2017.
01:02:00.000 It was all focused on computer network operations and how they lash up with terrestrial networks.
01:02:05.000 How do we exploit all of that was one facet of my job.
01:02:09.000 And your question was, how did I get into all of that?
01:02:14.000 And that was the...
01:02:15.000 How do you get into it?
01:02:16.000 What was like...
01:02:18.000 What was the operational aspect of it?
01:02:20.000 How did you actually, what did you do?
01:02:23.000 So, you know, I'll stick to terms that are more generally understood by the public, but learning how to do things like war driving, collecting on networks, Wi-Fi endpoints, cell phones, understanding the ins and outs of them, understanding how to do forensic analysis of them.
01:02:42.000 So after there was an operation and a bunch of guerrillas had been sent in to kill a bad guy, we could derive maximum intelligence value from the handset to plan other operations.
01:02:54.000 And so, you know, it would be passive monitoring of networks to inform the intelligence picture, which would lead to either combat operations or active computer network operations, where now it's like, well, there's, you know, a, I don't know, an Iraqi or an Afghani router that hasn't been patched in three years.
01:03:18.000 And we think we can either write or find a zero day, which is just an exploit of those routers, where we can muck with their router in a way where they think they're getting good information and they're not, or they're erecting other things to mitigate risk for the commander.
01:03:39.000 And so that really, you know, exploded at that point.
01:03:42.000 And between that and human intelligence, which is kind of the actual gathering of intelligence from other people, you know, you would call it spy or James Bond, but that's James Bond was a horrible spy.
01:03:55.000 Was he?
01:03:56.000 I mean, yeah, you know, your job's to remain anonymous, and you're walking into a casino and there's Goldfinger calling you by your first and last name.
01:04:04.000 It's not a great look.
01:04:06.000 You know, generally you don't want to be sleeping with your sources or using your real name or whatever.
01:04:13.000 So human intelligence.
01:04:15.000 And then my focus for the last 10 years was how does signals intelligence computer network operations become a force multiplier for people conducting overt and clandestine operations throughout the theater at that time.
01:04:29.000 My deployments and my time was spent in Iraq, Afghanistan, Africa, Northern Africa.
01:04:36.000 And then a lot of people don't know it, but we were in active combat operations in the southern Philippines as well for a fair amount of time.
01:04:43.000 I want to maybe say seven or ten years.
01:04:44.000 We were doing combat operations in the southern Philippines.
01:04:47.000 My first deployment to the southern Philippines was 2007.
01:04:54.000 Who were we doing operations against?
01:04:56.000 So there were terrorist elements down there that were traveling back and forth from Pakistan and Afghanistan.
01:05:03.000 And there was a terrorist organization down there called the Abu Saif Group.
01:05:07.000 And there were other ones as well.
01:05:09.000 Jama Islamia, I think was the name of the other one.
01:05:13.000 And they were conducting their own terrorist anti-Christian operations in the southern part of the Philippines.
01:05:18.000 In the southern part of the Philippines, I don't, can I say it?
01:05:20.000 Can I say the word?
01:05:21.000 What do you mean?
01:05:22.000 Jamie, can you pull up a map of the Philippines?
01:05:24.000 Can you pull it up?
01:05:25.000 Oh, say that.
01:05:26.000 I'll say that.
01:05:27.000 I've been listening to it forever.
01:05:27.000 Yeah, pull it up, Champion.
01:05:29.000 So there's what's called the autonomous region of Muslim Mindanao, which is the southern part from a place called Zambuanga down to Hulu or Holo Island.
01:05:39.000 And there's a, it's a funny joke because if you zoom into Zambuanga, which is God, look how many islands are.
01:05:45.000 I know.
01:05:46.000 Go down to the south there.
01:05:47.000 See Zamboanga.
01:05:48.000 Go down right there.
01:05:50.000 Zoom right there on that island.
01:05:51.000 Now move to, sorry, now move to the southwest.
01:05:55.000 See that penis?
01:05:57.000 At the tip of that penis is called Zambuanga.
01:06:00.000 All of our combat operations, now if you zoom out a little bit more and pan more south and zoom out just a little bit more so the joke hits, all that sperm south of the tip of the Zamboanga city, there are terrorist operations in here.
01:06:17.000 Now, if you go to that main island called Sulu, there's Holo Island, that's where I was on this tiny island out in the middle of nowhere.
01:06:25.000 And on that, there's a mountain.
01:06:27.000 That's all the Philippines?
01:06:28.000 Well, no, I mean, this is all the Philippines down here, yeah.
01:06:30.000 Wow.
01:06:31.000 So this is called, there's a mountain in there.
01:06:32.000 I think it was called Mount Tumatoc or something like that near on the eastern part of the island called Luke.
01:06:38.000 It's called Luke.
01:06:39.000 Yeah, so there's mountains.
01:06:40.000 There's a mountainous region there.
01:06:41.000 There are a bunch of terrorists up there.
01:06:43.000 They were killing people in the area, conducting bombings.
01:06:46.000 They were getting trained.
01:06:47.000 In fact, there was a guy, and I believe I'm going to get his name wrong, perhaps, but I believe his name, it was either Insulan Haplon or oh, it's Jamar Patek, Jamal Patek.
01:06:59.000 He was actually arrested outside of Osama bin Laden's compound the day after he was killed.
01:07:03.000 We were trying to kill him on that island or in and around that island is where we were trying to find him and kill him.
01:07:09.000 So they're terrorist facilitators.
01:07:11.000 They did the USS Coal bombing.
01:07:14.000 Zoom back out.
01:07:14.000 I want to see the Philippines one more time?
01:07:16.000 Like all the islands?
01:07:17.000 When you zoom all the way out, it's so nuts how many islands there are.
01:07:22.000 Yeah, so up north of Manila is mostly the Christian population.
01:07:27.000 And as you get down south, it's the autonomous region of Muslim Minnow.
01:07:31.000 And that is all of where these terrorist operations were happening.
01:07:35.000 And I believe that mostly pulled out of there.
01:07:37.000 There might be still some people in Zamboanga.
01:07:39.000 I'm not sure anymore because it's been five years, four years since I retired.
01:07:44.000 But yeah, we were doing counterinsurgency operations down there and guys died down there and there were combat operations.
01:07:49.000 And I was out there.
01:07:52.000 I was in a tactical military intelligence battalion and I was attached to the first special forces group.
01:07:57.000 And we were down there a couple of times.
01:07:59.000 And a lot of people don't even know about it.
01:08:01.000 Yeah, I never heard about it.
01:08:03.000 Yeah, so anyway.
01:08:06.000 Sorry, the sidebar, but I'm so stunned at how many islands are in the Philippines, how spread out it is.
01:08:11.000 Yeah, it's insane.
01:08:13.000 And the thing about it is, I'd go to all of these little outposts and these out islands.
01:08:18.000 We were always debriefing these guys.
01:08:20.000 And I'm going to get these terms wrong.
01:08:21.000 So I'm sure there'll be people in the comments.
01:08:23.000 But I think they're called Bongarais or something like that.
01:08:25.000 But they were like these mayors of each one of these little islands.
01:08:29.000 And there'd be terrorists in and around those areas.
01:08:31.000 And we'd try to make friends with these guys so they'd give us some information.
01:08:35.000 And every one of those places was absolutely beautiful.
01:08:38.000 Like you'd go there and be like, man, Hilton could turn this into something in a short order.
01:08:42.000 Right.
01:08:43.000 You know, when you're out of these places, beautiful beach, beautiful, lush jungles, the best swimming water.
01:08:48.000 Nicest people, too.
01:08:49.000 Oh, Filipino people are some of my favorite people, man.
01:08:53.000 Like, you want to talk.
01:08:53.000 The guys that we worked with out there, they're scout.
01:08:56.000 I think they're called Scout Sniper Scout Rangers.
01:08:58.000 And they were especially, I think they were like their special forces.
01:09:01.000 We'd go to the range with these guys and show them stuff.
01:09:04.000 And they're the most ride or die type of guys you'll ever meet in your life.
01:09:09.000 Like, you know, so-and-so said this about you last week, and I could kill him.
01:09:12.000 It's like, no, dude, it's cool.
01:09:13.000 It's like, don't worry about it.
01:09:15.000 Fun fact, they're some of the best pool players on earth, too.
01:09:17.000 Oh, really?
01:09:18.000 Some of the greatest pool players of all time.
01:09:19.000 Came out of the Philippines.
01:09:20.000 They're just great people.
01:09:21.000 I mean, I just, the people down there were fantastic.
01:09:24.000 And it was awful because those guys would be bombing churches, Christian churches, and stuff like that.
01:09:28.000 And they're doing counter-operate, like I said, counter intelligence operations out there doing intelligence operations collection to inform that battle picture.
01:09:38.000 But those guys had direct links with Osama bin Laden and other people.
01:09:42.000 Yeah, right after we, like I said, I think it was, I think if you look it up, I think his name is Patek, P-A-T-E-C, P-A-T-E-K.
01:09:42.000 I have no idea.
01:09:51.000 And he was arrested outside of Osama bin Laden's compound, and we had been chasing him in the Philippines.
01:09:55.000 Wow.
01:09:56.000 Because we thought he was still down there.
01:09:58.000 There was another guy that I believe we killed him.
01:10:00.000 His name was Albeder Parad.
01:10:03.000 But yeah, my job was not, I always say this on podcasts because the veteran community is wild right now.
01:10:08.000 They love to cut each other down right now.
01:10:10.000 There's something weird going on where, like, obviously lying.
01:10:13.000 Yeah, call the people out.
01:10:15.000 I prefer to call people out face to face, but I always make sure people know I was not a cool guy.
01:10:22.000 Like sometimes I got to dress like one.
01:10:24.000 For a few years, I didn't wear any uniforms, and I got to grow my beard out and act like a cool guy.
01:10:28.000 But I was really a nerd for cool guys.
01:10:30.000 I've literally got pictures of myself down in the Holo or in Afghanistan or anywhere else and tape around my glasses and Pez Dispencer and my radio and collection equipment looking like a true blue American nerd.
01:10:44.000 But I was not the guy who kicked the door in.
01:10:46.000 I was always the guy who pointed the door out.
01:10:48.000 So I'd be safe in the Humvee in the back, you know, eating an MRE and somebody that looked like another gorilla, you know, like Annie Stumpf or Tim Kennedy or someone like that.
01:10:56.000 I'd be like, is that the house?
01:10:57.000 I'd be like, pretty sure that's the house.
01:10:59.000 You guys want to be safe, but go ahead.
01:11:00.000 I'll be in the Humvee.
01:11:01.000 I'll be out here or I'll be in an airplane above, you know.
01:11:05.000 And yeah, it was, it was being born in North Dakota and, you know, my mother, single mother, after she left that first guy, trailer house in the middle of this little town called Cavalier, North Dakota.
01:11:20.000 I had no options.
01:11:21.000 I was a horrible student.
01:11:23.000 And what did you do?
01:11:24.000 That's crazy that you're so smart, but you were a horrible student.
01:11:26.000 I wouldn't, yeah, I'd call myself curious before I'd call myself smart.
01:11:30.000 But, you know, my mother, you know, I don't know if you would remember this, but maybe other people my age, you know, you'd get these scholastic book order forms that you'd bring home from school and you could order books.
01:11:43.000 There'd always be on the back page.
01:11:45.000 There'd always be like little cool stuff like you could get like, you know, a pair of gloves or a hat or something.
01:11:50.000 Anyway, one time there was a coil radio that you could order with an earpiece and you put this coil radio together and with an earpiece, no battery.
01:11:59.000 It was just the electromagnetic radiation would activate the coil and the coil would, you could listen to radio chatter.
01:12:08.000 Really?
01:12:08.000 With no battery?
01:12:09.000 Yeah, yeah, just tiny little radio.
01:12:11.000 How did it, what was the power?
01:12:13.000 The electromagnetic radiation.
01:12:15.000 And you would just kind of like a record, like, you know, how you hit a record.
01:12:20.000 Electromagnetic radiation would hit the coil and the coil would feed up to an amplifier or up to an earpiece and the earpiece you could hear chatter and you could.
01:12:27.000 Did the earpiece have a battery?
01:12:29.000 I don't think anything had a battery on it.
01:12:31.000 I think it was just a...
01:12:32.000 Wow.
01:12:33.000 I could be mistaken, but I don't believe it was.
01:12:35.000 It was powered by electromagnetic radiation.
01:12:38.000 Yeah.
01:12:38.000 I mean, you can look it up, Jamie, if you want.
01:12:40.000 Sorry to say that again, but tighten that thing down.
01:12:42.000 That thing's driving me crazy.
01:12:44.000 Yeah, sorry.
01:12:45.000 Like here or here.
01:12:46.000 Look at my finger.
01:12:46.000 Right here.
01:12:47.000 It's right here.
01:12:48.000 Yeah, I've been meaning to do that literally when everybody uses this fucking thing.
01:12:52.000 It's wobbling around ready to fall off.
01:12:53.000 Yeah, but if you look up coil radio with small earpiece, I could be wrong.
01:12:58.000 I don't remember there being a battery on it.
01:12:59.000 Electromagnetic radiation powered it.
01:13:02.000 That's bananas.
01:13:03.000 Yeah, so kind of like a same thing with like, you know, not at the same wattage, but a microwave, right?
01:13:08.000 Sends power through the air.
01:13:09.000 Right, but it uses DC.
01:13:11.000 But it uses power if you send it.
01:13:13.000 Yeah, but I could be wrong.
01:13:15.000 But at any rate, that was the first time I got a radio, and I was hearing things, and I'd put it together, and I'm listening to things.
01:13:22.000 Like what kind of things?
01:13:24.000 HF radio, VHF radio, people talking, that type of stuff.
01:13:28.000 And it was just, and then I found out how to get an antenna to make the antenna larger and started ordering auxiliary pieces for it.
01:13:36.000 And then what really changed me was my mother let me get a my mother and I would clean houses.
01:13:41.000 She was a waitress, but we also would go around and clean houses.
01:13:43.000 And there was a lawyer that we worked for.
01:13:44.000 His name was Phil Culp.
01:13:47.000 And he had an old 286SX IBM.
01:13:51.000 And it was just sitting in his basement.
01:13:53.000 And I told my mom, I was like, hey, if I clean it for like a month, can I have that computer?
01:13:57.000 Like, he doesn't use it.
01:13:58.000 He's got a new 486 up in his place here.
01:14:00.000 And he instantly said I could have it.
01:14:02.000 And then that started me down the computer networking realm.
01:14:05.000 And like, look, how could I get this 286 to act like a 386?
01:14:08.000 Or how could I force it to run Windows?
01:14:09.000 Or how do I update the memory?
01:14:11.000 How do I do these things?
01:14:12.000 In this little town, Edinburgh, North Dakota, there was a guy who had a computer store in a basement of an old general store, and his name was Jeff Munzerbrotten.
01:14:20.000 And I would go there and ask him questions about computers and just start learning ins and outs on how do I update the RAM?
01:14:26.000 How do I get memory better?
01:14:28.000 How do I augment the storage?
01:14:30.000 How could I force this thing to run Windows 3.1 so I could have a GUI instead of using a command line?
01:14:36.000 GUI mean graphic user interface.
01:14:37.000 Graphic using the interface.
01:14:39.000 Yeah, sorry.
01:14:40.000 And so that kind of started me on that.
01:14:43.000 And that, for me, like I said, I had all kinds of problems with attention deficit disorder and not being able to pay attention.
01:14:51.000 That was the only time I would go for three days.
01:14:55.000 I don't believe in ADHD.
01:14:58.000 I might be wrong, but I think it's a superpower.
01:15:01.000 I mean, it certainly, I remember I would spend two days working on a problem and not sleeping.
01:15:05.000 That's what I'm saying.
01:15:06.000 I think it's a superpower.
01:15:07.000 I think it just keeps you from being interested in things you're not interested in.
01:15:11.000 Yeah, I have a theory on that too that I can get into after.
01:15:15.000 But that started me down that road.
01:15:17.000 But in school, I couldn't pay attention.
01:15:18.000 Me neither.
01:15:19.000 There was this teacher.
01:15:20.000 I always tell a story.
01:15:21.000 It's a great teacher.
01:15:22.000 She's still around.
01:15:23.000 Her name is Connie Trenbeth.
01:15:26.000 And she was my English teacher or literature teacher or something like that.
01:15:29.000 She might not even remember the story, but here I am telling it on your podcast.
01:15:32.000 I remember it.
01:15:34.000 She kept me after class once, and she goes, you know, I knew your dad, Bill.
01:15:38.000 And, you know, your uncles were all smart.
01:15:41.000 And my great uncle has an engineering wing of a school named after him out in western North Dakota.
01:15:48.000 And she goes, all these guys were thinkers, and your dad did all this great stuff and built all this stuff.
01:15:52.000 And essentially what she was telling me is, you're a waste of life.
01:15:58.000 Like all you do is you come in here, you disrupt the class, you upset people, no one can talk.
01:16:04.000 Sounds like me.
01:16:05.000 You're trying to dominate every conversation.
01:16:07.000 But when, you know, you had written one paper on something that interested you, and I don't remember what it was.
01:16:13.000 And she's like, that was a wonderful paper.
01:16:15.000 She's like, if you could just do that every time.
01:16:18.000 And I was not hearing it.
01:16:21.000 Like, I remember the conversation because I actually remember her.
01:16:24.000 I think she said waste of life.
01:16:25.000 I think she actually said that.
01:16:27.000 Like, you're wasting, like, you're obviously my RP, my CPU clocks high.
01:16:32.000 I'm always thinking, even when I'm not thinking, and even as we're sitting here talking, I'm thinking about other things or stuff I want to do when I get back to my computer or stuff I want to do for my business.
01:16:41.000 And so I joined the military.
01:16:44.000 And the absurdity of life is this.
01:16:47.000 I had joined to be a military policeman, which I absolutely would have hated.
01:16:52.000 All of them got turned into infantry people or stand gate guard, which is a needed function in the military, but it doesn't apply to my personality.
01:16:59.000 But when I went to the recruiter station out in Minneapolis, I think it was, I was a bonehead and I forgot my driver's license.
01:17:06.000 And they're like, well, and I was supposed to leave.
01:17:08.000 And at this time, I had dumped my girlfriend, told everyone goodbye.
01:17:12.000 I'd wiped the dust off my boots, like left Cavalier, North Dakota.
01:17:16.000 And I was like, hey, I'm not going back.
01:17:21.000 So whatever we got to do right now.
01:17:24.000 And he's like, well, you can go home, get your license, because the MEPS station was in Minneapolis.
01:17:31.000 Was it far ago?
01:17:32.000 It was five, six, seven hours away.
01:17:32.000 It doesn't matter.
01:17:34.000 And they're like, well, you're not leaving today without a driver's license.
01:17:39.000 So I looked at my recruiter and I was like, I don't know what job you need to get me into, but it needs to be a different job.
01:17:44.000 And they're like, well, you scored exceptionally high in your general technical part of your ASVAB, which is like understanding machines and objects and stuff.
01:17:52.000 So we could get you into this like Intel job where you'd learn about radars and stuff.
01:17:57.000 And that immediately clicked for me.
01:17:59.000 And then he's like, well, we got to go brief you in this skiff room.
01:18:02.000 There's a secure compartmented information facility.
01:18:05.000 There's only one guy who's got a clearance and he can brief you on the job.
01:18:08.000 And if you want that job, then you can leave tomorrow.
01:18:10.000 I instantly started hearing like the James Bond music, you know, dang it, yeah.
01:18:15.000 Yeah, and so they walked me in this back place and, you know, nothing super crazy and briefed me up on the job.
01:18:23.000 And I went back out and I said, yeah, this is actually the job for me.
01:18:26.000 So the absurdity of life is me forgetting my driver's license when I was 16.
01:18:29.000 I was 16 when I signed up.
01:18:32.000 Maybe 17.
01:18:33.000 No, I was turning 17 that December.
01:18:35.000 When I signed up for the military, I can connect with a string to forgetting my driver's license to being here with you today.
01:18:43.000 You can sign up when you're 16?
01:18:45.000 I think I was turning 17.
01:18:47.000 You can sign up when you...
01:18:48.000 I didn't even know you could sign up when you're starting.
01:18:49.000 I had signed my delayed entry program thing, and I left a little bit before my 18th birthday.
01:18:55.000 So I was graduated from high school.
01:18:57.000 But yeah, you can sign up when you're 16, I believe, as long as your parents signed the waiver.
01:19:01.000 My mother signed the waiver.
01:19:02.000 She was happy to get me out of the trailer.
01:19:05.000 So, yeah, I was 17, almost 18 when I left.
01:19:10.000 Yeah, right there.
01:19:11.000 So that's all the pieces.
01:19:12.000 They call it a crystal radio.
01:19:13.000 Yeah, I was going to say crystal controlled.
01:19:15.000 That's a radio?
01:19:16.000 There it is.
01:19:17.000 That's actually the exact thing.
01:19:18.000 That is almost exactly what it looks like.
01:19:22.000 Slinky made it.
01:19:23.000 Well, they bought the brand.
01:19:23.000 They just led the Slinky brand now.
01:19:26.000 There's a bunch of these all over the internet.
01:19:28.000 Yeah.
01:19:28.000 Wow.
01:19:30.000 Make your own working radio without battery.
01:19:32.000 Yeah, and it uses a, I was going to say crystal controlled radio because it uses a crystal diode on it.
01:19:38.000 Would you say Tesla coil, James?
01:19:39.000 So yeah, it's a Tesla coil.
01:19:41.000 This guy's explaining it.
01:19:42.000 So this thing has actually kind of cool too.
01:19:44.000 Let me find this thing.
01:19:46.000 A rocket radio, they called, which is like further development.
01:19:49.000 This thing.
01:19:50.000 It attached to a phone.
01:19:53.000 So you plug that onto a phone cable.
01:19:55.000 There's a picture of it somewhere on here, but it explains like you're picking up.
01:20:00.000 There you go.
01:20:02.000 Wow.
01:20:03.000 Wow.
01:20:04.000 No battery or current needed, hence no operating expense and long life.
01:20:08.000 Yeah, this is almost onto a phone.
01:20:11.000 What year was this?
01:20:12.000 Man, this is old.
01:20:14.000 Yeah.
01:20:15.000 So it also shows here, this is like you're picking up power from a radio tower.
01:20:19.000 Yeah.
01:20:19.000 Wow.
01:20:20.000 The more powerful the signal.
01:20:21.000 This is quite like what they're paying for at the FCC.
01:20:23.000 The more powerful your radio tower, the longer and more people you can reach.
01:20:28.000 Crazy that has no battery.
01:20:31.000 And that's also why some radio signals come in very well on your radio and some don't.
01:20:35.000 And they sound like dog shit.
01:20:36.000 Yeah.
01:20:37.000 Weak power.
01:20:38.000 Yeah.
01:20:39.000 And then the frequency modulation.
01:20:41.000 Like amplitude modulation isn't as efficient as frequency modulation when it comes to for the vocorder to produce sound.
01:20:49.000 Amplitude modulation travels farther, but it doesn't have the amount of information.
01:20:55.000 It's not modulated with the carrier wave can't be modulated with as much information as you need, whereas frequency modulation is much quicker, megahertz, and you can amplitude and add more sound or more information, which is why it sounds better.
01:21:08.000 So FM sounds better, but it doesn't travel as far.
01:21:11.000 And it sounds worse.
01:21:11.000 Right.
01:21:13.000 When I was training people in the military on this, I always use the analogy of if a party is happening next door, you can hear the bass music, but you can't hear the treble.
01:21:21.000 You can hear the bass music because that frequency travels farther because it's lower in the frequency band.
01:21:27.000 But you can hear the treble because, or you can't hear the treble, I'm sorry, because it's higher frequency and there's more modulation.
01:21:34.000 And so it disperses quicker and you can't hear it as well.
01:21:37.000 And it's the same thing with like VLF comms coming off of like a submarine can travel underwater for a very long ways, but you can't put as much information in them as you could if you were doing, you know, VHF or UHF comms where there's lots of modulation.
01:21:53.000 So it's the dispersal.
01:21:54.000 And, you know, a lot of my, you know, mid-part of my career was explaining this stuff to, you know, military guys who were trying to understand like, here's how a cell phone works, and this is how frequency works, and this is how we send information.
01:22:06.000 And just kind of demystifying, you know, how a GSM network works.
01:22:12.000 One of the things that I wanted to ask you about that is when new technology is emerging, how do you stay ahead of the ability to extract information from this technology, hack into networks before people understand the capability?
01:22:35.000 You really can't.
01:22:36.000 You really can't.
01:22:37.000 And that's the beauty of the free market, is that the innovation to perform the function that you want someone to pay for will always move faster than your ability to exploit the technology.
01:22:48.000 Then how do you explain things like Pegasus?
01:22:51.000 Well, I mean, something like Pegasus, well, first off.
01:22:55.000 Explain Pegasus to people that don't know.
01:22:57.000 It was a persistent implant on cell phones for people.
01:23:02.000 Initially, you had to click it.
01:23:04.000 It was a click explicit.
01:23:05.000 Initially, it was a click, and then it became a non-click exploit.
01:23:08.000 So in other words, you had to interact with something on the phone in order to initialize and install the implant.
01:23:14.000 And then after, but the reason why it was so good is because it wasn't stored in the it wasn't stored in the unusual areas that you would want a persistent implant or where you would have a persistent implant.
01:23:27.000 For instance, you know, you might want to put it in the application layer of an app or something like that where there's a binary that can run and execute commands or functions.
01:23:38.000 And so they, I won't get into the very specifics of where and how they did this because I'm not sure if I got this information from the government or not, so I won't say it.
01:23:47.000 But they stored it in a place where it wasn't normal.
01:23:50.000 And you can read papers on your own and look at the forensics of it and how the actual implant was executed.
01:23:57.000 But it essentially allowed people to own your phone and was the kind of implant I only dreamed of when I was helping develop my own implants in the military.
01:24:10.000 Mostly what we would rely on is zero-day architecture and looking for something in a phone that either they hadn't patched or that the phone that you were looking at hadn't been patched.
01:24:20.000 So phones, as they have their own red teams, are going through the phone for their own, because they want to sell a product that people will use and people won't use stuff that can get hacked.
01:24:29.000 So they'll do their own red teaming and they'll discover like, oh, you know, on this router we developed, we left this port open and it shouldn't have been open.
01:24:37.000 So now we're going to write a patch that will close that port so that this port is no longer accessible by a guy like me.
01:24:42.000 So I can't go in there and do something to this particular type of router.
01:24:46.000 Another great thing, I'll say something good about the administration.
01:24:49.000 They're doing some stuff right now to make sure that we're getting rid of Chinese technology and Chinese routers.
01:24:55.000 And, you know, there's a widespread network of the PLA has a, and I can't remember the name of the botnet, but they essentially implanted a bunch of old unpatched routers to get access to government and business proximal people.
01:25:13.000 And it was widespread and huge.
01:25:15.000 And, you know, it looked like to me, I haven't read this anywhere, but if I were looking at this implant and how it was done, they were trying to really cause some trouble.
01:25:25.000 It was being placed at critical places, think power, think energy, think banking.
01:25:31.000 Like they really wanted to cause some ruckus.
01:25:34.000 And I have not been part of this administration, so I'm not saying anything classified for those of you who are listening.
01:25:40.000 But there was a decision to say, hey, we need to make sure that these things get patched, and also that we're not bringing in architecture from the overseas because they don't play by the same rules that we at least say we play by.
01:25:51.000 So that's why they banned Huawei devices.
01:25:53.000 Oh, yeah, and ZTE.
01:25:54.000 Yeah.
01:25:55.000 Huawei had a phone that I was really interested in back in the day.
01:25:59.000 They had a Porsche Design had partnered with Huawei and made this insane Android phone with like the best camera, the best battery.
01:26:07.000 It was like really high level.
01:26:09.000 And I was like, gonna buy it.
01:26:11.000 And then all of a sudden they banned all the Huawei phones.
01:26:13.000 And I was like, what's going on?
01:26:15.000 And then, you know, I had heard some people say, oh, they're just trying to stop competition.
01:26:19.000 It's like American companies are trying to stop it.
01:26:22.000 And then I went into it deeper and I said, no, it seems like there's third-party input on some of their routers and some of their network devices that they had engineered in order to be able to access them by third party.
01:26:38.000 And this, because of whatever, lack of understanding, lack of knowledge of how these things are constructed, the people that purchased them weren't aware of them.
01:26:49.000 And these things had gotten into place.
01:26:51.000 And they had gotten into place in universities.
01:26:53.000 They got into place in military establishments.
01:26:56.000 They were using them in cell phone towers that people had, you know, inadvertently bought from China.
01:27:01.000 Yep.
01:27:02.000 And that's really, I mean, I can tell you firsthand from having done some of the forensic exploitation on this stuff.
01:27:08.000 Another large part of my career I didn't talk about was just on mobile forensics and media forensics, which is essentially you think of like CSI, Miami, or CSI, whatever the city was.
01:27:18.000 There's a crime, someone was killed.
01:27:20.000 You have forensics that are doing forensics on like blood and fingerprints and blood splatter and all that stuff.
01:27:25.000 There's a whole another part of that same forensics branch that focuses on media forensics.
01:27:30.000 What was deleted off this phone at one point?
01:27:32.000 What remains on this phone?
01:27:34.000 What was it being used for?
01:27:35.000 I would do this in the military so that when we did do an operation, and I was part of some of the largest ones ever done out in Afghanistan, there would be treasure troves of phones and all of these computers and stuff like that.
01:27:47.000 And it was my job.
01:27:49.000 And I had a great team that worked for me.
01:27:51.000 In my deployment in 2015, we would go in afterwards, gather up all of this stuff.
01:27:57.000 And, you know, the task force commander would literally be standing by and we would say, you know, here's the intelligence that we've derived.
01:28:04.000 Here's the multi-point analysis.
01:28:06.000 You know, it was on this hard drive.
01:28:07.000 It was here.
01:28:08.000 You know, there's a bad guy place out here.
01:28:08.000 It was here.
01:28:10.000 And those guys would be rolling like within moments after the last operation.
01:28:13.000 Like some operations we'd do where we'd be rolling one after another target because we were getting really good at media forensics and intelligence that was there.
01:28:22.000 And then getting into active media forensics, which is a different discipline.
01:28:25.000 But essentially, I can get into that later if you want to.
01:28:28.000 But launching and doing these follow-on operations off, you know, dumping the binary from a phone and examining it at the ones and zeros level to say everything that was going on with this thing.
01:28:40.000 Or if it was a really high, like the organization that I worked for at that time did the analysis of the Osama bin Laden media.
01:28:48.000 And, you know, on that media, we're doing far more than we would for another piece of media and that we're, you know, x-raying it and we're looking at maybe what the disk looked like before or what was destroyed or reconstructing things, spending millions of dollars on that intelligence analysis because we wanted to fully understand everything that this guy was involved in and what he was doing and where he was and who he was talking to.
01:29:10.000 And so that was another part of my career that I did for about five years or so.
01:29:14.000 What was going on with the Huawei phones?
01:29:16.000 Like, what were they doing with them?
01:29:18.000 I mean, some of them were coming out implanted.
01:29:22.000 In other words, there was access built in for a foreign actor.
01:29:25.000 And then in other terms, other places with routers, with the ZTE stuff, there were just things that you would patch or that you would fix as a company who was trying to protect the consumer and create a product that people would use.
01:29:38.000 And they weren't doing it.
01:29:39.000 So they were creating persistent back doors either by actively placing code on there that would allow root access or they were leaving things open, especially in Africa, like the work that, you know, when I was working in Africa, the Chinese were just owning Africa.
01:29:54.000 They were just giving them communications infrastructure.
01:29:58.000 And they were doing that because they wanted their resources and they wanted to know what these people were saying and what they were doing.
01:30:04.000 And so I'm a free market real, like I'm as free market as a guy can get.
01:30:09.000 I want the best people building the best products and I want everyone to be able to compete.
01:30:13.000 But in that case, I would never own a Huawei or a ZTE or anything else.
01:30:18.000 On a consumer level, what were they doing with those phones?
01:30:21.000 Like if they had imported them to the United States, if they didn't have that ban, what would have been the issue?
01:30:27.000 Getting access to, you know, any number of people that the Chinese really want access to everybody.
01:30:34.000 But you could start at the topical level of just saying, you know, getting Joe Rogan to use a ZTE would be, that would be my wet dream as a guy who used to do this work back in the day because you're talking to the president or you're talking to this guy or that guy.
01:30:46.000 And I can build out a network of understanding who you're in contact with, who you're talking to, what's being talked about.
01:30:53.000 But then also finding out this person's phone number and now doing a deep dive on there.
01:30:57.000 So it's really about getting all of that data and constructing an analyst notebook, essentially, outline of who's talking to who, who do we need to implant.
01:31:07.000 But it's for business as well.
01:31:10.000 They would want this in the hands of somebody who's in charge of a business because they want their IP.
01:31:14.000 They would want this in soldiers' hands so they would know deployment dates or who's going where and who's doing what.
01:31:18.000 They want this in routers because routers are usually the most unpatched piece of technology in that you're not, especially, you know, these days they're more automated patching.
01:31:28.000 But back in the day, like you had to manually update a router.
01:31:31.000 And if you didn't, well, then you had potential exploits that were sitting on that router where I could gain access to the router in your home, or I could gain access to a BGP router, which is like a border gateway, which is moving all of the internet data.
01:31:44.000 Or I could get access to a microwave terminal.
01:31:47.000 If you look at a cell phone, they've got the microwave terminals on there that are sending information in between them.
01:31:52.000 If those are Chinese parts that are either being used for the processing, the CPU, or the physical infrastructure of that, the products that they were putting out would give me direct access to the information that's being passed on those terminals.
01:32:05.000 So you're getting, you know, system-level, root-level access through machinery, through communication devices, and through things like routers where you can know everything you want to know about your enemy.
01:32:17.000 Wow.
01:32:18.000 And so as far as today's technology, I see you use an Android phone.
01:32:23.000 Is there a phone that is more secure or a platform that is more secure?
01:32:29.000 It all depends.
01:32:31.000 I always take this from Thomas Soule.
01:32:33.000 There are no answers.
01:32:34.000 There are only trade-offs.
01:32:35.000 So there's like the way to answer that question would be, who are you?
01:32:39.000 What are you trying to do with your life?
01:32:41.000 What are you talking about on your phone?
01:32:42.000 What are you doing on your phone?
01:32:44.000 Most of these phones, if you're just an average everyday citizen who's just going about your job, the phones today are pretty secure, especially versus a few years ago.
01:32:54.000 If you're a reporter, now the nexus is, do you trust the government and do you trust Apple?
01:33:01.000 If you trust the government, you trust Apple, then Apple's probably your best bet for using an, you know, there's lockdown mode on an Apple phone or they used to call it back in the day.
01:33:11.000 I think it was called reporter mode, but there was ways to encrypt the devices and to encrypt the chatter and the tunnel coming out of the phone, the RF coming out of the phone.
01:33:22.000 What is lockdown mode?
01:33:24.000 I don't know if that's exactly what it was called or not because I've never really used Apple just for my own personal reasons.
01:33:29.000 What personal reasons?
01:33:31.000 I don't trust Apple.
01:33:32.000 How so?
01:33:34.000 They are more interested in monetizing people's data than they are providing them capability.
01:33:39.000 So every time you take a photo, every time you upload a document, every time you talk to it, every time it asks you about your, you know, you'll get these questions where it says if your password's lost, you can back up your password in these ways.
01:33:52.000 Tell us where you were born.
01:33:54.000 Tell us your mom's maiden's name.
01:33:55.000 Tell us your mom's this, your mom's that.
01:33:57.000 Lockdown mode is extreme optional protection.
01:33:59.000 You can only be used if you believe you may be personally targeted by a highly sophisticated cyber attack.
01:34:03.000 Most people are never targeted by attacks of this nature.
01:34:06.000 When iPhone is in lockdown mode, it will not function as it typically does.
01:34:09.000 Apps, websites, and features will be strictly limited for security, and some experiences will be completely unavailable.
01:34:16.000 Yeah, so when I was advising guys back in the day on going out and doing like a high-risk source meet, so they're going to go meet a spy for another country, and you're a military guy and you're debriefing someone or doing something, I was always telling them to use lockdown mode.
01:34:30.000 I knew that it did those things.
01:34:31.000 I didn't know if that was the term or if I'd thought about it.
01:34:33.000 So can you still send iMessages?
01:34:36.000 You can still text and call.
01:34:37.000 Text and call that stuff.
01:34:38.000 Yeah, but there's other things that you can't do.
01:34:41.000 Well, like Meta just recently announced they're no longer encrypting your DMs.
01:34:46.000 Why would they do that?
01:34:47.000 Well, they said that it's for protection or whatever, to make sure that people aren't doing bad things.
01:34:52.000 I don't know.
01:34:53.000 See what their explanation for it was.
01:34:57.000 Sorry, I'm worried about this reporter.
01:34:57.000 What was it?
01:34:59.000 I'm sorry.
01:35:00.000 Meta.
01:35:02.000 Meta recently announced that they're no longer encrypting your DMs on Instagram.
01:35:08.000 And a lot of people are up in arms and they're stopping using any DMs on Instagram and any of that stuff.
01:35:15.000 The idea is that other people can read your stuff now now whether it's meta can read your stuff or who That's what I mean.
01:35:23.000 Yeah I said, why don't you trust Apple?
01:35:24.000 It's the same reason I don't trust Meta.
01:35:26.000 They're not interested.
01:35:27.000 The dangers behind Meta killing end-to-end encryption for Instagram DMs.
01:35:31.000 Meta blamed users for not opting into the privacy protecting feature.
01:35:34.000 Experts fear the move could be the first major domino to fall for end-to-end encryption tech worldwide.
01:35:40.000 That's a horrible narrative.
01:35:42.000 Yeah, it seems squirrely.
01:35:47.000 So.
01:35:48.000 Oh, you've read your last free article.
01:35:50.000 Oh, my God.
01:35:51.000 Give me money, motherfucker.
01:35:52.000 But what Apple and Meta want to do is, like, they're trying to build these new neural networks.
01:35:57.000 They're trying to, you know, humans, and we can get into this too later if you want.
01:36:02.000 Humans are the only thing, in my opinion, and I'm happy to have you disagree with me, and I love to have this conversation.
01:36:08.000 In my opinion, we're the only ones that are.
01:36:10.000 After May 8, 2026, announced plans to discontinue support for end-to-end encryption for chats on Instagram.
01:36:16.000 If you have chats that are impacted by this change, you will see instructions on how you can download any media or messages you may want to keep.
01:36:23.000 Social media giant said in a help document, if you're on an older version of Instagram, you may also need to update the app before you can download your affected chats.
01:36:31.000 When reach for comment, this is what Meta had to say.
01:36:34.000 Very few people are opting for end-to-end encrypted messages and DMs, so we're removing this option from Instagram in the coming months.
01:36:40.000 Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp.
01:36:44.000 But WhatsApp is a little squirrely, right?
01:36:46.000 WhatsApp.
01:36:47.000 Yeah, I mean, they're all squirrely.
01:36:49.000 And that's the problem.
01:36:51.000 And so you asked me why I don't trust them.
01:36:53.000 It's because they want to, they want to use, so humans, in my opinion, and some animals are the only things that are, that have the ability to project consciousness.
01:37:04.000 And projecting consciousness is how you train a neural network.
01:37:07.000 And it's how you train all these large networks.
01:37:10.000 A lot of my time also in the military is spent.
01:37:12.000 I was doing artificial intelligence in 2012, 2011, before it was even a catch term.
01:37:17.000 We were using artificial intelligence to map dynamic networks and to do other things, more pragmatic uses of it than how it's being used today with large language models or convolutional neural networks.
01:37:27.000 But they need consciousness to train their models.
01:37:30.000 So when Google offers you meta or Instagram or whoever else offers you photo storage, it's because they want your face to train neural networks.
01:37:38.000 If they're going to pay for the compute, if they're going to pay for the storage for these things, they're doing it because they're going to use the data.
01:37:46.000 If you're getting a free app, in essence, any free app, if the product's free, then you're the product.
01:37:52.000 So when Google is allowing you to use a Google Drive and get a gig of storage, they're going to use those photos to train neural networks to do better facial recognition.
01:38:00.000 What if you're paying for Google Drive?
01:38:02.000 I don't know about their terms of service now.
01:38:04.000 That is one of the best things that I use with large language models is any product I download, I have the neural network examine the terms of service.
01:38:14.000 And then you can pretty much understand like, here's my focus.
01:38:17.000 Here's the 40-page terms of services document.
01:38:20.000 When you click that link that you got, what are they able to do with my data?
01:38:23.000 So that's how I sign up for apps.
01:38:24.000 And that's one of the great uses of a large language model, in my opinion, is to quickly understand how these things are being used.
01:38:31.000 And that's why I say with Apple, with Meta, with all of these large information, you are more the product than the products, the product.
01:38:37.000 And that is because they're trying to build the most powerful, capable artificial intelligences, which I think is a misnomer.
01:38:44.000 And again, we can get into it later.
01:38:46.000 But they're trying to build these hyper-competent artificial intelligences.
01:38:50.000 And you need two things for that, really, is training data and you need compute.
01:38:55.000 And that's why you start seeing them coming out with like Meta's building its own nuclear engineering facility or something, a nuclear facility or something like that.
01:39:01.000 And they need more training data.
01:39:04.000 So if I want to build a replica of Joe Rogan that I can make hyper-realistic AI videos for, I need every picture of your face from every angle.
01:39:13.000 I need every wince, every squint, everything you've ever done.
01:39:16.000 So I can introduce more training data to better train that neural network in order to generate more hyper-realistic versions of yourself.
01:39:25.000 And so when a company is offering you something for free, and it's fine, like if people are fine with that idea, then by all means, download all the free apps that you want.
01:39:34.000 But if you're downloading a free app, it's because you are the product.
01:39:37.000 They either want to see how you type, they want to see what you're saying, they want to see how you're thinking about things, they want to understand your political biases, they want to look at your photos.
01:39:45.000 And this isn't because they're a deep-seated nation-state actor.
01:39:49.000 They can become that, but it's because they're trying to build the best products because the big money is in AI.
01:39:55.000 That's where the biggest money is.
01:39:57.000 So anytime you're doing any of these things, and it's just been obvious to me from the on not from the onset, but pretty close to the onset, that yeah, this is a good example, right?
01:40:06.000 Pokemon Go players built a 30 billion photo map.
01:40:09.000 That's how training robots deliver your pizza.
01:40:12.000 There you go.
01:40:14.000 So you, you know, they view people, and they can say they don't.
01:40:18.000 And maybe if someone from there catches this podcast, which they well could, they might put out a statement that's saying that that's not their doing.
01:40:24.000 But I'm telling you, as a person who has done media forensics, who has done computer network operations, and who has trained artificial intelligence models, that is precisely what they are doing.
01:40:35.000 That is their statement.
01:40:37.000 What is the difference between using Apple and using Android?
01:40:41.000 Well, Android will do the same things, and Google will do the same things.
01:40:43.000 It's just that I can root my phone or I can install a custom operating system like graphene or something like that, which I'm not doing right now.
01:40:52.000 I had to make a sacrifice when I started my company, SpartanForge.
01:40:56.000 And the sacrifice was I had to be the face of this product.
01:40:59.000 And so I never had a social media until I started the company.
01:41:03.000 And I didn't upload things to the cloud until I started this company.
01:41:06.000 And it became just like, I have to sell a product.
01:41:09.000 I have to, you know, and I'm actually selling a product, not people's data or people's photos.
01:41:14.000 I have to sell this product.
01:41:15.000 I have to let people often don't know who is the company or who is the organizing principal and what do they care about in the company.
01:41:23.000 And I just made that trade and said, I'm going to have to become a public person and start putting things out there.
01:41:28.000 And so, you know, I started a company.
01:41:31.000 We started our first Instagram and I started my marketing team started my first Instagram.
01:41:36.000 And I had to start uploading things and talking about how I felt about things because I wanted people to know that this company was not going to be like the other companies that are out there.
01:41:46.000 We don't sell their data.
01:41:47.000 We don't sell emails.
01:41:48.000 I can make a half million dollars off my email list tomorrow.
01:41:51.000 And I've been offered that money.
01:41:52.000 You know, we've got millions of emails from people who have signed up for our apps.
01:41:55.000 Other companies who are starting companies, they want to go out and reach marketing people.
01:42:01.000 So if you're starting another hunting app, maybe for cameras or for a call or a turkey call or an out call or something, and you found Spartan Forge and you said, man, they've got 2 million emails.
01:42:13.000 I could pay them a half million dollars for that $2 million and start some top-of-line marketing, top-of-funnel marketing, and go blast them.
01:42:21.000 So they would pay me a lot of money for those emails.
01:42:23.000 I will never do that.
01:42:24.000 I'll never sell my company's emails, the people's emails.
01:42:27.000 I'll never do any of those things because the product is the product for my company.
01:42:31.000 It's not the people.
01:42:33.000 So the reason why you use Android over Apple is the ability to root it and install things like graphene.
01:42:40.000 Yeah, custom OSs.
01:42:42.000 But yet you don't use it.
01:42:43.000 Not now, but what I still can use and what I still do use is Android also publishes their framework in an open source fashion where you can look at the Android.
01:42:53.000 It's called AOSP, Android open source project.
01:42:57.000 So the basis of Android, the nuts, think of it as the nuts and bolts.
01:43:01.000 I'll try not to talk in too technical terms here.
01:43:04.000 But the basic framework, think about it like a car.
01:43:07.000 The frame and the engine makeup is published so you can look at how things work on the inside.
01:43:12.000 Apple goes the opposite way and they don't publish any of that and you can't see any of that stuff.
01:43:16.000 I'm for the free and open version because at least if something, at least if I'm worried about my phone having a problem, I can actually dump binary or I can create an EO1 file and exhume.
01:43:27.000 I can look at the binary and say, is my phone acting like it should or doing what it should?
01:43:31.000 Or is there some kind of persistent implant?
01:43:33.000 I wouldn't be able to do that with a – I would have to trust Apple and Apple's ecosystem and whoever they're – McAfee or whatever they're using.
01:43:41.000 I would have to trust them, which I don't.
01:43:43.000 So I like the Android because that option available for the average consumer that's not that learned in computers?
01:43:51.000 Well, the great part about large language models now is if you wanted to dump your own phone today, you could follow along with a large language model and do it, your own Android.
01:44:00.000 And how would you do that?
01:44:02.000 Well, you would have to buy some expensive, there is something, you'd either have to pay a firm to do it, or you could download things like Celebrite.
01:44:12.000 You could get a Celebrite or there's other things called Forensic Toolkit, other things like that that allow you to examine your phone at a deeper level.
01:44:21.000 And is this an app forensic?
01:44:23.000 They're products.
01:44:24.000 Products.
01:44:24.000 They're products.
01:44:25.000 So it's a physical product.
01:44:26.000 To dump your phone into?
01:44:27.000 Yeah, and they're software.
01:44:29.000 And there's connecting and all that type of stuff.
01:44:31.000 Tools I used throughout my military career, Celebrite is one of them, but they're Israeli-owned.
01:44:37.000 I've got nothing against Israel.
01:44:39.000 I've just got everything against foreign actors.
01:44:41.000 Just if they're not an American company, that automatically kicks them down a level for me.
01:44:46.000 So anyway, there's all kinds of Android just makes it much easier to examine your phone or to understand if you've got something going on that's funky than it is on Apple.
01:44:58.000 So for the average person, like for me, like if I got.
01:45:01.000 You're not the average person.
01:45:02.000 Well, let's pretend.
01:45:03.000 If I got an Android phone and I wanted to examine my phone, what would be the process?
01:45:09.000 You would download some of the software that I talked about.
01:45:11.000 You would jack your phone into it.
01:45:13.000 You would open your phone and then it would start carving the binary of your phone, everything in your phone.
01:45:21.000 You could create a one-to-one emulation of your phone if you wanted to.
01:45:24.000 And then you would be able to get under the hood and examine the apps.
01:45:27.000 You'd be able to examine the binary.
01:45:29.000 What's the executable code?
01:45:30.000 You'd be able to look at all of those things and then determine because Android open source project is published, you could do a one-for-one and say, well, you know, at the kernel level, there's this weird code that's not in the Android build.
01:45:45.000 So what is this code?
01:45:47.000 And then with a neural network, you could probably, I've never done it, but I'm sure you could figure out what the intent is of that code, even for a lay person.
01:45:55.000 So I could take that information, I could put it into Perplexity, and Perplexity would lay out what's going on with it?
01:46:01.000 Ostensibly, it would be able to, yes, unless it was some type of weird code.
01:46:05.000 I don't know if I haven't used Perplexity, so I don't know if they have something like ChatGPT's Codex.
01:46:11.000 Sort of just try just to be like, can you help me examine my Android phone as doing looking for any malicious actors?
01:46:16.000 Yes, I can walk you through structured non-destructive check for malware or other shady activity on your Android phone.
01:46:21.000 A first, what are you noticing?
01:46:23.000 Four tools, commands, quickly check for common warning signs, sudden big battery dream, you're not using the phone, unusual data usage, particularly in the background, apps you don't remember installing, or icons briefly appearing and then disappearing.
01:46:36.000 Lots of pop-ups, redirects in browser, or new default search launcher, strange calls, SMS messages you didn't send yourself.
01:46:44.000 If any of those ring a bell, we'll focus on them in later steps.
01:46:47.000 Yeah, it's just asking you, like, why are you running?
01:46:49.000 So this is just something that you could do with an Android phone that you just can't do with it.
01:46:53.000 Yeah, Apple's not open.
01:46:54.000 What are the reasons you don't trust Apple?
01:46:55.000 Well, could I ask, can I do one thing before we remember that question because I want to forget it?
01:46:59.000 Could I give you a prompt?
01:47:01.000 Sure.
01:47:01.000 Because I want to answer your first question that we've already gone past.
01:47:05.000 Can you bring Perplexity back up, please?
01:47:07.000 You want to go in addition to that or start a new one?
01:47:08.000 No, this is fine.
01:47:10.000 Just say, my friend helped me carve an EO1 file, EchoOscar EO1 file.
01:47:22.000 And he says that there is code in there that doesn't comport with the rest of the Android system.
01:47:33.000 Yeah, P-O-R-T.
01:47:35.000 The rest of the system.
01:47:39.000 Could I dump that code here and could you tell me what it means?
01:47:43.000 I'm sure the answer is yes, but I just didn't want to answer it because I've never done it.
01:47:49.000 Could you tell me?
01:47:50.000 Could you tell me, Jamie?
01:47:52.000 Could you tell me what it means?
01:47:53.000 Yeah.
01:47:55.000 Figured that out probably, though.
01:47:57.000 Get the U out.
01:47:59.000 Get the U. Tell me you.
01:48:01.000 Could I have some of your coffee, please?
01:48:03.000 It's for you.
01:48:03.000 Yeah, absolutely.
01:48:05.000 Okay, let's see what it says.
01:48:09.000 Yes, you can paste suspicious code here, and I can help explain what it appears to do line by line and whether anything looks malicious.
01:48:17.000 Before you paste, a few important notes.
01:48:19.000 Remove or redact anything that looks like private data, username, passwords, keys, tokens, IPs, email addresses, phone number.
01:48:26.000 Perplexity is our sponsors.
01:48:28.000 I love that.
01:48:28.000 Because you'd never get that from ChatGPT.
01:48:31.000 You'd want all that information.
01:48:32.000 Perplexity is the shit.
01:48:34.000 And so you're not sharing personal or case-sensitive information.
01:48:38.000 Wonderful, Perplexity.
01:48:40.000 If it's very long, send in chunks and tell me chunk 1-3, chunk 2-3, et cetera, so we can help keep track.
01:48:48.000 I can do static analysis here, read and reason about the code, but I cannot actually execute it in a sandbox.
01:48:54.000 So this is more like a careful forensic read-through than a full dynamic malware analysis.
01:49:00.000 Go ahead and paste the code snippet your friend flagged as not fitting with the rest of the system and tell me in a sentence or two where in the EO1 it came from.
01:49:09.000 Example, app folder, system partition, random file path.
01:49:13.000 Yep, exactly.
01:49:14.000 So yeah, I thought that would be the answer.
01:49:16.000 I've just never done it.
01:49:17.000 And so you can do a forensic examination of an Apple, by the way.
01:49:22.000 Sorry if I misspoke there, but you can't do it to the level that you can with, because the Android open source project publishes all of the code, I can get an understanding of the very inner working.
01:49:32.000 So if something's being done, for instance, at the kernel, or you could think about it as like the lowest level of the phone, something that wouldn't normally get caught in a forensic examination, I wouldn't be able to do that with Apple.
01:49:45.000 And the nation state actors are doing things at very low levels in the code framework for that exact reason because most people who aren't very deep into forensics would miss that.
01:49:58.000 It would be like the fingerprint under the couch cushion or something like that.
01:50:01.000 And what is the difference between what someone can do with an Android phone with the standard Android operating system versus Graphene?
01:50:12.000 So that gets into, you know, if you wanted to WarDrive or sample Wi-Fi networks in an area, or if you wanted to run a barrage attack on a Wi-Fi endpoint, you could work that in there to do things with the phone that you couldn't otherwise do with a standard app, with a standard Android operating kit.
01:50:31.000 But as far as on a consumer level, what protections do you have by running graphene that you don't have by running Android?
01:50:40.000 You're much more in control of the ecosystem.
01:50:44.000 You have a firmer understanding.
01:50:45.000 And again, you could use a large language model to do this to understand exactly what's being run on the phone.
01:50:50.000 You control the background services that can be run on the phone.
01:50:53.000 So if you're getting hot mic'd or if your camera's taking pictures of you when you're not looking or it's listening to you for advertising content, stuff like that, you would be in control of all of that in a way that you're not control of on a native Android app.
01:51:04.000 In control, like how so would it alert you that this is happening?
01:51:07.000 Or just the functionality wouldn't be there for it to take place.
01:51:10.000 Right, because the functionality is only designed for the standard Android operating system.
01:51:15.000 And I haven't installed graphene in a while.
01:51:18.000 So a lot of this, all of this updates, and I could be saying things that are incorrect.
01:51:22.000 I stopped doing this about three years ago.
01:51:24.000 Well, I know that there was, I forget what country it was, but they were focusing on people who use Google Pixel phones, for example.
01:51:31.000 Yeah, because that's because that's one of the phones that are more commonly rooted.
01:51:35.000 Yeah, it's easy to do.
01:51:37.000 And you could do it with a large language model.
01:51:38.000 You could sit there and be walked through on how to do it, which is a great part of that.
01:51:42.000 Is it complicated?
01:51:43.000 For a person like me that's not that astute?
01:51:46.000 No, it's not something I would do with a phone that you care about the first few times.
01:51:50.000 Right.
01:51:50.000 Because you're going to jack things up.
01:51:51.000 have to you know get the bootloader and uh essentially the starting you know the starting mechanisms of the phone that launches all of the other things you have to get down to a level and unlock that so that you can um is that available for all android phones No, not all Android phones.
01:52:06.000 Lots of them lock it down, so you can't do that.
01:52:08.000 Is that available for Samsung phones?
01:52:10.000 No, not this one.
01:52:11.000 So the question has to become, can you unlock the bootloader?
01:52:15.000 And that is the starting, think of it as the starting engine of the rest of the phone.
01:52:18.000 Why is that only available on Google Pixel phones?
01:52:21.000 I'm not sure why they do it that way.
01:52:22.000 I haven't looked into that.
01:52:23.000 It's just pixels.
01:52:24.000 And the older Samsung's made it available.
01:52:28.000 Older Galaxy S7s, S10s, you could do more than you can with like, you know, I've got the Galaxy fold here, and you can do almost none of that on here.
01:52:37.000 That is fucking sweet, though.
01:52:39.000 Yeah, I love this phone.
01:52:40.000 But like I said, I went away from doing all that, A, because it was work.
01:52:44.000 B, because I'm not working in national security anymore, and I'm not, you know, I haven't written an exploit in years.
01:52:51.000 I don't do this type of work anymore, and I need to sell a product.
01:52:54.000 And it just, you know, working with other employees, like that run my Instagram or, you know, assistant going through my email and all those other types of things, it just wasn't pragmatic anymore for me to keep doing that, and I had to give up that.
01:53:05.000 Did you forge your app work run on graphene?
01:53:09.000 Yeah, well, it could.
01:53:10.000 Yeah, it would.
01:53:11.000 You have to sideload the app.
01:53:12.000 But again, a large language model could walk you through doing that.
01:53:15.000 So we haven't gotten to that level of...
01:53:18.000 Does it make sense here that this says it's easier because Google makes it easier?
01:53:22.000 Yeah, he was just asking me why they make it easier.
01:53:22.000 Yeah.
01:53:25.000 And I don't know that answer.
01:53:27.000 So the process is officially supported in the Android settings under developer options, allowing users to toggle OEM locking.
01:53:34.000 Simple fast boot method.
01:53:35.000 Pixels use standard fast boot commands that work consistently across all models to unlock the bootloader, accessibility.
01:53:43.000 Yeah.
01:53:44.000 That's what I was talking about.
01:53:45.000 So, yeah, I don't know why they do it.
01:53:47.000 It might be people can, well, the Android open source project exists.
01:53:52.000 So, it would stand to reason that you would want a way for someone, because what you want is people interacting with that code and red teaming it and making the code better and then offering bug bounties so that you can tell Android, like, hey, you've got a critical flaw in your system architecture here, and then they'll pay you 20 grand for that.
01:54:09.000 I've got friends who do that.
01:54:11.000 So, you and I talked about Eric Prince's phone.
01:54:14.000 Yes.
01:54:18.000 So, the narrative is that that is an unhackable phone.
01:54:23.000 Yeah, it's just by virtue.
01:54:24.000 And look, Eric's a wonderful guy, and the principles that he used for the first instantiation of that phone are the correct principles, which is we need to get, if you want, if you're security focused at all, you should get away from these big, large conglomerates because none of your data is private.
01:54:43.000 That's a correct principle.
01:54:45.000 An incorrect principle, and I'm going to get shit about this, but I told you in the beginning I care about the truth and I do care about the truth, is that when you're using a PKI subsystem that relies on Microsoft, then you're not in control of the PKI certificate signing, and Microsoft could cause a bunch of problems, and they were using that.
01:55:05.000 So, the other thing being, if you're building on the Android open source project, that means the code that you're using as the engine, let's just call it that of your phone, is examinable by the public.
01:55:16.000 So, you're relying on Android to publish these updates to the phone, and you're relying on those things to be as good as possible.
01:55:25.000 Now, you might harden it some more, but as long as the code is out there, it can always be mucked with.
01:55:30.000 As long as people have to interact with the device and type, and you have to see what you're typing, a phone's going to be, it's going to have Swiss cheese.
01:55:38.000 So, when people say something is unhackable, as you said, that's just not true.
01:55:44.000 Yeah, it didn't make sense to me.
01:55:45.000 It's just not true.
01:55:47.000 Yeah, we talked about it quite a bit.
01:55:50.000 Like I said, great guy, done lots of great things for the country.
01:55:53.000 And it's just if they had just said something along the lines of it's hackable as any phone is hackable, because by virtue of you having to interact with it, it's hackable.
01:56:03.000 It just, like if I install, if I came up with an app that had a, you know, look at the TikTok terms of service on the first TikTok.
01:56:10.000 Oh, it's bonkers.
01:56:11.000 With those terms of services, I will own your phone.
01:56:14.000 And I'm not saying you can install TikTok on his phone, but what I'm saying is by virtue that you have to interact with the phone and see what you're doing and type passwords, and you've got those kinds of terms of service, I could easily put a key logger in that, and now I know your signal password or your signal pin.
01:56:29.000 Or, you know, I get you, you know, you're going to China, so I stop you in secondary.
01:56:34.000 And while you're in secondary, I've got a CCTV on you, and you unlock your phone.
01:56:37.000 Now I know how to unlock your phone.
01:56:39.000 And now I'm going to lock you up in secondary at customs in China or in Canada.
01:56:46.000 And I'm going to separate you from your phone.
01:56:48.000 And I've seen you unlock it.
01:56:49.000 Well, now I'm going to get in there with NCASE or I'm going to get in there with FTK or I'm going to get in there with Celebrite.
01:56:54.000 And I'm going to dump your phone.
01:56:57.000 And just by virtue of it being built on the Android open source project, that's a great thing.
01:57:03.000 It's a good thing.
01:57:04.000 Just don't call it totally unhackable.
01:57:06.000 Because a guy like me, I don't need but a week or two to tell you on this current build, like here, here's the hole in this Swiss cheese.
01:57:14.000 Now, is it far better than having a Google phone with standard firmware and standard OS or an Apple phone?
01:57:23.000 I don't know about Apple because, again, you asked me about Apple and I said, I don't know Apple.
01:57:27.000 I don't know what's happening at the top of that company, but I know that they like to monetize people, and that's pervasive in my mind.
01:57:34.000 And using data that people don't know is getting used, even though it's in a 40-page terms of services document, is pervasive.
01:57:40.000 So I just don't know at that highest level of analysis.
01:57:43.000 And that's why I said to answer your question about the safest phone, I would ask you what you're using it for, who you are, and what are you doing in the world is the best way to answer that question.
01:57:53.000 So, me, like, what would you recommend I use?
01:57:56.000 I mean, I wouldn't want to, I mean, okay, I'll tell you generally what I would say because you might ask me that question one day because we go back and forth about a lot of tech.
01:58:05.000 I know specifically what I would recommend for you to do, and I'd even tell you to hire someone else to do it and not me, because that just that checks and balances is what I would want.
01:58:15.000 But for you, I would say you should take something like a Raspberry Pi and you should run WireGuard on your phone, and you should route all of your internet traffic through something like a home terminal at your house through a Raspberry Pi using something like WireGuard, which is a VPN that I use that's very good.
01:58:35.000 And everything should be routed through that.
01:58:39.000 And if you trust Apple, continue using Apple.
01:58:43.000 If you don't trust Apple, then use Android.
01:58:46.000 And you could use a Pixel and do graphene, and you could use Signal on there and those other things.
01:58:52.000 And you're going to be relatively safe.
01:58:54.000 But again, if I'm a nation-state actor, I can create circumstances where I'm going to get access to your shit and I'm going to lock you down.
01:59:02.000 And some of them are more expensive than other methods to do it.
01:59:06.000 But I'm a pragmatist and you can always come up with a method to get a hold of somebody's shit.
01:59:09.000 You can always create the circumstances, especially if you're a nation-state actor to get a hold of somebody's stuff.
01:59:15.000 That would be the very high level of things that I would recommend to you just out the gate.
01:59:24.000 Yeah, it's very concerning because it seems like these things keep getting stronger and more capable.
01:59:30.000 Yes.
01:59:30.000 Like the Pegasus 2 being a non-click exploit.
01:59:33.000 Yes.
01:59:34.000 So all they have to do essentially is just know your number.
01:59:37.000 Yep.
01:59:39.000 And that's, you know, you just make yourself a difficult target would be my best recommendation.
01:59:46.000 When you're going to answer questions about password reset, don't answer them honestly.
01:59:50.000 Write down in a physical journal or something how you answered those questions.
01:59:53.000 Don't answer them honestly.
01:59:55.000 You know, all of these things we think are added for layers of protection.
01:59:58.000 For instance, you used to get that pop-up on your phone where it said, you know, there'd be like blocks of pictures and it would say, click all of the pictures with a traffic light in it.
02:00:10.000 I was just going to say that, a traffic light in it.
02:00:12.000 Part of that might be for security.
02:00:13.000 The art part of it is they're using the information of what you're clicking to train neural networks.
02:00:18.000 You're a product at that point.
02:00:20.000 You think you're getting security out of it, but you're a product at that point because you're helping to educate a neural network on what traffic lights look like and how they can look and all those different instantiations of traffic lights.
02:00:31.000 So, and again, like we have to separate causality and intention and outcomes in that the companies might do this because they want to create the greatest AI ever.
02:00:42.000 But when you're issuing someone a 40-page terms of service document on everything they can do with your thing that you paid $2,000 for, it's just, you know, we need more ethical people.
02:00:53.000 At least what Eric Prince was trying to do was right, which was we need to off-ramp from some of these big things because the way that this government is going, I'm very worried about the rights of the individual now and going forward because we have an uneducated class of people for all of the reasons in the world.
02:01:13.000 Like if you want to just focus on your family and you're not thinking about these things, I don't hate that for you.
02:01:17.000 But the idea of individual autonomy and rights has been so shit on in recent years that when we get more uneducated and we rely large language models are great, but they're not a foundation of learning.
02:01:32.000 In other words, we have a lot of people with access to information but no wisdom.
02:01:37.000 It's like when your parents would say, learn how to do addition and subtraction on paper before you use a calculator.
02:01:43.000 Like, understand how to do research and cite sources and understand, you know, how to conduct really good analysis before you just use a neural network for everything.
02:01:53.000 Because as we lose focus of our civics and what our founders are trying to do and the uniqueness of it, which is truly unique, which is, you know, when I joined the Army, I joined the Army to get out of North Dakota.
02:02:04.000 When I re-enlisted in the Army, it's because I believed in the experiment.
02:02:07.000 And that's another five-hour podcast.
02:02:09.000 But the foundation of the experiment is good, but we've eroded it in so many ways over the years and given up so many individual rights in the name of security.
02:02:20.000 And I'm sure it's been said on here before, but Franklin said, anybody who gives up their individual rights in the name of security deserves neither.
02:02:28.000 Your freedoms in the name of security deserve neither.
02:02:31.000 And it's some of the ways that they've done it have been really above the surface.
02:02:35.000 And it frankly blows my mind that we let the government get away with some of these things that we let them get away with, where you even explain it to people and they're like, I don't see it.
02:02:45.000 Like, I don't see how that was a big deal.
02:02:47.000 And I'm like, it was a total recalibration of the system that allowed the Democratic Party and the Republican Party to usurp your rights in a way that if you knew any better, you'd probably be protesting.
02:03:00.000 Like some of the ways that they've done this, you know, we can go with the easy stuff like the Patriot Act, right?
02:03:06.000 In the name of security, we're going to start collecting on Americans.
02:03:09.000 You know, and the Biden and Obama administration, I will say this at risk of, you know, getting in trouble because I used to have a clearance.
02:03:19.000 They had a massive vacuum cleaner and they knew what it was vacuuming up.
02:03:23.000 And they kept vacuuming it up anyway in the name of security.
02:03:26.000 I'm not saying they were going after American citizens, but they certainly knew they were.
02:03:31.000 And they just vacuumed shit up and collected it and stored it in a database.
02:03:36.000 In case they needed it.
02:03:37.000 In case at some point we needed to, you know, come up with a narrative or get rid of somebody who's inconvenient or whatever else that just flies in the face of individual American rights and American autonomy and is really, in my mind, the anti-pattern to freedom.
02:03:54.000 It's just really, really bad.
02:03:56.000 I mean, I'll give you one that people always crap on me whenever I talk to them about it, but there's two that really bother me.
02:04:01.000 One of them being like the 17th Amendment.
02:04:03.000 Do you know the 17th Amendment to the Constitution?
02:04:06.000 So the 17th, so when the founders, when you read the Federalist papers and the Federalist papers, I really love reading the Federalist papers.
02:04:13.000 I love reading how they informed the Constitution, the Bill of Rights, the Declaration even.
02:04:19.000 John J. James Madison wrote these documents explaining the framework.
02:04:23.000 And the 17th Amendment, essentially how the Senate, the Senate, right?
02:04:27.000 The 50 people there that are supposed to be representing us was originally constructed was a state would have legislatures and the state legislatures and the governor would appoint the senator.
02:04:37.000 The reason that the founders did that was because the state governments had to give power to the federal government to exist.
02:04:45.000 Back with the Articles of Articles of Confederation.
02:04:50.000 Confederation, is that right?
02:04:51.000 Articles of, I think it's the Articles of Confederation.
02:04:54.000 I'm blowing up, sorry, I'm going nuts.
02:04:57.000 Back before there was a strong centralized American government, we had problems with money, we had problems with interstate commerce and those types of things.
02:05:04.000 And those articles eventually turned into what is the Constitution.
02:05:07.000 But the states had to grant that power.
02:05:09.000 And the signers of the Declaration of Independence and the Constitution knew that the states needed to be those small projects that we talked about before where if California wanted to go nuts, let them go nuts.
02:05:20.000 But it shouldn't impact what's happening in Texas.
02:05:22.000 It shouldn't impact what's happening over in New England.
02:05:24.000 It shouldn't impact what's happening in the Midwest.
02:05:26.000 But if that goes nuts and it fails, it needs to fail.
02:05:30.000 So the state senators, I'm sorry, the state legislatures would come together and they would vote for a senator.
02:05:36.000 they would elect a senator.
02:05:37.000 And that senator's job was to go to the federal government and protect the rights of the state.
02:05:43.000 Not to protect the rights of individuals per se, and certainly not to embolden the federal government.
02:05:49.000 But with the 17th Amendment, what happened was the House of Representatives' function was to be the petulant children of government.
02:05:58.000 So their job was to come up with crazy ideas, crazy laws, all of those things.
02:06:02.000 The more liberal version of government jurisprudence would be the House of Representatives, your crazy ideas.
02:06:08.000 And then you had state senators who were supposed to be between the House and the President who would say, well, here's a good idea, but the rest of this is retarded, AOC.
02:06:16.000 Like, we're not doing all this.
02:06:17.000 That's crazy.
02:06:18.000 Or whoever else, name you a Republican who's an asshat as well.
02:06:23.000 We're not doing these things.
02:06:24.000 And that's because it would erode the state's rights and the state's constitution and what made this state great.
02:06:29.000 Because what the legislatures would do is say, hey, Joe Rogan, you've made a lot of money and you've got a big podcast and a big voice and you've learned some lessons around the way.
02:06:38.000 And you were able to do that in Texas.
02:06:39.000 And you decided to come to Texas because we had all of these things that California didn't have.
02:06:44.000 We need you to go to the Senate for three years or six years or seven years, whatever it was back then, and represent those same principles.
02:06:52.000 So when Obamacare comes through, you can say, not only no, but fuck no.
02:06:56.000 Like, I'm not voting for this thing.
02:06:58.000 And it was to protect the state.
02:06:59.000 But what the 17th Amendment did was it was redundant with the House of Representatives, which was, in the founder's eyes, the only popular vote part of the Constitution, of the American government was the popular vote.
02:07:14.000 And then you had, you know, the way the president gets elected through electors, but you had the state senate, which was appointed by the states.
02:07:20.000 So the legislatures, and I'll use North Dakota where I'm from, you'll have one big city, two big cities, Fargo and Grand Forks, North Dakota.
02:07:29.000 It's where the universities are.
02:07:30.000 It's where your crazy kids are.
02:07:32.000 Crazy thought exists, hyper crazy ideas, but some of them are useful.
02:07:37.000 The rest of the state's agriculture, right?
02:07:39.000 So all of those legislators from all those counties, those legislative districts would get together and say, we're going to put Bill Thompson, that would never happen, but in charge of, he's going to be at the Senate representing North Dakota.
02:07:51.000 But he has to represent the whole state.
02:07:54.000 In other words, you can't do things that will help Grand Forks or Fargo because that's where the universities are.
02:07:59.000 That's where all the crazy politics are.
02:08:01.000 You also need to be thinking about the guys out in the western counties, Lemoore County and North Dakota or way out west.
02:08:06.000 You have to protect agriculture.
02:08:07.000 You have to protect small businesses.
02:08:09.000 You have to protect families.
02:08:11.000 What the 17th Amendment under Woodrow Wilson and how they really usurped the Constitution and made the Senate a redundant, they made it a redundant House of Representatives and using the popular vote.
02:08:22.000 So now we use popular vote for that.
02:08:25.000 But if you want the popular vote in North Dakota, 85% of the population is in Fargo and Grand Forks.
02:08:30.000 So now you've got, if I want to run for Senate in North Dakota, I'm just going to spend all of my time in Fargo and Grand Forks.
02:08:36.000 Because if I can repeat back to those people all the ideas that they want to hear, I'm going to win that vote and I don't have to represent those people out in the rest of the state in anything.
02:08:45.000 So they created a redundant House of Representatives.
02:08:48.000 But another reason why it happened was they wanted popular vote because there is no amount of money that you could stick into a legislature out in the western part of North Dakota.
02:08:57.000 You can't bribe these people.
02:08:58.000 But the DNC and RNC now can say, look, these two senators are running.
02:09:02.000 We like this guy.
02:09:03.000 So this guy will do whatever we tell him to do.
02:09:06.000 And it has nothing to do with the state or representing the state's rights or the rest of those legislative districts.
02:09:11.000 We're going to pick this senator and he's getting $300 million for his election bid.
02:09:15.000 And this other guy, who's a slower-moving constitutional conservative, who might be a free market absolutist and a classical liberal, he's not being funded.
02:09:27.000 But under the state architecture, you might have been a better representation of the state.
02:09:32.000 And that's why the legislators had to vote for you to put you in as a senator.
02:09:36.000 You had to represent the whole state.
02:09:38.000 But now, all that someone who wants to be a senator needs to do is go to the Republican National Committee or the Democrat National Committee and say, I'll do all the things you tell me to do, fund my campaign, and I'm going to go stump in Fargo and Grand Forks, North Dakota, and the hell with the rest of the state.
02:09:55.000 It's very important.
02:09:56.000 It's a very important sleight of hand.
02:09:57.000 And when that happened, you made a redundant House of Representatives, and the state no longer was protected at the federal level.
02:10:05.000 And what happened was all of the power from all of these states and these legislatures and these individuals got sucked up into the federal government.
02:10:13.000 And then after that, you see all of these things that would never have been passed by a state getting passed, things like Obamacare, things like the Patriot Act, certain war resolutions, all kinds of things where it just further erodes the power of the state.
02:10:27.000 And federal government wants that because it puts all of the power up in the federal government.
02:10:31.000 And people always say we need to get money out of politics.
02:10:34.000 No, we need to get power out of politics.
02:10:36.000 That power that they've taken over the last 130 years or so used to exist at the state and local levels because they wanted these thought experiments happening where we could pluck the best things out of them and forget the rest.
02:10:49.000 But all of that power has now gone up to the federal government and the federal government won't ever release that power and they only want more budget and more spending to execute that power.
02:10:59.000 And that's also because the interest groups that want to go, they don't want to have to go and convince a whole state of whether or not something is good that people are going to vote on.
02:11:07.000 They just want to go take a lobby and go up to the federal government because they want all of the power up there as well.
02:11:12.000 And the federal government wants all the power up there as well because they make $300,000 a year before they become a politician and they're worth $30 million when they're done being a politician because all of the money has to go to the federal government because they're in charge of light bulbs we can use, computers we can use, flush toilets we can have, how our roads are going to look, what our medical care looks like.
02:11:33.000 None of those powers are explicitly written in the Constitution of the United States and they use things like the commerce law and other things in order to create things like Obamacare, where really we want competing states.
02:11:44.000 If Texas comes up with a great way to do health care and North Dakota's isn't so great, they can look at that experiment and they can adopt the principles and they can have it at that level.
02:11:54.000 But it's much easier to get change at the local level when the power is derived from the state and the individual because if I want to change the way that my state does health care, I have one of two options or three options.
02:12:05.000 I can run for office, I can support someone who is going to go into office and do what I want, or I can move.
02:12:10.000 But when everything's centralized at the federal government and everything flows from the federal government, all of the money, power, and gravity is up there.
02:12:17.000 And the individual, the 300 million of us or so, have really no power now to exercise either state's rights or individual rights at the higher level.
02:12:26.000 I hope I'm elucidating this correctly, but it's a real usurpation of individual and state autonomy that really got rid of state power, which was, if you read the federalist papers, was so important to the founders that there was this state, that the state's needs were organized because the state was where the founders wanted these thought experiments.
02:12:45.000 You read Thomas Hobbes Leviathan or John Locke or Montesquieu.
02:12:50.000 All of them talked about this great experiment that was being set up and how it was built on all this Western politics and everything that came before it on how we could have a government that was forced to respect the rights of individuals and allowed for these competing think tanks of ideas and that the power would never rest at the federal government.
02:13:06.000 But the 17th Amendment was a way that a lot of that power went from the state level and the state legislatures.
02:13:13.000 And now to become the president, they want to do a popular vote.
02:13:17.000 And under a popular vote, you would just have to campaign in New York and L.A. You would get the popular vote out of the likely voting people.
02:13:25.000 And now the rest of the country is not.
02:13:27.000 And that would be another, you hear all these people saying we need a popular vote.
02:13:30.000 Can't have the Electoral College.
02:13:32.000 We can't have all of these things.
02:13:34.000 Everything needs to be pure democracy allows 51% to rule 49%.
02:13:41.000 And that was another thing the founders were working fervently to get away from.
02:13:46.000 And that's why we had an electoral college.
02:13:48.000 And it's actually quite beautiful when you actually read about it and examine it.
02:13:51.000 It's why we had the state senate and state legislatures.
02:13:54.000 And it's why we had the House.
02:13:55.000 You had all levels of the things of government that the founders cared about being represented in this body, politic.
02:14:01.000 And it was a beautiful thing.
02:14:02.000 And I could go on for 15 more things about that.
02:14:04.000 I won't do it for the sake of your listeners because I doubt this is what they wanted to do.
02:14:08.000 But similar things happened with the Supreme Court in Marbury v. Madison and allowing the Supreme Court to have judicial review.
02:14:15.000 That was never a thing that was in the Constitution.
02:14:17.000 And the Supreme Court, if you like the Supreme Court being able to have the power to describe everything as being either constitutional or unconstitutional, then you're not ruled by a democracy.
02:14:25.000 You're ruled by an oligarchy.
02:14:27.000 You've got eight people in robes that are going to tell you whether or not laws are good or bad.
02:14:30.000 And that's not the founding of this country.
02:14:32.000 It's not how it was intended to work.
02:14:33.000 And that all started back in Marbury v. Madison with Thomas Jefferson and these writs of mandamus that were the Supreme Court, long story short, essentially granted itself the power to conduct judicial review under the old system or the system, old system.
02:14:49.000 The system that was ratified and that the founders approved was if a law was deemed unconstitutional, it would go before the Supreme Court and they just would rule in favor of the person.
02:15:00.000 And then eventually the government would figure out, oh, this law doesn't work.
02:15:03.000 But it was never on the Supreme Court to say constitutional, unconstitutional.
02:15:07.000 You would get arrested for some law, and it would get appealed to the Supreme Court.
02:15:11.000 The Supreme Court would say, we're not punishing this person.
02:15:13.000 This is against the Constitution.
02:15:15.000 But the government would have to keep arresting people.
02:15:18.000 It would have to keep going in front of the federal government.
02:15:20.000 So what I'm saying is, and I'm sorry to go off on this, we can go back to tech.
02:15:23.000 But all I'm saying is the core of the American experiment in individual rights and what makes this country so great and why I was willing to die for it after my initial enlistment.
02:15:34.000 And why I have such love for this is because it was the only experiment where the value of the individual was held at the top of the hierarchy and that people could truly be allowed to flourish.
02:15:43.000 And in 250 years, we did more than any society could have hoped to have achieved in tens of thousands of years.
02:15:49.000 Not that it's been around that long, but in thousands of years.
02:15:52.000 Everything tends towards disorder and everything, power always gets centralized.
02:15:58.000 And we had a framework to do that, but we were willing participants in our own demise.
02:16:03.000 And now we're scratching our heads and wondering why there's no individual and why there's no individual autonomy and why a guy can't smoke weed on the weekend or why a guy can't do X, Y, or Z, because we have centralized the authority and the power and the decision-making structure.
02:16:16.000 And we're allowing them to be, there would be no problem with money in politics if the federal government had only the powers that were outlined to it in the Constitution.
02:16:25.000 I think that's very well said.
02:16:27.000 And I could have never said it the way you said it.
02:16:30.000 And I think there's a lot to absorb here.
02:16:32.000 I'm sorry.
02:16:33.000 No, no, it was great, dude.
02:16:34.000 It was great.
02:16:35.000 This is one of the things that I love about you.
02:16:37.000 You're very thorough.
02:16:38.000 Yeah, thorough is one thing.
02:16:40.000 My friends always say Bill's tism is starting to show.
02:16:43.000 You got a touch of the tism.
02:16:44.000 But I think that's good.
02:16:45.000 Like I said, just like ADHD, I think it's a superpower.
02:16:48.000 A lot to absorb.
02:16:49.000 So I think we'll wrap it up right here.
02:16:51.000 But thank you.
02:16:52.000 This was an awesome conversation.
02:16:53.000 I really appreciate it.
02:16:54.000 It was really great.
02:16:55.000 Yeah.
02:16:56.000 We could do this again, too.
02:16:57.000 I'm sure you probably have 30 or 40 of these.
02:16:59.000 We didn't even get to AI.
02:17:00.000 I wanted to get to AI because I think I have a very anti-pattern to AI and how you understand it.
02:17:05.000 But if you want, we can save that for another time.
02:17:08.000 Yeah, we'll do that for our next one because I think that's another four hours.
02:17:10.000 Yeah, probably.
02:17:11.000 Yeah, for sure.
02:17:13.000 And by then, who knows where it's going to.
02:17:14.000 I mean, Jensen Huang from NVIDIA recently declared that we've reached AGI.
02:17:20.000 Yeah, so I would, I would, yeah, I could, yeah, I just couldn't disagree more.
02:17:26.000 And I think I could, in the same way, I just elucidated.
02:17:28.000 You're not the only one.
02:17:29.000 Quite a few people.
02:17:30.000 Yeah, yeah.
02:17:31.000 I mean, it's consciousness projection.
02:17:32.000 And I'll sum it up in a minute.
02:17:34.000 At the end of the day, neural networks are mathematical functions.
02:17:38.000 They rest in, you know, weighting neurons based on training data and applying power to train models.
02:17:44.000 It's all mathematic.
02:17:46.000 There's no sense of knowing there in that, you know, Penrose, I've read a lot of, on his orc OR, if people want to read about that, I won't explain it.
02:17:57.000 orchestrated objective reduction and how the mind works and these fleets of consciousness that we have, these shimmers of consciousness that we have based around what he describes in the microtubule.
02:18:08.000 We get conscious thought and that conscious thought we project into things.
02:18:12.000 AI is very good conscious projection, but it will never have consciousness or knowing because it has no system of values.
02:18:19.000 And if we were to instill values in it, it would still be consciousness projection.
02:18:23.000 You saw my dad's cabin.
02:18:25.000 My dad died when I was five, but I bought it back and was working on it.
02:18:27.000 And inside of his cabin, I got to learn a lot about my father by working on the cabin that he built.
02:18:34.000 We wouldn't measure things or cut things right on walls and that type of stuff.
02:18:37.000 That's all consciousness projection that allowed me to get to know him away.
02:18:41.000 I might not even have known him if he were alive, but I got to re-experience and understand my father and his thoroughness through that cabin.
02:18:47.000 AI is consciousness projection.
02:18:49.000 It's projected consciousness.
02:18:50.000 It's getting very good, but on a calculator, you could get the same thing out of a neural network that you get out of a neural network if you had sufficient time.
02:18:58.000 I could present you a question just like you did on Perplexity.
02:19:01.000 I could sit here with a rule book and I could type in a calculator.
02:19:06.000 It might take me a million years, but I could do it and I could give you the same answer that a neural network would give you.
02:19:11.000 That doesn't mean consciousness or knowing or AGI is presence, is present.
02:19:16.000 It relies on its training data.
02:19:18.000 It can only give you what the training data gives it.
02:19:21.000 It needs human consciousness projection like we talked about with the CAPTCHAs or we talked about with uploading photos to Google Drive.
02:19:28.000 It needs that training data.
02:19:30.000 And to me, it's just really fancy, clever math.
02:19:34.000 And having trained these networks from dozens of years or dozen years now and working with them, they're just really clever consciousness projection.
02:19:43.000 And so, yeah, that is four hours and we can do that next time.
02:19:46.000 We'll do that next time.
02:19:47.000 Definitely.
02:19:48.000 But if people, you mentioned the app.
02:19:49.000 By the time we do it next time, who knows what the fuck is going to be going on with AI, too?
02:19:53.000 But if people want to learn more about me or my company, if I can say that.
02:19:58.000 Yeah, please.
02:19:58.000 It's spartanforge.ai.
02:20:00.000 We're built under the rubric of individual freedom.
02:20:02.000 I want people outdoors.
02:20:03.000 I want people hunting.
02:20:04.000 I want people experiencing nature.
02:20:06.000 I want people providing for their families.
02:20:09.000 The best part of my day is when my kids are eating a backstrap of an animal that I took.
02:20:14.000 And I want to enable people to go out and do that.
02:20:16.000 And even though it's paradoxical through an app, you can get lost.
02:20:19.000 You've got to conserve time.
02:20:20.000 You've got to e-scout.
02:20:20.000 You've got to learn things before you go out there.
02:20:22.000 So we built this company under that.
02:20:24.000 It's one of my, I've got three other companies that I'm doing, but Spartan Forge is the one that I'm working on.
02:20:28.000 It's an awesome app.
02:20:29.000 Really working on.
02:20:30.000 Well, I really appreciate that.
02:20:31.000 We've put a lot of work into it, and we've got a lot more coming over the summer.
02:20:34.000 So if people want to support us or want to get out there and get some hunting done, please check it out.
02:20:38.000 And I answer all the Instagram DMs.
02:20:40.000 So if you have a question for me.
02:20:42.000 Good luck with that now.
02:20:43.000 Well, I try to.
02:20:44.000 I spent about two hours every morning doing it.
02:20:47.000 Good luck.
02:20:48.000 Thank you, Joe, for having me.
02:20:49.000 Thanks, brother.
02:20:49.000 Appreciate you very much.
02:20:50.000 Yeah, I get that.
02:20:51.000 All right, you too.
02:20:52.000 Bye everybody.