This Past Weekend with Theo Von - June 20, 2025


#591 - Ro Khanna


Episode Stats

Length

1 hour and 55 minutes

Words per Minute

196.03612

Word Count

22,677

Sentence Count

1,807

Misogynist Sentences

19

Hate Speech Sentences

63


Summary

Today's guest is a United States Congressman from California's 17th District, Ro Khanna. His parents immigrated from India. He was his co-chair on Bernie Sanders' presidential campaign in 2020. He's a member of the Democratic Party.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.300 Rocky's vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.320 Sponsored by Bell. Conditions apply.
00:00:28.560 See AirCanada.com.
00:00:30.480 California, we've got some new tour dates for you.
00:00:33.500 I'll be in Los Angeles on August 14th at the Dolby Theater.
00:00:38.420 Anaheim on August 16th at the Honda Center.
00:00:41.460 And Oceanside on August 17th at Front Wave Arena.
00:00:46.500 Get your tickets early this Thursday, June 19th, with code RATKING, 10 a.m. local time.
00:00:52.600 General on sale begins Friday, June 20th.
00:00:55.780 Get your tickets only at TheoVaughn.com slash T-O-U-R.
00:01:01.920 Please avoid any scalper sites.
00:01:04.640 And thank you so much.
00:01:05.500 This is the Return of the Rat Tour.
00:01:06.800 Almost over.
00:01:07.800 I promise it's almost over.
00:01:09.640 And thank you guys so much for all the support.
00:01:11.220 Today's guest is a United States congressman from California's 17th district.
00:01:16.980 That is Silicon Valley.
00:01:19.040 He's from Philadelphia.
00:01:20.600 His parents immigrated from India.
00:01:22.840 He worked with Bernie Sanders.
00:01:25.040 He was his co-chair on Sanders' presidential campaign in 2020.
00:01:29.740 I'm grateful for his time.
00:01:31.620 He's a member of the Democratic Party.
00:01:33.200 Today's guest is Congressman Ro Khanna.
00:01:54.000 Ro Khanna.
00:01:55.640 Is that right?
00:01:56.180 That's right.
00:01:56.800 Perfect.
00:01:57.420 Nice.
00:01:57.800 Thank you for coming in today, man.
00:01:59.020 Thanks for meeting up and just being willing to come and chat.
00:02:03.800 You are Indian?
00:02:05.320 Your family is Indian?
00:02:06.240 Indian heritage, yeah.
00:02:07.320 Okay.
00:02:07.700 My parents immigrated from India.
00:02:10.800 My grandfather, actually, Amarnath Vidyalunkar, spent years in jail fighting for India's independence
00:02:17.700 as part of Gandhi's independence movement.
00:02:20.920 And then my parents came and settled in Philadelphia.
00:02:25.340 That's where I was born.
00:02:26.380 Do a lot of Indian people, is that a, is that, you know, some places in America are certain
00:02:30.380 like kind of spots where people start to come.
00:02:32.180 Is that a, is that a popular place that a lot of Indian people have gone to over the
00:02:35.080 years?
00:02:35.560 You know, where I grew up in Bucks County, Pennsylvania, there were a few Indian families,
00:02:39.100 but it was 99% white.
00:02:41.760 The bigger places are New Jersey.
00:02:43.820 Have you ever been to Edison, New Jersey?
00:02:46.040 I don't think so.
00:02:47.500 Edison, it's got a lot of Indian food.
00:02:48.940 Have you eaten Indian food?
00:02:49.800 Oh yeah, I've had some good Indian food.
00:02:50.920 What's your favorite dish?
00:02:52.580 I think it's like the one, it's like where you all sit in a circle and everybody kind
00:02:56.960 of like catches a cold together or whatever.
00:03:00.580 Catches a cold.
00:03:01.420 No, I just mean like everybody's like kind of sharing off of the same food.
00:03:04.440 Yeah, it is a very sure thing.
00:03:05.560 You got to come to Fremont sometime in California.
00:03:08.160 That's the best Indian food in the, in the country.
00:03:10.700 But Fremont, Silicon Valley, a lot of Indian families, New York, New Jersey, you know, but
00:03:15.580 my upbringing was interesting because I was one of the few Indian kids growing up in Pennsylvania.
00:03:21.520 And, uh, you know, I grew up though with a lot of people who teachers who believed in
00:03:27.360 me, little league coaches who believed in me, uh, and, uh, at times I would get teased
00:03:31.940 being Indian, but by and large, uh, it made me very hopeful about the country.
00:03:37.000 Yeah, man.
00:03:37.600 Well, thanks.
00:03:38.260 Um, I know we wanted to talk about, I'm trying to think, I actually wrote some questions
00:03:41.780 down today.
00:03:42.200 Cause I feel like, I feel like this is, this is like meet the press.
00:03:46.040 You got, you got, you're, you're, you're on, on, on a different level.
00:03:49.580 I, I thought, you know, the thing I love about your show is it's kind of, uh, just you talking
00:03:54.120 like a regular person.
00:03:55.400 Maybe you're right.
00:03:55.840 Should I put these away?
00:03:56.540 You think?
00:03:57.140 I mean, no, you, you can give it a try.
00:03:59.420 I mean, I think you guys, you know why?
00:04:01.180 I think this is what I realized.
00:04:02.160 I was like, man, I have, sometimes I have conversations with people that I want to be
00:04:06.580 able to get more information from and I can't remember everything.
00:04:11.420 Right.
00:04:12.000 But you know what the thing about you is I'm not saying this to flatter you.
00:04:14.720 A lot of times you'll ask questions and they're like deeper and smarter than some of the
00:04:18.920 Sunday talk show hosts.
00:04:20.840 And I think it's cause you actually talk to real people and it's sort of like, well, how
00:04:24.400 does this affect my buddy?
00:04:25.760 How does this affect real people?
00:04:27.480 So we, you could give it a try when they, well, let me, we'll do a mix.
00:04:30.300 How about this?
00:04:30.860 We, um, I know how we got in touch is you and I have like similar thoughts on different
00:04:35.500 topics online.
00:04:36.640 Um, well, let's just start with Iran or Iran.
00:04:39.820 People say it differently, but are, do you think we're headed into world war three here?
00:04:42.720 What do you think's going on?
00:04:43.680 I'm really concerned.
00:04:45.700 I'm, I'm fearful.
00:04:47.220 Uh, we have a possibility where we're going to blunder into another war in the Middle East.
00:04:54.580 And just today, actually, to make some news, Thomas Massey, who's a principled Republican
00:04:59.260 and, uh, myself introduced a war powers resolution saying that the president should not get into
00:05:06.980 a war with Iran without coming to Congress.
00:05:10.160 Uh, you know, the president to his credit ran against the war in Iraq.
00:05:14.600 He ran saying, I don't want, uh, any wars.
00:05:17.900 And the reality is that if we strike Iran, they start hitting us back and hitting our troops
00:05:24.420 in the Middle East, terrorist activity.
00:05:26.840 That's going to cost this country a lot of money that should be being spent here at home.
00:05:31.420 It's going to mean people could die who are serving the country.
00:05:34.800 Oh yeah.
00:05:35.600 Well, let's take a look at this, at the bill that you guys were thinking about.
00:05:38.500 What is it called?
00:05:39.240 It's called, uh, the war powers resolution to stop a war in Iran, which basically says
00:05:43.940 before the president does anything to go into war, uh, with Iran, he's got to get a vote
00:05:48.280 in Congress.
00:05:49.740 And how long does that take though?
00:05:51.340 Like usually to get a vote in Congress, because there's a certain period of time.
00:05:54.040 Like, can they do it in 24 hours?
00:05:55.200 It takes weeks or what?
00:05:56.120 They could.
00:05:56.480 It's the decision of the speaker of the house to when he can bring it.
00:05:59.100 He has to bring it within 15 days, but we, what we want is today.
00:06:03.400 But by the way, you know, there are a lot of Trump supporters who are telling him, don't
00:06:07.220 go into this war.
00:06:08.520 Uh, Tucker Carlson, uh, Marjorie Taylor Greene.
00:06:12.140 Oh, Dave Smith was talking about it.
00:06:13.680 He's a, uh, political, he's a comedian who also talks a lot about politics, you know?
00:06:17.580 Um, oh, I think it's, this is a horrible idea, you know?
00:06:20.400 You know, and yeah, people say, well, you don't know a ton about the Middle East.
00:06:23.000 Well, that's fine.
00:06:23.680 I don't want people I know, my friends getting called up.
00:06:28.820 I don't want, uh, the children of my friends getting called over to die.
00:06:32.980 Um, it, I don't even understand how it's an option.
00:06:36.240 The only reason I even kind of want it to be possible is to give, is that if Israel like
00:06:43.260 goes to war with Iran, then it'll, they'll at least stop killing Palestinian people for
00:06:47.880 a day or two.
00:06:48.740 Right.
00:06:48.920 You know what I'm saying?
00:06:49.300 Like, that's like my own, it's like, at least point the guns that way, you know, kind
00:06:53.420 of, that's where some of my thoughts are.
00:06:55.000 Well, you've been such a courageous voice on, on Gaza.
00:06:58.320 I mean, you look at what's going on there, right?
00:07:01.620 55,000 people killed.
00:07:04.600 Now Israel says some terrorists and they're right, some terrorists, but most of them,
00:07:09.300 women and children.
00:07:10.260 Yeah.
00:07:11.100 You've got 130,000 people injured.
00:07:13.860 You've got over 600,000 kids who are not in school there.
00:07:18.280 You've got the entire population of Gaza being forced into a corner, Southern corner of Gaza.
00:07:26.640 And then you've got, uh, over 400,000 people who are starving, who don't have enough food
00:07:33.840 to live a decent life.
00:07:36.180 Yeah.
00:07:36.540 And the saddest part I feel like is you see this and you're like, somebody should come
00:07:40.220 help them.
00:07:40.740 That's my first thought a lot of times.
00:07:42.240 And then the saddest thought for me after that is we should come back.
00:07:46.460 Like, it feels like America is the group that would go to help that situation.
00:07:52.240 Usually that's, we're the good guys, right?
00:07:54.020 I mean, we were the good guys in world war two.
00:07:56.680 We saved the world from Nazism.
00:07:58.280 We were the good guys in the cold war.
00:08:00.500 We stood up to the Soviet union.
00:08:02.580 People want to think that we are the good guys.
00:08:05.080 Why are we helping Netanyahu in the killing of children, women, and so many Palestinian
00:08:15.320 civilians?
00:08:15.880 Now, look, Israel was hard hit October 7th.
00:08:19.240 I'm sure, like many people, you thought that was horrific, the October 7th attack.
00:08:25.100 Oh, yeah.
00:08:25.780 And neither of us have any sympathy for Hamas.
00:08:29.860 They're a terrorist group.
00:08:30.880 But what happened is I supported initially Israel saying, go get those terrorists who
00:08:36.260 committed those horrific attacks on October 7th.
00:08:39.360 But by within three months, they had destroyed a lot of Hamas.
00:08:44.880 What now is happening over two years later, they're still going and bombing.
00:08:50.580 And they're bombing because Ben Gavir and Smoltlich, two of the people in Netanyahu's
00:08:57.280 cabinet are saying, don't stop the war, even though they know that they can't, you can't
00:09:01.880 bomb and eradicate Hamas.
00:09:04.020 They haven't been able to do it.
00:09:06.040 And what's the only thing that's happening is that civilians are getting killed and Israel
00:09:12.540 actually is becoming less popular in that region.
00:09:15.920 I mean, I was in Saudi Arabia and Jordan.
00:09:18.440 They're losing support.
00:09:20.020 Yeah.
00:09:20.500 So why is Netanyahu trying to drag us into it?
00:09:24.080 What do you think that relationship is like between him and Donald Trump?
00:09:27.500 So I think Donald Trump actually initially stood up to Netanyahu because he wanted to
00:09:32.920 bring this war to a close and he didn't want to get dragged in initially to a war in Iran.
00:09:38.500 I mean, you know what Trump's courage was while he became president in 2016?
00:09:42.980 Do you remember the Republican stage?
00:09:44.660 And they all asked, are you for the Iraq war?
00:09:47.740 And Jeb Bush, who was George Bush's brother, wasn't willing to say that his brother made a
00:09:52.120 mistake.
00:09:52.600 Oh, yeah.
00:09:52.840 He's a little oil bunny.
00:09:53.920 You know, and Donald Trump, to his credit, stood up on a Republican stage and said the
00:09:59.800 Iraq war was a disaster.
00:10:01.460 I think that's how he became president.
00:10:03.280 And he ran saying, I want to be a president who ends these wars.
00:10:07.340 I really believe that's what he thought at the time.
00:10:11.280 Oh, well, sure.
00:10:11.780 Well, I think recently, you know, we're doing OK without the notes so far, huh?
00:10:16.340 You're doing great.
00:10:16.980 I wish I wish people had more common sense like you than the people who are making this
00:10:22.320 foreign policy stuff.
00:10:23.480 Actually, my recommendation is that before Donald Trump makes a decision, he needs to
00:10:27.420 have a conversation with you.
00:10:28.820 Oh, God.
00:10:29.400 No, well, to remind him because I, you know, I watched your actually clip with him during
00:10:34.080 the presidential campaign.
00:10:35.160 And I said, you made him the most human that I've actually seen him.
00:10:40.160 I mean, your conversation with him about his brother and alcoholism.
00:10:44.320 And then he asks you, what is it?
00:10:46.240 What was it like for you to have cocaine and your addiction?
00:10:48.720 And I thought, well, this is like him having a normal conversation.
00:10:51.920 Yeah, it felt pretty normal.
00:10:53.260 And so, you know, but I think, do you know anyone, Theo, in your life who's like, yeah,
00:10:58.380 we should go out and go to war with Iran?
00:11:00.600 Like any of the folks you talk to?
00:11:02.240 Nobody, nobody, even like, even like hardhead, you know, like even guys who like Navy, former
00:11:09.860 Navy, no, none of them, nobody is saying yes.
00:11:13.100 And I felt like the biggest thing that's, I mean, there's so many things right now.
00:11:18.540 And it's like that I feel like in the world that are kind of stressing people and making
00:11:23.620 people nervous.
00:11:24.780 But one of the things is I felt like it was supposed to be America first.
00:11:27.560 Like we're focusing on like, what are we doing to get things back into America, right?
00:11:31.760 To like increase like the purpose of being an American, to refill like our hearts with
00:11:37.920 blood and like, and, and, and, and make us feel something again here and make us be excited
00:11:43.500 about being an American.
00:11:44.460 Right.
00:11:44.780 And I felt like that was what a lot of the energy was for.
00:11:49.220 And then now that we're caught up here and it feels like we are just working for Israel.
00:11:54.820 So I think to a lot of people, it's, I don't know, you just really start to feel very disillusioned
00:12:02.440 pretty quickly.
00:12:03.500 And, and I don't mean disillusioned in people.
00:12:05.580 I don't mean disillusioned in like the human beings, but this disillusioned in our leaders,
00:12:11.160 right?
00:12:11.440 You start to feel like, I don't know, at one point, neither one of these parties is helping
00:12:16.380 us.
00:12:16.960 Well, they both seem like parties that have gotten us into war.
00:12:20.680 I mean, $6 trillion it cost us into Iraq and Afghanistan.
00:12:25.620 And we need now an anti-war movement in this country.
00:12:29.620 And I don't think it has to be Democrat or Republican.
00:12:32.240 I think Americans are sick of war.
00:12:35.060 Here are the two things I think we could do.
00:12:37.000 Look on how do you end the war in Gaza?
00:12:41.060 Yeah.
00:12:41.300 That's a great question.
00:12:42.160 How do you, yeah.
00:12:42.720 What would you do to curtail things right now in Gaza?
00:12:45.260 Do you think they're really going to aim for like a two-party state over there?
00:12:48.520 To me, it looks like they're just going to try to sweep these people into the ocean,
00:12:53.100 you know?
00:12:53.960 Which I hate to say that.
00:12:54.940 The problem is you've got Netanyahu who is listening to his right wing, right?
00:13:00.020 Ben Gavir and Smoltlich.
00:13:02.700 These are two guys.
00:13:03.940 Smoltlich is his other guy's name?
00:13:05.040 Smoltlich.
00:13:05.380 Smoltlich.
00:13:05.960 Smoltlich, bring him up.
00:13:08.060 There he is.
00:13:08.660 Smoltlich.
00:13:10.040 Zalo Smoltlich, yeah.
00:13:11.140 And these guys are basically telling, they're telling Netanyahu that if you stop bombing
00:13:18.360 Gaza until all of Hamas is killed, that we won't support you for prime minister.
00:13:25.660 So if Netanyahu stops, he stops being prime minister.
00:13:29.780 Oh, I see what you're saying.
00:13:30.500 So he's in a bit of a catch-22.
00:13:31.940 He's in a catch-22.
00:13:33.220 But why don't, I don't understand if they're, because you always hear great things about
00:13:36.240 the Israeli military, right?
00:13:38.220 The IDF, you always hear, like, how amazing they are.
00:13:41.080 And I've met a lot of former people that have been in it and guys who were, like, in the
00:13:45.200 Krav Maga, some really talented warriors, really.
00:13:49.700 But then why not send, like, your Navy SEALs to get the actual Hamas people?
00:13:54.300 I don't understand this.
00:13:55.720 If you're so precise, then what is this, you're bug spraying this group, these people, you know?
00:14:02.580 Like, they got them in the first three months.
00:14:05.840 They destroyed most of Hamas's military capability.
00:14:09.080 But the reality is, Hamas is both a military movement, but it's also partly a political
00:14:15.520 movement.
00:14:16.100 I mean, there are people-
00:14:17.360 Some people in Palestine elected Hamas because they thought they needed a desperate political
00:14:21.700 party.
00:14:22.200 Exactly.
00:14:22.860 Right.
00:14:23.220 So then the question is, well, what do you do?
00:14:24.780 Because I agree with you that Hamas can't be governing Gaza.
00:14:28.020 And so there is a solution, and the solution is the president, Donald Trump, can call Netanyahu
00:14:33.880 and says, okay, enough with the bombing.
00:14:35.880 You know, we're giving them right now 2,000-pound bombs to hit these civilians.
00:14:41.900 We're giving them-
00:14:42.620 There's a video the other day.
00:14:43.760 Sorry to interrupt you, bro.
00:14:44.920 No, please.
00:14:45.360 Can you see if you can pull that up, Trevor?
00:14:46.920 And there was a video of a child picking up the casing of a missile and reading it, I think.
00:14:55.440 See if you find that.
00:14:58.020 Yeah, we can't find it.
00:14:59.700 That's okay.
00:15:01.580 Do you think there's any chance of Palestine getting its land back there?
00:15:05.940 I do.
00:15:06.920 But first of all, I mean, you talked about this child getting a missile.
00:15:10.180 I mean, how would it make you feel that our tax dollars actually were paying for a lot
00:15:15.800 of the missiles and the weapons that are being used to now kill Palestinians?
00:15:21.380 I agree.
00:15:21.760 I think we've given them $12 billion this year, Israel allegedly.
00:15:25.460 Can you look up what amount of war materials that we've given them, Trevin?
00:15:33.080 The United States has provided significant military aid to Israel throughout 2025.
00:15:36.960 The Trump administration, which took office in January 2025, has approved nearly $12 billion
00:15:43.640 in major foreign military sales to Israel.
00:15:47.360 This includes nearly a $3 billion arms sale.
00:15:50.040 Does that mean we're selling them to them?
00:15:51.440 We're selling part of them to them, and then we've also given aid.
00:15:54.940 Now, I voted against the offensive weapons aid about last year, but we approved it.
00:16:00.640 So some of it is in aid.
00:16:02.400 Some of it is in sales.
00:16:03.660 Trump is giving them 2,000-pound bombs that they can be using.
00:16:09.080 So here's, though, what I think that I'm hoping he or others may be listening to this,
00:16:14.420 because your voice really matters, because you know a lot of people who voted for him,
00:16:18.020 and they voted for him because they wanted American patriotism.
00:16:21.160 They wanted to focus on building our industrial base.
00:16:23.900 They wanted us out of these wars.
00:16:25.860 Donald Trump can call Netanyahu, and he says, enough.
00:16:29.260 We're not going to be giving you more offensive weapons to kill more civilians in Gaza.
00:16:35.020 But here's what I'm going to do.
00:16:36.620 I'm going to convene Saudi Arabia.
00:16:40.040 I'm going to convene Jordan.
00:16:41.120 What does convene mean?
00:16:42.380 Convene mean bring them together.
00:16:44.100 Okay.
00:16:44.580 Bring them together.
00:16:45.720 We're going to get the Arab states together, Jordan, Egypt, Saudi Arabia, people in Palestine,
00:16:53.260 Fatah, Israel, the United States.
00:16:55.920 And we're going to say we need a new government in Gaza.
00:16:58.500 It can't be Hamas.
00:16:59.980 You know, Hamas is a terrorist organization.
00:17:02.760 We're going to get rid of Hamas.
00:17:04.740 And we're going to make sure that this new government has Palestinian voices,
00:17:08.960 but also has Saudi, Jordan, Egypt, part of it to make it secure so that they don't attack Israel.
00:17:16.820 Because Saudi doesn't want to attack Israel.
00:17:19.180 You know, he could win the Nobel Prize.
00:17:21.540 You know, he wants to always win the Nobel Prize.
00:17:22.980 Donald Trump?
00:17:23.580 Donald Trump could win the Nobel Prize.
00:17:24.520 How could he win it, do you think, Roy?
00:17:25.700 I think if he comes in and he says to Netanyahu, stop.
00:17:29.220 Stop this.
00:17:30.440 I'm actually the president who's going to bring peace.
00:17:32.720 And here's how we're going to do it.
00:17:34.480 It's not just going to be Israel, the United States, and Palestine.
00:17:37.080 Because we agree Hamas can't be anywhere running Gaza.
00:17:41.000 I mean, Hamas has called for the destruction of Israel.
00:17:43.420 We have to stand for Israel's right to exist as a Jewish democratic state.
00:17:47.640 I mean, that is a principle we have to respect.
00:17:50.660 But we're going to get Saudi Arabia.
00:17:52.700 And they have to respect Palestine's right to exist, right?
00:17:55.880 They have to respect Palestine's right to self-determination and have a Palestinian state in Gaza and in the West Bank.
00:18:02.000 And you can get those two states.
00:18:05.000 And with land swaps, you could actually make it that even a lot of the settlers are still going to be able to stay in some of the places.
00:18:13.600 I mean, you could get to a lot of the deal we know that needs to happen.
00:18:16.980 But Trump has a good relationship with the Saudis.
00:18:20.600 We have a good relationship with Jordan.
00:18:22.920 We've got a good relationship with Egypt.
00:18:24.980 Get all of these countries on board and say, okay, what is the new Palestinian leadership going to look like?
00:18:31.140 Saudi, Egypt, Jordan, are you going to guarantee?
00:18:33.360 Qatar too.
00:18:33.980 Qatar.
00:18:34.460 My boys over there.
00:18:35.680 I was just over there, man.
00:18:37.380 They had us all dressed up like a bunch of Ku Klux Kans.
00:18:39.600 You were in Qatar?
00:18:40.420 Yeah.
00:18:40.820 What were you doing in Qatar?
00:18:42.000 They had me come and perform for the troops.
00:18:43.980 Wow.
00:18:44.540 It was really great.
00:18:45.160 It was pretty special, but sorry, I interrupted you just to get a joke in, but also was excited
00:18:50.520 and had a great time there.
00:18:51.700 They really treated me very nice.
00:18:52.980 People thought I was a spy.
00:18:54.020 I thought maybe they were going to-
00:18:54.920 Well, you could be part of this old Middle East plan.
00:18:56.820 Maybe you'll go perform if they get a state there.
00:19:01.660 I mean, look, what bothers me is that we know what we need to do.
00:19:05.480 Right.
00:19:06.040 So why is Trump afraid?
00:19:07.180 Is Donald Trump afraid of Netanyahu?
00:19:09.140 I don't understand that because Trump certainly feels like he is a guy who,
00:19:15.160 does not seem afraid to say what he wants to say.
00:19:18.260 I don't think he, I don't know what it is because I want Trump to do the right thing.
00:19:23.160 Like I want him to start to end the war in Gaza to actually figure out how we get peace
00:19:30.300 there.
00:19:30.580 And I think he could, he could, if he picks up the phone and he says, Netanyahu, you're
00:19:35.020 done.
00:19:35.800 Netanyahu is going to stop the bombing.
00:19:37.540 And then Netanyahu is going to say, okay, but we got Hamas.
00:19:39.920 I can't stay here with Hamas.
00:19:41.380 And Trump can say, okay, I'm going to help you get rid of Hamas, but you can't keep
00:19:45.800 bombing and killing women and children.
00:19:47.520 That's not getting rid of Hamas.
00:19:48.940 Yeah.
00:19:49.080 It's fricking crazy, dude.
00:19:50.640 I mean, it's just like.
00:19:52.280 And so you get Egypt, you get Jordan, you get Saudi Arabia.
00:19:54.860 I call it the 23 state solution.
00:19:57.460 You get all of those Arab states.
00:19:58.640 Put Mississippi.
00:19:59.520 Dude, I'll tell you this.
00:20:01.080 We can give Mississippi to Palestine.
00:20:03.020 I've been saying that for 15 years.
00:20:04.800 Have them come over, have Mississippi.
00:20:06.980 Have Mississippi?
00:20:07.540 Yeah, have the Palestinian people come and move there if they want.
00:20:10.400 Yeah, but they want Palestine.
00:20:11.760 I understand.
00:20:12.380 And I do understand that.
00:20:13.420 I was just trying to give them something that I think everybody could, you know, because
00:20:16.800 Mississippi could use an economic boom.
00:20:19.920 You know, they could use, I think, some.
00:20:22.360 You're for Palestinian immigrants coming, maybe.
00:20:24.440 Yeah.
00:20:24.980 Well, that's what I'm saying.
00:20:25.540 With Palestinian immigrants coming, it would just kind of like add a new texture to that
00:20:30.120 whole state.
00:20:30.560 I think it needs like a rebirth of thought.
00:20:33.960 You know, they have a lot of great arts there, but I think they need.
00:20:35.800 They had Faulkner.
00:20:36.420 Oh, yeah.
00:20:37.220 They had some greats.
00:20:37.900 They had Eudora Welty.
00:20:38.880 They had Faulkner.
00:20:39.900 They had Jackson Dart.
00:20:42.480 I mean, they've got some great Morgan Freeman.
00:20:44.660 I mean, those are easy ones, some of them.
00:20:46.140 But no, I'm just saying if they had the Palestinians there, it would be interesting.
00:20:49.120 So I've just always thought that that would be interesting.
00:20:52.640 But I think your voice on this really matters because no one sees you as sort of like some
00:20:57.660 campus activist shutting down university buildings.
00:21:01.460 You're talking to regular people.
00:21:02.880 You're talking to people who are truck drivers and blue-collar workers.
00:21:06.800 And you're saying, you know what?
00:21:08.560 I'm always for the little guy.
00:21:10.260 That's – from the stuff I've seen about you, if there's like one Theo Vaughn political
00:21:15.520 philosophy, it seems like you're for the people who are being bullied.
00:21:19.320 You're for the people who are kind of shut out.
00:21:21.120 And that's how you're seeing this Palestinian issue.
00:21:23.980 It's not because you're – I mean, it's not Jewish or Palestinian or Muslim.
00:21:27.620 It's like how can we not be for both people and now you're the most powerful country in
00:21:33.480 the world and a lot of people you know supported Donald Trump and he has an opportunity to bring
00:21:39.700 the suffering to the end and make peace.
00:21:43.420 Hey, everyone.
00:21:44.340 I've been telling you about MoonPay.
00:21:48.600 Yep, MoonPay.
00:21:50.260 That's what I've been using because I've been getting back in the crypto, you know.
00:21:55.220 I've been tiptoeing in the crypto and you feel me?
00:21:59.060 Yep, MoonPay has got me back.
00:22:01.360 MoonPay is one of the most user-friendly apps out there for buying and selling crypto.
00:22:05.960 People call it the PayPal of crypto because just like you use PayPal to buy almost anything
00:22:12.460 online, you can use MoonPay to buy almost anything in crypto.
00:22:17.800 Nearly every crypto app you've heard of uses MoonPay and it works with credit cards, debit
00:22:23.800 cards, Venmo, and even PayPal.
00:22:26.080 Now, here's where it gets really interesting.
00:22:29.820 MoonPay powers payments on BonkBot, a cutting-edge AI trading bot.
00:22:35.760 Designed to optimize your crypto trading strategy with advanced algorithms.
00:22:41.380 Whether you're trading 24-7 or looking for a more hands-off approach, BonkBot ensures
00:22:47.000 you make data-driven, intelligent trades.
00:22:49.940 It's your AI-powered partner for smarter, more profitable trading.
00:22:54.240 You can top up with MoonPay and its many payment methods on BonkBot.
00:22:59.280 Remember, while MoonPay makes buying crypto straightforward, it's essential to do your own research and
00:23:04.880 understand the risks involved.
00:23:06.940 Crypto trading can be volatile and you could lose your investment.
00:23:10.440 MoonPay is a tool to facilitate your transactions, not a source of financial trading.
00:23:14.900 Trade responsibly.
00:23:16.820 Hey there, Summer Side Hustle.
00:23:19.240 It's getting hot, the birds are chirping, and you might have that itch to make some extra cash.
00:23:26.300 Go on a little vacation, whatever.
00:23:28.000 One thing that's really helped me with that is using Shopify to run my merch store.
00:23:34.060 That's what we use.
00:23:34.880 It's simple because if I'm out of town, I can just check on my phone and see what's going on.
00:23:39.640 Right there, boom.
00:23:40.520 I can see what's being ordered, what's not being ordered, how much, how, when, why, etc.
00:23:47.000 Yep.
00:23:47.640 Over the years, Shopify has been there as we moved out of a basement over there in Akron, Ohio,
00:23:53.480 and bigger areas to hold our merch.
00:23:57.200 Shopify makes it so easy for me to start and run my business, and it will for you too.
00:24:02.060 So if you're ready to build your own little empire or just start selling seashells by the
00:24:07.460 seashore, whether it's merch, a new product, or your new idea, go to shopify.com slash T-H-E-O
00:24:15.080 and make it happen.
00:24:16.560 That's right, shopify.com slash Theo.
00:24:20.440 So, you know what I realized the other day?
00:24:23.620 Because I've been thinking a lot about like voice, right?
00:24:26.220 Yeah.
00:24:26.560 Like what does it mean to have a voice?
00:24:28.180 And I don't mean like me having a voice in the world, but just even have a sound that
00:24:31.780 comes out of my throat that means something that's connected to my heart or to my feelings
00:24:35.780 or to my thoughts.
00:24:38.880 You know, and it's just, it's, you know, it's a gift.
00:24:41.220 It's interesting to have.
00:24:42.260 And then what it feels like when you're afraid to speak, right?
00:24:47.040 Like I feel like my whole life when I was a kid, I was just, I just, I wanted to have
00:24:51.460 a voice for myself.
00:24:52.500 You know, I just never had a voice.
00:24:54.540 We were in pretty traumatic place when I was a kid.
00:24:56.840 And so I just never had, I couldn't even, I didn't even have any feelings to put together
00:25:01.440 to share what I wanted to share.
00:25:03.380 Right.
00:25:03.960 So I think like, um, like I'll just, I'll always go for the underdog.
00:25:09.040 I think that's all it is.
00:25:10.140 Dude, I will have my, my team.
00:25:11.680 That's kind of like your philosophy, right?
00:25:13.440 Look, my team will be in a series, right?
00:25:15.660 A best of five.
00:25:16.980 Yeah.
00:25:17.260 And if the other, and if the other, and if our, if my team goes up two to one, I'll
00:25:20.620 start cheering for the other team.
00:25:21.860 I just, for some reason, I always cheer for the underdog.
00:25:25.300 Um, you know, I grew up, uh, in Philadelphia, there was the Rocky story, right?
00:25:29.140 And that's, that's the American thing.
00:25:30.620 Like Rocky is the underdog.
00:25:32.080 Like, we're supposed to be the ones there to help, right?
00:25:34.980 We're supposed to be the ones there to help.
00:25:36.700 And if your country isn't even going to do that at a level, that's grandiose, right?
00:25:42.180 Then what, what makes you want to go home and do that at your own level?
00:25:46.560 You know, like what you'll, you'll still do that, but you won't do it in feeling that
00:25:49.980 you are supported by your country.
00:25:52.820 And, uh, yeah, I don't want people to go die over there.
00:25:55.320 I don't think that it's, it doesn't feel right to me.
00:25:58.140 Um, and that's okay.
00:25:59.440 Some people might be like, well, you're not a political pundit.
00:26:01.420 I don't give a shit, but you know what, you know, I don't care if I am or not.
00:26:04.700 I am a human being and you're a citizen and I am a citizen.
00:26:07.580 And I, some people are like, well, you're lucky to be a citizen.
00:26:09.740 Yeah, but I have to, I'm a citizen.
00:26:11.040 You know what I'm saying?
00:26:11.680 Like I'm a citizen.
00:26:13.200 I think the native Americans should get things back.
00:26:15.280 And you know what?
00:26:15.660 Your voice hasn't been corrupted.
00:26:19.640 I think part of the challenge in politics is you get surrounded with so many of these,
00:26:24.380 uh, interest groups.
00:26:25.880 You get surrounded by how you're going to raise money.
00:26:28.880 You get surrounded by all these foreign policy thinkers.
00:26:32.040 Dude, we're working together.
00:26:33.260 I didn't even realize it, Roe.
00:26:34.100 I hate to interrupt you.
00:26:34.820 But right now, my question is.
00:26:37.220 Yeah.
00:26:38.680 Roe, I, I, one thing I like about you is I feel like you're a black sheep.
00:26:41.760 You take a lot of unpopular stances.
00:26:44.000 That's something I really admire.
00:26:45.320 One of those is not accepting lobby from PACs or PACs.
00:26:49.760 We hear about them a lot of times.
00:26:50.940 Some of my listeners, you hear about these PACs.
00:26:53.020 It means political action committee, right?
00:26:55.840 And I'm reading this.
00:26:56.620 Just give me a break, guys, uh, real quick.
00:26:59.140 Uh, we are at a point, I believe, where a lot of people are starting to see that neither
00:27:02.380 party really represents the people.
00:27:04.560 That's the, that's the feeling that I'm hearing people say.
00:27:07.900 Um, I've heard Candace Owen say it.
00:27:09.580 And, you know, she's, you know, very much been a kind of Republican role model, I feel like,
00:27:13.920 or at least, uh, have some influence of that.
00:27:16.740 I feel like she's also kind of in a lane of her own.
00:27:19.300 Um, parties have been unable to keep businesses and lobbyists away from politics.
00:27:23.280 In 2025, lobbying was at an all-time high of 4.4 billion.
00:27:28.120 I believe that's a couple of sites said that online.
00:27:30.540 The sites could be lying.
00:27:32.060 Um, how can we know who is accepting lobby money?
00:27:36.620 First of all, how do we know that?
00:27:38.100 Yeah.
00:27:39.060 So first of all, take a guess how many lobbyists there are in Washington right now.
00:27:43.920 Ballpark.
00:27:44.780 12,000.
00:27:46.660 Yeah.
00:27:47.240 10,000.
00:27:47.940 That's pretty good.
00:27:48.600 10,000.
00:27:49.640 They're hiding a couple thousand.
00:27:50.940 You know it.
00:27:52.400 10,000.
00:27:53.160 Think there.
00:27:53.460 500 members of Congress.
00:27:54.860 That's 535, right?
00:27:56.900 Members of Congress senators.
00:27:57.840 That's 20 to one, right?
00:27:58.920 You play football.
00:27:59.740 Like imagine you got 20 people, your wide receiver, you got 20 people covering you.
00:28:03.840 Yeah.
00:28:04.340 Now, let me give you a concrete example of how this stuff works.
00:28:08.580 Pharma lobbyists, right?
00:28:09.700 Just big pharma.
00:28:10.520 There are about 1,200 big pharma lobbyists.
00:28:13.640 So it's about two and a half to one for every member of Congress.
00:28:17.820 I'm sure you know people who want to get some drug, prescription drugs I'm talking about,
00:28:23.740 and it's too much money, right?
00:28:25.260 And they skip it.
00:28:26.060 Do you know folks?
00:28:27.300 Oh, yeah.
00:28:27.620 Totally I do.
00:28:28.180 So Donald Trump, one of the good things he does is he puts out an executive order and
00:28:34.260 he says Americans shouldn't pay more for any of these prescription drugs than people in
00:28:39.560 other parts of the world.
00:28:40.660 Right.
00:28:41.120 You know, I've had people who like go to a foreign country to get the drug because they
00:28:45.060 don't want to pay them.
00:28:46.640 They can't afford to pay the drugs.
00:28:47.920 Oh, yeah.
00:28:48.400 One night I'm hanging out with a girl and, you know, we've been on a couple of dates.
00:28:51.620 I thought we were going to maybe smooch a little.
00:28:54.080 It's 11 p.m.
00:28:55.740 She's like, I have to drive to Mexico to get Ozempic.
00:28:59.400 Are you serious?
00:29:00.500 I'm not even joking.
00:29:01.800 She's like, I have to drive to Mexico to get Ozempic.
00:29:04.000 And she's like, if I leave now, the hours are best for driving there and back.
00:29:06.740 And I was like, oh, gosh.
00:29:07.780 The weight of her heart was like getting her this drug.
00:29:10.540 I'm not riding with her, you know, but I definitely, I gave her a little bit of gas money,
00:29:13.780 you know, because I hope to see her again.
00:29:15.020 But anyway.
00:29:16.360 Anyway, it's crazy.
00:29:17.980 Right.
00:29:18.100 I mean, we're paying for all these drugs to be developed.
00:29:21.080 You and I pay the NIH to develop all these drugs.
00:29:25.460 Then the big pharma sells them at a profit to Americans.
00:29:30.180 And basically the rest of the world gets these drugs at a fraction of the cost.
00:29:34.080 Right.
00:29:34.200 So we're subsidizing their cost over there.
00:29:36.160 We're subsidizing their cost.
00:29:37.020 Now, someone could say, OK, Mexico or some of these developing countries, fine, they should
00:29:42.460 have these drugs cheaper.
00:29:43.380 But we're subsidizing Great Britain.
00:29:45.660 We're subsidizing Japan.
00:29:46.960 We're subsidizing Europe.
00:29:48.440 These are rich countries.
00:29:49.660 We're subsidizing Switzerland.
00:29:51.240 So Donald Trump, one thing that I agree with, he comes out and he says, Americans shouldn't
00:29:55.600 pay a higher price than any other industrialized country.
00:29:58.640 I'm the first congressperson actually to introduce his executive order as a law.
00:30:05.720 I said, OK, because the executive order, you know, the pharma just will sue it and tie it
00:30:10.520 up in court and nothing's going to happen.
00:30:12.060 Yeah.
00:30:12.160 Because he had another one with price transparency, right?
00:30:14.480 Exactly.
00:30:14.940 Where it was like all the all the prices of of of if you go get an MRI, whatever it is,
00:30:20.880 it needs to be like a menu.
00:30:21.780 Like you go to McDonald's, you know how much it costs.
00:30:23.720 Right.
00:30:23.980 Because without a menu, they can just charge you later whatever they want.
00:30:26.760 Right.
00:30:26.980 Which is part of the whole bait and switch of the insurance system as part of the whole
00:30:30.680 shell game.
00:30:33.060 That was another executive order.
00:30:34.560 But that hasn't been enacted.
00:30:36.840 Like, how do you we get these things to actually happen?
00:30:40.960 Like, why do they stop at an executive order?
00:30:43.120 Because there's always this pomp and circumstance.
00:30:44.960 Right.
00:30:45.020 It's an executive order, but it never gets followed through because it gets tied up in
00:30:49.780 courts and they get sued and they say, well, Congress didn't authorize it.
00:30:54.120 So I look, I actually give Trump some credit for putting it out there.
00:30:58.080 But he knows everyone knows it can't actually make a difference until it becomes a law.
00:31:03.440 And who sues it?
00:31:04.260 The pharma company?
00:31:04.740 The pharma company sue it.
00:31:06.120 They've got a whole industry of lawyers, of lobbyists, and they come out and they they
00:31:13.040 sue them and they're, by the way, they're making record profits and they say, oh, we
00:31:17.000 need this money to do research and innovation.
00:31:19.700 Give me a break.
00:31:20.480 They're there.
00:31:21.140 They have those profits.
00:31:22.400 They're buying back their stock.
00:31:24.020 They're giving out dividends.
00:31:25.760 Their executives are making tons of money.
00:31:27.940 Oh, yeah.
00:31:28.520 They're buying fricking Corvettes, dude.
00:31:30.580 They're doing it.
00:31:31.280 But they are.
00:31:31.920 They're all probably all have Corvettes and they don't care.
00:31:34.460 That's the thing.
00:31:35.040 They don't care.
00:31:35.660 But how so how do we stop that system?
00:31:37.960 Right.
00:31:38.740 First of all, do you believe that system can actually be stopped?
00:31:42.100 I do.
00:31:42.500 And I do.
00:31:44.360 And I'll give you two examples why, because we did it with Big Tobacco.
00:31:47.820 You remember that when tobacco basically had Washington, D.C. bought.
00:31:51.460 And then there was a campaign that said free kids from tobacco.
00:31:56.160 And bring that up.
00:31:56.860 I want to see a picture that some kid would have smoked if they let them.
00:32:00.860 They used to.
00:32:02.660 There it is.
00:32:04.140 Campaign for the campaign for tobacco, free kids, an American nonprofit membership organization
00:32:08.880 established in 1995 with Bill Novelli as its first president.
00:32:13.200 OK, so that happened in 1995.
00:32:14.700 And that had an effect.
00:32:16.100 Well, it was them.
00:32:17.460 It was grassroots organizing.
00:32:19.320 There was a hearing in front of Congress.
00:32:21.060 They brought in all these tobacco executives and basically the tobacco executives are seen
00:32:26.340 lying to the American public saying, yeah, we know that this stuff is addictive.
00:32:30.800 And that had a huge impact.
00:32:32.760 There was laws that got passed that said you can't sell tobacco cigarettes to kids.
00:32:39.160 You have to disclose the addictive quality.
00:32:42.860 So we stood up to Big Tobacco.
00:32:44.620 We also did it one other time in this country, famously.
00:32:47.440 I don't know if you know what DDT is.
00:32:49.860 DDT.
00:32:50.320 Yeah, definitely.
00:32:51.200 It's in the yard, brother.
00:32:52.400 Yeah, exactly.
00:32:52.780 That's what they say.
00:32:53.040 It's in the grass, right?
00:32:53.940 You know, it's the fertilizers.
00:32:55.100 Yeah.
00:32:55.400 And there was this woman, Rachel Carson.
00:32:56.960 She exposed that it was hurting your health.
00:32:59.140 It was hurting the birds.
00:33:01.840 It was hurting the ecosystem.
00:33:03.400 And that led to Congress saying, no, we're going to ban DDT for a lot of different uses.
00:33:08.480 We got to do the same thing in standing up with pharma.
00:33:11.320 How much money does pharma spend?
00:33:13.020 $380 million a year.
00:33:16.500 In lobbying?
00:33:16.900 In lobbying.
00:33:17.260 Or in commercials, you mean?
00:33:18.140 Just in lobbying.
00:33:19.060 Now, does lobbying include commercials?
00:33:20.460 That means paying to congressmen and reps.
00:33:23.060 That's paying to congresspeople and reps.
00:33:24.780 That's like money that is going to congressmen, to representatives, to state legislators.
00:33:29.900 So I introduced this bill, right?
00:33:31.600 Saying, okay, let's do what Donald Trump wants.
00:33:34.180 Let's do what Bernie Sanders wants.
00:33:35.900 Like, how can we disagree on this?
00:33:37.440 Let's take it to pharma so Americans don't pay more.
00:33:40.480 I get two Republicans who co-sponsor it, Representative Luna and Representative Biggs.
00:33:45.980 Bring them up.
00:33:47.460 Bring up Luna and Biggs.
00:33:48.680 First of all, and if that's not an animated series, I don't even know.
00:33:53.320 It sounds also like it could be a rat group, Luna and Biggs.
00:33:55.860 We got a shot of them?
00:33:56.720 So that's Andy Biggs.
00:33:57.580 He's a legitimate guy?
00:33:58.780 He's a legitimate guy.
00:33:59.780 I mean, look, I disagree on things, but he's willing to stand up for what he believes.
00:34:04.900 Representative Luna, she's willing to stand up.
00:34:08.780 Oh, dude, I was just looking at her online yesterday.
00:34:11.660 I like her.
00:34:13.360 I mean, I like what she says.
00:34:15.500 You know what I'm saying?
00:34:15.980 I like her energy.
00:34:17.060 She has an independence.
00:34:19.620 And I got a few of the Democrats to be on board with this.
00:34:23.000 Obviously, they don't want Americans being overcharged for prescription drugs.
00:34:27.120 I mean, this was part of Bernie's whole campaign.
00:34:28.940 And now people are – are there also people telling you that they got on board or did it come down to a vote?
00:34:33.880 How did it end up?
00:34:34.640 So right now we don't even have a – we don't have a vote.
00:34:37.180 We should have every single person in Congress co-sponsoring this.
00:34:40.260 The way Congress works is you got to get to about 100 people to co-sponsor it and then the speaker calls up a vote on the issue.
00:34:47.280 But because you got big pharma's lobbying money, well, they're so cynical.
00:34:53.320 They don't say – they don't try to stop – convince people to vote against it.
00:34:58.200 They try to stop the congresspeople from – the speaker from bringing it for a vote.
00:35:02.800 So they would like – they would invest in Mike Johnson?
00:35:05.240 They'll invest not just in Mike Johnson.
00:35:07.040 They'll invest in –
00:35:07.340 Mike, is that libel or slander for saying that?
00:35:10.440 No.
00:35:11.060 I mean it's just –
00:35:11.540 They would lobby with – they would try to lobby Mike Johnson.
00:35:14.320 They'll try to lobby Mike Johnson.
00:35:15.520 But you know there's this quote that I love from Dr. King.
00:35:18.680 He said, attack the evil system, not the individuals who happen to be caught up in the evil system.
00:35:25.200 OK. Got it.
00:35:25.640 So yeah, there's some egregious cases.
00:35:28.560 The problem is the system is rotten.
00:35:30.500 OK.
00:35:30.920 And the – they are pouring money into different campaigns, into different lobbyists, and basically are not going to allow a vote on this.
00:35:41.020 Because they don't – because they know if there's a vote and someone votes against Donald Trump and against lowering prescription drugs, the American people will be furious.
00:35:50.720 Are going to know who they are.
00:35:51.460 Are going to know who they are.
00:35:52.300 Right.
00:35:52.460 And then we can call them out by name.
00:35:53.900 So they just don't want anyone to do it.
00:35:56.220 And this is how – this is the game.
00:35:58.000 This is why we don't get Medicare for all, why we don't have health insurance.
00:36:01.020 Health insurance is not that we have – we haven't had a vote on Medicare for all.
00:36:04.140 This is why we still have fossil fuel subsidies.
00:36:07.540 We haven't had a vote to get rid of fossil fuel subsidies.
00:36:10.560 This is why we still have a defense budget that is over a trillion dollars.
00:36:14.460 We don't get votes on cutting out the excessive contracts.
00:36:19.320 I mean it's – the game is to stop the vote.
00:36:22.460 And so it – let me ask you – the thing I wanted to ask you actually the most in this conversation is do you know people who are just like, yeah, I'm done with politics because I don't think I can make a difference?
00:36:31.420 Oh, yeah, I'm saying – that's what I'm – that's what I've noticed in the past few months on some of my favorite shows that I listen to and watch.
00:36:40.420 People are like, oh, I see now neither – nobody has our side.
00:36:45.680 Neither one of these groups it feels like have our side.
00:36:48.860 You feel like it's just all these articles.
00:36:50.880 It's like, well, here's an article about this is going to happen and then people get in an uproar about it.
00:36:55.400 But nobody – it's almost like it's all algorithmed perfectly to the amount of our attention span.
00:37:00.760 Man, where it just fades out of like the zeitgeist of discussion.
00:37:07.060 And zeitgeist just means like the circle of discussion at the moment I think.
00:37:12.000 But so it's just like – it just fades and then the next ball hits the air and you're like, oh, look at that one.
00:37:16.580 That's it.
00:37:17.320 Transgender something, you know.
00:37:18.400 Right, right.
00:37:18.720 And then that one, then it's like, oh, racism again.
00:37:21.060 Here I am.
00:37:21.940 You know, don't forget about me.
00:37:23.980 And then it – but it's the same.
00:37:25.260 And after a while, you're just like – I've seen this show so many times.
00:37:29.740 I think even our DNA is sick.
00:37:31.860 So I think you're having newborn children that are like, this is fucking bullshit.
00:37:34.980 Because even – I think it's in our DNA now how long the charade has been going on.
00:37:39.420 And that's where I see people are like at a point where they're just at a loss.
00:37:45.060 How can that be stopped?
00:37:46.840 How can that sort of thing actually be stopped?
00:37:49.620 Can we do anything or does it have to fall on our representatives?
00:37:53.020 The representatives aren't going to change things.
00:37:55.240 And this is my biggest concern for the country, that people keep getting disillusioned.
00:38:00.960 They're like, I can't change anything.
00:38:02.500 I'm just going to focus on my own life.
00:38:04.500 Every two years, every four years, you get congresspeople, presidents saying they're going to do something.
00:38:09.160 They do some executive order.
00:38:10.700 It gets tied up in courts.
00:38:12.400 Big pharma blocks any reform.
00:38:15.260 And what's the point, right?
00:38:16.740 And then the problem is that the less people get engaged, the more power these lobbyists and moneyed interests have.
00:38:22.720 Because they don't want the engagement.
00:38:24.820 No.
00:38:25.360 They're perfectly fine with the system.
00:38:28.260 And they're like, OK, we will keep supporting our members of Congress and presidents.
00:38:34.460 Every four years, they'll run on change.
00:38:36.000 Nothing actually changes and let people get more and more disillusioned.
00:38:39.780 That's what's going on in this country.
00:38:41.120 I mean, look, I believe that we've entered like a privatized communism.
00:38:43.800 That's what I say a lot.
00:38:44.640 That's exactly what it seems like we're in, right?
00:38:47.180 And you're sitting here telling me that our vote doesn't even really matter.
00:38:50.720 So, yeah, people are getting disillusioned.
00:38:52.220 Like, but what do we do?
00:38:54.760 But that's why I gave that example of tobacco and DDT.
00:38:58.340 When the country actually mobilizes.
00:39:01.260 When they're like, we're done with this.
00:39:02.980 Like, what do we need to do?
00:39:03.680 Buy a Jeep or whatever?
00:39:04.900 Well, I think one of the things you do is like look at who's taking money from these interest groups.
00:39:12.240 OK, that is there a site that we can look at where that's reported?
00:39:15.860 Opensecrets.org.
00:39:17.320 Opensecrets.org.
00:39:18.760 Crack that bit open, can't it?
00:39:21.700 I know Chuck Schumer's on it.
00:39:23.840 You know, and everyone is.
00:39:26.100 Is that slander or not?
00:39:28.000 Yeah, I think you're I think you're pretty.
00:39:29.980 You're well, not even close to the line of slander.
00:39:33.600 We're not?
00:39:34.140 No, we're not.
00:39:34.500 Oh, good.
00:39:35.000 Oh, yeah.
00:39:35.420 That little money monkey's taking something.
00:39:37.420 But you can find it on OpenSecrets.org.
00:39:39.480 You can find out how much each person gets and from where?
00:39:41.840 You can find out how much each.
00:39:43.200 So I don't take any PAC money, right?
00:39:44.760 I don't take any lobbyist money.
00:39:46.260 But you can even find out individuals aggregated, how much they're getting.
00:39:50.240 And then if they're getting money, fine.
00:39:52.480 But then look at their voting record.
00:39:54.120 Are they voting against big pharma?
00:40:00.700 Are they voting against the big insurance companies?
00:40:05.940 So first is just being aware.
00:40:08.460 But the second thing is we've got to demand that people have a vote on getting lower prescription drug prices.
00:40:16.100 Who do we ask that for?
00:40:17.200 From Mike Johnson?
00:40:17.800 Mike Johnson.
00:40:18.580 Okay.
00:40:18.820 We got to say.
00:40:19.240 He's from Louisiana.
00:40:20.140 He's from Louisiana.
00:40:20.940 And he's.
00:40:21.340 So I'll say it right now.
00:40:22.440 Mike Johnson, why don't you put the vote up?
00:40:25.660 I mean, everybody wants this.
00:40:27.120 Everybody is tired of paying, of watching people.
00:40:30.020 Medical debt's the number one cause of bankruptcy in America.
00:40:34.760 We know that.
00:40:35.720 It's bipartisan, right?
00:40:37.120 It means both partisans like it.
00:40:38.520 Yeah.
00:40:38.940 I mean, Trump, it's Trump's bill.
00:40:40.440 I mean, Trump's executive bill.
00:40:41.820 Just help us, dude.
00:40:42.740 Give us something.
00:40:43.820 You know what I'm saying?
00:40:44.740 Two things he could do a vote on.
00:40:46.720 One is stopping the war in Iran.
00:40:48.640 I mean, that he can.
00:40:49.640 Massey and my bill, he has to give us a vote on it unless he plays games.
00:40:53.920 And then two, let's just focus on big pharma.
00:40:56.680 Getting big pharma.
00:40:58.400 Yeah, let's start with one lobby.
00:40:59.680 Let's start with one big lobby.
00:41:01.380 I think it's just the, it's like, give us something, man.
00:41:04.060 I was like, cause you know what I'm saying?
00:41:05.600 He's from Louisiana.
00:41:06.800 So like that, Louisiana is a state where it's like, it's mostly about like people, right?
00:41:11.360 Peer to peer.
00:41:12.140 It's like, it's like the state, it doesn't have a lot of, um, I don't think we have one
00:41:16.900 top, um, we don't have one fortune 500 company.
00:41:21.060 I don't think in the whole state, maybe energy, but I think that they left.
00:41:23.500 So we're a low income state, but the one thing that we do have is each other's backs.
00:41:28.140 Right.
00:41:28.500 And I just can't imagine that Mr. Johnson wouldn't bring this to the floor, to the floor.
00:41:36.260 Is that it?
00:41:36.760 To the floor.
00:41:37.360 Yeah.
00:41:37.420 Bring it to the floor, Mike.
00:41:38.760 You know what I'm saying, bro?
00:41:39.920 Like if it, people are sick, people are, people are sick, man.
00:41:44.580 People are dying.
00:41:45.260 You don't want to bring it to the floor.
00:41:46.420 I think that's fucking bullshit.
00:41:48.280 Now, if you are bringing it to the floor and Rose not telling me the truth, then I'm sorry,
00:41:51.840 Mike.
00:41:52.680 I'm sorry, dude.
00:41:54.020 And I, I got a good relationship with Mike Johnson.
00:41:56.520 Mike's awesome.
00:41:56.900 But, but someone has just got to say, he's a cool guy, but he's got keys to the janitor's
00:42:02.620 closet and it's time to fucking get the, it's time to get the broom out and, and sweep this
00:42:06.620 shit out from under the rug.
00:42:07.760 That's what I feel like.
00:42:08.660 And you know what will happen?
00:42:09.540 If there's one time we take on one of these lobbying groups, whether it's big pharma,
00:42:14.220 the rest will get scared, big insurance, big, the rest gets scared.
00:42:17.080 And it's going to give hope to this country.
00:42:19.340 Yeah.
00:42:19.740 My voice matters.
00:42:20.980 Yeah.
00:42:21.180 I can get involved.
00:42:22.600 Yeah.
00:42:22.780 I can bring change.
00:42:24.240 I mean, we have been voting for change in this country since Barack Obama, every, every
00:42:28.700 four years, every two years, like kick them out.
00:42:30.780 Let's try someone else.
00:42:31.920 Yeah.
00:42:32.340 And, and no one.
00:42:33.400 And we put in a mixed guy, you know, it's like, dang, we got an Indian guy showing up.
00:42:37.780 They're trying everything.
00:42:39.540 It's like, they're even trying the Indian guy.
00:42:41.360 I mean, you know, the country is, the country is like, who do we need?
00:42:45.700 They don't care about Indian, black woman.
00:42:49.520 They're just like, get me someone who's going to actually bring change.
00:42:53.160 Give me, it's like, give me somebody with some balls.
00:42:55.140 That's one reason why I support a lot of trans stuff.
00:42:57.560 Cause I'm like, and maybe one of these women will have enough balls on them to fucking get
00:43:02.060 us to where we need to be.
00:43:03.200 Like if, if that's where we're at, where we have to attach balls to an emotional sense of
00:43:08.200 empathy that a lot of time is in a woman and that's who we need to get us over this hill
00:43:12.000 of bullshit that we keep climbing up, then maybe that's what we need.
00:43:15.760 And you know what?
00:43:16.560 It struck me when you said Louisiana doesn't have one fortune 500 company, you know, I represent
00:43:20.900 Silicon Valley, you know, in my district, hold on, you know, my district, dude, we're
00:43:28.260 about to go in.
00:43:29.140 All right.
00:43:29.680 And you and I spoke on the phone about, this is crazy, bro.
00:43:33.580 Yeah.
00:43:33.920 Did you read this somehow?
00:43:35.640 Which?
00:43:36.020 Are you that Oz mentalist guy?
00:43:37.460 I mean, I have.
00:43:40.600 No, you're not.
00:43:41.460 I'm not.
00:43:41.940 Sorry.
00:43:43.080 It said, and I'm sorry, I'm going back to my notes and thank you, man, for talking about
00:43:46.500 this stuff with me, man.
00:43:47.580 It even helps us.
00:43:48.440 Like one thing I noticed, even we spoke on the phone the other night, man.
00:43:51.000 And it just like, after I got on the phone with you, I felt a little better.
00:43:54.640 I was like, at least I have somebody to talk about this with.
00:43:57.260 And dude, yesterday I'm, I'm leaving this restaurant and a guy comes up to me, he's
00:44:01.520 missing a tooth or he's, he was missing one.
00:44:04.580 And he was like, Hey man, thanks for speaking up about, uh, Palestine stuff.
00:44:09.140 And I was like, I said, well, I don't, honestly, I don't know a ton about it.
00:44:12.120 It's just kind of how I feel.
00:44:13.240 And he goes, well, I think a lot of people are just afraid to speak up.
00:44:16.440 They're afraid if they speak against Israel, that they're going to be labeled an anti-Semite.
00:44:20.480 And I was like, Oh no.
00:44:21.900 I said, I understand that, but I think that's just a verbal trap.
00:44:25.780 You know, I'd said, you can have, like, I have tons of Jewish friends that I talk to
00:44:30.620 every day and, and I don't think what, uh, yet in Yahoo is doing is good.
00:44:34.580 But anyway, don't undersell yourself.
00:44:36.500 You have more common sense and your values than 90% of the foreign policy blob that has
00:44:42.540 gotten us into all these wars and that has compromised our humanity.
00:44:46.600 There's a reason the founders wanted Congress to be making decisions about war and peace.
00:44:51.760 They trusted the American people, right?
00:44:54.000 Like the Kings of the past, when the founders were there, they would just do wars for their
00:44:58.660 own glory.
00:44:59.720 And the American people were really, the founders were really suspicious of that.
00:45:03.460 And they, so they said, you know what?
00:45:04.960 We actually trust the farmers and the people in factories and ordinary Americans not to get
00:45:10.380 us into all this stuff.
00:45:12.020 And they trusted people like you, Theo.
00:45:14.680 And the problem is that your voice has gotten out of Washington and, and, and not, uh, central
00:45:21.140 to it.
00:45:21.480 But now we've got all these podcasts and they can't ignore you.
00:45:24.420 So good for you to try to take back citizen voices.
00:45:27.500 That's what this country was supposed to be.
00:45:30.120 Well, yeah.
00:45:30.780 Thanks, man.
00:45:31.360 I can't, sometimes I'm like, what the fuck do I have to do with anything, dude?
00:45:34.660 That's a shit that like, I'm like, good.
00:45:36.740 You're a citizen.
00:45:37.620 Right.
00:45:37.960 Now that I agree with.
00:45:39.100 Corrupted.
00:45:39.620 But then you're just shocked and you're like, this me, like it's, it's just, I think
00:45:44.260 it's just always blows my mind.
00:45:46.220 I think the test in politics is like pretty simple.
00:45:48.480 They're two basic tests to politics.
00:45:50.420 Do you believe that the American people are smarter and wiser and have good values?
00:45:55.520 Or do you believe in like a bunch of experts and elites?
00:45:58.020 Like I fundamentally believe like the, if we just allowed and listened to people, normal
00:46:02.480 people, we'd be much better off in this country.
00:46:05.240 And the second is like, do you love and believe in this country?
00:46:07.800 And I do.
00:46:08.320 I think we're a good country.
00:46:09.480 I think we're the good guys, right?
00:46:11.120 Most Americans want to be the good guys.
00:46:13.080 And like, we've gotten away from that.
00:46:14.980 Yeah.
00:46:15.340 Cause we've got so many intermediate moneyed interests, lobbying interests, PhD, foreign
00:46:21.880 policy experts who are corrupting the process.
00:46:24.860 Oh yeah.
00:46:25.780 They're corrupting, man.
00:46:26.840 They're bought and paid for.
00:46:27.820 They're spies.
00:46:28.880 They're spies.
00:46:29.620 And the sick part is to me, they're just killing themselves.
00:46:34.240 They're killing the future of their children's imagination.
00:46:37.520 Think of all the little things that you are hampering.
00:46:40.800 But then I guess, you know, money, you know, when you see things happen with money and it's
00:46:46.460 hard to, to, to prevent.
00:46:48.560 So like you said, don't point at the person, try and point at the overall issue.
00:46:53.980 But I love that opensecret.org or com.
00:46:57.500 Opensecrets.org.
00:46:58.380 They can, they can do an analysis on everyone.
00:47:00.340 They can do it on me.
00:47:01.100 They can do it.
00:47:01.780 And you know, it's not like people know, there's no one who's perfect in American politics.
00:47:06.020 Let's look at Mike Johnson.
00:47:07.280 All right.
00:47:07.780 I'll look at him right now.
00:47:08.620 He's not going to be happy about it.
00:47:09.620 I'm sorry, Mike.
00:47:10.400 He's a good man.
00:47:11.080 He's a man of faith.
00:47:11.920 But like, like the whole point is like, maybe he ends up actually putting this for a vote.
00:47:17.020 I mean, that would be, see, but like all these packs.
00:47:19.760 Imagine being fucking Paul Revere.
00:47:22.620 You know what I'm saying?
00:47:23.200 That's what he has a chance to be.
00:47:24.460 Because then we could all hold these people like you, you fucking vote for this or you will
00:47:29.900 never be back in this thing again.
00:47:32.200 Seriously.
00:47:32.780 I think that's what people can do.
00:47:33.900 Why doesn't Open Secrets have an app that makes it really easy?
00:47:37.500 Is it very easy, the interface, for people to be able to navigate how to find out who
00:47:42.160 gets what and whatever?
00:47:43.240 Or is it kind of tricky?
00:47:43.820 It's pretty easy.
00:47:44.120 You can do it.
00:47:44.600 Like I would look at two things.
00:47:45.900 One, how much money do they get from packs?
00:47:47.820 Which are the packs?
00:47:48.700 How much money do they get from lobbyists?
00:47:51.000 And what are the industry groups?
00:47:53.580 Now, if they're taking money and they're still voting against those groups, fine.
00:47:57.720 But what raises red flags is when they're taking money and they're not bringing something
00:48:03.080 for a vote or they're voting with those industry groups.
00:48:07.520 Right.
00:48:08.040 This says top contributors for 2022-23 for Mr. Johnson, American Israel Public Affairs
00:48:15.460 Committee.
00:48:16.560 And that gets to the vote on Iran, right?
00:48:18.240 That's AIPAC, right?
00:48:19.180 That's AIPAC.
00:48:19.860 And like Massey and I have shown independence on that.
00:48:23.580 So, okay, bring this vote so we don't get into war on Iran and don't be beholden to groups
00:48:30.800 who are telling you, no, no, no, let's just go along with Netanyahu and strike Iran.
00:48:35.140 It's absolutely crazy.
00:48:38.140 And you know what the craziest thing is, is that we've had Thomas Massey schedule to come
00:48:42.600 in for like a month, right?
00:48:43.860 He's great.
00:48:44.520 Yeah.
00:48:44.940 Well, he's one of the smartest members of Congress.
00:48:46.820 He's a maverick.
00:48:47.440 That's what I like.
00:48:47.860 He's a maverick.
00:48:48.000 I think you and him both are mavericks.
00:48:49.200 That's one thing that I admire about you is that you make choices that feel like your
00:48:52.560 own, right?
00:48:54.700 And the odds that you guys have a bill together and then you're here today and he's coming
00:48:58.740 tomorrow.
00:48:59.500 Oh, is he coming tomorrow?
00:49:00.300 That's awesome.
00:49:01.080 I wanted to put you on the same day so you at least get to say, hey, but maybe in the
00:49:04.160 future.
00:49:04.640 We've done stuff.
00:49:05.360 We've done a lot of stuff together.
00:49:06.420 We both have like people in our own parties who get upset because we take stances that
00:49:11.580 are independent.
00:49:12.300 We work across the aisle.
00:49:13.240 But isn't that what Congress is supposed to be like?
00:49:15.440 I mean, when you have a conversation with your buddies, do you always agree with like
00:49:20.000 three of them and then have three of the others always disagree?
00:49:23.440 It's like so irrational.
00:49:25.720 Usually like you'll agree with someone on one thing and you'll disagree on something
00:49:29.700 else.
00:49:30.220 And yet in Congress, it's like you got these two teams and they're always supposed to
00:49:33.680 be with their own parties.
00:49:35.260 That's not how people actually are in real life.
00:49:37.680 They want people who just think for themselves.
00:49:39.780 I agree.
00:49:40.040 And the craziest part is they're on two teams, but they should be playing for the same group,
00:49:46.000 right?
00:49:46.340 Team America.
00:49:46.940 Yes.
00:49:47.740 And the fact that there, it's like, it always feels, the craziest part is it feels like
00:49:52.500 it's you as a citizen against your own, against, that's what it feels like now.
00:49:57.780 It doesn't feel like this person, it feels like that in the voting, like they're going
00:50:01.480 to, and then it feels like it just changes.
00:50:03.760 Once they get over that hill, you never see them again.
00:50:06.960 Um, but Mike Johnson's a Louisiana boy and I think that he'll do the right thing, man.
00:50:11.280 And I, uh, but I could help, um, keep tabs on who votes for the right thing that they're
00:50:16.920 supposed to vote for.
00:50:18.160 Um, because it's, it's unbelievable.
00:50:21.040 It's, it's, we are literally, it's still DDT, but it's prescription drugs and we're killing
00:50:27.640 our own people.
00:50:28.420 You know, this show is sponsored by liquid IV.
00:50:33.220 I like staying hydrated, baby.
00:50:36.160 It's important to me.
00:50:37.240 It's important to me just to get, just dang, just get to a cellular level.
00:50:42.620 I like to hear them little babies getting drenched inside of me.
00:50:46.100 That's who I am.
00:50:47.680 And liquid IV.
00:50:49.880 Well, they help me.
00:50:51.100 Now, mostly I enjoy the sugar-free options that they have.
00:50:54.700 White peach, lemon lime, rainbow sherbet, and more.
00:50:59.580 They've got them.
00:51:00.160 And they just released their new sugar-free flavor, Arctic Raspberry.
00:51:04.480 I just love, I just get my water bottle, open it up, hit it with that, a stick of liquid
00:51:09.360 IV, bam, shake it up, doused, doused, and my cells are feeling hydrated.
00:51:17.820 That's it.
00:51:18.340 No matter what your summer brings, tear, pour, and live more.
00:51:21.880 Go to liquidiv.com and get 20% off your first order with code THEO at checkout.
00:51:28.140 That's 20% off your first order with code THEO at liquidiv.com.
00:51:34.340 Getting your sweat on might seem like hard work, but with Symmetry Sauna, it's a work of art.
00:51:42.700 Premium custom saunas for your home or business, plus a series of sleek pre-built saunas.
00:51:49.300 I just got myself a Symmetry Sauna, and I'm sweating out bad decisions, like the time I tried to just fill my own chipped tooth in.
00:52:00.700 Why sauna?
00:52:01.800 Well, it relieves sore joints and muscles, improves skin, boosts heart health, and melts stress like hot butter.
00:52:09.500 Hmm, and I'm finally sleeping like a baby, which is rare since I'm usually up at 3am wondering if penguins have knees.
00:52:19.180 Pro athletes, fitness buffs, big folks getting small, small folks getting big, everybody's hot boxing.
00:52:26.560 And Symmetry Saunas can help.
00:52:29.000 Designed in the USA, made with aspen wood from ancient Estonian forests.
00:52:34.720 Ooh, yep, fancy trees.
00:52:38.180 Symmetry Sauna, the perfect balance of form and function.
00:52:41.560 Learn more about how to get your own premium home sauna from Symmetry Sauna at symmetriesauna.com slash T-H-E-O.
00:52:52.020 Okay, so you and I spoke on the phone about the tech lobby, right?
00:52:56.040 And that was something that I had never even heard the term before, the tech lobby.
00:52:59.180 Really, yeah.
00:52:59.520 Never heard of it.
00:53:00.160 Your district in California, because you're a Democratic congressman, right?
00:53:05.780 From California?
00:53:06.680 Yeah.
00:53:07.040 Your district is home to five companies worth over $1 trillion.
00:53:10.720 Yeah.
00:53:11.040 That includes Apple, Google, NVIDIA, Tesla, and Broadcom.
00:53:15.140 While half the country is de-industrialized and dependent.
00:53:20.600 You guys are, you're in part of a, you represent a very special district.
00:53:25.160 Yeah.
00:53:25.340 Um, how big is that tech lobby, uh, who are the biggest players, and are there multiple
00:53:30.840 tech lobbies fighting against each other?
00:53:33.940 Yeah.
00:53:34.260 Well, first, let me just say that part of the problem in this country is that we've got $14
00:53:39.280 trillion in my district and five companies over a trillion dollars, and Louisiana doesn't
00:53:45.440 have a single company in the Fortune 500.
00:53:47.320 Like, how did we allow this to happen in America, where all the wealth is piling up in Silicon
00:53:51.540 Valley?
00:53:51.700 Oh, wait, and one, sorry.
00:53:52.540 There are two Fortune 500 companies headquartered in Louisiana, Intergy and Lumen Technologies,
00:53:57.280 and I'm sorry about that, guys.
00:53:58.460 But even that, right?
00:53:59.560 You got two companies, and you got $5 trillion companies in one district, and a lot of what
00:54:03.620 we've got to figure out is all these places like Johnstown, Pennsylvania, Lorain, Ohio,
00:54:09.280 Downriver, Michigan, places totally hollowed out.
00:54:12.000 You go there, and there's 30% hotel occupancy, and then they're looking at Silicon Valley making
00:54:17.180 more money than any time in human history in the world.
00:54:19.940 Like, how in America did we allow this to happen?
00:54:22.180 It's crazy, the income and cloud.
00:54:24.300 It's sad, and it's unconscionable.
00:54:26.740 But is there a tech lobby?
00:54:28.120 Yeah, there's a tech lobby.
00:54:29.440 Let me tell you one of the things that's been in the news that I know you've discussed.
00:54:33.920 Palantir, right?
00:54:34.900 And the contracts that they're getting, $113 million to create this database on Americans.
00:54:42.240 And I know you asked someone.
00:54:44.800 Yeah, I did.
00:54:45.980 I asked J.D.
00:54:46.840 Vance about Palantir because I'm scared of it, right?
00:54:49.020 Yeah.
00:54:50.060 This is just on my thing.
00:54:51.480 That's so crazy, dude.
00:54:52.460 We are like, this is bananas.
00:54:53.960 I asked J.D.
00:54:54.760 Vance about Palantir because I'm scared of it.
00:54:56.220 I felt like he gave kind of a political answer to me, right?
00:54:59.120 Which is no judgment.
00:55:01.080 That's what I felt like.
00:55:03.500 I want our listeners to know why I'm concerned.
00:55:06.820 Well, one of the reasons is that Palantir got a $795 million contract with the U.S. Army for the MAVEN smart system, which is using artificial intelligence tools for data fusion and target identification.
00:55:21.880 And then a $30 million contract from the U.S. Immigration and Customs Enforcement to develop an operating system that identifies undocumented immigrants and tracks self-deportations.
00:55:31.860 So one of the things that's going on is that it feels like to me that they are asking Palantir – or they've granted Palantir the opportunity to create this overall database.
00:55:44.000 And that is one thing that J.D. did talk about.
00:55:45.940 So maybe he didn't give that just a political answer.
00:55:47.600 He said he believes that it's just an overall database where like the local police department will now have them in the IRS or everything will be synced up so that if you're involved in something in the world, everything will – all the databases will be linked, right?
00:56:05.720 Now, I felt like he would maybe know more about it because he has a relationship with this guy, Peter Thiel, who is one of the founders of it or something.
00:56:15.460 But I don't know if that's true, right?
00:56:18.060 And maybe he didn't give me a political answer.
00:56:20.860 I think it felt – I don't know.
00:56:23.640 You know what?
00:56:24.680 Maybe it's just it wasn't the exact answer I wanted, right?
00:56:28.240 And so that's why I'm kind of framing it like that.
00:56:31.920 So, yeah, I don't want to say that.
00:56:33.960 It just felt – yeah, I think it wasn't kind of what I wanted to hear.
00:56:36.800 But that's also part of diplomacy, right?
00:56:39.200 It's like I learned that recently.
00:56:41.220 It's like everything is – you're not going to get everything, right?
00:56:45.700 Nobody gets everything.
00:56:47.720 But anyway, that's what it said.
00:56:49.600 It was this overall database.
00:56:51.500 And to me, I'm thinking, dude, if they have an overall database, right, and they know everything, they're going to know everything.
00:57:00.760 To me, it feels like they'll just have this – like all your information, your bank card, your blood type.
00:57:07.220 They'll know if you prayed this morning.
00:57:08.980 They'll know just by like biometrics or whatever.
00:57:11.460 If you've hit your knees, they'll know if you're feeling hopeful.
00:57:14.620 They'll know all of these things, right?
00:57:16.460 What are things that we don't see coming from this Palantir deal?
00:57:20.100 Do you think that those fears are realistic?
00:57:22.140 What do you think?
00:57:23.460 Well, I'm concerned.
00:57:23.940 Let me tell you how I see the facts.
00:57:26.840 First of all, I don't want to dunk on the vice president because I saw some of the interview.
00:57:30.740 And I love the fact that you have people here of different viewpoints.
00:57:34.280 But he's got a particular view.
00:57:35.920 I mean Peter Thiel founded Palantir.
00:57:40.460 Peter Thiel also – I mean it's public record – put $15 million into his campaign.
00:57:44.940 Into his campaign.
00:57:45.620 Yeah, we talked about that.
00:57:46.400 To make him a senator.
00:57:47.220 So I'm not saying – I'm not questioning him.
00:57:49.100 I'm just saying he's got a view of – a relationship with Palantir.
00:57:53.940 And we've got to figure out what are the objective facts.
00:57:57.940 I mean the Palantir was founded by Peter Thiel and it was founded by CIA.
00:58:02.880 It's in-cut-el, right?
00:58:03.740 This is what funds it.
00:58:04.800 CIA is – what do you mean?
00:58:06.120 The CIA has a venture capital firm called In-cutel.
00:58:09.360 In-cutel.
00:58:09.880 They gave the money initially for Palantir.
00:58:12.280 Okay.
00:58:12.600 There are certain things that Palantir does that are important to our country.
00:58:15.740 They're using AI for military applications so that our fighter pilots can identify targets, right?
00:58:22.840 We need certain things to make sure that we're the leading military in the world.
00:58:27.360 The problem is on this database that they're creating about Americans, right?
00:58:32.420 And here's what they can do.
00:58:33.920 They're not collecting the data.
00:58:35.500 They keep saying I'm a data aggregator.
00:58:37.160 That means they're creating the program, the software, where the government puts in the data, your financial records, your health records, your employment records.
00:58:46.960 They can buy your social media records.
00:58:48.980 And they put it in and Palantir spits out information.
00:58:53.340 Now, let me give you a concrete example of why this could be dangerous.
00:58:58.280 Let's say you're a comedian.
00:58:59.460 You know comedians, right?
00:59:02.080 My guess is before you became Theo Vaughn, there were times where as a comedian you did shows that made you maybe $5,000 a night.
00:59:10.220 And maybe then you'd go a while and not make anything or make $500, right?
00:59:14.380 You got good shows, bad shows.
00:59:16.300 Oh, yeah.
00:59:16.740 I mean we made $25, yeah, $100, yeah, for sure.
00:59:19.560 And I'm sure you still have friends who are in stand-up comedy and who have good nights and bad nights.
00:59:26.320 Now, Palantir, this database, suddenly flags and does an algorithm.
00:59:31.760 And if someone's having income here and there, they say, well, is this person a risk for tax evasion?
00:59:37.800 And suddenly they're getting audited because they're making a predictive model that comedians, stand-up comedians should be making $70,000 a year and this person isn't disclosing their income.
00:59:49.160 Are they a risk?
00:59:50.480 We don't know with Palantir what their algorithm is targeting.
00:59:55.520 Are they targeting ordinary people who have incomes that are variable?
01:00:00.460 Or are they targeting the billionaires who are evading taxes?
01:00:04.200 It's just a black box.
01:00:05.460 We don't know what data of your friends or my friends is being collected and whether they're actually creating a database.
01:00:13.860 All we need to do is have transparency.
01:00:16.460 I have three simple things that I think we should be able to get consensus around.
01:00:21.460 Your data should not be collected without your consent.
01:00:25.540 You need to know what data of yours the federal government has.
01:00:29.340 The federal government should not have any of your data that they don't absolutely need to provide you a service.
01:00:36.600 And then when Palantir, we should know what their algorithms are.
01:00:40.160 Are they doing this to get audit people who are like making $50,000 because they're accusing them of tax evasion?
01:00:47.060 Are they targeting their algorithms on the very rich or on ordinary folks?
01:00:51.040 What if you have a social media post that says things that are critical of the government?
01:00:56.400 Is that information now the government has?
01:00:58.300 It's like a whole black box.
01:01:00.020 Yeah.
01:01:00.340 It's very scary.
01:01:02.020 It's very scary too because what if another country says, hey, Palantir, you're our friends.
01:01:12.540 Why don't you target these people in America so we can start to bring them down, right?
01:01:16.420 Why don't you start to adjust their scores?
01:01:18.180 Hey, why don't you make it so that it makes it look like these people have already paid their taxes if they haven't?
01:01:26.520 You know what I'm saying?
01:01:26.960 There's just all these little things that could be plausible based on whoever owns the information.
01:01:31.540 Like whoever owns Palantir, whoever owns that algorithm is going to be able to basically puppeteer so many things that could be possible.
01:01:39.600 And even if they're not doing it, there's the fear that they could be doing it.
01:01:42.660 There's the fear they could.
01:01:43.400 And there are no laws right now, right?
01:01:44.860 We need laws saying show us your algorithms.
01:01:48.920 Have an audit of the algorithms.
01:01:50.600 Who are you targeting?
01:01:51.600 How do we make sure that you're not using this in a negative way?
01:01:55.980 And basically like Americans should know what data people have about them.
01:01:59.920 I agree.
01:02:00.800 I agree with you 100%, man.
01:02:02.640 And I want to say also I don't want like a company like Palantir to think like you're bullying me or you're doing this or that.
01:02:08.780 They just happen to be the company that has been picked and it's a scary time.
01:02:15.340 It's already so scary with like technology, everything happening so fast.
01:02:20.180 It's spooky.
01:02:21.520 And you have a nuanced view.
01:02:22.720 Look, I have a nuanced view.
01:02:23.800 There's things that Palantir does that are important for a military.
01:02:27.260 Like just saying, oh, Palantir is all bad would be unfair to what they're doing on the pandemic.
01:02:33.960 They helped with distribution or what they've done on the military.
01:02:37.380 The point is, though, this American database building is scary and trying to predict whether someone is being a tax delinquent or not, trying to predict if they're going to be a financial risk, trying to figure out a profile on them.
01:02:50.280 And what we need – I really blame the lawmakers because what we need is the laws to protect people's privacy, to protect people's data, to protect the use of these algorithms.
01:03:04.180 Yeah, I agree 100%.
01:03:05.060 And we're going to get to that.
01:03:05.760 I know you have the Internet Bill of Rights, and so we're going to get to that in a second.
01:03:11.220 Yeah, this is – the push – I want to say this.
01:03:13.420 The push has put a key Palantir product called Foundry into at least four federal agencies, including DHS, HHS, and others.
01:03:25.000 Widely adopting Foundry, which organizes and analyzes data, paves the way for Mr. Trump to easily merge information from different agencies, the government officials said.
01:03:33.580 And that's what J.D. Vance said when he was here, right?
01:03:35.620 He was saying – and I'm reading that off of The New York Times, give or take what you think about them.
01:03:42.340 But that's what J.D. Vance had said, that he thought it was just this overall compiling, this kind of aggregating of information, right?
01:03:49.000 He didn't think it would get super specific.
01:03:51.360 So I do – I don't think he – it felt a little politically, but that also could just be what he exactly believes, and that could be exactly what's happening.
01:03:59.280 I think there's just a lot of fear with it.
01:04:02.100 A lot of – it says in here also, creating detailed portraits of Americans based on government data is not just a pipe dream.
01:04:08.420 The Trump administration has already sought access to hundreds of data points on citizens and others through government databases, including their bank account numbers, the amount of their student debt, medical claims, disability status, etc.
01:04:19.900 So this will just – I mean I think this whole thing just goes further down that line of just like knowing everything about everyone, and I don't even think that it's Republican or Democrat.
01:04:32.360 I just think it's where we are in time right now with technology and that it's very scary and that you can start to see how big the tech lobby is.
01:04:41.220 I want to say this.
01:04:44.300 One thing about Palantir though that a lot of people have had issues with is what they're doing in Gaza, right, with AI targeting.
01:04:52.920 And this is off of – I read this online, and so that makes it probably uncredible.
01:04:58.280 But allegedly Palantir software is reportedly used by the Israeli military to help select targets in Gaza.
01:05:04.380 Its data mining and AI systems can process intelligence reports, communication intercepts, and surveillance data to generate lists of potential targets in a matter of minutes, a process that previously took hours.
01:05:18.720 Lavender and Habsora, which are two of their kind of I guess programs or overall idea names, and Habsora means the gospel.
01:05:28.480 Investigations indicate that two AI systems, Lavender and Habsora, are at the core of this collaboration.
01:05:36.160 Lavender assigns Palestinians a threat score based on metadata, social media, and movement patterns.
01:05:42.300 These systems reportedly operate with significant error margins, and their recommendations can be enough to authorize deadly strikes, sometimes with minimal human oversight.
01:05:51.380 Gaza is described as a live laboratory for these AI technologies where new systems are trained, tested, and refined in real-time combat conditions before being marked globally as battle-tested solutions.
01:06:05.400 I think that's where a lot of the fear is.
01:06:07.860 It really is.
01:06:08.500 If that can happen there and they're capable of that, then how do I know if when I'm walking down the street that some decision I made years ago or something, a bullet's not just going to fly out of anywhere and go through my head from a drone or something, you know?
01:06:25.000 That, I think, is the big fear about this, and I think some people don't even know it.
01:06:29.080 I think the big fear with technology—
01:06:31.280 Is that crazy for me to say that?
01:06:32.460 No, I mean, I don't think it's crazy to fear that machines should not control human beings, right?
01:06:39.460 I mean, we—look, technology can do a lot of good.
01:06:43.100 It can do a lot of bad.
01:06:44.220 But the key principle here is, are human beings going to be in charge?
01:06:48.080 From what you described, it looks like that AI is being used.
01:06:52.220 They're selecting targeting.
01:06:53.600 And the biggest negative in that whole paragraph you read is without human oversight.
01:06:58.520 I mean, you still need a human being saying, okay, wait, we're not just going to go do strikes because AI tells us there's a risk.
01:07:06.120 There's some women and children there.
01:07:07.540 Let's not have those strikes.
01:07:09.000 There's a hospital there, right?
01:07:10.680 And I think that one of the things we need to adopt for the United States is there has to be a human being making decisions before any military use, any military strike.
01:07:20.800 Yeah, at least give me the fat guy with the donut sitting there and be like, nah, forget it.
01:07:25.020 Nah, nah, nah.
01:07:25.900 He's like, we're now going to be—we're celebrating.
01:07:29.080 We're celebrating.
01:07:30.160 He would—I'd rather have a human being, right?
01:07:31.860 Because there's some compassion.
01:07:33.120 Oh, I agree.
01:07:33.920 There's some sense of guilt.
01:07:35.900 AI is not going to feel guilty.
01:07:37.280 AI is not going to feel compassion.
01:07:39.040 AI is not going to have any accountability.
01:07:41.140 AI doesn't even go to sleep at night.
01:07:43.660 How can you trust something that doesn't even go to sleep at night?
01:07:46.340 Right.
01:07:46.620 You know?
01:07:46.980 That doesn't even give God a chance to rebuild the inside of it while it's resting.
01:07:50.960 That's pretty beautiful, yeah.
01:07:52.140 I do not understand this, man.
01:07:53.940 And that's very scary.
01:07:55.340 And that happens—it's happening, allegedly, I want to say that.
01:07:58.700 It's allegedly—you can go look it up for yourself—happening in Gaza that this—and to me, it does start to feel like that.
01:08:05.400 You're like, it has been such a massacre.
01:08:07.220 They are so overabundant.
01:08:08.700 I mean, it's basically people climbing into holes of rubble looking for pieces of their children and family members.
01:08:16.820 And it's like we're watching it all.
01:08:18.760 It really does feel like, am I now part of some sick experiment?
01:08:25.260 Like, how long do I sit and watch before I raise my hand or say something?
01:08:28.600 Or at least some part of me that has a voice inside of me, like, says, I don't want—just take me out.
01:08:36.520 I don't want to be in this game anymore.
01:08:38.020 I don't like it, man.
01:08:39.780 I don't like it.
01:08:41.900 We're going to keep it moving right there because we have some other stuff that I wanted to say.
01:08:45.660 Oh, I wanted to think about this with you.
01:08:47.180 At some point, like, if we lay this layer of technology over our society, right, and whether we do it now or it's in the future with AI, this all-knowing kind of layer that knows, like, if you've eaten, knows your blood pressure, knows how far you've walked today, knows how much money you have, knows the emojis you sent recently to kind of get an aggregate idea of how you're even feeling as a human.
01:09:10.520 Then that almost becomes your God, right?
01:09:15.420 In a weird way.
01:09:16.220 I know that some people say, nothing will ever be a new God for me.
01:09:21.300 I feel that, and I agree with that, and I respect that.
01:09:25.020 But if you're – say future generations, if they're raised under this cloud of that's all-knowing, then that – would they almost at some point pray to this algorithm that it would give them a lottery ball of some sort of thing or something?
01:09:43.920 Does that even – I mean it's kind of Ray Bradwellian or whatever, Bradwellian or whatever, or Brad Burwellian.
01:09:51.700 But does that make any sense to you, man?
01:09:53.940 Well, that's why we have to keep our humanity – you know, some of the stuff in Silicon Valley is crazy where these tech leaders really are trying to be godlike.
01:10:04.820 Let me give you two examples.
01:10:06.340 Have you heard of this thing called cryonics?
01:10:09.780 You know, where –
01:10:10.780 We had a guy on who did the freezing.
01:10:12.540 Yeah.
01:10:12.900 Oh, you did?
01:10:13.280 Years ago.
01:10:13.600 Yep.
01:10:13.780 I mean basically they want to like freeze your body if you die, if I die, freeze your body and your brain so that some future medical technology could bring you back to life.
01:10:24.020 Some pervert probably too.
01:10:25.260 And Peter Thiel has done this.
01:10:27.600 Reportedly, Sam Altman at OpenAI has done this.
01:10:29.880 Have signed up for it?
01:10:30.540 Have signed up for it.
01:10:31.200 That's the reporting.
01:10:32.820 Now, I'll tell you why that's so bad to try to be immortal.
01:10:36.260 One of my favorite speeches of all time was Steve Jobs.
01:10:39.880 Steve Jobs founded Apple Computers.
01:10:41.860 He dies of pancreatic cancer.
01:10:43.440 He gives this commencement speech.
01:10:44.880 And he says that life invented death and death is a change agent because death allows for the removal of the old guard and for new ideas and new creativity to come about.
01:11:01.300 Like you're supposed to make way for a new generation.
01:11:06.520 We're supposed to be mortal.
01:11:08.900 So you have these people who are trying to be immortal.
01:11:11.980 Now, why does that matter?
01:11:12.800 Because this is partly AI, right?
01:11:15.560 If you don't have a sense of your own mortality, your own limitations, you're not going to have a humanistic sense of putting people over machines.
01:11:22.680 A second example with these tech leaders.
01:11:25.920 They want to build these tech cities.
01:11:27.160 Have you heard of that?
01:11:28.200 I've seen like pieces in Memphis and stuff where people are getting sick because of like data being uploaded and stuff in their neighborhood and shit like that.
01:11:35.440 Yeah, I mean that's storage, you know.
01:11:36.660 That's a big issue of how these data centers are being built and whether they're going to cause pollution or dislocation.
01:11:43.040 But they really want to build these tech cities, not just a city.
01:11:46.400 It's like a secular genesis.
01:11:48.740 Like they want to build a new civilization with algorithmic norms, with morality that they're setting, not like that was set by God or years of civilization.
01:12:00.480 And it's scary.
01:12:02.060 It's scary because it shows an arrogance, a lack of humility.
01:12:06.900 My sense is, look, AI can be an incredible tool if it helps you figure out how many steps you walked and when you're sleeping and how to be nutritious.
01:12:16.880 But it has to be subservient to human agency.
01:12:23.760 We have to have laws that say ultimately human beings are in charge.
01:12:29.660 And we have to make sure that we're teaching people to have the critical thinking skills and the autonomy to be able to use these programs and not be subservient to them.
01:12:40.520 And that's – to me, that's the biggest question for humanity.
01:12:44.000 It's ironic because you're asking almost deeper questions than any of these Sunday talk shows in Congress.
01:12:49.260 Like in Congress, we're busy fighting about like who insulted who and like who's up, who's down in the polls.
01:12:55.220 And we just have this technology revolution that's going to change how all of us work, that is going to make people afraid if they're going to lose their jobs, that's going to try to create the super intelligence wondering whether we're making the decisions or machines are making the decisions.
01:13:12.020 That's what we should be talking about.
01:13:14.940 Right, and we should be talking about it in a way that is – and I guess there are ways to talk about it in ways that are exciting.
01:13:20.200 But I just think right now with what you have seen like – to me, the destruction and the stuff caused by Israel now, it makes it very scary right now with that company because you've just seen that.
01:13:36.180 And then you're now – that's the company that's going to oversee our country.
01:13:38.400 What's going to keep me from being a Palestinian to the next guy who starts the button in the morning from his cell phone and then goes off on his yacht and just –
01:13:48.040 The constitution hopefully.
01:13:49.760 Right, right.
01:13:50.220 Hopefully the constitution.
01:13:52.540 Thanks for your patience today, Mr. Khanna.
01:13:55.760 Ro is fine.
01:13:56.580 Ro.
01:13:57.400 Rohit.
01:13:58.140 That's your real name?
01:13:58.600 Yeah.
01:13:58.740 Rohit?
01:13:59.120 Yeah.
01:13:59.420 How do you say it?
01:14:00.460 Rohit.
01:14:01.000 You know, when I was a kid, people used to – I still remember this.
01:14:03.800 They used to say, Rohit, Rohit, Rohit is on fire.
01:14:07.960 And then I figured out I was like making fun of my full name.
01:14:11.400 And then when I used to go up to hit on Little League, I used to go chant, Rohit can't hit.
01:14:17.180 Rohit can't hit.
01:14:18.300 Watch the bunt.
01:14:19.680 But, you know, we have this in common, right?
01:14:21.940 I looked up the – you've got a long name that –
01:14:24.260 Oh, yeah.
01:14:25.040 What's your full name?
01:14:26.100 It's Rohit Khanna.
01:14:27.460 Rohit Khanna.
01:14:28.360 But, you know, we're both – I don't want to geek out because I was – I loved literature.
01:14:32.760 But it's like quintessentially American to, in some ways, change your name and to make your story, right?
01:14:40.480 Like my favorite character in all of American literature is this guy, Jay Gatsby.
01:14:44.780 And his name was James Gats.
01:14:46.640 He was from North Dakota.
01:14:48.160 And he said, wow, James Gats is not a sophisticated name.
01:14:51.220 So I'm going to call myself Jay Gatsby.
01:14:53.120 And he reinvents himself.
01:14:54.840 And in different ways, you, Theo Vaughn – like no one would have thought you'd be Theo Vaughn growing up.
01:14:59.840 No one would have thought this geeky Indian kid is going to go represent Silicon Valley.
01:15:05.080 And Rohit Khanna is going to become Rohit Khanna representing the most economically powerful place in the world.
01:15:10.200 But, like, that's the story of America, right?
01:15:12.480 That's the story of self-invention.
01:15:15.200 And it's something that we've kind of lost.
01:15:18.180 It's why my parents came here.
01:15:20.240 And the question is how do we get that back?
01:15:22.780 Like how – you know, we've talked about all the negative stuff.
01:15:25.680 That's a good point.
01:15:26.140 But how do we get people believing in this country again, believing that we can do big things again?
01:15:31.820 Well, I think there's some ways – and we'll go into this now then because we're on my list here.
01:15:36.380 And I am using a list today.
01:15:37.440 That's okay because it's an important conversation.
01:15:42.380 Because of AI, job replacement has already started to happen, right?
01:15:47.600 What kind of worker protections can be implemented, do you think?
01:15:50.560 Because I know you have an internet bill of rights, and then we can talk about that right after that.
01:15:53.660 Or it may be part of it, but what do you think – because we should be protecting the human, right, at a certain point.
01:15:59.600 We should be protecting the human.
01:16:00.900 And we could go two ways with AI, right?
01:16:03.320 One way of AI is like this could create a lot of new jobs, right?
01:16:07.280 We could build new factories now that are productive in this country.
01:16:11.100 You could have someone in a classroom in Kentucky now who – because AI doesn't get to go to Harvard or Stanford but suddenly has the knowledge of the whole world at their fingertips.
01:16:20.240 And it can get that kind of education.
01:16:22.500 You can have someone working in a small rural town making $80,000 or $100,000 because they're a content creator.
01:16:30.420 Or you know how many content creators there are in this country?
01:16:33.860 1.5 million who make some money doing content creation.
01:16:37.360 Or they can do digital marketing or they can help AI start a small business.
01:16:42.180 So like my point is how do we get AI to lessen the wage gap, to create economic opportunities as opposed to just eliminating jobs?
01:16:52.160 The scary part is what if they're going to start to eliminate jobs?
01:16:56.500 Like if you're a truck driver, what if they say, OK, we're just going to have self-driving trucks?
01:17:01.160 That's why we need to pass law saying, no, I need a truck driver for safety.
01:17:05.720 Like would you fly on a plane right now without pilots?
01:17:08.900 You know, same thing.
01:17:09.720 I wouldn't.
01:17:10.860 So let's make sure we have basic safety in terms of recognizing that we do want workers and workers get to have a say in how to use this technology.
01:17:19.420 You know what the scariest thing is?
01:17:21.040 The unemployment rate for kids between 21 and 29 who have a college degree.
01:17:26.440 Everyone often says, oh, if you have a college degree, you're going to be employed.
01:17:30.300 Actually, it turns out if you have a vocational skill, skilled trades, 21 to 29, you have a 2% unemployment rate right now.
01:17:37.240 College degrees, 15%.
01:17:38.980 That could go up to 30% with AI, with these entry-level jobs.
01:17:45.860 And one of the things we need, I think we need the federal government to help incentivize people getting apprenticeships after they're done with a trade school or a college.
01:17:57.680 So they can actually get a job, make sure they're subsidized.
01:18:00.300 Create a government future workforce administration that hires them so they get a few years of experience and then can go on to the private sector.
01:18:08.540 Most importantly, everyone needs to understand technology and AI.
01:18:12.940 The old cliche is you're not going to be replaced by AI, but you will be replaced by someone who knows AI.
01:18:19.400 And in our schools, K through 12, everyone should have a technology class.
01:18:24.960 China is aiming by 2030 to have universal literacy on AI.
01:18:29.020 No, yeah, not to interrupt you, but I think – well, one thing you said here that was really great was anybody – I agree.
01:18:35.380 A lot of jobs are going to – it's going to be scary.
01:18:37.840 I'm amazed.
01:18:38.640 I was talking to my brother yesterday.
01:18:39.640 I was like – I said, bro, do you have your kids learning about AI right now?
01:18:43.440 Like are they learning about it?
01:18:44.520 Are they starting to use it?
01:18:46.240 My brother's like, no, I don't think so.
01:18:47.520 I said, dude, I feel like it's coming so fast that it could change things really fast.
01:18:53.240 Now, one thing you mentioned is anybody can learn this right now.
01:18:57.880 It's almost like a great reset in a lot of ways.
01:19:00.580 So you have the kid who couldn't afford to go to Harvard, right?
01:19:03.480 Yeah.
01:19:03.700 Who couldn't afford to go to MIT, whose father doesn't have the nepotism because we certainly live in like a nepotiety now in a lot of ways, whose father didn't have that capability.
01:19:15.080 Now that kid can get on a computer for probably six weeks or something, really focus and crack down and could probably get a job.
01:19:25.280 Yeah.
01:19:25.620 I was literally looking the other day.
01:19:26.700 I was like I need somebody who knows everything about AI, right?
01:19:29.920 I need somebody who can program and design and help with AI.
01:19:33.360 Like I want to start like an AI studio, right?
01:19:35.420 Oh, that's pretty cool.
01:19:36.240 And I got to do it fast because everybody else is going to be doing it, right?
01:19:39.880 You could have a new indie.
01:19:41.080 So even someone like you needs AI, right?
01:19:42.800 A hundred percent.
01:19:43.540 I feel every day and there's a little part of me that pull like a little string inside of me that gets reminder.
01:19:49.160 Like, hey, man, you have to get on top of this.
01:19:51.280 I know other things are pressing.
01:19:52.140 I hope kids who are listening to this hear that because a lot of people want to be like, how do I be Theo Vaughn?
01:19:56.280 How do I be this guy with a podcast or do stand up, be a stand up comedian?
01:20:01.520 And even someone like you, you want to do that, you got to know AI.
01:20:04.960 Look, one of the things I say is, look, I was really good as an Indian kid at math, right?
01:20:08.540 Like every Indian kid is.
01:20:09.780 But you don't need math.
01:20:10.860 Like you don't need algebra and calculus anymore to do AI.
01:20:13.720 You don't have to.
01:20:14.420 Dude, you're going to shake up all of Madras by saying that.
01:20:18.080 You don't have to be like I was, like some nerdy kid who did like math leads and, you know, got good grades on math tests.
01:20:26.400 Don't need it anymore.
01:20:27.780 If you want to be an advanced coder, programmer, fine.
01:20:30.420 But the amazing thing with AI is now it does all that for you.
01:20:33.480 You know what jobs are actually being eliminated?
01:20:35.520 The coder jobs.
01:20:36.560 The guys who did all the math.
01:20:38.240 The AI is doing the coding for them.
01:20:40.240 Imagine there's a guy sitting there building AI and it's only going to take his job.
01:20:45.220 So now like, okay, you don't have to be the kid who did algebra and calculus.
01:20:49.600 Not to say that you shouldn't learn math.
01:20:51.500 You should learn it for your own sake.
01:20:52.660 But even if you don't like math, now you can just learn this technology.
01:20:56.040 You just have to be comfortable with it.
01:20:58.280 How do you use it to do a business?
01:21:00.980 How do you use it to learn about the world?
01:21:03.840 And you can be the one asking it questions and prompting it.
01:21:06.860 And it can create a whole new level of jobs.
01:21:10.640 And you know what the exciting thing is?
01:21:12.220 You don't have to leave Louisiana for so many times we said,
01:21:15.680 okay, you got to go move to Silicon Valley, move to New York.
01:21:18.400 And people in Appalachia-
01:21:18.640 You have a gold rush or whatever.
01:21:20.280 And people in Appalachia were like, yeah, no, it's really beautiful here.
01:21:23.340 No, I really like living in my small town, in my rural community.
01:21:26.340 Like, no, I don't want to go live in New York.
01:21:28.280 No, I don't want to go to Rose District.
01:21:29.780 I often ask all these people in my district who are like, oh, okay,
01:21:33.440 just tell them to move here or move to a tech place.
01:21:35.800 I was like, well, are you going to move to Louisiana?
01:21:38.080 And no hands go up.
01:21:39.100 Then I was like, why are you so arrogant to think they want to live here?
01:21:41.640 They like it where they grew up with their families.
01:21:44.580 But now you can like live there and you learn a little bit of tech
01:21:48.340 and you can work for almost any company in the country, in the world.
01:21:52.100 That is the opportunity if we do this right.
01:21:55.560 And we should start with kindergarten through 12th grade.
01:21:58.260 It's just like reading or writing.
01:21:59.800 I agree.
01:22:00.580 Are your brother's kids going to do it?
01:22:02.100 I hope so.
01:22:02.980 I'll say this.
01:22:03.620 They're not working for me one day if they don't.
01:22:06.360 This is also interesting because it's like every now and then there's
01:22:08.580 something that makes it where you can jump.
01:22:10.940 You can jump up a caste or a system.
01:22:13.960 You can jump a couple levels.
01:22:15.520 And I believe that right now this is that thing.
01:22:18.040 I said the same exact thing.
01:22:19.180 You know, the wealth gap.
01:22:21.840 There are two huge wealth gaps in this country.
01:22:23.820 One is between places like Silicon Valley and factory towns,
01:22:27.180 rural towns that were decimated.
01:22:28.440 And the other is a racial wealth gap, 10 to 1.
01:22:31.440 White families and Asian-American families tend to 1 the wealth of people who are African-American,
01:22:37.960 Latino-American.
01:22:39.280 Technology, you can do that, scale it in one generation.
01:22:42.780 If we can get rural communities, factory towns, towns in the Black South,
01:22:48.100 the opportunity to do things with tech and build companies, build wealth, be content creators.
01:22:54.520 Oh, hell yeah.
01:22:55.460 Black AI, dude.
01:22:56.680 You know what I'm saying?
01:22:57.560 Build me a trap beat, homie.
01:22:58.940 Let's get this.
01:23:00.680 No, but I agree with you.
01:23:01.920 It's an opportunity, right?
01:23:04.260 And that is, it's like, it's an opportunity.
01:23:07.860 Some people are like, there's no, and I know that's just one thing.
01:23:10.360 But also another thing is that trade jobs, like you're saying,
01:23:13.040 you're still going to need plumbers for sure.
01:23:14.700 You're still going to need, you're still going to want the hands on the wheel of a human being,
01:23:18.580 I believe.
01:23:19.540 But there's a lot of trade jobs.
01:23:20.860 You're not going to be able to replace those.
01:23:23.240 You talked about an internet bill of rights, right?
01:23:25.640 Which you tried to pass, I'm guessing, or tell me about that.
01:23:29.040 Tell me a little bit about that experience and why do we need that?
01:23:32.400 Do you feel like we already answered that or no?
01:23:33.940 I could go do it briefly.
01:23:35.180 I mean, there's a guy, Tim Berners-Lee.
01:23:38.120 You should have him on sometime.
01:23:39.260 Tim, what's his name?
01:23:39.920 Tim Berners-Lee.
01:23:40.960 He is the founder, basically, of the World Wide Web.
01:23:43.780 Wow.
01:23:44.340 He's one of the people who made the internet, the internet.
01:23:48.200 Oh, pervert.
01:23:49.240 And I'm sure he was like knighted or something.
01:23:52.380 And so he and I talked, and we came out with this internet bill of rights.
01:23:58.160 You remember that Facebook Cambridge Analytica scandal where people lost their data on Facebook
01:24:03.980 and profiles were made?
01:24:05.560 And basically, it said, in this country, no one should take your data without your consent.
01:24:11.940 Agreed.
01:24:12.240 And you should know where your data is.
01:24:13.920 I mean, there are other parts to it, but it's pretty simple.
01:24:15.960 You own your data, you control your data, and you should know what's happening with your data.
01:24:21.440 And I think, and by the way, you should be compensated to some extent for the use of your
01:24:25.780 data if you consent.
01:24:27.340 There should be a data dividend in this country.
01:24:29.600 And I think that is not a partisan issue.
01:24:32.540 It is an issue that's going to become increasingly important with things like this database that
01:24:37.820 may be being made.
01:24:38.940 Like, you should know.
01:24:40.000 Do you know, Theo, right now what the government knows about you?
01:24:42.540 Do you know if there's a profile on you?
01:24:45.180 Do you know, like, what if that database is wrong?
01:24:47.620 Pisces, I know that.
01:24:48.340 I'm trying to think of what else.
01:24:50.880 I don't know.
01:24:51.720 Yeah.
01:24:52.220 Like, you should know.
01:24:54.280 You should be able to consent.
01:24:55.740 You should be able to get something deleted if they have an incorrect information.
01:24:59.460 And that applies to the government.
01:25:01.240 And it applies to big corporations.
01:25:03.900 I don't want Google having some profile on Ro Khanna.
01:25:07.220 I mean, so, we introduced this in 2017.
01:25:11.920 Wow.
01:25:12.420 2017.
01:25:13.440 Congress, to this day, has not been able to introduce or pass any internet bill of rights.
01:25:19.760 There are no protections.
01:25:20.900 You know, when Josh Hawley—
01:25:22.100 When you say there's no protections.
01:25:22.980 Oh, you just mean there's no protections for people.
01:25:24.780 I mean, no protections for people.
01:25:26.540 Josh Hawley once, and there are places where I actually agree with Hawley, he had Mark Zuckerberg
01:25:31.280 stand up and apologize.
01:25:33.120 To those parents behind him.
01:25:33.860 To those parents.
01:25:34.400 I saw that.
01:25:34.880 And someone said to me, you know what he should have also done?
01:25:37.340 He should have had the whole Congress stand up and apologize.
01:25:39.980 The problem isn't just these tech companies.
01:25:41.920 It's the Congress.
01:25:43.260 We've had this stuff going on for 10 years.
01:25:46.540 And Congress has not passed an internet bill of rights.
01:25:49.380 And one of the things I say is, like, I know these people.
01:25:52.120 I know how they think.
01:25:53.860 They're going to run circles around you.
01:25:55.300 You've got to hold them accountable.
01:25:57.140 Have the tech guy saying, hold them accountable.
01:25:59.600 Why won't they do it?
01:26:00.480 The same reason as before?
01:26:02.320 Why won't the tech companies do it?
01:26:03.600 Who's lobbying?
01:26:04.180 Who's lobbying against them?
01:26:05.240 Big tech?
01:26:05.800 It's tech lobbying.
01:26:08.460 It's internet service provider lobbying.
01:26:10.860 It's these lobbyists.
01:26:13.460 And they don't want, and we haven't.
01:26:15.960 I mean, can you imagine this country that has a constitution that is a country founded on freedom?
01:26:21.240 And we don't have any federal legislation saying you own your data?
01:26:24.840 Oh, well, that's what it's all become.
01:26:27.560 It's all part of this thing where all you become is data, and then you don't own it.
01:26:32.560 And then you're part of a new country without you even realizing it.
01:26:36.640 You got put into a whole new continent titled or owned by whatever tech monarchy owns it, or all of Jarchi owns it, and then that's where you are.
01:26:48.480 People are like, I'm in America.
01:26:49.400 Like, that's fine.
01:26:50.480 You can believe that all you want, right?
01:26:52.240 But if another company owns your information, that company can't really even be sued.
01:26:57.900 There was also part of that big, beautiful bill that keeps state-led, like AI.
01:27:03.960 It's the worst part of that bill.
01:27:05.500 The AI.
01:27:06.120 You can't even sue the people.
01:27:07.580 Bring it up for me.
01:27:08.140 Get to that really quick.
01:27:09.000 10 years they want no state to have any regulation on AI, 10 years to give a free pass to these big tech companies, no regulation.
01:27:19.300 I spoke out against that, and my phone blew up because I had all these angry tech leaders saying, Roe, how are you speaking out against these monitoriums?
01:27:26.260 That's when I knew I was doing the right thing.
01:27:28.000 That's when you know you're doing the right thing, man.
01:27:29.520 But you know what's sad there in this country?
01:27:32.720 Because you and I both have gone to some of these towns that lost their factories, that lost their industries.
01:27:38.460 And we screwed them twice.
01:27:40.160 First, we took away their jobs.
01:27:42.080 We literally shipped their jobs offshore or watched their jobs go to China.
01:27:47.020 These towns started to deindustrialize.
01:27:49.620 There was pain.
01:27:50.360 There was opioids.
01:27:51.320 There were drugs.
01:27:52.140 And then what do we do?
01:27:53.220 We get social media companies to find algorithms on their data to start targeting them with content, which is causing young girls to commit suicide, which is getting some people more addicted to drugs,
01:28:07.260 which is getting people to be angry at each other.
01:28:10.080 It's like we screwed them taking away their jobs, and now we're taking their data and screwing them with social media.
01:28:15.760 I just feel like the devil is at the gates, man.
01:28:18.120 I feel like we are at such an important time, and I know people have probably said that before.
01:28:21.960 And what the fuck do I know, dude?
01:28:23.440 I am – I'm not mentally handicapped or whatever, but I am definitely adjacent, right?
01:28:28.400 But I do know some things.
01:28:30.020 I know this, that I'll say this.
01:28:31.680 This is in that big, beautiful bill.
01:28:33.860 And I don't know everything that's in it, but this is one thing that really stood out to me.
01:28:37.500 And I asked J.D. about this.
01:28:39.700 It says, for a full 10 years after enactment, no U.S. state or locality may enforce any law or regulation that limits, restricts, or otherwise regulates artificial intelligence models, artificial intelligence systems, or automated decision systems entered into interstate commerce.
01:28:57.960 Only laws encouraging use or deployment of AI are excluded.
01:29:02.420 That means that you won't have any legal recourse against a machine.
01:29:08.000 Basically.
01:29:08.660 Like, you know how you said you want hands on a wheel?
01:29:10.980 Yeah.
01:29:11.460 If now a state wants to pass a law saying you got to have a truck driver on a truck, you can't do that.
01:29:16.760 If a state wants to pass a law saying you can't have an algorithm in social media that gets young girls addicted to content that makes them more likely to have eating disorders, can't do that.
01:29:29.060 So it's just going to be federal will have the only –
01:29:30.920 Federal will be the only one.
01:29:31.920 And you know what –
01:29:32.340 Guy, do you not see how – do you not see what is at the gates?
01:29:37.300 And by the way, you're not going to get federal legislation because the way you get federal legislation in this country is when you get a bunch of states that pass legislation, then you get the industry coming to the federal government saying, oh, there are too many state laws.
01:29:48.900 Can you just get one federal standard?
01:29:50.720 But if you don't have the state laws, you're never going to have industry coming saying let's get a federal standard.
01:29:55.160 So it's basically like let's have AI just develop.
01:29:58.420 And that is the problem, right?
01:29:59.960 There's the beautiful AI vision of how your nephews could learn this and build wealth and we could have some people in one generation have economic security.
01:30:09.640 And then there's the dystopian AI vision where big government and big companies control our data or making decisions based on what predictive algorithms are telling about us where we lose control to machines.
01:30:25.980 And the question is which way are we going to go as a society and who's going to benefit?
01:30:31.020 Is the AI revolution going to be like globalization?
01:30:33.100 All the money piles up in my district, you know, we'll be sitting here five years from now instead of $14 trillion, it'll be $50 trillion.
01:30:40.980 And places like Louisiana will still be shafted.
01:30:44.420 Or is the AI revolution going to be owned by American citizens?
01:30:47.940 Like to me, that is the whole biggest question of our society right now.
01:30:52.700 I agree.
01:30:53.500 I agree, man.
01:30:54.340 It's one of the reasons why I was so grateful that you were willing to come here so quick and that we're able to talk about this stuff.
01:30:59.040 Because I wanted to have, I just, I don't know, sometimes I'll just get afraid that I'm not educated enough to have certain conversations.
01:31:05.600 You're, you're, you're, you're, you're, you're perfectly, you're educated to, to the point where it's your values.
01:31:10.700 And that's, I think that's actually one of the things that the tech folks rely on.
01:31:14.400 And it's like what the foreign policy establishment relies on.
01:31:17.420 It's like the old priests.
01:31:19.020 Like we've got some secret language, secret vocabulary.
01:31:22.260 You can't talk about the Middle East because you haven't studied the Middle East as much as us.
01:31:26.120 You can't talk about technology because you don't know the math that went into the coding of AI.
01:31:30.920 Who cares?
01:31:31.920 These aren't questions of technological competence.
01:31:34.540 These aren't questions of foreign policy competence.
01:31:36.840 These are questions of human values.
01:31:38.780 It's a moral, yeah.
01:31:39.520 It's a moral issue.
01:31:41.240 But just, but just to be sure to go back.
01:31:42.540 So on, on that point, we're talking about that part that's in the big, beautiful bill.
01:31:46.720 It will not leave you very much legal recourse.
01:31:49.140 Is that correct?
01:31:50.280 It'll leave you no legal recourse if you're a state.
01:31:53.120 You can't do any regulations.
01:31:54.640 What if you're a citizen?
01:31:55.860 And if you're a citizen, you have almost no legal recourse.
01:31:58.740 It's basically saying, look, let's trust the AI developers to just develop AI.
01:32:03.660 And the argument they give is, well, we don't want to lose out to China.
01:32:07.640 Well, you know what?
01:32:08.380 I don't want to live in a society like China.
01:32:10.920 China has a social credit score on every single person.
01:32:14.000 Like if you're in China, you get ranked.
01:32:16.140 10, you're a great Chinese citizen.
01:32:18.340 One, you're not, you know, because you didn't pay your bills.
01:32:21.200 Because you were rude to someone.
01:32:22.840 Because you made fun of someone.
01:32:24.040 Because your karate sucks.
01:32:25.620 That's just an old joke.
01:32:26.160 The argument can't be like, we're going to stay ahead of China, so let's become like China.
01:32:31.320 Right.
01:32:31.440 How about we're going to stay ahead of China without being China?
01:32:34.440 And we're going to have regulations because we're Americans.
01:32:39.100 And it's here now, though.
01:32:40.420 People don't understand it.
01:32:41.040 It's here now, right?
01:32:41.980 Like we can't keep kicking this.
01:32:44.360 The ball I build, I feel like we're on the goal line.
01:32:47.240 We're on the goal.
01:32:47.640 This is the thing that the tech leaders, why they're so involved suddenly in politics, they understand that this is the whole ballgame.
01:32:53.900 Like, well, we're fighting over all this other stuff.
01:32:57.780 Oh, where is the funding for Medicaid and tax?
01:33:00.620 All very important.
01:33:01.760 They're like, well, we just want the whole enchilada.
01:33:03.980 We just want to control data and control jobs and control the economy.
01:33:09.220 And these people in Washington are kind of clueless.
01:33:12.360 Like, they're people and members of Congress.
01:33:13.660 I'm not going to say the names who are asking the Google CEO about the iPhone.
01:33:17.560 They don't know that Apple makes the iPhone.
01:33:20.020 You know, they're technologically illiterate, and they're just being run circles around.
01:33:25.580 And now they're saying, like, Congress, you're so illiterate.
01:33:28.240 Just don't do anything.
01:33:29.280 Don't have any regulation.
01:33:30.340 Let us just do everything.
01:33:31.300 Like, really?
01:33:32.140 We didn't fight a revolution in this country to let the tech billionaires get to decide our future.
01:33:38.740 And I think what we need, you know, FDR was a traitor to his own class.
01:33:43.520 He came from New York, Wall Street Finance.
01:33:47.100 He said, no, I'm not going to let finance dictate America.
01:33:51.360 And in some ways, we need people from technology saying, yeah, we love technology.
01:33:55.400 But no, technologists don't get to call the shots in this country.
01:33:59.480 Theo's buddies get to call the shots.
01:34:01.380 And by the way, they got better judgment than you.
01:34:03.880 Or just human beings, people with some, I think, because I'm not, I don't have the best morals in the world or anything.
01:34:08.260 But I feel like I fucking try to be alive because that's what I'm, I'm sentenced to right now is being alive on earth.
01:34:15.220 So why don't, why don't, why isn't there a tech leader that comes out was like, this is the app that shows you exactly who's doing what.
01:34:21.420 So you can, and this is how you start to change.
01:34:23.680 Like, we need something that's step by step, almost for dummies.
01:34:27.320 And I hate to say that, but we've all become dumbed down by our phones and games.
01:34:32.020 But we need to know if we have, I mean, because I just feel like this is it.
01:34:35.600 We got to find a developer who's like, you know what?
01:34:38.500 I'm going to fucking try to save humanity.
01:34:41.060 You know who we could have?
01:34:42.280 Maybe you should have him on here is Steve Wozniak.
01:34:44.640 He's the, he's the guy who developed Apple with Steve Jobs.
01:34:47.420 Cool guy.
01:34:47.960 Cool guy.
01:34:48.500 And he's a humanist.
01:34:49.380 He's, he believes human values.
01:34:51.120 We need some people like Wozniak.
01:34:53.420 I mean, there are others, the Woz, you know, it, and, you know, Steve Jobs in many ways was a, was a humanist.
01:34:59.940 People who understand technology, but are like, you know, technology in the service of human values.
01:35:05.120 We're not gods.
01:35:06.340 We don't want to rule.
01:35:08.200 I often say we need Silicon Valley in the service of America, not America in the service of Silicon Valley.
01:35:14.080 Amen.
01:35:14.600 I agree.
01:35:15.020 I would love to meet Mr. Wozniak and talk to him.
01:35:18.780 Okay.
01:35:19.220 You campaigned for, um, and I'm, yeah, I'm going back to my, some of my charts here, but that's all right today, guys.
01:35:24.740 That's where I'm at.
01:35:25.540 It's been a long week and it's been a long, uh, life.
01:35:29.060 Um, you campaigned for Bernie Sanders in 2016 and co-chaired his presidential run in 2020.
01:35:35.280 Yeah.
01:35:35.580 Is that right?
01:35:36.100 That's right.
01:35:37.520 We've, uh, asked him directly what happened, you know, and I was always, I wish that they made it so that whoever won the presidency, the other, like if a president was Republican, the Democrat, the vice president,
01:35:49.200 had to be Democrat, right?
01:35:50.560 Yeah.
01:35:50.840 I wish it was like that because then it seems like there would be a common ground or invited
01:35:55.760 them to govern together.
01:35:57.120 Right.
01:35:57.440 You know, like there has to be some sort of teamwork because otherwise you outsource it and it's
01:36:00.820 just, it seems like it's so much more expensive.
01:36:02.800 But what I was going to ask you right here was, um, and we've asked him directly what happened,
01:36:07.780 but in your opinion, why do you think that, uh, he lost out on the nomination both times?
01:36:11.700 Because I think that's had an effect on, um, how the democratic party has been perceived over
01:36:15.800 the years.
01:36:16.320 Well, the establishment screwed him.
01:36:17.780 I mean, there's no doubt about that.
01:36:20.000 He, he should have been the nominee in 2020.
01:36:22.540 You had the whole party basically like drop out, tell people to drop out and endorse Joe
01:36:27.960 Biden.
01:36:28.300 They resurrected Joe, Joe Biden.
01:36:30.220 He had lost Iowa.
01:36:31.820 He came in like fourth or fifth.
01:36:33.380 He lost New Hampshire fourth or fifth.
01:36:34.980 It's never happened in the history of the party in recent history that someone is losing
01:36:39.760 that badly and then wins.
01:36:41.220 And they suddenly see Bernie's winning.
01:36:43.080 Iowa is winning.
01:36:43.800 New Hampshire is winning.
01:36:44.580 Nevada unstoppable.
01:36:46.260 Nate Silver had him at 70% odds to win the nomination at that point.
01:36:49.720 And the party, uh, just reacts and they say no.
01:36:53.680 And they convinced Buttigieg to drop out and endorse Biden.
01:36:57.120 They convinced Beto to endorse Biden.
01:36:59.740 They convinced Klobuchar to, uh, endorse Biden.
01:37:02.780 How did they convince them all?
01:37:03.400 Was it funding and stuff you think?
01:37:05.240 Well, I think they said, look, you, you want to have a future.
01:37:07.620 You can be in the, you can be in the cabinet.
01:37:09.760 You can, we're going to be for you.
01:37:11.820 And then they're like, Bernie can't win.
01:37:13.320 Bernie can't win.
01:37:13.860 Come on.
01:37:14.860 Donald Trump just won.
01:37:16.020 He's talking about invading Greenland.
01:37:17.840 And you're telling me we can't talk about Medicare for all.
01:37:19.980 I mean, like the one thing that Donald Trump has shown you is like, you can say what you
01:37:24.200 believe.
01:37:25.280 Anybody can win.
01:37:25.940 And, and, and, and that's like, you know, JD had that too.
01:37:29.360 He was like, you come on.
01:37:30.880 I disagree with him, but you know where he stands.
01:37:32.840 Like just campaign with your heart, campaign with your ideas.
01:37:36.720 That's what, and Bernie was speaking to people who were outsiders, who were underdogs.
01:37:42.700 You know, I think that's why you and I resonated.
01:37:44.680 I was an underdog growing up.
01:37:46.340 I was not, I was not supposed to be a politician.
01:37:49.800 I didn't have a father who was a Senator or a rich person or a Congress person.
01:37:55.200 Like no one would have thought this Indian kid of Hindu faith was going to have any shot
01:37:59.980 to get elected to dog catcher in this country.
01:38:02.100 But in Bernie was speaking for all the folks who weren't supposed to make it, who aren't
01:38:06.720 making it in this country.
01:38:07.960 Bernie was underdog.
01:38:08.800 He was an underdog.
01:38:09.620 That's another thing about Trump though.
01:38:10.720 Trump was also an underdog.
01:38:11.840 You know what I'm saying?
01:38:12.180 He did have part of that with them.
01:38:13.460 A lot of these, but I do, I do think there's a, there's people that love an underdog man.
01:38:17.640 Um, okay.
01:38:19.180 Let's build the primary early, right?
01:38:21.420 Uh, who should the Democrats run?
01:38:23.980 Well, first of all, do you think we need to still have just Republican and Democratic party?
01:38:27.580 Or do you think there's value in getting a big financier to support like a third party?
01:38:33.320 Like say if Elon Musk decided to go and support a third party, I think there's a value in
01:38:37.740 third parties.
01:38:38.520 Here's the problem I have in terms of just like third parties running for president.
01:38:41.820 A lot of times it has to be a billionaire.
01:38:43.660 And then you're like, okay, is that billionaire really going to have the values of the working
01:38:47.780 class?
01:38:48.220 But in terms of parties, and I'm one of the few people in the Democrat party who says this,
01:38:52.420 I think have more parties.
01:38:53.880 You know, if like the Democrats are like, oh, your third party is going to take away
01:38:57.700 votes from us.
01:38:58.340 That was like, be better.
01:38:59.360 Like, it's not the voter's job to vote for you.
01:39:01.960 It's your job to convince the voter.
01:39:03.720 I never understood this with like the whole Harris Trump race.
01:39:06.480 They're like, well, you know, they're Jill Stein is taking votes away from us.
01:39:11.540 I was like, her job is to run.
01:39:13.580 Your job is to convince people to vote for you.
01:39:16.320 You know, it's like, if like Apple computers wasn't selling iPhones, they're like, well, this
01:39:19.680 other third party company is taking iPhone sales away from you.
01:39:23.900 Okay, well, why aren't you making the iPhone better?
01:39:25.980 Right.
01:39:26.240 And so I think having more parties is good.
01:39:28.500 I still think the Democratic party is the vehicle where we can really have a progressive
01:39:33.000 working class agenda.
01:39:34.360 I think we've got to reform it in the way Bernie Sanders was trying.
01:39:38.380 And, you know, and I think the Republican party, I hope, will become more anti-war, will
01:39:43.680 become more working class centered so that we have reform in this country.
01:39:49.060 Yeah, I just don't know how we could keep going with nothing because that's what it
01:39:53.780 starts to almost feel like is that we're just, we're not even getting anything.
01:39:58.180 I've heard you talk about the idea of economic patriotism.
01:40:01.080 Yeah.
01:40:01.240 What does that kind of mean when you say that?
01:40:02.900 I like that terminology.
01:40:03.880 My parents came to this country in the 1960s.
01:40:08.020 It was the time John F. Kennedy said, let's go to the moon.
01:40:10.620 Let's dream big.
01:40:11.800 This country was humming.
01:40:13.240 We were building things.
01:40:14.540 We were making things.
01:40:16.220 We were doing things for civilization.
01:40:17.420 I think that's what can bring us together.
01:40:19.440 Let's have a Marshall Plan, not for Europe, a 21st century Marshall Plan for America, where
01:40:25.560 we start to build new factories, new industry, new AI centers, new AI academies, new universities,
01:40:33.060 new trade schools.
01:40:34.480 And we come together in this country, white, black, Latino, Asian, to build things in America.
01:40:40.360 And by the way, I think that's something even some of the MAGA folks could get behind.
01:40:44.780 Like, what is it, we need a new national purpose.
01:40:48.520 You know, and when John F. Kennedy was president, 60% of Americans had trust in government.
01:40:53.040 Today, it's probably like down to less than 20%.
01:40:55.680 And the reality is, you know, we talked about the PAC money and the lobbyist money, but most
01:41:00.180 people go into Congress, they are still, they want to do something for America.
01:41:04.120 They care.
01:41:04.840 Really?
01:41:05.100 You believe that?
01:41:05.660 I do.
01:41:06.460 I believe that about, you know, J.D. Vance and I have gotten into arguments.
01:41:09.260 I believe that about him.
01:41:10.600 Like, I believe that.
01:41:11.620 Oh, I do too about him.
01:41:12.600 I do believe that.
01:41:13.380 You know, I believe that about most people.
01:41:15.500 Like, I think they have a story.
01:41:17.420 They want to do something good.
01:41:19.340 The system is so broken.
01:41:20.900 But you know what we need, again, is a vision that's going to inspire us to come together
01:41:26.180 to do things.
01:41:28.200 And economic patriotism, I think, would be one of those visions.
01:41:31.380 Like, let's make a, let's redevelop communities.
01:41:34.400 Yeah, me and Mike, I was talking with Mike Rowe and we've kind of started to put the wheels
01:41:38.060 on creating like a, a place that's just American products, right?
01:41:42.540 Yeah.
01:41:42.640 Where you go to buy stuff that's just American, you know, and start to see what that would
01:41:45.680 look like.
01:41:46.240 So that would be really interesting to have like a new American kind of QVC, but all
01:41:50.820 this stuff is just American stuff.
01:41:52.680 And one of my bills is actually to give a tax credit at the end of the year.
01:41:56.900 If you keep your receipts and buy American.
01:41:59.060 So look, you buy an American glass, American book, you get to deduct it on your taxes.
01:42:05.440 I love that.
01:42:06.160 I mean, I love that if we could.
01:42:07.680 And also if we just didn't give money to some of these other places and stuff, it would
01:42:10.900 be, I think it would be, we'd have a lot of money for people.
01:42:13.780 Yeah.
01:42:13.960 It's funny that American, American manufacturing doesn't even seem to be a part of the democratic
01:42:17.320 talking points anymore.
01:42:18.700 Which is so ironic because FDR industrialized the whole country.
01:42:21.540 You know, the, the tragedy I would say of Elon Musk, who I've known is that he could have
01:42:26.680 been Bill Knudson.
01:42:28.040 I know you're like American history.
01:42:29.460 Bill Knudson was this guy who was running GM.
01:42:32.860 K-N-U-D-S-E-N.
01:42:33.920 K-N-U-D-S-O-N.
01:42:35.800 And you can look him up.
01:42:37.420 And he was running GM and FDR says, you know, we need you.
01:42:41.540 We need you to build American industry.
01:42:44.780 We were making a thousand planes a year.
01:42:47.180 Within a few years, Bill Knudson takes it to 300,000 planes.
01:42:51.860 And instead, I mean, the tragedy in my view of Musk is he was undermining and destroying
01:42:56.360 like the NIH and all these agencies where what we needed is someone to come in and say,
01:43:01.320 let's industrialize this country.
01:43:02.740 Let's work with labor to build new industry, new jobs.
01:43:07.020 The Democrats did that.
01:43:09.020 Like we, I don't understand how we let Trump become the made in America guy.
01:43:13.240 We, we need to be the party that says, here's our vision for making things in America.
01:43:17.260 And wouldn't it be great if the argument in this country was who was going to build America
01:43:21.540 better?
01:43:21.940 And like, Trump's like, here are my ideas.
01:43:23.640 And we were like, here are our ideas.
01:43:25.140 And maybe you take the best of those ideas.
01:43:27.180 But how about a common goal again?
01:43:29.420 Like, I think the country is exhausted.
01:43:32.080 We're embarrassed.
01:43:33.840 They see all of us in Congress and I'm guilty of this too sometimes.
01:43:37.040 And we're arguing and name calling.
01:43:39.500 And they're like, yeah, but my life, like I can't afford a house.
01:43:44.820 My kids aren't going to have the same job I did.
01:43:47.180 I don't know where I'm going.
01:43:48.980 I don't know what's happening with this country.
01:43:50.940 Like, just get your act together and get some direction that's supposed to be the greatest
01:43:54.180 country in the world and start working together on team America.
01:43:57.980 Yeah.
01:43:58.080 Yeah, I think it's, the dream is fading, I feel like.
01:44:04.820 And I don't want to be a pessimist, but I don't, I feel like, I don't want to say the
01:44:09.680 dream is fading.
01:44:11.000 I don't want to say that it feels like we don't all have the same dream that's fading.
01:44:19.100 That's the part that sometimes feels like it's fading.
01:44:21.620 It used to be that you'd go to bed at night and it felt like you all had this universal dream,
01:44:24.900 you know, I think.
01:44:25.580 Yeah, I think, and we've, we've lost that.
01:44:27.780 Like, what is the common American dream?
01:44:30.820 And, and we've, one, we're so divided, right?
01:44:32.980 Life is so different for some people in Silicon Valley than it is for folks who are like, how
01:44:37.140 am I going to pay my medical bills?
01:44:38.880 How am I going to pay for rent?
01:44:40.960 How are my kids going to have a decent job?
01:44:43.980 And the other thing is that we lack a common sense of vision.
01:44:47.580 Like, what are we doing as a society together as Americans?
01:44:51.300 And I, you know, I think if, if we get people talking about that, and, and that's why I
01:44:56.100 say a Marshall plan, economic patriotism, rebuilding this country, and that doesn't mean old factories.
01:45:01.340 Like, it's going to be the new modern factories, new industries.
01:45:04.300 Well, I think people, it's also going to get weird if people, if some of the greed doesn't
01:45:07.720 end there, people are just going to want to see the rich start to suffer.
01:45:10.140 And that's when shit can get really weird.
01:45:13.240 That's when you start getting people into revolutionary thought, but then maybe who knows, maybe we
01:45:18.140 just, you know, we just slowly drift off into the ether of our sofa with a Cheez-Its on our
01:45:24.520 chest, you know, I don't know.
01:45:25.920 Some man with fake breasts laying there eating Cheez-Its, you know, and just drifting into
01:45:30.480 the ether of existence.
01:45:31.220 Well, that would be sad if we don't, you know, we, but I hope that's not our Paul Revere,
01:45:35.660 right?
01:45:35.900 I hope that there is still like a lot of people who believe in something.
01:45:39.260 And I think there is, and I think there's people that want something new.
01:45:42.320 I'm glad, I'm glad that even, uh, just saying, asking Mike Johnson for help with that thing,
01:45:46.420 you know, I think that that's, um, neat just cause I didn't even know that we could do that,
01:45:50.340 you know?
01:45:51.480 Um, what was anything else we wanted to talk about?
01:45:54.240 We're almost there.
01:45:55.580 Let me see.
01:45:56.560 Um, oh, one thing that JD Vance said that was really interesting the first time I spoke
01:46:00.320 with him was he said that a lot of Congress people and representatives will go, uh, turn over
01:46:08.320 to be, or senators will turn over to be lobbyists because they don't get paid enough by the
01:46:14.540 government type of situation.
01:46:16.340 Is that true?
01:46:17.260 Take me down a little bit of that discussion where, um, the, the salary is better being
01:46:22.520 a lobbyist than it is to be on, on the, this side.
01:46:26.440 Well, he's absolutely right.
01:46:27.880 And it's, uh, terrible what happens.
01:46:29.900 I mean, you are in Congress, you're working on the armed services committee, overseeing
01:46:34.960 defense contractors, and then you'll retire.
01:46:37.420 And suddenly, uh, you'll become a lobbyist for the defense industry.
01:46:42.060 You'll go from making $170,000 a year to making a million dollars a year.
01:46:46.880 And that's why I have a bill.
01:46:48.540 Uh, it's called the drain the swamp act or my political reform act.
01:46:51.920 And it says ban members of Congress from ever becoming lobbyists.
01:46:55.880 Now, you know, Harry Truman won world war two.
01:46:58.720 He did more than anyone in Congress put together and he started NATO and Harry Truman at the
01:47:05.160 end of his presidency, the guy who wins world war two sets us up for the post cold post world
01:47:11.260 war two order, uh, writes to someone when they, he was invited to give a speech.
01:47:15.320 He said, I'm a little embarrassed.
01:47:16.980 Uh, Betsy and I can't afford the train ride.
01:47:20.200 Can you, uh, pay for the train ride?
01:47:22.320 Happy to give the speech, right?
01:47:23.700 It used to be, you went into politics, not to get rich.
01:47:26.860 And I don't begrudge people who are wealthy.
01:47:29.100 There are a lot of people, uh, who are wealthy, but you shouldn't be in public service to make
01:47:33.560 money.
01:47:33.940 And afterwards you can do a lot of things.
01:47:36.560 Go, go do something in AI, go do something, you know, don't go and lobby.
01:47:40.920 Don't become a, uh, a member of Congress who's just going to cash out, uh, by lobbying.
01:47:46.500 It's something I will never do, but it's also a band we should have.
01:47:49.660 And I, you know, maybe we can make that a bipartisan effort.
01:47:53.080 I don't see how it couldn't be because every human wants it on both sides.
01:47:56.560 So I don't see why it just keeps happening.
01:48:00.000 And yeah, if you're good at your job of being a Congressman or a Senator, are they different
01:48:04.520 ones?
01:48:04.780 Yeah.
01:48:05.100 Yeah.
01:48:05.660 Then afterwards you can do tour, do a tour at all the donut shops.
01:48:08.660 You show up over there.
01:48:09.880 Yeah.
01:48:10.200 People pay in a heartbeat to show up there, take photos and stuff, have a, get a hug or something.
01:48:15.060 Get a, um, get a glazed.
01:48:17.780 Um, just the one thing they shouldn't do is start a podcast.
01:48:20.480 Politicians podcasts are terrible.
01:48:22.260 Yeah.
01:48:22.760 They're like so boring.
01:48:23.700 Newsome tried it.
01:48:24.620 Yeah.
01:48:24.920 And I mean, it's just kind of, cause it's like so stilted and it's not, there's no humor.
01:48:29.420 You gotta have some humor.
01:48:30.280 Like, you know, the politician by the straight guy, but you need someone who's funny.
01:48:33.560 You need like this conversation.
01:48:35.100 It's kind of all over the place.
01:48:36.960 It's not, it's not like, Oh, wow.
01:48:39.080 What are the poll numbers on this?
01:48:40.940 And so it's just, uh, I've never seen a politician who had a good, interesting podcast.
01:48:44.960 Yeah.
01:48:45.320 I think that that would be a bit much maybe, but I don't know.
01:48:47.800 Shit.
01:48:48.140 I mean, anybody could do it.
01:48:49.300 And sorry for saying shit right in front of you, sir.
01:48:50.880 Oh no, not at all.
01:48:51.900 Well, I should have done better.
01:48:53.620 No, not at all.
01:48:54.920 Oh, um, I did have this question though.
01:48:56.700 We're talking about that.
01:48:57.860 Um, you voted in the past about finances.
01:49:00.120 Um, you voted in the past to ban members of Congress from trading and holding individual stocks.
01:49:04.800 Uh, this hasn't happened yet.
01:49:07.140 Right.
01:49:07.740 Yeah.
01:49:08.060 Um, why do you think that hasn't happened?
01:49:10.000 You know, like if Martha Stewart went to prison, the Nancy Pelosi should go to prison.
01:49:14.080 It feels like, you know, well, I, I, I don't think that Pelosi in my view, I mean, this has
01:49:19.460 done, uh, things that are, uh, unethical, but I think that there's this, such a trust deficit
01:49:24.920 in the Congress that people think that the biggest problem is money in politics, right?
01:49:30.120 They're taking all this money from, uh, PACs.
01:49:32.660 They're taking all this money from lobbyists.
01:49:34.340 But at the same time, uh, people are concerned if members of Congress are individually trading
01:49:39.840 stock.
01:49:40.320 And so the bill I've supported calling for a ban on stock trading is a no trust act.
01:49:44.700 And it says, you know, put your money in a trust, have someone else manage it.
01:49:48.680 Don't be there trading stocks or telling your advisor to trade stocks while you're creating
01:49:55.780 a conflict.
01:49:56.760 And, uh, why hasn't he gotten a vote?
01:49:59.500 Uh, I don't know.
01:50:00.660 Johnson can, Johnson can get us a vote on.
01:50:04.640 I, I think it would pass overwhelmingly.
01:50:06.300 And we've got a bipartisan group, uh, that's pushing for a vote on that.
01:50:10.000 But you, if you do the five parts of my political reform plan, ban PAC money, ban super PACs, ban
01:50:17.760 lobbyists from getting money, ban members of Congress from ever becoming a lobbyist and
01:50:22.540 ban stock trading in Congress.
01:50:24.140 I think you would start to restore trust in Congress.
01:50:28.100 And some of the MAGA folks actually have liked the plan the most.
01:50:31.240 I had to drain the swamp act, which I said, you want to drain the swamp?
01:50:34.560 How about just telling lobbyists they can't give gifts to white house officials?
01:50:38.000 And someone's like, oh, you only wanted to apply for Donald Trump.
01:50:40.460 No, I was like, I wanted to apply whether it's Donald Trump or, or democratic president.
01:50:44.760 Yeah.
01:50:44.880 Oh yeah.
01:50:46.340 I mean, all of the things you're saying seem very, like, I think these are things that
01:50:52.640 are super important.
01:50:53.320 I feel like it's things that feel most people feel are super important and that's, but,
01:50:56.820 but it never, what happens?
01:50:58.000 It's like, it just like, we talk about these things, everybody's like, yeah.
01:51:02.360 And then it gets to some place where it all, nothing ever really evolves.
01:51:06.200 It feels like, and that's where I feel like we're at now.
01:51:08.120 Instead, Hollywood gives us like, here's a Diddy case, go like, you know, keep the black
01:51:12.640 population at ease, you know, like, or here's a, you know, we'll stretch the NBA finals another
01:51:19.520 week.
01:51:19.900 We'll send in a special ref that'll-
01:51:21.380 Are you for the Pacers?
01:51:22.520 Allegedly, I'm for the Pacers.
01:51:23.900 Yeah.
01:51:24.140 I like Halliburton.
01:51:25.260 I think he's just a super guy, but, but, or, yeah, I know it just, it feels like it never
01:51:30.740 gets there.
01:51:31.960 You had, let me see, my producer wrote also, you have a, but your family owned stock or
01:51:37.680 you had stock in your family before you got married?
01:51:40.260 Yeah.
01:51:40.460 How does, how would that relate into a congressman or, or Senator owning?
01:51:46.660 Well, under the, under the Trust Act, people can, members of Congress and they can be in
01:51:51.960 a trust.
01:51:52.580 They just can't trade stock.
01:51:54.300 Okay.
01:51:54.760 They can't trade stock.
01:51:55.480 And that, I think that, that's why I've sponsored the bill.
01:51:58.020 I think that we, what we want to do is eliminate conflicts.
01:52:02.420 And if you can have the No Trust Act happen, it says members of Congress cannot trade stock.
01:52:11.040 If there are any assets, they put them in a trust and that would eliminate conflicts.
01:52:15.640 But what, what, what is upsetting to people is people making trades while they're doing
01:52:22.460 policy.
01:52:22.920 And that's where the conflict is.
01:52:24.280 The federal employees currently aren't allowed to do that.
01:52:26.680 They have to have their money in, in some trust.
01:52:29.840 Yeah.
01:52:30.340 Yeah.
01:52:30.680 Because it would almost be, I mean, they, ethically it doesn't make sense, but it would
01:52:34.180 be hard not to, you're in a meeting and you're hearing about this or that, to get on your
01:52:37.420 phone and get on your app and, and, and make a trade, you know?
01:52:40.880 Heck, if I'm watching a sports center highlight or something and I see like one of the players
01:52:44.140 has a call for something, I'm betting against that team the next night, you know?
01:52:47.120 Like if, you know, if I see Walker Bueller's, uh, you know, if he seems like he's angry about
01:52:51.680 something the next day, I'm going to, I'll probably bet he has a few more strikeouts.
01:52:54.920 Well, look, I think the honest truth is there are not many Pete roses in the Congress.
01:52:58.480 There are not many people, honestly, uh, who are out there, uh, uh, trading on information.
01:53:04.760 You know, all my parties, for example, going after Marjorie Taylor Greene.
01:53:07.900 And she says that she's got an independent advisor who's making the trades.
01:53:11.180 I actually believe her.
01:53:12.080 I believe her.
01:53:12.660 But the problem is that the, the, the perception that has been created is that people, uh, are
01:53:19.740 benefiting from this.
01:53:21.040 So just pass the, the no trust act.
01:53:23.260 It's got bipartisan support.
01:53:24.960 Uh, people, if they have, uh, assets, they can put them in a, in a trust.
01:53:29.360 They can have it independently managed.
01:53:30.760 Then you know that they're not sitting there, uh, making trades or telling people what to
01:53:35.680 do.
01:53:35.960 Can they legally advise the trust advisor?
01:53:38.860 I mean, wouldn't that be?
01:53:39.520 They, they, they, they, they can't in day, day to day trading.
01:53:42.200 They can't, they can, they can set it up, but they can't, uh, tell the trust advisor,
01:53:46.580 okay, trade here, trade there.
01:53:48.380 Uh, and it has to be all disclosed.
01:53:50.420 And that, that would assure the American people that you don't have people out there, uh, trading
01:53:55.940 on stocks in a way that is, uh, uh, that, that they can't.
01:53:59.600 Yeah.
01:54:00.220 You know, the craziest thing is, I just thought sometimes the, one of the benefits of the
01:54:04.300 surveillance system would be that finally we get to see what all these senators and Congress
01:54:08.680 people and, uh, and everybody's up to, you know?
01:54:10.720 Right.
01:54:11.060 I wish they could just tell us like, uh, do a, um, I wish they had a breathalyzer for
01:54:16.940 if people were being honest or not.
01:54:18.320 That's what we need.
01:54:19.340 That's what we need.
01:54:20.780 There you go.
01:54:21.540 You know?
01:54:22.040 They, they, they, they have forced the lie detector test next, next time I'm on, maybe
01:54:26.080 I'm gonna have to like do a lie detector.
01:54:27.780 Take that back to, uh, to Silicon Valley with you.
01:54:30.280 Um, Ro Khanna.
01:54:32.360 That's your name.
01:54:32.960 You got that.
01:54:33.680 You got the pronunciation.
01:54:34.840 Perfect.
01:54:35.260 I'm getting better.
01:54:36.300 Uh, yeah, I got to go to India once.
01:54:37.480 I went to Madras one time and had a really great time.
01:54:39.680 Uh-huh.
01:54:40.120 Really loved it.
01:54:40.920 A lot of, uh, excitement, energy in the faces of the, in the eyes, uh, in the smiles of the
01:54:46.000 people there.
01:54:46.740 Thoroughly enjoyed it.
01:54:47.760 Um, best of luck to you, man.
01:54:49.360 I appreciate you coming on and talking.
01:54:51.500 Uh, keep us out of Iran unless we're already there.
01:54:53.880 I know we've been on the air for a couple hours, so.
01:54:56.260 But, uh.
01:54:56.720 Well, with Massey, we're gonna do it together.
01:54:58.620 And with your, your advocacy.
01:55:00.340 And keep up your voice.
01:55:01.400 You know, I, it gives me hope that we got, uh, people like you, uh, out there just speaking
01:55:06.980 from your heart.
01:55:08.060 Uh, and don't let anyone say you don't know what you're talking about.
01:55:10.420 You know, you know a lot more of what you're talking about than a lot of the people running
01:55:14.020 this country.
01:55:15.160 But, uh, thank you so much for your time.
01:55:17.040 Yep.
01:55:17.220 Appreciate you.
01:55:17.760 Now I'm just falling on the breeze and I feel I'm falling like these leaves.
01:55:23.980 I must be cornerstone.
01:55:27.180 Oh, but when I reach that ground, I'll share this peace of mind I found.
01:55:34.580 I can feel it in my bones, but it's gonna take.