00:04:26.000And so we have a little bit that you could, Robbie.
00:04:31.000I have asked the class if they could watch the video you made about your lawsuit against Google and how important this lawsuit is to every single one of us watching
00:04:46.000because you're putting yourself out there to fight for everybody and their reputation and their family and their rights.
00:04:53.000So because we're going to, you know, be muted here and there, if you could just give us what's going on, tell us how it's going.
00:05:00.000And I'm particularly interested in your conversation with Rand Paul, Senator Rand Paul, and where you think this is headed.
00:05:13.000And as everybody knows, I was a massive fan of Scott Adams.
00:05:17.000So I'm glad you guys have carried this on to have his legacy continue in terms of the case with Google.
00:05:24.000You know, I feel like there's not very many people who are in a position where they could afford to fight Google
00:05:32.000and have the platform to be able to battle the immense PR advantage that they have. Right.
00:05:38.000And so I felt like it wasn't really a choice I had to fight this.
00:05:42.000I mean, not only to correct the damage that it's done to me and the safety issues it's created for me.
00:05:49.000But beyond that, you know, just the fact that I see how incredibly dangerous it will be if we allow AI to lie with impunity, to defame people,
00:06:00.000to essentially cross the boundary of what I believe should be the first principle of AI.
00:06:05.000AI should never be able to harm humans.
00:06:07.000That should be the first principle built into any AI that there is, whether it is a chat bot or it moves on to, you know, being fused in with robotics, whatever it is.
00:06:21.000So, you know, when this happened to me, my initial impulse was like, hey, let's try to solve this with Google.
00:06:28.000And that's what I tried to do. I tried to go to Google, you know, had the best intentions of just like getting this fixed for everybody.
00:06:35.000And let's make sure there's a process in place so that if anything like this ever happens, it is super simple for somebody to be able to correct it.
00:06:41.000And, you know, unfortunately, that was not our experience.
00:06:44.000In fact, the first person that we really got deep into conversations with at Google, you know, we had essentially, you know, asked for this to be fixed.
00:06:53.000I'm going to fast forward a little bit months later.
00:06:56.000I check in and I say, hey, you know, where are we at with this?
00:06:59.000And they responded, I'm sorry, Robbie, I'm actually resigning.
00:07:04.000And we couldn't get anything done, essentially, to correct this.
00:07:09.000So, you know, that obviously sent a signal to me that there was no interest in correcting the problem.
00:07:15.000And the defamation not only continued for two years, but it actually got remarkably worse.
00:07:21.000It moved on from simply just stating a lie about me to adding fake sources, fake police records, fake court records, naming fake victims in detail, and even doing fake victim statements about the horrific, horrific crimes that it accused me of.
00:07:38.000And, of course, these are entirely fictitious people.
00:07:51.000We cannot find any basis for this to be in training data anywhere.
00:07:54.000So that's a scary side note because, you know, Google, when they were questioned by Senator Marsha Blackburn, their defense was these are hallucinations.
00:08:03.000Well, I've never seen hallucinations like this in my life before because hallucinations are usually melding together a bunch of stuff and doing it incorrectly, and it's super sporadic and random.
00:08:13.000This was a consistent, consistent stream of lies where it accused me of things ranging from sexual assault, child rape, shooting somebody, violent assault, drug use, drug selling, you know, I mean, every number of things, abusing a nanny.
00:08:29.000And it would repeatedly do these same crimes over and over again.
00:08:34.000So, you know, when it goes into detail like that, and if you, you know, what's interesting is if you challenge an AI with a lie, it typically will go, oh, yeah, I got that wrong.
00:08:44.000Because, like, it's being confronted with, you got this wrong.
00:08:47.000In this case, when it was confronted, it would usually, I mean, the vast majority of the time would just stick with the lie.
00:08:54.000And it would say, no, no, no, no, this is true.
00:09:16.000And it would even go so far as to print out, or I don't know if that's the right term, but it would send you back a full fake article that it wrote up in the name of a real journalist pretending to be whatever that media outlet was and pretending to be that journalist.
00:09:33.000In fact, there's one, Yashar Ali, who called this out because at one point he was cited as a source by Google's AI for a, I believe it was, I don't have it in front of me, but I believe it was an assault story.
00:09:46.000And Yashar actually called this out and was like, this is crazy.
00:09:50.000I never wrote anything like this about Robbie, and I never had any story even remotely like this about him.
00:10:03.000I even, you know, found out that it was continuing to happen up to this week.
00:10:07.000And part of the reason that there's going to be a massive issue with this is open source AIs, they have an inherent problem where if there's a big issue like this, the company that made it, they don't have control of it anymore, essentially.
00:10:23.000Because once they set it out into the wild, and we're talking well over 100 million downloads of Gemma, for instance, that's one of their AIs, they can't go and force an update to every Gemma download out there.
00:10:35.000Because if somebody's disconnected from the internet, they're using Gemma as their main AI for whatever it is, whether it be app building or whatever, they can't force an update to that.
00:10:46.000So Gemma there, and there's a bunch of websites, too, out there.
00:11:04.160So I'm not sure how that's even going to work in court, you know, let's say, you know, I win down the line, how they're going to enforce a stoppage to this, because there's all of these downloads out there that you're never going to be able to get back, never going to be able to get them to stop lying.
00:11:17.960And what's scarier about that is Gemma, one of the platforms that was the very worst of Google's, it is used for app building a lot of times.
00:11:25.860So imagine somebody's building, like, a reputation scoring app for banking or insurance or something like that.
00:11:31.160I mean, the long-term damage that can do is immense.
00:12:04.700In fact, we had one of the insurance companies come back who knew who I was, and they were sympathetic to my situation.
00:12:12.780And so they told my agent what was going on.
00:12:14.840And they said, you know, we've deemed him a high risk because of his career and online stuff and the Google stuff, is what they said.
00:12:26.100We don't know if that means the lies Google told or whether that means my lawsuit against Google or what.
00:12:31.920So we have to find this out in Discovery.
00:12:33.980Like, there's so many different rabbit holes we have to go down in Discovery to figure out exactly how far this all goes.
00:12:40.600But, you know, in general, I see the threat this poses because if this is just the beginning and AI is used this way long term, I see how it can be used to enforce ideological, you know, sort of obedience.
00:12:55.660Because if you live in a world where somebody who doesn't have my ability to fight back, just an average person, you know, doesn't have a ton of money to be able to fight this.
00:13:06.380If they're dealing with a reality where AI, one of the big frontier labs, doesn't like their politics and they decide to lie about them.
00:13:15.220And when prospective employers look them up, they get back fake crimes.
00:13:21.780You know, imagine how that's going to destroy people's lives.
00:13:23.980And if the only fix to that is don't speak out about politics, you know, don't speak out about what you actually think.
00:13:32.040You can see where it gets dangerous very fast.
00:13:34.680Did this start when you were running in your office?
00:13:38.780And I want to point out that Senator Paul thinks you have a good chance of winning your case.
00:13:46.960And I agree based on hearing J-Cal from the All In podcast talking about something like this.
00:13:56.620And I also want to add, because I'm trying to get in without the noise coming back, but that, you know, Scott initially, before AI was even a thing, was like, oh, you know, AI.
00:14:07.160Like, I'd want an AI of me, like, after I die and da-da-da-da-da.
00:14:12.820But then when he saw all the flaws with the AI and how it was going and it could be programmed and whatever, he then said over a year ago, no, I don't want this.
00:14:23.640You know, it's not at all where it should be.
00:14:25.780And, you know, you see that it can assign opinions to people that those people never had.
00:14:31.620And essentially what I said about this kind of defamation is that it turns somebody into an actual puppet.
00:14:39.320And now somebody else is their puppet master.
00:14:41.840And it is very distressing, very upsetting.
00:14:46.860You know, it's their likeness, their voice, whatever.
00:14:49.700And for you, it's like, oh, okay, like, here's what we're saying about this guy.
00:15:04.120It'll ruin even the history of who you are, you know, long after you're gone.
00:15:11.440These things will live on forever, these lies.
00:15:13.740And I think that it's time to really go after, and if it's the AI company or the AI code and whoever wrote it and whoever owns it should have the living sued out of them, out of existence, and the people carrying it on also, same thing.
00:15:35.120And that was a lot to say, but I just wanted to get your take on all of those things.
00:15:39.680Yeah, you know, I don't know why a bunch of legacy media companies are not suing them over this, because they misrepresented every single one of them that was cited as a source and wrote fake stories and things along those lines.
00:15:55.640I would absolutely be suing if I was in a position to at one of those companies.
00:16:00.240Now, in terms of winning, yeah, I mean, some of the foremost legal experts have stood and said, yeah, I think there's very, very strong likelihood that Robbie will win this case.
00:16:11.560You know, there's a possibility we have a hearing, actually, in less than two weeks, and there's a possibility they may ask us to amend and come back,
00:16:20.380which is just a byproduct of the fact that we have had so many people come forward since we filed with more facts that we didn't know, including continued defamation.
00:16:32.960I hope they don't do that just because I feel like that adds a delay that, you know, Google, I'm sure, would love.
00:16:38.180But, you know, we do need to continue to be able to kind of like share what's happening as the defamation continues.
00:16:45.280But I do think we're in a great position, and I think we've got to take this, you know, all the way because I think we need to set a precedent here.
00:16:52.820And I think, you know, like you said, there's all these unintended consequences with AI if you don't get it right,
00:17:01.220if you don't have, you know, really serious principles in how you're building it.
00:17:06.320So, you know, in the case of, you know, what's happened with Scott,
00:17:11.520I can't even watch the video because it just makes me so mad.
00:17:14.460This AI Scott thing that they're doing.
00:17:16.640The truth is, imagine something like this in the hands of a foreign actor in a foreign government.
00:17:25.540And what's happening is you actually change the long-term view people have of that person
00:17:32.220because they're going to now confuse whatever happens here on out, clips of that AI here on out,
00:17:38.460with what the man Scott Adams actually said and believed.
00:17:41.920And so you materially change the way they're viewed by the public.
00:17:48.380You materially change even the way that they're viewed by people that knew them because people will start to go,
00:17:53.440wait, was that Scott or is that the thing?
00:19:35.420You know, but that's the world we're stepping into.
00:19:38.220And so I think people have to be clear-eyed about this because even if the major frontier labs do a good job of watermarking the videos
00:19:45.860or something like that to try to prevent this type of situation, there is going to be rogue AI.
00:19:50.760There is going to be AI that is developed in a country that is not friendly to the United States,
00:19:55.440that is going to seek to topple or I should say inject chaos into our elections and into our civil life.
00:20:01.860And so I am 100% positive when I tell you we are one or two elections away from a foreign actor using AI in this manner to flip an election.
00:20:10.500There is going to be a video indiscernible from reality, and I don't care which side this affects.
00:20:15.800What I care about is the fact that this dramatically alters the way that our elections and public service works in this country
00:20:23.280because it is going to happen to somebody.
00:20:25.360They're going to be accused of something absolutely heinous, and there's going to be video and audio that is compelling of it.
00:20:30.320And people are going to change the way they vote because of it.
00:20:48.680I'm going to grab a couple of questions.
00:20:51.480So the interesting part is that I'm an attorney, Robbie.
00:20:59.080So the interesting part about the lawsuit is, and I will, I have another question, but this is just a comment,
00:21:05.260that the hallucinations don't go into, oh, he won a prize of world peace, or he loves children, or he has, you know, like it goes into the negative.
00:21:16.980And my issue in regards to their claim that you're not proving malice, malice comes from also the actions that they have made.
00:21:25.140You've notified them, and yet they still keep on going and not doing anything about it.
00:21:31.280But I know that you probably can't comment further, you know, due to your attorney and your case is ongoing.
00:21:37.640But I had a question about Scott Adams.
00:21:40.400So Scott was a full supporter of you, loved you, always when I shared anything about you to him, he was like, Robbie, Robbie.
00:21:50.920He, he, he, like, he just loved you so much.
00:21:55.520And I wanted to know how, how Scott influenced your work.
00:22:02.040And I also wanted to highlight to everybody out there that Scott also had you in the Dilbert comic.
00:22:07.980So he, he, he not only loved you, but he made, I think, maybe one strip or two, I don't know, if you remember that.
00:22:17.720But I wanted to see what you're, to give me your thoughts on that.
00:22:23.600Yeah, so the malice question, you know, I'm a hard, I will say this, I feel bad for my lawyers, I'm a hard client to control.
00:22:30.600So I, I probably say things when I shouldn't, but I love them.
00:22:46.220It is incredibly malicious that they did not fix it in a timely manner, incredibly beyond malicious.
00:22:53.340Um, and it's, I, I think under the law, you know, you, you'd classify it as, you know, uh, gross negligence that, that is, you know, defined best as malicious.
00:23:04.040I don't think there's a better word for it.
00:23:06.280And so, um, I think we definitely crossed that threshold.
00:23:09.940Now, I think actually beyond to cross it, I think it's like, it's so obvious.
00:23:14.240It kind of slaps you in the face that it's malicious.
00:23:16.280And to your point, I can't believe the defense that this is hallucinations, which doesn't even matter as a defense, by the way, because pretend it was hallucinations.
00:23:26.720Uh, what you still allowed it to continue for multiple years after being notified over and over and over again.
00:23:32.700So, uh, it kind of doesn't matter what the reason was.
00:23:35.420You built a product that does this to people and, and it's just, it's, it's untenable.
00:23:41.520So, you know, that's, that's one thing.
00:23:43.620But, um, in terms of, you know, sort of the, the long-term issue here, there has to be a better process for when this happens to people.
00:23:53.920And, you know, so I think it's incumbent on us to create that, but I want to talk about Scott for sure.
00:24:00.200Uh, Scott is one of a very small group of people whose persuasion crossed borders.
00:24:08.860You know, I mean, it really, like, it didn't matter where you were from as long as you could understand English, you know, um, Scott could have a massive influence on your life, your thinking and your process.
00:24:21.760One of the things that Scott, you know, really, uh, impacted me with was I feel like he was very measured in how he analyzed things and he didn't like to jump to the immediate gut reaction.
00:24:32.940He liked to kind of, you know, sit on something and, and, and take it in all the way.
00:24:36.720And so I sort of adopted that too, with a lot of things, uh, cause I saw that the more reactive you were immediately, the more likely it was that you might be wrong.
00:24:48.700And so you need to sit on something, collect all the facts, the evidence, and then make a determination.
00:24:52.580And so I think that has helped me a lot in how I see the world and how I sort of react to things.
00:24:58.840But, uh, yeah, no, one of the coolest things that has ever happened to me, maybe cause I'm a giant dork, but him putting me in the Dilbert comic was amazing.
00:25:08.100And, uh, I, he didn't even tell me ahead of time.
00:25:10.580I actually found out somebody called me.
00:25:12.020They were like, Hey, you're in Dilbert.
00:25:13.820And, uh, then I reached out to Scott and was like, this is crazy.
00:25:17.100Well, and, uh, he confirmed it was so yeah, Scott, Scott has had such an immense influence on the right, especially because I know so many people who they just kind of live by the idea that it's our job to be useful.
00:25:32.880And that's what I try to do every day.
00:25:34.380You know, like with the DEI project, I'd say, um, it was an amalgamation of a lot of things that made me do it.
00:25:41.180But one of them was frustration with this idea that one person can't make a difference.
00:25:45.080I think that so many people believe that, and it's probably one of the top, if not the single most toxic idea that we can have, that we can't make a difference, that we simply are not important enough or powerful enough as an individual to make a difference.
00:26:29.920Yeah, well, um, I, I wanted to ask a little more about your lawsuit with Meta, because I think you, this is actually the second lawsuit for definition that you've been through.
00:26:38.680I know that one got settled and it sounded like from the little blurbs I've seen, at least that, that they were much more receptive to fixing it and they wanted to work with you on it and all that.
00:26:48.620But I wanted to get your perspective on that.
00:26:50.220And, you know, was, was, is that the type of response you were hoping for from Google?
00:26:54.260And, and do you think that it was, you know, the people at Meta were just, do you think they took the right approach for that?
00:27:02.200I think that, uh, you know, it was handled really well by Meta.
00:27:06.280You know, they understood that, you know, they, they, they don't want a biased AI.
00:27:12.580They want an AI that does a great job and is fair to people and tells the truth.
00:27:18.520And so, you know, I, I'm an advisor in Meta for full disclosure to everybody.
00:27:22.840I'm an advisor for the AI for that very reason to ensure that there's not bias injected.
00:27:28.680You know, like I said earlier, I don't care who this affects.
00:27:30.800I don't care if it's Democrats, Republicans, independents, you know, if you're, you know, uh, yellow with pink polka dots, I don't think that AI should be able to lie about you, harm you in any way or misrepresent you.
00:27:42.220So, um, you know, I think they handled it the right way.
00:27:45.680It was my hope that Google would want to fix things and do the right thing.
00:27:50.360Unfortunately, Google has chosen a different path, but, you know, um, this happened independently of that situation.
00:27:56.680So it's, uh, it's just wild, you know, for the best I can tell, uh, the initial issue started with Google.
00:28:07.920And, and in terms of hallucinations, I, I would just say, I, I totally agree that I don't think it could be a hallucination when it's so consistent.
00:28:15.040I mean, I, I do have a pretty, I think, decent technical understanding of how AI works and how hallucinations work.
00:28:21.000And at least as I've understood the definition of what a hallucination is, is that it's mainly built on the randomness of the algorithm that these AIs depend on, where it has probabilities of predicting what's the next word in this response I'm giving.
00:28:36.180And it will often give you the most probable one, which would all be based on the training, but sometimes it will give you the second one or the third one or the fourth one.
00:28:45.060And so if that were what was happening, where it was just doing a hallucination, then it would only happen one time.
00:28:50.680It wouldn't happen over and over and over again.
00:28:52.940And it certainly wouldn't give you like a complete fake article with all that.
00:28:57.500So to me, that does seem like a ridiculous defense.
00:29:00.280Um, and I don't know what legal standing that would even have because they are still defaming you.
00:29:06.220I mean, regardless of whether it's a hallucination or not, they built this thing and I don't think they can stand on something like section 230.
00:29:13.920Maybe Marcel knows more about that than me, but, um, section 230, I think is, is where it's user generated content, but this is not user generated content.
00:29:23.620So they're the ones that are defaming you over and over and over again, every time someone asks about you knowingly.
00:29:31.220It section 230 has no application here at all.
00:29:34.980Um, I'd be surprised if they tried to apply it because they are the publisher of the content and frankly, they probably don't want anything remotely capable of piercing the veil of section 230.
00:29:45.520But, um, I do think there probably has to be an update, you know, in general, I think you've got to have broad ability to allow people to speak.
00:29:53.600I think freely online, but beyond that, you know, you have to have lines in the sand on malice and especially content that they are publishing through these AI chat bots.
00:30:02.360But yeah, you know, the absurdity of the hallucination defense is the fact that there, there's not anything that's like Robbie loves to save puppies, right?
00:30:22.680It's all like, you know, he, he was on drugs, sold drugs, shot somebody, um, groomed actresses.
00:30:28.920That was another one that was like mind blowing to me.
00:30:32.820Um, and so it, it, it's very clear there's this like absolute negative tilt to it that even just that on its own, you know, throwing out the fact that it's consistent all the time.
00:30:45.620Um, it just like, I, I don't know how you get to the place where you could try to pretend it's a hallucination.
00:30:51.340I think you'd have to be a non-technically savvy person to buy the excuses to hallucination.
00:30:56.140Um, but even then most non-technically proficient people, when they see something like this, if someone says it's a hallucination from a machine, they go like, what are you talking about?
00:31:06.980Like, you know, it's, it's, uh, it's hard to buy no matter where your vantage point is and people should, people should think is it's so weird to me.
00:31:17.260And I'm like seeing this a lot with, is it Claude, how it starts to talk to itself and reveal what's happening.
00:31:26.380I'm like, you're like telling on yourself, AI, but okay.
00:31:29.940You know, and Robbie's like, you know, is this intentional?
00:31:35.420Like, you know, we don't want to do it, but it's making us do it.
00:31:37.740I'm like, this is the strangest thing.
00:31:39.500And I feel like, um, I am not a lawyer like Marcella and I have no idea, but, um, we are redoing our will and I am putting it into my will that nobody, you know, after I die or while I'm alive, people can use my likeness, my voice, my whatever.
00:31:56.700Um, because I don't know who's protecting us.
00:32:01.280Um, yeah, I just figure it's a, it's a wild west out there with AI and, um, and I just, I feel like, you know, maybe make a public statement on your platforms.
00:32:14.880It's still in there and I pinned it that, you know, because obviously it wasn't good enough, you know, for Scott to make a video, a statement, have his brother make a statement, have his estate, make a statement.
00:32:24.740And because I'm like the face out there, like trying to defend this.
00:32:29.220Now people are like, Oh, Erica doesn't want this.
00:32:31.280And I'm like, it's not an Erica thing.
00:32:32.700This is a Scott Adams genius human being who worked his whole life to give himself to us, to be useful.
00:32:46.680And the, the, the, there's one person in particular who I won't name who was given really good advice to basically stop what he's doing.
00:32:55.920And it's also not permitted, but now is putting an AI version of Scott out there, teaching other people how to make an AI version of Scott.
00:33:07.480So for those of you that think Scott would love this, you are so off track and you give zero shits for him, his legacy, respect for his family.
00:33:21.260And if it was like your father or your mother or your son, brother, someone you love who was just like this amazing person and they passed away.
00:33:30.020And now someone else, this is, I'm sorry, but this is me, like shoves their hand up their back and now is their puppet.
00:34:29.320Give me your slings and arrows if, if, you know, you feel like this is never a good thing, but I'm, I'm here for it because, uh, he was my friend.
00:34:36.980I loved him and, and more than that, I respect him and I respect every single person watching this, that it would never happen to you because the pain it is causing and the trauma is like, you can't even imagine.
00:35:04.660The idea of like, you know, if I pass away that my kids could open up their phone and see a fake version of me saying something, maybe something horrible.
00:35:16.340Um, so it's, it's, it's unfathomable to me that somebody thinks this is a good idea.
00:35:21.860You know, that somebody, and especially if they're pretending that they were a fan of Scott or loved Scott in some way.
00:35:29.520Um, I hate to tell you, if you're doing something like this, you're nothing but disrespecting his legacy and who the man was.
00:35:37.400And so, you know, I hope, I hope if they see this, the person who's behind this, that they do the right thing and they stop doing this because it's just, it's horrible.
00:35:46.980And the family, the family, I think should come first, right?
00:36:11.820They're the ones who have to live here after I'm gone.
00:36:13.720Um, but you know, I, I think it's incredibly disturbing that somebody thinks this is a good idea and that they should be the one who gets to decide that it is okay.
00:36:26.300I know that somebody thinks, or they might feelbt sitcoms.
00:36:34.440Yeah, Robby, I have questions regarding, I have so many questions, Robby, but, um, um, I wanted you to talk about your background to the audience that might not know.
00:36:45.140And also about your music video directing and Hollywood and all of that.
00:36:50.520though came before because as you know a lot of our audience loves scott obviously and he believed
00:36:56.980in talent stacks and you have one of the greatest talent stacks that i've seen um meaning that you've
00:37:07.160you've come from different a different angle than most uh political commentators and it's it's so
00:37:14.540interesting um what you've done and what you're still doing so i want them to know a little bit
00:37:21.240about that you're still on mute so sorry am i okay there we go um you know i'd say this might take a
00:37:38.360while but what really to me uh provoked me to speak up and to to take a political stance publicly which
00:37:45.840pretty much set my career on fire in hollywood was the fact that my family came from cuba so my family
00:37:50.940already lost everything to communism once and i mean everything and so you know having had that
00:37:57.720experience in my direct family lineage you know you grow up with constant reminders how lucky you are
00:38:05.160to be american just constant and so you know i understand very clearly my father figure in my
00:38:12.260life was my great grandpa and he reminded me all the time that the reason it was able to happen in
00:38:18.500cuba was that so many people were afraid to speak up to ruffle feathers lose business whatever it might
00:38:23.560be and in the end they lost all of those things anyways right so i mean they should have just stood
00:38:29.140up when they had the chance but that was lost on them in the moment in the moment safety felt like
00:38:33.880silence you know if i was silent i was going to be safe and too few people spoke up and so you know
00:38:40.020he imparted on me this need not not a want but like an absolute need a duty to stand up if you ever see
00:38:48.040signs of this in america so here i am young man i come from a family that didn't have a lot i didn't
00:38:53.220have hollywood connections or anything and i built a business really bootstrapped it myself i didn't even
00:38:58.540i got denied for the one time i went to a bank to try to get a loan um so that that tells you like
00:39:03.080this was like a bottom-up operation and so i start this production company and we got to be
00:39:09.220very successful we had you know over 14 directors across the world i directed oscar-winning actors
00:39:14.140actresses some of the biggest music stars in the world um you know was nominated for and won a lot
00:39:19.560of the biggest and most prestigious awards out there and um you know what's funny about that actually
00:39:24.560side note i never went to an award show when i was nominated um not for the mmvas not the vmas
00:39:30.120none of it i never went um i always thought that hollywood was kind of a disgusting hellhole and so
00:39:35.740i didn't really want to be around the people unless it was directly work-related and i was getting paid
00:39:40.180for it um it was just never a fit i'd rather be home and so um that speaks i think a little bit to
00:39:46.200my psyche i'm um i'm much more happy and comfortable at home with my family than i am
00:39:50.280going and doing these things um which probably makes me more effective because i don't have to worry
00:39:54.400about the interpersonal relationship stuff that other people worry about because frankly i don't give a
00:39:58.500shit if people like me so i can just tell the truth and you know i'm i'm good so uh you know
00:40:05.820from there you know i come out in 2015 i endorsed trump and and you know really came out full bore as
00:40:13.420a conservative to the right of trump and i think that surprised a lot of people and i'm not sure why
00:40:19.040because you know privately i never really hid my views you know i was very very pro-gun very
00:40:24.420anti-communist so i'm not sure why it was shocking to people i do remember after i came out though uh
00:40:29.960publicly because i think i did it on fox fox or fox business i can't remember it was like a tv
00:40:34.580interview and i come out and i endorse him and do that that whole thing and um get a phone call from
00:40:39.920one of our big clients at the time so we had um i can't say exactly who but i can say this it was one
00:40:45.280of the very largest movie studios in the world they were one of our clients and a long-term client where
00:40:51.060we had a long-term contract we provided extra production um for some of the big movies and things
00:40:56.580like that that they couldn't provide so when they needed extra crew they needed a another director they
00:41:02.380needed you know side project whatever like we did all that stuff and so i get a call from our our rep
00:41:08.440there and they go robbie robbie robbie robbie you know it's not too late to turn back on this right
00:41:14.440i love you i consider you a friend you are going to destroy your company and your career it is not too
00:41:20.280to turn back you just got to tell people you didn't understand his immigration policy you know
00:41:25.280you're latino and that'll work you know people buy that and you know you can move on and i go well
00:41:32.020you know the tough part about that is the immigration policy was kind of the settling point for me
00:41:38.500and i was like i i went into this knowing what i was doing like i know all of you people who pretend
00:41:45.620to be so tolerant are actually the most intolerant people on earth so i'm not shocked by what's
00:41:50.360coming um i i knew what was coming so uh you know as that's going on a lot of people didn't know this
00:41:57.000about me and still don't know this about me i i'm an investor too so i invest and i'm i'm i would say
00:42:02.040very wise about it you know maybe not nancy pelosi but i'm i'm pretty good and so um you know i i sort of
00:42:09.660prolifically and invested in in real estate and in the markets and have done very well doing that
00:42:14.720and uh you know i think that for me i've always felt it was very important to use whatever extra
00:42:21.240money i could to fight for what i believe in and so we've done that we've tried to put money to work
00:42:25.200um for political things but also non-political things that i feel like are inherent good
00:42:29.880in in our country like you know when we had the horrible hurricane um affect us here in in tennessee
00:42:36.680it happened especially in east tennessee power was out uh people didn't have electricity they
00:42:41.980didn't have internet and so we made sure actually faster than emergency response did and we eventually
00:42:51.240even gave these to tennessee's emergency management uh association so starlinks i i bought up all the
00:42:58.660starlinks i possibly could and elon helped um you know get get me with the right people at starlink and
00:43:04.320we were able to deploy them super fast here in east tennessee um i'm not in east tennessee but
00:43:09.800you know we we had our people there and so things like that as well as the political missions right
00:43:15.760um you know funding everything necessary to be able to be effective changing laws and changing sort of the
00:43:23.340cultural fabric of how certain people look at things you know whether it be the transitioning of
00:43:28.020children or or you know um death penalty for pedophiles that's another one that my wife and i have
00:43:33.120worked very hard on and we got done here in tennessee and now is spreading to other states you
00:43:37.480know i think we've got alabama florida now on board and um and there are going to be many more that we
00:43:42.480have uh been talking to and pushing along that way so you know i i think that's uh sort of my life
00:43:49.800mission now is use uh my ability to create uh money out of thin air and you know go and and use it to
00:43:59.900to to do good in the world and so uh you know i feel like if you have those talents those abilities
00:44:05.980you should use them for good and that's what i'm trying to do and what's wonderful too is i feel
00:44:10.700like i have a great following of a very aligned people who have done well in life and have decided
00:44:20.400to back me for that same reason they want to put their money to work too and that through the dei
00:44:26.540project was like the lifeblood of it being able to pay for you know different things we needed to
00:44:31.280do and uh it's cool to see that there's other people who feel the same way that i do that we
00:44:36.900need to put our money to work to actually change this country in the world i love i love that prime
00:44:42.640example example of all which was scott's motto and um one i want to say to happy eye doc yes robbie
00:44:50.860does resemble clark gable yes he does um and also robbie as far as you showing and telling us but
00:44:58.960you know by all of your examples of what you as one person can do you know so we as you know one
00:45:06.140person can also do useful things but in case the the sippers aren't um aware of what you did with the
00:45:14.420dei can you give us just like a little compact thing because that was one of scott's like
00:45:18.880most proud things that you were doing like he was just like look at what robbie's doing you know he's
00:45:25.360one guy like yes robbie like you know a lot of times we would be in the pre-show before going live
00:45:30.520too and we would talk about it more but i mean i i would hear it like one after the other i see your
00:45:36.400posts and see what you were doing and you're like holy shit like robbie is taking on the world for
00:45:41.860us like it's for us you guys and that's how i always viewed it like robbie's got a beautiful
00:45:47.320family i'm sure he wants the world to be great for them but nobody takes on this kind of shit from
00:45:54.060people and puts themselves out there and you know take slings and arrows if they're not doing it for
00:46:00.940the greater good i'm telling you and robbie give them just like a quick like what you did with the dei
00:46:06.200because i know we only have like about 10 minutes and i know owen has more questions too
00:46:10.240quick uh you know sort of rundown of it is essentially the right has not given pressure
00:46:16.500at all in corporate america for decades and so you know this simple physics if all the pressure is
00:46:22.420coming one direction everybody knows what's going to happen it's going to go that way if you don't
00:46:26.180have a counterforce nothing good right and so what's interesting about it too is the left-wing
00:46:30.900pressure that there had been for decades was a total paper tiger it was groups like the hrc who
00:46:35.080actually have no real popularity or backing and so you know as i see corporate america shift wildly
00:46:40.680left i was like we need to create a cultural counterweight and prove our actual power but it
00:46:46.100has to be dynamically focused you've got to go one by one and i realized one of the big problems on the
00:46:50.860right was like we're all over the place we're sporadic on a million different things and so
00:46:54.520you never get this like very focused pressure and i think that's what we worked very hard at
00:47:00.520is ensuring that when we had a company that we were focused on we brought pressure to bear from
00:47:06.200every side of the right and so doing that we'd go company by company who had the craziest policies
00:47:13.500you know all the woke stuff funding transitions for kids some of them were funding drag camps for kids
00:47:18.700in the summer and we're talking about fortune 500 companies right i would have our research put
00:47:24.480together into one video expose all of it and then the boycott that would ensue after that would change
00:47:32.900the uh the policies you know uh and we'd get in contact with the ceos talk with the ceos and and
00:47:39.420sort of negotiate a surrender so so so to speak and uh we did so very effectively and so what's
00:47:46.040interesting is that public campaign uh we at a certain point didn't even really have to do anymore
00:47:52.720because now we can just call these companies when we spot something and they're ready to change like
00:47:58.080we don't need no story required right and so i've actually gotten more done in the last six months
00:48:04.120changing policies at companies than we did in the first year and a half of this project and i think
00:48:10.240that's really interesting because i haven't been just pumping out videos of it like we initially did
00:48:15.160for the first year and a half because i'm so focused on using the time effectively and being useful
00:48:19.760in the most efficient way possible sometimes it's going to take videos and public pressure but
00:48:24.800sometimes you prove sort of your ability to do this in such a way that you can can just get a lot
00:48:29.760done behind the scenes that's amazing that's amazing you guys you guys i i can i can see that
00:48:35.680and all the platforms that are just falling in love with you as you should be you know please
00:48:40.960and so in there sorry for the echo you guys let's see if
00:48:56.880owen can jump in owen are you there yeah no yeah i'm here so i i wanted to ask about your run for
00:49:15.520office you you ran for office in tennessee and apparently the republican party didn't like that
00:49:19.680very much and they somehow tried to get you to not be a bona fide republican or and kicked you
00:49:25.520off the ballot are you going to try and run for office again and can you tell us a little bit about
00:49:29.600what happened there so it's incredible um i thought i lived in america right um i didn't know this was
00:49:36.160possible so what i'm about to lay out for people is going to blow their minds i ran for congress and
00:49:41.440when i ran for congress this was uh i guess you know about uh about how many how many years has
00:49:49.280it been i guess six years um good time goes by faster than you think um and you know when when i
00:49:56.000started the run i thought you know this is america anybody can do this right you just have to become
00:50:01.760popular to the voters and they have to want you and you know and that's it right um and by the way
00:50:07.680just side note i have been a registered republican my entire life okay i have only voted for republicans
00:50:15.360i have literally never voted for a democrat and they try this this thing and i will say we've mended
00:50:23.040fences so i don't want to be too hard on anybody but i just to explain what happened because it's still
00:50:27.520like it blows my mind i run for office i get a gigantic lead the last poll done in the race before
00:50:33.600i was removed from the ballot and by the way i was endorsed by pretty much everybody
00:50:37.680um the last poll before i leave the race there was a 30 point gap between me and the second person
00:50:42.800who eventually became the congressman for the district somebody i really don't like
00:50:46.480um but that's another story for another day so when this happens i get removed from the ballot i sue
00:50:52.960i win in state court the state court says yes he absolutely should go back on the ballot so
00:50:57.600that's that's what's gonna happen then uh last minute i think it was like 48 hours before ballots
00:51:03.120are printed they take it to the supreme court they the state republican party appeals to the
00:51:08.240state supreme court state supreme court i think as everybody knows um is appointed by establishment
00:51:15.360governors typically and so uh they side with the party and say yeah they can just remove him for any
00:51:21.120reason they want and so i was removed from the ballot could not run as a republican and um at that
00:51:28.160point it's too late to put your name on the ballot as an independent or something like that which
00:51:31.840i wouldn't have wanted to do anyways because i am a republican i've been one my whole life
00:51:36.000uh so instead what people did is there was uh the most protests protest votes in congressional history
00:51:42.400of write-ins of people writing my name and in protest because you can't realistically win a massive
00:51:47.840congressional race a federal race like that with write-in votes um and i actually told quite a few
00:51:53.280people i didn't want them to to do this because i wanted to ensure that um sort of best person won that
00:51:59.440primary which um you know i don't think really um ended up the case but yeah that was it that's
00:52:06.320america that can happen you can get kicked off the ballot um because somebody decides they don't want
00:52:11.440you to run uh so wild but yeah that can happen in america yeah yeah it seems like some of the most
00:52:20.880influential persuasive decent voices get canceled or shut down or refocused um it's it's a crazy thing
00:52:31.280oh go ahead rob i will add to to that i am so thankful that it happened like i thank god all the
00:52:39.680time that it happened and this is actually sort of breaking news here um i actually recently was
00:52:46.560essentially offered up a congressional seat okay it's a long story but i would have absolutely
00:52:53.920won this and it would have been mine if i wanted it and um it was a special election so i think i'll
00:53:00.400i think everybody can put two and two together if they just look at what special elections happened
00:53:04.800and i ended up saying no because i realized something in the time that i ran for office and now
00:53:11.280i actually want to be an effective person united states congress is not a place for effective
00:53:18.160people um at least it hasn't been for a long time i hope that changes maybe one day that changes maybe
00:53:23.520one day it is a place that actually legislates and does their damn job today it does not today
00:53:29.120it's a clown show where people go to do i guess whatever clown show it is they want to be a part of
00:53:34.160i don't want to be a part of a clown show i want to get stuff done so i'm much more effective
00:53:37.840outside of government at this point um the only thing i could ever really see myself running for
00:53:42.400is like i've been an executive my whole life so some sort of executive position where i could
00:53:46.720actually just make decisions and change things on my own um whether that be local or state you know
00:53:51.760whatever um that's the only thing i could see myself doing because i i have a very strong mind for
00:53:57.360that legislatively i would be driven nuts being they'd have to they'd have to throw me out honestly
00:54:02.640of congress at this point had i been elected i would have definitely been thrown out they would
00:54:06.320have censored me or kicked me out for some reason because i wouldn't have been able to hold my tongue
00:54:10.800at what a farce this is like how far are we in to trump's second term and what the hell has congress
00:54:18.640done what what have they done pretty much everything good has been an executive order you know and the
00:54:25.680only things that they've codified like a couple of them you know i mean they've renamed some post offices
00:54:32.000i i don't know what to tell people i mean everything they do is either incredibly corrupt
00:54:36.720uniparty you know favors or you know something self-serving like there's very little we get out
00:54:42.640of congress so i am incredibly thankful things worked out the way they did because there's so
00:54:47.280much good that i was able to do from that point in time to today and that's a good lesson for people
00:54:52.400because when it happened to me i was so upset now i'm like asking god like what why is this happening
00:54:58.640right like i feel like i was supposed to do this why is this happening and so it's a good lesson to
00:55:04.400people that like you may be really upset something doesn't work out sometime but there's probably a
00:55:09.760damn good reason for it and there may be more than a silver lining on the other side of it there may be
00:55:14.800you know lessons that you needed to learn in that failure and it's a stepping stone to something that
00:55:18.800is going to be your greatest success you know so i'm thankful that it put me on a path to be able
00:55:24.960to accomplish what we've been able to do since then that's a that's a i'll just wait for the
00:55:31.920me okay thanks that's a fantastic message and i say that often you know you fall down seven get up eight
00:55:38.720and there are so many people you know sometimes i have friends that'll say oh so and so should you
00:55:44.320know be in the administration or do this or do that and i'm like they are they will be so hamstringed
00:55:49.440if they are part of politics and sometimes the best way to affect change is to be on the outside
00:55:55.440of it and influence all around it because you're right i mean i think hillary clinton's only accomplishment
00:56:00.960was not even accomplishment was naming a post office i don't know what the post office fetish is
00:56:06.960but it's all about changing the names of post offices over there so anyway but robbie i would have been
00:56:13.760just like you if i had gone into office because uh clearly i also am passionate about that you can
00:56:19.600only i can only imagine like if you you know i wish that you would have made office you know but you're
00:56:27.440right like imagine everything that you've done since then would have not happened your dei campaigns
00:56:33.920your war on children documentary um all of the things that have been effective and have
00:56:41.840helped all of us you know and i and we have to thank you because you actually made such a difference
00:56:48.720i don't think you would have been able to do that had you been in government
00:56:52.640yeah we owe robbie a big credit credit thanks you're right about that i would not have been able to
00:56:59.360do that stuff i mean like the war on children alone i would have to say is it's a close thing between
00:57:05.680that and the and dei but i almost feel like the war on children was the most effective thing that
00:57:10.160i've ever done because the number of child protection laws that it ended up inspiring in
00:57:17.120many different states is insane so there's so many states that have banned transgender surgeries
00:57:23.120for kids and hormones as a byproduct of that movie and the effectiveness of it it was over 60 million
00:57:29.680people watched that movie which was wild you know elon musk was a big supporter of it and the trump
00:57:36.000family was too and so uh you know it crossed a lot of political boundaries too where it was a film
00:57:43.680that an independent can watch a democrat can watch and if you're just a rational person who cares you
00:57:50.560know about kids you're going to watch it and go oh we have a problem here and it's not just about
00:57:54.560the transgender issue it delves into everything from social media technology um the way that it's
00:57:59.680affecting our kids to uh the psychological aspects of you know what's affecting kids today education
00:58:05.520system and the failures inside of it and how we fix it that's a big part of this is like how do we fix
00:58:11.520this because we have a generation of youth that believe in god the least they report the highest levels
00:58:18.160of mental illness that we've seen ever in youth and then you've got this you know
00:58:24.240transgender craze that started to happen during that time this is a bad combination right and so
00:58:29.680the heartening side of this though is that in the time after that movie there was this rapid turn to
00:58:35.600god that has begun to occur in this generation so at the time we made it it was the most faithless
00:58:40.800generation now we're veering out of that territory and young men especially in that generation are veering
00:58:47.920toward the most conservative generation in american history among young men which is
00:58:54.800a very welcome development and i think that that is a reason for hope for all of us to kind of hang
00:58:59.760our hat on is that uh you know we've got a lot of young men in this country who have been awakened
00:59:06.160to how dangerous leftism is amen with that said robbie um i i want to thank you on behalf of all of
00:59:15.360us at the scott adams school all of the simultaneous sippers all the beloved it has been such an honor
00:59:22.000having you on and i want to let people in the comments know if you go to robbiestarbuck.com you
00:59:28.080will find the movie the war on children it's there at robbiestarbuck.com and robbie also has a podcast
00:59:35.360the robbie starbuck show and we'll link everything also after the show so robbie um i hope you'll come
00:59:41.680back again and we would love to talk about all the issues going on with children we're having cory
00:59:47.360deangelis on on thursday another amazing fighter for children and um did you want to say something
00:59:55.600we always do a little closing sip robbie but if you had a final message i want you to have that
01:00:01.200i will be a part of the closing sip i have my scott adams mug here which by the way uh for those who
01:00:06.000don't know um i got this mug like a week ago or a little over a week ago and it is going to be on
01:00:12.400my desk until it breaks okay so when we do shows when we film stuff that scott's mug is going to be
01:00:17.200here as a reminder that even after we're gone the the effect we have on other people can live well
01:00:24.160beyond us one day i will die you know might be tomorrow might be 50 years from now but one day
01:00:29.680i'll die and my hope is is that somebody else will have been affected by my work and we'll
01:00:33.840you know have some sort of memory there to remind people that we affect each other long after we're
01:00:38.720gone there's this beautiful butterfly effect to everything that we do and i always say to my kids
01:00:44.880especially my my uh oldest son because he um and i'll say it a lot to my my younger son too as he
01:00:50.960develops is that who you are is defined by the choices you make that nobody ever knows about
01:00:57.280it's defined by who you are when nobody's looking it's very easy to do the right thing when everybody's
01:01:02.480watching you it's very easy to pretend you're a holy person when everybody's watching it's much
01:01:07.040harder when nobody's looking and the only person who knows is you and god and that's what i try to
01:01:12.960always remind my son of and so um you know i think that that's that's something that if we all live
01:01:19.600out and we try in those moments nobody's looking to make the world a better place to make the right
01:01:24.240choice even when it's and i'd say especially when it's hard then we will begin to see a truly better
01:01:31.200world all around us for our kids and our grandkids because you know i think we need to leave this world a
01:01:37.120better place for them than we found it and unfortunately this is the first generation where
01:01:41.760you know because of ai and a lot of other things it's not a sure thing that we will leave our kids
01:01:46.320a better world than the one we came into um as of right now i can say this when i was born in the 80s
01:01:52.400the culture around america was a much better place than the one we have today and so i hope we can get
01:01:59.360to a place where that's no longer the case um but right now i'm pining for the 90s you know like
01:02:04.160the 90s rocked i would say like every day i'm like i wish i was waking up in the 90s
01:02:09.120but thank you for having me i really appreciate it you guys are making a big difference and i'm
01:02:12.880so glad you're carrying on scott's legacy by continuing this and continuing to meet together
01:02:18.320and sip together and uh i'll be tuning in sometimes you know that's what i did with scott you know
01:02:23.840unfortunately with four kids and a farm and all of the projects we're doing i don't get to do it
01:02:27.760every day but i it is i'm gonna watch something in the morning it's gonna be it's gonna be this
01:02:32.960just like it was with scott we love that love that on behalf of owen and marcella and myself
01:02:39.600and everybody thank you robbie starbuck and everyone you know robbie's the example scott
01:02:45.040was our example go out there and be useful and to scott to scott see you tomorrow