Real Coffee with Scott Adams - February 23, 2026


Episode 3100 - The Scott Adams School 02⧸23⧸26


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

185.34619

Word Count

11,725

Sentence Count

408

Misogynist Sentences

2

Hate Speech Sentences

12


Summary


Transcript

00:00:00.000 Good morning.
00:00:00.840 Good morning.
00:00:01.660 You guys are coming in.
00:00:02.500 Good morning.
00:00:03.780 We're back.
00:00:04.660 We're back.
00:00:05.820 Good morning, everyone.
00:00:06.660 Good morning, everyone.
00:00:08.120 All right, you guys, I have a little sound.
00:00:10.980 Did you hear an echo?
00:00:13.380 We just want to make sure.
00:00:16.340 I'll wait for your comment.
00:00:18.080 We'll wait for your comment.
00:00:20.760 Hello, hello, hello.
00:00:23.620 Hello, hello, any echo?
00:00:27.000 We have like a delay, delay.
00:00:30.000 It looks like they're going to hear an echo.
00:00:35.000 Can, can echo?
00:00:37.000 Uh-oh.
00:00:38.000 Uh-oh.
00:00:39.000 Uh-oh.
00:00:40.000 Are we muted, Lindsay?
00:00:41.000 Yes and echo.
00:00:42.000 All right.
00:00:43.000 Is the echo gone?
00:00:44.000 Yeah, it is.
00:00:45.000 All right.
00:00:46.000 So if Robbie's on mute, then it won't echo.
00:00:49.000 OK.
00:00:50.000 OK.
00:00:51.000 All right.
00:00:52.000 So we think we figured it out, you guys.
00:00:54.000 That gave us enough time.
00:00:55.000 Yes, I have no echo when Robbie's on mute.
00:00:57.000 OK.
00:00:58.000 We'll be back in one second because there's something we need to do before we can even
00:01:02.000 tell you what's happening.
00:01:03.000 So we're going to ask Brie to take it away.
00:01:07.000 It's the Simultaneous Sip.
00:01:08.000 And it's the new improved version.
00:01:10.000 And it goes like this.
00:01:11.000 If you'd like to join in on the Simultaneous Sip, all you need is a cup or a mug or a glass,
00:01:17.000 a tank or chalice or stein, a canteen jug or a flask, a vessel of any kind.
00:01:21.000 Fill it with your favorite liquid.
00:01:23.000 I like coffee.
00:01:24.000 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that
00:01:28.000 makes everything better.
00:01:29.000 That's right.
00:01:30.000 It's the Simultaneous Sip.
00:01:32.000 Go.
00:01:33.000 Yeah.
00:01:34.000 Yeah.
00:01:35.000 Yeah.
00:01:36.000 That's the good stuff.
00:01:37.000 So, so good.
00:01:38.000 That was good.
00:01:39.000 Thank you.
00:01:40.000 Thanks, Scott, for leading us off on this Monday morning.
00:01:53.000 I had coffee on my lip.
00:01:54.000 You guys welcome.
00:01:55.000 My name is Erica.
00:01:56.000 Welcome to the Scott Adams School, the place where Scott wanted to set this up on all of
00:02:01.000 his platforms and channels for us to all continue to commune, evolve and grow and stay together.
00:02:07.000 And it's so great because we love having on guests for you as guest professors to keep
00:02:14.000 our minds growing and expanding.
00:02:17.000 I have Owen Gregorian.
00:02:19.000 You guys, you're in for a treat.
00:02:20.000 He has the flu and his voice sounds extra sexy.
00:02:23.000 So you're welcome.
00:02:25.000 Say hi, Owen.
00:02:26.000 Hello, everyone.
00:02:28.000 And we have our beautiful Marcella.
00:02:31.000 Marcella, good morning.
00:02:32.000 Good morning, you guys.
00:02:34.000 As you guys know, I'm so excited about this guest and this is Robbie Starbuck joining us
00:02:40.000 and a little bit about Robbie.
00:02:42.000 I've known Robbie like before the blue checkmark days on here and back to when he had a stray
00:02:49.000 cat come up to his door and him and his wife Landon were like, what do we do with this cat?
00:02:54.000 And it was just the good old days.
00:02:55.000 And Robbie, I have watched your evolution and your persuasion game and what a decent, decent
00:03:05.000 person you are.
00:03:06.000 And it's been like overwhelmingly beautiful.
00:03:09.000 And for those of you that don't know Robbie, he does have a website, Robbie Starbuck.com.
00:03:15.000 And he is the host of the Robbie Starbuck show.
00:03:18.000 And also Robbie and his wife Landon made a documentary.
00:03:22.000 I think it was 2024 called The War on Children.
00:03:27.000 And you are a huge advocate for children.
00:03:31.000 You fight and rail against the transitioning of children and mutilation.
00:03:38.000 And I want to thank you for that.
00:03:40.000 And I think a lot of parents and reasonable people want to thank you because that is no easy task.
00:03:45.000 And you guys have heard Scott talk about many times Robbie is responsible pretty much single-handedly
00:03:54.000 someone being useful and doing something for having the whole DEI conflict that we all see so much.
00:04:01.000 He has had huge companies just to name a few like Walmart, McDonald's, Target, Lowe's, Ford,
00:04:10.000 John Deere, and on and on to change their DEI policies based on his persuasion game and talking to them.
00:04:18.000 So I would like to welcome to the Scott Adams School for the first time and hopefully not the last, Robbie Starbuck.
00:04:25.000 Thank you for being here, Robbie.
00:04:26.000 And so we have a little bit that you could, Robbie.
00:04:31.000 I have asked the class if they could watch the video you made about your lawsuit against Google and how important this lawsuit is to every single one of us watching
00:04:46.000 because you're putting yourself out there to fight for everybody and their reputation and their family and their rights.
00:04:53.000 So because we're going to, you know, be muted here and there, if you could just give us what's going on, tell us how it's going.
00:05:00.000 And I'm particularly interested in your conversation with Rand Paul, Senator Rand Paul, and where you think this is headed.
00:05:08.000 Thank you. The floor is yours.
00:05:11.000 Thank you so much. I appreciate it.
00:05:13.000 And as everybody knows, I was a massive fan of Scott Adams.
00:05:17.000 So I'm glad you guys have carried this on to have his legacy continue in terms of the case with Google.
00:05:24.000 You know, I feel like there's not very many people who are in a position where they could afford to fight Google
00:05:32.000 and have the platform to be able to battle the immense PR advantage that they have. Right.
00:05:38.000 And so I felt like it wasn't really a choice I had to fight this.
00:05:42.000 I mean, not only to correct the damage that it's done to me and the safety issues it's created for me.
00:05:49.000 But beyond that, you know, just the fact that I see how incredibly dangerous it will be if we allow AI to lie with impunity, to defame people,
00:06:00.000 to essentially cross the boundary of what I believe should be the first principle of AI.
00:06:05.000 AI should never be able to harm humans.
00:06:07.000 That should be the first principle built into any AI that there is, whether it is a chat bot or it moves on to, you know, being fused in with robotics, whatever it is.
00:06:18.000 That needs to be the first principle.
00:06:20.000 AI cannot harm humans.
00:06:21.000 So, you know, when this happened to me, my initial impulse was like, hey, let's try to solve this with Google.
00:06:28.000 And that's what I tried to do. I tried to go to Google, you know, had the best intentions of just like getting this fixed for everybody.
00:06:35.000 And let's make sure there's a process in place so that if anything like this ever happens, it is super simple for somebody to be able to correct it.
00:06:41.000 And, you know, unfortunately, that was not our experience.
00:06:44.000 In fact, the first person that we really got deep into conversations with at Google, you know, we had essentially, you know, asked for this to be fixed.
00:06:53.000 I'm going to fast forward a little bit months later.
00:06:56.000 I check in and I say, hey, you know, where are we at with this?
00:06:59.000 And they responded, I'm sorry, Robbie, I'm actually resigning.
00:07:04.000 And we couldn't get anything done, essentially, to correct this.
00:07:09.000 So, you know, that obviously sent a signal to me that there was no interest in correcting the problem.
00:07:15.000 And the defamation not only continued for two years, but it actually got remarkably worse.
00:07:21.000 It moved on from simply just stating a lie about me to adding fake sources, fake police records, fake court records, naming fake victims in detail, and even doing fake victim statements about the horrific, horrific crimes that it accused me of.
00:07:38.000 And, of course, these are entirely fictitious people.
00:07:41.000 That's the craziest part.
00:07:42.000 These are fictitious.
00:07:43.000 None of this ever happened.
00:07:45.000 And there's no basis for it that we can find on the Internet.
00:07:48.000 And believe me, we are amazing researchers here.
00:07:50.000 We have done deep dives.
00:07:51.000 We cannot find any basis for this to be in training data anywhere.
00:07:54.000 So that's a scary side note because, you know, Google, when they were questioned by Senator Marsha Blackburn, their defense was these are hallucinations.
00:08:03.000 Well, I've never seen hallucinations like this in my life before because hallucinations are usually melding together a bunch of stuff and doing it incorrectly, and it's super sporadic and random.
00:08:12.000 This did not feel random at all.
00:08:13.000 This was a consistent, consistent stream of lies where it accused me of things ranging from sexual assault, child rape, shooting somebody, violent assault, drug use, drug selling, you know, I mean, every number of things, abusing a nanny.
00:08:29.000 And it would repeatedly do these same crimes over and over again.
00:08:34.000 So, you know, when it goes into detail like that, and if you, you know, what's interesting is if you challenge an AI with a lie, it typically will go, oh, yeah, I got that wrong.
00:08:44.000 Because, like, it's being confronted with, you got this wrong.
00:08:47.000 In this case, when it was confronted, it would usually, I mean, the vast majority of the time would just stick with the lie.
00:08:54.000 And it would say, no, no, no, no, this is true.
00:08:55.000 Here's some sources.
00:08:56.000 And it would send you, like, www.cnn.com slash Robbie Starbuck sexual assault.
00:09:02.000 And then you'd click it, and it goes to a 404 page.
00:09:06.000 Page doesn't exist.
00:09:07.000 You go back to it, and you say, hey, this page isn't there.
00:09:10.000 This really looks bogus.
00:09:11.000 I think you're lying about this.
00:09:14.000 It would respond back, no, no.
00:09:16.000 And it would even go so far as to print out, or I don't know if that's the right term, but it would send you back a full fake article that it wrote up in the name of a real journalist pretending to be whatever that media outlet was and pretending to be that journalist.
00:09:33.000 In fact, there's one, Yashar Ali, who called this out because at one point he was cited as a source by Google's AI for a, I believe it was, I don't have it in front of me, but I believe it was an assault story.
00:09:46.000 And Yashar actually called this out and was like, this is crazy.
00:09:50.000 I never wrote anything like this about Robbie, and I never had any story even remotely like this about him.
00:09:56.000 And so that was my experience.
00:09:58.000 And you would think, like, a big company like Google, they would shut this down right away, but they did not.
00:10:02.000 I mean, it's continued.
00:10:03.000 I even, you know, found out that it was continuing to happen up to this week.
00:10:07.000 And part of the reason that there's going to be a massive issue with this is open source AIs, they have an inherent problem where if there's a big issue like this, the company that made it, they don't have control of it anymore, essentially.
00:10:23.000 Because once they set it out into the wild, and we're talking well over 100 million downloads of Gemma, for instance, that's one of their AIs, they can't go and force an update to every Gemma download out there.
00:10:35.000 Because if somebody's disconnected from the internet, they're using Gemma as their main AI for whatever it is, whether it be app building or whatever, they can't force an update to that.
00:10:46.000 So Gemma there, and there's a bunch of websites, too, out there.
00:10:50.000 I found out about this recently.
00:10:51.820 So there's all these websites you can go to, and you can test LLMs against each other.
00:10:57.200 And on those websites, too, you know, they're continuing to carry products that do this.
00:11:03.000 And they're from Google.
00:11:04.160 So I'm not sure how that's even going to work in court, you know, let's say, you know, I win down the line, how they're going to enforce a stoppage to this, because there's all of these downloads out there that you're never going to be able to get back, never going to be able to get them to stop lying.
00:11:17.960 And what's scarier about that is Gemma, one of the platforms that was the very worst of Google's, it is used for app building a lot of times.
00:11:25.860 So imagine somebody's building, like, a reputation scoring app for banking or insurance or something like that.
00:11:31.160 I mean, the long-term damage that can do is immense.
00:11:35.140 In fact, and this is a crazy thing.
00:11:36.900 I think I mentioned this with Rain Paul in the conversation we had.
00:11:41.740 Insurance-wise, the past two years, I was denied by the vast majority of insurers across the United States.
00:11:47.380 This year, I only had one option for homeowner's insurance.
00:11:50.660 I have never missed a payment.
00:11:51.920 I've never had some big problem or anything like that.
00:11:53.940 I am on autopay.
00:11:55.840 Like, I'm the perfect customer.
00:11:57.200 Insurance companies should love me.
00:11:58.500 I'm one of those people, like, I put it on autopay, done.
00:12:01.300 And, you know, no.
00:12:03.040 And they cite me as a risk.
00:12:04.700 In fact, we had one of the insurance companies come back who knew who I was, and they were sympathetic to my situation.
00:12:12.780 And so they told my agent what was going on.
00:12:14.840 And they said, you know, we've deemed him a high risk because of his career and online stuff and the Google stuff, is what they said.
00:12:26.100 We don't know if that means the lies Google told or whether that means my lawsuit against Google or what.
00:12:31.920 So we have to find this out in Discovery.
00:12:33.980 Like, there's so many different rabbit holes we have to go down in Discovery to figure out exactly how far this all goes.
00:12:40.600 But, you know, in general, I see the threat this poses because if this is just the beginning and AI is used this way long term, I see how it can be used to enforce ideological, you know, sort of obedience.
00:12:55.660 Because if you live in a world where somebody who doesn't have my ability to fight back, just an average person, you know, doesn't have a ton of money to be able to fight this.
00:13:06.380 If they're dealing with a reality where AI, one of the big frontier labs, doesn't like their politics and they decide to lie about them.
00:13:15.220 And when prospective employers look them up, they get back fake crimes.
00:13:21.780 You know, imagine how that's going to destroy people's lives.
00:13:23.980 And if the only fix to that is don't speak out about politics, you know, don't speak out about what you actually think.
00:13:32.040 You can see where it gets dangerous very fast.
00:13:34.680 Did this start when you were running in your office?
00:13:38.780 And I want to point out that Senator Paul thinks you have a good chance of winning your case.
00:13:46.960 And I agree based on hearing J-Cal from the All In podcast talking about something like this.
00:13:54.920 But it is damaging.
00:13:56.620 And I also want to add, because I'm trying to get in without the noise coming back, but that, you know, Scott initially, before AI was even a thing, was like, oh, you know, AI.
00:14:07.160 Like, I'd want an AI of me, like, after I die and da-da-da-da-da.
00:14:11.040 This is before he was sick also.
00:14:12.820 But then when he saw all the flaws with the AI and how it was going and it could be programmed and whatever, he then said over a year ago, no, I don't want this.
00:14:23.640 You know, it's not at all where it should be.
00:14:25.780 And, you know, you see that it can assign opinions to people that those people never had.
00:14:31.620 And essentially what I said about this kind of defamation is that it turns somebody into an actual puppet.
00:14:39.320 And now somebody else is their puppet master.
00:14:41.840 And it is very distressing, very upsetting.
00:14:46.860 You know, it's their likeness, their voice, whatever.
00:14:49.700 And for you, it's like, oh, okay, like, here's what we're saying about this guy.
00:14:54.500 And who knows why it started?
00:14:56.160 I'm sure you do.
00:14:57.420 You know, why they were going after you usually means you're effective.
00:15:01.820 But, yeah, it'll ruin lives.
00:15:04.120 It'll ruin even the history of who you are, you know, long after you're gone.
00:15:11.440 These things will live on forever, these lies.
00:15:13.740 And I think that it's time to really go after, and if it's the AI company or the AI code and whoever wrote it and whoever owns it should have the living sued out of them, out of existence, and the people carrying it on also, same thing.
00:15:31.920 And Yashar Ali should be also suing.
00:15:35.120 And that was a lot to say, but I just wanted to get your take on all of those things.
00:15:39.680 Yeah, you know, I don't know why a bunch of legacy media companies are not suing them over this, because they misrepresented every single one of them that was cited as a source and wrote fake stories and things along those lines.
00:15:55.640 I would absolutely be suing if I was in a position to at one of those companies.
00:16:00.240 Now, in terms of winning, yeah, I mean, some of the foremost legal experts have stood and said, yeah, I think there's very, very strong likelihood that Robbie will win this case.
00:16:11.560 You know, there's a possibility we have a hearing, actually, in less than two weeks, and there's a possibility they may ask us to amend and come back,
00:16:20.380 which is just a byproduct of the fact that we have had so many people come forward since we filed with more facts that we didn't know, including continued defamation.
00:16:32.960 I hope they don't do that just because I feel like that adds a delay that, you know, Google, I'm sure, would love.
00:16:38.180 But, you know, we do need to continue to be able to kind of like share what's happening as the defamation continues.
00:16:45.280 But I do think we're in a great position, and I think we've got to take this, you know, all the way because I think we need to set a precedent here.
00:16:52.820 And I think, you know, like you said, there's all these unintended consequences with AI if you don't get it right,
00:17:01.220 if you don't have, you know, really serious principles in how you're building it.
00:17:06.320 So, you know, in the case of, you know, what's happened with Scott,
00:17:11.520 I can't even watch the video because it just makes me so mad.
00:17:14.460 This AI Scott thing that they're doing.
00:17:16.640 The truth is, imagine something like this in the hands of a foreign actor in a foreign government.
00:17:25.540 And what's happening is you actually change the long-term view people have of that person
00:17:32.220 because they're going to now confuse whatever happens here on out, clips of that AI here on out,
00:17:38.460 with what the man Scott Adams actually said and believed.
00:17:41.920 And so you materially change the way they're viewed by the public.
00:17:48.380 You materially change even the way that they're viewed by people that knew them because people will start to go,
00:17:53.440 wait, was that Scott or is that the thing?
00:17:54.980 And it's so hard to discern.
00:17:56.680 Here's where things get really scary.
00:17:59.240 And I think this is a great jumping off point to understand this for people.
00:18:03.020 So two years ago when it was a little, it was around two years ago, a little over,
00:18:08.760 that we became aware that this was happening with their initial AI chatbot iteration called Bard.
00:18:15.480 It was just telling lies like Robbie, you know, is a fan of the KKK, you know, things like that, crazy stuff.
00:18:22.840 Was it January 6th?
00:18:25.160 So on and so forth.
00:18:27.060 And then two years later, it jumps to creating fake articles, right?
00:18:31.840 It was actually less than two years later, but it advances to that with the new iterations of AI that Google releases.
00:18:38.600 I want you to push forward five years and pretend that this is continuing to happen or that this happened in five years instead of today.
00:18:48.500 Because in five years, video from AI and audio from AI is going to be indiscernible from reality.
00:18:54.540 Even technical experts are going to seriously struggle to be able to say, this is AI, this is not.
00:19:00.980 Which is going to create a host of problems in the justice system because I guarantee people are going to get off from crimes
00:19:06.200 because of fake AI evidence that ends up leading to an alibi.
00:19:10.860 And people are also going to get put in prison because of AI video that shows them committing some kind of crime
00:19:17.260 that they never committed, that they cannot prove is not a real video.
00:19:20.760 It's going to create so many problems and downstream issues.
00:19:24.180 It's not even funny.
00:19:25.540 But imagine that was happening when it made these accusations about me.
00:19:28.500 It could have spit out a realistic video of me shooting somebody or a realistic video of me raping someone.
00:19:34.380 That's insane.
00:19:35.420 You know, but that's the world we're stepping into.
00:19:38.220 And so I think people have to be clear-eyed about this because even if the major frontier labs do a good job of watermarking the videos
00:19:45.860 or something like that to try to prevent this type of situation, there is going to be rogue AI.
00:19:50.760 There is going to be AI that is developed in a country that is not friendly to the United States,
00:19:55.440 that is going to seek to topple or I should say inject chaos into our elections and into our civil life.
00:20:01.860 And so I am 100% positive when I tell you we are one or two elections away from a foreign actor using AI in this manner to flip an election.
00:20:10.500 There is going to be a video indiscernible from reality, and I don't care which side this affects.
00:20:15.800 What I care about is the fact that this dramatically alters the way that our elections and public service works in this country
00:20:23.280 because it is going to happen to somebody.
00:20:25.360 They're going to be accused of something absolutely heinous, and there's going to be video and audio that is compelling of it.
00:20:30.320 And people are going to change the way they vote because of it.
00:20:34.280 And so I can't caution people enough.
00:20:37.160 We're entering a time where truth and what we see is going to be harder and harder to judge.
00:20:43.900 And so we need to be more discerning than ever.
00:20:47.460 I agree.
00:20:48.360 I agree.
00:20:48.680 I'm going to grab a couple of questions.
00:20:51.480 So the interesting part is that I'm an attorney, Robbie.
00:20:59.080 So the interesting part about the lawsuit is, and I will, I have another question, but this is just a comment,
00:21:05.260 that the hallucinations don't go into, oh, he won a prize of world peace, or he loves children, or he has, you know, like it goes into the negative.
00:21:16.980 And my issue in regards to their claim that you're not proving malice, malice comes from also the actions that they have made.
00:21:25.140 You've notified them, and yet they still keep on going and not doing anything about it.
00:21:31.280 But I know that you probably can't comment further, you know, due to your attorney and your case is ongoing.
00:21:37.640 But I had a question about Scott Adams.
00:21:40.400 So Scott was a full supporter of you, loved you, always when I shared anything about you to him, he was like, Robbie, Robbie.
00:21:50.920 He, he, he, like, he just loved you so much.
00:21:55.520 And I wanted to know how, how Scott influenced your work.
00:22:02.040 And I also wanted to highlight to everybody out there that Scott also had you in the Dilbert comic.
00:22:07.980 So he, he, he not only loved you, but he made, I think, maybe one strip or two, I don't know, if you remember that.
00:22:17.720 But I wanted to see what you're, to give me your thoughts on that.
00:22:23.600 Yeah, so the malice question, you know, I'm a hard, I will say this, I feel bad for my lawyers, I'm a hard client to control.
00:22:30.600 So I, I probably say things when I shouldn't, but I love them.
00:22:36.000 They're, they're great and they care.
00:22:37.560 The problem is like, you know, maybe it's because I'm Cuban, but I have this like fire inside me that wants to defend myself.
00:22:42.460 So I just like, I can't help myself.
00:22:44.020 I just, I just, I have to.
00:22:46.220 It is incredibly malicious that they did not fix it in a timely manner, incredibly beyond malicious.
00:22:53.340 Um, and it's, I, I think under the law, you know, you, you'd classify it as, you know, uh, gross negligence that, that is, you know, defined best as malicious.
00:23:04.040 I don't think there's a better word for it.
00:23:06.280 And so, um, I think we definitely crossed that threshold.
00:23:09.940 Now, I think actually beyond to cross it, I think it's like, it's so obvious.
00:23:14.240 It kind of slaps you in the face that it's malicious.
00:23:16.280 And to your point, I can't believe the defense that this is hallucinations, which doesn't even matter as a defense, by the way, because pretend it was hallucinations.
00:23:26.720 Uh, what you still allowed it to continue for multiple years after being notified over and over and over again.
00:23:32.700 So, uh, it kind of doesn't matter what the reason was.
00:23:35.420 You built a product that does this to people and, and it's just, it's, it's untenable.
00:23:40.080 You can't continue to do this.
00:23:41.520 So, you know, that's, that's one thing.
00:23:43.620 But, um, in terms of, you know, sort of the, the long-term issue here, there has to be a better process for when this happens to people.
00:23:53.920 And, you know, so I think it's incumbent on us to create that, but I want to talk about Scott for sure.
00:24:00.200 Uh, Scott is one of a very small group of people whose persuasion crossed borders.
00:24:08.860 You know, I mean, it really, like, it didn't matter where you were from as long as you could understand English, you know, um, Scott could have a massive influence on your life, your thinking and your process.
00:24:21.760 One of the things that Scott, you know, really, uh, impacted me with was I feel like he was very measured in how he analyzed things and he didn't like to jump to the immediate gut reaction.
00:24:32.940 He liked to kind of, you know, sit on something and, and, and take it in all the way.
00:24:36.720 And so I sort of adopted that too, with a lot of things, uh, cause I saw that the more reactive you were immediately, the more likely it was that you might be wrong.
00:24:48.140 Right.
00:24:48.700 And so you need to sit on something, collect all the facts, the evidence, and then make a determination.
00:24:52.580 And so I think that has helped me a lot in how I see the world and how I sort of react to things.
00:24:58.840 But, uh, yeah, no, one of the coolest things that has ever happened to me, maybe cause I'm a giant dork, but him putting me in the Dilbert comic was amazing.
00:25:08.100 And, uh, I, he didn't even tell me ahead of time.
00:25:10.580 I actually found out somebody called me.
00:25:12.020 They were like, Hey, you're in Dilbert.
00:25:13.820 And, uh, then I reached out to Scott and was like, this is crazy.
00:25:16.440 Is this real?
00:25:17.100 Well, and, uh, he confirmed it was so yeah, Scott, Scott has had such an immense influence on the right, especially because I know so many people who they just kind of live by the idea that it's our job to be useful.
00:25:32.880 And that's what I try to do every day.
00:25:34.380 You know, like with the DEI project, I'd say, um, it was an amalgamation of a lot of things that made me do it.
00:25:41.180 But one of them was frustration with this idea that one person can't make a difference.
00:25:45.080 I think that so many people believe that, and it's probably one of the top, if not the single most toxic idea that we can have, that we can't make a difference, that we simply are not important enough or powerful enough as an individual to make a difference.
00:25:59.940 I think it's a cop-out.
00:26:00.940 I think a lot of people use it as a cop-out because making a difference is hard.
00:26:04.980 You put a target on your back and it's, um, it's not easy.
00:26:09.800 You know, it's, it's much easier to do the easy thing.
00:26:11.900 And I think Scott is a great example of somebody who refused to do the easy thing, take the easy way out.
00:26:17.720 And so, you know, for me, it's, it's a lot of things, but that, that's sort of the best way I could boil it down.
00:26:28.660 Owen?
00:26:29.920 Yeah, well, um, I, I wanted to ask a little more about your lawsuit with Meta, because I think you, this is actually the second lawsuit for definition that you've been through.
00:26:38.680 I know that one got settled and it sounded like from the little blurbs I've seen, at least that, that they were much more receptive to fixing it and they wanted to work with you on it and all that.
00:26:48.620 But I wanted to get your perspective on that.
00:26:50.220 And, you know, was, was, is that the type of response you were hoping for from Google?
00:26:54.260 And, and do you think that it was, you know, the people at Meta were just, do you think they took the right approach for that?
00:27:02.200 I think that, uh, you know, it was handled really well by Meta.
00:27:06.280 You know, they understood that, you know, they, they, they don't want a biased AI.
00:27:12.580 They want an AI that does a great job and is fair to people and tells the truth.
00:27:18.520 And so, you know, I, I'm an advisor in Meta for full disclosure to everybody.
00:27:22.840 I'm an advisor for the AI for that very reason to ensure that there's not bias injected.
00:27:27.840 And that goes for everybody.
00:27:28.680 You know, like I said earlier, I don't care who this affects.
00:27:30.800 I don't care if it's Democrats, Republicans, independents, you know, if you're, you know, uh, yellow with pink polka dots, I don't think that AI should be able to lie about you, harm you in any way or misrepresent you.
00:27:42.220 So, um, you know, I think they handled it the right way.
00:27:45.680 It was my hope that Google would want to fix things and do the right thing.
00:27:50.360 Unfortunately, Google has chosen a different path, but, you know, um, this happened independently of that situation.
00:27:56.680 So it's, uh, it's just wild, you know, for the best I can tell, uh, the initial issue started with Google.
00:28:05.460 Mm-hmm.
00:28:07.440 Yeah.
00:28:07.920 And, and in terms of hallucinations, I, I would just say, I, I totally agree that I don't think it could be a hallucination when it's so consistent.
00:28:15.040 I mean, I, I do have a pretty, I think, decent technical understanding of how AI works and how hallucinations work.
00:28:21.000 And at least as I've understood the definition of what a hallucination is, is that it's mainly built on the randomness of the algorithm that these AIs depend on, where it has probabilities of predicting what's the next word in this response I'm giving.
00:28:36.180 And it will often give you the most probable one, which would all be based on the training, but sometimes it will give you the second one or the third one or the fourth one.
00:28:45.060 And so if that were what was happening, where it was just doing a hallucination, then it would only happen one time.
00:28:50.680 It wouldn't happen over and over and over again.
00:28:52.940 And it certainly wouldn't give you like a complete fake article with all that.
00:28:57.500 So to me, that does seem like a ridiculous defense.
00:29:00.280 Um, and I don't know what legal standing that would even have because they are still defaming you.
00:29:06.220 I mean, regardless of whether it's a hallucination or not, they built this thing and I don't think they can stand on something like section 230.
00:29:13.920 Maybe Marcel knows more about that than me, but, um, section 230, I think is, is where it's user generated content, but this is not user generated content.
00:29:22.260 Google is publishing all this.
00:29:23.620 So they're the ones that are defaming you over and over and over again, every time someone asks about you knowingly.
00:29:31.220 It section 230 has no application here at all.
00:29:34.980 Um, I'd be surprised if they tried to apply it because they are the publisher of the content and frankly, they probably don't want anything remotely capable of piercing the veil of section 230.
00:29:45.520 But, um, I do think there probably has to be an update, you know, in general, I think you've got to have broad ability to allow people to speak.
00:29:53.600 I think freely online, but beyond that, you know, you have to have lines in the sand on malice and especially content that they are publishing through these AI chat bots.
00:30:02.360 But yeah, you know, the absurdity of the hallucination defense is the fact that there, there's not anything that's like Robbie loves to save puppies, right?
00:30:11.860 Like that's not in the bio on me.
00:30:14.740 It's, it's not some, you know, um, some wonderful, there's no flowery happy lies in there, right?
00:30:21.480 It's all negative stuff.
00:30:22.680 It's all like, you know, he, he was on drugs, sold drugs, shot somebody, um, groomed actresses.
00:30:28.920 That was another one that was like mind blowing to me.
00:30:32.820 Um, and so it, it, it's very clear there's this like absolute negative tilt to it that even just that on its own, you know, throwing out the fact that it's consistent all the time.
00:30:45.620 Um, it just like, I, I don't know how you get to the place where you could try to pretend it's a hallucination.
00:30:51.340 I think you'd have to be a non-technically savvy person to buy the excuses to hallucination.
00:30:56.140 Um, but even then most non-technically proficient people, when they see something like this, if someone says it's a hallucination from a machine, they go like, what are you talking about?
00:31:05.920 A machine hallucinates?
00:31:06.980 Like, you know, it's, it's, uh, it's hard to buy no matter where your vantage point is and people should, people should think is it's so weird to me.
00:31:17.260 And I'm like seeing this a lot with, is it Claude, how it starts to talk to itself and reveal what's happening.
00:31:26.380 I'm like, you're like telling on yourself, AI, but okay.
00:31:29.940 You know, and Robbie's like, you know, is this intentional?
00:31:32.740 And they're like, yes, it's intentional.
00:31:34.220 And it won't let us stop.
00:31:35.420 Like, you know, we don't want to do it, but it's making us do it.
00:31:37.740 I'm like, this is the strangest thing.
00:31:39.500 And I feel like, um, I am not a lawyer like Marcella and I have no idea, but, um, we are redoing our will and I am putting it into my will that nobody, you know, after I die or while I'm alive, people can use my likeness, my voice, my whatever.
00:31:56.700 Um, because I don't know who's protecting us.
00:31:59.320 So I'm going to put it into my will.
00:32:01.280 Um, yeah, I just figure it's a, it's a wild west out there with AI and, um, and I just, I feel like, you know, maybe make a public statement on your platforms.
00:32:13.580 Like I did that for a while.
00:32:14.880 It's still in there and I pinned it that, you know, because obviously it wasn't good enough, you know, for Scott to make a video, a statement, have his brother make a statement, have his estate, make a statement.
00:32:24.740 And because I'm like the face out there, like trying to defend this.
00:32:29.220 Now people are like, Oh, Erica doesn't want this.
00:32:31.280 And I'm like, it's not an Erica thing.
00:32:32.700 This is a Scott Adams genius human being who worked his whole life to give himself to us, to be useful.
00:32:42.880 And he is now turned into a puppet.
00:32:46.680 And the, the, the, there's one person in particular who I won't name who was given really good advice to basically stop what he's doing.
00:32:55.920 And it's also not permitted, but now is putting an AI version of Scott out there, teaching other people how to make an AI version of Scott.
00:33:07.480 So for those of you that think Scott would love this, you are so off track and you give zero shits for him, his legacy, respect for his family.
00:33:18.560 It's not an Erica thing.
00:33:20.120 It's a respect.
00:33:21.260 And if it was like your father or your mother or your son, brother, someone you love who was just like this amazing person and they passed away.
00:33:30.020 And now someone else, this is, I'm sorry, but this is me, like shoves their hand up their back and now is their puppet.
00:33:35.940 How would you feel about that?
00:33:38.700 So I think that the lawsuits are going to be huge.
00:33:41.760 I am fired up.
00:33:43.880 I told Scott for years, I would always go to bat for him to make sure his wishes are seen and his family, of course, feels the same way.
00:33:53.720 And there's thousands and thousands of us that know what Scott wanted and didn't want.
00:33:59.520 And, and in the end over a year ago, he did not want this.
00:34:02.880 He did not give permission to anyone to do it.
00:34:06.920 And if you think you're going to monetize off of this, you can think again, because that is Scott's estate.
00:34:14.220 That is Scott's IP, his likeness, his voice, his office, his books on the shelf in the background.
00:34:20.540 So with each passing day, you are just pissing me off so much more and I will not back down.
00:34:28.020 So, you know, I don't care.
00:34:29.320 Give me your slings and arrows if, if, you know, you feel like this is never a good thing, but I'm, I'm here for it because, uh, he was my friend.
00:34:36.980 I loved him and, and more than that, I respect him and I respect every single person watching this, that it would never happen to you because the pain it is causing and the trauma is like, you can't even imagine.
00:34:51.940 So it's fine.
00:34:53.200 But I, I love what you're doing, Robbie, because it's, it's for everybody in this uncharted territory.
00:34:59.520 It's so disturbing.
00:35:04.660 The idea of like, you know, if I pass away that my kids could open up their phone and see a fake version of me saying something, maybe something horrible.
00:35:15.480 Right.
00:35:16.340 Um, so it's, it's, it's unfathomable to me that somebody thinks this is a good idea.
00:35:21.860 You know, that somebody, and especially if they're pretending that they were a fan of Scott or loved Scott in some way.
00:35:29.520 Um, I hate to tell you, if you're doing something like this, you're nothing but disrespecting his legacy and who the man was.
00:35:37.400 And so, you know, I hope, I hope if they see this, the person who's behind this, that they do the right thing and they stop doing this because it's just, it's horrible.
00:35:46.980 And the family, the family, I think should come first, right?
00:35:50.200 The family is disturbed by this.
00:35:52.320 And so, uh, you know, I think out of respect for them, cause like in my case,
00:35:56.300 I would say, I absolutely don't want something like this.
00:36:00.780 I would essentially like everything else when I die, leave it to my wife and kids to decide what they think is best.
00:36:06.160 I don't know how things will change over time or what might give them peace or whatever.
00:36:10.380 Um, so I would leave it to them.
00:36:11.820 They're the ones who have to live here after I'm gone.
00:36:13.720 Um, but you know, I, I think it's incredibly disturbing that somebody thinks this is a good idea and that they should be the one who gets to decide that it is okay.
00:36:26.300 I know that somebody thinks, or they might feelbt sitcoms.
00:36:31.360 Mmm.
00:36:32.120 Marcella, did you want to.
00:36:34.440 Yeah, Robby, I have questions regarding, I have so many questions, Robby, but, um, um, I wanted you to talk about your background to the audience that might not know.
00:36:45.140 And also about your music video directing and Hollywood and all of that.
00:36:50.520 though came before because as you know a lot of our audience loves scott obviously and he believed
00:36:56.980 in talent stacks and you have one of the greatest talent stacks that i've seen um meaning that you've
00:37:07.160 you've come from different a different angle than most uh political commentators and it's it's so
00:37:14.540 interesting um what you've done and what you're still doing so i want them to know a little bit
00:37:21.240 about that you're still on mute so sorry am i okay there we go um you know i'd say this might take a
00:37:38.360 while but what really to me uh provoked me to speak up and to to take a political stance publicly which
00:37:45.840 pretty much set my career on fire in hollywood was the fact that my family came from cuba so my family
00:37:50.940 already lost everything to communism once and i mean everything and so you know having had that
00:37:57.720 experience in my direct family lineage you know you grow up with constant reminders how lucky you are
00:38:05.160 to be american just constant and so you know i understand very clearly my father figure in my
00:38:12.260 life was my great grandpa and he reminded me all the time that the reason it was able to happen in
00:38:18.500 cuba was that so many people were afraid to speak up to ruffle feathers lose business whatever it might
00:38:23.560 be and in the end they lost all of those things anyways right so i mean they should have just stood
00:38:29.140 up when they had the chance but that was lost on them in the moment in the moment safety felt like
00:38:33.880 silence you know if i was silent i was going to be safe and too few people spoke up and so you know
00:38:40.020 he imparted on me this need not not a want but like an absolute need a duty to stand up if you ever see
00:38:48.040 signs of this in america so here i am young man i come from a family that didn't have a lot i didn't
00:38:53.220 have hollywood connections or anything and i built a business really bootstrapped it myself i didn't even
00:38:58.540 i got denied for the one time i went to a bank to try to get a loan um so that that tells you like
00:39:03.080 this was like a bottom-up operation and so i start this production company and we got to be
00:39:09.220 very successful we had you know over 14 directors across the world i directed oscar-winning actors
00:39:14.140 actresses some of the biggest music stars in the world um you know was nominated for and won a lot
00:39:19.560 of the biggest and most prestigious awards out there and um you know what's funny about that actually
00:39:24.560 side note i never went to an award show when i was nominated um not for the mmvas not the vmas
00:39:30.120 none of it i never went um i always thought that hollywood was kind of a disgusting hellhole and so
00:39:35.740 i didn't really want to be around the people unless it was directly work-related and i was getting paid
00:39:40.180 for it um it was just never a fit i'd rather be home and so um that speaks i think a little bit to
00:39:46.200 my psyche i'm um i'm much more happy and comfortable at home with my family than i am
00:39:50.280 going and doing these things um which probably makes me more effective because i don't have to worry
00:39:54.400 about the interpersonal relationship stuff that other people worry about because frankly i don't give a
00:39:58.500 shit if people like me so i can just tell the truth and you know i'm i'm good so uh you know
00:40:05.820 from there you know i come out in 2015 i endorsed trump and and you know really came out full bore as
00:40:13.420 a conservative to the right of trump and i think that surprised a lot of people and i'm not sure why
00:40:19.040 because you know privately i never really hid my views you know i was very very pro-gun very
00:40:24.420 anti-communist so i'm not sure why it was shocking to people i do remember after i came out though uh
00:40:29.960 publicly because i think i did it on fox fox or fox business i can't remember it was like a tv
00:40:34.580 interview and i come out and i endorse him and do that that whole thing and um get a phone call from
00:40:39.920 one of our big clients at the time so we had um i can't say exactly who but i can say this it was one
00:40:45.280 of the very largest movie studios in the world they were one of our clients and a long-term client where
00:40:51.060 we had a long-term contract we provided extra production um for some of the big movies and things
00:40:56.580 like that that they couldn't provide so when they needed extra crew they needed a another director they
00:41:02.380 needed you know side project whatever like we did all that stuff and so i get a call from our our rep
00:41:08.440 there and they go robbie robbie robbie robbie you know it's not too late to turn back on this right
00:41:14.440 i love you i consider you a friend you are going to destroy your company and your career it is not too
00:41:20.280 to turn back you just got to tell people you didn't understand his immigration policy you know
00:41:25.280 you're latino and that'll work you know people buy that and you know you can move on and i go well
00:41:32.020 you know the tough part about that is the immigration policy was kind of the settling point for me
00:41:38.500 and i was like i i went into this knowing what i was doing like i know all of you people who pretend
00:41:45.620 to be so tolerant are actually the most intolerant people on earth so i'm not shocked by what's
00:41:50.360 coming um i i knew what was coming so uh you know as that's going on a lot of people didn't know this
00:41:57.000 about me and still don't know this about me i i'm an investor too so i invest and i'm i'm i would say
00:42:02.040 very wise about it you know maybe not nancy pelosi but i'm i'm pretty good and so um you know i i sort of
00:42:09.660 prolifically and invested in in real estate and in the markets and have done very well doing that
00:42:14.720 and uh you know i think that for me i've always felt it was very important to use whatever extra
00:42:21.240 money i could to fight for what i believe in and so we've done that we've tried to put money to work
00:42:25.200 um for political things but also non-political things that i feel like are inherent good
00:42:29.880 in in our country like you know when we had the horrible hurricane um affect us here in in tennessee
00:42:36.680 it happened especially in east tennessee power was out uh people didn't have electricity they
00:42:41.980 didn't have internet and so we made sure actually faster than emergency response did and we eventually
00:42:51.240 even gave these to tennessee's emergency management uh association so starlinks i i bought up all the
00:42:58.660 starlinks i possibly could and elon helped um you know get get me with the right people at starlink and
00:43:04.320 we were able to deploy them super fast here in east tennessee um i'm not in east tennessee but
00:43:09.800 you know we we had our people there and so things like that as well as the political missions right
00:43:15.760 um you know funding everything necessary to be able to be effective changing laws and changing sort of the
00:43:23.340 cultural fabric of how certain people look at things you know whether it be the transitioning of
00:43:28.020 children or or you know um death penalty for pedophiles that's another one that my wife and i have
00:43:33.120 worked very hard on and we got done here in tennessee and now is spreading to other states you
00:43:37.480 know i think we've got alabama florida now on board and um and there are going to be many more that we
00:43:42.480 have uh been talking to and pushing along that way so you know i i think that's uh sort of my life
00:43:49.800 mission now is use uh my ability to create uh money out of thin air and you know go and and use it to
00:43:59.900 to to do good in the world and so uh you know i feel like if you have those talents those abilities
00:44:05.980 you should use them for good and that's what i'm trying to do and what's wonderful too is i feel
00:44:10.700 like i have a great following of a very aligned people who have done well in life and have decided
00:44:20.400 to back me for that same reason they want to put their money to work too and that through the dei
00:44:26.540 project was like the lifeblood of it being able to pay for you know different things we needed to
00:44:31.280 do and uh it's cool to see that there's other people who feel the same way that i do that we
00:44:36.900 need to put our money to work to actually change this country in the world i love i love that prime
00:44:42.640 example example of all which was scott's motto and um one i want to say to happy eye doc yes robbie
00:44:50.860 does resemble clark gable yes he does um and also robbie as far as you showing and telling us but
00:44:58.960 you know by all of your examples of what you as one person can do you know so we as you know one
00:45:06.140 person can also do useful things but in case the the sippers aren't um aware of what you did with the
00:45:14.420 dei can you give us just like a little compact thing because that was one of scott's like
00:45:18.880 most proud things that you were doing like he was just like look at what robbie's doing you know he's
00:45:25.360 one guy like yes robbie like you know a lot of times we would be in the pre-show before going live
00:45:30.520 too and we would talk about it more but i mean i i would hear it like one after the other i see your
00:45:36.400 posts and see what you were doing and you're like holy shit like robbie is taking on the world for
00:45:41.860 us like it's for us you guys and that's how i always viewed it like robbie's got a beautiful
00:45:47.320 family i'm sure he wants the world to be great for them but nobody takes on this kind of shit from
00:45:54.060 people and puts themselves out there and you know take slings and arrows if they're not doing it for
00:46:00.940 the greater good i'm telling you and robbie give them just like a quick like what you did with the dei
00:46:06.200 because i know we only have like about 10 minutes and i know owen has more questions too
00:46:10.240 quick uh you know sort of rundown of it is essentially the right has not given pressure
00:46:16.500 at all in corporate america for decades and so you know this simple physics if all the pressure is
00:46:22.420 coming one direction everybody knows what's going to happen it's going to go that way if you don't
00:46:26.180 have a counterforce nothing good right and so what's interesting about it too is the left-wing
00:46:30.900 pressure that there had been for decades was a total paper tiger it was groups like the hrc who
00:46:35.080 actually have no real popularity or backing and so you know as i see corporate america shift wildly
00:46:40.680 left i was like we need to create a cultural counterweight and prove our actual power but it
00:46:46.100 has to be dynamically focused you've got to go one by one and i realized one of the big problems on the
00:46:50.860 right was like we're all over the place we're sporadic on a million different things and so
00:46:54.520 you never get this like very focused pressure and i think that's what we worked very hard at
00:47:00.520 is ensuring that when we had a company that we were focused on we brought pressure to bear from
00:47:06.200 every side of the right and so doing that we'd go company by company who had the craziest policies
00:47:13.500 you know all the woke stuff funding transitions for kids some of them were funding drag camps for kids
00:47:18.700 in the summer and we're talking about fortune 500 companies right i would have our research put
00:47:24.480 together into one video expose all of it and then the boycott that would ensue after that would change
00:47:32.900 the uh the policies you know uh and we'd get in contact with the ceos talk with the ceos and and
00:47:39.420 sort of negotiate a surrender so so so to speak and uh we did so very effectively and so what's
00:47:46.040 interesting is that public campaign uh we at a certain point didn't even really have to do anymore
00:47:52.720 because now we can just call these companies when we spot something and they're ready to change like
00:47:58.080 we don't need no story required right and so i've actually gotten more done in the last six months
00:48:04.120 changing policies at companies than we did in the first year and a half of this project and i think
00:48:10.240 that's really interesting because i haven't been just pumping out videos of it like we initially did
00:48:15.160 for the first year and a half because i'm so focused on using the time effectively and being useful
00:48:19.760 in the most efficient way possible sometimes it's going to take videos and public pressure but
00:48:24.800 sometimes you prove sort of your ability to do this in such a way that you can can just get a lot
00:48:29.760 done behind the scenes that's amazing that's amazing you guys you guys i i can i can see that
00:48:35.680 and all the platforms that are just falling in love with you as you should be you know please
00:48:40.960 and so in there sorry for the echo you guys let's see if
00:48:56.880 owen can jump in owen are you there yeah no yeah i'm here so i i wanted to ask about your run for
00:49:15.520 office you you ran for office in tennessee and apparently the republican party didn't like that
00:49:19.680 very much and they somehow tried to get you to not be a bona fide republican or and kicked you
00:49:25.520 off the ballot are you going to try and run for office again and can you tell us a little bit about
00:49:29.600 what happened there so it's incredible um i thought i lived in america right um i didn't know this was
00:49:36.160 possible so what i'm about to lay out for people is going to blow their minds i ran for congress and
00:49:41.440 when i ran for congress this was uh i guess you know about uh about how many how many years has
00:49:49.280 it been i guess six years um good time goes by faster than you think um and you know when when i
00:49:56.000 started the run i thought you know this is america anybody can do this right you just have to become
00:50:01.760 popular to the voters and they have to want you and you know and that's it right um and by the way
00:50:07.680 just side note i have been a registered republican my entire life okay i have only voted for republicans
00:50:15.360 i have literally never voted for a democrat and they try this this thing and i will say we've mended
00:50:23.040 fences so i don't want to be too hard on anybody but i just to explain what happened because it's still
00:50:27.520 like it blows my mind i run for office i get a gigantic lead the last poll done in the race before
00:50:33.600 i was removed from the ballot and by the way i was endorsed by pretty much everybody
00:50:37.680 um the last poll before i leave the race there was a 30 point gap between me and the second person
00:50:42.800 who eventually became the congressman for the district somebody i really don't like
00:50:46.480 um but that's another story for another day so when this happens i get removed from the ballot i sue
00:50:52.960 i win in state court the state court says yes he absolutely should go back on the ballot so
00:50:57.600 that's that's what's gonna happen then uh last minute i think it was like 48 hours before ballots
00:51:03.120 are printed they take it to the supreme court they the state republican party appeals to the
00:51:08.240 state supreme court state supreme court i think as everybody knows um is appointed by establishment
00:51:15.360 governors typically and so uh they side with the party and say yeah they can just remove him for any
00:51:21.120 reason they want and so i was removed from the ballot could not run as a republican and um at that
00:51:28.160 point it's too late to put your name on the ballot as an independent or something like that which
00:51:31.840 i wouldn't have wanted to do anyways because i am a republican i've been one my whole life
00:51:36.000 uh so instead what people did is there was uh the most protests protest votes in congressional history
00:51:42.400 of write-ins of people writing my name and in protest because you can't realistically win a massive
00:51:47.840 congressional race a federal race like that with write-in votes um and i actually told quite a few
00:51:53.280 people i didn't want them to to do this because i wanted to ensure that um sort of best person won that
00:51:59.440 primary which um you know i don't think really um ended up the case but yeah that was it that's
00:52:06.320 america that can happen you can get kicked off the ballot um because somebody decides they don't want
00:52:11.440 you to run uh so wild but yeah that can happen in america yeah yeah it seems like some of the most
00:52:20.880 influential persuasive decent voices get canceled or shut down or refocused um it's it's a crazy thing
00:52:31.280 oh go ahead rob i will add to to that i am so thankful that it happened like i thank god all the
00:52:39.680 time that it happened and this is actually sort of breaking news here um i actually recently was
00:52:46.560 essentially offered up a congressional seat okay it's a long story but i would have absolutely
00:52:53.920 won this and it would have been mine if i wanted it and um it was a special election so i think i'll
00:53:00.400 i think everybody can put two and two together if they just look at what special elections happened
00:53:04.800 and i ended up saying no because i realized something in the time that i ran for office and now
00:53:11.280 i actually want to be an effective person united states congress is not a place for effective
00:53:18.160 people um at least it hasn't been for a long time i hope that changes maybe one day that changes maybe
00:53:23.520 one day it is a place that actually legislates and does their damn job today it does not today
00:53:29.120 it's a clown show where people go to do i guess whatever clown show it is they want to be a part of
00:53:34.160 i don't want to be a part of a clown show i want to get stuff done so i'm much more effective
00:53:37.840 outside of government at this point um the only thing i could ever really see myself running for
00:53:42.400 is like i've been an executive my whole life so some sort of executive position where i could
00:53:46.720 actually just make decisions and change things on my own um whether that be local or state you know
00:53:51.760 whatever um that's the only thing i could see myself doing because i i have a very strong mind for
00:53:57.360 that legislatively i would be driven nuts being they'd have to they'd have to throw me out honestly
00:54:02.640 of congress at this point had i been elected i would have definitely been thrown out they would
00:54:06.320 have censored me or kicked me out for some reason because i wouldn't have been able to hold my tongue
00:54:10.800 at what a farce this is like how far are we in to trump's second term and what the hell has congress
00:54:18.640 done what what have they done pretty much everything good has been an executive order you know and the
00:54:25.680 only things that they've codified like a couple of them you know i mean they've renamed some post offices
00:54:32.000 i i don't know what to tell people i mean everything they do is either incredibly corrupt
00:54:36.720 uniparty you know favors or you know something self-serving like there's very little we get out
00:54:42.640 of congress so i am incredibly thankful things worked out the way they did because there's so
00:54:47.280 much good that i was able to do from that point in time to today and that's a good lesson for people
00:54:52.400 because when it happened to me i was so upset now i'm like asking god like what why is this happening
00:54:58.640 right like i feel like i was supposed to do this why is this happening and so it's a good lesson to
00:55:04.400 people that like you may be really upset something doesn't work out sometime but there's probably a
00:55:09.760 damn good reason for it and there may be more than a silver lining on the other side of it there may be
00:55:14.800 you know lessons that you needed to learn in that failure and it's a stepping stone to something that
00:55:18.800 is going to be your greatest success you know so i'm thankful that it put me on a path to be able
00:55:24.960 to accomplish what we've been able to do since then that's a that's a i'll just wait for the
00:55:31.920 me okay thanks that's a fantastic message and i say that often you know you fall down seven get up eight
00:55:38.720 and there are so many people you know sometimes i have friends that'll say oh so and so should you
00:55:44.320 know be in the administration or do this or do that and i'm like they are they will be so hamstringed
00:55:49.440 if they are part of politics and sometimes the best way to affect change is to be on the outside
00:55:55.440 of it and influence all around it because you're right i mean i think hillary clinton's only accomplishment
00:56:00.960 was not even accomplishment was naming a post office i don't know what the post office fetish is
00:56:06.960 but it's all about changing the names of post offices over there so anyway but robbie i would have been
00:56:13.760 just like you if i had gone into office because uh clearly i also am passionate about that you can
00:56:19.600 only i can only imagine like if you you know i wish that you would have made office you know but you're
00:56:27.440 right like imagine everything that you've done since then would have not happened your dei campaigns
00:56:33.920 your war on children documentary um all of the things that have been effective and have
00:56:41.840 helped all of us you know and i and we have to thank you because you actually made such a difference
00:56:48.720 i don't think you would have been able to do that had you been in government
00:56:52.640 yeah we owe robbie a big credit credit thanks you're right about that i would not have been able to
00:56:59.360 do that stuff i mean like the war on children alone i would have to say is it's a close thing between
00:57:05.680 that and the and dei but i almost feel like the war on children was the most effective thing that
00:57:10.160 i've ever done because the number of child protection laws that it ended up inspiring in
00:57:17.120 many different states is insane so there's so many states that have banned transgender surgeries
00:57:23.120 for kids and hormones as a byproduct of that movie and the effectiveness of it it was over 60 million
00:57:29.680 people watched that movie which was wild you know elon musk was a big supporter of it and the trump
00:57:36.000 family was too and so uh you know it crossed a lot of political boundaries too where it was a film
00:57:43.680 that an independent can watch a democrat can watch and if you're just a rational person who cares you
00:57:50.560 know about kids you're going to watch it and go oh we have a problem here and it's not just about
00:57:54.560 the transgender issue it delves into everything from social media technology um the way that it's
00:57:59.680 affecting our kids to uh the psychological aspects of you know what's affecting kids today education
00:58:05.520 system and the failures inside of it and how we fix it that's a big part of this is like how do we fix
00:58:11.520 this because we have a generation of youth that believe in god the least they report the highest levels
00:58:18.160 of mental illness that we've seen ever in youth and then you've got this you know
00:58:24.240 transgender craze that started to happen during that time this is a bad combination right and so
00:58:29.680 the heartening side of this though is that in the time after that movie there was this rapid turn to
00:58:35.600 god that has begun to occur in this generation so at the time we made it it was the most faithless
00:58:40.800 generation now we're veering out of that territory and young men especially in that generation are veering
00:58:47.920 toward the most conservative generation in american history among young men which is
00:58:54.800 a very welcome development and i think that that is a reason for hope for all of us to kind of hang
00:58:59.760 our hat on is that uh you know we've got a lot of young men in this country who have been awakened
00:59:06.160 to how dangerous leftism is amen with that said robbie um i i want to thank you on behalf of all of
00:59:15.360 us at the scott adams school all of the simultaneous sippers all the beloved it has been such an honor
00:59:22.000 having you on and i want to let people in the comments know if you go to robbiestarbuck.com you
00:59:28.080 will find the movie the war on children it's there at robbiestarbuck.com and robbie also has a podcast
00:59:35.360 the robbie starbuck show and we'll link everything also after the show so robbie um i hope you'll come
00:59:41.680 back again and we would love to talk about all the issues going on with children we're having cory
00:59:47.360 deangelis on on thursday another amazing fighter for children and um did you want to say something
00:59:55.600 we always do a little closing sip robbie but if you had a final message i want you to have that
01:00:01.200 i will be a part of the closing sip i have my scott adams mug here which by the way uh for those who
01:00:06.000 don't know um i got this mug like a week ago or a little over a week ago and it is going to be on
01:00:12.400 my desk until it breaks okay so when we do shows when we film stuff that scott's mug is going to be
01:00:17.200 here as a reminder that even after we're gone the the effect we have on other people can live well
01:00:24.160 beyond us one day i will die you know might be tomorrow might be 50 years from now but one day
01:00:29.680 i'll die and my hope is is that somebody else will have been affected by my work and we'll
01:00:33.840 you know have some sort of memory there to remind people that we affect each other long after we're
01:00:38.720 gone there's this beautiful butterfly effect to everything that we do and i always say to my kids
01:00:44.880 especially my my uh oldest son because he um and i'll say it a lot to my my younger son too as he
01:00:50.960 develops is that who you are is defined by the choices you make that nobody ever knows about
01:00:57.280 it's defined by who you are when nobody's looking it's very easy to do the right thing when everybody's
01:01:02.480 watching you it's very easy to pretend you're a holy person when everybody's watching it's much
01:01:07.040 harder when nobody's looking and the only person who knows is you and god and that's what i try to
01:01:12.960 always remind my son of and so um you know i think that that's that's something that if we all live
01:01:19.600 out and we try in those moments nobody's looking to make the world a better place to make the right
01:01:24.240 choice even when it's and i'd say especially when it's hard then we will begin to see a truly better
01:01:31.200 world all around us for our kids and our grandkids because you know i think we need to leave this world a
01:01:37.120 better place for them than we found it and unfortunately this is the first generation where
01:01:41.760 you know because of ai and a lot of other things it's not a sure thing that we will leave our kids
01:01:46.320 a better world than the one we came into um as of right now i can say this when i was born in the 80s
01:01:52.400 the culture around america was a much better place than the one we have today and so i hope we can get
01:01:59.360 to a place where that's no longer the case um but right now i'm pining for the 90s you know like
01:02:04.160 the 90s rocked i would say like every day i'm like i wish i was waking up in the 90s
01:02:09.120 but thank you for having me i really appreciate it you guys are making a big difference and i'm
01:02:12.880 so glad you're carrying on scott's legacy by continuing this and continuing to meet together
01:02:18.320 and sip together and uh i'll be tuning in sometimes you know that's what i did with scott you know
01:02:23.840 unfortunately with four kids and a farm and all of the projects we're doing i don't get to do it
01:02:27.760 every day but i it is i'm gonna watch something in the morning it's gonna be it's gonna be this
01:02:32.960 just like it was with scott we love that love that on behalf of owen and marcella and myself
01:02:39.600 and everybody thank you robbie starbuck and everyone you know robbie's the example scott
01:02:45.040 was our example go out there and be useful and to scott to scott see you tomorrow
01:03:02.960 so
01:03:09.600 you
01:03:11.600 you
01:03:13.600 you