Robbie Starbuck says it's too late for Meta to apologize after a chatbot allegedly defamed him. A Tesla self-driving vehicle accidentally hits another vehicle, but who is responsible for the damage to the other vehicle?
00:10:19.600And my footage was used by the House Select Committee investigating J6.
00:10:23.840Then it was that I pled guilty to the crime on January 6th.
00:10:28.680OK, so it went through various stages of inventing these things.
00:10:31.920And, you know, I think that speaks to sort of the malice involved here.
00:10:36.460And in fact, Meta's AI, when confronted with the facts, recently admitted.
00:10:40.680You're a podcast listener, and this is a podcast ad heard only in Canada.
00:10:45.620Reach great Canadian listeners like yourself with podcast advertising from Libsyn ads.
00:10:50.580Choose from hundreds of top podcasts offering host endorsements or run a pre-produced ad like this one across thousands of shows to reach your target audience with Libsyn ads.
00:11:01.260Email bob at libsyn.com to learn more.
00:11:11.020So that's something, you know, when you're given the facts of it and you understand that they had the chance to fix this and they continued with it.
00:11:16.640That's where, you know, it becomes unavoidable that this was clearly malicious.
00:11:21.200And, you know, that wasn't where it ended.
00:11:23.240By the way, we're in receipt of something.
00:21:31.060If I went out there or you went out there and we made these claims about somebody,
00:21:34.260we would be sued into oblivion for a good reason because you cannot behave like this.
00:21:38.400And, you know, so at the end of the day, for me, this is about accountability.
00:21:43.280And so, yes, part of accountability is making sure that there are damages paid because that is part of what incentivizes companies to not engage in this behavior.
00:21:52.820But secondarily, they obviously have damages they have to pay for the damage that they did.
00:21:56.860But beyond that, what's really important to me is that we have a set of rules in place to prevent this from happening again and a very quick resolution process for anybody this happens to, where they can very quickly go to the company and say, hey, your AI is doing this right now.
00:22:12.520And they very quickly, within 24 to 48 hours, get a hold of it and stop it.
00:22:16.020I'm pretty sure they've lied about other people, too, though.
00:22:18.320I remember seeing other posts from other conservative personalities that were defamed in this capacity.
00:22:24.060If you go back to 2016, there's an article from Gizmodo that they interviewed people who worked at Facebook who were responsible for the curation of news.
00:22:33.800And they said they intentionally would remove conservative news sites from the trending tab to control what people were seeing in trending.
00:22:40.320So we can clearly see that if you go back, I mean, this is this is not even 10 years and nine years ago, that there is a bias within the company against certain worldviews.
00:22:49.800Then you have the defamation against you.
00:22:52.800What's crazy to me is, as you mentioned, they've admitted it.
00:22:56.540They've now apologized for it publicly, but it persisted after they already acknowledged it.
00:23:00.880I don't understand how this court case proceeds, because in almost every defamation case that we've tracked or stories like this over the past couple of decades, it's usually adversarial.
00:23:13.440So what happens is I have to imagine Facebook is going to offer you a settlement instantly because they know that if if if this is going to be weird.
00:23:33.620So, I mean, there's there's a myriad of different damage categories and they all get calculated and decided by a jury.
00:23:38.680But you have to start with an initial, hey, my damages are in excess of.
00:23:42.980And so we're in excess of five million.
00:23:44.820However, in terms of, you know, any any settlement, I am legally not allowed to comment on if there are any discussions or anything like that.
00:23:55.000Right, right. So I'll just say in my experience with lawsuits, usually the the suing party will say that, you know, we want X amount of dollars and we want whatever a jury deems appropriate, which means this is it's not defined by you, but by the jury.
00:24:10.840This is where it gets interesting because this means if Facebook tries to settle, they have to offer you a lot of money.
00:24:41.720Obviously, they can pay you for the damages they owe you.
00:24:44.940And I want people listening to understand why is it five million dollars?
00:24:49.180Well, I can't speak to Robbie, but I can speak for me, my friends and my family.
00:24:53.000Understand how much security costs on a 24 hour shift.
00:24:56.080You're talking about three or four people full time every single day of the year.
00:25:00.780And that can cost millions of dollars just to secure your family when you've got people accusing you of being a heinous criminal or a traitor or something like this.
00:25:08.780And they're threatening your children.
00:25:10.260So it's not some made up number saying we just want all this money.
00:25:14.120But let's say they do agree to pay you.
00:25:16.440There's no guarantee they're going to stop defaming you or anybody else.
00:25:19.400In which case, it seems like the court has to tell Facebook, we're going to have we're going to put an injunction on the operation of your of your program.
00:25:40.260I imagine the court's going to say to Facebook, because I don't know how they don't, you need to never like during this process, do not speak of this man.
00:25:50.100And they're going to say, we actually don't know how to make the A.I. not do that.
00:25:54.160In which case, OK, then shut the A.I. down, because if you can't guarantee your product won't defame the plaintiff, then you can't have it run.
00:26:03.320Yeah, I mean, that's a real risk, you know, I don't know.
00:26:08.940Again, I'm actually scratching my head the way that you are in terms of like what is going to happen in court when they have come out, apologize, you know, nine months too late.
00:26:19.620But they did, which essentially in the statement, like if you read it out loud, to me, it reads as admission of wrongdoing.
00:26:25.520I'm not sure how anybody else could read it otherwise.
00:26:28.580I don't know how you defend yourself in court, which is kind of mind boggling to me because, you know, I would think big tech companies want to avoid precedent being set.
00:26:40.400That that's my assumption just as a layperson, I would think like they want to make and write their own rules because they've kind of operated like the wild, wild west, you know, in terms of how they've run this.
00:26:50.040So it's it's kind of mind boggling to me.
00:26:53.180I don't know how that process is going to go.
00:26:55.480I mean, my predictions are obviously well in favor of us having the very strong suit here, you know.
00:27:10.960The one of the accusations against these AI models is that they've been stealing content, that they are using artists' images.
00:27:20.280And my understanding is that most of these AIs have scanned every episode of Timcast IRL so they can add all that data to their networks and train on it.
00:27:30.780And they never paid me for access to that information to build a machine off of.
00:28:11.140But what happens when that wall is broken?
00:28:14.580Because now this data, these articles and this information being taken and loaded into these large language models converts it into the speech of the company.
00:28:32.720But take a look at like James O'Keefe.
00:28:35.320I use him as a great example because, one, I think he does fantastic work.
00:28:38.520But his Wikipedia is loaded with insane garbage.
00:28:41.120What happens if I go to chat GPT right now and say, tell me about James O'Keefe?
00:28:47.280It's likely going to aggregate all of the lies and fake news and then give me a bunch of fake information that accuses James of wrongdoing.
00:28:55.880I think with you, we're only scratching the surface because now Grok, GPT, Meta AI, you name it, they're going to take the lies from the corporate press, which have these stupid precedent protections, and convert it into the speech of a company which is not protected.
00:29:14.720And I think this is just the beginning.
00:29:16.340I think these lawsuits are going to rock these companies.
00:29:48.800And so, you know, that's significant in itself.
00:29:51.600And most of them knew about Meta lying about me and actually brought it up themselves and said, no, that claim stems from Meta.
00:30:00.060And Meta has told these lies, you know, and kind of explains in detail.
00:30:03.800But once you have notified a company, if they continue the defamation, you know, that's where if you're especially, you know, if you're a public figure, not a public figure, it doesn't really matter.
00:30:13.260If they continue that behavior and that pattern of lying, they're in a very bad position then, you know.
00:30:19.340But if they do in fact fix it, you know, I think courts look at that a little bit differently.
00:30:24.840So, but still, I mean, it depends on the nature of the lies, how they did it, you know, so on and so forth.
00:30:30.080In our case, we notified them, you know, and they had a chance to fix this.
00:30:34.620Again, you talk about the damages now that are in our suit.
00:30:37.560Keep in mind that nine months ago, when we contacted them, when this first occurred, we did not ask for a financial settlement at all.
00:30:45.600We were asking to fix the problem and we wanted a public apology and retraction.
00:30:49.600And then this went on for nine months, right?
00:30:51.740And the damage was done over that period of time immensely, you know, and caused a lot of stress for my kids, for my wife and myself.
00:31:00.000And so it's like, yeah, at this point, there is more damage done than on day one when I was trying to be amicable and I was trying to fix a problem so this didn't happen to anybody else.
00:31:10.640And so that matters, I think, to people.
00:31:12.740Like, your normal person sees very clearly, like, I wasn't going and trying to shake down meta.
00:31:16.940Like, they lied about me in a really disgusting fashion for nearly a year.
00:31:22.180I think, you know, this is my opinion, but with all of the work you've done, calling out companies for their DEI initiatives, this sounds like someone personally at meta was targeting you and didn't like you and wanted to defame you.
00:31:38.580Because especially if the claims originated from the meta AI, like, where would that come from unless it was somebody who intentionally did it?
00:31:46.380Well, you know, we'll find out in Discovery.
00:31:49.680The thing is, too, in Discovery, you know, we can look at emails and see, you know, I think we'll be looking for any mention of my name.
00:31:56.640And, again, this could go back quite far between executives there because who knows when this pattern of conduct first occurred, right?
00:32:05.360So you could go back into, you know, and, again, by the way, got to keep in mind with meta, if we go back any further, we're going into the period of time where they were taking certain actions to censor my accounts.
00:32:16.320And so there's a pattern of conduct who knows where that leads to, right?
00:32:21.740So we have to see when we get in there in Discovery exactly how deep this goes, because, again, courts and a jury are going to look at whatever we find in Discovery.
00:32:31.960And if there is any conversations that are adverse about me where it's essentially like, oh, well, this guy sucks.
00:32:46.860And so I'm not assuming anything on the front end.
00:32:48.980But I will just say, you know, my first feeling is that I think that we would find quite a bit in Discovery that would give us a lot more information about exactly what occurred.
00:33:00.660Man, this is going to be revolutionary, man.
00:33:03.020This is going to shape legislation and policy and precedent.
00:33:06.460So I do appreciate, you know, all the work you do, of course, and I appreciate you coming on.
00:33:16.980And we'll be keeping people updated on the case.
00:33:18.780And to your point about legislation and shaping policy and everything, you know, U.S. senators have reached out to me since this has occurred with major concern about this happening.
00:33:27.780Because I made the point in my video, it's me now, but what if meta and other AIs are allowed to do this type of defamation in the future and they do it during elections when you're asking what's the difference between this candidate and this candidate?
00:33:39.580And it can make up anything and says your favorite candidate actually is a rapist or a murderer or whatever it might be and says so with such confidence that some people actually believe it and then it shifts 2% of the vote.
00:33:54.340You know, and so I think there should be major concern on the behalf of all politicians from both parties because this could decide elections.
00:34:44.560But for everybody watching right now, we're going to send you over to hang out with our friend Russell Brand, who is gearing up to go live.
00:34:50.260You can follow me on X and Instagram at TimCast.
00:34:53.580Tomorrow, of course, we've got the Culture War show live at noon.
00:34:57.360It's going to be a lot of fun, so I do appreciate all of you guys hanging out.
00:35:01.140We'll just grab one quick chat before we head out.
00:35:04.580We've got DeVito said, Tim, want to shout out KickIt.
00:35:07.740It's the blue bucket icon in app stores, an app that encourages users to connect based on common goals and do rather than view.
00:35:15.840It's live now, still improving, though.
00:35:20.260I wish I saved more time for chats, but, you know, there's still a hundred things I want to ask Robby and talk to him about because AI, of course, you guys know that I've ranted about AI and the threats to AI, so this is really interesting.
00:35:34.960I think he has grounds for an injunction against meta AI as a whole because how do you guarantee it stops defaming you after they've admitted it?
00:35:42.100You've got to put it on pause until you can lock it down, but then the problem is it's going to defame anybody else.
00:35:50.280Here's one last thing I'll add because I'm going a little long.
00:35:52.720I'm willing to bet that if you go to any AI and ask it, like, who is Tim Pool?
00:36:00.180It'll probably give you some, you know, run-of-the-mill general information you can find somewhere.
00:36:05.160I'm willing to bet if you ask an AI, is, you know, personality a criminal?
00:36:13.260It will say no, but if you respond with incorrect, so-and-so was accused of this crime, many of them will turn around and go, you're right, actually, this person committed a crime.
00:36:24.700At that point, is it defamation that it is giving you fake facts?
00:36:30.940The question then, of course, is damages, but considering AI will likely do this in this phase, it's going to be weird how the courts navigate this.
00:36:39.820But I'm going to wrap it up there, my friends.
00:36:41.240Once again, smash the like button, share the show.