Valuetainment - March 26, 2025


"He Was Scared!" - OpenAI Whistleblower FEARED For His Life After EXPOSING Copywrite Scandal


Episode Stats

Length

11 minutes

Words per Minute

170.63893

Word Count

1,914

Sentence Count

176

Misogynist Sentences

1

Hate Speech Sentences

4


Summary


Transcript

00:00:00.000 He knew he was a threat.
00:00:02.380 But if he were worried, he would take protection to prevent his life, right?
00:00:06.620 I do see that on October 30th, I saw he was very scared.
00:00:11.900 On October 28th?
00:00:12.880 October 30th, 29th, he met a copyright attorney called Matthew Batari.
00:00:18.600 We spoke to him on November 29th.
00:00:21.140 And Matthew told us that Suchi said what OpenAI did to him.
00:00:25.180 That's why we are holding on to them.
00:00:27.140 And then, I'll meet with you and I'll share what I learned from your son.
00:00:32.400 But he never met with us.
00:00:33.760 That was next day Suchir came and talked to us.
00:00:37.280 And when he came home, he was so scared.
00:00:39.880 He hadn't slept all night.
00:00:41.620 I watched him again.
00:00:42.780 I did the FaceTime meeting.
00:00:44.200 And November 7th, he came again.
00:00:46.000 He was normal.
00:00:47.580 So that particular day, did he receive any threat?
00:00:51.220 I wanted to go back to drop Suchir.
00:00:53.620 He didn't accept it.
00:00:55.100 I said, I'll come meet you.
00:00:56.340 No, I'll come home.
00:01:00.420 That's tough.
00:01:06.380 October 30th.
00:01:07.560 So he was worried on the 30th.
00:01:10.220 Yeah, he was worried on 30th.
00:01:11.680 It was obviously seen.
00:01:15.580 What did he learn from conversation with Matthew?
00:01:18.480 That's where my mind goes.
00:01:22.120 What did he learn from Matthew that made him worry?
00:01:26.740 What truth did he learn?
00:01:29.580 And a chief scientist officer doesn't want to talk to you?
00:01:32.040 They won't talk to me.
00:01:33.200 And he didn't even talk to my attorney.
00:01:34.740 Matthew didn't even talk to my attorney.
00:01:36.760 He didn't talk to the PI.
00:01:37.780 So he knows something.
00:01:41.300 Can you zoom in a little bit on Ilya?
00:01:43.000 I mean, this guy has done some very good interviews.
00:01:45.500 He's very smart at what he does.
00:01:47.140 Israel-Canadian commercial scientist who specializes in machine learning, several major contributions in deep learning with Alex Krzyzewski, Jeffrey Hinton, and come to Alex on the network.
00:01:58.340 He co-founded and was a former chief scientist at OpenAI in 2023, was one of the members of OpenAI board that ousted Sam Altman from his position as CEO.
00:02:07.600 Oh, so he is fully against Altman as well.
00:02:11.280 Altman returned a week later and Suskiver stepped down from the board.
00:02:14.840 In June 2024, he co-founded a company called Safe Superintelligence and Daniel.
00:02:18.920 Is this like, yeah.
00:02:20.600 With Daniel Gross and Daniel Levy.
00:02:23.120 Huh.
00:02:28.340 Honestly, we don't even have time to sit and grieve or cry.
00:02:34.800 It's numbed.
00:02:35.620 Emotions are numbed.
00:02:36.840 Yeah.
00:02:37.400 Which is fighting.
00:02:40.660 I wish the occasions was different for us to be sitting down together.
00:02:45.420 All we want to do is get as many eyeballs to this as possible to get people of interest who can find out more to get to the bottom of this.
00:02:54.340 Because I applaud you tremendously for having the courage and the willingness to do this.
00:03:02.740 This is not easy to do.
00:03:04.160 Yeah.
00:03:04.700 And my condolences goes out obviously to you and your entire family as you're going through this process.
00:03:11.420 And we're going to do our best to get this in the right people's hands.
00:03:16.800 God willing, the right people will reach out to you.
00:03:19.100 Thank you.
00:03:19.980 And I also can write a complaint to the FBI director.
00:03:24.000 Send it over to Tony.
00:03:25.580 Will I, can we send it out to one of his staff?
00:03:29.600 If you send us a note, we're going to forward it to the right people that we have in contact, that we're in contact with.
00:03:34.880 Okay.
00:03:35.420 We need to sign and send that, right?
00:03:37.560 Yeah.
00:03:38.060 Okay.
00:03:38.340 We'll do it today.
00:03:39.200 Perfect.
00:03:39.920 No problem.
00:03:40.400 Hopefully it will reach them.
00:03:41.660 God willing.
00:03:42.520 God willing.
00:03:43.240 I mean, you know, this is something that it's going to be very technical.
00:03:47.400 There is some political components to it because of OpenAI and what he's working on.
00:03:52.300 But at the end of the day, justice is a big part of America.
00:03:56.840 Justice to find that exactly for the story to come out.
00:04:00.140 Can I say something?
00:04:01.380 Recently, OpenAI tried to promote a bill.
00:04:05.180 That bill says because of NSA data, they want to make copyright doable.
00:04:12.880 They don't want to oppose copyright because OpenAI is using NSA data.
00:04:19.400 So copyright is agreeable.
00:04:21.980 So that means Suchir protested copyright.
00:04:24.680 Now they want to make copyright legal.
00:04:27.960 OpenAI and Google asked the government to let them train AI in content they don't own.
00:04:32.360 OpenAI argues it needs access to avoid forfeiting the lead in AI to China.
00:04:36.560 There's one more in which they say they want to promote a bill.
00:04:42.500 They're suggesting a bill.
00:04:43.760 In that bill, they say we have NSA data and we want it to be copyright enabled because of that.
00:04:51.200 The proposal came in response to a request White House, which asked governments, industry groups, private sector, organization, and others for input for Donald Trump's AI action plan.
00:04:58.960 And the initiative is supposed to enhance America's position as an AI powerhouse while preventing burdensome requirements.
00:05:07.200 Yeah.
00:05:08.540 I mean, that's what you got to do.
00:05:09.840 You got to put the FOMO, the threat, to be able to force a bill like this to pass.
00:05:16.120 You need a crisis to...
00:05:18.960 We don't know, like, how far this can go because who's behind it?
00:05:23.440 We have no idea how deep this goes.
00:05:26.200 Well, this is a big race right now.
00:05:28.280 This is very...
00:05:29.840 And there's a lot of controversy and competition right now, specifically with OpenAI.
00:05:36.260 Initially, it was, hey, this is going to be a nonprofit.
00:05:38.280 This is what we're doing.
00:05:39.000 I think Elon put $50 million of his own money into it.
00:05:41.280 And then all of a sudden, finding a way to adjust it.
00:05:44.080 And Sam's argument was the fact that this is what we need to do to raise a lot of money.
00:05:47.340 If we don't do this, we can raise the money.
00:05:49.480 And then from there, the company's valuation, I think 100x.
00:05:52.360 I don't know if I read this somewhere.
00:05:54.060 But we're going to find out.
00:05:56.360 We're going to find out what's going on.
00:05:57.620 You know, with today's economy and the market and how small the world is, the people...
00:06:03.460 If you're watching this and you're interested in this and you also have information yourself
00:06:09.240 and you have the courage to want to do something about it, you know, I suggest you reach out.
00:06:14.340 And what's a way for people to reach out to you?
00:06:16.400 Is there a website?
00:06:17.140 Is there somewhere that they can go?
00:06:18.480 Is there anything they can do?
00:06:20.040 Is it purely Twitter?
00:06:21.140 Twitter is easy, yes.
00:06:22.140 Okay.
00:06:22.620 It's fantastic.
00:06:23.140 You know, we can allow what we can disallow, depending on who reaches.
00:06:27.560 I usually look for two, three messages and try to understand the psychology.
00:06:32.800 We had some good experiences.
00:06:34.840 We had some bad experiences.
00:06:36.280 Well, Purnima, Balaji, thank you for your time.
00:06:41.320 Thank you for coming out.
00:06:43.180 And again, extremely uncomfortable.
00:06:46.140 But I appreciate you and I applaud you for being willing to share this message with others.
00:06:51.560 And it's very obvious as parents, you're proud parents for raising a kid to do what he did for the short period that he had.
00:07:00.180 I'm sure his legacy is going to continue by other people that will be inspired to want to also go out there and, you know, get the types of results that he's gotten at his young life.
00:07:08.260 Very impressive, the kind of a kid you guys raised.
00:07:10.940 Thank you so much for coming on and being on the show here.
00:07:12.880 And thank you for sharing this information to everyone.
00:07:16.140 They've tried to suppress this.
00:07:18.200 They've tried their best to suppress this news.
00:07:20.480 They don't want anyone to know it.
00:07:22.260 And if I were to say, there are a couple other whistleblowers.
00:07:25.120 One of them is Cyrus Passa.
00:07:27.060 He's an AI whistleblower.
00:07:28.840 He was founded exactly three months in the same way in Los Angeles.
00:07:34.240 And his mother is, you know, she wasn't able to protest it.
00:07:37.140 So, recently, Cyrus Passa, it was only, the news only came on Twitter and Facebook.
00:07:43.380 They didn't let this news circulate.
00:07:45.580 How do you spell his name?
00:07:47.540 C-Y-R-U-S.
00:07:49.740 Okay.
00:07:50.480 P-A-R-S-A.
00:07:52.580 I can share the news from Facebook, one of his friends.
00:07:55.900 Close friends put that on Facebook.
00:07:58.140 Another few friends put it on Twitter.
00:08:07.140 This is not the first time we saw a Boeing whistleblower who spoke about the nuts and bolts, right?
00:08:14.160 He was founded exactly the same way, self-inflicted, gunshot-owned.
00:08:18.860 And if you see JFK's files, someone who was in the military who spoke about it, who's behind it, he got shot.
00:08:29.960 No, nothing would surprise me today.
00:08:33.660 When it comes down to AI, there's a lot of intel on that.
00:08:37.140 Cyrus Passa was, from an AI organization, was found dead by a gunshot wound to the head.
00:08:41.920 He had a lot of spiritual training and would not have taken his life since he knew what that meant to a moment of his life.
00:08:46.540 He likely was suicided since he also mentioned to not take him out in his tweet in January of 2025.
00:08:55.980 Wow.
00:08:57.180 Is there anything we can do to stop this whistleblower death?
00:09:00.700 That's the only plea I have for government is, please stop it.
00:09:06.060 How can you stop it?
00:09:07.340 Trump wants to bring a lot of changes.
00:09:09.500 Kash Patel is working with him.
00:09:10.920 Pam Bondi wants to bring justice system.
00:09:13.580 If the only thing, my last wish would be, you know, like, bring justice and stop further.
00:09:21.540 You know, it's been, America is a capitalist country.
00:09:25.080 Any, because of whistleblowing, any financial loss to them, they only know to kill and silence it.
00:09:31.760 It's not easy to suppress it, but at least if there is some kind of protection in place, you know, people can use that.
00:09:39.700 When they whistleblower, they can tell that, right?
00:09:42.640 There's a protection for me.
00:09:44.280 And then anything happens to them becomes a liability for who they blew against the whistleblower.
00:09:49.980 If we have some law like that, people would not kill.
00:09:53.260 Nowadays, more than ever, the brand you wear reflects and represents who you are.
00:09:58.040 So for us, if you wear a future looks bright hat or a value taming gear, you're telling the world, I'm optimistic.
00:10:05.580 I'm excited about what's going to be happening, but you're a free thinker, you question things, you like debate.
00:10:10.180 And by the way, last year, 120,000 people got a piece of Future Looks Bright gear with Valuetainment.
00:10:17.560 We have so many new things.
00:10:19.420 The cufflinks are here.
00:10:20.940 New Future Looks Bright.
00:10:22.040 This is my favorite, the green one.
00:10:23.800 Just yesterday, somebody placed an order for a hundred of these.
00:10:27.900 If you watch the PBD podcast, you got a bunch to choose from.
00:10:31.260 White ones, black ones.
00:10:32.600 Because if you smoke cigars and you come to our cigar lounge, we have this high quality lighter cutter and a holder for the cigars.
00:10:41.720 We got sweaters with the Valuetainment logo on it.
00:10:44.400 We got mugs.
00:10:45.220 We got a bunch of different things.
00:10:46.960 But if you believe the future looks bright, if you follow our content and what we represent with Valuetainment, with PBD podcast, go to vtmerch.com.
00:10:56.900 And by the way, if you order right now, there's going to be a special VT gift insight just for you.
00:11:01.680 So again, go to vtmerch.com, place your order, tell the world that you believe the future looks bright.
00:11:07.500 If you enjoyed this video, you want to watch more videos like this, click here.
00:11:10.220 And if you want to watch the entire podcast, click here.