Valuetainment - March 26, 2025


"Manipulative & Full Of Lies" - OpenAI's Sam Altman SLAMMED By Whistleblower Suchir Balaji's Parents


Episode Stats

Length

9 minutes

Words per Minute

175.07277

Word Count

1,584

Sentence Count

156

Hate Speech Sentences

2


Summary


Transcript

00:00:00.000 Are you currently in communication with Ilya?
00:00:03.000 He's not replied to my email.
00:00:05.860 We are trying to.
00:00:06.740 We would really want to get in touch with him.
00:00:09.060 Is he still with OpenAI?
00:00:10.460 No, he left.
00:00:10.900 Or he's also left?
00:00:11.560 He left.
00:00:12.020 But he doesn't want to talk to anybody?
00:00:14.460 I believe instead of blaming something,
00:00:17.060 like I would think he's busy with his own startup.
00:00:19.440 He just started SSI, his own startup.
00:00:22.160 He's probably very busy with it.
00:00:24.040 If he knows, he might be open to talk.
00:00:27.580 But one thing which I shared in Tucker's interview,
00:00:30.000 as well, December 15th or so,
00:00:32.540 when he went to an AI summit or some conference,
00:00:35.480 Ilya had security guards around him,
00:00:37.840 armed gunmen around him.
00:00:39.740 Why?
00:00:40.260 He's scared of his life.
00:00:41.580 Why is that?
00:00:44.040 Who's against him?
00:00:45.300 Who would kill him?
00:00:46.200 Why?
00:00:48.420 Well, you know, one could argue and say,
00:00:51.160 you know, in the AI space right now,
00:00:53.560 it's extremely competitive.
00:00:55.240 People are recruiting each other.
00:00:57.440 And it's like the Oklahoma land rush times 1,000.
00:01:00.640 Everybody wants to own the AI revolution that is taking place.
00:01:05.480 So, you know, and I'm sure it's not the safest situation to be,
00:01:09.360 especially when you have that kind of content and information.
00:01:11.540 So how much time did your son spend time with him,
00:01:14.540 the chief scientist?
00:01:15.200 I think it's one of his assignments,
00:01:18.880 maybe a few months, a year or so.
00:01:20.980 Suchir worked on different assignments.
00:01:23.020 First assignment was WebGPT research paper.
00:01:25.880 That might be working closely with Ilya.
00:01:29.020 And then he worked on feeding the training data.
00:01:31.920 That is the most frustrating experience for him.
00:01:35.860 And then he tuned up the ChatGPT
00:01:38.120 and made it work very efficiently and faster.
00:01:41.140 That's a significant contribution by Suchir.
00:01:44.880 Got it.
00:01:45.380 So if there's anybody that would have intel,
00:01:48.460 it would be him, Ilya.
00:01:49.860 Exactly.
00:01:50.300 He would have intel.
00:01:51.200 Everything he knows.
00:01:52.200 So he knows everything.
00:01:53.220 So he's either worried for his life,
00:01:55.800 doesn't want to get involved,
00:01:56.980 doesn't want to take attention away from his company,
00:01:58.880 and just kind of wants to be left alone.
00:02:00.740 It's one of those things.
00:02:02.420 Yeah, but Suchir knew the risk he was running into.
00:02:04.860 He writes in his journal.
00:02:06.180 I wish I could share it.
00:02:07.340 He writes in his journal that I'm a credible threat
00:02:10.220 because of New York Times.
00:02:12.140 And he writes...
00:02:12.980 About himself or about Ilya?
00:02:14.540 Yes.
00:02:14.700 Okay.
00:02:15.240 About himself.
00:02:16.120 I'm a credible threat.
00:02:17.320 Sorry, I changed the topic.
00:02:19.080 I'm a credible threat because of New York Times.
00:02:23.040 And did he write anything in his journal about Ilya?
00:02:26.480 Yes, he writes.
00:02:27.600 Is there anything you can share on what he says about Ilya?
00:02:30.120 He likes Ilya.
00:02:31.380 He's a good person.
00:02:32.000 So complimentary about Ilya.
00:02:33.420 Absolutely.
00:02:34.320 Complimenting Ilya.
00:02:35.040 In his journal that you read,
00:02:38.840 who were some of the names he was concerned about?
00:02:41.920 Where he said, I'm not sure if I...
00:02:43.340 Was there any patterns that you would say
00:02:44.720 the way he wrote it in his journal,
00:02:46.120 he doesn't trust this guy,
00:02:47.300 he doesn't trust this person?
00:02:48.560 He doesn't trust Sam Altman.
00:02:50.080 Why is that?
00:02:51.900 He is manipulative and full of lies.
00:02:55.500 For that, one of his friends was trying to convince Suchir,
00:02:58.800 look, they have to say that
00:03:00.360 because being a CEO and all the VCs are interested,
00:03:03.640 he has to blow up.
00:03:05.700 But Suchir said there's no integrity for him.
00:03:08.260 I'll give an example of that.
00:03:10.280 I think March 2024,
00:03:14.520 one of the OpenAI employees left OpenAI
00:03:17.480 and he wanted to whistleblow.
00:03:19.500 They put a condition for him that
00:03:21.340 he cannot speak negative about the company.
00:03:24.560 If he spoke, he would not get his stock options.
00:03:28.520 That guy, he refused to take...
00:03:31.120 He accepted the loss of stock options,
00:03:34.120 but he whistleblow what they did.
00:03:37.060 Then Sam Altman said,
00:03:39.360 I never knew anything about it.
00:03:40.840 I'm so sorry I'll change it,
00:03:42.240 but it has a signature in it.
00:03:43.860 How much time did they spend together,
00:03:49.000 your son and Sam Altman?
00:03:51.700 Were they interacting?
00:03:53.100 Was he hands-on?
00:03:54.540 Were they in the same room?
00:03:56.500 Was he in boardrooms?
00:03:57.780 Was he in...
00:03:58.860 He was an O-1 contributor.
00:04:01.360 He was a very significant contributor.
00:04:03.680 Even in the video of OpenAI employees,
00:04:06.240 my son comes there.
00:04:07.940 I mean, the introduction to OpenAI,
00:04:10.000 there's a video for the incoming employees.
00:04:11.980 My son is there.
00:04:12.800 So, introduction to working to OpenAI,
00:04:14.960 your son is in that video?
00:04:16.200 Yes.
00:04:16.700 Really?
00:04:17.280 They put your son in the intro of OpenAI?
00:04:19.960 Yes, he was there.
00:04:20.920 He was very shy to talk.
00:04:22.940 He didn't speak anything.
00:04:24.200 He's very shy as such.
00:04:27.860 He's there.
00:04:28.600 Very, very interesting.
00:04:29.980 Okay, so, but hour-wise,
00:04:32.000 how many hours do you think Sam Altman
00:04:34.660 and Suchir spent together?
00:04:36.760 Actual hours.
00:04:38.280 That I don't have in the journal.
00:04:40.400 Some of the employees should share.
00:04:42.380 He met once.
00:04:43.660 He met once.
00:04:44.600 But Sam Altman sent a note for us
00:04:46.640 after Suchir passed away.
00:04:48.500 He knew that Suchir made a significant contribution.
00:04:51.240 So, he sent a note to you?
00:04:52.420 Yes.
00:04:52.800 Did he call you?
00:04:53.940 He wanted to,
00:04:54.900 but we didn't want to talk to him.
00:04:56.740 So, he wanted to,
00:04:57.720 but you declined the call?
00:04:58.640 Yes.
00:04:59.140 Okay.
00:04:59.920 Respectfully.
00:05:00.320 So, he at least made the effort to reach out to you,
00:05:04.120 but you didn't?
00:05:05.260 He did not reach out to us.
00:05:06.520 We reached out HR to know about his beneficiary
00:05:09.400 and other information.
00:05:11.040 Then he said,
00:05:12.340 but they keep saying they'll support the family.
00:05:16.400 What have they done?
00:05:18.260 Now we have a request to them.
00:05:20.020 Let's keep it confidential.
00:05:21.520 We're going to go to them.
00:05:22.860 Let's see if they accommodate our request.
00:05:24.700 Is the request public or is it private?
00:05:28.520 It's private request.
00:05:29.660 It's about.
00:05:30.260 It's about your son with open.
00:05:34.200 Let's see if they support.
00:05:35.520 As such,
00:05:36.480 they got back saying that there's no open AI.
00:05:39.460 There's no email that originated from my son
00:05:41.940 about copyright.
00:05:43.440 Your son left open AI in very good terms.
00:05:46.640 He didn't do anything
00:05:47.580 until we saw in New York Times.
00:05:49.820 We didn't know about his whistleblowing activity.
00:05:52.080 That could not be true.
00:05:53.680 You know why?
00:05:55.120 Wait, wait.
00:05:56.800 So they're saying until the New York Times article,
00:06:00.140 they did not know about any of the whistleblowing activity.
00:06:03.040 They say that.
00:06:04.320 So what was their argument that they're saying?
00:06:07.740 We are looking for someone who's read his emails
00:06:10.220 or he's responded to emails.
00:06:12.880 So she knew it, right?
00:06:14.160 Why would he go to New York Times in July end itself?
00:06:18.420 There's something, right?
00:06:19.440 We need to discover.
00:06:20.380 And what we know is emails could have been deleted.
00:06:24.660 Evidences could have been wiped out.
00:06:26.960 There are a few other employees,
00:06:28.580 formal employees of open AI who have a lawsuit against them.
00:06:32.080 They say their emails were deleted.
00:06:34.160 I don't want to make any conspiracy theory,
00:06:37.140 but we would not rule out any of those.
00:06:39.420 That's exactly why we are saying
00:06:41.140 FBI should get involved and investigate.
00:06:43.460 Who should be involved?
00:06:44.520 FBI.
00:06:45.260 FBI should get involved.
00:06:46.600 Have you had any luck?
00:06:47.960 Is FBI, since the Tucker interview,
00:06:51.040 did you,
00:06:51.440 because I'm assuming when you do the Tucker interview afterwards,
00:06:53.580 that goes out there,
00:06:54.280 a lot of weird people are going to reach out to you.
00:06:55.980 Some people that are currently at open AI,
00:06:58.280 previous employees,
00:06:59.240 did anybody reach out to you after Tucker's interview?
00:07:01.720 No.
00:07:02.080 They're all scared for their life.
00:07:03.560 Are you serious?
00:07:04.060 So no one,
00:07:04.980 no one even from the agency reached out to you?
00:07:07.220 No.
00:07:08.480 So January 15th,
00:07:10.360 when was the interview done with Tucker?
00:07:12.340 Is it on January 15th?
00:07:13.860 Yes.
00:07:14.160 So that's five days before inauguration,
00:07:18.040 while the president gets in.
00:07:19.560 I know.
00:07:20.280 So both Sam Altman and Ilan
00:07:26.160 are involved with a relationship with the president.
00:07:28.700 The president has a good relationship with both of them.
00:07:30.840 Yeah.
00:07:33.220 It's a bit of a technical situation here
00:07:36.320 on who would want to really investigate
00:07:38.980 and get to the bottom of this.
00:07:40.580 But you're saying no one has yet reached out.
00:07:42.440 That's interesting.
00:07:43.040 Nowadays, more than ever,
00:07:44.880 the brand you wear reflects and represents who you are.
00:07:47.900 So for us,
00:07:49.020 if you wear a Future Looks Bright hat
00:07:51.540 or a Valuetainment gear,
00:07:53.240 you're telling the world,
00:07:54.600 I'm optimistic.
00:07:55.820 I'm excited about what's going to be happening.
00:07:57.440 But you're a free thinker.
00:07:58.380 You question things.
00:07:59.120 You like debate.
00:08:00.040 And by the way,
00:08:00.740 last year,
00:08:01.240 120,000 people got a piece of Future Looks Bright gear
00:08:06.580 with Valuetainment.
00:08:07.400 We have so many new things.
00:08:09.280 The cufflinks are here.
00:08:10.740 New Future Looks Bright.
00:08:11.800 This is my favorite.
00:08:12.620 The green one.
00:08:13.660 Just yesterday,
00:08:14.500 somebody placed an order for a hundred of these.
00:08:17.640 If you watch the PBD podcast,
00:08:19.320 you got a bunch to choose from.
00:08:21.040 White ones,
00:08:21.840 black ones.
00:08:23.020 If you smoke cigars
00:08:25.540 and you come to our cigar lounge,
00:08:26.960 we have this high quality,
00:08:28.900 lighter cutter
00:08:29.900 and a holder for the cigars.
00:08:31.560 We got sweaters
00:08:32.540 with the Valuetainment logo on it.
00:08:34.260 We got mugs.
00:08:35.080 We got a bunch of different things.
00:08:36.760 But if you believe
00:08:37.740 the future looks bright,
00:08:39.260 if you follow our content
00:08:41.160 and what we represent
00:08:42.460 with Valuetainment,
00:08:43.740 with PBD podcast,
00:08:45.440 go to vtmerch.com.
00:08:46.980 And by the way,
00:08:47.340 if you order right now,
00:08:48.320 there's going to be a special VT gift insight
00:08:50.400 just for you.
00:08:51.540 So again,
00:08:51.820 go to vtmerch.com,
00:08:53.280 place your order,
00:08:54.620 tell the world
00:08:55.540 that you believe
00:08:56.260 the future looks bright.
00:08:57.120 If you enjoyed this video,
00:08:58.340 you want to watch more videos like this,
00:08:59.700 click here.
00:09:00.080 And if you want to watch
00:09:00.760 the entire podcast,
00:09:02.420 click here.