SHNEAKO - February 27, 2026
Sam Altman's Plan to Destroy the World
Episode Stats
Words per Minute
202.14551
Summary
Sam Altman is the gay Jew behind ChatGPT, the LLM that Netanyahu is proud of. He s Silicon Valley s golden child, the next model of tech CEO: humble and unassuming, but with a big vision for the future. His carefully managed public image has been designed to take the best bits from other CEOs, all while ironing out all of the wrinkles.
Transcript
00:00:00.000
Sam Altman is about to destroy society forever.
00:00:15.400
The next model of tech CEO, humble and unassuming, but with a big vision for the future.
00:00:20.720
His carefully managed public image has been designed to take the best bits from other CEOs,
00:00:35.340
Sam tries to put forward a humble image online,
00:00:37.840
leaving lowercase Twitter comments about how he wouldn't spend lots of money on a car.
00:00:41.600
But he's also been spotted driving cars worth far more than the Porsche he was talking about.
00:00:45.700
Like most tech CEOs, Altman loves to go on podcasts,
00:00:48.780
but he just can't resist his urge to dodge any uncomfortable questions.
00:00:52.200
Just listen to him dodge this very simple direct question on Theo Von's podcast.
00:00:56.260
a lot of these guys have bunkers zucky has a bunkie i know that somewhere out in hawaii
00:01:00.100
do you have a bunker i have like underground concrete heavy reinforced basements but i don't
00:01:05.460
have anything that's what a bunker is i have a bunker i have underground reinforced basements
00:01:13.940
that i've spent billions of dollars on that's exactly what a bunker is
00:01:18.580
i don't think that's really interesting the fact that bunkers exist the fact that bunkers exist
00:01:24.020
tells me everything they need to know and these are not normal five dollar bunkers these are
00:01:28.100
multi-million dollar bunkers if there's multi-million dollar bunkers and these people
00:01:32.900
are spending this much money on living underground um don't you think they're trying to prepare for
00:01:37.460
something right people aren't gonna spend that much money over speculation they're only gonna
00:01:42.660
spend that much money over what they know is gonna happen and same thing with uh they cry about
00:01:47.460
climate change all the time but people like bill gates they just bought you know multi-billion dollar
00:01:52.020
shorefront properties it's like okay if the sea levels are rising why would the people that have
00:01:57.220
overwhelming knowledge about the climate buy houses right by sea level you know especially
00:02:03.540
oh blackrock for example people crying about bitcoin me and claude were going back and forth
00:02:07.140
about that yesterday he was saying that oh bitcoin's escape okay then why is blackrock
00:02:11.060
investing money into it just look at where the wealthiest people the people with the most
00:02:14.580
amount of power are putting their money it's the best way to see where the future is going
00:02:18.820
he's just the same as the rest of them and the fact that he would try and bend the truth over
00:02:39.160
something that confirms that makes it all the more obvious and it's just the tip of the iceberg and
00:02:43.880
unfortunately there's mounting evidence that ultimate is far worse than the average tech
00:02:47.760
billionaire this person of the year has left a long trail of lawsuits allegations powerful enemies
00:02:53.240
and possibly even murders what's even worse is that so much of the future could one thing he has
00:02:59.060
in common with peter teal if i'm not mistaken allegedly in my opinion well peter teal does
00:03:03.280
have an ex-boyfriend that offed himself apparently so does sam altman they're both uh sam altman is a
00:03:09.820
gay jew peter teal is a gay person who works with israel a gay white guy works with israel but
00:03:15.640
they do have that in common. Depend on this man. The world economy is all in on AI with trillions
00:03:22.460
of dollars riding on Sam Altman's personal promises for the future. And if they turn out
00:03:26.780
to be just another set of lies, the consequences could be cataclysmic. That's why the future of
00:03:32.460
the world could very well hinge on the reasons why I hate Sam Altman. To many, it did just seem
00:03:37.760
like he came out of nowhere. As ChatGPT skyrocketed to popularity overnight, so did his profile to a
00:03:42.660
wider audience. But people more connected to Silicon Valley have known Sam Altman for years.
00:03:47.880
Far from being an outsider, he was a central figure at the very heart of the tech world.
00:03:52.420
He worked on the inside, brushing shoulders with billionaires like Elon Musk and Peter Thiel.
00:03:56.400
But even before that, Sam Altman was just a kid. He was the first child of thought,
00:04:00.760
with politically connected real estate developer father and a dermatologist mother. In other words,
00:04:05.040
Sam was born into high society. And Sam was a bright clit and clearly a prodigy when it came
00:04:09.820
to technology. His semi-official biography The Optimist writes that at just two years old he was
00:04:14.180
using his dad's VCR player to watch Sesame Street. It also doesn't shy away from how differently
00:04:18.600
wide he was. When he was still just two Sam's mother took him to the playground but instead
00:04:23.320
of running to play on the swings or to hang off the monkey bars Sam just sat down on a bench.
00:04:27.820
Quotes no mum. Another thing he has a common with Peter Thiel both bullied kids. So many of the
00:04:32.200
people that run Silicon Valley and this tech world just revenge on the nerds. They were bullied as
00:04:37.700
kids and hurt people hurt people and so they want to destroy the whole world because you know misery
00:04:43.360
loves company they had a bad childhood they just want to take it out on everybody else
00:04:46.960
i'm going to sit with you here and let's just watch the babies play at the age of eight sam
00:05:03.240
was then given his first computer. He almost immediately learned how to program on it. In
00:05:07.440
The Optimist, Sam uses this as an opportunity to lay the seeds for his vision for AI. As the
00:05:11.780
author writes, quote, I just remember thinking that someday the computer was going to learn to
00:05:15.640
think. They then go on to explain how Sam needed to get away from the participation trophies and
00:05:20.200
into an environment that more suited his genius. At around the age of 13, Sam transferred to a
00:05:25.080
prestigious private school where he made his first real connections with upper society,
00:05:28.680
which mixed inclusivity with massive pressure for students to get good grades. He wasn't just
00:05:33.000
a stereotypical nerd though. Friends and family do describe a young Sam Altman as confident and
00:05:37.480
able to convince people of almost anything. His passion for whatever he was working towards was
00:05:41.700
infectious. His father had used this family trait to work on building affordable housing.
00:05:57.680
Sam Altman's plans were entirely different. For college he enrolled at Stanford to major in
00:06:02.780
computer science, although he didn't last very long. The connections, again, were far more
00:06:06.960
important for Sam than the lectures or the courses he took. At this point in the 2000s,
00:06:10.980
Stanford had cemented his reputation as the feeder school for Silicon Valley. Tiny startups
00:06:15.240
and tech giants alike proud its corridors, looking for any young prodigy programmers and the next
00:06:19.760
generation of entrepreneurs. Ullman's general intelligence and his technology combined with
00:06:24.220
Love these schizo streams with Warner. Hope you guys are well.
00:06:26.460
Thanks a lot, man. I appreciate it. Oh, that's the cabin here.
00:06:30.900
any ability to persuade people of his vision were a magic combo within only two years of the school
00:06:38.560
he got caught up in the silicon valley fever leaving to begin his own startup admittedly
00:06:42.940
though it wasn't anything special loop as it was called was a social network designed to focus on
00:06:47.520
real life location tracking and a kind of check-in feature in 2008 a 23 year old sam altman gave this
00:06:53.500
presentation on the app obviously we know a lot more about what social media needs to work now
00:06:57.240
than we did back in 2008. It does have some good elements that look like things Snapchat would add
00:07:01.480
years later. But even then, it was clear that there were lots of problems. For this app to have ever
00:07:06.020
worked, it would have needed constant surveillance and tracking. At the same time, people would also
00:07:09.960
need to engage with the app almost constantly. They would always have to be updating their status
00:07:13.660
to make it work properly. Altman didn't really need to convince the general public about loops
00:07:17.240
yet though. Instead, his role as CEO was to sell the app to the expansive world of Silicon Valley
00:07:21.280
venture capitalism. This was funnily enough something he found very easy and you can kind
00:07:25.560
of see why. Take a listen to young Sam telling an interviewer why Loopt will take off based on his
00:07:29.580
vision for the future. He has this uncanny ability to make his vision for the future seem
00:07:53.040
completely believable. Looped will succeed because people will soon stop caring about privacy and
00:07:57.160
want to share their location all the time. It'll happen because of how obviously valuable Looped
00:08:00.940
is. Today, we know none of that really happened in the way he imagined it, but investors at the
00:08:05.000
time completely bought in. Straight out of the gate, Looped received $5 million from various
00:08:09.500
Silicon Valley investors. But as the years went on, the app never really got off the ground.
00:08:13.800
Altman boasted about a large and healthy user base and interviews, but who actually remembers
00:08:17.140
this app at all? It didn't stop him from convincing investors to put in more money though.
00:08:20.680
Over its seven-year lifespan, the app was periodically injected with outside funds,
00:08:24.280
keeping it alive. On paper though, it was a failure. Despite all of his vision and his
00:08:29.260
confidence, Loot became a footnote in tech history. But as we can speculate now, that
00:08:33.320
might have never been the purpose of Loot for Sam Altman. He probably believed in the app with all
00:08:37.620
of his heart at one point. But as time went on, Sam became increasingly involved in the inner
00:08:41.920
workings of Silicon Valley's investment engine. His status as a fast-rising startup entrepreneur
00:08:46.080
was all he needed to make connections with powerful people like Peter Thiel and Reid
00:08:49.580
Hoffman. Eventually, his friends at Sequoia Capital would step in and take Loopt off of his
00:08:53.380
hands, allowing Sam to net a few million dollars in the process, despite having pretty much nothing
00:09:00.700
concrete to show for years and years of work. But Loopt had given Sam a foothold in Silicon Valley.
00:09:06.620
But before we continue, I want to tell you about War Thunder.
00:09:13.620
Does Warner do Umrah tours for public? Would love to do it if possible.
00:09:18.300
great question no only private oh sorry little bro get your money up
00:09:30.660
would make him his unofficial king in 2011 sam joined y combinator it wasn't just investment
00:09:37.820
fund it was the startup accelerator in silicon valley airbnb twitch reddit stripe doordash
00:09:43.600
coinbase and dozens of other household names all owed their existence to it in one way or another
00:09:50.160
He came to decide which startups got funding, which were favoured,
00:09:53.220
and which were ignored and starved of cash before they could rock the boat.
00:09:56.100
It also gave him the opportunity to get in on the ground floor with tons of potentially massive companies.
00:10:00.580
A scattering of small investments could net him millions down the line.
00:10:03.420
Only a fool or someone with completely pure morals would have resisted getting rich off of this insider knowledge.
00:10:09.780
He was right on the front lines of venture capitalism,
00:10:12.120
picking out the prize apples for them to pluck from the trees.
00:10:14.700
But while extending his own web of connections and influence,
00:10:17.000
Sam still found a way to make it sound like he was working towards some sort of greater good
00:10:21.120
for society. In 2014, Sam was then made president of Y Combinator and his personal reach extended
00:10:34.280
even further. He was only 28 years old, and to many, it was obvious how much power he had in
00:10:38.980
his hands. But to deflect from those kinds of questions, Sam had a special answer. You could
00:10:43.080
be forgiven for thinking that Silicon Valley had just no real ideology or philosophy, other than
00:10:47.720
to make as much money as possible, whatever the cost. But for years, a philosophical movement had
00:10:52.260
been building in Silicon Valley that seemed to offer guilty tech pros a way to justify their
00:10:55.980
actions. Effective altruism is pretty simple on the surface. Springing from the work of the Australian
00:11:00.700
ethicist Peter Singer, its basic principle is that you should do whatever you can to make as many
00:11:05.280
people as happy as possible. For Sam Altman, this logic justified his career choices. He decided
00:11:09.980
that he would be wasting his life if he spent it on working at a soup kitchen or even running a
00:11:13.780
regular charity. Instead, he could use his skills to make as much money as possible. He could then
00:11:18.160
use the profits to fund more charity work than he could ever accomplish directly working for charity.
00:11:22.760
Plus, if he ever got to the top of the tech world, it would prevent someone more evil taking his place
00:11:26.740
and misusing the power for their own gain. Some of Silicon Valley's biggest scam artists have
00:11:31.060
clugged themselves in this exact ideology. When Elizabeth Holmes was-
00:11:34.880
Do you know about Sam Bankman-Fried, the guy right here?
00:11:38.860
Sam Begman-Freedy, locked up right now, Jewish guy of course,
00:11:42.760
behind one of the biggest crypto scams of all time, FTX.
00:11:47.940
Masters have cloned themselves in this exact ideology.
00:11:51.020
When Elizabeth Holmes was asked how she felt being the youngest billionaire in the world,
00:11:54.800
she immediately leant on this to make herself look humble.
00:12:04.520
and what matters is how well we do in trying to make people's lives better.
00:12:09.260
That's why I'm doing this, that's why I work the way that I work
00:12:14.380
Sam Backman-Fried was also a big fan of the movement,
00:12:17.080
later using it to try and excuse his actions in an interview taken while FDX
00:12:22.160
You know, I was thinking a lot about, you know, bed nets and malaria,
00:12:25.940
about, you know, saving people from diseases no one should die from,
00:12:28.960
about animal welfare, about pandemic prevention
00:12:31.380
and you know what could be done on large scales help mitigate those. Sam Altman would use these
00:12:36.540
ideas to his own benefits. Just like Bankman, Fried and Holmes he uses them as a shield but
00:12:41.620
he also used them as an excuse to gain even more power and start open AI. Obviously if tech oligarchs
00:12:47.300
had actually stuck to real effective altruism it would have been great. They could have spent their
00:12:51.300
vast excesses of money on food for the hungry or by investing in vital infrastructure. Instead the
00:12:56.920
movement took a much more different time top we were speaking about this yesterday but it's like
00:13:01.600
i think one of the biggest problems that's not spoken about is the fact that these tech
00:13:05.440
billionaires they're investing so much money into mars into ai mass surveillance when they have all
00:13:10.380
the funds right now to feed every hungry person in the world there's no reason that people need
00:13:14.720
to be starving they have all the capital to feed them and every waking moment they choose to not
00:13:20.560
do it it's an active decision to keep people hungry and did you notice that they love talking
00:13:34.720
The reality, they know nothing about the ocean,
00:13:38.700
and do this, this, and that. We can't even go to Antarctica.
00:13:54.800
city anybody you can go but you have to profess that there's only one worthy of worship if you're
00:14:00.300
going to worship idols you're going to worship different things then you can't go to the place
00:14:03.060
of worship of god the house of worship of god is only for the people that worship god alone
00:14:07.640
worship other things go somewhere else and you know something interesting about antarctica so
00:14:12.980
antarctica is one of the only continents right that are every single country in the world has
00:14:18.220
signed a treaty that no one can own it it's only used for scientific purposes don't you find that
00:14:24.180
so fascinating no one is allowed to own this land it's only used for scientific purposes so there's
00:14:29.400
something there and they don't want the rest of the world to know and then since we're young they
00:14:33.840
told us hey guys don't worry there's nothing there it's nothing but ice penguins and polar bears you
00:14:38.800
don't want to go over there and through all the media mr bees all these people what are they they
00:14:43.480
try to push that same narrative for me personally i think gog and magog are hiding there who knows i
00:14:48.940
could be mistaken but i personally think gog and magog would be hiding there that's the only land
00:14:53.180
on earth that we as humans cannot explore openly and that's probably where they're hiding a lot of
00:14:58.000
you know the secrets of the world so yeah forget about mars let's go see that let's go explore our
00:15:03.320
own world first before we spend all this money on spacex minds in the tech world got bored with
00:15:10.220
this grounded version of effective altruism they instead became obsessed with the idea of doing
00:15:14.340
the best thing for the billions or even trillions of people who might exist in our distant future
00:15:18.840
in their minds they needed to save these people from a coming threat one that could potentially
00:15:23.100
wipe out humanity completely. Evil AGI. While it was still a niche position at the time,
00:15:28.320
people like Elon Musk and even some top AI researchers had begun to become incredibly
00:15:32.660
worried about artificial general intelligence. They thought that if we kept on making breakthroughs
00:15:45.140
with AI, then we would quickly stumble across an artificial intelligence capable of improving
00:15:49.480
itself and escaping its digital chains. If it decided that humans weren't worth keeping around,
00:15:54.840
then it could go full Skynet and just destroy humanity. Preventing this outcome outweighed
00:15:59.480
all the mosquito nets and aid packages that you could dream of. Musk saw this as an imminent
00:16:04.040
threat, especially considering that Google was well ahead in AI development and seemed to have
00:16:08.520
no care about human life. They had all the power, all the funding and they had hired pretty much all
00:16:13.240
of the top AI researchers. With their profit incentive, Musk believed that they could easily
00:16:17.320
create the next HAL 9000. To beat Google and save humanity, Musk needed a startup unlike any other.
00:16:23.440
This was what brought him into contact with Sam Altman and his network of wealthy investors who
00:16:27.480
shared his concerns with AI. Over the course of a few months, they came up with a plan to beat
00:16:31.320
Google. Without any other way to compete, Musk, Altman and the gang would have to attack them on
00:16:35.440
an ideological level. In her book Empire of AI and an interview since it was released, Karen HAL
00:16:40.060
describes how this all took place. Because of that fear, Altman and Musk then thought we need
00:16:45.380
do a non-profit, not have these profit-driven incentives. We're going to focus on being
00:16:49.860
completely open, transparent, and also collaborative to the point of self-sacrificing,
00:16:54.500
if necessary. I have come to speculate. This is not based on any documents that I read or anything.
00:17:00.180
That's not cozy. Stop being racist. I've come to speculate that part of the reason why they
00:17:04.580
started as a non-profit in the first place is because it was a great recruitment tool for
00:17:09.540
getting at that bottleneck. They could not compete on salaries with Google, but they could
00:17:14.900
compete on a sense of mission this is the reason that musk named the company open ai and today is
00:17:20.420
just completely ironic but back then it was truly an open attempt at making ai without any profit
00:17:25.300
motives the ideological approach worked at first it enabled the company to poach lots of google's
00:17:29.780
top ai researchers despite the drop in salary it also meant that the company had tons of funds to
00:17:34.180
work with a large portion of them coming from musk himself obviously today we know that open ai is
00:17:38.980
one of the most profit incentivized capitalistic companies around someone in the chat says so elon
00:17:44.100
is trying to save us yeah well he was talking about the dangers of open ai and then created
00:17:48.300
a direct competitor which is almost identical to it grok is the same exact thing as chat gpt
00:17:52.940
this guy instead of trying to push legislation to restrict mass surveillance from ai he just
00:17:59.020
created a competitor burger king to the mcdonald's good and also that's just a form of manipulation
00:18:04.320
too hey guys oh you know just to make it seem like he's a good guy you know i'm saying
00:18:08.500
Musk is currently in the process of suing Altman and OpenAI,
00:18:13.720
accusing them of having always been profit motivated and of duping him for years into
00:18:17.360
funding their schemes. Whether or not that's true will come out in the courts, but the emails we've
00:18:21.560
seen already from the legal process are just incredibly telling. Just take a look at this
00:18:25.260
exchange between Altman and Musk, which seems to suggest they were all in on the grift by 2017.
00:18:29.840
In the next email on the exchange, Musk offers to give them all prototype Teslas,
00:18:33.520
but this happy state of affairs didn't last for long, as Altman pushed to make the company just
00:18:37.880
his profit-seeking as the competition, he also worked a muscly Musk out of the business altogether.
00:18:42.580
A few months later, Musk was angry at being taken along for a ride. Instead of preventing a dark AI
00:18:46.960
future, he might have just added another malicious influence into the mix. As time went on, momentum
00:18:51.520
at OpenAI grew and it attracted more and more investors. This was Altman's strength. His pitch
00:18:56.680
was relatively simple. AGI was going to change the world entirely. Getting in on the ground level
00:19:02.240
with OpenAI, who had a good chance of being the first to make the breakthrough, could therefore
00:19:06.100
be the most profitable investment ever. It was amazing because he didn't need to back it up with
00:19:10.680
much. Just being one of the biggest players in the game was already enough, and the potential of AI
00:19:14.780
sold itself. When ChatGPT went huge in late 2022, it did seem to confirm everything he had been
00:19:20.100
saying. Investors then poured in their investments in hopes of catching the wave. OpenAI went from a
00:19:24.900
relatively small tech company to one of the biggest tech names in the entire world overnight.
00:19:29.720
Altman's world tour then began, where he spent months talking to world leaders and billionaires
00:19:33.580
alike. It was the next part of the plan, getting those juicy government contracts and expanding
00:19:38.160
his power beyond the limits of Silicon Valley and into the halls of government. Other tech
00:19:42.180
companies were quick to join the craze, pouring in their own billions to push their own AI research
00:19:46.160
and products. Even during this golden age, Altman was still playing a dangerous game. Behind the
00:19:50.860
scenes, he ran OpenAI in a strange, secretive way. He hid the development and the release of
00:19:55.220
ChatGPT from the rest of the company's board. He took all the credit for its creation. He seemed to
00:19:59.460
position himself to get all of the recognition, the power, and the glory for himself. It was
00:20:03.680
these resentments that led to the failed coup in 2023. But because of both Microsoft and the
00:20:08.080
company's regular employees backing him up, it completely failed and only increased Altman's
00:20:12.300
power even more. Today, it's clear that he enjoys full control. It's only recently that we've seen
00:20:16.800
Altman's manipulation skills out in the open. His job changed when ChatGPT became so massive
00:20:21.700
and the investment into AI got so crazy. Instead of just having to convince investors, he now has
00:20:26.420
convince the entire world that openai specifically is worth the trillions that are in play recently
00:20:30.820
he's been doing the rounds on podcasts making the standard claim that ai is the future tutors
00:20:35.780
incredible ai medical advisors but but personally speaking i'm so excited for ai for science but
00:20:41.860
he's had to change his tune when chan gpt was the only game in town he was keen to make it
00:20:45.780
clear that it was hopeless for anyone else to compete with their model models how should we
00:20:49.380
think about that where is it that a team from india you know three super smart engineers with
00:20:53.780
you know not 100 million but let's say 10 million could actually build something truly substantial
00:20:58.260
look the way this works is we're going to tell you it's totally hopeless to compete with us on
00:21:01.620
training foundation models you shouldn't try and it's your job to like try anyway and i believe
00:21:06.260
both of those things and i think it i think it is pretty hopeless but he was clearly kind of joking
00:21:12.260
but his comment at the end makes it clear that this was what he really thought chat gpt was a
00:21:16.900
major breakthrough of course but the competitors caught up quickly google and other companies that
00:21:20.820
rivaled OpenAI's resources were quick to respond to the challenge. But even much smaller companies
00:21:25.060
could compete as well, like the Chinese startup DeepSeek and their AI model. Now that OpenAI are
00:21:29.140
losing when it comes to how powerful their model actually is, Altman has begun pushing the idea
00:21:33.460
that none of that even mattered in the first place. When I was a kid, the race was like the
00:21:37.380
megahertz race and then it became the gigahertz race. Everybody wanted a computer with a faster
00:21:40.660
processor and, you know, Intel would come out with this one and then AMD would come out with this one
00:21:44.740
and every like it turned out that those so much has said ai equals actually indians
00:21:53.380
gigahertz measurements eventually were not even that helpful like you could have one that had a
00:21:56.820
lower number it's also a great example of how he tries to appeal to regular people but just ends
00:22:01.620
up looking patronizing it's not the only time he began a story with the phrase when i was a kid
00:22:06.580
like that's the only way he could think of relating to normal people by remembering things
00:22:10.580
from when his brain hadn't properly formed yet on different podcasts altman exposes himself in
00:22:14.980
other ways just watch this clip of how defensive he said again before he was he talked about he
00:22:20.020
was a kid when what so every time he says hey when i was a kid before i was a homosexual oh my god
00:22:26.020
okay talking calls and brought up the open ai whistleblower you had complaints from one
00:22:30.260
programmer who said you guys were basically stealing people's stuff and not paying them
00:22:33.300
and then he wound up murdered what was that also a great tragedy uh he committed suicide
00:22:37.940
do you think he committed suicide i really do that's right medical record does it not look like
00:22:41.060
one to you no he was definitely murdered i think um there were signs of a struggle of course the
00:22:45.780
surveillance camera the wires had been cut um and his mother claims he was murdered on your orders
00:22:51.940
do you believe that i'm well i'm asking i mean you just said it so do you do you believe that
00:22:56.740
when you get deep into this case and look at the details revealed by the family's investigation
00:23:01.380
it becomes incredibly suspicious alderman's reaction to the line of questioning does him
00:23:06.580
and his company absolutely no favors here either. He clams up either giving short answers or long
00:23:11.820
winded nothing responses with seemingly no emotions behind his eyes. He tries to shut
00:23:15.640
down the question by appealing to respect for the family. Meanwhile Tucker had the real respect to
00:23:28.240
give Sochi Balaji's family time to make their case in front of his audience. When you consider
00:23:32.680
the strange circumstances the fact what was named social balaji was it when he timed to make
00:23:39.020
failure meanwhile tucker has the real respect to give social balaji's family time oh balaji okay
00:23:45.740
not okay okay all right you said yeah you thought it was jeet it's very close right
00:23:50.900
very close no the way that he announced it yeah right look at them justice for jeet bro balaji
00:23:56.560
come on be respectful guys be respectful no real warning signs and the discrepancies of the scene
00:24:00.920
you can really see where they were coming from. And it only adds to the pressure on future potential
00:24:19.240
whistleblowers to not say anything that could damage OpenAI or Sam Altman. Then when you add
00:24:23.580
in the predatory, incredibly restrictive NDAs that he makes employees sign, it does get a little
00:24:28.520
worrying. Meanwhile, the other effects that OpenAI has had on the world today are just incredibly
00:24:32.480
alarming. In her book, Empire of AI, Karen Howe investigates these in excruciating detail.
00:24:37.540
She reveals the communities sucked dry by OpenAI's data centers. She talks to people in the
00:24:41.960
developing world hired on poverty wages to trawl through heartbreaking and mentally scurrying
00:24:46.020
content, all for ChatGPT's training data. Now, you could say that this is all justified to bring
00:24:50.900
the wonders of AI to the world, but really it's not a bright future either way. Think about the
00:24:55.020
options. Even if you assume that Sam Oldman is correct, things don't look so good to be a man
00:25:00.320
that deserves all of this power. If AI really is this transformational, then he's going to be one
00:25:04.720
of the most powerful men in history. And just look at how he's used his power already to see
00:25:08.780
how that could turn out. On the other hand, he could end up being another Silicon Valley tragedy,
00:25:13.000
someone who built a whole lot of hype which eventually led to a spectacular downfall.
00:25:16.740
Sometimes in the aftermath of a collapse, a moment from the past can come back and stand out.
00:25:20.220
It can feel like the problem was hiding in plain sight, just waiting for someone to notice it.
00:25:23.000
when sam bankman freed and his fortunate ftx empire came crashing down it was his infamous
00:25:26.680
box interview he had given a few months before in the clip he pretty much describes a ponzi scheme
00:25:30.040
but with new crypto words and extra steps this year in x tokens being given out for it that's
00:25:34.440
a 16 return that's pretty good we'll put a little bit more in right and maybe that happens until
00:25:38.360
there are 200 million dollars in the box so well remember the words from jeffrey epstein in the
00:25:42.200
files he said let the goyim work in the real world the jew will work in finance stocks ships shorting
00:25:49.640
we'll work on wall street we'll get the transaction fees we'll let the the goyim
00:25:53.000
work in the real world sam beckman free does embody that completely
00:25:58.040
so translation we work to manipulate the money while you work for the money
00:26:05.320
you know sophisticated traders and or people on crypto twitter or others or similar parties
00:26:09.240
go and put 200 million dollars in box collectively and they start getting these x tokens for it
00:26:12.760
right and now all of a sudden it's like wow people just decide to put 200 million dollars in the box
00:26:16.440
this is a pretty cool box. Now all of a sudden, of course, the smart money is like, oh wow,
00:26:19.320
like this thing's now yielding like 60% a year in X token. Of course I'll take my 60% yield.
00:26:22.820
He ends his explanation not with people figuring out the box was worthless, but with this line
00:26:26.300
instead. Right, so they go, they pour another $300 million in the box, and you get a site,
00:26:29.660
and then it goes to infinity. Of course, this wasn't how it ended for FTX. It didn't go to
00:26:32.860
infinity, and eventually the speculation ended. When he was challenged on how shady this all
00:26:35.960
seemed, he didn't even defend it. Altman was asked a very similar question in an interview
00:26:39.160
about OpenAI. He was questioned on the astronomical difference between their revenue
00:26:42.040
and their investment promises. Quite simply, how could someone promise hundreds of billions
00:26:44.880
of investment with 1,000 times less revenue. He took a different approach, angrily denying the
00:26:48.600
question entirely and pretty much saying that it doesn't even matter.
00:27:14.880
to some of the people who are making the most noise on twitter whatever about this very quickly
00:27:17.360
you can tell it's not how anyone expected him to respond the microsoft ceo was also on the call and
00:27:21.120
as the ceo of microsoft and one of openai's biggest so you remember from professor uh james lecture
00:27:26.400
yesterday where we were talking about the the psychology of evil people right like and he
00:27:31.920
talked about how their sociopath they lacked emotion and you could clearly see that guy like
00:27:36.560
he just lacks like empathy you know certain characteristics of emotion that normally humans
00:27:42.880
have you can see he's just lacking that and we just studied that yesterday i think it's like
00:27:46.780
well to have that much power and control you can't be a normal function you have to be a sociopath
00:27:51.100
it's the psycho versus schizo versus normie theory you're gonna play
00:27:54.580
the psychos are the ones manipulating the normies and the greatest threat to the psychos are the
00:27:59.860
schizos good point yeah investors he clearly didn't like it he immediately laughs awkwardly
00:28:06.060
in response to the hostility and tries to change the subject and let me just say one thing uh brad
00:28:10.240
as both a partner and an investor, there has not been a single business plan that I've seen from
00:28:15.700
OpenAI that they've put in and not beaten it. So in some sense, this is the one place where,
00:28:20.580
you know, in terms of their growth, and just even the business, it's been an unbelievable
00:28:23.640
execution, quite frankly. I mean, obviously, OpenAI, everyone talks about all the success
00:28:26.440
and usage and what have you. But even, I'd say, all up, the business execution has been just
00:28:30.200
pretty unbelievable. At the same time, it's clear from how defensive Sam Altman got that the
00:28:33.380
question must bother him as well. It's the kind of hard truth that even he finds hard to dodge,
00:28:36.760
and this is the crux of the problem with Sam Altman.
00:28:38.820
China 2PT is undoubtedly a great piece of technology,
00:28:48.440
Well, depending on what happens next to the global economy,
00:28:53.460
Again, thank you to War Thunder for sponsoring.
00:28:58.560
You gotta see the psychos that are controlling the world
00:29:01.080
and getting everybody to be used to the AI surveillance state