Stay Free - Russel Brand - May 07, 2023


Bari Weiss (AI: Apocalypse or Revolution?)


Episode Stats

Length

29 minutes

Words per Minute

185.05695

Word Count

5,416

Sentence Count

306

Misogynist Sentences

5

Hate Speech Sentences

2


Summary

As companies race to integrate artificial intelligence into our everyday lives, one man behind that technology has resigned from Google after more than a decade. Dr. Geoffrey Hinton, known as the godfather of artificial intelligence, says he stepped down because he didn t want to personify it anymore. And what s next for the coronation of King Charles I of the United Kingdom? A simple, modern, coronation that doesn t need all sorts of fancy accessories. Just silk breeches and a pair of pants. And a shiny new hat. This week's After Show is hosted by Alex Blumberg ( ) and Jonny LoQuasto ( ), with regular contributions from Mark Phillips ( ), Ben Kertes ( ), and Sophie Taylor ( ). Subscribe to After Show to get notified when we deconstruct the latest news in After Show topics! Logo by K. Williams and Willoughby. Theme by Mavus White. Music by PSOVOD and tyops. We do not own the rights to any music used in this podcast. All credit and music by any other works credited to any other artists. If you enjoyed this podcast, please consider leaving us a five star review on iTunes or rating us a review on our podcast, we'd really appreciate it if we sent us a rating and review we'd like it if it helps us spread the word about what we're listening out to other people are listening to us. Thank you, Jonny, K. Phillips, B. Williams, J. Holmes, A. B. McElroy, D. Maffee, E. Burt, S. Bresnan, Alyssa, C. S. Walker, JUICY, EYAN JAYE, P. RYAN, JEAN MAYTER, SORCHESTER, GAYLE, AND JANE KELLY, AND KIMBERLY MOULDER, JRYAN VANECKER, AND PAUL MAYOR, AUGMENTED AND KEVIN MAYO CHEERIE, AYAN MORRIAGE, AND MARYLEA CHEARY, AND CHEORGE MAYANNAH VYANNAIA AND JOSIA VEASTER AND JAMIE VEAN VEYAN CHEARIE AND JAMES MEYER AND JONNY MAHAN JEAVIA AND KAYAN JEAVIE AND PAYAN OCHTERRY AND JYANIA CORRONE AND JEA CORRODY AND BRYAN LYNNE CHEERY AND PRAYAN MAHER AND KENYAN MEYAN AND JODY MORCHEARY AND KABANNAY VEALLY AND KAVIEVIE CORNER AND KOULDERIE AND RAISE THEM'


Transcript

00:00:00.000 You are awakening and you are wonderful in spite of everything, in spite of it all.
00:00:05.000 Look at you, continuing to provide, continuing to connect, continuing to awaken.
00:00:11.000 It's going to be a great show.
00:00:12.000 If you are watching this on YouTube or Elon Musk's Citadel of Home Trooves, 80% of you, you're Let's have a look at the Google Dude.
00:00:22.000 Let's have a look at this.
00:00:23.000 and indeed the vast majority of the show by joining us on Rumble or even better go deeper take a
00:00:29.000 Deeper dive to the very very depths right down in locals There's a red button on your screen if you're watching us
00:00:34.000 down run when you can join our community nature's child Let's have a look at the Google dude. Let's have a look at
00:00:39.000 this. Let's have a let's see how the mainstream media cover this thing
00:00:42.000 This morning as companies race to integrate artificial intelligence into our everyday lives one man behind that
00:00:49.000 technology has resigned from Google after more than a decade.
00:00:54.000 Dr. Geoffrey Hinton, known as the godfather of artificial intelligence, says he stepped down.
00:00:58.000 I don't want to personify it.
00:01:00.000 Right.
00:01:01.000 Don't give it a godparent.
00:01:02.000 Like in the event that its actual parent dies.
00:01:02.000 No.
00:01:05.000 Do you think that he wanted that label or do you think it's Like, how many people do it?
00:01:09.000 Like, do they approach him out Google and go, Godfather, hello?
00:01:12.000 I don't think he wants it.
00:01:14.000 I mean, he's quit his job.
00:01:15.000 You come to me on my calculator's wedding day and you do not offer respect and you don't think to call me Godfather.
00:01:23.000 Yeah, I don't know if it's... A lot of nicknames.
00:01:25.000 People work hard for a nickname, don't they?
00:01:27.000 A lot of people you can tell really want a nickname to stick.
00:01:29.000 They do, yeah.
00:01:30.000 I wonder if old Geoffrey Hinton likes being called Godfather.
00:01:34.000 It's difficult to know.
00:01:35.000 Look, really what interests me is... Look, let's not pretend that a king isn't going to get a shiny new hat this weekend.
00:01:44.000 Here we are, worrying about AI.
00:01:46.000 Well, it's a bit worrying.
00:01:47.000 I mean, Elon Musk is worrying about it as well.
00:01:49.000 Elon Musk is a bag of nerves.
00:01:51.000 He's always worried about something.
00:01:52.000 Is that what it is?
00:01:53.000 Something he's worried about?
00:01:54.000 He gets jittery, doesn't he?
00:01:55.000 Oh no, this Twitter, that's not very well run.
00:01:57.000 We're about to sack half the people that work here.
00:01:59.000 Eighty percent.
00:02:00.000 Eighty percent, right.
00:02:02.000 You're not going to use the job.
00:02:03.000 You are not going to use it.
00:02:04.000 You are going to use, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, you are, 80% eh?
00:02:24.000 Maybe I could do me, a cuddly German shepherd, dear old loyal Dan with his bizarre ankles.
00:02:30.000 But no, if you have a look at that gallery there, that's a neat, lean, sparse team.
00:02:35.000 If you defrost that window, have a look.
00:02:37.000 I can't, you couldn't do without They're a lovely little bunch.
00:02:40.000 We're a bit lean and light today.
00:02:43.000 Often we are on Friday show because we want to be able to focus on Barry Weiss.
00:02:48.000 We want to be able to focus on the coronation.
00:02:50.000 What's funny when they talk about the coronation, let's have a look at some of the headlines, is they talk as if like there's ways of making it more sensible and practical.
00:02:58.000 Look at this, King Charles to do a way of outdated silk stockings and breeches for coronation.
00:03:05.000 One doesn't need all these outdated bridges.
00:03:07.000 A simple coronation, a simple modern coronation for a modern world.
00:03:12.000 The whole idea is you've been anointed by God to be the figurehead of a nation.
00:03:19.000 And all of the, however you shake this down, the wealth of the royal family is accumulated through plunder over centuries.
00:03:28.000 Yeah, he's not losing those stockings for budgetary reasons, is he?
00:03:30.000 Let's get rid of these expensive stockings!
00:03:33.000 And we all remember that phone call and some of the things he wished he was, didn't he?
00:03:36.000 Yeah.
00:03:37.000 He wanted to be a sanitary product, a pair of pants.
00:03:39.000 Oh, of course, yeah.
00:03:40.000 He said he wanted to be all sorts of things.
00:03:40.000 Didn't he?
00:03:41.000 I knew you'd remember those details.
00:03:43.000 I think about that sometimes.
00:03:45.000 I'll tell you what you want to have a look on, and this is something you literally have to be careful about talking about on YouTube, because we're going to be giving you some of the best secrets about the Royals and some of the best conspiracies.
00:03:53.000 If you watch that documentary, bizarrely made by Keith Allen, the actor, you might not be able to find it on YouTube, you definitely better find it on Rumble.
00:04:01.000 If you have a look at Rumble, the one about Diana, what's it called?
00:04:05.000 Unlawful killing.
00:04:06.000 Diana.
00:04:08.000 Ooh, have a little look at that.
00:04:10.000 Was it on the telly?
00:04:12.000 No, mate.
00:04:12.000 No.
00:04:13.000 Like Keith Allen, what's amazing about it is Keith Allen, the actor, Lily Allen's dad, just made it himself.
00:04:19.000 So what, did you have it on a legal VHS or something?
00:04:21.000 You know sometimes when you look at a Russian version of YouTube and it's all Russian mad letters, you don't know what they are.
00:04:26.000 They're like spaceships and noughts and crosses and things like that.
00:04:29.000 I watched it on... Nice reductive...
00:04:32.000 Do you know Russia?
00:04:33.000 Interpretation of the Russian language?
00:04:34.000 Spaceships, Pac-Man, TV aerial ones, one that looks like a fish.
00:04:42.000 That's their culture.
00:04:44.000 It's going to a war with Russia.
00:04:46.000 It's an easy business.
00:04:46.000 Apparently Russia aren't hard anymore.
00:04:48.000 They don't need to worry about provoking Russia.
00:04:49.000 What are they going to do?
00:04:51.000 Well, use their considerable military might to endlessly respond and grind down NATO forces.
00:04:58.000 Well, let me know in the chat, let me know in the comments.
00:05:00.000 You can join us, you can join us on locals and join the chat, participate in this stuff with a delightful community wherever you're from in the world.
00:05:09.000 Certainly don't mean to make a mockery of the set of semaphores that the Russian people use to communicate with.
00:05:14.000 No, I think people are I think anyone would have understood what you meant by that.
00:05:18.000 Just mucking about.
00:05:19.000 But anyway, so you found it on, you think, some kind of Russian website.
00:05:22.000 Yeah.
00:05:22.000 Russian YouTube.
00:05:23.000 It's pretty interesting.
00:05:25.000 A lot of stuff went on.
00:05:25.000 Let me know what you think about that.
00:05:27.000 I don't think I would like to get into the potentially murky territory around the sad and tragic death of Diana, but Keith Allen, don't mind, so have a look at his documentary.
00:05:38.000 Yeah, it's worth having a look at.
00:05:38.000 Wow.
00:05:40.000 I heartily recommend it, even though...
00:05:42.000 Did you don't remember in the old days used to just like look at curiosities and things that are a bit peculiar it's like you know oh it's before you had to have a sort of a banal diet of pre-chewed slop like some gray ready brick diet like you're not allowed any spice or flavor I used to be able to look at things and go, well, I think that's a bit mad.
00:06:00.000 I don't really agree with that.
00:06:01.000 Quite a peculiar and wonderful theory, but I'm not sure that's actually true.
00:06:04.000 Let's have a look at some of that evidence.
00:06:06.000 You used to be able to decide for yourself.
00:06:07.000 The whole of censorship is underwritten by the idea that we're too bloody stupid to understand anything, and perhaps to a degree we are, because we're willing to put up with expensive ceremonies to anoint further royalty.
00:06:20.000 Let's face it, the death of Queen Elizabeth II meant that this is time for a radical appraisal and review of whether or not we Hello!
00:06:37.000 Joining me now is Barry Weiss, founder and editor of The Free Press, former Twitter Files journalist and New York Times editor.
00:06:43.000 Thanks for joining us today, Barry.
00:06:45.000 It's great to see you on the screen I'm looking at.
00:06:47.000 Great to see you too, Russell.
00:06:49.000 What's going on over there?
00:06:50.000 What are you up to?
00:06:52.000 What am I up to?
00:06:53.000 Raising a baby, starting a company, you know, trying to do media the right way.
00:07:00.000 I presume you're also having a mental breakdown if you're trying to simultaneously start a company and raise a baby.
00:07:05.000 How old's your baby?
00:07:06.000 She's seven and a half months, yeah.
00:07:08.000 I mean, it's a lot.
00:07:09.000 A lot going on at one time.
00:07:11.000 What about the sleep and everything, mate?
00:07:14.000 I mean, I don't look my best, but we only live one life.
00:07:19.000 When else are we going to do it?
00:07:20.000 And I should add, by the way, that building the company alongside some of my close friends and also my wife, a journalist that I met at the New York Times, who also left to do this with me.
00:07:31.000 So there's a lot going on.
00:07:33.000 There's not really like a work-life balance, I would say, in my life, Russell.
00:07:36.000 Doesn't sound like there's very much balance.
00:07:38.000 I suppose at least if both of you are doing nights, then I suppose that's something.
00:07:43.000 But I won't spend any more time poring over your private business.
00:07:47.000 Sleep train the baby.
00:07:50.000 She sleeps the entire night.
00:07:52.000 I don't understand people that don't choose to do that.
00:07:56.000 Nor do I!
00:07:56.000 I don't know what you're using, Benadryl?
00:07:59.000 Hey Barry, I wanted to ask you some questions about the news but afterwards I'd like to talk to you about how you're convincing your daughter to sleep that long.
00:08:08.000 Mate, I wanted to ask about firstly a little bit about That dude quitting Google and everything, Geoffrey Hinton, and your conversation with him, and whether you learned anything more detailed about his concerns around AI.
00:08:25.000 Is it kind of an existential threat?
00:08:27.000 A pragmatic threat?
00:08:28.000 Economics?
00:08:29.000 Is it to do with jobs?
00:08:30.000 Is it to do with some sci-fi type end of the world scenario?
00:08:34.000 What did you glean from that, Barry?
00:08:37.000 The guy that left Google on the podcast.
00:08:39.000 I had on Sam Altman, the CEO of OpenAI, which is the company that runs ChachiBT.
00:08:44.000 And the title of the podcast, I think, summarizes where a lot of people are thinking this goes.
00:08:49.000 You know, is AI the end of the world or is it the dawn of a new one?
00:08:53.000 There's a tremendous amount of hyperbole going on around this new technology.
00:08:58.000 Some are comparing it to fire.
00:08:59.000 That's Sundar Pichai, the CEO of Google.
00:09:02.000 Others are comparing it to agriculture, the wheel, electricity, Gutenberg Press, you name it.
00:09:07.000 Here's what I know.
00:09:09.000 In the past decade, Russell, as you've surely watched, crypto is the thing that has been absolutely hyped.
00:09:15.000 This was the thing that was going to get rid of state currency.
00:09:18.000 It was going to get rid of the dollar bill.
00:09:19.000 It was going to change the world.
00:09:20.000 It was going to democratize money.
00:09:22.000 But my wife was just in Austin at a crypto conference, and you still pay for the swag in dollar bills.
00:09:28.000 In other words, people are still sort of casting about for the use case of the thing that was meant to change the world as we know it.
00:09:34.000 Think about ChatGPT in comparison.
00:09:37.000 That came out basically a week ago.
00:09:39.000 Something like a few months ago, ChatGPT 4 was unveiled.
00:09:43.000 100 million people are using that app every day, and it's already changing the way that people work, the way they do research, the way they cheat on tests in college.
00:09:52.000 News organizations have announced that they're getting rid of certain jobs because they're already outsourcing them to this technology.
00:09:58.000 So, It's already proven its use, which is extremely exciting and also extremely unnerving.
00:10:05.000 There's an economist that I love named Tyler Cowen who writes this incredible blog, Marginal Revolution, if your listeners aren't aware of it.
00:10:12.000 And he had this incredibly succinct, excellent post about this, where he basically says, As much as we have believed that the internet was a seismic technological revolution, the truth is, is that most of us that are alive, save very, very old people that lived through, you know, World War II and the advent of nuclear weapons, we really haven't lived through a fundamental technological revolution.
00:10:36.000 We haven't lived through what he calls moving history, where we're actually feeling like the tectonic plates shift.
00:10:43.000 This is that thing.
00:10:45.000 And as human beings who are only able to think so far into the future, it's really scary.
00:10:51.000 But probably the cavemen who watched their neighbor invent fire felt the same way.
00:10:55.000 They probably thought, holy shit, this thing allows us to cook food and stay warm, but also holy shit, someone can come and burn our whole village to the ground.
00:11:04.000 In other words, every single time this new technology comes into being, there's a kind of moral panic around it.
00:11:11.000 There's this really amazing newsletter called Pessimist Archive and they keep track of the panics that are the reaction to new technology.
00:11:11.000 Right?
00:11:19.000 I read one the other day where it was like, it was a poem that they unearthed from 250 BC, freaking out about the sundial, right?
00:11:27.000 There's articles about, you know, the extinction of the slide rule and how the calculator is going to ruin education forever for kids.
00:11:35.000 For people that were living in 1600s in Central Europe, the printing press Probably meant to them war and bloodshed.
00:11:43.000 To us it meant the advent of the industrial revolution and the scientific revolution.
00:11:48.000 So my feeling about this new technology, sorry to go on about this, I'm really excited about it because I feel like it's huge, is it's not a question of yes or no.
00:12:00.000 It's going to happen.
00:12:02.000 The question is, who is going to do it and what are the guardrails going to be around it?
00:12:06.000 And those, I think, are the real pressing questions that some of the smartest people in the world, way smarter than me, are grappling with right now.
00:12:13.000 One of them being the CEO of OpenAI, Sam Altman.
00:12:17.000 It's interesting because when you talk about regulation with something like this it can sometimes seem to be at odds with where we might stand elsewhere on the subject of censorship.
00:12:30.000 I've heard people say that if this isn't like that Elon Musk, for example, said this ought
00:12:37.000 to be regulated and it's not regulated.
00:12:39.000 And now I know that when people talk about regulation elsewhere within social media,
00:12:45.000 the problem ends up being that it's not about regulation of monopolies, it's regulation
00:12:49.000 of it ends up being sensitive of free speech, essentially.
00:12:53.000 I'm fascinated, Barry, to hear you say that this is a seismic shift and is epochal and
00:12:59.000 that you don't think that everyone having a phone in their pocket represents that or
00:13:04.000 the ability to be contacted.
00:13:07.000 You think this is beyond that because this is beyond utility because it can actually
00:13:10.000 transform it's not like, well, it's just a tool we use.
00:13:13.000 It can become a tool that uses us.
00:13:15.000 Is this what you're saying?
00:13:17.000 I mean, that is what I'm increasingly convinced by.
00:13:19.000 Don't get me wrong.
00:13:21.000 I am not a futurist.
00:13:22.000 I am not a technologist.
00:13:25.000 When everyone was freaking out about Bitcoin and crypto in the beginning of the pandemic, I went and bought $10,000 and then promptly lost my password forever, thus losing the $10,000.
00:13:36.000 I'm not a sophisticated technologist.
00:13:38.000 What I know is having spent a little bit of time with chat GPT, it is eerie the way that it can imitate Human intelligence.
00:13:47.000 And do I...
00:13:49.000 Far be it for me to suggest that the phone in my pocket that contains more computing power than, you know, what sent rockets to the moon.
00:13:57.000 I mean, of course I'm blown away by it.
00:13:59.000 I'm just saying that this thing, in its very, very short few months, has already proven to be extraordinarily transformative.
00:14:09.000 And so I'm not saying that the internet and the fact that we're talking through a screen right now, and I'm in LA and you're in the UK, it's unbelievable.
00:14:17.000 I'm just suggesting that this has the ability to be perhaps even more unbelievable, and people that are more sophisticated than me are suggesting so.
00:14:26.000 And so I think it's incumbent upon all of us to learn about it.
00:14:29.000 Now, as for the question of regulation and censorship, that really, really scares me.
00:14:34.000 I mean, go in there and ask—other people have done this, but go in there and type something controversial into chat, GBT.
00:14:40.000 Type in, tell me about Mao.
00:14:42.000 Type in, tell me about Jordan Peterson.
00:14:44.000 You'll immediately see that because, you know, because all technology is ultimately created by human beings, that it has biases.
00:14:51.000 And unlike Twitter, right, where we could go into the archive, because Elon
00:14:56.000 Musk allowed journalists into the archive, of course, through the Twitter files, we could
00:15:00.000 see the choices they were making.
00:15:02.000 This thing is built on a text corpus of billions and billions of texts, articles, books, documents,
00:15:10.000 lyrics. It's much harder, I think it's going to be much harder, to sort of ascertain the biases,
00:15:16.000 because you're not like, you know, it's just different. The scale of it is completely different.
00:15:20.000 I think that's really worrisome. The other thing that's worrisome, as we saw in the Twitter files,
00:15:24.000 the amazing Twitter files hearings, where Matt Taibbi and Michael Schellenberger went before
00:15:29.000 Congress, and we saw, you know, it was a lot of different things being done, and it was a lot of
00:15:32.000 an incredible display, let's say, by some American politicians who didn't know what Substack was,
00:15:37.000 who asked if me and Matt and Schellenberger were in a threesome.
00:15:40.000 I mean, it was incredible.
00:15:41.000 Like, do we really trust the people who don't know what Substack is to regulate, you know,
00:15:46.000 chat GBT and open AI? Like, I don't even know if they know what a modem is or know how the
00:15:52.000 internet works.
00:15:53.000 And so that I think is really worrisome to me.
00:15:56.000 And so there are people who are suggesting other kinds of, you know, Sam Altman, CEO of OpenAI, suggests that maybe he should be the head ultimately of OpenAI.
00:16:05.000 Maybe that's a position that should be democratically elected because that's how significant and important it will be.
00:16:12.000 Um, so, you know, the jury's out, but when I look at the people who are in Washington and their average age, frankly, the idea of them regulating this technology is worrisome to me.
00:16:22.000 Yeah, that is cause for concern.
00:16:24.000 When we have in the media landscape cozy relationships as evidence between the recent White House correspondents' dinner and then adversarial, aggressive, punitive relationships as the aforementioned Tybee Schellenberger What do you think this tells us about the shifting landscape between the media and the powerful?
00:16:50.000 In particular, I'm noting Matt Taibbi's IRS visit, the threat with jail for perjury or whatever.
00:17:00.000 How do you feel Barry operating in a comparable space and both of you know Matt Taibbi being a peer and indeed colleague of yours?
00:17:08.000 I think it is the job of journalists to hold power to account and do that even when it's politically inconvenient for your side.
00:17:19.000 You know, I think that Nat Taibbi, Michael Schellenberger, me, we're never going to be invited to the White House Correspondents' Dinner.
00:17:26.000 And I'm okay with that.
00:17:27.000 Because when I became a journalist, I didn't do it for the money.
00:17:31.000 I didn't do it for the accolades, and I didn't do it so that I could, you know, drink champagne next to powerful people.
00:17:38.000 I did it because it's a vocation that allows you to pursue your curiosity and in which you get, you know, a salary to take your flashlight and look into the darkest corners, into the kind of corners that the powers that be don't want you to look.
00:17:55.000 So, you know, when I see the IRS, Seemingly being weaponized against someone like Matt Taibbi.
00:18:02.000 I think that that is something that every single journalist in this country, whether they work for an independent site, whether they write for a substack, whether they work at the Washington Post or the New York Times, should be absolutely up in arms about that.
00:18:14.000 And I think it tells you something really concerning about the state of the legacy press in this country that, you know, the Wall Street Journal thankfully had an editorial, but there should have been editorials about that visit in every single newspaper across the West, in my view.
00:18:29.000 Why isn't Biden likely to conduct primary debates?
00:18:35.000 I think many of us would be interested to hear debates between, for example, Robert F. Kennedy and Biden, and Marianne Williamson's doing pretty well also.
00:18:46.000 Why is the Democrat Party becoming so censorial, so afraid of conversation?
00:18:52.000 What's going on, Barry?
00:18:55.000 I mean, look, tells you a lot about the popularity of Joe Biden among voters that Marianne Williamson is pulling at something like nine or 10% and RFK Jr.
00:18:55.000 What do you think?
00:19:05.000 that who announced like two weeks ago, I think, is pulling at something like 20% already.
00:19:11.000 Who knows what will happen when Gavin Newsom, California governor, maybe is reportedly maybe going to get to the race at some point.
00:19:16.000 People Realize that Joe Biden, though he won the last election, is getting slipped the questions in press conferences to sort of be prepped.
00:19:29.000 He's someone that they're sort of I don't want to say hiding, but trying to protect from the probing questions of the press as much as possible.
00:19:38.000 Why?
00:19:39.000 Why isn't he doing a debate?
00:19:40.000 Well, for all of those reasons.
00:19:41.000 How do you think he would fare in a debate against Marianne Williamson, RFK, and to say nothing of other people that might join the race?
00:19:49.000 So essentially, you have someone in a position of power that's being protected.
00:19:54.000 You have a relationship between the mainstream press and the government that is consensual, as we saw with a recent report around the Pentagon Papers Part 2, that the content of the leaks was ignored.
00:20:09.000 You had the ludicrous spectacle of Biden saying that we must protect the free press and that
00:20:15.000 journalism is not a crime while Assange is still away in a maximum security prison.
00:20:21.000 And adding to this, this potentially unprecedented tool that we were previously discussing, which
00:20:28.000 will ultimately, I suppose, end up in the hands of the powerful.
00:20:32.000 And it seems based on what you're saying about the inflections that AI already bears
00:20:38.000 culturally that it's a system, and of course we know from the Twitter files what the relationship
00:20:43.000 is between big tech and the Democrat party in particular, of course, I'm sure they would
00:20:47.000 be flexible depending on which of those two parties were in power.
00:20:50.000 It seems that the potential to govern the population is about to become, I would say, what do I want to say, sort of overwhelming.
00:21:04.000 Overwhelming.
00:21:05.000 With these new tools, it's possible that freedom could be further eroded.
00:21:10.000 So, really, at a point where we ought be insisting on new independent movements, a point where we should be insisting on transparency, there is more surveillance, militarization of the police, more protest laws, an inability to conduct public discourse by the most powerful person in the world.
00:21:30.000 What do you imagine is most immediately required, Barry?
00:21:35.000 What I think is most immediately required and what I see already happening, and I guess this is the silver lining, is, you know, look at both of us in this moment right now.
00:21:47.000 I don't even know how big your audience is at this point.
00:21:49.000 It's astronomical.
00:21:51.000 Here I am, thinking that I was going to be, you know, I spent my career in the legacy press, the Wall Street Journal, the New York Times, left the New York Times in 2020 for reasons maybe we can discuss, had no plan, had never, I barely had a credit card, you know, to say nothing of being an entrepreneur.
00:22:10.000 That was like the furthest thing from my mind.
00:22:12.000 And now I'm building a media company and I have 20 people working with me.
00:22:16.000 So the great news is that the technological revolution we're living through, yes, can be used in extraordinarily oppressive ways, and it can also be democratizing.
00:22:27.000 It's like all technology.
00:22:29.000 It's neutral.
00:22:30.000 It can be used for good or bad, like fire, like the printing press, like the iPhones in our pocket, right?
00:22:38.000 And so while I think we should be concerned, and while I think this technology, AI, as we were talking about before, has the potential to be the big one, so to speak. I think
00:22:49.000 that, you know, if the past is prelude, it can be used in both ways. And so am I worried that you
00:22:58.000 can go right now and create a conversation between the two of us, as someone did between Joe Rogan and
00:23:04.000 Sam Altman, and created an episode of the Joe Rogan experience that looked kind of like them and
00:23:09.000 sounded kind of like them? Yeah, that really worries me when I think about actual disinformation,
00:23:14.000 not what people want to believe is disinformation. Very concerning to me. But there's also
00:23:18.000 incredible things that are going to come from it. So this is something that I'm watching
00:23:23.000 more as a journalist, wanting to track it, wanting to understand it, wanting to understand who the
00:23:28.000 players are.
00:23:29.000 What their motivations are.
00:23:31.000 Did the people that signed that letter, including Elon Musk and Steve Wozniak and others, calling for a six-month pause in the advent of increased AI capabilities beyond chat GPT.
00:23:44.000 Did they sign that letter because they're pure of heart?
00:23:47.000 Did they sign that letter because they want to catch up to the competition?
00:23:51.000 What are the motivations?
00:23:52.000 What are driving people?
00:23:53.000 And by the way, what's driving other countries?
00:23:56.000 What's driving China?
00:23:57.000 Where are they in terms of AI capabilities?
00:24:00.000 These are the kind of questions that I think are going to be driving the next years of our life, the next years of stories, and it's one that I'm following incredibly, incredibly closely as a journalist above anything else.
00:24:10.000 Barry, I'm so grateful to you for asking these questions.
00:24:13.000 I mean, I admire incredibly what you've done and the organization that you are evidently building, not to mention your ability, along with your wife, to expertly manage this child through the night in ways that seem to me to be unprecedented.
00:24:27.000 I think what's interesting also about what you're saying is that you are journaling what's happening but increasingly I think it's likely that to become a legitimate journalist is to become a de facto activist and perhaps this is something that began with the Greenwald and Assange and certainly it seems likely due to the ongoing increase of censorship to be a necessity that if you're going to tell the truth you are an enemy of the powerful.
00:24:57.000 So I'm glad that we at least have an allegiance.
00:25:01.000 Of course, I'll give you the chance to respond.
00:25:02.000 Yeah, I think.
00:25:05.000 Look, I'm old school.
00:25:06.000 I think the job of a journalist, there are different roles in the world, right?
00:25:10.000 There's the job of the advocate.
00:25:11.000 There's the job of the columnist.
00:25:13.000 There's the job of the like all there's there's room for activists.
00:25:16.000 There's room for all of these things.
00:25:18.000 I think journalism I think the way to do journalism that maintains integrity and maintains the trust of people has to hew to sort of old school rules that frankly a lot of the legacy press has turned their back on, right?
00:25:36.000 The thing that used to happen at the New York Times was very clear.
00:25:39.000 You know, if a certain op-ed, and I was an op-ed editor, editor there for years and then also wrote my own columns,
00:25:45.000 if an op-ed sort of hewed to the ideological narrative, if an op-ed argued that
00:25:49.000 Donald Trump was a moral monster that had to be taken down, if an op-ed
00:25:53.000 claimed that, you know, Joe Biden was the savior of the world, we could go on and on
00:25:56.000 and on, you know what the arguments are. It would sort of sail into the paper.
00:26:00.000 And arguments that contradicted that, arguments that complicated it.
00:26:05.000 Those were ones that sort of were subjected to a much, much, much more rigorous test.
00:26:10.000 In other words, and I think that that was to the detriment of the audience to the reader.
00:26:14.000 And I think that when you think about the The old manifesto of the New York Times, the idea of all the news that's fit to print, and the way that it sort of has transformed, and many other papers as well, to all the news that fits the narrative.
00:26:30.000 I just think that there is a huge, wide-open space for people that are actually interested in treating readers like adults.
00:26:37.000 that are actually interested in treating listeners as sophisticated people that can make their own decisions, not just shoving propaganda down their throat.
00:26:45.000 And so that's what we're about at The Free Press.
00:26:48.000 We're about telling honest stories.
00:26:49.000 We're about, you know, telling the truth about the world as it actually is, not as we wish it to be.
00:26:54.000 And we put a special emphasis on stories that are either ignored or misconstrued by the mainstream press.
00:27:00.000 And God knows there are a lot of those these days.
00:27:02.000 Barry, thank you so much.
00:27:04.000 That sounds like a fantastic endeavour and I'm grateful to you for undertaking it.
00:27:08.000 You can learn more from Barry Weiss by reading the Free Press, listening to her podcast Honestly, reading her book, How to Fight Antisemitism.
00:27:18.000 She's an incredibly creative person!
00:27:21.000 She doesn't stop!
00:27:22.000 This is the only liquid that Barry will consume throughout the live-long day.
00:27:26.000 Barry, thanks for joining us and thanks for your fantastic contribution.
00:27:30.000 Thanks for having me, Russell.
00:27:33.000 Next week on Rumble, our special guest will include presidential candidate RFK Jr.
00:27:40.000 We thought long and hard about the potential blowback and trouble that may ensue from this booking, but we gotta give RFK the 20% now.
00:27:51.000 Legit candidate.
00:27:52.000 Joe Biden may not want to debate him.
00:27:54.000 People may not want to admit that his book about Charles Anthony Fauci was an incredible success.
00:27:59.000 But we want to hear from RFK.
00:28:02.000 You want to hear from RFK.
00:28:03.000 He's coming next week.
00:28:05.000 We've got international security expert Max Abrams coming on.
00:28:08.000 We've got a whole A variety of it.
00:28:11.000 Look at him just looking off wistfully.
00:28:12.000 Oh, national security.
00:28:13.000 It's a bloody nightmare, he seems to be saying, almost to himself as much as anything else.
00:28:18.000 Thank you so much for joining us.
00:28:21.000 We've got a fantastic show.
00:28:22.000 Oh no, this is next week now, isn't it?
00:28:26.000 Yeah, that's it for the week.
00:28:28.000 We can't work it out in Sundays, can we, Gal?
00:28:30.000 Well, no, we can't.
00:28:31.000 We must.
00:28:32.000 Maybe to prepare the forthcoming week, but only for that.
00:28:36.000 All right, guys.
00:28:37.000 Thank you so much for joining us for another fantastic week of freedom.
00:28:42.000 Wow.
00:28:42.000 Look at what we've created on Rumble.
00:28:44.000 Isn't it extraordinary?
00:28:45.000 It's wonderful.
00:28:46.000 What began from a simple dream by a narcissist.
00:28:51.000 We're not even saying who that is.
00:28:52.000 It could be me.
00:28:52.000 No, it's me.
00:28:53.000 It could be you.
00:28:54.000 You are that sweet narcissist.
00:28:56.000 I'm really enjoying that shirt.
00:28:59.000 Oh, thank you.
00:29:00.000 Absolutely fantastic.
00:29:01.000 Bring him up the credits a little bit.
00:29:03.000 Have you noticed how low down the credits Gareth is?
00:29:05.000 He's about 9th or 10th.
00:29:06.000 There's a chat GPT bots are higher up the credits than him.
00:29:10.000 Mind you, they deserve it, don't they?
00:29:12.000 Join us next week on Rumble.
00:29:14.000 Not for more of the same, but for more of the different.