Timcast IRL - Tim Pool - February 17, 2021


Timcast IRL - Youtube Is Giving us The Business, Wont Let Us Stream w- James Okeefe


Episode Stats

Length

2 hours and 28 minutes

Words per Minute

197.92786

Word Count

29,356

Sentence Count

2,393

Misogynist Sentences

19

Hate Speech Sentences

23


Summary

James O'Keefe joins us on the show to talk about why Greta Thunberg burned an effigy, gets an activist arrested for sedition, and much, much more. We also have a special guest, Luke Hradowski of WeAreChange.org.


Transcript

00:00:00.000 We were late by a few minutes.
00:00:19.000 For some reason, YouTube would not let us stream.
00:00:22.000 And every time we tried, it kept saying, live streaming is not available right now.
00:00:28.000 And I tried refreshing.
00:00:29.000 We tried stopping the broadcasting software and turning it back on.
00:00:32.000 Nothing worked.
00:00:33.000 I think it may have something to do with the title, which is Greta Thunberg burned an effigy, gets activist arrested for sedition with our guest, James O'Keefe.
00:00:43.000 And no matter what I did, it didn't work.
00:00:44.000 And so I tried something.
00:00:46.000 I changed the title to YouTube is giving us the business and won't let us stream.
00:00:51.000 And all of a sudden, streaming worked.
00:00:53.000 So it's funny, a lot of people in the chat were saying, I wonder if this is a stream that gets Tim banned because we're having James O'Keefe on the show.
00:01:01.000 And talking about Greta Thunberg, for those that don't know the story, she accidentally tweeted out an activist toolkit.
00:01:07.000 And now activists in India are freaking out saying she's essentially interfering in their politics and burning images of her, burning her own effigy.
00:01:16.000 And we also have James O'Keefe because he's got a story he's breaking about Mark Zuckerberg and some leaked videos.
00:01:21.000 So I wonder if the reason the stream wasn't working because his name was in the title, I guess?
00:01:26.000 I have no idea, but it was really strange because it kept... this overlay would appear that said live stream is not available, and you could click try again or finish.
00:01:34.000 If you clicked finish, it would take you out of the streaming studio and then just put you back in regular YouTube.
00:01:40.000 It was the weirdest thing.
00:01:41.000 So I went in, I changed the title, and apparently that worked.
00:01:44.000 So, um...
00:01:46.000 As you already know, we've got a bunch of different things to talk about.
00:01:49.000 We are joined today, obviously, by our guest, James O'Keefe.
00:01:53.000 James, would you like to introduce yourself?
00:01:55.000 Now, as you may notice, James is a small gorilla standing on the desk because there's not actually a James O'Keefe in that chair.
00:02:04.000 And it's because he's late, so y'all can blame him.
00:02:07.000 But he should be here in about 10 minutes or so.
00:02:10.000 Tsk, tsk, tsk.
00:02:11.000 Oh, James.
00:02:12.000 We got to make him do push-ups when he comes here.
00:02:17.000 He's got to take a shot for every five minutes he was late.
00:02:19.000 So he's at one shot.
00:02:21.000 Of what?
00:02:22.000 Whiskey!
00:02:24.000 We've got that Conor McGregor stuff, don't we?
00:02:25.000 Yeah.
00:02:26.000 That's pretty good, right?
00:02:27.000 I never even had a taste.
00:02:29.000 Oh, really?
00:02:29.000 Oh, you're weird.
00:02:31.000 You don't want the Polish to drink.
00:02:32.000 It's trouble.
00:02:32.000 You do not.
00:02:33.000 They were offering me beers yesterday.
00:02:35.000 I'm like, trust me, you don't want this.
00:02:38.000 If you think YouTube is giving you the business.
00:02:40.000 Wait until a Polish person starts drinking.
00:02:43.000 I just don't like booze anyway.
00:02:44.000 Yeah, I'm not a big fan.
00:02:44.000 I don't know.
00:02:45.000 This'll be like the first time the guest is like running in covered in sweat.
00:02:48.000 Like, I'm here!
00:02:50.000 I'm here!
00:02:51.000 Let's start asking him really complex, hard questions just to make it harder on him.
00:02:51.000 I'm James O'Keefe!
00:02:56.000 Like as soon as he gets here?
00:02:57.000 Yes.
00:02:57.000 Yes.
00:02:58.000 Did you guys... Why M equals MC squared?
00:03:00.000 Tell us now!
00:03:01.000 Did you guys ever go through... Actually, that's actually... Whatever.
00:03:04.000 You know.
00:03:06.000 Did you guys ever go through hard drinking phases?
00:03:08.000 No.
00:03:09.000 No.
00:03:09.000 I mean, when I was like late teens.
00:03:11.000 High schooler.
00:03:12.000 Yeah, I went through a hard one in my 20s.
00:03:13.000 Yeah.
00:03:14.000 A lot of beer.
00:03:15.000 You never did, Luke?
00:03:15.000 A lot of beer has fluoride.
00:03:17.000 I'm watching my third eye.
00:03:18.000 Oh, that's great.
00:03:19.000 Just pure, filtered, organic hipster stuff.
00:03:21.000 Most beer has a lot of fluoride in there.
00:03:24.000 Interesting.
00:03:24.000 Why?
00:03:24.000 Why?
00:03:24.000 Because they use regular tap or something?
00:03:26.000 I think it's the way that they use the hops.
00:03:28.000 I forgot exactly why.
00:03:29.000 Because they dump fluoride into it?
00:03:31.000 Yeah.
00:03:32.000 Luke, you're a conspiracy theorist.
00:03:34.000 Some would say that.
00:03:36.000 So Luke's here.
00:03:38.000 Yes, they are burning effigies of Greta Thunberg in India.
00:03:42.000 I never thought I would get to say that.
00:03:43.000 Welcome back, beautiful and amazing human beings.
00:03:45.000 My name is Luke Hradowski of WeAreChange.org, and if you like me and want to support my voluntary efforts, you can on WeAreChange.org forward slash donate.
00:03:53.000 There's many ways you could get involved with what I'm doing, ways without even spending a dime.
00:03:58.000 And because of that, I'm still here.
00:04:00.000 So thank you very much.
00:04:01.000 I'm also tweeting up and memeing up a live storm right now on Twitter.com forward slash Luke WeAreChange.
00:04:06.000 Thanks for following me on there as well.
00:04:09.000 Sometimes the titles really make me laugh.
00:04:11.000 Will it be like, Greta Thunberg burned an effigy by Indians or something, whatever it is, with James O'Keefe.
00:04:17.000 So you're like, James was there too?
00:04:20.000 Yeah, like, James was the one who was leading the protest.
00:04:23.000 What's up, everybody?
00:04:25.000 Good to be back.
00:04:25.000 I'll probably maybe turn my mic down a touch if that's I think I'm a little coming in hot.
00:04:29.000 Oh, yeah.
00:04:30.000 But I did start iancrossland.net.
00:04:32.000 I got it revamped.
00:04:33.000 I'm selling merch on the website, including things like this awesome free to code mug that you can get over there.
00:04:39.000 And you can follow me on all my socials over there.
00:04:40.000 So it's great to be back.
00:04:42.000 Capitalism.
00:04:42.000 Very entrepreneurial of you.
00:04:44.000 And when James gets here, we'll introduce him.
00:04:49.000 But for now, the gorilla is standing in for him.
00:04:51.000 This is our guest.
00:04:53.000 He's very angry.
00:04:53.000 I don't know why, but he's got scars on his chest.
00:04:55.000 Maybe somebody hit him.
00:04:59.000 We also have Sarpatch.
00:04:59.000 Let's press on.
00:05:00.000 I am here in the corner, laughing at the gorilla.
00:05:02.000 And these guys are pretty great over here.
00:05:03.000 Sorry we're running late.
00:05:04.000 Ladies and gentlemen, we have an awesome sponsor today.
00:05:06.000 Thank you so much.
00:05:07.000 Give a shout out to Virtual Shield, a virtual private network service.
00:05:11.000 You can go to surfinginternetsafe.com, click the link in the description below, and you can get access to a 50% off VPN service, Virtual Shield.
00:05:20.000 They have been with me the whole time.
00:05:21.000 They're my first sponsor and they're great.
00:05:23.000 A virtual private network basically provides a simple layer of security for you while you browse the internet.
00:05:28.000 They make it much harder for bad actors, hackers, and even the government.
00:05:32.000 They make it harder for them to spy on you.
00:05:34.000 So the way I describe it is to make it just simpler for everybody.
00:05:38.000 You know, we all lock our doors and windows.
00:05:39.000 We don't really expect anyone to break in.
00:05:41.000 But you still have that lock.
00:05:43.000 That's what a virtual private network basically does for you.
00:05:45.000 It gives you that very simple layer of security.
00:05:47.000 So if you go to surfinginternetsafe.com, you can get 50% off 24 months of online security from the world's easiest and fastest VPN for only $59.88.
00:05:56.000 That's actually 77% off, they say.
00:05:58.000 Plus you get 50% off all add-ons and other great discounts on their other plans.
00:06:03.000 They're also proud to announce that this month all discounts are guaranteed for life.
00:06:06.000 That means 50% off as long as you are a customer.
00:06:09.000 Check it out.
00:06:10.000 Go to surfinginternetsafe.com.
00:06:12.000 Link is in the description below.
00:06:13.000 And seriously, guys...
00:06:15.000 But when I first started doing all this virtual shit, they were the first company.
00:06:17.000 They were like, yo, dude, you're cool.
00:06:19.000 Can we sponsor you?
00:06:20.000 And it was just like this tiny little channel.
00:06:22.000 And I was like, yes.
00:06:23.000 And they really helped get us where we are today.
00:06:26.000 So serious shout out, guys.
00:06:27.000 Thank you so much.
00:06:28.000 Surfinginternetsafe.com.
00:06:29.000 And don't forget!
00:06:30.000 Go to TimCast.com and become a member because we have exclusive podcast episodes you can't get anywhere else.
00:06:36.000 We did a little jokey introduction of the OurPillow.
00:06:39.000 So if you actually look at the screen, you can see I'm holding up OurPillow.
00:06:41.000 It's a burlap sack full of packing peanuts.
00:06:43.000 But we had a very serious half an hour discussion with Will Chamberlain talking about why parlor isn't necessarily the answer in terms of social media censorship.
00:06:50.000 And we have a bunch of other episodes talking about alien technology, and we're planning much, much more.
00:06:55.000 So we keep saying we're gonna go to the range.
00:06:56.000 We may actually get to do it, because of a really awesome guest coming up later this week, who's gonna give us the full rundown on Crazy Guns, and it's gonna be a lot of fun.
00:07:05.000 And we're hoping to actually get a chance this time.
00:07:07.000 It was like snowing the past several weekends.
00:07:09.000 So TimGuest.com, go and check it out.
00:07:11.000 And as for now, we are still waiting for James O'Keefe, who's probably going 120 miles an hour down the highway, or something.
00:07:19.000 Yeah, drive safe.
00:07:19.000 It's snowy out there, I guess.
00:07:21.000 But in the meantime, let's jump over to the first story.
00:07:23.000 Hey, I found a little trick on the website I wanted to let people know about.
00:07:26.000 If you click reply on the video to a comment, and then it says, it'll pop up and it'll say, logged in as, and it's your name.
00:07:33.000 If you click that, you can change your avatar.
00:07:36.000 So when you make a comment, you have a little picture next to it.
00:07:38.000 Cool.
00:07:39.000 I love it.
00:07:39.000 Right on.
00:07:40.000 A little Easter egg.
00:07:41.000 Let's jump into this first story.
00:07:43.000 Greta Thunberg effigies burned in Delhi after tweets on farmers' protests.
00:07:49.000 Celebrity interventions inflame sentiments in India as police investigate pro-farmers' toolkit.
00:07:56.000 So, basically what happened is, a couple weeks ago, Greta Thunberg accidentally tweeted out what's called a toolkit.
00:08:02.000 It's basically just a manual explaining what to say, where to say it, how to do all this stuff.
00:08:08.000 A lot of people find this to be particularly nefarious.
00:08:10.000 They call it... Well, they say it's similar to astroturfing.
00:08:14.000 Do you guys know what astroturfing is?
00:08:15.000 I've heard of it.
00:08:16.000 Luke, you're familiar?
00:08:17.000 Yeah, I've heard of it.
00:08:17.000 What is it?
00:08:18.000 So basically you'll have these big organizations, special interests, that want to get activists to do a thing.
00:08:24.000 And so they'll hire a bunch of people to show up and wave signs.
00:08:29.000 Sometimes it won't be as direct as saying, like, we want to pay an activist to do this.
00:08:34.000 Like, hardcore astroturfing, the most nefarious would be getting random people who don't care about the politics to wave a sign.
00:08:41.000 But typically what it is, they'll put up an ad saying like, hey, do you care about climate change?
00:08:46.000 Well then come on down and protest and you'll get your food and travel paid for.
00:08:51.000 And essentially those funds being provided to activists allows them the ability to be there.
00:08:56.000 It's considered inorganic, right?
00:08:58.000 Yeah.
00:08:58.000 So when you have Greta Thunberg putting out this toolkit where it's basically telling people what to say or how to say it, how to protect yourself and things like that, a lot of people view that as nefarious.
00:09:07.000 So now you have, well, Greta Thunberg leaked this.
00:09:10.000 They say this from The Guardian.
00:09:12.000 Thunberg became embroiled in allegations of an international criminal conspiracy against India after she tweeted a toolkit for people who wanted to show support for the farmers.
00:09:22.000 So there's a big farmer protest going on right now.
00:09:24.000 The document included campaigning tips, such as suggested hashtags and advice on how to sign petitions.
00:09:30.000 Though not named in the police case that was filed, Thunberg's tweet was said to have brought the Delhi police's attention to the existence of the toolkit.
00:09:38.000 Leaders in the ruling Bharatiya Janata Party said the toolkit was evidence of international plans for attacks against India.
00:09:46.000 Now, I'm just going to say this because I know a lot of people are like, whoa, Greta Thunberg is caught.
00:09:49.000 She's astroturfing.
00:09:50.000 I don't think it's that big of a deal.
00:09:52.000 Like, if you say like, hey, here's the message we're using, here's the hashtags we're using, and you're telling someone how to get active, I don't think that's that big of a deal.
00:10:00.000 But I do think it's interesting the way Delhi is handling it.
00:10:02.000 They're basically saying international, you know, foreign celebrities interfering in our politics ain't gonna have none of that.
00:10:08.000 So Greta Thunberg, they're just burning images of her in effigy, but here's actually the big news.
00:10:14.000 Not this one, it's the next one.
00:10:16.000 Indian activists arrested for creating protest toolkit shared by Greta Thunberg.
00:10:23.000 This is a huge, huge screw up on Greta's part.
00:10:25.000 Look, how old is she now?
00:10:27.000 Greta Thunberg?
00:10:27.000 She's 18?
00:10:29.000 I don't want to drag her too heavily because she's inexperienced and naive, but this was such a serious failure of operational security.
00:10:38.000 This woman is facing, I think she's facing life in prison.
00:10:42.000 Let's check out this story.
00:10:43.000 They say Disha Ravi, 22, could face up to life in prison following her arrest Sunday in connection to the online document, which calls for organized support for India's deadly farmers protests.
00:10:54.000 Ravi, a leader of the Indian arm of Thunberg's Fridays for Future Protest movement, is being investigated for possible sedition, a charge that carries a penalty of life imprisonment, a police source told Reuters.
00:11:04.000 Well, you know, look, I just heard a lot from Democrats about the dangers of insurrection and sedition.
00:11:10.000 And, you know, they say these people should be locked up.
00:11:12.000 So what's the difference?
00:11:14.000 What's the difference in this story?
00:11:16.000 It sounds like a precedent was set over here in the U.S.
00:11:18.000 and now other countries are going to take that and make it totalitarian.
00:11:21.000 That's dangerous.
00:11:22.000 That's what they do.
00:11:22.000 Often when we see this happen in the US, something happen in the US, other countries go,
00:11:26.000 okay, then we can do it too, right? And so that's actually criticism that was levied at Donald Trump.
00:11:30.000 They were saying that when he was going after the press, all of a sudden a bunch of other world leaders started
00:11:33.000 going after the press too. I think that's kind of silly, you know,
00:11:36.000 criticize people if they do something wrong.
00:11:38.000 Trump expressing his opinions on the media, I don't, whatever, people have opinions on the media.
00:11:43.000 In other countries like Turkey where they're arresting, Yeah, Erdogan was the guy I was thinking of.
00:11:48.000 Erdogan.
00:11:49.000 Erdogan, Erdogan.
00:11:50.000 Turkey arrests more journalists than anybody else.
00:11:52.000 So here, so for those that aren't familiar with what's going on in India, I just did
00:11:56.000 a quick run through earlier.
00:11:58.000 It's fairly complicated, I don't know too much, but it's basically a bunch of farmers
00:12:03.000 have this system where it's, they have like guaranteed pricing from the government on
00:12:07.000 the crops they grow.
00:12:09.000 So it sustains them.
00:12:10.000 And there's new laws that are coming in that will basically take that away.
00:12:13.000 And so they're protesting it.
00:12:14.000 And this makes a lot of sense as to why the left is getting involved and saying, support the farmers.
00:12:19.000 Because the farmers want a guaranteed minimum government intervention over the work they're doing, and the new laws would be more, I guess, capitalist.
00:12:27.000 According to CNN, they'd be allowed to sell whatever they grow to whoever wants it.
00:12:32.000 Whereas right now, the government guarantees a set minimum price, and only certain people are allowed to bid on these crops.
00:12:40.000 I guess you can call it deregulation.
00:12:41.000 It's resulting in these massive, deadly protests.
00:12:45.000 Greta Thunberg gets involved.
00:12:46.000 For what reason?
00:12:47.000 I don't really know.
00:12:50.000 Well, I don't think anyone should ever be arrested for posting anything online, especially when it comes to activist toolkits.
00:12:58.000 But when you look at the larger kind of international efforts, especially by massively funded PR campaigns, there's a lot of ties between intelligence agencies and also The supposed popular uprisings that usually work in the benefit of American geopolitics, like we saw in Ukraine, Syria, and Libya, that were very organized, that did have talking points, hashtags, and also the coordination of the media that pushed for a certain agenda and was able to get it because of these larger disinformation efforts.
00:13:28.000 And if you study things like the economic hitman, you see these efforts... Confessions of an economic hitman.
00:13:33.000 Confessions of an economic hitman.
00:13:33.000 Yeah, yeah.
00:13:35.000 Great book.
00:13:36.000 Definitely recommend you read it.
00:13:37.000 I read it in high school.
00:13:38.000 It changed my vision of the world.
00:13:39.000 When you look at the kind of ways that they push for unrest in order to get their way, it really makes you think, what's really going on here?
00:13:48.000 And we have seen a lot of pressure on India by individuals like Bill Gates that are criticizing India.
00:13:54.000 Also, just recently, their manufacturing plants.
00:13:56.000 So, you know, there is something to wonder here about what's happening on a global scale, whether it's a conspiracy or not.
00:14:02.000 We do not know, but I still think it's an overreaction for India to arrest a person for sharing this, and it's only going to hurt them in the long run.
00:14:10.000 I was in India before, I got some wild stories there.
00:14:13.000 I actually landed in India when they did a currency reset, and the whole country went in panic.
00:14:17.000 Oh, what a mess.
00:14:19.000 Yes, there was riots, there was people dying, there was massive civil unrest.
00:14:24.000 Isn't that crazy that some guy can be like, oh, that currency we have, it's now a different currency, and everyone just believes it.
00:14:30.000 Everything in India is pretty much cash, except for some places in Delhi.
00:14:35.000 And when I was in Delhi, it was fine.
00:14:36.000 I went to Goa, everything's in cash, which was absolutely absurd during the currency reset, where people were sleeping outside of ATMs.
00:14:43.000 That was tourists.
00:14:45.000 It was also weird to see ATMs segregated for local Indians and tourists.
00:14:50.000 And the tourists were literally sleeping outside.
00:14:53.000 The Indians lines were even way longer because people didn't have any money, didn't have any cash.
00:14:57.000 Oh, because they reset the currency and it was something different.
00:14:59.000 They got rid of a large swap of currency because the government wanted all the accounting.
00:15:04.000 Because a lot of people, because it's such a cash-driven country, a lot of people have their money figuratively underneath the mattress.
00:15:12.000 The government was like, you're not paying taxes on that.
00:15:14.000 So we're just going to make all of these bills illegal.
00:15:16.000 You're going to have to give Give them to us so we could see exactly how much you have, so we could get you the accurate amount of money that you owe the government.
00:15:25.000 Crazy time during the currency reset.
00:15:28.000 Crazy story, but we got to keep it on.
00:15:31.000 But one thing that kind of reverted to in India that was very interesting was people literally doing an honor system.
00:15:37.000 You go to the store, you just tell them your name, you just take down your name, you would have to come back later pay cash.
00:15:42.000 But a lot of civil unrest happens in India.
00:15:45.000 The country geopolitically is at a very important place, especially countering China.
00:15:50.000 So there's a lot of interest in India.
00:15:52.000 They're stabilizing India right now with China.
00:15:55.000 China is basically on the border of India and they're fighting, right?
00:15:58.000 Well, they're having these weird clashes without any weapons.
00:16:02.000 Like clubs.
00:16:03.000 But they're using archaic weapons like clubs and sticks and nails and barbed wire on baseball bats.
00:16:10.000 Literally of what they're using fighting each other because they have this weird peace treaty where they can't use Firearms against each other and they can't use live ammunition against each other But they could beat the crap out of each other which they do, but I've heard recently within the last few days.
00:16:24.000 They actually Settled previous conflicts that were arising just two weeks ago, and they're at a stalemate now So they're at a position where there's more peace now than there was just two weeks ago Here's the crazy thing about the toolkit that gets released you have to ask yourself who makes these things Who spends the time to develop an activist guide for protesting?
00:16:45.000 And you mentioned this, Luke, you've got intelligence agencies.
00:16:48.000 You think about what's going on with China and India and the conflict on the border, plus, I mean, you have, like, what, Pakistan, like, Kashmir, all that stuff.
00:16:56.000 So there's good reason for any adversary of India or us to create that conflict.
00:17:02.000 So when you have international interests spreading information that can destabilize a country, what did the United States say about it?
00:17:10.000 When we had, what was it like, a handful of Facebook pages with almost no followers created by Russia.
00:17:16.000 Yeah.
00:17:16.000 They went for years claiming Russia was trying to take over the country and destabilize everything and they were freaking out.
00:17:22.000 In fact, they're still freaking out.
00:17:23.000 They're foaming at the mouth like...
00:17:25.000 Over some Facebook pages.
00:17:27.000 Now, what would happen if Russia was disseminating pamphlets, booklets, toolkits, telling people how to bypass arrest, how to win in the legal system, how to protest, how to hashtag, how to spread the ideology and win?
00:17:39.000 They would have gone even crazier.
00:17:41.000 If you think it was nuts now with Russiagate, imagine if Russia actually put together a toolkit telling people how to do these things.
00:17:50.000 It didn't get that bad.
00:17:52.000 And they still went insane.
00:17:54.000 That's what's happening in India and Greta Thunberg is facilitating it and she's not Indian.
00:17:58.000 Look, if you are an American, if you are an Indian, and you put together an activist guide
00:18:02.000 because you want to help people and you believe in your cause, well, that's cool.
00:18:05.000 Yeah, AOC did that.
00:18:06.000 If you're a foreign actor of high profile with tons of money and massive backing from
00:18:11.000 special interests, and then you are pushing out ideological and tactical information
00:18:18.000 that can exacerbate riots and protests.
00:18:21.000 Well, now you got a problem.
00:18:23.000 You got a serious problem.
00:18:24.000 Because what happens if India actually becomes destabilized?
00:18:27.000 I wonder if, you know, look.
00:18:29.000 I kind of feel bad for Greta Thunberg because she's just a kid.
00:18:33.000 The things she's called for in terms of abandoning fossil fuels, outright banning fossil fuels in the next year or whatever, not even the issue like now, we want it now, it'd just kill millions of people.
00:18:41.000 Look at what's happening in Texas with these windmills freezing and the gas lines freezing.
00:18:45.000 We need to make sure we can get the electricity running.
00:18:47.000 Already, we can see what happens when you do not have the sufficient electrical equipment, weathering, etc.
00:18:56.000 I know a lot of people are blaming the wind turbines for freezing, but everything basically froze.
00:19:00.000 Nobody thought this was going to happen.
00:19:01.000 Yeah, 15 states are having power outages.
00:19:03.000 There's 4.4 million people that went without power in Texas.
00:19:06.000 The National Guard had to be deployed.
00:19:08.000 Right, right, right.
00:19:08.000 And there's a huge argument now happening whether it's the kind of free energy or whether it's the fuels that are, you know, that- Right, right, right, right.
00:19:16.000 But just the basic point with Greta Thunberg I'm trying to make is she doesn't understand the things she's advocating for are going to kill millions.
00:19:25.000 From getting rid of the fossil fuels, fine.
00:19:27.000 But if, look, If riots, protests, and mass protests, mass civil unrest sweeps India because the people of India are challenging their legal system, okay then.
00:19:38.000 We want to make sure people stay safe.
00:19:39.000 We don't want a collapse that results in mass death.
00:19:42.000 We want reform.
00:19:42.000 We want people to do the right thing, and we don't want fighting in the streets.
00:19:47.000 Greta Thunberg doesn't understand what destabilization or collapse really means, and I don't think she... I don't think... I mean, I guess she would care if she knew, but I don't think she knows.
00:19:55.000 I don't think she realizes what it means for someone of her profile to be having these conversations.
00:20:00.000 Apparently, there were, like, some leaked messages as well, where she's saying, like, oh, these hate campaigns, they happen.
00:20:05.000 It's like, dude, the hate campaigns are happening because you're interfering in a foreign country's politics, and you have massive clout that's getting tons of attention.
00:20:16.000 and spreading this artificial, you know, external ideology inside the country.
00:20:20.000 It's no surprise people are protesting her.
00:20:24.000 They're calling it an international criminal conspiracy, man.
00:20:27.000 Yeah.
00:20:27.000 A lot of eyes are on India, and I think we're going to be hearing about India a lot more, especially in the near future, especially with their very strategic, very important position in the world, because as China is going to be expanding their influence, they're going to also be propping up Pakistan.
00:20:43.000 These are all nuclear powers.
00:20:46.000 And when you look at India, I mean, they have a vast emerging middle class that speaks English, that is ready to help, even by some estimates, even by some experts, be the next hegemonic world power, even over China, because they are in a better position than China on the world map, comparatively as well.
00:21:06.000 So India is going to be a very key, crucial place to sway your influence in, especially in the not so distant future.
00:21:13.000 And it's not a surprise that this is happening to me, in my opinion.
00:21:16.000 I got mixed feelings about what this is all about, this farmer protest.
00:21:20.000 Now, I'm trying to figure it out, too, so maybe we can clarify.
00:21:23.000 It sounds like what they're doing is, right now, if a big multi-corporate farming industry in India wanted to sell bare-bottom prices and outmatch all the local farmers, they could sell their corn for $3, and the government would have to step in, Tim, can you clarify this?
00:21:43.000 I don't know a whole lot about it.
00:21:45.000 The government would subsidize it, basically.
00:21:47.000 It's basically, right now, government-regulated sales versus absolute free markets.
00:21:51.000 That's the general idea.
00:21:52.000 If Monsanto wanted to sell $3 corn and the farmer was like, I can't afford to sell my corn for $3, they're putting me out of business, okay, the government would say, okay, you can sell your corn for $10, we'll subsidize seven of it, and then it'll go for $3 to the market.
00:22:04.000 Maybe that sounds like what is going on right now, but they want to change it so that Government gets out of the way and corporates can sell whatever, but the danger is that Monsanto could undercut these local farmers.
00:22:15.000 We don't know that.
00:22:15.000 This is what it sounds like they're protesting.
00:22:18.000 We don't know that.
00:22:20.000 We don't know exactly the limits of this law.
00:22:23.000 It's a foreign country and I can't tell you.
00:22:25.000 I can only tell you that CNN said right now the current system is a government minimum guarantee for crops versus a sell-to-anyone-at-any-price model.
00:22:32.000 Right.
00:22:33.000 The sell-to-anyone model could be dangerous because if you have Monsanto over there and they want to sell $3 corn and the local farmers can't afford it, it'll bankrupt local farmers.
00:22:41.000 So that's why government is in there to help subsidize local farmers.
00:22:48.000 It's never this simple and we just don't know.
00:22:50.000 And ultimately, maybe it doesn't even matter what it is.
00:22:52.000 It's just that there's foreign nationals influence.
00:22:55.000 But I mean, that's the name of the game right now.
00:22:57.000 The globe is connected, you know?
00:22:59.000 It's not even a national game anymore.
00:23:01.000 It's the same thing with China, bro.
00:23:03.000 It's not that America wants cheaper products made in China.
00:23:07.000 They don't.
00:23:07.000 The American people want jobs.
00:23:09.000 They want to be able to work.
00:23:10.000 They want to feel fulfilled.
00:23:11.000 I find it hilarious that we have so many Gen Z memes, where it's like, there was one where it's from the Incredibles, where, you know, have you guys seen the Incredibles?
00:23:19.000 The family of superheroes.
00:23:21.000 So basically, like, superheroes are outlawed, and then, like, this Mr. Incredible is now an insurance, you know, adjuster or whatever.
00:23:27.000 And it said, like, me, it was like a meme, it's like me working in an office, you know, getting paid for, you know, for an unfulfilling existence or something.
00:23:34.000 And I'm like, yeah, people want meaningful jobs.
00:23:36.000 They want to make things to be proud of.
00:23:38.000 But all of those are being shipped away.
00:23:40.000 So these jobs are going to China.
00:23:42.000 Why?
00:23:42.000 Because billionaires... Because the billionaires and the millionaires in this country!
00:23:47.000 That's why.
00:23:48.000 Now Bernie Sanders, you know, I mock him, essentially, because he's no longer fighting against the millionaires and the billionaires.
00:23:53.000 But same thing with, you mentioned these companies, the international ties.
00:23:57.000 Regular people don't want this stuff.
00:23:59.000 They just want to be with their families, they want to make enough, they want to get by, they want to survive.
00:24:02.000 They don't want a boot on their neck from the government, and they don't want their jobs being sent overseas.
00:24:09.000 Oh my.
00:24:10.000 Look who's late.
00:24:11.000 You are 27 minutes late.
00:24:14.000 That's five shots of whiskey.
00:24:16.000 We want 27 push-ups.
00:24:18.000 Immediately.
00:24:19.000 So we're going to frame you up here.
00:24:22.000 And Ian, surprisingly, also very interestingly, India's Supreme Court just a few years ago ruled that Monsanto seeds couldn't be patented in the country.
00:24:32.000 So there's there's, you know, a lot of interesting news surrounding India, especially a lot of media figures now saying that they're very surprised that India dealt with the coronavirus a lot better than any other country.
00:24:43.000 But of course, that's not a surprise to us, especially after talking to Chris Martinson and Peak Prosperity, why they did so well.
00:24:49.000 You could see our conversations.
00:24:50.000 I did a video with him.
00:24:51.000 Tim did a video with him.
00:24:52.000 But ladies and gentlemen, James O'Keefe, the news of the hour.
00:24:56.000 Oh, what up, dog?
00:24:59.000 I had to charter a plane.
00:25:01.000 Oh my God.
00:25:02.000 Tell us about that.
00:25:03.000 I'm so glad you're here.
00:25:03.000 At a secret location doing some guerrilla.
00:25:07.000 Yes, guerrilla warfare.
00:25:08.000 Get the mic, like, direct.
00:25:10.000 Straight in, tube it, and maybe, I don't know, turn it a little bit.
00:25:15.000 How you doing, dude?
00:25:16.000 How was your journey?
00:25:17.000 It's been crazy, a crazy day.
00:25:19.000 We just broke a story on Mark Zuckerberg, CEO of Facebook, on a covertly recorded taking an anti-vaccine stance in violation of his own policy at Facebook.
00:25:31.000 So that's very interesting and a lot to talk about there, Tim.
00:25:35.000 Well, do you want to talk about it, or do you want to tell us why you had to charter a plane, or what's going on?
00:25:39.000 Yeah, it's quite the adventure.
00:25:39.000 I can't.
00:25:40.000 If I told you, I'd have to kill you.
00:25:41.000 Okay.
00:25:42.000 No, no, no.
00:25:43.000 Just been a crazy 24 hours, and I apologize I'm late, but I have good excuse.
00:25:48.000 Oh, it is what it is.
00:25:49.000 I was getting text messages.
00:25:50.000 I'm like, we're doing the show.
00:25:51.000 We put the gorilla in your place.
00:25:53.000 Gorilla journalism.
00:25:54.000 That's right.
00:25:56.000 He was doing all right.
00:25:56.000 He wasn't saying a whole lot, but you know, he looks angry, so it riles people up.
00:26:01.000 What's up?
00:26:02.000 Headphones.
00:26:02.000 I think you got them.
00:26:03.000 Oh, yeah, yeah.
00:26:04.000 See, normally this is the stuff we do before the show starts.
00:26:09.000 What is up with Project Veritas?
00:26:10.000 Punctuality is not my strength.
00:26:13.000 Now we know.
00:26:13.000 Yeah.
00:26:14.000 So what's going on, man?
00:26:14.000 Welcome to the show.
00:26:15.000 Thanks for having me, Tim.
00:26:16.000 I appreciate it.
00:26:17.000 Yeah, yeah.
00:26:18.000 So just well, give us the latest.
00:26:19.000 Then we can talk about that.
00:26:20.000 We just broke a story as I'm getting here.
00:26:23.000 This is the CEO of Facebook, Mark Zuckerberg.
00:26:25.000 We just broke it.
00:26:26.000 I broke it on Twitter.
00:26:27.000 I'm still I'm back on Twitter.
00:26:29.000 Project Veritas has been permanently suspended.
00:26:31.000 Before you get started, I have to tell you what happened to us when we tried to start this show.
00:26:36.000 So the story we led with is about Greta Thunberg.
00:26:39.000 I don't know if you heard the news.
00:26:40.000 She shared this activist toolkit and now protesters are burning her effigy because they view her as an international entity who's fueling these deadly protests that are happening in India.
00:26:53.000 So the title of the video was Greta Thunberg burned an effigy and activist is charged with sedition with James O'Keefe.
00:27:00.000 Activist.
00:27:01.000 The first thing that happened was when we clicked create stream, it said we weren't allowed to do it.
00:27:07.000 Like an error occurred or something.
00:27:09.000 And so we had to go back and recreate.
00:27:12.000 So, so for those that aren't familiar, you have to like, you, you click start a stream and then it gives you the title, all that, you know, timing and everything.
00:27:18.000 We fill out the forms, we click to go and then it worked and we're like, okay, thank you.
00:27:21.000 Finally.
00:27:22.000 That's awesome.
00:27:23.000 We go in.
00:27:24.000 I start, I click streaming on the broadcast software and then all of a sudden YouTube says live streaming is not available right now.
00:27:30.000 Hmm.
00:27:31.000 So I refresh.
00:27:32.000 Try again.
00:27:33.000 Refresh.
00:27:33.000 Try again.
00:27:33.000 Turn the software off.
00:27:35.000 Turn it back on.
00:27:36.000 None of it works.
00:27:37.000 But I took the title and changed it to YouTube is giving us the business, won't let us stream.
00:27:43.000 Worked.
00:27:44.000 Oh, that's ingenious.
00:27:45.000 So the funny thing is people were commenting, is this the episode that's going to get Tim banned having James O'Keefe on right now?
00:27:51.000 And I took a picture.
00:27:52.000 I tweeted the image.
00:27:53.000 I am not exaggerating.
00:27:54.000 I tweeted out the image as live streaming is not available.
00:27:57.000 When I changed the title, which removed your name and Greta Thunberg's, I don't know what caused it.
00:28:02.000 Maybe it was burned in effigy.
00:28:03.000 Maybe it was sedition.
00:28:04.000 I have no idea.
00:28:05.000 It's working now.
00:28:06.000 Now it's working.
00:28:06.000 Well, I'm on the show, so I don't know if... I mean, our YouTube is still... This is a YouTube, right?
00:28:11.000 Yeah.
00:28:11.000 This is YouTube streaming, so they haven't banned us on YouTube.
00:28:14.000 But a lot to unpack here, a lot to talk about.
00:28:16.000 I'll let you be the captain and, you know, tell us what direction you want me to go in.
00:28:20.000 I think that was... Look, you were late, but we were also several minutes late as well because we couldn't get the stream to work.
00:28:25.000 Couldn't get the stream going.
00:28:26.000 And I just want to stress, as soon as, you know, Greta Thunberg, James O'Keefe, the title was changed, it worked.
00:28:32.000 Don't ask me why.
00:28:33.000 It's not the first time something like this has happened.
00:28:35.000 But you know, I was joking with my colleague, my colleagues are standing over there, the entourage, the Project Veritas entourage.
00:28:41.000 All flew on a private plane.
00:28:42.000 Yeah.
00:28:43.000 It was a crazy day.
00:28:44.000 You wouldn't even believe it if I told you.
00:28:46.000 You'll know soon.
00:28:47.000 I was saying that whoever these people in San Francisco or wherever they are, like, you know, shadow boosting, banning, censoring, they're worried because they're thinking there might be a whistleblower who's watching this Who might record us banning and censoring, so maybe they're deterred from doing it too, like the psychological deterrent effect.
00:29:06.000 So we just broke a story, Mark Zuckerberg on the hidden camera, or he's covertly recorded on one of his staff calls, taking this anti-COVID vaccine stance in violation of Facebook's own policy.
00:29:18.000 Zuckerberg says, my understanding, this is a quote from Zuckerberg now, not me, my understanding is that these vaccines Wow.
00:29:25.000 uh... modify d n a and or an a now face because the policy implemented recently you're not
00:29:31.000 allowed to say that so i was thinking would be meet quoting zuckerberg dick
00:29:35.000 playing videotape of mark zuckerberg
00:29:38.000 get banned on face book yes it's in violation of face book some policy probably
00:29:42.000 would Well, they'll try.
00:29:43.000 So we had, uh, you guys know Jack Murphy?
00:29:47.000 We had Jack Murphy on the show.
00:29:49.000 He said on this show, he was like, Donald Trump gave, I'm not even going to quote what Jack said because they've, they've come after us for it.
00:29:56.000 He basically said Donald Trump's, Jack Murphy said that Donald Trump said something about voter fraud and they use that and accused us as a show of making those claims.
00:30:07.000 Right.
00:30:07.000 when we've not.
00:30:09.000 We've actually argued against them.
00:30:10.000 We've even had some prominent Trump supporters give a more balanced approach to what's going on
00:30:15.000 with the whole election stuff, and they tried accusing us of being conspiracy theorists
00:30:18.000 for simply saying Trump said a thing.
00:30:20.000 So you quoting Zuckerberg, they're gonna claim James O'Keefe
00:30:25.000 pushed the conspiracy theory.
00:30:27.000 Simply.
00:30:28.000 And they won't tell people.
00:30:29.000 Push the anti-vaccine conspiracy theory.
00:30:33.000 So my headline on the story was Facebook CEO Mark Zuckerberg takes anti-vaccine stance in violation of his own platform's policy.
00:30:41.000 So it'd be funny to see if we get a comment from, is it Andy Stone, spokesperson of Facebook?
00:30:46.000 Has he responded yet?
00:30:48.000 So we have emailed him, we've carrier pigeoned him, we do everything, text him, what?
00:30:54.000 Text him, call him.
00:30:56.000 But Tim, you know, Twitter has permanently suspended Project Veritas' Twitter account.
00:31:03.000 The official comment that Twitter gave CNN reporter Brian Fung of CNN Business is they said that we have published people's private information.
00:31:14.000 Twitter did not say that I'm misinformation or fake news.
00:31:17.000 No, no.
00:31:18.000 They said the reason that we were suspended is because we're doxing people.
00:31:21.000 Project Veritas is publishing people's private information.
00:31:24.000 The evidence of that was us doing a routine TV doorstop.
00:31:28.000 That is to say, you know, television reporters go outside someone, you know, someone's driveway.
00:31:32.000 CNN does it all the time, like they do to memers.
00:31:35.000 Didn't CNN do that with that old woman?
00:31:37.000 Yes!
00:31:38.000 And you could actually see the numbers on her house.
00:31:43.000 So it's completely illogical.
00:31:46.000 But that's the reality.
00:31:52.000 But here's the interesting part.
00:31:53.000 And I have the documents in front of me because I want to get my facts correct.
00:31:57.000 Brian Fung of CNN.
00:31:58.000 says that we were repeated violation of the company's, quote, anti-doxing policies.
00:32:04.000 But then, but then just Sunday, the host's name was Anna Cabrera was on with Brian Stelter.
00:32:11.000 And she said that Project Veritas was removed for misinformation.
00:32:15.000 So CNN's own reporters are contradicting each other.
00:32:19.000 They're just making stuff up.
00:32:20.000 So what you're saying is you're going to get a retracto out of this.
00:32:23.000 We have our lawyer has sent a letter to CNN corporate David Vigilante is the general counsel of CNN, and we are going to get a retractor.
00:32:33.000 Retracto the correction alpaca.
00:32:36.000 He's coming at you.
00:32:37.000 It sounds like it wasn't so much that you guys got busted for doxing, but that it was the who you were doxing, because wasn't it somebody that was high profile?
00:32:44.000 So this is a little nuanced, but I think it's important that we take your audience through the facts because I think it's important because I think Twitter has crossed a Rubicon here that they have not previously crossed.
00:32:53.000 I mean, look at the Hunter Biden stuff, man.
00:32:55.000 They suppressed a very relevant news story to help someone win a presidential election.
00:33:00.000 That is true. I mean these people are evil man. That is true. So what happened here was
00:33:04.000 We I don't know if you've ever experienced this but when you are flagged Twitter gave us the option
00:33:11.000 It said you can simply get your account reinstated at Project Veritas by deleting this tweet
00:33:15.000 They want us to delete the tweet the video of my reporter Guy Rosen.
00:33:19.000 Hartsock with a camera and a flag mic. You could see the flag mic, there's no
00:33:23.000 hidden camera recording here, in a street on a sidewalk outside the house of
00:33:28.000 Facebook vice president. Was this public property? What was his name? Guy Rosen.
00:33:33.000 Was this public property? This was a public street, a residential street, and
00:33:37.000 his residence. And we're following the lead of CNN.
00:33:41.000 CNN goes and door stops people, local TV news.
00:33:44.000 So Christian Hartzog asks Guy Rosen for comment.
00:33:47.000 This is maybe the biggest public policy story of our lives.
00:33:50.000 We're not confronting a private citizen.
00:33:52.000 Well, he is a private citizen.
00:33:54.000 He's vice president of Facebook.
00:33:56.000 And then Twitter says, you need to take down this video clip off Twitter.
00:34:00.000 But I decided I'm not going to do that, Tim.
00:34:03.000 I would be against my conscience.
00:34:04.000 I'm going to appeal it.
00:34:05.000 I knew it was a risk.
00:34:07.000 And usually when you appeal, that takes a week or two to do.
00:34:10.000 And then in the middle of the day, Twitter got bombarded with phone calls by the New York Times, CNN and everyone else, and Twitter changed their minds.
00:34:17.000 It said, nope, we've changed our mind.
00:34:19.000 We're now going to ban Project Veritas forever.
00:34:22.000 Pressure.
00:34:23.000 It's just media.
00:34:23.000 This is what the game is for these, I don't want to call them journalists, these are activists.
00:34:27.000 because both me and Project Veritas tweet the same stuff.
00:34:30.000 Pressure.
00:34:31.000 Pressure.
00:34:32.000 It's just media.
00:34:33.000 It's, it's, this is what the game is for these, I don't want to call them journalists, these
00:34:36.000 are activists.
00:34:37.000 Because I broke a story, I think it was a year and a half ago, where someone, someone
00:34:41.000 who worked at a major bank leaked to me internal documents about, I think we actually, I think
00:34:49.000 we may have talked about this after the fact.
00:34:51.000 They leaked to me these documents, and it was basically showing that this bank had banned Well, I gotta be very careful here.
00:35:00.000 It appeared to be that certain right-wing individuals had their accounts closed because
00:35:05.000 an email from a journalist said, why are you supporting white supremacy or something to
00:35:10.000 that effect.
00:35:11.000 So what they do is a real journalist is going to email someone and say, you know, a story
00:35:16.000 came out and, you know, actually I got to say this.
00:35:20.000 There's literally no news in finding out that someone has a bank account and then emailing
00:35:24.000 the bank and saying, why does this person have a bank account?
00:35:27.000 We know why people have bank accounts what they do is they contact the bank they that the email then says something like this individual has been accused of being a white supremacist and Why do you support white supremacy?
00:35:40.000 The bank gets the message.
00:35:42.000 It's mafioso.
00:35:43.000 They're basically saying, we are going to write a story claiming you support white supremacist organizations.
00:35:50.000 And then the bank says, oh, we don't.
00:35:51.000 We banned those people.
00:35:53.000 And then the activists pretending to be journalists go, ah, okay, great.
00:35:56.000 Getting financial institutions or other companies to destroy their enemies.
00:36:00.000 So it's not too dissimilar.
00:36:02.000 You, James, are one of the, I think Veritas is one of the very few actual news organizations Look, I think there's a lot of news outlets and there's a lot of good journalists and they work hard and they do well, but you look at how awful things have gotten with these major corporate news outlets.
00:36:17.000 There's no risk to the establishment, to the corporate powers, to the special interests from CNN or The New York Times.
00:36:24.000 I mean, very little.
00:36:25.000 But it's like you were saying, these people are looking over their shoulders, not knowing who is going to hold them accountable by either recording their unethical behaviors and exposing them doing wrong.
00:36:38.000 They don't know who that's going to be.
00:36:39.000 And so they're worried about it.
00:36:39.000 And they're worried about you.
00:36:41.000 Getting rid of you will make their lives very difficult.
00:36:43.000 Well, I think there's a psychological, like I say, right now, as I speak, if there are coders, engineers, wherever they may be, Silicon Valley, when they do whatever they do that's unethical or at least dishonest, I know they're private companies, they can do whatever they want constitutionally, but there's that little thought inside their head, like the jury on your shoulder, sort of worried, deterred, maybe someone's observing me do this thing.
00:37:05.000 But what's most interesting about this situation that Project Veritas was permanently suspended was Twitter changed their mind, Tim, just as you said, once they got bombarded with all that pressure from New York Times and CNN being like, what's the deal?
00:37:17.000 Are they suspended permanently or not?
00:37:19.000 And someone high up at Twitter said, nope.
00:37:22.000 We're going to suspend him forever.
00:37:24.000 And that shows me that the Washington Post and the New York Times and CNN actually have tremendous power.
00:37:30.000 And Jeff Bezos, who is whatever he is, a trillionaire, billionaire, trillion dollar market cap, one of the wealthiest men in the history of the world, is most proud of his ownership of that woke clickbait rag Washington Post.
00:37:43.000 That's his crowning achievement in life.
00:37:46.000 I actually am not one of those people who say, these companies, CNN has tremendous power to get Twitter to change their mind by making a phone call.
00:37:58.000 Think about it.
00:37:58.000 Yep.
00:37:59.000 CNN spreads baseless theories.
00:38:01.000 They eat human brains.
00:38:03.000 They dox people.
00:38:04.000 They even had an article not so long ago that was titled, quote, How CNN Found the Reddit User Behind the Trump Wrestling GIF.
00:38:11.000 Yep.
00:38:12.000 What's the date on that one?
00:38:13.000 What's the date on that one?
00:38:14.000 July 5th 2017 where they knowingly went after someone because of that famous gift that Donald Trump
00:38:22.000 tweeted and they said Publicly apologize right now or we're going to docks. They
00:38:28.000 said promise never to do it again Exactly. Otherwise, we'll release your pride and this is
00:38:32.000 the organization that's spearheading the censorship movement that never gets called out for their
00:38:37.000 Baseless well areas for their fake news and their lies Even the New York Times, yesterday we were talking about a specific story where the New York Times was caught fabricating the story about the fire extinguisher killing and beating to death this Capitol Police officer.
00:38:52.000 Or at the very least getting burned by their sources and refusing to correct until a month later.
00:38:56.000 Exactly.
00:38:57.000 And then now today, they also have another article that's titled, unfettered conversations are taking place on Clubhouse, talking about how it's dangerous, this new kind of platform that's rising in popularity because they don't know how to deal with harassment, misinformation, and quote, privacy.
00:39:15.000 Can I point out the beautiful irony of, was it Anna Cabrera, I think you said?
00:39:20.000 Anna Cabrera.
00:39:21.000 Saying that you were suspended permanently for misinformation, and that statement itself is misinformation.
00:39:27.000 I mean, I have the two statements.
00:39:28.000 It's like a fire truck on fire.
00:39:29.000 By the way, we were going to get our lawyer, we could always just sue CNN, we've sued the New York Times, we'll talk about that in a minute, but I've got two statements from CNN in front of me right now.
00:39:38.000 One is from Brian Fung, CNN Business, who's actually a pretty nice, I think he's a serious reporter.
00:39:43.000 And he says that Veritas was suspended for, quote, threats of sharing other people's private information.
00:39:48.000 Wait, wait, wait.
00:39:49.000 Threats?
00:39:50.000 This is threats of sharing other people's private information.
00:39:54.000 That's literally what CNN did.
00:39:55.000 Luke just cited that story from them.
00:39:58.000 And then I have a statement from Anna Cabrera.
00:39:59.000 This is on the air with Brian Stelter on Sunday.
00:40:03.000 Quote, Twitter has suspended the account of Project Veritas.
00:40:06.000 I'm quoting her now, so forgive.
00:40:07.000 It's not my words.
00:40:08.000 Forgive me.
00:40:09.000 Quote, a conservative actionist, activist organization, at least that's how they couch themselves.
00:40:17.000 I have never in my life referred to myself as a conservative activist, not once I was swear to it under oath in a defamation lawsuit against CNN, unless they retract that.
00:40:27.000 And then she goes on, quote, This is a much broader crackdown by social giants on accounts promoting misinformation.
00:40:35.000 So our General Counsel has just sent a letter to David Vigilante today.
00:40:40.000 I have the letter in front of me, signed.
00:40:42.000 A failure to retract will sue.
00:40:43.000 By the way, these organizations are used to throwing the spear at civilian innocent people.
00:40:50.000 They're not used to being the spear being thrown at them.
00:40:53.000 I've sued the New York Times for defamation in the New York Supreme Court, and we're going to get passed motion to dismiss in a lawsuit.
00:41:00.000 We talked about this last time.
00:41:01.000 You and I talked about this.
00:41:03.000 We need to sue the shit out of them.
00:41:05.000 And I'm not doing it to harass.
00:41:07.000 I'm doing it because that's justice.
00:41:09.000 If you lie, if you intentionally disregard the truth, that's called actual malice.
00:41:13.000 I'm a public figure.
00:41:15.000 You can't, under the law of New York State, intentionally lie.
00:41:20.000 We'll get past motion to dismiss, and then you get into depositions.
00:41:24.000 That means I put Dean Baquet, executive editor of the New York Times, in a chair, and he's
00:41:28.000 required by the law to answer my questions under oath.
00:41:31.000 Not only that, but isn't there an opportunity for discovery?
00:41:33.000 Maybe get access to communications?
00:41:34.000 Discovery, depositions, interrogatories, they are required to answer my questions.
00:41:40.000 When I confront Dean Baquet in the street in Los Angeles, that's the executive editor of the New York Times, he just ran away and laughed maniacally.
00:41:47.000 But when the judge says, no, you have to answer Project Veritas' questions, why did you intentionally lie about James O'Keefe?
00:41:54.000 We have so many lawsuits, Tim, I just think we're going to start suing people.
00:42:00.000 I think it's time.
00:42:01.000 Because you can't defame people and get away with it anymore.
00:42:06.000 So long as people don't fight back, you can.
00:42:08.000 But now it seems like they picked a fight.
00:42:10.000 It's really expensive.
00:42:12.000 The problem with litigation is that each of these lawsuits is a million bucks.
00:42:16.000 But we talked about this last time.
00:42:17.000 Remember we said someone should start a non-profit and just start suing people who lie?
00:42:22.000 Well, that might actually just be project, you know, one third of our budget at this point.
00:42:26.000 You'll make money on it.
00:42:28.000 Yeah, the People's Defamation Defense Fund.
00:42:30.000 Right.
00:42:30.000 Exactly.
00:42:31.000 Just get a big fund and then track when... You know, the issue is there are a lot of people now who are finding themselves in the public sphere who have small careers as personalities, pundits, and journalists.
00:42:31.000 Perfect.
00:42:44.000 They can't go up against the New York Times.
00:42:46.000 Small individual, maybe there's a journalist who's got 70,000 Twitter followers and they're making, you know, 30 to 50k a year off of being an independent journalist.
00:42:53.000 Well, Tim, do you think these people, do you think it's psychologically too, people want to be liked by the New York Times or want their book reviewed?
00:42:59.000 Is it that psychological effect?
00:43:02.000 They don't want to take on an institution as revered, as beloved.
00:43:06.000 And what do you think about that?
00:43:07.000 Do you ever wonder, I was reading something that was really interesting, they talk about
00:43:09.000 how the old mythological stories of heroism were individuals challenging the system, fighting
00:43:15.000 against the gods, and the superheroes of today are supporting the system and fighting on
00:43:20.000 behalf of the establishment.
00:43:23.000 So I wondered about that.
00:43:24.000 You have people right now that don't want to challenge the machine, no matter how evil or wrong it may be, no matter how much they oppress and harm the working class, the little guy, the individual.
00:43:34.000 They want to be a part of the machine.
00:43:36.000 They want to be the one hazing the other person, not the one getting hazed.
00:43:40.000 So when you look at the unfettered power the New York Times or CNN has to destroy lives, why would anyone choose to be on the other end of that?
00:43:48.000 I mean, it's a legitimate question and I think the same psychological effect has on people.
00:43:53.000 I've noticed it since Project Veritas has been banned.
00:43:57.000 I don't want to share Project Veritas because I might be banned.
00:43:59.000 I want people to draw an analogy.
00:44:01.000 That's like saying I don't want to tell the truth because the New York Times might be mean to me.
00:44:05.000 OK, let's treat Twitter like the New York Times.
00:44:08.000 Don't draw a distinction between big tech and big media.
00:44:12.000 They are one and the same.
00:44:13.000 In fact, the algorithms prefer CNN, right?
00:44:16.000 Well, YouTube creates a special section on their front page for Carousel, the establishment news outlets.
00:44:22.000 You're not going to find.
00:44:24.000 Well, you may get recommended shows like this.
00:44:26.000 You may get recommended my other channels, which is news, commentary and journalism.
00:44:30.000 But CNN gets special privilege.
00:44:32.000 And what's funny is, download VidIQ.
00:44:35.000 I love this plugin, I use it.
00:44:36.000 And it shows you a percentage next to the title of every video of the thumbs-up to thumbs-down ratio.
00:44:42.000 You'll see the recommended videos are usually overwhelmingly thumbs-up.
00:44:46.000 Why?
00:44:46.000 Well, they're being recommended.
00:44:48.000 It'll be videos from maybe me or Steven Crowder, or maybe people like Kyle Kalinske, depending on your politics, or Jimmy Dore.
00:44:53.000 But then you'll see this special mainstream media carousel, and it's all 10% thumbs up, 5%, just overwhelmingly rejected by the users of YouTube.
00:45:03.000 But YouTube, desperate for the approval of CNN, the New York Times, ABC, etc., drop to their knees and beg CNN to like them.
00:45:12.000 Beg.
00:45:13.000 I think it's fair to say, as you were mentioning James there, that the mainstream media and social media companies definitely share the same activist toolkits as some people would say.
00:45:22.000 But this is interesting.
00:45:23.000 I think this is the bigger story here, the legal aspect of it, because if you remember, not so long ago, CNN had to settle with the Covington kids.
00:45:30.000 Yes.
00:45:31.000 So there is a president for this.
00:45:33.000 We don't know exactly how much, but if that's the case, that we don't know how much, that means it's a good much.
00:45:33.000 Undisclosed.
00:45:38.000 It's a lot.
00:45:39.000 Not necessarily, but I do want to address that.
00:45:42.000 So I see a lot from the left saying this was clearly a nuisance fee.
00:45:47.000 They probably paid the bare minimum just to make him go away.
00:45:49.000 I don't think that's true.
00:45:50.000 That's ironic.
00:45:51.000 I don't think that's true though, I don't.
00:45:53.000 Because I believe the motion to dismiss failed, and they were potentially looking at depositions and discovery.
00:45:59.000 That sounds to me like the Covington kids and their lawyer probably said, we don't need a nuisance fee, we can put more pressure on you.
00:46:05.000 CNN probably, they definitely didn't pay $250 million to settle.
00:46:10.000 But it probably was...
00:46:12.000 It's hard to know.
00:46:12.000 Nice.
00:46:13.000 That was the lawyer in that.
00:46:14.000 Lynne Wood was the attorney for Covington.
00:46:15.000 Yeah, isn't that crazy?
00:46:16.000 But it's also the Washington Post that settled.
00:46:18.000 So it's not just CNN.
00:46:19.000 It's a number of news organizations.
00:46:19.000 It's true.
00:46:20.000 But I think, Tim, you're right.
00:46:21.000 What they fear most, and I've been through 12 of these depositions.
00:46:25.000 I don't know if you guys have been sued.
00:46:26.000 I've been sued so many times.
00:46:28.000 And I've won.
00:46:28.000 Project Roadhouse has never lost a lawsuit.
00:46:30.000 I always talk about that.
00:46:31.000 We've never lost a lawsuit.
00:46:33.000 Never settled a lawsuit.
00:46:34.000 So they fear me.
00:46:35.000 They fear litigation.
00:46:37.000 And the reason they fear it is the reason you said.
00:46:39.000 Discovery.
00:46:40.000 When you people don't understand depositions, they get all the emails.
00:46:43.000 They've looked at all my emails.
00:46:45.000 That's why I'm so clean.
00:46:46.000 I've never broken the law at Project Veritas.
00:46:49.000 We're so clean because they look at all of our emails.
00:46:51.000 They can't find anything.
00:46:53.000 But what happens when you open up CNN's emails, Tim?
00:46:56.000 When you get a Brian Stelter, Anna Cabrera, and Brian Fung in a deposition chair under oath.
00:47:03.000 So Cabrera says one thing.
00:47:05.000 They're going to be attacking each other.
00:47:08.000 Yes.
00:47:08.000 Their lawyer is going to be like, how do we solve this one?
00:47:11.000 And you can't perjure yourself in federal court.
00:47:13.000 You can't contradict yourself.
00:47:15.000 So I actually came to this epiphany in the last week or two.
00:47:18.000 We just need to start suing the shit out of them.
00:47:22.000 When you have a case, when you have a case for defamation, when you get past motion to dismiss, that's when people start settling lawsuits.
00:47:31.000 You're not going to settle though, are you?
00:47:33.000 I don't think so.
00:47:34.000 I mean, I've never settled on defense.
00:47:36.000 On offense, I mean, it's going to cost me over a million dollars to get to a jury verdict in the New York Times litigation.
00:47:42.000 That's a lot of money.
00:47:43.000 Plus, you have to consider, I think that judges oftentimes in these cases will accept apologies and retractions, too.
00:47:49.000 If the New York Times retracted the... I talked to you about this last time.
00:47:53.000 This was the Minnesota video and Ilhan Omar and the New York Times said I was part of
00:47:57.000 a coordinated disinformation campaign.
00:48:00.000 The New York Times said that because they cited some idiot professor in some random
00:48:05.000 place just saying, I think James is part of a coordinated disinformation campaign.
00:48:10.000 And then they and then they cross cited that on USA Today and then Facebook cited USA Today
00:48:14.000 and then banned our video.
00:48:16.000 So I if they offered money, I'd consider it.
00:48:19.000 But I think my conscience says I'm going to go all the way to a jury verdict.
00:48:23.000 I would also ask for an apology and for CNN to eat more human brains on national television.
00:48:28.000 I think it's pure punishment.
00:48:30.000 We will never pass over an opportunity to mention that a CNN host on TV ate human brain, and it's still on YouTube right now on numerous channels.
00:48:40.000 Recommended and promoted while alternative independent media gets screwed over because the mainstream media giants always get promoted in the algorithms no matter what.
00:48:48.000 And they get caught so many times with their pants down lying to the American public, whether through their sources, whether unknowingly or knowingly, we don't know until we have this discovery.
00:48:58.000 The people watching this, and obviously every comment I see is, what are you going to do about it?
00:49:02.000 I mean, people are just tired of complaining.
00:49:04.000 So I think a solution, this is a solution.
00:49:06.000 We sue them, okay?
00:49:08.000 Because I can't tell you how many... I have a dozen cases.
00:49:11.000 I'm, you know, trying to be the change I wish to see in the world, as Gandhi said.
00:49:15.000 I'm doing it.
00:49:16.000 We're suing the New York Times for defamation in New York Supreme Court.
00:49:19.000 We will get past motion to dismiss.
00:49:21.000 And Tim, there's a very strong chance the New York Times tries to settle and give me money.
00:49:27.000 You need to do a, like, remix, remastered, orchestral retracto with, like, old London Symphony Orchestra.
00:49:34.000 We're bringing in live llamas.
00:49:36.000 We're bringing alpacas.
00:49:37.000 We're going to get an alpaca farm.
00:49:37.000 Let's go to India.
00:49:39.000 Yes.
00:49:39.000 An actual llama.
00:49:41.000 So by the way, I didn't, this was like a dumb idea came up.
00:49:41.000 I insist.
00:49:44.000 Andrew Breitbart came up with this 10 years ago.
00:49:46.000 People actually, their favorite videos are the retractile videos.
00:49:49.000 It's a good song.
00:49:49.000 I love it.
00:49:50.000 I think they love the fact that we actually get them to admit that they're lying.
00:49:56.000 You know, it's pretty cool.
00:49:57.000 But the question, do you think that I should take the money or go all the way to a jury verdict?
00:50:01.000 I think what you should do, if I could recommend something, is you cover your costs, but then you take the money out of it and you create it and you put it in this fund where you protect other people.
00:50:08.000 The People's Defamation Defense Fund.
00:50:09.000 The People's, what do you call it?
00:50:10.000 The People's Defamation Defense Fund.
00:50:12.000 I like that.
00:50:12.000 Yeah.
00:50:13.000 When you protect the Redditors and the memers that CNN keeps going after, or the grandma that they door-stopped, you help those individuals in a fund after recouping some of your initial costs.
00:50:22.000 That's what I would do.
00:50:23.000 You take the dividends and you start a fund with that.
00:50:27.000 Exactly.
00:50:28.000 The one thing I would say is just weigh the risks.
00:50:30.000 If the New York Times comes to you and offers apology, retraction, and settlement, and you say no, the judge might tell you to screw off.
00:50:37.000 He might be like, what do you mean?
00:50:38.000 You won.
00:50:38.000 They came to you and they said they're going to give you everything you want.
00:50:41.000 So it may be great to get a jury victory, but that's only if they fight you all the way.
00:50:46.000 I gotta say, we covered this story the other day a couple times, you know, as the story has been emerging.
00:50:51.000 The officer in the Capitol riots, New York Times put out fake information.
00:50:55.000 The next day, and I missed this, because I trust the New York Times, it was reported in Houston, the officer died of an unrelated stroke.
00:51:02.000 The New York Times didn't change the article until a month and a week or so later.
00:51:07.000 So I don't think the New York Times, they're going to resist.
00:51:11.000 I think they're going to resist as much as possible any kind of admitting fault.
00:51:16.000 But I'll tell you this, man, if you get the New York Times to, you know, issue a full retraction apology, they've done that before for you, haven't they?
00:51:22.000 That's a win, yes.
00:51:23.000 We have hundreds of retractions and corrections, including a couple from the New York Times.
00:51:29.000 If they offer to retract the article and apologize, to me that's the case.
00:51:35.000 We win.
00:51:36.000 That's what I wanted in the first place.
00:51:37.000 But, Tim, they refused to do that.
00:51:39.000 Right.
00:51:40.000 And I think they didn't – I think I called their bluff.
00:51:43.000 I think they're like, oh, screw you.
00:51:45.000 And they're not used to people fighting back.
00:51:48.000 They're not – they're used to railroading people.
00:51:51.000 And the biggest epiphany for me was because we got Raw Story to retract an entire paragraph.
00:51:56.000 I don't know if you've heard of Raw Story.
00:51:57.000 It's a woke, clickbait, verified sort of situation.
00:52:01.000 Are they connected to Sorrows?
00:52:04.000 I don't know.
00:52:05.000 Raw Story just retracted, Tim, an entire paragraph.
00:52:09.000 They just made something up.
00:52:11.000 What did they make up, Eric?
00:52:12.000 Raw Story made up an entire paragraph about Project Veritas.
00:52:16.000 Oh, they said that I said that Antifa was behind the Capitol riots.
00:52:20.000 I never said that.
00:52:22.000 So our attorney said, they just deleted it.
00:52:25.000 And I was like, that's great.
00:52:27.000 I got a retraction.
00:52:28.000 I'm framing the retracto.
00:52:30.000 But then I thought, for every one of these, there's 50,000 that no one ever Got deleted.
00:52:38.000 And I think to myself, they should be thanking us for doing a service.
00:52:43.000 Because, but not for Project Veritas, that paragraph would still be on the internet forever.
00:52:49.000 On Wikipedia, cited by other organizations.
00:52:53.000 It blows your mind.
00:52:54.000 Wikipedia for like seven years claimed I invented a remote control Zeppelin for live broadcasting.
00:53:01.000 I have no idea how!
00:53:02.000 But that's the editorial standard of Wikipedia, I suppose.
00:53:07.000 And the New York Times, in their response to our motion to dismiss, cited Wikipedia in the legal motion.
00:53:15.000 They had footnotes, but James O'Keefe's Wikipedia page says that he's no good!
00:53:20.000 I am not making this up.
00:53:23.000 You know what, man?
00:53:24.000 I'm not making this up.
00:53:25.000 The great lady hath fallen.
00:53:26.000 So what do you do?
00:53:28.000 You gotta start this defense fund, and then next time you come on, blow it up on the show?
00:53:31.000 Like, talk about it and get people to start investing in it?
00:53:34.000 Well, we're gonna start by... We already sued the New York Times, and we're gonna start suing a few other outlets.
00:53:40.000 I think we're onto something here.
00:53:41.000 Malice.
00:53:43.000 Defamation law in the United States.
00:53:45.000 This is why discovery and depositions are so dangerous to these organizations.
00:53:51.000 I've been in these private slack groups.
00:53:54.000 You guys know what slack is?
00:53:56.000 It's basically discord for private corporations.
00:53:59.000 I have seen these people say things that would get them in very, very serious trouble.
00:54:04.000 I have had, there's one particular instance I cite over and over again about a very prominent individual who now works for the New York Times, who basically told me, hey, we engage in unethical behavior, so don't report on other people doing the same kind of unethical behavior.
00:54:16.000 And I was like, wow.
00:54:18.000 If a story broke, and they got sued, and that chat was released in Discovery, the company would be, their credibility would be gone forever.
00:54:26.000 So how does Discovery work?
00:54:27.000 So I've been through this so many times.
00:54:29.000 I'm not an attorney.
00:54:30.000 I've got a few dozen working for us.
00:54:32.000 I dropped out of law school.
00:54:34.000 You do a motion to dismiss, you pass this motion to dismiss, that's when these companies start settling lawsuits, like in the case of Nick Covington, or Nick from the Covington- Sandman.
00:54:44.000 Nick Sandman, thank you.
00:54:45.000 Once you get past motion to dismiss, then the company has realized you are headed towards discovery.
00:54:50.000 A deposition is where they videotape me.
00:54:54.000 For 12 hours, I'm under oath, and the lawyers pepper me with questions.
00:54:58.000 They look at your emails, they get everything related to the litigation, so I can open up the New York Times' emails, I can look at every communication.
00:55:06.000 It's against the law for them to delete emails about me.
00:55:09.000 They have to preserve their records about me.
00:55:12.000 And I get, and by the way, a deposition, nobody likes it.
00:55:15.000 It's very uncomfortable.
00:55:17.000 Unethical people, which we are not unethical at Project Veritas, we always behave like people are watching, fear the deposition.
00:55:24.000 Have you ever been deposed?
00:55:25.000 Negative.
00:55:26.000 Have you ever been deposed?
00:55:29.000 Not in this kind of context.
00:55:31.000 I've been in trial before.
00:55:32.000 I plead the fifth.
00:55:33.000 That's a yes.
00:55:34.000 I've been deposed like a dozen times.
00:55:36.000 I've been to federal jury trials.
00:55:38.000 I understand this process, and it's very uncomfortable.
00:55:43.000 I mean, I've also been arrested by the FBI, so there's that.
00:55:47.000 I've been through all this, and they're not used to this.
00:55:51.000 They're not used to this.
00:55:53.000 And it's almost like a form of content when you get the executive editor of the New York Times in a chair under oath.
00:56:01.000 It's almost like he's required to answer questions.
00:56:05.000 And their brain probably hurts because they're legally bound to tell the truth.
00:56:09.000 They have to.
00:56:10.000 And their brain isn't used to that.
00:56:11.000 So the withered husk in their brain that is the truth center is like struggling to function.
00:56:17.000 Pouring water on a robot.
00:56:18.000 It's like malfunctioning.
00:56:19.000 When you do discovery, you start pulling from the emails.
00:56:22.000 If you find like different infractions of this guy said defamed you, this guy defamed you, this guy defamed you, do they then become separate?
00:56:29.000 It's so beautiful.
00:56:30.000 In the New York Times case already, the reporter's name in that case, a guy's New York Times reporter.
00:56:36.000 Maggie Astor wrote an article.
00:56:39.000 This is so beautiful.
00:56:40.000 In the article, she said that Project Veritas relies solely on anonymous sources in the Minnesota video.
00:56:46.000 When you can see the people's faces, it was just preposterous.
00:56:49.000 It was self-evidently incorrect.
00:56:51.000 And then under oath, Maggie Astor changed her tune.
00:56:55.000 under oath she already printed a retracto in the motion she already changed the fact she said no no many of the people are named so under oath they changed their tune well so that's that's that's that's a that's a retraction that's a victory for you that's a victory already they've now admitted the the facts are different from what they've proposed already admitted she knows she knows She knows it, and the judge knows it, and the jury knows it.
00:57:19.000 I'm telling you, this is the People's Litigation Defense Fund, Tim.
00:57:22.000 It's a good idea.
00:57:23.000 They've got to settle with you.
00:57:24.000 They've got to just keep quiet.
00:57:25.000 But what concerns me is the damages that that guy, that girl went on CNN in front of, I don't know, 100,000 people.
00:57:30.000 Do you have how many people were watching that episode?
00:57:32.000 A million people.
00:57:32.000 And they lie about, they said that you were spreading false information on Twitter.
00:57:36.000 Well, this is what they did to me when I got invited to the White House.
00:57:38.000 So, me and Bill, and you were there as well.
00:57:41.000 When we got invited to the White House, The Today Show showed my picture and claimed that I was a proponent of the Seth Rich conspiracy theory.
00:57:50.000 They made it up.
00:57:51.000 Because what happened was, Fox News reported definitively—it was Fox Business, I believe—information about the Seth Rich conspiracy theory.
00:57:59.000 I then read that and said, wow, well, I don't believe it's entirely true, but we'll see what happens.
00:58:05.000 Because I read a story that was later retracted, they ignore all of the context around the fact that it was retracted, that I came out when it was retracted and said, wow, the story was fake the whole time, who'd have thought?
00:58:14.000 But simply by reading a story that later got retracted, they claimed I was pushing a conspiracy theory.
00:58:18.000 You see how that works?
00:58:20.000 That's a great trick, because they'll argue, but it is true, it was a conspiracy theory.
00:58:24.000 Yeah, but at the time, the context is relevant.
00:58:26.000 It was a definitively reported story from Fox Business, later retracted.
00:58:30.000 I reported on the retraction.
00:58:31.000 That's the dirt of these organizations.
00:58:34.000 They know it's not true.
00:58:36.000 They know it's a manipulation.
00:58:37.000 Think about every single journalist who pushed the insane Russiagate lies.
00:58:41.000 Are they all conspiracy theorists now?
00:58:43.000 Well, there's a great book called Disinformation by a Soviet bloc dissident, which I tweeted this picture.
00:58:49.000 This guy died yesterday.
00:58:50.000 I can't even pronounce his name, but he wrote a book called Disinformation.
00:58:52.000 He was a defector, and he talks about how disinformation works in the Soviet Union, or in his case, Romania, I believe.
00:58:58.000 Spell out his name.
00:58:59.000 Ion, M-I-H-A-I Pachepa.
00:59:04.000 He died yesterday, and he wrote a book called Disinformation.
00:59:06.000 It's like $1,000 on Amazon because everyone's trying to buy it right now.
00:59:09.000 I have a copy of it.
00:59:11.000 And he says that there's always a little kernel of truth in their disinformation, and they always go back to that little... It's mostly untrue, but there's some weird, like you just said, little factoid, and that's how they do it.
00:59:23.000 That's how disinformation works.
00:59:24.000 They'll use some iota of truth, and it's surrounded by falsehoods.
00:59:29.000 Assumptive language.
00:59:30.000 Saying things like, you know, James O'Keefe, they'll say, James O'Keefe pushed the conspiracy theory that Antifa was responsible for the Capitol riots.
00:59:30.000 Yeah.
00:59:39.000 Right.
00:59:39.000 And then they'll argue in court, we didn't say that he affirmed, asserted as true.
00:59:43.000 We say he pushed the theory because he was talking about it.
00:59:46.000 Like if I ask a question, you're pushing the conspiracy theory.
00:59:49.000 I'm a reporter, right?
00:59:50.000 So I ask questions for a living.
00:59:51.000 So if I, if I ask questions about whether Antifa was present, And then one reporter from Bloomberg says he implied it.
00:59:59.000 Raw Story says he's pushing the theory.
01:00:01.000 It's this crazy thing.
01:00:04.000 Once it makes it through a few sources, it'll be, James O'Keefe asked a question about X. Then someone else copies that and says, according to source A, James O'Keefe was implying that X was true.
01:00:18.000 And then the third source says James O'Keefe said X, then Wikipedia takes it, runs it,
01:00:23.000 James O'Keefe believes in, you know, space dinosaurs.
01:00:25.000 It's like a game of telephone. And again, the further it goes, the crazier it gets.
01:00:30.000 And people don't understand this because they don't see themselves doing this. But people need
01:00:33.000 to understand the levels of depravity, the amount of lies, the amount of selective editing that goes
01:00:40.000 into a lot of these mainstream media pieces. When they have an agenda, when they have something that
01:00:44.000 they want to push, they will do everything in their power.
01:00:47.000 They will stoop to levels that are unimaginable just to get that agenda through. Well, the
01:00:51.000 editing thing, that's what I say in our videos, it's a form of psychological projection. This
01:00:56.000 is that what, you know, I'm not a psychologist, but this is projection. They just
01:01:00.000 quite literally accuse their enemy of doing. And we're not like that. You and I are, that's not
01:01:06.000 how we operate. We're fairly ethical people. We're pretty honest people, pretty down to earth
01:01:11.000 people.
01:01:12.000 These people project onto us what they do.
01:01:15.000 And when they say editing, I mean, all journalism is edited selectively.
01:01:18.000 So, I mean, these words are arranged into sentences in a fairly deceptive manner.
01:01:23.000 They write things.
01:01:25.000 I mean, we do the purest form of journalism is people's lips moving.
01:01:29.000 You can hear the words coming out of their mouth, but they practice this form of projection and it's rather evil.
01:01:36.000 They just, and it works, Tim.
01:01:39.000 It's effective.
01:01:40.000 And they have, like, $100 million set aside annually for lawsuits.
01:01:44.000 Getting ready to pay out $40 million a year in lawsuits.
01:01:47.000 I'm just throwing a number out.
01:01:48.000 I don't know exactly how much.
01:01:49.000 Yeah, they do, though.
01:01:49.000 They're well prepared for lawsuits.
01:01:51.000 They budget them in.
01:01:52.000 They know they're coming.
01:01:53.000 And they know they can crush the little guy.
01:01:55.000 And that's what you were saying.
01:01:56.000 They railroad people.
01:01:57.000 They railroad people, and I think back in the New York Times' case, that article about our Minnesota video was the night of the presidential debate.
01:02:06.000 They had a researcher lined up from Stanford.
01:02:09.000 It wasn't Stanford University, it's some other Stanford, some unknown research group called Stanford.
01:02:14.000 I mean, you can't make this stuff up.
01:02:16.000 They called the guy, the reporter called the guy, they had a canned remark.
01:02:20.000 Yeah, it's, quote, probably part of a disinformation campaign.
01:02:24.000 Probably.
01:02:25.000 And then the New York Times cited that researcher saying it's probably part.
01:02:28.000 They put it in the headline.
01:02:29.000 It was cross-referenced by USA Today.
01:02:31.000 Facebook uses USA Today as their fact checker.
01:02:33.000 And before you know it, 50 million people getting a notification saying the video is fake news.
01:02:38.000 It's unbelievably effective.
01:02:40.000 Well, let's talk about the story you just broke.
01:02:43.000 Yeah.
01:02:43.000 And, you know, talking about the censorship now.
01:02:46.000 So you guys just put out this tweet.
01:02:48.000 Mark Zuckerberg.
01:02:49.000 Actually, I have the tweet right here.
01:02:50.000 You tweeted, CEO Mark Zuckerberg takes anti-vax stance in violation of his own platform's policy.
01:02:56.000 I'm not going to read his quote because, you know, even though I know you already did, but they might try.
01:03:00.000 I'll tell you what I do when I read quotes.
01:03:02.000 I'll do this.
01:03:04.000 Quote, I share some caution on this vaccine because we, and this is Mark Zuckerberg saying this, just don't know the long term, and again, Mark Zuckerberg, side effects of basically, and again, this is from Mark Zuckerberg's quote, modifying people's DNA, Mark Zuckerberg here, and RNA.
01:03:19.000 No, no, I have to do that because- They'll selectively edit you.
01:03:22.000 Yes, yes.
01:03:23.000 No joke, no joke.
01:03:24.000 Me reading something from Mark Zuckerberg and they will splice it together.
01:03:28.000 Well, go ahead and splice it that many times if you really want to do it.
01:03:30.000 Some people will.
01:03:32.000 But this is Mark Zuckerberg basically saying something that, as far as I can tell, was never believed to be correct.
01:03:38.000 When the research in the news was coming out about the mRNA vaccine, it was never true.
01:03:44.000 It was always a conspiracy theory that it modified DNA.
01:03:48.000 Now, at the time, it was not against Facebook's rules to claim that.
01:03:52.000 That's my understanding.
01:03:52.000 That's correct, right?
01:03:54.000 At the time, but recently Facebook announced that they are, quote, expanding their efforts to remove false claims on Facebook about the vaccine.
01:04:03.000 So Mark Zuckerberg makes the statement in July where he's essentially violating Facebook's own policy.
01:04:10.000 And the question is, I guess Mark Zuckerberg's thinking has evolved on the vaccine.
01:04:16.000 And it's a little bit strange because he's now just instituted this policy prohibiting What he was saying a few months ago.
01:04:25.000 Well, I think it's fair to say people's opinions change.
01:04:29.000 And Mark Zuckerberg was clearly wrong in this.
01:04:32.000 The mRNA, it essentially, it doesn't change your DNA.
01:04:36.000 It just configures certain cells to produce the spike protein.
01:04:39.000 And then when those cells die off, that's it.
01:04:40.000 That's the end of it.
01:04:41.000 That's why there's actually multiple doses.
01:04:42.000 That's why they say it doesn't work necessarily in the long term.
01:04:45.000 But this was, I don't know what he was reading.
01:04:48.000 Mark Zuckerberg was reading fake news.
01:04:50.000 Now, it's fine, in my opinion, that he wants to have an opinion that's incorrect, because people are wrong.
01:04:55.000 Even Mark Zuckerberg.
01:04:56.000 The real issue here, as you can see, and we were talking about this earlier, Zuckerberg is afforded the right to pontificate on things that were widely considered to be conspiracy theories at the time.
01:05:06.000 Whereas if I said the same thing, I would have been annihilated in the media.
01:05:09.000 He's telling his own staffers this stuff.
01:05:11.000 Now he's implementing this rule.
01:05:13.000 And again, I'm not holding it against him that he changed his opinion, okay?
01:05:16.000 No.
01:05:16.000 But the authority is him.
01:05:17.000 This is the problem.
01:05:18.000 Who appointed Mark Zuckerberg to decide when and how we're allowed to think about things and share opinions?
01:05:24.000 Well, his own vice president, Nick Clegg, we had him on a leaked tape, basically saying, they're just making this stuff up as they go.
01:05:30.000 This is Nick Clegg, vice president of Facebook.
01:05:32.000 He's a British guy and he's on the, I guess he has their own oversight board.
01:05:37.000 And they're basically playing the role of judge and jury of the Supreme Court.
01:05:41.000 The Financial Times reported there is a growing trend.
01:05:43.000 Tech companies think they should be deciding public policy, not government.
01:05:48.000 And this is a prime example.
01:05:49.000 This has nothing to do with my opinion about the vaccine or whatever it or Europe.
01:05:52.000 It doesn't.
01:05:53.000 The point is Zuckerberg is the CEO of Facebook.
01:05:56.000 And just a couple of months ago, he was anti-vax.
01:05:59.000 This is Mark Zuckerberg.
01:06:00.000 Not me, not you, not us.
01:06:02.000 It's Mark Zuckerberg.
01:06:04.000 The CEO of maybe the most powerful company in the history of the planet Earth.
01:06:07.000 At a time when if you tweeted anything remotely anti-vax, they went after you hard.
01:06:12.000 But this is the CEO of Facebook anti-vax.
01:06:15.000 On the record, not on the record, he's off the record, he's behind closed doors.
01:06:18.000 But to his employees, to a major... I guess he's on the record to his own staff, his own executive committee, saying this.
01:06:25.000 This is a whistleblower inside Facebook, folks, has given us this tape of Mark Zuckerberg.
01:06:28.000 That is one of Mark's own colleagues, is a source to James O'Keefe, and is still a source to James O'Keefe. That's why those
01:06:36.000 engineers are very careful.
01:06:37.000 That's pretty cool.
01:06:38.000 Pretty cool, right? And Mark Zuckerberg just a few months ago said, I'm anti-vax.
01:06:42.000 I think it changes your DNA. Again, Mark Zuckerberg's words, not mine.
01:06:45.000 It's too late. They got you.
01:06:46.000 But suddenly now Mark Zuckerberg has changed the policy.
01:06:52.000 You're not allowed to even pontificate about the vaccine.
01:06:55.000 Doesn't that seem a little capricious to you?
01:06:57.000 I think it's good that he finally realized he was wrong.
01:07:01.000 I think this is actually included in your report that he did a public conversation with Dr. Fauci.
01:07:05.000 And here's what I love about this, because I texted you when I saw this, and I was like, well, look, I'd get it if he changed his opinion.
01:07:11.000 You guys fully included the dates of when he said it, the date of when he had a change of opinion, you're not hiding the fact that, and I think it's fair to point out that sometimes people's opinions change, and good for Mark Zuckerberg on realizing he was wrong.
01:07:21.000 The issue is, if it's wrong to say it now, we knew it was wrong to say it, you know, back in July when he was telling his employees this.
01:07:28.000 The point is, Why was the rule just implemented?
01:07:31.000 Why was Mark Zuckerberg allowed to push insane conspiracy theories to a massive company, Facebook?
01:07:38.000 I guess the idea is if we're gonna live under this rule about disinformation, you can clearly see that Mark Zuckerberg shares disinformation, and then it only changes when he realizes he was spreading disinformation.
01:07:49.000 In many ways, and this Nick Clegg, the vice president of Facebook, is admitting as much in a leaked tape, one of these leaked tapes, We're kind of having to come up with these rules.
01:08:00.000 Think of it like the U.S.
01:08:01.000 Constitution.
01:08:01.000 Imagine the United States Constitution had very specific things that you're not allowed to say.
01:08:06.000 One of them is you cannot speak about the COVID vaccine.
01:08:09.000 I mean, that's what Facebook is saying.
01:08:11.000 They're just making new things up that you can't talk about.
01:08:15.000 And the analogy I draw is the Constitution of the United States is very clear.
01:08:19.000 Congress shall make no law.
01:08:21.000 prohibiting the free exercise of the freedom of speech.
01:08:25.000 But Facebook has these rules about certain subjects you're not allowed to pontificate about.
01:08:31.000 And the audience, well, they're a private company.
01:08:34.000 They can say what they want.
01:08:36.000 They're a private company.
01:08:37.000 I get they're a private company.
01:08:38.000 But you can't lie.
01:08:40.000 Lying is wrong.
01:08:41.000 It's just unethical.
01:08:42.000 I would even question the private remark there because they work hand in hand with the government
01:08:48.000 in many different ways.
01:08:49.000 So I would say they're even a quasi-government agency, not even from their seed funding,
01:08:52.000 but from what they're doing now.
01:08:54.000 So it's interesting because, I mean, he's afforded to change his mind.
01:08:58.000 He's allowed to pontificate, but we're not.
01:09:01.000 When you get rid of conversation, when you censor it, when you stifle it, you spread a lot of these crazier theories.
01:09:06.000 And then, again, it deserves to be called out, and I'm happy someone's calling it out.
01:09:10.000 Look at Twitter and Hunter Biden.
01:09:12.000 This is one of the biggest media scandals of our generation.
01:09:16.000 A news story broke about Hunter Biden, now President Joe Biden's son, at the time, running for the highest office in the land.
01:09:24.000 And his son was implicated in some very serious crimes, and his brother!
01:09:27.000 So Twitter intervened, Facebook intervened, and they publicly bragged, don't worry, we're suppressing this story so people can't see it.
01:09:36.000 And then, what, two weeks after the election was over, they were like, oh, that story?
01:09:39.000 That was true the whole time.
01:09:40.000 It was true the whole time.
01:09:42.000 That is horrifying.
01:09:44.000 And so this goes hand-in-hand.
01:09:47.000 I think you're looking for some consistency.
01:09:50.000 The consistency would be, Mark Zuckerberg says, don't spread disinformation.
01:09:54.000 We derank disinformation.
01:09:56.000 They give power to these third-party fact-checking organizations, which just make stuff up, and they put out fake news themselves.
01:10:02.000 Like, my favorite was Snope's article.
01:10:05.000 Did Ocasio-Cortez exaggerate the danger she experienced in the Capitol?
01:10:09.000 False!
01:10:10.000 While she wasn't in the Capitol, the Capitol was stormed.
01:10:13.000 No.
01:10:14.000 AOC's story about the Capitol took place a full hour and ten minutes before the building had been breached.
01:10:21.000 So I think it is fair to have the opinion she exaggerated.
01:10:24.000 Why then is Snopes given the ability to slap a warning label on my posts?
01:10:30.000 Which is, in my opinion, definitely- They didn't actually do that to me, I'm just saying.
01:10:33.000 On your or anyone else's posts, Snopes can now put their opinion over your opinion.
01:10:38.000 If we're supposed to be living by a standard to stop disinformation, I think what we've now realized from, you know, the video you've put out just now, this is the context I think you need to make sure people understand.
01:10:47.000 Right.
01:10:48.000 That in July, it was a conspiracy theory to say what Mark Zuckerberg was saying, and he didn't care.
01:10:56.000 He let people say it.
01:10:57.000 Now he had the epiphany, so he makes the rule.
01:11:00.000 That's the problem.
01:11:01.000 Are we allowed to have these ideas and communicate, or is it only on Mark Zuckerberg's whims the rules change?
01:11:06.000 I mean, this is an extraordinary point, Tim.
01:11:08.000 I mean, it's like a constitution.
01:11:10.000 People need to understand these companies just, OK, now we're going to create a rule this week saying you're not allowed to talk about this.
01:11:16.000 These are private companies.
01:11:17.000 Again, this is from the Financial Times.
01:11:19.000 Momentous decisions in the hands of private companies is not a long-term public policy solution.
01:11:24.000 They neither have the legitimacy nor the capacity to make such decisions in the public interest.
01:11:31.000 These are akin to the Supreme Court decisions for Facebook to say, OK, now you are not allowed to talk.
01:11:38.000 about vaccines.
01:11:39.000 We will ban you.
01:11:41.000 It's more powerful than if the United States Supreme Court were to cast a decision tomorrow saying you are not allowed to talk about red delicious apples.
01:11:51.000 So hold on.
01:11:51.000 Is the rule actually that you can't talk about vaccines at all?
01:11:54.000 I'm going to read the rule to you.
01:11:55.000 This is last week, Mario, I believe.
01:11:56.000 This is Facebook announced last week a new rule that said, quote, We are expanding efforts to remove false claims on Facebook and Instagram about COVID vaccines and vaccines in general.
01:12:10.000 So I wonder, you know it'd be really funny if like one day, Mark Zuckerberg just like, he meets somebody who has this epiphany and he becomes this crazy fruitarian who thinks meat is wrong.
01:12:20.000 Right.
01:12:20.000 And then he passes a rule where he's like, I think that eating meat is just wrong and eating vegetables is also wrong.
01:12:26.000 You know, a lot of people talk about not doing harm to animals, but you also harm the plant when you eat the vegetable.
01:12:31.000 Fruits are actually meant to be eaten, so I think everyone should have to eat nothing but fruit and spread the seeds.
01:12:36.000 So now we're gonna ban anyone who promotes meat or advertises it.
01:12:39.000 That's exactly right.
01:12:40.000 That's exactly what happened.
01:12:42.000 Because in July, he said, no, this thing is, I'm an anti-vax guy and I think it modifies your DNA.
01:12:46.000 This is like not five, 10 years ago.
01:12:48.000 This was in 2020, he said this.
01:12:51.000 And suddenly he had a change of heart.
01:12:53.000 I mean, so I think that Facebook is more powerful than the United States Supreme Court.
01:12:58.000 In fact, these tech companies are more powerful than all three branches of government.
01:13:01.000 So when they make a rule change that says you can't talk about something, It's as if the United States Supreme Court made a change about, codified the United States Constitution.
01:13:12.000 Give me the ability to control what people say, and I care not who makes the laws.
01:13:16.000 Well, the thing is, okay, this is the big debate.
01:13:19.000 This is it.
01:13:19.000 This is a big debate of our times.
01:13:20.000 They're private companies right now, and we don't have the right to seize their means of production as a company.
01:13:26.000 That's very communist if we were to say, okay, Facebook, now you're government controlled.
01:13:30.000 We're taking the country.
01:13:31.000 We're not going to do that.
01:13:32.000 We're not seeing the means of production.
01:13:33.000 So we're not going to make them create their own terms of service.
01:13:36.000 We can't do that.
01:13:37.000 That's a communistic takeover.
01:13:38.000 We can't do that.
01:13:39.000 Listen to me.
01:13:41.000 The problem is these companies have a right to create their own terms, as does a restaurant.
01:13:47.000 They don't want you talking about certain things in the restaurant.
01:13:48.000 Get out of the restaurant.
01:13:49.000 That's the restaurant owner's right.
01:13:51.000 My concern is how do we move forward in the society with that still being the case?
01:13:55.000 And I can only find... Tim, please let me finish this thought that we need to free the software code.
01:14:01.000 It's the only way to allow other people to create software as powerful as Facebook with their own terms of service.
01:14:07.000 So you made several points there, and the premise of your final conclusion is incorrect, which is what I'm trying to address.
01:14:13.000 Scale exists.
01:14:15.000 If there is one person buying up all the farmland, and now no one has food, the people have a right to say, nah, you can't do that.
01:14:23.000 One small plot of farmland has a right to their sovereignty to grow their crops and be left alone.
01:14:28.000 One person seizing every single opportunity for people to eat is very different.
01:14:33.000 Facebook is not some small platform people use sometimes.
01:14:36.000 It is the one of the largest, if not, it is the second largest website, I think, on the planet behind Google.
01:14:41.000 And they essentially control the flow of information by monopolizing it.
01:14:45.000 They've now used that monopoly power to control our politics.
01:14:49.000 It is not the same thing as a restaurant.
01:14:52.000 There's 50 different restaurants that are not chains I can go to.
01:14:55.000 There's no other Facebook.
01:14:56.000 The president is not on these other platforms.
01:14:58.000 Okay, I agree.
01:14:59.000 It is a monopoly.
01:15:00.000 But if you look at Rockefeller's Standard Oil, they broke it up.
01:15:03.000 That was the first monopoly.
01:15:04.000 And all it did was make Rockefeller more rich.
01:15:07.000 They created six new oil companies that he had stock in all six.
01:15:10.000 If they did this to Facebook, you'd have Facebook Prime, you'd have Facebook Messenger, you'd have Instagram.
01:15:14.000 That's not the answer.
01:15:15.000 No, it's not the answer.
01:15:15.000 Breaking up Facebook is not the answer.
01:15:17.000 Regulating it by freeing the software code, yes.
01:15:20.000 Why would freeing the software code do anything?
01:15:21.000 Because it would give people the opportunity to create software as powerful with their own terms.
01:15:25.000 And then people would migrate to the place with the best terms.
01:15:27.000 Ian, you just advocated against seizing the means of production.
01:15:30.000 Now you're advocating for it.
01:15:31.000 No.
01:15:31.000 I think that Facebook has the right to function, and I would never want to take it away from Mark.
01:15:35.000 But the code is their means of production.
01:15:37.000 No, the means of production is the website.
01:15:39.000 No, the factory is... Yes, what's the website made of?
01:15:42.000 The code, right?
01:15:43.000 Yeah.
01:15:43.000 So you want to give the secret sauce that makes their system work.
01:15:46.000 Yeah, but I'm not going to take their system.
01:15:48.000 That's what you're advocating.
01:15:49.000 No, I'm not taking... I don't want to take Facebook.
01:15:50.000 I want to free the software code so other people can create more of them.
01:15:54.000 So you're taking... It's different.
01:15:55.000 They're allowed to keep their brand, but not the actual company.
01:15:57.000 And the website, and all the marketing that goes through the website.
01:16:00.000 Yes, of course.
01:16:01.000 That's the only capitalist...
01:16:04.000 I think you're talking about transparency.
01:16:09.000 I mean, these platforms, they have the ability to amplify certain voices while excluding others.
01:16:15.000 It's a product of scale economies, and their power comes from squeezing out alternative platforms While fueling this virality and and I think I mean I agree with you on one hand I think it's about transparency I I'm I'm biased Veritas my mission I believe in it if we just expose how they do what they do you're talking about like
01:16:39.000 Interesting.
01:16:39.000 letting their competitors know how, how, how, what do you mean by, by showing
01:16:43.000 their, what they're doing.
01:16:44.000 Access to the source code.
01:16:45.000 Access to the code.
01:16:46.000 So you could see what the, how the algorithms work, um, how the code is
01:16:49.000 built, how the software is built.
01:16:50.000 So you could reuse it.
01:16:52.000 And if you make it a free software code, if you take the software and
01:16:55.000 change it and make it better, they would have access to your changes as well.
01:16:58.000 Interesting.
01:17:00.000 Interesting.
01:17:02.000 I don't know.
01:17:03.000 I think the answer is, you know, we had Will Chamberlain on the show talking about how parlor isn't the answer.
01:17:10.000 We need platform access as a civil right to make sure people have the ability to use these platforms.
01:17:15.000 But I've got to be honest, the problem is multifaceted.
01:17:18.000 So I'll give you one example.
01:17:20.000 Just because I'm allowed to use Twitter doesn't mean Twitter is functioning properly, right?
01:17:24.000 So we talk about Facebook's censorship and the capriciousness of their rule changes.
01:17:28.000 Well, recently I quote-tweeted Cassandra Fairbanks.
01:17:32.000 She did this story about vans pulling up to the TCF Center and questions being raised about it.
01:17:37.000 I said I didn't think the videos mattered.
01:17:38.000 Whatever they do prove or don't prove, I don't think they matter.
01:17:41.000 Because Time Magazine wrote an article basically saying that the election was rigged.
01:17:45.000 And then I put, I'm sorry, they didn't say it was rigged.
01:17:47.000 They say it was fortified by changing election laws and manipulating the flow of information.
01:17:52.000 That's a fact.
01:17:53.000 They didn't say it was rigged.
01:17:54.000 That's what I said.
01:17:55.000 They said it was fortified.
01:17:56.000 Time Magazine wrote this article.
01:17:58.000 Twitter blocked my tweet in all forms.
01:18:01.000 You can't retweet it.
01:18:02.000 You can't quote.
01:18:02.000 You can't respond.
01:18:03.000 You can't like.
01:18:04.000 No, no, I'm sorry.
01:18:04.000 You can't quote.
01:18:04.000 You can't respond, retweet, or like.
01:18:07.000 And they put a tag saying, this claim about fraud is disputed.
01:18:10.000 But it's literally not disputed.
01:18:12.000 Time Magazine wrote it.
01:18:13.000 I was quoting Time Magazine.
01:18:15.000 So I actually reached out to Jack and I said, could you remove this?
01:18:18.000 It is a false statement of fact.
01:18:20.000 Jack did not respond.
01:18:21.000 Well, he responds sometimes.
01:18:22.000 You know, I talk to him every so often.
01:18:24.000 We have this other story that's really funny, actually, and this exemplifies the great problem.
01:18:28.000 Check this out.
01:18:29.000 Twitter labels Indiana AG's tweet a Valentine's Day meme that election was stolen from Trump.
01:18:36.000 This is hilarious.
01:18:38.000 Todd Rokita tweeted, Happy Valentine's Day.
01:18:41.000 And there's an image that says, You stole my heart like a 2020 election.
01:18:45.000 Happy Valentine's Day.
01:18:47.000 And Twitter says, This claim of election fraud is disputed and this tweet can't be replied to, retweeted, or liked due to a risk of violence.
01:18:55.000 Chef's kiss.
01:18:56.000 Memes!
01:18:57.000 It's a meme!
01:18:58.000 He's not literally saying there was fraud, he's mocking the idea.
01:19:02.000 It's Donald Trump as a cartoon and he's making fun of it.
01:19:04.000 You can't even post memes!
01:19:05.000 Yeah, they went after a couple of my memes too on Instagram and Twitter, just deleted
01:19:09.000 them and I didn't even know and I'm like, wait, what's going on here?
01:19:11.000 Why me?
01:19:12.000 I mean, they're satirical, they're meant to be funny.
01:19:14.000 That's funny.
01:19:15.000 That's, that's, that's, seriously, I don't think that's going, that's not going to make
01:19:18.000 anyone violent at all.
01:19:20.000 And now we're at a phase where Instagram just announced a few days ago that they're going to be going and snooping through your DMs, your private messages, looking for hate speech to go after.
01:19:31.000 That's not new.
01:19:33.000 I know, but them publicly announcing it, them saying and being so emboldened by it is new.
01:19:39.000 We have to acknowledge the reality that these platforms are, I think we've all been talking about this, They have a form of concentrated political power akin to basically a loaded gun on the table.
01:19:50.000 They swing elections, they sway elections, so what do we do about it, right?
01:19:54.000 That's what everyone says to me.
01:19:55.000 It's like, okay, stop complaining, be brave, do something.
01:19:59.000 I kind of agree with you.
01:20:01.000 I think it's exposing the reality and we need an army of whistleblowers inside big tech.
01:20:10.000 Well, you know, I've said this several times.
01:20:12.000 I think the work you guys do is some of the most important work we have right now.
01:20:16.000 Your show brought us Richard Hopkins.
01:20:18.000 Last time I was on the show, the mailman from Pennsylvania, Erie, Pennsylvania, sent us an email after watching your show and blew the whistle on the Postal Service Well, that's all you, man.
01:20:33.000 I mean, we're just a bunch of people who sit around talking about stuff on the internet.
01:20:36.000 But for whatever reason, he watches your show more than any other show, and he came to VeritasTips at ProtonMail.com.
01:20:43.000 That's awesome.
01:20:44.000 Shameless plug.
01:20:44.000 That's V-E-R-I-T-A-S Tips at ProtonMail.com.
01:20:48.000 It's encrypted.
01:20:49.000 Hopefully the NSA is not intercepting that email.
01:20:53.000 And that Richard came to us and Richard lost his job, Tim.
01:20:57.000 Richard was fired from the Postal Service.
01:21:00.000 First he was suspended, then he was fired.
01:21:01.000 By the way, I think that's important because he has skin in the game.
01:21:06.000 He's an actual person who is going to lose his career for the public's right to know something.
01:21:12.000 He's not just saying something, he's giving up his livelihood.
01:21:16.000 Here's why I say what you do is so important, and actually why what Richard and others have done is even more important.
01:21:22.000 I mean, for one, you can't do your job unless there's brave people willing to stand up and work with you to get that information out.
01:21:28.000 More importantly, those brave people who are standing up are creating this atmosphere where these big shots, these fat cats, these corrupt individuals now have to keep looking over their shoulders wondering, who's going to say, you can't do that?
01:21:39.000 I'm essentially going to blow the whistle.
01:21:41.000 It's like you were mentioning earlier, people at Facebook now have to worry, who's the person there who's going to share what they're saying?
01:21:47.000 Luke, when they go to look at DMs, which is unconscionable and unethical for someone to browse private DMs, in the back of that, whatever, engineer coder's mind is, wait, before I look at the DM, should I be worried if someone's looking at me?
01:22:03.000 That psychological effect.
01:22:05.000 Exactly and this is why they don't have any transparency or accountability to their actions because if they did people would know what they can and cannot say.
01:22:14.000 They don't want you knowing it because they want this fear effect.
01:22:17.000 They want this chilling effect where you have to worry about your butt and and worry about what you're thinking about because when you're when you're censoring words you're censoring what people can think essentially and we also have to understand on the bigger point here these are not private capitalistic entrepreneurship organizations.
01:22:33.000 You look at the seed funding, look where they came from, look right now at their tax incentives, the visa programs,
01:22:38.000 their data sharing surveillance programs, the government contracts that they receive.
01:22:42.000 These are entities that are in line with a lot of government
01:22:45.000 institutions that are working hand in hand together.
01:22:48.000 In many ways, government is downstream from these institutions.
01:22:50.000 I was thinking as we were talking that there could be some software developer that's like, I'm going to blow the whistle on Facebook.
01:22:57.000 I'm going to deliver all the code to Project Veritas.
01:22:58.000 entities that are private companies pick and choose they're not they're not that
01:23:02.000 at all they are monopolies at least that's a great observation I was
01:23:05.000 thinking as we were talking that there could be some software developer that's
01:23:08.000 like I'm gonna blow the whistle on Facebook I'm gonna deliver all the code
01:23:11.000 to project Veritas that wouldn't work because unless we use the government to
01:23:15.000 regulate these people and to actually change the software license and do it
01:23:20.000 right you could get rid of the code and they can just change it
01:23:23.000 So we have to acknowledge that these guys are basically quasi-government organizations and should be treated such.
01:23:31.000 Yeah, well, for the time being, I mean, things are demoralizing.
01:23:36.000 Yeah, I want to leave Twitter.
01:23:37.000 I don't want to leave Twitter, but when I see the way they're treating you, like, this ridiculous... Well, I can tell you right now there's someone watching this who does work for Twitter or Google or Facebook, I guarantee it.
01:23:47.000 And we have so many sources and whistleblowers coming to us right now, it's beyond what you can fathom.
01:23:52.000 I mean, we have untold dozens, over a hundred whistleblowers right now, recording, watching.
01:23:59.000 I'm talking about the CEOs of these companies.
01:24:01.000 And if you're watching this right now, just like Richard Hopkins a few months ago, send us a note.
01:24:08.000 I think the solution is exposure.
01:24:09.000 I think they don't fear the United States government anymore.
01:24:12.000 All these people, these politicians, almost like they suck up to CNN.
01:24:17.000 All they ever do is go on CNN.
01:24:19.000 I don't even know what these Congress critters actually do.
01:24:22.000 What do they actually do?
01:24:23.000 Because all I ever see them doing is going on Instagram, which is owned by Mark Zuckerberg,
01:24:29.000 and going on television.
01:24:30.000 Do they actually create any laws?
01:24:32.000 I guess they don't because government is in a state of sclerosis, so they can't pass legislation.
01:24:36.000 So all they do is go on tech platforms.
01:24:38.000 They are beholden to tech.
01:24:41.000 They suck on the teat of big tech.
01:24:43.000 Where would AOC be without Twitter and Instagram?
01:24:45.000 Nowhere.
01:24:46.000 All the government, just like Alexander Solzhenitsyn said, the media has more power than all three branches of the government.
01:24:54.000 And what laws are they beholden to when they say, you are not allowed to talk about this or that?
01:25:01.000 It is akin to the United States Supreme Court changing the Constitution and saying, Except when you talk about, you know, applesauce.
01:25:10.000 You wanna know something really crazy that I was reading?
01:25:13.000 Guys, definitely, if you're listening, fact check me on this one.
01:25:15.000 I was reading some post, it was a conservative post of this, that Trump's own lawyer in the impeachment trial believed the Very Fine People hoax up until he had to research the evidence to defend Donald Trump, then realized the media had been selectively editing what Trump had said, and Trump definitively denounced white supremacy after Charlottesville.
01:25:36.000 And well, the media cuts it out of context, like they did with the Shinzo Abe fish feeding thing.
01:25:41.000 And this lawyer didn't know that about Trump until he was investigating because he had to go over the evidence.
01:25:47.000 How many people think they're informed because they read the news?
01:25:51.000 They read what Facebook allows to be shared.
01:25:54.000 And they actually are just being fed garbage.
01:25:56.000 The amount of fake news, disinformation, and propaganda out there is absolutely insane to even fathom sometimes.
01:26:05.000 It's like 1984, 2 plus 2 equals 5 is just keep being repeated.
01:26:08.000 You know that's a real thing though, right?
01:26:10.000 Yes, I know, I know, I know.
01:26:12.000 I've seen everything.
01:26:13.000 I've seen it unfold.
01:26:14.000 But James, the one question I really wanted to ask you here, when you got hit by Twitter, was there any alternative social media platform that you were looking into fully transitioning away from Twitter?
01:26:23.000 So I take a unique position on this because I may take a stance that Tim, you don't agree with, or Luke, you don't agree with.
01:26:31.000 I think that content is king.
01:26:33.000 I believe in distribution by proxy.
01:26:36.000 In other words, if they ban me everywhere, which everyone's warning me about, okay, whatever.
01:26:40.000 They ban me everywhere and I get a hidden camera video of a federal judge taking a bribe.
01:26:45.000 That's a 30 second video.
01:26:48.000 I would simply send an email to my 500,000 people on my email list saying, please download this video and upload it to your Twitter.
01:26:55.000 They can't ban everyone.
01:26:56.000 So content is king.
01:26:58.000 Content, not platform, is king.
01:27:00.000 I've always believed that.
01:27:01.000 Everyone wants to create another platform.
01:27:04.000 I thought this in the back of my head about Parler.
01:27:06.000 By the way, Parler got banned.
01:27:07.000 OK, we're going to create another Parler.
01:27:09.000 Parler's back.
01:27:09.000 Parler's back.
01:27:11.000 And then I saw Parler banned Milo today.
01:27:13.000 And now he's back.
01:27:14.000 Oh, he's back.
01:27:15.000 OK, well, there's all these updates about the platform.
01:27:18.000 But at the end of the day, content is king.
01:27:20.000 I didn't coin that phrase.
01:27:22.000 That was the Viacom CEO.
01:27:24.000 I forgot his name.
01:27:25.000 But I believe that what was going through my mind, Luke, was that content is king.
01:27:31.000 A good story will effuse its way and distribute its way, sort of like the black goo in the movie Prometheus.
01:27:39.000 It will just sort of get out there and there'll be a new paradigm.
01:27:43.000 I don't know what the name of the next or the future platform will be.
01:27:46.000 But what I do know is a good story is always lurking behind every corner.
01:27:51.000 And again, distribution by proxy.
01:27:53.000 So Telegram Wow.
01:27:54.000 Mm-hmm.
01:27:54.000 We had 4,000 followers a couple weeks ago.
01:27:56.000 Now we have 350,000 followers.
01:27:59.000 And what we'll do is we'll send a Vimeo link and say, please download this clip and upload
01:28:04.000 it into your Twitter.
01:28:06.000 So if we're banned on Twitter, citizens can do that.
01:28:09.000 What are they going to do, ban everyone?
01:28:12.000 They can't do that.
01:28:13.000 This is the funny thing.
01:28:14.000 You know, when TikTok had a bullying problem, there were people being mocked for their appearance
01:28:20.000 or their weight.
01:28:21.000 They realized something.
01:28:23.000 Or as Twitter bans the hate speech to protect the minority, TikTok realized, well, if we
01:28:29.000 want to stop the bullying.
01:28:31.000 And we ban, say, a thousand people bullying this person.
01:28:34.000 That's a thousand users we lose in our annual reports for shareholders.
01:28:38.000 Why don't we ban the person being bullied?
01:28:41.000 Then there's no more bullying!
01:28:42.000 So TikTok actually targeted the person who was a victim of the insults.
01:28:47.000 That was clever.
01:28:48.000 I bring that up just because, in the instance of Twitter, Well, they can't ban everybody, because then they're gonna have to issue a report saying, we lost X many users this quarter, oh, we banned them because we didn't like what they had to say.
01:28:59.000 Yeah, and they can't ban us today, because if they ban us for the Zuckerberg story, every reporter that talks about it will have to say, there's no doxxing here, it's Zuckerberg on tape.
01:29:08.000 You know what the problem is right now, though?
01:29:10.000 It's not fun anymore.
01:29:12.000 It's not what?
01:29:12.000 It's not fun.
01:29:13.000 Which part?
01:29:14.000 Twitter used to be fun.
01:29:15.000 Oh, it's terrible.
01:29:17.000 But they're getting rid of interesting people.
01:29:19.000 Seeing a Trump tweet was hilarious.
01:29:21.000 You know, the journalists would have panic attacks when Trump would tweet.
01:29:23.000 That's weird.
01:29:24.000 But I thought it would be funny when Trump would tweet something really silly, you know, like, not that he should have most of the time, but sometimes he would tweet funny and silly things.
01:29:33.000 What did he tweet?
01:29:34.000 Oh yeah, when Elizabeth Warren was drinking the beer, remember this?
01:29:37.000 And her husband walks up.
01:29:39.000 And then she's like, thanks for being here.
01:29:41.000 Trump tweets.
01:29:41.000 What do you mean?
01:29:42.000 He's supposed to be there.
01:29:43.000 It's his house.
01:29:44.000 That was hilarious.
01:29:45.000 It made it exciting.
01:29:47.000 They're getting rid of anything that's even remotely interesting on the platform.
01:29:50.000 And if they ban us today for publishing a videotape of Zuckerberg, every reporter who reports on the ban will have to include the facts about what Mark Zuckerberg said.
01:30:00.000 That's why they won't ban us.
01:30:02.000 Tim, I think there's a lot to this whole litigation.
01:30:04.000 That's a good one.
01:30:05.000 I mean that.
01:30:06.000 My producer just texted me, that's genius, Tim.
01:30:09.000 Helping others with a litigation fund.
01:30:11.000 That's a really important idea.
01:30:13.000 Because so many people get defamed.
01:30:16.000 And we have a recourse in the theory of defamation law.
01:30:21.000 Especially if they're not public figures.
01:30:23.000 And I'm a public figure, I have to prove what's called actual malice.
01:30:26.000 I have to prove the person knew they were lying about me when they lied.
01:30:30.000 But a non-public figure, Tim, if you're railroaded?
01:30:33.000 Discovery.
01:30:34.000 This is a really good idea.
01:30:36.000 I think the reason people haven't done this is because they don't know they can do it, and it takes, I call it, balls, resources, and willpower.
01:30:46.000 Balls or huevos.
01:30:48.000 Fortitude.
01:30:49.000 Huevos.
01:30:50.000 Cojones.
01:30:51.000 Huevos.
01:30:51.000 That's the politically correct terminology.
01:30:53.000 Eggs.
01:30:55.000 Eggs.
01:30:55.000 And I think it's a great idea.
01:30:57.000 I think it's an excellent idea.
01:30:58.000 You can call it the People's Defamation Fund PDF.
01:31:00.000 Yeah, PDF files.
01:31:01.000 PDF.
01:31:02.000 And people already, you know, it's already in the back of their mind.
01:31:04.000 PDF.
01:31:04.000 They know what it is.
01:31:05.000 Well, they say the idea is 1% of it.
01:31:06.000 99% is the execution.
01:31:06.000 But still.
01:31:10.000 I think if you launched, not a GoFundMe, because they would ban you in two seconds.
01:31:13.000 They'd ban you, yeah.
01:31:14.000 But if you launched any one of these other fundraisers, you'd instantly raise enough to just hire an executive manager or whatever to run the fund.
01:31:23.000 Listen, I'm going to say it on your show right now.
01:31:25.000 If we get the New York Times to give us money, which I'm betting my reputation on it, so it's going to happen.
01:31:31.000 We're going to use some of those proceeds to start this idea that you have.
01:31:35.000 I like this idea.
01:31:36.000 You mean the idea that I brought up?
01:31:38.000 The idea that you had.
01:31:39.000 It doesn't matter.
01:31:40.000 I named it!
01:31:41.000 But another thing, I mean, I remember starting off when I was still just, you know, no one, not a big reputation, getting slandered and attacked by the mainstream media.
01:31:50.000 And also other instances where they literally took my videos that I did, my live reporting on protest, and they claimed it as theirs.
01:31:58.000 So many times.
01:31:58.000 Didn't they call you, like, a non-shrinking violet or something?
01:32:01.000 Yes, that was the New York Times.
01:32:02.000 What was that quote?
01:32:03.000 That was in the, uh... Luke Hradowski is armed with a video camera and a YouTube channel and definitely is not a shrinking violet.
01:32:10.000 Nice.
01:32:11.000 That was the New York Times perspective.
01:32:13.000 And that was actually, you know, fair reporting on their part, which is really interesting because they actually quoted me correctly when other news organizations literally took quotes out of context, put them together to make them sound bad, or as if I was pushing for some kind of violent action, which I, of course, never was.
01:32:28.000 I believe in total nonviolence.
01:32:30.000 Always have.
01:32:30.000 Always have advocated for that.
01:32:32.000 But for them, especially when I was starting off, especially when I didn't have a big following, they would just take everything I did.
01:32:38.000 Take my live reporting, take my photos, take my videos, claim it as theirs.
01:32:42.000 And then when I became prominent, they started slandering me.
01:32:44.000 And I'm like, what's going on here?
01:32:45.000 I actually started talking to some lawyers and let's just say they don't steal my videos.
01:32:50.000 They don't steal my photos anymore.
01:32:52.000 But a lot has changed in the last seven, eight years.
01:32:54.000 The world has changed.
01:32:55.000 The New York Times went real bad.
01:32:57.000 Yeah, we were talking about this yesterday with Will, because even with that New York Times weapons of mass destruction lie, there is still some kind of understanding that the New York Times tries to tell the story.
01:33:10.000 And even then, when they got caught with the WMDs, they admitted it, they talked about it, there were some repercussions behind it.
01:33:18.000 Yeah, they faced some real actions because of that.
01:33:23.000 Still, not enough comparatively to the lie that they sold the American public and the hundreds of thousands of people that died in the radicalization and immigration crisis that came out of it, but still not enough.
01:33:33.000 But there was still this kind of veil understanding that they're trying, that they made a mistake, that they're going to do their best.
01:33:41.000 They even interviewed me a couple times and they did honest, real reporting, but now all of that's gone.
01:33:46.000 That whole perception, that whole understanding that they're not even trying anymore.
01:33:50.000 They're blatantly slapping the American people upside the face.
01:33:54.000 They're obsessing about race.
01:33:55.000 They're obsessing about all this other stuff that divide and conquers us so we fight each other and we don't focus on the real issues, the actual things that do affect us, especially after 2012.
01:34:05.000 Their decline is evident and they're not the same New York Times that they're known for.
01:34:09.000 Let's jump to Super Chats, and, you know, normally I like to read as many as possible, but I'm gonna try and find as many questions for James as possible, you know, because a lot of the comments are just general, you know, comments on stories, and forgive me if you guys- I will read a bunch of Super Chats, don't get me wrong, but I think it's a good opportunity for people to get questions in for James if they haven't had the chance to.
01:34:29.000 I'll start with one of the most important.
01:34:31.000 Aurora Isabella says, this is a statement, James O'Keefe is hot AF.
01:34:36.000 With a sweating face.
01:34:37.000 That was very important to let you know that.
01:34:39.000 I was going to say about that.
01:34:40.000 You know, maybe I should... No, I'm not going to say it.
01:34:42.000 Modeling shots.
01:34:43.000 No, I was going to say... No, I'm not going to say it.
01:34:46.000 Riley Luan says, James, how can I help?
01:34:49.000 I've got media skills, CGI, but no real revenue source at the moment, and I'm certainly not afraid of speaking out online or in real life.
01:34:56.000 How can I best help to take these corrupt a-holes down?
01:34:58.000 So Prager says there's three sorts of people in this life.
01:35:04.000 Those who fight, those who donate to those who fight and those who do nothing.
01:35:09.000 So I suppose you're not the third, so you can either wear a camera Strap a camera to your body.
01:35:15.000 Tim, you can, um, what's her name?
01:35:17.000 What's the person?
01:35:19.000 What's the person who asked the question?
01:35:20.000 Oh, I, you can either put a camera on your body.
01:35:23.000 You can, you can find someone in your network who works for an organization that wants to blow the whistle to be a citizen reporter.
01:35:28.000 Riley Luand.
01:35:29.000 Riley, or you could donate a tax exempt donation to Project Veritas.
01:35:33.000 Those are your two options.
01:35:34.000 Pick one.
01:35:35.000 Very cool.
01:35:36.000 Colin Stevens says, Tim, please recommend James reach out to Nick Rechieta of Rechieta Law.
01:35:42.000 He is also a YouTuber and has talked about building a support network with lawyers for situations like this before.
01:35:46.000 Nick Rechieta?
01:35:48.000 I think that's a guy whose meme was banned.
01:35:50.000 Rechieta?
01:35:51.000 Meme?
01:35:51.000 Yeah.
01:35:51.000 How do you spell that?
01:35:52.000 That's not the... How do you spell the last name?
01:35:55.000 R-E-K-I-E-T-A.
01:35:56.000 Alright.
01:35:57.000 This is a YouTuber.
01:35:58.000 Sent to my team.
01:35:59.000 Thank you.
01:36:00.000 Yeah.
01:36:01.000 I mean, listen, if that happens, videotape your screen.
01:36:04.000 I mean, I get messages like this all the time.
01:36:05.000 Some of them are bunk, some of them are real.
01:36:07.000 I mean, I'm sure it happens.
01:36:08.000 shot. Tim, this is the same thing with your live stream title. James is the new Alex Jones.
01:36:12.000 I mean, listen, if that happens, videotape your screen. I mean, I get messages like this
01:36:17.000 all the time. Some of them are bunk. Some of them are real.
01:36:20.000 I mean, I'm sure it happens.
01:36:22.000 Again, what's more interesting is a hidden camera video of the engineer or the coder
01:36:28.000 in real time eating Doritos, high fiving each other.
01:36:31.000 Let's mess with James O'Keefe's Instapage.
01:36:33.000 That's the video I want.
01:36:35.000 Or stopping us from going... Or looking at the... This is not me.
01:36:38.000 I'm quoting somebody.
01:36:39.000 The dick pics on Twitter.
01:36:41.000 Remember that guy, Clay... Clay Haynes from Twitter?
01:36:45.000 Clay Haynes from Twitter.
01:36:46.000 We got him on hidden camera in San Francisco bragging about, yeah, we look at people's dick pics on the DMs.
01:36:52.000 I mean, that was extraordinary.
01:36:54.000 I want to see those videotapes.
01:36:56.000 The NSA was doing that before it was cool, though.
01:36:58.000 They were doing that over 10 years ago.
01:37:00.000 But we need the videotape of the NSA people doing it.
01:37:04.000 Well, NSA officers got caught spanking it to people's private images.
01:37:08.000 I confronted General Hayden about this.
01:37:10.000 He denied it, and I had to confront him with the actual news article, and then he's like, and he ran away.
01:37:15.000 Is Hayden the one that lied under oath about the Fourth Amendment stuff?
01:37:17.000 It's hard to know because all of them lie under oath, especially if they're former CIA.
01:37:22.000 And we have Mike Pompeo bragging about it.
01:37:24.000 One of them, like, scratched his head, like, under oath.
01:37:26.000 James Clapper?
01:37:27.000 Clapper.
01:37:28.000 It was like, this guy's a spy, and he's, like, the worst liar ever.
01:37:31.000 He's like, not wittingly.
01:37:33.000 We don't spy on people.
01:37:35.000 If you're gonna lie, at least try to, like... Well, Mike Pompeo says they get taught how to lie, steal, and cheat.
01:37:40.000 That was the worst lying ever.
01:37:42.000 It's like a spook.
01:37:43.000 Like, I did not spy.
01:37:46.000 I mean, at least... I mean, these are bad spies.
01:37:48.000 Think about the amount of viewership we've lost because we couldn't include your name in the title.
01:37:54.000 The amount of viewership you lost because you couldn't include my name?
01:37:57.000 In the title.
01:37:58.000 We put the name of our guests in the title of the show so people know.
01:38:02.000 Because often, you know, when we have, say, Jack Murphy or Will or, you know, or James O'Keefe, there are a lot of people who maybe wouldn't normally watch the show but want to see this particular perspective.
01:38:11.000 When we tried to stream with your name in it, it didn't work.
01:38:15.000 When I changed the title, removing your name, it worked.
01:38:17.000 Now the title is, YouTube is giving us the business, won't let us stream.
01:38:21.000 A lot of people are clicking to see what's going on, and they're going, oh, whoa, it's James!
01:38:24.000 Oh, man.
01:38:24.000 Imagine if they got an email notification saying, James O'Keefe on the show.
01:38:28.000 You'd get like times 40, 50 percent.
01:38:31.000 Times 80 percent.
01:38:32.000 All the people who want to hear what you had to say.
01:38:34.000 And it's not just about you, James.
01:38:35.000 It's about any other guest we would have where it for some reason is not allowing us to do it.
01:38:39.000 Now, I'll tell you this.
01:38:40.000 Is it possible?
01:38:42.000 Crazy old glitch!
01:38:42.000 It is possible.
01:38:43.000 Just a glitch, you know.
01:38:44.000 Yeah, well, I'm sorry.
01:38:45.000 I'm at the point in my life where the glitches always flow in one direction.
01:38:49.000 Never go full conspiracy.
01:38:51.000 No, no, so what I'm saying is, at a certain point you want to say that, you know, make the least amount of assumptions.
01:38:58.000 Someone correct me on Occam's razor.
01:39:00.000 What is more likely?
01:39:01.000 Well, when we know they actively censor conservatives, when they literally just censored, suspended Project Veritas on Twitter for ridiculous BS reasons, and we know they did, and then I tried to have James on the show with his name in the title, and it doesn't work!
01:39:15.000 I think it's simpler to say glitches happen though, but they only, they, they happen like, come on, man.
01:39:22.000 At a certain point, it, there are more assumptions and it just so happens that James is being censored at the exact same time.
01:39:28.000 They're not letting our stream go live.
01:39:29.000 I'd get messages like these glitches only happen when it's me doing it, but you know how many people are having that experience?
01:39:35.000 Like, yeah, it's only happens to you.
01:39:36.000 The truth is very powerful.
01:39:38.000 St.
01:39:38.000 Augustine said the truth is like a lion.
01:39:40.000 I'd love to hear more, more questions if you have them.
01:39:43.000 Oh yeah.
01:39:43.000 We got more.
01:39:44.000 Oh yeah, yes, but I've got, you know, I'll have to go through and find them all.
01:39:46.000 Sure.
01:39:46.000 All right, let's see.
01:39:48.000 Have to go through them.
01:39:51.000 So I'll just, I'll grab some in the meantime as we scroll through, because there's a lot.
01:39:55.000 I don't want to leave people out.
01:39:56.000 Milton Bradley said, I'm sorry, Elusive Gator says, Tim, would you be willing to invite Don Jr., Trump's son, on your show?
01:40:02.000 Would you invite him?
01:40:03.000 Yes.
01:40:03.000 That'd be fun.
01:40:04.000 100%.
01:40:05.000 Whenever he would like to come on.
01:40:07.000 A real fact-checking website?
01:40:08.000 Hi Tim, could you and your crew please start a real fact-checking website?
01:40:12.000 We are absolutely planning on doing that, but I'm also curious if you guys are planning on doing that, James.
01:40:18.000 Any kind of...
01:40:19.000 A real fact-checking website?
01:40:21.000 Or just general reporting. I know you guys do investigative work.
01:40:24.000 Yeah.
01:40:24.000 But what about...
01:40:25.000 I think we have to focus on our sort of mission, which is to be the answer to the question, what can I do?
01:40:33.000 I was talking to a very popular host of a major news television and he said, Everyone asks, what can I do?
01:40:42.000 And I feel like that's my purpose in life, is to be the answer to that.
01:40:46.000 So I have to keep focused.
01:40:48.000 I don't want to lose my focus and be a fact-checking of everything.
01:40:52.000 Well, that being said, we have another question.
01:40:53.000 Stan T00 says, James, love your work.
01:40:56.000 Curious, has any leftist submitted stories versus conservative parties to Project Veritas?
01:41:01.000 Keep up the great work.
01:41:02.000 If someone's doing something wrong or corrupt, I'll publish it.
01:41:04.000 of exposing non-conservatives.
01:41:06.000 I think he's saying, have you gotten leftists trying to expose conservatives or people on
01:41:10.000 the right?
01:41:11.000 I don't know.
01:41:12.000 If someone's doing something wrong or corrupt, I'll publish it.
01:41:15.000 I did a story on a Republican ballot harvester in Texas that she was arrested by the attorney
01:41:21.000 His name is Raquel Rodriguez in New Hampshire, the Attorney General of New Hampshire Republican.
01:41:25.000 We exposed someone who voted multiple times.
01:41:29.000 So at the end of the day, there was the Jeffrey Epstein clip that we got from ABC News owned by Disney.
01:41:35.000 And this was a leaked recording of Amy Robach a year and a half ago.
01:41:39.000 And this was a non Republican.
01:41:41.000 So I think Veritas over time will expand in many directions.
01:41:44.000 I love the idea when they claim that you target leftist groups, and you've investigated Google, The Washington Post, The New York Times.
01:41:52.000 Sort of a tacit implication these institutions are leftist.
01:41:54.000 You only go after left-wing groups like The New York Times and The Washington Post, OK?
01:41:59.000 Well, there's your answer, folks.
01:42:01.000 Yeah, yeah, yeah.
01:42:02.000 All right, Michael Smith says...
01:42:05.000 With all these barriers to open discussion, lockdowns and censorship, what do you see we can do to help affect public opinion?
01:42:11.000 I have thought about getting more involved in local government.
01:42:13.000 Thoughts?
01:42:14.000 Keep it simple.
01:42:14.000 I am a gorilla.
01:42:15.000 Much love.
01:42:17.000 Gorilla?
01:42:18.000 I thought you put this because we're a gorilla journalist.
01:42:21.000 I didn't know if that was a pun.
01:42:22.000 It's Alex Jones created a meme.
01:42:24.000 Coincidental.
01:42:25.000 Skin in the game is a term that my colleague Eric Cochran used, who blew the whistle on Pinterest.
01:42:31.000 Eric Cochran said, I have to have skin in the game, because we only live our lives for so long.
01:42:37.000 Life is short.
01:42:38.000 My advice to you, sir, is to get involved locally and to be an investigative reporter on the local level, as Luke has done.
01:42:50.000 It'll mean something after you're gone.
01:42:52.000 And I think we don't realize, you know, and I think people are beginning to wake up and understand that we have to be the media.
01:43:00.000 Stop complaining about how biased they are and just go out and do their jobs.
01:43:05.000 I mean, when I did the acorn story.
01:43:08.000 Ten years ago, I walked in in Baltimore, Maryland.
01:43:12.000 My colleague, Kana, was dressed like a Miami hooker.
01:43:15.000 She was 20 years old.
01:43:16.000 And within minutes, these pseudo-government agents were telling us how to disguise underage hookers as dependents on our tax forms.
01:43:24.000 It was amazing how easy it was to just get people to just talk.
01:43:29.000 Just be a journalist.
01:43:31.000 Go out and do it.
01:43:33.000 You don't need that much training.
01:43:35.000 Just take a camera and go start asking questions.
01:43:37.000 And Jon Stewart sang your praises, saying, journalists, where are you?
01:43:41.000 Look at this kid.
01:43:42.000 Look what he's doing.
01:43:43.000 And the United States Senate voted 83 to 7 in a democratically controlled House and Senate to defund ACORN.
01:43:48.000 That was 10 years ago.
01:43:49.000 And this is a girl that messaged me.
01:43:52.000 Hannah messaged me on Facebook.
01:43:53.000 I've never met her before.
01:43:54.000 She said, James, You think it'd be a good idea to go into Acorn dressed as a hooker?
01:43:58.000 Now most people get these sorts of DMs.
01:44:01.000 Slide into the DMs.
01:44:03.000 Slide into the DMs.
01:44:04.000 They delete them as spam.
01:44:05.000 I said, that's a great idea.
01:44:07.000 There should be a pimp in the situation.
01:44:09.000 And you know, be a citizen journalist.
01:44:10.000 Go out there and just go do it.
01:44:14.000 Robert Bettle says he who controls the feed controls the world. I propose forcing by law the algorithm to be open
01:44:21.000 source and the choice of which algorithm and third-party filter service be moved to the end user.
01:44:26.000 I think it's a it gets really I don't know but esoteric when when when talking about the code and the source and
01:44:33.000 how we deal with that and that's one of the challenges.
01:44:36.000 They rely on this not only opaque system, but even if they did release the code, most people are gonna look at it and say, I have no idea what that is.
01:44:43.000 See, I have a different perspective a little bit than you.
01:44:45.000 I think the solution is for the CEO or the Vice President of these companies to come out and say...
01:44:51.000 I wouldn't be happy.
01:44:52.000 are trying to elect Democrats.
01:44:54.000 If they just said that, I'd be happy.
01:44:56.000 I wouldn't be happy.
01:44:57.000 It'd be a start.
01:44:59.000 It'd be a good start.
01:45:00.000 Acknowledgement.
01:45:01.000 Because if the code or whatever it is, I'm not an engineer, so
01:45:04.000 excuse my ignorance.
01:45:05.000 But whatever it is that we're talking about here, somewhere
01:45:08.000 in that code shows them targeting shadow banning.
01:45:12.000 Let's say, Tim, we had a San Francisco engineer
01:45:15.000 at Google quite literally writing code as we speak,
01:45:19.000 targeting this podcast.
01:45:21.000 And he was under oath in Congress, and he said, you know, Congressman Smith, I targeted Tim Pool's podcast, and here's the code showing me doing it because I hate Tim Pool, I want to destroy James O'Keefe, and I want to elect Democrats.
01:45:35.000 Thank you very much.
01:45:36.000 I'd be very happy with that.
01:45:37.000 My problem with relying on the CEO's executive is that It's not a good and evil thing for me.
01:45:42.000 It's a thing about, it's justice.
01:45:44.000 Because if, if, if you rely on the CEO to do the right thing and reveal themselves, they're going to sell the company.
01:45:50.000 And then you're going to have to rely on another human and another human, and that's allows for corruption to breed.
01:45:56.000 I just think the system should be transparent and not even open source, but free software like GPL general public license or the MIT license.
01:46:04.000 So that if changes are made to it, those changes are also free and open.
01:46:08.000 The challenge, James, with what you were saying about the guy saying, like, here's the code, is that they're not going to say that.
01:46:13.000 I mean, I'd love it too.
01:46:14.000 They're going to do what we've already seen these big tech companies do.
01:46:17.000 There is no bias.
01:46:19.000 We do not ban people for public opinion, you know, for their political opinions.
01:46:22.000 And then they just say, we only ban for rule violations.
01:46:27.000 And our rules are extremely specific and politically motivated.
01:46:30.000 Do you think it's a cognitive dissonance?
01:46:32.000 Do you think they've convinced themselves?
01:46:33.000 Or do you think secretly?
01:46:34.000 I know it gets brought up so often, but the Joe Rogan episode with me, Joe, when Jack didn't understand, when I told him to his face, your rules are biased against conservatives.
01:46:48.000 He said, we don't ban people for being conservative for their political opinions.
01:46:51.000 I said, yes you do, because your rules are biased against conservatives.
01:46:54.000 So this is from the Orwell book, 1984, where their cognitive dissonance is to such a degree where they've actually convinced themselves not to actually internalize or think of the word doublethink because in doing so, they're admitting they're tampering with reality.
01:47:08.000 So you're saying psychologically, even when they tuck themselves under the covers at night, they don't individually and personally think that they're helping Democrats?
01:47:17.000 Is that what you're saying?
01:47:18.000 Right.
01:47:18.000 So in the instance of Jack Dorsey, it was a really fascinating revelation when I told him You have a policy against misgendering people.
01:47:26.000 To conservative Americans, of which half the country, or a little bit more, a little bit less, they believe that if someone is born biologically male, you must use he, him pronouns.
01:47:36.000 Whereas the left believes it's based on identity.
01:47:39.000 Those are two different worldviews.
01:47:41.000 Your rule set penalizes the conservative worldview.
01:47:45.000 Therefore, A conservative who expresses their worldview will be banned from your platform.
01:47:50.000 Your rules are biased against the conservative worldview.
01:47:53.000 And it was like the first time he had ever heard that.
01:47:55.000 Like, he didn't realize.
01:47:57.000 In their minds, 99.9% of Americans hate Donald Trump.
01:48:01.000 are all progressives and it's just this silly fringe far right. Oh those dang far right.
01:48:07.000 How did 75 million people vote for Donald Trump? They were tricked by lies. That must be it.
01:48:12.000 You know the principal Skinner meme? Am I out of touch? No, it's the children who are wrong.
01:48:17.000 That's them. They don't get it no matter how many times you tell them.
01:48:21.000 That's straight out of 1984.
01:48:23.000 That's out of the manual, the Emanuel Goldstein manual that George Orwell hypothesized, which is that it's to believe that black is white and then to change your mind and never to have previously thought that black is white.
01:48:37.000 And you can't even talk about doublethink.
01:48:40.000 It's a strange psychosis then, Tim, if what you're saying is true.
01:48:44.000 I don't know if I agree with you.
01:48:45.000 I think there's a substantial minority of these engineers and folks in Silicon Valley that have an intentionality behind how they code, that do
01:48:53.000 have an agenda, and that would privately admit this.
01:48:55.000 No, definitely.
01:48:56.000 I think that there's a substantial minority and therefore the mission is to expose that
01:49:00.000 intentionality.
01:49:01.000 But I think you're also right that there's also a substantial minority of people who
01:49:05.000 have – are like the Goldstein Manual in Orwell's 1984, and they have this cognitive
01:49:10.000 dissonance where they don't even concede the notion of double think, and they've
01:49:16.000 themselves through some type of strange psychosis.
01:49:19.000 Do you know what happens when someone is experiencing cognitive dissonance and you present absolute proof that their worldview is incorrect, or at least a portion of it?
01:49:27.000 You know what their reaction is?
01:49:29.000 Blind rage.
01:49:30.000 This is why you see those videos of the famous memes of people, like, their veins are popping out and they're screaming like that.
01:49:36.000 And that's why you see so often the meme of the cool-as-a-cucumber guy, you know, poking fun at them.
01:49:42.000 The person who is not emotionally agitated, who has thought through their problems, who has thought through the problems and looked up the information, has the arguments right at hand, for the most part.
01:49:51.000 I don't think every single person on the right has all the answers.
01:49:54.000 But you'll see a lot of these ideologically driven, woke, culty, leftist-type individuals, pro-censorship, Experiencing the cognitive dissonance of trying to claim their for free speech while almost also simultaneously being in favor of censorship, they can't handle it.
01:50:08.000 They make up, like there's that famous comic where the guy says that the paradox of tolerance, which it's a paradox itself, like the comic itself makes literally no sense because it contradicts itself.
01:50:19.000 They can't see through this.
01:50:21.000 Eventually, when they're confronted by information that proves it, they snap, they get angry, they get violent.
01:50:25.000 They get rage.
01:50:27.000 Right.
01:50:27.000 It's a defense mechanism.
01:50:28.000 And when you have a social media admin that's feeling rage, that's when the bans happen.
01:50:33.000 Exactly.
01:50:34.000 So, the way it was explained to me, and this could be absolutely wrong, it's been a long time, it was explained to me by some psychologist, I think.
01:50:42.000 What they basically said was, you grow up, you know, you're a little kid, you're a preteen, you're a teenager, you're a young adult, And you're building a worldview.
01:50:52.000 You're, in your mind, you are determining what is true and what isn't.
01:50:55.000 By the time you're, you know, getting into your late teens, into your early twenties, your brain is sort of now constructed a solidified worldview that is nearly complete.
01:51:04.000 At a certain point, your brain says, this is true, it must be true, because you've survived for this long.
01:51:11.000 Whatever it is that you've learned has helped you survive in this dangerous, treacherous world.
01:51:17.000 So it is dangerous to have those ideas and that worldview challenged, because if it turns out you're actually believing something that could be dangerous for you, it could put your life at risk.
01:51:26.000 The simplest way to explain it is, back when humans were running through the fields in the savannas, we learned, hey, you know, fire hot.
01:51:33.000 And your brain builds this worldview that fire is hot.
01:51:35.000 People believe these things, and their social circles, they basically solidify and fortify this worldview.
01:51:42.000 Once you get to a certain age, you need to maintain that worldview because it will keep you alive.
01:51:48.000 All of a sudden, someone comes around and completely shatters that worldview, putting your life at risk.
01:51:52.000 It's a defense mechanism where, by getting emotionally enraged, it shuts down your ability to process the information to protect what has kept you alive for this long.
01:52:01.000 It's essentially a failsafe that actually backfires in the long run.
01:52:05.000 I think what concerns me, I would love to hear your opinion on this, is that people are, for the most part, lacking critical thought.
01:52:12.000 Maybe not all, but, and I know we can learn critical thought, but even people that are massively intelligent developers or fantastically wealthy business tycoons, lacking critical thought, where if someone challenges your worldview and says something you think is threatening your safety, You're still supposed to support their ability to do that under the U.S.
01:52:32.000 Constitution.
01:52:33.000 I go back to Orwell constantly because I feel like this book answers so many of these questions.
01:52:38.000 And what scares me about the Orwell book is when Winston, the protagonist, this is the Winston Smith, is being tortured by O'Brien, the tyrant.
01:52:47.000 And every one of us has got fears.
01:52:49.000 Imagine your worst fear in the world.
01:52:50.000 I'm not going to say to my enemies what it is.
01:52:53.000 Is it bees?
01:52:55.000 What?
01:52:55.000 Is it bees?
01:52:56.000 Close.
01:52:58.000 You're on the right track.
01:53:00.000 And the tyrants have figured out that they have Winston.
01:53:04.000 They've put a cage on his face and they put a rat.
01:53:06.000 The biggest fear of Winston is a rat.
01:53:08.000 And he says, 2 plus 2 equals 5.
01:53:09.000 Please tell us that 2 plus 2 equals 5.
01:53:11.000 And Winston says, no, no, 2 plus 2 equals 4.
01:53:13.000 And the tyrant opens up the door and the rat's coming towards his face.
01:53:17.000 And finally, Winston says, OK, 2 plus 2 is whatever you want it to be.
01:53:22.000 Two plus two is whatever you want.
01:53:23.000 Just don't put the rat in my face.
01:53:26.000 None of us are that strong.
01:53:28.000 None of us are that... I don't care.
01:53:29.000 Don't tell... Think of what your worst fear is in the whole wide world.
01:53:32.000 Could be, I don't know, spiders, bees, whatever.
01:53:36.000 None of us are that strong.
01:53:37.000 I don't, I don't, I don't agree.
01:53:39.000 I don't, I don't agree with you.
01:53:40.000 There are four lights.
01:53:43.000 Do you know that reference?
01:53:45.000 Do you know the reference, there are four lights?
01:53:47.000 A bunch of people watching are going, yes!
01:53:49.000 It's from The Next Generation, Star Trek.
01:53:52.000 Captain Picard is kidnapped and tortured by the Cardassians, and there are four lights in front of him, and he keeps saying, there are five lights.
01:53:59.000 Now tell me, how many lights are there?
01:54:01.000 And Picard refuses to say, there are five lights, and he says, there are four lights!
01:54:05.000 Was he tortured?
01:54:06.000 Yeah, yeah, he was tortured, deprived of sleep, all of the, you know, yeah, he was tortured, and he's like, disheveled and shaking, and he refused to back down.
01:54:13.000 Well, I mean, this is the question, to go to your point, but yet we live in a society where there's courage, there's the courage to run up a hill with a bayonet.
01:54:25.000 You know, Von Clausewitz talks about two different types of courage, and then there's moral courage.
01:54:29.000 Where the hell is the moral courage?
01:54:31.000 In Washington, D.C.
01:54:32.000 We're not asking people to give up their lives, we're asking them to give up their reputation, and Orwell's hypothesis is that no one is that strong.
01:54:39.000 The protagonist, Winston Smith, is forced to say that 2 plus 2 equals 5, or a rat eats his face, okay?
01:54:46.000 And Winston says, okay, whatever, 2 plus 2 equals 5.
01:54:50.000 And Orwell says to tell deliberate lies while genuinely believing them to forget any fact that has become inconvenient and then when it becomes necessary again to draw it back from oblivion for just so long as it is needed.
01:55:02.000 To deny the existence of objective reality.
01:55:05.000 So these Silicon Valley people Are these tyrants telling us this 2 plus 2 equals 5?
01:55:13.000 It's a great metaphor because I feel like we're being scared through social media into the spirit of race or social rejection.
01:55:23.000 I have people in the conservative movement, I don't even consider myself part of that movement, afraid to retweet me for fear of being what?
01:55:31.000 We're not talking about a rat.
01:55:33.000 Those are cowards.
01:55:34.000 Well, there's a lot of them, my friend.
01:55:35.000 Absolutely.
01:55:36.000 It's like a method to scare people so that they become willing to do that.
01:55:36.000 Absolutely.
01:55:41.000 We're not talking about spiders and bees and rats.
01:55:43.000 No, ostracization.
01:55:44.000 We're talking about the possibility of being censored on Twitter.
01:55:48.000 If that's the thing that they're scared of, we're in trouble, fellas.
01:55:51.000 James, you mentioned that you think nobody has that strength to reject the rat, right?
01:55:55.000 People forget rats and bees and tarantulas.
01:55:58.000 They're afraid of censorship.
01:56:00.000 You do it every day.
01:56:01.000 What?
01:56:02.000 You do it every day.
01:56:02.000 You spit in the eye of the establishment machine every single day, and so do we.
01:56:08.000 You do that.
01:56:09.000 You do that.
01:56:09.000 And you do!
01:56:10.000 I mean, listen, I can come on here and talk about movies sometimes, like I did Gina Carano, Star Wars, you know, and it's kind of wagging the finger at these corporations.
01:56:19.000 They don't care that I'm mad at them because they fired Gina Carano.
01:56:22.000 You put a camera in the faces of these people.
01:56:24.000 So, you know, I think There are people who are willing to say, there are four lights, two plus two equals four.
01:56:30.000 You're one of them.
01:56:32.000 And I think there are a lot of people who are cowards.
01:56:34.000 And the problem is, you know what the thing is?
01:56:36.000 I'll tell you this.
01:56:38.000 I've been having this kind of joking thing.
01:56:40.000 It's sort of a joke.
01:56:41.000 Say one good thing about Antifa.
01:56:43.000 It's kind of a joke about lowering the temperature and stuff.
01:56:47.000 And I'll tell you this, they're brave, they're angry, they're passionate, and they're willing to sacrifice everything to get what they want.
01:56:54.000 And I think they're dangerously incorrect.
01:56:56.000 Too many people on the right are absolutely not willing to sacrifice anything.
01:57:01.000 Why is that?
01:57:01.000 I don't know, man.
01:57:02.000 Because they're scared and they're fat and happy.
01:57:04.000 Because they get to sit back in their lounge chair watching, you know, whatever Hollywood movie they claim to hate, but still get to be a part of something.
01:57:12.000 They've been given enough money to survive and then they're threatened with it being taken away.
01:57:16.000 So they're living in fear so they don't speak up.
01:57:19.000 So I often say, it's not black and white.
01:57:22.000 It's not like, I'm gonna say, we will always defy the machine.
01:57:27.000 When it comes to saying a certain name, which we can't say on YouTube, I've chosen... We're not gonna say it.
01:57:34.000 Why?
01:57:34.000 I prefer to have this show live with James O'Keefe and Ian and Luke and Lydia and we're talking about what's going on and why it's dangerous and important.
01:57:42.000 I could choose to say one word or one phrase and get the whole conversation removed.
01:57:47.000 I won't do that.
01:57:48.000 That's not about fear.
01:57:49.000 That's about tact.
01:57:50.000 Yeah, but James is talking about torture.
01:57:52.000 When you're under duress of torture, none of us have ever been there.
01:57:55.000 We would have told each other if we have been.
01:57:57.000 I think more people are willing to risk their lives than their own reputations.
01:58:01.000 Think about it.
01:58:02.000 How many people in Washington, D.C.
01:58:04.000 are willing to risk their reputation for a cause greater than themselves?
01:58:08.000 I can't think of maybe one or two.
01:58:09.000 I can think of many folks who've made the ultimate sacrifice, sacrifice that is beyond anything I can comprehend, go overseas and die for their country.
01:58:18.000 I can't think of many.
01:58:20.000 Members of Congress or people working in government that would be willing to make even a sacrifice of their own reputation for a cause greater than themselves.
01:58:29.000 Forget torture.
01:58:30.000 Because they're not interested in values.
01:58:35.000 They're not interested in preserving freedom, liberty, and accountability.
01:58:38.000 Well, that's a problem.
01:58:39.000 It absolutely is.
01:58:41.000 That's the main problem.
01:58:43.000 The left, it's not so much that Antifa has a core set of values they're fighting for, it's that they're extremely angry, and they can be weaponized by the collective because they fall in line.
01:58:52.000 Because they have toolkits, they have these plans.
01:58:54.000 And the right is more individualist.
01:58:56.000 That, and there are a lot of people on the right who are entirely self-interested.
01:59:01.000 And that's why they wouldn't retweet you?
01:59:02.000 That's ridiculous.
01:59:04.000 That to me is sad and pathetic.
01:59:06.000 So you think sacrificing our reputations is?
01:59:11.000 I think I think this road to to truth.
01:59:14.000 I've been doing this for 12 years and I've been through a lot of pain.
01:59:18.000 I've been through their moments of ups and downs where I thought it was over.
01:59:22.000 I mean, I've been I've been in federal prison and for falsely accused, I mean, of something I did not do.
01:59:28.000 That's a very and everyone abandoned me in that moment.
01:59:30.000 No, everyone thought I was done.
01:59:32.000 James, you're over, including my mentor.
01:59:35.000 So I know what it's like, and I know that the road to truth and justice, you're going to go through this, and I don't think people are willing to.
01:59:45.000 And political prosecution, and to be tarred and feathered, in the psychology of the American people, is almost worse than death.
01:59:58.000 Because I can see people with the courage to run up a hill with a bayonet, but a more rare form of courage is moral courage.
02:00:08.000 That is to stick by a cause which is losing.
02:00:10.000 They don't believe in themselves.
02:00:12.000 I don't know why that is.
02:00:14.000 I think that's the question.
02:00:17.000 Because if you can solve that question, you can actually change things.
02:00:19.000 Are you spiritual?
02:00:20.000 I'll tell you what it is.
02:00:21.000 I'm arrogant.
02:00:23.000 I know.
02:00:24.000 I don't think I know everything.
02:00:26.000 But when I do the research and I assess the situation, I look at these political positions and I come to a conclusion.
02:00:32.000 I don't come to it easily, which is why people call me a milquetoast fence-sitter.
02:00:35.000 A milquetoast what?
02:00:38.000 Fence-sitter.
02:00:38.000 Fence-sitter.
02:00:39.000 Yeah, because I often don't take very strong opinions on a lot of political issues because it takes a lot of research and a lot of confidence to assert something to be true.
02:00:47.000 We must have universal health care and ban all private, you know, health insurance I think is a ridiculous position because it's too absolute.
02:00:54.000 But the things I do know that make sense, I am very, very assertive when it comes to these things, and arrogant.
02:01:00.000 So when someone comes out, when every single person, it could be literally everyone in the world, comes out and says, no, we refuse, I'll just say, I don't care.
02:01:08.000 You are wrong.
02:01:09.000 Period.
02:01:10.000 When all these socialists and communists call for a centralized economics, I just say, that has never worked.
02:01:17.000 It will not work.
02:01:18.000 We can look to computing to see why a decentralized network works better than a centralized one.
02:01:23.000 You are just wrong on so many levels.
02:01:25.000 I will never back down from that position.
02:01:27.000 Unless, of course, new information changes things.
02:01:29.000 The two hardest moments of my life, or two of them, were number one, my Wikipedia page when I got started.
02:01:36.000 It was very emotional for me.
02:01:38.000 I couldn't change it.
02:01:39.000 It was all lies.
02:01:41.000 And everyone I knew in my life was like, wow, is that really true?
02:01:43.000 Did you really do that?
02:01:44.000 It was all fake nonsense.
02:01:46.000 I remember this is like 10 years ago, a moment of vulnerability.
02:01:50.000 I was tearing up.
02:01:52.000 Looking at my own Wikipedia page, there's nothing I can do about it.
02:01:55.000 And I remember how painful it was to be thought ill of by a website that was the first thing that showed up when you googled my name.
02:02:04.000 It was painful.
02:02:06.000 Number two, a federal judge destroying my videotape in Louisiana and nobody cared.
02:02:11.000 A federal judge destroyed the evidence that would exculpate me in New Orleans and everyone in the media said, good!
02:02:20.000 Off with his head, like the French Revolution.
02:02:23.000 Off with his head.
02:02:24.000 Dave Weigel laughing.
02:02:26.000 Laughing that they had incarcerated a reporter and destroyed federal evidence.
02:02:30.000 And again, too, this was the second hardest moment of my life.
02:02:33.000 And I thought, how can reporters, how can news media think it's a good thing to jail journalists simply because they don't like me?
02:02:44.000 These two moments of pain were for me the hardest things to endure.
02:02:50.000 And it was pain probably beyond physical pain because it was a form of injustice.
02:02:59.000 And as a Martin Luther King would say, you know, even a small justice is a threat to justice anywhere.
02:03:03.000 And for someone who's motivated by justice to endure that type of justice was a pain that was almost too much for me to bear.
02:03:09.000 Okay.
02:03:10.000 Do you believe or feel that there's a higher, like a greater force?
02:03:14.000 I think there's something wrong with me.
02:03:16.000 I think there's something wrong with my brain, right?
02:03:19.000 I'm probably not thinking about or motivated by the same sorts of incentives.
02:03:25.000 And I think the team I have, there's also like the guy who climbs up Yosemite without a harness.
02:03:30.000 They did an MRI scan.
02:03:31.000 He just isn't afraid of falling.
02:03:33.000 I don't know.
02:03:34.000 I don't know what it is.
02:03:35.000 Alex Hommel?
02:03:36.000 Alex Hommel.
02:03:38.000 I think our people are so focused on truth that they're not worried about these other things.
02:03:44.000 But I can tell you, I think this is very important, having endured a federal judge destroyed my tape and nobody cared.
02:03:52.000 Nobody gave a damn.
02:03:53.000 That was the hardest thing I've ever been through.
02:03:57.000 And I almost wasn't strong enough to get through it.
02:03:59.000 I told my mom and dad, I'm like, I don't know if I can get through this.
02:04:03.000 Why continue?
02:04:04.000 What happened exactly?
02:04:06.000 It's a long story.
02:04:07.000 I wrote a book about it.
02:04:08.000 What's the book called?
02:04:09.000 Breakthrough.
02:04:11.000 You had, I would guess, presumably a normal upbringing when you were a kid?
02:04:18.000 Not normal, but good upbringing.
02:04:21.000 Did your parents helicopter over you or snowplow things away from you, or did you just go out and do your thing when you were young?
02:04:26.000 Very independent, hard-working father and grandfather.
02:04:29.000 Did a lot of manual labor with my father and grandfather.
02:04:31.000 So there are a couple things in my life that I credit my probably tenacity and arrogance to, and confidence.
02:04:38.000 I'm not trying to be too self-deprecating.
02:04:40.000 Is that my family started a business when I was like nine years old and I had to go from the south side of Chicago to the north side of Chicago as a nine-year-old by myself.
02:04:48.000 And so this is from the orange line to the... I guess you could take the orange line to the red line or blue line up to Wrigley Field and that was for almost two years and I was entirely on my own.
02:05:00.000 I had to figure things out on my own.
02:05:01.000 I had to ask people for help when I needed help.
02:05:03.000 But then there was another point in my life where I was homeless.
02:05:06.000 And actually, I was homeless a couple times.
02:05:08.000 And so for me, I'm kind of like, there is nothing you can take from me that has me worried.
02:05:13.000 There is nothing you can accuse me of, literally nothing, because I've slept on parked benches.
02:05:17.000 So like nothing left to lose.
02:05:19.000 I, as far as I'm concerned, I would be happy sleeping in a mud hut in the middle of the woods, because it's freedom.
02:05:25.000 Man, all that stress removed, all I gotta do is figure out how many rabbits I can eat before I die of rabbit starvation.
02:05:30.000 Then I gotta find some fat and some vegetables.
02:05:32.000 I think that you acknowledge your arrogance is a sign of humility.
02:05:35.000 And being humble and being humiliated is important to destroy... It's an ego booster.
02:05:41.000 And once you've had your ego... I've had it destroyed so many times in my life.
02:05:45.000 I mean, once you get... You're right, Ian.
02:05:47.000 Once you get the ego destroyed a few times, Have any of these reporters ever had their ego destroyed?
02:05:54.000 They're nothing but ego.
02:05:56.000 Have they ever had to, were they ever locked out?
02:06:00.000 You know what I mean?
02:06:00.000 It's like, a lot of these kids, a lot of these people, I say kids, probably when they were kids were bullied and they wanted to find that source of power to oppress others to get that feeling back.
02:06:10.000 It's a lot of what you see in adults who are oppressive and arrogant and nasty to other people.
02:06:15.000 It's because it's kind of revenge for them.
02:06:17.000 Like I said, they want to be the one doing the hazing, not the one getting hazed.
02:06:21.000 You know what it was for me?
02:06:22.000 I'll let you guys do the hazing and whatever.
02:06:24.000 I'll be over there by myself.
02:06:26.000 Don't come near me.
02:06:27.000 That's how I've always been.
02:06:27.000 When you've been accused of a felony you have not committed and had arrested by the FBI, put in shackles, thrown in federal prison, it is amazingly Humbling.
02:06:38.000 And of course you're, and I said this story, the first temptation is to talk.
02:06:42.000 If you're, if you, let's say the feds arrest you tomorrow and just made up some bullshit and totally false.
02:06:47.000 The first thing you would think psychologically is, and everyone thinks, oh, I'm smart enough.
02:06:52.000 I would, Miranda writes, no, no, no, my friend.
02:06:54.000 When you're shackled at 24 years old, you're pissing your pants.
02:06:57.000 You don't think rationally.
02:06:59.000 When you're in, not handcuffs, but shackles.
02:07:01.000 I talked to the FBI because I didn't know any better.
02:07:04.000 I thought, no, I didn't do it.
02:07:05.000 And then they use every word against you because that's what they do, but it humbles you, and you think, okay, now I know the level I'm playing at.
02:07:14.000 It's not that they will use everything you say against you, it's that they can say whatever they want after you've talked.
02:07:19.000 That's right, that's what I meant to say.
02:07:21.000 There's a very famous, I think it was a Supreme Court justice or a lawyer who wrote this long, profound statement on why you never talk to law enforcement, no matter what.
02:07:30.000 I saw that.
02:07:31.000 Not even beat cops.
02:07:32.000 And they said it's because the moment you open your mouth, that law enforcement officer can say in a court of law, James O'Keefe admitted to everything to me.
02:07:40.000 That's exactly what happened.
02:07:41.000 And you'll say, that's not true.
02:07:42.000 I never said that.
02:07:43.000 Well, he certainly did.
02:07:45.000 And then the prosecutors say, were you talking with the officer?
02:07:47.000 Yes, I was.
02:07:48.000 And you had a long conversation.
02:07:49.000 It's that kernel of truth part of disinformation.
02:07:51.000 You admitted you were there that day.
02:07:53.000 It's like talking to the officer.
02:07:54.000 Well, there's a reason the FBI writes their transcript in pencil.
02:08:00.000 I'm serious.
02:08:01.000 I'm serious.
02:08:01.000 I was actually interviewed by the FBI too.
02:08:03.000 I was arrested a number of times for crimes I never committed.
02:08:06.000 And your first instinct is like, hey, I didn't do it.
02:08:08.000 I didn't do it.
02:08:08.000 What are you guys doing?
02:08:09.000 I'm innocent.
02:08:09.000 I'm innocent.
02:08:10.000 And then you learn, like, it doesn't matter.
02:08:12.000 They're punishing you because of your political stance.
02:08:15.000 And being through the ringer, it definitely wakens you up.
02:08:18.000 And I totally understand this kind of sacrificial feeling as well.
02:08:22.000 And it's important to have.
02:08:24.000 And I also question, why don't other people have this too?
02:08:26.000 Because one thing you mentioned is this moment at the end of your life that you're going to be looking back on.
02:08:33.000 It's going to be a very important moment.
02:08:34.000 It's going to be the key instrumental moment of your life.
02:08:37.000 Are you going to have any regrets?
02:08:38.000 Are you going to be happy the way you lived your life without any regrets?
02:08:41.000 And to me, I'm going 100 miles an hour as well.
02:08:43.000 And I'm happy there's other individuals like yourself who are doing it as well.
02:08:46.000 And maybe, maybe we can inspire other people to do the same.
02:08:50.000 You will.
02:08:51.000 You will.
02:08:52.000 Well, let's read some more of these superchats.
02:08:53.000 We got Dan Carmo says, Love the show, mate.
02:08:55.000 James O'Keefe is a bloody worldwide hero and make no mistakes in your Timcast.
02:08:59.000 Keep speaking truth like you do and you, my friend, slot right into the same category.
02:09:03.000 Peace out, legends.
02:09:05.000 Oh, thank you very much.
02:09:06.000 A lot of people... Any questions?
02:09:08.000 Given that hope, you know what I mean?
02:09:10.000 Well, questions are hard to find because I have to go through them and, you know... A lot of statements?
02:09:13.000 A lot of statements.
02:09:14.000 And so, typically, I'm like, I'm reading through and there's a lot of stuff where people are saying, you know, really nice things.
02:09:20.000 We have... Lawrence Van Don says, James, I can all but guarantee that Robert Barnes and Viva Frey would team up with you on the People's Defamation Defense Fund.
02:09:28.000 They're lawyers who are amazing to listen to on YouTube.
02:09:30.000 Barnes is a lawyer for the John Doe Covington kids' defamation.
02:09:34.000 I mean, I love this idea.
02:09:36.000 Luke and Tim, People's Defamation Fund, when we get, not if, but when we get money from the lawsuits, we'll start it.
02:09:45.000 And I think you do have to be willing to be defamed and arrested.
02:09:48.000 That's the question.
02:09:49.000 Here's the real question.
02:09:51.000 If telling the truth And being a good investigative reporter that 50 years ago would have won you a Pulitzer, now gets you defamed and arrested, would you do it?
02:10:00.000 Yes or no?
02:10:01.000 Binary question, yes or no?
02:10:03.000 If you actually were honest with me about this, the majority of people would say no.
02:10:08.000 I'm not willing, I have a family.
02:10:10.000 You know, I think about George Washington because he was willing to subvert and lie to accomplish his goals.
02:10:18.000 He wasn't an overt truth teller, maybe until he became president and then he decided to change.
02:10:23.000 So there is a willingness to not speak the truth when that will benefit you that is Uh, the essence of the United States.
02:10:32.000 Okay.
02:10:32.000 I get it.
02:10:33.000 To be smart with when you speak doesn't mean, and I've also found that being honest doesn't mean that you always say everything on your mind.
02:10:39.000 There's a time and a place for it.
02:10:40.000 Discretion.
02:10:40.000 Yeah.
02:10:41.000 Discretion.
02:10:42.000 I understand.
02:10:43.000 Maybe, maybe in that sense, Washington and Lincoln were politicians and how they, how they did what they did.
02:10:49.000 But I'm saying something a little different, which was if it was required for you to be arrested and or defamed.
02:10:56.000 By everyone you wanted to be liked by, would you tell the truth?
02:10:59.000 Yes or no?
02:11:00.000 And I think the majority of people would not.
02:11:02.000 I would.
02:11:03.000 Well, so, uh, we have a lot of people asking the same question, so I think we'll get this one out of the way.
02:11:07.000 Who are the people standing behind Ian with cameras?
02:11:10.000 What up, homies?
02:11:11.000 Oh my goodness.
02:11:12.000 You guys are in frame.
02:11:13.000 That's the geek squad from Best Buy.
02:11:15.000 Introduce yourselves.
02:11:18.000 James had to show off so he picked them up at Best Buy.
02:11:20.000 They don't have to give out their names if they don't want to or anything like that.
02:11:23.000 If they're anonymous they should know there's a camera pointed right at them.
02:11:26.000 Just a heads up.
02:11:28.000 Can you introduce the team?
02:11:29.000 These are some of my Project Veritas colleagues.
02:11:32.000 Eric Spracklin, Chief of Staff, Comms.
02:11:34.000 Mario works on our Comms team.
02:11:36.000 We got a couple Project Veritas videographers with us here.
02:11:41.000 We were coming from another assignment and then tomorrow headed to yet another assignment.
02:11:45.000 Very cool.
02:11:46.000 Nice to meet you guys.
02:11:47.000 What up, guys?
02:11:50.000 All right, I am just looking through to try and find some, uh... All right, let's see.
02:11:55.000 I think I know the answer to this question, but, well, someone's asking.
02:11:58.000 So, Gor Before Don says, Would Project Veritas consider branching into entertainment?
02:12:03.000 The truth is needed on those fronts as well.
02:12:05.000 Well, we'd be interested in exposing corruption in entertainment, particularly in California.
02:12:11.000 We have a few big sources right now in that community in Los Angeles and Hollywood.
02:12:15.000 VeritasTipsAtProtonMail.com.
02:12:17.000 Listen, It's a target-rich environment.
02:12:21.000 I'll never be out of work.
02:12:22.000 That's for damn sure.
02:12:23.000 There's way too much going on, and that's why we need insiders and whistleblowers.
02:12:28.000 I want to record a song.
02:12:30.000 We've talked about this for months, and I feel like six months can slip by if we don't just... You and I record a song?
02:12:36.000 Done.
02:12:36.000 We'll schedule it.
02:12:38.000 So we're actually working on a song, but did you guys want to do something about the political messaging?
02:12:42.000 Doesn't have to be.
02:12:43.000 Nah, just fun.
02:12:44.000 As long as the song's awesome.
02:12:45.000 Yeah.
02:12:46.000 Yeah, yeah.
02:12:46.000 We're gonna record.
02:12:48.000 James, we were jamming before.
02:12:50.000 In a different life, I would have been a thespian or a musician.
02:12:53.000 I love that.
02:12:55.000 And DJ.
02:12:56.000 And I'm getting better at that.
02:12:57.000 I started as an amateur.
02:12:58.000 But yes, we will make a song.
02:13:01.000 What's next?
02:13:01.000 That's the question.
02:13:04.000 Oh, man.
02:13:04.000 Jumping around.
02:13:05.000 I finally found a question and then Super Chat jumps because when they load at the same time.
02:13:10.000 Well, um, let's see.
02:13:12.000 Do we have any questions?
02:13:14.000 I'm not sure if it's a question, but, uh, well, here we go.
02:13:16.000 Woodworker Anon says, Mr. O'Keefe, the reputation is worse than death.
02:13:19.000 Their legacy is at stake.
02:13:21.000 If they make a stand and fail, they are a ish stain on the back, on the back page of a history book.
02:13:27.000 They no longer work for us.
02:13:28.000 They want to be a Kennedy.
02:13:29.000 Is that a song reference or something?
02:13:31.000 I'm not sure what that that's sort of cryptic, like a poem.
02:13:34.000 I think what they're trying to say is that many of these people just want their name to be favorable in the history books.
02:13:38.000 The people that are the problem.
02:13:40.000 politicians, media, establishment.
02:13:42.000 That goes back to my statement that if doing the right thing required for a temporary amount
02:13:51.000 of time, maybe even some of your life, being thought ill of and defamed by all the people
02:13:58.000 you want to be liked by.
02:13:59.000 The most prescient statement I've ever heard on the topic was by Rush Limbaugh, who once
02:14:04.000 told me, and I think he said this publicly, but he also told me one to one.
02:14:09.000 was, James, the hardest thing to accept about my life is being hated by all the people that I wanted to be liked by.
02:14:18.000 And it sounds like a cliche.
02:14:21.000 Of course.
02:14:21.000 Yes, of course.
02:14:22.000 But no, no, no.
02:14:23.000 That's actually the hardest part about this job for me.
02:14:26.000 And I'm being honest.
02:14:28.000 You know, my Wikipedia page is awful.
02:14:32.000 And everyone I go speak to, one of the first things, God, your Wikipedia page is terrible!
02:14:37.000 I mean, it's awful!
02:14:38.000 You realize it's because you're not playing the game the same way they are.
02:14:41.000 But let's just pause and think about that for a minute, because it sounds like a cliche, of course they're gonna... No, no, no, no.
02:14:46.000 This is a really hard thing to psychologically accept.
02:14:49.000 And I don't care who you are, if you're a human, it's no fun to be shit on By everyone.
02:14:57.000 CNN, the New York Times.
02:14:59.000 I mean, they just... Sunday, criticizing... And it's almost become a cliche.
02:15:05.000 Well, of course.
02:15:06.000 No, not of course.
02:15:08.000 This is a hard thing to accept.
02:15:10.000 And it's a hard thing to endure.
02:15:12.000 And it's a bitter pill that most humans cannot swallow.
02:15:16.000 Because you want to be liked.
02:15:17.000 Nobody likes to be hated.
02:15:19.000 I'm not a masochist.
02:15:20.000 I'm not a sadist.
02:15:22.000 I didn't get into this to be Hated, shit on, defamed, jailed, sued, lied about, okay?
02:15:31.000 You have to be sick in the head if you do this to get that flack.
02:15:36.000 What I'm trying to tell you is that that is a necessary byproduct of being effective.
02:15:41.000 One of my early mentors said, when we were enduring this, I said, James, it's a sign of respect.
02:15:49.000 for them to do that to you.
02:15:52.000 And we have to change people's methodology in this country.
02:15:55.000 They're afraid of us.
02:15:57.000 CNN is afraid of us.
02:15:58.000 Twitter is afraid of us.
02:15:59.000 Google is afraid of us.
02:16:01.000 Pinterest is afraid of us because of these brave heroes.
02:16:04.000 Russell Strasser, the deep state federal agent that we recorded the deep state agent interrogating Richard.
02:16:11.000 And we put Russell Strasser's face in the YouTube video.
02:16:15.000 Russell Strasser is currently in hiding because he knows we're gonna doorstop him.
02:16:19.000 Okay?
02:16:20.000 You know what's funny, though, is I haven't had that experience where... Maybe it's because I'm a milquetoast fencer, I guess, whatever.
02:16:28.000 I haven't had the people... Well, actually, let me slow down.
02:16:30.000 There's not a whole lot of people I've ever wanted the respect of.
02:16:32.000 You know, a lot of people I play music, they say, who do you look up to in music?
02:16:35.000 I'm like, I don't look up to anybody.
02:16:37.000 I think they're all bad.
02:16:38.000 I think I have to make music.
02:16:40.000 It's always been very like, I don't know, arrogant, I guess.
02:16:43.000 Skateboarding 2 is your favorite skateboarder?
02:16:44.000 Meh.
02:16:45.000 But you know what's interesting for me is something changed with the internet and the ability for people to investigate on their own and build their own communities.
02:16:53.000 I've actually had, there are some pro skateboarders that I grew up watching all of their videos.
02:16:58.000 The legends messaged me on Instagram and Twitter being like, dude, you're the best.
02:17:02.000 And I'm like, whoa.
02:17:03.000 To have, like, this dude, when I was, like, 14, watching his video in my friend's basement on VHS, being like, man, I wish I was as good at skateboarding as that guy.
02:17:11.000 Now he's mentioning me, being like, dude, you're one of the best people covering news and talking about it.
02:17:14.000 I'm so grateful.
02:17:15.000 I'm like, that's awesome.
02:17:17.000 When your heroes become your rivals.
02:17:19.000 I want to tell a one or two minute story about Dean Baquet, executive editor of the New York Times.
02:17:23.000 I was at a conference at Duquesne University two and a half years ago giving a speech, and they invited me to my shock.
02:17:29.000 And I walked up to Dean Baquet and I tried to shake his hand.
02:17:31.000 This is the head of the New York Times.
02:17:34.000 And I've ambushed him a few times, but, you know, we're at this conference and I thought maybe we could have a small talk.
02:17:38.000 Hey, Dean, how you doing?
02:17:40.000 He literally turned his back to me and cowered like a like a little coward.
02:17:46.000 And he and he pretended like I wasn't there.
02:17:48.000 And he said, please go away.
02:17:49.000 And at that moment, there was still like four percent of me that cared about what The New York Times thinks of me.
02:17:56.000 But that just vanished.
02:17:58.000 The moment you stop caring about what they think of you is the moment you're finally free.
02:18:04.000 And we say, okay, that's easy to do, but it's not.
02:18:08.000 The moment you stop caring about what the New York Times thinks of you... And you know what?
02:18:13.000 Most people in the conservative movement care so much about their Twitter accounts.
02:18:18.000 If you start... If I said... And they lie about it, Tim.
02:18:23.000 They say they don't, but they do.
02:18:25.000 And the moment you stop caring about what Jack Dorsey thinks about you is the moment you're free.
02:18:32.000 Conservatives, for the most part, are trying to... It's like...
02:18:36.000 That nerdy kid who's trying to desperately make the cool kids think they're cool.
02:18:40.000 And it's kind of sad.
02:18:41.000 You just gotta stop doing it.
02:18:42.000 You just gotta be yourself, stand up for what you believe in.
02:18:44.000 And there are a lot of prominent conservatives who do this and just say, whatever man, I don't care, shut up.
02:18:48.000 And they're really funny about it.
02:18:50.000 A lot of these trolls who just, you know, they just roll with it.
02:18:53.000 I love it.
02:18:53.000 But then there are some people who...
02:18:55.000 A lot of these mainstream, high-profile personalities, they want Hollywood to like them.
02:19:00.000 They want the networks to like them.
02:19:02.000 And it's like, but they hate you no matter what.
02:19:03.000 They're not gonna book you.
02:19:05.000 You can pretend to support them.
02:19:06.000 This is the problem.
02:19:07.000 Because when something like the Covington kids happen, how many Conservatives immediately just agreed with the narrative that had emerged about these kids.
02:19:14.000 That was not true.
02:19:15.000 And then changed their story once the Washington Post and CNN gave them permission to do so.
02:19:21.000 There were a few people credited with initially rejecting the narrative outright.
02:19:26.000 The first was Robbie Suave.
02:19:28.000 And then I'm assuming, you know, some people have said I was also one of the very early on.
02:19:32.000 I did a video the next day being like, what is this?
02:19:35.000 This is wrong.
02:19:36.000 Because when someone sends me a video, and it's just a kid smiling in front of a guy playing a drum, and they tell me to be angry, I'm like, for what?
02:19:43.000 I'm not assuming anything.
02:19:44.000 I have no idea what this is.
02:19:46.000 So I looked for a video, and I watched it.
02:19:48.000 Then I found a longer video of the Native American guy walking up to the kids, and I'm like, I have no idea what's happening.
02:19:53.000 Why is everybody angry?
02:19:54.000 So I felt DeFranco was angry.
02:19:56.000 I'm like, what's he mad about?
02:19:57.000 What's happening?
02:19:58.000 And then I made a video, I'm like, these people are nuts.
02:20:00.000 And then a bunch of people actually got on board, helped source a two hour long live stream, which I downloaded, and then showed clips where I'm like, dude, these kids didn't do anything wrong.
02:20:10.000 But so many people were just desperate to fit in.
02:20:12.000 The same thing is true with some of these, you know, these more tragic incidents related to Black Lives Matter protests.
02:20:18.000 Before the full story comes out, and it's happened several times, people just jump the gun and make assumptions based on the leftist narrative.
02:20:23.000 Or how about something I walked right into and fell for, believing the New York Times narrative on the officer at the Capitol and how he died.
02:20:31.000 And it turns out it was an unrelated stroke.
02:20:34.000 And I believed it.
02:20:35.000 I just believed what the New York Times said.
02:20:38.000 You know what?
02:20:39.000 It's because, to a certain degree, you do still trust big major news outlets.
02:20:44.000 Not unless they show me primary sources and they show it raw.
02:20:47.000 They have lost all credibility.
02:20:49.000 I don't believe them.
02:20:50.000 I don't believe you unless you show me your source material.
02:20:53.000 They have given us no reason to trust them.
02:20:55.000 And I think what we're talking about right now is the real issue, Tim.
02:20:59.000 It's the perverse incentive for people that supposedly fight for truth to sell out in order to get positive ink by all the people they want to be liked by.
02:21:09.000 The moment you don't care is the moment you're free.
02:21:12.000 We do have several ladies asking if James is single.
02:21:18.000 What's the verdict?
02:21:19.000 Veritas Tips at ProtonMail.com.
02:21:23.000 That's V-E-R-1-8-7-7-cars-for-kids.
02:21:26.000 Ever heard that commercial?
02:21:28.000 Veritas Tips at ProtonMail.com.
02:21:30.000 It's V-E-R-I-T-A-S Tips.
02:21:33.000 Are you giving that out now for the single ladies?
02:21:35.000 At ProtonMail.com.
02:21:37.000 It's a one-stop shop, everyone.
02:21:39.000 It's incredible.
02:21:40.000 You can be a whistleblower.
02:21:42.000 You can donate.
02:21:44.000 I mean, there's a lot of things you can do.
02:21:45.000 Just sign up.
02:21:46.000 Here's a real question, though.
02:21:48.000 Kimchi93 says, James, do you ever get nervous confronting people?
02:21:51.000 No.
02:21:52.000 No, I don't.
02:21:52.000 Did you used to?
02:21:53.000 No.
02:21:57.000 What's the guy, the climber that climbs Yosemite?
02:21:58.000 Hommel.
02:21:59.000 Alex Hommel.
02:21:59.000 Hommel.
02:21:59.000 Is that the guy?
02:22:00.000 That was a great documentary.
02:22:01.000 He climbs the Yosemite in California without a harness and they had to scan his brain for fear.
02:22:07.000 I don't get scared at all doing this.
02:22:10.000 I'm trying to think about, I'm definitely, sometimes I'm not afraid confronting the people.
02:22:17.000 What am I afraid?
02:22:19.000 There's a part of me that is afraid of the legal threats, like going to jail sucked.
02:22:26.000 But, you know, you choose to overcome your fear, and you do the right thing despite your fear.
02:22:31.000 Luke has absolutely gone after hundreds of high-profile people as well, so I'll throw it to you as well, Luke.
02:22:36.000 It's fun.
02:22:37.000 It's exciting.
02:22:38.000 There is a little bit of nervousness, but once you're doing it, you're kind of in this flow state.
02:22:42.000 And when you're locking eyes with, like, a Lord Jacob Rothschild or a Henry Kissinger or Zbigniew Brzezinski or Bill Clinton or Tony Blair... Yes, Luke, we know!
02:22:50.000 How much time we got here?
02:22:52.000 When you're locking eyes with them, you're in the moment and it feels amazing and incredible to be able to get rid of this facade of them getting their butts kissed all the time and just slap them with reality and they're shocked and they quiver.
02:23:04.000 You mean the confrontation?
02:23:05.000 Yeah, the confrontation and the way they react.
02:23:07.000 It's so telling because they're so cowardly.
02:23:10.000 They expect you to kiss their ass and then once you're like, let's talk about this and this and this and this real issue, What I'm worried about is when I do it, like two examples, when I called into Jeff Zucker's conference line in December, I dialed into the CEO of CNN's 9 a.m.
02:23:24.000 conference call.
02:23:25.000 My concern is not fear, it's like I only have one take.
02:23:29.000 Like I can't do this more than, I have one shot at this.
02:23:32.000 Don't mess up.
02:23:33.000 I just, okay, when do I interject and say, hello Jeff, this is James.
02:23:36.000 What's the moment in the dialogue?
02:23:40.000 And then when I was in the acorn thing as the pimp, I'm not afraid.
02:23:44.000 This is when the hidden cameras are this big and they're strapped to your chest with velcro.
02:23:49.000 I remember those.
02:23:49.000 So I'm looking in my pocket like, is this still filming?
02:23:52.000 Oh, the battery's dying!
02:23:54.000 So the producer in me is just technically worried.
02:23:59.000 Is the camera filming?
02:24:00.000 Logistics?
02:24:01.000 Do we have the audio?
02:24:02.000 I'm always just worried about the little things.
02:24:04.000 Yeah, and they go wrong a lot.
02:24:06.000 I confronted John Bolton I asked him, you know, my friend's drawing a portrait of him.
02:24:11.000 Does he want the blood on his hands, on the left hands, on his right hands?
02:24:14.000 He freaked out.
02:24:16.000 And I remember being so happy and giddy because it was a real interview.
02:24:18.000 And at the end, I just hit him with that.
02:24:21.000 And there was no audio of it.
02:24:22.000 No audio of it at all.
02:24:25.000 My team will always tell me, it's always the little things like, do we have backup The worst thing about this business is when you do the thing and the camera failed or the batteries died.
02:24:36.000 I'm more worried than scared.
02:24:38.000 We have another question from Pom Fum.
02:24:40.000 James, have you ever considered branch offices in other countries, i.e. Canada?
02:24:44.000 Yes.
02:24:45.000 Yeah, we got a lot of messages from Canada, from Europe.
02:24:48.000 There are laws, you know, Luke and I have been to Greece.
02:24:51.000 Yeah, we got tear gassed together.
02:24:53.000 We got tear gassed in Athens in 2015.
02:24:55.000 Tear gassed.
02:24:58.000 Veritas is going global.
02:24:59.000 We got whistleblowers all over the world.
02:25:01.000 There are certain laws we can't break but we're protected by the United States Supreme
02:25:05.000 Court Bartnicki case 2001 a whistleblower can themselves send us a recording even if
02:25:13.000 they potentially violated an NDA or broke the law so we will branch out we
02:25:17.000 are branching out we're going global right on
02:25:20.000 Well, I think we've gone a little bit over, but we'll sort of wrap up this portion of the show here.
02:25:25.000 Do you want to stick around and do a members segment?
02:25:28.000 We'll talk a little bit more.
02:25:28.000 That sounds great.
02:25:29.000 Right on.
02:25:30.000 Well, for those that are listening, smash that like button on your way out.
02:25:32.000 Make sure you go to TimCast.com.
02:25:35.000 Become a member because we're going to continue the conversation in a mostly uncensored manner.
02:25:40.000 I say mostly because we still have some, you know, general Like, there are families who listen, so we try to keep things, but we swear a whole lot.
02:25:47.000 We'll speak about things that typically YouTube won't let us.
02:25:49.000 Go to TimCast.com.
02:25:50.000 That should be up maybe about an hour or so.
02:25:52.000 Don't forget to, again, like, subscribe, hit the notification bell, leave us a good comment.
02:25:56.000 You can follow me on Parler or Mines at TimCast.
02:25:59.000 My other YouTube channels are YouTube.com slash TimCast and YouTube.com slash TimCast News.
02:26:04.000 This show is live Monday through Friday at 8 p.m.
02:26:06.000 James, do you want to shout out anything else before we wrap this portion up?
02:26:09.000 The only thing I would shout out is if you're watching this and you work for a Silicon Valley tech giant or the government, or you see fraud happening in your municipality, corporation, or government bureau, contact us on the inside.
02:26:24.000 VeritasTips at ProtonMail.com Right on.
02:26:27.000 I think the shirt I'm wearing is very fitting for today's discussion.
02:26:31.000 It says practice media distancing, and you yourself could exclusively get this shirt by going to thebestpoliticalshirts.com.
02:26:40.000 If you want to see the former head of the CIA squirm and sweat balls after asking him some serious questions, you can on my YouTube channel, WeAreChange.
02:26:49.000 And James, you brought up a very important point when I asked you about alternative social media platforms.
02:26:53.000 You brought up email lists.
02:26:54.000 Those are key because you actually have the list.
02:26:57.000 You actually have contact with your members.
02:26:59.000 No one's standing in the way.
02:27:01.000 I've been building mine up.
02:27:02.000 WeAreChanged.org, top right-hand corner.
02:27:04.000 It means a lot to me.
02:27:05.000 Thank you, James, for coming on and thanks for having me.
02:27:07.000 I also want to shout out this shirt that you can get on the TimCast.com store.
02:27:10.000 I think it is Harumph.
02:27:13.000 James, one time Tim and I went to your office and met with some of the reporters.
02:27:16.000 They asked me, what can we do?
02:27:18.000 What's the best thing we can expose?
02:27:19.000 And at the time, I didn't have an answer, but I've thought about it.
02:27:21.000 And I'm hoping that at some point someone can get inside the Federal Reserve and break the books on that company, because the government's not allowed to audit them at present.
02:27:31.000 And I find that despicable.
02:27:33.000 That's the Holy Grail.
02:27:34.000 If you're out there listening, that's my advice.
02:27:37.000 And check out iancrossland.net, new and improved.
02:27:39.000 You can get one of these free-the-code mugs if you really believe in free software and breaking up this monopoly of internet censorship.
02:27:47.000 Love it.
02:27:47.000 Love you, man.
02:27:48.000 And then me, pushing all the buttons for these wild guys.
02:27:51.000 I'm Sour Patch Lids on Twitter.
02:27:53.000 I am also on Mines.
02:27:55.000 I am on Gab as RealSourPatchLids, and I'm also on Instagram as RealSourPatchLids.
02:28:00.000 We're gonna be jumping over to TimCast.com for the exclusive members-only portion of the show, but for everybody that hung out thus far and super chatted, seriously, thank you guys so much.
02:28:10.000 Make this all possible, and I really do appreciate it.
02:28:13.000 So again, smash that like button, and we will see you all in about an hour or so, or whenever you come by TimCast.com.