Timcast IRL - Tim Pool - October 20, 2020


Timcast IRL - James O'Keefe In Studio With BREAKING Expose On Google Election Interference


Episode Stats

Length

2 hours and 9 minutes

Words per Minute

200.86487

Word Count

26,012

Sentence Count

2,192

Misogynist Sentences

20

Hate Speech Sentences

13


Summary

Jeffrey Toobin caught on camera while on a zoom call with an anti-Trump colleague. James O'Keefe joins the show to talk about it and more. Plus, Project Veritas releases a new undercover video proving that Google's algorithms are skewed in favor of Joe Biden.


Transcript

00:00:00.000 Ladies and gentlemen, we have some very serious, breaking news which is going to affect the very fabric of
00:00:33.000 our society.
00:00:34.000 And we are joined by a very, very important guest who has nothing to do with this story.
00:00:38.000 I'm talking about Jeffrey Toobin, the CNN analyst who was caught whacking off on a Zoom call.
00:00:42.000 This is crazy!
00:00:44.000 You guys heard about this, right?
00:00:46.000 Yes.
00:00:47.000 Yeah.
00:00:47.000 Okay, that's crazy news, but we actually do have real news.
00:00:51.000 We do.
00:00:51.000 And James O'Keefe is here.
00:00:52.000 We do.
00:00:53.000 Hello.
00:00:53.000 I'd like to see that Toobin picture, by the way.
00:00:55.000 No, you don't.
00:00:56.000 Why would you want to see that?
00:00:58.000 I'm just curious.
00:00:59.000 Okay, so Jeffrey Toobin is from The New Yorker and a CNN analyst, and apparently on a Zoom meeting, he was whacking off.
00:01:07.000 He was like an anti-Trump guy, Trump is racist, all that stuff.
00:01:11.000 Why would anyone, for any reason, be on like a work meeting call and decide to just start cranking it out?
00:01:18.000 Because?
00:01:18.000 He said it was a mistake!
00:01:20.000 I didn't think they could see me!
00:01:21.000 It was a mistake.
00:01:22.000 Wait, hold on a minute.
00:01:23.000 Like, at least when Louis C.K.
00:01:25.000 did this, he asked for permission first.
00:01:27.000 You know, and he still got cancelled for that.
00:01:29.000 This guy just was like, well, I'm at a work meeting, better just, you know, go at it.
00:01:32.000 And then, oops, oh no, they saw me!
00:01:34.000 Oh no!
00:01:34.000 I didn't realize what a Zoomie was.
00:01:35.000 So we met to turn off his camera and then jerk it while no one could see him.
00:01:38.000 Yes!
00:01:39.000 I think I'm looking at the picture right now.
00:01:42.000 They covered up his thing with a birthday hat.
00:01:45.000 Is it the Vice article?
00:01:47.000 Yeah, that's what it looks like right there.
00:01:49.000 Oh no, I'm scared!
00:01:50.000 Oh no!
00:01:52.000 You can see his stomach and his body and his gut and everyone's like putting their hands over their mouth.
00:01:59.000 I'm not sure if I'm looking at the real thing.
00:02:01.000 You are, I think that's right.
00:02:02.000 Maybe this is a coordinated disinformation campaign.
00:02:04.000 Just a picture of someone.
00:02:06.000 I made the two bin pic SFW-ish.
00:02:09.000 Safe for work-ish.
00:02:10.000 It's like two, is that it?
00:02:14.000 Is that what it looks like?
00:02:15.000 Why are you making me look at this?
00:02:17.000 Why did you look it up?
00:02:18.000 I mean, I didn't look it up.
00:02:19.000 It showed up on my Twitter.
00:02:20.000 It just turned up.
00:02:21.000 It shows up.
00:02:22.000 It's the algorithm.
00:02:23.000 Yeah.
00:02:24.000 Groundbreaking journalist James O'Keefe sources image.
00:02:27.000 I don't know if this is, yeah.
00:02:28.000 We don't, we can't confirm.
00:02:29.000 No, maybe someone took a fake.
00:02:31.000 It might, it might be a weird recreation thing.
00:02:34.000 Yeah, I don't know.
00:02:35.000 Couldn't confirm it.
00:02:36.000 I don't want to find out.
00:02:37.000 I'm not going to look for it.
00:02:38.000 I don't want to know.
00:02:39.000 Just seeing his face.
00:02:40.000 You know, it may surprise many of you listening that we actually do have, like, real important news.
00:02:44.000 That we have important things to talk about.
00:02:45.000 And that's why James is here.
00:02:47.000 I don't know why I talked about that for a few minutes.
00:02:50.000 Not planned.
00:02:51.000 Good use of time.
00:02:52.000 Anyway, welcome to the show.
00:02:53.000 Yeah, that's James O'Keefe.
00:02:55.000 I think most of you know who he is and he's got a big story he just dropped, so we're
00:02:59.000 going to talk about that.
00:03:00.000 Of course, you know him, you know me.
00:03:02.000 We got Ian, he's hanging out.
00:03:03.000 Yo!
00:03:04.000 Ian's in the corner.
00:03:05.000 And Lydia's here as well.
00:03:07.000 So I think that's as good as an intro is ever gonna get.
00:03:09.000 So subscribe, hit the like button, hit the notification bell, and oh man, how do you top that?
00:03:15.000 I don't know.
00:03:16.000 I guess we'll just start talking about stuff.
00:03:18.000 So James, you just dropped a big story.
00:03:20.000 Yes, sir.
00:03:21.000 Not as big as the Jeffrey Toobin thing, but you know.
00:03:22.000 Correct, yes.
00:03:23.000 Depends upon how big that is.
00:03:24.000 Before we go in.
00:03:25.000 Oh my gosh, James.
00:03:26.000 James, for anyone that doesn't know, you run Project Veritas.
00:03:29.000 I run Project Veritas.
00:03:30.000 We're the nation's premier, perhaps the nation's only, Undercover or even investigative reporting organization.
00:03:37.000 We use video, incontrovertible video evidence, and we just launched a story an hour ago on Google.
00:03:42.000 A Google engineer discussing how the algorithms are skewed to favor Joe Biden in the presidential election.
00:03:48.000 So we will definitely lead with this, and I've got your website pulled up.
00:03:52.000 We'll look at it.
00:03:53.000 But I'll just add, you may be the only undercover news outlet in the country, and man do they hate you.
00:03:59.000 But we'll get into all the fake news.
00:04:00.000 There's so much to say about that.
00:04:01.000 Yeah, so we'll leave with this big story.
00:04:03.000 So we can pull this up real quick, just to give people a quick glimpse.
00:04:05.000 But instead of reading through what your website says, you're here, you can tell us.
00:04:09.000 But I'll read the headline.
00:04:10.000 Senior Google Manager on Search Engine's Power.
00:04:13.000 You are just plain and simple trying to play God.
00:04:16.000 The power's in the search.
00:04:18.000 Trump says something, misinformation.
00:04:19.000 You're gonna delete.
00:04:20.000 If a Democratic leader says that, then you're gonna leave it.
00:04:23.000 So how about you just tell us what's going on?
00:04:25.000 What's this breaking story?
00:04:26.000 Well, this is a guy, Ritesh Lakhar, technical program manager at Google, and he works for the cloud.
00:04:32.000 And he's kind of an unwitting whistleblower.
00:04:35.000 He talks about how the corporation plays God.
00:04:37.000 And I'm reading here, like, if it was fraud, it doesn't matter.
00:04:40.000 But for Trump or for Melania, it matters.
00:04:43.000 On the other side, Trump says something, misinformation, you're going to delete that because it's illegal or whatever pretext.
00:04:47.000 He talks about the double standard, something we all suspect to be true, but he says if a Democratic leader says that, then you're just going to leave it.
00:04:54.000 And he talks about how the corporation, Google, is outsourcing jobs in order to spy on Americans, and he feels guilty about this.
00:05:01.000 And he describes the algorithm, how people cry in the corridors of Google when Trump won.
00:05:07.000 So a lot of what Veritas does is confirm suspicions, right?
00:05:10.000 None of this is surprising to you, but it may be one of the first times we've ever heard them say it as a current employee.
00:05:16.000 That's why we use hidden cameras.
00:05:18.000 People are more honest.
00:05:19.000 Sometimes the newsworthiness is such that people talk about the ethics of recording maybe a good man saying this, but the public's right to know is paramount.
00:05:28.000 It's important that we show this information to the masses.
00:05:31.000 This guy says something about training Chinese people to spy on America and something like that.
00:05:36.000 What was that?
00:05:37.000 He says in this tape that he feels guilty.
00:05:40.000 He feels, quote, suffocated at Google, and particularly guilty for outsourcing jobs overseas in order to spy on the American people.
00:05:49.000 At one point in the conversation, we ask, why is it that Google prefers Democrats?
00:05:53.000 And his answer, well, this is America.
00:05:56.000 Very enlightening video, and this is part one of a series of videos.
00:06:00.000 We'll release a video every day this week on Google.
00:06:03.000 Different employee.
00:06:04.000 I tweeted out your video, and the first responses from people was... Can't see it.
00:06:10.000 I can't see it.
00:06:10.000 It's unavailable.
00:06:11.000 Done.
00:06:12.000 So there's a tweet, I've been getting the same thing sitting here right now, and it says, quote, this tweet might include sensitive content, to view it you need to change your privacy settings.
00:06:20.000 You know what I love?
00:06:21.000 I love the irony of you literally exposing manipulation, interference, and censorship And then they censor you.
00:06:31.000 It's the perfect... You know, I always say this.
00:06:33.000 The best way to describe irony, to define irony, is a firetruck on fire.
00:06:39.000 That's irony.
00:06:40.000 The firetruck is supposed to put the fire out, not in flames.
00:06:42.000 You're exposing these big tech companies, and they're actively shutting you down in real time.
00:06:46.000 They're actively doing it.
00:06:47.000 They've always done it.
00:06:49.000 We have a saying at Project Veritas, content is king.
00:06:53.000 You just gotta do the good work.
00:06:54.000 You gotta expose it.
00:06:55.000 You gotta put it out there.
00:06:57.000 Get proxies to put it out there.
00:06:59.000 We embargo clips with people ahead of time.
00:07:01.000 Sometimes I do that with you, Tim.
00:07:02.000 I'll share a little link.
00:07:03.000 That's true.
00:07:04.000 And that's why the New York Times called it a misinformation campaign, because we were embargoing clips with people ahead of time, which is what reporters always do.
00:07:13.000 But that's why I brought this book with me, 1984.
00:07:16.000 If you haven't read it since you were 15, reread the book, listen to it on Audible, whatever you do.
00:07:21.000 That looks like it's a 1984 printing, too.
00:07:23.000 Yeah, I think this is my high school book that I took.
00:07:28.000 So the embargo thing is important, too, because it is extremely common.
00:07:33.000 People don't realize it.
00:07:33.000 And the New York Times tried smearing you simply because you did it.
00:07:36.000 Simply because we did it.
00:07:37.000 This is something we should talk about at some point.
00:07:39.000 I can mention it now.
00:07:40.000 The New York Times Talk to researchers at Stanford University, interviewed them, and they said, well, it's probably part of a disinformation campaign because James O'Keefe shared it with the MyPillow guy two days before the event.
00:07:52.000 Is that who he shared it with?
00:07:52.000 I met with Mike Lindell because he's a big-time deal in Minnesota.
00:07:55.000 He's a Minnesota guy, and we're doing a Minnesota story.
00:07:57.000 He's got a big following.
00:07:57.000 I said, hey Mike, here's a tease of what we're doing.
00:08:00.000 And Stanford University told the New York Times, well, that's the reason it's probably disinformation.
00:08:05.000 USA Today quotes the disinformation part and suddenly Facebook is censoring the video because USA Today says it's disinformation.
00:08:11.000 Straight out of Orwell.
00:08:13.000 So, I don't know what else there is to say about, you know, you've got another guy.
00:08:18.000 It's not the first time you've caught somebody.
00:08:20.000 But here's what's crazy to me.
00:08:21.000 We've seen other videos like this.
00:08:23.000 The Verge, actually.
00:08:24.000 I think it was The Verge who published this video.
00:08:26.000 Yes.
00:08:26.000 Where you see them giving a meeting about Donald Trump won, and we're all scared and sad.
00:08:31.000 Yeah.
00:08:32.000 And people are crying.
00:08:33.000 That was a left-wing publication.
00:08:34.000 You also had Gizmodo.
00:08:36.000 This was back in, I think, 2018.
00:08:37.000 Yeah.
00:08:38.000 Where they wrote one of the first stories saying Facebook actively censored conservative news websites.
00:08:43.000 Yes.
00:08:44.000 You confirm all this, and it's misinformation.
00:08:46.000 They publish it, and now it's fine.
00:08:48.000 Well, I think it's the courage to continue doing it.
00:08:50.000 You know, there's so much cynicism and hopelessness.
00:08:51.000 We were talking last night about how cynical people are.
00:08:54.000 I think you just have to keep going.
00:08:56.000 I really do.
00:08:56.000 Because you get caught in a trap of being defensive, like responding to their bias and whining about them.
00:09:03.000 What I've found is if you just keep reporting, they keep attacking me, I just keep reporting.
00:09:07.000 I think the audience that you're serving right here, this stream right now, and the audience I serve, these are new audiences.
00:09:13.000 These are people who don't necessarily watch CNN or maybe even watch Fox News.
00:09:17.000 They're just people that are being enlightened little by little.
00:09:20.000 And the truth is, Tim, I think we're winning.
00:09:22.000 I mean, I think I've got more sources coming to me than ever before.
00:09:25.000 People, people, the sources tell me I've got no place else to go.
00:09:28.000 They say, I can't go to the New York Times.
00:09:30.000 I can't go to USA Today.
00:09:32.000 I don't even know what USA Today is except right pieces about us.
00:09:35.000 So let's do this.
00:09:35.000 I think I'm going to really enjoy ragging on the media, but we should give that- Yeah, we should talk about other things.
00:09:41.000 So let me do this.
00:09:42.000 You've exposed this.
00:09:43.000 You had another video of a director at Google talking about censorship.
00:09:47.000 That woman, I forgot what her position was.
00:09:49.000 What was the name of the woman at Google?
00:09:52.000 The Irish woman?
00:09:53.000 We've had a couple.
00:09:53.000 Jen Janai.
00:09:54.000 Jen Janai.
00:09:55.000 The big one was Jen Janai saying we need to have algorithmic fairness.
00:09:59.000 Yeah.
00:09:59.000 Because God forbid you Google search something and you get reality or facts.
00:10:03.000 What if the facts are unfair?
00:10:04.000 So what we have to do is censor the reality such that it makes the world more fair.
00:10:08.000 Who defines fair?
00:10:09.000 We don't know.
00:10:10.000 And then Jen Janai says, we got to prevent the next Trump situation.
00:10:14.000 This is what Janai says, who's a senior individual at Google, head of Google's innovation department.
00:10:19.000 So when she said we have to prevent the next Trump situation, Was it a smoking gun? Not quite, because what did she mean
00:10:26.000 by that?
00:10:27.000 Now she'll say on the medium, she wrote an article, no, no, I meant stop Russian interference in the election.
00:10:32.000 So they'll say whatever they need to say to explain away their comments on the tape.
00:10:38.000 Have you seen the social dilemma?
00:10:40.000 Just a few minutes of downstairs.
00:10:42.000 Yeah, we were playing just a bit of it downstairs.
00:10:45.000 And what's really interesting is they basically confirm everything you've got on camera.
00:10:53.000 You've got former tech executives.
00:10:54.000 They don't go as deep as to say, yeah, when we were there, we were like, we're gonna make sure the Democrats win.
00:11:00.000 But they were straight up saying, we manipulate people into doing what we want them to do.
00:11:04.000 We figure out how to persuade them into doing what we want them to do.
00:11:07.000 And then I see stories like that, like, you know, so in the documentary, they have this mock version of political extremism called the Extreme Center and Don't Vote, like these two groups that are fighting each other.
00:11:19.000 But it's very clear that they're straight up saying social media companies are encouraging, you know, are favoring political factions or institutions.
00:11:28.000 And considering we're seeing something like this, it reminds me of what we heard from the CEO of Reddit.
00:11:32.000 I don't know if you're familiar with this, but he actually said, I'm confident Reddit could sway an election.
00:11:39.000 We wouldn't do it.
00:11:40.000 Then what do they do?
00:11:41.000 Now they're actively censoring, you know, conservatives, Trump supporters.
00:11:45.000 So we're seeing this kind of thing just basically across the board.
00:11:48.000 I guess I would just ask you, we have the story.
00:11:51.000 All right.
00:11:51.000 So, you know, I don't know what else to say about it to say about, well, this one individual.
00:11:56.000 So I'm going to ask you to with, with all of the exposes you've done on Google so far, give us the big picture of what's happening with Google, with Facebook, with Twitter, from this story to the other stories you've done.
00:12:06.000 Right, well, you know, and I think video, I've said this many times, video transfixes in a way that words don't.
00:12:12.000 So kind of hearing these people talk about how they do this and what they do with the, first with the Google clip we were talking about a moment ago, a woman said we've got to prevent the next Trump situation.
00:12:24.000 This guy saying it seems they favor Democrats with their algorithms.
00:12:30.000 It's becoming more obvious, it's becoming more clear that there is no shame on these platforms.
00:12:37.000 And I think the recent New York Post story was a good example of that.
00:12:41.000 Where it's just, it's no longer you need a hidden camera.
00:12:44.000 They're just doing it out in the open.
00:12:46.000 And I still believe that if the content, if the story is good enough, it can circumvent the powers that be because people are hungry for real, raw, enlightened information.
00:12:56.000 So that's my premise.
00:12:58.000 If that premise is off, or if that's wrong, then maybe this escalates into the next DEF CON stage, which is civil unrest in this country.
00:13:06.000 But I still believe fundamentally that people are intelligent enough to be receptive to real and raw information like you're seeing with Project Veritas.
00:13:15.000 And what's more, Tim, is, and I can't emphasize this enough for your show, Veritas' vision is insiders and whistleblowers coming public.
00:13:23.000 Brave people.
00:13:24.000 That's why our motto is Be Brave, Do Something.
00:13:26.000 So when I do something like this, what happens is six Google insiders will contact me and say, Hey bro, I've seen this.
00:13:33.000 I don't want to lose my job, but here's an encrypted.
00:13:36.000 And then it's like this army of people because they can stop one man, but they can't stop a thousand people.
00:13:42.000 You know what I mean?
00:13:43.000 That's like this whole idea of an army, a thousand insiders sending information at the same time.
00:13:47.000 Well, this guy, Ritesh, I feel bad for him.
00:13:50.000 I mean, he's going to get a lot of unwanted attention.
00:13:52.000 He's being honest.
00:13:53.000 Yeah.
00:13:54.000 You know, he says in this clip, if the, what does he say, if Trump wins, there'll be riots.
00:13:58.000 If the left wins, they'll be ecstatic.
00:13:59.000 Yes.
00:14:00.000 I mean, he's being, he's being honest about what's happening in this country.
00:14:02.000 He tells you, you know, candidly what's happening.
00:14:05.000 I feel bad because, you know, I don't want, I don't want this guy to go through any hardship or anything because of this.
00:14:11.000 The problem is, what happens when no one is willing to step forward and tell you what's actually happening and they're actively participating in it?
00:14:18.000 I had this conversation with a guy named Eric Weinstein on his podcast for like two hours about this ethics of blurring faith.
00:14:24.000 Very fascinating, very intellectual conversation, but I was telling my staff that Ernest Hemingway once said, what is moral is what you feel good after and what is immoral is what you feel bad after.
00:14:35.000 Morally defensible journalism is really what you feel good about afterward.
00:14:39.000 It is only that which makes you feel better than you would otherwise.
00:14:42.000 So it's like eating coal in the forest to help your stomach.
00:14:45.000 It never feels good, but you feel better after you do it.
00:14:48.000 Undercover work is tough.
00:14:51.000 People don't like it.
00:14:52.000 It's like you're filming someone without them knowing.
00:14:54.000 Sometimes they're good people, sometimes they're bad people.
00:14:56.000 In this case, this guy is probably some combination of somewhere in between.
00:15:01.000 But the public's right to know is paramount.
00:15:03.000 And we're better off for knowing this information.
00:15:06.000 I don't like harming people.
00:15:08.000 That's not the intent of this.
00:15:09.000 We're not trying to shame people.
00:15:11.000 We're trying to... What's more important than a monopoly on a search engine telling you we're trying to elect this guy?
00:15:18.000 Nothing's more important than that.
00:15:20.000 And it's a public place, we're in a restaurant, so it's a very interesting ethical conversation I suppose, but I think we need more of this sort of thing, not less of it.
00:15:27.000 I agree though.
00:15:28.000 Yeah, yeah, yeah.
00:15:28.000 So I feel bad, you know, because I empathize with somebody who is scared to come forward, won't be honest about it.
00:15:35.000 But I also feel like at a certain point, if you know what you're doing is destroying something, someone, society, and you're like, well, I'm not going to stick my neck out.
00:15:46.000 I mean, that's kind of messed up.
00:15:47.000 Yeah.
00:15:48.000 And there's no other option.
00:15:49.000 It's a choiceless choice to do this work because people are not going to speak on the record.
00:15:53.000 Nobody will ever do that.
00:15:54.000 I mean, Well, once in a while, I mean, you had Eric Cochran, our colleague at Pinterest, was making a ton of money, and he gave it all up.
00:16:03.000 And he said, I just, what am I going to do?
00:16:05.000 I'm going to be ashes to ashes one day and go to the grave, and I'm young and want to do this.
00:16:10.000 I hope to find more of these people.
00:16:12.000 We have a tip line, Veritas, that's V-E-R-I-T-A-S, tips at protonmail.com, for those of you watching, if you want to be brave and do something.
00:16:21.000 So there's a lot of options.
00:16:22.000 You know man, I worked for a Disney company.
00:16:27.000 It was ABC Univision.
00:16:29.000 And it's interesting, you said something that reminded me of what they told me.
00:16:34.000 Side with the audience.
00:16:36.000 The goal of this news outlet, Fusion, as I was told multiple times by the president, was to just say things that the audience would agree with.
00:16:44.000 And so I asked him, if there is a story that is factually true but would upset our audience, we wouldn't report it?
00:16:50.000 And he said, I think that's fair.
00:16:52.000 How does that translate?
00:16:54.000 Let's say you have raw footage of a Proud Boy walking down the street, and then Antifa runs up to him and punches him in the face.
00:17:01.000 The Proud Boy gets up and punches back.
00:17:04.000 The full context would offend our audience.
00:17:06.000 We don't report it.
00:17:07.000 The shorter context would not offend our audience.
00:17:09.000 The Proud Boy hitting Antifa.
00:17:11.000 That's the story that complements their worldview.
00:17:15.000 The Proud Boys are bad.
00:17:16.000 So ultimately, I never blew the whistle or anything while I was there, but I refused to play ball with them.
00:17:23.000 I ended up leaving and then basically explaining everything about what these companies do, how they do it, how the media manipulates.
00:17:30.000 And I think too many people Have said to me things like, oh, well, this is really funny to me.
00:17:36.000 They say, yeah, but you could support yourself, but, you know, you weren't worried about what, you know, what you were, like, you were gonna lose your job.
00:17:42.000 You didn't care.
00:17:43.000 Like, I cared about all that stuff.
00:17:44.000 I was worried that coming out, I did a video that Sargon of Akkad put on his channel, where it was me breaking down four different ways, I think it was four, that the media was manipulating everybody.
00:17:55.000 And I knew, I'm like, I'm gonna put out a video on this, you know, this YouTuber, you know, anti-feminist guy's channel.
00:18:00.000 They're not going to hire me.
00:18:01.000 This is it.
00:18:02.000 I'm going to say this right now, let everybody know, and this is how we're going to get it out, because these companies won't let me say it.
00:18:08.000 And then I better start something on my own, because I'm never going back to these companies.
00:18:11.000 I had actually gone to a bunch of these, I'm not going to name them, but you know who they are, these big New York digital firms.
00:18:17.000 And some of them were like, you name your job, you name your salary.
00:18:19.000 And I said, I am not going to get, I saw some of the people working there, I'm not going to get myself in a situation like that anymore.
00:18:25.000 You're an entrepreneur, and that's very important what you're doing.
00:18:30.000 It's like you have to be a journalist and a businessman at the same time.
00:18:34.000 You are an entrepreneur.
00:18:34.000 I mean, it's very important.
00:18:35.000 I'm at your place right now, and it's amazing what you are building and what you've built.
00:18:40.000 And what I've learned in my career is that when I started, I was in a garage with nothing but a mic and a laptop.
00:18:46.000 And I realized, OK, in order to make videos, I have to do something else and I have to be the chairman of a company and I don't settle litigation.
00:18:52.000 I have to raise millions of dollars.
00:18:53.000 So in order to do what I think is right, in order to not compromise or ever sell out, I haven't sold out to anyone.
00:18:59.000 I'm not answering to anybody.
00:19:01.000 I have no advertisers.
00:19:02.000 No one can boycott me.
00:19:05.000 I had to build a company and learn how to be a CEO in order to do my passion, which is investigative reporting.
00:19:12.000 Otherwise, I can't do investigative reporting because I have to be owned by somebody.
00:19:15.000 And that's what you're doing and more power to you.
00:19:18.000 Well so the reason I tell this story is just because my own personal experience with not
00:19:23.000 – I'm not going to pretend it's as brave as some of the whistleblowers you've seen
00:19:27.000 because my contract expired.
00:19:28.000 I was in golden handcuffs.
00:19:30.000 I couldn't do anything.
00:19:32.000 I was straight up told by the president I wasn't allowed to participate in a presidential event because of my race.
00:19:38.000 Because I looked too white.
00:19:40.000 I was told that, and I was actually contemplating, like, do I sue them for this?
00:19:44.000 And I was like, I don't want to get involved in any of that.
00:19:45.000 I just want to carry on.
00:19:47.000 But what I'm trying to get to is, When you say be brave, that's exactly it.
00:19:53.000 A lot of people seem to think that they're like, they look to me, they look to successful people and they say, you're the exception.
00:19:59.000 I know that if I strike out on my own, that I'm going to fail.
00:20:03.000 No, no, you have to just try.
00:20:05.000 You've got to, you've got to throw it out there.
00:20:07.000 So if you're, I'll tell you this, man, you have two choices.
00:20:09.000 If you work for one of these companies and you know, they're doing something wrong and you say, well, I'm not getting my hands dirty or taking my neck out.
00:20:16.000 Then, then how are you not a part of the problem?
00:20:20.000 You are.
00:20:21.000 If you see something and you decide, I am going to stand up and let people know what they're doing, you are solving that problem.
00:20:27.000 But there are too many people who are more than happy to just let the problem carry on.
00:20:27.000 Yes.
00:20:31.000 Well, you know, that's true.
00:20:32.000 And I think you've got to be a leader in this life.
00:20:35.000 You've got to take risks.
00:20:37.000 Courage is the virtue that sustains all others.
00:20:40.000 I don't think you need to have a hundred thousand people or a million.
00:20:43.000 You just need a few.
00:20:44.000 You need like a couple dozen.
00:20:45.000 Again, going back to this example of Eric Cochran.
00:20:48.000 You meet this guy.
00:20:49.000 He works with us.
00:20:49.000 He runs our... This is the Pinterest whistleblower.
00:20:51.000 Pinterest whistleblower.
00:20:52.000 And it just brings tears to your eyes to hear him talk about... He left a salary, a very high salary.
00:20:58.000 And he describes why he did it and he says a lot of people go through life and they just want the material things in life and that's not what life is about.
00:21:05.000 And you get a lot more fulfillment in standing up for something that is right.
00:21:10.000 And I don't think I could ever do anything but Project Veritas.
00:21:14.000 I've tried to live a normal life.
00:21:16.000 I've tried to go to law school and do all these things.
00:21:20.000 I just feel so passionately that if you stand up for what's right and moral and decent, like you have done in your career, you know, good things will happen.
00:21:29.000 And I don't know what the alternative is.
00:21:30.000 Do you know the story of how the Nobel Peace Prize or the Nobel Prize came about?
00:21:33.000 No.
00:21:34.000 It's the dude Nobel, inventor of dynamite.
00:21:37.000 Someone at the news outlet, I guess, accidentally published his obituary early.
00:21:41.000 And I'm probably flubbing some of the details because I'm not pulling it off the website or anything.
00:21:44.000 But apparently they accidentally published his obituary early.
00:21:48.000 It happens.
00:21:49.000 And they called him the Merchant of Death.
00:21:51.000 And so he apparently got panicked.
00:21:52.000 It was like, is that my legacy?
00:21:54.000 So he decided to create something to be, you know, to be better.
00:21:56.000 The Nobel Prize and all the chemistry, engineering, mathematics, Peace Prize, etc.
00:22:01.000 I think people... It's... Maybe it's something that not all people have, this feeling about what you're going to leave behind.
00:22:08.000 But I'm curious if, you know, these people at Google who refuse to speak up, the people who are there... Like, look, clearly not everybody at Google is a zealot who wants to manipulate people and control the world, thinking their ideology is legit.
00:22:21.000 This guy clearly doesn't.
00:22:22.000 He thinks they're playing God and there's something wrong with what they're doing.
00:22:24.000 It's unethical.
00:22:25.000 But why wouldn't he just actually come out and give that statement, put out that video, and say, hey guys, this is happening?
00:22:33.000 Do people not care about the legacy that they leave behind?
00:22:36.000 I think people are afraid.
00:22:38.000 I think that the only thing you have to fear is fear can Fear itself.
00:22:42.000 But the biggest question that I get asked is, what can I do?
00:22:46.000 Do you get asked that question?
00:22:47.000 All the time.
00:22:48.000 It's like, what can I do?
00:22:48.000 What can I do?
00:22:49.000 And, you know, Dennis Prager says there's three types of people in the world.
00:22:52.000 There's people who fight, people who support those who fight, maybe financially, and those who do nothing.
00:22:57.000 So, you know, you could certainly financially support those who do things, or you can do things, or you can do nothing.
00:23:03.000 Those are the three options.
00:23:05.000 The biggest question I get asked is, how can I help?
00:23:07.000 How can I contribute?
00:23:09.000 And I'm not saying it's for everybody, but there's a tiny fraction of people who... I mean, this guy Zach Voorhees, another one I didn't mention at Google, algorithmic unfairness.
00:23:18.000 Zach leaked the document out of Google, and he was terminated by Google.
00:23:23.000 And he said, quote, this was an act of atonement, an attempt to make my conscience clear.
00:23:28.000 Very powerful.
00:23:29.000 Wow.
00:23:29.000 And he meant it.
00:23:30.000 And he was tearing up.
00:23:32.000 And, you know, it's just, some people can, it's a choiceless choice.
00:23:36.000 They can't, they can't do anything.
00:23:38.000 And whistleblowing, which is what, sort of what some of us do at Project Garotas, because a lot of the people are not my employees.
00:23:43.000 They're actually people who just send me stuff.
00:23:47.000 It's like you're an astronaut which is let go from a spacecraft.
00:23:50.000 You're not part of any organization.
00:23:51.000 You're not part of Veritas.
00:23:52.000 You're not part of Google.
00:23:54.000 You're just out there floating out in the wind with no safety net for a higher purpose.
00:24:00.000 And in the 20th century, you had Jeffrey Weigand and these sorts of whistleblowers, their lives were over.
00:24:05.000 All they had was 60 minutes.
00:24:06.000 Yeah.
00:24:07.000 But in this day and age, there's GoFundMe, there's a life after whistleblowing, although that's an ideal.
00:24:12.000 Right, the GoFundMe thing.
00:24:13.000 But there's a new market for this sort of thing.
00:24:17.000 It used to be in the 20th century you'd get divorced, you'd be bankrupt, your life would be over.
00:24:21.000 Most whistleblowers have reportedly said that they regretted doing it.
00:24:24.000 Wow.
00:24:25.000 They wished they could not do it.
00:24:26.000 They were naive.
00:24:27.000 I think there's a new era.
00:24:28.000 I think there's a new age.
00:24:29.000 I think we're living in an era of... These are the times that try men's souls, to quote Thomas Paine.
00:24:33.000 I think There's like this new movement of people that are about to do this, Tim.
00:24:39.000 Yeah.
00:24:39.000 Because you were talking last night, oh, we're gonna go to war.
00:24:41.000 Okay, well, before we go to war, there's gonna be whistleblowers recording everything, I think.
00:24:44.000 To quote Breitbart.
00:24:46.000 War.
00:24:46.000 Yes, to quote Breitbart.
00:24:48.000 Exactly.
00:24:48.000 So, you know, yeah, so last night we were talking about where do we go from here?
00:24:54.000 And you were saying, I think, civil unrest.
00:24:55.000 And I said, yeah, I do believe that the trajectory we're on leads to full-scale conflict of some kind.
00:25:02.000 What that looks like, I don't know.
00:25:04.000 And I know it's interesting, you know, a lot of people don't like the conversation around potential civil war or anything like that, but I think people get confused about what it really means.
00:25:13.000 I'm not talking about factions marching in the streets anytime soon or anything like that.
00:25:16.000 But I do think that we're seeing this, and I'm going to pause right here, and I wanted to say this as soon as we pull up this article.
00:25:24.000 I got bad news for you, man.
00:25:27.000 This doesn't shock anybody anymore.
00:25:29.000 I think it's really, really important.
00:25:31.000 You're putting this stuff out.
00:25:32.000 You're confirming it.
00:25:32.000 You're showing it.
00:25:33.000 You're proving it.
00:25:34.000 And it's at a point now where I see this and I'm like, I know.
00:25:38.000 I know.
00:25:39.000 So what do we do?
00:25:40.000 It's like we're hitting a wall, you know what I mean?
00:25:42.000 Well, I don't know if I agree completely.
00:25:45.000 I think that you need to do more of it.
00:25:47.000 This is like, I do these things.
00:25:49.000 I used to do these things once a month.
00:25:50.000 Now I'm doing them like three times a week.
00:25:52.000 It was, you know, Andrew Breitbart wrote a book called Righteous Indignation, which I believe was coined by Ida Tarbell, the investigative reporter.
00:25:58.000 Investigative reporting, there's another book called Custodians of Conscience, so we test investigative reporting, test and affirms what is moral and what is not moral, and that there's that Overton window where you're kind of testing where the boundaries are about, and you're setting the threshold of anger.
00:26:13.000 It's hard!
00:26:14.000 Because there's so much noise and crap.
00:26:17.000 How do you break through that?
00:26:18.000 Investigative reporting is the fiercest of indignation fused with the hardest of fact.
00:26:25.000 So you're trying to find these facts that are outrageous to people.
00:26:29.000 And it is hard, but I think it's working, Eric Sprachman.
00:26:35.000 I think we're getting there.
00:26:37.000 I think we just have to do more of it.
00:26:39.000 I think you need a hundred videos.
00:26:42.000 And the audience is not going to watch CNN because they can't stand it.
00:26:44.000 I mean, CNN has no viewers, by the way.
00:26:46.000 Nobody watches CNN.
00:26:47.000 Actually, their viewers have been going up.
00:26:49.000 Oh yeah.
00:26:50.000 Yeah, it's a weird phenomenon right now.
00:26:51.000 Tucker Carlson, of course, as you probably heard, has the highest rated cable news show in history.
00:26:56.000 Over 5 million in the past, like a week or two ago, massive.
00:27:00.000 But around the same time, Rachel Maddow and Anderson Cooper have been skyrocketing as well.
00:27:05.000 So, it's... What numbers are we talking about?
00:27:07.000 I'm talking about Anderson Cooper hitting 3 million.
00:27:09.000 3 million.
00:27:09.000 3 million.
00:27:10.000 And Rachel Maddow hitting 4.
00:27:11.000 Like in one night?
00:27:12.000 We can average over the week.
00:27:14.000 We can beat that.
00:27:16.000 Well, I'll tell you this.
00:27:17.000 Across my channels, I'm getting over 100.
00:27:19.000 There you go.
00:27:20.000 And on YouTube, CNN's getting 190.
00:27:22.000 So, for one person, you know... Not bad.
00:27:25.000 And this show actually has a couple people.
00:27:27.000 But compared to CNN's, you know, Yeah, I think they're on the way out.
00:27:32.000 I think they're doing really well now because of the Orange Man narrative.
00:27:35.000 The Trump bump, you know, and Tucker sort of cuts through a lot of the BS.
00:27:39.000 So, you know, naturally his show does well.
00:27:41.000 But none of these companies, none of these media organizations are willing to be the tip of the spear.
00:27:47.000 They don't go there.
00:27:48.000 Vice's unique value proposition was, we go there.
00:27:51.000 Imagine how broken the news business is when your unique value proposition to your customers is, we go to the place we're reporting on.
00:27:59.000 None of these people will ever have me on TV, obviously, but they'll talk about this stuff.
00:28:03.000 I think it's that idea of the tip of the spear being in an exposed position where you're not just responding to events but you're actually creating events.
00:28:13.000 The Google thing tonight is new information into the matrix.
00:28:17.000 It's like not just us talking about pre-existing information.
00:28:20.000 And most media corporations, because they're commercial, that economic imperative compromises their their their what they do I don't have an
00:28:32.000 economic imperative We have never generated a penny of profit which is a pain
00:28:37.000 in the ass, but I have to go Work a hundred hours a week raising a pay bill
00:28:40.000 But we you know I'm saying I think it's economics not ideology. Isn't there a partisan element to it? There is
00:28:47.000 There exists.
00:28:48.000 But Jeff Zucker, I mean, everything I've learned about the guy, I mean, he kind of built Trump up and then he tried to tear him down.
00:28:53.000 And I don't know what these companies are going to do after Trump.
00:28:55.000 So you do fundraising relentlessly?
00:28:59.000 Constantly.
00:29:00.000 Would you consider your donors to be conservative?
00:29:04.000 Some, many of them, some of them.
00:29:07.000 There seems to be a overlap between conservatism, I don't even know what that, I don't even know what that means anymore.
00:29:13.000 To be honest, I agree.
00:29:14.000 And reality.
00:29:16.000 There seems to be a, because.
00:29:18.000 That's the inverse Colbert proposition.
00:29:20.000 Because CNN, they just keep yammering about crap.
00:29:23.000 I mean, I've been through the ringer with this, New York Times and USA, they literally say the opposite, we should talk about this, the opposite of what is real.
00:29:29.000 So if, so if you, if you just, You know, extrapolate that over everything they say.
00:29:34.000 If I just aim my camera in any direction, any direction, it's going to show things that are contrary to what they show you.
00:29:41.000 Is that conservative?
00:29:43.000 I don't really consider myself an ideologue.
00:29:45.000 I really don't.
00:29:46.000 I tend to focus on the sacred cows, the organizations that these places won't go, won't talk about, because I believe in justice.
00:29:53.000 I always have, since I was a college student.
00:29:55.000 My professors were telling me how great Stalin was, and I said, no, no, that's not true.
00:29:59.000 So I guess I'm more of a contrarian?
00:30:01.000 If the media was saying doing the inverse, I'd probably, my cameras would show something different, right?
00:30:05.000 I think, how about anti-establishment, in a sense?
00:30:09.000 Perhaps.
00:30:10.000 So I consider myself to be fairly anti-establishment, and that's why I'm often ragging on, I find myself ragging on Democrats.
00:30:15.000 Because I don't like the Republicans either, but right now, with Trump, you have something very different.
00:30:22.000 And so, when it comes to the things that we see on CNN and MSNBC, on mainstream news outlets, these are supposed to be the news outlets that are informing everybody and being fair, and they're not.
00:30:33.000 They're clearly pandering to one side and catering to one political faction in this country.
00:30:38.000 So I view that as, do I care what Breitbart is doing?
00:30:41.000 Why would I care?
00:30:41.000 Why?
00:30:41.000 Breitbart's not that big.
00:30:42.000 Do I care about what the New York Times is doing?
00:30:44.000 Absolutely!
00:30:45.000 They're the Grey Lady, the paper of record.
00:30:47.000 When they're acting, you know, a fool, I'm worried about that.
00:30:51.000 Look, I get it.
00:30:51.000 There's always going to be some little blog, some right-wing channel, some right-wing news outlet or something.
00:30:56.000 And then I hear all of these, you know, you see like Brian Stelter on CNN saying, Fox News does this.
00:31:01.000 And I'm like, yes, so what?
00:31:02.000 It's one channel.
00:31:03.000 They do this sometimes, they do that sometimes.
00:31:05.000 What about you guys?
00:31:06.000 What about ABC, NBC, CBS?
00:31:08.000 All of these, Washington Post, they're all endorsing Biden.
00:31:11.000 They're all, you know, not every single one of them, but a lot of papers are endorsing a political candidate.
00:31:15.000 They're putting out information.
00:31:16.000 They're clearly lying about it.
00:31:18.000 Like the Twitter fact check.
00:31:19.000 Did you see this?
00:31:20.000 When the New York Post story got censored, the Washington Post fact check put up on Twitter said, Joe Biden played no role In ousting the Ukrainian prosecutor, even though he's on video saying he did!
00:31:31.000 This is what they're going to start doing now, is just saying the quite literal opposite of reality that's happened to us.
00:31:37.000 I'm aware of it.
00:31:39.000 Anyway, my point is, I view things as like, the establishment is breaking the rules, and that's a problem to me.
00:31:47.000 If there are people who oppose the establishment, who are much smaller, who are breaking the rules, I'll bring it up when it matters, but I don't think it's that significant.
00:31:53.000 Like, I understand it can be in concert with all these other channels, but anyway, that's just me kind of breaking down my point of view.
00:31:59.000 As it relates to you, you guys have been accused of being, you know, partisan, conservative, and going after left-wing organizations and trying to smear them or whatever.
00:32:09.000 Well, I mean, there's a lot of examples to the contrary.
00:32:12.000 I mean, we went after the Republican Attorney General of New Hampshire recently because his deputy said that there was no such thing as voter ID in the state, and I confronted Gordon McDonald, who is a Republican, who works for Chris Sununu, who is a Republican.
00:32:23.000 There are examples in Amy Robach, the Jeffrey Epstein video.
00:32:26.000 That wasn't conservative or liberal, that was actually plotted by liberals, and suddenly the New Republic was like, is James O'Keefe legitimate now?
00:32:33.000 They actually have a headline that said that.
00:32:35.000 It's like the moment I go after the NRA, we live in a world not of angels but angles, where people don't ever think that our motives are pure.
00:32:43.000 Not everything's political.
00:32:48.000 What's going to happen to him eventually is that whistleblowers are just going to trust me because I don't settle lawsuits and don't back down.
00:32:53.000 And there's going to be 10, 20, 30, 40% of our sources that just bring us straight up corruption.
00:32:58.000 If someone comes to me, Republicans ripping up ballots, I will air that video.
00:33:01.000 I guarantee, I swear to God, if someone gives me a video of Republican congressmen, I'm publishing the video, but I don't get videos of Republicans.
00:33:12.000 I get a guy in Minnesota who says, absentee ballot.
00:33:16.000 We don't care that it's illegal.
00:33:18.000 The guy is saying, I don't care that it's illegal.
00:33:21.000 And the New York Times says, there's no evidence.
00:33:22.000 This is unsubstantiated.
00:33:24.000 Well, what is evidence?
00:33:28.000 The guy's on tape.
00:33:30.000 It's like a South Park episode.
00:33:32.000 It's like Parker and Stone come up with the most outrageous video.
00:33:35.000 They're writing 2020.
00:33:36.000 Let's get some Somalis.
00:33:37.000 Let's put them in a car.
00:33:39.000 Let's say, I'm still hustling at two in the morning, ripping up absentee ballots, blank ballots.
00:33:43.000 I break the law.
00:33:44.000 It looks like a South Park episode.
00:33:46.000 And the New York Times goes, this is unsubstantiated video. So it's his own video. It's
00:33:51.000 a self-fulfilling prophecy because if you have the incontrovertible evidence and the New York
00:33:56.000 Times says it's unsubstantiated and the information monopolists at Facebook only cite quote-unquote
00:34:02.000 verified... you guys all know this. And Jeff Bezos, who was speaking in the Washington
00:34:05.000 Post, I have a quote here.
00:34:06.000 This is Jeff Bezos, maybe the most biggest, most powerful company, Amazon, in the world.
00:34:11.000 Quote, even though the Washington Post is a complexifier for me, I do not at all regret my investment.
00:34:17.000 The Post is a critical institution with a critical mission.
00:34:20.000 This is, quote, something I will be most proud of.
00:34:24.000 I think it's very telling the CEO of Amazon thinks that his investment in some woke clickbait rag in Washington.
00:34:30.000 Is the most important thing he will have done in his life.
00:34:34.000 Wow.
00:34:35.000 The media is more powerful than all three branches of the government of the United States.
00:34:40.000 Yep.
00:34:41.000 And that's truth.
00:34:42.000 So anyway, what I was getting to in terms of the bias accusations against you, what I really love about this was when you're publishing Google.
00:34:51.000 Is Google a left-wing institution now?
00:34:52.000 When they say that you're targeting liberal organizations or leftists, they're almost admitting that these big tech companies Are.
00:35:00.000 Or the newspapers are.
00:35:00.000 They are.
00:35:01.000 The Democrat media complex.
00:35:03.000 I think Ritesh Lakar tonight.
00:35:04.000 We're going to have more videos tomorrow.
00:35:05.000 And he does say that they are, they lean that way.
00:35:08.000 And it appears to be, Tim, the case that they are partisan institutions.
00:35:12.000 And some of the evidence in these tapes, you'll see one tomorrow, which it might even trigger an FEC complaint.
00:35:15.000 There is a hearing next week in D.C., the Senate committee.
00:35:18.000 The Senate Commerce Committee is having a hearing, and I'm sure they'll play some of this.
00:35:18.000 What is it, Eric?
00:35:23.000 If they're helping left-wing parties Democrat parties in advertising on Google, that triggered an FEC violation.
00:35:30.000 I think, speaking of the Washington Post, and how social media amplifies a lot of what these journalists do, verifying them, giving them credibility, I love it when I see a verified journalist with 300 followers.
00:35:42.000 Well, they're verified, they work for the news outlet, that's why it's important, I guess.
00:35:45.000 And let me tell you a story.
00:35:48.000 In, I think it may have been 2016, it's been a while now, the Washington Post published a story That insinuated, out of thin air, that Kim.com, the notorious mega-upload hacker, fabricated a trove of emails, or something, to- I got a pair- It's been a long time since I've covered this story, but they basically wrote this story claiming that he may have hacked Seth Rich's email account, or tried to break into Seth Rich's email account.
00:36:13.000 to plant a trove of fake emails to prove quote-unquote prove the Seth Rich conspiracy theory that was it the DNC staffer Seth Rich leaked emails that that's the conspiracy I guess to WikiLeaks they made this whole story up completely fabricated So the general story was that someone in the Rich family got a mega-upload email saying they signed up.
00:36:35.000 And from that tidbit, this guy, Dave Weigel, at the Washington Post, fabricated this whole thing.
00:36:42.000 So I went on Twitter, I found his number, and I called him.
00:36:45.000 And I was like, you know, I think I sent him a signal message, like, I was very polite, very professional, saying, Hi, my name is Tim Poole, I'm a journalist.
00:36:54.000 I'm curious as to your sources, to verify, blah, blah, blah.
00:36:57.000 And he just was like, Oh, what is this?
00:36:59.000 What is this?
00:37:00.000 Basically, the sum of the story.
00:37:02.000 A fake story was written to smear Kim.com and people who were concerned about WikiLeaks' release with no evidence, and then I think it was maybe six or eight months later, they went back and changed the entire story.
00:37:16.000 And you can track all of this on newsdiffs.org, which I'm not sure if they exist anymore, but at the time, they showed you the difference from the original article and the article as it was changed in November.
00:37:27.000 I think it was published in May, I think.
00:37:29.000 It was a smear, a hit job, and then after the dust settles, in November, they go in and they change all the language without telling anybody.
00:37:37.000 I've seen this so many times.
00:37:40.000 So when you have the Washington Post, for instance, that does something like that, and I've seen it, they endorse Joe Biden, and then they put up this false Twitter fact check.
00:37:48.000 Twitter had this fake fact check up for days.
00:37:48.000 I love it.
00:37:52.000 Were you able to pull it up?
00:37:53.000 I'm trying to right now.
00:37:55.000 But I just wanna say, while you're talking to me about this, the people on your show, the people watching this, there are insiders that are reaching out to our encrypted ProtonMail address right now.
00:38:04.000 Already.
00:38:05.000 Like right now.
00:38:06.000 My chief operations just sent me a message.
00:38:09.000 There's people, literally, right now, insiders are coming to the email address.
00:38:13.000 Wow.
00:38:14.000 They're watching your show.
00:38:15.000 We're in a good band.
00:38:15.000 Veritas.
00:38:16.000 I'm gonna do another ad.
00:38:17.000 Oh man.
00:38:18.000 VeritasTipsAtProtonMail.com.
00:38:20.000 You know that show, Cars for Kids?
00:38:21.000 Do you ever hear that on there?
00:38:22.000 Yes.
00:38:22.000 Yeah, Cars for Kids.
00:38:23.000 One, eight, seven, seven cars for kids.
00:38:25.000 Veritas tips at ProtonMail.
00:38:28.000 We need a MyPillow jingle.
00:38:30.000 Right now, Eric, there are insiders coming to us watching Tim Pool's show.
00:38:34.000 I feel like we're doing one of those... What are those things they used to do where they would raise money on public access TV?
00:38:41.000 The phones are ringing off the hooves!
00:38:44.000 The ProtonMail is being inundated with whistleblowers as we speak!
00:38:47.000 Like the Live Aid concert.
00:38:49.000 There are people working for tech companies watching Tim Pool's livecast right now who are emailing Project Veritas right now, on the inside.
00:38:56.000 Yeah, give me one of those cameras.
00:38:58.000 We're gonna get banned.
00:38:59.000 You are changing the world and you don't even know it.
00:39:02.000 I mean, I think so.
00:39:04.000 It's the goal every day to just do something.
00:39:07.000 Yes.
00:39:08.000 Do you know Jack Murphy?
00:39:09.000 Yes.
00:39:11.000 He was like, you work so much, Tim.
00:39:13.000 It's like 16-hour days.
00:39:14.000 And I was like, I'm just trying not to be bored.
00:39:16.000 If I wasn't working, I'd be sitting there just staring at the wall.
00:39:18.000 Could you do anything but what you're doing?
00:39:20.000 Could you really go back to doing this, right?
00:39:22.000 Oh, I do a ton of stuff.
00:39:24.000 I showed you my music video.
00:39:25.000 Well, that's true.
00:39:26.000 Yeah, I'm crazy.
00:39:27.000 I just work all the time.
00:39:28.000 Do you think you'll ever stop doing the sort of journalistic stuff in your life?
00:39:33.000 No.
00:39:33.000 And I think it's a blessing in a lot of ways because it's something you can always contribute to.
00:39:39.000 I can be bedridden and I can be writing something down or I can be researching.
00:39:44.000 You know, I could break my leg skating.
00:39:46.000 My hands could be ripped off by some kind of strange shark for some reason.
00:39:50.000 And I'd still be able to talk and read and use my feet for my computer or whatever.
00:39:54.000 Voice the text?
00:39:55.000 Yeah.
00:39:56.000 But you know what?
00:39:57.000 It's always been a passion of mine to read things, to understand things.
00:40:01.000 I've been on the internet since I was little, researching, reading, and understanding.
00:40:05.000 And you know what it really is?
00:40:06.000 I don't really trust people.
00:40:09.000 It's probably a problem.
00:40:10.000 But I would hear people say things and I'd be like, I don't know if that's true, and I'd check.
00:40:13.000 And then I would check again.
00:40:15.000 And then so it turned into me just being like, here's what I think about things and here's, you know, here's my experience.
00:40:21.000 And from going on the ground and traveling around, it was like, I like traveling.
00:40:25.000 I would travel and then tell people what I thought.
00:40:27.000 Now it's a little too dangerous because people know who I am and antivirus are, you know, threatening me and stuff.
00:40:31.000 So now it's more of just- You used to do more field work.
00:40:33.000 You used to be in there.
00:40:34.000 Used to be all the field.
00:40:34.000 Yeah, all the field.
00:40:35.000 All I did was- You did a bunch of field work in the early days.
00:40:38.000 I still do.
00:40:39.000 You know, I'm like, you know, sometimes I'd be like patting and go in there like I'm a tank, you know.
00:40:42.000 But you don't do any field work anymore?
00:40:45.000 Uh, I would say there's a couple things we might end up doing soon.
00:40:49.000 Like, I have the van ready to go.
00:40:50.000 Maybe after the election, but the challenge is... I'll tell you this.
00:40:53.000 I don't think people realize, you know, my... I'm not super political, personally.
00:40:59.000 Like, I clearly have my opinions on, you know, Trump and the Democrats and Republicans and stuff like that.
00:41:04.000 But my interest is always more just like, what's the biggest story?
00:41:06.000 And that's why if you go back like a year or two and look at my YouTube channel, there's stuff about, you know, Jordan Peterson.
00:41:11.000 There's stuff about cultural issues.
00:41:13.000 I have a video from a couple years ago that's got over a million views about men not wanting to help women and children because they might get accused or something like that.
00:41:20.000 And so it was always just like, what was the news cycle?
00:41:23.000 What was important?
00:41:24.000 What did I think mattered?
00:41:25.000 And now that we're in this big election cycle, especially coming after 2018 with They've ramped up like crazy all the anti-Trump stuff and the street violence.
00:41:33.000 So it's just dominated everything.
00:41:35.000 Look, if all of this stuff goes away after the election night, you know, Joe Biden wins in a landslide, Trump shakes his hand, they wave, and then everything's back to the Obama years, which is never going to happen, then, you know, I'll be off, you know, in, you know, I don't know, Middle East or something, covering some conflict or crisis like I was doing eight years ago.
00:41:52.000 But it's the way I explain it to people.
00:41:54.000 I had a friend hit me up and say, you know, there's this huge unrest in Thailand right now.
00:41:58.000 Will you cover it?
00:41:59.000 And I said, I've been following it, but I'm not going to cover it.
00:42:02.000 You know why?
00:42:02.000 There's an election happening right now, and the most important story to me that I care about is the US election, media censorship, manipulation, and what's going to happen to us after January 20th.
00:42:15.000 If, you know, six years ago, I was in Thailand, you know, I was in Turkey, I was in these countries because that was the most important thing I could see.
00:42:22.000 Things in the U.S.
00:42:22.000 were kind of just, you know, moseying along, but everything started to change.
00:42:27.000 I don't know where we're ending up, but I tell you this, there's a lot of people who seem to think that voting for Joe Biden will bring them back to those good old days, the Obama year.
00:42:34.000 No, you know, it's never going back to normal.
00:42:37.000 The world has changed fundamentally.
00:42:39.000 Social media has changed the game.
00:42:41.000 We are not returning to some bygone era of even five years ago.
00:42:46.000 This is it.
00:42:47.000 It's from here, wherever we go, I don't know.
00:42:49.000 What's your prediction with the social media companies after the election?
00:42:52.000 What do you think is going to happen?
00:42:53.000 The social media companies?
00:42:54.000 The next six months, what's going to happen?
00:42:55.000 Trump and the Republicans landslide across the board, then maybe some kind of Section 230 reform.
00:43:02.000 I'm not confident.
00:43:03.000 I mean, the Republicans controlled everything from 2016 to 2018.
00:43:05.000 They did nothing.
00:43:06.000 They couldn't even get the funding for the wall and stuff like that.
00:43:09.000 That's true.
00:43:09.000 Why do you think that is?
00:43:10.000 That's an interesting topic.
00:43:11.000 Why are the Republicans so weak when they get the power?
00:43:14.000 I have a theory.
00:43:15.000 Well, my theory for 2016 is that it was corporate crony establishment politicians. I've never seen
00:43:22.000 a difference between the Republican and the Democrat. I don't care. I
00:43:24.000 used to, I made a meme back in the day on Facebook of the Republicans and the Democrats
00:43:27.000 like holding hands behind your back because I view them as
00:43:30.000 fundamentally the same thing. They don't care about principles. I think it's the Dinesh
00:43:33.000 D'Souza line about the Republicans fear the terrifying and humiliating power
00:43:38.000 of the press.
00:43:40.000 Like they don't do the right, they don't do the thing that they're reluctant to do because
00:43:43.000 the media will crap on them and ultimately what they want is to be praised by the New
00:43:47.000 York Times.
00:43:48.000 Because it hurts.
00:43:49.000 I mean I can tell you, they've probably given him Dean Beck case satisfaction saying this.
00:43:52.000 It's a difference between seeking their recognition and knowing that it's hurtful.
00:43:56.000 You know no one wants to be boycotted, targeted, you know, tarred and feathered.
00:44:00.000 Yep.
00:44:01.000 So I think that's my theory why the Republicans are on Capitol Hill.
00:44:03.000 You know what I am?
00:44:03.000 Republicans aren't cool.
00:44:05.000 No.
00:44:05.000 And people want to be cool.
00:44:07.000 And so the Republicans have seemed to be, especially now, desperately trying to be like, hey, I'm hip.
00:44:13.000 New York Times, look at me.
00:44:14.000 I'm cool.
00:44:15.000 I'm like those celebrities.
00:44:17.000 But it's changing.
00:44:18.000 It's definitely changing.
00:44:19.000 And I had a conversation, believe it or not, there's a lot of Hollywood celebrities, rock stars, musicians, artists, et cetera, who are secretly pro-Trump.
00:44:26.000 And many of them hit me up, pro-skateboarders even, you know.
00:44:30.000 Really?
00:44:30.000 Definitely.
00:44:31.000 Oh man, you'd be surprised.
00:44:32.000 It's like the Soviet Union, they can't tell their friends or their family.
00:44:35.000 Skateboarders, this is crazy to me, are not leftists.
00:44:40.000 They used to be.
00:44:41.000 Like when I was growing up it was like, When it was cool?
00:44:43.000 So what do they do?
00:44:44.000 You know, you know what it was is because skateboarders oppose the moral authoritarians skateboarders want to be
00:44:49.000 left alone So now you have this moral authoritarian left insulting
00:44:53.000 attacking skateboarders like making edgy art They like pushing security guards running away and skating
00:44:57.000 and just what do they do? They just message you. Hey, don't tell anybody
00:45:00.000 Like how do they reach out to you?
00:45:02.000 I had someone message me crying.
00:45:04.000 On the phone?
00:45:04.000 Just text message?
00:45:05.000 Like, sending me a message being like, I'm in tears right now, I'm crying, I'm so scared, I, you know, I can't believe what's happening in this country.
00:45:11.000 And these are celebrities and things like that, so, you know, I bring this up just to say... Oh, I forgot exactly where I was gonna go with this, but my, you know, my point from there is that there's a lot of people who, believe it or not, are cool and don't want to be on the left.
00:45:26.000 So that was kind of the point I was saying is that Republicans aren't cool.
00:45:29.000 Right. But I've gotten hit up by now probably like three or four different
00:45:33.000 cool types.
00:45:34.000 And I know the air quotes like stereotypically celebrity hip saying things
00:45:39.000 like the reason why they think what I do is so important is because it's kind of
00:45:43.000 anti-establishment. But I'm not some suit wearing Republican.
00:45:46.000 Yeah. You know, like beanie and skateboards.
00:45:48.000 Even getting to know you, which I've this is like the second time I've met you.
00:45:52.000 You're singing.
00:45:53.000 You're singing music that's beautiful music.
00:45:55.000 Good animation.
00:45:56.000 You've got a good voice.
00:45:57.000 You're an artistic, creative individual.
00:46:00.000 And a lot of these people in DC are stodgy.
00:46:02.000 They write white papers.
00:46:03.000 They can't dance.
00:46:06.000 There's a skateboard park on this property that you built.
00:46:09.000 And you sing.
00:46:11.000 I find myself the same way.
00:46:13.000 I was a thespian before I was a journalist.
00:46:16.000 But there's something to that.
00:46:18.000 There's a creative aspect.
00:46:20.000 Charisma.
00:46:20.000 It's charismatic.
00:46:22.000 It's creativity.
00:46:23.000 Republicans lack charisma.
00:46:24.000 Terrible.
00:46:25.000 They're terrible storytellers.
00:46:27.000 Have you ever seen a movie funded by conservative money?
00:46:30.000 I'm not going to name any names.
00:46:31.000 It's actually just so funny.
00:46:33.000 It's terrible.
00:46:34.000 It's bad.
00:46:36.000 It's so bad.
00:46:38.000 And when the left makes a movie, they just make a movie.
00:46:43.000 Let's make a conservative movie.
00:46:45.000 I gotta point this out, though.
00:46:46.000 I think things are changing, and this is really, really important for... I don't think it's about the right.
00:46:52.000 It's about freedom, liberty.
00:46:55.000 It's about respect, individualism, and civil rights.
00:46:59.000 And right now, the left doesn't have that.
00:47:01.000 They're moral authoritarians.
00:47:02.000 It used to be the right back when I was growing up, and, you know, the religious right, moral authoritarians, putting labels on things and banning things.
00:47:08.000 It's not anymore.
00:47:10.000 So, I'm gonna point something out.
00:47:11.000 Can you change it to James' camera again?
00:47:13.000 Yeah, I can.
00:47:14.000 Up in the corner of the screen, right behind you, there's a picture of Joe Biden eating a small child.
00:47:18.000 I like that one.
00:47:19.000 I love it.
00:47:19.000 This is some of the best art I've ever seen.
00:47:23.000 I ordered a bunch of these prints from George Alexopoulos.
00:47:25.000 He's gprime85 on Twitter and Instagram.
00:47:28.000 And he makes these really amazing, like, comic critiques in its art.
00:47:34.000 And it's a gruesome and grotesque image of Joe Biden eating a child.
00:47:39.000 And it's incredible art.
00:47:41.000 I saw this on Twitter and I was trying to find this guy.
00:47:44.000 It's like a satirical exaggeration of what you'd imagine a Joe Biden nightmare to be.
00:47:51.000 Yes, it is.
00:47:54.000 What's that artist, Ralph Steadman?
00:47:55.000 It's sort of like I got a Ralph Steadman.
00:47:58.000 There's a fear and loathing aspect to it.
00:48:00.000 I bring this up because this is incredible art that normally wouldn't be associated, you don't see these things associated with the right.
00:48:06.000 Never.
00:48:07.000 But now it's, this change is happening and that's, I think, there's a couple different ways to view what's happening whatever it is on the right.
00:48:15.000 You've got laughter, comedy, you know they try to cancel all the comedians claiming that they're bigots or far right or whatever.
00:48:22.000 So if you can't have a good time, what do you got?
00:48:24.000 But now you've got artists, you've got video games, you've got people that are now associating themselves with either Trump or just the right in terms of the culture war, and they are creative and charismatic and interesting and exciting.
00:48:37.000 And the right needs more of that.
00:48:39.000 And I don't know if it's the right or whatever you want to call it, but the world.
00:48:44.000 Well, the world's had it.
00:48:46.000 But whatever's happening right now, you have a lot of people just pretending to be on the left, I guess, or they are.
00:48:51.000 And the left is becoming so boring and corporate and stodgy and just like plastics.
00:48:55.000 It's like anti-punk rock.
00:48:57.000 The right has become punk.
00:48:58.000 What's that, the Sex Pistols guy?
00:49:01.000 Johnny Rotten.
00:49:01.000 Make America Great Again.
00:49:03.000 He's in for Trump.
00:49:04.000 And he voted for Hillary in 2016.
00:49:06.000 So I don't, I don't know if Trump's going to win again.
00:49:08.000 I kind of think he, you know, I kind of feel like he is.
00:49:12.000 And the Republicans I think are going to win, but I, I, I have no basis for this other than.
00:49:16.000 I don't think anybody knows what's going to happen.
00:49:18.000 Right.
00:49:18.000 Nobody does.
00:49:18.000 And I think hard predictions are dumb, but, but, but I'll say this.
00:49:21.000 I think there's a fracture happening in, in reality.
00:49:25.000 I mean, you, you even hear it from the left that there's two different realities right now.
00:49:29.000 But I'll tell you what, man.
00:49:30.000 Whatever we're doing, this is the real world.
00:49:34.000 The work you're doing, the work we're doing here on the show, is reality.
00:49:37.000 And whatever CNN is doing is this weird fantasy realm of Russian spies and Trump peeing on beds and stuff.
00:49:47.000 But it really is insane.
00:49:48.000 It's insane.
00:49:49.000 But people believe it.
00:49:50.000 The difference between the Soviet Propd and the American Propd is in the United States, people actually believe what they read.
00:49:55.000 In the Soviet Union, everyone thought it was a joke.
00:49:57.000 Privately when they're in their cottages and they're whispering to some of their kids,
00:50:02.000 some of whom they didn't even trust, they'd sell each other out,
00:50:05.000 they'd say, we all know this is BS.
00:50:07.000 But in the United States, what I've discovered and learned in doing this for 12 years
00:50:13.000 is a lot of people actually believe what they see on TV.
00:50:17.000 They believe it.
00:50:17.000 They see it and they go, that is the truth.
00:50:19.000 Hitler's Minister of Propaganda, Joseph Goebbels, I think was his name, would say if a lie is told enough,
00:50:25.000 people believe it's truth.
00:50:27.000 Yep.
00:50:28.000 It's like a phenomenon that's real.
00:50:29.000 Well, so, look, you put out a video of a guy saying Google is doing this.
00:50:36.000 Yeah.
00:50:36.000 And they say it's a smear.
00:50:37.000 Yeah.
00:50:39.000 Look at all the ballots in my hand!
00:50:41.000 I'm gonna get paid for this!
00:50:42.000 And they're like- No evidence.
00:50:44.000 No evidence, no evidence.
00:50:46.000 So let's do this.
00:50:47.000 We'll use this to segue into the hit piece and how the fake news operates.
00:50:51.000 Amazing story, amazing story.
00:50:52.000 Tell me the story, James.
00:50:53.000 Oh my goodness.
00:50:54.000 I don't think you've ever seen me as angry and passionate that week I did those response videos.
00:50:59.000 I was angry.
00:51:00.000 I was pissed.
00:51:01.000 I don't get angry.
00:51:02.000 I was angry.
00:51:03.000 I was upset.
00:51:05.000 I've never seen anything like this in my career.
00:51:06.000 We put out this Minnesota video.
00:51:08.000 And it's got this guy in the car with all the ballots.
00:51:10.000 And this was recorded on his Snapchat.
00:51:13.000 I did not secretly record him.
00:51:14.000 Now you might ask, how did I?
00:51:16.000 This is a private Snapchat that one of my sources obtained.
00:51:19.000 So he posted it to an individual or two, and that person sent it to me.
00:51:25.000 It has a timestamp on the Snapchat.
00:51:27.000 It's July 2nd.
00:51:29.000 So we didn't even record this, Tim.
00:51:30.000 It was recorded by himself, of himself, in his car with many, many, many absentee ballots.
00:51:36.000 It's illegal in Minnesota to have more than three.
00:51:38.000 We put this out, all these clips of people saying, we're breaking the law, we don't care.
00:51:42.000 All these Somalis doing cash transactions for ballots.
00:51:45.000 It's all on the tape.
00:51:47.000 And the New York Times puts out an article, Tim, two hours before the presidential debate, because I'm pretty sure Trump was going to mention this.
00:51:53.000 You know, he's talking about fraud, ballot fraud.
00:51:55.000 And it says, you know, videos are part of a coordinated disinformation campaign, experts say.
00:52:02.000 Who are these experts? Stanford University researchers who they pay.
00:52:07.000 So what's interesting about this, and this is something that, you know, again going back to the Soviet Union,
00:52:11.000 they tend to project onto me that which they are.
00:52:16.000 They are the thing that they hate, and they accuse me of doing the thing that they do.
00:52:21.000 And they do it first, and they do it well.
00:52:23.000 And it's an incredibly effective Machiavellian tactic.
00:52:26.000 So they come out and say it's a coordinated disinformation campaign, when that's precisely what the New York Times is doing.
00:52:30.000 Now I'm going to speculate for a minute.
00:52:32.000 I think the executive editor of the New York Times knew that was defamatory, and it is, and I would win on summary judgment if I sue them, and I'm about to, but they told me they're going to retract the article.
00:52:39.000 We'll see if they retract in the coming days, and if they don't, I'll suit them for defamation.
00:52:43.000 So that pulls the teeth out of any suit?
00:52:45.000 The fact that they said, oh yeah, we're going to retract?
00:52:46.000 Yeah, the General Counsel said that they're going to make a correction.
00:52:49.000 But the USA Today has already used the article, so the USA Today is going to have to print a retraction.
00:52:53.000 So walk us through that, though.
00:52:55.000 Let's get to that point.
00:52:55.000 Well, let me just finish the point about... So I threatened to suit for defamation, the New York Times.
00:53:02.000 And they basically say that it's a disinformation campaign.
00:53:05.000 The New York Times, I think they made a utilitarian calculus and said, we know this is defamatory, but we need to muddy the waters of O'Keefe's work three weeks before the election, and we'll just deal with it two years from now in a trial.
00:53:19.000 And they knew they were part.
00:53:22.000 They are part of their own disinformation campaign, and it's genius to call me a disinformation expert.
00:53:29.000 They're the ones engaging in disinformation by contacting Stanford, quoting some idiot in a room who says, it's probably disinformation.
00:53:37.000 It is literally, I've never been more pissed.
00:53:40.000 So anyways, here's what happens.
00:53:41.000 We put a video out, USA Today quotes the New York Times, Facebook views USA Today as credible, and sends a notification out to every single person who has
00:53:51.000 ever shared that Minnesota video. I got texts from everyone in the country
00:53:54.000 James Facebook just told me you're fake news. And I think we've got my
00:53:58.000 attorney says and I don't want to be in the litigation business Tim because
00:54:01.000 there's only so many people I can sue my general counsel told me today is like what about suing
00:54:06.000 Facebook?
00:54:06.000 I mean it's just the only thing these people respond to is raw power.
00:54:12.000 That's what they want.
00:54:13.000 They only care, the only thing they respond to is a threat to their power.
00:54:18.000 And I have not, and this is a pretty extraordinary accomplishment, I don't mean to brag, Project Veritas Corporation has never lost a lawsuit.
00:54:25.000 Not once.
00:54:25.000 We've been sued over a dozen times.
00:54:27.000 We've won eight straight lawsuits.
00:54:29.000 Why?
00:54:29.000 Because they'll take it all the way to the Supreme Court.
00:54:32.000 And at the end of the day, I'm in the right.
00:54:34.000 The facts and law are on our side.
00:54:36.000 So, a similar thing happened to me.
00:54:38.000 Someone took a tweet of mine that was factually correct about Bill Clinton, Epstein Island, and then reposted the screenshot.
00:54:45.000 Facebook said it was fake news.
00:54:47.000 Facebook's liaison for politics told me they can't do anything about it because it's the organizations that post the fact check.
00:54:55.000 The fact-checking organization told me to basically shove it, even though they knew what I posted was true, they have the power to deem it fake news.
00:55:01.000 Even though the link they put on it confirmed everything I said.
00:55:05.000 Which made me think, no, Facebook's the one who published that.
00:55:09.000 When Facebook sends a notification to your followers, or to people who have shared the story, that you are fake news, that's Facebook making a statement.
00:55:17.000 I think you should sue Facebook.
00:55:18.000 Yes.
00:55:19.000 I think you should challenge their Section 230 protections under the rules that they are not fairly moderating, and they personally made, Facebook made, a defamatory statement in that presumably algorithmic response they put out.
00:55:32.000 This is a very important point you're making, and I want your audience to know that it's very important we fight back.
00:55:37.000 It is a pain in the butt, but I have to sue these people.
00:55:41.000 And I will sue the New York Times.
00:55:42.000 If they don't retract the article, I'm suing them.
00:55:43.000 I swear, it's going to happen.
00:55:45.000 And I have to go through Discovery.
00:55:46.000 They'll go through Discovery.
00:55:47.000 It's very labor-intensive.
00:55:49.000 It's a headache.
00:55:50.000 I went to a trial once, a jury trial.
00:55:53.000 You're putting your company on the line.
00:55:54.000 But you have to fight back against these people in the courts.
00:55:57.000 And there's a lot of federal judges out there who do believe in the rule of law.
00:56:01.000 You can't defame someone.
00:56:02.000 You can't intentionally, maliciously, you know, if you're a public figure like me or Tim, It's actual malice.
00:56:09.000 You have to prove that they knew that they lied.
00:56:11.000 Yep.
00:56:12.000 That's hard.
00:56:13.000 And it's hard to prove.
00:56:14.000 But I think we went on summary judgment after Discovery against the New York Times.
00:56:17.000 And Nick Sandman sued the New York Times and CNN.
00:56:19.000 He settled out of court.
00:56:22.000 Would you get them to downscale the value of USA Today as a fact-checking organization?
00:56:28.000 I don't know how I could do that.
00:56:30.000 What do you mean by downscale?
00:56:32.000 Well, if USA Today came out and said that what you did was fake news and they were wrong, can you force Facebook through litigation to discredit USA Today henceforth?
00:56:43.000 It's a war of attrition.
00:56:46.000 All these litigation battles.
00:56:47.000 But USA Today is going to have to print a correction, because the New York Times is going to have to print a correction.
00:56:52.000 And when that happens, we can go to Facebook.
00:56:54.000 But the problem is, while all this is happening, we're two weeks away from a presidential election.
00:56:58.000 I continue breaking videos.
00:57:01.000 And media establishments... No, no, no.
00:57:02.000 It's a disinformation campaign, according to the New York Times.
00:57:05.000 So, you have to... As an entrepreneur, I've had to build a whole division within Project Veritas to just fire legal letters off to publications every day.
00:57:12.000 We have, like, two people doing that every single day now.
00:57:14.000 For people that want to start an organization like very hard.
00:57:17.000 Well, how would they do it?
00:57:18.000 So walk me through How did you start the organization and how did you build it out?
00:57:23.000 I mean the bit as they say in in economics the barriers to entry or to The only other person that I know of that has tried to do this and was very effective as David Daleiden Center for medical progress.
00:57:36.000 He did the Planned Parenthood baby harvesting videos and And he was instantly sued by everyone.
00:57:41.000 Kamala Harris.
00:57:42.000 Is it Kamala or Kamala?
00:57:44.000 Kamala.
00:57:44.000 What is it?
00:57:45.000 Kamala.
00:57:46.000 Kamala.
00:57:46.000 I don't want to get that wrong.
00:57:48.000 Yeah.
00:57:48.000 Kamala Harris, then Attorney General of California, raided his home, you know, took his drives and they charged him with umpteen felonies.
00:57:57.000 I gotta jump in on this.
00:58:00.000 You've got a successful organization.
00:58:02.000 You're bringing in money from donors, you're getting millions and millions of views, you know, you're breaking these huge stories that I think, what, you got like 8 million views or what was it the last?
00:58:09.000 Yeah, the Minnesota thing was like...
00:58:11.000 20 million, on Twitter, one video is 20 million.
00:58:14.000 Well, so what's a regular person going to do?
00:58:17.000 I mean, look, when I got defamed by the Today, you know, the Today show put a picture of my face up on TV and claimed I was a conspiracy theorist.
00:58:25.000 And it was totally, what they accused me of was made up entirely.
00:58:28.000 And I couldn't do anything about it.
00:58:29.000 Yeah.
00:58:29.000 Just, they just, they can lie and they can get away with it.
00:58:32.000 I don't have the ability to go up against NBC universal, which was like Comcast, like multi-billion dollar international corporation.
00:58:40.000 Well, Dan Boorstin, this is the quote we use at Project Veritas to answer your question, what can I do?
00:58:45.000 You're going back to the what can I do question, which I think is the most important question in the world.
00:58:49.000 In this world of illusion and quasi-illusion, our heroes tend to be anonymous.
00:58:52.000 So school teachers, janitors, cops, just honest, decent, salt-of-the-earth people.
00:58:57.000 And most of them are just drinking their coffee, going to work, don't mess my life.
00:59:02.000 But there's like 0.1 or 0.01% of the people that say, you know what?
00:59:05.000 Screw it.
00:59:07.000 There are some things more important in life than bread.
00:59:11.000 There are just things that are more eternal.
00:59:14.000 So I don't know if many people can, you can contribute one dollar to Project Veritas, I suppose.
00:59:20.000 That could be something you could do.
00:59:22.000 But I think there's, Tim, there is a tiny fraction of people that are willing to make the ultimate sacrifice for a cause greater than themselves.
00:59:30.000 And the greatest sacrifice right now in this world appears to be giving up your reputation And that is a sacrifice that few people are willing to give up and there are people wanting to do that and that's what they can do.
00:59:42.000 Well, I guess what I mean is if John Doe, plumber in Minnesota, gets defamed by the New York Times, what's he gonna do about it?
00:59:52.000 I mean, granted, public figures have different standards.
00:59:54.000 This regular guy can, you know, have an easier go of it for defamation.
00:59:57.000 But there are smaller public figures.
00:59:59.000 You know, when Nick Sandman was smeared across the board, they actually argued he was an involuntary public figure, meaning they can put a camera in your face and then claim you're a public figure and that's their defense.
01:00:08.000 Granted, there needs to be more organizational entrepreneurs like you.
01:00:13.000 Like others, like you.
01:00:15.000 Someone needs to start an organization raising money, suing the New York Times for defaming ordinary citizens.
01:00:20.000 I can't do that.
01:00:21.000 I'm only one man.
01:00:21.000 I work 110 hours a week.
01:00:23.000 If I work any more, I'll get sick and die.
01:00:26.000 There needs to be more people with, I'm just going to say it, people with balls.
01:00:30.000 And there just has to be more people that do this sort of thing.
01:00:34.000 I can't do all of it.
01:00:36.000 I encourage other people to start organizations and have balls.
01:00:39.000 Oh yeah, for sure.
01:00:40.000 right. There is a great idea. Start an organization and sue the New York Times for defaming ordinary
01:00:46.000 citizens. And by the way, you'll raise $50 million.
01:00:49.000 Oh yeah, for sure.
01:00:50.000 Easy, done. I'd probably do it another life.
01:00:53.000 But now they're going to be like, that's trying to destroy real journalism.
01:00:57.000 No, no, no, no, no, no. It is, I mean, there are philosophical and constitutional arguments
01:01:03.000 against defamation in this country. You can't just maliciously lie about someone.
01:01:07.000 They do it every day.
01:01:08.000 Well, yeah, but they were a nation of laws. And in the federal court system, you can win.
01:01:14.000 In the federal court system in this country, I mean, Trump appointed, like, I don't know how many, 200 federal judges.
01:01:19.000 There are Article 3 constitutionally appointed federal judges who believe in the rule of law.
01:01:24.000 I took it all the way to a jury verdict in this case in North Carolina with this Democracy Partners individual who sued me for quoting her.
01:01:31.000 Wow.
01:01:32.000 I quoted a guy talking about this woman.
01:01:34.000 The woman sues me.
01:01:35.000 I'm like, why the hell are you suing me?
01:01:36.000 He's the one who said it.
01:01:38.000 And I quoted him.
01:01:39.000 And it got all the way to a jury verdict.
01:01:40.000 Wow.
01:01:41.000 A jury verdict.
01:01:42.000 There's an incredible documentary on YouTube about this.
01:01:44.000 We produced it.
01:01:45.000 And it gets all the way.
01:01:46.000 The jury is coming out of the box.
01:01:47.000 They sit down.
01:01:48.000 I'm like, I can't believe them.
01:01:49.000 There's a jury about to issue a verdict.
01:01:51.000 Multi-million dollar defamation case.
01:01:53.000 And the federal judge, a guy named, what was his name?
01:01:57.000 Last name, federal judge.
01:01:59.000 In North Carolina.
01:02:00.000 He says, stop, stop it.
01:02:02.000 And he pulls out a rare Rule 50 in federal court directed verdict.
01:02:06.000 Case dismissed.
01:02:07.000 Wow.
01:02:07.000 And he says, this is outrageous.
01:02:10.000 If Mike Wallace were being sued, people would laugh.
01:02:13.000 Those are his words.
01:02:14.000 And he said, O'Keefe was just quoting, they're just quoting the guy.
01:02:18.000 Why did it get that far?
01:02:19.000 It got that far because of my brand.
01:02:21.000 I edit videos.
01:02:22.000 When you actually deconstruct all the BS, And you actually take it all the way to a jury and to an appeals court.
01:02:30.000 There's still some semblance of the rule of law in this country, and people need to sue media corporations, and they have to go all the way.
01:02:38.000 It's expensive, though, man.
01:02:39.000 Well, like I said, have some guy like you start a company with the sole, our mission statement.
01:02:44.000 Don't make me do it.
01:02:45.000 Not you, but someone, if it were a different life.
01:02:48.000 Maybe I can clone myself, and it's Project Sue the New York Times.
01:02:53.000 You will raise $50 million, and you'll have so much business.
01:02:58.000 project to sue the New York Times.
01:03:00.000 I love it.
01:03:00.000 It's very on-the-nose.
01:03:01.000 So like a non-profit that raises money to help clients sue?
01:03:04.000 Has to be.
01:03:05.000 5-1-C-3.
01:03:05.000 5-1-C.
01:03:06.000 Tax-deductible donations with a stated mission to expose fraud and hypocrisy.
01:03:10.000 How about the Citizens Defamation Defense Fund?
01:03:13.000 That's very clever.
01:03:14.000 I like that.
01:03:14.000 What is that?
01:03:15.000 CDDF?
01:03:15.000 Yes.
01:03:16.000 You know what?
01:03:16.000 That sounds too much Washington, D.C., you know?
01:03:19.000 Get them.
01:03:20.000 It sounds like an organization that doesn't do anything.
01:03:22.000 How about the Citizens Retribution Center?
01:03:24.000 Yes.
01:03:24.000 I like it.
01:03:25.000 That's better.
01:03:26.000 Defamatory Retribution Legion.
01:03:29.000 You could raise millions.
01:03:30.000 CRC.
01:03:31.000 I mean, if the Southern Poverty Law Center can raise 500 mil.
01:03:33.000 And store it all, what, in the Bahamas or something?
01:03:37.000 But you've got to have balls.
01:03:39.000 You've got to stand up.
01:03:40.000 There were many times, I'll give you one quick story, anecdote.
01:03:42.000 from that North Carolina case, where they put me in a room with these plaintiffs, the people sue me, and they have an arbitration hearing.
01:03:50.000 They try to negotiate, and the judge comes in there and he goes, now O'Keefe, you could lose, you could lose your company, you could risk it all.
01:03:56.000 Why don't you just give this person $25,000 and call it?
01:04:00.000 You're going to spend a million dollars, O'Keefe.
01:04:02.000 Just give her $25,000.
01:04:03.000 And I said something to the effect of, so help me God, I will not settle this lawsuit.
01:04:08.000 So what they try to do is threaten you and scare you and all this crap.
01:04:12.000 And I just say, you know what?
01:04:13.000 If I have to go bankrupt, if I have to give it all away, if I have to liquidate the clothes and my watch and go bankrupt, I have to be willing to risk everything in order to stand on principle.
01:04:24.000 You have to be willing to do that if you want to play in this game.
01:04:29.000 If you are going to give them an inch or sacrifice your principles or sell out or fear anyone, don't even start it.
01:04:36.000 I don't think people realize when I say, you know, I've had people message me saying, Tim, Tim, you know, you've talked about you get banned and you get in your van and just go off, live down by the river.
01:04:45.000 No, you gotta tell people to be strong, you're gonna keep fighting.
01:04:48.000 And I'm like, well, listen, you know what I'm saying when I say I'm absolutely happy living out down by the river with nothing?
01:04:55.000 There's nothing you can take from me.
01:04:57.000 I will throw everything and the kitchen sink at you if you try and screw with me or what I believe in.
01:05:02.000 I will sacrifice literally everything to win.
01:05:06.000 I completely agree with you, and I applaud you for it.
01:05:08.000 Don't settle.
01:05:09.000 Other people shouldn't.
01:05:10.000 This is something that's always bothered me my entire life, is people saying things like, why won't someone else do it for me?
01:05:17.000 Terrible.
01:05:17.000 No, if you want it, you stand up for it, and if they don't give it to you, you walk away.
01:05:21.000 But when you just cave and give in, then what do you actually have?
01:05:24.000 A lot of the people that are doing the change the world thing,
01:05:27.000 they get co-opted or they get distracted. Their mission is not to do it, it's to raise money.
01:05:31.000 So I want a 501c3 corporation under the IRS. We're a tax-deductible, charitable organization.
01:05:37.000 So if you're in the business of running a... you end up...
01:05:39.000 your business model, I need to raise money.
01:05:42.000 So your focus becomes my deal every day is to make more money.
01:05:47.000 That is not my objective.
01:05:49.000 My stated objective is to do journalism and win every lawsuit that I file and is filed against me.
01:05:54.000 So in order to accomplish that objective, money is a means to an end.
01:05:57.000 It's not my end.
01:05:59.000 I'm not a for-profit organization.
01:06:01.000 And I hate to put it in that way, it makes me sound like I'm not a capitalist.
01:06:04.000 But I do believe, Tim, that the economic imperative, just like that ABC News guy said, Eric Spracklin, in February.
01:06:11.000 Remember the ABC News guy?
01:06:12.000 He said the economic imperative is corrupting, Disney Corporation is corrupting ABC News.
01:06:16.000 And they sell Marvel comic books instead of telling the news.
01:06:19.000 It's just investigative reporting.
01:06:21.000 There's no business model for it.
01:06:24.000 And anyone who comes up with some ingenious paradoxical way to make a profit doing real investigative reporting, it costs us a million dollars sometimes to do a story.
01:06:32.000 A million dollars.
01:06:33.000 You want to know what I was?
01:06:34.000 I was at a big investors meeting for a major, there's a major, major company that invests in all these different media companies, and I'm not going to say these names or anything, but you know what they told me?
01:06:42.000 I said, if you are an investigative journalist, and you come to me, an investor, and make me a proposition, you give me $300,000, and I'm gonna investigate this story for the next year.
01:06:53.000 My first question is, what do you project to accomplish by the end of that year?
01:06:58.000 And you know what every investigative journalist says?
01:07:00.000 What's that?
01:07:00.000 I have no idea.
01:07:01.000 We're gonna investigate, we'll see what we find.
01:07:03.000 So you mean to tell me I'm gonna give you 300 grand to investigate something, and I have no idea what I'm getting back in return?
01:07:08.000 Yeah.
01:07:08.000 Not interested.
01:07:09.000 I tell you what, though.
01:07:11.000 You come to me and say, we're going to write rage-bait articles about Donald Trump.
01:07:14.000 How much is that going to cost?
01:07:15.000 Eh, five grand.
01:07:16.000 How much do I get back?
01:07:17.000 Oh, we're going to make half a million by the end of the year.
01:07:18.000 Done!
01:07:19.000 Here's a check right now, on the spot.
01:07:21.000 That is corrupting media.
01:07:22.000 That is corrupting journalism.
01:07:24.000 Yeah, yeah, the rage, the clickbait, the woke clickbait stuff is, you know, what's the guy who, the girl who resigned from the New York Times and said, I'm just sick and tired of seeing a thousand anti-Trump, what was her name?
01:07:37.000 Barry Weiss.
01:07:38.000 Barry Weiss said that quote, it was very eloquently stated, I don't remember, it was something to the effect of, I'm just tired of seeing all the anti-Trump stuff.
01:07:45.000 How many more articles on the op-ed page do you need to see?
01:07:47.000 So there's a hunger for information, but I think the real meat and potatoes of what fundamentally the issues are is people need to have more courage, and there needs to be more people like you, and people who are organizational entrepreneurs, so you're a creative person who also runs your own company.
01:08:04.000 And you have to, and Tom Fitton and I were talking about this,
01:08:06.000 the Judicial Watch guy, and he said, James, you're an entrepreneur journalist.
01:08:11.000 And I realized in my life that, okay, so I can't settle the lawsuit,
01:08:16.000 but in order not to settle the lawsuit, I have to raise millions of dollars.
01:08:20.000 How am I going to raise, I had to figure this stuff out.
01:08:24.000 And I never thought, in my craziest imagination, that I'd have 50 plus employees,
01:08:30.000 and traveling around the United States, jettisoning around, getting checks from foundations.
01:08:36.000 But it wasn't my stated goal to do that.
01:08:39.000 I wanted to do whatever was right, and I had to learn how to do all this other stuff
01:08:44.000 to do the thing that was my passion.
01:08:46.000 Why don't you guys start a straightforward news website, actually writing articles, fact-checking?
01:08:53.000 Because the reason why I don't do that is because there's only so much time and I have to be very specific.
01:08:59.000 I have to focus on the critical thing that we do.
01:09:03.000 I believe that video is, you know, this should be self-evident, but the medium is the message to quote Marshall McLuhan.
01:09:11.000 It has to be hot.
01:09:12.000 It has to be Smoking gun, videotaped evidence.
01:09:15.000 It has to be incontrovertible.
01:09:16.000 Even when it is incontrovertible, they say it isn't evidence.
01:09:19.000 So I just have to focus, Tim.
01:09:21.000 I can't do mission creep on these other things.
01:09:24.000 I have to focus on the task at hand.
01:09:25.000 I think, you know, you have a 501c3 and a 501c4, correct?
01:09:28.000 So what's the difference between the two?
01:09:28.000 Yes.
01:09:31.000 So the IRS, I think this is absurd, but the IRS says if I investigate a politician within 90 days of an election, I am doing, you know, Political action.
01:09:40.000 Political advocacy.
01:09:41.000 I've never advocated for anything in my life.
01:09:43.000 I expose, but I've never advocated for anybody.
01:09:46.000 So one of our attorneys said, just to be careful, you should start a 5-1-C-4.
01:09:50.000 The money's not tax deductible, but it is a donation to the C-4.
01:09:56.000 And that's this Project Veritas, just with the word action.
01:10:01.000 Correct.
01:10:01.000 If we're investigating Hillary Clinton or a senator from Arizona, a lot of these people in the states, these employees for Senate candidates will privately admit that their bosses are lying about their progressive views.
01:10:17.000 They're minimizing them in order to get elected.
01:10:20.000 And once they get elected, they'll do all the things that they told you they couldn't do.
01:10:23.000 I'm going to ask you this question, I'm pretty sure everyone can guess what your answer is going to be.
01:10:27.000 Have you ever deceptively edited a video for any reason?
01:10:33.000 No.
01:10:34.000 We've done hundreds of investigations, and I will list right here, right now, the specific things that they talk about, like mistakes I've made, if you want to call them that.
01:10:45.000 In the Acorn investigation, which is the one most of my critics will bring up, they'll say, he didn't wear the pimp costume!
01:10:50.000 In the video!
01:10:52.000 Well, in some of the offices, in fact, Hannah was dressed like a prostitute in every office.
01:10:56.000 In one of the offices we had the pimp fur.
01:10:58.000 But no, I wasn't dressed like that flamboyant pimp in the video.
01:11:01.000 But pimp protocol doesn't require the wearing of a coat to be a pimp.
01:11:05.000 I said I wanted to whore out girls, classify them as dependents on the tax form.
01:11:09.000 So that's kind of a ridiculous criticism.
01:11:11.000 The second example they used, we were talking about this last night, was in the NPR video in 2011.
01:11:14.000 That was nine years ago.
01:11:16.000 There was one scene where Ron Schiller, the vice president of National Public Radio, tells someone he thinks is a Muslim fundamentalist.
01:11:23.000 He's kind of quoting somebody else, but mid-sentence it becomes his own thought.
01:11:29.000 So an editorial decision was made in the timeline to create an in, to cut, when he went mid-sentence.
01:11:36.000 That's it.
01:11:36.000 The video was 12 minutes long.
01:11:39.000 It was one part.
01:11:40.000 And the way the media works is they yammer about this one specific editorial decision.
01:11:46.000 I don't think it was a material edit, Tim.
01:11:48.000 That is the only two examples in hundreds of investigations.
01:11:53.000 And look at my Wikipedia page.
01:11:54.000 All it talks about are those two examples.
01:11:57.000 And this is an outrage, this is a God standard they're trying to hold you to.
01:12:01.000 No journalist is God.
01:12:04.000 Everyone makes mistakes.
01:12:05.000 So I bring this up because I remember when, I can't remember exactly what expose you did, but they said you deceptively edit videos and you secretly record people so, you know, it's unethical.
01:12:16.000 And at the same time, Channel 4 in the UK did this big undercover expose on like Brexit supporters or something.
01:12:21.000 I don't know if you remember this.
01:12:22.000 And I'm sitting on Twitter and I'm like, wait a minute, they're praising this and condemning this at the exact same time, it's the exact same thing.
01:12:28.000 No, it's the deceptive editing is just, I mean, all journalism is edited in a selective fashion.
01:12:34.000 Words are arranged onto newsprint.
01:12:36.000 Bob Woodward, in his book Fear, and the new book, which I don't even know the name of the new book he wrote, but Bob Woodward uses deep background sourcing.
01:12:44.000 And something that Tom Wolfe invented, like this new journalism, this thing where, Tim, he meets with a source, and the source tells him a story on hearsay, and then he puts the quotes from the one source in other quotes, and he tells it like it's a direct quote.
01:13:00.000 What is that if not selective editing?
01:13:02.000 Our stuff, every word you can see the person's lips moving, and it's just almost It's asinine if you actually think it's pathological if you think about it that journalists like Bob Woodward who quote people who quote other people and then he puts that stuff in quotes is saying that I selectively edit when you can see the person's lips moving.
01:13:23.000 So what you begin to realize when you open your eyes is It's worse than 1984.
01:13:28.000 As Gavin McGinnis says, it's become almost a Kafkaesque nightmare.
01:13:33.000 It's no longer Orwellian.
01:13:35.000 It's just this pathologically insane double standard.
01:13:38.000 And the only solution is to just keep moving forward.
01:13:41.000 You just have to keep putting out the work.
01:13:44.000 You gotta keep moving.
01:13:45.000 There's a metaphor I was thinking about that is if you're in the military and you're under fire in the military, if you hunker down and sit there, they're gonna kill you.
01:13:52.000 They're gonna zero in on you and surround you and kill you.
01:13:55.000 Same in journalism.
01:13:55.000 If you stop to defend yourself, they're gonna circle you.
01:13:58.000 You just keep pushing even though it doesn't make sense.
01:13:58.000 Well, you have to do both.
01:14:01.000 No, you have to do both.
01:14:02.000 So we'll put the Google video out today, put another one out tomorrow, put another one out tomorrow.
01:14:05.000 But you simultaneously have to have a division within Project Veritas with lawyers that just files defamation lawsuits.
01:14:10.000 You have to actually do both things concurrently, which is to answer your question, who can do this?
01:14:15.000 You'd have to be a masochist to do this.
01:14:18.000 You have to love and inflicting self-pain.
01:14:21.000 I'm sure there are humans who can do it, but I was put in federal prison.
01:14:25.000 I was charged with a crime I didn't commit.
01:14:27.000 I thought my life was over in Louisiana ten years ago.
01:14:31.000 And I was on federal pretrial release living with my parents.
01:14:36.000 It was all nonsense.
01:14:37.000 I didn't do it.
01:14:38.000 But you're right.
01:14:39.000 You have to keep moving forward.
01:14:41.000 You can't stop.
01:14:43.000 No matter what.
01:14:44.000 Unless they are, as Elon Musk says, you're dead or incapacitated.
01:14:48.000 You just have to keep putting out product.
01:14:50.000 More stories.
01:14:52.000 And I don't think they would stop.
01:14:53.000 People say you're going to go to jail once Biden gets elected.
01:14:56.000 Truth and Reconciliation.
01:14:57.000 Truth and Reconciliation.
01:14:58.000 Robert Reich.
01:14:58.000 Who said that?
01:14:59.000 Truth and Reconciliation Commission to name the politicians, the moguls, the collaborators of Donald Trump.
01:15:06.000 Truth and Reconciliation Committee.
01:15:08.000 That's right out of Emanuel Goldstein manual, 1984.
01:15:11.000 Truth and Reconciliation.
01:15:12.000 So will we go?
01:15:13.000 What does that look like when we go to jail?
01:15:16.000 And, you know, I think it will be the removal from society.
01:15:19.000 What does that mean?
01:15:20.000 So if you look at Laura Loomer.
01:15:23.000 Purged from every platform.
01:15:26.000 She's been banned from some banks, I think.
01:15:27.000 Has she?
01:15:28.000 I think so.
01:15:29.000 So they start removing your ability to function in society.
01:15:32.000 They start stripping your access to financial institutions, communications, etc.
01:15:36.000 I think that's what it means.
01:15:37.000 And so it reminds me of... I can't remember which dystopian novel this is, where they excise you from society.
01:15:44.000 You're free to move around, but everyone just ignores you, and they won't talk to you, and you're like an other, and you can't shop or anything.
01:15:50.000 Yes.
01:15:50.000 That SWIFT global payment system.
01:15:52.000 If anyone out there has info about the SWIFT global payment system and wants to contact Project Veritas to uncover that corrupt, centralized thing.
01:16:00.000 Because if they ban you off the SWIFT payment system, you can't use Visa, you can't use MasterCard.
01:16:03.000 I mean, think about this.
01:16:04.000 If you're banned from Visa, MasterCard, pick one.
01:16:07.000 If one of those companies bans you, you got no credit card.
01:16:09.000 The way I look at it, I hear what you're saying.
01:16:11.000 I understand the argument, you know, everyone gets banned and information gets censored.
01:16:14.000 But I also think that, like, let's take a very clear example, like a videotape.
01:16:18.000 Let's think of a nonpartisan example, like a federal judge on tape accepting a briefcase full of cash.
01:16:24.000 Yeah.
01:16:24.000 And let's say that I'm banned, or you're banned, or anyone's banned.
01:16:27.000 But let's just say we have this clear, incontrovertible videotape of a federal judge taking a bribe.
01:16:33.000 It just seems to me that the people who are not banned, you can use them as proxies and text them the video and they can send it out.
01:16:42.000 I just think that content is king.
01:16:44.000 I hear you, but what happens when you can't pay your rent anymore?
01:16:49.000 Well?
01:16:49.000 Because you have no bank.
01:16:50.000 Crypto.
01:16:51.000 And your money is in a shoebox under your mattress or whatever.
01:16:54.000 So that's what I fear.
01:16:55.000 I do want to bring something else up though because I want to talk about how the media plays this.
01:17:00.000 There's a really funny phenomenon in news about how journalism fact-checks itself.
01:17:09.000 There's an inherent bias that the New York Times is credible.
01:17:12.000 The Washington Post is credible.
01:17:14.000 So from that standpoint, they'll start using the New York Times and the Washington Post as a gauge of whether or not another organization is credible, creating this bias feedback loop.
01:17:24.000 Yes, feedback loop is right.
01:17:26.000 So the example I use is NewsGuard.
01:17:28.000 It's the easiest way to explain what's broken in media because they've quantified it.
01:17:33.000 Before, you'd be like, why are they saying James O'Keefe and Project Veritas are fake news or deceptive?
01:17:38.000 Well, I think they have an agenda.
01:17:39.000 They don't like that you're exposing these long-standing institutions and you're hurting them politically.
01:17:44.000 So what they'll do is, they'll say that you're deceptive, you're biased, or part of a coordinated smear campaign.
01:17:49.000 People then just trust the New York Times.
01:17:51.000 But I'll tell you this.
01:17:53.000 NewsGuard is an organization that rates the credibility of various news organizations.
01:17:58.000 They give you guys a proceed with caution.
01:18:01.000 That's crazy.
01:18:01.000 website fails to adhere to several basic journalistic standards.
01:18:04.000 But they do say you do not repeatedly publish false content, but for a variety of reasons
01:18:09.000 they say you're not credible, of which it's the examples you've actually given.
01:18:13.000 Those two things, among some other points about not knowing where your fundraising comes
01:18:17.000 from.
01:18:18.000 They give Media Matters of America green checks across the board.
01:18:22.000 For those that don't know, Media Matters doesn't do journalism at all.
01:18:25.000 It is just a group of activists that complain about their opinions on other organizations and make things up, quite literally fabricate things that aren't true.
01:18:34.000 You know what's funny?
01:18:35.000 Can I just jump in here for a second?
01:18:36.000 Like I'm not verified on Twitter and I don't, how many followers do I have?
01:18:39.000 I think you got more than I do.
01:18:41.000 I think I have like eight, 800,000 or close to that.
01:18:44.000 So it's time to verify James.
01:18:45.000 Okay, but let me just say something about this because this is pretty, this is 778,000 followers on Twitter and I'm not a verified human being.
01:18:53.000 And I think at some point it's like, it's actually a badge of honor to not to be verified.
01:18:59.000 Like, so this whole, I think the inverse is true, like we're not verified, we're not credible, but when people like that do that to me, And the powers that be, like USA Today has to do that thing and Facebook has to send that ridiculous notification.
01:19:12.000 It actually has an inverse effect.
01:19:14.000 People go, wow, it's like the World War II bombers are over the target.
01:19:18.000 Like, I've never seen anything like, everyone's, James, I just got a notification on Facebook, they said your video is fake.
01:19:24.000 The fact that Facebook did that is almost a verification in and of itself.
01:19:29.000 And it draws more attention.
01:19:30.000 It's kind of a, you know, I'm pioneering here with my logic, but it seems to me, and I've thought about this not being verified thing.
01:19:38.000 I don't know many, there's only a handful of people that are in this league where they've got a million followers and they do all this work that everyone's always talking, and they're not verified.
01:19:49.000 In many ways, that is a kind of verification, isn't it?
01:19:51.000 Yeah, you're kind of like, what's the right word for it?
01:19:56.000 A rebel.
01:19:57.000 Rebel, rogue, eyepatch.
01:20:00.000 If everyone's saying, don't believe this man, he's fake, he's terrible.
01:20:03.000 If people are loudly saying that all the time, then you must be doing something right.
01:20:09.000 Going back to the cliche from Churchill, if you have haters, good, it means you stand for something.
01:20:14.000 So I think they're walking a very slippery slope.
01:20:17.000 Constantly saying how fake everything is.
01:20:19.000 If anything, they're kind of just quiet.
01:20:22.000 Like, unable to engage in defamation, journalists are typically rendered mute.
01:20:25.000 They don't say anything at all.
01:20:26.000 And this Dave Weigel character is a great example.
01:20:28.000 They don't even open their mouths if they can't say anything negative.
01:20:31.000 So anyway... What's his story?
01:20:33.000 Well, he was the guy who fabricated that thing I mentioned.
01:20:35.000 So we have this thing called Retracto the Correction Alpaca.
01:20:37.000 I love it.
01:20:38.000 That song's good, by the way.
01:20:40.000 Professionally get that recorded.
01:20:41.000 We're making plushies, and everyone loves the song.
01:20:44.000 It was written 10 years ago.
01:20:46.000 It was an Andrew Breitbart creation.
01:20:47.000 It was a quick story.
01:20:49.000 I was arrested, and everyone said I committed a felony, felony, felony, and we kept getting corrections and retractions.
01:20:54.000 And Andrew Breitbart says, at like 11 o'clock one night, he calls me, and he goes, James, what if there was a mascot for the corrections?
01:21:00.000 And he's like, let's think of it.
01:21:01.000 And we're on the phone, completely exhausted, and Andrew Breitbart says to me, is it retraction alpaca?
01:21:09.000 And literally, I shit you not, I laughed until I cried.
01:21:13.000 We were just laughing for an hour about this ridiculous creation.
01:21:16.000 And then Andrew Breitbart died two years ago and then it went away.
01:21:18.000 And then like a year ago I resurrected the retraction alpaca.
01:21:21.000 And every time a journalist prints a retraction, an angel gets its wings and retractor the correctional packet.
01:21:28.000 So Dave Weigel prints two, we get two retractions on Dave Weigel, the Washington Post.
01:21:32.000 Here's a guy, I'm going to say something about Mr. Weigel.
01:21:35.000 He was a friend of my comms director, Steve Gordon.
01:21:37.000 Steve Gordon passed away.
01:21:38.000 Weigel wouldn't even email me.
01:21:41.000 just so just these people are so filled with hate and so what we do is we you know you know we do we frame the retractions put them on the wall yep Dave Weigel has two retractions in his honor and And he'll hate me for the rest of his life for it, I'm sure.
01:21:57.000 You're gonna run out of space.
01:21:59.000 We need a whole new building just to put all the retractions in.
01:22:01.000 Seriously.
01:22:01.000 Hundreds of retractions.
01:22:02.000 A huge wall of all the retractions.
01:22:04.000 Well, so here's what I'm thinking, right?
01:22:06.000 You keep exposing this stuff and it becomes more and more apparent and evident that there's some kind of...
01:22:12.000 I mean, look, you've got corrupted individuals at high-ranking institutions, media companies are lying to us, social media companies.
01:22:20.000 We see it in the documentary, The Social Dilemma.
01:22:22.000 We've got a bunch of former executives saying straight up, they're manipulating you on purpose, and it's making people go crazy, and it's creating this massive partisan divide.
01:22:30.000 Watch The Social Dilemma.
01:22:31.000 I don't know, we watch tidbits, but I say, for everybody listening.
01:22:34.000 Where do you watch it?
01:22:35.000 Netflix, I think.
01:22:35.000 Netflix.
01:22:36.000 Check this out.
01:22:37.000 They show, have you seen that Pew Research, where it shows the left and the right moving far apart?
01:22:41.000 Yes.
01:22:41.000 But the right barely moves.
01:22:43.000 The right barely moves to the right, and the Democrats go very far to the left.
01:22:47.000 This is what I see happening.
01:22:49.000 Social media companies are banning conservatives and anti-establishment like I don't want I don't it's not necessarily just conservatives I don't necessarily know what that means anymore anyway but it's people who are not associated with the establishment left who challenge the system are more likely to get banned and many of them happen to be conservative.
01:23:05.000 By getting rid of fringe elements and bombastic elements of the right, the only thing that's left are regular-looking conservative individuals like, you look like a normal guy, James O'Keefe.
01:23:16.000 You've got people like Will Chamberlain, who, just suit-wearing, lawyer, Trump supporter, very calm, rational guy.
01:23:21.000 Very, very, you know, easygoing.
01:23:24.000 On the left, however, they're pushing things in favor of this more extreme, insane, cultish rhetoric.
01:23:32.000 Antifa has given a free pass across the board.
01:23:35.000 And so politicians see the insane rhetoric, believe that's where the Democrats and the left are today, and embrace it.
01:23:42.000 And it leaves the regular people behind.
01:23:44.000 So you've got social media.
01:23:46.000 Here's what I'm trying to say.
01:23:47.000 When you look at the Pew research showing the Democrats have moved very, very far left in terms of... It's consistently liberal, meaning they don't hold any conservative views or negotiate with conservatives, and conservatives have barely moved at all to the right.
01:24:00.000 Conservatives are staying where they are.
01:24:01.000 They're regular people.
01:24:03.000 The media is manipulating people and driving them into this insane leftward direction, which eventually at some point... Well, we'll see.
01:24:10.000 We'll see what happens on November 3rd.
01:24:12.000 No, what I'm saying, I'm not saying that every single person is.
01:24:15.000 I'm saying they've captured people in a storm that's going crazy.
01:24:18.000 That's true.
01:24:19.000 It's leaving the regular people behind.
01:24:21.000 So this is what we saw.
01:24:23.000 My main segment today, Gallup, shows that today there is a party affiliation poll at the last, the latest update they have from September 14th to 28th shows there is a Republican plus one advantage in party affiliation in this country.
01:24:38.000 Four years ago, in the same time duration, it was Democrat plus five.
01:24:43.000 Independent voters are also up two.
01:24:45.000 So that means what we're seeing now is several percentage points of Democrats no longer identify as Democrat.
01:24:51.000 A couple now identify as independent and some identify as Republican.
01:24:55.000 So they're being driven away from the Democratic Party.
01:24:57.000 What I think is happening is the things you expose, the thing we're seeing, I think, well, I absolutely think you're having an impact.
01:25:04.000 And I think that people are starting to see through the facade, realizing these people are going insane, what they're promoting is insane, and now they believe they're Republican or Independent, they're leaving the Democratic Party because of it.
01:25:16.000 I think what people need is hope.
01:25:17.000 I think the thing I fight in speeches and wherever I go is just the sheer amount of cynicism.
01:25:23.000 Nothing, you see this in the comments, nothing matters, nothing will be done, nothing will be done about it.
01:25:26.000 Like you see this, I see this top comment every, nothing will happen to Ilhan Omar, no one will get arrested.
01:25:31.000 It's just this cynical, nasty, nihilistic sentiment.
01:25:36.000 And I say stop whining and complaining about it.
01:25:38.000 I say be brave, do something.
01:25:40.000 It's so obvious of a motto.
01:25:41.000 It's just like a kindergartner can understand it.
01:25:45.000 So here's what gets scary to me.
01:25:47.000 The average person can't do anything to Ilhan Omar.
01:25:51.000 What do regular people do?
01:25:52.000 Well, I said it earlier, and it's a very simplistic answer, but I think it's true.
01:25:56.000 You can fight.
01:25:57.000 In other words, you can jump on a grenade.
01:25:58.000 I mean, nothing that I do, or we do, or anybody journals, it is not even close to the comparison of something like a Marcus Luttrell, who saw his combat veterans go overseas and die for their country.
01:26:12.000 Think about that.
01:26:13.000 And I don't know if people ask them do they fear for their lives, but these are people that go overseas and make the ultimate sacrifice for a cause.
01:26:21.000 How many people in Washington, D.C.
01:26:23.000 Can you think of one?
01:26:23.000 do that?
01:26:25.000 Can you think of one member of Congress?
01:26:27.000 No, no, I'm serious.
01:26:28.000 I'm sorry, a member of Congress?
01:26:31.000 Can you think of a person who will make a sacrifice In a civic way, in the same vein, you know what I mean?
01:26:43.000 You say that there are people in the military who would jump on a grenade to save their platoon.
01:26:48.000 There are people who would travel overseas and risk their lives for the good of America.
01:26:53.000 In D.C., those are the people that would pick up a child and hold it in front of them to shield themselves from the grenade.
01:26:58.000 That's what I'm talking about.
01:27:01.000 You have to do that.
01:27:02.000 There's no easy answer.
01:27:03.000 It's like, what can I do?
01:27:05.000 Well, most people aren't going to do it, but you don't need most people.
01:27:08.000 You just need .001% of the people to do it.
01:27:10.000 You know what you can do is call her office and you can coordinate calling campaigns because if you get another thousand people to call her office at 2 p.m.
01:27:18.000 on Thursday and then do it again on Friday and again on Monday, you will change people's minds.
01:27:23.000 Let me tell you what someone did do.
01:27:25.000 An insider in the Somali community.
01:27:27.000 And Somali is a very, you know, it's hard to infiltrate or undercover work because they're very tight-knit.
01:27:33.000 There was an insider who was upset by what they saw, and they contacted us.
01:27:37.000 That's what they did, and it made an impact.
01:27:41.000 That investigation... So, Tim, what can people do?
01:27:45.000 It's the Dennis Prager line.
01:27:47.000 You can fight, you can donate a dollar or ten dollars or a thousand dollars to those who fight, or you can do nothing.
01:27:54.000 Those are your options.
01:27:55.000 Very simple.
01:27:56.000 Pick one.
01:27:57.000 You can afford to... I'm not gonna raise money on your show, but you can afford to send money, $10 to a cause you care about.
01:27:57.000 Pick one.
01:28:03.000 I think the first and most important thing everyone can do is speak up.
01:28:08.000 That's it.
01:28:09.000 In fact, share shows like this and share the videos from James.
01:28:13.000 The video you just put out.
01:28:16.000 Did you know that my... I have four YouTube channels.
01:28:20.000 I control three of them because Scanner is editorially independent.
01:28:23.000 They do their thing.
01:28:25.000 My, of the channels I control, two of them are blacklisted on Google.
01:28:28.000 You cannot Google search TimCast or TimCast News.
01:28:31.000 If you do, only this show comes up, because this show is new.
01:28:34.000 At some point, they removed my main YouTube channels from Google so they cannot be searched for.
01:28:40.000 Taking away all of that information, and the only reason any of it works is because people on YouTube might be linked to it, or people are sharing it.
01:28:48.000 So I tell you this, man, if you think, if you watching, there's, well, I'll say there's two things you can do.
01:28:52.000 If you think what James is doing is important, and the information you're exposing, then share those videos, and then, just to throw back to what you said, what was your email address?
01:29:02.000 It's VeritasTips, that's V-E-R-I-T-A-S Tips, at protonmail.com, fully encrypted, Also, projectveritas.com backslash brave if you want to sign up to work for us or do anything else like that.
01:29:17.000 Blow the whistle.
01:29:18.000 VeritasTipsAtProTimeMail.com.
01:29:19.000 There are a lot of people working for tech companies right now watching this, and actually they're emailing me as I speak.
01:29:24.000 So I just want to stress this.
01:29:27.000 I don't want, I think people need to realize you're much more powerful than you realize, and there are things you can do, literally, by, you go to work and you can be like, hey, did you guys see that thing from Project Veritas?
01:29:39.000 Yes.
01:29:39.000 What Google is doing?
01:29:40.000 And then when they're like, what is this?
01:29:42.000 You just talk about it, share the ideas.
01:29:44.000 That's how you change it.
01:29:45.000 Everyone on this podcast right now watching this, go to Twitter and just tweet the video out.
01:29:52.000 50 is, I don't know how many people, 100,000 people watching this right now.
01:29:54.000 Just do that.
01:29:57.000 50,000 people tweeting that Google video out will change things.
01:29:59.000 You can also start using the Brave browser and DuckDuckGo search engine.
01:30:04.000 Google, I love you, but your browser is proprietary and it's biased as all get out from what I can tell.
01:30:10.000 No offense, but that's what the algorithm is doing.
01:30:12.000 So something like the Brave browser and DuckDuckGo will circumvent that.
01:30:17.000 I guess you would call it.
01:30:18.000 People are scared to say words.
01:30:20.000 They're scared to say who they're going to vote for.
01:30:22.000 There was research from a group of PhDs.
01:30:25.000 They found 10% of people who are going to vote for Trump are likely to lie about who
01:30:30.000 they actually support to polls.
01:30:32.000 You just got to say it.
01:30:34.000 Be brave is a great, I guess, what is it your motto?
01:30:37.000 Slow dinner?
01:30:38.000 It's like got milk, you know.
01:30:40.000 We were just sitting there reading the Harvard Business Review trying to come up with some motto that came to us.
01:30:46.000 Yes, you got it.
01:30:47.000 Listen, courage is the virtue that sustains every other virtue.
01:30:50.000 So you have to.
01:30:52.000 I go back to my analogy about sacrifice.
01:30:54.000 You have to make a sacrifice, Tim, if you want to actually truly make a difference.
01:30:59.000 You have to sacrifice something.
01:31:01.000 And I think the thing that people fear most is not their life.
01:31:05.000 Right now in this society, and I'm talking domestically, I think they fear their reputation.
01:31:12.000 And I can tell you as someone who's taken a lot of arrows over the years, and it's very personal.
01:31:16.000 And I do this, and I'm a sensitive guy.
01:31:19.000 I have a heart.
01:31:20.000 If I didn't have a heart, I would not be good at my job.
01:31:23.000 The hardest thing for me to accept was to be hated.
01:31:29.000 I don't want to be hated.
01:31:31.000 I don't love to be loved, but I don't want to be despised by everyone that I grew up respecting.
01:31:39.000 And I remember right after, this is about my first year, you know, 10, 11 years ago, I remember looking at my Wikipedia page.
01:31:44.000 It was just, it was awful.
01:31:46.000 And I remember thinking, it's breaking my heart to read this, the first thing you see.
01:31:51.000 You all know how Wikipedia works, and how ridiculous it is, and they quote this guy, and that was retracted, but the citation's still up.
01:31:57.000 And I can't change it.
01:31:59.000 There's nothing I can do.
01:32:01.000 And to me, that was the hardest part, was accepting, was literally saying, I'm not going to read the comments, and I'm not going to give a shit about Wikipedia, I'm just going to move forward.
01:32:11.000 For me, it was very hard to accept being hated.
01:32:14.000 And not, you know what?
01:32:17.000 I had a guy, an advisor, who said to me many years ago, I was disgusted by all the attacks, vicious attacks, people trying to infiltrate me, women, all types of weird situations.
01:32:27.000 And he said, James, it's a badge of honor that they do this to you.
01:32:32.000 And his name was Steve, and he told me this, and I went, oh my God.
01:32:36.000 It's a badge of honor that they attack you in this sick, twisted, underhanded way.
01:32:42.000 You're effective.
01:32:42.000 This guy was a Vietnam veteran.
01:32:44.000 And I said, he's right.
01:32:47.000 And I've had to ignore the Wikipedia, and I've had to ignore the haters, and I've had to just press on.
01:32:51.000 Right?
01:32:52.000 The slogan of human history.
01:32:53.000 Just press on.
01:32:55.000 So is there an example of anybody you looked up to growing up that doesn't like you or has said bad things about you?
01:33:02.000 That's a good question.
01:33:05.000 I get what you're saying.
01:33:06.000 I don't want... No, no, no.
01:33:07.000 Specific... Well, the New York Times I didn't respect in high school.
01:33:12.000 I'd read it.
01:33:13.000 I think the New York Times has gotten a lot worse.
01:33:15.000 Definitely.
01:33:16.000 There are a lot of institutions...
01:33:20.000 You know, CNN, I wanted to be liked initially.
01:33:26.000 There's a large part of me that wanted to be respected by these establishments because I respect their power.
01:33:33.000 And the moment you stop caring about what the executive editor of the New York Times thinks, the moment you don't care is the moment you are truly free.
01:33:41.000 This is the moment you are free to do the good things that you need to do.
01:33:45.000 But there's always 4% of you, because you're human, there's 4%, maybe for me it's down to almost 1%, where you're like, you know what, part of me still cares.
01:33:54.000 I bring this up because I think you will get your due.
01:33:59.000 I'll tell you this right now, I'm pretty sure there are young people who look up to you and you're the person they're looking up to as a young person.
01:34:07.000 They're going to grow up and you're going to give them that respect that you wanted for from these institutions.
01:34:11.000 As long as you don't sell out, back down, compromise, settle a lawsuit.
01:34:16.000 The arc of the moral universe, I always quote Martin Luther King, the arc of the moral universe is long and it bends towards justice.
01:34:23.000 But I remember reading the New York Times in my dining hall at Rucker's.
01:34:27.000 I was a freshman, sophomore, and I just sat there and read the New York Times back to back.
01:34:31.000 And I was so disgusted by it, and I decided that someone should do something.
01:34:37.000 I started filming my professors and ambushing them.
01:34:42.000 This is two years before YouTube.
01:34:44.000 This is 2005.
01:34:44.000 So it was a Kodak digital camera with QuickTime on a website.
01:34:49.000 There was no YouTube.
01:34:50.000 Wow. 2005.
01:34:51.000 So James, do you think that Andrew Breitbart would be proud of what you've done?
01:34:56.000 I hope so.
01:34:56.000 I mean, he was someone who... Andrew Breitbart was someone who was very much a... Him and I were very different people in many respects.
01:35:06.000 Very different.
01:35:07.000 But I talked to him every day.
01:35:09.000 And what he would do is understand... He would tweet a thousand times a day.
01:35:14.000 He was very aggressively in the weeds on everything and fighting every battle.
01:35:21.000 Probably why he died so young, 42, 43.
01:35:23.000 And we were very much aligned in that way.
01:35:25.000 And when he died, you know, nature abhors a vacuum.
01:35:28.000 Nobody has quite filled his shoes.
01:35:30.000 I think it's you.
01:35:31.000 Many others have filled it.
01:35:33.000 But, you know, I think so.
01:35:37.000 I wonder what life would be like and what the world would be like if he never died.
01:35:40.000 Well, Jon Stewart gave you praise.
01:35:43.000 Back in the day.
01:35:43.000 That was many moons ago.
01:35:46.000 I don't know.
01:35:47.000 What's the new guy at Daily Show?
01:35:48.000 They're all awful.
01:35:50.000 So unfunny.
01:35:51.000 But here's the thing.
01:35:51.000 Jon Stewart, back in the day, he was looking at what you did.
01:35:55.000 I think it was the acorn thing, right?
01:35:57.000 It was acorn, yeah?
01:35:58.000 It was acorn.
01:35:58.000 He praised it.
01:35:59.000 He was like, journalists, where are you?
01:36:00.000 Look at this guy.
01:36:02.000 And that was amazing.
01:36:03.000 Two kids from the cast of High School Musical 3?
01:36:05.000 Journalist, where the hell were you?
01:36:07.000 Meet me at camera three.
01:36:08.000 Where were you?
01:36:09.000 He was like, what did he say?
01:36:11.000 You know who broke these story?
01:36:12.000 These two.
01:36:13.000 It's like two kids, a 25 year old and a 20 year old with a Sony minicam.
01:36:17.000 He was foretelling the future.
01:36:20.000 I'll tell you this, Jon Stewart has praised Trump on the 9-11 victims fund.
01:36:24.000 Jon Stewart's been a pretty level-headed guy, but he left the space.
01:36:30.000 And now we've got a bunch of plastic people in his wake.
01:36:33.000 There were people that, you know, gave you respect, although many of the institutions were still against you early on.
01:36:38.000 Here's what I'm trying to get to.
01:36:40.000 I felt similarly.
01:36:41.000 I was like, I'm gonna keep saying my thing.
01:36:44.000 I'm gonna say what I believe.
01:36:45.000 I don't care if, you know, people get mad at me.
01:36:47.000 And there have been instances where I'm like, you know, there are some high-profile people that I've looked up to when I was younger, and I'm like, I wonder what they would think about me and everything I do.
01:36:55.000 I'll tell you this, man.
01:36:57.000 I get messages from the people I used to look up to when I was 14, these pro skateboarders, who message me saying, I love your show, you're the best.
01:37:04.000 There you go.
01:37:05.000 And so, what I think is, it's, you just gotta stay true to yourself, and if you have principles, if you're honest, if you have integrity, you do good work, you never back down.
01:37:16.000 I think then any honest person worth having respect of will give you the respect.
01:37:20.000 But you must understand, and we have a curriculum, we have an ethics curriculum at Project Veritas, like we have a whole week seminar where we bring in journalists, and I spend the first day talking about ethics, like I'm obsessed with the history of undercover work, because if you don't have integrity, nothing matters.
01:37:36.000 Now what's interesting and ironic about this is that they will, the first thing they will do is say you have no integrity.
01:37:42.000 So the more integrity you have, The more you will be attacked for having no integrity.
01:37:48.000 And that creates a feedback loop, or that creates a self-fulfilling prophecy, a reverse incentive, a perverse incentive, where you will actually not have integrity so that you are branded as having an integrity.
01:38:00.000 And I'm telling you, the hardest part, if you want to do this, anything like this, is to accept being hated by all the people you want to be liked by.
01:38:10.000 You have to be hated by... Who's the guy in Tom Ardell in True Lies?
01:38:17.000 Here's a guy who... I grew up watching these movies with Tom Ardell, and suddenly he's doing tweet storms about me.
01:38:22.000 It's very ironic.
01:38:23.000 I mean, Tom Ardell's an affable guy, you know, he's a good actor, I enjoyed his movies, and he's just...
01:38:29.000 I don't know what he's on, a drug-induced binge, tweet-storming.
01:38:34.000 Seriously?
01:38:35.000 I tweeted the song White Lines by Grandmaster Flash at him when he was tweeting at me last time.
01:38:40.000 So anyways, it's crazy, huh?
01:38:42.000 Hey, regarding the future, I want to ask a question about DeepFix, because your business model is to get video, and in the future there's going to be Deep fakes is when they take video and they change it, a computer modulates it so it looks real, but it's completely, you know, whatever you want to call it, fabricated.
01:38:59.000 Exactly.
01:39:00.000 How are you going to deal with that?
01:39:02.000 Well, this goes back to the defamation law.
01:39:06.000 You can't just lie about someone.
01:39:08.000 Newspaper reporters don't even use video.
01:39:10.000 They use words.
01:39:12.000 They can make them up.
01:39:13.000 They can write any words they want.
01:39:15.000 So if you think of it in that prism, there's a new paradigm and the paradigm will be there will be fake things out there.
01:39:22.000 But there's always been fake things.
01:39:23.000 There's been fake language.
01:39:24.000 There's been fake arguments.
01:39:25.000 There's been fallacious arguments.
01:39:27.000 There's been newspaper articles that have no evidence.
01:39:29.000 There's been forgeries.
01:39:30.000 There's been anonymously sourced crap in the New York Times with no documentation.
01:39:34.000 So this is not a new issue.
01:39:38.000 McLuhan would say it's a hot medium, television is a hot medium, so I suppose it's a little more concentrated of a problem, but you still can't intentionally lie about someone in a court of law You would lose that defamation case.
01:39:54.000 So I cannot create a fake video about someone.
01:39:57.000 They would sue me and win.
01:39:59.000 So I think there's going to be litigation around these deep fake videos.
01:40:02.000 I think people will bring fake videos to you.
01:40:05.000 You have to vet them.
01:40:06.000 You have to do the hard shoe leather reporting, the sourcing, the corroboration.
01:40:10.000 And by the way, a lot of people give me a lot of stuff.
01:40:13.000 I'm sitting on some stuff.
01:40:14.000 We should have a dead man switch and release all of it, by the way.
01:40:17.000 I'm sitting on some crazy... You mean you don't?
01:40:19.000 Don't admit that.
01:40:19.000 I'm sitting on some crazy shit right now.
01:40:21.000 Like, beyond anything you can imagine.
01:40:24.000 I can't publish it.
01:40:24.000 You know why?
01:40:25.000 Because I haven't corroborated it.
01:40:27.000 Mm-hmm.
01:40:27.000 Yeah.
01:40:28.000 Well, how about we take some super chats?
01:40:30.000 Let's do it.
01:40:31.000 Some questions.
01:40:32.000 Make sure, before we do, you smash that like button.
01:40:35.000 Correct.
01:40:35.000 We have a lot of people who are hanging out.
01:40:37.000 We have some really important points.
01:40:38.000 I'm not going to be able to read out, obviously, every single super chat, but I'm going to try and read as many as I can.
01:40:41.000 This is a very important one.
01:40:43.000 Scott Hale says, Tim, I'm a Trump supporter for very similar reasons you are.
01:40:47.000 I appreciate everything you do, but sadly, mainstream media is winning my family.
01:40:51.000 Winning my family is majority Democrats and told me to my face.
01:40:54.000 Any proof I physically show them is a lie.
01:40:57.000 So I guess I would ask you, James, what would you tell someone if no matter what they seem to show their friends and their family don't believe it, even if you have video?
01:41:05.000 That's a good question.
01:41:06.000 I've been asked this before.
01:41:07.000 I'm tired, so I'm trying to remember how I respond.
01:41:12.000 You're never going to be able to convince everyone.
01:41:19.000 I would probably ask them, what is a lie about it?
01:41:21.000 Facts.
01:41:22.000 Just say, this guy O'Keefe hasn't lost one lawsuit.
01:41:25.000 Did you know that?
01:41:25.000 I did not know that.
01:41:26.000 Just ask them questions.
01:41:28.000 Just take another look.
01:41:29.000 It's kind of like a cross to a vampire.
01:41:33.000 Just look at this and tell me what about this is not real.
01:41:37.000 Are you saying that that's not the person?
01:41:39.000 You know, just sort of have a come-to-Jesus moment with the person.
01:41:41.000 Just look at the tape together and re-watch it.
01:41:45.000 I don't know.
01:41:46.000 Some people don't want to believe it.
01:41:48.000 People believe what they want to believe, I've found in life.
01:41:51.000 And it's a very hard thing to change people's minds.
01:41:55.000 But I think you've got to be persistent, you've got to watch the video with them, and you've got to ask questions.
01:42:00.000 What's the next one?
01:42:01.000 Donut Donut says, I'm working for FB.
01:42:03.000 Light it up, James.
01:42:04.000 It's about time.
01:42:05.000 Tired of internal comms from Zuck about how to listen, soul searching, BLM on every poster, and internal tools for coding.
01:42:12.000 So what that person needs to do is email VeritasTips.
01:42:12.000 Oh, snap.
01:42:15.000 VeritasTips at ProtonMail.
01:42:18.000 VeritasTips at ProtonMail.
01:42:20.000 Email VeritasTips at ProtonMail.com, please.
01:42:22.000 And you can use an encrypted deal.
01:42:24.000 You can have a secret account, a phone number, burner phone, whatever you need to do.
01:42:28.000 I gotta be honest, a lot of these superchats are just, you know, just praising you.
01:42:31.000 I'm not gonna read all of them, but Pensive says, James, you are a true American hero.
01:42:36.000 And I mean that in every sense of the word.
01:42:38.000 Thank you more than words can say.
01:42:40.000 Well, that means a lot.
01:42:41.000 Thank you.
01:42:42.000 We've got this one from Joseph.
01:42:42.000 Thank you.
01:42:44.000 He says, Every question asked by mainstream news media should be prefaced with, quote, I'd like to preface my answer with Trump denounces white supremacy and racism.
01:42:52.000 Pass on to Kayleigh McEnany.
01:42:54.000 We got another thank you for for James.
01:42:57.000 David Walker says, Not that I think Biden will drop out at this point.
01:43:00.000 Would that mean Kamala would be the Democratic candidate, even though she is she is a VP pick and not runner up primary?
01:43:06.000 I don't know, but I do think Pelosi is Setting up the 25th Amendment panel to remove Joe Biden, which we've talked about.
01:43:13.000 Wouldn't the DNC just get to pick somebody if Biden dropped out?
01:43:15.000 It'd probably just be Kamala.
01:43:16.000 Yeah.
01:43:18.000 Mark G says, does Veritas need any programmers?
01:43:20.000 I work for one of the companies involved in the Senate hearing.
01:43:23.000 I just got told that I can't use the terms whitelist and blacklist since it's racist.
01:43:27.000 This company wasn't political five years ago, but now it's evil.
01:43:31.000 Yes, we do need tech people.
01:43:33.000 We always need people.
01:43:34.000 We're just running a company.
01:43:35.000 Email?
01:43:36.000 Email.
01:43:36.000 You can email jobs at ProjectGarotas.com.
01:43:40.000 We're looking for an IT support.
01:43:40.000 Excellent.
01:43:42.000 Yes, I know people, but they're going to infiltrate you.
01:43:45.000 We have a very extensive vetting process.
01:43:48.000 We have a big filter.
01:43:49.000 I know people say that about me.
01:43:51.000 And you know, I have to behave in a way that's Fairly ethical, such that not if but when I get infiltrated, there's no big story.
01:43:58.000 Because the people that worship you are eventually going to come and try and infiltrate you.
01:44:02.000 They've already done it.
01:44:02.000 Oh, they've already tried.
01:44:04.000 I've been through that hell.
01:44:08.000 And, you know, it is what it is.
01:44:09.000 You have to run the risk of that people infiltrating me.
01:44:15.000 But to what end?
01:44:16.000 The only thing I really have to protect are the identity of our sources.
01:44:19.000 And those are quarantined.
01:44:22.000 So, you know.
01:44:23.000 All right, Vsidia says, your Epstein video is the biggest reason I will never trust the mainstream media.
01:44:27.000 If they can't be trusted with to report, trusted with to report that, why trust them with anything?
01:44:33.000 Thank you for your work.
01:44:34.000 I appreciate that.
01:44:35.000 Yeah.
01:44:36.000 Daniel Rudwick, Bundwick, Bundrick, sorry.
01:44:40.000 And we have a ton of Super Chats, by the way, so I'm just trying to go through them as fast as we can.
01:44:43.000 I'm trying to read through them and find good questions, but I can't, you know.
01:44:46.000 He says, we think we fight evil people, but we forget the line between good and evil cuts through the heart of everyone.
01:44:50.000 That is amazing.
01:44:51.000 That's my favorite.
01:44:53.000 Who said that?
01:44:54.000 So this is Daniel Bundrick, he said.
01:44:56.000 Let me read the full thing for you.
01:44:58.000 We think we fight evil people, but we forget the line between good and evil cuts through the heart of everyone.
01:45:03.000 Blowing the whistle means calling attention to the evil that crawls out of you as well.
01:45:07.000 That's a Solzhenitsyn line from Gulag Archipelago.
01:45:10.000 The line that separates good and evil runs through every human heart.
01:45:13.000 Perhaps the most profound thing said in the 20th century.
01:45:17.000 In a YouTube comment?
01:45:18.000 And we all have this line.
01:45:22.000 Every one of us.
01:45:23.000 It's like what Peterson talked about when if you if we were born in Nazi Germany and how many of us would do the right thing and have the balls to take on the regime and most of us don't want to admit that the truth is not what you know people oh I would oh I would be the heroic one that would take on Hitler well you know so the line that separates good and evil runs through every human heart and that's going full circle here with the Google guy today working for Google saying this stuff Um, you know, how, how culpable is he?
01:45:53.000 I don't know.
01:45:55.000 So I, I love this one.
01:45:56.000 Uh, Curtis McLaughlin Jr.
01:45:57.000 says, Tim, you should take the photo of Biden devouring a child out of frame of your guests.
01:46:02.000 Is that in the frame?
01:46:03.000 Keep up the fantastic work, Tim.
01:46:04.000 It is.
01:46:04.000 That is brilliant.
01:46:05.000 Sorry, Dave.
01:46:08.000 I gotta see this.
01:46:09.000 We, we were, uh, we're actually playing on rotating the photos.
01:46:13.000 And that's just the one that we put up.
01:46:14.000 Brilliant.
01:46:15.000 So I think it is art.
01:46:18.000 I don't think it is meant... It's just meant to be silly and absurd and it's really great art.
01:46:23.000 You and I have similar tastes because I saw that on Twitter and I was like, this is genius.
01:46:26.000 Isn't it some of the best art ever?
01:46:28.000 It's G prime 85.
01:46:29.000 Who's the guy in the lower right over there?
01:46:31.000 That's Trump.
01:46:32.000 That's Trump?
01:46:33.000 Yes.
01:46:33.000 Is that baby Trump?
01:46:34.000 Yep, he's making fun of Trump.
01:46:35.000 He's got tiny hands.
01:46:36.000 Yes.
01:46:37.000 So that's actually satirizing the insults of Trump.
01:46:42.000 Like the point he was making was that that's what everyone says.
01:46:44.000 He's like his tiny hands or whatever.
01:46:46.000 That's genius.
01:46:47.000 Yeah, yeah, yeah, yeah.
01:46:49.000 Let's see.
01:46:49.000 What do we got here?
01:46:50.000 Dave Kruppel says, I'm a tradesman, proud aviation maintainer, blue collar, and I have neither the brains nor the will to own my own business.
01:46:57.000 I can't have my name and face out there with a pro-Trump opinion.
01:47:00.000 Mm hmm.
01:47:01.000 Dang.
01:47:02.000 That's brutal, huh?
01:47:03.000 A lot of people in Hollywood, you know, do this where they're, again, it's like the Samizdat, which was the revolutionary publication in the Soviet Union, and they have to whisper, you know, I agree with you.
01:47:15.000 I guess we had secret voting.
01:47:17.000 We've had it since the beginning, right?
01:47:18.000 Where you don't have to disclose who you're voting for because of that, because you don't want to get harassed by your neighbors or have people come to your house and be like, you voted for this guy, so we're all coming.
01:47:28.000 Yeah.
01:47:30.000 Jennifer Scott says, I love your work, but it seems like nothing ever changes.
01:47:33.000 We never see criminal charges like in the case of Omar, or people losing their jobs like in the case of what happened in Denver.
01:47:39.000 So I have to ask myself and you, what's the point?
01:47:43.000 This is that cynicism, that hopelessness that is so pervasive and is toxic.
01:47:49.000 The journey of a thousand miles begins with a single step and the price of liberty is eternal vigilance.
01:47:57.000 It's like the metaphor I use, like the black goo in Ridley Scott's Prometheus.
01:48:02.000 It manifests.
01:48:05.000 You have to start somewhere.
01:48:07.000 And the vision of this, which people don't sometimes see, is that one video becomes ten videos become a hundred.
01:48:12.000 Like one whistleblower becomes a thousand.
01:48:14.000 They can stop one man.
01:48:15.000 Now, do people get arrested? I don't have the authority of the Attorney General of the United States.
01:48:20.000 The DOJ may in fact be prosecuting someone, but they can't talk about that
01:48:25.000 because they don't comment on ongoing investigations.
01:48:27.000 Nor should they.
01:48:28.000 Nor should they. It's against the policy. So let's see what happens.
01:48:31.000 Apparently in the case of the New York Post story, the FBI does comment that they're investigating.
01:48:37.000 This is the one time.
01:48:38.000 Then we need to blow the whistle on the FBI.
01:48:41.000 Everyone needs to think bigger for a second.
01:48:42.000 Stop whining.
01:48:44.000 Stop bitching.
01:48:44.000 I'm tired. I'm just going to speak to the person right.
01:48:47.000 Is that the camera. Yes.
01:48:48.000 Stop. Stop.
01:48:50.000 Can I say stop whining.
01:48:52.000 Stop bitching.
01:48:54.000 Yes. Stop complaining.
01:48:55.000 You know think big for a second.
01:48:58.000 Think big.
01:49:00.000 Imagine if someone had the stones at the FBI.
01:49:03.000 Now, there are two types of courage to fight a war, to quote von Clausewitz, military philosopher.
01:49:08.000 There's the courage to march up a hill with a bayonet and to take the hill.
01:49:12.000 Okay, that's physical courage.
01:49:13.000 And then there's a more rare or an equally amazing form of courage, which is moral courage.
01:49:18.000 And just imagine if someone at that Bureau of Investigation in D.C.
01:49:23.000 wore one of those little hidden cameras and recorded Comey and recorded Christopher Wray.
01:49:28.000 Can you imagine the difference that would make?
01:49:31.000 So stop whining about things and instead think about how you can contribute to that objective in any way, matter, shape or form.
01:49:42.000 But the negative energy and the cynicism is starting to piss me off because I happen to represent an organization that has a lot of people right this minute working for tech companies emailing us Asking for hidden cameras.
01:49:55.000 So, you can choose to live your life cynical and pissed off and whining, or you can contribute to the vision and the mission of actually trying to do something about it.
01:50:05.000 I got a super chat for you.
01:50:06.000 I'm sorry, but that one gets me a little bit.
01:50:09.000 Hold on, hold on.
01:50:10.000 This one's gonna cheer you up.
01:50:11.000 Tempest83 says, I must not fear.
01:50:14.000 Fear is the mind killer.
01:50:15.000 Fear is the little death that brings total obliteration.
01:50:18.000 I will face my fear.
01:50:20.000 Be brave.
01:50:22.000 Very well said.
01:50:22.000 I think it's from Dune.
01:50:23.000 It's from Dune, yeah.
01:50:24.000 Very well said.
01:50:25.000 I fear, you know.
01:50:26.000 Yo, what about authority?
01:50:28.000 This word keeps coming up.
01:50:29.000 People just like adhere to authority.
01:50:31.000 They kind of believe authority, whether that's whatever the New York Times, kind of derived from the word author.
01:50:35.000 But the word authority, to Tim's point earlier, is changing.
01:50:39.000 Tim is saying that young people don't... Do young people really look up to these bozos on networks?
01:50:44.000 Well, they look up to the author of whoever's speaking to their reality.
01:50:46.000 And if you're the author, or if your message is being proliferated, there are many now authors, then maybe that's the new authority.
01:50:53.000 Correct.
01:50:53.000 I agree with that.
01:50:54.000 I think the authority is changing depending upon whose stories are most credible.
01:51:00.000 and prevalent and you know if you build it they will come.
01:51:04.000 News has a way of attracting an audience. If people can't get actual news from CNN they'll
01:51:09.000 go to where they can get news. They'll go to Tim. They'll go to people who have new authority. Yeah.
01:51:14.000 What's the next one? Charles Balyozian. Big fan. Want to talk about Armenian war.
01:51:21.000 Biden came out siding with Turkey, saying Armenia can't occupy territory.
01:51:25.000 There since 500 BC, while Trump says we're working on it, just want to shine light on the issue.
01:51:31.000 Artsakh is Armenia.
01:51:33.000 Not entirely sure, you know.
01:51:34.000 I don't know what that is.
01:51:35.000 No, it's Armenia stuff.
01:51:36.000 There's just a lot of, so I'm reading through a lot of these, but a lot of them are, James, you're awesome.
01:51:40.000 James, you're awesome.
01:51:41.000 Yeah, just get some hard questions.
01:51:42.000 I know, and it's difficult.
01:51:43.000 That's a hard one.
01:51:44.000 It's too hard.
01:51:44.000 James, why?
01:51:45.000 James, everybody loves you.
01:51:46.000 I can't, you know.
01:51:47.000 Just let's find a good question.
01:51:49.000 Well, I'll read this one.
01:51:50.000 N.A.
01:51:50.000 says, if you have haters, that literally means you stand for something.
01:51:54.000 Churchill.
01:51:54.000 You do.
01:51:55.000 You do.
01:51:56.000 I'm an independent.
01:51:57.000 Time to make a stand for something.
01:51:58.000 Another Churchill quote that I love.
01:52:00.000 I read a biography of him last year, and it was, it's better to be an actor than a critic.
01:52:06.000 It's better to be better to be making the news than taking it.
01:52:10.000 It's better to be an actor than a critic.
01:52:12.000 So it's another way of saying, I guess, be the man in the arena, you know, to go out there and go there.
01:52:18.000 Again, this is the idea of going to the location.
01:52:21.000 Go there.
01:52:21.000 Don't sit in an air-conditioned booth in Manhattan and, you know, read teleprompters.
01:52:26.000 So, better to be an actor than a critic.
01:52:28.000 Indeed.
01:52:28.000 Benjamin the Rogue says, while I was in the Democratic Party, they accidentally gave me access to a secret server.
01:52:33.000 What was in it has haunted me ever since, and I did tours in Afghanistan.
01:52:38.000 I have lived with the guilt of not being able to get the info out.
01:52:41.000 Thank you for your work.
01:52:43.000 I think this is a really important super chat.
01:52:45.000 I have no idea what this person saw, but imagine you are somebody who is witnessing now something you know is wrong, you know is very wrong.
01:52:52.000 Do you just sit there and do nothing or do you be brave?
01:52:55.000 That is a very good question, Tim.
01:52:59.000 To be or not to be.
01:53:00.000 To blow the whistle or not.
01:53:02.000 And again, a lot of the whistleblowers that I've read about and spoke about view it as this choiceless choice.
01:53:08.000 Eric Cochran said, I had no other option.
01:53:12.000 There was no option for me.
01:53:14.000 There's no optionality in these people's hearts.
01:53:17.000 They must do this.
01:53:19.000 It is a necessary consequence of their destiny.
01:53:23.000 You just know.
01:53:24.000 And if you're sitting there next to your boss, your supervisor, doing some illegal activity, you're like, well, I'm going to lose my My home, my mortgage, my reputation.
01:53:33.000 But what if there was an organization that had my back?
01:53:36.000 So CNN, Cary Porch, the guy that blew the whistle on Jeff Zucker's phone calls, I think he raised $120,000 on GoFundMe after that.
01:53:42.000 Is GoFundMe going to keep allowing that?
01:53:47.000 There will be some other platform.
01:53:48.000 There's a solution, market solution.
01:53:50.000 That was twice his yearly salary.
01:53:52.000 And he got hired by someone who was a freedom person.
01:53:56.000 Again, I think the 21st century is different than the 20th as it pertains to whistleblowers.
01:54:00.000 So if you're sitting there in a work environment and you're thinking, right now someone's thinking, maybe I should report on my colleague racketeering or stealing money from the federal government, all I need to do is take a picture of that deal.
01:54:14.000 But if I do that, then I'll lose my job.
01:54:19.000 99.9% of you won't do it, but if 0.1% of you do it, you'll change the world.
01:54:23.000 There's a lot of people out there.
01:54:24.000 If you have a million people and 1% stand up, that's a lot of people making a difference.
01:54:28.000 And there's so much more satisfaction in taking that other road less traveled.
01:54:35.000 Uh, and there will be a safety net for you at Project Veritas.
01:54:38.000 We will have your back.
01:54:39.000 We will tell your story in a way that nobody else will.
01:54:41.000 And in some cases, and I'll give you a specific example, ABC News Amy Robach, that was given to me by a current ABC News employee.
01:54:49.000 She, he gave me a...
01:54:52.000 Tape.
01:54:53.000 And that person, identity was protected, and that person does in fact still work for ABC News.
01:54:58.000 And they fired the wrong person.
01:54:59.000 And they fired Ashley Bianco, who never was my source.
01:55:02.000 Wow!
01:55:02.000 And she has a wrongful termination lawsuit in the works.
01:55:05.000 Wow!
01:55:05.000 Good!
01:55:05.000 Excellent.
01:55:06.000 So we've got Stumbling Saint says, James, based on everything you know, is there hope coming?
01:55:11.000 I think we went over the hope thing.
01:55:13.000 I think yes.
01:55:14.000 I think we've answered that.
01:55:16.000 I think we are seeing the tide change.
01:55:18.000 You know what really bothers me about that whole Gretchen Whitmer thing with these guys, you want to kidnap her or whatever, is that she had already lost.
01:55:24.000 The system worked, surprisingly.
01:55:26.000 I was thinking in 2016 there's no way Trump would win because the system is under control, it's rigged, right?
01:55:32.000 Then Trump won and I was kind of like, wow, if Trump can win, anybody can win.
01:55:36.000 It must be real.
01:55:38.000 That gave me confidence in the system.
01:55:40.000 You look at what happened in Michigan, and the state legislature rules against Whitmer.
01:55:44.000 She defies them.
01:55:45.000 The Supreme Court rules against her.
01:55:46.000 She defies them.
01:55:47.000 The AG says, you have no powers anymore.
01:55:49.000 She lost.
01:55:52.000 The checks and balances worked.
01:55:53.000 Her AG has defied her.
01:55:54.000 The courts and the legislature ruled against her.
01:55:57.000 That, to me, is extremely hopeful.
01:55:58.000 And these lunatics who are planning some dumb garbage mission or whatever just risked everything because it worked.
01:56:05.000 The stuff you're doing exposing them It's going to work.
01:56:08.000 There's people watching right now who are seeing this.
01:56:10.000 There's people emailing right now.
01:56:11.000 Well, it has to.
01:56:13.000 There's no other, other than what you talked about last night, other than physical conflict.
01:56:19.000 You know, Abraham Lincoln once said, public opinion is everything.
01:56:22.000 You know, going back to Walter Lippmann and a lot of these authors that talk about how important sentiment is.
01:56:29.000 Politics is downstream from culture.
01:56:30.000 Culture is downstream from data and information.
01:56:33.000 And what is a hotter medium of information than people speaking in their own words in the most honest, pure fashion?
01:56:40.000 So if that doesn't change things, then nothing will change things.
01:56:46.000 Another way of saying this is, a builder can build faster than a destroyer can destroy.
01:56:52.000 It's kind of a rule of political movements, or rock beats scissors.
01:56:56.000 In other words, a videotape of someone talking defeats propaganda against that thing.
01:57:03.000 Yeah.
01:57:03.000 And if that isn't the case, then I guess we're headed towards, um...
01:57:07.000 Chaos.
01:57:08.000 Civil war. Chaos. Anarchy.
01:57:09.000 7seed says you should set up that dead man switches right away.
01:57:13.000 Yeah, if that happens, there are all types of weird stuff that's gonna happen.
01:57:16.000 Uncorroborated videos are going to be released into the ether.
01:57:20.000 Price Man says, James, Tim, is there a solution to this problem?
01:57:23.000 Companies manipulating their algorithms, government regulation, breaking up tech monopolies?
01:57:27.000 If there's one thing that communists fear more than anything else, it's being exposed.
01:57:32.000 To expose them.
01:57:33.000 Whitaker Chambers writes about this in Witness.
01:57:35.000 To expose is the power.
01:57:37.000 The solution is exposure.
01:57:39.000 The solution is to simply do what we've been talking about this entire show.
01:57:43.000 This is interesting.
01:57:44.000 Leslie Elizabeth says, Hey guys, my father is the judge that presided over James's criminal case in Louisiana in 2010.
01:57:50.000 I'm now a lawyer in New Orleans.
01:57:52.000 I'm a fan of your work.
01:57:53.000 Wow.
01:57:53.000 What you both do is important.
01:57:55.000 Keep it up.
01:57:55.000 David Knowles.
01:57:56.000 What's his name?
01:57:57.000 Well, this is the super chat comes from Leslie Elizabeth.
01:58:00.000 Leslie Elizabeth.
01:58:01.000 So there was a federal judge and a magistrate judge in that case.
01:58:05.000 The magistrate judge was the one who... I have a misdemeanor conviction in Louisiana, and it was a magistrate judge, David Noll.
01:58:11.000 Anyway, great, Elizabeth.
01:58:12.000 Nice to hear from you.
01:58:13.000 Awesome.
01:58:13.000 Mr. Boogie says, your views and thoughts on Snowden and Assange.
01:58:17.000 Interesting topic.
01:58:19.000 My views and thoughts.
01:58:22.000 I think that a lot of people view this in a black and white way, you know, like he's good or bad, and I would say that I don't view it in that dichotomy.
01:58:30.000 I think it's a false dichotomy.
01:58:31.000 I think someone can both be a hero and a traitor, and I don't want to cast moral judgments on what he did.
01:58:39.000 I viewed Citizen Four, I viewed the documentary where, you know, he's being interviewed by Loy Poitras, He comes across as very sincere to me, maybe slightly naive in the way that I was when I started this organization, not knowing the shitstorm that would ensue.
01:58:54.000 But I think that he could be quite literally, in some literal sense, a traitor, but also a hero, and the greater good is served.
01:59:02.000 But that's a very complicated question, and I don't like viewing it in... Because a lot of the people that are betraying these organizations, Tim, that I work with, They are kind of traitors.
01:59:15.000 If someone betrayed Project Veritas, you put anyone's life under a magnifying glass.
01:59:20.000 You're going to find human foibles.
01:59:23.000 You're going to find issues and problems.
01:59:25.000 So, to betray one's country for your country, can't both things be true simultaneously?
01:59:32.000 I think with Assange, he's not an American citizen, and he's a publisher.
01:59:37.000 He receives leaks, he publishes them, so what they've actively been doing to Assange is criminalizing the act of journalism, whatever is political.
01:59:45.000 Are we talking about Snowden or Assange?
01:59:46.000 Both.
01:59:47.000 So, in Assange's case, I view him as a publisher who publishes leaks.
01:59:51.000 Yes.
01:59:51.000 And if you don't like the information, and he's encouraging... Well, that's too bad.
01:59:55.000 His organization can do that.
01:59:57.000 It depends upon the facts.
01:59:58.000 There was a recent criminal complaint against him from Virginia this year or last year.
02:00:03.000 I think... Well, it's the ongoing extradition thing?
02:00:05.000 Extradition thing, but there was an actual complaint filed...
02:00:10.000 Right.
02:00:10.000 criminal thing filed against Assange in the last year, right? It depends upon the facts in that
02:00:14.000 complaint. I'm trying to remember the specifics, but if he coordinated with the leaker, if he
02:00:20.000 constructed, he said, curious, I know, curious eyes don't run dry, right? So that statement
02:00:25.000 to the source about hacking, if he told him to hack, so it really comes down to discovery in
02:00:32.000 the criminal case about what he told the person.
02:00:34.000 Right, right, right.
02:00:35.000 Because I'm protected under United States Supreme Court Bernanke, where if someone goes in there and hacks some stuff and then just mails it to me, of course I can publish it.
02:00:43.000 Yeah.
02:00:43.000 But if I coordinated with that hacker, well that's a violation of federal law.
02:00:48.000 The thing about Snowden, however, is that I don't view Snowden as a whistleblower.
02:00:52.000 What do you view him as?
02:00:52.000 He's a leaker.
02:00:54.000 Uh, this was evidenced by an interview he did with, I think it was John Oliver, where he wasn't familiar with some of the information he leaked.
02:01:00.000 So a whistleblower says, here's the thing that's bad, or here are multiple things that are bad.
02:01:03.000 I better let people know about this.
02:01:05.000 A leaker just gives you documents.
02:01:07.000 I'm not saying that as a, uh, to condone or condemn anything.
02:01:10.000 I'm just saying that's the fact.
02:01:11.000 Yeah.
02:01:12.000 So if you don't like that, he did it.
02:01:13.000 If you do like that, that's on you.
02:01:14.000 I'm just pointing out a lot of people say he's a whistleblower.
02:01:17.000 Well, he did blow the whistle on some things, but he released a ton of documents that he didn't know.
02:01:20.000 But I think this is more of a philosophical question inherent in the question, which is that, well, what do you think about him?
02:01:26.000 Is he good or bad?
02:01:27.000 And I think on some level, you know, it's like undercover journalism.
02:01:32.000 It's like, can you be an ethical undercover person?
02:01:35.000 It's like trying to create dry water or fireproof coal.
02:01:39.000 You can't, it's like, is he good or bad?
02:01:41.000 Well, he did betray his country.
02:01:44.000 He did break the law, many federal laws probably.
02:01:47.000 So it's a very strange philosophical dichotomy.
02:01:52.000 I think overall exposing people spying and violating our Fourth Amendment rights is the public's right to know is paramount, but you have to break laws that are serious laws.
02:02:03.000 And by the way, you know what one of our ethical rules at Project Veritas is?
02:02:06.000 Don't break the law.
02:02:08.000 I can't do that.
02:02:10.000 But other people can, and give me information.
02:02:13.000 So, it's very interesting, and I've read the literature on this.
02:02:17.000 It's fascinating, and it's not an easy question to answer, so I would urge your audience not to view it in black and white terms.
02:02:24.000 Alright, you ready for the heavy one?
02:02:26.000 Go ahead.
02:02:27.000 Ford2016 says, how do you deal with the pressure of knowing that people want you 86'd?
02:02:31.000 Powerful people.
02:02:32.000 Is there a limitation walled to the depth of your investigations that you will not move past?
02:02:38.000 86th, I should know this, killed.
02:02:40.000 Yes.
02:02:42.000 Like 187 on the cop.
02:02:46.000 What is 86?
02:02:46.000 What is the etymology?
02:02:47.000 We don't have any more of that.
02:02:49.000 I can tell you actually.
02:02:51.000 No one really knows.
02:02:52.000 It's a bit of folklore.
02:02:53.000 Some say it has to do with old pre-World War II electronics about lockout switches.
02:02:59.000 Some say it has to do with the prohibition era of a business that was on 86th Street, where the cops would tip off the bar saying, 86, the customers.
02:03:07.000 Push them out the door to 86th Street, the cops come in on the other street.
02:03:10.000 So others say it means 80 miles out and six feet under.
02:03:14.000 UrbanDictionary.com has the answer.
02:03:15.000 Exactly.
02:03:16.000 Well, that's what they say, 80 miles out, six feet under.
02:03:19.000 What do I think about getting killed?
02:03:20.000 I mean, this is also a very tough question to answer, because there's no... You got to answer it the right way, otherwise you, you know...
02:03:30.000 I'm not a fatalist.
02:03:32.000 I take precautions that I can't talk about.
02:03:35.000 I don't think about it really, Tim.
02:03:37.000 I don't worry about that too much, but we do take precautions.
02:03:42.000 I think the way that they've tried to hurt us so far is to use the justice system against us, like jail and prosecution and reputational threats.
02:03:52.000 But I think we're getting to the point now where you do have to be worried about not someone trying to assassinate you, but a crazy lunatic who's an anomaly.
02:04:03.000 And you have to do some smart things.
02:04:07.000 People always say to me, be careful.
02:04:12.000 I'm like, I'm not going to be careful.
02:04:14.000 I'm not going to be careful at all.
02:04:16.000 But I try not to think about it too much.
02:04:19.000 And there are limits to, I suppose, what I'd be willing to do.
02:04:23.000 There are some topics that would scare the crap out of me, like investigating the Mexican drug cartel.
02:04:28.000 I don't know if that's my beat, you know?
02:04:31.000 They will kill you.
02:04:33.000 There are some stories where you just get killed.
02:04:36.000 And I haven't... I've yet to find a one... I've yet to find a story, and this is honest to God truth, that I've gotten that I didn't publish because I was afraid.
02:04:44.000 So far, so far.
02:04:46.000 There are organizations that will kill you.
02:04:49.000 Yes.
02:04:50.000 That Voltaire quote that if you tell people the truth, they'll kill you unless you make them laugh.
02:04:55.000 Interesting.
02:04:56.000 Had not heard that.
02:04:57.000 A lot of our stuff is pretty serious, you know.
02:04:59.000 It's very serious.
02:05:01.000 We'll do one more that you've already answered, but just to wrap up on.
02:05:05.000 Chris Stroud says, James, if someone brought you info on a right-wing organization, Fox News for example, and offered to expose them via hidden camera, would you do that, or do you only focus on exposing left-wing organizations?
02:05:16.000 I would do it.
02:05:17.000 Of course.
02:05:17.000 Depends upon the information.
02:05:19.000 Someone says, if The problem with the question is, if someone brought you information on Fox News, would you publish it?
02:05:25.000 The answer is, that depends on what the information is.
02:05:28.000 If someone brought me information on CNN, that depends.
02:05:31.000 Like if someone brought you a video of, say, Jeffrey Toobin cranking it on a Zoom meeting, would you publish that information?
02:05:37.000 I'm kidding.
02:05:39.000 That's interesting.
02:05:40.000 That's a really interesting philosophical question.
02:05:42.000 If someone's jerking off on a camera.
02:05:45.000 It depends.
02:05:45.000 It depends.
02:05:46.000 The whole Kogan thing.
02:05:47.000 It depends.
02:05:47.000 The whole Kogan thing.
02:05:48.000 Remember that?
02:05:49.000 It depends upon what the Zoom call was.
02:05:51.000 I mean, you know, the 9am conference calls at CNN, for example.
02:05:53.000 Probably full of it.
02:05:55.000 The thing about ethics, the beauty and bane of ethics, is it's completely circumstantial.
02:06:02.000 There are no universal rules about journalism ethics.
02:06:05.000 I go back to the Hemingway quote, that you're better off, the real ethical issue, this is a little deep, but you're better off for having done it.
02:06:15.000 Because it's not necessarily journalism does harm people.
02:06:20.000 Journalism harms good people.
02:06:22.000 Information harms people.
02:06:23.000 Truth harms people.
02:06:25.000 If your objective is to reduce harm in the Kantian sense, in the categorical imperative sense, never use someone as a means, you can't actually do journalism.
02:06:34.000 So in every case it's a situational test.
02:06:37.000 that you evaluate the specific factors. But the answer, the bottom line is, I would publish the
02:06:42.000 story about Republicans, conservatives, and I think that we will do more of that,
02:06:49.000 and I think the media is going to have a huge problem on their hands because they're not going
02:06:53.000 to be able to say, well, he's just an ideologue. Well, what are they going to say when it's
02:06:56.000 someone that they don't like, the NRA or... They cheered you for it with the Epstein thing.
02:07:00.000 In one story, it was like, suddenly they're like, oh, can we like this guy now?
02:07:05.000 And I think that says a lot about them.
02:07:07.000 I really do.
02:07:08.000 Asking their rich overlords.
02:07:09.000 You had Seymour Hersh and all these people, go throughout the 20th century.
02:07:13.000 Upton Sinclair was an avowed socialist, and I say more power to him.
02:07:17.000 He didn't do the jungle to expose rotting meat conditions.
02:07:22.000 He did it because he supported unionization of workers.
02:07:26.000 If he's an advocate, I don't care as long as the information is real.
02:07:31.000 Seymour Hersh, very anti-war guy, anti-Vietnam War.
02:07:35.000 More power to him, as long as he's reporting the facts.
02:07:38.000 There are plenty of awards.
02:07:40.000 History is replete with unbelievable reporters who are 100% focused on a specific ideology and didn't do anything else.
02:07:48.000 And they all won Pulitzer Prizes.
02:07:50.000 And history views them fondly.
02:07:52.000 So if your argument against me is that I only expose big tech and government bureaus, if that's your argument that I'm discredited because my beat, that's not consistent with any historical view of investigative reporting.
02:08:09.000 The question is, is what you're posting true?
02:08:12.000 Exactly.
02:08:13.000 Well, we're a little bit over, but this is tremendous, man.
02:08:17.000 Seriously, thanks for coming on.
02:08:18.000 This has been... VeritasTips at ProtonMail, VeritasTips at... Be brave.
02:08:23.000 What are your other social links?
02:08:25.000 What do you mean?
02:08:28.000 ProjectVeritas.com, Twitter, it's James O'Keefe III, that's James O-K-E-E-F-E, capital I-I-I, at Twitter.
02:08:39.000 You can follow us there.
02:08:40.000 Right on.
02:08:41.000 Well, thanks for hanging out, everybody.
02:08:42.000 Make sure you smash that like button on the way out.
02:08:44.000 We've got awesome guests throughout the week.
02:08:47.000 Of course, Thursday is going to be the debate, so I don't think we'll be live then, but we're going to have a really great show tomorrow talking big tech censorship, kind of following up on some of your reporting and probably bringing up more of your reporting.
02:08:56.000 So make sure to smash the like button.
02:08:57.000 You can subscribe.
02:08:58.000 We'll have clips up from the show all throughout tomorrow.
02:09:00.000 You can follow me on Twitter, Instagram, and Parler at TimCast.
02:09:03.000 Check out my other channels, youtube.com slash TimCastNews and youtube.com slash TimCast.
02:09:09.000 And of course, you can follow Mr. Ian Crossland.
02:09:10.000 Follow me anywhere and everywhere at Ian Crossland.
02:09:13.000 Oh, very nice.
02:09:14.000 I like that.
02:09:15.000 And that's me.
02:09:16.000 It's our patch lids.
02:09:17.000 O-I-D-S.
02:09:18.000 Sour Patch L-Y-D-S.
02:09:20.000 She spelled it with a Y. Yes.
02:09:23.000 But seriously, again, James, thanks for coming on.
02:09:25.000 Thank you.
02:09:25.000 Come back any time.
02:09:26.000 Everybody else will be live tomorrow at 8 p.m.
02:09:29.000 And again, smash the like button.