The Jordan B. Peterson Podcast


490. Bringing Woke Capitalism to a Shuddering Halt | Robby Starbuck


Summary

Robbie Starbuck is a man who has been mounting a single-handed war against woke capitalism. He has taken on Tractor Supply, John Deere, Harley-Davidson, and a host of other corporations who are named in the interview, and has been successful. And so I wanted to know just who this guy is, and why he s been so effective in his campaign against corporate greed. In this episode, I talk to him about why he thinks he can be trusted, how he thinks about corporate greed, and how he views the long-term impact of his strategy. I also talk about how he got started with his campaign, and what got him to where he is now. And, as always, thank you for tuning into HYPEBEAST Radio and Business of HYPE. Please don t forget to rate, comment, and subscribe to our other shows MIC/LINE, The Anthropology, The HYPE Report, and HYPETALKS. Please take a few minutes to leave us a rating and review our show on Apple Podcasts. It helps spread the word to your friends and family about what we're doing it. And please don't forget to tell us what you think about it! We really appreciate it. Timestamps: 1:00 - Who are you listening to the show? 2:30 - What are you looking for? 3:15 - What do you think of it? 4:40 - Is he a good guy? 5:00 6:00 Is he up to something? 7:00 What is his strategy? 8:10 - What does he think of what he's up to? 9: Is he motivated? 11:00 How do you feel about what he s up to do? 13:00 Can he be trusted? 14:00 Do you think he s a good person? 15:00 Does he have a plan? 16:00 Why do you have a problem? 17:00 Are you up to be an agent of appropriate change? 18:00 Should I trust him? 19:00 Will he be a good man? 21:00 If you like it or don t I don t think he can do better? 22: Does he need to be more than one thing? 23:30 Is he good enough? 25:00 Would you like to know more? 26:40 27:30


Transcript

00:00:00.000 Hello, everybody.
00:00:15.100 I had the opportunity today to talk to Robbie Starbuck, and I came across him on X.
00:00:24.240 He's been mounting a single-handed war, although he has a team.
00:00:30.000 Against woke capitalism, and I'm not a fan of woke capitalism, I can't imagine anything more preposterous than woke capitalism, because the woke movement is essentially Marxist.
00:00:42.320 It's a deadly enemy of anything vaguely smacking of capitalism.
00:00:46.520 And so the notion that giant corporations are promoting a pathological form of compassionate neo-Marxism is absolutely preposterous.
00:00:56.420 In any case, of all the people that I've been watching over the last couple of years, Robbie Starbuck seems to have mounted the most effective sequential campaigns against the woke capitalists.
00:01:09.220 Now, there's been other contenders in that regard.
00:01:11.380 The Republican Treasurers Association, I probably got that name wrong, has done a pretty good job of pushing back against the ESG mongers like BlackRock.
00:01:21.100 But Robbie Starbuck has gone after Tractor Supply, John Deere, Harley-Davidson, and a host of other corporations who are named in the interview, and very effectively.
00:01:35.000 And so I've been wondering just who this guy is.
00:01:37.780 You know, from my perspective, he just popped up on the landscape.
00:01:40.680 That doesn't mean that I'm informed enough to know what I should know about where he came from, and we delve into that too.
00:01:46.880 But the first time he went after Tractor Supply, and I thought, well, that's really interesting.
00:01:52.760 And then John Deere, and I thought, oh, that's twice, you know, and he's successful both times.
00:01:57.140 And then Harley-Davidson, which was really the killer as far as I was concerned, because there's nothing more absurd that you can possibly contemplate than woke Harley-Davidson.
00:02:07.740 And so I wanted to know who he was, and what's he up to, and whether he can be trusted, and what's motivating him, and how he views this in the long run, and what his strategy is in his approach.
00:02:19.020 And we covered all of that.
00:02:20.640 And so if you're very interested in how to conduct yourself so that you can be an agent of appropriate change while taking the responsibility necessary to do so, you could do a lot worse than to listen to Robbie Starbuck for 90 minutes.
00:02:35.900 Well, thanks for agreeing to talk to me today, Robbie.
00:02:39.920 I've been following you for a while on X, as have an increasing number of people, and you sort of, as far as I was concerned, you sort of popped out of nowhere.
00:02:51.160 And all of a sudden, you're causing major trouble to corporations everywhere.
00:02:56.720 Corporations who richly deserve it.
00:02:58.960 And I got to say, Harley-Davidson tops the list in terms of foolish corporate maneuvering, contrary to the interests of their committed consumer base.
00:03:09.100 I think they did something even more foolish than Budweiser, which is really quite a hard contest to win.
00:03:16.320 So the first thing I'd like to know, and perhaps everybody watching and listening, is, well, who are you, and what have you been doing?
00:03:24.480 Let's lay out your plan and your strategy and how you developed that.
00:03:28.140 So just a history of you, and then what it is that you've been up to.
00:03:33.520 Maybe let's reverse that.
00:03:35.120 Let's start with what you've been up to and then do a history of you.
00:03:39.380 Okay.
00:03:40.020 So, well, the way this all started, you know, we've had sources giving us great stories for a long time.
00:03:45.560 And so there's a measure of trust that gets built up.
00:03:48.080 You know, like, for instance, one of those stories was my wife and I, we put out the story about Vanderbilt's transgender pediatric clinic early on.
00:03:54.680 And then Matt made it a huge national story, did a fantastic job.
00:03:57.780 And together we were able to convince the legislature here in Tennessee to ban hormones and puberty blockers and these surgeries for minors.
00:04:08.460 And so as a byproduct of that.
00:04:10.160 When was that?
00:04:11.300 Oh, that would have been last year at the very beginning of the year, maybe even at the tail end of 2022.
00:04:17.440 I'm terrible with dates, but it was, you know, it was fairly recent history.
00:04:20.840 Okay, but that's about when.
00:04:21.900 Yeah.
00:04:22.140 And so, you know, as a byproduct of that trust you build up, people, they start to think of you when they've got a story themselves that they feel like is newsworthy.
00:04:31.140 And so one of those stories was someone came forward from Tractor Supply and they'd worked there a very long time.
00:04:37.240 And they said, you know, you would not believe the stuff happening at this company.
00:04:40.780 I don't recognize it anymore.
00:04:42.240 I love working here.
00:04:43.660 This has been the best place, you know, best decision that I had made, at least I thought, until recent years.
00:04:48.940 And then they laid out the evidence of kind of what had been going on.
00:04:51.660 And to be really candid with you, Jordan, I didn't really believe some of the stuff they were saying.
00:04:56.980 I had to send out a couple of people from my team and myself to verify some of the things they had said, but it all checked out.
00:05:03.760 And so I had sort of this epiphany where I said, this is Tractor Supply.
00:05:08.380 And for those who are not familiar, maybe in a different part of the world, Tractor Supply is like the most American, American place you could go to shop.
00:05:16.000 You know, I mean, this place is filled with like American flags and it's a farm store, you know.
00:05:20.080 So I've got cattle, for instance, Jordan, and chickens.
00:05:22.780 And so like we'd go there sometimes to get some things we needed if there was just like something we needed in a pinch.
00:05:29.140 And my kids love that store.
00:05:30.600 So I realized as a byproduct of this whistleblower coming forward that I myself am helping fund things that are directly opposed to my values.
00:05:39.360 You know, like this company was funding pride events in my own community, in my own state, things that I don't believe kids should be exposed to.
00:05:47.660 No matter what the orientation of the people involved is, I think that it's wrong to expose kids to sexually charged material.
00:05:54.100 So, you know, if I'm not okay with that, I assume a lot of other customers at Tractor Supply are not okay with it.
00:05:59.040 So I said, you know what, we're going to go at this from a very different vantage point.
00:06:02.520 I'm going to look at sort of the history of boycotts, what has worked, what has not worked.
00:06:05.980 And I want to have something I believe is something we can replicate.
00:06:09.640 Because if this is happening at Tractor Supply, this is happening in many other places.
00:06:13.480 So from that look backwards, I realized, number one, all of this craziness really has accelerated since George Floyd.
00:06:20.200 That's where the vast majority of it originated.
00:06:23.380 And so, and maybe not originated, but where it exploded, I should say.
00:06:27.420 And as a byproduct of that, the Overton window shifted wildly, like we've never seen, in my lifetime at least.
00:06:34.260 And so I realized for us to be able to take that back to some semblance of sanity,
00:06:38.700 what we really have to do is make sure we're focused on the right targets at the right period of time and think about this strategically.
00:06:44.920 And we need to avoid the pitfalls of two of the biggest responses PR departments think about.
00:06:50.760 Number one is, let this blow over.
00:06:52.620 So these stories can't blow over.
00:06:53.940 They can't be 24-hour news cycles.
00:06:55.340 And number two is that we understand that the old system was something where these big companies could go to major media outlets.
00:07:04.540 If they knew a bad story was coming, they could say, hey, listen, we're going to up our ad spend, kill the story.
00:07:10.020 Well, we live in a new time now where that's not possible.
00:07:12.240 And I know for a fact, I won't say who, but at least two of the companies that we focused on tried that.
00:07:17.740 They tried going to major media outlets to kill the story.
00:07:20.160 And it worked in that area, but it wasn't enough to kill the story in total because we live in a new dynamic where more people were watching my video than were watching those news networks reports, you know, on any given night.
00:07:32.520 So, you know, some things have fundamentally changed.
00:07:35.620 And we saw pieces of that through what happened with Bud Light.
00:07:38.440 And so we took the good pieces, left the bad pieces, which was that Bud Light was terribly unfocused in terms of telling people what they needed to do.
00:07:46.040 And Bud Light was also an anomaly in the sense that they were lucky they had such a large product category.
00:07:51.100 Because many people who were trying to punish Bud Light went out and bought a beer that actually was owned by AB, the same parent company that owns Bud Light.
00:07:59.000 So in many respects, it was unsuccessful in that regard.
00:08:02.000 But we said if we do this the right way and we sort of act as, you know, sort of a mouthpiece in general for like how you do this effectively, I think we can make real change happen.
00:08:11.180 And so when it worked with Tractor Supply, we felt like our model was correct.
00:08:15.880 And that essentially relies on shifting the window back to sanity by focusing on the companies first that primarily depend on the conservative consumer walking through their door.
00:08:26.320 And then as time goes on, you kind of slowly shift to like the 50-50, what I call jump ball companies, where it kind of depends on everybody.
00:08:33.140 And then you can look at the ones where maybe conservative consumers are in the minority, which is few and far between, honestly.
00:08:40.040 But once you get there, suddenly you realize, take an eagle-eye view of the situation, that these companies on the far left that would still adhere to these values, they're going to be looked at as the weird ones.
00:08:51.120 And so that's sort of, you know, where we're hoping to get with this project.
00:08:56.480 Okay, okay.
00:08:57.380 So I have a bunch of questions that emerges from that.
00:08:59.720 Let's walk through, in chronological order, if you would, the companies that you've been going after.
00:09:07.140 And just tell the story of each company, if you would.
00:09:09.960 And so you said you started out in this broad space, not exactly going after corporations per se, but you were concentrating on the transgender butchery in Tennessee.
00:09:21.100 Was it in Tennessee specifically?
00:09:23.400 Well, we've done it, you know, we've looked at it in total.
00:09:25.920 Well, we also made the film The War on Children, which at this point is the most watched documentary of the year.
00:09:31.120 I think Matt will surpass us at some point with his new film, and I'm happy for him to do so because it's an incredible film.
00:09:37.300 But that focused on it from a broad national level, all the issues facing kids, which included gender ideology.
00:09:44.180 But yeah, in Tennessee, as far as a news story goes, that was a big focus for us a couple years ago, making sure that at least here, where we live, that that was something that would not be happening to children.
00:09:54.400 Okay, okay. And when did The War on Children come out?
00:09:58.040 It came out in February of this year.
00:10:01.340 February 24, okay.
00:10:02.920 Now, so before we go into the corporations, why don't you let everybody know who you are?
00:10:09.660 I mean, like I said, for me, you burst onto the landscape relatively suddenly.
00:10:13.860 And so just give us a history of your endeavors and how you got involved in this enterprise.
00:10:22.440 I think, you know, it's interesting the way the world works now because it definitely can feel like that.
00:10:30.320 You know, I think in any sense, no matter what somebody does, it's like people feel like, oh, they just kind of popped up, right?
00:10:36.320 I've been around a long time, though.
00:10:37.700 You know, I started out actually as a director-producer in Hollywood.
00:10:40.840 So I directed Oscar-winning actors, actresses, some of the biggest music stars, people like Natalie Portman, you know, Smashing Pumpkins, Megadeth, all across the board.
00:10:49.580 And so that's a unique sort of unicorn-like background for somebody who comes out as openly conservative.
00:10:56.860 But in 2015—
00:10:57.500 That's for sure.
00:10:58.200 Yeah, in 2015, I saw a very dark picture of what our country could become if we did not make the right decisions.
00:11:05.960 And that was something that I think was largely informed by my family history.
00:11:10.460 So my family originally is from Cuba.
00:11:12.940 And so they lost everything to communism.
00:11:14.920 And that anti-communist, you know, sort of education I got as a child about what it steals from you made me first a voracious reader because I realized that information was the most dangerous thing to authoritarians, especially on the left, but authoritarians in general.
00:11:32.900 And so I wanted to read everything.
00:11:34.520 I wanted to know everything I could possibly know.
00:11:36.960 And as a byproduct of that, I became increasingly conservative, you know.
00:11:40.720 But when you try to make it in Hollywood, the reality is you simply cannot be open about it or you're going to be blacklisted.
00:11:47.780 But in 2015, I decided, you know what?
00:11:50.080 I have to be open about it.
00:11:51.620 I've got to talk about it because we're at this crossroads where if we go down one of these roads, America can become Cuba.
00:11:58.680 And if we head down that road, the future for my own kids is a dark one.
00:12:04.100 And my great-grandfather, he was like my father figure.
00:12:07.080 He warned me so many times as a young man, if you ever see these warning signs, you need to speak up and do whatever you can.
00:12:16.900 And so at that point, you know, I had to drop the cowardice because there's a lot of cowardice involved in, you know, sort of that operative mode of thinking where you think, okay, I have to be quiet or I'm going to lose X, whatever that may be.
00:12:31.400 Because that really is how people lose their countries.
00:12:34.320 They lose their countries inch by inch through silence.
00:12:37.900 And so I said, you know what?
00:12:39.740 If this burns down my career and everything I've built up, so be it.
00:12:43.020 I believe in my ability to go and figure out something else.
00:12:46.460 And so I came out.
00:12:47.520 I endorsed Trump in 2015.
00:12:48.700 That's a crucial point.
00:12:50.340 That's a crucial point that you made there.
00:12:53.080 I had two crucial points.
00:12:54.180 You know, the fact that you lose, a country slides into totalitarianism an inch at a time.
00:13:00.240 And it does that because people are unwilling to give up something they have, they think they have, and will remain silent.
00:13:08.180 And then you might say, well, what's the counterposition to that?
00:13:11.340 And the counterposition is that your best way forward is to say what you think.
00:13:16.880 And if you're a credible and able person, then that will open up all sorts of new opportunities to you.
00:13:23.160 It might mean that you lose some of the things that you depended on.
00:13:26.420 But the thing you have to think about in that situation is that if you're in a situation now that is already so rotten that its maintenance requires your silence,
00:13:35.740 you've already turned three quarters into a braying donkey, if you want to use the Pinocchio imagery, or a slave.
00:13:43.160 And maybe you want to cling to your slavery, but if it means you're voiceless, then you've lost.
00:13:49.460 Now all that security is illusory.
00:13:51.180 It's an illusion.
00:13:52.280 So you're giving up something that you actually no longer have.
00:13:55.120 But you also had enough faith in yourself, in yourself specifically, or what?
00:14:00.440 In the pursuit of the truth to motivate you to take the risk of, and exactly what did you do in Hollywood?
00:14:07.360 I mean, I don't mean as a career because you laid that out a bit.
00:14:10.180 But how was it that you revealed your true proclivities?
00:14:14.660 Yeah, you know, there's so many really, I think, important poignant things you just said.
00:14:19.380 I mean, one of them being that Hollywood and entertainment in general and the mindset that sort of permeates throughout it,
00:14:27.820 it is a prison and it is sort of like a mental slavery.
00:14:31.560 And I think that's one thing there's just a really, I think, broken perception about.
00:14:37.540 Because there's this belief, I think, that's pervasive throughout society that celebrities, let's say, have it all, right?
00:14:43.220 And that they're just, they're able to kind of have the world as their oyster.
00:14:47.340 But the truth is, in my experience, having been very close to a lot of these big stars,
00:14:51.240 is that I've never met a more depressed group of people who are more broken internally,
00:14:56.440 who don't know who to trust, and who really have their life kind of in shambles in the most personal ways.
00:15:03.740 And so that was just something I recognized early on in my career.
00:15:07.320 And it made me second guess my own desire to be in that industry.
00:15:11.280 But at the same time, I was a young dad who was focused on, I need to make money.
00:15:15.180 I'm lucky to be here.
00:15:16.320 I've got a great job that's, you know, really paying me well.
00:15:18.920 I need to protect that for my kids.
00:15:20.620 And that's something that a lot of people convince themselves, you know, sort of that
00:15:24.540 that manufactured consent of silence is born through that excuse in your head that,
00:15:29.060 oh, well, I need the money to take care of my family, right?
00:15:31.440 But like you said, the belief is so important in yourself that you can exist in some other way.
00:15:38.120 And anytime you're trapped in a system that requires your silence in order for you to be
00:15:42.220 remunerated, that's probably a really dark system.
00:15:46.060 You know, it's not.
00:15:47.220 Yeah, right, exactly.
00:15:48.080 And there's these people who say as well, when you break out of that, that you're going
00:15:53.240 to lose friends.
00:15:54.280 Well, that in itself is a really, really, you know, kind of poisoned well, because the
00:15:58.600 truth is they're not your friends if they don't know who you are and if they don't love
00:16:02.040 and appreciate you for who you truly are.
00:16:04.680 So it's actually one of the most freeing things is for people to speak up to a group of
00:16:08.100 people and realize, oh, these people who I sort of had this illusion were my friends.
00:16:13.240 They're not actually my friends.
00:16:14.900 And so that's something I encourage young people to do all the time is be yourself, be
00:16:19.120 true to yourself and what you believe in.
00:16:21.300 Because if you're putting on a mask for somebody else so that they will like you, they don't
00:16:26.320 actually like you.
00:16:27.840 They don't care about you.
00:16:29.740 They care about this fictitious person you've made up because you believe that's what's
00:16:34.000 likable.
00:16:34.880 Just be you.
00:16:35.780 And so in that sense, you know, I think your desires when it becomes, you know, sort of
00:16:41.200 on onto the subject of work or politics or what have you, it's sort of a similar thing.
00:16:46.040 You just have to be honest and true.
00:16:47.880 And that's what I've always believed, you know, that if I follow what I know is right,
00:16:52.500 things are going to work out.
00:16:53.740 Are you tired of feeling sluggish, run down, or just not your best self?
00:16:58.300 Take control of your health and vitality today with Balance of Nature.
00:17:01.540 With Balance of Nature, there's never been a more convenient dietary supplement to ensure
00:17:05.200 you get a wide variety of fruits and vegetables every day.
00:17:08.120 Balance of Nature takes fruits and vegetables, they freeze dry them, turn them into a powder,
00:17:12.040 and then they put them into a capsule.
00:17:13.860 The capsules are completely void of additives, fillers, extracts, synthetics, pesticides, or
00:17:18.460 added sugar.
00:17:19.420 The only thing in Balance of Nature fruit and veggie capsules are fruits and veggies.
00:17:22.820 Right now, you can order with promo code JORDAN to get 35% off your first order, plus get
00:17:27.640 a free bottle of fiber and spice.
00:17:29.480 Experience Balance of Nature for yourself today.
00:17:32.040 Go to balanceofnature.com and use promo code JORDAN for 35% off your first order as a preferred
00:17:36.520 customer, plus get a free bottle of fiber and spice.
00:17:39.440 That's balanceofnature.com, promo code JORDAN for 35% off your first preferred order, plus
00:17:44.540 a free bottle of fiber and spice.
00:17:47.940 You know, and some of that is religious faith.
00:17:50.500 That's the hallmark of faith.
00:17:52.000 Well, that's the hallmark of faith.
00:17:53.780 It's a kind of courage, first of all, to be willing to take the risks that go along with
00:17:59.240 the truth.
00:18:00.120 But it's also faith in the proposition that the truth does set you free.
00:18:03.980 That doesn't mean that it won't come with interesting twists and turns, let's say, of
00:18:09.940 the sort that you just described.
00:18:11.740 You'll discover who your false friends are.
00:18:14.100 Well, that's painful, but it's useful.
00:18:16.080 The other thing that happens too, and I'm sure this has happened to you, although you haven't
00:18:20.140 mentioned it yet, is that once you do say what you think, you may lose a certain number of
00:18:25.440 people around you, although other relationships will strengthen.
00:18:27.980 But there'll be all sorts of other people that flock to you in consequence who are, I
00:18:34.200 wouldn't say precisely of like mind, but also willing to, let's say, risk the truth.
00:18:39.960 And so whatever you lose on the friendship side, you'll gain in terms of true allies.
00:18:45.620 And that's actually a good deal.
00:18:46.980 Now, you have to wait out the lag period, you know, and there's some unpleasantness that
00:18:52.280 might come along with that.
00:18:53.320 But that's also partly why that faith is necessary, right?
00:18:57.300 I mean, there's an assumption that I've come to, which is twofold, I would say.
00:19:02.980 The first is that there's no difference between speaking the truth and having the adventure
00:19:08.120 of your life.
00:19:08.940 Those are the same thing.
00:19:11.040 And the second thing is that whatever happens to you if you speak the truth is the best thing
00:19:17.280 that could happen under those circumstances, regardless of how it looks to you in the moment.
00:19:22.280 And, you know, your momentary view isn't omniscient.
00:19:25.560 And so the fact that, you know, imagine you suffer for three weeks and then things are
00:19:30.440 really good for a year because of it.
00:19:32.960 Well, those three weeks are still going to be miserable.
00:19:35.300 And if you used your judgment then, you'd think, oh, God, this is a complete catastrophe.
00:19:39.040 But you don't have that longer term view.
00:19:41.840 And I think faith in the redeeming power of the truth is equivalent to the longest possible
00:19:48.220 term view.
00:19:49.000 So, all right, so you were in Hollywood working.
00:19:53.520 What period of time was that?
00:19:55.080 And do you want to just flesh out what you did a bit so we have some sense of what it
00:19:58.840 was that you were involved in and also what you risked when you decided you were going
00:20:03.160 to make what you actually thought clear?
00:20:07.800 Yeah.
00:20:08.120 So in 2015, that's when I came out and endorsed Trump in the Republican primary.
00:20:12.480 And I did so publicly and I had a following already at the time because I guess I was
00:20:17.140 sort of a new crop of directors where we had kind of our own followings online.
00:20:21.840 And a large reason I even broke into the industry without any family connections or being from
00:20:26.600 a wealthy family or anything like that was because I was lucky enough to kind of like break
00:20:30.900 into the industry at a time where digital was starting to be a problem for film.
00:20:35.740 And so I was able to come into record labels and say, hey, we can do the music videos you
00:20:40.180 guys want at half the price.
00:20:41.480 You guys are doing them already because digital doesn't require the expensive film.
00:20:45.360 And this is sort of the area that we're fantastic at.
00:20:48.260 And as a byproduct of that, we were able to grow quite a successful company.
00:20:51.560 And so I think at our strongest point, we had over a dozen directors across the world,
00:20:57.080 Europe, Canada, U.S.
00:20:58.860 And in general, we did a lot of different music videos, commercials, documentaries.
00:21:03.060 And then we worked with big, you know, motion pictures like Paramount Pictures.
00:21:07.540 We did a lot of projects with.
00:21:09.920 So things that we would do that, that, you know, people would sort of be familiar with
00:21:14.100 is, you know, any of the big movies like from Paramount, basically like Transformers, Terminator,
00:21:19.180 anytime they had a music video, you know, that would air with the film or, you know,
00:21:23.020 on MTV and stuff like that, we would do a lot of those, you know, tons of music videos
00:21:27.900 in general.
00:21:28.480 That's sort of the side of the business everybody likes because they're all popular.
00:21:32.500 You know, those get hundreds of millions or billions of views.
00:21:34.920 And so people like that stuff.
00:21:36.620 But honestly, the stuff that makes money is the commercial stuff.
00:21:39.680 And, you know, so we did a lot of that stuff as well.
00:21:42.320 And in general, you know, it's something that makes a decent living and everything.
00:21:47.680 You know, I'm not going to lie.
00:21:48.500 I, it's, you know, great in that respect, but if it requires your silence, it's not worth
00:21:53.700 it.
00:21:54.040 And so that's, that's sort of where we made the decision, my wife and I, that this was
00:21:59.260 not right for us.
00:22:00.220 And we were also at the same time sort of having our faith transformed in many different
00:22:04.560 ways or strengthened, I should say.
00:22:07.020 And so, you know, that, that choice to leap strengthened our faith because as dangerous
00:22:12.000 as it was to do, it required a level of trust in God, I had not handed over previously.
00:22:19.460 And I had always been reluctant to hand anybody that trust because I was a very precocious
00:22:23.520 kid.
00:22:23.860 I was all about like, you know, some sense of, of leadership and control on anything I did
00:22:28.500 so that everything was perfect.
00:22:29.760 Right.
00:22:31.200 And that was the first time in my life.
00:22:33.020 I just handed over control and said, all right, I'm going to jump, you know, and we literally,
00:22:36.800 we picked up our kids and we moved across country to Tennessee and that would be about
00:22:41.520 six years ago.
00:22:42.400 And it was not only best decision we've ever made outside of getting married and having
00:22:46.660 kids, but it was one that it really set me up to have the courage to be able to do the
00:22:52.960 things we're doing now because we trusted God and, you know, God was there for us every
00:22:57.780 step of the way in ways that, you know, I'm still figuring out where I'm constantly surprised
00:23:02.980 by just sort of the, the amazing nature of how things can happen.
00:23:06.900 Right.
00:23:07.320 And some of them happen in ways where it just, it does feel miraculous.
00:23:10.660 And, um, we didn't know anybody here.
00:23:13.240 We just jumped.
00:23:14.080 We knew this was where we were supposed to go and we did it and it all worked out.
00:23:18.360 And, you know, as far as these projects, I just became increasingly vocal online about
00:23:24.300 my political views.
00:23:25.620 And, you know, we had a number of different projects like our film where we were trying
00:23:30.460 to make as much of a difference as we can, but I always go back to with this project, what,
00:23:35.220 what sort of became different inside of me is that for a very long time, I had this
00:23:40.080 belief, um, sort of a naive belief that, Hey, we've got this party of people whose job
00:23:45.460 it is to fight for us.
00:23:46.540 Right.
00:23:46.820 So like, if we get the right people elected, they're going to fight, they're going to win.
00:23:51.240 And it's a very naive belief.
00:23:53.120 Yes, we do need those people in office because there are going to be change makers.
00:23:57.780 But I think one of the great mistakes we've made on the conservative side is depending on
00:24:02.660 these people to save us.
00:24:04.740 They're not going to save us.
00:24:07.380 We, we need to take individual responsibility.
00:24:09.460 And I think that's, what's been missing.
00:24:11.200 And if you look at the left and how they've arrived at this moment where they have total
00:24:14.960 control of every cultural institution in our country, and you could argue globally, they
00:24:20.240 got there not through the trust of their leaders, but because they have an incredibly active
00:24:25.440 activist base that is willing to do whatever is necessary to win.
00:24:29.140 And our side has kind of lacked in a lot of effects when it comes to that, because I think
00:24:34.360 it comes down to, honestly, just the very simple truth that a lot of us tend to be individualists
00:24:39.020 and on the left, they're collectivists.
00:24:41.440 Right.
00:24:41.940 And so they're, they're willing to sort of be like bees where, you know, they'll all die
00:24:46.460 for, for the queen bee.
00:24:47.860 Right.
00:24:48.120 And they're willing to kind of like carry out whatever it is that's necessary for the survival
00:24:52.420 of their ideology.
00:24:53.560 For us, we're all kind of individual.
00:24:55.340 And I think one of the things we have to do is be able to match the energy that they
00:25:00.180 have, but in very intelligent ways.
00:25:02.460 And that's kind of what we've done to approach this project is make sure that we're active
00:25:07.420 in every sense that, you know, we're, we're inspiring people to believe you can make a
00:25:12.680 difference.
00:25:13.120 One person can make a difference.
00:25:14.360 Every person on this chain of events from the person who's the initial whistleblower to
00:25:18.240 us about a company to the very end where we're in conversations with the executives, every
00:25:23.260 individual involved is necessary to be able to bring these wins that we've been bringing,
00:25:27.660 where we bring a bunch of corporations back to sanity.
00:25:31.720 Yeah.
00:25:32.300 Okay.
00:25:32.660 So a couple of things that I wanted to go through with you there.
00:25:36.020 So, yeah.
00:25:36.760 So you said you came out in 2015 in Hollywood to support Trump.
00:25:40.560 So that was pretty daring and pretty early.
00:25:42.320 I mean, it seems to me that 2014, 2015 is about when things had got out of hand enough
00:25:50.100 to go sideways.
00:25:50.900 Something shifted at that point.
00:25:53.220 And part of that, I think was probably the increasing dominance of cell phone technology
00:25:58.040 and the fact that everybody's so interconnected and that information was moving around so
00:26:02.280 insanely fast, but something definitely shifted.
00:26:04.980 I started to become aware of things at the University of Toronto, for example, that just
00:26:09.240 hadn't been there before.
00:26:10.520 My graduate students were starting to talk to me about being afraid to broach certain topics
00:26:14.580 in class.
00:26:15.580 Like that had never happened before.
00:26:17.140 I was never nervous at Harvard or at the University of Toronto to review what I had learned from
00:26:23.580 the research literature to my undergraduates.
00:26:26.080 And my attitude was, too, that if any student was perturbed by the content, they were more
00:26:31.120 than welcome to leave.
00:26:32.420 And I made that clear and I never had any trouble.
00:26:34.560 But then all of a sudden, you know, my students were starting to report to me that they were
00:26:38.360 nervous.
00:26:38.720 And then I started to get nervous about talking about gender differences, which is a core element
00:26:43.340 of my field, actually, because I'm a personality psychologist.
00:26:47.100 And the university started to move in a real DEI direction.
00:26:51.100 Something shifted in 2014.
00:26:53.700 OK, so now we've established that you had a genuine career, that you were up and coming,
00:26:58.780 that you were on the cutting edge of the technological revolution in Hollywood.
00:27:02.340 And so you had something at stake.
00:27:04.780 You decided that you were going to publicly support Trump.
00:27:07.780 And you said at the same time that you were undergoing something approximating a re-evaluation
00:27:12.320 of your faith and that you had, and your wife, because you mentioned her, had decided
00:27:17.660 to, what, in a sense, throw caution to the wind.
00:27:21.080 And so tell me a little bit more about how you came out in support of Trump and why you
00:27:27.120 did it then, how you discussed that with your wife and how that was tangled up with your
00:27:32.140 progression, let's say, with regard to your faith.
00:27:35.600 Yeah, you know, my wife is just the treasure of my life in so many ways, because she's the
00:27:44.400 person who made me believe I could just be me, you know, and I didn't need to put anything
00:27:50.600 else on.
00:27:51.180 And that's kind of, I think, the mark of a great woman.
00:27:53.620 And I think a great partner in general in life is that they're going to give you that
00:27:58.520 courage to just be you.
00:28:00.940 And, you know, as much as many of us just want to have that innately, sometimes we don't.
00:28:05.320 And you need that validation in your life that, like, hey, you be you, we will take
00:28:10.760 whatever comes along with that.
00:28:11.960 And so my wife, she's as conservative as I am, I think, maybe even on some things she
00:28:16.340 might even beat me.
00:28:17.620 And, you know, she was a big believer from the very beginning.
00:28:21.320 She grew up in the South that California was a horrible place.
00:28:24.460 But increasingly, during the Obama years, it got much worse.
00:28:28.260 And you could see the writing on the wall.
00:28:30.160 I mean, to give you an insight visually, because I'm a very visual person, the day we moved
00:28:35.800 out of California, I will never forget this.
00:28:38.380 We used to take our kids all the time to Malibu Pier.
00:28:41.400 It was they have a breakfast spot there and stuff.
00:28:43.360 And our kids just always love that spot.
00:28:45.240 And it was kind of like our family's spot.
00:28:47.540 We went every week, pretty much.
00:28:49.520 And so we went there as our last thing all together in California.
00:28:52.960 And we were in the sand, and our kids are playing, and our youngest one, who at the
00:28:59.000 time probably was around two years old, she says, Dad, what's that?
00:29:05.460 She said it in a much cuter voice than that, because she's still trying to pronounce things
00:29:09.560 correctly.
00:29:10.240 I looked down, it's a hypodermic needle.
00:29:12.800 And that to me was like, I still had that feeling, am I doing the right thing, right?
00:29:17.300 Am I crazy doing this, you know?
00:29:19.400 I looked down, and I saw that needle, and it was like confirmation from the heavens.
00:29:23.480 Like, you're not only doing the right thing.
00:29:25.640 This is the writing on the wall.
00:29:26.880 This is where this place is going.
00:29:29.100 And it couldn't have been, I think, any more eye-opening as to what the future was going
00:29:34.480 to look like.
00:29:35.020 And it's just like all encapsulated in that little moment, this little, beautiful, precious,
00:29:39.240 innocent child.
00:29:40.320 What's that, Daddy?
00:29:41.400 And it's, you know, this symbol of the brokenness of the state, you know, in California, and really
00:29:47.960 of leftism in total.
00:29:49.000 And what it produces, because it's anti-family, it's anti-child, and it's really producing
00:29:53.760 a future where hedonism reigns.
00:29:56.600 And hedonism has more rights than goodness.
00:29:59.480 And I think that that's something that was definitely very motivating to me going forward,
00:30:05.960 that this was a fight not just about Republican versus Democrat, right versus left, which can
00:30:10.680 be kind of boring.
00:30:11.720 It was really a fight of good versus evil.
00:30:13.940 And that's something that became more apparent in the years afterward.
00:30:18.860 And I think it's the most apparent it's ever been now.
00:30:21.420 I think you even have very non-political people now waking up and going, this is sort of good
00:30:28.380 versus evil.
00:30:29.040 That's what it looks like.
00:30:30.300 And I think that's going to become increasingly the conversation, because, you know, I've got
00:30:34.860 friends who had been atheists their whole life, and I find it fascinating.
00:30:38.660 I have never seen a wave of atheists turning to God the way I have in recent years.
00:30:43.920 And it really is fascinating.
00:30:46.560 And for them, you know, because they're very analytical, very based in like sort of like,
00:30:50.340 what can I see, measure, feel, and touch, right?
00:30:53.360 And for them, the thing that turns them is not historical evidence.
00:30:57.440 It's not things like that.
00:30:58.620 The thing that has been turning them is just simply visually watching the world around them
00:31:03.560 move and feeling the evil that is coming forth in the United States from the left primarily.
00:31:10.620 And that has been moving.
00:31:12.180 And you put your finger on that.
00:31:13.740 You put your finger on that with regards to the war on children.
00:31:16.920 I mean, I haven't seen anything worse in my entire life.
00:31:19.700 And I've spent, so personally, you know, what I've seen firsthand, but also what I've
00:31:26.760 investigated historically, I don't think I've ever seen anything worse than the trans butchery
00:31:31.580 in relationship to children.
00:31:33.240 Like that rivals or exceeds anything I only fortunately read about in relationship, say,
00:31:39.860 to Auschwitz.
00:31:40.640 But it also rivals the worst of the Japanese atrocities in China, which were the worst things
00:31:45.600 that I'd ever come across historically.
00:31:47.720 And so it's a level of pathology that I would have regarded as inconceivable, especially with
00:31:57.100 regards to both the medical and the psychological community.
00:32:00.200 Like, as far as I can tell, what the medical community is doing to children, and we know
00:32:04.780 now it's about 8,000 young women who have insurance coverage in the United States.
00:32:09.860 So it's far more women than that because these are just the insurance cases.
00:32:13.580 8,000 have had double mastectomies since this idiocy began.
00:32:17.140 And as far as I can tell, that is literally a crime against humanity, that with the sterilization
00:32:23.340 and the mutilation.
00:32:24.260 Because minors cannot provide informed consent to procedures like that.
00:32:29.640 Anyone with any sense understands that.
00:32:32.020 If informed consent means anything, it means that minors cannot consent to their own sterilization
00:32:39.320 and mutilation.
00:32:40.060 And yet no one's being prosecuted, not to the degree they should have.
00:32:44.920 And we know that the scale of this catastrophe is much wider than this mere 8,000 girls, merely.
00:32:53.780 Like, it's a terrible number of people.
00:32:56.020 We've turned the country upside down for far less consequential occurrences than that.
00:33:02.400 And so, like, I can't characterize that as anything other than evil.
00:33:06.440 And, you know, we should leave that term for when it's actually useful.
00:33:10.000 And that does point to something moving below the surface that's more than merely political.
00:33:14.840 And there is a transformation of belief in the air.
00:33:18.020 There's no doubt about that.
00:33:19.280 And it is, you know, in large part for the reasons that you're describing.
00:33:23.120 Okay, so delve a bit more into the faith element, if you would.
00:33:27.440 And where did you move in Tennessee?
00:33:28.900 So, we're just outside of Nashville.
00:33:31.660 We went full farm.
00:33:32.680 We've got a farm.
00:33:33.560 Yeah, we've got cattle.
00:33:34.860 We've got chickens.
00:33:35.680 You know, like, we went big.
00:33:36.920 We were like, we're going to change our life.
00:33:38.280 We're going to change our life.
00:33:39.200 We're going to drink raw milk.
00:33:40.300 We're going to have cows.
00:33:41.440 We're going to do the whole thing.
00:33:43.360 And, you know, I think it's something interesting as you were speaking.
00:33:48.920 You know, I never gave much credence to the idea of parallel universes.
00:33:53.000 You know, it just seemed kind of, you know, complex.
00:33:56.520 But then again, I don't doubt God's ability to create complex things, right?
00:34:00.940 But the change you're describing that happened around those Obama years in the United States,
00:34:05.680 at times, I almost wonder if we slipped into a parallel universe.
00:34:10.140 Because the things that you were just describing are so absurd.
00:34:13.880 8,000 women, you know, many girls having double mastectomies.
00:34:18.720 Those are all girls.
00:34:19.600 Those are all girls.
00:34:20.460 All under 18.
00:34:21.060 Minors, the 8,000 figure, all of them, 8,000 under 18.
00:34:25.100 That's not the full total of the double mastectomies.
00:34:27.720 That's just minors in the U.S.
00:34:29.740 8,000 children.
00:34:31.400 And, you know, I've interviewed one of them, Layla Jane, who at 13 had hers.
00:34:36.560 The barbarity involved is unthinkable.
00:34:41.100 And I remember just a decade ago, I could have gone up to anybody.
00:34:45.460 Well, it would have been a little more, probably, you know, 12 years ago.
00:34:47.940 I could have gone up to anybody, regardless of their political belief, who they voted for.
00:34:51.900 And I could have asked them, is it okay to give a double mastectomy to a 13 or 14-year-old
00:34:56.180 who was confused about their gender?
00:34:58.100 Every single person would have said, absolutely not.
00:35:00.100 That's abuse.
00:35:00.700 You belong in jail if you do that, right?
00:35:02.800 So what changes that in such a short period of time?
00:35:07.460 You know, the only answer I have is evil or a parallel universe.
00:35:12.640 Yeah, well, it's not only, see, it's even worse than that.
00:35:16.440 Because it's not only that these procedures, which are experimental and sadistic and profit-oriented
00:35:24.660 and ideologically addled and cruel and counterproductive and rife with side effects,
00:35:30.720 but it's not only that those are being conducted, conducted en masse, lied about, this is something
00:35:39.680 that never happens.
00:35:40.660 It's like, no, it's happening, and it's happening a lot.
00:35:42.900 I know, for example, there's a black market in puberty blockers.
00:35:46.420 So whatever the figures are for children that are put on puberty blockers, which is part
00:35:50.820 of the pathway to surgical transformation, the true number of kids who are experimenting
00:35:54.460 with puberty blockers is much greater than that.
00:35:56.480 Yeah, but so it's happening.
00:35:59.060 It's happening at large scale.
00:36:01.300 The people who are doing it are lying about doing it and covering it up.
00:36:04.900 And this is the capper.
00:36:07.420 Opposition to it is essentially criminalized.
00:36:10.480 And so that's like, that's the, that's a perfect trifecta.
00:36:13.380 So, you know, I'm in trouble in Canada, for example, because I objected to Elliot Page.
00:36:20.320 Attention men who still believe in the American dream.
00:36:22.640 In a world gone mad, the Precision 5 from Jeremy's Razors stands as a beacon of sanity.
00:36:27.440 Five blades of superior engineering offer a shave as unshakable as your faith that the
00:36:31.660 nation's best days still lie ahead.
00:36:33.660 Experience an exceptionally smooth, remarkably close shave and a testament to the fact that
00:36:37.680 merit still matters.
00:36:38.940 Stop giving your money to woke corporations that hate you.
00:36:41.500 Get Jeremy's Razors Precision 5 instead.
00:36:43.760 Available now at jeremysrazors.com, walmart.com, and Amazon Prime.
00:36:47.680 Ellen Page displaying herself so wonderfully after her surgical transformation to 1.4 million
00:36:57.600 followers, which was, you know, unconscionable in my regard.
00:37:01.100 I know that she had her problems and, and they were genuine.
00:37:05.860 And I have some personal sorrow, let's say, for her confusion because her situation is catastrophic.
00:37:11.720 But once you advertise that to your 1.4 million followers and you're a celebrity, you're not
00:37:18.300 a victim.
00:37:19.440 You're a perpetrator.
00:37:21.220 And enough is enough.
00:37:22.960 You know, and if she only convinced one other girl to go down that road, that's like one
00:37:27.640 girl too many for me.
00:37:28.560 It's criminal.
00:37:29.380 It's barbaric.
00:37:30.600 It's criminal.
00:37:31.360 Well, it's, it's crimes against humanity level criminality.
00:37:35.220 And my sense is it won't stop until there are people prosecuted on that basis.
00:37:40.500 It's so, and it's such a growth industry, you know, it's so pathological.
00:37:45.600 Yeah, they're, they're making a lot of money off this, which is a whole other layer of sick.
00:37:49.160 But you know what this is like?
00:37:51.040 It would be like if I went on my social media channels and I started doing heroin in front
00:37:56.220 of people and I started telling them how good it feels and how great it is.
00:38:00.380 And some people pick up heroin as a byproduct of it and they kill themselves on it.
00:38:05.360 That's on me.
00:38:07.040 You know, that's, that's principally on me.
00:38:10.180 It doesn't matter how good it made me feel.
00:38:13.180 It's still something that is going to have the capacity to break and kill people.
00:38:17.560 And that's what this ideology is.
00:38:19.580 And when you're doing that, especially to children, I mean, there's something about
00:38:23.660 that that's just incredibly demonic.
00:38:25.740 Yeah, it's, it's beyond, it's every, well, you know, I talked to, uh, um, who broke the
00:38:31.860 W path fall?
00:38:32.880 Schellenberger, Schellenberger.
00:38:34.000 I talked to him about this when he broke the W path files and he had said he had watched
00:38:38.260 my conversation with Abigail Schreier.
00:38:41.220 Um, when I came, I was ill for a while.
00:38:43.360 And when I came back and hit my podcast to get hard, the first podcast I did with was
00:38:47.640 with Abigail Schreier who wrote irreversible damage.
00:38:51.420 And, uh, I was very nervous about doing the podcast because at that point it was, your
00:38:57.500 reputation was on the line if you objected to the transgender surgery crowd.
00:39:03.620 And so we walked through a book, which is, and I looked into the surgical procedures, which
00:39:09.920 are like, they're brutal and barbaric and experimental beyond belief.
00:39:14.040 And that's what I wanted to point out is, you know, Schellenberger talked to me about
00:39:17.640 that because he watched that podcast and, you know, and Schellenberger is a pretty brave
00:39:23.600 journalist.
00:39:24.200 And his basic response was something like, this is so awful.
00:39:28.820 There's, it must be exaggerated.
00:39:30.220 There's no way it can be happening.
00:39:32.100 And it took him basically two years to wrap his head around the fact that, no, this was
00:39:37.120 happening.
00:39:37.540 That major, the major medical establishment, American Medical Association, American Psychological
00:39:43.120 Association, were not only on board with this, but promoting it and persecuting people who
00:39:48.740 objected to it, you know, and it's, it's really something, it's really something that's
00:39:55.860 actually, as you're, as you also alluded to, it's really incomprehensible.
00:39:59.740 It's so terrible.
00:40:01.180 It's no wonder people don't believe it, right?
00:40:02.860 It's, it's no wonder that people are turning a blind eye to it because it's very, but it
00:40:07.100 also makes me think, you know, that must've been what was happening in Nazi Germany when,
00:40:12.080 when rumors of the, rumors of the persecution of the Jews and their demolition and all the
00:40:18.600 other people the Nazis went after started to circulate and people weren't nearly as connected
00:40:23.060 then as they are now.
00:40:25.360 It was going to be much more easy for them to turn a blind eye to things they couldn't
00:40:29.100 possibly believe were happening while we're in that situation.
00:40:32.160 Now it's so sickening.
00:40:34.060 It just, I just can't believe that it's happening and I can't believe that it's happening while
00:40:40.540 people are proclaiming that that's the moral pathway.
00:40:44.920 It's like, what the hell?
00:40:47.320 Okay.
00:40:47.700 So fine.
00:40:48.280 So fine.
00:40:50.280 You saw something was happening in 2015.
00:40:53.660 You had a good career in Hollywood.
00:40:55.380 You decided that you're going to move to Tennessee, to Nashville, which Nashville, two thumbs up for
00:41:00.280 Nashville.
00:41:00.840 It's a great place.
00:41:02.380 And you've done that.
00:41:03.900 And then, okay, so let's get the timeline going here.
00:41:07.280 And so you started the, was the activism, the, the more conservative level public activism,
00:41:15.860 did that initiate on your part with the transgender issue and then move from there?
00:41:21.400 Well, I had been outspoken about education and mandates and things like that around COVID,
00:41:27.100 but it's one of those central issues where actually I was going to say this, people should
00:41:31.180 know you and I didn't plan to talk about the transgender issue, honestly, but it's, it,
00:41:35.240 it's actually important because I have noticed this issue more than any other issue when it
00:41:40.880 comes to transitioning children to just beyond barbaric, barbaric sounds like too nice a word
00:41:46.060 for what is happening.
00:41:48.020 That issue has been what has activated more people, especially in spheres of influence
00:41:53.980 that have never involved themselves for very practical business oriented reasons into this
00:41:59.820 sphere and to say, I am going to fight and put it all on the line because this is a line
00:42:04.440 that means so many other things.
00:42:07.240 You know, what they've done to children tells you everything about what they're willing to
00:42:10.600 do in every other sense.
00:42:12.020 That's a truth that meant, I'd say the most intelligent people I know have all figured
00:42:17.900 out.
00:42:18.560 This line means a thousand other things and they're all supremely evil.
00:42:24.020 And that's why you're seeing so many prominent people that you would have never expected stand
00:42:28.480 up, speak up, put their money and their time into the fight like Elon Musk.
00:42:32.180 You know, I mean, if you had told me years ago, Elon Musk is going to come out and support
00:42:37.120 your movie and, and promote it, you know, and it's going to end up getting almost 60 million
00:42:41.000 views largely as a byproduct of him and Donald Trump Jr. promoting it.
00:42:44.560 I would have said you're crazy because I I'm very practical and I and pragmatic and I recognize
00:42:50.240 as somebody who owns all these incredible businesses, even if he does agree with me,
00:42:54.040 that's a very dangerous position for him to take with the government contracts and things
00:42:58.020 like that, that he has to deal with.
00:42:59.400 So you have to ask yourself, why would somebody like that be willing to risk everything that
00:43:05.060 they've achieved?
00:43:06.100 And it's very simple.
00:43:07.520 This line marks an evil that will beset not just our nation, but the world for the next
00:43:13.800 generation and the generation after it.
00:43:15.440 And you're going to have to get to the darkest of places for a small group of people to rise
00:43:20.520 up and do whatever's necessary to bring back some semblance of sanity and liberty.
00:43:25.420 I don't want us to get there.
00:43:26.880 And that's the thing that some people would say like, oh, well, you're a fascist, you're
00:43:29.260 radical or whatever it may be.
00:43:30.440 What I think is actually fascist is what's going on now, where you've got these people
00:43:34.900 in these institutions who believe it is fundamentally their right to shove their ideology down everybody
00:43:39.640 else's throat.
00:43:40.660 That's what's happening in corporate America.
00:43:42.540 That's what's happening in education.
00:43:43.900 It's happening all over the place.
00:43:45.400 That's what I find fundamentally to be wrong.
00:43:47.840 But it has a cascading effect that will continue on and will destroy the lives of our kids and
00:43:52.700 grandkids.
00:43:53.500 And so I think it's important people recognize that about that transgender issue when it comes
00:43:57.060 to kids.
00:43:57.440 It is something that means so many other things.
00:44:02.040 Well, we can elaborate on that briefly.
00:44:05.080 I mean, in 2016, when I released, I released a couple of videos that caused a major kerfuffle
00:44:14.140 that really hasn't died down around me since.
00:44:16.440 And the videos were essentially objecting to the government of Justin Trudeau deciding that
00:44:21.620 it was OK to put words in people's mouths, including mine, with regards to the transgender issue.
00:44:27.440 And I felt two things at that point.
00:44:29.640 I thought, OK, well, you've jumped out of your bailiwick there, buddy, because you don't
00:44:33.820 get to.
00:44:34.360 There was never legislation in any Western country ever that compelled speech among private
00:44:39.940 citizens ever.
00:44:41.260 So there were certain types of commercial speech that were regulated, but for commercial reasons.
00:44:47.940 And so then I thought, too, well, now we have this idea that we can mess with the fundamental
00:44:54.160 category of sex and make that a social construction and generate confusion around that.
00:45:00.400 And I thought, well, that's the most fundamental perceptual category, I think, sex.
00:45:05.520 It's more fundamental than black and white, up and down, night and day, like male and female.
00:45:10.780 If you don't get that right, you don't reproduce.
00:45:13.980 And so that disappeared 650 million years ago.
00:45:17.620 Like the sexual differentiation is hardwired at the most fundamental level.
00:45:23.240 And what that means is that if you can confuse people about that, they will swallow any lie.
00:45:28.180 I warned the Canadian Senate about this in 2016.
00:45:31.820 I said, if you start to confuse young people about sex, you will produce an epidemic among
00:45:38.500 young women, like a contagious epidemic.
00:45:40.720 Because I knew the literature on contagious psychological epidemics.
00:45:44.200 It goes back 300 years.
00:45:46.480 And it's always young women.
00:45:48.060 It's likely because they hit puberty earlier than young men and so have to contend with the
00:45:54.060 brute force of biological transformation when they're still relatively immature, comparatively speaking.
00:46:02.820 And so for whatever reason, they're more susceptible to psychological epidemics.
00:46:06.500 And so, you know, you have two things going on.
00:46:10.120 You enforce the lie that men can be women.
00:46:13.380 And so you prime people for the lie.
00:46:17.600 And then you confuse the most vulnerable people and tilt them into like an irrecoverable pathology.
00:46:23.360 And then it got even worse because I didn't think at that time that we would have an epidemic of like mutilating and sterilizing surgery.
00:46:31.420 Like, you know, I have an imagination for evil, but I got to say it didn't extend that far.
00:46:37.000 Because it seems absurd.
00:46:38.300 Well, well, it is, it is, it's, it's, it's, it's, it's the, it's the ultimate and evil clown.
00:46:45.000 Yeah.
00:46:45.260 Like, it's the worst thing that you can imagine.
00:46:47.260 It's got this horrible element of satirical black, the blackest of satirical comedy, you know.
00:46:55.520 And, and you can see that in the transgender movement on the political side.
00:46:58.620 You know, in, in all sorts of other ways, because the, the drag phenomena is satire of femininity.
00:47:08.920 And so it's got that dark edge to it.
00:47:10.980 And, you know, when that was a fringe thing for theaters, well, it wasn't disrupting all of the world.
00:47:20.160 But when it moves to the center, whenever the fringe moves to the center, all hell breaks loose.
00:47:26.660 Because the fringe is multiplicity.
00:47:29.020 And the center can't be multiplicity, obviously.
00:47:32.420 It's not a center then.
00:47:34.160 It's chaos.
00:47:35.600 So, okay.
00:47:36.580 So, war on children, investigation into the transgender phenomenon.
00:47:41.640 Now that, now you start taking on the corporations.
00:47:44.760 And you start with tractor supplies.
00:47:46.460 So, walk us through that story, if you would.
00:47:48.940 And then let's walk through the corporations one by one.
00:47:51.600 And if you would, detail out how your strategy developed and your influence grew.
00:47:58.200 Yeah.
00:47:58.680 So, with tractor supply, essentially, the way we approached it is, number one, we had to make sure we didn't play into the PR strategy of this will blow over.
00:48:05.960 So, what we did is we said, okay, the first video is going to be like a knockout where it's got to be really strong.
00:48:11.360 But we can't put everything because we need to be able to have pieces of information every day just dripping out that you include many of the pieces that are still very impactful.
00:48:21.820 So, we hold back some of the stuff that we would consider the best stuff as long as we can make sure we have a decent portion of sort of explaining the problem in that first video.
00:48:30.860 Our first video on a company tends to be somewhere between 7 and 10 minutes.
00:48:36.000 Any longer than that, people, you know, I think if you're talking about the attention span of millions of people who are swiping on social media, you start to lose people, you know.
00:48:44.340 So, we stick to that time span.
00:48:47.260 And then each day, we sort of drip out different pieces of what we've found through our investigations.
00:48:52.360 And our investigations are a combination of what the whistleblowers bring forward to us.
00:48:57.840 And then, secondary, we do really great open source investigations on the companies.
00:49:02.900 You'd be shocked, Jordan.
00:49:04.680 Actually, I don't think you would be because of your background in psychology.
00:49:07.900 I think a lot of people would be shocked by the types of interviews that executives at major companies grant where there's like five views ever all time on the interview.
00:49:17.920 But they do it because of their own narcissism, you know.
00:49:20.620 And they're filled with mistakes.
00:49:23.360 And so, nobody's ever really gone through all these, logged them, cut them, labeled them, and saved them for the future.
00:49:29.800 We've been doing that now for a couple of months on a lot of companies, okay?
00:49:34.360 Companies that we've never said a word about.
00:49:36.200 We've already got every crazy interview their executives have ever given.
00:49:39.820 And we've got them cut, labeled, and ready to go when we move on to them.
00:49:43.660 The other part of this is we recognize going one by one is important.
00:49:46.640 Because just like, you know, any sort of hunt, when you've got animals all together as a herd, they're much stronger.
00:49:52.640 So, if you try to take on corporate America by going and attacking a group of them who are all together as 100 corporations, you're going to get nowhere.
00:49:59.100 But if you focus in on one and make them the target of the ire of customers that they need in their stores, that's a totally different prospect.
00:50:08.580 Because at the end of the day, these are public companies, by and large, that we do this with.
00:50:13.680 Because private companies are a little bit different in terms of their ability to kind of wiggle and skate out of a lot of this.
00:50:18.560 But with public companies, the board has a fiduciary duty to shareholders.
00:50:22.260 So, if the board is aware that conservative consumers make up a cross-section of their customer base that is anything beyond 20%, it's malpractice for them to allow a story like this to go and grow legs for over a month and reach hundreds of millions of people.
00:50:36.600 Because each company we focus on has generated hundreds of millions of impressions.
00:50:40.060 So, that's better than a lot of national ad campaigns can do.
00:50:43.940 And that's not counting, by the way, the mainstream media coverage of what we've done.
00:50:48.920 I don't even know what those numbers would be.
00:50:50.480 I mean, I'm less and less impressed by the stuff that they're able to pull in because it seems like people like yourself or me.
00:50:57.920 They're dying.
00:50:58.740 They really are.
00:50:59.760 They're a dying breed.
00:51:00.560 But in general, you know, for these companies.
00:51:02.940 No, they're a suicidal breed.
00:51:05.880 They're a suicidal breed.
00:51:06.360 That too.
00:51:07.340 That too.
00:51:07.800 But in general, you know, we're crossing into this new paradigm of how information works, right?
00:51:13.100 And so, I think this is one of those early stories where corporations are having to learn some very hard lessons.
00:51:19.260 But I will say a lot of them are learning quickly because if you look at sort of what we did in the timeline, we went from tractor supply, took about three weeks to get a statement from them where they changed all their policies.
00:51:28.880 I mean, they dropped every woke thing that we had put out there.
00:51:32.340 Then after that, we focused on John Deere.
00:51:34.420 For those who are unfamiliar with John Deere, big tractor company, I mean, we're talking again about a company that depends on probably 90% conservative consumers, right?
00:51:43.520 And for them, it took about three weeks as well to get them to flip.
00:51:48.360 Then we got to Harley.
00:51:49.860 Harley was one that I think was psychologically very important to what we have done going forward.
00:51:55.420 But you know what's interesting, Jordan, is that they're one of the smallest companies that we have flipped.
00:52:01.980 But I would say psychologically maybe the most important one because their CEO was not a typical CEO like many of the CEOs of these companies that we have flipped.
00:52:11.660 They're kind of agnostic about the whole thing, if not opponents of the wokeness, but they didn't know what the heck was going on at their own companies.
00:52:18.920 And there's this like pervasive ignorance about how bad it's gotten.
00:52:22.340 And then they're like, oh, gosh, we didn't know that.
00:52:23.880 Yeah, let's fix it.
00:52:24.980 But in the case of Harley, this CEO is a true believer.
00:52:28.140 This is a guy who founded the B team with Richard Branson.
00:52:30.680 And the B team's explicit purpose is to force wokeness through corporate America by bringing in new leaders who believe in the woke ideology in general.
00:52:40.280 You know, so they want to do this on a global scale.
00:52:42.400 And they've been quite successful at a number of companies forcing these new leaders in and bringing their ideologies with them.
00:52:49.940 And so that was one where I said, you know what?
00:52:53.220 They may dig in their heels, but we need to do this right to where if they do dig in their heels, they're not going to recover from it.
00:52:59.840 Because we have to make this just absolutely clear to the consumer how far gone they are from the values of their consumer base.
00:53:07.660 And I think we did a good job of that.
00:53:08.940 Again, we started with the long video explaining the problem, but we had such a large amount of material.
00:53:14.680 You know, like one of the videos we held back initially was the interview of their CEO describing himself as the Taliban of sustainability.
00:53:22.540 And so for people, I thought it was very important we break that video down.
00:53:26.360 Taliban of sustainability, what does that actually mean?
00:53:29.120 To me, what it means, if you describe yourself that way, the Taliban of anything, it means you are willing to do anything for what you believe in.
00:53:37.300 There is no red line you won't cross.
00:53:39.220 You are a terrorist for that cause.
00:53:41.380 That's what it means to me.
00:53:42.980 And sustainability is just a buzzword to describe wokeness in total, right?
00:53:46.640 So I hear, when I hear Taliban of sustainability, I hear, I am a terrorist for this left-wing ideology.
00:53:52.640 I will do anything to make sure that it comes to fruition.
00:53:55.700 And if you look at sort of the policies they had adopted, well, it's very clear that seems like that was the direction they were going.
00:54:01.420 And he even had an explicit, you know, desire to reshape capitalism is the way they say it.
00:54:07.300 Marxists love to do that.
00:54:08.740 You know, reshape capitalism.
00:54:10.680 You know, you can eliminate cronyism from capitalism, but that's not what they were describing.
00:54:14.920 They weren't describing removing cronyism.
00:54:16.600 They were describing a system where essentially, you know, businesses operate as a social benefit to society.
00:54:22.900 That's not what capitalism was ever meant to be.
00:54:25.240 That's not capitalism.
00:54:26.160 And the way they describe it is just a pathway to pure Marxism.
00:54:30.320 And it's something actually I think is a very important, interesting point that I think you will find interesting and maybe have some things to say on.
00:54:37.140 You know, I think coming from a family who had communism steal everything, I think it's very important people understand the modern left, they are a new age version of the Communist Party.
00:54:47.780 And when I say that, there's some fundamentally important differences.
00:54:51.120 But I think once people hear them, if they scoffed at the beginning at me using the term communist to describe what they're doing, I think they won't after they hear this.
00:54:59.200 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier than ever.
00:55:06.800 Shopify is the global commerce platform that helps you sell at every stage of your business, from the launch your online shop stage all the way to the did we just hit a million orders stage.
00:55:16.000 Shopify is here to help you grow.
00:55:17.800 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products and track conversions.
00:55:25.420 With Shopify, customize your online store to your style with flexible templates and powerful tools, alongside an endless list of integrations and third-party apps like on-demand printing, accounting and chatbots.
00:55:37.660 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:55:45.620 No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level.
00:55:51.800 You know, the fundamental difference here is that they no longer believe in the need to seize the means of production because they realize something incredibly important.
00:56:16.560 The power structure on the left realized that, optically, it's going to be really hard to sell the idea of communism to a populist who understands that communism killed so many people in such a brutal way, right?
00:56:28.300 Like, that's not going to be a popular sell to come out and say, hey, we want to be communists.
00:56:32.040 We want to take over industry.
00:56:33.500 We want to control your life.
00:56:34.780 That's not something people generally are going to be really amenable to, right?
00:56:37.920 Not if you say it openly.
00:56:38.920 What they realized was, you don't need to go seize the means of production.
00:56:42.540 You just need to control the minds of the people in charge of production.
00:56:46.920 And if people doubt that that's the path we're on, look no further than big tech.
00:56:51.200 Look at Google.
00:56:52.300 Look at Facebook.
00:56:53.480 Every one of these companies acts as an arm of the state's ruling party.
00:56:58.620 That's not capitalism.
00:57:01.020 That's not, you know, business just doing its thing, answering to the free market.
00:57:04.920 We fundamentally shifted from a system where customer is king to one where the needs and desires of the Democratic Party are king.
00:57:13.320 And secondary to that are the needs and desires of BlackRock, State Street, and Vanguard.
00:57:17.640 Now, we're fundamentally shifting the reality back with what we're doing to say, no, actually, customer is king.
00:57:23.880 Because you know what?
00:57:24.460 If you own John Deere, you own Tractor Supply, you own Harley Davidson, guess who's not walking in your door to buy your products?
00:57:31.020 BlackRock's not.
00:57:32.280 Vanguard's not.
00:57:33.000 State Street's not.
00:57:33.740 The Democratic Party is not.
00:57:35.680 My people are.
00:57:37.380 And so fundamentally shifting that reality, you know, I think psychologically a lot of important things happen from forcing Harley to change.
00:57:45.280 One of those things was that people felt like, okay, one, maybe a fluke.
00:57:50.900 Two, possibly just extremely lucky.
00:57:53.700 Three companies in a row, it was like, oh, this is actually working.
00:57:58.080 This is a trend.
00:57:59.040 This is something that can be replicated and continue.
00:58:01.460 And we are a force in the market where we're shifting.
00:58:04.480 You know, in the case of Tractor Supply, it was an almost $3 billion loss in market cap during our campaign.
00:58:09.640 With John Deere, almost $10 billion loss in market cap during our campaign.
00:58:13.720 $10 billion.
00:58:14.280 You know, and as you look at sort of what's happening in the market, one of the fears that a lot of people tried to sell about, you can't leave wokeness, it's going to lose you money, right?
00:58:24.160 Every company that has come out and rejected these policies as a product of our campaigns has seen their stock go up the day that they announced that they were dropping these policies.
00:58:33.440 Every single one had their stock go up the day they announced.
00:58:36.340 That's not an accident.
00:58:37.600 The market's not going to punish you because retail is more involved than ever in the market.
00:58:41.120 And these bigger players like Vanguard, State Street, and BlackRock, they actually, I think, don't know what to do with us.
00:58:47.580 They don't know what to do now that there's an activated consumer base on the right who is willing to use their wallet as a weapon because they fundamentally understand that I'm right in terms of the thesis that, at the end of the day, you need us to walk in the doors.
00:58:59.680 If we don't walk in the doors at these places, you're toast.
00:59:02.540 You can go and have institutional investors try to prop you up until the next quarter, but when you have to report earnings and people see that you had a really sizable loss in one strategic area of your business or maybe the business in total, you're going to have big problems.
00:59:17.040 You're not going to get your bonus.
00:59:18.120 There's financial consequences for the decisions you've been making.
00:59:22.080 And so that's what we're trying to make abundantly clear.
00:59:24.080 So as a byproduct of that, one of these other positive things is psychologically in the minds of executives, okay, things have changed.
00:59:30.160 Now, these people are somebody you need to be afraid of.
00:59:32.940 If they come to your door, they mean business.
00:59:35.360 They have the ability to reach hundreds of millions of eyeballs, and you don't want that.
00:59:39.280 You don't want to be the story anymore.
00:59:40.940 So we fundamentally altered our approach a little bit.
00:59:43.480 Now, instead of just going straight with the story, we started to reach out ahead of time and say, hey, we've been investigating.
00:59:49.580 Here's what we found.
00:59:50.340 We're planning a story.
00:59:51.720 And essentially, we want to make sure our story is right.
00:59:54.760 If you have any corrections or you want to let us know about any changes you're considering as a byproduct of reading our reporting, let us know you have until this date to give us a comment, correction, or feedback as far as changes go.
01:00:07.380 And we offer off-the-record conversation between Robbie and your executive if they want to talk through this issue at all.
01:00:14.480 We're happy to do so.
01:00:16.760 And I've had those off-the-record talks with a lot of these companies and their executives.
01:00:20.560 And they've, frankly, been very productive talks.
01:00:23.680 You know, I'm very pragmatic.
01:00:24.840 I mean, to them, I'm sure they would love if I was, like, you know, kind of an idiot and I was, like, trying to strong-arm them into this.
01:00:31.020 But I'm not.
01:00:31.560 The truth is, for us, it's fundamentally simple.
01:00:33.640 We're going to report one way or the other.
01:00:35.360 It's just either going to be about your past failures or you changing things.
01:00:38.600 That's up to the companies, though.
01:00:40.000 You know, we're not like these shakedown artists on the left.
01:00:42.900 We're not asking for money.
01:00:44.320 We don't want money, which a lot of them, I think, find quite surprising.
01:00:47.220 Like, I'm willing to fund whatever we need to myself, and then we've got great subscribers on X who help fund our research team growing.
01:00:54.300 Because I'd say the hardest part of what we do right now is we've got over 5,000 whistleblowers.
01:00:58.400 So it's important that people continue doing that.
01:01:01.420 5,000.
01:01:02.040 Yes, 5,000.
01:01:03.160 And we don't want to discourage people from doing it because we're scaling up our team to be able to meet the needs of the number of whistleblowers we're getting.
01:01:11.640 And they're fantastic.
01:01:12.880 You know, people are sending us evidence probably right now as you and I speak.
01:01:15.480 How many corporations are implicated in that network of 5,000 whistleblowers?
01:01:19.880 We don't have a solid number on that because there is crossover between the 5,000.
01:01:23.560 So, you know, maybe 10 of them are at the same company type thing.
01:01:26.400 I don't think we've broken that down to see what that actually looks like.
01:01:30.500 But, I mean, we're talking, you know, well into the hundreds and hundreds, if not over.
01:01:34.520 Okay, okay.
01:01:34.800 Yeah.
01:01:34.960 So that's the order of magnitude.
01:01:37.120 Yeah.
01:01:37.580 And so then we have metrics for, you know, how do we do this to make sure we choose the right companies to be able to move us in the direction we want to go to sort of change the norm?
01:01:47.160 You know, because I'm a big believer, everything I do in life, I always teach my kids this.
01:01:51.060 Any big decision you're going to make or change you're trying to make, always try to go outside of yourself.
01:01:56.020 Go into an eagle-eye view.
01:01:57.160 You know, you're just kind of looking at things from overhead in as unbiased of a way as you possibly can.
01:02:02.160 And when I look at sort of the lay of the land here as if this is a battlefield, I see very clearly you have to make the right decisions or you're going to get, you know, cut off at the knees very easily.
01:02:12.860 And so for us, it's pick the right companies.
01:02:14.940 And those metrics are fairly simple right now.
01:02:16.920 It's like we look at who the customer base is.
01:02:19.320 We look at the regions that they do well in.
01:02:21.420 We look at, you know, what do they sell?
01:02:23.940 Who are they selling to in general?
01:02:25.680 Because there's different subgroups within the demographics of who their customers are.
01:02:30.060 And then secondary to that, this is a big one.
01:02:31.980 And I think this is more of a feel than a science.
01:02:34.660 We do look very deeply at the board and at the executives and the psychology of who they are as people.
01:02:40.800 So we look at the psychology of these board members because I think that gives us a really strong window into, you know, sort of how we fix things, right?
01:02:47.660 Because each one of them is different.
01:02:48.820 Some of them are true believers.
01:02:50.040 Some of them are not.
01:02:50.720 And they're just simply there at that moment in time because after George Floyd, they gave license to these crazed lunatics in the HR and PR departments to go and apply all these policies that would mean that that executive was not a racist because that's really what they were concerned about.
01:03:06.580 They didn't want to be pitted as a racist because at the time they felt like that would have been fatal.
01:03:11.640 And now we're in a fundamentally different time because four years-ish later, the whole companies had to experience what this looks like, right?
01:03:18.980 And so what was sold to them is, you know, this unifying, diverse, inclusive thing is anything but.
01:03:24.880 Everybody's experienced it.
01:03:26.020 They'd rather, you know, have their eyeballs poked out than do another DEI training because, frankly, we all know that it's mind-numbingly stupid.
01:03:32.500 And it's beyond farce because it's simply propaganda at this point.
01:03:36.580 I've done over 100 of these trainings, these big DEI trainings that the major companies use.
01:03:41.100 They are some of the most ludicrous trash I've ever read in my life.
01:03:44.640 And every single one of them has one thing that I found in common.
01:03:48.160 They pretty much all have resources, okay?
01:03:50.200 At the end of the training, they've got this resources section.
01:03:53.040 I have yet to find one resource that is even center lane politically.
01:03:57.780 Every single resource that is recommended is extremely far left.
01:04:01.580 And the number one most recommended resource in these DEI trainings will always be kind of darkly funny to me.
01:04:08.700 It's Ibram Kendi's How to Be an Anti-Racist.
01:04:10.900 Right, right, right.
01:04:11.500 Which I find really ironic since these are major corporations telling their employees,
01:04:15.860 you should read this book as part of your DEI training.
01:04:17.880 And the book says to be an anti-racist, you must be an anti-capitalist.
01:04:21.620 That is the thesis, the core thesis of the book.
01:04:24.880 And you've got the largest companies in the Fortune 100 telling their employees,
01:04:29.680 yeah, go read this guy's book.
01:04:31.600 It's signing your own death warrant ideologically, which is the most bizarre part of the whole thing.
01:04:36.940 But the executives at the top are largely ignorant to it.
01:04:40.640 You know, you've got a couple of true believers, but most of them tell me in truth.
01:04:45.020 And I actually believe them that they had no idea.
01:04:48.380 When I show them these trainings, they're like in shock and they go, no, we've got to get rid of this.
01:04:53.320 This is crazy.
01:04:54.460 And they always go, we didn't know we had activists in the company.
01:04:57.680 Yes, every single one of these companies has activists across the board in these different areas.
01:05:01.320 And what people don't realize at these executive positions, because they've largely worked to get there over the course of some odd, like 20, 30 years, right?
01:05:08.680 They don't realize the kids coming out of college today are fundamentally different from the kids you went to college with.
01:05:15.420 You went to college in a time where people went to college to become a professional at something.
01:05:19.920 Kids today, the profession, the major, it's all a veneer in large part for many of the kids.
01:05:26.180 Not all of them, but for many of them.
01:05:28.020 In truth, what they're meant to become as a byproduct of going to college is a trained activist.
01:05:33.520 So they go into whatever job they've been given a diploma to fit into with the intent purpose of spreading the poison, this ideology.
01:05:44.000 You know, I mean, they're spreading the poison.
01:05:45.580 It's become a cancer throughout these companies, but that is their purpose.
01:05:49.360 They believe the same way that somebody who is religious believes that they need to evangelize.
01:05:55.920 These people are religiously captured by this ideology.
01:05:59.200 It is their God.
01:06:00.220 And so the way that you are willing, if you're a religious person, to do anything for God, these people are willing to do the same for their ideology.
01:06:08.700 And the sooner we understand that, the sooner we're going to understand why it requires us to speak up.
01:06:13.640 Because I will say this, and this may be the most important thing I say throughout this whole thing.
01:06:17.640 The biggest mistake conservatives and normal people have made over the last 30 years is not only celebrating, but promoting the idea we should be a silent majority.
01:06:29.020 Silent majorities get people killed.
01:06:31.820 Silent majorities destroy countries.
01:06:34.060 They are the most poisonous thing you can possibly be.
01:06:37.560 Because it allows a very loud, deranged minority to take over the entirety of your country, every major institution, and they guide the path of the future that your children are going to have to live in.
01:06:50.440 It is shameful to be a silent majority.
01:06:52.920 There has never been a time where it was more important for people to understand that fundamental fact.
01:06:56.580 It is time for people to take personal responsibility, stop waiting for a politician to save you, step up, do the work yourself to make a difference where you live.
01:07:04.940 Because the truth is, if we all did that and we all took personal responsibility and you took that eagle eye view we talked about earlier, you would see very clearly that each one of these pockets of our country was protected as a byproduct of each individual community taking control and saying, we're not going to allow this crazy here.
01:07:21.060 If that was happening, instead of people waiting for a politician to save them, we'd be in a much better place.
01:07:26.040 And don't get me wrong, voting is very important.
01:07:28.760 Who we elect is very important.
01:07:30.780 But more important is what we do on an individual level.
01:07:33.700 And people need to start believing in their ability to make a difference again.
01:07:36.360 I find it very difficult to disagree with any of that.
01:07:40.980 I'm going to go through your, I'm going to summarize the strategy that you laid out and then I'm going to zero in on the board analysis a little bit.
01:07:48.800 And then I want to talk to you about the three companies that you've gone after, Tractor Supply, John Deere and Harley Davidson and the strategy that goes along with that.
01:07:56.840 So you said you start a video, you start by distributing a video that's about seven to 10 minutes.
01:08:02.500 So, and that captures attention optimally, but also is long enough to provide some real information.
01:08:07.180 You save some of the material that your crew has documented so that you can do a protracted campaign so you're not part of the 24-hour news cycle.
01:08:17.340 You do, you investigate all the open source material that a given company has produced so that you can use their own words as an illustration of either what they're doing
01:08:26.600 or what they know or what they don't know.
01:08:28.700 You go after companies one by one, you're focusing on companies that have at least a 20% conservative market share
01:08:35.340 and you pointed out that if the company is doing anything to violate their implicit contract or explicit contract with those consumers that they're in breach of their fiduciary duty.
01:08:45.400 And then you talked about doing a board analysis and one of the things you pointed out there, which is you can't make this stuff up, you know.
01:08:54.040 I mean, I've been struck as you have by the fact that, so the first question for me was,
01:09:00.400 why in the world is corporate America promoting an anti-cap, radical anti-capitalist, quasi-Marxist, post-modern activist movement?
01:09:16.220 Because it's preposterous.
01:09:17.820 It's like, don't these people know that they're funding and promoting a viewpoint that's worse than Marxism,
01:09:25.180 that's completely antithetical to everything that they themselves not only purport to believe in,
01:09:31.320 but have actually lived by in some sense for the real actors, let's say.
01:09:36.800 Now, your solution to that, and I think it's the right one in all but the minimum number of cases,
01:09:42.400 is that, well, they don't know, and you think, well, can people possibly be that blind?
01:09:48.400 But one of the things that I've learned is that, and maybe this is even more true for conservatives, it's possible, you know.
01:09:58.100 Lots of people live in 1995.
01:09:59.820 And it's not 1995.
01:10:04.660 It's not even 2015.
01:10:07.020 And in some ways, it's not even 2024, because things are changing so fast, we have no idea where we are even.
01:10:14.640 And so I think you're right with regards to these boards, and this is actually a positive thing in a way.
01:10:19.500 It's like, the people who are putting forth these policies are,
01:10:22.960 A, trying to protect themselves against the accusations of racism that could bring them down as individual actors,
01:10:29.540 and B, they're not interested in the systems of ideas that underlie these movements, let's say.
01:10:35.420 They have no idea that the systems of ideas exist.
01:10:37.780 They have no idea how pathological they are.
01:10:39.980 And they have almost no appreciation whatsoever for the force of philosophical ideas.
01:10:44.660 And that might also be part of the conservative temperament, right?
01:10:47.740 Because conservatives tend to be detail-oriented and practical.
01:10:50.900 And so when you talk to them about abstracted ideas, they're not that interested,
01:10:56.260 and they also don't think they have much power.
01:10:58.300 And that's really a bad idea, especially in the situation that we're in right now.
01:11:02.200 But, you know, it was striking to me the fact that you concentrated on your realization,
01:11:06.920 your discovery that most of the people who are involved in this at the corporate level
01:11:11.080 actually have no idea whatsoever what they're fostering.
01:11:13.980 They don't know who Robin DiAngelo is.
01:11:16.480 No clue.
01:11:17.020 They don't know that—no idea.
01:11:18.880 They don't know anything about Ibram Kendi.
01:11:20.900 They don't know anything about the implicit association test.
01:11:24.100 Republicans don't know anything about the implicit association test.
01:11:27.380 And the fact that it's provided hypothetical scientific justification for the idea of implicit
01:11:33.360 bias and the whole bloody woke movement.
01:11:35.840 And people are—they're living far behind the times.
01:11:39.140 And, you know, it's not that surprising in some way because things are changing very,
01:11:44.040 very fast.
01:11:44.680 And it is not an easy thing to be on the cutting edge.
01:11:47.460 Okay, so the upside, the positive conclusion that can be derived from what you described is that
01:11:53.680 given that a fair bit of what is happening is a consequence of blindness,
01:11:59.620 sometimes willful and much more seldomly direct propagandistic intent,
01:12:05.880 at least on the part of the corporate leaders.
01:12:08.780 It's easier than you might think to shift the direction of the movement.
01:12:13.120 Okay, so let's move to that for a minute.
01:12:15.040 So it was very, very interesting to me.
01:12:18.300 Like, I really started to take what you were doing seriously,
01:12:21.000 and you made an illusion to this, after you'd done it three times.
01:12:24.180 Because I used the three principle, the principle of three, as a verification index.
01:12:29.900 That was our principle, too, going into it, was I didn't even take it seriously,
01:12:34.720 seriously that we had a winning strategy until we hit three.
01:12:38.720 Yeah, yeah.
01:12:39.320 Three establishes the pattern, right?
01:12:41.880 Two, you can still write off and probably should.
01:12:44.220 But three, you think, okay, something's going on here.
01:12:47.240 Now, you also very carefully, likely, but it appeared like that from the outside.
01:12:52.660 You picked very emblematic corporations.
01:12:55.360 I mean, Tractor Supply, John Deere, like, Bedrock, Middle America,
01:13:03.140 small C conservative to the core.
01:13:05.440 It's like, this is not, this should not be a woke company, obviously.
01:13:09.420 And then you went for Hardy Davidson, which I thought was insanely comical
01:13:13.420 in this terrible way that we've been describing.
01:13:15.380 It's like, Harley Davidson, tattooed bikers, and now you have a woke CEO.
01:13:21.220 I mean, are these people, are they completely out of their mind?
01:13:26.120 It's like, it seemed to me like Budweiser on steroids.
01:13:29.240 What are you going to do?
01:13:30.140 You're going to make fun of your customer base.
01:13:33.280 That's your sales and marketing strategy, is it?
01:13:36.860 You're going to take the people who actually buy your products,
01:13:39.540 especially with Harley Davidson, because that's an emblematic brand
01:13:42.400 of that sort of freedom-loving, you know, helmetless,
01:13:45.980 on-the-fringe biker.
01:13:49.020 Yeah, well, and it's the rough edges of America.
01:13:51.800 Now you're going to turn them into DEI princesses and cry.
01:13:55.920 And it, you know, it's so ridiculous.
01:13:58.300 And so I saw the three companies you picked.
01:14:00.820 I thought, oh yeah, that's pretty, it's very unlikely that that's fluke.
01:14:06.320 Okay, now you also said, now-
01:14:08.060 Wait, you got, you identified something there, Jordan.
01:14:10.420 I mean, those are psychological choices.
01:14:13.520 We do choose emblematic companies.
01:14:15.340 We have a whole host of companies that have crazy stuff
01:14:17.880 that we could choose to be next.
01:14:19.840 But we do choose emblematic companies because at this point,
01:14:22.520 they represent more than themselves.
01:14:24.140 They represent the entity of corporate America.
01:14:27.140 Because if you can make the emblems fall back into alignment with sanity,
01:14:31.240 the other ones are going to convince themselves
01:14:33.580 that they're important in a way that they're not.
01:14:35.320 But they're going to convince themselves
01:14:36.800 that they need to change so they're not next.
01:14:39.260 And I already, we already see this in a number of cases
01:14:42.360 where I don't find the companies particularly interesting,
01:14:44.940 but we found out that they're changing policy
01:14:46.740 as a byproduct of seeing what's happening with this movement.
01:14:49.660 And there's going to be non-consumer-facing companies that do this.
01:14:52.560 There's going to be also consumer-facing ones that do.
01:14:54.980 There's going to be new CEOs who get into surprisingly,
01:14:58.160 you know, high-level companies
01:14:59.160 who do have more of a conservative view on the world,
01:15:01.600 who are going to go in with the intent purpose
01:15:03.660 to sort of defang this ideology.
01:15:05.560 Even in companies where you really would not expect them to turn around on this,
01:15:10.740 I think people are going to be surprised in the coming year
01:15:13.160 by the types of companies that defang this ideology.
01:15:16.420 And we're seeing it, you know, I think across the board,
01:15:18.940 even in industries where people don't expect it,
01:15:21.520 where they're making cuts.
01:15:22.480 Because there's also a financial perspective to this, Jordan,
01:15:24.800 where, you know, companies are living through an economy
01:15:27.420 that is not fantastic on the consumer side.
01:15:30.640 And so you may have a market propped up by a bunch of artificial things,
01:15:34.020 but we all know that those are things
01:15:36.100 that can go by the wayside very quickly.
01:15:37.640 And at the end of the day, your real value is your customer side.
01:15:40.840 And so that's the other side of this,
01:15:43.300 is like they're all tightening their belts in many different ways.
01:15:46.160 And one of the primary ways you could do that
01:15:48.120 if you're a corporate CEO right now
01:15:49.640 is get rid of your DEI department
01:15:51.240 because you have wasted untold millions
01:15:53.720 on this one department that produces exactly nothing
01:15:56.820 and only produces a detriment to your business.
01:15:59.680 That is the only thing they've ever produced
01:16:01.360 is a detriment to your business.
01:16:02.860 They're a potential liability.
01:16:04.660 Who in their right mind would start a business and say,
01:16:06.500 I want an entire department that is only a potential liability
01:16:09.680 because that's what they've turned into.
01:16:11.540 And, you know, a lot of this was also predicated
01:16:13.460 on a McKinsey study.
01:16:14.560 A bunch of companies got,
01:16:15.520 I don't know why they listened to McKinsey.
01:16:17.220 I mean, it blows my mind, but 2015 McKinsey-
01:16:20.080 Because they pay them a lot of money.
01:16:21.480 Which is, again, even more ridiculous to me.
01:16:24.640 Like you pay for stupid advice,
01:16:27.280 but they're paying for stupid advice.
01:16:29.100 McKinsey comes with this study
01:16:30.860 that tells these companies you're going to be rich
01:16:34.000 if you embrace this left-wing, you know, woke DEI ideology.
01:16:38.060 Wall Street Journal-
01:16:38.500 Yeah, because there's nothing more obvious than that.
01:16:40.720 Yeah, nothing more obvious than that as a moneymaker, right?
01:16:43.200 So Wall Street Journal comes out recently.
01:16:44.980 So we're almost a,
01:16:46.440 I'd say we're about eight and a half years removed
01:16:48.880 from when McKinsey did this.
01:16:50.320 And Wall Street Journal just came out
01:16:52.200 with the most brutal takedown of the McKinsey study,
01:16:55.960 proving it was entirely a farce.
01:16:57.660 The whole thing was a farce.
01:16:58.940 It was all predicated off of lies.
01:17:00.880 And when you look at the actual reality of the market,
01:17:03.660 none of this makes money.
01:17:04.720 In fact, in London, this is actually quite interesting.
01:17:07.260 They have a diversity ETF.
01:17:09.300 It underperforms every other ETF on the market.
01:17:12.380 So, you know, there's other metrics as well.
01:17:14.840 I won't bore people with,
01:17:15.780 because a lot of it's kind of boring financial nonsense.
01:17:17.720 But the truth is, this loses you money.
01:17:21.420 So who in their right mind would sign up for that?
01:17:23.280 Because there's no small business in America
01:17:24.840 that if you told them,
01:17:25.900 hey, I've got a great idea for you,
01:17:27.140 you're going to open a segment of your business.
01:17:28.720 It costs you an extraordinary amount of money.
01:17:30.380 It's going to make you nothing.
01:17:31.680 And people might get really upset with you
01:17:33.300 and stop shopping at your store.
01:17:35.980 No small business owner in America would sign up for that.
01:17:38.540 So why are the biggest companies in the world?
01:17:40.860 And I think there's multiple layers to that.
01:17:43.600 You can go from the BlackRock, Vanguard, State Street side,
01:17:46.340 or you can go from the side of the activists.
01:17:47.740 I particularly actually think the activists
01:17:49.860 are an even bigger issue because, you know,
01:17:52.520 BlackRock can desire something,
01:17:54.040 but at the end of the day,
01:17:54.820 if you don't have your soldiers in place to implement it,
01:17:57.800 it cannot be implemented.
01:17:59.520 And there's going to be too many different stages
01:18:02.080 where it can kind of be defanged.
01:18:04.440 The problem is they have these people
01:18:06.300 in central nodes of power within a corporation
01:18:09.080 that are not really recognized from the outside
01:18:11.280 as the nodes of power.
01:18:12.840 So it's not your CEO, it's not your COO,
01:18:15.160 but it's your, you know, VP of marketing.
01:18:17.980 It's your head of HR.
01:18:19.880 It's those people who are really the ones
01:18:22.500 driving a lot of this.
01:18:25.440 Hey, everyone, real quick before you skip,
01:18:27.800 I want to talk to you about something serious and important.
01:18:30.740 Dr. Jordan Peterson has created a new series
01:18:33.160 that could be a lifeline
01:18:34.360 for those battling depression and anxiety.
01:18:36.980 We know how isolating and overwhelming
01:18:39.020 these conditions can be,
01:18:40.380 and we wanted to take a moment to reach out
01:18:42.100 to those listening who may be struggling.
01:18:43.780 With decades of experience helping patients,
01:18:46.880 Dr. Peterson offers a unique understanding
01:18:49.100 of why you might be feeling this way in his new series.
01:18:52.200 He provides a roadmap towards healing,
01:18:54.220 showing that while the journey isn't easy,
01:18:56.420 it's absolutely possible to find your way forward.
01:18:59.600 If you're suffering, please know you are not alone.
01:19:02.760 There's hope, and there's a path to feeling better.
01:19:06.040 Go to Daily Wire Plus now
01:19:07.340 and start watching Dr. Jordan B. Peterson
01:19:09.660 on depression and anxiety.
01:19:11.180 Let this be the first step
01:19:13.180 towards the brighter future you deserve.
01:19:18.340 Yeah.
01:19:19.080 Okay, okay.
01:19:20.060 So, let me...
01:19:22.740 I'm going to harass you a bit
01:19:24.460 because there's a danger in what you're doing,
01:19:27.160 and I want to discuss that with you.
01:19:29.380 So, well, the first danger, I would say,
01:19:32.320 is one of power.
01:19:33.420 Like, now you're in a position
01:19:34.640 where you can call up a corporation
01:19:37.120 or make contact with them
01:19:38.420 and lay out a set of demands.
01:19:41.880 That's one way of putting it.
01:19:43.360 And that's a lot of power.
01:19:45.380 Okay, so that's the first thing.
01:19:47.040 And so, obviously,
01:19:47.980 there's a danger associated with that
01:19:50.040 that has to be regulated.
01:19:51.580 Okay, the second thing is
01:19:52.740 you're taking a page in a way
01:19:55.600 from the activist playbook of the left.
01:19:57.760 And the activist playbook of the left
01:20:00.820 has produced a lot of social pathology.
01:20:04.020 A lot.
01:20:05.080 And so, I guess I wonder
01:20:07.500 how you distinguish the activism
01:20:10.580 that you're engaging in personally
01:20:13.020 from the cancel culture,
01:20:14.780 let's say, of the left, right?
01:20:16.020 Because you can see how it could go
01:20:17.620 in that direction.
01:20:18.920 And then, okay.
01:20:20.200 And then, the next issue is
01:20:22.420 why are you convinced
01:20:24.900 or are you convinced
01:20:25.920 that these policy changes
01:20:27.740 that these corporations are announcing
01:20:29.860 have any teeth?
01:20:31.340 You know, because what I'm seeing
01:20:32.480 happening at the universities,
01:20:33.780 for example, is they say,
01:20:35.000 well, you know,
01:20:35.640 we'll abide by the Supreme Court ruling
01:20:37.660 that made affirmative action
01:20:39.400 in its more progressive manifestations illegal.
01:20:43.840 But they don't.
01:20:44.980 They just move the deck chairs
01:20:46.300 around on the Titanic, right?
01:20:47.700 And they lie through their bloody teeth.
01:20:50.340 And you saw that
01:20:51.160 with the Texas Children's Hospital,
01:20:52.700 for example, too.
01:20:53.920 And so, and, you know,
01:20:55.260 and it's partly because,
01:20:57.200 well, the people are still there.
01:20:58.660 So, their names of their department
01:21:00.920 are going to change
01:21:01.900 and they'll use some new terminology
01:21:03.660 to describe what they're doing.
01:21:05.200 But as you already pointed out,
01:21:06.620 they're committed bloody activists
01:21:07.980 and you'd have to fire all of them
01:21:09.740 to actually get rid of them.
01:21:11.640 I mean, that's one approach anyways.
01:21:14.000 Some people can be salvaged,
01:21:15.820 but they're more like
01:21:16.560 the board of governor types
01:21:17.700 that you described
01:21:18.460 that don't know
01:21:18.960 what the hell's going on.
01:21:20.520 So, okay.
01:21:21.320 So, danger of power for you.
01:21:23.640 Let's start there.
01:21:24.420 Danger of power.
01:21:25.360 Yeah, yeah, yeah.
01:21:26.800 So, danger of power,
01:21:28.360 you know,
01:21:28.720 I take that very seriously.
01:21:30.680 So, you know,
01:21:31.340 in our team,
01:21:32.160 first of all,
01:21:32.900 we have an ethics,
01:21:34.700 a set of ethics
01:21:35.660 that we abide by.
01:21:36.560 You know, like one is,
01:21:37.760 you know,
01:21:38.020 we will not go and short
01:21:39.320 a company we're reporting on.
01:21:40.840 We're not going to go
01:21:41.420 and trade on it.
01:21:42.300 We're not going to tell anybody
01:21:43.400 who could go and trade on it
01:21:45.280 because I think that would be unethical.
01:21:47.440 It may be legal in many places,
01:21:49.100 but I think it's unethical.
01:21:50.740 Secondary to that,
01:21:52.060 you know,
01:21:52.300 I think that the way
01:21:53.240 that we approach this
01:21:54.180 is that it is never a shakedown.
01:21:55.980 It is never blackmail.
01:21:57.400 We never treat it that way
01:21:58.820 because that's entirely inappropriate
01:22:00.180 and that's not our goal.
01:22:01.640 We're not the mafia.
01:22:02.720 We never want to be the mafia.
01:22:04.040 And I refuse to be a shakedown artist
01:22:05.620 like these people on the left.
01:22:06.700 I've had a lot of people
01:22:07.620 come and approach and say,
01:22:09.180 hey, we've got this idea,
01:22:10.240 you know,
01:22:10.460 ways you could,
01:22:11.260 you know,
01:22:11.440 get these companies to pay you,
01:22:13.240 you know,
01:22:13.480 I don't want their money.
01:22:15.080 I will not take their money.
01:22:16.460 I'm principally ideological.
01:22:17.940 I believe that God put me
01:22:19.080 in the position I'm in for a purpose.
01:22:20.900 I'm going to do the right thing,
01:22:22.220 wield that power wisely
01:22:23.420 in a way that I feel like
01:22:25.200 is going to be responsible.
01:22:27.840 It has to be measured.
01:22:28.820 The other thing to remember too
01:22:30.060 is that this path
01:22:33.240 we've set out here,
01:22:34.620 it's only successful
01:22:36.020 because I'm not acting crazy.
01:22:38.100 The minute I act crazy
01:22:39.680 and I demand companies,
01:22:41.520 you know,
01:22:42.100 do crazy things,
01:22:43.360 it's going to look wildly different,
01:22:44.540 right?
01:22:45.400 See,
01:22:45.820 if you look at this
01:22:46.480 from an outsider's perspective,
01:22:47.100 like if you're somebody non-political
01:22:49.520 and you're looking at these two sides,
01:22:50.980 you're reading the news,
01:22:52.440 you know,
01:22:52.820 or whatever,
01:22:53.920 and you see one side
01:22:55.280 would like to force their ideology,
01:22:57.320 the other side
01:22:58.280 is over here saying,
01:22:59.560 actually,
01:23:00.220 hey,
01:23:00.420 I think I just want everybody
01:23:01.520 to get along.
01:23:02.240 Things should be fairly neutral
01:23:03.380 and let's just like
01:23:04.360 not talk about
01:23:04.980 what kind of sex
01:23:05.500 you like to have at work
01:23:06.400 and,
01:23:07.060 you know,
01:23:07.260 maybe the company
01:23:07.900 should only sponsor things
01:23:09.400 that are dedicated
01:23:10.260 to the core business
01:23:10.960 and maybe not sponsor events
01:23:12.340 where they support
01:23:13.740 sex changes for kids.
01:23:15.100 I think that would probably
01:23:15.940 be a good level
01:23:16.820 of playing field
01:23:17.420 for everybody
01:23:18.020 and everybody
01:23:18.540 just be nice to each other.
01:23:20.120 I don't care,
01:23:20.940 you know,
01:23:21.300 sort of what your gender is,
01:23:22.960 what race you are,
01:23:23.940 who you want to have sex with,
01:23:25.360 it's work.
01:23:26.080 Let's just do work
01:23:26.980 and be nice to each other
01:23:27.920 and get our job done.
01:23:29.200 If you look at that
01:23:29.900 from the outsider perspective,
01:23:30.840 it's like,
01:23:31.200 okay,
01:23:31.380 well,
01:23:31.560 that side over there
01:23:32.500 who's asking
01:23:33.420 to shove their ideology
01:23:34.700 down everybody's throat
01:23:35.820 seems sort of crazy
01:23:37.140 and fascist-y
01:23:37.920 and this side over here
01:23:38.760 is just asking
01:23:39.340 for everybody
01:23:39.840 to just sort of be neutral.
01:23:41.340 That seems like
01:23:42.100 a more sane position,
01:23:43.200 right?
01:23:43.760 So if we go to these companies
01:23:45.120 and I say,
01:23:45.560 actually,
01:23:46.080 you know what,
01:23:46.400 I want you to adopt
01:23:47.200 my ideology.
01:23:48.300 You better start donating
01:23:49.220 to the groups
01:23:49.720 that I want you to donate to.
01:23:51.760 That fundamentally alters
01:23:53.800 the reason that this works.
01:23:55.540 So that in itself
01:23:56.640 confines the power
01:23:57.660 that we've sort of,
01:23:58.760 you know,
01:23:59.280 been able to acquire
01:24:00.160 through this campaign
01:24:01.120 because it's predicated
01:24:02.800 on a set of,
01:24:03.980 you know,
01:24:04.280 sort of rules
01:24:04.920 that we've made
01:24:05.980 from the outset.
01:24:07.080 The minute you stray
01:24:07.980 outside of them,
01:24:08.820 the power wanes.
01:24:09.720 And so if you want
01:24:10.480 to be effective,
01:24:11.500 you have to stay
01:24:12.300 within the confines
01:24:13.080 of why you're effective.
01:24:14.760 But your second question,
01:24:16.800 I think-
01:24:16.920 Do you have people,
01:24:17.440 do you have,
01:24:18.120 sorry,
01:24:18.680 do you have people
01:24:19.300 helping you out with that?
01:24:20.420 I mean,
01:24:21.160 you know,
01:24:21.440 you mentioned,
01:24:22.040 for example,
01:24:22.540 earlier that you were
01:24:24.080 more prone
01:24:25.080 to take the seriousness
01:24:26.540 of your own pathway
01:24:28.000 to heart
01:24:30.520 as a consequence
01:24:31.840 of your wife's support
01:24:33.320 of your vision,
01:24:34.360 let's say,
01:24:34.800 and your ability.
01:24:35.800 And, like,
01:24:36.380 it is helpful
01:24:36.920 to have people around you
01:24:38.040 keeping you on the straight
01:24:39.040 and narrow,
01:24:39.520 let's say.
01:24:39.740 She'd be the first one.
01:24:40.760 She'd be the first one
01:24:41.720 to check me.
01:24:42.340 I always tell people,
01:24:43.520 if you're worried
01:24:44.280 about me at all,
01:24:45.480 just remember,
01:24:46.240 I have to sleep in bed
01:24:47.260 next to Landon Starbuck,
01:24:48.940 and that lady
01:24:49.620 will be the first one
01:24:50.940 to let me know
01:24:52.160 if I have strayed
01:24:53.260 from what is right
01:24:54.300 and good and righteous.
01:24:56.060 She's my compass,
01:24:58.040 you know,
01:24:58.480 and I'm very lucky
01:24:59.400 as a man
01:24:59.820 to have a wife like that
01:25:00.780 because it's not lost on me
01:25:01.920 how rare that is
01:25:02.760 to have somebody like that
01:25:03.840 who will be a compass.
01:25:04.900 And sometimes that's hard.
01:25:05.820 That's something people
01:25:06.900 should know is,
01:25:08.040 like,
01:25:08.420 as a man,
01:25:09.720 you know,
01:25:10.080 there's lines there
01:25:11.220 where it's like,
01:25:11.780 okay,
01:25:12.120 you want to lead,
01:25:13.480 obviously,
01:25:14.260 but your wife
01:25:15.080 can really be a compass
01:25:16.200 of when your leadership
01:25:17.540 can stray in a direction
01:25:18.820 where you're not being
01:25:20.220 true to yourself.
01:25:21.280 And you may even struggle
01:25:23.640 up against that idea,
01:25:25.040 you know,
01:25:25.460 because it's uncomfortable,
01:25:26.700 you know,
01:25:27.020 to get checked like that.
01:25:28.640 But if you have a partner
01:25:30.060 who really loves you,
01:25:31.180 wants to see you succeed,
01:25:33.180 those things are the things
01:25:35.000 that forge you
01:25:35.700 into the greatest weapon
01:25:36.620 you can be.
01:25:37.760 And so,
01:25:38.340 yeah,
01:25:39.160 I would name her
01:25:40.000 first and foremost.
01:25:41.020 I mean,
01:25:41.160 she's the first one
01:25:42.060 who would check me
01:25:42.760 and be like,
01:25:43.460 you know,
01:25:44.320 let's say I got really
01:25:45.700 ego-driven about it,
01:25:47.480 right?
01:25:47.680 Like,
01:25:47.860 she'd be the first one
01:25:49.080 to be like,
01:25:49.600 drop the ego,
01:25:50.640 okay?
01:25:51.360 Yeah,
01:25:51.640 and she'd go through
01:25:52.420 all the reasons why,
01:25:54.120 you know?
01:25:54.520 And so I'm thankful for that
01:25:56.220 because,
01:25:56.580 you know,
01:25:56.760 it's easy for any man,
01:25:57.860 any woman
01:25:58.520 to lose their,
01:26:00.220 their sort of like path,
01:26:02.240 right?
01:26:02.600 And,
01:26:03.020 and if you don't have
01:26:03.840 somebody who's there
01:26:04.740 kind of like,
01:26:05.460 hey,
01:26:05.900 you're straying off the path,
01:26:07.160 that can happen to anybody.
01:26:09.580 And I think that's,
01:26:10.580 that's kind of like
01:26:11.220 one of the beautiful
01:26:12.460 but difficult things
01:26:13.140 about humanity,
01:26:13.660 right?
01:26:14.000 We're imperfect
01:26:14.660 and there's all these things
01:26:16.800 about us
01:26:17.260 that make us who we are
01:26:18.120 and they're not all
01:26:18.780 beautiful and good,
01:26:20.400 you know?
01:26:21.560 But our experiences
01:26:22.740 drive who we want to be.
01:26:23.840 And I think in general,
01:26:24.640 the things that,
01:26:25.460 that kind of separate us
01:26:26.760 good and bad
01:26:27.820 is largely driven off
01:26:29.540 of who we want to be.
01:26:31.000 And for me,
01:26:31.800 I know very clearly,
01:26:32.860 I want to be
01:26:33.960 the thing that if
01:26:35.040 somebody said,
01:26:35.740 boil down what matters
01:26:36.720 most to you,
01:26:37.800 it would be one moment.
01:26:40.040 When I die
01:26:41.040 and I,
01:26:42.700 I pretend in this
01:26:43.720 that I'm coherent
01:26:44.440 on my deathbed,
01:26:45.300 right?
01:26:45.480 I get to die in the way
01:26:46.420 that I would most
01:26:47.180 perceive of value.
01:26:48.300 So all my kids
01:26:49.160 are around me.
01:26:49.840 We're about to have
01:26:50.400 baby number four right now.
01:26:51.480 So all four of my kids
01:26:52.380 are around.
01:26:52.820 My wife's there.
01:26:54.480 On that day,
01:26:55.600 in my coherent mind,
01:26:56.800 I want to know inside,
01:26:58.440 in the deepest parts
01:26:59.220 of my soul,
01:26:59.760 I want to know
01:27:00.300 that each one of my kids
01:27:01.420 is thinking in their head
01:27:02.720 that my dad had integrity.
01:27:05.060 My dad always did
01:27:06.300 the right thing.
01:27:07.300 He always stood up
01:27:08.300 even when it was hard.
01:27:09.520 He always loved us.
01:27:10.800 He always loved our mom.
01:27:11.860 He always did right by us.
01:27:13.380 But he always had faith.
01:27:15.440 And all those things,
01:27:17.080 like if that's what
01:27:17.800 I'm focused on
01:27:18.640 as my prize in life
01:27:19.880 is that I will get
01:27:20.760 that moment.
01:27:21.640 Whether I get it or not,
01:27:22.860 that's the thing
01:27:23.400 that matters to me
01:27:24.160 is that at the end
01:27:24.880 of my life,
01:27:25.460 my kids can think
01:27:26.500 of me that way.
01:27:28.500 And they can't think
01:27:30.000 of me like that
01:27:30.620 if I stray.
01:27:32.020 Okay.
01:27:32.480 Okay.
01:27:33.060 Yep.
01:27:33.520 Yep.
01:27:34.400 So let's talk about
01:27:35.840 the activism issue
01:27:37.740 and then the lip service issue.
01:27:39.920 And then we'll close
01:27:40.680 this part of the interview.
01:27:41.680 I think what we'll do
01:27:43.240 on the Daily Wire side,
01:27:45.180 I think,
01:27:45.540 is to delve a bit more
01:27:46.700 into your personal history
01:27:47.780 because I'm curious
01:27:48.620 about what shaped you
01:27:50.180 along the way.
01:27:51.680 I often do that
01:27:52.500 with my guests
01:27:53.140 on the Daily Wire
01:27:54.460 half an hour.
01:27:55.260 And I think we'll also
01:27:56.400 talk about
01:27:57.080 the distinction.
01:27:58.440 We can touch on that here.
01:27:59.620 The distinction between
01:28:00.720 the ideology,
01:28:03.080 so to speak,
01:28:03.660 that you're pursuing
01:28:05.060 and the ideology
01:28:06.420 of the left
01:28:07.060 because I don't think
01:28:08.280 that they're
01:28:08.820 the same
01:28:10.120 manifestation
01:28:11.560 of cognitive apparatus.
01:28:13.280 Yeah,
01:28:13.840 I think I can explain
01:28:14.660 it pretty well
01:28:15.280 why they're different.
01:28:16.820 Okay,
01:28:17.300 let's do that
01:28:17.920 on the Daily Wire side.
01:28:18.940 Let's cover the activism issue
01:28:20.460 and the lip service issue
01:28:21.580 and close this part off.
01:28:23.120 I just want to know
01:28:24.100 how you circumvent
01:28:26.840 the danger
01:28:27.440 of having activism
01:28:28.920 on the right
01:28:29.800 start to become
01:28:30.680 the equivalent
01:28:31.540 of activism
01:28:32.160 on the left.
01:28:32.700 Now,
01:28:32.840 you already answered
01:28:33.400 that to some degree.
01:28:34.440 You know,
01:28:34.640 you said you take pains
01:28:36.260 to ensure,
01:28:37.000 for example,
01:28:37.440 that you're not
01:28:38.040 insider trading,
01:28:39.500 so to speak,
01:28:40.080 even though it would be legal.
01:28:41.140 And you're not
01:28:42.200 shaking down
01:28:43.460 the companies
01:28:44.080 for donations
01:28:45.200 to any of the causes
01:28:46.320 that you might support.
01:28:47.380 Like,
01:28:47.540 you have reasons for that.
01:28:48.620 But I think
01:28:51.240 the closest analog
01:28:52.320 in some ways
01:28:53.200 on the critical side
01:28:54.180 to what you're doing,
01:28:55.720 you know,
01:28:55.940 a critic would say,
01:28:56.740 well,
01:28:56.860 that's just right-wing
01:28:57.820 cancel culture.
01:28:58.940 So how would you respond
01:29:00.240 to an allegation like that?
01:29:02.060 Yeah,
01:29:02.340 you know,
01:29:02.620 I think that's one
01:29:03.380 that is easy to make
01:29:05.000 because it's born
01:29:05.880 out of their frustration
01:29:07.040 with the reality
01:29:07.840 that the left
01:29:08.460 has embraced
01:29:09.000 a cancel culture
01:29:09.860 that tries to attack
01:29:10.900 and destroy individuals.
01:29:12.740 What we do
01:29:13.760 is quite different.
01:29:14.940 You know,
01:29:15.100 our focus is on
01:29:16.260 these major corporations
01:29:17.860 that it's really
01:29:19.140 about educating
01:29:19.860 the consumer.
01:29:20.840 You know,
01:29:21.020 so for us,
01:29:21.580 these major corporations,
01:29:22.560 we're not punching down.
01:29:23.560 We're going up,
01:29:24.340 you know,
01:29:24.580 to the very top
01:29:25.440 of the financial system here.
01:29:27.060 And we're saying
01:29:28.240 there's a fundamental
01:29:29.540 difference between
01:29:30.640 what the image
01:29:31.940 of a company is
01:29:33.100 perceived to be
01:29:34.180 and what the reality
01:29:35.440 actually is.
01:29:36.660 And so our job
01:29:37.920 is to fill
01:29:38.400 the education gap there
01:29:40.060 because the story
01:29:41.080 hasn't simply been told.
01:29:42.460 You can't blame
01:29:42.900 a consumer
01:29:43.400 for giving money
01:29:44.080 to a company
01:29:44.640 that they don't know
01:29:45.560 has been funding
01:29:46.400 some crazy,
01:29:47.220 awful thing,
01:29:47.920 right?
01:29:48.640 So we're just
01:29:49.740 educating them
01:29:50.620 in terms of canceling them.
01:29:52.620 You know,
01:29:52.880 like,
01:29:53.260 it's really in their court
01:29:54.900 what happens
01:29:55.640 is a byproduct of that
01:29:56.780 because the consumer
01:29:57.700 has to decide
01:29:58.580 at the end of the day
01:29:59.340 if they feel comfortable
01:30:00.340 spending their money
01:30:01.080 somewhere.
01:30:01.960 There's sort of
01:30:02.620 a larger story here,
01:30:03.780 though,
01:30:04.040 in terms of the fact
01:30:05.040 that the natural consumer,
01:30:07.020 like,
01:30:07.140 let's go through this.
01:30:08.040 So we do,
01:30:09.220 Tractor Supply was the first one.
01:30:11.460 John Deere was the second one.
01:30:12.660 Harley-Davidson,
01:30:13.660 Polaris,
01:30:14.320 which included
01:30:14.780 Indian Motorcycle,
01:30:16.500 Lowe's,
01:30:17.600 Ford,
01:30:19.280 Stanley Black & Decker,
01:30:20.420 which owns Craftsman,
01:30:21.980 Stanley,
01:30:23.020 Black & Decker,
01:30:24.260 and DeWalt,
01:30:25.780 which are the major
01:30:26.440 tool companies in America.
01:30:28.440 And then just yesterday,
01:30:30.800 you know,
01:30:31.640 we were able to announce
01:30:33.120 the largest market cap
01:30:34.460 of any company we've flipped,
01:30:35.600 which was Caterpillar.
01:30:36.560 A lot of people don't realize
01:30:37.380 what a big company Caterpillar is.
01:30:39.140 It's over 113,000
01:30:40.460 global employees,
01:30:41.540 170 plus billion
01:30:43.140 in market cap
01:30:43.900 of that company.
01:30:44.800 And we were able
01:30:45.640 to get policy changes
01:30:47.060 out of them.
01:30:47.520 So going through
01:30:48.200 all these companies,
01:30:49.120 oh,
01:30:49.320 and Jack Daniels
01:30:50.360 and their parent company
01:30:51.180 as well,
01:30:51.780 and Molson Coors,
01:30:53.300 the beer brand.
01:30:54.500 See,
01:30:54.680 we're getting to the point
01:30:55.360 where we've flipped so many
01:30:56.360 that I'm starting to forget
01:30:57.320 some when I do interviews.
01:30:58.660 And that's a good thing.
01:30:59.800 You know,
01:31:00.000 eventually I hope I forget
01:31:01.120 a number of them,
01:31:01.780 but it goes into
01:31:02.580 your third question
01:31:03.440 as well.
01:31:04.720 You know,
01:31:05.440 how do we make sure
01:31:06.480 that these companies
01:31:07.180 are actually abiding
01:31:08.380 by the changes
01:31:09.040 they say they're going to make?
01:31:11.100 And I think that's
01:31:11.980 a very critical question
01:31:13.420 because without accountability,
01:31:15.120 nothing matters.
01:31:16.540 And so I think
01:31:17.420 that's fundamentally born
01:31:18.620 out of why
01:31:19.500 we designed this
01:31:20.980 the way we did,
01:31:21.900 where it's predicated
01:31:22.960 on the need
01:31:23.500 to have whistleblowers
01:31:24.460 in the company.
01:31:25.700 We won't go cover
01:31:26.920 a company
01:31:27.360 we don't have
01:31:28.140 whistleblowers in
01:31:28.940 because those whistleblowers
01:31:30.020 are our eyes
01:31:30.900 and ears essentially.
01:31:31.780 And we know
01:31:32.520 they're the ones
01:31:33.220 who are activated
01:31:34.000 enough
01:31:35.120 and really motivated
01:31:37.040 enough to come forward
01:31:38.280 if things are still
01:31:39.960 going negatively
01:31:40.900 at the company,
01:31:41.740 if they're not
01:31:42.360 seeing things change.
01:31:44.020 So there's one company
01:31:45.580 actually that we've
01:31:46.440 had our eye on
01:31:47.240 that is one of these
01:31:48.200 companies that's
01:31:48.720 made a statement.
01:31:50.000 And it seems like
01:31:51.460 they're veering off path
01:31:52.680 in some ways.
01:31:54.040 And so we've been
01:31:54.740 documenting,
01:31:55.380 and I will say this,
01:31:56.300 I've said it many times
01:31:57.120 before,
01:31:57.960 when the day comes
01:31:58.940 where we do have
01:31:59.800 to go back
01:32:00.620 and report
01:32:01.120 on a company
01:32:01.720 that has already
01:32:02.640 made a statement,
01:32:03.720 we will be much
01:32:04.700 more aggressive
01:32:05.260 in our reporting
01:32:05.980 and it's going
01:32:06.660 to be like,
01:32:07.240 you know,
01:32:07.640 no kid gloves.
01:32:08.740 It's going to be
01:32:09.460 everything.
01:32:10.100 We're going to put
01:32:10.780 everything out
01:32:11.540 that we have
01:32:12.500 because in every
01:32:13.080 one of these cases
01:32:13.720 we did not put
01:32:14.400 out everything we had.
01:32:15.480 We put out
01:32:16.200 a good deal
01:32:16.840 of stuff,
01:32:17.640 but we have more.
01:32:19.440 You have something
01:32:19.860 in reserve.
01:32:20.720 Yes.
01:32:21.260 And so I think
01:32:22.020 the companies are aware
01:32:23.040 with eyes on the inside,
01:32:24.860 that's sort of
01:32:25.640 what predicated
01:32:26.260 the change
01:32:26.700 in the first place
01:32:27.440 is that,
01:32:27.840 okay,
01:32:27.920 our own employees
01:32:28.720 are outing us
01:32:29.420 for being crazy.
01:32:30.460 So,
01:32:31.140 you know,
01:32:31.520 if they did that once,
01:32:32.360 what makes you think
01:32:33.000 they're not going
01:32:33.380 to do it again,
01:32:34.000 right?
01:32:35.020 And so I think
01:32:36.940 that's probably
01:32:37.740 the saving grace here
01:32:39.160 is that we've got
01:32:39.920 people in these companies.
01:32:41.120 They're not
01:32:41.780 ideologically homogenous
01:32:43.180 in that sense.
01:32:43.900 Like,
01:32:44.520 you do have
01:32:45.640 these breakaway,
01:32:46.860 you know,
01:32:47.280 people who are like,
01:32:48.060 no,
01:32:48.260 I'm ready to stand up
01:32:49.540 for my values
01:32:50.180 at this point.
01:32:50.820 I'm sick of being
01:32:51.580 basically treated
01:32:52.360 like I'm a racist
01:32:53.080 and being forced
01:32:53.840 to do white privilege
01:32:54.800 quizzes and trainings
01:32:55.840 and, you know,
01:32:56.240 all these things.
01:32:56.960 I'm going to speak up.
01:32:58.560 So I think that
01:32:59.420 that on its own
01:33:00.560 has certain value,
01:33:01.620 but the secondary value
01:33:02.580 here now
01:33:03.140 is that you've got
01:33:04.060 the DEI activists
01:33:05.520 scared.
01:33:06.680 So I thought
01:33:07.680 this was really interesting.
01:33:09.400 It was Bloomberg,
01:33:10.640 I believe.
01:33:11.440 It was either Bloomberg
01:33:12.420 or Wall Street Journal.
01:33:13.800 One of them
01:33:14.180 did an interview
01:33:15.020 with a bunch
01:33:16.120 of DEI professionals,
01:33:17.660 right?
01:33:17.840 That's what they call themselves.
01:33:18.880 I struggle to say
01:33:19.620 they're professional about it.
01:33:20.860 I mean,
01:33:21.160 we'll just call them
01:33:21.840 race hustlers,
01:33:22.500 but they're
01:33:23.080 DEI professionals,
01:33:24.640 okay?
01:33:25.440 So they asked them,
01:33:26.740 you know,
01:33:27.020 if this guy comes
01:33:28.160 to your company,
01:33:29.440 what are you guys
01:33:30.440 going to do?
01:33:31.320 You know,
01:33:31.600 because you guys
01:33:32.180 are leading major
01:33:32.920 DEI departments.
01:33:33.760 What is your plan?
01:33:35.080 And the reporter
01:33:35.860 said they were shocked.
01:33:36.920 They were expecting
01:33:37.620 these people
01:33:38.200 to all say,
01:33:39.180 we're just going
01:33:40.400 to ignore him
01:33:41.020 or we're going
01:33:41.460 to dig in our heels
01:33:42.140 or whatever.
01:33:42.700 Every single
01:33:43.500 DEI person
01:33:44.260 they talked to
01:33:45.160 said we'd have
01:33:46.340 to reevaluate
01:33:47.220 our policies
01:33:48.000 because there's
01:33:49.040 no denying
01:33:49.680 that there's
01:33:50.180 a real sizable
01:33:51.160 interest in movement
01:33:52.280 there.
01:33:53.340 And, you know,
01:33:54.200 for our company,
01:33:54.960 we have to look
01:33:55.540 at what is
01:33:56.140 the potential,
01:33:57.140 you know,
01:33:57.500 net loss
01:33:58.220 of us
01:33:58.840 digging in on this.
01:34:00.300 So we're talking
01:34:01.100 about them interviewing,
01:34:02.000 mainstream media
01:34:02.560 interviewing some
01:34:03.300 of the ideological
01:34:04.780 creatures who are
01:34:05.880 responsible for a lot
01:34:06.920 of this mess
01:34:07.360 and they're even
01:34:07.840 admitting at this
01:34:08.440 point that they'd
01:34:09.240 have to go to
01:34:09.640 the drawing board
01:34:10.260 with their executives
01:34:10.980 and they're doing
01:34:11.940 that out of a
01:34:12.440 survival instinct.
01:34:13.960 You know,
01:34:14.240 it's something
01:34:15.140 that I think
01:34:16.100 is not surprising
01:34:17.360 because if they
01:34:17.900 want to survive
01:34:18.560 in their workplace,
01:34:19.360 they want to,
01:34:20.220 you know,
01:34:20.560 survive to exist
01:34:21.520 and maybe reform
01:34:22.740 in some other
01:34:23.200 version later on,
01:34:24.320 they have to find
01:34:26.040 a way to seem
01:34:26.700 somewhat sensible
01:34:27.460 at the moment.
01:34:28.280 But that on its own
01:34:29.580 is a sign
01:34:30.420 that we are
01:34:31.180 making sizable,
01:34:32.760 you know,
01:34:33.260 sort of change
01:34:33.940 within the workplace
01:34:34.760 because that's
01:34:35.640 pitting them
01:34:36.160 into a new reality
01:34:37.320 where they have
01:34:38.080 to go back
01:34:38.840 to hiding
01:34:40.020 in some sense.
01:34:41.340 And, you know,
01:34:42.480 I hate the idea
01:34:43.480 that, you know,
01:34:44.360 anything we do
01:34:45.080 is to force anybody
01:34:46.020 into hiding
01:34:46.600 because I actually,
01:34:47.480 I prefer a world
01:34:48.100 where everything's
01:34:48.580 kind of out in the open
01:34:49.200 and people debate
01:34:49.800 and things like that.
01:34:50.420 But there are
01:34:50.800 certain places
01:34:51.380 institutions
01:34:52.000 where I feel like
01:34:52.900 really like workplaces,
01:34:55.440 your politics
01:34:56.120 and your views
01:34:57.240 on sex
01:34:57.820 or whatever it may be
01:34:58.660 don't belong at work,
01:34:59.740 you know,
01:35:00.060 unless it's a central
01:35:00.980 thing to your job,
01:35:02.660 that's never been
01:35:03.700 an acceptable thing
01:35:04.560 at work.
01:35:05.060 You know,
01:35:05.340 if I had gone to work
01:35:06.420 in the early 2000s
01:35:07.760 and started talking
01:35:08.500 about who I like
01:35:09.360 to have sex with,
01:35:10.720 you know,
01:35:11.080 or, you know,
01:35:12.460 whatever it may be,
01:35:13.900 people,
01:35:14.300 I would have been fired
01:35:15.040 at sexual harassment,
01:35:16.120 right?
01:35:17.240 That's fundamentally
01:35:18.460 changed today.
01:35:19.240 And I think like
01:35:19.860 as we alter
01:35:20.720 the path
01:35:21.240 of going forward,
01:35:23.100 we're going to see
01:35:23.880 more of that come back,
01:35:25.060 the idea that like,
01:35:25.900 hey,
01:35:26.000 certain things
01:35:26.480 are just not
01:35:26.940 appropriate for work.
01:35:28.740 Yep.
01:35:29.440 All right, sir.
01:35:30.140 Well, I think
01:35:30.600 that's a good place
01:35:31.380 to stop.
01:35:32.780 We covered what you're
01:35:35.120 doing relatively
01:35:35.800 comprehensively.
01:35:36.600 Is there anything else
01:35:37.680 that you'd like to
01:35:38.820 bring to people's
01:35:39.460 attention before we
01:35:40.400 switch to the
01:35:40.940 daily wire side?
01:35:42.380 Well, if people want
01:35:43.400 to be a whistleblower
01:35:44.500 in their own workplace
01:35:45.400 to stop this wokeness,
01:35:46.640 they can go to
01:35:47.140 robbystarbuck.com
01:35:48.060 slash DEI,
01:35:49.220 and we've got the
01:35:50.060 ability for you to
01:35:50.780 give us a tip there
01:35:51.660 and tell us what's
01:35:52.340 going on,
01:35:52.800 and we'll have
01:35:53.680 somebody on the
01:35:54.120 research team or
01:35:54.800 maybe myself reach
01:35:55.620 out and, you know,
01:35:57.020 be able to sort of
01:35:57.860 go from there to learn
01:35:58.660 what's going on in
01:35:59.200 your company.
01:35:59.740 And that's an
01:36:00.260 intensive process,
01:36:01.140 by the way,
01:36:01.500 the way we vet
01:36:02.240 the information
01:36:02.800 because we also are
01:36:04.400 protecting against
01:36:05.060 fake information.
01:36:06.240 You know,
01:36:06.580 there's bad actors
01:36:07.760 out there,
01:36:08.860 you know,
01:36:09.080 people who themselves
01:36:10.080 are traitors and
01:36:11.300 trying to trade
01:36:11.860 stocks against
01:36:12.400 companies and
01:36:12.820 things like that,
01:36:13.300 where they will
01:36:13.760 make things up
01:36:14.540 and send it to us.
01:36:15.120 But we have a
01:36:15.640 very good team,
01:36:16.460 and we're very
01:36:17.500 responsible about
01:36:18.320 sort of, you know,
01:36:19.180 making sure that
01:36:19.800 we're putting out
01:36:20.740 information that is
01:36:21.500 true.
01:36:21.760 Because, again,
01:36:22.260 there's so many
01:36:23.160 pitfalls that could
01:36:24.180 endanger sort of
01:36:25.600 what we do,
01:36:26.300 and so we are very
01:36:27.320 careful to make sure
01:36:28.500 because the right's
01:36:29.240 not allowed to make
01:36:29.860 the types of mistakes
01:36:30.620 the left is.
01:36:31.500 You know,
01:36:31.840 so for us,
01:36:32.800 everything has to be
01:36:33.440 perfect every time.
01:36:34.160 That's what I always
01:36:34.600 say.
01:36:35.160 I'd rather be a day
01:36:36.180 late with what we,
01:36:37.640 you know,
01:36:38.120 plan to do
01:36:38.900 than do it on time
01:36:40.780 and realize that we
01:36:41.920 were sloppy with
01:36:42.740 something.
01:36:44.080 Now, you also
01:36:45.060 mentioned that
01:36:45.860 you're dependent,
01:36:46.740 your operation is
01:36:47.500 dependent to some
01:36:48.200 degree on public
01:36:49.040 support.
01:36:49.600 Is there a place
01:36:50.320 that people can go
01:36:51.260 to find out how
01:36:52.440 they can contribute
01:36:53.200 to your efforts?
01:36:54.740 Yeah, they can
01:36:55.200 subscribe to my
01:36:55.900 X page.
01:36:56.440 That's a simple
01:36:56.920 way for people,
01:36:57.620 and it's not,
01:36:58.120 it doesn't cost
01:36:58.740 very much,
01:36:59.220 five dollars a
01:36:59.800 month, but that
01:37:00.580 goes principally
01:37:01.420 100% to funding
01:37:02.800 our research team.
01:37:04.160 None of it goes
01:37:04.740 to me.
01:37:05.400 It goes to funding
01:37:05.900 research.
01:37:07.180 And so, outside of
01:37:08.500 that, you know,
01:37:09.140 on that DEI page
01:37:10.220 I mentioned,
01:37:11.220 there's also a link
01:37:12.200 there where people
01:37:12.780 can give one time
01:37:13.660 if they want to.
01:37:15.240 And so, that's at
01:37:15.920 robbystarbuck.com
01:37:16.740 slash DEI.
01:37:17.460 So, that helps us
01:37:18.080 grow because we
01:37:18.880 really, we need to
01:37:19.940 hire more researchers
01:37:20.800 and we've got some
01:37:21.800 good friends who are
01:37:23.120 very trustworthy that
01:37:24.060 I'd like to bring in
01:37:24.900 to help on the
01:37:25.480 research side of
01:37:26.180 things because I
01:37:27.780 don't think people
01:37:28.260 realize how intensive
01:37:29.260 it is.
01:37:29.780 Like, I mean, if you
01:37:30.380 just talk about
01:37:30.920 executive interviews,
01:37:31.920 going through those,
01:37:33.080 you were talking
01:37:33.600 about hundreds of
01:37:34.560 hours with just
01:37:35.400 each company,
01:37:36.320 of interviews that
01:37:37.180 you have to go
01:37:37.740 through and find
01:37:39.520 the crazy, you
01:37:40.340 know, like, it's
01:37:40.920 like a search for
01:37:41.820 the crazy, right?
01:37:42.760 And so, that's not
01:37:44.140 something that's
01:37:45.520 terribly exciting to
01:37:46.600 do.
01:37:46.820 In fact, it's very
01:37:47.440 boring a lot of
01:37:48.120 times, but it's
01:37:49.140 something that you
01:37:49.720 have to do to
01:37:50.320 really complete
01:37:50.840 these investigations
01:37:51.820 appropriately and
01:37:52.920 you've got to just
01:37:53.700 have people, you
01:37:54.580 know, compensated
01:37:55.160 being able to do
01:37:55.920 it and sit there
01:37:56.480 and do it.
01:37:56.860 So, it is very
01:37:57.480 helpful.
01:37:57.900 We're very
01:37:58.160 appreciative of all
01:37:59.100 of our subscribers.
01:38:00.620 All right, sir.
01:38:01.420 All right.
01:38:01.800 So, I think for
01:38:02.540 everybody watching
01:38:03.260 and listening, if you
01:38:04.040 want to join us
01:38:04.620 on the Daily Wire
01:38:05.260 side, I think
01:38:05.800 we'll do two
01:38:06.320 things.
01:38:06.700 I want to delve
01:38:07.920 more into the
01:38:09.740 biographical details
01:38:11.860 of Robbie's
01:38:12.640 light to get some
01:38:13.620 sense of how his
01:38:16.320 orientation towards
01:38:17.540 this sort of
01:38:18.320 enterprise emerged.
01:38:19.880 And I would also
01:38:20.940 like to clarify a
01:38:21.940 little bit the
01:38:22.900 distinction between,
01:38:24.880 I mean, if this is
01:38:25.660 just a war between
01:38:26.660 ideologies, it's
01:38:27.540 arbitrary in some
01:38:28.400 sense, but you
01:38:29.000 made an allusion to
01:38:29.840 the fact that
01:38:30.460 there's a moral
01:38:32.460 force, there's a
01:38:34.280 moral battle being
01:38:35.060 played out at the
01:38:35.900 moment underneath
01:38:36.560 the political, that
01:38:38.460 isn't arbitrary, it
01:38:40.160 isn't one arbitrary
01:38:41.840 viewpoint against
01:38:42.720 another.
01:38:43.520 There's something more
01:38:44.180 fundamental at stake,
01:38:45.200 and I do think that
01:38:46.040 that's the
01:38:47.420 emblematic proof of
01:38:49.080 that is the
01:38:49.840 trans-butchery issue,
01:38:51.000 which is so far
01:38:51.880 beyond the pale that
01:38:52.800 you have to be
01:38:53.820 blind willfully or
01:38:55.160 otherwise or malevolent
01:38:56.240 to feel that that's
01:38:57.120 acceptable in any
01:38:57.980 manner whatsoever.
01:38:58.940 So, I think we
01:38:59.680 should delve more
01:39:00.260 deeply into that
01:39:01.020 and clarify it, and
01:39:02.340 that'll allow us
01:39:03.160 also to touch upon
01:39:04.520 something else you
01:39:05.240 alluded to, which is
01:39:06.180 the tilt, the
01:39:08.340 pronounced tilt,
01:39:09.540 especially over the
01:39:10.320 last couple of years
01:39:11.280 away from the kind of
01:39:13.000 radical atheism that
01:39:14.300 was pervaded by the
01:39:15.460 four horsemen of the
01:39:16.300 atheist movement, for
01:39:17.160 example.
01:39:17.700 And so, we'll discuss
01:39:19.060 all that on the
01:39:19.720 Daily Wire site.
01:39:20.400 So, everybody who's
01:39:21.180 watching and listening,
01:39:22.460 you know, join us
01:39:23.720 there so we can
01:39:24.620 continue this discussion.
01:39:26.500 Thank you very much.
01:39:27.540 That was extremely
01:39:28.320 interesting.
01:39:30.180 What you're doing is,
01:39:31.920 well, it's striking.
01:39:32.920 It's striking.
01:39:33.540 And it is really an
01:39:34.360 example, I would say,
01:39:35.540 of something else you
01:39:36.380 pointed to, which is
01:39:37.320 the necessary refusal
01:39:39.960 of the individual to
01:39:41.180 assume that someone
01:39:42.120 else will take care of
01:39:43.340 the problem.
01:39:44.280 You know, part of the
01:39:45.120 problem with that
01:39:45.840 attitude, this is why
01:39:47.120 there's a pervasive
01:39:47.860 sense of meaninglessness,
01:39:49.240 particularly on the
01:39:50.060 left, and that is
01:39:50.880 marked on the clinical
01:39:52.100 side, by the way, is
01:39:53.520 that if you assume
01:39:54.920 that all the
01:39:55.480 important things will
01:39:56.420 be done by someone
01:39:57.200 else, then there's
01:39:57.880 nothing for you to do.
01:39:59.240 And if there's nothing
01:39:59.840 for you to do, well,
01:40:00.700 you don't have any
01:40:01.360 responsibility.
01:40:02.380 Well, exactly.
01:40:03.480 You've got a lot of
01:40:04.440 misery to grind your
01:40:05.760 way through, and you
01:40:06.560 have no reason
01:40:07.180 whatsoever for putting
01:40:08.140 up with it, so why
01:40:09.300 the hell wouldn't you
01:40:10.100 be demoralized and
01:40:11.100 unhappy?
01:40:11.640 And that's exactly
01:40:12.320 what we're seeing
01:40:12.960 arise, particularly on
01:40:14.600 the left, and most
01:40:15.860 interestingly,
01:40:16.580 particularly, although
01:40:17.520 not uniquely, among
01:40:18.640 the young women who
01:40:19.860 are the most likely
01:40:20.620 to be supporting
01:40:21.360 these very woke
01:40:22.340 policies that we
01:40:23.180 described.
01:40:24.300 All right, so,
01:40:24.960 thank you very
01:40:26.540 much for agreeing
01:40:27.220 to talk to me
01:40:27.960 today.
01:40:28.300 It was a very
01:40:28.900 enlightening conversation,
01:40:30.880 not least because you
01:40:32.080 managed to meld a
01:40:35.020 certain elevated
01:40:36.420 degree of
01:40:36.940 philosophical
01:40:37.560 sophistication with
01:40:39.400 an extremely
01:40:40.520 programmatic and
01:40:43.120 strategic approach to
01:40:44.260 the problem that's
01:40:45.460 boots on the
01:40:46.100 ground, right?
01:40:46.760 I mean, you're
01:40:47.520 moving all the way
01:40:48.200 from the level of
01:40:49.280 abstract idea to
01:40:50.340 the implementation
01:40:52.080 phase and the
01:40:53.660 documentation of the
01:40:54.640 consequence of that
01:40:55.480 implementation as
01:40:56.340 well.
01:40:56.600 So, it's really a
01:40:57.360 full-fledged battle
01:40:59.260 strategy.
01:41:00.240 And it's, well, it's
01:41:01.760 already had remarkable
01:41:03.100 consequences.
01:41:04.040 And so, you know, it's
01:41:05.420 really something to see.
01:41:06.700 So, you know, thank you
01:41:08.420 for that.
01:41:09.580 Thank you, Jordan.
01:41:10.100 I appreciate it.
01:41:10.960 And this has been one of
01:41:12.360 my favorite interviews I've
01:41:13.400 done just because you're
01:41:14.660 incredibly detailed.
01:41:15.960 And I think the format
01:41:17.800 here offers the ability
01:41:19.240 to be more detailed in
01:41:20.540 the way that we kind of
01:41:21.220 explain things and talk
01:41:22.180 about it.
01:41:22.780 I always find that
01:41:23.480 interesting.
01:41:23.980 I like the longer stuff
01:41:25.020 versus the shorter
01:41:25.720 stuff.
01:41:26.080 That's probably one of my
01:41:26.960 great regrets of our
01:41:27.840 videos is that they are
01:41:28.740 seven to ten minutes
01:41:29.420 long because if I had it
01:41:30.440 my way, everybody would
01:41:31.440 have my attention span
01:41:32.420 and watch like, you
01:41:33.360 know, an hour and a
01:41:33.920 half, two hour long,
01:41:34.860 longer explanation of
01:41:36.220 what's going on.
01:41:37.220 So, I appreciate what
01:41:38.360 you're doing.
01:41:39.540 Yeah, yeah.
01:41:39.960 Well, I'm looking forward
01:41:40.760 to putting this out for
01:41:41.840 people because they can
01:41:43.040 go back, they can go
01:41:44.180 into it in more detail
01:41:45.420 now and see what you're
01:41:46.220 up to.
01:41:46.740 And, you know, the thing
01:41:47.620 is on the video side is
01:41:48.880 that there's utility in
01:41:50.680 virtually every length of
01:41:52.240 video, right?
01:41:53.440 It's interesting.
01:41:54.680 It's different than the
01:41:55.800 print environment in some
01:41:57.120 fundamental way because
01:41:58.380 on video, it's like you
01:42:01.340 could sell ideas by the
01:42:02.660 phrase, the sentence, the
01:42:04.060 paragraph, the chapter,
01:42:05.620 the book.
01:42:06.260 Like, there's a domain for
01:42:08.000 every length of cut, but
01:42:10.040 it is really useful to be
01:42:11.240 able to drill down the way
01:42:12.260 we did today so that
01:42:13.620 people can get a real
01:42:14.400 sense of the landscape.
01:42:15.420 All right, well, and
01:42:16.580 thank you to everybody
01:42:17.380 who's watching and
01:42:18.120 listening on the YouTube
01:42:18.880 side and film crew here
01:42:20.480 in Toronto today because
01:42:21.940 I'm in Toronto and, you
01:42:24.740 know, one of the hotbeds
01:42:26.300 of woke activism in North
01:42:27.800 America, especially at the
01:42:28.920 school board level.
01:42:30.220 I mean, I think the
01:42:30.860 Toronto District School
01:42:31.780 Board is maybe the worst
01:42:32.900 school board for woke
01:42:34.420 nonsense in North America
01:42:35.860 and that's a hard contest
01:42:37.140 to win.
01:42:37.640 So, shout out to the
01:42:38.720 Toronto District School
01:42:39.640 Board for obtaining such a
01:42:41.280 difficult victory in that
01:42:42.300 regard.
01:42:42.600 All right, sir.
01:42:43.520 Thank you very much.
01:42:44.500 Very nice talking to you.
01:42:46.060 Thank you.
01:42:48.460 Thank you.