FREEMIUM: Brokenomics | Best of 2025
Episode Stats
Words per Minute
173.97856
Summary
In this episode of Brokonomics, we take a look back at the best episodes of the last year, and discuss what we loved the most, and what we'd like to see in the next one. We're also joined by a special Christmas bonus for you, the audience.
Transcript
00:00:00.000
Hello and welcome, welcome, welcome, welcome, welcome, welcome, welcome to, welcome, welcome
00:00:22.120
to, welcome to, welcome, welcome to Brokonomics. Hello, Merry Christmas, Happy New Year to all
00:00:27.680
the Brokonomics viewers. Now, have you ever watched TV and the series will be quite good
00:00:35.680
and even a really good series will do this and at some point you're just given a filler
00:00:41.260
episode, an episode that serves no purpose apart from to save time and effort for the
00:00:47.380
writers. So I'm thinking of an episode like The Fly in Breaking Bad, which is, you know,
00:00:53.840
the only new bit of content they recorded that is Walter White chasing a fly down in
00:00:58.760
his lab thing. But then they did loads and loads of flashbacks to previous things that happened.
00:01:05.760
It was just pure filler. And I also saw this in Me and the Woman re-watched Stargate and
00:01:13.060
Stargate Atlantis recently. And like every season, they do that. They record a minimal amount
00:01:19.100
of content and then they just do a whole load of flashbacks. And it's really annoying and
00:01:23.660
it's really lazy. And I've always hated it. And I thought if ever, if ever I have a creative
00:01:30.420
endeavor, I am not going to do that shit. Well, anyway, I'm now doing that shit. And the reason
00:01:35.000
is, is because I am off for Christmas and the studios closed and I was busy and I didn't have
00:01:42.120
time to record an extra thing. So I'm just doing filler crap. But I'm going to try and make it good
00:01:49.400
for you by highlighting the very best stuff that I have done over the course of the last year and
00:01:56.520
maybe a little bit beyond it as well to find the really best stuff. And so we're going to play some
00:02:02.320
clips of the absolute highlights of Brokonomics. We're going to make it freemium so that all of you
00:02:08.460
can watch it. Because when I go through these, I'll tell you the view count, but they're the view
00:02:12.480
count of the master race, the people who actually Lotus Eater subscribers. Every so often, I get
00:02:18.280
people come up to me in the streets and say, oh, you're Dan, aren't you? And I love Brokonomics and
00:02:22.380
stuff like that. And then I'm thinking, oh, this is good. So I'm shaking their hand and patting them
00:02:27.780
on the back and then saying, oh, so how long have you been a subscriber? And they say, oh, no, I'm not a
00:02:32.500
subscriber. I just watch it on YouTube. And then I'm thinking, oh, you bastard. It's only £5 a month,
00:02:38.000
right? That is less, that is clearly less than you spend on stuff that will make you fat that you
00:02:46.660
don't need, just mouth pleasure that you have, not that kind, that you have. You could just buy one
00:02:54.340
less bloody coffee or milkshake or something and subscribe and then we can pay editors, right? It's
00:03:00.060
it's important. But anyway, some of you, some of you don't subscribe. And maybe if I give you a bit
00:03:08.180
of talking to you, you might find the Christmas cheer in you just to bloody subscribe on lotuseaters.com.
00:03:15.020
It's literally only £5 a month. You will not be, even if you're proper poor, you can afford that.
00:03:20.940
Anyway. Right. So let's get on with the episodes that we did. All of this is going to be possible
00:03:31.940
thanks to my main man, Thomas, the editor. He's really my co-partner in creating Brokernomics.
00:03:41.180
He does lots of good stuff and he's put this all together. And so what I've got here is a list of
00:03:45.340
things that you, the audience really liked. And then I've tagged a few on my own personal favorites
00:03:49.440
as well. And we talk about what was, what was, what was particularly great. Thomas did, did the
00:03:57.760
one that you should have seen perhaps last week, the one where I was having a conversation with
00:04:03.400
Santa. So he's bloody brilliant. I should give him a, this is his Christmas bonus, me, me giving him
00:04:09.580
some praise. And I went to him and said, I want to have an argument with myself. Can you make it
00:04:13.720
happen? He was like, yes, I can make that happen. So solid stuff. You'll see, you'll see his name
00:04:18.640
often gets attached to the top of episodes. So, so give him, but don't poach him. If, if you have,
00:04:24.940
if you need an editor, um, no, he, he's, he's no good for that. Leave him alone. I want him here.
00:04:32.120
First episode that apparently you people quite liked, um, was, um, Brokonomics Economics with
00:04:45.580
Crayons for Socialists. Um, interesting, but that is one of the, that, that got 5,000 plus views.
00:04:54.840
It's really interesting to me that that episode did so well. And the reason that it did so well
00:05:00.360
is because that was probably my lowest effort video of the entire year. Basically I'd, I'd been
00:05:06.740
busy with a whole bunch of other content and some other business stuff that we're doing here in,
00:05:11.800
here in Lotus Eaters. And basically that week I was just run ragged. And I thought, what is the
00:05:17.620
lowest effort that I can put into a Brokonomics? And I thought, I know, I'll just, I'll just,
00:05:23.640
I'll just get a glossary of basic economic terms. And I'll just sit there and explain them.
00:05:29.400
Um, but really simplistically as if I'm talking to a socialist and you guys absolutely bloody loved
00:05:35.520
it. And this, and this is a, this is a weird thing, right? With content creators. They often
00:05:41.220
notice the more effort you put into something, the less the audience likes it. And the lower
00:05:46.120
effort you put in, the more they like it. But anyway, it's a bit weird. Um, watch this and then
00:05:51.120
I'll meet you on the other side. Um, next basic idea I want to explain the Laffer curve.
00:05:55.560
The basics of the Laffer curve is that there's a peak, a peak point in taxation. So a very basic
00:06:02.080
level chart, actually crayons would have been great for this. Um, imagine a chart that has
00:06:07.560
tax rate along the bottom, 0% to 100%. And imagine on the top part of the graph, it is amount of tax
00:06:16.720
collected. And the Laffer curve is basically it goes up and then down again. Reason. Um,
00:06:24.880
if your tax rate is 0%, then you will get zero tax revenues, obviously because you're not taxing
00:06:32.000
anything. But if you tax a hundred percent, then you will also get zero. And the reason is, is that
00:06:39.360
if you are going to be taxed a hundred percent, you know, everything you do is immediately taken
00:06:44.960
off you. Why would you bother doing anything at all? You wouldn't, you just wouldn't work. There's,
00:06:50.080
there's no, literally no point in working at all. And when the Laffer curve is shown in books,
00:06:56.560
it is often this smooth, um, hump, which might lead you to assume that the maximum point on the chart
00:07:06.000
is 50%. That's often how it's shown in books in practice. And we don't know what it is. Um,
00:07:14.880
because we don't yet have, um, enough real world examples. It would be great if we could set up,
00:07:20.960
you know, a second earth and, and try, you know, taxing everyone half as much on that and see what
00:07:26.800
happened. I think the productive output would soar on that second earth, maybe in time with enough
00:07:32.000
computation and AI, we better run models on this kind of stuff. But I suspect that the,
00:07:38.800
the revenue maximizing point on that curve, i.e. the one that raises the most amount of tax
00:07:45.360
is probably a tax rate in the, in somewhere around the 15% mark. Right. Anyway, hope you like that one.
00:07:52.240
Um, what have we got next? We've got, um, Brokernomics debate, uh, sex, feminism,
00:07:58.560
and the death of civilization, uh, with, um, almost 6,000 views on that one. Uh, and, and that's
00:08:05.840
proper views, not YouTube views. If you add YouTube views, it goes well beyond that. Um,
00:08:10.640
and this is the, uh, Dutton, uh, Sula Kowalski Rainer debate. So that is when, um, so I had Dr.
00:08:17.520
um, Sula Kowalski on. She was absolutely brilliant. And she reframed feminism for me.
00:08:23.360
And when I heard the reframe, I just immediately got it. And what she was saying is that feminism
00:08:30.560
is an evolutionary strategy developed by women to try and prevent other women from having children.
00:08:39.120
Because if you can prevent other women from having children, that means that you raise
00:08:45.200
the prospects of your own offspring by virtue of the fact they're competing with fewer other
00:08:52.320
offspring. So the successful feminist is one who lives a traditional life and has children,
00:08:59.360
but convinces other women to basically stay working in a cubicle, um, get in debt, uh, hate men,
00:09:07.760
don't have children. And it works and it, and apparently it works for women, but it doesn't
00:09:12.320
work for men because for men, there's literally no point trying to stop other men from having men.
00:09:16.960
Because if you miss one here, just go around and have all the children. Whereas women,
00:09:21.760
because fertility is comes in such, um, longer chunks, um, you know, nine months, as opposed to,
00:09:30.320
I know, 45 minutes, maybe a bit less, maybe 10 minutes. Um, there is literally no point in trying
00:09:37.040
to gatekeep men in this way. So, so that's why it is a, a female fem that that's why feminism is a
00:09:42.560
female sabotage strategy. And so she explained it brilliantly. And then after that, um, loads of people
00:09:48.720
in the comments were like, Oh, you should get into debate with Ed Dutton. So I did. And his coauthor.
00:09:53.840
So anyway, uh, loads of you really liked this debate, but I think both of them side by side are
00:09:57.760
excellent. Uh, you should check them out. Anyway, let's, let's watch the highlights from that one.
00:10:02.400
I think you're right that it is, it is cyclical. Uh, that's, uh, that, that, that, that's sure.
00:10:07.680
But, but, but, but then the, the best explanation is that which explains the most. And I think that
00:10:14.080
just to say, Oh, well, it's, it's, it's a sexual competition mechanism. Um, I mean,
00:10:20.640
first of all, one of the things you said, I wrote down earlier, um, you, you talked about, uh, the,
00:10:26.080
the, the, the, the reproduction game changing and, uh, you know, nobody's children dying off
00:10:32.880
or something to that effect or fewer, fewer, less troubles have.
00:10:38.880
I'm not sure that's quite right because what you see under conditions of
00:10:43.120
harsh Darwinian selection, for example, if you look at the research by Gregory Clark,
00:10:46.960
and the sun also rises, what has been shown in that and other research is that there was
00:10:51.360
substantial differential, um, completed fertility among different social classes.
00:10:58.880
Yeah. So the, the, the, the richer 50% of the society had about double
00:11:03.760
the complete fertility of the poor of 50%. And, and people knew this, people were aware of this.
00:11:09.680
I forget which, it was some book I've cited somewhere that was written in about 1670 about
00:11:14.000
demographics. And people were aware of this, that the, the, the children of the poor died off
00:11:20.960
and they didn't reproduce. And so it's a phrase that Ollie has coined in, in, in, in a sense,
00:11:25.280
our society was being self-genocided by its upper class, every generation. And the, the, the genes
00:11:32.240
of the upper, of the higher classes, the genes, therefore, of the more intelligent
00:11:36.480
were, were, were, were moving their way down every generation, which is why all of we English
00:11:41.040
people are descended from Edward the third, um, because that process was occurring. So I think the
00:11:47.680
most parsimonious explanation has to, has to take in, has to explain, has to bring into its purview,
00:11:54.960
the genetic explanations, which we have, sorry, the genetic changes, which are Congress with the
00:12:03.120
behavioural changes that we note in wokeness, which would predict that happening, um, and other changes.
00:12:10.720
It has, it has to explain the lot. And, um, your explanation is only explaining an element of it.
00:12:16.800
I think your, your, your explanation could be taken and be drawn under the purview of a,
00:12:20.320
of a larger explanation. Which aspects, which aspects of wokeness are you suggesting that I'm
00:12:25.200
not explaining? Because your explanation then, you sort of pointed out how it was always the case
00:12:30.400
that the, the wealthiest were having the, the most completed reproduction, right? And so the,
00:12:35.840
the less wealthy were not. And so what began happening when societies began getting wealthier,
00:12:42.960
that began to change, right? Once you, once you're in a society where it is,
00:12:49.680
so this is my argument. Once you're in a society where the differential wealth of the wealthiest is
00:12:55.040
no longer sufficient to guarantee differential reproductive success, the woken begin to change
00:13:00.880
their behaviour such that they maintain that differential reproductive success. And the way
00:13:06.400
they change their behaviour is to engage in manipulative reproductive suppression of the women
00:13:11.280
below them, because they can't rely on the lack of wealth and the poverty of those women to kill
00:13:16.240
those women's babies for them. And so what we see now that we are currently calling woke is just the
00:13:21.920
current manifestation of this female manipulative reproductive suppression by the wealthy women whose
00:13:27.280
wealth is no longer affording them the differential reproductive success that it used to. Which is
00:13:32.000
why this stuff emerges when societies get sufficiently wealthy that the poorest 50% are no longer just
00:13:38.480
having babies that die all the time. When you're not getting the pressure. Surely the, the, um,
00:13:46.320
under, under, in a context of, of harsh child, of harsh Darwinian selection and high child mortality,
00:13:52.480
um, if you're a wealthy woman, um, there is every possibility that your offspring will die off or
00:13:59.760
whatever. And so there would be a very strong pressure to be very competitive.
00:14:05.040
Okay. And the next up is, uh, Brokonomics conspiracies. Um, over 6,000 views on that one. Again,
00:14:12.000
proper views, not YouTube views, well beyond that on, on YouTube, of course. Um, uh, with Bo. Now,
00:14:18.720
now this is an interesting episode. This is, I thought this is something that I'd wanted to do
00:14:24.080
for a long time. I like talking about this stuff, but I thought, well, I can't really get away with
00:14:28.240
it. Can I? Because Brokonomics is not really supposed to be that. So, but eventually I just
00:14:33.360
couldn't contain myself anymore. And so I think on the previous episodes, I casually mentioned that,
00:14:38.480
oh, maybe I should do a, uh, a conspiracy episodes with, with, with Bo. And, um, you know,
00:14:45.520
tell me in the comments if you like that. And everybody was like, yes, do that. So that,
00:14:49.520
so that gave me the permission to, to do what I actually really wanted to do anyway,
00:14:53.760
which was, was to talk about conspiracies with Bo. And, um, glad I did. God knows what
00:14:58.720
it's got to do with Brokonomics. Uh, but it was a fun episode and you all liked it because it was
00:15:03.280
really popular. So, uh, let's check out a bit of that. Flat Earth comes out of not entirely,
00:15:08.400
but often it's sort of, um, it can be an early Christian thing. Oh, that's definitely what's
00:15:13.920
going on here. And the dinosaurs often, it's a creationist thing because if you read,
00:15:18.480
they don't want to accept that the earth is in fact billions of years old because,
00:15:22.160
because they go like 9,000 years or something, don't they? Some, yeah.
00:15:25.520
So therefore there can't be dinosaurs. So, so I can imagine if you're a Christian,
00:15:29.760
you have to get to this position. It's not all Christians. It's a particular type of creationist.
00:15:33.840
Yes. It's not all Christians, but at all. Yes. Um, you get, you get geologists that are Christians.
00:15:40.160
Yes. It's just a particular type of creationism. Yes.
00:15:43.760
Where they won't accept evolution. People that, uh, people that won't accept Darwin and the theory
00:15:49.760
of evolution, it's all wrapped up in that dinosaurs aren't real or that the geological record is,
00:15:55.600
is not what you've been led to believe it is. And once again, I put it for me, I put it in the same
00:16:01.440
category as Flat Earth is that you, you're just not accepting, you're failing to reason properly.
00:16:07.600
Yes. Geology is, is, is real. So the earth is billions of years old. And if you go down
00:16:13.600
to certain strata, it must be millions of years ago. And there you will find fossilized remains of
00:16:20.240
mega fauna and flora. So it's just, it's just a complete nonsense and cope that dinosaurs aren't
00:16:29.120
real. Yes. For me. I agree with you on that one. It's just, um, ongoing debates, Clinton body count.
00:16:35.440
That's not so much of a conspiracy. I mean, I think that Clintons have been killing people. What do you
00:16:39.360
think? Well, again, it's a matter of record that loads and loads of people that have been close to
00:16:46.400
them, loads have died in suspicious circumstances or committed suicide. Statistically aberrant.
00:16:51.760
Yeah. Yeah. Completely, completely aberrant. I think it's fair to say that's a matter of record.
00:16:57.200
Hmm. It's not like a conspiracy theory or one person claimed it and actually people have looked
00:17:02.560
into it and it's just not true. It is true. Yes. You get your average person will know a small
00:17:09.920
number of people that died, were murdered or committed suicide. Yes. Very small number. You
00:17:14.960
might know one or two or half a dozen your whole life. Maybe if it's over 10 or 20, that's weird. If it's
00:17:21.600
in the dozens and dozens and dozens, I mean, even someone that's a public figure, that's an ex-president or
00:17:26.720
whatever, an ex-senator. They will know more people than a normal person like you or I. So,
00:17:32.400
okay, they're okay. And all that sort of stuff. But still, still, the numbers are weirdly high.
00:17:40.480
Okay. Hope that was good. Next up, amongst the audience favorites, is Brokernomics
00:17:47.680
Geopolitical Flashpoint with Faraz Moudad. And so this was the second conversation that I had
00:17:55.280
with Faraz. The first one I thought was absolutely brilliant and that's on my list.
00:18:01.120
This has replaced that one because I put his one on my list. So this one has ended up going here.
00:18:08.800
So this was the second conversation. I think we've got the next one coming up again in a minute.
00:18:12.560
So this was a follow-up conversation. And you can see why, well, after the first one,
00:18:17.280
why I wanted to invite him back. And why after the second one, I had to go and have a word with Carl and
00:18:21.520
Alan was like, you've got to check this guy out. He is solid. We need to get this guy because he will
00:18:30.400
really flesh out what we're offering here at the Lotus Eaters because he can add something that,
00:18:35.760
you know, Carl, Bo and I tend to dip into geopolitics, but we're kind of tourists. I mean,
00:18:43.280
we can do a bit of it, but we, but, but Faraz, I mean, he just has absolute expertise on this and he's
00:18:50.240
super based. Really, really, there's always this competition. Well, Bo and Harry think it's a
00:18:56.080
competition, but there's this competition between, you know, me, Bo and Harry as to who's the most
00:19:00.320
based Lotus Eater. And, um, I, I mean, in fairness, Faraz, he just, he just effortlessly wins.
00:19:06.720
So he's a solid chap. Um, so, uh, yeah, what, what, watch this on the follow up at which point
00:19:13.280
it was obvious we had, we had to have him at Lotus Eaters. If the relationship between Russia
00:19:17.280
and China is strong, it's going to severely reduce one of the US's big trump cards, which is if,
00:19:23.040
if Russia is dealing with its, you know, regional wider orbit of people, and it's doing it through
00:19:28.800
US controlled shipping lanes, the US Navy has the ability to interdict those, which always gives
00:19:34.000
it that trump card in the pocket if it needs it. If the, if the relationship with Russia and
00:19:39.280
China continues to develop in the way it is, all of this stuff is going to be done by transcontinental
00:19:44.320
rail lines, which the US Navy has no hope of interdicting. Exactly. And it basically solidifies
00:19:50.480
that relationship. Yes. And I don't know how far along the path that relationship is, but I'd imagine
00:19:56.880
it's full steam at the moment. It's full steam. They, they're building new pipelines and expanding
00:20:01.760
existing pipelines. They're building more infrastructure. And essentially the policy
00:20:08.720
from Clinton through Bush, through Obama and Biden has given China its Lebensraum. They've given it the
00:20:16.800
sort of expanse and the natural resources that it needs to be a dominant power. The question is how
00:20:26.640
reversible is this? And if it isn't reversible, can the West be insulated from it? But is it even
00:20:34.560
believable? Because let's, let's say I'm Putin and Wyckoff turns up and he, he convinces me that there
00:20:42.480
was a change of direction in the US and I believe him. I've still got to believe that at the next
00:20:48.480
election or possibly the one after that, it's going to flip back the other way. Yes. So why the hell
00:20:53.520
should I, even if I agree with Wyckoff, why, why would I go there? Yeah. And this is something
00:21:00.400
that Putin has mentioned repeatedly, that the Americans are not deal ready and that he meets
00:21:05.520
with these envoys, are not deal capable, sorry. And that he meets with these envoys and he meets
00:21:10.640
with these presidents and they keep changing and they all promise one thing, go back home and the men
00:21:17.600
in gray suits sit them down and tell them what it's really like and everything changes. And nothing
00:21:23.120
that was agreed between Putin and the American president actually happens. So Putin's side of
00:21:29.440
the story is that he cannot take the word of an American president. What might be different
00:21:37.040
is if there is a severe enough institutional purge in the United States and there is a change in dogma
00:21:46.000
in the establishment, which is a multi-year task. It's not about appointing a few cabinet secretaries
00:21:53.360
here and there. That might make America deal ready. But there's an argument, transformational change.
00:22:02.160
And there's an argument that it can only happen in the event of a massive shock that hits the American
00:22:08.880
system. Are we talking a civil war level shock or something? Either an internal conflict or an
00:22:17.200
external defeat. Okay. And the last, I believe, of the audience favorites, although actually the rest
00:22:24.720
of them were pretty highly roped by the audience as well, is the Imperium of Dan. Well, it was supposed
00:22:31.200
to be the Imperium of Dan. I told them, I told, I told the guy who does the, the loading it onto
00:22:36.800
the website. I said, I want this episode called the Imperium of Dan, because I thought that'd be
00:22:41.920
a clever pun. And he changed it to the Imperium of Man, which I suppose is more accurate. But anyway,
00:22:46.560
I was annoyed by that. And what did this get? This got well over 6,000 views on this one.
00:22:52.640
So this again is another Brokonomics episode. What the hell it's got to do with Brokonomics? I mean,
00:22:56.880
I kind of smudged in to fit with Brokonomics, but it was me and Samson, the, the, the producer guy
00:23:04.320
here who does the, the button stuff when we're doing the podcasts and some other bits. He's a big
00:23:09.840
Warhammer fan. So we had a, we had a chat about it and this was framed as a very high level
00:23:16.320
introduction to the theme to see if people liked it so that we could have an excuse to do it again.
00:23:22.720
And actually evidently looking at the numbers, we need to do this again. We, we need more content
00:23:27.760
on this. So, so I need to have a think about how I can again, shoehorn this in somehow to being
00:23:34.560
about Brokonomics. Um, but actually the way we frame it, I mean, there are economic systems.
00:23:39.520
There's a lot of economic systems going on in the Imperium. So it's a Warhammer stuff,
00:23:43.680
by the way, if you don't know, it's Warhammer stuff, which is like in the future and it's a bit
00:23:48.080
grim and dark and all that kind of stuff. Um, so anyway, yeah, uh, watch this, uh, quick highlight.
00:23:53.840
And if you want more of it, let me know because I need excuses to, to do this kind of stuff.
00:24:18.080
Hello and welcome to Brokonomics. Now on this episode, I'm going to be fixing the Imperium of
00:24:34.400
Man, which is a, uh, which is a big project. Uh, but in order to help me, uh, I've got,
00:24:39.360
I've got Samson. The emperor has called for me to assist him. You're most, I think you're one of
00:24:45.680
the emperor's most loyal servants at the moment. Exactly. Yes. I'm going to call this episode,
00:24:49.520
the Imperium of Dan. Yes. Yes. If you know about Warhammer, you'll, you'll find that incredibly
00:24:54.000
funny. Yes. Um, backwardness and chaos and barbarity and, and humanity looks proper done
00:24:59.760
at that point. Um, until we get the emergence of the god emperor and, and he basically, he comes up
00:25:07.280
and he says, um, no, enough of this. I'm going to, I'm going to put this place in order. And he begins
00:25:13.600
conquering the entire earth and he's very successful. He does that. And once he's finished
00:25:18.560
that, he's like, now I'm going to conquer everything else. Yep. He starts, uh, I believe the,
00:25:25.520
it's called the great crusade. Yes. Basically unites humanity on earth and then begins his,
00:25:31.600
well, soon to be called holy conquest across the stars to reunite this post utopian society.
00:25:38.880
Because he knows there's all these other human worlds out there. Yep. Now he, he's convinced
00:25:44.320
that because the massive space perverts screwed up once before, if humanity doesn't get its
00:25:50.080
shit together, something like that is, something like that is going to happen again. And the only
00:25:54.800
way to stop that is if humanity is united under him and then they exterminate everything that isn't
00:26:01.680
humanity, which, you know, give him credit, bold plan. Yeah. Fair enough. Yeah. It has its merits.
00:26:10.640
Yes. It has, it has its merits. So, so anyway, and now in order, and we're almost done with the
00:26:14.800
law bit and then we can get on with the actual problem with Impeel. Um, in order to conquer the
00:26:19.440
galaxy, it occurs to him that it might be useful to have an entire race of super soldiers built.
00:26:24.880
So he promptly goes off and builds them. So he builds himself 20, 20 generals. Uh, and, and these
00:26:30.880
are basically, you know, transhuman demigods. They're like 14 foot tall and they're multiple
00:26:36.320
hearts. Yes. Yeah. Multiple extra organs that, um, they keep inventing basically to make them
00:26:42.880
the most powerful soldiers the galaxy I've ever seen. Well, the primarchs, he calls his sons.
00:26:47.840
There's 20 of those. And, and these are absolute super human type beings. I mean, they're, they're
00:26:54.000
absolutely singular apart from him. Right. The next up one is, uh, the first conversation that I have
00:27:06.880
with Fraz. Um, um, Fraz just nailed it in this one. Uh, I, I didn't know what I was getting with him.
00:27:14.800
Um, but he, he came in and he just completely nailed it. And, um, I mean, it's, it is, it's just so
00:27:22.000
good. Oh, by the way, if you're not watching real politic, um, you really should, because he's got his
00:27:27.040
own show now on the Lotus Eaters and it's, um, quite superb. And, and he really gets under the skin
00:27:32.880
of things. And, and, and the other, the other cool thing about Fraz is because, because he's,
00:27:37.280
because he's Middle East and he gets a pass on certain questions so he can just go there. Um,
00:27:43.280
uh, which is kind of necessary for covering geopolitics fairly. So do, do check out the
00:27:49.440
work that, that, that he's been doing. Um, but yeah, here's some clips from the first conversation
00:27:54.480
we had that really cemented him as, uh, as a man to listen to. He's 60, but yes. Okay. Okay.
00:28:00.560
50, 60, they're 50, 60 grand, are they, these drones? Okay. A bit more expensive than I thought,
00:28:06.320
but an aircraft carrier, I don't know how many hundreds of millions that runs into.
00:28:11.200
So let's put all of this in context. First thing, the fact that the plane fell off meant that it
00:28:19.120
wasn't secured properly. And that's a training issue. In a crisis, somebody forgot to secure
00:28:25.520
the plane properly. Well, and also the fact the aircraft carrier was turning so sharply because
00:28:29.840
all the previous lines of defense. But they, yes. So that's the second issue. But the, in trainings for
00:28:36.320
maneuvers, you would train on this kind of sharp turn as part of your regular, uh, work in, in the Navy.
00:28:48.880
And you would, you were supposed to know what to put where and how to stabilize it. Yes.
00:28:56.000
The second part is that it penetrated all of this air defense and that the aircraft carrier did have to
00:29:01.120
maneuver in the first place. Now we don't know exactly what was penetrated and how much was
00:29:06.720
penetrated and how much of it was sort of, uh, just in case, or the air defense was fully penetrated.
00:29:16.000
The third part is that it's been two or three months of this bombardment
00:29:20.160
and Biden obviously had been doing it for, for a while longer. And yet this kind of capability
00:29:28.400
and anti-air capability with the Houthis still persists. So the American Navy is doing all of this
00:29:35.200
fighting and they haven't been able to contain the problem. The final point is that I had,
00:29:43.360
when this war started, I said, Yemen was going to be a huge problem because, um, the nature of the
00:29:52.240
terrain, the extent of weaponry that's available to the Houthi, the, um, cost of these missiles,
00:30:00.480
et cetera, et cetera. Then I saw what happened with Hezbollah and I thought, well, okay,
00:30:06.160
if the Americans can fight this way and have that kind of intelligence on Hezbollah's leadership,
00:30:14.240
then they might be able to turn it around and force the Yemenis to comply.
00:30:20.640
So far, we haven't seen that yet. Okay. When you say American intelligence on, on Hezbollah,
00:30:25.440
I mean, do you mean Israeli? Well, it's, it's the same thing. Right. Okay. But, but do the Israelis
00:30:31.520
have an interest in Yemen? Yes, because it's their shipping lanes that are being shut. It's, it's over
00:30:35.760
their shipping that this is happening. Um, but I don't think that they have anywhere near the same
00:30:42.960
collection capability against the, the, the, uh, the Houthi. The question is how much would they
00:30:50.640
have collected from the Iranians that would have translated into understanding of the Houthi?
00:30:58.160
Okay. Because the Israelis have penetrated the Iranians, as well as Hezbollah, as well as Hamas,
00:31:03.440
to be able to do the kinds of things that they're doing. Right. And now, um, some clips of, um,
00:31:10.320
episodes that don't fall neatly into, um, the audience top five. There, there's some of my
00:31:15.680
favorites. And actually some of these go back to 2024. So they don't even count in the remit that
00:31:20.320
I was given by my editor, but I just wanted to include them anyway. And actually this one did
00:31:24.480
extremely well. So this is, um, uh, Broconomics IQ, uh, 6,000 views, very solid episode.
00:31:31.360
And, um, it was when I interviewed Ed Dutton and Ed is brilliant. Um, flamboyantly eccentric and, um,
00:31:42.400
touches of genius interwoven into that as well. Quite a few touches of genius, actually. He's a very
00:31:47.600
smart guy. And we had a conversation about IQ. And again, it was, I like episodes where I thought
00:31:55.840
I knew what I was talking about. And then it's like, oh, bloody hell, there's all these angles
00:31:59.840
that I hadn't considered. And I end up thinking about them for quite a lot longer after having
00:32:06.240
recorded them because like, oh yeah, that was, yeah. You know what I mean? Right. Anyway, here's
00:32:10.880
the episode. Well, hello, hello, hello. Uh, welcome, uh, welcome to Broconomics. Uh, I am here today
00:32:16.480
with Edward Dutton. Hello. Uh, evolutionary psychologist. Uh, actually you tell me what
00:32:22.640
you are. Yeah, I suppose I'm an evolutionary psychologist. Yes. Professor Ed Dutton, uh,
00:32:28.400
a man who knows a thing or two about intelligence because, you know, I've got the seeking suspicion
00:32:32.960
that on Broconomics, we are always talking about broken systems, things going wrong. Quite possibly
00:32:39.120
there is a, a more fundamental cause to the ones that we've been discussing up till now.
00:32:43.840
The, the, the, whatever, the, the chavs, whatever you want to call them, you know,
00:32:46.640
the Dinos, this new word I've learned, they, um, they simply didn't exist in 1880.
00:32:55.120
We hadn't, IQ had not gone down that far. They, they, they didn't exist.
00:32:59.680
And the P and the people that are the sort of working class of today, as it were, the, whatever,
00:33:03.920
they're working in a shop, whatever that kind of, they would be on the streets in the workhouse dying.
00:33:09.120
In 1880, they'd be right at the bottom of society.
00:33:11.280
Because society was basically harder and you needed to be sharp to make your way through it.
00:33:15.280
Just that they were right at the bottom. So they were, they were the bottom. And now there's a new
00:33:19.360
further standard deviation that's coming at the bottom, uh, which is the underclass, which just
00:33:23.840
didn't, those kind of people just didn't exist. That's the opposite of what we used to have.
00:33:28.960
Up until about 1800, it was exactly the opposite. We were breeding for intelligence. And intelligence
00:33:34.000
was going up every generation. That is interesting. Now, another fundamental
00:33:38.480
question that I have before I start to get into a little bit more depth on this stuff is,
00:33:41.680
am I right? Am I right in any way when I think there are different types of high intelligence?
00:33:47.600
Because I'm pretty sure that I would have made a rubbish auditor because I'm rubbish at detail,
00:33:52.320
but I've made a pretty good investor because I'm good at seeing broad patterns and I don't mind,
00:33:57.040
in fact, I quite enjoy being contrarian. And I often meet people who are clearly high
00:34:02.000
intelligent, but they just seem to be intelligent to me in a different way.
00:34:05.040
So is it, are there different types of high intelligence or is it just how you apply yourself
00:34:12.160
over your life that gets you to that point? So first of all, with the intelligence pyramid
00:34:19.440
I mentioned earlier, there is a correlation of, I don't know what it is, 0.5 or whatever,
00:34:24.800
between spatial, on average, between spatial intelligence, mathematical intelligence,
00:34:28.720
and linguistic intelligence. Normally people that are good at one are good at all three.
00:34:31.680
Now, and then you get this G factor that underpins it for that reason.
00:34:35.840
But there are variations. So you're going to get some people that are very, very linguistically
00:34:40.000
tilted, very, very high in linguistic intelligence or whatever, very good poets or writers or lawyers,
00:34:46.800
but they might be very poor spatial intelligence to the extent they can't drive a car.
00:34:50.560
Like A.J. Ayer was an example of that. Couldn't drive a car. Or Einstein used to just get lost.
00:34:56.400
And then you're getting other people, again, very good at maths, whatever,
00:35:03.040
but the tongue-tied physicist or whatever, not very linguistically intelligent. They're unusual,
00:35:08.000
but they exist. Or you might even get someone that has very high linguistic intelligence or whatever,
00:35:12.960
but he's in something lower down the pyramid of intelligence, as it were. Maybe he has very slow
00:35:17.120
reaction time, very slow processing speed. So these kinds of variations exist, and that's going to affect
00:35:23.600
how good you are at certain tasks. What you also get, it's called the cognitive integration effort
00:35:28.960
hypothesis, is that the more intelligent you are, the weaker is the positive manifold between the
00:35:37.920
different kinds of intelligence. So as you become, let's say you've got really high IQ, 145,
00:35:44.000
really, really high IQ, then you might be absolutely rubbish at some of the things that are at the base
00:35:50.720
of the pyramid that I mentioned earlier. The cognitive abilities that are specialised abilities,
00:35:57.120
are modules that are weakly associated with general intelligence.
00:36:00.720
Right, next up, I've got Brokonomics industrialising space. And yes, this was another standout talent.
00:36:06.960
This was, how did this happen? Oh, I know what it was. So me and Bo were arguing on the pod,
00:36:13.840
because Bo likes to do space segments every now and again. And he did one about living on Mars.
00:36:18.720
Mars. And my view is that we should colonise everywhere in the solar system apart from Mars.
00:36:24.480
I don't like Mars. And I then put up the case for why Venus is better. And for those of you who
00:36:31.840
know a little bit about space, you might be thinking, no, that's wrong. Venus is awful. No,
00:36:35.920
I'm right. Venus is better. And I explained it in that episode. And anyway, so there was a planetary
00:36:43.680
scientist who was watching that, who got in touch. How did he get in touch? Maybe a comment,
00:36:49.680
maybe an email, I can't remember now. But he got in touch to say, oh yeah, I'm a planetary scientist,
00:36:53.280
and I agree with you. And he made some other good points. So I thought, oh, he sounds like
00:36:59.280
an interesting person to talk to. So I got in touch, connected up on one of the messaging apps,
00:37:05.280
whatever one it was. And I was speaking to this guy and quickly realised, yeah, OK, he's got talent
00:37:13.040
for this. Let's get him on. So I did an episode with him. And again, because I was trying to shoehorn
00:37:18.640
it into the Brokonomics frame, I thought, let's make it about industrialising space. And it just
00:37:23.120
happened to work brilliantly. And it was such a farsighted episode. I mean, we literally go from
00:37:31.280
low Earth orbit industrialisation up to the colonisation of the local cluster. So I mean,
00:37:37.120
it's just superb. Anyway, audience love this one. And I've had him back a couple more times,
00:37:41.920
and we'll definitely have him back again. So yeah, watch the clip. And if you haven't,
00:37:46.240
for some reason, seen it, go and watch the whole thing. And I'm delighted to say that I have found
00:37:51.280
just the man, Grant Donahue, planetary scientist. Welcome to Brokonomics.
00:37:58.480
Good. Well, can you tell us about yourself? What is a planetary scientist?
00:38:02.560
So planetary science is a subfield of geology, which is an annoying term because geology specifically
00:38:08.400
refers to the Earth. But I was trained as a sedimentary petrologist. Professionally,
00:38:14.000
I've worked as both a petroleum and an environmental geologist. But more broadly,
00:38:21.040
it means that I study the evolution of systems. We don't tend to say geology in isolation because
00:38:30.000
we've discovered that there's an enormous amount of interaction between the lithosphere,
00:38:33.360
the hydrosphere and the atmosphere. But considering one without the others is simply not reasonable,
00:38:40.640
Mm hmm. I see. OK, so you do you do Earth based geology as well. I mean, that that is probably
00:38:47.440
helpful because one of the things I was thinking leading into this is how do planetary scientists
00:38:52.320
get jobs? Because, you know, there can't be there can't be that much demand for it,
00:38:56.400
given that we're not actually in space yet. But if you can do the geology bit as well,
00:39:01.520
Yes. So there are a number of Ph.D. programs, positions in academia, NASA, for example,
00:39:08.640
who are looking into ISRU, a word I'm going to use a lot in situ resource utilization for whom
00:39:15.760
planetary scientists are of great interest. But that would be for someone who's who's
00:39:20.720
more specifically focused on that. I am a mere mortal and the allure of the petrochemical industry and
00:39:27.280
the greater opportunities there. They pay well. Yes. Got it. Right. But nevertheless,
00:39:33.600
we could we can dream that, you know, 20 years down the line, there'll be there'll be real demand
00:39:37.600
for the for the planetary side of your skill set. Yes. Getting getting to that point, where do you
00:39:43.440
think we're kind of going to go first in terms of making use of space industry? Because I know there is.
00:39:50.160
I mean, there's obviously quite a bit of space industry at the moment with the satellite systems
00:39:53.360
that we have going around. And I understand things are getting pushed in that direction more and
00:39:57.040
more. So, for example, because there's so much data basically up there in the lower Earth orbit,
00:40:02.800
firms are starting to realize that actually it's cheaper to send data centers up into
00:40:06.640
Earth orbit and process the data there than it is to sort of bounce it all backwards and forwards all
00:40:11.360
day. And once you start, you know, once you start pushing more and more the value chain up into orbit,
00:40:16.800
well, then you've got a growing industry and then it's a natural at some point to go beyond that.
00:40:20.560
So what do you think of the sort of key drivers that are going to take us to make this a proper industry?
00:40:26.000
Well, so studying, there's a wonderful expression in geology, which was by Charles Lyon, who's,
00:40:33.360
along with Hutton, considered the father of geologists. And he said, the present is the key
00:40:37.920
to the past. And so what we, I think a principle we can extrapolate from that is that if we want to
00:40:43.520
look at how we might expect the industrialization to play out of space, we might, it might be useful
00:40:49.280
to examine how industrialization played out or did not play out in the past.
00:40:52.880
And so in that regard, I would say that where we are in space right now is similar to where the
00:40:59.120
Romans were with the area pile, you know, the little thing that spun around,
00:41:03.200
in that we, the engineering and the technology is there, but because of a lack of resources or
00:41:09.760
contravening economic motives, you know, for the Romans, it was the fact that they had no real call
00:41:15.120
for labor-saving devices. Labor was the one thing they had no shortage of.
00:41:19.440
That did much to halt the advancement of that technology. There was no virtuous cycle that
00:41:26.480
led to reinvestment and reinvestment and reinvestment. But by contrast, when industrialization did kick
00:41:31.280
off in the United Kingdom and the Low Countries and eventually the rest of Europe, there were a
00:41:37.040
series of factors that drove it, most principally when it comes to geology, coal. England and Wales
00:41:43.440
have the most bountiful resources of anthracite in the world, more or less. High-quality coal that
00:41:49.120
burns incredibly cleanly, and it runs right to the surface. And so we, we've been extracting coal
00:41:55.760
since the 16th century. Railways actually predate the steam engine by 200 years, and a lot of people
00:42:00.160
made a lot of money mining coal around southern England. But the, one of the, the interesting things
00:42:06.560
that affected that was that in Rome and Britain, the main thing that limited mining was that you
00:42:13.200
would, you wouldn't exhaust the material seams, you'd reach the water table. And unlike many other
00:42:18.160
applications where alternative forms of power like wind or water were appropriate, if you want to pump
00:42:23.840
water out of a mine, you can't then stop those pumps. If the water gets back in, you rot the supports and
00:42:28.720
the whole thing falls in, it becomes unsafe. And so you need a form of power that is, that is portable,
00:42:33.760
you don't need to be attached to a river, and it needs to run constantly, it can't be periodic.
00:42:37.600
Right, next up, Brokeonomics Energy. Nearly 5,000 views on this one. It really should go easily
00:42:45.200
over 5,000. I think people just thought, oh, it's about steam engines and stuff all, and didn't,
00:42:50.880
didn't, didn't properly apply themselves to it. It was actually a bloody brilliant episode.
00:42:56.880
It was with, um, um, it was, it was with Alex Masters, who I, he was the guy that I had on, um,
00:43:04.000
you know, the, the election stream where I turned up to do the green screen in a green shirt without
00:43:09.680
really thinking about it and accidentally discovered the most amazing thing ever. Well, he, he was my
00:43:15.040
co-pilot on, on that, on the green screen stuff. He will, he wasn't wearing a green shirt because he
00:43:19.200
thought about it a bit more. And, um, anyway, so I, I had him in for Brokeonomics and we talked through
00:43:26.560
how Britain got industrialized in the first place. And honestly, it's, you, you might think,
00:43:34.400
oh, it is history. How interesting. It is fascinating. It is fascinating. And what it's
00:43:41.040
difficult to explain. So I'll just play the bloody clip and then you can see why it's so fascinating.
00:43:46.400
But again, it was one of those conversations that just rewrote my mental model of the past
00:43:51.920
and how we got here and why England industrialized and how it all fits together. And it was just,
00:43:58.640
for me, this is, this is probably one of my, my very, uh, most favorite episodes because it was,
00:44:04.000
it was so, you know, the wiring changed. Anyway, check it out.
00:44:09.040
Hello and welcome to Brokeonomics. Now in this episode, I thought I'd get into that whole net
00:44:13.680
zero business. And I thought the right person to do that with would be Alex Masters. Alex.
00:44:18.960
Hello. Good afternoon. Yes. Welcome to Brokeonomics. Now we may, we may know you as that steam guy,
00:44:26.480
uh, from, from Twitter and various other places. Tell us about yourself.
00:44:30.320
Yes. Well, it does turn out I'm actually a real person. Um, yes. So, um, I'm Alex, that steam guy.
00:44:36.560
Um, I am the chairman of an industrial history charity down in Devon called the Robey Trust.
00:44:43.440
Um, I potter about making sustainable fuels as part of that. Um, and generally just sort of making
00:44:50.800
myself a nuisance in the community driving rollers about. Um, but what I really, uh, am at heart is a
00:44:58.160
sort of a practical industrial historian sort of growing up on episodes of Fred Dibner and things like
00:45:03.200
that. Um, you know, so really trying to live that up. And it was the best of British when we were doing
00:45:08.880
steam things and stuff. England was fascinating because it's, we're at this weird like Goldilocks
00:45:16.320
country where everything is a bit of a pain, but it's not insurmountable. And because England was
00:45:23.200
quite densely populated, basically all the forests have been chopped down for charcoal making by about
00:45:29.040
the Norman conquest. There's no, there's no virgin woodland in England at all by the Tudor period.
00:45:35.680
And it's, it's one of the things the Tudors have real, real problems with is securing charcoal is,
00:45:42.480
is sources of charcoal. What we then did because that becomes a pain is Abraham Darby in 1709
00:45:52.080
realizes that you can smelt iron with Coke and all Coke is, is coal that's been, it's gone through
00:45:58.800
the charcoal process. So you've, you've expelled all the volatiles and the sulfur and all the rest of
00:46:03.120
it. So this is the set it on fire, bury it, bury it, and then dig it up later. Yes. Warm it up. Yes.
00:46:10.640
Um, well get it nice and hot and then yeah, Coke is a weird, it's, it's, it feels like pumice.
00:46:16.400
It's a very light, um, quite expansive sort of rock. Um, but you chuck that in with the limestone
00:46:25.520
and the iron ore and you can make... But it burns hotter than coal, does it?
00:46:29.120
Much hotter than coal. And it's the impurities as well. If you start adding things like sulfur
00:46:34.640
to iron, you'll make it very brittle. So it's unusable. Oh, and there's sulfur in coal. There's
00:46:39.840
a lot of sulfur in coal, which is what it smells of. Ah, so, so is, is Coke pretty much pure carbon?
00:46:44.960
Pretty much pure carbon. Yeah, you've, you've, you've driven off the volatiles as they say it is
00:46:49.920
and you're left with the sort of carbon matrix that holds them all in. And that's, that's what...
00:46:55.200
And when did we figure that out? 1709. Okay. So right at the beginning of the 18th century,
00:47:01.440
that's in... Isn't that just when the whole industrial revolution thing started to...
00:47:05.600
This is the industrial revolution because what he does there is you don't have to wait for a tree to grow
00:47:10.560
to, you know, you don't have to wait for a tree to grow, to chop it down, to turn it into charcoal, to do...
00:47:17.200
What you do is you mine basically an infinite amount of fuel.
00:47:24.400
You can have as much as you want whenever you want and you can make...
00:47:28.000
And so what, what happens there is he's now got a competitive advantage over everybody else.
00:47:31.520
Okay. Last episode for this little roundup and then you can get back to your,
00:47:37.840
whatever it is you're doing. This probably goes out slightly after Christmas. So you're just kind
00:47:43.440
of floating around, waiting to go back to work, having to find reasons to talk to your in-laws,
00:47:48.720
stuff like that. Anyway, so last one before you can get back to all that. Brokonomics house prices.
00:47:54.720
Now I'm definitely breaking the rule of favourite episodes of 2025 here, because this is from 2023.
00:48:00.480
It was actually one of the earliest episodes that I did. And it was with a guy,
00:48:05.600
Peter Laurie, who is a very knowledgeable man in finance, who reached out very early on in my
00:48:12.960
Brokonomics journey and said, I love the stuff you're doing. And we ended up having a couple of
00:48:17.040
chats and it was really solid. But he's also got a housing company and him and his partner came on
00:48:23.760
and they talked about the realities of trying to build houses in this country. Because for most of
00:48:31.120
you, I would imagine house prices are a bit bloody annoying that they're high, they're really high,
00:48:37.840
they keep going up and they're expanding faster than your wages. So it's tempting to take a shot at
00:48:44.720
housing developers. It's like, why are you making it so sodding expensive? When they explain the steps that
00:48:51.360
they are forced to go through in order to build a bloody house, you could easily understand why
00:48:57.840
houses are so sodding expensive. And again, it was one of those episodes that is necessary to watch to
00:49:03.520
understand why we're in such a systematic bloody mess in this country. I mean, there's one bit,
00:49:09.520
I don't know if it's in the clip because Thomas basically did all of this for me, but there was
00:49:13.920
one bit that we explained before they can build a house, they need to send people down there at night
00:49:19.040
to stand in the plot with a ticker tracker thing and count how many bats fly overhead. And that is
00:49:26.800
just one of hundreds of bits of bullshit that they need to comply with before they can build a bloody
00:49:33.840
house. It's one of those things, whenever you think Britain is screwed up, when you really drill
00:49:43.040
into the detail and speak to people who know stuff, you realise it is considerably more screwed up than
00:49:48.720
you thought it was. Anyway, here's a clip. Hello and welcome to Brokernomics. Now in this episode,
00:49:53.440
we're going to be answering the questions to why house prices are so high. And to address this question,
00:49:58.320
I've got back Peter Lorre, friend of the show. Good to have you back. Thanks so much to seeing you
00:50:03.440
again. But also Ian Lush. Thank you very much for coming on, Ian. Now, you've flagged this on
00:50:09.440
previous Brokernomics. You've got a sort of a thing into a property developers.
00:50:14.160
Yeah, these three of us run a small niche house builder, if you like. I suspect housing supply
00:50:21.200
is going to drop because of some of the factors we talk about here. And this is why I said to you,
00:50:27.600
when we started, I want to talk about the banking industry as well. I think we're going to produce,
00:50:32.000
we're never going to get to the 300,000. And actually, it's entirely possible that this
00:50:36.320
year we don't produce the same, we'll be lower than we produced last year. Because every data
00:50:42.240
point you've given me so far shows that supply just goes down and demand just goes up. So the lines are
00:50:49.120
heading away from each other quite fast. And we've seen, what really worries me is every recession you
00:50:55.760
have. You see a drop in the number of house builders. That's also been clear for me.
00:51:00.960
Yes. So Ian, he just painted a fairly pessimistic picture from supply and demand. Help us get into
00:51:09.600
the knitting gritty. Give us a basic overview of the process of actually building your house.
00:51:14.720
Okay. So you start by trying to identify a possible site that you could develop,
00:51:21.680
which in our experience is mostly brownfield. So existing urban land, if you like.
00:51:31.440
So having identified something that you think might be suitable for development,
00:51:35.120
you've then got to negotiate with the owner in order to strike a deal to be able to buy that.
00:51:40.160
And that's after you've checked that it's within the urban boundary. Therefore, it's likely to be
00:51:45.600
developable. If it's outside the urban boundary, there's every chance that you wouldn't be able
00:51:50.160
to develop it in the first place. So you've gone through a bit of thinking and a bit of process
00:51:56.560
and some negotiation and if you've had to do search to make sure there's not pipes or electric cables.
00:52:01.120
A whole raft of research to make sure that lots of different factors aren't going to be an immovable
00:52:11.120
problem. They can fall into lots of different categories from ecology to categorization of land to
00:52:22.080
where it is to drainage, a whole raft of things that could be an absolute brick wall that stops you
00:52:29.360
ever developing it. And this is something I'd throw in here as well, sorry to interrupt.
00:52:33.680
No, go for it. There's still a certain amount of EU legislation which makes a lot of developable land
00:52:39.600
undevelopable. I'm referring to SSSI here. If you are within a certain distance of a,
00:52:45.760
I'm going to get it wrong now. Site of special scientific interest. You can't develop.
00:52:49.760
So if you're within 400 meters of the boundary of a SSSI categorized piece of land, you can't
00:52:57.200
build. You won't get residential permission. How often does that come up?
00:53:02.160
A lot. It's all over the country. You go off and have a look at possible housing
00:53:07.120
sites all the time. One of the first things we check is how close is it to the nearest SSSI?
00:53:12.880
Oh, so they're all over the place. Oh yeah. Oh yeah. And they're lots of different sizes as well,
00:53:17.840
because the definition is all about how valuable is the site scientifically.
00:53:22.240
So how often do you, I'm trying to, I'm trying to think through the ratio of how often somebody
00:53:27.440
gets in touch and say, I've got a bit of land I might be interested in selling. Do you actually
00:53:31.680
make an offer? You could, for example, send out 200 letters across an area, a county, and you might
00:53:42.080
get two or three replies. If you're lucky, you might get one piece of land that becomes a deal.
00:53:48.400
But that might take a year to get there. Okay. Well, with the process you've described,
00:53:54.560
it sounds like it would take at least a year. Well, in terms of negotiating and figuring out
00:53:59.200
whether it's developable. Right. Well, I hope you enjoyed all of those. If you haven't seen all
00:54:04.240
of those episodes, then go and watch them because they're really good. In fact, just, in fact,
00:54:08.480
while you're off, until you get back to work, just rewatch the entire back catalogue of Brokernomics.
00:54:14.080
That's probably the best thing you can do. And also, as I mentioned earlier, if you are not
00:54:19.200
subscribed, it is literally only £5 a month. There is no conceivable way that you will miss that amount
00:54:27.200
of money every month. But if you do it, then we can pay editors, and then we can do better work. And
00:54:32.880
you know, maybe at some point, we can hire more presenters and do other stuff as well. But only
00:54:37.680
if you support us. So get in the Christmas spirit and subscribe to not just Brokernomics. You get all
00:54:43.920
of it. You get Chronicles from Luca. You get real politics from Faraz. You get the history stuff from
00:54:52.320
Beau. You get the deep dive analysis that me and Carl often do. And you get to watch the podcast live
00:54:59.920
and leave comments, send in videos if you pay a bit more. There is so much high-quality content.
00:55:07.840
And if you're so busy you can't watch any of it, just subscribe and give us the £5 a month anyway
00:55:12.800
as support, because we cannot do this without you. You are the partner that makes all of this possible.
00:55:20.240
Otherwise, you're just going to get BBC'd forever. You know, you're going to get that liberal propaganda.
00:55:26.240
So thank you very much for your support. Thank you for your Christmas spirit.
00:55:30.080
I can see you now. You are reaching into your pocket. You're taking out your wallet. You're
00:55:34.080
removing your credit card. You're going to lotuseaters.com and you're subscribing because
00:55:39.280
you want to make the world a better place. Thank you and Merry Christmas and Happy New Year.