421R_transcript_The origins of scaling in cities

Check out the episode:

You can find the shownotes through this link.


Are you interested in the connection between urban evolution and their population size?


Our debate today works with the article titled The origins of scaling in cities from 2023, by Louis M. A. Bettencourt, published in the Science journal.

This is a great preparation to our next interview with Greg Lindsay in episode 422 talking about the city as an engine for creative collisions.

Since we are investigating the future of cities, I thought it would be interesting to see the scaling relations of urban areas. This article suggests that urban efficiency can be measured by balancing the benefits of social interactions against the energy costs of moving people and information.

[intro music]


Welcome to today’s What is The Future For Cities podcast and its Research episode; my name is Fanni, and today we will introduce a research by summarising it. The episode really is just a short summary of the original investigation, and, in case it is interesting enough, I would encourage everyone to check out the whole documentation. This conversation was produced and generated with Notebook LM as two hosts dissecting the whole research.


[music]

Speaker 1: So if you double the size of a city, you might naturally expect everything inside it to just double right along with it.

Speaker 2: Right? Like the crime, the wealth, the number of roads, you think it all just scales up linearly.

Speaker 1: Exactly. But they don’t, the amount of roads only increases by about 85%. Meanwhile, total wages and innovation, they jumped by 115%. Today we are looking at Luis Ma Betten Kurz paper, the origins of scaling in Cities, and we’re asking a pretty fundamental question. Can the complex, seemingly chaotic nature of human cities actually be reduced to a unified set of mathematical scaling laws?

Speaker 2: It is a massive question.

Speaker 1: It really is, and I’m taking the position that yes, it absolutely can. The data shows cities aren’t just these random, messy collections of people. They universally function as predictable, open-ended social reactors, and they’re governed by strict physical scaling laws and deterministic infrastructure networks, regardless of their unique histories.

Speaker 2: I take the opposing view here. While aggregate statistical patterns definitely exist in the data, we can’t deny that a purely deterministic framework really overlooks the vital constraints of bounded human effort. I’m arguing that this model kind of oversimplifies the suboptimal outliers. It ignores the fragile localized limits of urban infrastructure that actually define a real world city survival.

Speaker 1: Let’s start right with those mathematical scaling laws, because they really are the engine of this whole theory. The paper maps data from over 3,600 cities worldwide. It’s a massive data set and it reveals something incredible. When a city grows its physical infrastructure, so things like roads, water pipes, electrical cables, it scales sub linearly.

Speaker 2: Meaning it basically scales at a slower rate than the actual population growth.

Speaker 1: Exactly. If the population doubles, you don’t need double the gas lines. You only need about 85% more, because as things get denser, people just naturally share that infrastructure. It’s mathematically more efficient, but at the exact same time, the socioeconomic outputs, wages, patents, even violent crime, unfortunately. They scale super linearly.

Speaker 2: They grow faster,

Speaker 1: faster than the population. Yes, you double the people. You get more than double the wealth and interaction. This continuous, predictable math perfectly predicts the evolution of a city’s spatial and social reality.

Speaker 2: Okay, so the math is undeniably elegant. I’ll give you that, but. Looking purely at average, global properties really obscures the fragility of these systems. A city is not just a clean equation on a whiteboard.

Speaker 1: Sure, but

Speaker 2: wait, let me just lay this out. The paper itself talks about bounded human effort. There are mental and physical limits to what a person can actually endure in a growing city. There are maximum limits to energy dissipation. For decades, urban planners just loved comparing cities to biological organisms,

Speaker 1: the old metaphors,

Speaker 2: veins as roads. The heart is downtown, and biology teaches us that if an organism scales too large, it collapses under its own metabolic weight. It simply can’t pump the resources fast enough to survive. So if a city is a massive network, pumping resources to its arteries, there has to be a natural, biological limit to its growth before the sheer cost of moving that energy around just destroys it.

Speaker 1: Wait, I see why you think that, but let me give you a different perspective. That biological analogy is exactly where our intuition leaves us astray here. A city isn’t an animal. It’s a star.

Speaker 2: Like literally.

Speaker 1: Yes, like a star. Think about how a star actually works in physics. Gravity pulls matter together. Which creates this intense heat and pressure. A city does the exact same thing with people. The gravitational pull of a dense urban centre crushes people together, and the friction of them bumping into each other creates the fusion of new ideas, wealth and innovation.

Speaker 2: Okay, I follow the metaphor, but

Speaker 1: But just star, right? A city burns through energy much faster to sustain that reaction. Biological organisms evolve to minimize energy dissipation. They want to be as efficient as possible so they don’t starve. Cities do the exact opposite. They actively increase energy loss to maximize social interactions. How can you deny the universality of this reactor model? When empirical measurements of German urban power grids perfectly mapped to these super linear dissipation rates.

Speaker 2: I admit that’s a fascinating distinction. The idea that the waste or the friction of a city is actually its defining feature.

Speaker 1: It is, and the data proves it holding up across totally different cultures.

Speaker 2: It holds up in the abstract, but let’s bring this down from theoretical physics and astrophysics to the actual pavement, right? If cities are just burning energy to create social interaction, there has to be a breaking point where the physical cost of moving around outweighs the benefit of meeting people.

Speaker 1: Sure.

Speaker 2: What happens to a city when it hits that wall? The model hinges on this concept of G Star, which is basically the theoretical sweet spot. It’s the optimal balance where a city maximizes its social interactions while keeping its transport costs manageable,

Speaker 1: right? It’s the perfect equilibrium between the benefits of density and the cost of mobility,

Speaker 2: right? But KOTs own paper points to these glaring outliers where this equilibrium is just a complete fantasy. Look at Riverside, California or Brownsville, Texas. The math shows their social potential is vastly underdeveloped compared to their actual size. Or take Bridgeport, Connecticut escalating mobility costs, literally the sheer friction and traffic of getting around are actively threatening the city stability. These aren’t just minor data BBLs. These are massive human settlements. I’d argue that these outliers prove that deterministic math fails precisely at the margins where real world urban planning actually happens.

Speaker 1: I really have to push back there because. Those outliers don’t break the model at all. They validate it.

Speaker 2: How does failing to reach potential validate the theory?

Speaker 1: Because the theory isn’t just some rosy picture of infinite growth, its own equation. For net urban output explicitly predicts those exact failure states. The model identifies a mathematical ceiling called gmax.

Speaker 2: The point of maximum dissipation.

Speaker 1: Exactly. It’s the tipping point where the friction and the cost of moving around finally overcome the social and economic benefits of living there. When a city hits GM max, it becomes unstable and starts to fracture. Commutes get too long, people stop interacting. If this mathematical framework can calculate exactly when a city approaches that ceiling and starts tearing itself apart, doesn’t that prove the universality of the framework? Rather than its weakness. It’s not ignoring Bridgeport. It’s providing the exact diagnostic tool to explain precisely why Bridgeport is suffering.

Speaker 2: Okay? But let’s look at the foundational assumption required for that diagnostic tool to even work in the first place. The entire framework rests on the idea of a mixing population,

Speaker 1: right? Meaning that anyone in the city can, in principle, interact with anyone else,

Speaker 2: right? But if you’re listener sitting in a marginalized neighbourhood with terrible public transit. You know, firsthand that this just isn’t true. The text explicitly acknowledges that cities contain isolated spatial and social pockets of lower mobility. So assuming an idealized average distance between individuals completely ignores the gritty reality of urban segregation, human effort is bounded. A resident in an underserved zip code cannot frictionlessly access the economic engine of the city centre. The math assumes inequality of access that physically, geographically does not exist.

Speaker 1: That’s an interesting point though. I would frame it differently. The model doesn’t claim everyone travels everywhere equally, or that the playing field is perfectly level. The mixing population is defined by a very specific entry condition,

Speaker 2: which is

Speaker 1: the minimum resources accessible to a person. Just have to match the basic cost of reaching anywhere in the city. We know this holds true on average because of behavioural scaling.

Speaker 2: You’re talking about the cell phone data now?

Speaker 1: The cell phone data, yes. But more importantly, the historical wage data we see in variant scaling of wages across US metropolitan areas, spanning 40 years.

Speaker 2: 40 years is a long time. Sure.

Speaker 1: It is through recessions, economic booms, the rise of the internet. Major shifts in public transit infrastructure. The super linear scaling of wages remained absolutely constant even with those isolated pockets and the segregation. You mentioned the overarching network physics of the city, continually balance area, infrastructure and interaction. Doesn’t that 40 year consistency prove that on average the mixing population mechanic is functioning exactly. As the physics model predicts,

Speaker 2: it proves that the macro level average holds up statistically. I’ll definitely concede that the mathematical gravity, as you put it, is real, but relying entirely on that macro average for urban policy is incredibly dangerous. We have to acknowledge the boundaries, the edge cases, and the localized limits.

Speaker 1: I agree. We have to acknowledge them. Yes,

Speaker 2: because the equations themselves reveal that unrestricted growth. Eventually pushes transportation and energy costs past the point of societal benefit. If we just view cities as these perpetual motion machines as invulnerable stars, we really risk ignoring the internal friction that tears them apart at the neighbourhood level.

Speaker 1: That is the exact brilliance of Betten Kortz framework. It proves that we can no longer rely on superficial aesthetic analogies. We can’t just look at city streets and call them river basins or veins anymore. The underlying network physics provides a real quantitative roadmap,

Speaker 2: right? It gives planners the precise metrics they actually need to delicately balance density, mobility, and social connectivity, so we don’t accidentally push our cities past those maximum dissipation points. And on that we absolutely converge. Urban planning has to grapple with the strict physics of density and mobility. The math gives us the parameters, no question, but I maintain that the human element, the bounded effort of actual people navigating that physical space, it constantly strains against those mathematical boundaries.

Speaker 1: It really is a profound shift in perspective. Viewing our cities, not merely as places we happen to live, but as these open-ended social reactors, vast, interconnected networks that actually become denser and more mathematically productive with scale, there’s still so much more to uncover in this framework, especially regarding how global economic hubs operate versus localized urban economies.

Speaker 2: Absolutely. The tension between social interactivity and the sheer physical cost of human settlement is I think the defining struggle of modern civilization. We invite you to look at the data, consider these boundaries, and form your own conclusion about the forces shaping the streets. You walk every single day,

Speaker 1: so the next time you look down at a sprawling city at night, ask yourself, are you just looking at a chaotic scatter of puzzle pieces, or are you witnessing the precise mathematical fusion of a human star?


[music]

What is the future for cities podcast?


Episode and transcript generated with ⁠⁠Descript⁠⁠ assistance (⁠⁠affiliate link⁠⁠).

One response to “421R_transcript_The origins of scaling in cities”

Leave a reply to 421R_The origins of scaling in cities – What is The Future for Cities? Podcast Cancel reply