Planning for the Future
In August 2020, the Ministry of Housing Communities and Local Government (MHCLG) published a White Paper called ‘Planning for the Future’ which sets out the most ambitious reforms to the planning system in England since the system was first created in 1948.
It is a matter of almost universal agreement that our planning system is badly in need of an upgrade, and the proposals in the White Paper represent a serious, smart and genuinely radical plan to do it. At the same time, it is also inevitable (and in fact, good) that such proposals will be a subject of vigorous disagreement, and contain massive scope for unintended consequences, because even small changes to the planning system have big impacts on our democracy, our society, our economy and our environment.
The planning system is a hugely complex thing, with many moving parts, many conflicting aims, and many direct and indirect effects on people’s lives and livelihoods; especially on the lives of people who historically are least likely to participate in it.
Planning is also an area beset by myths and misunderstandings on all sides of the political spectrum: especially when it comes to the way that the planning system interacts with our broken land market.
So, I will structure this response around seven key areas of the proposals where I feel most able to contribute something, based on our work.
1. A digital planning system
One of the most obvious characteristics of the planning system is that it was designed before computers or the internet. Its founding operating system was paper: forms, documents, paper maps and drawings. A world where knowledge existed only in the heads of experts, or in books, written-by, and readable-by humans only.
So far, we have only transplanted that paper-based system onto computers (eg PDFs), but we haven’t fundamentally changed it.
The result, as the white paper describes very clearly, is a slow, opaque, unpredictable, inconsistent and costly process, where millions of pounds and millions of hours every year are spent writing, reading and re-reading policies, legislation. For example, even a homeowner who wants to make relatively simple changes to their home has to dig through multiple websites, documents and pieces of legislation, most of which they cannot reasonably be expected to know even exist in advance (ask most people on the street what ‘Article 4 directions apply to your home?’ and they will quite reasonably look baffled).
All this results in huge cost and waste to the private sector and private households, and a prohibitive barrier to economic development activity, but also represents a huge cost burden on planning authorities. For example, over 50% of all planning applications are invalid (e.g. required information is missing), and need to be resubmitted several times over. A vast amount of planning officers’ time is taken up with this kind of repetitive, procedural work, not really with planning at all.
As the paper says, this makes the planning system intrinsically undemocratic: favouring those with the financial resources to hire experts to engage in long, protracted negotiations, or those with the spare time to participate. For most citizens and small businesses, the cost and risk of participation in the planning process is prohibitive.
What makes the planning system feel even more conspicuously bad is that today many of the other services in our everyday lives are increasingly digital. Almost everything we do today, from booking a holiday to buying insurance, is done online via simple, user-friendly digital services that use automation to improve the experience, lower administrative costs and reduce barriers to access. For many people, that is no longer considered innovative — it is just normal. So it is a serious problem that democratic processes and institutions are being so badly left behind.
Fortunately, organisations such as the Government Digital Service (GDS) have blazed a trail, setting new standards of quality for digital, user-centered government services that run on data, not documents. The challenge we face is to bring the basic principles of the web to planning: to make it faster, more accessible, more democratic, and to allow planners to spend more time planning.
Over the last two years or more, we have been working with a number of local planning authorities, in collaboration with teams at the Connected Places Catapult, MHCLG and elsewhere, to build digital planning services and tools that begin to do this.
1.1 Digital planning services
When we talk about the planning system, the first thing many people think about is the planning process itself: the arduous journey of preparing and submitting a planning application and trying to get planning permission. It is a trail of tears, littered with multiple web pages, fees, drawings, documents, emails, inconsistency and frustrated phone calls.
Building digital planning services (such as ‘Apply for Planning Permission’) that make the shift from exchanging documents to exchanging data have the potential to save millions of hours and millions of pounds every year, not just making planning applications easier and faster for applicants to prepare, but also saving labour for local authorities, who today are overburdened with the work of processing applications, managing failure demand, and performing repetitive admin tasks, such as manual data entry.
The shift from documents to data even allows proposals to be automatically pre-parsed against relevant policies and legislation before they are submitted. For example, we have worked with councils to develop a service called ‘Find out if you need planning permission’, allowing most users to find out if their proposed project requires planning permission in just a few minutes; an activity that previously took hours of research or meetings with planning officers.
The basic principles at work here are not new; they underpin most of the digital services we use everyday. Digital planning services are different only in the breadth, complexity and continuously-changing nature of the data, standards, policies and legislation that underpin those services, both national and local.
One thing we have found is that all this localised content cannot be centrally created or maintained by any one organisation. The right strategy therefore is to empower local authorities to create, control and customise their digital planning services, but to enable them to do so by building common platforms and service patterns that make it simple to standardise, customise and collaborate so they do not duplicate effort unnecessarily.
Continue to put local planning authorities at the front of creating, controlling and deploying digital planning services, but support them / make it easy for them to do this by creating common national data registers (e.g a new national planning register), platforms, patterns and standards.
1.2 Digital planning policies
The white paper sets out an aim to publish policies as machine-readable data, which we strongly support.
Planning policies can be thought of as comprising two kinds of data. First, geospatial data (usually a map boundary, such as a Conservation Area) that describes where that policy applies. Second, semantic data that describes the meaning of that policy (i.e the planning rule that applies in that area).
The first is best done by publishing that spatial boundary data on a GIS platform with an open API, so anyone can look up what policies apply to a given address. The second is a bit more nuanced. Digital planning services can capture the application of a given policy to a given proposal (eg ‘if a house is in a Conservation Area then planning permission is required to add a side extension’) but that is only possible if:
a. That policy has a standardised structure and meaning. Some (eg. Conservation Areas) do, but many local policies and decisions / orders (eg. planning conditions attached to a decision) do not, even though they easily could.
b. The meaning of that policy is clear in the first place. A lot of planning policies are surprisingly blurry, for example saying things like ‘visual amenity must be taken into consideration’, which in practice, could mean almost anything. The reason planners do this is completely understandable: it leaves scope for nuance and discretion at the development control stage (‘I’ll know it when I see it’). However, the cost of this is that it creates a huge amount of often unnecessary work, uncertainty and risk. Anecdotally, many planning officers say that the bulk of most planning reports submitted to them essentially consist of long paragraphs of their own policy documents regurgitated back at them (with added stock photographs). They then have to pick through the reports to find and interpret the key salient information about a scheme.
In many cases when you sit down with officers you discover that it is possible to standardise the meaning of many conditions, orders and policies and to set much clearer policy rules about the outcomes that would or would not be acceptable, (there are caveats, which I will discuss in the ‘Rules-based planning’ section of this response). However, there is one observation that is worth noting here.
In order to create a clearer, more transparent policy rule, it usually involves adding an additional set of nuanced dependencies or preconditions (‘If this, then that, unless that’ ) There is an elegant irony here, which is that in order to become more transparent and accessible, often policies must first become more, not less complex. The good news is that in the age of computers and the web, this kind of complexity is not a problem. Once these dependencies and preconditions can be documented as code, policies can be orders-of-magnitude more detailed and nuanced, and yet still take a fraction of the time to navigate, because you can tell a service where you are, what your property is and what you’d like to do, and it can test that against an array of policies instantaneously, a process that previously would have taken weeks of picking through documents to find which policies apply.
There is a general principle we can take from this.
When we talk about making planning more efficient by ‘cutting red tape’, the solution is not to deregulate (ie. relax the policies, to the detriment of social, economic and environmental outcomes) but in fact to do the opposite: make them tighter and more nuanced but also transparent and rules based, so they can be understood both as human-readable rules and as machine-readable code. Less bureaucracy, more democracy.
1.3 Availability of data
The potential benefits — to everyone — of creating digital planning services is hopefully obvious. However all this can only happen when the base data is available: digital maps, address data, property data, spatial policies, planning registers, all digitised and openly queryable online via an open API. These data can be thought of as the basic raw fuel of a 21st century digital planning system. If they are inaccessible or locked behind expensive and poorly designed paywalls, all the digital public services I have described above simply will not happen, and neither will the myriad private and civic sector innovations that can be built on top of those services. As the early Government Digital Service (GDS) team captured in their vision of ‘Government as Platform’, shared, trusted, open data registers are the foundation upon which digital government and a digital economy will be built.
Although there have been strong steps towards improving and opening up some of these geospatial datasets, the truth is that many of the datasets needed to support really good digital planning services are still unavailable, inaccessible or proprietary (and expensive). For example, most people would agree that it should be possible for anyone in the UK to enter a property’s address into a service and immediately see a detailed map, overlaid with the property boundary (known as a ‘cadastral parcel’). This allows a digital planning service to immediately tell you if any part of your property is, for example, in or adjacent to a Conservation Area. However, those Land Registry parcels are still closed, aside from occasional releases. Generally speaking, open land, property, maps and address registers is an area where the UK lags behind many other developed economies.
In some cases (eg. historical planning conditions), this data is not available because it simply has not been digitised. It has been only scanned-in (or in some cases hasn’t, and is still sitting in filing cabinets). More often however the data exists but is held behind badly designed paywalls or locked-into legacy software for commercial reasons.
Do whatever it takes to get key common maps and data registers digitised and published with open APIs, and invest in the work of standardising geospatial and planning data in a way that makes it as easy as possible for anyone to access (and in some cases, improve) it. They are the foundations of a digital planning system.
1.4 Transparency, auditability and redress
As we shift towards a more digital planning system, it will be incredibly important to avoid the perils of ‘black box’ algorithms, where the rules and decisions that shape our lives and livelihoods become not less, but more opaque. This is especially a risk wherever machine learning might be used. It is a well-known issue with this method that it makes projections based on patterns derived from historical data, with no basis for justification other than ‘that’s what has happened in the past’. So we end up recreating the system as it was, instead of how we want it to be. If we want the digital planning system to be trusted, we need to exclude these kind of ‘black box’ systems from the outset.
Demand transparency and auditability throughout the ecosystem of digital planning services, platforms and registers, so it is always possible to see what a given output is based on, and trace it back to a particular source (such as an organisation, a planning decision or a piece of policy or legislation).
1.5 Openness and interoperability
One of the reasons we have not already seen more widespread digital planning innovation is not because the required technology (eg the web) does not exist, but rather because local authorities are bound into legacy IT / software applications, which are often outdated or difficult to get data into or out of. Often this is an intentional feature, rather than a bug, since it has the effect of making it very difficult for customers (local authorities) to switch to an alternative piece of software, or forces them to buy proprietary plug-ins or connectors. This is often referred to as ‘supplier lock-in’. This prevalence of these kinds of de facto monopolies and duopolies in public sector IT is a huge barrier to innovation, competition and public value for money.
So it is very important that, as we shift towards a more data-driven, web-based planning system, we avoid creating a new generation of suppliers with lock-ins or monopolies. The White Paper rightly refers to “a new more modular software landscape”. To achieve this, it is essential that all new data registers, platforms and services meet common standards for interoperability — for example, they must always have a full, flexible API to allow different modules of the landscape to exchange and publish data (within appropriate permissions), and to allow councils to extract their data and switch to a competitor at any time.
The field where this struggle plays out is public IT procurement. In 2018 we, along with collaborators at Connected Places Catapult and contributors from across industry, government and civic organisations created an open procurement checklist for local authorities — which has since been improved upon by many others. In simple terms, our suggestion is that local authorities should not sign a contract with a software supplier unless their product or service passes these 14 health checks.
Encourage / enforce local government procurement standards that ensure that all new digital planning software or services provided by suppliers to local planning authorities are interoperable, auditable and represent good public value.
1.6 Innovation support for local authorities
Local planning authorities are the front line of the planning system. The digital transformation of the planning system means the digital transformation of how local authorities do planning. It is impossible to overstate the difficulty of navigating the transition from legacy IT systems, of digitising and standardising data, building platforms, internal process innovation, recruiting digital service teams required to build and run 21st century planning services, especially at a time when many already feel overstretched due to the current crisis. All this even before you get to the challenges of culture-change and trust-building that are inevitably a part of system change.
All this is to say that the vision of a digital planning system set out in the White Paper is absolutely achievable, but realising it will take time — and sustained innovation funding and support for local planning authorities — boosted by central coordination to build infrastructure, create shared standards, and help LPAs share their learning of what works and what doesn’t. That funding should be conditional on aligning with good standards in design and procurement. The work of the Local Digital and Digital Land teams is an impressive starting point for this work.
Plan for a sustained, at least 5 year digital transformation support programme for Local Planning Authorities. Although proposals to allow LPAs to use land capture returns to fund planning departments will hopefully help improve their resourcing, other forms of funding, support and collaboration will be critical to success.
2. Faster plan making
The white paper also proposes to set shorter statutory timetables for the creation and adoption of local plans. This is not an area in which we have done any significant work ourselves, but the same underlying principle applies. While it is clearly absurd for local plans to take seven years to adopt and prepare (it would have been impossible for us to imagine in 2013 the challenges our towns and cities would be facing in 2020) oversimplifying and reducing scope for community involvement would be a false solution.
Instead, significant exploration and investment should be put into digital data registers, platforms and design patterns (more on those later) that make the plan-making process faster, more accessible, inclusive and better informed.
For example, today, building an evidence base for a local plan involves commissioning consultants to engage in long investigations, the findings of which are buried in long PDF reports. And yet, Google Maps can tell me whether there is currently a traffic jam on a minor road on the other side of the world. We need to start building (or inviting non-profit organisations to build) open, live evidence bases, for example maps of population, deprivation, air pollution, wildlife activity, noise, health outcomes, and so on based on standards. The potential of citizen science and open source sensor networks to contribute to these datasets shouldn’t be ignored. Indeed, the metrics, standards and the provenance of data that those data registers use will be (and should be) contested (in the same way that today the findings of an Environmental Impact Assessment are argued over, and are used to advance one interest or another). Therefore the neutrality of those data platforms, and the trustworthiness (and open contestability) of the data on them is crucial. The work of ODI Leeds and Centric Lab is especially worth pointing-to here.
If we can begin to build these kinds of shared, live evidence bases, we can begin to imagine a world where plan-making is an almost continuous process, where planners and citizens can immediately see what the likely impact of a given alteration to the plan is likely to be. The only part that should take time is the democratic deliberation over what the long-term vision for a place should be: the kind of places we — by coordinating development over many years — want to create.
This, we would suggest, would represent a good vision for the single test of ‘sustainable development’ to be a more, rather than less sophisticated way for new developments to be assessed against their likely impacts.
Work with non-profit organisations and others to explore building common data registers, platforms and ‘city models’ that will provide a continuous, common evidence-base to inform local plans and assess the likely impact of development proposals.
3. A rules-based planning system
One of the great paradoxes of the UK planning system today is that the activity that takes up most of the time and effort is not actually planning at all, but development control.
The original principle of town planning is to coordinate development ahead of time, based on what is in the common strategic interest — to set a framework against which new development is tested. And yet somehow that process of testing individual proposals against the framework has dwarved the original activity of creating the framework in the first place. In this ‘discretionary’ system, what ends up getting developed often bears little resemblance (in design or outcomes) to the original plan.
This reactive approach to planning also creates a highly adversarial system, where unless you are a property owner, in all likelihood your first interaction with the planning system is when you see a laminated piece of paper tied to the lamppost outside your house, like a threat. ‘This is about to be done to you.’ Little surprise then that for most people, their main experience of the planning system is essentially protesting against development. By that time, the developer has already invested so much money into the process, that the cost of change is too high — they have already paid for the land and for the designs. Communities instinctively understand this, and recognise that the consultation process is unlikely to fundamentally change the basic economics of the scheme much; so it is simply a battle to get it stopped.
The uncertainty and unpredictability of this contested, negotiated process is bad for the market (because it creates huge risk and up-front cost, shutting out small players), bad for democracy (because it erodes trust and creates a sense of angry powerlessness) and bad for planning outcomes, because it encourages speculative developers to negotiate-down quality, because they can, and therefore must.
Some have argued that the shift towards a more rules-based planning system described in the White Paper represents an undermining of local democracy, but I would argue that the opposite should be true.
Rules-based planning is about moving democracy from its current reactive position, downstream of the market, to a proactive position, upstream of the market.
This is good for democracy, because it gives it genuine power; it is good for the market (especially SMEs and self-builders), because it gives predictability. It is also good for development outcomes, because it creates a ‘take it or leave it’ development requirement which is factored into the original land price. Far from representing a deregulation, the rules can be far more tight and clear about what kind of development would and wouldn’t be acceptable, if we want them to be.
However, this only works if the ‘rules’ have genuine power (that is to say, developers cannot rely on getting them easily overruled on appeal), and planners and local communities have the ability to participate meaningfully in deciding what the rules that shape development in their neighbourhood should be. In the words of the White Paper, we need to “radically and profoundly (re-invent) engagement with local communities so that more democracy takes place effectively at the plan-making stage”. This means resisting the temptation to centralise too much, but it also must factor in the needs of those who live nearby, or those who don’t yet live in a neighbourhood, but will in future.
Policies and plans should, as far as possible, take a federated approach from the national to the local to the neighbourhood. Objectives and parameters should be set nationally, along with default codes. These default codes can be overridden or customised by local authorities and in turn by local communities in the form of a more specific neighbourhood plan and codes if they wish to, but the overall objectives and parameters are non-negotiable. This will create a genuine sense of community power over development upstream of the market.
Explore and test a range of online and in-person forums and infrastructures with the objective of maximising participation in local plan-making and making it much easier for communities to see, create, debate, test and agree neighbourhood plans.
An equally important challenge to overcome in shifting towards more rules-based planning approaches will be encouraging planners to use them.
Although there have been a number of examples where Local Development Orders and local design codes have proven to be very successful, especially in enabling development by small businesses and self-builders, through our work on digitising policies we have found that some policy officers are still hesitant to use them, at least initially.
Two things we have found:
a. Planners and policy officers feel immediately more comfortable with a less vague, more rules-based approach to policy when they realise that they, as a planning authority, are in control of what those rules are, rather than having them imposed upon them in a way that feels like a loss of control.
b. It is usually quite easy to find rules that apply to most cases but much harder to agree rules that apply in all cases. Therefore plans and legislation may contain provision for ‘edge cases’. In other words, instead of seeing rules-based planning as a ‘green light / red light’ system, give officers the opportunity to set rules for ‘amber’ scenarios, where discretion or consultation may be needed on a specific bounded issue (not unlike prior approval).
4. Design Codes
Another fear that is often raised around rules-based planning is that it is automatically associated with the blunt 20th century instrument of ‘zoning’, used in some countries, which often results in poor quality, monocultural neighbourhoods.
This does not have to be the case. As discussed in section 1 of this response, the opportunity of digital technology and the web is that we can now create machine-readable ‘rules’ or codes that are significantly more nuanced and detailed than could ever fit into a document, and still be much simpler to use. If individual property owners want to go outside these rules, they should still be able to apply for planning permission in the usual way, but their deviation from the local codes should be compared against the default: that is to say, what they propose must be self-evidently equivalent or better in social and environmental terms, and in the eyes of the community, not worse.
The question is, how can we make these design codes easy to create, as well as to use, and conducive to good social, environmental and economic outcomes?
One solution lies in the use of ‘patterns’. A pattern can be a common design pattern (such as ‘row of townhouses’) or it can be a rule of thumb (eg. a street height to width ratio, or a maximum building size) that encourages a particular outcome (eg. gentle density).
The use of these kinds of patterns is well tested and understood, for example in what are referred to as ‘form-based codes’, and they are a way of setting rules that encourage more beautiful, human-friendly places.
However, in the 21st century we should not only be interested in form based codes, but also performance based codes. That is to say, we are not interested only in beauty that is skin deep, but rather we are interested in how those places then perform socially, environmentally and economically. After all, one of the original reasons the planning system was created was, essentially, preventative health.
This means we need to collect and document design patterns not just in terms of their spatial beauty and popularity but also map those patterns to any available data we can collect about the performance / impact of that pattern. For example, there is plenty of peer-reviewed science proving that patients in hospitals with a view of trees from their beds have measurably faster recovery times. Similarly, we know that providing protected cycle lanes decreases car use and improves air quality. That performance data can be documented against that pattern, and used as part of an off-the-peg evidence base.
We should develop a live, national library of planning patterns, clearly documented in a variety of human readable (eg. text, diagrams) and machine-readable (eg. 3D parametric models, .json) formats, and mapped against any available performance data. Planners and local communities can use these patterns almost as building blocks to rapidly explore, test and create local and neighbourhood plans, backed by an evidence base.
One criticism that might be brought against the proposals set out in the White Paper is that the government is trying to impose one particular definition of beauty or good design.
The national planning pattern library should be open source, such that anyone can suggest an improvement to a pattern, submit new evidence against a given pattern, or propose a new pattern to be added to the library. If a submitted pattern is sufficiently promising and popular it can be added to the library. Local planning authority design teams and local communities should also be encouraged to create/customise, document and share their own patterns that suit the local character of their area. This way, the national planning pattern library becomes an evolving social and cultural commons, like the English language itself.
For reasons I will explore in more detail in part 5 of this response, design codes should err away from prescribing specific materials for buildings, and instead focus mostly on form and performance. For one, it is perfectly possible to build ugly, inappropriate development using ‘local materials’, whereas patterns that are beautiful and human-friendly in scale, form and performance can be built using a wide array of materials.
4. Land value capture
Another of the fundamental original principles of planning is that the planning system should capture some (or all) of the land value uplift that is created by the public and the community when we install infrastructure and give our collective consent for development. The moment a piece of land is given outline planning permission for homes, its value skyrockets from (in theory) a few thousand pounds per hectare to, in some cases, many millions. That value is created by government and the community, not by the landowner. As Winston Churchill said, all the landowner has to do is “sit still”. Alongside ensuring that development is beautiful, sustainable and in the public interest, arguably the most important function of the planning system is to capture this publicly-created value and use it to pay for community infrastructure and goods, or to lock-in more affordable forms of land tenure. If the planning system fails to do this, the taxpayer has to pick up these costs (for example the cost of infrastructure projects or housing benefits) elsewhere through their tax bill. So aside from ensuring good development, getting land value capture right is an opportunity to save the taxpayer many billions of pounds every year.
No one should lament the end of Section 106. It was always a corrupting mechanism that left local communities begging for crumbs under their own table. It wasn’t even really necessarily good for developers either. Because developers could negotiate-down the amount of community infrastructure or affordable housing they provide, they arguably have a fiduciary responsibility to do so. And, in an environment where developers hold all the cards (ie. land), and local authorities are under pressure to deliver numbers (and are less well-resourced than their private sector counterparts), invariably the developers have been able to do so. It also created a perverse incentive for local authorities to give consent to poor quality development because of the scale of S106 contributions on offer. This also created a situation where large developers had more leverage than small ones, and therefore a significant market advantage when it came to bidding on land.
This touches again upon an area which is widely misunderstood, which is the effect of the planning system on land values. Land value is based on ‘residual value’: that is, you start with the end value of a property in that location (how much someone is able to pay for it), subtract from it the cost of development (and a profit margin, if you are a for-profit developer) and the remaining amount is how much you have left to pay for the land. Again, because developers could negotiate-down quality and S106 contributions, it means that any land bidding war will always be won by the developer who is most confident that they can squeeze down design quality and S106 contributions. With the higher land price thus paid, the developer can now claim that it is no longer ‘viable’ to provide better development or more affordable homes.
Shifting towards a fixed transparent sum (and deferring payment to completion, when new properties are refinanced) is an inherently sensible thing to do. It is good for the market (especially small players) because it takes away risk and effort, and it sets clear expectations that will be reflected in the land value. It is also good for democracy because it removes developers’ fiduciary duty to try to get a better deal. In short; it will save everyone a lot of money, time and hassle. That is all energy that can instead go into better development.
However a key question therefore becomes: at what level should the new single infrastructure levy be set? Set it too low and the community and taxpayers will be effectively gifting billions to landowners overnight, for nothing, because land values will go up more than the levy. As we have seen in other countries that use zoning style approaches, this can also create a corruption problem.
With this in mind, arguably the single most important words in the white paper are those setting the intention for the new Levy to raise “more” than the existing arrangements and deliver “at least as much” affordable tenure housing. Although landowners and their agents will undoubtedly argue that this figure is too high (because it is their job to do so), it is absolutely right that it should be higher, since the new rules based system is taking away a huge part of the risk and cost of development (referred to as ‘planning risk’). There will, no doubt, be friction in the transition, but this is a commitment that should be stuck-to, and robustly defended.
First, we should remember that 100% of land value is created by the community. Historically, S106 and CIL have only managed to recapture around 27% of the publicly-created land value uplift, compared to around 90% in the Netherlands, which benefits from higher quality development and better infrastructure as a result.
Second, while some owners of land may complain that their land value may fall as a result, we should remember that that high valuation is a function only of ‘hope value speculation’ or a presumption of poor quality development — anyone who paid it knew the risk they were taking. The community have no obligation to reward or underwrite their speculations. The new system arguably should result in slightly lower land values in some places.
Equally, if speculative developers claim that the new levy rate (combined with high quality design codes) would make development ‘unviable’ (by which they mean ‘unprofitable’) and therefore they cannot develop the land, it should be remembered that they always have the option of selling the land (probably for little or no loss) to smaller developers, non-profit developers or self-builders who can develop to that quality.
The Single Infrastructure Levy should be set higher than the previous S106 + CIL contributions.
However, there are some caveats to flag here, that should influence the detailed design of the levy.
4.1 Land value vs building value
The paper suggests that the levy would be based on a fixed percentage of the end property value. There is a danger here that this would disproportionately tax development in areas where land values are lower. When we compare a property in one part of the country that is worth, say, £400,000 with another identical one in a different part of the country that is worth £150,000, the difference is not in the construction value of the house (which is, let’s say £150,000 in both cases), but in the value of the land (location).
So if we were to apply a levy of, say, 25% of the property value to both these projects, the property in the high value location would pay £100,000. If construction costs £150,000, that leaves £150,000 for land / profit margin. Plenty.
The property in the lower value location would pay £50,000. After subtracting the construction cost of £150,000, that leaves a land value / margin of £0. Any less, and the land would even have a negative value.
The aim of the infrastructure levy should be to tax uplifts in the land value, not to tax or prohibit good quality construction or land remediation.
The levy should be designed to ensure that levy is only paid on value uplifts beyond the typical cost of (good quality) construction, so it doesn’t prohibit development on low / zero value sites.
4.2 Making allowance for more affordable tenures
Another flaw in Section 106 was the way that it effectively locked planning authorities into a Faustian pact with the land speculation market and landlords — even though they were not delivering the best place outcomes. Councils became dependent on private investors to pay for affordable homes and community infrastructure. In a future system where planning authorities rely on Single Infrastructure Levy payments even to fund their own planning teams, there is a similar risk.
This could have the unintended side-effect of crowding-out social landlords or community organisations (such as Community Land Trusts) who intend to rent or sell properties at affordable tenures below normal market value.
The levy should make allowances / discounts for non-profit organisations who are creating / providing genuinely affordable tenure homes.
As with the current CIL this may include measures for self-builders (who do not intend to sell or rent their homes). For example levy payments might even be deferred to the point when the property is sold.
4.3 Infrastructure-first development
The white paper proposes that local authorities should be allowed to borrow ahead of time to pay for the up-front cost of infrastructure. It also proposes that on-site provision of affordable homes will still be a feature of the new levy.
There is reasonable doubt that this will deliver the number or quality of affordable tenure homes, or truly ambitious neighbourhood infrastructure that is required to build whole new zero-carbon neighbourhoods. Whilst the mechanism is welcome, it should probably not be relied upon entirely for developments beyond a certain scale.
Alongside the changes set out in this paper, the government should legislate to update the Land Value Compensation Act 1961, to allow local authorities to — in the case of large strategic developments at least — purchase areas of land at current use value before classifying it as a ‘Growth Area’. This will allow local authorities to capture a much greater percentage of the land value upfront and to use that money to install world-class community infrastructure before selling or leasing plots to a diverse array of families, groups, organisations and developers under a variety of tenures to build out within design codes.
Simply put, this is the approach most likely to deliver truly beautiful, sustainable places: what Nicholas Boys-Smith calls ‘the Conservation Areas of the future’.
In several places, the white paper mentions encouraging the use of Modern Methods of Construction (MMC), although it is not specific as to exactly how this will be done.
As an R&D lab who have dedicated years of work towards digital innovation in construction, it might sound strange for us to say this, but we would suggest avoiding the kinds of measures that have been used in the UAE, whereby use of a certain percentage of MMC is mandated in a development. This kind of approach mostly serves to create perverse incentives, weird interpretations and odd market distortions.
As a general rule, the planning system should avoid legislating on how buildings are constructed. Rather, it should set the outcomes, and allow people and businesses to innovate.
In truth, the best things the planning system can do to promote the use of MMC are:
5.1 Maximise predictability
As the rules and design code approaches set out in the White Paper will tend to do. This will allow companies to develop products and solutions to deliver on these patterns.
5.2 Diversify development
Wherever possible, break larger sites into smaller projects, diversifying development (as recommended in the Letwin Review) to create more continuous, steady development activity. This sustains demand for local and national MMC SMEs.
5.3 Set high performance requirements
For example, requiring that new developments should be zero carbon in both construction and use. Increasingly the only way of doing this is to build using modern methods.
5.4 Do not allow design codes to prescribe materials, except possibly in conservation areas
For example, bricks look beautiful but they are massively skill and carbon intensive. Instead of mandating the use of bricks, design codes might set a range of materials, or prescribe that facades must have the same kind of small-scale texture as brick affords, and fit a palette of materials, shades and tones, for example.
5.5 Update the General Permitted Development Order 2015 to update rules for materials in the case of extensions
At present the GPDO requires that all permitted extensions be built using materials that match the existing building (more often than not, that means bricks). There is now a well-established practice of building beautiful new extensions onto old buildings that use distinct but complementary materials such as timber. A design code for these kinds of development would open up a market for small scale projects using MMC.
6. Land supply and demand
In a number of places, the White Paper refers to “increas(ing) the supply of land available for new homes where it is needed to address affordability”, or associating the undersupply of homes with unaffordability.
This is where we encounter two of the big myths where planning and land supply are concerned.
First, although suddenly adding a large volume of new homes into a specific area may have the effect of temporarily dropping local prices (known as ‘exceeding the absorption rate’), local prices will soon return to normal again afterwards. On the long run, increasing supply only has a very small downward effect on property prices. In 2018 the MHCLG’s own analysis found that if housing stock increases by 1%, it will only exert a downward pressure of 2% on house prices. In other words, undersupply was only a very minor factor in the the wildly inflating prices we have seen in recent decades.
Second, even increasing the supply of land with planning permission will not necessarily lead to an increase in supply of homes if that land is owned by speculative developers, because, for a number of reasons, those companies have a strong incentive to build-out at a steady pace and trickle new properties onto the market, holding years of land supply in reserve. The 2018 Letwin Review explained this very articulately.
Although the changes set out in the white paper should have a hugely positive effect in eliminating much of the lose–lose brinksmanship involved in negotiated planning permissions, they will not alter these two fundamental truths.
In truth, the value of a property is primarily a function not of supply / volume of stock, but of the value of living at that location. That is influenced first by how much people can afford to pay / borrow (or how much a landlord or land speculator thinks people will be able to pay / borrow in future), and second by the relative desirability of that location versus other alternative locations, in terms of jobs, schools, infrastructure, connections, beauty, cool-factor and so on.
Indeed, as counterintuitive as it may seem, sometimes adding supply can have the opposite effect. If you develop an area with lots of beautiful streets, big houses, amazing schools, restaurants, lots of economic infrastructure and so on, it will become cool and therefore it will become more, not less desirable (this is called the ‘agglomeration’ effect). We are, basically, a social species. If you think about it, there is no longer really any geographic reason for London to be in London. People only want to be in London because other people want to be in London.
This recognition that just building more homes will not reduce prices is what John Stuart Mill might have referred to as a ‘Pons Asinorum’, but it has significant implications for how we need to think and talk about land supply allocations.
It would be comforting to treat land allocations as a rational, mathematical exercise; simply a problem of allocating land “where it is needed”. And certainly, it would be strange not to designate additional land for development in areas that are growing and prospering. However, if we only do this, all that will happen is hot areas will get even more overheated, and cool areas will get cooler: the opposite of levelling-up.
The idea that new land allocations should not simply be left up to local authorities and communities — is still reasonably sound. In our current system, there is a strong possibility that NIMBYism would politically overrule the rewards of social and economic development, exacerbating even further the problem of undersupply. However, there isn’t a cold science that can tell us what those allocations should be, and the justification for those allocations cannot be that house prices might fall as a result. Rather it is a bigger, collective conversation about placemaking; creating new social and economic infrastructure to support growth and wellbeing, and ensuring that the places we do build remain affordable to as many people as possible.
In reality, the real question is not ‘how many housing units can we create?’, but ‘what kind of places do we want to create, and with what mix of tenures’? And it should be more focused on the economic prosperity, social inclusiveness and, yes, beauty of the places we create, that will shape people’s lives for the next hundred years.
To be clear, this is not a choice between quantity or quality. Through measures such as a Right to Replace, it would be entirely possible to add half a million beautiful, zero carbon homes to London’s suburbs alone if we decide we want to, and the city would be improved for it. But treating land allocation as a problem of supply and demand — a problem of numbers alone — is a false lens; one that will only lead to poor quality development in areas with high land values.
In the long run, the way to level the stark differences between the South East and the rest of England is not to bombard the South East (and the railway network) with poor quality, overcrowded housing units in the hope that it will bring down prices (it won’t, except inasmuch as the quality of those places may be so bad that families will want to move out) but rather to focus on creating new location value in places across Britain, using bold, green, infrastructure-first development to create beautiful, successful places where people want to move, stay and set up businesses.
That is where planning started, and it is where these reforms can take it next.
Alastair Parvin is the co-founder and CEO of Open Systems Lab, a non-profit R&D company working on digital and systems innovation for the built environment
A digital version of this response is available to view and comment here