Smart Digital Urbanism: creating the conditions for equitably distributed opportunity in the digital age

(The sound artists FA-TECH [http://fa-tech.tumblr.com/] improvising in Shoreditch, London. Shoreditch's combination of urban character, cheap rents and proximity to London's business, financial centres and culture led to the emergence of a thriving technology startup community - although that community's success is now driving rents up, challenging some of the characteristics that enabled it.)

(The sound artists FA-TECH improvising in Shoreditch, London. Shoreditch’s combination of urban character, cheap rents and proximity to London’s business, financial centres and culture led to the emergence of a thriving technology startup community – although that community’s success is now driving rents up, challenging some of the characteristics that enabled it.)

(I first learned of the architect Kelvin Campbell‘s concept of “massive/small” just over two years ago – the idea that certain characteristics of policy and the physical environment in cities could encourage “massive amounts of small-scale innovation” to occur. Kelvin recently launched a collaborative campaign to capture ideas, tools and tactics for massive/small “Smart Urbanism“. This is my first contribution to that campaign.)

Over the past 5 years, enormous interest has developed in the potential for digital technologies to contribute to the construction and development of cities, and to the operation of the services and infrastructures that support them. These ideas are often referred to as “Smart Cities” or “Future Cities”.

Indeed, as the price of digital technologies such as smartphones, sensors, analytics, open source software and cloud platforms reduces rapidly, market dynamics will drive their aggressive adoption to make construction, infrastructure and city services more efficient, and hence make their providers more competitive.

But those market dynamics do not guarantee that we will get everything we want for the future of our cities: efficiency and resilience are not the same as health, happiness and opportunity for every citizen.

Is it realistic to ask ourselves whether we can achieve those objectives? Yes, it has to be.

Many of us believe in that possibility, and spend a lot of our efforts finding ways to achieve it. And over the same timeframe that interest in “smart” and “future” cities has emerged, a belief has developed around the world that the governance institutions of cities – local authorities and elected mayors, rather than the governments of nations – are the most likely political entities to implement the policies that lead to a sustainable, resilient future with more equitably distributed economic growth.

Consequently many Mayors and City Councils are considering or implementing legislation and policy frameworks that change the economic and financial context in which construction, infrastructure and city services are deployed and operated. The British Standards Institute recently published guidance on this topic as part of its overall Smart Cities Standards programme.

But whilst in principle these trends and ideas are incredibly exciting in their potential to create better cities, communities, places and lives in the future, in practise many debates about applying them falter on a destructive and misleading argument between “top-down” and “bottom-up” approaches – the same chasm that Smart Urbanism seeks to bridge in the physical world.

Policies and programmes driven by central government organisations or implemented by technology and infrastructure corporations that drive digital technology into large-scale infrastructures and public services are often criticised as crude, “top-down” initiatives that prioritise resilience and efficiency at the expense of the concerns and values of ordinary people, businesses and communities. However, the organic, “bottom-up” innovation that critics of these initatives champion as the better, alternative approach is ineffective at creating equality.

("Lives on the Line" by James Cheshire at UCL's Centre for Advanced Spatial Analysis, showing the variation in life expectancy and correlation to child poverty in London. From Cheshire, J. 2012. Lives on the Line: Mapping Life Expectancy Along the London Tube Network. Environment and Planning A. 44 (7). Doi: 10.1068/a45341)

(“Lives on the Line” by James Cheshire at UCL’s Centre for Advanced Spatial Analysis, showing the variation in life expectancy and correlation to child poverty in London. From Cheshire, J. 2012. Lives on the Line: Mapping Life Expectancy Along the London Tube Network. Environment and Planning A. 44 (7). Doi: 10.1068/a45341)

“Bottom-up innovation” is what every person, community and business does every day: using our innate creativity to find ways to use the resources and opportunities available to us to make a better life.

But the degree to which we fail to distribute those resources and opportunities equally is illustrated by the stark variation in life expectancy between the richest and poorest areas of cities in the UK: often this variation is as much as 20 years within a single city.

Just as the “design pattern”, a tool invented by a town planner in the 1970s, Christopher Alexander, is probably the single most influential concept that drove the development of the digital technology we all use today, two recent movements in town planning and urban design – “human scale cities” and “smart urbanism” – offer the analogies that can connect “top-down” technology policies and infrastructure with the factors that affect the success of “bottom-up” creativity to create “massive / small” success: future, digital cities that create “massive amounts of small-scale innovation“.

The tools to achieve this are relatively cheap, and the right policy environment could make it fairly straightforward to augment the business case for efficient, resilient “smart city” infrastructures to ensure that they are deployed. They are the digital equivalents of the physical concepts of Smart Urbanism – the use of open grid structures for spatial layouts, and the provision of basic infrastructure components such as street layouts and party walls in areas expected to attract high growth in informal housing. Some will be delivered as a natural consequence of market forces driving technology adoption; but others will only become economically viable when local or national government policies shape the market by requiring them:

  • Broadband, wi-if and 3G / 4G connectivity should be broadly available so that everyone can participate in the digital economy.
  • The data from city services should be made available as Open Data and published through “Application Programming Interfaces” (APIs) so that everybody knows how they work; and can adapt them to their own individual needs.
  • The data and APIs should be made available in the form of Open Standards so that everybody can understand them; and so that the systems that we rely on can work together.
  • The data and APIs should be available to developers working on Cloud Computing platforms with Open Source software so that anyone with a great idea for a new service to offer to people or businesses can get started for free.
  • The technology systems that support the services and infrastructures we rely on should be based on Open Architectures, so that we have freedom to chose which technologies we use, and to change our minds.
  • Governments, institutions, businesses and communities should participate in an open dialogue about the places we live and work in, informed by open data, enabled by social media and smartphones, and enlightened by empathy.

(Casserole Club, a social enterprise developed by FutureGov uses social media to connect people who have difficulty cooking for themselves with others who are happy to cook an extra portion for a neighbour; a great example of a locally-focused “sharing economy” business model which creates financially sustainable social value.)

These principles would encourage good “digital placemaking“: they would help to align the investments that will be made in improving cities using technology with the needs and motivations of the public sector, the private sector, communities and businesses. They would create “Smart Digital Urbanism”: the conditions and environment in which vibrant, fair digital cities grow from the successful innovations of their citizens, communities and businesses in the information economy.

In my new role at Amey, a vast organisation in the UK that delivers public services and operates and supports public infrastructure, I’m leading a set of innovative projects with our customers and technology partners to explore these ideas and to understand how we can collaboratively create economic, social and environmental value for ourselves; for our customers; and for the people, communities and businesses who live in the areas our services support.

It’s a terrifically exciting role; and I’ll soon be hiring a small team of passionate, creative people to help me identify, shape and deliver those projects. I’ll post an update here with details of the skills, experience and characteristics I’m looking for. I hope some of you will find them attractive and get in touch.

11 reasons computers can’t understand or solve our problems without human judgement

(Photo by Matt Gidley)

(Photo by Matt Gidley)

Why data is uncertain, cities are not programmable, and the world is not “algorithmic”.

Many people are not convinced that the Smart Cities movement will result in the use of technology to make places, communities and businesses in cities better. Outside their consumer enjoyment of smartphones, social media and online entertainment – to the degree that they have access to them – they don’t believe that technology or the companies that sell it will improve their lives.

The technology industry itself contributes significantly to this lack of trust. Too often we overstate the benefits of technology, or play down its limitations and the challenges involved in using it well.

Most recently, the idea that traditional processes of government should be replaced by “algorithmic regulation” – the comparison of the outcomes of public systems to desired objectives through the measurement of data, and the automatic adjustment of those systems by algorithms in order to achieve them – has been proposed by Tim O’Reilly and other prominent technologists.

These approaches work in many mechanical and engineering systems – the autopilots that fly planes or the anti-lock braking systems that we rely on to stop our cars. But should we extend them into human realms – how we educate our children or how we rehabilitate convicted criminals?

It’s clearly important to ask whether it would be desirable for our society to adopt such approaches. That is a complex debate, but my personal view is that in most cases the incredible technologies available to us today – and which I write about frequently on this blog – should not be used to take automatic decisions about such issues. They are usually more valuable when they are used to improve the information and insight available to human decision-makers – whether they are politicians, public workers or individual citizens – who are then in a better position to exercise good judgement.

More fundamentally, though, I want to challenge whether “algorithmic regulation” or any other highly deterministic approach to human issues is even possible. Quite simply, it is not.

It is true that our ability to collect, analyse and interpret data about the world has advanced to an astonishing degree in recent years. However, that ability is far from perfect, and strongly established scientific and philosophical principles tell us that it is impossible to definitively measure human outcomes from underlying data in physical or computing systems; and that it is impossible to create algorithmic rules that exactly predict them.

Sometimes automated systems succeed despite these limitations – anti-lock braking technology has become nearly ubiquitous because it is more effective than most human drivers at slowing down cars in a controlled way. But in other cases they create such great uncertainties that we must build in safeguards to account for the very real possibility that insights drawn from data are wrong. I do this every time I leave my home with a small umbrella packed in my bag despite the fact that weather forecasts created using enormous amounts of computing power predict a sunny day.

(No matter how sophisticated computer models of cities become, there are fundamental reasons why they will always be simplifications of reality. It is only by understanding those constraints that we can understand which insights from computer models are valuable, and which may be misleading. Image of Sim City by haljackey)

We can only understand where an “algorithmic” approach can be trusted; where it needs safeguards; and where it is wholly inadequate by understanding these limitations. Some of them are practical, and limited only by the sensitivity of today’s sensors and the power of today’s computers. But others are fundamental laws of physics and limitations of logical systems.

When technology companies assert that Smart Cities can create “autonomous, intelligently functioning IT systems that will have perfect knowledge of users’ habits” (as London School of Economics Professor Adam Greenfield rightly criticised in his book “Against the Smart City”), they are ignoring these challenges.

A blog published by the highly influential magazine Wired recently made similar overstatements: “The Universe is Programmable” argues that we should extend the concept of an “Application Programming Interface (API)” – a facility usually offered by technology systems to allow external computer programmes to control or interact with them – to every aspect of the world, including our own biology.

To compare complex, unpredictable, emergent biological and social systems to the very logical, deterministic world of computer software is at best a dramatic oversimplification. The systems that comprise the human body range from the armies of symbiotic microbes that help us digest food in our stomachs to the consequences of using corn syrup to sweeten food to the cultural pressure associated with “size 0” celebrities. Many of those systems can’t be well modelled in their own right, let alone deterministically related to each other; let alone formally represented in an accurate, detailed way by technology systems (or even in mathematics).

We should regret and avoid the hubris that leads to the distrust of technology by overstating its capability and failing to recognise its challenges and limitations. That distrust is a barrier that prevents us from achieving the very real benefits that data and technology can bring, and that have been convincingly demonstrated in the past.

For example, an enormous contribution to our knowledge of how to treat and prevent disease was made by John Snow who used data to analyse outbreaks of cholera in London in the 19th century. Snow used a map to correlate cases of cholera to the location of communal water pipes, leading to the insight that water-borne germs were responsible for spreading the disease. We wash our hands to prevent diseases spreading through germs in part because of what we would now call the “geospatial data analysis” performed by John Snow.

Many of the insights that we seek from analytic and smart city systems are human in nature, not physical or mathematical – for example identifying when and where to apply social care interventions in order to reduce the occurrence of  emotional domestic abuse. Such questions are complex and uncertain: what is “emotional domestic abuse?” Is it abuse inflicted by a live-in boyfriend, or by an estranged husband who lives separately but makes threatening telephone calls? Does it consist of physical violence or bullying? And what is “bullying”?

IMG_0209-1

(John Snow’s map of cholera outbreaks in 19th century London)

We attempt to create structured, quantitative data about complex human and social issues by using approximations and categorisations; by tolerating ranges and uncertainties in numeric measurements; by making subjective judgements; and by looking for patterns and clusters across different categories of data. Whilst these techniques can be very powerful, just how difficult it is to be sure what these conventions and interpretations should be is illustrated by the controversies that regularly arise around “who knew what, when?” whenever there is a high profile failure in social care or any other public service.

These challenges are not limited to “high level” social, economic and biological systems. In fact, they extend throughout the worlds of physics and chemistry into the basic nature of matter and the universe. They fundamentally limit the degree to which we can measure the world, and our ability to draw insight from that information.

By being aware of these limitations we are able to design systems and practises to use data and technology effectively. We know more about the weather through modelling it using scientific and mathematical algorithms in computers than we would without those techniques; but we don’t expect those forecasts to be entirely accurate. Similarly, supermarkets can use data about past purchases to make sufficiently accurate predictions about future spending patterns to boost their profits, without needing to predict exactly what each individual customer will buy.

We underestimate the limitations and flaws of these approaches at our peril. Whilst Tim O’Reilly cites several automated financial systems as good examples of “algorithmic regulation”, the financial crash of 2008 showed the terrible consequences of the thoroughly inadequate risk management systems used by the world’s financial institutions compared to the complexity of the system that they sought to profit from. The few institutions that realised that market conditions had changed and that their models for risk management were no longer valid relied instead on the expertise of their staff, and avoided the worst affects. Others continued to rely on models that had started to produce increasingly misleading guidance, leading to the recession that we are only now emerging from six years later, and that has damaged countless lives around the world.

Every day in their work, scientists, engineers and statisticians draw conclusions from data and analytics, but they temper those conclusions with an awareness of their limitations and any uncertainties inherent in them. By taking and communicating such a balanced and informed approach to applying similar techniques in cities, we will create more trust in these technologies than by overstating their capabilities.

What follows is a description of some of the scientific, philosophical and practical issues that lead inevitability to uncertainty in data, and to limitations in our ability to draw conclusions from it:

But I’ll finish with an explanation of why we can still draw great value from data and analytics if we are aware of those issues and take them properly into account.

Three reasons why we can’t measure data perfectly

(How Heisenberg’s Uncertainty Principle results from the dual wave/particle nature of matter. Explanation by HyperPhysics at Georgia State University)

1. Heisenberg’s Uncertainty Principle and the fundamental impossibility of knowing everything about anything

Heisenberg’s Uncertainty Principle is a cornerstone of Quantum Mechanics, which, along with General Relativity, is one of the two most fundamental theories scientists use to understand our world. It defines a limit to the precision with which certain pairs of properties of the basic particles which make up the world – such as protons, neutrons and electrons – can be known at the same time. For instance, the more accurately we measure the position of such particles, the more uncertain their speed and direction of movement become.

The explanation of the Uncertainty Principle is subtle, and lies in the strange fact that very small “particles” such as electrons and neutrons also behave like “waves”; and that “waves” like beams of light also behave like very small “particles” called “photons“. But we can use an analogy to understand it.

In order to measure something, we have to interact with it. In everyday life, we do this by using our eyes to measure lightwaves that are created by lightbulbs or the sun and that then reflect off objects in the world around us.

But when we shine light on an object, what we are actually doing is showering it with billions of photons, and observing the way that they scatter. When the object is quite large – a car, a person, or a football – the photons are so small in comparison that they bounce off without affecting it. But when the object is very small – such as an atom – the photons colliding with it are large enough to knock it out of its original position. In other words, measuring the current position of an object involves a collision which causes it to move in a random way.

This analogy isn’t exact; but it conveys the general idea. (For a full explanation, see the figure and link above). Most of the time, we don’t notice the effects of Heisenberg’s Uncertainty Principle because it applies at extremely small scales. But it is perhaps the most fundamental law that asserts that “perfect knowledge” is simply impossible; and it illustrates a wider point that any form of measurement or observation in general affects what is measured or observed. Sometimes the effects are negligible,  but often they are not – if we observe workers in a time and motion study, for example, we need to be careful to understand the effect our presence and observations have on their behaviour.

2. Accuracy, precision, noise, uncertainty and error: why measurements are never fully reliable

Outside the world of Quantum Mechanics, there are more practical issues that limit the accuracy of all measurements and data.

(A measurement of the electrical properties of a superconducting device from my PhD thesis. Theoretically, the behaviour should appear as a smooth, wavy line; but the experimental measurement is affected by noise and interference that cause the signal to become "fuzzy". In this case, the effects of noise and interference - the degree to which the signal appears "fuzzy" - are relatively small relative to the strength of the signal, and the device is usable)

(A measurement of the electrical properties of a superconducting device from my PhD thesis. Theoretically, the behaviour should appear as a smooth, wavy line; but the experimental measurement is affected by noise and interference that cause the signal to become “fuzzy”. In this case, the effects of noise and interference – the degree to which the signal appears “fuzzy” – are relatively small compared to the strength of the signal, and the device is usable)

We live in a “warm” world – roughly 300 degrees Celsius above what scientists call “absolute zero“, the coldest temperature possible. What we experience as warmth is in fact movement: the atoms from which we and our world are made “jiggle about” – they move randomly. When we touch a hot object and feel pain it is because this movement is too violent to bear – it’s like being pricked by billions of tiny pins.

This random movement creates “noise” in every physical system, like the static we hear in analogue radio stations or on poor quality telephone connections.

We also live in a busy world, and this activity leads to other sources of noise. All electronic equipment creates electrical and magnetic fields that spread beyond the equipment itself, and in turn affect other equipment – we can hear this as a buzzing noise when we leave smartphones near radios.

Generally speaking, all measurements are affected by random noise created by heat, vibrations or electrical interference; are limited by the precision and accuracy of the measuring devices we use; and are affected by inconsistencies and errors that arise because it is always impossible to completely separate the measurement we want to make from all other environmental factors.

Scientists, engineers and statisticians are familiar with these challenges, and use techniques developed over the course of more than a century to determine and describe the degree to which they can trust and rely on the measurements they make. They do not claim “perfect knowledge” of anything; on the contrary, they are diligent in describing the unavoidable uncertainty that is inherent in their work.

3. The limitations of measuring the natural world using digital systems

One of the techniques we’ve adopted over the last half century to overcome the effects of noise and to make information easier to process is to convert “analogue” information about the real world (information that varies smoothly) into digital information – i.e. information that is expressed as sequences of zeros and ones in computer systems.

(When analogue signals are amplified, so is the noise that they contain. Digital signals are interpreted using thresholds: above an upper threshold, the signal means “1”, whilst below a lower threshold, the signal means “0”. A long string of “0”s and “1”s can be used to encode the same information as contained in analogue waves. By making the difference between the thresholds large compared to the level of signal noise, digital signals can be recreated to remove noise. Further explanation and image by Science Aid)

This process involves a trade-off between the accuracy with which analogue information is measured and described, and the length of the string of digits required to do so – and hence the amount of computer storage and processing power needed.

This trade-off can be clearly seen in the difference in quality between an internet video viewed on a smartphone over a 3G connection and one viewed on a high definition television using a cable network. Neither video will be affected by the static noise that affects weak analogue television signals, but the limited bandwidth of a 3G connection dramatically limits the clarity and resolution of the image transmitted.

The Nyquist–Shannon sampling theorem defines this trade-off and the limit to the quality that can be achieved in storing and processing digital information created from analogue sources. It determines the quality of digital data that we are able to create about any real-world system – from weather patterns to the location of moving objects to the fidelity of sound and video recordings. As computers and communications networks continue to grow more powerful, the quality of digital information will improve,  but it will never be a perfect representation of the real world.

Three limits to our ability to analyse data and draw insights from it

1. Gödel’s Incompleteness Theorem and the inconsistency of algorithms

Kurt Gödel’s Incompleteness Theorem sets a limit on what can be achieved by any “closed logical system”. Examples of “closed logical systems” include computer programming languages, any system for creating algorithms – and mathematics itself.

We use “closed logical systems” whenever we create insights and conclusions by combining and extrapolating from basic data and facts. This is how all reporting, calculating, business intelligence, “analytics” and “big data” technologies work.

Gödel’s Incompleteness Theorem proves that any closed logical system can be used to create conclusions that  it is not possible to show are true or false using the same system. In other words, whilst computer systems can produce extremely useful information, we cannot rely on them to prove that that information is completely accurate and valid. We have to do that ourselves.

Gödel’s theorem doesn’t stop computer algorithms that have been verified by humans using the scientific method from working; but it does mean that we can’t rely on computers to both generate algorithms and guarantee their validity.

2. The behaviour of many real-world systems can’t be reduced analytically to simple rules

Many systems in the real-world are complex: they cannot be described by simple rules that predict their behaviour based on measurements of their initial conditions.

A simple example is the “three body problem“. Imagine a sun, a planet and a moon all orbiting each other. The movement of these three objects is governed by the force of gravity, which can be described by relatively simple mathematical equations. However, even with just three objects involved, it is not possible to use these equations to directly predict their long-term behaviour – whether they will continue to orbit each other indefinitely, or will eventually collide with each other, or spin off into the distance.

(A computer simulation by Hawk Express of a Belousov–Zhabotinsky reaction,  in which reactions between liquid chemicals create oscillating patterns of colour. The simulation is carried out using “cellular automata” a technique based on a grid of squares which can take different colours. In each “turn” of the simulation, like a turn in a board game, the colour of each square is changed using simple rules based on the colours of adjacent squares. Such simulations have been used to reproduce a variety of real-world phenomena)

As Stephen Wolfram argued in his controversial book “A New Kind of Science” in 2002, we need to take a different approach to understanding such complex systems. Rather than using mathematics and logic to analyse them, we need to simulate them, often using computers to create models of the elements from which complex systems are composed, and the interactions between them. By running simulations based on a large number of starting points and comparing the results to real-world observations, insights into the behaviour of the real-world system can be derived. This is how weather forecasts are created, for example. 

But as we all know, weather forecasts are not always accurate. Simulations are approximations to real-world systems, and their accuracy is restricted by the degree to which digital data can be used to represent a non-digital world. For this reason, conclusions and predictions drawn from simulations are usually “average” or “probable” outcomes for the system as a whole, not precise predictions of the behaviour of the system or any individual element of it. This is why weather forecasts are often wrong; and why they predict likely levels of rain and windspeed rather than the shape and movement of individual clouds.

(Hello)

(A simple and famous example of a computer programme that never stops running because it calls itself. The output continually varies by printing out characters based on random number generation. Image by Prosthetic Knowledge)

3. Some problems can’t be solved by computing machines

If I consider a simple question such as “how many letters are in the word ‘calculation’?”, I can easily convince myself that a computer programme could be written to answer the question; and that it would find the answer within a relatively short amount of time. But some problems are much harder to solve, or can’t even be solved at all.

For example, a “Wang Tile” (see image below) is a square tile formed from four triangles of different colours. Imagine that you have bought a set of tiles of various colour combinations in order to tile a wall in a kitchen or bathroom. Given the set of tiles that you have bought, is it possible to tile your wall so that triangles of the same colour line up to each other, forming a pattern of “Wang Tile” squares?

In 1966 Robert Berger proved that no algorithm exists that can answer that question. There is no way to solve the problem – or to determine how long it will take to solve the problem – without actually solving it. You just have to try to tile the room and find out the hard way.

One of the most famous examples of this type of problem is the “halting problem” in computer science. Some computer programmes finish executing their commands relatively quickly. Others can run indefinitely if they contain a “loop” instruction that never ends. For others which contain complex sequences of loops and calls from one section of code to another, it may be very hard to tell whether the programme finishes quickly, or takes a long time to complete, or never finishes its execution at all.

Alan Turing, one of the most important figures in the development of computing, proved in 1936 that a general algorithm to determine whether or not any computer programme finishes its execution does not exist. In other words, whilst there are many useful computer programmes in the world, there are also problems that computer programmes simply cannot solve.

(A set of Wang Tiles, and a pattern created by tiling them so that tiles are placed next to other tiles so that their edges have the same colour. Given any particular set of tiles, it is impossible to determine whether such a pattern can be created by any means other than trial and error)

(A set of Wang Tiles, and a pattern of coloured squares created by tiling them. Given any random set of tiles of different colour combinations, there is no set of rules that can be relied on to determine whether a valid pattern of coloured squares can be created from them. Sometimes, you have to find out by trial and error. Images from Wikipedia)

Five reasons why the human world is messy, unpredictable, and can’t be perfectly described using data and logic

1. Our actions create disorder

The 2nd Law of Thermodynamics is a good candidate for the most fundamental law of science. It states that as time progresses, the universe becomes more disorganised. It guarantees that ultimately – in billions of years – the Universe will die as all of the energy and activity within it dissipates.

An everyday practical consequence of this law is that every time we act to create value – building a shed, using a car to get from one place to another, cooking a meal – our actions eventually cause a greater amount of disorder to be created as a consequence – as noise, pollution, waste heat or landfill refuse.

For example, if I spend a day building a shed, then to create that order and value from raw materials, I consume structured food and turn it into sewage. Or if I use an electric forklift to stack a pile of boxes, I use electricity that has been created by burning structured coal into smog and ash.

So it is literally impossible to create a “perfect world”. Whenever we act to make a part of the world more ordered, we create disorder elsewhere. And ultimately – thankfully, long after you and I are dead – disorder is all that will be left.

2. The failure of Logical Atomism: why the human world can’t be perfectly described using data and logic

In the 20th Century two of the most famous and accomplished philosophers in history, Bertrand Russell and Ludwig Wittgenstein, invented “Logical Atomism“, a theory that the entire world could be described by using “atomic facts” – independent and irreducible pieces of knowledge – combined with logic.

But despite 40 years of work, these two supremely intelligent people could not get their theory to work: “Logical Atomism” failed. It is not possible to describe our world in that way.

One cause of the failure was the insurmountable difficulty of identifying truly independent, irreducible atomic facts. “The box is red” and “the circle is blue”, for example, aren’t independent or irreducible facts for many reasons. “Red” and “blue” are two conventions of human language used to describe the perceptions created when electro-magnetic waves of different frequencies arrive at our retinas. In other words, they depend on and relate to each other through a number of sophisticated systems.

Despite centuries of scientific and philosophical effort, we do not have a complete understanding of how to describe our world at its most basic level. As physicists have explored the world at smaller and smaller scales, Quantum Mechanics has emerged as the most fundamental theory for describing it – it is the closest we have come to finding the “irreducible facts” that Russell and Wittgenstein were looking for. But whilst the mathematical equations of Quantum Mechanics predict the outcomes of experiments very well, after nearly a century, physicists still don’t really agree about what those equations mean. And as we have already seen, Heisenberg’s Uncertainty Principle prevents us from ever having perfect knowledge of the world at this level.

Perhaps the most important failure of logical atomism, though, was that it proved impossible to use logical rules to turn “facts” at one level of abstraction – for example, “blood cells carry oxygen”, “nerves conduct electricity”, “muscle fibres contract” – into facts at another level of abstraction – such as “physical assault is a crime”. The human world and the things that we care about can’t be described using logical combinations of “atomic facts”. For example, how would you define the set of all possible uses of a screwdriver, from prising the lids off paint tins to causing a short-circuit by jamming it into a switchboard?

Our world is messy, subjective and opportunistic. It defies universal categorisation and logical analysis.

(A Pescheria in Bari, Puglia, where a fish-market price information service makes it easier for local fisherman to identify the best buyers and prices for their daily catch. Photo by Vito Palmi)

3. The importance and inaccessibility of “local knowledge” 

Because the tool we use for calculating and agreeing value when we exchange goods and services is money, economics is the discipline that is often used to understand the large-scale behaviour of society. We often quantify the “growth” of society using economic measures, for example.

But this approach is notorious for overlooking social and environmental characteristics such as health, happiness and sustainability. Alternatives exist, such as the Social Progress Index, or the measurement framework adopted by the United Nations 2014 Human Development Report on world poverty; but they are still high level and abstract.

Such approaches struggle to explain localised variations, and in particular cannot predict the behaviours or outcomes of individual people with any accuracy. This “local knowledge problem” is caused by the fact that a great deal of the information that determines individual actions is personal and local, and not measurable at a distance – the experienced eye of the fruit buyer assessing not just the quality of the fruit but the quality of the farm and farmers that produce it, as a measure of the likely consistency of supply; the emotional attachments that cause us to favour one brand over another; or the degree of community ties between local businesses that influence their propensity to trade with each other.

Sharing economy” business models that use social media and reputation systems to enable suppliers and consumers of goods and services to find each other and transact online are opening up this local knowledge to some degree. Local food networks, freecycling networks, and land-sharing schemes all use this technology to the benefit of local communities whilst potentially making information about detailed transactions more widely available. And to some degree, the human knowledge that influences how transactions take place can be encoded in “expert systems” which allow computer systems to codify the quantitative and heuristic rules by which people take decisions.

But these technologies are only used in a subset of the interactions that take place between people and businesses across the world, and it is unlikely that they’ll become ubiquitous in the foreseeable future (or that we would want them to become so). Will we ever reach the point where prospective house-buyers delegate decisions about where to live to computer programmes operating in online marketplaces rather than by visiting places and imagining themselves living there? Will we somehow automate the process of testing the freshness of fish by observing the clarity of their eyes and the freshness of their smell before buying them to cook and eat?

In many cases, while technology may play a role introducing potential buyers and sellers of goods and services to each other, it will not replace – or predict – the human behaviours involved in the transaction itself.

(Medway Youth Trust use predictive and textual analytics to draw insight into their work helping vulnerable children. They use technology to inform expert case workers, not to take decisions on their behalf.)

4. “Wicked problems” cannot be described using data and logic

Despite all of the challenges associated with problems in mathematics and the physical sciences, it is nevertheless relatively straightforward to frame and then attempt to solve problems in those domains; and to determine whether the resulting solutions are valid.

As the failure of Logical Atomism showed, though, problems in the human domain are much more difficult to describe in any systematic, complete and precise way – a challenge known as the “frame problem” in artificial intelligence. This is particularly true of “wicked problems” – challenges such as social mobility or vulnerable families that are multi-faceted, and consist of a variety of interdependent issues.

Take job creation, for example. Is that best accomplished through creating employment in taxpayer-funded public sector organisations? Or by allowing private-sector wealth to grow, creating employment through “trickle-down” effects? Or by maximising overall consumer spending power as suggested by “middle-out” economics? All of these ideas are described not using the language of mathematics or other formal logical systems, but using natural human language which is subjective and inconsistent in use.

The failure of Logical Atomism to fully represent such concepts in formal logical systems through which truth and falsehood can be determined with certainty emphasises what we all understand intuitively: there is no single “right” answer to many human problems, and no single “right” action in many human situations.

(An electricity bill containing information provided by OPower comparing one household’s energy usage to their neighbours. Image from Grist)

5. Behavioural economics and the caprice of human behaviour

Behavioural economics” attempts to predict the way that humans behave when taking choices that have a measurable impact on them – for example, whether to put the washing machine on at 5pm when electricity is expensive, or at 11pm when it is cheap.

But predicting human behaviour is notoriously unreliable.

For example, in a smart water-meter project in Dubuque, Iowa, households that were told how their water conservation compared to that of their near neighbours were found to be twice as likely to take action to improve their efficiency as those who were only told the details of their own water use. In other words, people who were given quantified evidence that they were less responsible water user than their neighbours changed their behaviour. OPower have used similar techniques to help US households save 1.9 terawatt hours of power simply by including a report based on data from smart meters in a printed letter sent with customers’ electricity bills.

These are impressive achievements; but they are not always repeatable. A recycling scheme in the UK that adopted a similar approach found instead that it lowered recycling rates across the community: households who learned that they were putting more effort into recycling than their neighbours asked themselves “if my neighbours aren’t contributing to this initiative, then why should I?”

Low carbon engineering technologies like electric vehicles have clearly defined environmental benefits and clearly defined costs. But most Smart Cities solutions are less straightforward. They are complex socio-technical systems whose outcomes are emergent. Our ability to predict their performance and impact will certainly improve as more are deployed and analysed, and as University researchers, politicians, journalists and the public assess them. But we will never predict individual actions using these techniques, only the average statistical behaviour of groups of people. This can be seen from OPower’s own comparison of their predicted energy savings against those actually achieved – the predictions are good, but the actual behaviour of OPower’s customers shows a high degree of apparently random variation. Those variations are the result of the subjective, unpredictable and sometimes irrational behaviour of real people.

We can take insight from Behavioural Economics and other techniques for analysing human behaviour in order to create appropriate strategies, policies and environments that encourage the right outcomes in cities; but none of them can be relied on to give definitive solutions to any individual person or situation. They can inform decision-making, but are always associated with some degree of uncertainty. In some cases, the uncertainty will be so small as to be negligible, and the predictions can be treated as deterministic rules for achieving the desired outcome. But in many cases, the uncertainty will be so great that predictions can only be treated as general indications of what might happen; whilst individual actions and outcomes will vary greatly.

(Of course it is impossible to predict individual criminal actions as portrayed in the film “Minority Report”. But is is very possible to analyse past patterns of criminal activity, compare them to related data such as weather and social events, and predict the likelihood of crimes of certain types occurring in certain areas. Cities such as Memphis and Chicago have used these insights to achieve significant reductions in crime)

Learning to value insight without certainty

Mathematics and digital technology are incredibly powerful; but they will never perfectly and completely describe and predict our world in human terms. In many cases, our focus for using them should not be on automation: it should be on the enablement of human judgement through better availability and communication of information. And in particular, we should concentrate on communicating accurately the meaning of information in the context of its limitations and uncertainties.

There are exceptions where we automate systems because of a combination of a low-level of uncertainty in data and a large advantage in acting autonomously on it. For example, anti-lock braking systems save lives by using automated technology to take thousands of decisions more quickly than most humans would realise that even a single decision needed to be made; and do so based on data with an extremely low degree of uncertainty.

But the most exciting opportunity for us all is to learn to become sophisticated users of information that is uncertain. The results of textual analysis of sentiment towards products and brands expressed in social media are far from certain; but they are still of great value. Similar technology can extract insights from medical research papers, case notes in social care systems, maintenance logs of machinery and many other sources. Those insights will rarely be certain; but properly assessed by people with good judgement they can still be immensely valuable.

This is a much better way to understand the value of technology than ideas like “perfect knowledge” and “algorithmic regulation”. And it is much more likely that people will trust the benefits that we claim new technologies can bring if we are open about their limitations. People won’t use technologies that they don’t trust; and they won’t invest their money in them or vote for politicians who say they’ll spend their taxes on it.

Thankyou to Richard Brown and Adrian McEwen for discussions on Twitter that helped me to prepare this article. A more in-depth discussion of some of the scientific and philosophical issues I’ve described, and an exploration of the nature of human intelligence and its non-deterministic characteristics, can be found in the excellent paper “Answering Descartes: Beyond Turing” by Stuart Kauffman published by MIT press.

What’s the risk of investing in a Smarter City?

(The two towers of the Bosco Verticale in Milan will be home to more than 10,000 plants that create shade and improve air quality. But to what degree do such characteristics make buildings more attractive to potential tenants than traditional structures, creating the potential to create financial returns to reward more widespread investment in this approach? Photo by Marco Trovo)

(Or “how to buy a Smarter City that won’t go bump in the night”)

There are good reasons why the current condition and future outlook of the world’s cities have been the subject of great debate in recent years. Their population will double from 3 billion to 6 billion by 2050; and while those in the developing world are growing at such a rate that they are challenging our ability to construct resilient, efficient infrastructure, those in developed countries often have significant levels of inequality and areas of persistent poverty and social immobility.

Many people involved in the debate are convinced that new approaches are needed to transport, food supply, economic development, water and energy management, social and healthcare, public safety and all of the other services and infrastructures that support cities.

As a consequence, analysts such as Frost & Sullivan have estimated that the market for “Smart City” solutions that exploit technology to address these issues will be $1.5trillion by 2020.

But anyone who has tried to secure investment in an initiative to apply “smart” technology in a city knows that it is not always easy to turn that theoretical market value into actual investment in projects, technology, infrastructure and expertise.

It’s not difficult to see why this is the case. Most investments are made in order to generate a financial return, but profit is not the objective of “Smart Cities” initiatives: they are intended to create economic, environmental or social outcomes. So some mechanism – an investment vehicle, a government regulation or a business model – is needed to create an incentive to invest in achieving those outcomes.

Institutions, Business, Infrastructure and Investment

Citizens expect national and local governments to use their tax revenues to deliver these objectives, of course. But they are also very concerned that the taxes they pay are spent wisely on programmes with transparent, predictable, deliverable outcomes, as the current controversy over the UK’s proposed “HS2” high speed train network and previous controversies over the effectiveness of public sector IT programmes show.

Nevertheless, the past year has seen a growing trend for cities in Europe and North America to invest in Smart Cities technologies from their own operational budgets, on the basis of their ability to deliver cost savings or improvements in outcomes.

For example, some cities are replacing traditional parking management and enforcement services with “smart parking” schemes that are reducing congestion and pollution whilst paying for themselves through increased enforcement revenues. Others are investing their allocation of central government infrastructure funds in Smart solutions – such as Cambridge, Ontario’s use of the Canadian government’s Gas Tax Fund to invest in a sensor network and analytics infrastructure to manage the city’s physical assets intelligently.

The providers of Smart Cities solutions are investing too, by implementing their services on Cloud computing platforms so that cities can pay incrementally for their use of them, rather than investing up-front in their deployment. Minneapolis, Minnesota and Montpelier, France, recently announced that they are using IBM’s Cloud-based solutions for smarter water, transport and emergency management in this way. And entrepreneurial businesses, backed by Venture Capital investment, are also investing in the development of new solutions.

However, we have not yet tapped the largest potential investment streams: property and large-scale infrastructure. The British Property Federation, for example, estimates that £14 billion is invested in the development of new property in the UK each year. For the main part, these investment streams are not currently investing  in “Smart City” solutions.

To understand why that is the case – and how we might change it – we need to understand the difference in three types of risk involved in investing in smart infrastructures compared with traditional infrastructures: construction risk; the impact of operational failures; and confidence in outcomes.

(A cyclist’s protest in 2012 about the disruption caused in Edinburgh by the overrunning construction of the city’s new tram system. Photo by Andy A)

Construction Risk

At a discussion in March of the financing of future city initiatives held within the Lord Mayor of the City of London’s “Tommorrow’s Cities” programme, Daniel Wong, Head of Infrastructure and Real Estate for Macquarie Capital Europe, said that only a “tiny fraction” – a few percent – of the investable resources of the pension and sovereign wealth funds often referred to as the “wall of money” seeking profitable long-term investment opportunities in infrastructure were available to invest in infrastructure projects that carry “construction risk” – the risk of financial loss or cost overruns during construction.

For conventional infrastructure, construction risk is relatively well understood. At the Tomorrow’s Cities event, Jason Robinson, Bechtel’s General Manager for Urban Development, said that the construction sector was well able to manage that risk on behalf of investors. There are exceptions – such as the delays, cost increases and reduction in scale of Edinburgh’s new tram system – but they are rare.

So are we similarly well placed to manage the additional “construction risk” created when we add new technology to infrastructure projects?

Unfortunately, research carried out in 2013 by the Standish Group on behalf of Computerworld suggests not. Standish Group used data describing 3,555 IT projects between 2003 and 2012 that had labour costs of at least $10 million, and found that only 6.4% were wholly successful. 52% were delivered, but cost more than expected, took longer than expected, or failed to deliver everything that was expected of them. The rest – 41.4% – either failed completely or had to be stopped and re-started from scratch. Anecdotally, we are familiar with the press coverage of high profile examples of IT projects that do not succeed.

We should not be surprised that it is so challenging to deliver IT projects. They are almost always driven by requirements that represent an aspiration to change the way that an organisation or system works: such requirements are inevitably uncertain and often change as projects proceed. In today’s interconnected world, many IT projects involve the integration of several existing IT systems operated by different organisations: most of those systems will not have been designed to support integration. And because technology changes so quickly, many projects use technologies that are new to the teams delivering them. All of these things will usually be true for the technology solutions required for Smart City projects.

By analogy, then, an IT project often feels like an exercise in building an ambitiously new style of building, using new materials whose weight, strength and stiffness isn’t wholly certain, and standing on a mixture of sand, gravel and wetland. It is not surprising that only 6.4% deliver everything they intend to, on time and on budget – though it is also disappointing that as many as 41.4% fail so completely.

However, the real insight is that the characteristics of uncertainty, risk, timescales and governance for IT projects are very different from construction and infrastructure projects. All of these issues can be managed; but they are managed in very different ways. Consequently, it will take time and experience for the cultures of IT and construction to reconcile their approaches to risk and project management, and consequently to present a confident joint approach to investors.

The implementation of Smart Cities IT solutions on Cloud Computing platforms  by their providers mitigates this risk to an extent by “pre-fabricating” these components of smart infrastructure. But there is still risk associated with the integration of these solutions with physical infrastructure and engineering systems. As we gain further experience of carrying out that integration, IT vendors, investors, construction companies and their customers will collectively increase their confidence in managing this risk, unlocking investment at greater scale.

(The unfortunate consequence of a driver who put more trust in their satellite navigation and GPS technology than its designers expected. Photo by Salmon Assessors)

Operational Risk

We are all familiar with IT systems failing.

Our laptops, notebooks and tablets crash, and we lose work as a consequence. Our television set-top boxes reboot themselves midway through recording programmes. Websites become unresponsive or lose data from our shopping carts.

But when failures occur in IT systems that monitor and control physical systems such as cars, trains and traffic lights, the consequences could be severe: damage to property, injury; and death. Organisations that invest in and operate infrastructure are conscious of these risks, and balance them against the potential benefits of new technologies when deciding whether to use them.

The real-world risks of technology failure are already becoming more severe as all of us adopt consumer technologies such as smartphones and social media into every aspect of our lives (as the driver who followed his satellite navigation system off the roads of Paris onto the pavement, and then all the way down the steps into the Paris Metro, discovered).

The noted urbanist Jane Jacobs defined cities by their ability to provide privacy and safety amongst citizens who are usually strangers to each other; and her thinking is still regarded today by many urbanists as the basis of our understanding of cities. As digital technology becomes more pervasive in city systems, it is vital that we evolve the policies that govern digital privacy to ensure that those systems continue to support our lives, communities and businesses successfully.

Google’s careful exploration of self-driving cars in partnership with driver licensing organisations is an example of that process working well; the discovery of a suspected 3D-printing gun factory in Manchester last year is an example of it working poorly.

These issues are already affecting the technologies involved in Smart Cities solutions. An Argentinian researcher recently demonstrated that traffic sensors used around the world could be hacked into and caused to create misleading information. At the time of installation it was assumed that there would never be a motivation to hack into them and so they were configured with insufficient security. We will have to ensure that future deployments are much more secure.

Conversely, we routinely trust automated technology in many aspects of our lives – the automatic pilots that land the planes we fly in, and the anit-lock braking systems that slow and stop our cars far more effectively than we are able to ourselves.

If we are to build the same level of trust and confidence in Smart City solutions, we need to be open and honest about their risks as well as their benefits; and clear how we are addressing them.

(Cars from the car club “car2go” ready to hire in Vancouver. Despite succeeding in many cities around the world, the business recently withdrew from the UK after failing to attract sufficient customers to two pilot deployments in London and Birmingham. The UK’s cultural attraction of private car ownership has proved too strong at present for a shared ownership business model to succeed. Photo by Stephen Rees).

Outcomes Risk

Smart infrastructures such as Stockholm’s road-use charging scheme and London’s congestion charge were constructed in the knowledge that they would be financially sustainable, and with the belief that they would create economic and environmental benefits. Subsequent studies have shown that they did achieve those benefits, but data to predict them confidently in advance did not exist because they were amongst the first of their kind in the world.

The benefits of “Smart” schemes such as road-use charging and smart metering cannot be calculated deterministically in advance because they depend on citizens changing their behaviour – deciding to ride a bus rather than to drive a car; or deciding to use dishwashers and washing machines overnight rather than during the day.

There are many examples of Smart Cities projects that have successfully used technology to encourage behaviour change. In a smart water meter project in Dubuque, for example, households were given information that told them whether their domestic appliances were being used efficiently, and alerted to any leaks in their supply of water. To a certain extent, households acted on this information to improve the efficiency of their water usage. But a control group who were also given a “green points” score telling them how their water conservation compared to that of their near neighbours were found to be twice as likely to take action to improve their efficiency.

However, these techniques are notoriously difficult to apply successfully. A recycling scheme that adopted a similar approach found instead that it lowered recycling rates across the community: households who learned that they were putting more effort into recycling than their neighbours asked themselves “if my neighbours aren’t contributing to this initiative, then why should I?”

The financial vehicles that enable investment in infrastructure and property are either government-backed instruments that reward economic and social outcomes such as reductions in carbon footprint or the creation of jobs ; or market-based instruments  based on the creation of direct financial returns.

So are we able to predict those outcomes confidently enough to enable investment in Smart Cities solutions?

I put that question to the debating panel at the Tomorrow’s Cities meeting. In particular, I asked whether investors would be willing to purchase bonds in smart metering infrastructures with a rate of return dependent on the success of those infrastructures in encouraging consumers to  reduce their use of water and energy.

The response was a clear “no”. The application of those technologies and their effectiveness in reducing the use of water and electricity by families and businesses is too uncertain for such investment vehicles to be used.

Smart Cities solutions are not straightforward engineering solutions such as electric vehicles whose cost, efficiency and environmental impacts can be calculated in a deterministic way. They are complex socio-technical systems whose outcomes are emergent and uncertain.

Our ability to predict their performance and impact will certainly improve as more are deployed and analysed, and as University researchers, politicians, journalists and the public assess them. As that happens, investors will be more willing to fund them; or, with government support, to create new financial vehicles that reward investment in initiatives that use smart technology to create social, environmental and economic improvements – just as the World Bank’s Green Bonds, launched in 2008, support environmental schemes today.

(Recycling bins in Curitiba, Brazil. As Mayor of Curitaba Jaime Lerner started one of the world’s earliest and most effective city recycling programmes by harnessing the enthusiasm of children to influence the behaviour of their parents. Lerner’s many initiatives to transform Curitaba have the characteristic of entrepreneurial leadership. Photo by Ana Elisa Ribeiro)

Evidence and Leadership

The evidence base need to support new investment vehicles is already being created. In Canada, for example, a collaboration between Canadian insurers and cities has developed a set of tools to create a common understanding of the financial risk created by the effects of climate change on the resilience of city infrastructures.

More internationally, the “Little Rock Accord” between the Madrid Club of former national Presidents and Prime Ministers and the P80 group of pension funds agreed to create a task force to increase the degree to which pension and sovereign wealth funds invest in the deployment of technology to address climate change issues, shortages in resources such as energy, water and food, and sustainable, resilient growth. My colleague the economist Mary Keeling has been working for IBM’s Institute for Business Value to more clearly analyse and express the benefits of Smart approaches – in water management and transportation, for example. And Peter Head’s Ecological Sequestration Trust and Robert Bishop’s International Centre for Earth Simulation are both pooling international data and expertise to create models that explore how more sustainable cities and societies might work.

But the Smart City programmes which courageously drive the field forward will not always be those that demand a complete and detailed cost/benefit analysis in advance. Writing in “The Plundered Planet”, the economist Paul Collier asserts that any proposed infrastructure of reasonable novelty and significant scale is effectively so unique – especially when considered in its geographic, political, social and economic context – that an accurate cost/benefit case simply cannot be constructed.

Instead, initiatives such as London’s congestion charge and bicycle hire scheme, Sunderland’s City Cloud and Bogota’s bikeways and parks were created by courageous leaders with a passionate belief that they could make their cities better. As more of those leaders come to trust technology and the people who deliver it, their passion will be another force behind the adoption of technology in city systems and infrastructure.

What’s the risk of not investing in a Smarter City?

For at least the last 50 years, we have been observing that life is speeding up and becoming more complicated. In his 1964 work “Notes on the Synthesis of Form“, the town planner Christopher Alexander wrote:

“At the same time that the problems increase in quantity, complexity and difficulty, they also change faster than ever before. New materials are developed all the time, social patterns alter quickly, the culture itself is changing faster than it has ever changed before … To match the growing complexity of problems, there is a growing body of information and specialist experience … [but] not only is the quantity of information itself beyond the reach of single designers, but the various specialists who retail it are narrow and unfamiliar with the form-makers’ peculiar problems.”

(Alexander’s 1977 work “A Pattern Language: Towns, Buildings, Construction” is one of the most widely read books on urban design; it was also an enormous influence on the development of the computer software industry).

The physicist Geoffrey West has shown that this process is alive and well in cities today. As the world’s cities grow, life in them speeds up, and they create ideas and wealth more rapidly, leading to further growth. West has observed that, in a world with constrained resources, this process will lead to a catastrophic failure when demand for fresh water, food and energy outstrips supply – unless we change that process, and change the way that we consume resources in order to create rewarding lives for ourselves.

There are two sides to that challenge: changing what we value; and changing how we create what we value from the resources around us.

(...)

(“Makers” at the Old Print Works in Balsall Heath, Birmingham, sharing the tools, skills, contacts and ideas that create successful small businesses in local communities)

The Transition movement, started by Rob Hopkins in Totnes in 2006, is tackling both parts of that challenge. “Transition Towns” are communities who have decided to act collectively to transition to a way of life which is less resource-intensive, and to value the characteristics of such lifestyles in their own right – where possible trading regionally, recycling and re-using materials and producing and consuming food locally.

The movement does not advocate isolation from the global industrial economy, but it does advocate that local, alternative products and services in some cases can be more sustainable than mass-produced commodities; that the process of producing them can be its own reward; and that acting at community level is for many people the most effective way to contribute to sustainability. From local currencies, to food-trading networks to community energy schemes, many “Smart” initiatives have emerged from the transition movement.

We will need the ideas and philosophy of Transition to create sustainable cities and communities – and without them we will fail. But those ideas alone will not create a sustainable world. With current technologies, for example, one hectare of highly fertile, intensively farmed land can feed 10 people. Birmingham, my home city, has an area of 60,000 hectares of relatively infertile land, most of which is not available for farming at all; and a population of around 1 million. Those numbers don’t add up to food self-sufficiency. And Birmingham is a very low-density city – between one-half and one-tenth as dense as the growing megacities of Asia and South America.

Cities depend on vast infrastructures and supply-chains, and they create complex networks of transactions supported by transportation and communications. Community initiatives will adapt these infrastructures to create local value in more sustainable, resilient ways, and by doing so will reduce demand. But they will not affect the underlying efficiency of the systems themselves. And I do not personally believe that in a world of 7 billion people in which resources and opportunity are distributed extremely unevenly that community initiatives alone will reduce demand significantly enough to achieve sustainability.

We cannot simply scale these systems up as the world’s population grows to 9 billion by 2050, we need to change the way they work. That means changing the technology they use, or changing the way they use technology. We need to make them smarter.

Six ways to design humanity and localism into Smart Cities

(Birmingham’s Social Media Cafe, where individuals from every part of the city share their experience using social media to promote their businesses and community initiatives. Photograph by Meshed Media)

The Smart Cities movement is sometimes criticised for appearing to focus mainly on the application of technology to large-scale city infrastructures such as smart energy grids and intelligent transportation.

It’s certainly vital that we manage and operate city services and infrastructure as intelligently as possible – there’s no other way to deal with the rapid urbanisation taking place in emerging economies; or the increasing demand for services such as health and social care in the developed world whilst city budgets are shrinking dramatically; and the need for improved resilience in the face of climate change everywhere.

But to focus too much on this aspect of Smart Cities and to overlook the social needs of cities and communities risks forgetting what the full purpose of cities is: to enable a huge number of individual citizens to live not just safe, but rewarding lives with their families.

Maslow’s Hierarchy of Needs identifies our most basic requirements to be food, water, shelter and security. The purpose of many city infrastructures is to answer those needs, either directly (buildings, utility infrastructures and food supply chains) or indirectly (the transport systems that support us and the businesses that we work for).

Important as those needs are, though – particularly to the billions of people in the world for whom they are not reliably met – life would be dull and unrewarding if they were all that we aspired to.

Maslow’s hierarchy next relates the importance of family, friends and “self-actualisation” (which can crudely be described as the process of achieving things that we care about). These are the more elusive qualities that it’s harder to design cities to provide. But unless cities provide them, they will not be successful. At best they will be dull, unrewarding places to live and work, and will see their populations fall as those can migrate elsewhere. At worst, they will create poverty, poor health and ultimately short, unrewarding lives.

A Smart City should not only be efficient, resilient and sustainable; it should improve all of these qualities of life for its citizens.

So how do we design and engineer them to do that?

(Maslow’s Hierarchy of Needs, image by Factoryjoe via Wikimedia Commons)

Tales of the Smart City

Stories about the people whose lives and businesses have been made better by technology tell us how we might answer that question.

In the Community Lover’s Guide to Birmingham, for example, Nick Booth describes the way his volunteer-led social media surgeries helped the Central Birmingham Neighbourhood Forum, Brandwood End Cemetery and Jubilee Debt Campaign to benefit from technology.

Another Birmingham initiative, the Northfield Ecocentre, crowdfunded £10,000 to support their “Urban Harvest” project. The funds helped the Ecocentre pick unwanted fruit from trees in domestic gardens in Birmingham and distribute it between volunteers, children’s centres, food bank customers and organisations promoting healthy eating; and to make some of it into jams, pickles and chutneys to raise money so that in future years the initiative can become self-sustaining.

In the village of Chale on the Isle of Wight, a community not served by the national gas power network and with significant levels of fuel poverty, my colleague Andy Stanford-Clark has helped an initiative not only to deploy smart meters to measure the energy use of each household; but to co-design with residents how they will use that technology, so that the whole community feels a sense of ownership and inclusion in the initiative. The project has resulted in a significant drop in rent arrears as residents use the technology to reduce their utility bills, in some cases by up to 50 percent. Less obviously, the sense of shared purpose has extended to the creation of a communal allotment area in the village and a successful compaign to halve bus fares in the area.

There are countless other examples. Play Fitness “gamify” exercise to persuade children to get fit, and work very hard to ensure that their products are accessible to children in communities of any level of wealth.  Casserole Club use social media to introduce people who can’t cook for themselves to people who are prepared to volunteer to cook for others. The West Midlands Collaborative Commerce Marketplace uses analytics technology to help it’s 10,000 member businesses win more than £4billion in new contracts each year. … and so on.

None of these initiatives are purely to do with technology. But they all use technologies that simply were not available and accessible as recently as a few years ago to achieve outcomes that are important to cities and communities. By understanding how the potential of technology was apparent to the stakeholders in such initiatives, why it was affordable and accessible to them, and how they acquired the skills to exploit it, we can learn how to design Smart Cities in a way that encourages widespread grass-roots, localised innovation.

(Top: Birmingham's Masshouse Circus roundabout, part of the inner-city ringroad that famously impeded the city's growth. Bottom: This pedestrian roundabout in Lujiazui, China, constructed over a busy road junction, is a large-scale city infrastructure that balances the need to support traffic flows through the city with the importance that Jane Jacobs first described of allowing people to walk freely about the areas where they live and work. Photo by ChrisUK)

(Top: Birmingham’s Masshouse Circus roundabout, part of the inner-city ringroad that famously impeded the city’s growth until it was demolished. Photo by Birmingham City Council. Bottom: Pedestrian roundabout in Lujiazui, China, constructed over a busy road junction, is a large-scale city infrastructure that balances the need to support traffic flows through the city with the importance that Jane Jacobs first described of allowing people to walk freely about the areas where they live and work. Photo by ChrisUK)

A tale of two roundabouts

History tells us that we should not assume that it will be straightforward to design Smart Cities to achieve that objective, however.

A measure of our success in building the cities we know today from the generations of technology that shaped them – concrete, cars and lifts – is the variation in life expectancy across them. In the UK, it’s common for life expectancy to vary by around 20 years between the poorest and richest parts of the same city.

That staggering difference is the outcome of a complex set of issues including the availability of education and opportunity, lifestyle factors such as diet and exercise, and the accessibility of city services. But a significant influence on many of those issues is the degree to which the large-scale infrastructures built to support our physiological needs and the demands of the economy also create a high-quality environment for daily life.

The photograph on the right shows two city transport infrastructures that are visually similar, but that couldn’t be more different in their influence on the success of the cities that they are part of.

The picture at the top shows Masshouse Circus in Birmingham in 2001 shortly before it was demolished. It was constructed in the 1960s as part of the city’s inner ring-road, intended to improve connectivity to the national economy through the road network. However, the impact of the physical barrier that it created to pedestrian traffic can be seen by the stark difference in land value inside and outside the “concrete collar” of the ring-road. Inside the collar, land is valuable enough for tall office blocks to be constructed on it; whilst outside it is of such low value that it is used as a ground-level carpark.

In contrast, the pedestrian roundabout in Lujiazui, China pictured at the bottom, constructed over a busy road junction, balances the need to support traffic flows through the city with the need for people to walk freely about the areas in which they live and work. As can be seen from the people walking all around it, it preserves the human vitality of an area that many busy roads flow through. 

We should take insight from these experiences when considering the design of Smart City infrastructures. Unless those infrastructures are designed to be accessible to and usable by citizens, communities and local businesses, they will be as damaging as poorly constructed buildings and poorly designed transport networks. If that sounds extreme, then consider the dangers of cyber-stalking, or the implications of the gun-parts confiscated from a suspected 3D printing gun factory in Manchester last year that had been created on general purpose machinery from digital designs shared through the internet. Digital technology has life and death implications in the real world.

For a start, we cannot take for granted that city residents have the basic ability to access the internet and digital technology. Some 18% of adults in the UK have never been online; and children today without access to the internet at home and in school are at an enormous disadvantage. As digital technology becomes even more pervasive and important, the impact of this digital divide – within and between people, cities and nations – will become more severe. This is why so many people care passionately about the principle of “Net Neutrality” – that the shared infrastructure of the internet provides the same service to all of its users; and does not offer preferential access to those individuals or corporations able to pay for it.

These issues are very relevant to cities and their digital strategies and governance. The operation of any form of network requires physical infrastructure such as broadband cables, wi-fi and 4G antennae and satellite dishes. That infrastructure is regulated by city planning policies. In turn, those planning policies are tools that cities can and should use to influence the way in which technology infrastructure is deployed by private sector service providers.

(Photograph of Aesop’s fable “The Lion and the Mouse” by Liz West)

Little and big

Cities are enormous places in which what matters most is that millions of individually small matters have good outcomes. They work well when their large scale systems support the fine detail of life for every one of their very many citizens: when “big things” and “little things” work well together.

A modest European or US city might have 200,000 to 500,000 inhabitants; a large one might have between one and ten million. The United Nations World Urbanisation Prospects 2011 revision recorded 23 cities with more than 10 million population in 2011 (only six of them in the developed world); and predicted that there would be nearly 40 by 2025 (only eight of them in the developed world – as we define it today). Overall, between now and 2050 the world’s urban population will double from 3 billion to 6 billion. 

A good example of the challenges that this enormous level of urbanisation is already creating is the supply of food. One hectare of highly fertile, intensively farmed land can feed 10 people. Birmingham, my home city, has an area of 60,000 hectares of relatively infertile land, most of which is not available for farming at all; and a population of around 1 million. Those numbers don’t add up to food self-sufficiency; and Birmingham is a very low-density city – between one-half and one-tenth as dense as the growing megacities of Asia and South America Feeding the 7 to 10 billion people who will inhabit the planet between now and 2050, and the 3 to 6 billion of them that will live in dense cities, is certainly a challenge on an industrial scale. 

In contrast, Casserole Club, the Northfield Eco-Centre, the Chale Project and many other initiatives around the world have demonstrated the social, health and environmental benefits of producing and distributing food locally. Understanding how to combine the need to supply food at city-scale with the benefits of producing it locally and socially could make a huge difference to the quality of urban lives.

The challenge of providing affordable broadband connectivity throughout cities demonstrates similar issues. Most cities and countries have not yet addressed that challenge: private sector network providers will not deploy connectivity in areas which are insufficiently economically active for them to make a profit, and Government funding is not yet sufficient to close the gap.

In his enjoyable and insightful book “Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia“, Anthony Townsend describes a grass-roots effort by civic activists to provide New York with free wi-fi connectivity. I have to admire the vision and motivation of those involved, but – rightly or wrongly; and as Anthony describes – wi-fi has ultimately evolved to be dominated by commercial organisations.  

As technology continues to improve and to reduce in price, the balance of power between large, commercial, resource-rich institutions and small, agile, resourceful  grassroots innovators will continue to changeTechnologies such as Cloud Computing, social media, 3D printing and small-scale power generation are reducing the scale at which many previously industrial technologies are now economically feasible; however, it will remain the case for the foreseeable future that many city infrastructures – physical and digital – will be large-scale, expensive affairs requiring the buying power and governance of city-scale authorities and the implementation resources of large companies.

But more importantly, neither small-scale nor large-scale solutions alone will meet all of our needs. Many areas in cities – usually those that are the least wealthy – haven’t yet been provided with wi-fi or broadband connectivity by either.  

(Cars in Frederiksberg, Copenhagen wishing to join a main road must give way to cyclists and pedestrians)

(A well designed urban interface between people and infrastructure. Cars in Frederiksberg, Copenhagen wishing to join a main road must give way to cyclists and pedestrians passing along it)

We need to find the middle ground between the motivations, abilities and cultures of large companies and formal institutions on one hand; and those of agile, local innovators and community initiatives on the other. The pilot project to provide broadband connectivity and help using the internet to Castle Vale in Birmingham is a good example of finding that balance.

And I am optimistic that we can find it more often. Whilst Anthony is rightly critical of approaches to designing and building city systems that are led by technology, or that overlook the down-to-earth and sometimes downright “messy” needs of people and communities for favour of unrealistic technocratic and corporate utopias; the reality of the people I know that are employed by large corporations on Smart City projects is that they are acutely aware of the limitations as well as the value of technology, and are passionately committed to the human value of their work. That passion is often reflected in their volunteered commitment to “civic hacking“, open data initiatives, the teaching of technology in schools and other activities that help the communities in which they live to benefit from technology.

But rather than relying on individual passion and integrity, how do we encourage and ensure that large-scale investments in city infrastructures and technology enable small-scale innovation, rather than stifle it?

Smart urbanism and massive/small innovation

I’ve taken enormous inspiration in recent years from the architect Kelvin Campbell whose “Massive / Small” concept and theory of “Smart Urbanism” are based on the belief that successful cities emerge from physical environments that encourage “massive” amounts of “small”-scale innovation – the “lively, diversified city, capable of continual, close- grained improvement and change” that Jane Jacobs described in “The Death and Life of Great American Cities“.

We’ll have to apply similar principles in order for large-scale city technology infrastructures to support localised innovation and value-creation. But what are the practical steps that we can take to put those principles into practise?

Step 1: Make institutions accessible

There’s a very basic behaviour that most of us are quite bad at – listening. In particular, if the institutions of Smart Cities are to successfully create the environment in which massive amounts of small-scale innovation can emerge, then they must listen to and understand what local activists, communities, social innovators and entrepreneurs want and need.

Many large organisations – whether they are local authorities or private sector companies – are poor at listening to smaller organisations. Their decision-makers are very busy; and communications, engagement and purchasing occur through formally defined processes with legal, financial and confidentiality clauses that can be difficult for small or informal organisations to comply with. The more that we address these barriers, the more that our cities will stimulate and support small-scale innovation. One way to do so is through innovations in procurement; another is through the creation of effective engagements programmes, such as the Birmingham Community Healthcare Trust’s “Healthy Villages” project which is listening to communities expressing their need for support for health and wellbeing. This is why IBM started our “Smarter Cities Challenge” which has engaged hundreds of IBM’s Executives and technology experts in addressing the opportunities and challenges of city communites; and in so doing immersed them in very varied urban cultures, economies, and issues.

But listening is also a personal and cultural attitude. For example, in contrast to the current enthusiasm for cities to make as much data as possible available as “open data”, the Knight Foundation counsel a process of engagement and understanding between institutions and communities, in order to identify the specific information and resources that can be most usefully made available by city institutions to individual citizens, businesses and social organisations.

(Delegates at Gov Camp 2013 at IBM’s Southbank office, London. Gov Camp is an annual conference which brings together anyone interested in the use of digital technology in public services. Photo by W N Bishop)

In IBM, we’ve realised that it’s important to us to engage with, listen to and support small-scale innovation in its many forms when helping our customers and partners pursue Smarter City initiatives; from working with social enterprises, to supporting technology start-ups through our Global Entrepreneur Programme, to engaging with the open data and civic hacking movements.

More widely, it is often talented, individual leaders who overcome the barriers to engagement and collaboration between city institutions and localised innovation. In “Resilience: why things bounce back“, Andrew Zolli describes many examples of initiatives that have successfully created meaningful change. A common feature is the presence of an individual who shows what Zolli calls”translational leadership“: the ability to engage with both small-scale, informal innovation in communities and large-scale, formal institutions with resources.

Step 2: Make infrastructure and technology accessible

Whilst we have a long way to go to address the digital divide, Governments around the world recognise the importance of access to digital technology and connectivity; and many are taking steps to address it, such as Australia’s national deployment of broadband internet connectivity and the UK’s Urban Broadband Fund. However, in most cases, those programmes are not sufficient to provide coverage everywhere.

Some businesses and social initiatives are seeking to address this shortfall. CommunityUK, for example, are developing sustainable business models for providing affordable, accessible connectivity, and assistance using it, and are behind the Castle Vale project in Birmingham. And some local authorities, such as Sunderland and Birmingham, have attempted to provide complete coverage for their citizens – although just how hard it is to achieve that whilst avoiding anti-competition issues is illustrated by Birmingham’s subsequent legal challenges.

We should also tap into the enormous sums spent on the physical regeneration of cities and development of property in them. As I first described in June last year, while cities everywhere are seeking funds for Smarter City initiatives, and often relying on central government or research grants to do so, billions of Pounds, Euros, and Dollars are being spent on relatively conventional property development and infrastructure projects that don’t contribute to cities’ technology infrastructures or “Smart” objectives.

Local authorities could use planning regulations to steer some of that investment into providing Smart infrastructure, basic connectivity, and access to information from city infrastructures to citizens, communities and businesses. Last year, I developed a set of “Smart City Design Principles” on behalf a city Council considering such an approach, including:

Principle 4: New or renovated buildings should be built to contain sufficient space for current and anticipated future needs for technology infrastructure such as broadband cables; and of materials and structures that do not impede wireless networks. Spaces for the support of fixed cabling and other infrastructures should be easily accessible in order to facilitate future changes in use.

Principle 6: Any development should ensure wired and wireless connectivity is available throughout it, to the highest standards of current bandwidth, and with the capacity to expand to any foreseeable growth in that standard.

(The Birmingham-based Droplet smartphone payment service, now also operating in London, is a Smart City start-up that has won backing from Finance Birmingham, a venture capital company owned by Birmingham City Council)

Step 3: Support collaborative innovation

Small-scale, local innovations will always take place, and many of them will be successful; but they are more likely to have significant, lasting, widespread impact when they are supported by city institutions with resources.

That support might vary from introducing local technology entrepreneurs to mentors and investors through the networks of contacts of city leaders and their business partners; through to practical assistance for social enterprises, helping them to put in place very basic but costly administration processes to support their operations.

City institutions can also help local innovations to thrive simply by becoming their customers. If Councils, Universities and major local employers buy services from innovative local providers – whether they be local food initiatives such as the Northfield Ecocentre or high-tech innovations such as Birmingham’s Droplet smartphone payment service – then they provide direct support to the success of those businesses.

In Birmingham,for example, Finance Birmingham (a Council-owned venture capital company) and the Entrepreneurs for the Future (e4F) scheme provide real, material support to the city’s innovative companies; whilst Bristol’s Mayor George Ferguson and Lambeth’s Council both support their local currencies by allowing salaries to be paid in them.

It becomes more obvious  why stakeholders in a city might become involved in collaborative innovation when they have the opportunity to co-create a clear set of shared priorities. Those priorities can be compared to the objectives of innovative proposals seeking support, whether from social initiatives or businesses; used as the basis of procurement criteria for goods, services and infrastructure; set as the objectives for civic hacking and other grass-roots creative events; or even used as the criteria for funding programmes for new city services, such as the “Future Streets Incubator” that will shortly be launched in London as a result of the Mayor of London’s Roads Task Force.

In this context, businesses are not just suppliers of products and services, but also local institutions with significant supply chains, carbon and economic footprints, purchasing power and a huge number of local employees. There are many ways such organisations can play a role in supporting the development of an open, Smarter, more sustainable city.

The following “Smart City Design Principles” promote collaborative innovation in cities by encouraging support from development and regeneration initiatives:

Principle 12: Consultations on plans for new developments should fully exploit the capabilities of social media, virtual worlds and other technologies to ensure that communities affected by them are given the widest, most immersive opportunity possible to contribute to their design.

Principle 13: Management companies, local authorities and developers should have a genuinely engaging presence in social media so that they are approachable informally.

Principle 14: Local authorities should support awareness and enablement programmes for social media and related technologies, particularly “grass roots” initiatives within local communities.

Step 4: Promote open systems

A common principle between the open data movement; civic hacking; localism; the open government movement; and those who support “bottom-up” innovations in Smart Cities is that public systems and infrastructure – in cities and elsewhere – should be “open”. That might mean open and transparent in their operation; accessible to all; or providing open data and API interfaces to their technology systems so that citizens, communities and businesses can adapt them to their own needs. Even better, it might mean all of those things.

The “Dublinked” information sharing partnership, in which Dublin City Council, three surrounding County Councils and  service providers to the city share information and make it available to their communities as “open data”, is a good example of the benefits that openness can bring. Dublinked now makes 3,000 datasets available to local authority analysts; to researchers from IBM Research and the National University of Ireland; and to businesses, entrepreneurs and citizens. The partnership is identifying new ways for the city’s public services and transport, energy and water systems to work; and enabling the formation of new, information-based businesses with the potential to export the solutions they develop in Dublin to cities internationally. It is putting the power of technology and of city information not only at the disposal of the city authority and its agencies, but also into the hands of communities and innovators.

(I was delighted this year to join Innovation Birmingham as a non-Executive Director in addition to my role with IBM. Technology incubators – particularly those, like Innovation Birmingham and Sunderland Software City, that are located in city centres – are playing an increasingly important role in making the support of city institutions and major technology corporations available to local communities of entrepreneurs and technology activists)

In a digital future, the more that city infrastructures and services provide open data interfaces and APIs, the more that citizens, communities and businesses will be able to adapt the city to their own needs. This is the modern equivalent of the grid system that Jane Jacobs promoted as the most adaptable urban form. A grid structure is the basis of Edinburgh’s “New Town”, often regarded as a masterpiece of urban planning that has proved adaptable and successful through the economic and social changes of the past 250 years, and is also the starting point for Kelvin Campbell’s work.

But open data interfaces and APIs will only be widely exploitable if they conform to common standards. In order to make it possible to do something as simple as changing a lightbulb, we rely on open standards for the levels of voltage and power from our electricity supply; the physical dimensions of the socket and bulb and the characteristics of their fastenings; specifications of the bulb’s light and heat output; and the tolerance of the bulb and the fitting for the levels of moisture found in bathrooms and kitchens. Cities are much more complicated than lightbulbs; and many more standards will be required on order for us to connect to and re-configure their systems easily and reliably.

Open standards are also an important tool in avoiding city systems becoming “locked-in” to any particular supplier. By specifying common characteristics that all systems are required to demonstrate, it becomes more straightforward to exchange one supplier’s implementation for another.

Some standards that Smarter City infrastructures can use are already in place – for example, Web services and REST that specify the general ways in which computer systems interact, and the Common Alerting Protocol which is more specific to interactions between systems that monitor and control the physical world. But many others will need to be invented and encouraged to spread. The City Protocol Society is one organisation seeking to develop those new standards; and the British Standards Institute recently published the first set of national standards for Smarter Cities in the UK, including a standard for the interoperability of data between Smart City systems.

Some open source technologies will also be pivotal; open source (software whose source code is freely available to anyone, and which is usually written by unpaid volunteers) is not the same as open standards (independently governed conventions that define the way that technology from any provider behaves). But some open source technologies are so widely used to operate the internet infrastructures that we have become accustomed to – the “LAMP” stack of operating system, web server, database and web progamming language, for example – that they are “de facto” standards that convey some of the benefits of wide usability and interoperability of open standards. For example, IBM recently donated MQTT, a protocol for connecting information between small devices such as sensors and actuators in Smart City systems to the open source community, and it is becoming increasingly widely adopted as a consequence.

Once again, local authorities can contribute to the adoption of open standards through planning frameworks and procurement practises:

Principle 7: Any new development should demonstrate that all reasonable steps have been taken to ensure that information from its technology systems can be made openly available without additional expenditure. Whether or not information is actually available will be dependent on commercial and legal agreement, but it should not be additionally subject to unreasonable expenditure. And where there is no compelling commercial or legal reason to keep data closed, it should actually be made open.

Principle 8: The information systems of any new development should conform to the best available current standards for interoperability between IT systems in general; and for interoperability in the built environment, physical infrastructures and Smarter Cities specifically.

(The town plan for Edinburgh’s New Town, clearly showing the grid structure that gives rise to the adaptability that it is famous for showing for the past 250 years. Image from the JR James archive)

Finally, design skills will be crucial both to creating interfaces to city infrastructures that are truly useful and that encourage innovation; and in creating innovations that exploit them that in turn are useful to citizens.

At the technical level, there is already a rich corpus of best practise in the design of interfaces to technology systems and in the architecture of technology infrastructures that provide them.

But the creativity that imagines new ways to use these capabilities in business and in community initiatives will also be crucial. The new academic discipline of “Service Science” describes how designers can use technology to create new value in local contexts; and treats services such as open data and APIs as “affordances” – capabilities of infrastructure that can be adapted to the needs of an individual. In the creative industries, “design thinkers” apply their imagination and skills to similar subjects.

Step 5: Provide common services

At the 3rd EU Summit on Future Internet, Juanjo Hierro, Chief Architect for the FI-WARE “future internet platform” project, identified the specific tools that local innovators need in order to exploit city information infrastructures. They include real-time access to information from physical city infrastructures; tools for analysing “big data“; and access to technologies to ensure privacy and trust.

The Dublinked information sharing partnership is already putting some of these ideas into practise. It provides assistance to innovators in using, analysing and visualising data; and now makes available realtime data showing the location and movements of buses in the city. The partnership is based on specific governance processes that protect data privacy and manage the risk associated with sharing data.

As we continue to engage with communities of innovators in cities, we will discover further requirements of this sort. Imperial College’s “Digital Cities Exchange” research programme is investigating the specific digital services that could be provided as enabling infrastructure to support innovation and economic growth in cities, for example. And the British Standards Institute’s Smart Cities programme includes work on standards that will enable small businesses to benefit from Smart City infrastructure.

Local authorities can adapt planning frameworks to encourage the provision of these services:

Principle 9: New developments should demonstrate that they have considered the commercial viability of providing the digital civic infrastructure services recommended by credible research sources.

Step 6: Establish governance of the information economy

From the exponential growth in digital information we’ve seen in recent years, to the emergence of digital currencies such as Bitcoin, to the disruption of traditional industries by digital technology; it’s clear that we are experiencing an “information revolution” just as significant as the “industrial revolution” of the 18th and 19th centuries. We often refer to the resulting changes to business and society as the development of an “information economy“.

But can we speak in confidence of an information economy when the basis of establishing the ownership and value of its fundamental resource – digital information – is not properly established?

(Our gestures when using smartphones may be directed towards the phones, or the people we are communicating with through them; but how are they interpreted by the people around us? “Oh, yeah? Well, if you point your smartphone at me, I’m gonna point my smartphone at you!” by Ed Yourdon)

A great deal of law and regulation already applies to information, of course – such as the European Union’s data privacy legislation. But practise in this area is far less established than the laws governing the ownership of physical and intellectual property and the behaviour of the financial system that underlie the rest of the economy. This is evident in the repeated controversies concerning the use of personal information by social media businesses, consumer loyalty schemes, healthcare providers and telecommunications companies.

The privacy, security and ownership of information, especially personal information, are perhaps the greatest challenges of the digital age. But that is also a reflection of their importance to all aspects of our lives. Jane Jacobs’ description of urban systems in terms of human and community behaviour was based on those concepts, and is still regarded as the basis of our understanding of cities. New technologies for creating and using information are developing so rapidly that it is not only laws specifically concerning them that are failing to keep up with progress; laws concerning the other aspects of city systems that technology is transforming are failing to adapt quickly enough too.

A start might be to adapt city planning regulations to reflect and enforce the importance of the personal information that will be increasingly accessed, created and manipulated by city systems:

Principle 21: Any information system in a city development should provide a clear policy for the use of personal information. Any use of that information should be with the consent of the individual.

The triumph of the commons

I wrote last week that Smarter Cities should be a “middle-out” economic investment – in other words, an investment in common interests – and compared them to the Economist’s report on the efforts involved in distributing the benefits of the industrial revolution to society at large rather than solely to business owners and the professional classes.

One of the major drivers for the current level of interest in Smarter Cities and technology is the need for us to adapt to a more sustainable way of living in the face of rising global populations and finite resources. At large scale, the resources of the world are common; and at local scale, the resources of cities are common too.

For four decades, it has been widely assumed that those with access to common resources will exploit them for short term gain at the expense of long term sustainability – this is the “tragedy of the commons” first described by the economist Garrett Hardin. But in 2009, Elinor Ostrum won the Nobel Prize for economics by demonstrating that the “tragedy” could be avoidedand that a community could manage and use shared resources in a way that was sustainable in the long-term.

Ostrum’s conceptual framework for managing common resources successfully is a set of criteria for designing “institutions” that consist of people, processes, resources and behaviours. These need not necessarily be formal political or commercial institutions, they can also be social structures. It is interesting to note that some of those criteria – for example, the need for mechanisms of conflict resolution that are local, public, and accessible to all the members of a community – are reflected in the development over the last decade of effective business models for carrying out peer-to-peer exchanges using social media, supported by technologies such as reputation systems.

Of course, there are many people and communities who have championed and practised the common ownership of resources regardless of the supposed “tragedy” – not least those involved in the Transition movement founded by Rob Hopkins, and which has developed a rich understanding of how to successfully change communities for the better using good ideas; or the translational leaders described by Andrew Zolli. But Elinor Ostrum’s ideas are particularly interesting because they could help us to link the design, engineering and governance of Smarter Cities to the achievement of sustainable economic and social objectives based on the behaviour of citizens, communities and businesses.

Combined with an understanding of the stories of people who have improved their lives and communities using technology, I hope that the work of Kelvin Campbell, Rob Hopkins, Andrew Zolli, Elinor Ostrum and many others can inspire technologists, urban designers, architects and city leaders to develop future cities that fully exploit modern technology to be efficient, resilient and sustainable; but that are also the best places to live and work that we can imagine, or that we would hope for for our children.

Cities created by people like that really would be Smart.

Information and choice: nine reasons our future is in the balance

(The Bandra pedestrian skywalk in Mumbai, photo taken from the Collaborative Research Initiative Trust‘s study of Mumbai, “Being Nicely Messy“, produced for the 2012 Audi Urban Futures awards)

The 19th and 20th centuries saw the flowering and maturation of the Industrial Revolution and the creation of the modern world. Standards of living worldwide increased dramatically as a consequence – though so did inequality.

The 21st century is already proving to be different. We are reaching the limits of supply of the natural resources and cheap energy that supported the last two centuries of development; and are starting to widely exploit the most powerful man-made resource in history: digital information.

Our current situation isn’t simply an evolution of the trends of the previous two centuries; nine “tipping points” in economics, society, technology and the environment indicate that our future will be fundamentally different to the past, not just different by degree.

Three of those tipping points represent changes that are happening as the ultimate consequences of the Industrial Revolution and the economic globalisation and population growth it created; three of them are the reasons I think it’s accurate to characterise the changes we see today as an Information Revolution; and the remaining three represent challenges for us to face in the future.

The difficulty faced in addressing those challenges internationally through global governance institutions is illustrated by the current status of world trade deal and climate change negotiations; but our ability to respond to them is not limited to national and international governments. It is in the hands of businesses, communities and each of us as individuals as new business models emerge.

The structure of the economy is changing

In 2012, the Collaborative Research Initiatives Trust were commissioned by the Audi Urban Futures Awards to develop a vision for the future of work and life in Mumbai. In the introduction to their report, “Being Nicely Messy“, they cite a set of statistics describing Mumbai’s development that nicely illustrate the changing nature of the city:

“While the population in Mumbai grew by 25% between 1991 and 2010, the number of people travelling by trains during the same years increased by 66% and the number of vehicles grew by 181%. At the same time, the number of enterprises in the city increased by 56%.

All of this indicates a restructuring of the economy, where the nature of work and movement has changed.”

(From “Being Nicely Messy“, 2011, Collaborative Research Initiatives Trust)

Following CRIT’s inspiration, over the last year I’ve been struck by several similar but more widely applicable sets of data that, taken together, indicate that a similar restructuring is taking place across the world.

ScreenHunter_223 Nov. 28 00.06

(Professor Robert Gordon’s analysis of historic growth in productivity, as discussed by the famous investor Jeremy Grantham, showing that the unusual growth experienced through the Industrial Revolution may have come to an end. Source: Gordon, Robert J., “Is U.S. Economic Growth Over? Faltering Innovation Confronts the Six Headwinds,” NBER Working Paper 18315, August 2012)

The twilight of the Industrial Revolution

Tipping point 1: the slowing of economic growth

According to the respected investor Jeremy Grantham, Economic growth has slowed systemically and permanently. He states that: “Resource costs have been rising, conservatively, at 7% a year since 2000 … in a world growing at under 4% and [in the] developed world at under 1.5%”

Grantham’s analysis is that the rapid economic growth of the last century was a historical anomaly driven by the productivity improvements made possible through the Industrial Revolution; and before that revolution reached such a scale as to create global competition for resources and energy. Property and technology bubbles extended that growth into the early 21st Century, but it has now reduced to much more modest levels where Grantham expects it to remain. The economist Tyler Cowan came to similar conclusions in his 2011 book, “The Great Stagnation“.

This analysis was supported by the property developers I met at a recent conference in Birmingham. They told me that indicators in their market today are the most positive they have been since the start of the 1980s property boom; but none of them expect that boom to be repeated. The market is far more cautious concerning medium and long-term prospects for growth.

We have passed permanently into an era of more modest economic growth than we have become accustomed to; or at very least into an era whereby we need to restructure the relationship between economic growth and the consumption of resources and energy in ways that we have not yet determined before higher growth does return. We have passed a tipping point; the world has changed.

(Growth in the world's urban population as reported by World Urbanization Prospects”, 2007 Revision, Department of Economic and Social Affairs, United Nations)

(Growth in the world’s urban population as reported by “World Urbanization Prospects”, 2007 Revision, Department of Economic and Social Affairs, United Nations)

Tipping point 2: urbanisation and the industrialisation of food supply 

As has been widely quoted in recent years, more than half the world’s population has lived in cities since 2010 according to the United Nations Department of Economic and Social Affairs. That percentage is expected to increase to 70% by 2050.

The implications of those facts concern not just where we live, but the nature of the economy. Cities became possible when we industrialised the production and distribution of food, rather than providing it for ourselves on a subsistence basis; or producing it in collaboration with our neighbours. For this reason, many developing nations still undergoing urbanisation and industrialisation – such as Tanzania, Turkmenistan and Tajikstan – still formally define cities by criteria including “the pre-dominance of non-agricultural workers and their families” (as referenced in the United Nations’ “World Urbanization Prospects” 2007 Revision).

So for the first time more than half the world’s population now lives in cities; and is provided with food by industrial supply chains rather than by families or neighbours. We have passed a tipping point; the world has changed.

(Estimated damage in $US billion caused by natural disasters between 1900 and 2012 as reported by EM-DAT)

(Estimated damage in $US billion caused by natural disasters between 1900 and 2012 as reported by EM-DAT)

Tipping point 3: the frequency and impact of extreme weather conditions

As our climate changes, we are experiencing more unusual and extreme weather. In addition to the devastating impact recently of Typhoon Haiyan in the Philippines,  cities everywhere are regularly experiencing the effects to a more modest degree.

One city in the UK told me recently that inside the last 12 months they have dealt with such an increase in incidents of flooding severe enough to require coordinated cross-city action that it has become an urgent priority for local Councillors. We are working with other cities in Europe to understand the effect of rising average levels of flooding – historic building construction codes mean that a rise in average levels of a meter or more could put significant numbers of buildings at risk of falling down. The current prediction from the United Nations International Panel on Climate Change is that levels will rise somewhere between 26cm and 82cm by the end of this century – close enough for concern.

The EM-DAT International Disasters Database has calculated the financial impact of natural disasters over the past century. They have shown that in recent years the increased occurrence of unusual and extreme weather combined with the increasing concentration of populations and economic activity in cities has caused this impact to rise at previously unprecedented rates.

The investment markets have identified and responded to this trend. In their recent report “Global Investor Survey on Climate Change”, the Global Investor Coalition on Climate Change reported this year that 53% of fund managers collectively responsible for $14 trillion of assets indicated that they had divested stocks, or chosen not to invest in stocks, due to concerns over the impact of climate change on the businesses concerned. We have passed a tipping point; the world has changed.

(The prediction of exponential growth in digital information from EMC's Digital Universe report)

(The prediction of exponential growth in digital information from EMC’s Digital Universe report)

The dawn of the Information Revolution

Tipping point 4: exponential growth in the world’s most powerful man-made resource, digital information

Information has always been crucial to our world. Our use of language to share it is arguably a defining characteristic of what it means to be human; it is the basis of monetary systems for mediating the exchange of goods and services; and it is a core component of quantum mechanics, one of the most fundamental physical theories that describes how our universe behaves.

But the emergence of broadband and mobile connectivity over the last decade have utterly transformed the quantity of recorded information in the world and our ability to exploit it.

EMC’s Digital Universe report shows that in between 2010 and 2012 more information was recorded than in all of previous human history. They predict that the quantity of information recorded will double every 2 years, meaning that at any point in the next two decades it will be true to make the same assertion that “more information was recorded in the last two years than in all of previous history”. In 2011 McKinsey described the “information economy” that has emerged to exploit this information as a fundamental shift in the basis of the economy as a whole.

Not only that, but information has literally been turned into money. The virtual currency Bitcoin is based not on the value of a raw material such as gold whose availability is physically limited; but on the outcomes of extremely complex cryptographic calculations whose performance is limited by the speed at which computers can process information. The value of Bitcoins is currently rising incredibly quickly – from $20 to $1000 since January; although it is also subject to significant fluctuations. 

Ultimately, Bitcoin itself may succeed or fail – and it is certainly used in some unethical and dangerous transactions as well as by ordinary people and businesses. But its model has demonstrated in principle that a decentralised, non-national, information-based currency can operate successfully, as my colleague Richard Brown recently explained.

Digital information is the most valuable man-made resource ever invented; it began a period of exponential growth just three years ago and has literally been turned into money. We have passed a tipping point; the world has changed.

Tipping point 5: the disappearing boundary between humans, information and the physical world

In the 1990s the internet began to change the world despite the fact that it could only be accessed by using an expensive, heavy personal computer; a slow and inconvenient telephone modem; and the QWERTY keyboard that was designed in the 19th Century to prevent typists from typing faster than the levers in mechanical typewriters could move.

Three years ago, my then 2-year-old son taught himself how to use a touchscreen tablet to watch cartoons from around the world before he could read or write. Two years ago, Scientists at the University of California at Berkeley used a Magnetic Resonance Imaging facility to capture images from the thoughts of a person watching a film. A less sensitive mind-reading technology is already available as a headset from Emotiv, which my colleagues in IBM’s Emerging Technologies team have used to help a paralysed person communicate by thinking directional instructions to a computer.

Earlier this year, a paralysed woman controlled a robotic arm by thought; and prosthetic limbs, a working gun and living biological structures such as muscle fibre and skin are just some of the things that can be 3D printed on demand from raw materials and digital designs.

Our thoughts can control information in computer systems; and information in those systems can quite literally shape the world around us. The boundaries between our minds, information and the physical world are disappearing. We have passed a tipping point; the world has changed.

(A personalised prosthetic limb constructed using 3D printing technology. Photo by kerolic)

Tipping point 6: the miniaturisation of industry

The emergence of the internet as a platform for enabling sales, marketing and logistics over the last decade has enabled small and micro-businesses to reach markets across the world that were previously accessible only to much larger organisations with international sales and distribution networks.

More recently, the emergence and maturation of technologies such as 3D printingopen-source manufacturing and small-scale energy generation are enabling small businesses and community initiatives to succeed in new sectors by reducing the scale at which it is economically viable to carry out what were previously industrial activities – a trend recently labelled by the Economist magazine as the “Third Industrial Revolution“. The continuing development of social media and pervasive technology enable them to rapidly form and adapt supply and exchange networks with other small-scale producers and consumers.

Estimates of the size of the resulting “sharing economy“, defined by Wikipedia as “economic and social systems that enable shared access to goods, services, data and talent“, vary widely, but are certainly significant. The UK Economist magazine reports one estimate that it is a $26 billion economy already, whilst 2 Degrees Network report that just one aspect of it – small-scale energy generation – could save UK businesses £33 billion annually by 2030Air B’n’B – a peer-to-peer accommodation service – reported recently that they had contributed $632 million in value to New York’s economy in 2012 by enabling nearly 5,000 residents to earn an average of $7,500 by renting their spare rooms to travellers; and as a consequence of those travellers additionally spending an average of $880 in the city during their stay. Overall, there has been a significant rise in self-employment and “micro-entrepreneurial” enterprises over the last few years, which now account for 14% of the US economy.

Organisations participating in the sharing economy exhibit a range of motivations and ethics – some are aggressively commercial, whilst others are “social enterprises” with a commitment to reinvest profits in social growth. The social enterprise sector, comprised of mutuals, co-operatives, employee-owned businesses and enterprises who submit to “triple bottom line” accounting of financial, social and environmental capital, is about 15% of the value of most economies, and has been growing and creating jobs faster than traditional business since the 2008 crash.

In the first decade of the 21st Century, mobile and internet technologies caused a convergence between the technology, communications and media sectors of the economy. In this decade, we will see far more widespread disruptions and convergences in the technology, manufacturing, creative arts, healthcare and utilities industries; and enormous growth in the number of small and social enterprises creating innovative business models that cut across them. We have passed a tipping point; the world has changed.

Rebalancing the world

Tipping point 7: how we respond to climate change and resource constraints

There is now agreement amongst scientists, expressed most conclusively by the United Nations International Panel on Climate Change this year, that the world is undergoing a period of overall warming resulting from the impact of human activity. But there is not yet a consensus on how we should respond.

Views vary from taking immediate, sweeping measures to drastically cut carbon and greenhouse gas emissions,  to the belief that we should accept climate change as inevitable and focus investment instead on adapting to it, as suggested by the “Skeptical Environmentalist” Bjørn Lomborg and the conservative think-tank the American Enterprise Institute. As a result of this divergence of opinion, and of the challenge of negotiating between the interests of countries, communities and businesses across the world, the agreement reached by last year’s climate change negotiations in Doha was generally regarded as relatively weak.

Professor Chris Rogers of the University of Birmingham and his colleagues in the Urban Futures initiative have assessed over 450 proposed future scenarios and identified four archetypes (described in his presentation to Base Cities Birmingham) against which they assess the cost and effectiveness of environmental and climate interventions. The “Fortress World” scenario is divided between an authoritarian elite who control the world’s resources from their protected enclaves and a wider population living in poverty. In “Market Forces”, free markets encourage materialist consumerism to wholly override social and environmental values; whilst in “Policy Reform” a combination of legislation and citizen behaviour change achieve a balanced outcome. And in the “New Sustainability Paradigm” the pursuit of wealth gives way to a widespread aspiration to achieve social equality and environmental sustainability. (Chris is optimistic enough that his team dismissed another scenario, “Breakdown”, as unrealistic).

Decisions that are taken today affect the degree to which our world will evolve to resemble those scenarios. As the impact of weather and competition for resources affect the stability of supply of energy and foodmany cities are responding to the relative lack of national and international action by taking steps themselves. Some businesses are also building strategies for long-term success and profit growth  around sustainability; in part because investing in a resilient world is a good basis for a resilient business, and in part because they believe that a genuine commitment to sustainability will appeal to consumers. Unilever demonstrated that they are following this strategy recently by committing to buy all of their palm oil – of which they consume one third of the world’s supply – from traceable sources by the end of 2014.

At some point, we will all – individuals, businesses, communities, governments – be forced to change our behaviour to account for climate change and the limits of resource availability: as the prices of raw materials, food and energy rise; and as we are more and more directly affected by the consequences of a changing environment.

The questions are: to what extent have these challenges become urgent to us already; and how and when will we respond?

(“Makers” at the Old Print Works in Balsall Heath, Birmingham, sharing the tools, skills and ideas that create successful small businesses)

Tipping point 8: the end of the average career

In “The End of Average“, the economist Tyler Cowen observed that about 60% of the jobs lost during the 2008 recession were in mid-wage occupations; and the UK Economist magazine reported that many jobs lost from professional industries had been replaced in artisan trades and small-scale industry such as food, furniture and design.

Echoing Jeremy Grantham, Cowen further observes that these changes take place within a much longer term 28% decline in middle-income wages in the US between 1969 and 2009 which has no identifiable single cause. Cowen worries that this is a sign that the economy is beginning to diverge into the authoritarian elite and the impoverished masses of Chris Rogers’ “Fortress World” scenario.

Other evidence points to a more complex picture. Jake Dunagan, Research Director of the Institute for the Future, believes that the widespread availability of digital technology and information is extending democracy and empowerment – just as the printing press and education did in the last millennium as they dramatically increased the extent to which people were informed and able to make themselves heard. Dunagan notes that through our reliance on technology and social media to find and share information, our thoughts and beliefs are already formed by, and having an effect on, society in a way that is fundamentally new.

The miniaturisation of industry (tipping point 6 above) and the disappearance of the boundary between our minds and bodies, information and the physical world (tipping point 5 above) are changing the ways in which resources and value are exchanged and processed out of all recognition. Just imagine how different the world would be if a 3D-printing service such as Shapeways transformed the manufacturing industry as dramatically as iTunes transformed the music industry 10 years ago. Google’s futurologist Thomas Frey recently described 55 “jobs of the future” that he thought might appear as a result.

(Activities comprising the “Informal Economy” and their linkages to the mainstream economy, by Claro Partners)

In both developed and emerging countries, informal, social and micro-businesses are significant elements of the economy, and are growing more quickly than traditional sectorsClaro partners estimate that the informal economy (in which they include alternative currencies, peer-to-peer businesses, temporary exchange networks and micro-businesses – see diagram, right) is worth $10 trillion worldwide, and that it employs up to 80% of the workforce in emerging markets. 

In developed countries, the Industrial Revolution drove a transformation of such activity into a more formal economy – a transformation which may now be in part reversing. In developing nations today, digital technology may make part of that transformation unnecessary. 

To be successful in this changing economy, we will need to change the way we learn, and the way we teach our children. Cowen wrote that “We will move from a society based on the pretense that everyone is given an okay standard of living to a society in which people are expected to fend for themselves much more than they do now”; and expressed a hope that online education offers the potential for cheaper and more widespread access to new skills to enable people to do so. This thinking echoes a finding of the Centre for Cities report “Cities Outlook 1901” that the major factor driving the relative success or failure of UK cities throughout the 20th Century was their ability to provide their populations with the right skills at the right time as technology and industry developed.

The marketeer and former Yahoo Executive Seth Godin’s polemic “Stop Stealing Dreams” attacked the education system for continuing to prepare learners for stable, traditional careers rather than the collaborative entrepreneurialism that he and other futurists expect to be required. Many educators would assert that their industry is already adapting and will continue to do so – great change is certainly expected as the ability to share information online disrupts an industry that developed historically to share it in classrooms and through books.

Many of the businesses, jobs and careers of 2020, 2050 and 2100 will be unrecognisable or even unimaginable to us today; as are the skills that will be needed to be successful in them. Conversely, many post-industrial cities today are still grappling with challenges created by the loss of jobs in manufacturing, coalmining and shipbuilding industries in the last century.

The question for our future is: will we adapt more comfortably to the sweeping changes that will surely come to the industries that employ us today?

("Lives on the Line" by James Cheshire at UCL's Centre for Advanced Spatial Analysis, showing the variation in life expectancy and correlation to child poverty in London. From Cheshire, J. 2012. Lives on the Line: Mapping Life Expectancy Along the London Tube Network. Environment and Planning A. 44 (7). Doi: 10.1068/a45341)

(“Lives on the Line” by James Cheshire at UCL’s Centre for Advanced Spatial Analysis, showing the variation in life expectancy and correlation to child poverty in London. From Cheshire, J. 2012. Lives on the Line: Mapping Life Expectancy Along the London Tube Network. Environment and Planning A. 44 (7). Doi: 10.1068/a45341)

Tipping point 9: inequality

The benefits of living in cities are distributed extremely unevenly.

The difference in life expectancy of children born into the poorest and wealthiest areas of UK cities today is often as much as 20 years – for boys in Glasgow the difference is 28 years. That’s a deep inequality in the opportunity to live.

There are many causes of that inequality, of course: health, diet, wealth, environmental quality, peace and public safety, for example. All of them are complex, and the issues that arise from them to create inequality – social deprivation and immobility, economic disengagement, social isolation, crime and lawlessness – are notoriously difficult to address.

But a fundamental element of addressing them is choosing to try to do so. That’s a trite observation, but it is nonetheless the case that in many of our activities we do not make that choice – or, more accurately, as individuals, communities and businesses we take choices primarily in our own interests rather than based on their wider impact.

Writing about cities in the 1960s, the urbanist Jane Jacobs observed that:

“Private investment shapes cities, but social ideas (and laws) shape private investment. First comes the image of what we want, then the machinery is adapted to turn out that image. The financial machinery has been adjusted to create anti-city images because, and only because, we as a society thought this would be good for us. If and when we think that lively, diversified city, capable of continual, close- grained improvement and change, is desirable, then we will adjust the financial machinery to get that.”

In many respects, we have not shaped the financial machinery of the world to achieve equality. Nobel Laureate Joseph Stiglitz wrote recently that in fact the financial machinery of the United States and the UK in particular create considerable inequality in those countries; and the Economist magazine reminds us of the enormous investments made into public institutions in the past in order to distribute the benefits of the Industrial Revolution to society at large rather than concentrate them on behalf of business owners and the professional classes – with only partial success.

New legislation in banking has been widely debated and enacted since the 2008 financial crisis – enforcing the separation of commercial and investment banking, for example. But addressing inequality is a much broader challenge than the regulation of banking, and will not only be addressed by legislation. Business models such as social enterprise, cross-city collaborations and the sharing economy are emerging to develop sustainable businesses in industries such as food, energy, transportation and finance, in addition to the contribution made by traditional businesses building sustainability into their strategies.

Whenever we vote, buy something or make a choice in business, we contribute to our overall choice to develop a fairer, more sustainable world in which everyone has a chance to participate. The question is not just whether we will take those choices; but the degree to which their impact on the wider world will be apparent to us so that we can do so in an informed way.

That is a challenge that technology can help with.

(A smartphone alert sent to a commuter in a San Francisco pilot project by IBM Research and Caltrans that provides personalised daily predictions of commuting journey times. The predictions gave commuters the opportunity to take a better-informed choice about their travel to work.)

Data and Choice

Like the printing press, the vote and education, access to data allows us to make more of a difference than we were able to without it.

Niall Firth’s November editorial for the New Scientist magazine describes how citizens of developing nations are using open data to hold their governments to account, from basic information about election candidates to the monitoring of government spending. In the UK, a crowd-sourced analysis of politicians’ expenses claims that had been leaked to the press resulted in resignations, the repayment of improperly claimed expenses, and in the most severe cases, imprisonment.

Unilever are committing to making their supply chain for palm oil traceable precisely because that data is what will enable them to next improve its sustainability; and in Almere, city data and analytics are being used to plan future development of the city in a way that doesn’t cause harmful impacts to existing citizens and residents. Neither initiative would have been possible or affordable without recent improvements in technology.

Data and technology, appropriately applied, give us an unprecedented ability to achieve our long-term objectives by taking better-informed, more forward-looking decisions every day, in the course of our normal work and lives. They tell us more than we could ever previously have known about the impact of those decisions.

That’s why the tipping points I’ve described in this article matter to me. They translate my general awareness that I should “do the right thing” into a specific knowledge that at this point in time, my choices in many aspects of daily work and life contribute to powerful forces that will shape the next century that we share on this planet; and that they could help to tip the balance in all of our favour.

Can digital technology help us build better cities? A workshop at the Academy of Urbanism Annual Congress, Bradford, Thursday 16th May

(Protesters at Occupy Wallstreet using digital technology to coordinate their demonstration. Photo by David Shankbone)

Over the course of the last two decades, digital technologies such as the Internet, mobile telephone and touchscreen have transformed the way we communicate, work and live; and in so doing have caused industries such as publishing and music to change out of all recognition.

These developments clearly change the way that we behave in cities – the way we travel; and where and when we work, shop and communicate.

And they lead to new demands on the urban environment from residents, visitors, businesses and communities: the availability of mobile and broadband connectivity; open data portals; and transient working environments such as the Hub Westminster collaborative workspace – or simply cafes with wi-fi and power outlets.

Should these technologies change the way we design and build cities, and if so, how? Do technologies offer solutions to difficult problems such as offering more flexible, coordinated transport services? Or are they a distraction on focussing on what really matters – the physical, social and economic needs of people and their communities? And how do they compare to long-standing debates within the more traditional domains of urbanism about how good cities are created, regardless of technology?

(The collaborative working space of Hub Westminster which is constantly refactored to support new uses, exploiting furniture and spatial technology laser-cut from digital designs)
(The collaborative working space of Hub Westminster which is constantly refactored to support new uses, exploiting furniture and spatial technology laser-cut from digital designs)

The Academy of Urbanism, a body of several hundred professionals, researchers and policy-makers involved in the design and operation of cities from perspectives as diverse as town planning, social science and technology is holding a workshop at it’s Annual Congress in Bradford this year to explore these issues.

The workshop will feature opening contributions from speakers from a variety of backgrounds, and with differing opinions on the value and relevance of digital technology to good urbanism. Our intention is to stimulate an informed and frank debate to follow;  from which we hope that useful, practical insights will emerge on whether and how the technology agenda is relevant to cities.

Some of the questions we’d like to consider in the debate are:

  • Do emerging uses of technology in cities have implications for spatial or master-planning – for example, the provision of physical space for cabling, or the specification of policies or standards for information from city infrastructures to be made openly available?
  • What implications do technology trends such as online commerce and virtual working have for requirements for physical space and transport in cities?
  • If cities need the flexibility in their physical infrastructure implied by such approaches as “Smart Urbanism“, then can technology enable that flexibility? And what are the design principles for technology that should be applied in order to do so?
  • If technology professionals and urban designers are applying their skills in the same context domain (city systems) can we use tools common to both professions, such as design patterns, to combine and share our expertise?
  • What are the new investment and management models for funding, delivering and governing “smart” systems? How do they reflect the achievement of long term social, economic and environment objectives? How can the achievements of entrepreneurial and social enterprises be replicated at city-scale?

Our plans are still forming; so I’d value your thoughts on the theme and scope of the workshop; the structure of the debate; questions that will stimulate a constructive and worthwhile discussion … and any speakers on this topic – whether they are proponents or sceptics of technology in cities – who you think would be particularly interesting. (I’ll update this blog soon with our initial speakers once I’ve confirmed them).

And of course, I’d love you to simply attend the conference and the workshop and join the debate! I hope to see some of you there.

Little/big; producer/consumer; and the story of the Smarter City

(Photo of me wearing the Emotiv headset)

(Photo of me wearing the Emotiv headset)

I have a four year old son. By the time I die he’ll be about my age if I’m lucky.

If I could see him now as he will be then; I would struggle to recognise his interactions with the world as human behaviour in the terms I am used to understanding it.

When he was two years old, I showed him a cartoon on the touchscreen tablet I’d just bought. When it finished, he pressed the thumbnail of the cartoon he wanted to watch next.

The implications of that instinctive and correct action are profound, and mark the start of the disappearance of the boundary between information and the physical world.

Just as the way that we communicate with each other has changed increasingly rapidly from the telephone to e-mail to social media; so the way that we interact with information systems will transform out of all recognition as technology evolves beyond the keyboard, mouse and touchscreen.

The Emotiv headset I’m wearing in the photo above can interpret patterns in the magnetic waves created by my thoughts as simple commands that can be understood by computers. My thoughts can influence the world of information; and they can even be captured as images, as shown in this recent work using Magnetic Resonance Imaging (MRI).

And information can influence the physical world. From control technology implanted in the muscles of insects; to prosthetic limbs and living tissues that are created from digital designs by general-purpose 3D printers. As the way we interact with information systems and use them to affect the world around us becomes so natural that we’re barely conscious of it, the Information Revolution will change our world in ways that we are only beginning to imagine.

These technologies offer striking possibilities; and we face striking challenges. The two will come together where the activity of the world is most concentrated: in cities.

In the last revolution, the Industrial Revolution, we built the centres of cities upwards around lifts powered by the steam engine invented by James Watt and commercialised by Matthew Boulton in Birmingham. In the last century we expanded them outwards around the car as we became used to driving to work, shops, parks and schools.

(Photo of 3D printer by Media Lab Prado)

We believe we can afford a lifestyle based on driving cars because its long-term social and environmental costs are not included in its financial price. But as the world’s population grows towards 9 billion by 2050, mostly in cities that are becoming more affluent in what it’s increasingly inaccurate to call “emerging economies”; that illusion will be shattered.

We’re already paying more for our food and energy as a proportion of income. That’s not because we’re experiencing a “double-dip recession”; it’s because the structure of the economy is changing. There is more competition for grain to feed the world’s fuel and food needs; and droughts caused by climate change are increasing uncertainty in it’s supply.

We have choices to make. Do we consume less? Can we use technology to address the inefficiencies of supply chains which waste almost half the food they produce whilst transporting it thousands of miles around the world, without disrupting them and endangering the billions of lives they support? Or do we disintermediate the natural stages of food supply by growing artificial meat in laboratories?

These choices go to the heart of our relationship with the natural world; what it means to be human; and to live in an ethical society. I think of a Smarter City as one which is taking those choices successfully; and using technology to address its challenges in a way that is both sustainable, and sympathetic to us as human beings and as communities.

Three trends are appearing across technology, urbanism, and the research of resilient systems to show us how to do that. The first is for little things and big things to work constructively together.

The attraction of opposites part 1: little and big

(Photo of Masshouse Circus, Birmingham, before its redevelopment, by Birmingham City Council)

(Photo of Masshouse Circus, Birmingham, before its redevelopment, by Birmingham City Council)

Some physical interventions in cities have been “blunt”. Birmingham’s post-war economy needed traffic to be able to circulate around the city centre; but the resulting ringroad strangled it, until it was knocked down a decade ago. It didn’t meet the needs of individuals and communities within the city to live and interact.

By contrast, Exhibition road in London – a free-for-all where anyone can walk, drive, sit, park or catch a bus, anywhere they like – knits the city together. Elevated pedestrian roundabouts and city parks similarly provide infrastructures that support fluid movement by people cycling and walking; modes of transport in which it is easy to stop and interact with the city.

These big infrastructures are compatible with the life of the little people who inhabit the city around them; and who are the reason for its existence.

The same concepts apply to technology infrastructures.

Technology offers great promise in cities. We can collect data from people and infrastructures – the movement of cars, or the concentration of carbon dioxide. We can aggregate that data to provide information about city systems – how fast traffic is moving, or the level of carbon emissions of buildings. And we can draw insight from that information into the performance of cities – the impacts of congestion on GDP, and of environmental quality on life expectancy.

Cities are deploying mobile and broadband infrastructures to enable the flow of this data; and “open data” platforms to make it available to developers and entrepreneurs for them to explore new business opportunities and develop novel urban services.

But how does deploying broadband infrastructure in a poor neighbourhood create growth if the people who live there can’t afford subscriptions to it? Or if businesses there don’t have access to computer programming skills?

Connectivity and open data are the “big infrastructures” of the information age; how do we ensure that they are properly adapted to the “little” needs of individual citizens, businesses and communities?

We will do that by concerning ourselves with people and places, rather than information and infrastructures.

(Delay times at traffic junctions visualised by the Dublinked city information partnership.)

(Delay times at traffic junctions visualised by the Dublinked city information partnership)

Where civic information infrastructures are successful in creating economic and social growth, they are not deployed; they are co-created in a process of listening and learning between city institutions; businesses; communities; and individuals.

This process requires us to visit new places, such as the “Container City” incubation facility for social enterprise in Sunderland; to learn new languages; and understand different systems of value, such as the “triple bottom line” of social, environmental and financial capital.

If we design infrastructures by listening to and then enabling ideas, then we put the resources of big institutions and companies into the hands of people and businesses in a way that makes it less difficult to create many, more effective “little” innovations in hyper-local contexts – the “Massive Small” change first described by Kelvin Campbell.

By following this process, Dublin’s “Dublinked” partnership between the City and surrounding County Councils; the National University of Ireland, businesses and entrepreneurs is now sharing 3,000 city datasets; using increasingly sophisticated tools to draw value from them; identifying new ways for the city’s transport, energy and water systems to work; and starting new, viable, information-based businesses.

As a sustained process, these conversations and the trust they create form a “soft infrastructure” for a city, connecting it’s little and big inhabitants.

This soft infrastructure is what turns civic information into services that can become part of the fabric of life of cities and communities; and that can enable sustainable growth by weaving information into that fabric that describes the impact of choices that are about to be made.

(A smartphone alert sent to a commuter in a San Francisco pilot project by IBM Research and Caltrans that provides personalised daily predictions of commuting journey times – and suggestions for alternative routes.)

For example, a project in San Francisco used algorithms that are capable of predicting traffic speeds and volume in the city one hour into the future with 85% accuracy. These algorithms were developed in a project in Singapore, where the resulting predictions were made available to traffic managers, so that they could set lane priorities and traffic light sequences to attempt to prevent any predicted congestion.

But in California, the predictions were made available instead to individual commuters who where told in advance the likely duration of their journey each day, including the impact of any congestion that would develop whilst the journey was underway. This gave them a new opportunity to take an informed choice: to travel at a different time; by a different route or mode; or not to travel at all.

The California project shows that it’s far more powerful to use the information resulting from city data and predictive algorithms not to influence a handful of traffic managers who respond to congestion; but to influence the hundreds or thousands of individual travellers who create it; and who have the power to choose not to create it.

And in designing information systems such as this, we can appeal not just to selfish interests, but to our sense of community and place.

A project in Dubuque, Iowa uses Smart water meters to tell householders whether they are using domestic appliances efficiently; and can detect weak underlying signals that indicate leaks. People who are given this information can choose to act on it; and to a certain extent, they do.

But something remarkable happened in a control group who were also given a “green points” score comparing their water efficiency to that of their neighbours. They were literally twice as likely to improve their water efficiency as people who were only told about their own water use.

Maslow’s hierarchy of needs tells us that once the immediate physical needs of our families are secured, our motivations are next driven by our relationships with the people around us. Technology gives us the ability to design new information-based services that appeal directly to those values, rather than to more distant general environmental concerns.

The attraction of opposites part 2: producer and consumer

(Photo of 3D-printed objects by Shapeways)

This information is at our fingertips; we are its producers and consumers. For the last decade, we have used and created it when we share photos in social media or buy and sell in online marketplaces.

But the disappearance of the boundaries between information systems, the physical world and our own biology means that it is not just information that we will be producing and consuming in the next decade, but physical goods and services too.

As a result, new peer-to-peer markets can already be seen in food production; parking spaces; car journeys; the manufacture of custom objects; and the production of energy from sources such as bio-matter and domestic solar panels.

Of course, we have all been producers and consumers since humans first began to farm and create societies with diversified economies. What’s new is the ability of technology to dramatically improve the flexibility, timeliness and efficiency of interactions between producers and consumers; creating interactions that are more sustainable than those enabled by conventional supply chains.

Even more tantalising is the possibility of using new rates of exchange in those transactions.

In Switzerland, a complementary currency, the Wir, has contributed to economic stability over the last century by allowing some debt repayments to be bartered locally when they cannot be repaid in universal currency. And last year, Bristol became the 5th UK town or city to operate its own currency.

These currencies are increasingly using advanced technologies, such as the “Droplet” smartphone payment scheme now operating in Birmingham and London. This combination of information technology and local currencies could be used to calculate rates of exchange that compare the complete social, environmental and economic cost of goods and services to their immediate, contextual value to the participants in the transaction.

That really could create a market infrastructure to support Smarter, sustainable, and more equitable city systems; and it sounds like a great idea to me.

But if it’s such a good idea, why aren’t markets based on it ubiquitous already?

Collaborative governance; and better stories for Smarter Cities

(Stories of Mumbai: an exploration of Mumbai’s history of urban development, and its prospects for the future, using storytelling and puppetshows, by the BMW Guggenheim Lab)

If we are going to use the technologies and ideas I’ve described to transform cities, then technologists like me need to learn from the best of urbanism.

Jan Gehl taught us to design liveable cities not by considering the buildings in them; but how people use the spaces between buildings.

In Smarter Cities our analogous challenge is to concentrate not only on information infrastructures and the financial efficiencies that they provide; not least because “Smart” ideas cut across city systems, and so gains in efficiency don’t always reward those who invest in infrastructure.

Our objective instead is to create the harder to quantify personal, social and environmental value that results when those infrastructures enable people to afford to eat better food or to heat their homes properly in winter; to access affordable transport to places of employment; and to live longer, independent lives as productive contributors to their communities.

These are the stories we need to tell about Smarter Cities.

These stories are of vital importance because the third trend we observe is that cities only really get smarter when their leaders and communities coordinate the use of public and private assets to achieve a collective vision of the future, and to secure external investment in it.

Doing so needs the commitment not just of the owners and managers of those assets, but of the shareholders, voters, employees and other stakeholders that they are accountable to.

To win the commitment of such a broad array of people we need to appeal to common instincts: our understanding of narrative, and our ability to empathise. Ultimately we will need the formal languages of finance and technology, but they are not where we should start.

DDespommier

(Dickson Despommier, inventor of the vertical farm, speaking at TEDxWarwick 2013)

It’s imperative that we tell these stories to inspire the evolution of our cities. The changes in coming decades will be so fast and so profound that cities that do not embrace them successfully will suffer severe decline.

Luckily, our ability to respond successfully to those changes depends on a technology that is freely available: language, used face to face in conversations. I can’t think of a more essential challenge than to use it to tell stories about how our world can be come smarter, fairer, and more sustainable.

And there’s no limit to what any one of us can achieve by doing this. Because it is collaborative governance rather than institutional authority that enables Smarter Cities, then there are no rules defining where the leadership to establish that governance will come from.

Whether you are a politician, academic, technologist, business person, community activist or simply a passionate individual; and whether your aim is to create a new partnership across a city, or simply to start an independent social enterprise within it; that leadership could come from you.

(This article is based on the script I wrote in preparation for my TEDxWarwick presentation on 13th March 2013).

Pens, paper and conversations. And the other technologies that will make cities Smarter.

(Akihabara Street in Tokyo, a centre of high technology, photographed by Trey Ratcliff)

(Akihabara Street in Tokyo, a centre of high technology, photographed by Trey Ratcliff)

A great many factors will determine the future of our cities – for example, human behaviour, demographics, economics, and evolving thinking in urban planning and architecture.

The specific terms “Smart Cities” and “Smarter Cities”, though, are commonly applied to the concept that cities can exploit technology to find new ways to face their challenges. Boyd Cohen of Fast Company offered a useful definition in his article “The Top 10 Smart Cities On The Planet“:

“Smart cities use information and communication technologies (ICT) to be more intelligent and efficient in the use of resources, resulting in cost and energy savings, improved service delivery and quality of life, and reduced environmental footprint–all supporting innovation and the low-carbon economy.”

Some technology developments – such as Service-Oriented Architecture and distributed computing are technically cohesive and can be defined by a particular architecture. Others, however, are more loosely defined. For instance, “Web 2.0” – a term associated with the emergence of social media, smartphones and businesses such as e-Bay, Facebook and Twitter – was coined by Tim O’Reilly in 2003 as a banner to capture the idea that internet and related technologies had once again become valuable sources of innovation following the “dot.com crash”.

So what are the technologies that will make cities Smart?

To answer that question, we need to examine the convergence of two domains of staggering complexity, and of which the outcomes are hard to predict.

The first is the domain of cities: vast, overlapping systems of systems. Their behaviour is the aggregated behaviour of their hundreds of thousands or millions of citizens. Whilst early work is starting to understand the relationship between those systems in a quantitative and deterministic way, such as the City Protocol initiative, we are just at the start of that journey.

(An early example of the emerging technologies that are blurring the boundary between the physical world and information: Professor Kevin Warwick, who in 2002 embedded a silicon chip with 100 spiked electrodes directly into his nervous system. Photo by M1K3Y)

The second domain is technology. We are experiencing phenomenal growth in the availability of information and the invention of new forms of communication. In 2007, more new information was created in one year than in the preceding 5000 years. And whilst the telephone, invented in the mid-19th Century, took around 100 years to become widespread, internet-based communication tools such as Twitter can spread to hundreds of millions of users within a few years.

If we define a “new form of communication” as a means of enabling new patterns of exchange of information between individuals, rather than as a new underlying infrastructure, then we are inventing them – such as foursquareStumbleUpon, and Pinterest – at a faster rate than at any previous time in history.

The discovery and exchange of ideas enabled by these technologies is increasing the rate of invention across many other fields of endeavour, including science and engineering. Indeed, this was deliberate: the evolution of the internet is closely entwined with the need of scientists and engineers to collaborate with each other. I recently surveyed some of the surprising new technologies, and their applications in cities, that are emerging as a result – including materials that grow themselves, 3D printing and mind-reading headsets.

So whilst common patterns are emerging from some Smarter City solutions – for example, the “Digital Cities Exchange” research programme at Imperial College, London; the “FI-WARE” project researching the core platform for the “future internet”; the “European Platform for Intelligent Cities (EPIC)“; and IBM’s own “Intelligent Operations Centre” all share a similar architecture – there is no single platform, architecture or technology that defines “Smart Cities”. Rather, the term defines a period in time in which we have collectively realized that it is critically important to explore the application of new technologies to change the way city systems work to make them more efficient, more equitable and more resilient in the face of the economic, environmental and social challenges facing us.

My own profession is information technology; and I spend much of my time focussed on the latest developments in that field. But in the context of cities, it is a relatively narrow domain. More broadly, developments in many disciplines of science, engineering and technology offer new possibilities for cities of the future.

I find the following framework useful in understanding the various engineering, information and communication technologies that can support Smart City projects. As with the other articles I post to this blog, this is not intended to be comprehensive or definitive – it’s far too early in the field for that; but I hope it is nevertheless a useful contribution.

And I will also find a place in it for one of the oldest and most important technologies that our species has invented: language; and it’s exploitation in “Smart” systems such as pens, paper and conversations.

1. Re-engineering the physical components of city systems

(Kohei Hayamizu’s first attempt to capture energy from pedestrian footfall in Shibuya, Tokyo)

The machinery that supports city systems generally converts raw materials and energy into some useful output. The efficiency of that machinery is limited by theory and engineering. The theoretical limit is created by the fact that machinery operates by transforming energy from one form – such as electricity – into another form – such as movement or heat. Physical laws, such as the Laws of Thermodynamics, limit the efficiency of those processes.

For example, the efficiency of a refrigerator is limited by the fact that it will always use some energy to create a temperature gradient in order that heat can be removed from the contents of the fridge; it then requires additional energy to actually perform that heat removal. Engineering challenges then further reduce efficiency – in the example of the fridge, because its moving components create heat and noise.

One way to improve the efficiency of city systems is to improve the efficiency of the machinery that supports them; either by adopting new approaches (for example, switching from petrol-fuelled to hydrogen-fuelled vehicles), or by increasing the engineering efficiency of existing approaches (for example, using turbo-chargers to increase the efficiency of petrol and diesel engines).

Examples of this approach include:

  • Using new forms of energy exchange, for example, capturing energy from vibrations caused by footfall;
  • Using more efficient energy generation or exchange technologies – such as re-using the heat from computers to heat offices, or using renewable bio-, wind-, or solar energy sources;
  • Using new transport technologies for people, resources or goods that changes the economics of the size and frequency of transport; or of the endpoints and routes – such as underground recycling networks;
  • Replacing transport with other technologies – such as online collaboration;
  • Reducing wastage and inefficiencies in operation,such as the creation of heat and noise – for example, by switching to lighting technologies such as LED that create less heat.

2. Using information  to optimise the operation of city systems

In principle, we can instrument and collect data from any aspect of the systems that support cities; use that data to draw insight into their performance; and use that insight to improve their performance and efficiency in realtime. The ability to do this in practical and affordable ways is relatively new; and offers us the possibility to support larger populations at a higher standard of living whilst using resources more efficiently.

There are challenges, of course. The availability of communication networks to transmit data from where it can be measured to where it can be analysed cannot be assumed. 3G and Wi-Fi coverage is much less complete at ground level, where many city infrastructure components are located, than at head height where humans use mobile phones. And these technologies require expensive, power-hungry transmitters and receivers. New initiatives and startups such as Weightless and SigFox are exploring the creation of communication technologies that promise widespread connectivity at low cost and with low power usage, but they are not yet proven or established.

Despite those challenges, a variety of successful examples exist. Shutl and Carbon Voyage, for example, both use recently emerged technologies to match capacity and demand across networks of transport suppliers; thereby increasing the overall efficiency of the transport systems in the cities where they operate. The Eco-Island Community Interest Company on the Isle of Wight are applying similar concepts to the supply and demand of renwable energy.

Some of the common technologies that enable these solutions at appropriate levels of cost and complexity, are:

3. Co-ordinating the behaviour of multiple systems to contribute to city-wide outcomes

Many city systems are “silos” that have developed around engineering infrastructures or business and operational models that have evolved since city infrastructures were first laid down. In developed markets, those infrastructures may be more than a century old – London’s underground railway was constructed in the mid 19th Century, for example.

But the “outcomes” sought by cities, neighbourhoods and communities – such as social mobility, economic growth, wellbeing and happiness, safety and sustainability – are usually a consequence of a complex mix of effects of the behaviour of many of those systems – energy, economy, transport, healthcare, retail, education, policing and so on.

As information about the operation and performance of those systems becomes increasingly available; and as our ability to make sense of and exploit that information increases; we can start to analyse, model and predict how the behaviour of city systems affects each other, and how those interactions contribute to the overall outcomes of cities, and of the people and communities in them.

IBM’s recent “Smarter Cities Challenge” in my home city of Birmingham studied detailed maps of the systems in the city and their inputs and outputs, and helped Birmingham City Council understand how to developed those maps into a tool to predict the outcomes of proposed policy changes. In the city of Portland, Oregon, a similar interactive tool has already been produced. And Amsterdam and Dublin have both formed regional partnerships to share and exploit city information and co-ordinate portfolios of projects across city systems and agencies driven by common, city-wide objectives.

(A video describing the “systems dynamics” project carried out by IBM in Portland, Oregon to model the interactions between city systems)

We are in the very early stages of developing our ability to quantitatively understand the interrelationships between city systems in this way; but it is already possible to identify some of the technologies that will assist us in that process – in addition to those I mentioned in the previous section:

  • Cloud computing platforms, which enable data from multiple city systems to be co-located on a single infrastructure; and that can provide the “capacity on demand” to apply analytics and visualisation to that data when required.
  • Information and transaction integration technologies which join up data from multiple sources at a technical level; including master data management, and Service Orientated Architecture.
  • Information models for city systems that model the quantitative and semantic relationships between those systems.
  • Service brokerage capabilities to co-ordinate the behaviour of the IT systems that monitor and control city systems; and the service and data catalogues that make those systems and their information available to those brokers.
  • Federated security and identity management to enable citizens and city workers to seamlessly interact with services and information across city systems.
  • Dashboards and other user interface technologies which can present information and services from multiple sources to humans in an understandable and meaningful way.

4. Creating new marketplaces to encourage sustainable choices, and attract investment

As I’ve argued on many occasions on this blog, it is often important or useful to conceive of Smarter City solutions as marketplaces. Such thinking encourages us to consider how the information associated with city services can be used to influence individual choices and their collective impact; and the money-flows in marketplaces can be used to create business cases to support investment in new infrastructure.

The examples in transport innovation that I mentioned earlier in this article, Shutl and Carbon Voyage, can both be thought of as business that exploit information to operate new marketplaces for transport capacity. Eco-island have applied the same concept in energy; Streetline in car-parking; and Big Barn and Sustaination in business-to-consumer and business-to-business models for food distribution.

In addition to those I’ve previously described, systems that operate as transactional marketplaces often involve the following technologies:

Conversations, paper, technology

The articles I write on this blog cover many aspects of technology, future cities, and urbanism. In several recent articles, including this one, I have focussed in particular on issues concerning the application of technology to city systems.

I believe these issues are important. It is inarguable that technology has been changing our world since human beings first used tools; and overall the rate of change has been accelerating ever since. That acceleration has been particularly rapid in the past few decades. The fact that this blog, which costs me nothing to write other than my own time, has been read by people from 117 countries this year – including you – is just one very mundane example of something that would have been completely unthinkable when I started my University education.

But I absolutely do not want to give the impression that technology is the most important element of the future of cities; or that every “Smarter City” project requires all – or even any – of the technologies that I’ve described in this article.

Cities are about people; life is about people. Nothing matters unless it matters to people. In themselves, these are obvious statements; but consequently, our future cities will be successful only if they are built by consensus to meet the needs of all of the people who inhabit them. “Smarter” solutions will only achieve their objectives if they are designed and implemented so as to seamlessly integrate into the fabric of our lives. And sometimes the simplest ideas, using the simplest technology – or no technology at all – will be the most powerful.

Smarter Cities start with conversations between people; conversations build trust and understanding, and lead to the creation of new ideas. Many of those ideas are first shaped on pen and paper – often still the least invasive technology for co-creating and recording information that we have. Some of those ideas will be realised through the application of more recent technologies – and in fact will only be possible at all because of them. That is the real value that new technology brings to the future of cities.

But it’s important to get the order right, or we will not achieve the outcomes that we need. Conversations, paper, technology – that might just be the real roadmap for Smarter Cities.

(I would like to thank Steven Boxall for his comments on a previous article on this blog, “No-one is going to pay cities to become Smarter“, in the Academy of Urbanism‘s discussion group on Linked-In. Those comments helped me to shape the balance that I hope that I have achieved in this article between the roles that technology, people and conversations will play in creating the future of our cities).

Four avatars of the metropolis: technologies that will change our cities

(Photo of Chicago by Trey Ratcliff)

Many cities I work with are encouraging clusters of innovative, high-value, technology-based businesses to grow at the heart of their economies. They are looking to their Universities and technology partners to assist those clusters in identifying the emerging sciences and technologies that will disrupt existing industries and provide opportunities to break into new markets.

In advising customers and partners on this subject, I’ve found myself drawn to four themes. Each has the potential to cause significant disruptions, and to create opportunities that innovative businesses can exploit. Each one will also cause enormouse changes in our lives, and in the cities where most of us live and work.

The intelligent web

(Diagram of internet tags associated with “Trafalgar” and their connections relevant to the perception of London by visitors to the city by unclesond)

My colleague and friend Dr Phil Tetlow characterises the world wide web as the biggest socio-technical information-computing space that has ever been created; and he is not alone (I’ve paraphrased his words slightly, but I hope he’ll agree I’ve kept the spirit of them intact).

The sheer size and interconnected complexity of the web is remarkable. At the peak of “web 2.0” in 2007 more new information was created in one year than in the preceding 5000 years. More important, though, are the number and speed of  transactions that are processed through the web as people and automated systems use it to exchange information, and to buy and sell products and services.

Larger-scale emergent phenomena are already resulting from this mass of interactions. They include universal patterns in the networks of links that form between webpages; and the fact that the informal collective activity of “tagging” links on social bookmarking sites tends to result in relatively stable vocabularies that describe the content of the pages that are linked to.

New such phenomena of increasing complexity and significance will emerge as the ability of computers to understand and process information in the forms in which it is used by humans grows; and as that ability is integrated into real-world systems. For example, the IBM “Watson” computer that competed successfully against the human champions of the television quiz show “Jeopardy” is now being used to help healthcare professionals identify candidate diagnoses based on massive volumes of research literature that they don’t have the time to read. Some investment funds now use automated engines to make investment decisions by analysing sentiments expressed on Twitter; and many people believe that self-driving cars will become the norm in the future following the award of a driving license to a Google computer by the State of Nevada.

As these astonishing advances become entwined with the growth in the volume and richness of information on the web, the effects will be profound and unpredictable. The new academic discipline of “Web Science” attempts to understand the emergent phenomena that might arise from a human-computer information processing system of such unprecedented scale. Many believe that our own intelligence emerges from complex information flows within the brain; some researchers in web science are considering the possibility that intelligence in some form might emerge from the web, or from systems like it.

That may seem a leap too far; and for now, it probably is. But as cities such as Birmingham, Sunderland and Dublin pursue the “open data” agenda and make progress towards the ideal of an “urban observatory“, the quantity, scope and richness of the data available on the web concerning city systems will increase many-fold. At the same time, the ability of intelligent agents such as Apple’s “Siri” smartphone technology, and social recommendation (or “decision support”) engines such as FourSquare will evolve too. Indeed, the domain of Smarter Cities is in large part concerned with the application of intelligent analytic software to data from city systems. Between the web of information and analytic technologies that are available now, and the possibilities for emergent artificial intelligence in the future, there lies a rich seam of opportunity for innovative individuals, businesses and communities to exploit the intelligent analysis of city data.

Things that make themselves

(Photo of a structure created by a superparamagnetic fluid containing magnetic nanoparticles in suspension, by Steve Jurvetson)

Can you imagine downloading designs for chocolate, training shoes and toys and then making them in your own home, whenever you like? What if you could do that for prosthetic limbs or even weapons?

3D printing makes all of this possible today. While 3D printers are still complex and expensive, they are rapidly becoming cheaper and easier to use. In time, more and more of us will own and use them. My one-time colleague Ian Hughes has long been an advocate; and Staffordshire University make their 3D printer available to businesses for prototyping and exploratory use.

Their spread will have profound consequences. Gun laws currently control weapons which are relatively large and need to be kept somewhere; and which leave a unique signature on each bullet they fire. But if guns can be “printed” from downloadable designs whenever they are required  – and thrown away afterwards because they are so easy to replace – then forensics will rarely in future have the opportunity to match a bullet to a gun that has been fired before. Enforcement of gun ownership will require the restriction of access to digital descriptions of gun designs. The existing widespread piracy of music and films shows how hard it will be to do that.

3D printers, combined with technologies such as social media, smart materials, nano- and bio-technology and mass customisation, will create dramatic changes in the way that physical products are designed and manufactured – or even grown. For example CocoWorks, a collaboration involving Warwick University, uses a combination of social media and 3D printing to allow groups of friends to collectively design confectionery that they can then “print out” and eat.

These changes will have significant implications for city economies. The reduction in wage differentials between developed and emerging economies already means that in some cases it is more profitable to manufacture locally in rapid response to market demand than to manufacture globally at lowest cost. In the near-future technology advances will accelerate a convergence between the advanced manufacturing, design, communication and information technology industries that means that city economic strategies cannot afford to focus on any of them separately. Instead, they should look for new value at the evolving intersections between them.

Of mice, men and cyborgs

(Professor Kevin Warwick, who in 2002 embedded a silicon chip with 100 spiked electrodes directly into his nervous system. Photo by M1K3Y)

If the previous theme represents the convergence of the information world and products and materials in the physical world; then we should also consider convergence between the information world and living beings.

The “mouse” that defined computer usage from the 1980s through to the 2000s was the first widely successful innovation in human/computer interaction for decades; more recently, the touchscreen has once again made computing devices accessible or acceptable to new communities. I have seen many people who would never choose to use a laptop become inseparable from their iPads; and two-year-old children understand them instinctively. The world will change as these people interact with information in new ways.

More exciting human-computer interfaces are already here – Apple’s intelligent agent for smartphones, “Siri”; Birmingham City University’s MotivPro motion-capture and vibration suit; the Emotiv headset that measures thoughts and can interpret them; and Google’s augmented reality glasses.

Even these innovations have been surpassed by yet more intimate connections between ourselves and the information world. Professor Kevin Warwick at Reading University has pioneered the embedding of technology into the human body (his own body, to be precise) since 2002; and in the effort to create ever-smaller pilotless drone aircraft, control technology has been implanted into insects. There are immense ethical and legal challenges associated with these developments, of course. But it is certain that boundaries will crumble between the information that is processed on a silicon substrate; information that is processed by DNA; and the actions taken by living people and animals.

Historically, growth in Internet coverage and bandwidth and the progress of digitisation technology led to the disintermediation of value chains in industries such as retail, publishing and music. As evolving human/computer interfaces make it possible to digitise new aspects of experience and expression, we will see a continuing impact on the media, communication and information industries. But we will also see unexpected impacts on industries that we have assumed so far to be relatively immune to such disruptions: surgery, construction, waste management, landscape gardening and arbitration are a few that spring to mind as possibilities. (Google futurist Thomas Frey speculated along similar lines in his excellent article “55 Jobs of the Future“).

Early examples are already here, such as Paul Jenning’s work at Warwick University on the engineering of the emotional responses of drivers to the cars they are driving. Looking ahead, there is enormous scope amidst this convergence for the academic, entrepreneurial and technology partners within city ecosystems to collaborate to create valuable new ideas and businesses.

Bartering 2.0

(Photo of the Brixton Pound by Matt Brown)

Civilisation has grown through the specialisation of trades and the diversification of economies. Urbanisation is defined in part by these concepts. They are made possible by the use of money, which provides an abstract quantification of the value of diverse goods and services.

However, we are increasingly questioning whether this quantification is complete and accurate, particularly in accounting for the impact of goods and services on the environments and societies in which they are made and delivered.

Historically, money replaced bartering,  a negotiation of the comparative value of goods and services within an immediate personal context, as the means of quantifying transactions. The abstraction inherent in money dilutes some of the values central to the bartering process. The growing availability of alternatives to traditional bartering and money is making us more conscious of those shortcomings and trade-offs.

Social media, which enables us to make new connections and perform new transactions, combined with new technology-based local currencies and trading systems, offer the opportunity to extend our personalised concepts of value in space and time when negotiating exchanges; and to encourage transactions that improve communities and their environments.

It is by no means clear what effect these grass-roots innovations will have on the vast system of global finance; nor on the social and environmental impact of our activities. But examples are appearing everywhere; from the local, “values-led” banks making an impact in America; to the widespread phenomenon of social enterprise; to the Brixton and Bristol local currencies; and to Droplet, who are aiming to make Birmingham the first city with a mobile currency.

These local currency mechanisms have the ability to support marketplaces trading goods and services such as food, energy, transport, expertise and many of the other commodities vital to the functioning of city economies; and those marketplaces can be designed to promote local social and environmental priorities. They have an ability that we are only just beginning to explore to augment and accelerate existing innovations such as the business-to-consumer and business-to-business markets in sustainable food production operated by Big Barn and Sustaination; or what are so far simply community self-help networks such as Growing Birmingham.

As Smarter City infrastructures expose increasingly powerful and important capabilities to such enterprises – including the “civic hacking” movement – there is great potential for their innovations to contribute in significant ways to the sustainable growth and evolution of cities.

Some things never change

Despite these incredible changes, some things will stay the same. We will still travel to meet in person. We like to interact face-to-face where body language is clear and naturally understood, and where it’s pleasant to share food and drink. And the world will not be wholly equal. Humans are competitive, and human ingenuity will create things that are worth competing for. We will do so, sometimes fairly, sometimes not.

It’s also the case that predictions are usually wrong and futurologists are usually mistaken; so you have good cause to disregard everything you’ve just read.

But whether or not I have the details right, these trends are real, significant, and closer to the mainstream than we might expect. Somewhere in a city near you, entrepreneurs are starting new businesses based on them. Who knows which ones will succeed, and how?

How cities can exploit the Information Revolution

(This post was first published as part of the “Growth Factory” report from the thinktank TLG Lab).

(Graphic of New York’s ethnic diversity from Eric Fischer)

Cities and regions in the UK face ever-increasing economic, social and environmental challenges. They compete for investment in what is now a single global economy. Demographics are changing with more than 90% of the population now living in urban areas, and where the number of people aged over 65 will double to 19 million by 2050. The resources we consume are becoming more expensive, with cities especially vulnerable to disruptions in supply.

The concept of “Smarter systems” has captured the imagination of experts as an approach to turn these challenges into opportunities for more sustainable economic and social growth; particularly in cities, where most of us live and work. Smarter systems – in cities, transportation, government and industry –can analyse the vast amounts of data being generated around us to help make more informed decisions, operate more efficiently or even predict the future.

These systems enable city planners around the world to design urban environments that promote safety, community vitality and economic growth. They can bring real-time information together from city transportation, social media, emergency services and leisure facilities to better enable cities, such as Rio de Janeiro, to manage major public events. They can enable transport systems to better manage traffic flow and reduce congestion, as in Singapore. They can stimulate economic growth by enabling small businesses to better compete for business in collaboration with regional trading partners, in systems such as that operated by the University of Warwick.

Government policies such as Open Data, personal care budgets and open public services will dramatically increase the information available to citizens to help them take well-informed decisions. This information will be rich, complex and associated with caveats and conditions. Making it usable by the broad population is an immense challenge which will not be addressed by technology alone. Data needs not only to be made available, but understandable so that it can inform better decision-making.

Where does Smarter city data come from?

Raw data for Smarter systems is derived from three sources: the city’s inhabitants, existing IT systems and readings from the physical environment.

Information from people has become more accessible with the continued spread of connected mobile devices, such as smartphones. Open Street Map, for example, provides a global mapping information service sourced from the activities of volunteers with portable satellite navigation devices. However, the quality and availability of crowd-sourced information depends on the availability and resources of volunteers, who cannot be held accountable for whether information is accurate, complete or up-to-date.

It is also important to understand data ownership and the associated privacy concerns. There is a difference between data freely and knowingly contributed by an individual for a specific purpose and information created as a side-effect of their activity – for example, the record of a person’s movements created by the GPS sensor in their smartphone.

The Open Data movement, supported by central government, will dramatically increase the availability of data from public systems. For example, efforts are underway to make NHS healthcare data available, with appropriate security measures, to Life Sciences organisations to reinforce the UK’s pre-eminent position in drug discovery research. However, the infrastructure required to make large volumes of data widely and rapidly available in a usable form will not be created for free. Until their cost is included in future government procurements – or until commercial systems of funding are created – then much data will likely only be made open on a more limited “best efforts” basis.

Furthermore, not all city data is held by public bodies. Many transportation and utility systems are owned and operated by the private sector, and it is not generally established what information they should make available, and how. Many Smarter city systems that use data from such sources are private partnerships rather than open systems.

Meanwhile, certain kinds of data are becoming far more accessible through the advancing ability of computer systems to understand human language. IBM’s Watson computer demonstrated this recently by competing and winning against world champions in the American television quiz show, Jeopardy! Wellpoint is using this kind of technology to draw insight from medical information held in similar forms. Its aim is to better tackle diseases such as cancer by empowering physicians to rapidly evaluate potential diagnoses and explore the latest supporting medical evidence. Similar technology can draw insight from case notes in social care systems, as Medway Youth Trust is doing, or from the reports of engineers maintaining roads, sewers, and other city systems.

An early “mashup” application using open data from Chicago’s police force

Information is also becoming more readily available from the physical environment. In Galway Bay, a network of underwater microphones is connected to a system that can identify and locate the sounds of dolphins and porpoises. Their location provides a dynamic indication of which parts of the Bay have the cleanest water. That information is made available to companies in the Bay to allow them to control their discharges of water; and to the fishing and leisure industries who are dependent on marine life. This Open Data approach is being used by cities across the world such as Dublin, Chicago and London as a resource for citizens and businesses.

Whilst advances in technology have lowered the cost of generating information from physical environments, challenges remain. From the perspective of a mobile telephone user, much of the UK has signal coverage. However, telephones are used one metre or more above ground level; at ground level, where many parts of our transport and utility infrastructures are located, coverage is much poorer. Additionally, mobile transmitters and receivers are relatively expensive and power-hungry. Cheaper, lower power technologies are needed to improve coverage, such as the “Weightless” standard being developed to use transmission bandwidth no-longer needed by analogue television.

Using and combining data appropriately

In order to make information from multiple sources available appropriately and usefully, several issues need to be tackled.

When computer systems are used to analyse information and take decisions, then the data formats and protocols used by those systems need to be matched. Information as simple as locations and dates may need to be converted between formats. At an engineering level, the protocols used to transmit data across cities using wired or wireless communications behave differently and require systems that integrate them.

The meaning of information from related sources also needs to be understood and adapted to context. Citizens who go shopping in wheelchairs need to know how to get between car-parks and shops with lifts, accessible public toilets and cash points. However, the computer systems of the organisations who own those facilities will encode the information separately, in ways that support their efficient management, not that support journey-planning between them.

The City of Portland in Oregon has gone further in a project to understand how information from systems across the city is related. They are now able to better predict the impact that key decisions will have on the entire city, years in advance.

Privacy and ownership of data may affect its subsequent use, often with terms and conditions in place for governing its access. Furthermore, safeguards are required to ensure that sensitive information cannot be inferred from a combination of sources. For example the location of a safe house or shelter being identified from building usage, building ownership and /or information concerning taxi journeys by the employees of particular council agencies.

The human dimension

Smarter systems will only succeed in improving cities if there is wide consumer engagement. To be of value, information will likely need to be timely and presented in a manner appropriate to consumer context. Individual behaviour will only change where personal value is derived as a result of new information being presented – a saving in time or money, or access to something of value to their family.

(Photo of traffic in Dhaka, Bangladesh, from Joisey Showa)

Many cities are experimenting with technologies that predict the future build up of traffic, by comparing real-time measurements to databases of past patterns of traffic flow. In Stockholm, this information is used by a road-use charging system that supports variable pricing. In California, commuters in a pilot project were given personalised predictions of their commuting time each day. Both systems encourage individuals to make choices based on new information.

Utility providers are exploring how information from smart meters can encourage water and energy users to change behaviour. A recent study in Dubuque, Iowa, showed that when householders were shown how their water usage compared to the average for their neighbours, they became better at conserving water – by fixing leaks, or using domestic appliances more efficiently. Skills across artistic and engineering disciplines are helping us understand how this type of information can be communicated more effectively. Many people will not want to study figures and charts on a smart meter or website; instead “ambient” information sources may be more effective – such as a glow-globe that changes colour from green to orange to red depending on household electricity use.

Systems that improve the sustainability of cities could also affect economic development. Lowering congestion through Smarter transportation schemes can improve productivity by reducing time lost by workers delayed by traffic. By making information and educational resources widely available, Smarter systems could improve access to opportunity across city communities. A city with vibrant communities of well-informed citizens may appear a more forward-looking and attractive place to live for educated professionals and, in turn, for businesses considering relocation. New York has improved its attractiveness since the 1970s by lowering the fear of crime. One of its tools is a “real-time crime centre” that brings information together from across the city in order to better react to crime and public order incidents. The system can even help to prevent crime by intelligently deploying police resources to the areas most likely to experience incidents based on past patterns of activity – on days with similar weather, transportation conditions or public events.

Success in delivering against these broader objectives is much more likely to be achieved where the cities themselves are more clearly accountable for them.

So where do we start?

Investments in Smarter systems often cut across organisations and budgets and many have objectives that are macro-economic, social and environmental, as well as financial. As such, they challenge existing accounting mechanisms. Whilst central government and the financial markets offer new investment solutions such as ethical funds, social impact bonds and city deals, so far these have not been used to fund the majority of Smarter solutions – many of which are supported by research programmes. The Technology Strategy Board’s investment in areas such as “Future Cities” and the “Connected Digital Economy” will provide a tremendous boost, but there is much to be done to assist cities in using new investment sources to fund Smarter initiatives – or to develop sustainable commercial or social-enterprise business models to deliver them.

Although progress can be driven by strong leadership, the issues of governance and fragmented budgets will need to be overcome if we are to take full advantage of the benefits technology can bring.

We live in an era of major global challenges – well described in the recent “People and the Planet” report by the Royal Society. At the same time, we have access to powerful new technologies and ideas to address them, such as those proposed by the 100 Academics who contributed essays to the book “The New Optimists”. When we focus those resources on cities, we focus on the structures in which we can have the greatest impact on the most people.

Already many forward-looking cities in the UK such as Sunderland and Birmingham are joining others around the world by investing in Smarter systems. If we can meet the technical, organisational and investment challenges, we will not only provide citizens, businesses and agencies with new choices and exciting opportunities; we’ll also position the UK economy to succeed as the Information Revolution gathers pace.

%d bloggers like this: