Flickr Creative Commons
Computation isn’t just performed by computers; systems across scales are understood to process information. Information itself is a spacious term that can refer to the 1’s and 0’s behind your operating system, grammatical nuances of human language, or even the waggle dance of a honeybee. It’s no secret that our enhanced digital wanderings generate stockpiles of information, and that this material can be mined and analyzed particularly in pursuit of solutions to urban planning problems. People make information and information makes cities.
Some biologists believe that an entire biological system can be understood as a computer that tries to solve a “fitness function.” Put simply, natural selection computes solutions (species) to the problem of genetic survival. That may seem a little far-fetched. On smaller scales though, the nature of computation far is easier to define. For example, artificial neural networks are employed to compute more efficiently using machine-learning and pattern-recognition algorithms for problems we don’t have time for. The Media Lab’s Place Pulse project is based on a machine-learning algorithm. On the flipside, organic computers built entirely from biological materials (DNA-based computation), is an established domain of exciting research especially here at MIT. The point is, many things can be computers—including living systems. This means that it’s not entirely out of our scope to entertain the idea that a city could compute.
But if a city were processing information, how would it do so? Currently there are two working models of computation, the Turing Machine and Lambda Calculus. In 1936, Alan Turing came up with a conceptual machine that operates much like DNA transcription: a strip of encoded information is decoded bit by bit by a central processing unit. Remarkably without knowledge of DNA, or computers (both to be discovered and invented decades later), Turing conceived of the basic architecture required for computation both within our own bodies and our machines. Turns out, information and processing power are all that’s needed for computation to occur. (And If you’re really excited about math, I invite you to explore the Stanford Encyclopedia about Lambda Calculus.)
We know that sources of information generated by people living in cities are growing in size and scope exponentially. But who has the processing power? Who should be running the program?
This is an incredibly important question for 21st century city planners to ask themselves. Unfortunately, planners have a bad taste in their mouth when it comes to analytical methods of policy and urban design. New traditions in planning after WWII emphasized policy analysis tied to the engineer’s logic of system optimization. Planners were banished to dusty cubicles, operating as technicians feeding data into the mouths of heavy-handed decision makers. This “Rational Comprehensive” era of planning generated some pitiful social housing projects, playing into the commodification of urban development guided by the invisible hand. As a result, data analysis is the thorn in many a designer’s side, indelibly linked to the value-free optimization of social problems and painful ghosts of the past.
We must recognize that today we are in a different setting: today’s data is not only ubiquitous it is also profoundly social. Social media and the like capture a variety of human experiences, which can be interpreted and explored creatively. Smart City initiatives envision a future of ubiquitous computing, where computers (for better or worse) are incorporated into the physical urban fabric (trashcans that spit back the wrong kind of trash!?), generating higher resolution data more nuanced than any kind of public survey or social media application. Growing research in the field suggests that the gap between this imaginative future and reality is beginning to close. With the inevitable development of ubiquitous computing in cities comes a new generation of high-resolution information presenting a solution space that deserves to be explored. But planners are hesitant to do so, and perhaps as a result, the slack is being picked up in unusual places.
Both IBM and Microsoft have launched research labs entirely focused on “urban computing” that is, “the process of acquisition, integration and analysis of big and heterogeneous data generated by a diversity of sources in urban spaces…to tackle major issues that cities face.” And their papers are fascinating to boot. Take for example “Urban Computing with Taxicabs,” in which 30,000 taxis in Beijing equipped with GPS tracking devices generating data that was analyzed to track changing traffic patterns as a consequence of specific planning interventions. Or “Discovering Regions of Different Functions in a City” which correctly predicts emerging “points of interest” that result from federally executed land use plans. These papers are remarkable in that they innovate data analysis methodologies to complement planning practice. It’s incredibly exciting to know that the largest technology corporations in the world are tackling urban issues head on. But where are the planners?
If we don’t reinvent ourselves, shed our fear of algorithms and embrace what it means to be literate in the 21st century, there may not be room for us. Next generation planners must also be programmers who are willing to recognize the beauty of math as a language, understand the constraints and limits of it, and innovate new ways to improve upon the human condition using this tool to access data-laced resources. In turn, today’s growing population of urban scientists need the expertise of people who love and understand cities, but a productive relationship requires finding common ground. Calculus isn’t Klingon, so hit the books.
Emily Royall is an MCP student in City Design & Development at MIT DUSP.
On Her issues
Reprogramming the City
Hacking the City
Near Future Laboratory
Batty’s Complex City
Urban Morphology Institute