Big Data

Civic leaders have more data than ever to guide their decisions, but the tech is not in charge

Most big cities have bar and club zones that get rowdy when drunk patrons spill out at closing time. But the party strip in Eindhoven, a tech hub of 230,000 people in southeastern Holland near the Belgian and German borders, is such a compressed space that the crowding can become dangerous. On some nights, as many as 15,000 people will stream into the “Stratumseind,” a downtown pedestrian zone about 300 metres long and just 15 metres wide that’s lined with pubs and discos. Fights break out and rising violence became a significant public safety issue.

In 2015, Tinus Kanter, a municipal official who looks like a roadie, began working on a smart-city approach to public safety. In partnership with Stratumseind businesses, local police and tech firms, as well as the lighting giant Philips, which is headquartered in Eindhoven, the municipality transformed the stretch into a “living lab” with a range of technologies designed to drain some of the negative energy out of Saturday night revelries.

Video cameras were situated at each end of the strip to tally how many were entering or leaving (without capturing faces). Audio sensors are programmed to detect aggressive sounds while an off-site AI algorithm scans and interprets social media for posts mentioning Stratumseind or have geo-tagged images of the strip. Software, devised by the city and its tech partners, combines these data streams and sends red flags when trouble is detected, including to the police. Depending on the signals about the crowd’s behaviour, lighting provided by Philips, would shift to softer hues when things got ugly.

Kanter comes by his interest in crowd control honestly: before joining the civil service, he ran a heavy metal music festival. He stresses that the city insisted on “privacy by design,” so the systems do not capture personal information. The municipality also took more conventional steps, adding planters, terraces and seating to break up the space. “What I see now is that the street is becoming nicer and more open,” says Kanter, who adds that Eindhoven has been diligently tracking data. “We think that gathering numbers is a good thing because (they) provide scientific proof.”

The party strip in Eindhoven, a tech hub of 220,000 people in southeastern Holland near the German border, is such a compressed space that the crowding can become dangerous.

However, what the data shows in terms of safety isn’t clear. Kanter insists there has been less fighting, although he can’t prove the lighting and the sensors were the reason. Albert Meijer, a University of Utrecht professor of public innovation who has studied Stratumseind, says the technology alone didn’t markedly improve safety. What did change, he says, is that media coverage of the area shifted from the brawling to the devices, which, in turn, has attracted municipal delegations from abroad, which may have been the point all along. “Philips,” he says, “wanted to showing its new street lighting to sell around the world.”

Meijer describes Stratumseind as a “quantified street” and the label speaks to the complex, and long-running, triangular relationship between planning, urban data and the diverse array of technologies deployed to gather it. One of the core promises of smart-city technology is that by collecting and interpreting granular, real-time data drawn from a wide range of sources, such as sensors, municipalities will be able to make more responsive and more efficient planning and operational decisions. It is, fundamentally, a technocratic idea that suggests that evidence and facts gathered from the city itself will guide the best course of action.

There is truth in this thinking. For example, when city officials can track cycling activity using apps installed on cyclists’ phones, they can “see” where bike lanes are used and needed. Similarly, if transportation or transit planners can track daily traffic or ridership volumes over an extended period, using data from cellphone signals or tap-on-tap-off fare cards, they can add service, or identify areas experiencing increases in work-related car trips. Such insights could lead to planning that informs infrastructure and private investment, as well as choices about programming public spaces.

Tinus Kanter, a municipal official  began working on a smart city approach to public safety. In partnership with Stratumseind businesses, local police and tech firms, as well as the lighting giant Philips, which is headquartered in Eindhoven, the municipality transformed the stretch into a “living lab” with a range of technologies designed to drain some of the negative energy out of Saturday night revelries.

The planning profession, which dates back to the beginning of the 20th century, didn’t originally rely on data. Rather, growth was driven by the construction of new civic infrastructure (streetcar lines, bridges, etc.) and prevailing planning ideologies — for example, the importance of creating garden-city suburbs or physically separating residential, commercial and industrial areas.

By the 1920s and 1930s, municipal officials were collecting more information, such as the condition of housing stock or infectious-disease outbreaks, and using it to plan. In the postwar era, systematically gathered and synthesized data took on a much more prominent role. According to MIT science and technology historian Jennifer Light, planners learned how to map growth, housing conditions and urban “blight” by combining census and household survey data with aerial photography and military satellites.

By the mid-1970s, Light wrote in her 2003 book,From Warfare to Welfare,” analysts working for the City of Los Angeles were developing mathematical models that combined information from databases of digitized aerial images, census statistics and building-inspection reports to make predictions about future housing-development scenarios — a precursor of today’s smart city analytics.

Also since the 1960s, local governments (and many others) have used “geographic information systems” to support their analyses and planning. Originally conceptualized by an Ottawa geographer, GIS are densely layered digital maps that contain a range of information associated with a particular place — natural features, buildings, boundaries, infrastructure, businesses, land use and zoning rules, census data, aerial photos, pollution sources, etc.

Transportation planners also use extensive annual travel surveys, the results of which were combined with census data to produce so-called travel demand forecasts — information that municipalities could use to estimate transit usage and traffic around regions such as Greater Toronto.

Land-use planners, both in government and the private sector, also draw on diverse information sources, including traffic, cycling and pedestrian counts, visualizations of zoning policies, interactive simulation software, consultation session feedback and surveys of residents. Numerous planning apps have also emerged, such as “Walk Score,” which rates neighbourhood walkability in cities around the world.

MIT science and technology historian Jennifer Light.

In recent years, the extensive deployment of sensors, as well as large-scale smart phone and social media usage, and newly digitized municipal records have combined to change the playing field, yielding torrents of raw data instead of the more processed statistics that planners had relied on in the past.

What’s more, remarkable advances in computing power and coding tools, as well as the long-anticipated maturation of artificial intelligence software engineering, have created entirely new ways of leveraging data and observing what’s happening in a city. For example, the City of Stockholm, through a partnership with MIT and Sweden’s KTH Royal Institute of Technology, has a project to install solar-powered sensors on city vehicles — buses, garbage trucks, taxis — to gather data on noise, air and road quality that can provide planners with granular information about “hyperlocal” environmental conditions.

Other sources of planning data are coming from outside municipalities. InsideAirbnb is a website that “scrapes” address, rate and other host details from Airbnb’s main site. It cross-references this information with housing and rental market data, and then maps it all. The site was created by a handful of New York City affordable housing activists. Through it, visitors can see the density and locations of Airbnb units in any neighbourhood in any city. The site, in effect, is a data visualization tool that gives planners and residents valuable housing market information (and policy insights) into phenomena such as condo towers that have become overrun by short-term rental investors and ghost-hotel operators.

The coronavirus pandemic has created more public-health applications for big data culled from cellphone signals. Researchers, governments and data firms, including Google, Apple and Environics Analytics, have been publishing analyses on mobility patterns, both nationally and locally, as a means of assessing how physical distancing measures have impacted travel. For example, Google and Apple aggregated cellphone mobility data by location type in Toronto in the fall. The data, published in charts in The Star, showed a sharp increase in park usage compared to pre-pandemic levels. Such evidence helped city officials make decisions about extending park-focused programs and services into the cold-weather months.

However, not all of the emerging applications are convincing or useful.

In 2017, for instance, a Harvard-MIT research team published a study about an experiment using Google Street View, itself a vast trove of urban data. Using 1.6 million images of street scenes in five U.S. cities, taken first in 2007 and again in 2014, the researchers amassed a database of paired photos of the same places, from the same perspectives, at two points in time. The investigators then developed an algorithm to assess “perceptions of safety” based on a “crowd-sourced study” of street scenes in New York and Boston, and used this formula to rate the perceived safety of the images they had gathered.

Finally, they crossed referenced the street-level safety scores with census and other socio-economic data. Not surprisingly, it concluded that denser areas populated by more affluent residents were less likely to experience physical decline — “tipping,” as the authors put it — over time. The upshot is that a forbiddingly complex technical process yielded an obvious and well-understood conclusion.

Less experimentally, an Israeli company called Zen City has been selling a software-based service using AI that gathers and assesses citizen feedback on local planning — a process it calls “sentiment analysis.” The feedback comes from online surveys, but also scans social-media chatter, tourist ratings, complaints to municipal 311 lines, etc. The company presents its system as a decision-making tool for municipal politicians and officials — “it’s representing the voices of the silent majority,” says Zen City solutions architect Nir Zernyak.

In New South Wales, Street Furniture Australia, an industry group, recruited academic planners and landscape architects to evaluate street furniture equipped with wirelessly connected sensors that perform tasks such as monitoring when a trash can needs to be emptied or how park benches are used. Other applications include park-based work stations, with Wi-Fi and USB ports, as well as tables that can be booked via a smart phone app.

Zernyak cites an example from Oregon. Beaverton, a suburb of Portland, has attempted to ban so-called “car camping” by setting up two “safe parking” locations for homeless people who live in their cars. The city or partner agencies assist the users to access social services. In its marketing materials, Zen City claims its “actionable, data-based insights” revealed that Beaverton may have to change its zoning rules to allow for these sites, and that not all neighbourhoods wanted one — conclusions that may not have required this kind of outsourced data-crunching. (Beaverton officials did not return calls for comment.)

The push to mobilize new sources of “smart” urban data often comes from private firms that stand to benefit. In New South Wales, Street Furniture Australia, an industry group, recruited academic planners and landscape architects to evaluate street furniture equipped with wirelessly connected sensors that perform tasks such as monitoring when a trash can needs to be emptied or how park benches are used. Other applications include park-based work stations, with Wi-Fi and USB ports, as well as tables that can be booked via a phone app.



The data generated from these “smart social spaces” is aggregated on a “smart asset management dashboard” which municipal officials use to monitor how these hubs are used. The idea behind the pilot, explains Nancy Marshall, a University of Sydney planning professor who is part of the evaluation team, is to find ways to encourage people to use public spaces, but the group also wants to conduct “behaviour mapping.” She says none of the sensors gather personal information. (The team met with Sidewalk Labs officials in Toronto last year.)

How this intel gets used is an open question. Information that flows from park bench or picnic table sensors could prompt municipal planners to add amenities if heavy traffic is indicated. But it’s not difficult to imagine less positive applications. For example, if the data shows a lot of late-night traffic, local residents worried about crime might use the information as extra fodder for municipal officials to remove benches or tips for police to increase patrols. Marshall stresses the data from the pilot projects isn’t shared with law enforcement but such assurances is no guarantee that other municipalities that purchase these systems will be as restrained.

New York University planning and urban analytics expert Constantine Kontokosta offers another caution. Trash bin sensors designed to monitor when a container needs emptying could, in theory, provide data that lets city officials apply algorithms to optimize collection routes by using GPS mapping tools to direct trucks only to full bins, thus saving money on fuel and labour. However, in a 2018 paper, Kontokosta writes that such analysis might conflict with other municipal policies, such as the need to abide by collective agreements. “The computing challenges are solvable,” he notes. “(T)he real uncertainty lies with how to integrate data-driven processes into public sector management.”

The broader point is that existing and new forms of urban data, some of it automated, can inform a city’s policy-making machinery and deliver fresh insights, but they don’t supplant institutional or political judgment, as well as human experience. “Planning is a social science, which balances these pieces with an artistic element as well,” says Toronto planner Blair Scorgie, an associate at SvN, a planning and design firm. “It’s what I love about it.”

Paul Steely White, executive director of Transportation Alternatives, is photographed in New York's revamped Times Square in 2014.

Some of the most compelling examples of smart, data-driven planning pre-date the rise of the smart-city tech industry. In the early 1960s, Danish architect and planner Jan Gehl began meticulously documenting pedestrian activity in a newly created car-free zone in central Copenhagen to prove to area merchants that they weren’t going to lose business. Carried out by volunteers, Gehl’s “public life surveys” tracked pedestrian and cyclist activity, bench usage, sidewalk café seating and so on, with the results painting a picture of how and when people use their streets; it’s not a tech-driven exercise involving sensors, but rather one based on much more nuanced observations about the human rhythms of city life.

In the late 2000s, New York City hired Gehl to conduct similar surveys and analysis on Times Square and several of Broadway’s intersections. The surveys revealed a conspicuous dearth of younger and older pedestrians — a detail non-video sensors wouldn’t pick up — while an analysis of the chronically congested intersection showed the road allowance occupied almost 90 per cent of all the open space in the Square. NYC’s transportation czar in 2008 used Gehl’s findings to order a radical remake of Times Square, closing large segments of the street and creating public spaces fitted with tables and chairs. The model has been replicated elsewhere in Manhattan, reclaiming 400,000 square metres from traffic.

In Toronto, the King Street Pilot Project, which launched in 2017, offers a compelling example of how city officials succeeded in integrating technology and planning judgment to improve public services and public space. In 2015, the city set up a “big data innovation team” to tease out insights from information generated by electronic traffic counters, cycling apps, vehicle detectors and other sources that produced continuous flows of digitized transportation information.

A TTC streetcar makes its way along the King street pilot project (near Bathurst) past some empty chairs and tables in 2018. The city used data gleaned via Bluetooth and other sources to confirm that the project worked as hoped.

The plan envisioned significantly restricted private-vehicle use on King Street in order to improve streetcar service. Data analysts used low-resolution cameras installed in traffic signal controllers to monitor pedestrian and vehicles volumes, and then drew on anonymized Bluetooth signals from phones to calculate how much time riders spent on streetcars traversing the area. Project officials also tracked daily revenues through point-of-sale payment devices to assess how declines in vehicle traffic impacted King Street businesses.

The city published a monthly dashboard of key metrics to demonstrate changes in travel times, cyclist and pedestrian activity and commerce. Restaurants, in turn, were allowed to build partially enclosed patios extending into the street — a move that laid the ground work for the city’s CafeTO pandemic program, which let scores of eateries expand into cordoned off parking spaces.

The metrics affirmed the experiences of commuters, residents and local businesses: that streetcars were moving faster, pedestrian and cycling activity was up and merchants hadn’t seen a drop in business, as some had feared. In 2019, council voted to make the King Street corridor permanent.

As with downtown Copenhagen and Times Square, the King Street project illustrated how planners and analytics experts can make innovative uses of granular urban data in order to deliver city-building goals, and that it was possible to do so without compromising privacy or directing scarce funds to smart city tech firms.

Getting the real numbers out of 311

The City of Toronto 311 operations centre in Metro Hall.

When municipalities across North America began setting up 311 call centres to handle requests and complaints, the centres weren’t positioned as “smart city” systems. Rather, proponents saw 311 as a means of improving citizen engagement and bureaucratic accountability. Over the years, 311 services, including Toronto’s, have become increasingly tech-enabled, with social-media accounts, apps and the release of machine-readable complaint-tracking records through open data portals.

Municipalities now sit on vast troves of data from 311 calls — hundreds of thousands or even millions per year — that can be mined and analyzed, and then used to inform municipal planning and budgeting. A proliferation of calls about basement floods, missed garbage pickups or dubious odours from factories can provide important clues, both about what’s happening in a neighbourhood as well as the performance of city departments. If scanned carefully for longer-term patterns, 311 calls can also offer predictions about future problems.

These digital mountains of call records certainly qualify as “big data.” But the ways in which this information is or can be used also offers important lessons, both positive and negative, about applications for other large urban data sets that might be generated by smart-city technologies.

The most obvious application is how municipal agencies respond to residents’ requests for service. New York University urban analytics expert Constantine Kontokosta observes that many municipalities tend to be make such decisions in a “black box,” with little transparency as to whose needs take first priority (first-come-first-serve, a triage system, etc.). He and other 311 researchers say that these data sets also contain important signals that could assist in making service delivery either more efficient or more equitable (which aren’t necessarily the same thing).

One pattern, noted by a New York state Health Foundation/Harvard research team in a 2020 study, found that spikes in calls about a particular problem may be orchestrated community campaigns. The study described the practice as a “misuse” that could lead city officials to “erroneously” conclude that an area was experiencing some kind of decline.

Another evaluation, published by Kontokosta in 2017, looked at New Yorkers’ complaints about hot-water problems in their buildings. Drawing on 311 data, inspection reports, census tract information and other records, the study found that neighbourhoods with high rents and incomes, better educated residents and larger non-Hispanic white populations “tend to over-report”: “Based on these results, we find that socioeconomic status, householder characteristics and language proficiency have a non-trivial effect on the propensity to use 311 across the city.”

Still other analysts have mined 311 data sets to show how they correlate to broader trends, such as the spread of urban “blight.” Those patterns, according to a 2016 analysis by NYU and the Center for Urban Science and Progress, could theoretically be used to predict future real-estate prices.

In 2017, a team of geographers and artificial intelligence scholars at the University of Illinois Urbana used six years of Chicago 311 sanitation service requests (e.g., overflowing garbage cans) to develop what they said was the first algorithm capable of generating predictions to help guide decisions about scheduling and routes.

Kontokosta, whose work focuses more on fairness and equity than efficient management, contends that such algorithms will eventually be available commercially, but notes that one limiting factor is that many local governments still use older mainframe computers that don’t have the chops to process so much data.

The other is a dearth of data scientists and mathematicians on municipal payrolls. “People with these skills,” he says, “aren’t working for cities.”

This is part of the 2019-2020 Atkinson Fellowship in Public Policy series on the politics and governance of smart city technology. The series, by Toronto journalist and editor John Lorinc, will examine data and privacy, mobility applications, predictive policing, sustainable smart cities, data and planning and smart city megaprojects. It concludes with a discussion about how these systems can fit into accountable, progressive and democratic city-building efforts.

Next: Vendor lock-in, function creep and the minefield of digital tech procurement

Source link

Most Popular

To Top