Snow also set the stage for another data-driven mayor to take office in Chicago. Just one month after New Yorks blizzard, Chicago mayor Richard Daley faced down an even worse snowstorm. In another bungled response, Daleys chief of staff delayed a decision to close down Lake Shore Drive, and hundreds of people were stranded as cars and buses were trapped by drifting snow. Daley faced some of the harshest criticism of his twenty-two-year reign.
When Rahm Emanuel, former White House chief of staff and mayor-elect, arrived at Chicago’s City Hall in May 2011, the memory of the fiasco was still fresh. As summer turned to fall, forecasters predicted a harsh winter ahead. Unlike New York, where plowing progress was tracked manually by radio reports from drivers during the 2010 blizzard, Chicago had installed GPS trackers on all its plows in 2001. City officials could follow the plows on a real-time map, but citizens had no way to access this information. Accusations of preferential snow removal on streets and in neighborhoods of the mayor’s political supporters were common.
The lack of transparency around Chicago’s plowing operations is far more typical of how city governments operate than the free-for-all data giveaways of apps contests. The vast majority of the data that city governments collect remains hidden. Department heads guard this data closely, and resist sharing it even with each other, let alone with the public. It is the source of their power, and it can expose their shortcomings.
But as Emanuel said in the weeks following Barack Obama’s election in November 2008 amid the global economic meltdown, “You never want a serious crisis to go to waste.... This crisis provides the opportunity for us to do things that you could not do before.” John Tolva, Emanuel’s new chief technology officer, had a simple solution—open up the plow map. The result, Chicago Shovels, sported a gamelike Plow Tracker map that showed the progress of plows during major storms. But Tolva also saw the map as a way to recruit citizens to help with snow removal and developed a tool called Snow Corps to match shovel-ready volunteers with snowbound senior citizens. Tolva’s approach to data-driven reforms couldn’t have been more different than Goldsmith’s in New York. Instead of data-mining organizational charts and performance to right-size the city workforce, he opened up operational data to mobilize citizens. And he had his own ideas about using technology to make government more cost-effective.
Tolva’s path to public service began on the windswept platform of the L, as the city’s elevated trains are called. As he recalls, “During the mayoral campaign, Rahm did a tour of over a hundred L stops. It was December, it was freezing, it was early and I went into my L stop. He and I were the only people there, so I approached him and said, ‘What do you think about open data?’ ” Emanuel countered, “Do you mean like, transparency?,” Tolva told me.31
“If I was going to hook him I would have to hit him where it hurts,” Tolva recounts. “No, I mean saving money.” People streamed into the station around them, but Emanuel ignored them, momentarily fixed on Tolva. Before turning to greet the throng of prospective voters flooding into the station, he locked hands with Tolva. “We should talk,” the candidate told him. Five months later, Tolva received an invitation to join the mayor-elect’s transition team.
Bloomberg may be fond of numbers, but Tolva is a data junkie, obsessed with the stuff and always on the hunt for more. Before taking the job as Chicago’s chief technology officer, he had spent some thirteen years at IBM—most recently as the head of the company’s City Forward project, an effort to evangelize the virtues of data-driven decision making in local government. One of the projects he oversaw was the deployment of City Forward’s Web app, which let people create benchmark comparisons between cities around the world using a variety of vital statistics.
By early 2012 Tolva was working hard to live up to the promise he’d made to the mayor on that train platform. He was busy building an early warning system of his own, like the UN’s Global Pulse, to scour the city’s data for trouble spots. As we spoke by phone, he overflowed with excitement about all of the free technology at his disposal, rattling off a laundry list of powerful open-source software tools that were rapidly democratizing the ability to manage and analyze big data. They include MongoDB, a tool for managing huge databases (which Tolva learned about from the Foursquare crew) and R, a language for statistical analysis.
Early results of these number-crunching explorations of the city’s big data are tantalizing. “Deep analytics,” he says, borrowing IBM’s jargon for the collection of tools and techniques for dissecting big data, “is about more than more than just performance management and transparency. It’s about showing us where there are connections that we did not realize.” In one experiment, his team cross-referenced Meals on Wheels delivery logs with the city’s own tax records to generate a map of elderly living alone. “We can start to build up a list of people that need to be checked on during heat and cold emergencies,” he says; “Is that a cost saving tool? Yes. But it is also a lifesaving tool.” In Chicago’s harsh climate, extreme weather routinely claims the lives of dozens of seniors.
Inspired by popular data-driven online indexes like WalkScore, which computes a numerical measure of walkability for any US street address, Tolva was also working on a Neighborhood Health Index. A massive mash-up, it would synthesize “all the indicators that we have block by block and infer the probability that an undesirable outcome will result.” While Chicago’s effort looked at real data, not some abstract model, there was an eerie similarity to the cybernetic missteps of the 1960s that tried to compute urban decay. But Tolva wasn’t entirely seduced by data. He understood that it is nothing more than a diagnostic tool: “A single data point that does not tell you that a house is going to fall into blight but [the index could signal] that there is a higher than normal probability that it will be in disrepair.” The data could then be used as an input when allocating revitalization funds or directing social workers to trouble spots. It was a strategy cut from the same cloth as Goldsmith’s vision for transforming bureaucrats and civil servants into knowledge workers, but without the union busting.
As a triage tool for stretching scarce city resources, it’s hard to argue against this kind of data-driven management. But as data becomes more central to how we measure government performance, it can create perverse incentives. One of the largest and longest-running data-driven management systems of any American city is the New York City Police Department’s CompStat program. Since 1994 CompStat has combined computerized mapping of crime reports with weekly roll-call meetings where commanders are grilled by their superiors over any errant localized spikes in lawlessness. In practice, it allows the NYPD to shift resources to wipe out crime hot spots before they can undermine a community’s sense of order. For many years, the program was widely credited for the stunning decline in New York’s crime rate in the 1990s, though many other theories have been put forth to explain it (for instance, the reduction in the number of at-risk teens following the legalization of abortion decades earlier, and the end of the crack epidemic). Regardless of its efficacy, in recent years criticisms of CompStat’s impacts on policing have mounted.34 It turned out that, in their quest to maintain steady reductions in the reported rate of crime, police officers allegedly routinely reclassified crimes as less serious offenses and even discouraged citizens from reporting them in the first place. CompStat shows that when data drives decisions, decisions about how to record the data will be distorted.
Читать дальше