Chicago is often praised
as a leader in civic technology, its analytics projects hailed as exemplary. But does this city truly merit all the kudos?
After reading yet another reference to Chicago's innovative use of text analytics (to battle rat infestations, among other things), I poked around a bit to learn more about Chicago's award-winning approach to civic tech.
Chicago's Mayor Rahm Emanuel made open data part of his transition plan, announced at the start of his administration in 2011. Open data and predictive analytics are a major tool -- and talking point -- in his quest to improve Chicago.
Chicago Mayor Rahm Emanuel in a meeting with city technology leaders.
(Source: Daniel X. O'Neil via Flickr)
Some critics have argued that Chicago's big-data projects are more flash than substance. Some even question whether Chicago's much-touted crime analytics project has reduced crime as substantially as has increased police presence. There also are critics who seem to think the online apps themselves aren't as transparent as they could be, including the fellow who posted the following image online:
City of Chicago Pothole tracking app. Posted by Steven Vance on Flickr, with the caption: "Pothole data, but not the measurement the Chicago Forward Action Agenda called for over 2.5 years ago, in May 2011."
Still, there is a lot of visible progress. The city won $1 million from Bloomberg Philanthropies in March 2013 to "build the first citywide real-time predictive analytics platform." The Chicago Department of Innovation and Technology (DoIT) calls this project the SmartData platform, formerly nicknamed "Windy Grid," a real-time analysis system that works alongside a series of separate tools to boost Chicago's analytics capabilities.
The SmartData Platform is still in development, says Brenna Berman, Chicago's DoIT Commissioner and CIO. "While it won't be complete for 2 years, components are operational now," she wrote in an email to me.
SmartData is just part of Chicago's multipronged effort to be up to the minute with big data. There is also CrimeScan and CityScan software developed with Carnegie Mellon University (the former used in Chicago's well-documented crime tracking system). Then there is the city's data dictionary, the work of a specially appointed open data unit and data science team; not to mention online tools for small businesses; and the Pothole Tracker mentioned above, which is part of the Open Data Portal, along with the rat-catching software and other apps.
On the face of it, this sounds impressive. But it also seems a bit unwieldy, reminiscent of the company Google, which has a tendency to barge into projects on a grand scale, only to lose interest later.
Is Chicago going the way of Google?
When I asked Brenna Berman about integrating all of Chicago's data efforts, she said it may not even make sense to do so. "The SmartData Platform targets issues related to big spatial data and the City does have a few other analytics tools that target different issues," she stated. "Sometimes technical integration between these tools makes sense, but more often we focus on educating our users about which tool is best in given situations to meet their business goals."
Nothing Google-like about that. Indeed, it seems realistic to take an approach that one tool does not fit all purposes when it comes to city analytics, and probably never will.
Chicago Department of Innovation and Technology (DoIT) Commissioner and CIO Brenna Berman, at a civic hacking event last year.
(Source: Daniel X. O'Neil via Flickr)
I don't have figures to indicate what Chicago has spent on its enormous analytics push or what it's saved as result of the work. (Yes, I've asked for this information, but nothing has come from the city press office yet, and the city's website presents much information but in a hard-to-read spreadsheet format.) Still, there are plenty of signs that the approach has been a well considered one.
For one thing, Chicago has aggressively insisted on open-source development tools, and has not displaced existing systems. "In government, often when we want to do something big and different like this, we replace everything," said Brett Goldstein, the city's former CIO and chief data scientist, in an interview with Future Cities sister site InformationWeek. As Chicago continues to develop its solutions with help from corporate and academic partnerships, it will share what it learns with the open-source community, Goldstein said.
Another promising sign is that Chicago is proceeding apace with its big-data work, undeterred by setbacks, criticism, or personnel changes. That shows me there's coherence in the plan. For instance, Goldstein, whose appointment in Chicago made tech press headlines for months, resigned his post last spring, making way for Brenna Berman's rise to CIO. A new chief data scientist will be hired to replace Goldstein, Berman says: "I expect exciting news in this area soon," she wrote me this week.
Also heartening is that Chicago's DoIT isn't boiling the ocean, but building on what it learns and tests. As InformationWeek's Chris Murphy stated last April:
Graffiti and rodents are mere starting points, says Brenna Berman, an ex-IBMer who's now first deputy CIO. "This is the approach for how this department will be part of the answer for tackling the murder rate or addressing complex emergencies like snowstorms or improving the water infrastructure," she says. The harder-to-solve problems will take more data and analysis of more variables, but "it's the exact same story...."
I think Chicago's on the right track. To outsiders, the city's IT presents a confusing flurry of projects, but any analytics expert will tell you that data gets "smarter" the more you work with it. If the philosophy and strategy are sound, the results can be surprisingly effective. It requires a process, though, involving iterative trials.
Kudos, Chicago. Can't wait to see what solutions you produce!
— Mary Jander, Managing Editor, UBM's Future Cities