There are countless mundane things that municipal governments do. These might include processing building permits, collecting residential taxes or distributing school supplies to needy families. Many of these services are provided inefficiently. They also require large expensive staffs.
The rise of technologies such as artificial intelligence (AI) has created an obvious opportunity: automating municipal services to make them more efficient. Obviously, there are many things that can go wrong.
Although AI has taken the media spotlight, automation includes a wide range of technologies such as traditional algorithmic prediction and digital identity. Municipal services automation has miraculous potential, but comes with significant risks.
Chicago’s Restaurant Inspection Algorithm
2025 marks the 10-year anniversary of Chicago’s use of technology to determine which restaurants need health inspections.
The city of Chicago has over 15,000 restaurants and 15% of them usually get a critical health violation. In 2015, Chicago’s Department of Public Health came up with an algorithm that would analyze various factors such as a restaurant’s history of past critical violations, tobacco licenses, liquor licenses, length of time the restaurant has been doing business, nearby burglaries, etc. After this analysis, the algorithm would then tell city health inspectors which restaurants to inspect.
The program was so successful that the following year, the city decided to significantly expand this program. The new version also scans public social media posts made by restaurants, and looks at many new sources of data. Now Chicago is once again expanding its use of algorithms to predict which restaurants will need health inspections. In 2023, the city began incorporating AI into the process.
The program has been objectively successful. Chicago carried out an extensive two-month evaluation of the program with the insurance company Allstate to determine whether the algorithms were working. During this study, one team of inspectors did detailed regular inspections using traditional methods. A second team would then use the algorithm to determine which of the restaurants to evaluate. The team which used the algorithm to predict which restaurants would have the issues uncovered critical health violations in an average of 7.5 days, whereas the other team took 14 days.
Even the most ardent believers in small government want restaurants to be safe. Food safety is a relatively uncontroversial use case. But what if a similar technique were used to predict who would commit violent crimes? There is significant correlational data that shows men are more violent than women, black men commit more violent crimes than white men, and poor people are more likely to be arrested for crimes than rich people. An algorithm that told police officers who to investigate based on gender, race and wealth would be clearly discriminatory.
The question then becomes – should algorithmic discrimination based on other factors, such as a restaurant’s location or history of past burglaries be allowed? How might this data correlate with the religion or ethnicity of the owners? Could this data become a justification for discrimination? Where do we draw the line? There will never be a clear answer.
A restaurant with a history of past health violations merits more scrutiny than a restaurant with a perfect record. The same might be said for a violent criminal. There are some types of data that, as a society, we are comfortable using as the basis for future investigations – and some types that we’re not.
Editor’s note: Next week, Part II will explore these questions further by looking at California’s widely popular mobile driver’s licenses. Part III will review Hangzhou’s City Brain, a massive and somewhat creepy artificial intelligence system that is housed in a municipal database.
Thibault Serlet is a partner at Key State Capital, a Venture Capital consortium that invests in digital identity technology. He previously served as the president of the Adrianople Group, a business intelligence firm which helped investors finance the creation of new Special Economic Zones. He led the creation of several large scale datasets including the Web of Trust, a global database of decentralized digital identity projects, Open Zone Map, the world’s first global map of free zones, and the Charter Cities Institute’s New Cities Map.