Proposed Chicago Data Sensors Raise Concerns over Privacy, Hidden Bias

Michael Holloway Liberty Image 12.12.13 CC_small   John McElligott 135px

By Michael Holloway, John McElligott

Beginning in mid-July, Chicagoans may notice decorative metal boxes appearing on downtown light poles.  They may not know that the boxes will contain sophisticated data sensors that will continuously collect a stream of data on “air quality, light intensity, sound volume, heat, precipitation, and wind.”  The sensors will also collect data on nearby foot traffic by counting signals from passing cell phones.  According to the Chicago Tribune, project leader Charlie Catlett says the project will “give scientists the tools to make Chicago a safer, more efficient and cleaner place to live.” Catlett’s group is seeking funding to install hundreds of the sensors throughout the city.  But the sensors raise issues concerning potential invasions of privacy, as well as the creation of data sets with hidden biases that may then be used to guide policy to the disadvantage of poor and elderly people and members of minority groups.

Privacy

Project leaders and City officials deny that the sensors raise privacy concerns.  According to Catlett, a computer scientist, the sensors will “count contact with the signal rather than record the digital address of each device, and “information collected by the sensors will not be connected to a specific device or IP address.”  Brenna Berman, the city’s commissioner of information and technology, said that “privacy concerns are unfounded because no identifying data will be collected.”  However, Alderman Robert Fioretti has called for a public hearing on the data sensors.  Fioretti notes that City Council was never consulted about the plan, an Emanuel administration initiative, and states that the sensors raise “obvious invasion-of-privacy concerns.”

Raising a note of skepticism about the City’s privacy assurances, Professor Fred Cate of Indiana University’s Maurer School of Law noted the difficulty of avoiding the collection of personally identifiable information, even when protections intended to prevent the collection of personal information are in place: “Almost any data that starts with an individual is going to be identifiable.”  Cate’s statement accords with scientific research showing that, in practice, supposedly anonymous or anonymized data can in many cases be reidentified with an individual.  Cate also raised the question of oversight: “If you spend a million dollars wiring these boxes, and a company comes in and says ‘We’ll pay you a million dollars to collect personally identifiable information,’ what’s the oversight over those companies?”

In light of the potential privacy concerns, Dean Harold Krent of IIT Chicago-Kent College of Law noted that transparency is key in Chicago’s operation of the sensors. The City must be clear about how many sensors there are and how they are used, and must ensure that the data captured by the sensors is easily accessible to public officials.

Hidden Bias

Jeremy Gillula, a staff technologist at the Electronic Frontier Foundation (EFF), pointed out that the proposed system may create unintentionally biased data sets.  The proposed sensors will track contacts with signals from Wi-Fi and Bluetooth-enabled devices, but this will only reflect a subset of the overall foot traffic, since not all passers-by will be carrying devices with Wi-Fi or Bluetooth capabilities.  In Boston, the use of a mobile app called Street Bump to track potholes in the city produced biased data because smartphone owners tended to live in wealthier areas.  Similarly, many Tweets during Hurricane Sandy originated in the largely affluent borough of Manhattan, giving the impression that it was among the hardest-hit areas of the storm, while in fact lower-income, outlying areas such as Breezy Point, Coney Island and Rockaway were harder hit.

These examples reflect the fact that large datasets, while seemingly objective and abstract, are “intricately linked to physical place and human culture.”  As the EFF has noted, “many groups are under-represented in today’s digital world (especially the elderly, minorities, and the poor). These groups run the risk of being disadvantaged if community resources are allocated based on big data, since there may not be any data about them in the first place.”  Chicago will need to carefully validate the data collected from the proposed sensors to avoid introducing similar biases into policy and planning decisions.

Michael Holloway is a Legal Fellow at IIT Chicago-Kent’s Institute for Science, Law and Technology.

John McElligott is a Research Assistant at the IIT Chicago-Kent Institute for Science, Law and Technology. He is currently studying Law in his second year at the IIT Chicago-Kent College of Law.

1 thought on “Proposed Chicago Data Sensors Raise Concerns over Privacy, Hidden Bias

Leave a Reply

Your email address will not be published. Required fields are marked *