Archiv

Scenario: Planning

Debora Weber-Wulff & Christina B. Class

Michaela works at the local police station in Neustatt. Due to decreasing tax revenues and cuts in federal subsidies, significantly less funding will be available for police work in the coming years. They probably won’t be able to fill positions lost to organic developments, like retirement. A complete overhaul of the police presence in the city is needed. It’s a significant problem because petty crime, thefts, and burglaries have risen sharply recently. After several tourists were robbed while visiting the famous city center and the Museum of Modern Art, the tourist board and the hotel and restaurant industry are putting additional pressure on the police department and calling for an increased police presence.

Michaela’s friend Sandra is involved in an interdisciplinary research project that combines data mining and artificial intelligence methods with sociological findings to develop a new approach to urban development. Current results suggest that the prototype developed can better predict criminal activity in certain areas. So, as the second phase of the project gets underway, a proposal to include additional municipalities is on the table to test the premise with detailed, albeit anonymized, information on crime and offenders.

Michaela makes arrangements for Sandra to meet with the mayor at his office. The mayor listens carefully to what Sandra says and is interested in working together on the project. He invites Sandra to the next city council meeting. After Sandra’s presentation, a heated discussion ensues. Some council members raise concerns about data protection. Sandra explains the measures being taken to anonymize and protect personal data. Peter objects that members of small, select groups would still be identifiable in practice. Anton then intervenes, saying that various factors often coalesce in connection with crime, such as poverty, unemployment, education, etc. He’s heard that the sociology professor involved in the project focuses almost exclusively on ethnic origin at the expense of many other factors—and that is discriminatory. Werner, the owner of a large catering business, advocates for a judicious use of the police force to protect people and businesses from increasing crime.

Michaela is confused. She sees the potential for her approach to make good use of police resources. However, she also understands the concerns. But haven’t we always been assigned to certain categories in our society, and increasingly so since the electronic analysis of data records? Recently, her car insurance premiums went up because she belonged to a category of insured persons whose accident rate rose. That’s also unfair.

Questions:

  1. How problematic is the categorization of people in our society? Is it really that big an issue, and is it exacerbated by IT?

  2. Should communities be permitted to release sensitive personal data anonymously for research projects? What ethical problems might arise here? How trustworthy are methods for anonymizing data?

  3. Planning for increased police presence in certain geographic areas can protect residents from crime. But doesn’t this necessarily involve a certain degree of prejudgment of the people who live there? When is this type of prejudgment merited?

  4. Should the whole project and the prototype be scrapped just because one participant has used dubious criteria for classification?

  5. Determining which variables are relevant is a critical element of this project. How can we guarantee that prejudices do not guide this process?

  6. How can we prevent correlations from suddenly being interpreted as causal relationships? How great is the danger this presents in the context of analyzing personal data?

Published in Informatik-Spektrum 35(3), 2012, S. 236–237

Translated from German by Lillian M. Banks

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>