Center for Applied Cybersecurity Research report examines data protection, personal privacy

Dec. 4, 2013


BLOOMINGTON, Ind. -- A report issued today from Indiana University’s Center for Applied Cybersecurity Research paves the way for updated privacy protection by focusing on the uses -- both positive and negative -- of personal information.

The Data Use and Impact report is based on a global workshop the center hosted in London in May. Participants included 25 senior representatives from industry, government, academia and advocacy groups in Australia, Canada, France, Israel, Italy, Mexico, New Zealand, the United Kingdom and the United States. The workshop and report were funded by The Privacy Projects, a nonprofit organization dedicated to improving privacy policies, practices and technologies through research, collaboration and education.

Fred H. Cate, Center for Applied Cybersecurity Research director and co-convener of the CACR workshop and of a global privacy summit organized by Microsoft Corp., said the sheer volume of data being generated today creates overwhelming challenges for the creation and enforcement of provisions to guide its use. For the average consumer, the implications arising from the use of their personal information may not be immediately clear.

“When was the last time anyone scrolled through the 50-plus screen pages of Apple’s terms and conditions when registering their new iPhone or bothered to read the privacy policies of social media sites?” Cate asked.

Those are but two common scenarios consumers face, and as the ubiquity of data -- be it geolocational, biomedical, financial or otherwise -- increases at rapid speed, those who have access to it are saddled with an array of issues governing its use.

“Only by focusing on how personal data are used and the potential harms and benefits likely to result from those uses can we ensure that data are used responsibly, that individual privacy is protected, and that data users are accountable stewards of the data they possess,” Cate said.

The Data Use and Impact report offers 10 conclusions, including:

  • The current focus of most data protection regimes on notice and consent at time of data collection is not working. Privacy notices are too complex; many privacy policies don’t provide meaningful terms, choices or restrictions on data use; individuals do not read them; there is a lack of clarity as to what constitutes a harm or risk of harm in connection with personal data; and our preoccupation with notice and choice focuses too much attention on the “bureaucracy of privacy” rather than on meaningful privacy protection.
  • Existing data protection systems should evolve to focus more on use of data and less on notice and choice, so that data users would evaluate the appropriateness of an intended use of personal data not by focusing primarily on the terms under which the data were originally collected, but rather on the likelihood of benefits or harms associated with the proposed use of the data. A greater focus on use would “make privacy relevant again” and enhance both privacy protection (including enforcement) and individual trust.
  • The real benefit to individuals and society doesn’t come just from a greater focus on uses of data, but rather from assessing the risks -- good and bad -- of proposed uses. As a result, a critical component of the evolution toward a more use-focused data protection system is the development of a simple, transparent approach to risk assessment.
  • The goal of a risk management approach focused on data uses is to reduce or eliminate the harm that personal information can cause to individuals. Accomplishing this, however, requires a clearer understanding of what constitutes “harm” in the privacy context.
  • The evolution to a more use-based approach is particularly important with the advent of “big data” and the analytical tools that have accompanied it because personal data may have substantial valuable uses that were wholly unanticipated when the data were collected. In fact, the analysis of big data doesn’t always start with a question or hypothesis, but rather may reveal insights that were never anticipated. As a result, data protection that is based on a notice specifying intended uses of data and consent for collection based on that notice can result in blocking socially valuable uses of data, lead to meaninglessly broad notices or require exceptions to the terms under which the individual consented.
  • Implementing a more use-focused data protection system must be achieved through evolution, rather than revolution. This evolution actually recaptures the early focus on data protection in Europe and the United States on risk assessment to prevent harm, rather than on protecting individual privacy rights. Laws implementing use-based risk analysis should specify objectives and outcomes, rather than telling information users how to achieve those.

The report is the most recent in a series of initiatives designed to make privacy protection more workable and more effective that began with global data protection dialogues convened in 2012 by Microsoft in Washington, D.C., Brussels, Singapore, Sydney and São Paulo for small groups of leading regulators, industry executives, public interest advocates and academic experts.

These events culminated in a global privacy summit in Redmond, Wash., at which Microsoft convened more than 70 privacy and data protection experts from 19 countries on five continents to consider the future of data sources and uses and practical steps to enhance privacy protection. The summit called for the examination of data uses and impacts contained in today’s report, along with a re-examination of the OECD Fair Information Privacy Principles, which is also being released today by the Oxford Internet Institute. That report, "Data Protection Principles for the 21st Century," is co-authored by Cate; Peter Cullen, Microsoft’s general manager for trustworthy computing governance; and Viktor Mayer-Schönberger, professor of Internet governance and regulation at the University of Oxford’s Oxford Internet Institute.

The next step in this reconsideration of privacy protection is a series of events focusing on assessing and managing risks surrounding the use of data. The Center for Applied Cybersecurity Research hosted one of those events -- a tutorial on risk management for data protection experts -- in November and will be hosting another -- a workshop to help create frameworks for identifying and assessing risks presented by data uses -- in late spring. Both events have been funded by The Privacy Projects.

About CACR

The Center for Applied Cybersecurity Research is a National Center for Academic Excellence in both Information Assurance Education and Research. It works to improve the quality of cybersecurity practice, research and education by combining technical expertise with an interdisciplinary focus on law, policy, economics and behavioral sciences. The center is part of the Indiana University Pervasive Technology Institute.