Sarah Spiekermann, Ph.D.   –   Business Informatics Professor and Thinker

EU-US Privacy Bridges after the end of Safe Harbor

  • 29. Oktober 2015, 11:53
  • MyBlog

Many companies are in the dilemma that their business models and subsequent data sharing practices are ethically incompatible with the beliefs and expectations of their customers. A Bridge Report that addresses this lamentable situation

People who are working in the field of privacy and data protection these days are observing an incredible dynamics: The European Court of Justice is bringing down the Safe Harbor Agreement on personal data exchange between the US and the EU. It finally forces US companies to respect European jurisdiction on privacy matters (i.e. the right to delete data or links to data that undermine one’s dignity). A European data protection regulation is about to come into life, which foresees major financial sanctions for companies who do not comply with our European privacy norms. All of these legal developments are a great victory for what we in Europe consider to be a human right to privacy. I am a great enthusiast of these developments.

However, we should never forget one thing: the law is not about good or ethical behaviour. It is not about user trust or customer loyalty, brand building or sustainable competition. The law and its enforcement is no more than a red line. A ‘hard’ boundary. The edge towards what we consider to be criminal or unlawful behaviour.

In order for digitalized companies to win and maintain sustainable user trust, they need to go beyond the law. No matter what legal baseline the EU and the US legislators may agree on for data protection, institutions need to become really good at managing and safeguarding personal data. They need to have Total Quality Data Management in place. They need to steer through a rapidly changing IT service environment. And they need to protect their customers’ privacy from disproportionate governmental surveillance. This is not easily done. Moreover, Internet Economics is (today) fundamentally based on the trade of personal data (the new "oil"). Many companies are in the dilemma that their business models and subsequent data sharing practices are ethically incompatible with the beliefs and expectations of their customers.

Against this background 19 privacy experts from the EU and US convened over 18 months to create a Bridge Report that addresses this lamentable situation. All 19 are internationally renown legal and technical privacy experts, but also have (and this is important) substantial knowledge about economic correlations and company practices. They were chosen to represent the European and US market, because Europe and the US have entered into a kind of hate relationship over data exchange. The quarrels and challenges described above lead to mutual accusations. Europe and the US mentally drift apart. But a huge amount of Europe’s personal data is still processed in the US. So, bridges are inevitable. They are desperately needed beyond the law in order to stop the rising distrust.

The Bridge Report can be found here. It formulates 10 EU-US privacy bridges. Since I was a member of this expert group, I want to pull out a few thoughts I have on some of the bridges. What is particularly important to me when reading the details? What do I miss?

The Transparency Bridge

Personally I was responsible for Bridge 3 (together with Joel Reidenberg and Ira Rubinstein from the US) that calls for new approaches to transparency of data processing activities. In my personal life I have made the experience that more transparency leads to more trust and less conflict even though it is hard ... So this is what I recommend to companies as well. I am aware that companies have difficulties to create transparency on data processing, because IT backend infrastructures are a bloody mess these days. For this very reason, companies have good reasons to work towards "simple", "meaningful" and "standardized" reporting of data handling practices. The more they work towards this external goal truthfully, the more will they win clarity for themselves internally.

Most important in this bridge for me are two further aspects: The first is that reporting of data handling practices needs to be "machine-readable". I find it absurd that highly digitalized and professional companies play the "paper mountain" game with customers and data protection authorities (DPAs). If external parties like DPAs and customers can automatically pull and compare information on data handling performance and compliance, then companies can compete on data management quality. This should be along-term goal.

The second aspect important to me is the provision of information on "automated decision making". When companies segment their customers, categorize and judge them, I think people should know about it. Marketing folks won’t be amused by this advice. After all, psycho-profile based targeting is one of the most holy cows of Internet marketing right now. But I don’t think it is a long-term strategy. Research shows it is not. And ethically it is no ok. When I teach my business students at WU Vienna I am always telling them that in order to see whether something is ok to do or not, ethically fine or not, they should imagine making the practice public. Are they ok to see themselves on the front headline of a newspaper with their practice? If not, they should avoid the practice.

User Controls

There is always this argument that people would be overwhelmed by control over their data if that control was given to them. It is argued that especially future technologies like the Internet of Things will make it impossible for people to control their data. So why not deprive users of control altogether from the start? I am happy that the 19 privacy experts agreed that this story is simply false. We can put people in the drivers’ seat for their data. We just need to build the right tools for people to use.

What you see in the Bridge report as well though is that none of us 19 believes that true user control is realistically achievable in a self-regulatory process. The W3C lamentably failed twice over the past 15 years to achieve industry consensus and acceptance of user control standards; even though the technology was there. Insiders know the stories of the failed privacy control standards P3P and DNT. In our Bridge report we therefore outline that the outcome of a "usable technology" needs to be supported by a "clear regulatory guidance from both EU and US regulators".

EU and US Regulator Cooperation

I think one core achievement of the Bridge report and its preparation is that high-level representatives (or former top-executives) of the FTC and the Art. 29 Group came together and agreed that in many tech-developments today we would benefit from a coordinated action between EU and US regulatory bodies. Bridge 1 and 9 are on these cooperation potentials. If I just think about the transparency and control standards I described above, I don't think that Europe can do it alone. If FTC and Art. 29 had gotten together earlier and had pushed the folks in the standardization bodies (like W3C) to reach and embrace a consumer-friendly solution, then DNT or even P3P might have taken another path.

Personally, I do believe in the absolute sovereignty of our European democracy and regulatory bodies. I don’t want any US body or lobby group to mess with our business. Looking forward to watch the film "Democracy". But I do think that co-operation, information exchange, and strategic alliance is absolutely important and beneficial when it comes to setting the right boundaries for corporates’ technical developments that will have a deep impact on the life of citizens on both sides of the Atlantic.

De-identification Or Anonymization

The necessity of US-EU cooperation in regulation comes in again when we talk about de-identification. I prefer to use the term anonymization. The Bridge Report recognizes of course that anonymization is not perfect. Big Data has the technical potential to re-identify people. Some datasets, like genetic information, cannot be reliably anonymized.

But I am very happy that the group of experts agreed that anonymization is still a very important and valid approach to protect user privacy. The degree of defamation of anonymity practices that has popped up in academic papers is a pity and loss for the privacy world. All it does is to point out that you can break into a car that is locked. So what? Do we not continue to lock our cars? And don’t we feel somewhat more protected by locking our cars and apartment doors?

Anonymization I think is an extremely important practice to create datasets companies and research institutions can work with and do research with. However, and this is crucial in our report: Technical anonymization alone is not enough to protect users’ personal data. The lock on a car is also not enough to prohibit theft. What we need in parallel to the technical prohibition is the regulator. We write: "regulators might also agree to develop model clauses for … civil or criminal statutes prohibiting the re-identification and/or disclosure of de-identified personal information." And again, co-operation between US and EU regulatory bodies becomes important.

What I Am Missing

The most important bridge I am missing in the report is a true commitment to user choice and control. I wanted to see in the report a user right to a privacy-friendly service version. When I go to Facebook I want to be allowed to pay them €/$ 3 a month to buy the service under the condition that they don’t share my personal data with others who are not involved in immediate service delivery. It is this choice to opt for a privacy-friendly service version that puts users in control. At least some control, because in monopolistic Internet markets people often don’t have the choice to switch to another service provider. So as long as I use Facebook (or any other monopolistic service I like) I am condemned to accept that they use my data for a lot of purposes that I do not appreciate at all. I believe that the unavailability of privacy-friendly service options generally undermines trust in Internet markets. I have extensively laid out my economic and scientific reasoning around this form of user control in a journal article (that can be found here). The music portal Spotify I learned is a great example for how such a privacy-friendly and ad free service version can work. My biggest enigma working in the BRIDGE team is why they would not get this point. But this is probably the life reality of every democratic system: Some good ideas simply don’t make it.

The second point I miss is in Bridge 10 on education: On February 18th 2014 ACM (the Association for Computing Machinery and the IEEE Computer Society have released jointly developed new curriculum guidelines for computer scientists undergraduate degree programs. The program they communicated promised to take a "Big Tent" view of computing fostering integration of computing with other disciplines. When I read these powerful organizations’ recommendations on privacy (or any kind of ethical) education of future engineers I was simply shocked. They seem to be shamefully blank on privacy matters. Personally, I think we need to much better educate engineers’ capabilities to reflect on ethical issues in their IT designs in order to enable them to chose the right technical controls in the IT systems they conceive. The Bridge project would have had the power to reach out to these organizations. It does not.

Taken together, I am standing behind the Bridge Report. I do believe in co-operation across the Atlantic beyond the law in order to address future technological challenges. (Sarah Spiekermann, 29.10.2015)

Sarah Spiekermann is a university professor, chairing the Institute for Management Information Systems at Vienna University of Economics and Business. She does research on social questions of internet economy and technology design.

Link

37th International Privacy Conference


Sarah Spiekermann, Ph.D.

Copyright © 2018 Sarah Spiekermann