Posted: March 9, 2012 at 5:15 pm | By: lisavanpappelendam | Tags: confidentiality, control, engineering, Gürses, infrastructure, practice, privacy, responsibilization, Seda, Seda Gürses, SNS, SPION, technology, UnlikeUs#2 | 3 Comments
For session #3 The Private in the Public, Seda Gürses (pictured right), the coordinator of the interdisciplinary project Security and Privacy in Online Social Networks (SPION), was invited to speak about the technological responses to present-day privacy issues. SPION proved to be a very challenging and highly necessary endeavor to reconcile notions of privacy and technical privacy solutions within a particular social context.
(Click here for the video of Seda Gürses’ presentation)
Gürses starts off by explaining what the engineering perspective, relating to her own academic background, within computer science is on privacy. The main question therein can be simplified into the following: What does a system need to do to enable privacy? The difficulty faced by engineers is that there is no singular notion of privacy and this singularity is exactly one of their critical requirements.
She goes on by asking the audience what the role is that our mother’s maiden name – just to name one example – plays within online security. In the offline world, we saw a combination of financial institutions held responsible for guaranteeing consumer security. In the online world, we see the rise of online banking and a shift to both financial institutions and consumers themselves in responsibility with regards to online security. Suddenly the risk of identity theft reared its ugly head, which makes the quest for security solutions a pressing matter. Gürses clarifies how from then on security questions, such as your mother’s maiden name, encryption and password models became part and parcel of our everyday (online) existence. This forces consumers to keep certain information regarding their lives a secret from others. Gürses refers to this process as responsibilization in design, where subjects are rendered responsible for their own privacy and security.
Before attempting to formulate a definition of privacy from a computer science perspective, Gürses playfully states various popular opinions of engineers regarding users. Users are considered to be indifferent, stupid and the weakest link. However, a different perspective might modify these opinions, Gürses claims. What if engineers and researchers kept in mind that in reference to social networks or social networking sites (SNS) users assume that it is the SNS provider that is responsible for guaranteeing a private and secure online environment for them? Research has shown that social media users are certainly overwhelmed by the potential dangers and feel incapable of judging the long-term risks.
Therefore, Gürses proposes that technical specialists retract the responsibilities from users and place them within technology, in other words the infrastructure, where they might indeed belong. Mechanisms would be activated to mitigate the negative effects of SNS. It all comes down to making SNS providers the true bearers of responsibility in privacy and security issues.
But what is privacy then? Gürses discusses three circulating definitions in detail. From the classical view of computer scientists, privacy is either confidentiality or the right to be left alone (Samuel D. Warren & Louis D. Brandeis 1890). If confidentiality is at stake, the solution can be anonymous communications. Mechanisms register that users must give up certain data about their behavior, but they can keep who you are in the source strictly confidential. Other solutions to privacy breaches can be dummy traffic, mixes or encryption. This is not to oversimplify matters, because there are in fact intangible factors within a user’s environment that can pose as obstacles in this process of hiding information and identity. Within SPION, researchers are working on tools to protect confidentiality, for instance query forgery.
In the second definition, privacy is considered as control (again drawing from Samuel D. Warren). Central to this definition is the separation of identities and the protection of data along a certain principle. Individuals have the right to decide for themselves what information they wish to share with which others. Driven by this second definition, the SPION team is working on robust access control models, agents that assist users, information flows controlled at the level of scripting and browser security in communications.
Finally, privacy is defined as a practice. Gürses illustrates how on SNS users do not decide individually about the use of data and identity. It is a social decision and restrains the user’s freedom. The aspirations of SPION therefore also lie in advancing transparency, feedback and awareness on these platforms.
To round up, Gürses concludes that making privacy decisions is an extremely complex process. It is bounded by cognitive power. And she lists a small number of initiatives of SPION and their partners that aim to ease this process. For instance, together with researchers from Carnegie Mellon University experiments in nudging are done to stir citizens to behave in the required manner that enhances their responsibilization. Furthermore, SPION develops educational programs alongside the Department of Educational Studies of the University of Ghent to increase awareness of risks in privacy and security among users. Lastly, legal matters concerned with privacy are addressed. Together with the Interdisciplinary Centre for Law and ICT of the KU Leuven, SPION searches for suitable answers to questions such as “Which legal frameworks apply to SNS?” and “What are SNS’ liabilities?” in order to reduce the responsibilities that increasingly and subtly have been allocated to individual users.
Gürses’ talk at Unlike Us #2 has certainly given the audience grounds for continuing to follow the pioneering efforts of the researchers and engineers of SPION and partners in issues of privacy and security.