Printer Friendly

Benefits of privacy-enhancing identity management.


In everyday life, individuals are frequently and naturally playing different roles, for example as family members, citizens or patients. Typically, when a person is performing a certain role, he does not reveal all personal data about himself to respective communication partners but only parts of his personal data (i.e. parts of his identity, also called his "partial identities"). Figure 1 below illustrates that a person named John reveals different partial identities to different communication partners in dependence on the roles that he is performing. In the non-electronic world, individuals naturally had control over the releases of their partial identities to other parties. In the information age, users have more or less lost effective control over their personal spheres. When communicating via the Internet, users are leaving many personal data traces at various sites, which can be easily compiled to extensive personal profiles. Besides, due to low costs and technical advances of media storage, masses of data can easily be stored, processed and are hardly ever deleted or forgotten. These processes of personal data collection, storage and processing are often not transparent for the individuals concerned. Emerging pervasive computing technologies, where individuals are usually unaware of a constant data collection and processing in their surroundings, will even heighten this problem.

Privacy as an expression of the human dignity is considered as a core value in democratic societies and is recognized either explicitly or implicitly as a fundamental human right by most constitutions of democratic societies. Today, in many legal systems, privacy is in fact defined as the right to informational self-determination, i.e. the right of individuals to determine for themselves when, how, to what extent and for what purposes information about them is communicated to others. For reinforcing their right to informational self-determination, users need technical tools that allow them to manage their (partial) identities and to control what personal data about them is revealed to others under which conditions. Identity Management (IDM) can be defined to subsume all functionality that supports the use of multiple identities, by the identity owners (user-side IDM) and by those parties with whom the owners interact (services-side IDM). According to Pfitzmann and Hansen, identity management means managing various partial identities (i.e. set of attributes, usually denoted by pseudonyms) of a person, i.e. administration of identity attributes including the development and choice of the partial identity and pseudonym to be (re-)used in a specific context or role (Pfitzmann and Hansen 2008). Besides, transparency-enhancing tools (which are often part of privacy-enhancing identity management systems, see below) can help users to make the processing of their personal data more transparent, i.e. visible and understandable for them, and support them in their decision making by informing them about consequences of personal data releases.


The objective of this paper is to discuss the benefits of privacy-enhancing identity management system including transparency enhancing tools. For this, we will especially explore the potentials of the PRIME IDM system, which has been developed within the EU Framework Programme 6 project PRIME (1) (Privacy and Identity Management for Europe). In the next section, we present how the PRIME system and its technical components can help to protect the user's privacy when he is communicating with a services sides with that he is interacting by enforcing legal privacy principles such as data minimisation, purpose binding and transparency while also establishing trust of the user in the services side and vice versa. In section 3, we will then illustrate how PRIME technologies can be applied in an e-shopping scenario to enhance both privacy and trust of online consumers while still allowing Internet shops to conduct their legitimate business interests or activities with less or no personal information. Finally, we will also discuss the benefits in financial terms that organisations can have when deploying privacy-enhancing identity management.

Privacy-enhancing Identity Management

Several identity management systems are available today, currently developed and/or standardised, such as Microsoft's CardSpace, Higgins or the Liberty Alliance set of protocols. The PRIME project has developed user-centric privacy-enhancing identity management systems providing maximum privacy by integrating state-of-the art privacy-enhancing and transparency enhancing technologies. In particular, PRIME enforces privacy features presented in the following subsections:

Data minimisation

PRIME enforces data minimisation for transactions between users and their communication partners (services sides or other users), so that the least possible amount of information is revealed. This limits the communication partner's ability to profile users. This data minimisation principle can be derived from the EU Data Protection Directive 95/46/EC: The processing of personal data must be limited to data that are adequate, relevant and not excessive (Art.6 I (c)). Besides, data should not be kept in a personally identifiable form any longer than necessary (Art.6 I (e)). Indeed, privacy of individuals is best protected if no personal data about them are collected or processed at all. PRIME achieves data minimisation by the following technical means:

* All interactions are a priori made anonymous with the help of anonymous communication technologies, i.e. the user's IP address or location is not revealed to his communication partners.

* On application level, PRIME allows individuals to act under different pseudonyms with respect to communication partners, roles or activities. Whether or not interactions can be linked to each other or to a certain pseudonym is under the individual's control. The degree of anonymity protection provided by pseudonyms depends on the amount of personal data that can be linked to the pseudonym, which is in particular also dependent on how often the pseudonym is used in various contexts or for various transactions. The best protection can be achieved if for each transaction a new so-called transaction pseudonym is used that is unlinkable to any other transaction pseudonyms and at least initially unlinkable to any other personal data items of its holder (see also (Pfitzmann and Hansen 2008)). However, in some situations linkability of transactions might be desired by the user. For instance, a user could choose to re-use the same pseudonym for purchasing items at a certain online shop in order to build up a reputation or for collecting bonus points under this so-called relationship pseudonym, i.e., a pseudonym that he re-uses in respect to a specific communication partner (but to nobody else), which is in this case the online shop.

* Moreover, if a user needs to provide some personal information e.g. for getting a service, the anonymous credential system implemented in PRIME (Camenisch and Lysyanskaya 2001) can enforce that no further personal data are revealed than the ones requested. A credential is a set of attributes signed by the issuer of the credential and bound to the credential owner. Traditional credentials require that all attributes are disclosed together if the holder wants to prove certain properties and have the drawback that different uses of the same certificate can be linked to each other. In contrast to traditional credentials, anonymous credentials make it for a user possible to prove to a communication partner only specific properties without actually revealing the complete certificate itself or any extra information. For example, if a user named John has an anonymous passport credential issued by his government stating attributes about him such as his age and he wants to download a video online which is only permitted for adults, he can prove via cryptographic zero-knowledge proofs with his credential just the fact that he is older than 18 without revealing his birth date or age or anything else. PRIME's anonymous credentials also have the property that different shows of a credential cannot be linked with each other. If for instance John later wants to purchase another video for adults online at the same video shop, he can again prove that he is over 18 with his anonymous credential without that the video shop is able to recognise that the two proofs are based on the same credential. This means that the two purchases by John cannot be linked as purchases done by the same person. The anonymous credential system used in PRIME allows also for conditionally anonymous or pseudonymous users, i.e. in case of misuse, the identity of the user can be revealed with the help of a Trusted Revocation Authority.

* While traditional authorisation schemes first identify a user in order to decide whether he is authorised to access a service or resource, anonymous credentials in combination with privacy-enabling access control allow users to prove that they have a valid service subscription or the authorisation to access resources without the need to reveal the users' identities, i.e. users can be enabled to access certain services or resources anonymously.

Assurances (Trust) & Life Cycle Data Management

While data minimisation is the best strategy for minimising the privacy impacts, often minimisation alone is not enough. In daily life there are many situations where users have to reveal personal data, e.g. for home delivery of purchased items, eHealth or eGovernment services, and thus cannot act anonymously. Besides, privacy impacts can be high even if the amount of data is minimised. For these cases, PRIME also includes mechanisms that enable users to establish trust in service providers and technical means to reliably enforce their privacy promises at their sides. These include the following ones:

* Automated Privacy & Trust Policy Negotiation: A negotiation can be defined as a set of messages exchanged between a user and a service provider in which both parties agree on data to be released by the user in exchange for a service and data to be released by the service provider to give the user assurance. More specifically, trust negotiation involves the user releasing properties about himself endorsed by credentials in order to get access to a resource and the services side giving evidence that it is sufficiently "trustworthy" as required by the user's preferences, e.g., by providing privacy seals, or evidence regarding its reputation. In addition privacy policy negotiation is conducted, that is, the data handling policy for the data to be released by the user is agreed on, which is typically stating for what purposes the requested data should be processed, how long the data should be retained, and whether the data may be transferred to a third party. Besides, the data handling policy can also include obligations to be enforced by the services side.

* Assurance Management: One of the important tasks in building trust between different parties is assuring the sending part that the receiving part will play by the rules. In a privacy context this means assuring the user that the system of the data controller (2) will manage the received data according to the negotiated privacy policies and security requirements. In PRIME three building blocks together implement this functionality under the heading of assurance management. These three blocks are the Assurance Control, the Reputation Manager and the Trust Manager. The trust manager's role is to check and assess the trustworthiness and security level of the underlying PRIME components, e.g. verifying the integrity of the components and their runtime environment by using Trusted Computing. The Reputation Manager is responsible for collecting, managing and presenting reputation based assessments of the data controller. Typical examples of this could be e.g. consumer organisation blacklists, privacy seals awarded by data protection commissioners or security/privacy breach notification lists. Finally, the Assurance Control is responsible for assessing whether the system is following agreed privacy & trust policies based on the policy and the information that it gets from the other two components and to communicate this assurance information to the enquiring PRIME component or the end user.

* Privacy-enhancing access control: When a services side requests data from a user, it needs to specify the purposes for which the data will be processed (cf. Art 10 EU Data Protetcion Directive). The use of the data will then be legally limited to those purposes (cf. Art. 6.1 b EU Data Protection Directive). As discussed above, the data processing purposes therefore need be part of the privacy policy (data handling policy) that the user and the services side negotiate. The negotiated policy is associated to the personal data (i.e., it "sticks" to the data). If the data are transmitted to third parties, the policy remains associated and is enforced appropriately. In order to assure the user that the policy is upheld when his personal data are processed, the PRIME services side implements privacy-enhanced access control that cannot only check "who" has the right to access the data but also checks whether the data are requested to be accessed for one of the purposes for that the data were collected and possibly whether other conditions are fulfilled as well.

* Life Cycle Management: A primary task of the life cycle data management in PRIME is the management of privacy obligations. At the PRIME services side, an obligation management component enforces privacy obligations, such as the anonymisation of data once that keeping them in a personally identifiable form is not any longer necessary, or that data should be deleted at a certain time. As mentioned above, obligations can be part of the data handling policy that is negotiated between the user and the services side. Hence, certain obligations can be dictated by the user, such as for instance the obligation to notify the user if his data are accessed or transferred to third parties.

Transparency for end users

Transparency comes in many shapes and sizes and can have a different meaning in different contexts. Within the privacy enhancing context it is used to describe the ability for an individual to get information on the personal data that are stored about him, where these data originate from, how and particularly for what purposes the data are used and processed. In order to provide transparency for end users, legal, social, and technical instruments are needed. We like to refer to these types of tools as transparency tools for privacy purposes or simply transparency tools or transparency enhancing tools (TETs). A number of definitions on what constitutes a transparency tool exist (Hildebrandt 2008, Hansen 2007 and Hedbom 2008). There are many reasons for why transparency tools are needed. First of all, they will help users to make informed decisions and help establishing trust in services sides by making it possible (to some extent) for users to verify that the services side is playing by the rules and is upholding the negotiated policies. The EU Data Protection Directive protects the privacy principle of transparency of personal data processing by providing data subjects extensive information and access rights. Pursuant to Art.10, data subjects have the right to information about at least the identity of the controller, data processing purposes and any further information necessary for guaranteeing fair data processing. If the data are not obtained from the data subject, the data subjects have the right to be notified about these details pursuant to Art.11. Further rights of the data subjects include the right of access to data (Art.12 a) including the right to obtain knowledge of the logic involved in any automatic processing of data concerning him, the right to object to the processing of personal data (Art.14), and the right to correction, erasure or blocking of incorrect or illegally stored data (Art.12 (b)).

Emerging ubiquitous computing technologies and Ambient Intelligence environments will make it very hard to uphold the concealment paradigm in the future. In this context, transparency tools will play an important role in minimising information asymmetries and helping the users to understand where information about them originates from comes from and how it will affect them (Weitzner et. al 2006, Sackmann et al 2006 and Hildebrandt 2007).

The Data Track function in PRIME provides transparency about who has received what personal data related to them and possibilities to trace personal data being passed on. The Data Track is a history function at the user side IDM system that keeps track of all the user's data releases. It stores all transaction records comprising personal data sent, pseudonyms used for transactions, credentials that were disclosed, date of transmission, purposes of data collection, recipient (i.e., the data controller) and all further details of the privacy policy (data handling policy) that the user and recipient have agreed upon. Easy to use tools for finding relevant records about past data disclosures are part of the data track. Functions integrated in the data track should allow the user to exercise their rights to access data online. Once the user has "tracked" specific transaction records, the data track user interface provides buttons that the user can click for activating such online functions. These online functions provide the individual with direct access to his data stored at the data controller or at least help him in finding out about the address of the data controller (from the privacy policy), generating requests, giving the needed authentication (even if a pseudonym is used), monitoring the complaint status, compiling reminders, and--in case of problems--addressing the supervisory authority in charge (see also (Fischer-Hubner et al. 2008)).

PRIME-enabled E-Shopping Scenario

This section describes a privacy-enhanced e-shopping scenario using PRIME technology, which allows e-shopping customers to regain control over their private spheres and which aims at enhancing user trust (see also (Andersson et al. 2005), and for an extended version (Fischer-Hubner and Hedbom 2008)). The privacy-enhanced e-shopping scenario illustrated in Figure 2 involves the customer (i.e. user) on the user side and the Internet shop (i.e. e-shopping service provider) on the services side, as well as the bank and delivery service as transaction-facilitating services. Both the customer and the shop use computing devices which are equipped with PRIME's identity management middleware. By utilizing PRIME's anonymous communication component and anonymous credential system, all activities within these phases can remain unlinkable from each other. The masks in Figure 2 symbolize different pseudonyms under which the user can act. Hence, the privacy principle of data minimisation is enforced with respect to the transaction. PRIME's data track functions help to enforce the privacy principle of transparency with respect to the data involved in the transaction.

Scenario phases

The privacy-enhanced e-shopping scenario can be divided into the phases browsing, negotiation and purchase, payment, delivery and data tracking as described below.

Browsing: The customer browses the Internet shop's web site anonymously. There is no need to provide any customer identity data or Internet address within the browsing phase. If a customer participates in the shop's loyalty programme or wants to receive personalised advice, he can show a proper credential without revealing any other personal information; this should be sufficient for taking part in the programme or receiving personalised advice.

Negotiation and Purchase: Also for the actual purchase of items, it is actually sufficient that the customer uses pseudonyms for communicating with the Internet shop. In case that the shop requires to know certain properties of the customer (e.g., whether he is over a certain age in order to conclude a contract), his PRIME IDM application can prove this without releasing identifying information by using PRIME's anonymous credential system. A new transaction pseudonym can be used for each purchase, such that individual purchase transactions remain unlinkable to the others and unlinkable to the browsing phase. If reputation in the eyes of the shop is to be built up, the same relationship pseudonym can be reused every time the customer gets in contact with a specific shop. This still prevents linkage of the customer's profile with his profiles at other merchants.

Nevertheless, according to current business practices, the shop may request further personal data from the customer, e.g. for the purposes of delivery and payment (e.g., name, credit card details). Together with the data request, the shop's privacy policy is provided to the user, which is subject to agreement with the user. The user has the possibility to customise this policy, for instance, by allowing the use of his personal data for additional purposes, such as direct marketing for receiving special offers, and/or dictating privacy obligations, for instance for being regularly notified about the status of his data. Before releasing any personal data, the user's IDM application can check (anonymously, if desired) that the services side (Internet shop) complies with the user's preferences, including trust requirements. Depending on the user-related data being requested, the user's IDM system can request assurance claims from the shop such as proofs of the Internet shop's platform properties, privacy seals issued by Data Protection Commissioners or reputation metrics. Once the Internet shop has provided the assurance claims as evidence for trustworthiness, the user's access control has accepted them as sufficient, agreement on a data handling policy could be established, and user's consent is given, the requested data are disclosed to the Internet shop.

Payment: The payment might be performed by means of credit card, mediated by a PayPal, card processor and a bank, or by means of anonymous eCoins, based on PRIME's anonymous credential system that are sent directly to the Internet shop. The use of eCoins is preferable from a privacy standpoint because it prevents any linkages between payment and purchase due to the unlinkability properties of credential transactions. If payment is done via credit card, the customer's credit card details should preferably not be requested by the shop, but directly by the credit card institute. For this purpose, the shop would send to the user the price information and a transaction-ID for the purchase transaction. The customer's IDM system would then contact a credit card institute of the customer's choice, which would request the credit card details and subsequently inform the shop that the payment for the purchase transaction-ID has been completed.

Delivery: Digital goods can be downloaded anonymously. If a delivery service is used to deliver tangible goods, the customer's IDM system should, similarly as described above for the payment procedure, send his address details directly to the delivery service. The customer's personal address is then only known to the delivery service, while the content of the goods being delivered is only known to the shop. In PRIME, a user interface wizard has been developed to guide the users through such multi-party transactions involving the shop, a payment institute and a delivery service. If the customer does not wish to disclose his personal address at all, a pick-up point could be used, such as a 24/7 gasoline station, from which the goods can be fetched.

Data Tracking: If personal data have been released to the Internet shop by the user in either identifiable or pseudonymous form, he can use the data track function to obtain information about the status of his data. In particular, he can use the Data Track functions to access his data, to check the fulfilment of agreed privacy obligations, or to request to delete or block his data if the current processing of his data does not comply with the agreed data handling policy or with legal privacy provisions, or if he simply wants to revoke his consent for the further usage of his data, for example for direct marketing.


Conclusions from the scenario

In the example presented here, the design started from maximum privacy and maximum flexibility for the users. The end users can choose and control whether they act (browse, shop) anonymously, pseudonymously or whether they release personal data under specified privacy and trust policies. Hence, the privacy principle of data minimisation is enforced for the browsing and transaction phases. The access control component at the Internet shop's side enforces the customer-agreed data handling policies.

Assurance claims, such as third-party endorsed statements, can help companies to prove that the policy enforcement will be "trustworthy". Furthermore, as pointed out above, the data track function makes the data processing transparent to the users, and allow them to exercise their basic rights. This scenario does not only illustrate that PRIME can enhance privacy for its end users. The example also shows that PRIME applications still allow companies such as the Internet shop to conduct their legitimate business interests or activities with less or no personal information. Reasons for businesses for collecting personal data such as to better serve their customers, to develop better services and products, to recognise returning customers, and/or to be able to conduct targeted or personalised marketing can also be well achieved by the use of relationship pseudonyms. Companies could offer special bonus programmes, special awards or direct marketing offers to returning customers. Also (financial) risks can be mitigated, because PRIME's anonymous credential system allows customers to prove certain attributes, such having passed an age limit, e.g. to conduct legally binding contracts, without revealing their exact age or any other personal details. Finally, also defaulting customers (for instance customers that do not pay) can be addressed. Users can also be addressed in the case that the payment process fails, or when law enforcement requires this. If anonymous eCoins, based on PRIME's anonymous credential system, are used for payments, the identity of the paying customer could still be revealed with the help of a trusted revocation authority.

Adoption Factors for Companies: The Benefits of Privacy-Enhancing Technologies

It is hard to assess the real cost-benefits of privacy-enhancing technologies (PETs) and of privacy-enhancing IDM as well as the costs privacy incidents. According to (Ribbers 2008, F3), the economics of privacy is a rather new research area. Furthermore, very few of the studies that exist present empirical data (Ribbers 2008, F3). Some of the studies that do are performed on US and UK companies by the Ponemon institute together with different security and computer companies (Ponemon 2007a, Ponemon 2008a, IBM 2004). There is however a limited number of statistical figures or "hard facts" that can prove the statements about potential business drivers provided below. They are therefore also based on the evidences that we have gained and our reasoning. Potential business drivers for privacy-enhancing technologies in general and privacy-enhancing IDM in particular are the following:

Compliance with Data Protection Legislation

According to most data protection legislations, organisations that process personal data have to implement appropriate technical and organisational measures to protect the confidentiality and integrity of personal data and to prevent any unlawful form of processing (cf. Art. 17 EU Data Protection Directive). Privacy-enhancing IDM can provide such technical measures. There are also indications that privacy enhancing technologies might reduce the costs for complying with data protection laws (Interior 2004). Non-compliance with data protection legislation may results sanctions, such as fines, imposed on the data controller (cf. Art 24 EU Directive). Furthermore, compensation payment requirements may be a consequence of privacy breaches if that breach can be shown to cause damage for the data subjects (e.g. ChoicePoint Inc paid $10 million and $5 million in redress to customers as a result of a privacy breach in 2005 (Trade Commission 2006) and later paid another $10 million in order to settle a class action law suit (Bosworth 2008)). PETs could be used to mitigate the risks for such privacy breaches.

High Costs for Demanded Improvements of Systems or Applications

According to Art. 28 EU Data Protection Directive, data protection commissioners in Europe have the power to impose a temporary or definitive ban on data processing or engage in legal proceedings where the national provisions adopted pursuant to this Directive have been violated or to bring these violations to the attention of the judicial authorities. There have been several cases where privacy-intrusive systems, applications or projects were banned or required to be improved for making them legally compliant, which resulted in high costs for the responsible organisations. It is a recognized fact in Software Engineering that requirement changes cost more the later in the development /maintenance cycle they appear i.e. it is better to make things right to begin with than trying to correct faults later. Within the computer security area there is also the notion that security has to be designed into the system and is hard to achieve as an afterthought. We have no reason to believe that privacy mechanisms should follow any other pattern since there are strong ties between security and privacy. Privacy-enhancing identity management systems are designed to enforce legal privacy requirements to guarantee privacy compliant data processing from the start.

Loss of Reputation and the Damage of the Brand

When breaches happen customers will lose faith and trust in the company. The actual costs for these occurrences can be hard to measure. However, according to an US study (Penomon 2007a), the cost of lost business was 64% of the total cost of a data breach and incurred an average cost of $4.1 million or $128 per compromised record. Further, in a consumer study encompassing 1795 customers (Penomon 2008b), 57% said that they lost trust and confidence in the organisation after a breach notification. To strengthen the argument, both the UK and the US studies (Penomon 2007a, Penomon 2008a) showed that the sectors traditionally associated with trust e.g. the financial and banking sector had a higher average cost for breaches. Privacy-enhancing technologies are mitigating such risks of privacy breaches.

Competitive Advantages

This is a statement that we believe to be reasonable, but currently there are very few reports on research within this area, so more research is needed in order to verify the claim. However, there are some indications that would support this fact. Ixquick, a Meta search engine company, invested in privacy features and introduced a new privacy policy based on anonymity as an explicit strategy. As a result, according to (Ribbers 2008, F3), their traffic increased 16% in 2006 and 17% in 2007. They were also the first company that was awarded with the EuroPrise Privacy Seal (3), which helped the company to significantly increase its customer base (Computer Sweden 2008), as the seal was very visibly contributing to the company's good privacy reputation. There are also a number of European citizens surveys (e.g. (Eurobarometer 2008)) that indicate that citizens have privacy concerns and care about privacy. Thus, they might choose a less privacy invasive alternative if such an alternative existed.

User Trust in the Services Side

Transparency-enhancing tools, particularly those such as the PRIME data track, that allow users to exercise their rights online, can enhance trust in the services sides, as the research on social trust factors in PRIME suggests that trust in a service provider can be increased if procedures are transparent and reversible (Andersson et al. 2005). Trust is an important factor for customers adopting a system. As Johnston et al. (2004) conclude: "Trust is important because if a person is to use a system to its full potential, be it an e-commerce site or a computer program, it is essential for her to trust the system".

Data Integrity

PETs can contribute to better data integrity. We believe that the correctness of customer data plays an important part in both making the right decisions regarding customers as well as in creating customised and user enhanced services. Making the customer part of the correction process by allowing them to view and request correction of their own data and to make it possible for customers to enforce some form of control over profiling processes will most probably increase the possibilities of these data both being correct as well as relevant. An example of very limited profile control can be seen in the Amazon Book Recommendation service (4) that allows Amazon customers to view how their customer profiles and recommendations based on them are derived by Amazon and permit customers to correct profile data.

Lower Data Management Costs and Increased Security

Privacy-enhancing technologies enforcing data minimisation enhance security and lower the costs of data management. Anonymous communication technologies and anonymous credentials allow user to act anonymously or to release only the minimally required data, which result in lower costs for data management at the services side, as only the data that are really necessary are processed. PETs enforcing data minimisation and privacy-enhanced authorisation and access control also implement the "need-to-know" security principle and can thus also protect the confidentiality of the organisation's (non-personal) data. Anonymous communication tools are today also often used by military or commercial organisations that want to keep their communication or business relationships confidential.

Ethical Business Practices and Increased Stock Values

Ethical and privacy-respecting business practices can increase the stock value of the company. In one of the company workshops conducted within the PRIME project (Ribbers 2008, F4), one of the company representatives of a large telecom company stated the following "We conduct privacy risk reporting every quarter, such reporting is useful for ethical investors (Calvert Group, F&C, Dow Jones) and can be benchmarked against the Dow Jones Sustainability Index, which now includes an additional variable for minimizing reputational risk. Last year they included a number of questions on privacy and how we manage privacy. Inclusion in such indices reduces of our cost of capital.... It minimizes the reputational risk and investors like that. They do not like to be mentioned in the newspapers with bad news". There are also cases where privacy breaches have led to a drop in the stock value according to (Ribbers 2008, F3), e.g DoubleClick's stock fell by 20% after a privacy violation and Choicepoints stock fell 17% after the announcement of a privacy breach.

Employee Trust

Employees have higher trust in their company if the company is known to respect their customers' privacy, e.g., by deploying PETs. Employees who notice that their employers are negligent with their customers' private data, might fear that also their personal data are not well protected by their employers. In many companies, however, privacy protection is today part of their social responsibility plans, which also contributes to a good work place atmosphere. The economic impacts of some of these factors are hard to measure. Hence, it is hard to answer the question whether economically these benefits justify investments in PETs/TETs. However, an analysis of business cases for PETs (Ribbers 2008, F3) conducted within the PRIME project came to the following conclusions: "At least in the case that we analysed, it could be demonstrated that Privacy investments may pay off."


In this paper, we discussed the benefits that privacy-enhancing IDM can provide in technical, social and economic terms for end users and services sides deploying them. Privacy-enhancing IDM provides technical means to protect legal privacy principles and to establish a trust relationship between a user and a service provider. The field of economics of privacy is still in its infancy and privacy-enhancing technologies and identity management systems are not very widely deployed yet. Still, there is evidence that there is a business case for PETs and for privacy-enhancing IDM.


The work reported in this paper was partly supported by the EU FP 6 project PRIME which received funding from the Community's Sixth Framework Program and the Swiss Federal Office for Education and Science. We want to thank all partners in PRIME, especially our colleagues from the PRIME Framework work package, for fruitful discussions which also inspired this paper.


Andersson, C., Camenisch, J., Crane, S. Fischer-Hubner, S., Leenes, R., Pearson,. S., Pettersson, J.S., and Sommer, D. (2005), "Trust in PRIME", Proceedings of the 5th IEEE Int. Symposium on Signal Processing and IT, December 18-21. Athens, Greece.

Bosworth, M., H. (2008), ChoicePoint Settles Data Breach Lawsuit,, 27 January, Available at:

Camenisch, J. and Lysyanskaya A. (2001), "Efficient non-transferable anonymous multi-show credential system with optional anonymity revocation", In Advances in Cryptology--Eurocrypt, Vol. 2045, pp. 93-118.

Computer Sweden (2008), EU Stampel for Integriteten, Computer Sweden, IDG, available at: http://, 17 November.

EU (1995), European Union, "Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data," Official Journal L, Vol. 281.

Eurobarometer (2008), European Commission, Flash Eurobarometer, Data Protection in the European Union -Citizens' perceptions. Analytical Report, available at:

Fischer-Hubner, S., Pettersson, J.S., Bergmann, M., Hansen, M., Pearson, S. Casassa-Mont, M, (2008), HCI designs for Privacy-enhancing Identity Management, in: Aquisti et al. (Eds.), Digital Privacy -Theory, Technologies, and Practices, Auerbach Publications.

Fischer-Hubner, S., and Hedbom, H. (2008), PRIME Framework V3, PRIME D14.1.c Deliverable, 17 March,

Hansen, M. (2007), "Marrying Transparency Tools with User-Controlled Identity Management". In proceedings of Third International Summer School organized by IFIP WG 9.2, 9.6/11.7, 11.6 in cooperation with FIDIS Network of Excellence and Human-IT, Karlstad, Sweden, Springer.

Hedbom, H. (2008), A Survey on Transparency Tools for Privacy Purposes. Proceedings of the 4th FIDIS/ IFIP Summer School, Brno, 1-7 September. To be published by Springer.

Hildebrandt, M. (2008), (ed.) "FIDIS D 7.12: Biometric Behavioural Profiling and Transparency Enhancing Tools", FIDIS work in progress.

Hildebrandt, M. (2007), FIDIS Deliverable D7.9 A Vision of Ambient Law, 2007, available at:

IBM (2004), IBM & Ponemon Institute, The Cost of Privacy Study, 17 February.

Interior (2004), Ministry of the Interior and Kingdom Relations, The Netherlands: Privacy Enhancing Technologies, Whitepaper for Decision-Makers.

Johnston, J., J. H.P. Eloff & L. Labuschagne (2003), "Security and human computer interfaces", Computers & Security, Vol. 22 (8), pp. 675-684.

Pfitzmann, A., Hansen, M. (2008), Anonymity. Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management--A Consolidated Proposal for Terminology, Version v0.31, 15 February. Anon_Terminology_v0.31.doc#_Toc64643839

Ponemon (2007 a), Ponemon Institute, LLC, 2007 Annual Study: U.S. Cost of a Data Breach Understanding Financial Impact, Customer Turnover, and Preventitive Solutions, November.

Ponemon (2008 a), Ponemon Institute, LLC, 2007 Annual Study: U.K. Cost of a Data Breach Understanding Financial Impact, Customer Turnover, and Preventative Solutions, February.

Ponemon (2008 b), Ponemon Institute, LLC, Consumers Report Card on Data Breach Notification, April.

PRIME Tutorials (2008), PRIME tutorials, https://

Ribbers, P. (2008), 'F' Series of Deliverables--Business Processes and Business Case, 25 May,

Sackmann, S., Struker, J., and Accorsi, R. (2006), "Personalization in Privacy-aware Highly Dynamic Systems", Communications of the ACM, Vol. 49 (9).

Trade Commission (2006), The Federal Trade Commission's Settlement with ChoicePoint. Available at:

Weitzner, D., Abelson, H., Berners-Lee, T., Hanson, C., Hendler, J., Kagal, L., McGuiness, D.L., Sussman, G.J., and Waterman, K. (2006), "Transparent Accountable Data Mining: New Strategies for Privacy Protection", Computer Science and Artificial Intelligence Laboratory Technical Report: MIT-CSAILTR-2006-007, Massachusetts Institute of Technology, Cambridge, MA, USA.


(2) The entity defined to be responsible for personal data processing according to national or Community laws or regulations (see Art. 2.d EU Directive 95/46/



Simone Fischer-Hubner * and Hans Hedbom **

Department of Computer Science Science, Karlstad University, Universitetsgatan 2, 65188 Karlstad, Sweden

* E-mail: simone.fischer-huebner@kause, ** E-mail:
COPYRIGHT 2008 Asia-Pacific Institute of Management
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Fischer-Hubner, Simone; Hedbom, Hans
Publication:Asia-Pacific Business Review
Date:Oct 1, 2008
Previous Article:An empirical analysis of the impact of industrialization on infrastructure development in Himachal Pradesh.
Next Article:Integrating talent engagement as a strategy to high performance and retention.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters