The encryption factor.
Web customers need to trust that their personal data is secure but, explains Juliet Hoskins, privacy does not mean anonymity, and attacks on new forms of tracking technology are counter-productive
The concept of privacy lies at the heart of internet business, particularly among new users, already uncertain about the security of transactions, who are now concerned about how their information may be used. Privacy does not mean anonymity. While we may wish for anonymity ourselves, we mistrust it for others. "A signed love letter is flattering flat·ter 1
v. flat·tered, flat·ter·ing, flat·ters
1. To compliment excessively and often insincerely, especially in order to win favor.
2. ; an anonymous love letter is creepy creep·y
adj. creep·i·er, creep·i·est Informal
1. Of or producing a sensation of uneasiness or fear, as of things crawling on one's skin: a creepy feeling; a creepy story.
2. ," points out Stuart Baker, a solicitor at US firm Steptoe & Johnson.
Privacy should mean that personal information is seen only by those who need to see it and is not passed to unscrupulous third parties. The debate centres on whether to preserve anonymity, or to implement strong authentication (1) Verifying the integrity of a transmitted message. See message integrity, e-mail authentication and MAC.
(2) Verifying the identity of a user logging into a network. so that the information trail can be properly controlled.
The argument over privacy and authentication is similar to that over encryption The reversible transformation of data from the original (the plaintext) to a difficult-to-interpret format (the ciphertext) as a mechanism for protecting its confidentiality, integrity and sometimes its authenticity. Encryption uses an encryption algorithm and one or more encryption keys. . Do you limit strong encryption An encryption method that uses a very large number as its cryptographic key. The larger the key, the longer it takes to unlawfully break the code. Today, 256 bits is considered strong encryption. As computers become faster, the length of the key must be increased. in case it gets into the wrong hands, or do you regulate against the people who would abuse it? Originally, governments argued that strong authentication was more important for secure networks than strong encryption and did not try to stop strong authentication. Now many are trying to do just that. Attacks on authentication technologies are a worrying trend.
Intel's new P3 processor, for instance, contains a processor serial number unique to that chip and can be set to reveal its serial number. Similarly, Microsoft has incorporated into its software a method of identifying each document with a globally unique identifier A Globally Unique Identifier or GUID (IPA pronunciation: ['gu.ɪd] or [gwɪd] (GUID (Globally Unique IDentifier) A pseudo-random 128-bit number that is computed by Windows and Windows applications in order to identify any component in the computer that requires a unique number. ) that includes information about the machine on which it was produced. This has already been used to track the author of the Melissa virus A Word macro virus that was unleashed in the spring of 1999. It sent an e-mail message with a list of pornographic Web sites to the first 50 names in the user's Microsoft Outlook address book. .
Privacy groups attacked these developments as soon as they were announced and governments followed, but the US Federal Trade Commission and state attorneys general did not sue Intel and Microsoft. Europe claims not to regulate technology, but several government agencies conveyed a simple message: "disable To turn off; deactivate. See disabled. those technologies or face unpleasant sanctions."
"The attack on these technologies is ironic," Baker says. Privacy advocates led the fight against the FBI's effort to restrict encryption technology, claiming that technologists, not government, should determine which technologies are deployed, particularly if they provided security for responsible users. The FBI was restricting technology simply because it might be misused by a few. Surely it was better to regulate the misuse than to deny everyone better security.
But when it came to authentication, the arguments were reversed. Privacy advocates admitted that the serial number and GUID may make network users more accountable and secure, but they could be misused to track users through cyberspace Coined by William Gibson in his 1984 novel "Neuromancer," it is a futuristic computer network that people use by plugging their minds into it! The term now refers to the Internet or to the online or digital world in general. See Internet and virtual reality. Contrast with meatspace. . Their answer, however, was not to regulate unscrupulous users -- they wanted these capabilities removed from the hardware and software to preserve anonymity.
But if we do that, we end up trusting networks of anonymous, unaccountable users. We want to share information with some people and not with others. If there's no way to tell who's using the network ar who's accessing our data, then we can't tell whether or not our expectations have been met. International law is still a long way behind technology and the protection of data is a key debate. While the US relies on sell-regulation, the EU believes in laws.
This divergence has caused severe problems for those sending data from Europe to America. "The EU directive (European Union Directive) A set of privacy requirements that took effect in 1998 and ordered European member nations to enact compliant legislation. It deals with the establishment of Data Protection Authorities, people's rights to personal information and enforcement. 95/46 on the protection of individuals over the processing of personal data and on the free movement of such data protects users and prohibits data transmissions to countries without the same level of protection," explains Dr Andreas Mitrakas, GlobalSign NV. "Data collected in Europe cannot be transmitted to the US for storage or processing."
An issue of substance may also arise over the personal data of applicants for digital certificates. Many of these services are offered by European and overseas providers on the world wide web. The EU directive says that personal data collected in Europe should remain in Europe.
Of course there have been scandals. "A well-known advertising banner company briefly took its stock off the market this year, after its data collection practices were investigated," says Eric Arnum, US editor at the European Forum for Electronic Commerce (EEMA EEMA - European Electronic Messaging Association ). "When the company serves up a banner on one of its 11,000 websites, it gives out a `cookie', a unique code which signals to the company when it has a return visitor. People who don't erase their cookie.txt file See ASCII file. provide details about how often they visit all those sites.
"The company crossed the line when it launched a personalisation service to collect names, ages, incomes, education, home location, ages of children etc. The company could attach a name and an income to a cookie -- it could tell which rich suburban men bought on-line porn, which kids bought the latest NSyncalbum, and where their mothers worked."
Juliet Hoskins is editor of EEMA Briefings for The European Forum for Electronic Commerce EEMA will be exhibiting at e-business Expo 2000, 7-9 November 2000, Olympia, London Olympia is an exhibition centre in Hammersmith, London, England. It opened in the 19th century and was originally known as the National Agricultural Hall.
Erected in 1886, by Andrew Handyside of Derby it covered an area of four acres.
In the US, data protection is based on self-regulation. The Federal Trade Commission said in May that this was not working. A survey of websites found that just 20 per cent adhered to principles of notice, consent, access and security:
* Notice -- companies must provide clear and conspicuous notice about what information will be collected and how it may be used;
* Consent -- consumers will "opt in" before personally identifiable information In information security and privacy, personally identifiable information or personally identifying information (PII) is any piece of information which can potentially be used to uniquely identify, contact, or locate a single person. is collected, used or disclosed. Consumers must be able to opt out when non-personally identifiable information is collected;
* Access -- companies must provide access to personal data and an opportunity to correct it;
* Security -- companies must protect the security and confidentiality of information. Notice of breaches in security must also be provided.