Back in 2012, the Girls Around Me app made headlines when women’s personal profiles, shared willingly to keep touch with their girlfriends, started appearing in an app used by strangers on their smartphones.
Such incidents sparked off the “creep factor”, a new phenomenon describing the feeling consumers report after experiencing an infringement on their privacy. At the heart of the matter is the “surprise” consumers feel when personal information shared in one context pops up unexpectedly in another.
People are increasingly going online for work and play, social life and shopping. Placing an order on an e-commerce Website or booking a taxi via your smartphone on a community platform have become commonplace these days. Meanwhile, increased monetization of personal data has led some operators to massively collect data on individuals. Every time we connect to the Internet, we leave our footprints in the form of private information. Such data is collected, segmented, sold and used, allegedly to enhance our quality of life and better meet our needs, but more truthfully for commercial purposes that are often beyond our control.
Online trust has hit an all-time low with up to 89 % of UK Internet users – and a staggering 92 % in the USA – worrying about their privacy online, according to a 2014 study by global data privacy company TRUSTe.
Obligation to disclose
Consumers are often oblivious to the data being collected about them and what happens to this information. Providing personal data is a necessary evil for the convenience of accessing goods and services online, explains Norma McCormick, past Chair of the Consumer and Public Interest Committee at the Standards Council of Canada, the ISO member for the country. Consumers must automatically consent to requests for information and cannot always fully restrict the type of details they hand over.
“Once the information is provided,” she warns, “the consumer has more limited, and sometimes no control, on how it is used”. Consumers can, however, reduce the risk of misuse or abuse of their personal data by looking for verification schemes, which can increase confidence in the recipient of the information.
When it comes to dealing with companies online, individuals are faced with the organization’s privacy and the settings it has determined. “No surprise then that research shows the median time users spend on licence agreements is only six seconds; and no more than 8 % of users read the licence agreement in full,” says Richard Bates, Head of Digital Initiatives at Consumers International, a world federation of consumer groups.
Big bucks for big data
With Internet users worldwide nearing the three billion mark (42 % of the world’s population), according to the International Telecommunication Union (ITU), the mind boggles as to the amount of digital personal data stored on the Web. What’s more, an estimated 90 % of that data has been collected over the past two years. With this rapid growth, regulators must be proactive about putting frameworks in place to harness the social benefits of this enormous data pool while protecting legitimate consumer concerns.
Any data circulating on the Web are by definition “highly vulnerable”. Once captured, they are saved, analysed, processed and sometimes transmitted to other databases. Information is often stored in regions with a cold climate, because doing so is cheaper as a result of energy savings. Also, a data item might be routed through numerous networks before it reaches its destination.
Personal data have a high monetary value and are thus subject to market pressures, but also to all kinds of malicious and criminal behaviour. Incredibly, a date of birth fetches two dollars apiece, according to the Organisation for Economic Cooperation and Development (OECD), meaning people’s birthdays have a potential market value of USD 6 billion. This necessarily raises questions of data security and jurisdiction.
The laws of the land
Cases of online data being stolen have grown over the years. That’s because many privacy elements can be captured and digitized through an Internet connection, but also via a wide variety of intrusive connected devices. In a number of countries, “digitized privacy elements” are classified under personal data or personally identifiable information (PII), explains Dr. Kernaghan Webb, Associate Professor, Law and Business Department, at Ryerson University, Toronto, Canada. “However,” he deplores, “a unified international definition of ‘digitized privacy elements’ is not yet available.”
Airbnb, Uber and Facebook are examples of a particularly data-rich form of online multi-actor marketplace interaction mechanisms (OMAMIMs). They raise significant questions about the accountability of consumers, businesses and governments in the digital marketplace in terms of who can do what, and how. “Understanding the distinctive features of these OMAMIMs is an important first step to determining the appropriate roles and approaches of all parties in protecting consumers while encouraging the creation of innovative products and services,” explains Dr. Webb.
Beyond government dictates, which are overly restrictive and prescriptive, self-regulation can be an effective strategy for protecting customer privacy. This is already happening as most companies now have privacy policies and internal processes to regulate data collection, usage and customer choice. These let consumers know how and what personal data is being collected and used, and allow them to opt out of the marketing process. It makes pragmatic sense, as most businesses want to do right by their customers to avoid losing them and protect their brand.
Although industry players must show willing by safeguarding consumers’ rights and fighting misleading advertising, connected customers have their own role to play, with its specific responsibilities. And regulators must take a new look at ways to empower and protect them.
To be fully accountable, consumers need access to valid, complete and documented information on online products and services, highlighting their advantages and disadvantages. “Finding a more meaningful solution to this problem requires developing mechanisms that enable consumers to express their terms; and move beyond a one-size-fits-all model of consent,” emphasizes Bates.
So what are the tools and services that could empower consumers to better manage their data? A new market for personal data management services is already emerging that can help individuals assert more control over how their data is collected. Such services take on many forms from browser plug-ins that block tracking applications to government initiatives prompting companies to return the data they hold to their rightful owners.
Fragmentary country-specific privacy laws burden companies and do not protect consumers. There has been considerable pressure in some countries to ease restrictions on the handling of personal data, yet relying solely on self-regulatory programmes and customer acumen is problematic in the absence of widely recognized and accredited standards. Standardization seems to be a good solution to strike the balance between governance, business and consumer.
“International Standards, I suspect, can support interoperability between and across the different tools and systems that will constitute this ecosystem of data-empowering services. And they have a role in defining and promoting what a consumer-centric approach to terms and conditions should look like,” says Bates. Global standards addressing issues of transparency, data protection and dispute resolution mechanisms can provide the overarching framework that facilitates transborder data flows, with appropriate obligations attached.
That said, ISO standards are voluntary. They are not a replacement for conventional regulation but a supplement that enhances the regulators’ ability to raise a cohesive cross-border response built on the self-regulatory capabilities of private-sector platforms and the joint accountability of buyers and sellers. A robust process that should begin to narrow the issues dividing nations on data protection law.