That the technology mega-trends predicted for 2020 and beyond will continue on their march seems to me to be inevitable; we’re just left debating the timeframe. But it’s the counter-trends that I believe will determine whether privacy is a winner or a loser.
Business models that put the individual in control: Today, data about people is almost exclusively controlled by organisations, whether public or private sector. People have very little control over their own personal data. If data is power, then the scales are tipped heavily in favour of corporations and governments against the individual.
But the cost and complexity of processing, storing, transferring, computing and analysing data are such that it is perfectly feasible for individuals to control their own data – in fact, billions of people now do this daily in a rudimentary form, as they manage profiles on social media, and use smartphones to capture, manipulate and share data. There is no longer any reason why the organisation should be the default point of control of personal data.
What’s more, where organisations function as the default data controllers, the economic potential for personal data is limited, because data remains locked up in corporate silos (even silos as big as those controlled by Google are still silos). The utility of much of this data cannot be unleashed because it cannot easily, legitimately or lawfully be connected with other data from other sources. This data only becomes really valuable when it can be combined with relevant data across all services that relate to a person’s life – online, retail, financial, governmental and the myriad other sources coming available.
New entrepreneurs recognise this and are developing solutions that put the individual back in control. By making the individual “the single point of control and integration of data about their lives”, they are able to aggregate data about an individual from all sources and services. In doing so, they are creating an entirely new, and enormously valuable, asset class that is currently diminished by being spread across the myriad data silos owned by the many hundreds of corporations and government agencies we interact with. And there is good evidence that this will enable entirely new services, and significant new economic growth and value.
Aside from enabling economic growth, these new models also happen to offer a market-driven solution to many of the privacy problems we are facing with the onward march of data-generating technology where the organisation is the default controller of that data. Shifting the balance of power back towards the individual must produce a positive outcome for privacy. And because it also offers the possibility of enabling innovation and economic growth, privacy is no longer trapped in one–sided conflict with forces it cannot hope to defeat. It does not require a balance, or a trade-off, between privacy and growth – it enables both.
A typical example of the sort of new service provider that is beginning to emerge is the personal data vault or bank. A personal data bank provides the single point of integration for personal data under the control of the individual, and provides related services (much like a normal bank does with your money) that enables the individual to get value from their data – from eliminating repetitive form filling (providing address, delivery and payment data to online merchants), to monetising one’s own data through purchase preference and ‘intent-casting’, to enabling new, complex ‘decision support’ services. In this model, the individual becomes the curator of their own personal data, able to volunteer more, or more relevant, data and manage that data to ensure it is relevant, accurate and as comprehensive as they want it to be.
Once consumers have realistic alternatives, we can expect to see an end to the ‘privacy paradox’, i.e. individuals’ actual behaviours defying their expressed attitudes, as it becomes possible, without disproportionate consequences, to act upon those attitudes by making meaningful choices.
While the emergence of personal data banks and similar business models do not in and of themselves prevent organizations from collecting and exercising control over personal data regardless, they have the potential to disrupt this simply by being inherently more valuable. Because the value of personal data is closely connected to its relevance and currency – think of personal data as having a ‘half-life’ – ‘personally curated’ sources of data will have higher value simply due to the fact that they will represent the actual wishes and desires of an individual, rather than the presumed wishes and desires based on derived data. Plus, our personal data changes all the time (think of musical tastes, favourite bars or hangouts, travel interests, and, for many people, even where they live, or the job they are doing). Maintaining personal data at the level of accuracy and currency needed for many applications to be optimally effective is an impossible task for an organization without the individual’s direct involvement. Conversely, for the individual it is practically impossible to manage and keep up-to-date and accurate their own personal data when it is spread across hundreds of organisations, each with their own interfaces and approaches.
Technology development that supports social norms and values; It’s a cliché that technology is disruptive. And too often we hear that we should accept disruption to our sense of privacy because technology has made it an outdated and redundant concept, and we can’t turn back the clock. Not infrequently the people who express these views are the very people who helped to create the technology that has brought these things to pass in the first place. This is simply a form of technological determinism.
But technology should and can develop in a way that reflects and supports social norms and values. Since technology is created by people, we are perfectly capable of creating it in ways that take account of privacy and other values. Urban architects have learnt to do this with our physical environment – concerning themselves not just about function and aesthetics, but also with broader environmental impacts, the need for building communal living spaces and creating a sense of community.
More significantly, technology is largely the product of private enterprise. To understand why technology has developed the way it has, or how it will develop in future, we need to understand the economic motivations and drivers of those who create it, and the business models that justify investment.
Early applications for data processing technology were focused on efficiency – replacing manual processes with automated processes. Automated data processing requires data as input, but once used, remained surplus to requirements. Personal data was relatively scarce, and even though it was recognised that data needed to flow across borders, it was not seen as a valuable asset in and of itself. But it was recognised that automated data processing had the potential to cause harm to people’s privacy, and so new codes and regulations were created that essentially treated personal data like ‘toxic waste’, to be contained and made safe. Now, today, rather than being a mere by-product of digitisation, data is a resource defined by superabundance, and has become perhaps the most important driver of economic growth in the digital economy. This will become even more so as we move towards 2020. Organisations are therefore incentivised to create and capture personal data and exercise control over it.
In short, technology continuously causes friction with privacy because commercial organisations haven’t really tried to address the problem. While “Privacy Enhancing Technologies” have a reasonably long history, particularly within academia, they have failed to be adopted commercially or at sufficient scale. For instance, cryptographic tools have not been adopted by the general user due to a lack of commercial investment in embedding them seamlessly into products that consumers want. This is because, beyond mere legal compliance, privacy hasn’t featured as a strategic priority, and correspondingly there has been insufficient investment by organisations in developing the broader range of skills and expertise needed to create and deploy privacy-enhancing products or services, such as in product marketing, engineering or user experience. There simply hasn’t been a sufficient incentive to do so. And now there is precisely the opposite incentive – to generate and use data as a revenue driver in and of itself.
However, if the individual begins to become the point of control, businesses that want to leverage the vast pool of personal data assets available will need to compete with each other to provide the most attractive destination for people’s data. And if businesses are competing to provide individuals with the best ‘personal data banks’ and other tools that enable them to gain control of their own data, and ‘invest’ it on their own terms, then it will become a business imperative to find innovative and attractive approaches to issues such as individual control and permission, transparency and usability, data portability and ownership, as well as data protection, anonymisation and other counter-surveillance measures. There will be an economic incentive to encourage technology development where personal data control and privacy are functional necessities, not regulatory pipe dreams.
This in turn will create a demand by organisations for new skills from technologists and service designers that enable them to create products that embed respect for privacy- related values from the outset. Universities and colleges will seek to meet this demand by providing courses and modules on the fundamentals of what privacy is and why it’s important, but also qualifications in new fields like privacy engineering and privacy design.
The contrast in this respect between privacy and security couldn’t be greater. One the one hand, the security industry has been estimated to be worth $350 Billion in the US alone; security is a sophisticated and maturing market. The ‘privacy industry’ by contrast is hardly recognizable at all. The reason is simple – in an organisation-centric world, where data is valuable and where corporations control data, it is in their self-interest to secure that data. Hence, supply meets demand. But in the privacy arena, there has simply been insufficient demand to stimulate a supply.
But this is changing. Something approximating a privacy marketplace is now becoming a reality, consisting of tools that prevent tracking and other counter-surveillance services on the one hand, and personal data vaults and banks that enable the curation and management of one’s own data on the other. Major players in the internet and communications space have also already begun to lay down their markers. As this market develops, consumers will benefit from the greater control over their personal data that results.
Second generation regulation: Nevertheless, we must be wary of substituting technological utopianism with economic utopianism. These competitive forces can be harnessed, but are unlikely to create change for the good all by themselves. Regulation has an important role to play. But we need a different type of regulation to the existing data protection and privacy regulation we have today.
Existing data protection regulation emerged in the 1970s and 1980s in response to computing and data processing developments beginning in the 1960s. The underlying assumption was that data processing would always be a complex and resource intensive activity, and hence would always be the preserve of large, well-resourced organizations. Individuals needed the protection of regulation against the impacts of automated data processing and the decisions it enabled. The regulatory frameworks were generally “command and control” style frameworks that provided rules that regulated the behaviour of large, static organization (the ‘data controller’), and were designed to protect the individual who lacked any means to exercise control themselves (the ‘data subject’).
This assumption that the organization is the natural point of control for personal data no longer holds. Yet our current data protection frameworks are built upon this assumption. Even the latest EU proposals are still essentially based on this model. But with the real possibility for personal control over personal data, and business models emerging to support this, policy makers need to focus on helping this nascent market develop, rather than trying to stem the tide of technology with rules and guidelines.
What’s more, policy makers have struggled to find ways to effectively regulate technology in a way that produces commercially deployed technologies that reflect or support privacy norms and values, rather than disturbing them. While there are regulatory restrictions surrounding the use of personal data, this has predominantly resulted in legalistic methods of compliance. I would contend that these haven’t had any significant impact on the design of technologies themselves, how they generate data, or how they make that data available.
Issuing decisions and guidelines after technology has already been commercially adopted and has started to negatively impact privacy is like closing the stable door after the horse has bolted. And yet while concepts like data protection or privacy ‘by design’ are constructive ideas, they are unlikely to translate into better technology design on a large scale simply because they happen to appear in a regulatory instrument. What is so often needed on many aspects of privacy is creativity and innovation, and you cannot command an organization to innovate.
But you can incentivize it to innovate. If a market is encouraged to develop where individuals are placed in a controlling position at the centre of a personal data market and ecosystem, there will be economic incentives to look for better solutions to issues people care about. The role of regulation should then become less about issuing detailed rules and requirements (e.g. telling companies what to include in their privacy statements, or specifically how they should capture consent, or whether they need to seek regulatory approval to use data for certain purposes), and more about ensuring that fair and open competition develops and operates to produce beneficial privacy outcomes for individuals, while also allowing innovation and growth with data. This type of regulation has been called “second generation” regulation, a term coined by Professor Dennis Hirsch in the context of evolving environmental regulation. Hirsh describes the evolution from the not-so-effective early post-war environmental “command and control” regulation to the more sophisticated and effective frameworks we see today that embrace a broad understanding of how economic incentives can stimulate innovation. Hirsch sees a parallel between regulating information privacy and environmental degradation – both require innovation if they are to achieve satisfactory and effective outcomes without stifling economic growth.
However, one very important principle that has emerged within Europe’s attempt to modernize its data protection regime is “data portability”. This principle will require organisations to allow personal data to be exported to another entity at an individual’s request. While the mechanisms for achieving this are by no means trivial (look at how long it took the mobile industry to implement mobile number portability, which is a far simpler undertaking), this is the sort of measure that will facilitate a personal data market to develop and grow. It is both a typical “second generation” form of regulation, and an essential component of an individual taking control of their personal data.
 Alan Mitchell, Strategy Director, Ctrl-Shift, speaking on “The Business and Economic Case” at Personal Information Economy 2014, available at: https://www.youtube.com/watch?v=xbQh0DNzAlA&feature=youtu.be&t=5m2s (accessed 17/11/2014)
 World Economic Forum, “Personal Data: The emergence of a new asset class”, available at: http://www3.weforum.org/docs/WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf (accessed 10/12/2014)
 Ctrl Shift, “Personal Information Management Services: An analysis of an emerging market”, available at: https://www.ctrl-shift.co.uk/research/product/90 (accessed 12/12/2014)
 Some examples are You Technology (http://you.tc/), Personal.com (https://www.personal.com/) and QIY (https://www.qiy.nl/)
 An example of a complex decision support service would, for instance, enable a household to recalibrate its domestic energy consumption needs. For more information, see “Personal Information Management Services: An analysis of an emerging market”, supra note 27.
 Martin Doyle, “The Half Life of Data”, available at: http://www.business2community.com/infographics/half-life-data-infographic-0971429 (accessed 10/12/2014)
 Online contact books, like Plaxo (http://www.plaxo.com/), and social networking services like Facebook (https://www.facebook.com/) and LinkedIn (https://www.linkedin.com/home) are good examples of how there has already been a shift of control to the individual. In these cases, the process of giving out contact information (e.g. via business cards) and allowing others to manage one’s contact data is replaced with the individual managing their own contact information and creating stable connections online with people they want to stay in touch with.
 Somewhat ironically, urban architecture is also concerned with other social issues, such as how to reduce crime in urban planning and design through ‘natural surveillance’.
 The 1980 OCED Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm), 1981 Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal (http://conventions.coe.int/Treaty/en/Treaties/html/108.htm), and the 1995 EU Data Protection Directive 95/46/EC (http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML)
 As this recent academic paper illustrates, solutions are available to many of the privacy problems highlighted with pervasive technologies - “A Roadmap for IoT/Cloud/Distributed Sensor Net Privacy Mechanisms”, available at: http://internet-science.eu/publication/1141 (accessed 15/12/2014)
 Justin Troutman, “People Want Safe Communications, Not Usable Cryptography”, MIT Technology Review, available at: http://www.technologyreview.com/view/533456/people-want-safe-communications-not-usable-cryptography/ (accessed 12/12/2014)
 ASIS International, “Groundbreaking Study Finds US Security Industry to be Worth $350 Billion Market”, available at: https://www.asisonline.org/News/Press-Room/Press-Releases/2013/Pages/Groundbreaking-Study-Finds-U.S.-Security-Industry-to-be-$350-Billion-Market.aspx (accessed 17/12/2014)
 Mark Little, Ovum, “Personal Data and the Big Trust Opportunity”, available at: http://www.ovum.com/big-trust-is-big-datas-missing-dna/ (accessed 10/12/2014)
 For example, Ghostery, Inc. Website available at: https://www.ghostery.com/en-GB/
 For example, devices like the Blackphone are designed to ensure highly secure and encrypted mobile communications. Website available at: https://www.blackphone.ch/
 Supra note 27
 CNET, “Google to encrypt data on new version of Android by default”, available at: http://www.cnet.com/uk/news/google-to-encrypt-data-by-default-on-new-version-of-android/ (accessed 17/12/2014); and see supra note 14.
 The current draft of the EU Data Protection Regulation is available at: http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf (accessed 17/12/2014)
 The controversy over the European Court of Justice decision in the so-called ‘right-to-be-forgotten’ case against Google is illustrative of this, where traditional data protection rules are applied to a technology, i.e. search engines, that was never designed to ‘forget’, to ‘age’ search results, or otherwise address the privacy issues with indexing against individuals’ names. The European Commission’s Factsheet on the case is available at: http://ec.europa.eu/justice/data-protection/files/factsheets/factsheet_data_protection_en.pdf (accessed 12/12/2014)
 Article 23 (Data Protection by Design and Default) in the Draft Data Protection Regulation, available at: http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf (accessed 17/12/2014)
 Dennis D. Hirsch, “Protecting the Inner Environment: what Privacy Regulation can Learn from Environmental Law”, available at: http://users.law.capital.edu/dhirsch/articles/hirschprivacyarticle.pdf (accessed 01/12/2104)
 Article 18 (Right to Data Portability), available at: http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf (accessed 17/12/2014)
 For a general description of mobile number portability - http://en.wikipedia.org/wiki/Mobile_number_portability (accessed 15/12/2014)