The right to privacy finds its expression in all the major international human right instruments. They were all, without exception, drafted and agreed in different times to those we find ourselves in today. Even as we contemplate the years ahead, there is almost universal acknowledgement of the continuing value and relevance of these instruments and the rights enshrined. Yet, the subject of privacy has never been more in flux, facing a seemingly endless barrage of pressures. Privacy is becoming one of the most vexing public issues of our time, and will remain so in 2020.

Contemporary concerns and debates about privacy are essentially debates about technology and the role and impact of technology on our lives and societies. Practically every mega-trend in the world of technology is creating tensions for privacy, personal freedom and autonomy – ubiquitous connectivity, big data, the cloud, wearable tech, artificial intelligence, the internet of everything, connected health, drones – the list goes on.

It’s no longer just a case of leaving digital footprints from our movements around a digital landscape. As the size of computing continues to shrink to nanotech levels, and the cost continues to fall, technology will become embedded in both the physical world and our physical bodies. We will be living in a world where we are ‘surrounded by computational intelligence’[1].

Technology is becoming invisible. And its unobtrusiveness will aid its pervasiveness – there are already estimated to be 16 billion connected objects today and this is predicted to reach 40 billion by 2020[2]. And this pervasive connected technology will create ever more data. IDC estimates that by 2020 people and connected objects will generate 40 trillion gigabytes of data that will have an impact on daily life in one way or another[3]. This data will make known about us things that were previously unknown or unknowable (including to ourselves). And in doing that, it will enable actions and decisions to be taken about us that will have profound consequences far beyond the display of adverts on our variously sized screens, or personalised pricing based on profiles of our income and propensity to pay[4].

Evgeny Morozov, the author[5] and researcher, gave an example of this recently in his talk at the Observer Ideas festival 2014 in London[6]. In the Philippines, sensors have been placed in public toilets which emit an alarm if someone uses one of the stalls and then tries to leave without using the soap dispenser. You can only turn off the alarm by using the soap dispenser. The sensor thereby has a deliberately regulating effect on the behaviour of users, in this case encouraging hand washing. This is just a logical extension of the seat belt alarms fitted to most new cars built today or the use of speed cameras, the purpose in both cases being to use technology to regulate our behaviour and thereby reduce injury and the cost to health services of car accidents.

Let’s stick with cars for a moment. The installation of a wide range of new sensors in vehicles is already transforming other aspects of motoring, such as insurance. Usage based insurance schemes utilise sensors that collect data on location, speed, braking and acceleration to determine the risk profile of the driver, and consequently their insurance premium. The other touted benefit is that such technology acts to discourage  risky driving behaviours. In return, we subject ourselves to a degree of surveillance. It is not long before we can see the same technology being used for other ostensibly worthy purposes, e.g. perhaps identifying if you are too tired to drive and automatically disabling the engine.

Of course, it might be argued that none of this compels us to allow sensors into our cars, homes and other parts of our lives, and the collection of data about us – we are not compelled to use usage based insurance or drive “intelligent” cars, and so we have a choice. But if refusing to allow the collection of data by sensors begins to become a costly decision (e.g. increased car, home or health insurance[7] premiums), it’s a choice that is easier to make for those who can afford it. And, of course, once sensors and data-generating technologies become embedded in products as standard, there will come a point when there are few realistic alternatives.

This rise of technology that not only observes, but intervenes (I’ll term it “bossy tech”), is a consequence of placing sensing technology in more and more places where these ‘interventions’ can be automated, based upon the exponential increase in data sources that can be analysed in real time with intelligent computing. And as bossy tech gets a lot smarter it will no doubt get bossier, as public authorities acquiesce in the notion that technology can regulate our behaviour far more efficiently than traditional enforcement methods – why waste money on policing public spaces if cameras and audio sensors can detect potentially unsociable behaviours, use facial and voice recognition to identify the individuals involved, and then order them to stop or else face the consequences?

The value of digital identity, i.e. the sum of all digitally available information about an individual, has been estimated to be worth €1 trillion to the European economy by 2020[8]. The internet of things is predicted to generate a value-add of $1.9 trillion globally by 2020[9]. Much of that value is not likely to be from the ‘things’, but from data derived about those things that promise to transform every sector, bringing efficiencies and cost savings, but also entirely new service possibilities[10]. Whatever the figures, there is undoubtedly a huge economic incentive to generate and collect data from whatever sources it becomes available. As more data from more things becomes available, we can expect to see a data “land grab” by organisations.

The control of data provides organisations with valuable insights and enables influence over purchasing decisions and other behaviours. Increasingly, therefore, data is power, economic or otherwise. But there is already undoubtedly an asymmetry in power between organisations and individuals today, as organisations have an abundance of information about consumers and analytics tools to interrogate it, while consumers suffer information scarcity and possess few tools to make any sense of their own data[11]. And this appears to be getting worse, according to Sir Tim Berners-Lee[12]. In the 2014 – 15 Web Index, an annual report measuring the Web’s contribution to social, economic and political progress published by the World Wide Web Foundation, it is revealed that the web is becoming less free and more unequal.

In the absence of any countervailing forces, the current technology mega-trends look set to create further asymmetries in power resulting in less privacy for individuals in 2020.

[1] Brian David Johnson, Intel, Wired UK retail talk, available at: (accessed 10/12/2014)
[2] ABI Research, “The Internet of Things Will Drive Wireless Connected Devices to 40.9 Billion in 2020”, available at: (accessed 10/12/2014)
[3] ICD white paper, “The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things”, April 2014, available at: (accessed 10/12/2014)
[4] Blogger Alistair Croll declares that “Personalization” is another word for discrimination” in his post titled “Big data is our generations civil righjts issue”, available at: (accessed 23/11/2014)
[5] Evgeny Morozov homepage, available at: (accessed 01/12/2014)
[6] Observer Ideas - A Festival for the Mind, 12 October 2014. For an introduction: (accessed 17/12/2014)
[7] Barclay Ballad, “Now you can get financial reward for your personal fitness data”, 9 December 2014, available at:  (accessed 17/12/2014)
[8] Liberty Global, “The Value of Our Digital Identity”, available at: (accessed 10/12/2014)
[9] Gartner, Inc. newsroom, “Gartner Says the Internet of Things Installed Base Will Grow to 26 Billion Units By 2020”, available at: (accessed 09/12/2014)
[10] Harbour Research, “Where Will Value Be Created In The Internet Of Things & People?”, available at: (09/12/2014)
[11] Mark Little, Ovum, “Personal Data and the Big Trust Opportunity”, available at: (accessed 10/11/2014)
[12] World Wide Web Foundation, “Recognise the Internet as a human right, says Sir Tim Berners-Lee as he launches annual Web Index”, available at: (accessed 17/12/2014)
  • julianranger

    To discuss Privacy fully I think we need to define what it is in the digital sense.
    We seem to be caught between two stools of thought on Privacy – either Privacy is dead (aka Mark
    Zuckerberg and more recent posts such as or the Go Dark movement. To me this seems to be looking at issues incorrectly because we haven’t defined what Privacy is.

    Specifically, being private doesn’t mean not sharing anything – it means being in control of what you
    share, to whom and when. For example, I am a private person, but I share sex with my wife, I share family issues within my family group, I share my finances with my financial advisor, I am happy for my supermarket to know what I buy. The point is that in the physical world I am largely (but never completely) in control of my privacy and that includes what I share and with whom.

    So privacy does NOT mean no sharing. This is important as sharing is the grease to the future economy – combining different data sets that I share will enable radically new services and experiences that I have yet to even think of. Privacy equates to controlled sharing. There is a spectrum of sharing for data items: from items I keep solely to myself, to items I share with one or a few people and ask not to be shared further, to data I may share more widely and allow to be reshared, to data which I share with the world (either as me or in anonymised form).

    I’d like to include “for what purpose” in the above definition of what privacy implies re control and to most people they would. If I disclose to a close friend a secret so I can get feedback for example, I do not expect that secret to be disclosed to others – it was only for the purpose of our conversation. However, I can’t control my friend directly and he may tell others. In which case of course he has lost trust and I probably won’t share with him again – or at least will share more carfeully. This is of course the same in the digital world. If I share with you for a purpose and you use for another purpose then I am unlikely to want to share with you again.
    So, I propose we define Digital Privacy as the “The ability to control your personal data, inc. who you share it with, when and for what purpose”.
    (Note: the dictionary defines Privacy as the “condition of being secret”. In my digital privacy definition I propose this is equivalent to “being in control of who is in on the secret”.)

    • Patrick

      Hi Julian, I love the simplicity with which you’ve spelled out that privacy does not equate to no sharing.

  • Tim Jones

    Knowing the Unknown
    Building on the existing point about ‘Knowing the Unknown’ (By 2020 people and connected objects will generate 40 trillion gigabytes of data that will have an impact on daily life in one way or another. This data will make known about us things that were previously unknown or unknowable.) DC workshop participants saw that there will be increasing tension between ability to predict and manage behaviour / conditions vs. self-determination / free will. In addition there may be greater information imbalance with others knowing more about you than you know yourself. At a global level there may well be differing views on free will vs. societal good – varying around the world.