The Future of Privacy – Options and Possibilities

There are plenty of predictions about technology – from the utopian visions of a bright new hyper-efficient world where robots free humanity from drudgery, to doom-laden predictions of pervasive surveillance and the demise of personal autonomy at the hands of governments and corporations. But there are a number of counter-trends emerging that present their own narrative about how the future will play out.

Privacy is a public issue: The public’s perception of the threats to privacy, personal freedom and autonomy – whether from corporations or governments – is growing. Privacy has already emerged beyond a niche, specialist concern to being a mainstream public issue. It seems that almost weekly new research is released revealing increasing public concern about privacy and declining levels of trust in organisations’ handling of peoples’ personal data[1].

In addition, a lesson the public has learnt thanks to the revelations from Edward Snowden is that data controlled by organisations will always be susceptible to access by governments using extensive legal powers of disclosure and surveillance. This is becoming a liability for communications and technology companies, under pressure from their users, who are beginning to take measures to put some control back into the hands of their users[2].

This growing consumer and citizen awareness and distrust looks set to accelerate and will increasingly become a factor in decision making for ordinary people – decisions about the products we use or abandon, the brands we associate with, the political leaders we elect. And as data insights become increasingly actioned by bossy tech, this will exacerbate the trend – behavioural observations, and the interventions that result, will increasingly be seen as unwarranted intrusions and restrictions on personal freedom and autonomy.

Digital activism will expand the digital commons: Consumers are taking matters into their own hands. A 2013 study from the Pew Research Internet project found that “86% of internet users have taken steps online to remove or mask their digital footprints—ranging from clearing cookies to encrypting their email, from avoiding using their name to using virtual networks that mask their internet protocol (IP) address”[3].

The plummeting cost and complexity, and increased ‘consumerisation’, of computing, processing and storage means that activists are now able to harness technology for themselves, without the aid of corporations and governments. The ‘digital commons’[4] will continue to grow, empowering more and more citizens and consumers to take matters into their own hands, such as deploying end-to-end encryption, anonymizers[5], and by “watching the watchers”[6].

Business model disruption is inevitable: The default internet business model – advertising – is showing some signs of strain, and even the biggest players such as Google are openly exploring new models[7]. Yet the value in personal data is so great, and the levels of public mistrust in organisations’ handling and use of personal data is so high, that it is inconceivable to me that entrepreneurs will not make a serious effort to exploit this disparity. What we are already witnessing is the emergence of new business models that threaten to disrupt not just the default internet business model, but more broadly the assumption that the organisation is the natural and legitimate point of control and ownership of personal data. Instead, new disruptive providers are seeking to put the individual in control of their personal data[8]. In the process, they are seeking to disintermediate data-intensive businesses from their existing sources of data.

Regulation will get tougher: Policy makers will act to toughen laws, even though they move at geological speeds compared to the rate of technology development.

New laws and regulations are being promulgated around the world, many following the European model[9]. And Europe is on a journey to update and toughen its data protection laws[10]. The EU proposals will increase fines, place tougher requirements on organisations for obtaining consent, and create a new ‘data protection by design’ obligation. The fines alone will focus attention, forcing organisations to devote more time and resources to compliance.

[1] The Royal Statistical Society, “New research finds data trust deficit with lessons for policymakers”, available at: (accessed 10/12/2014)
[2] Apple, Inc, “A message from Tim Cook about Apple’s commitment to you privacy”, available at: (accessed 10/12/2014)
[3] Pew Internet Research, “Anonymity, Privacy and Security Online”, 5th September 2013, available at: (accessed 12/12/2014)
[4] In her 2012 book, “Consent of the Networked”, Rebecca Mackinnon describes how activist individuals play a key role in influencing the shape of technologies and the balance of power in her chapter on the Rise of the Digital Commons. Summary available at:
[5] For example, The Onion Router (TOR). See the Wikipedia entry available at:
[6] An example is the TrackMap project, whose aim is to show where data travels when people visit their favourite news websites through visualization, available at: (accessed 15/12/2014)
[7] CITEworld, “Google for business: Now 100 percent ad-free”, 16th May 2014, available at: (accessed 10/12/2014)
[8] Ctrl-Shift, “New market for ‘empowering’ personal data services will transform relationships between customers and brands”, 20th March 2014, available at:  (accessed 10/12/2014)
[9] For example, in South Africa the Protection of Personal Information Act 4 of 2013 (, in Ghana the Data Protection Act 2012 ( and in India proposals in the form of a Privacy Bill (
[10] European Commission Data Protection newsroom, available at:

The Future of Privacy – Impacts and Implications

Threats to privacy from new trends and developments in technology look set to continue in 2020 and beyond. But the impact of the counter-trends and the effect they may have in constraining or shaping technology has received less attention – perhaps with the exception of law and regulation. As someone who has spent most of their professional life helping large organisations comply with law and regulation, I am often surprised at the level of faith in the law or regulation alone in delivering acceptable outcomes to complex problems like the impact of technology on our privacy.

Law and regulation is very effective at creating momentum and movement. By creating fear in board rooms, it can galvanise organisations to focus on compliance. But this does not guarantee that the things organisations do as a result will be pleasing to all concerned, even if they appear to meet the requirements of the law, and organisations can claim to be fully compliant. This is the problem we have faced to date with technology and privacy – there is no lack of law, legal opinion and guidance; yet there is continuing dissatisfaction with how things are, i.e. the outcomes we are left with.

This is because very often policy makers do not know what those outcomes should be and it would be a mistake for the law to try to determine them. While we are capable of identifying what we don’t like, it’s much harder to say what we do like – or more to the point, how we would like the future to actually look.

It’s therefore a case of sticks and carrots. Hit the donkey with a stick and the donkey will move. But it’s unlikely to go in the direction we want it to. Dangle a carrot under its nose in the direction we do want it to go, and it will generally follow the carrot. Law and regulation is good at creating impetus and momentum, but it won’t guarantee that we get to a desirable destination. To do that, we need incentives. Fortunately, the green shoots of these incentives can be found among the other counter-trends.

The possibility that individuals can now begin to take control of their own personal data is upending long established norms about the control of personal data – the assumption that the organisation is the default point of control. This is heralding the emergence of new entrepreneurs that see an opportunity to strike a new deal with consumers, offering them control. But not control simply for its own sake (worthy though that may be); rather control as a way to exercise greater autonomy over many aspects of their lives that today are made too complex and too difficult by data being controlled elsewhere.  And in doing so, there is the potential to unlock enormous economic value from personal data.

This potential for economic disruption to come to the aid of privacy (if not its complete rescue) by shifting power over data from the organisation to the individual is one of the most significant trends emerging as we look to 2020. It needs to be harnessed if we want to shape the development of technology to preserve the rights enshrined in all the major human rights instruments.

The 19th August 2014 was the 25th anniversary of the Web. This year, 2015, is the 800th anniversary of one of the most important legal developments in history – the Magna Carta. The Magna Carta was all about a shift in power – from the English King to the nobles, but in defining the principles for how power is distributed and constrained, it laid down the foundations of England’s legal system, and has influenced legal systems across the world. In celebration of the 25th anniversary of the web and the 800th anniversary of the Magna Carta, Sir Tim Berners-Lee has called for the creation of a ‘Magna Carta for the Web’ in 2015[1], and has declared that we need to “hardwire the rights to privacy, freedom of expression, affordable access and net neutrality into the rules of the game[2].

This is a fitting aspiration. But just as the Magna Carta was a response to the shift of power from King to nobles, hardwiring the web in order to protect privacy will require a shift of power over personal data from the organisation to the individual.

[1] “Tim Berners-Lee calls for internet bill of rights to ensure greater privacy”, The Guardian, available at: (accessed 17/12/2014)
[2] Supra note 12

Introduction: Defining loyalty

Before discussing the challenges facing those of us who work in the ‘loyalty space’ in more depth, it is probably worth providing an overview of what the term means to us.

To us, loyalty is a particular way of thinking about the relationship between brands and consumers. It is about what happens beyond the moment of simple transactions, and the specific products being bought and sold; beyond even the sometimes powerful messages contained in advertising. Instead, loyalty describes the long-term relationship and value-exchanges between brands and their customers, of which those momentary transactions are just a part.

Of course the word ‘loyalty’ covers a range of emotions and behaviours that go far beyond just the commercial space including our relationships with family and friends, political parties, nation states, religions, football teams etc. In fact, the question “where do your ‘loyalties lie’?” is one which goes a long way toward the formation of our very self identity. And we are well aware that commercial or, dare I say it, brand loyalty lies at one end (perhaps the less invested end) of the human loyalty spectrum. Nevertheless, a person’s consumer loyalty does lie on the spectrum and can still involve similar kinds of emotional attachments and accompanying behaviours. The implication of this being that even when talking solely about the future of consumer loyalty, we should still be bearing in mind the future of loyalty more generally, and the evolving ways in which people will emotionally align themselves with different values, ideas and propositions.

The Future of Loyalty – The Global Challenge

Loyalty in the future will not be like loyalty in the past. This much we know. Where once simple equations ruled (the customer collects points, the customer saves), there is now a chaotic, multi-channel hubbub increasingly driven by fast transactions and instant gratification, and the need for brands to think more deeply about the emotional, less rational, drivers behind the kinds of loyalty behaviours that might once have been exemplified by your grandmother insisting on her monthly trip to the local department store.

For brands that aspire to create customer loyalty in this new disorderly world, there is a fundamental question: quite simply, what will ‘loyalty’ in the future be? Already the conversation has long since moved on from the traditional points and prizes models, through ideas of personalised loyalty experiences for individual loyal customers, and on to the challenge of customer and context -led customisation of loyalty experiences. But where will this conversation lead us? And where, in terms of a customer’s emotional relationship with a brand, will ‘loyalty’ begin and indeed, end?

The key drivers behind the evolutionary changes to the loyalty model have been technological of course, both in terms of our ability to collect and store more customer data, and in terms of communications platforms that allow consumers to talk to each other in the same spaces (social media and mobile platforms in particular) that also allow for real-time, in-context marketing and brand-consumer interactions. These new technologies have brought new possibilities, and theoretically at least, brands now have a dizzying array of tools with which to create new kinds of long and short term, emotional connections with their customers. But those same tools have also presaged a new kind of consumer, with new and distinct expectations, some of which look determinedly dis-loyal.

However, reports of the ‘death of loyalty’, evidenced by increasingly brand-fickle consumer behaviours, perhaps driven by consumers now being empowered by access to different choices and information, may be exaggerated. It is always worth remembering the two sides of the loyalty coin: on the one, those customer behaviours that look, for all intents and purposes, like loyalty; and on the other, the brand-created, customer experiences that are designed to drive those behaviours. Brands may have been mistaken in assuming that ‘loyalty’ behaviour was ever more than ephemeral, dependent on loyalty schemes with a specific shelf-life; but that does not mean that brands cannot seek to redefine loyalty experiences and find new ways to drive loyal behaviours. The challenge lies in understanding the consumer of the future, and their redefined needs and expectations.

Loyalty has actually always been about creating an exchange of value between brands and consumers and especially about the value brands can provide beyond the specific features of a product being bought and sold, creating an emotive loyalty. This is unlikely to change. But understanding what kinds of value are likely to be exchanged in the future is a challenge. We need to answer the question fast, since, in this age of digital engagement and interaction, in which one-way advertising messages are now only part of the picture, the consumer is empowered to quickly seek, find and even demand, gratification of his or her own personal needs. Brands will need to respond to this, or find that their once ‘loyal’ customers are enticed elsewhere. In particular they will need to start seriously addressing the ‘harder to quantify’ aspects of the value exchange, and reconcile the rational value exchange with the less rational emotional value exchange.

Let’s get down to the nitty gritty.

One of the tools that brands increasingly have at their disposal is data (or ‘big data’ to use the fashionable term).  We can now know a lot more about consumer behaviour at both the individual and group level. But we need to learn how to harness it, to make sense out of it, and to create beauty out of it. This challenge brings a number of attendant questions such as: how can we build data collection into business models? How can we know what the best or most relevant kinds of data are to collect? And of course, how can we use this data to create new kinds of loyalty experiences and value exchanges? Lurking ominously in the background there is also the question of to what extent consumers will allow us to collect and use their personal information, and what they will expect in return. The backlash is already beginning in some quarters, although the questions of whether there are generational differences in the value placed on personal information is an interesting one. Either way, it looks like, for brands, providing genuine value in new ways and making commitments to being honest and transparent look like inevitable first steps.

Assuming we answer some of these questions, we then face another immediate challenge: the ‘fat wallet’ problem. Given that data collection and storage is becoming ubiquitous, and the ability to contact and interact with customers is too, so there are more and more opportunities for brands to move in to the loyalty space and offer their own, unique, loyalty experiences. Banks, airlines and hotels are the traditional players in the space, but already we have seen multiple other entrants, not least of course, the likes of Google and Facebook, the very architects of many of the changes we are seeing in customer behaviour.

Consumers will increasingly face the literal and metaphorical problem of having a wallet (or purse) fat with loyalty cards. In this scenario, the value of loyalty may become diluted, the consumer may become overloaded, eventually disengaging from loyalty altogether, and brands will face an increasingly uphill struggle to remain ‘front of mind’, even when the value they offer is particularly relevant. One solution to this may be to start thinking away from ‘pro-active’ loyalty, in which the consumer must actively and consciously take part in a loyalty scheme (too many of these and wallets become fat), and on to more ‘passive loyalty’ models that demand less of the consumer. On the other hand, consumers may be happy to put up with fat wallets, in order to ‘smarten up’ their consumption patterns, using loyalty schemes strategically.

Behind these more broadly conceived challenges lie the questions and uncertainties surrounding the physical (or digital) mechanisms and infrastructure that will underpin loyalty experiences themselves. As already noted, technology has driven many of the changes we have already seen, and it is likely to in the future. We might for example see a proliferation of payment systems, or indeed a convergence. Loyalty currencies (points, air-miles etc.) might become instantly convertible and flexible enough to be used across contexts, and/or borders (a question which raises others around creating loyalty experiences that are relevant in different cultural contexts – are loyalty behaviours in China driven by the same set of value propositions?). The mobile wallet is a both a certainty and an uncertainty for those of us thinking about the future of loyalty. It may have little impact beyond changing the mechanism of payments, or the effects could be more profound.

Similarly, the channels for brand-consumer communication and interaction are likely to increase. Mobile is a certainty, but what about the so-called ‘internet of things’ or wearable technologies? Which inventions and innovations are the most likely to be adopted, and which will prove the most effective channels for the types of relationship-building that drive loyalty?

Associated with all this, comes the question of the impact of real-time, in-context feedback, interaction and marketing. Will the ability to make prices dynamic, rewards instant, and responses to consumer demands individually relevant, all mean that traditional, long-term, loyalty models become meaningless or (to use an excruciating pun) pointless? More likely perhaps is that short-term transactional consumer behaviours, and longer-term loyalty driven value exchanges are likely to co-exist, and it will be more a question of which consumers are looking for which type, and which sectors and brands can generate the different types of services to deliver to those different needs: providing mechanisms that address the relative simply needs of the instant transaction as well as addressing the more complex and diverse variables that go into shaping what makes a consumer loyal.


The Future of Loyalty – Options and Possibilities

As I have already hinted, there are a number of possibilities for the future of loyalty. Change is certain, but little else is. That said, there are some fundamentals upon which we can rely. Consumers will still shop, spend and almost certainly continue to look for value propositions beyond just the features offered by specific products. In other words, there is still likely to be a space for loyalty. The idea of ‘knowing your customer’ is also going to remain, albeit transformed into a new challenge defined by the tensions between the ubiquity (and inevitability) of having access to ever more customer data, the right to collect that data, how and where you can store or share it and the puzzle of what to do with it once you have it. Alongside this, the death of the traditional media model (if it is even still alive) will finally sink in; what are now considered novel channels of communication will become the norm.

These certainties are more than likely to lead to an enhanced role for high-quality data managers and analysts (or data management and analysis systems). They will lead to a period of re-definition, evolution and innovation in terms of the kinds of value exchanges and exchange mechanisms that define loyalty offers. They will lead to a different set of consumer expectations, perhaps to the point that brands will no longer be able to deliver to them on their own. Strategic brand alliances, designed to deliver sophisticated choice and content, to complex consumer needs, are likely to emerge.

Less certain are the changes that new technologies will bring; especially in terms of payment mechanisms, mobile wallets and communications technologies. We know that consumers will face choices in all of these areas, but which ones they will adopt en masse remains uncertain. Will consumers opt to keep personal information private, while expecting to be able to enjoy the benefits of dynamic prices and rewards from multiple brands in multiple contexts? Or will the increasing demand from consumers for relevancy and personalised content tip the balance in favour of greater sharing? Ultimately can brands manage to create sufficiently tempting, relevant offers and experiences utilising the tools at their disposal (by, for example, gamification, curating, understanding etc.) to hold the consumer’s attention and make them more willing to engage and invest? The only certainty here is that the consumer is likely to gain the upper-hand in terms of the power dynamic and principles such as ‘great customer service’ will no longer be a negotiable.

The Future of Loyalty – Proposed Way Forward

In practical terms, there are a number of ways forward. There is an immediate need to understand the changes that are being wrought on consumer needs and expectations.  Significant investment in consumer research and data management and analysis seems to be a no-brainer. These kinds of research will themselves have to be mindful of what we know is coming, and specifically aimed at solving the problems outlined already such as the question of how to understand ‘big data’ and make it useful; and how to analyse and explore the impacts of new technologies on attitudes and behaviour so as to feed directly into reformulations of truly customer-led value propositions.

In tandem with this, and utilising a method that has been made much easier by the very same technologies we have been discussing, is the need for brands to be unafraid of testing. We don’t know what will succeed in the future and what is in the market today that will fail, so brands face a dilemma: Continue to innovate and test a wide variety of solutions and technologies and see what works (which brings the risk of spreading your focus and investment too thin and failing with all); or pick your winning horse or horses, focus there, be successful, but be exposed when consumers grow tired of that platform and switch to something new.

As the pace of uptake of new solutions is increasing exponentially, especially in younger generations; it is ever harder to decide on the right strategy.  The savvy business will be prepared to fail in this environment, but also prepared to learn from that failure, just as much as they must be prepared to respond to successes quickly.

In terms of actively innovating, brands will need to explore different possibilities and be open to new models. Innovation might be encouraged through strategic alliances with unlikely bedfellows for example, perhaps from different sectors, or from clever acquisition, or investment in or promotion of (lean) start-ups or suppliers.

Above all though, brands must place the customer at the heart of business models. This is likely to involve creating new business models and organisational structures that allow for customer engagement and management to become a core function that cuts across traditional silos, and helps to focus entire businesses on the contextual needs and value opportunities for different audiences at different stages of a customer journey or experience.

The Future of Loyalty – Impacts and Implications

The implications of everything I have discussed are broad.

Consumers’ ideas of utility value and similarly expectations of loyalty are likely to move from a recognition of the value in standard and ‘always available’ loyalty propositions to dynamic, exciting, changing and variable experiences that are ‘here today’ and ‘gone tomorrow’. This will mean an increase in customer-driven engagement in order to see what is or isn’t available at any given moment, rather than the annual ‘collect, save, spend’ patterns. However, we must address exactly what kinds of emotional connections can be created between brands and consumers, and explore the levers that might brands might be able to pull to create them, that are not simply reliant on the rational economic levers of points, rewards and monetary value. In doing so, of course, we may discover that the irrational emotional connections are even more valuable than the rational economic ones that have so far dominated.

Finally, lying behind all of these discussions, and the fact of brands and consumers beginning to interact more frequently and directly, with more customer information sought, collected and utilised, we are also likely to see increases in external (governmental) intervention and the possibility of regional or national ‘balkanisation’ in terms of the different ways in which brand-consumer relationships are regulated. This could happen even as companies attempt to move against such trends by, for example, initiating cross platform integrations of customer management in which every brand touchpoint is connected (without recognition of borders) and actively collecting customer data.

In economic terms, the need for brands to have access to the resources (especially the technical resources) to take part in this new world of customer engagement may begin to crowd out smaller players, at least in the short term. And competition for loyalty is likely to mean squeezed margins even for the bigger players. In the coming years, brands will need to be disruptive in their thinking about loyalty, seeking new kinds of value proposition, exploring different models and redefining the very ways in which loyalty is conceived.

The Future of Learning – The Global Challenge

The main global challenges pertaining to learning are related to the curation, contextualisation and control of a rapidly increasing amount of data, information and learning content. As the O3B (‘other three billion’) initiatives make continuous efforts to provide internet connectivity to the world’s developing markets, there is going to be a definite shift in the use, makeup and function of the internet as its usership reaches unfathomable numbers. With this considerable expansion in connectivity, as well as the increase in widely available cheap devices at a time when 60% of online traffic is already on mobile, there is a going to be a tidal wave of content that is accessible all of the time, anywhere. As the ability to learn whatever, whenever continues to empower the individual learner, traditional learning content providers and distributors will face the challenge of repositioning themselves within the new ecosystem that is emerging.

For learners, everything they will need to know in order to progress in their chosen discipline will be available online, but it is going to be vital that there is a way of filtering and curating this overwhelming wealth of information in a way that is simple, intuitive and valuable. A learner needs to feel confident that the answers they are getting are accurate, up-to-date and the best input for meeting their needs.

With learning taking place across a vast range of content types and platforms another challenge will be providing an assessment and accreditation framework that is able to reflect the investment and aspirations of learners around the globe. The learning that takes place on a mobile device at the instigation of an inquisitive learner needs to have the same status as courses delivered in the traditional learning environments of schools and universities.

A key question that arises is whether virtual, online learning is able to replicate the powerfully immersive interactions that form the basis of face-to-face exchanges. Learning is grounded in the interplay of conversation, experience and meaning. Are applications and algorithms capable of creating meaningful and relevant learning opportunities that are based on actually understanding the learner and responding to their needs?

Furthermore, is the world in danger of losing the ability to ‘learn’ properly? With every answer to every question being only a touch screen away, does it mean that learners are only going to be threading together an uninterrupted sequence of hastily-consumed information chunks rather than internalising and applying their knowledge in ways that are personal to them? Or, conversely, is a new learning skill being developed as a result of the immense amount of information at our disposal? This skill could enable learners to locate, extract and apply precisely what they need, precisely when they need it, without having to wrack their memories for classes they took years previously. The words “I don’t know” would become redundant.

It is certain that learning material will no longer be delivered in discrete packets of content as with the print model. Instead, publishers and content creators will have all of their content available in the cloud for learners to access as and when they need to. Learning content will emulate the model of music streaming; rather than purchasing the music as a product, the listener pays for access. As such, a learner will be able to engage with valuable learning content as and when they need to without needing to subscribe to full courses or a full set of materials.

We would also predict that the flexibility and responsiveness of digital learning platforms and approaches will greatly influence the way that learning is promoted in traditional environments. As adaptive and personalised learning develops thanks to the considerable data that is being captured on the behaviours and abilities of learners, so too will classrooms and other physical learning spaces become less rigid and passive in their arrangement and use. In all levels of education, from reception to university, learning spaces will evolve into configurable, inductive interfaces that empower the learner to create an environment that works best for them. The ancient paradigm of a teacher-led learning approach as represented by rows of identical desks or chairs facing the same single point of reference at the front of the room will be replaced by a more fluid, collaborative pedagogical method.

Furthermore, we predict that there will be a movement away from a top-down, broadcast approach of learning to a hyper-collaborative global network consisting of learners, institutions and content providers. Larger entities will emerge within that network but there will no longer be any oligopolies in the learning sector. Well-established learning institutions will need to learn how to best position themselves within this new learning ecosystem.

It’s uncertain whether the adaptive learning technologies that are able to leverage the immense amount of data generated by and about each individual learner will be able to provide the same quality of learning that face-to-face instruction has done historically. An adaptive learning engine is able to identify what content a learner needs to cover in order to achieve predetermined objectives, for example, but can it help a learner discover for themselves what it is they need to learn in order to reach their own set of goals? Despite the personalisation that is provided through adaptive learning products there is still the challenge of maintaining the focus on the individual and their desires and ambitions when it comes to their learning.

It’s also uncertain how learning institutions and the hyper-collaborative network paradigm are going to exist in combination. It can be argued that there will remain a place and a use for institutions that implement a more deductive pedagogical approach, but how such institutions will communicate and contribute to the network of connected and highly-motivated learner/users is difficult to anticipate.

Furthermore, it’s not clear what the impact will be of the overwhelming amount of information that is going to going to be available once internet connectivity reaches the O3B markets and as mobile interactions continue to represent the lion’s share of internet traffic. How will learners be able to navigate and filter the overwhelming volume of material at their disposal in order to locate content that is directly going to be of benefit to them?

The Future of Learning – Options and Possibilities

With the ascension of adaptive learning in combination with increasingly digital learning environments, we would view the ‘rehumanization’ of the learning space as a compelling option for addressing the attendant challenges. This would entail the promotion of productive crowdsourcing learning networks whereby individuals are able to elicit answers or input from a globally dispersed community of learner-users. These communities would be self-organising and self-regulating and capable of providing quick and reliable feedback to an individual learner’s needs. This ecosystem of P2P connections would act as an organic filter for the learner, collaboratively curating the vast amount of information available and providing responses and recommendations based on collective experience.

An adaptive learning layer may be added to this model that would then make recommendations or suggestions based on the learner’s online history or search behaviour. Rather than making suggestions in the form of content chunks to cover, however, the adaptive learning layer could suggest topics, themes or areas of study that are relevant or related to the material the learner is choosing to interact with. In a sense the adaptive learning element would become a virtual curriculum developer that responded to the preferences of the individual learner.

In addition, the evolution of the Semantic Web by the World Wide Web Consortium (3WC) will potentially provide an in-built solution to navigating the vast amount of data when looking for applicable learning material. The Semantic Web will present online data in terms of relationships and relevance rather than as straightforward text-based search criteria. A learner will be able to engage with online content that understands what they are looking for and how it relates to and impacts other topics.

The Future of Learning – Proposed Way Forward

In the first instance, we would propose the implementation and integration of the ongoing assessment of the use of technology within traditional learning environments. It’s already apparent that technology is becoming embedded in classrooms and lecture theatres so it would seem to be a logical progression of that evolution to start observing how that tech is being used by the learners themselves. Educators could carry out regular review sessions with their students to gain an insight into how the learning tech and online resources is being leveraged in the attainment of identified learning goals. This could then contribute to a new model of adaptive curricula that are realised at the intersection of teacher and technology.

The deliberate observation of technology enabled learning would help to shift the attitudes towards educational technology to a more proactive and engaged one, as opposed to reactive and resistant. This phase of observation can be global as well as local, especially in the light of the O3B initiatives that are going to dramatically increase the number of people with access to the Internet. How are learners in India using their tablets compared to learners in Mexico, for example? Are learners gravitating towards similar sites or applications? What questions are being asked?

To complement this observation we would suggest that educators encourage their learners to source information from their own Personal Learning Networks (PLNs) and to also actively contribute themselves to requests from other individuals within their communities.

We would also propose a widespread use of adaptive learning technologies in conjunction with teacher-led enquiry. This would provide the learning technologies creators to learn from the application of their products and to further refine them. The out-of-hand rejection of such technologies will result in the delay of creating more advanced, more intuitive systems that are able to better meet the needs of the learner. In the meantime, it would also capture a enormous amount of quantitative data on how learners are interacting with technology and how they are engaging with their learning materials. This will in turn help to inform how learning content can be created.

The Future of Learning – Impacts and Implications

The impact of embracing adaptive learning and the encouragement of crowd-sourced learning solutions would help to radically change the culture surrounding learning and promote the shift from a top-down model to one of collaboration and exchange. There needs to be an alignment of learning potential and practice in order to allow the extensive benefits of learning technology to be realised. This requires the active participation of all parties within the learning space: educators, learners, content creators, publishers and tech developers. We would even go so far as to predict that there will be less and less distinction between those functions across the learning space as connectivity continues to improve.

The Future of Health – The Global Challenge

The healthcare and wellness industry is going to drive the world economy of the 21st century. Globally healthcare is already well over a $6 trillion industry. But, despite its size, it only addresses about 30 per cent of the world population; nearly 70 per cent is nowhere near receiving decent healthcare services. We need a revolution in order to service the entire market.

The major issue is primarily revolving healthcare. The world’s first heart surgery was done in Oslo in 1895 – well over a century ago. A hundred and twenty years later only 10% of the world’s population can afford it. We can and must do better. The future cannot be just an extension of the past. It must embrace new technology, implement innovative approaches and aim higher than people previously thought possible.

The 21st century will see a rapidly growing demand for healthcare, but this demand looks unlikely to be met in the way the past century was. For one thing, to treat the 21st century’s problems with a 20th century approach to healthcare would require an impossible number of doctors. For another, caring for the chronic diseases that are growing in prevalence are not what doctors are best at.

Before we explore the future challenges and options, we should however recognize that over recent years we have already achieved a good deal. Globally, on average, we have never been so healthy, wealthy and educated. Although there have been long-term improvements in health delivery and care, it is over the past few decades that progress has really started to build momentum. This has happened partly because advances in technology, public health and governance have all aligned, and partly because there has been shared understanding of what the big issues are and how to address them. As the IMF has highlighted, child death rates have fallen by more than 30%, with about three million children’s lives saved each year compared to 2000. Deaths from malaria have fallen by one quarter in the same period.

But, as the WHO points out, we still have major challenges to address:

  • The average annual rate of decline of women dying due to complications during pregnancy and childbirth is far below target to reach the Millennium Development Goal
  • While HIV infections have declined by 33% globally, sub-Saharan Africa still accounts for 70% of all new infections
  • Although the global tuberculosis mortality rate has fallen by 45% since 1990, multi-drug resistance TB continues to pose problems with an estimated 450,000 per year developing it.
  • In 2012 almost half the world’s population were still estimated to be at risk of malaria with Africa bearing 80% of new cases and 90% of associated deaths
  • Moreover, as the current e-bola pandemic in West Africa highlights, our ability to prevent such disease epidemics is limited, primarily due to low levels of public health in many key centres of major population.

Increasing access to affordable essential medicines is vitally important but several factors undermine availability in many countries. These include poor medicine supply and distribution, insufficient health facilities and staff, low investment in health and the high cost of many medicines.

Contrasting the world’s most developed healthcare market with that in India we can see many significant issues. US healthcare spend is spiraling upwards above 18% of GDP while in India, for example, the figure is just over 4% against a global average of 10%. Worldwide health spending is expected to increase by 5% next year. In India, where the government has now promised to introduce universal health insurance, spending is expected to rise by 18%.

While US life expectancy at birth is now around 80, in India we have just reached 67. Over the past thirty years, our infant mortality rates have dropped from 118 to 42 per 100,000 births compared to less than 5 in the US. In the US the prevailing market means that a healthy person can expect to spend $142,000 on out-of-pocket health expenses in the 20 years after turning 65. If they have a chronic disease this figure doubles and if they live until 90, they will need an extra $75,000. In the US there are 2.5 physicians per 10,000 population: in India we have 0.7.

The Future of Health – Options and Possibilities

Many in the ‘developed’ world are focused on the benefits of technology improving the effectiveness and the efficiency of healthcare. With many countries expecting to be spending up to a fifth of GDP on healthcare by 2050, the need for more effective use of resources is clear.

Certainly the potential to use information to drive for more personalised care may well open up access and raise quality while controlling costs. Especially in the pharmaceutical arena, personalized medicine and the prospect of customized therapies based on more sophisticated diagnostics is a major focus for many researchers and the opportunities for genetically orientated pharmacogentics are substantial. With most current medicines only working for 1 in 10 patients and many $1bn blockbuster cancer drugs effective with 25% of patients, the potential for bespoke treatments is significant. However, some see that, in the short term, these innovations will be primarily focused on the developed world’s more established healthcare markets and will take time to have global impact.

Tele-health, and especially ‘m-health’ has already shown great promise globally. Especially in sub-Saharan Africa and India but also elsewhere in Asia, the opportunity to use mobile as a platform for both curative and preventative healthcare has been attracting much attention from governments, entrepreneurs and the mobile networks alike. With real-time monitoring an increasing norm and the entrance of major global technology companies such as Apple and Google into the area of personal and remote monitoring, the potential is indeed significant. While the business model for preventative healthcare is yet to be fully defined, those such as McKinsey and the GSMA see this as a means of saving of $200bn a year just in treating chronic diseases across the OECD and BRIC countries.

Alongside these significant new platforms shifts there is also the need to improve access to effective treatment of fast rising chronic diseases. According to WHO figures, by 2020 major chronic diseases are expected to contribute to 73% of all deaths and 60% of the global disease burden. Moreover, 79% of the deaths attributed to these diseases will continue to occur in the developing countries. Addressing this requires both behavioural changes across many areas of society around consumption and exercise as well as structural change in the way healthcare and sick-care is provided. If we are going to stem the rising tide of chronic disease and deal with its consequences we need a far more integrated approach to wellness and healthcare that works across all societies and not just a select few. We need to integrate primary, secondary and tertiary prevention and health promotion across sectors and different disciplines.

The Future of Health – Impacts and Implications

Healthcare is a unique industry that creates millions of jobs for millions of households, both skilled and unskilled.  Unlike manufacturing, healthcare is not dependent on any finite components. It is dependent mostly on human skill. And human skill is replenishable. We can technically reduce the price of any service to any level we want: Surgeons are like technicians – the more surgeries they perform, the better they get at it. But behind every skilled doctor you need to have at least two highly skilled nurses, at least four or five technicians, and good administrators.

By 2022 India needs to have 200,000 specialists, 450,000 doctors and over 1.2m nurses. If every country has an adequate number of surgeons, radiologists, anaesthetists and cardiac surgeons, believe me, costs will come down by more than 50%. It is a question of demand and supply.

In global forums everyone talks about reducing the cost of healthcare. But no one knows how much they are spending today. At Narayana Health we have invested in technology. Every day at noon I get an SMS on my cell phone with yesterday’s revenue, expenses and EBIDTA (earnings before interest, depreciation, taxation and amortization) margin. For us looking at a profit and loss account at the end of the month is like reading a post-mortem report. You cannot do anything about it. Whereas, if you monitor it on a daily basis, it works as a diagnostic tool. You can take remedial measures.

The principles that we have developed and refined in India can certainly be applied elsewhere. We have developed what some see as a ‘frugal’ innovation approach to several healthcare challenges and hence have proven design solutions for low-income populations. These solutions can also be applied to higher income economies with even greater efficiency benefits.

As an example of how the Indian approach can provide more efficient high quality healthcare, you can look at Health City in the Cayman Islands that we opened in 2014. Health City is not only a lower cost alternative for patients needing heart, cancer and eye surgery in North and South America, it will make clear how over priced and inefficient hospitals in the US really are. Health City in the Cayman Islands will show that lower costs and better outcomes can be done outside India just as well as in Bangalore. In the US it currently costs approximately $1.25 million per bed to build a hospital. Health City is costing only $250,000 per bed. Furthermore, in the Cayman facility prices are less than half the average US costs for surgical procedures with quality outcomes matching the very best.

Global healthcare affordability will not come from the Unites States or any of the current world leaders, but rather from those nations of the world that have little today and have no choice but to perform at the highest levels possible in the future.

The Future of Health – Proposed Way Forward

I want to enable every man, woman and child to have access to high-tech healthcare within the next 15 to 20 years, including in the poorest regions of the world. Today, most healthcare interventions are not accessible to nearly 90% of the world’s population. The way forward is not a new medicine or a new scanner or a new operation – it is a process innovation to bring healthcare to everyone.

Most countries suffer from a simple mismatch: the demand for health care is rising faster than the supply of doctors. One approach to making doctors more effective is to focus what they do. This is something that we in India have been dedicated to.

At Narayana Health our focus has been on offering as many operations as possible using the core resource without compromising on quality. Surgeons do the most complex procedures and other medical staff do everything else. In addition, by using the latest technologies such as tablets in the ICU instead of patient charts, simulations to train critical care nurses and telemedicine to access those patients in remote parts of the country, a far higher quality of healthcare is delivered than the global norm.

Alongside our process innovation priority, this means that surgeries in the organisation’s 18 hospitals across 14 Indian cities typically cost between $1600 and $2000 each – less than half that of other Indian hospitals and about one-fiftieth as much as a similar procedure in the US: Two per cent of the cost with outcomes that rival the best in the US.

Equally in other areas of Indian healthcare, similar efficiencies are also being achieved. LifeSpring hospitals have reduced the price of childbirth by augmenting doctors with less expensive midwives: Their costs are about 20% of those in a private clinic. In addition, Aravind Eyecare provides cataract surgery to about 350,000 patients each year for around $50 each: Operating rooms have at least two beds so that surgeons can quickly move from one patient to the next and, for every surgeon, there are six ‘eye-care technicians’ specifically trained by Aravind to perform many of the other tasks in the operating theatre that, in other countries, require a surgeons training.

Japanese companies reinvented the process of making cars. That’s what we are doing in healthcare: What healthcare needs is process innovation, not product innovation. It’s all about numbers. Because we do a large number of operations, our overheads are distributed over a larger number of patients. Equally, because we implant the largest number of heart valves in the world we get heart valves at a lesser price.

Looking ahead, I see that the efficiencies we have achieved through the approaches that we have taken in India can be applied globally. With an aging society and escalating costs, the 20th century model of healthcare still practiced in many countries today is unsustainable and we need to shift the model forward.

In addition, I also see a need to change the world of health insurance. There has to be an alternative way of funding healthcare. 10 years ago we convinced our local government to launch a health insurance programme and convinced 1.7m farmers to contribute 5 INR (8c) per month and the government became the reinsurer. Today the premium has risen to 18 INR (US$0.27) per month. In 10 years, 450,000 famers have had treatment and 60,000 of them have had a heart operation all because of the power of 5 rupees per month. Today we are covering high technology healthcare for nearly 3 million farmers.

Now we are trying to convince policy makers that micro-health insurance is the best model for the whole of society. In India we have 850 mobile phone subscribers who are spending 150 rupees per month just to speak on the phone. So if we can collect 20 rupees from each mobile phone subscriber, we can cover the healthcare of another 850 million people. The Indian government will soon become a health insurance provider. Not only a healthcare provider.

The Future of Government – The Global Challenge

In recent years, the debate in contemporary political science has centred around the political institutions that limit or check power, like democratic accountability and the rule of law. However, as Francis Fukuyama has pointed out in his article, “What is Governance”, little attention has been paid to the institution that actually accumulates and uses this power – the state. While there have been repeated claims of the withering of the state over the past decades, few of these have proven accurate. In fact, there has been a need for increased government capacity to deal with the increased demands placed on the state. In many countries, this has been exacerbated by an underinvestment in public sector capacity over the past few decades. We need to go beyond the usual conversation about how the state carries out the business of governance and back to the more fundamental questions of what is the role of the state and why this is important.

To understand the trends that affect the role of the state, we have to consider the context in which the state operates. Governance falls roughly between the fast- and slow-moving components of society, nature and culture on the one hand and infrastructure, commerce and fashion on the other. This presents an interesting challenge for states because the components that change quickly get all the attention, but those that change slowly have all the power. The fast learn, propose, and absorb shocks; the slow remember, integrate, and constrain. Managing the tension between the fast- and slow-moving components of society is core to the role of the state and how it will evolve. In Singapore, it might mean that while it is relatively quick to change policies with regard to home loan restrictions, cultural norms and values around home ownership can take a longer time to shifts.

In his book, “The End of Power”, Moises Naím suggested that we were “on the verge of a revolutionary wave of positive political and institutional innovations”. Naím described the shift in power through three revolutions, which in turn would impact the role of the state:

The More Revolution: As people became more numerous and were living fuller and longer lives, they became more difficult to regiment and control.

The Mobile Revolution: As people became more mobile with the ease of migration, power lost its captive audience.

The Mentality Revolution: As people became more affluent they had higher expectations of living standards.

Looking at this from the perspective of relative rates of change, one observes that these revolutions have taken place within the timespan of one to two generations, much more quickly than similar changes that have taken place in the history of societies. This has led to a compression of timescales within which the state operates. The middle-class uprising in countries like Brazil, where there has been a mismatch of expectations around the sustainability of economic growth and improved standards of living, is a manifestation of the tensions that can emerge from these revolutions.

So the key question to answer is can governance keep pace with the changes in the rest of society?

According to David Ronfeldt, new information and communication technologies have enabled dispersed, often small actors to connect, coordinate and act jointly as never before. This favours and strengthens network forms of organisation and represents a structural change in the operating environment for states.

When institutions and markets were the dominant organisational form, there were economies of scale allowing for the efficient management of large units, in many cases by the state. However, in a network, the state is but one of many stakeholders. Without economies of scale through centralisation, common market-based measures of state performance, like efficiency and productivity, also become less useful.

Not all participants in a network are equal, and leadership still matters. In a network structure, the state would have to adapt the way it exercises power and performs its role. Leaders can have a louder voice, but have to build the legitimacy to exercise it. This would increasingly become the challenge for states operating within the network. Ronfeldt therefore suggests that power and influence appear to be migrating to actors who are skilled at developing multi-organisational networks, and at operating in environments where networks are the dominant organisational form. In general, non-state actors are ahead of state actors operating in this environment and this may present a shock to established centres of power, as will be described in the following section.

In a network form, other entities compete with the state for influuence within the web, like environmental, human rights, and other activist nongovernmental groups, which operate at many levels of government around the world. This new dynamic changes the role of the state. Non-state actors are starting to have state-like power and capability, ranging from diplomacy to urban planning to provision of public services. For example, Zappos’ founder, Tony Hsieh, invested $350 million to transform the decaying and blighted part of the old Vegas Strip into the most community-focused large city in the world. The Downtown Project has already funded over 60 tech start-ups and 21 small businesses with the ultimate goal being to invest in 100-200 entrepreneurs. This makes Tony Hsieh the de-facto mayor of downtown Las Vegas. This type of activity is not limited to entrepreneurs. According to a CNN report in 2006, “Hezbollah did everything that a government should do, from collecting the garbage to running hospitals and repairing schools”.

Globalisation and the free movement of capital have enabled multi-national corporations to become a network of supranational entities, exporting goods and services as well as culture and ideology to the states in which they operate. For example, Procter & Gamble was the first company to hire women in Saudi Arabia. Although Saudi labour laws have a provision for employing women, many companies have been unwilling to cause cultural controversy. Multinationals also form the basis of connectivity in a transnational network, providing air travel, sea freight and global telecommunications capabilities. What results is that domestically, multinationals have assets and access to resources that can rival some states. They have a disproportionate say on the regulation and public policy agenda when they represent industry lobby for national safety standards as a result of their global supply chain.

The state is relatively good at dealing with the problems that are defined in terms of the Westphalian concept of state, for example, sovereignty and international trade. It typically has established mechanisms to safeguard its interest and power. However, it has become increasingly difficult to establish what the state actually has jurisdiction over and this creates new forms of market failures. While states retain the jurisdiction to manage resources within their physical and geographical boundaries, many resource and public-good problems resist a state-centric approach. For example, governance by norms, spheres of influence and interlocking societal relations rather than comparatively inflexible international law could make the management of trans-boundary problems easier.

In a G-Zero world, where every state is for itself, ineffective mechanisms to deal with the growing trans-boundary nature of problems will lead to more pressure for a distributed, bottom-up model of global governance system. Small states like Singapore have a clear interest in an open, rule-based system as they face heightened risk in a system where there are no longer strong institutional platforms to safeguard their interests. Such states may find themselves shifting from playing price-taker or “pivot” roles to advocating for strong international rule of law and no unilateral actions.

Today, many individuals regard themselves as “city-zens”, that is, their residency in a city is core to their identity regardless of their actual citizenship and voting rights. However, the current governance system is not good at taking into account factors such as the preferences of the non-voter (for example, city-zens), the environment and future generations. What results is not only rising expectations on the part of citizens (voters in the political process), but that the state increasingly also has to look at the interests of non-voters as well.

As technology expands at an ever-increasing rate, society struggles to keep up. This has led to the erosion of Social Mobility: The rise of robotics and automation is wiping out many middle-skill jobs. Coupled with the expansion in higher education opportunities in emerging markets, there will be fierce competition for such jobs. In addition, the structure of the modern economy is changing. The increased demand for high value services imposes a high barrier to entry. Only a fraction of the workforce is able to participate in value creation that these sectors provide. What results is what Kenichi Ohmae called the “M-shaped society”, where income distribution in Japan is becoming polarised due to the impact of technological change and globalisation. The ability to provide education and middle-skilled high-paying jobs was one of the state’s levers for upward social mobility in the past, but this has eroded over time.

The rise of social media and surveillance technologies has led to changing expectations of the policy making process. On the one hand, individuals are more empowered; on the other, empowered individuals demand more from the state. What results is what John Keane calls “monitory democracy”, where “the powerful consequently come to feel the constant pinch of the powerless”. New technology also presents governance challenges as the state struggles to regulate in an increasingly complex and uncertain environment. For example, stringent IP laws may become obsolete with new production technologies like 3D printing and autonomous vehicles could change the transport landscape, creating new liability issues.

The Future of Government – Options and Possibilities

What are the Implications on the Role of the State? In response to these trends, we should consider what the implications on the role of the state might be. We will also highlight weak signals that suggest how the role of the state might evolve in Singapore. Broadly, the state faces two challenges to its role, as follows:

The first is the redistribution of wealth through taxation and the provision of public services. Globally, austerity measures have forced states to cut back on their fiscal spending and this has constrained their ability to supply public services. In Singapore, one of the fiscal challenges highlighted in the “Singapore Public Sector Outcomes Review” is how to raise sufficient revenue to invest in the range of capabilities and infrastructure that Singapore needs to survive and succeed in the future. In this constrained environment, the state needs to find other ways to increase the “supply” of the state.

Secondly, governance is a competitive marketplace. There can be both private and public supply of social services and individuals are mostly free to choose which they prefer. For example, in a society where there is a widening gulf between rich and poor, the rich may live increasingly separate lives and provide for their own “public services”. On one hand, this could allow the Government greater focus in providing services for the needy; on the other, the rise of gated communities and privatised social services could signal the beginning of deterioration in the quality of public services as the rich opt out. The state also needs to consider what public services it has a role in supplying vis-a-vis other stakeholders, and how it might partner them to deliver better services. The provision of public services by the state may not necessarily keep pace with the increase in demand; in fact, sometimes the increase in supply of public services also increases the demand. In this case, the role of the state might be to play specific coordination functions, and allow civil society or private sector partners the space to grow as new providers of public services.

The Future of Government – Proposed Way Forward

Joseph Nye argues that transactional hard power skills, like organisational ability and political acumen, are just as important as transformational soft power skills, like communications, vision and emotional intelligence. The state must develop a kind of “contextual intelligence” to be able to apply the best combination of hard and so power skills in different situations. It bears consideration what new capabilities the state should invest in to be able to ensure “supply” for the future, both in the ability to deliver on its promises and the ability to shape the direction that it is moving in. In retail parlance, “consumer insights” provide a key to what the “supply” should be. Likewise, for the state to undertake this type of sense-making work, it has become important not only to get data from economists and engineers but also insights from sociologists and anthropologists.

As Singapore approaches fifty years of rapid progress, sense-making would also have to take into account the development of its slower-moving components – in terms of its history, culture and heritage. In August 2011, the Government launched the Singapore Memory Project, a nationwide movement that aimed to capture and document precious moments and memories related to Singapore. Intangible assets such as collective memory are important in maintaining the resilience of our country, as Singapore seeks to become more adept at managing its pace of change. As the state seeks to be more responsive to growing public pressure, how can it work with new or existing providers of public services to split the load? What capability gaps have arisen because of the change in the operating environment? What new capabilities should the state invest in to ensure “supply” for the future?

The rise in the network structure and the expanding influence of non-state actors also presents opportunities for states to facilitate networks of responsibility and build inclusive institutions in place of traditionally more extractive ones. What results is greater experimentation and decentralisation, leading to more robust processes and outcomes. There are weak signals of this happening in Singapore. In 2013, local social enterprise SYINC launched a collaborative, community focused project “Under the Hood” to crowdsource innovative solutions to Singapore’s urban poverty challenges. The initiative brought together a range of organisations from the private and people sector, and acted as a lab to prototype micro-level, local solutions that are scalable, if proven successful. The potential for greater collaboration with such initiatives creates a specific role for the state in the network to identify successful ideas and scale them, leveraging its resources and existing infrastructures to augment the delivery of public services.

Some argue that only looking at increasing the “supply” of the state with limited resources leads to a vicious cycle. One of the reasons for this is that increasing the “supply” of the state can enlarge the issues that come under the purview of the state, thereby creating its own demand. When there is surplus demand for public services, the instinct is for the state to fill the gap. However, this sometimes generates more demand for said services. Therefore, a more sustainable solution might be to find ways to reduce the “demand” on the state that can lead to a more virtuous cycle.

The nature of trust may be different in a networked structure. Even though the quality of public services has improved, there has still been a declining level of trust in governments, institutions and elites. There is a growing sense amongst the middle class that the “system” is rigged in a self-serving way and that it lacks the capacity to deal with emerging challenges.

Trust in a network structure depends on long-term reciprocity of relationships, where there needs to be fair outcomes for stakeholders in these networks, and a perceived “fair” allocation of costs and benefits. Contribution, participation and reciprocity then lead to trust outcomes over time. In this environment, the appropriate scale of decision-making may be smaller, which can favour small states like Singapore, although it bears consideration how we might further localize decision-making to build more trust.

Efforts to invite participation from the network have to be designed with care. In 2006, the New Zealand government undertook a review of their Policing Act. One stage was to open up the act on a wiki for two weeks and the public was able to contribute. However, the parliamentary council office came out to express concerns at the format required and the expertise of the public in being able to meaningfully contribute to drafting legislation. Furthermore, in a low-trust environment, the public may question the role of a preventative government in protecting its citizenry and the potential legality of an infallible prosecutor.

How might the state create more space for network actors to take greater responsibility?

The state often retains the reputational risk and overall accountability for outcomes.

How can the state share responsibility while maintaining the influence over outcomes?

One of the ways that the state can legitimize itself to its constituents might be to facilitate the building of relationships with the people and other sectors to co-provide solutions to problems. There are many well-studied factors that contribute to the demand for the state, for example, the origins of crime, educational failure, indebtedness, family breakdown, psychological trauma, ill health, and others – yet the demand for the state is derivative, that is, people are actually demanding for certain services to be provided, and not necessarily for the state to provide it. This delineation opens up many possibilities for the state to co-opt other partners into the picture, with the state retaining an important role in designing the architecture of the networks in the sector, and facilitating access. In Singapore, the app functions as a gateway for all things to do with transportation by aggregating available data, facilitating greater access to other non-state partners, and enabling the public to find solutions for themselves.

One of the challenges facing the state, especially in the area of public policy innovation, is how to balance equity and autonomy. A centralised system is often viewed to be more equitable at the expense of autonomy. However, as the governance system gets more complex, there are also hidden forms of inequity in a centralised system, like the difficulty

in navigating the system. Decentralised service provision at the hyper-local level can actually help to reduce this inequity. For example, the emergence of chartered schools is a good example of how this decentralised approach worked in practice because the focus was on outcomes, rather than the process. This represents a shift in the role of the state from ensuring equity in process to equity in outcomes.

The Future of Government – Impacts and Implications

One of the roles of the state is to ensure parity in process, if not outcomes. However, for certain areas, enforcing strict levels of compliance generates a greater demand for state intervention. For example in Singapore, the Workplace Safety and Health Act was amended in 2006 to focus on Workplace Safety & Health systems and outcomes, rather than merely on compliance, to allow for flexibility and robustness in the regulation to keep pace with technology and the nature of work. Setting and monitoring outcomes of individual agencies, while useful, is insufficient. In recognition of this, the Ministry of Finance and other Ministries have therefore worked to jointly establish whole-of-government outcomes along with suitable indicators to track our progress towards achieving them. In addition, when the state is better able to measure outcomes, greater possibilities in funding design, beyond grant funding, open up to states to more effectively measure and manage their resources and increase their impact, for example, with the incorporation of behavioural insights.

The operating environment for the state has changed. Networks dominate institutions as the dominant organisational form. The influence of non-state actors, in particular multinationals has expanded. Jurisdiction has grown beyond boundaries. Technological change has outpaced society. Consequently, the role of the state has had to evolve and to succeed in this new operating environment, the state needs to both increase the “supply” of the state and reduce the “demand” for the state.


The Future of Food – The Global Challenge

Food is fundamental for human existence and health but many of the world’s inhabitants experience ongoing hunger. For some this is due to drought, others war and for many it is a lack of money to buy food. The United Nations, Food and Agriculture Organization estimates that 850 million people worldwide are hungry and a greater number suffer from nutrient deficiencies. Approximately one billion people have inadequate nutrient intake, others excessive calorie intake. Obesity has become an epidemic in developed countries, while in some developing societies the double burden of nutrient deficiency and obesity is apparent. The challenge of preventing hunger and malnutrition will become even greater as the global population grows from the current 7 billion people to nearly 10 billion by 2050.

Not only is the global population increasing, we are living longer and becoming more affluent. As incomes increase, diets become more energy-dense and meat becomes a larger proportion of the diet. These changes in population and cuisine have led to a tremendous rise in the demand for animal-source protein. The competition between livestock and humans for grains and other high quality plant foods, whether real or perceived, is recognised as a major challenge. This has become more complicated with the diversion of grain to the production of biofuel.

For many years there has been an ongoing debate about the benefit or otherwise of animal-source foods, especially red meat consumption. In the past, claims of the detrimental effect of animal-sourced foods on human health have been made without rigorous scientific investigation. There is no doubt, however, that animal source foods, including lean meat, fish, poultry, eggs and milk, are an excellent source of protein and micronutrients. Fish can be added to this list but wild fisheries are rapidly being depleted. It should not be forgotten that humans evolved as ‘meat eaters’. It is unlikely that we will lose our appetite for meat but we must curb it. In many instances, the mechanism that allows impoverished families to improve their income and wellbeing is access to livestock or poultry.

Whatever diet we choose in the future our food will need to be produced more efficiently. Increased agricultural productivity must come from a reduced land area and resource base. Arable land continues to be lost due to soil degradation and urbanisation. We will need to be less dependent on resources that are becoming scarce, like arable land and water, or more costly, like energy and petrochemical-based inputs, including fertilizers. Some would argue that it is how we manage the nexus between food, water and energy that is our biggest challenge for global food security.

Conversely, the environmental impact of agriculture should not be forgotten. There is no doubt that agriculture exerts considerable pressure on water supplies, especially when irrigation is used. What form of energy will agriculture use in the future to produce, process and transport our food? The impact of agriculture on plant and animal biodiversity and other ecosystem services also must be addressed. Pollination of crops by bees is an integral component of agricultural production. Any disruption to this ecosystem service could have devastating consequences for food production.

Climate change will accentuate the challenges identified above. Pest and disease problems of plants and animals are likely to increase partly in response to climate change. Consensus exists regarding impacts of agricultural production, processing and distribution of food on global climate change. A significant proportion of anthropogenic emissions of greenhouse gasses come from agriculture and these emissions need to be reduced.

Just as the climate system is global, so is our food system. While globalisation may create opportunities and increase food distribution the benefits predominantly flow to those with a developed and secure food supply. Government subsidies, import restrictions and food safety legislation all mitigate against an equitable distribution and pricing of food. In some situations this will lead to civil unrest.

The Future of Food – Options and Possibilities

In developing countries where many of the population exist as subsistence farmers the food system is relatively straight forward. In contrast to developed economies where the food system or agricultural supply chain includes all aspects of crop and animal production, aquaculture, processing, storage, and distribution of food products through the wholesale and retail systems. More opportunities exist to guard against adversity and to increase productivity when the food system is complex and not reliant on a few food staples.


Food production must increase substantially but over the next decade both systems must cope with more severe climate events (2014 was the hottest year on record) and increased globalisation as more free trade agreements are signed. The increased amount of food required will need to be produced with finite water supplies on existing areas of arable land. There is general agreement that another “Green Revolution” is required but today’s revolution must be different to overcome existing environmental, financial and societal constraints. It is no longer possible or responsible to use unlimited water and chemical inputs to increase production. Other approaches to food production and processing must be found that use existing and new technologies in conjunction with appropriate social policies that are sustainable. Policies must ensure conservation of global biodiversity and animal welfare. The Commission on Sustainable Agriculture and Climate Change identified seven critical areas for the transition to a sustainable global food system;

  1. Integrate food security and sustainable agriculture into global and national policies
  1. Significantly raise the level of global investment in sustainable agriculture and food systems in the next decade
  1. Sustainably intensify agricultural production while reducing greenhouse gas emissions and other negative environmental impacts of agriculture
  1. Develop specific programs and policies to assist populations and sectors that are most vulnerable to climate changes and food insecurity
  1. Reshape food access and consumption patterns to ensure basic nutritional needs are met and to foster healthy and sustainable eating patterns worldwide
  1. Reduce loss and waste in food systems, targeting infrastructure, farming practices, processing, distribution and household habits
  1. Create comprehensive, shared, integrated information systems that encompass human and ecological dimensions

We must achieve all of these goals. Future food production must have both vastly increased productivity and good environmental practices. Meeting these goals will require the effective use of science. Biotechnology with its evolving “omics” tools (genomics, proteomics, metabolomics), will allow the development of new approaches to counter some of the complex problems we now face. With these approaches it will be possible to fast track current crop plants with agronomic traits such as yield and tolerance to environmental stress using the same or diminished inputs and be able to withstand pathogen attack and potential contamination with mycotoxins. The coming generation of crop plants may have value-added outputs such as improved nutrient and food functionality and be sources for biomass for biofuel production and human therapeutics.

Another important area that will undergo a major renaissance is microbial ecology with the application of molecular biology techniques . While microbial ecology is not a new concept, it is pivotal to understanding the presence and functioning of microbes in complex and dynamic food environments, both outside and inside the gastrointestinal tract. As we understand more about the complex and dynamic microbial ecology of foods, we will be in a better position to manipulate those biotic and abiotic factors that enhance food quality and human health. Similar improvements will be made to animal health and it is the unique microbial ecology of ruminant livestock (cattle and sheep) that allows them to convert human-inedible plant feeds and by-products into nutritious human foods.

The other platform that should permit a major leap forward is nanotechnology. It holds promise for responding to the need for more precise management of resources such as water and fertilizers, improving crop and livestock production, controlling pests, diseases, and weeds, monitoring plant disease and environmental stresses, improving postharvest technology, including waste management and food safety. It will allow the application of precision agriculture in both developed and developing economies.

However, without consumer acceptance, new technologies will not succeed. This will require education and communication of the benefits that will accrue from their application. This will need to be achieved with a back-drop of increased consumer interest in foods produced locally and organic agriculture. These “feel-good” approaches to agriculture will not overcome the food demands of the future but the more useful aspects of these practices must be part of food production in the future.

The Future of Food – Proposed Way Forward

Despite daunting challenges, the application of contemporary food production and processing practices along with scientific advances combined with appropriate social policies can underpin sustainable food production systems. Clearly, the solution to the challenge of meeting future food demands lies in increased agricultural productivity everywhere, but particularly among small-holder farmers, of whom there are millions worldwide. Mixed crop and livestock production systems produce about half of the world’s food supply. Targeting these systems should be a priority for policies to sustainably intensify production by carefully managed inputs of fertilizer, water, and feed to minimize waste and environmental impact, supported by improved access to markets, new varieties, and technologies.

The global food system is extremely complex and the gap between developing and developed nations is not only in economics but also in science, governance, and public information. Thus, to tackle these issues, a number of areas must be addressed urgently:

  • Science and research; There has been a global decline in agricultural R&D in the past four decades. There is now an urgent need to redouble the agricultural research effort. The new food producing system has to be science-based with low resource input. To ensure this occurs there must be definable career paths to encourage the next generation to enter agriculture and food research.
  • Economics and education; Increased economic development is required in developing countries hand-in-hand with education. These improvements will ultimately decrease the birth rate. In many economies, women manage the food cycle and their recognition and education should be a priority. In developed economies, education will be equally important as consumer attitudes will be very important to the eventual acceptance of new technologies and adoption of different patterns of food consumption. Part of the economic equation must be to pay farmers more for their products.
  • Sustainable diet; Part of the solution to feeding the planet is the development of consumption patterns that meet requirements in a safe, nutritious and affordable manner. In developed countries this will mean learning to eat sustainably with less reliance on meat. Through the application of the tools of molecular biotechnology, future nutrition will be personalised to account for individual variation and to improve health and well-being.
  • Waste; Postharvest losses of plant foods can be substantial in developing countries and amount to 30 to 50 % of production due to a lack of storage infrastructure. In developed countries we throw away a similar proportion of all food produced. The combined loss would feed about 3 billion people. Reducing wastage will provide breathing space to allow the development and adoption of new food production technologies.
  • Governance: Addressing these complex issues will take commitment and collaborative efforts at both an international and national government levels. It must also involve government agencies, private enterprise, and nongovernmental organizations. An atmosphere of collective good will ensure that research investment is appropriate and will enable the development of policy to allow integrated implementation of new food production systems.

The Future of Food – Impacts and Implications

Over the next decade and beyond maintaining global food security will become much more difficult as the population increases. We must double food production in a sustainable manner. Greater quantities of food will need to be produced with reduced inputs of water, energy and nutrients on the same or reduced area of arable land in a changing environment. To do otherwise will court significant human conflict.

The increasing urbanisation of the global community exacerbates this situation as more and more people become isolated from the land and farming. Moreover, urban populations are more vulnerable to disruptions in the food supply chain. City folk need to understand where their food comes from. This will require education that is starting to happen with the realisation that nutrition is an important component of human health. The nutrients supplied in our food reflects agricultural practises and food processing.

The link between human health and agriculture is through food; its sources, composition and distribution. Food sources include both plant and animal and the availability and composition of the latter is largely determined by the cost of plant-based feedstuffs. It is not surprising therefore, that any consideration of population demographics demonstrates the importance of agricultural production as a major determinant of public health. This would appear to be a straight forward proposition, embracing the adage ‘we are what we eat’, especially in developing societies. However, the relationship between agricultural production and human health is complex in a modern, developed society and measuring the impacts is difficult.

Our relationship with food must change. We will need to reinvent our diets to meet our nutritional requirements for optimal health and in so doing consume fewer calories and less meat. To maintain a viable food supply we must be prepared to pay realistic prices and reduce waste throughout the food supply chain. All of the required changes must be underpinned by rigorous research. This will require substantial public and private sector investment.

Visionary public policy, both national and international, must be a major instrument if our food systems are to evolve in a sustainable manner.

The Future of Data – The Global Challenge

In the last ten years we have seen an explosion in the amount of structured data we produce through our everyday activities.  All on-line activity, such as credit card payments, web searches and mobile phone calls, leaves a data exhaust, little puffs of evidence about our behaviour, what we do and how we think.  This can now be stored, shared and analyzed, transforming it from meaningless numbers into life-changing tools.

Like it or not, we live in a world where personal information can be accessed at the click of a key on a massive scale. Although there are myriad benefits (medicine, education and the allocation of resources are obvious areas), there are also significant risks. The threat of cyber warfare is a good example.   There is no turning back, so what does this mean for society going ahead? I believe that in order to maximize the benefits and minimize the risks over the next ten years we will have to fundamentally change our behaviours, our structures and our businesses.

Writing today, my real concern is that we haven’t yet got a clear understanding of the risks this new data-fuelled world brings and therefore even less about how to deal with them. That doesn’t mean we should over-react. Indeed the opposite: if we haven’t thought them through, we are more likely to over-react in some areas and under-prepare in others.  We are obviously severely under-prepared against cyber-terrorism, as we see with the recent Sony debacle.

As an example of over-reaction, look at concerns about health data, which, in the main, can be addressed through the judicial use of sandbox technologies and severe penalties for misuse. Surely it is counterintuitive to miss out on the enormous social benefit of sharing health data because we haven’t thought properly about how to deal with potential risks? How do we exploit data knowledge to positive effect and what are the key challenges going forward?

The first big issue is how to keep the opportunities equal.  I believe that all levels of society should benefit from the information data crunching can deliver.  But just because the capability is there, it is not a guarantee that it will be shared unilaterally. Currently this is an area where new inequalities could grow, as well as existing equalities get worse. Data sharing and the science of getting value from data is obviously much more advanced in the advanced economies.  It’s quite possible that these skills will be used to accelerate their own national well being, both commercial and social, leaving less technologically based societies behind. It would be wrong to assume that technology will be a leveler at all times. Yes, it has the potential, but the hope that it will have an equalizing effect is by no means assured.

There are obvious tensions between sharing, privacy and freedom. But we must be wary of erecting a virtual net curtain, hiding the voyeur and leaving the public vulnerable.  Why shouldn’t youthful misdemeanors be left in the ether? I think they should.  After all, we know that silly things sometimes happen – even to ourselves.  The trick is for us all is to know and acknowledge what is public, and to act accordingly. Years ago, we lived in small communities. Our doors were unlocked and our neighbours knew our every move.  It was considered normal. Our community is now global, but the principal remains the same.  Some guidelines do need to be established if we are to maximize the social benefit of data; we must develop an agreement about what privacy really is in reality as well as in the virtual world. This will involve thinking afresh about the relationship between the citizen, governments, and corporations.

Understanding data ownership will become a bigger issue than it already is today. Consumers and end users will want to own and control their personal data, but this seemingly straightforward statement grows more difficult to achieve with each passing day. There isn’t much information that we can easily say belongs to just one person.  Consider two people having a chat in a café. The content belongs to both of them; the fact of their meeting belongs to all who observe it. If I have a contagious disease, we don’t consider that information my personal property. When a doctor takes your temperature, does that information belong to you, the doctor or the hospital?  Data is useful to everyone, so we must get used to sharing particularly as more and more of our lives becomes digitised and new issues arise. The challenge is to develop our ethical and legal apparatus for this, establishing a set of agreed principals and regulatory framework that can act as the basis

History is littered with evidence that shows how we consistently fail to identify the next big threat. The Greeks didn’t recognize the Trojan Horse; the Allies in the First World War weren’t initially concerned about aerial warfare. Similarly, I believe we are currently under-playing the potential impact of cyber-attack. As more control systems are connected to the web, more vulnerability will inevitably appear.

Cyber-security, which involves protecting both data and people, is facing multiple threats; cybercrime and online industrial espionage are growing rapidly. Last year, for example, over 800 million records were lost, mainly through cyber attacks. A recent estimate by the think tank, Centre for Strategic and International Studies (CSIS), puts the annual global cost of digital crime and intellectual property theft at $445 billion—a sum roughly equivalent to the GDP of a smallish, rich European country such as Austria.

Although the attacks on Target, eBay and Sony have recently raised the risk profile in boardrooms around the world, law enforcement authorities are only now grappling with the implications of a complex online threat that knows no national boundaries. Protection against hackers remains weak, and security software is continuously behind the curve. Wider concerns have been raised by revelations of mass surveillance by the state; a growing number of countries now see cyber space as a new stage for battle, and are actively recruiting hackers as cyber warriors.  How to minimize this threat is key to all of our futures.

The Future of Data – Options and Possibilities

The way data will be optimized is changing.  It is not enough to know single lines of information.  Data must be connected and multi layered to be relevant. It means knowing not one thing or ten things or even 100 things about consumers but tens and hundreds of thousands of things. It is not big data but rather connected data – the confluence of big data and structured data – that matters.  Furthermore, with the growth in social tools, applications and services, the data in the spider’s web of social networks will release a greater value. In the UK alone, YouGov now knows 120,000 pieces of information about over 190,000 people.  This is being augmented every day.  The analysis of this allows organisations both public and private to shape their strategy for the years ahead.

We are also growing a huge data-store of over a million people’s opinions and reported behaviours. These are explicitly shared with us by our panelists to use commercially as well as for wider social benefit (indeed we pay our panelists for most of the data shared).

But many companies exploit data that has been collected without genuine permission; it’s used in ways that people do not realize, and might object to if they did. This creates risks and obstacles for optimising the value of all data.  Failure to address this will undermine public trust.  We all have the right to know what data others have and how they are using it, so effective regulation about transparency and the use of data is needed.  Europe is leading the way in this respect.

Governments, however, are the richest sources of data, accounting for the largest proportion of organized human activity (think health, transport, taxation and welfare). Although the principle that publicly-funded data belongs to the public remains true, certainly in the UK, we can expect to see more companies working with, through and around governments. Having the largest coherent public sector datasets gives Britain huge advantages in this new world

It is clear that encouraging business innovation through open data could transform public services and policy making, increasing efficiency and effectiveness. In the recent Shakespeare Review it was found that data has the potential to deliver a £2bn boost to the economy in the short-term, with a further £6-7bn further down the line[1]. However, the use of public data becomes limited when it involves private companies.  To address this in the future, when companies pitch to work with governments, preference should be given to those that share an open data policy, or at least the relevant parts. Furthermore, where there is a clear public interest in wide access to privately generated data – such as trials of new medicines — there is a strong argument for even greater transparency.

Aside from governments (whose data provision is by no means perfect) access to large, cheap data sets is difficult.  The assumption is that everything is available for crunching and that the crunching will be worth the effort. But the reality is that there are different chunks of big data – scientific, business and consumer – which are collected, stored and managed in multiple ways.  Access to relevant information let alone the crunching of it will take some doing. On top of this, much corporate and medical data is still locked away, stuck on legacy systems that will take years to unpick.  Many would say the sensible thing is to adopt a policy of standardization, particularly for the medical industry, given the growing number of patients living with complex long-term conditions. And yet, many standards abound.  So in addition to regulation around transparency, over the next ten years we can expect to see agreement on standardisation in key areas.

But the potential benefits from this wealth of information is only available if there are the skills to interpret the data.  Despite Google’s chief economist, Hal Varian, saying that “the sexy job of the next ten years will be statisticians;” number crunchers are in short supply (or at least not always available in the right locations at the right time). By 2018 there will be a “talent gap” of between 140,000 and 190,000 people, says the Mc­Kinsey Global Institute. The shortage of analytical and managerial talent is a pressing challenge, one that companies and policy makers must address.

Separately, it is entirely plausible that the infrastructure required for the storage and transmission of data may struggle to keep pace with the increasing amounts of data being made available. Data generation is expanding at an eye-popping pace: IBM estimates that 2.5 quintillion bytes are being created every day and that 90% of the world’s stock of data is less than two years old. A growing share of this is being kept not on desktops but in data centres such as the one in Prineville, Oregon, which houses huge warehouses containing rack after rack of computers for the likes of Facebook, Apple and Google. These buildings require significant amounts of capital investment and even more energy. Locations where electricity generation can be unreliable or where investment is limited may be unable to effectively process data and convert it to useful, actionable knowledge. Yet, it is the growing populations in these same areas – parts of Asia and Africa, for example – that will accelerate data creation, as more of its inhabitants develop online activities and exhibit all the expected desires of a newly emerging middle class.  How should this be managed?

[1] Shakespeare Review: An independent Review of Public Sector Information, May 2013

The Future of Data – Proposed Way Forward

Economically connected data can clearly benefit not only private commerce but also national economies and their citizens. For example, the judicial analysis of data can provide the public sector with a whole new world of performance potential.  In a recent report, consultancy firm McKinsey suggested that if US healthcare were to use big data effectively, the sector could create more than $300 billion in value every year, while in the developed economies of Europe, government administrators could save more than €100 billion ($149 billion) in operational efficiency improvements alone.

It is understandable that many citizens around the world regard the collection of personal information with deep suspicion, seeing the data flood as nothing more than a state or commercial intrusion into their privacy. But there is scant evidence that these sorts of concerns are causing a fundamental change in the way data is used and stored.

That said, we must all have a care. As public understanding increases, so will concerns about privacy violation and data ownership. If it is discovered that companies are exploiting data that has been collected without genuine permission and are using it in ways that have no societal benefit, there is a considerable risk of a public backlash that will limit opportunities for everyone.  The shelf life of the don’t- know-so-don’t-ask approach to data collection will be short.

Some in the industry believe governments need to intervene to protect privacy. In Britain, for instance, the Information Commissioner’s Office is working to develop new standards to publicly certify an organisation’s compliance with data-protection laws. But critics think such proposals fall short of the mark—especially in light of revelations of America’s National Security Agency (NSA) ran a surveillance programme, PRISM, which collected information directly from the servers of big technology companies such as Microsoft, Google and Facebook.

From a marketing perspective, detailed awareness of customer habits will enable technology to discriminate in subtle ways. Some online retailers already use “predictive pricing” algorithms that charge different prices to customers based on a myriad of factors, such as where they live, or even whether they use a Mac or a PC.

Transport companies provide another interesting use case for connected data. Instead of simply offering peak and off-peak pricing, they can introduce a far more granular, segmented model. Customers can see the cost of catching a train, and the savings that can be made by waiting half an hour for the next one. They can also see the relative real-time costs of alternative transport to the same destination, and perhaps decide to take a bus rather than a train. They have the ability to make informed, value-based judgments on the form of travel that will best suit their requirements. Such dynamic systems will provide greater visibility of loading and so allow the use of variable pricing to nudge passengers into making alternative choices that can improve the efficiency of the overall network.  Benefits all round.  That said, although there may be innocuous reasons for price discrimination, there are currently few safeguards to ensure that the technology does not perpetuate unfair approaches.

Open access to data is reaping its own rewards.  London’s Datastore makes information available on everything from crime statistics to tube delays to, as their website states,  “encourage the masses of technical talent that we have in London to transform rows of text and numbers into apps, websites or mobile products which people can actually find useful.” Many are taking up the challenge, and are delivering real social benefits..  A professor at UCL, for example, has mapped how many people enter and exit Tube stations, and how this has changed over time.  This information has now been used by Transport for London to improve the system.

The Future of Data – Impacts and Implications

Looking ahead, I believe the best approach to future-proof access to big data is to ensure there is agreement around its use, not its collection.  Governments should define a core reference dataset, designed to strategically identify and combine the data that is most effective in driving social and economic gain. This will then become the backbone of public sector information, making it possible for other organisations to discover innovative applications for information that were never considered when it was collected.

This approach has the potential for huge societal benefit. The shorter-term economic advantages of open data clearly outweigh the potential costs. A recent Deloitte analysis quantifies the direct value of public sector information in Britain at around £1.8bn, with wider social and economic benefits taking that up to around £6.8bn. Even though these estimates are undoubtedly conservative, they are quite compelling.

And yet, at the same time individuals need to be protected. There are instances where, for very good reasons, ‘open’ cannot be applied in its widest context. I therefore suggest we acknowledge a spectrum of uses and degrees of openness.

For example, with health data, access even to pseudonymous case level data should be limited to approved, legitimate parties whose use can be tracked (and against whom penalties for misuse can be applied). Access should also be limited to secure sandbox technologies that give access to researchers in a controlled way, while respecting the privacy of individuals and the confidential nature of data. Under these conditions, we can create access that spans the whole health system, more quickly and to more practitioners, than is currently the case. The result: We gain the benefits of ‘open’ but without a significant increase of risk.

Nor should we consider ‘free’ (that is, at marginal cost) to be the only condition, which maximises the value of public information. There may be some particular cases when greater benefits accrue to the public with an appropriate charge.  Finally, as big data unquestionably increases the potential of government power to accrue un-checked, rules and regulations should be put in place to restrict data mining for national security purposes.

We will also have to look to how we focus resources within academia.  The massive increase in the volume of data generated, its varied structure and high rate at which it flows, have led to the development of a new branch of science – data science.  Many existing businesses will have to engage with big data to survive. But unless we improve our base of high-level skills, few will have the capacity to create new approaches and methodologies that are simple orders of magnitude better than what went before.  We should invest in developing real-time, scalable machine learning algorithms for the analysis of large data sets, to provide users with the information to understand their behavior and make informed decisions

We should of course strive for an increased shift in capital allocations by governments and companies to support the development of efficient energy supply and robust infrastructure. These investments can prepare us for serving continued growth in world productivity – and help offset the increasing risk for the massive, destructive disruptions in the system that will inevitably, come with our growing dependency on data and data storage.

Innovation in storage capabilities should also be considered. Take legacy innovation, for example. The clever people at CERN use good old-fashioned magnetic tape to store their data, arguing that it has four advantages over hard disks for the long-term preservation of data: Speed (extracting data from tape is about four times as fast as reading from a hard disk). Reliability (when a tape snaps, it can be spliced back together; when a terabyte hard disk fails, all the data is lost). Energy conservation (tapes don’t need power to preserve data held on them). Security (if the 50 petabytes of data in CERN’s data centre was stored on a disk, a hacker could delete it all in minutes; to delete the same amount from the organisation’s tapes would take years).

The key thing to remember is that numbers, even lots of numbers, simply cannot speak for themselves.  In order to make proper sense of them we need people who understand them and their impact on the world we live in.  To do this we need to massively spread academia vertically and horizontally, engaging globally at all levels, from universities to government to places of work.  The current semi-fractured structure of academia is actually an advantage; it will help us ensure plurality of ideas and approaches. Remember, we’re not just playing with numbers; we’re dealing with fundamental human behaviors. We need philosophers and artists as well as mathematicians, and we must allow them to collectively develop the consensus.

If we get it right, over the next 10 years I would expect to see individuals being more comfortable with living in the metaphorical glass house, allowing their personal information to be widely accessible in return for the understanding that it will enable them to enjoy a richer, more ‘attuned’ life. I would also expect to see a maturing of our individual data usage, a coming of age with regards to appreciating and integrating data and less of a fascination at its very existence. We will also perhaps see a new segment appearing, those who elect to reduce their data noise by avoiding needless posts of photos of their lunch and such.


We will also see a structural shift in employment, markets and economies as the focus in maturing economies continues to shift away from manufacturing and production and toward a new tier of data-enabled jobs and businesses. As we demand more from our data, we will need to match it with a skilled workforce that can better exploit the information available.


After all the noise perhaps it would be wise to remember that big data, like all research, is not a crystal ball and statisticians are not fortune tellers. More information, and the increasing ability to analyse it, simply allows us to be less wrong. I believe that we will have continued growth in world productivity, probably accelerating over the next ten years, even as the risk for massive destructive disruptions in the system increases.  There will be huge challenges and even dangers, but I am confident we will be the better for it.  Every time humans have faced a bigger crisis, they have emerged stronger. Although we can’t be sure that this will always be the case, now is the time to be bold and ambitious.

The Future of Connectivity – The Global Challenge

The telecoms industry not only faces a massive increase in data demand, it also needs to boost profitability and personalized experience at the same time. To meet this challenge by 2025 mobile networks need to support up to 1000 times more capacity, reduce latency to milliseconds, reinvent Telcos for the cloud and flatten the total energy consumption.

One gigabyte per day equates to a 60-fold increase, or roughly a doubling of traffic per user every 18 months, compared to the average 500MB per user per month some mobile networks in mature markets are seeing today. This demand will be driven by hundreds of thousands of data apps sharing the same network, each with its own requirements towards the network. Every user, human as well as machine, will expect the optimal experience from the network for its personalized set of applications.

Why do we believe demand for mobile broadband will grow to these dimensions? What will it mean for operators and their networks? And even more importantly, what are the vital capabilities and technologies we need to explore and develop in the next decade to make this happen?

The Future of Connectivity – Options and Possibilities

Demand will continue to grow exponentially in the next decade: Demand for mobile broadband is closely related to the evolution of device and screen technologies, one of the fastest evolving areas in the Information and Communication Technology (ICT) industry. In 2011, the Retina display of an iPad already had nearly twice as many pixels to fill with content compared to a Full-HD television. New device form factors such as Google’s glasses, another hot topic introduced in 2012, continue to drive this evolution and ultimately only the human eye will set the limits for the amount of digital content that will be consumed by a mobile device. And these devices will not only consume content – ubiquitous integrated cameras with high resolution and frame rate are producing Exabytes of digital content to be distributed via networks.

Enabled by these powerful new devices, the app ecosystem continues to fuel demand for mobile data by continuously inventing new categories of applications that test the limits of the network. It started with mobile web browsing in 2007 and accounted for more than 50% of video traffic in 2012. And by 2020, people might demand mobile networks that allow them to broadcast live video feeds from their glasses to thousands of other users in real time.

Many of the apps will be cloud based or rely on content stored in the cloud. IDC estimates in their digital universe study that by 2020 30% of all digital information will be stored in the cloud – and thus be accessed through networks.

An even broader range of use cases for networks will develop as communication technologies and applications proliferate into all industries and billions of machines and objects get connected. They will go far beyond the classical examples of the smart grid or home automation. Just imagine the potential – but also the requirements – that remotely controlled unmanned vehicles would bring to mobile broadband networks.

In summary, we believe that device evolution, cloud based application innovation and proliferation of communication technologies into all industries will ensure that the exponential growth in demand for mobile broadband we have seen in the last few years will continue in the next decade.

The Future of Connectivity – Proposed Way Forward

Having understood what drives demand we can define the requirements for future mobile networks: As stated earlier, one gigabyte of data traffic per user per day is about 60 times the average data traffic seen in mature mobile operator networks today. On top of this, the growth in mobile broadband penetration and the surge of connected objects will lead to around ten times more endpoints attached to mobile operator networks than today. To prepare for this, we need to find ways to radically push the capacity and data rates of mobile networks into new dimensions to handle this amount of data traffic.

Yet, being able to deal with this traffic growth is just one aspect. An increasing number of real-time apps will test the performance of the networks. To support them with a good user experience we need to find ways to reduce the end-to-end latency imposed by the network to milliseconds. Tactile (touch/response) and machine-to-machine interactions in particular have low latency demands that can be as low as in the single digit milliseconds range.

To ensure mobile broadband remains affordable even while supporting the capacity and real-time requirements described previously, we also need to radically reduce the network Total Cost of Ownership (TCO) per Gigabyte of traffic. We believe one important lever to address this will be to automate all tasks of network and service operation by teaching networks to be self-aware, self-adapting and intelligent. This will help to reduce CAPEX/IMPEX for network installation as well as OPEX for network and service management. In addition to lower TCO, self-aware and intelligent networks will be able to understand their user’s needs and automatically act to deliver the best personalized experience.

To further reduce costs per GB, we need to share network resources through both within a single operator network, as well as between operators. It will include physical infrastructure, software platforms, sites, spectrum assets or even the network as a whole. We must also find ways to increase the energy efficiency. In addition to their environmental impact the energy costs account today for up to 10% (in mature markets) and up to 50% (in emerging markets) of an operator’s network OPEX and they have been growing constantly in the last years.

The most powerful way of course to deal with the cost pressure will be to identify new revenue streams. Are end customers and termination fees really the sole revenue source for operators, or will technologies enable new business models that allow operators to better monetize all their assets?

Ultimately we of course need to admit that due to the fast pace of change in the industry it is simply not possible to predict all requirements future networks will face. There will be many use cases that are simply not known today. To cope with this uncertainty, flexibility must be a key requirement as well.