The Future of Government – Proposed Way Forward

Joseph Nye argues that transactional hard power skills, like organisational ability and political acumen, are just as important as transformational soft power skills, like communications, vision and emotional intelligence. The state must develop a kind of “contextual intelligence” to be able to apply the best combination of hard and so power skills in different situations. It bears consideration what new capabilities the state should invest in to be able to ensure “supply” for the future, both in the ability to deliver on its promises and the ability to shape the direction that it is moving in. In retail parlance, “consumer insights” provide a key to what the “supply” should be. Likewise, for the state to undertake this type of sense-making work, it has become important not only to get data from economists and engineers but also insights from sociologists and anthropologists.

As Singapore approaches fifty years of rapid progress, sense-making would also have to take into account the development of its slower-moving components – in terms of its history, culture and heritage. In August 2011, the Government launched the Singapore Memory Project, a nationwide movement that aimed to capture and document precious moments and memories related to Singapore. Intangible assets such as collective memory are important in maintaining the resilience of our country, as Singapore seeks to become more adept at managing its pace of change. As the state seeks to be more responsive to growing public pressure, how can it work with new or existing providers of public services to split the load? What capability gaps have arisen because of the change in the operating environment? What new capabilities should the state invest in to ensure “supply” for the future?

The rise in the network structure and the expanding influence of non-state actors also presents opportunities for states to facilitate networks of responsibility and build inclusive institutions in place of traditionally more extractive ones. What results is greater experimentation and decentralisation, leading to more robust processes and outcomes. There are weak signals of this happening in Singapore. In 2013, local social enterprise SYINC launched a collaborative, community focused project “Under the Hood” to crowdsource innovative solutions to Singapore’s urban poverty challenges. The initiative brought together a range of organisations from the private and people sector, and acted as a lab to prototype micro-level, local solutions that are scalable, if proven successful. The potential for greater collaboration with such initiatives creates a specific role for the state in the network to identify successful ideas and scale them, leveraging its resources and existing infrastructures to augment the delivery of public services.

Some argue that only looking at increasing the “supply” of the state with limited resources leads to a vicious cycle. One of the reasons for this is that increasing the “supply” of the state can enlarge the issues that come under the purview of the state, thereby creating its own demand. When there is surplus demand for public services, the instinct is for the state to fill the gap. However, this sometimes generates more demand for said services. Therefore, a more sustainable solution might be to find ways to reduce the “demand” on the state that can lead to a more virtuous cycle.

The nature of trust may be different in a networked structure. Even though the quality of public services has improved, there has still been a declining level of trust in governments, institutions and elites. There is a growing sense amongst the middle class that the “system” is rigged in a self-serving way and that it lacks the capacity to deal with emerging challenges.

Trust in a network structure depends on long-term reciprocity of relationships, where there needs to be fair outcomes for stakeholders in these networks, and a perceived “fair” allocation of costs and benefits. Contribution, participation and reciprocity then lead to trust outcomes over time. In this environment, the appropriate scale of decision-making may be smaller, which can favour small states like Singapore, although it bears consideration how we might further localize decision-making to build more trust.

Efforts to invite participation from the network have to be designed with care. In 2006, the New Zealand government undertook a review of their Policing Act. One stage was to open up the act on a wiki for two weeks and the public was able to contribute. However, the parliamentary council office came out to express concerns at the format required and the expertise of the public in being able to meaningfully contribute to drafting legislation. Furthermore, in a low-trust environment, the public may question the role of a preventative government in protecting its citizenry and the potential legality of an infallible prosecutor.

How might the state create more space for network actors to take greater responsibility?

The state often retains the reputational risk and overall accountability for outcomes.

How can the state share responsibility while maintaining the influence over outcomes?

One of the ways that the state can legitimize itself to its constituents might be to facilitate the building of relationships with the people and other sectors to co-provide solutions to problems. There are many well-studied factors that contribute to the demand for the state, for example, the origins of crime, educational failure, indebtedness, family breakdown, psychological trauma, ill health, and others – yet the demand for the state is derivative, that is, people are actually demanding for certain services to be provided, and not necessarily for the state to provide it. This delineation opens up many possibilities for the state to co-opt other partners into the picture, with the state retaining an important role in designing the architecture of the networks in the sector, and facilitating access. In Singapore, the mytransport.sg app functions as a gateway for all things to do with transportation by aggregating available data, facilitating greater access to other non-state partners, and enabling the public to find solutions for themselves.

One of the challenges facing the state, especially in the area of public policy innovation, is how to balance equity and autonomy. A centralised system is often viewed to be more equitable at the expense of autonomy. However, as the governance system gets more complex, there are also hidden forms of inequity in a centralised system, like the difficulty

in navigating the system. Decentralised service provision at the hyper-local level can actually help to reduce this inequity. For example, the emergence of chartered schools is a good example of how this decentralised approach worked in practice because the focus was on outcomes, rather than the process. This represents a shift in the role of the state from ensuring equity in process to equity in outcomes.

The Future of Government – Impacts and Implications

One of the roles of the state is to ensure parity in process, if not outcomes. However, for certain areas, enforcing strict levels of compliance generates a greater demand for state intervention. For example in Singapore, the Workplace Safety and Health Act was amended in 2006 to focus on Workplace Safety & Health systems and outcomes, rather than merely on compliance, to allow for flexibility and robustness in the regulation to keep pace with technology and the nature of work. Setting and monitoring outcomes of individual agencies, while useful, is insufficient. In recognition of this, the Ministry of Finance and other Ministries have therefore worked to jointly establish whole-of-government outcomes along with suitable indicators to track our progress towards achieving them. In addition, when the state is better able to measure outcomes, greater possibilities in funding design, beyond grant funding, open up to states to more effectively measure and manage their resources and increase their impact, for example, with the incorporation of behavioural insights.

The operating environment for the state has changed. Networks dominate institutions as the dominant organisational form. The influence of non-state actors, in particular multinationals has expanded. Jurisdiction has grown beyond boundaries. Technological change has outpaced society. Consequently, the role of the state has had to evolve and to succeed in this new operating environment, the state needs to both increase the “supply” of the state and reduce the “demand” for the state.

 

The Future of Food – The Global Challenge

Food is fundamental for human existence and health but many of the world’s inhabitants experience ongoing hunger. For some this is due to drought, others war and for many it is a lack of money to buy food. The United Nations, Food and Agriculture Organization estimates that 850 million people worldwide are hungry and a greater number suffer from nutrient deficiencies. Approximately one billion people have inadequate nutrient intake, others excessive calorie intake. Obesity has become an epidemic in developed countries, while in some developing societies the double burden of nutrient deficiency and obesity is apparent. The challenge of preventing hunger and malnutrition will become even greater as the global population grows from the current 7 billion people to nearly 10 billion by 2050.

Not only is the global population increasing, we are living longer and becoming more affluent. As incomes increase, diets become more energy-dense and meat becomes a larger proportion of the diet. These changes in population and cuisine have led to a tremendous rise in the demand for animal-source protein. The competition between livestock and humans for grains and other high quality plant foods, whether real or perceived, is recognised as a major challenge. This has become more complicated with the diversion of grain to the production of biofuel.

For many years there has been an ongoing debate about the benefit or otherwise of animal-source foods, especially red meat consumption. In the past, claims of the detrimental effect of animal-sourced foods on human health have been made without rigorous scientific investigation. There is no doubt, however, that animal source foods, including lean meat, fish, poultry, eggs and milk, are an excellent source of protein and micronutrients. Fish can be added to this list but wild fisheries are rapidly being depleted. It should not be forgotten that humans evolved as ‘meat eaters’. It is unlikely that we will lose our appetite for meat but we must curb it. In many instances, the mechanism that allows impoverished families to improve their income and wellbeing is access to livestock or poultry.

Whatever diet we choose in the future our food will need to be produced more efficiently. Increased agricultural productivity must come from a reduced land area and resource base. Arable land continues to be lost due to soil degradation and urbanisation. We will need to be less dependent on resources that are becoming scarce, like arable land and water, or more costly, like energy and petrochemical-based inputs, including fertilizers. Some would argue that it is how we manage the nexus between food, water and energy that is our biggest challenge for global food security.

Conversely, the environmental impact of agriculture should not be forgotten. There is no doubt that agriculture exerts considerable pressure on water supplies, especially when irrigation is used. What form of energy will agriculture use in the future to produce, process and transport our food? The impact of agriculture on plant and animal biodiversity and other ecosystem services also must be addressed. Pollination of crops by bees is an integral component of agricultural production. Any disruption to this ecosystem service could have devastating consequences for food production.

Climate change will accentuate the challenges identified above. Pest and disease problems of plants and animals are likely to increase partly in response to climate change. Consensus exists regarding impacts of agricultural production, processing and distribution of food on global climate change. A significant proportion of anthropogenic emissions of greenhouse gasses come from agriculture and these emissions need to be reduced.

Just as the climate system is global, so is our food system. While globalisation may create opportunities and increase food distribution the benefits predominantly flow to those with a developed and secure food supply. Government subsidies, import restrictions and food safety legislation all mitigate against an equitable distribution and pricing of food. In some situations this will lead to civil unrest.

The Future of Food – Options and Possibilities

In developing countries where many of the population exist as subsistence farmers the food system is relatively straight forward. In contrast to developed economies where the food system or agricultural supply chain includes all aspects of crop and animal production, aquaculture, processing, storage, and distribution of food products through the wholesale and retail systems. More opportunities exist to guard against adversity and to increase productivity when the food system is complex and not reliant on a few food staples.

 

Food production must increase substantially but over the next decade both systems must cope with more severe climate events (2014 was the hottest year on record) and increased globalisation as more free trade agreements are signed. The increased amount of food required will need to be produced with finite water supplies on existing areas of arable land. There is general agreement that another “Green Revolution” is required but today’s revolution must be different to overcome existing environmental, financial and societal constraints. It is no longer possible or responsible to use unlimited water and chemical inputs to increase production. Other approaches to food production and processing must be found that use existing and new technologies in conjunction with appropriate social policies that are sustainable. Policies must ensure conservation of global biodiversity and animal welfare. The Commission on Sustainable Agriculture and Climate Change identified seven critical areas for the transition to a sustainable global food system;

  1. Integrate food security and sustainable agriculture into global and national policies
  1. Significantly raise the level of global investment in sustainable agriculture and food systems in the next decade
  1. Sustainably intensify agricultural production while reducing greenhouse gas emissions and other negative environmental impacts of agriculture
  1. Develop specific programs and policies to assist populations and sectors that are most vulnerable to climate changes and food insecurity
  1. Reshape food access and consumption patterns to ensure basic nutritional needs are met and to foster healthy and sustainable eating patterns worldwide
  1. Reduce loss and waste in food systems, targeting infrastructure, farming practices, processing, distribution and household habits
  1. Create comprehensive, shared, integrated information systems that encompass human and ecological dimensions

We must achieve all of these goals. Future food production must have both vastly increased productivity and good environmental practices. Meeting these goals will require the effective use of science. Biotechnology with its evolving “omics” tools (genomics, proteomics, metabolomics), will allow the development of new approaches to counter some of the complex problems we now face. With these approaches it will be possible to fast track current crop plants with agronomic traits such as yield and tolerance to environmental stress using the same or diminished inputs and be able to withstand pathogen attack and potential contamination with mycotoxins. The coming generation of crop plants may have value-added outputs such as improved nutrient and food functionality and be sources for biomass for biofuel production and human therapeutics.

Another important area that will undergo a major renaissance is microbial ecology with the application of molecular biology techniques . While microbial ecology is not a new concept, it is pivotal to understanding the presence and functioning of microbes in complex and dynamic food environments, both outside and inside the gastrointestinal tract. As we understand more about the complex and dynamic microbial ecology of foods, we will be in a better position to manipulate those biotic and abiotic factors that enhance food quality and human health. Similar improvements will be made to animal health and it is the unique microbial ecology of ruminant livestock (cattle and sheep) that allows them to convert human-inedible plant feeds and by-products into nutritious human foods.

The other platform that should permit a major leap forward is nanotechnology. It holds promise for responding to the need for more precise management of resources such as water and fertilizers, improving crop and livestock production, controlling pests, diseases, and weeds, monitoring plant disease and environmental stresses, improving postharvest technology, including waste management and food safety. It will allow the application of precision agriculture in both developed and developing economies.

However, without consumer acceptance, new technologies will not succeed. This will require education and communication of the benefits that will accrue from their application. This will need to be achieved with a back-drop of increased consumer interest in foods produced locally and organic agriculture. These “feel-good” approaches to agriculture will not overcome the food demands of the future but the more useful aspects of these practices must be part of food production in the future.

The Future of Food – Proposed Way Forward

Despite daunting challenges, the application of contemporary food production and processing practices along with scientific advances combined with appropriate social policies can underpin sustainable food production systems. Clearly, the solution to the challenge of meeting future food demands lies in increased agricultural productivity everywhere, but particularly among small-holder farmers, of whom there are millions worldwide. Mixed crop and livestock production systems produce about half of the world’s food supply. Targeting these systems should be a priority for policies to sustainably intensify production by carefully managed inputs of fertilizer, water, and feed to minimize waste and environmental impact, supported by improved access to markets, new varieties, and technologies.

The global food system is extremely complex and the gap between developing and developed nations is not only in economics but also in science, governance, and public information. Thus, to tackle these issues, a number of areas must be addressed urgently:

  • Science and research; There has been a global decline in agricultural R&D in the past four decades. There is now an urgent need to redouble the agricultural research effort. The new food producing system has to be science-based with low resource input. To ensure this occurs there must be definable career paths to encourage the next generation to enter agriculture and food research.
  • Economics and education; Increased economic development is required in developing countries hand-in-hand with education. These improvements will ultimately decrease the birth rate. In many economies, women manage the food cycle and their recognition and education should be a priority. In developed economies, education will be equally important as consumer attitudes will be very important to the eventual acceptance of new technologies and adoption of different patterns of food consumption. Part of the economic equation must be to pay farmers more for their products.
  • Sustainable diet; Part of the solution to feeding the planet is the development of consumption patterns that meet requirements in a safe, nutritious and affordable manner. In developed countries this will mean learning to eat sustainably with less reliance on meat. Through the application of the tools of molecular biotechnology, future nutrition will be personalised to account for individual variation and to improve health and well-being.
  • Waste; Postharvest losses of plant foods can be substantial in developing countries and amount to 30 to 50 % of production due to a lack of storage infrastructure. In developed countries we throw away a similar proportion of all food produced. The combined loss would feed about 3 billion people. Reducing wastage will provide breathing space to allow the development and adoption of new food production technologies.
  • Governance: Addressing these complex issues will take commitment and collaborative efforts at both an international and national government levels. It must also involve government agencies, private enterprise, and nongovernmental organizations. An atmosphere of collective good will ensure that research investment is appropriate and will enable the development of policy to allow integrated implementation of new food production systems.

The Future of Food – Impacts and Implications

Over the next decade and beyond maintaining global food security will become much more difficult as the population increases. We must double food production in a sustainable manner. Greater quantities of food will need to be produced with reduced inputs of water, energy and nutrients on the same or reduced area of arable land in a changing environment. To do otherwise will court significant human conflict.

The increasing urbanisation of the global community exacerbates this situation as more and more people become isolated from the land and farming. Moreover, urban populations are more vulnerable to disruptions in the food supply chain. City folk need to understand where their food comes from. This will require education that is starting to happen with the realisation that nutrition is an important component of human health. The nutrients supplied in our food reflects agricultural practises and food processing.

The link between human health and agriculture is through food; its sources, composition and distribution. Food sources include both plant and animal and the availability and composition of the latter is largely determined by the cost of plant-based feedstuffs. It is not surprising therefore, that any consideration of population demographics demonstrates the importance of agricultural production as a major determinant of public health. This would appear to be a straight forward proposition, embracing the adage ‘we are what we eat’, especially in developing societies. However, the relationship between agricultural production and human health is complex in a modern, developed society and measuring the impacts is difficult.

Our relationship with food must change. We will need to reinvent our diets to meet our nutritional requirements for optimal health and in so doing consume fewer calories and less meat. To maintain a viable food supply we must be prepared to pay realistic prices and reduce waste throughout the food supply chain. All of the required changes must be underpinned by rigorous research. This will require substantial public and private sector investment.

Visionary public policy, both national and international, must be a major instrument if our food systems are to evolve in a sustainable manner.

The Future of Data – The Global Challenge

In the last ten years we have seen an explosion in the amount of structured data we produce through our everyday activities.  All on-line activity, such as credit card payments, web searches and mobile phone calls, leaves a data exhaust, little puffs of evidence about our behaviour, what we do and how we think.  This can now be stored, shared and analyzed, transforming it from meaningless numbers into life-changing tools.

Like it or not, we live in a world where personal information can be accessed at the click of a key on a massive scale. Although there are myriad benefits (medicine, education and the allocation of resources are obvious areas), there are also significant risks. The threat of cyber warfare is a good example.   There is no turning back, so what does this mean for society going ahead? I believe that in order to maximize the benefits and minimize the risks over the next ten years we will have to fundamentally change our behaviours, our structures and our businesses.

Writing today, my real concern is that we haven’t yet got a clear understanding of the risks this new data-fuelled world brings and therefore even less about how to deal with them. That doesn’t mean we should over-react. Indeed the opposite: if we haven’t thought them through, we are more likely to over-react in some areas and under-prepare in others.  We are obviously severely under-prepared against cyber-terrorism, as we see with the recent Sony debacle.

As an example of over-reaction, look at concerns about health data, which, in the main, can be addressed through the judicial use of sandbox technologies and severe penalties for misuse. Surely it is counterintuitive to miss out on the enormous social benefit of sharing health data because we haven’t thought properly about how to deal with potential risks? How do we exploit data knowledge to positive effect and what are the key challenges going forward?

The first big issue is how to keep the opportunities equal.  I believe that all levels of society should benefit from the information data crunching can deliver.  But just because the capability is there, it is not a guarantee that it will be shared unilaterally. Currently this is an area where new inequalities could grow, as well as existing equalities get worse. Data sharing and the science of getting value from data is obviously much more advanced in the advanced economies.  It’s quite possible that these skills will be used to accelerate their own national well being, both commercial and social, leaving less technologically based societies behind. It would be wrong to assume that technology will be a leveler at all times. Yes, it has the potential, but the hope that it will have an equalizing effect is by no means assured.

There are obvious tensions between sharing, privacy and freedom. But we must be wary of erecting a virtual net curtain, hiding the voyeur and leaving the public vulnerable.  Why shouldn’t youthful misdemeanors be left in the ether? I think they should.  After all, we know that silly things sometimes happen – even to ourselves.  The trick is for us all is to know and acknowledge what is public, and to act accordingly. Years ago, we lived in small communities. Our doors were unlocked and our neighbours knew our every move.  It was considered normal. Our community is now global, but the principal remains the same.  Some guidelines do need to be established if we are to maximize the social benefit of data; we must develop an agreement about what privacy really is in reality as well as in the virtual world. This will involve thinking afresh about the relationship between the citizen, governments, and corporations.

Understanding data ownership will become a bigger issue than it already is today. Consumers and end users will want to own and control their personal data, but this seemingly straightforward statement grows more difficult to achieve with each passing day. There isn’t much information that we can easily say belongs to just one person.  Consider two people having a chat in a café. The content belongs to both of them; the fact of their meeting belongs to all who observe it. If I have a contagious disease, we don’t consider that information my personal property. When a doctor takes your temperature, does that information belong to you, the doctor or the hospital?  Data is useful to everyone, so we must get used to sharing particularly as more and more of our lives becomes digitised and new issues arise. The challenge is to develop our ethical and legal apparatus for this, establishing a set of agreed principals and regulatory framework that can act as the basis

History is littered with evidence that shows how we consistently fail to identify the next big threat. The Greeks didn’t recognize the Trojan Horse; the Allies in the First World War weren’t initially concerned about aerial warfare. Similarly, I believe we are currently under-playing the potential impact of cyber-attack. As more control systems are connected to the web, more vulnerability will inevitably appear.

Cyber-security, which involves protecting both data and people, is facing multiple threats; cybercrime and online industrial espionage are growing rapidly. Last year, for example, over 800 million records were lost, mainly through cyber attacks. A recent estimate by the think tank, Centre for Strategic and International Studies (CSIS), puts the annual global cost of digital crime and intellectual property theft at $445 billion—a sum roughly equivalent to the GDP of a smallish, rich European country such as Austria.

Although the attacks on Target, eBay and Sony have recently raised the risk profile in boardrooms around the world, law enforcement authorities are only now grappling with the implications of a complex online threat that knows no national boundaries. Protection against hackers remains weak, and security software is continuously behind the curve. Wider concerns have been raised by revelations of mass surveillance by the state; a growing number of countries now see cyber space as a new stage for battle, and are actively recruiting hackers as cyber warriors.  How to minimize this threat is key to all of our futures.

The Future of Data – Options and Possibilities

The way data will be optimized is changing.  It is not enough to know single lines of information.  Data must be connected and multi layered to be relevant. It means knowing not one thing or ten things or even 100 things about consumers but tens and hundreds of thousands of things. It is not big data but rather connected data – the confluence of big data and structured data – that matters.  Furthermore, with the growth in social tools, applications and services, the data in the spider’s web of social networks will release a greater value. In the UK alone, YouGov now knows 120,000 pieces of information about over 190,000 people.  This is being augmented every day.  The analysis of this allows organisations both public and private to shape their strategy for the years ahead.

We are also growing a huge data-store of over a million people’s opinions and reported behaviours. These are explicitly shared with us by our panelists to use commercially as well as for wider social benefit (indeed we pay our panelists for most of the data shared).

But many companies exploit data that has been collected without genuine permission; it’s used in ways that people do not realize, and might object to if they did. This creates risks and obstacles for optimising the value of all data.  Failure to address this will undermine public trust.  We all have the right to know what data others have and how they are using it, so effective regulation about transparency and the use of data is needed.  Europe is leading the way in this respect.

Governments, however, are the richest sources of data, accounting for the largest proportion of organized human activity (think health, transport, taxation and welfare). Although the principle that publicly-funded data belongs to the public remains true, certainly in the UK, we can expect to see more companies working with, through and around governments. Having the largest coherent public sector datasets gives Britain huge advantages in this new world

It is clear that encouraging business innovation through open data could transform public services and policy making, increasing efficiency and effectiveness. In the recent Shakespeare Review it was found that data has the potential to deliver a £2bn boost to the economy in the short-term, with a further £6-7bn further down the line[1]. However, the use of public data becomes limited when it involves private companies.  To address this in the future, when companies pitch to work with governments, preference should be given to those that share an open data policy, or at least the relevant parts. Furthermore, where there is a clear public interest in wide access to privately generated data – such as trials of new medicines — there is a strong argument for even greater transparency.

Aside from governments (whose data provision is by no means perfect) access to large, cheap data sets is difficult.  The assumption is that everything is available for crunching and that the crunching will be worth the effort. But the reality is that there are different chunks of big data – scientific, business and consumer – which are collected, stored and managed in multiple ways.  Access to relevant information let alone the crunching of it will take some doing. On top of this, much corporate and medical data is still locked away, stuck on legacy systems that will take years to unpick.  Many would say the sensible thing is to adopt a policy of standardization, particularly for the medical industry, given the growing number of patients living with complex long-term conditions. And yet, many standards abound.  So in addition to regulation around transparency, over the next ten years we can expect to see agreement on standardisation in key areas.

But the potential benefits from this wealth of information is only available if there are the skills to interpret the data.  Despite Google’s chief economist, Hal Varian, saying that “the sexy job of the next ten years will be statisticians;” number crunchers are in short supply (or at least not always available in the right locations at the right time). By 2018 there will be a “talent gap” of between 140,000 and 190,000 people, says the Mc­Kinsey Global Institute. The shortage of analytical and managerial talent is a pressing challenge, one that companies and policy makers must address.

Separately, it is entirely plausible that the infrastructure required for the storage and transmission of data may struggle to keep pace with the increasing amounts of data being made available. Data generation is expanding at an eye-popping pace: IBM estimates that 2.5 quintillion bytes are being created every day and that 90% of the world’s stock of data is less than two years old. A growing share of this is being kept not on desktops but in data centres such as the one in Prineville, Oregon, which houses huge warehouses containing rack after rack of computers for the likes of Facebook, Apple and Google. These buildings require significant amounts of capital investment and even more energy. Locations where electricity generation can be unreliable or where investment is limited may be unable to effectively process data and convert it to useful, actionable knowledge. Yet, it is the growing populations in these same areas – parts of Asia and Africa, for example – that will accelerate data creation, as more of its inhabitants develop online activities and exhibit all the expected desires of a newly emerging middle class.  How should this be managed?

[1] Shakespeare Review: An independent Review of Public Sector Information, May 2013

The Future of Data – Proposed Way Forward

Economically connected data can clearly benefit not only private commerce but also national economies and their citizens. For example, the judicial analysis of data can provide the public sector with a whole new world of performance potential.  In a recent report, consultancy firm McKinsey suggested that if US healthcare were to use big data effectively, the sector could create more than $300 billion in value every year, while in the developed economies of Europe, government administrators could save more than €100 billion ($149 billion) in operational efficiency improvements alone.

It is understandable that many citizens around the world regard the collection of personal information with deep suspicion, seeing the data flood as nothing more than a state or commercial intrusion into their privacy. But there is scant evidence that these sorts of concerns are causing a fundamental change in the way data is used and stored.

That said, we must all have a care. As public understanding increases, so will concerns about privacy violation and data ownership. If it is discovered that companies are exploiting data that has been collected without genuine permission and are using it in ways that have no societal benefit, there is a considerable risk of a public backlash that will limit opportunities for everyone.  The shelf life of the don’t- know-so-don’t-ask approach to data collection will be short.

Some in the industry believe governments need to intervene to protect privacy. In Britain, for instance, the Information Commissioner’s Office is working to develop new standards to publicly certify an organisation’s compliance with data-protection laws. But critics think such proposals fall short of the mark—especially in light of revelations of America’s National Security Agency (NSA) ran a surveillance programme, PRISM, which collected information directly from the servers of big technology companies such as Microsoft, Google and Facebook.

From a marketing perspective, detailed awareness of customer habits will enable technology to discriminate in subtle ways. Some online retailers already use “predictive pricing” algorithms that charge different prices to customers based on a myriad of factors, such as where they live, or even whether they use a Mac or a PC.

Transport companies provide another interesting use case for connected data. Instead of simply offering peak and off-peak pricing, they can introduce a far more granular, segmented model. Customers can see the cost of catching a train, and the savings that can be made by waiting half an hour for the next one. They can also see the relative real-time costs of alternative transport to the same destination, and perhaps decide to take a bus rather than a train. They have the ability to make informed, value-based judgments on the form of travel that will best suit their requirements. Such dynamic systems will provide greater visibility of loading and so allow the use of variable pricing to nudge passengers into making alternative choices that can improve the efficiency of the overall network.  Benefits all round.  That said, although there may be innocuous reasons for price discrimination, there are currently few safeguards to ensure that the technology does not perpetuate unfair approaches.

Open access to data is reaping its own rewards.  London’s Datastore makes information available on everything from crime statistics to tube delays to, as their website states,  “encourage the masses of technical talent that we have in London to transform rows of text and numbers into apps, websites or mobile products which people can actually find useful.” Many are taking up the challenge, and are delivering real social benefits..  A professor at UCL, for example, has mapped how many people enter and exit Tube stations, and how this has changed over time.  This information has now been used by Transport for London to improve the system.

The Future of Data – Impacts and Implications

Looking ahead, I believe the best approach to future-proof access to big data is to ensure there is agreement around its use, not its collection.  Governments should define a core reference dataset, designed to strategically identify and combine the data that is most effective in driving social and economic gain. This will then become the backbone of public sector information, making it possible for other organisations to discover innovative applications for information that were never considered when it was collected.

This approach has the potential for huge societal benefit. The shorter-term economic advantages of open data clearly outweigh the potential costs. A recent Deloitte analysis quantifies the direct value of public sector information in Britain at around £1.8bn, with wider social and economic benefits taking that up to around £6.8bn. Even though these estimates are undoubtedly conservative, they are quite compelling.

And yet, at the same time individuals need to be protected. There are instances where, for very good reasons, ‘open’ cannot be applied in its widest context. I therefore suggest we acknowledge a spectrum of uses and degrees of openness.

For example, with health data, access even to pseudonymous case level data should be limited to approved, legitimate parties whose use can be tracked (and against whom penalties for misuse can be applied). Access should also be limited to secure sandbox technologies that give access to researchers in a controlled way, while respecting the privacy of individuals and the confidential nature of data. Under these conditions, we can create access that spans the whole health system, more quickly and to more practitioners, than is currently the case. The result: We gain the benefits of ‘open’ but without a significant increase of risk.

Nor should we consider ‘free’ (that is, at marginal cost) to be the only condition, which maximises the value of public information. There may be some particular cases when greater benefits accrue to the public with an appropriate charge.  Finally, as big data unquestionably increases the potential of government power to accrue un-checked, rules and regulations should be put in place to restrict data mining for national security purposes.

We will also have to look to how we focus resources within academia.  The massive increase in the volume of data generated, its varied structure and high rate at which it flows, have led to the development of a new branch of science – data science.  Many existing businesses will have to engage with big data to survive. But unless we improve our base of high-level skills, few will have the capacity to create new approaches and methodologies that are simple orders of magnitude better than what went before.  We should invest in developing real-time, scalable machine learning algorithms for the analysis of large data sets, to provide users with the information to understand their behavior and make informed decisions

We should of course strive for an increased shift in capital allocations by governments and companies to support the development of efficient energy supply and robust infrastructure. These investments can prepare us for serving continued growth in world productivity – and help offset the increasing risk for the massive, destructive disruptions in the system that will inevitably, come with our growing dependency on data and data storage.

Innovation in storage capabilities should also be considered. Take legacy innovation, for example. The clever people at CERN use good old-fashioned magnetic tape to store their data, arguing that it has four advantages over hard disks for the long-term preservation of data: Speed (extracting data from tape is about four times as fast as reading from a hard disk). Reliability (when a tape snaps, it can be spliced back together; when a terabyte hard disk fails, all the data is lost). Energy conservation (tapes don’t need power to preserve data held on them). Security (if the 50 petabytes of data in CERN’s data centre was stored on a disk, a hacker could delete it all in minutes; to delete the same amount from the organisation’s tapes would take years).

The key thing to remember is that numbers, even lots of numbers, simply cannot speak for themselves.  In order to make proper sense of them we need people who understand them and their impact on the world we live in.  To do this we need to massively spread academia vertically and horizontally, engaging globally at all levels, from universities to government to places of work.  The current semi-fractured structure of academia is actually an advantage; it will help us ensure plurality of ideas and approaches. Remember, we’re not just playing with numbers; we’re dealing with fundamental human behaviors. We need philosophers and artists as well as mathematicians, and we must allow them to collectively develop the consensus.

If we get it right, over the next 10 years I would expect to see individuals being more comfortable with living in the metaphorical glass house, allowing their personal information to be widely accessible in return for the understanding that it will enable them to enjoy a richer, more ‘attuned’ life. I would also expect to see a maturing of our individual data usage, a coming of age with regards to appreciating and integrating data and less of a fascination at its very existence. We will also perhaps see a new segment appearing, those who elect to reduce their data noise by avoiding needless posts of photos of their lunch and such.

 

We will also see a structural shift in employment, markets and economies as the focus in maturing economies continues to shift away from manufacturing and production and toward a new tier of data-enabled jobs and businesses. As we demand more from our data, we will need to match it with a skilled workforce that can better exploit the information available.

 

After all the noise perhaps it would be wise to remember that big data, like all research, is not a crystal ball and statisticians are not fortune tellers. More information, and the increasing ability to analyse it, simply allows us to be less wrong. I believe that we will have continued growth in world productivity, probably accelerating over the next ten years, even as the risk for massive destructive disruptions in the system increases.  There will be huge challenges and even dangers, but I am confident we will be the better for it.  Every time humans have faced a bigger crisis, they have emerged stronger. Although we can’t be sure that this will always be the case, now is the time to be bold and ambitious.

The Future of Connectivity – The Global Challenge

The telecoms industry not only faces a massive increase in data demand, it also needs to boost profitability and personalized experience at the same time. To meet this challenge by 2025 mobile networks need to support up to 1000 times more capacity, reduce latency to milliseconds, reinvent Telcos for the cloud and flatten the total energy consumption.

One gigabyte per day equates to a 60-fold increase, or roughly a doubling of traffic per user every 18 months, compared to the average 500MB per user per month some mobile networks in mature markets are seeing today. This demand will be driven by hundreds of thousands of data apps sharing the same network, each with its own requirements towards the network. Every user, human as well as machine, will expect the optimal experience from the network for its personalized set of applications.

Why do we believe demand for mobile broadband will grow to these dimensions? What will it mean for operators and their networks? And even more importantly, what are the vital capabilities and technologies we need to explore and develop in the next decade to make this happen?

The Future of Connectivity – Options and Possibilities

Demand will continue to grow exponentially in the next decade: Demand for mobile broadband is closely related to the evolution of device and screen technologies, one of the fastest evolving areas in the Information and Communication Technology (ICT) industry. In 2011, the Retina display of an iPad already had nearly twice as many pixels to fill with content compared to a Full-HD television. New device form factors such as Google’s glasses, another hot topic introduced in 2012, continue to drive this evolution and ultimately only the human eye will set the limits for the amount of digital content that will be consumed by a mobile device. And these devices will not only consume content – ubiquitous integrated cameras with high resolution and frame rate are producing Exabytes of digital content to be distributed via networks.

Enabled by these powerful new devices, the app ecosystem continues to fuel demand for mobile data by continuously inventing new categories of applications that test the limits of the network. It started with mobile web browsing in 2007 and accounted for more than 50% of video traffic in 2012. And by 2020, people might demand mobile networks that allow them to broadcast live video feeds from their glasses to thousands of other users in real time.

Many of the apps will be cloud based or rely on content stored in the cloud. IDC estimates in their digital universe study that by 2020 30% of all digital information will be stored in the cloud – and thus be accessed through networks.

An even broader range of use cases for networks will develop as communication technologies and applications proliferate into all industries and billions of machines and objects get connected. They will go far beyond the classical examples of the smart grid or home automation. Just imagine the potential – but also the requirements – that remotely controlled unmanned vehicles would bring to mobile broadband networks.

In summary, we believe that device evolution, cloud based application innovation and proliferation of communication technologies into all industries will ensure that the exponential growth in demand for mobile broadband we have seen in the last few years will continue in the next decade.

The Future of Connectivity – Proposed Way Forward

Having understood what drives demand we can define the requirements for future mobile networks: As stated earlier, one gigabyte of data traffic per user per day is about 60 times the average data traffic seen in mature mobile operator networks today. On top of this, the growth in mobile broadband penetration and the surge of connected objects will lead to around ten times more endpoints attached to mobile operator networks than today. To prepare for this, we need to find ways to radically push the capacity and data rates of mobile networks into new dimensions to handle this amount of data traffic.

Yet, being able to deal with this traffic growth is just one aspect. An increasing number of real-time apps will test the performance of the networks. To support them with a good user experience we need to find ways to reduce the end-to-end latency imposed by the network to milliseconds. Tactile (touch/response) and machine-to-machine interactions in particular have low latency demands that can be as low as in the single digit milliseconds range.

To ensure mobile broadband remains affordable even while supporting the capacity and real-time requirements described previously, we also need to radically reduce the network Total Cost of Ownership (TCO) per Gigabyte of traffic. We believe one important lever to address this will be to automate all tasks of network and service operation by teaching networks to be self-aware, self-adapting and intelligent. This will help to reduce CAPEX/IMPEX for network installation as well as OPEX for network and service management. In addition to lower TCO, self-aware and intelligent networks will be able to understand their user’s needs and automatically act to deliver the best personalized experience.

To further reduce costs per GB, we need to share network resources through both within a single operator network, as well as between operators. It will include physical infrastructure, software platforms, sites, spectrum assets or even the network as a whole. We must also find ways to increase the energy efficiency. In addition to their environmental impact the energy costs account today for up to 10% (in mature markets) and up to 50% (in emerging markets) of an operator’s network OPEX and they have been growing constantly in the last years.

The most powerful way of course to deal with the cost pressure will be to identify new revenue streams. Are end customers and termination fees really the sole revenue source for operators, or will technologies enable new business models that allow operators to better monetize all their assets?

Ultimately we of course need to admit that due to the fast pace of change in the industry it is simply not possible to predict all requirements future networks will face. There will be many use cases that are simply not known today. To cope with this uncertainty, flexibility must be a key requirement as well.

The Future of Connectivity – Impacts and Implications

More spectrum, high spectral efficiency and small cells will provide up to 1000 times more capacity in wireless access. In the world of wireless, Shannon’s law is the one fundamental rule that defines the physical limits for the amount of data that can be transferred across a single wireless link. It says that the capacity is determined by the available bandwidth and the signal to noise ratio – which in a cellular system typically is constrained by the interference.

Therefore the first lever to increase the capacity will be to simply utilize more spectrum for mobile broadband. In total the entire spectrum demanded for mobile broadband amounts to more than 1,100 MHz and a large amount (about 500 MHz) of unlicensed spectrum at 2.4 GHz and 5 GHz can provide additional capacities for mobile data. Of course reaching an agreement on spectrum usage requires significant alignment efforts by the industry and is a rather time consuming process. Therefore it is also necessary to look at complementary approaches such as the Authorized Shared Access (ASA) licensing model, which allows fast and flexible sharing of underutilized spectrum that is currently assigned to other spectrum-holders such as broadcasters, public safety, defence or aeronautical.

A key challenge associated with more spectrum is to enable base stations and devices to utilize this larger and a potentially fragmented spectrum. Here technologies such as intra- and inter-band Carrier Aggregation will be essential to make efficient use of a fragmented spectrum.

The second lever for more capacity will be to address the interference part of Shannon’s equation. This can be achieved for example through beam forming techniques, which concentrate the transmit power into smaller spatial regions. A combination of multiple spatial paths through Coordinated Multipoint Transmissions (CoMP) can further increase the capacities available to individual users. We believe that with the sum of these techniques the spectral efficiency of the system can be increased by up to 10 times compared to HSPA today.

Advanced technologies and more spectrum will help to grow capacity by upgrading existing macro sites for still some time. However, a point will be reached when macro upgrades reach their limits. By 2020 we believe mobile networks will consist of up to 10…100x more cells, forming a heterogeneous network of Macro, Micro, Pico and Femto cells. Part of this will also be non-cellular technologies such as Wi-Fi, which need to be seamlessly integrated with cellular technologies for an optimal user experience.

Although the industry today has not defined what 5G will look like and the discussions about this are just starting, we believe that flexible spectrum usage, more base stations and high spectral efficiency will be key cornerstones.

The capacity demand and multitude of deployment scenarios for heterogeneous radio access networks will make the mobile backhaul key to network evolution in the next decade. The backhaul requirements for future base stations will easily exceed the practical limits of copper lines. Therefore from a pure technology perspective, fiber seems to be the solution of choice. It provides virtually unlimited bandwidth and can be used to connect macro cells in rural areas and some of the small cells in urban areas. However the high deployment costs will prevent dedicated fiber deployments just to connect base stations in many cases. Due to the number of deployment scenarios for small cells, from outdoor lamp post type installations to indoor, we believe a wide range of wireless backhaul options will coexist including microwave links and point to multipoint link, millimetre wave backhaul technologies. For many small cell deployment scenarios (e.g. for installations below rooftop level) a non-line-of-sight backhaul will be needed. The main options here are to either utilize wireless technologies in the spectrum below 7 GHz or to establish meshed topologies.

Besides pure network capacity, the user experience for many data applications depends heavily on the end-to-end network latency. For example, users expect a full web page to be loaded in less than 1000ms. As loading web pages typically involves multiple requests to multiple servers, this can translate to network latency requirements lower than 50ms. Real-time voice and video communication requires network latencies below 100ms and advanced apps like cloud gaming, tactile touch/response applications or remotely controlled vehicles can push latency requirements down to even single digit milliseconds.

The majority of mobile networks today show end-to-end latencies in the range of 200ms-500ms , mainly determined by slow and capacity limited radio access networks. Therefore the high bandwidth provided by future radio access technologies and the use of fast data processing and transmission will provide a major contribution to reduce the network latency. Due to the amount of data being transferred the user perceived latency can be much higher than the plain round-trip-time. Thinking of future ultra high resolution (UHD) real time video applications this clearly motivates the need for further technology evolution.

Equally important is the real traffic load along the end-to-end path in the network. A high traffic load leads to queuing of packets, which significantly delays their delivery. When attempting to solve this, it is not efficient to just overprovision bandwidth in all network domains. Instead latency sensitive media traffic might take a different path through the network or receive preferred treatment over plain data transfers. This needs to be supported by continuously managing latency as a network quality parameter to identify and improve the bottlenecks. In return, low latency traffic could be charged at a premium, providing network operators with new monetization opportunities.

One physical constraint for latency remainins: Distance and the speed of light. A user located in Europe accessing a server in the US will face a 50ms round-trip time due simply to the physical distance involved, no matter how fast and efficient the network is. As the speed of light is constant, the only way to improve this will be to reduce the distance between devices and the content and applications they are accessing. Many future applications such as cloud gaming depend on dynamically generated content that cannot be cached. Therefore the processing and storage for time critical services also needs to be moved closer to the edge of the network.

The introduction of additional radio access technologies, multiple cell layers and diverse backhaul options will increase complexity and bears the risk that network OPEX will rise substantially. This is why the Self- Optimizing-Network (SON) is so important. SON not only increases operational efficiency, but also improves the network experience through higher network quality, better coverage, capacity and reliability. Extending the SON principles now to a heterogeneous network environment is a challenge and opportunity at the same time.

Fortunately, big data analytics and artificial intelligence (AI) technologies have matured in recent years, mainly driven by the need to interpret the rapidly growing amount of digital data in the Internet. Applied to communication networks, they are a great foundation for analyzing Terabytes of raw network data and to propose meaningful actions. In combination with AI technologies, actionable insights can
be derived even in the case of incomplete data; for example machine-learning techniques can find patterns in large and noisy data sets. Knowledge representation schemes provide techniques for describing and storing the network’s knowledge base and reasoning techniques utilize this to propose decisions even with uncertain and incomplete information. Ultimately we believe that both, big data analytics and AI technologies will help to evolve SON into what we call a “Cognitive Network”, one that is able to handle complex end-to-end optimization tasks autonomously and in real time.

Customer Experience Management (CEM) can provide insights that will enable operators to optimize the balance of customer experience, revenues and network utilization. Cognitive Networks will help to increase the automation of CEM enabling network performance metrics to be used to govern the insight/action control loop, as well as experience and business metrics. This again increases the operational efficiency and at the same will be the prerequisite to deliver a truly personalized network experience for every single user.

The big data analytics and AI technologies introduced with the Cognitive Networks will be the foundation for advanced customer experience metrics. The ability to deal with arbitrary amounts of data in real time will allow a much more detailed sensing of network conditions and the resulting user experience in real time.

It also will be the foundation for large-scale correlations with other data sources such as demographics, location data, social network data, weather conditions and more. This will add a completely new dimension to user experience insights.

Cloud technologies and being able to provide computing and storage resource on-demand have transformed the IT industry in the last years. Virtualization of computing and storage resources has enabled the sharing of resources and thus their overall efficiency. Virtual cloud resources can also be scaled up and down almost instantly in response to changing demand. This flexibility has created completely new business models. Instead of owning infrastructure or applications it is possible to obtain them on-demand from cloud service providers. So far this approach has mainly revolutionized IT datacenters. We believe that similar gains in efficiency and flexibility can be achieved when applying cloud technologies to Telco networks. Virtualization will allow decoupling of traditional vertically integrated network elements into hardware and software, creating network elements that consist just of applications on top of virtualized IT resources. The hardware will be standard IT hardware, hosted in datacentres and either owned by the network operator or sourced on-demand from third party cloud service providers. The network applications will run on top of these datacentres, leveraging the benefits of shared resources and flexible scaling.

Also user plane network elements such as base stations will be subject to this paradigm shift. Over time, the migration of network elements in combination with software defined networking will transform today’s networks into a fully software defined infrastructure that is highly efficient and flexible at the same time.

Efficient radio technologies, high utilization and network modernization will reduce the network energy consumption, another important cost factor for operators. Having the forecasted traffic growth in mind, reducing the network energy consumption must be a major objective. The focal point for improving network energy efficiency will be the radio access, which accounts for around 80% of all mobile network energy consumption. Ultimately the energy efficiency that can be achieved depends on the pace of network modernization. Efficiency gains materialize only when the new technologies are introduced into the live network. Determining the right pace for modernization requires careful balancing of CAPEX and OPEX. We believe that energy efficiency can beat the traffic growth – which makes keeping the network energy consumption at least flat a challenging – but achievable goal.

The Future of Connectivity – Conclusion

We believe that device evolution and application innovation will continue to drive the exponential growth in demand for personalized mobile broadband in the next decade. This demand and the associated usage profile define the key requirements for future mobile networks in terms of capacity, latency, automation, resource utilization and energy efficiency.

For each of these requirements, we’ve shown an essential set of technologies that needs to be explored and developed in the next decade. This technology evolution leads to our vision of a fully software defined “liquid” network architecture – a network architecture that combines highest efficiency with flexibility and is the foundation to deliver the best experience to every mobile broadband user.

The Future of Cities – The Global Challenge

In 1800, less than two percent of the global population lived in cities. Today one out of every two people is a city dweller and by 2050 it’s likely that over 70% of people will live in a city. The growth of mega-cities in Africa, Asia and South America, and the rebirth of post-industrial cities in Europe and North America is creating a new wave of urbanisation. Such mass urbanisation requires a rethink about how we plan and design cities. If we want the cities of the future to be sustainable and healthy places for people to live, the city of 2025 will need to look radically different.

Cities are the engines of the global economy: just 600 urban centres generate 80% of global GDP. Today, the economic power of cities is primarily found in the developed world with 20 percent of global GDP contributed by North American cities alone. However, this is changing and the trend is likely to accelerate. By 2025 the centre of growth will move to the emerging economies with cities in China, India and Latin America forming the largest city economies supplanting cities in Europe and North America from the list.

Cities consume 75% of the world’s natural resources, and produce more than 60% of greenhouse gas emissions. As a result, while the economic power of cities continues to grow, they remain vulnerable to the by-products of their success.

Rapid urbanisation is placing strains on the economic, environmental and social fabrics of cities. Challenges caused by a growing population such as traffic congestion, pollution and social tensions as well as diseases such as cancer, obesity and depression represent a growing challenge to policy makers.

Climate change poses a new and worrying challenge for cities. Already 50% of cities are dealing with its effects, and nearly all are at risk. Over 90% of all urban areas are coastal, putting most cities on earth at risk of flooding from rising sea levels and powerful storms.

Our cities are also home to a sizeable and increasing older population. By 2050 there will be two billion people aged over 60 worldwide, a 250% increase on today’s figures. Many of these people will live in cities. In developed countries, 80% of older people are expected to live in cities by 2050, while cities in developing countries will house a quarter of the older population.

Japan has faced this population change earlier than many countries and faces an enormous challenge with extra pressure on public services and appropriate housing. With more than 30% of the Japanese population aged over 60 – far higher than any other country – Japanese architects and planners have taken a major role in adapting urban environments to support healthy ageing of populations. This experience will soon be of global interest – by 2050, there will be another 64 countries where over 60s represent over 30% of the population.

This combination of environmental pressures, changing economic patterns and demographic change means that the cities of the future will need to be designed to operate differently. These challenges also present with them huge opportunities. With the right focus and resources, cities can become more sustainable – urban planning, design, technological and governance models could all facilitate this.

The Future of Cities – Options and Possibilities

Globally, cities have a long history of fostering social and practical innovation. New technology has enabled cities to evolve and reinvent themselves, fostering a better quality of life for their inhabitants in the face of huge social, environmental and technological upheaval.

An understanding of an area’s demographic, problems, capabilities and environmental constraints could play a key role in informing the design and planning of cities to enable as many people as possible to achieve a fulfilling, social and active life.

The planning of cities has already been transformed and can go much further with the right resources in place. Pen and paper has long been supplanted in most cities by a wide range of electronic data devices, geographic information systems, satellite mapping and visualisation software. These offer urban planners and designers a deeper insight into human behaviour as well as a greater understanding of the physical attributes of sites, to inform design and how it is delivered. As the technology becomes more sophisticated, these new approaches can combine to create place-based design approaches that, for example, address the health and environmental impacts of cities by integrating routes which will make it more likely city residents walk and cycle as well improving public transport, making denser development and more compact spaces more appealing to potential residents.

New approaches are also enabling architects and planners to better understand how cities impact their environments. Increasing the use of natural features helps reduce flooding by improving sustainable drainage, and prevents cities from overheating. Incorporating green infrastructure also helps to support mental wellbeing, thereby, also yielding savings in future health budgets.

Technology can help plan growth in a more integrated way – addressing societal, environmental and design issues across a range of locations. Interesting examples can be seen in cities such as Rio de Janeiro which is pioneering new digital transport and governance systems, through a citywide operation centre that connects all the city’s 30 agencies, from transport to the emergency services. On a day-to-day basis, It helps officials from across the city collaborate on running public services more smoothly and efficiently. In the event of crisis, such as a collapsing building, the operation centre helps roll out a coordinated response. Transport systems can be shut down, emergency services mobilised and gas supplies can be cut off, while citizens can be informed of alternative routes via Twitter.

Cities are also starting to use digital platforms to better plan for the future and encourage public engagement in the future of their cities. In Asia a number of emerging cities are working with partners to develop models for sustainable growth that learn from the current generation of cities.

Developing these models further will be crucial to generate popular support if the city of 2025 is to benefit from new approaches. In the UK RIBA has explored the idea of a digitised planning system, using new technology and big data to support strategic planning of a city and help improve public engagement with the process. Public consultation software, online forums and social media are now increasingly used to capture public opinion to test ideas, evolve proposals and disseminate information.

New approaches can also inform the way we design for an ageing population. Urban design can help older people live healthier and more socially active lives by creating more inclusive spaces. Their wellbeing can be enhanced by designing affordable, accessible, well-connected housing that connects with local amenities more directly.

Understanding this group’s needs will increasingly become more important at the city-scale to help local authorities develop innovative housing that can bring out the most of older people but also impact on younger age groups in a very positive way. Designing inclusively for all generations is the way to create successful integrated communities.

The Future of Cities – Proposed Way Forward

The planners and architects of tomorrow will have a range of tools available to them that their predecessors had likely never dreamed of. Predicting which of these developments will be truly transformative is an impossible science and will vary significantly from city to city. But exploring the potential implications and applications of a range of technologies will highlight the range of possibilities ahead of us – leaving us both prepared and in a position to better control the fate of cities.

In order to do so successfully, it will be crucial to retain a focus on utilising technology as a means to anticipate and manage change within urban areas to create and maintain good quality sustainable environments.

We will need measures at the national level to help enable new technology to play a role across boundaries. Globally, a strong cultural shift will be required – moving away from the model of business as usual to an approach that enables the economy to thrive within resource constraints.

2015 will be crucial to the future development trajectory of cities in 2025. In September, the United Nations is expected to agree a new set of Sustainable Development Goals which will define a new set of international development objectives, one area expected to be included is an objective to make cities more sustainable. In December, the Paris summit will attempt to finalise a new climate change agreement. Although the impact of the two global agreements will be crucial in ensuring future prosperity for cities, national, regional and local governments should seek anyway to develop smart city solutions to ensure cities can be future-proofed effectively.

There will be no one size fits all or quick solutions to the complex interests and failings accumulated over centuries of development. Local governments will therefore be crucial in creating ambitious and proactive area-specific planning policies and programmes that integrate climate change, public health and ageing population priorities into planning policies and development to achieve a long-term approach.

The Future of Cities – Impacts and Implications

If a strong global commitment to sustainable development and tackling climate change is set in 2015, and bold leadership and new technology is fostered by national and local governments; the city could start to look completely different by 2025. Cities can become cleaner, greener, healthier and more pleasant places to live, while still driving economic growth and fostering innovation.

This comes with a significant caveat. Creating new places proactively and with future changes in mind will require a culture shift within those who plan, build and design our cities.

In planning, more multi-disciplinary thinking will need to be applied to urban development strategies and design, to ensure a variety of changes can be accounted for and addressed. Greater participation from the public will be required to gain a deeper insight into their needs and preferences. In an era where the public voice is becoming easier to access and harder to suppress, it will become increasingly hard to generate support for new initiatives without taking public views into account. The era when planners, architects and builders could create new cities from a blank canvas without heed to the social or environmental impacts is over. City leaders, planners and designers will need to incorporate continuous feedback loops that provide information about a range of social, economic and environmental changes into their thinking to maintain public and political support.

Modelling and testing various approaches will be important to arrive at the optimal design or policy intervention. This will not only require new technologies to aid this process, but also a willingness among local and central governments to adopt longer-term development approaches, and to increase public participation in design and planning processes.

In construction, this will necessitate a shift to a circular economy that is restorative, both naturally (e.g. one that replenishes fresh drinking water) and technically (e.g. building materials can be reused without polluting the environment). Buildings would also have to be built to anticipate future change, rather than using design standards based on existing conditions. History has taught us that the cities which fail to react to the changing world face decline. With the tools at their disposal today, cities have never been better equipped to rise to the challenge. Their success in 2025 and beyond will be determined by how well they do so.

The Future of Ageing – The Global Challenge

Advances in science and technology coupled with large-scale changes in health practices involving improved sanitation, water purification, and a host of lifestyle changes have led to dramatic increases in longevity in developed nations around the world. On a global scale, life expectancies in developed regions are continuing to rise in the 21st century and, although most people assume that there are biological limits on life span, so far there is little evidence that we are approaching them.[1] Because fertility declined across the same years that life expectancies increased, the distribution of age in the global population changed irrevocably. The once-universal pyramid shapes of age distributions of populations in the western world, with many young ones at the bottom narrowing to tiny peaks at the tops, are being rectangularized, reflecting the fact that most people, not just an exceptional few, are living into old age.

To the extent that the importance of ageing societies is recognized at all, anxiety is the typical response. Terms like “grey tsunami” imply that larger numbers of older citizens will become a drain on societies. Concern is warranted. The demographic changes underway are fundamentally altering virtually all aspects of life as we know it. Workforces are becoming older and more age diversified than ever in history. Families are having fewer children, yet four and five generations are alive at the same time. Education has come to predict well-being and even length of life, yet is unevenly distributed, creating heightened disparities across socioeconomic strata accentuating old age outcomes between rich and poor.

[1] Oeppen, J. and J. W. Vaupel: Broken limits to life expectancy. Science 296, 1029-1031 (2002).

The Future of Ageing – Options and Possibilities

To date, however, the concern has been largely misplaced, with the emphasis on aging as opposed to an emphasis on the cultures that surrounds very long lives. By culture, we are referring to the crucible that holds science, technology, and large scale behavioral practices and social norms. We maintain that the more serious problems concern antiquated social norms and the lack of cultural supports for people 50 and older, such as medical treatments for common diseases of old age and technologies and services that allow people to age in place, and social norms that encourage life-long participation in communities, families and workplaces. The culture that guides people through life today is a culture that evolved around shorter lives. The urgent challenge now is to create cultures that support people through ten and more decades of life.

Although predictions about the future are always perilous, we can comfortably predict that life will change and can change such that longer lives improve quality of life at all ages. Unfortunately to date we have been decidedly uncreative about ways to use added years of life. These years have been tacitly tacked on to the end of life, with old age the only stage in life that has gotten longer. Rather than move forward by happenstance, we need strategic thinking about how to best use added decades of life. Helping individuals and nations visualize, plan and prepare is essential in order to ensure that longer lives are high quality.

Changing the nature, timing and duration of work will be key. Individuals and societies must effectively finance very long lives and so far we are doing a poor job. Life expectancy at age 65 for the world’s population increased by roughly fifty percent from the 1950s to the present time, while the average age of retirement has remained relatively constant. [1] Between now and 2030, the number of people in developed countries over the “conventional” retirement age of 65 will increase by more than thirty percent. At the same time, the size of the conventional working-age population in developed countries is projected to decline by four percent. To the extent that nothing changes, the ratio of the working-age population to retirees will steadily decrease in the foreseeable future. Of course, these projections are based on the assumption that people continue to retire at relatively young ages. One obvious, although surprisingly ignored, way to address the challenges posed by the declining number and share of working-age population is to expand the workforce by increasing the workforce participation of older workers and, in some countries, women.

Increasingly, research findings suggest that this is feasible. A substantial majority of people 60 to 70 years of age report that they are physically able to work. A 2014 paper published in the Journal of Gerontology found that 85% of Americans aged 65-69 report no health-based limitations on paid work or housework.[2] Similar trends are evident in Europe. [3] To be sure, the numbers of disabled individuals has, and will continue to, increase in aging societies and it is extremely important to have policies that support people who cannot work. We maintain that the generosity of disability insurance should increase, yet we must recognize that chronological age is a poor predictor of the ability to work. Even at very advanced ages, substantial numbers of people are sufficiently healthy to contribute to workplaces. Societies that find ways to tap older peoples’ contributions will benefit greatly.

Although the idea of longer working lives often meets resistance, evidence for the benefits of work to individuals is growing. Arguably, the most obvious reason to work longer is the financial benefit. For many, retirement at age 65 is economically infeasible. In the words of Stanford economist John Shoven, “the reality is that few workers can fund a 30 year retirement with a 40 year career”.[4] Neither can societies. In recent years, it is becoming clear that remaining active and engaged in work is also associated with physical, socioemotional, and cognitive benefits. Studies of healthy aging suggest that older adults who are engaged have lower mortality rates, are less likely to experience various physical and mental illnesses, and are more likely to have a strong sense of identity and well-being.[5] Working longer also has protective effects against cognitive decline, [6] ostensibly by providing a mentally engaging environment where workers can “use it” so they don’t “lose it.” Research suggests that both paid and unpaid work are associated with enhanced well-being, delayed disability, decreased mortality risk, and onset of fewer diseases and associated functional impairments. [7],[8],[9],[10] New models of working longer can relieve some of the pressure to save large sums of money for extended periods of leisure. Importantly, working longer can mean working differently. Many workers would be happy to exchange decades-long retirements in old age for four day work weeks, regular time off for sabbaticals, retraining, and part-time work when children are young as well as at advanced ages as people fade into retirement.

[1] “Population Facts”, United Nations Department of Economic and Social Affairs, Population Division, December, 20-13

[2] Lowsky, Olshansky, Bhattacharya, Goldman, “Heterogeneity in Healthy Aging”, J Gerontol A Biol Sci Med Sci first published online November 17, 2013 doi:10.1093/gerona/glt162

[3] A. Börsch-Supan, Myths, Scientific Evidence and Economic Policy in an Aging World, J. Econ. Ageing, 1–2 (2013), pp. 3–15

[4] Ford, John Patrick. 2014. “How to support a 30-year retirement.” San Diego Source. http://www.sddt.com/Commentary/article.cfm?SourceCode=20141106tza&Commentary_ID=12&_t=How+to+support+a+30year+retirement#.VL6ykS6AyjI

[5] Rowe, John W. and Robert L. Kahn. 1998. Successful Aging. New York: Pantheon; Cohen, Sheldon. 2004. “Social Relationship and Health.” American Psychologist 59:676-684.

[6] Rohwedder, Susann and Robert J. Willis. 2010. “Mental Retirement.” Journal of Economic Perspectives. 24:119-138

[7] Rohwedder, Susann, and Robert J. Willis. 2010. “Mental Retirement.” Journal of Economic Perspectives, 24(1): 119-38.

[8] Carr DC, Komp K, editors. “Gerontology in the era of the third age: implications and next steps.” New York: Springer Publishers; 2011: 207-224

[9] Morrow-Howell N, Hinterlong J, Rozario PA, Tang F. “Effects of volunteering on the well-being of older adults.”   J Gerontol B Psychol Sci Soc Sci. 2003; 58B:S137–S145. Doi: 10.1093/geronb/58.3.S137

[10] Matz-Costa C, Besen E, James JB, Pitt-Catsouphes M. “Differential impact of multiple levels of productive activity engagement on psychological well-being in middle and later life.” The Gerontologist. 2012. Doi: 10.1093/geront/gns148

The Future of Ageing – Proposed Way Forward

From a societal perspective, there is a pressing need to make use of the human capital represented in older people. General knowledge and expertise increase with age, as do emotional stability and the motivation to invest in important activities. If appropriately utilized, older populations can benefit national and global economies. Yet the clarion call to workers today is about saving for increasingly long retirements, instead of actively planning to work longer. In the US, the responsibility for saving has shifted to the individual, reflecting the move from defined benefit plans to defined contribution plans. Unfortunately, the change has resulted in considerable undersaving for retirements. In a 2014 Retirement Confidence Survey, only 64% of all workers age 25 and older reported that they and their spouse had saved at all for retirement, a decrease from 75% in 2009. [1] Overall, 60% of workers report that they have less than $25,000 in total savings and investments, with over one third reporting less than $1,000 in total savings.[2] If appropriate steps are not taken, there could be catastrophic economic implications to both individuals and societies, as low retirement savings could lead to major strains on economies.

 

For those who have inadequate retirement savings, the most obvious solution is to work longer. This approach may hold benefits that extend beyond income to include better physical health and cognitive functioning. One major potential barrier, however, is that employers remain ambivalent about older workers. Currently, most employers’ view older workers as expensive and sometimes less productive than younger workers. Research findings increasingly suggest that the latter reflects stereotypes more than evidence. The productivity of workers tends to increase with age. This is especially true for knowledge workers, yet blue collar workers also can retain (and perhaps increase) productivity.[3] One study that measured the performance of more than 400 McDonald’s restaurants across the UK found that restaurants that employed mixed-age workforces, including workers age 60 and above, delivered an average increase of twenty percent in customer satisfaction levels compared to less age diverse workforces.[4] Moreover, there is a net benefit of intergenerational teams on workplace productivity, including a broader range of skills and experience across the workforce, increased mentorship opportunities and skills transfer, a reduction in turnover, and increased staff morale.[5],[6] Companies that adapt to older workers’ needs using cost-efficient measures such as flexible work arrangements, workplace modifications, and on-the-job training can benefit from age diversity in the workforce. [7] BMW’s older worker production line at Dingolfing is an example of how thoughtful design of blue-collar workplaces can support high levels of productivity in older workers. The company collaborated with its older production workers to tailor one of its most labor intensive manufacturing lines to an average worker age of 47. The resultant line reached its production goals with defect rate and worker absenteeism meeting or exceeding the levels achieved by “younger” lines.[8] The cost of older workers is a real issue for employers. By leveraging older workers as source of human capital, employers can better manage their talent, facilitate knowledge transfer to younger workers, and help older workers slowly phase into retirement. Offering bridge jobs or flexible work arrangements such as flex hours and part-time work will allow employers to retain the expertise of older workers while reducing costs.[9],[10]

[1] Employee Benefit Research Institute. 2014. “2014 Retirement Confidence Survey.” http://www.ebri.org/pdf/surveys/rcs/2014/RCS14.FS-6.Prep-Ret.Final.pdf

[2] Ibid.

[3] Burtless, G.(2013).The Impact of Population Aging and Delayed Retirement on Work-force Productivity. Tech. rep., Center for Retirement Research at Boston College.

[4] Department for Work and Pensions (UK). 2011. “Employing Older Worker Case Studies.” https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/142752/employing-older-workers-case-studies.pdf

[5] Ilmakunnas et al. “Diversity at the workplace: Whom does it benefit?” Helsinki School of Economics. http://www.eale.nl/conference2009/Programme/PapersC/add102508_wKXraqYSnk.pdf

[6] Department for Work and Pensions (UK). 2013. “Employing an Older Workforce, An Employer’s Guide to Today’s Multi-generational Workforce. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/142751/emplying-older-workers.pdf

[7] Brooks. 2014. “Productivity and Age.” Age UK.

http://www.50plusworks.com/downloads/Age%20and%20productivity%20briefing%20(March%202014).pdf

[8] Loch, C.H, Sting, F.J, Bauer, N, & Mauermann, H. (2010). How BMW Is Defusing the Demographic Time Bomb. Harvard Business Review, 88(3), 99–102. Retrieved from http://hdl.handle.net/1765/20802

[9] CIPD. 2012. “Managing a Healthy Ageing Workforce, A National Business Imperative.” http://www.cipd.co.uk/binaries/managing-a-healthy-ageing-workforce-a-national-business-imperative_2012.pdf

[10] Backes-Glenner & Veen. 2009. “The Impact of Aging and Age Diversity on Company Performance.” Institute for Strategy and Business Economics, University of Zurich. http://www.zora.uzh.ch/48541/1/Backes         Gellner_The_impact_of_aging_and_age_diversity_on_company_performance-V.pdf

The Future of Ageing – Impacts and Implications

Dire predictions that a “Grey Tsunami” will overwhelm economies with unproductive societies harken back to Thomas Malthus’ 1798 “Essay on the Principle of Population”. There, Malthus predicted that growing populations would outrun the food supply, leading to poverty and starvation. The prediction did not foresee the development of agricultural technologies that greatly increased food production. In the case of older populations, predictions about economic disaster change to discussions of economic growth if people remain productive into advanced ages. Rather than a problem, we may be experiencing one of the greatest opportunities ever in history to dramatically improve quality of life at all ages.

Don is EVP with The Futures Company in New York where he leads Trends and Futures Consulting. In his time with TFC and previously at Social Technologies and the Ernst & Young Centre for Business Innovation, Don has supported many organisations in the areas of innovation and future strategy.

For more details about Don please see his LinkedIn profile.

James is a Partner at The Foundation, Chair of GEN Community and a trustee of GreenThing and the RSPB. He was previously co-founder and UK CEO of Zopa, the world’s first peer-to peer-lending company. Prior to this he was Strategy Director at Egg and a consultant at LEK.

For more details about James please see his LinkedIn profile.

Nicky co-founded Best Foot Forward, a sustainability consultancy that pioneered development of carbon and ecological foot-printing. She is now Strategy Advisor at Anthesis Consulting Group and helps organisations respond to, and benefit from, the threats and opportunities of sustainability.

For more details about Nicky please see her LinkedIn profile.

Anthesis is a specialist global environmental and social sustainability consultancy founded on the belief that commercial success and sustainability go hand in hand. We bring together pioneering “thinkers and doers” from across continents, to offer  a unique combination of commercial relevance, technical depth and global reach. Our approach is based on commercial relevance, technical depth and experience, and a specialist and global service. Wherever you are on your journey towards sustainability, we provide passionate, creative and committed teams who proudly offer exceptional client care.​​ ​
Anthesis is hosting the UK Future of Resources discussion

 

Read More »
Ashoka is the largest network of social entrepreneurs worldwide, with nearly 3,000 Ashoka Fellows in 70 countries putting their system changing ideas into practice on a global scale. Founded by Bill Drayton in 1980, Ashoka has provided start-up financing, professional support services, and connections to a global network across the business and social sectors, and a platform for people dedicated to changing the world. Ashoka launched the field of social entrepreneurship and has activated multi-sector partners across the world who increasingly look to entrepreneurial talent and new ideas to solve social problems.​
Ashoka is hosting an India Future of Health discussion

 

Read More »

Arup is an independent firm of designers, planners, engineers, consultants and technical specialists offering a broad range of professional services. Founded in 1946 with an initial focus on structural engineering, Arup first came to the world’s attention with the structural design of the Sydney Opera House, followed by its work on the Centre Pompidou in Paris. Arup has since grown into a truly multidisciplinary organisation. Arup’s Foresight team identifies and monitors the trends and issues likely to have a significant impact upon the built environment and society at large. Arup is hosting the global Future of Cities discussions. http://www.arup.com

Biomin a leader in animal nutrition and health, develops and produces feed additives, premixes and services for the improvement of animal performance, in an economically viable way. Its products cover solutions for mycotoxin risk management, holistic approaches towards promoting growth naturally as well as specific solutions that address dietary requirements for swine, poultry, dairy and beef cattle as well as aquaculture. Biomin is part of the Erber group of companies that focus on improving the efficacy, quality and safety of food and feed products in a healthy and sustainable way

Biomin is hosting the Future of Food discussions in Singapore and Austria.

http://www.biomin.net 

Read More »