The Future of Water – Options and Possibilities

The UN has sagely noted that “water is the primary medium through which climate change impacts will be felt by humans, society and the environment” and accordingly climate change will necessitate improvements in water resilience systems in cities across the globe. Increasingly they will have to focus on local water sourcing, reuse and recycling in order to sustain their ever-expanding population. There are multiple ways in which efficiency can be improved not least through significant investments in green infrastructure, the adaption of smart technology and widespread public education which will help to manage water demand through a broader understanding about its natural process. Water is a key contributor to life. We need to be constantly reminding ourselves of this and take action.

Many countries are currently working to maintain and improve the quality of their sources.  About 96% of the earth’s total water supply is found in oceans and there is broad agreement that extensive use of desalination will be required to meet the needs of growing world population.  Worldwide desalination plants are producing over 323 million cubic metres of fresh water per day, however energy costs are currently the principal barrier to its greater use. The State of Singapore has innovative water technology, aiming, despite its size and population density, to become fully self-sufficient by 2061.  Plans include tripling its desalinated water supply by 2030, the large-scale collection of rainwater, and the collection of recycled water which, as well as the standard procedures, uses micro filtration processes, reverse osmosis and UV treatment to deliver potable water to its citizens. In short they are converting their city into a catchment and focusing on source diversity.

Elsewhere efficiencies will be improved by the use of intelligent robots, which will play a greater role in the inspection of infrastructure.  New materials, such as graphene, that are lighter, stronger, smarter and greener will also become more popular replacing traditional materials such as stainless steel pipes.

Growing concern for the environment and for public health means that water companies will be held to greater account for their environmental impact and water quality. A stronger emphasis on green infrastructure will support a trend for companies to transform from providing base utilities to creating a system of amenities that support the water cycle. An example of this can be found at the Illinois Institute of Technology. Rain gardens have been reutilized as communal meeting spaces, through-ways turned in to permeable walkways and three acres of new native plant communities with underground cisterns collect rainwater for future non potable reuse. Once all the changes are implemented the IIT predicts a 70 – 80% reduction of run-off into Chicago’s sewer system while making the collected non-potable water available for irrigation. Expect this repurposing of public spaces for multi-functionality for both amenity and wider sustainability purposes to be widely adopted.

Alongside making improvements to the infrastructure, there is a pressing need to do more with less water. Smart technology and big data will help. Changing public behavior is a huge challenge however.  Although there is widespread understanding that rising consumption of raw materials is both intensifying resource scarcity and increasing competition, most people, certainly in the developed world, live materialistic lifestyles resulting in high levels of waste.  In Australia for example, on average around 20 million tonnes of waste per year is thrown away at a value of AUD10.5 bn. Digital lifestyles can increasingly link consumer behavior to consumption and growing connectivity, utilizing the Internet of Things, will mean that it will be possible to monitor the consumption and cost of water in real time allowing consumers to understand their impacts and take action.

Data analytics can help build understanding on how to use the water cycle to respond to the challenges of climate change. It can also lead to increased scrutiny of water utilities and a better understanding of cost. Companies will therefore be able to integrate the true cost of water into their decision-making. In addition the availability of data provides an opportunity to educate customers about consumption. Publicity campaigns and a growing sense of urgency will nudge consumers to reduce consumption and should be used in partnership with economic levers that recognize the true value of water.

Growing populations and changes in diet mean that we need to produce more food. Water is a fundamental part of this process. In Australia, for example, the agricultural sector accounts for around 65% of total water consumption.  This could be greatly reduced if we could change consumer behaviour. It is estimated that Australians throw away AUD5.3bn of food waste every year. This is simultaneously wastewater. There is a real need to change this approach and developments in this sector will continue to have tangible knock on effects for the water supply industry and the natural environment from which this water is sourced.

Science will also have a key role in reducing the amount of water we use. Nano and biotechnology is a potential game-changer for the water industry, and can enable breakthrough products and technologies to tackle pressing global challenges such as reducing environmental footprints, using less and cleaner energy and decreasing water usage and waste generation. For example microorganisms are now being used to treat water that has been contaminated by hazardous materials. The global market for nanostructured product used in water treatment was worth an estimated USD1.4bn in 2010 and is expected to rise to USD2.1bn in 2015.[1]  Initial success in this area has also raised the possibility of the utility as a self-healing ecosystem.

Greater efficiency is the driving force for manufacturing companies where energy and water can be as much as 50% of the total manufacturing cost.  In the future expect more green manufacturing and increased co-operation when companies forge alliances across traditional boundaries, for example to share common costs. In the water industry this will manifest itself in knowledge sharing and contributions to joint research and development across catchment boundaries.  Through using resources more efficiently countries could also become more active trading partners; this would allow for more equal water redistribution amongst users. This could include a water balance concept similar to carbon emissions reduction strategies where water saved in one country offsets additional water use in another.

Looking ahead, users are likely to have to pay for the real cost of infrastructure. One short-term option is the financial recycling of assets and capital where old assets are sold or leased to fund the new. However, in the longer-term we will have to pay the true value for key resources. This shift could also lead to the greater application of the circular economy, which will help stretch resources through end of life recycling and reuse. More awareness will lead to increased scrutiny of water utilities and pricing of services as the widespread availability of data provides the opportunity to educate customers about consumption and managing resource use.  Looking through an international lens, water trading would allow for the efficient redistribution of water amongst users, so countries could become active trading partners. As the amount of water used in agriculture in arid regions is two to three times higher than in rain fed regions water trade could help save water on a global scale.

Once efficiencies and improvements are made, consideration should be given to the most cost effective way to provide access to basic services.  The fixed nature of water supply infrastructure and its history as an essential government supplied service gives rise to natural monopolies within supply areas. Governments need to ensure the pricing policy is appropriate to balance the essential need for water, the impacts on consumers (particularly those on lower incomes) and the requirements of the suppliers to remain financially viable.  To do this there should be better integration between urban water planning and urban development planning with considerations on limitation to green-field development.

Recognizing innovation opportunities for the future more and more companies are tapping into the public’s intellectual capital by crowdsourcing product ideas and solutions.  In exchange they are giving creative consumers a direct say in what gets developed, designed or manufactured. Crowd-funding added at around 270,000 jobs and injected more than US$65bn into the global economy by the end of 2014 with an expected industry growth of 92%.

[1] . Nanotechnology Now. Nanotechnology in Water Treatment. 2012; Available from: http://www.nanotech-now.com/news.cgi?story_id=45894

The Future of Connectivity – Proposed Way Forward

Having understood what drives demand we can define the requirements for future mobile networks: As stated earlier, one gigabyte of data traffic per user per day is about 60 times the average data traffic seen in mature mobile operator networks today. On top of this, the growth in mobile broadband penetration and the surge of connected objects will lead to around ten times more endpoints attached to mobile operator networks than today. To prepare for this, we need to find ways to radically push the capacity and data rates of mobile networks into new dimensions to handle this amount of data traffic.

Yet, being able to deal with this traffic growth is just one aspect. An increasing number of real-time apps will test the performance of the networks. To support them with a good user experience we need to find ways to reduce the end-to-end latency imposed by the network to milliseconds. Tactile (touch/response) and machine-to-machine interactions in particular have low latency demands that can be as low as in the single digit milliseconds range.

To ensure mobile broadband remains affordable even while supporting the capacity and real-time requirements described previously, we also need to radically reduce the network Total Cost of Ownership (TCO) per Gigabyte of traffic. We believe one important lever to address this will be to automate all tasks of network and service operation by teaching networks to be self-aware, self-adapting and intelligent. This will help to reduce CAPEX/IMPEX for network installation as well as OPEX for network and service management. In addition to lower TCO, self-aware and intelligent networks will be able to understand their user’s needs and automatically act to deliver the best personalized experience.

To further reduce costs per GB, we need to share network resources through both within a single operator network, as well as between operators. It will include physical infrastructure, software platforms, sites, spectrum assets or even the network as a whole. We must also find ways to increase the energy efficiency. In addition to their environmental impact the energy costs account today for up to 10% (in mature markets) and up to 50% (in emerging markets) of an operator’s network OPEX and they have been growing constantly in the last years.

The most powerful way of course to deal with the cost pressure will be to identify new revenue streams. Are end customers and termination fees really the sole revenue source for operators, or will technologies enable new business models that allow operators to better monetize all their assets?

Ultimately we of course need to admit that due to the fast pace of change in the industry it is simply not possible to predict all requirements future networks will face. There will be many use cases that are simply not known today. To cope with this uncertainty, flexibility must be a key requirement as well.

The Future of Connectivity – Impacts and Implications

More spectrum, high spectral efficiency and small cells will provide up to 1000 times more capacity in wireless access. In the world of wireless, Shannon’s law is the one fundamental rule that defines the physical limits for the amount of data that can be transferred across a single wireless link. It says that the capacity is determined by the available bandwidth and the signal to noise ratio – which in a cellular system typically is constrained by the interference.

Therefore the first lever to increase the capacity will be to simply utilize more spectrum for mobile broadband. In total the entire spectrum demanded for mobile broadband amounts to more than 1,100 MHz and a large amount (about 500 MHz) of unlicensed spectrum at 2.4 GHz and 5 GHz can provide additional capacities for mobile data. Of course reaching an agreement on spectrum usage requires significant alignment efforts by the industry and is a rather time consuming process. Therefore it is also necessary to look at complementary approaches such as the Authorized Shared Access (ASA) licensing model, which allows fast and flexible sharing of underutilized spectrum that is currently assigned to other spectrum-holders such as broadcasters, public safety, defence or aeronautical.

A key challenge associated with more spectrum is to enable base stations and devices to utilize this larger and a potentially fragmented spectrum. Here technologies such as intra- and inter-band Carrier Aggregation will be essential to make efficient use of a fragmented spectrum.

The second lever for more capacity will be to address the interference part of Shannon’s equation. This can be achieved for example through beam forming techniques, which concentrate the transmit power into smaller spatial regions. A combination of multiple spatial paths through Coordinated Multipoint Transmissions (CoMP) can further increase the capacities available to individual users. We believe that with the sum of these techniques the spectral efficiency of the system can be increased by up to 10 times compared to HSPA today.

Advanced technologies and more spectrum will help to grow capacity by upgrading existing macro sites for still some time. However, a point will be reached when macro upgrades reach their limits. By 2020 we believe mobile networks will consist of up to 10…100x more cells, forming a heterogeneous network of Macro, Micro, Pico and Femto cells. Part of this will also be non-cellular technologies such as Wi-Fi, which need to be seamlessly integrated with cellular technologies for an optimal user experience.

Although the industry today has not defined what 5G will look like and the discussions about this are just starting, we believe that flexible spectrum usage, more base stations and high spectral efficiency will be key cornerstones.

The capacity demand and multitude of deployment scenarios for heterogeneous radio access networks will make the mobile backhaul key to network evolution in the next decade. The backhaul requirements for future base stations will easily exceed the practical limits of copper lines. Therefore from a pure technology perspective, fiber seems to be the solution of choice. It provides virtually unlimited bandwidth and can be used to connect macro cells in rural areas and some of the small cells in urban areas. However the high deployment costs will prevent dedicated fiber deployments just to connect base stations in many cases. Due to the number of deployment scenarios for small cells, from outdoor lamp post type installations to indoor, we believe a wide range of wireless backhaul options will coexist including microwave links and point to multipoint link, millimetre wave backhaul technologies. For many small cell deployment scenarios (e.g. for installations below rooftop level) a non-line-of-sight backhaul will be needed. The main options here are to either utilize wireless technologies in the spectrum below 7 GHz or to establish meshed topologies.

Besides pure network capacity, the user experience for many data applications depends heavily on the end-to-end network latency. For example, users expect a full web page to be loaded in less than 1000ms. As loading web pages typically involves multiple requests to multiple servers, this can translate to network latency requirements lower than 50ms. Real-time voice and video communication requires network latencies below 100ms and advanced apps like cloud gaming, tactile touch/response applications or remotely controlled vehicles can push latency requirements down to even single digit milliseconds.

The majority of mobile networks today show end-to-end latencies in the range of 200ms-500ms , mainly determined by slow and capacity limited radio access networks. Therefore the high bandwidth provided by future radio access technologies and the use of fast data processing and transmission will provide a major contribution to reduce the network latency. Due to the amount of data being transferred the user perceived latency can be much higher than the plain round-trip-time. Thinking of future ultra high resolution (UHD) real time video applications this clearly motivates the need for further technology evolution.

Equally important is the real traffic load along the end-to-end path in the network. A high traffic load leads to queuing of packets, which significantly delays their delivery. When attempting to solve this, it is not efficient to just overprovision bandwidth in all network domains. Instead latency sensitive media traffic might take a different path through the network or receive preferred treatment over plain data transfers. This needs to be supported by continuously managing latency as a network quality parameter to identify and improve the bottlenecks. In return, low latency traffic could be charged at a premium, providing network operators with new monetization opportunities.

One physical constraint for latency remainins: Distance and the speed of light. A user located in Europe accessing a server in the US will face a 50ms round-trip time due simply to the physical distance involved, no matter how fast and efficient the network is. As the speed of light is constant, the only way to improve this will be to reduce the distance between devices and the content and applications they are accessing. Many future applications such as cloud gaming depend on dynamically generated content that cannot be cached. Therefore the processing and storage for time critical services also needs to be moved closer to the edge of the network.

The introduction of additional radio access technologies, multiple cell layers and diverse backhaul options will increase complexity and bears the risk that network OPEX will rise substantially. This is why the Self- Optimizing-Network (SON) is so important. SON not only increases operational efficiency, but also improves the network experience through higher network quality, better coverage, capacity and reliability. Extending the SON principles now to a heterogeneous network environment is a challenge and opportunity at the same time.

Fortunately, big data analytics and artificial intelligence (AI) technologies have matured in recent years, mainly driven by the need to interpret the rapidly growing amount of digital data in the Internet. Applied to communication networks, they are a great foundation for analyzing Terabytes of raw network data and to propose meaningful actions. In combination with AI technologies, actionable insights can
be derived even in the case of incomplete data; for example machine-learning techniques can find patterns in large and noisy data sets. Knowledge representation schemes provide techniques for describing and storing the network’s knowledge base and reasoning techniques utilize this to propose decisions even with uncertain and incomplete information. Ultimately we believe that both, big data analytics and AI technologies will help to evolve SON into what we call a “Cognitive Network”, one that is able to handle complex end-to-end optimization tasks autonomously and in real time.

Customer Experience Management (CEM) can provide insights that will enable operators to optimize the balance of customer experience, revenues and network utilization. Cognitive Networks will help to increase the automation of CEM enabling network performance metrics to be used to govern the insight/action control loop, as well as experience and business metrics. This again increases the operational efficiency and at the same will be the prerequisite to deliver a truly personalized network experience for every single user.

The big data analytics and AI technologies introduced with the Cognitive Networks will be the foundation for advanced customer experience metrics. The ability to deal with arbitrary amounts of data in real time will allow a much more detailed sensing of network conditions and the resulting user experience in real time.

It also will be the foundation for large-scale correlations with other data sources such as demographics, location data, social network data, weather conditions and more. This will add a completely new dimension to user experience insights.

Cloud technologies and being able to provide computing and storage resource on-demand have transformed the IT industry in the last years. Virtualization of computing and storage resources has enabled the sharing of resources and thus their overall efficiency. Virtual cloud resources can also be scaled up and down almost instantly in response to changing demand. This flexibility has created completely new business models. Instead of owning infrastructure or applications it is possible to obtain them on-demand from cloud service providers. So far this approach has mainly revolutionized IT datacenters. We believe that similar gains in efficiency and flexibility can be achieved when applying cloud technologies to Telco networks. Virtualization will allow decoupling of traditional vertically integrated network elements into hardware and software, creating network elements that consist just of applications on top of virtualized IT resources. The hardware will be standard IT hardware, hosted in datacentres and either owned by the network operator or sourced on-demand from third party cloud service providers. The network applications will run on top of these datacentres, leveraging the benefits of shared resources and flexible scaling.

Also user plane network elements such as base stations will be subject to this paradigm shift. Over time, the migration of network elements in combination with software defined networking will transform today’s networks into a fully software defined infrastructure that is highly efficient and flexible at the same time.

Efficient radio technologies, high utilization and network modernization will reduce the network energy consumption, another important cost factor for operators. Having the forecasted traffic growth in mind, reducing the network energy consumption must be a major objective. The focal point for improving network energy efficiency will be the radio access, which accounts for around 80% of all mobile network energy consumption. Ultimately the energy efficiency that can be achieved depends on the pace of network modernization. Efficiency gains materialize only when the new technologies are introduced into the live network. Determining the right pace for modernization requires careful balancing of CAPEX and OPEX. We believe that energy efficiency can beat the traffic growth – which makes keeping the network energy consumption at least flat a challenging – but achievable goal.