Is the Internet threatened by the rising seas?

Is the Internet threatened by the rising seas?

In the collective imaginary the internet is in the cloud, when it is actually much more tangible than we want to think. It spreads across the globe using an underground network which grows according to the demand for internet access. It just so happens that this network is in danger. Indeed, the rise of sea levels due to global warming threatens this internet network located on the coast. The damage caused by this rising water could greatly affect our modern lifestyle.

A tangible and massive internet network

The internet is described by the Cambridge dictionary as “the broad system of connected computers around the world that allows people to share information and communicate with each other”. It has three main components: the end-user equipment, the data centres and the internet network.

This network is itself composed of several elements such as optic fibre cables, hardware servers, data transfer stations and power stations. These elements, all interconnected, weave a network to transmit information from one end of the world to another, and it is quite difficult to estimate its length. In 2014, there were 285 submarine communication cables or about 550,000 miles. It is difficult to gage the size of the terrestrial network, as it grows based on the demand, and the newly installed cables intermix with the old ones.

In the United States, it is estimated that most Internet infrastructure was built around the 1990s and 2000s. At that time, the development of the network followed the development of major American cities. Today, operators tend to install network extensions along with other infrastructures such as roads, railways or power lines. In some areas of the world and throughout history, cities and megacities have developed along the coastlines; portuary cities that were synonymous with wealth, opportunities, and businesses. These attractive and often densely populated cities are now facing a danger: the flooding of their internet network.

The rising seas gaining internet ground

Paul Barford, a computer scientist, and his assistant student, Ramakrishnan Durairajan, undertook a mapping of US internet infrastructure. As the infrastructures are private and belong to the operators, the locations are kept mostly secret in order to avoid any possible damage. In mapping the network, they observed that it was becoming denser in areas of high population. There are often coastal cities.

They presented their information to Carole Barford, climate scientist. and they became aware of the risk of flooding part of the network. They decided to superimpose the map with that of the rising sea level due to global warming by the National Oceanic and Atmospheric Administration (NOAA). Through their research, they estimated that in 2033, about 4,000 miles and 1,100 traffic hubs would be underwater in the US. For New York City, about 20% of its internet network will be underwater.

We should not underestimate the repercussions this flood would have on our current lifestyles. Many services work through the internet, such as traffic lights, medical monitoring or cash dispensers. In the past, some cities had suffered blackouts due to flooding. A recent example: in 2012 during Hurricane Sandy, 10% of the city of New York was deprived of electricity.  

The problem is that the terrestrial network is designed to be water resistant, but not to work under water.

Unlike Internet Networksubmarine cables, cables buried in the earth are protected mainly with plastic. There are not adequately protected in cases of floods or frost. And with part of the network being a few years old, it is possible that it is even more fragile than the new extensions.

It was during the Applied Networking Workshop in Montreal, July 16, 2018, that the three scientists presented their study concerning the territory of the USA. Carole Barford said “The 15-year predictions are really kind of locked in,” nobody can change what will happen. The main cities involved are New York, Miami and Seattle.

Saving the Internet … from itself?

“If we want to be able to function like we expect every day, we’re going to have to spend money and make allowances and plans to accommodate what’s coming” said Carole Barford.” “Most of the damage that’s going to be in the next 100 years will be done sooner than later … That surprised us. The expectation was 50 years to plan for it. We do not have 50 years, “added Paul Barford.

So, what are the solutions to avoid this submersion of the network?

The first would be to be able to locate all the infrastructures that compose / form the Internet network. Despite the risk of voluntary degradation, it is necessary to identify the infrastructure that will be underwater in a few years. The study predicts that about 4,000 miles and 1,100 traffic hubs will eventually be underwater. Their estimation is made according to the networks they knew about. This study must also extend to all continents and countries. As rising water levels are a global effect of climate change, many coastal cities are likely to be affected.

In order to limit the impact of rising water on the internet, operators can envisage different solutions. Strengthening the current network, moving it further inland, or ensuring that computer signals avoid submerged areas. However, these solutions are not perfect or permanent. Strengthening infrastructure will only work for so long. Avoiding submerged areas will impact the quality of the internet network and could cause latency. Moving existing infrastructures or creating new ones will require significant financial investments that could affect the end user.

Our internet use seems to be in danger. However, does it contribute to its own destruction? The internet is not as green as it seems. We power data centres, one of the main components of the internet, with unsustainable energy sources, creating carbon emissions. Forbes estimated that the carbon footprint of data centres alone is equivalent to that of the global aviation industry, or 2% of global emissions. The emissions of carbon dioxide, due to our increasing use of the internet, are one of the causes of the melting of the ice caps and rising water levels.

Wouldn’t it be ironic if our growing internet addiction was its own worst enemy?

The invisible pollution of the internet

The invisible pollution of the internet

What If the internet became the primary cause of global warming? Ian Bitterlin, a data centre expert, estimates that by 2030, the Internet will consume 20% of the world’s electricity. Today the energy consumed by the internet is, for the most part, not of green origin. It generates an ever-increasing carbon footprint and has a detrimental impact on global warming. Large companies face social pressure and increasingly frequent investigations from independent organisations, and now are embarking on a race for a Green Internet.

The energetic greed of Internet

A high power consuming global network

To determine the energy consumption of the internet, one must ask what the Internet is. According to the Cambridge Dictionary, the internet is “the large system of connected computers around the world that allows people to share information and communicate with each other”. A study conducted by Ericsson and TeliaSonera determined that the three most energy-hungry internet components of this “broad system” are the end-user equipment, data centres and networks.

The end-user equipment

According to a study from the Centre for the Digital Future in the United States in 2017, an American would spend an average of one full day per week connected to the internet. A study from Statista indicates that teenagers are even more exposed to internet: they spend about 4 hours on the internet a day, a little over a day in a week. These numbers are just further evidence of the constant connectivity we experience daily. To stay connected, we use devices that we regularly recharge, thus consuming energy.

The data centres

Data centres are also very greedy. A data centre is “a place of computers that can be kept safely” according to the Cambridge dictionary. Each click, each message sent, each video watched solicits these computer farms. They use electricity to operate, but especially to keep cool. The cooling functions of computers alone account for 40 to 50% of the electricity consumed. McKinsey & Company estimate that only 6% to 12% of the power is used to compute. The remaining part is used to prevent a surge in activity that could crash their operations.

To illustrate the amount of energy consumed by a data centre, Peter Gross, an engineer and designer of power systems for data centres said: “A single data centre can take more power than a medium-size town”. In France, the energy consumption of data centres is higher than the electricity consumption of the city of Lyon (2015, the French Electricity Union). Data Centres’ global energy consumption is up to 3% of the global energy consumption wrote The Independent in 2016.

The internet network

We can also see an increase in the development of the networks which allow access to the internet. The components of this network are for example DSL, Cable Modem, Fiber. These networks work also thanks to energy.

 

To determine the energy consumption shares of these three major Internet components, the ACEEE assessed in 2012 that for a gigabyte of data downloaded, the network is consuming 5.12 kWh power: 48% from data centres, 38% of end-users equipment, and 14% of internet networks.

 

The vagueness of the exact global consumption

Determining the global energy consumption of the internet is complicated. The Centre for Energy-Efficient Telecommunications (CEET) tried to do this once. It estimated that internet consumption accounts for 1.5% to 2% of the world’s total energy consumption in 2013. If we compare this figure with the use of other countries, the Internet would be the 5th country consuming the most energy in the world. In 2014, Jon Koomey, Professor at Stanford University and known for describing Koomey’s law, estimated this consumption to be at around 10%. However In 2017 Greenpeace estimated it at the lower rate of 7%.

There are a few reasons that can explain this critical difference. The main one being that when end-user equipment consumes energy, this energy is not necessarily used to connect to the internet. A laptop or computer can be used offline to play video games. Allocating the share of electricity used for the internet connection is therefore very complicated. Some experts prefer not to count the energy consumption of these devices so as not to distort the numbers. Besides, experts expect this power consumption to double every four years. The Guardian predicts that in 2020 the internet will reach 12% of global energy consumption.

With great power come great sustainable responsibilities

The dark side of the power

The problem with the energy consumption of the internet lies in how to track what kind of energy the internet network is using. As Gary Cook, a senior policy analyst at Greenpeace, said: “How we power our digital infrastructure is rapidly becoming critical to whether we will be able to arrest climate change in time. […] If the sector simply grew on its current path without any thought as to where its energy came from, it would become a major contributor to climate change far beyond what it already is.” Indeed, in 2016, the Independent wrote that the carbon footprint of data centres worldwide was equivalent to the global aviation industry, which is up to 2% of global CO2 emissions.

Some associations have therefore investigated to determine what the share of renewable energy that the data centres use to consume is. The Environmental Leader estimated that in 2015 Google and Amazon used at least 30% of fossil energy to power their data centres. In 2015, Lux Research company found out through a benchmarking on data centres owned by Google, 4 out of 7 were dependent on coal energy. In 2012, Greenpeace released the report “How Clean is your Cloud?” informing about the respect (or not) of the environment of some companies had through the use of their cloud and their data centres.

The Green Power Race

These studies by different organisations have created a race for green power for data centres of large companies. Google, Apple, Facebook and Amazon now provide 100% renewable energy to their data centres or are aiming towards this objective. As an example, Amazon has been powering its servers with at least 50% renewable energy since 2018. However Greenpeace recently contradicted this information and would estimate it only at 12%. Greenpeace also points out that the change triggered by these big Western companies is not enough. Indeed, sizeable Chinese web companies such as Baidu, Tencent, show very little transparency, communicating little about their energy consumption, or their use of green energy. They face little access to renewable energies due to monopoly utilities.
The GAFA are also under the spotlight; medium and small data centres remain off the radar.

Nonetheless the International Energy Agency’s (IEA) announced that despite the increase in workload for data centres (about 30% in 2020) the amount of electricity used would be up to 3%. Data centres are becoming more and more energy efficient.

 

The internet remains the most important source of information and has also made it possible to create less polluting solutions. Reading an email is more eco-friendly than printing a paper. Using an app to find a car parking space is more environmentally friendly than driving around in circles to find one. if you find yourself feeling concerned about this invisible pollution that we generate daily, rest easy in the knowledge that the Internet also contains tips to reduce its own electricity consumption.

 

Sources:

 

 

 

BIG DATA & CRM : Is Customer Data Plateform the new DMP ?

It’s a question that should have already arouse your interest as a marketer. Why ? These last few months have been fueled by discussions, talks and market moves around that technology stacks called Data Management Platform. There is a shift in the data-driven approach to collect, activate, and analyse data. Have you felt it coming ?
Indeed, lately at the Boston MarTech Conference (2-4th October), the question about « Finding success and efficiency from a centralized customer data center » so in other terms, the « rising of the Customer Data Plateform », has been one of the main focus. It directly relates to the DMP tool which has been mostly used until recently to maximise media audience activation by tracking user navigation. The today’s need is to get an holistic view of the customer in order to drive loyalty and enhance customer lifetime value at a larger scale. The AdTech and MarTech industry’s aim is to gather as qualitative PII and non PII data (Personal Identifiable Data or not) as possible and make a greater use of the DMP. So what about Customer Data Plateform, is it really new ? What has been done so far on the market ?And why as a marketer should I bother reading this whole article ? Well, there is no promises just facts to provide you with a close-up of the major challenges around data management which is impacting the entire digital ecosystem, including you !

A Data-driven perspective…

Let’s get a clear understanding of what a DMP is, its evolutionary applications and the avent of a « Data Customer Plateform » vision.

According to Forrester Research, DMP is a « unified technology plateform that intakes disparate first-, second and third party datasets, provides normalization and segmentation on that data and allows a user to push the resulting segmentation into live interactive channels environment». This technology stack has been used for the past eight years to mine the data originating from Big Data. The flow streams stored is a combination of first party data, the most valuable assets including brand’s audience, media, browsing and mobile data (options, visited pages, clicked on banners,etc.). Second party data concerns partnering agreements and third party data are data collected and sold to other companies for audience targeting.The particularity of the DMP tool is that it gathers mostly non PII information and therefore differs from a data lake technology which is reserved for internal use (example of the Hadoop ecosystem).

In what extend are marketers concerned ?

As a marketer you certainly realize that data is a key financial asset that is at the center of your marketing strategy to get a competitive advance on the market.
Using data management technology has became a standard tool to score in real time users across every channels and touchpoints. In order to leverage your DMP you have to build actionable and valuables segments by collecting behavioral data. Your segment strategies will then serve different purposes : activation of new leads, use of look alike modelling for finding new customers, consolidation of data by user-matching, qualification of cookies and the performing of analytics.
Moreover, accepting to perform peer to peer data sharing with publishers can also contribute to maximise your media strategy targeting more high-end audiences. It will contributes to lower your CPA (marketing sales and costs for acquiring a new customer) which is a key indicator of a successfull marketing campaign.

Issues and challenges around Data Management

According to the study entitled « DMP Europe 2016 » led by Exchange Wire, Weborama and Stratégies, 37% of the companies equipped with a DMP technology complained about the few concrete marketing activation possibilities (sending emails, custom contents). The DMP tool is rather under-used because at its origins it was created to monetize data through programmatic display buying. Thus, people are undervaluing its scope of application. Moreover, the Converteo study highlights the fact that in France, the use of DMP is not totally common among companies.
Another issue about DMP concerns its scaling limits : if the plateform does not drives enough media buying volume, the DMP is not a « good value for money ».

The avent of the Customer Data Plateform vision

That is why there is a shift today on the market because agencies and brands want to fully embrace a data-driven strategy and converged toward a single plateform that integrate solutions : DMP behavior data with CRM data. They are building what we call « Consumer data Plateform » which is a unique solution relying on a single identifier for each customer to operate on a larger scale. The marketing Analyst David Raab introduced this trending acronym «Customer Data Plateform » back in 2013 to explain the need of global companies to link big data and data insights. The Customer Data Plateform is similar to marketing hubs and designed to centralize and store all type of customer’s data contrary to DMP solutions whose Data stored (First, second and third party data) is anonymous. According to Cyril Fekete, consulting director at a french specialist called Artefact, CDP corresponds to the following formula : « CDP = DMP + CRM + data lake ».

A broad application scope and a promising future

As Emily Macdonald, the head of programmatic at DigitasLBi said, there is a « march of martech » where big players in Saas are moving and acquiring DMP, CRM and automated marketing solutions : Salesforces with Krux DMP, Oracle with the Eloqua Marketing Solution or Adobe with the Neolane CRM tool (look here at the Gartner’s 2017 Magic Quadrant For Digital Marketing Hubs).
On the French market, different companies such as the Eularian or Makazi startups have already broaden the applications scope of the DMP with the activation of browsing information through emailing.
Another reason for using such technology stacks is that marketers and data owners want to better deep dive into data to get greater return on assets. They want to trigger purchases and increasing retention among their targeted audience by enhancing their customer knowledge.
However, it also means that data owners want to have a better control of their own data to optimize their marketing ROI and manage price setting with publishers and other ad tech players partners. In fact, their is a need for transparency around the use of data (what is shared) and its business model (fixed price, CPM,…) as in the past there were no clear rules (cf. AdExchanger article) and there is always a privacy risks around hijacked cookies.

But what about the impact of the GDPR’S Directive ?

All theses changes and need of transparency are linked to the General Data Protection Directive and Eprivacy Regulation that are coming into play in May 2018. They are going to sweep off the opaque layer of privacy and impose rules to protect users from data misuses coming from the entities responsible of personal data collection. The EU Commission wants to end cookies’ headbands displayed each time a user visits a website and replace them by a unique consent given directly through the browser settings. It means that a user can reject all cookies at once. That is why the draft regulation can have a huge negative impact on the advertising industry and might slightly change the game’s rules.

So what’s next ?

Hence, there is a lot going on around Data Management ! The future of marketing stacks includes for sure the Customer Data plateform which is a reflexion encompassing a unified omnichannel vision with on and offline data.  DMP technologies are not finished yet as its applications have not be fully exploited on the market. Furthermore, advanced artificial intelligence, machine learning and cloud computing will play a great role in data provisioning, storing and analytics. So, whatever the name chosen to call your single data platform management, opt for a solution that merges all cross-device and cross-channel customer data and favour the individual’s customer journey. As Paul Graham said, your goal as a marketer is to « make something people want ». Let’s enrich and leverage your data in order to better engage your audience, increase conversion and maximize your customer lifetime value metric.

What is your viewpoint about this topic ? Share your thoughts with us right here .

BIG DATA & CRM: Is Customer Data Platform the new DMP?

BIG DATA & CRM: Is Customer Data Platform the new DMP?

It’s a question that should have already arouse your interest as a marketer. Why? These last months have been fueled by discussions, talks and market moves around that technology stacks called Data Management Platform. There is a shift in the data-driven approach to collect, activate, and analyse data. Have you felt it coming?

Indeed, lately at the Boston MarTech Conference (2-4th October), the question about « Finding success and efficiency from a centralized customer data center » so in other terms, the « rising of the Customer Data Platform », has been one of the main focus. It directly relates to the DMP tool which has been mostly used until recently to maximise media audience activation by tracking user navigation. The today’s need is to get an holistic view of the customer in order to drive loyalty and enhance customer lifetime value at a larger scale. The AdTech and MarTech industry’s aim is to gather as qualitative PII and non PII data (Personal Identifiable Data or not) as possible and make a greater use of the DMP. So what about Customer Data Platform, is it really new ? What has been done so far on the market ? And why as a marketer should I bother read this whole article ? Well, there is no promises just facts to provide you a close-up of the major challenges around data management which is impacting the entire digital ecosystem, including you!

(more…)

Trump, Macron…how big data made its way to politics

Trump, Macron…how big data made its way to politics

On 20 January 2017, Donald Trump was elected president of the United States. This was a very unexpected event for many people. In May 2017, Emmanuel Macron became the 8th president of the 5th republic of France. Not much brings these two public figures together, but the way of getting to where they are today using technological tools is one of them.

President Trump and Macron are far from being the only ones who used big data to push their campaign to a higher level. Though most candidates in France actually did (1). This is where we are at today: technology is getting into the most social and humanised part of our lives. Politics involves people, culture, social issues… Today, the actual U.S and French presidents used one of the most powerful and new technologies out there to get to reinforce and widen their group of voters, with the objective of being elected. Big data, a revolution in the political sector.

So…what exactly is big data?

« It means population data as opposed to sample data. Until fairly recently, campaigns relied on survey data, which is obviously limited. But big data has changed that. Campaigns now have access to data about the entire population, not just a narrow sample. » says Eitan Hersh (2). Indeed, today politicians can have access to personal data from any group of people, going through these data, digital agencies.



(more…)