In the collective imaginary the internet is in the cloud, when it is actually much more tangible than we want to think. It spreads across the globe using an underground network which grows according to the demand for internet access. It just so happens that this network is in danger. Indeed, the rise of sea levels due to global warming threatens this internet network located on the coast. The damage caused by this rising water could greatly affect our modern lifestyle.
A tangible and massive internet network
The internet is described by the Cambridge dictionary as “the broad system of connected computers around the world that allows people to share information and communicate with each other”. It has three main components: the end-user equipment, the data centres and the internet network.
This network is itself composed of several elements such as optic fibre cables, hardware servers, data transfer stations and power stations. These elements, all interconnected, weave a network to transmit information from one end of the world to another, and it is quite difficult to estimate its length. In 2014, there were 285 submarine communication cables or about 550,000 miles. It is difficult to gage the size of the terrestrial network, as it grows based on the demand, and the newly installed cables intermix with the old ones.
In the United States, it is estimated that most Internet infrastructure was built around the 1990s and 2000s. At that time, the development of the network followed the development of major American cities. Today, operators tend to install network extensions along with other infrastructures such as roads, railways or power lines. In some areas of the world and throughout history, cities and megacities have developed along the coastlines; portuary cities that were synonymous with wealth, opportunities, and businesses. These attractive and often densely populated cities are now facing a danger: the flooding of their internet network.
The rising seas gaining internet ground
Paul Barford, a computer scientist, and his assistant student, Ramakrishnan Durairajan, undertook a mapping of US internet infrastructure. As the infrastructures are private and belong to the operators, the locations are kept mostly secret in order to avoid any possible damage. In mapping the network, they observed that it was becoming denser in areas of high population. There are often coastal cities.
They presented their information to Carole Barford, climate scientist. and they became aware of the risk of flooding part of the network. They decided to superimpose the map with that of the rising sea level due to global warming by the National Oceanic and Atmospheric Administration (NOAA). Through their research, they estimated that in 2033, about 4,000 miles and 1,100 traffic hubs would be underwater in the US. For New York City, about 20% of its internet network will be underwater.
We should not underestimate the repercussions this flood would have on our current lifestyles. Many services work through the internet, such as traffic lights, medical monitoring or cash dispensers. In the past, some cities had suffered blackouts due to flooding. A recent example: in 2012 during Hurricane Sandy, 10% of the city of New York was deprived of electricity.
The problem is that the terrestrial network is designed to be water resistant, but not to work under water.
Unlike submarine cables, cables buried in the earth are protected mainly with plastic. There are not adequately protected in cases of floods or frost. And with part of the network being a few years old, it is possible that it is even more fragile than the new extensions.
It was during the Applied Networking Workshop in Montreal, July 16, 2018, that the three scientists presented their study concerning the territory of the USA. Carole Barford said “The 15-year predictions are really kind of locked in,” nobody can change what will happen. The main cities involved are New York, Miami and Seattle.
Saving the Internet … from itself?
“If we want to be able to function like we expect every day, we’re going to have to spend money and make allowances and plans to accommodate what’s coming” said Carole Barford.” “Most of the damage that’s going to be in the next 100 years will be done sooner than later … That surprised us. The expectation was 50 years to plan for it. We do not have 50 years, “added Paul Barford.
So, what are the solutions to avoid this submersion of the network?
The first would be to be able to locate all the infrastructures that compose / form the Internet network. Despite the risk of voluntary degradation, it is necessary to identify the infrastructure that will be underwater in a few years. The study predicts that about 4,000 miles and 1,100 traffic hubs will eventually be underwater. Their estimation is made according to the networks they knew about. This study must also extend to all continents and countries. As rising water levels are a global effect of climate change, many coastal cities are likely to be affected.
In order to limit the impact of rising water on the internet, operators can envisage different solutions. Strengthening the current network, moving it further inland, or ensuring that computer signals avoid submerged areas. However, these solutions are not perfect or permanent. Strengthening infrastructure will only work for so long. Avoiding submerged areas will impact the quality of the internet network and could cause latency. Moving existing infrastructures or creating new ones will require significant financial investments that could affect the end user.
Our internet use seems to be in danger. However, does it contribute to its own destruction? The internet is not as green as it seems. We power data centres, one of the main components of the internet, with unsustainable energy sources, creating carbon emissions. Forbes estimated that the carbon footprint of data centres alone is equivalent to that of the global aviation industry, or 2% of global emissions. The emissions of carbon dioxide, due to our increasing use of the internet, are one of the causes of the melting of the ice caps and rising water levels.
Wouldn’t it be ironic if our growing internet addiction was its own worst enemy?
Phygital is all about combining both the physical and the digital. It brings an ecosystem to this world, lying between the brand and the consumer across virtuality and reality. This article will go through the different steps that can help businesses to adapt to this transformation. It will focus on the relation between the real and the digital in marketing, and then explain why it is important to balance the marriage between both of them.
Thou shalt adopt a user-friendly experience and invite the client to your den
According to Barbara Prose, expert journalist, this relation between the physical and the digital is similar to organic food. It was once omniscient but today its value needs to be redefined. In reality, the two worlds – physical and digital – cross each other but do not necessarily overlap. Most French retailers believe that the main challenge for successful marketing is to overlap these worlds.
It is a hard task for brands as consumers have a powerful wealth of choices and different channels of purchase. For example, Amazon sets the bar high by instigating fast delivery, instant customer service and large stock availability. Therefore, to restore the balance in physical shops while maintaining the added value of new tools, marketers should look for ways to infuse the digital into the physical.
For Michael Miramond, Director of Retail, Luxury and Consumer Products at IBM, the luxury and clothing sector who historically bargained on the physical marketing, will have a card to play with artificial intelligence. As a matter of fact, IBM is developing the Watson virtual assistant, which can interpret a series of human “markers” such as the voice or the facial expressions. This technology is able to “learn”, hence, to better know and “understand” consumers. It could facilitate the virtual personalisation of the products. The luxury and clothing retailers seem to hope of becoming a “concept” space, designed as a two-way path. On the one hand, customers can test before buying directly from their computers and on the other, sales teams can use the data collected online to enhance the in-store user experience, thereby privileging the utmost “phygital” experience to date.
Hereby Micheal Miramond’s speech in French
Thou shalt tailor-make the client’s experience but beware, if unprepared it might result in a flop
As “phygital” grows its roots from segmenting the clients and personalising their taste, it needs to be well managed and the firm should be well prepared to adopt a more agile structure. Charles-Alexandre Peretz, Director of Marketing and Operations at Atelier NA, a specialist in custom-made men suits launched 8 years ago, argues that although digital is in the brand’s DNA it is still important to convince the client to visit their store.
The website represents the first date with the client. It is a means for exchanging information. From the client’s point of view, it allows the understanding of the brand and their products. And from the company’s perspective, it grants information on the profile of the client. Consequently, arriving at the shop, the seller will be able to understand the client’s expectations. This means he will be being better armed when proposing a unique tailor-made experience.
But sometimes customising can turn the other way. For example, Starbucks US tested a new city-wide concept based on a beverage pre-reservation system, looking more after the digital promise of the campaign. Following the promise of retrieving their order within 45 seconds, a high number of people rushed to the store. This left the whole staff and the machines worn by the insufficiency to manage all of these orders simultaneously. Bottom line, the digital and the real experience must intertwine in order to respond to the costumer’s demand.
Thou shalt put faith and trust in mobile
To bring “phygital” into the shopping experience, a company must switch from a classical business model to a more integrated one. Strategies such as ‘data-driven’ and ‘data-centric’ should be implemented in its core value. For Olivier Guillet, the adoption of a mobile marketing strategy is an opportunity for reaching new customers. In fact, baby boomers are retiring and millennials will soon take over in the active society. This new takeover is mainly driven by the mobile. It is a device that is always on, an intimate extension of ourselves, a guide to reach the physical stores. In brief, the perfect game-changer in the ‘phygital’ marketing.
Thou shalt provide an extensive training to all co-workers
With great innovative and sales tools comes great responsibility. One that is to be shared with all of the company’s shareholders, especially with all the employees. There is no excuse when it comes to continuous team development, especially if it works as the main engine behind getting the golden advantage on the client. A change of the organisation culture is, therefore, an important element to plan and put in place.
Thou shalt honour the social media by using them as sales channels and reputation proof
Sales have dramatically changed by these intrusive technological advancements brought by the social media. This affects many aspects of selling, related to public consumption and tools offered to the companies on these platforms (payment integration, paid ads etc.). Facebook and Twitter, for example, are central to the clients’ satisfaction and their close relationship with the brands. They are the new representation of forum to voice concerns about the different products and services (Marketo, 2010).
Saying so, expectations are increasingly high and the public is constantly demanding the brands to reward their loyalty or to react by improving their offers. In return, companies can take to the social media to educate and to dig down and create an intimate relationship between the product or service and the client.
As a matter of fact, creating a social presence is not simply a task. It is rather simultaneously a blend of a sales channel, a brand management and a firm’s reputation operation. It is important to understand that the balance of power has moved from the company to the customer (Baer, 2010).
These are the rules that you shalt set before them
The theory of one size fits all is no longer possible to apply around the social media. Although it is relatively doable and cheap to put together a budget for the social media strategies, on the long term it involves the brand in putting together a strong coalition inside and outside the company. That said, sales, marketing, customer relationship management and the client must work in concert on the digital platforms and relay back to the physical store in order to grow their affinity to each other.
Finally, physical and digital are not opponents and certainly not the opposite. They are like your right hand and your left one; you need both to clap. Therefore, honouring these five commandments will give a great understanding of the opportunity offered by both worlds and certainly will be key in shaping the next generation market space.
Marketing Magazine (2018), Physique et Digital – Les deux c’est mieux, September 2018 p.52.
What If the internet became the primary cause of global warming? Ian Bitterlin, a data centre expert, estimates that by 2030, the Internet will consume 20% of the world’s electricity. Today the energy consumed by the internet is, for the most part, not of green origin. It generates an ever-increasing carbon footprint and has a detrimental impact on global warming. Large companies face social pressure and increasingly frequent investigations from independent organisations, and now are embarking on a race for a Green Internet.
The energetic greed of Internet
A high power consuming global network
To determine the energy consumption of the internet, one must ask what the Internet is. According to the Cambridge Dictionary, the internet is “the large system of connected computers around the world that allows people to share information and communicate with each other”. A study conducted by Ericsson and TeliaSonera determined that the three most energy-hungry internet components of this “broad system” are the end-user equipment, data centres and networks.
The end-user equipment
According to a study from the Centre for the Digital Future in the United States in 2017, an American would spend an average of one full day per week connected to the internet. A study from Statista indicates that teenagers are even more exposed to internet: they spend about 4 hours on the internet a day, a little over a day in a week. These numbers are just further evidence of the constant connectivity we experience daily. To stay connected, we use devices that we regularly recharge, thus consuming energy.
The data centres
Data centres are also very greedy. A data centre is “a place of computers that can be kept safely” according to the Cambridge dictionary. Each click, each message sent, each video watched solicits these computer farms. They use electricity to operate, but especially to keep cool. The cooling functions of computers alone account for 40 to 50% of the electricity consumed. McKinsey & Company estimate that only 6% to 12% of the power is used to compute. The remaining part is used to prevent a surge in activity that could crash their operations.
To illustrate the amount of energy consumed by a data centre, Peter Gross, an engineer and designer of power systems for data centres said: “A single data centre can take more power than a medium-size town”. In France, the energy consumption of data centres is higher than the electricity consumption of the city of Lyon (2015, the French Electricity Union). Data Centres’ global energy consumption is up to 3% of the global energy consumption wrote The Independent in 2016.
The internet network
We can also see an increase in the development of the networks which allow access to the internet. The components of this network are for example DSL, Cable Modem, Fiber. These networks work also thanks to energy.
To determine the energy consumption shares of these three major Internet components, the ACEEE assessed in 2012 that for a gigabyte of data downloaded, the network is consuming 5.12 kWh power: 48% from data centres, 38% of end-users equipment, and 14% of internet networks.
The vagueness of the exact global consumption
Determining the global energy consumption of the internet is complicated. The Centre for Energy-Efficient Telecommunications (CEET) tried to do this once. It estimated that internet consumption accounts for 1.5% to 2% of the world’s total energy consumption in 2013. If we compare this figure with the use of other countries, the Internet would be the 5th country consuming the most energy in the world. In 2014, Jon Koomey, Professor at Stanford University and known for describing Koomey’s law, estimated this consumption to be at around 10%. However In 2017 Greenpeace estimated it at the lower rate of 7%.
There are a few reasons that can explain this critical difference. The main one being that when end-user equipment consumes energy, this energy is not necessarily used to connect to the internet. A laptop or computer can be used offline to play video games. Allocating the share of electricity used for the internet connection is therefore very complicated. Some experts prefer not to count the energy consumption of these devices so as not to distort the numbers. Besides, experts expect this power consumption to double every four years. The Guardian predicts that in 2020 the internet will reach 12% of global energy consumption.
With great power come great sustainable responsibilities
The dark side of the power
The problem with the energy consumption of the internet lies in how to track what kind of energy the internet network is using. As Gary Cook, a senior policy analyst at Greenpeace, said: “How we power our digital infrastructure is rapidly becoming critical to whether we will be able to arrest climate change in time. […] If the sector simply grew on its current path without any thought as to where its energy came from, it would become a major contributor to climate change far beyond what it already is.” Indeed, in 2016, the Independent wrote that the carbon footprint of data centres worldwide was equivalent to the global aviation industry, which is up to 2% of global CO2 emissions.
Some associations have therefore investigated to determine what the share of renewable energy that the data centres use to consume is. The Environmental Leader estimated that in 2015 Google and Amazon used at least 30% of fossil energy to power their data centres. In 2015, Lux Research company found out through a benchmarking on data centres owned by Google, 4 out of 7 were dependent on coal energy. In 2012, Greenpeace released the report “How Clean is your Cloud?” informing about the respect (or not) of the environment of some companies had through the use of their cloud and their data centres.
The Green Power Race
These studies by different organisations have created a race for green power for data centres of large companies. Google, Apple, Facebook and Amazon now provide 100% renewable energy to their data centres or are aiming towards this objective. As an example, Amazon has been powering its servers with at least 50% renewable energy since 2018. However Greenpeace recently contradicted this information and would estimate it only at 12%. Greenpeace also points out that the change triggered by these big Western companies is not enough. Indeed, sizeable Chinese web companies such as Baidu, Tencent, show very little transparency, communicating little about their energy consumption, or their use of green energy. They face little access to renewable energies due to monopoly utilities. The GAFA are also under the spotlight; medium and small data centres remain off the radar.
Nonetheless the International Energy Agency’s (IEA) announced that despite the increase in workload for data centres (about 30% in 2020) the amount of electricity used would be up to 3%. Data centres are becoming more and more energy efficient.
The internet remains the most important source of information and has also made it possible to create less polluting solutions. Reading an email is more eco-friendly than printing a paper. Using an app to find a car parking space is more environmentally friendly than driving around in circles to find one. if you find yourself feeling concerned about this invisible pollution that we generate daily, rest easy in the knowledge that the Internet also contains tips to reduce its own electricity consumption.
Monday 9 am, I am discovering my new workspace: sofas, high tables, meeting bubbles, phone call spaces, plants…I am officially working in flex-office.
According to a study by Actinéo, nearly 50% of professionals work from outside their office at least occasionally, and 28% do so regularly. From home office to desk sharing and coworking spaces, the workplace is becoming more and more flexible.
The Flex office consists of the absence of an assigned office in the workplace. Imagine arriving in the morning with your laptop and settling where you want to go every morning. But is it a concept to adopt? In this article, you will find the advantages but also the risks that can appear with the flex office and personal feedbacks.
The flex office appears after different observations such as the office occupancy rate of only 60% in France. But also that the digital revolution allows many workers to communicate with their team from anywhere. In addition, employees are increasingly turning to companies that offer a certain level of comfort. By moving away from the constraints of schedules and journeys, they can gain in serenity and productivity.
The flex office meets different needs and desire:
An employee’s tasks vary from day to day. Analysis, data entry, call, content creation, brainstorming, meeting… Choosing your work environment according to the activity of the day is a real factor of productivity.
Usually grouped within the same department, employees rarely can exchange ideas externally. With the flex office concept, barriers are removed to allow everyone to connect easily by sharing ideas and knowledge.
Collaborative spaces allow two people from different departments to meet face-to-face and take the time to discuss current issues. A simplified communication is created. Employees then have a better understanding of all the company’s businesses and are more collaborative and engaged.
3. Space optimization
For the company, every square meter of unoccupied space represents high costs. The layout of the spaces is, therefore, important when we know that 60% of the offices would be unoccupied on a daily basis.
The flex-office makes it possible to remove a large part of these vacant spaces to make them user-friendly and useful spaces.
Flex office is not for everyone
This new spatial organization is not without consequences for employees. According to an OpinionWay barometer from 2017, 68% of a thousand employees surveyed are afraid of desk-sharing.
Depending on the functions of each person, the flex office experience can thus be lived in very different ways. For example, those who work in project mode and spend their time running from one place to another in the open space will probably get used to this mode of spatial organization. For others, the break can be brutal and employees can feel lost. Where is the picture of my loved ones, or what happened to that pile of files still lying around on my desk?
Here are some examples of tension created by the flex office:
1. Loss of personal space
As I told you earlier, with the flex-office employees have their workspace completely modified. Used to a dedicated office and personalized to their image, it can be difficult to imagine becoming a nomad.
The flex office, if poorly controlled, can have the opposite effect to that expected. It is important to provide many collaborative spaces and to regularly bring together your teams in order to accustom them to group together on their own.
3. Caring management
For the flex office concept to work, the company must be based on trust. An essential value for happy and productive employees. As positions are no longer allocated and compartmentalized, it is difficult for a manager to have a constant eye on the presence of his teams. Between teleworking, collaborative spaces, cafeteria or meeting box, the employee can be anywhere at any time.
Hence the need to organize regular meetings with his team in order to monitor the progress of projects.
Many companies have adopted this new working organization, such as Sanofi or BNP Paribas. For the flex office concept to work, the company must prepare its employees. For example, organizational issues will have to be considered beforehand to reassure employees and allow a smooth transition. And you are you ready to move on to desk sharing?
The Agile methodologies have become increasingly popular in the last few years. For instance, some large companies like IBM manage to adapt them (see details in the article How IBM’s Biggest Business Unit Got Agile). What make them so attractive? And what are they interesting for large companies ? These new project managements enable companies to stay relevant and competitive in the digital world.
1. What are Agile Methodologies ?
Thanks to the contribution of 17 software developers, the Agile Manifesto is born in 2001. His goal was to help people to develop software in a more efficient way by establishing key ideas, as following:
Individuals and interactions over processes and tools.
Working software over comprehensive documentation.
Customer collaboration over contract negotiation.
Responding to change over following a plan.
In this respect, these new ways of working promote iterative development. It consists of regular check-in with the user (see my previous article on Test and Learn) and tested solutions by accelerating the delivery and by minimising costs of change.
In addition, these methods also value the human factor by promoting collaboration and multidisciplinary within teams. This automatically leads to raise the client satisfaction, who is at the core of the conception process and the stakeholders’ involvement. Are the Agile methodologies a mere fad or a real game changer?
2. Why are they more relevant for the current market needs?
Today’s business environment is tumultuous with tough competitors and more and more demanding customers looking for a unique experience. In fact, they have to face up to assaults from digital natives like startups and collaborative platforms, for example Uber, Airbnb or BlaBlaCar… This is why a successful digital transformation depends on the capability for companies to use digital technology in all of its areas to provide best value to their consumers.
Actually, the Agile methodologies make possible for large companies to adapt to dynamic market expectations throughout the entire project life cycle. They accelerate the digital transformation by giving a framework to anticipate and respond rapidly to new needs.
3. Difficulties to adopt this model in large companies
Traditional conception of innovation needs a new breath
At the beginning, the Agile methodologies were developed for a small team or project. This is why, today a lot of long-lasting companies have difficulties to adapt these methods to their structure. Why?
Companies can no longer consider innovation as it has been under the industrial culture. Particularly in France, we have often an exclusively engineering-based approach which follows the classic Waterfall model. The project is conducted in a linear fashion, passing from one service to another without adapting the delivery according the client needs and potential new technical constraints.
However, this does not mean that we have to let it down altogether, but that companies must adjust it and take what is relevant and valuable for business in agile methods.
Actually, the digital natives have an organisational advantage that makes them more flexible and responsive to change. Contrary to them, large companies need to reinvent themselves. They have to imagine a new strategy to progressively change their culture and involve their collaborators and clients.
Therefore, this kind of transformation have a big impact on services and individuals. It requires a huge capacity of coordination and communication to frame their organisational change in such a large structure. It is important to point out that not all projects are suitable for agile methods, just as there are many, the best known of which are: Extreme Programming, Lean and Scrum.
But don’t believe that you can convert all your services in one shot with a turnkey solution. You need to organise it cleverly and progressively. It necessitates time, patience and being able to convince resisters to changes or sceptics.
Finally, the Agile methods challenge our traditional conceptions of project management as well as the corporate culture. These new forms of work qualified as ‘demand pull’ rather than ’technology push’ will dominate tomorrow’s market. If it is not already done… So we should start to consider its benefits.
If you have any good or bad experiences with the Agile methodologies to share or you just want to discuss about it, don’t hesitate to contact me at firstname.lastname@example.org. I am currently working on this subject for my thesis, thank you for your help!