Big FMCG companies may have a wide range of products in their portfolio, which is making the forecast simulation more difficult with less accuracy. So, what could be the impact of inaccurate forecast on such companies? And how can they benefit from the BI and either AI to enhance this accuracy?
As a Marketer in a FMCG company, I will take this opportunity to briefly share my forecasting experience and process, the problems I used to face along with the reasons behind them, and my analysis on what the solution would be through the progress of technology in this field, and how technology and human would work in parallel to enhance the processes and accuracy in work.
To understand the real problem, I used to face during forecasting process, I will explain what it represents inside the company and why it is so important to enhance its accuracy!
Forecasting is in general an act of calculation within predictions, in other words, to calculate the future! And to be able to do that, we should have the data history, the present data and market place situation, the company targeted growth per product and targeted market share, and expect or simulate the sales to market against market trend, taking into consideration the current stock we have, the stock en-route, the up-coming promotional and marketing events that could affect positively or negatively the sales impact.
What if the forecast was less important than the sales to market need? Therefore, you will have stock shortages not only in your warehouse but also a short in supply to the market, thus empty shelves and loss in sales! And this is a nightmare, since you will miss the monthly sales target, your competitors will take advantage and you will risk losing your brand image!
Let’s now take the opposite scenario, what if the forecast was more important than the sales to market in only 1 month? As a direct consequence, you will have slow moving items in your warehouse, additional stocking cost, slow moving items blocking the space in front of the fast-moving items, thus an unbalanced stocking level.
Not to forget that the forecasting simulation is placed for 4 months ahead as a golden rule in supply chain ; because based on the forecast, Procurement Department should order the raw materials needed for production raw materials need sometimes 60 days to reach the warehouse), then the Production team should produce products and build up the physical stock in order to have a safety stock for Logistic Department to supply the warehouses in different areas and perhaps shipping for different countries!
What if we’re forecasting for a new product to be launched? Isn’t it even more difficult? And what if we’re forecasting for above 600 SKU’s? Can you imagine the nightmare without proper solutions and clear data analysis?
Despite having different quantitative and qualitative methods and approaches in forecasting, they are still not enough to avoid forecast inaccuracy. There must be a harmony between all the processes to ensure a smoother mechanism and accurate execution based on solid data analysis and simulation to get clearer solutions and data readings.
Although having a lot of BI solutions in the market like Oracle, IBM, Microsoft SQL, JD Adwards …etc., that can be used to improve forecasting, it is still the trickiest operation in almost all industries. Market fluctuations, business conditions variation, economic crisis and uncertainty, in addition to shifts in supply and demand that can make from forecasting a more guesswork than science!
So how, through BI and AI, can we optimize this process and increase the accuracy percentage to above 95%?
What marketing and business managers need, is to find a new set of technologies driven by data science; an approach based on predictive analytics as a complementary of forecasting that relies on a combination of data history, statistics, machine learning, data mining and modelling. These processes will enable business leaders to connect data to effective action plans by drawing reliable conclusions and solutions about the current/present situation and the future events.
So, how does it work?
By using data, data and more data from internal and external sources via predictive algorithms:
Internal sources essentially include the company’s internal marketing automation data, sales history by channel by sector by product’s category and SKU, product’s labeling and mapping, current sales situation, stock level data, business plan data and targeted sales growth by SKU, marketing plan data and projected promotional and events activities, projected extra sales deals/new products launching/areas expansion…
External sources consist of data points as market volume, current market situation as market share by product’s category, competition activities offline and online based on seasonality, product’s pricing change, market trend and situation, area/country economic situation, customer’s orientation and surveys to meet customer’s expectations via uploading the surveys data.
Predictive algorithms use data science to spot correlations between thousands of variables (historical data) and the outcome (sales) to predict the likelihood of closing each prospect. These algorithms can rapidly recalibrate themselves in response to emerging patterns of data.
For instance, if a company acquires or undergoes geographical expansion, they can quickly pick up the nuances and adapt to changing circumstances, thereby ensuring the precision of predictions even in the most dynamic business environments.
Performance unearthing actionable insights while gazing into the future is great, the more important issue is to do the internal homework within companies’ departments, not only finding the right data, but to enhance the internal processes and rightly map, label and gather these data to make the algorithm process easier and can help managers to correct various problems in order to easily read the data and improve the solutions.
The main issue of facilitating the data reading and finding easier solutions, is to enhance the business value within the company.
With objective and accurate predictions, it is easier to stay in total control of the pipeline and know exactly what will close and what won’t. This ensures shorter sales cycles, higher rep quota attainment and an increase in average deal size, while reducing sales and marketing costs. Furthermore, it guides smarter decision-making by solving complex business questions in a fraction of time and uncovers new business opportunities. It is for these reasons that predictive analytics is seen by many organizations as an ROI decision instead of a cost factor.
As a conclusion, while it may be galling to discover that a computer thinking in 1second can get a better grip on the data than all our human intuition, one can’t really argue if it works. Predictive analytics is revolutionizing sales forecasting by replacing the constraints of human inference and bias with objective models based on forecasting algorithms.
But at the same time, these models cannot work properly and give us the best solutions and readings if humans don’t do their work properly and provide the right data within an internal business value and synergy within all concerned departments and business operations!
Why not listening to good music while reading? My inspiration for this article about women in tech is Run the World – Beyoncé. Play it now!
In her book, Lean In: Women, Work, and the Will to Lead, Sheryl Sandberg writes:
The promise of equality is not the same as true equality
These words illustrate perfectly the reality of the tech industry. New technologies are omnipresent today and everybody is impacted by it in a way or another. Forecasts are positive when it comes to jobs and opportunities for all, hoping for a more equal world. Yet, there is one thing you should know: gender inequality is still clearly strong in the tech industry.
WOMEN IN TECH: HISTORY AS A PROOF
Technology, computing, digital… all that we know today was not exclusively created by men. Many women actively contributed to it and they should be thanked for their work. Let’s review some of them!
Ada Lovelace – (1815-1852)
Born in 1815, Ada Lovelace is the inventor of the first algorithm to be applied by a machine. For that, she has been given the nickname “first computer programmer”. She also brought out important questions regarding society considering technology to a collaborative tool. Impressive for the time!
Kay McNulty, Betty Jenning, Betty Snyder, Marlyn Meltzer, Fran Bilas and Ruth Lichterman – 1946
Called the ENIAC 6, these six women programmed one of the first computers in History, the ENIAC. An interesting thing to know is that some of them did not receive any recognition for their work during their lifetimes. Also, many people and especially historians were persuaded that they were only “refrigerator ladies”. It means that, for them, their job was only to refresh the machines while they were, in fact, creating something big.
Grace Hopper – (1906-1992)
This woman created the first computing language called COBOL. For her work, she received the award of Computer Science Man of the Year by the Data Processing Management Association in 1969. Great for a woman! Later, in 1991, she was given the National Medal of Technology which constitutes a high honour in the USA for people working in the tech industry.
Radia Perlman – (1951-)
Have you heard about the Spanning Tree Protocol? Well, Radia Perlman does!She conceived the algorithm behind it, consisting of the “basic traffic rules” of the Internet we know today. She is named “The Mother of the Internet”.
WOMEN IN TECH: A SYMBOL OF GENDER INEQUALITY
Women making History was unfortunately not enough to have gender equality in the tech industry. Many reports have been published the last 10 years to illustrate it, showing that the tendency continues to be the same. Here are four figures to help you measure it:
1. Syntec Numérique realized a study in which is highlighted the fact that in 2016, only 33% of total employees in the digital industry in France were women. Among them, only 16% had jobs such as developers. Others were working in human resources, communication or administration for example.
Percentage of computing occupations held by women – Author: Julie Compagny
3. The same report proved that 41% of all women employed in the high-tech industry quit in 2015. This is huge compared to men: only 17% of them did it. This report allows an understanding that the main reason why women left their job has nothing to do with family concerns. Indeed, they perceived no possibility of evolution and development in the companies they were working in, leading them to change job.
4. In 2017, 30% of all Google employees worldwide were women and, actually, it has been the case since 2014. Three female employees took the company to the court in 2017 because they were less paid than men. It seems that Google still has a lot to learn!
Distribution of Google employees worldwide from 2014 to 2018, by gender – Author: Julie Compagny
WOMEN IN TECH: LET’S BE POSITIVE
The benefits of gender equality
Of course, it exists many benefits for gender equality in the tech industry. Two insights are very interesting to look at though and we will focus on them.
The second one is that over 1.4 million computing job will be open by 2020 in the USA. Nonetheless, with the current computing grads of the country, only 30% of those jobs would be taken. It is then a huge opportunity for women who are totally needed!
How to encourage gender equality?
Here again, there are many ways to encourage gender equality. The most logical solutions are very often the ones we do not think about.
Regarding companies, simplifying job descriptionsand being honest about the must-haves is a good solution to have more women applying. A fact is that a woman will apply to a job if 100% of the criteria meet who she is whereas a man will apply to the same job if only 60% of the criteria meet who he is. Making job descriptions easier is then a source of opportunity for women. Companies can also promote inclusion and diversity and make it a real priority for all employees, men and women. Encourage its female employees to develop their competencies and reach higher levels is necessary and will favour gender equality.
Other solutions exist and you should try them: find a mentor to help you, integrate an association or a digital community, have a role model to project yourself and above all, believe in yourself and do not forget that yes, it is possible! If others succeeded, why won’t you?
Gender inequality in tech can be challenged and it is our responsibility, to all of us, to make things change.
In her book, Sheryl Sandberg also writes:
We cannot change what we are not aware of, and once we are aware, we cannot help but change
I really hope that these words will be meaningful to you and that after having read this article, you will actively fight for gender equality in the tech industry.
In the collective imaginary the internet is in the cloud, when it is actually much more tangible than we want to think. It spreads across the globe using an underground network which grows according to the demand for internet access. It just so happens that this network is in danger. Indeed, the rise of sea levels due to global warming threatens this internet network located on the coast. The damage caused by this rising water could greatly affect our modern lifestyle.
A tangible and massive internet network
The internet is described by the Cambridge dictionary as “the broad system of connected computers around the world that allows people to share information and communicate with each other”. It has three main components: the end-user equipment, the data centres and the internet network.
This network is itself composed of several elements such as optic fibre cables, hardware servers, data transfer stations and power stations. These elements, all interconnected, weave a network to transmit information from one end of the world to another, and it is quite difficult to estimate its length. In 2014, there were 285 submarine communication cables or about 550,000 miles. It is difficult to gage the size of the terrestrial network, as it grows based on the demand, and the newly installed cables intermix with the old ones.
In the United States, it is estimated that most Internet infrastructure was built around the 1990s and 2000s. At that time, the development of the network followed the development of major American cities. Today, operators tend to install network extensions along with other infrastructures such as roads, railways or power lines. In some areas of the world and throughout history, cities and megacities have developed along the coastlines; portuary cities that were synonymous with wealth, opportunities, and businesses. These attractive and often densely populated cities are now facing a danger: the flooding of their internet network.
The rising seas gaining internet ground
Paul Barford, a computer scientist, and his assistant student, Ramakrishnan Durairajan, undertook a mapping of US internet infrastructure. As the infrastructures are private and belong to the operators, the locations are kept mostly secret in order to avoid any possible damage. In mapping the network, they observed that it was becoming denser in areas of high population. There are often coastal cities.
They presented their information to Carole Barford, climate scientist. and they became aware of the risk of flooding part of the network. They decided to superimpose the map with that of the rising sea level due to global warming by the National Oceanic and Atmospheric Administration (NOAA). Through their research, they estimated that in 2033, about 4,000 miles and 1,100 traffic hubs would be underwater in the US. For New York City, about 20% of its internet network will be underwater.
We should not underestimate the repercussions this flood would have on our current lifestyles. Many services work through the internet, such as traffic lights, medical monitoring or cash dispensers. In the past, some cities had suffered blackouts due to flooding. A recent example: in 2012 during Hurricane Sandy, 10% of the city of New York was deprived of electricity.
The problem is that the terrestrial network is designed to be water resistant, but not to work under water.
Unlike submarine cables, cables buried in the earth are protected mainly with plastic. There are not adequately protected in cases of floods or frost. And with part of the network being a few years old, it is possible that it is even more fragile than the new extensions.
It was during the Applied Networking Workshop in Montreal, July 16, 2018, that the three scientists presented their study concerning the territory of the USA. Carole Barford said “The 15-year predictions are really kind of locked in,” nobody can change what will happen. The main cities involved are New York, Miami and Seattle.
Saving the Internet … from itself?
“If we want to be able to function like we expect every day, we’re going to have to spend money and make allowances and plans to accommodate what’s coming” said Carole Barford.” “Most of the damage that’s going to be in the next 100 years will be done sooner than later … That surprised us. The expectation was 50 years to plan for it. We do not have 50 years, “added Paul Barford.
So, what are the solutions to avoid this submersion of the network?
The first would be to be able to locate all the infrastructures that compose / form the Internet network. Despite the risk of voluntary degradation, it is necessary to identify the infrastructure that will be underwater in a few years. The study predicts that about 4,000 miles and 1,100 traffic hubs will eventually be underwater. Their estimation is made according to the networks they knew about. This study must also extend to all continents and countries. As rising water levels are a global effect of climate change, many coastal cities are likely to be affected.
In order to limit the impact of rising water on the internet, operators can envisage different solutions. Strengthening the current network, moving it further inland, or ensuring that computer signals avoid submerged areas. However, these solutions are not perfect or permanent. Strengthening infrastructure will only work for so long. Avoiding submerged areas will impact the quality of the internet network and could cause latency. Moving existing infrastructures or creating new ones will require significant financial investments that could affect the end user.
Our internet use seems to be in danger. However, does it contribute to its own destruction? The internet is not as green as it seems. We power data centres, one of the main components of the internet, with unsustainable energy sources, creating carbon emissions. Forbes estimated that the carbon footprint of data centres alone is equivalent to that of the global aviation industry, or 2% of global emissions. The emissions of carbon dioxide, due to our increasing use of the internet, are one of the causes of the melting of the ice caps and rising water levels.
Wouldn’t it be ironic if our growing internet addiction was its own worst enemy?
What If the internet became the primary cause of global warming? Ian Bitterlin, a data centre expert, estimates that by 2030, the Internet will consume 20% of the world’s electricity. Today the energy consumed by the internet is, for the most part, not of green origin. It generates an ever-increasing carbon footprint and has a detrimental impact on global warming. Large companies face social pressure and increasingly frequent investigations from independent organisations, and now are embarking on a race for a Green Internet.
The energetic greed of Internet
A high power consuming global network
To determine the energy consumption of the internet, one must ask what the Internet is. According to the Cambridge Dictionary, the internet is “the large system of connected computers around the world that allows people to share information and communicate with each other”. A study conducted by Ericsson and TeliaSonera determined that the three most energy-hungry internet components of this “broad system” are the end-user equipment, data centres and networks.
The end-user equipment
According to a study from the Centre for the Digital Future in the United States in 2017, an American would spend an average of one full day per week connected to the internet. A study from Statista indicates that teenagers are even more exposed to internet: they spend about 4 hours on the internet a day, a little over a day in a week. These numbers are just further evidence of the constant connectivity we experience daily. To stay connected, we use devices that we regularly recharge, thus consuming energy.
The data centres
Data centres are also very greedy. A data centre is “a place of computers that can be kept safely” according to the Cambridge dictionary. Each click, each message sent, each video watched solicits these computer farms. They use electricity to operate, but especially to keep cool. The cooling functions of computers alone account for 40 to 50% of the electricity consumed. McKinsey & Company estimate that only 6% to 12% of the power is used to compute. The remaining part is used to prevent a surge in activity that could crash their operations.
To illustrate the amount of energy consumed by a data centre, Peter Gross, an engineer and designer of power systems for data centres said: “A single data centre can take more power than a medium-size town”. In France, the energy consumption of data centres is higher than the electricity consumption of the city of Lyon (2015, the French Electricity Union). Data Centres’ global energy consumption is up to 3% of the global energy consumption wrote The Independent in 2016.
The internet network
We can also see an increase in the development of the networks which allow access to the internet. The components of this network are for example DSL, Cable Modem, Fiber. These networks work also thanks to energy.
To determine the energy consumption shares of these three major Internet components, the ACEEE assessed in 2012 that for a gigabyte of data downloaded, the network is consuming 5.12 kWh power: 48% from data centres, 38% of end-users equipment, and 14% of internet networks.
The vagueness of the exact global consumption
Determining the global energy consumption of the internet is complicated. The Centre for Energy-Efficient Telecommunications (CEET) tried to do this once. It estimated that internet consumption accounts for 1.5% to 2% of the world’s total energy consumption in 2013. If we compare this figure with the use of other countries, the Internet would be the 5th country consuming the most energy in the world. In 2014, Jon Koomey, Professor at Stanford University and known for describing Koomey’s law, estimated this consumption to be at around 10%. However In 2017 Greenpeace estimated it at the lower rate of 7%.
There are a few reasons that can explain this critical difference. The main one being that when end-user equipment consumes energy, this energy is not necessarily used to connect to the internet. A laptop or computer can be used offline to play video games. Allocating the share of electricity used for the internet connection is therefore very complicated. Some experts prefer not to count the energy consumption of these devices so as not to distort the numbers. Besides, experts expect this power consumption to double every four years. The Guardian predicts that in 2020 the internet will reach 12% of global energy consumption.
With great power come great sustainable responsibilities
The dark side of the power
The problem with the energy consumption of the internet lies in how to track what kind of energy the internet network is using. As Gary Cook, a senior policy analyst at Greenpeace, said: “How we power our digital infrastructure is rapidly becoming critical to whether we will be able to arrest climate change in time. […] If the sector simply grew on its current path without any thought as to where its energy came from, it would become a major contributor to climate change far beyond what it already is.” Indeed, in 2016, the Independent wrote that the carbon footprint of data centres worldwide was equivalent to the global aviation industry, which is up to 2% of global CO2 emissions.
Some associations have therefore investigated to determine what the share of renewable energy that the data centres use to consume is. The Environmental Leader estimated that in 2015 Google and Amazon used at least 30% of fossil energy to power their data centres. In 2015, Lux Research company found out through a benchmarking on data centres owned by Google, 4 out of 7 were dependent on coal energy. In 2012, Greenpeace released the report “How Clean is your Cloud?” informing about the respect (or not) of the environment of some companies had through the use of their cloud and their data centres.
The Green Power Race
These studies by different organisations have created a race for green power for data centres of large companies. Google, Apple, Facebook and Amazon now provide 100% renewable energy to their data centres or are aiming towards this objective. As an example, Amazon has been powering its servers with at least 50% renewable energy since 2018. However Greenpeace recently contradicted this information and would estimate it only at 12%. Greenpeace also points out that the change triggered by these big Western companies is not enough. Indeed, sizeable Chinese web companies such as Baidu, Tencent, show very little transparency, communicating little about their energy consumption, or their use of green energy. They face little access to renewable energies due to monopoly utilities. The GAFA are also under the spotlight; medium and small data centres remain off the radar.
Nonetheless the International Energy Agency’s (IEA) announced that despite the increase in workload for data centres (about 30% in 2020) the amount of electricity used would be up to 3%. Data centres are becoming more and more energy efficient.
The internet remains the most important source of information and has also made it possible to create less polluting solutions. Reading an email is more eco-friendly than printing a paper. Using an app to find a car parking space is more environmentally friendly than driving around in circles to find one. if you find yourself feeling concerned about this invisible pollution that we generate daily, rest easy in the knowledge that the Internet also contains tips to reduce its own electricity consumption.
“Ok Google, turn on the light of the living room”, here is one of my first sentences when I get home. Like many people, I gave in to the temptation of the must-have gadget of a smart speaker for Christmas.
All these products have in common that they are connected to the Internet via WiFi and have a virtual assistant capable of answering oral questions. With their synthetic voice, they promise to make life easier by providing access to a multitude of services faster and more naturally than by typing on a smartphone’s virtual keyboard.
But it is important to have some understanding in mind before buying it because it might also be a threat to privacy. Instead of comparing all features offers by those smart speakers here are some question I asked myself before purchasing my Google Home.
Are they real value added technology?
Access to certain features faster than on a mobile phone
“Alexa, launches episode 2 of Season 3 of Orange is the New Black” or “Hey Siri, plays my favorite playlist” “Okay Google, turn on the TV on M6”. These are some easy commands to execute if you have a subscription to Netflix or a Spotify account.
Plan your daily life
“Okay Google, what are my appointments today?” The smart speaker will state the elements listed in your Google Calendar. It can also wake you up at a pre-defined time with the sound of birds, then gives you the weather report and the latest news.
Control connected objects in the house
Turning on the light, managing the thermostat, open your shutters, etc. Be careful, it’s necessary to check beforehand that your devices are compatible with the speaker. You will also need to be very patient with the configuration. Once these pitfalls are done, it’s possible to program your object. For example, opening your shutters at 8 o’clock; turn on the light of your bedroom at 8:10 o’clock and turn on the coffee machine at 8:15 by simply saying “Hey Siri, hello”. Freaky or not?
This new technology will help us know everything whenever we need it. Announced as our digitized future, information is everywhere and nowhere at the same time.
What can’t they do?
The smart speakers are still imperfect because no one is perfect, isn’t it? Sometimes you have to ask them several times to make yourself understood, especially when you are asking for foreign music.
Those connected speakers are not able to hold a conversion like a real human yet. It is better to use the lists proposed by each company (Google, Amazon, Apple) to be sure to be understood.
Are they spies in the middle of your living room?
Installed in a corner of your living room, the connected speakers are a new gateway for Amazon, Google and Apple. It allows them to propose their services directly to their consumers’ home and to learn more about their habits and their behavior.
Before purchasing this type of product, it’s necessary to keep in mind that it is an Internet-connected device equipped with several microphones. They are normally in standby and are activated only when the keywords of the smart speaker are spoken, such as “Alexa”. Despite this precaution, there is a risk of hacking, as with any connected object on wifi.
Several episodes of involuntary activation of their microphones have already been shown in the news. For example, Washington station KIRO 7 was contacted by an American couple that had their conversation recorded and transmitted to one of their employees in Seattle by an Amazon speaker.
This isn’t the first time Alexa has gone wrong. Users of Amazon’s digital assistant, have reported their Echos were laughing at them for no apparent reason.
Those type of incident can be very dangerous with regard to the data collected by these devices. Those speakers collect lots of information about the business habits and behaviors of their users. The data tell a lot about their private lives.
How to limit their intrusiveness?
If you are sensitive to these connected objects, it is possible to limit their intrusiveness by respecting certain elementary principles. Here are some advice given by The French personal data protection authority:
Turn off the microphone when you are not using your speaker or when you receive friends.
Regularly delete the history of your conversations
Don’t give the responsibility for sensitive functions such as alarms or locks.
These easy tips will protect you in case of hacking or malfunctioning. Therefore, you have to be aware that when you ask something to these speakers, they select the information sources of the answers for you. While your computer or smartphone offer you several sources each time. The selection of the answer may influence you.
In the ideal connected home, humans could converse easily and safely with their digital personal assistants, asking them for a whole range of services. Collecting our data allow them to deliver more targeted services and improve the quality of the services we get from our virtual assistants but is it really worth losing our privacy?