Is Google’s Algorithm more human than you think ?

Is Google’s Algorithm more human than you think ?

Search Quality Raters. These users evaluate the quality of the search results of the Google algorithm. Behind the magic of the automatic, the questions, intentions of users and results of research are analysed and evaluated, tightly circumscribed by guidelines provided by Google. Their feedback is used by the engineers to improve the quality of the research results proposed to the daily users. However, SEO professionals are worried about the impact of the work of these teams on the URL ranking. Despite claims by Google managers to the contrary, minds remain sceptical about the total use of these data collected.

The humans behind the algorithm

An unknown profession, but not secret

Since the early 2000s, people have been working and analysing the results of Google’s algorithm. Today, there are approximately 10,000 of them in the world. They are average people, users of search engines like everyone else. They applied for a part-time job offer at a third company such as Lionbridge or Leapforce and had to pass two tests in order to be selected. One tested their reasoning through questions and the other composed of ‘nearly real-life’ exercises. At home, they spend between 10 and 20 hours per week (paid between $ 12-15 / hour) studying and giving feedback on research results that have already happened.

“In-our-shoes” analyses

The analysed results are mainly organic like texts, images, videos and news results (sometimes paid ad results, as well). Each day, they are offered to perform different tasks to evaluate research results. They can, for example, test a given URL and assess its relevance according to a query on desktop or mobile. They also make side-by-side comparisons of organic results of the same search and selecting the results that best match the query.

Companies provided them with information such as the language of the search, location and sometimes the map of queries (map restoring queries previously sought) to better understand the intention of the user. Their purpose, to put themselves in the shoes of any user and determine if the results are relevant to the intent and research.


A very monitored job

Each task has an estimated completion time. Agencies are timing the Search Quality Raters during their tasks to judge their effectiveness. For example, evaluating the quality of a URL is estimated at 1 minute and 48 seconds. However, to ensure that the analysis is done without bias and with the application, the same tasks are assigned to several Search Quality Raters. If their results diverge, they are asked to agree together. In case of persistent disagreement, a moderator will decide


The Guidelines: Quality Made in Google

To best frame the evaluation of the quality of the search results, Google transmits (via third-party companies) guidelines. In 2015, after many leaks, Google finally decided to publish them officially.

Google regularly makes changes according to the new objectives of the algorithm. The last official publication dates back to July 20, 2018 and is 164 pages long.

In their guidelines, Google explains to their Search Quality Raters how to evaluate the quality of pages of their search engine. For this, it is necessary to carry out three notations.

Needs Met

The objective is to verify that the result corresponds to the query and the intention of the user. For this, Google identifies four kinds of queries: those with the objective to inquire (know), to act (do), to go to a specific site (website) and local visit (visit-in-person). The Search Quality Rater will evaluate whether the result meets the needs by placing the cursor on the scale from FailsM (Fail to Meet the Needs) to FullyM (Fully Meet the Needs). Some queries can be a mixture of several types.

Scale of the Needs Mat Rating

A Search Quality Rater may decide not to assign a rating for content and to “flag” it in certain cases: if the material is pornographic, presented in a language different from that of the query, does not load, or contains upsetting and or offensive content.


The E-A-T

The E-A-T acronym stands for Expertise-Authority-Trust. The Search Quality Raters assess the level of expertise of the content by verifying that the author of the main content has enough personal experience for it to be considered relevant.

They then assesses the authority of the main content, the site and the author. A Search Quality Rater must find evidence of their reputation and recommendations from entities whose authority is already clearly established.

Finally, Trustworthiness is the confidence that the user can have towards the site. It is established with the main content, the website and the author.

This evaluation is in no way related to the query. Through their criteria, Google puts forward the assessment of the benefit that the content brings to users. Moreover, it says on the Google Blog: “We built Google for the users, not for websites”.  Through this rating, Google is fighting back against the increase of fake news.

We built Google for the users, not for websites – The Google Blog

The Overall page quality rating


This rating is based on the query and the intent of the user. It includes five criteria: The purpose of the page, the notation of the E-A-T, the appreciation of the main content, the information found and the reputation of the website and the author.

Scale of the Overall Page Quality Rating

The YMYL pages

Some pages are rated more strictly than others: pages Your Money, Your Life (YMYL) page category, created by Google, groups pages containing medical, financial, legal, news, public / official information, as well as pages for shopping or financial transactions. Their content can have a significant impact on the lives of users reading them, which is why they must contain high-quality information.

A quarter of the guidelines pages are dedicated to mobile queries and the assessment of its content especially for queries like “visit-in-person”. Both the main content, as well as the quality of the mobile optimisation of pages have a full part to play in this.

Grey Areas around the ratings

The impact on the SERP ranking

Many experts have expressed concerns about the role of Search Quality Raters in the Search Engine Result Page (SERP). Can the evaluation of URL quality and feedback from Search Quality Raters cause a downgrade? Is the data collected reusing in addition to the algorithm? In response to this, Matt Cutts, the head of the webspam team at Google, said the feedback would only be used to refine the algorithm. The webspam and quality rater teams have two separate goals and are not connected.


Indeed, the process would be to evaluate the quality of sites at first. Then, when engineers change the algorithm, Search Quality Raters would be able to assess the difference in quality during side-by-side evaluations without knowing which side contains the product of the change in the algorithm and which version is the old one. Engineers will modify and improve the algorithm based on feedback from Search Quality Raters. They can then run a live test on a small percentage of users that are not search quality raters.

However, if in the short term the ranking of a page judged of poor quality by Google is not altered. We can imagine that this will happen in the long term. Indeed, if a page presents some of the characteristics considered to be bad quality, the fact that it is noted as such by a Search Quality Rater will not impact its ranking.

On the other hand the engineers will make sure that only the high quality results are present in the best results during different changes in the algorithm.

The Search Quality Evaluator Guidelines as SEO bedtime reading  

The ratings of Search Quality Raters are therefore essential. Unfortunately, Google does not communicate this to the authors but the guidelines framing their notation are, which is why the Search Quality Evaluator Guidelines is an essential document for evaluating one’s content. By doing our assessment, we are more than likely to find areas for improvement. Moreover, as SEO is a red thread spot, this evaluation is to be renewed regularly and especially when reworking these guidelines



Sources :

Mobile Proximity Marketing in France

Mobile Proximity Marketing in France

Location-based data at your fingertips 


LenouvelEconomist run two months ago as a headline, that we moved on to the « Mobile moment ». In fact, Google announced last September that mobile queries had exceeded those made via desktop over the summer 2017 in France.

French actors are clearly lagging behind in terms of brands leveraging mobile to increase conversion but the wind is shifting rapidly and they are on their way to catch up. The main updates and fields of improvement are around increasing the page load times, investing in accelerated Mobile Pages provided by Google, adopting mobile first contents, etc.
On this matter, according to Thomas Husson, analyst at Forrester, this revolution is not only about « a change in mobile usage » but more widely, it is an « evolution of consumers’ states of mind » who want to receive services in real time to satisfy their demands whenever they feel the need for it. In terms of improving customer experience and enabling brand loyalty, he adds that brands should properly perform upstream measures to evaluate the full value of mobile driving in-store traffic, as 85% to 90% of the transactions are still made offline. That is why collecting contextual data link to consumers is key in order to anticipate their needs.


Another key milestone coming this year is the launch of Google’s mobile first index which will impact your overall mobile marketing strategy.Will it become your new SEO Bible? In any case, it will change the whole search result experience giving full credit to the mobile version of your site to rank it on Google. You will need more than ever to understand your users’ behaviour and preferences in order to engage them truly with mobile optimized content and optimized website to increase your mobile conversion rate.


According to the 2018 Baromobile study released by OMD (Omnicom Media Group) together with the expert in drive-to-store S4M, the level of consciousness around « Data connexion » has arised and consumers are using more and more mobile :


Furthermore, according to the 2016 Study made by the Mobile Marketing Association together with L’ARCEP, «  80% of the mobile users receiving a commercial SMS declare themself interested by its content», whereas «  20% among them have already clicked on a promotional link ». So we can ask ourself :

  • Which sms strategies brands are leveraging off to push their in-store conversion ?

For locale acquisition campaign, SMS geofencing is a circumvented tool. Since 2015 in France the commercialization of SMS in client acquisition is rising as retailers and brands want to grow their SMS database and increase conversion. According to a 2016 study, a tremendous open rate reaching 95% (of which 90% within 4 minutes – study from MMA – Arcep) with a memorization rate of 60% (INSEE 2016).

  • What simple content to push ? 

Flash sales notifications, discounts with flash codes,link redirecting toward a landing page, videos. You can as well personalize your sms by creating custom SENDER Id (or OADC) but remember 160 characters maximum and the STOP mention included to be privacy-friendly.

  • How to use geofencing in your SMS geotargeting techniques ?

Retailers are using geolocation as a real life « cookie » to drive traffic in store thanks to GPS, beacons or wifi technologies.  One of the condition using geofencing properly before sending any commercial offer is that you must first get the permission from a mobile phone number to text message them. It can be done through an  SMS keyword sent to an SMS code : « shoes » to 3638 or an optin obtained through a previous campaign. Once customers give their permission to share their location, you will be able to set your campaign and choose for instance a radius of 20km (the maximum distance) around the postal address of potential clients or a radius of 1km around your retail store to reach them.

  • What’s the main advantage of SMS geotargeting ?

It provide you with a high precision target based on past and real time results and, thanks to optimization algorythms, you will be able in the future to consistently improve your targeting audience predicting their needs, their purchase intents while analysing POS traffic and factors influencing them.

Some french specialist in geo profiling and geolocation for retailers like Ubudu  in the tourism industry, can help you building your proximity marketing strategy by providing you with high-precision geolocation features (up to 10cm !). You will be able to send push notification according to their location, send them personalized welcome message, enable customers to avoid lining in store and enable them finding products and services as fast as possible to provide them unprecedent experiences !

  • Toward an omichannel strategy ?

Click and Mortar and pure players businesses are leveraging consistently the mobile experience to fuel their omnichannel strategy by multiplying acquisition campaigns and activations. Sephora is a good example of a makeup store retailer using sms within its 360° « Wonder » Christmas campaign and whom has been recently rewarded for it (read the dedicated article here).


Mobile is key when we think of cross-canal and ROPOiste strategies because acccording to the 2016 study of L’Observatoire du ROPO², on average out of 100 visits in a retailer store (during a period of 3 months), 46% of visitors are cross canal (visiting the website and the store) and conversion rate made via mobile in France is reaching 56 %. That is why, brands should perform various drive to store activations such as Drive to store RTB, Facebook Offers or SMS.

Lets focus on mobile display advertising wich offer huge potential in terms of activating local customer :
→ You can stimulate real time visits in store by targeting precisely your customer catchment area. Plateforms such as Adsquare, which is a mobile first data exchange, allow you to broadcast custom offers based on contextual information (open hours, weather, events and dedicated locations), within where you can adjust the radius of your reach and add advertisings redirecting to Google Map.
→ One insightful business case is the partnership made between the french adtech startup Databerries (now Teemo) and Intersport around its « Real life targeting » solution. It includes a SDK (a set of programming tools for your app), integrating easily geolocation data from 40 app publishers (including Marmiton, L’équipe). For Intersport, a « visit » is equal to a 15 minute visit in store. Thanks to Databerries, Intersport confirms that for one euro invested, it take back 4 to 6 euros margin (study available here)
In terms of your overall customer journey, what could be the next steps once you have attracted potential customers in-store ? Leveraging in real-time your data collected is essential :

2/ During the visits :

  • In-store wifi : accessing the free In-Store wifi access to identify your visitors and activate marketing trigger (push notifications, trigger content on wi-fi connection landing page) to collect data and new optins
  • Geofencing local notification

3/ Post-visits : Behavioral retargeting
By collecting  of individual data visits and sending :

  • SMS / Email offering complementary/lookalike products
  • SMS/Email to encourage revisiting the store with a welcome back message



Transforming your store into a « Data Generator » and enhancing your database plateform in order to serve your drive to store and omnichannel strategy is the way to procede in order to boost your sales. Retailers are in this way tending to adopt the « Cookie less » targeting model thanks to the unique identifiers.

→ A great variety of first party location data is now at your disposal like the Mobile ID (MAC Address), Mobile Number and optin thanks to the synchronisation between a device having a unique identifier and emitters like wifi sensors, IoT signals, GPS, beacons. You will be therefore able to cross historical location data, detailed behavioral pattern, as well as intentionist data within the same place. Adtech technologies are rightly improving their capacities offering now Location-smart SDKs. On the media buying plateform side (DSP, DMP, trading desks), french actors are doing more and more partnerships to access millions of unique ID, combine it with online identifiers and geolocation data. For instance Vectaury is coupling its DSP with its SDK to propose ultra qualified segmented audience and thanks to different partnerships with Mondadori, Madvertise,  they have access to more than 10 millions located user data that they propose to publishers to maximise their conversion.

«  Mobility data » as it is called, has become therefore a new source of information and is used as a proxy for second and third party engagement data (read the article here).
For instance, Fidzup can provide you a clear measurement of your Drive-to-store ROI by performing behavioral targeting in and out of the store (buying intention, catchment area, competitors customers) and is able to collecte granular data coming from Store, section, fitting rooms, floor, cash registers. (To know more about Fidzup : here).

However, in order to ensure clean and high qualitative data origin, your mobile advertisers must cross-referencing location data through different sources. Examples would be using GPS, phone operators, beacons or specific algorythms and detect any possible ad fraud in order to collect compliant and qualitative actionable data. On the compliance part, the french mobile actor Vectaury is offering a new optin since the beginning of 2018 : a « Geo-transparent optin »banner to simplify the consent collection of mobile users and perform anonymized profiling (read article here).
As a matter of fact, the overall marketing need to collect contextual data is becoming widespread and confirmed by the last trends in the Adtech market. Since the beginning of the year 2018, the first Adtech sector attracting investments in France is the Drive- to- store sector. French actors like Teemo (Ex Databerries), Mozoo,  Vectaury or Fidzup are leveraging offline market opportunies given the fact that this market is not under the Google and Facebook dominance. For instance, the French Actor Tabmo is trying to make the usage of mobile DSP more accessible to agencies accompanying brand thanks to its creation module : « GPstore (drive to store),  MScroll (canvas type) and the SlideMotion (read the article here).

It is no more a question of proximity advertising but about the unicity of the location data that lead to new marketing strategies : Online to offline attribution, retargeting and re-engagement. What is important to keep in mind is to leverage on your CRM data and – onboarding thanks to  cross-device data combination. According to Ray Kingman, CEO, Semcasting «  By using IP, location and device matching, marketers can leverage mobile to execute deterministic attribution with 70%-90% coverage matching across all channels and platforms – both online and offline».

New marketing metrics are therefore emerging to mesure the online ad impact on offline sales with KPIs such as :  visit rate,  Cost-per-Visit (CPV), visit direction, number of visits, areas visited location intelligence:

To go further on the metrics subject, earlier at the Mobile World Congress hold in Barcelona (from the 26st of Feb until the 1st of March), the company Augury took the opportunity to publicize its first complete app ecosystem intelligence solution by launching «  Active Insights »which combines cross-app data and provide full cross device reports . The Ogury mobile data collection plateform has data from more than 400 million users with audience segments based on the mobile users app usage. It give marketer competitors insights, how to deal with churn rate and  how to segment their target to deploy mobile ad targeting then outside the apps which hit an unprecedent scale in terms of insights.


After casting a wide net around mobile strategies.. what if blockchain was coming into play ? Mobiquity Networks, a mobile  location data intelligence provider adopted a blockchain layer into its marketing strategy. Its plateform gathers ID, IDFA (Apple Identifier) with anonymized location data (like footfall traffic in-store) via third-party permission-based mobile apps, primarily shopping apps and combined with other data in order to be sold to clients. Adding a hybrid blockchain layer to your database can help your business dealing with your data separation problems providing structures to guide your data pointers and breakdown each « event » within your blockchain together with IDs source written into it. Blockchain is therefore an extra key consideration to not underestimated in the digital marketing and advertising data management landscape.

To conclude, drive-to-store strategies on the marketing as not yet entered into a mature stage has its overall impact measurement (whatever the device used) is still not entirely manage by measurement companies. The latter are trying to provide real time insights to measure the entire online and near/in store customer journey that will allow brand to perform hyper locale marketing campaigns. The advent of contactless mobile payments is another factor that is going to reinforce the practice of real time targeted actions as well as analysis based on these new consumer behaviour shopping habits.

Therefore, the « mobile-ization  » amongst retailers and pure players brands is on its way in France, blending greater ease and convenience of shopping experience for mobile users and customers looking for greater on the go experiences.




Mobile Proximity Marketing in France

Our first steps on Google Analytics

Google Analytics

You have probably heard about this analytic tool. Everyone can access to it freely. For some of us, it seems to be the “black beast”. This is time to demystify it! In a first glance, for sure, we can be afraid… But don’t panic! I will prove that this tool can be easily managed by everyone.

Interface Google Analytics

Even if we all agree that the tool is powerful, seriously, Google: why the interface is so much “none-user-friendly”. Frankly speaking, a lot of beautiful dashboards exist (I don’t want debate… And by the way, as a first-hand information, Google will revamp the interface!)

Any way. Believe me, this tool worth a visit. Why?

5 ways to analyse relevantly your data

Because data is precious! Yes, to my opinion, here are the main features:

1.    Understand the impact of our communication actions

We often spend time to create amazing Facebook posts, send truly beautiful newsletters or even write articles that torn up. However, what the actual ROI of these frantic online endeavours? Google Analytics allows you to analyse which actions & channels are the most efficient and bring traffic to the website.

Audience on Google Analytics

In this example (accessible from “Audience > Overview”), we realized a newsletter announcing the new website. The number of visits grows up suddenly during the same date. No doubt: the communication with the newsletter was successful (with more than 50% of open rate).

2.    Better know your visitors

Into GA, it’s possible to analyse the profile of your visitor. Demographic data, interests, behaviour, …

Visitor overview on Google Analytics

This information allows to understand which type of person you reach. First, to adapt the communication to the core target. Second, it may allow to realise that the targeting is not efficient. And maybe this information should lead to re-define the strategy to better qualify your marketing campaigns.

3.    Understand the user journey

Another interesting tab is « behaviour ». This data allows you to visualize the user journey of your visitors: what are the different stages of the navigation (Homepage > Blog > Article 1…)? At which moment the visitor leaves your website? (…)

Behaviour on Google Analytics

Finally, what type of information do we collect? Here, we can easily interpret the data. It’s interesting to identify pages where we lose a lot of traffic but also trying to understand what are the reasons.

In the example of e-commerce, the pattern of all websites is really similar. Indeed, many people leave the website at the payment stage when they discover the shipping costs.

4.    Identify the different entry points of the website

Directly linked to the first point (impact of communication actions), it’s interesting to know how people arrive on your website and also where.

It could be thanks to Facebook, where you shared your blog articles, the Google search bar (if you optimized correctly your SEO) or even a direct access (when people are directly typing the URL).

5.    Measure your ROI on Adwords campaign

SEA campaigns on Google are connected with GA tool. When you are selling something on your website (e-commerce website, selling services, or even direct contact…), this is interesting to calculate his ROI (Return On Investment) and analyse your paying campaigns to identify if it brings money or not.

I asked Alexandra, co-founder of, to share with us the analytics of this e-commerce website.

Acquisition channel on Google Analytics

In this screenshot, we can analyse which Multi-channel is the most efficient in terms of conversion. In this case, among the visitors who click on a paid search (with Google Adwords) convert around 41% of the total number of people who finally buy the product. This means that almost half of the turnover comes from paid search! At the contrary, social network conversion is quite low compared to other acquisition channel. Even if this brings an incommensurable values with awareness and word of mouth.

To go further

  • Have a look on Google Tutorial!

Google suggests some videos on the Analytics Academy platform. If you want to know more, you can follow these tutorials (plan between 4 and 6 hours).

  • Be certified!

If you want to put this skills on your resume or even your Linkedin profile, Google proposes a free certification thanks to an online-test. You can also check the preparation guide to make sure you are totally ready!

To conclude…

I hope you have found this piece helpful and I encourage you to interact with me through the comments section. I’ll be more than happy to answer your questions

Mobile Proximity Marketing in France

UX: Why is it so important and for which businesses?

Facing the surge of articles on user experience, I felt the need to go back to fundamentals. When we talk about UX, what are we really talking about?User experience is the way in which a user perceives the service as a whole. A good user experience must induce a propensity to use the service durably, or even to contribute to its evolution. On the other hand, a bad experience makes ” the customer flee “. The challenge is therefore huge because the quality of the user experience determines the crystallization and the development of a service relationship. By definition, user experience is “the answers and perceptions of a person resulting from the use or anticipation of the use of a product, service or system”.
This definition is clearly a matter of ergonomics. The term user experience as formulated by Donald Norman, a professor of cognitive science at the University of California in the 1990s, refers to psychology and therefore to emotions. Its aim is to provide the most appropriate approach to a targeted audience according to any offer (products, services, companies, etc.). The more adapted the approach, the greater the satisfaction of this offer.


Why is it so important?

The whole purpose of UX is to reduce the cognitive effort, a somewhat barbaric expression which means, in a simple way, the fact of dispensing a user to spend all his capacities to fulfill a task at a given moment.

Companies are taking User Experience more and more seriously for several reasons:

The first one is because of the competition. In fact, what differentiate TrainLine and Voyages SNCF? nothing, because both are selling train tickets. And that was the crazy challenge Captain Train took: bet everything on the interface and offer one of the best User Experience of the market: the interface is clear, really easy to understand and it takes less than 3 mn to order a train ticket. Nowadays, that’s exactly what users are looking for: being able to have access to a specific service or buy a specific thing without any frustration and friction throughout the customer journey.
ux3 ux2

Source:  &


The second reason and one of the most important is the following: nowadays with the rise of internet usage, there is no longer physical vendors helping customers. Salespeople have been replaced by the interface. The consumer is alone in front of his device, and a smooth user experience is one of the main reason why consumers will choose to buy online or leave the website.

Moreover, the challenge for online businesses is to truly understand the customer journey. In fact, since 2007, all the investments from online businesses were focusing on acquisitions. Acquisition consists on bringing and attracting as much people as possible on the website. Problem is: with acquisition strategies, it is very difficult to prove the Return On Investment. Attracting people on a website is necessary and extremely important but one has to make sure that those people are converting. In addition to that, acquisition costs have been multiplied by 40 over the past five years. Today, e-Commerce companies focus on understanding customer paths to increase conversion rates and business performance. 94% of CMOs declared that user experience would be their priority marketing inverstment in 2017. Their first goal is to improve the Web and Mobile customer journey on their website to increase their revenues.ux4

Which businesses are concerned?

ALL OF THEM. All businesses from any sectors are affected by the importance of a better user experience.

Obviously, at the very beginning of the growing awareness of the importance of UX, the more mature were and still are e-retailers. In fact, a better user experience and a smooth customer journey can make a HUGE difference on their revenue. But today, all businesses feel affected through the different challenges they are facing. For example, a retailer will not have the same challenges as a bank or a luxury brand or even an online media.

Banks for example are facing very specific challenges. In fact, the retail banking market is known to be particularly cumbersome and volatile. Even if in average 25% of French people say they are unsatisfied with their bank, only 3% of them will change bank each year. Banks need to provide a great user experience, making it easy to open an account or transfer money. A good interface coupled with great services and functionality is the key to perform well.

Online media are even more specific. Indeed, they have two objectives: monetize their audience and increase their subscription rate. In order to monetize their audience as much as possible they need to provide a great user experience that is going to make the reader willing to stay as long as possible on the website. By boosting the users’ retention, they are going to be able to display more ads.


User Experience and customer journey are no longer fashion and trendy words but remain challenges for online businesses, regardless of the sector.

To be able to provide a great user experience the first and most important things is to be able to KNOW WHO your customers are and UNDERSTAND THEM. Companies are using several tools and solution to be able to optimize their website : Traffic analytics, such as Adobe, Google Analytics and AT Internet,  will help them to make statements: bounce rate, time spent, number of pages viewed. Behavioural analytics, such as ContentSquare, will make them understand how users are interacting with their website, mobile and app: basically it is answering the following question ‘why did my user leave the website without buying or subscribing?’ And then, once all of these has been understood comes the e-merchandizing, customisation, content and testing solution such as Qbit, ABtasty or Kameleoon.

Mobile Proximity Marketing in France

Google launches a free version of Optimize and Data Studio

The Google Analytics Suite 360 takes shape with the availability of Data Studio in France and the announcement of a free version of Optimize.

Google Data Studio is now available in France

The Google Analytics team is now investigating the subject of consolidating and visualizing data. Accessible since this summer in the United States, Data Studio is launched since September the 28th in France ; both its free version (limited to 5 different reports) and its paid version of Data Studio 360. This data visualization oriented reporting tool expands the possibilities by providing aggregate data from multiple APIs and data sources such as :

• Google Adwords,
• Google Analytics,
• Google BigQuery, MySQL and Google Cloud SQL,
• Youtube Analytics
• Google Sheet (usable as an intermediate destination to export any other data source).

Once the data is aggregated and processed by connectors, it is possible to use it in one or more reports : this architecture is the guarantee of the data’s consistency presented to end users.
Google provides a set of reports template which designs clearly highlight the data value. The reports’ construction is very similar to other tools of the Google Drive suite (particularly Google Sheet). The reporting interface is built, among others, with blocks of tables or charts that you can position and customise very intuitively.

Google Data Studio
Another strong point of Google Data Studio is that all reports are dynamic. In the template below, the change of the analysis period is made directly from the interface, up to the choice of the end user.


On the maps or on the graphs, data can be animating with the movement of the mouse. This gives a fun aspect to data visualization. For more details: here is the official data studio documentation link.

Optimize, Optimize free Google 360

Another good news is the upcoming availability of Optimize, the free variation of Optimize 360 which is the premium version currently available in beta test. Optimize 360, which has been announced in spring 2016, is a all-in-one personalization tools developed to improve user’s experience.
Google Optimize is supposed to enable the marketing department to create and easily deploy a multitude of targeted adjustments on a website (content, block position) by using A/B testing, multivariate testing or personalization. Here is a few strengths related to the product :

• deploying Optimize from Google Tag Manager or as a plugin in Google Analytics Universal by using an asynchronous tag is really simple,
• the solution is fully integrated to data collected by Google Analytics, there is a real synergy and consistency in targeting an audience as in evaluating the performance of a deployed test. By using the Google 360 suite, there are no more contradictions between tools,
• access and roles are secure based on the quality of the Google infrastructure.

Until September the 28th, beta access was restricted and you had to pay for it. A recent post on the google analytics official blog announced that a free version will gradually be accessible to those who request it on this form.

google optimize

Optimize integrates reporting to measure the performance of each test.
Google has not release more details about the differences in functionality between Optimize 360 and Optimize in its free version. This information will be published shortly.

With these free versions of Optimize and Data Studio, Google want to democratize the use of A/B testing and reporting for both very small as big web players. It could also give visibility to all Google Analytics 360 Suite which is the successor of Google Analytics Premium.

Other new features in the Google Analytics 360 suite

Besides the imminent launch of Google Optimize and immediate availability of Data Studio in France, Google announced a new functionality in Google Analytics : Session Quality Score. The Mountain View firm will use machine learning to identify good and bad sessions on websites ; hence, it will be possible to analyze results more accurately. The functionality will be deployed shortly. Google Tag Manager was also improved and became compatible with 20 additional sites like Twitter, Quantcast, Nielsen or Microsoft Bing.