AT Internet Headquarter. Credit: Xavier Bellenger
In recent years, the requirements for analytics as well as the range of data user profiles has shifted dramatically. At the same time, there has been an increasing demand for flexibility, customisation and power. As the number of technical providers continues to grow, the offerings of digital performance measurement solutions have become somewhat diluted – resulting in a complex, uneven and incomplete market.
As a long-term player and digital analytics pioneer since 1999, AT Internet has taken an innovative approach to the current market challenges and completely redefined its technologies. With an approach strongly focused on the needs of our users, while adhering to our values (ethical and eco-responsible), we have set out on a genuine product revolution. Our goal was more than just creating the most functions – an approach that goes against today’s need for tools that are accessible to everyone and allow users to act quickly. Instead, we’ve aimed to apply the most recent technological developments – to provide an intelligent solution to marketing, product strategy, UX and General Management challenges – while remaining focussed on the need for energy conservation and importance of privacy by design.
Here’s the full rundown of our vision, technical approach and how AT Internet carried out this technological shift.
A diverse and complex digital analytics market
When AT Internet arrived on the scene, we essentially measured interactions between computers and web pages. 20 years later, the market has come a long way. Users now combine a range of devices (smartphones, connected watches, voice assistants etc.) to access the wide range of platforms each brand offers (website, mobile site, iOS application, Android application, etc.). To respond to this, Digital Analytics solutions now need to measure these complex journeys in their entirety and as exhaustively as possible. The fact that there is an extremely wide and strategic scope of use of this data means that it also has to be highly reliable.
However, the reality of today’s Digital Analytics market is complex. Even if everyone agrees on the importance of having a complete, coherent and accurate knowledge of the reality of uses, a company looking to purchase the most appropriate solution still needs to understand the range of different approaches on the market.
Digital Marketing Analytics tools, designed to meet specific digital marketing issues, have a range of complex concepts (traffic acquisition, monetisation, numerous metrics and specific analyses…). AT Internet, Adobe Analytics and the market share leader Google Analytics are clearly positioned in this category. The quantity and variety of standard information is probably the main advantage of these players, especially for marketing management roles. Nevertheless, these tools can quickly reach their limits when it comes to analysing very specific company concepts.
Product Analytics tools have been able to take advantage of the drawbacks of Digital Marketing Analytics to gain a place in the market. American providers such as Mixpanel, Amplitude or Heap analytics, have a high degree of flexibility, allowing them to measure interactions that are very specific to the development of a product or service. Their perceived simplicity is a strong selling point for profiles in the product domain (Product Managers, Product Owners etc.). Their disadvantage is that the high degree of flexibility can make them complex when it comes to having reliable and exhaustive data – as very few analyses are ready to use. To reconstruct the visit metrics or to rebuild the dozens of marketing analyses available and necessary in digital analytics is both tedious and potentially hazardous. There are countless risks of error and it is unrealistic to imagine certifying audiences in this way.
Finally, In-house tools (designed in-house), have for a long time been only available to more mature companies. However, the availability of cloud platforms has democratised the tools needed for In-house projects. These tools can offer end-to-end customisation capabilities (nature of the data collected, processing applied, restitution interfaces) while offering configurable computing power. But beyond the exceptional skills required to successfully complete this type of project, this approach often requires numerous functional trade-offs for an economical balance that is often very challenging to maintain over time (development costs, risks, technical debt, maintenance, etc.). In addition, technical developments (cookies, adblocking, new media) and recent legislation (e.g. the GDPR) make it difficult to ensure reliability and compliance. In practice, companies that have the skills to deliver and, above all, maintain such projects are few and far between.
As a result, companies may be tempted to combine the Digital Marketing Analytics and Product Analytics tools to answer all the questions they may have about its users, while still having the flexibility of an In-House solution. However, the high cost of doing this makes it unsustainable in the long term, especially for tools that work in silos at the expense of data quality.
The elements of a new technological platform
In addition to the robustness and reliability of our solution’s analytical core, developed and proven for over 20 years by tens of thousands of users, we are adding three new key ingredients to deliver the best of Digital Marketing, Product and In-house analytics.
- A strongly user-centric vision: we are going to offer even more flexibility in the ability to analyse the entire user journey, while making it easier to manipulate metrics such as Unique users (which are costly in terms of computing resources).
- A higher degree of flexibility: on the basis of a more complete data model, each company will be able to take into account its own requirements thanks to almost unlimited customisation capabilities (customised variables, metrics or custom events).
- Unlimited computing power: this is probably the most anticipated feature. To respond to increasingly numerous and complex challenges (deduplication, segmentation, combining datasets), the analysis and calculation power is multiplied. Some of our customers are already collecting tens of billions of interactions per month.
This new phase of our digital analytics solution is the result of one of the most ambitious projects in the history of AT Internet. To intelligently combine the 3 strengths mentioned above with our expertise, we started from scratch to build a brand-new platform – the New Data Factory. The project is particularly far-reaching as we have chosen to carry it out in complete transparency for our clients, delivering functionalities as the development progresses. Without realising, they have been using both the old measurement system and this new technological platform for months. Some of the product’s flagship functions are entirely based on this new system: our Navigation analysis, e-commerce Sales Insights analyses, or more recently the new version of our data mining tool Data Query 3.
Another foundation of our product strategy is ethics by design. Respect for the privacy of Internet users is an integral part of our company DNA. An efficient and reliable analytics solution cannot be designed without a relationship of total trust between the solution, the site and its users. In addition to guaranteeing complete technical transparency and independence, our business is exclusively that of analytics, and the ownership of the data is entirely that of the client (without any secondary use). Moreover, we are convinced that the main value of analytics above all is to better understand in order to better serve. There is no sustainable value creation from data without a clear contract of trust with Internet users. The guarantees we offer in terms of privacy and the work we have done with the CNIL, responsible for the application of the GDPR, allow us to benefit in France from an exemption from the collection of consent. This exemption is subject to conditions but allows us to make the completeness of the data considerably more reliable while respecting the essential trust of Internet users.
Finally, in addition to protecting privacy, our ethical approach also consists of minimising the impact of our activities on the planet. Collecting and processing information en masse is a particularly energy-intensive activity. We are committed to developing a solution that will reduce its carbon impact, notably through intelligent resource management and a systematic search for minimising the information collected and stored. The Big Data is not concerned about the purpose of data collection, and the quality of the data ingested and digested today seems irresponsible and disrespectful of the expectations of Internet users.
Respect for privacy, the minimisation of the data collected and calculated, and the economic balance are part of a virtuous circle where the data is fairer, more reliable, more respectful and less harmful for the planet.
The Analytics Suite: enhanced relevance and value
Our product strategy is to deliver value-added functionalities as this new technology platform is developed. Our customers have therefore already been able to test a few components of this enhanced version of the Analytics Suite.
Priority on data activation via Machine Learning
The Analytics Suite offers Data Science features based on Machine Learning algorithms: anomaly detection, prediction, contribution and automatic clustering. We are constantly enhancing our offer with intelligent functions (AXON) to support the analyst’s work by automatically suggesting insights. Traditional core analytics remain available and also continue to evolve with releases. Sunburst, for example, useful for analysing navigational routes, or funnels to illustrate churn points during the shopping journey, are still available in the solution.
We remain committed to offering a system that is fully open and ready to interconnect with our customers’ entire ecosystem. A new-generation API is being introduced, complemented by unprecedented mechanisms for extracting large volumes of data to ensure the widest possible distribution of value. There will also be more ready-to-use AT Connect connectors natively integrated into the platform. These information flows will ease the use of the data via partner tools, subject to the informed consent of the Internet user.
Flexibility and customisation, up to 1000 variables
Each company has its own specific requirements that call for a high degree of customisation. Expectations are even higher when it comes to meeting challenges other than those related to digital marketing, such as UX or product strategy. Our solution is designed to allow our clients to collect a wide range of information about users’ interactions with their brand. When an event is measured (page load, video play, etc.), it is qualified by a number of parameters that we call properties. It is this structuring that makes the information usable and relevant – each type of interaction can have specific properties. We will allow our customers to add a large number of custom properties (up to 1000).
In addition to the thirty or so predefined events offered by AT Internet, our users will be able to create an unlimited number of custom events. These include downloading a brochure, adding music to a playlist, or contacting a vendor on a classified ad site. They will be able to measure interactions that are in line with their strategic objectives or that will enrich the data collected so far, using their own jargon. These events will also be qualified by properties available in the AT Internet data model or customised.
Tagging: the tag first approach
To achieve this high degree of customisation, we will apply the tag first approach. This means that any data present in the collection library (JS, SDK) is automatically collected. Gone are the previous configurations and declarations when it comes to adding new types of events or new custom properties. To ensure data quality and seamless governance, authorised users will be notified when changes are applied to the data model. A simple authorisation will allow them to make them effective.
For all the information that does not belong in the tagging libraries, we now offer the ability to easily import data into the platform:
- User criteria
- Product catalogues (colours, model, brand, stock, etc.)
- Content catalogues (genre, duration, director, etc.)
- Campaign listings (type, cost, formats, etc.)
- Generally speaking, any information that can be associated with a key (ID).
Of course, it will still be possible to easily create personalised processing rules to correct or enrich the data collected before it is stored in the platform.
A ready-to-use tool with 400 properties and 120 standard metrics
The self-service analytics feature, which is already a major strength of the product, will be further enhanced. The platform’s new data model is enriched, bringing the number of standard properties (traffic sources, e-commerce, audio/video, advertising, technologies…) to more than 400 and the number of metrics to 120. The use of this data model combines significant time savings with high quality analysis.
An exceptionally smooth experience
An analyst must be able to identify the potential for optimisation in Explorer, continue their investigation in Data Query and share the results of their analysis in a customised dashboard. Adaptive and scalable, the user will be able to choose which features to access based on their profile.
Behind the scenes of a technological revolution
In line with our product vision, the development of a new technological platform has been the subject of a number of decisive technical choices.
Architecture: adapted technologies and a DevOps approach
Starting from scratch to develop a product allows you to choose the best in terms of technology. In the past, our platform has been almost entirely based on a single range of technology solutions. We are changing our approach and now choose to use as many technologies as necessary to achieve our goals. Some are genuine industry benchmarks, such as Kafka for real-time processing, or Snowflake which is probably the most powerful datawarehouse on the market. But beyond the technologies, it is the quality of the tool combinations that is key to the expected benefits. To ensure that these different components work together smoothly, scalably and flexibly, we are applying a DevOps approach within our development teams and making extensive use of container orchestration (Kubernetes).
Storage: a radically different approach to unifying data
The new technology platform is based on a unified storage of all data collected on behalf of a company. Whether standard or custom, all properties become a column in this data storage structure. The measured interactions are rows in this structure (potentially several billion per month), regardless of the sites or devices they come from.
This radical change in technical approach has two advantages:
- A truly unified view: all the data generated by a user is gathered and available. An analyst will be able to study a very large scope (all platforms), a set of sites (all applications) or even a section of a site, without constraints.
- Unparalleled performance: this storage method, backed by the technological choices made, allows unprecedented query speed. The volume of data and the complexity of queries no longer impose limits.
In a future article, we will discuss in detail the technical advantages of unified column storage compared to a more traditional approach.
Photo credit: Xavier Bellenger