This post provides a (non-comprehensive) list of the points which should definitely not be overlooked when undertaking a digital analytics project. To create the list, we asked the advice of several of our expert consultants. Please note that points #6 and #7 are the errors that occur the most often and require all of your attention…

#1 – Loosely choosing a Web Analytics tool, without having tested it

The web analytics tool must not be chosen lightly, and must not be chosen without having carried out any tests beforehand. The tool must be selected according to the company’s technical resources available and its level of maturity in the field of web analytics. To evaluate your level of maturity you can use the maturity model developed by AT Insight: “Maturity models: why, how, for who?” There is not really much point in setting-up a very advanced tool, complicated to implement, if the teams don’t have the time nor the expertise to exploit it. In addition, poor implementation will not allow users to take full advantage of the high-performing tool, and there is a risk of getting unreliable data in the end. On the other hand, you shouldn’t choose an easy-to-implement solution that does not meet the needs of the final users. As you are selecting the tool, you must take user needs and other elements into consideration such as data ownership management, the level of support and assistance provided by your teams, the availability of data in real time …

Another essential point: the trial period. Not all publishers will offer you a full, free trial. Make sure that the trial takes place in real conditions, with operational tagging and over a period of time which is sufficiently conclusive.

Further information to help you choose a web analytics tool is available in our blog post: “12 essential criteria to help you choose your web analytics solution.”

#2 – Not thinking about “final users” when implementing the solution

You can implement the best solution with the most interesting analyses possible, but if users don’t have the time to consult the analyses, or if the teams are not mature enough to exploit them, then you will have wasted your time on integrating and setting up a complex solution. This time spent on integration could have been spent on other urgent cases, or worse still, you might have hired an agency to work on the integration who has invoiced several days’ work of development. The opposite is also true, where the solution has been implemented but users only have half of the information desired. Disappointment guaranteed, which might even lead to senior management questioning the use of the WA solution that was selected.

#3 – Wanting to measure everything … at all costs

It is not worth measuring everything on the website, and collecting data which will not be analysed. The tools available out there offer a multitude of analyses which make it possible to track everything on your site, and as a result it is tempting to collect a maximum amount of data. Let’s take an example from the AT Internet solution: setting up ClickZone on all pages is not worthwhile, however, it is interesting to implement it on some key pages of the site, pages that you will have determined. Likewise, tagging all of the links on the site can prove to be a long-winded, tedious task even though the final users do not need to measure all of them. Tag implementation must be in close relation to the needs expressed by the final users.

#4 – Not considering the constraints of your technical teams

A user needs study is not a wish list to Father Christmas. As interesting and advanced as your needs might be, if the teams responsible for integration have major integration restrictions (linked to the CMS, an Intranet or to other items associated with the technology used on the site), then tagging will not be implemented. Once again, disappointment is guaranteed, but this time it will be the users who will be disappointed as not all of their needs (Christmas listJ) will be implemented.

#5 – Not considering the specific technical features of the site

Writing a tagging plan and integrating tags depends on the technical elements of the site. Ask yourself the following questions: is my site optimised for different devices (desktop, mobile, tablets…)? If so, is the site available in responsive design or is the site declined into several different versions with specific URLs? Has AJAX technology been used? Are there videos? Is the site managed through a CMS, is the content generated dynamically by this CMS?

#6 – Skipping test phases

One of the main pitfalls during a WA project is to be satisfied with implementing a tagging plan and not scheduling any time for testing. Unfortunately, if no verification has been made after the tags have been integrated, there is a strong chance that the conclusions you draw from the data are completely incorrect as the data is not reliable. In addition to having the time available, it is also necessary to have the technical skills to be able to carry out the testing, and assistance is available to help you in this step. If problems are encountered during the first test, then of course we recommend correcting them and carrying out a new test to check that everything is working correctly and that no problems have been created on any of the other points.

#7 – Forgetting to tag the site as it evolves

The web analytics project should not come to a stop the first time the tool is implemented. The tagging plan is not fixed, it must evolve over time at the same rhythm as the site. With these functional, ergonomic and technical changes, tagging can quickly become obsolete. Tagging audits help us realise this very quickly, and allow us to react in time to continue to provide high quality data. For example, not taking analytics into consideration during a site revamp is a big mistake, because the users will then be unable to evaluate the benefits of the site revamp. It is important to gather user requirements once again with the final users of the solution, and with the technical teams, during the site revamp to adapt the tagging plan to meet new needs if necessary. Ideally, a process must be defined so that the web analyst and the technical contact person responsible for tagging are part of the exchanges whenever a site update is planned. To make such updates and evolutions easier, soft tagging solutions such as Datamanager can be used.

#8 – Not choosing the right key performance indicators

The KPIs must be defined prior to the web analytics project, failing that, the web analytics tool will not allow you to monitor your company’s digital performance. These indicators must stem from your strategy and be adapted to both your sector of activity and your site. Analysing these indicators will let you see where you are in relation to your goal and to evaluate areas for improvement. The choice of these indicators will determine the tagging plan, which is why it is necessary to know them beforehand. However, pay attention to avoid a KPI overload! With the evolution of the site and the company’s strategy KPIs will also have to change, meaning that the new KPIs also have to be defined if necessary.

#9 – Not defining dashboards at the beginning

After having defined KPIs, you need to think how they will be displayed in order to help you make decisions. It is important to create a dashboard and reporting template. The purpose of the dashboard and report is not the same, the dashboard is really a management and decision-making tool whereas the report is an overview of the activity to measure your performance on a daily basis. The dashboard must be adapted according to the recipient, and must answer the following questions: what is the recipients’ level in the company? How often should the dashboard be sent? (Daily, weekly, monthly, quarterly?)

Further information on creating dashboards is available on our blog post “Don’t be satisfied with dashboards that have been designed to just give an overview of your online activity.”

It is also important to anticipate displaying data and information over the long term. Do all my co-workers need to access data on the web analytics tool? Do I only want certain services to have access to certain information? These questions must be asked prior to the web analytics project because the responses will have an impact on the tagging plan and the configuration of access rights in the tool itself.

#10 – Forgetting the data management policy as part of an international web analytics project

During the implementation phase of a web analytics project in a group with several different distinct entities, or a project for an international site, you need to think how you would like to aggregate the data in the tool. This will depend on governance within the group. Will just one single entity manage the tool and be responsible for distributing data within the entire group or will each entity, for example in each country, manage their own data independently etc.? This will have an impact on the choices to be made during the creation of the tagging plan.


Comments are closed.