“Collaboration for innovation” is a big part of AT Internet’s company culture. And our human scale means we can collaborate extensively (both internally amongst teams, and externally with clients) to create tools that continuously adapt to our customers’ needs. In this series of articles, get an insider’s look at how we bring our collaborative approach to product development. Join us behind the scenes of Explorer, our streamlined tool for exploratory data analysis.
In 2016, we began undertaking a major initiative: the complete revamping of our exploratory analysis tool, Analyzer NX. Instead of updating the existing tool, we made the choice to build something entirely new from a blank slate. Given the strategic importance of this project, the involvement and support of all AT Internet employees were absolutely vital, from the earliest stages of design.
We therefore implemented a process allowing us to present elements of the new product and adapt its direction based on the resulting feedback.
Step 1: Wireframing
Wireframes are sketches allowing users to envisage the tool they might use. They’re more or less very basic mock-ups of the interface, which help users imagine how they would potentially interact with the product, enabling us to validate the fundamental concepts.
Before even writing the first line of code, we used wireframes to test and validate the functional and ergonomic approaches we had adopted in order to ensure the three major promises of Explorer and the Analytics Suite 2:
- Improve communication between our different tools
- Reduce the number of tools
- Improve user perception upon sign-in (by providing data immediately)
For example, we worked on the following two wireframes:
From within each analysis, we added links allowing users to send the analysis in question to a dashboard, or to continue their exploration in Data Query. Explorer needed to behave like a document creation assistant, or as a starting point for more advanced exploration.
EXPLORER: SEND AN ANALYSIS TO ANOTHER APPLICATION
The diversity of tools available in our solution is often cited as a source of confusion for some of our users.
During a single session, they very may well use several different tools, depending on the purpose of their analysis. Without ties between these tools, the user experience can become a bit chaotic, forcing users to systematically return to the solution homepage and giving them the impression of having to “channel hop”.
To simplify the journey taken between several different tools, we simply suggested adding a menu allowing users to toggle between tools without having to leave the current page.
With a view to reduce the number of tools, we envisioned a solution that would make it possible to both combine our tools and make them more practical to use on a daily basis. We therefore suggested centralising all documents created by our users within a centralised “Documents” catalogue.
This approach would enable us to go from four tools (Dashboards, Dashboard Manager, Reports, Report Manager) to a single tool. It would also allow users to access their documents much more quickly, from anywhere within the solution interface.
“DOCUMENTS” ZONE IN THE ANALYTICS SUITE 2: FIRST DRAFT OF WIREFRAMES
“DOCUMENTS” ZONE IN THE ANALYTICS SUITE 2: THE CLEANED-UP WIREFRAME
For each hypothesis to be validated, we created a wireframe. All wireframes were presented to our colleagues, who shared their feedback, challenged us on our envisioned solutions, and contributed their own ideas. Following these sessions, and after integrating the shared feedback, we were able to validate the proposed solutions which would permit us to guarantee the promises cited above.
Step 2: Prototyping
Once the concept was validated, the following step was to bring these mock-ups to life by giving them colour and making them interactive.
We created several mock-ups to test the Analytics Suite 2’s visual and ergonomic approach for each of our tools. This would help us best understand the overall impact and see if the experience was streamlined between the redesigned interfaces.
The design needed to be more modern, but also needed to highlight the most essential part of our products: the data being measured.
After several iterations, the prototype was born and everyone in the company was able to test it during workshops.
GETTING TO KNOW THE EXPLORER PROTOTYPE DURING A WORKSHOP
One thing left to do was evaluate the overall user experience. To do this, we sent out an Attrakdiff questionnaire internally, which enabled us to confirm positive perception and usability of the product.
RESULTS OF THE INTERNAL ATTRAKDIFF QUESTIONNAIRE
Step 3: Testing the prototype
Just one step remained – a crucial one – before launching development: presenting this prototype and this product vision to our customers. Indeed, it was unthinkable for us to undertake developments if the product did not respond to our customers’ expectations. During our 2016 Digital Analytics Forum event, we were able to offer our customers the chance to play with the prototype for the first time. About 10 of our customers took the prototype for a spin, shared their impressions, and told us if they could imagine using a tool like this every day, if such a product were to be developed.
Little by little, we refined the prototype, adding or removing features based on the iterations presented to our co-designer customers (learn more about our co-design initiatives in part two of this series). In doing so, we were able to offer a lighter interface, a more modern design, and a usability approved by our customers.