Social-media-monitoring-le-mythe-du-tout-automatique

If you have already tried to implement any type of monitoring on the social web, you will probably have been confronted with an off-putting volume of important data. Faced with such a mass of data that needs to be considered and then processed, the temptation is often to want to save time (and money) by having computers do a large part of the work, from monitoring to analysing the data. There is a wide range of “all automated” solutions available on the market.

Of course, automation in social media monitoring is possible, and has many advantages: when applied intelligently and sparingly, it can be of great help. If automation is said to be particularly reliable in terms of data collection, storage and archiving, or in calculating simple data, then this is not the case, however, when we want to replace human intelligence with algorithms, for example to determine the influence of an author, or the sentiment about a subject in a given text. Automation, in a field as sensitive as monitoring, carries several important risks.

A large number of experts agree in saying (and in demonstrating) that the automation of certain “calculations” remains utopian.
The different benchmarks carried out on the automated sentiment analysis show that in the best case scenario there is an accuracy level of 70%. By best case scenario we mean where the languages are recognised and managed by the system, the texts are non-coded, they are neither too short nor too long and do not contain any irony, and have vocabulary similar to that used in corpus-based learning. This accuracy level, as high as it may be, exists only if we are happy enough to limit ourselves to detecting a positive or negative tone of a text, without considering its subjective character (in other words neutral, unopinionated, or containing a mixed tone). As soon as we begin to move further and further away from this ideal context, this rate falls dramatically. In a benchmark carried out in May 2010, FreshNetworks stated an average accuracy level of 30%, falling to 7% in some cases!

The relevance of online influence and scoring tools has been the subject of similar heated debates [1].

Yet it would seem clear that any qualitative analysis should be able to distinguish between what is relevant and what isn’t. In addition to a good understanding of Internet user knowledge, languages, and the way in which Internet users express themselves on different channels, it is essential to be able to pick up on strong signals and ignore the weakest ones, and to know (or determine) who is influential and who isn’t. It is also necessary to have an in-depth understanding of how Internet users react towards you and your competitors.
For this to take place you need to perfectly understand your business sector, your company, your products and any issues you may be faced with.

Is it realistic to believe that a machine, as well equipped as it may be, is capable of grouping all of these qualities together, and to adapt to the subjectivity of each situation?

Value human intelligence

As has always been the case, reliability and human intelligence are strong values at AT Internet: they are part of our DNA. Recognised by the Forrester market research company as being one of the best in the world, the quality of our customer service can only but confirm the importance that human intervention has in the field of analytics, something that we are greatly interested in.

We are convinced that no social media monitoring tool, as good as it may be, will ever equal the level of understanding and finesse of human judgement. We would like to show you our excellence in this field through the different range of turnkey “online reputation” packages that we have to offer.
Our Social Analytics offer is made up of: manually detecting tone and influence, eliminating noise, on demand sourcing, training programmes, assistance, and an analysis of your online reputation. Further information is available on the website: www.atinternet.com

To read :

http://commetrics.com/articles/fails-validity-test/

http://jennifered.wordpress.com/2010/07/06/why-sentiment-analysis-cant-work-and-why-it%E2%80%99s-a-damned-good-thing/


[1] We are not going to go into too much detail here as these topics have been dealt with time and time again on the web.

 

Author

Product Marketing Manager Having studied marketing, Mélanie has worn the product manager hat at AT Internet for nearly 10 years. She is responsible for introducing innovative dataviz products while keeping the user-centric approach front and centre.

Comments are closed.