By Finn Raben
Over the weekend the New York Times and Observer newspapers reported that data mining and analysis company Cambridge Analytica, a company that had been employed with considerable success by Donald Trump in the 2016 US presidential campaign, had illegally harvested 50 million Facebook profiles in order to build a powerful software program to predict and influence choices at the ballot box. The Observer reports that data was collected via a digital app on the Facebook platform where hundreds of thousands of users were paid to take a personality test and agreed to have their data collected for academic use. However, the app also collected the information of the test-takers’ Facebook friends.
Tim Macer interviews Mario Callegaro, a senior research scientist at Google, about why companies use DIY research, the key issues to be considered and the most important skills and capabilities that a DIY researcher should possess.
Let’s start with your definition of what DIY research is.
Andrew Jeavons gives an overview as to how we can apply one of the oldest AI theories to the modern day.
By Neda Eneva
On 6 June ESOMAR hosted a seminar on the topic of Media Return on Investment. Does modern media need to be redefined? Do we understand the impact of ad fraud in media measuring? How can we measure online consumer behavior more effectively? How can we tackle ad fraud? Is TV as we know it dead? These were only some of the questions raised and discussed at this day full of insights… here some of the highlights:
The event was opened by Gian Fulgoni, Co-Founder and Executive Chairman of comScore who dived immediately into the issues of the modern overload of data, which offers as many opportunities as it does challenges. Data being generated mostly by data scientists, who do not know how to use the data properly, coupled with the pressures of the modern age, such as short timelines and demand for real-time reaction, often pushes marketers to go in the wrong direction. Gian highlighted that it is not enough to measure media ROI, but that it is even more important to understand the particulars of what is being delivered within the media plan. One of the most interesting findings he presented was – “Forget the click”, underlining that the “click” is not a metric one should use to measure effectiveness, as despite it being simpler and cheaper to use, it is often misleading. Instead, he recommends attitudinal impact or brand behavioral changes in measuring media ROI.
Is an article online the same as an article in a print magazine? This is how Leendert van Meerem opened his presentation on definitions. Do we know what we are measuring and do we need to keep changing the definitions of media? Media measurement is increasingly becoming more and more complex. With new platforms, distributors, formats, old silos are being forgotten and new ones emerge, making media measurement more and more complex. Leendert suggests consumer behavior and online measurement as the new central points in this new wide array of media platforms and formats. Within this, he identifies merging existing media data with new data sources, perhaps shaping them into one another, which in return demands new levels of transparency and clarity to non-experts.
The morning session of the seminar was concluded by two innovative case studies on media measurement. Mariana Irazoqui presented SKO’s live hybrid measurement system, based on a hybrid measuring method that employs both panel and census data. Menno van der Steen from IPG Mediabrands had an econometric take on media measurement, presenting calculating ROI using econometric modeling.
Angel Cuevas Rumin from Universidad Carlos III de Madrid offered a very practical session on ad fraud. Opening with the shocking statistic that 10-30% of overall online advertising is fraudulent, he highlighted that ad fraud is a serious problem that could heavily impact the industry. He identified the main ad fraud examples as adware, auto refreshing, redirecting traffic and botnets. While underlining that publishers and marketers should be very worried about this issue, he identified ad fraud as an issue largely ignored by the industry. The topic of ad fraud was also discussed by Augustine Fou from Marketing Science Consulting Group, who identified the “bot” as the key cause of ad fraud affecting digital advertising and measurement, and interestingly enough the link between ad fraud and ad blocking. With the rise of IoT, Augustine predicts that things will get worse before they get better, as millions and millions of devices are now susceptible to an attack by bots. To tackle the issues, he urges the industry to measure bots specifically and filter them out to avoid incorrect metrics. Due to the fact that ad blocking is also affected by bots, he suggests the focus to be on more reliable “ads loaded” metrics.
In a session on transforming data from a liability to a safe fuel, ESOMAR’s Kim Smouter highlighted the importance of trust in maintaining a healthy and functioning industry moving forward. The correct management and handling of data is of great importance if we are to maintain the trust of legislators and the general public. He presented the Data Serenity programme as the steps each professional and company can take to ensure this: be human in your data collection; invest in resources to manage data flows correctly; know your flows, express yourself very simply and always think about possible liabilities.
The afternoon section of the programme dived into platform specific case studies. Christian Kurz from Viacom International Media Networks opened with TV RE(DEFINED) from the perspective of the consumer and how Viacom as a company responded to that. Zooming in on the evolution of TV as a two-way communication platform, available whenever convenient for the viewer, TV is in fact not dead, but needs a new broadened definition, concluded Christian. He urged the industry to help make multi-platform measurement a reality (with the need for transparency and collaboration across the industry, more data in silos not being the way forward), to feed the super consumers and use technology to unlock growth in the industry. Caroline Brasseur from egta also focused on the changing nature of TV and the resulting viewer fragmentation. TV is still TV, Caroline argued, but this evolution leads to issues concerning measurement, which is currently not following the consumer’s efforts. She highlighted that TV broadcasters would like a new measurement system that would get a multi-screen measurement based on the TAM methodology, a system based on a viewer-centric approach, rather than per device.
The last section of the day gave the floor to key innovation companies sharing insights on some of their latest measurement and marketing tools and strategies. Lanier Pietras from Google presented the latest in Google tools on consumer and brand measurement, such as the Google Cookie List Surveys and Google Geolocation Surveys. Jennifer Brett from LinkedIn showcased the move towards content marketing as the focus point of the LinkedIn advertising business. She had useful tips to B2B marketers, suggesting two types of measurement of content marketing effectiveness – content marketing score and content effectiveness study. Nathaniel Greywood from Twitter reminded the audience that there is no silver bullet solution to measuring Twitter and it is the connection with other platforms that gives a holistic picture of consumer behavior. Finally, Michalis Michael from DigitalMR presented social media listening as the correct method to understand campaign success. He recommended the use of sentiment technology with semantically accurate reporting, integrating data from multiple sources on one dashboard.
From platforms and definitions, to measurement techniques and ad fraud, the seminar touched upon various challenges and opportunities in researching and measuring media ROI. And what a better way to end this discussion than to go back to a fundamental aspect of research – how are we to best make impact with our findings? The event was closed off by the fantastic Emma Whitehead, Graphic and Creative Director at Kantar, who reminded the audience of the importance of communicating research. It is in human nature to process and anticipate good communication through storytelling, Emma argued. Storytelling makes people concentrate, absorb quicker and care more about the information presented to them. Correlation is not causation and data does not press our buttons, she added. The part of the brain that deals with data does not relate to the sensory experience section of the brain, which is why turning data into a sensory experience should be a vital component of any research. And so, researchers, how many of you can fit your story in five slides or even more shocking, would you be able to tell the story of your findings without a powerpoint and in two sentences? Food for thought…
Neda Eneva is Marketing and Communications Manager at ESOMAR. @esomar