By the ESOMAR USA Representatives
Although North America was unable to retain fastest growing market for research for a second year running, the overall market size has increased by almost 20% according to the latest ESOMAR Global Market Research report. But, while the overall market growth figure may have implied a slowing down, the market measure has expanded to include two additional new sectors, giving the region a net growth of 0.5%. Following on from this we asked the ESOMAR representatives in the US about the challenges, opportunities and trends in their market.
Did someone say mobile? Again?
As this series of articles continue, having looked at the LATAM, APAC an MENAP regions, we do indeed see the same patterns emerging, even more so in a developed market such as the United States. It’s of course, the conversation about mobile and internet penetration rearing its head again. Jackie Lorch, Vice President, Global Knowledge Management, SSI USA, comments, “With online penetration approaching 90%, online is the go-to data collection methodology, and don’t even think about fielding a questionnaire that can’t be completed on a mobile phone.” Yet, although this knowledge is commonplace, it doesn’t mean the industry has caught up yet. With US smartphone penetration near 60%, survey participants are increasingly choosing to take surveys on mobile devices. Lorch observes, “The industry has not made it a priority to put participants first and design mobile-friendly questionnaires. Likewise mobile in-the-moment research presents wonderful opportunities to interact at the moment of decision-making with video or image capture. Yet we have largely failed to engage.”
This sentiment is echoed by Melanie Courtright, EVP, Products and Client Services, Research Now. “The biggest challenge is learning how to evolve for mobile devices — the questionnaires themselves have to change, and we are really struggling with moving fast enough in America.” Although the market research world might be lagging behind, there are big opportunities here, comments Courtright, “Biggest opportunity is in automation of basic research types so that through standardization we can integrate other forms of data better and spend more time on interpretation and make decisions more quickly.”
While the rest of the world might have an inbuilt stereotype of Americans (can anyone blame them, Donald Trump anyone?) – it couldn’t be further from the truth. Lorch comments, “Most of the stereotypes you have heard about America are exaggerations. Most Americans enjoy foods other than burgers, fries and buckets of cola and many are well-informed about and interested in other countries in the world!” Indeed, Courtright observes, “The US is very diverse, both in business and with consumers. It’s like many small countries grouped together, so to try and approach it as one market is not possible.”
The US shouldn’t be treated as one country – this is a population of almost 319 million people, spanning across more than 9 million km. That’s a lot of people, can we really expect them to have the same thoughts and opinions? Lorch expands on this further, “Different geographies, attitudes and cultures can be found within its borders. You’re likely to find doing business in the fast-paced, intense, “in-your-face” culture of New York City different from the more laid-back, proudly non-conformist, technology-driven Northern California, for example. Street signs you may see along the way help tell the story!”
Very visible in the US, but not just limited to here, is a major new societal trend that will impact research – fragmentation – in almost every aspect of modern life. Lorch notes, “From people’s time and attention, the data sources they use, their digital device habits, to the diversity of their beliefs, lifestyles, attitudes and interests. Institutions in the media, government and communities that used to help define large groups among the population have largely vanished to be replaced by customisation of the individual experience to a massive degree. As society fragments, are our traditional research taxonomies relevant anymore? We still group people by age, by ethnicity, by geography, in ways that haven’t changed for generations. We should instead consider life stages, and new attitudinal groupings as ways to better understand the consumer. This is equally true for B2B research where titles, responsibilities and purchase patterns are changing rapidly and we need to target based on the reality of today’s job functions and responsibilities.”
There are many trends impacting on America, and indeed society at large, none more so than technology. Lorch comments, “The idea of technology as not just enabler, but also driver of our business is a phenomenon noted by Unilever’s Stan Sthanunathan. Technology has made research more efficient, and improved its quality for companies who have invested in it. Now technology is doing more: actively directing where research will be and go in the future. It is taking over many research tasks that humans used to do. The challenge is that powerful technology and the expertise to run it is usually only available to the larger players, so many smaller enterprises need to find a new raison d’etre, or risk being swallowed up.”
Big data and the internet of things will also shape the future of market research, but we first need to get over the problems. “The practical and operational obstacles in the way of getting value from all the data now available are not trivial, yet the potential rewards are massive. If we can overcome the obstacles, research can use big data to answer the what, when and where questions and surveys to get at the why and what next – resulting in shorter, more interesting surveys and more accurate factual data.”
And a further trend to look out for, comes from Courtright, who comments, “A trend we’re seeing is definitely privacy and what that means in a world of cookies and meters and observational data collection. And in turn, society’s reaction to those practices, along with their expectations of transparency and responsibility.”
How to do business here
While we know we need to let go of those stereotypes of Americans we seen in the media, how do we do business here?
Lorch has some sound advice, “Americans are informal and direct in business dealings and make decisions relatively quickly – so don’t be afraid to ask for the business and discuss specifics like delivery times and costs.” But, don’t mistake that good old American positivity for success. “A positive attitude is much admired in the US, so even if someone tells you they’re “incredibly excited” about meeting you and hearing about your product it doesn’t mean you’ve made the sale!” observes Lorch.
So what have we learnt about doing business in the United States? Don’t treat this country as one…
Special thanks to Jackie and Melanie for this article.
Jackie Lorch, Vice President, Global Knowledge Management, SSI USA
Melanie Courtright, EVP, Products and Client Services, Research Now
London and Amsterdam – 29 June 2016
ESOMAR, the world association for market, social and opinion research, has appointed Language Connect as its official translation partner.
In providing the global voice for the data, research and insights community, ESOMAR strives to unite and offer support and guidance to professionals from all over the globe. Understanding the very important need for such guidance and support in local languages, ESOMAR is delighted to announce the launch of its multi-language online portals for its website (now available in French, Japanese, Arabic, Spanish, Chinese and German). These portals are an introduction into the key services ESOMAR provides its members and the industry and are accessible live on esomar.org.
Furthermore, ESOMAR now offers all its guidelines, as well as its professional standards resources translated into the same key languages: French, Japanese, Arabic, Spanish, Chinese and German, available here.
David Brett, Global Sales Director at Language Connect, comments: “Due to the ever expanding global application of its services and reference materials, ESOMAR is seeking to have its key communication platforms and reference documents available in multiple languages. We are delighted to be able to offer ESOMAR the breadth of expertise and Market Research sector knowledge to support this growth.”
Finn Raben, ESOMAR’s Director General stated: “We are delighted to have established this relationship with Language Connect, whose expertise in the translation arena, coupled with a thorough appreciation of our language needs across the world, has helped us to deliver a truly global resource for all our members and partners.”
Language Connect is a specialist Market Research language services provider and has been working in the sector for over a decade, servicing hundreds of clients across all competencies of Market Research. The company was voted Best Support Services in the MRS Operations Awards last year and is headquartered in London with offices in Munich, Singapore, New York, Istanbul, Melbourne and Dubai.
ESOMAR is the global voice for the data, research and insights community.
For 70 years, ESOMAR has brought together the research sector to share knowledge, promote best practice and agree upon the future of the industry as a community. With more than 4,900 individual members and over 500 corporate members in more than 130 countries, ESOMAR represents a network of over 35,000 researchers, all of whom agree to uphold the ICC/ESOMAR International Code, which is endorsed or adopted by over 60 national research associations worldwide.
ESOMAR is committed to advancing market research, facilitating ongoing industry dialogue through a comprehensive programme of industry-specific and thematic conferences, publications and communications as well as actively advocating self-regulation and the worldwide code of practice.
By Finn Raben
This morning’s news that the people of Britain had voted to leave the EU, has ensured that June 23rd will be an historic day in the annals of the EU…whether it will be regarded as a “good” day or a “bad” day for either Britain or the EU, will only become apparent during the next two years.
That said, I suspect that the euphoria expressed by the winning side will soon dissipate as the true complexity of dealing with the outcome becomes clearer.
There was a very interesting article published in The Guardian newspaper yesterday, entitled : ‘The UK is now two nations, staring across a political chasm’ – written by John Harris. This article took the position that the story of the referendum was the restive mood of millions in the UK, and that the (in Harris’ words) “disgraceful tricks” of political messaging were not sufficiently counterbalanced by responsible broadcast journalism, leading to emotive themes becoming the predominant communication tactic.
Now that the result is known, the chasm is probably a lot wider, and a lot more complex, than a simple In / Out decision might have inferred….
- Almost half of Britain does not want to leave the EU
– the split of 52% – 48% is by no means an overwhelming majority
- England and Wales voted to Leave the EU
- Scotland and Northern Ireland voted to remain in the EU…
– Will this require a second independence referendum in Scotland ?
– Will the travel, customs and security border need to be reinstated between the 26 counties of the Republic and the 6 counties of the North of Ireland?
– Does this signify the beginning of the end for the Union of Home Nations?
- London voted to remain, much of the North voted to leave…
– is there a capital city syndrome becoming apparent ?
- Most of the House of Commons supported a Remain position;
– Is the current government now out of sync with the electorate?
– Is there a need for a general election?
- The young (under 50) voted to remain in the EU, the older generation to leave….
– Most young people (all over the EU and the world) are pro Europe, so should voters have stood back and remind themselves that the future is about their children and grandchildren, and have trusted in the next generations instincts and vote in a way that best suited their future desires?
- What will happen to British expatriates, living and working in the EU, and their EU counterparts, living and working in the UK ?
- The Brexit vote will put immediate stress on Transatlantic political unity (amid growing tensions with Russia), and will complicate U.S. trade ties, particularly on issues such as TTIP and the Privacy Shield.
- If we assume that the Brexit vote was largely a referendum on elites and immigration, these are the same themes that Republican nominee Donald Trump has put at the center of his bid for the White House –will he seize on these results as a vindication of his position and campaign?
- Right-wing conservatives in the Netherlands and in France are using the result to call for referenda in each of the member states; Russia will be delighted at any sign of weakness in the EU structure, and at any evidence of transatlantic disunity, putting pressure on the EU to “resolve” the situation quickly
The essence of a referendum is that people speak – not the politicians. The British people have now spoken, and the resulting task facing Britain’s government is a daunting one; effect the withdrawal, whilst managing/repairing the rifts that the voting results above have clearly highlighted.
However, an equally daunting challenge now also exists for the European Union; if Britain has reached a point where dissatisfaction with its EU membership has hit breaking point, how long will it be until the next country seeks a similar referendum? The danger is to assume that the EU is “ok”, whereas it is far from it. Reform will be needed on both sides of the European divide, to ensure both entities are “fit for purpose” for the coming years. If this is not recognised, if the political classes do not notice that people are discontent and that they feel they are not being listened to, then politicians play into the hands of populists who leverage emotions and prejudices…so listen, or beware.
Finn Raben is ESOMAR Director General. @Finn01 #esomar
by Jan Willem Knibbe
Text and data mining is a hot topic. Some see it as a new revenue source for market research and insights professionals, with its promise of effective analysis of vast amounts of data. You don’t have to ask respondents to fill in cumbersome questionnaires but you can analyse what they post online. Fast, cheap and versatile data. However, in Europe the current copyright framework poses a big hurdle for these types of research projects.
Under the current legislation, researchers interested in data mining can only copy copyrighted data with the permission of the author. Unlike in the USA, there is no fair-use provision in the legal framework that would enable using these data for research purposes without the explicit permission of the author. This gap has also been identified by the European Union. As part of the Digital Single Market agenda, a review of the copyright legislation is foreseen. One of the aspects that this review should cover is creating an exemption for research purposes in the copyright law. We have previously published an article about these developments: A new copyright directive in Europe: will there be room for data mining?
Since we published that article it has now become clear that the European Commission will champion a research exemption in the new framework. However, a debate has emerged whether only non-commercial research should benefit from the exemption. This is a revival of a notion first proposed in the General Data Protection Regulation where it was first suggested that only non-commercial research should be allowed to make use of any derogations foreseen for research purposes.
This position, however, is not unanimously supported. Support for broadening the exemption to all research isn’t coming just from business research community, but also the academic research community which issued a clear call that any distinction between commercial or non-commercial research is artificial and will only lead to legal uncertainty. For example, LERU (League of European Research Universities) writes that the potential of Text and Data Mining (TDM) is acknowledged by researchers, who see the benefits of using automated tools to mine the literature and supporting research data. (…) However, the legal basis to allow the use of TDM techniques, certainly in licensed commercial literature, is unclear. What is needed at a European level is a Fair Dealing Exception, certainly for the purposes of research, in the EU Copyright and Database Directives to facilitate the sharing and re-use of research data.
One of the arguments in favour of not making a distinction is that it would be legally difficult to determine where to draw the line between commercial and non-commercial purposes. It has been noted by many that commercial research projects are done by universities, while commercial companies are executing research project for the public good. This has been partly recognised by DG Connect by focusing on the purpose of the project, and not the body executing the project. Nevertheless, the question on where to draw the line remains open for interpretation and consequently which projects might benefit from the Commission proposals.
Another issue being addressed it the current voluntary nature of any exemption in the Infosoc Directive, which is the European legislation governing copyright. This means that every Member State is free to choose whether or not it provides for an exemption, including for text and data mining. This creates a complex, heterogonous legal framework to contend with, especially in the case of international research projects not to mention the uneven playing field created in the different Member States.
Unlike the discussion around putting limits on research, this seems to be a more accepted change in the legislation. Nevertheless, European harmonisation is increasingly sensitive in most national governments, who have to approve any new directive in the Council of the European Union. It remains to be seen if this proposal will make it to the finishing line. A new legislative proposal covering data mining is expected in September 2016, but it yet unclear when it could be adopted by the Parliament and Council. For an explanation of the process, please consult our law making in Europe beginners’ guide.
All in all, there is still a long road ahead of us. ESOMAR will continue to monitor this dossier, and try to convince European legislators that the distinction between commercial and non-commercial research is artificial would prevent all research actors, including applied research actors like market research agencies, from using text and data mining to improve the speed, relevance, and effectiveness of insights for the betterment of businesses and society alike.
Jan Willem Knibbe is Policy & Public Affairs Assistant at ESOMAR.
By Neda Eneva
On 6 June ESOMAR hosted a seminar on the topic of Media Return on Investment. Does modern media need to be redefined? Do we understand the impact of ad fraud in media measuring? How can we measure online consumer behavior more effectively? How can we tackle ad fraud? Is TV as we know it dead? These were only some of the questions raised and discussed at this day full of insights… here some of the highlights:
The event was opened by Gian Fulgoni, Co-Founder and Executive Chairman of comScore who dived immediately into the issues of the modern overload of data, which offers as many opportunities as it does challenges. Data being generated mostly by data scientists, who do not know how to use the data properly, coupled with the pressures of the modern age, such as short timelines and demand for real-time reaction, often pushes marketers to go in the wrong direction. Gian highlighted that it is not enough to measure media ROI, but that it is even more important to understand the particulars of what is being delivered within the media plan. One of the most interesting findings he presented was – “Forget the click”, underlining that the “click” is not a metric one should use to measure effectiveness, as despite it being simpler and cheaper to use, it is often misleading. Instead, he recommends attitudinal impact or brand behavioral changes in measuring media ROI.
Is an article online the same as an article in a print magazine? This is how Leendert van Meerem opened his presentation on definitions. Do we know what we are measuring and do we need to keep changing the definitions of media? Media measurement is increasingly becoming more and more complex. With new platforms, distributors, formats, old silos are being forgotten and new ones emerge, making media measurement more and more complex. Leendert suggests consumer behavior and online measurement as the new central points in this new wide array of media platforms and formats. Within this, he identifies merging existing media data with new data sources, perhaps shaping them into one another, which in return demands new levels of transparency and clarity to non-experts.
The morning session of the seminar was concluded by two innovative case studies on media measurement. Mariana Irazoqui presented SKO’s live hybrid measurement system, based on a hybrid measuring method that employs both panel and census data. Menno van der Steen from IPG Mediabrands had an econometric take on media measurement, presenting calculating ROI using econometric modeling.
Angel Cuevas Rumin from Universidad Carlos III de Madrid offered a very practical session on ad fraud. Opening with the shocking statistic that 10-30% of overall online advertising is fraudulent, he highlighted that ad fraud is a serious problem that could heavily impact the industry. He identified the main ad fraud examples as adware, auto refreshing, redirecting traffic and botnets. While underlining that publishers and marketers should be very worried about this issue, he identified ad fraud as an issue largely ignored by the industry. The topic of ad fraud was also discussed by Augustine Fou from Marketing Science Consulting Group, who identified the “bot” as the key cause of ad fraud affecting digital advertising and measurement, and interestingly enough the link between ad fraud and ad blocking. With the rise of IoT, Augustine predicts that things will get worse before they get better, as millions and millions of devices are now susceptible to an attack by bots. To tackle the issues, he urges the industry to measure bots specifically and filter them out to avoid incorrect metrics. Due to the fact that ad blocking is also affected by bots, he suggests the focus to be on more reliable “ads loaded” metrics.
In a session on transforming data from a liability to a safe fuel, ESOMAR’s Kim Smouter highlighted the importance of trust in maintaining a healthy and functioning industry moving forward. The correct management and handling of data is of great importance if we are to maintain the trust of legislators and the general public. He presented the Data Serenity programme as the steps each professional and company can take to ensure this: be human in your data collection; invest in resources to manage data flows correctly; know your flows, express yourself very simply and always think about possible liabilities.
The afternoon section of the programme dived into platform specific case studies. Christian Kurz from Viacom International Media Networks opened with TV RE(DEFINED) from the perspective of the consumer and how Viacom as a company responded to that. Zooming in on the evolution of TV as a two-way communication platform, available whenever convenient for the viewer, TV is in fact not dead, but needs a new broadened definition, concluded Christian. He urged the industry to help make multi-platform measurement a reality (with the need for transparency and collaboration across the industry, more data in silos not being the way forward), to feed the super consumers and use technology to unlock growth in the industry. Caroline Brasseur from egta also focused on the changing nature of TV and the resulting viewer fragmentation. TV is still TV, Caroline argued, but this evolution leads to issues concerning measurement, which is currently not following the consumer’s efforts. She highlighted that TV broadcasters would like a new measurement system that would get a multi-screen measurement based on the TAM methodology, a system based on a viewer-centric approach, rather than per device.
The last section of the day gave the floor to key innovation companies sharing insights on some of their latest measurement and marketing tools and strategies. Lanier Pietras from Google presented the latest in Google tools on consumer and brand measurement, such as the Google Cookie List Surveys and Google Geolocation Surveys. Jennifer Brett from LinkedIn showcased the move towards content marketing as the focus point of the LinkedIn advertising business. She had useful tips to B2B marketers, suggesting two types of measurement of content marketing effectiveness – content marketing score and content effectiveness study. Nathaniel Greywood from Twitter reminded the audience that there is no silver bullet solution to measuring Twitter and it is the connection with other platforms that gives a holistic picture of consumer behavior. Finally, Michalis Michael from DigitalMR presented social media listening as the correct method to understand campaign success. He recommended the use of sentiment technology with semantically accurate reporting, integrating data from multiple sources on one dashboard.
From platforms and definitions, to measurement techniques and ad fraud, the seminar touched upon various challenges and opportunities in researching and measuring media ROI. And what a better way to end this discussion than to go back to a fundamental aspect of research – how are we to best make impact with our findings? The event was closed off by the fantastic Emma Whitehead, Graphic and Creative Director at Kantar, who reminded the audience of the importance of communicating research. It is in human nature to process and anticipate good communication through storytelling, Emma argued. Storytelling makes people concentrate, absorb quicker and care more about the information presented to them. Correlation is not causation and data does not press our buttons, she added. The part of the brain that deals with data does not relate to the sensory experience section of the brain, which is why turning data into a sensory experience should be a vital component of any research. And so, researchers, how many of you can fit your story in five slides or even more shocking, would you be able to tell the story of your findings without a powerpoint and in two sentences? Food for thought…
Neda Eneva is Marketing and Communications Manager at ESOMAR. @esomar