London and Amsterdam – 29 June 2016
ESOMAR, the world association for market, social and opinion research, has appointed Language Connect as its official translation partner.
In providing the global voice for the data, research and insights community, ESOMAR strives to unite and offer support and guidance to professionals from all over the globe. Understanding the very important need for such guidance and support in local languages, ESOMAR is delighted to announce the launch of its multi-language online portals for its website (now available in French, Japanese, Arabic, Spanish, Chinese and German). These portals are an introduction into the key services ESOMAR provides its members and the industry and are accessible live on esomar.org.
Furthermore, ESOMAR now offers all its guidelines, as well as its professional standards resources translated into the same key languages: French, Japanese, Arabic, Spanish, Chinese and German, available here.
David Brett, Global Sales Director at Language Connect, comments: “Due to the ever expanding global application of its services and reference materials, ESOMAR is seeking to have its key communication platforms and reference documents available in multiple languages. We are delighted to be able to offer ESOMAR the breadth of expertise and Market Research sector knowledge to support this growth.”
Finn Raben, ESOMAR’s Director General stated: “We are delighted to have established this relationship with Language Connect, whose expertise in the translation arena, coupled with a thorough appreciation of our language needs across the world, has helped us to deliver a truly global resource for all our members and partners.”
Language Connect is a specialist Market Research language services provider and has been working in the sector for over a decade, servicing hundreds of clients across all competencies of Market Research. The company was voted Best Support Services in the MRS Operations Awards last year and is headquartered in London with offices in Munich, Singapore, New York, Istanbul, Melbourne and Dubai.
ESOMAR is the global voice for the data, research and insights community.
For 70 years, ESOMAR has brought together the research sector to share knowledge, promote best practice and agree upon the future of the industry as a community. With more than 4,900 individual members and over 500 corporate members in more than 130 countries, ESOMAR represents a network of over 35,000 researchers, all of whom agree to uphold the ICC/ESOMAR International Code, which is endorsed or adopted by over 60 national research associations worldwide.
ESOMAR is committed to advancing market research, facilitating ongoing industry dialogue through a comprehensive programme of industry-specific and thematic conferences, publications and communications as well as actively advocating self-regulation and the worldwide code of practice.
By Finn Raben
This morning’s news that the people of Britain had voted to leave the EU, has ensured that June 23rd will be an historic day in the annals of the EU…whether it will be regarded as a “good” day or a “bad” day for either Britain or the EU, will only become apparent during the next two years.
That said, I suspect that the euphoria expressed by the winning side will soon dissipate as the true complexity of dealing with the outcome becomes clearer.
There was a very interesting article published in The Guardian newspaper yesterday, entitled : ‘The UK is now two nations, staring across a political chasm’ – written by John Harris. This article took the position that the story of the referendum was the restive mood of millions in the UK, and that the (in Harris’ words) “disgraceful tricks” of political messaging were not sufficiently counterbalanced by responsible broadcast journalism, leading to emotive themes becoming the predominant communication tactic.
Now that the result is known, the chasm is probably a lot wider, and a lot more complex, than a simple In / Out decision might have inferred….
- Almost half of Britain does not want to leave the EU
– the split of 52% – 48% is by no means an overwhelming majority
- England and Wales voted to Leave the EU
- Scotland and Northern Ireland voted to remain in the EU…
– Will this require a second independence referendum in Scotland ?
– Will the travel, customs and security border need to be reinstated between the 26 counties of the Republic and the 6 counties of the North of Ireland?
– Does this signify the beginning of the end for the Union of Home Nations?
- London voted to remain, much of the North voted to leave…
– is there a capital city syndrome becoming apparent ?
- Most of the House of Commons supported a Remain position;
– Is the current government now out of sync with the electorate?
– Is there a need for a general election?
- The young (under 50) voted to remain in the EU, the older generation to leave….
– Most young people (all over the EU and the world) are pro Europe, so should voters have stood back and remind themselves that the future is about their children and grandchildren, and have trusted in the next generations instincts and vote in a way that best suited their future desires?
- What will happen to British expatriates, living and working in the EU, and their EU counterparts, living and working in the UK ?
- The Brexit vote will put immediate stress on Transatlantic political unity (amid growing tensions with Russia), and will complicate U.S. trade ties, particularly on issues such as TTIP and the Privacy Shield.
- If we assume that the Brexit vote was largely a referendum on elites and immigration, these are the same themes that Republican nominee Donald Trump has put at the center of his bid for the White House –will he seize on these results as a vindication of his position and campaign?
- Right-wing conservatives in the Netherlands and in France are using the result to call for referenda in each of the member states; Russia will be delighted at any sign of weakness in the EU structure, and at any evidence of transatlantic disunity, putting pressure on the EU to “resolve” the situation quickly
The essence of a referendum is that people speak – not the politicians. The British people have now spoken, and the resulting task facing Britain’s government is a daunting one; effect the withdrawal, whilst managing/repairing the rifts that the voting results above have clearly highlighted.
However, an equally daunting challenge now also exists for the European Union; if Britain has reached a point where dissatisfaction with its EU membership has hit breaking point, how long will it be until the next country seeks a similar referendum? The danger is to assume that the EU is “ok”, whereas it is far from it. Reform will be needed on both sides of the European divide, to ensure both entities are “fit for purpose” for the coming years. If this is not recognised, if the political classes do not notice that people are discontent and that they feel they are not being listened to, then politicians play into the hands of populists who leverage emotions and prejudices…so listen, or beware.
Finn Raben is ESOMAR Director General. @Finn01 #esomar
by Jan Willem Knibbe
Text and data mining is a hot topic. Some see it as a new revenue source for market research and insights professionals, with its promise of effective analysis of vast amounts of data. You don’t have to ask respondents to fill in cumbersome questionnaires but you can analyse what they post online. Fast, cheap and versatile data. However, in Europe the current copyright framework poses a big hurdle for these types of research projects.
Under the current legislation, researchers interested in data mining can only copy copyrighted data with the permission of the author. Unlike in the USA, there is no fair-use provision in the legal framework that would enable using these data for research purposes without the explicit permission of the author. This gap has also been identified by the European Union. As part of the Digital Single Market agenda, a review of the copyright legislation is foreseen. One of the aspects that this review should cover is creating an exemption for research purposes in the copyright law. We have previously published an article about these developments: A new copyright directive in Europe: will there be room for data mining?
Since we published that article it has now become clear that the European Commission will champion a research exemption in the new framework. However, a debate has emerged whether only non-commercial research should benefit from the exemption. This is a revival of a notion first proposed in the General Data Protection Regulation where it was first suggested that only non-commercial research should be allowed to make use of any derogations foreseen for research purposes.
This position, however, is not unanimously supported. Support for broadening the exemption to all research isn’t coming just from business research community, but also the academic research community which issued a clear call that any distinction between commercial or non-commercial research is artificial and will only lead to legal uncertainty. For example, LERU (League of European Research Universities) writes that the potential of Text and Data Mining (TDM) is acknowledged by researchers, who see the benefits of using automated tools to mine the literature and supporting research data. (…) However, the legal basis to allow the use of TDM techniques, certainly in licensed commercial literature, is unclear. What is needed at a European level is a Fair Dealing Exception, certainly for the purposes of research, in the EU Copyright and Database Directives to facilitate the sharing and re-use of research data.
One of the arguments in favour of not making a distinction is that it would be legally difficult to determine where to draw the line between commercial and non-commercial purposes. It has been noted by many that commercial research projects are done by universities, while commercial companies are executing research project for the public good. This has been partly recognised by DG Connect by focusing on the purpose of the project, and not the body executing the project. Nevertheless, the question on where to draw the line remains open for interpretation and consequently which projects might benefit from the Commission proposals.
Another issue being addressed it the current voluntary nature of any exemption in the Infosoc Directive, which is the European legislation governing copyright. This means that every Member State is free to choose whether or not it provides for an exemption, including for text and data mining. This creates a complex, heterogonous legal framework to contend with, especially in the case of international research projects not to mention the uneven playing field created in the different Member States.
Unlike the discussion around putting limits on research, this seems to be a more accepted change in the legislation. Nevertheless, European harmonisation is increasingly sensitive in most national governments, who have to approve any new directive in the Council of the European Union. It remains to be seen if this proposal will make it to the finishing line. A new legislative proposal covering data mining is expected in September 2016, but it yet unclear when it could be adopted by the Parliament and Council. For an explanation of the process, please consult our law making in Europe beginners’ guide.
All in all, there is still a long road ahead of us. ESOMAR will continue to monitor this dossier, and try to convince European legislators that the distinction between commercial and non-commercial research is artificial would prevent all research actors, including applied research actors like market research agencies, from using text and data mining to improve the speed, relevance, and effectiveness of insights for the betterment of businesses and society alike.
Jan Willem Knibbe is Policy & Public Affairs Assistant at ESOMAR.
By Neda Eneva
On 6 June ESOMAR hosted a seminar on the topic of Media Return on Investment. Does modern media need to be redefined? Do we understand the impact of ad fraud in media measuring? How can we measure online consumer behavior more effectively? How can we tackle ad fraud? Is TV as we know it dead? These were only some of the questions raised and discussed at this day full of insights… here some of the highlights:
The event was opened by Gian Fulgoni, Co-Founder and Executive Chairman of comScore who dived immediately into the issues of the modern overload of data, which offers as many opportunities as it does challenges. Data being generated mostly by data scientists, who do not know how to use the data properly, coupled with the pressures of the modern age, such as short timelines and demand for real-time reaction, often pushes marketers to go in the wrong direction. Gian highlighted that it is not enough to measure media ROI, but that it is even more important to understand the particulars of what is being delivered within the media plan. One of the most interesting findings he presented was – “Forget the click”, underlining that the “click” is not a metric one should use to measure effectiveness, as despite it being simpler and cheaper to use, it is often misleading. Instead, he recommends attitudinal impact or brand behavioral changes in measuring media ROI.
Is an article online the same as an article in a print magazine? This is how Leendert van Meerem opened his presentation on definitions. Do we know what we are measuring and do we need to keep changing the definitions of media? Media measurement is increasingly becoming more and more complex. With new platforms, distributors, formats, old silos are being forgotten and new ones emerge, making media measurement more and more complex. Leendert suggests consumer behavior and online measurement as the new central points in this new wide array of media platforms and formats. Within this, he identifies merging existing media data with new data sources, perhaps shaping them into one another, which in return demands new levels of transparency and clarity to non-experts.
The morning session of the seminar was concluded by two innovative case studies on media measurement. Mariana Irazoqui presented SKO’s live hybrid measurement system, based on a hybrid measuring method that employs both panel and census data. Menno van der Steen from IPG Mediabrands had an econometric take on media measurement, presenting calculating ROI using econometric modeling.
Angel Cuevas Rumin from Universidad Carlos III de Madrid offered a very practical session on ad fraud. Opening with the shocking statistic that 10-30% of overall online advertising is fraudulent, he highlighted that ad fraud is a serious problem that could heavily impact the industry. He identified the main ad fraud examples as adware, auto refreshing, redirecting traffic and botnets. While underlining that publishers and marketers should be very worried about this issue, he identified ad fraud as an issue largely ignored by the industry. The topic of ad fraud was also discussed by Augustine Fou from Marketing Science Consulting Group, who identified the “bot” as the key cause of ad fraud affecting digital advertising and measurement, and interestingly enough the link between ad fraud and ad blocking. With the rise of IoT, Augustine predicts that things will get worse before they get better, as millions and millions of devices are now susceptible to an attack by bots. To tackle the issues, he urges the industry to measure bots specifically and filter them out to avoid incorrect metrics. Due to the fact that ad blocking is also affected by bots, he suggests the focus to be on more reliable “ads loaded” metrics.
In a session on transforming data from a liability to a safe fuel, ESOMAR’s Kim Smouter highlighted the importance of trust in maintaining a healthy and functioning industry moving forward. The correct management and handling of data is of great importance if we are to maintain the trust of legislators and the general public. He presented the Data Serenity programme as the steps each professional and company can take to ensure this: be human in your data collection; invest in resources to manage data flows correctly; know your flows, express yourself very simply and always think about possible liabilities.
The afternoon section of the programme dived into platform specific case studies. Christian Kurz from Viacom International Media Networks opened with TV RE(DEFINED) from the perspective of the consumer and how Viacom as a company responded to that. Zooming in on the evolution of TV as a two-way communication platform, available whenever convenient for the viewer, TV is in fact not dead, but needs a new broadened definition, concluded Christian. He urged the industry to help make multi-platform measurement a reality (with the need for transparency and collaboration across the industry, more data in silos not being the way forward), to feed the super consumers and use technology to unlock growth in the industry. Caroline Brasseur from egta also focused on the changing nature of TV and the resulting viewer fragmentation. TV is still TV, Caroline argued, but this evolution leads to issues concerning measurement, which is currently not following the consumer’s efforts. She highlighted that TV broadcasters would like a new measurement system that would get a multi-screen measurement based on the TAM methodology, a system based on a viewer-centric approach, rather than per device.
The last section of the day gave the floor to key innovation companies sharing insights on some of their latest measurement and marketing tools and strategies. Lanier Pietras from Google presented the latest in Google tools on consumer and brand measurement, such as the Google Cookie List Surveys and Google Geolocation Surveys. Jennifer Brett from LinkedIn showcased the move towards content marketing as the focus point of the LinkedIn advertising business. She had useful tips to B2B marketers, suggesting two types of measurement of content marketing effectiveness – content marketing score and content effectiveness study. Nathaniel Greywood from Twitter reminded the audience that there is no silver bullet solution to measuring Twitter and it is the connection with other platforms that gives a holistic picture of consumer behavior. Finally, Michalis Michael from DigitalMR presented social media listening as the correct method to understand campaign success. He recommended the use of sentiment technology with semantically accurate reporting, integrating data from multiple sources on one dashboard.
From platforms and definitions, to measurement techniques and ad fraud, the seminar touched upon various challenges and opportunities in researching and measuring media ROI. And what a better way to end this discussion than to go back to a fundamental aspect of research – how are we to best make impact with our findings? The event was closed off by the fantastic Emma Whitehead, Graphic and Creative Director at Kantar, who reminded the audience of the importance of communicating research. It is in human nature to process and anticipate good communication through storytelling, Emma argued. Storytelling makes people concentrate, absorb quicker and care more about the information presented to them. Correlation is not causation and data does not press our buttons, she added. The part of the brain that deals with data does not relate to the sensory experience section of the brain, which is why turning data into a sensory experience should be a vital component of any research. And so, researchers, how many of you can fit your story in five slides or even more shocking, would you be able to tell the story of your findings without a powerpoint and in two sentences? Food for thought…
Neda Eneva is Marketing and Communications Manager at ESOMAR. @esomar
Ahead of the European Pharmaceutical Market Research Association (EphMRA)‘s flagship Business Intelligence/Analysis Conference in Frankfurt, Germany, held June 21st – 23rd 2016, EphMRA President Thomas Hein, outlines the latest trends changing the day-to-day lives of healthcare researchers.
By Thomas Hein
Market research is an essential activity for all healthcare companies.
It provides the unbiased, independent voice of the customer to the healthcare companies and, therefore, has to follow several regulations especially with regard to data privacy.
Used to guide decisions in several areas of the business from identification of unmet customer needs, development of product portfolio, communication strategies, awareness and utilisation of products, just to mention a few, its imperative all companies keep up to speed with what is an ever-changing market research landscape.
These exciting and fast-paced changes also have an impact on researcher activities as professionals, across both companies and agencies, must always be aware of the newest trends, anticipate the needs of their customers and address them with the best methodologies.
The vision of EphMRA is creating excellence in professional standards and practices to enable healthcare market researchers to become highly valued business partners.
The role of the market researcher in this sector has evolved over the last 10 years from a data analyst providing information towards a customer and markets insights expert providing decision support.
There are several key trends currently impacting the healthcare industry which market researchers have to be aware of and address them appropriately.
A holistic patient focused approach
For pharmaceutical companies, one of these trends is patient centricity which is becoming a core strategy as companies are aware that the patient is more and more involved in the decision making process about healthcare services and the prescription of drugs.
A patient focused market research approach with a holistic view of the patient, (rather than looking at one disease state and its treatment) requires different methods compared to conducting research with healthcare professionals.
Very often market researchers do not have an understanding of the patient and their role as they do not have direct contact with patients. Market researchers have to provide the patient perspective to the company and recommend patient oriented strategies to the companies.
In traditional ethnographic research, a patient is followed daily for hours during their day-to-day life and observed on key topics. Questions are asked within the observed situation to gain a deeper understanding about attitudes and motivation.
Now, new technologies will make this method -which is conducted more and more in recent years,-less cost intensive and even more observational eliminating nearly completely the interviewer influence.
One of the stand-out new technologies now available is Google Glass which allows researchers to see the world through the patient’s eyes.
Usage of such a technology has to be explored, and the first agencies are now embracing it. Additional technology is already available via smart phones. Patients can make audio and video recordings either ‘in the moment ‘or after it, like after a physician visit.
This leads to another important trend for the future, mobile health – which is the practice of medicine and public health supported by mobile devices.
Mobile health applications include the use of mobile devices in collecting community and clinical health data, delivery of healthcare information to practitioners, researchers, and patients, real-time monitoring of patient vital signs, collection of personal health related data by consumers including patients and direct provision of care.
This leads to a huge amount of data which has not previously been available. If patients start to collect data on their lifestyle, disease state, physicians visits, dietary and reasons for decisions it will allow the holistic view on a patient with all healthcare related aspects which has not been available so far.
Mobile health will also change the way healthcare companies are communicating with their customers, especially healthcare professionals and patients.
The various digital channels currently available have different advantages and disadvantages compared to the traditional model of sales representatives visiting healthcare professionals. The communication tools will get more interactive and will allow healthcare professionals to receive the information they need at the point in time they need it.
Data privacy and security
With all the data available and more patient level data generated by primary market research the topic of data privacy and security becomes increasingly important.
For Europe, the fundamental right to the protection of personal data is already explicitly recognised in Article 8 of the Charter of Fundamental Rights of the European Union. There are special regulations around the processing of health data.
Similar regulations exist for other geographies, and this has implications for primary market research as well as for the analysis of patient level healthcare data.
EphMRA is in continuous contact with the respective authorities to explain the nature of market research and for which objectives data is used, and informs its members about changes in regulations in Europe as well as other major geographies to ensure the companies and agencies are compliant.
Several of the topics mentioned above and the implications for market research will be addressed at the EphMRA Business Intelligence/Analysis Conference in Frankfurt, Germany from June 21st – 23rd 2016.
Registration is open now for the industry leading event. For full details visit www.ephmraconference.org
Thomas Hein is President of EphMRA and Global Director Customer Insights and Strategy Immunodiagnostics, Thermo Fisher Scientific
EphMRA (European Pharmaceutical Market Research Association) strives to create excellence in professional standards and practices to enable healthcare market researchers to become highly valued business partners.