A New Privacy Shield will be in place to protect EU/US data flows starting 1 August 2016
By Kim Smouter
A MAJOR RELIEF?
Europe and the United States have announced that they have come to an agreement on the replacement mechanism to the EU/US Safe Harbour. The Safe Harbour Scheme had been struck down by the European Court of Justice last year forcing European and American authorities to scramble and setup a replacement mechanism allowing the free flow of data between the world’s two largest data markets.
In February, authorities had announced the Privacy Shield which sought to address the European Court of Justice’s opposition to indiscriminate mass surveillance on Europeans and also the inequivalent level of redress afforded to Europeans. But following negative feedback from European politicians, and European and national data protection authorities about the new scheme, it was uncertain the Privacy Shield would ever see the light of day.
Companies wishing to sign up to the new Privacy Shield will be invited to do so starting 1 August 2016, noting that at the moment data transfers using the old Safe Harbor are illegal and subject to enforcement actions. German Data Protection Authorities have already begun issuing fines for companies who are still transferring data using the old scheme.
THE PRIVACY SHIELD SURVIVES SCRUTINY AND POLITICAL OPPOSITION
So, despite political opposition to the new Shield, representatives of EU Member States and the European Commission gave their final nod of approval to the proposed scheme. A new version of the text was prepared to address the negative reviews of the national data protection authorities and the European Data Protection Supervisor who will eventually have enforcement responsibility over the scheme.
The Privacy Shield is a slightly different animal from its predecessor, but for those involved in the previous scheme it should be seen as an evolution of the pre-existing requirements.
Nonetheless, there are a number of changes to highlight from the perspective of a company including:
STRICTER NOTIFICATION REQUIREMENTS
- The Privacy Shield requires additional information be provided to individuals in the Notice Principle, including a declaration of the organization’s participation in the Privacy Shield, a statement of the individual’s right to access personal data, and the identification of the relevant independent dispute resolution body;
STRICTER CONTRACTUAL REQUIREMENTS
- The Privacy Shield strengthens protection of personal data that is transferred from a Privacy Shield organization to a third party controller by requiring contracts that provides that personal data may only be processed for limited and specified purposes consistent with the consent provided by the individual and that the recipient will provide the same level of protection as the Principles;
GREATER EMPHASIS ON DATA CHAIN RESPONSIBILITIES
- The Privacy Shield strengthens protection of personal data that is transferred from a Privacy Shield organization to a third party agent, requiring a Privacy Shield organization to:
- take reasonable and appropriate steps to ensure that the agent effectively processes the personal information transferred in a manner consistent with the organization’s obligations under the Principles;
- upon notice, take reasonable and appropriate steps to stop and remediate unauthorized processing; and provide a summary or a representative copy of the relevant privacy provisions of its contract with that agent to the Department upon request;
CLARIFICATION OF LIABILITIES
- The Privacy Shield organization (the data importer) is responsible for the processing of personal information it receives under the Privacy Shield and subsequently transfers to a third party acting as an agent on its behalf.
- The Privacy Shield organization remains liable under the Principles if its agent processes such personal information in a manner inconsistent with the Principles, unless the organization proves that it is not responsible for the event giving rise to the damage;
- The Privacy Shield also clarifies that Privacy Shield organizations must limit personal information to the information that is relevant for the purposes of processing;
ANNUAL CERTIFICATION REQUIREMENTS
- The Privacy Shield requires an organization to annually certify with the US Department of Commerce its commitment to apply the Principles to information it received while it participated in the Privacy Shield if it leaves the Privacy Shield and chooses to keep such data;
- It also requires that an independent recourse mechanism be provided at no cost to the individual;
STRONG EXPECTATIONS TO RESPOND PROMPTLY TO REQUESTS
- The Privacy Shield requires organizations and their selected independent recourse mechanisms to respond promptly to inquiries and requests by the Department for information relating to the Privacy Shield;
- The Privacy Shield also requires organizations to respond expeditiously to complaints regarding compliance with the Principles referred by EU Member State authorities through the Department;
- It further requires a Privacy Shield organization to make public any relevant Privacy Shield-related sections of any compliance or assessment report submitted to the FTC if it becomes subject to an FTC or court order based on non-compliance.
MORE FLEXIBLE RETENTION PERIODS FOR RESEARCH AND STATISTICAL ANALYSIS
- The Privacy Shield hasn’t forgotten about offering a differentiated regime for research, as organizations may retain personal information for the time and to the extent such processing reasonably serves the purposes of archiving in the public interest, journalism, literature and art, scientific or historical research, and statistical analysis.
THAT’S GREAT, BUT WHAT’S THE ADVICE FOR MARKET RESEARCHERS?
Clearly the adoption of a new Privacy Shield offers a much more “user-friendly” mechanism to re-enable data transfers between the EU and the US in the same way that before the Safe Harbour scheme allowed more than 4000 companies to transfer data easily between the two data markets. Market, opinion, and social researchers also benefited from this scheme as leading agencies were using it but also many suppliers into the industry.
The alternatives, like binding corporate rules and standard contractual model clauses provided by the Commission can be cumbersome or worse and sometimes simply cannot be applied to the use-case. So having this scheme can be a relief.
There is, nonetheless, a word of caution to be placed on rushing to adopt the Privacy Shield. As highlighted by the European Parliament’s rapporteur on the General Data Protection Regulation, Jan Albrecht, there are many who think the new Privacy Shield will not pass muster in front of the courts and Privacy Advocates will be rushing to challenge the new decision.
There is therefore a real risk that in the not too distant future, the Privacy Shield may be struck down like its predecessor by the Court of Justice. Companies should think carefully about whether they wish to invest the time and resources to certify under the new scheme in light of this uncertainty.
In light of the new developments, our advice to our members can be summarised as follows and is consistent with the advice we have been providing since the Court of Justice decision namely:
- Conducting an audit of any data transfers susceptible to journey via the US is crucial to determine your exposure to the Court of Justice ruling that personal data transfers to the US under the Safe Harbor scheme is illegal.
- Updating your privacy policies to highlight the existence of these data transfers, if you haven’t already, is a crucial step. The aim should be to notify as clearly as possible what data is transferred to the US, to underline the conditions under which that data is travelling, and the risks involved. It’s important that this is understood to be an indication of goodwill and shouldn’t be mistaken as a compliance measure by the organisation.
- Seeking alternatives to transfers to the US remains a useful step to consider as all transfer schemes currently in existence have proven subject to potential legal challenges.
- Where possible, partner with European-based services to execute your data processing tasks involving Europeans’ personal data as this will reduce exposure to legal problems stemming from inequivalent levels of protections that you may encounter resulting from the use of a non-EU partner. Anonymised data is not subject to restrictions and therefore it may be wise to process the data in Europe, and then send it to US entities as anonymised data sets.
- If this is not possible or practical, then the alternative mechanisms like binding corporate rules, standard model contractual clauses, and the Privacy Shield (Starting 1 August 2016) must be in place before personal data transfers from the EU to the US can take place. If you’ve already adopted one of the other alternative mechanisms it makes no sense to return to the Privacy Shield.
- If you intend to use the Privacy Shield, we recommend that partners you use for data processing be subject to an annual audit of their Privacy Shield certification along with meeting the requirements referenced above.
- It may also be useful to consider adding a safeguard clause into your contracts which allows you to require your partner to work with you to find alternatives should the Privacy Shield be subject to a new legal challenge, and should an alternative not exist, allow the termination of the partnership without any additional fees.
WE’RE HERE TO HELP YOU!
ESOMAR members may feel the need to reach out to determine whether the Privacy Shield is the right mechanism for them. The document itself can be quite daunting! That’s why ESOMAR’s Professional Standards service operates a free queries service for members which can help assist members in their reflections. Members can get in touch with our services at email@example.com. So if you have any questions don’t hesitate to get in touch.
Kim Leonard Smouter is Head of Public Affairs & Professional Standards at ESOMAR.
By the ESOMAR USA Representatives
Although North America was unable to retain fastest growing market for research for a second year running, the overall market size has increased by almost 20% according to the latest ESOMAR Global Market Research report. But, while the overall market growth figure may have implied a slowing down, the market measure has expanded to include two additional new sectors, giving the region a net growth of 0.5%. Following on from this we asked the ESOMAR representatives in the US about the challenges, opportunities and trends in their market.
Did someone say mobile? Again?
As this series of articles continue, having looked at the LATAM, APAC an MENAP regions, we do indeed see the same patterns emerging, even more so in a developed market such as the United States. It’s of course, the conversation about mobile and internet penetration rearing its head again. Jackie Lorch, Vice President, Global Knowledge Management, SSI USA, comments, “With online penetration approaching 90%, online is the go-to data collection methodology, and don’t even think about fielding a questionnaire that can’t be completed on a mobile phone.” Yet, although this knowledge is commonplace, it doesn’t mean the industry has caught up yet. With US smartphone penetration near 60%, survey participants are increasingly choosing to take surveys on mobile devices. Lorch observes, “The industry has not made it a priority to put participants first and design mobile-friendly questionnaires. Likewise mobile in-the-moment research presents wonderful opportunities to interact at the moment of decision-making with video or image capture. Yet we have largely failed to engage.”
This sentiment is echoed by Melanie Courtright, EVP, Products and Client Services, Research Now. “The biggest challenge is learning how to evolve for mobile devices — the questionnaires themselves have to change, and we are really struggling with moving fast enough in America.” Although the market research world might be lagging behind, there are big opportunities here, comments Courtright, “Biggest opportunity is in automation of basic research types so that through standardization we can integrate other forms of data better and spend more time on interpretation and make decisions more quickly.”
While the rest of the world might have an inbuilt stereotype of Americans (can anyone blame them, Donald Trump anyone?) – it couldn’t be further from the truth. Lorch comments, “Most of the stereotypes you have heard about America are exaggerations. Most Americans enjoy foods other than burgers, fries and buckets of cola and many are well-informed about and interested in other countries in the world!” Indeed, Courtright observes, “The US is very diverse, both in business and with consumers. It’s like many small countries grouped together, so to try and approach it as one market is not possible.”
The US shouldn’t be treated as one country – this is a population of almost 319 million people, spanning across more than 9 million km. That’s a lot of people, can we really expect them to have the same thoughts and opinions? Lorch expands on this further, “Different geographies, attitudes and cultures can be found within its borders. You’re likely to find doing business in the fast-paced, intense, “in-your-face” culture of New York City different from the more laid-back, proudly non-conformist, technology-driven Northern California, for example. Street signs you may see along the way help tell the story!”
Very visible in the US, but not just limited to here, is a major new societal trend that will impact research – fragmentation – in almost every aspect of modern life. Lorch notes, “From people’s time and attention, the data sources they use, their digital device habits, to the diversity of their beliefs, lifestyles, attitudes and interests. Institutions in the media, government and communities that used to help define large groups among the population have largely vanished to be replaced by customisation of the individual experience to a massive degree. As society fragments, are our traditional research taxonomies relevant anymore? We still group people by age, by ethnicity, by geography, in ways that haven’t changed for generations. We should instead consider life stages, and new attitudinal groupings as ways to better understand the consumer. This is equally true for B2B research where titles, responsibilities and purchase patterns are changing rapidly and we need to target based on the reality of today’s job functions and responsibilities.”
There are many trends impacting on America, and indeed society at large, none more so than technology. Lorch comments, “The idea of technology as not just enabler, but also driver of our business is a phenomenon noted by Unilever’s Stan Sthanunathan. Technology has made research more efficient, and improved its quality for companies who have invested in it. Now technology is doing more: actively directing where research will be and go in the future. It is taking over many research tasks that humans used to do. The challenge is that powerful technology and the expertise to run it is usually only available to the larger players, so many smaller enterprises need to find a new raison d’etre, or risk being swallowed up.”
Big data and the internet of things will also shape the future of market research, but we first need to get over the problems. “The practical and operational obstacles in the way of getting value from all the data now available are not trivial, yet the potential rewards are massive. If we can overcome the obstacles, research can use big data to answer the what, when and where questions and surveys to get at the why and what next – resulting in shorter, more interesting surveys and more accurate factual data.”
And a further trend to look out for, comes from Courtright, who comments, “A trend we’re seeing is definitely privacy and what that means in a world of cookies and meters and observational data collection. And in turn, society’s reaction to those practices, along with their expectations of transparency and responsibility.”
How to do business here
While we know we need to let go of those stereotypes of Americans we seen in the media, how do we do business here?
Lorch has some sound advice, “Americans are informal and direct in business dealings and make decisions relatively quickly – so don’t be afraid to ask for the business and discuss specifics like delivery times and costs.” But, don’t mistake that good old American positivity for success. “A positive attitude is much admired in the US, so even if someone tells you they’re “incredibly excited” about meeting you and hearing about your product it doesn’t mean you’ve made the sale!” observes Lorch.
So what have we learnt about doing business in the United States? Don’t treat this country as one…
Special thanks to Jackie and Melanie for this article.
Jackie Lorch, Vice President, Global Knowledge Management, SSI USA
Melanie Courtright, EVP, Products and Client Services, Research Now
by Jan Willem Knibbe
Text and data mining is a hot topic. Some see it as a new revenue source for market research and insights professionals, with its promise of effective analysis of vast amounts of data. You don’t have to ask respondents to fill in cumbersome questionnaires but you can analyse what they post online. Fast, cheap and versatile data. However, in Europe the current copyright framework poses a big hurdle for these types of research projects.
Under the current legislation, researchers interested in data mining can only copy copyrighted data with the permission of the author. Unlike in the USA, there is no fair-use provision in the legal framework that would enable using these data for research purposes without the explicit permission of the author. This gap has also been identified by the European Union. As part of the Digital Single Market agenda, a review of the copyright legislation is foreseen. One of the aspects that this review should cover is creating an exemption for research purposes in the copyright law. We have previously published an article about these developments: A new copyright directive in Europe: will there be room for data mining?
Since we published that article it has now become clear that the European Commission will champion a research exemption in the new framework. However, a debate has emerged whether only non-commercial research should benefit from the exemption. This is a revival of a notion first proposed in the General Data Protection Regulation where it was first suggested that only non-commercial research should be allowed to make use of any derogations foreseen for research purposes.
This position, however, is not unanimously supported. Support for broadening the exemption to all research isn’t coming just from business research community, but also the academic research community which issued a clear call that any distinction between commercial or non-commercial research is artificial and will only lead to legal uncertainty. For example, LERU (League of European Research Universities) writes that the potential of Text and Data Mining (TDM) is acknowledged by researchers, who see the benefits of using automated tools to mine the literature and supporting research data. (…) However, the legal basis to allow the use of TDM techniques, certainly in licensed commercial literature, is unclear. What is needed at a European level is a Fair Dealing Exception, certainly for the purposes of research, in the EU Copyright and Database Directives to facilitate the sharing and re-use of research data.
One of the arguments in favour of not making a distinction is that it would be legally difficult to determine where to draw the line between commercial and non-commercial purposes. It has been noted by many that commercial research projects are done by universities, while commercial companies are executing research project for the public good. This has been partly recognised by DG Connect by focusing on the purpose of the project, and not the body executing the project. Nevertheless, the question on where to draw the line remains open for interpretation and consequently which projects might benefit from the Commission proposals.
Another issue being addressed it the current voluntary nature of any exemption in the Infosoc Directive, which is the European legislation governing copyright. This means that every Member State is free to choose whether or not it provides for an exemption, including for text and data mining. This creates a complex, heterogonous legal framework to contend with, especially in the case of international research projects not to mention the uneven playing field created in the different Member States.
Unlike the discussion around putting limits on research, this seems to be a more accepted change in the legislation. Nevertheless, European harmonisation is increasingly sensitive in most national governments, who have to approve any new directive in the Council of the European Union. It remains to be seen if this proposal will make it to the finishing line. A new legislative proposal covering data mining is expected in September 2016, but it yet unclear when it could be adopted by the Parliament and Council. For an explanation of the process, please consult our law making in Europe beginners’ guide.
All in all, there is still a long road ahead of us. ESOMAR will continue to monitor this dossier, and try to convince European legislators that the distinction between commercial and non-commercial research is artificial would prevent all research actors, including applied research actors like market research agencies, from using text and data mining to improve the speed, relevance, and effectiveness of insights for the betterment of businesses and society alike.
Jan Willem Knibbe is Policy & Public Affairs Assistant at ESOMAR.
By Alieke Stubbe
Every day we interact continuously with technology by using our smartphones, wearing activity trackers and being online every single moment of the day. Thanks to this, we get smart notifications and real-time information. Think about Google Calendar that tells you exactly when to leave for work based on real-time calculations. My Garmin tracks my activity every day and I just love the fact that I can follow my own activity and optimise my daily behaviour via a personal app.
Technologies enable us to do things we could not imagine a few years ago, but there are still some issues today. First of all, it’s not a seamless experience. We need PIN codes and passwords or have to carry extra devices, which can be rather messy. Overall, nothing is really connected and this creates a very fragmented experience. Secondly, the most important and biggest issue, it’s a blind spot. We don’t always know when we are disclosing information or what will be done with it. We have no control over our own information!
Imagine we could “uberize” the whole idea of data collection: what if humans could leverage and exploit all the data they are collecting anyway? Just like pretty much everyone can drive a car, what if everyone could make value of their own data? Moreover, how can the market research industry benefit more from technology and consumers’ addiction to track just about everything? How can we combine the value these technologies bring consumers with our need to collect data?
In comes the chip: a non-painful chip implant that can be (de)activated by the chip carrier every month. A chip so smart that it captures your behaviour, the brands you use, your emotions, moods, thoughts and attitudes. A chip that can connect with your debit card, smartphone, car and even your home security system. A chip that uploads all data in real time. All these metrics are tracked and can be used anonymously for commercial purposes (like in traditional market research).
And what’s in it for you, the chip carrier? The chip provides you with the ultimate personal coach! You can set personal goals according to your weight or your activity level. You can get personalized working schedules, spending alerts and so on. Companies can offer you customized services and products based on your data (e.g. bank accounts, insurances, etc.). And all this very conveniently and seamlessly. Above all, you are the one in charge of your data, you decide when you want to share your personal information and to whom.
Think about it, no more self-reported data, no more ad-hoc set-ups, no more ‘annoying’ questionnaires. Think about real-time human data and having access to everything consumers do, think and feel. This way we can move from researching to monitoring. Furthermore, we can forget about looking at the what, who, how and when questions; the chip will tell us everything. The only thing we still need to figure out is the WHY, why do consumers do what they do?
So what do you think? What would you want in return for sharing everything you do, think and feel? How much would you want for having a (market research) chip implanted in your wrist, tracking every movement? What price would you want for giving up your privacy? 10 000 euro a year? 500 euro for every month you share your data? Would you have your studies paid in exchange for sharing all your data during those studies? What would be a fair transaction between research agency and participant?
Alieke joined InSites Consulting as a Qualitative Research Consultant, after completing a Master’s degree in industrial psychology & human resources and a Postgraduate degree in marketing management. As part of the InSites Consulting Technology & Services team, she is currently working for a range of local and global clients. With this idea on the future of market research she was rewarded with the Febelmar Young Talent Award at the annual Febelmar Congress in Brussels.
Original post by ESOMAR Foundation
In 2013, the World Food Programme was faced with a challenge when conflict broke out in the North Kivu region of the Democratic Republic of Congo. They needed updated food security indicator data, but could not collect it via traditional methods. This led them to develop a new way to collect food security data: through mobile survey company GeoPoll, which had the ability to send SMS messages directly to the phones of citizens in North Kivu.