100,000+ data points: “According to our data, the diapers should fit”
Moms and babies: “Check out these photos – not so much”
This was the essence of the findings on a case study with a major consumer goods company. The company was losing market share when babies were transitioning up from one size diaper to the next size up. All their technical data was telling them that fit was not an issue. Even with all of the data at the company’s disposal, it was a digital qualitative study that was able to uncover that fit was very much an issue at size transition, and enable the company to create steps to address the challenge.
In the new world of Big Data, some might question whether qualitative research is still necessary. Indeed, qualitative research has always been a small part of the market research industry compared to quantitative research. But as the example above demonstrates, qualitative still has a place in the research toolkit. In addition to its stand-alone role of generating deep consumer insight, qual has always fulfilled an additional and fundamental role – that of delivering the “why” behind quantitative research and data trends. And ultimately understanding why something is happening is the key to knowing what to do about it.
People are more than a collection of data points. There is irrefutable power in seeing customers’ faces and hearing their stories to illuminate the why behind the patterns in data, and give companies insight on how to move forward. Behind every data point is a human story that holds the key to that understanding; qualitative brings the human element to a data driven world.
But that is not to say that qualitative research doesn’t have its own ‘big data’ challenges. One of the great things about mobile and online research is the ability to capture a large amount of expression from engaged participants very quickly. However this also represents a challenge for analysis. For example in a recent project, seventy participants produced 400 pages of text and over 1,100 images and many hours of video in just 5 days of activities.
While the size of all this data can be measured in mere gigabytes vs. exabytes, to a qualitative researcher, it is a huge mountain of in-depth information – call it “Little Big Data.”
Faced with Little Big Data, qual researchers have understandably looked to technology for help, especially looking at text analytics. However, I have talked to researchers who have utilized different text analytics tools, some integrated into qualitative analysis packages, some fully dedicated text analysis. Quite a few of the researchers I have spoken to regarding this approach reported to be less than satisfied.
And there’s the rub. Even taking into account the inherently interpretive nature of qual, it remains an inefficient process for its practitioners. The effort to set up and tune the tools to be efficient with analysis is often greater than the perceived benefit. At the end of the day, many of the researchers I talked to felt they weren’t much better off than printing out transcripts and whipping out their trusty highlighters.
The frustrating aspect of it is that the foundational technology that could speed up significantly the analysis and communication of insights actually exists, but there hasn’t been a single company with the resources to pull it all together. With apologies and respect to all of the companies in the digital qual and analysis category, we all do specific things very well, but none of us have achieved the economic scale necessary to make the investments to deliver the big leap. Here’s what’s necessary for a qualitative research technology company to keep up with the world of Little Big Data:
- Data from face to face, mobile and web based interviewing streams need to be integrated. According to the Esomar Global Market Research Report, 95% of qual is still conducted face to face. So a company or solution needs to be able to stream data from face to face research and integrate it with web and mobile sources
- Collection and analysis need to be integrated. One of the biggest sources of inefficiencies in qualitative analysis is that it often requires moving data from collection systems to analysis systems. This is akin to building a database of information for collection and then pulling it out and transferring it to another database for analysis. It results in lost time and repetitive tasks.
- Analysis tools need to leverage the pattern surfacing abilities of computers and algorithms to support humans’ unique ability to assign meaning to those patterns in a way that creates a fluid user experience that researchers will actually want to use.
- The system needs to reduce or automate as many of the mundane and repetitive tasks that every reporting effort requires.
As the speed of insight generation grows ever faster, qual needs to keep up. We already are seeing the speeding up of the front end of the qualitative research process as several qualitative offerings feature instant recruiting options, but dramatically reducing the time from collection to insights communication while being able to maintain quality. We will need to leverage technology in smart ways to make it happen.
This solution does not yet exist, but I believe its development will be crucial for keeping qualitative research relevant in the coming years. I believe the next few years will see a consolidation of the qualitative marketplace especially in the digital and mobile areas. The acquisition of Revelation by FocusVision represents the first such consolidation and by the points outlined above, the combined organisation is well position to make good on the promise of keeping qual relevant in the Big Data – and Little Big Data – age.
Steve August is CEO at Revelation