It’s not uncommon for managers and executives to not only need research, but need the results yesterday. After all, staying ahead of the competition and striving to deliver excellence is their top priority. Knowing more than your competitors is the ultimate advantage.
With advances in technology and the need for greater efficiency in businesses, the demand for market research to be real-time is dramatically increasing. The world we live in is becoming increasingly automated and as a result, so is the market research industry.
Helene Protopapas refers to automation as ‘the use of systems with minimal human intervention, reducing or eliminating unnecessary human labour activities and thus allowing people to focus on high-intelligence processes.’ Or, in other words, automation in research allows researchers to free themselves of the time-consuming, tedious tasks and focus their attention on generating insight.
Recently, we have seen the process of requesting and sourcing sample become ever more automated. This is, in part, due to the increase in the number of specialist providers offering both panel and aggregation services.
Aggregation refers to sample vendors who use multiple sources to create larger, more accessible panels. While the premise of aggregation is simple, it is the filtering and selection technology behind it which thrusts vendors into automated territory. By simply selecting a few basic parameters, these panel providers send invites to relevant panel members, making it easy to fill nearly any request.
Survey routing is perhaps one of the oldest forms of research automation – with roots in basic paper based surveys. The typical ‘If Yes, Skip to Question XX’ text found on many surveys guides participants through the process, ensuring they only answer the questions relevant to them. In turn, this improves the experience for both the participant taking the survey, and the researcher analysing it.
Tweet This | |
"Survey routing is the oldest example of automation in market research. made possible by pen & paper." |
Online survey tools take this one step further, by only displaying the questions each participant needs to see. Answers to previous questions and demographic data guide survey takers through a custom route, allowing for cross-comparisons between groups and in-depth probing of individual responses.
Sending reminder notifications to participants taking part in a research task is an essential way to ensure excellent response rates. However, the task itself can be time-consuming and (ironically) you still often have to manually send emails based your reminder schedule.
However, taking a leaf from social media software, many online tools now offer the ability to create a custom schedule that will send reminder notifications at pre-determined times. In the future, we expect this to develop into flexible reminder systems that automatically adjust schedules depending on known response rates and previous data.
There is no doubt that data cleaning is one of the most boring research tasks imaginable. No-one wants to sift through pages of data just to look for errors, mistakes and formatting discrepancies. Fortunately, manual data cleaning may soon be a thing of the past. New software developments are bringing with them more creative data cleaning tools. From those that predict information based on previous responses, to which apply common formatting to all data.
The only danger with automated data cleaning is the lack of human oversight – which can lead to data quality and accuracy issues. When assessing data cleaning software it’s important to look at the potential risks, and ask if they are ones you are willing to take.
The volumes of data collected in market research can sometimes seem overwhelming. This is even more common when dealing with qualitative studies where reams of transcripts may be involved.
When it comes to automating data analysis though, there are arguments for and against the use of it. The obvious benefits are in saving time and in turn, saving money. Software such as SPSS also allows you to categorize responses and integrate results with other survey data for better insight and statistical analysis. However, critics will argue that automated data analysis comes with an increased level of risk.
Sentiment analysis is another good example of how we take what someone says or writes and uncover whether their strong emotional reactions to something. At FlexMR, we use this within our SmartboardMR tool, where we even allow respondents to tag their comment as positive, neutral or negative and then map different areas of an image. Within a few clicks, we’re able to see at a quick glance which areas of the stimulus performed well and which areas would need improving.
Taking this example further, what the automated analysis doesn’t give us is the reasons why certain aspects performed well or vice versa. This is where the researcher and their interpretation is key. As researchers and particularly when dealing with qualitative analysis, we often live and breathe the data to uncover all possible findings. Automating this is a natural part of the research process
There is a popular misconception that data visualisation requires an extensive knowledge of design. This is rarely the case. In fact, while some tools offer the ability to create charts and graphs with ease – others will automate the entire process from start to finish. Perhaps the best example of such data visualisation is provided by real-time dashboards.
Tweet This | |
"The notion that data visualization requires vast design knowledge is very much misplaced." |
These, while still rare in market research circles, are becoming integral to marketing and executive teams across the country. Why? Because dashboards are capable of tracking behaviour in real-time, providing time critical information and guiding responses. As automated data visualisation tools begin to creep in to the research industry, two uses stand out in particular.
The first is live reporting of campaign engagement. Examples of how this can be implemented are provided by Bing Pulse and its applications both within classrooms and alongside live events. Secondly, real-time dashboards can be used to monitor engagement with large scale research studies. This means a huge range of questions can be answered. How many respondents have completed the task? Where are respondents located? What demographic qualities to respondents possess?
We often find the need in research to run the same repeated research design to track changes and spot trends in behaviour. A number of online platforms now present the option to duplicate research tasks, meaning the time spent programming that 50 question (repetitive) survey is reduced.
Combined with other forms of automation, repeated research tasks ensure changes in behaviour are tracked over time, while still freeing researcher time to create new ideas and probe analyses.
As with everything, there is a definite time and place for automation. Just because tasks can be automated does not mean we should suddenly rush to it as the solution to all research problems. Research automation should be approached with caution – a metaphorical pinch of salt – to ensure we do not degrade quality in the process. Used in moderation and with oversight, automation offers wonderful opportunities – but only so long as researcher remain in control and do not let automatic processes govern us.