Every day in the news, we are faced with more stories about the misuse of our data from social media companies, political and campaigning organisations, big brands and the rest. The data that is out there about each and every one of us right now is not just big data, it’s off-the-scale enormous. This includes data we know we’ve provided and we think we know why, data we know we’ve provided but aren’t sure what for, data which is repurposed, and data which we had no idea we were even generating.
Let’s take my average morning. I wake up (to an old-fashioned non-connected alarm clock so you can’t get me there), check my email and social media on a tablet. If I wake up before the alarm, I might override the heating on my Hive. So, not 5 minutes into my day, I’ve let Google, Facebook, Outlook, Lenovo, Hive and, crucially anyone else they might pass data onto have a whole bunch of data about what I’ve been doing, my heating preferences and exposure to ads.
I sign up for a site that will tell me where the cheapest fuel is in the area, providing not only my email address and name, but my car type and location. On my drive to work I drive over car counting strips and pass traffic cameras, thus generating more data in my wake. I stop at the petrol station, fill up and hand over my credit card and maybe a loyalty card before continuing to the office. If on the other hand I cycle to the office, I’ll log my ride on my GPS, which syncs to at least 3 different fitness and training apps on my phone.
What I’m generating here, without even explicitly taking part in research is a huge amount of information that can be collated, sliced and analysed by researchers, big data experts and marketing professionals. Of course, when I sign up for many of the services I use, I have accepted terms and conditions (which I, like everyone else, skim read at best as I don’t have a spare 25 days a year) to agree what companies can do with my data. For explicit research activities, the MRS code of conduct guidelines around the use of data, however, make it clear that data collection is done with the informed consent of subjects; that excessive data is not collected and that data is held securely and retained for no longer than necessary, as well as that it is made clear what is being done with the data. Even further, GDPR has focussed everyone’s minds as to where data sits and who is responsible for it.
However, there have been many examples recently where companies and organisations have flagrantly breached professional and legislative guidelines. Only last month, the Bounty parenting club has received a fine of a massive £400,000 because it was selling on data collected from expectant and new parents in hospitals and birthing centres as well as users of its app and website to other companies – no less than 39 of them.
The firm stands accused of collecting sensitive personal data from parents at a vulnerable time and not being clear what was being done with the data; the company is also tasked with distributing Child Benefit forms on behalf of the state as well as marketing promotions and free samples. The company is now said to have conducted a review of its processes and put in place measures to handle data in a more compliant way.
If this news on Bounty is news to you, then this next example definitely won’t be. The most famous recent case was the Cambridge Analytica scandal, for which Facebook was fined £500,000 after 50 million profiles were harvested from its unsuspecting users. Just over a quarter of a million users signed up for an app called “This is Your Digital Life", which was originally created to support a piece of academic research but this then also gave them access to the data of their Facebook friends.
This huge data set was then famously used to profile US voters and thereby formulate strategies to influence the outcome of the US presidential election. In this case, the breaches are multiple: users originally thought their data was being used for an academic exercise, but not only was their data passed on, so was that of their likely hundreds of friends on Facebook and then the data were used in a way that was likely very much out with original expectations.
So, what good practices can we adopt as researchers, insight professionals, marketers and developers to ensure we are collecting and using market research data in a responsible way?
With trust in brands, organisations and the media tending to be on the decrease amongst many consumers, it is imperative that we don’t abuse what trust we have by ensuring clarity, transparency and rigour in our market research.