Just one year after news broke of the shocking relationship between Facebook and Cambridge Analytica, the scandal has lost none of its relevance in terms of how organisations worldwide handle data on consumers and employees.
The UK-based analytics company created an app linked with Facebook that uncovered personal information about millions of the social network’s users through a personality quiz. Answers given in the quiz went towards a psychological profiling that was subsequently used to create pro-Trump adverts in the run-up to the 2016 presidential elections.
Whistleblower, Christopher Wylie, who worked with a Cambridge University academic to obtain the data, told the Observer:
“We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”
It’s safe to say that the privacy scandal has impacted organisations worldwide across every sector. Under the rules of the General Data Protection Regulation, which came into being on May 25th 2018, organisations dealing with the personal data of EU-based citizens must obtain consent by GDPR-compliant means for data to be used in any way.
One of the means of obtaining consent is by explicit, user opt-in. Organisations must also clearly say how the data they collect is to be used, with whom it will be shared and why, and for how long the data will be kept.
Wylie continues to feud with Facebook/Cambridge Analytica, and other big corporations around the world concerning his views on the way they use consumers data. You can follow his views on his Twitter feed.
While there have been some major changes in the data protection sector, Terry Ray, SVP and Imperva Fellow believes that this is not the end.
“Data breaches of the Facebook magnitude are certain to happen again,” Mr Ray says.
“It’s clear that organisations are opting to dedicate resources to the data protection areas of the business as Terry also highlights that “Experts are hard to find for data security, so companies either have to learn as they go, pay for an expert service or more commonly look toward AI and machine learning to supplement their human manual expertise,” he adds.
Jasmit Sagoo is senior director data protection and analytics firm, Veritas Technologies. Speaking to GDPR: Report, Ms Sagoo says:
“Over the last year, the way that consumers create and share data has changed – and so has the way that they expect businesses to store and process it. High-profile breaches, scandals, and the introduction of the GDPR last May have made consumers more cautious about what data they share, where it’s being stored and who it’s being accessed by.
“We’ve seen what can happen when this trust is broken, for both the user and the business they shared their data with. Our research has found that poor data protection can have a dire commercial impact on companies – 56% of consumers would dump a business that fails to protect their data, and 47% would abandon their loyalty and turn to a competitor.
“Over the past year, consumers have grown far warier of what data they are giving away and how it is safeguarded, even if they feel like they are getting something in return. This means that for businesses built on data, caution is the watchword of today.
“In the modern data economy, businesses have two responsibilities: firstly, to understand their customers, and secondly, to properly protect their data. The businesses that fail to do so will find themselves a cautionary tale.”
No comments yet