Released this summer on Netflix, The Great Hack is a documentary that explores the murky data-sharing relationship between Facebook and Cambridge Analytica.
To briefly recap, in 2014, the private data of up to 87 million Facebook users fell into the hands of UK-based Cambridge Analytica, a data intelligence company, or “full-service propaganda machine” if you ask data scientist, Christopher Wylie who helped to set the company up.
The harvested data was used for psychological profiling to target America voters with pro-Trump content in the run-up to the 2016 presidential elections. When the story hit the headlines in spring of last year, it was also alleged that the leaked data could have been used to sway the Brexit referendum. The freedoms upon which global democracy is founded were under direct attack.
The Great Hack presents this scandal as unravelled by directly affected persons: Cambridge Analytica insider and whistle-blower, Britney Kaiser, journalist Carol Cadwalladr, and David Carroll, an American professor who sued Cambridge Analytica following an insufficient response to a data subject access request.
In a world full of data jargon, you couldn’t blame the person in the street for thinking the film’s subject matter rather inaccessible, even nebulous; data privacy is the concern of the tech gods of Silicon Valley; GDPR caused last year’s deluge of privacy emails.
The Great Hack pulls apart and reconstructs these themes with sobering clarity by showing what happened when that person in the street simply asked a company: what do you know about me?
The answer exposes the rotten core of big data processing, and shows that the main players have long since lost their ethical compass in their attempts to control a monster that each of us blindly feeds with every like and share.
If ever society needed a wakeup call as to the urgent need for the policing powers of the GDPR, this film is it.
Prior to the film’s launch, I spoke with David Carroll, to find out more about his role in pulling the thread on the scandal of our digital times.
How interested were you in data protection rights in the run-up to submitting the DSAR?
I was a real American worried about data privacy issues before it was cool. In 2014, I was vocal about Apple’s content blocking in iOS, and there was a big debate in the advertising and technology circles around why consumers were installing ad blockers at an increasing rate. The industry said it was because adverts were annoying.
I was arguing that there were privacy concerns at play as well. I was an early advocate, trying to warn the industry that the facts were going to blow up in their face.
I attended major meetings where I was the voice in the room saying that the industry was reckless, that this is going to explode. I couldn’t have predicted that Cambridge Analytica would be the inevitable event that was a result of the reckless behaviour.
Could you tell us about the lead-up to your submission of the data subject access request (DSAR) to Cambridge Analytica?
After Donald Trump was elected, I got to know data protection expert, Paul-Olivier Dehaye who helped convince me to submit the DSAR, which was delivered to Cambridge Analytica in January 2017.
The response I posted on Twitter got attention from British data protection experts and academics who suggested the response contained unlawful elements, and my solicitor agreed that there were serious breaches of the Data Protection Act [2018].
We began filing complaints with the Information Commissioner’s Office (ICO). I then filed a legal claim in the High Court and Cambridge Analytica served on March 16th 2018 – the same day that Facebook suspended the company. That weekend, the Observer, the Guardian, Channel 4 and the New York Times published their stories which blew the lid, making the issue a global phenomenon.
Once the story was public, the ICO obtained a criminal warrant in response to my complaint filed, but Cambridge Analytica filed for insolvency in the 30-day response window and the company was essentially untouchable. I think the moral of the story in the end is that the ultimate loophole in a protection enforcement is bankruptcy and insolvency.
I’m glad Paul convinced me; the DSAR alone showed that we were right from the beginning, and every next step was a vindication. It taught me how much the United States badly needs to grant its citizens the equivalent data protection rights to those that Europeans enjoy.
Ironically, had Cambridge Analytica not exported voter data to Britain, where citizens’ rights are vigorously supported by the ICO, we would have had no recourse to even do a DSAR and they could have lawfully ignored the request.
What do you hope viewers take from the film?
I hope viewers understand that data rights are fundamental rights. Your country doesn’t give you them – you have to demand them, and that’s how it is in the United States right now.
We need to be aware of the relationship between our data and our democracy, I hope this becomes more of a political issue, especially as we go into another election season, and candidates who are strong on data rights will resonate will with people who care about that.
I hope that the film is as non-partisan as possible. In the US, privacy is one of the few political topics that republicans, democrats and independents agree on. Unfortunately, it’s a highly charged topic so it remains to be seen how that will play out, but I think that the movie will make people think differently about what the GDPR is.
Most people were introduced to the GDPR last year with a barrage of emails that nobody was going to read. But the movie makes it understandable as to why you would even need something like the GDPR. I think it will make the idea of data protection, data law and data rights more of a mainstream conversation.
How effective is The Great Hack in dealing with the themes around data protection?
I think the movie is successful in making the complex topics much more accessible to a general audience. It explores aspects of the story that have not been covered by the media and press.
The movie’s very valuable in that regard; the story moves beyond misappropriated Facebook data, and I think viewers will ask why the press focussed so much on Facebook data when there were so many other forces at play.
The film also succeeds in making the invisible visible through great visual effects that appear, as I explain what’s going on.
Did you face any challenges in making the movie?
One challenge was agreeing to be a subject in a documentary while participating in a legal action; I had to be very careful about that.
It was also a risky undertaking from a personal financial perspective. I was often asked whether I feared for my safety which underscored the general sentiment that I was a going up against a nefarious entity.
Brittany (Kaiser) was in greater peril because of her knowledge, and being inside until the very end. She knew way more than I knew. She had a bodyguard in the film, which I didn’t have. Cambridge Analytica employees also tried to intimidate me in deceptive ways, but nobody else discouraged me.
Are consumers in the US waking up to the value of their personal data and the importance of keeping it secure?
Yes, I definitely believe that the Cambridge Analytica was a cataclysm in terms of causing an awakening, where before there was a sense of negligence on the part of consumers and technical companies.
Consumers and businesses were reckless because there was no penalty, not even bad press, for data breaches, and so the whole system has been realigned. The movie will move the needle even more because it makes things more understandable.
I think people thought big tech could do no wrong, that it was a benevolent force in society. After the election, the Cambridge Analytica scandal demonstrated abuse on a massive scale. But it was the canary in the coal mine and has proved a focal point in the dark. It’s a useful narrative in what has until now been the domain of privacy professionals, lawyers and policymakers because data privacy is difficult to grasp.
To what extent has democracy been undermined by the powers depicted in the film?
We don’t know if the techniques employed in the film were effective or not. Some experts believe that they were. The ICO will publish their final report on the matter in the autumn, to offer the most definitive and trustworthy assessment of the forensics of their offering. I will wait until then to make a judgement.
I think as technology advances and privacy is further eroded, it is very important to be concerned – to think about how limits on data collection processing and enforcement of violating those limits is an important part of democracy in the 21st century.
I can’t say that Cambridge Analytica is the only reason why Trump was elected or why Brexit is happening, but it is just one of many factors that we should consider.
Cambridge Analytica was not allowed to work in the French elections for Marine le Pen even though they wanted to, because CNIL is a fearsome adversary to their interests. Countries that have robust data protection and enforcement reputations already have had a deterrent effect on bad actors like Cambridge Analytica.
Has the film made you reflect on how private data can be used more ethically?
I think the key lesson is the right to know. The right of access is a foundational data protection right and every citizen in the world deserves it. Yet not every country provides it.
The fundamental right of access and the right to know are what all data protection stands on, and the foundation of every other right. By exercising your right of access and your right to know you can strengthen personal data sovereignty.
The movie shows why that matters. The United States does not have a right of access but it must be included in any new privacy legislation. I will be carefully watching and advocating as the debates continue you.
I’m glad to see that it is part of the California Consumer Privacy Act and it is part of the bills in progress in states such as New York.
Will the US introduce a national data protection legislation?
It’s a fascinating dynamic that is characteristically American, whereby the states are moving faster than Washington DC. It’s creating pressure on the industry to convince lawmakers in Washington to pass a national privacy law and pre-empt state law.
It’s a big trade; republicans and democrats alike are not going to pre-empt state law for nothing. The industry will have to win a big concession to get pre-emption. The debate will be so interesting to watch and you will see the industry lobby work overtime because industry is fearful of a patchwork of regulation and different state laws. I understand the appeal of a national law, but I would expect it to be even stronger than the CCPA.
Is the GDPR seen as an example to how consumer privacy should be dealt with in the US?
A remarkable moment occurred in the Mark Zuckerberg Congressional hearings after Cambridge Analytica.
A Republican lawmaker asked Zuckerberg to talk about the merits of the GDPR. Before Cambridge Analytica that would have been impossible to even imagine; a conservative Republican asking a Silicon Valley tech giant what he thought about the good aspects of European law. Brussels is known as the Silicon Valley of regulation. I heard even a high-profile Senator said that there was something going very wrong.
The very conservative Republicans are open to regulating the Tech industry is US is a sea change. There was zero appetite for that before Cambridge Analytica. In a strange way, we can welcome Steve Bannon for creating the Cambridge Analytica scandal because it has opened the world’s eyes to the data protection problem.
To what extent are privacy and security intrinsically linked?
To me, privacy and security are two sides of the same coin. You can’t have one without the other.
It expresses the trade-off – the greater the convenience, the more you sacrifice security and privacy and to get more privacy and security you have to sacrifice convenience. It’s a delicate balance.
The status quo in the US is that individuals are responsible for their own security and privacy, but this puts the burden on the individual, which I think is fundamentally flawed.
We need a more enlightened view of privacy and security is to understand it more like a collision – as a collective responsibility. When we understand it that way, we will advance the cause as it will take the burden off an individual and put it on to communities.
What trends do you see developing in your industry?
I would say that regulation in the US will be an opportunity for entrepreneurs because there are new businesses to be created from new requirements and regulations.
If a DSAR equivalent is instituted in the US then an entrepreneur could start a business to help people find all the companies and organisations that have collected data and create products that streamline and the law more useable.
Whether or not the recapture of control can be monetised is another thing. It is too difficult to do the DSAR as it stands, so new businesses that provide identification platforms will be a new area of business growth. Many people say that regulation will hurt start-ups I would argue that no it will be a bonanza for them.
Hear David Carroll live
You can hear the views of David Carroll, at PrivSec Conference New York.
Taking place on 5th and 6th November at Columbia University, this two-day conference brings global business leaders and IT experts together with senior figures in cyber protection and security at a decisive point in the evolution of global data privacy.
David Carroll joins a distinguished roster of representatives from other global organisations including Uber, the New York Times, BNY Melon, Bank of England, Raytheon and many more.
For more information on this landmark conference, or to book your ticket, click here.
No comments yet