Editor’s Note: Julian Wheatland was COO & CFO of SCL during the period of its growth until 2018, when he took over as acting CEO of the Cambridge Analytica group in order to close down operations. Wheatland now serves as an adviser to corporate clients on data management issues and speaks publicly on “how to avoid becoming the next Cambridge Analytica.” The opinions expressed in this commentary are his own.

Cambridge Analytica made many mistakes on the path to notoriety. But its biggest mistake was believing that complying with government regulations was enough and thereby ignoring broader questions of data ethics and public perception. As the man who stepped in as its CEO during one of the biggest data privacy controversies of the past decade, I should know.

That there was a scandal like this seems, in hindsight, to have been inevitable. As I comment in “The Great Hack” (a Netflix original documentary released on July 24), the scandal that engulfed Cambridge Analytica could easily have occurred at a different company — Cambridge was not alone in analyzing personal data to target communications. Now, more than a year later, as Facebook gets a $5 billion fine for misusing data and Twitter announces yet another bug that may have shared identifiable user data with advertisers, I’m left wondering what has really changed regarding data privacy and the processing of our personal data.

As it stands today, none of the companies in the Cambridge Analytica/SCL group have been found to be in material breach of any rules or regulations, although investigators on both sides of the Atlantic are still looking.

The universal cry in response to the events surrounding Cambridge Analytica and Facebook has been limited to calls for greater regulation. There’s a frenzy in the air for governments to step in and keep the public ‘safe.’ But to expect regulation alone to solve all of the challenges of personal data privacy is naïve. Laws and regulations can, at best, legislate the technology that’s already in the marketplace or that can be anticipated in the near future, but this is likely to be a long time after the technology has come online and often, by then, it is outdated or has already been misused.

If this data technology is to thrive, the public must have trust in its practitioners. We need regulation, but companies that are developing and using the technology must own responsibility for public confidence and deliver it through transparency, effective management, honesty and choice.

Transparency

All companies should clearly state what information they hold on their customers, how they use it, what they use it for and why, along with a guiding set of ethical principles.

I can already hear the shrieks from Silicon Valley as executives envisage their competitive advantage (and advertising income) being eroded, but I’m not asking them to publish their algorithms or secret sauce. The public has both a right and a need to know what is being done with their personal information, and tech companies shouldn’t wait for the law to require it.

Google, for example does this fairly well today in its Privacy Policy which explains the data it collects, what it uses it for and why. If Cambridge Analytica had been this transparent about the data it collected, why and what it did with the data, it might have eased people’s concerns and might have been less intimidating.

Effective management

Company boards should set up internal procedures that give clear policy guidance to employees so they can assess if new tech or data uses contravene the company’s publicly stated ethical principles. And those procedures should make sure that anything unclear or out of the ordinary is escalated to the board for sign-off. Effectively managing these issues is a matter of corporate governance and brand protection; it’s not good enough when the CEO claims ignorance.

In addition, external auditing of these control procedures would generate trust by assuring customers and business partners that the company is really doing what it says it’s doing.

Cambridge Analytica had no such ethical procedures in place. When launching new initiatives, once the legal and regulatory review was done, the new project was signed off without any further ethical review. I doubt the company would have ever taken Facebook data if tools, policies or guidelines for how to act on ethical concerns had been in place.

Honesty

Companies should appoint a chief ethics officer (or equivalent) whose role is to ensure that the company is actually doing what it says it’s doing. This person, along with internal audit procedures, would be responsible for making sure the company is complying with its own data ethics policies and, frankly, thinking two steps ahead and anticipating danger — in precisely the way that regulators can’t.

Had we had someone in a similar role at Cambridge Analytica, things might have turned out differently. Ethical questions such as, “Is it okay to build a personality profile of someone without their knowledge?” would have been proactively considered before being put into practice.

Choice

Finally, individuals must have the ability to simply and easily opt out if they decide they don’t want to share their data. Cambridge Analytica took the view that US citizens didn’t have that right, but this was a legal assessment, not an ethical one, and, in my view, it was a mistake. Individual choice should extend to third party companies that may acquire your data and also enable you to undo any previous agreement to have your data shared.

Regulation can provide the necessary minimum safeguards, but it can’t deliver public confidence — nor should we ask it to; otherwise it will stifle innovation and international competitiveness. Companies of all shapes and sizes would be wise to own this issue and recognize the impact on their brands for getting it wrong.

We cannot look at issues of privacy in terms of box-ticking or backward-looking metrics. If we just leave this conversation to regulators, the world will inevitably see more data controversies, and next time there could be even more at stake.