Another Day, Another Conundrum

Guest post by Paul Carlin, South Eastern Health and Social Care Trust

Another day, another conundrum for those who deal with data in all its myriad shapes and forms.

With the Cambridge Analytica/ Facebook debacle, we have been presented with a seemingly apocalyptic vision for the potential misuse of personal data that threatens, however flawed, the notion of Western Democracy.

Various opinion leaders, politicians, talking heads and news outlets have presented us with a prima facie case of an intervention that speaks to social engineering by using the personal data of 87 million Facebook users, without their knowledge or consent.

So what’s all the fuss about?

Is it the fact they did it, and actually could define algorithms for targeted intervention to try and manipulate voter outcome based on psychological profile?

Is it the fact that they didn’t tell anybody, except the political movers and shakers who utilised the algorithms?

Is it the fact that they didn’t gain any individuals consent?

Or is it a combination of all three?

I don’t know, and I would hazard a guess that most other people wouldn’t either.

I’ll begin by addressing the last two questions, in that they define the user context, within a governance legislative and ethical frame.

Facebook collects data and whilst they may be perceived as a service provider, they are not. Facebook  is a data broker whose sole function is to collect, collate, structure and analyse data to leverage value at a micro and macro level. This is wrapped up in an interface that encourages the release of personal information, and they cover the issue of sharing with 3rd parties as follows:

“We transfer information to vendors, service providers and other partners who globally support our business, such as providing technical infrastructure services, analysing how our Services are used, measuring the effectiveness of ads and services, providing customer service, facilitating payments or conducting academic research and surveys. These partners must adhere to strict confidentiality obligations in a way that is consistent with this Data Policy and the agreements that we enter into with them.”

Facebook clearly state that academic research will be conducted, and is facilitated in order to generate new knowledge, that may or may not have a commercial use. One could perhaps argue that Cambridge Analytica, who appear to have had some involvement with Academia through Cambridge University, were justified in the development of their algorithms as, prior to its commercial exploitation, their work fell squarely within the domain of academic research.

The issue then progresses to the methodology and specific intent of targeting individuals. In accepting  that the data of the user-base must have been interrogated using the developed technology which identified and matched user profile to targeted information, any suggested misuse may lie, not  at the development end, but in the utilisation.

Cambridge Analytica created the technology. Facebook facilitated its creation but who used it and why? Perhaps this important question should be our focus?

What strikes me is the lack of public outrage. Certainly the press is awash with statements about the abuse and scandalous use of individuals data, but is the public that concerned?

One need only look at the Facebook share price over the last 30 days; on the 27th March it reached its lowest price over the Cambridge Analytica issue at  152.22(USD). But on the 10th of May it closed at 185.53 (USD).

The stock has rebounded completely wiping out any losses. Alongside this is the fact that user numbers have grown 13% year on from 2017, with no appreciable slowdown in usage despite the scandal.

Yes there may be reputational risk, bad publicity and heightened awareness but the indignation appears to be limited to a select grouping within society.

The issues here are multiple and manifest in the implication that society can, and potentially is already being, manipulated and controlled. Nobody has coherently questioned why someone would do this type of work or addressed the fact that just because we can, doesn’t necessarily mean we should!

Be that as it may, the cat is now well and truly out of the bag. The risks appear to have been accepted as negligible by the public, despite being reported as cataclysmic by the political and reporting classes. So one may well ask, since the electorate are unconcerned, why  our political representatives should take this issue seriously at all?

This blog is a challenge to think about these issues, and perhaps to look beyond the hand ringing rhetoric to the ethical and potential issues that this actually identifies, as data and its use become ever more personally targeted and autonomous.