Thursday, 8 October 2015

Privacy sometimes means secrets

IPSWITCH survey results infographic
A recent survey (Sep 2015) for IPSWITCH was broadly picked up by the tech press and highlighted the concerns of IT professionals with the looming EU General Data Protection Regulation (GDPR);  69% say their business will need to invest in new technologies or services to help prepare the business for the impact of GDPR including:

  • 62%: encryption
  • 61%: analytic and reporting
  • 53%: perimeter security
  • 42%: file sharing
This was shortly followed by the European Court of Justice ruling on the Schrems case concerning the Safe Harbour arrangements (Oct 2015).This has variously provoked doom laden stories and more measured pieces pointing out that many large companies have seen it coming and taken steps to put in place other legal means to ensure continued operations.

That said these two stories relate to only two of the eight principles (also see below) of the Data Protection Act in the UK, the remaining six presenting us with a whole series of further challenges to compliance. And that's the 1998 act, not even the pending GDPR.

Perhaps the increasing costs and complications of processing Personal Data might lead us to ask some questions about how we design at least some of our future IT systems to avoid the issue in the first place. In particular within the domain of the Internet of Things, there is a widespread presumption that the value in the data that the things communicate is only realised when simultaneously the data is shared (with a 2nd, possibly 3rd party) and the person identified.  We need to challenge this assumption from the perspectives of both the technology and the business model.

Projects such as the hoax drug testing toilet and the IoT toilet roll holder raise plenty of questions around sharing data that we would do well to keep in mind before building technology that at its heart presumes sharing is a good idea (*).

So IoT developing folks ask yourselves some questions:
  • Does this heating control accessible from a roaming mobile phone need to pass unencrypted data through a middle man, or should we take a leaf out of iMessage's book and just encrypt it end-to-end?
  • Ditto baby monitors.
  • Does this thermostat really need to know who I am, with name, address, etc or could it simply operate anonymously?
  • Smart washing machine doing condition monitoring - yes supply anonymous statistics of operation to the manufacturer, but maybe a monthly digest rather than a second by second stream of washing machine consciousness... in fact why not use email to send it and bcc me?
  • What value do you derive from data fusion across users - or is it simply that you wanted to obtain an even more detailed profile of me to sell to marketeers?
Fundamentally, is the majority of the value in IoT really in sharing data or in providing an enhanced product and value add to customers? And like the scorpion in the fable, is your desire to slurp data simply "In your nature...", but ultimately a bad choice for many IoT products.

(*) If you need a daily reminder of some of the lunacy out there, follow @internetofshit (too much toilet humour. ed.).

From the ICO website...

Schedule 1 to the Data Protection Act lists the data protection principles in the following terms:
  1. Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless:(a) at least one of the conditions in Schedule 2 is met, and (b) in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met.
  2. Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes.
  3. Personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed.
  4. Personal data shall be accurate and, where necessary, kept up to date.
  5. Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.
  6. Personal data shall be processed in accordance with the rights of data subjects under this Act.
  7. Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.
  8. Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.

Thursday, 24 September 2015

I, Robot; and privacy by design

Text of submission to Gikii 2015...

Sonny, the modified NS-5 robot in the 2004 I, Robot film exhibits several key elements designed to serve his mission of avoiding the robotic revolution:

  1. Keep secrets;
  2. Heterogeneity of processing;
  3. Separation from central authority;
  4. Denser alloy…

How can we reflect upon this for technology in general, and privacy by design in particular.

1. Keep Secrets
The recent report  from the Digital Catapult highlights the importance of trust in growing the opportunities for providing value from personal data. However, the report exhibits the continued mental blockage that the technological community seems to have confounding the value in personal data with the need to share that data.
The simple technical architecture prevalent today is software / app / device / whatever which provides a user interface, while all data is stored in “the cloud”. These systems often come with complex Ts&Cs wherein (whether you understand it or not) you have agreed to share your data with the provider and often given them license to do unspecified things with it - usually everything just short of publically publishing it. Laughably this is often referred to as a “privacy policy” – this is not privacy, at best it is confidentiality.
I sit here editing this file on my personal computer using Word – this document is currently private, and the use of this “editing” app does not require me to share my data (document) with anyone; contrast that with Google Docs where in order to use the functionality I am required to share my data with Google. In this regard, I ‘trust’ Word to maintain my privacy as it takes no view on where I store my data and I can choose to keep it secret or send it for review; whereas I have to hope and pray Google is able to live up to its claimed confidentiality, as “the data is out there”.
Hopefully forthcoming legislation will require “privacy by design” – in our simple example, this should indicate that since an editor can be designed to maintain things as secret, this should be a mandatory option.

2. Heterogeneity of processing
Sonny is fitted with a secondary processing system, one that is capable of overriding the default “3 laws” behaviour. However, importantly Dr. Alfred J. Lanning who created Sonny also realised the importance of ensuring that Del Spooner, the automatonophobic cop was sufficiently piqued by Lanning’s death to investigate and follow the breadcrumbs.
We can draw two distinct and complementary lessons from this:

  • Aside from the system that is performing some useful function, we need observers watching out for undesirable behaviours; such technology is widespread in corporations, where independent “intrusion detection systems” monitor networks for anomalous traffic – where are the personal information intrusion detection systems?
  • Since we are dealing with personal data, there is, by definition, some living individual that it concerns. They are an essential part of the heterogeneous processing that we need to ensure is considered as part of the system – for this we must ensure that the humans have legibility, agency and negotiability concerning their data.

Humans in the loop, is essential for “privacy by design” extending significantly beyond what we consider as informed consent.

3. Separation from central authority
It is the stuff of endless Hollywood movies (from Dr. Strangelove to Captain America: The Winter Soldier) that centralized command and control systems are a danger – a central point of attack, emergent intelligence or simply bugs, and the world ends.
While centralized command and control systems are the dream of military commanders, even they provide authority to commanders in the field to make independent decisions to ensure appropriate and timely response to changing circumstances and communications interference (also many movies here too).
Sonny was provided not only with heterogeneity of processing so that he could challenge the default “3 laws” behaviour, he was able to be aware of commands from V.I.K.I. but could choose to ignore them.
Plans for centralized control of domestic appliances for demand side management of energy consumption are both technically dangerous and require intrusive monitoring (that many confound with the roll out smart meters). A similar problem arose in the telephone network in the mid 1980s and was solved with a highly distributed algorithm that requires no centralized state   - such analytic insights need to be forefront of those campaigning for “privacy by design” when confronted with the spurious technical argument “we must centralize the data or it doesn’t work”.

4. Denser Alloy…
Extolling commercial entities to modify their practices to take on board “privacy by design” is a merit-worthy activity, but we’ll need some ghosts in the machine to catch them when they succumb to bad practice or plain carelessness.
Sonny: Do you think we were all created for a purpose? I’d like to think so.
[looks at his hand]
Sonny: Denser alloy. My father gave it to me. I think he wanted me to kill you.
The talk will take examples from domestic Internet of Things to illustrate the points herein made…