A year after being alerted to the fact that the Royal Free NHS Trust in London had transferred some 1.6 million partial patient records to Google's DeepMind business, the Information Commissioner's Office has issued a draft Undertaking that it requires the Royal Free to give, in respect of various breaches of the Data Protection Act (see here).
The collaboration between the Royal Free and DeepMind was very quickly identified by data professionals as troubling; personal data relating to patients always requires special consideration, especially around the manner in which it is processed, the amount of data that is being processed, the information provided to the underlying patients about the nature of the processing that was to take place appeared to be weak, or lacking, and the arrangements between the parties did not appear to stand up to real scrutiny. On all four counts, the ICO has found the Royal Free lacking (see here).
The ICO's letter to the Royal Free provides some detail of the alleged breaches of the Act, and in particular the breaches of Principles 1, 3, 6 and 7 of the Act.
Principle 1 requires that data controllers notify individuals as to how their data will be processed and for what purposes. Here, the ICO found that the Free "did not provide an appropriate level of transparency to patients about the use of their personal data" and, like the National Data Guardian, Dame Fiona Caldicott found that the Free had failed to meet the conditions for processing in Schedules 2 and 3 of the Act.
Principle 3 (personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed) was also found to be breached – the Free had provided some 1.6 million partial patient records to DeepMind, and the Commissioner found that such number of records was neither necessary nor proportionate. This is an interesting finding as the DPA gives no guidance on Principle 3's meaning: is proportionality a test of the total use of all personal data held, or of the data held about a given individual? Here, the ICO has found the Free wanting in the former case.
Principle 6 requires controllers to provide individuals with sufficient information about the proposed processing and the Free had failed to do so.
Finally, Principle 7, which requires controllers to put in place a contract with third party processors. Whilst a contract was concluded in September 2015, the ICO concluded that it did not go far enough to ensure that "only the minimal possible data would be processed by DeepMind and that the processing would only be conducted for limited purposes". Further, "the Commissioner [was] also concerned to note that the processing of such a large volume of records containing sensitive health data was not subject to a privacy impact assessment ahead of the project's commencement."
One might have thought that, given the seriousness of the breaches, the volume of sensitive personal data handed over to Google and the interest in getting this very issue right, the Commissioner would levy a fine on the Free close to the maximum permitted of £500,000. Instead, she has required the Free to give undertakings around its future performance. Whether this approach will act as a 'nudge' to others to perform better, or indicate that breaches of this scale aren't taken as seriously as many may have expected, will be the test for future projects of this kind. One thing is certain: the use of machine learning and artificial intelligence in assessing medical risk and process will only continue to expand, and how the health sector interacts with that, in the light of the DPA, and the General Data Protection Regulation, will play a crucial role in shaping healthcare and care outcomes for the foreseeable future.