HMRC has announced that it will comply with a direction from the Information Commissioner's Office (ICO), and delete 5 million records of the voices of callers to its hotline, after the ICO found that the capture and retention of the recordings was in breach of data protection law.
The ICO investigation, prompted by media coverage and a complaint from campaign group "Big Brother Watch", focused on the use of voice authentication for customer verification on some of HMRC’s helplines, from January 2017 to October 2018. It is understood that the ICO findings will reveal that HMRC had not gathered adequate consent from the people whose voices were recorded.
The ICO has revealed that it will be publishing a formal enforcement notice giving HMRC 28 days from the date of publication to effect the deletion.
In a statement, Deputy Commissioner at the ICO Steve Wood, said their investigation exposed a significant breach of data protection law. He warned that although digital services help to make lives easier, this must not come at the expense of people's "fundamental right to privacy", and if organisations did not obtain the necessary consent, then the ICO would "take action to protect the public."
The use of the technology by HMRC spanned a period both before and after the General Data Protection Regulation (GDPR) took effect. However, the ICO’s investigation itself was carried out using powers under the GDPR. GDPR provides that biometric data (such as voice recognition recordings) is considered special category information and is subject to stricter conditions. Data controllers who use new or invasive technologies to process biometric data will generally be required to conduct data protection impact assessments (DPIAs) beforehand. It is not yet known whether HMRC conducted a DPIA.
Mishcon de Reya has expert lawyers and specialists who can assist clients with DPIAs, and the legal and technical issues which the processing of biometric data involves.