<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Latest from Mishcon de Reya</title>
    <link>https://www.mishcon.com</link>
    <atom:link href="https://www.mishcon.com/datamatters/rss" rel="self" type="application/rss+xml" />
    <description>This RSS feed displays the very latest news, events, articles and briefings from the Mishcon de Reya Solicitors web site at www.mishcon.com</description>
    <language>en-gb</language>
    <copyright>Copyright 2026 Mishcon de Reya LLP. All rights reserved. www.mishcon.com</copyright>
    
    <generator>PilotSite RSS Generator</generator>
    <webMaster>feedback@mishcon.com (Mishcon De Reya)</webMaster>
    <lastBuildDate>Wed, 22 Apr 2026 08:52:43 +0000</lastBuildDate>
    <ttl>20</ttl>
    <item>
      <title><![CDATA[DSG v ICO: which perspective applies when determining duties?]]></title>
      <link>https://www.mishcon.com/news/dsg-v-ico-which-perspective-applies-when-determining-duties</link>
      <guid>https://www.mishcon.com/news/dsg-v-ico-which-perspective-applies-when-determining-duties</guid>
      <description><![CDATA[]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 25 Feb 2026 12:42:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The <a href="https://www.judiciary.uk/wp-content/uploads/2026/02/ICO-v-DSG-2026-EWCA-Civ-140-FINAL-for-hand-down.pdf">Court of Appeal has upheld an appeal by the Information Commissioner</a> (i), in long-running regulatory litigation arising from a cyber breach DSG Retail Ltd suffered in 2017-2018, and which led the ICO to impose a fine of the then-maximum &pound;500,000, under the now-repealed Data Protection Act 1998. The breach had involved the taking, by the attackers, of a large volume of financial card information, but in the very large majority of cases, the information only consisted of the cards&#39; PAN number (the 16 digit one on the front of a card) and their expiry date (the &quot;EMV data&quot;). So (in that majority of cases) no names or CVV numbers were disclosed. It is common ground that the EMV data is personal data in the hands of DSG Retail, because it can match it with the other information it holds, and identify individuals to whom the data relates. What has been at issue is whether the EMV data is personal data in the hands of the attacker.</p>

<p>On initial appeal, the First-tier Tribunal (FTT) agreed with the ICO that DSG Retail had contravened the data security principle requirement to take <em>&quot;appropriate technical and organisational measures...against unauthorised or unlawful processing of personal data and against accidental loss or destruction of or damage to personal data&quot;</em>. However, the FTT reduced the fine to &pound;250,000, holding that the contravention was not as serious as the ICO had determined.</p>

<p>On next appeal to the Upper Tribunal, DSG Retail focused its case on an argument that, when assessing whether a controller has complied with the data security principle, the ICO - and the courts - must consider whether the information accessed by the attacker was personal data <em>from the perspective of the attacker</em>. As the very large majority of exfiltrated information in this case was incapable of being connected by the attacker to any identifiable individual, it was not personal data in the attacker&#39;s hands. In agreeing with this submission, and upholding DSG&#39;s appeal, the Upper Tribunal held that the ICO had fallen into error in finding that there had been a contravention of the data security principle without considering whether the data that was rendered vulnerable would be &quot;personal data&quot; in the hands of third parties who could access it.</p>

<p>The Court of Appeal has now overturned the Upper Tribunal&#39;s decision, with the court saying that the data security principle applies if the data is &quot;personal&quot; from the perspective of the data controller, and that it is unnecessary to consider whether it is the personal data &quot;in the hands of&quot; or &quot;from the perspective&quot; of any other person.</p>

<p>This is the latest in a long line of cases, including from the European courts, which deal with the questions of &quot;identifiability&quot; and &quot;perspective&quot;, and the issues remain both complex and controversial. It is quite possible that there will be a further appeal to the Supreme Court, although DSG Retail is not believed to have made any announcement as yet.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABICWAAA7AA7YAAAAAAAB6AB7QAP777777AACAAA7IG72BQAAE.jpg" length="40981" />
    </item>
    <item>
      <title><![CDATA[Unlawful cookies: a new avenue for the ICO to issue fines?]]></title>
      <link>https://www.mishcon.com/news/unlawful-cookies-a-new-avenue-for-the-ico-to-issue-fines</link>
      <guid>https://www.mishcon.com/news/unlawful-cookies-a-new-avenue-for-the-ico-to-issue-fines</guid>
      <description><![CDATA[Until 5 February 2026, the Information Commissioner's Office could only issue a fine for contravening cookies law if it was a "serious" contravention and one that was of a kind "likely to cause substantial damage or substantial distress".]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 12 Feb 2026 21:30:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Until 5 February 2026, the Information Commissioner&#39;s Office (ICO) could only issue a fine for contravening cookies law if it was a &quot;serious&quot; contravention and one that was of a kind &quot;likely to cause substantial damage or substantial distress&quot;.</p>

<p>With the commencement of section 115 and Schedule 1 of the Data (Use and Access) Act 2025, (DUAA) those &quot;seriousness&quot; <em>and</em> &quot;substantial damage or substantial distress&quot; requirements are removed, and, in principle, <em>any</em> contravention is punishable by a fine.</p>

<p>In reality, given that the ICO - in contrast to the field of unsolicited direct electronic marketing (where the similar &quot;substantial damage or substantial distress&quot; requirement had been repealed in 2015) - has not shown much appetite for enforcing the cookie provisions of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), this may not present an immediate regulatory risk. However, those deploying cookies and similar technologies, and those advising them, would do well to bear the changes in mind. They would also do well to watch for any changes in messaging coming out of the ICO.</p>

<p>Regulation 6 of PECR, as amended, provides that, in general, a person &quot;must not store information, or gain access to information stored, in the terminal equipment of a subscriber or user&quot;. As the ICO explains, &quot;Cookies are used by many websites and can do a number of things, e.g. remembering your preferences, recording what you have put in your shopping basket, and counting the number of people looking at a website&quot;. They are also an essential feature of much online programmatic advertising targeted at individual users, through profiling.</p>

<p>Although cookies and similar tracking technology are ubiquitous online, they should not be deployed unless certain conditions, as laid out in new schedule A1 of PECR are met. These conditions include: where the user has consented to the cookies (provided they have been given clear and comprehensive information about them); where the cookies are required for transmitting a communication over a network; where the cookies are strictly necessary to provide an &quot;information society service&quot; (broadly, this means most online services, such as apps, search engines, social media platforms, online marketplaces, content streaming services, online games, news websites, and any websites offering other goods or services, but it will generally not include websites operated by public authorities); the collection - in some cases - of website/user analytics.</p>

<p>Understanding aspects of PECR requires knowledge of wider communications law and technological understanding of how communications are made across online networks. Some of the amendments recently made by the DUAA involve these complex aspects. Shortly after the DUAA was enacted, the ICO indicated that updated PECR guidance was due in &quot;Winter 2025/2026&quot;, but it has not emerged yet. Practitioners and advisers would do well to look out for it though, as it may give an indication of the ICO&rsquo;s possible areas of regulatory focus.</p>

<p>The DUAA also increased the maximum penalty for PECR contraventions from &pound;500,000 to &pound;17.5 million or 4% of global annual turnover (whichever is higher). Even under the current low-interventionist ICO regime, this increase, and the technical complexity of PECR represents a risk that should be understood all the way up to board level.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AAEEEAAA7AA7YAAAAAAAB6AB7QAP777774AAASQAXAF5YBIAAE.jpg" length="34405" />
    </item>
    <item>
      <title><![CDATA[Data protection and electronic privacy reform: what’s hot and what’s not?]]></title>
      <link>https://www.mishcon.com/news/data-protection-and-electronic-privacy-reform-whats-hot-and-whats-not</link>
      <guid>https://www.mishcon.com/news/data-protection-and-electronic-privacy-reform-whats-hot-and-whats-not</guid>
      <description><![CDATA[Various aspects of data protection and eprivacy reforms take effect in the UK on 5 February 2026.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 05 Feb 2026 15:11:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In brief</h2>

<ul>
	<li>Various aspects of data protection and eprivacy reforms take effect in the UK on 5 February 2026.</li>
	<li>Significant automated decision-making undergoes a major shift, from a &ldquo;can&rsquo;t, unless&hellip;&rdquo; framework, to a &ldquo;can, so long as&hellip;&rdquo; one.</li>
	<li>Charities&rsquo; ability to send electronic marketing to their supporters becomes much wider.</li>
	<li>Potential maximum fines for eprivacy breaches rise from &pound;500,000 to &pound;17.5m or 4% of global annual turnover.</li>
	<li>A new concept of &ldquo;recognised legitimate interests&rdquo; is introduced which does not require the usual balancing tests for reliance on the existing legitimate interests lawful basis.</li>
	<li>With the making of The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026, Parliament has now finally commenced the most significant data protection and eprivacy reform provisions since the UK left the EU.</li>
</ul>

<p>Almost as soon as Brexit took effect, the-then government introduced a <a href="http://www.gov.uk/guidance/national-data-strategy">National Data Strategy</a> and proposed a &ldquo;bold new data regime outside the EU&rdquo;. Two failed Parliamentary Bills followed (the Data Protection and Digital Information Bills 1 and 2) before Parliament finally passed the Data Use and Access Act 2025 (DUAA), in June last year. The final version of the reform package was notably less bold than what earlier governments had proposed, but practitioners and advisers have still been waiting, with some anticipation, for the commencement of part 5 of DUAA, which deals primarily with amendments to the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. With the Commencement No. 6 Regulations now made, that wait is over.</p>

<h2>What changes are made?</h2>

<h3>Automated decision-making</h3>

<p>Possibly the most significant change, given both the current stage and likely future stages of technological advancement, is to Article 22 of the UK GDPR. Prior to amendment, Article 22 placed a general prohibition on automated processing, including profiling, which produced legal effects on a data subject or similarly significantly affected them. This prohibition (cast as a data subject right) was subject to exceptions where that automated processing was necessary for performance of a contract with the data subject, was required or authorised by law, or was based on the data subject&rsquo;s consent. These exceptions were, obviously, limited in effect.</p>

<p>Article 22 is now replaced by Article 22A, the effect of which is that it is only when automated processing (producing legal effects on a data subject or similarly significantly affecting them) is done to special category data that the general prohibition, and exceptions, still apply. With &ldquo;ordinary&rdquo; personal data, meanwhile, automated decision-making can take place, provided specific safeguards are observed, such as: providing information about the decision-making; the ability for the data subject to make representations; the ability to obtain human intervention; and the ability to contest the decision made.</p>

<p>In many cases, provided they observe those safeguards, and provided that the rights, freedoms and interests of data subjects are not overriding, businesses will be able to rely on their (or others&rsquo;) legitimate interests as a legal basis for automated decision-making. This change from a &ldquo;can&rsquo;t, unless&hellip;&rdquo; to a &ldquo;can, so long as&hellip;&rdquo; scheme represents a potentially fundamental shift in how the law will treat such processing, including in the context of AI systems.</p>

<h3>Charities and electronic direct marketing, and PECR fines</h3>

<p>The law governing the sending of direct marketing to individuals by email and SMS was passed as far back as 2003 - the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). It only allows direct electronic marketing to be sent with the recipient&rsquo;s consent, but with an important exception for organisations who have sold products or services (or entered into negotiations for the sale of such products or services) to the individuals to be marketed to - as long as those organisations provided information on opting-out, they can send such electronic marketing without consent.</p>

<p>This exception has come to be called the &ldquo;soft opt-in&rdquo;. Because of the requirement for there to be a &ldquo;sale&rdquo; of a product or service, charities have not generally been able to avail themselves of it, until now. Regulation 22 of PECR has been amended to allow charities to send direct marketing to individuals where: the sole purpose of the marketing is to further one or more of the charity&rsquo;s charitable purposes; the individual provided their details when expressing an interest in one or more of those purposes, or when offering or providing support for one or more of those purposes; and the appropriate opt-out information is provided at all stages.</p>

<p>This amendment is potentially very significant for the charity sector and its fundraising capabilities. But charities would be wise to step cautiously and take appropriate advice. Unlawful direct electronic marketing has long been a target for enforcement action by the Information Commissioner (ICO) who is also receiving increased enforcement powers in relation to PECR, including, notably, an increase in the maximum fine for PECR infringement from &pound;500,000 to &pound;17.5m or 4% of global annual turnover (whichever is higher) - the equivalent of the maximum fines under the UK GDPR and Data Protection Act 2018.</p>

<h3>&ldquo;Recognised&rdquo; legitimate interests</h3>

<p>For processing of personal data to be lawful under the UK GDPR, it must, at least, have a lawful basis under Article 6(1). Perhaps the basis most commonly relied upon is that the processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party (Article 6(1)(f)), which applies unless those interests are overridden by the interests or fundamental rights and freedoms of the data subject. As the case law has established, to determine properly whether this basis is available requires a careful set of at least three staged tests: i) are legitimate interests being pursued (and by whom)? ii) is the processing necessary for the purposes of those interests? iii) are those interests outweighed by the data subject&rsquo;s interests, rights or freedoms?</p>

<p>With the amendments to Article 6, Parliament has now introduced the concept of a &ldquo;recognised&rdquo; legitimate interest. Where such an interest exists, the third test above is effectively knocked away: Parliament has determined that certain processing activities are for legitimate interests and those interests are never outweighed by the data subject&rsquo;s rights. These include:&nbsp;processing for the purposes of safeguarding national security, of protecting public security, for defence purposes and for safeguarding a vulnerable individual.</p>

<p>It should be stressed however that the existence of a recognised legitimate interest does not imply an exemption from the obligation to comply with all other aspects of the data protection principles.</p>

<h3>Regulation-making powers</h3>

<p>A notable feature of the DUAA &ndash; and indeed of much modern legislation &ndash; is that it confers on the Secretary of State the power to make further regulations. There are multiple instances of this in the Act, but, as one example, further &quot;recognised legitimate interests&quot; can be created by this route.</p>

<h2>Other amendments</h2>

<p>We will be providing further commentary in the coming weeks on other significant changes to the UK&#39;s data protection and eprivacy regime. These include in the areas of scientific research, cookies, data subject complaints and rights, and reform of the Information Commissioner&#39;s Office.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AALECAAA7AA7YAAAAAAAB6AB7QAP777776XQGAAAMII3CCAAAE.jpg" length="115548" />
    </item>
    <item>
      <title><![CDATA[Synthetic identity fraud: understanding the threat and protecting your interests]]></title>
      <link>https://www.mishcon.com/news/synthetic-identity-fraud-understanding-the-threat-and-protecting-your-interests</link>
      <guid>https://www.mishcon.com/news/synthetic-identity-fraud-understanding-the-threat-and-protecting-your-interests</guid>
      <description><![CDATA[Synthetic identity fraud involves the creation of fictitious identities by combining real personal information (such as National Insurance numbers) with fabricated details (false names, addresses, dates of birth).]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 22 Jan 2026 10:10:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In&nbsp;brief&nbsp;</h2>

<ul>
	<li><strong>Synthetic identity fraud&nbsp;</strong>involves&nbsp;the creation of&nbsp;fictitious identities by combining&nbsp;real personal&nbsp;information (such as National Insurance numbers) with fabricated details (false names, addresses, dates of birth). Unlike traditional identity theft,&nbsp;no real&nbsp;person&nbsp;corresponds to the identity created.&nbsp;</li>
	<li><strong>The threat is growing rapidly</strong>: synthetic identity fraud is now one of the fastest-growing financial crimes in the UK and globally, with&nbsp;annual&nbsp;losses estimated in the billions. Financial services, credit providers, and e-commerce platforms are particularly vulnerable.&nbsp;</li>
	<li><strong>Victims include individuals whose genuine details are misused, lenders and businesses suffering direct financial losses, and the broader economy</strong>&nbsp;through facilitation of money laundering and other serious crimes.&nbsp;</li>
	<li><strong>Legal remedies are available</strong>: victims and affected businesses can pursue freezing orders, asset tracing, proprietary injunctions, and Norwich Pharmacal relief, whilst lawyers play a crucial role in coordinating enforcement action and advising on prevention frameworks.&nbsp;</li>
</ul>

<h2>What is&nbsp;synthetic&nbsp;identity&nbsp;fraud?&nbsp;</h2>

<p>Synthetic identity fraud&nbsp;involves the construction of&nbsp;fictitious identities by blending genuine personal information&nbsp;-&nbsp;often a real National Insurance number or date of birth&nbsp;-&nbsp;with fabricated elements such as false names, addresses, or telephone numbers. The resulting &quot;synthetic identity&quot; does not correspond to any real person,&nbsp;which distinguishes&nbsp;it from traditional identity theft, where a fraudster assumes the complete identity of an existing individual.&nbsp;</p>

<p>These&nbsp;synthetic identities are&nbsp;then&nbsp;cultivated over time. Fraudsters apply for credit, make small purchases, and build a credit history,&nbsp;establishing&nbsp;apparent&nbsp;legitimacy. After months or even years of &quot;nurturing&quot; the identity, the fraudster executes a &quot;bust-out&quot;:&nbsp;resulting in the fraudster&nbsp;maximising&nbsp;credit lines&nbsp;before&nbsp;disappearing, leaving&nbsp;victims - primarily&nbsp;lenders&nbsp;-&nbsp;with substantial losses and no real person to pursue.&nbsp;</p>

<h2>Scale and&nbsp;prevalence&nbsp;</h2>

<p>Synthetic identity fraud has&nbsp;emerged&nbsp;as one of the most significant and fastest-growing threats in financial crime. In the United States, the Federal Reserve has&nbsp;identified&nbsp;it as the fastest-growing type of financial crime, accounting for an estimated 10&ndash;15% of credit losses and costing lenders approximately $6 billion annually. UK-specific data&nbsp;remains&nbsp;more limited.&nbsp;UK Finance and the Financial Conduct Authority have flagged&nbsp;synthetic identity fraud as an escalating concern, particularly as digital onboarding and remote account opening have accelerated post-pandemic.&nbsp;</p>

<p>The financial services sector&nbsp;bears the brunt of&nbsp;losses, but e-commerce platforms, telecommunications providers, and any business extending credit or processing high-value transactions face exposure.&nbsp;&nbsp;</p>

<h2>Who is at&nbsp;risk and&nbsp;what are the&nbsp;harms?&nbsp;</h2>

<p>Synthetic identity fraud creates multiple categories of victim:&nbsp;</p>

<p><strong>Individuals:</strong>&nbsp;persons&nbsp;whose genuine personal details&nbsp;(often children, the elderly, or individuals with thin credit files)&nbsp;are harvested and incorporated into synthetic identities may suffer credit damage, difficulty accessing finance, and the burden of proving they are victims rather than perpetrators.&nbsp;Such personal&nbsp;data&nbsp;is&nbsp;frequently&nbsp;obtained&nbsp;through cyber-attacks&nbsp;on&nbsp;third parties.&nbsp;</p>

<p><strong>Lenders and businesses</strong>:&nbsp;these entities face&nbsp;direct financial losses from unpaid credit&nbsp;and&nbsp;chargebacks, and fraud-related write-offs can be&nbsp;substantial. Reputational damage and regulatory scrutiny may follow, particularly where inadequate controls are&nbsp;identified.&nbsp;</p>

<p><strong>Consumers and the economy:</strong> synthetic identities underpin wider criminal enterprises, from money laundering to fraud-as-a-service operations, eroding trust in digital commerce and increasing costs across the financial system.&nbsp;</p>

<h2>Prevention and&nbsp;protection&nbsp;</h2>

<p>Effective prevention requires a multi-layered approach:&nbsp;</p>

<p><strong>For individuals:</strong> regularly&nbsp;monitor&nbsp;credit reports for unfamiliar accounts or enquiries; consider credit freezes or fraud alerts; safeguard personal information, particularly National Insurance numbers and identity documents.&nbsp;</p>

<p><strong>For businesses</strong>: implement robust Know Your Customer (KYC),&nbsp;Anti-Money Laundering (AML)&nbsp;and cyber-security&nbsp;controls; deploy advanced identity verification technologies, including biometric authentication, document verification, and AI-driven anomaly detection; monitor account behaviour for patterns consistent with synthetic identity cultivation (e.g., rapid credit-building followed by sudden high-value applications); and share intelligence through industry fraud-prevention networks.&nbsp;</p>

<h2>Legal&nbsp;remedies and the&nbsp;role of&nbsp;legal&nbsp;practitioners&nbsp;</h2>

<p>When synthetic identity fraud is detected, swift legal action is essential to mitigate losses and hold perpetrators accountable. Mishcon de Reya&#39;s Fraud team advises clients on the full spectrum of remedies:&nbsp;</p>

<p><strong>Freezing orders and asset preservation:</strong> obtaining urgent injunctive relief to freeze assets and prevent dissipation, including worldwide freezing orders where fraud spans multiple&nbsp;jurisdictions.&nbsp;</p>

<p><strong>Norwich Pharmacal relief</strong>: compelling third parties (banks, telecommunications providers, online platforms) to&nbsp;disclose&nbsp;information&nbsp;identifying&nbsp;fraudsters and tracing the flow of funds.&nbsp;</p>

<p><strong>Proprietary claims and tracing:</strong> pursuing assets through complex transaction chains, asserting proprietary interests, and recovering misappropriated funds.&nbsp;</p>

<p><strong>Coordination with law enforcement:</strong> liaising with the National Crime Agency,&nbsp;Report&nbsp;Fraud, and international authorities to support criminal investigations and enhance prospects of recovery.&nbsp;</p>

<p><strong>Regulatory compliance and prevention:</strong>&nbsp;advising&nbsp;financial institutions and businesses on regulatory obligations, designing and stress-testing KYC/AML frameworks, and implementing technology-driven controls to detect and prevent synthetic identity fraud before losses occur.&nbsp;</p>

<p>Synthetic identity fraud&nbsp;presents a&nbsp;sophisticated&nbsp;and&nbsp;evolving threat,&nbsp;but it is not insurmountable. With the right combination of vigilance, technology&nbsp;controls, and&nbsp;timely&nbsp;legal&nbsp;action, individuals and businesses can protect themselves and pursue effective remedies when fraud&nbsp;of this kind&nbsp;occurs.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABYSUAAA7AA7YAAAAAAAB6AB7QAP777774AABPAAXMHF4BYAAE.jpg" length="35745" />
    </item>
    <item>
      <title><![CDATA[Privacy is dead? Long live privacy]]></title>
      <link>https://www.mishcon.com/news/privacy-is-dead-long-live-privacy</link>
      <guid>https://www.mishcon.com/news/privacy-is-dead-long-live-privacy</guid>
      <description><![CDATA[It is often said that privacy is dead. Over a decade ago, Facebook's founder Mark Zuckerberg claimed that privacy was no longer a "social norm"; that people had become so comfortable sharing more information, more openly and with more people, that the notion of holding back and protecting personal information was antiquated, even obsolete.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Mon, 22 Dec 2025 10:25:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In brief</h2>

<ul>
	<li>Although the technological and societal threats to privacy are greater than ever, privacy is not dead and is worth fighting for.</li>
	<li>There is more private information in use, therefore more scope for misuse, and scrutiny has been turbocharged by technology, including AI.</li>
	<li>Even the most exposed and scrutinised families and public figures have the right to respect for their private and family lives, and to prevent unwarranted intrusion.</li>
	<li>The law recognises individual autonomy: a person&#39;s right to exercise close control over particular information about their private life.</li>
	<li>Being &quot;for&quot; privacy does not mean being against freedom of expression. They are two equal but competing rights, always in tension. The legal approach to privacy has built-in safeguards, including to prevent trivial claims, and it will not stop the publication of material that is genuinely in the public interest.</li>
</ul>

<p>It is often said that privacy is dead. Over a decade ago, Facebook&#39;s founder Mark Zuckerberg claimed that privacy was no longer a &quot;<em>social norm</em>&quot;; that people had become so comfortable sharing more information, more openly and with more people, that the notion of holding back and protecting personal information was antiquated, even obsolete.</p>

<p>Technological and societal threats to privacy have only increased since then. Sceptics and certain sections of the media would have you believe that those who seek to assert their right to privacy must have something to hide - they are branded as &quot;<em>furtive</em>&quot;, &quot;<em>evasive</em>&quot; or &quot;<em>secretive</em>&quot;. Yet the human right to respect for private and family life is intrinsically connected to our respect for human dignity and autonomy, and legal protections for private information and &ndash; crucially &ndash; against unwarranted intrusion into private life, remain alive and well, and are consistently being applied in novel contemporary situations.</p>

<p>It is, without doubt, harder now to protect private information. For one thing, our private data and information is more in use, so the scope for misuse is higher. We are just at the start of an AI revolution, fuelled by data gathering on a massive scale, as well as increasingly sophisticated surveillance. Some technology, such as smart fridges and tracking apps, we can opt out of, but much of it, like biometric data collection, is built into everyday transactions, especially for security. It is almost impossible to live &#39;off grid&#39;. Technology has also turbocharged scrutiny, not just by the press but by citizen journalists, activists and hackers. Private moments are more likely to be captured &ndash; from video doorbells to drones &ndash; and have never been easier to spread. The Coldplay concert embrace this summer, and the viral storm which followed, is just one example of the internet turned sleuth, judge and jury.</p>

<p>Even the most exposed and scrutinised families and public figures have privacy rights, which can &ndash; indeed must - be enforced confidently and consistently if we are not to concede the fight entirely. The best way to guard against loss of privacy is by being mindful about what we share, with whom, and how it is received by the recipient. It is prudent to set clear expectations before sharing &ndash; so that if information is not for onward dissemination, that is understood and agreed in advance. Equally, if privacy boundaries are set, they must be actively policed in order to be effective &ndash; and unlawful interference addressed.</p>

<p>It is often forgotten that privacy law does not just protect the keeping of secrets, but also against intrusion into the private spheres of our lives. The Supreme Court has upheld privacy injunctions, even in the face of considerable online speculation about the parties&#39; identities, appreciating the significance of the additional scrutiny and media intrusion that would result in the injunction being lifted.</p>

<p>The law also recognises individual autonomy. At one time it was thought that disclosures in a given &quot;<em>zone</em>&quot; of a person&#39;s private life could defeat, or at least greatly reduce, the weight of any claim for privacy in respect of other information in the same &quot;<em>zone</em>&quot;. That theory has been discredited. The starting point now is that a person has the right to exercise close control over particular information about their private life: to decide whether to disclose anything about a given aspect of that life and, if so, what to disclose, when and to whom. For public figures in particular, this fits with the established principle that they are not &quot;fair game&quot; simply because they seek favourable publicity in other aspects of their lives.</p>

<p>Being &quot;<em>for</em>&quot; privacy should not be equated with being &quot;<em>against</em>&quot; freedom of expression. These are equal but often competing rights, always in tension, and any interference with each must be necessary and proportionate. In other words, the legal approach to privacy has built-in safeguards, including to prevent trivial claims, and it will not stop the publication of material that is genuinely in the public interest. Privacy is worth fighting for. It is a human right, and should not be a dirty word.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACKUEAAA7AA7YAAAAAAAB6AB7QAP777774AAAKADVMDFMAYAAA.jpg" length="93237" />
    </item>
    <item>
      <title><![CDATA[When AI impersonates - taking action against deepfakes in the UK]]></title>
      <link>https://www.mishcon.com/news/when-ai-impersonates-taking-action-against-deepfakes-in-the-uk</link>
      <guid>https://www.mishcon.com/news/when-ai-impersonates-taking-action-against-deepfakes-in-the-uk</guid>
      <description><![CDATA[The UK lacks overarching deepfake legislation, leaving victims facing a complex patchwork of existing laws including intellectual property, data protection, defamation and malicious falsehood. While the Government has recently introduced criminal sanctions relating to non-consensual intimate deepfakes, significant gaps remain.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 10 Oct 2025 15:59:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In brief&nbsp;</h2>

<ul>
	<li>The UK lacks overarching deepfake legislation, leaving victims facing a complex patchwork of existing laws including intellectual property (IP), data protection, defamation and malicious falsehood.&nbsp;&nbsp;</li>
	<li>While the Government recently introduced criminal sanctions for sharing non-consensual intimate deepfakes (via the Online Safety Act 2023), and provisions criminalising their creation (in the Data (Use and Access) Act 2025,&nbsp;significant gaps remain.&nbsp;</li>
	<li>Detection and enforcement present substantial challenges for individuals, with perpetrators often difficult to identify and frequently based overseas, beyond UK regulatory reach, whilst platforms are often slow to remove deepfake content.&nbsp;</li>
	<li>The Government&#39;s current consultation on AI and copyright may include consideration of whether more controls should be given to performers over the use of their likenesses and performances.&nbsp;&nbsp;</li>
	<li>The EU AI Act meanwhile includes transparency requirements for deepfakes including machine-readable marking and disclosure obligations, though practical implementation challenges remain.&nbsp;</li>
</ul>

<p>The use of AI tools is proliferating and becoming mainstream. Allied to fast-moving developments in the technology, it is becoming increasingly difficult to distinguish AI-generated content &ndash; including deepfakes (i.e. images, video or audio intended to impersonate an individual&#39;s likeness or voice) -&nbsp;from human-generated and authentic content. Deepfake technology isn&#39;t, in itself, particularly new, but the ease and scale with which deepfakes can now be produced and disseminated, without easy detection or challenge, has led to urgent calls for a review of regulation in this area.&nbsp;</p>

<p>&#39;Digital replicas&#39; (a more benign expression for &#39;deepfakes&#39;) can, of course, be created for positive uses. The technology has been used to de-age the actor Harrison Ford in the movie <em>Indiana Jones and the Dial of Destiny</em> and to reanimate deceased actors (such as Carrie Fisher) on screen. But, when digital replicas are made without consent, they can be put to more nefarious uses. <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/deepfakes-demean-defraud-disinform">Ofcom</a> summarised these risks well in a recent study when it noted that deepfakes can be used to &quot;<em>demean, defraud and disinform</em>&quot;. Many famous people have been the subject of deepfakes, from Taylor Swift through to Stephen Fry and the financial journalist Martin Lewis, but the problem also impacts non-celebrities, and sometimes in devastating ways.&nbsp;&nbsp;&nbsp;</p>

<h2>No overarching regulatory framework&nbsp;</h2>

<p>The problem for those impacted is that there is no overarching law regulating deepfakes in the UK. Instead, there is a patchwork of existing laws (for example, IP, data protection, defamation, malicious falsehood) alongside existing laws that meet particular harms (such as the use of deepfakes in <a href="https://www.mishcon.com/news/deepfake-fraud-protecting-your-organisation">fraudulent activity</a>). Importantly, current regulatory focus is on the creation and dissemination of non-consensual intimate images in the form of deepfakes, where the Government has taken a number of steps to introduce criminal sanctions, with more developments to come shortly. These developments have been hard fought for, and greatly welcomed by campaigners, but there are still gaps in the legislation, for example, there is nothing yet to address &quot;nudifying&quot; or &quot;undressing&quot; apps, which remove clothing from images.&nbsp;&nbsp;</p>

<h2>Difficulties in detection and enforcement&nbsp;</h2>

<p>In addition to the overly complex nature of the current regulatory framework, those impacted by deepfakes face the additional difficulty of tracking down those who create or disseminate such images. Even if they can be identified, the perpetrators are often based overseas and out of reach of the UK regulatory authorities. While contractual protections may assist for some individuals (for example, performers, who may wish to contract against having their performance used to train an AI model), there is no one size fits all approach to this enforcement question. Accordingly, in addition to enhanced regulation, many are looking to the role of the AI model developers and the large tech platforms (including social media) in detecting and expeditiously removing such content, enforcing their terms of use, and, where possible, preventing such content being generated in the first place. But our experience has been that the platforms are often slow to react, which can be detrimental where content can go viral rapidly online.&nbsp;</p>

<h2>Potential claims against deepfakes&nbsp;</h2>

<p>There are some potential claims that an individual might make in relation to the use of their likeness (image or voice) in a deepfake, some of which are currently less relevant to non-celebrities, but where we may see calls to broaden out the protection available.&nbsp;</p>

<h3>Intellectual property rights&nbsp;&nbsp;</h3>

<p>In the UK, there are certain forms of IP rights that might be available to provide protection for an individual&#39;s likeness. However, there is no form of personality right or image right in the UK (unlike in some other countries).&nbsp; Potential IP rights that might arise include:&nbsp;</p>

<ul>
	<li><em>Copyright</em>: while there is no copyright in an individual&#39;s voice or image, there is likely to be copyright in a photograph or video of an individual, or in a sound recording of their voice. If those copyright works are reproduced without the copyright owner&#39;s consent (e.g. during the training of an AI model), arguments of copyright infringement may arise. However, the difficulty here is that often the individual who is the subject of e.g. the photograph is not the copyright owner of that photograph. Individuals may also find that any relevant copyright works have been licensed to AI model developers to train their models.&nbsp;<br />
	<br />
	Separately, performers have certain rights in their performances (there is no requirement to be a celebrity to rely upon these rights), as well as certain moral rights (though in practice moral rights are often waived by performers). While these rights may be relied upon to tackle unauthorised uses of a performance, the performers&#39; union, <a href="https://www.equity.org.uk/news/2025/equity-calls-for-rights-based-approach-to-uk-government-s-plans-to-unleash-ai">Equity</a>, has called on the government to strengthen performers&#39; rights to encourage licensing and prevent unauthorised AI-related uses. In particular, Equity is lobbying for increased transparency measures and additional rights, including in relation to performance synthesisation, image rights and unwaivable moral rights. It is also concerned about the terms of contracts used by production companies for training AI models/generating digital replicas, citing the example of a performer whose likeness was used as a &#39;performance avatar&#39; and who later discovered it being used to promote the Venezuelan government.&nbsp;<br />
	<br />
	Copyright may, however, have a greater role to play in relation to deepfakes going forward. The Danish government is considering using copyright law to regulate deepfakes by making unauthorised sharing of AI-generated deepfakes illegal, including in relation to deepfakes of non-celebrities. Individuals would be able to demand removal of the images, as well as compensation, and the right would last for up to 50 years after their death. Meanwhile, the central proposal of the US Copyright Office&#39;s report on <a href="https://www.mishcon.com/download/digital-replicas">Digital Replicas</a> is a new federal law to deal with unauthorised digital replicas (again, which would be available for all individuals, not just celebrities), on the grounds that existing laws in the US do not provide sufficient legal redress. A number of US sates have also proposed such laws.&nbsp;<br />
	<br />
	In the UK, the Government is currently conducting a <a href="https://www.mishcon.com/news/uk-government-consultation-on-copyright-and-ai-a-win-win">consultation process</a> in relation to AI and copyright. While the consultation does not formally consult on specific proposals on digital replicas and personality rights, the Government has said that it is keen to hear views on the topic. This could include whether the current legal framework provides sufficient control to performers over the use of their likenesses/performances (perhaps involving consideration of whether performers should be able to opt their performances out of being used to train AI models).&nbsp;<br />
	&nbsp;</li>
	<li><em>Trade marks and passing off</em>: Celebrities, such as Rihanna and former motor racing driver Eddie Irvine, have had some success in bringing passing off proceedings for the use of their image to advertise a product, on the grounds that this amounts to a false endorsement. While such claims may assist in similar situations involving deepfakes of celebrities, it will be much more difficult for a non-celebrity to get such a claim off the ground.&nbsp;</li>
</ul>

<h3>Data protection&nbsp;&nbsp;</h3>

<p>Information which &quot;relates to&quot; an identified or identifiable individual is their &quot;personal data&quot;, and will, as a general principle, mean that the data subject has rights arising, and those who process the personal data have obligations imposed on them. &quot;Inaccurate&quot; data is still personal data, and, by extension, there is certainly a strong argument that a deepfake of an identifiable individual will also be their personal data. This means that affected individuals potentially have the right to request erasure, or to bring complaints or claims, under the UK GDPR.&nbsp;</p>

<h3>Defamation&nbsp;</h3>

<p>A deepfake could give rise to a claim in defamation if it contains false and defamatory information and causes the subject serious reputational harm. Consider a politician who becomes the subject of a fake video where they admit to wrongdoing. The merits will depend on multiple factors including the meaning, nature and extent of publication of the deepfake, and the evidence of reputational harm. There may also be problems locating and identifying the source of the deepfake/its author, problems establishing the liability of any platform hosting the deepfake, and jurisdictional hurdles if they/the platform are based outside of the UK.&nbsp;&nbsp;</p>

<h3>Breach of privacy and/or confidence&nbsp;&nbsp;</h3>

<p>Where a deepfake contains true but private and/or confidential information, the subject may be able to bring a claim for misuse of private information and/or breach of confidence if they did not consent to the information being used and shared in this way. What constitutes &quot;private information&quot; is not defined in law, but it is established that it includes information such as: medical information, details of a person&#39;s sexuality and sex life, and details of their home or family life.&nbsp;</p>

<h3>Non-consensual intimate image deepfakes&nbsp;</h3>

<p>The UK Government recently introduced various pieces of legislation aimed at criminalising conduct around non-consensual intimate deepfakes. As of 31 January 2024, legislation brought in by the Online Safety Act 2023 and inserted into the Sexual Offences Act 2003 criminalises the <em>sharing, or threatening to share</em>, of intimate deepfakes without consent. In addition, the Data (Use and Access) Act 2025&nbsp;contains provisions criminalising the <em>creation, and requesting of the creation</em>, of intimate deepfakes without consent.&nbsp;&nbsp;</p>

<h2>Wider regulatory responses&nbsp;</h2>

<p>The EU&#39;s AI Act is a wide-ranging piece of legislation regulating the development and deployment of AI, including generative AI. One of the bedrocks of ensuring trustworthiness and integrity of AI systems is a robust framework of transparency requirements which enables people to know when they are interacting with or are exposed to AI systems and their outputs (including deepfakes or other manipulated content). In that context, the EU AI Act contains a number of transparency requirements, including in relation to deepfakes, which will start to apply from 2 August 2026.&nbsp;</p>

<p>The European Commission is considering how best to implement the AI Act&#39;s transparency requirements and has recently published a draft&nbsp;<a href="http://digital-strategy.ec.europa.eu/en/library/first-draft-code-practice-transparency-ai-generated-content">Code of Practice</a> on the detection and labelling of artificially generated or manipulated content.&nbsp;&nbsp;</p>

<p>Specifically, in relation to deepfakes and other generated content, Article 50 of the EU AI Act requires:&nbsp;</p>

<ul>
	<li>Providers of AI systems that directly interact with individuals to ensure they are informed they are interacting with an AI system and not a human (unless this is obvious to a reasonably well-informed, observant and circumspect individual in the circumstances and context of use). For example, the Archival Producers Alliance has published <a href="https://www.mishcon.com/download/apa-genai-best-practices">guidance</a> on best practices for the use of Generative AI in Documentaries which include providing visual vocabulary that alerts the audience to GenAI use, such as a unique frame around the material, change of aspect ratio etc.&nbsp;</li>
	<li>Providers of AI systems to facilitate detection and identification of AI-generated or manipulated content by marking such content in a machine-readable manner and enabling related detection mechanisms (e.g. metadata identification, cryptographic techniques and watermarking).&nbsp;</li>
	<li>Deployers of AI systems generating or manipulating deepfake content to provide information about the origin of the content. However, where the content forms part of an evidently artistic, creative, satirical, fictional or analogous work or programme, these obligations are limited to disclosing the existence of the generated or manipulated content in an appropriate manner that does not hamper the display or enjoyment of the work.&nbsp;&nbsp;</li>
</ul>

<p>Of course, the position in relation to transparency and labelling of AI content is not straightforward, both legally and practically. Many organisations, for example, have partnered with the Coalition for Content Provenance and Authenticity (C2PA) to add labels to AI-generated content (e.g. LinkedIn). These tags are automatically added based on embedded code data in the images, as identified by the C2PA process. However, this may easily be circumvented by stripping the metadata from digital files. It must therefore be anticipated that the discussions around the proposed Code of Practice will lead to a range of (potentially conflicting) viewpoints that may require compromises to be reached in certain areas.&nbsp;&nbsp;</p>

<h2>Practical steps&nbsp;</h2>

<p>While a number of legal measures are available for individuals who find that their likeness or voice has been used in a deepfake (as well as preventative measures to protect against creation in the first place), the framework for taking action remains a complex one, and so we would recommend anyone impacted to seek specialist legal advice. Those needing support with non-consensual intimate image deepfakes can contact services such as the <a href="https://revengepornhelpline.org.uk/">Revenge Porn Helpline</a>, who provide free assistance with the removal of intimate imagine including deepfakes shared without consent from the internet. The <a href="https://www.police.uk/advice/advice-and-information/online-safety/online-safety/deepfakes-what-is-a-deepfake/">Police also have published guidance</a> on reporting potential criminal offences involving deepfakes. &nbsp;&nbsp;</p>

<p>If you would like to discuss issues relating to deepfakes, including how to take action to protect against digital replicas being created and shared, please <a href="https://www.mishcon.com/contact">get in touch</a> with a member of the team.&nbsp;&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AABCCAAA7AA7YAAAAAAAB6AB7QAP777774AAAJYA5AB7IAIAAA.jpg" length="89913" />
    </item>
    <item>
      <title><![CDATA[Criminal prosecution under data protection laws]]></title>
      <link>https://www.mishcon.com/news/criminal-prosecution-under-data-protection-laws</link>
      <guid>https://www.mishcon.com/news/criminal-prosecution-under-data-protection-laws</guid>
      <description><![CDATA[Where a data controller fails to comply lawfully with a data subject access request (DSAR) under the UK GDPR, that failure will constitute a breach of statutory duty, potentially attracting civil enforcement action by the Information Commissioner’s Office (ICO). However, a recent prosecution illustrates that a failure can, in some circumstances, also be a criminal offence.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 05 Sep 2025 14:19:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In brief&nbsp;</h2>

<ul>
	<li>A care home director has been prosecuted and fined by the Information Commissioner in relation to a Data Subject Access Request (DSAR). This is believed to be the first such prosecution of its type.&nbsp;</li>
	<li>While failure to comply with a DSAR is usually treated as a civil matter, section 173 of the Data Protection Act 2018 makes it a criminal offence &ndash; once a DSAR is received - to alter, erase, or conceal information to prevent disclosure.&nbsp;</li>
	<li>Criminal liability under section 173 can lie with the data controller itself, or with individual directors and staff, and all organisations should be alive to the possibility of prosecutions being brought.&nbsp;</li>
</ul>

<p>Where a data controller fails to comply lawfully with a data subject access request (DSAR) under the UK GDPR, that failure will constitute a breach of statutory duty, potentially attracting civil enforcement action by the Information Commissioner&rsquo;s Office (ICO). However, a <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2025/09/care-home-director-found-guilty-of-ignoring-request-for-personal-information/">recent prosecution</a> illustrates that a failure can, in some circumstances, also be a criminal offence.&nbsp;</p>

<p>The right to know how one&rsquo;s personal data is being processed is recognised in law as especially important - it has been described as a &quot;lynchpin&quot; of data protection law. &nbsp;</p>

<p>When a DSAR is made to a data controller, the controller must supply, normally within one month - and subject to the application of exemptions - an explanation of the processing, and copies of the personal data undergoing processing. &nbsp;</p>

<p>Ordinarily, a failure to comply will be treated as a civil wrong, potentially resulting in civil enforcement action by the Information Commissioner (ICO) or civil proceedings by the requester to secure compliance. However, section 173 of the Data Protection Act 2018 (DPA) also provides that a controller (or an employee or officer of the controller) will commit an offence if - after receiving a DSAR - it alters, defaces, blocks, erases, destroys or conceals information with the intention of preventing disclosure of all or part of the information that the requester would have been entitled to receive.&nbsp;</p>

<p>The recent prosecution by the ICO - believed to be the first such section 173 DPA case - was of a director of Bridlington Lodge, a care home in Yorkshire, who was found to have blocked, erased, or concealed records held by the care home, to prevent this information being disclosed. The request had been made by a woman who had lasting power of attorney over her father&rsquo;s affairs (and so was authorised to make the request on his behalf). &nbsp;</p>

<p>At Beverley Magistrates Court on Wednesday 3 September 2025, the director was convicted and ordered to pay a fine of &pound;1,100 and additional costs of &pound;5,440.&nbsp;</p>

<p>The ICO has informed Mishcon de Reya that the director offered an unsuccessful defence that, variously, claimed that: the information requested had in fact been provided by a member of staff; the care home manager was responsible for responding to the DSAR, not him; the company had been deregistered from the ICO in 2016 (not that this could conceivably have been relevant); Bridlington Lodge was a building, not a data controller.&nbsp;</p>

<p>The ICO also explained to this firm that the requester now has the requested personal data.&nbsp;</p>

<p>The ICO has attracted some criticism in recent times for the relatively low volume of civil enforcement actions it brings. In particular, it has rarely shown a willingness to intervene in DSARs where the requester has been faced with a recalcitrant data controller. Whether this criminal enforcement case indicates a shift in approach is not yet clear - it may be that the behaviour of the director in this particular case was simply so egregious that it warranted exceptional action. However, all data controllers - and indeed, their employees and directors, who might be directly criminally liable - should be aware that prosecutions under section 173 of the DPA can be brought, and, in appropriate cases, might well be pursued by the ICO.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADOSAAAA7AA7YAAAAAAAB6AB7QAP777774AABNID4IE7CBAAAA.jpg" length="15176" />
    </item>
    <item>
      <title><![CDATA[Online safety: Protection of Children Codes come into force]]></title>
      <link>https://www.mishcon.com/news/online-safety-protection-of-children-codes-come-into-force</link>
      <guid>https://www.mishcon.com/news/online-safety-protection-of-children-codes-come-into-force</guid>
      <description><![CDATA[The Protection of Children Codes (Codes) set out safety measures recommended for providers of Part 3 Services (user-to-user or search engine services) to comply with their duties under the Online Safety Act 2024 (OSA) and come into effect from 25 July 2025.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 24 Jul 2025 13:50:00 GMT</pubDate>
      <content:encoded><![CDATA[<h2>In brief&nbsp;</h2>

<ul>
	<li>The <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/statement-protecting-children-from-harms-online">Protection of Children Codes (Codes)</a> set out safety measures recommended for providers of Part 3 Services (user-to-user or search engine services) to comply with their duties under the Online Safety Act 2024 (<strong>OSA</strong>) and come into effect from 25 July 2025.&nbsp;</li>
	<li>Services that implement the recommended measures within the Codes will be treated as complying with their relevant duties under the OSA.&nbsp;</li>
	<li>However, service providers are not required to comply with the Codes to meet their OSA duties. They can implement alternative measures, provided they maintain a record of the measures and explain how these meet the duty.&nbsp;</li>
	<li>These Codes come into force ahead of the deadline of 7 August 2025 for certain services to disclose their <a href="https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-childrens-risk-assessments">children&#39;s risk assessments (CRAs)</a> to Ofcom.&nbsp;</li>
</ul>

<h2>The Codes&nbsp;</h2>

<p>There are two Codes issued by Ofcom (the regulator for the OSA) that set out the safety measures recommended for Part 3 Services, one for <a href="https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/protection-of-children-code-of-practice-for-user-to-user-services.pdf?v=399754">user-to-user services</a> and one for <a href="https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/protection-of-children-code-of-practice-for-search-services2.pdf?v=399753">search engine services</a>. Where services comply with and implement the recommended measures, the Codes will act as a &quot;safe harbour&quot; and will mean that the services will be treated as having complied with their duties under the OSA in relation to the protection of children from the risk of harm. The Codes do not remove the requirement to complete any of the risk assessments required by the OSA, and services will still be required to complete these and evidence the mitigations that they put in place to reduce risk.&nbsp;</p>

<h2>Measures&nbsp;</h2>

<p>The Codes set out measures by thematic area, and not all services will need to comply with all measures to benefit from the Codes&#39; safe harbour effect. The Codes are not prescriptive, and services can choose to implement other measures provided they maintain a record of the measures implemented and how they consider these to meet the relevant duty.&nbsp;</p>

<p>Where services do not meet the standard outlined in the Codes and fail to evidence and explain their alternative approaches, it is likely that any Ofcom enforcement will require the implementation of the measures specified in the Codes. As Ofcom continues its enforcement efforts under its powers granted by the OSA, we will gain greater insight into how Ofcom will assess alternative measures to those in the Codes.&nbsp;</p>

<h2>CRA disclosure to Ofcom and the public&nbsp;</h2>

<p>To comply with their duty in relation to the completion of CRAs, some of the Categorised Services (Category 1 and Category 2A Services) are expected to submit their completed CRA(s) to Ofcom by <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/important-dates-for-online-safety-compliance">7 August 2025</a>.&nbsp;&nbsp;</p>

<p>Categorised Services are considered to be &quot;higher risk&quot; services by Ofcom. Category 1 Services are user-to-user services that have a content recommender system and either (i) more than 34 million UK users, or (ii) have more than seven million UK users and allow those users to forward or share user-generated content. Category 2A Services are search engine services that are not vertical search services and have more than seven million UK users. Category 1 and 2A Services must also publish notices of the findings and conclusions from their completed CRA(s) in a publicly available statement.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AC2DYAAA7AA7YAAAAAAAB6AB7QAP777774AAAYIBSMIEUCAAAA.jpg" length="26312" />
    </item>
    <item>
      <title><![CDATA[Data Protection risks to life: Should more be done?]]></title>
      <link>https://www.mishcon.com/news/data-protection-risks-to-life-should-more-be-done</link>
      <guid>https://www.mishcon.com/news/data-protection-risks-to-life-should-more-be-done</guid>
      <description><![CDATA[The Secretary of State for Defence announced in Parliament, on 16 July, that in February 2022, a “significant data protection breach” relating to the Afghan Relocations and Assistance Policy (ARAP) had resulted in the information of 18,714 applicants - and in some cases their family members - to the ARAP and associated schemes, being mistakenly sent to multiple recipients. Some of the information ultimately ended up on Facebook. It presented such a serious threat that the Government was compelled to apply for what became a super-injunction.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 16 Jul 2025 12:41:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The Secretary of State for Defence <a href="https://www.gov.uk/government/speeches/oral-statement-on-afghan-data-breach">announced in Parliament</a>, on 16 July, that in February 2022, a <em>&ldquo;significant data protection breach&rdquo;</em>  relating to the Afghan Relocations and Assistance Policy (ARAP) had resulted in the information of 18,714 applicants - and in some cases their family members - to the ARAP and associated schemes, being mistakenly sent to multiple recipients. Some of the information ultimately ended up on Facebook. It presented such a serious threat that the Government was compelled to apply for what became a super-injunction.&nbsp;</p>

<p>ARAP is the scheme for the resettlement of certain Afghan citizens who worked for, or with, UK Armed Forces over the combat years of Afghanistan. It is evident that information about those who might apply or qualify is of the utmost sensitivity, and that if it fell into the wrong hands it could result in a risk to life. This is why, at enormous cost, as the Secretary of State explained, the UK has had to commit to a specific settlement scheme designed for people not eligible for ARAP, but judged to be at the highest risk of reprisals by the Taliban, as a result of the data breach. &nbsp;</p>

<p>The Secretary of State also told Parliament that the incident occurred when a defence official emailed an ARAP case working file outside of authorised government systems, believing it to contain the details of only 150 applicants. Instead, it contained all 18,714.&nbsp;</p>

<p>The risks of disclosure of <em>&ldquo;hidden&rdquo;</em> data in spreadsheets are something that have been known about for many years. As far back as 2013 the author wrote about it <a href="https://www.theguardian.com/public-leaders-network/blog/2013/sep/06/data-leaks-private-information-risk">for the Guardian</a>. However, the fact that it continues to happen suggests that it is not sufficiently widely known, and that many who use spreadsheets and share data do not have the appropriate policies and controls in place to prevent such incidents. In 2024 the Information Commissioner <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2024/10/what-price-privacy-poor-psni-procedures-culminate-in-750k-fine/">fined the Police Service of Northern Ireland</a> for a worryingly similar infringement.&nbsp;</p>

<p>The Commissioner is tasked with regulating and enforcing data protection law and has powers to serve fines (to a maximum of &pound;17.5 million, or 4% of global annual turnover, whichever is higher). In a <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2025/07/ico-statement-in-response-to-2022-mod-data-breach/">parallel statement on 15 July</a>, a Deputy Commissioner explained that the incident involved <em>&ldquo;hidden data in a spreadsheet&rdquo;</em>, and that it was <em>&ldquo;unacceptable&rdquo;</em>. However, the Commissioner was <em>&ldquo;satisfied that no further regulatory action is required at this time in this case&rdquo;</em>.&nbsp;</p>

<p>The Commissioner operates what it calls a <em>&ldquo;public sector approach&rdquo;</em> under which he will generally only fine a public authority if an infringement is <em>&ldquo;egregious&rdquo;</em>. In documents published earlier this year, on <a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/2025/02/ico-consultation-on-the-revised-approach-to-public-sector-regulation/">proposals to extend the public sector approach</a>, the Commissioner&#39;s office explained that the non-exhaustive list of what might qualify as <em>&ldquo;egregious&rdquo;</em> included:&nbsp;</p>

<ul>
	<li>Actual or potential harm to people: this could be physical or bodily harm. For example, evidence of:&nbsp;&nbsp;
	<ul>
		<li>a high risk of actual or potential harm to affected people or their family members, including a threat to life following a data breach;&nbsp;</li>
		<li>where there is evidence of a high degree of negligence; and&nbsp;</li>
		<li>relevant previous infringements, or recent infringements, by the controller.&nbsp;</li>
	</ul>
	</li>
</ul>

<p>Given that this recent incident put tens of thousands of lives at risk, was (as the Secretary of State told Parliament) a <em>&ldquo;serious departmental error&rdquo;</em> and a <em>&ldquo;clear breach of strict data protection protocols&rdquo;</em>, and happened around the same time that the Commissioner was fining the Ministry of Defence for <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/12/ico-fines-ministry-of-defence-for-afghan-evacuation-data-breach/">another very serious data breach involving high risk to Afghani citizens</a> it is very difficult to understand how it did not meet the threshold for regulatory action.&nbsp;</p>

<p>But if, for whatever reason, a fine was not felt appropriate, it is worth also noting the Commissioner has various other powers available, and these include the power, under section 139(3) of the Data Protection Act 2018, to lay a report before Parliament on any matter relating to the carrying out of his functions.&nbsp;</p>

<p>This data protection issue of hidden data in spreadsheets, which has twice in recent years exposed many thousands of people to the risk of loss of life, is surely something that the ICO should consider worthy of drawing to Parliament&rsquo;s attention by way of a statutory report.&nbsp;</p>

<h3>Related coverage</h3>

<p><a href="https://www.independent.co.uk/news/uk/politics/afghan-data-breach-starmer-information-commisioner-b2790305.html">Independent</a></p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADFT6AAA7AA7YAAAAAAAB6AB7QAP777774AAB7YASMIEUCAAAE.jpg" length="88694" />
    </item>
    <item>
      <title><![CDATA[Fines for cookie contraventions more likely as a result of law change]]></title>
      <link>https://www.mishcon.com/news/fines-for-cookie-contraventions-more-likely-as-a-result-of-law-change</link>
      <guid>https://www.mishcon.com/news/fines-for-cookie-contraventions-more-likely-as-a-result-of-law-change</guid>
      <description><![CDATA[The Data (Use and Access) Act 2025 (DUAA) will make some significant changes to the enforcement regime for cookies and direct electronic marketing. The increase in the maximum fines, from £500,000 to £17.5 million or 4% of global annual turnover (whichever is higher) has received some attention. However, another change - which will make it much easier for fines to be issued for serious cookie infringements – is potentially of more significance, and it has received surprisingly little notice.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 02 Jul 2025 11:11:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The <a href="https://www.mishcon.com/news/how-will-the-data-use-and-access-act-reshape-data-protection">Data (Use and Access) Act 2025 (DUAA)</a> will make some significant changes to the enforcement regime for cookies and direct electronic marketing. The increase in the maximum fines, from &pound;500,000 to &pound;17.5 million or 4% of global annual turnover (whichever is higher) has received some attention. However, another change - which will make it much easier for fines to be issued for serious cookie infringements &ndash; is potentially of more significance, and it has received surprisingly little notice. &nbsp;</p>

<p>The <a href="https://www.legislation.gov.uk/uksi/2003/2426/contents">Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR)</a> is the primary law when it comes to cookies and direct electronic marketing. When the GDPR (now the UK GDPR) came into effect in 2018 the Data Protection Act 1998 (DPA98) was repealed, except that its enforcement sections remained in place for PECR infringements.&nbsp;</p>

<p>Under section 55A of DPA98, as modified by PECR, the Information Commissioner&#39;s Office (ICO) may fine a person if they are satisfied that there has been a serious contravention of the cookie provisions, in regulation 6 of PECR, by that person, and the contravention was of a kind &quot;likely to cause substantial damage or substantial distress&quot; (and also that the contravention was either deliberate or the person acted negligently in allowing it to happen). The requirement for a contravention to be likely to have caused &quot;substantial damage or distress&quot; also originally applied to direct electronic marketing contraventions, but the requirement was removed by secondary legislation in 2015.&nbsp;</p>

<p>The upshot is that, as the law currently stands, a cookie contravention would have to be both serious <em>and</em> likely to cause substantial damage or substantial distress before the ICO could even consider issuing a fine. It is difficult to imagine that many, if any, contraventions would meet that threshold.&nbsp;</p>

<p>All that is set to change under the DUAA: once its section 115 and schedule 13 are commenced (at a date yet to be announced), both PECR and the Data Protection Act 2018 will be amended so that any contravention of regulation 6 of PECR is potentially subject to a fine. The ICO will still have to have regard to factors such as the nature, gravity and duration, and the intentional or negligent character of the contravention, but there will be no seriousness threshold and no &quot;harm&quot; threshold.&nbsp;</p>

<p>So, does this mean that, when the amended powers come into effect, the ICO will be issuing a swathe of cookie fines? That seems unlikely: although the ICO has <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2025/01/ico-takes-action-to-tackle-cookie-compliance-across-the-uk-s-top-1-000-websites/">adopted an &quot;online tracking strategy&quot;</a>, which involves assessing some large websites&#39; compliance, there has been no indication that this strategy will lead to multiple fines being deployed.&nbsp;&nbsp;</p>

<p>However, the possibility cannot be ruled out, especially if the ICO were to encounter cookie contraventions which are serious and egregious.&nbsp;</p>

<p><em>Thanks are due to Tim Turner of <a href="https://2040training.co.uk/">2040 Training</a>, who first drew the author&#39;s attention to this topic.&nbsp;</em></p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADXTYAAA7AA7YAAAAAAAB6AB7QAP777774AABIAAE4DZIAYAAA.jpg" length="209400" />
    </item>
    <item>
      <title><![CDATA[How will the Data (Use and Access) Act reshape data protection?]]></title>
      <link>https://www.mishcon.com/news/how-will-the-data-use-and-access-act-reshape-data-protection</link>
      <guid>https://www.mishcon.com/news/how-will-the-data-use-and-access-act-reshape-data-protection</guid>
      <description><![CDATA[On 19 June, the Data (Use and Access) Act (DUAA) received Royal Assent. DUAA deals with a number of areas, such as digital verification services and smart-metering, but in this article we consider what changes it will bring in terms of data protection law and practice, and what the impact might be on both organisations and individuals.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Mon, 30 Jun 2025 10:42:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>On 19 June, the Data (Use and Access) Act (DUAA) received Royal Assent. DUAA deals with a number of areas, such as digital verification services and smart-metering, but in this article we consider what changes it will bring in terms of data protection law and practice, and what the impact might be on both organisations and individuals.&nbsp;&nbsp;</p>

<p>The first thing to note is that this is an amending Act: the core pieces of data protection legislation in the UK will remain the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. The significance of DUAA lies in the changes it will make to those existing laws.&nbsp;</p>

<p>&nbsp;Although the Act will not introduce wholesale changes to data protection law, companies should certainly take note of certain aspects. The large majority of the data protection provisions are not yet in effect, and will require secondary legislation to bring them into effect. This secondary legislation (and associated guidance) may not emerge for some months yet.&nbsp;</p>

<p>However, we draw attention to the following key points:&nbsp;</p>

<ul>
	<li>Changes to cookies rules &ndash; once the relevant provisions are commenced, those who operate websites and apps may be able to deploy some &quot;analytics&quot; cookies, without seeking visitors&#39; consent. However, it is important to note that, although the UK law will change in this respect, the EU law will not (at least for the time being). This means that any organisation which is operating in the EU, or even simply making its website or app available in the EU, will have to consider whether there is any benefit to be gained from having different approaches according to the jurisdiction (or even whether it is technically practical to do so);&nbsp;</li>
	<li>Clarification that, when responding to a subject access request, data controllers only need to undertake a &quot;reasonable and proportionate search&quot; (the reference to &quot;proportionality&quot; may well mean that a small organisation, faced with a request of no obvious value for the requester, may need to do a less comprehensive search than a large company dealing with an obviously serious request). This change simply introduces onto the statute book something that the courts stated as long as 20 years ago, but, nonetheless, it may well be helpful to be able to point to the provision, in cases where requesters dispute how a controller has responded to a request. This is a provision which has commenced immediately upon enactment of DUAA.&nbsp;</li>
	<li>A requirement to have a &quot;complaints procedure&quot; for data protection matters. Data controllers will, once the provisions have commenced, have to acknowledge complaints within 30 days and respond to them &quot;without undue delay&quot;. Those organisations who already have complaints procedures in place should be able to incorporate data protection matters into existing procedures (although they will need to be aware that, unlike with most consumer complaints, there are statutory deadlines when it comes to data protection matters).&nbsp;</li>
	<li>Charities will, once the provisions have commenced, now be able to avail themselves of existing provisions available to commercial organisations, and send &quot;unsolicited&quot; marketing emails and text messages to supporters (and those who have expressed an interest in the charity), as long as they offer an &quot;opt-out&quot;. This has the potential to transform the fundraising market. However, charities should note that unlawful direct electronic marketing continues to be one of the areas that the Information Commissioner targets for enforcement: getting it wrong can be risky.&nbsp;</li>
	<li>Relatedly to the charities and marketing point, the Information Commissioner will (again, subject to commencement of the provisions) have increased enforcement powers: the maximum fine currently for direct marketing (and, indeed, cookies infringements) is &pound;500,000 &ndash; this will increase to match the UK GDPR maximum fines of &pound;17.5 million or (in the case of an undertaking) 4% of global annual turnover (whichever is higher).&nbsp;</li>
</ul>

<p>These are by no means the only significant changes, and we will provide further information and updates as the changes take effect. However, organisations should certainly be reviewing their data protection compliance programmes and policies, both to ensure they will be (or remain) compliant, but also to consider whether there are any opportunities to use the data they have to deliver benefits to data subjects and to the organisations themselves.&nbsp;</p>

<p>&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACWCUAAA7AA7YAAAAAAAB6AB7QAP777774AABAYEAUEYGBAAAA.jpg" length="102315" />
    </item>
    <item>
      <title><![CDATA[Online Safety Act: What you need to know about Illegal Content Risk Assessments]]></title>
      <link>https://www.mishcon.com/news/online-safety-act-what-you-need-to-know-about-illegal-content-risk-assessments</link>
      <guid>https://www.mishcon.com/news/online-safety-act-what-you-need-to-know-about-illegal-content-risk-assessments</guid>
      <description><![CDATA[As of 16 March 2025, all platforms, sites, and apps in scope of Part 3 of the UK's Online Safety Act 2023 (OSA) must assess the risks posed on and by their service that are associated with Priority Offences (as defined in the OSA) and other illegal content by completing an Illegal Content Risk Assessment (ICRA). The completion of an ICRA is a separate duty from the duties to complete a Children's Access Assessment (CAA) and a Children's Risk Assessment (CRA).]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Tue, 17 Jun 2025 11:42:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>As of 16 March 2025, all platforms, sites, and apps in scope of Part 3 of the UK&#39;s Online Safety Act 2023 (<strong>OSA</strong>) must assess the risks posed on and by their service that are associated with Priority Offences (as defined in the OSA) and other illegal content by completing an Illegal Content Risk Assessment (<strong>ICRA</strong>). The completion of an ICRA is a separate duty from the duties to complete a <a href="https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-compliance-with-childrens-access-assessment">Children&#39;s Access Assessment</a> (<strong>CAA</strong>) and a <a href="https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-childrens-risk-assessments">Children&#39;s Risk Assessment</a> (<strong>CRA</strong>).&nbsp;</p>

<h2>What services are in scope?&nbsp;&nbsp;</h2>

<p>The OSA creates a legal obligation on all Part 3 Services to complete an ICRA and outlines certain requirements for the assessment.&nbsp;&nbsp;&nbsp;&nbsp;</p>

<p>Part 3 Services are those which are either:&nbsp;&nbsp;</p>

<ul>
	<li><strong>User-to-user services:</strong> where content is generated, uploaded to, or shared on the service by a user, and may be encountered by another user or users of the service; or&nbsp;&nbsp;</li>
	<li><strong>Search services: </strong>services that are or include a search engine, or have the ability to search websites, databases or other aspects of a service.&nbsp;</li>
</ul>

<h2>What needs to be assessed?&nbsp;</h2>

<p>The requirement to complete an ICRA is set out in the OSA, which outlines the elements needed to complete the assessment. However, the process, form and considerations to be taken into account during the assessment are dictated by <a href="https://www.ofcom.org.uk/siteassets/resources/documents/online-safety/information-for-industry/illegal-harms/risk-assessment-guidance-and-risk-profiles.pdf?v=390984">guidance issued by Ofcom</a>, the online safety regulator, published on 16 December 2024. Ofcom&#39;s guidance follows the same four-stage method that it advises is followed when completing a CRA:&nbsp;</p>

<ul>
	<li>Assess the content on the service which may be priority illegal content, other illegal content or used to commit a Priority Offence (see further below as to Priority Offences);&nbsp;</li>
	<li>Assess the risk of harm to users that encounter the content identified and the risk of harm should the content be used to commit a Priority Offence;&nbsp;</li>
	<li>Identify and implement measures to address the risks identified; and&nbsp;&nbsp;&nbsp;</li>
	<li>Ensure that appropriate reports and monitoring are completed.&nbsp;&nbsp;</li>
</ul>

<details><summary>Stage 1: Assessment of illegal content&nbsp;</summary>

<p>In stage 1, services must determine the types of illegal content likely to be found on their service. This can be a particularly tricky assessment as there are 130 Priority Offences listed in the OSA, which services must consider when completing an ICRA. To assist with this assessment, Ofcom&#39;s guidance narrows the 130 Priority Offences into 17 categories which it refers to as &quot;priority illegal content&quot;.&nbsp;&nbsp;</p>

<p>The guidance requires services to consider for each category of priority illegal content:&nbsp;</p>

<ul>
	<li>The risk of users encountering the content; and&nbsp;</li>
	<li>The risk of the service being used to facilitate or commit a Priority Offence (user-to-user services only).&nbsp;</li>
</ul>

<p>Services are also required to assess the risk of encountering other illegal content, not included in the 17 categories, and if illegal content is identified, complete the same consideration of risk.&nbsp;</p>

<p>Once services have identified the risk of encountering illegal content, they must identify the risk of harm which may occur. Ofcom&#39;s guidance provides Risk Profiles, which identify risk factors associated with Priority Offences, to assist services with determining harms that could arise on their service. Ofcom is clear in its guidance that these lists are not exhaustive, and there should be consideration of other harms and risks when completing an ICRA.&nbsp;</p>

<p>The considerations to be made for user-to-user services are slightly more onerous than for search services, as they must also consider specific types of Child Sexual Exploitation and Abuse (<strong>CSEA</strong>) and Child Sexual Abuse Material (<strong>CSAM</strong>) and the risks associated with this content.&nbsp;</p>
</details>

<details><summary>Stage 2: Assessment of the risk of harm&nbsp;</summary>

<p>Following the completion of stage 1, services should have identified illegal content that may be present on their service and the risk factors which relate to them. In stage 2, services should utilise the list of illegal content and risks of harm identified by their risk factors, and assess the level of harm presented by the service for each of these.&nbsp;</p>

<p>Consideration should be given to:&nbsp;</p>

<ul>
	<li>Existing controls in place to reduce risk of harm;&nbsp;</li>
	<li>Evidence of risk of harm on the service from existing records; and&nbsp;</li>
	<li>Characteristics and functions of the service which might increase the risk of harm.&nbsp;</li>
</ul>

<p>Services should assess each category of illegal content identified in stage 1 as being reasonably believed, or otherwise likely, to be on the service, individually to ascertain the risk of harm. Ofcom has provided General Risk Level Tables in its guidance to assist with this assessment. During this stage, services should also assess and evaluate the likelihood and impact of harm for each category of illegal content identified in stage 1 by assigning each category a risk level.&nbsp;</p>

<p>CSAM and CSEA content has specific Risk Level Tables within the guidance, and user-to-user services should give special consideration to ensure that risk of harm is suitably assessed.&nbsp;</p>

<p>To assist with the gathering and analysis of evidence, Ofcom provides a list of &quot;inputs&quot; within its guidance which are indicative of the types of evidence to be relied upon during the completion of an ICRA. As with the other reference tables provided by Ofcom, these lists are not exhaustive and services are encouraged to consider other sources of evidence when completing their ICRA.&nbsp;</p>
</details>

<details><summary>Stage 3: Identification and implementation of measures&nbsp;</summary>

<p>Once the level of risk and harm has been identified by stage 2, services should determine appropriate measures that need to be taken to mitigate and reduce the risk of harm.&nbsp;</p>

<p>Services should refer to the <a href="https://www.ofcom.org.uk/siteassets/resources/documents/online-safety/information-for-industry/illegal-harms/illegal-content-codes-of-practice-for-user-to-user-services.pdf">Illegal Content Codes of Practice</a>, which outline recommended measures for services based on their size, function and/or risk level. Some measures in the Codes of Practice are applicable to all services, while others will be relevant only to certain types of risk identified in stage 2. The Codes of Practice operate as a &quot;safe harbour&quot;, so if a service chooses to implement all applicable measures set out in the Codes of Practice, they will be considered to have complied with their duties under the OSA by Ofcom.&nbsp;</p>

<p>Services are not obligated to follow the Codes of Practice, and can choose to implement alternative measures. Ofcom guidance encourages services to implement measures in addition to those contained in the Codes of Practice, where they consider it suitable to do so. The guidance is keen to remind services that any measures taken should not infringe on a user&#39;s right to freedom of expression, or violate privacy or data protection laws.&nbsp;</p>

<p>Where mitigation or other measures are undertaken, services should keep a record of these measures, and explain how these reduce or mitigate harm and comply with the OSA duties.&nbsp;</p>
</details>

<details><summary>Stage 4: Reporting and monitoring&nbsp;</summary>

<p>Once an ICRA is completed, a written record must be kept and reported internally to ensure appropriate governance.&nbsp;&nbsp;</p>

<p>Where services are a <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/additional-duties-for-categorised-online-services">Category 1 or 2A service</a>, they must also share a copy of their ICRA with Ofcom, as soon as practicable after completion or revision. Category 1 services must include a summary of the ICRA in their Terms of Service. Category 2A services must include a summary of the ICRA in a publicly available statement. Category 1 and 2A services are typically large providers, with more than seven million UK users and either allow user to user content, have a content recommender system, or are a search service. &nbsp;</p>

<p>The effectiveness of implemented measures also needs to be monitored and adjusted where necessary. The outcome of monitoring should assist with keeping the ICRA up to date, as it will need to be reassessed on an annual basis, or where there is a significant change to the service, which could impact the ICRA that has already been carried out. Significant changes include adjustments to the service&#39;s design or operation, new evidence regarding the risk of harm to children, or new evidence of an increase in the number of children using the service.&nbsp;</p>
</details>

<p>&nbsp;</p>

<h2>When must ICRAs be completed or updated?&nbsp;</h2>

<p>For services that are already in operation and in scope of the obligation to complete an ICRA, the assessment should have been completed by 16 March 2025.&nbsp;</p>

<p>For services that come into scope of the obligation to complete an ICRA (whether due to a change in the service or due to the launch of a new service), an ICRA must be completed within the first three months of operation.&nbsp;</p>

<h2>What is the penalty for non-compliance?&nbsp;</h2>

<p>If an appropriate ICRA has not been carried out, Ofcom may use its powers to investigate and could impose a penalty of up to 10% of qualifying worldwide revenue or &pound;18 million, whichever is greater. Ofcom may also require remedial action to be taken. &nbsp;</p>

<p><a href="https://www.mishcon.com/news/online-safety-act-ofcom-illegal-harms-enforcement-action">Ofcom recently set out its plans for enforcement under the OSA</a>, and has begun to announce investigations that it is undertaking. It is expected that Ofcom will continue to actively enforce compliance. Mishcon de Reya has advised several clients on compliance with the OSA, as well as the commercial and practical implications that it poses and is available to assist. Please contact a member of our<a href="https://www.mishcon.com/services/the-online-safety-act/team"> Online Safety Team</a> if you wish to discuss this further.  &nbsp;</p>

<p>&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACPCKAAA7AA7YAAAAAAAB6AB7QAP777774AAAXQBEEGZCBQAAA.jpg" length="25988" />
    </item>
    <item>
      <title><![CDATA[Charities and the soft opt-in: a change for fundraising emails?]]></title>
      <link>https://www.mishcon.com/news/charities-and-the-soft-opt-in-a-change-for-fundraising-emails</link>
      <guid>https://www.mishcon.com/news/charities-and-the-soft-opt-in-a-change-for-fundraising-emails</guid>
      <description><![CDATA[When the Data (Use and Access) Bill passes, it will amend direct electronic marketing laws to allow charities to send marketing emails to individual supporters, even when they haven't specifically consented to receive them. But what sort of marketing will be permitted? In particular, will charities be able to send unsolicited fundraising emails?]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 11 Jun 2025 10:51:00 GMT</pubDate>
      <content:encoded><![CDATA[<p class="is-lead">When the <a href="https://bills.parliament.uk/bills/3825">Data (Use and Access) Bill </a>passes, it will amend direct electronic marketing laws to allow charities to send marketing emails to individual supporters, even when they haven&#39;t specifically consented to receive them. But what sort of marketing will be permitted? In particular, will charities be able to send unsolicited fundraising emails?&nbsp;</p>

<p><a href="https://www.legislation.gov.uk/uksi/2003/2426/regulation/22">Regulation 22</a> of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (&quot;PECR&quot;) creates a rule that means that in most circumstances a person cannot send unsolicited direct electronic marketing (which these days mostly means email and text messages) to an &quot;individual subscriber&quot; unless the recipient has specifically consented to receive it. An &quot;individual subscriber&quot; is a person who has a contract with the service provider (e.g. the email provider or phone provider), and in most cases it equates to one&#39;s personal email account or phone number.&nbsp;</p>

<p>But there is an exception to the rule: regulation 22(3) says that where the sender obtained the contact details of the recipient in the course of the sale, or negotiations for the sale, of a product or service, then unsolicited marketing can be sent without consent, provided that the recipient was offered an opt-out at the time their details were obtained, and is offered one each time marketing is sent. This exception is commonly referred to as the &quot;soft opt-in&quot;.&nbsp;</p>

<p>Charities, though, do not, in the main, sell products or services, and so have not been able to avail themselves of the soft-opt in. The Data (Use and Access) Bill would change that. Clause 114 says that a charity would be able to send or instigate the sending of direct electronic marketing where it obtained the recipient&#39;s details when they expressed an interest in, or offered support for, its charitable purposes, and where &quot;the sole purpose of the direct marketing is to further one or more of the charity&#39;s charitable purposes&quot;.&nbsp;</p>

<p>So, would an email which solicited funds (and perhaps did nothing else) be permissible? Everything a charity does should further its charitable purpose(s), though sometimes it also has the power to do things that are &#39;incidental&#39; or &#39;ancillary&#39; to this. Fundraising is an activity that charities carry out to raise money to spend on their charitable purposes, not a charitable purpose in itself. It is a way of indirectly furthering a charity&#39;s purpose or purposes indirectly, but can indirect furthering of purposes be allowed under clause 114?&nbsp;&nbsp;</p>

<p>Ultimately, it will be for a court to decide (no doubt assisted by guidance from the Information Commissioner which one imagines will be produced once the Bill is passed). But clause 114 was introduced by the government, by an amendment at report stage, and received cross-party support. <a href="https://hansard.parliament.uk/Lords/2025-01-21/debates/24423D96-CD94-4AFB-A47B-DCF3AB3B350B/Data(UseAndAccess)Bill(HL)#contribution-C7A95091-6987-46CA-A48D-E85ABC89B9E4">When proposing the amendment, the minister, Lord Vallance, said.&nbsp;</a></p>

<p>&quot;This amendment will permit charities to send marketing material&mdash;for example, promoting campaigns or <strong>fundraising activities</strong>&mdash;to people who have previously expressed an interest in their charitable purposes, without seeking express consent.&quot; (emphasis added).&nbsp;</p>

<p>This is surely a clear steer as to how the clause should be interpreted, and it would be very surprising if the ICO or a court subsequently saw it differently.&nbsp;</p>

<p>None of this is to say that charities should see this as free rein to send masses of spam emails, however important the cause or urgent the fundraising need. In particular, they should be aware that the opt-out provisions have to be abided by, and that the ICO is particularly active in its enforcement of PECR, and quite rigid in its approach. Add to this the fact that the Data (Use and Access) Bill will increase the maximum fine for a PECR infringement from the current &pound;500,000 to &pound;17.5 million or 4% of global annual turnover (whichever is higher), and there will be significant regulatory and legal risk for getting it wrong. But at a time of financial constraint for charities, the loosening of restrictions on fundraising emails should be welcomed by the charity sector.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABKBWAAA7AA7YAAAAAAAB6AB7QAP777774AABNICL4K3ACQAAA.jpg" length="75896" />
    </item>
    <item>
      <title><![CDATA[Online Safety Act: deadline approaches for Children's Risk Assessments]]></title>
      <link>https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-childrens-risk-assessments</link>
      <guid>https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-childrens-risk-assessments</guid>
      <description><![CDATA[Our specialists examine how platforms, apps and sites under the UK's Online Safety Act 2023 must complete a Children's Risk Assessment  by 24 July 2025, if they are likely to be accessed by children, following a Children's Access Assessment. This requirement is distinct from the duty to also complete an Illegal Content Risk Assessment duty.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Tue, 20 May 2025 17:20:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>By 24 July 2025, all platforms, sites and apps in scope of Part 3 of the UK&#39;s Online Safety Act 2023 (<strong>OSA</strong>) that have identified they are likely to be accessed by children following the completion of a <a href="https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-compliance-with-childrens-access-assessment">Children&#39;s Access Assessment </a>(<strong>CAA</strong>) must complete a Children&#39;s Risk Assessment (<strong>CRA</strong>) to comply with their OSA duties. Providers should be aware that the duty to complete a CRA is separate to the duty to complete an <a href="https://www.mishcon.com/news/online-safety-act-what-you-need-to-know-about-illegal-content-risk-assessments">Illegal Content Risk Assessment</a> (<strong>ICRA</strong>).&nbsp;</p>

<h2>What services are in scope?&nbsp;</h2>

<p>All Part 3 Services that have identified in their CAA a likelihood that their service will be accessed by children are in scope of the requirement to complete a CRA under the OSA.&nbsp;&nbsp;</p>

<p>Part 3 Services are those which are either:&nbsp;</p>

<ul>
	<li><strong>User-to-user services</strong>: where content is generated, uploaded to, or shared on the service by a user, and may be encountered by another user or users of the service; or&nbsp;</li>
	<li><strong>Search services</strong>: services that are or include a search engine, or have the ability to search websites, databases or other aspects of a service.&nbsp;</li>
</ul>

<h2>What needs to be assessed?&nbsp;</h2>

<p>The requirement for a CRA is set out in the OSA. However, the process and form of the assessment are dictated by <a href="https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/childrens-risk-assessment-guidance-and-childrens-risk-profiles.pdf?v=396653">guidance issued by Ofcom</a> on 7 May 2025. Ofcom&#39;s guidance follows the same four -stage method, that it advises is followed, when completing an ICRA:&nbsp;</p>

<ul>
	<li>assess the content on the service which may be harmful to children&nbsp;&nbsp;</li>
	<li>assess the level of risk of harm that the content is likely to have on children&nbsp;&nbsp;</li>
	<li>identify and implement measures to address the risks identified&nbsp;&nbsp;</li>
	<li>ensure that appropriate reports and monitoring are completed&nbsp;</li>
</ul>

<h3>Stage 1: Assessment of harmful content&nbsp;</h3>

<p>In the first stage, consideration must be given to the type of content on the service. Ofcom&#39;s guidance identifies three categories of content that may be harmful:&nbsp;</p>

<ul>
	<li>Primary Priority Content, which includes content that relates to&nbsp;
	<ul>
		<li>pornography;&nbsp;</li>
		<li>suicide;&nbsp;&nbsp;</li>
		<li>self-harm; or&nbsp;</li>
		<li>eating disorders.&nbsp;</li>
	</ul>
	</li>
	<li>Priority Content, which includes a range of content including that which:&nbsp;
	<ul>
		<li>is abusive;&nbsp;</li>
		<li>incites hate;&nbsp;</li>
		<li>encourages violence;&nbsp;</li>
		<li>depicts violence; or&nbsp;</li>
		<li>encourages substance abuse.&nbsp;</li>
	</ul>
	</li>
	<li>Non-designated Content, which includes all other content which presents a material risk of significant harm to children, such as:&nbsp;
	<ul>
		<li>content which creates body stigmas; and&nbsp;</li>
		<li>content that promotes depression.&nbsp;</li>
	</ul>
	</li>
</ul>

<p>The CRA must assess each category of content separately and consider how the assessed service may be used harmfully and the level of risk posed. The CRA should consider Ofcom&#39;s <a href="https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/childrens-register-of-risks.pdf?v=396667">Children&rsquo;s Register of Risks </a>and <a href="https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/guidance-on-content-harmful-to-children.pdf?v=395445">Guidance on Content Harmful to Children</a>, which will assist in identifying risk factors.&nbsp;</p>

<h3>Stage 2: Assessment of the risk of harm to children&nbsp;</h3>

<p>This stage involves using evidence to assess and assign a level of risk to each category of content based on the likelihood and impact of children encountering the content.&nbsp;&nbsp;</p>

<p>This stage should consider the different ages of children that may access the service and the differing levels of harm they may face. Ofcom has created a series of Children&#39;s Risk Profiles as part of their guidance on CRAs. These Children&#39;s Risk Profiles should be used to assess harm when completing a CRA and contain a series of risk factors and content that Ofcom considers to be indicative of harm to children. This stage of the CRA may involve considering additional factors and characteristics not contained in the Children&#39;s Risk Profiles that may increase the risk of harm. Services should also consider the existing measures in place which may mitigate harm.&nbsp;</p>

<h3>Stage 3: Identification and implementation of measures&nbsp;</h3>

<p>In stage three, services should consider how they may mitigate the risk to children and identify measures that can be taken to comply with the children&#39;s safety duties in the OSA. The <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/quick-guide-to-childrens-safety-codes">Protection of Children Codes</a> (<strong>Codes</strong>) offer some measures recommended by Ofcom based on a Service&#39;s size, functionality and risk level. Where a service complies with all the applicable measures in the Codes, it will be considered to have complied with its OSA duties.&nbsp;&nbsp;</p>

<p>Services should also consider if there are additional measures that can be taken to reduce risk and implement them as needed.&nbsp;</p>

<h3>Stage 4: Reporting and monitoring&nbsp;</h3>

<p>Once a CRA is completed, services must report on the outcome to senior risk management to ensure appropriate internal governance.&nbsp;</p>

<p>Where services are a <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/additional-duties-for-categorised-online-services">Category 1 or 2A service</a>, they must also share a copy of their CRA with Ofcom, as soon as is practicable after completion or revision. Category 1 and 2A services are typically large providers, with more than seven million UK users and either user to user content, a content recommender system, or are a search service.&nbsp;</p>

<p>Services are also required to report non-designated content identified on the service to Ofcom and provide details and examples of the content.&nbsp;&nbsp;</p>

<p>The effectiveness of measures implemented will also need to be monitored and adjusted where necessary. The outcome of monitoring should assist with keeping the CRA up to date, as it will need to be reassessed on an annual basis, or where there is a significant change to the service, which could impact the CRA that has already been carried out. Significant changes include adjustments to the service&#39;s design or operation, new evidence regarding the risk of harm to children or new evidence of an increase in the number of children using the service.&nbsp;</p>

<h2>When must CRAs be completed?&nbsp;</h2>

<p>For services that are already in operation and in scope of the obligation to complete a CRA, the assessment must be completed by 24 July 2025. Providers have been encouraged to begin the assessment process well in advance of the deadline to allow sufficient time for any required changes to be made.&nbsp;</p>

<p>For services that come into scope of the obligation to complete a CRA (whether due to a change in the service or due to the launch of a new service), a CRA must be completed within the first three months of operation.&nbsp;</p>

<h2>What is the penalty for non-compliance?&nbsp;</h2>

<p>If an appropriate CRA has not been carried out, Ofcom may use its powers to investigate and could impose a penalty of up to 10% of qualifying worldwide revenue or &pound;18 million, whichever is greater. Ofcom may also require remedial action to be taken.&nbsp;</p>

<p><a href="https://www.mishcon.com/news/online-safety-act-ofcom-illegal-harms-enforcement-action">Ofcom recently set out its plans for enforcement under the OSA,</a> and has begun to announce investigations that it is undertaking as a result of these plans. It is expected that Ofcom will continue to actively enforce compliance. Mishcon de Reya has advised several clients on compliance with the OSA, as well as the commercial and practical implications that it poses and is available to assist. Please contact a member of our <a href="https://www.mishcon.com/services/the-online-safety-act/team">Online Safety Team</a> if you wish to discuss this further.&nbsp;&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADBTKAAA7AA7YAAAAAAAB6AB7QAP777774AAAIAFOAEDQBAAAA.jpg" length="20003" />
    </item>
    <item>
      <title><![CDATA[BT is subject to Environmental Information law, says Information Commissioner]]></title>
      <link>https://www.mishcon.com/news/bt-is-subject-to-environmental-information-law-says-information-commissioner</link>
      <guid>https://www.mishcon.com/news/bt-is-subject-to-environmental-information-law-says-information-commissioner</guid>
      <description><![CDATA[The Information Commissioner (ICO) has issued decision notices ruling that British Telecommunications plc and Openreach Limited are public authorities for the purposes of the Environmental Information Regulations 2004 (EIR).]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 02 May 2025 10:17:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The Information Commissioner (ICO) has issued <a href="https://www.mishcon.com/download/the-information-commissioner-ico-decision-notices">decision notices</a> ruling that British Telecommunications plc and Openreach Limited are public authorities for the purposes of the <a href="https://www.legislation.gov.uk/uksi/2004/3391/regulation/2">Environmental Information Regulations 2004</a> (EIR).&nbsp;</p>

<p>The EIR are a piece of legislation that runs in parallel to the Freedom of Information Act 2000 (FOIA). They implemented an EU Directive, which in turn gave effect to an international treaty (the &quot;<a href="https://unece.org/environment-policy/public-participation/aarhus-convention/text">Aarhus Convention</a>&quot;) on access to environmental justice. The EIR require public authorities to disclose environmental information (so long as it is held) to any person who requests it, subject to the application of exemptions.&nbsp;</p>

<p>However, deciding which bodies qualify as public authorities under the EIR is sometimes complicated. Any body which is a public authority under FOIA counts, but the EIR also say that bodies which carry out &quot;functions of public administration&quot; will also do so. The test for determining this derives from a case involving water companies which went, pre-Brexit, to the Court of Justice of the European Union, but which was then clarified further by the <a href="https://www.bailii.org/uk/cases/UKUT/AAC/2015/52.html">Upper Tribunal</a>. The test boils down to the following criteria, which are laid out in the ICO decisions:&nbsp;</p>

<ul>
	<li>The body must be doing a task that the state normally does, or would otherwise do.&nbsp;&nbsp;</li>
	<li>The state must have required it to do this task and there must be a statutory basis.&nbsp;&nbsp;&nbsp;</li>
	<li>The task must have an environmental impact.&nbsp;&nbsp;</li>
	<li>The body must have &quot;special powers&quot;, beyond those available in private law, for the purpose of carrying out the task.&nbsp;</li>
</ul>

<p>In deciding that BT and Openreach are public authorities, the ICO noted that the state, acting via Ofcom, issues directions to telecoms providers which then entitles the providers to exercise powers under the Communications Code, and has thereby entrusted BT and Openreach with administrative functions, with a clear basis in statute.&nbsp;&nbsp;</p>

<p>Furthermore, operation of a mobile network involves emission of radio waves, and a cabled network requires laying of cables &ndash; usually underground &ndash; which requires disturbance and movement of earth. All of these affect the elements of the environment.&nbsp;</p>

<p>Finally, the powers exercisable under the Communications Code include the right to install equipment on, over or under land, the right to maintain equipment and the right to enter onto private land if necessary for such purposes. These are not powers available to persons under private law, and so constitute &quot;special powers&quot;.&nbsp;</p>

<p>Accordingly, BT and Openreach are public authorities for the purposes of the EIR, and must comply with requests for environmental information. Although the ICO does not specifically say so, one assumes that this would extend to other telecoms providers.&nbsp;</p>

<p>It is worth mentioning one other point though &ndash; in recent years the ICO has made decisions that a housing association and Heathrow Airport were EIR public authorities. In both cases this was overturned on appeal. It is possible, therefore, that these latest decisions may get challenged.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AA6TQAAA7AA7YAAAAAAAB6AB7QAP777774AAAIIBRAG4IBQAAA.jpg" length="162411" />
    </item>
    <item>
      <title><![CDATA[Belgian data protection authority declares FATCA processing illegal]]></title>
      <link>https://www.mishcon.com/news/belgian-data-protection-authority-declares-fatca-processing-illegal</link>
      <guid>https://www.mishcon.com/news/belgian-data-protection-authority-declares-fatca-processing-illegal</guid>
      <description><![CDATA[In a seminal decision in a case brought by the Association of Accidental Americans, the Belgian data protection authority confirmed its previous finding that the FATCA related processing of sensitive personal and financial data by the tax authorities under the relevant FATCA Agreement represents is illegal.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 01 May 2025 16:31:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>In a seminal decision in a case brought by the Association of Accidental Americans, the Belgian data protection authority confirmed its <a href="https://www.dataprotectionauthority.be/citizen/belgian-dpa-prohibits-the-transfer-of-tax-data-of-belgian-accidental-americans-to-the-usa">previous finding</a> that the FATCA related processing of sensitive personal and financial data by the tax authorities under the relevant FATCA Agreement represents is illegal.&nbsp;</p>

<p>The judgment follows a small number of claims throughout Europe, including <a href="https://www.mishcon.com/news/filippo-noseda-in-tax-notes-international-on-two-english-judgments-over-fatca">Jenny&#39;s case</a> before the High Court in London. &nbsp;</p>

<p><a href="https://www.mishcon.com/download/24-apr-2025-decision-from-belgian-data-protection-authority">Read our summary of the decision</a>.&nbsp;</p>

<p><a href="https://www.mishcon.com/people/filippo-noseda">Filippo Noseda</a>, the Mishcon de Reya Partner who has been leading the firm&#39;s work on the data protection implications of the US Foreign Accounts Tax Compliance Act (FATCA), as well as the OECD&#39;s Common Reporting Standard (CRS) and beneficial ownership registers commented: <em>&ldquo;The Belgian decision represents a victory for data protection and the Rule of Law in an extremely politicised context.&rdquo; &nbsp;</em></p>

<p><em> &ldquo;Our research shows that the European Commission believed as early as 2012 that processing sensitive personal and financial information under FATCA (the US Foreign Accounts Tax Compliance Act) was disproportionate and that the US offered lower levels of data protection than Europe.  However, following the UK&#39;s signing of the first bilateral Agreement against the negative advice of EU data protection experts, the Commission went along with the idea of &quot;temporary&quot; agreements to be signed by EU Member States.  Once that happened, our research shows that new Commissioners started resisting calls from campaigners and data protection authorities to review the position. We published a <a href="https://www.mishcon.com/download/timeline-2011-2025-eu">timeline </a>and <a href="https://www.mishcon.com/assets/managed/docs/downloads/doc_3370/27%20Sept%20to%20COM%20re%20substantive%20response%202.PDF">chronology </a>of events based on internal EU documents and have been <a href="https://www.mishcon.com/download/aeoi-and-gdpr-access-to-internal-eu-documents">pressing the EU </a>for the disclosure of additional documents that show the true extent of the problem.&nbsp;</em></p>

<p><em> &ldquo;Nobody should engage in tax evasion. However, the fight against tax evasion must be conducted respecting the fundamental rights of compliant citizens. FATCA entails the automatic processing and transfer of sensitive personal data without any indicia of tax evasion, which exposes compliant citizens to serious risks for their data and the Belgian data protection authority confirmed its previous finding that a generalised processing of personal data violates the principle of proportionality enshrined in the EU Charter of fundamental rights and the GDPR.  The Belgian data protection authority also found that the relevant FATCA Agreement violates basic data protection principles, including the principle of purpose limitation and the prohibition against excessive data retention.  What&#39;s more, the decision confirms a previous finding from the UK&#39;s Information Commissioner&#39;s Office that affected citizens are not provided with sufficient information over the existence and extent of data processing, which violates the principle of transparency of data processing.&quot;&nbsp;</em></p>

<p>Filippo added: <em>&quot;This is the second time that the Belgian data protection authority found that FATCA is illegal.  However, a previous decision handed down in May 2023 was appealed by the Belgian State on procedural grounds.  This is reminiscent of the approach taken by HMRC in Jenny&#39;s case.  Instead of engaging with the substance of Jenny&#39;s claim, HMRC mounted a procedural battle aimed at blocking any criticism of HMRC&#39;s handling of FATCA against the better judgment of the European Commission and the EU&#39;s data protection working party that had already reached the conclusions contained in the Belgian decision before HMRC jumped the gun and signed up to FATCA, thus opening the floodgates. Jenny&#39;s case <a href="https://www.mishcon.com/download/fatca-complaint-against-hmrc">remains pending</a> before the UK&#39;s Information Commissioner&#39;s Office.&nbsp;</em></p>

<p>Today&rsquo;s judgment is likely to have practical implications beyond the EU, including the UK. The UK was the first country to sign a FATCA agreement with the US.  While the UK is no longer a member of the EU, the GDPR continues to survive in the UK, albeit with a new name (UK GDPR). Also, the EU Charter of fundamental right (which no longer applies to the UK) is based on the European Convention on Human Rights, which continues to apply to the UK.&nbsp;</p>

<p>[The European Convention on Human Rights was the <a href="https://www.jerseylaw.je/publications/jglr/Pages/JLR1702_Noseda.aspx">brainchild of Winston Churchill</a> and remains relevant for the UK.  Art. 8 of that Convention introduced a right to privacy aimed at giving citizens back control over their lives after the horrors perpetrated by fascist States during World War II.  While the right to privacy is not absolute, any restriction is subject to the principle of proportionality and the Belgian decision confirms that the indiscriminate and generalised processing and transfer of data is disproportionate, and thus illegal.  At a time where populism is on the rise and the Rule of Law is under attack in liberal countries, the right to privacy and data protection is a principle <a href="https://www.ft.com/content/d89ebb7f-5efd-41ea-9f0d-aa2db77664db">worth fighting for</a>.]&nbsp;</p>

<p>Mishcon de Reya has been <a href="https://www.mishcon.com/services/transparency-vs-privacy">at the forefront</a> of addressing the data protection implications of systems of automatic exchange of information under FATCA and the CRS, as well as public registers of beneficial ownership.  In 2018, Mishcon Academy, the firm&rsquo;s think-tank, published a report entitled <em>&lsquo;<a href="https://www.mishcon.com/services/the-common-reporting-standard/crs-the-report">The Great Debate: Privacy vs Transparency&rsquo;</a></em>. The firm has also published <a href="https://www.mishcon.com/services/transparency-vs-privacy">its research into internal EU documents</a> and <a href="https://www.mishcon.com/services/fatca/correspondence">its correspondence with the EU</a>, the OECD, as well as the UK Information Commissioner&rsquo;s Office to ensure accountability of the work carried out by public authorities and raise awareness over the underlying data protection issues.  &nbsp;</p>

<p>The firm&#39;s work in this area has been widely reported in the media (including <em>The Financial Times, The Economist, TIME Magazine, Forbes and Bloomberg</em>) and specialised press (including <em><a href="https://www.mishcon.com/news/filippo-noseda-in-tax-notes-international-on-two-english-judgments-over-fatca">Tax Analyst</a></em>  in the US and <em><a href="https://www.mishcon.com/news/filippo-noseda-in-trusts-and-trustees-on-the-quantitative-analysis-of-transparency-measures">Trusts &amp; Trustees</a></em> in Europe). &nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AAZC4AAA7AA7YAAAAAAAB6AB7QAP777774AABAIEA4EYIBAAAA.jpg" length="118972" />
    </item>
    <item>
      <title><![CDATA[Online Safety Act: Deadline approaches for compliance with Children's Access Assessment]]></title>
      <link>https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-compliance-with-childrens-access-assessment</link>
      <guid>https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-compliance-with-childrens-access-assessment</guid>
      <description><![CDATA[By 16 April 2025, all platforms, sites, and apps in scope of Part 3 of the UK's Online Safety Act 2023 (OSA) must assess, by completing a Children's Access Assessment (CAA), whether it is possible for children to access the service they provide.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 11 Apr 2025 17:16:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>By 16 April 2025, all platforms, sites, and apps in scope of Part 3 of the UK&#39;s Online Safety Act 2023 (<strong>OSA</strong>) must assess, by completing a Children&#39;s Access Assessment (<strong>CAA</strong>), whether it is possible for children to access the service they provide. Ofcom, the online safety regulator, has reminded in-scope providers (both user-to-user services and search engines) of their obligations to complete such CAAs in order to determine whether they then will need to complete <a href="https://www.mishcon.com/news/online-safety-act-deadline-approaches-for-childrens-risk-assessments">Children&#39;s Risk Assessments</a> (<strong>CRAs</strong>) and implement protection measures in order to comply fully with their duties under Part 3 of the OSA.&nbsp;</p>

<h2>What services are in scope?&nbsp;</h2>

<p>All Part 3 services are in scope of the requirement to complete a CAA under the OSA. Part 3 services are services which are either:&nbsp;</p>

<ul>
	<li><strong>User-to-user services</strong>: services where content is generated, uploaded to, or shared on the service by a user, and may be encountered by another user or users of the service; or&nbsp;</li>
	<li><strong>Search services</strong>: services that are or include a search engine, or have the ability to search websites, databases or other aspects of a service.&nbsp;</li>
</ul>

<p>If more than one Part 3 service is provided by a platform, site, or app, separate CAAs must be carried out for each service. If only part of a service is within scope, a CAA must still be carried out.&nbsp;</p>

<h2>What needs to be assessed?&nbsp;</h2>

<p>The requirement for a CAA is set out in the OSA. However, the process and form of the assessment are dictated by <a href="https://www.mishcon.com/download/childrens-access-assessments-guidance">guidance issued by Ofcom</a> on 16 January 2025. Ofcom&#39;s guidance defines two stages of assessment in a CAA, though both stages may not always need to be carried out. The first stage is to consider whether it is possible for children to normally access the service. The second stage is to assess whether the &quot;child user condition&quot; is met.&nbsp;</p>

<h3>Stage 1: Possible for children to access the service?&nbsp;</h3>

<p>Ofcom says that, where no highly effective age assurance (<strong>HEAA</strong>) is in place, it must be concluded that it is possible for children normally to access the service. HEAA is a form of age assurance where robust checks are completed to ensure users are over 18 years of age (or another appropriate age). <a href="https://www.mishcon.com/download/heaa-for-part-3-services">Ofcom has also issued guidance about HEAA for Part 3 Services</a>, including how to identify whether age assurance methods meet the criteria to be classified as HEAA and other considerations such as data protection compliance.&nbsp;&nbsp;</p>

<p>If no HEAA is identified as being in place, services must conclude that children can normally access the service and go on to complete stage 2 of the CAA.&nbsp;</p>

<h3>Stage 2: The child user condition&nbsp;</h3>

<p>This stage contains two parts. If the first part concludes positively that there is a significant number of children who access the service, the second part, which asks whether the service is likely to attract children, does not need to be completed.&nbsp;</p>

<h3>Part 1: Significant number of children&nbsp;</h3>

<p>In this part of the CAA, platforms must determine if a significant number of children are accessing their service. This involves analysing user data to identify the proportion of users who are under 18 and assess whether this is a significant number. The number of children who access the service will be significant if either:&nbsp;</p>

<ul>
	<li>The number of children reflects a significant number in proportion to the total userbase of the service; or&nbsp;</li>
	<li>The number of children is inherently significant.&nbsp;</li>
</ul>

<p>If the data indicates that a significant number of users are children, the service must proceed to complete a CRA and consider whether protection measures need to be implemented. Factors such as user demographics, content type, and marketing strategies should be considered to make an informed determination.&nbsp;</p>

<h3>Part 2: Likely to attract children&nbsp;</h3>

<p>If the first part of stage 2 concludes that a significant number of children do not access the service, the second part must be completed. This involves assessing whether the service is likely to attract children. Considerations that should be made include the nature of the content, the design and functionality of the service, and any marketing or promotional activities that may appeal to children. Services that are designed in a way that is appealing to children, or that feature content popular with younger audiences, are more likely to meet this condition.&nbsp;</p>

<h2>When must CAAs be completed?&nbsp;</h2>

<p>For services that are already in operation and in scope of the obligation to complete a CAA, the assessment must be completed by 16 April 2025. Providers have been encouraged to begin the assessment process well in advance of the deadline to allow sufficient time for any required changes to be made.&nbsp;</p>

<p>For services that, in the future, come into scope of the obligation to complete a CAA (whether due to a change in the service or due to the launch of a new service), a CAA must be completed within the first three months of operation.&nbsp;</p>

<p>Once a CAA is completed, it will need to be re-assessed on an annual basis, or where there is a significant change to the service which could impact the CAA that has already been carried out. Significant changes include adjustments to the service&#39;s design or operation, new evidence regarding the efficacy of age assurance or new evidence of an increase in the number of children using the service.&nbsp;</p>

<h2>When must CRAs be completed?&nbsp;</h2>

<p>If a CRA is needed, it must be completed within three months of the release of Ofcom&#39;s final guidance about completing CRAs. This guidance is expected to be released in April 2025. <a href="https://www.mishcon.com/download/ofcoms-draft-guidance">Ofcom&#39;s draft guidance</a> provides insight into the expected requirements and should be used by services to prepare for when the final guidance is released.&nbsp;</p>

<h2>What is the penalty for non-compliance?&nbsp;</h2>

<p>If an appropriate CAA has not been carried out, Ofcom may use its powers to investigate and could impose a penalty of up to 10% of qualifying worldwide revenue or &pound;18 million, whichever is greater. Ofcom may also require remedial action to be taken.&nbsp;</p>

<p><a href="https://www.mishcon.com/news/online-safety-act-ofcom-illegal-harms-enforcement-action">Ofcom recently set out its plans for enforcement under the OSA</a>, and it is expected that it will continue to actively enforce compliance. Mishcon continues to advise multiple clients on compliance with the OSA, as well as the commercial and practical implications that it poses.&nbsp;&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AC2DYAAA7AA7YAAAAAAAB6AB7QAP777774AAAYIBSMIEUCAAAA.jpg" length="26312" />
    </item>
    <item>
      <title><![CDATA[Online Safety Act: Ofcom illegal harms enforcement action]]></title>
      <link>https://www.mishcon.com/news/online-safety-act-ofcom-illegal-harms-enforcement-action</link>
      <guid>https://www.mishcon.com/news/online-safety-act-ofcom-illegal-harms-enforcement-action</guid>
      <description><![CDATA[As of 17 March 2025, platforms, sites and apps in scope of the UK's Online Safety Act 2023 (OSA) must take steps to tackle criminal content on their services, with the next set of illegal harms duties coming into force on that date. Ofcom, the online safety regulator, has reminded in-scope service providers (both user-to-user services and search engines) of their obligations, which flow from the work they were required to have done (by 16 March) to carry out 'suitable and sufficient' illegal harms risk assessments.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 04 Apr 2025 15:05:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>As of 17 March 2025, platforms, sites and apps in scope of the UK&#39;s Online Safety Act 2023 (<strong>OSA</strong>) must take steps to tackle criminal content on their services, with the next set of illegal harms duties coming into force on that date. Ofcom, the online safety regulator, has reminded in-scope service providers (both user-to-user services and search engines) of their obligations, which flow from the work they were required to have done (by 16 March) to carry out <em>&#39;suitable and sufficient&#39;</em>&nbsp; illegal harms risk assessments. Ofcom has <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/enforcing-the-online-safety-act-platforms-must-start-tackling-illegal-material-from-today/">published</a> details of its illegal harms enforcement programme, which will, over the coming months, see it assess platforms&#39; compliance with their illegal harm obligations and, where necessary, commence targeted enforcement action to achieve industry compliance.&nbsp;</p>

<h2>Ongoing enforcement&nbsp;</h2>

<p>Alongside enforcement action following illegal harm risk assessments, Ofcom has launched two further enforcement programmes, tackling those areas it considers to be key priorities. These programmes signal its intention, as the online safety regulator, to ensure compliance with the OSA, as well as its <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/statement-protecting-people-from-illegal-harms-online/">illegal content Codes of Practice</a>. The areas subject to ongoing enforcement activity are:&nbsp;</p>

<ul>
	<li><u>Age assurance measures in the adult sector:</u> From 17 January 2025, Ofcom has been writing to all service providers that display or publish pornographic content to inform them of their obligations under the OSA and to request confirmation of the age assurance provisions they are implementing to achieve compliance. Ofcom has also perhaps signalled its intentions in this area, announcing on 27 March 2025 that it had fined <a href="https://www.ofcom.org.uk/online-safety/protecting-children/ofcom-fines-provider-of-onlyfans-1.05-million">OnlyFans &pound;1 million</a> for failure to accurately respond to requests for information about its age assurance practices. Whilst the enforcement action against OnlyFans was commenced under the previously in place video-sharing platform (VSP) regime, it provides an indication of the approach that Ofcom is likely to take under the OSA.&nbsp;&nbsp;</li>
	<li><u>Dissemination of child sexual abuse material (CSAM) by offenders:</u> The level of harm from CSAM and the risk it poses is acute. On 17 March 2025, given the high-risk nature and susceptibility of file-sharing and file-storage platforms for the sharing and distribution of CSAM, Ofcom also announced <a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/enforcement-programme-into-measures-being-taken-by-file-sharing-and-file-storage-services-to-prevent-users-from-encountering-or-sharing-child-sexual-abuse-material-csam/">the launch of an enforcement programme</a> that will require providers to demonstrate how they are tackling this issue.&nbsp;</li>
</ul>

<p>These enforcement programmes are aimed at assessing the safety measures being taken, or that will soon be taken, in relation to these identified risks for adult and CSAM content. It is likely that Ofcom will announce further enforcement programmes in the coming months, and in-scope platforms should be prepared to evidence their compliance to Ofcom for all the risks and duties placed on them by the OSA.&nbsp;</p>

<p>If in-scope platforms fail to engage with its enforcement programmes, Ofcom has confirmed that it will not hesitate to open investigations into individual services and, if required, use its <a href="https://www.mishcon.com/news/online-safety-onlyfans-fine-and-the-future-of-ofcom-enforcement">enforcement powers</a>. As its Enforcement Director commented: <em>&quot;Any provider who fails to include the necessary protections can expect to face the full force of our enforcement action&quot;</em>. Ofcom&#39;s enforcement powers under the OSA are significant and include the ability to issue fines of up to 10% of worldwide turnover or &pound;18 million, whichever is greater, as well as significant information gathering powers.&nbsp; Certain infringements of the OSA may lead to criminal liability (including for senior managers), and business disruption measures.&nbsp;&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AAGDGAAA7AA7YAAAAAAAB6AB7QAP777774AAACAAXMHF4BYAAA.jpg" length="96745" />
    </item>
    <item>
      <title><![CDATA[Mishcon de Reya acts for journalist in environmental information precedent case]]></title>
      <link>https://www.mishcon.com/news/mishcon-de-reya-acts-for-journalist-in-environmental-information-precedent-case</link>
      <guid>https://www.mishcon.com/news/mishcon-de-reya-acts-for-journalist-in-environmental-information-precedent-case</guid>
      <description><![CDATA[Jon Baines, who along with partner Adam Rose, heads up Mishcon's Freedom of Information practice, recently acted for investigative journalist Lucas Amin of the "Democracy for Sale" newsletter.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 03 Apr 2025 15:28:00 GMT</pubDate>
      <content:encoded><![CDATA[<p><a href="https://www.mishcon.com/people/jon-baines">Jon Baines</a>, who along with partner <a href="https://www.mishcon.com/people/adam-rose">Adam Rose</a>, heads up Mishcon&#39;s <a href="https://www.mishcon.com/services/freedom-of-information">Freedom of Information practice</a>, recently acted for investigative journalist <a href="https://muckrack.com/lucas-amin">Lucas Amin of the &quot;Democracy for Sale&quot;</a> newsletter. Mr Amin has been seeking clarity on whether the <a href="https://www.aria.org.uk/">Advanced Research and Information Agency</a> (ARIA) is subject to the Environmental Information Regulations 2004 (EIR), and, if it is, to get disclosure of information about research grants.&nbsp;<br />
&nbsp;<br />
When ARIA was set up in January 2023, it was excluded from the Freedom of Information Act 2000 (FOIA). At the time, the Campaign for Freedom of Information said, &quot;<em>It is extraordinary that a body responsible for spending &pound;800 million of public funds&hellip; should be freed from the scrutiny that applies to the whole public sector</em>&quot;.&nbsp;<br />
&nbsp;<br />
However, the EIR, which run in parallel to FOIA, did not expressly exclude ARIA from their ambit, and so Mr Amin requested grant information about its &quot;<a href="https://www.aria.org.uk/opportunity-spaces/scoping-our-planet/scoping-our-planet">Scoping our Planet</a>&quot; project, which seeks to provide funding in order to &quot;fill gaps in Earth system measurement to respond confidently to the climate crisis&quot;.&nbsp;<br />
&nbsp;<br />
Perhaps surprisingly, ARIA did not dispute that it was subject to the EIR, but, extraordinarily, did dispute that the information sought was &quot;environmental&quot;. Mishcon de Reya assisted Mr Amin in making a complaint to the Information Commissioner&#39;s Office (ICO). During the ICO&#39;s investigation, ARIA disclosed much of the requested information to Mr Amin and, in March 2025, the ICO published a formal <a href="https://www.mishcon.com/download/ico-decision-notice">decision notice</a> which upheld Mr Amin&#39;s complaint. The decision notice establishes, for the first time, that ARIA is subject to the EIR, and that it breached the EIR when responding to Mr Amin&#39;s request.&nbsp;<br />
&nbsp;<br />
Commenting on the case, Jon Baines said: &quot;<em>I think Parliament got it wrong by excluding ARIA from the scope of FOIA, but it&#39;s always been clear that they must still be subject to the EIR. It&#39;s good that the ICO agrees, and that Lucas has managed to get this mark down in the sand</em>&quot;.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AAMSOAAA7AA7YAAAAAAAB6AB7QAP777774AABAYEAUEYGBAAAA.jpg" length="28174" />
    </item>
    <item>
      <title><![CDATA[UK Government launches AI Opportunities Action Plan]]></title>
      <link>https://www.mishcon.com/news/uk-government-launches-ai-opportunities-action-plan</link>
      <guid>https://www.mishcon.com/news/uk-government-launches-ai-opportunities-action-plan</guid>
      <description><![CDATA[On 13 January 2025, the UK's Prime Minister, Sir Keir Starmer, gave a speech designed to showcase the new AI Opportunities Action Plan.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Tue, 14 Jan 2025 14:00:00 GMT</pubDate>
      <content:encoded><![CDATA[<p paraeid="{26629a38-f0f1-4435-9a74-25f9cb310e09}{217}" paraid="765946685"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">On 13 January 2025, the UK&#39;s Prime Minister, Sir Keir Starmer, gave&nbsp;a </span><a href="https://www.gov.uk/government/speeches/pm-speech-on-ai-opportunities-action-plan-13-january-2025" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">speech</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> designed to showcase the new </span><a href="https://www.gov.uk/government/publications/ai-opportunities-action-plan/ai-opportunities-action-plan" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">AI Opportunities Action Plan</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">. This was commissioned by the Government and led by Matt Clifford CBE, a tech entrepreneur and the Chair of the Advanced Research and Invention Agency (ARIA).&nbsp; Containing 50 recommendations for the Government to implement, the Action Plan&#39;s broad objective is to make the UK <em>&quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">one of the great AI superpowers</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>, building on its existing status as the third largest AI market in the world, behind the USA and China.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>

<div>
<p paraeid="{26629a38-f0f1-4435-9a74-25f9cb310e09}{241}" paraid="1499846349"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">The Action Plan, the Government&#39;s response to which is accessible </span><a href="https://www.gov.uk/government/publications/ai-opportunities-action-plan-government-response/ai-opportunities-action-plan-government-response" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">here</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">, gives rise to significant implications for AI developers, rights holders, and individuals as regards intellectual property, data protection, and the overall direction of travel in the UK concerning the regulation of AI-powered technologies.</span><span data-ccp-props="{'201341983':0,'335559739':240,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<h2 paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{1}" paraid="1094478692"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">What does the AI Opportunities Action Plan propose?</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></h2>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{9}" paraid="1732032828"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">Key takeaways from the Action Plan include the following:</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<ul>
	<li paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{21}" paraid="556267115"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><strong>Expanding the capacity of the AI Research Resource (AIRR)</strong>. </span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">The Action Plan envisages increasing the capacity of the AIRR&mdash;a cluster of advanced supercomputers for AI research&mdash;twentyfold by the end of the decade. It recommends the establishment of AI Growth Zones (AIGZs), areas with enhanced power access and fast-tracked planning regulations intended to accelerate the spread of AI infrastructure in the UK. The first is intended to be in Culham, Oxfordshire. Already home to the UK Atomic Energy Authority, the site is earmarked for the development of a pilot AI data centre beginning with 100MW of capacity, potentially scaling up to 500MW.</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></li>
</ul>
</div>

<div>
<ul>
	<li paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{41}" paraid="1994623660"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><strong>Attracting AI talent.</strong> </span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">To bridge the estimated gap between supply and demand of skilled AI professionals, new educational pathways will be explored. While the Action Plan also advises of changes to the existing immigration system to attract the required talent, the Government&#39;s response does not indicate full agreement with this aspect of the recommendation, stressing that the UK already offers a variety of visa routes for this purpose.</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></li>
</ul>
</div>

<div>
<ul>
	<li paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{53}" paraid="1099836118"><strong><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">Adopting AI in the public sector.</span></strong><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> Large-scale adoption of AI within the public sector is recommended not only to maximise productivity but also to encourage private-sector adoption.&nbsp; Examples include medics using AI to complete forms and reports and teachers using these technologies to plan lessons. A range of use cases are to be piloted, with the best deployed nationwide.</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></li>
</ul>
</div>

<div>
<ul>
	<li paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{65}" paraid="919652787"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><strong>Unlocking the UK&#39;s data assets</strong>. </span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">This includes revisiting the UK&#39;s legal framework concerning copyrighted-protected assets that might be used by AI companies to train AI models and creating a new National Data Library (NDL) that comprises public-sector data, potentially including health-related data from the NHS. The Action Plan additionally recommends establishing a copyright-cleared UK media asset training data set for licensing internationally, and which might include content from institutions such as the BBC, the National Archives, and the British Library.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></li>
</ul>
</div>

<div>
<ul>
	<li paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{81}" paraid="2006752203"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><strong>Adopting a pro-innovation approach.</strong> </span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">Although the Action Plan initially refers to this particular principle in terms of <em>&quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">enabling safe and trusted AI development and adopting through regulation, safety and assurance</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>, it is at pains to emphasise opportunities for maximising economic growth and innovation, urging the Government to ask itself whether each recommended action <em>&quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">benefit[s] people and organisations trying to do new and ambitious things in the UK</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em> and warning that, if the answer to this is negative, then untapped potential may not be realised.</span><span data-ccp-props="{'201341983':0,'335559685':714,'335559739':240,'335559740':240,'335559991':357}">&nbsp;</span></li>
</ul>
</div>

<div>
<h2 paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{97}" paraid="1001459246"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">How will this affect AI regulation in the UK?</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></h2>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{103}" paraid="1484211634"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">While the Action Plan reaffirms support of the UK&#39;s Artificial Intelligence Safety Institute, it does represent a shift in emphasis towards underlining AI technologies&#39; potential to boost economic growth. The Government&#39;s broad agreement to the Action Plan indicates that the UK will not pursue wholesale regulation of these technologies on a statutory basis. With the UK no longer part of the EU, it understandably makes little political sense for the Government to adopt an approach equivalent to the EU AI Act, the world&#39;s first comprehensive and legally binding framework for AI development and use. Additionally, Britain&#39;s sluggish economic growth means there is little appetite to create further deterrents to potential investors by way of regulation. It may be no coincidence that the Prime Minister announced the Action Plan on the same day the pound hit a 14-month low against the US dollar.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{117}" paraid="1234470254"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">The Action Plan reinforces its pro-innovation agenda by going as far as to state that regulators entrusted with oversight should be asked to report on how they have helped to foster AI-driven growth. Further, it adds that if those reports demonstrate that regulation is in fact stifling innovation, the Government should consider transferring the relevant regulatory powers to a central body with <em>&quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">higher risk tolerance</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{135}" paraid="564451442"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">It is therefore clear that, while alignment with the EU&#39;s approach regarding text and data mining (as discussed below) may be explored by the Government, any attempt to mimic the EU AI Act in the UK is now unlikely. Nonetheless, UK businesses wishing to export AI technologies to the EU will still be bound by the requirements of the EU AI Act, whose effects may, in this way, still be felt in Britain.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':240,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<h2 paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{145}" paraid="2517431"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">What does this mean for the UK&#39;s intellectual property (IP) framework concerning content used to train AI models?</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></h2>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{157}" paraid="662086323"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">There is an ongoing tension between the need for AI developers to use a significant number, and broad range, of materials as training data for their models and the interests of IP rights holders and creators. The current text and data mining (TDM) exception in UK copyright law only covers the use of copyright-protected materials for non-commercial research, and the previous Conservative Government&#39;s attempts to broaden its scope (ie a broad TDM exception with no opt-out for rights holders) were abandoned, with attempts to broker a voluntary resolution subsequently failing.  </span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{171}" paraid="1791532826"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">As discussed in </span><a href="https://www.mishcon.com/news/uk-government-consultation-on-copyright-and-ai-a-win-win" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">our December article</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">, the UK Government has issued a fresh consultation on this subject, proposing a TDM exception for commercial purposes, subject to an opt-out right by rights holders.&nbsp; Although stakeholders can contribute to the consultation until 25 February 2025, the Action Plan considers the current position unsatisfactory, claiming it is <em>&quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">hindering innovation and undermining our broader ambitions for AI</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>. This indicates a clearer desire for the UK to be at least as competitive as the EU as regards making data sets accessible to AI developers.</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{69a1c038-408c-4cbd-8963-01ae35bdfb37}{190}" paraid="1187553000"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">The Government&#39;s published response to this aspect of the Action Plan simply states that the issue is open for consultation. However, the fact </span><a href="https://www.gov.uk/government/news/prime-minister-sets-out-blueprint-to-turbocharge-ai" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">its own publicity materials</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> generally refer to the Prime Minister<em> &quot;</em></span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">agreeing to take forward all </span><a href="https://www.gov.uk/government/publications/ai-opportunities-action-plan-government-response/ai-opportunities-action-plan-government-response" rel="noreferrer noopener" target="_blank"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">50 recommendations</span></a></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em> has already left some rights holders concerned as to whether responses to the consultation will be heeded. Within hours of the Action Plan&#39;s publication, for example, </span><a href="https://www.publishers.org.uk/publishers-associations-response-to-ai-opportunities-action-plan/" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">the CEO of the Publishers Association</span><span data-ccp-charstyle="Hyperlink"> </span><span data-ccp-charstyle="Hyperlink">cautioned</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> that the outcome of the consultation should not be a </span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">fait accompli</span><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">, adding his opinion that </span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">&quot;[t]he UK can [&hellip;] seize all the growth opportunities associated with AI without facilitating a US tech-led heist of UK copyright work</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>.</span><span data-ccp-props="{'201341983':0,'335559739':240,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<h2 paraeid="{7c47b7e8-3212-4658-98c5-b7456fe538b8}{32}" paraid="1497572338"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">Does this address the use of NHS health data in an AI context?</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></h2>
</div>

<div>
<p paraeid="{7c47b7e8-3212-4658-98c5-b7456fe538b8}{38}" paraid="1009681618"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">The NHS is one of the world&#39;s largest single sources of patient-level health data. It is no surprise, therefore, that its data assets are very attractive for AI developers that have already exhausted various pools of training data.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{7c47b7e8-3212-4658-98c5-b7456fe538b8}{48}" paraid="766136235"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">While this is one opportunity that affords the UK a unique selling point compared with other competing economies, any move which might involve large technology companies accessing NHS data is not without political or legal risk.&nbsp;</span><span data-ccp-props="{'201341983':0,'335559739':120,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{7c47b7e8-3212-4658-98c5-b7456fe538b8}{80}" paraid="1948618987"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">To alleviate these concerns, the Action Plan envisages holding the relevant data sets in the NDL, with the Prime Minister clarifying in his speech that such data would be anonymised.&nbsp; If the data is truly anonymised such that no individual can be re-identified, its use would fall outside of the scope of data protection law. Nonetheless, in response to the Government&#39;s announcement, the Information Commissioner&#39;s Office has </span><a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2025/01/statement-in-response-to-ai-action-plan/" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">stressed</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> that </span><em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">&quot;[d]ata protection is essential to realising this opportunity and ensuring that the public can have trust in AI</span></em><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"><em>&quot;</em>.</span><span data-ccp-props="{'201341983':0,'335559739':240,'335559740':240}">&nbsp;</span></p>
</div>

<div>
<p paraeid="{7c47b7e8-3212-4658-98c5-b7456fe538b8}{107}" paraid="267837157"><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB">There also remain questions regarding the feasibility of some of the proposals, not least in terms of how some aspects&mdash;such as the expansion of the AIRR&#39;s capacity&mdash;will be funded given the current economic climate. However, as mentioned above, the </span><a href="https://www.gov.uk/government/publications/ai-opportunities-action-plan-government-response/ai-opportunities-action-plan-government-response" rel="noreferrer noopener" target="_blank"><span data-contrast="none" lang="EN-GB" xml:lang="EN-GB"><span data-ccp-charstyle="Hyperlink">Government&#39;s response to the Action Plan</span></span></a><span data-contrast="auto" lang="EN-GB" xml:lang="EN-GB"> indicates that it broadly agrees with all recommendations.</span><span data-ccp-props="{'201341983':0,'335559740':240}">&nbsp;</span></p>
</div>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AB3DUAAA7AA7YAAAAAAAB6AB7QAP777774AAAVAA5AB7IAIAAI.jpg" length="86926" />
    </item>
    <item>
      <title><![CDATA[The new Data Bill - unfair to non-profits?]]></title>
      <link>https://www.mishcon.com/news/the-new-data-bill-unfair-to-non-profits</link>
      <guid>https://www.mishcon.com/news/the-new-data-bill-unfair-to-non-profits</guid>
      <description><![CDATA[The new Data (Use and Access) Bill could have provided an opportunity for the Government to show its commitment to a planned new "covenant" with the charity sector, by reviving a proposal to change the law and allow charities and other nonprofits to promote their services by email and text message, in the same way that profit-making companies can.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 06 Nov 2024 10:16:00 GMT</pubDate>
      <content:encoded><![CDATA[<p><em>The Government has subsequently tabled <a href="https://www.mishcon.com/news/data-use-and-access-bill-amendment-good-news-for-charities">an amendment to the Bill </a>which would extend the soft opt-in to charities.</em></p>

<p>The new <a href="https://www.mishcon.com/news/data-protection-reform-right-back-on-the-agenda">Data (Use and Access) Bill </a>could have provided an opportunity for the Government to show its commitment to a planned <a href="https://www.civilsociety.co.uk/news/covenant-agreement-between-charities-and-government-to-launch-in-the-new-year.html">new &quot;covenant&quot; with the charity sector</a>, by reviving a proposal to change the law and allow charities and other nonprofits to promote their services by email and text message, in the same way that profit-making companies can. The previous Data Protection and Digital Information Bill, which was dropped just before the general election, proposed to extend what is known as the electronic direct marketing &ldquo;soft opt-in&rdquo; to non-profits. However,&nbsp; a similar clause does not appear in the new Bill.</p>

<p>Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (&quot;PECR&quot;) deals with circumstances under which a person can send an unsolicited direct marketing communication by email, or text message.</p>

<p>In simple terms, a person cannot send an unsolicited direct marketing email or text message to an individual&rsquo;s private email account, unless the individual has consented to receive it. &ldquo;Consent&rdquo;, here, takes its definition from the UK GDPR.</p>

<p>(The actual law is more complex &ndash; it talks of an &ldquo;individual subscriber&rdquo;. This is the person who is a party to a contract with a provider of public electronic communications (for which, read &ldquo;email&rdquo; and &ldquo;text message&rdquo;) services for the supply of such services. So, if you have signed up for, say, a Gmail account, you have a contract with Google, and you are &ndash; if you are an individual &ndash; an individual subscriber.)</p>

<p>The exception to the consent requirement is at regulation 22(3) of PECR, which says that the sender does not need the prior consent of the recipient where the sender: obtained the contact details of the recipient of that electronic mail in the course of the sale or negotiations for the sale of a product or service to that recipient; the direct marketing is in respect of the sender&rsquo;s similar products and services only; and the recipient has been given a simple means of refusing the use of their contact details for the purposes of such direct marketing, at the time that the details were initially collected, and at the time of each subsequent communication.</p>

<p>This exception has long (and perhaps unhelpfully) been known as the &ldquo;soft opt in&rdquo;.</p>

<p>Note though, that the recipient&rsquo;s contact details must have been collected &ldquo;in the course of the sale or negotiations for the sale of a product or service&rdquo;.</p>

<p>There are various types of non-profit who might wish to send promotional emails and text messages to individuals, but which don&rsquo;t as a rule sell products or services. Perhaps the most obvious of these are charities, but political parties also fall into the type.</p>

<p>The Information Commissioner has <a href="https://ico.org.uk/for-organisations/direct-marketing-and-privacy-and-electronic-communications/guide-to-pecr/electronic-and-telephone-marketing/#directmarketing">long held</a> that promotional communications sent by such non-profits do constitute &ldquo;marketing&rdquo; (and the Information Tribunal has upheld this).</p>

<p>But the combined effect of regulation 22(3) and the interpretation of &ldquo;marketing&rdquo; as covering promotional emails and text messages by charities, means that those charities (and political parties etc.) can&rsquo;t send soft opt in communications.</p>

<p>PECR are an old (in terms of technology) piece of law. They gave effect to an EU Directive which was drafted at a time when e-commerce was in its infancy. There is a good argument that the drafters did not envisage a time when non-profits would habitually use, or wish to use, electronic communications to promote their services and ideals. But in current times, there appears to be no rationale for the UK to favour the commercial sector over the non-profit one.</p>

<p>It is noteworthy that the Government did not reintroduce the clause from the previous Bill.&nbsp; Particularly in light of concerns from the sector about <a href="https://www.civilsociety.co.uk/news/charities-respond-as-chancellor-announces-40bn-tax-rise-in-first-budget.html">increased employer National Insurance Contributions</a>, and about funding in general, many charities and non-profits may wish to avail themselves of lobbying opportunities as the Data (Use and Access) Bill proceeds in Parliament.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADCS6AAA7AA7YAAAAAAAB6AB7QAP777774AAA5QAXMHF4BYAAA.jpg" length="90893" />
    </item>
    <item>
      <title><![CDATA[Data Protection reform right back on the agenda]]></title>
      <link>https://www.mishcon.com/news/data-protection-reform-right-back-on-the-agenda</link>
      <guid>https://www.mishcon.com/news/data-protection-reform-right-back-on-the-agenda</guid>
      <description><![CDATA[The government has, on 23 October, introduced a Data (Use and Access) Bill (DUA Bill) into Parliament. It revives many of the provisions of the Data Protection and Digital Information Bill (DPDI Bill) which failed to get passed prior to July's General Election,]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 24 Oct 2024 11:06:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The government has, on 23 October, introduced a <a href="https://www.mishcon.com/download/data-use-and-acess-bill">Data (Use and Access)</a> Bill (DUA Bill) into Parliament. It revives many of the provisions of the Data Protection and Digital Information Bill (DPDI Bill) which failed to get passed prior to July&#39;s General Election, but drops some of the more controversial ones. But, for good measure, there are some notable new proposals.&nbsp;</p>

<h2>The UK GDPR&nbsp;</h2>

<details><summary>Not repealed &ndash; accountability provisions&nbsp;</summary>

<p>In terms of what has not been revived, there is no longer a proposal to jettison the requirements for certain data controllers to appoint data protection officers, nor to conduct &quot;data protection impact assessments&quot; of high risk processing, or to maintain &quot;records of processing activities&quot;. Though the absence of repeal of these provisions will reassure many, questions may remain about whether these obligations impose unnecessary compliance obligations on some SMEs.</p>
</details>

<details><summary>Amendments of note&nbsp;</summary>

<p>Of particular interest to data controllers are clauses in the DUA Bill which would amend data subject access request (DSAR) rights and obligations, &quot;privacy notice&quot; requirements, rules and penalties in the area of cookies and electronic marketing, and the way the Information Commissioner functions.</p>
</details>

<details><summary>DSARs&nbsp;</summary>

<p>Data controllers would &ndash; assuming the Bill passed in its current form &ndash; be able to require a data subject to identify which information or activities a DSAR relates to - for instance where the controller &quot;processes a large amount of information concerning the data subject&quot;. In such circumstances, the time for compliance would be &quot;paused&quot;. Although this is something that often currently happens in practice, it would be put on a statutory footing. Similarly, although it is common for a controller to decide that the time for compliance with a DSAR does not begin until the controller is satisfied as to the identity of the requester, the DUA Bill would make it clear in law that this was the case.</p>

<p>The Bill would also put into statute the point that courts have made on several occasions: that when searching for personal data in response to a DSAR, the search need only be a &quot;reasonable and proportionate one&quot;.</p>
</details>

<details><summary>Privacy notices&nbsp;</summary>

<p>&quot;Privacy notices&quot; are the means whereby controllers meet their current obligations under Articles 13 and 14 of the UK GDPR to provide information to data subjects about processing. The DUA Bill contains clauses which were not in the prior bill, and which are of real significance. The Bill proposes that the obligation to give a privacy notice to data subjects from whom data is directly collected will not apply to the extent that providing it &quot;is impossible or would involve a disproportionate effort&quot;. It gives examples of factors that might be taken into account when considering whether there would be a &quot;disproportionate effort&quot;, such as &quot;the number of data subjects, the age of the personal data and any appropriate safeguards applied to the processing. Similar wording is proposed for the Article 14 case where personal data is collected but not directly from the data subject. It seems likely that if these clauses are enacted, the obligation on data controllers to notify data subjects of processing will be greatly reduced. Correspondingly, these clauses are likely to be highly controversial, and subject to parliamentary debate.</p>
</details>

<details><summary>Complaints procedures&nbsp;</summary>

<p>An interesting side note is that the Bill would require data controllers to have a complaints procedure for data subjects, and it would give the Secretary of State the power to make regulations requiring data controllers to notify the Information Commissioner of how many complaints it had received.</p>
</details>

<h2>PECR&nbsp;</h2>

<p>There are some interesting proposed amendments to the Privacy and Electronic Communications (EC Directive) Regulations 2003) (PECR).</p>

<details><summary>Spam sent to no one&nbsp;</summary>

<p>For instance, currently, when someone sends huge volumes of &quot;spam&quot; emails (or text messages) but they are not received by anyone, these do not count as potentially offending communications: the DUA Bill would change this &ndash; such communications would be treated as having been sent to a &quot;recipient&quot;. This would mean that those who send enormous volumes of speculative &quot;spam&quot; would be more at risk of enforcement action.</p>
</details>

<details><summary>Analytics without consent&nbsp;</summary>

<p>What reappears in the proposed amendments to PECR is the proposal that was in the DPDI Bill to permit the use of first-party cookies (and similar technology) for website analytics purposes, without the need to get users&#39; consent. Additionally, the DUA Bill would grant the Secretary of State the power, by making regulations, to introduce other circumstances where cookies might be deployed without consent.</p>
</details>

<details><summary>PECR fines to equal UK GDPR level&nbsp;</summary>

<p>Furthermore, the DUA Bill revives the proposal to increase the potential fine for PECR infringements to UK GDPR levels (&pound;17.5m for the most serious infringements).</p>
</details>

<h2>The Information Commissioner&nbsp;</h2>

<p>The proposal to recast the Commissioner (a &quot;corporation sole&quot;) as a Commission, with a chief executive, is also revived. However, also revived is the intention that the Secretary of State would have considerable ability to affect the operation of the Commission - for instance, they would be able to determine the number of members, would be able to appoint non-executive members and would have to be consulted on the matter of the appointment of a chief executive.</p>

<h2>What does it all mean?&nbsp;</h2>

<p>&quot;Never let a good bill go to waste&quot; may well have been the thought of ministers in the new Labour administration, when they took power in July. Certainly, by breathing life back into the expired DPDI Bill, they have declined the opportunity a) to decide to prepare a wholly new bill, or b) to decide there was no need for change at all. And many of the returning provisions are sensible (and some of those which have been dispensed with are not going to be mourned).</p>

<p>What needs, now, to be observed closely, is how any final enactment lands with business, and how it lands with the European Commission. The UK-EU &quot;adequacy agreement&quot;, which enables effective free movement of personal data between the two jurisdictions, is due to expire (and be renegotiated) in 2025. If the EU member states, and the European Commission, decide that the UK has diverged too far from the EU model, they may want to take the opportunity to give the UK a bloody nose. And that, in itself, would inevitably have an economic impact.</p>

<p>There is a long road ahead.</p>

<p>&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADKB4AAA7AA7YAAAAAAAB6AB7QAP777774AAAIQBQYG4GBQAAA.jpg" length="38495" />
    </item>
    <item>
      <title><![CDATA[Data centres as critical national infrastructure: a new era for digital resilience]]></title>
      <link>https://www.mishcon.com/news/data-centres-as-critical-national-infrastructure-a-new-era-for-digital-resilience</link>
      <guid>https://www.mishcon.com/news/data-centres-as-critical-national-infrastructure-a-new-era-for-digital-resilience</guid>
      <description><![CDATA[So much of our lives depend on digital services. The Government's decision to designate Data Centres as Critical National Infrastructure (CNI) is therefore a welcome step. The change effectively codifies what was already a widely held assumption, even among the general public, regarding the importance of data centres.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Mon, 16 Sep 2024 11:05:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>So much of our lives depend on digital services. The Government&#39;s decision to designate Data Centres as Critical National Infrastructure (<strong>CNI</strong>) is therefore a welcome step. The change effectively codifies what was already a widely held assumption, even among the general public, regarding the importance of data centres.&nbsp;&nbsp;</p>

<p>The UK has an estimated 500 data centres. This puts it ahead of many countries, including China and most of Europe, although it is dwarfed by America (which has circa 5,000). Moreover, it is highly likely that demand for data centre capacity will grow significantly, as business seek to meet the computing needs of Artificial Intelligence alongside more routine services.&nbsp;&nbsp;</p>

<p>The recognition of data centres as a vital part of the country&#39;s infrastructure is therefore not before time. Undoubtedly it will bring some benefits to the sector, in particular in justifying greater focus and support at a governmental level. However, it is no panacea, and much will still fall to the providers and consumers of these services.&nbsp;</p>

<h2>Strong and stable?&nbsp;</h2>

<p>Security is often the headline issue when it comes to digital services. As a data centre client, ensuring that the data centre can securely house sensitive data is crucial. As a result, many view the sector as being a leader in physical security already, and there are well-trodden practices to manage physical risks.&nbsp;</p>

<p>However, from the customer&#39;s viewpoint, resilience often takes precedence. Recent outages in data centres, such as the Oct 2021 Meta misconfiguration, show that availability issues that cause service disruption create headlines. Whilst in this case, although the fault only led to 6 hours of service outage, it was a highly public event and was linked to a drop in the share price of Meta.&nbsp;</p>

<h2>Clouds on the horizon?&nbsp;</h2>

<p>Perhaps the most interesting part of the announcement is contained in a footnote that &quot;CNI will include both the physical data centres and the cloud operators that use them to supply ordinary services&quot;.&nbsp;&nbsp;</p>

<p>The extent of support for major cloud providers is yet to be disclosed. However, it is a positive sign to see government support for these service providers being potentially formalised.&nbsp;&nbsp;</p>

<h2>Benefits of designation&nbsp;</h2>

<p>Risk management in data centres is a collaborative effort, with strategies between operators and clients being mutually dependent.&nbsp;</p>

<p>The UK Government National Protection Security Authority already publishes guidance for the data centre sector and its customers. The advice is well developed, focusing on specific risks driven by considerations ranging from geography and asset ownership to the management of data halls and &#39;meet-me rooms&#39; where network interconnectivity occurs.&nbsp;&nbsp;</p>

<p>The Government&#39;s designation of data centres as CNI will likely lead to greater support in the sharing of intelligence related to issues, and support in the recovery from crisis events. The changes are not just focused on cyber security issues, but also will improve overall resilience &ndash; such as from extreme weather events or other disruptions.&nbsp;&nbsp;</p>

<p>This designation may also offer further, more subtle, benefits to Data Centre operators, who (like other sectors) have likely recognised their susceptibility to disruptions from physical protests; their status as CNI may now mean that data centre operators will have an easier argument when seeking injunctive relief to safeguard their services. Additionally, it could influence planning and construction practices to improve access to essential utilities like electricity and water making construction more viable.&nbsp;&nbsp;</p>

<p>However, a belief that this status alone will deter cybercriminals may be somewhat optimistic. It will need to be backed by clear measures that target those who attack critical infrastructure, to send a message that it is off-limits.&nbsp;&nbsp;</p>

<p>Further, the change will potentially bring more scrutiny with it for data centre operators. In the context of the broader economic environment, it appears that businesses will bear the brunt of any implementation costs, barring extraordinary events or civil contingencies.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AC4DEAAA7AA7YAAAAAAAB6AB7QAP777774AABAYEAUEYGBAAAA.jpg" length="114687" />
    </item>
    <item>
      <title><![CDATA[Dutch Data Protection Authority releases guidance on "data scraping"]]></title>
      <link>https://www.mishcon.com/news/dutch-data-protection-authority-releases-guidance-on-data-scraping</link>
      <guid>https://www.mishcon.com/news/dutch-data-protection-authority-releases-guidance-on-data-scraping</guid>
      <description><![CDATA[New guidance (currently only available in Dutch) has been released by the Dutch Data Protection Authority (Dutch DPA)  in relation to "data scraping".]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 08 Aug 2024 14:41:00 GMT</pubDate>
      <content:encoded><![CDATA[<p><a href="https://autoriteitpersoonsgegevens.nl/actueel/ap-scraping-bijna-altijd-illegaal?" >New guidance (currently only available in Dutch)</a> has been released by the Dutch Data Protection Authority (Dutch DPA) in relation to &quot;data scraping&quot;. The Dutch DPA advise that data scrapers with commercial interests cannot rely on &quot;legitimate interests&quot; as the basis for the collection and processing of scraped data. Although it is perhaps unlikely that a similar approach would be taken by the Information Commissioner, in the UK, it will be important to track how the issue develops.</p>

<h2>Data scraping</h2>

<p>Data scraping is where a &quot;bot&quot; or automated computer program captures information that is stored online on a mass scale. One of the uses of data scraping in recent years has been to collect data to train AI (Artificial Intelligence) and LLMs (Large Language Models) such as that used in ChatGPT.</p>

<h2>The Dutch DPA&#39;s guidance</h2>

<p>For processing of personal data to be lawful under the EU GDPR one of the bases listed in Article 6(1) must apply. Article 6(1)(f) provides such a basis where processing is necessary for the legitimate interests of the data controller, or any other person, as long as those interests are not overridden by the interests, rights or freedoms of data subjects.</p>

<p>In the Dutch DPA&#39;s view, commercial entities cannot rely on legitimate interests to collect and process data that has been scraped; rather, explicit consent is always required from the data subjects.</p>

<p>The Dutch DPA recognises that the collection of consent on this scale from data subjects who may not be contactable would be impractical. Given the impracticalities, the Dutch DPA says that there is, therefore, no legal basis under EU GDPR for the practice of data scraping by commercial entities.</p>

<p>The Dutch DPA does accept that scraping of personal data for personal, non-commercial, purposes may be compliant without consent. However, if that data was then shared on a public repository (such as GitHub) then it is unlikely that further processing would be lawful.</p>

<p>The guidance requires certain risk assessments and safeguards to be put in place to protect data subjects in the instances where data scraping is compliant. The Dutch DPA also suggests that if a high risk to privacy is identified when completing a DPIA (Data Protection Impact Assessment) then a controller would have an obligation under Article 36 of the GDPR to consult with the Dutch DPA directly, so that it can assess intended processing and what measures may be needed before any processing starts.</p>

<p>The guidance is particularly critical of data scraping that is used to train AI. This is because it identifies that information stored on the internet may contain incorrect, biased, and discriminatory information which may pose a risk to fundamental rights when the AI is later deployed.</p>

<p>The guidance does not address enforcement, which may raise questions, at least for the time being, about what realistic deterrent there is to poor compliance.</p>

<h2>Is this indicative of other Regulators&#39; direction of travel?</h2>

<p>The approach taken by the Dutch DPA to whether commercial entities can rely on the &quot;legitimate interests&quot; basis is one which has been the subject of some criticism and which has been referred to the <a href="https://curia.europa.eu/juris/document/document.jsf?text=&amp;docid=269046&amp;pageIndex=0&amp;doclang=EN&amp;mode=req&amp;dir=&amp;occ=first&amp;part=1&amp;cid=289964" >Court of Justice of the European Union</a> (CJEU) by the Duch court in another case. It is certainly not shared by all (if any) other European data protection authorities.</p>

<p>Nonetheless, data scraping is an area of increasing scrutiny for some regulators, and we are aware that the Polish Data Protection Authority fined a Swedish scraper &euro;220,000 in 2019. However, this was in response to a failure to notify individuals that their data had been scraped, as opposed to the scraping itself. Similarly, the Spanish regulator fined Equifax Inc. &euro;1 million in 2021 for failing to inform data subjects, limit their collection to what was necessary, ensure the data&#39;s accuracy, and satisfy the balancing test required to rely on the legitimate interest basis for collection and processing.</p>

<p>It is difficult to say whether other data protection authorities and regulators will take similar positions to this guidance from the Dutch DPA (whilst probably not adopting the full &quot;legitimate interests&quot; analysis). It may be that other authorities wait to see what happens following this guidance, and how the CJEU rules on the &quot;legitimate interests&quot; point, before implementing their own.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACWCUAAA7AA7YAAAAAAAB6AB7QAP777774AABAYEAUEYGBAAAA.jpg" length="102315" />
    </item>
    <item>
      <title><![CDATA[School receives statutory reprimand for using Facial Recognition Technology]]></title>
      <link>https://www.mishcon.com/news/school-receives-statutory-reprimand-for-using-facial-recognition-technology</link>
      <guid>https://www.mishcon.com/news/school-receives-statutory-reprimand-for-using-facial-recognition-technology</guid>
      <description><![CDATA[The Information Commissioner’s Office (ICO) has issued a statutory reprimand, under the UK GDPR, to an academy school in Essex, in relation to how it introduced and operated facial recognition technology (FRT) to take cashless canteen payments from students.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 24 Jul 2024 10:35:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The Information Commissioner&rsquo;s Office (ICO) has issued a <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2024/07/essex-school-reprimanded-after-using-facial-recognition-technology-for-canteen-payments/" >statutory reprimand</a>, under the UK GDPR, to an academy school in Essex, in relation to how it introduced and operated facial recognition technology (FRT) to take cashless canteen payments from students.</p>

<p>There are some key takeaways from the ICO&#39;s action, some of which extend beyond schools and into a wider business and retail context.</p>

<p>FRT systems uniquely identify individuals by capturing an image of their face in real time and matching it with a pre-existing database of images. This constitutes &ldquo;processing&rdquo; of the individual&rsquo;s personal data, but, because of how the FRT works, it also qualifies as &ldquo;biometric processing&rdquo;, and it therefore requires extra measures under data protection law to be taken to ensure it is done fairly and lawfully.</p>

<p>Prior to doing such biometric processing - and therefore prior to implementing FRT - a data controller must undertake a <a href="https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/accountability-and-governance/data-protection-impact-assessments-dpias/" >Data Protection Impact Assessment</a> (DPIA) - a type of risk assessment which needs to take into account the necessity and proportionality of the processing, and the risks to the rights and freedoms of the data subjects. In the case of the Essex school, a DPIA had only been undertaken after the FRT system had been introduced.</p>

<p>If FRT systems are used in schools, and in the workplace (for instance for access control to certain spaces, service or items) the data controller has to have a lawful justification for doing so. And it may be unlikely that there are sufficiently compelling reasons to impose the system, without asking the students, or the workers, for their explicit consent. In the Essex school example, the ICO found that, instead, it had merely been inferred that students consented, in the absence of a parental opt-out. The ICO pointed out that this was insufficient: &quot;the law does not deem &lsquo;opt out&rsquo; a valid form of consent and requires explicit permission&quot;. And, furthermore, the ICO noted that most of the students were of an age and competence that meant they were able to take their own decision on whether to partake in the scheme, regardless of their parents&#39; wishes. In fact, the ICO did not refer, as it could have done to the fact that under the <a href="https://www.legislation.gov.uk/ukpga/2012/9/part/1/chapter/2" >Protection of Freedoms Act 2012</a>, students in schools have their own express rights, and even if a parent has said that it is ok, or not ok, for the school to use biometric processing, the student still has the right to override that parental decision.&nbsp;</p>

<p>When employers look to introduce FRT in the workplace for access control reasons, they should also bear in mind that data protection law considers that where there is an &ldquo;imbalance of power&rdquo; - for instance between a worker and an employer - it can be difficult to rely on the worker&rsquo;s consent to the processing. At the very least, the employer will need to offer the option not to be exposed to the FRT, which may well mean providing an alternative means of access - for instance through the use of codes or PIN numbers.</p>

<p>It is also important that those purchasing and deploying FRT systems and software exercise due diligence. Suppliers might seek to provide advice and reassurance, but they are not the ones who will be liable for any data protection infringements, and investigation by the ICO.</p>

<p>Although the ICO chose not to impose a fine on the school, perhaps because it has a policy of rarely, if ever, fining public bodies, it is not impossible that a more punitive sanction could be imposed on a private sector organisation that failed to undertake a DPIA before introducing FRT, or failed properly to take into account the legal issues around consent. Anyone proposing to use of FRT should consider the legal issues carefully, and take advice where appropriate.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AB3DQAAA7AA7YAAAAAAAB6AB7QAP777774AAAAAAYIGGCBQAAA.jpg" length="96818" />
    </item>
    <item>
      <title><![CDATA[What do the main party manifestos say about data?]]></title>
      <link>https://www.mishcon.com/news/what-do-the-main-party-manifestos-say-about-data</link>
      <guid>https://www.mishcon.com/news/what-do-the-main-party-manifestos-say-about-data</guid>
      <description><![CDATA[Shortly after the announcement that there would be a July 4 general election, it became apparent that the Data Protection and Digital Information Bill, which had been mooted in various forms for two years, had lapsed.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 13 Jun 2024 13:52:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Shortly after the announcement that there would be a July 4 general election, it became apparent that the Data Protection and Digital Information Bill, which had been mooted in various forms for two years, had lapsed. What had been the focus of many practitioners and lawyers was no more.</p>

<p>But what do the main political parties propose, in the area of data protection itself, in their pre-election manifestos? The main answer is not a great deal: it does not look like a return to data protection reform is high on any of the parties&#39; agendas. This may come as some relief, especially for those who have concerns that too great a divergence between the UK and the EU data protection frameworks could threaten the European Commission&#39;s position that the UK has an <em>&quot;adequate&quot;</em> regime, for the purposes of transfers of data from the EU to the UK. Despite this, it is likely that the next Government, whatever party or parties form it, will still need to give attention to aspects of the current regime which warrant amendment.</p>

<p>The Conservative Party &ndash; the party which sponsored the Data Protection and Digital Information Bill &ndash; also says nothing specific in its manifesto on the topic of data protection. It does say that the Conservatives would <em>&quot;invest &pound;3.4 billion in new technology to transform the NHS for staff and for patients&quot;</em>, and that this would result in the digitisation of NHS processes through the proposed Federated Data Platform. And there is a proposal to legislate for comparable data across the UK, in order that performance of public services can be accurately compared. There is also reference to accelerating AI development, and investment into the sector, although nothing on whether or how it might be subject to future regulation.</p>

<p>Similarly, the Labour Party makes no express reference to data protection, although it does propose a number of points on digital policy, especially in the area of AI. For instance, it will establish a &#39;Regulatory Innovation Office&#39; to <em>&quot;co-ordinate issues that span existing boundaries&quot;</em>. And it says Labour will introduce <em>&quot;binding regulation on the handful of companies developing the most powerful AI models and by banning the creation of sexually explicit deepfakes&quot;</em>. There is also a suggestion of an expansion of the Online Safety Act, but no specific details.</p>

<p>The Liberal Democrat manifesto does refer to data protection, in the context of a wider proposal for a Digital Bill of Rights <em>&quot;to protect everyone&rsquo;s rights online, including the rights to privacy, free expression, and participation without being subjected to harassment and abuse&quot;</em>. It also proposes repeal of the <em>&quot;immigration exemption&quot;</em> in the Data Protection Act 2018, and that <em>&quot;all products&quot;</em> will be required to provide a &quot;short, clear version of their terms and conditions, setting out the key facts as they relate to individuals&rsquo; data and privacy&quot;. There would also be a &#39;Patients&#39; Charter&#39; who would seek, among other things, to protect patient data and <em>&quot;patients&rsquo; rights to opt out of data sharing&quot;</em>.</p>

<p>The Green Party also proposes a Digital Bill of Rights in its manifesto, to establish the UK as<em> &quot;a leading voice on standards for the rule of law and democracy in digital spaces&quot;</em>, and to ensure that <em>&quot;UK data protection is as strong as any other regulatory regime&quot;</em>. Meanwhile, the Green Party would adopt a <em>&quot;precautionary regulatory approach to the harms and risk of AI&quot;</em> and would seek to <em>&quot;align the UK approach with our neighbours in Europe, UNESCO and global efforts to support a coordinated response to future risks of AI&quot;</em>.</p>

<p>None of the party manifestos covered above make reference to Freedom of Information, although it is perhaps notable that the Liberal Democrat manifesto does propose that <em>&quot;all Ministers&rsquo; instant-messaging conversations involving government business must be placed on the departmental record&quot;</em> and that <em>&quot;all lobbying of Ministers via instant messages, emails, letters and phone calls is published as part of quarterly transparency releases&quot;</em>.</p>

<p>None of this is to suggest that data protection, and wider digital rights issues, might not become subject to policy focus, once the next administration is in power, but as things stand none of the parties appears to see information rights as a topic to boost their campaigns.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABMTCAAA7AA7YAAAAAAAB6AB7QAP777774YQGAAAYIGGCBQAAE.jpg" length="81874" />
    </item>
    <item>
      <title><![CDATA[Important news on data subject access requests: requesters entitled to identities of recipients]]></title>
      <link>https://www.mishcon.com/news/important-news-on-data-subject-access-requests-requesters-entitled-to-identities-of-recipients</link>
      <guid>https://www.mishcon.com/news/important-news-on-data-subject-access-requests-requesters-entitled-to-identities-of-recipients</guid>
      <description><![CDATA[A very significant data protection subject access judgment was recently handed down in the High Court, in the case of Harrison V Cameron & Another. As a judgment of the High Court it has binding effect, and unless appealed, its findings must be followed.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Tue, 11 Jun 2024 17:01:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>A very significant data protection subject access judgment was recently handed down in the High Court, in the case of <a href="https://www.bailii.org/ew/cases/EWHC/KB/2024/1377.html" >Harrison V Cameron &amp; Another.</a> As a judgment of the High Court it has binding effect, and unless appealed, its findings must be followed.</p>

<p>It clarifies, or confirms, some key points for all those who make, respond to or advise on such requests, whether they are made under the <a href="https://www.mishcon.com/uk-gdpr">UK GDPR</a> or - in the case of subject access requests to law enforcement authorities &ndash; under part 3 of the Data Protection Act 2018.</p>

<p>Key rulings were made in particular to the effect that;</p>

<ol>
	<li>requesters are entitled, in principle, to be informed of the identities of the recipients of their personal data (not just the categories of recipient)</li>
	<li>the subject access regime has a &ldquo;specific and limited purpose, which is to enable a person to check whether a data controller&rsquo;s processing of his or her personal data unlawfully infringes privacy rights and, if so, to take such steps as the data protection law provides&rdquo;</li>
	<li>a director of a company, when acting as such, will not be a &ldquo;controller&rdquo;.</li>
</ol>

<p>The underlying details of the case are striking. A director of a gardening company (Mr C) had covertly recorded threatening calls made by a wealthy homeowner working in the property investment industry (Mr H) with whom the company was coming into dispute, and subsequently circulated the recordings to a limited number of unnamed family members and others.</p>

<p>The recordings found their way to a wider circle of people, including some of Mr H&rsquo;s peers and competitors in the property investment sector. Mr H contended that the circulation of the recordings had caused his own company to lose out on a significant property acquisition. Accordingly, he made subject access requests, under Article 15 of the UK GDPR both to Mr C and to Mr C&rsquo;s company (&ldquo;ACL&rdquo;). Those requests were rejected on the grounds that i) Mr C, when circulating the recordings, was processing Mr H&rsquo;s personal data in a &ldquo;purely personal and household&rdquo; context, and so the processing was out of scope of the UK GDPR, ii) Mr C was not personally a controller under the UK GDPR, and iii) ACL could rely on the exemption to disclosure where it would involve disclosing information relating to another individual who did not consent to disclosure, and where &ndash; in the absence of such consent &ndash; it was not reasonable in the circumstances to disclose, when having regard to the backing test required under Article 15(4) of the&nbsp; UK GDPR and paragraph 16 of Schedule 2 to the Data Protection Act 2018 (DPA 2018).</p>

<p>In a lengthy judgment (dealing mostly with the facts and evidence) Mrs Justice Steyn held that Mr C&rsquo;s processing was not for purely personal and household reasons: he was clearly acting as a director of ACL in making the recordings and circulating them. However, she agreed that he was not a controller &ndash; he was acting in his capacity as a director, and &ndash; following the case law in <a href="https://www.bailii.org/ew/cases/EWCA/Civ/2017/121.html" >Ittihadieh</a> and <a href="https://www.bailii.org/ew/cases/EWHC/Ch/2013/2485.html" >In re Southern Pacific Loans</a> &ndash; a director processing data in the course of their duties for their company is not a controller; the company is.</p>

<p>A crucial part of the judgment, in terms of wider relevance, is on the interpretation of Article 15(1)(c) of the UK GDPR. This provides that a data subject should be given information on &ldquo;the recipients or categories of recipient&rdquo; to whom personal data have been or will be disclosed. Many practitioners, and lawyers, have taken this to be an option available to the controller (i.e. the controller can decide whether to provide information on the specific recipient or just on categories thereof). Not so, said Steyn J, agreeing with the CJEU in the <a href="https://curia.europa.eu/juris/document/document.jsf?docid=269146&amp;doclang=en" >Austrian Post</a> case (which, as a post-Brexit case, wasn&rsquo;t binding on her, but to which she could have regard, so far as it was relevant to the issues (see section 6(2) of the EU (Withdrawal) Act 2018)): the choice lies with the data subject, and, if the data subject chooses to receive information on individual recipients, he or she is entitled, in principle, to that information, unless it would be impossible or manifestly excessive to do so.</p>

<p>Notwithstanding this, Mr H was not entitled in this case to have the identities. Mr H had previously sent subject access requests individually to at least 23 employees of ACL, and he had an intention to pursue further legal options other than under the UK GDPR, if he was able to identify potential claimants. ACL believed that disclosing identities of recipients of the recordings would put them at &ldquo;significant risk of being the object of intimidating, harassing and hostile legal correspondence and litigation&rdquo;. The judge agreed that it was &ldquo;not unreasonable for the Defendants to give significant weight to [Mr H&rsquo;s] sustained and menacing behaviour in considering whether to protect or disclose the identities of friends, colleagues and family members&rdquo;. The fact that &ldquo;hostile litigation&rdquo;, against the third parties to whom the recordings were disclosed, was being contemplated was a relevant factor to take into account when balancing their interests with Mr H&rsquo;s access rights, under paragraph 16 of Schedule 2. The judge agreed with the court in the case of<a href="https://www.bailii.org/ew/cases/EWHC/KB/2023/1092.html" > X v The Transcription Agency</a> that the subject access regime,&nbsp;&ldquo;has a specific and limited purpose, which is to enable a person to check whether a data controller&rsquo;s processing of his or her &lsquo;personal data&rsquo; unlawfully infringes privacy rights and, if so, to take such steps as the DPA 2018 provides&hellip;[and so] it was reasonable for the Defendants to give weight to their desire to protect family, friends and colleagues from hostile litigation going beyond the exercise of rights under the UK GDPR and the DPA 2018.&rdquo;</p>

<p>It has long been a subject of debate, under the UK GDPR and the prior law, whether a requester&rsquo;s motive is relevant when responding to a subject access request rears its head again. The judge&rsquo;s analysis in Harrison v Cameron is compelling, and so it certainly appears that &ndash; at the very least when it comes to the balancing test implied by paragraph 16 of Schedule 2 of the DPA 2018 &ndash; the motive is capable of being taken into account.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABOCAAAA7AA7YAAAAAAAB6AB7QAP777774AAAIIBRAG4IBQAAA.jpg" length="134448" />
    </item>
    <item>
      <title><![CDATA[The end of the Data Protection and Digital Information Bill]]></title>
      <link>https://www.mishcon.com/news/the-end-of-the-data-protection-and-digital-information-bill</link>
      <guid>https://www.mishcon.com/news/the-end-of-the-data-protection-and-digital-information-bill</guid>
      <description><![CDATA[With the announcement by the Prime Minister, on 22 May, that the King had granted a request for dissolution of Parliament, and that a General Election will be held on 4 July, the thoughts of data protection professionals inevitably turned to the question of what would happen with the Data Protection and Digital Information Bill.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 24 May 2024 08:26:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>With the announcement by the Prime Minister, on 22 May, that the King had granted a request for dissolution of Parliament, and that a General Election will be held on 4 July, the thoughts of data protection professionals inevitably turned to the question of what would happen with the Data Protection and Digital Information Bill.</p>

<p>The Bill, which was approaching report stage in the House of Lords, had, prior to 22 May, seemed destined to pass, provided there was enough parliamentary time: it was largely unopposed by the main opposition, and was well advanced in its passage through Parliament.</p>

<p>However, it now appears that, with <a href="https://privycouncil.independent.gov.uk/wp-content/uploads/2024/05/2024-05-23-List-of-Business.pdf">Parliament due to be dissolved no later than the 31 May</a>, and no indication that there is any likelihood of the Bill being included in the &quot;wash up&quot; process, whereby some priority legislation is fast-tracked before dissolution, it has lapsed. Indeed &ndash; there has been strong indication from <a href="https://hansard.parliament.uk/Lords/2024-05-23/debates/2FF1DA69-732C-4CC5-927A-DCB095BD5231/DigitalMarketsCompetitionAndConsumersBill#contribution-26B9A187-A77B-449B-BE4C-DDC3B8BC52C6" >opposition peers</a> that the Bill has now &quot;failed&quot;.</p>

<p>It will now have to be seen whether the next administration &ndash; whatever its political colour &ndash; has any appetite to revive the Bill in something like its current form. Should Rishi Sunak be Prime Minister of the next government, with something like his current cabinet in place, it would seem quite likely the Bill would be relatively swiftly resurrected. Should, instead, a Labour government assume power, it is probably unlikely that an identical data protection bill would be high on its agenda, but legal news outlet <a href="https://www.lexology.com/pro/content/uk-gdpr-reform-dead-now" >Lexology</a> has reported rumours that Labour might look to introduce a &ldquo;digital bill in the autumn on entirely different lines&rdquo; which would include legislation on Artificial Intelligence.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADXCCAAA7AA7YAAAAAAAB6AB7QAP777774AAB7IDSEE4SBAAAA.jpg" length="155287" />
    </item>
    <item>
      <title><![CDATA[Report highlights a ‘Wild West’ for personal data which undermines UK human rights]]></title>
      <link>https://www.mishcon.com/news/report-highlights-a-wild-west-for-personal-data-which-undermines-uk-human-rights</link>
      <guid>https://www.mishcon.com/news/report-highlights-a-wild-west-for-personal-data-which-undermines-uk-human-rights</guid>
      <description><![CDATA[The media platform OpenDemocracy, and journalist Jenna Corderoy, have produced an important and damning, report into the generally parlous situation data subjects are faced with when trying to access their information from public bodies.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 02 May 2024 10:10:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The media platform OpenDemocracy, and journalist Jenna Corderoy, have produced an <a href="https://www.opendemocracy.net/en/freedom-of-information/personal-data-sar-request-undermines-basic-legal-human-rights/" >important and damning, report</a> into the generally parlous situation data subjects are faced with when trying to access their information from public bodies.&nbsp;</p>

<p>The subject access right (&quot;SAR&quot;) under the <a href="https://www.mishcon.com/uk-gdpr">UK GDPR</a> and the Data Protection Act 2018, is one that has existed in the UK since 1984, and the report,&nbsp;<em>Getting Personal: Accountability and Personal Data in the UK</em>, notes that it is&nbsp;</p>

<p>&quot;&hellip;an extremely powerful right, covering public authorities as well as private companies, charities, political parties and other organisations. Over the years, SARs have been responsible for exposing injustices, supporting legal claims and revealing the extent of surveillance.&quot;&nbsp;</p>

<p>However, data from freedom of information requests, and interviews with a number of people who have tried to exercise their rights, reveals what the report describes as&nbsp;&nbsp;</p>

<p>&quot;&hellip;long and unjustified delays &ndash; sometimes lasting several months [which] are depriving citizens of their personal information and undermining their legal and human rights.&quot;&nbsp;</p>

<p>The Information Commissioner&#39;s Office (ICO) has an enforcement role, but the report notes that &quot;formal action is vanishingly rare&quot; and that citizens are often left only with the possibility of bringing legal claims &ndash; something that is beyond the means of most.&nbsp;</p>

<p>The report includes an interview with Mishcon de Reya client John Pring (for whom we acted on a <em>pro bono basis</em>), who had to wait an astonishing two and half years to <a href="https://www.disabilitynewsservice.com/dwp-finally-admits-defeat-in-information-battle-with-dns-after-two-and-a-half-years/" >get access to his personal data from the Department of Work and Pensions</a>, and who feels that a &ldquo;policy of delaying the release of potentially embarrassing information, often for years, has gradually become ingrained within DWP&quot;.&nbsp;</p>

<p>There are a number of recommendations in the report. These include that the ICO should take more enforcement action and, in particular, that this should include intervention on individual SAR cases where there has been a clear breach of data protection laws, instead of only taking action about an organisation&rsquo;s overall compliance levels.&nbsp;&nbsp;&nbsp;</p>

<p>Although there are some changes to the SAR process proposed in the <a href="https://bills.parliament.uk/bills/3430" >Data Protection and Digital Information Bill</a> currently before Parliament, none of these looks set to address the systematic, and structural, problems with compliance and enforcement. One hopes, though, that OpenDemocracy&#39;s report will at least generate a debate on the issue and how things might be improved.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADJCMAAA7AA7YAAAAAAAB6AB7QAP777774AAB4YGAUEYGBAAAA.jpg" length="92532" />
    </item>
    <item>
      <title><![CDATA[New connectable products regime: Enforcement update]]></title>
      <link>https://www.mishcon.com/news/new-connectable-products-regime-enforcement-update</link>
      <guid>https://www.mishcon.com/news/new-connectable-products-regime-enforcement-update</guid>
      <description><![CDATA[We recently reported that the Product Security and Telecommunications Infrastructure Act 2022 (PSTIA) is due to come into force on 29 April 2024. Part 1 of the PSTIA aims to protect consumers from unsafe connectable products entering the UK market by requiring compliance with minimum security requirements for products that may pose a cyber security risk.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 26 Apr 2024 15:26:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>We recently <a href="https://www.mishcon.com/news/a-new-regime-for-connectable-products-from-april-2024">reported</a> that the <a href="https://www.legislation.gov.uk/ukpga/2022/46/part/1/enacted" >Product Security and Telecommunications Infrastructure Act 2022</a> (PSTIA) is due to come into force on 29 April 2024. Part 1 of the PSTIA aims to protect consumers from unsafe connectable products entering the UK market by requiring compliance with minimum security requirements for products that may pose a cyber security risk.&nbsp;</p>

<p>The Office for Product Safety and Standards (OPSS) has now issued <a href="https://www.gov.uk/government/publications/opss-enforcement-enforcement-actions/consumer-connectable-product-security-regulations#guidance-on-enforcement-actions-and-associated-rights" >guidance</a> setting out how it intends to approach enforcement of the requirements of the PSTIA.&nbsp;&nbsp;</p>

<h2>Enforcement actions by the OPSS&nbsp;</h2>

<p>The OPSS <a href="https://www.gov.uk/guidance/regulations-consumer-connectable-product-security" >announced</a> in January 2024 that it will take a risk-based approach to non-compliance with the PSTIA in line with its existing <a href="https://www.gov.uk/government/publications/safety-and-standards-enforcement-enforcement-policy/opss-enforcement-policy#our-approach-to-addressing-non-compliance-and-product-safety-risk" >Enforcement Policy</a>. The new guidance, specific to the PSTIA, outlines how the OPSS intends to implement the five actions available for enforcement under the PSTIA where there has been a breach of a duty under <a href="https://www.legislation.gov.uk/ukpga/2022/46/part/1/chapter/2/enacted" >Chapter 2 of Part 1 of the PSTIA</a>. The five actions are:&nbsp;</p>

<ol>
	<li><strong>Compliance Notices&nbsp;</strong><br />
	These set out the steps the OPSS requires a business to take to comply with their statutory duties and bring themselves into compliance.&nbsp;</li>
	<li><strong>Stop Notices&nbsp;</strong><br />
	These prohibit non-compliant activities and restrict non-compliant product availability on the market until compliance is achieved via the steps set out in the notice by the OPSS.&nbsp;</li>
	<li><strong>Recall Notice&nbsp;</strong><br />
	These are issued where the OPSS believes that there has been a compliance failure in relation to any consumer facing product and/or the action taken by the business to mitigate the failure is inadequate. The guidance explains that the OPSS considers a recall to be an appropriate compliance action given the risk that non-compliance may pose. However, the PSTIA does not create a duty on businesses to recall products.&nbsp;</li>
	<li><strong>Monetary Penalty Notices</strong>&nbsp;<br />
	These may be issued where the OPSS is satisfied that there has been a failure to comply with a duty. These penalties can be fixed, consisting of a one-off penalty, or incurred daily, where a further penalty is due in respect of each day that non-compliance continues. The OPSS will issue Monetary Penalties in line with its Enforcement Policy and the circumstances of the case. However, it should be noted that penalties can be severe and amount to the greater of &pound;10m or 4% of the business&#39;s qualifying worldwide revenue during its most recent complete accounting period.&nbsp;</li>
	<li><strong>Forfeiture Order&nbsp;</strong><br />
	The OPSS may issue a Forfeiture Order where they require that non-compliant products defined by section 42(1) of the PSTIA are delivered up, destroyed or disposed of. The OPSS must apply to the court to obtain the order.&nbsp;</li>
</ol>

<p>The OPSS may choose to issue a combination of the above actions, or issue one in isolation. Before taking any of these actions the OPSS will notify the affected businesses via a Notice of Intent and provide an opportunity to respond. The guidance encourages businesses to engage with the OPSS and states that any response will be considered, and a decision will be made as to whether to continue proceeding with the enforcement action.&nbsp;</p>

<p>Compensation may be available where a Stop Notice or Recall Notice is made, and loss has been suffered. Businesses must apply for compensation within 45 days of the Notice and then the OPSS will assess whether compensation is due (such decisions will be appealable).&nbsp;&nbsp;</p>

<p>Details of any Notice given, varied, or revoked by the OPSS may be publicly published, and businesses need to be aware that this may have reputational implications.&nbsp;&nbsp;</p>

<p>If a business ignores or fails to comply with a Notice, the OPSS may choose to prosecute or pursue a civil claim where a Monetary Penalty has not been paid.&nbsp;&nbsp;</p>

<h2>Right to appeal&nbsp;</h2>

<p>Businesses have a statutory right to appeal the Notices and compensation decisions by the OPSS to the First Tier Tribunal within 28 days of the Notice or decision being served or varied. Appeals against Forfeiture Orders must be made to the relevant court within the same timeframe.&nbsp;</p>

<h2>What does the guidance mean for businesses?&nbsp;</h2>

<p>The guidance suggests that the OPSS is looking to take a risk-based approach, having consulted with businesses about how to comply with the PSTIA. We have seen similar approaches taken by the Information Commissioner&#39;s Office and other government regulators such as Ofcom, which suggests that the risk-based approach is a trend we will continue to see in enforcement.&nbsp;</p>

<p>Time will tell how actively the OPSS pursues compliance with the PSTIA. However, there are severe penalties available, which it may choose to use as a deterrent in the early days of enforcement. Businesses should actively work with the OPSS where possible, particularly given both the monetary and reputational risks associated with enforcement action.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AADRWAAA7AA7YAAAAAAAB6AB7QAP777774AABVQAAAFAABIAAA.jpg" length="61254" />
    </item>
    <item>
      <title><![CDATA[Clarity of consent in fertility treatment]]></title>
      <link>https://www.mishcon.com/news/clarity-of-consent-in-fertility-treatment</link>
      <guid>https://www.mishcon.com/news/clarity-of-consent-in-fertility-treatment</guid>
      <description><![CDATA[A recent case in the Family Division of the High Court illustrates the importance of clarity of wording when arrangements are made for donations in the context of fertility treatment.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 17 Apr 2024 11:06:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>A recent case in the Family Division of the High Court illustrates the importance of clarity of wording when arrangements are made for donations in the context of fertility treatment.&nbsp;&nbsp;</p>

<p>The judgment in <a href="https://www.bailii.org/ew/cases/EWHC/Fam/2024/587.html" ><em>Wessex Fertility Ltd &amp; Ors v Donor Conception Network</em> [2024] EWHC 587 (Fam) </a>deals with whether it was lawful to request that an egg donor (&quot;Donor A&quot;) provide a DNA sample for the purposes of genetic analysis, in circumstances a) where Donor A&#39;s eggs had been used in fertility treatment for a couple (&quot;Mr and Mrs H&quot;) b) where the resultant child was born with a number of health problems, and c) where Donor A had, at the time of donation, indicated on a &quot;consent form&quot; that she did not want subsequently to be notified that she had a previously unsuspected genetic disease, or that she was a carrier of a harmful inherited condition.</p>

<p>The court considered the position under both Article 8 of the European Convention on Human Rights, which provides a right to respect for a private and family life, and under data protection law - specifically, whether the processing of Donor A&#39;s own personal data could be held to be necessary for the purpose of medical diagnosis and/or in the provision of health treatment and proportionate to any interference with her rights (including her prior objections).</p>

<p>Ultimately, the court appears to have found it relatively straightforward to decide that the compelling circumstances warranted the interference under Article 8 and the data protection provisions fell accordingly into place. Perhaps key though was the court&#39;s findings to the effect that Donor A would retain the right to her own decision as to whether she provides a sample upon request or not, and that she could also decide that she would not wish to know the results of any DNA testing (and that request would be respected).</p>

<p>Nonetheless, the Court also found that the &quot;consent form&quot; had been ambiguous in the first place as to whether it actually covered the situation in question. There was at least an argument that all the form had done was convey Donor A&#39;s wish not to be informed if she had &quot;a previously unsuspected genetic disease, or that (she was) a carrier of a harmful inherited condition, rather than any child subsequently born as a result of her donation.&nbsp;</p>

<p>Probably the key lesson for clinics, clinicians and donors, therefore, is to ensure that terms of any agreements or expression of wishes, made at the time of donation, are as carefully drafted, and carefully read, as possible.&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACZRWAAA7AA7YAAAAAAAB6AB7QAP777774AAASIA5AB7IAIAAA.jpg" length="80228" />
    </item>
    <item>
      <title><![CDATA[Data breach crisis in central government, time for ICO to act?]]></title>
      <link>https://www.mishcon.com/news/data-breach-crisis-in-central-government-time-for-ico-to-act</link>
      <guid>https://www.mishcon.com/news/data-breach-crisis-in-central-government-time-for-ico-to-act</guid>
      <description><![CDATA[Official figures from the Information Commissioner's Office suggest that there was an 8000% increase in the number of people affected by financial data breaches in central government between 2019 and 2023.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 12 Apr 2024 12:42:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Official figures from the Information Commissioner&#39;s Office suggest that there was an 8000% increase in the number of people affected by financial data breaches in central government between 2019 and 2023.</p>

<p>There are estimated to be around 67.5 million people in the United Kingdom. Each of those is a data subject under our data protection laws. Yet in 2023 - alone &ndash; according to the <a href="https://ico.org.uk/about-the-ico/our-information/disclosure-log/ic-287363-p8d9/" >Information Commissioner&#39;s Office&#39;s statistics</a>, there were approximately 195 million data subjects whose rights and freedoms were put at a likely risk by breaches of data security in central government, in relation to &ldquo;economic or financial data&rdquo;. This means that, in a single calendar year, every single person in the country&#39;s rights and freedoms were put at likely risk almost three times, on average, by a Government breach of data security.</p>

<p>It is possible that some of the 195 million were outside the UK, or that some people were less affected (or not affected at all), and some were more affected. It&#39;s also important to note this finding only relates to &quot;economic and financial data&quot;, so full figures for all personal data will be much higher.</p>

<h2>A crisis in data security in central government?</h2>

<p>The figures derive from reports of &#39;personal data breaches&#39;(PDBs) made under <a href="https://www.mishcon.com/uk-gdpr/article-33" >Article 33 of the UK GDPR</a> to the Information Commissioner&rsquo;s Office (ICO). A PDB is defined as &quot;a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed&quot;.</p>

<p>There was for a while a problem with &#39;over-reporting&#39; of PDBs, but the ICO has been <a href="https://informationrightsandwrongs.com/2020/06/04/ico-report-a-databreach-to-us-and-we-might-take-action-against-you/" >very vocal in discouraging such over-reporting</a> in recent years, so - all things being equal - one might actually have expected a drop in figures. It should also be noted that when a data controller makes a report under Article 33, estimates of people affected are rarely going to be 100% accurate.&nbsp;</p>

<p>Not every PDB indicates a serious failure warranting enforcement action, and some will end up being &#39;near misses&#39;. However, what the figures do unequivocally show is a massive increase in the numbers potentially affected between 2019 and 2023 (from 2.4 million in 2019) with a notable upswing between 2022 and 2023 (from 70 million to 195 million).</p>

<p>There have certainly been several damaging data security breaches in recent months. Examples such as the <a href="https://www.electoralcommission.org.uk/privacy-policy/public-notification-cyber-attack-electoral-commission-systems%5d" >compromise of the England and Wales electoral register</a> as well as ransomware incidents involving the <a href="https://www.bl.uk/cyber-incident/" >British Library</a> and a number of other UK public authorities may have upped the figures, but not all of those will have involved economic and financial data, and it is not immediately obvious how they could be categorised as &#39;central government&#39;.</p>

<h2>The ICO&#39;s response</h2>

<p>The Information Commissioner John Edwards was <a href="https://www.linkedin.com/posts/odiakagan_dataprivacy-dataprotection-privacyfomo-activity-7181651002238849026-wIia?utm_source=share&amp;utm_medium=member_ios" >only recently reported</a> as saying that his policy of not fining the public sector but instead issuing non-binding reprimands was <em>&ldquo;very effective,&nbsp;especially in the public sector where reputation is worth more than the purse&rdquo;</em>. The evidence in fact points rather starkly the opposite way. Since his softer-touch approach for public authorities was adopted, it appears that data security failings at least in central government have skyrocketed.</p>

<p>It is important to note that the softer-touch was<a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/06/open-letter-from-uk-information-commissioner-john-edwards-to-public-authorities" > introduced as a &ldquo;trial&rdquo;</a>, and Mr Edwards did say <em>&ldquo;if I&nbsp;do not see the improvements that I hope to see, I will look again&rdquo;</em>. However, the trial is soon to end (assuming it is still a two year trial, as announced in June 2022) and data security failings in central government appear to be on the rise. It is true that the numbers of PDBs and people affected does not necessarily indicate a failure of the trial, but - as yet - there appears to be very little indication of what evidence or metrics <em>will </em>be used to gauge success or failure.</p>

<p>The ICO was asked to comment but did not state whether the increase in central government data breaches required action. It responded: <em>&quot;We are continuously engaging and working with government departments to remind them of their legal obligations, and offer guidance and advice with the aim of improving practices. Over the past two years, we&rsquo;ve also taken formal action against a number of central government departments, using the full range of our regulatory powers to uphold people&rsquo;s information rights&hellip;We can confirm there will be a review [of the revised approach to public sector enforcement, after the two year trial]&quot;</em>.</p>

<h2>The issue of transparency</h2>

<p>These figures are buried away in a freedom of information disclosure by the ICO: they were not proactively published, and there appears to be no explanation for the enormity of the issue, and nor does there seem to be any transparency within central government about how such security issues are happening and what is being done about them.</p>

<p>Regardless, the evidence points to a pressing need for government to get its house in order, and for the ICO to take a fresh look at whether there is a need for more robust enforcement in the public sector.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABBDGAAA7AA7YAAAAAAAB6AB7QAP777774AAA5QAXQHF4BYAAE.jpg" length="29490" />
    </item>
    <item>
      <title><![CDATA[New keeling schedules published for Data Protection Bill]]></title>
      <link>https://www.mishcon.com/news/new-keeling-schedules-published-for-data-protection-bill</link>
      <guid>https://www.mishcon.com/news/new-keeling-schedules-published-for-data-protection-bill</guid>
      <description><![CDATA[Although the jury is out on whether, or when, the Data Protection and Digital Information Bill (DPDI Bill) gets passed, all lawyers and practitioners should be preparing themselves for the eventuality.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 10 Apr 2024 14:59:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Although the jury is out on whether, or when, the Data Protection and Digital Information Bill (DPDI Bill) gets passed, all lawyers and practitioners should be preparing themselves for the eventuality. The Government&rsquo;s &ldquo;Keeling Schedules&rdquo; will help keep track of the changes the Bill would make to existing laws.</p>

<p>In 1938, Sir Edward Herbert Keeling complained in Parliament about the difficulties of understanding the impact of draft laws which would have the effect of amending existing laws. In response, the Prime Minister, Neville Chamberlain, announced that, in appropriate cases,</p>

<p><em>&ldquo;a Bill amending or applying an existing enactment by reference should contain a Schedule setting out the enactment as it will read when amended by the Bill and showing by typographical devices the Amendments proposed&rdquo;.</em></p>

<p>These are what came to be known as &ldquo;Keeling Schedules&rdquo;, and when they are made publicly available, they can an invaluable reference tool to understand potential legislative changes.</p>

<p>The Government published Keeling Schedules in May 2023 on an earlier iteration of the DPDI Bill, but has since published <a href="https://depositedpapers.parliament.uk/depositedpaper/2286275/files#collapse-details" >updated versions</a>.</p>

<p>They illustrate how, respectively, the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) 2003 would be affected by the DPDI Bill.</p>

<p>It&rsquo;s important to note, of course, that until the DPDI Bill gets enacted, none of these changes will happen, and also that the Bill will almost inevitably emerge from the current committee stage in the House of Lords with further adjustments . It is an ever-changing situation, but at Mishcon de Reya we will aim to continue to provide updates.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AAJDEAAA7AA7YAAAAAAAB6AB7QAP777777AACAAA7IG72BQAAE.jpg" length="80455" />
    </item>
    <item>
      <title><![CDATA[Anonymisation in the life sciences sector: challenges and legal considerations]]></title>
      <link>https://www.mishcon.com/news/anonymisation-in-the-life-sciences-sector-challenges-and-legal-considerations</link>
      <guid>https://www.mishcon.com/news/anonymisation-in-the-life-sciences-sector-challenges-and-legal-considerations</guid>
      <description><![CDATA[Anonymisation is a critical process for life sciences organisations to protect individuals' privacy while handling personal data.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 22 Mar 2024 09:54:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Anonymisation is a critical process for life sciences organisations to protect individuals&#39; privacy while handling personal data. However, the specificity and detail of the personal data used in this sector poses significant challenges to effective anonymisation, sparking extensive academic and judicial debate.</p>

<p>This article delves into the concept of &quot;anonymised&quot; data within the data protection framework, addresses the complexities of the topic, and offers recommendations for organisations in the life sciences field.</p>

<h2><strong>The challenge for the life sciences sector</strong></h2>

<p>Life sciences organisations may collect personal data from many sources. They might collect human samples from sponsored studies, receive datasets that have already been processed outside of the organisation, or work with public data.</p>

<p>In many cases it may be possible to combine data held within the organisation, or that is easily accessible from collaborators, to identify an individual from the data. To avoid some of the obligations that come from processing personal data, many organisations would like to be able to treat data that they work with, and disclose to third parties, as anonymised data.</p>

<h2><strong>The legal background</strong></h2>

<p>Anonymous data falls outside the scope of EU and UK data protection laws, namely the GDPR and UK GDPR, which govern the processing of &quot;personal data.&quot; Personal data is defined as information relating to an identifiable person.</p>

<p>Pseudonymous data, by contrast, is still personal data, and subject to data protection laws. The GDPR defines &quot;pseudonymisation&quot; as processing data such that it cannot be attributed to a specific individual without additional information, which must be kept separate and protected.</p>

<p>The crux of determining whether information is anonymous involves assessing if a living individual can be identified using &quot;<em>all the means reasonably likely to be used</em>.&quot; This includes considering the costs, time, and available technology for identification. The law requires a binary test, to ascertain whether an individual can be identified using reasonably likely means, rather than the likelihood of identification.</p>

<p>One interesting issue that can arise is that the same dataset can be considered personal data or anonymised data depending on who holds it. For instance, if Person A has a pseudonymised dataset and a key to identify individuals, it constitutes personal data for them. However, if Person B holds the same dataset without the key, it constitutes anonymised data for them.</p>

<h2><strong>Issues arising</strong></h2>

<p>In most circumstances, the complexities are unlikely to present an issue, but it is necessary to draw attention to them, if only to be aware that this is not always a straightforward area of law.</p>

<p>For example, anomalies within a dataset such as a patient in an age demographic of 30 - 40 with dementia are valuable data points and cannot easily be excluded from the dataset without impacting the effectiveness of the research. However, it is much easier to re-identify outliers from limited data compared to routine cases.</p>

<h2><strong>Regulatory guidance in the UK</strong></h2>

<p>The UK&#39;s Information Commissioner&#39;s Office (ICO) produced its <a href="https://ico.org.uk/media/1061/anonymisation-code.pdf" >Anonymisation Code of Practice</a> in 2014, which remains relevant despite being based on the pre-GDPR regime. In May 2021, however, the ICO began drafting new guidance on anonymisation, pseudonymisation and privacy enhancing technologies (PETs), and held a <a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/ico-call-for-views-anonymisation-pseudonymisation-and-privacy-enhancing-technologies-guidance/" >consultation</a> that closed in December 2022. However, development of the aspects of the guidance relating to anonymisation and pseudonymisation is on hold pending the passage of the Data Protection and Digital Information Bill.</p>

<h2><strong>Risk-based steps, measures, and mitigations</strong></h2>

<p>Organisations can take several steps to mitigate the risks of identification and to comply with regulatory requirements:</p>

<ol>
	<li><strong>Reduce identifiability</strong>: Aim to make identification as remote as possible, using the ICO&#39;s &quot;motivated intruder&quot; test as a benchmark. Such an intruder is described as &quot;a person who starts without any prior knowledge but who wishes to identify the individual from whose personal data the anonymised data has been derived&quot;.</li>
	<li><strong>Technical and organisational measures</strong>: Implement measures such as access controls, secure data transfer methods, and encryption to reduce the risk of identification.</li>
	<li><strong>Privacy-enhancing technologies (PETs)</strong>: The <a href="https://ico.org.uk/media/for-organisations/uk-gdpr-guidance-and-resources/data-sharing/privacy-enhancing-technologies-1-0.pdf" >ICO&#39;s June 2023 guidance on PETs</a> includes measures such as differential privacy, synthetic data, encryption, and trusted execution environments.</li>
	<li><strong>Risk assessments</strong>: Even when the processing involved might not strictly mandate Data Protection Impact Assessments (DPIAs), it would be sensible to undertake them. A DPIA brings potentially at least three benefits: it informs decision-making; it helps mitigate risk; and it serves as documentation to rely on in the event of any subsequent complaints or challenges. Again, the <a href="https://ico.org.uk/media/for-organisations/uk-gdpr-guidance-and-resources/accountability-and-governance/data-protection-impact-assessments-dpias-1-0.pdf" >ICO has guidance</a> on undertaking DPIAs.</li>
	<li><strong>Person A/Person B Approach</strong>: This approach means that only one party, such as the collaborator, holds the key and the other party, such as the sponsor, does not. In these circumstances the collaborator would hold pseudonymised data, but it would be anonymised in the sponsor&#39;s hands.</li>
</ol>

<p>Mishcon de Reya, with its deep-rooted expertise in working with life sciences organisations, can assist in reviewing and negotiating agreements to ensure responsibilities for data protection are appropriately allocated.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADXCCAAA7AA7YAAAAAAAB6AB7QAP777774AAB7IDSEE4SBAAAA.jpg" length="155287" />
    </item>
    <item>
      <title><![CDATA[Princess of Wales hospital privacy breach claims: Jon Baines comments]]></title>
      <link>https://www.mishcon.com/news/princess-of-wales-hospital-privacy-breach-claims-jon-baines-comments</link>
      <guid>https://www.mishcon.com/news/princess-of-wales-hospital-privacy-breach-claims-jon-baines-comments</guid>
      <description><![CDATA[Following media reports that the ICO is assessing reports that patient notes of The Princess of Wales were inappropriately accessed by staff at the "London Clinic", Jon Baines, Senior Data Protection Specialist at Mishcon de Reya, commented.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Thu, 21 Mar 2024 10:34:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Following media reports that the Information Commissioner&#39;s Office (ICO) is assessing reports that patient notes of The Princess of Wales were inappropriately accessed by staff at the &quot;London Clinic&quot;, <a href="https://www.mishcon.com/people/jon-baines">Jon Baines</a>, Senior Data Protection Specialist at Mishcon de Reya, commented:</p>

<p><em>&quot;Any investigation by the ICO is likely to consider whether a criminal offence might have been committed by an individual or individuals. Section 170 of the Data Protection Act 2018 says that a person commits an offence if they obtain or disclose personal data &quot;without the consent of the controller&quot;. Here, the &quot;controller&quot; will be the clinic itself. The ICO themselves have the power to bring prosecutions.</em></p>

<p><em>&quot;Although there are defences available to someone charged with the offence - such as that they reasonably believed they had the right to &quot;obtain&quot; the personal data, or on grounds of public interest - such defences are unlikely to apply where someone knowingly accesses patient notes for no valid or justifiable reason.</em></p>

<p><em>&quot;The section 170 offence is (in England and Wales) a &quot;recordable offence&quot; (one where the police may keep a record of a conviction on the police national computer), in contrast to the equivalent offence under the prior Data Protection Act. However, it remains an offence only punishable by a fine. In England and Wales, although the maximum fine is unlimited, there is no possibility of any custodial sentence. Recent prosecutions by the ICO under section 170 have seen a council officer fined for unlawfully accessing social services records, and a tracing agent fined for illegally obtaining personal information to check if customers of a high street bank could repay their debts.</em></p>

<p><em>&quot;A further area of potential investigation for the ICO will be whether the clinic itself complied with its obligations under the UK GDPR to have &quot;appropriate technical or organisational measures&quot; in place to keep personal data secure. Serious failures to comply with that obligation could lead to civil monetary penalties from the ICO, to a maximum of &pound;17.5m although, in reality, given that such civil &quot;fines&quot; must be proportionate, it is rare that such large sums are even considered by the ICO.</em></p>

<p><em>&quot;Individuals, such as - in this case - The Princess of Wales, can also bring claims for compensation under the UK GDPR, and for &quot;misuse of private information&quot;, where their data protection and privacy rights have been infringed.</em></p>

<p><em>&quot;Whatever the outcome from the ICO, anyone working in an environment where they might have access to personal data, particularly of a sensitive nature, should be aware that there are potential criminal law implications arising from unauthorised access, and any organisation holding such information should ensure it has appropriate measures in place to prevent, or at least reduce the risk, of such access.&quot;</em></p>

<h3>Related coverage</h3>

<p><a href="https://url.uk.m.mimecastprotect.com/s/h-QdCMjYwcWvEQmskT0QS?domain=dailystar.co.uk" >Daily Star</a><br />
<a href="https://url.uk.m.mimecastprotect.com/s/JQ5ECNxZLcXwv2yTjHnSU?domain=thisismoney.co.uk" >Daily Mail</a><br />
<a href="https://url.uk.m.mimecastprotect.com/s/1lpRCP127TMJy7VF6d4a8?domain=forbes.com" >Forbes</a><br />
<a href="https://www.telegraph.co.uk/royal-family/2024/03/20/three-london-clinic-staff-investigated-princess-of-wales/" >Daily Telegraph</a><br />
<a href="https://url.uk.m.mimecastprotect.com/s/aoSxCLgX7i4lEJ8uBz03V?domain=itv.com" >ITV News</a><br />
<a href="https://url.uk.m.mimecastprotect.com/s/ZUrmCO71Ms4Eq6QuriQPE?domain=mirror.co.uk" >The Mirror</a><br />
<a href="https://www.newsweek.com/kate-middleton-sue-medical-records-breach-hospital-1881676" >Newsweek</a><br />
<a href="https://stylecaster.com/entertainment/celebrity-news/1742619/kate-middleton-hospital-security-breach/" >stylecaster</a></p>

<p>&nbsp;</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABICWAAA7AA7YAAAAAAAB6AB7QAP777777AACAAA7IG72BQAAE.jpg" length="40981" />
    </item>
    <item>
      <title><![CDATA[The Princess of Wales and possible data protection offences and infringements]]></title>
      <link>https://www.mishcon.com/news/the-princess-of-wales-and-possible-data-protection-offences-and-infringements</link>
      <guid>https://www.mishcon.com/news/the-princess-of-wales-and-possible-data-protection-offences-and-infringements</guid>
      <description><![CDATA[Media outlets, including the BBC, have indicated that the Information Commissioner's Office (ICO) is assessing reports that patient notes of the Princess of Wales were inappropriately accessed by staff at the "London Clinic".]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 20 Mar 2024 14:24:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Media outlets, including the <a href="https://www.bbc.co.uk/news/uk-68613057" >BBC</a>, have indicated that the Information Commissioner&#39;s Office (ICO) is assessing reports that patient notes of The Princess of Wales were inappropriately accessed by staff at the &quot;London Clinic&quot;.</p>

<p>According to the reports at least one member of staff at the clinic, where The Princess of Wales recently underwent abdominal surgery, &quot;was said to have been caught trying to access&quot; the notes.</p>

<p>Any investigation by the ICO is likely to consider whether a criminal offence might have been committed by an individual or individuals. <a href="https://www.legislation.gov.uk/ukpga/2018/12/section/170/enacted" >Section 170 of the Data Protection Act 2018 </a>says that a person commits an offence if they obtain or disclose personal data &quot;without the consent of the controller&quot;. Here, the &quot;controller&quot; will be the clinic itself. The ICO themselves have the power to bring prosecutions.</p>

<p>Although there are defences available to someone charged with the offence - such as that they reasonably believed they had the right to &quot;obtain&quot; the personal data, or on grounds of public interest - such defences are unlikely to apply where someone knowingly accesses patient notes for no valid or justifiable reason.</p>

<p>The section 170 offence is (in England and Wales) a &quot;recordable offence&quot; (one where the police may keep a record of a conviction on the police national computer), in contrast to the equivalent offence under the prior Data Protection Act. However, it remains an offence only punishable by a fine. In England and Wales, although the maximum fine is unlimited, there is no possibility of any custodial sentence. Recent prosecutions by the ICO under section 170 have seen a council officer fined for unlawfully accessing social services records, and a tracing agent fined for illegally obtaining personal information to check if customers of a high street bank could repay their debts.</p>

<p>A further area of potential investigation for the ICO will be whether the clinic itself complied with its obligations under the <a href="https://www.mishcon.com/uk-gdpr">UK GDPR </a>to have &quot;appropriate technical or organisational measures&quot; in place to keep personal data secure (Article 5(1)(f)). Serious failures to comply with that obligation could lead to civil monetary penalties from the ICO, to a maximum of &pound;17.5m (although, in reality, given that such civil &quot;fines&quot; must be proportionate, it is rare that such large sums are even considered by the ICO).</p>

<p>Individuals, such as - in this case - the princess, can also bring claims for compensation under the UK GDPR, and for &quot;misuse of private information&quot;, where their data protection and privacy rights have been infringed.</p>

<p>Whatever the outcome from the ICO, anyone working in an environment where they might have access to personal data, particularly of a sensitive nature, should be aware that there are potential criminal law implications arising from unauthorised access, and any organisation holding such information should ensure it has appropriate measures in place to prevent, or at least reduce the risk, of such access.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABHSWAAA7AA7YAAAAAAAB6AB7QAP777774AABEQARAHEIBYAAE.jpg" length="123745" />
    </item>
    <item>
      <title><![CDATA[What can a business do if an employee steals its confidential information?]]></title>
      <link>https://www.mishcon.com/news/what-can-a-business-do-if-an-employee-steals-its-confidential-information</link>
      <guid>https://www.mishcon.com/news/what-can-a-business-do-if-an-employee-steals-its-confidential-information</guid>
      <description><![CDATA[Unlawful removal of confidential information can have serious consequences for a business, including significant financial loss and reputational damage.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Tue, 19 Mar 2024 10:08:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>Unlawful removal of confidential information can have serious consequences for a business, including significant financial loss and reputational damage. Where a senior employee removes confidential information and resigns to join a competitor, this is of even more concern. &nbsp;Below, we set out the key steps a business should take in such situations and consider some of the possible remedies that can be obtained.</p>

<ol>
	<li><strong>Investigate: </strong>Investigations should be started early as they can be complex and time intensive because of the large amounts of data involved or due to difficulties with getting access to relevant devices. Choose a team of trusted people to carry out the investigation to avoid tipping off potential co-conspirators. When reviewing large amounts of data, carry out a data protection impact assessment. &nbsp;</li>
	<li><strong>Consider other potential wrongdoing: </strong>Theft of confidential information may be part of a bigger problem, so don&#39;t make the focus of the investigation too narrow. Other unlawful conduct often also happens, and the employee may not be acting alone. For example, the employee may have been soliciting clients and/or staff, or evidence of a team move may come to light.</li>
	<li><strong>Stabilise the business:</strong> To prevent any further damage to the business, inform relevant senior leaders of the confidential information that has been taken and what the consequences could be for the business and the relevant stakeholders. Business leaders should have early conversations with senior people who are in a position of trust. Consider what action are needed need to take to stabilise the business, such as suspending or limiting staff access to systems and placing employees on garden leave if they have not already left employment.</li>
	<li><strong>Secure the confidential information: </strong>Problem employees still owe continuing obligations to their employers. These obligations include a mix of express and implied duties and in some cases include restrictive covenants and fiduciary duties. If the employee has already left the business, to secure the confidential information and flush out instances of wrongdoing, remind the employee of their continuing obligations and demand the return all confidential information. It may also be appropriate to request certain undertakings from them at this stage too. Carefully consider, however, whether putting the employee on notice of the findings will cause further harm to the business: will the employee destroy or tamper with the evidence? If so, consider other options instead such as applying to the court for relief and protection without giving the employee notice of the application.</li>
	<li><strong>Put the new employer on notice:</strong> If the employee has resigned to go to a competitor, it is important to ensure that they are made aware of the employee&#39;s obligations to their current employee and their wrongdoing. Businesses may also want to require a set of undertakings from the competitor, putting them under pressure to confirm that they will not act unlawfully or induce breaches of contract.</li>
	<li><strong>Engage external counsel and other advisers: </strong>Consider what outside support may be needed, such as forensic services providers, external lawyers and PR advisers. Engaging advisers at an early stage will help ensure the business has the best possible protection.</li>
</ol>

<p>The remedies in this context include:</p>

<ol>
	<li><strong>Court injunctions:</strong> In certain circumstances the court may grant an injunction, either on an interim or final basis. Appropriate injunctions include: immediate delivery up of the confidential information, an imaging order to copy the information held on electronic devices (which could include personal phones and laptops and personal email accounts) search orders (allowing entry and search of premises to seize evidence), freezing orders (to freeze assets) or springboard relief (used to tackle any head start gained) .</li>
	<li><strong>Financial remedies: </strong>Another option is a claim for damages to put the business in the position it would have been in had it not been for the breach. In certain less common situations, the company may have a claim for an account of profits.</li>
	<li><strong>Criminal liability:&nbsp;</strong>The employee&#39;s actions may also amount to a criminal offence, for example under data protection law or the Computer Misuse Act 1990. While financial remedies and securing stolen data is not the focus of criminal actions in the same way as civil actions, the threat of criminal liability may persuade the employee to take matters seriously.</li>
</ol>

<p>Mishcon&#39;s High Court team has extensive and market leading experience advising on some of the most prominent, high value and complex commercial employment disputes. We help employers recover stolen confidential information and trade secrets, prevent the publication, dissemination or use of such information by competitors and former employees, and obtain other appropriate remedies. If you have any questions, contact our experts at <a href="mailto:mdremploymenthighcourt@mishcon.com">mdremploymenthighcourt@mishcon.com</a>.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/AD5SQAAA7AA7YAAAAAAAB6AB7QAP777774AAAHQBRYG4OBQAAA.jpg" length="40958" />
    </item>
    <item>
      <title><![CDATA[AI in the workplace: Data protection issues]]></title>
      <link>https://www.mishcon.com/news/use-of-ai-by-employers-data-protection-issues</link>
      <guid>https://www.mishcon.com/news/use-of-ai-by-employers-data-protection-issues</guid>
      <description><![CDATA[While the use of AI systems in recruitment and during employment continues to grow, it is essential for both creators and for employers, as users of AI systems, to carefully consider what types of data they will be handling and collecting, and the key data protection requirements.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Wed, 13 Mar 2024 12:33:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>While the use of AI systems in recruitment and during employment continues to grow, it is essential for both creators and for employers, as users of AI systems, to carefully consider what types of data they will be handling and collecting, and the key data protection requirements. &nbsp;</p>

<p>This article introduces the data protection framework that surrounds the use of AI platforms. We summarise the key legal considerations that employers should be aware of when using AI technologies in recruitment and employment.</p>

<h2>Controller vs Processor?</h2>

<p>When an organisation decides to process personal data for any activity, the first thing it should consider is whether it is a controller or processor of that data. If the organisation decides the purpose and means of processing data (i.e. what personal data is processed, and why &ndash; for example an employer obtaining employee or candidate data), then it is likely to be a controller.</p>

<p>If the organisation provides a service to a third party and the third party decides what data is to be processed and why, or the organisation is processing the personal data on the third party&#39;s instructions &ndash; for example a company that handles payroll administration for another employer - the organisation is likely to be a processor.</p>

<p>This role-based classification is important as the obligations on an employer depend on whether it is a controller or a processor.</p>

<p>As a controller, the employer is required under the UK GDPR to provide specific information (usually in the form of what is termed a &quot;privacy notice&quot;) to employees and job candidates etc. This privacy notice should set out key information about the processing activity, such as how the personal data is going to be used, the lawful basis relied on by the employer, and the rights of employees in this regard. For example, if an employer uses an AI system to select candidates in a recruitment process, the employer should set this out in its privacy notice together with the associated lawful basis. &nbsp;&nbsp;&nbsp;</p>

<p>Another key requirement for controllers considering the use of AI systems is to assess the risks associated with the use of an AI system before engaging in the activity - this is likely to constitute a DPIA as outlined below. &nbsp;</p>

<h2>Data Protection Impact Assessment (DPIA)</h2>

<p>When an organisation decides to carry out a &quot;high-risk&quot; processing activity using personal data, it is required to assess the risks associated with the activity by carrying out a DPIA. In an employment context potential &quot;high-risk&quot; activities using personal data include using AI platforms in recruitment, making employment decisions on task allocation, promotion and termination, and monitoring or evaluating employees.</p>

<p>Using of AI platforms for activities such as candidate selection or to review employee performance are likely to be &nbsp;&quot;high-risk&quot; activities because they involve what the Information Commissioner has classed as &quot;innovative technology&quot; and, furthermore, these processes can have significant impact on candidates and employees.</p>

<p>Some common risks associated with the use of AI based technologies include:</p>

<ul>
	<li>inherent inbuilt bias in the AI platform;</li>
	<li>lack of transparency;</li>
	<li>unfair decision making; and</li>
	<li>accessing personal data without the knowledge or consent of individuals (also known as data scraping).</li>
</ul>

<p>Consequently, when using AI-based technologies, employers should be aware of their data protection obligations. For instance, in addition to providing the usual information necessary to comply with the UK GDPR, transparency requires employers to inform employees when they are using AI systems to handle their personal data. &nbsp;</p>

<p>Where use of AI involves automated decision-making about individuals, they also have the right under the UK GDPR to receive meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing. As a responsible operator of an AI system, an employer must be able to explain to its staff how its system works and how it reaches the decisions it does, in a way that a typical member of the public can understand.</p>

<p>To the extent that employers may be operating in the EU, or otherwise affected by extra-territorial provisions, these overarching principles are also echoed in the <a href="https://www.mishcon.com/news/the-eu-ai-act-is-one-step-closer-to-becoming-law">EU&#39;s incoming AI Act</a>.</p>

<h2>Data Subject Access Requests (DSARs)</h2>

<p>Explainablity requirements are particularly important because employees, candidates and other individuals have the right under the UK GDPR to make a DSAR. This is a formal request made by an individual to an organisation, seeking information about and access to the personal data that the organisation holds about them. This helps individuals be aware of and verify the lawfulness of the processing of their personal data.</p>

<p>It is therefore important for creators of AI systems to consider how to develop the AI system to comply with the DSAR right, and for employers as users of an AI system to consider how well the system can respond to these requests.</p>

<h2>Practical Measures</h2>

<p>More broadly, creators and employers using AI systems should ensure the following practical measures are implemented where appropriate. This will help ensure compliance with the data protection framework applicable to the use of AI platforms, and also help manage potential risks.</p>

<ul>
	<li>Be clear and up front with employees about how and why you are using data (in your privacy notices and relevant policies).</li>
	<li>If the employer is scraping data to train its AI model (such as extracting information from a website), it will need to complete a DPIA (and there may be legal implications beyond just data protection law).</li>
	<li>Be prepared to explain how your AI model works. You should consider this a mandatory requirement if you use an AI system in a recruitment or employment context.</li>
	<li>Build the AI model so that a human is involved in the decision-making process. &nbsp;</li>
	<li>If relevant, expect questions from investors, and others, around where you acquired your data, and be able to confirm that data was collected lawfully.</li>
</ul>

<p>You can achieve this by using factsheets (a collection of information about how an AI model was developed and deployed), DPIAs (describing the capabilities and limitations of the system) and conformity assessments i.e. a demonstration that the AI system meets legal and regulatory requirements (insert link).</p>

<h2>AI v Data Protection Compliance</h2>

<p>The use of AI is increasingly coming under the regulatory spotlight, and in the UK the Information Commissioner&#39;s Office has<a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2024/01/information-commissioner-s-office-launches-consultation-series-on-generative-ai/"> launched the first of a series of consultations on generative AI</a>, &quot;examining how aspects of data protection law should apply to the development and use of the technology&quot;. It will be essential for employers to keep up to date not just with technological and legal developments in this area, but also with developments in regulatory approach and risk.</p>

<p>The effective use of AI in the employment context requires a comprehensive understanding of data protection laws. As AI continues to evolve, staying on top of employers&#39; legal obligations in relation to AI is crucial for both creators and for employers using AI systems. This helps not only with regulatory compliance but also fosters trust and transparency in AI technologies.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ACWCUAAA7AA7YAAAAAAAB6AB7QAP777774AABAYEAUEYGBAAAA.jpg" length="102315" />
    </item>
    <item>
      <title><![CDATA[CJEU rules in relation to personal data and advertising]]></title>
      <link>https://www.mishcon.com/news/cjeu-rules-in-relation-to-personal-data-and-advertising</link>
      <guid>https://www.mishcon.com/news/cjeu-rules-in-relation-to-personal-data-and-advertising</guid>
      <description><![CDATA[On 7 March 2024 the Court of Justice of the European Union (CJEU) issued its judgment in response to two questions that were referred to it by the Belgian Court of Appeal, about protocols related to what is known as “Real-Time Bidding” (RTB) for targeted online advertising.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 08 Mar 2024 14:35:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>On 7 March 2024 the Court of Justice of the European Union (CJEU) issued its&nbsp;judgment in response to two questions that were referred to it by the Belgian Court of Appeal, about protocols related to what is known as &ldquo;Real-Time Bidding&rdquo; (RTB) for targeted online advertising.<br />
<br />
The questions stem from an appeal by IAB Europe in relation to the Belgian Data Protection Authority&#39;s (DPA) assessment of IAB Europe&rsquo;s status as a data controller under the EU GDPR. The DPA&#39;s litigation department had issued corrective measures and imposed an administrative fine against IAB Europe following a series of complaints received since 2019.<br />
<br />
The judgment provides clarity on how tracking numbers referred to as the &ldquo;TC String&rdquo; might be categorised as &ldquo;personal data&rdquo;, and the situations in which a sectoral organisation might be a controller or joint controller of data under the EU GDPR.<br />
<br />
Most people will be familiar with the cookie pop-ups on websites that request consent for advertising purposes. RTB is the most common system for placing adverts on websites. The RTB system involves advertisers in a variety of industries live bidding during an individual&#39;s use of a website for the advertising on the site. Under the EU GDPR, RTB had become difficult to operate as the tracking and targeting of individuals required prior consent which was difficult to achieve in a compliant manner, particularly as the concept of &quot;consent&quot; has developed and the penalties for non-compliance have increased under EU GDPR.</p>

<p>IAB Europe is a European advertising group, which developed, implemented, and promoted a framework and platform which allowed for the collection of consent, to enable the RTB system. The framework involves the large-scale processing of data and the application of a string of character-based code with advertisers and data brokers to track the individual&#39;s consent and interests for targeting advertising. This code, referred to as the &ldquo;TC String&quot;, facilitated the RTB process by interacting with a cookie placed on the individual&#39;s browser, to identify whether they had consented to the processing of their information for the purpose of the targeted advertising. The cookie is also connected to the IP address of the individual.</p>

<p>The data used for the targeted advertising included various data shared and collected by advertisers who rely on the &ldquo;legitimate interests&rdquo; lawful basis for processing contained within Article 6(1)(f) of the EU GDPR. The personal data ranged from gender, age, information on the individual&#39;s location and recent search and purchase history.</p>

<p>IAB Europe argued that it was&nbsp;not a controller as it&nbsp;did not process data itself, but only established the system and rules for processing.</p>

<p>The two questions referred to the CJEU can be distilled into the following: (i) was the TC String personal data, and (ii) was IAB Europe, as the implementer of the framework for the TC String, a &quot;controller&quot; for the purpose of the EU GDPR?</p>

<h2>TC String as Personal Data</h2>

<p>The DPA argued that the TC String is personal data according to the definition at Article 4(1) of the EU GDPR, on the basis that when linked with other data, the individual user to whom it relates becomes identifiable. The CJEU found that the scope of Article 4(1) was intentionally wide, and in line with existing case law on the matter, where it was possible that the individual <em>could </em>be identified, then the data was personal. The court agreed with the view of the DPA and found that IAB Europe and its members had sufficient access to information to combine with the TC String to make it identifiable, meaning that it was personal data within the scope of Article 4(1).</p>

<h2>IAB Europe as a Controller</h2>

<p>Under Article 4(7) a controller is an entity which jointly or with others determines the purposes and means of processing the personal data. The CJEU recalled that an entity that exerts influence over the processing of personal data is also a controller, referencing by analogy <em>Jehovan todistajat C-25/17 10 July 2018. </em>The CJEU said also that IAB Europe set out rules within the framework as to how the TC String containing the details of an individual&#39;s consent may be used, stored and shared, and would revoke access to the Strings should members fail to abide by the rules. Accordingly, it is exerting influence over the processing of personal data. Therefore, the court considered the IAB Europe was a joint controller, despite the fact that the organisation of the framework was such that IAB would never have direct access to personal data beyond the TC Strings. However, the joint controllership does not extend to the subsequent processing of data that occurs by third parties for the targeting of adverts based on their preferences.</p>

<p>This decision will have an impact on all digital processing in the EU as it highlights just how broad the protections under GDPR should be applied. It continues a trend in the CJEU of being willing to find joint controllership. It may also mean that, in time, we see significant changes in how consent to data processing is obtained and shared within the digital advertising industry which, for consumers, may mean changes to cookie banners.</p>

<p>It&rsquo;s important to note that, post-Brexit, judgments of the CJEU in relation to the EU GDPR do not apply in the UK, let alone bind the domestic courts or the Information Commissioner&rsquo;s Office (though the UK courts may &#39;have regard&#39; to those decisions). So, although the CJEU&rsquo;s findings should be noted, especially by anyone who makes use of the TCF framework, it remains uncertain what the effect of the judgment will be in the UK.</p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ABVCOAAA7AA7YAAAAAAAB6AB7QAP777774AABNQBAAEAABAAAA.jpg" length="46767" />
    </item>
    <item>
      <title><![CDATA[UK Home Office data protection law breach: Jon Baines comments]]></title>
      <link>https://www.mishcon.com/news/uk-home-office-data-protection-law-breach-jon-baines-comments</link>
      <guid>https://www.mishcon.com/news/uk-home-office-data-protection-law-breach-jon-baines-comments</guid>
      <description><![CDATA[The Information Commissioner's Office (ICO) has found that a pilot scheme from the UK Home Office that placed ankle tags on migrants on immigration bail breached UK data protection law.]]></description>
      <author>feedback@mishcon.com (Mishcon De Reya)</author>
      <pubDate>Fri, 01 Mar 2024 14:15:00 GMT</pubDate>
      <content:encoded><![CDATA[<p>The Information Commissioner&#39;s Office (ICO) has found that a pilot scheme from the UK Home Office that placed ankle tags on migrants on immigration bail breached UK data protection law.</p>

<p>Senior Data Protection Specialist <a href="https://www.mishcon.com/people/jon-baines/">Jon Baines </a>provided comment on the decision, commending the ICO for their action, and has been quoted across various media publications.</p>

<p>Jon commented: <em>&ldquo;It is interesting - and encouraging in many ways - to see the ICO taking formal enforcement action in this area. Although data protection law gives rights to all individuals, there&rsquo;s a strong argument for saying that regulatory action should be focused in particular on respecting and enforcing the rights of the most vulnerable in society.</em></p>

<p><em>&ldquo;The ICO is making clear with this action that monitoring someone&rsquo;s movements 24/7 is incredibly intrusive, and in the context of affected asylum claimants, disproportionate and unlawful.</em></p>

<p><em>&ldquo;Recently, the ICO has been in the habit of issuing &lsquo;reprimands&rsquo; for serious infringements, but the problem with reprimands is that they do not compel recipients to do - or refrain from doing - anything. By contrast, an enforcement notice, like this one - can require a recipient to stop activities, and failure to comply can be treated as a contempt of court. It will be interesting, therefore, to see if the Home Office decides to appeal this notice.&rdquo;</em></p>

<h3>Related coverage</h3>

<p><a href="https://nationaltechnology.co.uk/Home_Office_Migrant_GPS_Tags_Broke_Data_Protection_Law.php" >National Technology News</a><br />
<a href="https://www.infosecurity-magazine.com/news/home-office-data-protection-migrant/" >Infosecurity Magazine</a></p>
]]></content:encoded>
      <category>Article</category>
      <enclosure type="image/jpeg" url="https://www.mishcon.com/assets/managed/images/cache/ADBSAAAA7AA7YAAAAAAAB6AB7QAP777774AAAVAA5AB7IAIAAA.jpg" length="45374" />
    </item>
  </channel>
</rss>