In brief
- Various aspects of data protection and eprivacy reforms take effect in the UK on 5 February 2026.
- Significant automated decision-making undergoes a major shift, from a “can’t, unless…” framework, to a “can, so long as…” one.
- Charities’ ability to send electronic marketing to their supporters becomes much wider.
- Potential maximum fines for eprivacy breaches rise from £500,000 to £17.5m or 4% of global annual turnover.
- A new concept of “recognised legitimate interests” is introduced which does not require the usual balancing tests for reliance on the existing legitimate interests lawful basis.
- With the making of The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026, Parliament has now finally commenced the most significant data protection and eprivacy reform provisions since the UK left the EU.
Almost as soon as Brexit took effect, the-then government introduced a National Data Strategy and proposed a “bold new data regime outside the EU”. Two failed Parliamentary Bills followed (the Data Protection and Digital Information Bills 1 and 2) before Parliament finally passed the Data Use and Access Act 2025 (DUAA), in June last year. The final version of the reform package was notably less bold than what earlier governments had proposed, but practitioners and advisers have still been waiting, with some anticipation, for the commencement of part 5 of DUAA, which deals primarily with amendments to the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. With the Commencement No. 6 Regulations now made, that wait is over.
What changes are made?
Automated decision-making
Possibly the most significant change, given both the current stage and likely future stages of technological advancement, is to Article 22 of the UK GDPR. Prior to amendment, Article 22 placed a general prohibition on automated processing, including profiling, which produced legal effects on a data subject or similarly significantly affected them. This prohibition (cast as a data subject right) was subject to exceptions where that automated processing was necessary for performance of a contract with the data subject, was required or authorised by law, or was based on the data subject’s consent. These exceptions were, obviously, limited in effect.
Article 22 is now replaced by Article 22A, the effect of which is that it is only when automated processing (producing legal effects on a data subject or similarly significantly affecting them) is done to special category data that the general prohibition, and exceptions, still apply. With “ordinary” personal data, meanwhile, automated decision-making can take place, provided specific safeguards are observed, such as: providing information about the decision-making; the ability for the data subject to make representations; the ability to obtain human intervention; and the ability to contest the decision made.
In many cases, provided they observe those safeguards, and provided that the rights, freedoms and interests of data subjects are not overriding, businesses will be able to rely on their (or others’) legitimate interests as a legal basis for automated decision-making. This change from a “can’t, unless…” to a “can, so long as…” scheme represents a potentially fundamental shift in how the law will treat such processing, including in the context of AI systems.
Charities and electronic direct marketing, and PECR fines
The law governing the sending of direct marketing to individuals by email and SMS was passed as far back as 2003 - the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). It only allows direct electronic marketing to be sent with the recipient’s consent, but with an important exception for organisations who have sold products or services (or entered into negotiations for the sale of such products or services) to the individuals to be marketed to - as long as those organisations provided information on opting-out, they can send such electronic marketing without consent.
This exception has come to be called the “soft opt-in”. Because of the requirement for there to be a “sale” of a product or service, charities have not generally been able to avail themselves of it, until now. Regulation 22 of PECR has been amended to allow charities to send direct marketing to individuals where: the sole purpose of the marketing is to further one or more of the charity’s charitable purposes; the individual provided their details when expressing an interest in one or more of those purposes, or when offering or providing support for one or more of those purposes; and the appropriate opt-out information is provided at all stages.
This amendment is potentially very significant for the charity sector and its fundraising capabilities. But charities would be wise to step cautiously and take appropriate advice. Unlawful direct electronic marketing has long been a target for enforcement action by the Information Commissioner (ICO) who is also receiving increased enforcement powers in relation to PECR, including, notably, an increase in the maximum fine for PECR infringement from £500,000 to £17.5m or 4% of global annual turnover (whichever is higher) - the equivalent of the maximum fines under the UK GDPR and Data Protection Act 2018.
“Recognised” legitimate interests
For processing of personal data to be lawful under the UK GDPR, it must, at least, have a lawful basis under Article 6(1). Perhaps the basis most commonly relied upon is that the processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party (Article 6(1)(f)), which applies unless those interests are overridden by the interests or fundamental rights and freedoms of the data subject. As the case law has established, to determine properly whether this basis is available requires a careful set of at least three staged tests: i) are legitimate interests being pursued (and by whom)? ii) is the processing necessary for the purposes of those interests? iii) are those interests outweighed by the data subject’s interests, rights or freedoms?
With the amendments to Article 6, Parliament has now introduced the concept of a “recognised” legitimate interest. Where such an interest exists, the third test above is effectively knocked away: Parliament has determined that certain processing activities are for legitimate interests and those interests are never outweighed by the data subject’s rights. These include: processing for the purposes of direct marketing; processing for intra-group data transfers; and processing for ensuring the security of network and information systems.
It should be stressed however that the existence of a recognised legitimate interest does not imply an exemption from the obligation to comply with all other aspects of the data protection principles.
Regulation-making powers
A notable feature of the DUAA – and indeed of much modern legislation – is that it confers on the Secretary of State the power to make further regulations. There are multiple instances of this in the Act, but, as one example, further "recognised legitimate interests" can be created by this route.
Other amendments
We will be providing further commentary in the coming weeks on other significant changes to the UK's data protection and eprivacy regime. These include in the areas of scientific research, cookies, data subject complaints and rights, and reform of the Information Commissioner's Office.