top of page

Automated Decision-Making (ADM) in Law Enforcement

A video of a robot looking for answers.

The changes made under the Data (Use and Access) Act (DUAA) 2025 (1) signal a seismic shift in the way law enforcement agencies must think about information governance. From HR departments to intelligence units, automated decision-making (ADM) now sits at the heart of legal compliance and ethical risk.


The amends made by DUAA 2025 allow you to use people’s personal information to make significant automated decisions about them in more circumstances, so long as you continue to apply appropriate safeguards. And it introduces a new safeguard to pro-actively re-consider a decision with human involvement, when this is necessary for some public interest reasons.


Why does this matter? Because for Data Protection Officers (DPOs) and operational leads, it’s no longer just about collecting data, it’s about understanding how decisions are made, challenged, and reconsidered in real time.

 

Comparative definitions of ADM
Comparative definitions of ADM

General processing updates to ADM: the human behind the machine

S80(1) of the DUAA substitutes UK General Data Protection Regulation (GDPR) (2) Article 22 for Article 22A-Article 22D. Under the revised UK GDPR Articles 22A–22D update how ADM is handled.


Automated processing means there is no meaningful human involvement in the taking of the decision. Automated processing produces a decision that is significant to the data subject in producing a legal effect OR has a similarly significant effect.


Article 22B puts restrictions on this processing, stating an automated decision cannot be taken unless there is explicit consent from the data subject, the decision is necessary for entering into or performing a contract between the data subject and controller, or it’s required or authorised by law.


Article 22C stipulates the safeguards for automated decision-making, stating that where a significant decision is taken based on personal data and solely on automated processing, the controller must ensure that safeguards are in place. The safeguards must consist of measures which provide the data subject informed about the decisions, enable them to make representations, enable them to obtain human intervention, and contest such decisions.


Article 22D builds on this, making further provisions for the Secretary of State to make further provisions via regulations.

 

Law enforcement processing updates to ADM: operational activity

S80(2) of DUAA 2025 amends the Data Protection Act (DPA) 2018 (3) by substituting Section 49 and 50, for Sections 50A-50D.


S50A defines automated decision-making for law enforcement as a decision with no meaningful human involvement, that is significant by producing an adverse legal effect for the data subject OR having a similarly adverse impact.


S50A(2) then stipulates that the extent to which there is meaningful human involvement in taking a decision, must consider at least the extent to which the decision is reached by means of profiling.


S50B puts restrictions on automated decision making based on sensitive processing, stating an automated decision cannot be taken unless there is explicit consent from the data subject, and/or the decision is required or authorised by law.


50C stipulates the safeguards for automated decision-making, stating that where a significant decision is taken based on personal data and solely on automated processing, the controller must ensure that safeguards are in place. The safeguards must consist of measures which provide the data subject informed about the decisions, enable them to make representations, enable them to obtain human intervention, and contest such decisions.


These safeguards do not apply however if the provision is required for one of the following reasons listed in 50C(4):

  • Avoiding obstruction of an official or legal inquiry, investigation or procedure

  • Protecting public safety

  • Protecting national security

  • Protecting the rights and freedoms of others


The safeguards also don’t apply if the controller reconsidered the decision as soon as reasonably practicable, and there is meaningful human involvement in the reconsideration of the decision.


Article 50D builds on this, allowing the Secretary of State to make further provisions via regulations.

 

ADM in intelligence services

Section 50D(4)(4) amends the definition of automated decision-making for intelligence services, stipulating that for the purposes of Section 96 and 97 of the DPA18, a decision is based on entirely automated processing if the decision-making process does not include an opportunity for a human being to accept, reject or influence the decision.

 

Why these differences matter

  • Legal exposure: Misclassifying ADM under the wrong regime can lead to unlawful processing and regulatory scrutiny.

  • Operational clarity: Staff must know when a decision is automated and what rights individuals have.

  • Governance: DPOs must ensure that systems are configured to reflect the correct regime especially in hybrid environments (e.g. HR + policing + intelligence).


Practical applications

HR systems (Part 2)

  • Scenario: Automated scoring of job applicants or internal promotion decisions.

  • Impact: Must offer clear routes for human intervention and contestation. Legal basis must be explicit.

  • DPO Action: Ensure HR platforms allow staff to challenge decisions and that audit trails show meaningful human review.

Predictive policing tools (Part 3)

  • Scenario: Risk assessment algorithms flag people for intervention.

  • Impact: If profiling is used and decisions are adverse (e.g. denial of bail), safeguards must be in place unless exemptions apply.

  • DPO Action: Evaluate whether profiling contributes to the decision. If so, document legal basis and ensure human oversight is embedded.

 Joint intelligence operations (Part 4)

  • Scenario: Intelligence-led decisions on threat levels or covert surveillance.

  • Impact: ADM is triggered even if a human sees the output but cannot influence it.

  • DPO Action: Map workflows to identify where human influence is absent. Apply Part 4 safeguards and ensure decisions are not solely machine-led


Key takeaways

To stay ahead of the regulatory curve:

✅ Map your data flows across all three regimes to help visualise where ADM occurs and under which regime.

✅ Audit automated systems for legal triggers and safeguards.

✅ Evaluate profiling and consent dependencies.

✅ Document your decision impact assessments by evaluating whether decisions meet the threshold for ADM and what safeguards apply.

✅ Train operational staff on intervention and contestation procedures, including how to identify ADM and respond to data subject challenges.

✅ Document exemptions with clear public interest rationale.

 

References

(1)  HM Government, (2025). Data (Use and Access) Act 2025. Available at: https://www.legislation.gov.uk/ukpga/2025/18/introduction/enacted (Accessed: 28 July 2025).


(2) HM Government, (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council. Available at https://www.legislation.gov.uk/eur/2016/679/contents (Accessed: 28 July 2025).


(3) HM Government, (2018). Data Protection Act 2018. Available at: https://www.legislation.gov.uk/ukpga/2018/12/contents (Accessed: 28 July 2025).


bottom of page