EFF, Civil Society Groups, Academics Call on UK Home Secretary to Address Flawed Data Bill

Posted by Deeplinks on 2025-04-08 14:58:39
Discussion Points:
  • Bias in Automated Decision-Making: How can we ensure that automated decision-making tools are fair, transparent, and unbiased, particularly in the context of law enforcement?
  • Data Protection Safeguards: What role should data protection safeguards play in preventing the misuse of automated decision-making tools, and how can they be strengthened?
  • Marginalized Groups and Discrimination: How can policymakers address the risks of discrimination against marginalized groups through the use of automated decision-making tools in law enforcement contexts?
Summary:

The Electronic Frontier Foundation (EFF) has joined forces with civil society groups and academics to warn UK Home Secretary Yvette Cooper and Department for Science, Innovation & Technology Secretary Peter Kyle about the dangers of the draft Data Use and Access Bill (DUA Bill). Clause 80 weakens safeguards for automated decisions in law enforcement, allowing for biased and discriminatory outcomes. This could disproportionately affect marginalized groups. The government's own assessment acknowledges historical biases in datasets, yet politicians are pushing forward with this agenda. Safeguards must be strengthened to prevent opaque, unfair, and harmful decisions. Immediate action is required."}","summary":""}

ADVERTISEMENT

Original Message:

Last week, EFF joined 30 civil society groups and academics in warning UK Home Secretary Yvette Cooper and Department for Science, Innovation & Technology Secretary Peter Kyle about the law enforcement risks contained within the draft Data Use and Access Bill (DUA Bill).


Clause 80 of the DUA Bill weakens the safeguards for solely automated decisions in the law-enforcement context and dilutes crucial data protection safeguards. 


Under sections 49 and 50 of the Data Protection Act 2018, solely automated decisions are prohibited from being made in the law enforcement context unless the decision is required or authorised by law. Clause 80 reverses this in all scenarios unless the data processing involves special category data. 


In short, this would enable law enforcement to use automated decisions about people regarding their socioeconomic status, regional or postcode data, inferred emotions, or even regional accents. This increases the already broad possibilities for bias, discrimination, and lack of transparency at the hands of law enforcement.


In the government’s own Impact Assessment for the DUA Bill, the Government acknowledged that “those with protected characteristics such as race, gender, and age are more likely to face discrimination from ADM due to historical biases in datasets.” Yet, politicians in the UK have decided to push forward with this discriminatory and dangerous agenda regardless. 


Further, given the already minimal transparency around automated decision making, individuals affected in the law enforcement context would have no or highly limited routes to redress.


The DUA Bill puts marginalised groups at risk of opaque, unfair and harmful automated decisions. Yvette Cooper and Peter Kyle must address the lack of safeguards governing law enforcement use of automated decision-making tools before time runs out.


The full letter can be found here



Source: Deeplinks

Comments

Your name:

Comment: