Skip to content

Brought to you by

Dentons logo in black and white

UK People Reward and Mobility Hub

The latest updates in employment, benefits, pensions and immigration

open menu close menu

UK People Reward and Mobility Hub

  • Home
  • Events and training
  • Who We Are
    • Meet the team
  • How we can help

Human or machine: computer says “no”

By Elouisa Crichton
March 2, 2022
  • Data protection
  • Equality Act
  • GDPR
Share on Facebook Share on Twitter Share via email Share on LinkedIn

Organisations are increasingly using algorithms and automated decision-making to assist them in making decisions about individuals, but to what extent is this a step in the right direction?

Many employers now include algorithms and automated decision-making in hiring and other personnel processes. The London School of Economics and Political Science recently reported that more than 60% of firms had adopted new digital technologies and management practices as a result of COVID-19. Whilst the use of these AI tools provides benefits to an organisation, such as speed and cost savings, employers should be mindful of the legal implications of placing too much reliance on AI.

Consideration of data protection law

The UK implementation of the GDPR provides that data must be processed “lawfully, fairly and in a transparent manner”. When organisations use algorithms to process special category data (e.g. health, race and religion), they must ensure this does not have an unjustified, adverse effect on the individual.

UK GDPR specifically prohibits “solely automated decision-making that has a legal or similarly significant effect” unless:

  • you have explicit consent from the individual;
  • the decision is necessary to enter into or perform a contract; or
  • it is authorised by EU or member state law.

These exemptions are a high bar to satisfy for employers. Consent might appear to be the most relevant in an employment context, but there is a risk that the power imbalance between a job candidate and prospective employer could result in consent not being considered to be freely given (and, as such, invalid). Where consent is relied upon as a basis of processing, organisations also need to keep in mind that individuals are entitled to refuse or withdraw consent at any time, without suffering any detriment (in practice, that means they could have a right to switch to a process that does not involve automation).  What is “necessary” to enter into a contract can be difficult to establish.  The Information Commissioner’s Office guidance states that the processing must be a targeted and proportionate step which is integral to delivering the contractual service or taking the requested action. This exemption will not apply if another decision-making process with human intervention was available.  It seems that relying solely on automated decision-making would run into some GDPR hurdles.

That said, most organisations using automation do so alongside traditional methods (interviews, applications, assessments, appraisals etc.). Before introducing algorithms and automated decision-making as part of any process, organisations must prepare a Data Protection Impact Assessment (DPIA) to analyse, identify and minimise the data protection risk to ensure compliance with UK GDPR or risk a fine of up to £8.7 million or 2% global annual turnover if higher.

Consideration of the Equality Act 2010

Algorithms are human-made and, as such, they are inherently at risk of featuring some bias. A significant concern could arise if the algorithm inadvertently leads to discrimination in breach of the Equality Act.

For example, an automated recruitment system could discriminate if it:

  • favours one gender over another (including scoring language more typically used by male candidates more highly than language more commonly used by females);
  • values length of service in past roles disproportionately over experience/skills, which could lead to age discrimination risks; or
  • does not recognise overseas qualifications on a par with those from the UK (potentially exposing an employer to race discrimination claims).

Any automated decision-making process that does not build in disability discrimination safeguards and reasonable adjustments could also place the employer at risk. There are examples of individuals whose disability impacts on their ability to satisfactorily complete multiple choice tests, despite them being able to answer the question using free text. An automated process that does not build in flexibility (including appropriate triggers for human checks) could lead to equality concerns.

A robust AI tool may recommend candidates for recruitment that surprise an organisation. We know that diverse teams work well but that does not always play out in recruitment decisions. Diversity and a range of personality types can challenge existing (often unconscious) preferences related to team cohesion. This could leave the recruiters wondering if the AI tool has got it wrong and needs changed/overruled, or if it has instead shone a spotlight on potential bias in the human decision-making process left unchecked until now.

Takeaway considerations for employers

Bias and discrimination can unfortunately be found in AI tools, often stemming unintentionally from the humans who program them. Notwithstanding this, AI may also be the solution (or at least a helpful part of it) to achieving more equitable decisions. As technology continues to develop, algorithms can be programmed to detect and hopefully reduce discrimination and bias in decision-making. And, perhaps, we should be prepared to embrace some surprise outcomes from AI that in fact redress unidentified bias in the human decision-making process (robot 1:0 human).

Share on Facebook Share on Twitter Share via email Share on LinkedIn
Subscribe and stay updated
Receive our latest blog posts by email.
Stay in Touch
Data Protection, Equality Act, GDPR
Elouisa Crichton

About Elouisa Crichton

All posts Full bio

You might also like...

  • Data protection
  • GDPR

One day less to respond to DSARs!

Under the General Data Protection Regulation (GDPR), individuals can request access to the personal data that employers or other organisations […]

By Aggie Salt
  • ACAS
  • Data protection
  • Employment policies
  • GDPR
  • Legislation
  • Proposed legislative changes

Acas survey highlights workers’ AI fears

By Claudia Pert and Sarah Beeby
  • Discrimination
  • Employee welfare
  • Equality Act
  • General
  • Wellbeing

LGBT+ Business Champion’s call to engage: improving LGBT+ inclusivity

By Mark Hamilton

About Dentons

Redefining possibilities. Together, everywhere. For more information visit dentons.com

Grow, Protect, Operate, Finance. Dentons, the law firm of the future is here. Copyright 2023 Dentons. Dentons is a global legal practice providing client services worldwide through its member firms and affiliates. Please see dentons.com for Legal notices.

Categories

Dentons logo in black and white

© 2025 Dentons

  • Legal notices
  • Privacy policy
  • Terms of use
  • Cookies on this site