Beyond the algorithm: balancing artificial intelligence in recruitment with the human touch

AI in recruitment promises efficiency, but what are the risks? Discover how to leverage recruitment technology without losing the essential human element.

September 3, 2025

The rise of AI in recruitment: a promise of efficiency?

Artificial intelligence in recruitment is no longer a future concept; it's a present-day reality. For hiring managers in the pharmaceutical and healthcare sectors, the appeal is obvious: the promise of faster hiring cycles, lower operational costs, and data-driven decisions.

AI recruitment tools offer clear and tangible benefits that feel like a ‘no-brainer’ in a world where budgets are squeezed on all sides. An estimated 87% of companies now use AI for recruitment. This technology can source vast numbers of passive candidates, automate the initial screening of CVs, and deploy chatbots for scheduling. The potential for efficiency is tempting, but does this efficiency come at the cost of quality, fairness, and the very human elements that define a successful hire?

The hidden risks: when algorithms get it wrong

While AI offers powerful process improvements, an over-reliance on automation carries significant and well-documented risks. For organisations where the quality of talent is non-negotiable, these risks cannot be ignored.

The danger of algorithmic bias

One of the most significant risks of AI in recruitment is algorithmic bias in hiring. This is not a theoretical concern; it is the subject of high-profile legal challenges and academic research. The ongoing collective-action lawsuit against HR software company Workday in the US serves as a cautionary tale, with claims that its screening tools discriminated against applicants based on race, age, and disability.

This case highlights several ways bias can become embedded in AI tools:

  • Data bias: This occurs when an AI is trained on historical data that contains an overrepresentation of certain groups. If past hires have been predominantly from one demographic, the AI learns to favour that profile and can automatically reject qualified candidates from underrepresented groups.
  • Proxy data bias: This is where an algorithm uses a seemingly neutral data point as a substitute for a protected characteristic. For example, an AI that prioritises candidates from Oxbridge or Russell Group universities could inadvertently  apply a negative filter around  colleges from certain locations schools.
  • Algorithmic bias: This happens when a developer’s own unconscious biases become coded into an algorithm. An AI could be designed to flag leadership terms like “captain” or “president,” which could disadvantage candidates from lower socioeconomic backgrounds whose leadership experience was gained outside of these traditional structures.

This is not a new problem. Amazon famously scrapped an AI recruiting tool in 2018 after discovering it discriminated against female candidates. And the issue is proven to be continuing to happen here in the UK. A landmark 2025 study from the London School of Economics (LSE) investigating AI tools used in social care found that some models systematically downplayed women's health needs. When analysing identical case notes, the AI used language like 'disabled', 'unable', and 'complex' significantly more often in descriptions of men.

With the new 'Fit for the Future' 10-Year Plan encouraging wider AI adoption across the NHS, these findings are a timely warning. The same underlying bias that leads to unequal care provision can lead to unequal hiring opportunities, screening out exceptional talent and exposing organisations to legal and ethical challenges.

Losing the unquantifiables: culture fit, passion, and potential

The goals set out in national strategies like the 'Fit for the Future' plan and the Life Sciences Sector Plan depend on more than just technical qualifications. They require people with creativity, resilience, and an ability to collaborate across complex systems.

An algorithm cannot measure a candidate's passion for patient outcomes. It cannot assess their ability to navigate the nuanced relationship between industry and the NHS in a partnership role. And it cannot identify the raw leadership or positive disruption potential in a candidate whose CV doesn't fit a standard template. These are the unquantifiable attributes that an expert human consultant identifies through conversation, relationship-building, and industry insight.

The impersonal candidate experience

In a competitive market for specialist talent, the candidate experience is a real differentiator. Top-tier professionals, particularly for senior or highly technical roles in the pharma recruitment space, expect a high-touch, personal, and respectful process. An entirely automated journey of chatbot interactions and algorithm-driven rejections can alienate the very people you want to attract, damaging your employer brand and driving exceptional talent to competitors. This need for human interaction is equally relevant for graduates who in today’s market can have ten or more impersonal, AI-driven interactions before have a human interaction for a role.  

The CHASE approach: AI-powered, human-driven recruitment

We believe the optimal approach is not a battle between technology and humanity, but a strategic partnership. True recruitment excellence is AI-powered, but fundamentally human-driven.

Using technology as a tool, not a decision-maker

At CHASE, we leverage market-leading technology for the "heavy lifting." Our systems help us with market mapping, identifying potential candidate pools, managing communications, and ensuring an efficient process. This frees up our most valuable asset: the time of our expert consultants. We use technology to augment our capabilities, but the critical stages - short-listing, interviewing, cultural assessment, and the final recommendation - are always driven by expert human judgment.

The irreplaceable value of expert consultants

Our human-centric recruitment model is built on the value that an expert CHASE consultant provides that AI cannot replicate:

  • Deep sector knowledge: An innate understanding of the pharmaceutical, medtech, and healthcare landscape, including the specific demands of roles within commercial teams and NHS-Industry Partnerships.
  • A curated network: We don't just find candidates; we build long-term relationships. Our network is extensive, deep and built on trust, giving us access to high-calibre passive talent.
  • Nuanced understanding: The ability to go beyond a job description to grasp the subtle nuances of your company culture and the specific challenges of the role.
  • Skilled engagement: The expertise to approach, persuade, and engage senior, high-value candidates who would never respond to an automated message.

Actively championing diversity and inclusion

The human consultant is the ultimate safeguard against the passive bias of an algorithm. Our role is to intelligently challenge assumptions, look beyond the keywords on a CV, and present a diverse, qualified shortlist. We actively work to widen the talent pool, ensuring fairness and providing our clients with access to the very best talent, regardless of background.

How to harness AI safely in your hiring process

For hiring managers looking to leverage AI in hiring, we recommend three key actions to ensure a balanced and effective approach:

  • Keep a human in the loop: Never allow technology to be the final decision-maker on a candidate's viability. Insist on human oversight at all critical decision points in your process.
  • Audit your tech for bias: Challenge your technology vendors. Insist on transparency regarding how their tools are trained and what measures are in place to ensure fairness and mitigate bias.
  • Prioritise partnership: Work with a recruitment partner who understands this balance. A specialist partner can help you design a process that is both efficient and fair, leveraging the best of technology while safeguarding your organisation with expert human judgment.

Conclusion: technology should serve people, not replace them

Artificial intelligence in recruitment is a valuable assistant. It can bring speed and scale to the top of the recruitment funnel. However, it is not, and should not be, a replacement for human connection, intuition, and expertise.

In a people-focused industry like life sciences, where innovation, collaboration, and patient-centricity are paramount, the most important decisions will always be human ones. At CHASE, our core belief is in the power of people to achieve the extraordinary.

Ready to build a high-performing team with a recruitment process that’s both efficient and fair? Contact CHASE today to learn how our expert-led, human-centric approach delivers exceptional talent.

Browse other insights

Explore our latest thinking, event updates and industry insights to stay informed.

All resources

Lisa Ellis reflects on two decades with CHASE

Recruitment
July 6, 2025

Lisa Ellis’ 20-year journey with CHASE includes five roles in pharma, both contract and permanent, showcasing CHASE’s expert guidance and lasting career impact.

Read story

How to set up an assessment centre for hiring in the pharma industry

Recruitment
April 22, 2025

Discover how to set up an assessment centre to hire top pharma talent fairly and effectively. Learn key benefits, essential steps and tips to ensure success.

Read story

Festive advice for merry teams

Recruitment
December 18, 2024

To help you strike a balance, we’ve shared practical advice to make your workplace feel more like Santa’s workshop than Scrooge’s office this festive season.

Read story