People vs. Machines: How can we ensure diversity in recruitment?
In order to ensure the recruitment of more diverse employees and to eliminate the bias that has plagued recruiting for decades, Dr Grace Lordan argues that AI must play a greater role in the wider recruitment processes of businesses.
The Great Resignation has shone a light on the various flaws of workplace culture. People are overworked, underpaid and inevitably unmotivated by corporate motives that don't serve them. The result has been more empowered employees advocating for what they want their company culture to represent, putting more pressure on HR professionals to deliver these demands.
On the other side, many industries are facing tighter margins and, erroneously viewing diversity and inclusion (D&I) as a nice to have, investments in D&I threaten to wane. In the long run sacrificing D&I investments will bite, given that access to diverse talent particularly for capital intensive business allows for a competitive edge. More than that, growing external pressures from investors, clients and regulators mean that sooner or later the excuses for firms that lack diverse representation of their population will have run dry.
So how can businesses access diverse talent in the midst of tighter budget constraints? It is not easy. There is good evidence that current hiring processes are plagued by cronyism and bias. In other words, people like to hire their friends or those who are too 'like them' in terms of thinking style far too often. On the other hand, businesses will only thrive with cognitive diversity. What can be done? Why not substitute more of the humans involved in the recruitment process with machines?
Worried that a machine might be inferior to humans? Think again.
Alongside my colleagues Paris Will and Dario Krpan at LSE we conducted research that highlights the efficiency and effectiveness of Artificial Intelligence (AI). Overall, our findings illustrated that AI improves efficiency in hiring by being faster, increasing the occupation rate for open positions, and recommending candidates with a greater likelihood of being hired after an interview. And of course, it frees up people in the organisation to do other tasks at the same time. While AI had limited abilities in predicting employee outcomes after being hired, it was a substantial improvement over humans. We also assessed whether AI could decrease biased decision-making and improve the diversity of selected candidates. And although the evidence was more mixed, the answer is best captured as 'most of the time'.
We also were interested in how humans reacted to the idea of AI taking on more of the recruitment process. After all, machines aren't going to get the recruiting job if humans don't agree. We demonstrated that there is an overwhelmingly negative response to using AI hiring processes. This obviously affects the rate at which this is being adopted, which falls short. Basically, it is humans that are that are the friction in a wider adoption of AI in hiring. Yes machines are still biased, but they are less so than humans.
The verdict is that AI hiring has the potential to create great strides in recruitment in terms of diversity and lower costs. But without human support, we will become our own saboteurs in the face of reaping the benefits by not adopting the technology.
AI hiring functions are created through algorithms developed by human knowledge. This factor is the main risk of biased influence that can be embedded into AI hiring but can be easily mitigated by applying more stringent compliance guidelines and having monitoring. Take care though. If humans are to be involved in the monitoring, they should not have skin in the hiring process or end up monitoring to the extent it is as time consuming as having humans in the process.
With the potential AI hiring holds, it is time AI functions are used in recruitment to improve diversity in hiring. It is also time that those who develop AI become diverse, so that the biases that remain move towards nil. Is it a wonder that biases remain when the technology sector is one of the worst sectors for diversity in their product creation?
The main message I leave you with is that there is evidence that current hiring processes are plagued by cronyism and bias.
It is time that humans hand over the hiring process to machines who do not have these tendencies. Biases embedded in algorithms can be mitigated somewhat with more care from those writing them, and compliance staff, who do not have skin in the hiring process can monitor the process to abate any concerns on fairness. Let's progress AI in recruitment and workplace inclusivity at the same time.