Artificial Intelligence solutions are increasingly being looked at in the management of human resources and talent in our workspaces, but how do you ensure they can be used fairly without bias? Dr Amy Lui Abel discusses some of the key considerations and how critical employee data can be managed.
Dr Amy Lui Abel has a rich and diverse background of success as both a practitioner and an academic in the Human Resources and human capital space. She gained a PhD from New York University on the topic of corporate universities and talent management organizations and has more 25 years’ experience as a senior executive and expert in human capital and talent management.
Now Dr Abel catches up with GRC World Forums prior to her upcoming participation on the panel, Are You Biased? How Fair is AI Use in Your HR Processes? at PrivSec Global later this month.
Combining her tendency towards the pragmatic with her thorough understanding of theoretical knowledge, Dr Abel has been working at The Conference Board, a research thinktank for the past 10 years, where she has researched the intersection of HR and AI and how critical employee data should be managed in the digitised workspace.
“There are good guidelines that we can all learn from; this is not a mystery, this is not a secret…being transparent, being open”
With the uptake in the use of artificial intelligence in workspaces during the pandemic, organisations and HR leaders are grappling with rapid digital transformation. Dr Abel says remote working brought with it a new reliance on technical infrastructure, which she says society may not be completely prepared for.
In her recent research on Artificial Intelligence for HR: Separating the Potential from the Hype, Dr Abel and her colleagues at The Conference Board argue that artificial intelligence poses many limitations, including bias, and a lack of regulation, but “that doesn’t mean that HR teams should eschew AI altogether, simply because they don’t understand it or see it as too dangerous,” they say. “Rather, they ought to proceed with eyes wide open and with appropriate safeguards and controls in place,” they argue.
The potential of AI in HR can be significant, but Dr Abel and her fellow researchers believe that it is not the sole responsibility of HR to address AI’s limitations.
Rather, they stress that HR leaders must collaborate with company stakeholders and consult with external experts “to mitigate AI risks and seize AI opportunities.”
The Conference Board research highlights several actions HR can and should take during this time of heightened digital reliance. Firstly, they can develop their senior HR leaders’ understanding of AI and prepare HR employees for AI’s “expanding role,” they state:
“The HR function can evaluate HR tasks in terms of their variability and their need for human intervention—the two dimensions that help define whether a process or task is suited to assisted or augmented intelligence, or whether it is better suited to automation.”
Secondly, HR should identify which processes could be remodelled to capitalise on both machine and human intelligence intervention. Furthermore, though HR should take the lead in identifying processes that could be leveraged by AI, the report states that for successful implementation of AI in HR, collaboration with outsourced legal experts and security experts alike will prove important.
Lastly, going forward HR should encourage the creation of new roles that will facilitate the adoption of AI, such as cognitive trainers, results explainers and AI sustainers, according to AI expert Thomas Davenport.
“There are good guidelines that we can all learn from; this is not a mystery, this is not a secret - being transparent, being open,” Dr Abel says. “I think if we continue in that vein, hopefully we can mitigate any sort of serious problems that come up, or at least we should try. We should definitely try.”
Dr Amy Lui Abel will be speaking at ”Are You Biased? How Fair is AI Use in Your HR Processes?” at PrivSec Global on March 23
No comments yet