Money

A Fairness Trilemma in Hiring

In the world of economics, triangles are often used to illustrate complex trade-offs and dilemmas. Just like in trade and monetary policy, hiring under unequal starting conditions presents a similar triangle that is often overlooked in discussions about fairness.

When companies turn to algorithms to streamline their hiring processes, they are faced with three key goals: efficiency, representation, and formal neutrality. However, the challenge lies in the fact that they cannot prioritize all three simultaneously. They must choose two out of the three, knowing that the third will be compromised in the process. This dilemma, known as the “fairness trilemma,” sheds light on the complexities of using algorithms for hiring practices.

The traditional belief that algorithms can eliminate bias and improve efficiency without any trade-offs has been debunked in recent years. Companies invested heavily in Diversity, Equity, and Inclusion (DEI) programs and algorithmic hiring tools, hoping to achieve better outcomes for underrepresented groups. However, the reality is that algorithms do not eliminate bias; they simply shift it to different areas such as model design and data selection.

The story of Amazon’s hiring algorithm serves as a cautionary tale. The algorithm, trained on historical data, favored candidates who resembled past male hires, resulting in biased outcomes. While the model was efficient and formally neutral, it failed to produce representative results from non-representative data. This led Amazon to reconsider its algorithm and ultimately abandon it.

Similarly, HireVue’s AI video interviews faced criticism for using facial analysis that correlated with demographic factors unrelated to job performance. This forced the company to reevaluate its approach and ultimately discard the problematic feature.

A simple model illustrates the fairness trilemma, showing how companies must navigate between efficiency, representation, and neutrality when making hiring decisions. Depending on their priorities, they may have to compromise on one aspect to prioritize the others. This highlights the inherent challenges of allocating scarce job opportunities under unequal conditions.

Just like in economics, where price controls lead to scarcity in different forms, constraints in hiring algorithms result in new challenges such as opaque decision-making processes and subjective overrides. Companies must acknowledge these trade-offs and decide where to allocate discretion in their hiring processes.

Moving forward, organizations should be transparent about their priorities, place discretion in visible areas, and refrain from marketing algorithms as a perfect solution to complex issues. While algorithms can assist in decision-making, they cannot eliminate the underlying trade-offs caused by systemic inequalities. The key is to acknowledge these constraints, make informed decisions, and take responsibility for the outcomes.

By understanding and embracing the fairness trilemma, companies can strive for legitimacy in their hiring practices, openly addressing the challenges they face and making conscious decisions to navigate them effectively.

Related Articles

Back to top button