In 2015, Jildert and his team at Randstad worked on a revolutionary matching algorithm. They were convinced they held the future of recruitment in their hands. The results were indeed astonishing: candidates and vacancies were matched with unprecedented precision based on skills ontology. But there was a catch. The algorithm was based on historical data, and that data turned out to be far from unbiased.

The algorithm adopted the patterns from the past, including the biases that lay dormant within them. As a result, certain groups of candidates were systematically disadvantaged. Women, people with a migration background, older applicants – they simply received fewer opportunities. Jildert and his team had created a powerful tool, but it was a tool that unintentionally facilitated discrimination.