Accessibility tools

The biggest DE&I data misconceptions: A 6 point guide for recruiters.

D&I Blog (1)
Hayley Bakker
by Hayley Bakker

As Head of Product at Diversely, I’m lucky to work with hiring teams that are genuinely forward-thinking and ambitious when it comes to their DE&I goals and strategies. I deliver regular demos and onboarding sessions and love the depth of conversations, relevant questions and hearing about practitioner challenges (and how we can help resolve these).

It’s a large part of how Diversely evolves as a product and how we make sure our platform solves recruiters’ most pressing issues, now as part of Access Volcanic’s inclusive approach.

From the 100+ conversations I’ve had this year alone, here’s what I’ve learned and what you may be getting wrong about unconscious bias and how it could be harming your DE&I progress. By addressing these head on, I hope we’ll be able to move from talk (and doubt) to action (and impact)!

Here are my top 6 most heard misconceptions:

  1. AI MUST BE AVOIDED IN HIRING, IT INTRODUCES A LOT OF BIAS

  2. WE DON’T HAVE ANY UNCONSCIOUS BIAS, NOW THAT WE’VE BEEN TRAINED

  3. MEASURING - AND LABELING - DIVERSITY, JUST PERPETUATES BIASES

  4. SHARING OUR INITIAL DE&I DATA AND GOALS WILL MAKE US LOOK BAD

  5. SETTING AND SHARING DIVERSITY GOALS WILL DEMOTIVATE ‘THE MAJORITY’

  6. ANONYMISING PROFILES MAKES IT MORE DIFFICULT TO RECRUIT UNDER-REPRESENTED GROUPS

 

1. AI must be avoided in hiring, it introduces a lot of bias

Unfortunately and in some cases rightly so, Artificial Intelligence (AI) has gotten a bad reputation when it comes to hiring. Most of us would have heard horror stories of where AI has failed and introduced significant bias for minorities and with big impact causing unfair hiring processes. At Diversely we get asked about this a lot and here are my usual response:

AI - like humans - can be good and bad, but it’s not inherently either. It’s all about how we’ve trained and programmed them, the decision-making power we give them and how we hold them accountable.

  • AI should be trained and programmed based on a large, diverse and representative data set. If this is not the case, you might as well stop there. It’s a huge topic, and one we’ve considered when training our own bias analyser.

  • We should consider how we use AI, such that it’s not making any un-supervised decisions without intervention. Within Diversely we mainly use AI (NLP) to create awareness for bias language in job ads, suggest more inclusive language and remove identifiable characteristics and demographics from applicant profiles. No decision-making, simply assisting humans.

  • We hold our platform and the users accountability through diversity analytics which create insights into a number of decision-making points, applied, shortlisted, hired, by source, and the diversity of applicants at each stage (relates a lot to topic 3 below).

AI and automation, used in the right ways, to support humans and in combination with the right training and accountability improves outcomes, and does not introduce but rather reduce bias.

2. We’ve overcome our unconscious biases through training

In short, we all have bias, some conscious other unconscious, which in part comes from a humans basic survival instinct and in part is fuelled by our social upbringing (family, school, neighbourhood etc.).

Fortunately, we’re becoming more aware of how these biases are impacting, sometimes negatively, our ways of operating within businesses and specifically in relation to our ability to fairly assess someone’s suitability for a role. Most companies we work with have gone through some form of unconscious bias training, which we applaud!

But let’s accept training for what it is, a valuable starting point, not a final solution.

Unconscious bias training can create awareness and lead to buy-in from the team for greater diversity and inclusion measures. But more frequently than we’d like, participants walk out of the training and go on with their lives (and work) in exactly the same ways they did before.

Even if the learnings did stick with some participants, it’s not called UNCONSCIOUS bias without a reason.

We have to admit to ourselves, that as much as we educate ourselves, we’ve been programmed since birth (and through our dna) and some form of bias will still slip in. After training, the next step is to incorporate new ways of working, that mitigate bias, and hold each other accountable. The work does not stop here.

3. Measuring - and labelling - diversity perpetuates bias

Diversity self-identification and tracking is a conscientious topic for cultural reasons, data privacy considerations and anti-discrimination purposes. You can find out more about how to go about compliant DEI data collection and reporting in our e-guide, right here.

Some would argue that asking applicants and employees about their demographics - and reporting about diversity - puts individuals into ‘boxes’ and these labels work divisively. To this we’d say: What gets measures gets done, and some labels are helpful. It all depends on how the data and labels are used.

For any other area of business, like finance or digital marketing, the idea of taking action without reflecting on the data impact, would be considered crazy. Why should this be different for DE&I?

In short, if we’re not able to understand our diversity at a meaningful level, how do we know if:

  • We’re actually representative of our communities

  • Our DE&I strategies and initiatives are yielding any results

  • Under-represented groups are being supported/ enabled

  • And how do we hold each other accountable?

Diversity data should be used to understand trends and inform action to support individuals from various backgrounds. For instance, if we find that a specific minority is not being selected within a specific area of the company, that’s useful to deep-dive for root causes and address. Similarly, if we see an increase of our talent pool is neurodiverse, we can put in place initiatives (manager communication training, flexible schedules, adjusted screens) to improve our workplace for people with neurodiversity.

It’s about levelling the playing field for all, not adding bias.

4. Sharing our D&I scores and data will make us look bad

An important part of Diversely is creating greater transparency for applicants when they’re applying for a job. Three out of four applicants consider a companies approaches to DE&I an important criteria in the selection of their next employer.

Want to understand your or your client’s diversity status?Share this six-minute survey to get an accurate picture, report & score.

To put things into perspective, most companies we work with have started on their DE&I journey, but are certainly not (yet) where they’d like to ideally be. And that’s okay. But that doesn’t mean they shouldn’t share where they are and what their plans are.

In fact, applicants will be more inclined to take a companies DE&I statements more seriously if it’s backed up by data and real initiatives, even if the scores are not yet ‘ideal’.

  

5. Setting & sharing diversity goals demotivates ‘the majority’

Diversity quotas are a greatly disputed measure. Proponents of quotas cite data showing it could take centuries to close the gender (and other) gaps organically, without putting in place regulations and targets. Opponents in turn arguing that quotas lead to negative attitudes and demotivation among ‘the majority’ who feel unfairly treated as well as the minority who now will be perceived as only getting the job because of their demographics.

At Diversely, we’d argue it all depends on how the diversity goals are set, measures are implemented and communicated.

Here’s how we approach this for recruiters and candidates.

At Diversely we speak about diversity goals, these can be set across 8 elements of diversity. We find a lot of companies setting unrealistic goals. Setting 50-50 goals for recruitment where the talent pool is not a 50-50 representation can lead to bad behaviours. That’s why we’d always recommend starting by looking at the accessible market and setting relevant goals on that basis.

We hear the argument ‘We just want to hire the best for the job’ a lot. And that’s great and fair. It’s not about recruiting, hiring or keeping people who are not suitable for the job. It’s about levelling the playing field to ensure your talent funnel is a fair representation at the top, your company and jobs are attractive to people from various backgrounds, and all candidates have an equitable opportunity to be selected based on merit.

Sounds simple? Unfortunately, it’s not.

To truly achieve this, you need to implement processes to improve reach, attractiveness and inclusion, and bias-free selection for all and especially under-represented groups.

If implemented and communicated well, how can anyone disagree with that?

6. Anonymising profiles makes it more difficult to recruit under-represented groups

At Diversely we believe in providing a fair opportunity to all applicants. We achieve this in part by anonymising applicant profiles (resumes/CVs) by removing names, gendered pronouns, dates (of birth), ethnic and racial references and even school names.

Some companies applying ‘positive discrimination’ might find this gets in their way of shortlisting under-represented candidates. They may for instance try to actively seek out and invite women for tech role interviews.

We generally have two lines of thought on this:

  • If you’ve been purposeful in our outreach to broader and more diverse talent pools, and you’ve defined your job description and ad in a way that is inclusive, you should already have a more diverse set of applicants in your inbox. Applying an anonymised first screening, based purely on experience and skills, and inviting those that are best fit for the role, seems like a fair next step. Not much use inviting people who are not meeting your basic criteria.

  • In some cases you may be on the fence about an applicant and could lean either way: rejecting or inviting for next steps. This is where Diversely has included an optional diversity indicator. When turned on, this will indicate who of the applicants is part of your ‘under-represented group’ based on your diversity goals. You still wouldn’t know if they’re an ethnic, age, gender or other minority, but can decide to give them the benefit of the doubt.

Now’s the time to move ahead with DE&I

I truly hope some of these questions and answers clarify some of the common misconceptions about unconscious bias and diversity. It can be quite a tricky space to navigate and I see a lot of fear getting in the way of proper action.

I’d love to hear your thoughts and experiences on this. By addressing these topics pro-actively and heads on, I hope we can move forward from talk (and doubt) to action (and impact)!

Diversely’s data & bias-free approach to recruitment is now exclusively available as part of Access Volcanic. Want to learn more? We’d love to talk!

Back to all blogs