How to challenge automatic (biased) thinking

How to challenge automatic (biased) thinking

Most of our thinking is automatic. As we all go through our day-to-day lives, a lot of our thinking requires little effort and is generally reliable when it only impacts ourselves. But Catherine Garrod, Author and Founder of Compelling Culture, argues that automatic thinking is unreliable when it comes to making decisions that will impact other people. This is because our brains can’t draw on their experiences and knowledge. She discusses how we can all challenge our automatic, biased thinking.

Most of our thinking is automatic, requires little effort and is brilliant for helping us make decisions throughout each day. Safe or dangerous? Yes or no? Take part or decline? Like or dislike? etc. When we’re making decisions that will impact only us, this automatic thinking is pretty reliable as it’s based on what we’ve experienced before, what we already know, what feels familiar etc.

But automatic thinking is unreliable when it comes to making decisions that will impact other people. As our brains can’t draw on all their experiences, knowledge and needs.

That presents a bias risk when developing policies, products and services for colleagues, consumers and communities. Because we tend to design things in a way that makes sense for us. And the risk is higher for teams made up of people who have similar education, lived and industry experiences.

Just one example of automatic (biased) thinking is captured in a one minute clip on YouTube – the ‘racist soap dispenser’ at Facebook office does not work for black people. The product was designed to dispense soap when a motion sensor registered a hand underneath, but the sensor only recognised hands with pale skin. Did the people in that company decide to develop a sensor that didn’t register dark skin? Of course not. But they clearly hadn’t gathered enough research to check it was going to work for a mix of consumers. That product had been through research, design, testing, marketing, sales, manufacturing, logistics and install before a consumer realised it was defective. The outcome was a reputation risk, an operations challenge to redevelop the product and a financial burden to refund or replace the faulty units.

Biased thinking goes beyond skin tone. During the COVID-19 pandemic there was a rush to provide personal protective equipment to hospital and care workers. The templates were based on six foot men, yet here in the UK, 75% of NHS workers are women, so that safety equipment was putting lives at risk. There are global health disparities in the populations underrepresented in clinical trials. And it’s significantly harder to travel if you have a physical disability, because planes, trains, platforms etc. are lacking in design.

There are thousands more examples and you likely have your own. The point is, when diverse groups of people aren’t involved in research, design and testing, colleagues, customers and communities are let down.

And this is a form of societal discrimination, even though in the majority of cases, there was no deliberate intention to overlook the mix of lived experiences and needs.

It’s estimated over 90% of our thinking is automatic, so when our work impacts other people, we need to use the part of our brain that is conscious. That part is slower, it knows it doesn’t have all the answers and it looks for a broader perspective.

Here are some examples of conscious thinking:

• Does our marketing appeal to people over 25?
• Is our communication style friendly for neurodiverse people?
• Could someone using mobility aids access the event and toilets?
• Is our mental health support intersectional?
• Are our digital platforms accessible for people with visual impairments?
• Do our family policies recognise single parents, same sex couples and non-binary people?
• Does our building provide space for people to pray?
• Is the new packaging easy to open for people over 70?
• Have we considered suppliers owned/led by women and people from underrepresented groups?

To build confidence using the conscious part of your brain, here are three simple habits you can practice:

1) Get to know people who aren’t just like you

As you’re building your network, mentoring and social circles, think about who you spend time with most often and see if you can create a deliberate mix.

Think about the five or six people closest to you in your work or home life. Chances are you have lots in common, as that’s how we’re socialised and how we form friendships. And that’s ok. If those five or six people are all a lot like you, consider how heavily influenced you are by their opinions and views on the world. Maybe it’s the sports you enjoy, how you vote in political campaigns, your sense of humour etc.

Getting to know people who aren’t just like you helps you realise you have more in common than you think, which is a great place to build from. It also broadens your awareness and increases your empathy for experiences that differ to your own. And this helps limit your bias and be more successful when contributing to or making decisions that might affect them.

At work you could spend time with other teams, join networks or social groups. And there’s a bunch of ways to do this in your own time too. If you’re a regular on social media, look at who you follow and see if you can create a greater mix. If you like books, look at the authors; podcasts, look at the hosts and guests; boxsets and movies, look at the lead characters, writers and producers. The algorithms for all these things will recommend you more of the same. So you can teach the algorithms to serve you a greater mix and boost your knowledge while enjoying the things you love.

2) Deliberately seek alternative perspectives

An important part of career development is gathering feedback to develop your thinking and approach. Whenever you do this, make sure to go beyond your usual ‘go to’ people; who you spend most time with and already know your work. To challenge bias, you also need to approach people who are less familiar with what you’re working on, as they can usually offer perspectives you hadn’t considered.

Write down a list of ten people you’d like to approach. Then look at your list and if the people are all a bit like you with similar work and life experiences, consider who else you could approach. Then either switch half the names or add another ten people.

3) Disaggregate your data

Whenever you share data to demonstrate research, design or testing, make sure the data is broken down by demographic groups.

Because data presented as an average of the total group often masks the reality that people from underrepresented groups are having a lesser experience. As most things in the workplace and society have been designed by people from overrepresented groups who share similar experiences and needs.

To protect people’s private information, only a limited number of people will be able to produce disaggregated data. So whenever you’re working with data insight specialists or third parties, ask them to present the data in a way that shows which demographics groups are having the best and worst experiences. Or which demographic groups are going to benefit most and least from potential new designs, policy decisions etc.

Challenging your automatic (biased) thinking makes your work better and contributes to a world that works better for everyone. Because unless you’re consciously including people, you’re almost certainly unconsciously excluding people.

Catherine Garrod is the Founder of Compelling Culture and author of Conscious Inclusion: How to ‘do’ EDI one decision at time. She is also a guest lecturer for inclusive leadership at Cambridge Judge Business School.

Browse our latest issue

Intelligent CXO

View Magazine Archive