Wait! Before you start throwing the hates into my email, please read on!
In April 2019, I read a story citing an uproar within Microsoft regarding the diversity inclusion project. Reportedly one female employee wrote:
“We still lack any empirical evidence that the demographic distribution in tech is rationally and logically detrimental to the success of the business in this industry….We have a plethora of data available that demonstrate women are less likely to be interested in engineering AT ALL than men, and it’s not because of any *ism or *phobia or ‘unconscious bias’- it’s because men and women think very differently from each other, and the specific types of thought process and problem solving required for engineering of all kinds (software or otherwise) are simply less prevalent among women.”
Okay, then. Have you ever heard “we can’t be it if we can’t see it”?
Hey, Ms. Words, I wanted to let you know that MAYBE your “plethora of data” might be – wait for it – BIASED! We even have scientific proof that your data is wrong.
“Neuroscientists have found few sex differences in children’s brains beyond the larger volume of boys’ brains and the earlier completion of girls’ brain growth, neither of which is known to relate to learning.”
I am not here to debate you; I am here to talk about bias. We know that humans can be biased. However, have you thought about how our data can be biased? Machine bias is a very real, very important discussion that we need to have. Let’s take a look at these biases, how they impact data, and what we can do to reduce bias in our personal and professional lives.
Human bias is broken down into two categories, conscious (also known as Explicit Bias), and unconscious (Implicit Bias).
- Explicit Bias is a bias of which you are aware. For example, you may be a female hiring manager and see older women as less likely to be a valued employee due to their age. So therefore you chuck their resumes in the trash without giving it another thought.
- Implicit Bias is one that you do not know you have, such as social stereotypes. For example, all little girls love kittens and want to be princesses. Or only men are great at tech.
There are four types of machine bias: Sample, Prejudice, Measurement, and Algorithm Bias. Each bias reflects problems related to gathering or using data where systems draw improper conclusions about data sets because of human intervention or lack of understanding the nuances of the data.
- Sample bias results when data does not accurately represent the environment the model will operate in. For example, if you create an automated hand drier that uses a light-triggered motor and only test it on people who have light colored hands, those with darker colored skin will walk around with wet hands… or wet pant legs.
- Prejudice bias is influenced by cultural or other stereotypes. What is the name of your virtual assistant? Alexa? Siri? What gender voice is used? Female. Why not a gender fluid name and voice? Because those who designed and coded were under the assumption that society viewed women more as administrative assistants than men. I personally would prefer to pick Thor or Jarvis give me the assistance and guidance needed.
- Measurement bias happens when there is an issue with the device used to observe or measure. This will skew the data in one particular direction. Take for example a political poll that asks the question, “How would you describe Candidate A?” and then gives the following choices:
- (1) Is the strongest man alive.
- (2) Is the smartest man alive.
- (3) is the bravest man alive.
None of the three questions provide an honest or clear critical observation. They are all measured on the “best” curve.
- Algorithm bias happens when a computer system reflects the unconscious values of those who coded, collected samples, or trained the data. Correctional Offender Management Profiling for Alternative Sanctions (“COMPAS”) is an algorithm widely used in the US to guide sentencing by predicting the likelihood of a criminal re-offending. According to the analysis, the system predicts that black defendants pose a higher risk of recidivism than they do, and the reverse for white defendants. I ran the data several different ways and when race was considered, the data showed that black men posed a higher risk of recidivism than do the white defendants. Take the race data out, and the data showed that those who lived in poorer neighborhoods were more likely to re-offend. Which still shows bias.
What is the solution?
As much as we try to take human bias out of advanced analytics (AI/Machine Learning), without working to decrease the human bias, we are fighting an uphill battle. The absolute best solution is diversity and inclusion. Diverse groups of people working together for a common goal will provide numerous experiences from which to pull, representation of varied societal norms, and will help to ensure that more under-represented groups are given a voice.
Diversity includes a variety of gender, age, sex, diverse socio-economic backgrounds, race, creed, education level, introverts and extroverts, those who question, those who dream, those who want to work behind the scenes and in the front.
What does that look like? To me, that looks a lot like success.
Still Need Convincing?
Still not convinced that women belong in tech? Let’s take a look at Chloe Condon (t|b|MSFT) and just ONE of the amazing things she has created. It is not uncommon for any of us to be caught in awkward social situations. It is awesome that Chloe thought of a great solution to help rescue you!
Women have been instrumental in growing technology for years. From Ada Lovelace (who developed an algorithm for a computer that didn’t yet exist – the world’s first computer programmer) to contemporary greats like Dr. Shirley Jackson (who conducted scientific research that enabled others to invent the portable fax, touch tone telephone, solar cells, fiber optic cables, among other things), and Margarita Salas Falgueras (invented a faster, simpler and more reliable way to replicate trace amounts of DNA into quantities large enough for full genomic testing).
If you are a person who needs lists, check out: Mothers of Technology
Lasting Impacts, Lasting Words
Before you go out there saying that tech doesn’t need diversity inclusion, think about all the things that women, people of color, people from varying backgrounds have done to improve the quality of YOUR life. For example:
- Who invented the modern refrigerator that keeps your beer cold? Let’s talk about Florence Parpart
- Who created CCTV (think about your Ring Doorbell that alerts you when your pizza has arrived)? That would be Marie Van Brittan Brown
- Want some ice cream for dessert? You can thank Nancy Johnson!
Yell It From The Rooftops
Ladies and gentlemen, yell it from the rooftops with me “Women are NOT bad at tech!” Ladies, get out there and make your mark and never, ever, ever let someone tell you that you are bad at tech. Technology is not a man’s only game. Math, science, coding, none of this is male-only. Make your mark, stand up, be heard, and inspire the female leaders who will come after us!
Gentlemen, here is your call to action. Stop allowing your brothers to make such foolish statements as “women can’t think like men so they won’t be good at tech.” Promote the ideas and amplify the voices of women in science, technology, and math. Please be part of the solution, not more of the problem.
We need you, our collective future needs you!
“I was told I’d never make it to VP rank because I was too outspoken. Maybe so, but I think men will always find an excuse for keeping women in their ‘place.’ So, let’s make that place the executive suite and start more of our own companies.” – Jean Bartik, one of the first programmers of the ENIAC.