This Plugin Removes Sexist Bias as you Search

Reading Time: 7 minutes.

Showing the difference between search results with and without S.H.E. installedHow would you feel if you searched for “School Boy” and got sexy photos of young men without their shirts on? Surprised? Disgusted? Upset? Probably. Now head over to your browser and type in “School Girl.” Yeah. You probably don’t want to do that.

 

The results have references to sexy costumes, a news story from a hate speech site attacking transgender school girls, and a story about a murdered school girl.

Google’s search results for “School Girl.”

On Google, you’ll get two stories from the same source of anti-LGBTQ and misogynist hate speech. This is a well known site for hate speech, especially against trans women, and advocates for restricting women’s rights. That’s Google’s idea for a top search. The article misrepresented facts, was one sided, and did not give advocates an equal opportunity to make their case. Right off the bat, Google comes out sexist.

The other story is about an attacked girl. So, two stories attacking girls, one about a murdered girl. The rest of the results are about sexy costumes. That’s what Google thinks about “School Girls.”

What about Duck Duck Go?

Duck Duck Go search results show the same sexism, with a racist twistOn Duck Duck Go, not only are underage girls sexualized with sexy school girl costumes and cosplay, but there’s also racist results, fetishizing Asian women. The big result on the page is for a pornography.

Let’s compare the first row of image results for “School Boy” and “School Girl.”

School boy vs school girl. The boys are stock photos of children. The girls have sexy photos in themYou might ask why I didn’t show the Google results for videos. Well, that’s because the very first result from Google about “School Girl” was a porn. It showed an uncensored thumbnail of two people in the act of intercourse. She was wearing, predictably, a plaid pleated skirt, and, of course, no underwear. The image results weren’t much better. I had safe search set to moderate. Google thought it was okay to show me thumnails of porn.

Search Human Equalizer

This is where S.H.E. comes in. S.H.E. or “Search Human Equalizer” filters out sexist results from your searches. It’s a Chrome plugin, and it helps serve up less sexist results. It can save you from unintentionally searching for porn, help you find a female scientist (and not a “sexy female scientist” costume), and it makes the web a safer place for young girls. It’ll also make your computer a little more safe to use at work, or during presentations.

Let’s take a look at everything SHE has to do to make the web a more equal place.

Also, let’s get ready to ask Google why this isn’t a default part of their search engine, instead of a side project by another company that’s only available for Chrome.

I kid, we all know why Google hasn’t made this their default behavior (spoiler: it’s because they’re extremely sexist).

Why SHE is Needed

A Tweet from Katrina Lake (@kmlake) pointing out that her phone autocorrects "CEO" to an emoji of a man in a suit, instead of a woman.

Sexist bias in algorithms is not limited to search, but search is a good start.

Bias in our algorithms comes from us. But it also reinforces our own prejudices. If you believe a restaurant is better than the one down the block, you’ll search for ratings. If Google only shows you positive results for the restaurant you already liked, throwing out nearly all the bad ratings, then you’d think it was the best even more. You’d feel as though you’ve been validated, and your opinion is the right one.

This happens with bias against women as well. A small bias, like thinking women might not be as good at STEM jobs as men, is made worse when search results overinflate the number of men you see when searching for “engineer.” If you can’t imagine a woman CEO and you search for “CEOs” and don’t see women, it’ll make you think women can’t be CEOs. Even at a subconscious level, validation of small ideas reinforces them, and it has real world effects.

Take guys like Joe Biden. Does he think the fact that he touches women’s and girls’ faces and bodies without permission is a bad or sexist thing? No. Because, subconsciously, he doesn’t think of a woman’s personal space as valid as a man’s. That comes from years of bias reinforcement. That’s how you can make an otherwise decent person a sexist sleazeball.

Bias is honed by the wrong experiences. And Google’s serving up the wrong experiences with each search.

Jobs

When you search for “CEO,” only 10% of the results will be women. The problem? Women make up 28% of CEOs. That’s still just higher than half of what it should be, as women, and this is going to surprise some people, it would seem, make up a little over half the population.

This isn’t uncommon. Women are underrepresented in search results with prestigious connotations. In fact, they’re underrepresented in all working roles. The Pew Research Center found that, despite the fact that women account for 46% of the workforce, they make up 40% of search results. That’s a statistically significant margin.

People don’t like to think of women doing jobs.

STEM

The same happens when you search for famous scientists or engineers. You’ll find almost all male names. Search for “famous computer scientist” on Google. Down the list, off to the sidebar section of the page, you’ll find Grace Hopper and Ada Lovelace. What did they do? Oh, well Ada Lovelace invented computer programming, and is credited with being the first computer scientist. She’s 8th on the list. The first computer scientist is 8th on a list of computer scientists.

Grace Hopper? Well, every computer scientist should consider her a hero. She invented compiling. Hopper invented the idea of a programming language that can be easy to read and write and compiles to machine code. She made programming a much more efficient task. Our jobs would be hellish nightmares without her.

Why aren’t these two women first? With the exception of Alan Turning, who cracked the Enigma code during WW2, likely the person with the most impact on us winning the war, the other people all used technologies stemming from Grace Hopper’s contributions. They all stemmed from Ada Lovelace’s idea of a programmable computer. But these women are hidden off to the side.

The next woman on the list is 15 places down.

Stock Images

This goes beyond jobs and work. Looking for a stock image of a woman? You’ll find a shocking number of “sexy” images. Add her race on, especially, and, rather than just finding images of women with your specified background, you’ll instead find fetishized results. For many nonwhite races, if you search for “<race> woman” vs “<race> man,” you’ll find pornographic results. I even tried to reduce this by using “businesswoman” and “businessman.” It still happened.

How are women supposed to be taken seriously when society doesn’t think of us as holding jobs, positions of power, or making scientific discoveries?

If society even reduces results like “businesswomen” to porn, how are we ever going to be taken seriously as bosses?

What SHE Does

The difference between a search for “school girl” and “school boy” is shocking.The difference the SHE search engine plugin makes is just as surprising. Fortunately, it has an toggle, so we can easily compare results with and without it.

Take a look at the results below. You can she the difference (sorry, I had to).

Difference between searches with SHE, without SHE, and for a search about school boys without SHE. The SHE results filter out the "sexy school girl" results. THe search for boys didn't require that, because society doesn't fetishize boys.

SHE works by grabbing your normal search results. However, if more results are necessary to eliminate bias, it grabs more from the server than you’d typically get. Then, it throws out the search results that actually have nothing to do with what you searched for. It removes the sexist results. Not enough female CEOs on the page? SHE will get it up to a number that actually reflects the reality, that there are typically more female CEOS than female results for a search of “CEO.”

It’ll replace sexy school girl costumes with school girls. Of course, if you want sexy school girl costumes, that’s what you search for! You’ll still get that. SHE just removes the sexist bias in everyday searches. In doing so, it can make the people using a browser with this extension on less sexist. It’ll show them that women can be scientists, smart engineers, and, yes, even school girls without being porn stars.

Seriously, I can’t believe the last part is a discussion we even need to have.

What SHE Also Does

Two search results. One with SHE on shows Pantene search results instead of local businesses and other links.

S.H.E. automatically adds ads for Pantene

Okay, SHE sounds perfect, right? Well, it’s not. I’m not just talking about the fact that it doesn’t work for all search engines. Also, this isn’t about the lack of search terms it works on. And, finally, I’m not going to moan about the lag, which shows you the original results for a fraction of a second before filtering. It’s a client-side browser extension, what did you expect?

No, this is bad because it’s an ad. When you do a search for “great hair,” for example, you’ll notice it replaces your search results with links to Pantene hair products. Of course Pantene couldn’t just make an app for marketing and goodwill, they also had to make it serve up ads. As such, you can’t trust it for most search results related to Pantene products or even Proctor and Gamble products. You can also expect that your searches won’t just be used by Google but also Proctor and Gamble, the owner of Pantene.

That’s why, as much as I love the idea behind this extension, I just can’t recommend you actually use it.

Why Isn’t This the Default?

Google search showing text for "Stop sexist search bias."

Someone at Google should be searching for this right now.

Obviously I’m not talking about the ads for Pantene. Those don’t belong anywhere. However, Google should filter results better. Instead of perpetuating bias, they should be using their algorithms to help fight bias. Instead, they’re making it worse. I’ve found Duck Duck Go is even worst than Google, often serving up results from hate speech websites.

Pantene is likely making subtle changes to your request, receiving results from Google’s servers and changing the order locally. But if someone could do all of this without Google’s help, imagine what someone on Google’s end could do. They could easily filter search results to show less bias.

Google is a source of information for a large portion of the planet. If they pass on information they know has misogynistic bias, they are intentionally increasing misogyny in our society.

This is a problem that has hit developers working on machine learning and AI. Self driving cars are more likely to hit black people than white people because they’re less likely to see a black person as a human. Why? The engineers didn’t program that in themselves. But they trained the software on faces familiar to them. Because there is a dearth of women and people of color at tech companies, facial recognition is less accurate with women and people of color. A car may “decide” to hit a black woman while dodging a white man, because it was less sure she was actually a human.

Search is Broken. Fix it.

Google logo with male and female symbols. The male symbol is above the female symbol.

Google continues to elevate men at the cost of women

This is just one example of biases playing out in tech. We have the perfect one from Google though. Women are underrepresented in search results. Searching for “school girl” should show stock images of school girls, not sexy costumes. Could you imagine if their search operated like this for other items? Search for “acorn” and instead of photos of acorns, you find photos of sexy acorn costumes. I Googled that, by the way. Yes, there are results. Google’s search is broken. It’s not giving people what they’re searching for. Instead, it’s reinforcing and exacerbating existing sexist beliefs.

Our search engines are broken. Instead of returning even, unbiased results they’re clearly biased against women. Due to the fact that so many people use search engines as a source of truth, Google has a responsibility to fix this. They’re making sexism worse by ignoring their problems. The fact that another business had to swoop in to rig up a client-side solution is utterly absurd. A third party company had to fix another company’s broken products. Google is a joke. The shameful truth is, no one’s doing better.


Sources: