Despite all the talk of fixing the technology industry’s persistent gender problem, the numbers don’t look too good. Microsoft, Facebook and Google, for example, all employ a workforce that’s under 20% female. In response, these titans of tech have launched diversity initiatives and released regular transparency reports on hiring and inclusion programs. But do their actual job postings demonstrate a desire for a more inclusive workplace?
We used Textio and Gender Decoder—algorithms that analyze job postings for the presence of coded, gender-biased language—to find out. We first came across the tech reading MIT’s Rahul Bhargava on how men can self-audit for gender equality in the classroom. Looking at a dozen listings for open engineering positions at Facebook, Google, and Microsoft, we found a slight bias towards what the tools deem more “masculine” language. While the majority of job postings were “neutral,” senior level positions that require leadership skills and technical prowess were particularly gendered male.
Textio was co-founded by Kieran Snyder and Jensen Harris, who were inspired by their experience in the tech industry. Snyder, who holds a PhD in linguistics and worked as a Microsoft executive, says the tool can help employers across industries, whether they’re trying to optimize for more women, as in tech, or more men, as in education and nursing. The bias of language, when made visible, isn’t inherently malicious. It can be useful when trying to attract an underrepresented demographic. But too often, the biases of language go unaddressed.
By analyzing the job postings and subsequent interviews of an “early list of participating companies,” the founders learned that different language attracts male and female applicants. The algorithm they built works by scouring the job posts for words that the service has coded as being “feminine” (interpersonal, collaborate, communicate, support, responsive, reasonable, learn, nurture, connect, partner, and balance) versus “masculine” (active, ambitious, dominant, driven, objective, take-charge, individual, opinionated, competitive, manage, and analyze).
Here are the results:
Eight job postings have “neutral” gender balance, two with “slightly masculine,” two with “slightly feminine,” and three as “very masculine.” All three labeled “very masculine” were senior level positions that courted applicants with a “relentless drive,” and repeated the importance of “managing” others. The “slightly feminine” posting sought candidates with “interpersonal skills” who could “establish strong relationships” with other teams and “partner” with other engineers on projects.
Five job postings have “neutral” gender balance, five have a “slightly feminine” gender balance, one was “slightly masculine” and another was “very masculine.” The “very masculine” posting was for a senior software engineer role where “managing” skills and “drive” are “essential,” but not communication or a supportive nature. The “slightly feminine” job postings emphasized the importance of contributing to “our team” using “interpersonal” skills and “collaborative” design reviews.
Three were “neutral,” seven were “slightly masculine” and two were “slightly feminine.” Somewhat of an outlier, Google’s job postings emphasize “expertise” in technical skills without highlighting the importance of teamwork or collaboration. This is true of all the Google job postings we analyzed, not just senior level positions.Thus, none were “very masculine,” but all were slightly more skewed that way. Interestingly, Textio selected the two postings as “slightly feminine” because they both described code as “beautiful.”
Our findings lend support to the idea that job postings for senior positions—statistically the most gender-skewed level—are coded masculine because they emphasize domination and hierarchy as opposed to egalitarianism, collective thinking or more “soft” interpersonal skills like listening.
Of course, an orientation towards collaboration and communication are not inherently feminine traits at all. But women aren’t conditioned for competitive leadership the way men are. Snyder says that when evaluating hundreds of performance reviews for gender bias, she found that women are often punished for behavior associated with men: “The word ‘abrasive’…ended up being used 17 out of a couple hundred women’s reviews and zero times in men’s reviews. ‘Aggressive’ was used in a man’s reviews was used as an exhortation to be more of it, and [in] women’s reviews was used as a term of some judgment.” Put simply, women are admonished at work for displaying the exact “dominant” and “take charge” traits courted in these job postings.
Though it’s a good first step in attracting more female applicants, undoing the gender bias in the tech industry won’t be fixed with an algorithm alone. We need to acknowledge how the culture of tech itself is skewed towards a gendered framework that overly rewards aggression and assertiveness, traits that women are discouraged from displaying in the work place. Data can make these biases more visible, but it’s up to humans, particularly those with power, to confront and correct them.