Sunday, April 1, 2018

How to Address Gender Bias in the Workplace? Look at Your Slack Messages

Gender bias can lurk in unexpected corners of the workplace. One place you might be overlooking it? Your company"s Slack account--and Slack wants to to help you with that. 


Quartz reported Wednesday that at the Wharton People Analytics Conference Slack CEO Stewart Butterfield revealed his company may be building tools that will analyze gendered communication trends on its platform.


Studies show that women and men communicate differently. For example, men tend to interrupt women more often, and women have a tendency to apologize in conversation when expressing their opinion. These same tendencies can occur even when people communicate online.


Although Butterfield said that bias isn"t a big problem on Slack, it"s enough of a concern that he wants to give users a way to analyze their own messages for potential bias, something he referred to as "personal analytics."


"There are analytics that no one else has access to you except for you," he told the audience. "And they don"t present you with any real moral value either way, but [they answer questions like], do you talk to men differently than you talk to women? Do you speak to support groups differently than you speak to superiors? Do you speak in public differently than you speak in private?" 


If and when Slack does develop such a tool, it could have the opportunity to make a big impact. Founded in 2013, the internal messaging tool is used by over six million daily active users and more than 50,000 teams, which includes 43 percent of the largest 100 companies in the U.S. by revenue. The company reportedly brings in an annual revenue of $200 million.


In a statement to Inc., however, Slack downplayed the idea that it"s actively working on such tools. 



"Our Search, Learning and Intelligence team in New York is focused on improving search and making Slack more useful for our users. That work includes developing analytics tools, including potentially the "personal analytics" that Stewart mentioned, but these initiatives are in the early stages and will continue to develop over the next couple of years."



Butterfield"s comments highlight what could be a shift in how tech companies might think about eliminating bias in their products. Instead of trying to build algorithms and tools that are supposed to automatically predict and identify bias--a problematic task because they"re designed by human beings with bias--Butterfield suggests instead that tech companies can help users by simply showing them their own behavior. 


Textio is another startup thinking hard about bias in online communication. The company analyzes job postings and applicant data for enterprise companies like Slack, Atlassian, Johnson & Johnson, and Expedia and helps suggest gender-neutral language for less biased job ads. Co-founder Kieran Snyder says Slack could be on to something if it develops tools to help you understand your own patterns of behavior.


"It"s wonderful if I get personal insight [into] how I"m relating to colleagues differently based on factors I have not been aware of. But the actual thing that is interesting to me as a person or a company is, does it change the behavior?," says Snyder, who also has a PhD in linguistics.


For Textio"s purposes, collecting voluntary, aggregated data on gender and ethnicity is crucial to its work. The company uses this information--which you elect (or not) to provide on job applications--to try to predict how different groups of people apply for a job. "If you have really solid data where everyone has voluntarily indicated their gender identity, you can have relatively low bias," she says.


It"s not clear whether Slack would collect voluntary data on users" personal information such as gender and ethnicity. But what is clear is that this type of data collection is complicated. Women often turn to digital messaging tools for the sense of anonymity, which can allow them to participate more equally in conversations with men than when speaking face-to-face, according to Susan C. Herring, a professor at Indiana University, Bloomingdale in her study "Gender and Power in Online Communication." So collecting this information brings up privacy concerns. (Snyder mentions that people are less likely to give information on their ethnicity than gender possibly due to even greater discrimination for not being white.)


Without those data points, however, companies can only infer someone"s identity, which can help perpetuate deeper bias when trying to build tools to fix that very issue.


"If you have data where you"re guessing people"s gender identity--let"s say from their name or other patterns of communication--then you have really high bias," she says.

No comments:

Post a Comment