Over the course of the election, we saw online discourse reach a new low.
“Trolls are no longer just a 17-year-old sitting in their mom’s basement hacking away,” said reporter Kristen Brown while introducing a panel about online civility at the Real Future Fair in Oakland. “Everybody has the potential to be a troll, and we are seeing trolling becoming a really big problem.”
Joined by Caroline Sinders, a BuzzFeed Eyebeam OpenLab fellow, and Danielle Leong, an engineer on the community and safety team at code repository service GitHub, the panel discussed, in short, how not to be an asshole online.
The first step for any platform, according to Leong, is outlining a definition of harassment. “If you never have a consensus of what harassment is, you are never going to get anywhere,” she said, saying it took GitHub 19 months to finalize its code of conduct, which is centered on discouraging behavior that impedes productivity. Only when that’s done, she explained, can the work of identifying consequences begin.
Sinders reiterated how these definitions shift from network to network. “So many companies do things that are partially right for their audience,” she said. While Facebook takes harassment very seriously, they have a problem with fake news and filter bubbles. And though Twitter is effective as a live news source, its response to abuse on the platform has lagged so far behind it’s reportedly dissuaded potential buyers from purchasing the company.
Curbing online harassment means focusing on it specifically and hiring ethnographers and journalists that specialize in identifying trends. Leong said that when building a product, it’s necessary to ask, “What is your end goal here? Is it to protect the victim?”
One way to keep these issues constantly at the fore is to have diverse product and development teams. “A down vote never would have been implemented by a woman of color,” said Sinders of the Reddit feature that is often used to silence underrepresented voices, despite being designed to do just the opposite.
Half of Leong’s team at GitHub are women of color, 30 percent are transgender, and in the course of their work they use a checklist that’s meant to help actively identify opportunities for harassment or non-consensual behavior. This diversity of perspectives should be seen as imperative, a way of including the concern and experiences of many different possible users in advance.
“If you are designing something it’s important to think about what your best case scenario is, but also all the things that could go wrong,” Sinders said, warning that if you don’t, “you are always going to be three steps behind, and you are going to pick things that are autonomous over things that are intelligent.”
It’s quite simple, the way Leong puts it: “Design systems as if there’s a stalker trying to find someone.”
The panel closed by discussing how an important part of dealing with harassment without curbing free speech is considering context. There’s a difference between being an asshole and being a troll. But one thing we should all agree on is that everyone should be safe online.
“Basic human safety should be a human right,” Leong said. “And as it is right now on the internet, it’s not.”