News

How female CEO’s building social networks are putting user safety first

In the wake of Bumble’s successful IPO, we examined what happens when women are the ones leading and building social networks.

Over the past few years, Big Tech has been facing more and more scrutiny when it comes to user safety. Since launching, large social media platforms have existed mostly uncensored, allowing for significant — and in some cases, devastating — events to transpire online. In many cases, these platforms were designed by men, with the singular focus of growth at all costs… putting user safety on the back burner. 

In recent years, there has been an emergence of female-led social media platforms — think Bumble, Peanut, Nextdoor, and Houseparty, to name a few. Several of these companies have been vocal about prioritising the safety of their users through a variety of measures. For example, Bumble urged Texas lawmakers to make unwanted sexting a crime in 2019. “If indecent exposure is a crime on the streets, then why is it not on your phone or computer?” CEO Whitney Wolfe Herd asked lawmakers on the House Criminal Jurisprudence committee. 

As more users — particularly women — are asking for increased transparency and safety, female founders are using their lived experience to inform design decisions that make their platforms safer. 

Traditionally, product and engineering teams have skewed white and male, which may account for safety gaps on mainstream social platforms. Tracy Chou, the founder of the anti-harassment plug-in Block Party, saw those implications firsthand while working at Quora. One of her first projects there was building a block function. “There’s that perspective of having been on the inside and seeing how product and engineering teams work,” Chou told TechCrunch. “But also being a DEI activist and seeing how lack of representation on teams has impacted product decisions for the worst.” Her personal experiences with harassment informed the product’s design, she told ABC News, along with “knowing the psychological impact of having to see all this negativity.”  

“Social platforms have traditionally been built with the paradigm of status, [and] the idea that status is procured based on numbers of followers or likes or sharing opinions as facts,” says Peanut Founder & CEO Michelle Kennedy. “These actions make sense for a masculine economy, in particular privileged ideals of people who have not had the experience of being marginalised or harassed.” As a result, safety features on these platforms were designed to be reactive against bad actors, and not proactive in preventing harassment in the first place. “When you approach products and tools from the viewpoint of having this experience, you think about things differently,” she adds. “You think about safety differently and you know it’s imperative to get it right.” On an upcoming episode of our Two Percent Podcast, Kennedy stresses the importance of scaling moderation and safety along with Peanut itself.  

“Being a mom has really influenced my viewpoint,” says Houseparty CEO Sima Sistani, who has a sticker on her phone that says “Social Media Seriously Harms Your Mental Health.” Her video-chat app allows users as young as 13, and thinks of her kids when she considers young users. “When thinking about decisions at Houseparty, my kids are always in my mind, my mother is always in mind, being a woman is always in my mind.” 

All of these perspectives have influenced the way female leaders are building social apps. We interviewed a few top leaders — Michelle Kennedy of Peanut, Sima Sistani of Houseparty, Sarah Friar of Nextdoor, and others — who are building social networks, to see how they’ve been proactive around user safety.

Verify user identities

Starting in 2016, a number of social apps (like Tinder, Bumble, and Peanut) required users to upload a selfie to verify their identities. Bumble, Peanut and Hinge took the extra step of introducing in-app video-chatting, which could help users get a sense for their match’s personality from a safe distance. 

Group video chat apps have long had to deal with strangers crashing chats and harassing participants. (Think: Zoom-bombing.) To prevent this, Houseparty limits group video chats to verified contacts, and “friends-of-friends” video chats are capped at 10 people. There’s a locking mechanism to prevent unwanted participants from joining. “We prioritized reporting and blocking mechanisms early in our lifecycle, even at the expense of growth,” Sistani says. 

Nextdoor’s onboarding process ensures that users are connecting to real neighbors in their neighborhood. It is because of this that Nextdoor is where you can find authentic local perspectives that drive meaningful connections, trust, and value. “It’s not growth at any cost,” Friar told Wired

Encourage reflection

Some offensive posts are fueled by impulsiveness. By encouraging users to reflect before sharing, founders have seen a dip in inflammatory posts. Within the community forum, Peanut uses artificial intelligence to encourage “supportive” language and flag language “of negative sentiment.” The user will then be prompted to consider rephrasing their post, “because Peanut is a place for supportive conversation.”

 “There’s always empathy around the product features that we build. To be honest, that is our secret sauce,” Kennedy tells us on The Two Percent Podcast.

Reflection is the entire conceit of Co—Star, which “is built to bring your closest relationships closer,” says founder Banu Guler. “While we’ve launched features with automatic toxicity detectors and block by default, the true emphasis is on building an environment oriented toward vulnerability and meaningful relationships.”

Nextdoor has long dealt with members bringing their offline racial bias to the app; to combat this issue, the company solicited the help of Biased author Dr. Jennifer Eberhardt, a social scientist and Stanford professor whose work explores the causes and effects of racial bias. After participating in exercises with Dr. Eberhardt and her students, the Nextdoor team, first, added friction to the post flow to force people to slow down before they post, reducing racial bias by 75%. Later, with Eberhardt, they developed a “Kindness Reminder.” If a member replies to a neighbor’s post with a potentially offensive or hurtful comment, a Kindness Reminder will be prompted before the comment goes live. “The member is then given the chance to reference our Community Guidelines, reconsider and edit their reply, or ultimately refrain from posting,” Friar says. “We have since seen a 30% decrease in reported comments between neighbors.” Even though this might lead to less posts, she adds, “a dip in engagement is something we are willing to trade for stronger connections.”

Use A.I. to your advantage

Beyond interpersonal safety, Peanut gives users automated tools to protect their mental wellbeing. For example,members of the community can flag triggering content, which will then be covered with a “sensitive content” filter. If users wish to view it, they can opt in. Similarly, Bumble banned unsolicited lewd images in 2019, blurring them out. In its SEC filing, the dating app said that the feature is powered by machine-learning capabilities, which identify lewd images in the chat function. 

Enforce the rules

Peanut bans certain types of triggering content, and prevents misinformation with a ban on anti-vaccine content. “For Peanut to exist as users’ safe space, their anti-social network so to speak, safety has to be fundamental to everything that we build,” Kennedy says. The app uses automation tools and empowers the community to self-police in the forum. 

Similarly, Bumble made headlines in January 2021 for banning body-shaming. “If you’re not sure if a message will come across as body shaming, a good rule of thumb is simply not to comment on another user’s body or health at all,” the company wrote in a blog post. “People who use body shaming language in their profile or through the Bumble app’s chat function will receive a warning for their inappropriate behavior, and repeated incidents or particularly harmful comments will result in being banned from the platform.” Wolfe Herd has long spoken out about the need for regulatory laws to catch up with our digital lifestyles. “It is time that our laws mirror this way we lead double lives, in the physical and the digital,” she told Inc. “You look at government right now, it only protects the physical world. But our youth are spending a lot more time in the digital world than they are in the physical.” 

Meanwhile, the hyperlocal neighbourhood app Nextdoor has strict Community Guidelines. “We’re a community-moderated platform because we want the neighbourhoods on Nextdoor to reflect the nature of neighbourhoods in the real world,” says CEO Sarah Friar. She explains that Nextdoor uses a “layered” approach to moderation, including a system where members can flag content that doesn’t adhere to guidelines, plus algorithms that spot and remove unsafe content and misinformation. Beyond that, there are real people on the Neighbourhood Operations Team evaluating content.  A seven-person Neighbourhood Vitality Advisory Board of leading academics, the head of the NAACP, and the top loneliness expert in the U.S. advises leadership on the necessary elements of communities and connections.

Allow for customisation

Other startups are building safety plug-ins for existing social media networks. Chou’s app Block Party allows Twitter users to filter out specific harassing or offensive content to protect their mental well-being from online abuse. This content gets dumped into a “Lockout Folder” which users can review later, or send to law enforcement if they’re pursuing legal action against harassers.  “It’s a lot easier for someone else to help you process it and flag something that is a concern,” Chou told TechCrunch. “It’s nice to be able to share that burden. The current design of most of these platforms is to put the burden of dealing with it solely on the person who’s being abused.” 

Nextdoor’s bread-and-butter is hyperlocal. Any product Nextdoor launches will be different from others in the space as the focus is solely to give neighbours a more local “insider” perspective from those that live in the neighbourhoods — neighbours, local businesses and trusted public agencies. As Friar told Wired, “we are authentically hyper local.” 

In order for Peanut users to protect themselves from personally triggering content, there’s a “mute keywords” feature that allows them to remove certain types of discussions from their feeds and notifications. “Some women have used this to create a COVID-free feed to protect their mental wellbeing,” Kennedy says. “Others who are trying to conceive have used this feature to mute conversations around ‘pregnancy,’ which they found emotionally triggering.”

Put content before followers

Many founders believe that we’re entering a new era where social media won’t place as much emphasis on follower counts. “We are not a ‘follower’ network; friendship requires mutual agreement between users,” Sistani says of Houseparty. “I think we are moving to a world where people are seeking to get away from the highlight reel to more authentic interactions. The last decade was about sharing, and there was something very aspirational about it. But now we are seeing platforms where it’s more about participation, conversation, and empathy.”

When you reimagine social media from a feminine perspective, Kennedy believes, “collaboration, collectiveness, and support” emerge as the most important qualities for an app. “Anything we do with safety and moderation has to be absolutely front and centre, and it’s why we think of it so obsessively.” 

In Closing…

“You can still drive massive profit and be a good business model while pushing the needle on safety and privacy for users,” Wolfe Herd told Inc.

At the end of the day, user safety should be a priority for all founders, not just female founders. However, “as female founders, we are not uniquely positioned [to improve online safety], but we do have a unique point-of-view,” Sistani says. “I think the most important thing is to have a diverse group around the table from gender identity to different races, religions, and ages.” And of course: “we always reinforce the importance of proceeding carefully online as one does in real life.” 


This article was originally published here

To connect with your local neighbourhood, please login at nextdoor.com.au

Leave a Comment

Pin It on Pinterest

Shares
Share This
Theme is edited