Some women shared the messages they get on Instagram. It's not pretty

Women on Instagram are exposed to an epidemic of misogynist abuse, according to a new report. REUTERS/Lucas Jackson/File Photo

Women on Instagram are exposed to an epidemic of misogynist abuse, according to a new report. REUTERS/Lucas Jackson/File Photo

Published Apr 6, 2022

Share

Women on Instagram are exposed to an "epidemic of misogynist abuse," according to a new report.

The Center for Countering Digital Hate (CCDH), a nonprofit focused on online hate and misinformation, worked with five high-profile women, including actor Amber Heard, to analyze more than 8,717 direct messages the women received.

The report charges Instagram with failing to address reports of abuse and the fundamental struggles that high-profile women on the platform face when it comes to using Instagram's safety tools.

In one shocking statistic, the CCDH found that Instagram didn't act on 90% of abuse sent via direct message to the women in this study, despite the messages being reported to moderators.

Instagram's direct message, or DM, function is private and operates like an email inbox. It's also long been a less-visible hotbed for hate, in part because of its private nature. While public gender-based violence on digital platforms is common, direct messages are monitored less, so harassers can operate in secret.

"Harassment, violent threats, image-based sexual abuse can be sent by strangers, at any time and in large volumes, directly into your DMs without consent and platforms do nothing to stop it," the report warns.

Instagram strongly rebutted the report.

"While we disagree with many of the CCDH's conclusions, we do agree that the harassment of women is unacceptable. That's why we don't allow gender-based hate or any threat of sexual violence, and last year we announced stronger protections for female public figures," Cindy Southworth, Facebook's head of women's safety, said in a statement.

Last April, Facebook-owned Instagram launched new tools to protect users from abuse, including stricter penalties for people who send abusive messages, new capabilities to block unwanted accounts and filters that, when turned on, should automatically screen DM requests containing offensive words, phrases and emoji. Users can also create their own custom lists of offensive terms that can be automatically blocked.

The company says the report wrongly concludes that it does not penalize users because it does not always disable their accounts. But Instagram says it does penalize users in stages: A single violation results in a strike, a warning and the blocking of a person's ability to send direct messages for a period of time.

Harassment against women has long been a problem on Instagram. Last year, 16% of women journalists reported incidents of online violence to Instagram, according to a report by the United Nations Educational Scientific and Cultural Organisation (UNESCO) and the International Center for Journalists (ICFJ) on online attacks. Young women have reported being harassed by "hate pages" on the app, set up specifically to troll them. In 2020, a poll conducted by the women's rights group Plan International found that online abuse is driving girls to quit social media platforms including Facebook, Instagram and Twitter, with nearly 60% experiencing harassment.

For years, women have urged Instagram to crack down on harassment taking place over DMs specifically. In 2020, writer Nicola Thorp wrote that when she received rape threats over Instagram DM, the company offered her "no help at all." While Instagram says it has taken steps to combat online attacks against women, the CCDH report found notable holes in the system.

For instance, the report notes, for these high-profile users, ensuring safety requires cutting themselves off from the platform's essential features. Users must decide whether to allow all requests for DMs from people they don't know or to opt out entirely. If they do choose to keep messaging on, there is a special "requests" box for messages from people the user isn't connected with, which women said was essential to check to catch messages from friends, not miss out on business opportunities, and respond to corporate partners. Shutting message requests off completely would mean eliminating a valuable channel for women to receive business offers and find networking and media opportunities.

"Press requests come in for me to talk about my activism," Jamie Klingler, a U.K.-based writer and activist, said in the report. She said she feels she can't turn it off.

CCDH's research shows that 1 in 15 Instagram DMs sent by strangers to high-profile women contain content that violates Instagram's own community guidelines.

"Instagram is not women-first about this, they're not safety-first about anything," Klingler said.

Instagram DMs are regularly used to send image-based sexually abusive and pornographic content, according to the report. Users choose to send these illicit photos and abusive messages to women through private messaging to escape the scrutiny that comes with a public post.

"On Instagram, anyone can privately send you something that should be illegal," said Rachel Riley, a U.K.-based television host, in the report. "If they did it on the street, they'd be arrested."

Many users send abusive or threatening messages using voice notes. CCDH's research showed that 1 in 7 voice notes within their participants' data was abusive. One voice note was sent to Heard saying, "You, I don't like you, you are bad people. Die! Die! Die! Die! DIE!" The only action she could take was to react with an emoji.

CCDH's researchers reported the account to Instagram, but it remained active as of last month. Instagram says users can report the entire chat history by reporting the account for bullying and harassment, and when the chat is reported, it will listen to the messages. However, many women often listen to the messages before realizing they're harassment.

Instagram also says its system allows people to receive voice calls only from accounts whose DM requests they have already accepted, which should provide protection from unwanted calls.

Still, the system can be easily exploited because often those intent on stalking or harassing a woman online will start by sending innocuous messages of support, or purport to be offering a business opportunity, the type of messages women are likely to accept. Once the harasser has gained access, they begin their attack.

Another issue arises in "vanish mode." Messages sent in vanish mode disappear after the recipient has viewed them. To report harmful messages or content sent in vanish mode, women must first view the content.

Instagram says that because only people who follow each other can use vanish mode, people cannot technically receive a vanish mode message from a stranger.

The report also found Instagram's "hidden words" feature, which is supposed to hide certain words users don't want to see, was largely ineffective at filtering out abuse for those surveyed, including harmful language or phrases. Hidden words can also still be sent provided they're written on an image.

Instagram says it does not screen direct messages in the same way as public content because it considers such content to be private.

It was also difficult for these women to download their data or evidence of abusive messages, the report found. No one in the CCDH's study was provided with a record of messages previously sent to them by blocked accounts, despite requesting their full messaging history from Instagram. Having a paper trail of abuse is crucial when contacting authorities or cataloguing abusers across platforms.

Instagram says that when blocking someone, it will also preemptively block any new accounts that person may create. However, this feature often doesn't work, and many harassers will simply log in from a new device and begin the same behavior.

When women are met with an unrelenting barrage of online hate in intimate spaces such as DMs, it results in a chilling effect on free speech, the report said. Women in the report described fearing for their safety in speaking out and how fearful and isolated the online violence made them feel.

"Social media is a really important way we establish our brand, maintain relationships, and transact commerce," Imran Ahmed, chief executive of CCDH, told The Washington Post. "Are we now saying the cost for women doing that is this level of abuse?"

Black women and women of color, LGBTQ people and other systematically marginalized groups are especially likely to experience online attacks. One in four Black Americans have faced online harassment because of their race or ethnicity, a Pew Research Center study found in 2017. The messages women of color receive often mix racism with misogyny.

Despite the tools Instagram provides users, the primary issue, Ahmed said, is Instagram's failure to act on content that is reported.

Instagram has helped create "a culture in which abusers expect no consequences - denying women dignity and their ability to use digital spaces without harassment," he said.

WASHINGTON POST

Related Topics:

Instagram