Share this...

By Trisha Prabhu

“How do I report bullying and harassment online? How do I get a platform to notice me?”

Hi there, and welcome back to another week of Ask Trish.

Thank you so much to this week’s question-er for the always relevant, deeply important question (and I’m so, so sorry to hear that you’re experiencing online bullying and harassment. That’s never easy, and I applaud you for reaching out to learn more about how to take it on). Reporting cyber abuse is a crucial first step to tackling it, but the process can seem shrouded in mystery and many of y’all seem to have had very frustrating experiences with reporting, e.g., you report something…but the platform doesn’t get back to you. That’s why, despite the fact that we’ve actually covered this topic on Ask Trish, this week, I’m revisiting it. I hope to offer y’all a little more clarity on 1) how reporting mechanisms work and 2) some tips and tricks to ensure that platforms notice you. It’s far from a perfect process, but when it works, reporting can be a powerful way to stop online hate in its tracks – so re-familiarizing ourselves with this topic is definitely a good idea. With that said, let’s get into it:

First, what are reporting mechanisms, and how do they work? Platform reporting mechanisms are mechanisms through which you can report content that violates the platform’s policies, e.g., bullying content, CSAM (child sexual abuse material), content violates intellectual property rights, etc. Nearly every major platform has at least one (very, very lengthy) policy that spells out what content they will and will not allow on the platform. A lot of platforms screen for certain, particularly harmful types of content, e.g., CSAM, which they try to take down automatically, without any human reporting. But their screening technologies aren’t always perfect, so reporting can be one of the best ways to signal to platforms that some content may violate their policies. In terms of how the process works, on your end, it’s generally pretty simple: depending on the platform, you can either tap on the content and immediately report it (you’ll often be asked to explain why you’re reporting it), or you’ll be redirected to a platform Help Center, where you can then report the content. As for platforms, on their end, most have both technologies and human moderators reviewing reports and taking content down. (X, or Twitter, is a notable exception to this rule.)

Okay – now we have an overview of reporting. But you might still be wondering: Okay, Trish, but how do I make my case? And how do I make sure platforms notice me? It’s not a scientific process by any means, but below, I offer my suggested tricks and tips when reporting content. (Keep in mind that none of these are silver-bullet solutions, just my best attempts to provide you all with some additional techniques when you’re reporting content.)

  • Go to the platform’s policy (policies) and try to determine exactly how and why, per the platform’s criteria, the content violates their policies. To you, it might be cyberbullying, but based on the platform’s policy, the content may actually fall under a different category of prohibited content. The distinction doesn’t matter to you, but it does to the platform – and identifying the specific way that the content violates their policies can help you make a better case for getting that content removed (especially if the platform asks you to categorize what type of offensive content it is).
  • Relatedly, leverage the many online resources to help you more precisely and effectively report. There are so many fantastic tools out there that can help you distinguish between different types of harmful content and even guide you through the reporting process. Report Harmful Content is one great such resource. 
  • Report offending content multiple times. First time is not the charm when it comes to reporting…report as much as you can, as often as you can, to make the offending content more visible/a higher priority for the platform. If you feel comfortable, have parent(s)/guardian(s) and friends report the content too.
  • If things escalate, seek help from law enforcement to remove content. If you’re a minor, you’re entitled to help from law enforcement to get certain types of content, e.g, sexual content of you, removed from the internet. (As you can imagine, platforms may be much more quick to reply to a police officer!) When applicable, work with your parent(s)/guardian(s) to seek help from law enforcement – and I promise, you won’t get in trouble.

I hope that that insight and advice is helpful – let me know if those tips actually work. And to the question-er (and anyone else experiencing cyber-abuse), please know that this community is here for you. I strongly encourage you to check out Ask Trish’s many other resources on cyberbullying and online hate. Remember: you are worthy and loved, not what your abusers say you are.

To the other readers, if this post sparked a question, or if you’ve more generally been harboring any tech or internet-related concerns or musings, please do share them here. I’m excited to learn more about what’s on your mind, and hopefully, keep sharing valuable advice. Thank you for contributing!

Have a great week,

Trish

@asktrish

This week, Trish dives into an important topic: how to report cyber abuse on social media platforms. The process can seem shrouded in mystery, and you may be wondering how to make sure platforms notice you. Luckily, Trish has some insight and tips and tricks for you — check out this week’s post to get the scoop ⬆️

♬ original sound – Ask Trish

Share this...