Social Media Sites: The Content Moderation Dilemma

A top priority should be to put into place a review and appeals process that is fair and less arbitrary.

TikTok and Facebook application on screen Apple iPhone XRFor social media sites, moderating content is no easy task. With billions of daily users on the top networks, sifting through posts and making thoughtful determinations as to whether posts violate community guidelines is a seemingly insurmountable feat.

The challenges presented by online content moderation took center stage during recent presidential elections, and social media sites have since grappled with creating appropriate posting guidelines and content removal procedures. There’s a delicate balance between permitting the free exchange of ideas and prohibiting content that is abusive, hateful, or life-threatening — especially when there are billions of posts per day.

The realities of the situation have created a perfect storm on many sites and made it possible for anonymous trolls to take advantage of content violation reporting procedures. It’s easy to report content with the click of a button, resulting in the instantaneous suspension of content and laughable appeals processes that often result in the arbitrary removal of legitimate and educational content, with little if any avenue for further review. And for people who post educational content on hot-button topics such as sexism, racism, sexuality, or politics, it’s easy for trolls to repeatedly report videos, resulting in the ban of legitimate creators from social media platforms.

For lawyers, who often post about educational topics that some people deem to be divisive or offensive, arbitrary content moderation policies can be a big problem. I’ve seen this occur time and time again on TikTok, with lawyers and other professionals who post relatively benign content getting banned from the platform after repeated content violations. Because of the limited appeals process, they often have to start new accounts from scratch and attempt to rebuild their audience.

I’ve experienced arbitrary bans myself. I recently had a video on TikTok removed for violating community guidelines, after which I posted a video asking my followers who were legal professionals to share their similar experiences. You can read their replies in the comments to that video, but the overall theme was that there was little to no recourse available when an alleged violation occurred.

Of course, TikTok is a private platform, and the moderation — and deletion — of content is discretionary. The Terms of Service determine the rights of users and the path for any type of recourse. I am not disputing that fact.

That being said, the current state of affairs is untenable for platforms that are monetized and supported by advertising dollars. For that reason, social media platforms like TikTok need to find better ways to fairly moderate content so that the end result encourages the thoughtful exchange of ideas. And a top priority should be to put into place a review and appeals process that is fair and less arbitrary.

Sponsored

Otherwise, professionals such as attorneys will ultimately abandon the platform for one that is more welcoming when it comes to substantive, educational content. The reason this should be a concern for TikTok is that professionals are one of the key groups that are most likely to seek to advertise on the platform in an effort to reach clients or customers.

If TikTok burns bridges with this segment of users by creating an environment that is hostile to the open exchange of ideas, then it runs the risk of alienating well-funded advertisers. After all, why spend money on a platform that is unwelcoming and stifles creativity and the free expression of thought — a principle that I would argue attorneys tend to value more than most.

I realize that moderating billions of videos or posts each day is a challenging endeavor. But social media networks come and go, and users are fickle. When a site stops serving their needs, becomes overridden with spam and uninteresting content, or the next big thing comes along, they’ll migrate quickly.

TikTok has an opportunity to take a different path and create a site that continues to be engaging, entertaining, and educational. However, that won’t occur if it continues to arbitrarily stifle and remove legitimate and educational content simply because anonymous trolls can mass report content that contradicts their particular worldview.

If this trend continues, I have no doubt that lawyers and other professionals will ultimately leave the platform for greener pastures. But there are steps that can be taken to reduce that likelihood. Whether that occurs remains to be seen.

Sponsored

The ball’s in your court, TikTok.


Nicole Black is a Rochester, New York attorney and Director of Business and Community Relations at MyCase, web-based law practice management software. She’s been blogging since 2005, has written a weekly column for the Daily Record since 2007, is the author of Cloud Computing for Lawyers, co-authors Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York. She’s easily distracted by the potential of bright and shiny tech gadgets, along with good food and wine. You can follow her on Twitter at @nikiblack and she can be reached at niki.black@mycase.com.

CRM Banner