TechNews Pictorial PriceGrabber Video Thu Mar 28 04:03:17 2024

0


Mark Zuckerberg, FOSTA-SESTA, and the Challenges of Content Moderation
Source: Sarah Altschuller


In his testimony before Congress last week, Facebook CEO Mark Zuckerberg observed that, on issues ranging from fake news to hate speech, the company “didn’t take a broad enough view of our responsibility, and that was a big mistake.”

Looking ahead, it remains to be seen what a “broad enough view” means for companies that both host online content. When the content that you and I see on various websites is determined by a complex ecosystem of content writers, human moderators, algorithms, and artificial intelligence, it will not be easy to get the balance right. We expect companies to provide open platforms and to protect free expression while also taking action to address harmful content in all its various forms. It does seem as we are at a moment where the normative expectations for online service providers are shifting such that intermediaries will be expected to take a “broader” view. These expectations present significant operational challenges, especially for companies with a global reach.

In this context, it is notable that on April 11, the same day that Mr. Zuckerberg testified before the U.S. House of Representatives, President Trump signed FOSTA-SESTA into law and therefore enacted controversial amendments to Section 230 of the Communications Decency Act. The amendments removed liability protections for online platforms that knowingly assist, support, or facilitate sex trafficking. Most observers agree that the policy objectives at issues in FOSTA-SESTA are undeniably laudable. That said, there are significant questions with respect to how companies that host content will respond to FOSTA-SESTA.

With new concerns about liability, will companies aggressively moderate content to avoid potential litigation? How can such efforts be calibrated to address real risks? Or will companies steer clear of content moderation due to concerns that such activities may support a finding that they “knowingly” facilitated trafficking activity if permitted content is ultimately found to be linked to wrongful acts? How can good faith corporate efforts be rewarded?

The answers to these questions may depend on the size of each company, the resources that any individual company can apply to content moderation efforts, and future case law. More generally, as governments around the world seek to enact new requirements specific to content moderation — with a mix of both commendable and repressive policy objectives — we will likely see events in the United States this month influence both rhetoric and regulation in other countries.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |