EARN IT could offer a framework for better platform moderation
The EARN IT Act, recently approved for review by the Senate Judiciary Committee, remains a controversial bill, primarily due to concerns that it could deter technology vendors from using encryption. But amid the ongoing debate over Section 230 and the role of technology platforms in our public discourse, legislation like EARN IT could, if paired with carefully crafted procedural protections, offer a blueprint for how Congress can address bipartisan concerns about child sexual abuse material (CSAM) and other illegal content online.
Debates about Section 230 and the liability protection it affords digital platforms generally focus either on how to get platforms to remove hate speech, misinformation and other disadvantaged content, or how to prevent them from censoring certain political speeches, especially those of conservatives. But such discussions fundamentally ignore what Section 230 was meant to do: define how best to assign responsibility for content moderation decisions in order to strike the ideal balance between expression and potentially harmful content.
No moderation system will ever be perfect. Some harmful content will always exist. But there’s no reason to assume that the status quo, rooted in assumptions about the online environment of more than two decades ago, necessarily strikes that balance in a way that makes sense today. .
To the extent that the law currently permits harms that outweigh the benefits of expression, it should be adjusted to deter those harms if it can be achieved at low enough cost. Almost everyone would agree that harmful content should be removed if it can be done without any effect on lawful expression. Thus, the question is to find the right compromise: one that would deter damage but would not impose such massive legal liability that it would lead to the bankruptcy of online platforms. It can be done, but it requires careful thought.
The EARN IT law traces the contours of the problem but, without a truly holistic approach, it could do more harm than good. Although Section 230 is largely beneficial, its granting of near-total immunity prevents the legal system from adapting to new developments. Certainly, as platforms uncover new forms of harm, pressures guide their behavior, such as concerns about image and the ability to grow and maintain a user base. But without legal consequences for making unreasonably bad decisions, such pressures may not provide enough incentive to find optimal solutions.
Rather than a blanket grant of legal immunity, Section 230 protections should be conditional on platforms demonstrating reasonable behviour. That is, an online service provider should have a duty to ensure reasonable moderation of illegal content. The idea of “reasonable moderation” implicitly implies that platforms will not be able to deal with all malicious content.
This could that the platforms are already operating as reasonably as possible, within the limits of economic efficiency. But determining that should involve at least some oversight by a neutral tribunal.
Analyzing whether a platform has behaved reasonably could include examining its use of encryption, as contemplated by the EARN IT Act. Since many malicious actors seek to steal user data, it can be quite reasonable to encrypt communications. But there may also be fringe cases where a platform unreasonably allowed the use of encryption to hide what it had good reason to believe was criminal behavior. Flexible reasonableness standards, informed by well-developed industry best practices, can address either situation.
Given that the courts have largely not had the opportunity to assess these issues through a gradual and iterative process over the quarter century of the application of section 230, it would be ill-advised to simply throw all the issues surrounding online moderation into the court process. in just one time. This would invite a torrent of litigation that threatens to do more harm than good.
To make the transition less chaotic, there should be procedural limitations, such as heightened pleading standards and an explicit safe harbor to reduce litigation at the pleading stages. These reforms should also incorporate industry standards and best practices, as well as a judicial review mechanism that can provide feedback on the process.
There are legitimate concerns about federal lawmakers amending Section 230. Many public statements by lawmakers suggest they want regulations that are entirely inconsistent with the First Amendment. But more can be done, within the limits of the Constitution, to address the very real problem of harmful and illegal content online. The EARN IT Act isn’t perfect, but it outlines a framework that could be developed into a more balanced reform of Section 230.
Kristian Stout is Director of Innovation Policy at the International Center for Law and Economics and co-author of the working paper “Who moderates moderators? : A Legal and Economic Approach to Holding Online Platforms Accountable Without Destroying the Internet.”