This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 4 minute read

Disinformation report suggests major legal reforms

We’re suffering from “information disorder,” according to a report recently issued through the Aspen Institute. That’s not news. But how to solve this problem is the big issue, and the commission’s answer is that it will require substantial legal changes, as well as action by platforms, professionals, and academics.

Legal responses to our disinformation pandemic are controversial. Traditional free speech advocates balk at speech restrictions based on undesirable content, and partisans on both the right and left worry that new laws will inhibit the content they want.

Against this background, the report of Aspen Digital’s Commission on Information Disorder tells us that legal changes are likely to be an inevitable part of the debate on handling misinformation and disinformation. The commission, co-chaired by prominent newswoman Katie Couric, conducted a comprehensive review of the disinformation problem, consulting with many leading scholars and participants in the digital information ecosystem.  And it came to the conclusion that our information disorder is dire—“a world disordered by lies”—and requires structural changes and new rules.

The commission’s full analysis, in its well-written 80-page report, provides a thoughtful analysis of today’s mis- and dis-information in the context of the economic, social, and political context in which it developed. The commission has also made a number of research reports and explanatory videos available on its Knowledge Center.

For lawyers, the most important parts of the report are recommendations for new or amended laws. These include suggested new laws or regulations to compel transparency by platforms about data collection and use, and disclosure of content moderation standards and practices. Also, to address the social inequities that the commission found to foster much disinformation, the commission recommended establishing truth, racial healing, and transformational processes at different levels of government.

Two of the recommendations seek to address concerns at the heart of the disinformation pandemic—the economic weakening of the traditional news media, and the current legal immunity of internet platforms.

The commission found that the distress of local news media “denies [communities] access to trustworthy information, and leaves them ill-equipped to access or respond to inaccurate stories that fill the void.” It noted that civic engagement has decreased substantially when trusted news sources have gone out of business. The commission suggested using tax incentives to support local news media in various ways, including federal tax credits to subsidize local news subscriptions, state-level taxes on digital advertising, and tax incentives for local outlets serving their communities. But surprisingly, the commission did not directly address whether such government support was consistent with editorial independence of the press.

The report also recommended establishment of a “Public Restoration Fund,” funded in part by the federal government, which would support an independent non-profit organization “mandated to invest in systemic misinformation counter-measures.” It even suggested that this effort could be supported by cy-pres funding in the event of class actions against disinformation providers.

The commission’s recommendations on platform immunity under section 230 of the Communications Act are likely to get the most attention. Section 230 immunity was established in 1996 to encourage the development of communications on the Internet and responsible content moderation practices. But the commission concluded that section 230 “has also been used to shield companies from liability for non-speech practices.” Therefore it proposed two amendments to section 230.

The first would remove section 230 immunity for “paid advertising content.” According to the commission, “Tech platforms should have the same liability for ad content as television networks or newspapers, which would require them to take appropriate steps to ensure that they meet the established standards for paid advertising in other industries.” The exclusion would cover only promotional advertising, not paid content services or subscriptions.  

The second proposed exclusion would remove the platform’s immunity with respect to their “product design features”—things like their design of recommendation algorithms, friend recommendations, and “watch next” recommendations. These features, the report noted, are not user speech (which section 230 sought to protect) and hence should not be protected if these features themselves caused harm. This recommendation appears to address concerns that Facebook’s algorithms, and similar design features by other platforms, tend to promote extreme content, including mis- and dis-information.

The commission presented its section 230 recommendations as a kind of embrace of section 230’s original purpose, to encourage user-generated content and online collaboration, while addressing unforeseen consequences of the law. It gave the example of one case where “the product tools enabled a malicious user to wage a campaign of impersonation and harassment against their ex-partner,” but the platform was given immunity. The report stated that new rules for platforms should be considered in light of current circumstances, and “there should be a higher standard of care, when platforms are offering tools that enable amplification beyond organic reach of the service.”

Interestingly, while the report also identified misinformation “superspreaders” as a major problem, it proposed no legal changes to address that phenomenon. Rather, it recommended that online platforms should enact and apply policies to penalize and restrict repeat dissemination of misinformation by uses. 

Interestingly, this report may not just sit on library shelves. A West Coast incubator, Aspen Tech Policy Hub, has launched the Information Disorder Prize Competition, which seeks to fund projects that work toward ways to implement the commission’s recommendations.

We’re a long way from knowing how today’s information disorder will be addressed, but the Aspen commission’s report strongly suggests that major changes in laws and regulations will be considered, debated, and possibly enacted as an important part of this process.

Mark Sableman is a partner in Thompson Coburn’s Intellectual Property group. He is the editorial director of Internet Law Twists & Turns. You can find Mark on Twitter, and reach him at (314) 552-6103 or msableman@thompsoncoburn.com.

Tags

internet law twist & turns, blogs