Snapchat’s Speed Filter Opens the Door to Lawsuits

The following guest post was authored by Tariq Akeel, a J.D. candidate at Michigan State University School of Law and a 2018 Summer Associate at Warner Norcross + Judd LLP.

Can you sue an online advertisement forum for allowing users to post sex trafficking ads? How about Yahoo for failing to remove offensive content created under a fake profile? Traditionally, the answer has been no. Under the federal Communications Decency Act (CDA), internet service providers are granted immunity from lawsuits arising out of content posted by third-party users. However, in a recent decision, the Georgia appellate court has started to define the boundary of how far this immunity may extend.

On September 10, 2015, Christal McGee allegedly drove 113 mph to record her speed on Snapchat’s “Speed Filter.”  For those of you unfamiliar, the Speed Filter is an in-app commodity that can be used to measure one’s moving speed. Users can record their speed, take a picture, and share it with the world.  However, before having the opportunity to post her Snap, McGee crashed into another vehicle causing one of the passengers, Wentworth Maynard, to sustain permanent brain damage.

The Maynards’ attorney initiated a lawsuit against Snapchat, the internet service provider of the Speed Filter. The plaintiff argued that Snapchat knew that its users may use the Speed Filter in a manner that will distract them from obeying the law. And on top of that, the Filter encourages dangerous speeding.

In response, Snapchat asserted that Communications Decency Act § 230 granted them immunity because the lawsuit was based on the actions of Christal McGee, a third-party user. The trial court agreed and dismissed the case–a decision discussed on this blog here. The Georgia Court of Appeals, however, recently overturned the decision and remanded for reconsideration.

What is CDA § 230?

CDA § 230 is a federal law that protects internet service providers from lawsuits that arise based on the content uploaded by third-party users. In other words, online platforms that host or publish speech are granted immunity against a range of laws that they would otherwise be held accountable for. Third-party content is exactly what it sounds like – this blog post, YouTube videos, and Instagram pictures are all examples of third-party content falling under CDA § 230 immunity.

Congress enacted this law to uphold the freedom of speech online. It would be nearly impossible for online platforms to police and monitor all of the content being uploaded on their site. If these platforms were required to do so, most would choose to not host any third-party user content at all.

Why did CDA § 230 immunity not apply?

The main issue of the lawsuit came to this: can Snapchat still have immunity from liability even though McGee’s post did not make it onto the internet?

The defense argued that it does not matter whether an actual online post was made. Either way, CDA § 230 immunity applies. However, the plaintiff asserted that they were not seeking to hold Snapchat liable for McGee’s post, or failure to post. Rather, they were seeking to hold Snapchat liable for the negligent creation, design, and maintenance of the Speed Filter because it encourages excessive speeding. In addition, the defense was relying on case law where the publication of third party users’ posts caused the harm, and the claims were dependent on the content of the posts themselves. Here, although the defendant was preparing to post online, she did not have the opportunity to do so.

Ultimately, the court agreed with the plaintiff. Because the lawsuit was based on the use of the Snapchat Filter, and not the content to be published, CDA § 230 immunity did not apply. Snapchat may be sued for their negligent creation of the Speed Filter, however, it is uncertain whether or not the plaintiff will prevail on that claim.

What does this mean?

Going forward, this decision has provided an avenue for lawsuits against online intermediaries. CDA § 230 is no longer a blanket immunity that online platforms or mobile app creators can rely on. Hopefully, the trend will be that these providers will be held liable not for third-party content, but for the type of content that they allow on their platforms.  While it is uncertain if plaintiff’s negligence claim will prevail in the district court, this decision has helped further define the boundaries of CDA § 230 and its application into the future.