home.pngtwitter-squared.pnglinkedIn.pngmail.pngfacebook.png
TOC.png
TOC.png
Message from the President Keith Toms
Read more >
Message from the Editor
Read more >
Here Comes the Unified Patent Court and Unitary Patent
Read more >
Writing Competition 1st Place: Copyright and Typography: Separating Typography from its Past
Read more >
Writing Competition 2nd Place: Deepfakes are Taking Over Social Media: Can the Law Keep Up?
Read more >
Ornamentality: When Eye-Catching Isn’t Distinctive
Read more >
Deferred Subject Matter Eligibility Response (DSMER) Pilot Program Is Now Open
Read more >
Brazilian Pharmaceutical Patents: The End of ANVISA'S Controversial Prior Consent
Read more >
“What, me worry?” - How Lawyers can successfully navigate the uncertainty of waiting
Read more >
Volunteer as a committee co-chair in 2022
Read more >
List of Officers and Board of Governors and Committees
Read more >
Job Listings
Read more >
< Back
calendar__2_.pngcalendar.png
NEWSLETTER ARCHIVE
Volume 53, Issue 1
BPLA-logo.png
Editor.jpg

Writing Competition 2nd Place:
Deepfakes are Taking Over Social Media: Can the Law Keep Up?

By Kavyasri Nagumotu, J.D. Candidate, University of New Hampshire School of Law
ABSTRACT
Public figures are being subject to deepfakes portraying artificially created circumstances that never actually occurred. Digital impersonation is becoming increasingly realistic and convincing. Online platforms such as Facebook, Twitter, and YouTube are fueling the rapid and widespread diffusion of user-created deepfakes. Intellectual property doctrines and recent “fake news” rules are unable to handle published deepfakes. The current Section 230 of the Communications Decency Act completely shields online platforms from the liability of publishing users’ deepfakes. The online platforms, controlled by a few private companies, are essentially governing the large parts of the digital world, leading to a crisis of legitimacy.
Technological and legal solutions are necessary to deter deepfakes that are primarily used to spread misinformation. As of now, the only possible ramification for public figures is to use property or tort law to claim civil liability against the individual deepfake creators. However, civil liability cannot ameliorate the harms because plaintiffs are not always able to identify the deepfake creator, and the creators can be located beyond the effective reach of the U.S. legal process. Since online platforms play a key role in enabling the distribution of deepfakes, a more effective approach would be to shift the focus and impose liability on the platforms. A discussion of First Amendment rights will remain in the background for these claims, and the courts must decide how to balance free speech rights with the societal harm that deepfakes cause. While we wait for legal mechanisms to potentially fall into place, the technology of deepfakes is only going to improve, causing chaos. We need to discuss the harms of deepfakes and possible solutions to prevent the spread of misinformation now.
< Previous Article
Table of Contents
Next Article >
2022 Ⓒ Boston Patent Law Association