Meta Platforms, the company formerly known as Facebook, faces continued legal action after a US judge rejected its attempt to dismiss a lawsuit filed by Australian billionaire Andrew Forrest. The lawsuit centers around fraudulent cryptocurrency ads featuring Forrest’s image that were displayed on Facebook.

Forrest alleges that Meta failed to adequately vet the advertisements, which used deepfake technology to make it appear as if he was endorsing the crypto schemes. These deceptive ads caused reputational damage to Forrest, according to the lawsuit.

Meta attempted to shield itself from liability using Section 230 of the Communications Decency Act, a law that protects online platforms from being held responsible for content posted by users. However, Judge Casey Pitts of the California District Court ruled that Meta’s actions could potentially fall outside the protections offered by Section 230.

The judge highlighted Meta’s potential negligence in allowing the fraudulent ads to be displayed. Forrest’s lawsuit argues that Meta failed to uphold its responsibility to operate in a “commercially reasonable manner” by not having sufficient safeguards in place to prevent such scams.

This decision paves the way for Forrest to refile his lawsuit and argue his case based on Meta’s alleged negligence. If successful, it could set a precedent for holding social media platforms accountable for failing to adequately prevent the spread of deceptive content.

The case has significant implications for both Meta and the broader social media landscape. It raises questions about the responsibility of platforms to monitor and vet advertisements, particularly those involving deepfakes and other potentially misleading technologies.

For Meta, this lawsuit represents a potential financial liability and a reputational blow. It underscores the ongoing challenge social media platforms face in balancing user freedom with preventing the spread of misinformation and scams. The outcome of this case will be closely watched by industry observers and could influence future regulations and practices regarding online advertising and content moderation on social media platforms.

Shares: