From Lawsuits to a Face Scanning System, Discord Has Had a Busy Week

Discord has landed itself in some hot water this week, starting things off with a lawsuit filed against it on child safety issues. Parallelly, Discord is also looking into face scanning age verifications tests to restrict adult content to actual adult users. These stories aren’t connected per se as the Discord lawsuit has been filed in New Jersey, and the “experiments” with age verification are being conducted in the UK and Australia, but the outcome of one could greatly influence progress on the other front.

Discord is known for hosting thousands of niche communities that serve different interests but like many other social media and networking platforms, there is also graphic content that flies under the radar. Adult content is allowed on the platform, but the new lawsuit alleges that Discord’s child safety issues are going out of hand, with the company’s lacking safety measures to keep underaged users safe from the content. 

With insufficient safety checks in place despite claims of its protection features, Discord will now have to defend itself against the lawsuit and the accusations that are soon to follow.

Discord age verification

The result of the Discord lawsuit should be a turning point for other social networking platforms.

Discord Lawsuit Explained: Why Is the Company in Trouble?

Let’s break down the allegations first. On April 17, 2025, New Jersey Attorney General Matthew J. Platkin, along with the Division of Consumer Affairs, filed a lawsuit against Discord, Inc. in the Superior Court of New Jersey, Chancery Division, Essex County. 

The lawsuit has been filed against Discord for unlawful practices that expose kids in New Jersey to child predators and violent sexual content and was brought out after a multiyear investigation into the company.

What Does the Discord Lawsuit Claim?

The lawsuit alleges that Discord was aware that its safety features didn’t sufficiently safeguard its young userbase from the threats online. It states that the company intentionally misled parents and their kids about the safety settings it did have in place for direct messages (DMs).

According to the Discord lawsuit, despite reassuring parents and claiming that the app was safe for younger users, the Safe Direct Messaging feature that is supposed to automatically scan and delete private DMs with explicit media, did not function as promised. 

Despite the existing measures, the lawsuit believes that predators are able to “stalk, connect, and victimize children.” Often posing as children, these predators were able to share and solicit explicit images through the app, even turning to sextortion as a means of getting their way.

To sum up Discord’s child safety issues mentioned in the lawsuit:

  • Misleading safety claims and faulty features that the state claims were ineffective in protecting children from child sexual abuse material (CSAM), violent content, and other harmful media
  • Default safety setting of “My Friends are nice” that scanned direct messages from everyone except friends, which created a loophole for predators
  • Inadequate age verification that allowed children under 13 to gain access to the platform with ease
  • Platform design allows for users within a common server to exchange direct messages, opening up the door for predators to misuse it
  • Consumer fraud violations claimed as the Discord lawsuit believes the company knew about its lax measures but continued to mislead parents for their profits

The lawsuit cited criminal cases in New Jersey where there was evidence of grooming, solicitation, and exploitation on the platform.

What Are the Remedies Sought in the Discord Lawsuit?

The lawsuit aims to make Discord more active in its effort to limit its usage among kids and protect those above 13 who are allowed to use the platform. It seeks different remedies from the company, including a court order to stop Discord from further violating the Consumer Fraud Act (CFA). the payment of civil penalties, and the requirement that Discord surrender any profits earned in New Jersey through these unlawful actions.

Forcing Discord to be more active in making its platform safe is part of the Office of the Attorney General’s effort to keep children safe online. The office previously took action against Meta for similar unlawful conduct, and TikTok, for making the app addictive by designing features that kept children and teens online.

How Has Discord Responded to the Allegations?

A spokesperson for Discord claimed they were surprised by the lawsuit as they had already engaged with the attorney general’s office over the matter. “Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer,” they said in a statement. 

The company emphasized that it had already taken multiple measures to keep its users safe, particularly implementing the 2023 feature that automatically detects CSAM and takes action against it.

New Jersey is the first state in the country to file a suit like this against Discord, but more states could also move to push the company to action considering the conversation about online safety right now. Recent cases in California had also brought the focus on the safety concerns around Discord and Roblox. 

Discord face scanning

How will users respond to Discord’s Face ID system? That’s what we need to know.

Could the Discord Face Scanning Tests Be Part of the Solution?

The Discord privacy concerns should not be downplayed—there are very real safety concerns around children going online, and in 2025, it’s much easier for predators to reach and manipulate this vulnerable section of the population. 

Keeping that in mind, the UK’s Online Safety Act (OSA) has required that all platforms operating in the region enforce robust age-checking systems in the region by July. Organizations that fail to comply could see fines up to 10% of their global turnover. 

What Do We Know About Discord’s Age Verification Test?

Currently, Discord’s new verification System is being tested in the UK and Australia exclusively. Discord users in the region may soon receive a message asking them to verify their age group by taking a video selfie with their device camera or scanning their ID. After users follow the steps to upload either, they will receive a notification about the status of their age verification process through a DM from the Discord profile.

The process should only take a few minutes, and will not need to be repeated after it is complete. Those who disagree with the results of the Discord age verification test can attempt it again.

Not all users may find themselves asked to verify their age. For now, it should come up for those who find “content flagged by [Discord’s] sensitive media filter” or those who try to modify their sensitive content filter settings. 

Is the Discord Face Scanning System the Right Move?

Let’s face it, most of us don’t want to give tech companies more data on us than they already possess, especially when they never go into detail about how they plan to keep it safe. There are also the additional concerns surrounding user preference for anonymity online and how face ID systems feel invasive in that regard. Discord states it won’t keep a record of the age verification scans so that may make it easier to accept the system.

Until we see the Discord face scanning system in action, we won’t know how good it is at its job. Kids who look a little older may be able to get away with accessing unsafe content, and adults who look younger may feel some frustration at getting blocked by Discord’s new verification

With such age scanning systems, there is also the possibility of kids turning to more unsafe and unregulated platforms to access the content they want, which would ultimately defeat the purpose of making these platforms safer. Still, making platforms uniformly safer is the ultimate goal and Discord’s age verification test may bring in some safety and aid with the concerns brought up by the lawsuit. 

Discord Child Safety Issues Are Part of a More Systemic Problem

The issues with Discord’s allowance of graphic content highlight a larger problem—it is very easy to access just about anything online, especially due to the rise of trollers and bot accounts that help disguise the real predators. Putting a firm and conclusive end to all adult content would mean over-censoring adults similar to the chaos that ensued on Tumblr circa 2018.

With the Discord face scanning plans and other safety regulations, the safety factor might go up, but adults who make up the majority of users on the platform may migrate elsewhere if it becomes too cumbersome to use. Businesses have to determine the right number of safety tools and measures is for its platform, while also ensuring that they don’t alienate its free-spirited audience.

Discord’s age scanning feature could be the solution to preventing younger kids from accessing the platform, and their safety has to come first. If the system succeeds, it could go into effect more globally, not just on Discord, but other platforms that adopt it as well. For now, we’ll have to watch out for how the Discord lawsuit and face scanning systems take shape.

Need more insights into how the tech world is evolving? Subscribe to Technowize, it’s that simple.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *