Once the home of chaotic gaming voice chats and spontaneous memes, Discord is now testing stricter content access policies. As the platform grows beyond gaming, moderation tools have gotten more serious. In a bid to protect users under the age of 13, Discord is rolling out face or ID scan age verification. This system will be used to change content filters or view sensitive media.
According to Discord’s official FAQ, this one-time age verification process is currently being tested in the UK and Australia. It activates when users try to unblur media flagged as sensitive or attempt to access age-restricted settings. If the user has not verified their age, they will be prompted to do so before proceeding.

The voice chat app offers two options: Face Scan and ID Scan. For Face Scan, Discord will request camera access and walk users through an on-device verification. For ID Scan, the user will need to scan a QR code with a phone, then take a clear photo of their government-issued ID. Once verified, Discord will send a confirmation DM, and you won’t have to verify again.
Discord claims no biometric data is stored or sent to its servers. Face Scans operate on-device, and ID images are deleted after verification. Still, this system doesn’t only apply to PC and mobile apps, but targets PlayStation and Xbox platforms as well. It’s also unclear how long this test will last, and whether it will be implemented in other regions later.
The timing isn’t helping ease concerns regarding the voice chat app’s current situation. Discord is reportedly preparing to go public, raising fears of monetization and other drastic changes. “RIP Discord, brought into the cycle of infinite growth at any cost,” one Redditor lamented. With Revolt and other alternatives gaining attention, longtime users seem to be starting to consider jumping ship.