Rahul Matthan: How age tokens to get past age gates favour free speech

India could leverage Aadhaar to generate zero-knowledge tokens of proof that verify a user’s age without revealing any other personal information. (Mint)
India could leverage Aadhaar to generate zero-knowledge tokens of proof that verify a user’s age without revealing any other personal information. (Mint)
Summary

In the US, keeping children safe from exposure to harmful stuff online has run into a free-speech muddle. But in India, we have the digital infrastructure needed to reliably ensure that kids aren’t exposed to content meant for adults.

In September 2022, California governor Gavin Newsom signed the Age-Appropriate Design Code Act (CAADCA) into law, introducing a new legislative framework to protect children online. The law required businesses whose services were likely to be accessed by minors to conduct ‘data protection impact assessments’ before launching new features. These assessments had to evaluate whether platforms could expose children to “harmful or potentially harmful materials" and mandate mitigation strategies for any risks identified.

NetChoice, a tech industry trade group, immediately sued California Attorney General Rob Bonta, arguing that the law violated the US First Amendment and other constitutional provisions. A district court issued a preliminary injunction in September 2023, which was subsequently broadened on 13 March 2025.

Also Read: Aadhaar-based age tokens can solve a privacy problem

The court’s decision was based on the finding that the law violated the constitutional guarantee of free speech by compelling businesses to “opine on and mitigate the risk that children may be exposed to harmful or potentially harmful materials online." This, the court held, essentially forced platforms to make editorial judgments about speech content, which was not permitted under the near-absolute right to speech available under the US Constitution.

A closer look at the court’s reasoning suggests that the problem is not just that the CAADCA restricted free speech. Since there is no reliable way in the US to distinguish children from adults online, platforms that are under pressure to deny children access end up restricting what adult users can access. Without a clear mechanism to accurately identify which of their users is a child, platforms know that if they under-restrict access, they risk non-compliance with the law. As a result, most will choose to over-restrict, preventing adults from uploading or accessing content that they have a constitutional right to see and share.

Also Read: Private companies can use Aadhaar infrastructure for identity checks again

The situation in India is somewhat different. While the Indian Constitution also grants citizens the right to freedom of speech and expression, it is subject to “reasonable restrictions." As a result, when the state insists that intermediaries ought to police harmful content, courts rarely interfere, even if this means that these platforms have to exercise the sort of ‘editorial’ judgement over content that US courts have struck down as unconstitutional.

That said, India has something that the US lacks: digital infrastructure that offers an elegant solution to the age verification problem that has resulted in US companies over-compensating in order to comply. This means that even though Indian courts may not interfere with laws that require online intermediaries to take editorial decisions on content, Indian platforms are better equipped than their American counterparts to strike an appropriate balance while moderating content.

Also Read: Rahul Matthan: Don’t let data privacy safeguards work against us

In an earlier article, I had described how India could leverage Aadhaar to generate zero-knowledge tokens of proof that verify a user’s age without revealing any other personal information. I had discussed this solution in the context of India’s Digital Personal Data Protection (DPDP) Act of 2023, which imposes strict age-based restrictions on the processing of personal data by requiring verifiable parental consent before any such data of a child can be processed. That idea has now been incorporated into the draft DPDP rules, which expressly contemplate the use of “virtual tokens mapped to identity and age."

If we can apply the same technological solution to content moderation, it should be possible for Indian intermediaries to avoid the overreach problem that has plagued content regulation in the US. 

If platforms can accurately determine which of their users are children and which are adults, they will be able to narrowly tailor content restrictions so that these apply only to those below the permitted age. This will mean that children attempting to access age-inappropriate material can be redirected away from such content without affecting the ability of adults to access it. This will enable platforms to uphold the fundamental right to freedom of speech and expression while also ensuring that reasonable restrictions can be applied with surgical precision.

As a result, even though India has narrower constitutional protections for speech than the US has, it is capable of striking a far better balance when protecting children from harmful content. Since we have a technological infrastructure that offers a simple, low-friction solution to age verification, Indian platforms are able to build solutions that can target content more effectively than is possible in the US.

Also Read: Rahul Matthan: Shield innovation in content creation from intimidation

Content moderation is a global challenge. Every country needs to strike a balance between over-moderation and children’s exposure to harm that is appropriate in its own unique context. Given the volume and velocity at which content is generated online, no country will be able to do this effectively unless it puts in place an effective age-verification mechanism.

India’s age-token solution represents a remarkable inversion of the way in which content moderation is currently conceptualized. It offers an answer that not only preserves privacy, but ensures that content restrictions can be devised to apply only to those in need of protection. Sometimes, solutions lie not in the laws we enact, but in the infrastructure we erect.

The author is a partner at Trilegal and the author of ‘The Third Way: India’s Revolutionary Approach to Data Governance’. His X handle is @matthan.

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

topics

Read Next Story footLogo