The OPC recently concluded its exploratory consultation on age assurance and has indicated its next step will be prepare a draft guidance document about the use and design of age-assurance systems. We discuss the results of the OPC’s exploratory consultation and compare the approaches taken to children’s privacy in Canada, the U.S. and the EU.
1. Background
The increasing awareness of the sensitivity of children’s personal information means businesses begun to try and limit risk by avoiding the collection of such information, or by treating it differently.
Children’s privacy has also become a key focus for privacy regulators around the world. As concerns mount over young people’s exposure to harmful content and data practices online, age assurance technologies – tools used to verify or estimate a user’s age – are gaining prominence as a potential safeguard.
In Canada, the Office of the Privacy Commissioner (OPC) recently conducted an exploratory consultation on age assurance, seeking input from stakeholders across industry, civil society, academia, and international data protection authorities. It has been signaling its interest in this area for some time (see, for instance the 2023 Joint Resolution, and Privacy Commissioner Philip Dufresne’s comments during his appearance before Parliament). The OPC’s strategic priorities for 2024-2027 include protecting children’s privacy, with age assurance being an area of focus.
Meanwhile, the European Union has proposed a set of 10 guiding principles applicable to online use cases of age assurance technologies, and in the United States, state-level age assurance laws are triggering constitutional challenges over free speech and privacy.
This article explores how the regulation of age assurance technologies is evolving in Canada and abroad, and what Canadian businesses should be doing now to prepare.
2. OPC’s consultation on age assurance
From June to September 2024, the OPC held an exploratory consultation on age assurance during which over 40 responses were received. With age-based restrictions becoming more common online, the stated goal of the consultation was to increase the OPC’s “understanding of the benefits, concerns, and existing research or writing associated with age assurance” and support “…the next step(s) of our work, which will include the creation of a draft guidance document about the use and design of age-assurance systems that will also be subject to consultation.”
The OPC published the results of the its consultation in March 2025. In summarizing and responding to the feedback it received from respondents on its preliminary position, the OPC acknowledged the trade-offs associated with age assurance tools – while they can reduce harm, they may also undermine anonymity, facilitate discrimination, or create barriers for some users. Based on various submissions from a range of stakeholders, the OPC identified the following six themes that emerged:
A. Differentiate between forms and uses of age assurance: Respondents emphasized that “age assurance” is an umbrella term that includes a range of techniques, from simple self-declaration to biometric age estimation. Each method comes with its own implications for privacy, accuracy, and suitability. Many urged the OPC to tailor its future guidance to the specific context and use case, rather than applying a one-size-fits-all approach.
B. The impacts associated with the use or misuse of age assurance should not be underestimated: Stakeholders noted that age assurance can help prevent serious harms to youth, such as exposure to violent content or other explicit material. However, they also warned that improperly implemented systems could restrict access to critical content, particularly for LGBTQ+ youth and marginalized groups. The privacy risks of breaches or misuse were also highlighted, especially when sensitive browsing habits could be exposed.
C. Age assurance is a tool – not the goal: Many submissions emphasized that the ultimate goal is to protect children, not to verify age for its own sake. Age assurance should be treated as just one option among several, alongside tools like parental controls, education, and safer design practices. Respondents called for a focus on outcomes, noting that effectiveness should be measured by harm reduction, not just technical precision. Even partial or imperfect methods may contribute meaningfully to youth protection.
D. Clarify who is responsible: Submissions varied on where accountability should lie. Some advocated for device-level controls or parental choice, while others argued that platforms and content providers must be accountable. Most agreed that putting the onus solely on families or individuals would be insufficient.
E. Use caution with age estimation: Age estimation tools vary, but respondents highlighted that those processing biometric characteristics raised concerns about bias, data sensitivity, and privacy. Critics questioned the appropriateness of using facial recognition or similar techniques on young users. That said, others viewed age estimation as a more privacy-conscious alternative to collecting government ID, particularly where measurements of physical characteristics are not involved.
F. Apply a risk-based, proportional approach: The OPC’s initial position was to limit age assurance to high-risk scenarios, which many supported, but others found too restrictive. Critics argued that this could create loopholes and allow companies to avoid responsibility in moderate-risk cases. A more flexible, proportional approach was recommended, where the level of assurance matches the risk posed by the content or service.
The OPC intends to issue formal guidance elaborating on and defining its expectations for the design, development, and use of privacy-protective age assurance techniques; however, it may engage in further consultations before doing so.
3. Global approaches to age assurance
The OPC’s consultation fits within a broader global trend, with the EU and US each advancing their own strategies toward governing the use of age assurance technologies.
A. The European/UK perspective: Guiding principles for GDPR-compliant age assurance
In February 2025, the European Data Protection Board (EDPB) released its first major statement on age assurance. The statement outlines the following 10 principles for ensuring General Data Protection Regulation (GDPR)-compliant practices when implementing age assurance technologies online:
- Enjoyment of Rights and Freedoms: Age assurance must respect all fundamental rights, with the best interests of the child as a primary consideration.
- Risk-Based Proportionality: Use of age assurance should be necessary, proportionate to risk, and supported by a Data Protection Impact Assessment or Child Rights Impact Assessment, where appropriate.
- Preventing Data Protection Risks: Systems must avoid tracking, profiling, or repurposing data beyond age assurance itself.
- Purpose Limitation and Data Minimization: Controllers should only collect what’s strictly needed.
- Effectiveness: Systems must be accurate, accessible, and resilient, with redress mechanisms in place.
- Lawfulness, Fairness and Transparency: Processing must have a clear legal basis, and users – especially children – must be informed in understandable terms.
- Automated Decision-Making: If automation is used, include human oversight and appeal options, particularly where children are affected.
- Data Protection by Design and Default: Systems must use privacy-preserving technologies like local processing, unlinkability, and selective disclosure.
- Security: Providers must implement safeguards, anticipate breaches, and ensure continuity in case of system failures.
- Accountability: Providers must maintain governance frameworks to define responsibilities, enable audits, and promote transparency and trust.
The EDPB may issue further guidance addressing specific use cases. Together, the EDPB’s guidance is expected to influence enforcement under the GDPR and related regulatory frameworks, including those in Canada
In the UK and Ireland, children’s privacy also protected by the Children’s Code (also known as the Age Appropriate Design Code or AADC). The Children’s Code, introduced by the Information Commissioner’s Office , is a legal requirement for online services likely to be accessed by children and mandates high privacy protections by default for children, requiring services to collect, share, and use children’s data responsibly.
B. In the U.S.: a wave of legislative activity and litigation
In contrast to the guidance-based approach seen in the EU and anticipated in Canada, a legislative approach has been used in various states across the United States. These efforts to introduce age assurance laws, particularly those aimed at restricting minors’ access to online pornography and social media, have triggered a flurry of legal challenges. Courts are now being asked to weigh child protection against long-standing constitutional rights, including free speech and privacy. Key developments include:
Indiana and Texas’ age verification mandate: A coalition of businesses, including adult content platforms, sued the State of Indiana’s Attorney General over a new law that requires users to verify they are over 18 by presenting an identification document or engaging in age verification by a third-party provider. The plaintiffs argue the law violates multiple constitutional rights, poses privacy and data security risks, and is easily circumvented by minors. They also claim the law unfairly excludes major platforms like search engines and social media, which often host similar content.
This case is now on hold, as it is closely tied to a parallel challenge in Texas, which is currently before the U.S. Supreme Court. The Texas law not only mandates age verification for adult websites, but also requires these sites to display “health warnings” deemed pseudoscientific by critics. The outcome of the Texas case is expected to shape how courts assess similar laws nationwide, including Indiana’s, and could significantly influence the future of online age assurance laws in the United States.
Mississippi’s age-based content restrictions under review: In Mississippi, a lobby group representing major players in the tech industry is challenging a law that requires age verification and restricts access to online content deemed harmful to minors. Critics say the law sweeps too broadly, jeopardizing access to protected speech, including historical texts and pop culture references, and puts sensitive personal data at risk. The court granted a preliminary injunction, agreeing that the law likely violated First Amendment rights and raised serious privacy concerns by requiring the collection of sensitive personal data. The State of Mississippi has since appealed the decision. The appeal was heard in February 2025, and a ruling remains pending.
Arkansas’ Act deemed unconstitutional: The U.S. District Court for the Western District of Arkansas permanently struck down Arkansas’s age verification law, the Social Media Safety Act, finding it unconstitutional. The law would have required platforms to collect and verify sensitive personal data, like identification documents, before granting access to digital communication services. The court ruled in March 2025 that the law violated the First Amendment and imposed burdens on all users, not just minors. It also raised cybersecurity risks by compelling platforms to obtain sensitive personal information.
New York’s SAFE for Kids Act and Child Data Protection Act: In contrast to age verification-focused laws, New York has passed broader digital safety legislation. The SAFE for Kids Act restricts addictive social media feeds for minors without parental consent, while the Child Data Protection Act prohibits the collection and use of data from individuals under 18. Supporters say the laws give parents stronger tools to protect children’s mental health. Critics caution that the measures could lead to vague enforcement standards and unintended censorship.
California: California was one of the first states to address the issue and has specific laws regarding children’s privacy, built upon the broader frameworks of the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA). These laws require businesses to obtain affirmative consent for the sale or sharing of personal information of children under 16. For children under 13, this consent must come from a parent or guardian; for those 13-15, the child can provide the consent. The laws also address the “right to be forgotten,” allowing minors to request deletion of their personal data.
California was also the first state to pass legislation specifically focused on children’s privacy. The California Age-Appropriate Design Code (CAADC) is similar to the AADC in effect in the UK and Ireland. Among other things, it requires businesses “likely to be access by minors” to implement age estimation and if it is unable to do so, it must apply the more stringent privacy and data protections afforded to children to all users.
4. Takeaways for Canadian businesses
As regulators and legislators grapple with the issues of protecting youth and preserving privacy, age assurance is fast becoming a new frontier in the digital governance conversation. With new standards and laws emerging globally, and the OPC poised to issue guidance, businesses operating online platforms in Canada – particularly those accessible to minors – should be proactive, not reactive, in how they protect children and safeguard their privacy. Here are four key takeaways:
Guidance is coming: The OPC is expected to release guidance clarifying when age assurance should be used and how it can be implemented in a privacy-protective manner. Businesses that target or attract minors should be ready to align with these expectations as they take shape.
Data hygiene: Organizations that have collected the information of minors should consider whether their current data governance processes (i) allow for it to be identified; and (ii) allow for it to be treated differently from the information of other users. Similar consideration will apply on a go-forward basis once guidance and/or laws are in effect. Organizations should not underestimate the time and resources this will take and consider addressing this sooner rather than later. For some business, this may mean adjusting business models or customer flows.
Proportionality matters: Businesses should start evaluating their platforms: Do they attract a significant number of youth users? Is the content sensitive, regulated, or potentially harmful to minors? Organizations may soon be expected to show that any use, or non-use, of age assurance is justified and proportionate to the potential risks involved.
Design with users in mind: If age assurance is implemented, it should be simple, transparent, and privacy-forward. Minimize data collection, avoid repurposing age data, and offer alternatives wherever feasible. A burdensome or opaque process could erode trust and trigger regulatory scrutiny. Businesses should begin acquainting themselves with the various age assurance tools and age appropriate design considerations.
Watch the global trend: While Canadian guidance and regulation may unfold more gradually than in the U.S. or Europe, global expectations are converging around the need to safeguard children’s rights and interests in a privacy protective manner.
For more information on this topic, please contact Jaime Cardy, Janice Philteos, or other members of the Dentons Privacy and Cybersecurity group