Children’s privacy is increasingly in the regulatory spotlight, and a new consultation paper from the UK suggests that even organizations which do not specifically target children may have regulatory obligations. Canada doesn’t currently specifically regulate children’s privacy, but in light of the Office of the Privacy Commissioner of Canada’s (“OPC”) recent Guidelines for Obtaining Meaningful Consent (“Meaningful Consent Guidelines”), the OPC may take a similarly broad approach to the interpretation and application of privacy laws.
UK Information Commissioner’s Approach
On May 31, 2019, the UK’s Information Commissioner’s Office (“ICO”) wrapped up a month and a half long public consultation on its draft Age appropriate design code of practice (“Code of Practice”). The ICO states the draft Code of Practice provides standards for the collection and use of personal information for those organisations providing “online services likely to be accessed by children”. A closer review of the draft Code of Practice shows, however, that it’s likely to apply much more broadly.
The draft Code of Practice also applies to companies that provide online products or services that t aren’t specifically aimed or targeting at children under the age of 18, but that are nonetheless still likely to be accessed by them. If a provider believes their service is likely only to be used by adults and therefore escapes the application of the draft Code of Practice, the provider must be able to demonstrate that that is in fact the case:
You may be able to rely on market research, the nature and context of the service, or specific measures you have taken to limit access by children. The important point is that even if the service is aimed at adults, you must be able to point to specific documented evidence to demonstrate that children are not likely to access the service in practice.
The draft Code of Practice imposes 16 standards that in practice, would significantly limit the flexibility of providers in designing their electronic content and in the collection of personal information. A sampling of the standards includes requiring provider to:
- design services with the best interest of the child as a consideration;
- set privacy settings at ‘high-privacy’ as a default;
- switch off profiling and geolocation options by default; and
- avoid to use of nudge techniques that could encourage children to provide unnecessary personal data, weaken or turn off their privacy protections
A full list of the 16 standards and the full draft Code of Practice can be found here.
In addition to imposing significant standards and controls, the draft Code of Practice would impose potential regulatory action for failure to comply with it and other applicable UK legislation such as the Data Protection Act, the General Data Protection Regulations and the Privacy & Electronic Communication Regulations. The ICO would be empowered to issue fines of up to 20 million Euros or 4% of annual worldwide turnover, whichever is greater.
This draft Code of Practice would apply to UK-based services as well as, and of significant importance for many organizations, any online services outside the UK that have a branch, office or other ‘establishment’ in the UK and that process personal data in the context of the activities of that establishment.
How Does the Canadian Model Measure Up?
Canada does not have a similar code of practice, and in a marked contrast, does not have any legislation specific to the collection of the personal information of children.
Organizations which collect, use, or disclose personal information in the course of their commercial activities are governed by the Personal Information Protection and Electronic Documents Act (PIPEDA), which does require that organizations obtain consent for the collection, use or disclosure of personal information, with some exceptions. PIPEDA also requires that organizations make reasonable efforts to ensure that individuals are advised of the purposes for the collection, use or disclosure of personal information and that it must be stated in a manner that the individual can reasonably understand. What PIPEDA does not do, however, is provide for any differentiation between adults and individuals under the age of 18.
However, in its Meaningful Consent Guidelines, which it began enforcing on January 1, 2019, the OPC imposes on organizations an obligation to take into account the ability of users to understand what they are consenting, and specifically states that “the level of maturity” of a user is one such consideration (footnotes omitted, emphasis added):
The OPC is of the view that while a child’s capacity to consent can vary from individual to individual, there is nonetheless a threshold age below which young children are not likely to fully understand the consequences of their privacy choices, particularly in this age of complex data-flows. On the other hand, the OIPC-AB, OIPC-BC and Quebec CAI do not set a specific age threshold, but rather consider whether the individual understands the nature and consequences of the exercise of the right or power in question. As such, where a child is unable to meaningfully consent to the collection, use and disclosure of personal information (the OPC takes the position that, in all but exceptional circumstances, this means anyone under the age of 13), consent must instead be obtained from their parents or guardians. For minors able to provide meaningful consent, consent can only be considered meaningful if organizations have reasonably taken into account their level of maturity in developing their consent processes and adapted them accordingly. Organizations undertaking such collections, uses or disclosures should pay special mind to Guiding Principle 7, and stand ready to demonstrate on demand that their chosen process leads to meaningful and valid consent.
There is no specific commentary about whether services or products are targeted to children or not. Instead, this would simply be part of the context which determines the nature of the consent required.
Takeaways for Business
While the Canadian approach to online privacy and the protection of personal information of children has thus far been less prescriptive than the proposed approach in the UK, the OPC’s recent announcements signal that this is a topic of importance to their Office and other policymakers. Privacy regulators globally are generally in contact with each other, and it would no be surprising if the OPC adopted the approach of the draft Code of Practice
Businesses that believe their services are being provided to a largely adult demographic should consider why they believe that to be the case, and consider gather support for that assertion. Online providers may want to consider implementing age-gating mechanisms as part of their overall privacy compliance program.
Organizations providing services to children between the ages of 13-18 should review their consent process and privacy policy against the Meaningful Consent Guidelines to determine if they are adequate.
__
For more information about Denton’s data expertise and how we can help, please see our unique Dentons Data suite of data solutions for every business, including data mapping, contractual review, and consent benchmarking. Our Transformative Technologies and Data Strategy page has more information about our sophisticated tech practice, which focuses on data-driven technologies such as artificial intelligence, data analytics, and digital identity.