In February, the European Data Protection Board (“EDPB”) adopted Guidelines on deceptive design patterns in social media platform interfaces – how to recognize and avoid them (the “Guidelines”).
The Guidelines focus on providing recommendations and guidance for the design of the interfaces of social media platforms, however, all organizations and users within the digital ecosystem will likely have some familiarity with the techniques being discussed. The term “deceptive design patterns” has been used by the EDPB in an effort to more descriptively and inclusively define these practices, however, they are more commonly referred to as “dark patterns”.
The EDPB seemingly signals its expectation that provisions of the General Data Protection Regulation (“GDPR”) will be enforced in the EU in an effort to facilitate users’ ability to effectively protect their personal information and make conscious choices, and it is quite likely that regulators in other jurisdictions (including Canada), will soon be turning their minds to this issue (to the extent that they haven’t already).
What are Dark Patterns?
Definitions vary, but the Guidelines define “deceptive design patterns” as interfaces and user journeys that “attempt to influence users into making unintended, unwilling and potentially harmful decisions, often toward a decision that is against the users’ best interest and in favour of the social media platforms interests, regarding the processing of their personal data”.
We have likely all encountered some form of dark pattern throughout our personal online user journeys – from the nudging practices used to persuade an individual to volunteer more personal information than intended, to the mailing list that make signing up easy but unsubscribing nearly impossible. These various techniques of nudging are known by different names (roach motels, “privacy zuckering”, confirm shaming etc.) but the similarity between all of these techniques from a data privacy perspective is that an individual’s meaningful consent prior to personal information being collected may not be obtained, and the required transparency obligations under privacy laws may not be adhered to.
Combatting Dark Patterns through Privacy Laws
There is a clear overlap between consumer protection and privacy laws in this area. However, in light of heightened powers of enforcement being proposed under the Consumer Privacy Protection Act (“CPPA”) and recent guidance and legislation from abroad, this is an area to which privacy regulators in Canada are likely to direct their attention.
Canada’s federal private sector privacy law, the Personal Information Protection and Electronic Documents Act (“PIPEDA”) requires organization’s to obtain the “knowledge and consent” of an individual prior to collecting, using or disclosing his or her personal information. The “knowledge” requirement means organizations must be transparent about the purposes for which the personal information will be used. The “consent” requirement means that any consent must be meaningful, and information must be collected by fair and lawful means (which includes not using deceptive practices).
Because “meaningful” consent is generally the only grounds for the lawful processing of personal information in Canada, it begs the questions – what tools does the Federal Privacy Commissioner currently have to deal with these deceptive design patterns, what tools may it have in the future, and how do other jurisdictions address this issue through the lens of privacy laws?
The table below, while not comprehensive, outlines some of these considerations:
PIPEDA | In PIPEDA, the requirement that personal information be collected by fair and lawful means is intended to prevent organizations from collecting information by misleading or deceiving individuals about the purpose for which information is being collected. This requirement implies that consent with respect to collection must not be obtained through deception. [Principle 4.4.2, PIPEDA] |
CPPA (draft) | The CPPA goes further than PIPEDA to expressly address state that any consent obtained by providing false or misleading information or using deceptive or misleading practices would not be valid.[5] There is no specific definition of dark patterns, presumably to allow for a flexible interpretation of these evolving practices, but dark patterns would likely be included here. [Section 16, CPPA] |
Quebec | Quebec, having already updated its privacy law through Bill 64 looks to implement a “privacy by default” requirement in September 2023, which states that any entity who collects personal information when offering a technological product or service to the public, must, by default, provide the highest level of confidentiality to users.This measure attempts to prevent dark patterns from occurring in the first place. [Section 108, An Act to modernize legislative provisions as regards the protection of personal information] |
GDPR | The GDPR contains a requirement for data controllers to have data protection designed into the processing of personal data and as a default setting and this applies throughout the processing. Such measures must ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons. [7] Consent must be “informed” and “freely given” consent.[8] Similarly to the laws in Quebec, users are to be given by default a choice before their personal information is disseminated, and this principle is integrated into the design process. [Article 25 and Article 7 of the GDPR, respectively] |
California | The California Privacy Rights Act (“CPRA”) amends and adds to the California Consumer Privacy Act, and further targets the use of dark patterns. The CPRA defines the term dark patterns, and within the definition of consent, it is expressly stated that agreement obtained through use of dark patterns does not constitute consent. [Section 1798.140(h), Civil Code] |
These are all mechanisms through which privacy regulators may look to protect consumers from being deprived of their ability to make a conscious choice regarding how their personal data is being processed and how much of it is being collected, and we will be following this topic closely.
Takeaways for businesses
Given the attention dedicated to dark patterns by global data protection regulators, we anticipate that the Canadian privacy commissioners will be looking for their own case to investigate. Businesses should review their websites, apps and other digital properties to assess whether they are being transparent and whether consent is meaningful.
Business should consider conducting privacy impact assessments, either formally or informally, on the user interface/interface changes and the privacy officer should be brought in early at the design stage. In-house developers and third party developers should not be left to design the interface on their own without privacy input.
For more information about this and other topics and how we can help, please see our unique Dentons Data suite of data solutions for every business, including enterprise privacy audits, privacy program reviews and implementation, data mapping and gap analysis, and training in respect of personal information.