Bill C-11 (the Digital Charter Implementation Act) was introduced on November 17, 2020. It consists of two parts – Part 1, which would enact the new Consumer Privacy Protection Act (CPPA), and Part II, which would enact the legislation to establish the Personal Information and Data Protection Tribunal (Tribunal). It also incorporates previous amendments made to PIPEDA in 2015 via the Digital Privacy Act. The current PIPEDA would continue to exist, but it would be focused on the electronic documents aspect of e-commerce.
Notably, PIPEDA’s Schedule 1 Principles (based on the OECD Principles) are gone, replaced by the actual language of the proposed CPPA. This will be a welcome development for many businesses that found the unusual structure and vague language of the Schedule I Principles challenging.
The key proposals are set out below. For more information, see our the links to the CPPA: In Depth series of posts at the end of this post.
Scope
Notably, the Bill uses the balancing language of PIPEDA. In other words, the stated goal is “to support and promote electronic commerce” by protecting personal information (PI). As with PIPEDA, the scope is limited to PI used in the course of commercial activities – which are now defined (see definitions, below). This need for balancing is emphasized again in the Bill’s Purpose section (section 5).
New definitions
There are several new definitions, which would have an impact on businesses and their privacy practices.
The existence of “automated decision systems” would now brought into the ambit of the CPPA, and they are defined as being “any technology that assists or replaces the judgement of human decision-makers using techniques such as rules-based systems, regression analysis, predictive analytics, machine learning, deep learning and neural nets”. This is an ambitious definition, and businesses will likely want to understand better what exactly is meant by the phrase “assists or replaces” humans, as this could potentially include everything from artificial intelligence to Magic 8 Balls.
As mentioned above, “commercial activity” is now defined and qualified. It means (as it does in PIPEDA) “any particular transaction, act or conduct or any regular course of conduct that is of a commercial character” but is now qualified by adding “taking into account an organization’s objectives for carrying out the transaction, act or conduct, the context in which it takes place, the persons involved and its outcome.”
With the increased demand for and use of de-identified and anonymized information, the Bill now adds a definition of what it means to de-identify PI: it means “to modify personal information — or create information from personal information — by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual.
Businesses will be concerned about the lack of definition of “anonymization”, which under other privacy regimes, is a carve-out from the application of privacy laws. Instead, there is a definition of “disposal” – the permanent and irreversible deletion of PI. However, this appears to contemplate only actual deletion, not anonymization. This is a significant gap.
The term “service provider” is now defined and means an “organization, including a parent corporation, subsidiary, affiliate, contractor or subcontractor, that provides services for or on behalf of another organization to assist the organization in fulfilling its purposes”. While many businesses will welcome the clarity, there are likely others that will be surprised to find themselves caught by this definition (e.g., parent corporations).
Clarity on accountability
The Bill introduces a distinction similar to the controller/processor distinction found in the GDPR. In the Bill, the organization that is accountable for PI is the one that is “collect[s] it and that determines the purposes for its collection, use or disclosure, regardless of whether the information is collected, used or disclosed by the organization itself or by a service provider on behalf of the organization.”
Explicit requirement to have a privacy management program
Section 9 (1) of the Bill requires that every organization must implement a privacy management program (Program) and such Program must include “the organization’s policies, practices and procedures” that it puts in place to fulfil its obligations, including those pertaining to the protection of PI, requests for access to PI, training for staff, and explanatory materials.
Interestingly, the Program must take into account both the sensitivity and the volume of the PI under the organization’s control. This is likely a nod to the risk of re-identification of data, but could mean that organizations with large volumes of very low sensitivity information could be required to enhance their existing programs.
Organizations would also be required to give access to the OPC to “policies, practices and procedures” included in the Program. If this provision of the Bill survives, organizations should consider carefully scoping their “privacy program” and defining which documents are in scope (and therefore accessible to the OPC) and which documents are out of scope (and therefore protected from access).
Appropriate purpose now has “factors” that must be considered
PIPEDA allowed the collection of PI for undefined “appropriate purposes”. The Bill now sets out factors, which would have to be considered when an organization is determining whether a collection, use or disclosure of PI is appropriate.
These purposes for the collection, use or disclosure must be recorded prior to collection and any new purpose must be recorded prior to use or disclosure. Presumably, this could be done by revising a privacy policy. A failure to record this appropriately will mean a violation of section 13, which limits collection of PI to such recorded purposes.
Use or disclosure for purposes other than those recorded requires valid consent. However, there are some exceptions, and an organization may use PI for purposes other than those recorded, when those purposes are certain enumerated business activities, where the PI is de-identified, where business transactions are being contemplated, the prevention of fraud, and a number of other listed exceptions.
Still consent based
Valid consent to collection, use and disclosure is still required, but the validity of consent is now contingent upon certain information being provided in “plain language”:
- The purpose of the collection, use or disclosure;
- The way in which the PI is to be collected, used or disclosed;
- “Any reasonably foreseeable consequences” of such collection, use or disclosure;
- The specific type of PI that is to be collected, used or disclosed; and
- The names of any third parties or types of third parties to which the organization may disclose the PI.
There is likely to be much debate about what is meant by “reasonably foreseeable consequences” of a collection, use or disclosure, and what standard will be used.
Requirement for express consent now assumed
Under the Bill, consent would have to be expressly obtained, unless an organization could establish that implied consent would be appropriate. This is likely to increase the documentation requirements for organizations.
Withdrawal of consent exception now includes contract terms
On giving reasonable notice to an organization, an individual may, at any time, “subject to this Act, to federal or provincial law or to the reasonable terms of a contract,” withdraw their consent in whole or in part. Once so notified, and organization must cease the collection, use or disclosure of that person’s PI.
The “reasonable terms of a contract” is a modification of the language of PIPEDA. If this language survives into the final Act, organizations should consider including such language expressly in their contracts.
Additional exceptions to consent
In a marked departure from PIPEDA, but in alignment with other international privacy laws, neither knowledge nor consent are required to collect or use PI where it is done for one a number of listed “business activities” AND it is reasonable and expected, AND it is not being “collected or used for the purpose of influencing the individual’s behaviour or decisions.”
While this would be good news for many businesses in respect of their ordinary and routine activities, the latter prohibition will likely have a significant impact on marketing/targeted ads, as it essentially forces an organization to seek consent.
Also expressly permitted without knowledge or consent are:
- Transfers to service providers (see the expanded definition of service provider, above);
- Use of PI for internal research and development purposes provided the PI is de-identified prior to doing so;
- Use and disclosure of PI for prospective and completed business transactions. This was provided for in PIPEDA, and the Bill version now includes a requirement to de-identify the PI prior to using and disclosing in the context of a proposed business transaction;
- Collection, use or disclosure of PI if it was produced by the individual in the course of their employment, business or profession and the collection, use or disclosure is consistent with the purposes for which the information was produced;
- Disclosures to lawyers or notaries representing an organization;
- Disclosures for “socially beneficial purposes” provided the PI is de-identified and made to government, health care, post-secondary or similar institution. “Socially beneficial purpose” is defined and means a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment”; and
- Disclosures by a breached organization to other organizations or government that may be in a position to mitigate harm to individuals.
Existing PIPEDA exceptions remain, including disclosures to government institutions including law enforcement. Disappointingly, the PIPEDA exception for disclosure in the face of “lawful authority”, an ambiguous term with no clear definition, remains. In fact, more instances of the term have been introduced, thereby introducing a lack of clarity into those sections as well.
Publicly available information still covered
Publicly available information would still be protected unless listed in the regulations. It would be helpful if the regulations were expanded to include more categories of routinely publicly available PI.
New requirement to dispose of personal information
Organizations must now dispose of PI at the end of its lifecycle (see definition of disposal, above). As well, individuals would now have clear right to request the disposal of their PI, with certain limited exceptions. An organization so doing must also instruct its service providers to dispose of the PI and obtain confirmation that they have done so. As discussed earlier, this does not appear to include anonymization – which could affect a large number of current business models.
Breach notification obligations would apply directly to service providers
Under the Bill, if a service provider determines that any breach of security safeguards has occurred that involves PI, it would be required to notify the organization that controls the personal information as soon as feasible. It is unclear from the language what the threshold is for “involves”.
More specificity and plain language required in privacy policies
Organizations would be required to use plain language. In addition, any required policies or practices would need to contain certain types of information, including an explanation of the organization’s exceptions to the requirement to obtain consent, a general account of the use of any automated decision system that could have “significant impacts” on individuals, and whether an organization carries out either interprovincial or international transfers that “may have reasonably foreseeable privacy implications”.
Algorithmic transparency introduced
Organizations would be required to provide plain language explanations of the prediction, recommendation or decision made by automated means, and of how the PI that was used to make the prediction, recommendation or decision was obtained. This is a new requirement, and it appears that it would catch organizations that have already adopted such technologies, with no grandfathering. Organizations considering adopting such technologies in the near future would be wise to consider incorporating this into their anticipated deployments.
Data mobility would be required
The Bill would allow an individual to request that an organization disclose the PI that it has collected from the individual to an organization designated by the individual, if both organizations are subject to a “data mobility framework” provided under the regulations. Regulations regarding data mobility frameworks would be made by the Governor in Council.
The current wording only applies to PI collected by the originating organization; it presumably excludes PI that the organization has generated from the PI it collected, although this not stated expressly. Some clarity on the scope of this right would be beneficial.
De-identification efforts are not “one size fits all”
An organization that de-identifies PI must ensure that any technical and administrative measures applied are proportionate to the purpose for which the information is de-identified and the sensitivity of the personal information. Re-identification efforts are prohibited, except in the context of testing security safeguards.
De-identified information appears to be still subject to the Act, since the definition of “de-identify” suggests it is simply a modification of personal information. In addition, the threshold is whether the de-identified information could be used “in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual”. This formulation is a departure from the current “serious possibility” test, and its vagueness is unlikely to provide businesses with the certainty they need on this point.
Creation of codes of practice and certification programs
The Office of the Privacy Commissioner of Canada (OPC) would now have the ability to approve codes of practice and certification programs
The ability to apply for such approval is not limited to “organizations” but includes all “entities” which would presumably include industry associations, interest groups and other loosely organized affiliations. Codes of practice must offer “substantially the same or greater” than the protections offered under the CPPA.
Certification is more robust, requiring (in addition to a code of practice), interpretation guidelines, certification mechanisms, mechanism for independent verification, disciplinary measures for noncompliance by members, and other measures.
Compliance with a code of practice or a certification programs does not relieve an organization of its obligations under this Act. However, certification is a factor that the OPC must consider when it is deciding whether to launch an investigation as the result of a complaint.
“Inquiries” in addition to “investigations”, and the requirement that “decisions” be made
Subsequent to an investigation, where the matter is not resolved, the OPC would now have the power to launch an “inquiry”. Unlike the PIPEDA provisions in respect of an investigation, an inquiry under CPPA would have basic rules of evidence, the organization would have a right to be heard and be assisted by counsel, and the OPC would be required to complete an inquiry by rendering an actual decision (as opposed to a “finding” under the investigation stage). A decision, unlike a finding, is open to legal challenge.
Privacy Commissioner would have new powers
OPC may make limited orders: In the context of an inquiry, the OPC may make orders requiring an organization to do or refrain some doing something. It may also issue interim orders in the context of an investigation.
OPC may “recommend” (but not impose) fines: Subsequent to completing an inquiry, the OPC may recommend to the Tribunal that it impose fines for certain violations of the limitation on collection provisions (and recording requirements for purpose), consent provisions, retention and disposal provisions, security safeguard provisions, and breach reporting obligations.
The maximum penalty for all the contraventions in a recommendation taken together is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed.
The OPC would have to take into account-listed factors in determining the quantum of its recommended penalty.
It is the Tribunal, which must determine whether to impose a penalty, and it may choose to rely on the OPC’s decision or may substitute its own decision.
Due diligence defence: Organization may avail themselves of a due diligence defence which, if successful, prevents the Tribunal from imposing a penalty.
OPC audit powers: The OPC can audit any organization’s personal information management practices, but it must provide the organization with a report of its findings and recommendations. Of concern to organizations will be the fact that under the Bill, these reports may be made public in the OPC’s annual report.
Organizations (and complainants) would have appeal rights
Organizations and complainants would have the right to appeal to the Tribunal (within 30 days) from a finding made in a decision rendered in an inquiry, an order made in such a decision, or a decision made not to recommend a penalty. Leave to appeal interim orders issued pursuant to an investigation may be appealed with leave of the Tribunal. The Tribunal can dismiss or allow an appeal, or substitute its own finding, order or decision.
Private right of action introduced
Individuals affected by the acts or omissions of an organization (these individuals do not necessarily need to be complainants) may sue the organization for damages for loss or injury in cases where the Commissioner or Tribunal has made a finding that the organization violated the Act. The action must be brought within two years.
Large penalties
Organizations guilty of an indictable offence are liable to a fine not exceeding the higher of $25,000,000 and 5% of the organization’s gross global revenue in its financial year before the one in which the organization is sentenced. Summary convictions may result in fines not exceeding the higher of $20,000,000 and 4% of the organization’s gross global revenue in its financial year before the one in which the organization is sentenced.
Coming into force
The Bill, once enacted, would come into effect on a day to be fixed by order of the Governor in Council.
Other posts in the CPPA: In Depth series:
Part 5: CPPA: An in-depth look at the data mobility provisions in Canada’s proposed new privacy law
Part 6: CPPA: An in-depth look at the disposal provisions in Canada’s proposed new privacy law
Part 7: CPPA: An in-depth look at the consent provisions in Canada’s proposed new privacy law
Part 8: CPPA: An in-depth look at the access request provisions in Canada’s proposed new privacy law
Part 10: CPPA: An in-depth look a the privacy policy provisions in Canada’s proposed new privacy law
For more information about Denton’s data expertise and how we can help, please see our Transformative Technologies and Data Strategy page and our unique Dentons Data suite of data solutions for every business, including enterprise privacy audits, privacy program reviews and implementation, and training in respect of personal information. Subscribe and stay updated.