Top Five Privacy Developments in Canada: A Year in Review 2023
Overview
As another year has come to an end and we have already embarked on a new year, we take this opportunity to reflect on a number of significant changes to Canadian privacy law. From promising developments to proposed legislation to a groundbreaking investigation, there is much to review as we head into 2024. Let’s take a look at the top five recent developments we encountered in 2023.
1. The Second Phase of Québec’s Law 25 Came Into Force
On September 22, 2023, the second part of Québec’s “Act to modernize legislative provisions as regards the protection of personal information”, also known as “Law 25”, came into effect.
Along with the second set of requirements discussed below, the administrative penalties for non-compliance came into effect this year. Law 25 introduced three different enforcement mechanisms to ensure compliance with the new law: (1) administrative monetary penalties (“AMP”), (2) penal offences, and (3) a private right of action. Under the AMP regime, companies that contravene certain provisions of the amended Act may be liable for up to $10 million or two percent of worldwide turnover from the previous year, whichever is greater. For more severe violations, Law 25 introduced several new penal offences with fines of up to $25 million or four percent of worldwide turnover from the previous year, whichever is greater. Meanwhile, the private right of action recognizes the possibility for individuals to claim punitive damages when their privacy rights are violated.
Given the seriousness of the enforcement mechanisms, businesses must ensure compliance with the new requirements brought by Law 25. The new requirements set out under Law 25 are scheduled to come into force in three increments. The first set of these privacy requirements (which include the appointment of a privacy officer and mandatory breach reporting) came into force on September 22, 2022. The second set of the requirements came into force on September 22, 2023, and the remaining requirements will come into force in September 2024.
The second set of requirements require Québec businesses to establish their own privacy policies, including a formal complaints process and proper practices on the use or destruction of personal information. It also requires privacy impact assessments. The changes also reflect a greater emphasis on transparency and the requirement for organizations to have the highest level of security for personal information as a default, subject to certain exceptions. Other notable requirements include:
- Consent requirements for minors
- Destroying and anonymizing data
- The right to be forgotten
- Ensuring individuals know that their personal information will be used for automated decision-making
2. Major Developments to Bill C-27 Are Underway
Canada’s federal privacy legislation is expected to see an overhaul soon.
On September 26, 2023, Canada’s Minister of Innovation, Science and Industry, François-Philippe Champagne (the “Minister”), proposed substantive amendments to Bill C-27, also known as the Digital Charter Implementation Act. Bill C-27 proposes the introduction of the Consumer Privacy Protection Act (the “CPPA”), the Personal Information and Data Protection Tribunal Act (“PIDPTA”), and introduces Canada’s first legislative framework for the regulation of artificial intelligence (“AI”): the Artificial Intelligence and Data Act (the “AIDA”).
That day, the Minister made proposals that would potentially modify the language under Bill C-27, including:
- Under the CPPA, establishing privacy as a fundamental right for Canadians, reinforcing the protection of children’s privacy and requiring the Office of the Privacy Commissioner of Canada (“OPC”) to enter into compliance agreements with organizations and potentially issuing financial penalties; and
- Under AIDA, revising the definition of “high-impact” AI systems to include specific classes, amending AIDA’s content to align with international frameworks, differentiating the roles and obligations for actors within the AI value chain (such as developers), introducing obligations for “general purpose” AI systems (such as ChatGPT), and clarifying the role of the proposed AI and Data Commissioner.
Currently, Bill C-27 has finished its second reading in the House of Commons and is undergoing consideration by the Standing Committee on Industry and Technology ("INDU Committee"). The INDU Committee will continue to hear from witnesses from various technology, privacy and governmental organizations.
If Bill C-27 is passed, Ontario will be required to follow the newly enacted legislations rather than PIPEDA, which will be effectively replaced. The Bill’s hefty administrative penalties and fines are also an incentive for businesses to invest in protecting personal information and ensure that their processes and procedures remain in compliance.
For more information on Bill C-27, see our recent blog post, Minister of Innovation, Science and Industry Releases Proposed Amendments to Bill C-27 Regarding Artificial Intelligence and Privacy Laws.
3. Canada’s Proposed Artificial Intelligence and Data Act Is Under Debate
As part of Bill C-27, AIDA would be Canada’s first ever legislative framework for AI. However, its current state is not without its detractors.
In recent INDU meetings, several witnesses have commented on various alleged shortcomings in AIDA. These included the possibility of focusing too strongly on individual harms rather than on a broader scale, the possible lack of the Minister’s independence in their proposed role and AIDA’s potential lack of applicability to the public sector.
On November 28, 2023, the Minister released a letter to the INDU Committee, including the full text of potential amendments to AIDA, which were to be in line with both Bill C-27’s objective and stakeholder feedback. Overall, the Minister emphasized the necessity of implementing AIDA at a “turning point in the history of AI, both in Canada and globally”, and where delays may lead to harm individuals and economic failure.[1]
The Minister’s letter should also be read in consideration with an article published by the OPC on December 7, 2023. That article outlined various principles for the development, provision and use of generative AI systems.[2] These include:
- Legal Authority and Consent: Organizations should understand their legal authority, if any, to collect, use, disclose and delete personal information, and that consent should be valid and meaningful.
- Appropriate Purposes: Generative AI should only be used for purposes that a reasonable person would consider appropriate in the circumstances.
- Necessity and Proportionality: The use of generative AI should be fair and more than simply potentially useful.
- Openness: Organizations must allow individuals to understand the primary purpose and any secondary purpose for the use of the generative AI system.
- Accountability: Organizations should be compliant with privacy legislation and capable of showing it.
- Individual Access: Individuals should be capable of accessing or correcting their own information collected by generative AI through set procedures.
- Limiting Collection, Use and Disclosure: Generative AI systems should not collect more information than what is necessary to fulfill its specified purpose.
- Accuracy: Personal information that is used to train generative AI should be as accurate as possible.
- Safeguards: Developers, providers and organizations using generative AI should design and/or monitor them to defend against any inappropriate uses.
It is vital for developers and providers of generative AI to conduct a review to properly gauge how to comply with the principles above. The same applies to organizations that implement generative AI for both public-facing and private uses. Importantly, the core of any AI compliance framework should be the incorporation of privacy-by-design and ethics-by-design concepts. This would allow for data protection and ethical features to be integrated into an organization’s system of engineering, practices and procedures.
4. The Privacy Commissioner Launched its Investigation into ChatGPT
Could ChatGPT have predicted this?
On April 4, 2023, Canada’s Privacy Commissioner, Philippe Dufresne, launched a formal investigation into OpenAI, the developer of the generative AI software ChatGPT. The software, which has long generated buzz in the public sphere for its intuitiveness, has also been the source of major controversy. Notably, ChatGPT is notorious for a function dubbed “web scraping”, a way of extracting data from any source with a human-readable output. ChatGPT also raises eyebrows with its use of personal information without consent.
While the investigation is still underway, it follows hot on the heels of Italy’s own ChatGPT investigation, which culminated in the Italian Data Protection Authority (“IDPA”) temporarily banning it on March 31, 2023. As the first western country to do so, the IDPA cited the main reason to be the potential threat to user privacy. Other countries, such as North Korea, Iran, Russia, Cuba, Syria and China, have also begun restricting the use of ChatGPT. It remains to be seen what Canada’s Privacy Commissioner will determine in light of Canadians’ privacy rights and an individual’s reasonable expectation of privacy.
Overall, the ChatGPT investigation also raises two important questions: First, what are the risks inherent with an AI technology’s collection, use and disclosure of personal information? Second, how can we balance fostering technological development with responsible industry regulation? The ChatGPT investigation will be helpful in assessing the balance required to foster innovation while mitigating the risks associated with AI.
For more information on the ChatGPT investigation, see our blog post, Canadian Privacy Commissioner Launches Investigation into ChatGPT.
5. British Columbia Enacted its New Mandatory Privacy Breach Requirements
Starting February 1, 2023, the head of each public body in British Columbia (“BC”) was required to develop a privacy management program (“PMP”), including policies, procedures and tools that help protect personal information. Of significant importance is the mandatory requirement for privacy breach reporting.
The new requirement, which falls under the amended BC Freedom of Information and Protection of Privacy Act (“FIPPA”), includes notifying the BC Information and Privacy Commissioner of any “privacy breach” that could reasonably result in a “real risk of significant harm” to an individual, “without unreasonable delay.” Importantly, notifications are also to be given to individuals who may also reasonably face a “real risk of significant harm”. These notifications are intended to allow individuals to understand both how they might be impacted by the privacy breach, and also what steps they can take to reduce or mitigate the risk.
For more information on the new mandatory breach requirements in BC, see our blog post, New Mandatory Breach Requirements in British Columbia’s Public Sector.
Bonus: Two PIPEDA Cases
This year also saw the OPC giving two intriguing PIPEDA-related decisions.
The first dealt with the Agronomy Company of Canada Ltd. (“Agronomy”). In April 2020, a privacy breach occurred and personal information from 845 individuals was stolen. Agronomy was originally unaware of the breach, but also refused to pay a ransom when the threat actor sent an email requesting one. The personal information was later published on the dark web in June 2020. The OPC found that Agronomy had breached its obligations under PIPEDA, notably by lacking appropriate safeguards and failing to be accountable. For instance, Agronomy did not have a comprehensive privacy policy nor a designated Privacy Officer. Agronomy has since addressed these issues by making improvements to its incident management plan.[3]
The second dealt with Home Depot of Canada Inc. (“Home Depot”), which had disclosed the complainant’s personal information to Facebook without their knowledge or consent. Although it was intended to measure the effectiveness of ads, the OPC found that Home Depot’s provision of customer data to Facebook contravened PIPEDA. Namely, Home Depot’s privacy statement was not sufficient for the purposes of implied consent, nor was it readily available or clearly explained to customers. Home Depot has since committed to accepting the OPC’s recommendations and ceased providing data to Facebook.[4]
Conclusion
Overall, 2023 was an eventful year for privacy and cybersecurity. With 2024 approaching, ensuring that organizations have up-to-date privacy management procedures and processes to remain in compliance with evolving legislation remains a priority. Businesses are encouraged to reach out to Roland Hung in the Technology, Privacy & Data Management Group at Torkin Manes with questions and to receive business-specific recommendations.
The author would like to acknowledge Torkin Manes’ Articling Student Herman Wong for his invaluable contribution in drafting this bulletin.