Toronto City Hall exterior building

IPC Clarifies its Position on Notification and Consent for the use of AI Tools

Torkin Manes LegalPoint
 

On February 28, 2024, the Information and Privacy Commissioner of Ontario (the “IPC”) rendered its findings in Privacy Complaint PI21-00001, which found that McMaster University’s use of Artificial Intelligence (“AI”) when proctoring electronic examinations through the Respondus exam proctoring software failed to comply with the Freedom of Information and Protection of Privacy Act (the “Act” or “FIPPA”)[1].

Specifically, the IPC found that adequate notice was not provided when collecting personal information, and McMaster’s current contract with Respondus did not adequately protect all the personal information being collected.

Background

The use of AI technology has seen a spike in recent years throughout colleges and universities across Ontario, especially during the onset of the COVID-19 pandemic, when virtual exam proctoring became necessary.

McMaster, like many others post-secondary institutions, adopted Respondus exam proctoring software, which is comprised of two programs: Respondus Lockdown and Respondus Monitor. The Respondus Lockdown browser restricts access to a student’s device during examinations, and the Respondus Monitor program monitors the student’s screen during examinations. Respondus Monitor uses AI to analyze audio and video to flag activities that is suspicious or consistent with cheating.

The IPC launched the complaint against McMaster to answer five key questions to determine whether the use of AI when proctoring examinations is, in fact, authorized under the Act. The five questions were:

  1. Whether the information collected falls within the definition of “Personal Information” under the Act?
  2.  Assuming the information was “personal information”, was the collection of personal information in compliance with the Act?
  3. Did McMaster provide notice of the collection in accordance with the Act?
  4. Was the use of the personal information in accordance with the Act?  
  5. Whether McMaster had reasonable contractual and oversight measures over the personal information collected by a service provider (in this case, Respondus)?

The IPC’s Decision

In this case, the IPC found that the information collected by Respondus, including biometric data, photo identification, and audio and video recordings, were deemed “personal information” as defined under the Act. The IPC also confirmed that for the collection of personal information to comply with the Act, it must either be “expressly authorized by statue or necessary to the proper administration of a lawfully authorized activity”.[2]

The IPC found that while the collection of personal information was not expressly authorized by statute, it was necessary to the proper administration of the activity and accordingly authorized. While its use and collection was necessary and authorized, McMaster failed to provide adequate notice of collection. Additionally, for the use of personal information to be in accordance with the Act, it needs to be used for the purpose that the individual has consented to and for which it was obtained, or a “consistent purpose”.  In this case, Respondus used some of the information collected to improve its own services, which was unnecessary for  the purpose of the proctoring of examinations and not a use to which the students have consented.

In its investigation, the IPC reviewed relevant documentation and found that nowhere on the Respondus monitoring platform did students have the option to grant their consent to Respondus using their video or audio recordings to improve its system’s performance or capabilities.

Ultimately, the IPC not only found McMaster’s use of Respondus to be non-compliant with the Act, but also concluded that the contractual measures in place between Respondus and McMaster were not sufficient to safeguard the privacy and security of the personal information collected.

Given McMaster’s breach of the Act, IPC issued recommendations for the University to bring it into compliance.

Takeaways

The IPC’s decision affirms that personal information may be collected by universities and the collection may be deemed appropriate, necessary and in compliance with the Act.  However, institutions and businesses must remain mindful of the notice that is being given to individuals prior to the collection of their information, as well as the specific purpose(s) for which the information is being obtained. When seeking to use AI tools, one must be cognizant of the type of information collected, the notice given to students / customers, the option to consent / opt-in or out of said collection, and ultimately the purpose of the collection of the information and its specific use, as well as what safeguards are in place to ensure the information’s security.

Failure to (a) provide notice, (b) obtain consent, or (c) limit the use to the intended purpose can subject an organization to an IPC investigation that may result in a finding that it has failed to comply with the Act.

Businesses that are using (or are planning to use) AI and its various tools and models should be prepared to comply with the applicable privacy laws and upcoming AI laws. Here are some recommendations to consider when using AI tools:

  • Build a principle- and risk-based AI compliance framework that can evolve with the technology and regulatory landscape. The framework should be built with input from both internal and external stakeholders.
  • Part of the framework should set out clear guidelines around the responsible, transparent and ethical usage of AI technology.
  • Conduct a privacy and an ethics impact assessment for the use of new AI technology. The assessment should answer the following questions:
    • What type of personal information will the AI technology collect, use and disclose?
    • How will the personal information be used by the AI technology?
    • Will the data set lead to any biases in the output?
    • What risks are associated with the AI technology’s collection, use and disclosure of the personal information?
    • Will there be any human involvement in the decision-making?
    • How will the organization provide notice and obtain consent for the collection, use, or disclosure of personal information by the AI Technology?
    • If personal information will be collected by an AI service provider, will the business implement reasonable contractual and oversight measures to protect the personal information collected by a service provider?

Privacy-by-design and ethics-by-design concepts should be incorporated and at the core of any organization’s AI compliance framework. This means that data protection and ethical features will be integrated into the organization’s practices and procedures. These features will likely allow an organization to adapt to changing technology and regulations.

For more information, please contact  Roland Hung of Torkin Manes’ Technology and Privacy & Data Management Groups.

The author would like to acknowledge Torkin Manes Articling Student Armon Ghaeinizadeh for his assistance in drafting this bulletin.

 


[1] Freedom of Information and Protection of Privacy Act, RSO 1990, c F.31.
[2] FIPPA, Section 38(2).