On April 24, 2023, the House of Commons completed the second reading of the Digital Charter Implementation Act, 2022 (“Bill C-27”). If passed, Bill C-27 will enact the Artificial Intelligence and Data Act (“AIDA”), the first private sector artificial intelligence (“AI”) legislation in Canada.
WHAT’S AN ARTIFICIAL INTELLIGENCE SYSTEM?
Under AIDA, an artificial intelligence system (“AI System”) is broadly defined as:
“a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions.”
WHO DOES AIDA APPLY TO?
AIDA will apply to any person[1] who, in the course of international or interprovincial commerce:
- carries out a “regulated activity”;
- is “responsible” for an AI System; or
- is “responsible” for a “high-impact system”.
A “regulated activity” means any of the following activities:
- processing or making available for use any data relating to human activities for the purpose of designing, developing or using an artificial intelligence system; or
- designing, developing or making available for use an artificial intelligence system or managing its operations.
A person is “responsible” for an AI System, including a high-impact system, if they design, develop, make available for use or manage the operation of an AI System.
HIGH-IMPACT SYSTEMS
As stated above, AIDA will apply to any person who is responsible for a high-impact system.
A “high-impact system” means an AI System that meets the criteria listed in AIDA’s regulations. Although we do not yet know the criteria listed in the regulations, the Government of Canada’s AIDA Companion Document (the “Companion Document”) lists some factors to determine if an AI System may be a high-impact system.
Based on the Companion Document, some examples of high-impact systems may include but are not limited to:
- Systems intended to make decisions, recommendations or predictions relating to access to services;
- Systems that use biometric data to identify a person remotely or to make predictions about the characteristics, psychology or behavior of individuals;
- Online content recommendation applications; and
- AI applications integrated in health and safety functions such as autonomous driving systems and systems making triage decisions in the health sector.
OBLIGATIONS UNDER AIDA
Assess whether the AI System is a High-Impact System
If Bill C-27 is passed, a person responsible for an AI System will need to assess whether the AI System is a high-impact system based on criteria listed in the regulations and maintain records describing the reasons supporting their assessment.
Establish measures to mitigate risks of harm or bias and to monitor compliance
If a person is responsible for a high-impact system, then they must, in accordance with the regulations:
- establish measures to identify, assess and mitigate risks of harm or biases output that could result from the use of the system (“Mitigation Measures”);
- establish measures to monitor compliance with the mitigation measures they are required to establish and the effectiveness of those Mitigation Measures; and
- keep records describing those measures, and any additional records required by the regulations.
Publish a notice on a publicly available website
A person who makes available for use a high-impact system must, in accordance with the regulations, publish on a publicly available website a plain-language description of the system that explains:
- how the system is intended to be used;
- the types of content the system is intended to generate;
- the types of decisions, recommendations and predictions it is intended to make;
- the Mitigation Measures in respect of the system; and
- any other information prescribed by regulation.
A person who manages the operation of a high-impact system will have a similar obligation to publish on a publicly available website a plain-language description of the system.
Notify the Minister if there is a risk of material harm
A person who is responsible for a high-impact system must, as soon as feasible and in accordance with the regulations, notify the designated Minister if the use of the system results or is likely to result in material harm.
Establish measures regarding anonymized data
A person who carries out a regulated activity and who processes or makes available anonymized data in the course of that activity must establish measures in accordance with the regulations with respect to how data is anonymized and the use or management of anonymized data and maintain a record of those measures.
FINES AND OFFENCES
A person who violates the above listed obligations commits an offence under AIDA and will be liable to fines of up to $10 million or 3% of a company’s gross global revenues, whichever is greater, or, if such person is an individual, to a fine determined by the court.
A person who commits the following offences under AIDA will be liable to a fine of up to $25 million or 5% of the organization’s gross global revenues, whichever is greater, or, if such person is an individual, to a term of imprisonment of up to five years and/or a fine determined by the court:
- possessing or using personal information that was unlawfully obtained or derived for the design, development, use, or provision of AI Systems;
- knowingly or recklessly making available for use an AI System that causes serious psychological or physical harm, or substantial damage to property; or
- making available for use an AI System to defraud the public and cause substantial economic loss to an individual.
The amounts of administrative penalties for those persons who commit a violation of AIDA will be determined by regulation.
NEXT STEPS
We will continue to monitor Bill C-27’s progress. If passed, the obligations described above will be supplemented by the regulations to enable organizations to prepare for compliance once AIDA comes into force.
Please contact Brianne Kingston or one of our lawyers in the Corporate/Commercial Practice Group if you have any questions about Bill C-27.
[1] The definition of “person” under AIDA includes any legal entity. However, AIDA does not apply to certain federal government institutions, as more particularly described in Section 3 of AIDA.