• About
  • Offices
  • Careers
  • News
  • Students
  • Alumni
  • Payments
  • EN | FR
Background Image
Bennett Jones Logo
  • People
  • Expertise
  • Knowledge
  • Search
  • FR Menu
  • Search Mobile
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
View all
Practices
Corporate Litigation Regulatory Tax View all
Industries
Energy Infrastructure Mining Private Equity & Investment Funds View all
Advisory
Crisis & Risk Management Public Policy
View Client Work
International Experience
Insights News Events Subscribe
Arbitration Angle Artificial Intelligence Insights Business Law Talks Podcast Class Actions: Looking Forward Class Action Quick Takes
Economic Outlook New Energy Economy Series Quarterly Fintech Insights Quarterly M&A Insights Sustainability & the CIO
People
Offices
About
Practices
Industries
Advisory Services
Client Work
Insights
News
Events
Careers
Law Students
Alumni
Payments
Search
Subscribe

Stay informed on the latest business and legal insights and events.

LinkedIn LinkedIn Twitter Twitter Vimeo Vimeo
 
Blog

New Practice Directions Consider Artificial Intelligence in Court Submissions

July 24, 2023

Written By Stephen Burns, Sebastien Gittens and David Wainer

The use of artificial intelligence (AI) in the preparation of materials filed with the courts has been the subject of recent practice directions, with certain Canadian courts requiring that any reliance on AI by a litigant must be disclosed.

For example, the Court of King's Bench of Manitoba issued a practice direction on June 23, 2023, with respect to AI.1 Therein, the Court acknowledged that: (1) AI is rapidly developing; (2) it is impossible to "… completely and accurately predict how [AI] may develop or how to exactly define the responsible use of [AI] in court cases"2; and (3) there are concerns about the reliability and accuracy of information generated from the use of AI. To that end, the Court now requires that any use of AI in preparation of materials filed with the Court must indicate how the AI was used.

The Supreme Court of Yukon issued a similar practice direction on June 26, 2023, noting that cases "in other jurisdictions have arisen where it has been used for legal research or submissions in court."3 Further, similar to the practice direction from the Manitoba Court of King's Bench, the Supreme Court of Yukon noted that there are "legitimate concerns about the reliability and accuracy of the information generated from the use of artificial intelligence." Accordingly, the Supreme Court of Yukon now requires parties to advise the Court of if they have relied on AI "for their legal research or submissions in any matter and in any form before the Court…". The Supreme Court of Yukon also specifically mentioned ChatGPT, a well-known AI tool, as an example of one such tool that must be disclosed.

The ability of AI to "hallucinate" (i.e., generate incorrect information) is driving concern with respect to the accuracy, reliability and credibility of AI's output. Accordingly, these practice directions seek to ensure that courts may rely on the submissions of litigants in light of the foregoing limitations associated with AI. Bennett Jones anticipates that additional practice directions will be issued by other Canadian courts in due course.

In the interim, these practice directions indicate that Canadian courts are monitoring the rapidly evolving nature of AI. They also reflect a general trend to regulate the use of AI. For example, the federal government has tabled Bill C-27: An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts.4 One component of this legislation is the Artificial Intelligence and Data Act (AIDA) which seeks to regulate "high impact AI systems". As the companion document released by Innovation, Science and Economic Development Canada indicates, the government will consider a list of factors in determining which AI systems would be considered "high impact" under AIDA:5

  • evidence of risks of harm to health and safety, or a risk of adverse impact on human rights, based on both the intended purpose and potential unintended consequences;
  • the severity of potential harms;
  • the scale of use;
  • the nature of harms or adverse impacts that have already taken place;
  • the extent to which for practical or legal reasons it is not reasonably possible to opt-out from that system;
  • imbalances of economic or social circumstances, or age of impacted persons; and
  • the degree to which the risks are adequately regulated under another law.

In this light, it may be possible that certain AI used in the preparation of court submissions could be deemed to be "high impact AI systems," and thus subject to AIDA.

For a detailed look into what future AI regulation may look like in Canada, please refer to our blog, Artificial Intelligence—A Companion Document Offers a New Roadmap for Future AI Regulation in Canada.

The Bennett Jones Privacy and Data Protection group is available to discuss how your organization can responsibly integrate AI into its operations.


1 Court of King's Bench of Manitoba, "Practice Direction Re: Use of Artificial Intelligence in Court Submissions" (23 June 2023).

2 Ibid.

3 Supreme Court of Yukon, "Practice Direction: Use of Artificial Intelligence Tools" (26 June 2023).

4 Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, 1st Sess,44th Parl, 2021 (second reading completed by the House of Commons on 24 April 2023).

5 Canada, The Artificial Intelligence and Data Act (AIDA) – Companion document (last modified 13 March 2023).

Please note that this publication presents an overview of notable legal trends and related updates. It is intended for informational purposes and not as a replacement for detailed legal advice. If you need guidance tailored to your specific circumstances, please contact one of the authors to explore how we can help you navigate your legal needs.

For permission to republish this or any other publication, contact Amrita Kochhar at kochhara@bennettjones.com.

Download PDF

Authors

  • Stephen D. Burns Stephen D. Burns, Partner, Trademark Agent
  • J. Sébastien A. Gittens J. Sébastien A. Gittens, Partner, Trademark Agent
  • David  Wainer David Wainer, Associate

Related Links

  • Insights
  • Media
  • Subscribe

Recent Posts

Blog

Upending the Ground Rules: Proposed Major Overhaul [...]

May 08, 2025
       

Blog

Government of Alberta Proposes Significant Changes [...]

May 06, 2025
       

Blog

What Does the SPAC IPO Rebound Mean for Cross-Border Deals?

May 05, 2025
       

Blog

Q&A on Protecting Family Enterprises Through Collaborative Family Law

April 29, 2025
       

Blog

CSA Announces Pause on Climate-Related and Diversity-Related [...]

April 28, 2025
       
Bennett Jones Centennial Footer
Bennett Jones Centennial Footer
About
  • Leadership
  • Diversity
  • Community
  • Innovation
  • Security
Offices
  • Calgary
  • Edmonton
  • Montréal
  • Ottawa
  • Toronto
  • Vancouver
  • New York
Connect
  • Insights
  • News
  • Events
  • Careers
  • Students
  • Alumni
Subscribe

Stay informed on the latest business and legal insights and events.

LinkedIn LinkedIn Twitter Twitter Vimeo Vimeo
© Bennett Jones LLP 2025. All rights reserved.
  • Privacy Policy
  • Disclaimer
  • Terms of Use
Logo Bennett Jones