By: Rob Hudson, Law Librarian, Alberta Law Libraries The effective use of prompts in legal research will support A2J in pro-bono practice and increasingly will be adopted in 2025. Developing some literacy in AI prompt engineering this year will assist lawyers and self-represented litigants alike. Alberta Law Libraries (ALL) supports the administration of justice and the rule of law in the Province of Alberta through its legal information resources. In keeping with the theme of this year's A2J Week, legal information sources at ALL are ‘Lifting Communities Through the Power of Pro Bono’ with specific Services to the Bar and Services to the Public. Continuing legal information literacy, outreach, and education at ALL are central to the Access to Justice (A2J) goal. A2J Week 2025 is a good time to prepare for the major changes anticipated this year in AI information retrieval and analysis, and we would like to focus on prompts. Poor use of AI prompts could lead to decreased A2J outcomes in legal research! Additionally, preparing, revising, or even avoiding prompts could be advised in databases without key reliability and other quality attributes. For example, a growing research debate about AI pitfalls is appearing more frequently in SLAW and other sources. Introducing critical legal research into RAI (Responsible AI) for understanding prompts is a capability this blog post will discuss but cannot fully answer as it is still much too early in the evolution of this legal research topic.
Prompt 1: What is prompt engineering? The International Federation of Libraries Associations defines prompt engineering as ‘the crafting of requests in natural language to shape the outputs from generative AI in desired directions.” Prompt Engineer has even become a new job description in legal practice information management, as it requires pre-planning and knowledge. The basic idea is to create questions that will yield the best AI answers. Identifying which type of prompt the database supports before creating the question is essential. Prompt 2: Are there different types of prompt systems? Pure, Guided, and Hybrid prompting are used in different contexts. Pure prompts include the chat approach with natural language to give the researcher maximum control but also unpredictable results. In contrast, Guided prompting doesn't use plain language but controlled inputs and radio buttons for queries with less free expression from the researcher. Hybrid prompting is a mix of Pure and Guided: currently Lexis+ AI uses Hybrid prompting (a combination of user questions with system question), while Practical Law AI has a user empowered approach with Pure prompting. ALL has a database called vLex Justis where you can practice prompting on the AI tool Vincent. Still curious? Another recommendation is to try Pure prompting at Beagle+ AI ChatGPT-enabled bot at the Peoples Law School in BC. One issue with exploring legal research using AI prompts is that most of the tools are in subscription databases so unless you have those it may be hard to experiment. The adoption of prompts in open access databases will make this much easier. There are no AI prompts yet in CanLII; only AI summary tools currently. Prompt 3: How can prompts be formulated to get the best answer? The suggested steps include choosing a grounded database to ensure accurate legal authorities, for example Lexis+ draws only from the high-quality legal and news resources; assessing the type of prompt search supported: Pure, Hybrid or Guided; explicitly defining style and audience for the response, the responses for a legal client are different that for an academic project; being specific as to date, jurisdiction, and topic; repeating the request and syncing answers; and asking for sources that can be checked with citations. Prompt 4: How do I learn more? We would also love to hear from you! If you have any questions, comments, or suggestions, please contact ALL at Ask a Law Librarian.
0 Comments
Leave a Reply. |