Skip to main content
Skip table of contents

Using generative AI in teaching and learning

What is generative AI?

Generative AI is a type of artificial intelligence that uses machine learning to create new types of media, including text, images, sound, and video. Many tools exist that use generative AI to create new content. Some examples include:

Media

Example tools

Text

ChatGPT, Copilot (Secure version approved at McGill)

Images

Bing, Adobe Firefly, Midjourney, DALL-E3 , Copilot

Sound

Voiceify

Video

D-ID

A secure version of Microsoft Copilot has been approved at McGill. Copilot is an AI chatbot tool integrated into Microsoft Edge and accessible from other browsers. If you want to try it, make sure you are using the Commercial Data Protection (secure) version, not the public version. Read more about using Copilot at McGill.

Generative AI tools are built upon large language models (LLM) that have been trained on enormous corpora of information and are designed to respond to input prompts with well-structured output. They are focused on predictive text (i.e., the most likely word to follow a previous word), as well as generated media based on patterns or styles in existing datasets. While the outputs can often demonstrate seemingly sophisticated language interaction, there may be situations where they provide inaccurate information in that interaction (e.g., fictional sources). These tools are constantly improving; however, it is important to examine any generative AI output with a critical lens. The McGill Library has a page on artificial intelligence with a section on AI Literacy that discusses the importance of critically approaching AI.

Ethical use of generative AI 

The development of generative AI has brought many ethical considerations that must be addressed with its use. The 2018 Montreal Declaration for a Responsible Development of Artificial Intelligence and the 2021 UNESCO Recommendation on the Ethics of Artificial Intelligence provide a number of principles important to the adoption of these new technologies. Both documents offer an excellent framework for how organizations need to consider these new technologies. 

The emergence of generative AI is leading to a process of reflection on the evolution of the concept of academic integrity. For any questions related to disciplinary procedures, contact the Office of the Dean of Students

Learn more:

Back to top


How can generative AI be used in teaching and learning? 

In May 2023, the Senate Subcommittee on Teaching and Learning (STL) created a working group to address the use of generative AI at McGill. The working group drafted a report with key recommendations for the use of generative AI at McGill. The report also presents principles for the use of generative AI in teaching and learning.  

Many different strategies exist for using generative AI tools in course teaching. For example, instructors can use them to create sample texts for students to analyze; create images for presentations; design entire presentation materials; and generate sample practice questions. Strategies also exist for students to use generative AI tools to support their learning. For example, they can be used to brainstorm ideas; create images to support assignments such as presentations; and summarize documents. These resources offer more ideas for creating instructional materials and supporting student learning: How could AI be used for learning and teaching? and How AI can be used meaningfully by teachers and students in 2023

It is important to note that students need AI literacy skills to responsibly use generative AI tools. These potentially new literacy skills could include elements such as fact-checking and prompt engineering.  

Avoid AI detection tools  

AI detection tools are tools designed to identify content that is partially or wholly generated by AI. These tools (specifically regarding generated text) are unreliable and often inaccurate, a statement corroborated by OpenAI, the creators of ChatGPT. OpenAI states that their false positive rate is 9%, which is a similar rate found in other AI detection tools. A false positive is when the tool identifies text as being generated by AI when it was actually created by a student. False positive results misguide instructors and can create situations where students are wrongly accused of a violation that they did not commit, forcing them to defend work that is rightfully theirs. Such situations can result in a negative classroom climate and create unnecessary stress and anxiety for all involved. McGill therefore discourages the use of AI detection tools.

Course outline statements 

There should be no default assumption as to the use of generative AI tools. Therefore, McGill recommends that instructors explain to students in their course outline what the appropriate use or non-use is of generative AI tools in the context of that course. The use or non-use of these tools should align with the learning outcomes associated with the course. For this reason, instructors will need to write their own context-appropriate course outline statements. Below are four external examples to draw on: 

If you allow students to use generative AI tools in your course, provide guidance in your course outline for how students should acknowledge the use. Monash University provides useful examples, as well as links to APA and MLA citation guidelines. See the section entitled How students acknowledge the use of generative AI. 

AI and assessment of student learning 

Generative AI tools can be used to support many different types of assessment. As with the planning of any assessment of student learning, it is important to be intentional about the strategy. In this case, be intentional about either designing AI into your assessments or designing AI out of your assessments. This reflective process will encourage you to think through your assessments and determine if the integration of AI is appropriate (or not) for allowing students to demonstrate their learning. Designing AI into an assessment could involve having students generate the assignment with an AI tool, then critique the output, and reflect on the process. Designing AI out of the assessment could involve in-class writing exercises, where students create and share their ideas with other students in class during class time. Either way, the decision should align with students’ achievement of the desired learning outcomes. Monash University provides excellent considerations for assessment design.

Learn more:

Back to top


While this resource is accessible worldwide, McGill University is on land which has served and continues to serve as a site of meeting and exchange amongst Indigenous peoples, including the Haudenosaunee and Anishinabeg nations. Teaching and Learning Services acknowledges and thanks the diverse Indigenous peoples whose footsteps mark this territory on which peoples of the world now gather. This land acknowledgment is shared as a starting point to provide context for further learning and action.

TLS-logo_rgb_horizontal_EN.png

McLennan Library Building 3415 McTavish Street Suite MS-12(ground level), Montreal, Quebec H3A 0C8 | Tel.: 514-398-6648 | Fax: 514-398-8465 | Email: tls@mcgill.ca

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.