GenAI in the Classroom: Considerations for Use
Are you considering using GenAI tools, such as Copilot, in your course? Below you’ll find some things to consider before you dive in.
When deciding whether to use AI tools for assignments, it’s important not to assume students know how to use them. They need to learn to use the tools responsibly and may need your guidance to do so. To support this, add proper language to your syllabi concerning academic integrity standards that specify how the AI tool should be used in the course, or in a specific assignment.
Before assigning work with any AI tool, there are several important things to consider:
- Ensure that all activities align with the learning objectives of a module or lesson and that the tool will be used in a meaningful way that enhances and optimizes student learning.
- Experiment with the tool before implementing it, so you can experience it and reflect on different uses for your courses.
- Include a clear explanation of the purpose for using the tool as part of each assignment description in the syllabus.
- Include clear instructions or provide links to information on how to use each tool.
- Include detailed instructions for individual tasks that provide clear parameters for how the tool may and may not be used.
- Teach students to write good prompts that include clear instructions. These can help them learn to use the tool and/or guide them through an activity. Good prompts are essential for attaining good results.
Ethical and Privacy Considerations
According to Bender, et al. (2021), when incorporating GenAI tools as part of course design, instructors should consider privacy and ethical issues:
- Data privacy, ownership, authorship, copyrights: Companies that work on the development of GenAI tools (e.g. OpenAI) may ask users to open accounts by providing identifiable information (e.g. email address, Google account, phone number). Privacy policies usually state companies can use and share the data as they wish (Caines, 2023; Wilfried Laurier University, 2023). Carleton has guidance on the use of third-party tools in its Adoption of Technology Enhanced Learning Resources policy. Third-party tools can be helpful for teaching and learning and can provide extended course functionality. However, instructors need to be aware that most third-party tools are not integrated into Carleton systems, so training, technical support and troubleshooting are unavailable from the university. Instructors are actively discouraged from adopting a third-party tool that has not been cleared for privacy and security by the university.
- Unpaid labour and the commercialization of student text: GenAI tools may be enhanced by the interactions with users who engage with them. Requiring students to use these tools can mean providing free labour for companies that may become commercial later in their development.
- Inequitable access: Several AI tools have created for-pay subscription plans. For-pay models are not within reach of all students and can create inequitable access for students from marginalized groups, creating advantages for those who can pay and disadvantages for those who cannot. On the other hand, some authors argue that AI tools can “lower the financial cost of personalized tutoring,” especially for students from equity-deserving groups, who cannot realize their full educational potential (Chine et al., 2022, p. 366). In addition, GenAI tools may be unavailable in some countries due to government bans, censorship or other restrictions (UNESCO, 2023).
- Inherent bias and discrimination: GenAI tools can replicate and perpetuate existing biases (e.g. racist, sexist beliefs), toxic speech patterns (Bolukbasi et al., 2016; Welbl et al., 2021) or specific worldviews (Bender et al., 2021). Bias can be present in the training data, the coding, the validation process and in the presentation of the results. Bias and discrimination can be hard to detect because GenAI tools are complex, and technologies are perceived as neutral.
- Lack of regulation: Currently, GenAI tools are not regulated, and their rapid development prompted more than 2,700 academics and leaders from the private sector to call AI labs to pause the training of AI systems more powerful than GPT4 (UNESCO, 2023; Future of Life Institute, 2023).
Need additional support?
If you would like help designing assessments and class activities, please request a consultation to work directly with our support staff.