- AI in Teaching at Carleton: Opportunities and Challenges
- Recommendations from Teaching and Learning Services
- Carleton’s Policy
- Academic Integrity Statement Examples
- How to Prepare Syllabus Language Regarding the Use of AI Tools
- Ethical and Privacy Considerations When Using Generative AI Tools in Teaching
A Report by the Working Group on the Use of Artificial Intelligence (AI) in Teaching and Learning at Carleton University
Convened in February 2023, the AI working group has been working to help identify opportunities and challenges related to generative AI in teaching and learning and propose recommendations and guidelines for Carleton’s teaching and learning community.
As with all technologies, instructors and students need to evaluate potential risks, including how their data will be used and stored, concerns about authorship, inherent biases, and inequitable access. There is also a risk of exposure to potentially harmful content and responses, even though organizations are attempting to reduce these risks. If instructors decide to ask students to use generative AI tools in their courses, they should be prepared to address these issues.
When considering whether to incorporate generative AI tools into a course, instructors need to start by determining whether these types of tools align with the course learning outcomes. Learning outcomes should guide the knowledge and skills students will gain from the course and help determine types and approaches to assessments.
As with the use of any tool, instructors should be prepared to discuss with students the role of the tool in the course and how it can enhance learning. With generative AI, in particular, they may also need to work with students to evaluate the reliability and accuracy of AI tools as supports to instructional processes. These discussions should also include defining what constitutes fair use versus misuse of AI tools. In courses that include generative AI tools, instructors need to ensure that the policy for use is clearly stated in the syllabus.
Carleton University encourages teaching innovation and supports instructors who wish to try and/or adopt new pedagogical approaches and educational technologies. Generative AI tools are here to stay, and they open possibilities for rethinking how we design and teach our courses, including our assessment strategies. For academic disciplines, it will be particularly important to rethink how to continue to use essays and other forms of written assessments to evaluate students’ knowledge and skills in light of generative AI. Please note that these policies will continue to evolve to keep pace with the technology.
Academic Integrity and AI
Using AI tools to generate content for assignments, and presenting it as one’s own original work, as well as copying or paraphrasing the content produced by AI tools without proper citations or the instructor’s consent, are both considered to be in violation of academic integrity. See Carleton’s policy on academic integrity for more information.
If instructors suspect that an assignment has been completed with unauthorized use of generative AI tools, they should not confront the student or engage in punitive actions. Instead, they should proceed as with any other potential allegation of academic misconduct and report them to the dean’s office.
Instructors should not rely on AI detection tools for allegations of academic misconduct. Instead, they should provide as much background information and details as possible about the context of their course and/or discipline that would allow the dean’s office to gain a better understanding of each potential case. Further, they should carefully document any problems with a student’s assignment that would violate the Academic Integrity Policy such as missing, inaccurate, or fictitious references.
Potential signs of the use of generative AI can include:
- Absence of personal experiences, opinions, or insights (Taylor Institute, 2023).
- Generic and repetitive language (Taylor Institute, 2023).
- Inconsistent, non-existent, or invented references (Toronto Metropolitan University, date unknown).
- Student’s inability, when asked, to produce any research notes or to summarize the main points of the paper (Toronto Metropolitan University, date unknown).
Example #1: AI Tools Not Allowed
(Carleton’s Academic Integrity Policy)
Plagiarism is presenting, whether intentionally or not, the ideas, expression of ideas, or work of others as one’s own, including content generated by AI tools. Plagiarism includes reproducing or paraphrasing portions of someone else’s published or unpublished material, regardless of the source, and presenting these as one’s own without proper citation or reference to the original source. Examples of sources from which the ideas, expressions of ideas or works of others may be drawn from include but are not limited to: books, articles, papers, literary compositions and phrases, performance compositions, chemical compounds, artworks, laboratory reports, research results, calculations and the results of calculations, diagrams, constructions, computer reports, computer code/software, material on the internet, and/or conversations.
- Co-operation or Collaboration
Students shall not cooperate or collaborate on academic work when the instructor has indicated that the work is to be completed on an individual basis. Failure to follow the instructor’s directions in this regard is a violation of the standards of academic integrity. Unless otherwise indicated, students shall not cooperate or collaborate in the completion of a test or examination. Students are responsible for being aware of and demonstrating behaviour that is honest and ethical in their academic work (see www.carleton.ca/registrar). Instructors at both the graduate and undergraduate level are responsible for providing clear guidelines concerning their specific expectations of academic integrity (e.g. rules of collaboration or citation) on all course outlines, assignment and examination material.
Example #2: AI Tools Allowed
(Adapted from Mollick & Mollick, 2023; the authors gave their permission to use their language or adjust it to fit in one’s own course)
I expect you to use AI (e.g., ChatGPT and image generation tools) in this class. In fact, some assignments will require it. Learning to use AI is an emerging skill and I will provide instructions on how to use them. I am happy to meet and help you with these tools during office hours or after class.
Be aware of the limits of ChatGPT, such as the following:
- If you provide minimum-effort prompts, you will get low-quality results. You will need to refine your prompts in order to get good outcomes. This will take work.
- Do not trust anything ChatGPT says. If it gives you a number or fact, assume it is wrong unless you either know the answer or can check with another source. You will be responsible for any errors or omissions provided by the tool. It works best for topics you understand.
- AI is a tool, but one that you need to acknowledge using. Please include a paragraph at the end of any assignment that uses AI explaining what you used the AI for and what prompts you used to get the results. Failure to do so violates the academic integrity policy.
- Be thoughtful about when this tool is useful. Do not use it if it is not appropriate for the case or circumstance.
Violations of academic integrity standards include using AI tools to generate assignment-related content and submitting it as one’s own original work, as well as copying and/or paraphrasing the content generated from AI tools without proper citations or without permission by instructors.
Based on your course needs, choose from the following three use cases:
- Generative AI tools as part of learning activities
- Indicate which assignments will require students to use AI tools.
- Indicate which AI tools are used for learning activities.
- Explain the purpose of using AI tools for specific assignments.
- Provide instructions on how to get started using the AI tools.
- Provide instructions on how to engage with the AI tools to complete learning activities.
Syllabus statement example: You will need to use ChatGPT (a generative AI tool) to complete the Paper Review assignment. You will use it to evaluate the accuracy of the grammar in the assigned paper and provide feedback based on the results from ChatGPT. For help with how to use it, please read the instructions on how to use ChatGPT. Also, check the assignment instruction documents on how to complete the assignment.
- Generative AI tools as student learning supports or resources
- Indicate which assignments may include the use of AI tools.
- List the AI tools students may choose.
- Ask students to submit step-by-step procedures for using AI tools, including prompts.
- Include a clarifying statement such as, “Do not copy, paraphrase or translate anything from anywhere (ChatGPT included) without saying where you obtained it”.
- Provide Information about how to cite AI generated content.
Syllabus statement example: AI tools can be used to assist you in your preparation for writing a final research paper. You are allowed to use AI tools to help create a draft of the paper outline, search for ideas from literature, and edit your paper. If you choose to use an AI tool, you must submit a complete report on what/how you used it including all the prompts you entered to the system. Also, it is required to cite all the AI generated content properly.
- Generative AI tools not being used in a course
- Make it clear when students are not allowed to use generative AI in a course by including a statement.
- Explain why AI tools are not allowed.
Syllabus statement example: AI tools are not allowed to assist in any type of preparation or creation of the assignments in this course. Using AI tools in any way is a violation of academic integrity standards. Since this course focuses on building your original ideas and critical thinking, using AI tools would compromise the learning purpose, therefore is prohibited. Contact your instructors for more information before you use any AI tools.
According to Bender, et al. (2021), when incorporating generative AI tools as part of course design, instructors should consider privacy and ethical issues:
- Data privacy, ownership, authorship, copyrights: Companies that work on the development of generative AI tools (e.g., OpenAI) may ask users to open accounts by providing identifiable information (e.g., email address, Google account, phone number). Privacy policies usually state companies can use and share the data as they wish (Caines, 2023; Wilfried Laurier University, 2023). Carleton has guidance on the use of third-party tools. Third-party tools can be helpful for teaching and learning and can provide extended course functionality. However, instructors need to be aware that most third-party tools are not integrated into Carleton systems, so training, technical support, and troubleshooting are unavailable from the university. Instructors are actively discouraged from adopting a third-party tool that has not been cleared for privacy and security by the university.
- Unpaid labour and the commercialization of student text: Generative AI tools may be enhanced by the interactions with users who engage with them. Requiring students to use these tools can mean providing free labour for companies that may become commercial later in their development
- Inequitable access: Several AI tools have been created for-pay subscription plans. For-pay models are not within reach of all students and can create inequitable access for students from marginalized groups, creating advantages for those who can pay and disadvantages for those who cannot. On the other hand, some authors argue that AI tools can “lower the financial cost of personalized tutoring,” especially for students from equity-deserving groups, who cannot realize their full educational potential” (Chine et al., 2022, p. 366). In addition, generative AI tools may be unavailable in some countries due to government bans, censorship, or other restrictions (UNESCO, 2023).
- Inherent bias and discrimination: Generative AI tools can replicate and perpetuate existing biases (e.g., racist, sexist beliefs), toxic speech patterns (Bolukbasi et al., 2016; Welbl et al., 2021), or specific worldviews (Bender et al., 2021). Bias can be present in the training data, the coding, the validation process, and in the presentation of the results. Bias and discrimination can be hard to detect because Generative AI tools are complex, and technologies are perceived as neutral.
- Lack of regulation: Currently, generative AI tools are not regulated, and their rapid development prompted more than 2,700 academics and leaders from the private sector to call AI labs to pause the training of AI systems more powerful than GPT4 (UNESCO, 2023; Future of Life Institute, 2023).
- Bender, E. M., Gebru, T., McMillan-Major, A., & Smitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, March 2021, pp. 610–623.
- Kovanovic, V. (2022). The dawn of AI has come, and its implications for education couldn’t be more significant. The Conversation, December 14, 2022.
- Mollick, E., & Mollick, L. (2023). Why All Our Classes Suddenly Became AI Classes: Strategies for Teaching and Learning in a ChatGPT World. Harvard Business Publishing – Education.
- Monash University (2023). Generative artificial intelligence technologies and teaching and learning.
- Taylor Institute (2023). A First Response to Assessment and ChatGPT in your Courses. Website.
Was this page helpful?
3 people found this useful