
Over the past few weeks, we have held a number of roundtables and open forums with staff across the University around how we can productively engage with artificial intelligence at Sydney. This FAQ is a collation of key issues from these discussions, and will help you to better understand how to approach generative AI this semester. For more information about this, see our collection of resources that talk about what generative AI is, and how it might be useful for teachers and students.
The policy and assessment frameworks
What is in the policy about generative AI?
The new Academic Integrity Policy 2022, coming in force on 20 February 2023, mentions these technologies in two key places. Clause 4(9)(2)(j)(i) states that it is an academic integrity breach to inappropriately generate content using artificial intelligence to complete an assessment task. Clause 5(16)(5)(b) mentions that students must acknowledge assistance provided when preparing submitted work, including the use of automated writing tools.
This means that by default, students are not permitted to use generative AI to complete assessments inappropriately. However, as a University, we want to grow students to be ethical citizens and leaders, and it is clear that these technologies will form an important part of our collective future. Therefore, the policy allows unit coordinators to permit the appropriate use of these technologies, including in assessment. What this looks like will differ depending on the unit, discipline, and other factors. In these situations, coordinators need to make clear to students what is meant by appropriate use, and students will need to acknowledge how these technologies have been used in the preparation of assessments.
Can I use pen and paper exams?
Given the potential for generative AIs to create novel text that will not be detectable by traditional text similarity tools, there is an understandable concern that these technologies could be used to generate submitted work that may not reflect a student’s own level of competency. Invigilated tests and exams are still a viable assessment type to authenticate student proficiency, as long as they fit within the assessment and feedback framework.
However, as part of the 2023-2025 Teaching Strategy, we wish to move away from exams (which have limited authenticity and inclusivity), towards more authentic forms of assessment. We would recommend using invigilated tests and exams in semester 1, 2023 as a temporary measure, if needed, for authentication of proficiency, and move towards assessment redesign in the medium term.
What do I do about my assessments and teaching?
How do I know if my assessments are affected?
Consider trying out the generative AI tools to see what sort of output it would produce for your assessment. For text-based assessments, you might find it informative to paste in the assessment prompt that is usually given to students into ChatGPT – this prompt or follow-up prompts can include some background information about the assessment, the desired learning outcomes, and the question(s) itself. Have the tool generate a few responses to the prompt, and consider the quality of its responses.
Is it just ChatGPT that I need to know about?
ChatGPT is just the tool that has received the most attention (having gained over 100 million active users within 2 months of its release). However, there are a rapidly growing set of AI-enabled tools available online for generating all forms of writing, as well as multimedia artefacts including images, video, music, vocals, and more. Not all of these may be applicable to your assessment(s), but it is worthwhile building awareness of the AI tools that your students may be using. It is worth noting also that the quality of AI tools is improving rapidly, and limitations (such as factuality, accuracy, ability to reference sources, etc) will continue to be overcome.
What do my students need to know?
We strongly recommend speaking with your students about generative AI. We also strongly recommend that you set clear guidelines about whether and how these technologies might be appropriately used to support their learning in your unit. Students need to also be aware of the Academic Integrity Policy 2022, alongside their other responsibilities as students such as those found in the Student Charter 2020.
How do I explain generative AI to my students?
AI tools will become part of every workplace and we want our students to be those who master the technology. They will need to learn how to build on the work produced by AI. The article on What teachers and students should know about AI in 2023 is a good start. Additionally, we are developing a short video explainer, together with an expert in large language models from the School of Computer Science at the University. This will be available soon.
What should I do about my assessments this semester? Next semester?
The article on How can I update assessments to deal with ChatGPT and other generative AI? is a good start. Although assessment types have been published in unit outlines by now, the way that you provide feedback to students, and students’ work in completing assessments, could still be adapted. As outlined in the article, you might for example provide references that students must use or contextualise your questions to news and current and local events.
As always, it is important that you ensure your assessments are connected to your unit’s learning outcomes; don’t just set more difficult assessments or increase the expected standards if generative AI may be able to produce passable work. Also, try to avoid the widely-suggested approach to turn all assessments into a form of ‘critique the AI response’ – this may not be the most effective assessment to address learning outcomes.
How can I design an assessment that is AI-proof?
Given the linguistic and other powers of generative AI, it will be difficult to design an assessment that is completely ‘AI-proof’. Suggestions such as having students refer to class discussions or contemporary sources, or including details of their own environments or experiences, will help somewhat but students could still generate text using AI and simply insert these details manually. The article linked in the above question offers alternative suggestions on approaching assessments in the context of generative AI being widely available.
Can I use one of those AI text detectors if I suspect students are using ChatGPT inappropriately?
No. There are significant unresolved privacy issues with these third party tools. If you suspect that a student may have used generative AI to complete an assessment task inappropriately, the unit coordinator should be notified and they will lodge a case with the Office of Educational Integrity. This case will then be reviewed independently, and a similar approach to investigating cases of contract cheating will be taken.
Are there equity issues around using these tools?
We cannot mandate the use of generative AI in assessments, unless access for students and staff is guaranteed and provisioned by the University. ChatGPT, for example, is likely to continue to have a free access tier in addition to introducing a paid subscription model – although the free tier does not have guaranteed access and may become unavailable unexpectedly during peak times. If you wish for students to optionally critique or otherwise draw on ChatGPT output as part of an assessment, you could consider generating some outputs yourself and making these available to students on Canvas so that they don’t need to do this themselves.
Working together to productively engage with generative AI in education
Where can I find out more about this?
We held two all-staff forums on this recently – University of Sydney staff can access the recording of one of these forums. We also have a set of resources available on AI and education at Sydney.
Do you have any examples of guidelines for students that I can use in my units?
The first port of call should be the Academic Integrity Policy 2022. Following this, here is a set of points that you may wish to adapt for your own contexts.
You are permitted to use generative AI to help you <insert learning activity and benefit>. For example, you may want to prompt the tool to <provide some specific examples>.
If you use these tools, you must include a short footnote explaining what you used the tool for, and the prompts that you used. Such a footnote is not included in the word count.
Don’t post confidential, private, personal, or otherwise sensitive information into these tools. Their privacy policies can be a bit murky.
Your assessment submission must not be taken directly from the output of these tools.
If you use these tools, you must be aware of their limitations, biases, and propensity for fabrication.
Ultimately, you are 100% responsible for your assessment submission.
The Syllabus Resources from the Sentient Syllabus Project also contain some text that you might consider for inspiration.
What are some examples of how academics are using AI in their teaching and assessment?
We are currently curating examples of how different academics are approaching generative AI in their teaching and assessment, and the guidelines they are providing their students.
I have an example of how I’m using generative AI in my unit
We would love to hear about this. Please email Danny Liu.
Who can I talk to for more advice?
If you’d like to discuss assessment and teaching design, please reach out to Educational Innovation via our design consultations – you can book a time that suits you: https://bit.ly/ei-consults. For matters of policy, please consult your Associate Dean Education. If you would like Educational Innovation to speak to your school, department, or Faculty, please get in touch with Danny Liu.