AI - The New Chapter in Higher Education: An Academic Facilitator’s Perspective

 

Classroom photo from Introductory Academic Program (IAP)

By Dr James Bedford, Academic Learning Facilitator, Pro Vice-Chancellor, Education & Student Experience Portfolio

Published 23 May 2023

 

I work alongside a team of brilliant individuals called Academic Learning Facilitators (ALFs). We support students from all stages and faculties in their academic studies. On any given day we might be preparing or delivering a workshop, teaching into one of our many programs, consulting with students about their academic work or assignment drafts, creating an online resource, or perhaps meeting with faculty to discuss how we can work together to embed our resources in their course. Because we talk to staff from all over the university, and have candid conversations with students about their work, we are often able to see firsthand the trends and challenges affecting higher education. 

Lately, there have been many conversations around generative AI. When it comes to ChatGPT, Bing, Jasper, Bard, and the various other iterations coming out each week, students are using them, but are unsure how to use them correctly. Official policy and principles underlying usage seem a bit vague.

I hear many students describe their feelings towards ChatGPT as “apprehensive”, “anxious”, and “uncertain”. One said they have found ChatGPT to be “addictive”. Another said they felt their ability to write might greatly diminish by using this tool. Some seem to be more positive and are using generative AI to develop their language and comprehension skills. Overall, it seems there is genuine confusion around what these tools mean and how students might implement them.

In December 2022, I started using ChatGPT in the classroom to demonstrate its potential benefits. I worked with about 30 students in the Introductory Academic Program (IAP), a course required by those who have received an Australia Awards Scholarship. I sat with each student individually for 10-15 minutes, looking through their work and asking questions, while providing both verbal and written feedback. Students waited their turn, talking quietly in the back, sometimes listening in on the advice I was giving. In some cases, and with the student’s permission, I would use ChatGPT to analyse one of their paragraphs. I began showing students how to use very basic prompts like: “improve and clarify this passage”. The results were impressive. In addition to me providing feedback on their work, I could show them an improved passage in a matter of seconds. This was useful when it was difficult to explain what exactly was wrong with the sentence on a linguistic or syntactical level, or where I didn’t have the time or energy to locate such an example elsewhere.

The next step was crucial in order for students to learn something of value. I asked them to analyse the responses or ‘improvements’ ChatGPT made. What was better or worse about the response it provided? Did it capture what they were trying to say, or did it distort it? What did it do differently? What did they notice? Running a portion of their response into ChatGPT and evaluating the responses it provided proved to be a valuable learning experience for the students and myself. Their eyes lit up when the ideas they were struggling to articulate were suddenly clarified and improved with the click of a button. As the technology was quite new, most had never seen it before.

I was witnessing, firsthand, their genuine reactions to generative AI. It was like watching the barriers between translating an idea into language suddenly break down into tiny, tiny pieces.

This tool could change everything, I thought to myself. What could go wrong? 

There were times where it did go wrong. Horribly wrong. There were a few cases where it misconstrued their original ideas or made the writing less clear and more convoluted. I made sure to explain the dangers of using this tool, including the way it can hallucinate sources, while perpetuating inherent bias, misogynist or racist information, not to mention its lack of up-to-date knowledge and sheer lack of common sense. However, ChatGPT’s capacity to provide instantaneous feedback on a piece of writing might be far too intoxicating for some students to avoid. And perhaps this isn’t a bad thing.

A lot has happened within the space of generative AI since December 2022. Now Term 1 has passed, it has become more advanced, and students appear to be more aware these tools exist. While more students seem to be using programs like ChatGPT, they still seem unsure what constitutes misuse. Is it plagiarism if I rewrite a response ChatGPT provides? Can I translate my ideas into English using ChatGPT? Can I have ChatGPT proofread my work? My advice to students here is to have a conversation with their course coordinator or supervisor. I might then have a discussion with the student about academic integrity and what this looks like, or perhaps discuss their writing process to ensure they’ve used these tools correctly. Are they familiar with UNSW’s plagiarism policy? Do they know what is grounds for plagiarism? Most haven’t read the policies written for them. When I was a student, I didn’t either.

When it comes to generative AI, having a good understanding of plagiarism and academic integrity policies is just the beginning. As students use these technologies more and more, it is important they understand the greater implications of using them, in particular, how using generative AI might impact their learning.

In a recent seminar I gave for HDRs we discussed some of the challenges and limitations of using certain tools like ChatGPT and Elicit, while also covering important topics like copyright and privacy. We concluded the workshop with an activity asking participants to write UNSW’s official policy statement in response to these emerging tools. Students were split into small groups and asked to read and discuss UNSW’s current plagiarism policy before coming up with some points for discussion around generative AI. This was a useful activity as it encouraged students to think more deeply about the uses and potential misuses of these tools. This then led to broader discussions about the purpose of university and the students’ role as learners and creators of knowledge. One student even commented that workshops around AI literacy should be required learning at university.

It’s vital we continue teaching AI literacy in order to clarify how generative AI is to be used responsibly. We must set clear guidelines on appropriate usage, and hopefully, once the dust begins to settle, create authentic assessments that don’t always rely on these tools for completion. While a return to written exams and oral presentations might seem like a step backwards, perhaps it is one way to address what is to come. I am hoping to implement a portfolio as assessment in our Academic Skills Plus program. This can then be used to examine a students’ process and participation more broadly. Something that encourages students to reflect on their own learning experience, while also keeping a paper trail of their work over a course, will be very useful in the age of AI. I now encourage students to keep drafts of their work handy just in case a teacher were to have a conversation with them due to Turnitin’s AI detection tool flagging their response. While Turnitin’s AI detection tool should only be used as a starting point for a conversation with the student about their work, a student who has copies of their drafts and evidence of their process could better demonstrate the integrity of their submission. 

No doubt, the way in which universities prepare for and respond to the swarm of generative AI tools that are coming, and that are already here, will be crucial for how we navigate the future of higher education. As these powerful technologies continue to improve, and become more and more widespread, universities will need to focus on familiarising students and faculty with the technical aspects of these tools. In addition, we need to have conversations about the broader societal implications. This includes ethical considerations, the potential for job displacement, and the ways these technologies may alter communication and decision-making processes. A greater push towards having these honest and open conversations with staff and students is vital in order to support AI literacy and to better prepare our approaches to this technology.
 

***

Reading this on a mobile? Scroll down to learn about the author.

See also

For UNSW students

 

Enjoyed this article? Share it with your network!

 

Comments