Rethinking reflective writing assessments in the face of generative AI

 

Two students working together

By Dr Neda Chepinchikj, Implementation Manager, English Language Support UNSW

Published 24 July 2023

 

How we redesigned a task in an Engineering course

Producing a piece of reflective writing shows that you can think critically, synthesise ideas and express your insights (Mann et al., 2009), and showcases your language and communication skills. Thus, it is frequently used as a low-stakes assessment or a diagnostic of students’ writing, critical-thinking skills and linguistic competence.  

The first written assessment in a major, first-year, undergraduate, core course in the Faculty of Engineering – Introduction to Engineering Design and Innovation (DESN1000) – is a piece of reflective writing. The task, which comes early in the course (due at the end of Week 2), is based on a task where students work in teams to design and build a particular device. At the end of the task, each student is asked to individually reflect and comment on their experience of working in a team through the lens of Field’s (1997) three principles of design. This is a low-stakes assessment (worth only 5% of the overall mark), and the students receive formative feedback from a Smarthinking1 tutor on their language and reflective writing skills through “improvement-focused comments” (Bearman et al., 2023, p. 3) 2.

The students do not receive scores or marks, which removes the stress factor and allows students to focus more on learning and developing their skills.  

At the start of 2023, the advent of ChatGPT and the increased discussions about generative AI heralded a changing landscape of feedback and assessment. The as yet unexplored terrain of generative AI caused an uproar in the higher-education sector and prompted many teachers to start thinking seriously about what this could mean for academic integrity in students’ assessments.  

Just before the start of Term 1, the convenors of DESN1000 approached me about redesigning the reflective writing task in the face of ChatGPT and its potential for misuse by the students.  

We began by testing ChatGPT’s prowess with respect to reflective writing. After prompting the tool to produce a reflection and support it with evidence from the source used for the task, we found that ChatGPT could only produce a short, generic paragraph using the source. As brief as this paragraph was (much shorter than the 500-word limit for the task), it was able to use the source and provide a reasonable, albeit generic, answer to the prompt.

However, what it couldn’t do (despite being explicitly asked), was the reflection itself, because this needed to be based on experience and include the affective element of actual teamwork.

This insight was quite useful because it meant that we didn’t need to reconceptualise the entire assessment, but merely make some adjustments:  

  1. The task now focused on the students’ teamwork and how it affected the success of their design activity. 

  1. We removed the three principles of design by Field (1997) and redirected the focus from discussing the success of the students’ design work in the light of applying or not applying these rules. Instead, we asked the students to describe their teamwork and its progress with reference to the five stages of group development mentioned in their textbook for the course. 

  1. We emphasised the students’ feelings about participating in the activity. 

  1. We asked them to reflect on their key learnings and comment on what they would have done differently.  

  1. Finally, we included hyperlinks to various resources on reflective writing and Harvard referencing in the assessment brief itself.    

Thus, by redirecting the focus of the task to the experiential side of teamwork and the students’ feelings and learnings, we strove to minimise the potential for students to misuse ChatGPT and we left more room for critical thinking and student engagement in the task.

From this experience, we learned that ChatGPT cannot yet produce believable reflective pieces without raw data on real-world experiences; other scholars (e.g. Nikolic et al., 2023) have reported similar findings. The quality of ChatGPT’s outputs also depends on the prompts used to generate an answer. This suggests that, despite our findings that ChatGPT could not generate a convincing piece of reflective writing, it may only be a matter of time before it becomes capable of realistically imitating this type of academic writing (Li et al., 2023).     

 

References

  • Bearman, M., Ajjawi, R., Boud, D., Tai, J., & Dawson, P. (2023). CRADLE suggests…assessment and genAI. Centre for Research in Assessment and Digital Learning, Deakin University. DOI: 0.6084/m9. figshare.22494178.
  • Field, B.W. (1997). Introduction to the design processes (2nd edn). Monash University.

  • Li, Y., Sha, L., Yan, L., Lin, J., Raković, M., Galbraith, K., Lyons, K., Gašević, D., & Chen, G. (2023). Can large language models write reflectively? Computers and Education: Artificial Intelligence, 4, 100140.

  • Mann, K., Gordon, J., & MacLeod, A. (2009). Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education, 14, 595-621.

  • Nikolic, S., Daniel, S., Haque, R., Belkina, M., Hassan, G. M. Grundy, S., Lyden, S., Neal, P., & Sandison, C. (2023). ChatGPT versus engineering education assessment: A multidisciplinary and multi-institutional benchmarking and analysis of this generative artificial intelligence tool to investigate assessment integrity. European Journal of Engineering Education, 1-56. DOI: 10.1080/03043797.2023.2213169.

Footnote

      1 Smarthinking is an online writing support platform, where students can receive formative feedback on their writing from writing professionals.

Acknowledgments

I would like to thank and acknowledge the DESN1000 course convenors Dr Nicholas Gilmore and Professor Ilpo Koskinen for their insights and collaboration.  

Editorial support by Laura E. Goodin.

 

***

Reading this on a mobile? Scroll down to learn about the author.

 

Enjoyed this article? Share it with your network!

 

Comments