Published on 19 February 2026
Teaching is relational – but now there are three of us in the relationship
By A/Prof. Lynn Gribble and A/Prof. Kamilya Suleymenova
Ask any academic and they will tell you that the joy of teaching lies in its relational dimensions. We witness students experience ‘aha’ moments, shift their thinking, and develop the critical capacities that allow them to engage constructively with the world. Even in large classes, students often feel they know their teachers and, in most cases, want to be known by them. Academics learn names, understand aspirations, and seek to motivate students not only for academic success but for lifelong learning and career development. Teaching, at its core, is relational.
Prior to November 2022, the learning environment operated differently. We might now describe this as the ‘before GPT’ (BGPT) era. Viewed through an open systems lens, the primary relationship within the system was between student and academic. Trust was built through direct interaction. While technological change and massification had already stretched that relationship, the external forces acting upon the system were broadly understood.
For many disciplines, in particular business and social sciences (and humanities), coursework is or was assessed often through essays and reports. Thus, the known inputs into the system included the student’s own written work, sometimes supported by peers or family, and occasionally deliberate attempts to outsource writing. Such outsourcing required explicit intent to deceive. Academic misconduct occurred, but it was not the norm (see Rigby et al., 2015). Within this system, the transformation process — from student input to written output — was comparatively transparent. Written artefacts functioned as reasonable proxies for a student’s thinking and knowledge, even if imperfectly so.
The emergence of generative AI marks the ‘after GPT’ (AGPT) era. The dyadic relationship between student and academic has, in effect, become altered and may now be even considered a triadic one. AI now sits within the learning relationship, whether invited or not.
As generative tools are embedded within everyday technologies, their use is not always motivated by an intention to mislead. Spellcheck and grammar tools are accepted as good practice; predictive text and generative drafting occupy a more contested space. The boundary between assistance and substitution is no longer clear.
For many academics, writing is not merely a means of communication but a process of thinking. If thinking is enacted through writing, then delegating writing to AI is perceived as delegating cognition itself. In AGPT, the open system has been fundamentally altered. The inputs, transformation processes and outputs are all affected. Written artefacts can no longer be assumed to reflect individual capability unless the process of their creation is observable.
This creates significant asymmetry. As yet, the norms governing the use of AI within (and outside of) higher education are unclear and subject to a heated debate. Thus, when a student's writing process is unobservable, the uncertainty is very high for both parties. In addition, given the existing grey areas, the psychological cost of misconduct may diminish. All together this means that trust becomes fragile.
Students, too, experience a shifting trust landscape. They question whether academics are using AI to design assessments or grade submissions. If AI is legitimate for one party, why not the other? The relational contract — both explicit and psychological — is unsettled.
This disruption occurs within a broader environment that is volatile, uncertain, complex and ambiguous (VUCA). While higher education frequently claims to prepare students for a VUCA world, generative AI has intensified each dimension. The pace of technological development makes risk estimation difficult. This is not routine change; it is systemic uncertainty.
Academics are particularly exposed in such conditions. Educational systems are characterised by inertia. Assessment redesign requires substantial labour and navigation of quality assurance frameworks. Academic credibility itself has historically been certified through written outputs produced under BGPT conditions. Rapid change can feel destabilising not only pedagogically but professionally. Moreover, when the future skills landscape is unclear even in the medium term, curriculum decisions become high-stakes bets.
The response cannot be to abandon existing systems wholesale. Nor can it be to treat generative AI as a marginal add-on. The disruption is systemic, and so the response must also be systemic. This requires examining the world of work, industry practices and evolving workflows to reimagine higher education as an integrated system — from teaching design to knowledge acquisition to assessment.
Focusing on isolated components risks fatigue and despair. Instead, we must acknowledge uncertainty, translate it into manageable risk, make provisional assumptions, and revise those assumptions as technologies evolve. This is demanding work. It is also intellectually invigorating.
The task ahead is to preserve the relational core of teaching while openly integrating generative AI into our classrooms. We must design learning experiences that make AI visible within the relationship rather than hidden behind it, and develop assessment approaches that meaningfully evaluate knowledge, capability and judgement at scale.
As we enter our fourth AGPT academic year, the question is no longer whether AI belongs in higher education. The question is how we consciously position it within the teaching relationship so that trust, integrity and intellectual growth remain central.