Will artificial intelligence revolutionise the higher education sector for good?

Experts come together to discuss the future possibilities

When a plumber turned up to do some work at Thomas King’s home last year, he told the IT expert that he’d recently contested a parking ticket using the artificial intelligence (AI) software program ChatGPT.


“It had just been released (ChatGPT) and he didn’t have great reading and writing skills and he was able to contest a parking ticket,” Mr King said. “He was really blown away by the utility of it.”

Thomas is a Higher Education Industry Executive at Microsoft ANZ. While the seasoned technology expert acknowledges AI programs such as ChatGPT aren’t fully baked, he believes people are starting to see the benefits of generative AI and this is leading to significant and rapid uptake across all levels of society.

The uptake of ChatGPT is unprecedented – the chatbot reached 100 million users within two months of launching. It’s this rapid expansion and potential impact on the higher education sector that recently brought together leading researchers and industry experts for a thought-provoking Luminaries webinar hosted by UOW.

The AI and the Future of Higher Education webinar was moderated by UOW Senior Professor Sue Bennett. She was joined by Microsoft’s Thomas King; Professor Rhona Sharpe, Director of the Centre for Teaching and Learning at Oxford University; and Professor Michael Henderson, Director of Educational Design and Innovation in the Education Faculty at Monash University.

Conversations around AI and education often deviate to common themes of cheating and assessment. However, as panellists highlighted, there is a more nuanced discussion to be had around the opportunities AI can provide.

Panellists agreed that AI is here to stay and the answer isn’t to ban it, instead suggesting we embrace the technology while still acknowledging the challenges it presents.

Professor Rhona Sharpe has over 20 years’ experience in the digital education space and considers the use of AI tools as an extension of digital literacy.

“Digital literacies can be understood as the ability to access and use digital tools and importantly, to incorporate those skills into contextualised practices in creative and discerning ways,” she said.

“We have to prepare our graduates to use it well and wisely.”

But digital literacy is something that is developed, and this can only be done if students have access to the technology. Professor Sharpe refers to this as functional access.

“Ticking the box to say that you have access to a laptop isn’t functional if you share it with other members of your family, or if the screen is broken and if the WiFi connection isn’t very good,” Professor Sharpe said.

“We’re already seeing some inequalities emerging with access to ChatGPT Plus. Some students are paying for ChatGPT Plus, some aren’t. Some are confident in their use of language to use the prompt function, some aren’t. Understanding barriers to access is very important.”

Professor Sharpe reiterates the need for students to know how to use the tools, giving examples of AI being used in positive ways such as improving English in emails, writing code or using it as a tutor.

But at what point does using AI tools for productivity cross the line into cheating and how can this be managed?

Professor Michael Henderson is an expert in the field of digital education and said he is cautious for calls to go back to pen and paper and exam halls.

“Exams are not impervious to cheating and while exams can sometimes be a useful context for the demonstration of certain kinds of knowledge, they’re often not a great design for a lot of different purposes about trying to get students to demonstrate their learning, or for creative and critical thinking,” Professor Henderson said. “It’s unlikely to stop generative AI misuse.”

Professor Henderson is quick to acknowledge he isn’t against exams in totality, but believes universities need to move away from quick fixes and look at making big changes.

“Most institutions I am coming across, quite excitingly, are accepting the idea of AI in assessments and I think it’s a fantastic approach.”

Professor Henderson believes AI could be used in assessment design; generative AI could be used to support and prompt learners to use established structures and genres to help with expression; or to inspire novel thinking.

“This is exciting stuff,” Professor Henderson said. “We could be using it in ways to test ideas, to explore hypotheses, to quickly spin up and generate materials and notions, to become a sounding board.”

Professor Sharpe agrees that the emergence of AI doesn’t change the reasons students cheat.

“Students are more likely to present work that is not their own when the assessment tasks they have been set aren’t relevant or meaningful, when the deadlines are too tight and when the stakes are too high,” Professor Sharpe said.

“The role of the educator is to encourage students to use these tools in critical and creative ways, not superficial ways that they might do without our support.”

Professor Sharpe believes AI certainly has a place in reducing inequalities in education, citing tools such as Grammarly and spellcheck as programs which support students to improve their written English. It’s the inclusion and equity aspect of AI that also excites Professor Henderson.

“It can really provide opportunities for students who are differently abled to engage,” Professor Henderson said. “But on the other hand we need to recognise that generative AI services will inevitably become subscription services.”

He says it’s important for universities to ensure students have equitable and functional access. Professor Henderson is also concerned about how generative AI could become an echo chamber.

“It does not know the difference between fact alone and doesn’t know the implications of biased language and stereotype representations in its current form, it has very few values embedded, except for those of the programmers,” he said. “It’s something we need to be very critical of.”

The answer of how the higher education sector can respond to the emergence of AI is complex, but it must be tackled. Thomas King says institutions should develop broad AI strategies that don’t just focus on assessment.

“It’s not just around assessment, it’s not just around teaching and learning, it’s around impact. It’s around your partnerships, it’s around the student experience you want to provide. It’s about improving your university operations and administration,” Mr King said.

Mr King encourages academics to think about how they can use AI to their advantage.

“Have you really sat down and thought how you can take advantage of this to give you some of that time back to spend with your students, to spend contemplating how to solve research challenges, to free up time for your creative expertise to be used?”

“It’s such a critical time for the sector to really show how valuable it is to society at large.”

Watch the full Luminaries seminar on AI and the Future of Higher Education here.

About Luminaries

Luminaries brings together leading UOW researchers and thought leaders for a one-hour conversation every fortnight. Join us online for this interdisciplinary series and discover how research and collaboration is tackling global social, environmental and economic challenges.