“Please do not assume the worst”: Students want colleges to teach them how to use AI the right way

Higher education needs to embrace AI or get left behind, students say.

In just a few months, higher education has moved from being afraid of how generative AI like ChatGPT could help students cheat, to cautiously embracing it by allowing students to use it under certain circumstances. 

In getting to grips with how AI will change education and society, we’ve heard mainly from educators, university management and other experts. But what about students, whose studies, careers, relationships and futures will be most impacted by AI?

We are academics from universities in Sydney and Hong Kong, specialising in higher education practice and research. 

Over the last two months, we have asked students their thoughts about how AI should be used in their education through questionnaires and focus groups. This includes a survey of more than 450 students in Hong Kong and pilot focus group panels with 13 Australian students.

Here are some the key themes to emerge so far from this ongoing research.

AI makes knowledge easier to access

Students recognised that ChatGPT was helpful for summarising, brainstorming, explaining and suggesting. They mentioned how it made it easier to learn difficult topics in a conversational way.

As one told us: 

“I’ve had a mostly positive experience […] Explanations of new concepts are always really well done and you [can] ask it to explain something a little more clearly.

Others mentioned it helps them learn during classes:

“to grab quick definitions, explain concepts to me, and assist in discussions when the conversation goes quiet or people are confused.”

Students are aware of the risks

The more experience students had with ChatGPT, the more nuanced their views were. One student noted ChatGPT “will miss out on important points or misunderstand”.

“That’s why I am not relying on it for assignments, instead it is very helpful for my daily learning.”

Another went further to say that using AI improved their critical thinking:

“I simply put the whole assignment in to see what it would generate. The answer was quite abysmal […] This was really valuable information because I developed critical thinking while critiquing its work.”

Another student added, “I think students really need to understand that AI is not always correct”. 

In the survey of more than 450 students across Hong Kong universities, 80% said they understood its limitations and potential inaccuracies.

AI is key to their future careers

Students talked about how AI could remove less desirable parts of work, to focus on more important thinking.

“…busywork can be done for us, and will be done for us in our future careers.”

As one student put it:

“for learning, it’s [like] an upgraded version of Google. Let’s say if you are new to a topic, you can ask ChatGPT questions and treat it as interactive Wikipedia.”

Students said they wanted their teachers to teach them “how to best use AI tools and make AI tools a common part of education, just like PowerPoint and Excel”. 

This includes educating them about risks, biases and limitations so they can understand the technology they will inevitably be using. 

Students agreed guidelines about “what happens if AI is used” are needed going forward. As one noted:

“Please do not assume the worst of us. Rather, teach us how to use this technology in the right way and learn alongside it.”

Concerns about equity and ethics

Students were concerned about the disadvantages that lack of access to ChatGPT would mean for some people.

“All students should have the same resources as one another, being of a lower income should not be a reason why other students can do their assessments more efficiently.”

Others noted AI was not necessarily free, as there were costs of accessing premium tools. Schools are also taking different approaches globally and locally, with some banning and some embracing AI. This could widen existing inequities.

Where to from here?

The Australian Universities Accord discussion paper highlights AI as a significant opportunity, and challenge. 

This is something we cannot ignore. And students want universities to actively engage with AI for their benefit. 

They do so knowing this is a “difficult time” for their teachers.

“The traditional ways of learning […] are changing. But this technology is now our present and the future, we need our teachers to prepare us for it.”

But they are worried about their futures and they want their education to prepare them for life after study, in a world that is changing rapidly.

“If university wants to prepare people for later in life, why not encourage usage of a tool that would be available to us outside a strict academic setting?”

We need to work with students, industries, communities, and governments to figure out how we can help our students engage productively and responsibly with AI. This is urgent work as the pace of AI development accelerates and has wide-ranging impacts across society perhaps beyond its developers’ understanding.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Related
Why a neurodivergent team will be a golden asset in the AI workplace
Since AI is chained to linear reasoning, workplaces that embrace it will do well to have neurodivergent colleagues who reason more creatively.
When an antibiotic fails: MIT scientists are using AI to target “sleeper” bacteria
Most antibiotics target metabolically active bacteria, but AI can help efficiently screen compounds that are lethal to dormant microbes.
OpenAI and Microsoft are reportedly planning a $100B supercomputer
Microsoft is reportedly planning to build a $100 billion data center and supercomputer, called “Stargate,” for OpenAI.
Can we stop AI hallucinations? And do we even want to?
“Making stuff up” and “being creative” may be two sides of the same coin — but you have to be able to tell the difference.
When AI prompts result in copyright violations, who has to pay?
Who is responsible for copyright violations when they’re produced by generative AI? The technology is outpacing the law.
Up Next
Exit mobile version