24 January 2023

Warning: No (artificial) intelligence was used writing this column

| Ross Solly
Join the conversation
5
Hand and robot hand typing

Will AI prepare students for the future or degrade their ability to think and write creatively? Image: File.

Before we begin, I want to point out that the following words are all my own. No artificial intelligence, no plagiarising.

It should be easy for regular readers to notice. If I had used AI, I’m sure the column would read much better, be more interesting and use much bigger words.

The use of AI at schools and universities is being debated around the world after it was discovered some students were using the latest technology to write their assignments. Somehow, some people think students using AI is a good thing and that banning it would be counterproductive.

Am I missing something? How can this technology be good for students? How can they possibly benefit from getting a computer to write their assignments for them?

READ ALSO Government, Opposition in war of words over outpatient wait times

In NSW, public schools have announced they will ban the use of AI, but private schools are indicating they will not. According to these private schools, they believe their teachers can identify work produced by AI, which is interesting, considering many global experts claim it is almost impossible to tell the difference.

Private schools also say using AI technology will lessen the workload of over-stretched teachers. Everyone knows finding enough teachers is a problem Australia-wide. But is embracing technology which takes away the need for students to use creative thinking of their own the best way to make life easier for teachers?

New York schools have banned ChatGPT, the AI chatbot most favoured by students. The education department said it had concerns about negative impacts on student learning and concerns regarding the safety and accuracy of content.

READ ALSO Heidi Yates reappointed ACT Victims of Crime Commissioner

Proponents of AI technology say it prepares students for the real world and that this is the way of the future. Maybe, but do employers want employees who actually can’t write their own submissions, reports and analysis?

The technology is spreading like wildfire. The Australian version of Rolling Stone said it will use AI to write some of its articles. But don’t worry, it will clearly identify which articles have not been penned by a real person to save us the hassle of reading them.

At least that’s one subscription I can safely cancel.

And then some misguided Nick Cave fan sent the singer lyrics written by ChatGPT. The singer was not impressed, labelling the song “a grotesque mockery” and a “travesty”.

“This song sucks,” he declared.

READ ALSO The silent electric Volvo – clean, green and not an animal to be seen

I know some of you will declare me a Luddite, stuck in the dark ages and wishing people still walked in front of cars carrying lanterns. But I’ve already on this website expressed my concern about what social media and other new technology is having on younger generations.

And have you tried recently to get a teenager to read a book? Impossible.

So for me, ChatGPT and other forms of AI don’t need to be embraced. If this new technology means schools and universities have to return to the days of assignments being done in the classroom using pen and paper, so be it.

The pandemic has already put a generation of students behind the eight-ball. Let’s not add to it.

Join the conversation

5
All Comments
  • All Comments
  • Website Comments
Latest
Tom Worthington5:15 pm 25 Jan 23

Were the worlds all your own? No use of a thesaurus, spellchecker, or copy editor? Those are all tools.

Words are used for a purpose. Publishers pay authors to write. If they can get a machine to write as well, and cheaper, they will use a machine. If authors can make use of the AI to produce better words, than the AI on its own, then some writers will keep their jobs.

For the rest of us, who produce words as a byproduct of something else, we can fit AI into what we do, if it does an okay job.

I use a computer program to narrate slides shows. I produce a script, and the computer reads it out in a voice which sounds like me, fades in music, and times this to slides. I use this when preparing a presentation to check the timing, and flow. It also makes a handy preview. Without the software it is too much trouble doing this.

A few decades ago I was a guest at High Table at an Oxbridge college. The person sitting next to me was a language don. I casually mentioned how soon computers would be able to do translation. The table went silent, and I was looked at in horror. Machine translation is now routine. It is not perfect, and you would not use it for a published book, let alone poetry, but for business it is okay. We will adjust to AI text generation.

It’s a matter of trust. The text looks good, but that does not mean that it is correct or accurate. The danger here is that people will put AI generated answers in documents without fact checking because of the hype, or perhaps because they used it once and the answer was OK so they now have misplaced trust. The AI is equivalent to that person we all know that knows nothing about the subject but is very good at bluffing and fooling people that he or she is an expert.

The other factor is that if we wanted to check the answer, we can’t. We don’t know if invalid resources were used or what weighting it placed on certain sources. So, if we used some AI generated text in a document and someone asked ‘where did that come from’, all we can effectively answer is “I don’t know, the machine made it up on my behalf”.

Daily Digest

Want the best Canberra news delivered daily? Every day we package the most popular Riotact stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.