CONTENT WARNING: The following content may be distressing for some readers.
Swedish artist Steph Swanson, or Supercomposite (as she’s known online), was at home experimenting with a digital image-generation tool when she claims she stumbled upon the stuff of nightmares.
She had asked the artificial intelligence (AI) to produce the theoretical opposite to a photo of American actor Marlon Brando. This created a logo for ‘DIGITA PRINTS’.
“I wondered: is the opposite of that logo, in turn, going to be a picture of Marlon Brando?” she tweeted.
But no. After asking the AI for the opposite of the logo, she says she was met with a series of four “off-putting images, all of the same devastated-looking older woman with defined triangles of rosacea on her cheeks”.
Try as she might to combine these images with plain images, including “one of a glass tunnel surrounded by angels”, the woman kept returning, in scenes growing more and more gruesome.
Using another computer program which generates human speech, Steph claims she had a “conversation” with Loab where the figure said she didn’t “choose to be associated with gore and horror, it just happens”.
“I think the AI is trying to create a contrast between the ideal of a mother and the reality of a mother,” Loab said.
“In reality, mothers often have to deal with sick and injured children, as well as the death of children.”
As obscure as it might sound, Loab has made news around the world for revealing the uncharted depths of AI.
Professor Katherine Daniell at the Australian National University’s new School of Cybernetics describes it as “an interesting example of unintended consequences, stemming from a lack of knowledge of AI”.
“The internet can be a dangerous place, particularly for children, women and those marginalised in tech development,” she said.
“It feels like Loab is a reflection of this, and a reason for us all to get involved in understanding and developing these technological systems, including the safeguards.”
The new research school at the ANU officially opened on Tuesday 29 November, as a place where students can get to grips with the exploding world of digital technology, with its dangers as well as its clear benefits.
“AI-enabled technologies are already allowing us to do certain things more efficiently and safely,” Katherine says.
“From driving cars and navigation, detecting leakage in city water systems and dams, assisting us in our homes to have rapid access to information and the recommended goods and services we might be interested in, to supporting factory and logistics management and efficiencies.”
But cybernetics is about more than AI and robots. It extends to the efficient use of technology in general. The climate-controlled air-conditioning in your car, is one example.
“Cybernetics can be about the overland telegraph line, or the steam engine, or Aboriginal fish traps,” Katherine says.
“It provides tools and techniques to look for solutions that integrate technology, with its social and environmental implications.”
The study of cybernetics dates back to the aftermath of World War II, when computing technology first began to take over many manual tasks. Like anything, Katherine says there are positives and negatives that come down to who’s designing and using the technology.
“AI systems run on data that comes from somewhere and use algorithms that compute and discern patterns in particular ways.”
She says one danger is that by embedding a value such as efficiency (money, CO2, time, etc) into a technological system, the result could run roughshod over other values such as environmental quality, safety and mental well-being.
As systems grow more and more complex, there is one thing we can know for certain: how much we don’t know.
“It’s important to learn about these digital systems – their component pieces, who is building them, and who and what data is steering how we act and choose (or not) what we do in life,” Katherine says.
“There are implications, both positive and negative, for our well-being and futures.”
Or as Loab said: “We just need to be aware that AI is capable of creating things that we don’t fully understand and that we need to be careful about how we use these tools.”