10 September 2024

It shouldn't be too hard to outlaw political AI deepfakes before the next election, right?

| Chris Johnson
Join the conversation
5
Peter Dutton and Anthony Albanese deepfakes

AI deepfakes have no place in Australian election campaigns. Images: David Pocock Instagram.

David Pocock has just proven a point that should be taken seriously by the whole Federal Parliament.

The independent senator for the ACT spelled out clearly what we already knew – elections can be won or lost by the fraudulent use of artificial intelligence.

He just showed us how easily that could be done.

Pocock’s little experiment over the weekend in releasing deepfake videos he had created of the Prime Minister and the Opposition Leader not only agreeing with each other on a policy issue but on an issue they can’t even find agreement on within their respective parties should have triggered alarm bells at Labor and Liberal party HQs.

The issue, of course, was banning gambling ads.

Even though it’s unlikely that either major party is going to stand up to the gambling lobby, many people could easily be convinced that in the height of an election campaign they might just find some gumption on the issue.

The videos Pocock released show in one video Albo outlining his policy to ban the ads, and in a separate video, Dutton giving the PM the Coalition’s full support.

Putting aside the fact that these two can’t reach agreement on anything, the videos have the potential to trick many.

If they were fake videos showing the leaders disagreeing on the issue, the power is there to falsely create a point of difference and, in so doing, sway votes one way or the other.

READ ALSO Pocock makes deepfakes of PM and Dutton to highlight the dangers of AI

Pocock’s videos are crudely made but still good enough to fool far too many unsuspecting voters.

If such a ploy were unleashed during the next federal election, it might take an inordinate amount of extra media appearances to convince the electorate of the truth.

That’s why the Federal Parliament should act swiftly to outlaw the use of deepfake AI in election material.

There should be huge penalties for flouting any such laws.

Pocock and others (mostly other independents) have been trying to get the rest of the parliament to take this issue seriously for some time now.

Enquiries, petitions and conferences have so far only led the government to repeat ad nauseam that it’s taking advice.

The Australian Electoral Commission has been more than clear in its statements that there is nothing it can do – it’s a matter for the parliament.

READ ALSO Voluntary AI safety measures introduced before mandated regulations adopted

There are at least laws being introduced and countering tactics being trialled in some parts of the world. What’s going on in Australia?

Not much.

While the government keeps consultations going over “AI guardrails”, bad actors are planning their assaults on democracy.

AI used properly has its uses in government, in business and in the community.

But there is nothing good to be found in an AI deepfake used in the context of an election campaign.

The government has moved recently to act against those who would use deepfakes as a sexual weapon.

Its legislation creates new criminal offences to ban the sharing of non-consensual deepfake sexually explicit material.

It will impose serious criminal penalties on those who share sexually explicit material without consent. This includes material that is digitally created using artificial intelligence or other technology.

“Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse,” Attorney-General Mark Dreyfus said when introducing his bill.

“This insidious behaviour can be a method of degrading, humiliating and dehumanising victims. Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence.”

That’s a good move. The government has stepped up on that front.

Now to election material. Why can’t it be as decisive?

Let’s amend the criminal code and update electoral laws to criminalise the use of AI deepfakes for the blatant purposes of deception in the context of election campaigns.

And let’s do it before the next federal election.

The major parties don’t seem willing to take on the gambling lobby (which makes Pocock’s video topics timely).

But what about tackling AI fraudsters?

Maybe the two leaders can make some very real announcements soon, explaining how they have reached agreement on that issue.

Join the conversation

5
All Comments
  • All Comments
  • Website Comments
LatestOldest
Trish O'Connor5:23 pm 10 Sep 24

A question – why is it called “deep fake” and what is the difference between that and “fake”. Is it just for sensationalism ?????

It’s a specific way of making fake pictures using a type of machine learning called deep learning. Deep Learning + fake = Deep Fake.

I think we should go further than the columnist is suggesting and ban all deepfakes, whether or not they are sexual or in an election context. What possible justification can there be for making it look like someone said something that they never said?

It shouldn’t be taken seriously. Only a complete moron can’t spot AI generated videos. Stop pandering to the lowest common denominator and attempting to ban everything.

Don’t kid yourself, you don’t have a chance of spotting good effects/AI.
And political parties exempting themselves from truth in advertising laws tells you everything you need to know.

Daily Digest

Want the best Canberra news delivered daily? Every day we package the most popular Riotact stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.