David Pocock has just proven a point that should be taken seriously by the whole Federal Parliament.
The independent senator for the ACT spelled out clearly what we already knew – elections can be won or lost by the fraudulent use of artificial intelligence.
He just showed us how easily that could be done.
Pocock’s little experiment over the weekend in releasing deepfake videos he had created of the Prime Minister and the Opposition Leader not only agreeing with each other on a policy issue but on an issue they can’t even find agreement on within their respective parties should have triggered alarm bells at Labor and Liberal party HQs.
The issue, of course, was banning gambling ads.
Even though it’s unlikely that either major party is going to stand up to the gambling lobby, many people could easily be convinced that in the height of an election campaign they might just find some gumption on the issue.
The videos Pocock released show in one video Albo outlining his policy to ban the ads, and in a separate video, Dutton giving the PM the Coalition’s full support.
Putting aside the fact that these two can’t reach agreement on anything, the videos have the potential to trick many.
If they were fake videos showing the leaders disagreeing on the issue, the power is there to falsely create a point of difference and, in so doing, sway votes one way or the other.
Pocock’s videos are crudely made but still good enough to fool far too many unsuspecting voters.
If such a ploy were unleashed during the next federal election, it might take an inordinate amount of extra media appearances to convince the electorate of the truth.
That’s why the Federal Parliament should act swiftly to outlaw the use of deepfake AI in election material.
There should be huge penalties for flouting any such laws.
Pocock and others (mostly other independents) have been trying to get the rest of the parliament to take this issue seriously for some time now.
Enquiries, petitions and conferences have so far only led the government to repeat ad nauseam that it’s taking advice.
The Australian Electoral Commission has been more than clear in its statements that there is nothing it can do – it’s a matter for the parliament.
There are at least laws being introduced and countering tactics being trialled in some parts of the world. What’s going on in Australia?
Not much.
While the government keeps consultations going over “AI guardrails”, bad actors are planning their assaults on democracy.
AI used properly has its uses in government, in business and in the community.
But there is nothing good to be found in an AI deepfake used in the context of an election campaign.
The government has moved recently to act against those who would use deepfakes as a sexual weapon.
Its legislation creates new criminal offences to ban the sharing of non-consensual deepfake sexually explicit material.
It will impose serious criminal penalties on those who share sexually explicit material without consent. This includes material that is digitally created using artificial intelligence or other technology.
“Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse,” Attorney-General Mark Dreyfus said when introducing his bill.
“This insidious behaviour can be a method of degrading, humiliating and dehumanising victims. Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence.”
That’s a good move. The government has stepped up on that front.
Now to election material. Why can’t it be as decisive?
Let’s amend the criminal code and update electoral laws to criminalise the use of AI deepfakes for the blatant purposes of deception in the context of election campaigns.
And let’s do it before the next federal election.
The major parties don’t seem willing to take on the gambling lobby (which makes Pocock’s video topics timely).
But what about tackling AI fraudsters?
Maybe the two leaders can make some very real announcements soon, explaining how they have reached agreement on that issue.