Person on phone call with AI voice cloning fraud warning

AI Voice Cloning Scams Targeting Australian Businesses

PN
Peter Nelson
· · 5 min read

Cybercriminals are using AI to clone voices and trick employees into authorising payments. Learn how these scams work and how to protect your team.

A finance manager at a Melbourne professional services firm receives a call. It sounds exactly like the CEO — the accent, the speech patterns, the turns of phrase. The CEO explains they are in an important meeting overseas and need an urgent payment of $85,000 processed today. The instructions come through on email shortly after. The finance manager processes the transfer.

The CEO never made that call. The voice was generated by AI from publicly available recordings — a podcast appearance, a video interview, a LinkedIn Live session.

This scenario is not hypothetical. AI voice cloning scams — sometimes called “vishing” (voice phishing) combined with AI generation — have been documented in Australia and are increasing in frequency as the technology becomes cheaper and more accessible.


How AI Voice Cloning Works

Modern text-to-speech AI systems can clone a voice from as little as a few seconds of audio. Feed the model a clip from a public video or podcast, and it can generate new speech in that person’s voice — saying anything the attacker scripts.

The resulting audio is not perfect, but it does not need to be. It only needs to be convincing enough under the circumstances of a phone call: slightly distorted by phone audio quality, the target under time pressure, and the emotional authority of hearing a familiar voice.

Commercially available tools for voice cloning are accessible online, some for free. The barrier to entry for this type of fraud has collapsed.


The Business Email Compromise (BEC) Connection

AI voice cloning is most dangerous when combined with business email compromise — where attackers have compromised or spoofed a senior executive’s email account.

The attack sequence:

  1. Attacker researches the target organisation (LinkedIn, website, public information) to identify the CEO/CFO and the finance function
  2. Attacker clones the CEO’s voice from public recordings
  3. Attacker calls the finance manager, impersonating the CEO with the cloned voice
  4. Attacker follows up with a spoofed email appearing to come from the CEO’s address
  5. Finance manager, having heard the CEO’s voice, treats the email as legitimate confirmation

The dual channel — voice and email — is what makes this attack so effective. A single spoofed email triggers healthy scepticism. A phone call from the “CEO’s voice” followed by a confirming email bypasses that scepticism.


Red Flags to Train Your Team On

Urgency and pressure: Legitimate payment requests from senior executives follow normal processes. “This needs to happen today before close of business” is a pressure tactic.

Request to bypass normal procedures: Any request that starts with “don’t go through the usual process” should stop the transaction.

New or changed payment details: Any payment to a new account, or a change to an existing supplier’s bank details, requires verification through a separate, independently established channel.

Calling from an unusual number: Attackers often spoof caller ID. A call from the CEO’s mobile number is not verification that it is actually the CEO.

Request for secrecy: “Don’t mention this to anyone until it is done” is a classic social engineering tactic.


Procedural Controls That Stop These Attacks

Technology is less effective than procedure for stopping AI voice cloning attacks. The most reliable defence is a verification protocol that cannot be bypassed regardless of how convincing the impersonation is.

Dual authorisation for payments: Any payment above a defined threshold (say, $5,000) requires approval from two authorised signatories. One person cannot authorise alone, regardless of who instructed them.

Callback verification with independently sourced numbers: If you receive a payment instruction by phone, call back using a number you sourced independently (from the company’s official website or your existing address book) — not a number provided by the caller.

Verbal code words: Some organisations establish a simple code word system — a pre-agreed word that the CEO would include in any urgent out-of-normal-process request. Cloning software cannot know this.

No payment process changes by phone: A standing rule that bank account changes for existing suppliers require written confirmation plus a verification call using independently established contact details. No exceptions.


What to Do if You Suspect a Clone Attack

If a payment has been made and fraud is suspected:

  1. Contact your bank immediately — transfers can sometimes be recalled if acted upon quickly
  2. Report to ACSC via ReportCyber (cyber.gov.au)
  3. Report to the Australian Competition and Consumer Commission’s Scamwatch
  4. Contact Victoria Police if the amount is significant
  5. Preserve all evidence: call recordings (if available), emails, logs

Do not be embarrassed to report — these attacks are sophisticated and specifically engineered to bypass normal scepticism. The ACSC and law enforcement need this data to track and combat these attacks.


Education is the Best Defence

CX IT Services includes security awareness training — covering AI-enabled fraud, phishing, and social engineering — in our managed security offering for Melbourne businesses. Contact us to discuss how to train your team to recognise and respond to these emerging threats.

Free Right Fit Call

Want to Talk Through What This Means for Your Business?

Book a free 15-minute Right Fit Call. No obligation - just a straight conversation about your IT situation.

  • No lock-in contracts - ever
  • Valued at $250 - completely free
  • 4.5-star Google rated
  • Answer in 60 seconds or less

Book Your Free Right Fit Call

Takes about 2 minutes. We'll confirm if we're the right fit - or point you in the right direction.

Step 1 of 8 13%

Takes about 2 minutes · No obligation