Home

Menacing messages to be targeted by AI banking tool

Jennifer Dudley-NicholsonAAP
Commonwealth Bank will share anti-abuse technology with other financial institutions. (Erik Anderson/AAP PHOTOS)
Camera IconCommonwealth Bank will share anti-abuse technology with other financial institutions. (Erik Anderson/AAP PHOTOS) Credit: AAP

One of Australia's major banks plans to share an artificial intelligence tool that can identify hidden threats used to harass victims through their bank accounts.

Developed by Commonwealth Bank, the innovation uses machine-learning and natural language processing to identify offensive, harassing and threatening messages even if the perpetrators avoid using profanities or other banned words.

The bank has pledged to share its AI model with other financial institutions in a move one executive said could stop "insidious and threatening forms of abuse" that risked going undetected.

The new tool comes three years after the CommBank revealed the scope of technology-facilitated abuse in its system and began blocking offensive descriptions attached to low-value deposits.

Get in front of tomorrow's news for FREE

Journalism for the curious Australian across politics, business, culture and opinion.

READ NOW

Human Resources group executive Sian Lewis said the institution had blocked almost one million threatening messages since introducing its automatic filter in 2020.

But she said some threats were being altered to make it through the filter, forcing the bank to create a better way to identify threats.

"We got our AI and data analytics team involved," she said.

"It starts to scan messages at a large scale... and we can now, as the machine continues to learn, narrow that down to get even better and better at identifying those abusive transactions."

CommBank customer vulnerability head Caroline Wall said identifying threats was difficult as many were couched in everyday language and were only threatening in context.

"Some of the most insidious and threatening forms of abuse that we see actually don't contain profanity, they contain things like 'I know where you live' or 'have you checked the dogs today' or 'if you don't unblock me, I'm turning up'," she said.

"Those kinds of things cannot be identified by a word block because they are unique to the individual experiencing the abuse."

Ms Wall said the AI tool profiled relationships between customers over one to three months and flagged "high-risk cases" with a team of specialists who could contact potential victims and offer support.

The Commonwealth Bank also worked with NSW Police on a pilot to report cases of financial abuse.

Ms Lewis said code for the bank's AI tool would be made available to other financial institutions around the world for free through GitHub in a bid to "stamp out financial abuse".

The Commonwealth Bank also announced partnerships with five not-for-profit organisations to provide support for victims of family and domestic violence.

A national study into technology-facilitated abuse by Monash and RMIT Universities found half of all Australian adults would experience the issue during their lifetimes, with monitoring, threats and harassment the most common forms of abuse.

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails