Computers, rather than conmen, are set to be the future face of fraud as criminals turn to robotics in an effort to avoid detection.
According to the latest report from KPMG, organisations are set to battle against so-called ‘seeker bots’ defined as self-learning and self-replicating Artificial Intelligence that will render the faces of criminals invisible.
KPMG’s Profile of a Fraudster report is based on the analysis of 596 fraudsters investigated by the firm between 2011 and 2013. Based on the modus operandi of fraudsters’ crimes, the report predicts that traditional fraudsters (identified by KPMG as 36–45 years of age, acting against his/her own organisation and in executive positions) will be replaced by ‘seeker bots’.
Infographic produced by KPMG KPMGInfographicFraudsters
These ‘bots’ will be designed to continuously test a company’s cyber defences in an attempt to find a ‘hole in the fence’, meaning that attempts to second guess or pre-empt tactics used by real people will not always be worthwhile.
The KPMG report warns that, on finding a gap, the bots will analyse the potential for fraud and then launch a highly specialised ‘attack bot’ uniquely designed to suit the type of business, size, infrastructure and data set-up of the victim. The ultimate aim will be to remove assets to a virtual delivery location which can then be accessed by the fraudsters.
Taste of things to come
Hitesh Patel, UK head of forensics at KPMG, commented: “This is not science fiction, but a taste of things to come. We are already seeing highly trained hackers link up with the organised crime network. The ‘faceless’ criminal is not far away. Cyber crime is already on the rise and we expect cyber attacks and high-tech fraud to grow exponentially.”
Hitesh Patel: UK head of forensics at KPMG
KPMG’s report argues that, to unravel the frauds of the future, the best investigators will be those who are able to reduce large amounts of data to identifiable events. Yet some skills will remain as current tomorrow as they are today, with successful defence requiring an ability to operate seamlessly across borders, sharing corporate intelligence to ensure quick historical and geographical reach enables organisations to track ‘bot behavioural patterns’ as swiftly as they happen.
At the same time, the report reveals that the criminal(s) behind the changing face of fraud are by nature collaborative, preferring to collude with others instead of following the perceived stereotype of a reclusive loner.
The data shows that the proportion of cases involving collusion rose from 32% in the 2007 survey to 61% in 2011 and 70% this year. In many cases, perpetrators were highly respected (39% of all cases analysed), regarded as sociable (35%) and/or an extrovert (33%).
Patel added: “A few years ago, hackers were motivated by political objectives and seen as disruptive influences targeting computer networks to make an ideological point. Most were seen as individuals trying to make a name for themselves. However, with an ability to master Artificial Intelligence, it’s only a matter of time until the fraudsters harness the full power of technology to enrich themselves and criminal organisations. That is unless legitimate businesses take steps to defend themselves.”
He concluded: “A plausible person is no longer needed to present a stolen cheque to a bank teller. All that’s needed is a hacker who can access a protected computer network. Perhaps human features and emotions will no longer be a significant part of the profile. Instead, electronic features, signatures and behaviours may be all that a victim organisation will know of the cyber fraudster.”
Copy of the full report