Interest is growing in the use of RegTech solutions in financial crime and this will be a major topic at FinCrime World Forum next month. Here we take a look at some of the major technologies and techniques that could be utilised as well as some of the barriers to adoption
There now seems little debate that we are on the cusp of a regulatory technology (RegTech) boom in financial crime, particularly when it comes to onboarding and meeting know Your Customer requirements.
Indeed, a recent report by Juniper Research estimated that global spending on RegTech will nearly quadruple to $130billion by 2025, report finds. This is expected to be driven primarily by the use of machine learning and artificial intelligence, to onboard customers and to verify and authenticate identities.
Next month’s FinCrime World Forum will look at trends around RegTech, including the response of regulators, the opportunities and barriers to adoption.
To set the scene, we’ve picked out some of the key technologies, techniques and issues in the RegTech space.
Machine learning and AI
This is the big one. AI and machine learning is most effective when applied to large amounts of data and carrying out pattern recognition.
Given the huge datasets larger financial institutions work with on a daily basis, and their requirements to monitor multitudes of transactions and carry out KYC, it is not surprising this technology is of interest to Anti Money Laundering compliance teams across the world.
As Janet Bastiman, a speaker at the previous FinCrime World Forum event in March, said “One of the things that AI does really well is that it’s very good at spotting patterns and anomalies in large amounts of data.”
This ability to screen millions of data points very quickly and find what’s correlated or what is unusual is vital.
A human might easily miss the fact that a metric for a client is not fitting a typical pattern, particularly if the change is subtle or involves little movement.
KYC can also become “perpetual” so if a client’s risk profile changes, technology can help identify this and ensure records are up to date.
Machine learning’s ability to learn and train itself is also much heralded.
There are dangers with this technology that regulators are currently beginning to grapple with across the world.
If machine learning is trained on data that for whatever reason is not perfect, its effectiveness may be impacted. This can lead to problems such as “overfitting” where the machine learns from a pattern that is prevalent in training data but not in real-world data.
There are questions about AI’s ability to understand linguistic or contextual nuances
“AI systems are ultimately only as good as the data feeding into them - and small flaws will make decisions unreliable,” Guy Harrison, Guy Harrison, General Manager, Dow Jones Risk & Compliance, wrote for GRC World Forums last month.
Similarly care needs to be taken to ensure algorithms governing the parameters the machine is working with are aligned to the realities of the financial crime compliance outcome you are trying to achieve.
There are also questions about AI’s ability to understand linguistic or contextual nuances. Humans therefore need to be in the loop and a key question over the next few years is identifying the areas which need to have people involved.
Another issue around the use of AI and machine learning is “explainability’. Financial institutions need to be able to explain to regulators (and to themselves) how its AI functions and how it has arrived at conclusions. Five US federal regulators last month published a public Request for Information (RfI) asking for input on ways to regulate AI. This made it clear that explain ability is a key concern. The regulators said lack of explain ability can inhibit the organisation’s understanding of the ‘conceptual soundness’ of the AI, which can increase uncertainties around reliability and increase risk when used in new contexts. It can also make audit and review more challenging and pose challenges to data protection requirements.
The five regulators’ RfI also raises issues about oversight of third parties, the increased intensity of data processing and cybersecurity risks.
Behavioural biometrics
Machine learning need not necessarily be limited to raw data though.
There is increasing excitement in the RegTech industry in how machine learning and AI can be combined with behavioural biometrics to give further insights into possible illicit activity, particularly fraud.
An individual user’s interactions with the bank can be dynamically analysed and machine learning technology can compare it against their typical behaviour they have learned. Tim Ayling, vice president, Europe, Middle East & Africa, at Revelock, last year suggested different interactions could be brought together to create “BionicID” or footprints.
Further down the line there may be more exciting possibilities. Becky Marriott, vice president of risk and compliance at banking services provider Tide, has previously talked about the prospect of behavioural biometrics being able to, for instance, determine whether somebody is holding their phone in a different way or putting different weight on the buttons, or even determining whether somebody is under duress.
Blockchain analytics
There has been an awful lot written over the past few years about the pseudonymous nature of cryptocurrencies and how this can create challenges for financial crime compliance teams, particularly when currency passes through tumblers or exchanges.
While this is of course a concern, the flipside is that if identifiers on the blockchain ledger can be de-anonymised, transactions can be traced at a micro-level.
Law enforcement agencies can, if the technology is effective, see multiple transactions without having to request or subpoena information from different banks along the way.
Amanda Wick, Chief of Legal Affairs at Chainalysis, said in March, “At the highest level, you can see the ecosystem, you can see how funds are moving…the visibility is just insane and carries huge opportunities for compliance.”
For this reason RegTech firms offering blockchain analytics services can expect to be in demand over the coming years.
Privacy enhancing technologies
Attend any form or debate on financial crime and the chances are somebody will talk about the need for greater information-sharing and cross-sector partnership working (indeed, FinCrime World Forum has various sessions on these topics).
One of the chief barriers though has been concerns about data protection and privacy requirements. How can financial institutions routinely share data about transactions with each other, and with regulators, without infringing on the data rights of individual customers?
This dilemma is fuelling a surge of interest in RegTech that can protect the underlying personally identifiable data through encryption and other techniques while still allowing useful aggregated or anonymised data that can determine which customers are likely involved in high-risk activity.
Homomorphic encryption is one such PET generating interest. This uses cryptographic techniques to enable third parties to process encrypted data and return an encrypted result to the data owner, while providing no knowledge about the data or the result.
Zero-knowledge proof mechanisms are also much talked about. These vary but they use mathematical models to create a probabilistic assessment of whether something is true. They provide small pieces of unlinkable information that when put together shows the verifier’s assertion to be overwhelming probable, without revealing the personal data.
All eyes are currently on Transaction Monitoring Netherlands, a project involving five Dutch banks which aims to move towards collectively monitoring the banks’ 12 combined 12 billion transactions a year. The use of PETs is being explored to make this work effectively.
Two of the banks involved, ABN Amro and Rabobank, have worked with scientific research organisation TNO to explore a multi-party computation solution that allow the banks to determine which of their clients may be involved in high-risk transactions, without accessing individuals’ risk scores from banks and infringing customers’ privacy. The personal data is encrypted, split up and fed into an algorithmic system. In April the banks reported “promising” results from a test of the system on synthetic data.
Facial biometrics and liveness detection
Expect a continued shift towards using ‘selfies’ as a means of identification when accessing banking services online. To guard against spoofing, RegTech firms are also increasingly using liveness detection, to ensure that the images used are of the real living and breathing customer. This often involves challenging the customer to move or blink.
Barriers to adoption?
So there are many possibilities for the use of RegTech to improve transaction monitoring, KYC, onboarding, partnership working and information-sharing to tackle financial crime.
However, the pace of adoption is held back by several factors.
There is currently a huge amount of choice and lots of niche products and it may be that some financial institutions are holding back until RegTech solutions are more established, perhaps offering a suite of products to prevent the need to invest in several solutions for different problems.
Another issue is around data siloes such as keeping Know Your Customer and onboarding data separate to transaction monitoring information. Some of the reasons for this will be cultural but there are also data protection and privacy concerns which may hold back bringing data together.
The City of London Corporation in the United Kingdom in April published a report with RegTech Associates (whose co-founder Dr Sian Lewin is speaking at FinCrime World Forum) arguing for more action to quicken the pace of RegTech adoption.
Recommendations included building awareness about RegTech’s ability to scale through an independent testing and accreditation regime; regulators adopting a tech embracing stance to advocate for improved standards and the establishment of a “coherent and collective voice for the industry” to represent the UK RegTech sector.
The report also said financial institutions should be incentivised to upgrade from legacy technology and for procurement processes to be streamlined.
FinCrime World Forum sessions on ‘Aligning RegTech to FinCrime realities”
- Tuesday 22nd June, 2:30 pm
United We Stand: The potential for integrated FinCrime teams and controls
- Tuesday 22nd June, 3:20 pm
Beyond the Banking Bubble: Leveraging technology lessons from other industries
- Tuesday 22nd June, 4:10 pm
Pushing for the RegTech Revolution: Removing the barriers to adoption in FinCrime
- Tuesday 22nd June, 5:00 pm
The Next Step with RegTech: A panel interview with the Napier team
To find out more about these sessions, click here.
No comments yet