In recent years there has been a lot of talk about perpetual (or continuous) KYC. It is a term used to indicate the industry objective of moving away from conducting painful and ineffective periodic review processes. Very few banks, however, have made much progress with reducing this burden.
Banks typically manage to scrape through their 1 year review cycles for their high-risk customers but hardly ever complete their 3 and 5 year cycles for medium and low-risk customers respectively. Often, they still heavily depend on bringing in external resources to get the work done to avoid significant fines. In order to achieve perpetual KYC, banks need to be able to continuously monitor and refresh three main pillars of data.
Containing research from Oliver Wyman’s Global KYC benchmarking study on Global Banking & Markets KYC, we look at how banks can reduce onboarding times, KYC refresh costs and improve the customer experience by leveraging better technology and automation across the three pillars of data.
Download the PDF.
Customer data is frequently stored in a variety of databases across the bank which means there is not one single source of truth for the data. Moreover, most customers fail to update their customer data when changes occur in a person or company’s circumstances, resulting in a lot of the data stored by the banks being out of date. The poor quality of this customer data has some serious implications; most importantly, it leads to significant inaccuracies of a customer’s risk profile.
The main challenge banks have here is that, for many years, they have been running their KYC data validation and enrichment processes manually, even if they have implemented expensive homegrown or vendor built Customer Lifecycle Management (CLM) solutions. Manual processes have serious implications for the reusability of any data collected either during onboarding or during the previous round of periodic review.
External data, therefore, once collected, is typically not stored in a structured form but rather in paper filings with written annotations. What is collected by many banks lacks any level of audibility, and there is typically no clear data provenance. The next round of periodic review therefore means starting again from scratch and redoing the work completely.
This part of the customer review process cannot be done during onboarding, as for obvious reasons the customer will not yet have a transaction history with the bank at the time. The main objective here is to identify any behavioural patterns which do not match the customer’s profile and risk profile. Do we see any anomalies in their expected behaviour?
This is an important part of the process whereby AML and KYC really should be coming together. A tie-up or horizontal approach between the two makes sense as a transaction monitoring alert cannot be properly reviewed without access to good quality and validated customer data. The bank needs to have a good view of a customer’s transactions, as well as the customer’s profile and risk profile, in order for their operations team to effectively assess risk.
There are very few banks where KYC and AML data is successfully shared. I’ve also seen various instances where banks lack a consolidated view of all a customer’s incoming and outgoing transactions. Even where banks do succeed here, assessing any uncovered anomalies is challenging if the bank lacks a good understanding of who their customers and no external data validation has occurred for a number of years.
Considering these difficulties which many banks are facing, it is understandable that banks are struggling with their periodic review. Even if they do, it is typically more a ticking the box exercise to keep the regulator happy than a well thought out process which effectively helps identify risk within the bank’s customer population. To most banks, a perpetual KYC process where a bank will only have to respond to triggers detected is a stated objective rather than a realistic goal. This could be a change in a customer’s self-reported data, an externally detected change in their circumstances, or an anomaly in their transaction behaviour.
It is not all doom and gloom, however, as there are now a number of solutions on the market to make it easier to share customer data internally and ensure one golden source of customer data. Banks have also come up with innovative solutions to leverage any touchpoint they have with their customer (whether via online banking, the ATM or at the counter) to get their customers to refresh their KYC data.
Similarly, there is an increasing number of vendors on the market now offering analytical solutions to not just identify rule-based AML patterns, but also help grow a better understanding of the customer and their risk profile. For example, if we see a customer conducting a lot of transactions within certain jurisdictions or industries, that might not have to be an indication of AML risk, but it might well be an indication for the bank to increase the customer’s risk level. It is vitally important then to connect with the most up to date validated and enriched customer data available for an analyst to truly understand whether certain detected transaction patterns match with the customer’s known (risk) profile.
As for external data, there is now an increasing number of solutions available to support the implementation of a more structured, automated and reusable process for validating and enriching customer data. This is the primary reason why many banks are now coming to Arachnys. Having the ability to leverage external data in a much smarter and cost-effective way is already a no-brainer for banks who are looking to reduce cost in their KYC operations. Being able to leverage our capabilities to reduce the burden of periodic review by leveraging prior data is another key selling point. Our ability to proactively monitor external unstructured data sources for changes in a customer’s profile, however, is a game-changer, as it is the primary building block for achieving true event-driven perpetual KYC.