APAC’s data intensive regulatory environment is making it increasingly important for banks to adopt a strategic data framework, say Swati Kothari and Jackielou Imperial at Wolters Kluwer.
New and emerging regulations across the Asia-Pacific region are placing greater emphasis on the quality of data required for reporting. The latest wave of more granular reporting requirements is forcing financial institutions to review their data collection and management processes.
This regulatory focus on data quality and granularity is seen by the marketplace as the signal for affected banks to adopt the strategic data frameworks which they will need to meet their reporting obligations. The new requirements are challenging. However, those firms which will embrace long-term solutions and implement flexible frameworks to support the new reporting formats can expect to realize significant benefits in terms of total cost of ownership, agility/flexibility and future-proofing against unforeseen regulatory change.
Financial institutions are facing a host of enhanced data initiatives from regulators across the APAC region. These include:
- the Hong Kong Monetary Authority’s (HKMA) new Granular Data Repository (GDR) pilot,
- the Monetary Authority of Singapore’s (MAS) Data Collection Gateway,
- the Bank of Thailand’s (BOT) Credit Risk Data Set,
- the Reserve Bank of India’s (RBI) Automated Data Extraction and the Centralized Information Management System projects,
- the Australian Prudential Regulation Authority’s (APRA) new Data Collection Solution, which will replace the existing D2A system, and
- China, where it is even more complex with different data granularity requirements across three separate regulators: the China Banking & Insurance Regulatory Commission (CBIRC), the People’s Bank of China (PBOC) and the State Administration of Foreign Exchange (SAFE).
Meanwhile, banks in the APAC region are busy addressing the large exposure and other capital-based reporting changes introduced by global regulations such as Basel III Reforms’ Capital Adequacy Requirements (CAR) and Standardized Approach – Counterparty Credit Risk (SA-CCR) and share a heavy calculation component that requires highly granular, consistent data.
These initiatives are raising the bar in terms of the quality, accuracy, flexibility and granularity of transaction data submitted by banks to the regulator, with financial institutions in APAC required to compile data in more detail and with more standardized analytical methods.
A case in point is the HKMA GDR pilot – which is expected to be finalised before the end of 2020 and introduces more detailed transaction reporting for banks. The GDR pilot was kicked off in the first quarter of 2019 and involves a number of selected local retail banks. The pilot requires participating banks to complete transaction reports for their Hong Kong offices, as well as any mainland China branches and subsidiaries, on a monthly basis.
The Granular Data Repository collects information related to two types of loans: residential mortgage loans (RML) and corporate loans. The reporting data grid comprises a wide variety of information, including loan static details, counterparty details, contract details, operational details, and collateral details and repayment schedule. There are in total around 250 fields required for submission.
Experience from the pilot will be used to help design the final GDR, which is currently at an initial development phase. The HKMA is assessing whether the GDR data can replace certain statistical surveys it conducts to assess residential mortgage lending and other activities.
Impact on financial institutions
The heightened emphasis on data among APAC region regulators impacts participating financial institutions in a number of ways. They introduce new standards of data quality, requiring affected banks to put in place a common data repository and institute robust data governance processes. This need to get the data right accentuates the role of the chief data officer.
Certainly, achieving compliance is not trivial. For example, early indications – from participants who just started filing reports – suggest that banks are struggling to populate even half of the 250 data fields required for HKMA GDR’s transaction reports, with the expectation that it will take them several months to fill the gaps in their data sets. In addition to this lack of ‘data-readiness’, banks are finding some field requirements are not clearly defined, which is also causing issues.
It’s clear that the reporting formats introduced by these emerging data granularity requirements will create challenges for regulated institutions. To begin with, regulators have yet to finalise unambiguous reporting models that can encompass all business scenarios. With the emphasis on data quality – in terms of completeness, accuracy and timeliness – it’s unclear whether banks will have the capabilities or resources to be able to meet the requirements for providing ‘error-free’ data, as required.
The new rules will require high volumes of data output from banks’ reporting systems, and there are concerns that this may stretch both the banks and the regulators themselves. Finally, banks may need to deal with interpretation issues on the data they report to regulators.
In Australia, under pressure from industry practitioners and the wider financial community, APRA has delayed the go-live date for its new Data Collection Solution to the second half of 2020, with the exact date to be released imminently. The regulator had been operating in an ‘information gathering’ mode and was unable to plan appropriately without having a clearer understanding of how banks are using XBRL files.
Data framework requirements
To get on board with these data-intensive APAC regulations, affected banks will need to address a number of issues. These include ensuring they can deliver on the requirement for data quality, completeness and availability. Key to this is the capability of the front- and back-office systems generating the transaction data for the regulatory reports.
Legacy systems can be used to capture the data if the fields required for meeting the granular data requirements exist in the base structure of these systems. For third-party systems/core banking systems – which most banks use – they may need to put a request to their vendor suppliers to add new fields if the desired fields are not readily available.
System enhancements may be necessary on a case-by-case basis. This also depends on the type of approach a bank has taken. Those taking a tactical/short-term approach may compromise by maintaining data offline; those opting for a more strategic/long-term approach may choose to embed data in upstream systems, which in most cases would require system enhancements.
The nature of the emerging regulations creates the need for banks to implement a strategic data framework that meets current requirements but also has the flexibility to deal with unforeseen future ones.
Building for the new data granularity requirements
Since many of the impending regulatory initiatives will eventually establish pull mechanisms – with data being pulled by regulators from a bank’s data repository rather than a bank pushing down the data to regulators – they will require a bank’s technology team to be equipped to address the platform requirements prescribed. For smaller banks with minimal IT resources, this will require some level of reliance on system vendors to ensure compliance.
Additionally, a bank’s client-facing and KYC teams need to be prepared to provide additional information required due to the more granular data requirements. Operations teams that perform back-end verifications and cross-checks also need to be trained to ensure all relevant information required for reporting is being sourced for the mandatory fields.
There are also a number of systems challenges. The first is the need to consolidate the data elements required for reporting from multiple source systems into a common data repository. Different source systems will have different fields to capture the same data element, which can cause issues. Practitioners will need to capture and maintain mandatory data elements to ensure seamless reporting. They will also need to build in cross-validation checks, as prescribed by the regulators, to ensure error-free data compilation for onward submission.
Benefits of a strategic data framework
The implementation of a strategic data framework can yield operational benefits. As discussed above, market and regulatory shifts are driving the need for integrated, risk- and finance-enabled regulatory compliance and reporting capabilities. By underpinning these mechanisms with a unified and flexible data framework, firms can consolidate regulatory data in one place and provide the agility needed to respond to unpredictable future change. The integration of regulatory calculations and regulatory reporting provides huge benefits in terms of high accuracy, audit trail, minimized adjustments, minimum reporting time and lower total cost of ownership.
A strategic data framework can also be used to drive a regulatory reporting mechanism that can process large volumes of data efficiently. Firms are looking to technology solutions that offer the speed and scalability needed to process the large volumes of data required for compliance with the new data granularity requirements and other data-driven regulatory initiatives.
Key considerations for regulated financial institutions
As of today, there is no insight on whether the transaction-level data sourced from banks under the incoming regulatory environment will replace those generated by heavy calculations needed for banks’ CAR/SA-CCR reports. Until concrete requirements have been announced by the regulators, banks will continue investing in existing regulatory reporting requirements. They should, however, keep an eye on any changes related to the regulators’ plans as these could mean modification or replacement of current approaches.
In the meantime, affected banks should consider implementing high-performance reporting mechanisms that leverage technologies capable of handling huge volumes of data, such as in-memory processing and grid computing, to enable faster processing to churn out the reports within regulatory timelines. They should also look at vendor solutions that support reporting for existing regulations, such as the European Central Bank’s (ECB) AnaCredit, which requires transaction-level data for loan portfolios.
Swati Kothari and Jackielou Imperial are regulatory reporting specialists and product managers in Singapore for Wolters Kluwer Finance, Risk & Reporting business.