Our IFRS9 Technical Solution
ETL processes were set up to the clean data
When we started with the implementation of IFRS9 standard, we had about a year to design and come with working and hopefully an extendable and robust solution. Since most of our developers are comfortable with Java and we built other projects using this language, the decision to built a core of application on Java Spring framework was easy. There were also no obvious reasons why not to performance-wise. The core is responsible for loading accounts from a database and executing a stack of sub-processes on each one. Because each sub-process is responsible for only one task (validation of data, PD calculation, ECL calculation,...), the code is very readable and we can introduce any new sub-process anytime.
Because application depends on many different inputs from different sources, we had to set up ETL processes to the clean data and transform them to a common format before we save them to the database.
MongoDB as the ideal candidate
We choose MongoDB as our storage due to its nonrestrictive nature. Input formats changed very often at the beginning of development and MongoDB allowed us to quickly iterate and do changes without even thinking about the database. Our entities in the system are structured and mostly separated without many relations between each other, so the document based Mongo was the ideal candidate. We also found out during development how powerful the aggregation framework is. Most of our exports/reports are prepared using aggregations. It means less and smaller files with transformations and also less long unreadable streams in the Java-based core.
IFRS9 engine was later chosen as the foundation for a new tool for (not only) EBA stress test 2018. It made sense because IFRS9 engine was already stable and extendable. New module - Orchestrator - was built in order to manage the rest of the modules. It's also a Java application which connects to IFRS9 engine, PDI and RWA engine (very similar to IFRS9 engine architecture-wise). Already established connections between IFRS9 engine and PDI were removed and Orchestrator is now in charge of calling different actions. These actions are definable using Groovy scripts, so preparation of any kind of test is very easy and it can be done by any user familiar with available actions and scripting. The whole system is now called CRM.
The CRM system depends on models but as opposed to our other product Model Governance Suite, the purpose of CRM isn't to catalog them. Models are stored also in the database together with fixed mapping attributes and they are grouped by country in snapshots with defined validity timeline. While CRM lacks in catalog features, it compensates for it with the model analysis features. Because the number of supported/required model types is very limited, we are able to expect very accurately what users want to do with them. The output of the model is graphed in GUI so any change is immediately visualized. Internal values of models can be exported to Excel, the source code of Macro model can be exported to human readable formulas and we are even able to translate macro model to Excel file so the output vector (and every other internal vector) is calculated solely using Excel formulas.
Macro model inputs are grouped in packages with defined validity timeline. Not only that but packages belong to package sets. These sets are also mapped to timeline so CRM can contain multiple very different sets, for example with adjusted values. This is often used for sensitivity analysis. But the usage of sets is up to the user and the system has more benefits than the named example.
Our goal was to make IFRS9 calculation as easy and user-friendly as possible
While Orchestrator scripts provide a very robust stress testing solution, our goal was also to make standard IFRS9 calculations as easy and user-friendly as possible. The GUI is based on React and connects to IFRS9 core using REST backend. Execution of ECL calculation is as easy as dropping SAS dataset file with the portfolio on the Calculation Overview page. User can immediately see the progress on the progress bar with different calculation steps. When the calculation is done, reports can be downloaded from the same page. GUI is also used to manage all necessary repositories. Attributes and even models are editable. Each change in the system is tracked by Audit trail which shows changes of data visually similarly to GitHub. Because Audit trail internally works with JSONs close to the database layer, any historical state of entity can be potentially easily restored.
One tricky problem was related to logs. The best scenario is that the user will send us the error message with log, but we didn't want to bother application users with hard to understand error logs. Even better, the error is usually related to a problem with input data, so the user can fix the issue by himself. We try to catch as many common errors as possible, give them a human-readable message and pass them to the Event log, which is available also from GUI. It has benefits for both sides. Users don't have to contact developers with issues they can solve themselves and developers have to spend less time with common often recurring problems.
The very useful feature turned out to be Application profiles. User can at any time export the whole CRM configuration to zip file. It contains not the only configuration like the settings of calculation engine, but also Event log and all repositories. It means that the Application profile is something like a soft backup. Calculation results aren't saved, but the calculation itself can be reproduced again using the profile. The feature can be also used for transferring the current CRM instance to a new server using two mouse clicks.
The IFRS9 engine matured very quickly to the complex and powerful solution used for multiple kinds of stress tests and is still being extended.
Author: Luděk Novotný