IFRS9 Provisions Plausibility Engine

What did we solve?

Earlier in 2018, the new methodology came into force - IFRS9. Our team has been tasked with challenging implementation of calculation engine for IFRS9 Provisions for the Raiffeisen Bank International retail risk methodology and validation department. The provisions plausibility engine is used to verify the correctness of data provided by individual banks in the RBI group. It also serves as a reference implementation in the initial phase of the entire group IFRS9 project.


Excluding the consistent implementation of the IFRS9 Provisions calculation was an important criterion for the use of open technologies, standards to ensure support and future scalability. One of the most important topics, for the large international bank, is the speed of computation on a relatively common modern HW.

How did we solve the task?

After many years of experience with various monolithic legacy systems, we decided to go for an alternative approach. We have designed a modular solution composed of independent components. All components communicate via unified API.  The target architecture is supported by modern technology and open source software. With this crucial decision, we have gained easy scalability, support and a favorable price (in fact zero expenses for licenses).

Within the solution we have created the following components:

ETL module based on Pentaho DI. All operations, in which data is transformed between different schemas and their aggregation, are solved in the ETL module. For this purpose, we already had experience with the usage of Pentaho Data Integration, which has enabled us to quickly create the necessary data workflows. Our solution allows you to dynamically scale according to the available computing power.

The computational engine was created in the JavaEE and Python environments. We delivered a completely customized solution of a computing engine that is capable of utilizing the performance of a modern HW with many CPU and operating memory for processing large amounts of data.

Graphical user interface. The solution beyond the technical specification has been to create a modern user interface in which the user can easily check the state of the calculation of single-use portfolios. Track their results, manage calculation parameters and create or modify macroeconomic models for individual portfolios.

The user can modify all the model parameters or models themselves and compare their outputs in real time. The role of the graphical interface is fulfilled by a web application created with the JavaScript framework ReactJS. For data input, you can use drag & drop of the data package right from a user's computer via GUI, or use auto-run as part of system integration. The input data set is delivered as various exports from internal risk marts in a format of SAS data sets, which is loaded with our own open source plugin (link) using Pentaho's DI.

How did it end up?

The customer has been given a reliable tool capable of verifying the implementation of new calculation methodologies in individual banks. The software was installed in a short space of time and integrated on client servers, where authorized users can access it. The tool is now intensively used.

The data model has been designed to handle a large amount of data efficiently. We have chosen MongoDB as a technology for storing data of a different type, which proved to be a great choice. With the ease of scalability of the Pentaho DI platform, we've created several custom plug-ins to extend standard features.

We developed the solution iteratively using the Scrum methodology. We presented to the client's internal team the current state of development at regular intervals and installed an up-to-date version of the solution to test environment. Based on feedback, we have adapted further developments as needed. At the later stages of the project, we participated in analyzing and consolidating input data, analyzing data quality issues and removing them.