Data engineering and BI

CHALLENGES

We have identified while working with clients across industried that the hardest part on BI reporting, ML or model development is often not the task itself, but the data management around it. The data engineering is often the most challenging part of the problem.

Cloud environments are complex

These environments offer huge scalability, flexibility and potential cost savings. But the cloud environments can be also too complex to work with.

Missing data centralization

Data exist in many legacy systems across the organization. The data centralization is required in order to be effectively used.

Advanced data processing

Some calculations or reports are too advanced to be created by self-service BI tools like Tableau or Power BI. These problems require complicated data processing or transformation routines developed with the emphasis on large data scaling and efficiency.

HOW CAN WE HELP WITH DATA ENGINEERING AND BI CHALLENGES

Our data engineers and software developers help you to build a solid data architecture. We could help you with:

  • Design of your cloud architecture

  • Building your data lake or data warehouse

  • Data preparation and cleansing for your BI analytics or model development

  • Automation of the data ingestion

  • Custom reports and dashboards

  • Building of data transformation and calculation engines

CASE STUDIES

We have identified, developed, and successfully implemented solutions for both large and small companies, always with the emphasis on maximizing the business value.

IFRS9 Provisions Plausibility Engine

In early 2018, a new methodology for the calculation of provisions for Expected Credit Losses (or ECL) under IFRS9 for the banking sector came into force. Our team was tasked with implementing a solution for the retail credit exposures of Raiffeisen Bank International (RBI). The calculation is used to check the plausibility of results provided by each RBI subsidiary bank.

"This was a clear example of agility. CloseIT was presenting a steady development in regular iterations as the tool was progressively taking shape. The speed of development was impressive. From zero to functional IFRS 9 ECL prototype in 3 months!"
Deyan Ivanov, Head of Retail Risk Methodology & Validation at Raiffeisen Bank International

Credit Risk Stress Testing Framework

We have created a unique solution for recurring stress tests in credit risk Raiffeisen Bank International, which allows us to define our own test scenarios for simulation of portfolio development and calculations of individual Risk Parameters RWA or ECL. Our application was successfully used for EBA stress tests in 2018 and continues to be extensively used for other stress testing needs by internal or required controllers.

FROM OUR BLOG

Black Friday ready real-time analytics

E-commerce platforms like Shopify contain their reporting dashboards which cover tons of figures generated by the shop, but it might not be sufficient for all architectures. We looked at the case when a customer wants to achieve real time analytics scaling seamlessly for peak periods like Black Friday.

MGS to Databricks connector

The MGS team is now introducing a connector between MGS and Databricks. Teams governing the firm’s portfolio in MGS - definitely not only ML/AI models - get access to the details of model development and monitoring right at their fingertips. Teams using the Databricks platform can write code interacting with the inventory.

Is Databricks good for complex applications?

Can we use the Databricks as a runtime platform for complex applications using the standard development model with git, deployment pipelines and several independent modules / services?

Can you run it locally?

The software during the development cycle will go through many stages. It usually starts as a source code in IDE, different pars will be executed in unit tests, maybe even whole components will be tested. It’s somehow packaged (compiled) and deployed to some server (staging, production,…). That is a very rough lifecycle which is not applicable in all cases, but at least during my time in CloseIT, many projects followed it.

Starting AWS SAM folder structure (NodeJS)

When I started putting together my first AWS SAM project, I was confused with the project structure - as always, when I’m starting new project with new technology. You can easily make a bloated project where code is duplicated in each lambda function.