In the heart of amazing software development, in the city Brno, we launched a blog to share our thoughts.

Unreachable data in your model inventory? - Introducing the Reporting module

The MGS team is introducing the reporting module enabling direct connection with tools like PowerBI, Tableau or even MS Excel.

Flattening Docker images

Docker images are like onions or orges and it can sometimes be a problem. Especially with VFS storage, where each layer is represented as a full filesystem. We present several ways how to squash multiple image layers to one.

Caching frontend web application

As more and more web applications live on the client computer, it’s harder to update them right when you want. One of the solutions is to change resource file names after every update. Webpack can make this process straightforward.

Audit trail handling with MongoDB

This blog presents an example case, how an audit trail can be implemented with the MongoDB database. In fact, we implemented exactly this system in one of our older projects CRM.

Proof the feature works

There are many ways how to make sure the implemented software feature works. But sometimes, you need proof not only for yourself but also for your customer. What if the customer could monitor application functions?

Solving the Unirest JSON parsing problem on Databricks

Parsing Unirest response to JsonNode can lead to parsing error on Databricks if the top-level JSON structure is an array. We propose a workaround solution, which works best in custom Java libraries.

Black Friday ready real-time analytics

E-commerce platforms like Shopify contain their reporting dashboards which cover tons of figures generated by the shop, but it might not be sufficient for all architectures. We looked at the case when a customer wants to achieve real time analytics scaling seamlessly for peak periods like Black Friday.

MGS to Databricks connector

The MGS team is now introducing a connector between MGS and Databricks. Teams governing the firm’s portfolio in MGS - definitely not only ML/AI models - get access to the details of model development and monitoring right at their fingertips. Teams using the Databricks platform can write code interacting with the inventory.

Is Databricks good for complex applications?

Can we use the Databricks as a runtime platform for complex applications using the standard development model with git, deployment pipelines and several independent modules / services?

Automated risk model operations at scale

The MRM methodology is relatively well-defined, including the concept of model monitoring for particular model types. But how can we take these mostly manual procedures and deploy them at scale for the entire portfolio?

Should we register our CRO's napkin?

End User Computing (EUC) items exist in every bigger organization and are a potential source of headaches and significant loses. So, what are EUCs and how do we save ourselves this trouble? Here’s my point of view based on our recent case.

Can you run it locally?

The software during the development cycle will go through many stages. It usually starts as a source code in IDE, different pars will be executed in unit tests, maybe even whole components will be tested. It’s somehow packaged (compiled) and deployed to some server (staging, production,…). That is a very rough lifecycle which is not applicable in all cases, but at least during my time in CloseIT, many projects followed it.