PaaS as a solution to modernize and scale multiple legacy systems into a single BI cloud solution
Our client’s backstory
Our client is one of the market leaders in sleep innovation powering the sleeping experience with bed technology, including biometric data tracking over night, comfortable adjustable features like mattress firmness, temperature, etc., that allows for creating unique sleep experiences that improve health monitoring and the wellbeing of users.
What market need was identified?
Sales and operations run by the company are based on several large enterprise systems used for inventory and transportation tracking, as well as consumer insights. Each of these systems, however, solves a narrow domain of problems and in time essentially becomes an isolated silo of data.
Any advanced BI would need to combine all of those silos to provide additional insight, correlating all the different bits of information.
The company started making POCs in the Microsoft Azure stack to prove that it is possible to connect all the source systems.
In time, with enough experience in Azure space, the need arose to setup a new architecture of all the systems into a consolidated and well-built platform, as well as to provide DevOps expertise to support such a platform.
With the POCs having proved the benefit of basing the platform on Microsoft Azure, the team started treating different business users and groups inside the company as customers, orienting towards providing them easy access to the integrated data platform.
This is where the DevOps expertise provided to the company enabled fine control of user access, quick response to any integration problems and automation of certain aspects of the platform.
On the technical side, the new platform would avoid redundancies, reduce duplication of software implementation, improve performance and automation of maintenance and monitoring and reduce cost.
One of the first challenges was to connect and integrate all the different data sources into a single platform. Each of the systems has its own interface and the data was organized in a specific way.
Many of the early POCs quickly proved their benefit, and business teams started relying on them as a source for their BI and reports, even before they were officially production-ready. This required precise planning in terms of timing and migration of systems due to fundamental architecture changes, so it can officially be released to production. Many databases needed to be cloned, cleansed and readied to be integrated with the new backend.
In addition, there were multiple technological challenges – while the Azure stack itself provides many out-of-the-box solutions, in certain areas of automation and monitoring, it was necessary to think out of the said box and provide custom scripting to connect different Azure products.
How we did it
One of the first steps was to take an inventory of the multitude of existing POCs, analyze how they were connected and who was using them.
Legacy systems needed to be removed and new ones correctly setup in the beginning.
After the team was well setup in terms of roles, access to certain resources needed to be revised, and this is where the Azure Active Directory synchronization with on-prem Windows Active Directory proved its benefits.
As the platform gained more confidence with different business teams within the company, different requests started coming in and the team needed to be agile in order to deliver business value within short timeframe. This is where the DevOps expertise came into play as a number of release and deployment processes were automated using Azure DevOps.
CI/CD pipelines were implemented to automate the whole flow all the way from the git commit, through testing, to deployment to non-prod environments. Git flow was established for new projects, following best practices in terms of pull requests and quality gates. Monitoring and alerting was implemented as default for all the all deployed systems. Where Azure out-of-the-box solutions were not enough, custom scripts were written to enable full monitoring of business-critical systems and their performance.
Klika DevOps expertise also helped the team achieve new standards in automation and quality of monitoring and alerting. Having previously worked in the silos of separate enterprise systems, in-house teams also quickly benefited from the DevOps ability to quickly triage integration problems and solve them with the help of appropriate domain experts.
After the in-house teams finished initial POCs, the Klika team worked with them side-by-side to establish the newly created data processing platform on strong foundations that allowed for easier future development.