Outstanding solution architecture is not just the result of understanding systems and how they interact, but also requires hands-on experience with the available technologies. Only technical know-how allows for well-founded effort estimates and sound judgment regarding the suitability of available subsystems, languages, frameworks, and interservice communication approaches in the market. At the same time, solution architecture is never an end in itself, but serves a broader business purpose. A deep understanding of the business context and continuous involvement of all stakeholders is therefore essential.
Code quality is the foundation of outstanding systems. Bug-free performance is just as critical to customer satisfaction as features and design; and the agility of a team or company ultimately depends on the adaptability of sourcecode. Code quality metrics and test coverage are useful indicators for quality assurance, but ultimately, it takes developers with a comprehensive, end-to-end understanding of the codebase. Developers who grasp the patterns and theoretical foundations behind the available languages and frameworks, know how to apply them, and can show the way forward.
The replacement or support of individual manual processes through software systems is just the first step toward successful digitalization. True efficiency gains and cost savings come from the seamless integration of solutions across the entire organization. Achieving this requires a deep understanding of the benefits and limitations of different integration strategies, such as synchronous and asynchronous communication, event-driven architectures, microservice two-phase commits, and others. In most cases, security, identity and access management form the foundation of solid integration, which - to ensure long-term success - must also consider the broader context of interface documentation and potential future systems.
Cloud paves the way for reliable, scalable, and globally distributed deployments. Managing uniform environments, such as for production and staging, becomes remarkably simple with Infrastructure as Code. In most cases, adopting cloud technologies is a smart decision. However, sustainable success requires expertise. Cost savings often come only through carefully planned infrastructures, backed by experience. Additionally, cloud providers tend to encourage lock-in effects, even at the software level. The balance between portability and development convenience demands a nuanced approach.
A custom ETL solution was developed for a multinational electric utility company in the energy market to evaluate the performance of energy trades. Contributing to this project, the work involved transforming and linking data from a data lake and transferring it to a Snowflake data warehouse for further analysis.
The ETL system, implemented in Python and running on Azure Functions, utilizes a serverless architecture to ensure scalability and efficiency. This contribution helped streamline data processing, enabling the company to gain insights into the performance of energy trades and enhance decision-making. The Project was entirely managed using Azure DevOps, including Azure Pipelines.
An HR administration software was developed as a modern web application specifically for managing promotion processes. Contributions were made to both the backend, which was built in Kotlin and Java using the Spring framework, and the frontend, a data-intensive UI implemented with Vue.js (JavaScript / TypeScript) and AG Grid, hosted with nginx. PostgreSQL was the database of choice.
Authentication was handled through Keycloak with OAuth for secure access. Additionally, Apache Superset was integrated to provide insights into current HR statistics. The system was deployed using a combination of Gitlab CI/CD and Ansible for automation. A containerized approach was chosen to ensure scalability. SonarQube was used to monitor code quality, comprehensive end-to-end testing was ensured with Cypress.
Supporting the digitalization of the public sector involved providing technical expertise for large public tenders with a total volume of 800M€, covering both on-premises infrastructure and public cloud environments. This included advising on cloud strategies to ensure efficient, scalable, and secure solutions. Given the priority of cloud security for the public sector, several direct alignments with the Federal Office for Information Security (BSI) were conducted to ensure compliance with their standards and best practices.
In parallel, a multi-cloud management system was designed and implemented to govern deployments across various cloud platforms. The solution leveraged Terraform, Kubernetes, GitLab CI/CD and Ansible for automation, and ServiceNow to manage the ordering and provisioning of infrastructure, ensuring seamless operations across both public and on-premises environments. Part of the system was a custom on-premises cloud, for which automated Kubernetes-Cluster initialisation was implemented (K3S). The configuration of the underlying Red-Hat Enterprise Linux (RHEL) VMs was automated with Puppet.
A platform was developed to create a social network for sharing and socializing place recommendations, such as restaurants, points of interest, clubs, and sports facilities. The goal was to foster a community-driven environment where users could interact, recommend, and discover new places based on shared interests and experiences.
The platform leveraged gRPC and GraphQL for efficient data querying and communication. AWS Aurora (PostgreSQL with PostGIS) was chosen as a database. The app was built using Flutter and Dart to provide a smooth, cross-platform user experience. AWS Cognito handled secure authentication, while AWS Lambda was utilized for serverless backend functions related to Cognito and Hasura. Later in the project, Hasura was replaced by a custom Go backend with a GraphQL API, providing greater flexibility and control over the platform's data management and interactions. Both, backend and app, were delivered in a continously using GitLab CI/CD. Infrastructure was maintained with CDK and Cloudformation.
The project supported the ongoing test-driven development and optimization of the central customer repository service of a leading neobroker. The service consolidates and manages the data of millions of mobile app users across the platform. A key aspect of the project involved sourcing existing Know Your Customer (KYC) functionality into a dedicated microservice, ensuring seamless integration with existing systems and external providers to meet regulatory requirements.
The backend was developed using Kotlin, leveraging technologies such as Spring, Hibernate, PostgreSQL and Flyway for database migrations. RabbitMQ enabled asynchronous messaging between services. OAuth 2.0 was implemented to secure authentication and authorization. Code quality was monitored with SonarQube; rollouts on Kubernetes were managed with ArgoCD. A small JWT library was implemented to handle secure token processing across the service landscape.
The project involved refactoring the flight control software for an autonomous kite developed by Winborne Energy, designed to generate power by flying while tethered to the ground, driving a generator. The software, running on an ESP32 microcontroller, managed critical functions such as engine power and rudder control. The primary task was to improve the structure and maintainability of the existing C code by introducing CMake for build automation and establish header/source file separation.
To enhance collaboration and scalability, the codebase was reorganized, replacing a previously disordered structure that relied heavily on global variables. The refactoring process focused on bringing the code up to modern standards, improving readability, and making future updates more efficient. This experience highlighted the potential for further optimization through testing, providing valuable insights for future development.
A fleet and order management system was developed for a leading freight forwarder, serving as an intermediary between linehaul businesses and enterprises requiring large-scale transportation solutions. The frontend was hosted with nginx, crafted using Vue.js (JavaScript / TypeScript) and supported by a component library managed with Storybook. On the backend, a new microservice (Python with Flask, FastAPI, and Alembic) was created to manage user preferences. The microservice was deployed on AWS using a CI/CD Pipeline and Terraform, including the required RDS / PostgreSQL database. The adoption of websockets ensured the system's responsiveness, Cypress was used for extensive E2E-Testing.
The platform was integrated with the intermediary's existing marketplace for shipping orders, aiming to boost customer loyalty by enhancing the platform offerings. Amplitude was used to measure customer interactions and engagement, while Datadog and Sentry were employed to monitor performance and ensure the platform's quality and stability.
A research project developed an alternative model for food traceability across multi-step supply chains. The model enables bi-directional tracing with decentralized data storage, consumer-accessible information sharing, and mechanisms for data integrity and verification. Implemented using ASP.NET Core MVC in F#, Elasticsearch and a React-based (JavaScript) frontend, the model's successful simulation confirmed its objectives, showing feasibility for deployment on Google Cloud using Kubernetes.
In parallel, a streaming platform was developed for a subsidiary of a major stock exchange to automate security settlement processing. The backend, designed for resilience and continuous operation, employs message-oriented middleware to handle high volumes of secure and efficient message transformation. Built using Java with Apache Camel, Spring, and ActiveMQ for messaging, the platform optimizes processing speed and reliability, essential for managing large-scale financial transactions securely.
A data pipeline was developed for a global e-commerce leader to support linear optimization models for determining ideal locations for future logistics hubs. The optimization process takes into account various constraints, such as truck schedules and demand patterns, to recommend the most efficient hub placements. The pipeline gathers data from multiple sources and seamlessly feeds it into an advanced problem solver.
Built using Python and IBM CPLEX, the solution enables data-driven decisions about logistics infrastructure. By efficiently integrating and processing diverse datasets, the pipeline optimizes hub locations, improving overall operational efficiency and scalability in the company's logistics network.
A custom IT Service Management (ITSM) backend solution has been developed for a major German car manufacturer, enabling live synchronization of active frontend sessions managed by a Vue.js / JavaScript interface. The backend supports real-time IT support operations, with an efficient asynchronous employee lookup system that facilitates seamless querying across globally distributed Active Directories.
Built using C#/.NET technologies, including SignalR, Entity Framework, and ASP.NET MVC, the solution leverages WebSockets for real-time communication. Data is managed through Microsoft SQL Server, and secure access is ensured via Windows Integrated Authentication with IIS. This infrastructure continues to enhance the efficiency of IT support processes, delivering a responsive and scalable solution tailored to the manufacturer's global operations.
For a major industrial robotics manufacturer, a Single Sign-On (SSO) solution was implemented to streamline authentication for both employees and B2B clients. The objective was to provide a seamless login experience, despite the challenge of managing identities from two different sources: employee data stored in Active Directory and client data in a Microsoft SQL Server. To address this, Active Directory Federation Services (ADFS) was utilized to integrate the SQL database through an external Security Token Service (STS), with SAML (Security Assertion Markup Language) ensuring compatibility. The project was deployed on Azure.
This solution unified authentication, allowing both employees and clients to access the collaboration platform with a single set of credentials. By leveraging ADFS and SAML, it provided secure, efficient, and scalable identity management, improving user experience while maintaining robust security standards.