12/12/2025 updated


not available
info: not available until 07/01/2026
Senior Data Engineer & Database Migration Specialist
København K, Denmark Diploma in business education
Data EngineeringDatabase MigrationPostgresqlOracleGCP (Google Cloud Platform)Amazon AWSGoogle BigQuerySnowflake
My offers
Based on many years of experience in data engineering and project management I offer:
Your benefits
Working with me has the following benefits:
Based on many years of experience in data engineering and project management I offer:
- Performing development tasks in the areas Data Warehouse (DWH), ETL and Big Data
- Replacement/migration of relational databases (in particular Oracle and PostgreSQL)
- Support with the transfer of your database into the cloud infrastructure (AWS, GCP)
- Performance optimization of your database queries and processes
Your benefits
Working with me has the following benefits:
- The optimization and acceleration of data processes brings your data to its destination in a faster way and enables you to carry out more timely analyzes.
- The replacement/migration of your database takes place in a structured way and with the smallest possible downtime.
- The knowledge of your employees increases through my technical supportand guidance.
- Better and faster database queries and processes increase customer satisfaction.
- Focus topics
- ETL/DWH development
- Database migration (upgrades of existing databases; migration to other relational databases)
- Database performance optimization
- Databases
- Google BigQuery
- Oracle-Database (Oracle Database 12c Administrator Certified Associate, October 2016)
- PostgreSQL
- MySQL
- MariaDB
- MongoDB
- Cloud
- Amazon Web Services (AWS)
- Google Cloud Platform (GCP)
- Tools
- Bash
- Data Vault
- Docker
- git
- Linux
- Python
- Talend Studio
- Terraform
Languages
DanishGoodGermanNative speakerEnglishFluent
Project history
I am working for a large international group with a project team from a project service provider. My project task in the early stage was the implementation of complex KPI calculations using pandas and Jupyter notebooks. The goal was to decide whether the existing infrastructure for the KPI calculations can be replaced by a more modern infrastructure using AWS and Snowflake. In the next project stage, we wre focusing on the PoC-implementation of the data pipelines for one KPI using AWS, Snowflake, and AWS Glue. The current project stage include the production implementation of data pipelines for the whole product.
(Technologies used: Python, Pandas, Jupyter Notebook, Snowflake, AWS)
(Technologies used: Python, Pandas, Jupyter Notebook, Snowflake, AWS)
The company is a provider of charging solutions and infrastructure for electric vehicles based in Berlin, Germany. My project task in the early stage of the project was to create a new data warehouse using cloud technologies to enable an aggregated and uniform view of the data from different source systems. My current tasks focus on the extension and maintenance of the data warehouse. The implementation is carried out using Google Cloud BigQuery as a data warehouse solution, Terraform for creating infrastructure as code, the ELT approach with Data Vault modelling for loading the data warehouse and Tableau as a visualization tool. The source data of approx. 10 GB are extracted primarily from MariaDB and MongoDB. As a technical lead, it is my job to plan, operate and implement the entire process for creating the data warehouse infrastructure and loading processes. In terms of content, I am supported by ProductOwners and employees of the BusinessIntelligence team.
I worked for a large international group with a project team from a project service provider. My project task was to write database queries in Snowflake for data warehouse objects originating from different sources. Additionally, I aligned with the frontend and the business team in order to discover technical requirements.