Henry Hüske

I'm Henry Hüske

Senior Data Engineer / Database Migration Specialist

  • Address
  • E-mail
  • Phone
  • Languages German, English, Danish
  • Available from March 1st, 2025

Download my portfolio as PDF

Hello! I'm Henry Hüske. Senior data engineer specializing in database migration to the cloud. Strong background in project management support for data-related projects.

My services to you

  • Performing development tasks in the areas Data Warehouse (DWH), ETL, and Business Intelligence with experience in geodata handling
  • Replacement/migration of relational databases (in particular Oracle and PostgreSQL)
  • Support with the transfer of your database into the cloud infrastructure (AWS, GCP)
  • Performance optimization of your database queries and processes

Your benefits

  • The optimization and acceleration of data processes brings your data to its destination in a faster way and enables you to carry out more timely analyzes.
  • The replacement/migration of your database takes place in a structured way and with the smallest possible downtime.
  • The knowledge of your employees increases through my technical support and guidance.
  • Better and faster database queries and processes increase customer satisfaction.

Professional Skills

Focus topics
  • ETL/DWH development
  • Database migration (upgrades of existing databases; migration to other relational databases)
  • Database performance optimization
Tools
  • Bash
  • Data Vault
  • Docker
  • git
  • Linux
  • Pandas
  • Python
  • Talend Studio
  • Terraform
Cloud
  • Amazon Web Services (AWS)
  • Google Cloud Platform (GCP)
Databases
  • BigQuery
  • Oracle DB
  • PostgreSQL
  • Snowflake
  • MariaDB
  • MySQL
  • MongoDB

Project and Work Experience

since 04/2020

Senior Data Engineer

Ubitricity - Gesellschaft für verteilte Energiesysteme mbH, Remote

The company is a provider of charging solutions and infrastructure for electric vehicles based in Berlin, Germany. My project task is to create a new data warehouse using cloud technologies to enable an aggregated and uniform view of the data from different source systems. The implementation is carried out using Google Cloud BigQuery as a data warehouse solution, Terraform for creating infrastructure as code, the ELT approach and Data Vault modelling for loading the data warehouse and Tableau as a visualization tool. The source data of approx. 10 GB are extracted primarily from MariaDB and MongoDB. In the project, it is my job to plan, operate and implement the entire process for creating the data warehouse infrastructure and loading processes. In terms of content, I am supported by ProductOwners and employees of the BusinessIntelligence team.

Technologies used: Google Cloud, BigQuery, MariaDB, MongoDB, Terraform, Data Vault, Python

08/2021 - 10/2023

Senior Data Engineer, Requirements Engineer

Remote

I am working for a large international group with a project team from a project service provider. My project task in the early stage was the implementation of complex KPI calculations using pandas and Jupyter notebooks. The goal was to decide whether the existing infrastructure for the KPI calculations can be replaced by a more modern infrastructure using AWS and Snowflake. In the next project stage, we wre focusing on the PoC-implementation of the data pipelines for one KPI using AWS, Snowflake, and AWS Glue. The current project stage include the production implementation of data pipelines for the whole product.

Technologies used: Python, Snowflake, AWS, Pandas, Jupyter Notebook

05/2021 - 08/2021

Data Analyst & Data Engineer

Remote

I worked for a large international group with a project team from a project service provider. My project task was to write database queries in Snowflake for data warehouse objects originating from different sources. Additionally, I aligned with the frontend and the business team in order to discover technical requirements.

Technologies used: Snowflake, Azure DevOps

01/2021 - 04/2021

Database Engineer

Remote

I worked for a large international group with a project team from a project service provider. My project task was the database support for Oracle DB 12.2 to check whether a replication of ERP data can be realized via an Oracle database in AWS RDS (Amazon Web Services - Relational Database Service) to Snowflake. The replication volume to be checked is approximately one billion transactions per day for about 50+ ERP databases with an aggregated size of 42 TB.

Technologies used: Amazon Web Services, AWS RDS Oracle DB, AWS Data Migration Service, Snowflake

2015 - 2019

Senior Data Engineer

Avantgarde Labs GmbH, Germany (as employee)

My job included technical support for the ETL team of a large German e-commerce provider for electronic items; the ETL infrastructure consisted of 300 to 400 ETL jobs. On the one hand, I was responsible for creating, changing and optimizing ETL jobs using Talend Studio DI, and on the other hand for planning ETL processes and prioritizing tasks for a team of up to 5 employees. In the 5 years in the project three different Talend versions (5.3.1, 6.2.1, 7.0.1) were used. When upgrading the Talend Studio versions including the associated infrastructure components (Talend Administration Center, Nexus, git / SVN), I took over the technical coordination of the necessary steps in coordination with internal and external employees. A central database with a size of approximately 2 TB was used to combine the data from various source systems; until 2017 it was an Oracle database in version 11g R2, after that the database was migrated to PostgreSQL 9.6 and ETL infrastructure was transferred to the cloud (Google Cloud Platform using "Cloud SQL" for the database and VMs for the infrastructure components).

Technologies used: Google Cloud, Oracle DB, PostgreSQL, Talend, Docker

2009 - 2014

Data Engineer

Tele-Kabel-Ingenieurgesellschaft mbH, Germany (as employee)

My task was the project coordination for and the implementation of approx. 80 geodata migrations for supply networks (waster water, electricity, gas) for a Swiss GIS provider, which performs the GIS tasks for about 50 Swiss municipalities. The migrations took place from Oracle database 9 to Oracle database 11g R2. In addition, I carried out various customization programming for Autodesk Topobase using VB.NET for the customer.

Technologies used: Oracle DB, Autodesk Topobase, VB.NET

Education

2003 - 2008

Diploma in Business Education

Technische Universität Dresden, Dresden, Germany

2005 - 2006

Study abroad

Mid Sweden University, Sundsvall, Sweden

Availability

  • Daily on-site work for customers in the Copenhagen area
  • Daily remote work for customers outside the Copenhagen area

My contact details

  • Address
  • Phone
  • E-mail
Loading ...