Job Overview
We are seeking a skilled Data Engineer to join our team and contribute to the development of our modern data analytics platform. As a key member of our Data Analytics & Integration team, you will play a crucial role in shaping the future of our organization by designing, implementing, and maintaining high-performance data pipelines.
Your primary responsibilities will include setting up and operating our new data analytics platform, utilizing cutting-edge technologies such as Databricks, dbt, Airflow, Terraform, and Azure Data Services. You will also be responsible for managing our existing business intelligence landscape and assisting in the transition to the new architecture.
In this role, you will work closely with cross-functional teams to translate business requirements into scalable technical solutions. Your expertise in data integration, pipeline design, cloud and on-premise architectures, and collaboration with business departments will be instrumental in driving the success of our data-driven initiatives.
Responsibilities
* Design, implement, and maintain high-performance data pipelines using Databricks, dbt, Airflow, and other relevant tools
* Collaborate with cross-functional teams to translate business requirements into scalable technical solutions
* Manage our existing business intelligence landscape and assist in the transition to the new architecture
* Work closely with stakeholders to understand their needs and develop tailored solutions
* Stay up-to-date with industry trends and emerging technologies to ensure the continued success of our data analytics platform
Requirements
* Bachelor's degree in Business Informatics, Data Engineering, Data Management, or a related field
* Several years of experience in data engineering, preferably with a focus on Databricks and Azure (SQL Server, SSIS, SSAS, Power BI desirable)
* Initial practical experience in developing and operating complex data platforms, including data integration, pipeline design, cloud and on-premise architectures, and close collaboration with business departments to translate requirements into scalable technical solutions
* Strong knowledge of SQL, Python, ETL design, dbt, Airflow, Infrastructure as Code (Terraform), CI/CD (GitHub), and monitoring (e.g., Datadog, Grafana, Prometheus)
* Understanding of data governance, Unity Catalog, cost monitoring, and metadata management an advantage
* Initial experience with Jira and Confluence desirable
Benefits
* A hybrid data landscape with proven on-premise technologies (SQL Server, SSIS, SSAS, Power BI) and a modern data analytics platform with technologies such as Databricks, dbt, Airflow, Azure Data Services, Terraform, GitHub, Python, and Fabric currently under development
* A motivated team with principles such as trust, personal responsibility, transparency, and continuous development
* A structured operating process that distributes knowledge and ensures stability
* An environment that promotes innovation and offers space for new ideas
* An attractive compensation package aligned with your position and market conditions
* Company pension scheme and subsidies for additional retirement provisions
* Long-term accounts that allow you to accumulate free time
* Subsidized company restaurant and complimentary hot and cold beverages
* Company health management and a cool job bike program
* Free parking spaces and charging stations for electric cars