We digitize decisions with data—would you like to get involved?
We shape our customers' data-driven future with scalable, secure, and automated Azure platforms. As a Professional Data Platform Engineer, you will be responsible for complex data pipelines: from zero-ETL replication of operational data (mirroring) to Delta Lake-based lakehouses. Our vision: self-service data access for all departments, supported by DataOps and governance.
Planning, development, and maintenance of robust batch and streaming pipelines with Azure Data Factory, MS Fabric and Databricks
Ensuring data quality, lineage, and security—using Microsoft Purview and role-based access control
Collaborating with data scientists, product owners, and customers to translate requirements into scalable data solutions
Trust-based working hours, hybrid working, and remote working possible (residence in Germany)
2–5 years of experience in data engineering, preferably with Azure Data Services (Data Factory, Databricks, MS Fabric)
~ Very good SQL and Python skills; experience with Delta Lake, streaming (Event Hubs, Kafka), and data modeling
~ Very good German and good English skills