We digitize decisions with data—would you like to get involved?
In the role of Senior Data Platform Engineer, you will design the architecture of modern lakehouse platforms and enable self-service data access. You will combine Azure services (OneLake, Fabric, Databricks, Mirroring) to create scalable end-to-end solutions and implement governance, security, and cost optimization strategies. Your work will lay the foundation for successful AI projects.
Tasks & Responsibilities
* Architecture of lakehouse platforms with Fabric/OneLake, Synapse, and Delta Lake; introduction of zero-ETL data integration via mirroring
* Implementation of Unity Catalog for centralized data management and fine-grained access control, as well as integration with Purview
* Building DataOps frameworks: using Delta Live Tables, serverless SQL pools, and automated monitoring tools for performance and cost management
* Development of security and compliance concepts; integration of VNet, Private Link, Key Vault, and AD integration
* Mentoring of junior engineers, leading code reviews, assisting with architecture decisions
* Advising stakeholders on data strategy, technology selection, and business impact
What we offer you
* Flexible working: Trust-based working hours, hybrid working, and remote working possible (residence in Germany)
* State-of-the-art technology stack: Work with Fabric, Delta Lake, Databricks, Mirroring—ideal for tech-savvy juniors
* Targeted further training: Working hours for training, certifications, and mentoring
* Innovation space: Opportunity to test new tools and frameworks and develop proof-of-concepts
* Diversity & inclusion: We welcome all applicants and promote an inclusive environment; your ideas are important to us
* Leadership influence : You will shape technical strategies, establish governance processes, and drive innovation in a rapidly growing team
If you want to take on responsibility, enjoy working with the latest Azure technology, and value an open, learning-oriented team culture, we look forward to receiving your application!
* At least 5 years of experience in setting up and operating Azure data platforms (Data Lake, Databricks, Synapse/Fabric) and in managing projects
* In-depth knowledge of data architecture, distributed systems, data modeling, and performance optimization
* Experience with infrastructure as code and CI/CD, cost optimization, and DevOps
* Strong communication and leadership skills; ability to explain complex concepts to business partners in an understandable way
* Fluent German and very good English skills