Job Description
* Design, build, and maintain scalable data pipelines using Azure Data Services, Databricks and/or Microsoft Fabric.
* Consolidate structured/unstructured data into governed lakes and warehouses for BI and AI use cases.
* Implement robust data models and storage architectures (Star, Snowflake, Medallion).
* Ensure data integrity and quality, lineage, security, and governance across the full lifecycle.
* [Optional] Automate workflows using Azure DevOps, GitHub Actions, or other CI/CD tools.
* Collaborate in client workshops, translating requirements into technical Azure-native solutions.
* Optimize performance and cost efficiency of the data infrastructure.
Example Use Cases You Will Work On:
* Modernize legacy ETL workflows with Azure-native services.
* Build semantic models for enterprise BI using Star/Snowflake schema.
* Design medallion-structured ingestion flows to enable batch or near-real-time analytics.
* Deliver curated, governed data products for BI and AI use cases.
Qualifications
Must-Haves:
* 3+ years of real-world project experience as a Data Engineer in Azure ecosystems.
* fluent in german and english
* Advanced SQL and performance tuning.
* Strong background in dimensional data modeling and familiarity with ETL patterns, like medallion architecture.
* Hands-on experience with Azure Data Factory, Synapse, Databricks, and ideally Microsoft Fabric.
Nice-to-Haves:
* Proficiency in Microsoft Power BI.
* Exposure to Data Mesh or domain-oriented data architectures.
* Experience with Delta Lake, Unity Catalog, or Feature Store.
Soft Skills
* Strong communication and documentation skills (German & English).
* Agile, delivery-oriented mindset.
* Collaborative and self-directed approach.
* Analytical thinking with a focus on value delivery.
* Ability to translate business requirements into technical solutions.
* Experience working in agile, cross-functional teams.
Additional Information
You will be part of a collaborative, remote-friendly team that values continuous learning and
delivering impact through modern cloud-native data solutions.