Jobs
Meine Anzeigen
Jobs per E-Mail
Anmelden
Stellenangebote Job Tipps Unternehmen
Suchen

Data engineer - ai pipelines & dataops

Düsseldorf
Intersnackitkg
Data Engineer
Inserat online seit: 30 April
Beschreibung

We Want You to Grow With Us

High-quality, reliable data is the foundation on which every AI use case is built, and this role is responsible for making that foundation unshakeable. As our Data Engineer for AI Pipelines & DataOps, you will design and deliver the ingestion pipelines, streaming architectures, and APIs that feed Intersnack's AI and analytics systems with the data they need to perform at scale. You will report into the AI Programme and work in close collaboration with AI engineers, data architects, and business teams, with a particular focus on manufacturing environments, where edge device data presents unique ingestion challenges. At Intersnack, we build on a solid digital foundation, and your engineering work will be central to extending that foundation into the AI era.


What We Can Offer

This role offers the opportunity to work across a genuinely diverse data landscape, from manufacturing edge devices and IoT sensors to structured enterprise data systems across procurement and sales, giving you breadth of technical challenge that few data engineering roles can match. You will have direct influence over how AI and analytics use cases are enabled across a 4.5bn euro business, with the autonomy to define pipeline patterns, API standards, and DataOps practices that others will build on. Collaboration is at the heart of how we work, and you will be embedded in a programme team that spans data science, AI architecture, and business enablement. Dusseldorf is the home base, with flexibility for remote working.


How You Will Spend Your Time as Our Next Data Engineer - AI Pipelines & DataOps

You will design and build the data infrastructure that powers Intersnack's AI programme, from scalable ingestion pipelines that handle both structured enterprise data and unstructured signals from manufacturing environments, to the APIs that expose data, models, and AI services for consumption across the organisation. Your work will span architecture, implementation, and operations, with a strong focus on quality, observability, and continuous improvement.

What You Will Do

* Design and implement scalable data ingestion pipelines for structured and unstructured data sources, including manufacturing systems, edge devices, and enterprise data platforms, ensuring consistent data quality from source to consumption

* Build and maintain both batch and streaming data pipelines for analytics and AI use cases, leveraging cloud-native tooling on Microsoft Azure and/or AWS

* Design and expose REST or GraphQL APIs for data assets, machine learning model endpoints, and AI services, enabling reliable, governed consumption by internal applications and analytical systems

* Implement CI/CD practices and DataOps principles across pipeline development and deployment, supporting automated testing, versioning, and release management for data infrastructure

* Ensure data quality, lineage, and observability across all pipelines, implementing monitoring and alerting that surfaces data issues before they affect AI or analytics outputs

* Support the integration of manufacturing and edge device data, including IoT and OT systems, into the central data platform, addressing the specific latency, format, and volume challenges of operational technology environments

* Collaborate with data scientists and AI engineers to design and optimise data flows that support model training, inference, and knowledge retrieval pipelines

* Apply security-by-design practices to all pipeline and API design, including access control, encryption, and protections against data leakage, in line with Intersnack's sovereignty and compliance standards

* Contribute to the AI literacy and enablement programme by supporting colleagues in understanding data pipeline health, data quality standards, and the role of reliable data in AI outcomes


Essential Skills & Experience

* Proven experience designing and implementing production-grade data pipelines using cloud-native services on Microsoft Azure (e.g., Azure Data Factory, Azure Event Hubs, Databricks) and/or AWS (e.g., AWS Glue, Kinesis, Lake Formation)

* Strong proficiency in at least one pipeline or transformation framework (e.g., Apache Spark, dbt, Apache Kafka, or equivalent) and a scripting language such as Python or Scala

* Solid hands-on experience with SQL and NoSQL databases (e.g., PostgreSQL, Cosmos DB, MongoDB, or equivalent), including data modelling and query optimisation for both transactional and analytical workloads

* Experience building and maintaining streaming and batch pipelines for analytics and AI applications, with an understanding of trade-offs between the two approaches

* Ability to design and implement REST or GraphQL APIs for data and model serving, with an understanding of API versioning, documentation, and governance

* Familiarity with DataOps and CI/CD practices as applied to data engineering, including automated testing, pipeline orchestration, and infrastructure-as-code

* Experience with data quality tooling, lineage tracking, and observability frameworks (e.g., Great Expectations, OpenLineage, or equivalent)

* Working knowledge of AI and data security fundamentals, including data access controls, encryption, and risks such as data exfiltration in pipeline contexts

* Awareness of GDPR, EU AI Act, and EU data sovereignty requirements and their implications for data infrastructure design

* A strong command of spoken and written English is required; knowledge of German is considered an advantage

Valuable Experience

* Experience working with manufacturing or operational technology (OT) data environments, including IoT/edge device data ingestion and time-series data handling

* Familiarity with Microsoft Fabric, OneLake, or Azure Purview for unified data platform management

* Exposure to MLOps practices, including data versioning, feature stores, or model monitoring pipelines

* Experience with Terraform or other infrastructure-as-code tooling for scalable, repeatable data infrastructure deployment

* Background in FMCG, manufacturing, or supply chain, providing context for the operational data challenges typical in these environments

Important: Please note that a valid work and residence permit is required for non-EU applicants for this position.

About Intersnack IT
Intersnack IT KG is a member of the Pfeifer & Langen Industrie- und Handels-KG’s group of companies and a sister company to Intersnack Group. Established from the international harmonization and centralization of Intersnack Group’s IT estate, we are responsible for all group-wide IT services for and within Intersnack Group. It’s our target to provide the common IT infrastructure, aligned IT services and business solutions according to Intersnack’s requirements. Based on a solid digital foundation, Intersnack IT KG acts as a partner to all Intersnack functions, actively contributing to Intersnack’s business strategy. Explore exciting career opportunities and learn more by visiting our website at
About Intersnack Group
Intersnack has become one of Europe’s leading savory snacks producers by ‘creating happy snacking moments’ in people’s lives. Being privately owned, we operate with a long-term view and commit ourselves to a more sustainable world. Successfully and sustainably growing, our turnover in 2024 was more than €4.5 bn. We are now present in more than 30 countries across Europe and beyond. We have 12 regional Management Units, 45 production sites, and a total workforce of approximately 15,000 people worldwide. For further company insights, please visit the following link:

If you want to become part of our dynamic food industry success story, you’ll find all sorts of opportunities at Intersnack. Join our team and help us to grow and celebrate our successes together!

Bewerben
E-Mail Alert anlegen
Alert aktiviert
Speichern
Speichern
Ähnliches Angebot
Data engineer - digital health excellence center (w/m/d)
Düsseldorf
EY Deutschland
Data Engineer
Ähnliches Angebot
Junior data engineer (m/w/d)
Düsseldorf
Reply Deutschland SE
Data Engineer
Ähnliches Angebot
(senior) data engineer (gn) vollzeit/ teilzeit (hybrid)
Ratingen
DKV Mobility
Data Engineer
Mehr Stellenangebote
Ähnliche Angebote
Ingenieur Jobs in Düsseldorf
Jobs Düsseldorf
Jobs Düsseldorf (Kreis)
Jobs Nordrhein-Westfalen
Home > Stellenangebote > Ingenieur Jobs > Data Engineer Jobs > Data Engineer Jobs in Düsseldorf > Data Engineer - AI Pipelines & DataOps

Jobijoba

  • Job-Ratgeber
  • Bewertungen Unternehmen

Stellenangebote finden

  • Stellenangebote nach Jobtitel
  • Stellenangebote nach Berufsfeld
  • Stellenangebote nach Firma
  • Stellenangebote nach Ort
  • Stellenangebote nach Stichworten

Kontakt / Partner

  • Kontakt
  • Veröffentlichen Sie Ihre Angebote auf Jobijoba

Impressum - Allgemeine Geschäftsbedingungen - Datenschutzerklärung - Meine Cookies verwalten - Barrierefreiheit: Nicht konform

© 2026 Jobijoba - Alle Rechte vorbehalten

Bewerben
E-Mail Alert anlegen
Alert aktiviert
Speichern
Speichern