Die Schwarz IT betreut die gesamte digitale Infrastruktur und alle Softwarelösungen der Unternehmen der Schwarz Gruppe. Sie ist somit für Auswahl, Bereitstellung und Betrieb sowie Weiterentwicklung von IT-Infrastrukturen, IT-Plattformen und Business-Anwendungen zuständig. Um die Fachbereiche durch IT-Lösungen optimal bei deren Geschäftsprozessen zu unterstützen, nimmt die Schwarz IT die Anforderungen der Fachbereiche in Beratungsgesprächen auf und erarbeitet gemeinsam mit diesen professionelle und leistungsfähige IT-Lösungen.
Data Engineer Retail Media (m/w/d)
What you´ll do
* Work in a cross-functional product team to design and implement data centered features for Europe’s largest Ad Network
* Help to scale our data stores, data pipelines and ETLs handling terabytes of one of the largest retail companies
* Design and implement efficient data processing workflows
* Extend our reporting platform for external customers and internal stakeholders to measure advertising performance
* Continue to develop our custom data processing pipeline and continuously search for ways to improve our technology stack along our increasing scale
* Work with machine learning engineers and software engineers, to build and integrate fully automated and scalable reporting, targeting and ML solutions
* You will work in a fully remote setup but you will meet your colleagues in person in the company and engineering specific onsite events
What you’ll bring along
* 3+ years of professional experience working on data-intensive applications
* Fluency with Python and good knowledge of SQL
* Experience with developing scalable data pipelines with Apache Spark
* Good understanding of efficient algorithms and know-how to analyze them
* Curiosity about how databases and other data processing tools work internally
* Familiarity with git
* Ability to write testable and maintainable code that scales
* Excellent communication skills and a team-player attitude
Great if you also have
* Experience with Kubernetes
* Experience with Google Cloud Platform
* Experience with Snowflake, Big Query, Databricks and DataProc
* Knowledge of columnar databases and file formats like Apache Parquet
* Knowledge of "Big Data" technologies like Delta Lake
* Experience with workflow management solutions like Apache Airflow
* Affinity for Data Science tasks to prototype Reporting and ML solutions
* Knowledge of Dataflow / Apache Beam
Wir freuen uns auf deine Bewerbung!
Lara Schlimgen · Referenz-Nr. 47068
Stiftsbergstraße 1 · 74172 Neckarsulm