About us
We're on a mission to enable industrial companies to establish real-time autonomous decision making in their planning and scheduling processes. While the amount of available data is growing in industrial companies, it is rarely used to its full potential. Our software has the capability to transform the scheduling and planning processes especially in production sites all around the world.
We work with some of the largest industrial companies of the process industry like the BASF, Sun Chemical and more. Combined with our recent strong funding round (4.5 Million €) it is time to extend our technical core team to get ready for the next scaling challenges 🚀
About the role
Data Engineers are vital in bringing our software to our customers. Working closely with software and AI Engineers, you will implement our software by integrating data sources. On the other hand, you will closely work with AI Engineers to help building benchmarking and analytics pipelines for ML performance evaluations.
Something Special About the Team
We’re building an A-Team, prioritizing quality, experience, and skill over quantity. By year-end, we aim to have a highly skilled team that delivers efficiently with less drama, stress and effort than competitors.
What You'll Achieve
* Writing and testing code to integrate data sources from customer systems to our solution.
* Analysis of customer data using pre-built tools and analytics code.
* Close collaboration with AI Engineers to support in benchmarking and test dataset creation.
* Close collaboration with AI Engineers to build MLOps tools handling data from live instances.
* Writing code and tests for standard interfaces to our solution, closely collaborating with AI Engineers and Full Stack Developers.
* Conduct implementation projects together with Business Analysts to set new customers up with our solution.
* Data Pipeline operations like monitoring and similar tasks.
Skills You'll Need To Bring
* Advanced coding capabilities in python and SQL (Microsoft & postgres).
* Designing data/ ETL pipelines and stream architectures.
* Objectively work on architecture patterns/ technologies used in the realm of data transfer and storage.
* Writing tests and documentation for code.
* Working with common cloud services for data engineering (esp. in Azure, e.g. Azure Functions).
* Written and verbal communication in English (C1 level and above).
Nice To Haves
* Experience in Machine Learning and experiment analytics.
* Knowledge about infrastructure, deployment and orchestration in regards to data pipeline solutions.
* Experience in mathematical optimization.
* Experience in data pipeline orchestration and management.
* Capability to speak German.
Compensation
Up to €75,000 annually plus significant stock options. We also offer a Deutschland Ticket for public transport and a Wellpass (eGym) membership for €10/month.
Remote Work Policy
Currently, the team meets in the office three days a week (Tuesday to Thursday). This may evolve based on team feedback and collaboration needs.
Top Notch Office
Located at Willy-Brandt-StraĂźe 59, 20457 Hamburg, our office on the 8th floor offers panoramic city views.
Hardware
You’ll receive the latest MacBook Pro M4 Pro.
Standort
akeno, Hamburg