Job Description
We are on the lookout for a Senior Analytics Engineer to join our consumer product tribe on our journey to always deliver amazing experiences.
At Delivery Hero, you’ll shape the global consumer experience for millions of users. Your work will have a direct impact on how we attract and retain customers, creating data-driven, personalized interactions that keep them coming back. As part of our Consumer Team, you’ll enhance customer satisfaction and drive global growth and profitability through innovative projects.
Our consumer product tribes build and scale the digital experience enjoyed by millions of consumers across 11 brands in 70+ countries. Our users can do a lot on our platform—order food delivery, buy groceries, find dine-in deals, discover new ad offerings from our partners, play games in our loyalty program (and more!)—and our goal is to make all this as delightful, personalised and seamless an experience as possible!
This is a hybrid role embedded within our consumer analytics team. You will be acting as a bridge between product analytics teams, data engineering teams, and external data teams with the goal of enabling shorter time to insights for the analytics team, improving data trustworthiness as well as enabling greater self-serve for both analysts and end-users alike. The person taking on this challenge is required to be a self-starter with a strong growth and get-things-done mindset. You will be expected to identify root causes of analytics pain points, propose and drive solutions with external teams towards addressing them, and be comfortable getting hands-on or implementing quick fixes to unblock the business while not losing sight of the longer term vision.
This role is based in Berlin and the team is structured as cross-functional squads & tribes that focus on different aspects of our user experience.
In this role you will,
40% Data Architecture and Engineering:
1. Understand data use cases, architect and build reliable, curated data models and associated production pipelines to enable analytical agility and experimentation
2. Optimise large-scale ingestion of backend and frontend events by having a hand in how our data capture processes are designed, developed, and maintained
3. Own the DBT + Airflow framework and data quality tooling for product analytics
40% Data management
4. Partner with our data foundations team to improve our foundational datasets and data infrastructure
5. Build a solid understanding of the company data ecosystem as a whole - focusing primarily on discovery, and helping understand and investigate the data with our team of analysts.
6. Define and own the data quality metrics. Configure data quality tools, investigate data quality issues. Work with upstream data owners to improve data quality.
7. Champion quality data governance and querying through knowledge shares, workflow optimisations, hiring, onboarding of talent etc
8. Maintain good documentation of our data
20% Visualization & dashboard maintenance
9. Creation of dashboards ( data quality monitoring dashboard)
Qualifications
10. 4+ years of relevant analytics/data engineering experience in rapidly growing and dynamic environments
11. Solid production-grade experience with workflow management tools (primarily Airflow) and config-driven data build tooling (primarily DBT) to deliver end-to-end data pipelines.
12. Strong skills in schema design, dimensional data modeling, SQL and working with large datasets
13. Experience with enabling data quality observability,monitoring and alerting.
14. Ability to drive and manage medium scale project initiatives and work with external teams on larger projects
15. Basic experience with data visualization tools (Looker, Tableau)
16. You aim for clean and well engineered solutions, while keeping an eye for simplicity and pragmatism.
17. You are highly passionate about data, with creative problem-solving abilities and an eye for detail.
18. You have good English communication skills; being able to explain to the team complex technical projects, and to summarize them for non-technical stakeholders.
19. You take ownership of your tasks and support the team embracing sharing and collaboration.
Nice to have
20. Familiarity with BigQuery and Google Cloud Platform
21. Familiarity with data quality tooling (great expectations, Monte Carlo)
22. Familiarity with experimentation tooling and techniques (Eppo)
23. Python