Job Description
We are on the lookout for a Data Engineer to join the Quick Commerce team on our journey to always deliver amazing experiences.
As a Data Engineer, your primary mission will be to contribute to forming and improving our huge data model setup located in Google Cloud. This involves diving deep into our data platform to break down data platform and pipeline complexities, identify friction points, and optimise systems for better performance and scalability. You will be part of the team directly responsible for managing the data platform’s infrastructure components, continuously monitoring and optimizing pipelines, queries, and databases to ensure high availability and performance.
Day-to-day, you will be hands-on in designing, building, and maintaining scalable ingestion and data pipelines, ETL processes using SQL and Python, with a strong focus on ensuring high data quality, integrity, and low latency. This role also carries a responsibility for governance and standards. You will be responsible for promoting best coding practices and participating in standardization routines for data foundations, analytics, and data science. This extends to collaborating on data catalogs and access controls to ensure all data processes adhere to strict accuracy, privacy, security, and compliance standards.
You will work as a key collaborator, establishing Data Contracts between our numerous Domain Data Units and shaping efficient ways of working between various stakeholder groups. You will also work closely with cross-functional teams to define data requirements and implement or enhance data models specifically for our DataMesh environment.
1. Manage data platform’s infrastructure components, continuously enhancing and monitoring them as well as optimizing pipelines, queries, and databases for high availability and performance.
2. Design, build, and maintain scalable data pipelines and ETL processes using SQL and Python, ensuring high data quality, integrity, and low latency.
3. Work with cross-functional teams to define requirements and implement/enhance data models for a DataMesh environment.
4. Collaborate on data catalogs and access controls, ensuring all data processes adhere to accuracy, privacy, security, and compliance standards.
5. Build and deploy foundational data products for data science and analytics teams, troubleshoot data quality problems, and provide technical support for the BI toolbox.
Be part of redefining how customers experience quick commerce. You’ll help build technology that scales our non-food offerings, reaching new market segments and driving revenue growth. By innovating within our Quick Commerce Team, you’ll make Delivery Hero the go-to platform for a broad range of products, helping us grow faster and deliver more value to customers around the world.
Qualifications
6. 3+ years of experience in data or analytics engineering or a related field, with a focus on developing and maintaining large-scale data pipelines.
7. Proven ability to design and build complex data modeling in DataMesh.
8. Strong programming skills in SQL, Python.
9. Good understanding of an IaaC tool like Terraform.
10. Experience with tools such as Apache Airflow, Apache Kafka, and Terraform.
11. Experience with one of the cloud platforms, GCP, AWS, or Azure.
Nice to Have
12. Experience with implementing the Data Mesh model in a production environment, along with Data Lakes.
13. Experience with data visualization tools such as Tableau, Looker, or Power BI.
14. Experience with the E-commerce sector.
15. Strong analytical and problem-solving skills, with a keen eye for detail.