Your Impact
* Architect & Design: Design and model effective databases, data warehouses, and robust cloud infrastructure using cloud native solutions such as BigQuery, Dataflow, or DBT Cloud in collaboration with clients and internal teams
* Build & Deploy: Create and manage reliable batch and streaming data pipelines, and deploy production-ready applications on the cloud using GCP, Azure or AWS.
* Solve & Innovate: Translate business requirements into end-to-end technical solutions and effectively communicate your ideas to both technical and non-technical stakeholders.
* Learn & Grow: Continuously explore and apply new technologies, helping our team stay at the forefront of the data and ML landscape.
Our Tech Stack:
* Core Language: Python
* Cloud Platforms: GCP, Azure, and/or AWS
* Data Warehousing & Transformation: BigQuery, DBT
* Data Processing & Orchestration: Dataflow / Apache Beam, Airflow
* Infrastructure & Containers: Terraform, Kubernetes
* APIs & Web Frameworks: FastAPI, Flask / Connexion
Why you?
Minimum requirements:
* Foundation: A degree in Computer Science, Engineering or a related field.
* Experience: A minimum of 2 years of relevant professional experience.
* Technical Skills: Experience with Python, version control (Git) and SQL and a solid understanding of ML/ software engineering principles.
* Communication: Excellent verbal and written communication skills in English. You can confidently explain complex technical ideas to both technical and non-technical audiences.
* A consultative mindset and strong sense of ownership and initiative. You are a curious, proactive problem-solver who is always looking for new knowledge to improve your work.
Nice to have:
* Contribution to open-source projects
* Experience with Google Cloud, AWS or Azure
We're actively building a diverse team with a broad range of skills and backgrounds. This job description is a starting point, if you're passionate about this role and eager to learn, we strongly encourage you to apply.