Key Data Engineering Role
We are seeking a skilled professional to lead our data pipeline and integration projects.
* You will design, develop, and maintain complex data pipelines and integrations using Python, Java, Kafka, and TimescaleDB.
* You will collaborate with product managers, data architects, and developers to deliver efficient data solutions.
* You will develop and optimize data models and schemas for performance, scalability, and reliability.
* You will write and maintain intricate SQL queries to support analytics, reporting, and data transformations.
* You will implement and maintain APIs and data services for seamless data integration and accessibility.
* You will ensure the reliability, performance, and efficiency of data workflows and systems.
Your profile:
* Bachelor's degree in Computer Science, Software Engineering, or related field.
* At least 4 years of experience in data engineering or backend software development, with a focus on data pipelines, integration, and microservices architecture.
* Strong proficiency in Python and Java, including experience with Spring Boot and data processing libraries.
* Hands-on experience with event-driven architectures and tools like Kafka.
* Expertise in database systems, data modelling, and query optimization.
* Familiarity with time-series databases like Timescale DB is an advantage.
We offer a dynamic work environment, opportunities for growth and development, and competitive compensation packages.