Responsibilities
1. Design and implement scalable AI architectures in hybrid and cloud environments (Azure, AWS, GCP)
2. Integrate production systems (SCADA, MES, DCS) and IIoT components using OPC UA, MQTT, and Modbus
3. Develop secure, high-performance data pipelines from OT/edge environments to data lakes and cloud platforms
4. Support real-time applications for monitoring, anomaly detection, early warning, and forecasting
5. Define interfaces (REST/gRPC) and workflows for machine learning model deployment and orchestration (e.g., MLflow, Kubeflow)
6. Collaborate with ML Engineers, Data Scientists, and DevOps teams to deliver full end-to-end AI solutions
7. Use SQL for data modeling, transformation, and integration
8. Contribute to data governance, lineage, and compliance (e.g., GDPR, ISO/IEC 27001)
Your Profile:
9. Several years of experience designing data-driven system architectures, ideally in an industrial environment
10. In-depth knowledge of industrial protocols and OT/IT integration (e.g., SCADA, OPC UA, edge devices)
11. Expertise in edge computing, network security, and infrastructure design
12. Proficiency in SQL and familiarity with BI tools such as Power BI
13. Experience with MLOps pipelines and AI lifecycle platforms (e.g., MLflow, SageMaker, Vertex AI)
14. Hands-on project experience in industrial or manufacturing settings
15. Familiarity with IoT platforms, edge AI, or SCADA/MES systems
16. Knowledge of modern data platforms (e.g., Snowflake, Delta Lake, BigQuery)
17. Experience with orchestration tools such as Apache Airflow or dbt
18. Practical experience deploying AI services in edge computing environments
19. Strong communication skills with the ability to translate complex technical requirements into actionable plans
Application Process