Your Job
* Support team lead with team capacity planning (short & midterm planning), e.g. estimation of new backlog items, planning of iterations and refinements, program increment plannings (PI plannings).
* Accountability for consistent implementation of solution design within the development team, e.g. usage of technical standard solutions.
* Accountability for quality assurance of the refined and implemented technical solutions with respect to quality goals, e.g., scalability, maintainability, security.
* Evolving our data-intensive applications with respect to automation, monitoring, performance for our globally used core business applications hosted in Azure.
* Collaborating with the solution architects in the team to support the mid-term evolution of new cloud based solutions
* Interfacing with the central team of BDAP (Big Data Platform) and related services (Cloud Center of Excellence, Networking, IT Security) to continuously improve the systems in your responsibility
* Developing data workflows and task using e.g. Databricks.
* Optimizing the estate with respect to operating cost of Azure resources
* Working with business units and Corporate Underwriting to plan and execute the release of new product versions
* Ensuring stable and secure operation of our cloud native application including third level support for the mentioned systems
* Cloud-Service troubleshooting including analyses and problem solving
* Oversee important operating measures in collaboration with the release manager, e.g., replacement of certificates, rollover of secrets, monitor changes of breaking Azure changes
* Monitoring of cloud and DevOps infrastructure
* Development and maintenance of Azure DevOps pipelines, e.g., Azure DevOps YAML
* Create and update documentation for relevant components e.g. data engineering pipelines
Your Profile
* University degree preferably in Computer engineering / IT or similar qualification
* Demonstrated experience in software design and data architecture.
* Proficient in managing compute-intensive, data-driven, and large-scale systems, with an emphasis on state-of-the-art security practices.
* Expert on developing data engineering pipelines with Databricks.
* Strong understanding of Azure services, including but not limited to Azure Data Factory, Databricks, Azure Data Lake, Azure Blob Storage, Azure Functions/App Services, Azure SQL, SignalR, CosmosDB (Gremlin/SQL), and API Management.
* Familiarity with Data Vault modeling and the Delta Lake storage format is highly desirable.
* Flexibility and bringing a curious and creative mindset, open to new things and able to propose innovative ideas
* Customer focus, willingness to understand the reinsurance business domain and become a trusted expert partner to our users
* Commitment and desire to drive topics and for life-long learning
* Experience in agile way of working perspective in a highly regulated IT environment – from an a DevOps perspective