AI Safety Research Opportunities
The AI Safety Research Group invites applications for PhD positions to contribute to the development of safe artificial intelligence.
Our research focuses on understanding novel behavior in frontier models and ensuring that developers avoid safety and security vulnerabilities. We are seeking motivated researchers who can empirically investigate large reasoning models and address research questions related to deception and situational awareness.
-----------------------------------
Required Skills and Qualifications
* A Master's degree or equivalent in a relevant field
* Familiarity with large reasoning models and their limitations
* Strong analytical and programming skills, with proficiency in Python and statistics
* Advanced English language skills
-----------------------------------
Benefits
The PhD positions offer high-potential researchers an exceptional opportunity to conduct impactful research and work on their dissertation. Additionally, the position benefits from support structures offered by the Graduate Academy of the University of Stuttgart and the Research Focus IRIS.
-----------------------------------
Others
We invite applicants from all fields, but a background in computer science is an advantage. The envisaged start date is January 1st, 2026, but we are open to earlier availability. Please submit your application in a single PDF file, including a cover letter, CV, certificates, and reference letters.