What you contribute
* You have strong background in machine learning and deep learning techniques
* You are enrolled in a Master's program, e.g., computer science, electrical engineering, mechanical engineering, optics and photonics or a comparable field
* You have familiarity with generative models
* You also enjoy learning about new topics and contributing your own ideas
What we offer
* Good public transport connections
* An open and collegial working atmosphere and intensive support
* Work at the interface between now and the future
* Collaboration with industry partners
* We offer you the opportunity to work remotely (in consultation with your manager)
* A high degree of personal responsibility and the opportunity to contribute and implement your own ideas
We value and promote the diversity of our employees' skills and therefore welcome all applications – regardless of age, gender, nationality, ethnic and social origin, religion, ideology, disability, sexual orientation and identity. Severely disabled persons are given preference in the event of equal suitability. Our tasks are diverse and adaptable – for applicants with disabilities, we work together to find solutions that best promote their abilities.
With its focus on developing key technologies that are vital for the future and enabling the commercial utilization of this work by business and industry, Fraunhofer plays a central role in the innovation process. As a pioneer and catalyst for groundbreaking developments and scientific excellence, Fraunhofer helps shape society now and in the future.
Ready for a change? Then apply now and make a difference! Once we have received your online application, you will receive an automatic confirmation of receipt. We will then get back to you as soon as possible and let you know what happens next.
If you have any questions about the position, please contact:
Chia-Wei Chen
Phone: +497216091590
Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB
Requisition Number: 82594 Application Deadline: