ML Alignment & Theory Scholars (MATS) are looking for a talented individual to join our London team as a Research Manager.
Overview:
* Full-time position
* Language: English
* Location: London, UK
* Salary: £30 – £45/h, depending on experience
* Closing date: until filled
Job Description
As a Research Manager, you will play a crucial role in supporting and guiding AI safety researchers, facilitating projects, and contributing to the overall success of our programme. This role offers a unique opportunity to develop your skills, make a significant impact in the field of AI safety, and work with top researchers from around the world.
Your day to day will involve talking to both scholars and mentors to understand the needs and direction of their projects. This may involve becoming integrated into the research team, providing feedback on papers and ensuring that there is a plan to get from where the project is now to where it needs to be.
We are excited for candidates that can augment their work as a research manager by utilising their pre-existing expertise in one of the following domains:
* Theory– providing informed feedback to scholars on research direction, and helping MATS to assess research priorities.
* Engineering– helping scholars to become stronger research engineers, and building out the internal tooling of MATS.
* Projects– providing scholars with structure and accountability for their research, and helping MATS to build better systems and infrastructure.
* Communication– helping scholars to present their research in more compelling ways to influential audiences, and improving how MATS communicates its mission.
Responsibilities:
* Work with world-class academics & industry mentors to:
o people-manage their AI safety mentees
o Manage and support AI safety research projects
o Facilitate communication and collaboration between scholars, mentors, and other collaborators
o Organise and lead research meetings
* Work with individual junior AI safety researchers to:
o Provide guidance and feedback on research directions and writeups
o Connect them with relevant domain experts to support their research
* Contribute to the strategic planning and development of MATS:
o Spearheading internal projects
o Building and maintaining the systems and infrastructure that MATS requires to run efficiently
o Providing input into strategy discussions
Role Requirements
We welcome applications from individuals with diverse backgrounds, and strongly encourage you to apply if you fit into at least one of these profiles:
* AI safety researchers looking to develop a more holistic skillset
* Professionals with research or research management experience
* Product/project managers from tech or a STEM industry
* People managers with technical or governance backgrounds
* Technical writers or science communicators
* Community builders
If you do not fit into one of these profiles but think you could be a good fit, we are still excited for you to apply!
Essential Qualifications and Skills
We are looking for candidates who have the following:
* 2 years experience across a combination of the following:
o Technical research
o Governance or policy work
o Project management
o Research management (not necessarily technical)
o People management
o Community building
o Mentoring
* Excellent communication skills, both verbal and written
* Strong listening skills and empathy
* Strong critical thinking and problem-solving abilities
* Ability to explain complex concepts clearly and concisely
* Proactive and self-driven work ethic
* Familiarity with the basic ideas behind AI safety, and a strong alignment with our mission
Desirable Qualifications and Skills
We expect especially strong applicants to have deep experience in at least one of the following areas:
* Familiarity with AI safety concepts and research landscape
* Experience in ML engineering, software engineering, or related technical fields
* Experience in AI policy
* Background in coaching or professional development
* Entrepreneurial mindset and experience in building systems or infrastructure
* PhD or extensive academic research experience
How to apply?
To apply, please fill out the form here.
MATS is committed to fostering a diverse and inclusive work environment. We encourage applications from individuals of all backgrounds and experiences.
Join us in shaping the future of AI safety research!
#J-18808-Ljbffr