Application Deadline
May 18, 2026
The Constellation Institute has announced appliations for its flagship Astra Fellowship, a fully funded, in-person programme designed to accelerate research and talent development in the rapidly growing field of AI safety. Hosted at Constellation’s Berkeley research centre, the fellowship is aimed at preparing the next generation of researchers to address the risks associated with advanced artificial intelligence systems. As AI systems continue to evolve at unprecedented speed, the Astra Fellowship focuses on building capacity in AI alignment, governance, and safety research, ensuring that technical progress is matched with robust safeguards. Applications for the next cohort close on May 3rd at 11:59pm (Anywhere on Earth). The Astra Fellowship: A Fully Funded AI Safety Research Fellowship The Astra Fellowship is a five-month, full-time programme that provides selected fellows with financial, technical, and professional support to conduct high-impact research in AI safety. Participants work closely with leading experts and research mentors while engaging in collaborative projects at the intersection of machine learning, policy, and governance. Key features of the programme include: Monthly stipend of $8,400 for all fellows Research funding of approximately $15,000 per month for compute resources (empirical stream) Visa support for international applicants Access to a dedicated research workspace in Berkeley Structured mentorship and research management support Career placement and incubation services The fellowship is designed not only to support research but also to help participants transition into full-time roles in leading AI safety organisations or launch independent initiatives. Two Specialized Research Streams The programme is divided into two main tracks to accommodate different skill sets and interests: Empirical Stream This stream focuses on technical machine learning research in areas such as: AI alignment Model evaluation and testing Scalable oversight systems AI control mechanisms Participants in this stream work closely with technical mentors from leading AI research organisations. Strategy and Governance Stream This track is designed for individuals interested in policy, systems thinking, and strategic analysis of AI risks. Fellows explore: Catastrophic risk assessment Governance frameworks for advanced AI systems Institutional and regulatory design Global coordination strategies Both streams emphasize solving high-impact, underexplored problems in AI safety and ensuring responsible technological development. Mentorship from Leading AI Institutions A defining feature of the Astra Fellowship is its mentorship network, which includes experts from globally recognized organisations such as OpenAI, Anthropic, Google DeepMind, and research institutions like Redwood Research and METR. Fellows receive weekly mentorship sessions, office hours, and direct research collaboration opportunities. This structure ensures that participants gain both technical expertise and strategic insight into frontier AI safety challenges. Proven Impact and Career Pathways The Astra Fellowship has already demonstrated strong outcomes. More than 80% of participants from the first cohort have transitioned into full-time AI safety roles at leading organisations, including research labs, policy institutes, and technology companies. Graduates of the programme have gone on to work in: Frontier AI research labs Policy and governance institutions AI evaluation and safety organisations Independent research initiatives This strong placement record highlights the fellowship’s role as a key pipeline into the global AI safety ecosystem. Who Should Apply? The programme is open to individuals who are motivated to reduce catastrophic risks from advanced AI and who are eager to contribute to high-impact research. Ideal applicants may: Have technical or policy experience relevant to AI safety Be interested in empirical machine learning research or governance strategy Want to transition into AI safety careers or launch related organisations Come from both technical and adjacent disciplines Prior experience in AI safety is not required, as the programme actively encourages applications from diverse academic and professional backgrounds. Exclusive Insights: Why the Astra Fellowship Matters The Astra Fellowship reflects a growing global recognition that AI safety is becoming a critical frontier issue in technology governance. As advanced AI systems become more capable, the risks associated with misalignment, misuse, and lack of oversight are increasingly central to policy and research agendas. Talent pipeline development: The programme accelerates entry into a highly specialised and rapidly expanding field. Bridging research and policy: Fellows work across technical and governance domains, ensuring interdisciplinary solutions. Concentration of expertise: Access to leading institutions such as OpenAI, Anthropic, and DeepMind creates high-density knowledge transfer. Resource-intensive research support: Compute funding enables advanced empirical experimentation that is often inaccessible to early-career researchers. Career acceleration: High placement rates demonstrate strong demand for AI safety professionals across industry and academia. Global coordination relevance: AI safety is increasingly viewed as a cross-border issue requiring coordinated international responses. Incubation potential: Fellows are supported in launching independent research labs and initiatives. Strategic timing: The programme is positioned at a critical moment when AI capabilities are rapidly scaling and governance frameworks are still evolving. Overall, the Astra Fellowship functions not just as a training programme, but as a strategic intervention in shaping the future of safe artificial intelligence development. APPLY HERE For more opportunities such as these please follow us on Facebook, Instagram, WhatsApp, Twitter, LinkedIn and WPChannel Disclaimer: Global South Opportunities (GSO) is not the organization that is offering this opportunity. For any inquiries, please contact the official organization directly. Please do not send your applications & CVs to GSO, as we are unable to process them. Due to the high volume of emails, we receive daily, we may not be able to respond to all inquiries. Thank you for your understanding
Category
fellowship
Type
online
Organization / Source
globalsouthopportunities.com
Posted
April 18, 2026
Explore our curated collection of opportunities in the same category or browse all available opportunities.