So, you’re thinking about applying to be a winter fellow at GovAI. It’s a move that makes sense if you’re obsessed with how artificial intelligence is going to reshape global power dynamics, but honestly, the process is a bit of a black box from the outside. The Centre for the Governance of AI—most people just call it GovAI—has become this sort of intellectual hub in Oxford for people who don't just want to talk about "AI ethics" in a vague way, but actually want to dig into compute governance, lab oversight, and international institutions.
Being a winter fellow at GovAI isn't a vacation. It’s an intense, three-month sprint.
📖 Related: What Impact Did the Cotton Gin Have? Why Eli Whitney’s Invention Didn't Go According to Plan
The program usually targets people who are either finishing up a PhD, working in policy, or maybe coming from a deep technical background and wanting to pivot into the "safety" side of things. It’s competitive. Like, really competitive. But the vibe isn't corporate. It's academic, yet urgent. You’re basically given a desk, a stipend, and access to some of the sharpest minds in the field, like Allan Dafoe or Ben Garfinkel, and told to produce something that actually matters to policymakers.
What a Winter Fellow at GovAI Actually Does All Day
If you imagine a 9-to-5 where you’re filing reports, you’ve got the wrong idea. A winter fellow at GovAI spends a massive chunk of their time reading. Reading white papers, reading obscure Chinese policy translations, reading technical specs on NVIDIA’s latest Blackwell chips.
The core of the fellowship is a research project. You don't just show up and wait for instructions. You come in with a proposal. Maybe you’re looking at how export controls on semiconductor manufacturing equipment affect long-term AI development timelines. Or maybe you're diving into the "Sincere Beliefs" problem in AI alignment.
One week you might be presenting your initial findings in a "work-in-progress" seminar. This is where the magic (and the stress) happens. You’ll have people who have advised the UK government or worked with frontier labs like OpenAI and Anthropic poking holes in your logic. It’s not mean-spirited. It’s just that in AI governance, a small logical error can lead to a policy recommendation that is either useless or, worse, dangerous.
Lunch is usually communal. You’ll hear debates about whether the "p(doom)" discourse is a distraction or if we should be more worried about the centralization of power in three companies. It's a lot. If you’re someone who uses "he/him" pronouns, you’ll find the cohort is generally diverse, though the field of AI safety has historically struggled with gender balance—something GovAI has been actively trying to fix through their recruitment pipelines.
The Application Hurdles You Didn't See Coming
Most people mess up the application because they try to sound too "polished" and end up sounding like a generic consultant. GovAI doesn't want consultants. They want researchers who can think from first principles.
💡 You might also like: PPO Reinforcement Learning Explained: Why It Actually Works (Simply)
When you apply to be a winter fellow at GovAI, you'll likely face a task. It’s often a writing test or a policy analysis case study. They might ask you to summarize the trade-offs of a specific piece of legislation, like the EU AI Act, but from the perspective of a specific stakeholder. They aren't looking for you to regurgitate the news. They want to see if you can identify the "second-order effects."
For example, if a government mandates a "kill switch" for large models, what does that do to open-source innovation? Does it just drive the talent to jurisdictions with no oversight? That’s the kind of nuance they’re looking for.
The Research Proposal
This is the make-or-break part.
- Be Specific. Don't say "I want to study AI safety." Say "I want to analyze how the licensing of large-scale compute clusters could be verified through physical hardware on-device monitoring."
- Show Impact. Why does this matter now? If your research is only relevant in a world where we have AGI (Artificial General Intelligence) and nothing else, it might be too speculative for the policy-oriented folks.
- Evidence of Talent. They look at your past work. It doesn't have to be AI-related. If you wrote a brilliant thesis on the history of nuclear non-proliferation, that shows you can handle complex, high-stakes institutional analysis.
Life in Oxford During the Winter
Let’s be real: Oxford in the winter is gray. It’s damp. But there’s something about being in those old stone buildings, surrounded by thousands of books, that makes the high-tech talk of "compute thresholds" feel grounded in history.
As a winter fellow, you aren't just siloed in an office. GovAI is part of a broader ecosystem. You’re a short walk from the Future of Humanity Institute (though its status has changed recently) and various university departments. You’ll spend time at the Trajan Coffee House or the Eagle and Child (if it’s open) arguing about the scaling laws.
The stipend is decent. It’s designed to cover your living expenses in Oxford, which, if you haven't checked lately, is shockingly expensive. You won't be living like royalty, but you won't be a starving student either. Most fellows find housing in shared houses or through university-adjacent short-term lets.
Why Does This Fellowship Even Matter?
You might wonder if a three-month stint actually changes anything. In the grand scheme of things, the "winter fellow at GovAI" title is a massive signal to the rest of the policy world.
The alumni from this program end up in influential places. We’re talking about the UK AI Safety Institute, the US Department of Commerce, and the policy teams at the major labs. Because the field is so new, there isn't a "standard" career path. There’s no "Major in AI Governance" at most universities yet. This fellowship acts as that bridge.
It’s also about the "hidden" curriculum. You learn how to talk to technical researchers. If you’re a policy person, you need to understand what a "transformer architecture" actually is, at least at a high level, or the engineers won't take your safety proposals seriously. Conversely, if you're a technical person, you learn why you can't just "ban" a certain type of code without trampling on civil liberties or international trade laws.
Common Misconceptions About GovAI
People think you have to be a "doomer." You don't.
While GovAI grew out of concerns about long-term existential risk, the current research agenda is much broader. They care about "structural" risks—how AI might accidentally trigger a war or lead to massive economic inequality. You don't have to believe the world is ending tomorrow to be a valuable winter fellow at GovAI. You just have to believe that AI is a transformative technology that requires serious, sober institutional management.
Another misconception is that it's only for academics. While a lot of fellows have PhDs, they've taken people from journalism, law, and even former founders. What matters is the quality of your "output." Can you write a memo that a busy government minister would actually understand and find useful?
Moving Forward: How to Prepare Your Profile
If you’re serious about this, don't wait for the application window to open to start your research.
First, start a blog or a Substack. Write 1,500-word deep dives into specific AI policy niches. If you can show a track record of "thinking in public," it makes your application ten times stronger. It proves you aren't just interested in the prestige of the fellowship, but that you're actually interested in the work.
Second, get comfortable with the technical basics. You don't need to be able to code a neural network from scratch in C++, but you should know what "inference" vs. "training" means and why it matters for regulation.
👉 See also: Images of the stars: Why your wallpaper looks nothing like the real universe
Lastly, network with current or former fellows. Most of them are surprisingly open to a quick Zoom call if you have specific, intelligent questions. Don't just ask "how do I get in?" Ask them about their research. Ask them what the most controversial debate in the office was during their term.
Practical Steps for Your Application
- Audit your CV: Remove the fluff. If you have experience in high-stakes environments—whether that's a lab or a political campaign—highlight the decision-making parts.
- Draft your proposal: Aim for a "Goldilocks" project. Not too broad ("I'll solve AI alignment"), not too narrow ("I'll study the font size of AI warning labels").
- Identify your "niche": Are you the "China-US relations" person? The "compute governance" person? The "human rights and bias" person? Having a clear identity helps the reviewers see where you fit in the cohort.
- Read the GovAI research blog: Go back two years. Understand the "house style" and the evolution of their thinking. If you reference their past work in your application, it shows you've done your homework.
The winter fellowship is a rare chance to step back from the daily news cycle and think deeply about the next fifty years of human history. It's hard to get, it's harder to do well, but if you're the right kind of nerd, there's nowhere else you'd rather be.