Future of Humanity Institute
The Future of Humanity Institute is part of the Faculty of Philosophy and the Oxford Martin School at the University of Oxford. Founded in 2005, its director is Nick Bostrom. The mission of FHI is described on their website:
The Future of Humanity Institute is the leading research centre looking at big-picture questions for human civilization. The last few centuries have seen tremendous change, and this century might transform the human condition in even more fundamental ways. Using the tools of mathematics, philosophy, and science, we explore the risks and opportunities that will arise from technological change, weigh ethical dilemmas, and evaluate global priorities. Our goal is to clarify the choices that will shape humanity’s long-term future.
FHI puts together a wide range of researches, prominent on their original fields, which decided to focus on global questions about the progress and future of humanity, e.g.:
- Nick Bostrom : Director, philosopher, has more than 200 publications on subjects such as: Artificial General Intelligence (AGI) Risks, Existential risk, Biological Cognitive Enhancement and Whole brain emulation
- Anders Sandberg: Research Fellow, computational neuroscientist, researches human enhancement and ethics of new technologies
- Robin Hanson: Research Associate, economist, interested in prediction market given the future of technology and many other questions involving bayesianism, cognitive biases, technology, policies and the Fermi Paradox.
- Toby Ord: Research Associate, philosopher, researches decision making and theoretical and pratical ethics. Founder of Giving What We Can, an international society dedicated to the elimination of poverty;
- Milan Cirkovic: Research Associate, astrophysicist, interested in the anthropic principle and the Fermi Paradox Fermi Paradox.
The FHI is an affiliate to LessWrong and Overcoming Bias. Their past activities include holding a conference in 2008 titled Global Catastrophic Risks Conference and publishing a book, also titled Global Catastrophic Risks.