Catastrophic AGI Risk Prediction
A downloadable project
Catastrophic AGI Risk Prediction
Mitchell Reynolds & Jon Menaster
We introduce a custom case scenario based on Open Philanthropy’s AI Worldviews Contest. We estimate a 48% probability of an existential catastrophe from the loss of control of an AGI system. Our approach suggests a starting point for forecasting AGI risk through a programmatic tool. Our future work will use the ITN framework to assign weights to a variety of independent variables within technical and governance research in order to improve the ability of forecasters to accurately predict the extent to which AGI existential catastrophe will occur within a given timeframe.
Status | Released |
Category | Other |
Author | mitchell-reynolds |
Tags | artificial-intelligence, Sci-fi |
Download
Download
AI Governance Draft Submission.pdf 229 kB
Leave a comment
Log in with itch.io to leave a comment.