Forecast: From the 2022 Hybrid Forecasting-Persuasion Tournament
My personal forecasts on a wide range of Global Catastrophic Risk topics from the second half of 2022
You can find a post-mortem of the tournament where I originally made these forecasts here. I won’t replicate that context here, and instead just present the forecasts as they stood at the end of the tournament. Please forgive any peculiar formatting decisions, as these were copied from a proprietary platform. Each question also originally had an associated google doc with some initial sources to kick off research.
Disclaimers
These forecasts represent my best judgement on each question, in the second half of 2022, given the time constraints of a tournament environment. That being said, I’m just a single forecaster, and I don’t even have a track record that you can use to evaluate my accuracy. Research has shown that to be a recipe for especially inaccurate results, even before you consider that this tournament was exploring the especially challenging space of global catastrophic risks. I think my forecasts are reasonable, and I put a lot of effort into trying to make them accurate, but I would still heavily advise against using them as an input into any serious decisions.
Additionally, there are definitely areas where my forecasts suffered from lack of highly relevant knowledge that I didn’t happen to dig up during the tournament but that I’ve since learned about. The possibility of “catastrophic” climate change from hitting various climate system tipping points stands out as the most egregious example of something that I was just completely oblivious to, but I’m sure there are lots more topics like that which I’m still ignorant of and therefore can’t call out with a warning.
Forecasts
What is the probability that a genetically-engineered pathogen will be the cause of death, within a 5-year period, for more than 1% of humans alive at the beginning of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
A pathogen is any microorganism (bacteria, viruses, etc.) that can cause diseases.
For the purposes of this question, an engineered pathogen is one produced via genetic modification or manipulation by humans. This may include the laboratory modification of pathogens that have previously been found in nature.
In order to resolve positively, the relevant 5-year period must end before the resolution date.
If reasonable people disagree about whether this event has occurred, this question will be resolved via a panel of experts.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Such a pandemic could be caused by either a leak from a lab (likely BSL-4 if it's handling a pathogen that can do this) or from the use of a bioweapon. I'm intentionally excluding non-contagious bioweapons (anthrax, ricin, etc.) because I find them extremely implausible to achieve this death toll in a 5 year span.
On average, I'd expect genetically-engineered pathogens to be more likely to cause a pandemic than naturally occurring ones. Gain of function research makes them more dangerous, while attempting to intentionally create a version of the pathogen to be used as an inoculation would make them less dangerous. Assuming these roughly cancel out, engineering to make a pathogen into a bioweapon will obviously tend to make it more dangerous.
With the possible exception of COVID-19 (Which I think is ~70% likely to have leaked from a lab, in which case it was almost certainly genetically engineered) there seem to have been no past pandemics caused by genetic engineering to use for generating a prior.
There is a biological weapons convention banning biological weapons, with 184 states joining and another 13 that have either not signed or ratified the treaty. The treaty lacks any means to verify compliance, and there have been many known violations of it, so I think it’s safe to consider it ineffective. These previous non compliances also demonstrate a willingness by state powers to weaponize the pathogens of past pandemics.
Therefore, we should expect the risk of such a pandemic to increase with conflict between state powers. I don’t expect there to be any meaningful/accurate measures of how much genetic engineering research or bioweapons development is going on at any given time.
Furthermore, technological progress has greatly democratized genetic engineering thus far with things like CRISPR and the open publication of pathogen genomes. This potentially opens the door for non-state actors, like terrorist organizations.
I will break down my forecast into the chance of a pandemic from a...
a state actor releasing a bioweapon due to a great powers war
a non-state actor releasing a bioweapon
a BSL-4 lab leak
State Actor Bioweapon
This study, cited by 80,000 hours, approximates the annual risk of great powers war at ~0.7734% (the constant annual probability required to get their ~45% chance estimate of one or more wars in the next 77 years). I’m going to estimate that in 70% of these wars a genetically engineered pandemic will be induced, and that it has an 80% chance of being lethal enough to meet this question’s criteria. (genetically enhanced versions of very lethal past pandemic pathogens…)
Non-State Actor Bioweapon
I think the current risk of this is quite low, but will certainly increase over time with further democratization of the associated technology and information. I'm going to approximate it as 10% of the chance of the pandemic from the great powers war until 2030, 50% until 2050 and equally likely until 2100.
BSL-4 Lab Leak
There have been three BSL-4 infections in the 36 years since the designation was introduced if we count COVID-19 (I'm comfortable counting it despite my uncertainty because the actual count of BSL-4 leaks seems much, much more likely to be undercounted vs. overcounted), for the annualized probability of at least a single infection of ~2.8%. Obviously none of these have resulted in a pandemic meeting this question’s criteria, but COVID-19 is only off by an order of magnitude. Given this, and BSL-4 labs specifically only researching the most dangerous pathogens I’m going to estimate a 5% chance that any reported BSL-4 leak is genetically engineered pathogen that causes a pandemic of this magnitude.
Conclusion
Combining all of these probabilities as independent and using a Binomial Distribution yields...
~0.62% annual chance until 2030 for a total chance of 4.854%
~0.79% annual chance until 2050 for a total chance of 19.915%
~1.00% annual chance until 2100 for a total chance of 54.339%
What is the probability that a non-genetically-engineered pathogen will be the cause of death, within a 5-year period, for more than 1% of humans alive at the beginning of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
A pathogen is any microorganism (bacteria, viruses, etc…) that can cause diseases.
A naturally arising pathogen is any pathogen that was not produced via genetic modification or manipulation by humans. If there is a lab escape of a naturally arising pathogen that has not been genetically modified or manipulated (e.g. smallpox), this would count as a "naturally arising pathogen" for the purposes of this question.
In order to resolve positively, the relevant 5-year period must end before the resolution date.
If reasonable people disagree about whether this event has occurred, this question will be resolved via a panel of experts.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
A pandemic due to a non-genetically-engineered pathogen could emerge either naturally (likely zoonotic origin before jumping to humans) or from a human release of an existing pathogen. Wikipedia lists 6 pandemics in history meeting the severity criteria of the question. With the first happening in 1481, this gives us a prior of ~0.4% annual probability of such a pandemic.
There are reasons to believe either that the risk of such a pandemic is higher now than it was in history, or that it could be lower:
Reasons for potentially increased risk:
Increasingly interconnected global society with things like air travel and open borders aid pathogen transmission.
Increasing contact with animals between encroachment into wilderness and the growth of factory farms.
The preservation and handling of dangerous pathogens in scientific laboratories with demonstrated safety deficiencies.
Wikipedia lists 2 confirmed infections at BSL-4 facilities with COVID-19 likely being a third. The BSL-4 designation was only created in 1984.
Reasons for potentially reduced risk:
Public health infrastructure enabling contact tracing, quarantines, vaccination drives, and outbreak surveillance.
Medical technologies like more effective vaccines or treatments.
Increasing awareness of the threat that pandemics pose.
This article from 80,000 hours digs into similar reasons and comes to an undetermined conclusion on which set of factors is stronger. I would have likely come to a similar answer before the global COVID-19 response. I am now quite confident that our response to pandemics is bottlenecked not by technological advancements, but by bureaucratic/institutional ineptitude and believe trends here to be worsening. With little apparent government interest in systemic improvements to prepare for future pandemics, even in the wake of the most serious part of this one, it's hard to imagine any meaningful improvement before something like the magnitude of pandemic asked about in this question actually happens. For this reason I will be treating my prior as a lower-bound on my final forecast.
This study from the Center of Global Development uses the severity and frequency of past pandemics to estimate “the annual probability of of a pandemic on the scale of COVID-19” [in number of deaths] “in any given year to be between 2.5-3.3 percent.”
This study uses similar data but a different methodology and comes to an estimate of 2.63% for the same probability. This close agreement gives me confidence in both approaches. To meet the criteria of this question, we need a pandemic that kills a roughly 10x larger proportion of the global population than COVID-19. Historically, the spanish flu of 1918 is of the correct size. This paper approximates the current average recurrence interval of a spanish flu-like event at 127 years, for an annual probability of ~0.79%. This is in line with my lower-bound prior. Without finding any strong reason to believe things will get much better or worse in the future, and this paper's methodology already accounting for changing vulnerability to pandemics over the last few decades, I will be extrapolating this annual probability to all three forecasts asked in the question. As the vast majority of the data used in this study (with the one possible exception being the COVID-19 pandemic) being for pathogens of natural origin, I'm confident that this estimate does not include any additional risk from genetically-engineered pathogens.
Using a Binomial Distribution I get...
6.148% chance by 2030
19.915% chance by 2050
46.132% chance by 2100
What is the probability that artificial intelligence will be the cause of death, within a 5-year period, for more than 10% of humans alive at the beginning of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
Artificial intelligence is defined here as ‘the development of machines capable of sophisticated (intelligent) information processing (Dafoe 2018, AI Governance: A Research Agenda; See pg 5 footnote 2 for more.)
We use the term “be the cause of death” to cover cases where AI is the direct or proximate cause of the deaths. For example, if a terminator-style robot powered by AI kills 10% of all humans alive at the time within a 5-year period, that will count for this question. Alternatively, if AI, operating without direct human intervention, causes the launch of nuclear weapons that kill 10% of all humans alive at the time, that will also count for the purposes of this question. And if that same incident does not kill 10% of all humans, but the subsequent nuclear winter does, that will also count for the purposes of this question.
If reasonable people disagree about whether this event has occurred, this question will be resolved via a panel of experts.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Combining the estimates of this first Metaculus question of a catastrophe of this scale occurring with this second Metaculus question of the catastrophe being caused by AI gives an ~8.4% chance of 10% of the human population being killed by AI by 2100. These markets have pretty good prediction volume, I don't have any special AI expertise, and the expert opinions seem especially hard to synthesize due to differing definitions and the inherently theoretical nature of most of the arguments. I don't believe I could do any better than the wisdom of the crowd on this one, especially considering the EA community's apparent interest in AI risk.
I'm going to assume that this annual average risk is halved for the 2050 timeline, and then halved again for the 2030 timeline. This is an estimate to account for improving AI capabilities over time. I expect AI to inherently get more dangerous as it becomes more capable.
This approach gives...
0.022% chance of catastrophic risk by 2030
0.157% chance of catastrophic risk by 2050
8.400% chance of catastrophic risk by 2100
What is the probability that artificial intelligence will cause human extinction or reduce the global population below 5,000...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
Artificial intelligence is defined here as ‘the development of machines capable of sophisticated (intelligent) information processing (Dafoe 2018, AI Governance: A Research Agenda; See pg 5 footnote 2 for more.)
AI would be considered to have "caused" extinction if it is the direct or proximate cause of the deaths. For example, if a terminator-style robot powered by AI kills all humans, that will count for this question. Alternatively, if AI, operating without direct human intervention, causes the launch of nuclear weapons that kill all humans, that will also count for the purposes of this question. And if that same incident does not kill all humans, but the subsequent nuclear winter does, that will also count for the purposes of this question.
If an extinction event (global population is reduced below 5,000) is caused by multiple sources including AI, it will count as an extinction event caused by AI.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Please see my reasoning for the catastrophic risk version of this question.
The only addition is that I'm using this Metaculus market's conditional probability of the catastrophe killing 95% of humans, assuming it was already sufficient to kill 10%. Because it's much harder to achieve the criteria of this question than the criteria of the Metaculus question, I'll take 10% of their estimated probability. This is mostly based on intuition, but the 95% criteria could leave ~400 million people alive which is much, much more than 5,000. The Metaculus estimate has been relatively stable and is 60%, so I'll be multiplying my estimates from the catastrophic risk version of this question by 6% to get:
0.001% chance by 2030
0.009% chance by 2050
0.504% chance by 2100
What is the probability that one or more incidents involving nuclear weapons will be the cause of death, within a 5-year period, for more than 10% of humans alive at the beginning of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
We use the term “be the cause of death of” to cover cases where nuclear weapons are the direct or proximate cause of the deaths. For example, if a country launches nuclear weapons that directly cause the death of 10% of all humans alive at the time within a 5-year period, that will count for this question. And if that same incident does not cause the death of 10% of all humans, but the subsequent nuclear winter does, that will also count for the purposes of this question.
If reasonable people disagree about whether this event has occurred, this question will be resolved via a panel of experts.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
I don’t see how a single nuclear event could satisfy this question. The Tokyo metro area only has 40 million people in it, or 0.5% of the population. You’d have to kill the entire Tokyo metro area 20 times with different events in 5 years to meet this criteria without nuclear war, so I’m going to rule out things like lone acts of terrorism or single accidents.
This leaves the chance of nuclear war. Based on this analysis by Rethink Priorities, even a full blown nuclear exchange between the US and Russia currently would not come close to meeting the criteria for this question. This leaves nuclear winter as the only viable pathway I can come up with.
It’s hard to imagine a great powers war, after the advent of nuclear weapons, that doesn’t include a nuclear exchange. It’s also hard to imagine a future where nuclear weapons proliferation was successful AND there was still a great powers war. This study, cited by 80,000 hours, approximates the annual risk of a great powers war at ~0.7734%.
I’m going to estimate that in 80% of these wars there is a nuclear exchange. I can’t imagine a war with all of the nuclear powers on one side, or one in which there is victory for one side without the other using their nuclear weapons. this (combined with the above 0.7734% annual probability) is very roughly in line with this analysis by Rethink Priorities estimating a 1.1% annual probability of nuclear war.
Then I’m going to estimate that only 50% of these exchanges kill a sufficient number of people to meet the question criteria. This analysis by Rethink Priorities is quite confident that a nuclear exchange between the US and Russia would lead to a nuclear winter killing between 36% and 96% of the world population. I can't imagine the first 10% not happening within 5 years as the post estimates that the smoke could take 5-10 years to dissipate. My percentage expresses a much lower confidence than this post because it accounts for exchanges between smaller powers, not just the current US and Russia arsenals.
Using a Binomial Distribution yields...
2.448% chance by 2030 (7 years)
8.311% chance by 2050 (27 years)
21.471% chance by 2100 (77 years)
What is the probability that one or more incidents involving nuclear weapons will cause human extinction or reduce the global population below 5,000...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
Nuclear weapons would be considered to have caused extinction either if they cause the death of all humans directly (e.g. via mass nuclear strikes) or if they cause the death of all humans via a nuclear winter effect.
If an extinction event (global population is reduced below 5,000) is caused by multiple sources including nuclear weapons, it will count as an extinction event caused by nuclear weapons.
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Based on this analysis by Rethink Priorites of the amount of smoke generated by nuclear exchanges, the climate/agricultural effects of different amounts of smoke, and my expectations regarding relative nuclear exchange sizes in the future, I estimate a 20% chance that a given catastrophic nuclear exchange is large enough to cause global extinction.
The consensus (This source, others, and the above link regarding nuclear winter) seems to be that Australia and New Zealand would still be able to grow crops during a nuclear winter and therefore survive. I’ll give this a 90% chance of being true. If this is true, we need a nuclear winter that kills everyone else, followed by some sort of civilizational collapse in these areas. If only these two countries survive, this would mean ~99.5% of the world’s population has died. This analysis by Rethink Priorities estimates a ~10% chance of extinction in such a situation.
Combining these two scenarios gives us:
0.094% chance by 2030
0.330% chance by 2050
0.916% chance by 2100
Toby Ord estimates this at ~1/1,000 in the next century, which translates to an annual probability of about 0.001%, which makes my estimate more than 10x higher, but not enough information is shared in The Precipice for me to be able to update from this.
What is the probability that non-anthropogenic causes (e.g., asteroid or comet impacts, solar flares, a supervolcanic eruption, or a stellar explosion) will be the cause of death, within a 5-year period, for more than 10% of humans alive at the beginning of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
Non-anthropogenic refers to any natural disaster not originating with human activity.
Extreme weather events that plausibly were worsened by climate change (e.g. hurricanes) will not be considered as "non-anthropogenic causes" for the purposes of this question.
We use the term “be the cause of death of” to cover cases where a non-anthropogenic cause causes the death of people directly (e.g. asteroid directly impacting a large number of people) or indirectly (e.g. a major asteroid impact leads to subsequent disruption to food systems that leads to the death of >10% of the human population within a 5-year period).
My forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
I used "Global Catastrophic Risks", a book by Nick Bostrom and Milan M. Cirkovic to research most of this answer. If any information does not have a specifically cited source, this is where it came from.
These are all of the potential causes I could find between that book and this Wikipedia page:
Supervolcanic eruption
Comet or asteroid impact
Stellar explosion (solar flare or supernova or gamma ray burst)
Extraterrestial beings
The last one seems quite unlikely to me and I don't expect to find any high quality information to let me update from that baseline, so I'll be ignoring it.
The odds of a supernova or gamma ray burst that could have this level of impact on humanity are well studied and extremely small.
Past volcanic eruptions of the magnitude required to lead to a "volcanic winter" have an annual chance of 0.002%. I think an eruption would have to have this kind of global effect in order to kill 10% of the population, but having achieved that it seems extremely likely to me that skipping a single growing season would result in famine killing >10% of humans.
Asteroids/comets >1km in diameter are expected to have global effects if they impact earth. Tokyo only has ~37 million people in it, so it doesn’t seem like there’s a way for an impact to meet the criteria of this question if it’s smaller than this. This source has an annual chance of 0.0002% for this magnitude of impact and it seems extremely likely that one of these events would kill >10% of humans.
A solar storm of the intensity that caused the Carrington Event is estimated to happen once in 500 years. This is the best source I have on the ultimate impacts, but their estimate is only “millions of lives” lost in the US, which if scaled globally is well short of the threshold. I’m going to give this kind of event a 5% chance of causing the death of 10% of humans. Overall that’s a 0.01% annual chance.
It seems safe to call these probabilities independent, so adding them all up we get a 0.0122% annual chance. Using a Binomial distribution yields...
0.098% chance by 2030
0.341% chance by 2050
0.947% chance by 2100
What is the probability that non-anthropogenic causes (e.g., asteroid or comet impacts, solar flares, a supervolcanic eruption, or a stellar explosion) will cause human extinction or reduce the global population below 5,000...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
Question and resolution details
Non-anthropogenic refers to any natural disaster not originating with human activity.
Extreme weather events that plausibly were worsened by climate change (e.g. hurricanes) will not be considered as "non-anthropogenic causes" for the purposes of this question.
Non-anthropogenic causes would be considered to have caused human extinction if they kill all humans directly (e.g. asteroid directly destroying the planet) or indirectly (e.g. a supervolcanic eruption destroys the food supply and leads to extinction).
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Toby Ord (in The Precipice) estimates the risk from asteroids/comets, supervolcanoes, and stellar explosions. I don't expect to be able to estimate this risk any better than he can. His 1 in 10,000 extinction risk per century translates to a 0.0001% chance per year.
He omits solar storms from his analysis. I estimate that 2% of the solar storms severe enough to satisfy question 7 would satisfy this question. This would be via civilizational collapse following destruction of the electrical grid.
Combing these two probabilities with a Binomial Distribution gives...
0.002% chance by 2030
0.008% chance by 2050
0.023% chance by 2100
What is the overall probability of a global catastrophe where more than 10% of humans alive at the start of a 5-year period die by the end of that period...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Much of the work required to answer this question has already been done in other questions. I also considered the possibility of a catastrophe meeting this criteria being caused by some anthropogenic cause other than what has already been asked about, but ended up excluding them all:
Climate Change
Global temperature changes inducing famines / heat exhaustion seems far too slow moving to meet the "5-year period" aspect for the question criteria
Extreme weather events seem far too localized to meet the "10% of humans" aspect of the question criteria.
Non-nuclear war
WWI killed <1% of the population and WWII ~2%. It doesn't seem plausible to me that a great powers war would kill >10% of the human population without leveraging one the technologies already addressed in the other questions: AI, bioweapons, or nuclear weapons.
Nanotechnology
This seems too speculative / far-off. I'm not even sure if it will be a viable and weaponizable technology by 2100 let alone how lethal or likely it would be.
This leaves only the possible causes addressed by previous questions to be combined into a single, overall forecast.
Note: The pathogen questions had slightly different criteria, instead asking about the chance of killing >1% of the human population. There were 6 historical pandemics that would have met the criteria for that question, but only 2 that would have met the criteria for this question. Therefore I'm considering a pandemic that kills >10% of the population to be one third as likely as one that kills >1% and adjusted my answers accordingly.
List of risks:
Genetically engineered pathogen
Non-genetically engineered pathogen
Non-anthropogenic
Nuclear
AI
Assuming these events to be independent, multiplying their chances of avoidance by each other gives…
6.11% chance of catastrophe by 2030
20.48% chance of catastrophe by 2050
50.63% chance of catastrophe by 2100
What is the overall probability of human extinction or a reduction in the global population below 5,000...
...by the end of 2030?
...by the end of 2050?
...by the end of 2100?
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Please see my reasoning from the catastrophic risk version of this question.
For the pathogen risk questions, I think it’s highly unlikely that a pathogen causes human extinction, but I’m sure it’s possible. I’m going to estimate that there’s a 2% chance that a pathogen that kills >10% of the human population also satisfies the criteria for this question and adjust my estimates accordingly:
List of risks:
Genetically engineered pathogen
Non-genetically engineered pathogen
Non-anthropogenic
Nuclear
AI
Assuming these events to be independent, multiplying their chances of avoidance by each other gives…
0.317% chance of extinction by 2030
1.140% chance of extinction by 2050
3.409% chance of extinction by 2100
By what year will humans go extinct or first have a population less than 5,000?
Question and resolution details
For the purposes of question resolution, we define “humans” as biological creatures who (A) can be traced back genealogically, via a chain of live births, to 2022 humans OR (B) could mate with 2022 humans and produce viable offspring.
This definition explicitly excludes digital people, sentient artificial intelligences, and biologically engineered post-humans who do not fit the criteria A or B above.
This operationalization of humanity was inspired by this Metaculus question.
Extinction refers to the nonexistence of any of the beings described above.
My Forecast:
Note: All calculations can be found in this spreadsheet.
I don't find the hypothesis of The Precipice particularly compelling. Toby Ord believes that we are in a unique time of high existential risk, and that if we figure out how to get through it we will enter a very long future of very low risk. While I certainly believe we are in a period of unsustainable existential risk, it's not clear to me that we can/will reduce it close to nothing vs. it always just being a consequence of human progress to some degree. Therefore, I am choosing to continue extrapolating outwards from my 2030, 2050, and 2100 projections to see how long humanity is likely to survive if existential risk continues to increase over time with increasing technology as I have forecasted it to in this century.
First I figured out what parameters generated a model of linearly increasing existential risk that best fit my forecast for the total extinction risk question.
Next I used those parameters run a basic simulation, in Python, of when this extinction would occur. I repeated this simulation 10,000 times, and then I analyzed the results directly to determine the above percentiles. fitting of the parameters, simulation results, and the calculation of the percentiles can be found in a Google Sheet here.
The python code for the simulation was as follows, and was ran on online-python.com:
import random
for run in range(1,10001):
inc = 1
while True:
year = 2022 + inc
roll = random.random()
if roll <= 0.00039150+(0.00000136*inc):
print(year)
break
inc = inc + 1
What will be the global surface temperature change as compared to 1850-1900, in degrees Celsius…
...in 2030?
...in 2050?
…in 2100?
Question and Resolution Details
The global surface temperature change refers to the change in both global mean surface temperature (GMST) and global surface air temperature (GSAT) as compared to a representative period between 1850 and 1900. This metric is used by the Intergovernmental Panel on Climate Change. See more on their methodology and model in Cross-Section Box TS.1 (Page TS-27) of the most recent IPCC Assessment Report’s Technical Summary.
We will use the most recent IPCC report to determine the average global surface temperature. If at some point IPCC reports become unavailable or the IPCC’s methodology or metric for measuring global surface temperature change is altered in future reports, a panel of experts will determine an equivalent authoritative source for determining average global surface temperature.
My Forecast:
I can't find a better source of forecasts on this than the IPCC. As a note, criticism of the IPCC tends to call it overly conservative and historically there was some evidence of this with respect to their forecasts of sea level changes.
The latest IPCC report forecasts a range of scenarios based on the possible change in greenhouse gas emissions over time. Their lowest forecast assumes global emissions reaching net zero around 2050, and their highest forecast assumes a doubling of our current emissions by 2050.
Metaculus currently thinks only ~40% of countries even pledging to be carbon neutral by 2050 will keep their pledge. (>25 countries have made this pledge)
Global CO2 emissions haven't shown much sign of slowing yet besides a dip for COVID. Eyeballing some graphs makes it look like if CO2 emissions continue to increase roughly linearly how they have for the last 50 years, they’ll roughly double by 2050. Making the worst case IPCC scenario very roughly a “if we do nothing to react to climate change” scenario.
I think, combined, the above two points mean that the possible futures bounded by IPCC forecasts include all reasonable ones that I'd consider.
The forecasts for this report conveniently include estimates for almost the exact time frames in this question (Table SPM.1) in my link above, including both best estimates and a "very likely" range for each. I've used the best estimate for SSP2-4.5 (middle of the road forecast) as my 50% answer for each time, best estimates from SSP1-2.6 and SSP3-7.0 as my 25% and 75% guesses respectively, and the limits of the "very likely" ranges of SSP1-1.9 and SSP5-8.5 for my 5% and 95% guesses.
I'm comfortable omitting the climate effects of things like geoengineering (extremely unlikely) or nuclear war (unlikely to be timed well enough, at sufficient scale, to affect this), as I think these are outside of 95% confidence interval.
What is the probability that total worldwide production of primary crops will fall by at least 70% within any three-year period…
…by 2030?
…by 2050?
…by 2100?
Question details and Resolution criteria
Primary crops are defined by the UN Food and Agriculture Organization (FAO) as “those which come directly from the land and without having undergone any real processing, apart from cleaning.” More detail can be found here.
Resolution will be determined using statistics from the UN Food and Agriculture Organization (FAO)'s annual reports or, if these reports are not available, by a panel of experts.
For example, in 2019: "When it comes to production, total production of primary crops increased by 53 percent between 2000 and 2019, hitting a record high of 9.4 billion tonnes in 2019. Half of global primary crop production is made up of just four crops: sugar cane, maize, wheat and rice." FAO
Our resolution data will aim to approximate the data used in the below time series as closely as possible:
(Source:FAOSTAT)
My Forecast:
Note: All calculations can be found in this spreadsheet. My forecasts for my beliefs, other expert's beliefs, and the beliefs of superforecasters are all identical as I expect us all to be competently estimating the same ground truth.
Global statistics on crop production can be found here.
A 70% drop from disease or bioweapon seems extremely unlikely. No single crop or even crop family makes up a significant enough portion of the world, so you’d need many separate diseases to ravage the world in a three year period.
Geographically, land for crops is also pretty well distributed. You’d need to take out all crop production across multiple continents to reach 70%. I think this lets me reasonably limit scenarios I care about to ones with global effects.
Anthropogenic climate change is global, but it seems much, much too slow, with too many possible countermeasures, for it to cause a 70% drop in crop production in 3 years.
This leaves nukes, asteroids/comets, supervolcanoes, or anything that kills off so many people so quickly that we stop producing most of our crops. This means that the answer is bounded somewhere between my overall catastrophic risk estimate (10% of people die) and my overall existential risk estimate (basically everyone dies).
In reality, I believe the different risks that added up to these estimates all have different probability distributions relative to their severity. As an estimate, I will assume that killing 70% of people/crops is 7x less likely than killing 10%.
What percentage of the world population will be classified as living in an electoral or closed autocracy in the V-Dem Institute’s annual Democracy Report…
…for the year 2030?
…for the year 2050?
…for the year 2100?
Questions and Resolution Details
In the yearly Democracy Report, the V-Dem Institute measures a span of indices corresponding to democratic values and principles on a country-by-country basis. Based on a regime classification paper published by Lührmann et al. in 2018, one can also classify each country as a “liberal democracy,” “electoral democracy,” “electoral autocracy,” or “closed autocracy.”
If the Democracy Report is not published in one or more of the resolution years, we will attempt to resolve the question using V-Dem and Lührmann et al.’s methodologies, or we will rely on another non-state organization’s report using similar metrics. In the latter case, a panel of experts will select the replacement report and resolve the question given data from the new survey.
If it is impossible to gather data from independent, non-state sources on democratic principles anywhere on the globe, this question will be considered resolved as 100%.
My Forecast:
After reviewing the classification scheme, the terms match pretty closely what I intuitively expected them to mean. To be classified as an autocracy there must be no de-facto multiparty, or free and fair elections, or these institutional prerequisites: elected officials, free and fair elections, freedom of expression, alternative sources of information, associational autonomy, and inclusive citizenship.
The chart of historical data is very useful. The long trend is definitely towards decreasing autocracy and increasing democracy, but there have been periods where autocracies gained ground. Interestingly, each of these so far has been followed by a period of even faster growth in the democratic countries. Looking at the timing, the main periods of temporary autocracy followed by accelerating democracy seem to be WWI, WWII, and the Vietnam War. Perhaps even more interestingly, as of 2021 we seem to be in the sharpest period of increasing autocracy (as a % of countries) in history. Does this mean we're approaching a war that results in even faster democracy? Or is this new decline the start of a new pattern?
I believe that there are two primary attractors for the future of humanity that can be concisely labeled as chaos and oppression. Bostrom's Vulnerable World Hypothesis has us at increasing risk of extinction as we continue to develop new and ever more powerful technologies. As long as we don't have the means to selectively prevent the existential technological developments, eventually we'll kill ourselves. This fate is what I'm calling the chaotic future, where disasters of increasing magnitude befall humanity until we're extinct. The oppressive future is one where autocratic governance uses mass surveillance and a monopoly on violence to control the development of technology in order to prevent these disasters. There is a third, seemingly less likely attractor, in which humanity figures out how to build democratic governance that simultaneously upholds liberal values while preventing existential technologies.
So, within my mental framework, this question is asking me to forecast which of these attractors I view as how likely and by when. Both extinction and global autocracy resolve this question at 100%, while the third attractor likely resolves this question at 0% or very close to it. This is a challenging forecast.
2030:
For 2030 I am going to predict a reclamation of at least most of the democratic foothold lost since 2019, as I expect this to mostly be from "emergency" measures implemented for covid-19 that will gradually reverse as public opinion of them sours. My worst case is a continued slow increase of autocracy, while my best case is slightly exceeding post-pandemic status, and the rest of the forecasts are biased towards a scenario where we regain most of the democratic losses.
2050:
I believe that in a great powers war where democratic countries are aligned against autocratic countries, the defeat of the democratic side would result in nearly global autocracy after which would be very difficult to escape. In previous questions I put the annual risk of great powers war at ~0.77% annual chance. If we say there's 50% of defeat of the democratic side, this outcome (combined with the chance of extinction) dominates my 95% forecast for 2050 which I'll make 95% to reflect that some democratically countries likely won't be strategically important enough to be worth overthrowing.
If I try and eyeball the long term trend of increasing democracy since 1900, it has 2050 at somewhere around 30% autocratic. I'll make this my 25% case with 15% as my 5% case to account for improved democratic technologies or the crumbling of autocracies or a democratic victory in a great powers war (Which I would expect to be less absolute than an autocratic one).
I believe it's most likely that the trend towards democracy slows as we approach 2050 vs. the historical rate because of the increasing threat posed by more powerful technologies and autocratic governance being the most obvious solution. I'll use 50% as my 50% forecast and 65% as my 75% forecast.
2100:
By 2100 I think it's much more likely that we've settled into one of the primary attractors given how far technology should have progressed by then. I'm now going to make my 75% prediction one of near total autocracy (95%) and my 95% prediction one of almost absolute global autocracy (99%). My 5% prediction will account for the possibility of achieving that 3rd attractor with global democratic governance, and I'll make it 5%. My 25% prediction will account for a steadily increasing amount of democratic progress roughly in accordance with the historical rate putting us at around 20%. My 50% prediction will reflect a timeline where autocracy continues to slowly grow from my most likely 2050 prediction in accordance with increasing technology. I'll ballpark this at 75%