Global Catastrophic Risks

@tags:: #lit✍/📰️article/highlights
@links:: cause profile, global catastrophic risks,
@ref:: Global Catastrophic Risks
@author:: Probably Good

=this.file.name

Book cover of "Global Catastrophic Risks"

Reference

Notes

Quote

For example, experts estimate that there’s around a 1% chance of nuclear war every year. Though this might sound low on its own, this level of risk quickly stacks up over time. In fact, this level of risk implies a 63% probability of nuclear war happening over the next 100 years. Those are worse odds than a coin flip.
- View Highlight
-
- [note::The risk of low probability catastrophic events is not in their marginal risk, but instead the total risk over a given time period]

Quote

A commonly cited (and worrying) example is that the UN Biological Weapons Convention, which prohibits the development of biological weapons, has a smaller annual budget than an average branch of McDonald’s.
- View Highlight
-

Quote

More broadly, 80,000 Hours estimates that only $1 billion per year (adjusted for quality) is spent on preventing the most serious pandemics globally, a surprisingly small amount relative to the scale of the threat. For context, it’s estimated that Covid-19 will have cost the global economy $12.5 trillion of damage by 2025.
- View Highlight
-
- [note::Wow - does this imply governments are throwing billions of dollars away by not investing in pandemic preparedness?]

Quote

Although AI is touted as one of the most likely sources of existential risk in the coming decades, it’s estimated that there are only 100-200 people working full-time on AI safety efforts. This means it’s probably even more neglected than global catastrophic biological risks. Given that 20,000+ people work to produce plastic bricks for LEGO, it’s plausible that AI risk should probably absorb a fair few more workers. (We’re not saying that plastic bricks aren’t important, just that they’re probably not 100 times as important as protecting humanity.)
- View Highlight
-


dg-publish: true
created: 2024-07-01
modified: 2024-07-01
title: Global Catastrophic Risks
source: reader

@tags:: #lit✍/📰️article/highlights
@links:: cause profile, global catastrophic risks,
@ref:: Global Catastrophic Risks
@author:: Probably Good

=this.file.name

Book cover of "Global Catastrophic Risks"

Reference

Notes

Quote

For example, experts estimate that there’s around a 1% chance of nuclear war every year. Though this might sound low on its own, this level of risk quickly stacks up over time. In fact, this level of risk implies a 63% probability of nuclear war happening over the next 100 years. Those are worse odds than a coin flip.
- View Highlight
-
- [note::The risk of low probability catastrophic events is not in their marginal risk, but instead the total risk over a given time period]

Quote

A commonly cited (and worrying) example is that the UN Biological Weapons Convention, which prohibits the development of biological weapons, has a smaller annual budget than an average branch of McDonald’s.
- View Highlight
-

Quote

More broadly, 80,000 Hours estimates that only $1 billion per year (adjusted for quality) is spent on preventing the most serious pandemics globally, a surprisingly small amount relative to the scale of the threat. For context, it’s estimated that Covid-19 will have cost the global economy $12.5 trillion of damage by 2025.
- View Highlight
-
- [note::Wow - does this imply governments are throwing billions of dollars away by not investing in pandemic preparedness?]

Quote

Although AI is touted as one of the most likely sources of existential risk in the coming decades, it’s estimated that there are only 100-200 people working full-time on AI safety efforts. This means it’s probably even more neglected than global catastrophic biological risks. Given that 20,000+ people work to produce plastic bricks for LEGO, it’s plausible that AI risk should probably absorb a fair few more workers. (We’re not saying that plastic bricks aren’t important, just that they’re probably not 100 times as important as protecting humanity.)
- View Highlight
-