Defining Meta Existential Risk
!tags:: #lit✍/📰️article/highlights
!links:: 2artificial-intelligence, 2biological-technology, 2climate-change, 2existential-risk, 2exponential-technology, 2game-theory, 2global-catastrophic-risk, 2meta-existential-risk, 2nuclear-technology, tagged-by-ghostreader-ai,
!ref:: Defining Meta Existential Risk
!author:: rhyslindmark.com
=this.file.name
Reference
=this.ref
Notes
Existential Risks are possible future events that would "wipe out" all sentient life on Earth. A massive asteroid impact, for example. Existential Risks can be contrasted with non-existential Global Catastrophic Risks, which don't wipe out all life but produce lots of instability and harm. A (smaller) asteroid impact, for example :).
- View Highlight
-
(Daniel Schmachtenberger defines the 1st issue as: "Rivalrous (win-lose) games multiplied by exponential technology self terminate." And the 2nd issue as: "The need to learn how to build closed loop systems that don’t create depletion and accumulation, don’t require continued growth, and are in harmony with the complex systems they depend on.")
- View Highlight
-
These risks might not be such a big deal if we were able to look at the negative impact of the exponential curve and say "oh, let's just fix it". The problem is that we can't because we're stuck in two kinds of bad game theory problems (often called "coordination problems".
- Our "Power" Problems are Arms Races: For powerful new tech, all of the players are incentivized to make it more powerful without thinking enough about safety. If one player thinks about safety but goes slower, then the other player "wins" the arms race. e.g. Russia and the U.S. both build more nukes, Google and Baidu create powerful AI as fast as possible (not as safe as possible), and CRISPR companies do the same.
- Negative Externalities are a Tragedy of the Commons: It is in each country's best interest to ignore/exploit the global commons because everyone else is doing the same. If they try to carbon tax, underfish, etc. and no one else does, then they "lose".
This is the root of meta x-risk: exponential curves with negative impacts that humans need meta-coordination to stop.)
- View Highlight
-
Meta Existential Risk is the risk to humanity created by (current and future) exponential curves, their misalignment with human values and/or our Earth system, and our inability to stop them given coordination problems.
- View Highlight
-