Longtermism Fund

!tags:: #lit✍/📰️article/highlights
!links::
!ref:: Longtermism Fund
!author:: givingwhatwecan.org

=this.file.name

Book cover of "Longtermism Fund"

Reference

Notes

Quote

Although the exact risk of catastrophic events is uncertain, a 2022 study found that nearly 50% of AI researchers surveyed believed that the risk of advanced AI ultimately leading to humanity's extinction was at least 10%.
- No location available
-


dg-publish: true
created: 2024-07-01
modified: 2024-07-01
title: Longtermism Fund
source: hypothesis

!tags:: #lit✍/📰️article/highlights
!links::
!ref:: Longtermism Fund
!author:: givingwhatwecan.org

=this.file.name

Book cover of "Longtermism Fund"

Reference

Notes

Quote

Although the exact risk of catastrophic events is uncertain, a 2022 study found that nearly 50% of AI researchers surveyed believed that the risk of advanced AI ultimately leading to humanity's extinction was at least 10%.
- No location available
-