Existential Risk: Difference between revisions

From P2P Foundation
Jump to navigation Jump to search
unknown (talk)
unknown (talk)
Line 33: Line 33:


=Typology=
=Typology=
==AI Risk==
From Issarice:
* '''Until 1950''' Fictional portrayals only
"Most discussion of AI safety is in the form of fictional portrayals. It warns of the risks of robots who, through either stupidity or lack of goal alignment, no longer remain under the control of humans.
* '''1950 to 2000''' Scientific speculation + fictional portrayals
During this period, discussion of AI safety moves from merely being a topic of fiction to one that scientists who study technological trends start talking about. The era sees commentary by I. J. Good, Vernor Vinge, and Bill Joy.
* '''2000 to 2012''' Birth of AI safety organizations, close connections with transhumanism
This period sees the creation of the Singularity Institute for Artificial Intelligence (SIAI) (which would later become the Machine Intelligence Research Institute (MIRI)) and the evolution of its mission from creating friendly AI to reducing the risk of unfriendly AI. The Future of Humanity Institute (FHI) and Global Catastrophic Risk Institute (GCRI) are also founded. AI safety work during this time is closely tied to transhumanism and has close connections with techno-utopianism. Peter Thiel and Jaan Tallinn are key funders of the early ecosystem.
* '''2013 to present''' Mainstreaming of AI safety, separation from transhumanism
SIAI changes name to MIRI, sells off the "Singularity" brand to Singularity University, grows considerably in size, and gets a lot of funding. Superintelligence, the book by Nick Bostrom, is released. The Future of Life Institute (FLI) and OpenAI are started, and the latter grows considerably. New organizations founded include the Center for the Study of Existential Risk (CSER), Leverhulme Centre for the Future of Intelligence (CFI), Future of Life Institute (FLI), OpenAI, Center for Human-Compatible AI (CHAI), Berkeley Existential Risk Initiative (BERI), Ought, and the Center for Security and Emerging Technology (CSET). OpenAI in particular becomes quite famous and influential. Prominent individuals such as Elon Musk, Sam Altman, and Bill Gates talk about the importance of AI safety and the risks of unfriendly AI. Key funders of this ecosystem include Open Philanthropy and Elon Musk."
(https://timelines.issarice.com/wiki/Timeline_of_AI_safety)


==Nuclear Risk==
==Nuclear Risk==


Broad timeline:
Broad timeline from Issarice:


* 1960s: The Cuban Missile Crisis threatens nuclear war
* '''1960s''': The Cuban Missile Crisis threatens nuclear war




* 1970s
* '''1970s'''


"In addition to that, Zuberi notes that “by the late 1970s the defniition of proliferation changed from acquiring nuclear weapons or other explosive devices to developing a ‘nuclear explosive capability’”, and “consequently, the objective of safeguards changed from early detection of diversion of signifcant quantities of nuclear materials from peaceful to military pursuits to ‘prevention of development of nuclear explosive 4 ON NUCLEAR (DIS-)ORDER 121 capability’” (Zuberi 2003, p. 44)."
"In addition to that, Zuberi notes that “by the late 1970s the defniition of proliferation changed from acquiring nuclear weapons or other explosive devices to developing a ‘nuclear explosive capability’”, and “consequently, the objective of safeguards changed from early detection of diversion of signifcant quantities of nuclear materials from peaceful to military pursuits to ‘prevention of development of nuclear explosive 4 ON NUCLEAR (DIS-)ORDER 121 capability’” (Zuberi 2003, p. 44)."


* 1980s
 
* '''1980s'''


"The decade was dominated by the Cold War superpower competition of the United States and the Soviet Union. Much of the world held its collective breath during the first years of the decade as tensions and the nuclear arms race heated up between the two rivals, leading to popular anti-nuclear protests worldwide and the nuclear freeze movement in the United States. The international community exhaled a bit in the second half of the decade as the United States and the Soviet Union earnestly sat down at the arms negotiating table and for the first time eliminated an entire category of nuclear weapons through the 1987 Intermediate-Range Nuclear Forces Treaty. The two countries also proceeded to negotiate cuts to their strategic nuclear forces, which ultimately would be realized in the landmark 1991 Strategic Arms Reduction Treaty. Although the U.S.-Soviet nuclear arms race was center stage, efforts to advance and constrain the nuclear weapons ambitions and programs of other countries played out in the wings, sometimes as part of the superpower drama. For instance, the United States shunted nonproliferation concerns aside in ignoring Pakistan’s nuclear weapons program because of that country’s role in fighting Soviet forces inside Afghanistan. Meanwhile, Iraq, North Korea, and South Africa advanced their nuclear weapons efforts in relative secrecy. In this decade, Iran began to secretly acquire uranium-enrichment-related technology from Pakistani suppliers. Taiwan’s covert nuclear weapons program, however, was squelched by U.S. pressure. Other nonproliferation gains included a joint declaration by Argentina and Brazil to pursue nuclear technology only for peaceful purposes, alleviating fears of a nuclear arms race between the two, and the conclusion of a nuclear-weapon-free zone in the South Pacific. Moreover, the NPT added 30 new states parties during the decade, including North Korea."
"The decade was dominated by the Cold War superpower competition of the United States and the Soviet Union. Much of the world held its collective breath during the first years of the decade as tensions and the nuclear arms race heated up between the two rivals, leading to popular anti-nuclear protests worldwide and the nuclear freeze movement in the United States. The international community exhaled a bit in the second half of the decade as the United States and the Soviet Union earnestly sat down at the arms negotiating table and for the first time eliminated an entire category of nuclear weapons through the 1987 Intermediate-Range Nuclear Forces Treaty. The two countries also proceeded to negotiate cuts to their strategic nuclear forces, which ultimately would be realized in the landmark 1991 Strategic Arms Reduction Treaty. Although the U.S.-Soviet nuclear arms race was center stage, efforts to advance and constrain the nuclear weapons ambitions and programs of other countries played out in the wings, sometimes as part of the superpower drama. For instance, the United States shunted nonproliferation concerns aside in ignoring Pakistan’s nuclear weapons program because of that country’s role in fighting Soviet forces inside Afghanistan. Meanwhile, Iraq, North Korea, and South Africa advanced their nuclear weapons efforts in relative secrecy. In this decade, Iran began to secretly acquire uranium-enrichment-related technology from Pakistani suppliers. Taiwan’s covert nuclear weapons program, however, was squelched by U.S. pressure. Other nonproliferation gains included a joint declaration by Argentina and Brazil to pursue nuclear technology only for peaceful purposes, alleviating fears of a nuclear arms race between the two, and the conclusion of a nuclear-weapon-free zone in the South Pacific. Moreover, the NPT added 30 new states parties during the decade, including North Korea."


* 2000s
 
* '''2000s'''


"many countries began expressing a newfound interest in nuclear energy during the early 2000s."
"many countries began expressing a newfound interest in nuclear energy during the early 2000s."


(https://timelines.issarice.com/wiki/Timeline_of_nuclear_risk)
(https://timelines.issarice.com/wiki/Timeline_of_nuclear_risk)
[[Category:Existential Risk]]
[[Category:Statistics]]
[[Category:Global Governance]]


[[Category:Existential Risk]]
[[Category:Existential Risk]]
[[Category:Statistics]]
[[Category:Statistics]]
[[Category:Global Governance]]
[[Category:Global Governance]]

Revision as of 08:30, 26 April 2023

= ""an existential risk is any risk that has the potential to eliminate all of humanity or, at the very least, kill large swaths of the global population, leaving the survivors without sufficient means to rebuild society to current standards of living". [1]


History

From Issarice:

  • 19th century–1945 Early development:

"Concerns about human extinction can be traced back to the 19th century[2], with geology unveiling a radically nonhuman past.[3] French scientist Georges Cuvier popularizes the concept of catastrophism in the early 1800s. The first near-Earth asteroid is discovered. Toward the first half of the twentieth century, chemical and biological weapons become a case of concern.


1945 onwards Atomic Age/anthropocene:

"The nuclear holocaust becomes a theoretical scenario shortly after the beginning of this age, which starts following the detonation of the first nuclear weapon. In the 1950s, humanity enters a new age, facing not only existential risks from our natural environment, but also the possibility that we might be able to extinguish ourselves. During the 1960s, mutual assured destruction leads to the expansion of nuclear-armed submarines by both Cold War adversaries. In the same decade, the anti-nuclear movement launches, and the environmentalist movement soon adopts the cause of fighting climate change. Supervolcanoes are discovered in the early 1970s.[6] Global warming becomes widely recognized as a risk in the 1980s. The term "existential threat", beginning to spread around the 1960–80s during the Cold War, takes off in the 1990s and early 2000s.


  • 21st century Field of study consolidation:

Nick Bostrom introduces the term "existential risk", which emerges as a unified field of study.[2] By the early 2000s, scientists identify many other threats to human survival, including threats associated with artificial intelligence, biological weapons, nanotechnology, and high energy physics experiments. Unaligned artificial intelligence is recognized by some as the main threat within a century. Today, it is understood that the natural risks are dwarfed by the human-caused ones, turning the risk of extinction into an especially urgent issue."

(https://timelines.issarice.com/wiki/Timeline_of_existential_risk)


Timelines

Typology

AI Risk

From Issarice:

  • Until 1950 Fictional portrayals only

"Most discussion of AI safety is in the form of fictional portrayals. It warns of the risks of robots who, through either stupidity or lack of goal alignment, no longer remain under the control of humans.

  • 1950 to 2000 Scientific speculation + fictional portrayals

During this period, discussion of AI safety moves from merely being a topic of fiction to one that scientists who study technological trends start talking about. The era sees commentary by I. J. Good, Vernor Vinge, and Bill Joy.


  • 2000 to 2012 Birth of AI safety organizations, close connections with transhumanism

This period sees the creation of the Singularity Institute for Artificial Intelligence (SIAI) (which would later become the Machine Intelligence Research Institute (MIRI)) and the evolution of its mission from creating friendly AI to reducing the risk of unfriendly AI. The Future of Humanity Institute (FHI) and Global Catastrophic Risk Institute (GCRI) are also founded. AI safety work during this time is closely tied to transhumanism and has close connections with techno-utopianism. Peter Thiel and Jaan Tallinn are key funders of the early ecosystem.


  • 2013 to present Mainstreaming of AI safety, separation from transhumanism

SIAI changes name to MIRI, sells off the "Singularity" brand to Singularity University, grows considerably in size, and gets a lot of funding. Superintelligence, the book by Nick Bostrom, is released. The Future of Life Institute (FLI) and OpenAI are started, and the latter grows considerably. New organizations founded include the Center for the Study of Existential Risk (CSER), Leverhulme Centre for the Future of Intelligence (CFI), Future of Life Institute (FLI), OpenAI, Center for Human-Compatible AI (CHAI), Berkeley Existential Risk Initiative (BERI), Ought, and the Center for Security and Emerging Technology (CSET). OpenAI in particular becomes quite famous and influential. Prominent individuals such as Elon Musk, Sam Altman, and Bill Gates talk about the importance of AI safety and the risks of unfriendly AI. Key funders of this ecosystem include Open Philanthropy and Elon Musk."

(https://timelines.issarice.com/wiki/Timeline_of_AI_safety)


Nuclear Risk

Broad timeline from Issarice:

  • 1960s: The Cuban Missile Crisis threatens nuclear war


  • 1970s

"In addition to that, Zuberi notes that “by the late 1970s the defniition of proliferation changed from acquiring nuclear weapons or other explosive devices to developing a ‘nuclear explosive capability’”, and “consequently, the objective of safeguards changed from early detection of diversion of signifcant quantities of nuclear materials from peaceful to military pursuits to ‘prevention of development of nuclear explosive 4 ON NUCLEAR (DIS-)ORDER 121 capability’” (Zuberi 2003, p. 44)."


  • 1980s

"The decade was dominated by the Cold War superpower competition of the United States and the Soviet Union. Much of the world held its collective breath during the first years of the decade as tensions and the nuclear arms race heated up between the two rivals, leading to popular anti-nuclear protests worldwide and the nuclear freeze movement in the United States. The international community exhaled a bit in the second half of the decade as the United States and the Soviet Union earnestly sat down at the arms negotiating table and for the first time eliminated an entire category of nuclear weapons through the 1987 Intermediate-Range Nuclear Forces Treaty. The two countries also proceeded to negotiate cuts to their strategic nuclear forces, which ultimately would be realized in the landmark 1991 Strategic Arms Reduction Treaty. Although the U.S.-Soviet nuclear arms race was center stage, efforts to advance and constrain the nuclear weapons ambitions and programs of other countries played out in the wings, sometimes as part of the superpower drama. For instance, the United States shunted nonproliferation concerns aside in ignoring Pakistan’s nuclear weapons program because of that country’s role in fighting Soviet forces inside Afghanistan. Meanwhile, Iraq, North Korea, and South Africa advanced their nuclear weapons efforts in relative secrecy. In this decade, Iran began to secretly acquire uranium-enrichment-related technology from Pakistani suppliers. Taiwan’s covert nuclear weapons program, however, was squelched by U.S. pressure. Other nonproliferation gains included a joint declaration by Argentina and Brazil to pursue nuclear technology only for peaceful purposes, alleviating fears of a nuclear arms race between the two, and the conclusion of a nuclear-weapon-free zone in the South Pacific. Moreover, the NPT added 30 new states parties during the decade, including North Korea."


  • 2000s

"many countries began expressing a newfound interest in nuclear energy during the early 2000s."

(https://timelines.issarice.com/wiki/Timeline_of_nuclear_risk)