Existential Risk Institute

This is an intriguing website I have been wading through that may be beneficial to the community.

The xrisksinstitute has put together some excellent analysis of a variety of existential risks defined as:

Existential Risks. It's important to note that a catastrophe need not be terminal to count as an existential risk. Some existential catastrophes are survivable. The most common definition of an existential risk is an event that either results in human annihilation or brings about an irreversible decline in our quality of life. From the transhumanist perspective, both prevent us from reaching a desirable (according to certain norms) posthuman mode of being. For the sake of conceptual clarity, it's helpful to disentangle the two components of the definition as follows. Let us call an existential risk that would result in annihilation an "extinction risk," and an existential risk that would result in a permanent state of privation a "stagnation risk." With this in mind, we can identify extinction risks as red-dot event, and stagnation risks as black-dot events that are sufficiently severe. (For reasons discussed in appendix 1 of The End, this typology dodges numerous problems that make Bostrom's typologies conceptually incoherent.)

I'm not going to pretend I have any idea what Bostrom's typology refers to.

There's some pretty heavy lifting going on over there. I am under medical advice to restrict myself to a maximum of 16 oz. curls.

Check it out: http://www.risksandreligion.org/#!key-ideas/mmd78

Share
up
0 users have voted.

Comments

tapu dali's picture

called risksandreligion.org.

One paragraph is called "To survive, we must go extinct."
This sounds vaguely like millenialist eschatology, but I could be wrong.

up
0 users have voted.

There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know.

Meteor Man's picture

The analysis is thought provoking, but not infallibly correct.

The descriptive link may be accounted for by the clear link between any number of religious systems and their belief in an existential doomsday prophecy:

Second, consider the case of religious apocalypticists, especially those who become what I call eschatological activists intent on bringing about the end of the world in an effort to hasten God's exacting of cosmic justice on the world (the ultimate theodicy). Groups of this sort don't merely want a fight, they want a fight to the death, and this makes them more likely to use catastrophic violence to accomplish their ends. Furthermore, history shows that such groups tend to arise, when they do, during periods of extreme social instability, economic uncertainty, and political chaos. This can be seen in the rise of the Islamic State — an apocalyptic group motivated by fantasies of Armageddon — in the aftermath of the US-led 2003 preemptive invasion, followed by the 2011 Syrian civil war.

The stated Mission Statement is:

merging technologies are introducing brand new risk scenarios that humanity has never before encountered. Some of these technologies will likely place unprecedented power in the hands of smaller groups, or even single individuals.

This situation needs to be studied with scholarly precision and scientific rigor. Not only must we examine the various tools of mass destruction — including nuclear weapons, biotechnology, synthetic biology, nanotechnology, and even artificial intelligence — but we also need to understand the various agents — such as lone wolves, ecoterrorists, and apocalyptic ideologues — who might attempt to use such tools to destroy civilization.

This is the central aim of the X-Risks Institute: to understand the unique technological and agential threats facing humanity today and in the coming centuries.

And there are constant references to their theory compared to alternate tautologies.

Who knows?

up
0 users have voted.

"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn

I hope to join you at 16 oz someday.

up
0 users have voted.

Peace out, tmp.

Meteor Man's picture

The key to any worthwhile self improvement program.

up
0 users have voted.

"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn

Meteor Man's picture

from a reliable source about the existential risk of artificial intelligence, a common scifi theme:

https://www.quora.com/Is-A-I-an-existential-threat-to-humanity

Speculative analysis or science fiction? Either or both?

up
0 users have voted.

"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn

pfiore8's picture

and maybe TOP is more light-hearted and fun? /snark

i think it doesn't matter how bad it will get. There is only one thing that does matter, imo: to pour all effort into mitigating the bad, evolving, and learning to share resources. less really can be more.

up
0 users have voted.

“There are moments which are not calculable, and cannot be assessed in words; they live on in the solution of memory… ”
― Lawrence Durrell, "Justine"

Pluto's Republic's picture

…collectively. This ranges from external risks, an asteroid impact for example. to risks that come from human activity, like global warming. Another kind of risk in play is the growing influence of human superstitions/religions, which often lead to apocalyptic actors. Then there are the risks of unintended consequences from things we do, and we may not see them until we are drowning in them. Attacking Iraq in 2003 is a short scale example of the dangers posed by unintended consequences. This leads to the greatest risk humans faces as a species: The danger that we will not evolve consciously, at a fast enough pace, to survive the social and environmental chaos we created:

The fate of potentially billions of future generations depends on our actions this century. Just as certain intellectuals have been on the "right side of history" with respect to issues like gender equality, gay marriage, animal rights, and so on, so too must humanity position itself on the "right side of futurology" with respect to phenomena, from climate change to superintelligence, that could irreversibly compromise our potential to reach a posthuman state.

If the moral rationality of our ends fails to match the instrumental rationality of our means, then a catastrophe could be all but guaranteed.

This is a genuinely unique moment in human history. We're approaching a potentially catastrophic collision between (a) neoteric technologies that will enable humanity to manipulate and rearrange the physical world in unprecedented ways, and (b) archaic belief systems about what reality is like and, even more importantly, how it ought to be. The fact is that we're no longer irresponsible kids playing with matches — we're irresponsible kids playing with flamethrowers that could burn down the entire global village. And there's perhaps no better manifestation of our irresponsibility than our continued infatuation with faith-based beliefs in ancient revelations.... If we don't grow up as a species, as the late Christopher Hitchens liked to say, then we could be headed for grief.

Again, the passage quoted is more of an outline of talking points, useful for analyzing the risks (threats) that humans must immediately address to avoid rapid decline and likely extinction.

This brings us to the author's final risk, which is that humans have not yet evolved physically and mentally into the kind of humans the future requires. For example, humans currently cannot (or will not) fully grasp the consequences of their actions, in advance. They lack the concept-generating mechanisms needed to objectify existential threats (like global warming) so they can truly comprehend it. There are clear limitations inherent in the mental machinery of humans, which is reflexively blocked by the willful denial of reality. This results in a failure to take decisive action so that the next generation can survive. (Some philosophers call this "cognitive closure" or "epistemic boundedness.")

Humanity could thus be conceptually blind to any number of risks in the universe; such risks would be unknowable not necessarily because they're convoluted or esoteric, but because the computers behind our eyes simply weren't designed to understand them.

In closing, the author introduces one high-concept solution for further discussion: A global push to rapidly accelerate the advancement of science, while, at the same time, developing a qualitative superintelligence that can help us save ourselves.

up
0 users have voted.
Meteor Man's picture

Thx for the assist. I'm still wading through this site and History Is A Weapon: http://www.historyisaweapon.com/hiawsitemap.html. in my spare time.

up
0 users have voted.

"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn