Existential Risk Institute
This is an intriguing website I have been wading through that may be beneficial to the community.
The xrisksinstitute has put together some excellent analysis of a variety of existential risks defined as:
Existential Risks. It's important to note that a catastrophe need not be terminal to count as an existential risk. Some existential catastrophes are survivable. The most common definition of an existential risk is an event that either results in human annihilation or brings about an irreversible decline in our quality of life. From the transhumanist perspective, both prevent us from reaching a desirable (according to certain norms) posthuman mode of being. For the sake of conceptual clarity, it's helpful to disentangle the two components of the definition as follows. Let us call an existential risk that would result in annihilation an "extinction risk," and an existential risk that would result in a permanent state of privation a "stagnation risk." With this in mind, we can identify extinction risks as red-dot event, and stagnation risks as black-dot events that are sufficiently severe. (For reasons discussed in appendix 1 of The End, this typology dodges numerous problems that make Bostrom's typologies conceptually incoherent.)
I'm not going to pretend I have any idea what Bostrom's typology refers to.
There's some pretty heavy lifting going on over there. I am under medical advice to restrict myself to a maximum of 16 oz. curls.
Check it out: http://www.risksandreligion.org/#!key-ideas/mmd78

Comments
This page is from a dubious website
called risksandreligion.org.
One paragraph is called "To survive, we must go extinct."
This sounds vaguely like millenialist eschatology, but I could be wrong.
There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know.
And you could be right
The analysis is thought provoking, but not infallibly correct.
The descriptive link may be accounted for by the clear link between any number of religious systems and their belief in an existential doomsday prophecy:
The stated Mission Statement is:
And there are constant references to their theory compared to alternate tautologies.
Who knows?
"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn
I am training with 12oz curls.
I hope to join you at 16 oz someday.
Peace out, tmp.
Practice, practice, practice
The key to any worthwhile self improvement program.
"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn
Artificial Intelligence Threat to Mankind
from a reliable source about the existential risk of artificial intelligence, a common scifi theme:
https://www.quora.com/Is-A-I-an-existential-threat-to-humanity
Speculative analysis or science fiction? Either or both?
"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn
interesting.
and maybe TOP is more light-hearted and fun? /snark
i think it doesn't matter how bad it will get. There is only one thing that does matter, imo: to pour all effort into mitigating the bad, evolving, and learning to share resources. less really can be more.
“There are moments which are not calculable, and cannot be assessed in words; they live on in the solution of memory… ”
― Lawrence Durrell, "Justine"
It's a roadmap or framework for assessing risks humans face
…collectively. This ranges from external risks, an asteroid impact for example. to risks that come from human activity, like global warming. Another kind of risk in play is the growing influence of human superstitions/religions, which often lead to apocalyptic actors. Then there are the risks of unintended consequences from things we do, and we may not see them until we are drowning in them. Attacking Iraq in 2003 is a short scale example of the dangers posed by unintended consequences. This leads to the greatest risk humans faces as a species: The danger that we will not evolve consciously, at a fast enough pace, to survive the social and environmental chaos we created:
Again, the passage quoted is more of an outline of talking points, useful for analyzing the risks (threats) that humans must immediately address to avoid rapid decline and likely extinction.
This brings us to the author's final risk, which is that humans have not yet evolved physically and mentally into the kind of humans the future requires. For example, humans currently cannot (or will not) fully grasp the consequences of their actions, in advance. They lack the concept-generating mechanisms needed to objectify existential threats (like global warming) so they can truly comprehend it. There are clear limitations inherent in the mental machinery of humans, which is reflexively blocked by the willful denial of reality. This results in a failure to take decisive action so that the next generation can survive. (Some philosophers call this "cognitive closure" or "epistemic boundedness.")
In closing, the author introduces one high-concept solution for further discussion: A global push to rapidly accelerate the advancement of science, while, at the same time, developing a qualitative superintelligence that can help us save ourselves.
Excellent follow up
Thx for the assist. I'm still wading through this site and History Is A Weapon: http://www.historyisaweapon.com/hiawsitemap.html. in my spare time.
"They'll say we're disturbing the peace, but there is no peace. What really bothers them is that we are disturbing the war." Howard Zinn