Talk:End civilization as we know it

From LessWrong
Jump to navigation Jump to search

When discussing threats that might end civilization as we know it, we should use that more objective phrase rather than trying to invent the concept of existential risk applied to a false frame like "all of humanity" or "all life" or "the biosphere" which we just don't understand.

Physical survival of at least some humans is almost inevitable in any disaster scenario, and to discuss only those where really truly every human would die out is silly. So let's elaborate phrases that are generally accepted as meaning what is actually meant when the term is used, and over time try to differentiate those from objectively definable levels of risk, rather than trying to define one level of "existential" vs. "not".

It's just less wrong to acknowledge that the "end", the "civilization", the "we" and the "know" are all fairly subjective terms and will remain so, than to pretend we know what constitutes human existence and what would make it worth living vs. not, when a new subspecies ceases to be a "human", and etc. etc. -- A legion of trolls

Hi, thanks for contributing to the wiki. Since very few people look over it (and hence have a chance to debate the changes), the process, as described at the end of Help:User_Guide, is to err on the side of including only ideas generally accepted on the blog. As it is, much of LW canon is not currently on the wiki, and things that are, are mostly stubs, so finding novel things to include is far from being the only thing left to do.
If there's a new idea you wish to share, or disagreement you have with the way things are, post to the blog first, and only include it in the wiki if it becomes generally accepted (incl. as relevant). In this instance, LW generally disagrees with the idea that discussing situations where really truly every human would die out is silly. --Your friendly Preventer of Information Services, Vladimir Nesov 22:18, 23 Ja

nuary 2012 (UTC)

It has to be said that it's pretty silly to expect that anyone, writing at LW / OB or not, would know in what situations every human being would die out. It's silly to reinvent an existing term to pretend it applies only to such situations, though, which is why the definition had to be corrected. Finally it remains silly to focus only on situations where the entire human species would disappear as that's a form of rather dangerous blinders against. So, to repeat, "to discuss _only_ those where really truly every human would die out is silly", yes.
But that's not what "existential risk" means on LW nor in general usage. For instance last night on TV news the phrase was used regarding a country's existence, which is no more than lines on a map in some people's thinking.
Yes, extreme scenarios have great value as a rule, both to anticipate high impact low probability events and expand thinking with dystopic/utopic scenarios.
The process of posting to a blog, waiting to see if objections occur, then once some vague criteria is met, compiling it, seems onerous and doubtful to succeed. Avoiding original research is fine, that's a Wikipedia rule too. I'd say that disagreements with wiki content belong on talk pages for that wiki concept. Discussing them on blogs spreads them out among hundreds of pages. In this case, on every page where "existential risk" is used.
So I guess I'm asking other users for a second opinion on all that. If unsatisfactory, I'll probably depart.