There's now a redirect from end civilization as we know it since this more correct phrase is what most people are really talking about when they throw "existential risk" around and expect it to be understood in the scope they mean, which it usually isn't. It'd be better to be able to write sentences that actually use this common parlance like:
- "climate-fueled pests moving north could end civilization as we know it, so they are worth some risks to stop"
- "for decades nuclear confrontation threatened to end civilization as we know it, do we need to do MAD again?"
- "an overly-trusted AI, friendly or not, could make an error or simply be fed bad data and end civilization as we know it too quickly to stop, simply by approving the wrong chemical for commercial use or something like that"
Rather than use the phrase "existential risk" to mean things that probably really would kill every human being, it might be more sensible to outline specific extreme scenarios, perhaps a dozen that might end civilization as we know it by some people's definition. Realizing that by others' definitions, All in the Family already ended it.