Back to LessWrong

AI arms race

From Lesswrongwiki

Jump to: navigation, search
Arbital has an article about

An AI arms race is a situation where multiple parties are trying to be the first to develop machine intelligence technology.

Humanity has some historical experience with arms races involving nuclear weapons technology. However, Arms races and intelligence explosions names a few important differences between nuclear weapons and AI technology, which may create dynamics in AI arms races that we have not seen elsewhere.

  • If an intelligence explosion occurs, this could allow the first party passing the relevant threshold to develop extremely advanced technology in years, months, or less, creating a strong winner-takes-all effect.
  • The development of AI technology carries the risk of creating unfriendly AI, potentially causing human extinction.
  • Non-military benefits from AI will make arms control seem undesirable; the fact that AI development requires only researchers and computers will make arms control difficult. On the other hand, the risks involved provide strong reasons to try, and AI systems could themselves help enforce agreements.

If the benefits of an intelligence explosion accrue to the group that created it, and the risks affect the entire world, this creates an incentive to sacrifice safety for speed. In addition to the risk of accidental unfriendly AI, there is the risk that the winner of an arms race turns into a badly-behaved human singleton.

External links

See also