Algorithmic complexity

From Lesswrongwiki
Revision as of 09:43, 13 September 2012 by Joaolkf (talk | contribs)
Jump to: navigation, search
Smallwikipedialogo.png
Wikipedia has an article about

The Algorithmic Complexity or Kolmogorov Complexity of a set of data is the size of the shortest possible description of this data. It’s an inverse measure of compressibility. How much more data is needed to make the shortest description, more complex and random the data is. This is also one of the best accounts of randomness so far[1]. As the numbers of patterns in the date increases, more ways to compress it, describe it shortly, and there are less Kolmogorov Complexity and less randomness. As the number of patterns in the date decreases, less ways to compress it or describe it shortly, so there are more Kolmogorov complexity and randomness. In the limit situation, there isn’t a single way to describe the data which takes less space than the data itself, the data is incompressible. [2]

More formally, the Kolmogorov complexity C(x) of a set x, is the size in bits of the shortest binary program (in a fixed programming language) that prints the set x as its only output. If C(x) is equal or greater than the size of x in bits, x is incompressible. [3]

This notion can be used to state many important results in computational theory. The most famous, perhaps, is Chaitin's incompleteness theorem, a version of Gödel’s incompleteness theorem.

Blog Posts

References

  1. SIPSER, M. (1983) "A complexity theoretic approach to randomness". In Proceedings of the 15th ACM Symposium on the Theory of Computing, pages 330{335. ACM, New York.
  2. FORTNOW, Lance. "Kolmogorov Complexity" Available at: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.120.4949&rep=rep1&type=pdf
  3. LI, Ming. & VITANY, Paul. “Algorithmic Complexity”. Available at: http://homepages.cwi.nl/~paulv/papers/020608isb.pdf


See also