68
References [1] A.M. Abramov. Kolmogorov’s pedagogic legacy. Russian Math. Surveys, 43(6):45–88, 1988. [2] Y.S. Abu-Mostafa. The complexity of information extraction. IEEE Trans. Inform. Theory, 32(4):513–525, 1986. [3] L. Adleman. Time, space, and randomness. Technical Report TM-131, MIT, Lab. Comput. Sci., March 1979. [4] P. Adriaans and M. van Zaanen. Computational grammar induction for linguists. Grammars, 7:57–68, 2004. [5] P. Adriaans and P.M.B. Vit´ anyi. Approximation of the two-part MDL code. IEEE Trans. Inform. Theory. To appear. [6] V.N. Agafonov. Normal sequences and finite automata. Soviet Math. Dokl., 9:324–325, 1968. [7] V.N. Agafonov. On algorithms, frequency and randomness. PhD thesis, University of Novosibirsk, Novosibirsk, 1970. [8] G. Aggarwal, Q. Cheng, M.H. Goldwasser, M.Y. Kao, P. Moisset de Es- panes, and R.T. Schweller. Complexities for generalized models of self- assembly. SIAM J. Comput., 34:1493–1515, 2005. [9] M. Agrawal, E. Allender, and S. Rudich. Reductions in circuit complex- ity: an isomorphism theorem and a gap theorem. J. Comput. Syst. Sci., 57(2):127–143, 1998. [10] P.S. Aleksandrov. A few words on A.N. Kolmogorov. Russian Math. Surveys, 38(4):5–7, 1983. [11] V.M. Alekseev and M.V. Yakobson. Symbolic dynamics and hyperbolic dynamical systems. Physics Reports, 75:287–325, 1981. [12] E. Allender. Some consequences of the existence of pseudorandom gen- erators. J. Comput. System Sci., 39:101–124, 1989. [13] E. Allender. Applications of time-bounded Kolmogorov complexity in complexity theory. In O. Watanabe, editor, Kolmogorov Complexity and Computational Complexity, pages 4–22. Springer-Verlag, Berlin, 1992.

References - link.springer.com3A978-0-387-49820-1%2F1… · tally sets. Inform. Comput., ... Program., volume 2719 of Lect. Notes Comput. Sci., pages 267–277, ... volume 1 of Santa

  • Upload
    vuphuc

  • View
    225

  • Download
    0

Embed Size (px)

Citation preview

References

[1] A.M. Abramov. Kolmogorov’s pedagogic legacy. Russian Math. Surveys,43(6):45–88, 1988.

[2] Y.S. Abu-Mostafa. The complexity of information extraction. IEEETrans. Inform. Theory, 32(4):513–525, 1986.

[3] L. Adleman. Time, space, and randomness. Technical Report TM-131,MIT, Lab. Comput. Sci., March 1979.

[4] P. Adriaans and M. van Zaanen. Computational grammar induction forlinguists. Grammars, 7:57–68, 2004.

[5] P. Adriaans and P.M.B. Vitanyi. Approximation of the two-part MDLcode. IEEE Trans. Inform. Theory. To appear.

[6] V.N. Agafonov. Normal sequences and finite automata. Soviet Math.Dokl., 9:324–325, 1968.

[7] V.N. Agafonov. On algorithms, frequency and randomness. PhD thesis,University of Novosibirsk, Novosibirsk, 1970.

[8] G. Aggarwal, Q. Cheng, M.H. Goldwasser, M.Y. Kao, P. Moisset de Es-panes, and R.T. Schweller. Complexities for generalized models of self-assembly. SIAM J. Comput., 34:1493–1515, 2005.

[9] M. Agrawal, E. Allender, and S. Rudich. Reductions in circuit complex-ity: an isomorphism theorem and a gap theorem. J. Comput. Syst. Sci.,57(2):127–143, 1998.

[10] P.S. Aleksandrov. A few words on A.N. Kolmogorov. Russian Math.Surveys, 38(4):5–7, 1983.

[11] V.M. Alekseev and M.V. Yakobson. Symbolic dynamics and hyperbolicdynamical systems. Physics Reports, 75:287–325, 1981.

[12] E. Allender. Some consequences of the existence of pseudorandom gen-erators. J. Comput. System Sci., 39:101–124, 1989.

[13] E. Allender. Applications of time-bounded Kolmogorov complexity incomplexity theory. In O. Watanabe, editor, Kolmogorov Complexity andComputational Complexity, pages 4–22. Springer-Verlag, Berlin, 1992.

724 References

[14] E. Allender. When worlds collide: derandomization, lower bounds andKolmogorov complexity. In Proc. 21st Conf. Found. Software TechnologyTheor. Comput. Sci., volume 2245 of Lect. Notes Comput. Sci., pages1–15, Berlin, 2001. Springer-Verlag.

[15] E. Allender, H.M. Buhrman, and M. Koucky. What can be efficientlyreduced to the Kolmogorov-random strings. Annals Pure Applied Logic,138:2–19, 2006.

[16] E. Allender, H.M. Buhrman, M. Koucky, D. van Melkbeek, and D. Ron-neburger. Power from random strings. In Proc. 43rd IEEE Symp. Found.Comput. Sci., pages 669–678, New York, 2002. IEEE.

[17] E. Allender, M. Koucky, D. Ronneburger, and S. Roy. Derandomiza-tion and distinguishing complexity. In Proc. 18th IEEE Conf. Comput.Complexity, pages 209–220, 2003.

[18] E. Allender and R.S. Rubinstein. P-printable sets. SIAM J. Comput.,17:1193–1202, 1988.

[19] E. Allender and O. Watanabe. Kolmogorov complexity and degrees oftally sets. Inform. Comput., 86:160–178, 1990.

[20] A. Ambainis. Application of Kolmogorov complexity to inductive in-ference with limited memory. In Proc. 6th Int. Workshop AlgorithmicLearning Theory, volume 997 of Lect. Notes Artif. Intell., pages 313–318, Berlin, 1995. Springer-Verlag.

[21] K. Ambos-Spies and A. Kucera. Randomness in computability theory. InP. Cholak, S. Lempp, M. Lerman, and R.A. Shore, editors, ComputabilityTheory and Its Applications: Current Trends and Open Problems, vol-ume 257 of Contemporary Mathematics, pages 1–14. American Math.Society, 2000.

[22] M. Anand and L. Orloci. Complexity in plant communities: the notionand quantification. J. Theor. Biol., 179:179–186, 1996.

[23] C. Ane and M.J. Sanderson. Missing the forest for the trees: Phyloge-netic compression and its implications for inferring complex evolutionaryhistories. Systematic Biology, 54(1):146–157, 2005.

[24] D. Angluin. Algorithmic theory of information and randomness. Lecturenotes postgraduate course, Edinburgh University, 1977/1978.

[25] L. Antunes and L. Fortnow. Sophistication revisited. In 30th Int. Coll.Automata, Lang., Program., volume 2719 of Lect. Notes Comput. Sci.,pages 267–277, Berlin, 2003. Springer-Verlag.

[26] L. Antunes and L. Fortnow. Time-bounded universal distributions.Electr. Coll. Comput. Complexity, TR05-144, 2005.

[27] L. Antunes, L. Fortnow, A. Pinto, and A. Souto. Low-depth witnessesare easy to find. In Proc. 22nd IEEE Conf. Comput. Complexity, pages46–51, 2007.

[28] L. Antunes, L. Fortnow, and D. van Melkebeek. Computational depth.In Proc. IEEE Conf. Comput. Complexity, pages 266–273, 2001.

[29] L. Antunes, L. Fortnow, D. van Melkebeek, and N.V. Vinodchandran.Computational depth: concepts and applications. Theor. Comput. Sci.,354:391–404, 2006.

[30] L. Antunes, L. Fortnow, and N.V. Vinodchandran. Using depth to cap-ture average-case complexity. In Proc. 14th Int. Symp. Fundam. Com-

References 725

put. Theory, volume 2751 of Lect. Notes Comput. Sci., pages 303–310,Berlin, 2003. Springer-Verlag.

[31] B. Apolloni and C. Gentile. Sample size lower bounds in PAC learningby algorithmic complexity theory. Theor. Comput. Sci., 209:141–162,1998.

[32] F. Argenti, V. Benci, P. Cerrai, A. Cordelli, S. Galatolo, and G. Men-coni. Information and dynamical systems: a concrete measurement onsporadic dynamics. Chaos, Solutions and Fractals, 13:461–469, 2002.

[33] A. Arning, R. Agrawal, and P. Raghavan. A linear method for deviationdetection in large databases. In Proc. 2nd Int. Conf. Knowledge Discov.Data Mining, pages 164–169. ACM-SIGKDD, 1996.

[34] V.I. Arnol’d. A few words on Andrei Nikolaevich Kolmogorov. RussianMath. Surveys, 43(6):43–44, 1988.

[35] E.A. Asarin. Individual random continuous functions. In Yu.V. Prokhor-ov, editor, Summaries of Reports of the First All-World Congress of theBernouilli Society of Mathematical Statistics and Probability Theory,volume 1, page 450. Nauka, 1986. In Russian.

[36] E.A. Asarin. Some properties of Kolmogorov δ-random finite sequences.SIAM Theory Probab. Appl., 32:507–508, 1987.

[37] E.A. Asarin. On some properties of finite objects random in the algo-rithmic sense. Soviet Math. Dokl., 36:109–112, 1988.

[38] E.A. Asarin and A.V. Pokrovskii. Application of Kolmogorov complex-ity to analyzing the dynamics of controlled systems. Automat. and Tele-mekh., 1:25–33, 1986. In Russian.

[39] K.B. Athreya, J.M. Hitchcock, J.H. Lutz, and E. Mayordomo. Effectivestrong dimension in algorithmic information and computational com-plexity. SIAM. J. Comput., 37:671–705, 2007.

[40] Y. Aumann, Y.Z. Ding, and M.O. Rabin. Everlasting security in thebounded storage model. IEEE Trans. Inform. Theory, 48(6):1668–1680,2002.

[41] R.A. Baeza-Yates, R. Gavalda, G. Navarro, and R. Scheihing. Boundingthe length of least common subsequences and forests. Theory Comput.Syst., 32(4):435–452, 1999.

[42] J.L. Balcazar and R.V. Book. Sets with small generalized Kolmogorovcomplexity. Acta Informatica, 23:679–688, 1986.

[43] J.L. Balcazar, H.M. Buhrman, and M. Hermo. Learnability of Kolmogorov-easy circuit expressions via queries. In P.M.B. Vitanyi, editor, Comput.Learning Theory; Proc. 2nd European Conf., volume 904 of Lect. NotesArtificial Intelligence, pages 112–124, Berlin, 1995. Springer-Verlag.

[44] J.L. Balcazar, J. Diaz, and J. Gabarro. Structural Complexity. Springer-Verlag, Berlin, 1988.

[45] J.L. Balcazar, R. Gavalda, and H.T. Siegelmann. Computational powerof neural networks: a characterization in terms of Kolmogorov complex-ity. IEEE Trans. Inform. Theory, 43(4):1175–1183, 1997.

[46] J.L. Balcazar and U. Schoning. Logarithmic advice classes. Theor. Com-put. Sci., 99:279–290, 1992.

[47] A.R. Barron and T.M. Cover. Minimum complexity density estimation.IEEE Trans. Inform. Theory, 37:1034–1054, 1991.

726 References

[48] A.R. Barron, J. Rissanen, and B. Yu. The minimum descriptionlength principle in coding and modeling. IEEE Trans. Inform. Theory,44(6):2743–2760, 1998.

[49] J.M. Barzdins. Complexity of programs to determine whether naturalnumbers not greater than n belong to a recursively enumerable set.Soviet Math. Dokl., 9:1251–1254, 1968.

[50] J.M. Barzdins. On computability by probabilistic machines. SovietMath. Dokl., 10:1464–1467, 1969.

[51] J.M. Barzdins. On the relative frequency of solution of algorithmicallyunsolvable mass problems. Soviet Math. Dokl., 11:459–462, 1970.

[52] J.M. Barzdins. Algorithmic information theory. In Encyclopaedia ofMathematics, volume 1, pages 140–142. D. Reidel (Kluwer AcademicPublishers), 1988. Updated and annotated translation of the SovietMathematical Encyclopaedia.

[53] J.M. Barzdins and R.V. Freivalds. On the prediction of general recursivefunctions. Soviet Math. Dokl., 13:1251–1254 (1224–1228), 1972.

[54] M. Baumert, V. Baier, J. Hauersen, N. Wessel, U. Meyerfeldt, A. Schir-denwan, and A. Voss. Forecasting of life threatening arrhythmias usingthe compression entropy of heart rate. Methods Information Medicine,43(2):202–206, 2004.

[55] Th. Bayes. An essay towards solving a problem in the doctrine ofchances. Philos. Trans. Royal Soc., 53:376–398, 1763. Ibid., 54:298-310,1764, R. Price, ed.

[56] R. Beigel, H.M. Buhrman, P. Fejer, L. Fortnow, P. Grabowski,L. Longpre, A. Muchnik, F. Stephan, and L. Torenvliet. Enumerationsof the Kolmogorov function. Electr. Coll. Comput. Complexity, 2004.TR04-015.

[57] R. Beigel, W. Gasarch, M. Li, and L. Zhang. Addition in log2 n stepson average: a simple analysis. Theor. Comput. Sci., 191(1-2):245–248,1998.

[58] T. C. Bell, J. G. Cleary, and I. H. Witten. Text Compression. PrenticeHall, Englewood Cliffs, 1990.

[59] A.M. Ben-Amram and Z. Galil. On pointers versus addresses. J. Assoc.Comput. Mach., 39(3):617–648, 1992.

[60] A.M. Ben-Amram and H. Petersen. Backing up in singly linked lists. InProc. 31st ACM Symp. Theory Comput., pages 780–786, 1999.

[61] F. Benatti, T. Kruger, Ra. Siegmund-Schultze, and A. Skola. En-tropy and quantum Kolmogorov complexity. Commun. Math. Phys.,265(2):437–461, 2006.

[62] D. Benedetto, E. Caglioti, and V. Loreto. Language trees and zipping.Physical Review Letters, 88(4):048702, 4 pages, 2002.

[63] C.H. Bennett. The thermodynamics of computation—a review. Int. J.Theor. Physics, 21:905–940, 1982.

[64] C.H. Bennett. Demons, engines and the second law. Scientific American,pages 108–116, Nov. 1987.

[65] C.H. Bennett. Dissipation, information, computational complexity andthe definition of organization. In D. Pines, editor, Emerging Synthesesin Science, volume 1 of Santa Fe Institute Studies in the Science ofComplexity, pages 297–313. Addison-Wesley, 1987.

References 727

[66] C.H. Bennett. Logical depth and physical complexity. In R. Herken,editor, The Universal Turing Machine; A Half-Century Survey, pages227–258. Oxford University Press, 1988. In Germany: Kammerer & Un-verzagt.

[67] C.H. Bennett. How to define complexity in physics, and why. In W.H.Zurek, editor, Complexity, Entropy and the Physics of Information,pages 137–148. Addison-Wesley, 1991.

[68] C.H. Bennett, P. Gacs, M. Li, P.M.B. Vitanyi, and W. Zurek. Informa-tion distance. IEEE Trans. Inform. Theory, 44(4):1407–1423, 1998.

[69] C.H. Bennett and M. Gardner. The random number omega bids fairto hold the mysteries of the universe. Scientific American, 241:20–34,November 1979.

[70] C.H. Bennett and R. Landauer. The fundamental physical limits of com-putation. Scientific American, 256(7):48–56, July 1985.

[71] C.H. Bennett, M. Li, and B. Ma. Chain letters and evolutionary histo-ries. Scientific American, 288(6):76–81, 2003.

[72] C.H. Bennett and P.W. Shor. Quantum information theory. IEEE Trans.Inform. Theory, 44:2724–2742, 1998.

[73] T. Berger. Rate Distortion Theory: A Mathematical Basis for Data Com-pression. Prentice-Hall, Englewood Cliffs, NJ, 1971.

[74] A. Berthiaume, W. van Dam, and S. Laplante. Quantum Kolmogorovcomplexity. J. Comput. Syst. Sci., 63(2):201–221, 2001.

[75] W. Bialek, I. Nemenman, and N. Tishby. Predictability, complexity, andlearning. Neural Computation, 13:2409–2463, 2001.

[76] L. Bienvenue, W. Merkle, and A.K. Shen. A simple proof of the miller–yu theorem. undamenta Informaticae, 83:1–4, 2008.

[77] M. Blum. On the size of machines. Inform. Contr., 11:257–265, 1967.[78] M. Blum and O. Goldreich. Towards a computational theory of sta-

tistical tests. In Proc. 33rd IEEE Symp. Found. Comput. Sci., pages406–416, 1992.

[79] A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth. Learnabil-ity and the Vapnik-Chervonenkis dimension. J. Assoc. Comput. Mach.,36(4):929–965, 1989.

[80] D.E. Boekee, R.A. Kraak, and E. Backer. On complexity and syntacticinformation. IEEE Trans. Systems Man Cybernet., 12:71–79, 1982.

[81] N.N. Bogolyubov, B.V. Gnedenko, and S.L. Sobolev. Andrei Nikolae-vich Kolmogorov (on his eightieth birthday). Russian Math. Surveys,38(4):9–27, 1983.

[82] R.V. Book. On sets with small information content. In O. Watanabe,editor, Kolmogorov Complexity and Computational Complexity, pages23–42. Springer-Verlag, Berlin, 1992.

[83] R.V. Book. On languages reducible to algorithmically random languages.SIAM J. Comput., 23:1275–1282, 1994.

[84] R.V. Book and J.H. Lutz. On languages with very high space-boundedKolmogorov complexity. SIAM J. Comput., 22(2):395–402, 1993.

[85] R.V. Book, J.H. Lutz, and K. Wagner. An observation on probabilityversus randomness with applications to complexity classes. Math. Sys-tems Theory, 27:201–209, 1994.

728 References

[86] R.V. Book and O. Watanabe. On random hard sets for NP. Inform.Comput., 125:70–76, 1996.

[87] G. Boolos. A new proof of the Godel incompleteness theorem. NoticesAmer. Math. Soc., 46:388–390, 1989.

[88] E. Borel. Lecons sur la theorie des fonctions. Gauthier-Villars, Paris,3rd edition, 1927.

[89] S. Bose, L. Rallan, and V. Vedral. Communication capacity of quantumcomputation. Phys. Rev. Lett., 85:5448–5451, 2000.

[90] B. Brejova. Analyzing variants of shellsort. Inform. Process. Lett.,79(5):223–228, 2001.

[91] A.A. Brudno. On the complexity of paths of dynamic systems. RussianMath. Surveys, 33:197–198, 1978. Tranlation of Uspekhi Mat. Nauk.

[92] A.A. Brudno. Entropy and the complexity of trajectories of a dynamicalsystem. Trans. Mosc. Math. Soc., 44:127–151, 1983.

[93] H.M. Buhrman, L. Fortnow, and S. Laplante. Resource-bounded Kol-mogorov complexity revisited. SIAM J. Comput., 31(3):887–905, 2001.

[94] H.M. Buhrman, L. Fortnow, I. Newman, and N.K. Vereshchagin. In-creasing Kolmogorov complexity. In Proc. 22nd Symp. Theor. AspectsComp. Sc., volume 3404 of Lect. Notes. Comput. Sci., pages 412–421,Berlin, 2005. Springer-Verlag.

[95] H.M. Buhrman, E. Hemaspaandra, and L. Longpre. SPARSE reducesconjunctively to TALLY. SIAM J. Comput., 24(4):673–681, 1995.

[96] H.M. Buhrman, J.H. Hoepman, and P.M.B. Vitanyi. Space-efficientrouting tables for almost all networks and the incompressibility method.SIAM J. Comput., 28(4):1414–1432, 1999.

[97] H.M. Buhrman, T. Jiang, M. Li, and P.M.B. Vitanyi. New applica-tions of the incompressibility method: Part II. Theor. Comput. Sci.,235(1):59–70, 2000.

[98] H.M. Buhrman, H. Klauck, N.K. Vereshchagin, and P.M.B. Vitanyi.Individual communication complexity. J. Comput. System Sci., 73:973–985, 2007.

[99] H.M. Buhrman, S. Laplante, and P. Miltersen. New bounds for the lan-guage compression problem. In Proc. 15th IEEE Conf. Comput. Com-plexity, pages 126–130, 2000.

[100] H.M. Buhrman, T. Lee, and D. van Melkebeek. Language compressionand pseudorandom generators. Computational Complexity, 14:247–274,2005.

[101] H.M. Buhrman, M. Li, J.T. Tromp, and P.M.B. Vitanyi. Kolmogorovrandom graphs and the incompressibility method. SIAM J. Comput.,29(2):590–599, 1999.

[102] H.M. Buhrman and L. Longpre. Compressibility and resource boundedmeasure. SIAM J. Comput., 31(3):876–886, 2002.

[103] H.M. Buhrman and E. Mayordomo. An excursion to the Kolmogorovrandom strings. J. Comput. Syst. Sci., 54(3):393–399, 1997.

[104] H.M. Buhrman and P. Orponen. Random strings make hard instances.J. Comput. System Sci., 53(2):261–266, 1996.

[105] H.M. Buhrman and L. Torenvliet. Complicated complementations. InProc. 14th IEEE Conf. Comput. Complexity, pages 227–236, 1999.

References 729

[106] H.M. Buhrman and L. Torenvliet. Randomness is hard. SIAM J. Com-put., 30(5):1485–1501, 2000.

[107] H.M. Buhrman, D. van Melkebeek, K.W. Regan, D. Sivakumar, andM. Strauss. A generalization of resource-bounded measure, with appli-cation to the BPP vs EXP problem. SIAM J. Comput., 30(2):576–601,2000.

[108] M.S. Burgin. Generalized Kolmogorov complexity and duality in com-putational theory. Soviet Math. Dokl., 25(3):559–564, 1982.

[109] J.-Y. Cai and J. Hartmanis. On Hausdorff and topological dimensionsof the Kolmogorov complexity of the real line. J. Comput. System Sci.,49(3):605–619, 1994.

[110] J.-Y. Cai and L. Hemachandra. A note on enumerative counting. Inform.Process. Lett., 38:215–219, 1991.

[111] C. Calude. Information and Randomness: An Algorithmic Perspective.Springer-Verlag, Berlin, 1994.

[112] C. Calude. A characterization of c.e. random reals. Theor. Comput. Sci.,271(1-2):3–14, 2002.

[113] C. Calude, S. Marcus, and L. Staiger. A topological characterization ofrandom sequences. Inform. Process. Lett., 88:245–250, 2003.

[114] C.S. Calude. Theories of Computational Complexity, chapter 4. North-Holland, Amsterdam, 1988.

[115] C.S. Calude, I. Chitescu, and L. Staiger. P. Martin-Lof tests: rep-resentability and embedability. Revue Roumaine Math. Pures Appl.,30:719–732, 1985.

[116] C.S. Calude, P.H. Hertling, B. Khoussainov, and Y. Wang. Recursivelyenumerable reals and Chaitin ω numbers. Theor. Comput. Sci., 255:125–149, 2001.

[117] C.S. Calude and H. Jurgensen. Randomness as an invariant for numberrepresentations. In H. Maurer, J. Karhumaki, and G. Rozenberg, edi-tors, Results and Trends in Theoretical Computer Science, pages 44–66.Springer-Verlag, Berlin, 1994.

[118] R. Carnap. Logical Foundations of Probability. Univ. Chicago Press,Chicago, 1950.

[119] J. Castro and J.L. Balcazar. Simple pac learning of simple decision lists.In Proc. 6th Int. Workshop Algorithmic Learning Theory, volume 997 ofLect. Notes Artif. Intell., pages 239–248, Berlin, 1995. Springer-Verlag.

[120] J. Castro and D. Guijarro. PACs, simple-PAC and query learning. In-form. Process. Lett., 73(1-2):11–16, 2000.

[121] C.M. Caves. Entropy and information: How much information is neededto assign a probability? In W.H. Zurek, editor, Complexity, Entropy andthe Physics of Information, pages 91–115. Addison-Wesley, New York,1991.

[122] C.M. Caves. Information, entropy and chaos. In J.J. Halliwell, J. Perez-Mercader, and W.H. Zurek, editors, Physical Origins of Time Asymme-try, pages 47–89. Cambridge Univ. Press, Cambridge, England, 1994.

[123] M. Cebrian, M. Alfonseca, and A. Ortega. Common pitfalls using nor-malized compression distance: what to watch out for in a compressor.Commun. Inform. Syst., 5(4):367–384, 2005.

730 References

[124] M. Cebrian, M. Alfonseca, and A. Ortega. The normalized compressiondistance is resistant to noise. IEEE Trans. Inform. Theory, 53(5):1895–1900, 2007.

[125] G.J. Chaitin. On the length of programs for computing finite binarysequences. J. Assoc. Comput. Mach., 13:547–569, 1966.

[126] G.J. Chaitin. On the length of programs for computing finite binarysequences: statistical considerations. J. Assoc. Comput. Mach., 16:145–159, 1969.

[127] G.J. Chaitin. On the simplicity and speed of programs for computinginfinite sets of natural numbers. J. Assoc. Comput. Mach., 16:407–422,1969.

[128] G.J. Chaitin. On the difficulty of computations. IEEE Trans. Inform.Theory, 16:5–9, 1970.

[129] G.J. Chaitin. Computational complexity and Godel’s incompletenesstheorem. SIGACT News, 9:11–12, 1971.

[130] G.J. Chaitin. Information-theoretic computational complexity. IEEETrans. Inform. Theory, 20:10–15, 1974. Reprinted in T. Tymoczko, edi-tor, New Directions in the Philosophy of Mathematics, Birkhauser, 1986.

[131] G.J. Chaitin. Information-theoretic limitations of formal systems. J. As-soc. Comput. Mach., 21:403–424, 1974.

[132] G.J. Chaitin. Randomness and mathematical proof. Scientific American,232:47–52, May 1975.

[133] G.J. Chaitin. A theory of program size formally identical to informationtheory. J. Assoc. Comput. Mach., 22:329–340, 1975.

[134] G.J. Chaitin. Algorithmic entropy of sets. Comput. Math. Appl., 2:233–245, 1976.

[135] G.J. Chaitin. Information-theoretic characterizations of recursive infi-nite strings. Theor. Comput. Sci., 2:45–48, 1976.

[136] G.J. Chaitin. Algorithmic information theory. IBM J. Res. Develop.,21:350–359, 1977.

[137] G.J. Chaitin. Program size, oracles, and the jump operation. Osaka J.Math., 14:139–149, 1977.

[138] G.J. Chaitin. Toward a mathematical definition of “life”. In R.D. Levineand M. Tribus, editors, The Maximal Entropy Formalism, pages 477–498. MIT Press, 1979.

[139] G.J. Chaitin. Algorithmic information theory. In Encyclopedia of Statis-tical Sciences, volume 1, pages 38–41. Wiley, 1982.

[140] G.J. Chaitin. Godel’s theorem and information. Int. J. Theor. Physics,22:941–954, 1982. Reprinted in T. Tymoczko, editor, New Directions inthe Philosophy of Mathematics, Birkhauser, Boston, 1986.

[141] G.J. Chaitin. Randomness and Godel’s theorem. Mondes en Develope-ment, 14, No. 54-55:125–128, 356, 1986.

[142] G.J. Chaitin. Algorithmic Information Theory. Cambridge Univ. Press,Cambridge, England, 1987.

[143] G.J. Chaitin. Beyond Godel’s proof. IBM Res. Magazine, 25:12–15, Fall1987.

[144] G.J. Chaitin. Computing the busy beaver function. In T.M. Cover andB. Gopinath, editors, Open Problems in Communication and Computa-tion, pages 108–112. Springer-Verlag, 1987.

References 731

[145] G.J. Chaitin. Incompleteness theorems for random reals. Advances inApplied Math., 8:119–146, 1987.

[146] G.J. Chaitin. Information, Randomness and Incompleteness—Papers onAlgorithmic Information Theory. World Scientific, Singapore, 1987.

[147] G.J. Chaitin. An algebraic equation for the halting probability. InR. Herken, editor, The Universal Turing Machine; A Half-Century Sur-vey, pages 279–284. Oxford University Press, 1988. In Germany: Kam-merer & Unverzagt.

[148] G.J. Chaitin. Information-Theoretic Incompleteness. World Scientific,Singapore, 1992.

[149] G.J. Chaitin. Information-theoretic incompleteness. Applied Mathemat-ics and Computation, 52:83–101, 1992.

[150] G.J. Chaitin. On the number of n-bit strings with maximum complexity.Applied Mathematics and Computation, 59:97–100, 1993.

[151] G.J. Chaitin. A new version of algorithmic information theory. Com-plexity, 1(4):55–59, 1995/1996.

[152] G.J. Chaitin and J.T. Schwartz. A note on Monte-Carlo primality testsand algorithmic information theory. Comm. Pure Applied Math., 31:521–527, 1978.

[153] D.G. Champernowne. The construction of decimals normal in the scaleof ten. J. London Math. Soc., 8:254–260, 1933.

[154] M. Charikar, E. Lehman, D. Liu, R. Panigrahy, M. Prabhakaran,A. Rasala, A. Sahai, and A. Shelat. The smallest grammar. IEEE Trans.Inform. Theory, 51(7):2554–2576, 2005.

[155] N. Chater. Reconciling simplicity and likelihood principles in perceptualorganization. Psychological Review, 103:566–581, 1996.

[156] N. Chater. The search for simplicity: a fundamental cognitive principle?The Quarterly J. Exper. Psych.: Sect. A, 52(2):273–302, 1999.

[157] N. Chater. Cognitive science: the logic of human learning. Nature,407:572–573, 2000.

[158] N. Chater and P.M.B. Vitanyi. The generalized universal law of gener-alization. J. Math Psychology, 47(3):346–369, 2003.

[159] N. Chater and P.M.B. Vitanyi. Simplicity: A unifying principle in cog-nitive science? Trends in Cognitive Sciences, 7(1):19–22, 2003.

[160] N. Chater and P.M.B. Vitanyi. “Ideal learning” of natural language: pos-itive results about learning from positive evidence. J. Math Psychology,51(3):135–206, 2007.

[161] N. Chater, P.M.B. Vitanyi, and N. Steward. Universal generalizationand universal inter-item confusability. Behavior and Brain Sciences,24(4):559–660, 2001.

[162] F. Chen, J. Xu, F. Gu, X. Yu, X. Meng, and Z. Qiu. Dynamic process ofinformation transmission complexity in human brains. Biol. Cybernetics,83:355–366, 2000.

[163] X. Chen, B. Francia, M. Li, B. Mckinnon, and A. Seker. Shared infor-mation and program plagiarism detection. IEEE Trans. Inform. Theory,50(7):1545–1550, 2004.

[164] X. Chen, S. Kwong, and M. Li. A compression algorithm for DNA se-quences. IEEE Eng. Med. Biol. Magaz., 20(4):61–66, 2001.

732 References

[165] Q. Cheng and F. Fang. Kolmogorov random graphs only have trivialstable colorings. Inform. Process. Lett., 81(3):133–136, 2002.

[166] H. Chernoff. A measure of asymptotic efficiency for tests of a hypothesisbased on the sums of observations. Ann. Math. Stat., 23:493–509, 1952.

[167] A.V. Chernov, An.A. Muchnik, A.E. Romashchenko, A.K. Shen, andN.K. Vereshchagin. Upper semi-lattice of binary strings with the relation“x is simple conditional to y”. Theor. Comput. Sci., 271(1-2):69–95,2002.

[168] I. Chlamtac and A. Farago. A new approach to the design and analysisof peer-to-peer mobile networks. Wireless Networks, 5:149–156, 1999.

[169] M. Chrobak and M. Li. k + 1 heads are better than k for PDAs. J.Comput. System Sci., 37:144–155, 1988.

[170] F.R.K. Chung, R.E. Tarjan, W.J. Paul, and R. Reischuk. Coding stringsby pairs of strings. SIAM J. Algebra Discrete Math., 6(3):445–461, 1985.

[171] A. Church. On the concept of a random sequence. Bull. Amer. Math.Soc., 46:130–135, 1940.

[172] R.L. Cilibrasi. The complearn toolkit. www.complearn.org, 2003.[173] R.L. Cilibrasi and P.M.B. Vitanyi. Clustering by compression. IEEE

Trans. Inform. Theory, 51(4):1523–1545, 2005.[174] R.L. Cilibrasi and P.M.B. Vitanyi. Similarity of objects and the meaning

of words. In Proc. 3rd Conf. Theory Appl. Models Comput., volume 3959of Lect. Notes. Comput. Sci., pages 21–45, Berlin, 2006. Springer-Verlag.

[175] R.L. Cilibrasi and P.M.B. Vitanyi. The Google similarity distance. IEEETrans. Knowledge and Data Engineering, 19(3):370–383, 2007.

[176] R.L. Cilibrasi, P.M.B. Vitanyi, and R. de Wolf. Algorithmic clusteringof music based on string compression. Computer Music J., 28(4):49–67,2004.

[177] J.T. Coffey and R.M. Goodman. Any code of which we cannot think isgood. IEEE Trans. Inform. Theory, 36:1453–1461, 1990.

[178] J.D. Collier. Two faces of Maxwell’s demon reveal the nature of irre-versibility. Stud. Hist. Phil. Sci., 21(2):257–268, 1990.

[179] J.D. Collier. Information originates in symmetry breaking. Symmetry:Culture and Science, 7:247–256, 1996.

[180] T.M. Cover. Enumerative source encoding. IEEE Trans. Inform. The-ory, 19:73–77, 1973.

[181] T.M. Cover. Generalization on patterns using Kolmogorov complexity.In Proc. 1st Int. Conf. Pattern Recognition, pages 551–553, 1973.

[182] T.M. Cover. On the determination of the irrationality of the mean of arandom variable. Ann. Statist., 1:862–871, 1973.

[183] T.M. Cover. Universal gambling schemes and the complexity measuresof Kolmogorov and Chaitin. Technical Report 12, Statistics Department,Stanford University, October 1974.

[184] T.M. Cover. Kolmogorov complexity, data compression, and inference.In J.K. Skwirzynski, editor, The Impact of Processing Techniques onCommunications, pages 23–33. Martinus Nijhoff Publishers, The Hague,Netherlands, 1985.

[185] T.M. Cover, P. Gacs, and R.M. Gray. Kolmogorov’s contributions toinformation theory and algorithmic complexity. Ann. Probab., 17:840–865, 1989.

References 733

[186] T.M. Cover and R.C. King. A convergent gambling estimate of the en-tropy of English. IEEE Trans. Inform. Theory, 24:413–421, 1978.

[187] T.M. Cover and J.A. Thomas. Elements of Information Theory. Wiley& Sons, New York, 1991.

[188] J.P. Crutchfield and D.P. Feldman. Statistical complexity of simple one-dimensional spin systems. Phys. Rev. E, 55(2):1239–1242, 1997.

[189] J.P. Crutchfield and K. Young. Computation at the onset of chaos. InW.H. Zurek, editor, Complexity, Entropy and the Physics of Informa-tion, pages 223–269. Addison-Wesley, New York, 1991.

[190] R.R. Cuykendall. Kolmogorov information and VLSI lower bounds. PhDthesis, University of California, Los Angeles, Dec. 1984.

[191] R.P. Daley. Complexity and randomness. In R. Rustin, editor, Computa-tional Complexity; Courant Comput. Sci. Symp. 7, pages 113–122, NewYork, 1971. Algorithmics Press.

[192] R.P. Daley. An example of information and computation resource trade-off. J. Assoc. Comput. Mach., 20(4):687–695, 1973.

[193] R.P. Daley. Minimal-program complexity of sequences with restrictedresources. Inform. Contr., 23:301–312, 1973.

[194] R.P. Daley. The extent and density of sequences within the minimum-program complexity hierarchies. J. Comput. System Sci., 9:151–163,1974.

[195] R.P. Daley. Minimal-program complexity of pseudo-recursive andpseudo-random sequences. Math. Systems Theory, 9:83–94, 1975.

[196] R.P. Daley. Noncomplex sequences: characterizations and examples. J.Symbolic Logic, 41:626–638, 1976.

[197] R.P. Daley. On the inference of optimal descriptions. Theor. Comput.Sci., 4:301–319, 1977.

[198] R.P. Daley. Quantitative and qualitative information in computation.Inform. Contr., 45:236–244, 1980.

[199] R.P. Daley. The process complexity and the understanding of sequences.In Proc. Symp. Summer School MFCS, High Tatras, September 1973.

[200] J. Daugman. The importance of being random: statistical principles ofiris recognition. Pattern Recognition, 36:279–291, 2003.

[201] G. Davie. Characterising the Martin-Lof random sequences using com-putably enumberable sets of measure one. Inform. Process. Lett.,92(3):157–160, 2004.

[202] P.C.W. Davies. Why is the physical world so comprehensible? In W.H.Zurek, editor, Complexity, Entropy and the Physics of Information,pages 61–70. Addison-Wesley, New York, 1991.

[203] L.D. Davisson. Universal noiseless encoding. IEEE Trans. Inform. The-ory, 19:783–795, 1973.

[204] A.P. Dawid. Discussion of papers by Rissanen and by Wallace and Dowe.Comput. J., 42(4):323–326, 1999. Discussions and rejoinders from bothcamps can be found in the same issue.

[205] B. de Finetti. Probability, Induction, and Statistics. Wiley, 1972.[206] E.D. Demaine and A. Lopez-Ortiz. A linear lower bound on index size

for text retrieval. J. Algorithms, 48(1):2–15, 2003.[207] F. Denis. Learning regular languages from simple positive examples.

Machine Learning, 44(1-2):37–66, 2001.

734 References

[208] F. Denis, C. D’Halluin, and R. Gilleron. PAC learning with simple exam-ples. In Proc. 13th Symp. Theor. Aspects Comput. Sci., volume 1046 ofLect. Notes Comput. Sci., pages 231–242, Berlin, 1996. Springer-Verlag.

[209] F. Denis and R. Gilleron. PAC learning under helpful distributions.Theor. Informatics Applications, 35:129–148, 2001.

[210] A. DeSantis, G. Markowsky, and M.N. Wegman. Learning probabilisticprediction functions. In Proc. 29th IEEE Symp. Found. Comput. Sci.,pages 110–119, 1988.

[211] T.G. Dewey. Algorithmic complexity of a protein. Phys. Review E,54(1):R39–R41, 1996.

[212] J.-E. Dies. Information et complexite. Ann. Inst. Henri Poincare, B,12:365–390, 1976. Ibid., 14:113-118, 1978.

[213] M. Dietzfelbinger. Lower bounds on computation time for various modelsin computational complexity theory. PhD thesis, Dept. Comput. Sci.,Univ. Illinois at Chicago, 1987.

[214] M. Dietzfelbinger. The speed of copying on one-tape off-line Turing ma-chines. Inform. Process. Lett., 33:83–90, 1989/1990.

[215] M. Dietzfelbinger and W. Maass. The complexity of matrix transpositionon one-tape off-line Turing machines with output tape. Theor. Comput.Sci., 108(2):271–290, 1993.

[216] M. Dietzfelbinger, W. Maass, and G. Schnitger. The complexity of ma-trix transposition on one-tape off-line Turing machines. Theor. Comput.Sci., 82(1):113–129, 1991.

[217] D. Ding and L. Yu. There is no SW-complete c.e. real. J. Symb. Logic,69(4):1163–1170, 2004.

[218] D. Donoho. The Kolmogorov sampler. Ann. Statistics, Submitted.[219] J.L. Doob. Kolmogorov’s early work on convergence theory and founda-

tions. Ann. Probab., 17:815–821, 1989.[220] R. Downey and D.R. Hirschfeldt. Algorithmic Randomness and Com-

plexity. Springer-Verlag, New York, To appear.[221] R.G. Downey, D.R. Hirschfeldt, and G. LaForte. Randomness and re-

ducibility. J. Comput. Syst. Sci., 68(1):96–114, 2004.[222] R.G. Downey, D.R. Hirschfeldt, and A. Nies. Randomness, computabil-

ity, and density. SIAM J. Comput., 31(4):1169–1183, 2002.[223] R.G. Downey, D.R. Hirschfeldt, A. Nies, and F. Stephan. Trivial reals.

In Proc. 7th and 8th Asian Logic Confs, pages 103–131. Singapore Univ.Press, 2003.

[224] J.C. Dubacq, B. Durand, and E. Formenti. Kolmogorov complexity andcellular automata classification. Theor. Comput. Sci., 259:271–285, 2001.

[225] B. Durand, L. Levin, and A.K. Shen. Complex tilings. In Proc. 33rdACM Symp. Theory Comput., pages 732–739, 2001.

[226] B. Durand and S. Porrot. Comparison between the complexity of a func-tion and the complexity of its graph. Theor. Comput. Sci., 271(1-2):37–46, 2002.

[227] B. Durand, A.K. Shen, and N.K. Vereshchagin. Descriptive complexityof computable sequences. Theor. Comput. Sci., 271(1-2):47–58, 2002.

[228] B. Durand and N.K. Vereshchagin. Kolmogorov-Loveland stochasticityfor finite strings. Inform. Process. Lett., 91:263–269, 2004.

References 735

[229] P. Duris, Z. Galil, W.J. Paul, and R. Reischuk. Two nonlinear lowerbounds for on-line computations. Inform. Contr., 60:1–11, 1984.

[230] E.B. Dynkin. Kolmogorov and the theory of Markov processes. Ann.Probab., 17:822–832, 1989.

[231] V.D. Dzhunushaliev. Kolmogorov algorithmic complexity and its proba-bility interpretation in quantum gravity. Class. Quantum Grav., 15:603–612, 1998.

[232] T. Ebert, W. Merkle, and H. Vollmer. On the autoreducibility of randomsequences. SIAM J. Comput., 32(6):1542–1569, 2003.

[233] P. Erdos and J. Spencer. Probabilistic Methods in Combinatorics. Aca-demic Press, 1974.

[234] S.C. Evans and B. Barnett. Network security through conservation ofcomplexity. In Proc. IEEE Military Communications Conf., pages 1133–1138, 2002.

[235] S.C. Evans, S.F. Bush, and J. Hershey. Information assurance throughKolmogorov complexity. In Proc. DARPA Information SurvivabilityConf. and Exposition, volume 2, pages 1322–1331, 2001.

[236] R. Falk and C. Konold. Making sense of randomness: implicit encodingas a basis for judgement. Psychological Rev., 104(2):301–318, 1997.

[237] M. Feder. Maximal entropy as special case of the minimum length de-scription criterion. IEEE Trans. Inform. Theory, 32:847–849, 1986.

[238] J. Feigenbaum, L. Fortnow, S. Laplante, and A. Naik. On coher-ence, random-self-reducibility, and self-correction. Comput. Complexity,7:174–191, 1998.

[239] J. Feldman. Minimization of Boolean complexity in human conceptlearning. Nature, 407:630–633, 2000.

[240] J. Feldman. The simplicity principle in human concept learning. CurrentDirections in Psychological Science, 12(6):227–232, 2003.

[241] J.H. Felker. A link between information and energy. Proc. IRE, 40:728–729, 1952. Discussion, Ibid., 52(1954), 1191.

[242] W. Feller. An Introduction to Probability Theory and Its Applications,volume 1. Wiley, 3rd edition, 1968.

[243] S. Fenner, L. Fortnow, S.A. Kurtz, and L. Li. An oracle builder’s toolkit.Inform. Comput., 182(2):95–136, 2003.

[244] P. Ferragina, R. Giancarlo, V. Greco, and G. Manzini end G. Valiente.Compression-based classification of biological sequences and structuresvia the Universal Similarity Metric: experimental assessment. BMCBioinformatics, 8(1):doi:10.1186/1471–2105–8–252, 2007.

[245] T.L. Fine. On the apparent convergence of relative frequencies and itsimplications. IEEE Trans. Inform. Theory, 16:251–257, 1970.

[246] T.L. Fine. Theories of Probability. Academic Press, 1973.[247] T.L. Fine. Uniformly reasonable source encoding is often practically

impossible. IEEE Trans. Inform. Theory, 21:368–373, 1975.[248] R.A. Fisher. On the mathematical foundations of theoretical statistics.

Philos. Trans. Royal Soc. London, Ser. A, 222:309–368, 1922.[249] P. Flocchini, E. Kranakis, D. Krizanc, F.L. Luccio, and N. Santoro. Sort-

ing multisets in anonymous rings. J. Parall. Distrib. Comput., 64(2):254–265, 2004.

736 References

[250] J. Ford. How random is a random coin toss? Physics Today, 36:40–47,April 1983.

[251] J. Ford. Chaos: solving the unsolvable, predicting the unpredictable. InM.F. Barnsley and S.G. Demko, editors, Chaotic Dynamics and Fractals.Academic Press, 1986.

[252] L. Fortnow and M. Kummer. Resource-bounded instance complexity.Theor. Comput. Sci. A, 161:123–140, 1996.

[253] L. Fortnow and S. Laplante. Circuit complexity a la Kolmogorov. In-form. Comput., 123:121–126, 1995.

[254] L. Fortnow and S. Laplante. Nearly optimal language compression usingextractors. In Proc. 15th Symp. Theor. Aspects Comput. Sci., volume1373 of Lect. Notes Comput. Sci., pages 84–93, Berlin, 1998. Springer-Verlag.

[255] L. Fortnow, T. Lee, and N.K. Vereshchagin. Kolmogorov complexitywith error. In Symp. Theor. Aspects Comput. Sci., volume 3884 of Lect.Notes Comp. Sci., pages 137–148, Berlin, 2006. Springer-Verlag.

[256] L. Fortnow and J.H. Lutz. Prediction and dimension. J. Comput. Syst.Sci., 70:570–589, 2005.

[257] L. Fortnow, A. Pavan, and A.L. Selman. Distributionally hard lan-guages. Theory Comput. Syst., 34:245–261, 2001.

[258] W.L. Fouche. Identifying randomness given by high descriptive complex-ity. Acta Applicandae Mathematicae, 34:313–328, 1994.

[259] W.L. Fouche. Descriptive complexity and reflective properties of combi-natorial configurations. J. London Math. Soc., 54(2):199–208, 1996.

[260] R.V. Freivalds. On the running time of deterministic and nondetermin-istic Turing machines. Latv. Mat. Ezhegodnik, 23:158–165, 1979. In Rus-sian.

[261] R.V. Freivalds and M. Karpinski. Lower time bounds for randomizedcomputation. In Proc. 22nd Int. Colloq. Automata, Lang. Prog., volume944 of Lect. Notes Comput. Sci., pages 183–195, Berlin, 1995. Springer-Verlag.

[262] B. Fu. With quasi-linear queries, EXP is not polynomial-time Turingreducible to sparse sets. SIAM J. Comput., 24(5):1082–1090, 1995.

[263] P. Gacs. On the symmetry of algorithmic information. Soviet Math.Dokl., 15:1477–1480, 1974. Correction, Ibid., 15:1480, 1974.

[264] P. Gacs. Komplexitat und Zufalligkeit. PhD thesis, Fachbereich Mathe-matik, J.W. Goethe Universitat, Frankfurt am Main, 1978.

[265] P. Gacs. Exact expressions for some randomness tests. Z. Math. LogikGrundl. Math., 26:385–394, 1980.

[266] P. Gacs. On the relation between descriptional complexity and algorith-mic probability. Theor. Comput. Sci., 22:71–93, 1983.

[267] P. Gacs. Every sequence is reducible to a random sequence. Inform.Contr., 70:186–192, 1986. See also: A. Kucera, Measure, Π0

1-classes andcomplete extensions of PA, pp. 245–259 in: H.D. Ebbinghaus, G.H.Muller, and G.E. Sacks, editors, Recursion Theory Week, Lect. NotesMath., Vol. 1141, Springer Verlag, Berlin, 1985.

[268] P. Gacs. Randomness and probability—complexity of description. InKotz-Johnson, editor, Encyclopedia of Statistical Sciences, volume 7,pages 551–555. Wiley, 1986.

References 737

[269] P. Gacs. Lecture notes on descriptional complexity and randomness.Technical report, Comput. Sci. Dept., Boston Univ., 1988–2007.

[270] P. Gacs. Gregory J. Chaitin, Algorithmic Information Theory. J. Sym-bolic Logic, 54:624–627, 1989. Book review.

[271] P. Gacs. The Boltzmann entropy and randomness tests. In Proc. 2ndIEEE Workshop Phys. Comput., pages 209–216, 1994.

[272] P. Gacs. Quantum algorithmic entropy. J. Phys. A: Math. Gen.,34:6859–6880, 2001.

[273] P. Gacs and J. Korner. Common information is far less than mutualinformation. Problems of Control and Inform. Theory, 2:149–162, 1973.

[274] P. Gacs, J. Tromp, and P.M.B. Vitanyi. Algorithmic statistics. IEEETrans. Inform. Theory, 47(6):2443–2463, 2001. Correction, Ibid., 48(8):2427, 2002.

[275] H. Gaifman and M. Snir. Probabilities over rich languages, randomnessand testing. J. Symbolic Logic, 47:495–548, 1982.

[276] Z. Galil, R. Kannan, and E. Szemeredi. On 3-pushdown graphs withlarge separators. Combinatorica, 9:9–19, 1989.

[277] Z. Galil, R. Kannan, and E. Szemeredi. On nontrivial separators for k-page graphs and simulations by nondeterministic one-tape Turing ma-chines. J. Comput. System Sci., 38:134–149, 1989.

[278] Z. Galil and J. Seiferas. Time-space-optimal string matching. J. Comput.System Sci., 26:3:280–294, 1983.

[279] R.G. Gallager. Information Theory and Reliable Communication. Wiley,1968.

[280] A. Gammerman and V.G. Vovk. Kolmogorov complexity: sources, the-ory and applications. Comput. J., 42:252–255, 1999.

[281] A. Gammerman and V.G. Vovk. Prediction algorithms and confidencemeasures based on algorithmic randomness theory. Theor. Comput. Sci.,287(1):209–217, 2002.

[282] Q. Gao, M. Li, and P.M.B. Vitanyi. Applying MDL to learning bestmodel granularity. Artificial Intelligence, 121(1-2):1–29, 2000.

[283] M.R. Garey and D.S. Johnson. Computers and Intractability, a Guideto the Theory of NP-Completeness. W.H. Freeman and Co., 1979.

[284] P. Gaspard and X.-J. Wang. Sporadicity: Between periodic and chaoticdynamical behaviors. Proc. Nat’l Acad. Sci. USA, 85:4591–4595, 1988.

[285] R. Gavalda. Kolmogorov Randomness and Its Applications to StructuralComplexity Theory. PhD thesis, Universitat Politecnica de Catalunya,Barcelona, 1992.

[286] R. Gavalda, L. Torenvliet, O. Watanabe, and J.L. Balcazar. GeneralizedKolmogorov complexity in relativized separations. In Proc. 15th Conf.Math. Found. Comput. Sci., volume 452 of Lect. Notes Comput. Sci.,pages 269–276, Berlin, 1991. Springer-Verlag.

[287] R. Gavalda and O. Watanabe. On the computational complexity of smalldescriptions. SIAM J. Comput., 22(6):1257–1275, 1993.

[288] M. Gell-Mann. The Quark and the Jaguar. W.H. Freeman, New York,1994.

[289] M. Gell-Mann. Remarks on simplicity and complexity. Complexity,1(1):16–19, 1995.

738 References

[290] M. Gell-Mann and J.B. Hartle. Strong decoherence. In D.H. Feng andB.L. Hu, editors, Proc. 4th Drexel Symp. Quantum Non-Integrability– Quantum Classical Correspondence., pages 3–35. International Press,Cambridge, Mass., 1997.

[291] M. Gell-Mann and S. Lloyd. Information measures, effective complexity,and total information. Complexity, 2(1):44–52, 1996.

[292] M. Gereb-Graus and M. Li. Three one-way heads cannot do stringmatching. J. Comput. System Sci., 48:1–8, 1994.

[293] B.V. Gnedenko. Andrei Nikolaevich Kolmogorov (on the occasion of hisseventieth birthday). Russian Math. Surveys, 28(5):5–16, 1973.

[294] S. Goel and S. Bush. Kolmogorov complexity estimates for detectionof viruses in biologically inspired security systems: a comparison withtraditional approaches. Complexity Journal, 9(2), 2003.

[295] E.M. Gold. Language identification in the limit. Inform. Contr., 10:447–474, 1967.

[296] A. Goldberg and M. Sipser. Compression and ranking. SIAM J. Com-put., 20:524–536, 1991.

[297] O. Goldreich, S. Goldwasser, and S. Micali. How to construct randomfunctions. J. Assoc. Comput. Mach., 33:792–807, 1986.

[298] P. Grassberger. Towards a quantitative theory of self-generated com-plexity. Int. J. Theor. Physics, 25(9):907–938, 1986.

[299] T.L. Griffiths and J.B. Tenenbaum. From algorithmic to subjective ran-domness. In Advances in Neural Information Processing Systems 16.NIPS 2003, MIT Press, 2004.

[300] T.L. Griffiths and J.B. Tenenbaum. From mere coincidences to mean-ingful discoveries. Cognition, 103(2):180–226, 2007.

[301] R.I. Grigorchuk. A connection between algorithmic problems and en-tropy characteristics of groups. Soviet Math. Dokl., 32:356–360, 1985.

[302] S. Grigorieff and J.Y. Marion. Kolmogorov complexity and nondeter-minism. Theor. Comput. Sci., 271(1-2):151–180, 2002.

[303] P. Grunwald. A minimum description length approach to grammar in-ference. In Lect. Notes Artif. Intell., volume 1040, pages 203–216, 1996.

[304] P.D. Grunwald. The Minimum Description Length Principle. MIT Press,2007.

[305] P.D. Grunwald, J. Myung, and M.A. Pitt, editors. Advances in MinimumDescription Length: Theory and Applications. MIT Press, 2005.

[306] P.D. Grunwald and P.M.B. Vitanyi. Kolmogorov complexity and infor-mation theory. With an interpretation in terms of questions and answers.J. Logic, Lang., Inform., 12(4):497–529, 2003.

[307] Y. Gurevich. The logic of computer science column. EATCS Bulletin,35:71–82, June 1988.

[308] V.G. Gurzadyan. Kolmogorov complexity as a descriptor of cosmic mi-crowave background maps. Europhysics Lett., 46(1):114–117, 1999.

[309] S. Gutmann. Using classical probability to guarantee properties of infi-nite quantum sequences. Phys. Rev. A, 52(5):3560–3562, 1995.

[310] U. Hahn and N. Chater. Concepts and similarity. In K. Lamberts andD. Shanks, editors, Knowledge, Concepts and Categories, pages 43–92.Psychology Press/MIT Press, 1997.

References 739

[311] U. Hahn, N. Chater, and L.B. Richardson. Similarity as transformation.Cognition, 87:1–32, 2003.

[312] D. Hammer and A.K. Shen. A strange application of Kolmogorov com-plexity. Theory Comput. Systems, 31(1):1–4, 1998.

[313] Y. Han and L.A. Hemaspaandra. Pseudorandom generators and the fre-quency of simplicity. J. Cryptology, 9(4):251–261, 1996.

[314] J.B. Hartle. Quantum pasts and utility of history. Phys. Scr., T76:67–77,1998.

[315] J. Hartmanis. Generalized Kolmogorov complexity and the structure offeasible computations. In Proc. 24th IEEE Symp. Found. Comput. Sci.,pages 439–445, 1983.

[316] J. Hartmanis and L. Hemachandra. On sparse oracles separating feasiblecomplexity classes. Inform. Process. Lett., 28:291–295, 1988.

[317] J. Hartmanis and R.E. Stearns. On the computational complexity ofalgorithms. Trans. Amer. Math. Soc., 117:285–306, 1969.

[318] R. Heim. On the algorithmic foundation of information theory. IEEETrans. Inform. Theory, 25:557–566, 1979.

[319] L. Hemachandra and S. Rudich. On the complexity of ranking. J. Com-put. System Sci., 41:2:251–271, 1990.

[320] L. Hemachandra and G. Wechsung. Kolmogorov characterizations ofcomplexity classes. Theor. Comput. Sci., 83(2):313–322, 1991.

[321] M. Hermo. Compressibility and uniform complexity. Inform. Process.Lett., 62(5):256–264, 1997.

[322] M. Hermo and E. Mayordomo. A note on polynomial size circuits withlow resource-bounded Kolmogorov complexity. Math. Systems Theory,27:247–356, 1994.

[323] P. Hertling and K. Weihrauch. Random elements in effective topologicalspaces with measure. Inform. Computation, 181(1):32–56, 2003.

[324] J.M. Hitchcock, M. Lopez-Valdes, and E. Mayordomo. Scaled dimensionand the Kolmogorov complexity. In Proc. 29th Int. Symp. Math. Found.Comput. Sci., volume 3153 of Lect. Notes Comput. Sci., pages 476–487,Berlin, 2004. Springer-Verlag.

[325] J.M. Hitchcock and J.H. Lutz. Why computational complexity requiresstricter martingales. Theory Comput. Syst., 39(2):277–296, 2006.

[326] J.M. Hitchcock and N. V. Vinodchandran. Dimension, entropy rates,and compression. J. Comput. Syst. Sci., 72(4):760–782, 2006.

[327] G. Hotz. Komplexitat als Kriterium in der Theorienbildung. Akademieder Wissenschaften und der Literatur (Mainz)/Abhandlungen Mathema-tisch-Naturwissenschaftliche Klasse, 1, 1988. Steiner-Verlag, Wiesbaden.

[328] T. Housel and V.A. Kanevsky. Re-engineering business processes: a com-plexity theory approach. Inform. Syst. Operations Res., 33(4), 1995.

[329] D.A. Huffman. A method for construction of minimum-redundancycodes. Proceedings IRE, 40:1098–1101, 1952.

[330] M. Huhne. Linear speed-up does not hold on Turing machines with treestorages. Inform. Process. Lett., 47(6):313–318, 1993.

[331] M. Huhne. On the power of several queues. Theor. Comput. Sci.,113(1):75–91, 1993.

[332] M. Hutter. New error bounds for Solomonoff prediction. J. Comput.Syst. Sci., 62(4):653–667, 2001.

740 References

[333] M. Hutter. The fastest and shortest algorithm for all well-defined prob-lems. Int. J. Found. Comput. Sci., 13(3):431–443, 2002.

[334] M. Hutter. Convergence and loss bounds for Bayesian sequence predic-tion. IEEE Trans. Inform. Theory, 49(8):2061–2067, 2003.

[335] M. Hutter. On the existence and convergence of computable universalpriors. In Proc. 14th Conf. Algorithmic Learn. Theory, volume 2842 ofLect. Notes Comput. Sci., pages 298–312, Berlin, 2003. Springer-Verlag.

[336] M. Hutter. Optimality of universal Bayesian sequence prediction for gen-eral loss and alphabet. J. Mach. Learn. Res., 4:971–1000, 2003.

[337] M. Hutter. Universal Artificial Intelligence: Sequential Decisions Basedon Algorithmic Probability. Springer-Verlag, Berlin, 2005.

[338] M. Hutter. Sequential predictions based on algorithmic complexity. J.Comput. Syst. Sci., 72:95–117, 2006.

[339] M. Hutter. On universal prediction and Bayesian confirmation. Theor.Comput. Sci., 384(1):33–48, 2007.

[340] M. Hutter and An.A. Muchnik. On semimeasures predicting Martin-Lofrandom sequences. Theor. Comput. Sci., 382:247–261, 2007.

[341] D.T. Huynh. Resource-bounded Kolmogorov complexity of hard lan-guages. In Proc. 1st IEEE Conf. Struct. Complexity Theory, volume223 of Lect. Notes Comput. Sci., pages 184–195, Berlin, 1986. Springer-Verlag.

[342] D.T. Huynh. Effective entropies and data compression. Inform. Com-put., 90(1):67–85, 1991.

[343] D.T. Huynh. The effective entropies of some extensions of context-freelanguages. Inform. Process. Lett., 37:165–169, 1991.

[344] D.T. Huynh. Non-uniform complexity and the randomness of certaincomplete languages. Theor. Comput. Sci., 96:305–324, 1992.

[345] L. Ilie, S. Yu, and K. Zhang. Word complexity and repetition in words.Int. J. Found. Comput. Sci., 15(1):41–55, 2004.

[346] R. Impagliazzo, R. Shaltiel, and A. Wigderson. Extractors and pseudo-random generators with optimal seed-length. In Proc. 32th ACM Symp.Theory Comput., pages 1–10, 2000.

[347] K. Jacobs. Turingmaschinen und zufallige 0-1-Folgen. Selecta Mathe-matica, 2:141–167, 1970.

[348] A.K. Jagota and K.W. Regan. Performance of neural net heuristics formaximum clique on diverse highly compressible graphs. J. Global Opti-mization, 10(4):439–465, 1997.

[349] A. Jakoby, R. Reischuk, and C. Schindelhauer. Malign distributions foraverage case circuit complexity. Inform. Comput., 150:187–208, 1999.

[350] E.T. Jaynes. Prior probabilities. IEEE Trans. Systems Man Cybernet.,4:227–241, 1968.

[351] E.T. Jaynes. On the rationale of maximum entropy methods. Proceedingsof the IEEE, 70:939–952, 1982.

[352] E.T. Jaynes. Papers on Probability, Statistics, and Statistical Physics.Kluwer Academic Publishers, 1989. Second edition.

[353] T. Jiang and M. Li. On the approximation of shortest common su-persequences and longest common subsequences. SIAM J. Comput.,24(5):1122–1139, 1995.

References 741

[354] T. Jiang and M. Li. k one-way heads cannot do string-matching. J.Comput. Syst. Sci., 53(3):513–524, 1996.

[355] T. Jiang, M. Li, and P.M.B. Vitanyi. New applications of the incom-pressibility method. Comput. J., 42(4):287–293, 1999.

[356] T. Jiang, M. Li, and P.M.B. Vitanyi. Average-case analysis of algorithmsusing Kolmogorov complexity. J. Comput. Sci. Tech., 15(5):402–408,2000.

[357] T. Jiang, M. Li, and P.M.B. Vitanyi. A lower bound on average-casecomplexity of Shellsort. J. Assoc. Comp. Mach., 47(5):905–911, 2000.

[358] T. Jiang, M. Li, and P.M.B. Vitanyi. The average-case area of Heilbronn-type triangles. Rand. Struct. Alg., 20(2):206–219, 2002.

[359] T. Jiang, J.I. Seiferas, and P.M.B. Vitanyi. Two heads are better thantwo tapes. J. Assoc. Comput. Mach., 44(2):237–256, 1997.

[360] D. Joseph and M. Sitharam. Kolmogorov complexity, restricted non-determinism and generalized spectra. In C. Choffrut and T. Lengauer,editors, Proc. 7th Symp. Theor. Aspects Comput. Sci., pages 152–164,Berlin, 1990. Springer-Verlag.

[361] D.J. Juedes, J.I. Lathrop, and J.H. Lutz. Computational depth andreducibility. Theor. Comput. Sci., 132(1-2):37–70, 1994.

[362] D.W. Juedes and J.H. Lutz. Kolmogorov complexity, complexity cores,and the distribution of hardness. In O. Watanabe, editor, KolmogorovComplexity and Computational Complexity, pages 43–65. Springer-Verlag, Berlin, 1992.

[363] D.W. Juedes and J.H. Lutz. Modeling time-bounded prefix Kolmogorovcomplexity. Theory Comput. Systems, 33:111–123, 2000.

[364] T. Jurdzinski and K. Lorys. Lower bound technique for length-reducingautomata. Inf. Comput., 205(9):1387–1412, 2007.

[365] Y. Kalnishkan. General linear relations between different types of pre-dictive complexity. Theor. Comput. Sci., 271(1-2):181–200, 2002.

[366] A. Kaltchenko. Algorithms for estimation of information distance withapplications in bioinformatics and linguistics. In Canadian Conf. Elec-trical and Comput. Engin., volume 4, pages 2255–2258, 2004.

[367] B. Kalyanasundaram and G. Schnitger. The probabilistic communica-tion complexity of set intersection. SIAM J. Discrete Math., 5(4):545–557, 1992.

[368] T. Kamae. On Kolmogorov’s complexity and information. Osaka J.Math., 10:305–307, 1973.

[369] T. Kamae. Subsequences of normal sequences. Israel J. Math., 16:121–149, 1973.

[370] T. Kamae and B. Weiss. Normal numbers and selection rules. Israel J.Math., 21:101–110, 1975.

[371] M.I. Kanovich. On the decision complexity of algorithms. Soviet Math.Dokl., 10:700–701, 1969.

[372] M.I. Kanovich. Complexity of resolution of a recursively enumerable setas a criterion of its universality. Soviet Math. Dokl., 11:1224–1228, 1970.

[373] M.I. Kanovich. On the complexity of enumeration and decision of pred-icates. Soviet Math. Dokl., 11:17–20, 1970.

[374] M.I. Kanovich. On the decision complexity of recursively enumerablesets. Soviet Math. Dokl., 11:704–706, 1970.

742 References

[375] M.I. Kanovich. On the complexity of Boolean function minimization.Soviet Math. Dokl., 12(3):720–724, 1971.

[376] M.I. Kanovich. On the precision of a complexity criterion for nonrecur-siveness and universality. Soviet Math. Dokl., 18:232–236, 1977.

[377] M.I. Kanovich. An estimate of the complexity of arithmetic incomplete-ness. Soviet Math. Dokl., 19:206–210, 1978.

[378] M.I. Kanovich and N.V. Petri. Some theorems on the complexity of nor-mal algorithms and their computations. Soviet Math. Dokl., 10(1):233–234, 1969.

[379] F. Kaspar and H.G. Schuster. Easily calculable measures for the com-plexity of spatiotemporal patterns. Phys. Review A, 36(2):842–848, July1987.

[380] H.P. Katseff. Complexity dips in infinite binary sequences. Inform.Contr., 38:258–263, 1978.

[381] H.P. Katseff and M. Sipser. Several results in program size complexity.Theor. Comput. Sci., 15:291–309, 1981.

[382] J. Kececioglu, M. Li, and J.T. Tromp. Reconstructing a DNA sequencefrom erroneous copies. Theor. Comput. Sci., 185(1):3–13, 1997.

[383] J.G. Kemeny. The use of simplicity in induction. Philos. Rev., 62:391–408, 1953.

[384] E. Keogh, S. Lonardi, and C.A. Ratanamahatana. Towards parameter-free data mining. In Proc. 10th ACM SIGKDD Int. Conf. KnowledgeDiscovery and Data Mining, pages 206–215, 2004.

[385] E. Keogh, S. Lonardi, C.A. Ratanamahatana, L. Wei, H.S. Lee, andJ. Handley. Compression-based data mining of sequential data. DataMining and Knowledge Discovery, 14:99–129, 2007.

[386] H.A. Keuzenkamp and M. McAleer. Simplicity, scientific inference andeconometric modelling. The Economic Journal, 105:1–21, 1995.

[387] A.I. Khinchin. Mathematical Foundations of Information Theory. Dover,1957.

[388] J.C. Kieffer and E.H. Yang. Sequential codes, lossless compression ofindividual sequences, and Kolmogorov complexity. IEEE Trans. Inform.Theory, 42(1):29–39, 1996.

[389] J.C. Kieffer and E.H. Yang. Grammar-based codes: a new class of uni-versal lossless source codes. IEEE Trans. Inform. Theory, 46(3):737–754,2000.

[390] M. Kikuchi. Kolmogorov complexity and the second incompleteness the-orem. Arch. Math. Logic, 36(1):437–443, 1997.

[391] W.W. Kirchherr. Kolmogorov complexity and random graphs. Inform.Process. Lett., 41:125–130, 1992.

[392] W.W. Kirchherr, M. Li, and P.M.B. Vitanyi. The miraculous universaldistribution. Math. Intelligencer, 19(4):7–15, 1997.

[393] S.R. Kirk and S. Jenkins. Information theory-based software metrics andobfuscation. Journal of Systems and Software, 72:179–186, 2004.

[394] D.E. Knuth. The Art of Computer Programming. Volume 1: Fundamen-tal Algorithms. Addison-Wesley, 1973. Second edition.

[395] D.E. Knuth. The Art of Computer Programming. Volume 2: Seminumer-ical Algorithms, pages 163–166. Addison-Wesley, 1981. Second edition.

References 743

[396] Ker-I Ko. On the definition of infinite pseudo-random sequences. Theor.Comput. Sci., 48:9–34, 1986.

[397] Ker-I Ko. On the complexity of learning minimum time-bounded Turingmachines. SIAM J. Comput., 20:962–986, 1991.

[398] K. Kobayashi. On the structure of one-tape nondeterministic Turingmachine time hierarchy. Theor. Comput. Sci., 40:175–193, 1985.

[399] K. Kobayashi. Σ0n-complete properties of programs and Martin-Lof ran-

domness. Inform. Process. Lett., 46:37–42, 1993.[400] K. Kobayashi. On malign input distributions for algorithms. IEICE

Trans. Inform. and Syst., E76-D(6):634–640, 1993.[401] K. Kobayashi. The Kolmogorov complexity, the universal distribution

and the coding theorem with generalized length functions. IEEE Trans.Inform. Theory, 43(3):816–826, 1997.

[402] A. Kocsor, A. Kertesz-Farkas, L. Kajan, and S. Pongor. Application ofcompression-based distance measures to protein sequence classification:a methodology study. Bioinformatics, 22(4):407–412, 2006.

[403] A.N. Kolmogorov. Grundbegriffe der Wahrscheinlichkeitsrechnung. Sprin-ger-Verlag, Berlin, 1933. English translation (by N. Morrison): Founda-tions of the Theory of Probability, Chelsea, 1956; 2nd Russian edition:Osnovnye Poniatiia Teorii Veroiatnostei, Nauka, 1974.

[404] A.N. Kolmogorov. On tables of random numbers. Sankhya, The IndianJournal of Statistics, Ser. A, 25:369–376, 1963.

[405] A.N. Kolmogorov. Three approaches to the quantitative definition ofinformation. Problems Inform. Transmission, 1(1):1–7, 1965.

[406] A.N. Kolmogorov. Logical basis for information theory and probabilitytheory. IEEE Trans. Inform. Theory, 14(5):662–664, 1968.

[407] A.N. Kolmogorov. Some theorems on algorithmic entropy and the al-gorithmic quantity of information. Uspekhi Mat. Nauk, 23(2):201, 1968.Meeting of the Moscow Mathematical Society.

[408] A.N. Kolmogorov. On the logical foundations of information theory andprobability theory. Problems Inform. Transmission, 5:1–4, 1969.

[409] A.N. Kolmogorov. Complexity of algorithms and objective definition ofrandomness. Uspekhi Mat. Nauk, 29(4):155, 1974. Abstract of a talk atthe Moscow Math. Soc. meeting 4/16/1974. In Russian.

[410] A.N. Kolmogorov. Combinatorial foundations of information theory andthe calculus of probabilities. Russian Math. Surveys, 38(4):29–40, 1983.

[411] A.N. Kolmogorov. On logical foundations of probability theory. In K. Itoand Yu.V. Prokhorov, editors, Probability Theory and MathematicalStatistics, volume 1021 of Lect. Notes Math., pages 1–5. Springer-Verlag,Berlin, 1983.

[412] A.N. Kolmogorov. Memories of P.S. Aleksandrov. Russian Math. Sur-veys, 41(6):225–246, 1986.

[413] A.N. Kolmogorov. Information Theory and Theory of Algorithms, Se-lected Works, volume 3. Nauka, 1987. Edited by Yu.V. Prokhorov andA.N. Shiryaev. In Russian.

[414] A.N. Kolmogorov. Letters of A.N. Kolmogorov to A. Heyting. RussianMath. Surveys, 43(6):89–93, 1988.

744 References

[415] A.N. Kolmogorov and V.A. Uspensky. On the definition of an algorithm.Uspekhi Mat. Nauk, 13(4):3–28, 1958. In Russian. English translation:Amer. Math. Soc. Translat., 29:2(1963), 217–245.

[416] A.N. Kolmogorov and V.A. Uspensky. Algorithms and randomness.SIAM Theory Probab. Appl., 32:389–412, 1987. Without annoying trans-lation errors pp. 3–53 in: Yu.V. Prokhorov and V.V. Sazonov, editors,Proc. 1st World Congress of the Bernoulli Society (Tashkent 1986), Vol.1: Probab. Theory and Appl., VNU Science Press, Utrecht, 1987.

[417] D.K. Kondepudi. Non-equilibrium polymers, entropy, and algorith-mic information. In W.H. Zurek, editor, Complexity, Entropy and thePhysics of Information, pages 199–206. Addison-Wesley, New York,1991.

[418] M. Koppel. Complexity, depth, and sophistication. Complex Systems,1:1087–1091, 1987.

[419] M. Koppel. Structure. In R. Herken, editor, The Universal Turing Ma-chine: A Half-Century Survey, pages 435–452. Oxford University Press,1988. In Germany: Kammerer & Unverzagt, Hamburg.

[420] M. Koppel and H. Atlan. Program-length complexity, sophistication andinduction. Memo, 1988.

[421] M. Koucky. On traversal sequences, exploration sequences and complete-ness of Kolmogorov random strings. PhD thesis, Rutgers, the StateUniv., New Jersey, 2003.

[422] L.G. Kraft. A device for quantizing, grouping and coding amplitudemodulated pulses. Master’s thesis, Dept. of Electrical Engineering,M.I.T., Cambridge, Mass., 1949.

[423] E. Kranakis and D. Krizanc. Lower bounds for compact routing. InProc. 13th Symp. Theor. Aspects Comput. Sci., volume 1046 of Lect.Notes Comput. Sci., pages 529–540, Berlin, 1996. Springer-Verlag.

[424] E. Kranakis, D. Krizanc, and F. Luccio. On recognizing a string on ananonymous ring. Theory Comput. Syst., 34(1):3–12, 2001.

[425] A. Kraskov, H. Stogbauer, R.G. Andrzejak, and P. Grassberger. Hierar-chical clustering using mutual information. Europhys. Lett., 70(2):278–284, 2005.

[426] N. Krasnogor and D.A. Pelta. Measuring the similarity of proteinstructures by means of the universal similarity metric. Bioinformatics,20(7):1015–1021, 2004.

[427] V. Kreinovich and L. Longpre. Unreasonable effectiveness of symmetryin physics. Int. J. Theor. Phys., 36:1549–1555, 1996.

[428] V. Kreinovich and L. Longpre. Why Kolmogorov complexity in physicalequations? Int. J. Theor. Phys., 37(11):2791–2801, 1998.

[429] V. Kreinovich and R. Watson. How difficult is it to invent a nontrivialgame? Cybernetics and Systems, 25:629–640, 1994.

[430] R.E. Krichevskii. Universal encoding and Kolmogorov complexity. InProc. 5th Int. Symp. Inform. Theory, Part 1, Abstracts of Papers, pages22–25, Moscow-Tbilisi, 1979. In Russian.

[431] R.E. Krichevskii and V.K. Trofimov. The performance of universal en-coding. IEEE Trans. Inform. Theory, 27:199–207, 1983.

[432] M. Kummer. Kolmogorov complexity and instance complexity of recur-sively enumerable sets. SIAM J. Comput., 25(6):1123–1143, 1996.

References 745

[433] M. Kummer. On the complexity of random strings. In Proc. 13th Symp.Theor. Aspects Comput. Sci., volume 1046 of Lect. Notes Comput. Sci.,pages 25–36, Berlin, 1996. Springer-Verlag.

[434] A. Kucera and T.A. Slaman. Randomness and recursive enumerability.SIAM J. Comput., 31(1):199–211, 2002.

[435] K. Lanctot, M. Li, and E.H. Yang. Estimating DNA sequence entropy. InProc. 11th ACM-SIAM Symp. Discr. Algorithms, pages 409–418, 2000.

[436] R. Landauer. Irreversibility and heat generation in the computing pro-cess. IBM J. Res. Develop., 5:183–191, 1961.

[437] P.S. Laplace. A Philosophical Essay on Probabilities. Dover, 1952. Orig-inally published in 1819. Translated from 6th French edition.

[438] S. Laplante and F. Magniez. Lower bounds for randomized and quan-tum query complexity using Kolmogorov arguments. In Proc. 19th IEEEConf. Comput. Complexity, pages 294–304, 2004.

[439] J.I. Lathrop. Compression depth and the behavior of cellular automata.Technical Report TR96-05, Comput. Sci. Dept., Iowa State Univ., 1996.

[440] J.I. Lathrop and J.H. Lutz. Recursive computational depth. Inform.Comput., 153(2):139–172, 1999.

[441] T. Lee and A. Romashchenko. Resource bounded Kolmogorov complex-ity revisited. Theor. Comput. Sci., 245:386–405, 2005.

[442] A. Lempel and J. Ziv. On the complexity of finite sequences. IEEETrans. Inform. Theory, 22:75–81, 1976.

[443] S.K. Leung-Yan-Cheong and T.M. Cover. Some equivalences betweenShannon entropy and Kolmogorov complexity. IEEE Trans. Inform.Theory, 24:331–338, 1978.

[444] L.A. Levin. Some theorems on the algorithmic approach to probabilitytheory and information theory. PhD thesis, Moscow University, 1971. InRussian.

[445] L.A. Levin. On storage capacity for algorithms. Soviet Math. Dokl.,14:1464–1466, 1973.

[446] L.A. Levin. On the notion of a random sequence. Soviet Math. Dokl.,14:1413–1416, 1973.

[447] L.A. Levin. Universal search problems. Problems Inform. Transmission,9:265–266, 1973.

[448] L.A. Levin. Laws of information conservation (non-growth) and aspectsof the foundation of probability theory. Problems Inform. Transmission,10:206–210, 1974.

[449] L.A. Levin. On the principle of conservation of information in intuition-istic mathematics. Soviet Math. Dokl., 17:601–605, 1976.

[450] L.A. Levin. Uniform tests of randomness. Soviet Math. Dokl., 17:337,1976.

[451] L.A. Levin. Various measures of complexity for finite objects (axiomaticdescription). Soviet Math. Dokl., 17:522–526, 1976.

[452] L.A. Levin. Randomness conservation inequalities; information and in-dependence in mathematical theories. Inform. Contr., 61:15–37, 1984.

[453] L.A. Levin. One-way functions and pseudorandom generators. Combi-natorica, 7:357–363, 1987.

[454] L.A. Levin. Robust measures of information. Comput. J., 42(4):284–286,1999.

746 References

[455] L.A. Levin. Forbidden information. In Proc. 47th IEEE Symp. Found.Comput. Sci., pages 761–768, 2002.

[456] L.A. Levin and V.V. Vyugin. Invariant properties of information bulks.In Proc. 6th Symp. Math. Found. Comput. Sci., volume 53 of Lect. NotesComput. Sci., pages 359–364, Berlin, 1977. Springer-Verlag.

[457] J.P. Lewis. Large limits to software estimation. ACM Software Eng.Notes, 26(4):54–59, 2001.

[458] M. Li. Information distance and its applications. Int. J. Found. Comput.Sci., 18(4):669–681, 2007.

[459] M. Li, J. Badger, X. Chen, S. Kwong, P. Kearney, and H. Zhang. Aninformation-based sequence distance and its application to whole mito-chondrial genome phylogeny. Bioinformatics, 17(2):149–154, 2001.

[460] M. Li, X. Chen, X. Li, B. Ma, and P.M.B. Vitanyi. The similarity metric.IEEE Trans. Inform. Theory, 50(12):3250–3264, 2004.

[461] M. Li, L. Longpre, and P.M.B. Vitanyi. The power of the queue. SIAM.J. Comput., 21(4):697–712, 1992.

[462] M. Li, B. Ma, and L. Wang. Finding similar regions in many strings. InProc. 31st ACM Symp. Theory Comput., pages 473–482, 1999.

[463] M. Li, J.T. Tromp, and P.M.B. Vitanyi. Sharpening Occam’s razor.Inform. Process. Lett., 85(5):267–274, 2003.

[464] M. Li and P.M.B. Vitanyi. Kolmogorovskaya slozhnost’ dvadsat’ letspustia. Uspekhi Mat. Nauk, 43(6):129–166, 1988. In Russian.

[465] M. Li and P.M.B. Vitanyi. Tape versus queue and stacks: The lowerbounds. Inform. Comput., 78:56–85, 1988.

[466] M. Li and P.M.B. Vitanyi. Two decades of applied Kolmogorov com-plexity: In memoriam A.N. Kolmogorov 1903–1987. In Proc. 3rd IEEEConf. Structure in Complexity Theory, pages 80–101, 1988.

[467] M. Li and P.M.B. Vitanyi. Applications of Kolmogorov complexity inthe theory of computation. In A.L. Selman, editor, Complexity TheoryRetrospective, pages 147–203. Springer-Verlag, New York, 1990.

[468] M. Li and P.M.B. Vitanyi. Kolmogorov complexity and its applications.In J. van Leeuwen, editor, Handbook of Theoretical Computer Science,chapter 4, pages 187–254. Elsevier and MIT Press, 1990.

[469] M. Li and P.M.B. Vitanyi. Learning simple concepts under simple dis-tributions. SIAM J. Comput., 20(5):911–935, 1991.

[470] M. Li and P.M.B. Vitanyi. Inductive reasoning and Kolmogorov com-plexity. J. Comput. System Sci., 44(2):343–384, 1992.

[471] M. Li and P.M.B. Vitanyi. Worst case complexity is equal to averagecase complexity under the universal distribution. Inform. Process. Lett.,42:145–149, 1992.

[472] M. Li and P.M.B. Vitanyi. Mathematical theory of thermodynamics ofcomputation. In IEEE Proc. Workshop Phys. Comput., pages 42–46,1993. Complete version in prelim. proceedings, Oct., 1992.

[473] M. Li and P.M.B. Vitanyi. Kolmogorov complexity arguments in combi-natorics. J. Comb. Theory, Ser. A, 66(2):226–236, 1994. Erratum, Ibid.,69(1995), 183.

[474] M. Li and P.M.B. Vitanyi. Statistical properties of finite sequences withhigh Kolmogorov complexity. Math. Systems Theory, 27:365–376, 1994.

References 747

[475] M. Li and P.M.B. Vitanyi. Computational machine learning in theoryand praxis. In J. van Leeuwen, editor, Computer Science Today, RecentTrends and Developments, volume 1000 of Lect. Notes Comput. Sci.,pages 518–535. Springer-Verlag, Berlin, 1995.

[476] M. Li and P.M.B. Vitanyi. A new approach to formal language theoryby Kolmogorov complexity. SIAM J. Comput., 24(2):398–410, 1995.

[477] M. Li and P.M.B. Vitanyi. Reversibility and adiabatic computation:trading time and space for energy. Proc. Royal Soc. London, Ser. A,452:769–789, 1996.

[478] M. Li and Y. Yesha. String-matching cannot be done by 2-head 1-waydeterministic finite automata. Inform. Process. Lett., 22:231–235, 1986.

[479] M. Li and Y. Yesha. New lower bounds for parallel computation. J.Assoc. Comput. Mach., 36:671–680, 1989.

[480] W. Li. On the relationship between complexity and entropy for Markovchains and regular languages. Complex Systems, 5(4):381–399, 1991.

[481] T.Y. Lin. Patterns in numerical data: practical approximations to Kol-mogorov complexity. In Proc. 7th Int. Workshop New Directions inRough Sets, Data Mining, Granular-Soft Comput., volume 1711 of Lect.Notes Artif. Intell., pages 509–513, Berlin, 1999. Springer-Verlag.

[482] S. Lloyd and J.J. Slotine. Algorithmic Lyapunov functions for stableadaptation and control. Int. J. Adapt. Contr. Sign. Proc., 1996.

[483] L. Lofgren. Explicability of sets and transfinite automata. In E. Caian-iello, editor, Automata Theory, pages 251–268. Academic Press, 1966.

[484] L. Lofgren. Recognition of order and evolutionary systems. In J. Tou,editor, Computer and Information Sciences II, pages 165–175. AcademicPress, 1967.

[485] L. Lofgren. Complexity of descriptions of systems. Int. J. General Sys-tems, 3:197–214, 1977.

[486] L. Longpre. Resource bounded Kolmogorov complexity, a link betweencomputational complexity and information theory. PhD thesis, Comput.Sci. Dept., Cornell Univ., 1986.

[487] L. Longpre. Resource bounded Kolmogorov complexity and statisticaltests. In O. Watanabe, editor, Kolmogorov Complexity and Computa-tional Complexity, pages 66–84. Springer-Verlag, Berlin, 1992.

[488] L. Longpre and S. Mocas. Symmetry of information and one-way func-tions. Inform. Process. Lett., 46(2):95–100, 1993.

[489] L. Longpre and O. Watanabe. On symmetry of information and poly-nomial time invertibility. Inform. Comput., 121(1):14–22, 1995.

[490] A. Lopez-Ortiz. New lower bounds for element distinctness on a one-tapeTuring machine. Inform. Proc. Lett., 51(6):311–314, 1994.

[491] M.C. Loui. Optimal dynamic embedding of trees into arrays. SIAM J.Comput., 12:463–472, 1983.

[492] M.C. Loui. Minimizing access pointers into trees and arrays. J. Comput.System Sci., 28:359–378, 1984.

[493] M.C. Loui and D.R. Luginbuhl. The complexity of simulations betweenmultidimensional Turing machines and random access machines. Math.Systems Theory, 25(4):293–308, 1992.

748 References

[494] M.C. Loui and D.R. Luginbuhl. Optimal on-line simulations of tree ma-chines by random access machines. SIAM J. Comput., 21(5):959–971,1992.

[495] D.W. Loveland. The Kleene hierarchy classification of recursively ran-dom sequences. Trans. Amer. Math. Soc., 125:497–510, 1966.

[496] D.W. Loveland. A new interpretation of von Mises’ concept of a randomsequence. Z. Math. Logik und Grundlagen Math., 12:279–294, 1966.

[497] D.W. Loveland. On minimal-program complexity measures. In Proc.(1st) ACM Symp. Theory Comput., pages 61–66, 1969.

[498] D.W. Loveland. A variant of the Kolmogorov concept of complexity.Inform. Contr., 15:510–526, 1969.

[499] B. Lucier, T. Jiang, and M. Li. Average-case analysis of Quicksort andBinary Insertion Tree height using incompressibility. Inform. Process.Lett., 103:45–51, 2007.

[500] D.R. Luginbuhl. Computational complexity of random access models.PhD thesis, University of Illinois at Urbana-Champaign, 1990.

[501] J.H. Lutz. Category and measure in complexity classes. SIAM. J. Com-put., 19:6:1100–1131, 1990.

[502] J.H. Lutz. An upward measure separation theorem. Theor. Comput.Sci., 81:127–135, 1991.

[503] J.H. Lutz. Almost everywhere high nonuniform complexity. J. Comput.System Sci., 44:220–258, 1992.

[504] J.H. Lutz. The quantitative structure of exponential time. In Proc. 8thIEEE Conf. Structure in Complexity Theory, pages 158–175, 1993.

[505] J.H. Lutz. Gales and the constructive dimension of individual sequences.In 27th Int. Colloq. Automata, Languages, Programming, volume 1853 ofLect. Notes Comput. Sci., pages 185–194, Berlin, 2000. Springer-Verlag.

[506] J.H. Lutz. The dimensions of individual strings and sequences. Inform.Comput., 187(1):49–79, 2003.

[507] J.H. Lutz. Effective fractal dimensions. Math. Logic Quarterly, 51(1):62–72, 2004.

[508] J.H. Lutz and D.L. Schweizer. Feasible reductions to Kolmogorov-Loveland stochastic sequences. Theor. Comput. Sci., 225:185–194, 1999.

[509] W. Maass. Combinatorial lower bound arguments for deterministic andnondeterministic Turing machines. Trans. Amer. Math. Soc., 292:675–693, 1985.

[510] W. Maass, E. Szemeredi, G. Schnitger, and G. Turan. Two tapes versusone for off-line Turing machines. Comput. Complexity, 3:392–401, 1993.

[511] J. Machta. Entropy, information, and computation. Am. J. Phys.,67:1074–1077, 1999.

[512] H.G. Mairson. The program complexity of searching a table. In Proc.24th IEEE Symp. Found. Comput. Sci., pages 40–47, 1983.

[513] K. Makarychev, Y. Makarychev, A. Romashchenko, and N.K. Vereshcha-gin. A new class of non-Shannon-type inequalities for entropies. Comm.in Inform. Syst., 2(2):147–166, 2002.

[514] R.N. Mantegna and H.E. Stanley. An introduction to econophysics: cor-relations and complexity in finance. Cambridge University Press, 2000.

[515] G.B. Marandzhyan. On certain properties of asymptotically optimal re-cursive function. Izv. Akad. Nauk Armyan. SSSR, 4:3–22, 1969.

References 749

[516] A.A. Markov. On normal algorithms which compute Boolean functions.Soviet Math. Dokl., 5:922–924, 1964.

[517] A.A. Markov. On normal algorithms associated with the computationof Boolean functions and predicates. Izv. Akad. Nauk USSR Ser. Mat.,31:161–208, 1967.

[518] P. Martin-Lof. Algorithmen und zufallige Folgen. Lecture notes, Univer-sity of Erlangen, 1966.

[519] P. Martin-Lof. The definition of random sequences. Inform. Contr.,9:602–619, 1966.

[520] P. Martin-Lof. On the concept of a random sequence. Theory ProbabilityAppl., 11:177–179, 1966.

[521] P. Martin-Lof. Algorithms and randomness. Rev. Int. Statist. Inst.,37:265–272, 1969.

[522] P. Martin-Lof. The literature on von Mises’ Kollektivs revisited. Theoria,35(1):12, 1969.

[523] P. Martin-Lof. Notes on Constructive Mathematics. Almqvist and Wik-sell, 1970.

[524] P. Martin-Lof. On the notion of randomness. In A. Kino et al., editor,Intuitionism and Proof Theory, pages 73–78. North-Holland, 1970.

[525] P. Martin-Lof. Complexity oscillations in infinite binary sequences. Z.Wahrscheinlichkeitstheorie verw. Gebiete, 19:225–230, 1971.

[526] P. Martin-Lof. The notion of redundancy and its use as a quantitativemeasure of the discrepancy between a statistical hypothesis and a set ofobservational data. Scand. J. Stat., 1:3–18, 1974.

[527] P. Martin-Lof. Reply to Sverdrup’s polemical article “Tests withoutpower”. Scand. J. Stat., 2:161–165, 1975.

[528] E. Mayordomo. A Kolmogorov complexity characterization of construc-tive Hausdorff dimension. Inform. Process. Lett., 84(1):247–356, 2002.

[529] J.W. McAllister. Effective complexity as a measure of information con-tent. Philosophy of Science, 70:302–307, 2003.

[530] K. Mehlhorn. On the program-size of perfect and universal hash func-tions. In Proc. 23rd IEEE Symp. Found. Comput. Sci., pages 170–175,1982.

[531] N. Merhav and M. Feder. Universal prediction of individual sequences.IEEE Trans. Inform. Theory, 38(4):1258–1270, 1992.

[532] W. Merkle. The Kolmogorov-Loveland stochastic sequences are notclosed under selecting subsequences. J. Symb. Logic, 68:1362–1376, 2003.

[533] W. Merkle. The complexity of stochastic sequences. J. Comput. Syst.Sci., 74(3):350–357, 2008.

[534] W. Merkle and N. Mihailovic. On the construction of effectively randomsets. J. Symb. Logic, 69(3):862–878, 2004.

[535] W. Merkle, N. Mihailovic, and T.A. Slaman. Some results on effectiverandomness. Theory Comput. Syst., 39:707–722, 2006.

[536] W. Merkle, J.S. Miller, A. Nies, F. Stephan, and J. Reimann.Kolmorogov-Loveland randomness and stochasticity. Ann. Pure Appl.Logic, 138(1-3):303–326, 2006.

[537] W. Merkle and J. Reimann. Selection functions that do not preservenormality. Theory Comput. Syst., 39:685–697, 2006.

750 References

[538] N.C. Metropolis, G. Reitweiser, and J. von Neumann. Statistical treat-ment of values of the first 2,000 decimal digits of e and π calculatedon the ENIAC. In A.H. Traub, editor, John von Neumann, CollectedWorks, Vol. V. Macmillan, 1963.

[539] J.S. Miller. Every 2-random real is Kolmorogov random. J. SymbolicLogic, 69(3):907–913, 2004.

[540] J.S. Miller. Contrasting plain and prefix-free Kolmogorov complexity.SIAM. J. Comput., 2006. Submitted.

[541] J.S. Miller. The k degrees, low for k degrees, and weakly low for k sets.Manuscript, 2007.

[542] J.S. Miller and L. Yu. Oscillations in the initial segment complexity ofrandom reals. Manuscript, 2007.

[543] J.S. Miller and L. Yu. On initial segment complexity and degrees ofrandomness. Trans. American Math. Soc., 360(6):3193–3210, 2008.

[544] A. Milosavljevic and J. Jurka. Discovering simple DNA sequences by thealgorithmic significance method. CABIOS, 9(4):407–411, 1993.

[545] A. Milosavljevic and J. Jurka. Discovery by minimal length encoding: acase study in molecular evolution. Machine Learning, 12:69–87, 1993.

[546] P.B. Miltersen. The complexity of malign ensembles. SIAM J. Comput.,22(1):147–156, 1993.

[547] M.L. Minsky. Problems of formulation for artificial intelligence. In R.E.Bellman, editor, Mathematical Problems in the Biological Sciences, Proc.Symposia in Applied Mathematics XIV, page 43. American Mathemat-ical Society, 1962.

[548] M.L. Minsky. Steps towards artificial intelligence. Proceedings I.R.E.,pages 8–30, January, 1961.

[549] J.L. Montana and L.M. Pardo. On Kolmogorov complexity in the realTuring machine setting. Inform. Process. Lett., 67(2):81–86, 1998.

[550] C.E. Mora and H.J. Briegel. Algorithm complexity and entanglement ofquantum states. Phys. Rev. Lett., 95:200503, 2005.

[551] C.E. Mora and H.J. Briegel. Algorithm complexity of quantum states.Int. J. Quantum Information, 4:715, 2006.

[552] C.E. Mora, H.J. Briegel, and B. Kraus. Quantum Kolmogorov complex-ity and its applications. arXiv.org:quant-ph/0610109, 2006.

[553] P. Moulin and J. Liu. Analysis of multiresolution image denoisingschemes using generalized Gaussian and complexity priors. IEEE Trans.Inform. Theory, 45(3):909–919, 1999.

[554] An.A. Muchnik. Lower limits on frequencies in computable sequencesand relativized a priori probability. SIAM Theory Probab. Appl., 32:513–514, 1987.

[555] An.A. Muchnik. On common information. Theor. Comput. Sci., 207:319–328, 1998.

[556] An.A. Muchnik. Conditional complexity and codes. Theor. Comput.Sci., 271(1-2):97–109, 2002.

[557] An.A. Muchnik and S.Y. Positselsky. Kolmogorov entropy in the contextof computability theory. Theor. Comput. Sci., 271(1-2):15–35, 2002.

[558] An.A. Muchnik, A.L. Semenov, and V.A. Uspensky. Mathematical meta-physics of randomness. Theor. Comput. Sci., 207:263–317, 1998.

References 751

[559] An.A. Muchnik, A.K. Shen, M. Ustinov, N.K. Vereshchagin, and M.V.Vyugin. Non-reducible descriptions for conditional Kolmogorov com-plexity. In Proc. 3rd Conf. Theory Appl. Models Comput., volume 3959 ofLect. Notes. Comput. Sci,, pages 308–317, Berlin, 2006. Springer-Verlag.

[560] An.A. Muchnik and N.K. Vereshchagin. Logical operations and Kol-mogorov complexity II. In Proc. 16th IEEE Conf. Comput. Complexity,pages 256–265, 2001.

[561] An.A. Muchnik and N.K. Vereshchagin. Shannon entropy versus Kol-mogorov complexity. In Proc. Int. Comput. Sci. Symp. Russia (CSR),volume 2925 of Lect. Notes. Comput. Sci,, pages 281–291, Berlin, 2006.Springer-Verlag.

[562] S. Muggleton, A. Srinivasan, and M. Bain. Compression, significance andaccuracy. In Proc. 9th Int. Workshop Machine Learning, pages 338–347,Aberdeen, Scotland, UK, 1992.

[563] D.W. Muller. Randomness and extrapolation. In Proc. 6th BerkeleySymposium, pages 1–31, 1970.

[564] M. Mundhenk. On hard instances. Theor. Comput. Sci., 242:301–311,2000.

[565] J. Muramatsu and F. Kanaya. Distortion-complexity and rate-distortionfunction. IEICE Trans. Fundamentals, E77-A:8:1224–1229, 1994.

[566] I.J. Myung, V. Balasubramanian, and M.A. Pitt. Counting probabilitydistributions: differential geometry and model selection. Proc. NationalAcad. Sci., 97(21):11170–11175, 2000.

[567] A. Naik, K.W. Regan, and D. Sivakumar. On quasilinear time complex-ity theory. Theor. Comput. Sci., 148(2):325–349, 1995.

[568] A. Nies. Computability and Randomness. Oxford University Press, Toappear.

[569] S.M. Nikol’skii. Aleksandrov and Kolmogorov in Dnjepropetrovsk. Rus-sian Math. Surveys, 38(4):41–55, 1983.

[570] S.P. Novikov. Memories of A.N. Kolmogorov. Russian Math. Surveys,43(6):40–42, 1988.

[571] M. Nykter, N.D. Price, M. Aldana, S.A. Ramsey, S.A. Kauffman, L.E.Hood, O. Yli-Harja, and I. Shmulevich. Gene expression dynamics in themacrophage exhibit criticality. Proc. Nat. Acad. Sci. USA, 105(6):1897–1900, 2008.

[572] M. Nykter, N.D. Price, A. Larjo, T. Aho, S.A. Kauffman, O. Yli-Harja,and I. Shmulevich. Critical networks exhibit maximal information diver-sity in structure-dynamics relationships. Phys. Rev. Lett., 100:058702(4),2008.

[573] Obituary. Mr. Andrei Kolmogorov—Giant of mathematics. Times, Oc-tober 26 1987.

[574] Obituary. Andrei Nikolaevich Kolmogorov. Bull. London Math. Soc.,22(1):31–100, 1990.

[575] P. Odifreddi. Classical Recursion Theory. North-Holland, 1989.[576] P. Orponen, Ker-I Ko, U. Schoning, and O. Watanabe. Instance com-

plexity. J. Assoc. Comput. Mach., 41:96–121, 1994.[577] M. Paatrascu and E.D. Demaine. Tight bounds for the partial-sums

problem. In Proc. 15th ACM-SIAM Symp. Discrete Alg., pages 20–29,2004.

752 References

[578] D. Pager. On the problem of finding minimal programs for tables. In-form. Contr., 14:550–554, 1969.

[579] P. Pajunen. Blind source separation using algorithmc information the-ory. Neurocomput., 22:35–48, 1998.

[580] R. Parekh and V. Honavar. Learning DFA from simple examples. Ma-chine Learning, 44(1-2):9–35, 2001.

[581] E.M. Pathos and N. Chater. A simplicity principle in unsupervised hu-man categorization. Cognitive Science, 26(3):303–343, 2002.

[582] R. Paturi. Study of certain probabilistic models of information transferand on-line computation. PhD thesis, Penn State University, 1985.

[583] R. Paturi and J. Simon. Lower bounds on the time of probabilistic on-line simulations. In Proc. 24th IEEE Symp. Found. Comput. Sci., pages343–350, 1983.

[584] R. Paturi, J. Simon, R.E. Newman-Wolfe, and J. Seiferas. Milking theAanderaa argument. Inform. Comput., 88:88–104, 1990.

[585] W.J. Paul. Kolmogorov’s complexity and lower bounds. In L. Budach,editor, Proc. 2nd Int. Conf. Fund. Comput. Theory, pages 325–334.Akademie Verlag, 1979.

[586] W.J. Paul. On-line simulation of k+1 tapes by k tapes requires nonlineartime. Inform. Contr., 53:1–8, 1982.

[587] W.J. Paul. On heads versus tapes. Theor. Comput. Sci., 28:1–12, 1984.[588] W.J. Paul, J.I. Seiferas, and J. Simon. An information theoretic ap-

proach to time bounds for on-line computation. J. Comput. Syst. Sci.,23(2):108–126, 1981.

[589] W.J. Paul and R.J. Solomonoff. Autonomous theory building systems.In P. Bock, M. Loew, and M. Richter, editors, Neural Networks andAdaptive Learning, Knowledge Processing and Its Applications Series.Elsevier Science Publishers, 1992.

[590] J. Pearl. On the connection between the complexity and credibility ofinferred models. Int. J. Gen. Syst., 4:255–264, 1978.

[591] H. Petersen. Bounds for the element distinctness problem on one-tapeTuring machines. Inform. Process. Lett., 81(2):75–79, 2002.

[592] H. Petersen and J.M. Robson. Efficient simulations by queue machines.In Proc. 25th Int. Coll. Automata, Lang. Programming, number 1443 inLect. Notes Comput. Sci., pages 884–895, Berlin, 1998. Springer-Verlag.

[593] G. Peterson. Succinct representations, random strings and complexityclasses. In Proc. 21st IEEE Symp. Found. Comput. Sci., pages 86–95,1980.

[594] N.V. Petri. Algorithms connected with predicates and Boolean func-tions. Soviet Math. Dokl., 10:294–297, 1969.

[595] N.V. Petri. The complexity of algorithms and their operating time. So-viet Math. Dokl., 10:547–549, 1969.

[596] J.D. Phillips, P.A. Gares, and M.C. Slattery. Agricaltural soil redistribu-tion and landscape complexity. Landscape Ecology, 14(2):197–211, 1999.

[597] J.R. Pierce and C.C. Cutler. Interplanetary communications. In F.I.Ordway, III, editor, Advances in Space Science, Vol. 1, pages 55–109,New York, 1959. Academic Press, Inc.

[598] N. Pippenger. An information-theoretic method in combinatorial theory.J. Comb. Theory, Ser. A, 23:99–104, 1977.

References 753

[599] J. Poland and M. Hutter. Asymptotics of discrete MDL for on-line pre-diction. IEEE Trans. Inform. Theory, 51(11):3780–3795, 2005.

[600] K.R. Popper. The Logic of Scientific Discovery. University of TorontoPress, 1959.

[601] M.B. Pour-El and J.I. Richards. Computability in Analysis and Physics.Springer-Verlag, 1989.

[602] J. Quinlan and R. Rivest. Inferring decision trees using the minimumdescription length principle. Inform. Comput., 80:227–248, 1989.

[603] J. Rabaey, L. Guerra, and R. Mehra. Design guidance in the powerdimension. In Proc. IEEE Int. Conf. Acoust. Speech Signal Process.,pages 2837–2840, 1995.

[604] R.P.N. Rao and D.H. Ballard. Dynamic model of visual recognition pre-dicts neural response properties in the visual cortex. Neural Comput.,9(4):721–763, 1997.

[605] P.E. Rapp, C.J. Cellucci, K.E. Korslund, T.A.A. Watanabe, and M.A.Jimenez-Montano. Effective normalization of complexity measurementsfor epoch length and sampling frequency. Phys. Rev. E, 64:016209, 2001.

[606] K.W. Regan. On superlinear lower bounds in complexity theory. In Proc.10th IEEE Conf. Structure in Complexity Theory, pages 50–64, 1995.

[607] K.W. Regan and J. Wang. The quasilinear isomorphism challenge.SIGACT News, 25:106–113, September 1994.

[608] S. Reisch and G. Schnitger. Three applications of Kolmogorov complex-ity. In Proc. 23rd IEEE Symp. Found. Comput. Sci., pages 45–52, 1982.

[609] Zh.I. Reznikova. Animal Intelligence: From Individual to Social Cogni-tion. Cambridge Univ. Press, 2007.

[610] Zh.I. Reznikova and B.Ya. Ryabko. Analysis of the language of antsby information-theoretical methods. Problems Inform. Transmission,22:245–249, 1986.

[611] J.J. Rissanen. Modeling by the shortest data description. Automatica,14:465–471, 1978.

[612] J.J. Rissanen. A universal prior for integers and estimation by minimumdescription length. Ann. Statist., 11:416–431, 1983.

[613] J.J. Rissanen. Stochastic complexity. J. Royal Stat. Soc., Ser. B, 49:223–239, 1987. Discussion: pages 252-265.

[614] J.J. Rissanen. Stochastical Complexity and Statistical Inquiry. World Sci-entific, Singapore, 1989.

[615] J.J. Rissanen. Complexity of models. In W.H. Zurek, editor, Complex-ity, Entropy and the Physics of Information, pages 117–125. Addison-Wesley, New York, 1991.

[616] J.J. Rissanen. Fisher information and stochastic complexity. IEEETrans. Inform. Theory, 42(1):40–47, 1996.

[617] J.J. Rissanen. Stochastic complexity in learning. J. Comput. Syst. Sci.,55:89–95, 1997.

[618] J.J. Rissanen. Hypothesis selction and testing by the MDL principle.Comput. J., 42(4):260–269, 1999.

[619] J.J. Rissanen. Information and Complexity in Statistical Modeling.Springer-Verlag, New York, 2007.

[620] J.J. Rissanen and I. Tabus. Kolmogorov structure function in MDL the-ory and lossy data compression. In P. Grunwald, I.J. Myung, and M.A.

754 References

Pitt, editors, Advances in Minimum Description Length: Theory andApplications, pages 245–262. MIT Press, 2005.

[621] E. Rivals and J.-P. Delahaye. Optimal representation in average usingKolmogorov complexity. Theor. Comput. Sci., 200:261–287, 1998.

[622] H. Rogers, Jr. Theory of Recursive Functions and Effective Computabil-ity. McGraw-Hill, 1967.

[623] A.E. Romashchenko, A.K. Shen, and N.K. Vereshchagin. Combinatorialinterpretation of Kolmogorov complexity. Theor. Comput. Sci., 271(1-2):111–123, 2002.

[624] D. Ronneburger. Kolmogorov complexity and derandomization. PhD the-sis, Rutgers, the State Univ., New Jersey, 2004.

[625] P.W.K. Rothemund and E. Winfree. The program-size complexity ofself-assembled squares. In Proc. 32nd ACM Symp. Theory of Computing,pages 459–468, 2000.

[626] R. Rubinstein. Structural complexity classes of sparse sets: intractability,data compression and printability. PhD thesis, Northeastern Univ., 1988.

[627] B.Ya. Ryabko. Encoding of combinatorial sources and Hausdorff dimen-sion. Dokl. Akad. Nauk SSSR, 27:1066–1070, 1984.

[628] B.Ya. Ryabko. Noiseless coding of combinatorial sources, Hausdorff di-mension, and Kolmogorov complexity. Problems Inform. Transmission,22:170–179, 1986.

[629] B.Ya. Ryabko. The complexity and effectiveness of prediction algo-rithms. J. Complexity, 10:281–295, 1994.

[630] B.Ya. Ryabko and J. Astola. Universal code as a basis for time seriestesting. Statistical Methodology, 3:375–397, 2006.

[631] B.Ya. Ryabko, J. Astola, and A. Gammerman. Application of Kol-mogorov complexity and universal codes to identity testing and non-parametric testing of serial independence of time series. Theor. Comput.Sci., 359(1-3):440–448, 2006.

[632] B.Ya. Ryabko and V.A. Monarev. Experimental investigation of fore-casting methods based on data compression algorithms. Problems In-form. Transmission, 41(2):65–69, 2005.

[633] B.Ya. Ryabko and Z. Reznikova. Using Shannon entropy and Kol-mogorov complexity to study the communicative system and cognitivecapacities in ants. Complexity, 2(2):37–42, 1998.

[634] D. Ryabko and M. Hutter. On sequence prediction for arbitrary mea-sures. In Proc. IEEE Int. Symp. Inform. Theory, pages 2346–2350, 2007.

[635] C.C. Santos, J. Bernardes, P.M.B. Vitanyi, and L. Antunes. Clusteringfetal heart rate tracings by compression. In Proc. 19th IEEE Int. Symp.Computer-Based Medical Systems, pages 685–670, 2006.

[636] R. Schack. Algorithmic information and simplicity in statistical physics.Int. J. Theor. Phys., 36:209–226, 1997.

[637] R. Schack and C.M. Caves. Chaos for Liouville probability densities.Physical Review E, 53(4):3387–3401, 1996.

[638] R. Schack and C.M. Caves. Information-theoretic characterization ofquantum chaos. Physical Review E, 53(4):3257–3270, 1996.

[639] R. Schack, G.M. D’Ariano, and C.M. Caves. Hypersensitivity to pertur-bation in a quantum kicked top. Physical Review E, 50:972, 1994.

References 755

[640] E.D. Scheirer. Structured audio, Kolmogorov complexity, and general-ized audio coding. IEEE Trans. Speech Audio Process., 9(8):914–931,2001.

[641] J. Schmidhuber. Discovering neural nets with low Kolmogorov complex-ity and high generalization capability. Neural Networks, 10(5):857–873,1997.

[642] J. Schmidhuber. Low-complexity art. Leonardo, J. Int. Soc. Arts, Sci-ences, and Technology, 30(2):97–103, 1997.

[643] J. Schmidhuber. Hierarchies of generalized Kolmogorov complexitiesand nonenumerable universal measures computable in the limit. Int.J. Found. Comput. Sci., 13(4):587–612, 2002.

[644] J. Schmidhuber. The speed prior: a new simplicity measure yield-ing near-optimal computable predictions. In Proc. 15th Conf. Comput.Learning Theory, pages 216–228, 2002.

[645] J. Schmidhuber. The new AI: General & sound & relevant for physics.In B. Goertzel and C. Pennachin, editors, Artificial General Intelligence,pages 175–198. Springer-Verlag, 2006.

[646] J. Schmidhuber, J. Zhao, and M. Wiering. Shifting inductive bias withsuccess-story algorithm, adaptive Levin search, and incremental self-improvement. Machine Learning, 28(1):105–130, 1997.

[647] C.P. Schnorr. Eine Bemerkung zum Begriff der zufalligen Folge. Z.Wahrscheinlichkeitstheorie verw. Gebiete, 14:27–35, 1969.

[648] C.P. Schnorr. Klassifikation der Zufallsgesetze nach Komplexitat undOrdnung. Z. Wahrscheinlichkeitstheorie verw. Gebiete, 16:1–21, 1970.

[649] C.P. Schnorr. Uber die Definition von effektiven Zufallstests, i-ii. Z.Wahrscheinlichkeitstheorie verw. Gebiete, 15:297–312, 313–328, 1970.

[650] C.P. Schnorr. Optimal Godel numberings. In Proc. 1971 IFIP Congress,TA-2, pages 12–14, Ljubljana, Yugoslavia, 1971.

[651] C.P. Schnorr. A unified approach to the definition of random sequences.Math. Systems Theory, 5:246–258, 1971.

[652] C.P. Schnorr. Zufalligkeit und Wahrscheinlichkeit; Eine algorithmischeBegrundung der Wahrscheinlichkeitstheorie, volume 218 of Lect. NotesMath. Springer-Verlag, Berlin, 1971.

[653] C.P. Schnorr. Process complexity and effective random tests. J. Comput.System Sci., 7:376–388, 1973.

[654] C.P. Schnorr. Rekursive Funktionen und ihre Komplexitat. Teubner,1974.

[655] C.P. Schnorr. A survey of the theory of random sequences. In R.E. Buttsand J. Hintikka, editors, Basic Problems in Methodology and Linguistics,pages 193–210. D. Reidel, 1977.

[656] C.P. Schnorr. A review of the theory of random sequences. In Proc. 5thInt. Congr. Logic, Meth. Phil. of Sci., London, Ontario, August 1975.

[657] C.P. Schnorr and P. Fuchs. General random sequences and learnablesequences. J. Symbolic Logic, 42:329–340, 1977.

[658] C.P. Schnorr and H. Stimm. Endliche Automaten und Zufallsfolgen.Acta Informatica, 1:345–359, 1972.

[659] C.P. Schnorr and G. Stumpe. A characterization of complexity se-quences. Z. Math. Logik und Grudl. Math., 21:47–56, 1975.

756 References

[660] U. Schoning. Resolution proofs, exponential lower bounds, and Kol-mogorov complexity. In Proc. Symp. Math. Foundat. Comput. Sci., vol-ume 1295 of Lect. Notes Comput. Sci., pages 110–116, Berlin, 1997.Springer-Verlag.

[661] U. Schoning. Construction of expanders and superconcentrators usingKolmogorov complexity. Rand. Struct. Alg., 17:64–77, 2000.

[662] U. Schoning and R. Pruim. Gems of Theoretical Computer Science.Springer-Verlag, 1998.

[663] R. Schuler. A note on universal distributions for polynomial-time com-putable distributions. In 12th IEEE Conf. Comput. Complexity, pages69–73, 1997.

[664] R. Schuler. Universal distributions and time bounded Kolmogorov com-plexity. In 16th Sypm. Theor. Aspects Comput. Sci., volume 1563 of Lect.Notes Comput. Sci., pages 434–443, Berlin, 1999. Springer-Verlag.

[665] P. Schweitzer. Using the incompressibility method to obtain local lemmaresults for ramsey-type problems. Inform. Process. Lett. To appear.

[666] J. Seiferas. A simplified lower bound for context-free-language recogni-tion. Inform. Contr., 69:255–260, 1986.

[667] J. Shallit and Y. Breitbart. Automaticity: Properties of a measure ofdescriptional complexity. J. Comput. System Sci., 53(1):10–25, 1996.

[668] J. Shallit and M. Wang. Automatic complexity of strings. J. Automata,Lang. Combinat., 6(4):537–554, 2001.

[669] C.E. Shannon. The mathematical theory of communication. Bell SystemTech. J., 27:379–423, 623–656, 1948.

[670] C.E. Shannon. A universal Turing machine with two internal states.In C.E. Shannon and J. McCarthy, editors, Automata Studies, pages129–153. Princeton University Press, 1956.

[671] C.E. Shannon. Coding theorems for a discrete source with a fidilitycriterion. In IRE National Convention Record, Part 4, pages 142–163,1959.

[672] A.K. Shen. The frequency approach to defining a random sequence.Semiotika i Informatika, 19:14–42, 1982. In Russian.

[673] A.K. Shen. The concept of Kolmogorov (α, β)-stochasticity and its prop-erties. Soviet Math. Dokl., 28:295–299, 1983.

[674] A.K. Shen. Algorithmic variants of the notion of entropy. Soviet Math.Dokl., 29(3):569–573, 1984.

[675] A.K. Shen. Connections between different algorithmic definitions of ran-domness. Soviet Math. Dokl., 38(2):316–319, 1989.

[676] A.K. Shen. Discussion on Kolmogorov complexity and statistical analy-sis. Comput. J., 42(4):340–342, 1999.

[677] A.K. Shen. Multisource information theory. In Proc. 3rd Conf. TheoryAppl. Models Comput., volume 3959 of Lect. Notes. Comput. Sci,, pages327–338, Berlin, 2006. Springer-Verlag.

[678] A.K. Shen, V.A. Uspensky, and N.K. Vereshchagin. Kolmogorov Com-plexity and Randomness. To appear.

[679] A.K. Shen and N.K. Vereshchagin. Logical operations and Kolmogorovcomplexity. Theor. Comput. Sci., 271(1-2):125–129, 2002.

[680] A.N. Shiryaev. A.N. Kolmogorov: Life and creative activities. Ann.Probab., 17:866–944, 1989. Publications of Kolmogorov: pages 945–964.

References 757

[681] M. Sipser. A complexity theoretic approach to randomness. In Proc.15th ACM Symp. Theory Comput., pages 330–335, 1983.

[682] S.S. Skiena. Further evidence for randomness in π. Complex Systems,1:361–366, 1987.

[683] D. Sleator, R. Tarjan, and W. Thurston. Short encodings of evolvingstructures. SIAM J. Discrete Math., 5:428–450, 1992.

[684] R.J. Solomonoff. The mechanization of linguistic learning. In 2nd Int.Congress on Cybernetics, pages 180–193, 1958.

[685] R.J. Solomonoff. A new method for discovering the grammars of phrasestructure languages. In Information Processing, pages 285–290, Paris,1959. Unesco.

[686] R.J. Solomonoff. A preliminary report on a general theory of induc-tive inference. Technical Report ZTB-138, Zator Company, Cambridge,Mass., November 1960.

[687] R.J. Solomonoff. An inductive inference code employing definitions.Technical Report ZTB-141, Rockford Research, Cambridge, Mass., April1962.

[688] R.J. Solomonoff. A formal theory of inductive inference, part 1 and part2. Inform. Contr., 7:1–22, 224–254, 1964.

[689] R.J. Solomonoff. Inductive inference research status, spring 1967. Tech-nical report, Rockford Research Inst., July 1967. Distributed by Clear-inghouse, US Dept. of Commerce.

[690] R.J. Solomonoff. Inductive inference theory—a unified approach to prob-lems in pattern recognition and artificial intelligence. In 4th Int. Conf.Artificial Intelligence, pages 274–280, Tbilisi, Georgia, USSR, 1975.

[691] R.J. Solomonoff. Complexity based induction systems: comparisons andconvergence theorems. Technical Report RR-329, Rockford Research,Cambridge, Mass., August 1976.

[692] R.J. Solomonoff. Complexity-based induction systems: comparisons andconvergence theorems. IEEE Trans. Inform. Theory, 24:422–432, 1978.

[693] R.J. Solomonoff. Perfect training sequences and the costs of corruption—a progress report on inductive inference research. Memorandum,Oxbridge Research, P.O. box 559, Cambridge, Mass. 02238, August1982.

[694] R.J. Solomonoff. Optimum sequential search. Memorandum, OxbridgeResearch, P.O. box 559, Cambridge, Mass. 02238, June 1984.

[695] R.J. Solomonoff. An application of algorithmic probability to problemsin artificial intelligence. In L.N. Kanal and J.F. Lemmer, editors, Un-certainty in Artificial Intelligence, pages 473–491. North-Holland, 1986.

[696] R.J. Solomonoff. The application of algorithmic probability to machinelearning. Grant proposal manuscript, Oxbridge Research, P.O. box 559,Cambridge, Mass. 02238, September 1988.

[697] R.J. Solomonoff. A system for machine learning based on algorithmicprobability. In Proc. 6th Israeli Conf. on AI and Computer Vision, 1989.

[698] R.J. Solomonoff. The discovery of algorithmic probability. J. Comput.System Sci., 55:73–88, 1997.

[699] R.J. Solomonoff. Two kinds of probabilistic induction. Comput. J.,42:256–259, 1999.

758 References

[700] R.J. Solomonoff. The probability of “undefined” (non-converging) out-put in generating the universal probability distribution. Inform. Process.Lett., 106(6):238–240, 2008.

[701] R.M. Solovay. Lecture notes on algorithmic complexity. Unpublished,UCLA, 1975.

[702] R.M. Solovay. On random r.e. sets. In A.I. Arruda et al., editor, Non-Classical Logic, Model Theory and Computability, pages 283–307. North-Holland, 1977.

[703] D. Sow and A. Eleftheriadis. Complexity distortion theory. IEEE Trans.Inform. Theory, 49(3):604–608, 2003.

[704] L. Staiger. Complexity and entropy. In Proc. Math. Found. Comput.Sci., volume 118 of Lect. Notes Comput. Sci., pages 508–514, Berlin,1981. Springer-Verlag.

[705] L. Staiger. Representable P. Martin-Lof tests. Kybernetika, 21:235–243,1985.

[706] L. Staiger. Kolmogorov complexity and Hausdorff dimension. Inform.Comput., 120(2):159–194, 1993.

[707] L. Staiger. A tight upper bound on Kolmogorov complexity by Hausdorffdimension and uniformly optimal prediction. Theory Comput. Syst.,31(3):215–229, 1998.

[708] L. Staiger. The Kolmogorov complexity of real numbers. Theor. Comput.Sci., 284(2):455–466, 2002.

[709] L. Staiger. Constructive dimension equals Kolmogorov complexity. In-form. Process. Lett., 93:149–153, 2005.

[710] J. Storer. Data Compression: Method and Theory. Computer SciencePress, 1988.

[711] E. Sverdrup. Tests without power. Scand. J. Stat., 2:158–160, 1975.[712] L. Szilard. On the decrease of entropy in a thermodynamic system by

the intervention of intelligent beings. Z. Phys., 53:840–856, 1929.[713] V.M. Tikhomirov. The life and work of Andrei Nikolaevich Kolmogorov.

Russian Math. Surveys, 43(6):1–39, 1988.[714] M.R. Titchener, R. Nicolescu, L. Staiger, A. Gulliver, and U. Spei-

del. Deterministic complexity and entropy. Int. J. Found. Comput. Sci.,64(1-4):443–461, 2005.

[715] B.A. Trakhtenbrot. A survey of Russian approaches to perebor (brute-force-search) algorithms. Ann. Hist. Comput., 6:384–400, 1984.

[716] J.T. Tromp. Binary lambda calculus and combinatory logic. Manuscript,CWI, Amsterdam, 2004.

[717] A.M. Turing. On computable numbers with an application to theEntscheidungsproblem. Proc. London Math. Soc., Ser. 2, 42:230–265,1936. Correction, Ibid., 43:544–546, 1937.

[718] J. Tyszkiewicz. The Kolmogorov expression complexity of logics. Inform.Comput., 135(2):113–135, 1997.

[719] J. Tyszkiewicz. A note on the Kolmogorov data complexity and nonuni-form logical definitions. Inform. Process. Lett., 64(4):187–195, 1997.

[720] J. Tyszkiewicz. On the Kolmogorov expressive power of Boolean querylanguages. Theor. Comput. Sci., 190(2):317–361, 1998.

[721] V.A. Uspensky. Complexity and entropy: an introduction to the theoryof Kolmogorov complexity. In O. Watanabe, editor, Kolmogorov Com-

References 759

plexity and Computational Complexity, pages 85–102. Springer-Verlag,Berlin, 1992.

[722] V.A. Uspensky. Kolmogorov and mathematical logic. J. Symb. Logic,57(2):385–412, 1992.

[723] V.A. Uspensky and A.L. Semenov. Algorithms: Main Ideas and Applica-tions. Kluwer Academic Publishers, Dordrecht, Netherlands, 1993. Also:in Lect. Notes Comput. Sci., vol. 122, A.P. Ershov and D.E. Knuth, ed-itors, Springer-Verlag, 1981, pp. 100–234.

[724] V.A. Uspensky, A.L. Semenov, and A.K. Shen. Can an individual se-quence of zeros and ones be random? Russian Math. Surveys, 45(1):121–189, 1990.

[725] V.A. Uspensky and A.K. Shen. Relations between varieties of Kol-mogorov complexities. Math. Systems Theory, 29:271–292, 1996.

[726] M.A. Ustinov. Non-approximability of the randomness deficiency func-tion. In Proc. Int. Comput. Sci. Symp. Russia (CSR), volume 3967 ofLect. Notes. Comput. Sci,, pages 364–368, Berlin, 2006. Springer-Verlag.

[727] L.G. Valiant. A theory of the learnable. Comm. Assoc. Comput. Mach.,27:1134–1142, 1984.

[728] P.A. van der Helm. Simplicity versus likelihood in visual perception:from surprisals to precisals. Psychological Bull., 126(5):770–800, 2000.

[729] P.J. van Heerden. A general theory of prediction. Technical report, Po-laroid Corporation, Cambridge, Mass., 1963.

[730] M. van Lambalgen. Random Sequences. PhD thesis, Universiteit vanAmsterdam, Amsterdam, 1987.

[731] M. van Lambalgen. Von Mises’ definition of random sequences reconsid-ered. J. Symbolic Logic, 52:725–755, 1987.

[732] M. van Lambalgen. Algorithmic Information Theory. J. Symbolic Logic,54:1389–1400, 1989.

[733] V.N. Vapnik and A.Ya. Chervonenkis. On the uniform convergence ofrelative frequencies of events to their probabilities. Theory of Probabilityand Its Applications, 16(2):264–280, 1971.

[734] J.S. Varre, J-P. Delahaye, and E. Rivals. Transformation distances: afamily of dissimilarity measures based on movements of segments. Bioin-formatics, 15(3):194–202, 1999.

[735] N.K. Vereshchagin. Kolmogorov complexity conditional to large integers.Theor. Comput. Sci., 271(1-2):58–67, 2002.

[736] N.K. Vereshchagin. Kolmogorov complexity of enumerating finite sets.Inform. Process. Lett., 103(1):34–39, 2007.

[737] N.K. Vereshchagin and P.M.B. Vitanyi. Kolmogorov complexity, ratedistortion and denoising of individual data. IEEE Trans. Inform. The-ory. Submitted; Arxiv cs.IT/0411014.

[738] N.K. Vereshchagin and P.M.B. Vitanyi. Kolmogorov’s structure func-tions and model selection. IEEE Trans. Inform. Theory, 50(12):3265–3290, 2004.

[739] N.K. Vereshchagin and M.V. Vyugin. Independent minimum length pro-grams to translate between given strings. Theor. Comput. Sci., 271(1-2):131–143, 2002.

[740] J. Ville. Etude Critique de la Notion de Collectif. Gauthier-Villars, Paris,1939.

760 References

[741] P.M.B. Vitanyi. On the simulation of many storage heads by one. Theor.Comput. Sci., 34:157–168, 1984.

[742] P.M.B. Vitanyi. On two-tape real-time computation and queues. J.Comput. Sys. Sci., 29:303–311, 1984.

[743] P.M.B. Vitanyi. An n1.618 lower bound on the time to simulate one queueor two pushdown stores by one tape. Inform. Process. Lett., 21:147–152,1985.

[744] P.M.B. Vitanyi. Square time is optimal for the simulation of a pushdownstore by an oblivious one-head tape unit. Inform. Process. Lett., 21:87–91, 1985.

[745] P.M.B. Vitanyi. Andrei Nikolaevich Kolmogorov. CWI Quarterly,1(2):3–18, June 1988. Also: Scholarpedia, p. 8546.

[746] P.M.B. Vitanyi. Multiprocessor architectures and physical law. In Proc.2nd IEEE Workshop Phys. Comput., pages 24–29. IEEE Comput. Soc.Press, 1994.

[747] P.M.B. Vitanyi. Physics and the new computation. In Proc. 20th Int.Symp. Math. Found. Comput. Sci., volume 969 of Lect. Notes Comput.Sci., pages 106–128. Springer-Verlag, Berlin, 1995.

[748] P.M.B. Vitanyi. Randomness. Arxiv preprint math/0110086, 1996.[749] P.M.B. Vitanyi. A discipline of evolutionary programming. Theor. Com-

put. Sci., 241(1-2):3–23, 2000.[750] P.M.B. Vitanyi. Quantum Kolmogorov complexity based on classical

descriptions. IEEE Trans. Inform. Theory, 47(6):2464–2479, 2001. Cor-rection, Ibid, 48(4):1000, 2002.

[751] P.M.B. Vitanyi. Asshuku ni motozuita hanyou na ruijido sokuteihou.Surikagaku, 519:54–59, September 2006. Translated into Japanese by O.Watanabe, English title: Universal similarity based on compression.

[752] P.M.B. Vitanyi. Meaningful information. IEEE Trans. Inform. Theory,52(10):4617–4626, 2006.

[753] P.M.B. Vitanyi. Algorithmic chaos and the incompressibility method.In E. Charpentier, A. Lesne, and N.K. Nikolski, editors, Kolmogorov’sHeritage in Mathematics, pages 301–317. Springer-Verlag, Berlin, 2007.

[754] P.M.B. Vitanyi. Analysis of sorting algorithms by Kolmogorov complex-ity (a survey). In I. Csiszar, G.O.H. Katona, and G. Tardos, editors, En-tropy, Search, Complexity, number 16 in Bolyai Society MathematicalStudies, pages 209–232. Springer-Verlag, Berlin, 2007.

[755] P.M.B. Vitanyi, F.J. Balbach, R.L. Cilibrasi, and M. Li. Normalizedinformation distance. In M. Dehmer and F. Emmert-Streib, editors, In-formation Theory and Statistical Learning. Springer-Verlag, New-York.To appear.

[756] P.M.B. Vitanyi and M. Li. Algorithmic arguments in physics of com-putation. In Proc. 4th Workshop Alg. Data Struct., volume 955 of Lect.Notes Comput. Sci., pages 315–333. Springer-Verlag, Berlin, 1995.

[757] P.M.B. Vitanyi and M. Li. Minimum description length induction,Bayesianism, and Kolmogorov complexity. IEEE Trans. Inform. The-ory, 46(2):446–464, 2000.

[758] R. von Mises. Grundlagen der Wahrscheinlichkeitsrechnung. Mathemat.Zeitsch., 5:52–99, 1919.

References 761

[759] R. von Mises. Probability, Statistics and Truth. Macmillan, 1939.Reprint: Dover, 1981.

[760] J. von Neumann. Various techniques used in connection with randomdigits. In A.H. Traub, editor, John von Neumann, Collected Works, vol-ume V. Macmillan, 1963.

[761] V.G. Vovk. Algorithmic information theory and prediction problems.In M.I. Kanovich et al., editor, Complexity Problems of MathematicalLogic, pages 21–24. Kalininsk. Gos. Univ., Kalinin, 1985. In Russian.

[762] V.G. Vovk. The law of the iterated logarithm for random Kolmogorov,or chaotic, sequences. SIAM Theory Probab. Appl., 32(3):413–425, 1987.

[763] V.G. Vovk. On a randomness criterion. Soviet Math. Dokl., 35:656–660,1987.

[764] V.G. Vovk. Prediction of stochastic sequences. Problems Inform. Trans-mission, 25:285–296, 1989.

[765] V.G. Vovk. Universal forecasting algorithms. Inform. Comput., 96:245–277, 1992.

[766] V.G. Vovk and A. Gammerman. Complexity approximation principle.Comput. J., 42(4):318–322, 1999.

[767] V.G. Vovk, A. Gammerman, and C. Saunders. Machine-learning appli-cations of algorithmic randomness. In Proc. 16th Int. Conf. MachineLearning, pages 444–453, 1999.

[768] V.G. Vovk and C. Watkins. Universal portfolio selection. In Proc. 11thConf. Comput. Learning Theory, pages 12–23. ACM Press, 1998.

[769] M.V. Vyugin. Information distance and conditional complexities. Theor.Comput. Sci., 271(1-2):145–150, 2002.

[770] M.V. Vyugin. Systems of strings with high mutual complexity. Probl.Inform. Transmission, 39(4):88–92, 2003.

[771] M.V. Vyugin and V.V. Vyugin. Predictive complexity and information.J. Comput. Syst. Sci., 70(4):539–554, 2005.

[772] V.V. Vyugin. Algorithmic entropy (complexity) of finite objects, and itsapplication to defining randomness and quantity of information. Semi-otika and Informatika, 16:14–43, 1981. In Russian. Translated into En-glish in: Selecta Mathematica formerly Sovietica, 13:4(1994), 357–389.

[773] V.V. Vyugin. The algebra of invariant properties of binary sequences.Problems Inform. Transmission, 18:147–161, 1982.

[774] V.V. Vyugin. On nonstochastic objects. Problems Inform. Transmission,21:3–9, 1985.

[775] V.V. Vyugin. On the defect of randomness of a finite object with respectto measures with given complexity bounds. SIAM Theory Probab. Appl.,32:508–512, 1987.

[776] V.V. Vyugin. Bayesianism: an algorithmic analysis. Inform. Comput.,127(1), 1996.

[777] V.V. Vyugin. Effective convergence in probability and an ergodic the-orem for individual random sequences. SIAM Theory Probab. Appl.,42(1):39–50, 1997.

[778] V.V. Vyugin. Ergodic theorems for individual random sequences. Theor.Comput. Sci., 207(4):343–361, 1998.

[779] V.V. Vyugin. Non-stochastic infinite and finite sequences. Theor. Com-put. Sci., 207(4):363–382, 1998.

762 References

[780] V.V. Vyugin. Algorithmic complexity and stochastic properties of finitebinary sequences. Comput. J., 42(4):294–317, 1999.

[781] V.V. Vyugin. Most sequences are stochastic. Inform. Comput., 169(2):252–263, 2001.

[782] V.V. Vyugin. Does snooping help? Theor. Comput. Sci., 276(1-2):407–415, 2002.

[783] V.V. Vyugin. On complexity of easy predictable sequences. Inform.Comput., 178(1):241–252, 2002.

[784] V.V. Vyugin. Suboptimal measures of predictive complexity for absoluteloss function. Inform. Comput., 175(2):146–157, 2002.

[785] V.V. Vyugin and V.P. Maslov. Extremal relations between additive lossfunctions and the Kolmogorov complexity. Probl. Inform. Transmission,39(4):71–87, 2003.

[786] A. Wald. Sur la notion de collectif dans la calcul des probabilites.Comptes Rendus des Sceances de l’Academie des Sciences, 202:1080–1083, 1936.

[787] A. Wald. Die Wiederspruchsfreiheit des Kollektivbegriffes der Wahr-scheinlichkeitsrechnung. Ergebnisse eines mathematischen Kolloquiums,8:38–72, 1937.

[788] C.S. Wallace. Statistical and Inductive Inference by Minimum MessageLength. Springer-Verlag, New York, 2005.

[789] C.S. Wallace and D.M. Boulton. An information measure for classifica-tion. Comput. J., 11:185–195, 1968.

[790] C.S. Wallace and D.L. Dowe. Mimimum message length and Kolmogorovcomplexity. Comput. J., 42(4):270–283, 1999.

[791] C.S. Wallace and D.L. Dowe. Refinements of MDL and MML coding.Comput. J., 42(4):330–337, 1999.

[792] C.S. Wallace and P.R. Freeman. Estimation and inference by compactcoding. J. Royal Stat. Soc., 49:240–251, 1987. Discussion: pages 252–265.

[793] X.J. Wang. Intermittent fluctuations and complexity. In W.H. Zurek,editor, Complexity, Entropy and the Physics of Information, pages 319–330. Addison-Wesley, New York, 1991.

[794] Y. Wang. The law of the iterated logarithm for p-random sequences. InProc. 11th IEEE Conf. Structure in Complexity Theory, pages 180–189,1996.

[795] Y. Wang. A separation of two randomness concepts. Inform. Process.Lett., 69(3):115–118, 1999.

[796] O. Watanabe. Comparison of polynomial time completeness notions.Theor. Comput. Sci., 53:249–265, 1987.

[797] O. Watanabe, editor. Kolmogorov Complexity and Computational Com-plexity. Springer-Verlag, Berlin, 1992.

[798] M. Wax and I. Ziskind. Detection of the number of coherent signals bythe MDL principle. IEEE Trans. Acoust. Speech Signal Process., ASSP-37(8):1190–1196, 1989.

[799] S. Wehner. Analyzing worms and network traffic using compression. J.Computer Security, 15(3):303–320, 2007.

[800] H.S. White. Algorithmic complexity of points in dynamical systems.Ergodic Theory and Dynamical Systems, 13:807–830, 1993.

References 763

[801] D.G. Willis. Computational complexity and probability constructions.J. Assoc. Comput. Mach., 17:241–259, 1970.

[802] C.H. Woo. Laws and boundary conditions. In W.H. Zurek, editor,Complexity, Entropy and the Physics of Information, pages 127–135.Addison-Wesley, New York, 1991.

[803] E.H. Yang. Universal almost sure data compression for abstract al-phabets and arbitrary fidelity criterions. Probl. Contr. Inform. Theory,20(6):397–408, 1991.

[804] E.H. Yang and S.Y. Shen. Distortion program-size complexity with re-spect to a fidelity criterion and rate-distortion function. IEEE Trans.Inform. Theory, 39(1):288–292, 1993.

[805] L. Yu, D. Ding, and R. Downey. The Kolmogorov complexity of randomreals. Ann. Pure Appl. Logic, 129:163–180, 2004.

[806] X. Zhang, Y. Hao, X. Zhu, and M. Li. Information distance from a ques-tion to an answer. In Proc. 13th ACM SIGKDD Int. Conf. KnowledgeDiscov. Data Mining, pages 874–883. ACM Press, 2007.

[807] I.G. Zhurbenko. The Spectral Analysis of Time Series, pages 231–236.Series in Statistics and Probability. North-Holland, 1986. Appendix II:Kolmogorov’s algorithm of the Random Number Generator.

[808] M. Zimand. On the topological size of sets of random strings. Zeitschr.f. math. Logik und Grundlagen d. Math., 32:81–88, 1986.

[809] M. Zimand. A high-low Kolmogorov complexity law equivalent to the0-1 law. Inform. Process. Lett., 57:59–64, 1996.

[810] M. Zimand. Large sets in AC0 have many strings with low Kolmogorovcomplexity. Inform. Process. Lett., 62:165–170, 1997.

[811] J. Ziv. On the complexity of an individual sequence. IEEE Trans. In-form. Theory, 22:74–88, 1976.

[812] J. Ziv. Distortion-rate theory for individual sequences. IEEE Trans. In-form. Theory, 26(2):137–143, 1980.

[813] J. Ziv and A. Lempel. Compression of individual sequences via variable-rate encoding. IEEE Trans. Inform. Theory, 24:530–536, 1978.

[814] W.H. Zurek. Algorithmic randomness and physical entropy. PhysicalReview, Ser. A, 40(8):4731–4751, 1989.

[815] W.H. Zurek. Thermodynamic cost of computation, algorithmic complex-ity and the information metric. Nature, 341:119–124, 1989.

[816] W.H. Zurek. Algorithmic information content, Church-Turing thesis,physical entropy, and Maxwell’s demon. In W.H. Zurek, editor, Com-plexity, Entropy and the Physics of Information, pages 73–89. Addison-Wesley, New York, 1991.

[817] W.H. Zurek, editor. Complexity, Entropy and the Physics of Informa-tion. Addison-Wesley, New York, 1991.

[818] W.H. Zurek. Decoherence, chaos, quantum-classical correspondence, andthe algorithmic arrow of time. Physica Scripta, T76:186–198, 1998.

[819] W.H. Zurek. Quantum discord and Maxwell’s demons. Physical Review,Ser. A, 67:012320, 2003.

[820] A.K. Zvonkin and L.A. Levin. The complexity of finite objects and thedevelopment of the concepts of information and randomness by meansof the theory of algorithms. Russian Math. Surveys, 25(6):83–124, 1970.

Index

(n)k: number of variations, 90′: Turing degree halting problem,

231A∗: set of all finite sequences of

elements of set A, 12A=n: set of all words of length n

in A, 550A∞: set of one-way infinite

sequences over set A, 14A≤n: set of words of length ≤ n

in A, 111C: complexity, 106C(x|l(x)): length-conditional C,

119C(x;n): uniform complexity, 130C[f(n), t(n), s(n)], 542C+: monotone upper bound on

C-complexity, 217Cs: space-bounded version of

Ct,s, 534Ct: time-bounded version of Ct,s,

534Ct,s: time-space-bounded C

complexity, 532Cr:complexity of r-ary strings,

115Cφ: complexity with respect to φ,

105Cn: normalized complexity for

reals, 132

E0: information distance, 642E1: max distance, 643E2: reversible computation

distance, 649E3: reversible sum distance, 651E4: sum distance, 652E5: min distance, 674H : entropy stochastic source, 67I(X;Y ): probabilistic mutual

information, 71I(x : y): algorithmic

mutual informationK(y) −K(y | x), 248

I(x; y): algorithmic mutualinformationK(x) +K(y) −K(x, y),252

IC(x:y): algorithmic informationin x about y, 186

K: diagonal halting set, 34K(K(x)|x): complexity of the

complexity function,242

K(x|l(x)): length-conditional K,204

K(x): prefix complexity, 202K+: monotone upper bound on

K-complexity, 212Kt,s: time-space-bounded K

complexity, 535

766 Index

K0: halting set, 34L(x): uniform discrete distribution

on N , 23O(f(x)): at most of order of

magnitude f(x), 16∆pi : class in polynomial hierarchy,

40∆0n: class in arithmetic hierarchy,

47Γx: cylinder generated by x, 14l∗(x): optimal universal code-word

length, 82ℓ∗(x): lower bound on l∗(x), 82ℓk(x, ǫ): upper bound on l∗(x),

309Ω(f(x)): at least of order of

magnitude f(x), 16Ω: halting probability, 226, 496Πpi : class in polynomial hierarchy,

40Π0n: class in arithmetic hierarchy,

47Σpi : class in polynomial hierarchy,

40Σ0n: class in arithmetic hierarchy,

47Θ(f(x)): of order of magnitude

f(x), 16x: prefix-code 1l(x)0x for x, 13∅: empty set, 7∃: there exists, 8∃∞: there exist infinitely many, 8∀: for all, 8∀∞: for all but finitely many, 8λ(ω): uniform continuous

distribution on [0, 1), 23〈·〉: pairing function, 7⌈·⌉: ceiling of a number, 8⌊·⌋: floor of a number, 8ln: natural logarithm, 8log-DNF, 376, 381log-decision list, 381log: binary logarithm, 8log∗ x: number of terms in l∗(x),

82ω: infinite sequence of elements of

B, 14φ(x) <∞: φ(x) converges, 8φ(x) = ∞: φ(x) diverges, 8

ρ0: universal integral test, 223σ-algebra, 20d(A): cardinality of set A, 7e: normalized information

distance, 661eG: normalized web distance, 668eZ : normalized compression

distance, 664emin: normalized min distance,

674l(x): length of string x, 13n-cell, 691n-string, 119, 122, 129, 130, 132,

159, 193n(T ): index of T , 30o(f(x)): asymptotically less than

f(x), 16x: finite sequence of elements of

B, 13x∗: first shortest program for x in

enumeration order, 110xR: reverse of string x, 13xi: ith letter of x, 13x1:n: first n letters of x, 13M: universal lower semicom-

putable continuoussemimeasure, 294

Mnorm: Solomonoff measure, 303,324, 325

Mc: universal lowersemicomputableextension semimeasure,326

m: universal lower semicom-putable discretesemimeasure, 267, 268

B: basic elements, 12, 262N : the nonnegative integers, 7Q: the rational numbers, 7R: the real numbers, 7Z: the integers, 7CD [f(n), t(n), s(n)], 544CDs: space-bounded version of

CDt,s, 534CDt: time-bounded version of

CDt,s, 534, 536CDt,s: resource-bounded

accepting complexity,533

Index 767

CU [f(n), t(n), s(n)], 547KDt,s: K version of CDt,s, 535KM : negative logarithm of M(x),

305Kc: Chaitin’s conditional prefix

complexity, 252Km: monotone complexity, 305Kt : Levin-complexity, 577–582ict: instance complexity, 571(nk

): number of combinations, 9

| · |: absolute value of a number, 8

Aanderaa, S.O., 525, 528Abel, N.H., 90acceptable numbering, 41, 112accepting a language, 37Ackermann, W., 45Adleman, L.M., 596, 599, 709Adriaans, P., 431, 439Agafonov, V.N., 59, 185Agrawal, M., 529Aho, T., 719Aldana, M., 719Aleksandrov, P.S., 97Alfonseca, M, 673, 719algorithmic complexity theory, ix,

101–257algorithmic entropy, see entropy,

algorithmicalgorithmic information theory,

see information theory,algorithmic

algorithmic probability theory,259–338

algorithmic statistic, 402Allender, E., 529, 570, 582, 597,

600Allison, L., 436Alon, N., 457, 463, 526Ane, C., 718Andreev, A.E., 597Andrews, J., 184Andrzejak, R.G., 718Angluin, D., 94, 434Anthony, M., 434ants, 711Antunes, L., 586, 588, 589, 719Archimedes, 49Aristotle, 341, 431

arithmetic hierarchy, 46Asarin, E.A., 173, 174Asmis, E., 431Astola, J., 720asymptotic notation, 15–17Aumann, Y., 597average-case

adder design, 444–445complexity, 290–294,

330–331, 334, 444–445,448–451, 461–490

Dobosiewicz sort, 485–486Heapsort, 476–480longest common subsequence,

486–489Quicksort, 485routing in networks, 473Shakersort, 485–486Shellsort, 480–486shortest common

supersequence, 489Avogadro’s number, 679

Bachman, P., 15Bacon, F., 713, 722Badger, J.H., 672, 717Baeza-Yates, R.A., 489Balcazar, J.L., 92, 381, 569, 597,

600Baranyai, Zs., 468Barendregt, H.P., 207Barron, A.R., 436Barzdins’s lemma, 131, 178, 180,

181, 195, 239, 245, 496,537, 539, 594

Barzdins, J.M., 116, 178, 180,185, 195, 434, 496, 539,546, 596

basic element, 12, 262, 294Bassalygo, L.A., 459Bayes’s rule, 20, 19–20, 60, 62,

65, 94, 96, 323, 332,333, 343–347, 349, 360,363, 384, 398, 432, 435

Bayes, T., 60, 343, 347, 432Bayesian reasoning, 343–347Beame, P., 515Becher, V., 237Beigel, R., 525

768 Index

Ben-Amram, A.M., 501, 530Ben-David, S., 588Benedek, G., 380, 434Benedetto, D., 718Benioff, P.A., 165, 633Bennamoun, M., 719Bennett, C.H., 183, 195, 228, 255,

570, 589, 595, 600, 626,633, 640, 641, 658, 659,688, 715–717, 720

Berger, T., 95Berman, P., 5, 569Berman–Hartmanis conjecture,

563Bernardes, J., 719Bernoulli process, 59, 63–65, 192,

284, 323, 324Bernoulli, J., 59, 63Bernstein, E., 709Berry, G.G., 177Berthiaume, A., 709, 722betting, 284–287Bienvenu, L., 231Biggs, N., 434binary interval, 274, 306binomial coefficient, 9bit: binary digit, viiBlum, M., 550Blumer, A., 380, 434Bogolyubov, N.N., 97Bollobas, B., 526Boltzmann constant, 629, 682Boltzmann, L., 678Bolyai, J., 95Bolyai, W., 95Book, R.V., 530, 570, 597, 598,

600Boolean formula, 376, 568Boolean matrix rank, 445Boppana, R., 526Borel, E., 20, 165, 238Borel–Cantelli lemmas, 64, 159Boswell, J., 259, 331Boulton, D.M., 435bra-ket notation, 696Brady, A.H., 46Brebner, G., 459Breitbart, Y., 530Brejova, B., 486

Brewer, R.G., 720Briegel, H.J., 722Briley, B.E., 525Brouwer, L.E.J., 164Brownian computer, 633Bruijn sequence, 528Bruijn, N.G. de, 528Buck, M., 672Buhrman, H.M., 123, 468, 469,

473, 519–521, 525, 526,529, 536, 559, 570, 577,588, 596–598, 641, 716

Burks, A.W., 444, 525, 715

Caglioti, E., 718Cai, J.-Y., 132, 561Calude, C., 158, 228, 235, 236Cantelli, F.P., 64Cantor, D.G., 455Cantor, G., 40Cardano, G., 23, 331cardinality, 7, 13Carnap, R., 96, 332, 347Carnot cycle, 675Carnot, N.L.S., 675Cartesian product, 7Case, J., 718Castro, J., 381Cauchy–Schwarz inequality, 530Caves, C.M., 625, 714, 722Cebrian, M, 673, 719Chaitin, G.J., 5, 91, 96, 99, 104,

123, 132, 160, 186, 192,194, 195, 206, 207, 213,214, 221, 224, 228, 232,233, 237–240, 254–256,327, 335, 435, 493, 600

Champernowne’s number,see sequence,Champernowne

Champernowne, D.G., 54, 93characteristic function, see

function, characteristiccharacteristic sequence, 125, 178,

538of K0, 180, 245of a language, 492, 496, 561,

600

Index 769

of high Turing degree set,183

of hyperimmune set, 185of immune set, 184of not recursively enumerable

set, 179, 181of recursively enumerable set,

178, 180, 181, 184, 239,245, 537, 539, 546, 594

of semirecursive set, 183random, 569

Chater, N., 437, 722Chen, X., 672, 717Chernoff bounds, 61, 167, 346,

462, 471, 567Chernoff, H., 94Chernov, A.V., 717Chervonenkis, A.Ya., 434Chitescu, I., 158Chomsky hierarchy, 490Chomsky, N., 332Chor, B., 588Chrobak, M., 500Church random sequence,

see sequence,Mises–Wald–Churchrandom

Church’s thesis, 24, 29, 53Church, A., 24, 35, 42, 51, 53, 93,

156Chvatal, V., 434, 489Cilibrasi, R.L., 670, 673, 718, 719classification, 669clause, 376Clementi, A.E.F., 597clique, 454, 457clustering

hierarchical, 666CNF formula, 376coarse-graining, 681code

additively optimal universal,253

ASCII, 74asymptotically optimal

universal, 81, 82average word length, 77, 95,

199data-to-model, 402

fixed-length, 74Hamming, 123instantaneous, see code,

prefixMDL, 407model, 402Morse, 66, 73optimal prefix, 77prefix, 5, 13, 15, 68, 75,

73–90, 94, 199self-delimiting, 79Shannon–Fano, 68, 79, 87,

94, 274, 280, 327, 593,604, 625, 668

two-part, 107–108, 721uniquely decodable, 74, 77,

88universal, 81, 79–82, 88, 95,

278variable-length, 75

code sequence, 74code word, 13, 73Cohen, P., 98coin-weighing problem, 455collective, 51, 51, 53, 55, 57, 93,

143, 155, 163combination, 9combinatorics, 8–12, 92, 451–460combinatory logic, 207, 255communication complexity,

516–521noncommunicable string, 520protocol independent,

519–521randomized, 521

Complearn toolkit, 719complexity

C, 106, 103–195, 206Cs, 534, 542Ct, 534, 536–542, 546,

550–562Ct,s, 532, 532–536, 542K, see complexity, prefixKt,s, 535, 546Kµ, 327r-ary strings C, 114CDs, 534CDt, 534, 550–562, 571CDt,s, 533, 533–536, 546

770 Index

Ct , 581KDs, 546KDt,s, 535, 546KM , 305, 328Kc, 252, 253, 254Km, see complexity,

monotoneKt , 578, 577–582additivity of, 109, 118, 191,

197, 202, 245–248additivity of C, 195additivity of K, 248, 251additivity of Kc, 252algorithmic, 1, 66, 96, 187alternative approach to

define it, 205approximation of, 125, 127average-case, 291, 290–294,

330–331, 334conditional C, 106, 119conditional K, 202continuity of, 121, 128expected C, 122, 188,

603–608expected K, 189, 246,

603–608extension, 216fluctuation of, 128incomputability of, 127, 193instance, 571, 571–577length-conditional C, 119,

122, 127, 129, 130, 132,160, 162, 193

length-conditional K, 204,213, 217

lower bound on C, 126lower bound on C(x|l(x)),

127, 129lower bound on K, 216majorant of, 261, 537monotone, 206, 221, 225,

305, 305–308, 328, 329,335, 336

monotonic upper bound onK, 212, 213, 217, 240,241

monotonicity on prefixes,119, 197, 199, 220

normalized for real numbers,132

number of states, 184of complexity function, 182,

242, 241–245, 254, 256of function, 116, 242

prefix, 104, 132, 202,197–257, 334, 432

quantum Kolmogorov,696–711

relation between C and K,215

resource bound hierarchies,542–545

resource-bounded, x, 97,531–600

space, 37state–symbol product, 91,

90–92, 95, 99stochastic, 435time, 37time-bounded uniform, 546time-space-bounded, 532–536uniform, 130, 132, 160, 162,

180, 184, 193, 197, 206,237, 308, 546, 547, 596

worst-case, 291, 330complexity class

∆E2 , 568

∆pi , 40

Πpi , 40, 597

Σpi , 40, 597#P, 560, 561BPP, 554, 555, 597, 598DSPACE, 38, 545, 550, 598DTIME, 38, 548, 561, 563,

564, 569, 598E, 564, 565, 568, 570, 598ESPACE, 598EXPTIME, 569IC[log,poly], 573NE, 568NP, 38, 534, 555, 560, 563,

564, 568, 573, 581, 599NSPACE, 38, 569NTIME, 38P, 38, 560, 563, 568, 573,

581, 598P/log, 575

Index 771

P/poly, 560, 575PSPACE, 38, 560, 569R, 555

complexity oscillation, 98, 143,143–147, 155, 159, 194,198, 218, 220, 221, 224,231

of K, 225, 229, 232, 233of Km, 225, 335

compression, 343, 347, 348, 350,362, 383, 423, 430, 433,439, 620, 621, 664–674

in nature, 711–714lossy, 405, 613

computability theory, 24–47, 92computable majorants, 536–542computational complexity, x,

37–40, 562–577, 598computational depth, 586computational learning theory, ix,

6, 370–382concatenation, 12context-free language, 581convergence

apparent, of relativefrequency, 142

recursive, of series, 152, 155,159, 162, 229

regulator of, 160Cook, S.A., 599Copolov, D.L, 436counting method, 453Cover, T.M., 90, 94, 95, 97, 113,

141, 214, 216, 218, 255,324, 327, 337, 338, 366,432, 433, 436, 439, 625,714

crossing sequence, 442Csiszar, I., 123, 366Culik II, K., 483Cutler, C.C., 715Cuturi, M., 718cylinder, 14, 21, 55, 144, 148, 380

D’Ariano, G.M, 722Duris, P., 512, 528Daley, R.P., 156, 160–162, 183,

546, 547, 596Dancik, V., 489

data(α, β)-stochastic, 424, 424,

439nonstochastic, 413, 419, 424,

425, 438, 439overfitting, 407

data-processing inequality,see inequality,data-processing

de la Vallee Poussin, C.J.G.N, 4decision list, 381decision tree, 393–397degree of unsolvability, 44Dekker, J.C.E., 43, 45Delahaye, J.P., 717Demaine, E.D., 501DeMarrais, J., 709denoising, 620–622derandomization, 567–568DeSantis, A., 433Devroye, L., 485Dewdney, A.K., 46DFA, see finite automatonDiacz, S., 237diagonalization method, 34Diaz, J., 92Dietzfelbinger, M., 510, 511dimension

Hausdorff, 132topological, 132Vapnik–Chervonenkis, 380

Ding, D., 237Ding, Y.Z., 598Diophantine equation, 179, 180,

239, 256Dirac, P., 696distance, 646

χ2, 328Euclidean, 366Hamming, 616, 647Hellinger, 328, 329, 353

information, see informationdistance

Kullback–Leibler, seeKullback–Leiblerdivergence

max, 643, 641–648min, 673

772 Index

mixed use NIDapproximations, 671

normalized compression,664, 663–667

normalized information, 661normalized max, 661normalized min, 671, 674

normalized sum, 661, 672,717

normalized web, 668,667–671

reversible, 649, 648–650sum, 651, 650–653

distortionEuclidean, 617, 627–628Hamming, 615–617, 620,

626–627list, 424, 615–616

distortion ball, 614distribution

binomial, 61, 346Bose–Einstein, 11, 282computable universal,

582–589conditional universal, 275Fermi–Dirac, 11, 281malign, 586Maxwell–Boltzmann, 11normal, 390, 392of description length, 211,

255, 277, 287, 288simple, 374uniform, 21, 69, 78, 136, 138,

368uniform continuous, 23uniform discrete, 23, 283universal, 6, 273, 266–302,

330–331, 344, 605universal time-limited,

582–589distribution-free learning, 370–382DNF formula, 376Dobosiewicz, W., 486Doerr, B., 483Doob, J.L., 322, 337Downey, R.G., 194, 232, 236, 237,

256Drexler, K.E., 717Duns Scotus, John, 63, 431

Durand, B., 125

Edmonds, J.E., 39effective enumeration, 29Ehrenfeucht, A., 380, 434, 490Eleftheriadis, A., 629, 715element-distinctness problem, 508Elias, P., 57, 88, 95Emanuel, K., 718ensemble, 66, 685entropy, 67, 65–73, 78, 80, 87,

156, 187, 188, 192, 195,198, 199, 246, 603–608

n-cell algorithmic, 692algorithmic, 692, 686–696,

720, 721Boltzmann, 678–685classical, 675–678coarse-grained algorithmic,

693, 694complexity, 687conditional, 70, 68–70, 246Gibbs, 686, 685–687, 693joint, 246of English, 113of Russian, 94physical, 720relation with complexity,

603–608Epicurus, 339, 341, 343, 347, 431Erdos, P., 94, 453–455, 457, 458,

463, 526estimator

best-fit, 404maximum likelihood, 405minimum description length,

406event, 18

certain, 18, 21impossible, 18, 21mutually independent, 20probability of, 18

Fano, R.M., 94Feder, M., 437Feigenbaum, J., 597Feldman, J., 722Felker, J.H., 715

Index 773

Feller, W., 10–12, 22, 23, 64, 65,92, 93, 331

Fenner, S., 597Ferguson, T.S., 57Fermat, P. de, 23, 179Fermi, E., 96, 720Ferragina, P., 665, 719Feynman, R.P., 633Fich, F., 516field, 19

Borel, 20Borel extension, 21probability, 19

Fine, T.L., 93, 96, 142, 432Finetti, B. de, 432finite automaton

k-head DFA, 499k-pass DFA, 500deterministic (DFA), 341,

447, 500nondeterministic (NFA), 447sweeping two-way DFA, 501

Fisher, R.A., 83, 90, 95, 398, 437Floyd, R.W., 476, 527Ford, J., 721Fortnow, L., 123, 192, 529, 559,

560, 567, 575, 576, 586,588, 589, 597, 598

Foulser, D., 490Fouz, M., 457, 460, 483fractal, 132Francia, B., 718Fredkin gate, 631, 633Fredkin, E., 715, 716Fredman, M.L., 501Freeman, P.R., 435Freivalds, R.V., 434, 447, 524frequency, 67

a priori, 289lower, 289relative, 51

frequency interpretation ofprobability, 50, 93

Friedberg, R.A., 42, 44Frost, S.D.W., 718Fu, B., 570function

Ackermann, 45, 89, 308, 309additively optimal, 103, 107

additively optimal partialrecursive prefix, 202

busy beaver, 46, 130, 185,325

canonical rate-distortion, 614characteristic, 8, 32, 371, 571coenumerable, see function,

upper semicomputablecomplexity, 205composition of, 8computable, 36, 35–37,

310–316computable in the limit, 672consistent, 571convergence of, 8decoding, 13, 73distance, 646distortion, 613distortion-rate, 613distribution, 22divergence of, 8encoding, 73enumerable, see function,

lower semicomputablefactorial, 9, 17following shape, 407generalized exponential, 45hash, 654honest, 562, 564incomputable, see function,

nonrecursiveinverse of, 8lower semicomputable, 35,

135, 148, 149, 151, 222,234, 260–262, 310–316

many-to-one, 8minimum description length

(MDL), 407monotone, 299, 300, 301nonrecursive, 46, 174, 186,

241one-to-one, 8pairing, 7parity, 524partial, 7, 8partial recursive, 29, 116partial recursive prefix, 200payoff, 285, 284–287, 319predicate, 29

774 Index

predictor, 58primitive recursive, 89probability density, 22probability mass, 22, 328ranking, 566rate-distortion, 613recursive, 29, 41, 46, 53, 116,

133, 331recursive real, 36regular, 300semicomputable, 36, 35–37,

92, 310–316, 331shape match, 133structure, see structure

functionsuccessor, 29total, 8total recursive, see function,

recursiveunit integrable, 310–316universal, 103, 104, 105, 107universal lower semicom-

putable, 260universal partial recursive, 31universal upper

semicomputable, 261upper semicomputable, 35,

174, 205, 216, 222, 328

Godel number, 30Godel numbering, see numbering,

acceptableGodel, K., 3, 33, 34, 95, 175, 177,

195, 612, 714Gacs, P., 97, 115, 141, 173, 182,

183, 185, 191, 193–195,206, 214, 218, 219, 231,241, 245, 251, 254–257,288, 289, 304, 307, 324,328, 335, 337, 370, 421,425, 426, 433, 434, 439,597, 599, 600, 625, 626,658, 659, 696, 708, 711,714, 716, 717, 721, 722

Gabarro, J., 92Gaifman, H., 164Galil, Z., 500, 509, 512, 528–530Gallager, R.G., 88, 94Gallaire, H., 527

Gammerman, A., 720Gao, Q., 436garbage bits, 631Gardner, M., 228, 255Garey, M.R., 92Gasarch, W., 182, 183, 525, 526,

560, 672Gavalda, R., 489, 569, 570Gavoile, C., 475Gell-Mann, M., 430, 721generalized Kolmogorov

complexity, seecomplexity, resource-bounded

genericity, 99Gereb-Graus, M., 500Giancarlo, R., 665, 719Gibbs, J.W., 685Gill, J, 570Gnedenko, B.V., 97Gold, E.M., 363, 434Goldbach conjecture, 228Goldberg, A., 556, 560, 597Goldreich, O., 588Goldsmith, J., 567Goldstine, H.H., 444, 525Good, I.J., 432Grunwald, P.D., 436Graham, R., 17graph

expander, 459labeled, 461OR-concentrator, 459random, see Kolmogorov

random graphsrandom directed, 457statistics subgraphs, 464tournament, see tournamentundirected, 457unlabeled, see number of

unlabeled graphsGrassberger, P., 718Gray, R.M., 97, 141, 338, 439,

625, 714Greco, V., 665, 719Griffiths, T.L., 722Grimmett, G., 458Grossman, J.W., 45Grumbach, S., 717

Index 775

Gurevich, Y., 599

Huhne, M., 510Hadamard, J.S., 4Hahn, E.L., 720halting probability, 226, 225–228,

232, 233, 235, 237, 238,255, 262, 272, 594

halting problem, 33, 33–35, 42,182, 186, 226, 245, 249,368, 414, 710

Hamiltonian equations, 681Hammer, D., 192, 530, 626, 714Hancock, T., 397Handley, J, 667, 719Hanson, N.R., 342Hao, Y., 719Harary, F., 465Hardy, G.H., 16Harrison, M.A., 495, 500, 527Hartle, J.B., 721Hartmanis, J., 92, 132, 508, 528,

550, 568, 596, 597Haussler, D., 380, 434Heilbronn’s triangle problem, 458Heilbronn, H.A., 458Heim, R., 142, 337Hemachandra, L., 561, 597, 599Hemaspaandra, E., 570Hennie, F.C., 525, 528, 532Hermo, M., 600Hertling, P., 235Heyting, A., 164Hilbert’s tenth problem, 180Hilbert, D., 45, 180, 612Hirschfeldt, D.R., 194, 232, 256Hochberg, J.E., 437Hoeffding, W., 57Hoepman, J.H., 469, 473, 526Homer, 101Honavar, V., 382Hood, L.E., 719Hopcroft, J.E., 527Huang, M.-D.A, 709Hume, D., 347Hurewicz, W., 132Hutter, M., 355, 358, 362,

367–369, 432, 433, 580,582, 599

Huynh, D.T., 561, 562Hwang, K., 525hypotheses identification, 363–364Hastad, J., 529

Ibarra, O.H., 500ideal MDL, 423immune

bi, 562P/poly, 562

Impagliazzo, R., 570, 597incompressibility method, 441–530induction, 339

in recursion theory, 363–366,369–370

inductive inference, 339Gold paradigm, 363, 364,

434inductive reasoning, ix, 6, 59, 96,

332, 339–432using M, 350–360

inequalitydata-processing, 72, 610information, 72Kraft, see Kraft inequality

inference, 82information, 66

algorithmic, 186algorithmic conditional

mutual, 248algorithmic conditional

mutual using K(c), 252algorithmic mutual, 189–191,

248, 247–254, 289,608–613, 661, 714, 717

algorithmic mutual usingKc, 252

conservation inequalities, seeinequality, informationconservation

dispersal of, 124in x about y, 246, 248maximal mutual, 653nongrowth, 610probabilistic mutual, 71,

69–72symmetry of probabilistic,

see symmetry

776 Index

of probabilisticinformation

information distance, 602, 642,641–674

admissible, 646normalized, 660–674

information inequality, seeinequality, information

information theory, 48, 65–90, 94,186, 187

algorithmic, 186–193, 199,244–254, 602–629

instance complexity, seecomplexity, instance

instance complexity conjecture,572

invariance theorems, see theorem,invariance

irreversible computation, 629, 630Israeli, A., 515Itai, A., 380, 434

Jurgenson, H., 228Jaffe, J., 490Jagota, A.K., 334Jakoby, A., 334, 587Janson, S., 458, 480, 527Jaynes, E.T., 398, 437Jenkins, S., 718Jiang, T., 123, 397, 434, 458, 485,

489, 490, 500, 501, 513,527

Jockusch, C.G., 183Johnson, D.S., 92, 434Johnson, Dr. Samuel, 259, 331Jones, J.P., 239, 240, 256Joseph, D., 575Juedes, D.W., 536, 546, 598, 600Jurdzinski, T., 528Jurka, J., 437

Konig’s infinity lemma, 133Kahn, J., 526Kajan, L., 718Kalyanasundaram, B., 459Kamae, T., 59, 181, 213Kanaya, J, 629, 715Kannan, R., 509, 529Kanovich, M.I., 184

Karp, R.M., 560Kasami, T., 528Katseff, H.P., 133, 159, 161, 194Kauffman, S.A., 719Kearney, P., 672, 717Kearns, M., 434Kemeny, J.G., 431Keogh, E.J., 667, 719Kertesz-Farkas, A., 718Keuzenkamp, H.A., 437Keyes, R.W., 716Keynes, J.M., 56Khintchin, A.I., 65, 88Khoussainov, B., 235Kim, C.E., 500Kim, J., 526Kirchherr, W.W., 469Kirk, S.R., 718Klauck, H., 519–521, 529Kleene, S.C., 35, 41, 42Knopp, K., 90Knuth, D.E., xiii, 16, 17, 92, 93,

194, 480, 485, 486, 527Ko, K.-I., 548, 575–577, 596, 598Kobayashi, K., 334, 514, 586Kocsor, A., 718Kolmogorov Axioms, 18Kolmogorov random graphs,

461–469Kolmogorov structure function,

see structure functionKolmogorov, A.N., 18, 49, 50, 52,

53, 55, 56, 65, 66, 73,92–99, 103, 104, 110,125, 142, 156, 158, 173,192–195, 221, 256, 290,327, 331, 332, 336, 424,438, 439, 531, 596, 599,721

Komlos, J., 458, 501Koppel, M., 430, 439Koucky, M., 521, 570, 597Kraft inequality, 76, 76–78, 82,

88–90, 94, 199, 204, 211,222, 223, 229, 247, 274,647, 672

Kraft, L.G., 76, 94Kranakis, E., 475, 476, 526, 527Kraskov, A., 718

Index 777

Krasnogor, N., 718Kraus, B., 722Kreinovich, V., 165Krizanc, D., 475, 476, 527Kucera, A., 236, 256Kullback–Leibler divergence, 72,

283, 326, 352Kumar, V., 671Kummer, M., 123, 181, 560, 576,

577, 598Kurtz, S.A., 597Kushilevitz, E., 529Kwong, S., 672, 717

Lofgren, L., 193Lopez-Ortiz, A., 508Lambalgen, M. van, 93, 195, 225,

229, 231, 233, 335, 526Landauer, R., 629, 633, 715, 716Lange, K.-J., 641, 716language compression, 550,

550–562optimal, 551, 558–559P-rankable, 560probabilistic, 555–558ranking, 551, 558–559with respect to Cp, 555–558with respect to CDp,

551–555Laplace, P.S., 20, 49, 60, 65, 259,

323, 331, 432Laplante, S., 529, 559, 597, 709,

722Larjo, A., 719Lathrop, J.I., 600law

0-1, 530complete probabilities, 19excluded gambling strategy,

52, 53high-low Kolmogorov

complexity, 530infinite recurrence, 58, 156,

330inverse weak law of large

numbers, 64iterated logarithm, 54, 55,

58, 65, 93, 147, 156,234, 238, 287, 329, 330

of large numbers, 55, 63,147, 163, 284, 287

of probability, 284, 287, 318of randomness, 55, 58, 147

of succession, 65, 323, 356slow growth, 594, 596strong law of large numbers,

64, 330weak law of large numbers,

59, 63, 64learning

log-DNF, 376–378log-DNF formula, 376log-decision list, 381by enumeration, 364decision list, 381decision tree, 393–397distribution-free, 373in the limit, 364monotone k-term DNF, 382simple DNF, 381under M, 379–380under m, 376–378under computable

distributions, 373–382Lecerf, Y., 715Lee, S.H., 667, 719Leeuw, K. de, 185Leeuwenberg, E.L.J, 437lemma

Barzdins’s, see Barzdins’slemma

coding, 553honesty, 562, 564jamming, 503KC-regularity, 491Lovasz’s local, 460, 526pumping, 490switching, 521

Lengler, J., 483Leung-Yan-Cheong, S.K., 95, 214,

216, 255Levin, L.A., 92, 132, 163, 185,

191–195, 206, 221, 245,251, 255, 257, 304, 306,324, 325, 328, 331, 332,334–338, 438, 546, 579,589, 596, 599, 612, 613,714

778 Index

Levine, R.Y, 641, 716Levy, M.A., 567Levy, P., 335, 337Lewis II, P., 550Li, L., 597Li, M., 123, 195, 294, 334, 375,

380–382, 397, 400,433–436, 457, 458, 468,483–485, 489, 490, 495,500, 501, 508, 509, 515,525–527, 529, 587, 600,626, 641, 658, 659, 672,674, 716, 717, 720

Li, X., 672, 718lie-game, 483Likharev, K., 716Lindstrom, B., 455Lipton, R.J., 560List, B, 625literal, 376Littlestone, N., 434Littlewood, J.E., 16, 93Liu, W., 719Lloyd, S., 721logical depth, 591, 589–596

(d, b)-deep, 591, 593machine-independent, 595of Ω, 594shallow, 593stability, 595

Lonardi, S., 667, 719longest common subsequence,

486, 486–489Longpre, L., 165, 509, 510, 549,

570, 596, 598, 600Lopez-Ortiz, A., 501Loreto, V., 718Lorys, K., 528loss

logarithmic, 358, 422measure, 358

Loui, M.C., 513, 514Lovasz, L., 434, 460Loveland, D.W., 122, 130, 132,

156, 160, 162, 193, 237,493, 496, 596

Low, L.H., 436lower bounds, 469–530

k-PDA, 500

k-head automaton, 499k-pass DFA, 500Boolean matrix rank, 445circuit depth, 521–524converting NFA to DFA, 447for Turing machines, 502–514in formal language theory,

490–497index size for text, 501multihead automata,

499–501one-tape Turing machine,

442online CFL recognition,

497–499parallel computation,

514–516Ramsey theory, 454–455,

460, 490, 515, 526routing in networks, 472–473singly vs doubly linked list,

501string-matching, 500sweeping two-way DFA, 501

Luby, M., 588Luccio, F.L., 476, 527Lucier, B., 485Lucretius, 340Luginbuhl, D.R., 514Luo, Z.Q., 458Lutz, J.H., 133, 536, 546, 570,

598, 600

Muller, M, 722Mere, Chevalier de, 23Ma, B., 672, 717, 718Maass, W., 508–511, 528, 529machine

k-pushdown store, 508k-queue, 509k-stack, 508, 512k-tape, 512deterministic Turing, 28, 350Kolmogorov–Uspensky, 599monotone, 298, 302, 306,

333, 334, 336nondeterministic 1-tape

Turing, 509nondeterministic Turing, 37

Index 779

offline Turing, 510one-way Turing, 502–514online Turing, 502, 502–514oracle Turing, 38PRAM, 514prefix, 201, 333, 334probabilistic Turing, 184,

184, 447, 554quantum Turing, 697, 700,

700–701reference monotone, 302reference prefix, 202reference quantum Turing,

704reference Turing, 106, 107Turing, 27, 24–31, 37, 40,

90–92, 442two-dimensional tape, 512universal Turing, 30, 90, 130,

192macro state, 678, 681

Mahaney, S., 567, 569Mairson, H.G., 530majorant of complexity, see

complexity, majorant ofmalign, see distribution, malignMamitsuka, H., 436Mandelbrot, B., 132Manzini, G., 665, 719Margolus, N., 633Markov process, 20, 350Markov’s inequality, 141, 282,

286, 292, 356Markov, A.A., 42, 192Markowsky, G., 433Martin-Lof, P., 54, 55, 98, 121,

133, 143, 156, 159, 160,162–165, 194, 219, 221,337

martingale, 320, 335, 337, 598Marxen, H., 46matching, 644Matijasevich, Yu.V., 180, 239,

240, 256Matthew effect, 96maximum likelihood estimator,

see estimator, maximumlikelihood

Maxwell’s demon, 688–691

Maxwell, J.C., 688Mayordomo, E., 133, 600McAleer, M., 437McAllister, E., 437McCarthy, J., 332McGorry, P.D., 436McKenzie, D.P., 436McKenzie, P., 641, 716McKinnon, B., 718McMillan, B., 88measure, 19, 21, 348

computable, 36, 264, 300,324

computable continuous,294–302, 305, 328, 359

computable discrete, 265–290conditionally bounded away

from zero, 359, 360constructive, 194continuous, 21countable, 21defective, 331discrete, 21discrete computable, 245discrete lower semicom-

putable, 245Laplace, 356Lebesgue, see measure,

uniformof random sequences, 153,

229probability, 263recursive, see measure,

computableresource-bounded, 598simple, 379Solomonoff, 303, 324, 325uniform, 21, 264, 300, 379,

541, 598universal lower semicom-

putable, 379Meertens, L.G.L.T., 17, 92Mehlhorn, K., 530Melkebeek, D. van, 586, 588, 589Merkle, R.C., 634, 716Merkle, W., 59, 157, 162, 183,

231, 236metric, 646

Minkowski, 672

780 Index

similarity, 661, 661–663Meyer auf der Heide, F., 515Meyer, A.R., 132, 493micro state, 679, 679

Mihailovic, N., 183, 236Miller, G.A., 670Miller, J.S., 157, 215, 230, 231,

235, 237, 238, 256Mills, W.H., 455Milosavljevic, A., 437Miltersen, P.B., 334, 559, 586minimal randomness deficiency

function, 404minimum description length

premature termination, 416minimum description length

estimator, see estimator,minimum descriptionlength

Minsky, M., 31, 92, 97, 332, 333Mises, R. von, 20, 50, 51, 53,

55–57, 60, 93–95, 97,141, 143, 155, 158, 163,165, 319, 337, 432

Mises–Wald–Church randomsequence, see sequence,Mises–Wald–Churchrandom

mistake bounds, see prediction,mistake bounds

Miyano, S., 500Mocas, S., 549model

determinacy, 423finite set, 402not in model class, 418probability, 428, 428–429total recursive function, 429,

429–430, 439model fitness, see estimator,

best-fitmonomial, 376, 381monotone k-term DNF, 382Mooers, C., 333Mora, C.E., 722Moran, S., 475, 515Moser, L., 455Motwani, R., 460Muchnik, A.A., 44, 158

Muchnik, An.A., 125, 158, 194,215, 256, 290, 367–369,426, 626, 659, 660, 714,717

multinomial coefficient, 10, 400multiplication rule, 19multiplicative domination, 266Munro, I., 476, 527Muramatsu, J., 629, 715Muse, S.V., 718mutual information, see

information, [types of]mutual

Myung, I.J., 436

Naik, A., 597, 598Natarajan, B.K., 434, 550Navarro, G., 489NCD, see distance, normalized

compressionNelson, C.G., 500Neumann, J. von, xiii, 50, 57, 93,

444, 525, 629, 715Newman, I., 526Newman-Wolfe, R., 512, 525Newton, I., vii, 9, 23, 341, 431NFA, see finite automatonNGD, see distance, normalized

webNicholson, P., 457Nies, A., 157, 194, 215, 230, 232,

238, 256Nisan, N., 529normalization of semimeasure,

see semimeasure,normalization

NP-complete, 39, 563, 564NP-hard, 39null set, 147

Π0n-, 164

µ-, 149constructive, 149, 151, 152,

163total recursive, 163

numberΩ-like real, 234, 262arithmetically random real,

234, 235computable, 262

Index 781

lower semicomputable,see sequence, lowersemicomputable

lower semicomputable real,324

nonrecursive real, 228normal, see sequence, normalof Wisdom, 228prime, 4, 17, 32random real, 227, 235, 236,

324recursive, see number,

computablerecursive real, see sequence,

recursivetranscendental, 227, 228

number of unlabeled graphs,465–468

numbering, acceptable, 112NWD, see distance, normalized

webNykter, M., 719

O’Connor, M.G., 59O’Neil, E.J., 447O’Neil, P.E., 447Oberschelp, A., 46Occam algorithm, 372Occam’s razor, vii, 63, 260, 272,

323, 341, 342, 358, 371,431

occupancy number, 11Ockham, William of, 63, 260, 341,

343, 347Odifreddi, P., 41, 42, 45, 92, 182oracle, 38, 40, 534, 535, 555–557,

562–565, 568, 569Baker–Gill–Solovay, 563

orderpartial, 8total, 8

order of magnitude symbols, 17Oresme, N., 431Orponen, P., 575–577, 597, 598Ortega, A., 673, 719Otu, H.H., 718outcome of experiment, 18, 277,

278, 284, 358overlap

maximal, 643–645minimal, 653–657

Ozhegov, S.I., 94

P-isomorphic, 563P-printability, 565Perennes, S., 475Peter, R., 45Polya, G., 64pac-learning, 371, 370–382, 434

simple, 370, 373–382, 434Palmer, E.M., 465Pao, H.K., 718paradox

Bertrand, 340Richard–Berry, 1, 177Russell, 177

Parberry, I., 515Parekh, R., 382Parikh, R., 490partition, 10Pascal, B., 23Patashnik, O., 17Paterson, M., 489Patrick, J.D., 436Paturi, R., 512, 525Paul, W.J., 512, 513, 524, 525,

528Peano arithmetic, 35Pearl, J., 434Pednault, E.P.D., 436Pelc, A., 483Pelta, D.A., 718Penrose, R., 207, 211Pepys, S., 23permutation, 8perpetuum mobile

of the first kind, 675of the second kind, 675

Petersen, H., 501Peterson, G., 599Petri, N.V., 185, 546phase space, 679phylogeny, 664Pierce, J.R., 715Pinto, A., 586Pintz, J., 458Pippenger, N., 455, 457, 514, 526Pitt, L., 382

782 Index

Pitt, M.A., 436place-selection rule, 93, 161

according to Kolmogorov, 56according to Kolmogorov–

Loveland, 156according to Mises–Wald–

Church, 53, 156,162

according to von Mises, 52,156

finite-state, 58total recursive function, 161Turing machine, 173

Plaxton, C.G., 484Poland, J., 432polynomial complexity core, 572polynomial hierarchy, 40, 570polynomial many-to-one

reduction, seereducibility, polynomialmany-to-one

polynomial Turing reduction, seereducibility, polynomialTuring

Pond, S.L.K., 718Pongor, S., 718Poonen, P., 484Popper, K.R., 343, 347Positselsky, S.Y., 125, 215Post’s problem, 44Post, E.L., 42–44, 92, 176Pour-El, M.B., 92Pratt, V.R., 184, 480predicate, 29, 533prediction, 348–370

mistake bounds, 364–365,369, 434

snooping curve, 422–423, 439strategy, 422

prediction error, 6, 56, 59,351–353, 355, 357, 432

expected using M, 329prefix-code, see code, prefixPrice, N.D., 719Price, R., 432principle

indifference, 63, 340insufficient reason, 340

maximum entropy, 398,398–400, 437

maximum likelihood, 398,437

minimum description length,96, 382, 382–399, 436,437, 721

minimum message length,435

multiple explanations, 339,341

pigeonhole, 441simplicity, 63

probabilistic communicationcomplexity, 459

probabilistic method, 441, 450,453, 454

probabilitya priori, see probability, prioralgorithmic, 273conditional, 19, 70function, see function,

probability [types]inferred, see probability,

posteriorinverse, 60joint, 69marginal, 69posterior, 20, 60, 64, 345,

347prior, 20, 60, 65, 343, 345,

347, 349, 398, 399, 432prior continuous, 298–302uniform, 290universal prior, 63, 94, 96,

193, 198, 255, 272,273, 276, 283, 297, 302,332–334, 344, 591

probability theory, 18–23, 92Ptolomy, C., 431pushdown automaton (PDA),

500, 528

quantum computation, 697–700quantum interference, 699quantum Kolmogorov complexity,

see complexity,quantum Kolmogorov

quantum state, pure, 696

Index 783

quantum Turing machine, seemachine, quantumTuring

qubit, 697queue, 509Quinlan, J.R., 397, 436

Renyi, A., 455, 526Rabin, M.O., 124, 525, 528, 598Rado, T., 46Ragde, P., 516, 526Raghavan, P., 460Rajski, C., 672Ramsey number, 454Ramsey, S.A., 719random

n-random sequence, 2372-random sequence, 2373-random sequence, 237Kolmogorov, 237strongly Chaitin, 238

random variable, 22(in)dependent, 22continuous, 22discrete, 22

randomnessnegative, 438positive, 438

randomness deficiency, 110, 120,120–121, 124, 125, 137,139, 193, 219, 280, 281,318, 328, 358, 402, 408,424, 431, 448–451, 461

for probability models, 428for total recursive function

models, 429rate distortion

algorithmic, 613–624probabilistic, 85–86

Ratner, M., 94Ravela, S., 718Razborov, A., 529recursion theory, see

computability theoryrecursively uniform limit, 128reducibility

many-to-one, 43, 176, 184,570

one-to-one, 43

polynomial many-to-one, 39,568, 598

polynomial truth-table, 570polynomial Turing, 39, 568self, 575truth-table, 570Turing, 43, 176, 182, 183,

232, 570weak truth-table, 184

Regan, K.W., 334, 530, 598regular language, 491Reimann, J., 59, 157Reisch, S., 459, 525Reischuk, R., 334, 512, 513, 528,

587relation

n-ary, 7binary, 7encoding, 73

relative frequency stabilization,142

resource-bounded Kolmogorovcomplexity, seecomplexity, resource-bounded

reversibleballistic computer, 632–634,

715Boolean gate, 631, 640, 716circuit, 631–632, 715computation, 629–641, 715Turing machine, 637–641,

715Reznikova, Zh.I., 722Richards, J.I., 92Riemann hypothesis, 228Risi, C., 718Rissanen, J.J., 89, 95, 382, 390,

391, 435–437, 439Rivals, E., 717Rivest, R., 381, 397, 436, 500Robbins, D., 672Robinson, R.M., 45Rockford Research, 333Rogers, H., Jr., 41–47, 92, 112Rolim, J.D.P., 597Romashchenko, A.E., 607, 626,

714, 717Ronneburger, D., 570, 581, 597

784 Index

Rooij, S. de, 715Rosenberg, A., 500Roth, K.F., 458routing in networks, 469–476routing table, 469Roy, S., 597Rozenberg, G., 490Rtanamahatana, C.A., 667, 719Rubinstein, R., 597Rudich, S., 529, 561rule of succession, see law, of

successionrun of zeros, 118Russell, B., 177, 341Russo, D., 597Ryabko, B.Ya., 132, 720, 722Ryabko, D, 433

Sakoda, W.J., 501sample space, 6, 18

continuous, 18, 20discrete, 18

Sanderson, M.J., 718Sankoff, D., 489Santos, C.C., 719SAT, 39, 560, 563, 564, 568, 573,

575, 578satisfiable, 39Savitch, W.J., 569Sayood, K., 718Schoning, U., 459, 575–577, 598,

600Schack, R., 625, 714, 722Schaffer, R., 527Schay, G, 525Scheihing, R., 489Schindelhauer, C., 334, 587Schmidhuber, J., 599Schmidt, W.M., 458Schmidt-Goettsch, K., 46Schnitger, G., 459, 510, 511, 525Schnorr’s thesis, 164Schnorr, C.P., 59, 115, 162–164,

194, 206, 221, 233, 235,255, 256, 335–337

Schuler, R., 588Schumacher, J., 720Schweitzer, P, 460Sedgewick, R., 527

Seiferas, J.I., 500, 512, 513, 525,527

Seker, A., 718self-delimiting code, see code,

self-delimitingSemenov, A.L., 158, 194, 256semimeasure, 264, 262–265, 331,

334computable, 264computable continuous, 328conditional, 350discrete, 265, 265–290extension, 326lower semicomputable, 264lower semicomputable

continuous, 294–302lower semicomputable

discrete, 265–290maximal, see semimeasure,

universalmaximal lower semi-

computable, seesemimeasure, universallower semicomputable

normalization, 303, 324, 325,333, 335

reference universal lowersemicomputablecontinuous, 297

relative enumerable, 289Solomonoff normalization,

303, 302–304, 324–326universal, 266, 294universal lower semicom-

putable, 255, 324,334

universal lower semicom-putable conditional,275

universal lower semicom-putable continuous,294, 294–298, 302, 325,328

universal lower semicom-putable discrete, 267,268, 273, 276, 287, 386

universal relativeenumerable, 289

sequence

Index 785

∆02-definable, 225, 232, 232,

233Π0n-random, 164

∞-distributed, see sequence,normal

µ-random, see sequence,random

k-distributed, 58Bernoulli, 59, 142, 163, 357Champernowne, 54, 59, 93,

165characteristic, see

characteristic sequencecomputable, 262, 324DNA, 589effectively unpredictable, 238finite, see stringhyperarithmetically random,

164incompressible, 226infinite, see sequenceKolmogorov–Loveland

random, 157Kolmogorov–Loveland

stochastic, 157, 158lower semicomputable, 161Martin-Lof random, see

sequence, randomMises–Wald–Church random,

53, 55, 58, 156, 238,337, 547

Mises–Wald–Churchstochastic, 156, 157,161, 162

nonrecursive, 231normal, 58, 165, 238pararecursive, 161pseudorandom, 93random, 55, 98, 149,

143–165, 183, 194, 198,219, 221, 222, 224, 225,227–229, 233, 318, 328,335

recursive, 47, 113, 123, 131,161, 231, 262, 289, 327,540

Schnorr random, 163Solovay random, 160, 221,

233

strongly Chaitin random, 255typical, 54universal recursive, 289von Mises random, see

collectiveweakly Chaitin random, 233,

255set

C[f(n), t(n), s(n)], 542,542–550, 562–570

Q-immune, 545m-complete, 176, 177, 184r-complete (r = 1,m, T ), 44r-hard (r = 1,m, T ), 44CD [f(n), t(n), s(n)], 544CU [f(n), t(n), s(n)], 546arithmetic, 234Borel, 164, 234complete, 245continuous, 8countable, 8cylinder, see cylinderdiagonal halting, 34, 36, 44,

88, 182effectively immune, 175, 182

effectively simple, 175, 182

empty, 7enumerable without

repetitions, 43exponentially low, 564fractal, 132halting, 34, 44, 176, 177,

180, 184hyperarithmetic, 164hyperimmune, 185immune, 43, 182, 184intuitionistic measure zero,

164Kolmogorov, 182meager, 120, 179, 193P-printable, 565, 581RAND, 175, 182recursive, 32recursively enumerable, 32,

178–180recursively enumerable-

complete, 563relative enumerable, 289semirecursive, 183

786 Index

simple, 43, 44, 45, 174, 176,177, 181, 184

sparse, 194, 545, 556, 559,565, 566, 568

tally, 566Turing complete, 182, 184weak truth-table complete,

184Shakespeare, W., 113Shallit, J., 530Shaltiel, R., 597Shamir, A., 124, 561Shannon, C.E., 48, 65, 66, 73, 77,

87, 90, 91, 94, 95, 98,99, 101, 192, 199, 332,720

shape following, see function,following shape

Shell, D.L, 480, 527Shen, A.K., 125, 157, 158, 160,

192, 194, 206, 231, 256,288, 337, 424, 429, 439,530, 626, 660, 714, 717

Shen, S.-Y., 629, 715Sherman, A.T., 641, 716Shiryaev, A.N., 97, 98Shmulevich, I, 719shortest common supersequence,

486, 489shortest program, 110, 118, 202,

213, 252, 253, 277, 287,305

prefix complexity of, 213similarity metric, see metric,

similaritySimon, J., 512, 525Simons, G., 57simple DNF, 381Singh, B.S., 436singly linked list, 501Sipser, M., 133, 159, 161, 193,

194, 501, 556, 560, 596,597

Sivakumar, D., 598Skiena, S.S., 93Slaman, T.A., 236, 256Sleator, D., 483Slotine, J.J., 721Smith, C., 434

Smoluchowski, M. von, 57Snell, J.L., 337Snir, M., 164snooping curve, see prediction,

snooping curveSobolev, S.L., 97Solomonoff measure, see measure,

SolomonoffSolomonoff normalization,

see semimeasure,Solomonoffnormalization

Solomonoff’s induction theory,348–370

Solomonoff’s inductive formula,358, 432

Solomonoff, R.J., 59, 63, 94,96–99, 104, 192–194,199, 255, 303, 304,325–327, 332, 333, 335,347, 348, 352, 357, 367,432, 433, 599

Solovay, R.M., 99, 132, 160, 181,184, 203, 215, 217, 220,221, 229, 232–235, 237,238, 241, 255–257, 289,290, 325, 335

sophistication, see model, totalrecursive function, 721

sortingBubblesort, 480, 484Dobosiewicz sort, 485–486Heapsort, 476–480, 527Insertion sort, 480, 484Queuesort, 484Quicksort, 290–294, 485Selection sort, 484Shakersort, 485–486Shellsort, 480–486Stacksort, 484

source sequence, 74source word, 13, 73Souto, A., 586Sow, D.M., 629, 715space–energy tradeoff, 641Spencer, J.H., 94, 453, 457, 463,

483, 526Srivastava, 671Stogbauer, H., 718

Index 787

stack, 460, 484, 508, 512Staiger, L., 132Stanat, D., 490state space, 679statistic

algorithmic minimalsufficient, 403, 403–430

algorithmic sufficient, 108,403, 402–404, 406

probabilistic, 83probabilistic minimal

sufficient, 85probabilistic sufficient, 83–85

statistical inference, 82statistical properties

of graphs, 464–465of sequences, 165–174, 195

statisticsalgorithmic, 401–431nonprobabilistic, see

statistics, algorithmicprobabilistic, 82–85

Stearns, P., 453Stearns, R., 508, 528, 532, 550Stephan, F., 157, 232, 238Steurer, D., 483Stimm, H., 59Stirling’s formula, 17, 67, 140,

188, 468Stirling, J., 17stochastic complexity, 435stochastic source, 67Stockmeyer, L, 588string, 12

C-random, 220K-random, 220δ-random, 124, 173c-incompressible, 116, 140c-random, 140n-string, see n-stringabsolutely nonrandom, 125binary, 12–15compressibility of, 116cyclic shift, 162, 213empty, 12incompressibility of

substring, 117incompressible, 123, 133,

142, 212, 218

incompressible w.r.t. K, 212,219

infinite, see sequencelength of, 13random, 97, 121, 140,

133–143, 194, 218–220reverse of, 13, 109self-delimiting, 13strongly typical, 420typical, 125, 402, 403–405,

419structure function, 405, 401–431,

519, 616bumps, 426

Sudborough, H., 500Suel, T., 484sufficiency line, 406, 407, 419sufficient statistic, see statistic,

[types of] sufficientsupermartingale, 320

universal lower semi-computable, 320,319–321

support vector machine (SVM),669

Svetlova, N.D., 94symmetry of information, 447

C, 189–192, 195, 567K, 249, 244–254Kc, 254algorithmic, 189, 190, 190,

195, 244–254, 257algorithmic resource-

bounded, 548conditional K, 252for Kµ, 327probabilistic, 71, 327

Szemeredi, E., 458, 501, 509, 511,529

Szilard engine, 688Szilard, L., 688, 720

Tahi, F., 717Tan, P.N., 671Tang, S., 570Tapp, A., 641, 716Tarjan, R.E., 483, 485Tenenbaum, J.B., 722Terwijn, S.A., 238

788 Index

test, 135, 136, 141, 194, 335P , 135Bernoulli, 142, 143Bose–Einstein distribution,

282Cauchy condensation, 90confidence interval of, 134critical region of, 134, 137Fermi–Dirac distribution,

281–282in statistics, 93, 134, 164integral, 223, 311

Levin’s, 231lower semicomputable, 138,

150Martin-Lof, 135martingale, 317pseudo randomness, 49ptime pseudorandom, 541randomness, 318recursive, 138reference universal for

uniform distribution,140

sequential, 147, 194, 541sequential µ, 147sequential Bernoulli, 162, 163sequential for uniform

distribution, 148sequential Martin-Lof, 147sequential martingale, 317sequential ptime, 541significance level of, 134Solovay randomness, 233statistical, 284sum, 278testing for randomness, 134universal, 136, 138, 194, 219,

220, 280, 283, 285, 334universal Bernoulli, 142universal for arbitrary

computable P , 219universal for uniform

distribution, 139, 219,281

universal integral, 223universal martingale, 317,

316–320

universal sequential, 149,194, 219, 223, 236, 256,329

universal sequentialBernoulli, 162

universal sequential for theuniform measure, 151

universal sequentialmartingale, 317

universal sum, 278, 278–284universal uniform, 163

Thackeray, W.M., 319theorem

s-m-n, 41, 42basic of r.e. sets, 42binomial, 9Blum speed-up, 550Chinese remainder, 124coding, 273, 273, 277, 278,

287, 289, 301, 305–307,334, 587, 588

coding, continuous version,305

conditional coding, 275conversion, 643, 659entropy uniqueness, 87enumeration, 31, 42equality stochastic

entropy and expectedcomplexity, 187

Fermat’s last, 179, 228Fine, 141hierarchy, 47incompleteness, 3, 34, 35,

177, 195, 612, 714incompressibility, 117incomputability, 127invariance, 96, 97, 99, 104,

105, 192, 198, 202, 252invariance for K, 193invariance instance

complexity, 571invariance quantum

Kolmogorov complexity,703

invariance uniformcomplexity, 130

invariance, for Ct,s, 532Korner–Csiszar–Marton, 654

Index 789

Kamae, 125, 181, 213KC-characterization, 492Liouville, 681Matijasevich, 239, 240, 256McMillan–Kraft, 88, 205Muchnik, 521, 654, 659, 660,

717Myhill–Nerode, 491noiseless coding, 77, 79, 80,

95, 199, 278, 280, 605Occam’s razor, 372, 373,

378, 380, 381, 434prime number, 17, 552recursion, 133Savitch, 569Schnorr, 221, 221, 224, 225,

227–230, 233, 255second recursion, 46Slepian–Wolf, 654supermartingale convergence,

321symmetry of information

(C-version), 190, 192symmetry of information

(K-version), 247, 249

time-bounded coding, 587,588

van Lambalgen, 231theory

axiomatizable, 34, 175–177,186

consistent, 34decidable, 34sound, 34, 175–177, 186

thermodynamicsfirst law, 674, 675of computation, 629–641second law, 674statistical, 679

Thomas, J.A., 90, 94, 95, 324,338, 366, 433, 439

Thurston, W., 483Tikhomirov, V.M., 97time–energy tradeoff, 658Todt, G., 46Toffoli, T., 715, 716Torenvliet, L., 569tournament, 452, 452–454

ranking, 457

transitive, 452Trevisan, L., 597Tromp, J.T., 5, 227, 234, 255, 288,

397, 421, 425, 426, 434,468, 483, 525, 526, 577,641, 714, 716

Turan, G., 511Turing

degree, 183machine, see machine,

Turingquantum machine, see

machine, quantumTuring

thesis, 24Turing, A.M., 24, 33, 42, 92, 192two-part description, 403Tyszkiewicz, J., 296, 530

Ullman, J.D., 527undecidable statement, 3, 35,

175–177uniform limit, 130upper bounds

carry sequence, 444–445combinatorics, 451–460covering families, 457routing in networks, 470–472tournaments, 452

Uspensky, V.A., 97, 98, 125, 158,194, 206, 221, 256, 290,336, 337, 439, 600

USSR, research in former, 193Ustinov, M.A., 427, 660

Valiant learning model, 370–382Valiant, L.G., 94, 370, 382, 434,

459, 559, 561Valiente, G., 665, 719van Dam, W., 709, 722van der Helm, P.A, 437van der Waerden number, 460van Melkebeek, D., 570, 597Vapnik, V.N., 434variation, 9Varre, J.S., 717Vazirani, U., 434, 530, 709Vazirani, V., 530, 559

790 Index

Vereshchagin, N.K., 125, 194,256, 290, 421–430, 439,519–521, 529, 626–628,659, 660, 714, 715, 717

Vert, J.P., 718Ville, J., 54, 58, 93, 335, 337Vinodchandran, N.V, 586, 588,

589Vitanyi, P.M.B., 17, 92, 123, 195,

294, 334, 375, 380–382,400, 421–431, 433–437,439, 457, 458, 468, 469,473, 484, 485, 489, 495,501, 508, 509, 513,519–521, 525–527, 529,587, 588, 600, 626–628,641, 658, 659, 670, 672,673, 711, 714–720, 722

Vivant, E., 718Vovk, V.G., 264, 329, 330, 397,

433, 436Vyugin, M.V., 659, 660, 717Vyugin, V.V., 193, 221, 332, 336,

423, 429, 439, 526

Wagner, K.W., 570Wald, A., 53, 54, 93, 165Wallace, C.S., 382, 435, 436Wallman, H., 132Wang, J., 598Wang, W.G., 600Wang, Y., 235, 596Warmuth, M., 380, 434, 475Watanabe, O., 206, 422, 549, 569,

570, 575–577, 597, 598,600

Wechsung, G., 599Wegman, M., 433Wehner, S., 719Wei, L., 667, 719Weiss, B., 59Weiss, S., 490Whitehead, A.N., 177Wiering, M, 599Wigderson, A., 515, 516, 526, 570,

597Wiles, A., 179Williams, J.W.J., 476, 527Willis, D.G., 327, 332

Wolf, R. de, 718Wong, W., 719Wood, D., 483word

finite, see stringinfinite, see sequence

WordNet, 670, 719

Yamanishi, K., 397, 436Yang, E.H, 629, 715Yang, Q., 490Yao, A.C.C., 480, 500, 527, 529Yee, C.N., 436Yesha, Y., 500, 515, 525Yianilos, P.N., 672Yli-Harja, O., 719Young, P., 575Yu, B., 436Yu, L., 215, 230, 231, 237Yu, S., 495

Zambella, D., 232Zator Company, 96, 333Zeitman, R.Z., 45Zhang, H., 672, 717Zhang, L., 483, 525Zhang, X., 719Zhao, J., 599Zhu, X., 720Zimand, M., 530Ziv, J., 715Zurek, W.H., 625, 626, 633, 658,

659, 714, 716, 717, 720,721

Zvonkin, A.K., 92, 132, 185, 191,193–195, 255, 324, 331,332, 334, 337, 546, 596

Zvozil, K., 722

TEXTS IN COMPUTER SCIENCE (continued from p. ii)

Kizza, Ethical and Social Issues in the Information Age, SecondEdition

Kozen, Automata and Computability

Kozen, Theory of Computation

Li and Vitányi , An Introduction to Kolmogorov Complexity and ItsApplications, Third Edition

Merritt and Stix , Migrating from Pascal to C++

Munakata, Fundamentals of the New Artificial Intelligence:Neural, Evolutionary, Fuzzy and More, Second Edition

Nerode and Shore, Logic for Applications, Second Edition

Pearce, Programming and Meta-Programming in Scheme

Revesz , Introduction to Constraint Databases

Schneider , On Concurrent Programming

Skiena and Revilla, Programming Challenges: The ProgrammingContext Training Manual

Smith, A Recursive Introduction to the Theory of Computation

Socher-Ambrosius and Johann, Deduction Systems

Stirling , Modal and Temporal Properties of Processes

Zeigler , Objects and Systems