Péter Gács (Hungarian pronunciation: ['pe:ter 'ga:tʃ]; born May 9, 1947), professionally also known as Peter Gacs, is a Hungarian-Americanmathematician and computer scientist, professor, and an external member of the Hungarian Academy of Sciences. He is well known for his work in reliable computation, randomness in computing, algorithmic complexity, algorithmic probability, and information theory.
Gacs has made contributions in many fields of computer science. It was Gács and László Lovász who first brought ellipsoid method to the attention of the international community in August 1979 by publishing the proofs and some improvements of it.[3][4][5] Gacs also gave contribution in the Sipser–Lautemann theorem.[6] His main contribution and research focus were centered on cellular automata and Kolmogorov complexity.
Work on cellular automata
His most important contribution in the domain of cellular automata besides the GKL rule (Gacs–Kurdyumov–Levin rule) is the construction of a reliable one-dimensional cellular automaton presenting thus a counterexample to the positive rates conjecture.[7] The construction that he offered is multi-scale and complex.[8] Later, the same technique was used for the construction of aperiodic tiling sets.[9]
Work on algorithmic information theory and Kolmogorov complexity
Gacs authored several important papers in the field of algorithmic information theory and on Kolmogorov complexity. Together with Leonid A. Levin, he established basic properties of prefix complexity including the formula for the complexity of pairs[10] and for randomness deficiencies including the result rediscovered later and now known as ample excess lemma.[11][12] He showed that the correspondence between complexity and a priori probability that holds for the prefix complexity is no more true for monotone complexity and continuous a priori probability.[13][14] In the related theory of algorithmic randomness he proved that every sequence is Turing-reducible to a random one (the result now known as Gacs–Kucera theorem, since it was independently proven by Antonin Kucera).[14] Later he (with coauthors) introduced the notion of algorithmic distance and proved its connection with conditional complexity.[15][14]
He was one a pioneer of algorithmic statistics,[16] introduced one of the quantum versions for algorithmic complexity,[17] studied the properties of algorithmic randomness for general spaces[18][19] and general classes of measures.[20] Some of these results are covered in his surveys of algorithmic information theory.[21][22] He also proved results on the boundary between classical and algorithmic information theory: the seminal example that shows the difference between common and mutual information (with János Körner).[23]
Together with Rudolf Ahlswede and Körner, he proved the blowing-up lemma.[24]
^ Peter Gacs. On the symmetry of algorithmic information. Doklady Akademii Nauk SSSR, 218(6):1265–1267, 1974. In Russian.
^ Peter Gacs. Exact expressions for some randomness tests. Z. Math. Log. Grdl. M., 26:385–394, 1980.
^ Downey, Rodney G., and Denis R. Hirschfeldt. Algorithmic randomness and complexity. Springer, 2010
^ Peter Gacs. On the relation between descriptional complexity and algorithmic probability. Theoretical Computer Science, 22:71–93, 1983. Short version: Proc. 22nd
IEEE FOCS (1981) 296-303
^ abcLi, Ming, and Paul Vitányi. An introduction to Kolmogorov complexity and its applications. Vol. 3. New York: Springer, 2008.
^ Charles H. Bennett, Peter Gacs, Ming Li, Paul M. B. Vitanyi, and Woiciech Zurek. Information distance. IEEE Transactions on Information Theory, 44(4):1407–1423, 1998. (Preliminary version appeared in STOC’97, arXiv:1006.3520.) According to Google Scholar, this paper has been cited 687 times by April, 2021 [1]
^ Peter Gacs, John Tromp, and Paul M. B. Vitanyi. Algorithmic statistics. IEEE Transactions on Information Theory, 47:2443–2463, 2001. arXiv:math/0006233[math.PR]. Short version with similar title in Algorithmic Learning Theory, LNCS 1968/2000.
^ Aditi Dhagat, Peter Gacs, and Peter Winkler. On playing "twenty questions" with a liar. In Proceedings of the third annual ACM-SIAM symposium on Discrete algorithms, SODA’92, pages 16–22, Philadelphia, PA, USA, 1992. Society for Industrial and Applied Mathematics.
^ Peter Gacs. Uniform test of algorithmic randomness over a general space. Theoretical Computer Science, 341(1-3):91–137, 2005.
^ Peter Gacs, Mathieu Hoyrup, and Cristobal Rojas. Randomness on computable probability spaces - a dynamical point of view. Theory of Computing Systems, 48:465–485, 2011. 10.1007/s00224-010-9263-x, arXiv:0902.1939. Appeared also in
STACS 2009.
^ Laurent Bienvenu, Peter Gacs, Mathieu Hoyrup, Cristobal Rojas, and Alexander Shen. Randomness with respect to a class of measures. Proceedings of the Steklov Institute of Mathematics, 274(1):34–89, 2011. In English and Russian, also in arXiv:1103.1529.
^ Peter Gacs. Lecture notes on descriptional complexity and randomness. Technical report, Boston University, Computer Science Dept., Boston, MA 02215, 2009.
www.cs.bu.edu/faculty/gacs/papers/ait-notes.pdf.
^ Thomas M. Cover, Peter Gacs, and Robert M. Gray. Kolmogorov’s contributions to information theory and algorithmic complexity. The Annals of Probability, 17(3):840–865, 1989.
^ Peter Gacs and Janos Korner. Common information is far less than mutual information. Problems of Control and Inf. Th., 2:149–162, 1973.
^ Ahlswede, Gacs, Körner Bounds on conditional probabilities with applications in multiuser communication, Z. Wahrsch. und Verw. Gebiete 34, 1976, 157–177