Book cover

Axiomatic Thinking II pp 199–234 Cite as

What is the Church-Turing Thesis?

  • Udi Boker 4 &
  • Nachum Dershowitz 5  
  • First Online: 18 September 2022

602 Accesses

1 Citations

1 Altmetric

We aim to put some order to the multiple interpretations of the Church-Turing Thesis and to the different approaches taken to prove or disprove it.

[Answer:] Rosser and its inventor proved that its beta-reduction satisfies the diamond property, and Kleene (pron. clean-ee) proved that it was equivalent to his partial recursive functions. The previous result combined with a similar one with the Turing Machine, led to the Church-Turing thesis. [Question: “What is ...?”] —Quizbowl Tournament (2004)

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Peter Wegner and Dina Goldin asserted that the Church-Turing Thesis should be interpreted with respect to functions over natural numbers [ 17 ]. Yet, if one considers it in the more general form, they claim that he/she must admit it wrong due to the actuality of interactive computations. All this means is that the original thesis needs to be adapted to refer to the nature of non-function-computing uses of computation. See, for example, [ 19 ].

There are numerous other names for the various versions of the thesis in the literature. Even the terms “mathematical” and “physical” have different connotations for different authors.

In [ 13 , 33 ] it is shown that while strict containment of function sets is in general not enough to infer the presence of extra computational power, general recursion is indeed more powerful than primitive recursion, even under rigorous definitions of power comparison.

Sieg [ 49 ] in explaining Church [ 4 ]: “Calculability is explicated by that of derivability in a logic.”

Gandy’s idea of basing a formalization of computability on hereditarily finite sets, which are precisely the finite objects of set theory, was presaged by Moto-o Takahashi [ 53 ].

Another argument of Bringsjord, explicitly touted as “a case against Church’s thesis”, takes its cue from the perceived ability of humans to judge the literary quality of writings that they cannot for the life of them produce themselves [ 101 ].

Herbert Robbins, the statistician: “Nobody is going to run 100-meters in five seconds, no matter how much is invested in training and machines. The same can be said about using the brain. The human mind is no different now from what it was five thousand years ago. And when it comes to mathematics, you must realize that this is the human mind at an extreme limit of its capacity” [ 105 ].

Emil L. Post. Finite combinatory processes—formulation 1. Journal of Symbolic Logic , 1 (3): 103–105, September 1936.

Google Scholar  

Yiannis N. Moschovakis. What is an algorithm? In Björn Engquist and Wilfried Schmid, editors, Mathematics Unlimited — 2001 and Beyond , pages 919–936. Springer, Berlin, 2001.

Erwin Engeler. The Combinatory Programme . Progress in Theoretical Computer Science. Birkhäuser, Boston, 1995.

Alonzo Church. An unsolvable problem of elementary number theory. American Journal of Mathematics , 58: 345–363, 1936.

Article   MathSciNet   MATH   Google Scholar  

Alan. M. Turing. On computable numbers, with an application to the Entscheidungsproblem. Proceedings of the London Mathematical Society , 42: 230–265, 1936–37. URL http://www.abelard.org/turpap2/tp2-ie.asp . Corrections in vol. 43 (1937), pp. 544–546. Reprinted in M. Davis (ed.), The Undecidable , Raven Press, Hewlett, NY, 1965.

Stephen C. Kleene. Lambda-definability and recursiveness. Duke Mathematical Journal , 2: 340–353, 1936.

Martin Davis. The Church-Turing thesis: Consensus and opposition. In Logical Approaches to Computational Barriers , pages 125–132, Berlin, 2006. Springer.

Chapter   Google Scholar  

Andrew Hodges. Did Church and Turing have a thesis about machines? In Adam Olszewski, Jan Wolenski, and Robert Janusz, editors, Church’s Thesis After 70 Years , pages 214–224. Ontos Verlag, 2006.

Robert I. Soare. The history and concept of computability. In Handbook of Computability Theory , pages 3–36. Elsevier, 1999.

Ian Parberry. Parallel speedup of sequential machines: A defense of parallel computation thesis. SIGACT News , 18 (1): 54–67, March 1986.

Article   MATH   Google Scholar  

Peter van Emde Boas. Machine models and simulations. In J. van Leeuwen, editor, Handbook of Theoretical Computer Science , volume A: Algorithms and Complexity, pages 1–66. North-Holland, Amsterdam, 1990. URL http://www.illc.uva.nl/Research/Reports/CT-1988-05.text.pdf .

MATH   Google Scholar  

Scott Aaronson. The toaster-enhanced Turing machine, 2012. URL https://www.scottaaronson.com/blog/?p=1121 . The Blog of Scott Aaronson.

Udi Boker and Nachum Dershowitz. Comparing computational power. Logic Journal of the IGPL , 14 (5): 633–648, 2006. URL http://nachum.org/papers/ComparingComputationalPower.pdf .

Udi Boker and Nachum Dershowitz. The Church-Turing thesis over arbitrary domains. In Arnon Avron, Nachum Dershowitz, and Alexander Rabinovich, editors, Pillars of Computer Science, Essays Dedicated to Boris (Boaz) Trakhtenbrot on the Occasion of His 85th Birthday , volume 4800 of Lecture Notes in Computer Science , pages 199–229. Springer, 2008. URL http://nachum.org/papers/ArbitraryDomains.pdf .

Martin Davis. The myth of hypercomputation. In Christof Teuscher, editor, Alan Turing: Life and Legacy of a Great Thinker , pages 195–212. Springer, 2003.

Gualtiero Piccinini. The physical Church-Turing thesis: Modest or bold? The British Journal for the Philosophy of Science , 62: 733–769, 2011.

Dina Goldin and Peter Wegner. The Church-Turing Thesis: Breaking the myth. In S. Barry Cooper, Benedikt Löwe, and Leen Torenvliet, editors, New Computational Paradigms: First Conference on Computability in Europe (CiE 2005, Amsterdam) , volume 3526 of Lecture Notes in Computer Science , pages 152–168, Berlin, June 2005.

Dina Goldin and Peter Wegner. The interactive nature of computing: Refuting the strong church-turing thesis. Minds and Machines , 18: 17–38, March 2008.

Article   Google Scholar  

Manfred Broy. Computability and realizability for interactive computations. Information and Computation , 241: 277–301, 2015.

Dina Q. Goldin, Scott A. Smolka, and Peter Wegner, editors. Interactive Computation: The New Paradigm . Springer, Berlin, 2006.

Stephen C. Kleene. Turing’s analysis of computability, and major applications of it. In A Half-century Survey on The Universal Turing Machine , pages 17–54. Oxford University Press, Inc., 1988.

Janet Folina. Church’s thesis: Prelude to a proof. Philosophia Mathematica , 6 (3): 302–323, 1998.

Stewart Shapiro. Understanding Church’s thesis, again. Acta Analytica , 11: 59–77, 1993.

Stephen C. Kleene. Introduction to Metamathematics . North Holland, 1952.

László Kalmár. An argument against the plausibility of Church’s thesis. In A. Heyting, editor, Constructivity in Mathematics, Proceedings of the Colloquium Held at Amsterdam, 1957 , pages 72–80, Amsterdam, 1959. North-Holland.

Martin Davis. Why Gödel didn’t have Church’s Thesis. Information and Control , 54 (1/2): 3–24, 1982.

Joseph R. Shoenfield. Recursion Theory , volume 1 of Lecture Notes In Logic . Springer-Verlag, Heidelberg, New York, 1991.

Elliott Mendelson. Second thoughts about Church’s thesis and mathematical proofs. Journal of Philosophy , 87 (5): 225–233, 1990.

Robin Gandy. The confluence of ideas in 1936. In A Half-Century Survey on the Universal Turing Machine , pages 55–111, New York, NY, 1988. Oxford University Press, Inc. URL http://dl.acm.org/citation.cfm?id=57249.57252 .

Stewart Shapiro. Proving things about the informal. In G. Sommaruga and T. Strahm, editors, Turing’s Revolution: The Impact of his Ideas About Computability , pages 283–296. Springer, Cham, January 2015.

Harvey M. Friedman. Mathematical logic in the 20th and 21st centuries, 2000. URL http://cs.nyu.edu/pipermail/fom/2000-April/003913.html . FOM mailing list. April 27, 2000.

Saul A. Kripke. The Church-Turing “thesis” as a special corollary of Gödel’s completeness theorem. In J. Copeland, C. Posy, and O. Shagrir, editors, Computability: Turing, Gödel, Church, and Beyond , pages 77–104. MIT Press, 2013.

Udi Boker and Nachum Dershowitz. The influence of domain interpretations on computational models. Journal of Applied Mathematics and Computation , 215 (4): 1323–1339, 2009.

Henk Barendregt. The impact of the lambda calculus in logic and computer science. Bulletin of Symbolic Logic , 3 (2): 181–215, 1997. URL http://www.math.ucla.edu/~asl/bsl/0302/0302-003.ps .

Solomon Feferman. Theses for computation and recursion on concrete and abstract structures. In Turing’s Revolution: The Impact of His Ideas about Computability , pages 105–126. Springer International Publishing, 2015.

David Chalmers. A computational foundation for the study of cognition. Journal of Cognitive Science , 12 (4): 325–359, 2011. Written in 1993.

Hartley Rogers, Jr. Theory of Recursive Functions and Effective Computability . McGraw-Hill, New York, 1966.

Stephen C. Kleene. Origins of recursive function theory. Annals of the History of Computing , 3 (1): 52–67, 1981.

Wilfried Sieg. Step by recursive step: Church’s analysis of effective calculability. Bulletin of Symbolic Logic , 3: 154–180, June 1997.

Alonzo Church. Review of “On computable numbers, with an application to the Entscheidungsproblem”. The Journal of Symbolic Logic , 2 (1): 42–43, 1937.

Hao Wang. A Logical Journey. From Gödel to Philosophy . MIT Press, 1996.

Kurt Gödel. Some remarks on the undecidability results. In Solomon Feferman, John Dawson, and Stephen Kleene, editors, Kurt Gödel: Collected Works, Vol. II , pages 305–306. Oxford University Press, 1972.

Hao Wang. Reflections on Kurt Gödel . Bradford Books. MIT Press, 1990.

Hao Wang. From Mathematics to Philosophy . Kegan Paul, London,UK, 1974.

Oron Shagrir. Gödel on Turing on computability. In Adam Olszewski, Jan Wolenski, and Robert Janusz, editors, Church’s Thesis after 70 Years , pages 393–419. Ontos-Verlag, 2006.

Arnon Avron. The problematic nature of Gödel’s disjunctions and Lucas-Penrose’s theses. Semiotic Studies , 34(1): 83–108, 2020.

Jon Barwise. An introduction to first-order logic. In Jon Barwise, editor, Handbook of Mathematical Logic , chapter A.1, pages 5–46. North-Holland, 1977.

Reinhard Kahle. Is there a “Hilbert Thesis?” Studia Logica , 107: 145–165, 2019.

Wilfried Sieg. Mechanical procedures and mathematical experience. In Alexander George, editor, Mathematics and Mind , pages 71–117. Oxford University Press, 1994.

Saul A. Kripke. The origins and nature of computation, 2006. URL https://www.youtube.com/watch?v=D9SP5wj882w . Presented at the 21st International Workshop on the History and Philosophy of Science. Jerusalem, Israel.

Robin Gandy. Church’s thesis and principles for mechanisms. In J. Barwise, D. Kaplan, H. J. Keisler, P. Suppes, and A. S. Troelstra, editors, The Kleene Symposium , volume 101 of Studies in Logic and The Foundations of Mathematics , pages 123–148. North-Holland, 1980.

Wilfried Sieg. Church without dogma: Axioms for computability. In S. Barry Cooper, Benedikt Löwe, and Andrea Sorbi, editors, New Computational Paradigms: Changing Conceptions of What is Computable , pages 139–152, New York, 2007. Springer.

Moto-o Takahashi. A foundation of finite mathematics. Publications of the Research Institute for Mathematical Sciences , 12: 577–708, 1977.

Yuri Gurevich. What is an algorithm? In SOFSEM 2012: Theory and Practice of Computer Science , pages 31–42, 2012.

Yuri Gurevich. Sequential abstract state machines capture sequential algorithms. ACM Transactions on Computational Logic , 1: 77–111, 2000.

Donald E. Knuth. Algorithm and program; information and data. Communications of the ACM , 9: 654, 1968.

Emil L. Post. Absolutely unsolvable problems and relatively undecidable propositions: Account of an anticipation. In M. Davis, editor, Solvability, Provability, Definability: The Collected Works of Emil L. Post , pages 375–441. Birkhaüser, Boston, MA, 1994. Unpublished notes, 1941.

Andreĭ N. Kolmogorov. O ponyatii algoritma [On the concept of algorithm] (in Russian). Uspekhi Matematicheskikh Nauk [Russian Mathematical Surveys] , 8 (4): 1175–1176, 1953. English version in: Vladimir A. Uspensky and Alexei L. Semenov, Algorithms: Main Ideas and Applications , Kluwer, Norwell, MA, 1993, pp. 18–19.

Udi Boker and Nachum Dershowitz. Abstract effective models. In M. Fernández and I. Mackie, editors, New Developments in Computational Models: Proceedings of the First International Workshop on Developments in Computational Models (DCM 2005), Lisbon, Portugal (July 2005) , volume 135 of Electronic Notes in Theoretical Computer Science , pages 15–23, 2006.

Nachum Dershowitz and Yuri Gurevich. A natural axiomatization of computability and proof of Church’s Thesis. Bulletin of Symbolic Logic , 14 (3): 299–350, 2008. https://doi.org/10.2178/bsl/1231081370 .

Wolfgang Reisig. The computable kernel of Abstract State Machines. Theoretical Computer Science , 409: 126–136, 2008.

Udi Boker and Nachum Dershowitz. Three paths to effectiveness. In Andreas Blass, Nachum Dershowitz, and Wolfgang Reisig, editors, Fields of Logic and Computation: Essays Dedicated to Yuri Gurevich on the Occasion of His 70th Birthday , volume 6300 of Lecture Notes in Computer Science , pages 36–47, Berlin, 2010. Springer. URL http://nachum.org/papers/ThreePathsToEffectiveness.pdf .

Wilfried Sieg. Axioms for computability: Do they allow a proof of Church’s thesis? In Hector Zenil, editor, A Computable Universe. Understanding and Exploring Nature as Computation , pages 99–123. World Scientific/Imperial College Press, Singapore, 2013.

Elliott Mendelson. Introduction to Mathematical Logic . Discrete Mathematics and Its Applications. CRC Press, 5th edition, 2009.

William J. Rapaport. Philosophy of Computer Science . Online draft, 2020. URL https://cse.buffalo.edu/~rapaport/Papers/phics.pdf .

John E. Hopcroft and Jeffrey D. Ullman. Formal Languages and Their Relation to Automata . Addison-Wesley, Reading, MA, 1968.

Harry R. Lewis and Cristos H. Papadimitriou. Elements of the Theory of Computation . Prentice-Hall, Englewood Cliffs, NJ, 1981.

Máté Szabó. Kalmár’s argument against the plausibility of Church’s thesis. History and Philosophy of Logic , 39 (2): 140–157, 2018.

Benjamin Wells. Is there a nonrecursive decidable equational theory? Minds and Machines , 12 (2): 301–324, 2002.

Rózsa Péter. Rekursivität und konstruktivität. In A. Heyting, editor, Constructivity in Mathematics, Proceedings of the Colloquium Held at Amsterdam, 1957 , pages 226–233, Amsterdam, 1959. North-Holland.

Jean Porte. Quelques pseudo-paradoxes de la ‘calculabilite effective’. In Actes du 2me Congrès International de Cybernétique , pages 332–334, Namur, Belgium, 1960.

Elliott Mendelson. On some recent criticism of Church’s thesis. Notre Dame Journal of Formal Logic , IV (3): 201–205, July 1963.

MathSciNet   MATH   Google Scholar  

Yiannis N. Moschovakis. Review of four recent papers on Church’s thesis. Journal of Symbolic Logic , 33 (3): 471–472, 1968.

Stewart Shapiro. Acceptable notation. Notre Dame Journal of Formal Logic , 23 (1): 14–20, 1982.

Michael Rescorla. Copeland and Proudfoot on computability. Studies in History and Philosophy of Science Part A , 43 (1): 199–202, 2012. Reconsidering the Dynamics of Reason: A Symposium in Honour of Michael Friedman.

B. Jack Copeland and Diane Proudfoot. Deviant encodings and Turing’s analysis of computability. Studies in History and Philosophy of Science , 41: 247–252, September 2010.

Michael Rescorla. Church’s thesis and the conceptual analysis of computability. Notre Dame Journal of Formal Logic , 48 (2): 253–280, 2007.

Michał Wroclawski. Representations of natural numbers and computability of various functions. In Florin Manea, Barnaby Martin, Daniël Paulusma, and Giuseppe Primiero, editors, Proceedings of the 15th Conference on Computability in Europe - Computing with Foresight and Industry (CiE 2019, Durham, UK) , volume 11558 of Lecture Notes in Computer Science , pages 298–309. Springer, July 2019.

Udi Boker and Nachum Dershowitz. A hypercomputational alien. Applied Mathematics and Computation , 178 (1): 44–57, 2006.

Roger Penrose. The Emperor’s New Mind: Concerning Computers, Minds, and The Laws of Physics . Oxford University Press, New York, 1989.

Roger Penrose. Shadows of the Mind: A Search for the Missing Science of Consciousness . Oxford University Press, Oxford, 1994.

Christopher Strachey. An impossible program. Computer Journal , 7 (4): 313, 1965.

John R. Lucas. Minds, machines and Gödel. Philosophy , XXXVI: 112–127, 1961. Reprinted in The Modeling of Mind , K. M. Sayre and F. J. Crosson, eds., Notre Dame Press, 1963, pp. 269–270. https://doi.org/10.1017/S0031819100057983

Arnon Avron. Mishpete Gedel u-ve‘ayat ha-yesodot shel ha-matematikah (= Gödel’s Theorems and the Problem of the Foundations of Mathematics) . Broadcast University, Ministry of Defence, Jerusalem, Israel, 1998. In Hebrew.

David Chalmers, editor. Symposium on Roger Penrose’s Shadows of the Mind , volume 2, 1995. Association for the Scientific Study of Consciousness. URL http://journalpsyche.org/files/0xaa25.pdf .

Geoffrey LaForte, Patrick J. Hayes, and Kenneth M. Ford. Why Gödel’s theorem cannot refute computationalism. Artificial Intelligence , 104 (1–2): 265–286, 1998.

Nachum Dershowitz. The four sons of Penrose. In Proceedings 12th International Conference on Logic for Programming, Artificial Intelligence, and Reasoning (LPAR) (Montego Bay, Jamaica) , volume 3835 of Lecture Notes in Computer Science , pages 125–138. Springer, December 2005.

Martin Davis. The Universal Computer: The Road from Leibniz to Turing . CRC Press, 3rd edition, 2018.

Hilary Putnam. Book review: Shadows of the Mind by Roger Penrose. Bulletin of the American Mathematical Society , 32 (3): 370–373, July 1995.

John. C. Shepherdson. On the definition of computable function of a real variable. Zeitschrift für mathematische Logik und Grundlagen der Mathematik (Mathematical Logic Quarterly) , 22 (1): 391–402, 1976.

Oron Shagrir. Effective computation by humans and machines. Minds and Machines , 12: 221–240, 2002.

Wilfried Sieg and John Byrnes. An abstract model for parallel computations: Gandy’s thesis. The Monist , 82 (1): 150–164, 1999.

Olivier Bournez, Nachum Dershowitz, and Pierre Néron. An axiomatization of analog algorithms. In Computability in Europe 2016: Pursuit of the Universal (CiE, Paris, France) , volume 9709 of Lecture Notes in Computer Science , pages 215–224, Switzerland, June 2016. Springer. URL http://nachum.org/papers/AxiomatizationAnalog.pdf . Full version at https://arxiv.org/pdf/1604.04295v2.pdf .

Jack B. Copeland and Oron Shagrir. Physical computation: How general are Gandy’s principles for mechanisms? Minds and Machines , 17 (2): 217–231, 2007.

Pablo Arrighi and Gilles Dowek. The physical Church-Turing thesis and the principles of quantum theory. International Journal of Foundations of Computer Science , 23 (5): 1131–1145, 2012.

Tien D. Kieu. Quantum algorithm for Hilbert’s Tenth Problem. International Journal of Theoretical Physics , 42: 1461–1478, 2003.

Pablo Arrighi and Gilles Dowek. The principle of a finite density of information. In H. Zenil, editor, Irreducibility and Computational Equivalence , volume 2 of Emergence, Complexity and Computation , pages 127–134. Springer, Berlin, 2013.

David Beckman, Daniel Gottesman, Michael A. Nielsen, and John Preskill. Causal and localizable quantum operations. Phys. Rev. A , 64, 2001.

Benjamin Schumacher and Michael D. Westmoreland. Locality and information transfer in quantum operations. Quantum Information Processing , 4 (1): 13–34, 2005.

Apostolos Syropoulos. Hypercomputation: Computing Beyond the Church-Turing Barrier . Springer Science & Business Media, 2008.

Selmer Bringsjord and David A. Ferrucci. The narrative-based refutation of Church’s thesis. In Artificial Intelligence and Literary Creativity: Inside the Mind of BRUTUS, a Storytelling Machine , chapter 5, pages 105–148. Lawrence Erlbaum, Mahwah, NJ, 2000.

Selmer Bringsjord, Owen Kellett, Andrew Shilliday, Joshua Taylor, Bram van Heuveln, Yingrui Yang, Jeffrey Baumes, and Kyle Ross. A new Gödelian argument for hypercomputing minds based on the Busy Beaver problem. Applied Mathematics and Computation , 176 (2): 516–530, 2006.

Owen Kellett. A multi-faceted attack on the Busy Beaver problem. Master’s thesis, Rensselaer Polytechnic Institute, Troy, New York, July 2005.

Selmer Bringsjord and Michale Zenzen. Superminds: People Harness Hypercomputation, and More , volume 29 of Studies in Cognitive Systems . Kluwer Academic, Dordrecht, The Netherlands, 2003.

Warren Page. An interview with Herbert Robbins. The Two-Year College Mathematics Journal , 15 (1): 2–24, 1984.

Mario Stipčević and Çetin Kaya Koç. True random number generators. In Çetin Kaya Koç, editor, Open Problems in Mathematics and Computational Science , pages 275–315. Springer International Publishing, 2014.

G. Lee Bowie. An argument against Church’s thesis. The Journal of Philosophy , 70 (3): 66–76, 1973.

Cristian S. Calude. Algorithmic randomness, quantum physics, and incompleteness. In Proceedings of the Conference on Machines, Computations and Universality (MCU 2004) , volume 3354, pages 1–17, september 2004.

David Deutsch. Quantum theory, the Church-Turing principle and the universal quantum computer. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences , 400: 97–117, 1985.

John Gill. Computational complexity of probabilistic Turing machines. SIAM Journal on Computing , 6 (4): 675–695, 1977.

Karl de Leeuw, Edward F. Moore, Claude E. Shannon, and Norman Shapiro. Computability by probabilistic machines. In Claude E. Shannon and John McCarthy, editors, Automata Studies , volume 34 of Annals of Mathematics Studies , pages 183–212. Princeton University Press, 1956.

Yuri Gurevich. Unconstrained Church-Turing thesis cannot possibly be true. Bull. EATCS , 127, 2019. URL http://bulletin.eatcs.org/index.php/beatcs/article/view/566/565 .

B. Jack Copeland. Accelerating Turing machines. Minds and Machines , 12 (2): 281–300, 2002.

E. Mark Gold. Limiting recursion. J. Symbolic Logic , 30 (1): 28–48, 1965.

Hilary Putnam. Trial and error predicates and the solution to a problem of Mostowski. J. Symbolic Logic , 30 (1): 49–57, 1965.

Hermann Weyl. Philosophy of Mathematics and Natural Science . Princeton University Press, 1949.

Adolf Grünbaum. Messrs. Black and Taylor on temporal paradoxes. Analysis , 12 (6): 144–148, 1952. URL http://www.jstor.org/stable/3326977 .

Gábor Etesi and István Németi. Non-Turing computations via Malament-Hogarth space-times. International Journal of Theoretical Physics , 41 (2): 341–370, 2002.

Mark L. Hogarth. Non-Turing computers and non-Turing computability. In Proceedings of the Philosophy of Science Association , volume 1994, pages 126–138, 1994.

Mark L. Hogarth. Does general relativity allow an observer to view an eternity in a finite time? Foundations of Physics Letters , 5 (2): 173–181, 1992.

Article   MathSciNet   Google Scholar  

Itamar Pitowsky. The physical Church thesis and physical computational complexity. Iyyun: The Jerusalem Philosophical Quarterly , 39: 81–99, 1990.

Oron Shagrir and Itamar Pitowsky. Physical hypercomputation and the Church-Turing thesis. Minds and Machines , 13 (1): 87–101, 2003.

John Earman and John D. Norton. Forever is a day: Supertasks in Pitowsky and Malament-Hogarth spacetimes. Philosophy of Science , 60 (1): 22–42, 1993.

Antony Galton. The Church-Turing thesis: Still valid after all these years? Applied Mathematics and Computation , 178: 93–102, 2006.

Paolo Cotogno. Hypercomputation and the physical Church-Turing thesis. The British Journal for the Philosophy of Science , 54 (2): 181–223, 2003.

Fred G. Abramson. Effective computation over the real numbers. In 12th Annual Symposium on Switching and Automata Theory (SWAT 1971) , pages 33–37, 1971.

Jean-Claude. Carréga. Théorie des corps - La règle et le compas . Hermann, Paris, 1981.

Pascal Schreck. On the mechanization of straightedge and compass constructions. Journal of Systems Science and Complexity , 32: 124–149, February 2019.

Jesper Lützen. Why was Wantzel overlooked for a century? The changing importance of an impossibility result. Historia Mathematica , 36 (4): 374–394, 2009.

Arnon Avron. Personal communication, 2020.

Hava T. Siegelmann. Neural Networks and Analog Computation: Beyond the Turing Limit . Birkhäuser, Boston, 1998.

Paul Cockshotta, Lewis Mackenzie, and Greg Michaelson. Physical constraints on hypercomputation. Theoretical Computer Science , 394: 159–174, 2008.

Cristian S. Calude and Boris Pavlov. Coins, quantum measurements, and Turing’s barrier. Quantum Information Processing , 1 (1): 107–127, 2002.

Michael A. Nielsen. Computable functions, quantum measurements, and quantum dynamics. Physical Review Letters , 79 (15): 2915–2918, 1997.

Andrew Hodges. Can quantum computing solve classically unsolvable problems? arXiv, 2005. URL http://arXiv.org/abs/quant-ph/0512248 .

Warren D. Smith. Three counterexamples refuting Kieu’s plan for “quantum adiabatic hypercomputation”; and some uncomputable quantum mechanical tasks. Applied Mathematics and Computation , 178 (1): 184–193, 2006. Special Issue on Hypercomputation.

Boris Tsirelson. The quantum algorithm of Kieu does not solve the Hilbert’s tenth problem. arXiv, November 2001. URL http://arXiv.org/abs/quant-ph/0111009.

John W. Carroll. Laws of nature. In Edward N. Zalta, editor, The Stanford Encyclopedia of Philosophy . Metaphysics Research Lab, Stanford University, fall 2016 edition, 2016. URL https://plato.stanford.edu/archives/fall2016/entries/laws-of-nature .

George Kreisel. Mathematical logic: What has it done for the philosophy of mathematics? In Ralph Schoenman, editor, Bertrand Russell: Philosopher of the Century . George Allen and Unwin, London, 1967.

Robert Black. Proving Church’s Thesis. Philosophia Mathematica , 8 (3): 244–258, October 2000.

Download references

Acknowledgements

We are enormously grateful for comments and suggestions from Arnon Avron, Erwin Engeler, Oron Shagrir, Wilfried Sieg, and an anonymous reader.

Author information

Authors and affiliations.

School of Computer Science, Reichman University, Herzliya, Israel

School of Computer Science, Tel Aviv University, Ramat Aviv, Israel

Nachum Dershowitz

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Udi Boker .

Editor information

Editors and affiliations.

Departamento de Matemática, Faculdade de Ciências, Universidade de Lisboa, Lisboa, Portugal

Fernando Ferreira

Carl Friedrich von Weizsäcker-Zentrum, Universität Tübingen, Tübingen, Germany

Reinhard Kahle

Department of Mathematics, ETH Zurich, Zurich, Switzerland

Giovanni Sommaruga

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this chapter

Cite this chapter.

Boker, U., Dershowitz, N. (2022). What is the Church-Turing Thesis?. In: Ferreira, F., Kahle, R., Sommaruga, G. (eds) Axiomatic Thinking II. Springer, Cham. https://doi.org/10.1007/978-3-030-77799-9_9

Download citation

DOI : https://doi.org/10.1007/978-3-030-77799-9_9

Published : 18 September 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-77798-2

Online ISBN : 978-3-030-77799-9

eBook Packages : Mathematics and Statistics Mathematics and Statistics (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

The Church-Turing Thesis: Logical Limit or Breachable Barrier?

  • Hacker News
  • Download PDF
  • Join the Discussion
  • View in the ACM Digital Library
  • Introduction

Key Insights

History of the thesis, what is an algorithm, what justifies the church-turing thesis, is the thesis mathematically provable, complexity: the extended church-turing thesis, physical computability, ctt-p and quantum mechanics, weaker physical computability theses, physical computation thesis, relativistic computation, ctt-a and computation in the broad.

Alonzo Church and Alan M. Turing

The Church-Turing thesis (CTT) underlies tantalizing open questions concerning the fundamental place of computing in the physical universe. For example, is every physical system computable? Is the universe essentially computational in nature? What are the implications for computer science of recent speculation about physical uncomputability? Does CTT place a fundamental logical limit on what can be computed, a computational “barrier” that cannot be broken, no matter how far and in what multitude of ways computers develop? Or could new types of hardware, based perhaps on quantum or relativistic phenomena, lead to radically new computing paradigms that do breach the Church-Turing barrier, in which the uncomputable becomes computable, in an upgraded sense of “computable”? Before addressing these questions, we first look back to the 1930s to consider how Alonzo Church and Alan Turing formulated, and sought to justify, their versions of CTT. With this necessary history under our belts, we then turn to today’s dramatically more powerful versions of CTT.

Back to Top

  • The term “Church-Turing thesis” is used today for numerous theses that diverge significantly from the one Alonzo Church and Alan Turing conceived in 1936,
  • The range of algorithmic processes studied in modern computer science far transcends the range of processes a “human cornputer” could possibly carry out.
  • There are at least three forms of the “physical Church-Turing thesis”— modest, bold, and super-bold—though, at the present stage of physical inquiry, it is unknown whether any of them is true.

Turing stated what we will call “Turing’s thesis” in various places and with varying degrees of rigor. The following formulation is one of his most accessible.

Turing’s thesis. “L.C.M.s [logical computing machines, Turing’s expression for Turing machines] can do anything that could be described as … ‘purely mechanical’.” 38

Turing also formulated his thesis in terms of numbers. For example, he said, “It is my contention that these operations [the operations of an L.C.M.] include all those which are used in the computation of a number.” 36 and “[T]he ‘computable numbers’ include all numbers which would naturally be regarded as computable.” 36

Church (who, like Turing, was working on the German mathematician David Hubert’s Entscheidungsproblem ) advanced “Church’s thesis,” which he expressed in terms of definability in his lambda calculus.

Church’s thesis. “We now define the notion … of an effectively calculable function of positive integers by identifying it with the notion of a recursive function of positive integers (or of a λ-definable function of positive integers).” 5

uf1.jpg

Church chose to call this a definition. American mathematician Emil Post, on the other hand, referred to Church’s thesis as a “working hypothesis” and criticized Church for masking it in the guise of a definition. 33

Upon learning of Church’s “definition,” Turing quickly proved that λ-definability and his own concept of computability (over positive integers) are equivalent. Church’s thesis and Turing’s thesis are thus equivalent, if attention is restricted to functions of positive integers. (Turing’s thesis, more general than Church’s, also encompassed computable real numbers.) However, it is important for a computer scientist to appreciate that despite this extensional equivalence, Turing’s thesis and Church’s thesis have distinct meanings and so are different theses, since they are not intensionally equivalent. A leading difference in their meanings is that Church’s thesis contains no reference to computing machinery, whereas Turing’s thesis is expressed in terms of the “Turing machine,” as Church dubbed it in his 1937 review of Turing’s paper.

It is now widely understood that Turing introduced his machines with the intention of providing an idealized description of a certain human activity—numerical computation; in Turing’s day computation was carried out by rote workers called “computers,” or, sometimes, “computors”; see, for example, Turing. 37 The Church-Turing thesis is about computation as the term was used in 1936—human computation. Church’s term “effectively calculable function” was intended to refer to functions that are calculable by an idealized human computer; and, likewise, Turing’s phrase “numbers which would naturally be regarded as computable” was intended to refer to those numbers that could be churned out, digit by digit, by an idealized human computer working ceaselessly.

Here, then, is our formulation of the historical version of the Church-Turing thesis, as informed by Turing’s proof of the equivalence of his and Church’s theses:

CTT-Original (CTT-O). Every function that can be computed by the idealized human computer, which is to say, can be effectively computed, is Turing-computable.

Some mathematical logicians view CTT-O as subject ultimately to either mathematical proof or mathematical refutation, like open mathematical conjectures, as in the Riemann hypothesis, while others regard CTT-O as not amenable to mathematical proof but supported by philosophical arguments and an accumulation of mathematical evidence. Few logicians today follow Church in regarding CTT-O as a definition. We subscribe to Turing’s view of the status of CTT-O, as we outline later.

In computer science today, algorithms and effective procedures are, of course, associated not primarily with humans but with machines. (Note, while some expositors might distinguish between the terms “algorithm” and “effective procedure,” we use the terms interchangeably.) Many computer science textbooks formulate the Church-Turing thesis without mentioning human computers at all; examples include the well-known books by Hopcroft and Ullman 24 and Lewis and Papadimitriou. 29 This is despite the fact that the concept of human computation was at the heart of both Turing’s and Church’s analysis of computation.

We discuss several important modern forms of the Church-Turing thesis, each going far beyond CTT-O. First, we look more closely at the algorithmic form of thesis, as stated to a first approximation by Lewis and Papadimitriou 29 : “[W]e take the Turing machine to be a precise formal equivalent of the intuitive notion of ‘algorithm’.”

The range of algorithmic processes studied in modern computer science far transcends the range of processes a Turing machine is able to carry out. The Turing machine is restricted to, say, changing at most one bounded part at each sequential step of a computation. As Yuri Gurevich pointed out, the concept of an algorithm keeps evolving: “We have now parallel, interactive, distributed, real-time, analog, hybrid, quantum, etc. algorithms.” 22 There are enzymatic algorithms, bacterial foraging algorithms, slime-mold algorithms, and more. The Turing machine is incapable of performing the atomic steps of algorithms carried out by, say, an enzymatic system (such as selective enzyme binding) or a slime mold (such as pseudopod extension). The Turing machine is similarly unable to duplicate (as opposed to simulate) John Conway’s Game of Life, where—unlike a Turing machine—every cell updates simultaneously.

A thesis aiming to limit the scope of algorithmic computability to Turing computability should thus not state that every possible algorithmic process can be performed by a Turing machine. The way to express the thesis is to say the extensional input-output function ια associated with an algorithm α is always Turing-computable; ια is simply the extensional mapping of α ‘s inputs to α ‘s outputs. The algorithm the Turing machine uses to compute ια might be very different from α itself. A question then naturally arises: If an algorithmic process need not be one a Turing machine can carry out, save in the weak sense just mentioned, then where do the boundaries of this concept lie? What indeed is an algorithm?

The dominant view in computer science is that, ontologically speaking, algorithms are abstract entities; however, there is debate about what abstract entities algorithms are. Gurevich defined the concept in terms of abstract-state machines, Yiannis Moschovakis in terms of abstract recursion, and Noson Yanofsky in terms of equivalence classes of programs, while Moshe Vardi has speculated that an algorithm is both abstract-state machine and recursor. It is also debated whether an algorithm must be physically implementable. Moschovakis and Vasilis Paschalis (among others) adopt a concept of algorithm “so wide as to admit ‘non-implementable’ algorithms,” 30 while other approaches do impose a requirement of physical implementability, even if only a very mild one. David Harel, for instance, writes: [A]ny algorithmic problem for which we can find an algorithm that can be programmed in some programming language, any language, running on some computer, any computer, even one that has not been built yet but can be built … is also solvable by a Turing machine. This statement is one version of the so-called Church/Turing thesis.” 23

Steering between these debates—and following Harel’s suggestion that the algorithms of interest to computer science are always expressible in programming languages—we arrive at the following program-oriented formulation of the algorithmic thesis:

CTT-Algorithm (CTT-A). Every algorithm can be expressed by means of a program in some (not necessarily currently existing) Turing-equivalent programming language.

There is an option to narrow CTT-A by adding “physically implementable” before “program,” although in our view this would be to lump together two distinct computational issues that are better treated separately.

The evolving nature and open-endedness of the concept of an algorithm is matched by a corresponding open-endedness in the concept of a programming language. But this open-endedness notwithstanding, CTT-A requires that all algorithms be bounded by Turing computability.

Later in this article we examine complexity-theoretic and physical versions of the Church-Turing thesis but first turn to the question of the justification of the theses introduced so far. Are CTT-O and CTT-A correct?

Stephen Kleene—who coined the term “Church-Turing thesis”—catalogued four types of argument for CTT-O: First, the argument from non-refutation points out the thesis has never been refuted, despite sustained (and ongoing) attempts to find a counterexample (such as the attempts by László Kalmár and, more recently, by Doukas Kapantais). Second, the argument from confluence points to the fact that the various characterizations of computability, while differing in their approaches and formal details, turn out to encompass the very same class of computable functions. Four such characterizations were presented (independently) in 1936 and immediately proved to be extensionally equivalent: Turing computability, Church’s λ-definability, Kleene’s recursive functions, and Post’s finitary combinatory processes.

Third is an argument usually referred to nowadays as “Turing’s analysis.” Turing called it simply argument “I,” stating five very general and intuitive constraints—or axioms—the human computer may be assumed to satisfy: “The behavior of the computer at any moment is determined by the symbols which he is observing, and his ‘state of mind’ at that moment”; “[ T ] here is a bound B to the number of symbols or squares which the computer can observe at one moment”; “[E]ach of the new observed squares is within L squares of an immediately previously observed square”; “[I]n a simple operation not more than one symbol is altered”; and “[T]he number of states of mind which need be taken into account is finite.” Turing noted that reference to the computer’s states of mind can be avoided by talking instead about configurations of symbols, these being “a more definite and physical counterpart” of states of mind. 36

The second part of Turing’s argument I is a demonstration that each function computed by any human computer subject to these constraints is also computable by a Turing machine; it is not difficult to see that each of the computer’s steps can be mimicked by the Turing machine, either in a single step or by means of a series of steps. In short, Turing’s five axioms entail CTT-O. (Turing’s axiomatic approach to computability was in fact foreshadowed by Kurt Gödel in a suggestion to Church a year or so earlier. 15 Some more recent axiomatic approaches to computability proceed differently; for example, Erwin Engeler employs the Schönfinkel-Curry idea of “combinators” in order to axiomatize the concept of an algorithmic function.)

The Turing machine is restricted to, say, changing at most one bounded part at each sequential step of a computation.

Fourth in this catalog of considerations supporting CTT-O are arguments from first-order logic. They are typified by a 1936 argument of Church’s and by Turing’s argument II, from Section 9 of Turing’s 1936 paper. In 2013, Saul Kripke 28 presented a reconstruction of Turing’s argument II, which goes as follows: Computation is a special form of mathematical deduction; and every mathematical deduction—and therefore every computation—can be formalized as a valid deduction in the language of first-order predicate logic with identity (a step Kripke referred to as “Hilbert’s thesis”); following Gödel’s completeness theorem, each computation is thus formalized by a provable formula of first-order logic; and every computation can therefore be carried out by the universal Turing machine. This last step regarding the universal Turing machine is secured by a theorem proved by Turing: Every provable formula of first-order logic can be proved by the universal Turing machine.

The third and fourth of these arguments provide justification for CTT-O but not for CTT-A. As Robin Gandy 20 pointed out, the third argument—Turing’s I—contains “crucial steps … where he [Turing] appeals to the fact that the calculation is being carried out by a human being.” 20 For example, Turing assumed “a human being can only write one symbol at a time,” and Gandy noted this assumption cannot be carried over to a parallel machine that “prints an arbitrary number of symbols simultaneously.” 20 In Conway’s Game of Life, for instance, there is no upper bound on the number of cells that make up the grid, yet the symbols in all the cells are updated simultaneously. Likewise, the fourth argument (Turing’s II) involves the claim that computation is a special form of formal proof, but the notion of proof is intrinsically related to what a human mathematician—and not some oracle—can prove.

It is thus perhaps not too surprising that the third and fourth arguments in this catalog seldom if ever appear in logic and computer science textbooks. The two arguments that are always given for the Church-Turing thesis (in, for example, Lewis and Papadimitriou 29 ) are confluence and non-refutation. Yet both those arguments are merely inductive, whereas the third and fourth arguments are deductive in nature.

However, a number of attempts have sought to extend Turing’s axiomatic analysis to machine computation; for example, Gandy 20 broadened Turing’s analysis in such a way that parallel computation is included, while Dershowitz and Gurevich 16 gave a more general analysis in terms of abstract state machines. We return to the topic of extending the analysis to machine computation later in this article but first address the important question of whether CTT-O is mathematically provable.

It used to be thought by mathematical logicians and others that CTT-O is not amenable to formal proof, since it is not a mathematically precise statement. This is because it pairs an informal concept—a “vague intuitive notion,” Church called it 5 —with a precise concept. However, Elliott Mendelson gave a powerful critique of this general argument; and today the view that CTT-O is formally provable seems to be gaining acceptance; see, for example, Dershowitz and Gurevich. 16 Inspired by Gandy, 20 Wilfried Sieg 35 stated that a tightened form of Turing’s argument I proves the thesis; and Kripke 28 entertained the same claim for Turing’s argument II.

Turing’s own view was that, on the contrary, his thesis is not susceptible to mathematical proof. He thought his arguments I and II, and indeed “[a]ll arguments which can be given” for the thesis, are “fundamentally, appeals to intuition, and for this reason rather unsatisfactory mathematically.” 36 Hilbert’s thesis is another example of a proposition that can be justified only by appeal to intuition, and so Kripke’s 28 tightened form of argument II, far from proving CTT-O, merely deduced it from another thesis that is also not amenable to mathematical proof.

Much the same can be said about argument I. If axioms 1–5 are formulated in precise mathematical terms, then it is certainly provable from them that computation is bounded by Turing computability; this is probably what Gandy 20 meant when he said Turing’s argument I proves a “theorem.” But the real issue is whether these axioms completely capture the concept of a computational or algorithmic process, and, so far as we see, no one has ever given a rigorous mathematical justification of that claim. The axioms may be supported by informal arguments, but the whole edifice then falls short of mathematical proof. This is most apparent when the informal arguments offered for the axioms invoke limitations in the cognitive capacities of human computers, as we point out elsewhere. 13 A justification of the second axiom may, for instance, refer to the limitations of human observation. The axioms most certainly lie beyond the scope of mathematical demonstration if their truth depends on contingent human limitations. Turing himself cheerfully appealed to cognitive limitations in the course of his analysis, saying, for example, “[j]ustification lies in the fact that the human memory is necessarily limited.” 36

Turing’s own view was that, on the contrary, his thesis is not susceptible to mathematical proof.

In summary, our answer to “Is CTT-O mathematically provable?” is: Turing thought not and we have found no reason to disagree with him. The various historical arguments seem more than sufficient to establish CTT-O, but these arguments do indeed fall short of mathematical proof.

We next address complexity theoretic forms of the Church-Turing thesis, then turn to the question of whether CTT-A is justified in the context of physically realistic computations.

It is striking that the Turing machine holds a central place not only in computability theory but also in complexity theory, where it is viewed as a universal model for complexity classes.

In complexity theory, the time complexities of any two general and reasonable models of computation are assumed to be polynomially related. But what counts as “reasonable”? Aharonov and Vazirani 1 glossover “reasonable” as “physically realizable in principle”; see also Bernstein and Vazirani. 3 If a computational problem’s time complexity is t in some (general and reasonable) model, then its time complexity is assumed to be poly( t ) in the single-tape Turing machine model; see also Goldreich. 21 This assumption has different names in the literature; Goldreich 21 called it the Cobham-Edmonds thesis, while Yao 40 introduced the term “Extended Church-Turing thesis.” The thesis is of interest only if P ≠ NP , since otherwise it is trivial.

Quantum-computation researchers also use a variant of this thesis, as expressed in terms of probabilistic Turing machines. Bernstein and Vazirani 3 said: “[C]omputational complexity theory rests upon a modern strengthening of [the Church-Turing] thesis, which asserts that any ‘reasonable’ model of computation can be efficiently simulated on a probabilistic Turing machine.” 3

Aharonov and Vazirani 1 give the following formulation of this assumption, naming it the “Extended Church-Turing thesis”—though it is not quite the same as Yao’s earlier thesis of the same name, which did not refer to probabilistic Turing machines:

CTT-Extended (CTT-E). “[A]ny reasonable computational model can be simulated efficiently by the standard model of classical computation, namely, a probabilistic Turing machine.” 1

As is well known in computer science, Peter Shor’s quantum algorithm for prime factorization is a potential counterexample to CTT-E; the algorithm runs on a quantum computer in polynomial time and is much faster than the most-efficient known “classical” algorithm for the task. But the counterexample is controversial. Some computer scientists think the quantum computer invoked is not a physically reasonable model of computation, while others think accommodating these results might require further modifications to complexity theory.

We turn now to extensions of the Church-Turing thesis into physics.

The issue of whether every aspect of the physical world is Turing-computable was broached by several authors in the 1960s and 1970s, and the topic rose to prominence in the mid-1980s.

In 1985, Stephen Wolfram formulated a thesis he described as “a physical form of the Church-Turing hypothesis,” saying, “[U]niversal computers are as powerful in their computational capacities as any physically realizable system can be, so that they can simulate any physical system.” 39 In the same year, David Deutsch, who laid the foundations of quantum computation, independently stated a similar thesis, describing it as “the physical version of the Church-Turing principle.” 17 The thesis is now known as the Church-Turing-Deutsch thesis and the Church-Turing-Deutsch-Wolfram thesis.

Church-Turing-Deutsch-Wolfram thesis (CTDW). Every finite physical system can be simulated to any specified degree of accuracy by a universal Turing machine.

Deutsch pointed out that if “simulated” is understood as “perfectly simulated,” then the thesis is falsified by continuous classical systems, since such classical systems necessarily involve uncomputable real numbers, and went on to introduce the concept of a universal quantum computer, saying such a computer is “capable of perfectly simulating every finite, realizable physical system.” Other physical formulations were advanced by Lenore Blum et al., John Earman, Itamar Pitowsky, Marian Pour-El, and Ian Richards, among others.

We next formulate a strong version of the physical Church-Turing thesis we call the “total physical computability thesis.” (We consider some weaker versions later in the article.) By “physical system” we mean any system whose behavior is in accordance with the actual laws of physics, including non-actual and idealized systems.

Total physical computability thesis (CTT-P). Every physical aspect of the behavior of any physical system can be calculated (to any specified degree of accuracy) by a universal Turing machine.

As with CTT-E, there is also a probabilistic version of CTT-P, formulated in terms of a probabilistic Turing machine.

Arguably, the phrase “physical version of the Church-Turing thesis” is an inappropriate name for this and related theses, since CTT-O concerns a form of effective or algorithmic activity and asserts the activity is always bounded by Turing computability, while CTT-P and CTDW, on the other hand, entail that the activity of every physical system is bounded by Turing computability; the system’s activity need not be algorithmic/effective at all. Nevertheless, in our “CTT-” nomenclature, we follow the Deutsch-Wolfram tradition throughout this article.

Is CTT-P true? Not if physical systems include systems capable of producing unboundedly many digits of a random binary sequence; Church showed such sequences are uncomputable, as we discussed elsewhere. 8 Moreover, speculation that there may be deterministic physical processes whose behavior cannot be calculated by the universal Turing machine stretches back over several decades; for a review, see Copeland. 9 In 1981, Pour-El and Richards 34 showed that a system evolving from computable initial conditions in accordance with the familiar three-dimensional wave equation is capable of exhibiting behavior that falsifies CTT-P; even today, however, it is an open question whether these initial conditions are physically possible. Earlier papers, from the 1960s, by Bruno Scarpellini, Arthur Komar, and Georg Kreisel, in effect questioned CTT-P, with Kreisel stating: “There is no evidence that even present-day quantum theory is a mechanistic, i.e., recursive theory in the sense that a recursively described system has recursive behavior.” 27 Other potential counterexamples to CTT-P have been described by a number of authors, including what are called “relativistic” machines. First introduced by Pitowsky, 32 they will be examined in the section called “Relativistic Computation.”

There are a number of theoretical countermodels to CTT-P arising from quantum mechanics. For example, in 1964, Komar 26 raised “the issue of the macroscopic distinguishability of quantum states,” asserting there is no effective procedure “for determining whether two arbitrarily given physical states can be superposed to show interference effects.” In 2012, Eisert et al. 19 showed “[T]he very natural physical problem of determining whether certain outcome sequences cannot occur in repeated quantum measurements is undecidable, even though the same problem for classical measurements is readily decidable.” This is an example of a problem that refers unboundedly to the future but not to any specific time. Other typical physical problems take the same form; Pitowsky gave as examples “Is the solar system stable?” and “Is the motion of a given system, in a known initial state, periodic?”

Cubitt et al. 14 described another such undecidability result in a 2015 Nature article, outlining their proof that “[T]he spectral gap problem is algorithmically undecidable: There cannot exist any algorithm that, given a description of the local interactions, determines whether the resultant model is gapped or gapless.” Cubitt et al. also said this is the “first undecidability result for a major physics problem that people would really try to solve.”

The spectral gap, an important determinant of a material’s properties, refers to the energy spectrum immediately above the ground-energy level of a quantum many-body system, assuming a well-defined least-energy level of the system exists; the system is said to be “gapless” if this spectrum is continuous and “gapped” if there is a well-defined next-least energy level. The spectral gap problem for a quantum many-body system is the problem of determining whether the system is gapped or gapless, given the finite matrices (at most three) describing the local interactions of the system.

In their proof, Cubitt et al. 14 encoded the halting problem in the spectral gap problem, showing the latter is at least as hard as the former. The proof involves an infinite family of two-dimensional lattices of atoms. But they pointed out their result also applies to finite systems whose size increases, saying, “Not only can the lattice size at which the system switches from gapless to gapped be arbitrarily large, the threshold at which this transition occurs is uncomputable.” Their proof offers an interesting countermodel to CTT-P, involving a physically relevant example of a finite system of increasing size. There exists no effective method for extrapolating the system’s future behavior from (complete descriptions of) its current and past states.

It is debatable whether any of these quantum models correspond to real-world quantum systems. Cubitt et al. 14 admitted the model invoked in their proof is highly artificial, saying, “Whether the results can be extended to more natural models is yet to be determined.” There is also the question of whether the spectral gap problem becomes computable when only local Hilbert spaces of realistically low dimensionality are considered. Nevertheless, these results are certainly suggestive: CTT-P cannot be taken for granted, even in a finite quantum universe.

Summarizing the current situation with respect to CTT-P, we can say, although theoretical countermodels in which CTT-P is false have been described, there is at present—so far as we know—not a shred of evidence that CTT-P is false in the actual universe. Yet it would seem most premature to assert that CTT-P is true.

Piccinini 31 has distinguished between two different types of physical versions of the Church-Turing thesis, both commonly found in the literature, describing them as “bold” and “modest” versions of the thesis, respectively. The bold and modest versions are weaker than our “super-bold” version just discussed (CTT-P). Bold versions of the thesis state, roughly, that “Any physical process can be simulated by some Turing machine.” 31 The Church-Turing-Deutsch-Wolfram thesis (CTDW) is an example, though Piccinini emphasized that the bold versions proposed by different researchers are often “logically independent of one another” and that, unlike the different formulations of CTT-O, which exhibit confluence, the different bold formulations in fact exhibit “lack of confluence.” 31

CTDW and other bold forms are too weak to rule out the uncomputability scenarios described by Cubitt et al. 14 and by Eisert et al. 19 This is because the physical processes involved in these scenarios may, so far as we know, be Turing-computable; it is possible that each process can be simulated by a Turing machine, to any required degree of accuracy, and yet the answers to certain physical questions about the processes are, in general, uncomputable. The situation is similar in the case of the universal Turing machine itself. The machine’s behavior (consisting of the physical actions of the read/write head) is always Turing-computable since it is produced by the Turing machine’s program, yet the answers to some questions about the behavior (such as whether or not the machine halts given certain inputs) are not computable.

uf2.jpg

Nevertheless, bold forms (such as CTDW) are interesting empirical hypotheses in their own right and the world might confute them. For instance, CTDW fails in the wave-equation countermodel due to Pour-El and Richards 34 where the mapping between the wave equation’s “inputs” and “outputs” is not a Turing-computable (real) function; although, as noted earlier, the physicality of this countermodel can readily be challenged. We discuss some other potential countermodels later in the article, but turn first to what Piccinini termed “modest” versions of the thesis.

Modest versions maintain in essence that every physical computing process is Turing-computable; for two detailed formulations, see Gandy 20 and Copeland. 8 Even if CTT-P and CTDW are in general false, the behavior of the subset of physical systems that are appropriately described as computing systems may nevertheless be bounded by Turing-computability. An illustration of the difference between modest versions on the one hand and CTT-P and CTDW on the other is given by the fact that the wave-equation example is not a counter-model to the modest thesis, assuming, as seems reasonable, that the physical dynamics described by the equation do not constitute a computing process.

Here, we formulate a modest version of the physical Church-Turing thesis we call the “Physical Computation” thesis, then turn to the question of whether it is true.

This form of the thesis maintains that physical computation is bounded by Turing-computability.

Physical computation thesis (CTT-P-C). Every function computed by any physical computing system is Turing-computable.

Is CTT-P-C true? As with the stronger physical computability theses, it seems too early to say. CTT-P-C could be false only if CTT-P and CTDW turn out to be false, since each of them entails CTT-P-C (see the figure here, which outlines the relationships among CTT-P, CTDW, and CTT-P-C). If all physical computation is effective in the 1930s sense of Turing and Church, then CTT-P-C is certainly true. If, however, the door is open to a broadened sense of computation, where physical computation is not necessarily effective in the sense of being bounded by Turing-computability, then CTT-P-C makes a substantive claim.

There is, in fact, heated debate among computer scientists and philosophers about what counts as physical computation. Moreover, a number of attempts have sought to describe a broadened sense of computation in which computation is not bounded by Turing-computability; see, for example, Copeland. 6 Computing machines that compute “beyond the Turing limit” are known collectively as “hypercomputers,” a term introduced in Copeland and Proudfoot. 11 Some of the most thought-provoking examples of notional machines that compute in the broad sense are called “supertask” machines. These “Zeno computers” squeeze infinitely many computational steps into a finite span of time. Examples include accelerating machines, 7 , 12 shrinking machines, and the intriguing relativistic computers described in the next section.

Notional machines all constitute rather theoretical countermodels to CTT-P-C, so long as it is agreed that they compute in a broadened sense, but none has been shown to be physically realistic, although, as we explain, relativistic computers come close. In short, the truth or falsity of CTT-P-C remains unsettled.

Relativistic machines operate in space-time structures with the property that the entire endless lifetime of one component of the machine is included in the finite chronological past of another component, called “the observer.” The first component could thus carry out an infinite computation (such as calculating every digit of π) in what is, from the observer’s point of view, a finite timespan of, say, one hour. (Such machines are in accord with Einstein’s general theory of relativity, hence the term “relativistic.”) Examples of relativistic computation have been detailed by Pitowsky, Mark Hogarth, and Istvan Németi.

In this section we outline a relativistic machine RM consisting of a pair of communicating Turing machines, T E and T o , in relative motion. T E is a universal machine, and T o is the observer. RM is able to compute the halting function, in a broad sense of computation. Speaking of computation here seems appropriate, since RM consists of nothing but two communicating Turing machines.

Here is how RM works. When the input ( m,n ), asking whether the m th Turing machine (in some enumeration of the Turing machines) halts or not when started on input n , enters T o , T o first prints 0 (meaning “never halts”) in its designated output cell and then transmits ( m,n ) to T E . T E simulates the computation performed by the m th Turing machine when started on input n and sends a signal back to T o if and only if the simulation terminates. If T o receives a signal from T E , T o deletes the 0 it previously wrote in its output cell and writes 1 in its place (meaning “halts”). After one hour, T o ‘s output cell shows 1 if the m th Turing machine halts on input n and shows 0 if the m th machine does not halt on n.

The most physically realistic version of this setup to date is due to Németi and his collaborators in Budapest. T E , an ordinary computer, remains on Earth, while the observer T o travels toward and enters a slowly rotating Kerr black hole. T o approaches the outer event horizon, a bubble-like hypersurface surrounding the black hole. Németi theorized that the closer T o gets to the event horizon, the faster T E ‘s clock runs relative to T o due to Einsteinian gravitational time dilation, and this speeding up continues with no upper limit. T o motion proceeds until, relative to a time t on T o clock, the entire span of T E ‘s computing is over. If any signal was emitted by T E , the signal will have been received by T o before time t. So T o will fall into the black hole with 1 in its output cell if T E halted and 0 if T E never halted. Fortunately, T o can escape annihilation if its trajectory is carefully chosen in advance, says Németi; the rotational forces of the Kerr hole counterbalance the gravitational forces that would otherwise “spaghettify” T o . T o thus emerges unscathed from the hole and goes on to use the computed value of the halting function in further computations.

Németi and colleagues emphasize their machine is physical in the sense it is “not in conflict with presently accepted scientific principles” and, in particular, “the principles of quantum mechanics are not violated.” 2 They suggest humans might “even build” a relativistic computer “sometime in the future.” 2 This is, of course, highly controversial. However, our point is that Németi’s theoretical countermodel, which counters not only CTT-P-C but also CTT-P and CTDW, helps underscore that the “physical version of the Church-Turing thesis” is quite independent of CTT-O, since the countermodel stands whether or not CTT-O is endorsed. We next reconsider CTT-A.

The continuing expansion of the concept of an algorithm is akin to the extension of the concept of number from integers to signed integers to rational, real, and complex numbers. Even the concept of human computation underwent an expansion; before 1936, computation was conceived of in terms of total functions, and it was Kleene in 1938 who explicitly extended the conception to also cover partial functions.

Gurevich argued in 2012 that formal methods cannot capture the algorithm concept in its full generality due to the concept’s open-ended nature; at best, formal methods provide treatments of “strata of algorithms” that “have matured enough to support rigorous definitions.” 22 An important question for computer science is whether CTT-A is a reasonable constraint on the growth of new strata. Perhaps not. In 1982, Jon Doyle 18 suggested equilibrating systems with discrete spectra (such as molecules and other quantum many-body systems) illustrate a concept of effectiveness that is broader than the classical concept, saying, “[E]quilibrating can be so easily, reproducibly, and mindlessly accomplished” that we may “take the operation of equilibrating as an effective one,” even if “the functions computable in principle given Turing’s operations and equilibrating include non-recursive functions.”

Over the years, there have been several departures from Turing’s 1936 analysis, as the needs of computer science led to a broadening of the algorithm concept. For example, Turing’s fourth axiom, which bounds the number of parts of a system that can be changed simultaneously, became irrelevant when the algorithm concept broadened to cover parallel computations. The future computational landscape might conceivably include more extensive revisions of the concept, if, for example, physicists were to discover that hardware effective in Doyle’s extended sense is a realistic possibility.

If such hardware were to be developed—hardware in which operations are effective in the sense of being “easily, reproducibly, and mindlessly accomplished” but not bounded by Turing computability—then would the appropriate response by computer scientists be to free the algorithm concept from CTT-A? Or should CTT-A remain as a constraint on algorithms, with instead two different species of computation being recognized, called, say, algorithmic computation and non-algorithmic computation? Not much rides on a word, but we note we prefer “effective computation” for computation that is bounded by Turing computability and “neo-effective computation” for computation that is effective in Doyle’s sense and not bounded by Turing computability, with “neo” indicating a new concept related to an older one.

The numerous examples of notional “hypercomputers” (see Copeland 9 for a review) prompt similar questions. Interestingly, a study of the expanding literature about the concept of an infinite-time Turing machine, introduced by Joel Hamkins and Andy Lewis in 2000, shows that a number of computer scientists are prepared to describe the infinite-time machine as computing the halting function. Perhaps this indicates the concept of computation is already in the process of bifurcating into “effective” and “neo-effective” computation.

In the computational literature the term “Church-Turing thesis” is applied to a variety of different propositions usually not equivalent to the original the-sis—CTT-O; some even go far beyond anything either Church or Turing wrote. Several but not all are fundamental assumptions of computer science. Others (such as the various physical computability theses we have discussed) are important in the philosophy of computing and the philosophy of physics but are highly contentious; indeed, the label “Church-Turing thesis” should not mislead computer scientists or anyone else into thinking they are established fact or even that Church or Turing endorsed them.

Submit an Article to CACM

CACM welcomes unsolicited submissions on topics of relevance and value to the computing community.

You Just Read

Copyright held by the authors. Publication rights licensed to ACM. Request permission to publish from [email protected]

The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.

January 2019 Issue

Published: January 1, 2019

Vol. 62 No. 1

Pages: 66-74

Advertisement

the extended church thesis

Join the Discussion (0)

Become a member or sign in to post a comment, the latest from cacm.

Wigderson Named Turing Awardee for Decisive Work on Randomness

2023 ACM A.M. Turing Award laureate Avi Wigderson

Screen Time: What and How Much is Too Much?

Saurabh Bagchi

Perspectives on AI from Around the Globe

orbit paths around the Earth, illustration

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Confirm your IT skills and competencies under the European IT Certification framework from anywhere in the world fully online.

EITCA Academy

Log in to your account by either your username or email address, forgot your details, create an account.

EITCA Academy

The European Information Technologies Certification Institute - EITCI ASBL

Certification Authority EITCI Institute Brussels, European Union Governing European IT Certification (EITC) standard in support of the IT professionalism and Digital Society

  • EITCA ACADEMIES CATALOGUE
  • EITCA/CG COMPUTER GRAPHICS
  • EITCA/IS INFORMATION SECURITY
  • EITCA/BI BUSINESS INFORMATION
  • EITCA/KC KEY COMPETENCIES
  • EITCA/EG E-GOVERNMENT
  • EITCA/WD WEB DEVELOPMENT
  • EITCA/AI ARTIFICIAL INTELLIGENCE
  • EITC CERTIFICATES CATALOGUE
  • COMPUTER GRAPHICS CERTIFICATES
  • WEB DESIGN CERTIFICATES
  • 3D DESIGN CERTIFICATES
  • OFFICE IT CERTIFICATES
  • BITCOIN BLOCKCHAIN CERTIFICATE
  • WORDPRESS CERTIFICATE
  • CLOUD PLATFORM CERTIFICATE NEW
  • INTERNET CERTIFICATES
  • CRYPTOGRAPHY CERTIFICATES
  • BUSINESS IT CERTIFICATES
  • TELEWORK CERTIFICATES
  • PROGRAMMING CERTIFICATES
  • DIGITAL PORTRAIT CERTIFICATE
  • WEB DEVELOPMENT CERTIFICATES
  • DEEP LEARNING CERTIFICATES NEW
  • EU PUBLIC ADMINISTRATION
  • TEACHERS AND EDUCATORS
  • IT SECURITY PROFESSIONALS
  • GRAPHICS DESIGNERS & ARTISTS
  • BUSINESSMEN AND MANAGERS
  • BLOCKCHAIN DEVELOPERS
  • WEB DEVELOPERS
  • CLOUD AI EXPERTS NEW
  • HOW IT WORKS
  • MY ORDER Your current order is empty.

What is the extended Church-Turing thesis and how does it relate to the study of quantum algorithms?

The extended Church-Turing thesis (ECT) is an important concept in the field of quantum algorithms, which relates to the study of quantum information and its computational capabilities. The ECT is an extension of the Church-Turing thesis, which is a fundamental principle in classical computer science.

To understand the ECT, we must first grasp the Church-Turing thesis. This thesis states that any function that can be effectively computed by an algorithm can be computed by a Turing machine. In other words, any problem that can be solved algorithmically can be solved by a Turing machine, which is a theoretical model of a classical computer.

The ECT extends this idea to include quantum computation. It suggests that any function that can be effectively computed by an algorithm can also be computed by a quantum computer. This implies that quantum computers are at least as powerful as classical computers in terms of computational capabilities.

The ECT has profound implications for the study of quantum algorithms. It implies that quantum computers can solve certain problems more efficiently than classical computers. This is because quantum computers can exploit the properties of quantum mechanics, such as superposition and entanglement, to perform certain computations in parallel.

One notable example of a quantum algorithm that demonstrates the power of quantum computation is Shor's algorithm. Shor's algorithm is a quantum algorithm for factoring large numbers, which has the potential to break the widely used RSA encryption scheme. The best-known classical algorithms for factoring large numbers have exponential time complexity, while Shor's algorithm has polynomial time complexity on a quantum computer. This showcases the potential advantage of quantum algorithms over their classical counterparts.

However, it is important to note that the ECT is a conjecture and has not been proven rigorously. It is based on the belief that quantum mechanics is a complete and accurate description of the physical world. If this assumption holds true, then the ECT is likely to be valid.

The extended Church-Turing thesis is an extension of the Church-Turing thesis that suggests quantum computers can solve any problem that can be effectively computed by an algorithm. It relates to the study of quantum algorithms by implying that quantum computers have the potential to outperform classical computers in certain computational tasks. While the ECT is still a conjecture, it provides a theoretical basis for exploring the power of quantum computation.

Other recent questions and answers regarding EITC/QI/QIF Quantum Information Fundamentals :

  • To find the period in Shor’s Quantum Factoring Algorithm we repeat the circuit some times to get the samples for the GCD and then the period. How many samples do we need in general for that?
  • Is the copying of the C(x) bits in contradiction with the no cloning theorem?
  • Why is it important to stay updated on the current state of experimental realization in quantum information?
  • How does quantum cryptography utilize the properties of quantum mechanics to implement secure cryptographic systems?
  • What are some advanced algorithms that were not extensively covered in this course?
  • What is the significance of fault-tolerant quantum computation in implementing quantum computers?
  • How do quantum error correcting codes protect quantum systems from environmental decoherence?
  • Why is classical control crucial for implementing quantum computers and performing quantum operations?
  • How does the width of a Gaussian distribution in the field used for classical control affect the probability of distinguishing between emission and absorption scenarios?
  • Why is the process of flipping the spin of a system not considered a measurement?

View more questions and answers in EITC/QI/QIF Quantum Information Fundamentals

More questions and answers:

  • Field: Quantum Information
  • Programme: EITC/QI/QIF Quantum Information Fundamentals ( go to the certification programme )
  • Lesson: Quantum Algorithms ( go to related lesson )
  • Topic: Extended Church-Turing Thesis ( go to related topic )
  • Examination review

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to The Church-Turing Thesis

The rise and fall of the entscheidungsproblem, a. stating the entscheidungsproblem, b.1 a “philosophers’ stone”, b.2 the consistency of mathematics, c. partial solutions, d. negative vibes, e. the church-turing result, f. aftermath.

Turing gave two formulations of what he called “the Hilbert Entscheidungsproblem ” (1936 [2004: 84, 87]):

[Is there a] general process [and a few paragraphs later, he emphasizes “general (mechanical) process”] for determining whether a given formula \(A\) of the functional calculus K is provable
[Is there a] machine [Turing machine] which, supplied with any one \(A\) of these formulae, will eventually say whether \(A\) is provable.

Given Turing’s thesis, the two formulations are equivalent.

Church stated the Entscheidungsproblem more generally:

By the Entscheidungsproblem of a system of symbolic logic is here understood the problem to find an effective method by which, given any expression Q in the notation of the system, it can be determined whether or not Q is provable in the system. (Church 1936b: 41)

While Turing and Church both formulated the Entscheidungsproblem in terms of determining whether or not a formula is provable , Hilbert and Ackermann had formulated it in terms of universal validity (“ allgemeingültigkeit ”):

The Entscheidungsproblem is solved once we know a procedure that allows us to decide, by means of finitely many operations, whether a given logical expression is universally valid or, alternatively, satisfiable. (Hilbert & Ackermann 1928: 73)

A universally valid formula of the functional calculus (e.g., \((x)(Fx \lor \neg Fx)\) or, in modern notation, \(\forall x(Fx \lor \neg Fx)\)) must contain neither free variables nor individual constants, and is such that a true assertion results no matter which replacements (of suitable adicity) are made for the formula’s predicate symbols, and no matter which objects are allocated to its variables (Hilbert & Ackermann 1928: 72–73). In the case of a satisfiable formula (e.g., \((Ex)Fx\) or, in modern notation, \(\exists xFx\)) there must be at least one way of replacing its predicate symbols and of allocating objects to its variables so that a true assertion results. Universal validity and satisfiability are related as follows: \(A\) is universally valid if and only if \(\neg A\) is not satisfiable (1928: 73). In the above quotation, therefore, Hilbert and Ackermann are presenting two different but equivalent forms of the Entscheidungsproblem , one employing the concept of universal validity (or simply validity, as we would say today) and the other the closely related concept of satisfiability. The hunt was on for what they called “a determinate, finite procedure” (1928: 17) for deciding, of any given formula \(A\) of the functional calculus, whether or not \(A\) is universally valid, or, equivalently, for deciding whether or not \(\neg A\) is satisfiable.

Turing’s and Church’s provability formulation of the Entscheidungsproblem and the Hilbert-Ackermann formulation in terms of validity are in fact logically equivalent, as Church noted in 1936 (1936b: 41). This equivalence is a consequence of Gödel’s proof that (where \(A\) is any formula of the functional calculus) if \(A\) is universally valid then \(A\) is provable in the calculus. (The proof was presented originally in Gödel’s 1929 doctoral dissertation and then published as Gödel 1930.) Nevertheless, the provability version of the Entscheidungsproblem is arguably superior, since it asks a question about a specific axiom system, as do the allied problems of consistency and completeness. In modern treatments, the problems of consistency, completeness and decidability for an axiom system lie at the heart of proof theory, the area of modern logic founded by Hilbert.

In 1928, Hilbert and Ackermann were certainly aware of the provability formulation (1928: 86, and see below) but they gave the validity formulation the starring role. It was von Neumann who emphasized the provability formulation, calling the process of proof the “real heart of mathematics” (von Neumann 1927: 10).

B. Why the problem mattered

Why was it that the Entscheidungsproblem was regarded as “the main problem of mathematical logic” and “the most general problem of mathematics”? There were fundamentally two reasons.

In his turn-of-the-century Paris lecture, Hilbert explained the idea of axiomatizing a subject-matter and using provability from the axioms as the touchstone of truth:

When we are engaged in investigating the foundations of a science, we must set up a system of axioms which contains an exact and complete description of the relations subsisting between the elementary ideas of that science…. [N]o statement within the realm of the science whose foundation we are testing is held to be correct unless it can be derived from those axioms by means of a finite number of logical steps. (Hilbert 1900: 264 [trans. 1902: 447])

He also famously said in his lecture that there is “no ignorabimus ” in mathematics—there is nothing that we shall not know (Hilbert 1900: 262). It was a phrase he would often use to express his “conviction of the solvability of every mathematical problem” (Hilbert 1900: 262 [trans. 1902: 445]). Thirty years later he reiterated the same standpoint, in a valedictory lecture in Königsberg:

[I]n my opinion, there is no such thing as an unsolvable problem. On the contrary, instead of foolish ignorabimus, our slogan is: We must know, We will know. (Hilbert 1930b: 963)

It was Hilbert’s great invention, proof theory, that would, he thought, supply the basis for answering all “meaningful questions”:

[O]n the basis of the proof theory I have outlined, every fundamental question can be answered in a way that is unambiguous and mathematically precise. (Hilbert 1930a: 8, 9)

However, proving a statement in an axiom system can be a tricky matter. Suppose the mathematician fails to discover a derivation of the statement from the axioms, what follows then? The failure could be because the statement is indeed not provable—but, on the other hand, there might be a way to prove it that escaped the mathematician’s attention. Success in finding a proof brings certainty, whereas failure leaves us uncertain. That is why a decision method is so desirable. The method enables the mathematician to determine whether or not the statement is provable. Thinking up proofs is a serendipitous activity that involves, as Behmann put it, “inventive insight”, “creative thought”, and searching “in every direction of the compass” (Behmann 1921: 2–3 [trans. 2015: 174]). Whereas a decision method is a purely mechanical process that is guaranteed to produce a definite answer.

Herbrand wrote of the “present great ambition, the solution of the Entscheidungsproblem ”, saying this “would provide a general method in mathematics” and enable us to “to decide with certainty whether a proposition is true in a given theory” (Herbrand 1930b: 254–255). In his 1921 lecture, Behmann had referred to this desired general method as a “philosophers’ stone” (“ Stein der Weisen ”), by means of which mathematicians “would be able to solve every problem posed ”—or even “delegate the work of proving mathematical theorems to mathematical laborers” (Behmann 1921: 3–4 [trans. 2015: 175], emphasis added). Turing’s mentor Newman, familiar of course with the work of the Hilbert group, also referred to the solution of the Entscheidungsproblem as a philosophers’ stone. Hilbert and Ackermann themselves said, in 1928:

Solving this general Entscheidungsproblem [for the second-order functional calculus] would … enable us to decide on the provability or unprovability of any mathematical proposition , at least in principle. (Hilbert & Ackermann 1928: 86, emphasis added)

Schönfinkel went even further. He was another member of the Göttingen group who praised “Leibniz’s bold conjectures” (Schönfinkel 192?: 2). (For an excellent biographical article on Schönfinkel, see Wolfram 2021.) In an early draft of what would become Bernays and Schönfinkel (1928), Schönfinkel wrote of “the great Entscheidungsproblem of mathematical logic” which, thanks to Hilbert, mathematicians now had “the courage and the boldness to dare to touch”. He described the Entscheidungsproblem as “the problem of ‘solving all problems’”—not just all mathematical problems but all problems. Because once there is a method for solving all mathematical problems:

thereafter everything else is indeed lightweight, once this “Gordian knot” is undone (since “the world is written in mathematical characters”). (Schönfinkel 192?: 1–2)

The second reason the Entscheidungsproblem was regarded as so important was its connection with the quest for proofs of the consistency of mathematical systems. A system is said to be inconsistent if there is some statement \(A\) such that both \(A\) and \(\neg A\) are provable from the axioms. The system is consistent if (and only if) there is no statement for which this sorry situation is the case.

By the early twentieth century, mathematics—until then a “paragon of reliability and truth”, Hilbert said—was in trouble (Hilbert 1926: 170, trans. in van Heijenoort 1967: 375). Its reputation had been “lost due to the paradoxes of set theory” (Hilbert 1922: 160). Hilbert explained:

[C]ontradictions appeared, sporadically at first, then ever more severely and ominously…. In particular, a contradiction discovered by Zermelo and Russell [“Russell’s paradox”] had, when it became known, a downright catastrophic effect in the world of mathematics. (Hilbert 1926: 169 [trans. 1967: 375])

Herbrand takes up the story:

One must take great care … Set theory has produced famous examples [of inconsistencies] … But there is nothing to show us that similar issues do not arise for the theories that seem to us the most finished, like arithmetic. Could it be that arithmetic is inconsistent? (Herbrand 1930b: 250–251)

“[P]roofs of consistency would be very useful to dispel lingering doubts”, wrote Herbrand (1930b: 251). Hilbert put it even more forcefully:

[T]he situation in which we presently find ourselves with respect to the paradoxes is in the long run intolerable. (Hilbert 1926: 170 [trans. 1967: 375])

Proving “the consistency of the arithmetic axioms” is “urgent”, he said—“a burning question” (Hilbert 1926: 179 [trans. 1967: 383]; Hilbert 1930a: 3). Hilbert’s overarching concern was to “re-establish mathematics’ old reputation for incontestable truth” (Hilbert 1922: 160), and he was “convinced that it must be possible to find a direct proof of the consistency of the arithmetical axioms” (Hilbert 1900: 265).

A solution to the Entscheidungsproblem would furnish a route to establishing consistency: “By means of the decision method, issues of consistency would also be able to be resolved” (Hilbert & Ackermann 1928: 76). This is because the decision method could be used to establish whether or not \(A\) and \(\neg A\) are both provable—so gaining a definite answer to the question “Is the system consistent?”. Hilbert believed that, following the discovery of the decision method, arithmetic and analysis would be proved consistent, thereby “establish[ing] that mathematical propositions are indeed incontestable and absolute truths” (Hilbert 1922: 162). Mathematics’ reputation would be regained.

By the time Turing and Church engaged with the Entscheidungsproblem , a number of decision methods were known for parts of the functional calculus.

Besides the previously mentioned Hilbert-Bernays decision method for the propositional calculus (and Peirce’s much less well-known method), and also the Löwenheim method for the monadic fragment of the functional calculus (later improved by Behmann in his 1922 paper, where it was additionally proved that the monadic fragment of the second-order functional calculus is decidable), there were various decision methods that succeeded providing the functional calculus was restricted in one way or another. Bernays and Schönfinkel (1928) showed there is a decision method when formulae containing at most two individual variables are considered. At Cambridge, Ramsey devised a method that worked provided existential quantifiers were omitted from the calculus, and any universal quantifiers were stacked one after another at the very beginning of a formula (with no negation sign preceding any of them, and with their scope extending to the end of the formula). Such formulae are interesting, Ramsey pointed out, since they represent “general laws” (Ramsey 1930: 272; Langford 1926b: 115–116). Ackermann, Bernays, Schönfinkel, Gödel, Kálmar, and Herbrand devised methods for other fragments of the calculus, in which only certain patterns of quantifiers were permitted. Gödel’s 1933 paper on the Entscheidungsproblem gave a summary of some of the developments.

Despite all this attention to the decision problem, no method had been found for the full functional calculus. But according to Hilbert it was just a matter of time. He and Ackermann emphasized that “finding a general decision process is an as yet unsolved and difficult problem” (1928: 77). They exuded optimism, writing buoyantly of moving “closer to the solution of the Entscheidungsproblem ” (1928: 81).

Even in the 1920s, however, negative opinion on the solvability of the Entscheidungsproblem was building. Behmann had pointed out in his 1921 lecture that, once the “philosophers’ stone” was found, “the entirety of mathematics would indeed be transformed into one enormous triviality ” (Behmann 1921: 5 [trans. 2015: 175]). Some found this consequence highly unpalatable. In 1927, Hilbert’s student Weyl insisted that “such a philosopher’s stone has not been discovered and never will be discovered”, and a good job too, Weyl thought, since if it were, “Mathematics would thereby be trivialized” (Weyl 1927: 20 [trans. Weyl 1949: 24]). Then a year later, in 1928, Hardy announced to the Cambridge mathematicians, assembled at his Rouse Ball Lecture, that he expected no “system of rules which enable us to say whether any given formula [is] demonstrable or not” (Hardy 1929: 16). “[T]his is very fortunate”, he continued, since if there were such a system,

we should have a mechanical set of rules for the solution of all mathematical problems, and our activities as mathematicians would come to an end. (ibid.)

Hardy explained that his description of Hilbert’s ideas was “based upon that of v. Neumann, a pupil of Hilbert’s”, saying that he found von Neumann’s exposition “sharper and more sympathetic than Hilbert’s own” (Hardy 1929: 13–14). Hardy was referring to von Neumann’s “Zur Hilbertschen Beweistheorie” ( On Hilbert’s Proof Theory ), published in 1927 but completed by the middle of 1925. Von Neumann said:

So it seems that there is no way to find the general decision criterion for whether a given normal formula [i.e., a well-formed formula with no free variables] is provable. (von Neumann 1927: 11, emphasis added)
The day that undecidability lets up, mathematics in its current sense would cease to exist; into its place would step a perfectly mechanical rule, by means of which anyone could decide, of any given proposition, whether this can be proved or not. (von Neumann 1927: 12)

But how to prove that there is no general decision criterion? Von Neumann confessed he did not know:

At present, of course, we cannot demonstrate this. Moreover, no clue whatsoever exists how such an undecidability proof would have to be conducted. (von Neumann 1927: 11)

Nor did he find a proof. With a hint of resignation he said in 1931 that the Entscheidungsproblem was “far too difficult” (1931: 121). Four years later, in 1935, he visited Cambridge for a term (arriving in April, a few weeks after Newman had lectured on the Entscheidungsproblem ) and he struck up an acquaintance with a young mathematician who admired his work (Copeland & Fan 2023). The young man was of course Alan Turing, who within a few months would devise his groundbreaking proof that—just as von Neumann had hypothesized—“It is generally undecidable whether a given normal formula is provable or not” (von Neumann 1927: 12). Was there discussion, during the spring of 1935, between von Neumann and Turing (the latter already primed by Newman’s lecture) about the Entscheidungsproblem —which was, after all, the main problem of mathematical logic? Did von Neumann perhaps play a role in Turing’s decision to take on this fundamental problem and prove it unsolvable? These are tantalizing questions, and we may never know for sure.

Gödel’s famous incompleteness theorems of 1931 placed unexpected new obstacles in the way of Hilbert’s desired consistency proof for arithmetic (Gödel 1931). Suspicion also began to build that Gödel’s incompleteness results might further imply the unsolvability of the Entscheidungsproblem . Herbrand said cautiously:

[A]lthough at present the possibility of a solution to the Entscheidungsproblem seems unlikely, its impossibility has not yet been proved. (Herbrand 1931a: 56)

Carnap, who took a close interest in Hilbert’s Göttingen group and had worked with Behmann (Schiemer, Zach, & Reck 2017) wrote the following about the Hilbertian idea of a “definite criterion of validity” or “decision procedure for mathematics”, using which “we could so to speak calculate the truth or falsity of every given proposition”:

[B]ased on Gödel’s latest results, the search for a definite criterion of validity for the entire system of mathematics appears hopeless. (Carnap 1935: 163–4)

But, as Herbrand noted, nobody had proved the Entscheidungsproblem to be unsolvable. That was where Turing and Church entered the story. Newman later summed up matters as they had appeared at the time, before Church and Turing published their transformational papers in 1936:

A first blow was dealt [to the “Hilbert decision-programme”] by Gödel’s incompleteness theorem (1931), which made it clear that truth or falsehood of \(A\) could not be equated to provability of \(A\) or not-\(A\) in any finitely based logic, chosen once for all; but there still remained in principle the possibility of finding a mechanical process for deciding whether \(A\), or \(\neg A\), or neither, was formally provable in a given system. (Newman 1955: 256)

Turing summed up his show-stopping result in his usual terse way: he had shown, he said, that “the Hilbert Entscheidungsproblem can have no solution” (Turing 1936 [2004: 84]). Church, also not one to waste words, compressed his proof into a paper barely two pages long, and concluded pithily:

The general case of the Entscheidungsproblem of the engere Funktionenkalkül is unsolvable. (Church 1936b: 41)

In establishing this, Turing and Church showed moreover that there can be no decision method for the second-order functional calculus (where quantification is permitted over not just individuals but also over properties and relations), since the second-order calculus contains the first-order calculus as a fragment. The same applies to every other mathematical system containing the first-order calculus, such as arithmetic.

However, it is one of the great ironies of the history of logic that, all along, Gödel’s first incompleteness theorem of 1931 did in fact suffice to establish the undecidability of the functional calculus—although this was certainly not apparent at the time. More than three decades passed after the publication of Gödel’s paper before this corollary of his theorem was noted, by Davis (Davis 1965: 109; Kleene 1986: 136).

Naturally, this unnoticed implication of Gödel’s theorem does not diminish Turing’s and Church’s great achievement, which at the time broke new ground. However, in recent times their work is sometimes regarded as amounting to merely a smallish development of Gödel’s previously published results, on which Church’s and Turing’s work was supposedly based. This is particularly true of Turing. Turing, it is said, “merely reformulated Gödel’s work in an elegant way” (Schmidhuber 2012: 1638) and “recast” Gödel’s findings “in the guise of the Halting Problem” (Dawson 2006: 133). (For further discussion of these and similar views, see Copeland & Fan 2022.) In this connection, it is worth remembering the words of Kleene, who worked closely with Church and played a leading part in the development of computability theory in the 1930s. Kleene noted that

One sometimes encounters statements asserting that Gödel’s work laid the foundation for Church’s and Turing’s results

and commented:

Whether or not one judges that Church would have proceeded from his thesis to these [undecidability] results without his having been exposed to Gödel numbering, it seems clear that Turing in [“On Computable Numbers”] had his own train of thought, quite unalloyed by any input from Gödel. One is impressed by this in reading Turing [“On Computable Numbers”] in detail. (Kleene 1987: 491–2)

What became of the Entscheidungsproblem after the Church-Turing negative result?

In letters written in the wake of the result, Turing and Newman discussed the idea Newman had presented in his 1935 lecture, a machine that is able to decide mathematical problems. Turing wrote “I think you take a much more radically Hilbertian attitude about mathematics than I do”, responding to Newman’s statement that

If all this whole formal outfit is not about finding proofs which can be checked on a machine it’s difficult to know what it is about. (Turing c.1940 [2004: 215])

Turing noted that he saw no essential difference between a proof-checking machine and a proof-finding machine. He challenged Newman:

When you say “on a machine” do you have in mind that there is (or should be or could be, but has not been actually described anywhere) some fixed machine on which proofs are to be checked, and that the formal outfit is, as it were about this machine?

Turing called this an “extreme Hilbertian” position, and said:

If you take this attitude … there is little more to be said: we simply have to get used to the technique of this machine and resign ourselves to the fact that there are some problems to which we can never get the answer.

Turing rejected this attitude to mathematics, because, he said, “there is a fairly definite idea of a true formula which is quite different from the idea of a provable one”—mathematicians’ judgements about whether formulae are true can outrun the pronouncements of the Hilbertian machine. Turing continued in his letter:

If you think of various machines I don’t see your difficulty. One imagines different machines allowing different sets of proofs, and by choosing a suitable machine one can approximate “truth” by “provability” better than with a less suitable machine … (Turing c.1940 [2004: 215])

In Turing’s opinion, that was how the debate on the Entscheidungsproblem had panned out. The Hilbertians had wanted a single decision method for the whole of mathematics—a single computing machine or algorithm—whereas Turing showed there can be no such thing; but he emphasized that there are, nevertheless, different decision methods (machines) for different areas of mathematics (see further Copeland & Shagrir 2013). In place of the one great unsolvable decision problem, there are many lesser, but often solvable, decision problems.

In the long aftermath of the Church-Turing result, as those rough pioneering days gave way to modern computer science, Turing’s opinion became the mainstream view: Today, computer science makes use of a multiplicity of algorithms for deciding different parts of the functional calculus and other mathematical systems.

Copyright © 2023 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

the extended church thesis

Church-Turing Thesis

The Church-Turing thesis (formerly commonly known simply as Church's thesis) says that any real-world computation can be translated into an equivalent computation involving a Turing machine . In Church's original formulation (Church 1935, 1936), the thesis says that real-world calculation can be done using the lambda calculus , which is equivalent to using general recursive functions .

The Church-Turing thesis encompasses more kinds of computations than those originally envisioned, such as those involving cellular automata , combinators , register machines , and substitution systems . It also applies to other kinds of computations found in theoretical computer science such as quantum computing and probabilistic computing.

There are conflicting points of view about the Church-Turing thesis. One says that it can be proven, and the other says that it serves as a definition for computation. There has never been a proof, but the evidence for its validity comes from the fact that every realistic model of computation, yet discovered, has been shown to be equivalent. If there were a device which could answer questions beyond those that a Turing machine can answer, then it would be called an oracle .

Some computational models are more efficient, in terms of computation time and memory, for different tasks. For example, it is suspected that quantum computers can perform many common tasks with lower time complexity , compared to modern computers, in the sense that for large enough versions of these problems, a quantum computer would solve the problem faster than an ordinary computer. In contrast, there exist questions, such as the halting problem , which an ordinary computer cannot answer, and according to the Church-Turing thesis, no other computational device can answer such a question.

The Church-Turing thesis has been extended to a proposition about the processes in the natural world by Stephen Wolfram in his principle of computational equivalence (Wolfram 2002), which also claims that there are only a small number of intermediate levels of computing power before a system is universal and that most natural systems are universal.

This entry contributed by Todd Rowland

Explore with Wolfram|Alpha

WolframAlpha

More things to try:

  • aleph0^3 = aleph0
  • expand (x + y + z)^10
  • Is 1/(1+sqrt(5)) an algebraic integer?

Referenced on Wolfram|Alpha

Cite this as:.

Rowland, Todd . "Church-Turing Thesis." From MathWorld --A Wolfram Web Resource, created by Eric W. Weisstein . https://mathworld.wolfram.com/Church-TuringThesis.html

Subject classifications

Help | Advanced Search

Quantum Physics

Title: the quantum-extended church-turing thesis in quantum field theory.

Abstract: The quantum-Extended Church-Turing thesis has been explored in many physical theories including general relativity but lacks exploration in quantum field theories such as quantum electrodynamics. Through construction of a computational model whose gate set mimics the interactions of QED, we demonstrate that one of the defining features of quantum field theory, particle creation and annihilation, is not likely to violate the quantum-Extended Church-Turing thesis. Through this computational model, it is shown that particle creation is likely only another form of quantum parallelism. However, whether or not the quantum-Extended Church-Turing thesis will hold for all computational devices in quantum field theories is still not known. For example, we briefly examine certain interactions in quantum electrodynamics which may create multi-qubit gates. These gates may have exponential complexity at the cost of being exponentially weak. This may in turn allow for computational advantage over traditional gate sets such as Clifford+T.

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • INSPIRE HEP
  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Omsk: Garrison city on the Irtysh

the extended church thesis

All photos by William Brumfield

Omsk, currently a city of around 1 million, was founded in 1716 as a fort on the middle reaches of the Irtysh River.  During the 18th century, its primary purpose was to protect Russia's southern border and establish authority over the aboriginal steppe tribes.

Although administratively subordinate to Tobolsk  throughout the 18th century, Omsk gained increasing power in the 19th century.  From 1808 until 1917, Omsk served as the headquarters of all Siberian Cossack troops, and in 1822 a separate Omsk Province was formed. Shortly thereafter, construction began on the Cossack Church of St. Nicholas (1833-1840), based on a plan by the noted Russian architect Vasily Stasov.  The St. Nicholas Church, which has been splendidly restored, became the repository of one of the area's great relics, the banner of Yermak, cossack conqueror of Siberia

This strong military presence also connected Omsk with the notorious Siberian exile system.  The most famous of the fortress's exiles was the writer Fyodor Dostoevsky, condemned in 1849 for associating with the "radicals" in St. Petersburg.  In January 1850 Dostoevsky arrived under guard at the Omsk fort, and for the better part of three years (1850-54), he lived the harrowing existence of a convict sentenced to hard labor, which included unloading barges on the Irtysh River. When his health broke under the physical and psychological strain, Dostoevsky was hospitalized under the care of a sympathetic medic.  It was during his stay in the infirmary that Dostoevsky began the writing of one of his seminal works, “Notes from the House of the Dead.” A few buildings still survive from that time, including one of the fortress gates, a few warehouses, and the hospital.

During the latter part of the 19th century, Omsk began a period of heady expansion, as the town became a transportation center for Russia’s vast interior. Regular steamboat service along the Irtysh River to Tobolsk began in 1862, but it was the railroads that made Omsk a boomtown. In 1894-95 Omsk was linked by the Trans-Siberian Railroad to Chelyabinsk in the west and Novonikolaevsk (later Novosibirsk ) in the east. In 1913, another rail line was completed from Omsk to Tyumen  in what would become the new Siberian mainline. By the beginning of the 20th century, the population of Omsk had tripled, to over 60,000 inhabitants.

This development as a transportation nexus led to a surge in the city's commercial district.  What had formerly been a provincial garrison town consisting primarily of wooden structures punctuated with several large churches now became a preeminent site for banks, educational institutions, industry and retail trade in Siberia.  In addition to branch offices for major banks and firms in Moscow and St. Petersburg, Omsk received investment from companies in the United States, Germany, and Great Britain.  The central part of Omsk had structures whose design rivaled Moscow's business district. Many of the new commercial buildings were built in styles derived from the Florentine Renaissance.

During the First World War, strategically located Omsk grew still further, and by 1917 the city’s population reached 100,000. Following the October Revolution, Bolshevik power was proclaimed almost immediately; but with little local support, the Bolsheviks were driven from the city in June 1918. Opposition to the Bolsheviks was fatally divided, and in November 1918, a military coup installed a dictatorship headed by Admiral Alexander Kolchak (1874-1920), who was a renowned polar explorer and gifted naval commander, but incapable of dealing with the chaos of the Russian civil war. For almost a year, Omsk could be considered the “capital” of the White forces, and the mansion where Kolchak had his headquarters is a prominent landmark. In November 1919 Kolchak’s forces were driven from Omsk. In 1921 Omsk became one of the centers of the American Relief Agency during the terrible Volga area famine.

With its economy shattered and the countryside still in chaos, Omsk struggled through the 1920s, only to achieve some industrial growth in the 1930s. Like many Siberian cities, it expanded rapidly during the Second World War, both as an evacuation haven and as a center of transportation and production.  Major development of the city’s military-industrial complex continued after the war, and by the end of the 1970s, the city's population exceeded 1,000,000—a benchmark of major significance. Omsk also became a center of the oil and gas industry.

Despite the social dislocation and contradictions of post-Soviet Russia, contemporary Omsk remains an important economic and industrial center.  It has a strong university (opened in 1974) and one of the largest regional libraries in Siberia. The city's expansion has occurred along the Irtysh River, thus creating a stretched plan with large distances between the various districts. Construction began on a subway in 1996, but work has been delayed, and the first line is not scheduled for completion until 2017. Omsk also has an international airport near the city center. The building of a new airport at a safer distance outside the city is proceeding slowly.

Fortunately, the historic center is rebounding, with its early 20th-century commercial district of imposing offices and shops.  There is a lively club scene, as well as a number of large theaters.  Omsk also supports diverse religions.  In addition to reopened Orthodox churches, the city contains a synagogue, mosques (the administration of the Imam of Siberia is in Omsk), and a large Baptist church, to list only a few. The reconstruction of the city’s main Orthodox church, the Cathedral of the Dormition (built in the 1890s and demolished in 1935) was completed in 2007.

Omsk continues to face the challenges of uneven economic development. But with proper investment and management, this Siberian powerhouse is poised to combine new levels of prosperity with a thriving cultural and intellectual life.

All rights reserved by Rossiyskaya Gazeta.

to our newsletter!

Get the week's best stories straight to your inbox

the extended church thesis

This website uses cookies. Click here to find out more.

Cathedral of the Assumption of the Blessed Virgin Mary

Rechke

Most Recent: Reviews ordered by most recent publish date in descending order.

Detailed Reviews: Reviews ordered by recency and descriptiveness of user-identified themes such as wait time, length of visit, general tips, and location information.

fritzmollie

Also popular with travelers

the extended church thesis

Cathedral of the Assumption of the Blessed Virgin Mary, Omsk

  • Sun - Sat 8:00 AM - 7:00 PM
  • (0.18 mi) Kamelot Hotel
  • (0.31 mi) Brick Walls Hotel
  • (0.35 mi) Hotel Flagman
  • (0.34 mi) Mini-Hotel SOVA
  • (0.34 mi) Malibu Hotel
  • (0.09 mi) Kantanello
  • (0.09 mi) Absento More
  • (0.09 mi) In the Actor's House
  • (0.09 mi) 1984 Brewing Co.
  • (0.08 mi) Domashnyaya Culinariya

IMAGES

  1. PPT

    the extended church thesis

  2. PPT

    the extended church thesis

  3. (PDF) Will boson-sampling ever disprove the Extended Church-Turing thesis?

    the extended church thesis

  4. Lecture 8 6 EXTENDED CHURCH TURING THESIS

    the extended church thesis

  5. Black Holes and the Quantum-Extended Church-Turing Thesis

    the extended church thesis

  6. A Formal Proof of the Extended Church-Turing Thesis: Demonstrating that

    the extended church thesis

VIDEO

  1. Church Thesis| Turing Machine

  2. What is Extended Essay? (Thesis Statement)

  3. 3.10.2024 Sunday Sermon. Luke 4:14-30

  4. The Biblical Case for Studying Church History

  5. FOURSQUARE CHURCH (ARCHITECTURAL THESIS by tatak_prime)

  6. How to Restart a Church: Dr. Ashley Boggan

COMMENTS

  1. Church-Turing thesis

    In computability theory, the Church-Turing thesis (also known as computability thesis, the Turing-Church thesis, the Church-Turing conjecture, Church's thesis, Church's conjecture, and Turing's thesis) is a thesis about the nature of computable functions.It states that a function on the natural numbers can be calculated by an effective method if and only if it is computable by a Turing ...

  2. The Church-Turing Thesis

    The Church-Turing Thesis. First published Wed Jan 8, 1997; substantive revision Mon Dec 18, 2023. The Church-Turing thesis (or Turing-Church thesis) is a fundamental claim in the theory of computability. It was advanced independently by Church and Turing in the mid 1930s. There are various equivalent formulations of the thesis.

  3. PDF 0.1 Extended Church-Turing Thesis

    The extended Church-Turing thesis is a foundational principle in computer science. It asserts that any "rea-sonable" model of computation can be efficiently simulated o n a standard model such as a Turing Machine or a Random Access Machine or a cellular automaton. This thesis forms the foundation of complexity the-

  4. What is the Church-Turing Thesis?

    The Extended Church-Turing Thesis , otherwise known as the Invariance Thesis , is more presumptuous. It asserts, not only that there is a Turing machine corresponding to each effective algorithm, but that there is one that is no more than polynomially slower (counting atomic actions) than any other effective means of performing that computation.

  5. PDF A Formalization and Proof of the Extended Church-Turing Thesis

    74 Extended Church-Turing Thesis So we may view implementations as computing a function over its domain. In the following, we will always assume a predefined subset I ∪{z} of the critical terms, called inputs and output, respectively. Input states may differ only on input values and input values must cover

  6. PDF A Formalization and Proof of the Extended Church-Turing Thesis

    2 Extended Church-Turing Thesis 3.Because we will be dealing with models of computation that work over different domains (such as strings for Turing machines and numbers for random access machines), and will, therefore, need to represent data values of one model in another, we will require the notion of simulation of

  7. The Church-Turing Thesis

    The Thesis and its History. The Church-Turing thesis concerns the notion of an effective or mechanical method in logic and mathematics. 'Effective' and its synonym 'mechanical' are terms of art in these disciplines: they do not carry their everyday meaning. A method, or procedure, M, for achieving some desired result is called ...

  8. PDF CS 294-2 Quantum Computation and Extended Church-Turing The ...

    The extended Church-Turing thesis is a foundational principle in computer science. It asserts that any "rea-sonable" model of computation can be efficiently simulated o n a standard model such as a Turing Machine or a Random Access Machine or a cellular automaton. This thesis forms the foundation of complexity the-

  9. The Church-Turing Thesis

    Complexity: The Extended Church-Turing Thesis. It is striking that the Turing machine holds a central place not only in computability theory but also in complexity theory, where it is viewed as a universal model for complexity classes. In complexity theory, the time complexities of any two general and reasonable models of computation are ...

  10. What is the extended Church-Turing thesis and how does it relate to the

    The extended Church-Turing thesis (ECT) is an important concept in the field of quantum algorithms, which relates to the study of quantum information and its computational capabilities. The ECT is an extension of the Church-Turing thesis, which is a fundamental principle in classical computer science. To understand the ECT, we must first grasp the Church-Turing

  11. The Church-Turing Thesis > The Rise and Fall of the

    Turing's and Church's provability formulation of the Entscheidungsproblem and the Hilbert-Ackermann formulation in terms of validity are in fact logically equivalent, as Church noted in 1936 (1936b: 41). This equivalence is a consequence of Gödel's proof that (where \(A\) is any formula of the functional calculus) if \(A\) is universally ...

  12. What precisely is the quantum extended Church-Turing thesis?

    $\begingroup$ @MarkS 1. I'd expect the "quantum Church-Turing thesis" to be along the lines of "A quantum Turing machine can simulate any realistic model of computation" (similar to Wikipedia's definition of quantum complexity-theoretic Church-Turing thesis). 2. The classical version of CT thesis doesn't talk about efficiency while the extended CT thesis does, but is widely believed to be ...

  13. PDF The Church-Turing Thesis

    •Perhaps this observation can be extended (to Church-Turing thesis) 9. Alonzo Church • American mathematician and logician 1903 - 1995 ... to mathematical logic and foundations of mathematical logic • Church theorem (Church-Turing theorem) • Church thesis (Church-Turing thesis) • lambda calculus • taught at Princeton, 1929-1967 ...

  14. Church-Turing Thesis -- from Wolfram MathWorld

    The Church-Turing thesis has been extended to a proposition about the processes in the natural world by Stephen Wolfram in his principle of computational equivalence (Wolfram 2002), which also claims that there are only a small number of intermediate levels of computing power before a system is universal and that most natural systems are universal.

  15. PDF Classical Physics and the Church-Turing Thesis

    Extended Church-Turing Thesis and the existence of numerous seemingly intractable computational problems arising from classical physics. Efforts to resolve this incompatibility could both advance our knowledge of the theory of computation, as well as serve the needs of scientific computing. 1. Introduction

  16. A Formalization and Proof of the Extended Church-Turing Thesis

    It is shown that every effective algorithm can be efficiently simulated by a Turing machine, and emulating an effective algorithm via an abstract state machine by a random access machine, representing data as a minimal term graph. We prove the Extended Church-Turing Thesis: Every effective algorithm can be efficiently simulated by a Turing machine. This is accomplished by emulating an ...

  17. PDF A Formalization and Proof of the Extended Church-Turing Thesis

    of this extended thesis is as follows: The Extended Church-Turing Thesis states ... that time on all "reasonable" machine models is related by a polynomial. (Ian Parberry [9]) We demonstrate the validity of this thesis for all (sequential, deterministic, non-interactive) effective models over arbitrary constructive domains in the following ...

  18. The Quantum-Extended Church-Turing Thesis in Quantum Field Theory

    The quantum-Extended Church-Turing thesis has been explored in many physical theories including general relativity but lacks exploration in quantum field theories such as quantum electrodynamics. Through construction of a computational model whose gate set mimics the interactions of QED, we demonstrate that one of the defining features of quantum field theory, particle creation and ...

  19. PDF A Formalization and Proof of the Extended Church-Turing Thesis

    The extended thesis adds the belief that the overhead in such a Turing machine simulation is only polynomial. One formulation of this extended thesis is as follows: The so-called "Extended" Church-Turing Thesis: ... any function naturally to be regarded as efficiently computable is efficiently computable by a Turing machine. (Scott ...

  20. Reviews of Modern Physics

    REVIEWS OF MODERN PHYSICS. VOLUME 29. NUMBER 3. JULY. 1957 "Relative State" Formulation of Quantum Mechanics*. Hugh Everett, lIl †. Palmer Physical Laboratory, Princeton University, Princeton, New Jersey * Thesis submitted to Princeton University March 1, 1957 in partial fulfillment of the requirements for the Ph.D. degree. An earlier draft dated January, 1956 was circulated to several physi ...

  21. Omsk: Garrison city on the Irtysh

    The reconstruction of the city's main Orthodox church, the Cathedral of the Dormition (built in the 1890s and demolished in 1935) was completed in 2007. Omsk continues to face the challenges of ...

  22. Cathedral of the Assumption of the Blessed Virgin Mary

    Czar Nicholas placed the first stone in 1891, and the church was completed in 1898. The architecture is unique, as it is a mix of Russian and Byzantine design. The many colours used, and the huge gold plated dome make this church a strange sight in Siberia, as it would be better suited in St. Petersburg or Moscow.

  23. The Lutheran Church of the Holy Catherine

    Coordinates: 54°59′02″N 73°22′09″E. Church of the Holy Catherine. The church before 1917. The Lutheran Church of the Holy Catherine (Russia, Omsk) is a unique religious building of the 18th century, preserved in Omsk. The church was built in 1790—1792 for the needs of foreign Protestants, who worked in the governance of Siberian ...