## 1. The computational hypothesis

For many decades, science had one major formal modeling language: differential equations. Developing differential equations took over 350 years, moving from intuitive, informal arguments to precise, formal proofs only at the end of the 19th century. Differential equations provided accurate numerical predictions and concise ways to state theories in many realms. There were always lacunae; the study of nonlinear dynamics and chaos theory started back in the 19th century. But something interesting and radical happened in the 1930s and 1940s: The idea of computation as a modeling language was developed. This is not the instrumental use of computation, for example, as a way of calculating approximate solutions to differential equations. The radical idea was the use of computation as a formal way to model process and “how to” knowledge, also known as *procedural* knowledge. Computation started being seen as a language that could be used in expressing scientific theories. Artificial Intelligence (AI) was the first field to be founded using this idea, in 1956.

The fundamental hypothesis of Artificial Intelligence is that computation is a useful way to model minds. What kind of computation? That remains an open question, although many constraints are becoming clearer. This hypothesis does not rule out explanations using differential equations, as computational models can contain them. The crucial point is that the language of computation is richer than the language of differential equations. Computation as a formal modeling language for cognition is a revolutionary notion, and subsequent progress in the field has proven its value, as discussed below.

The second field to adopt this idea was Cognitive Science, in 1978. There were (and are) many fields that study minds, each bringing valuable tools and perspectives. What was lacking was a common language, so we can recast our ideas into theories combining our insights. Cognitive Science was founded on the idea that computation would be that common language. Computation provides new tools for exploring theories of cognition, creating a new form of simulation that can be used to explain existing data and predict new findings. One need only look at the early proceedings, and the first issue of *Cognitive Science*, to see this.

These two fields are probably not the last to be making this intellectual bet. In current biology, there are signs of the same thinking emerging. Traditional differential equations models are being replaced with computational models, which provide more perspicuous accounts of phenomena in genetic regulatory networks and transcription processes.