2.6 The Principle of Computational Equivalence
summary
- The Principle of Computational Equivalence, along with the Principle of Computational Irreducibility (covered in Lesson 2.2), are the two main ideas proposed by Stephen Wolfram and described in his book, A New Kind of Science (2002).
- The Principle of Computational Equivalence is the more general of the two principles and implies the Principle of Computational Irreducibility.
- The Principle of Computational Equivalence states that once a system can perform behavior that is complex, that system is as complex as any other system found in the universe and is computationally equivalent in its sophistication.
- Following this principle, cellular automata programs like Rule 110 Cellular Automaton, the human brain, and even the evolution of weather systems, are computationally equivalent.
- Observing how rules progress in cellular automata like rule 30 and rule 110 (simple programs with very complex behavior) can explain how rules govern systems in nature.
- The computer revolution was launched after the discovery of a universal computer when it was shown that just by changing the software in a piece of hardware, you could run any computation you wanted.
- Computational irreducibility can be looked at as a way to reconcile determinism and free will in that there are fixed underlying rules, but you can’t know the outcomes of those rules unless you watch them unfold.