Explore the similarities and differences between the processes of computation by the brains of animals and digital computing

From Analog To Digital Computing: Does the Human Brain Have a Chance to Become Turing Machines?

The formal description of the Universal Turing Machine is the abstract basis for modern computation. It is based on the manipulation of integers, logic symbols, and logic signs. In this contribution to a discourse on the computer brain analogy, the authors discuss how analog computing performed by mammalian neurons is similar to and different from digital computing as performed by Universal Turing Machines. In the beginning, we consider that reality is a constant dialog between discontinuous and continuous worlds. Computing can be digital or analog, but is usually a mixture. Computers are essentially digital devices, but analog devices can simulate phenomena efficiently. In fact, all physical calculations must be implemented in the real world, and therefore, analog to some degree, even though they are based on abstract logic. Analog devices, such as the mammalian neuronal network, are used in artificial neural networks. These networks are digitally implemented but perform like analog models. Analog constructions are able to compute by implementing a variety feedback and feedforward circuits. Digital algorithms, on the other hand, allow for recursive processing that can generate unique emergent properties. We illustrate briefly how cortical neurons can make analogous predictions and integrate signals. We conclude that brains do not function as digital computers. However, we speculate that the recent implementations of writing by humans in the brain could be a digital path to slowly evolving the brain into a Turing machine.

This essay examines the similarities and differences between the computation process in animal brains and digital computing. It does so by examining the fundamental properties of the Universal Turning Machine (UTM), the abstract foundation for modern digital computing. In this context we attempt to distance XVIIIth-century mechanical automatas from modern machines by understanding that when computing allows recursion it changes the effects of determinism. Recursion in computation is both predictable and deterministic, but not always predictable. While it may be possible to create an algorithm which computes the decimal numbers of p, a finite sequence of values following the nth number cannot (yet), be predicted with n large enough.

Source:
https://www.frontiersin.org/articles/10.3389/fevo.2022.796413/full