The cognitive brain (neocortex) performs computational tasks that far exceed the capabilities of conventional computer systems. Furthermore, it is very fast (considering its biological components) and extremely energy efficient.
As a model for neocortical operation, a broad class of spiking neural networks encode information with precise spike timing relationships across multiple parallel lines. These temporal neural networks then process information as a wave-front of spikes that sweeps forward through the neural network. At the outputs, results are encoded in a temporal format. The time it takes to compute a result is the result.
Recently proposed race logic performs computation using a feedforward network of AND gates, OR gates, and delays. Information is encoded as the times at which logic transitions take place; i.e. when logic levels change from 0 to 1. Computation is performed by initializing logic levels to 0. Then input values are encoded as 0 to 1 transition times, and these transitions sweep forward through the logic network. Output transition times encode the results. The time it takes to compute a result is the result.
A very broad family of temporal neural networks and a generalized form of race logic are isomorphic. One uses spike times for encoding information, the other uses logic transition times.
This means that we can design neural networks with widely-used spiking neuron models, and then implement them directly in hardware using conventional logic gates and design techniques. In the process, the speed and energy efficiency advantages are retained. Improvements versus conventional best-effort binary designs are: speedups of 2-3X and reduction in dynamic energy consumption of 10X or more. Furthermore, we can design certain other non-brain-like functions with similar improvements. Examples are sorting and graph long/short path problems.
The key principle is that time is used as a resource for communication and computation. Time is the ultimate physical resource. Time is freely available everywhere; it consumes no power and takes up no space. By using time as a physical implementation resource, we can use less of other physical resources, dynamic energy being the most significant.
Both temporal neural networks and generalized race logic are examples of spacetime computing paradigms. Spacetime computing, in turn, can be characterized by a unary algebra, that operates on unary encoded information using simple primitives: max, min, increment, inhibit. These primitives are complete with respect to the set functions computable by temporal neural networks and generalized race logic. So, if one accepts that temporal neural networks model the brain's operation, then the brain is a unary computer. And, by using generalized race logic we may eventually be able to achieve brain-like capabilities with conventional digital hardware.
James E. Smith is Professor Emeritus in the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison. He received his PhD from the University of Illinois in 1976. He then joined the faculty of the University of Wisconsin-Madison, teaching and conducting research ̶ first in fault-tolerant computing, then in computer architecture. He has been involved in a number of computer research and development projects both as a faculty member at Wisconsin and in industry.
Prof. Smith made a number of early contributions to the development of superscalar processors. These contributions include basic mechanisms for dynamic branch prediction and implementing precise traps. He has also studied vector processor architectures and worked on the development of innovative microarchitecture paradigms. He received the 1999 ACM/IEEE Eckert-Mauchly Award for these contributions. Currently, he is studying new computing paradigms at home along the Clark Fork near Missoula, Montana.