Many things are cyclical. Some things are a one way street. Digitization covers some things that shouldn’t be reversed, and some that should and will. I started work early enough to experience using an analog computer. Analog computers use analogs of things to help simulating them. So for example, you can simulate heat flow through a wall by using a battery to provide a voltage as an analog of the temperature difference and a resistor to be an analog of the wall’s insulation. If you want a better result, you could simulate the heat capacity of the wall using a capacitor. A well-designed analog will produce a useful result. The best thing about analogs is that in some cases they are infinitely fast. Imagine writing a computer simulation of the convection currents in a glass of water. You could build a supercomputer to simulate every atom’s behavior digitally. Your program could include local sources of heat, take account of viscosity, chemical reactions among the impurities and everything else you can think of etc. You might decide to account for the movement of the earth and the Coriolis forces it would generate on the water as the current make the water move. If you want ridiculously precise results you could simulate the effects of every planet in the solar system on atomic movements. You could account for magnetic forces, electrostatic ones and so on. By now, your biggest supercomputer would be able to simulate the glass of water for a few microseconds before it is replaced by an upgrade. You can do it, but it isn’t ideal. The analog alternative is to pour a glass of water and watch it. Every atom, every subatomic particle in that glass, will instantaneously and continually account for every physical interaction with every passing photon, and every other particle in the universe, taking full account of space-time geography and the distances of each particle. It would work pretty well, it would be a good analog, even though it’s probably a glass of different water from a different tap. It will give you a continuous model at almost zero cost that works perfectly and greatly outperforms the digital one. Analog wins.
If you want to add 2+2, an analog computer will give you a result of roughly 4. The next time, it will still be roughly 4 but will be slightly different. A digital one will always give an answer of precisely 4, unless you’ve messed up badly somewhere. Digital wins.
It is obvious that digital has some advantages and analog does too. Analog is less reproducible, liable to drift, is not always transparent and has many other faults that eventually led to it being replaced for most purpose by digital computing. The truth remains that a glass of water has more processing power than all the digital computers every built put together, if you want to simulate water behavior.
Digital and analog processing are both used in nature. In vision, the retina sends an essentially digital stream of data to the brain. In IT, pretty much all communications is done digitally, as is storage of data. It is far easier to repair the degradation that occurs over time or transmission that way. If a signal level has shrunk slightly, it will still be clear whether it is a 1 or a 0 so it can be corrected, reset to the right level and re-transmitted or stored. For an analog signal, degradation just accumulates until the signal disappears. Digital wins in most of IT.
But back to analog. Much of the processing in many electronic circuits and systems is done in the analog domain before digital takes over for transmission or computation. Even computer motherboards, graphics cards, fans and power supplies have resistors, capacitors and even a transformer can be thought of as an analog device. So analog processing and devices are with us still, just hiding behind the scenes.
I think analog computing will make a comeback, albeit in certain niches. Imagine a typical number-crunching problem for supercomputers, such as simulating heat and force transfer. Imagine making an actual analog of it using some futuristic putty and exposing that putty to actual forces and heat. If there are nano-sensors embedded throughout, you could measure the transfer of forces and heat directly and not have to calculate it. Again the speed advantage of analog would return. Now suppose a hybrid machine with some such analogs and some digital programming too. Those bit best left to digital could be done digitally and others where real analogs could be made could shortcut the number-crunching requirements tremendously. The overall speed might be dramatically improved without sacrificing integrity. Furthermore, the old problems of drift faced by analog systems could be reduced or almost eliminated by frequent cross referencing and calibration as the system goes on.
Finally, AI may well have a powerful place in consciousness and AI realization. Many people believe AI would be best done using adaptive analog neurons. Until today I was one of them. However, I am starting to doubt that, and this looking again at analog has made me realize a bit more about consciousness techniques, so I will divert from this piece forthwith to write more on conscious computing.