Not really. FPGAs are fundamentally digital and pretty much give you a bunch of logic gates to work with ("Field-Programmable Gate Array"). The author's proposed architecture would instead provide an array of components that perform analog operations, such as summing, multiplication, and integration or differentiation, over analog voltages.
FPGAs are fundamentally analog, depending on if 'fundamental' means what was in the designer's head or what you actually fabricated. You are thinking about them and using them as if they were digital.
Adrian Thompson at Sussex University used a genetic algorithm to auto-design FPGA circuits in the early 90s. Since no one told the GA that FPGAs were supposed to be logic circuits, it happily used the FPGA as an analog machine.
Even an Intel i7 chip is an analog machine that approximately implements the i7 computer design. They throw away the ones that don't approximate it up to tolerance.
Obviously. But good luck implementing a human-comprehensible analog differential equation solver on one, without the help of a genetic algorithm, that doesn't depend (as Adrian Thompson's circuit did) on the temperature, the quirks of that specific board, and the effect of components which aren't even physically wired to it.
The difference isn't that FPGAs don't operate on analog voltages deep down (who said they don't?). The difference is in the set of tools and tolerances they give you, and in that sense FPGAs are only an analog coprocessor in the sense that a car can, technically, be used as a sailboat.
I think it is an interesting reminder that the world we live in is wholly analog, and while this includes systems with discrete sets of equilibrium states, using digital devices to perform analog functions looks like hacking rather than as something that one would normally be advised to do. The reason for that is that while such devices may all behave as designed, identically, in the digital domain, their analog characteristics may vary so wildly between versions and even different specimens, as well as with, say, temperature, and do it in a completely unspecified way, that any attempt to do a serious analog design based on them would seem impractical.
Maybe it was the wrong term. My point was we have them and they are used, but they are not cost efficient in general. So saying that they will be the future of computing, is a bit ridiculous imo.