Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It should be intuitive that analog is more efficient for representing analog abstractions (neural networks, which inherently copy analog structures).

That's not necessary true; again, see the DARPA project you listed.

> Digital hardware adds an additional abstraction layer built out of components that do not behave like the desired structures (neurons) and are basically running simulations.

The analog hardware doesn't necessarily behave like the desired components either (for example, due to manufacturing errors). With digital you get to choose your accuracy and with much greater control.

> Digital is more efficient for some problems in the sense that it's the only way to achieve precise computational results, but even when it comes to mathematical arithmetic, the circuitry to add two analog signals is much faster and simpler (two wires and a resistor), as is the design of a multiplier (one transistor).

Unless you don't care about noise at all, your design will be much more complicated than a single transistor. Either way, it's actually much easier to do this sort of "approximate computing" in the digital domain.

> It's definitely a complex subject, but I don't see why you would find it so difficult to believe -- the most intelligent machines we know of are analog and decentralized (like human brains).

A bird has feathers yet our most efficient airplanes don't use them; there's no reason at all to believe that just because the brain is analog, our design should be as well.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: