Combine this with Andrew Ng, whose own student Adam Coates matched Google's cat detector network with 64 consumer grade GPUs, and we are in for interesting times...
Adam Coates has done many more impressive things than that. The "cat detector" was an uninteresting piece of research and not very impressive at all. The image net results were bad and nets trained on one or two gpus were doing much more interesting things.
In Hinton's Coursera class, which was very good, he made a strong point about GPUs being the way to go (at least for now) in building deep learning networks.
Why are they still using GPUs? Maybe hardware companies should start making AI targeted processors instead. Could they boost performance 10 times with a dedicated processor?
GPUs out perform distributed computing currently for deep learning training, but it will be interesting to see how MPI and CUDA will be combined in the future (there are already people working on that, but its still early).
http://on-demand.gputechconf.com/gtc/2014/presentations/S465...
Combine this with Andrew Ng, whose own student Adam Coates matched Google's cat detector network with 64 consumer grade GPUs, and we are in for interesting times...
http://www.stanford.edu/~acoates/papers/CoatesHuvalWangWuNgC...