Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The software runs on Nvidia hardware. Not sure if all their servers are fitted with one or more GPU cards...

I think it is more likely that this is just research software.



Deep neural networks are usually trained on GPUs which can provide a huge speed up, and is much cheaper. However they can easily be run on CPUs after being trained. I don't see why they wouldn't have access to GPUs in production though.


Well, it depends, you can definitely have servers with NVidia hardware, and you don't need to have all of them with it. (Amazon offers GPGPU servers on EC2 for example)


Amazon only offers very costly Tesla and Quadro cards on their GPGPUs. Super expensive. For scale, if you do not need the extra memory on the Tesla cards, what you want is commodity NVIDIA GeForce cards that are a fraction of the cost.


Standard cards are handicapped for double precision operations though (1/8 speed afaik).


You don't need double precision for typical DL tasks, such as training a convnet.


Double precision is overrated :) (at least for ML)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: