View Single Post
Old 31st October 2018, 23:54   #53504  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by LigH View Post
I already wondered ... Neural Networks in a GPU implementation?! Want!
Wut? Most Neural Networks rely on GPUs to run them at any speed, because GPUs are actually pretty good at that, because it can be paralized really well. Not to mention even newer stuff with specialized hardware just for that task, ie. Tensor Cores.

We had NNEDI3 in madVR for a while, a Neural Network based scaler. But it was removed because it was ultimately decided that the added complexity in madVR was not worth it (since it was the only component to use OpenCL IIRC), and other algorithms could flat out replace it at higher speeds and quality.

Neural Networks can be really powerful, and who knows if some day one might return to madVR, but the real challenge with Neural Networks is training them. You need absolutely massive compute power to train a Neural Network. All those fancy NVIDIA demos you see for image processing, those are trained on super computers, large clusters with hundreds if not thousands of NVIDIA GPUs to train the network. Of course no mere mortal has that sort of resources to train a network.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 1st November 2018 at 00:04.
nevcairiel is offline   Reply With Quote