GPU not being used in Renders.
[email protected]
Posts: 17
My GPU (GTX 1080) isn't being used in certain renders. Everything's up to date and it seems random as to when it does and doesn't work. I've attached a snippet of what the log says when I try to render with exclusively the GPU on instances where it doesn't work.
Any help would be appreciated. Thank you!
txt
txt
D3D GPU ERR.txt
7K
Comments
There isn't enough memory for the scenes that fail - when memory runs out Iray, be default, falls back to the CPU (you can turn that off, in which case it will just stop).
How would I fix the issue of there not being enough memory, or the memory I do have not being read? It can happen between different renders in the same scene with the same elements. And I've done much more expensive renders that have worked fine.
Restarting DS can help, if a scene sometimes fails and sometimes works it may be just marginal and depend on any accumulated memory loss/fragmentation. Otherwise, textures are usually the main consumer of memory (though high density mesh, through SubD level ot from stand-based/dForce hair for example, can sometimes be to blame) so resizing the textures may help (manually, or through a tool like Scene Optimiser).
Is there any way I could allocate more memory from my GPU? This card's got 8GB at the ready and I don't think I've ever seen it fully utilized
How do you know that it has never been fully utilized?
Ok, the memory utilization is fine, my mistake. The Utilization percentage is not. It fluctuates around 5-15% I'm looking at the Windows 10 Task Manager to see all this
If only 5-15% of your VRAM is used, the GPU is not used for rendering due to your scene requiring more VRAM than is available => The rendering will be done by the CPU very slowly if CPU fallback is not disabled.