4.15 does not USE GPU! render refuses to render without CPU then leaves GPU idle

13

Comments

  • dragoneyes002 said:

    That is pretty much what I said when it drops the GPU to run on CPU the GPU becomes an expensive form of extra RAM. 

     That is not what is happening.

    The ram is just being allocated, It is not contributing to the process in anyway.

     

     

     

     

  • DrunkMonkeyProductions said:

    dragoneyes002 said:

    That is pretty much what I said when it drops the GPU to run on CPU the GPU becomes an expensive form of extra RAM. 

     That is not what is happening.

    The ram is just being allocated, It is not contributing to the process in anyway.

    that would be even worse. 

     

     

     

     

     

  • jnwggsjnwggs Posts: 89

    DrunkMonkeyProductions said:

    dragoneyes002 said:

    That is pretty much what I said when it drops the GPU to run on CPU the GPU becomes an expensive form of extra RAM. 

     That is not what is happening.

    The ram is just being allocated, It is not contributing to the process in anyway.

    Is this an issue with the iray driver? Or is it an issue with Daz Studio? That is the question I think...

     

     

     

     

     

  • PerttiAPerttiA Posts: 10,024

    jnwggs said:

    DrunkMonkeyProductions said:

    dragoneyes002 said:

    That is pretty much what I said when it drops the GPU to run on CPU the GPU becomes an expensive form of extra RAM. 

     That is not what is happening.

    The ram is just being allocated, It is not contributing to the process in anyway.

    Is this an issue with the iray driver? Or is it an issue with Daz Studio? That is the question I think...

    Is it an issue at all?

    During my tests DS said 3.8GiB was used for the geometry before informing that there was not enough VRAM to continue with the GPU. During the rest of the rendering DPU-Z reported 5GB:s of VRAM used when the CPU was doing the rendering - Did the used VRAM matter?, not at all.

    If the OP wants to render scenes on GPU he/she must get the memory usage of the scene down to what can fit the available VRAM on the GPU or the only option is rendering slowly with the CPU.

  • that seems to not be True PerttiA. here after changing to Win10 '20H2' I can show that even using the CPU the GPU was running. The Very thing I want to happen but its not consistant. and in the below example you will not be telling me that I had to REDUCE the geometry to get it! The reason I took that particular screen shot was because the CUDA was running at about 50% while I was away from the computer for a while and after I moved the Mouse it suddenly ran back up not sure what was happening there. Getting back to our issue you can see both the CPU and GPU are running in Tandem Exactly what I want to happen and this scene has 4 characters 4 cars 4 city blocks dozens of Props hundreds of materials, atmosphere and the environment. I'm thinking I'll reopen the scene and take a shot of the scene info just to be sure it exceeds the Vram. yup 8.509GB before it even tries to render. I'd say we've exceeded the VRAM pretty easily in this example.

    cuda taking a break.jpg
    3840 x 2160 - 2M
    alienbattlesceneinfo.jpg
    3840 x 2160 - 2M
  • Mart1n71Mart1n71 Posts: 129

    I notice in your first pic you use a few fairs from OOT. While you say the polygon count is not that high, have you looked at the texture memeory usgae in your log file? A lot of OOT hairs use 8K maps which, once unpacked into your GPU Vram, use up a lot of space. Leony hair for example only uses 22.6 Mb of Vram for geometry, but over twenty times that ammount of memory for 17 seperate texture maps at 467.5 Mb. Assuming each of the five sets of hair uses a similar set of textures, that's almost 2.5Gb of Vram used just on hair, add in the texture files of all the characters you have, plus the scenery, and it seems you are simply running out of room. Reducing the 8k textures down to 4k brought the texture Vram usage down to 270 Mb, and that was only four out of the 17 textures that hair product uses. Daz has tried to fit the scene into your GPU one item at a time and allocated as much space as it needs until it comes across an item that will no longer fit, but the whole scene must be on there for the GPU to participate in the render, If it doesn't fit, Daz will look to drop to CPU rendering. As you have turned off CPU fallback, the CPU cannot render the scene because you have told it not to, so nothing renders. If you turn the CPU fallback on again and the scene does not fit onto your GPU it will still render, but the GPU will take no part in the render process. Your photos through the side of your case show low Vram usage because the Vram space is just being allocated (so it cant be used by another program) but is not actually being used.

  • jnwggsjnwggs Posts: 89
    edited March 2021
     

    Is it an issue at all?

    During my tests DS said 3.8GiB was used for the geometry before informing that there was not enough VRAM to continue with the GPU. During the rest of the rendering DPU-Z reported 5GB:s of VRAM used when the CPU was doing the rendering - Did the used VRAM matter?, not at all.

    If the OP wants to render scenes on GPU he/she must get the memory usage of the scene down to what can fit the available VRAM on the GPU or the only option is rendering slowly with the CPU.

    Yep, I think it is. I just ran 6 close up renders of a GF8 face, trying different angles. One after another they ran fine and fast using the GPU. I clear the Vram after each try, and I close down the render and create a new one. On the 7th render, it decided to use the CPU instead and everything bogged down. I cancelled, saved the view, and then rebooted Daz and this time the exact same view rendered in seconds with the GPU. I should be able to clear the Vram using the hot keys that are used for doing that everytime, not just most of the time and once in a while it fails miserably. And I shouldn't have to shut down Daz Studio and reboot it like that.

    Post edited by jnwggs on
  • PerttiAPerttiA Posts: 10,024

    dragoneyes002 said:

    that seems to not be True PerttiA. here after changing to Win10 '20H2' I can show that even using the CPU the GPU was running. The Very thing I want to happen but its not consistant. and in the below example you will not be telling me that I had to REDUCE the geometry to get it! The reason I took that particular screen shot was because the CUDA was running at about 50% while I was away from the computer for a while and after I moved the Mouse it suddenly ran back up not sure what was happening there. Getting back to our issue you can see both the CPU and GPU are running in Tandem Exactly what I want to happen and this scene has 4 characters 4 cars 4 city blocks dozens of Props hundreds of materials, atmosphere and the environment. I'm thinking I'll reopen the scene and take a shot of the scene info just to be sure it exceeds the Vram. yup 8.509GB before it even tries to render. I'd say we've exceeded the VRAM pretty easily in this example.

    You keep showing us screenshots when we are asking for the logfile (a textfile not a screenshot of part of it) where we could see what's happening and explain that to you as well.

  • PerttiAPerttiA Posts: 10,024

    jnwggs said:

     

    Is it an issue at all?

    During my tests DS said 3.8GiB was used for the geometry before informing that there was not enough VRAM to continue with the GPU. During the rest of the rendering DPU-Z reported 5GB:s of VRAM used when the CPU was doing the rendering - Did the used VRAM matter?, not at all.

    If the OP wants to render scenes on GPU he/she must get the memory usage of the scene down to what can fit the available VRAM on the GPU or the only option is rendering slowly with the CPU.

    Yep, I think it is. I just ran 6 close up renders of a GF8 face, trying different angles. One after another they ran fine and fast using the GPU. I clear the Vram after each try, and I close down the render and create a new one. On the 7th render, it decided to use the CPU instead and everything bogged down. I cancelled, saved the view, and then rebooted Daz and this time the exact same view rendered in seconds with the GPU. I should be able to clear the Vram using the hot keys that are used for doing that everytime, not just most of the time and once in a while it fails miserably. And I shouldn't have to shut down Daz Studio and reboot it like that.

     In my tests the amount of VRAM reserved after one rendering has been done and the rendering window is saved/closed;

    1. The base load from the OS = 200MB
    2. The base load from DS = 170MB
    3. The base load from the scene = XX (52 to 470MB:s in my tests)
    4. Something quite consistent = ~2140MB

    The last one doesn't seem to increase with subsequent renders and doesn't increase the amount of used VRAM during subsequent renders.

  • Leony and Alice hair were wonking out. both created warnings in the log files. as soon as i removed them the scene stayed within the VRAM available. I think there might be a loop occuring if the hair's have multiple maps to chose from.

  • Mart1n71Mart1n71 Posts: 129

    For what it's worth, I loaded the first five G8F characters in my library into a scene and fitted them with the first five OOT hairs in my library including Alice and Leony hair. I got the same error warnings in my log file that you did, but the scene rendered ok (RTX 3090). Using MSI Afterburner to monitor vram usage, those five charcters with five hairs was using just over 9Gb of vram. That's without any clothing, without any enviroment such as the water, the shoreline, trees etc, without any props, and without any lighting other the the sun/sky setup. Looking at the quality of rocks and bushes in your background I seriously doubt you can get those items in the scene under 1Gb. The simple truth is the scene is too big for your card to hold. It will need to render on CPU (except you dissabled that option), or you will need to do some optimistation. Reducing all the 8k image files in the OOT hairs to 4k saved me over a gig of vram, I'm sure that if you dropped all the texture files half using a scene optimzer you would fit the scene on your card with no apprent loss in quality, except maybe the boat.

  • PerttiAPerttiA Posts: 10,024

    Mart1n71 said:

    For what it's worth, I loaded the first five G8F characters in my library into a scene and fitted them with the first five OOT hairs in my library including Alice and Leony hair. I got the same error warnings in my log file that you did, but the scene rendered ok (RTX 3090). Using MSI Afterburner to monitor vram usage, those five charcters with five hairs was using just over 9Gb of vram. That's without any clothing, without any enviroment such as the water, the shoreline, trees etc, without any props, and without any lighting other the the sun/sky setup. Looking at the quality of rocks and bushes in your background I seriously doubt you can get those items in the scene under 1Gb. The simple truth is the scene is too big for your card to hold. It will need to render on CPU (except you dissabled that option), or you will need to do some optimistation. Reducing all the 8k image files in the OOT hairs to 4k saved me over a gig of vram, I'm sure that if you dropped all the texture files half using a scene optimzer you would fit the scene on your card with no apprent loss in quality, except maybe the boat.

    OS, DS and the base load of a scene are probably taking over 1GB to start with, so the available VRAM would be less than 9GB:s for the geometry, textures, frame buffer and work space. The last two taking around 1.3 to 1.8 GiB in my tests.

    Reducing 8k textures and maps to 4k reduces the memory usage to one quarter of the original.

  • Mart1n71Mart1n71 Posts: 129

    PerttiA said:

    OS, DS and the base load of a scene are probably taking over 1GB to start with, so the available VRAM would be less than 9GB:s for the geometry, textures, frame buffer and work space. The last two taking around 1.3 to 1.8 GiB in my tests.

    Reducing 8k textures and maps to 4k reduces the memory usage to one quarter of the original.

    Exactly. My test was on a fresh OS restart with only DS, one Firefox tab open and background apps. MSI Afterburner reported 1048 Mb in use or allocated prior to render, and 9162 Mb during render. The only optimization I did was reduce the 8k maps to 4k (which I do as a matter of course for OOT hairs, because I just dont need an 8k bumb map in a 2 or 4k daz render) after which Afterburner reported 8126 Mb in use. Out of curiosity I just ran the scene optimizer on everything again and Vram went from 1236 Mb pre render to 6136 Mb during render, saving around 3Gb of Vram compared to the original. I still get the same error messages on loading the hairs, but I think these are related to OOT's specific hair shader and may add to load times, but once loaded they perform fine.

    So the answer to the OP's original question is that the scene is too big (especially texture usage) to fit onto a 3080, the CPU fallback option was unchecked resulting in a blank render. The solutions are to either A, optimize the scene to reduce memory usage, or B, check the CPU fallback option and render via CPU. The OP may or may not know that just because the CPU fallback option is checked does not mean that the CPU will be used all the time as it has been unchecked in the Devices pane. The CPU is only used as a fallback option if GPU rendering fails if, for instance, the scene doesn't fit. 

  • Mart1n71 said:

    PerttiA said:

    OS, DS and the base load of a scene are probably taking over 1GB to start with, so the available VRAM would be less than 9GB:s for the geometry, textures, frame buffer and work space. The last two taking around 1.3 to 1.8 GiB in my tests.

    Reducing 8k textures and maps to 4k reduces the memory usage to one quarter of the original.

    Exactly. My test was on a fresh OS restart with only DS, one Firefox tab open and background apps. MSI Afterburner reported 1048 Mb in use or allocated prior to render, and 9162 Mb during render. The only optimization I did was reduce the 8k maps to 4k (which I do as a matter of course for OOT hairs, because I just dont need an 8k bumb map in a 2 or 4k daz render) after which Afterburner reported 8126 Mb in use. Out of curiosity I just ran the scene optimizer on everything again and Vram went from 1236 Mb pre render to 6136 Mb during render, saving around 3Gb of Vram compared to the original. I still get the same error messages on loading the hairs, but I think these are related to OOT's specific hair shader and may add to load times, but once loaded they perform fine.

    So the answer to the OP's original question is that the scene is too big (especially texture usage) to fit onto a 3080, the CPU fallback option was unchecked resulting in a blank render. The solutions are to either A, optimize the scene to reduce memory usage, or B, check the CPU fallback option and render via CPU. The OP may or may not know that just because the CPU fallback option is checked does not mean that the CPU will be used all the time as it has been unchecked in the Devices pane. The CPU is only used as a fallback option if GPU rendering fails if, for instance, the scene doesn't fit. 

    The Answer I used was removing the two OOT hair and replacing with other hair the scene dropped below the Card's VRAM limit. the warnings may be attached to materials but its unlikely to be "they are 8k" maps. That was the Original problem the scene wasn't what should have exceeded the limit. a simple change of hair the program is having some issue with it warns you of kept the scene below the limit what the change didn't do which is seen later in the last scene I posted "AlienBattle" above Definitely exceeds the cards limit and rendered with BOTH the CPU and the GPU in tandem<<<< this is the way it should NORMALLY run but that is not the case. the same person keeps posting conflicting (((opinions))) "that's how it works" "It doesn't do that" "I never said it wouldn't" saying that once it exceeds the GPU VRAM it leaves the GPU behind and relies only on the CPU which would preclude the CPU and GPU working in tandem. EXCEPT in the 'alien Battle' scene the geometry alone counts around 9 million faces before all other usage the render ran BOTH the GPU and CPU the Cuda usage showed the GPU was active with the CPU run up. just to show there is a problem re rendered the same scene and it fell back to CPU only. the change 1-plane 2-faces/two quads 1-480p material.

  • Mart1n71Mart1n71 Posts: 129

    dragoneyes002 said:

    The Answer I used was removing the two OOT hair and replacing with other hair the scene dropped below the Card's VRAM limit. the warnings may be attached to materials but its unlikely to be "they are 8k" maps.

     And I'm willing to bet these new hairs do not contain six 8k maps like Leony hair does. The errors are not related to the 8k maps. OOT uses his/her own shader stystem not iRay uber base, so over time that shader may not now be 100% compatible with new versions of iRay, hence the error messages. I loaded Leony hair, changed the material shaders from OOT Hairblender Shader to IRay Uber Base, copied all the maps over and saved it as a new hair figure. Upon loading the new hair, there were zero errors, almost instant loading times, the same texture files used, and the same amount of Vram used to render. The warnings were only due to shader compatability but were unrelated to your original issue.

    That was the Original problem the scene wasn't what should have exceeded the limit. a simple change of hair the program is having some issue with it warns you of kept the scene below the limit what the change didn't do which is seen later in the last scene I posted "AlienBattle" above Definitely exceeds the cards limit and rendered with BOTH the CPU and the GPU in tandem<<<< this is the way it should NORMALLY run but that is not the case.

    Incorrect. If it was using both, it didn't exceed the cards available Vram. (see below regarding the BS shown in the Scene Info tab)

    the same person keeps posting conflicting (((opinions))) "that's how it works" "It doesn't do that" "I never said it wouldn't" saying that once it exceeds the GPU VRAM it leaves the GPU behind and relies only on the CPU which would preclude the CPU and GPU working in tandem.

    This is correct. Continuing to use the GPU even when the scene does not fit is called Out Of Core rendering, and unless there has been a very recent change, IRay does not support this (if it suddenly did the Daz Studio Discussion page would be on fire about it). Resources on your GPU will already have been allocated to be used so will show as being used by Windows resource manager and wont be cleared fully until the program they have been allocated to has closed fully. Wether those resources are actually being used for anything is another matter. It's like a seat reserved on a train or plane. It can't be used and is listed as full even if no one is sitting in it.

     

     EXCEPT in the 'alien Battle' scene the geometry alone counts around 9 million faces before all other usage the render ran BOTH the GPU and CPU the Cuda usage showed the GPU was active with the CPU run up.

    In that case, the scene DID fit on the card. 9 million faces shown in the Scene info tab means nothing to Vram usage. I just loaded EIGHT G8F's into a scene, removed all their textures and set their SubD levels to 4. My scene was showning over 36.5 million faces, around four times what you were using yet only used 8.1Gb of vram on render including 2Gb already being used by my system for other things.

    just to show there is a problem re rendered the same scene and it fell back to CPU only. the change 1-plane 2-faces/two quads 1-480p material. 

    Stopping a render does not always clear the Vram completely. If you have a scene that barely fits, after using iray preview mode for a while, or doing a couple of test renders, even without adding anything iray may eventually kick the scene down to CPU only. This has happened to me many times with my old 1080ti, after a couple of test renders with a scene reporting 10-11Gb it would eventually fail. Closing Daz, waiting for the background processes to finish, restarting Daz and reloading the scene, suddenly now it only takes up 7-8Gb of Vram.

  • Richard Haseltine said: On the not being able to start a new instance until the old one has finished closing, it is possible but requires the instance have a new name Daz Studio Pro 4.12 - instances

    On that page it provides a link to a dse file. I tried it and like it, but I would like to modify it. I have never worked with dse files, but have some programming experience. I would like to edit or make my own version of that dse script. Is there source code for that or an explanation of how to make it?

  • Richard HaseltineRichard Haseltine Posts: 100,747

    heinzerbrew_f94794efff said:

    Richard Haseltine said: On the not being able to start a new instance until the old one has finished closing, it is possible but requires the instance have a new name Daz Studio Pro 4.12 - instances

    On that page it provides a link to a dse file. I tried it and like it, but I would like to modify it. I have never worked with dse files, but have some programming experience. I would like to edit or make my own version of that dse script. Is there source code for that or an explanation of how to make it?

    .dse files are not editable, they are encrypted. If you want a modified version I'm afraid you would have to write it yourself.

  • Richard Haseltine said:

    .dse files are not editable, they are encrypted. If you want a modified version I'm afraid you would have to write it yourself.

    Thanks for the reply, I figured out that much. That is why my question was, "Is there source code for that or an explanation of how to make it?". I looked at the api, and I don't think this will be trivial for me to figure out.crying Sadly I don't have several days to learn how to make this myself just to reduce a few button clicks.

    Also, do you know if there is way to prevent the missing asset alerts we get that stop the scene loading? I want to load some old scenes and walk away while they load and come back to a fully loaded scene. I don't care if it is missing materials or other stuff. currently it stops to let me know stuff is missing, then stops again to tell me the stuff that was missing didn't load, and then I still have to click something to get it to finish loading.

    Also, Thanks for being a great resource on this forum.smiley

  • MaverickMaverick Posts: 16

    Hi everybody
    I just bumped into this thread in searching to solve my problem. As many above described too, my brand new PC (i9 10900 /48GB od RAM) with a GeForce RTX 3060Ti (about a month old) is just rendering purely on CPU and is completely ignoring my RTX 3060 Ti. As I'm always very cautious with new Versions of Software (and I read quite a bit of Users of DS4.14 having various issues) I'm still running DAZ-Studio 4.12.1.117. How ever it really puzzles me, why DS isnt using my RTX3060 Ti at all. I just made a (most) simple scene, with only 3 spotlights and a Cube (DAZ-Primitive) and neither the IRAY-Preview nor the Render makes use of the GPU at all, while the CPU runs on about 90% usage. But with this scene it is impossible, that the dedicated VRAM of 8GB is full (as described above as a reason for a CPU-Fall back).
    So I assume, there is either something wrong with my DS-Settings or there is a heavy incompatibility between my Computer and DS.

    Altough I read through the full thread above (to make sure, I'm not asking a question that was answered already above), I couldn't find any clue that would help me to make my RTX 3060 Ti being used in DS renders.
    Is there anything fundamental I have forgotten???

    Thanks for any help on this

    IRAY-Preview on CPU.jpg
    2560 x 1400 - 327K
  • PerttiAPerttiA Posts: 10,024
    edited March 2021

    Maverick said:

    Hi everybody
    I just bumped into this thread in searching to solve my problem. As many above described too, my brand new PC (i9 10900 /48GB od RAM) with a GeForce RTX 3060Ti (about a month old) is just rendering purely on CPU and is completely ignoring my RTX 3060 Ti. As I'm always very cautious with new Versions of Software (and I read quite a bit of Users of DS4.14 having various issues) I'm still running DAZ-Studio 4.12.1.117. How ever it really puzzles me, why DS isnt using my RTX3060 Ti at all. I just made a (most) simple scene, with only 3 spotlights and a Cube (DAZ-Primitive) and neither the IRAY-Preview nor the Render makes use of the GPU at all, while the CPU runs on about 90% usage. But with this scene it is impossible, that the dedicated VRAM of 8GB is full (as described above as a reason for a CPU-Fall back).
    So I assume, there is either something wrong with my DS-Settings or there is a heavy incompatibility between my Computer and DS.

    Altough I read through the full thread above (to make sure, I'm not asking a question that was answered already above), I couldn't find any clue that would help me to make my RTX 3060 Ti being used in DS renders.
    Is there anything fundamental I have forgotten???

    Thanks for any help on this

    DS 4.12 and IRAY rendering does not work with your card, you need to use DS 4.14.0.8 or newer

    https://www.daz3d.com/forums/discussion/comment/6200861/#Comment_6200861 ;

    4.14.0.8 (November 10, 2020)

    • NVIDIA Iray
      • Integrated Iray 2020.1.1 (334300.4226); see this thread for more detail
        • REQUIRES: NVIDIA Driver 451.48 (or newer) on Windows; see NVIDIA Driver Downloads
          • NVIDIA recommends installing Studio Drivers
        • Adds support for Ampere GPUs (SM 8.0 / GA100)
    Post edited by PerttiA on
  • Richard HaseltineRichard Haseltine Posts: 100,747

    heinzerbrew_f94794efff said:

    Richard Haseltine said:

    .dse files are not editable, they are encrypted. If you want a modified version I'm afraid you would have to write it yourself.

    Thanks for the reply, I figured out that much. That is why my question was, "Is there source code for that or an explanation of how to make it?". I looked at the api, and I don't think this will be trivial for me to figure out.crying Sadly I don't have several days to learn how to make this myself just to reduce a few button clicks.

    I'm not sure, I think it's mostly UI (various DzWidgets) plus, if it launches the enw version of DS rather than providing a command line, DzProcess http://docs.daz3d.com/doku.php/public/software/dazstudio/4/referenceguide/scripting/api_reference/object_index/process_dz

    Also, do you know if there is way to prevent the missing asset alerts we get that stop the scene loading? I want to load some old scenes and walk away while they load and come back to a fully loaded scene. I don't care if it is missing materials or other stuff. currently it stops to let me know stuff is missing, then stops again to tell me the stuff that was missing didn't load, and then I still have to click something to get it to finish loading.

    Also, Thanks for being a great resource on this forum.smiley

  • MaverickMaverick Posts: 16

    PerttiA said:

    DS 4.12 and IRAY rendering does not work with your card, you need to use DS 4.14.0.8 or newer

    https://www.daz3d.com/forums/discussion/comment/6200861/#Comment_6200861 ;

    4.14.0.8 (November 10, 2020)

    • NVIDIA Iray
      • Integrated Iray 2020.1.1 (334300.4226); see this thread for more detail
        • REQUIRES: NVIDIA Driver 451.48 (or newer) on Windows; see NVIDIA Driver Downloads
          • NVIDIA recommends installing Studio Drivers
        • Adds support for Ampere GPUs (SM 8.0 / GA100)

    Oh, wow! Thank you for this Information. I always thought IRAY is utilizing OpenCL and it does not matter what GPU actually is processing this Code. I remember with my old machine (i7 2600 with a GeForce GT545), that Iray render on the GPU stopped, as soon as DS needed OpenCL 1.2 to work and my old GPU just ran OpenCL 1.1. Further more I thought OpenCL is working like a kind of HAL between the software and the effective hardware beneath and that the Graphics card driver is the binding element between OpenCL and the GPU. How ever it looks like OpenCL and the version of it is not the only relevant number in whether IRAY works or not...

    Thanks again for the help.

  • zaselim_08eb9236zaselim_08eb9236 Posts: 64
    edited June 2021

    I am having the same issue even on the single character only, it was fine before 4.15 but after updating to 4.15 I am getting this issue. I have rendered more heavy scenes before without any issues and on geforece drivers instead of studio drivers but now i can't even render a single character even on studio drivers.

    Right now i am trying to render my single character (without any other assets like background, foreground or environmental), when i use iray viewport then it uses my GPU but when i click on render (same resolution) then it doesn't render because I truned off CPU rendering. Do i need to downgrade my drivers to 451.48? This version is not available on nivdia, 452.06 is the oldest there and i am using the latest 462.59.

    Post edited by zaselim_08eb9236 on
  • Richard HaseltineRichard Haseltine Posts: 100,747

    What is your GPU?

  • zaselim_08eb9236zaselim_08eb9236 Posts: 64
    edited June 2021

    Richard Haseltine said:

    What is your GPU?

     

    Oh sorry forgot to mention, i7 6700k, gtx 1070, corsair vengeance 32Gb 3200mhz ddr4, adata xpg spectrix 2tb m.2

    Edit: So after trying different think and after like 5 to 6 hrs, the render is running now. What i did was open another scene (a lot bigger scene about 2gb of scene file size which i rendered last year on 4.10/11/12, can't recall which one) and I started the render which by the was working fine. Then I open the scene (a single posed character) i wanted to render and it started rendering. I believe is some kind of Daz bug which is happening in 4.15, i just it gets fixed soon.

     

    Post edited by zaselim_08eb9236 on
  • CenobiteCenobite Posts: 206

    Never have an issue because i only do one render at a time, I watch my GPU work because it will change temp settings in use without looking at a profile, i have mine set to a colour scheme which accurately gages temp readings then displays on my fans and my card, i have the CPU temps and GPU colour range temps setup so i just loo at my pc while rendering to tell if it's finished, it will cool back down to red when done. 

    Lately tho the driver company has been doning some updates tho which keeps changing my temp schemes which is annoying because i always have to reset them, it's like they are trying to make you start the exe for for click bait to get stuff to work again now, shifty I.T programers doing dumb crap to the hard code.

  • zaselim_08eb9236zaselim_08eb9236 Posts: 64
    edited June 2021

    Cenobite said:

    Never have an issue because i only do one render at a time, I watch my GPU work because it will change temp settings in use without looking at a profile, i have mine set to a colour scheme which accurately gages temp readings then displays on my fans and my card, i have the CPU temps and GPU colour range temps setup so i just loo at my pc while rendering to tell if it's finished, it will cool back down to red when done. 

    Lately tho the driver company has been doning some updates tho which keeps changing my temp schemes which is annoying because i always have to reset them, it's like they are trying to make you start the exe for for click bait to get stuff to work again now, shifty I.T programers doing dumb crap to the hard code.

    I also do the same, one render at at time and i also setup my cpu/gpu temps with colors. But mostly my gpu/cpu doesn't go above 55C(GPU) 65C(CPU) overall (gaming/rendering).

    I don't use CPU in rendering so it remains idle and GPU remains at 55C mostly and goes up and down a couple degrees but under 60C. While gaming my CPU never tops 65 to 67C but on average its 65C and GPU at 55 to 56C. I have a habit of checking them after every session lol.

    Post edited by zaselim_08eb9236 on
  •  

    I used to have the same problem, but I've found the fix for my case

    make sure to turn on Hardware-accelerated GPU scheduling. This fixed the issue for me

    As you can see in the screenshot of task manager, my GPU usage when rendering maxed out quite well

     

  • Hi, I read that it causes an iray preview, solved by setting the scene as I need to save, then turn off and on the daz, load the scene (it must not be in the iray preview) and run the render, the render goes to the GPU. (I know it's annoying but it helps)

  • ALMS DigitalALMS Digital Posts: 5
    edited July 2021

    kevinso2001 said:

     

    I used to have the same problem, but I've found the fix for my case

    make sure to turn on Hardware-accelerated GPU scheduling. This fixed the issue for me

    As you can see in the screenshot of task manager, my GPU usage when rendering maxed out quite well

     

    +1 to this simple statement. Hardware acceleration was off by default and I just had to turn it on.

    Post edited by ALMS Digital on
Sign In or Register to comment.