DS fails to de-allocate GPU memory

(Rendering in IRAY)

Using both the current release and beta of Daz Studio I frequently have to restart DS because it fails to de-allocate the GPU memory. There are two different ways I get this problem.

The first is when I am working on a scene that fits in the GPU memory. When I'm working on the scene I often do test renders. It seems that after a few test renders I will run out of memory and the render will fall back to CPU only. I close my test renders after each test. In the beginning, the memory is returned after the test render is closed. In terms of percentage of usage, with the render open 70% (as an example) might be used, but when closed it will drop to a baseline of 8%. Over time the baseline seems to increase whether it is based on time or number of renders I'm not sure, but I believe it is based on the number of renders. Eventually, the memory is no longer recovered and will be stuck around 80 or 90 percent.

The other situation is when the scene is too big to begin with. I think it might recover sometimes, but it seems like it almost never does if it exceeds the memory.

Regardless of how I get this de-allocation problem, once it happens the GPU is no longer used during rendering. (without restarting DS)

I'm trying to find out if this a known bug, and if there is something I can do to avoid it. I have had this problem since I began GPU rendering about a year ago. I can provide logs and system specs if needed. Also, if it's not a known bug, is it worth contacting Daz about it?

Comments

  • The issue with memory not being freed up after it is exhausted is a known issue. Check to see if you are still getting the cumulative effect in the latest public beta, which includes an Iray update, and if you are please report it.

  • Kevin SandersonKevin Sanderson Posts: 1,643

    As Richard said it's a known issue. In another recent thread a user was having the problem of memory issues rendering an image sequence, dropping back to CPU after the first image rendered on the GPU (it was before the current beta was released, though). The suggestion was to use a good nvidia driver that has worked for some, 388.59, and to try not using Optix as it has been having issues according to some.

  • Yes, it is still a problem for me with the current beta. The beta "seems" to use more memory to render the same scene. I will try turning off Optix, and if that doesn't work I guess I will contact support. I'm not sure if I want to mess with drivers right now; I would hate to break a different program just to fix this one.

    I'm not sure if the cumulative effect was on both or not, it seems to be a little random in general. I will probably need to get more scientific about documenting my problems. Currently, I'm working with a fading memory and frustration.

  • TooncesToonces Posts: 919

    Yeh same happens to me. I just get used to closing/reopening Daz to clear the memory.

    Bummer that the beta seems to use more. I often am right at the limit of my 8GB cards so if beta uses more, I might have to wait till 1180's are released before upgrading daz.

  • fastbike1fastbike1 Posts: 4,077
    edited July 2018

    This is not a problem for everybody, so it likely has something / more to do with an individual setup than a gernic Studio error.

    Post edited by fastbike1 on
  • ebergerlyebergerly Posts: 3,255

    Yeah, it's unfortunate, considering the solution is probably fairly simple for NVIDIA to implement. Especially since they're the ones who developed CUDA in the first place. 

    Please, NVIDIA, just add some "cudaFree()" statements as needed to free up memory. And it would be nice if they provided a user button to do it on demand. Or maybe it's something they expect the Iray user to implement. 

  • The memory creep does happen on the beta as well for me. I canceled the first two renders early and the GPU memory was cleared down to 8% after the third it only went down to 14%. The difference between renders was camera changes and two morph changes before the third render. For this test, there didn't seem to be anything in the logs indicating a problem.

    I still need to contact daz and try downgrading my driver.

  • In addition, I have tried downgrading driver to 388.59 and turned off Optix.

    Here is the response from support. It basically states that it's Nvidia's problem. There is nothing like being the potato in a game of tech support hot potato. Regardless of who is right, the potato usually gets dropped.

    Hi Christopher,

    I spoke to our Devs about this. They said that they will look into this further, however, if it happens with the NVIDIA Iray render engine, this is an issue that NVIDIA has to fix. 

    They did ask that you test this when the next Iray build is implemented in Daz Studio. (It will be released with a new build of Daz Studio.) You will be able to see this in the Change Log on daz3d.com. Here is a link:

    http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log

    Please note that we do not have an ETA on when the Iray render engine will be updated and we do not know what General Release of Daz Studio this will be implemented in.

     

  • ebergerlyebergerly Posts: 3,255
    edited August 2018

    Yes, as I said, it is up to NVIDIA to implement something in the CUDA code to de-allocate memory. It is standard practice with just about every piece of software ever developed since the beginning of time. You need memory to run your software, so you grab memory (RAM or VRAM) using an "allocate" command, then when you're done you "de-allocate" so others can use it. In CUDA the command is "cudaFree". 

    Now I'm guessing a wrinkle is determining when to free the memory. You don't want to do it too soon, since it took a whole lot of work to load the memory in the first place, so if it's possible you'll need it later you don't want to get rid of it. Which is why I suggested maybe a user button you can press to do a "cudaFree" operation. 

    Post edited by ebergerly on
  • fastbike1fastbike1 Posts: 4,077

    Cancelling the renders won't really help unless you close the associated render window(s).

    The memory creep does happen on the beta as well for me. I canceled the first two renders early and the GPU memory was cleared down to 8% after the third it only went down to 14%. The difference between renders was camera changes and two morph changes before the third render. For this test, there didn't seem to be anything in the logs indicating a problem.

    I still need to contact daz and try downgrading my driver.

     

  •  

    Cancelling the renders won't really help unless you close the associated render window(s).

    In my first post, I state the following, "I close my test renders after each test." Also, in the blockquote you made, I state that initially the memory is cleared; it wasn't until later that it stopped being cleared. It seems obvious to me that I must have been closing the windows even if I didn't explicitly state that I did. The whole problem is that when I close the window the memory isn't recovered. What is the point of your statement? Are you telling me that closing my windows will fix my problem of my VRAM not being released when I close my windows? Well ok, I tried that; it didn't work.

    This is not a problem for everybody, so it likely has something / more to do with an individual setup than a gernic Studio error.

    Obviously, it isn't a problem for everybody. What is the point of stating this? The problem I have is not in my imagination, it occurs for other people with different graphics cards and other software using Iray. Support believes it is an issue for Nvidia to solve.

    If you discover a solution to my problem other than waiting for Nvidia to update Iray and then waiting for Daz to add the update(which is supports solution) then I would be glad to hear it.

     

     

  • ebergerly said:

    Yes, as I said, it is up to NVIDIA to implement something in the CUDA code to de-allocate memory. It is standard practice with just about every piece of software ever developed since the beginning of time. You need memory to run your software, so you grab memory (RAM or VRAM) using an "allocate" command, then when you're done you "de-allocate" so others can use it. In CUDA the command is "cudaFree". 

    Now I'm guessing a wrinkle is determining when to free the memory. You don't want to do it too soon, since it took a whole lot of work to load the memory in the first place, so if it's possible you'll need it later you don't want to get rid of it. Which is why I suggested maybe a user button you can press to do a "cudaFree" operation. 

    From my research, some people believe that some aspect of Iray is "hanging" and that since Iray doesn't completely stop or incorrectly stops the memory is never freed. Which is essientially the same thing you are saying. Maybe they could run a cudaFree before starting a render.

    I'm not sure that Daz could implement a cudaFree button, but it sure would be a nice feature if possible. I believe that doing so would create a memory access violation.

  • ebergerlyebergerly Posts: 3,255

    I'm not sure that Daz could implement a cudaFree button, but it sure would be a nice feature if possible. I believe that doing so would create a memory access violation.

    Yeah, I think you're right. Iray would need Mr. CUDA to 'fess up and divulge where in VRAM it had stored everything so that Iray could clear it out. So CUDA would have to make the memory location pointers accessible. Though that might be relatively easy since Iray and CUDA are from the same company and presumably they could work out a deal. 

    Anyone can go in today and write a very simple C/CUDA code to free memory (even I've done it), but you need more info about where and how much for the particular app that's using it.

     

  • charlescharles Posts: 810

    On Windows 10, open task manager and check GPU Memory under Performance tab. If it is high and needs to be flushed close Daz Studio. Check Task Manager process, if it is still running wait for it to finish or End Task on it. Check performance tab again and see if memory resolves. Otherwise you may have OTHER programs using too much of it, like Dropbox. Best to have 2 video cards, one for rendering and a cheapo for display and other GPU junk tasks.

     

  • Some of this is well beyond me and I don't want to appear to be piling on.  I am operating W10 and have Task Manager open almost 100% of the time.  I also have Note Pad ++ giving me pretty constant feed back (I think it is someone's script that lets me get on the fly DAZ Studio Help Log updates).  I experience both of the scenarios described at the outset of this thread when rendering.  I believe I have the latest versions of everything I can.  And I have an RTX 2070 that does nothing but rendering and a GTX 660 TI that supports my monitors and helps render smaller scenes. 

    And, as of 9/9/20, I experience de-allocation errors for both cards ALL THE TIME.  If it's a setup thing I sure would like to know what others are doing/have done to avoid this problem.  For me, at least, it is not a little inconvenience.  (The whole save, exit, watch task manager to ensure everything is closed properly, restart, reload, render process is a bit tedious.)

    Just saying.

  • I have this all the time. 4.12 is unusable to me because of the way the new software affects my memory and drops to cpu

    I downgraded to 4.11 and Iray uses less memory and I can use my work, but dforce hair and strand don't work in 4.11

    in 4.12 even successive renders of simple scenes would get worse until it drops to cpu

     

  • gioloigioloi Posts: 57
    edited August 2021

    Re-opening this topic just to say that in 4.15 the issue hasn't been solved yet.

    I have a Nvidia 1080 with 8 GB RAM and work with Windows 10 Home.

    When I load a scene for the first time the free GPU memory is 6783.3 MB. I can render smoothly with GPU.

    After the render is complete and I close the render window, GPU free memory is 4783.7 MB, that is 2 GB (!!!) less. In such conditions, if I try to re-render the scene it goes to CPU. Even if I close the scene with "new", the free memory goes to 5106.6 MB.

    Only solution is to close and re-open DS every time. Extremely annoying indeed.

    Post edited by gioloi on
  • If nVidia doesn't put in a 'CudaFree()' command at the end of rendering, you'd have thought that DS would, because DS knows when rendering is finished - whether cancelled or finished normally. I have done similar things in some of my programs when knowing I'm dealing with dll's that fail to free memory after use, and actively triggered the .dll's memory freeing method once I know the job has been done. I have never heard of trying to free memory when there is none to free being a problem, as it should be one of the safe commands...

    Regards,

    Richard

  • PerttiAPerttiA Posts: 10,013
    edited August 2021

    gioloi said:

    Re-opening this topic just to say that in 4.15 the issue hasn't been solved yet.

    I have a Nvidia 1080 with 8 GB RAM and work with Windows 10 Home.

    When I load a scene for the first time the free GPU memory is 6783.3 MB. I can render smoothly with GPU.

    After the render is complete and I close the render window, GPU free memory is 4783.7 MG, that is 2 GB (!!!) less. In such conditions, if I try to re-render the scene it goes to CPU. Even if I close the scene with "new", the free memory goes to 5106.6 MB.

    Only solution is to close and re-open DS every time. Extremely annoying indeed.

    Maybe what you are experiencing is related to your non-RTX GPU and/or the driver version.

    I have an RTX 2070 Super, using 456.38 drivers and although the VRAM is not released when the render is complete and the render window closed, it doesn't prevent me from doing subsequent renders over and over again.

    Post edited by PerttiA on
  • gioloigioloi Posts: 57
    edited August 2021

    PerttiA said:

    Maybe what you are experiencing is related to your non-RTX GPU and/or the driver version.

    I have an RTX 2070 Super, using 456.38 drivers and although the VRAM is not released when the render is complete and the render window closed, it doesn't prevent me from doing subsequent renders over and over again.

    Well, it doesn't happen always. When the render is small enough not to need almost the whole free memory, I can make multiple GPU-renderings with no problems.
    On the other hand, RAM-consuming renderings, that nonetheless can be executed by GPU at first load, fail to be replicated and force me to close DS.

    I keep the driver updated and honestly I'm not eager to buy a RTX card right now, given their current street prices.

    Post edited by gioloi on
  • gioloigioloi Posts: 57

    richardandtracy said:

    If nVidia doesn't put in a 'CudaFree()' command at the end of rendering, you'd have thought that DS would, because DS knows when rendering is finished - whether cancelled or finished normally.

    It would be nice if there were an app that forces GPU memory to be freed when launched. I looked for it but it seems that there isn't anything suitable.

  • PerttiAPerttiA Posts: 10,013

    gioloi said:

    PerttiA said:

    Maybe what you are experiencing is related to your non-RTX GPU and/or the driver version.

    I have an RTX 2070 Super, using 456.38 drivers and although the VRAM is not released when the render is complete and the render window closed, it doesn't prevent me from doing subsequent renders over and over again.

    Well, it doesn't happen always. When the render is small enough not to need almost the whole free memory, I can make multiple GPU-renderings with no problems.
    On the other hand, RAM-consuming renderings, that nonetheless can be executed by GPU at first load, fail to be replicated and force me to close DS.

    I keep the driver updated and honestly I'm not eager to buy a RTX card right now, given their current street prices.

    I'm not talking about small renders, but ones which use 4-5GB's of VRAM, which is about the maximum an 8GB card can render with the OS+DS+Scene baseload, still leaving some "working space" on the GPU (~700MB's). 

  • gioloigioloi Posts: 57

    PerttiA said: I'm not talking about small renders, but ones which use 4-5GB's of VRAM, which is about the maximum an 8GB card can render with the OS+DS+Scene baseload, still leaving some "working space" on the GPU (~700MB's). 

    The one I'm specifically talking about leaves about 250 MB of free GPU RAM. First time GPU rendering works, from second going on CPU has to take over.

  • PerttiAPerttiA Posts: 10,013

    gioloi said:

    PerttiA said: I'm not talking about small renders, but ones which use 4-5GB's of VRAM, which is about the maximum an 8GB card can render with the OS+DS+Scene baseload, still leaving some "working space" on the GPU (~700MB's). 

    The one I'm specifically talking about leaves about 250 MB of free GPU RAM. First time GPU rendering works, from second going on CPU has to take over.

    What does your log say? 

  • gioloigioloi Posts: 57

    This is puzzling.
    I loaded the 'critical' scene that led DS to CPU-render and, despite the GPU free RAM was around 4.8 GB, it rendered GPU-only.
    I re-tried once more, and again rendered with GPU.
    Clearly things are more complicated than I believed. For sure, the issue isn't related to free RAM - at least, not ONLY to free RAM.
    I'll give a look to the log next time the issue pops out.

  • If anyone wants it, here is what you need to put in a bat file to start a new daz instance with your scene. When you run the bat file it will ask for the name of the file you want to load. Just put the name not the extension so scene.duf would just be scene.

    -----------------

    @echo off
    SET /P name="Enter Name: "
    "C:\Program Files\DAZ 3D\DAZStudio4 Public Build\DAZStudio.exe" -instanceName B# -cleanOnLaunch 0 "C:\Users\chris\Documents\DAZ 3D\Studio\My Library\Scenes\%name%.duf"

    ----------

    Things to change:

    you must change the path to your daz studio

    you must change the path to your library

    you can change the instance name if you want.

    Note: this is a seperate instance of daz. so it uses a different log file and any settings you change won't save to your real daz.

    Tip: a bat file is just a plain text file with the extension chanced from .txt to .bat

     

     

  • Seven193Seven193 Posts: 1,068
    edited September 2021

    I don't even have to render something to notice a drop in GPU memory.  I use iRay preview, and more than often, it starts to sputter when the memory starts to fail.  The screen flickers to grayscale, then renders in color, then it flickers again, until I switch it off.  It does this without even touching the keyboard or mouse.

    Post edited by Seven193 on
Sign In or Register to comment.