Graphic Card

Will Iray work with an Nvidia GeForce RTX 3050 card?

Comments

  • frank0314frank0314 Posts: 13,910
    edited December 2021

    Looked at several different review sites and it has pretty poor reviews and doesn't handle graphic intense situations very well and is "abysmal at ray tracing" and is "horrid."

    Post edited by frank0314 on
  • With only 4GB of memory it would be able to handle only pretty simple scenes, anything with a lot of data would drop to CPU (if you alow that) or stop.

  • Thanks for this repy and the one in the New Users forum, Richard. I had no idea that I could use the CPU. That's great news.

  • LeanaLeana Posts: 11,388

    You can definitely render with Iray using the CPU. It will be much slower than when using a compatible GPU though.

  • nazmy2_a263cf7a4b said:

    Thanks for this repy and the one in the New Users forum, Richard. I had no idea that I could use the CPU. That's great news.

    Yes, you can iray render with an RTX 3050, but its going to be extremely limiting. If you haven't done so already, familiarize yourself with Daz Studio software and determine your expectations in regards to the scenes you plan on rendering & how fast you need to render them. I would discourage buying a 3050 for iray, even for a beginner, simply due to only having 4 Gb VRAM. As a minimum for starting out, I would go with either a 3060(8-12 Gb VRAM depending on which) or 3070(8 Gb VRAM w/ more CUDA cores).

    You can also render iray using the CPU, but the drawback to doing that is most of the system's CPU resources will be tied up and you really won't be able to use the system for anything else while rendering. Also, depending on what you're rendering, it could take hours to render a scene that would otherwise take 20-30 minutes on a GPU with sufficient VRAM.

  • You guys have give me a lot to think about. Thanks for all your input.

  • In case I do get a GPU, is there anything special I have to do to let Daz know to use it instead of the CPU?

  • FSMCDesignsFSMCDesigns Posts: 12,722

    nazmy2_a263cf7a4b said:

    In case I do get a GPU, is there anything special I have to do to let Daz know to use it instead of the CPU?

    that is completely up to the scene. if your scene uses more resources than the GPU can handle, it will drop to the CPU. I have an RTX 2080ti and I still have scenes drop to the CPU from time to time.

  • LeanaLeana Posts: 11,388

    nazmy2_a263cf7a4b said:

    In case I do get a GPU, is there anything special I have to do to let Daz know to use it instead of the CPU?

    You configure which devices to use in the render settings in DS.
    If you select your GPU as a render device there then it will be used provided the scene fits in the GPU VRAM. If the scene doesn't fit then the CPU will be used, or the render will stop if you disabled fallback ti CPU.

  • Thanks Michael and Leana. I got it now.

  • nonesuch00nonesuch00 Posts: 18,032

    If you are used to, can afford, and actually find a 3060TI, 3070, 3070TI, 3080, 3080TI, or 3090 that cost more than the laptops you are looking at, then yes, a laptop throttled low power 3050 or 3050TI will be slow but the fact is that it will do hardware ray tracing faster than any GTX GPU which can't raytrace in hardware at all. 

    bottom line: the multiple hardware review tests I saw rated the 3050TI and 3050 GPUs about equivalent to 2060 and 2060TIs; which people were gaga about before Sep 2020.

  • Thank you nonesuch00.

  • UHFUHF Posts: 512

    I recommend that you get 8GB to start, and even still, you will struggle with VRAM.  I have 12GB, and some new sets literally use that much right out of the download.

    The following product can really help if you want more than a basic character\scene.  (Really, if its in the background, no one will the 8k bump...)

    https://www.daz3d.com/scene-optimizer

     

    Not to freighten you, but... also consider that your main PC can also bottle neck all of this.  Daz loads a compressed version of a scene into the PC but then to render it decompresses it (multiply its size by PI and add a random number), and loads that into your Video Card.  Even with a beefy video card your PC RAM can hold you back.  (I've run out, and my PC has 64GB... and my video card is 12GB.)

  • Thanks for the input UHF.

  • nazmy2_a263cf7a4b said:

    Thanks for the input UHF.

    Really due to how pathetically optimized DAZ and assets are... anything less than a 3090 with its 24GB vram is not enough to render realistically... like Of course if you have 8-10gb vram than sure its ok but there comes a time where you are doing yourself a disservice in the long run. I mean I use a 3070 with 8GB vram and I hate my life, but ofc I can't do anything about it cause a 3090 costs 200% msrp in my region assuming there is stock to begin with.

  • I've decided to wait until I can afford a better GPU. You guys have really helped me.

  • axicecal said:

    nazmy2_a263cf7a4b said:

    Thanks for the input UHF.

    Really due to how pathetically optimized DAZ and assets are... anything less than a 3090 with its 24GB vram is not enough to render realistically... like Of course if you have 8-10gb vram than sure its ok but there comes a time where you are doing yourself a disservice in the long run. I mean I use a 3070 with 8GB vram and I hate my life, but ofc I can't do anything about it cause a 3090 costs 200% msrp in my region assuming there is stock to begin with.

    It depends on what you're trying to render and how much time you're willing to put into optimizing a scene to minimize resource usage. Even with a 3090, its quite easy to fill up 24Gb VRAM when working with bigger scenes(assuming you don't run out of RAM beforehand, which can be the case on systems running 128Gb RAM or less that are using a 3090). However, I do agree that some of the assets are STUPIDLY "optimized". A few weeks ago I loaded a character I had just purchase here, and tried to do a test render and it nearly fried my laptop. Originally I thought it was something with my laptop since all I had loaded in the scene was the character itself and one of Orestes's HDRIs that I've used quite frequently in the past. When I went back into Daz to see what happened, I find out that the default settings for the character had the display SubD level setting at 4 and the render SubD level at 5. It was definitely a "wtf?" moment and a promise to myself to never buy products from that particular PA again until I get my new rig... I wonder if even an RTX 3080 could render a character at that SubD level without struggling, nevermind it being in a full scene with HDRI, environment, and props...

  • nazmy2_a263cf7a4b said:

    I've decided to wait until I can afford a better GPU. You guys have really helped me.

    You're going to be waiting a very long time(at least a year). People said I was silly for spending $12k for an A6000 & 3090(they were each around $6k) back in June-July earlier this year, and guess what? You can no longer find those cards anywhere and won't have any chance of getting one in the foreseeable future unless crypto crashes hard and miners start selling off their used cards. So while this certainly wasn't an option for many, I think if you ever have the opportunity to get one and can take the hit on your wallet, then do so. I suspected this was possibly going to happen, but kept on kicking the can down the road and simply wrote it off as being a production issue caused by the pandemic & increased demand by crypto miners. I assumed once the pandemic calmed, production would ramp up enough to at least make it possible to get two cards. When I got the call that I could get my hands on a Kingpin, which was the exact type of card I needed for rendering, I pulled the trigger and the rest was history.

    Anyway, I think what's happening is a portion of these cards are being diverted at some point in the supply chain by both scalpers and crypto miners, which is exacerbating the problem of an already low supply of these cards & the raw materials used to make them.

  • Thanks magog_a4eb71ab. You've given me a lot to think about.

  • Why does everyone keep saying RTX 3050 only has 4GB of vram? Am I missing something? I don't think they make any with less than 8GB.

Sign In or Register to comment.