Graphics Card
robertfroh
Posts: 2
Hello,
Does this graphics card work well with DAZ?
Nvidia RTX A2000 12GB
https://www.nvidia.com/en-us/design-visualization/rtx-a2000/
Thank you,
Robert
Comments
Yes. Whether it's ideal is another matter.
What is the ideal card for DazStudio, to render scenes above 20Gb of memory?
if the actual scene data sent to iray is 20GB (the scene size in memory is not a reliable indicator as working on a scene and rendering it have very different requirements) then you would quite possibly need two 24GB cards linked via nVlink (assuming much of the scene data is materials) as the various overheads would quite possibly not leave enough room on a single 24GB card. You'd also need plenty of system memory.
You're into professional studio territory with this requirement. If you consistently need to render scenes larger than 20GB then chances are 24GB isn't going to cut it.
The cheapest way is 2 x RTX 3090s with NVLink but the power consumption and heat generation will be horrid. The 4090 does not support NVLink. The professional way would be to use an Nvidia workstation card - the A6000. This is not cheap but uses significantly lower power and has all the commercial benefits of a long warranty and support and will likely last a lot longer than two gaming cards being thrashed to within an inch of their lives for a long period. You'll need to pair this with 128GB of system RAM to be able to use it to its maximum potential.
Perhaps you could consider optimising your scenes as a first step? I suspect I could cut your scene VRAM useage in half without impacting the image in any meaningful way.
Start with an RTX 3060 12GB (not the ti-version), add 32GB's of RAM and the rest you can choose quite freely
Yep, absolutely. No need to spend 1k (US Dollars? Euros? Vietnamese Dong?!) at this point. A 12GB RTX 3060 is a great starting point.
What I recommend is that you leave your existing 1660Ti in place and add the 3060 into another PCI-E slot. It doesn't matter if it's in a slower lane as we're not interested in squeezing high fps for gaming out of it. That 3060 will sit there, all on its own not connected to any monitor, and Daz Studio will recognise it. This means that your 3060 will not have to drive Windows processes and lose a reasonable chunk of VRAM doing it. You'll get 99.99% of that 12GB just for your renders.
Tell Studio to use it by selecting the Advanced tab in the Render panel and checking the photoreal mode and interactive mode boxes.
If your power supply is less than 800W or so, I would be inclined to upgrade to at least that if you decide to do the above.
You should have enough cash left over for these from the store:
https://www.daz3d.com/scene-optimizer
https://www.daz3d.com/instancify
Both are currently on sale and can make a huge difference to what you can squeeze into a scene. These should be built in to Studio imo.
This is also useful but needs a little care to ensure you don't start rapidly changing the lighting and mood in indoor scenes particularly:
https://www.daz3d.com/camera-view-optimizer
Using these tools it is very unlikely you will max out a 3060 unless you're trying really, really hard. If you consistently do run out of VRAM, even with good optimisation, that's the time to start thinking about a 24GB GPU. Those, however, are not 1k upgrades in anyone's currency!
Separate piece on VRAM useage to avoid Wall of Text (tm).
Back at around DS 4.15 Iray would report the final VRAM useage to Studio and you'd get that number in the Render progress box. Nvidia removed that info so Studio can't report it. Bit of a shame but nothing can be done about it.
Estimating VRAM useage in advance is a bit of a mug's game but there are some useful things to know: Firstly, increasing the subdivision level of a mesh by 1 quadruples the amount of polygons in that mesh. One square becomes 4, becomes 16, becomes 64 etc. etc. Think about whether you need that level of detail on mid and far distance characters or objects for example.
A 4k map, commonly used in store products, will use about 64Mb of VRAM. It's not a precise calculation because Studio can and does compress the textures according to parameters that you set in the Advanced tab of the Render Panel. Bear in mind, then, that a character will likely have a 4K Diffuse Map, 4K normal map, glossy map, dual lobe maps.... and so on. Lots of maps. Same for hair, outfits, scenes and props. This all adds up very, very quickly but the final amount is impossible to calculate in advance.
For distant objects, use the scene optimiser recommended above to half the size of the maps and even remove some of them. Bump map on a distant vase? Not much point!
With a 3060 I would start the Texture Compression medium threshold at 4096 and the high threshold at 8192. This will apply no compression to maps under 4K. Medium compression on 4K - 8K maps and high compression on maps higher than 8K. If you're still struggling, try 2048 and 4096 as the thresholds.
The only way to tell exactly what VRAM the GPU is using is via a third party tool. GPU-Z is very useful for this.
https://www.techpowerup.com/gpuz/
The upcoming RTX 4000 SFF looks to be an excellent solution for a lot of DAZers.
If they actually retail for MSRP this card looks superb. Small, 70W, not the fastest rendering device but really good value for money. I do think a 3060 is a really good place to start for people new to rendering as they are relatively affordable and there is still a requirement to understand optimisation, lighting limitations and so on.
If, and it's a big if because I think miners are going to pile in on this one, they are obtainable at or close to MSRP this is probably going to be the go to card for a lot of people.
Hadn't seen any coverage of this one. Thanks for the heads up!
Thanks for your feed back. All of you!
"Perhaps you could consider optimising your scenes as a first step?"
Yes, there is always room for optimising, however last year I upgraded my system with a workstation I had to build it myself, because (if I go to pc store the only feed back I get is "why do you need that to play games?" "You want that to play games right?"... so I found out pretty quickly that pcs from store are not for me..) So I built my workstations (and rigs for the cards), this one is an i9 12cores and 254 Ram, however I keeped the cards GTX 1080ti since the cards are mainly for Iray, and the other project that I am working I will be using cpu to render. So what I am noticing is that with the same set of cards in my old pc i7 32Gb Ram some scenes I had to optimize in order to not get out of memory, replace the 16K HDRI for an 8K or 1K use instances etc, but now with the i9 and 254 Gb RAM, same scenes will render it might take a while but they render and do not crash. I am all about R&D so for a UHD 4K render I have scenes that go up to 40gb to 50gb on the workstation but on the Daz app it is around 30gb. For such an image to reach completion with a high sample rate it will take about 10 to 12 hours, and that also depends if it is in daylight conditions or night with low light areas, that Iray has a lot more dificulty to complete. To note that this work is being done mainly by the CPU, and the cards 5 of them work randomly around betwen them, and the CPU is at 100% to 90% across the render.
So I am wondering if with a nice set of cards or even one card would speed up the render time and work flow, and how much improvement would that be...
With NvLink ON, 2 cards of 20Gb that will give DazStudio 40Gb of memory?
I also question the RTX gaming cars durability many say after 2 years they are dead and this is factual. I wonder if the A series last the same?
The A series cards (formerly called Quadro) are designed for stability and durability.
Whitemagus - If your optimised scenes are still using ~30Gb in Daz Studio you really are into commercial territory here. I do have a suspicion that the memory requirement would be less when transferred to a GPU and you might be able to get away with a 24GB GPU.
However...
If you absolutely, positively, 100% know that you need >24GB of VRAM then it's serious money time. Nvidias' A6000 is the obvious choice. We have two of them but we're a small games company and we can write them off against tax over time - the investment for a private individual is considerable, even for one. Ours are the older Ampere architecture models and the upcoming Lovelace cards appear to be far superior. However, you could buy quite a nice second hand car for the price of just one.
I would always go with one card rather than two if possible. It might seem like stating the obvious, but you are twice as likely to have a GPU failure with two cards than with one. Nvidia are also dropping NVLink from here on in so it's not something I would invest in.
The A-series Nvidia cards are 'parts-binned'. This means every discrete component on the board has been checked for tolerances, by hand, thus the reason they're 4 to 5 times more expensive than the RTX gaming equivalent. Ours have a 5 year warranty and I expect they will last 15 years. They will be outdated and obsolete long before they fail. Workstation cards are designed to be run 24/7 in a server environment, they have a much lower TDP than gaming cards and we have a very sympathetic fan profile for them. When they're first fired up the fans kick in at 100% to slow down the temperature increase. When they're being shut down, the fans back off and let the temperature drop slowly.
Nvidia's workstation cards are pretty bulletproof. They're also very, very expensive. Be absolutely certain you need one before you invest.
As a rough guess I estimate that your 10-12 hour renders on the CPU would drop to 1-2 hours on the A6000s we have. They would be even quicker on the Lovelace architecture.
Serialchiller - no infinite wisdom here! You will find that there are very widely differing and strongly held opinions in the rendering world about what to do and what not, but the Daz community is generally united in recommending the RTX 3060 12GB for an initial build. If you can't do it on a 3060 you're not trying hard enough.
Or you're us, trying to plough through animations for dozens of characters in a reasonable timescale...
If and when you feel you need to move beyond what the 3060 can do, gracefully retire your 1660, slot the 3060 where it used to sit and add your newer, better (and more expensive!) upgrade. From a tech perspective I would find it hard to recommend an RTX 3090 for rendering. They just run too hot. The 4090 is better but is double your budget. Online rumours of its flammability have not helped its reputation either but, properly fitted, they're a safer bet than the furnace that is the 3090.
For now, though, as you get used to the software and start making your art you won't be disappointed with the 3060. It's not the fastest rendering device on the planet but it's the best bang for your buck/euro/dong available right now.
Timberwolf, Thanks for the feedback I see great value on it.
Yes, the A´s (quadros) are incredibly expensive, and with the RTX gamming cards performing badly on durability, it gets to a serious dilema on wich card to pick in current days, specialy with a very low budget, it is sad I must say.
And there is still, like you mentioned the change of tech that will render the cards obsolete in a few short years.
However it is good to know that with 2 A6000s it would shorten the render time 10x...
Says who? No such experience on these forums.
Hi Everyone,
Sorry to jump on this but Ijust wanted some advice on my current system. Im looking to upgrade to help be able to render bigger scenes and also potentialy delve into animation.
Im not looking for break the bank but hoping to upgrade so I can render images ideally with more than 2 people in a living room.
Below are my current PC specs
I'm looking at the RAM - 32GB PCS Pro DDR4 3200MHz (2x16GB) & 8mb Nvidia Geforce RTX 3070.
Is this a good option?
is there anything else you think I should do instead or should I give up because I'm poor?
Appreciate the support in advance
Daniel
Do not waste your money on RTX 3070, it has only 8GB's of VRAM and the VRAM is 10000 times more important than minor differences in rendering speed.
The RTX 3060 12GB (not the TI-version) is the best budget card and if you want to go faster, the next one is either RTX 3080 or the new RTX 40xx series
thank you so much for the advice. if I got the 16GB Nvidia GF 4080 would I need to upgrade anything else? I dont want the computer to not work because the graphics card is too powerful.
Is my current ram and power supply enough?
I dont want to spend the money on the 12GB RTX 3060 and it not really improve much. Would it enable me to render more than 2 gen 8.1 models and a fully designed set?
Below is a log message I got trying to render 2 8.1 models and only a few items for background. I tried to hide body parts that werent visable but it still came back with the error.
Apologies for the questions I'm very green when it comes to system specs I can use the systems just dont understand the technical area :S.
2023-04-14 23:13:49.079 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Available GPU memory has increased since out-of-memory failure. Re-enabling CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER)
2023-04-14 23:13:49.081 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Initializing OptiX for CUDA device 0
2023-04-14 23:13:49.192 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering with 1 device(s):
2023-04-14 23:13:49.193 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER)
2023-04-14 23:13:49.193 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering...
2023-04-14 23:13:49.193 Iray [INFO] - IRAY:RENDER :: 1.8 IRAY rend progr: CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): Processing scene...
2023-04-14 23:13:49.193 Iray (Scene Access) : Retrieving geometry
2023-04-14 23:13:49.194 Iray (Scene Access) : Retrieving device geometry
2023-04-14 23:13:49.527 Iray (Scene Access) : Retrieving volumes
2023-04-14 23:13:49.527 Iray (Scene Access) : Retrieving textures
2023-04-14 23:13:53.518 Iray (Scene Access) : Retrieving device textures
2023-04-14 23:13:53.895 Iray (Scene Access) : Retrieving flags
2023-04-14 23:13:53.895 Iray (Scene Access) : Retrieving device flags
2023-04-14 23:13:53.895 Iray (Scene Access) : Retrieving lights
2023-04-14 23:13:53.895 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : Importing lights for motion time 0
2023-04-14 23:13:53.895 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : Initializing light hierarchy.
2023-04-14 23:13:53.940 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : Light hierarchy initialization took 0.045 s
2023-04-14 23:13:53.942 Iray (Scene Access) : Retrieving device lights
2023-04-14 23:13:53.942 Iray (Scene Access) : Retrieving section objects
2023-04-14 23:13:53.942 Iray (Scene Access) : Retrieving device section objects
2023-04-14 23:13:53.942 Iray (Scene Access) : Retrieving materials
2023-04-14 23:13:53.980 Iray (Scene Access) : Compiling custom code
2023-04-14 23:13:53.997 Iray (Scene Access) : Retrieving environment
2023-04-14 23:13:53.998 Iray (Scene Access) : Retrieving device environment
2023-04-14 23:13:53.999 Iray (Scene Access) : Retrieving backplate
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving device backplate
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving portals
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving decals
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving device decals
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving motion transform data
2023-04-14 23:13:54.000 Iray (Scene Access) : Retrieving device motion transform data
2023-04-14 23:13:54.001 Iray (Scene Access) : Retrieving lens data
2023-04-14 23:13:54.001 Iray (Scene Access) : Retrieving device lens data
2023-04-14 23:13:54.034 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : JIT-linking wavefront kernel in 0.022 s
2023-04-14 23:13:54.034 Iray [INFO] - IRAY:RENDER :: 1.8 IRAY rend info : CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): Scene processed in 4.841s
2023-04-14 23:13:54.034 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): Not enough memory for kernel launches (0.000 B (128.855 MiB) required, 0.000 B available). Cannot allocate framebuffer.
2023-04-14 23:13:54.035 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): Failed to setup device frame buffer
2023-04-14 23:13:54.035 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): Device failed while rendering
2023-04-14 23:13:54.035 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [WARNING] - IRAY:RENDER :: 1.8 IRAY rend warn : CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER) ran out of memory and is temporarily unavailable for rendering.
2023-04-14 23:13:54.070 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [WARNING] - IRAY:RENDER :: 1.8 IRAY rend warn : All available GPUs failed.
2023-04-14 23:13:54.070 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: Fallback to CPU not allowed.
2023-04-14 23:13:54.070 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: All workers failed: aborting render
2023-04-14 23:13:54.071 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.8 IRAY rend error: CUDA device 0 (NVIDIA GeForce GTX 1660 SUPER): [Guidance sync] Failed slave device (remaining 0, done 0).
2023-04-14 23:13:54.071 [ERROR] Iray :: Internal rendering error.
Thanks again for the support In advance
This was rendered on RTX 3060 12GB in 5 minutes and 5 seconds, one version of the scene had even 2 more G8 figures and a Dog 8 and the scene rendered on GPU with no problems
Edit; No optimization was done on this scene, the whole town is there unhidden and I noticed there still was one character not on camera also not hidden.
Rendering this scene on W7 + DS 4.15 used about 6GB's of VRAM, which would have been too much for an 8GB card due to the baseloads by OS, DS, the scene, and the needed working space. On W10, the baseload is around 3.5GB's to 4GB's, on W7 about one GB less.
Indigo - A lot depends on how well you optimise your scenes, i.e. removing unecessary detail and reducing overly large texture maps. A 3060 will still require some effort on your part for larger, more complex scenes but there is very little it won't do.
As to hardware, I plugged in your system into PCPartpicker and, assuming you keep your 1660 installed to drive the monitors, I got the following power consumption estimate:
1) With a 3060, your PC would need about 510W. Your current PSU is too close for comfort in my books and would need an upgrade. 700W would do it safely.
2) With a 4080, your PC would pull about 660W peak. 800W would probably be ok, but I'd get an 850W PSU for this.
If you go for a 3060 your estimate of requiring 32GB is spot on. If you decide to go for the 4080 you'll want to double it to 64GB.
The cheapest option, by far, is the RTX 3060 and you will be forced, on occasion, into practising good optimisation. The really complex scenes you can see in the Gallery are often pieced together in Photoshop from separate renders anyway and that's another set of skills that would be worth learning. The rest of this thread answers any questions about why you should keep your 1660 and why the 3060 is still the best value-for-money card for hobbyist renderers.
The answer to any question along the lines of 'I'm new to Daz and my 6GB/8GB GTX card isn't doing anything' is always 32GB of RAM and an RTX3060...
I have meet a miner and he experienced it himseld, he even tried to by my cards. If you go to other forums it is all over with this issue... (I think it is the memory not handeling the heat... not sure if this is the exact problem, but they do break).
Also, I do not know if this issue has been fixed in the most recent models, but never seen nothing pointing that way yet.
The problem is the mining, not the cards.
Thank you so much for your responses.
I've looked at a few options but unfortunatly my casing cant fit the 4080. Please let me know what you think of the below option.
Ram - 64GB PCS Pro DDR4 3200MHz 2x 32GB
G Card - 12 GB Nvidia GF RTX 4070Ti
Power - 850w upgrade
I know some say the Ti is more expensive that the regular 4070 it but only option with the company I have the PC with..
@perttiA Impressive render thank you so much for the advice
Again thank you so much in advance for support
I will humbly suggest that if your scene is 20GB, you are doing something via brute force that may require a bit of finesse :) I've rendered an entire medieval battle in slightly more than half that. That is A LOT of VRAM to require for a scene. Look for 4K and 8K textures that are not near the camera. Look for places where you can use instances without the viewer even noticing. That is just a huge amount of VRAM for an amateur scene...
@indigo-dan These specs look much better than what you previously posted, and will let you run Daz Studio for a longer time period before you will need to consider your next upgrade.
The added benefit of the 4070Ti over the regular 4070, is that it has 1,792 more CUDA cores, so it will render your Daz scenes faster.
Make sure the new Power Supply you get has the proper connectors for the RTX 4070Ti. Ideally, you want a 600W PCIe Gen 5.0 12VHPWR Type-4 PSU Power Cable, otherwise you can use 2x PCIe 8-pin cables via the adapter that comes with the RTX 4070Ti.
Also, make sure your case has adequate ventilation and keep an eye on your GPU and CPU temperatures. You can do this in the "Sensors" tab of TechPowerUp GPU-Z: (https://www.techpowerup.com/download/techpowerup-gpu-z/).
absolute legend thank you. I will look into the ventilation and speak to the company to ensure I have enough space might ask if I need extra fan for the system.
appreciate the advice ☺️