What causes an iray render to take the most time rendering?
Toobis
Posts: 966
in The Commons
What I mean overall is it the lighting perhaps or characters skin shaders (possibly HD skin or not playing a factor) or some other aspects in a common iray image that causes the longest times to render? what is the most 'taxing' for the average system to iray render?
Comments
For me, it's complex lighting environments. It's not the quantity of light (more light = good, IMO). It's the complexity that the simulated light has to deal with.
This means... multi-mirror rooms, or mirror-like surfaces (high reflectivity), which bounce light off of each other into infinity. This is a problem.
Layers upon layers of translucence. So, hairs with multiple panels of translucency, on top of multiple layers of lace fabric, for instance. Opacity map on top of opacity map is a problem.
Needlessly complex skin... normals upon bumps upon displacements at a distance where it absoltely does not matter.
Trying to do too much in-engine. If you want to light it for effect, then fine, but do it at full light, and adjust the brightness and contrast in post to make it dark or dramatic. The worst thing is to under-light a scene, and force the engine to struggle to generate enough photons to have a good sample/convergence. Dark scenes are just hard to do in the engine, and it's better to avoid dark altogether. Light it like Hollywood would shoot the scene, really. They are good at shooting things quick and efficiently, so study their art.
It takes longer for Iray to render a lit cube in an enclosed space than it does to render a clothed HD figure with hair against nothing but an HDRI.
The longer it takes rays to terminate, the longer it will take for the render to finish. A character set against an HDRI alone where light bounces off the character and then goes off to infinity renders a lot faster than the same character just as brightly lit in an enclosed space. The more surfaces are in a scene, the more surfaces rays have to bounce between like a pinball hitting bumpers before they reach their final destination. The surfaces themselves also can give rays more work to do.
Two words: Volumetric Lighting
With Iray its not the Lighting that causes the issue with long render times. I would says its the Normal & Bump maps which are the biggest user of GPU resource ,Normal maps on average are 2 time or more larger in size the then textures. And all of those tex maps have to load into the GPU before rendering can start, and which is why your render may fall back to CPU if you can't fit all the tex maps into the Memory of the GPU .
Textures ,Normal , Bumb-maps are your biggest gpu resource hogs
I think there really isn't a one-sentence answer to these questions besides "all of the above."
Toobis, I'd be wary of any answers that say that it's just one thing which causes slow renders.
For my rendering, the biggest factors are (a) complex lighting set-ups, (b) complex geometry of the rendered objects, and (c) highly reflective or translucent surfaces. I'm sure other users have different experiences. For example, I never use HD character details because I generally have the camera too far away for the HD detail to make a meaningful difference, and I'm largely concerned with the shape of objects, not fine texture detail.
I seem to get the most hit to render times when I add a lot of reflective surfaces, metal, water, glass, etc. And anything that's reflective AND partially translucent ...yeah, that really ups the render times for me.
There are a couple of things with volumetrics. One is that, yeah, an atmospheric volume forces rays to make a lot of stops. Another is that it's typically used in relatively low light situations, and it will take a lot more time to get a decent render with that little light.
Here is a frame from an animation I worked on a few weeks ago. The background is an HDRI. The scene contains 5 atmospheric props. One is a frame that I used in the animation. It took 25 seconds to render and was good enough for the little animation, which was mostly a proof of concept type deal. The other is rendered to 95% and took around 10 minutes without denoising. I did a scene with this same set a long time ago also using atmospheric props while using the actual set as background, and that took overnight to render while still requiring denoising, albeit on a slower GPU and to 1200p rather than 720.
Letting Iray work for a billion hours removing noise when it looked good enough 3 hours (days) ago.
Sometimes (especially with animations) 300-600 iterations is all you need, even less if you use a denoising technique.
Setting the maximum bounces can help too. You'll usually need at least 3-5 to deal with eyeballs and standard stuff, but 7 or so is usually all you need (test in the viewport first). The "Auto" setting doesn't really seem to know that people can't see a mirror reflected 90 times.
Image size.
Glossy; Glassy, Translucent and Transparent; Sub-surface Scattering; CPU; lower spec graphics card; non-nvidia card, or a nvidia card that doesn't fit the criteria.
Oh yeh and whilst im thinking about it, 50 pixels hidden away somewhere that just wont converge - even though the rest converged two Nvidia generations ago.
this is a animations I made, it took 7 days to make from start to finish. about 4 days to render out 22 scenes. I was getting around 12 seconds a frames running at 30 keyframes per second at 1920 x 1080 . So the finished results would be in 1080hd.
the only reason I was able to get these render times were because I removed all the normal maps and reduced the 4k texture maps to 2k. & Yes I was using volumetric lighting & fog props, and water panes with high volume shine as well which made no difference in my render times.
Its a fun watch. Click to play 3 minute short film. Best viewed in 1080HD
wow..nice. What scenery is that..the mountains and woods? the water animation. the great fluffy clouds.
For me, I've noticed rendering the same scene without quitting the session increases render times. I could render the scene 1 time it takes 8 minutes, I adjust some posing, and the next time I hit render it is 14 minutes. I find it is best to kill the session and start studio up again, and it renders faster. It is a buggy piece of software.
Yes, this! I do everything in iray preview mode and then when it comes time to render, it can take forever. If I quit studio, restart, reopen the scene (which also takes about 15 minutes just to do that) it at least renders faster. Rendering has become MUCH slower with every new update. I have a 1080ti and a simple render of a portrait with hair and no clothes used to take maybe 5 minutes, now it can take an hour!
Thank you very much
This film was a kit bash of about 5 different enviroment sets.
King's Pass https://www.daz3d.com/kings-pass
Viking Village Bundle that I converted to Iray https://www.daz3d.com/viking-village-bundle
Iray Worlds SkyDome https://www.daz3d.com/iray-worlds-skydome
High Peaks Skydome and HDRI https://www.daz3d.com/high-peaks-skydome-and-hdri
The iray clouds i used "Above the Clouds for Iray: Nimbostratus". That I hand keyframed using the morph dials as needed along the timeline. https://www.daz3d.com/above-the-clouds-for-iray-nimbostratus
For the animated water I have a custom ground pane that I made myself for animated water I added some wave morph dials to it & wave bump maps for a good reflection and water ripple effect. I use this a lot in other animations I have made as well.
I also used the bubble aray in the Rigged Water Iray 2 set by Sickleyield for some added water splashes under the plane during take off. https://www.daz3d.com/rigged-water-iray-2
Thats it accept the character who was a genesis 8 female call Cami by Sassy at renderosity.
I also find that if you close Daz Studio completely before rendering it even the first time, that can also help decrease render time. I usually set the scene up, save the scene, closeout DS, then relaunch DS to render the scene. That seems to help decrease render times.
Noise.
Iray mostly renders an image by shooting rays from the camera, basically reverse engeneering a scene compared to how light works in reality. It shoots those rays on random directions within the defined area that will be your final render, they scatter around until they find a source of light, and then Iray computes the final color of those rays by tracing back the bounces and applying the results within the pixels through which your rays originated. That's why it's easier to render outdoor scenes compared to indoor ones. Outdoors, a lot of rays will bounce towards the sky where they will be quickly terminated and their color added. While indoors, those rays have to bounce quite a few times before they find a light source outside the window or a lamp.
The main contributing factor to the direction a ray bounces is it's roughness. Less roughness mean more defined bounces such as on mirrors, where light is reflected almost perfectly symetrial. Rough materials scatter those rays on all directions. For example there's a reflection of you on a plaster wall too when you look at one, but it's blurred way beyond recognition because the information that makes up your reflection is scattered on all directions. But on some materials, such as wood with a varnish finish, there's a more defined reflection, but still quite blurry. Usually, most interiors are made of mostly rough materials such as walls or fabirc, or kinda half-rough materials like wood with some glass and other highly reflective materials here and there.
Now, the problem often arises when you have a scene designed in such a way that Iray shoots out a lot of rays on a particular surfcace, and most of those rays will bounce around too many times for them to contribute to the color of a pixel. That's because with each bounce, light looses intensity. The more bouces you have before finiding a light source, the dimmer the final pixel will be. Often so dim in fact that the contribution to the pixel will be so neglible that you might as well terminate the ray anyway. So you will have to shoot a ray on that surface over and over again before it finds a light source and you can add that color to the pixel. And most of the time that ray will be very dim. But if you shoot enough rays at it, and average the final results, you should eventually obtain the final true color of that pixel. Usually the rays that find a light source in the first 6-7 bounces will dominate the final contribution to the color of a pixel. You can witness this yourself by going to the optimization tab on the render settings, and change the max path from -1 (which means no limit), to 1, then 2, 3 and so on, and watch how your scene lights up with each new bounce added.
When rendering, sometimes, abounce is so particularly lucky that, for example, it hits a surface such as a glossy wooden floor, and bounces straight into the middle of the emitting surface of a lamp that is sitting behind the camera. The final pixel is so bright that it shows up as a glaring point on your render. And very few other rays find a light source for that pixel within the first few bounces for it to average out enough. Those are called hot pixels, and most renders have tools built in to prevent those. They're relatively easy to deal with because they stand out and can be easily filtered out.
But what often happens is that iray is shooting out rays on a particular area of the image, most rays find nothing, but enough find something to display the true dimness of that image when averaged over and over again. And then, maybe one in a thousand rays for that pixel (just as an example, no idea what the real numbers are) hits that sweet spot of a direct bounce on a bright light source, and the average brightness of that pixel suddenly shoots up. Then it gets cancelled out slowly as rays with more bounces average it down again. But then it happens again, and suddenly the pixel is a little brighter once more. The next pixel over however isn't so lucky. In fact it's rather unlucky and never gets enough of those hot rays to display the true brightness that should be a little higher than it currently is. A few pixels over, it's again too bright, too many hot rays, and the other one not enough.
If you run a render long enough, the law of average should cancell everything out, but we don't have infitite compute power. And in a case where the vast majority of rays don't find a light source, but the few that do find those vary widly in brightness, you'll get noise. Lucky and unlucky pixels mixed together, displaying uneven brightness. Dealing with noise is basically the 'holy grail' of ray-traced render engines. There's many ways to do that. Brute-force by rendering more samples, adding more light sources so the rays can find them more easily, terminating pixels that are too hot, AI to clean up the final images and many more techical methods.
As for the compute side of the rays themselves, from my tests, I can say that bump and normal maps, as well as displacement maps, will need more calculation. More so that SSS in my opinion.
Thanks people I'll have to consider a lot of these things. So much to think about.
THIS!!!!!!!!!!!! I have been using Daz for 4 years now and this just taught me something new! I am forever searching for answers on speeding up my renders as I almost exclusively do indoor stuff and it is very hit or miss on how long it takes and the overall quality I get in the final image. I do use Photoshop to remove noise, but that takes a lot of the clarity and definition away. I want a nice balance between realism and render time. I still have little to no idea what half the settings in IRAY editor do. The Path Length always confused me (being default at -1). I had no idea I could control how long it hovers on a pixel before moving on. Thank you so much! I currently achieved 21% convergance on a scene in less than 5 minutes that prior was taking at least an hour to get that far! I have a character standing in front of a three sided mirror (a nightmare to render but an awesome effect!) and I started with 7 as the max path length. All of my lights are in the ceiling (the Mini Apartment product), I changed them to point lights (I do not use Ghost Lights anymore, the characters look dull and flat), I have upped intensity and this and that... thinking that it just needs more and more and more light to render faster (as that is always EVERYONE'S solution :/). So far, just in the short time I have taken to write this it has jumped to 31%!!!!! I can't believe no one has ever brought up this setting before! I will have to up the max path length as the image in the mirror is darker than I would like it to be for now but you might have just saved me countless hours of rendering my indoor night scenes! Thank you, sooooo much!
Wonder if that's a RAM-related thing (not your rig personally [or mine, or anyone's])...
Sometimes what I'll do is just let the thing render overnight (check for restart-required updates!) if it's something that I know will take a while.
For my style of art (pinups), it's hair.
If there is not enough RAM, DS could be hitting virtual memory the second time.
I haven't noticed such behaviour on DS 4.15, W7Ultimate, 64GB and 3060 (12GB)
The culprits are always the same, these increase calculations, and render times. transparency, reflections refraction, volumes,
There are a few other things I've found to be performance killers. Curved emissive surfaces is one. Unless you really need to see that fluorescent tube glowing, turn the emission off on that surface and replace it with either a ghost light, or a point light/spotlight. Same for round bulbs. I've worked with several scenes where simply replacing the curved emitters did wonders.
Using an HDRI for the outside of an interior scene wastes a lot of cpu time while the HDRI tries in vain to penetrate the walls. Yes, I know it actually works the opposite way, but that's the idea. The camera is working like mad to get to that obscured light source. I have used HDRIs where I made parts of the ceiling and walls semi-transparent to let light in, tho. You can use stock photos applied to planes or simply as background images to be seen through the windows. This does somewhat limit the viewing angles.
For outdoor scenes, lately I've been using one HDRI and one ghost light (an emissive transparent plane) as rim or fill. Get the free ghostlight shader on ShareCG.com. 4K renders can be done in seconds with this.
Lastly, trying to get rid of noise or fireflies by cranking up the quality is a losing game, IMO. Better to reduce the quality a little and use the mcjdenoise utility. It's great!
I know the release notes make a lot of claiims for 'guided sampling', and having it on permanently. Guiding light through an opening is one of them, but nVidia seem to believe in it enough to deprecate 'Ligh Portals'.
Resolution and certain hairs for me
I think it's funny how a hair can take up more storage than a neighborhood