Ah great. Size -100% flips the thing upside down but using either size X or Z at -100% does the trick - BUT only the environment sphere is flipped, the HDRI stays the same. Remember, the HDRI for the light and the env-sphere must be flipped.
When you rotate the UE2, both HDRI and env-map move in sync, as we would expect.
I know. There is an element I didn't post yet. The IBL Transformer really transforms the IBL
See by yourself
You've lost me there - I can't see anything that makes me think IBL Transformer transforms the IBL? But then again it's first thing in the morning here - this UE2 thing's been buzzing around in my head all night and I couldn't sleep properly.
Plus I haven't had my wake-up coffee yet, so my brain's not yet in gear...
The IBL transformer does indeed transform the IBL.
If you picture the IBL as a UV mapped (light emitting) sphere it should roughly look similar to UE's environment sphere when it comes to how the image is mapped on the sphere. Of course this isn't the case, which is why we're here in the first place.
However the real problem with the IBL isn't that it's rotated in relation to the environment sphere but that the mapping is wrong. The "poles" of the IBL sphere should be on the top and bottom of the sphere were it mapped correctly to be used with equirectangular images but they are actually NOT. See the attached image for reference. The lighting (red from the top, turqoise from below) cast on the sphere primitive in the image should match how the UE environment sphere looks, but it clearly doesn't. The poles are wayy off from their proper locations.
The IBL transformer fixes this issue and you get a properly mapped IBL sphere which truly matches the environment sphere and how an equirectangular image actually should light the scene.
Okay, I've had a day off, and I've just had my Monday morning coffee. I can now say "Ah! I see!". The red and cyan spots in your example make it crystal clear (even to me! ;o) ) that the poles of the IBL aren't opposite each other, so the IBL mapping is wrong, and not just rotated as I had thought.
With the six coloured lights I used for all my tests I missed the fact that they weren't properly aligned (Edit(11Sep):or maybe they were? See post 70) - I was just concerned with the right coloured illumination coming from the right quadrant. (Edit: But in my post 36 test I compared an IBL Transformer render with a UE2 X/Y-Rotate=-90 render and they were identical, which they would only be if the lighting was identical. That really puzzles me)
So when I applied X-Rotate = -90 and Y-Rotate = -90 to the UE2 the lights appeared to illuminate my cube correctly and I thought that was enough. But as Horo pointed out earlier, I really needed to check the sphere in detail - which I clearly didn't.do. From your 'back right view' the error appears to a bit under 45 degrees from this viewpoint - is that the 35 degrees that Horo was initially talking about?
And of course this would explain why the IBL Transformer is so named!
Edit: I assume you didn't rotate the UE2 for your examples, so I guess that your IBL TIF looked like the first attached JPG? (Or if you rotated UE2 -90 X and -90 Y it looked like the second?)
Looking at your two renders both poles seem to be tilted about 20° towards the back... I'd guess their positions to be very roughly 180° azimuth, ±70°elevation? I'm stumped trying to come up with a mapping that would give your results using either of these images...
...unless...compare the attached lat/long, angular, and mirrorball mappings (edit: these were generated from my vertical cross image in post #50, and aren't flipped horizontally)
If you plug a lat/long map into something that expects an angular map the poles will be off by 45 degress.
But if you plug a lat/long map into something expecting a mirrorball mapping - maybe 30 degrees (edit: measurements in GIMP put it at about 26-27°)? Could that be it? If so I'd expect the east/west lights to be out by about the same amount as the pole lights (up/down)
I get more and more confused. I did that top-bottom test as well and I can confirm that the light is not coming exactly from top and bottom. may be around latitudes +/-20° but that's just guesswork.
It is hard to believe that the environ map is mapped correctly on the sphere but not the HDRI on the invisible one. Why would anybody make one right and the other wrong? After all, you can reuse the code, just call the function twice, no need to invent the wheel twice.
I rather think that the light gathering algorithm is not up to the task. If you use small lights like the ones 3dcheapskate used in post #61 (I used similar) the alg doesn't gather them correctly. There are much less issues if larger lights are used but then, you cannot make a moderately precise measurement anymore.
There must be a reason why the light sphere ought to be a specular convolved one - the blurred light sources get wider, making it simpler to gather them. I also miss some quality control: how many light sources are generated from the HDRI. Using the Monte Carlo alg, you usually have a percent control, with the median-cut one the number of rectangles with a light source.
In my experience, it is very difficult to measure the offsets in UE2 with some accuracy to be able to take counter measures. I also found to my frustration that the results are inconsistent - depending on the HDRI used. UE2 came out about 2 years ago but there was never any update. On omnifreakers website there's nothing new. Does that disinterest - also from DAZ 3D - mean that it isn't worth to elaborate on UE2 since it is FUBAR anyway (Fooled Up Beyond Any Repair)?
Well Spooky this was reported back in the DS3 days...yes it has been broken all that time...I hope the new bug report doesn't get ignored like the old one did.
Well Spooky this was reported back in the DS3 days...yes it has been broken all that time...I hope the new bug report doesn't get ignored like the old one did.
We never had this kind of detail and DS 3 was from before I was responsible for Software Bugs.
Let's jsut forget about the Environement Sphere. I'm pretty sure there is at least 18° off just from the UV mapping. But there is more than that for sure.
From what I've seen, the IBL transformer made a new sphere with default UV mapping, scale to -1 and is sized 10000 if I'm not mistaken. The old Envsphere is deleted I guess. So no need to worry about finding how much off it is. Just make a preset with a sphere like that one. Just having Scale -1 is enough to get it displayed correctly (We see the inside of the sphere not the outside. Neat trick here)
For the Environment map, We should do a tdlmake -envlatl -mirrorx to get it correctly. Then we just have to find the correct rotations
Once we get that, making a preset with a new sphere and correct orientations of the UE light and Envsphere is pretty trivial
@Horo : We can't display directly the HDR Image because something is missing in the OpenGL preview : the ability to display .hdr and .tiff pictures (eventually OpenEXR too)
That's why you must also convert your HDR to a jpg or png file and map it to see it. But a feature request would be good on that point
Well Spooky this was reported back in the DS3 days...yes it has been broken all that time...I hope the new bug report doesn't get ignored like the old one did.
We never had this kind of detail and DS 3 was from before I was responsible for Software Bugs.
It was put on the Studio Basecamp which is now closed.
@Takeo.Kensei - if Z scale is set to -100% and the environment map offset in azimuth by +90° in relation to the HDRI, I had a direction match with the HDRI. None of the images mirrored. Then, the sphere can be rotated, light and shadows follow.
Yes, DS has no tone-mapper to display the HDRI, I understand that. I think it is possible to map an LDRI around it. But I doubt the sphere as such could be made visible. Maybe whatever image is applied as light, HDRI or LDRI, lights are generated from it and those aren't displayed.
Still busy bumbling around with my own ideas, based on the bits of the ongoing discussion that I can understand:
My next simple test:
1) Created three TIF images each with a pair of lights directly opposite each other, so I can test each of the three cardinal direction pairs (north-south, east-west, up-down). Yes - JPG images on the screenshot, but I used properly prepared TIFs for the test!
2) Applied each TIF to the UE2's Light > Basic > Color, with Ambient Only mode and no rotations on the UE2 itself (the spot colours of the TIFs are in accordance with 4.jpg in the first image of post 50)
3) I did four renders for each TIF, one from each of the four fixed cameras that would show both colours.
4) I compared the four render PNGs in GIMP (using horizontal flip and 90deg rotate to align the images first) by using 'Difference' mode, copied the result (which looking like a black circle) and tested whether it was totally black using the 'Fuzzy Select tool and adjusting the tolerance .
Result: Within a tolerance of about 2% (fuzzy select threshold = 5) the four render PNGs were identical. This was true for each of the three TIFs I used.
Conclusions:
a) My 'could it be a mirrorball mapping' thought is clearly incorrect - the illumination for all 12 renders appears to be coming from the correct directions, with no offset.
b) Horo's observation that the offset results seem to be different for different images seems valid - that's the only way that I can explain how this test indicates that there is no offset, when it's been fairly conclusively proved by other people in earlier posts that there is one! I guess this is where the 'gathering algorithm' stuff comes into it?
c) This would explain why my results were identical to the IBL Transformer results in post 36. My test images do not seem to produce this pole offset...
(It's difficult to judge from the attached screenshot, but this is such a simple test that anybody can do it for themselves. Bear in mind that the size of the spots may have a major effect...)
I'm happy to report that I managed to get the light match the backdrop. Luckily, the shadows are now about 90° off. Oh, I can get the shadows right if I accept that the light illuminates the object from the wrong side. You've got to see this as an artistic feature on which you can capitalize. No need to get mad (I hope I'll be able to convince myself eventually, still struggling).
I'm happy to report that I managed to get the light match the backdrop. Luckily, the shadows are now about 90° off. Oh, I can get the shadows right if I accept that the light illuminates the object from the wrong side. You've got to see this as an artistic feature on which you can capitalize. No need to get mad (I hope I'll be able to convince myself eventually, still struggling).
I don't really get it. If illumination is correct, you should have correct shadows. Can you post examples please?
I didn't begin new test yet. Have to think a bit first. I believe that one source of the problem is due to different coordinate system
Seems to me that DS coordinate system is right handed and 3delight must be left handed. Must take that into account when preparing the preview and HDR pictures. If you want more infos http://www.fundza.com/rib/example4/example4.html
I see what you mean. It appears to be a problem of sampling the lights. If the dynamic range is low, much environment light comes into the scene that can brighten up some of the shadows and we or the renderer or both get a bit confused.
I multiplied the specular convolved HDRI by itself to boost the dynamic range and now the shadows may be considered correct. The first image shows the environmap in the mirror ball and the position of the sun. The UE2 was rotated Y=127° so the light direction matches the environment and the reflection spheres.
For the second example, I created a median-cut representation of the same HDRI and used that map to light the scene. Obviously, either UE2 or the renderer got confused. The direction of the light is correct, the shadows are quite a few that cannot be accounted for. But this should not concern us too much since we do not know what sort of lights are created from the HDRI. The median-cut variant used assumes point light sources (as Bryce uses) but UE2 might create area light sources (as Bryce does when the HDRI is true ambiance optimised). Nevertheless, an interesting test, though useless.
And thanks for the left/right hand link. I'll consider that as well.
I don't really get it. If illumination is correct, you should have correct shadows. Can you post examples please?
I didn't begin new test yet. Have to think a bit first. I believe that one source of the problem is due to different coordinate system
Seems to me that DS coordinate system is right handed and 3delight must be left handed. Must take that into account when preparing the preview and HDR pictures. If you want more infos http://www.fundza.com/rib/example4/example4.html
This is actually a bug in DS. I would say that this is probably the only (serious) bug in all this. DS flips the coordinate system of every light (not only the uberenv light). This can easily be seen when you export a scene to a rib file; it will look somehow like this:
The "Scale 1 1 -1" sets up a right handed coordinate system in which all objects are added (the "world" coordinate system). But when a lightsource is added, the "Identity" resets it back to the default left handed coordinate system. As a result, all light sources have a flipped z-coordinate. So what should one do if creating a light shader in this situation? Either build the bug into the light shader, with the risk that the light shader stops working when the bug is fixed one day, or, wait until the bug gets fixed (might take a long time depending on the circumstances), or, build the light shader according to some de-facto standard like other light shaders are made. This last option seems to be what has been done by omnifreaker (i probably would have done the same). But the de-facto standard in this case would be to use the same coordinate system for environment maps as all other shader functions (environment and indirectdiffuse) use, and this means to use the latitude of the latitude/longitude map for the z-direction. A bit inconvenient, because most maps you will find on the internet use the latitude for the y-direction (as the previous posts pointed out), but perhaps not that inconvenient, because corrections have to be made anyway (because of the coordinate system flipping-bug). I think i will file a bug report for this, perhaps they will fix it, now that more people seem to have noticed it :-)
Edit: I assume you didn't rotate the UE2 for your examples, so I guess that your IBL TIF looked like the first attached JPG? (Or if you rotated UE2 -90 X and -90 Y it looked like the second?)
Looking at your two renders both poles seem to be tilted about 20° towards the back... I'd guess their positions to be very roughly 180° azimuth, ±70°elevation? I'm stumped trying to come up with a mapping that would give your results using either of these images...
My image map was effectively the same as your second image map, and yes the poles for the lighting were tilted to the back (if I remember correctly).
I haven't really done much experimentation since, and I could well be be off here or there. I while back I was trying to fix the lighting problem (much the same way you were) by trying to match the environment sphere with the HDRI by rotating the former but after this test it became evident that it wouldn't be enough - which I guess is the main thing you can walk away with from my little experiment. Shortly after I was directed to the IBL transformer which seemed to fix the problem so I just couldn't be bothered with it anymore.
For the tests, the same HDRI with a prominent light source (a rather bright sun) was used, it was shifted and flipped, but it is essentially the same. The scenes (DS and Bryce) were never changed, shaders, light intensities or whatever parameters were left untouched, only the HDRI was replaced.
In DS UE2, the light is offset by 127.5°, the shadows are correctly cast in the opposite direction of the light. However, there's another strong shadow about 30° apart that cannot be accounted for. The form of the shadows is wrong.
If the HDRI is flipped, we get the same result and we would certainly expect that since the sun is in the centre, mirroring the HDRI does not affect its position. The shadows are a wee bit darker. This is most probably due to the ambient light - though comparatively weak - that plays its part.
If the sun is moved right by 90°, it illuminates the objects from the wrong side and the shadow appears in yet another direction. The shape of the shadow somehow resembles what we would expect to see, but much too small.
Finally, flipping the shifted HDRI so that the sun shines from the left, the objects get even less light and also from the opposite direction. All shadows have disappeared.
I repeated the same tests in Bryce, which needs only the HDRI as light source; it can be tone-mapped to be used as backdrop as well. For DS UE2 I had to use two additional spheres, one within the other: one for the backdrop and one to create the reflection on the mirror ball. However, I did delete them at one time just to see whether the light changes. It did not.
Also in the Bryce examples, only the HDRI was exchanged. The light shines from the expected direction, the shadows are cast away from the sun and their shape is correct.
Image Based Light is a means to render with natural light. This cannot be done in DS UE2. Using the same HDRI with just the prominent light source moved gives completely different results. At least these tests explain why I got different results using different HDRIs. DS UE2 does not behave consistently.
I doubt this issue can be explained with the left and right handed coordinate system. I strongly suspect the lights are not correctly derived from the HDRI, the algorithm used is either not up to the task or was wrongly implemented. Start with any of the four HDRIs and rotate UE2 with the environment and reflection spheres, the result is always the same, rotation works correctly.
This is actually a bug in DS. I would say that this is probably the only (serious) bug in all this. DS flips the coordinate system of every light (not only the uberenv light). This can easily be seen when you export a scene to a rib file; it will look somehow like this:
Smart. Thanks for the head up. Didn't think about having a look at a RIB export. That may explain things. I'll try to set up some test asap. Would be good to settle all these. One thing I have in mind is that omnifreaker must have not known of this. Otherwise you could always create an option in the shader to workaround the bug and disable it when the bug is cleared
@Horo.we don't have a clear deomonstration yet but having a mess in coordinate system can completely mess your results. I'll try to come up with something
@Horo.we don't have a clear demonstration yet but having a mess in coordinate system can completely mess your results. I'll try to come up with something
That would be very helpful. I haven't figured out how to export as RIB, I can't even export uncompressed DUF, though it should be possible. As for the coordinates: I'm probably a bit at a loss here as well. A panorama is usually mapped on the outside of a sphere, if the camera is inside, it is mirrored and this can be corrected if Z is made negative - or by mirroring the panorama. I'm probably missing more than I'm aware of.
use mirroered panorama img is answer , as you have mentioned already.
light souce Tiff auto-converted from HDR, and diffuse image of backgoround PNG (JPG) ,
should be mirroerd if I use DAZ UE2
I have used Osomaki IBL Transformer , at start points ,so that I thought set mirrored img is default.
It have option to use Mirroered image or Normal.
When I check Tiff and jpg (from parameter pane, and surface pane),if it has not been mirroerd,
simply change option setting. It auto-make another mirroerd tiff , then apply it as light source.
After sending bug report Mantis about Axis problem , I thought DAZ will never correct it.
there has been no confirm from official,just other some users confirmed
and applied more details with clear english, I sometimes checked them.
(thanks Szark and Tempest!)
If DAZ will correct it in ds 5 from Horo who found and introduced this problem ,
it seems reasonable.
Well, I'm still at it. The environment sphere was the easy part to get right thanks to millighost's remark that we have Y and Z swapped. I deleted omnifreakers environment sphere and replaced it by one I made in Wings3D and exported as Wavefront OBJ with Y and Z swapped. The environment panorama can be mapped on it without mirroring or offsetting.
I'm more concerned by the light itself. I was successful in remapping the HDRI in 2 stages. I got the shadows right in azimuth and length (matching the sun's elevation). I processed an indoor HDRI the same way and compared the result with Bryce renders. I was quite pleased with the result, though the method is not elegant.
But - now I have a colour HDRI of the sort 3dcheapskate used for his tests. My method shows flaws. Whatever I do, I can make any 4 faces in a row of the cube in the centre get the correct colour. Two opposite faces always get mixed colours. My setup has a white cube in the world centre and 6 cameras looking at any one of the cube faces. The HDRI ought to illuminate each cube face with another colour and no colour mixing ought to occur.
Well, I'm still at it. The environment sphere was the easy part to get right thanks to millighost's remark that we have Y and Z swapped. I deleted omnifreakers environment sphere and replaced it by one I made in Wings3D and exported as Wavefront OBJ with Y and Z swapped. The environment panorama can be mapped on it without mirroring or offsetting.
I'm more concerned by the light itself. I was successful in remapping the HDRI in 2 stages. I got the shadows right in azimuth and length (matching the sun's elevation). I processed an indoor HDRI the same way and compared the result with Bryce renders. I was quite pleased with the result, though the method is not elegant.
But - now I have a colour HDRI of the sort 3dcheapskate used for his tests. My method shows flaws. Whatever I do, I can make any 4 faces in a row of the cube in the centre get the correct colour. Two opposite faces always get mixed colours. My setup has a white cube in the world centre and 6 cameras looking at any one of the cube faces. The HDRI ought to illuminate each cube face with another colour and no colour mixing ought to occur.
That is what really puzzles me most now - there is a known pole offset problem, but it sometimes doesn't show up (e.g. in my tests) !
So there must be something different between what Tempest/Horo are doing, and what I'm doing, for these colour-spots test:
- I'm still using an older version, DAZ Studio 4.5.1.6 Pro (64-bit), on a Windows 7 system. What version are you folks using?
- The colour spots image I'm using is created from an LDRI JPG source.
I'm using DS 4.6.0.18 Pro 64 bit on Win 7 64. bit. I don't think the version of DS is an issue, UE2 came already with DS 3 - which I still have installed. I could, theoretically, set up the same scene in DS 3 and check it.
The HDRI I'm using is a true HDRI with a moderate dynamic range of 8741:1. The colour dots are smaller than yours (3dcheapskate). Mapping of the HDRI on the light sphere is not conventional, there's no doubt about that. I'm still uncertain whether the light generation, or how it is interpreted by the 3Delight renderer, is flawed.
I recall being told by a trustworthy source that if a surface is perfectly horizontal (like the top/bottom faces of a cube primitive on loading) then DAZ Studio gets the lighting wrong. I remember doing some checks and confirming that the problem occured at roughly an elevation >88° or < -88° (Correction (19Sep 6amUTC): the angles I got from my checks were actually ±89.91°) ...
(I'm fairly sure the thread/post was here on the DAZ forums, but it may have been on Renderosity - I'm still searching for it, and I'll post the link here when I find it...)
Edit: maybe it was this thread? ... can't check now as I have to rush off now... I'll check later. Edit 2 (19Sep 6amUTC): Just confirmed - it was that thread I was thinking of, millighost's reply to my point (2) in post #14. But re-reading that whole thread, I'm not sure if that bug could cause the problem Horo's seeing?
If what you say has some truth in it, it would explain why an HDRI of a landscape or room appears to create the correct light and an artificially made doesn't. I have long suspected that something is wrong if a light source is in the zenith (when I made my tests with just 1 single light). I abandoned that suspicion when I rotated the UE2 sphere so that a side light gets on top and the top was correctly lit. On second thought, that test doesn't negate what you suggest.
If this +/-88° issue is a real one, then the light creation is wrong. Obviously, in the spherical projection, only the equator is strictly correct and distortions are introduced the higher up to the poles we get. This has to be accounted for when the lights are generated - with whatever algorithm. My pole-lights are rather small, I've got to make another HDRI then with larger pole lights.
The light dots had a diameter of about 15°. So I remade the HDRI with the top and bottom light at 35°, the other 4 at 17.5°. It looked a bit promising but not completely. Blue is the issue.
I multiplied the colours according to the inverse of the eye response curve where 30% red, 59% green and 11% of blue make white. So I multiplied red by 3.3, green by 1.69 and blue by 9.1, still using the HDRI with the double sized light on top and bottom.
I got all colours right - except blue itself was much too strong. However, Cyan got cyan, not green, pink got pink, not red. Does UE2 have a colour issue or 3Delight?
Of course, wrong colour mixing can still be due to some mapping offsets. But having 5 cube faces out of 6 right makes me really wonder.
@3dcheapskate - it finally dawned on me that you mentioned that straight lines seem to be a problem and I exchanged the cube by a sphere. Looks quite good, I had the impression, there was no light from top and bottom but with the colour mixing, who could say for sure?
So I used an HDRI with only the top and bottom lights and I got very strange results: From front, the sphere stays black, from the back I have the sphere coloured above with the light above and below with the one from below. From left, the left half of the sphere has the colour from the top on the upper side and on the lower side the one from below. Looking at it from the right side gives the same result, only "the moon" is now waxing where it was waning before. From above, I see on half of the sphere the light and from bottom again half from below.
None of my tests has shown any consistency whatsoever. There is a reason why the HDRI must be a specular convolved one. This gives nice ambient light and covers up that it absolutely doesn't work. Image based light is for rendering with natural light, and this can only be done if the HDRI is high resolution, high dynamic range, the lights generated accordingly and the render engine up to the task. By the way, I also used just LDRIs for all the tests and I got similar results.
It is possible to make the lights and shadows looking correct. For one particular HDRI. What appears to work for one HDRI doesn't necessarily work for another.
Here i put a cube in a scene, and applied horo's image to the uberenv2 (the dark "background" in the envmap was not completely black which washed the colors a bit, so i blackened it in gimp). Without any rotations applied I think it basically works as expected: cyan is on the front, yellow right and green top. For render on the right, i removed all the colored dots, leaving only the cyan resp. magenta "poles". Only the front and back side are lit, except some hardly visible pixels on the other sides (only front side shown). The colors are slightly wrong, because the render was gamma encoded, which might be one aspect of your trouble; in order to use a lat/long image, uberenvironment2 copies the image as-is into 3delight, but since 3delight only uses linear images i should have converted them to linear space (which i did not do) or rendered with gamma=1 (which i did not either). This is generally not necessary with HDRI, because HDRI does normally not use gamma correction, but there will likely be some color errors when someone uses an HDRI generated by e.g. adding two sRGB-JPEGs.
The HDRI I'm using is a true HDRI with a moderate dynamic range of 8741:1. The colour dots are smaller than yours (3dcheapskate). Mapping of the HDRI on the light sphere is not conventional, there's no doubt about that. I'm still uncertain whether the light generation, or how it is interpreted by the 3Delight renderer, is flawed.
Note, that there is no "light generation" with uberenv2, lighting is based purely on sampling the image. 3delight can do something like determining the direction of the strongest light for an environment map, but uberenv2 does not use this feature, as far as i know. Also, make sure that your environment map is really a lat/long map; at least for me the reason for some strange color mixes is often that the textures look like lat/long maps, but are not tagged as those. The tdl files that 3delight uses are simply TIF (Tagged Image Format). 3delight looks at the tags to determine if it is a lat/long map. I use the tiffdump utility to show the tags (because i do not know of anything better). The most interesting tags for textures are 33302 and 33303, make sure that it is LatLong. The output looks like this (only longer):
If there is something else the 33302 tag, 3delight assumes some other mapping (i think angular mapping, but i am not sure), and of course the results will be different. This is basically what tdlmake -envlatl does different from the tdlmake that DAZStudio calls automatically when optimizing textures; it writes those tags to the image.
@millighost - thank you for dropping in here, your insights are very much appreciated.
The image you took is an LDRI JPG, not an HDRI. I can always make available any of my test HDRIs for free download if there's an interest. However, I've also noticed that there is essentially no difference whether an HDRI or an LDRI is used for the colour test. Never mind washed-out colours, as long as they are correct, I don't care at this stage. I have no idea where in 3Delight I could switch on or off gamma. The HDRIs I use are always linear, they are true 96-bit TIFFs (or RGBE radiance HDRs).
And, of course, I use truly spherically projected images, not just some 2:1 aspect ratio things. The TIFF tags 33302 and 33303 are neither standard nor registered private tags and can only have a relevance in a particular program. As for valid TIFF tags, see www.digitalpreservation.gov/formats/content/tiff_tags.shtml. There is no program I know of that creates these tags. I don't know what a TDL file is. As you say, it's 3Delight variant of TIFF. How can one extract it?
UE2 takes a 96-bit TIFF either compressed or not for the light. It wants it in the spherical projection and maps it around an invisible sphere. How exactly it is mapped on the sphere, I don't know, and this is part of all these tedious experiments to find out. If the spherical HDRI panorama is not correctly mapped on the sphere, we get funny colours and I think we all absolutely agree here.
Whether point or area light sources are created from the loaded HDRI or not is not relevant but must be known by the render engine in order to be able to interpret what it finds out from its feeler rays it sends out.
Creating distinctive light sources from the HDRI loaded has the advantage that there is a way to control quality and speed by the number of lights created - or the light source resolution; this pixel is bright, the one next to it dark. Each bright pixel has the same brightness. The nearer those pixels are together, the brighter the light source is considered.
The disadvantage is that there are no parts that fade out or mix with adjacent colours. We can see this in Bryce because this results in shadow banding, even with 4096 light sources. If 3Delight scans the HDRI backdrop, it must be able to interpret the brightness of each pixel in the HDRI. This is very time consuming but should give the best result. If only a coarse scan is made and the rest around it guessed, speed increases and accuracy drops. But accuracy only drops if the HDRI is high resolution as mandatory for RNL (rendering with natural light), resolution doesn't drop noticeably if a specular convolved HDRI is used, the less the smaller Phong exponent used and least at exponent 1, which results in a diffuse convolved HDRI.
What I want is to render with natural light in DS like I can in Bryce and Octane (and in Carrara, though I performed only preliminary tests, being the Carrara dummy I am). I could make RNL work in DS for 2 HDRIs up to now. Using the same method doesn't work with the colour cube. What roughly works with the colour cube doesn't work with an outdoor and indoor HDRI.
That's why I say UE2 (or UE2 with 3Delight) doesn't work consistently. It may also well be that I want something out of UE2 for which it was never made. All tests I have now performed left me baffled, each one worked in Bryce.
To say something nice, using an uniformly white HDRI (dynamic range 1:1) gives correct results using a sphere to be lit by it or a cube. Moving down intensity results in an even uniform grey, brightness can be nicely controlled. No complaints here.
By the way, I'm using AsTiffTagViewer if I'm not just looking at the hex dump of the IFD.
Comments
You've lost me there - I can't see anything that makes me think IBL Transformer transforms the IBL? But then again it's first thing in the morning here - this UE2 thing's been buzzing around in my head all night and I couldn't sleep properly.
Plus I haven't had my wake-up coffee yet, so my brain's not yet in gear...
The IBL transformer does indeed transform the IBL.
If you picture the IBL as a UV mapped (light emitting) sphere it should roughly look similar to UE's environment sphere when it comes to how the image is mapped on the sphere. Of course this isn't the case, which is why we're here in the first place.
However the real problem with the IBL isn't that it's rotated in relation to the environment sphere but that the mapping is wrong. The "poles" of the IBL sphere should be on the top and bottom of the sphere were it mapped correctly to be used with equirectangular images but they are actually NOT. See the attached image for reference. The lighting (red from the top, turqoise from below) cast on the sphere primitive in the image should match how the UE environment sphere looks, but it clearly doesn't. The poles are wayy off from their proper locations.
The IBL transformer fixes this issue and you get a properly mapped IBL sphere which truly matches the environment sphere and how an equirectangular image actually should light the scene.
Okay, I've had a day off, and I've just had my Monday morning coffee. I can now say "Ah! I see!". The red and cyan spots in your example make it crystal clear (even to me! ;o) ) that the poles of the IBL aren't opposite each other, so the IBL mapping is wrong, and not just rotated as I had thought.
With the six coloured lights I used for all my tests I missed the fact that they weren't properly aligned (Edit(11Sep):or maybe they were? See post 70) - I was just concerned with the right coloured illumination coming from the right quadrant. (Edit: But in my post 36 test I compared an IBL Transformer render with a UE2 X/Y-Rotate=-90 render and they were identical, which they would only be if the lighting was identical. That really puzzles me)
So when I applied X-Rotate = -90 and Y-Rotate = -90 to the UE2 the lights appeared to illuminate my cube correctly and I thought that was enough. But as Horo pointed out earlier, I really needed to check the sphere in detail - which I clearly didn't.do. From your 'back right view' the error appears to a bit under 45 degrees from this viewpoint - is that the 35 degrees that Horo was initially talking about?
And of course this would explain why the IBL Transformer is so named!
Edit: I assume you didn't rotate the UE2 for your examples, so I guess that your IBL TIF looked like the first attached JPG? (Or if you rotated UE2 -90 X and -90 Y it looked like the second?)
Looking at your two renders both poles seem to be tilted about 20° towards the back... I'd guess their positions to be very roughly 180° azimuth, ±70°elevation? I'm stumped trying to come up with a mapping that would give your results using either of these images...
...unless...compare the attached lat/long, angular, and mirrorball mappings (edit: these were generated from my vertical cross image in post #50, and aren't flipped horizontally)
If you plug a lat/long map into something that expects an angular map the poles will be off by 45 degress.
But if you plug a lat/long map into something expecting a mirrorball mapping - maybe 30 degrees (edit: measurements in GIMP put it at about 26-27°)? Could that be it? If so I'd expect the east/west lights to be out by about the same amount as the pole lights (up/down)
I get more and more confused. I did that top-bottom test as well and I can confirm that the light is not coming exactly from top and bottom. may be around latitudes +/-20° but that's just guesswork.
It is hard to believe that the environ map is mapped correctly on the sphere but not the HDRI on the invisible one. Why would anybody make one right and the other wrong? After all, you can reuse the code, just call the function twice, no need to invent the wheel twice.
I rather think that the light gathering algorithm is not up to the task. If you use small lights like the ones 3dcheapskate used in post #61 (I used similar) the alg doesn't gather them correctly. There are much less issues if larger lights are used but then, you cannot make a moderately precise measurement anymore.
There must be a reason why the light sphere ought to be a specular convolved one - the blurred light sources get wider, making it simpler to gather them. I also miss some quality control: how many light sources are generated from the HDRI. Using the Monte Carlo alg, you usually have a percent control, with the median-cut one the number of rectangles with a light source.
In my experience, it is very difficult to measure the offsets in UE2 with some accuracy to be able to take counter measures. I also found to my frustration that the results are inconsistent - depending on the HDRI used. UE2 came out about 2 years ago but there was never any update. On omnifreakers website there's nothing new. Does that disinterest - also from DAZ 3D - mean that it isn't worth to elaborate on UE2 since it is FUBAR anyway (Fooled Up Beyond Any Repair)?
Very interesting thread. When you guys get this nailed down, can one of you file a bug report please. :)
Well Spooky this was reported back in the DS3 days...yes it has been broken all that time...I hope the new bug report doesn't get ignored like the old one did.
Let's jsut forget about the Environement Sphere. I'm pretty sure there is at least 18° off just from the UV mapping. But there is more than that for sure.
From what I've seen, the IBL transformer made a new sphere with default UV mapping, scale to -1 and is sized 10000 if I'm not mistaken. The old Envsphere is deleted I guess. So no need to worry about finding how much off it is. Just make a preset with a sphere like that one. Just having Scale -1 is enough to get it displayed correctly (We see the inside of the sphere not the outside. Neat trick here)
For the Environment map, We should do a tdlmake -envlatl -mirrorx to get it correctly. Then we just have to find the correct rotations
Once we get that, making a preset with a new sphere and correct orientations of the UE light and Envsphere is pretty trivial
@Horo : We can't display directly the HDR Image because something is missing in the OpenGL preview : the ability to display .hdr and .tiff pictures (eventually OpenEXR too)
That's why you must also convert your HDR to a jpg or png file and map it to see it. But a feature request would be good on that point
It was put on the Studio Basecamp which is now closed.
@Takeo.Kensei - if Z scale is set to -100% and the environment map offset in azimuth by +90° in relation to the HDRI, I had a direction match with the HDRI. None of the images mirrored. Then, the sphere can be rotated, light and shadows follow.
Yes, DS has no tone-mapper to display the HDRI, I understand that. I think it is possible to map an LDRI around it. But I doubt the sphere as such could be made visible. Maybe whatever image is applied as light, HDRI or LDRI, lights are generated from it and those aren't displayed.
Still busy bumbling around with my own ideas, based on the bits of the ongoing discussion that I can understand:
My next simple test:
1) Created three TIF images each with a pair of lights directly opposite each other, so I can test each of the three cardinal direction pairs (north-south, east-west, up-down). Yes - JPG images on the screenshot, but I used properly prepared TIFs for the test!
2) Applied each TIF to the UE2's Light > Basic > Color, with Ambient Only mode and no rotations on the UE2 itself (the spot colours of the TIFs are in accordance with 4.jpg in the first image of post 50)
3) I did four renders for each TIF, one from each of the four fixed cameras that would show both colours.
4) I compared the four render PNGs in GIMP (using horizontal flip and 90deg rotate to align the images first) by using 'Difference' mode, copied the result (which looking like a black circle) and tested whether it was totally black using the 'Fuzzy Select tool and adjusting the tolerance .
Result: Within a tolerance of about 2% (fuzzy select threshold = 5) the four render PNGs were identical. This was true for each of the three TIFs I used.
Conclusions:
a) My 'could it be a mirrorball mapping' thought is clearly incorrect - the illumination for all 12 renders appears to be coming from the correct directions, with no offset.
b) Horo's observation that the offset results seem to be different for different images seems valid - that's the only way that I can explain how this test indicates that there is no offset, when it's been fairly conclusively proved by other people in earlier posts that there is one! I guess this is where the 'gathering algorithm' stuff comes into it?
c) This would explain why my results were identical to the IBL Transformer results in post 36. My test images do not seem to produce this pole offset...
(It's difficult to judge from the attached screenshot, but this is such a simple test that anybody can do it for themselves. Bear in mind that the size of the spots may have a major effect...)
I'm happy to report that I managed to get the light match the backdrop. Luckily, the shadows are now about 90° off. Oh, I can get the shadows right if I accept that the light illuminates the object from the wrong side. You've got to see this as an artistic feature on which you can capitalize. No need to get mad (I hope I'll be able to convince myself eventually, still struggling).
I don't really get it. If illumination is correct, you should have correct shadows. Can you post examples please?
I didn't begin new test yet. Have to think a bit first. I believe that one source of the problem is due to different coordinate system
Seems to me that DS coordinate system is right handed and 3delight must be left handed. Must take that into account when preparing the preview and HDR pictures. If you want more infos http://www.fundza.com/rib/example4/example4.html
I see what you mean. It appears to be a problem of sampling the lights. If the dynamic range is low, much environment light comes into the scene that can brighten up some of the shadows and we or the renderer or both get a bit confused.
I multiplied the specular convolved HDRI by itself to boost the dynamic range and now the shadows may be considered correct. The first image shows the environmap in the mirror ball and the position of the sun. The UE2 was rotated Y=127° so the light direction matches the environment and the reflection spheres.
For the second example, I created a median-cut representation of the same HDRI and used that map to light the scene. Obviously, either UE2 or the renderer got confused. The direction of the light is correct, the shadows are quite a few that cannot be accounted for. But this should not concern us too much since we do not know what sort of lights are created from the HDRI. The median-cut variant used assumes point light sources (as Bryce uses) but UE2 might create area light sources (as Bryce does when the HDRI is true ambiance optimised). Nevertheless, an interesting test, though useless.
And thanks for the left/right hand link. I'll consider that as well.
This is actually a bug in DS. I would say that this is probably the only (serious) bug in all this. DS flips the coordinate system of every light (not only the uberenv light). This can easily be seen when you export a scene to a rib file; it will look somehow like this:
The "Scale 1 1 -1" sets up a right handed coordinate system in which all objects are added (the "world" coordinate system). But when a lightsource is added, the "Identity" resets it back to the default left handed coordinate system. As a result, all light sources have a flipped z-coordinate. So what should one do if creating a light shader in this situation? Either build the bug into the light shader, with the risk that the light shader stops working when the bug is fixed one day, or, wait until the bug gets fixed (might take a long time depending on the circumstances), or, build the light shader according to some de-facto standard like other light shaders are made. This last option seems to be what has been done by omnifreaker (i probably would have done the same). But the de-facto standard in this case would be to use the same coordinate system for environment maps as all other shader functions (environment and indirectdiffuse) use, and this means to use the latitude of the latitude/longitude map for the z-direction. A bit inconvenient, because most maps you will find on the internet use the latitude for the y-direction (as the previous posts pointed out), but perhaps not that inconvenient, because corrections have to be made anyway (because of the coordinate system flipping-bug).
I think i will file a bug report for this, perhaps they will fix it, now that more people seem to have noticed it :-)
My image map was effectively the same as your second image map, and yes the poles for the lighting were tilted to the back (if I remember correctly).
I haven't really done much experimentation since, and I could well be be off here or there. I while back I was trying to fix the lighting problem (much the same way you were) by trying to match the environment sphere with the HDRI by rotating the former but after this test it became evident that it wouldn't be enough - which I guess is the main thing you can walk away with from my little experiment. Shortly after I was directed to the IBL transformer which seemed to fix the problem so I just couldn't be bothered with it anymore.
For the tests, the same HDRI with a prominent light source (a rather bright sun) was used, it was shifted and flipped, but it is essentially the same. The scenes (DS and Bryce) were never changed, shaders, light intensities or whatever parameters were left untouched, only the HDRI was replaced.
In DS UE2, the light is offset by 127.5°, the shadows are correctly cast in the opposite direction of the light. However, there's another strong shadow about 30° apart that cannot be accounted for. The form of the shadows is wrong.
If the HDRI is flipped, we get the same result and we would certainly expect that since the sun is in the centre, mirroring the HDRI does not affect its position. The shadows are a wee bit darker. This is most probably due to the ambient light - though comparatively weak - that plays its part.
If the sun is moved right by 90°, it illuminates the objects from the wrong side and the shadow appears in yet another direction. The shape of the shadow somehow resembles what we would expect to see, but much too small.
Finally, flipping the shifted HDRI so that the sun shines from the left, the objects get even less light and also from the opposite direction. All shadows have disappeared.
I repeated the same tests in Bryce, which needs only the HDRI as light source; it can be tone-mapped to be used as backdrop as well. For DS UE2 I had to use two additional spheres, one within the other: one for the backdrop and one to create the reflection on the mirror ball. However, I did delete them at one time just to see whether the light changes. It did not.
Also in the Bryce examples, only the HDRI was exchanged. The light shines from the expected direction, the shadows are cast away from the sun and their shape is correct.
Image Based Light is a means to render with natural light. This cannot be done in DS UE2. Using the same HDRI with just the prominent light source moved gives completely different results. At least these tests explain why I got different results using different HDRIs. DS UE2 does not behave consistently.
I doubt this issue can be explained with the left and right handed coordinate system. I strongly suspect the lights are not correctly derived from the HDRI, the algorithm used is either not up to the task or was wrongly implemented. Start with any of the four HDRIs and rotate UE2 with the environment and reflection spheres, the result is always the same, rotation works correctly.
Smart. Thanks for the head up. Didn't think about having a look at a RIB export. That may explain things. I'll try to set up some test asap. Would be good to settle all these. One thing I have in mind is that omnifreaker must have not known of this. Otherwise you could always create an option in the shader to workaround the bug and disable it when the bug is cleared
@Horo.we don't have a clear deomonstration yet but having a mess in coordinate system can completely mess your results. I'll try to come up with something
That would be very helpful. I haven't figured out how to export as RIB, I can't even export uncompressed DUF, though it should be possible. As for the coordinates: I'm probably a bit at a loss here as well. A panorama is usually mapped on the outside of a sphere, if the camera is inside, it is mirrored and this can be corrected if Z is made negative - or by mirroring the panorama. I'm probably missing more than I'm aware of.
use mirroered panorama img is answer , as you have mentioned already.
light souce Tiff auto-converted from HDR, and diffuse image of backgoround PNG (JPG) ,
should be mirroerd if I use DAZ UE2
I have used Osomaki IBL Transformer , at start points ,so that I thought set mirrored img is default.
It have option to use Mirroered image or Normal.
When I check Tiff and jpg (from parameter pane, and surface pane),if it has not been mirroerd,
simply change option setting. It auto-make another mirroerd tiff , then apply it as light source.
After sending bug report Mantis about Axis problem , I thought DAZ will never correct it.
there has been no confirm from official,just other some users confirmed
and applied more details with clear english, I sometimes checked them.
(thanks Szark and Tempest!)
If DAZ will correct it in ds 5 from Horo who found and introduced this problem ,
it seems reasonable.
Well, I'm still at it. The environment sphere was the easy part to get right thanks to millighost's remark that we have Y and Z swapped. I deleted omnifreakers environment sphere and replaced it by one I made in Wings3D and exported as Wavefront OBJ with Y and Z swapped. The environment panorama can be mapped on it without mirroring or offsetting.
I'm more concerned by the light itself. I was successful in remapping the HDRI in 2 stages. I got the shadows right in azimuth and length (matching the sun's elevation). I processed an indoor HDRI the same way and compared the result with Bryce renders. I was quite pleased with the result, though the method is not elegant.
But - now I have a colour HDRI of the sort 3dcheapskate used for his tests. My method shows flaws. Whatever I do, I can make any 4 faces in a row of the cube in the centre get the correct colour. Two opposite faces always get mixed colours. My setup has a white cube in the world centre and 6 cameras looking at any one of the cube faces. The HDRI ought to illuminate each cube face with another colour and no colour mixing ought to occur.
millighost - when you mentioned in post 74 about filing a bug report, are you refering to a bug report about this whole UE2 problem, or a bug report about DS flipping the coordinate systems of all lights?
kitakoredaz - glad to see that you found this thread! Yes, it's very disappointing that DAZ have known about this for at least two years, but have done nothing to fix it. Hopefully DAZ_Spooky will make sure that it isn't forgotten about this time!
The colour-mixing on two opposite faces seems to correspond with Tempest!'s comments and example renders in post #59. However, my attempts in post 70 to reproduce the misaligned poles failed completely - my poles seemed perfectly aligned.
That is what really puzzles me most now - there is a known pole offset problem, but it sometimes doesn't show up (e.g. in my tests) !
So there must be something different between what Tempest/Horo are doing, and what I'm doing, for these colour-spots test:
- I'm still using an older version, DAZ Studio 4.5.1.6 Pro (64-bit), on a Windows 7 system. What version are you folks using?
- The colour spots image I'm using is created from an LDRI JPG source.
I'm using DS 4.6.0.18 Pro 64 bit on Win 7 64. bit. I don't think the version of DS is an issue, UE2 came already with DS 3 - which I still have installed. I could, theoretically, set up the same scene in DS 3 and check it.
The HDRI I'm using is a true HDRI with a moderate dynamic range of 8741:1. The colour dots are smaller than yours (3dcheapskate). Mapping of the HDRI on the light sphere is not conventional, there's no doubt about that. I'm still uncertain whether the light generation, or how it is interpreted by the 3Delight renderer, is flawed.
There may be yet another problem involved here!
I recall being told by a trustworthy source that if a surface is perfectly horizontal (like the top/bottom faces of a cube primitive on loading) then DAZ Studio gets the lighting wrong. I remember doing some checks and confirming that the problem occured at roughly an elevation >88° or < -88° (Correction (19Sep 6amUTC): the angles I got from my checks were actually ±89.91°) ...
(I'm fairly sure the thread/post was here on the DAZ forums, but it may have been on Renderosity - I'm still searching for it, and I'll post the link here when I find it...)
Edit: maybe it was this thread? ... can't check now as I have to rush off now... I'll check later.
Edit 2 (19Sep 6amUTC): Just confirmed - it was that thread I was thinking of, millighost's reply to my point (2) in post #14. But re-reading that whole thread, I'm not sure if that bug could cause the problem Horo's seeing?
If what you say has some truth in it, it would explain why an HDRI of a landscape or room appears to create the correct light and an artificially made doesn't. I have long suspected that something is wrong if a light source is in the zenith (when I made my tests with just 1 single light). I abandoned that suspicion when I rotated the UE2 sphere so that a side light gets on top and the top was correctly lit. On second thought, that test doesn't negate what you suggest.
If this +/-88° issue is a real one, then the light creation is wrong. Obviously, in the spherical projection, only the equator is strictly correct and distortions are introduced the higher up to the poles we get. This has to be accounted for when the lights are generated - with whatever algorithm. My pole-lights are rather small, I've got to make another HDRI then with larger pole lights.
The light dots had a diameter of about 15°. So I remade the HDRI with the top and bottom light at 35°, the other 4 at 17.5°. It looked a bit promising but not completely. Blue is the issue.
I multiplied the colours according to the inverse of the eye response curve where 30% red, 59% green and 11% of blue make white. So I multiplied red by 3.3, green by 1.69 and blue by 9.1, still using the HDRI with the double sized light on top and bottom.
I got all colours right - except blue itself was much too strong. However, Cyan got cyan, not green, pink got pink, not red. Does UE2 have a colour issue or 3Delight?
Of course, wrong colour mixing can still be due to some mapping offsets. But having 5 cube faces out of 6 right makes me really wonder.
@3dcheapskate - it finally dawned on me that you mentioned that straight lines seem to be a problem and I exchanged the cube by a sphere. Looks quite good, I had the impression, there was no light from top and bottom but with the colour mixing, who could say for sure?
So I used an HDRI with only the top and bottom lights and I got very strange results: From front, the sphere stays black, from the back I have the sphere coloured above with the light above and below with the one from below. From left, the left half of the sphere has the colour from the top on the upper side and on the lower side the one from below. Looking at it from the right side gives the same result, only "the moon" is now waxing where it was waning before. From above, I see on half of the sphere the light and from bottom again half from below.
None of my tests has shown any consistency whatsoever. There is a reason why the HDRI must be a specular convolved one. This gives nice ambient light and covers up that it absolutely doesn't work. Image based light is for rendering with natural light, and this can only be done if the HDRI is high resolution, high dynamic range, the lights generated accordingly and the render engine up to the task. By the way, I also used just LDRIs for all the tests and I got similar results.
It is possible to make the lights and shadows looking correct. For one particular HDRI. What appears to work for one HDRI doesn't necessarily work for another.
Here i put a cube in a scene, and applied horo's image to the uberenv2 (the dark "background" in the envmap was not completely black which washed the colors a bit, so i blackened it in gimp). Without any rotations applied I think it basically works as expected: cyan is on the front, yellow right and green top. For render on the right, i removed all the colored dots, leaving only the cyan resp. magenta "poles". Only the front and back side are lit, except some hardly visible pixels on the other sides (only front side shown). The colors are slightly wrong, because the render was gamma encoded, which might be one aspect of your trouble; in order to use a lat/long image, uberenvironment2 copies the image as-is into 3delight, but since 3delight only uses linear images i should have converted them to linear space (which i did not do) or rendered with gamma=1 (which i did not either). This is generally not necessary with HDRI, because HDRI does normally not use gamma correction, but there will likely be some color errors when someone uses an HDRI generated by e.g. adding two sRGB-JPEGs.
Note, that there is no "light generation" with uberenv2, lighting is based purely on sampling the image. 3delight can do something like determining the direction of the strongest light for an environment map, but uberenv2 does not use this feature, as far as i know.
Also, make sure that your environment map is really a lat/long map; at least for me the reason for some strange color mixes is often that the textures look like lat/long maps, but are not tagged as those. The tdl files that 3delight uses are simply TIF (Tagged Image Format). 3delight looks at the tags to determine if it is a lat/long map. I use the tiffdump utility to show the tags (because i do not know of anything better). The most interesting tags for textures are 33302 and 33303, make sure that it is LatLong. The output looks like this (only longer):
If there is something else the 33302 tag, 3delight assumes some other mapping (i think angular mapping, but i am not sure), and of course the results will be different. This is basically what tdlmake -envlatl does different from the tdlmake that DAZStudio calls automatically when optimizing textures; it writes those tags to the image.
@millighost - thank you for dropping in here, your insights are very much appreciated.
The image you took is an LDRI JPG, not an HDRI. I can always make available any of my test HDRIs for free download if there's an interest. However, I've also noticed that there is essentially no difference whether an HDRI or an LDRI is used for the colour test. Never mind washed-out colours, as long as they are correct, I don't care at this stage. I have no idea where in 3Delight I could switch on or off gamma. The HDRIs I use are always linear, they are true 96-bit TIFFs (or RGBE radiance HDRs).
And, of course, I use truly spherically projected images, not just some 2:1 aspect ratio things. The TIFF tags 33302 and 33303 are neither standard nor registered private tags and can only have a relevance in a particular program. As for valid TIFF tags, see
www.digitalpreservation.gov/formats/content/tiff_tags.shtml. There is no program I know of that creates these tags. I don't know what a TDL file is. As you say, it's 3Delight variant of TIFF. How can one extract it?
UE2 takes a 96-bit TIFF either compressed or not for the light. It wants it in the spherical projection and maps it around an invisible sphere. How exactly it is mapped on the sphere, I don't know, and this is part of all these tedious experiments to find out. If the spherical HDRI panorama is not correctly mapped on the sphere, we get funny colours and I think we all absolutely agree here.
Whether point or area light sources are created from the loaded HDRI or not is not relevant but must be known by the render engine in order to be able to interpret what it finds out from its feeler rays it sends out.
Creating distinctive light sources from the HDRI loaded has the advantage that there is a way to control quality and speed by the number of lights created - or the light source resolution; this pixel is bright, the one next to it dark. Each bright pixel has the same brightness. The nearer those pixels are together, the brighter the light source is considered.
The disadvantage is that there are no parts that fade out or mix with adjacent colours. We can see this in Bryce because this results in shadow banding, even with 4096 light sources. If 3Delight scans the HDRI backdrop, it must be able to interpret the brightness of each pixel in the HDRI. This is very time consuming but should give the best result. If only a coarse scan is made and the rest around it guessed, speed increases and accuracy drops. But accuracy only drops if the HDRI is high resolution as mandatory for RNL (rendering with natural light), resolution doesn't drop noticeably if a specular convolved HDRI is used, the less the smaller Phong exponent used and least at exponent 1, which results in a diffuse convolved HDRI.
What I want is to render with natural light in DS like I can in Bryce and Octane (and in Carrara, though I performed only preliminary tests, being the Carrara dummy I am). I could make RNL work in DS for 2 HDRIs up to now. Using the same method doesn't work with the colour cube. What roughly works with the colour cube doesn't work with an outdoor and indoor HDRI.
That's why I say UE2 (or UE2 with 3Delight) doesn't work consistently. It may also well be that I want something out of UE2 for which it was never made. All tests I have now performed left me baffled, each one worked in Bryce.
To say something nice, using an uniformly white HDRI (dynamic range 1:1) gives correct results using a sphere to be lit by it or a cube. Moving down intensity results in an even uniform grey, brightness can be nicely controlled. No complaints here.
By the way, I'm using AsTiffTagViewer if I'm not just looking at the hex dump of the IFD.
@Millighost : sent you a PM. Did you get it?
No! Thank you for asking. I've checked my PM inbox, there's nothing.