Can you render normal maps?

So, I'm wondering if there is some way (by plugin or custom shader or similar) to render a normal map for a scene.
Something like this:
http://www.alkemi-games.com/a-game-of-tricks/spritesheet_diff_norm/
Like a normal map based on distance from camera or something.
Post edited by inge.rasch on
Comments
I'm not an expert on normal maps, but I believe you need specific software to create/render normal maps. I don't know of any way to do so within DS, if that's the software with which you're trying to accomplish it. Perhaps someone else knows a way to do it in DS.
Difference to the camera would be a depth map or Z-buffer map - which you can certainly do fairly easily with Shader Mixer or with store products. A normal map is a map that is coloured to show the direction in which the surface normal is pointing, relative to the object axes or some other reference - since the normals are available to Shader Mixer I'm sure one could be made, though I'm not sure how useful it would be - after all, DS would have to be able to generate the normals in the first place so you wouldn't necessarily gain anything by creating a normal map.
So Normals aren't used in/by Daz3D?
I´ve used one once, it was very subtle and added just a little bit of tiny detail to the object. I hope to get more experience with normal maps though.
A normal map is just a way to tell the renderer which way a part of a model should be treated as facing, the way a bump map does but more directly - it allows the faking of detail as the surface is lit according to the way the map says it is facing, not the way it's really facing. DS does support normal maps, but you were apparently asking about creating them. Turning the existing elements that determine the shading normal (the direction the surface is treated as facing) into a set of normal maps won't gain you anything for still images - normal maps aren't intrinsically better than other methods given the same inputs. They might be useful if you were exporting to another application, or possible if you baked them once and then used them in all the frames of an animation. When people speak of baking a normal map they are usually going from a high-resolution mesh with lots of real detail down to a low resolution mesh with the normal map faking the detail.
Let me ask the question a bit differently:
I want to press render in Daz3D to create an image. This image will be your average render.
Then I want to press render again, and get an image that is a normal map for the first image.
So I'm not trying to get a normal map to fit a figure or prop. I'm trying to generate a 2d sprite from a daz figure, and then make a normal map for it.
See the bottom part of this if you still don't get at all what I mean (http://www.alkemi-games.com/a-game-of-tricks/) A bit lengthy read tough, but it explains normal maps fairly well.
Hope I manage to get the issue across here.
DS handles normals and tangent space kind of funny, I spent a while playing with them recently while learning my way around.
In your case, what you want is a bent normal shader as a render pass. This renders the scenes normals as color as you'd expect. It looks like it should be possible in daz studio, I've little experience with writing shaders, here's a Renderman example shader: http://renderman.pixar.com/view/image-based-lighting-code1
It may be really simple to do, I see a shader brick to calculate surface normals, and outputs a vector; I assume the only thing to do from there is convert that to RGB and plug that into the diffuse of a simple material. Maybe someone with more experience can help out with that.
That's not ideal in this situation, Crazy Bump (or any 2d normal map creator) will use the colors and "guess" the shape. It's better to create from a 3d source if you have it, which we do.
Oh I hear you but the OP asked to make a normal map from a sprite, a sprite being a flat 2d Image on a flat plane. I know Blender can make normal maps from a 3D source like ZBrush but from what I can gather from the OP's post this is not what is wanted.
I believe the goal is something like this, but for posed DAZ characters:
http://snakehillgames.com/spritelamp/
Interesting and I have no idea if my suggestions would work..I do still images only.
I think you could do that with Shader Mixer, you can get the current surface normal from a variable brick, split it into its components and process them to make a colour which you would then apply. I suspect it would be best done as a camera shader, so that you kept the normal generated by the surface shaders.
I got curious, so I did this, but found that we'd need Geometry Normals, not Surface Normal (which ignores edge smoothness AFAIK), But an NTransform (convert from object to camera) > Normalize > Diffuse slot, *almost* works, but still needs something, it looks like this.
I think the better route is to work from that Renderman example in the Shader Builder, I don't have time to figure all that out today.
This is what I've been playing around with so far. It works very well if the scene is set up with 5 different light sources, and 5 renders are used to put together a normal map. It's a lot of work though, for something that should be possible with a single render.
Will try to look at the renderman shader when I get home from work, but I'm a rookie with shaders as well. All help is much appreciated.
you can certainly do it in Carrara
AND as I found Octane standalone but the price may be a bit steep for you
I love normal maps and use them a lot in place of displacement maps that really should be named pokethru maps. ;) I used Crazy Bump for a while but now it is nDo2 that rules. Why try to force DS to things it is not made for?
The reason for using DS would be that it has access to the actual normals - using Crazy Bump, B2M or (as far as I know) nDo2 takes an image and tries to work out where the geometry is from the colours - a process that is inevitably prone to error in at least some cases.
True, every method have its pro and con. Normal maps from actual geometry such as from ZBrush are superior -if you have a steady hand. That was my pref workflow before. But more and more I find myself using the pen and other tools in Photoshop together with programs like C4D and Modo that show the result of a height map in real time, with OpenGL. And you can work with layers. When you are satisfied convert with nDO2 that work inside Photoshop. After a while you learn the knack of how to manually construct your normal maps.
I have also tried from the finished heightmap, instead of converting directly take it into ZBrush and "freeze" the offset (Apply DispMap). Then enhance in ZBrush with smooth, pinch etc. and make the Normal map from actual geometry. The result can be very good but mostly I judge it not worth the trouble. But it is an option.
I got curious, so I did this, but found that we'd need Geometry Normals, not Surface Normal (which ignores edge smoothness AFAIK), But an NTransform (convert from object to camera) > Normalize > Diffuse slot, *almost* works, but still needs something, it looks like this.
I think the better route is to work from that Renderman example in the Shader Builder, I don't have time to figure all that out today.
I've been trying to understand this, but I just can't figure out what the bent normal map is supposed to do. As far as I can gather "Bent normals show the average unoccluded direction. That is to say, given a uniform ambient environment, the bent normal is the direction along which most light reaches a point. i.e., if you were standing at the bottom of a deep canyon, your bent normal would point staight up out of the canyon." Maybe I am missing something basic here...
Thanks for all the suggestions for various programs, but I'd rather hoped I/we could get this to work in DS.
I think I'm too tired to figure this out tonight, but the Shader Mixer brick Normal Map has an interesting description regarding bent normals:
Takes an image in “normal map” format (which uses the RGB color channels to provide XYZ vector information for each point) and provides bent normals to a Displacement brick network. This brick can be used for vector displacement of a surface or just shading, depending on which parameters of a DS Default Displacement brick it is connected to.
I've been trying to understand this, but I just can't figure out what the bent normal map is supposed to do. As far as I can gather "Bent normals show the average unoccluded direction. That is to say, given a uniform ambient environment, the bent normal is the direction along which most light reaches a point. i.e., if you were standing at the bottom of a deep canyon, your bent normal would point staight up out of the canyon." Maybe I am missing something basic here...
Thanks for all the suggestions for various programs, but I'd rather hoped I/we could get this to work in DS.
I think I'm too tired to figure this out tonight, but the Shader Mixer brick Normal Map has an interesting description regarding bent normals:
Takes an image in “normal map” format (which uses the RGB color channels to provide XYZ vector information for each point) and provides bent normals to a Displacement brick network. This brick can be used for vector displacement of a surface or just shading, depending on which parameters of a DS Default Displacement brick it is connected to.
I'm sorry, I totally led you astray here. I always thought that a Bent Normals render was the term for rendering a scenes normals instead of the usual render types. Turns out it's part of a technique for calculating occlusion: http://renderman.pixar.com/view/image-based-lighting
Ok I took another shot at this, but I'm missing something in how the Normal Map brick works. It seems to lack a Z channel, so I looked up the equation for making one from X/Y [ x=r, y=g, z=1-sqrt(r*r+g*g) ], which works - BUT. The X and Y normal channels seem to lack any information on the negative side. I baked a normal map of this pot prop (top) to compare against the output of this shader.
Late late late edit: went back to it after realizing i forgot some math and it works now.
Sweet. Good job throttlekitty! Thanks a lot for your help. It's highly appreciated.
I tried to reproduce your result. I get what you have in your first image, line 2, where the negative information seems to be lacking. What was the problem causing that? My first though was that the normal map had a min/max from 0 to 1, instead of -1 to 1. Tried to fix that with a x*2 -1 operation, but then the result just became all weird.
By the way, I see from your screenshot that you've done this in your shader, but the z channel should be z=sqrt(1-(x²+y²)). Just thought I'd point that out, in case someone wonders. I don't know why DAZ seems to miss the z channel, but this article I stumbled across, while trying to find that formula, mentions that since the z channel can be calculated you could use that "free" grey channel for something else, e.g. specular/height/emissive/variation etc. http://oliverm-h.blogspot.no/2013/05/normal-maps-replacing-blue-channel.html
I don't know, it doesn't really make sense to me either :P
I got a few other questions too now:
1. Do you use some sort of self illumination on the bowl to make it unshaded? I couldn't figure out how to apply self illumination in DAZ.
2. Your shader screenshot doesn't show brick (10). Did you delete that, or is it off the screen somewhere?
3. Is it possible to switch off the anti-aliasing in the default shader so that a better/more accurate normal map is generated?
I'm not quite sure why you are working from a Normal Map brick - surely you want the actual normal vector, which you can get from a variable brick.
Turns out the Z channel is there, it's just "weaker" by a factor of two, so there's actually no need for calculating it this way.
1. Using the default material, be sure that ambient/gloss/spec aren't contributing!
2. I got annoyed with the work pane resizing on me, so brick(10) was somewhere ofscreen.
3. I don't think so. I assume you're concerned about the borders of your renders? If it was for a character, I'd try to render opacity as a pass and use that as a mask to trim all the other passes. For the normals pass, include a flat plane (aimed at the camera) that also has the shader, so there's no pixel blending against non-normals colors.
http://www.daz3d.com/forums/viewreply/387824/
@Richard, I did discover that block later, and have been using that instead. The output seems identical regardless, but I didn't examine that scientifically. I think I can get double use from the Normal Map brick: getting surface normals and a user-input normal map (if present) at the same time. I'll work this up into a more feature-ful product, there's a kink or two to work out first.
After some PMs with throttlekitty, we figured this out. This isn't quite the same shader as the last one you sent me though throttlekitty.
Just to clarify, this is done in Shader Mixer in DAZ3D.
I'll try to describe this a bit:
1. First, use a variable root to get the normalized normal vector (Nn on the brick). This will give you a normal vector for the currently evaluated point, expressed with unit vectors. From this we know that the vector will be in the range -1 to 1, on each axis, x,y and z.
2. Split this vector into it's x, y and z components.
3. Since we have values for each vector from -1 to 1, we will need to adjust these. A normal map express -1 to 1 (xyz vector) as 0 to 255 (RGB color). All the binary operations do are add 1 and divide by 2. If you think about it, this will give us vectors from 0 to 1, instead of -1 to 1.
4. You may also have noticed that the z component has an extra binary operation. The z axis is facing the wrong way, and needs to be flipped around. This is simply done by multiplying with -1.
5. Now we just need to gather up the x,y,z components again (who are now in the range 0 to 1) and put them into the diffuse color. The color is automatically scaled from [0, 1] to [0, 255] when transferred to the diffuse color.
6. Also, make sure that the glossiness and specular strength is set to 0%. Make sure ambient color is white and has a strength of 100.
7. Remember to set the background color to 127, 127, 255 (HEX #7f7fff) before you render.
Enjoy normal maps for your sprites (or whatever).
so you apply this shader to all surfaces in the scene or incorporate this as a camera?
You apply it to all the surfaces, yes.
Also, make sure you switch off any lights you have in the scene.
Thanks!
Awesome, thanks for the breakdown, inge! I tend to get frazzled in mathematics, your explanation makes a lot more sense than what I was trying to do with the vector>rgb conversion.
I had also wondered if one could apply this to a camera, it would be easier to kitbash normal maps from props that way, if one were so inclined.