Yours,
Morten
mlyr...@hotmail.com spake the secret code
<1172344623.4...@j27g2000cwj.googlegroups.com> thusly:
>luminosity and cube-map reflection - but I can't even seem to find a
>good example on multiple-light phong shading.
Its just the sum of the phong contribution for each light.
Phong shading is still a "local illumination model" unlike raytracing
or radiosity. You compute the contribution of each light source and
add them up.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>
Legalize Adulthood! <http://blogs.xmission.com/legalize/>
>>luminosity and cube-map reflection - but I can't even seem to find a
>>good example on multiple-light phong shading.
> Its just the sum of the phong contribution for each light.
> Phong shading is still a "local illumination model" unlike raytracing
> or radiosity. You compute the contribution of each light source and
> add them up.
It could be noted that this only applies to physical itensity, power, watts.
Many lighting formulas are related to perceptual itensity.
For example, on my computer/monitor, the relationship between the
"greyscale" (that number in the range 0..1 or 0..255 for bytes)
and physical power is round about:
power = greyscale^(1.6..2.6)
For example, full white lines consuming 50% area on black ground
look similar to a 3/4-th full area greyscale.
And 3/4-th white scale filling 50% of area over full black look similar
to 1/2-greyscale full area.
Npte that on CRTs it may make a diffrence if you use vertical or
horizontal lines, if the lines are just a single pixel (low pass filter,...).
So. If the lighting formula is designed to look good on screen,
and we assume power=greyscale^2, and we simply add two greyscales
g1 and g2, that both look correct on screen, the physical power is
(g1+g2)^2, but it should be (power(g1)+power(g2))~=(g1^2+g2^2).
I personlly prefer to compute everything in terms of physical power.
But this has the disadvantage of requireing a final persentation pass
to adjust the power-values against the displays luminosity-function,
and also reduces the numeric precision.
Gruss
Jan Bruns
Thank you very much for your response.
Do you have any other advice for the actual implementation? The
examples I have found seem to calculate some values (light direction
or something) in the vertex shader, and then do the actual phong
shading (obviously) in the pixel shader. Do I need to calculate this
direction for each light and pass it to the pixel shader? I am so
inexperienced with shaders that I am afraid I might create a horribly
ineffective shader. Could you or someone out there describe the "best"
way to implement this?
Yours,
Morten
"Jan Bruns" <testzugan...@arcor.de> spake the secret code
<45e0b75e$0$23138$9b4e...@newsspool1.arcor-online.net> thusly:
>For example, on my computer/monitor, the relationship between the
>"greyscale" (that number in the range 0..1 or 0..255 for bytes)
>and physical power is round about:
>
>power = greyscale^(1.6..2.6)
You're talking about the non-linear response of CRTs here, which is
what gamma correction is for.
Lighting is still computed in a linear space.
mlyr...@hotmail.com spake the secret code
<1172355838....@a75g2000cwd.googlegroups.com> thusly:
>Do you have any other advice for the actual implementation? The
>examples I have found seem to calculate some values (light direction
>or something) in the vertex shader, and then do the actual phong
>shading (obviously) in the pixel shader.
In order to do true Phong shading, the light must be computed
per-pixel and not per-vertex. That's why Phong shading is expensive.
Gouraud shading only computes lighting at the vertices and
interpolates it across primitives, which is why its faster and has
been around in hardware much longer than Phong shading.
>Do I need to calculate this
>direction for each light and pass it to the pixel shader?
Yep.
>I am so
>inexperienced with shaders that I am afraid I might create a horribly
>ineffective shader. Could you or someone out there describe the "best"
>way to implement this?
What you've described sounds like you are on the right track.
>>For example, on my computer/monitor, the relationship between the
>>"greyscale" (that number in the range 0..1 or 0..255 for bytes)
>>and physical power is round about:
>>power = greyscale^(1.6..2.6)
> You're talking about the non-linear response of CRTs here,
Yes, this example was related to specifics of Displays.
> which is what gamma correction is for.
Yes, that's what a gamma-ramp can be used for, if there's hw support.
But this has drawbacks:
A D3D gamma-ramp can only store up to 256 colors.
If you put in some non-linear function, there'll be loss of information.
To compensate for p=x^2, there will be 63 of 256 greyscales unused, and
the lowest greyscale apart from 0 is at 6%.
> Lighting is still computed in a linear space.
That's what I recomended, but as I also described, there are good
reasons and ways to avoid that, and also lighting formulas doing so.
I wouldn't have noted that if I believed that Morten's problem is
only not knowing how to add light-power. This wasn't obvious from
his question.
Gruss
Jan Bruns
Ok, so I'll try again. This is what I've come up with thanks to your
advice. However, I can't seem to make the multiple-light code work.
The lights seem to cancel eachother out. What's wrong? I use FX
Composer, and if I move a sphere between the two lights - they're
gone. Can anyone please help? I'm basically guessing here.
float4x4 WorldViewProjectionMatrix : WorldViewProjection;
float4x4 WorldMatrix : World;
float4x4 WorldInverseTransposeMatrix : WorldInverseTranspose;
float4x4 ViewInverseMatrix : ViewInverse;
float3 LightPosition1: Position
<
string UIName = "Light Position 1";
string Object = "PointLight";
string Space = "World";
>;
float3 LightPosition2: Position
<
string UIName = "Light Position 2";
string Object = "PointLight";
string Space = "World";
>;
int NumLights = 2;
float4 AmbientColor: Ambient
<
string UIName = "Ambient Color";
string UIWidget = "Color";
> = float4( 0.37, 0.37, 0.37, 1.00 );
float4 DiffuseColor: Diffuse
<
string UIName = "Diffuse Color";
string UIWidget = "Color";
> = float4( 0.89, 0.89, 0.89, 1.00 );
float4 SpecularColor: Specular
<
string UIName = "Specular Color";
string UIWidget = "Color";
> = float4( 0.49, 0.49, 0.49, 1.00 );
float Glossiness: SpecularPower
<
string UIName = "Glossiness";
string UIWidget = "Numeric";
float UIMin = 1.00;
float UIMax = 100.00;
> = float( 25.00 );
texture BaseTexture
<
string ResourceName = "C:\\Programfiler\\ATI Research Inc\
\RenderMonkey 1.62\\Examples\\Media\\Textures\\Fieldstone.tga";
>;
sampler2D BaseMap = sampler_state
{
Texture = (BaseTexture);
ADDRESSU = WRAP;
ADDRESSV = WRAP;
MINFILTER = LINEAR;
MAGFILTER = LINEAR;
MIPFILTER = LINEAR;
};
struct VertexInput
{
float4 Position : POSITION0;
float2 TextureCoord : TEXCOORD0;
float3 Normal : NORMAL0;
};
struct VertexOutput
{
float4 Position: POSITION0;
float3 VertexPosition: TEXCOORD0;
float2 TextureCoord: TEXCOORD1;
float3 ViewDirection: TEXCOORD2;
float3 Normal: TEXCOORD3;
};
struct PixelInput
{
float3 VertexPosition: TEXCOORD0;
float2 TextureCoord: TEXCOORD1;
float3 ViewDirection: TEXCOORD2;
float3 Normal: TEXCOORD3;
};
VertexOutput VertexShaderGeneric(VertexInput input)
{
VertexOutput output;
output.VertexPosition = mul(input.Position, WorldMatrix);
output.Position = mul(float4(input.Position.xyz, 1.0),
WorldViewProjectionMatrix);
output.TextureCoord = input.TextureCoord;
output.ViewDirection = normalize(ViewInverseMatrix[3].xyz -
output.VertexPosition);
output.Normal = mul(input.Normal, WorldInverseTransposeMatrix);
return(output);
}
float4 PixelShaderBaseTextureOnly(PixelInput input): COLOR0
{
float3 normalizedNormal = normalize(input.Normal);
//float4 baseColor = tex2D(BaseMap, input.TextureCoord);
// Simulate texture for simplicity
float4 baseColor = {0.5, 0.5, 0.5, 1};
float4 lighting = (AmbientColor * baseColor);
// for (int i = 0; i < NumLights; i++)
{
float3 lightDirection = normalize(LightPosition1 -
input.VertexPosition);
float lightDotProduct = dot(normalizedNormal, lightDirection);
float3 reflectionVector = normalize(((2.0f * normalizedNormal) *
(lightDotProduct)) - lightDirection);
float reflectionDotProduct = max(0.0f, dot(reflectionVector,
input.ViewDirection));
lighting += ((DiffuseColor * lightDotProduct * baseColor) +
(SpecularColor * pow(reflectionDotProduct, Glossiness)));
}
{
float3 lightDirection = normalize(LightPosition2 -
input.VertexPosition);
float lightDotProduct = dot(normalizedNormal, lightDirection);
float3 reflectionVector = normalize(((2.0f * normalizedNormal) *
(lightDotProduct)) - lightDirection);
float reflectionDotProduct = max(0.0f, dot(reflectionVector,
input.ViewDirection));
lighting += ((DiffuseColor * lightDotProduct * baseColor) +
(SpecularColor * pow(reflectionDotProduct, Glossiness)));
}
return(lighting);
}
technique BaseTextureOnly
{
pass P0
{
VertexShader = compile vs_2_0 VertexShaderGeneric();
PixelShader = compile ps_2_0 PixelShaderBaseTextureOnly();
}
}
Yours,
Morten
Gruss
Jan Bruns