Skip to main content
Disambiguating reflection tag as discussed here https://gamedev.meta.stackexchange.com/q/2552/39518
Link
DMGregory
  • 140.8k
  • 23
  • 257
  • 401
added 114 characters in body
Source Link
worbel
  • 183
  • 2
  • 11

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

Why does the reflection stay as if it is from the same view?

Which space should I be doing my calculations?

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = mul( (float3x3)mvMatrix, input.normal );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = mul( (float3x3)mvMatrix, input.normal );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

Why does the reflection stay as if it is from the same view?

Which space should I be doing my calculations?

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = mul( (float3x3)mvMatrix, input.normal );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );
deleted 5 characters in body
Source Link
worbel
  • 183
  • 2
  • 11

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = float4mul( mul(normalMatrixfloat3x3)mvMatrix, input.normal), 0 );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = float4( mul(normalMatrix, input.normal), 0 );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );

I'm implementing reflection/environment mapping using a cubemap. I have it working for the most part but when I move my camera around, I see the same area reflected on the object. I am calculating the eye vector and normal vector in view space similar to OpenGL Shading Language 3rd Edition example (p.311, Section 10.4).

How render looks: Example of environment mapping gone awry

Snippets of HLSL code

Vertex shader:

normal = mul( (float3x3)mvMatrix, input.normal );
normal = normalize(normal);
viewPos = mul( mvMatrix, float4(input.pos, 1) );

Pixel shader:

viewPos.w = 0;
V = normalize(viewPos);
normal.w = 0;
normal = normalize(normal);

float4 reflectVec = normalize(reflect(V, normal));
float4 reflectColor = tex.Sample(samLinear, reflectVec.xyz );
Source Link
worbel
  • 183
  • 2
  • 11
Loading