Note that there are some explanatory texts on larger screens.

plurals
  1. POAccurately passing the player's relative position into AGAL
    text
    copied!<p>I'm trying to develop a fragment shader that fades to 0 where the face normals are perpendicular to the direction of the player 'camera'. (This is for spherical planet atmospheres; I want them to fade at their outer extent). </p> <p>I have the game set up so that the player is always at 'universe position' 0, 0, 0 and all objects have their transform.matrix3Ds translated around the player as it moves. </p> <p>I should point out that I have multiple shaders working fine, some including mixing textures, interpolating between 2 models and specular shading; THIS problem, though, has me beat.</p> <p>I've been thinking that the fragment shader needs to know the direction to the player (so that it can dot product between the current face normal and the player direction). But, it also needs to have the model's 'current' vertex position (ie, the vertex output position the shader is currently drawing) added onto the inverse player camera direction; that way the direction to the camera from that model surface location will be correct. </p> <p>Well, apparently not. (I could explain what I've been doing, but I can sense people ignoring this question already...). Can anybody tell me HOW I can correctly calculate the direction to the player, given that I also need to include (I THINK) the fact that the model's vertex positions are 'offset' from the model's central position? This has been doing my head in! </p> <p>Thanks.</p> <p><strong>EDIT</strong></p> <p>Here's the relevant AS3 code followed by the AGAL: </p> <pre><code>// invMatrix is a cloned and inverted copy of the Planet object's transform.matrix3D... // .deltaTransformVector ONLY uses the matrix3D's rotation to rotate a vector3D... var tempVector:Vector3D = invMatrix.deltaTransformVector(transform.matrix3D.position); tempVector.normalize(); context3D.setProgramConstantsFromVector(Context3DProgramType.VERTEX, 5, Vector.&lt;Number&gt;([-tempVector.x, -tempVector.y, -tempVector.z, 0])); context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 7, rotMatrix, true); ... other constants set here ... context3D.setVertexBufferAt(3, mesh.normalsBuffer, 0, Context3DVertexBufferFormat.FLOAT_3); </code></pre> <p>In the vertex shader:</p> <pre><code>"m44 v3, va3, vc7 \n" + // passed rotMatrix is used to rotate the face normals "dp3 v4, vc5, va3 \n" + // put the dot product of the normalised, rotated camera position and the rotated normal in v4 </code></pre> <p>Then, in the fragment shader:</p> <pre><code>"mul ft0, ft0, v4 \n" + // multiply the sampled texture in ft0 by the passed dot product in v4 </code></pre> <p>But this is all making for some weird rendering behaviour, where half of the atmosphere appears drawn, depending on your position relative to the planet. Baffling. Any help well appreciated.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload