Sunday, 14 June 2009

Light Pre Pass Rendering

Iv been looking at light pre pass rendering as described here and started playing round with it. The basic concept is:
  • On you z pre pass, render mesh normals and depth into a render target.
  • Then using this rendertarget you obtain the normal of a surface and the world position ( pos = mul( float4( In.Tex.xy, depth, 1.0 ), matInvViewProj ) ) and can render the lights into the back buffer.

This is as far as I have got. After this the idea is to render the scene again adding the objects material data ( albedo (texture colour ), reflections etc) to the backbuffer (which already has the light data in it). The advantage over deferred endering is that you dont need a huge G-Buffer, which is good on platfoms like the xbox 360 where there is a fixed frame buffer size.

The backbuffer containing a single directional light:

The pre pass buffer rgb channels(normals ):

The pre pass buffer alpha channel (depth):

Wednesday, 18 February 2009

Android

I got myself a G1 a couple of month back (its a great phone btw) and lately iv been experimenting with the Android SDK.
So far iv managed to make a few apps (most of them direct tutorials from the android docs, which are also great). Iv also made a simple 2D space shooting game, the SDK is so easy to pick up it only took me a day to make the game! Ill post some pic and whatnot when i get time. Once its finished I might even turn it into a 3D game, iv already taken a look at the OpenGL ES stuff and started making a simple engine. If its good enough ill put it on the android market too.

Sunday, 8 February 2009

Metaballs

Its been a while since i'v posted, been busy with work etc. Anyway, after playing GoW2 and the Kilzone 2 demo (which has also got me thinking about developing another defered shading engine at work, I don think it would ever get used but it would be a cool feature to add to our engine) I was wondering how the blood created. I did some digging around and I found these things called metaballs (or isosurfaces). Theres a great introduction article on GameDev, from it I as able to implement a simple metaball system in render monkey.
Using the equations in the article, you basically iterate through each metaball in the pixel shader and return the summation of affect each metalball has on the pixel. Its pretty cool stuff and not to hard to implement:

Thursday, 24 April 2008

Deferred Shading Engine

My deferred shading engine is almost complete. I've been reading the two articles form GPU Gems 2 & 3 on the subject and I started work on it earlier this year. Its created with C++ and DirectX 9. Mesh material data is stored in the G-Buffer (G for geometry). I chose to use 4 64bit render targets to construct the buffer. The following data is stored:
  • World space position.
  • World space normals.
  • Diffuse colour.
  • Specular colour.
  • Specular power.
  • Specular intensity.
  • Material ID.
Below are the four textures that make up the G-Buffer:

World space positions
World space normals

Diffuse colour

Specular colour

There is also support for normal mapping and parallax occlusion mapping with offset limiting. When normal mapping is applied the render target texture that stores the normals looks dramatically different.
World space normals when normal mapping are enabled

The material ID will eventually be used as the z coordinate in a volume texture which stores lighting results, which means that the engine will be able to apply different lighting models without the use of branching.

Lighting in deferred shading is handled a a completely different way than a traditional rendering engine. A light volume is projected onto the screen. For a direction light this is a full screen quad, for a point light a sphere and for a spot light a cone is used. This makes the directional light the most expensive light type to draw as it effects every pixel in the G-Buffer. Spot and point lights are very cheap to draw as the light volume culls out the vast majority of the G-Buffer. It is possible to render 50-100 point lights in a deferred shading engine! Below is a view of how the lights and light volumes are rendered, captured with the help of nVidia's PerfHUD.

A directional light drawn with a full screen quad.


A point light drawn with a sphere light volume.

A spotlight drawn with a cone light volume.

The scene is also rendered with HDR lighting. This is simple to achieve as the data in the G-Buffer is already stored in 64bit render targets so after lighting is applied, one simply creates a Gaussian blurred version of the scene and combines it with the original and then applies some tone mapping.

There is also a particle engine in there which is simulated in the GPU's pixel shader, it is implemented in a similar way to the GPU based particle engine created I created in DirectX 10.

Final scene.



Final scene.

Thursday, 27 March 2008

Almost Done

Got some proper colour and size variables in there now so it can be changed! I've also changed the velocity texture to a 64bit one (R16G16B16A16) . I've tried the engine with a 1024 x 1024 texture which equates to 1,048,576 particles, it runs, but not smoothly, and a 768 x 768 texture (589,824 particles) which runs fairly smoothly. Its not limited to textures of sizes to the power of 2, its just the numbers I keep choosing out of habit. It all coming together, heres a video:

Wednesday, 26 March 2008

Pixel Shader Particle Engine

I've finally got my pixel shader particle engine working. Its by no where near finished but its getting there. The position and velocity data for each particle are stored in 128bit (R32G32B32A32) floating point textures, though I may change the velocity one to 64bit as 128bit is overkill. To get some randomness to the particles, on initialisation I create a 128 x 128, or 256 x 256 etc. pixel texture and fill it with random floating point values between -0.5 and 0.5. Particle velocty is then calculated by adding a set velocity value to a velocity variance multiplied by the value stored in the random texture:

final velocity = initial velocity + (velocity variance * random value)

As I am storing particle velocity and position in a texture, the particle engine is not stateless, meaning that I can (and have) implemented varying acceleration. At the minute there is just a bit of gravity or any other linear force in there, but eventually I sould be able to achieve such effects as vortexes. I can also add some sort of collision detection if desired, entirely on the GPU.

Im using vertex texture fetching to get the position data in the effect that presents the particles to the screen. This involves creating a list of verts (one for each particle) then using these vert as texture coordinate to sample the position texture. I still eed to get different colours and sizes in there but thats just a matter of adding another 2 parameters to the a vertex decleration and filling it in.

Below is a picture of it in action, Im storing particle data in a 512 x 512 pixel texture, meaning there is a total of 262,144 particles on screen. Quite a few more than my CPU based engine!


Tuesday, 11 March 2008

Solar Flux

Here is an early video of the game we are working on for our team project module. It is a space shooter, where the player can play from one of three camera angles. I've been working on the effects and shaders, at this minute we have the basic stuff in, like normal mapping and various specular lighting models, Phong (& Blinn), Ashikhmin etc. We also have HDR lighting in there, motion blur, various colour and film grain filters, radial blur, bloom (for computers that cant handle the HDR) and I've just managed to get DoF in there too, but its not demonstated in this video. I'll hopefully soon have some variance shadow mapping in there too.