Thursday, 24 April 2008

Deferred Shading Engine

My deferred shading engine is almost complete. I've been reading the two articles form GPU Gems 2 & 3 on the subject and I started work on it earlier this year. Its created with C++ and DirectX 9. Mesh material data is stored in the G-Buffer (G for geometry). I chose to use 4 64bit render targets to construct the buffer. The following data is stored:
  • World space position.
  • World space normals.
  • Diffuse colour.
  • Specular colour.
  • Specular power.
  • Specular intensity.
  • Material ID.
Below are the four textures that make up the G-Buffer:

World space positions
World space normals

Diffuse colour

Specular colour

There is also support for normal mapping and parallax occlusion mapping with offset limiting. When normal mapping is applied the render target texture that stores the normals looks dramatically different.
World space normals when normal mapping are enabled

The material ID will eventually be used as the z coordinate in a volume texture which stores lighting results, which means that the engine will be able to apply different lighting models without the use of branching.

Lighting in deferred shading is handled a a completely different way than a traditional rendering engine. A light volume is projected onto the screen. For a direction light this is a full screen quad, for a point light a sphere and for a spot light a cone is used. This makes the directional light the most expensive light type to draw as it effects every pixel in the G-Buffer. Spot and point lights are very cheap to draw as the light volume culls out the vast majority of the G-Buffer. It is possible to render 50-100 point lights in a deferred shading engine! Below is a view of how the lights and light volumes are rendered, captured with the help of nVidia's PerfHUD.

A directional light drawn with a full screen quad.


A point light drawn with a sphere light volume.

A spotlight drawn with a cone light volume.

The scene is also rendered with HDR lighting. This is simple to achieve as the data in the G-Buffer is already stored in 64bit render targets so after lighting is applied, one simply creates a Gaussian blurred version of the scene and combines it with the original and then applies some tone mapping.

There is also a particle engine in there which is simulated in the GPU's pixel shader, it is implemented in a similar way to the GPU based particle engine created I created in DirectX 10.

Final scene.



Final scene.

Thursday, 27 March 2008

Almost Done

Got some proper colour and size variables in there now so it can be changed! I've also changed the velocity texture to a 64bit one (R16G16B16A16) . I've tried the engine with a 1024 x 1024 texture which equates to 1,048,576 particles, it runs, but not smoothly, and a 768 x 768 texture (589,824 particles) which runs fairly smoothly. Its not limited to textures of sizes to the power of 2, its just the numbers I keep choosing out of habit. It all coming together, heres a video:

Wednesday, 26 March 2008

Pixel Shader Particle Engine

I've finally got my pixel shader particle engine working. Its by no where near finished but its getting there. The position and velocity data for each particle are stored in 128bit (R32G32B32A32) floating point textures, though I may change the velocity one to 64bit as 128bit is overkill. To get some randomness to the particles, on initialisation I create a 128 x 128, or 256 x 256 etc. pixel texture and fill it with random floating point values between -0.5 and 0.5. Particle velocty is then calculated by adding a set velocity value to a velocity variance multiplied by the value stored in the random texture:

final velocity = initial velocity + (velocity variance * random value)

As I am storing particle velocity and position in a texture, the particle engine is not stateless, meaning that I can (and have) implemented varying acceleration. At the minute there is just a bit of gravity or any other linear force in there, but eventually I sould be able to achieve such effects as vortexes. I can also add some sort of collision detection if desired, entirely on the GPU.

Im using vertex texture fetching to get the position data in the effect that presents the particles to the screen. This involves creating a list of verts (one for each particle) then using these vert as texture coordinate to sample the position texture. I still eed to get different colours and sizes in there but thats just a matter of adding another 2 parameters to the a vertex decleration and filling it in.

Below is a picture of it in action, Im storing particle data in a 512 x 512 pixel texture, meaning there is a total of 262,144 particles on screen. Quite a few more than my CPU based engine!


Tuesday, 11 March 2008

Solar Flux

Here is an early video of the game we are working on for our team project module. It is a space shooter, where the player can play from one of three camera angles. I've been working on the effects and shaders, at this minute we have the basic stuff in, like normal mapping and various specular lighting models, Phong (& Blinn), Ashikhmin etc. We also have HDR lighting in there, motion blur, various colour and film grain filters, radial blur, bloom (for computers that cant handle the HDR) and I've just managed to get DoF in there too, but its not demonstated in this video. I'll hopefully soon have some variance shadow mapping in there too.

Monday, 3 March 2008

3D Modeling with Blender

Recently I've been teaching myself to use the 3D modeling application known as Blender. Its a freeware alternative to the likes of 3D Studio Max and Softimage XSI. I have never done any 3D modeling before so this is my first attempt. Anyway, Ive made a little scene, here it is:

Tuesday, 19 February 2008

Videos of work

During my time at Canalside Studios I developed a CPU based particle engine in XNA, here is a old video of it in action:

The engine was used in 3 titles whilst I was working there, here is an early video of one of them, Hexothermic:


Another Game I worked on at Canalside is G3:


Finaly here are two videos of CPU based particle engines I have created in DirectX 10 for my final year dissertation. The first uses inheritance to create different emitter types and processes the particle rotation in a slghtly different way (has to convert from euler angles to world space vectors), resulting in poorer performance, it can only render 8000 particles on screen at 60FPS:



And here is a CPU based particle engine built using templates, no angles are needed, velocity is used directly as the direciton in which the partciles fire, it is able to render 10000-12000 particles on screen at 60FPS:

Dissertation

For my final year we have to produce a dissertation, it can be regarding anything that interests you. For mine I have chosen to develop two particle engines and discuss the advantages and disadvantages of both. Both engines will be developed in C++ using the DirectX 10 SDK. I have chosen to use DirectX 10 as previousily I have only used DirectX 9 and I want to familiarize myself with DirectX 10 and explore some of the new features available.

My first engine, which is already running, though it is not finalized, is a CPU based particle engine. All particle data is processed on the CPU, afterwhich it is uploaded to the GPU as a point list. The vertex shader simply passes the particle data through to the geometry shader. The GS creates a quad (as point sprites are no longer available in DX10) based upon the size and position of the particle. Finally the particle colour is passed to the pixel shader where the quad is textured and presented on screen. I have used templates to build an emitter at runtime.

My second engine, which is still under development, is a GPU based particle engine. Acceleration, velocity, position etc. is processed in the pixel shader, where all data is stored in a floating point texture. The data is then passed to (virtually) the same shader used to render the previous particle engine, the only differance is that the particles position is not passed to the shader via a vertex buffer, it is passed via a vertex texture.

My first post!

Hi. I have created this blog to share my ideas, developments and work with anyone wishing to read. If you wish to leave any feedback or general comments please do so. I will eventually get round to adding some code examples and discuss what I am developing at the minute.

I am currently studying my final year of a Computer Games Programming course and the University of Huddersfield. I have experience with Microsoft's DirectX 9 and 10 and XNA. I know how to program in Java, C# and C++ (my favourite).

My favourite area of games development is Rendering. I enjoy developing shaders, effects and strange rendering techniques. I feel it is a very visual aspect of programming, you can see your results almost instantly (providing it compiles).

My placement year was spent at Canalside Studios (www.canalsidestudios.com). During my time there I learnt a lot and we managed to obtain a contract with Microsoft to publish one of our titles on XBox Live Arcade!