Wednesday, June 30, 2010

Planet Water Shader Tests & OGE

First of all, I haven't posted in a while because I've been sick, then I got my new dev machine and was busy setting it up and playing games on it and other real life stuff.

So, water for this small planet has been difficult for me and I've tried a couple approaches and have yet to get something I like.

What I want:
  • reflections, refraction, transparency, specular, normal mapping
  • under water fog
  • fog color changes based on depth and distance
Here's what I've got so far (no reflection or refraction)

Video of planet water on YouTube

Reflection and Refraction

There is no reflection or refraction on the water because I don't know how I will manage that on a curved surface yet. I think what I will try first is only doing water reflection and refraction when you're close enough to the water that it appears to be more flat so I can use conventional techniques.

Underwater Fog

To support underwater I applied a fog to the pixel shader for the terrain based on the terrain depth under water.  I have a outward facing sphere with a shader for transparency, normal mapping and specular.  I also have an inward facing sphere with a shader for when you are underwater looking up at the surface from below.

Unfortunately, the fog shader on the terrain does not line up perfectly with the underwater inward facing sphere so you see cracks at the water edge where there is no fog.  I think I can only fix this issue with post processing.

New Game Dev Machine

I got a new developer machine with an i7 processor and Radeon 5770.  I tried to choose a middle of the road graphics card.  You can see the new game dev machine specs and discussion on


Lastly, I've spent a some time working with OGE which stands for "Open Game Engine".    I've created a small RakNet plugin for it and have been trying to get a little client/server action going.  This is a potential game engine that I might adopt for MyFirstPlanet, at least in the development stages to do all the audio, graphics, user interface, networking, gameplay, physics, and scripting that I would have to build myself.  It supports OGRE as a graphics engine plugin so I shouldn't have to rewrite any of my graphics code.

I'll try to post again when I have a video to show of the network tests.  It will mostly just be a couple clients on a server flying ships around in space over a planet.

Wednesday, May 5, 2010

Planet Terrain Shader With Atmosphere

Before you read this post, if you have not watched Ysaneya's latest tech demo video go watch it now! It is amazing!

In the previous posts I described and published my latest atmosphere shader and I got some useful feedback and requests for information, which I thought I should include here:

- The atmosphere shader is applied to an inverted sphere that is drawn before the planet. This means that if you have a single atmosphere shell and a planet with no atmosphere shader added to it, then you will just see the atmosphere effect behind the planet. I go into how I've added worked in atmosphere shader code into my terrain shader later on in the post you are reading now.

- In the atmosphere shader the camera and light should be in object space, as should the vertices.

- The atmosphere is still not fully opaque when on the sunny side. It is opaque (not see through) at the horizon and gets slightly transparent as it approaches the sun so some stars still show through which may not be desired. I plan on addressing this later when I add other planets and moons. I'm concerned that if I make the atmosphere too opaque then you won't see the moon or planet (or large space ships!) out there, and if I make the atmosphere too transparent then you'll see too many stars on the day side. I may just end up darkening the skybox as the user gets on the daylight side of the planet, but I haven't gotten there yet.

Now, about the planet terrain. I create an 8 way blend look up table based on terrain slope and height. When the planet is first created I render to texture a medium resolution texture (512x512 currently) that blends 8 textures based on this look up table. When viewing the planet from far away I display this medium resolution texture, then when the camera approaches the surface I blend between this medium resolution texture and a shader that actually blends the 8 textures per frame, which is slower but looks better close up.

Too add the atmosphere effect to my terrain in the vertex shader I basically calculate the amount of atmosphere fog like this:
// atmosphere fog / haze attenuation
float3 camToPos =;
float camToPosDist = length(camToPos);
float visibilityDistance = min(AtmosphereHeight,VisibilityDistance);
oAttenuation = saturate((camToPosDist - VisibilityDistance)/ (AtmosphereHeight + visibilityDistance));

position - The vertex position in object space
camPos - The camera position in object space
AtmosphereHeight - Atmosphere sphere radius minus terrain sphere radius.
VisibilityDistance - At distance = 0, fog is 0 at distance = VisibilityDistance, fog = 1

In my fragment shader I do this:
// gradient0 is a look up table like atmosphere gradient but two pixels high. 
// Top row is atmosphere gradient and bottom row is sun color gradient
float4 sunColor = tex2D(gradient0,float2(uv2,1));
float4 atmosphereColor = tex2D(gradient0,float2(uv2,0));

// how transparent is the atmosphere?
float transparency = min(AtmosphereTransparency,attenuation); 

// get diffuse amount from 8 way texture blend and apply lighting with normals
// or shadow map/light map

// get atmosphere contribution    
float3 atmosphereAmt = (atmosphereColor * sunColor * (transparency + (transparency * shadow)));

// add in atmosphere
outColor.rgb = (diffuse.rgb * sunColor * (1.0 - transparency)) + atmosphereAmt;

I use a constant called AtmosphereTransparency to make sure that the atmosphere contribution is no more than a certain amount. I use 0.3 because it looks OK, but on a really dense atmosphere you might want a higher amount.

Here's the full terrain shader you see in the videos when the camera is on the surface - this is the one I use for Ogre3d not for FX Composer.

void main_vp( float4 position : POSITION,
float2 uv       : TEXCOORD0,

out float4 oPosition : POSITION,
out float2 oUV1    : TEXCOORD0,
out float3 oLightDir : TEXCOORD1,
out float  oBlendAmt : TEXCOORD2,
out float  oUV2    : TEXCOORD3,
out float  oAttenuation: TEXCOORD4,

uniform float4x4 worldViewProjMatrix,
uniform float4 camPos,
uniform float4 lightPos,
uniform float AtmosphereHeight,
uniform float VisibilityDistance     
oPosition = mul(worldViewProjMatrix, position);
oUV1 = uv;

// directional light
oLightDir = normalize(;

// dot product of position and light in range 0..1
float posLength = length(;
float3 normal = / posLength;
oUV2 = (dot(oLightDir, normal) + 1.0) * 0.5;

// atmosphere fog / haze attenuation
float3 camToPos =;
float camToPosDist = length(camToPos);
float visibilityDistance = min(AtmosphereHeight,VisibilityDistance);
oAttenuation = saturate((camToPosDist - VisibilityDistance)/ (AtmosphereHeight + visibilityDistance));

// start blending between low res texture and high at 512 units out
float blendStart = 512.0;
float blendEnd = 0.0;
float blendDistance = blendStart - blendEnd;

oBlendAmt = max(0.0, camToPosDist - blendEnd);
oBlendAmt = min(1.0, oBlendAmt / blendDistance); 

And here is the fragment shader:
sampler diffTex0 : register(s0);
sampler diffTex1 : register(s1);
sampler diffTex2 : register(s2);
sampler diffTex3 : register(s3);
sampler diffTex4 : register(s4);
sampler diffTex5 : register(s5);
sampler diffTex6 : register(s6);
sampler diffTex7 : register(s7);
sampler blend0 : register(s8);
sampler blend1 : register(s9);
sampler diffuse0 : register(s10);
sampler gradient0 : register(s11);

float4 getBlendedSample8(in float4 weights0, in float4 weights1, in float2 diffUV) 
// use w,x,y,z order because PNG uses pixel format A8R8G8B8
return tex2D(diffTex0, diffUV)  * weights0.w +
tex2D(diffTex1, diffUV)  * weights0.x +
tex2D(diffTex2, diffUV)  * weights0.y +
tex2D(diffTex3, diffUV)  * weights0.z +
tex2D(diffTex4, diffUV)  * weights1.w +
tex2D(diffTex5, diffUV)  * weights1.x +
tex2D(diffTex6, diffUV)  * weights1.y +
tex2D(diffTex7, diffUV)  * weights1.z;

void main_fp(
float2 uv : TEXCOORD0,
float3 lightDir : TEXCOORD1,
float  blendAmt : TEXCOORD2,
float  uv2 : TEXCOORD3, 
float  attenuation : TEXCOORD4,

uniform float AtmosphereTransparency,
uniform float tileFactor,

out float4 outColor : COLOR
float4 sunColor = tex2D(gradient0,float2(uv2,1));
float4 atmosphereColor = tex2D(gradient0,float2(uv2,0));

// how transparent is the atmosphere?
float transparency = min(AtmosphereTransparency,attenuation); 

// get the texture color and lit value
float4 diffuse = tex2D(diffuse0, uv);
float shadow = diffuse.a;

// make ambient dark on side where light is
// and light on the side where it is dark
float3 ambient = sunColor * (1.0 - uv2);

float4 blended =  getBlendedSample8(tex2D(blend0, uv), tex2D(blend1, uv), uv * tileFactor);
diffuse = lerp(blended, diffuse, blendAmt);

// now shade based on the normal
diffuse.rgb = ((1.0 - shadow) * diffuse.rgb * ambient) + (diffuse.rgb * shadow);

// get atmosphere contribution    
float3 atmosphereAmt = (atmosphereColor * sunColor * (transparency + (transparency * shadow)));    

// add in atmosphere
outColor.rgb = (diffuse.rgb * sunColor * (1.0 - transparency)) + atmosphereAmt;
outColor.a = 1.0;

Wednesday, April 28, 2010

Atmosphere Shader Update And Tree/Grass Test

I've gone and made the atmosphere shader more complex, sorry. The basic idea is still pretty straight forward, but I added mie scattering to the pixel shader and added extra code to fix issues with the atmosphere being too transparent when near the surface of the planet (stars were showing through).

Also, I worked on pre-calculated/static shadow maps for the terrain and used those with the grass and trees. JohnJ created Paged Geometry, which is built for 2d heightmaps, but I semi-adapted it so that each face of the planet cube has its' own paged geometry instance. Here's a video of that:

Note: the atmosphere shader in this video is the old atmosphere shader so the atmosphere is still too transparent on the sunny side of the planet

It has issues like, the billboard trees don't display right when viewed top down and they don't fade out based on height like they should. Also there is an abrupt transition from one paged geometry instance to the next when you cross over from one cube planet face to another. I may just have to roll my own simplified version.

And now, here is the NVIDIA FX Composer shader for the current atmosphere shader I'm using:

// outer atmosphere radius
float AtmosphereRadius <
string UIName = "Atmosphere Radius";
string UIWidget = "Slider";
float UIMin = 0.0;
float UIMax = 10000.0;
float UIStep = 1.0;
> = {1200.0f};

// planet surface radius
float SurfaceRadius <
string UIName = "Surface Radius";
string UIWidget = "Slider";
float UIMin = 0.0;
float UIMax = 10000.0;
float UIStep = 1.0;
> = {1024.0f};

// this is the sun position/direction
float4 gLamp0DirPos : POSITION < // or direction, if W==0
string Object = "Light0";
string UIName = "Lamp 0 Position/Direction";
string Space = (LIGHT_COORDS);
> = {10.0f,10.0f,10.0f,1.0};

// this is the atmosphere 2d gradient
texture gTex <
string ResourceName = "AtmosphereGradient";
string ResourceType = "2D";
string UIName = "Gradient Texture";
sampler2D gTexSampler = sampler_state
Texture = ;
AddressU = Clamp;
AddressV = Clamp;

// this is for setting where the horizon should fall on the sphere
float StretchAmt <
string UIName = "Stretch Amount";
string UIWidget = "Slider";
float UIMin = 0.0;
float UIMax = 1.0;
float UIStep = 0.01;
> = {0.25f};

// this is for mie scattering
float Atmosphere_G <
string UIName = "Atmosphere G";
string UIWidget = "Slider";
float UIMin = -1.0;
float UIMax = -0.5;
float UIStep = 0.001;
> = {-0.95f};

float4x4 WorldViewProj : WorldViewProjection;
float4x4 ViewIXf : ViewInverse;
float4x4 WorldXf : World;

void main2VS(
float3 pos : POSITION,
uniform float4 lightPos,

out float4 oPosition: POSITION,
out float2 oUV: TEXCOORD0,
out float oAlpha: TEXCOORD1,
out float3 oCamToPos: TEXCOORD2,
out float3 oLightDir :TEXCOORD3
float4 Po = float4(,1);
float4 Pw = mul(Po,WorldXf);
float3 position =;
float4 camPos = float4(ViewIXf[3].xyz,1);

oPosition = mul(Po, WorldViewProj);

float radius = length(position);
float radius2 = radius * radius;
float camHeight = length(;
float3 camToPos = position -;
float farDist = length(camToPos);

float3 lightDir = normalize(;
float3 normal = normalize(position);

float3 rayDir = camToPos / farDist;
float camHeight2 = camHeight * camHeight;

// Calculate the closest intersection of the ray with the outer atmosphere
float B = 2.0 * dot(, rayDir);
float C = camHeight2 - radius2;
float det = max(0.0, B*B - 4.0 * C);
float nearDist = 0.5 * (-B - sqrt(det));
float3 nearPos = + (rayDir * nearDist);
float3 nearNormal = normalize(nearPos);

// get dot products we need
float lc = dot(lightDir, camPos / camHeight);
float ln = dot(lightDir, normal);
float lnn = dot(lightDir, nearNormal);

// get distance to surface horizon
float altitude = camHeight - SurfaceRadius;
float horizonDist = sqrt((altitude*altitude) + (2.0 * SurfaceRadius * altitude));
float maxDot = horizonDist / camHeight;

// get distance to atmosphere horizon - use max(0,...) because we can go into the atmosphere
altitude = max(0,camHeight - AtmosphereRadius);
horizonDist = sqrt((altitude*altitude) + (2.0 * AtmosphereRadius * altitude));

// without this, the shift between inside and outside atmosphere is jarring
float tweakAmount = 0.1;
float minDot = max(tweakAmount,horizonDist / camHeight);

// scale minDot from 0 to -1 as we enter the atmosphere
float minDot2 = ((camHeight - SurfaceRadius) * (1.0 / (AtmosphereRadius - SurfaceRadius))) - (1.0 - tweakAmount);
minDot = min(minDot, minDot2);

// get dot product of the vertex we're looking out
float posDot = dot(camToPos / farDist, / camHeight) - minDot;

// calculate the height from surface in range 0..1
float height = posDot * (1.0 / (maxDot - minDot));

// push the horizon back based on artistic taste
ln = max(0,ln + StretchAmt);
lnn = max(0,lnn + StretchAmt);

// the front color is the sum of the near and far normals
float brightness = saturate(ln + (lnn * lc));

// use "saturate(lc + 1.0 + StretchAmt)" to make more of the sunset side color be used when behind the planet
oUV.x = brightness * saturate(lc + 1.0 + StretchAmt);
oUV.y = height;

// as the camera gets lower in the atmosphere artificially increase the height
// so that the alpha value gets raised and multiply the increase amount
// by the dot product of the light and the vertex normal so that
// vertices closer to the sun are less transparent than vertices far from the sun.
height -= min(0.0,minDot2 + (ln * minDot2));
oAlpha = height * brightness;

// normalised camera to position ray
oCamToPos = -rayDir;

oLightDir = normalize( -;

float4 mainBPS(
float2 uv : TEXCOORD0,
float alpha : TEXCOORD1,
float3 camToPos : TEXCOORD2,
float3 lightDir :TEXCOORD3,
uniform sampler2D TexSampler
) : COLOR {

const float fExposure = 1.5;
float g = Atmosphere_G;
float g2 = g * g;

// atmosphere color
float4 diffuse = tex2D(TexSampler,uv);

// sun outer color - might could use atmosphere color
float4 diffuse2 = tex2D(TexSampler,float2(min(0.5,uv.x),1));

// this is equivilant but faster than fCos = dot(normalize(,normalize(camToPos));
float fCos = dot(,camToPos) * rsqrt( dot(, * dot(camToPos,camToPos));
float fCos2 = fCos * fCos;

// apply alpha to atmosphere
float4 diffuseColor = diffuse * alpha;

// sun glow color
float fMiePhase = 1.5 * ((1.0 - g2) / (2.0 + g2)) * (1.0 + fCos2) /(1.0 + g2 - 2.0*g*fCos);
float4 mieColor = diffuse2 * fMiePhase * alpha;

// use exponential falloff because mie color is in high dynamic range
// boost diffuse color near horizon because it gets desaturated by falloff
return 1.0 - exp((diffuseColor * (1.0 + uv.y) + mieColor) * -fExposure);

technique technique1 {
pass p0 {
ZEnable = false;
ZWriteEnable = false;
CullMode = CCW;
AlphaBlendEnable = true;
SrcBlend = One ;
DestBlend = InvSrcAlpha;
VertexShader = compile vs_3_0 main2VS(gLamp0DirPos);
PixelShader = compile ps_3_0 mainBPS(gTexSampler);

Possible Improvements
- pre-calculate camera height, camera height squared and other such variables. you should notice that a lot of the vertex shader code would be the same for every vertex including calculating horizon distance, camera and light only based calculations and parts of the sphere intersection calculations.
- simplify mie scattering equation and remove hdr
- when the camera is inside the atmosphere we no longer need to calculate the atmosphere closest intersection so the program should switch to a simpler version of the shader once inside.

Known Issues
- transition from outer atmosphere to inner atmosphere is not smooth. I suspect this is due to my atmosphere being an un-realistic size so I had to add a tweak amount so that the transition point is actually a bit inside the atmosphere at a point where only the inside of the sphere is visible to the camera. At the point where the camera is the same height as the atmosphere, it can still see the outside of the atmosphere shell.
- when on the surface of the planet looking back at the horizon it looks more like a painted shell than a spherical haze
- when you get really close to the edge of the atmosphere on the side facing the sun there is an artifact that appears that I haven't fixed yet.
- there are banding issues in the sky gradient when on the side of the planet facing the sun that I haven't solved yet and I think they're related to the color choices in my gradient, but I'm not sure.

If you have any suggestions, improvements or bug fixes please let me know in the comments!

Friday, April 23, 2010

Sphere to Cube Mapping

The following unit cube to unit sphere mapping is nice because the resulting sphere vertices are distributed somewhat evently:

Math Proofs: Mapping a Cube to a Sphere, by Phil

Here's the c++ version where x,y,z are the cube coords and sx,sy,sz are the sphere coords:

sx = x * sqrtf(1.0f - y * y * 0.5f - z * z * 0.5f + y * y * z * z / 3.0f);

sy = y * sqrtf(1.0f - z * z * 0.5f - x * x * 0.5f + z * z * x * x / 3.0f);

sz = z * sqrtf(1.0f - x * x * 0.5f - y * y * 0.5f + x * x * y * y / 3.0f);

Recently, I've been working on the reverse mapping (from unit sphere to unit cube) and have come up with this solution:

First determine the cube face the sphere point projects to. This step is simple - just find the component of the sphere vector with the greatest length.

Next, for each face, take the remaining cube vector components denoted as s and t and solve for them using these equations, which are based on the remaining sphere vector components denoted as a and b:

s = sqrt(-sqrt((2 a^2-2 b^2-3)^2-24 a^2)+2 a^2-2 b^2+3)/sqrt(2)
t = sqrt(-sqrt((2 a^2-2 b^2-3)^2-24 a^2)-2 a^2+2 b^2+3)/sqrt(2)

You should see that the inner square root is used in both equations, so only do that part once.

Here's the final function with the equations thrown in and checks for 0.0 and -0.0 and the code to properly set the sign of the cube component - it should be equal to the sign of the sphere component.

void cubizePoint(Vector3& position)
double x,y,z;
x = position.x;
y = position.y;
z = position.z;

double fx, fy, fz;
fx = fabsf(x);
fy = fabsf(y);
fz = fabsf(z);

const double inverseSqrt2 = 0.70710676908493042;

if (fy >= fx && fy >= fz) {
double a2 = x * x * 2.0;
double b2 = z * z * 2.0;
double inner = -a2 + b2 -3;
double innersqrt = -sqrtf((inner * inner) - 12.0 * a2);

if(x == 0.0 || x == -0.0) { 
position.x = 0.0; 
else {
position.x = sqrtf(innersqrt + a2 - b2 + 3.0) * inverseSqrt2;

if(z == 0.0 || z == -0.0) {
position.z = 0.0;
else {
position.z = sqrtf(innersqrt - a2 + b2 + 3.0) * inverseSqrt2;

if(position.x > 1.0) position.x = 1.0;
if(position.z > 1.0) position.z = 1.0;

if(x < 0) position.x = -position.x;
if(z < 0) position.z = -position.z;

if (y > 0) {
// top face
position.y = 1.0;
else {
// bottom face
position.y = -1.0;
else if (fx >= fy && fx >= fz) {
double a2 = y * y * 2.0;
double b2 = z * z * 2.0;
double inner = -a2 + b2 -3;
double innersqrt = -sqrtf((inner * inner) - 12.0 * a2);

if(y == 0.0 || y == -0.0) { 
position.y = 0.0; 
else {
position.y = sqrtf(innersqrt + a2 - b2 + 3.0) * inverseSqrt2;

if(z == 0.0 || z == -0.0) {
position.z = 0.0;
else {
position.z = sqrtf(innersqrt - a2 + b2 + 3.0) * inverseSqrt2;

if(position.y > 1.0) position.y = 1.0;
if(position.z > 1.0) position.z = 1.0;

if(y < 0) position.y = -position.y;
if(z < 0) position.z = -position.z;

if (x > 0) {
// right face
position.x = 1.0;
else {
// left face
position.x = -1.0;
else {
double a2 = x * x * 2.0;
double b2 = y * y * 2.0;
double inner = -a2 + b2 -3;
double innersqrt = -sqrtf((inner * inner) - 12.0 * a2);

if(x == 0.0 || x == -0.0) { 
position.x = 0.0; 
else {
position.x = sqrtf(innersqrt + a2 - b2 + 3.0) * inverseSqrt2;

if(y == 0.0 || y == -0.0) {
position.y = 0.0;
else {
position.y = sqrtf(innersqrt - a2 + b2 + 3.0) * inverseSqrt2;

if(position.x > 1.0) position.x = 1.0;
if(position.y > 1.0) position.y = 1.0;

if(x < 0) position.x = -position.x;
if(y < 0) position.y = -position.y;

if (z > 0) {
// front face
position.z = 1.0;
else {
// back face
position.z = -1.0;

I posted a stackoverflow question and got a lot of help from there and from mathoverflow too. I used to get the equations for s and t.

My thanks to gmatt and Leonid Kovalev!

Wednesday, March 17, 2010

Tri-Planar Texturing On A Sphere And Spacescape Release!

tri-planar normals on a sphere

Applying an animated normal map to a sphere is a problem. You have stretching issues and if using a cube-to-sphere mapping you have uv direction issues. The only solutions I know of to solve these is to do perlin noise in a shader, a 3d repeating noise texture(?) or do tri-planar texturing and use a 2d texture map, which is what I've done here.

I used the code from this GPU Gems 3 article and added a second bump map and animated them by updating the vertex uv coords and the results are decent.

Now there are no visible seams on the sphere and the bump map is pretty evenly spread across the surface. Also when it animates you don't have any visible seams due to opposing uv directions because of the way tri-planar texturing blends the uv maps. This blending does mean, however, that where the cube edges meet on the sphere you get most of the stretching and the bump map is more random. This side affect is OK for water because I want the bumps to look random.

I still have to deal with a bunch of issues including the fps hit this shader brings with it. Now instead of 2 normal map texture look ups per pixel there are 6 - 1 for each of the 3 axis and there are 2 normal maps. On top of this I have texture look ups for the atmosphere color, sun color and the water depth and I haven't even tried adding reflection or refraction.

I've pasted the FX Composer .fx HLSL file at the bottom of this post if you want to try it out.

Also, I have released the Spacescape skybox tool and uploaded the source code to!

The tool is rather complex and so I plan on writing some tutorials & tips to help answer some questions that have been coming up.

Here's an intro video:

Here's the tri-planar sphere.fx file code for NVidia FX Composer. NOTE: this is not my water shader, just the tri-planar stuff and it has a slider called OFFSET that you can drag to see how the bump map animates.

Wednesday, February 17, 2010

Spacescape - Now With Billboards

spacescape image with billboard sprites
(click to enlarge)

Just a minor update - I've added billboard sprites to the mix where as before I was just using point sprites. I also added an extra line to the noise shader like this:

noiseSum = pow(noiseSum / powerAmount);

This allows me to control the slope of the noise gradient so it is more steep or more gradual, which is something I needed for masks.

The image above really taxes my system (3-4 fps). It has 12 layers: one ridged noise mask for the brightest stars, then another for the middle level of stars and then two more ridged noise haze layers and the rest are mixes of billboard and point stars.

For kicks I threw the generated skybox into the planet app and it's definitely a bit over the top and low-res/blurry. Still it is "fun"

(click images to enlarge)
planet with generated skybox

planet with generated skybox 2

As usual, I uploaded these images and more to the My First Planet gallery.

I re-implemented writeToFile() in the plugin so I can save to files. I also added a writeToMaterial() function to the plugin so I can preview the skybox in ogre.

One thing I've noticed is that the resolution of skybox really affects the brightness of small stars. When I save a 2048x2048 skybox out the single pixel point stars are crisp, but get lost when imported into a program with a low screen resolution. So you kinda have to take that into account when making the skyboxes. It might look great at the resolution on your monitor, but not at the resolution on someone else's monitor. This is where the writeToMaterial() function will be useful so you can preview the skybox at different resolutions to see if it works OK at low resolutions.

I might mess with the mipmap generation for lower levels so that stars don't loose their brightness as quickly when the high resolution image is scaled down.

Here's a sample generated skybox (click for 1024x1024 version - open in new window):

And here's a funny little screenshot of what a skybox layered sphere looks like from the outside (not the same as the one above):
outside skybox layer sphere

Monday, February 15, 2010

Introducting Spacescape! A Space Skybox Creator.

Spacescape Alpha

Spacescape is a tool that uses Ogre to create / generate space scene skyboxes. The way it works is a camera sits inside a number of user defined layers / shells. Each layer can be a bunch of point sprites ( for stars), or a noise field (for nebulas/haze etc.) You can define how many stars you want and the color based on distance. In the screenshots I'm posting there are 7 layers. 3 of the layers are regular blue/white stars, 1 is a few red stars, 2 layers are ridged fbm noise of different colours and the 3rd layer is regular fbm noise used as a mask to make stars cluster by darkening regions of space.

This has been a really fun tool to make and I still have a lot left to do like the GUI. At present it reads in xml config files or you can use the plugin api to manipulate the skybox via code. Did I mention this will be an Ogre plugin too? I plan on releasing the plugin and tool as soon as I get a GUI and finish refactoring the save to file code.

How it works

Making a bunch of point sprites is easy. I use an Ogre::ManualObject for that with pointSprites enabled on the material.

For making the haze I create an inverted unit sphere and use a glsl shader on it that implements fbm and ridged fbm noise using Perlin's improved noise (not simplex noise). Lintfordpickle at has a good explanation with code samples about how fbm and ridged fbm noise work

To create the skyboxes I just create a camera with FOV set to 90degrees and render to texture the six images of the skybox. I have it set up to render to texture the mipmap levels too, so if a user wants to save a dds file with mipmaps they can, or they can just save out the six individual images. With the Ogre plugin it will also be possible to not save out the file but just tell the plugin to create a skybox with a given config file and use it in your own application.

Here's some more images (click to enlarge - you'll need to in order to see the stars better):

A note about dithering:
I've had a tough time figuring out what to do about gradient banding with dark colors in the space backgrounds. Without some kind of dithering or noise the gradient is very obvious. You can see the banding clearly in all the videos and screenshots I've posted prior to this post. The skybox I was using before was generated with blender and spiced up in photoshop, but there was no dithering on the haze so it's really ugly. To get around this with Spacescape, I provide the option to dither a noise layer by a controllable amount and this seems to make the banding issue go away.

The plan is to use this plugin in MyFirstPlanet to let users design their own space skyboxes easily too.

Hopefully, I'll have a demo and some source code posted in the next week or so.

Saturday, February 6, 2010

New YouTube Channel For Planet Dev Videos

I started the petrocket YouTube channel and am posting dev videos there for now. I like Vimeo, but YouTube now allows HD videos and I think it has a larger developer audience.

Here's another quick video I uploaded to test the HD quality. Sorry for the jerky camera movement.

Sunday, January 24, 2010

Simple Flexibile Atmosphere Shaders

Uploaded a video of the effect

First, why no posting since September? Well, in October my wife went into pre-term labor so she was on bed rest and that meant I had to become a mom till our son was born late November. Then in early December it was holiday madness and new baby madness!

Slowly, after the holidays I started work on the planet again. I've mostly been converting it to an Ogre Plugin and refactoring it. I'll do a separate blog post about the architecture of the plugin later I hope, but for now this post is about my work in NVidia FX Composer 2.5 working on some flexible atmosphere shaders.

If you don't know, my earlier shaders were based entirely off Sean O'Neil's free atmosphere shaders. Now, Sean does things the right way, which is why his work looks so good! His shaders represent the atmosphere of the earth very accurately. The one thing that prevented me from cutting and pasting them into my project was they rely on your atmosphere radius being a certain ratio to the surface radius (an earth sized ratio). My atmosphere's are much exaggerated so it didn't really work and I was kind of lost in all the math.

Side note:If you're ever struggling with this stuff I highly recommend you contact him because he's very kind and helpful.

So I started work on a shader that isn't mathematically correct, it just uses a look up table for the blend colors based on 2 things:
- dot product between the camera and the light
- dot products between the two sphere intersections of a ray from the camera to the vertex being drawn and the light.

Here's the 32x16 look up table for the atmosphere shell.

On the horizontal axis values on the left are used when the point on the atmosphere is pointing away from the sun and values on the right are used when the point on the atmosphere is pointing toward the sun. The vertical axis values are for blending between points near the surface of the planet (bottom) and points at the edge of the atmosphere (top).

So here is a side view of the planet that shows the gradient well.

To calculate the color to use on the horizontal axis I use this:

float StretchAmount = 0.5;
ln = dot(lightDir, farVertexPosition);
lnn = dot(lightDir, nearVertexPosition);
lc = dot(lightDir, cameraPosition);

ln = max(0,ln + StretchAmount);
lnn = max(0,lnn + StretchAmount);

uv.x = saturate(ln + (lnn * lc)) * saturate(lc + 1.0 + StretchAmount);

So when the camera is between the planet and the sun ln will be 0 and lnn will be 1 and lc will be 1 so you'll get uv.x = 1 (the blue color of the atmosphere). When the camera is on the back side of the planet ln will be 1, lnn will be 0 so uv.x would be 1.0 (the blueish color of the atmosphere) from the back too except the saturate(lc + 1.0 + StretchAmount) bit scales the value based on lc, which would be -1, so -1 plus 1 + stretch amount = 0.5, so uv.x ends up being 0.5 and you end up seeing the redish colour.

The StretchAmount is used to determine where the sunset should begin.

I also multiple the color by the height so it fades out as the atmosphere gets thinner, and also multiply by another amount so values at the back of the sphere fade out. This value is defined by:

float StretchAmt2 = 0.5;

float ln2 = max(0,ln + StretchAmt + StretchAmt2);
float lnn2 = max(0,lnn + StretchAmt + StretchAmt2 );
float alpha = saturate(ln2 + (lnn2 * lc));

float finalAlpha = height * alpha;

I wanted to have the alpha fade out after the point at which the sunset starts and this allows for that. Notice it is basically identical to the uv.x calculation it just offsets the ln and lnn values so that I can control when to start fading out the atmosphere. If StretchAmt2 is 0 then the atmosphere will be faded out by the horizon, a value of 0.5 is what is used in the pictures and it lets the atmosphere fade out about a quarter way around the back of the planet.

The surface shader uses a smaller 32x2 gradient map.

The top row corresponds to the atmosphere color and the bottom row corresponds to the sun color. Horizontal values correspond to the dot product between the surface position and the sun position.

Currently the pixel shader color output comes from this:

outColor = (diffuse * sunColor) + (atmosphereColor * min(AtmosphereTransparency,attenuation)) * sunColor;

The AtmosphereTransparency is how transparent I want the atmosphere to be, and presently I'm using 0.4 based on personal preference.

Lastly, the attenuation factor is based on the distance from the camera to the vertex being drawn and the code to calculate it is this:

float vd = min(AtmosphereHeight,VisibilityDistance);
outAttenuation = saturate((cameraToVertexDistance - VisibilityDistance)/ (AtmosphereHeight + vd));

VisibilityDistance is the minimum visibility distance so you can adjust it for personal taste. A large visibility distance will give the atmosphere a very thin impression and a small visibility distance would be a very dense atmosphere.

Here's a boring picture from the surface:

Next step is to put these shaders into Ogre and see how they look. Until then, don't forget to read the NVidia GPU Guide and ATI GPU Guides every night before you go to bed.