Thursday, August 20, 2015

Volumetric Clouds

Awhile back I wrote a system to create and render a volumetric nebula for universe mode. Later I realized that a modified version of that effect also made a good debris/dust cloud for ship explosions. Then I used a more colorful version for teleporter graphics in ground gameplay mode. This week I decided to try this approach for rendering volumetric puffy clouds in infinite tiled terrain mode, and it works! Well, almost, they don't quite look like "normal" clouds from a distance but they look great up close. Plus, the player can fly through them.

Before I show you some pretty screenshots, I need to get some technical details out of the way. I'll try not to make it too bad for the nontechnical readers.

The volume cloud system I'm using isn't actually something I thought up myself. I saw it explained in a video that I found on YouTube through a Google search a few years ago. Unfortunately, I can't seem to find that video again, so I can't link it here. The video described how to create a nebula out of a number of randomly oriented 2D slices containing partially transparent plasma textures created in Photoshop. Since I needed one unique nebula per galaxy, in a game with an infinite number of galaxies, I had to do something a bit different. Instead of manually creating an infinite number of textures in Photoshop (which I don't own), 3DWorld generates the color and alpha (transparency) values directly on the GPU using random 3D noise textures to create ridged Perlin noise. The location of the nebula in space is the seed used to determine the starting and step direction/distance into the noise texture so that each nebula is unique.

Each particle cloud (nebula, etc.) consists of 13 2D quads (squares) that all intersect at their centers. Each quad can be viewed from either direction, so it's actually more like 26 2D billboards. The quad planes are oriented in uniformly distributed directions by taking all {x,y,z} values in {-1,0,1} and folding out the symmetric ones. This gives the following 13 directions before normalization: (1,0,0), (0,1,0), (0,0,1), (1,1,0), (1,0,1), (0,1,1), (-1,1,0), (-1,0,1), (0,-1,1), (1,1,1), (1,-1,-1), (-1,1,-1), (-1,-1,1).

The corners of the quads are attenuated to an alpha of zero (fully transparent) so that they appear as circles with a smooth falloff from opaque to transparent at their outer edges. In addition, quads that are nearly parallel to the view direction are faded to transparent to avoid ugly artifacts from looking at them edge-on. Then the color and alpha channels are modulated by the noise functions to give an interesting, cloud-like effect. The fragment shader evaluates the noise function at a position corresponding to the coordinate of each pixel on the quad in 3D world space. Therefore, the cloud looks correct and has proper parallax when the viewer/camera moves around and through it. That is, as long as the view doesn't pass directly through the center where the quads all intersect, which is a singularity where every quad is viewed edge-on and rendered invisible, causing the cloud to disappear. I guess that's what it's like to be in the eye of the storm.

Here are some of the fun things I can render using this system.

Yes, it can draw a nebula, what a surprise! This system was initially used for drawing nebulae in universe mode. I've shown this image in a previous blog post, and here it is again.

Closeup view of a procedurally generated nebula in universe mode. This was in a previous blog post.
A nebula is drawn using ridged noise (just Perlin noise with some extra math). Okay, if you're curious, here is the math that converts a Perlin noise value 'v' into ridged noise:
v = 2.0*v - 1.0; // map [0,1] range to [-1,1]
v = 1.0 - abs(v); // ridged noise
v = v*v; // square it

Yes, exciting, isn't it? The nebula shader uses two random noise color channels where the colors themselves are also randomly selected (I think it was pink and orange in the screenshot) + one alpha channel. The noise is mostly high frequency and the bounding shape is spherical.

Who creates a space game without ship explosions? What is left after a big ship's explosion, anyway? A cloud of glowing dust and particles! This can be drawn with the same technique. Just use lower frequency noise to make it more irregular, make it even more ridged and wispy, use different colors for the interior vs. exterior of the volume, and you get this:

Volumetric debris and dust cloud from blue/white and red/orange ship explosions in universe mode.

That's enough particle clouds in universe mode. It looks good, but I don't want to overuse the effect. Where else can it be used? How about using it for the new teleporters in 3rd person shooter ("ground") mode? Since a teleporter doesn't exist in the real world I can make it look however I want. I decided to make it very bright so that it stands out, adds color to the dull grays of my office building scene, and looks like something that is definitely not natural. The colors and noise are animated as well so that it's clear to the player this thing isn't just a static decoration. It draws attention - this glowing object does something. Teleporters use 4 high frequency ridged Perlin noise channels in the shader: red, blue, orange, and alpha.

Closeup of a teleporter gameplay entity. The dynamic, animated, colored cloud casts light on the ground.
Here is a short video of the teleporter in action, where you can see the animated colored volume and how it actually moves the player to a different location in the scene. Note that teleporters don't just work with the player: they work with enemies, items, projectiles, particles, and any other type of dynamic game object. It's fun to throw grenades into the teleporter trying to hit someone you can't see on the other side. Just make sure you count to 5 before walking into it yourself. Sorry, I didn't record a video of this (yet).

The obvious use for a volumetric particle cloud rendering system it for rendering ... clouds. There are a ton of competing ways to draw clouds in games. Sometimes clouds are meant to be viewed from below, for example when the player can walk or drive on the ground. Sometimes clouds are meant to be viewed from above or inside, for example in a flight simulator or space game. In 3DWorld, there's really no constraint on where the player can go. It's a game engine, not a game, so it could be used for a ground-based first person shooter or a flight simulator. These clouds need to look correct, with a good frame rate, when viewed from any location. Well, except from their exact centers (again).

Clouds are drawn in a base color of white but are modulated to match the lighting conditions so that they're red-orange during sunrise/sunset and dark gray at night. Their brightness decreases toward their center to simulate self-shadowing inside the cloud volume. I haven't yet tried to add light scattering to these clouds. In addition, they become opaque near the center where the noise value has less of an effect. They use regular Perlin noise rather than ridged noise for a more puffy and natural appearance, and have a mixture of high and low frequencies. The noise function offset varies slowly so that clouds change shape over time. Here are some clouds viewed from the ground below in infinite tiled terrain mode.

View of volumetric procedural cloud puffs from below. These clouds slowly change over time.
Tiled terrain mode actually draws three cloud layers (listed in the order in which I added them):
  • 2D procedural noise cloud plane. The terrain and grass shaders ray cast into this layer for soft cloud shadows. I also have slow God-rays that ray march through the cloud density function. Density depends on weather conditions and atmosphere. Slowly animated/scrolling.
  • Static textured upper cloud layer. Provides a more interesting background than pure blue.
  • Puffy volumetric clouds. The new mode that's the subject of this post. Doesn't cast shadows yet - unclear how the terrain and grass shader can ray cast into these. Animated, but more slowly.
The sky looks a bit cluttered with all three cloud layers enabled now. Also, it's odd that the first layer casts shadows, but the second layer (which is denser and more apparent) doesn't cast shadows. Maybe I'll remove the first cloud plane layer later. Or maybe the set of enabled layers should be determined by atmosphere and weather conditions. For example, use a large number of puffy gray volume clouds when it's rainy, but only sparse/high 2D cloud cover when it's sunny.

This wireframe view of clouds from above shows how they are composed of multiple 2D quad billboards in various orientations arranged around a common center point, similar to flower petals.

Clouds drawn in wire frame mode. The structure of the 13 intersecting planes can be seen.
Here is a short video of the player flying through the cloudy sky over some islands. Since the clouds are rendered in 3D, they appear as real volumes when flown through. Most games seem to use 2D cloud billboards or skybox background images that are only meant to be viewed from a distance.

So far I've come up with four different uses of this particle cloud rendering technology. I wonder what else I can use it for? Rocket explosions? Plasma balls? Smoke effects? Insect swarms? Actually this might work well for smoke, and could be a good replacement for the existing billboard cloud system I use for smoke in 3DWorld. I guess we'll just have to wait and see.

Wednesday, August 12, 2015

Indirect Lighting for Player Controlled Lights

I'm continuing to work on improving the dynamic lighting in 3DWorld. This post is a short update that builds on the previous two posts (indirect lighting and dynamic lighting + triggers):
Indirect Lighting
Lighting and Triggers

I was reading another blog post from The Witness and it gave me an idea. The contrast is too high between the lit and unlit areas of the basement scene. All of the lighting is coming from the spotlight direct illumination; the indirect reflected light is completely missing. This is why the orange balls look black on the sides facing away from the lights. The scene doesn't look very real.

Let me review how the indirect lighting in 3DWorld works. The scene is divided into a 3D grid of light volumes in {x, y, z} that are uploaded to the GPU and used in the fragment shader to individually light each pixel. During the offline preprocessing phase, each light source emits millions of rays, each of which is traced through the scene using multiple CPU threads. Each ray's weighted RGB (red, green, and blue) light contribution is added to each grid cell that it passes through. This means that the fragment shader can query any point in space within the scene bounds to get the indirect lighting contribution. This uses more memory per volume than lightmaps and therefore the lighting is stored at a coarser granularity. But, it has the advantage that dynamic objects (such as the orange balls) that weren't part of the original static scene can be correctly lit. This approach may also be simpler to implement and more efficient to compute. I haven't implemented lightmaps so I don't know for sure.

Okay, back to the problem. Dynamic lights can be turned on and off when their triggers (light switches) are activated, so the indirect lighting isn't constant. It can't be baked into the (single) global lighting volume of the scene. The indirect lighting can't even be stored per-trigger because it needs to be removed when an individual light is destroyed. What is needed is per-light source volumes that are generated on-the-fly when needed and merged into the final lighting solution when their intensity (or enabled state) changes. Since the light triggering is infrequent, most game frames have the same set of enabled lights. It makes sense to only merge in the new lighting values when they change, and re-upload the merged data to the GPU sparsely. This avoids having to read from multiple 3D lighting textures on the GPU each frame. I haven't actually tried this, but I assume it would have a significant effect on frame rate. The 4-5ms of CPU time updating the lighting every few seconds is negligible.

So what does it look like? Here is a screenshot of the direct + indirect lighting effects on the basement spotlight scene with some orange balls in motion.

Direct + Indirect lighting + Shadows in the basement spotlight scene.

The biggest difference is the reflection of the spotlight hitting the ceiling and floor near the light on the back wall. The sides of the balls facing away from the lights aren't completely black any more. Much better. Unfortunately, now there is a 6 second freeze when the player first turns on the lights as the CPU computes 5 million rays (1M per light source) with 4 bounces each. That really ruins gameplay. Who wants to sit there waiting for the lighting to be computed in the middle of playing the game? It takes longer than loading the scene at the beginning!

One solution is to compute the lighting of each light source once in a preprocessing pass, then write it to disk for later reuse. I modified the scene file reader to accept filenames attached to each light for caching indirect lighting on disk. This works well, reducing lighting computation time from 6s to a few milliseconds.

However, now I'm stuck with multiple 8MB files on disk, one per light source. These files together take up more disk space than the rest of the scene files combined. They need to be compressed. Fortunately, they're easy to compress. The RGB color data is mostly zeros, and 32-bit floating-point numbers have more precision than I need. 8-bit unsigned integers would work just fine - they get converted to 8-bit in the GPU texture later anyway. The first thing I did was to remove most of those zeros. Since these are small local lights, their radius of influence is pretty small. In addition, their light is confined to this one room in the basement. I first filter the lighting values so that any value smaller than 0.1% is clamped to 0. Then I compute the smallest bounding cube that contains all of the nonzero values. This provides a 100-200x reduction in file size and memory usage. The 8MB files are now only 40-80KB. The reduction is enough that it doesn't seem necessary to do the 32-bit => 8-bit data compression.

Here are some screenshots comparing the effect of the different lighting components. In my opinion, the new combined direct + indirect lighting looks much better than direct only. [Ignore the frame rate on the lower left corner - I froze the scene update so the framerate counter wasn't updating. It normally runs at over 200 FPS.]

Uniform lighting shows the base material colors and textures. Some crates were added to provide more interesting shadows.

No lighting. A few emissive objects are visible (light switch and sky visible through the window).
Direct lighting + shadows only. The spotlights themselves are lit by a separate small light. Similar to the previous blog post.

Indirect lighting only. Most of the direct light hits the floor and ceiling near the back wall, reflecting light onto the wall.

Direct and indirect lighting combined form a more realistic global lighting solution for the scene.

Sunday, August 2, 2015

Destruction Video

Here is a test video of me shooting out windows, benches, lights, and exploding tanks.