Sunday, August 28, 2016

Procedural Universe Rendering

I recently installed a new version of Microsoft Visual Studio on my home machine where I develop 3DWorld. The upgrade from MSVS 2010 to MSVS 2015 had been delayed until I found a good deal on the latest version, which sells for hundreds of dollars. I managed to get a used copy on amazon.com for a fraction of the retail price, and it seems to work just fine.

Overall it only took me a few hours to get 3DWorld building and running with the new compiler. There were various minor fixes for syntax errors and warnings, and I had to rebuild some of the dependencies. However, the upgrade did require me to spend a lot of time setting up my universe mode scenes, for about the fourth time in the history of 3DWorld. My universe scenes were all invalidated and had to be reconstructed because the planets were different types and in different places. I had to re-place the ships and space stations and change various parameters.

The problem is that that the built-in random number generator values changed again. It seems like every version of Visual Studio gives me different values from rand(). Normally I wouldn't use the system rand() because it's slow, poor quality, and varies across compilers/OSes. I have my own custom random number generator that solves these three issues that I've been using since I switched to MSVS 2010 about 5 years ago. I thought that was the last time I would have to deal with the universe random seeds problem. I guess not.

Unfortunately, I missed a call to rand() that was used to precompute a table of Gaussian distribution random numbers to avoid generating Gaussian distributions on the fly. My custom random number generator was still being used to select a random entry from the Gaussian table, but the entries were all different. This distribution was used to select the temperature and radius of each system star. The star radius affected the planet orbits and the star temperature affected the planet types and environments. All of the galaxies and systems locations were the same, but within a solar system everything was different.

I fixed the problem and added a random seed config file parameter. This made it easy to regenerate the current system until I found one I liked, rather than having to fly around the galaxy looking for a suitable starting system for the player. I was looking for a seed that would give me a yellow to white star, an asteroid belt, and at least one of each type of interesting planet (Terran/Earth-like inhabitable, gas giant, ice planet, volcanic planet, ringed planet, etc.) In the process I came across some interesting and beautiful planets such as the gas giant in the screenshot below that looks like Jupiter.

Closeup of a procedural gas giant that looks like Jupiter, including small elliptical "storms".

Shadows

I settled on a system that had some interesting shadow effects, so I thought I would take some screenshots of the different types of objects that cast and receive shadows. Here is an image of a planet with a moon that is in the middle of the system's asteroid belt. I don't know if this actually happens in real solar systems, but it certainly makes for interesting gameplay. It's fun to watch the ships fly around the planet trying to (or failing to) avoid colliding with the asteroids. In this screenshot, I've positioned my ship so that the star is behind me and I'm in the shadow of the moon, looking at the asteroid belt and the planet, which is right in the middle of the asteroids.

Moon and planet casting shadows on an asteroid belt.

The small asteroids in the near field are fully shadowed and black, and the asteroids further away show a dark cone of shadow extending toward the planet in the center of the image. Some of the shadowed asteroids are difficult to see because they blend in with the black universe background, but you can definitely see shadowed asteroids contrasted against the planet. The shadow cone eventually disappears as the moon occludes a decreasing amount of the light from the star as the distance from the asteroid to the moon increases. This is similar to how, on Earth, shadows from nearby objects are much sharper than shadows from distant objects. Also note that the moon doesn't actually cast a shadow on the planet in its current position.

Here is a nice blue ocean planet that has a ring of asteroids around it. The ring casts a thin shadow near the equator of the planet. This can be seen as a thin dark line a bit below the center of the planet. This shadow is ray traced through the procedural ring density function in the fragment shader on the GPU to determine the amount of light that is blocked. The sun is behind my ship and a bit to the right. You can also see that the planet shadows the asteroid belt on the back left side. I found another planet where the moon should cast shadows on the rings, so I'll have to implement that in the code next.

Beautiful blue planet with asteroid belt rings. The rings cast a faint shadow on the planet and the planet casts a soft shadow on the rings.

I was lucky enough to find a rare occurrence of a moon casting a shadow on a planet - a solar eclipse! However, the relative sizes and distances between the star, moon, and planet in 3DWorld aren't to scale with real distances, so it may not represent a physically correct eclipse. I don't see these very often, and the previous planet configuration (MSVS 2010) didn't have one of these in any nearby star systems. The moon slowly revolves around the planet with an orbital period of around an hour, and after a few minutes of time the shadow no longer intersects the planet.

Rare occurrence of a moon casting a soft analytical shadow on a planet. The planet also reflects light onto the moon.

Note that the shadow has a physically correct umbra and penumbra. This is computed in the fragment shader when rendering the planet. The amount of light reaching the planet is calculated as one minus the fraction of the sun disk that is occluded by the moon. The sun is modeled as a circular/disk light source and the moon is modeled as a sphere projecting into a circle along the light vector. You can find the math for such a calculation here.

Bonus video of asteroid bowling! Here is a video of a planet plowing through the asteroid field at 100x speed, with a moon trailing behind it. I fixed the asteroid belt placement after recording this video.



Nebulae

Nebula rendering is not new to 3DWorld. I've shown images of 3DWorld's nebulae in previous posts such as this one. I recently went back and reworked the shader code that determines the color and transparency of each pixel in the nebula. I made a total of three changes:
  1. Added an octave of low frequency 3D Perlin noise to modulate the density/transparency of the nebula to give it a more random, nonuniform shape rather than looking like a large sphere.
  2. Increased the exponent of the noise from 2.0 to a per-nebula random value between 2.0 and 4.0 to produce stronger contrast between light and dark areas (wispy fingers).
  3. Switched to additive blending to model emissive gas rather than colored occluding material for high noise exponent nebulae to give them a brighter appearance.
Here are some nebula screenshots. They show the evolution of nebula rendering as I applied my changes to the algorithm. The first two show the original algorithm, the middle two show changes 1 and 2, and the last four images show the final code. The stars in these screenshots are in front of, inside, and behind the nebula.











Keep in mind that nebulae are volumetric objects computed using 3D noise, not just 2D images. They are drawn with 13 crossed billboards, allowing the player to fly in and around them with minimal rendering artifacts. I got the idea from this video.

That's it for nebulae. I'll add some more images if I change the algorithm again in the future. Sorry, I haven't created any nebula videos. The fine color gradients just look horrible after video compression, and it ruins the wispy, transparent effect.

Saturday, August 6, 2016

Indirect Lighting for Dynamic Objects

This is a followup post to my indirect lighting post of last year. I decided that I wanted moving objects such as doors to also influence indirect lighting in the scene. This is more difficult than handling light sources that can be switched on and off by the player. Moving objects have more than on/open and off/closed states - they have all the intermediate positions representing partially open states. Storing only two states isn't enough, and linearly interpolating between them doesn't work well for all cases. The light moves with the object. Consider a moving object that starts entirely to the left side of an opening through which light can pass, then moves entirely to the right. At both extremes it blocks no light, but at the midpoint of its path it blocks the entire opening, resulting in a dark room. This condition can't be achieved by interpolating between the end points, which would both be at the same lighting solution (fully lit).

These types of moving objects are called platforms in 3DWorld. They're named after the platforms used for doors and elevators in the Forge map editor for Marathon, a game I played in college long ago. 3DWorld platforms can move in any direction, and can be used for doors, elevators, crushers, machines, etc. Custom triggers can be attached to platforms to control them. These triggers can be activated by the player, or can be proximity sensors triggered by the player or smiley AIs. The example door shown in the images and video below are activated by four player controlled switches placed on the walls by the door. I even made the switches an emissive yellow color so that they can easily be seen in the dark.

Back to lighting. I briefly considered storing precomputed lighting values for several intermediate points along the platform's motion. There are some problems with this approach. One issue is that a small number of precomputed points doesn't provide a very accurate interpolation across the lighting values as the platform moves. A large number of points takes too much CPU time to compute and too much disk space to store. Also, the number of blocks of saved lighting data increases exponentially as multiple interacting platforms are added. For example, if the scene contains two adjacent doors A and B, they may interact with each other. Door A might block most of the light reaching door B. If they're both in series along the same hallway, light won't reach the end of the hallway unless both doors are open. This is difficult to automatically detect just by looking at the geometry of the doors and the hallway. We instead need to store a minimum of four lighting states: {A and B closed, A and B open, A open B closed, A closed B open}. If there are three doors, we need 8 states. It quickly gets out of control as the data scales exponentially with the number of doors/platforms.

This problem is similar to the one discussed at the end of this blog post for the game "The Witness". I remember reading about the exponential combination problem on their blog somewhere, but I can't seem to find it now. However, their indirect lighting system is entirely different from the one used in 3DWorld, so the trade-offs are also somewhat different.

My second idea was to cache the rays intersecting any possible position of each platform, and sort out which rays are blocked at runtime, based on the current door position(s). The platform is expanded to cover the union of it's possible positions by extending it in a line between it's start and end points. This proxy object is added to the bounding volume hierarchy prior to ray tracing. Then, when computing indirect lighting, any ray that could hit the platform in any of its possible positions will hit this proxy geometry. All rays intersecting the proxy are terminated (no longer propagate) and stored in a file on disk. This process is only done once, after which the file is loaded and its data reused. At the end, the proxy is removed and replaced with the actual platform in its initial position. All saved rays are re-cast, and any rays not intersecting the platform position add reflected indirect light to the scene. This additional light "L" represents the initial/nominal lighting of the scene, and is saved to the precomputed indirect lighting file for future use.

When the platform moves, the rays need to be re-evaluated to determine which ones are blocked by the platform in its updated position. The simplest approach is to remove the contribution of "L" from the scene and recompute it using the new platform position. While this works, and is simple, it's not a very good solution. Every ray would need to be re-cast every frame the platform is moving. This kills the frame rate, and makes the game unplayable. Clearly, an incremental approach is needed.

The key observation is that the platform moves slowly relative to each game frame and lighting changes incrementally. A door doesn't open or close in a single frame. If it takes one second to move across its path, and the game is running at 60 FPS (Frames Per Second), we can spread the lighting update across all 60 frames to get a nice smooth framerate. The trick is to determine which rays change state from blocked to unblocked between the previous and current frames. This can be done by testing each saved ray against the platform's bounding volume, which is very easy to parallelize across multiple threads. In most cases, the vast majority of rays are either blocked or unblocked in both frames. Only a small fraction of rays will change state, and only these rays need to be re-cast to update the lighting values.

[Note that I'm ignoring rays that intersect the platform at different points in the previous and current frames, even though the reflected lighting will change. In practice the error introduced by this is insignificant compared to the magnitude of the transmitted rays, especially if the platform is a dark, non-reflective color. I'm also ignoring rays that reflect off the same platform multiple times, as again their contribution to the full lighting solution should be negligible. Light rays lose their energy quickly when reflecting off multiple diffuse objects.]

Rays that were previously blocked but become unblocked this frame can be transmitted through the scene, and recursively reflected off other objects as they are in the precomputed ray tracing phase. If the same random seeds are used as in the precomputation phase, the rays will be exactly the same, and the lighting will look as if these rays were never blocked in the first place. Any rays that newly become blocked have their weights/colors negated so that they remove light from the scene during ray tracing. The platform is temporarily removed from the bounding volume hierarchy, and ray tracing proceeds as usual with the negative rays. This will cancel out the light that was added when these rays were included in the lighting solution earlier. When the platform moves back to its original position, everything happens in reverse, where all rays have weights negated from what they were in the forward motion of the platform. Therefore, the lighting solution will converge to the original/nominal value once the platform comes to rest. In reality there is a small amount of floating-point error, and maybe some non-determinism from using multiple threads without locking or atomic operations. But, after dozens of door open/close cycles, I can't see any visual difference in the lighting.

Okay, that's enough text. How about some images? I don't really have anything too exciting to show this time. Here is a screenshot of the basement, with the basement door open. The only light source is the sky and indirect sunlight coming in through the door. Sorry the image is so dark. The door is very small compared to the enormous room, so it doesn't get very bright in here. At least it's realistic lighting for such a room.

Basement with door open, letting the outside indirect light in.

And here is the same viewpoint with the basement door closed.

Basement with door closed, blocking most of the outside indirect light. A small amount of light is leaking from the door.

The basement should be completely black, except for the tiny emissive yellow door switches. The small amount of leaked light on the right side of the door is due to the way the 3D light volume texture is sampled in the fragment shader. Lighting is linearly interpolated across voxels (3D texture pixels), which produces a smooth transition from light to dark along thin objects such as the door. Since the walls are at least one light voxel in width, they properly block all of the light.

Here is a view from the outside looking into the basement, with the door in the process of closing. The basement is partially lit in this case, where the right side of the basement is slightly brighter than the left side because the door is open on the right.

Closeup of the basement door half-way closed, seen from the outside looking in.

It's easier to see the smooth transition in a video. Lighting is updated incrementally each frame the door is moving. As long as the door moves slowly enough, only a small number of rays need to be recomputed per frame. Lighting updates have a minimal impact on frame rate. This particular door has a total of 96K intersecting light rays and moves over the course of 1.6 seconds, taking an average of only 1.3ms of realtime with 8 threads across 4 CPU cores (0.9ms for ray tracing and 0.4ms for GPU texture update).




I'll hopefully add some more dynamic lighting platforms later, once I get the system properly tuned. This same solution should be general enough that it works for a wide variety of platforms.

The next step is to make this system work with fixed position static light sources such as room lights. It would be interesting to see a closet light that can be turned on and off, so that when the closet door is open and the light is on it indirectly lights the adjacent room. After that, I could try to make this work with dynamic point light sources, such as explosion effects. Of course, I haven't even gotten the regular static indirect lighting working in this case, so it could take significant effort.