Juicier Teleporters & Filmic Color Space
Classic 2D pixel art is timeless, but modern games take the flourishes of dynamic lighting, shades, particle systems, and post-processing effects like a chromatic aberration in bloom. During the creation of Bite the Bullet, some of our senior developers had an immersive round table on how they achieved the layered lighting effect for teleporters used in-game.
I have to say that seeing them in action brings the process to a whole new level. For demonstration's sake, we’ll keep the reference images as single still capture, and share them as a single gif animation of one of my favorite teleporters to kick off.
In this example, the glow stems from an emissive material, with a ton of effervescence cranked upon it. It looks “ just okay,” but does provide a sweet sci-fi sense of luminescence that any neo-futuristic teleporter would have.
Yet, there is a way to make it look better.
The difference is that in the former image, highly saturated color caps out as a solid cyan color. In the latter, it starts turning white, how light naturally behaves in intensity. The former's restriction is fundamentally depending on how Unity's rendering pipeline handles color by default. Our focus here is about the 'color space' rendering output.
Introducing: Filmic Color Space
Filmic color space accurately represents the behavior of 'lightness.' If you use Blender 2.79 or later, you may be familiar with this as the default color space starting at that version is Filmic. In Unity, you can achieve the same look by using a tone mapper in the post-processing stack.
Filmic Blender was created by Troy Sobotka, an industry professional and camera enthusiast who saw the problem with the default color space in Blender and wanted to fix it. So he coded Filmic Blender - a similar color management configuration to ACES, which is the industry standard for VFX.
Please note, this affects the color of everything. For example, comparing the floor colors in the above two images, you’ll notice that one is slightly darker. With this adjustment to a new tone mapper, the effect on all colors is substantial. If you want your art to translate the game view to look as it was drawn, this requires making this adjustment early in the asset creation process, so all colors are selected and applied as planned.
In our game, all of the VFX have been made in the previous linear color space, To achieve filmic-looking effects, we added bright white layering, and the contrast between colors was chosen in the context of the linear color space. When viewed in filmic, they'd look extremely overblown.
Similarly, much of the environment would look too dark because it was made under the assumption it would be viewed under a linear color space - the same color space the artists used when producing the art. In the same way, if you were composing a scene in Blender using linear color space and changed it to filmic color space, you would have to redo all of your lightings to accommodate this. You could choose not to change anything to match. Still, the overall look you get would be very different than you possibly intended.
The Bottom Line
Filmic color space will accurately and dynamically represent light behavior in a way that looks like HDR, but the removal of linear color space means that the color you see for a PNG asset in Windows Photo Viewer is not what you get in-game. Filmic color space is more realistic due to its higher dynamic range and translates to a more cinematic look. Even if viewed on a non-HDR display, the effect is still conveyed.
If you compare linear to filmic, linear usually feels washed out in comparison. But it's not as simple as contrast or saturation. As you see with the light example above; the behavior of color itself changes, especially as you get into extremely bright and saturated colors.
Effective lighting can give a simple and unfinished space a sense of life and ambiance. Using light, you can provide a level of polish that elevates the entire scene in pixel art games; giving it a modern feel without losing the purity and details.
Lighting doesn’t necessarily mean using light source objects in your engine. We prefer to use options like overlays and shaders to achieve a dynamically lit look despite not having dynamic light sources present. One of the most interesting takeaways we had during our filmic roundtable; was that for us: a better understanding and command over lighting may be the best way to provide the most effective visual feedback to a player with everything else already standardized. This is likely one of the reasons why people think UE4 looks better than Unity by default: UE4 uses filmic, and Unity uses linear color space by default, aka sRGB.
Your Filmic Workflow
Applying Filmic color space effectively in your Unity project requires an adjustment to your workflow, and some planning for visuals on both the artist and development side, planning around a specific color space.
There is a particular workflow for creating art assets for use in a filmic color space context, such that what the artist sees on their canvas is what you get in the game, 1 to 1. If Unity's filmic tone mapper is accurate, then using a filmic LUT profile in an image editor that supports it would be able to achieve that. For example, the graphics editor Krita has a LUT management system through something called OpenColorIO, which takes an OCIO profile to determine your viewing colorspace.
Take this Film Emulsion-Like Camera too for Blender so awesomely shared by Sobotka on GitHub:
https://github.com/sobotka/filmic-blender/blob/master/config.ocio
Once configured, you will then be viewing your canvas in the filmic color space, so what you see is what you get in-game.
The cost/drawback being all involved artists would need to have a workflow set up in a way that's compatible with the using a different color space. E.g, if their editor of choice doesn't support it, that's a problem. You can simply choose to have all art made in standard linear color space, but none of it will look exactly as-drawn when rendered within the filmic color space.
Here's a tip when making VFX, assuming you are staying in linear/sRGB color space:
Make the first iteration in linear color space, switch to filmic, and take a snapshot of it. Switch back to linear and try to recreate the filmic look using layering and adjusting colors. You can never 100% recreate it, but you can get close in the 50-70% range.
As an example below:
Then, after:
Filmic for reference:
It’s close! It took a second iteration, around twice as much time in total, but it gets strikingly closet to our desired effect. You have to use a whole other layer of problem-solving and creative decisions to get to the same point that filmic gets by default, and that's only if you're faking lighting like we are with our emissive materials. Additionally, if you animate or tween VFX (which is often the case with our projects), you’ll have more layers than normal to animate, and there will be a performance hit of more VFX-related objects rendering. If you are using actual real-time lighting, your mileage may vary in achieving a filmic color space look and feel while remaining in linear color space.
Our 2d projects pretty much never use real-time lighting for performance reasons. There are some cost considerations to achieve this level of juicy lighting, but for many, it’s a toll happily paid.
If you have a lot of dev-side VFX, especially with high complexity like particle systems, adding the art cost of using filmic would be worth it. Otherwise, if flashy VFX are rare and/or simple, it would probably be better to just fake filmic look on dev side with tricks. The main time cost in our consideration above is on the art workflow and existing assets that were already created in a different color space.
Adjusting your art toolchain the first time, and then getting familiar with the adjusted workflow can take some time in and of itself. Most artists have a personal preference of one or two programs (Photoshop, GIMP, Aesprite, GrafX2, GraphicsGale, etc), but will adopt a specialist set of brushes, or programs for a given project due to a unique feature. For example, some of our pixel artists prefer animation of pixel art to be done in Aseprite simply because of features like onion-skinning (not to mention the simplicity of importing sprite sheets into Unity using .ase files).