FREE US SHIPPING ON ORDERS $175+
0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
      Total

      Game Development

      A Guide to Cinemachine

      A Guide to Cinemachine
      Cinemachine is a suite of camera tools for Unity, which gives triple-A game quality controls for every camera in your project. The easy to install plugin allows you to add functionality to cameras you've already have, or make new ones with varying behaviors. 

      Read more

      Making An Authentic Metal Soundtrack: Coffee Crisis Case Study

      Making An Authentic Metal Soundtrack: Coffee Crisis Case Study

      How do you achieve such an authentic sound?


           I have compiled a variety of tools over the years that are involved. A couple to highlight in this particular soundtrack would be Superior Drummer 2.0 by Toontrack, a virtual drum kit that draws its samples from a massive library of raw audio drum samples recorded in professional studios, as well as a fantastic amp simulator made by Peavey call Revalver IV. Revalver IV provides a wide array of amplifiers, cabinets, microphones, as well as effects, to select from to provide complete tone control at the tip of your finger. Many of these tools, combined with modern production techniques can produce rather astounding results

      .

           Here we see Superior Drummer 2.0 with the microphone channels routed out into separate tracks within Cubase 5.


           The arrows here point out bus channels that control the output of groups of different channels. For example, the first arrow is the master fader for all the mic channels for the kick drum. There are two kick microphones routed to this fader in this example.


           Here we see the Virtual Mix Rack by Slate Digital as a plugin inserted on the kick master channel.


           This is Revalver IV by Peavey. This is one instance that is active on the Left Guitar channel in Cubase 5.

            A quick look at some of the amplifier models available in Peavey's Revalver IV.

      Tell me more about your creative process for composing music in a game.


           The first question asked is always about the type of experience we are trying to create for the consumer. Next is developing an actionable plan, understanding the tools at our disposal, and to begin laying the foundation of ideas, melodies, and structure in the case of a soundtrack. When it comes to Coffee Crisis, the idea to use abnormal and rapidly shifting sounds throughout really lays well with the concept of the alien invasion. Keeping the idea of the experience you desire to create for the consumer will always help you decide if the content you're creating is content you should use.  

       

      What software tools do you use for composition?


           An itemized list of the products I use are as listed:


                - Superior Drummer 2.0 by Toontrack

                - Omnisphere and Trilogy by Spectrasonics

                - Plugins by Waves

                - Plugins by FabFilter

                - Plugins by Slate Digital

         - Peavey Revalver VI

            Here we see two plugins I often use by Fab Filter. On the left is the Pro - Q. On the right, the Pro - L.


           Here are two plugins by Slate Digital that I frequently use on my master fader. On top is the Virtual Tape Machines plugin and below is the Virtual Bus Compressor model FG - Grey.

       

      Walk me through the other tools in your production process.


           One tool to talk about would be automation. Every modern Digital Audio Workstation otherwise referred to as a DAW, has some form of automation to use. What this allows for is change over time within the project you are creating. Change over time is an essential component of an effective production whether or not it is music, a film, a book, a game, etc. 


      What do you think brings a video game soundtrack to the next level?


           The secret to a successful soundtrack is the ability to have a distinct identity without detracting from the experience itself. Finding the balance between blandness and overbearing can take time, but is always worth the effort. 


      How do you balance a soundtrack & sound design when composing music for a game?


           Similar to an earlier response, it is all about focusing on the kind of experience you are trying to create. I would also add that it's never a bad idea to get outside opinions as well during the process. It always has the opportunity to reveal things about the soundtrack when you get a fresh pair of ears involved.


      How do you know Mitch Foster?


           Mitch and I first met in high school as a result of our mutual interest in playing guitar. Consequently, a band was formed that continued for many years. The soundtrack for Coffee Crisis is primarily re-imagined versions of a variety of songs that we had initially composed in the years following high school, a fun fact that few know.

       

      How can we contact Fist First Records for more information?


           Anyone can contact Fist First Records as follows:


           Email: fistfirstrecords@gmail.com


           Facebook: www.facebook.com/fistfirstrecords/

      Juicier Teleporters & Filmic Color Space

      Juicier Teleporters & Filmic Color Space

      Classic 2D pixel art is timeless, but modern games take the flourishes of dynamic lighting, shades, particle systems, and post-processing effects like a chromatic aberration in bloom. During the creation of Bite the Bullet, some of our senior developers had an immersive round table on how they achieved the layered lighting effect for teleporters used in-game.

      I have to say that seeing them in action brings the process to a whole new level. For demonstration's sake, we’ll keep the reference images as single still capture, and share them as a single gif animation of one of my favorite teleporters to kick off.

       

      Let’s examine the problem with the teleporter example below:
      With the glow emanating from the center:
       
      Exhibit A: Linear Teleportation
       

      In this example, the glow stems from an emissive material, with a ton of effervescence cranked upon it. It looks “ just okay,” but does provide a sweet sci-fi sense of luminescence that any neo-futuristic teleporter would have. 

       

      Yet, there is a way to make it look better.

       Exhibit B: Filmic Teleportation 

       

      The difference is that in the former image, highly saturated color caps out as a solid cyan color. In the latter, it starts turning white, how light naturally behaves in intensity. The former's restriction is fundamentally depending on how Unity's rendering pipeline handles color by default. Our focus here is about the 'color space' rendering output. 

       

      Introducing: Filmic Color Space

       

      Filmic color space accurately represents the behavior of 'lightness.' If you use Blender 2.79 or later, you may be familiar with this as the default color space starting at that version is Filmic. In Unity, you can achieve the same look by using a tone mapper in the post-processing stack.

       

       Filmic Blender was created by Troy Sobotka, an industry professional and camera enthusiast who saw the problem with the default color space in Blender and wanted to fix it. So he coded Filmic Blender - a similar color management configuration to ACES, which is the industry standard for VFX.

       

      Please note, this affects the color of everything. For example, comparing the floor colors in the above two images, you’ll notice that one is slightly darker. With this adjustment to a new tone mapper, the effect on all colors is substantial. If you want your art to translate the game view to look as it was drawn, this requires making this adjustment early in the asset creation process, so all colors are selected and applied as planned.

       

      In our game, all of the VFX have been made in the previous linear color space, To achieve filmic-looking effects, we added bright white layering, and the contrast between colors was chosen in the context of the linear color space. When viewed in filmic, they'd look extremely overblown.

       

      Similarly, much of the environment would look too dark because it was made under the assumption it would be viewed under a linear color space - the same color space the artists used when producing the art. In the same way, if you were composing a scene in Blender using linear color space and changed it to filmic color space, you would have to redo all of your lightings to accommodate this. You could choose not to change anything to match. Still, the overall look you get would be very different than you possibly intended.

       

      The Bottom Line

       

      Filmic color space will accurately and dynamically represent light behavior in a way that looks like  HDR, but the removal of linear color space means that the color you see for a PNG asset in Windows Photo Viewer is not what you get in-game. Filmic color space is more realistic due to its higher dynamic range and translates to a more cinematic look. Even if viewed on a non-HDR display, the effect is still conveyed.

      If you compare linear to filmic, linear usually feels washed out in comparison. But it's not as simple as contrast or saturation. As you see with the light example above; the behavior of color itself changes, especially as you get into extremely bright and saturated colors. 

      Effective lighting can give a simple and unfinished space a sense of life and ambiance. Using light, you can provide a level of polish that elevates the entire scene in pixel art games; giving it a modern feel without losing the purity and details.

      Lighting doesn’t necessarily mean using light source objects in your engine. We prefer to use options like overlays and shaders to achieve a dynamically lit look despite not having dynamic light sources present. One of the most interesting takeaways we had during our filmic roundtable; was that for us: a better understanding and command over lighting may be the best way to provide the most effective visual feedback to a player with everything else already standardized. This is likely one of the reasons why people think UE4 looks better than Unity by default: UE4 uses filmic, and Unity uses linear color space by default, aka sRGB. 

       

      Your Filmic Workflow

       

      Applying Filmic color space effectively in your Unity project requires an adjustment to your workflow, and some planning for visuals on both the artist and development side, planning around a specific color space.

       

      There is a particular workflow for creating art assets for use in a filmic color space context, such that what the artist sees on their canvas is what you get in the game, 1 to 1. If Unity's filmic tone mapper is accurate, then using a filmic LUT profile in an image editor that supports it would be able to achieve that. For example, the graphics editor Krita has a LUT management system through something called OpenColorIO, which takes an OCIO profile to determine your viewing colorspace. 

       

      Take this Film Emulsion-Like Camera too for Blender so awesomely shared by Sobotka on GitHub:

       

      https://github.com/sobotka/filmic-blender/blob/master/config.ocio

       

      Once configured, you will then be viewing your canvas in the filmic color space, so what you see is what you get in-game. 

       

      The cost/drawback being all involved artists would need to have a workflow set up in a way that's compatible with the using a different color space. E.g, if their editor of choice doesn't support it, that's a problem. You can simply choose to have all art made in standard linear color space, but none of it will look exactly as-drawn when rendered within the filmic color space.

       

      Here's a tip when making VFX, assuming you are staying in linear/sRGB color space:

       

      Make the first iteration in linear color space, switch to filmic, and take a snapshot of it. Switch back to linear and try to recreate the filmic look using layering and adjusting colors. You can never 100% recreate it, but you can get close in the 50-70% range. 

       

      As an example below:

       Exhibit C: Our teleporter in linear
       

      Then, after:

       
      Exhibit D:  Faux-Filmic

       

      Filmic for reference:

       
      Exhibit E: True Filmic

       

      It’s close! It took a second iteration, around twice as much time in total, but it gets strikingly closet to our desired effect. You have to use a whole other layer of problem-solving and creative decisions to get to the same point that filmic gets by default, and that's only if you're faking lighting like we are with our emissive materials. Additionally, if you animate or tween VFX (which is often the case with our projects), you’ll have more layers than normal to animate, and there will be a performance hit of more VFX-related objects rendering. If you are using actual real-time lighting, your mileage may vary in achieving a filmic color space look and feel while remaining in linear color space.

       

      Our 2d projects pretty much never use real-time lighting for performance reasons. There are some cost considerations to achieve this level of juicy lighting, but for many, it’s a toll happily paid.

       

      If you have a lot of dev-side VFX, especially with high complexity like particle systems, adding the art cost of using filmic would be worth it. Otherwise, if flashy VFX are rare and/or simple, it would probably be better to just fake filmic look on dev side with tricks. The main time cost in our consideration above is on the art workflow and existing assets that were already created in a different color space.

       

      Adjusting your art toolchain the first time, and then getting familiar with the adjusted workflow can take some time in and of itself. Most artists have a personal preference of one or two programs (Photoshop, GIMP, Aesprite, GrafX2, GraphicsGale, etc), but will adopt a specialist set of brushes, or programs for a given project due to a unique feature. For example, some of our pixel artists prefer animation of pixel art to be done in Aseprite simply because of features like onion-skinning (not to mention the simplicity of importing sprite sheets into Unity using .ase files). 

       

      Sound Design for Video Games: A Primer

      Sound Design for Video Games: A Primer
      The goal of a sound engineer is to work with the design and animation team to create audio that fits into the virtual world. Let's analyze some audio samples I created for Bite the Bullet, and how they work to support the game’s feel, theme, and universe. This blog will teach you basic sound design skills and theory.

      Read more