FREE US SHIPPING ON ORDERS $175+

Translation missing: ja.general.language.dropdown_label

Translation missing: ja.general.currency.dropdown_label

0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
      Total

      Game Development — mega cat studios

      A Guide to Cinemachine

      A Guide to Cinemachine
      Cinemachine is a suite of camera tools for Unity, which gives triple-A game quality controls for every camera in your project. The easy to install plugin allows you to add functionality to cameras you've already have, or make new ones with varying behaviors. 

      Read more

      Making An Authentic Metal Soundtrack: Coffee Crisis Case Study

      Making An Authentic Metal Soundtrack: Coffee Crisis Case Study

      How do you achieve such an authentic sound?


      I have compiled a variety of tools over the years that are involved. A couple to highlight in this particular soundtrack would be Superior Drummer 2.0 by Toontrack, a virtual drum kit that draws its samples from a massive library of raw audio drum samples recorded in professional studios, as well as a fantastic amp simulator made by Peavey call Revalver IV. Revalver IV provides a wide array of amplifiers, cabinets, microphones, as well as effects, to select from to provide complete tone control at the tip of your finger. Many of these tools, combined with modern production techniques can produce rather astounding results

      .

      Here we see Superior Drummer 2.0 with the microphone channels routed out into separate tracks within Cubase 5.


      The arrows here point out bus channels that control the output of groups of different channels. For example, the first arrow is the master fader for all the mic channels for the kick drum. There are two kick microphones routed to this fader in this example.


      Here we see the Virtual Mix Rack by Slate Digital as a plugin inserted on the kick master channel.


      This is Revalver IV by Peavey. This is one instance that is active on the Left Guitar channel in Cubase 5.

      A quick look at some of the amplifier models available in Peavey's Revalver IV.

      Tell me more about your creative process for composing music in a game.


      The first question asked is always about the type of experience we are trying to create for the consumer. Next is developing an actionable plan, understanding the tools at our disposal, and to begin laying the foundation of ideas, melodies, and structure in the case of a soundtrack. When it comes to Coffee Crisis, the idea to use abnormal and rapidly shifting sounds throughout really lays well with the concept of the alien invasion. Keeping the idea of the experience you desire to create for the consumer will always help you decide if the content you're creating is content you should use.

       

      What software tools do you use for composition?


      An itemized list of the products I use are as listed:


      - Superior Drummer 2.0 by Toontrack

      - Omnisphere and Trilogy by Spectrasonics

      - Plugins by Waves

      - Plugins by FabFilter

      - Plugins by Slate Digital

      - Peavey Revalver VI

      Here we see two plugins I often use by Fab Filter. On the left is the Pro - Q. On the right, the Pro - L.


      Here are two plugins by Slate Digital that I frequently use on my master fader. On top is the Virtual Tape Machines plugin and below is the Virtual Bus Compressor model FG - Grey.

       

      Walk me through the other tools in your production process.


      One tool to talk about would be automation. Every modern Digital Audio Workstation otherwise referred to as a DAW, has some form of automation to use. What this allows for is change over time within the project you are creating. Change over time is an essential component of an effective production whether or not it is music, a film, a book, a game, etc.


      What do you think brings a video game soundtrack to the next level?


      The secret to a successful soundtrack is the ability to have a distinct identity without detracting from the experience itself. Finding the balance between blandness and overbearing can take time, but is always worth the effort.


      How do you balance a soundtrack & sound design when composing music for a game?


      Similar to an earlier response, it is all about focusing on the kind of experience you are trying to create. I would also add that it's never a bad idea to get outside opinions as well during the process. It always has the opportunity to reveal things about the soundtrack when you get a fresh pair of ears involved.


      How do you know Mitch Foster?


      Mitch and I first met in high school as a result of our mutual interest in playing guitar. Consequently, a band was formed that continued for many years. The soundtrack for Coffee Crisis is primarily re-imagined versions of a variety of songs that we had initially composed in the years following high school, a fun fact that few know.

       

      How can we contact Fist First Records for more information?


      Anyone can contact Fist First Records as follows:


      Email: fistfirstrecords@gmail.com


      Facebook: www.facebook.com/fistfirstrecords/

      Juicier Teleporters & Filmic Color Space

      Juicier Teleporters & Filmic Color Space

      Classic 2D pixel art is timeless, but modern games take the flourishes of dynamic lighting, shades, particle systems, and post-processing effects like a chromatic aberration in bloom. During the creation of Bite the Bullet, some of our senior developers had an immersive round table on how they achieved the layered lighting effect for teleporters used in-game.

      I have to say that seeing them in action brings the process to a whole new level. For demonstration's sake, we’ll keep the reference images as single still capture, and share them as a single gif animation of one of my favorite teleporters to kick off.

      Let’s examine the problem with the teleporter example below:
      With the glow emanating from the center:
      Exhibit A: Linear Teleportation

      In this example, the glow stems from an emissive material, with a ton of effervescence cranked upon it. It looks “ just okay,” but does provide a sweet sci-fi sense of luminescence that any neo-futuristic teleporter would have.

       

      Yet, there is a way to make it look better.

      Exhibit B: Filmic Teleportation

       

      The difference is that in the former image, highly saturated color caps out as a solid cyan color. In the latter, it starts turning white, how light naturally behaves in intensity. The former's restriction is fundamentally depending on how Unity's rendering pipeline handles color by default. Our focus here is about the 'color space' rendering output.

       

      Introducing: Filmic Color Space

       

      Filmic color space accurately represents the behavior of 'lightness.' If you use Blender 2.79 or later, you may be familiar with this as the default color space starting at that version is Filmic. In Unity, you can achieve the same look by using a tone mapper in the post-processing stack.

       

      Filmic Blender was created by Troy Sobotka, an industry professional and camera enthusiast who saw the problem with the default color space in Blender and wanted to fix it. So he coded Filmic Blender - a similar color management configuration to ACES, which is the industry standard for VFX.

       

      Please note, this affects the color of everything. For example, comparing the floor colors in the above two images, you’ll notice that one is slightly darker. With this adjustment to a new tone mapper, the effect on all colors is substantial. If you want your art to translate the game view to look as it was drawn, this requires making this adjustment early in the asset creation process, so all colors are selected and applied as planned.

       

      In our game, all of the VFX have been made in the previous linear color space, To achieve filmic-looking effects, we added bright white layering, and the contrast between colors was chosen in the context of the linear color space. When viewed in filmic, they'd look extremely overblown.

       

      Similarly, much of the environment would look too dark because it was made under the assumption it would be viewed under a linear color space - the same color space the artists used when producing the art. In the same way, if you were composing a scene in Blender using linear color space and changed it to filmic color space, you would have to redo all of your lightings to accommodate this. You could choose not to change anything to match. Still, the overall look you get would be very different than you possibly intended.

       

      The Bottom Line

       

      Filmic color space will accurately and dynamically represent light behavior in a way that looks like HDR, but the removal of linear color space means that the color you see for a PNG asset in Windows Photo Viewer is not what you get in-game. Filmic color space is more realistic due to its higher dynamic range and translates to a more cinematic look. Even if viewed on a non-HDR display, the effect is still conveyed.

      If you compare linear to filmic, linear usually feels washed out in comparison. But it's not as simple as contrast or saturation. As you see with the light example above; the behavior of color itself changes, especially as you get into extremely bright and saturated colors.

      Effective lighting can give a simple and unfinished space a sense of life and ambiance. Using light, you can provide a level of polish that elevates the entire scene in pixel art games; giving it a modern feel without losing the purity and details.

      Lighting doesn’t necessarily mean using light source objects in your engine. We prefer to use options like overlays and shaders to achieve a dynamically lit look despite not having dynamic light sources present. One of the most interesting takeaways we had during our filmic roundtable; was that for us: a better understanding and command over lighting may be the best way to provide the most effective visual feedback to a player with everything else already standardized. This is likely one of the reasons why people think UE4 looks better than Unity by default: UE4 uses filmic, and Unity uses linear color space by default, aka sRGB.

       

      Your Filmic Workflow

       

      Applying Filmic color space effectively in your Unity project requires an adjustment to your workflow, and some planning for visuals on both the artist and development side, planning around a specific color space.

       

      There is a particular workflow for creating art assets for use in a filmic color space context, such that what the artist sees on their canvas is what you get in the game, 1 to 1. If Unity's filmic tone mapper is accurate, then using a filmic LUT profile in an image editor that supports it would be able to achieve that. For example, the graphics editor Krita has a LUT management system through something called OpenColorIO, which takes an OCIO profile to determine your viewing colorspace.

       

      Take this Film Emulsion-Like Camera too for Blender so awesomely shared by Sobotka on GitHub:

       

      https://github.com/sobotka/filmic-blender/blob/master/config.ocio

       

      Once configured, you will then be viewing your canvas in the filmic color space, so what you see is what you get in-game.

       

      The cost/drawback being all involved artists would need to have a workflow set up in a way that's compatible with the using a different color space. E.g, if their editor of choice doesn't support it, that's a problem. You can simply choose to have all art made in standard linear color space, but none of it will look exactly as-drawn when rendered within the filmic color space.

       

      Here's a tip when making VFX, assuming you are staying in linear/sRGB color space:

       

      Make the first iteration in linear color space, switch to filmic, and take a snapshot of it. Switch back to linear and try to recreate the filmic look using layering and adjusting colors. You can never 100% recreate it, but you can get close in the 50-70% range.

       

      As an example below:

      Exhibit C: Our teleporter in linear

      Then, after:

      Exhibit D: Faux-Filmic

       

      Filmic for reference:

      Exhibit E: True Filmic

       

      It’s close! It took a second iteration, around twice as much time in total, but it gets strikingly closet to our desired effect. You have to use a whole other layer of problem-solving and creative decisions to get to the same point that filmic gets by default, and that's only if you're faking lighting like we are with our emissive materials. Additionally, if you animate or tween VFX (which is often the case with our projects), you’ll have more layers than normal to animate, and there will be a performance hit of more VFX-related objects rendering. If you are using actual real-time lighting, your mileage may vary in achieving a filmic color space look and feel while remaining in linear color space.

       

      Our 2d projects pretty much never use real-time lighting for performance reasons. There are some cost considerations to achieve this level of juicy lighting, but for many, it’s a toll happily paid.

       

      If you have a lot of dev-side VFX, especially with high complexity like particle systems, adding the art cost of using filmic would be worth it. Otherwise, if flashy VFX are rare and/or simple, it would probably be better to just fake filmic look on dev side with tricks. The main time cost in our consideration above is on the art workflow and existing assets that were already created in a different color space.

       

      Adjusting your art toolchain the first time, and then getting familiar with the adjusted workflow can take some time in and of itself. Most artists have a personal preference of one or two programs (Photoshop, GIMP, Aesprite, GrafX2, GraphicsGale, etc), but will adopt a specialist set of brushes, or programs for a given project due to a unique feature. For example, some of our pixel artists prefer animation of pixel art to be done in Aseprite simply because of features like onion-skinning (not to mention the simplicity of importing sprite sheets into Unity using .ase files).

       

      OBS Settings for Recording Game Footage

      OBS Settings for Recording Game Footage
      An easy guide to getting the perfect OBS settings for recording game footage for trailers or streaming.

      Read more

      Bite the Bullet - Dev Log 1

      Bite the Bullet - Dev Log 1

      Run & Gun, and...eat? Saw through flesh a new-fashioned way or drill through enemies with fast-paced gunslinging with Bite the Bullet. This game combines some of our favorite game elements, ever, with some fun platforming, skill shots, power-ups, and fast gameplay. Also, cannibalism.

      Story

      In the 2Xth century, urbanization and pollution caused food and resource shortages. Humanity was able to adopt through technology, devising biologically-implanted nodes which allowed them to consume and metabolize any material, living, or inorganic.


      This necessity sparked a trend in biological manipulation, and soon mankind split itself into two species – the Celestials, who embraced bio-mods and expanded across the stars from their new lunar home; and the Ghouls, descendants of humans who never utilized the consumption nodes and remained behind on earth, poisoned and twisted by the conditions there.


      The strife between these species has caused mistrust, suspicion, even war, and their planetary conflict has attracted the attention, and ire, of a being that is beyond all mortal Flesh.


      Now, two champions - half-caste offspring with parents from both species - will attempt to save humanity’s future by fighting through its past on a desolate and perilous world.

      Features

      Bite the Bullet is a biopunk run and gun action platformer featuring:

      • The choice to destroy or consume enemies, turning their biological matter into many upgrades.
      • 10 levels infested with bladed drones, flame turrets, lightning geckos, blimp rats, and helmet squids.
      • Turn enemies into defenses or weapons, like the Gurtha swarm blowgun, lightning gecko on a stick, or turtle shell plate armor
      • Save up all that consumed flesh and metal in the bio-meter to activate Zombro mode, and pound your foes into smoking ash and bone dust.

      Characters

      Vill and Dart

      Two soldiers of the Lunarian forces who have a shared Purebread and Ghoul ancestry. They manage to keep their Ghoul genes hidden but still carry the bio-implants which enable them to consume all manner of organic and inorganic material


      Garands

      Most Ghouls are content with eking out a meager existence on the husk of a planet Earth has become. Others intend to take out their human cousins by force, joining ranks under the Ghoul revolutionary leaders.


      Lightning Geckos

      Reptiles that have evolved bio-electric survival mechanisms. They are often hunted by Ghouls as both a source of power and for designer leather boots.


      28s

      Sometimes, consumption nodes decay over time and infect other organs of the host. These speedy Ghouls have significant brain damage, which affects their higher cognitive functions.


      Dire Puffer Fish

      Before the great lunar migration, many animals were developed for sports activities. Mechanically altered pufferfish was one of them, as they provided a deadly challenge to typically peaceful activity.


      Sappys

      The Purebreds left plenty of explosives behind on their journey to the Moon. Sappys intend to make this refuse into the instrument of the Purebreds demise.


      Mega Mind

      The consumption nodes were created in a bioware lab which was the bleeding edge of technology for its time. Long abandoned, the dispossessed bio-material and implants in the facility begin to cohere. Now, a central, seething mass looms at the core of the laboratory, and its tendrils can be felt throughout the building like a pulsing nervous system.


      Chunks

      These large ghouls have several bio-modifications which provide them with superhuman strength, allowing them to wield suppressive fire weapons which are normally mounted on vehicles


      Blimp Rats

      Inflatable sacs in these creatures enable them to float into low hanging trees to gather fruit and leaves. Breathing in the gases stored in these sacs can cause nausea and hallucinations, making these animals short on natural predators.


      Eltons

      Hand-picked by Ghoul forces for leg strength and bone density, these individuals were given special implants that allow them to control tiny muscles, allowing them to stabilize and withstand the recoil of their mighty weapons.


      Hamster Squirrel

      Mutant rodents with acidic saliva, teeth as hard as granite, and a dangerous nether region. If they were not so volatile, some say that they could be domesticated…


      Buckshots

      Internal filtration membranes keep these Ghouls safe from the harmful effects of the drug of choice but do little to protect their enemies from the spread of their deadly weapons.


      Here are some updates from the last two weeks:


      Early Ham-Ham implementation: https://gyazo.com/86f0f62dd2460b52b0a423cd3764f47c

      Ghoul Character: https://gyazo.com/852336a8db44009c8c8ce66a0d898dfd

      Implementing rockets: https://gyazo.com/489bb48cd932bdca397bed9dee592a1b

      Kill cam: https://gyazo.com/d37b717d2f0ba6ae493e040fefdefb0a

      Critter cannon: https://gyazo.com/760e9934598ebee44e35cc1b72aa4646

       

       

      Wow, we've come a long way! See more and wishlist Bite the Bullet on Steam!

      Want even more updates? Join our Discord!