FREE US SHIPPING ON ORDERS $175+

Translation missing: zh-CN.general.language.dropdown_label

Translation missing: zh-CN.general.currency.dropdown_label

0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
      Total

      Game Development

      Dook's VR Design Tips - Lessons learned ~200 hours into the matrix

      Dook's VR Design Tips - Lessons learned ~200 hours into the matrix

      Hey friend, I'm dook, a programmer/designer working on Another Reigny Day, A VR castle defense game for the HTC Vive. Designing games for VR can be a daunting task since the medium is so new and always shifting. So we've compiled a small list of problems we've had, and solutions we've found to fix them. Quick disclaimer, I don't intend this article to be an authority on how to do things in VR, they're just things we've picked up along the way, and that have worked for us. VR design is such a volatile and subjective thing that everything in this post could be completely wrong or outdated with new hardware. Anyway, with that out of the way, here are some things we learned in no particular order.


      For the love of God, don't make the player use the grip buttons too much

      (on the vive wands anyway... most of the time...)


      The grip buttons on the Vive have to be the worst part of the system. They're just downright awkward to press, or at least I think so anyway. Anecdotally, about every second VR user I've talked to didn't like using them, but some people don't seem to have an issue. Even if it only affects a small percent of players, you should always design VR with accessibility in mind.


      But there are some use cases where the grip buttons are necessary. Objects that have to be used (i.e. guns) come to mind. You can't use the trigger for a grab action since there'd be no way to fire the gun. Using the grip buttons to toggle grabbing and trigger for using the item, is a pretty common solution. I think you'd be hard-pressed to find users that have a problem with that since the press is so in-frequent. But set it to toggle and not hold.


      Locomotion options


      Following the trend of accessibility, locomotion options are essential to make sure your players have a good experience depending on their tolerance to motion sickness. It might be a little extra work, but players appreciate this kind of stuff. Try to get all the standard ones in there if you can. So: teleport, trackpad, and arm swinger. Dash teleport is also a bit nicer than regular fade teleport since it provides great feedback to the player of where they're moving to without any sort of motion sickness. When play-testing Reigny, we often had players accidentally teleport downstairs and being confused about where they were since we only used fade teleport.


      Don't cover the player's field of view entirely.

      There's a flying monkey enemy that flings poop at the player's face and cover's the player's field of view with poo if it hits. This immediately brought a lot of players out of the experience and questioning what had just happened. This was kind of fixed with better sign-posting of what was happening: we slowed down the monkey, added a trail renderer to the projectile, and most importantly made the sides of the poop texture transparent so players could still see a glimpse of what was happening around them. Accounting also does this well in the courtroom scene that gestures you to reach onto your face by covering half of your view with black and putting a low pass filter on the audio.


      Use tracked objects

      Held objects that are 'tracked' feel way nicer than objects that are children of the controller. Tracked objects have the benefit of interacting properly with the environment (knocking over stuff) as well as moving when colliding with the environment (like a wall). It's a really simple thing that adds a lot to immersion.
      Juice it!


      Even though VR design is still pretty new and always changing, it's important to still adhere to some standard game design principles, especially making sure things are properly juiced. Sound + haptics are like half of what makes VR fun and are easy to neglect. Something that added a ridiculous amount of permanence was simply adding body parts and blood to the field on Another Reigny Day. The talk by Jan Willem Nijman (of Vlambeer fame) on juice is one of the best out there, and there are so many things that can be applied to VR (maybe not the screen shake parts though lol)

      Accessibility

      Thinking about accessibility is important for pretty much all games, and especially important in VR since you'll get a range of people playing your game. For example, how will kids and short people interact with things? A great way to test this is to try to play your game on your knees and see if everything can still be played ok. Some ways to deal with this could be to create tables with adjustable heights, or the option to scale your environment down to half scale.

      Another important question to ask is if the player can play the game sitting down, or standing in place. Adding the ability to be able to turn on the spot is great for those with standing configs or people with only front-facing tracking.


      Ergonomics is also worth keeping in mind when designing for VR. Experiences that have the player looking down or up for too long can cause neck strain. Having the player hold their arms out straight for too long can also be uncomfortable (ie using a bow and arrow for too long). Picking things up off the floor sucks just as much in VR as it does in real life, and should be avoided if possible. To fix some of these things in Reigny, we added the bow 'Recall' feature, which flings the bow from anywhere on the map to the player's hand, so they don't have to bend down. We also made using the bow for long periods optional, with an array of other options to help enemies part.

      Use Audio to give cues about what's happening in the game world

      Audio can be a great tool to alert players to what's happening around them without looking, which is especially enhanced by 3D audio in VR. With so much sensory input to the player, it can be really easy for the player to miss things happening around them. It's really important to create sound cues for things like enemies that might sneak up behind the player (footsteps work well with 3D Audio) or machines, that can sound off about if they're working properly or not. One of the issues we found when people played Another Reigny Day was that they were constantly surprised by enemies coming up behind them and attacking them, which happens when they break down the gate and storm the castle. The root cause of this was players not knowing if the gate had been broken. To help fix this we added a loud audio cue for when enemies are attacking the gate, as well as a loud crash for when it's been broken down.
      That's it for now! If you'd like to read further into VR design I highly recommend searching through the Google I/O 2016 + 2017 talks on VR for things that sound interesting. GDC and VRDC also have some really good ones about things studios have learned developing for VR. I think it's also important to try out other people's games to build up a design vocabulary with others and get inspiration.

      Finally, I'm including a short video demonstrating the most unique way to pause an HTC Vive Medieval Low Poly Action game in 2018:

      Be the Fetus

      Later skater


      You can follow me on twitter @duckpizza for more VR shenanigans

      Improving the mixer interactivity authentication experience on PC

      Improving the mixer interactivity authentication experience on PC

      We recently finished our second game with Mixer integration and wanted to share what might be a helpful resource for other devs in the future. It's short and sweet!


      For the PC platform, users who want to enable Mixer interactivity in your game must first authenticate through a short key. This entails providing the user with a key, them visiting a website, and then entering the key. By default, the Mixer Interactivity for Unity plugin makes this process very time-consuming by simply showing the key and the URL to the user, without any accessibility shortcuts to speed the process up.

      Our solution:
      1. Put the authentication key in a box that will copy the key to the clipboard when selected or clicked.
      2. Put the URL to mixer's authentication page into a button that, when selected or clicked, will open the URL in the user's default browser.

      Copying to the clipboard is a platform-dependent operation. Thankfully, Unity's TextEditor feature provides that easily. Here's some sample code which will copy a string to the user's clipboard:

      public void CopyToClipboard(string str)
      {
      TextEditor editor = new TextEditor
      {
      text = str
      };
      editor.SelectAll();
      editor.Copy();
      }

      Opening a URL is also simple:

      public void OpenURL(string url)
      {
      Application.OpenURL(url);
      }

      In summary, just have the user click/select the key to copy it to the clipboard (don't do so automatically - you don't want to forcefully overwrite the user's clipboard), and then select the link to the authentication page. This makes the process of enabling interactivity very smooth for the user, giving them a good first impression of Mixer Interactivity.

      Unity Chroma SDK: Using Particle Systems to Make Beautiful Chroma Animations

      Unity Chroma SDK: Using Particle Systems to Make Beautiful Chroma Animations

      Important note for devs: Make sure Razer Synapse is up to date, otherwise some SDK features will not work.

      For starters, here's a handy link to Unity Chroma SDK.

      One of the most powerful features of the Unity Chroma SDK is the ability to record the view of a camera in the scene and convert that to a chroma animation. A lot of creative opportunities lie within this feature, as chroma animations can, therefore, be made from a camera that sees particle systems, canvas, sprites, meshes with materials, and so on.

      For our game Log Jammers, I found particle systems achieved the type of effect we were looking for with far less development time than expected. Following is a description of the process to utilize the camera capture feature of the Unity Chroma SDK.

      First, set up a camera in the scene which aims at a particle system. Make sure the orientation of the two is such that the camera will see the particle system.



      Open the “Chroma Particle Window” which is dedicated to this Chroma SDK feature. This is located in Unity’s toolbar if you have imported the Chroma SDK package.

      Next, drag and drop the following references into their corresponding fields in the particle window:

      1. Your target chroma animation object. Make this by placing the Chroma SDK animation component of your choice on a GameObject.
      2. The camera you just configured to point at your particle system.

      With the particle window still open, select your particle system in the scene, and press the “Simulate” button in the scene view. You should see the particles of your system in the preview window.

       

       


      To start recording the camera view, press the “Start” button. I recommend timing your capture to start/stop around your particle system beginning and ending, so the transitions in and out of the chroma animation are smooth. Please note that capturing will add to the end of previously saved/recorded frames in the animation. Therefore, if you don’t like a capture, make sure to press the “Reset” button to clear the bad capture before recording again.

      This workflow makes for nice looking chroma animations in little time. Below you can see examples with the above-captured chroma animations in the hardware emulator.


      The Chroma Hardware Emulator

      To test all of our fancy new chroma effects on Razer hardware that I didn’t have available, I used the ChromaEmulator tool.

      This tool shows its emulation of Chroma effects in real-time for all types of Chroma hardware, including ChromaLink. This tool works with Unity without extra effort. Just fire it up, select your hardware of choice to emulate, and select show for each of them. These are some examples:

      Keyboard

      Numberpad

      Laptop

      Here's a helpful link to information for devs on chroma-link (explains the significance of the color channels).

       

       

      Emission Map Guide for Artists

      Emission Map Guide for Artists

      Emission Map Guide for Artists

      Emission maps are used to describe how much bloom/glow should be emitted from parts of a sprite/texture.


      Emission maps are grayscale: black means no emission for that pixel, white means full intensity emission. To start, here’s an example of a sprite with its corresponding emission map:

      We don’t want the entire building sprite to be emitting bloom/glow, so we create an emission map in which everything is black except the windows of the building. The result is that only the windows will emit a glow. Most lighting effects should be achieved through programmed lighting and emission, so most art will not need lighting drawn into them.


      Example table to describe the relationship between a sprite, its emission map, and the emitted bloom.

      Sprite pixel color

      Emission map pixel color

      Final emitted bloom

      White (Full intensity color)

      White (Full intensity color)

      White, full intensity

      White (Full intensity color)

      Black (No intensity color)

      No emission

      White (Full intensity color)

      Grey (Half intensity color)

      White, half intensity

      Yellow (Full intensity color)

      Grey (Half intensity color)

      Yellow, half intensity

      Yellow (Full intensity color)

      Black (No intensity color)

      No emission

      Grey (Half intensity color)

      White (Full intensity color)

      Grey, full intensity



      Unity Bloom Post Processing & Emission Maps

      Unity Bloom Post Processing & Emission Maps

      Unity Bloom Post Processing & Emission Maps

      (For Devs)


      1. Import this unity package - it will allow us to use post-processing such as bloom
      2. In the file manager view within unity: Right-click -> Create -> Post-Processing Profile
      3. Select the newly-made profile, and view its contents in the inspector.
      4. Check the box next to bloom to enable the following: bloom, chromatic aberration, and vignette. Copy these settings for now:
      5. Select your main camera in the scene's hierarchy, and add the script called 'Post Processing Behaviour'. For its 'Profile' element in the inspector, assign the post-processing profile you created earlier.
      6. The result should look a bit like this

      If you want to directly control how a specific sprite behaves in regards to bloom/emission, you need to make a Sprite Pixel-Lit material just for it with an emission map.

      1. Download the shaders from this GitHub: https://github.com/traggett/UnitySpriteShaders
      2. Put this folder in your assets folder. It has shaders for the materials we'll be using.

      In the file manager again, Right-click -> Create -> Material. Select the material. At the top in the inspector, select Shader -> Sprite (Pixel Lit).

      Configure the material as follows:

      1. Set 'Blend Mode' to Standard Alpha.
      2. Check the "Emission" box. This will enable you to control its bloom output via an emission map
      3. When emission is checked, under it is a box with 'Emission' next to it. This is where you will place the artist-provided emission map for the sprite you want to define the bloom for. An emission map will be black and white. See below for an example.
      4. The color box is where you can set the hue and intensity of the bloom/glow. I recommend putting it to pure white, which means no hue shift and maximum emission. You can use this to control the bloom intensity for a sprite without having an emission map, but an emission map will control which pixels emit and which don't.
      5. Once that's done, select your sprite's game object and set the material component of its sprite renderer to the new material you made.

      For emission maps made for sprite sheets that are sliced (such as animations) the emission maps will be automatically sliced/indexed in the same way as a referenced sprite sheet.

      Some notes about emission:

      • Unless you want all of a sprite to emit bloom, you should use an emission map. Without an emission map, the intensity of bloom from a specific pixel will be based on the intensity of the RGB color of that pixel. As a result, sprites which are a bright color will emit a lot of bloom, which may be undesirable.
      • An emission map won't decide the final color - it just describes how much bloom emission happens per pixel. The hue part of the emission material’s configuration will affect the color, otherwise, when left white, it will use each pixel’s color for emission color.
      • The final emission intensity appears to be (RGB Intensity of pixel * Corresponding Pixel intensity in Emission map), where the pixel emission map intensity is 0 to 1.
      • Example emission map and base. Here, only the windows will emit any bloom.

       

      Keep your paw on the pulse of all our unity tricks by joining our Discord!