FREE US SHIPPING ON ORDERS $175+

Translation missing: ja.general.language.dropdown_label

Translation missing: ja.general.currency.dropdown_label

0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
      Total

      Game Development — unity

      Unity - Undefined Script Order of Execution Bugs

      Unity - Undefined Script Order of Execution Bugs

      When a bug consistently reproduces for one person but never does for another, or it appears in a build but not in editor or vice-versa, it may be from an undefined script order of execution bug. These bugs occur regularly if using Unity normally, and the cause is subtle. The inconsistency also makes them a pain to track down and fix. These are easily the worst type of bug out there due to their inconsistent nature. This document will explain how these bugs happen, demonstrate the reasons for inconsistency in whether they occur or not, methods for fixing them, and ways of designing your code so there’s no room for these bugs to manifest in the first place. To start, here's a simplified example of a script execution order bug, shown below:


      class CharacterUI : MonoBehaviour

      {
      public UiImage activeSkill; //set in inspector

      public Skills characterSkills; //set in inspector

      void Start()
      {
      activeSkill.sprite = characterSkills.skillsKnown.primarySkill.sprite;
      }

      }

      class Skills : MonoBehaviour

      {
      SkillData skillsKnown; //initialized from Resources.Load

      void Start()
      {
      skillsKnown = Resources.Load<SkillData>();
      }
      }

       

      For someone testing the game, they may always, consistently, get a null reference exception in Start() of CharacterUI, resulting in a broken-looking UI. For the developer and several other testers, the bug may absolutely never happen. The bug may also never happen for anyone in-editor, but does happen for some people in builds.


      Why does this happen?

      First, let’s establish that the CharacterUI’s Start() depends on Skills running Start() before it. Otherwise, when it would access the data within characterSkills.skillsKnown, skillsKnown would be null. With that in mind, for the above case, what actually defines the order the two Start() methods run? The order of execution for Start() between these two classes is completely undefined. Because it’s undefined, if these two objects are created at the same time at scene startup, Unity determines this order arbitrarily, and it varies between in-editor sessions and builds, and per machine! For some people the bug may always happen in editor and never in build, and others it may always in the build and never in editor, and for others it may never happen at all. This all depends on whether, for a given machine and build/editor session, Unity happens to decide if Skills runs before CharacterUI, or CharacterUI runs before Skills. While we work through the example, consider that for an actual codebase in a game, the classes involved in such a bug will be more numerous and complex.


      Solutions

      There's a few solutions available for our contrived example. One solution would be changing Skills to initialize on Awake(), which will always run before anything else’s Start(). But what if for your case’s current logic, both need to use Awake() or both must use Start() due to other dependencies from other classes? If both use Awake, you'd run into the same issue, as the order between the two Awake calls are undefined. If both must use Start, it’s the same as the example undefined order problem.


      The general solution requires explicitly defining the script's priority/order. There's a way to do this in the project's script settings, but it's a pain to manage it there (and gets out of hand as you get into hundreds of classes), so you can instead use an attribute on the class, which looks something like [DefaultExecutionOrder(150)]. Below I show the attribute applied to the classes to fix the bug.


      [DefaultExecutionOrder(10)]
      class CharacterUI : MonoBehaviour {
      public UiImage activeSkill; //set in inspector

      public Skills characterSkills; //set in inspector

      void Start()
      {
      activeSkill.sprite = characterSkills.skillsKnown.primarySkill.sprite;
      }

      }

      [DefaultExecutionOrder(-5)]
      class Skills : MonoBehaviour {
      SkillData skillsKnown; //initialized from Resources.Load

      void Start()
      {
      skillsKnown = Resources.Load<SkillData>();
      }
      }


      The lower the order value, the earlier its mono methods like Start() are executed relative to other monobehaviours. Now, the execution order for the Start() calls has been defined, so Skills Start() will always run before CharacterUI's Start(). Note, this execution order affects Start, Awake, Update (and all other types of update like FixedUpdate, LateUpdate), as well as OnEnable, OnDisable. For example, the Skills’ class OnEnable() would run before CharacterUI’s OnEnable().


      Note: if just one class had its order defined, such as CharacterUI’s, the bug could still occur as Skills’ order relative to it is still undefined.


      Preferred Solution - Avoiding this Problem By Design

      The above solution of using the DefaultExecutionOrder attribute is fine if the damage is already done and the code can’t be refactored. However, the ideal solution is to design your code in a way where this issue doesn’t have room to occur in the first place.


      Solutions for this problem at a design level involves avoiding using Start() or Awake() for anything which depends on another game object's state. Instead, you should have some dedicated code in another class responsible for initializing your objects and using them together, rather than having your individual objects cross-referencing each other. As a red flag, if you require your Start() or Awake() methods to run in very particular orders between separate objects of classes in order for them to function properly, they should be redesigned so they are initialized explicitly by hand in another class. The thought process behind this is that if their initialization order is so important for them to function at all, this order deserves to be explicitly defined by hand, line-by-line, in one location, and not spread out throughout the codebase by using the DefaultExecutionOrder attribute. Let’s look into an example.


      For a contrived example, imagine you have classes A, B, C, D, and E which all depend on each other in different ways in their Start() and Awake() methods. If you need to understand the order which they initialize and you’re using the DefaultExecutionOrder attribute, you’d need to go between each class and make note of their order number, then organize those order numbers lowest to highest, then separately consider how for this order their Awake()s run first, followed by their Start()s, and some classes may be missing one and have the other. There is a much clearer way - just introducing one simple class, which takes references to each involved class and explicitly initializes them in a manually defined order, passing their dependencies as arguments.


      e.Initialize();
      d.Initialize(e);
      c.Initialize(d, e);
      b.Initialize(e, c, d)
      a.Initialize(b, c);


      Now, the order of initialization to a developer is extremely clear by just looking at it, the dependencies between classes are also extremely clear, and very importantly, there’s also no room for undefined execution order bugs because you have explicitly defined the initialization order by hand.


      Note that when defining initialization orders by hand, you may encounter things called cyclic dependencies. For example, if system A requires B to be initialized, and B requires C to be initialized, and C requires A to be initialized, there’s no possible valid order to initialize them in. Resolving cyclic dependencies can be complicated and requires some sort of refactoring, so it’s outside of the scope of this document. Resolving cyclic dependencies is the expression you’d want to use when researching solutions.

      A Guide to Cinemachine

      A Guide to Cinemachine
      Cinemachine is a suite of camera tools for Unity, which gives triple-A game quality controls for every camera in your project. The easy to install plugin allows you to add functionality to cameras you've already have, or make new ones with varying behaviors. 

      Read more

      Unity Chroma SDK: Using Particle Systems to Make Beautiful Chroma Animations

      Unity Chroma SDK: Using Particle Systems to Make Beautiful Chroma Animations

      Important note for devs: Make sure Razer Synapse is up to date, otherwise some SDK features will not work.

      For starters, here's a handy link to Unity Chroma SDK.

      One of the most powerful features of the Unity Chroma SDK is the ability to record the view of a camera in the scene and convert that to a chroma animation. A lot of creative opportunities lie within this feature, as chroma animations can, therefore, be made from a camera that sees particle systems, canvas, sprites, meshes with materials, and so on.

      For our game Log Jammers, I found particle systems achieved the type of effect we were looking for with far less development time than expected. Following is a description of the process to utilize the camera capture feature of the Unity Chroma SDK.

      First, set up a camera in the scene which aims at a particle system. Make sure the orientation of the two is such that the camera will see the particle system.



      Open the “Chroma Particle Window” which is dedicated to this Chroma SDK feature. This is located in Unity’s toolbar if you have imported the Chroma SDK package.

      Next, drag and drop the following references into their corresponding fields in the particle window:

      1. Your target chroma animation object. Make this by placing the Chroma SDK animation component of your choice on a GameObject.
      2. The camera you just configured to point at your particle system.

      With the particle window still open, select your particle system in the scene, and press the “Simulate” button in the scene view. You should see the particles of your system in the preview window.

       

       


      To start recording the camera view, press the “Start” button. I recommend timing your capture to start/stop around your particle system beginning and ending, so the transitions in and out of the chroma animation are smooth. Please note that capturing will add to the end of previously saved/recorded frames in the animation. Therefore, if you don’t like a capture, make sure to press the “Reset” button to clear the bad capture before recording again.

      This workflow makes for nice looking chroma animations in little time. Below you can see examples with the above-captured chroma animations in the hardware emulator.


      The Chroma Hardware Emulator

      To test all of our fancy new chroma effects on Razer hardware that I didn’t have available, I used the ChromaEmulator tool.

      This tool shows its emulation of Chroma effects in real-time for all types of Chroma hardware, including ChromaLink. This tool works with Unity without extra effort. Just fire it up, select your hardware of choice to emulate, and select show for each of them. These are some examples:

      Keyboard

      Numberpad

      Laptop

      Here's a helpful link to information for devs on chroma-link (explains the significance of the color channels).

       

       

      Unity Bloom Post Processing & Emission Maps

      Unity Bloom Post Processing & Emission Maps

      Unity Bloom Post Processing & Emission Maps

      (For Devs)


      1. Import this unity package - it will allow us to use post-processing such as bloom
      2. In the file manager view within unity: Right-click -> Create -> Post-Processing Profile
      3. Select the newly-made profile, and view its contents in the inspector.
      4. Check the box next to bloom to enable the following: bloom, chromatic aberration, and vignette. Copy these settings for now:
      5. Select your main camera in the scene's hierarchy, and add the script called 'Post Processing Behaviour'. For its 'Profile' element in the inspector, assign the post-processing profile you created earlier.
      6. The result should look a bit like this

      If you want to directly control how a specific sprite behaves in regards to bloom/emission, you need to make a Sprite Pixel-Lit material just for it with an emission map.

      1. Download the shaders from this GitHub: https://github.com/traggett/UnitySpriteShaders
      2. Put this folder in your assets folder. It has shaders for the materials we'll be using.

      In the file manager again, Right-click -> Create -> Material. Select the material. At the top in the inspector, select Shader -> Sprite (Pixel Lit).

      Configure the material as follows:

      1. Set 'Blend Mode' to Standard Alpha.
      2. Check the "Emission" box. This will enable you to control its bloom output via an emission map
      3. When emission is checked, under it is a box with 'Emission' next to it. This is where you will place the artist-provided emission map for the sprite you want to define the bloom for. An emission map will be black and white. See below for an example.
      4. The color box is where you can set the hue and intensity of the bloom/glow. I recommend putting it to pure white, which means no hue shift and maximum emission. You can use this to control the bloom intensity for a sprite without having an emission map, but an emission map will control which pixels emit and which don't.
      5. Once that's done, select your sprite's game object and set the material component of its sprite renderer to the new material you made.

      For emission maps made for sprite sheets that are sliced (such as animations) the emission maps will be automatically sliced/indexed in the same way as a referenced sprite sheet.

      Some notes about emission:

      • Unless you want all of a sprite to emit bloom, you should use an emission map. Without an emission map, the intensity of bloom from a specific pixel will be based on the intensity of the RGB color of that pixel. As a result, sprites which are a bright color will emit a lot of bloom, which may be undesirable.
      • An emission map won't decide the final color - it just describes how much bloom emission happens per pixel. The hue part of the emission material’s configuration will affect the color, otherwise, when left white, it will use each pixel’s color for emission color.
      • The final emission intensity appears to be (RGB Intensity of pixel * Corresponding Pixel intensity in Emission map), where the pixel emission map intensity is 0 to 1.
      • Example emission map and base. Here, only the windows will emit any bloom.

       

      Keep your paw on the pulse of all our unity tricks by joining our Discord!