Unity’s (not so) new event system

In version 4.6 Unity introduced a new powerful event system which takes care about how input is routed through the game. It allows a clear order which elements get the inputs before others.

Motivation

To be honest, the event system is already around for some time, but only recently I got the chance to really dive into it. The problem in our project was that there were multiple systems that used input to perform their actions, but not every system was active all of the time. Or one was active but only if the input wasn’t consumed by a higher priority system.

Although the systems were independent in general, because they have to share their input made them intermingled a bit too much in my opinion. So instead of using the input directly I tried to fully utilize the new event system to separate the systems again.

One system should only get input events that are meant for it, so it doesn’t have to check if they are really valid. On the other side a system could deactivate itself easily by deactivating its event handlers.

How does it work

If you are completely unfamiliar with the new event system of Unity, there are a view sources to get an introduction and some insights from:

The system consists of four main parts:

  • Input Modules
  • EventSystem
  • Raycasters
  • Input Handlers

Event generation

First of all the input events have to be generated somehow. This is done by the input modules.

The input modules are specific to the kind of input that is available on the device. By default there are two:

The standalone input module is used for standard mouse input that you will most often use on Windows, Mac or Linux standalone versions of your game.

The touch input module is meant for mobile devices in first place. It generates input events from the touch input of the user.

Event delegation

The second step is to decide where the event should go. The party responsible for this is the EventSystem script. There should be only one in each scene and it is informed about the event that are generated by the input modules.

To do its job, the event system utilizes the raycasters in the scene. The only job of those is to shoot rays through the scene that have their origins at the pointer/finger position of an event e.g. a tap. They deliver a list of hit objects that are sorted by the distance on the ray, so the first hit objects are first in list.

Those hit results are used to decide which objects the events are sent to. The first hit object that wants to consume the event gets it.

Event consumption

 

If an object wants to handle an event, it needs two things:

  • Correct script to be detected by a raycaster (e.g. a graphics behaviour for the GraphicsRaycaster or a collider for the Physics2DRaycaster)
  • An implementation of the interface the event corresponds to

For each event there is a specific interface that has to be implemented, e.g.:

  • IPointerClickHandler
  • IBeginDragHandler
  • IPointerEnterHandler

A full overview of supported events can be found at http://docs.unity3d.com/Manual/SupportedEvents.html

Use cases

The most common use case for the event system is to use it for the UI when using the new Unity UI. This works out of the box, an event system is even created if not existent if a canvas is added to the scene.

Using the same system for UI and in-game

With the physics raycaster and the physics 2D raycaster you can use it for in-game interactions just the same, easy way. Using the same event system for UI and in-game input has the advantage that it solves ordering issues.

In the past you may have checked your in-game input for “IsOnUI” before handling it. This isn’t necessary anymore as the input events are intercepted by the UI if it lies above an in-game object. The in-game object only gets the input if it was really hit.

Easy deactivation of interactions

To execute actions from the game we already have a nice system in place where we just invoke a public method of a data context. All we needed was a trigger that initiates the invocation. So we created some generic input handlers:

  • DragHandler which provides Unity event triggers for BeginDrag, Drag and EndDrag
  • HoverHandler with Unity event triggers for PointerEnter and PointerExit

If we want to disable interactions with some in-game objects, we now only have to disable their input handlers. Another possibility would be to disable their collider, so the physics raycaster doesn’t find them anymore. There is even a Unity default layer called IgnoreRaycasts (ID: 2).

Disabling the interaction like this allows a pretty good cooperation with other features as the input events are correctly forwarded to other interactable objects behind the disabled one.

A practical example: When not in the edit mode for a building you can’t move it via drag. Moving the camera by dragging it should be possible nonetheless. By disabling the drag handler on the building, dragging the camera is even possible when above the building. If the handler would still be active and the dragging was disabled in a different way, the event would still be consumed by the building and the camera couldn’t be dragged above the building.

 

In general this kind of deactivation makes the features pretty independent from each other as the event system decides about the delegation order and chooses different paths when handlers are disabled.

Conclusion

I’m pretty content with the new system Unity provided to handle inputs and I’m glad they didn’t limit it to the UI. Using it for all the game makes sure that the order events are delegated is correct and doesn’t require to check if an event should be consumed (e.g. if the event is above the UI).

In the end it makes features of the game that require user interactions much more independent from each other. The features can disable their interactions by simply disabling their event handlers and the input is correctly delegated to the next object by the event system.

As Unity only covers the basic input events (enter/exit, down/up/click, beginDrag/drag/endDrag) I see a need to extend the event system a bit with custom events, e.g. for pinching or long presses. There seem to be some places to add custom logic like custom messages or the possibility to add own input modules. The whole event system is open source which should make it easier to find the right point to start.

I hope to find the time for it soon, which would definitively be worth an own post 🙂

  • MrGrievous

    Thank you!

  • Evgeni Petrov

    Hello, what about handling an input event but allowing it to propogate further?

    • Hi Evgeni,
      Yeah, those are tricky cases. Right now I already created a custom InputModule for our projects which works with gestures and a bit different to the Unity way. So I can’t tell you how it is done with the standard Unity InputModules.
      In previous projects I could solve those issues often by using a different input event (e.g. Press instead of Release) and/or rearranging the UI hierarchy. But it really depends on the specific use case.
      If you like I can send you the current version of our InputModule, but it’s still in development and not meant to be released yet.