Introduction

The Unity Input System package (2019+) provides a new way to map input actions from a wide range of devices to any sort of application interaction. Communication between the controls and the input map is made up of a sender and a receiver. The path to the receiver must coincide with a device or action path, as defined in the project InputActions.


The GameDriver XR Simulated Input plugin allows tests to be written for projects that utilize the Input System package. Tests can be executed as if a physical device were present, allowing for automated test execution independent of the physical device. Simulated devices are created by the GameDriver agent, which processes input commands passed into the project using the GameDriver API client.


Build Prerequisites

Before getting started with the GameDriver XR Simulated Input plugin, it is essential to have the following present in your Unity project.

  • Unity Input System package v1.3.0 or later

  • Add the GameDriver Agent to an empty object in the first loaded scene in your project. When in Play mode, you should see an additional component added to this object automatically. That is the XRSimulatedInput plugin.

  • The build flag GDIO_UNITY_NEW_INPUT_IL2CPP needs to be present for standalone IL2CPP builds. This can be found in the Project Settings > Player > Script Compilation section below:

  • For standalone builds using the Input System Package with GameDriver, we recommend setting the “Development Build” flag. In some isolated cases, IL2CPP builds have experienced issues when not using this build setting.

Quick Start

To work with simulated XR devices using GameDriver, follow these simple steps. Note: GDIO is automatically added to simulated devices when the appendName argument is set to true and needs to be included in Input commands.

  1. Add device(s) by first creating your simulated devices. Device names can be anything for the device you want to simulate. For example:


  1. Create a simulated HMD: 

api.CreateInputDevice("Unity.XR.Oculus.Input.OculusHMD", "OculusHMD", new string[] { "GDIOHMD" });


  1. Create simulated controller(s):

api.CreateInputDevice("Unity.XR.Oculus.Input.OculusTouchController", "OculusLeftHand", new string[] { "LeftHand" }, true);
api.CreateInputDevice("Unity.XR.Oculus.Input.OculusTouchController", "OculusRightHand", new string[] { "RightHand" }, true);


  1. To ensure that the controllers are tracked actively, you will need to use the following commands:

api.IntegerInputEvent("GDIOOculusLeftHand/trackingState", 63, 1);
api.IntegerInputEvent("GDIOOculusLeftHand/trackingState", 63, 1);

 

  1. Utilize these devices by calling input commands passing in the desired action path. Examples below.

    1. To update the HMD position, use the ‘centerEyePosition’ as shown below:

api.Vector3InputEvent("GDIOOculusHMD/centerEyePosition", new Vector3(0, 0, -5), 100);


  1. To update the controller position.            

api.Vector3InputEvent("GDIOOculusLeftHand/devicePosition", new Vector3(0, 0, -5), 100);


  1. To update the controller rotation. Note that once done updating the rotation, add another line to reset it to (0, 0, 0) to avoid unwanted offsets later on.

api.QuaternionInputEvent("GDIOOculusLeftHand/deviceRotation", api.EulerToQuat(-10f, 0f, 0f), 100);


  1. To press the buttons,  /gripPressed can be replaced with the path of the other buttons as required.

api.ButtonPress("GDIOOculusLeftHand/gripPressed", 100, 1f);


  1. To move the joystick on the x or y-axis:

api.Vector2InputEvent("GDIOOculusLeftHand/Primary2DAxis", new Vector2(0f, -1f), 100);


Understanding Input System Device Layouts

Device paths in the map are written as <Device>/inputControlName and are case-sensitive. For example, the following are unique paths:


/deviceName/inputControlName

/DeviceName/InputControlName


Paths can have symbols in the map:

  • Name

  • <LayoutName>

  • {usageOrTagName}

  • #(displayName)


For example, if the HMD position is mapped to be received as:

    <XRHMD>/centerEyePosition


The code to send it can be set for a specific device, using the name:

    /GDIOHMD/centerEyePosition


Or all devices of the layout type using:

    <XRHMD>/centerEyePosition


<XRHMD> means that the device in question was created having an XRHMD as a base layout. The map will look for any device of this type, and check the inputControl of that name. If the map is requesting something like:


    <XRController>{LeftHand}/devicePosition


This is looking for any device of type XRController, but only those tagged as LeftHand. In this case, upon device creation, the tag must be passed to it, or the map won't identify it when receiving the command.


Simulated Device Creation

The GameDriver Agent allows you to add simulated devices for testing interactions and inputs without the need to plug in a physical device during automated testing. CreateInputDevice is used to create a generic device, and CreateInputDeviceFromDescription is used to create a specific device from a JSON definition exported from the Unity editor. However, simulated device usage is the same for both methods.


Creating Generic devices


Device creation using CreateInputDevice works as follows:

api.CreateInputDevice("LayoutName", "CustomName", new string[] { "tags" }, false);


Note: the LayoutName and CustomName fields need to be different.  For example, when creating a Meta Quest 2 HMD, we can use the following:

api.CreateInputDevice("OculusHMD", "GDIOOculusHMD");


However, for the controllers, it is important that the tag matches what the map requested.

api.CreateInputDevice("OculusTouchController", "GDIOOculusLeftHand", new string[] { "LeftHand" });
api.CreateInputDevice("OculusTouchController", "GDIOOculusRightHand", new string[] { "RightHand" });


Creating Specific Device Layouts


Device creation using CreateInputDeviceFromDescription works by first exporting the device definition from the Unity Editor using the Input Debugger (Window menu > Analysis > Input Debugger). You can find most devices under Layouts > Specific Devices.


Paste the resulting layout into a .json file (e.g. PlayStation.json for the above), and save this in a directory accessible from your test class.


To load the layout above, use the following command. The path to the PlayStation.json file here is relative to the build directory of the test, referencing the file 3 parent directories up in the root of the test folder:

string device = api.LoadDeviceDescription($"../../../PlayStation.json");
api.CreateInputDeviceFromDescription(device, "GDIOGamepad", new string[] { "Gamepad" }, true, "Gamepad", 60);


Working with Simulated devices


Input paths can be obtained using the GameDriver command:


public string MapInputControlPathsUsed(InputMapOutputTypes outputType, int timeout = 30)


Some paths will have more than 1 path, such as the LeftHand position, meaning any of them will be used. Some are for specific devices. Meta Quest, for example, doesn't have a /pointerPosition path.


 


Looking at the paths, it's easier to see how to send events using simulated inputs.


    api.Vector3InputEvent("<XRHMD>/centerEyePosition", val, frame);


or more specifically


    api.Vector3InputEvent("GDIOHMD/centerEyePosition", val, frame);


If GDIOHMD is the device name. For the hands, it could be for all controllers:


    ("<XRController>/devicePosition", val, frame)


or for only controllers with the tag LeftHand


    ("<XRController>{LeftHand}/devicePosition", val, frame)


or  ("GDIOLeftHand/devicePosition", val, frame)


If GDIOLeftHand is the device name. The {} part indicates the tag must be added to the device.


Code example: Note that the /trackingState is used to track the hands. In this example, the value must be set to 63. This value may be different depending on the implementation.


Supported Commands


The GameDriver API supports the following commands for use with the Unity Input System:

  • AxisPress
  • ButtonPress
  • IntegerInputEvent
  • Vector3InputEvent
  • Vector2InputEvent
  • QuaternionInputEvent


Details on the use of these commands can be found in the API reference.



Usage Examples


A common question is how to interact with an object in a scene using the XR controller. To click an object in world space using the XR controller, you would need to aim the controller at the object you intend to interact with. This can be done in two ways, using the built-in UnityEngine.Transform functions, or the XR device inputs. For example, you can set the initial position of the objects representing the HMD and Controllers by manipulating the UnityEngine.Transform.position property directly, replacing the path to match the objects representing your XR Rig and Controllers, and Vector3(0,0,0) for where you want to position the objects.

// Set the XR Rig to 0, 0, 0
api.SetObjectFieldValue("//*[@name='XR Rig']/fn:component('UnityEngine.Transform')", "position", new Vector3(0, 0, 0));

// Set the hands to a good position relative to the HMD - uses the localPosition since it is a child of the Camera and XR Rig. These coordinates are examples, and should be adjusted to suit your needs.
api.SetObjectFieldValue("//*[@name='LeftHand Controller']/fn:component('UnityEngine.Transform')", "localPosition", new Vector3(-0.24f, -0.1f, 0.24f));
api.SetObjectFieldValue("//*[@name='RightHand Controller']/fn:component('UnityEngine.Transform')", "localPosition", new Vector3(0.24f, -0.1f, 0.24f));


Alternatively, you can set the HMD and Controller positions using the simulated device path:

api.Vector3InputEvent("GDIOOculusHMD/CenterEyePosition", new Vector3(0, 0, 0), 1);
api.Vector3InputEvent("GDIOOculusLeftHand/DevicePosition", new Vector3(-0.24f, -0.1f, 0.24f), 1);
api.Vector3InputEvent("GDIOOculusRightHand/DevicePosition", new Vector3(0.24f, -0.1f, 0.24f), 1);


Then, capture the position of the object want to interact with and call the built-in Transform.LookAt method to point the controller at the object. The :

Vector3 button = api.GetObjectPosition("//*[@name='ButtonPath']");
api.CallMethod("//*[@name='LeftHandHPath']/fn:component('UnityEngine.Transform')","LookAt", new object[] { new Vector3(button.x, button.y, button.z) });


You can also use the /DeviceRotation here, which is somewhat tricky and may require some trial and error in the Unity editor to obtain. For example:

api.QuaternionInputEvent("GDIOOculusRightHand/DeviceRotation", new Quaternion(0.2142892f, -0.1563346f, -0.350345f, 0.8982751f), 1);

Lastly, use whatever ButtonPress that your app uses to click on the object. For example, on a Meta Quest 2 we can use the /gripPressed input path, as follows:

api.ButtonPress("GDIOOculusLeftHand/gripPressed", (ulong)(api.GetLastFPS() * 3), 0f);

The Recorder can give you a good idea of what inputs and positions are used in your project.