So, now that we understand a bit about how the XR system works Under The Hood, let's look at the core XR APIs we'll need to use to create VR apps on mobile devices. These core APIs of Unity reside in the XR namespace. We're not going to cover everything in the XR namespace, because much of it is meant for room scale experiences. But you should still read through all of the information in the XR section of the documentation to get a more complete understanding of VR beyond just mobile. All of the methods in the XR namespace are static methods, since there is only one XR VR system in your app. Most of the XR API functionality you'll use throughout the course is part of the InputTracking class in the XR namespace. The InputTracking class has the functionality you need to get information about everything the VR system is tracking. It consists mainly of a bunch of static methods, but there are a few events as well. The first two methods are the most convenient ones to use if you're not using a truck posed driver but, they offer the least flexibility. Get local position and, get local rotation let you query the XR system for the location and, orientation of a tracked object except an argument from the XR node enumeration which lists every possible type of tracked object currently known. While they're convenient, these methods are not the recommended way to get tracking information. For example; There's no way to get error information if you ask for tracking information for type of object the XR session doesn't support. The recommended practice is you use the. GetNodeStates method, which provides much more information about each tracked node. GetNodeStates lets you get a complete list of all the track objects known to the VR system. You can iterate through the list and use the methods of the XR nodes state class to get a much richer set of tracking informations about each one. They include methods to retrieve the velocity and acceleration properties of the tracked object and methods to get the position and orientation that are safer than the get local position and get local rotation methods and give you some hints about the reliability of the information. You can't access these directly as properties, you have to use one of the TryGet function. This is because the information may not be reliable so, if you plan to use this info, you need to code your app so it will still work if it's not available. You would need to do that anyway if you want to provide the best user experience. The input tracking class also offers four events and, you should always use them in your VR apps. NodeAdded and nodeRemoved, tell you when the XR session detects a new controller or headset and, you'll generally only get one of these events at the beginning of the XR session for each track device. It may take a while to get the first controller additive add, while the user tries to find it and, make it start sending information. The tracking acquired and tracking lost events are much more likely to occur throughout your XR session. Whenever something prevents a controller or headset from sending tracking information in a timely manner. In mobile VR because the controller is sending tracking information by Bluetooth and, there's no optical tracking that could get interrupted you're unlikely to get these events often but, it is still a good practice to deal with them when they do occur. When tracking is lost, you should avoid updating that controllers position directly from the API until you've verified what's going on. You could just leave it where it was the last time you were sure about it. But, if you want to do a really good job you should keep track of the controller velocity and assume it's still moving at the same speed as it was until you tracking info again. When you get the tracking last event, you should also set a timer and implement graceful fade out of the lost control or instead of hiding it immediately. This is less immersion breaking for the user and, if you don't get a tracking acquired signal again within a few seconds, you should probably put up a notification of some sort so that the user knows that you've lost tracking. It's far more serious if the headset loses tracking than if a controller loses tracking. In that case, you probably want to fade to block and put up a text message asking the user to remove the headset. There are a couple more things to note about mobile VR controller tracking as opposed to rooms scale tracking. If you ever used a room scale VR system, It's astonishing how accurately it chose where the controllers are and, how quickly it responds to their movements. You may be in for a bit of disappointment when you use a mobile VR controller because currently the controller is only tracked in terms of its rotation or orientation. In Unity, you do get position information about where the controller is in space but, that information is provided by what's called an arm model. The arm model assumes that you have an arm and, are holding controller in your hand. By doing so, it can calculate where the controller should be in space using inverse kinematics. While the results are impressive, they still fall far short of the accuracy of a room scale system. To get the best results, you should ensure that you start each mobile VR session by re centering the arm model using the vendor's approved method. In both cases holding down the depressed button on the controller until the circle fills, will reset the arm model for the correct forward direction and arm length.