A particularly useful aspect of eye tracking in the CCK is that it's a headset agnostic feature. At runtime, the headset which is running the build from the CCK will authorize the hardware as appropriate. This causes the eye tracking to either function based on where the user's eyes are actually looking or where the headset main camera for the player is looking.
The ray which is produced can be used to detect based on where this projection is pointing versus an object’s origin or we can use Unity’s internal physics and cast for a collider. Objects which have been imported into a CCK project as a prop will be viable as eye tracking targets.
Once the eye tracking node is active, every frame calls on the appropriate hardware and interfacing scripts depending on which kind of headset we have. If we are using Oculus, we call OVR_Eyegaze (Oculus plugin). If we're on PICO, we make use of the PXR_EyeTracking class to GetCombineEyeGazeVector (PICO Integration package). The node offers a combination of settings which will define how you should treat the completion state, which all revolve around how long you looked at the props.
The result is an accessible and generalized way to make use of a very useful telemetry mechanic, particularly effective in learning scenarios. If you ever want to check whether your user's attention was given appropriately to any particular object or effect within your virtual reality, then eye tracking is the way you would do it.
![](https://static.wixstatic.com/media/d49b76_edca011c42f343cb96c159198e9fac07~mv2.png/v1/fill/w_980,h_405,al_c,q_90,usm_0.66_1.00_0.01,enc_auto/d49b76_edca011c42f343cb96c159198e9fac07~mv2.png)
Comments