top of page
Search

Eye Tracking - SkillsVR

Writer's picture: Luke McIntoshLuke McIntosh

A particularly useful aspect of eye tracking in the CCK is that it's a headset agnostic feature. At runtime, the headset which is running the build from the CCK will authorize the hardware as appropriate. This causes the eye tracking to either function based on where the user's eyes are actually looking or where the headset main camera for the player is looking.



The ray which is produced can be used to detect based on where this projection is pointing versus an object’s origin or we can use Unity’s internal physics and cast for a collider. Objects which have been imported into a CCK project as a prop will be viable as eye tracking targets.


Once the eye tracking node is active, every frame calls on the appropriate hardware and interfacing  scripts depending on which kind of headset we have. If we are using Oculus, we call OVR_Eyegaze  (Oculus plugin). If we're on PICO, we make use of the PXR_EyeTracking class to GetCombineEyeGazeVector (PICO Integration package). The node offers a combination of settings which will define how you should treat the completion state, which all revolve around how long you looked at the props.


The result is an accessible and generalized way to make use of a very useful telemetry mechanic, particularly effective in learning scenarios. If you ever want to check whether your user's attention was given appropriately to any particular object or effect within your virtual reality, then eye tracking is the way you would do it.


Simple example of setting up an eyetracking scene in the CCK


7 views0 comments

Recent Posts

See All

Comments


    ©2019 by Luke McIntosh. Respects to 'Caladan Brood' band. They own the Image featured here.

    bottom of page