FAQ
SightLab VR Pro FAQ
What headsets and hardware is this compatible with?
Sightlab VR Pro works with the following headsets:
For eye tracking:
Varjo Aero, XR-3, XR-4, VR-3
Meta Quest Pro
Vive Focus 3
Vive Pro Eye
HP Omnicept
Pupil Labs (includes Vive Cosmos, Vive Pro, original Vive)
StarVR One
Tobii Pro
For non-eye tracked headsets you can use head position with these headsets:
Meta headsets (including the Quest, Quest 2, Quest 3 and Quest Pro with Oculus Link)
All SteamVR compatible headsets (Pimax, Valve Index and much more)
Windows Mixed Reality headsets
Pico works with just head position and connected via SteamVR link
Vive XR Elite
Desktop mode
For additional hardware supported, see the Vizard Supported Devices page.
Can I use SightLab VR standalone?
SightLab VR Pro requires WorldViz Vizard software for its full functionality as SightLab VR Pro is an add-on software to our Vizard software engine. You can although still us a software like Shadowplay to stream via the cloud to a device running in stand alone mode.
Can I import 3D models in any usual format (obj, fbx, for example) for data collection?
You can import 3D models in the usual formats, so .glTF, .obj and .fbx and other common formats will work. There may be mixed results depending on how the textures are packed (the most consistent format would be .glTF). For more information see this page.
We developed some applications in Unity 3D. Can we use them for eye-tracking?
Currently, SightLab only runs on the Vizard platform, but you can import assets from Unity by using their fbx exporter. Here's an article on doing that.
Can your software export data in CSV format? What other export formats are available?
Yes, the data files are in .csv and .txt
What data is collected in the main template?
Number of fixations, time to first fixation, average fixation time, total fixation time, fixation timeline, gaze intersect position, fixation spheres to show amount of fixation time, time stamps, head position, pupil diameter, as well as the ability to add custom flags such as interactions. Additional data is available with custom code and varies per eye tracked headset (such as “eye openness” for the Vive Pro Eye). To add more data points see the data logger example.
How many users does the multi-user edition of SightLab VR Pro support?
Currently, the Multi-User version of SightLab is set up to support up to 5 users. If you wish to have more users in your application contact sales@worldviz.com and we may be able to customize to your needs.
Are there differences in functionality between the single and multi-user version?
The multi-user version has all the same functionality as the single user, but with the ability to have multiple participants in a session interacting together.
The Server and Client are not connecting in the Multi-User Script
Restarting the machine should fix this issue, as it could be that the operating system had locked some resources or there were some stale network connections from the previous script execution which were not properly cleaned up. Also, make sure to see the print statement "After Server Continue" and the timer HUD is showing before starting the session.
How do I update the SRAnipal driver if using the Vive Pro Eye or the Vive Focus 3?
See this document on how the process of updating your SRAnipal driver works.
Not able to unpack the exe due to it being detected as a virus
You may need to temporarily disable real time protection or add an exclusion with Windows Defender (make sure to turn it back on afterwards).I Don't have space to add all of the objects I'm trying to collect data on
See the Example Script - gaze_time_subnodes or the code attributes page to see ways you can add a large list of objects if the GUI doesn't allow to select all of them.
Where can I get assets to build a scene with?
See this page on all the places in which to get assets
During Session Replay the position of my environment or regions of interest change.
If this happens, you can use the flag sightlab.is_GLOBAL = 0
See the example "gaze_time_subnodes" on how to get past this by setting the list of objects in code
I'm getting a "Permission Denied" error when trying to run SightLab
This usually happens if you have your participant ID either left blank or the same name as an earlier one, but that earlier one is still open in a spreadsheet editor program.
I'm not seeing my data file when I load the Session Replay
This is most likely that you put an underscore the participant ID name. You can remove that underscore to see the file, but underscores will cause this issue.
Poor framerate on a 3D model scene
You can use various optimization methods (see this page). This is a simple couple of lines that could be tried:
env = sightlab.objects[0] #If using the GUI, add this after the experiment starts in the sightLabExperiment function
env.optimize(viz.OPT_REMOVE_REDUNDANT_NODES)
env.hint(viz.VBO_HINT)
Additionally, you can open the model in Inspector and choose the textures tab on the bottom right, then shift click to select all textures, right click and compress them.
How Do I Install this on a MAC?
Vizard (the software SightLab is built on) is a Windows based software, but you can use “Parallels” to run Windows based applications on your Mac. So development and basic testing with Vizard can be done on a VMware partition on Mac, but for testing with VR peripherals like VR headsets, typically a PC is the better and more functional solution.I need a python library, but it requires a version of Python higher than 3.8
Search for the version that is supported for Python 3.8 in Google then install that version using the Package Manager: https://www.worldviz.com/post/tech-tip-using-the-vizard-package-manager-to-get-python-2-7-compatible-versions-of-python-libraries
I am not hearing audio from a windows audio player
You may need to play the sound directly in SightLab/Vizard using audio = viz.addAudio('someFile.wav')
My textures appear to be flickering/ z-fighting
#Add this code to offset the texture in the z axis
z_offset_value = -1.1 # Adjust as needed
youTexture.zoffset(z_offset_value)
Autocomplete isn't working You either need to copy your script into the SightLab root folder, or sometimes you need to uninstall Vizard, delete the Vizard folder from Program Files and reinstall (but then would need to reinstall additional Python libraries)
I want to sample the data at a higher rate than the display By default the data (eye tracking, etc.) is tied to the refresh rate of the headset. You can disable this by using this command viz.vsync(0) https://docs.worldviz.com/vizard/latest/#commands/viz/vsync.htm (note this should increase above the display rate, but may not always update to the highest sample rate allowed in a device).
For answers to all technical Vizard questions, please go to the Vizard 7 Documentation FAQ of the additional Vizard FAQ page here