SightLab VR Documentation
The SightLab package is a versatile and accessible tool for creating VR experiments using 3D models and 360-degree media. It is uniquely tailored for both non-programmers and researchers seeking advanced functionalities. SightLab includes an intuitive, GUI-based experiment generator to enable easy setup and operation, an interactive session replay mode for in-depth after-action reviews, and a comprehensive library of templates and examples to extend functionality. This application is a supplement to Vizard.
Sightlab Multi-User allows you to run multiple users in the full Sightlab VR experiment. If you are running on separate networks you would need to connect via a VPN using a third party software such as Teamviewer (see instructions).
Video Tutorial
Tutorial on how to use the single user Sightlab VR
Webinar Presentation
Presentation from Alex Dimov of Biopac on using Sightlab
- Release Notes
Changelog:
Version 1.10.0 4/15/24
Updated AI agent example
Added education application example (with optional AI add-on)
Full body avatars for Multi-User conversation interactions
Added navigation for server side view
Added foot tracking
Fixed issue with DataLogger for custom data and subsequent trial time stamps starting not at 0 (also for Omnicept and Visual Search example)
Added STIM file example for Video Player
Added instructions example for non-GUI to add start and end instructions easier
Updated custom settings code
Vive Tracker tracking object and body tracking examples
Added screen capture to session replay
Updated Visual Search
Updated Audio recording replay to load saved file
Updated transcribe audio
Added empty option for avatar head
Ability to toggle off controllers for multi-user
Added Omnicept 360 video example
New avatars from Reallusion and Avaturn included
Rating v2 for Multi User
Rating choice attribute accessible with Rating v2
Updated Ratingv2 example
Updated Spatial Accuracy Eye Tracker Test to add headers for errors file
Fixed issue with Walk the Plank example
Fixed issue with smooth pursuit example
Version 1.9.9 3/7/24
Updated support for mixed reality eye tracking (object identification and heatmaps)
Intelligent AI agent example (connects with openai, Anthropic, elevenlabs and more)
Added walk path example for reviewing where a user had navigated
Updated heatmap visualizer for all trials
Iphone Spatial video example
Added augmented reality to Biofeedback ball example
Added augmented reality to 3D Tablet menu
Updated Video Player
Updated face tracking module to include HTC face tracking
Fixed bug in face tracker saving example
Fixed issue with shadows on regions of interest
Adjusted documentation on how to automatically not have regions of interest show
New ground plane model
Updated Seaborn Visualization example
Updated plotly visualization example
Added adjustment for brightness intensity on 360 videos
Added measuring tape tool option
Added ability to save transcript of audio recording
Updated Visual Search template
Added triggerDown and triggerUp events to vizconnect
Version 1.9.8 2/5/24
Added OpenXR hand tracking module for any supported HMD (Quest 3, Quest Pro, Focus 3, etc.)
Example for OpenXR hand tracking added
OpenXR based hardware option for Vive Focus 3
OpenXR based hardware option for HP Omnicept
Hardware options for generic OpenXR HMD (with and without eye tracking)
Face_tracker_data module for automatically collecting and saving facial expression data for any script using Quest Pro (can be adjusted to use Vive Face Tracker)
Updated Model Viewer script to send names of models to Acqknowledge,collect dwell time, use timer and rating GUI
Examples for generating walk paths, gaze paths and visualizations using plotly
Updated Walk the Plank example
Added driving to be able to go up and down elevations
V2 rating module that can add strings of text as well as numerical selections
Updated collision module
Teleport to specific locations example
Avatar body example with seated, standing or tracked avatars
Updated Video Player
Version 1.9.7 1/5/24
Added mouse lock toggle semicolon “;” key (for desktop only)
Added csv experiment_data files for Multi User
Added heatmap_visualizor example for viewing a static heatmap of all trials, by condition or can be used for multi-user individual clients
Added ability to take screen captures (use ‘/’ key or change in settings, saved in screenCaptures folder, video recordings still in “recordings” folder)
Added stronger_springs module for being able to toss objects when using physics (Tossing_Objects example)
Added ability to toggle HUD overlays (default ‘h’ key for hud and console, ‘i’ key for gaze point in mirrored view, can be adjusted in settings)
Added Desktop, Meta or SteamVR dropdown for Session Replay Server playback
Added collision module for walking over elevated surfaces
Flashlight module for adding a flashlight that can be turned on/off (R thumbstick down or ‘m’ button)
Updated HP Reverb Omnicept example for saving sensor data (heart rate, cognitive load, eye openness)
Updated Visual Search examples for easier ability to use config file to change objects
Updated Pigeon Hunt example
Updated Virtual Screen examples
New models
Sky_day for skydome background with moving clouds daytime
Sky_night for skydome background with moving clouds nighttime
Ramp (for testing collisions and gravity)
Rocky Cavern (outside rock structures various elevations, day and night versions)
Version 1.9.6 12/22/23
Ability to choose custom location for environment and media resources in GUI (single user only for now)
Multi-User version now supports more simultaneous connections over separate networks (up to 5 by default, but much more can be added on request)
Fixed swapping models in model viewer (swapping models and change models) for Multi-User
Added client connected status to multi user 360
Added VR menu to work for multi user
Ability to bring up a modifiable menu that is oriented to the participant’s viewpoint
New SteamVR preset and Vive, Vive Pro, Vive Pro 2 ones
Adjusted Varjo Preset
Fixed replay having two left controllers
Fixed bug with "revert to default settings" in GUI
Version 1.9.5 12/11/23
New Heatmap visualizer
Updated data logger (now added to main SightLab VR script)
Data logger collects 6DOF for head by default
Data logger also works for 360 videos
Added client joined status to main window of server for multi-user
Updated gaze_time_subnodes example
Added “whiteRoom” model
Added "Jeep" model
Added “Direction Arrow” object for use also as a dummy node
Added scripts in Adjusting_Gaze_Data_Post_Session to convert .txt files to .csv in proper columns and measure fixations
Added movement limitation example (for keeping a user from going to certain spots in a scene)
Updated Video_Player example
Added GUI based Mixed Reality example
Added biofeedback ball showing getting data from Acknowledge to Sightlab
Updates Regions of Interest
Updated session replay playback of virtual screens (in 2D screen example)
Added second callback for gaze based functions
For Session Replay viewed in headset added dropdown for “Desktop, SteamVR or Meta
Added Quest 3 to mixed reality and hand tracking demos
Version 1.9.4 11/5/23
Updated Biopac markers to ‘stim’
Added BIOPAC markers as a checkbox for GUI version
Added server and client latency test for multi-user
Updated data rate for multi user and 360 version
Added a flashlight object
ToggleLights example
Updated Sample Code
Changed “Choice_GUI” examples to “RatingScale_GUI”
Added option in settings to change which headset is used if viewing the Session Replay in a headset
Hand Grabbing Demo now unlinks all grabbed objects when using “Reset” sphere
STIM File session replay example
NewProjects folder to store saved projects
Version 1.9.3 10/26/23
Support for Quest 3
Fixed video compression issue
Updated Video Screen (2D/3D screen) example
Added multi-user version
Fixed Driving Example
Added additional mixed reality examples
Updated hand tracking grabbing example
Updated avatar workflow and example
Updated how regions are saved with 360 videos
Fixed data saving error with Vive Pro Eye and multi-user
Version 1.9.2 10/7/23
Added automatic compressing of video recordings (need to install moviepy library)
Updated rating module
Updated instructions module
Updated body_tracking module
Improved data logger
Added selector/ highlight tool
Added vizconnect for Headsets using PPT
Fixed position of Meta Quest Pro controllers
Put key to continue gaze path in settings
Updated/New Examples:
Updated Visual Search examples
Updated 3D menu/tablet
Added FaceTrackingData
Updated Mirror Demo (added more facial tracking points)
Updated data logger example
Updated EyeTracker tests example
New rating example
Added Analyze Post Process Example
Added example to convert x,y,z to video pixels
Updated OpenXR Examples
Added grabbing with physics example for Meta Quest Pro
Updated Mirror demo with facial expressions
Updated Mixed Reality XR_Model_Viewer
Updated Mixed Reality screen 3D
Added new model to objects (spaceship)
Updated documentation
Added page on saving an experiment data file
Added page on data visualizations using additional python libraries
Added pages for new examples
Version 1.9.1 9/5/23
Updated navigation for Meta Quest Pro and Vive Focus (RH Arc Teleport, LH + RH B and A buttons to move forward/backward/left/right)
ExampleScripts Updates:
Added Varjo to Mixed Reality Examples
Updated STIM file example
Updated Driving Example
Fixed path in Avatar example
Added Facial expressions example
Updated Mirror demo
Added Simple Visual Search example
Model and Resource Updates:
Stimulus environment brighter
Updated complete_scene model
Version 1.9 8/7/23
Support for new hardware:
Meta Quest Pro Eye tracking (Requires Vizard 7.5 or higher)
Eye tracking, hand tracking, face tracking, body tracking (upper half) and mixed reality
Varjo (Requires Vizard 7.5 or higher)
Eye tracking, hand tracking, and mixed reality (and greenscreen masking)
Vive Focus 3 (Requires Vizard 7.5 or higher)
OpenXR devices
Added ability to use a STIM file for modifying independent variables
Updated scan path
Added scan path settings to adjust frequency and saccade range
Added gaze time threshold to settings (also accessible in code)
Added ability to have custom events for starting and stopping the trial
Added custom key for the default start and stop
Removed “Press Spacebar to Start Text”
Included a python file to compress the recorded videos
Added option for left-right stereo video and 180 video
Included Assets:
Updated “Stimulus.osgb” model
Added Office Photogrammetry model
Added Modern Office with skyline
Added Street model
Added Meta Quest Pro controller models
Added flag for global position
Fixed dual info panels with session replay in headset
Fixed rotation of 360 monoscopic videos and images
Fixed issue with timer starting while instructions are displayed
Fixed how rating works, so can set custom text within the experiment
Changed default data rate output to milliseconds (can be modified)
Example scripts can now be run from within the "Example Scripts" folder
New and Updated Example Scripts and Additional Plugins:
Added Model collection Viewer
Added Media collection viewer
Fixations and Saccades Template
Data Logger
Added Glaucoma example
Added Mirror example
Added Mixed Reality Examples
Added example for iterating through a list of all nodes in a model
Added Visual Search example
Updated Driving example
Pigeon Hunt example
Added example for running experiment on a timer
Removed “Calibration Check” and added “EyeTracker Tests” examples
Updated Audio Recording Example
Added example for adding ROIs to a moving point in a 360 video
Added resetting of position example ('r' key, but can be modified in vizconnect)
Added Walk the Plank
Added example for measuring fixations and saccades using angular distance
Multi-User
Added saving of head position
Fixed key error with fixation spheres
Added show_gaze_path flag
Added STIM file and rating GUI examples
Fixed error with scan paths in the replay
Fixed issue with lip flapping not working
Version 1.8.3 3/6/23
Fixed issue with Meta headsets and SightLabVR_360
Added ability to change fixation threshold in settings for 360 videos
Version 1.8.2 2/15/23
Fixed issue with installer
Removed resources shortcut that was causing error (resources are found under "utils")
Updated how instructions work (see documentation)
Updated examples for "set_flag", "gaze based interactions", "omnicept" and "instructions"
Moved "calibration check" into the utils folder
Version 1.8.1 2/7/23
Renamed "Build" to "Inspector"
Added headlight toggle to settings
Added HUD toggle to settings
Added fadeQuad toggle to settings
Added console text toggle to settings
Added comments to settings file
Added comment/instructions to main scripts
Updated Documentation
Version 1.8 1/4/22
Updated heatmap integration
Reorganized file structure with easier to modify scripts (note this version may need a few adjustments for running earlier data scripts (see section on tracking_data_replay))
Renamed “FixationRegions” to “RegionsofInterest”
Added mouselock to settings file for desktop
Put 3D gaze path showing in main session into settings and set to "off" by default
Put vizconnect dictionary into settings.py
Updated all example scripts
Fixed issue with instructions for multi-user 360
Fixed bug with multi-user 360 related to regions of interest
Documentation updates
Version 1.7.1 11/23/22
Fixed issue with multiple trials not working for multi-user
Fixed issue with importing multi-user causing error
Updated barchart example
Updated driving example
Fixed gaze point showing in front of view when using instructions with 360 video
Fixed lighting on complete_scene model
Added new example for gaze based interactions with multiple users
Updated gaze based interaction example
Added example for using 2D video
Updated 3D tablet example
Updated Rating GUI example
Ability to change mirror window size in the settings.py file
Fixed issue with scrub not working for multi-user playback
Fixed bug with instructions for single user
Data files using date and time now load with the most recent on top
Version 1.7 11/2/22
Added Session Replay for multiple users
Fixed issue with moduleSetup not working for importing SightLab
Fixed bug with fixation region being seen in middle of the scene
Added instructions flag and text to settings for adding instruction text (single user only)
Added "calibration check" file to verify eye tracking is working optimally
Improved heatmap functionality
Adding sequences of videos
Version 1.6.2 10/22/22
Added StandInAvatar to resources for easier way to add avatars to scenes
Took out some unnecessary print statements
Version 1.6.1 10/7/22
Toggling of sending events to Biopac Acqknowledge now in "settings.py" file
Changed video fixation regions to use settings.py
Added one more environment model "complete_scene"
Fixed issue with 360 video using Biopac and 360 multi-user video
Version 1.6 8/5/22
Added easier way to add regions of interest to 360 videos (without having to go into the code)
Fixed issue where left hand model was not being saved
Version 1.5 7/7/22
Can now quit the session without pressing spacebar at any time and it will still save a data file (single user)
Added saving of head position to tracking data file
Version 1.4 6/27/22
Added ability to run multiple trials without having to close and reopen the application
Updated Omnicept navigation
Multi-User Update: Updated Multi-User Sightlab to include full features of Sightlab Pro
Version 1.3 1/25/22
Added full support for HP Omnicept
Added example scripts
Disabled lighting on fixation spheres
Fixed issue with Oculus Quest simulated eye tracker
Added Vive Cosmos controller for avatar hand choice
Version 1.2 9/16/21
Added a Pupil Diameter overlay for the Session Replay
Fixed a bug where setting an object as “grabbable” was not staying saved
Fixed a bug where the timestamp on the session replay was not synchronized
Added the ability to move the offset of a tracked object in session replay
Added the ability to to choose the starting time in the session replay
Added a hardware dropdown chooser for the heatmap script
Added the ability to also hide the controllers in session replay
Added no controller (empty) option
Added ability to swap out model for both left and right hands
Version 1.1 7/1/21
Oculus Simulated eye tracker mode
Added ability to add custom flags to events
Load tracking files to view 3D gaze paths from previously run simulations
Workflow for adding additional tracking parameters
Collection of fixations to tracking data file (flag column)
Added spherical video as a main script type
Updates to documentation
Added vizfx to examples
Re-organization of file structure
Added timer and fixations for mirrored display
Time now starts at 0 for tracking data
Scene doesn’t show until recording starts
Pupil Diameter added to tracking file for Vive Pro Eye
Added video recording
Added intersect that is only visible for the mirrored, desktop view
Version 1.0 5/25/21
- Initial Setup
- Getting Assets
- Scene setup
- Running a Session
- Additional Experiment Set Up Options
Using a STIM file to modify conditions (independent variables)
Adding an origin reset
Setting custom flags
Experiment control (using a timer or other event to start/stop experiment)
- Additional Options/Examples
Visual Search Examples show some more advanced experiment Set ups:
VisualSearch1_SingleObject- Modifying a Condition, Measuring a Variable and saving to a file, Adding Instructions and using Gaze Based Interactions , Saving additional tracking data and visualizing using bar charts, histograms
VisualSearch2_MultipleObjects- Adding a Likert Rating and using 2 independent variables with 3 levels), also adding a boxplot visualization
VisualSearch3_Art_Gallery- Adding more conditions and using the swapping of textures for a visual search task.
Mixed Reality
Facial Tracking
Hand Tracking
Upper Body Tracking
- Visualizing and Analyzing the Data
- Supported Hardware
- Extending Vizard Scripts