Quick Start
Here's an overview of how to run a quick demonstration of both Sightlab and Sightlab Multi-User. First, make sure you've set permissions, installed Vizard and done the initial setup according to this page.
Click "Inspector" to open the builder tool (if wanting to use a new model or scene)
2. Load up an environment either from the built in environments (utils-resources- environments) or drag and drop one of your own to the utils-resources- environments folder (see here for some more places to get models). Sometimes you may need to adjust the scale and starting point. See this page for how to do that.
3. Add objects of interest to the scene by going to File-Add (either add some that are in the utils-resources-objects folder or download your own to that folder).
Click on the transform node (the top gear icon in the heirarchy tree on the object) and use the move, scale and rotate tools to place and scale your object(s). Either click an drag the transform handles or use the transform fields.
4. Note that objects should come in with a "Group Node" attached to them, if not you have to choose "insert above- Group node". This is what is used to target the object as an "object of interest" (you can also right click and add this to objects already in the scene)
5. Choose File- Add and add a Region of interest (in utils-resources-objects folder) to see how you can target specific areas of the scene (follow steps on this page)
6. Save this scene to the environments folder (can also adjust lighting and additional options in Inspector if necessary)
Configure Options
Choose options according to which version you are running. For 3D models, choose environment, press "Configure" then choose which objects to collect fixations on, set to visible or add as grabbable. If using Biopac, start up Acqknowledge and toggle the BIOPAC markers checkbox. Press "Continue" when done.
Add participant information (optional)
Run trials
Default to start and stop trials is Spacebar (can be changed). Escape to exit. Real time dwell time will be printed out and events will be sent to Acqknowledge if using Biopac.
Afterwards you can review the local data files in the data folder and the video recording in the recordings folder (if that was checked). An interactive Session Replay will be available for the single user version. If using Biopac you can view the synchronized physiological markers along with the saved SightLab events and video.
8. Run the Session Replay to view analytics such as heatmaps, scan paths, dwell time, user interactions and more. Use spacebar to start and stop replay
9. Run a 360 video session by adding media to the resources/ media folder and run according to the instructions here (works for both single and multi-users)
Note areas of interest can also be added to 360 videos as well
10. For additional functionality, see the documentation and the Example Scripts page (ExampleScripts folder will hold all these examples)
11. See how new avatars can be added here
Some additional functionality of note:
Supports a large collection of VR hardware and devices
Collect, record and utilize additional data such as:
(some of these depending on hardware support): Face tracking, hand tracking, full body tracking, eye tracking metrics (fixations, dwell time, saccades, pupil diameter, eye openness (if supported), etc.), physiological measurements, FNIR, EEG and much more
Add instructions, rating scales for user feedback and other common experiment tools
Support for Mixed Reality
Multi-User for both local and remote
Template and example library with a large selection of samples to build off of or run as is:
Visual Search templates
Virtual Screen
Driving Simulations
Stimulus presentation (3D models, 360 media, 2D media)
Gaze based interactions and biofeedback from physiological data
Additional visual analytics with integrations from PANDAs, Matplotlib, Plotly and more
Virtual Menus
STIM files for experiment control of independent variables or other manipulations
Phobia and exposure paradigms
Virtual Mirror
and much more