https://www.mdpi.com/2079-9292/12/13/2776
The data acquisition system consisted of a touchscreen device (ASUS), two 27-inch monitors (DELL), a screen-mounted eye-tracking system (Tobii Pro, Tobii), two RGB-D cameras (Realsense, Intel), and a Desktop (Intel i7, 32 G RAM). All cameras are fixed on custom frames with fixed geometric relationships. The touchscreen device is used as a visualised tool to display drawing trajectories in real-time. During the experiment, the two cameras and eye-tracker will capture the participant’s hand motion and gaze data by a tailor-made Python script (python3.6). The hand skeleton data were estimated using the media-pipe hand (Google).
To ensure that the acquisition data is time-synchronised on multi-sensor data, we use the labstreaminglayer API https://labstreaminglayer.readthedocs.io/info/intro.html.