Setup screen Tobii Eye Tracker 4C gazing information for usability testing purpose. The Tobii Eye Tracker 4C aims at providing an immersive reality without a headset. Also, with this product nothing stands between the screen and the immersive experience. Therefore, our clinicians will work with no interference of the device. This repository includes functions for the setup os the eye-tracking routines including: (i) calibration of the eye-tracker; (ii) finding eye positions; and (iii) validation of eye-tracker calibration settings. It contains functions for working with with the new Tobii Pro SDK for Python, along with essential Eye-Tracking routines, in a TobiiHelper
class. The repository is part of the work done by SIPg, an ISR-Lisboa research group and M-ITI, two R&D Units of LARSyS. The project also involves the collaborative effort of INESC-ID. Both ISR-Lisboa and INESC-ID are Associate Laboratories of IST from ULisboa.
We kindly ask scientific works and studies that make use of the repository to cite it in their associated publications. Similarly, we ask open-source and closed-source works that make use of the repository to warn us about this use.
You can cite our work using the following BibTeX entry:
@article{CALISTO2021102607,
title = {Introduction of human-centric AI assistant to aid radiologists for multimodal breast image classification},
journal = {International Journal of Human-Computer Studies},
volume = {150},
pages = {102607},
year = {2021},
issn = {1071-5819},
doi = {https://doi.org/10.1016/j.ijhcs.2021.102607},
url = {https://www.sciencedirect.com/science/article/pii/S1071581921000252},
author = {Francisco Maria Calisto and Carlos Santiago and Nuno Nunes and Jacinto C. Nascimento},
keywords = {Human-computer interaction, Artificial intelligence, Healthcare, Medical imaging, Breast cancer},
abstract = {In this research, we take an HCI perspective on the opportunities provided by AI techniques in medical imaging, focusing on workflow efficiency and quality, preventing errors and variability of diagnosis in Breast Cancer. Starting from a holistic understanding of the clinical context, we developed BreastScreening to support Multimodality and integrate AI techniques (using a deep neural network to support automatic and reliable classification) in the medical diagnosis workflow. This was assessed by using a significant number of clinical settings and radiologists. Here we present: i) user study findings of 45 physicians comprising nine clinical institutions; ii) list of design recommendations for visualization to support breast screening radiomics; iii) evaluation results of a proof-of-concept BreastScreening prototype for two conditions Current (without AI assistant) and AI-Assisted; and iv) evidence from the impact of a Multimodality and AI-Assisted strategy in diagnosing and severity classification of lesions. The above strategies will allow us to conclude about the behaviour of clinicians when an AI module is present in a diagnostic system. This behaviour will have a direct impact in the clinicians workflow that is thoroughly addressed herein. Our results show a high level of acceptance of AI techniques from radiologists and point to a significant reduction of cognitive workload and improvement in diagnosis execution.}
}
The following list is showing the set of dependencies for this project. Please, install and build in your machine the recommended versions.
List of dependencies for this project:
-
Setuptools (>= v40.8)
-
Tobii Research (>= v1.6)
Tobii’s consumer eye trackers are primarily intended for personal interaction use and not for analytical purposes. Any application that stores or transfers eye tracking data must have a special license from Tobii (Read more). Please, apply for a license here.
The instructions are as follows. We assume that you already have knowledge over Git and GitHub. If not, please follow this support information. Any need for support, just open a New issue.
To clone the hereby repository follow the guidelines. It is easy as that.
1.1. Please clone the repository by typing the command:
git clone https://github.com/mida-project/eye-tracker-setup.git
1.2. Get inside of the repository directory:
cd eye-tracker-setup/
1.3. For the installation and running of the source code, follow the next steps;
The installation guidelines are as follows. Please, be sure that you follow it correctly.
2.1. Run the following command to install the library using pip:
pip install -U pip setuptools
pip install tobii-research
python -m pip install -U pip setuptools
pip install tobii-research
2.2. Follow the next step;
The running guidelines are as follows. Please, be sure that you follow it correctly.
3.1. Run the sample using the following command:
python2 src/core/main.py
3.2. Enjoy our source code!
You can also run a Notebook to watch some of our models
chart plots. For this goal we are using the well known Jupyter Notebook web application. To run the Jupyter Notebook just follow the steps.
4.1. Get inside our project directory:
cd eye-tracker-setup/src/notebooks/
4.2. Run Jupyter Notebook application by typing:
jupyter notebook
If you have any question regarding the Jupyter Notebook just follow their Documentation. You can also ask for help close to the Community.
To find out how to apply the Upgrade Key to a Tobii Eye Tracker 4C, follow the Tobii Pro Upgrade Key – User Instructions document. Nevertheless, the Tobii Pro SDK Python API Documentation page is of chief importance to this repository, as well as, their Examples page for Python. For the first configurations, please follow both Python - Getting started and Python - Step-by-step guide pages, or follow the presented steps. Any questions regarding the Eye-Tracking topic just follow the StackOverflow tag for the purpose.
The work is also based and highly contributed from the tobii_pro_wrapper
. The tobii_pro_wrapper
repository was developed by Olivia Guayasamin (oguayasa) that we would like to thank. That repository shows pretty much everything we need to connect to a Tobii Eye-Tracker, calibrate the eyetracker, get gaze, eye, and time synchronization data from the eyetracker device, and convert the Tobii coordinate systems units.