This repository contains descriptions, code and data related to the real-time multi-echo functional magnetic resonance imaging (rt-me-fMRI) project conducted at the Electrical Engineering department of the Eindhoven University of Technology. Project outputs include:
-
A dataset and related publication:
-
A methods publication:
Below we provide more information and instructions regarding:
- The dataset summary
- How to download the data
- How to explore the data
- How to reproduce results and figures from the publications
- Relevant software tools
- How to cite this work
- How to contribute
The rt-me-fMRI
dataset is a multi-echo fMRI dataset (N=28 healthy participants) with four task-based and two resting state runs that were collected, curated and made available to the research community. Its main purpose is to advance the development of methods for real-time multi-echo functional magnetic resonance imaging analysis with applications in real-time quality control, adaptive paradigms, and neurofeedback, although the variety of experimental task paradigms supports a multitude of use cases. Tasks include finger tapping, emotional face and shape matching, imagined finger tapping and imagined emotion processing. This figure summarises the collected data:
The full data description is available as an F1000 data article.
Several depictions of the data tree can be viewed here
The rt-me-fMRI
dataset is available for reuse for the purpose of scientific research or education in the field of functional magnetic resonance imaging. If you wish to use the data, you have to agree to the terms of a Data Use Agreement when downloading the data.
The dataset itself can be downloaded from DataverseNL via this link.
The dataset was collected, processed and shared in accordance with the European Union's General Data Protection Regulation (GDPR) as approved by Data Protection Officers at the research institution. These specific conditions aim for personal data privacy to be prioritised while adhering to FAIR data standards ("findable, accessible, interoperable, reusable"). Procedures included de-identifying brain images (e.g. removing personally identifiable information from image filenames and metadata and removing facial features from T1-weighted images), converting the data to BIDS format, employing a Data Use Agreement, and keeping participants fully informed about each of these steps and the associated risks and benefits.
Much of the work that went into this administrative process has been documented as part of the output of the Open Brain Consent Working Group, accessible here.
To explore the dataset's derivative measures interactively, visit this web application. It was built with Python using the Plotly Dash framework. The open source code base is available at this repository.
The data preparation process is documented here. This includes code to convert neuroimaging, physiological and other data to BIDS format.
After preprocessing and quality checking of the data (see a full description in the data article) the data were processed and analysed as described in the methods article. Because of data storage limitations, these derivative data are not shared together with the rt-me-fMRI
dataset. However, code and instructions are provided to allow these derivative data to be reproduced. Additionally, code and instructions are provided to subsequently generate the summary data from which the results of the methods paper as well as the data underlying the Dash application are derived:
- Code and instructions to generate derivative data
- Code and instcurtions to generate summary data for figures and Dash application
The following notebooks contain code and descriptions that allows figures for the data and methods articles to be reproduced:
All (pre)processing and major data analysis steps for both the data article and methods article were done using the open source MATLAB-based fMRwhy
toolbox (v0.0.1; https://github.com/jsheunis/fMRwhy), which was developed over the course of this project. fMRwhy
has has conditional dependencies:
- SPM12 (r7771; https://github.com/spm/spm12/releases/tag/r7771; Friston et al., 2007)
- bids-matlab (v.0.0.1, https://github.com/jsheunis/bids-matlab/releases/tag/fv0.0.1)
- Anatomy Toolbox (v3.0; Eickhoff et al., 2005)
- dicm2nii (v0.2 from a forked repository; https://github.com/jsheunis/dicm2nii/releases/tag/v0.2)
- TAPAS PhysIO (v3.2.0; https://github.com/translationalneuromodeling/tapas/releases/tag/v3.2.0; Kasper et al., 2017)
- Raincloud plots (v1.1 https://github.com/RainCloudPlots/RainCloudPlots/releases/tag/v1.1; Allen et al., 2019).
Papers, book chapters, books, posters, oral presentations, and all other presentations of results derived from the rt-me-fMRI dataset should acknowledge the origin of the data as follows:
Data were provided (in part) by the Electrical Engineering Department, Eindhoven University of Technology, The Netherlands and Kempenhaeghe Epilepsy Center, Heeze, The Netherlands
In addition, please use the following citation when referring to the dataset:
Heunis S, Breeuwer M, Caballero-Gaudes C et al. rt-me-fMRI: a task and resting state dataset for real-time, multi-echo fMRI methods development and validation [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2021, 10:70 (https://doi.org/10.12688/f1000research.29988.1)
And the following citation when referring to the methods article:
Heunis, S., Breeuwer, M., Caballero-Gaudes, C., Hellrung, L., Huijbers, W., Jansen, J.F., Lamerichs, R., Zinger, S., Aldenkamp, A.P., 2020. The effects of multi-echo fMRI combination and rapid T2*-mapping on offline and real-time BOLD sensitivity. bioRxiv 2020.12.08.416768. https://doi.org/10.1101/2020.12.08.416768
Feedback and future contributions are very welcome. If you have any comments, questions or suggestions about the dataset or derivative measures, please create an issue in this repository.