The Virtual Brain

Explore other EduCases

EduCases for The Virtual Brain.

Learning by doing.

  • LEARN: An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data

    • Model construction

    Related publication

    An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data, published in NeuroImage, August 2015, by Michael Schirner, Simon Rothmeier, Viktor K. Jirsa, Anthony Randal McIntosh, PetraRitter

    doi: 10.1016/j.neuroimage.2015.03.055


    Large amounts of multimodal neuroimaging data are acquired every year worldwide. In order to extract high-dimensional information for computational neuroscience applications standardized data fusion and efficient reduction into integrative data structures are required.

    Such self-consistent multimodal data sets can be used for computational brain modeling to constrain models with individual measurable features of the brain, such as done with The Virtual Brain (TVB). TVB is a simulation platform that uses empirical structural and functional data to build full brain models of individual humans.

    For convenient model construction, we developed a processing pipeline for structural, functional and diffusion-weighted magnetic resonance imaging (MRI) and optionally electroencephalography (EEG) data.

    The pipeline combines several state-of-the-art neuroinformatics tools to generate subject-specific cortical and subcortical parcellations, surface-tessellations, structural and functional connectomes, lead field matrices, electrical source activity estimates and region-wise aggregated blood oxygen level dependent (BOLD) functional MRI (fMRI) time-series.

    The output files of the pipeline can be directly uploaded to TVB to create and simulate individualized large-scale network models that incorporate intra- and intercortical interaction on the basis of cortical surface triangulations and white matter tractograpy.

    We detail the pitfalls of the individual processing streams and discuss ways of validation. With the pipeline we also introduce novel ways of estimating the transmission strengths of fiber tracts in whole-brain structural connectivity (SC) networks and compare the outcomes of different tractography or parcellation approaches.

    We tested the functionality of the pipeline on 50 multimodal data sets. In order to quantify the robustness of the connectome extraction part of the pipeline we computed several metrics that quantify its rescan reliability and compared them to other tractography approaches.

    Together with the pipeline we present several principles to guide future efforts to standardize brain model construction. The code of the pipeline and the fully processed data sets are made available to the public via The Virtual Brain website ( and via github (

    Furthermore, the pipeline can be directly used with High Performance Computing (HPC) resources on the Neuroscience Gateway Portal ( through a convenient web-interface.

    More resources for this lesson

    GitHub repository with code