Human Connectome Project informatics: quality control, database services, and data visualization

Neuroimage. 2013 Oct 15:80:202-19. doi: 10.1016/j.neuroimage.2013.05.077. Epub 2013 May 24.

Abstract

The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain / anatomy & histology*
  • Brain / physiology*
  • Computational Biology / methods*
  • Computational Biology / standards
  • Connectome / methods*
  • Connectome / standards
  • Data Mining / methods*
  • Data Mining / standards
  • Database Management Systems / standards
  • Databases, Factual*
  • Humans
  • Information Storage and Retrieval / methods
  • Information Storage and Retrieval / standards
  • Models, Anatomic
  • Models, Neurological
  • Nerve Net / anatomy & histology
  • Nerve Net / physiology
  • Quality Control
  • User-Computer Interface*