Abstract
Humans are remarkably efficent at recognizing objects. Understanding how the brain performs object recognition has been challenging. Our understanding has been advanced substantially in recent years with the development of multivariate pattern analysis or brain decoding methods. Most start-of-the-art decoding procedures, make use of the mean signal activation to extract object category information, which overlooks temporal variability in the signals. Here, we studied category-related information in 30 mathematically different features from the electroencephalography (EEG; the largest set ever) across three independent and highly-varied datasets using multi-variate pattern analyses. While the event-related potential (ERP) components of N1 and P2a were among the most informative features, the informative original signal samples and Wavelet coefficients, down-sampled through principal component analysis, outperformed them. Informative features showed more pronounced effects in the Theta frequency band, which has been shown to support feed-forward processing of visual information. Correlational analyses showed that the features which provided the most information about object categories, could predict participants’ performance (reaction time) more accurately than the less informative features. These results provide researchers with new avenues to study how the brain encodes object category information and how we can read out object category to study the temporal dynamics of the neural code.
Competing Interest Statement
The authors have declared no competing interest.