Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

A quick and easy way to estimate entropy and mutual information for neuroscience

View ORCID ProfileMickael Zbili, View ORCID ProfileSylvain Rama
doi: https://doi.org/10.1101/2020.08.04.236174
Mickael Zbili
1Lyon Neuroscience Research Center, INSERM U1028-CNRS UMR 5292-Université Claude Bernard Lyon1 - Lyon, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Mickael Zbili
Sylvain Rama
2Laboratory of Synaptic Imaging, Department of Clinical and Experimental Epilepsy, UCL Queen Square Institute of Neurology, University College London - London, United Kingdom
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Sylvain Rama
  • For correspondence: s.rama@ucl.ac.uk rama.sylvain@gmail.com
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Data/Code
  • Preview PDF
Loading

Abstract

Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture nonlinear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this paper, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.

  • Entropy
  • Mutual Information
  • PNG
  • DEFLATE
  • Rastergram
  • Lossless compression
  • Place Fields

Competing Interest Statement

The authors have declared no competing interest.

Footnotes

  • Version 3 of this preprint has been peer-reviewed and recommended by Peer Community In Circuit Neuroscience (https://doi.org/10.24072/pci.cneuro.100001)

  • https://github.com/Sylvain-Deposit/PNG-Entropy

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC 4.0 International license.
Back to top
PreviousNext
Posted April 08, 2021.
Download PDF
Data/Code
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
A quick and easy way to estimate entropy and mutual information for neuroscience
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
A quick and easy way to estimate entropy and mutual information for neuroscience
Mickael Zbili, Sylvain Rama
bioRxiv 2020.08.04.236174; doi: https://doi.org/10.1101/2020.08.04.236174
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
A quick and easy way to estimate entropy and mutual information for neuroscience
Mickael Zbili, Sylvain Rama
bioRxiv 2020.08.04.236174; doi: https://doi.org/10.1101/2020.08.04.236174

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (4688)
  • Biochemistry (10379)
  • Bioengineering (7695)
  • Bioinformatics (26371)
  • Biophysics (13547)
  • Cancer Biology (10719)
  • Cell Biology (15460)
  • Clinical Trials (138)
  • Developmental Biology (8509)
  • Ecology (12842)
  • Epidemiology (2067)
  • Evolutionary Biology (16885)
  • Genetics (11415)
  • Genomics (15493)
  • Immunology (10638)
  • Microbiology (25254)
  • Molecular Biology (10239)
  • Neuroscience (54586)
  • Paleontology (402)
  • Pathology (1671)
  • Pharmacology and Toxicology (2899)
  • Physiology (4354)
  • Plant Biology (9263)
  • Scientific Communication and Education (1588)
  • Synthetic Biology (2561)
  • Systems Biology (6789)
  • Zoology (1470)