RT Journal Article SR Electronic T1 Incorporating knowledge of plates in batch normalization improves generalization of deep learning for microscopy images JF bioRxiv FD Cold Spring Harbor Laboratory SP 2022.10.14.512286 DO 10.1101/2022.10.14.512286 A1 Lin, Alexander A1 Lu, Alex X. YR 2022 UL http://biorxiv.org/content/early/2022/10/18/2022.10.14.512286.abstract AB Data collected by high-throughput microscopy experiments are affected by batch effects, stemming from slight technical differences between experimental batches. Batch effects significantly impede machine learning efforts, as models learn spurious technical variation that do not generalize. We introduce batch effects normalization (BEN), a simple method for correcting batch effects that can be applied to any neural network with batch normalization (BN) layers. BEN aligns the concept of a “batch” in biological experiments with that of a “batch” in deep learning. During each training step, data points forming the deep learning batch are always sampled from the same experimental batch. This small tweak turns the batch normalization layers into an estimate of the shared batch effects between images, allowing for these technical effects to be standardized out during training and inference. We demonstrate that BEN results in dramatic performance boosts in both supervised and unsupervised learning, leading to state-of-the-art performance on the RxRx1-Wilds benchmark.1Competing Interest StatementThe authors have declared no competing interest.