PT - JOURNAL ARTICLE AU - Alexander Rives AU - Joshua Meier AU - Tom Sercu AU - Siddharth Goyal AU - Zeming Lin AU - Jason Liu AU - Demi Guo AU - Myle Ott AU - C. Lawrence Zitnick AU - Jerry Ma AU - Rob Fergus TI - Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences AID - 10.1101/622803 DP - 2020 Jan 01 TA - bioRxiv PG - 622803 4099 - http://biorxiv.org/content/early/2020/12/15/622803.short 4100 - http://biorxiv.org/content/early/2020/12/15/622803.full AB - In the field of artificial intelligence, a combination of scale in data and model capacity enabled by un-supervised learning has led to major advances in representation learning and statistical generation. In the life sciences, the anticipated growth of sequencing promises unprecedented data on natural sequence diversity. Protein language modeling at the scale of evolution is a logical step toward predictive and generative artificial intelligence for biology. To this end we use unsupervised learning to train a deep contextual language model on 86 billion amino acids across 250 million protein sequences spanning evolutionary diversity. The resulting model contains information about biological properties in its representations. The representations are learned from sequence data alone. The learned representation space has a multi-scale organization reflecting structure from the level of biochemical properties of amino acids to remote homology of proteins. Information about secondary and tertiary structure is encoded in the representations and can be identified by linear projections. Representation learning produces features that generalize across a range of applications, enabling state-of-the-art supervised prediction of mutational effect and secondary structure, and improving state-of-the-art features for long-range contact prediction.Competing Interest StatementThe authors have declared no competing interest.