PT - JOURNAL ARTICLE AU - Yang, Ziyue AU - Milas, Katarina A. AU - White, Andrew D. TI - Now What Sequence? Pre-trained Ensembles for Bayesian Optimization of Protein Sequences AID - 10.1101/2022.08.05.502972 DP - 2022 Jan 01 TA - bioRxiv PG - 2022.08.05.502972 4099 - http://biorxiv.org/content/early/2022/08/06/2022.08.05.502972.short 4100 - http://biorxiv.org/content/early/2022/08/06/2022.08.05.502972.full AB - Pre-trained models have been transformative in natural language, computer vision, and now protein sequences by enabling accuracy with few training examples. We show how to use pretrained sequence models in Bayesian optimization to design new protein sequences with minimal labels (i.e., few experiments). Pre-trained models give good predictive accuracy at low data and Bayesian optimization guides the choice of which sequences to test. Pre-trained sequence models also obviate the common requirement of finite pools. Any sequence can be considered. We show significantly fewer labeled sequences are required for many sequence design tasks, including creating novel peptide inhibitors with AlphaFold. This work should enable calibrated predictions with few examples and iterative design with low data (1-50).Competing Interest StatementThe authors have declared no competing interest.