TY - JOUR T1 - TranDTA: Prediction Of Drug–Target Binding Affinity Using Transformer Representations JF - bioRxiv DO - 10.1101/2021.09.30.462610 SP - 2021.09.30.462610 AU - Mahsa Saadat AU - Armin Behjati AU - Fatemeh Zare-Mirakabad AU - Sajjad Gharaghani Y1 - 2021/01/01 UR - http://biorxiv.org/content/early/2021/10/01/2021.09.30.462610.abstract N2 - Drug discovery is generally difficult, expensive and the success rate is low. One of the essential steps in the early stages of drug discovery and drug repurposing is identifying drug target interactions. Although several methods developed use binary classification to predict if the interaction between a drug and its target exists or not, it is more informative and challenging to predict the strength of the binding between a drug and its target. Binding affinity indicates the strength of drug-target pair interactions. In this regard, several computational methods have been developed to predict the drug-target binding affinity. With the advent of deep learning methods, the accuracy of binding affinity prediction is improving. However, the input representation of these models is very effective in the result. The early models only use the sequence of molecules and the latter models focus on the structure of them. Although the recent models predict binding affinity more accurate than the first ones, they need more data and resources for training.In this study, we present a method that uses a pre-trained transformer to represent the protein as model input. Although pretrained transformer extracts a feature vector of the protein sequence, they can learn structural information in layers and heads. So, the extracted feature vector by transformer includes the sequence and structural properties of protein. Therefore, our method can also be run without limitations on resources (memory, CPU and GPU). The results show that our model achieves a competitive performance with the state-of-art models.Data and trained model is available at http://bioinformatics.aut.ac.ir/TranDTA/.Competing Interest StatementThe authors have declared no competing interest. ER -