Back

Adera: A drug repurposing workflow for neuro-immunological investigations using neural networks.

Lazarczyk, M.; Mickael, M. E.; Sacharczuk, M.

2022-07-14 bioinformatics
10.1101/2022.07.14.500072 bioRxiv
Show abstract

Drug repurposing in the context of neuro-immunological (NI) investigations is still in its primary stages. Drug repurposing is an important method that bypasses lengthy drug discovery procedures and rather focuses on discovering new usage for known medications. Neuro-immunological diseases such as Alzheimers, Parkinson, multiple sclerosis and depression include various pathologies that resulted from the interaction between the central nervous system and the immune system. However, repurposing of medications is hindered by the vast amount of information that needs mining. To challenge the need for repurposing known medications for neuro-immunological diseases, we built a deep neural network named Adera to perform drug repurposing. The model uses two deep learning networks. The first network is an encoder and its main task is to embed text into matrices. The second network we explored the usage of two different loss function, binary cross entropy and means square error (MSE). Furthermore, we investigated the effect of ten different network architecture with each loss function. Our results show that for the binary cross entropy loss function, the best architecture consists of a two layers of convolution neural network and it achieves a loss of less than 0.001. In the case of MSE loss function a shallow network using aRelu activation achieved an accuracy of over 98 % and loss of 0.001. Additionally, Adera was able to predict various drug repurposing targets in agreement with DRUG Repurposing Hub. These results establish the ability of Adera to repurpose with high accuracy drug candidates that can shorten the development of the drug cycle. The software could be downloaded from https://github.com/michel-phylo/ADERA1.

Matching journals

The top 11 journals account for 50% of the predicted probability mass.

1
Bioinformatics
1061 papers in training set
Top 3%
7.1%
2
Computational and Structural Biotechnology Journal
216 papers in training set
Top 0.4%
6.7%
3
Artificial Intelligence in the Life Sciences
11 papers in training set
Top 0.1%
6.3%
4
BMC Bioinformatics
383 papers in training set
Top 2%
4.8%
5
Briefings in Bioinformatics
326 papers in training set
Top 1%
4.8%
6
Bioinformatics Advances
184 papers in training set
Top 0.7%
4.8%
7
Computers in Biology and Medicine
120 papers in training set
Top 0.5%
4.8%
8
Scientific Reports
3102 papers in training set
Top 38%
3.5%
9
PLOS ONE
4510 papers in training set
Top 40%
3.5%
10
Frontiers in Pharmacology
100 papers in training set
Top 1.0%
3.5%
11
Journal of Cheminformatics
25 papers in training set
Top 0.2%
3.5%
50% of probability mass above
12
Journal of Chemical Information and Modeling
207 papers in training set
Top 1%
3.5%
13
Patterns
70 papers in training set
Top 0.3%
2.8%
14
PLOS Computational Biology
1633 papers in training set
Top 14%
2.1%
15
ImmunoInformatics
11 papers in training set
Top 0.1%
2.1%
16
Pharmaceutics
21 papers in training set
Top 0.2%
1.8%
17
Molecules
37 papers in training set
Top 0.9%
1.7%
18
Metabolites
50 papers in training set
Top 0.5%
1.7%
19
iScience
1063 papers in training set
Top 16%
1.6%
20
npj Systems Biology and Applications
99 papers in training set
Top 1%
1.6%
21
GigaScience
172 papers in training set
Top 2%
1.5%
22
IEEE/ACM Transactions on Computational Biology and Bioinformatics
32 papers in training set
Top 0.3%
1.5%
23
Biology Methods and Protocols
53 papers in training set
Top 1%
1.2%
24
Nucleic Acids Research
1128 papers in training set
Top 14%
1.2%
25
IEEE Journal of Biomedical and Health Informatics
34 papers in training set
Top 1%
1.2%
26
Pharmaceuticals
33 papers in training set
Top 1%
0.9%
27
BMC Medical Genomics
36 papers in training set
Top 0.9%
0.9%
28
BMC Genomics
328 papers in training set
Top 5%
0.8%
29
Neuroinformatics
40 papers in training set
Top 0.9%
0.8%
30
International Journal of Molecular Sciences
453 papers in training set
Top 16%
0.7%