Back
Top 0.2%
14.5%
Top 0.9%
8.9%
Top 2%
8.9%
Top 21%
8.9%
Top 2%
7.7%
Top 2%
6.4%
Top 1%
6.4%
Top 5%
5.0%
Top 82%
5.0%
Top 1%
4.1%
#1
3.8%
Top 1%
3.8%
Top 2%
1.9%
Top 2%
1.3%
Top 6%
1.2%
Top 5%
1.2%
Top 5%
0.9%
Top 0.7%
0.9%
Top 4%
0.7%
Top 33%
0.7%
A clinical specific BERT developed with huge size of Japanese clinical narrative
2020-07-09
health informatics
Title + abstract only
View on medRxiv
Show abstract
Generalized language models that pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate a development of a clinical specific BERT model with a huge size of Japanese clinical narrative and evaluated it on the NTCIR-13 MedWeb that has pseudo-Twitter messages about medical concerns with eight labe...
Predicted journal destinations
1
Journal of Biomedical Informatics
37 training papers
2
Journal of Medical Internet Research
81 training papers
3
Journal of the American Medical Informatics Association
53 training papers
4
Scientific Reports
701 training papers
5
PLOS Digital Health
88 training papers
6
JAMIA Open
35 training papers
7
BMC Medical Informatics and Decision Making
36 training papers
8
npj Digital Medicine
85 training papers
9
PLOS ONE
1737 training papers
10
Computers in Biology and Medicine
39 training papers
11
Patterns
15 training papers
12
International Journal of Medical Informatics
25 training papers
13
JMIR Medical Informatics
16 training papers
14
Communications Medicine
63 training papers
15
Heliyon
57 training papers
16
BMC Medical Research Methodology
41 training papers
17
JMIR Formative Research
31 training papers
18
Journal of Personalized Medicine
17 training papers
19
Frontiers in Digital Health
18 training papers
20
Frontiers in Public Health
135 training papers