Back

A clinical specific BERT developed with huge size of Japanese clinical narrative

2020-07-09 health informatics Title + abstract only
View on medRxiv
Show abstract

Generalized language models that pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate a development of a clinical specific BERT model with a huge size of Japanese clinical narrative and evaluated it on the NTCIR-13 MedWeb that has pseudo-Twitter messages about medical concerns with eight labe...

Predicted journal destinations