Search Articles

View query in Help articles search

Search Results (1 to 10 of 53 Results)

Download search results: END BibTex RIS


Extracting Family History of Patients From Clinical Narratives: Exploring an End-to-End Solution With Deep Learning Models

Extracting Family History of Patients From Clinical Narratives: Exploring an End-to-End Solution With Deep Learning Models

We denoted the BERT-based NER model as BERT-ner, and the BERT-based family member attributes (ie, family role, side of family, negation, living status) classification module as BERT-cls and relation extraction module as BERT-rel.Figure 3 illustrates the fine-tuning

Xi Yang, Hansi Zhang, Xing He, Jiang Bian, Yonghui Wu

JMIR Med Inform 2020;8(12):e22982


Depression Risk Prediction for Chinese Microblogs via Deep-Learning Methods: Content Analysis

Depression Risk Prediction for Chinese Microblogs via Deep-Learning Methods: Content Analysis

Here, we further investigated three deep-learning methods with pretrained language representation models, BERT, robustly optimized BERT pretraining approach (RoBERTa) [18], and generalized autoregressive pretraining for language understanding (XLNET) [19],

Xiaofeng Wang, Shuai Chen, Tao Li, Wanting Li, Yejie Zhou, Jie Zheng, Qingcai Chen, Jun Yan, Buzhou Tang

JMIR Med Inform 2020;8(7):e17958


Building a Pharmacogenomics Knowledge Model Toward Precision Medicine: Case Study in Melanoma

Building a Pharmacogenomics Knowledge Model Toward Precision Medicine: Case Study in Melanoma

The BERT–CRF model [13] and multilingual BERT model [14] were trained on different languages such as Portuguese and the F1 score was ultimately improved. Today, the BERT model has also been applied in biomedical research.

Hongyu Kang, Jiao Li, Meng Wu, Liu Shen, Li Hou

JMIR Med Inform 2020;8(10):e20291


Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study

Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study

The system is based on bidirectional encoder representation from transformers (BERT) [18] and includes the 2 other types of representations besides BERT: (1) character-level representation to tackle the out-of-vocabulary (OOV) problem in natural language processing

Ying Xiong, Shuai Chen, Qingcai Chen, Jun Yan, Buzhou Tang

JMIR Med Inform 2020;8(12):e23357