Published on 12.09.19 in Vol 7, No 3 (2019): Jul-Sep
Works citing "Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)–Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study"
According to Crossref, the following articles are citing this article (DOI 10.2196/14830):
(note that this is only a small subset of citations)
- Xu D, Gopale M, Zhang J, Brown K, Begoli E, Bethard S. Unified Medical Language System resources improve sieve-based generation and Bidirectional Encoder Representations from Transformers (BERT)–based ranking for concept normalization. Journal of the American Medical Informatics Association 2020;CrossRef
- Li L, Wang P, Yan J, Wang Y, Li S, Jiang J, Sun Z, Tang B, Chang T, Wang S, Liu Y. Real-world data medical knowledge graph: construction and applications. Artificial Intelligence in Medicine 2020;103:101817CrossRef
- Jim HSL, Hoogland AI, Brownstein NC, Barata A, Dicker AP, Knoop H, Gonzalez BD, Perkins R, Rollison D, Gilbert SM, Nanda R, Berglund A, Mitchell R, Johnstone PAS. Innovations in research and clinical care using patient‐generated health data. CA: A Cancer Journal for Clinicians 2020;70(3):182CrossRef