Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 12.09.19 in Vol 7, No 3 (2019): Jul-Sep

This paper is in the following e-collection/theme issue:

Works citing "Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)–Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study"

According to Crossref, the following articles are citing this article (DOI 10.2196/14830):

(note that this is only a small subset of citations)

  1. Xu D, Gopale M, Zhang J, Brown K, Begoli E, Bethard S. Unified Medical Language System resources improve sieve-based generation and Bidirectional Encoder Representations from Transformers (BERT)–based ranking for concept normalization. Journal of the American Medical Informatics Association 2020;
    CrossRef
  2. Li L, Wang P, Yan J, Wang Y, Li S, Jiang J, Sun Z, Tang B, Chang T, Wang S, Liu Y. Real-world data medical knowledge graph: construction and applications. Artificial Intelligence in Medicine 2020;103:101817
    CrossRef
  3. Jim HSL, Hoogland AI, Brownstein NC, Barata A, Dicker AP, Knoop H, Gonzalez BD, Perkins R, Rollison D, Gilbert SM, Nanda R, Berglund A, Mitchell R, Johnstone PAS. Innovations in research and clinical care using patient‐generated health data. CA: A Cancer Journal for Clinicians 2020;70(3):182
    CrossRef