Published on in Vol 8, No 12 (2020): December

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/23357, first published .
Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study

Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study

Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study

Journals

  1. Coghill J, Reis H. Hey BERT! Meet the Databases: Explorations of Bidirectional Encoder Representation from Transformers Model Use in Database Search Algorithms. Journal of Electronic Resources in Medical Libraries 2021;18(2-3):112 View
  2. Park W, Siddiqui I, Chakraborty C, Qureshi N, Shin D. Scarcity-aware spam detection technique for big data ecosystem. Pattern Recognition Letters 2022;157:67 View
  3. Kawazoe Y, Shimamoto K, Shibata D, Shinohara E, Kawaguchi H, Yamamoto T. Impact of a Clinical Text–Based Fall Prediction Model on Preventing Extended Hospital Stays for Elderly Inpatients: Model Development and Performance Evaluation. JMIR Medical Informatics 2022;10(7):e37913 View
  4. Kalyan K, Rajasekharan A, Sangeetha S. AMMU: A survey of transformer-based biomedical pretrained language models. Journal of Biomedical Informatics 2022;126:103982 View
  5. Lyu D, Wang X, Chen Y, Wang F. Language model and its interpretability in biomedicine: A scoping review. iScience 2024;27(4):109334 View