Published on in Vol 8, No 4 (2020): April

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/17787, first published .
Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation

Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation

Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation

Journals

  1. Kim Y, Lee J, Choi S, Lee J, Kim J, Seok J, Joo H. Validation of deep learning natural language processing algorithm for keyword extraction from pathology reports in electronic health records. Scientific Reports 2020;10(1) View
  2. Chen Y, Lo Y, Lai F, Huang C. Disease Concept-Embedding Based on the Self-Supervised Method for Medical Information Extraction from Electronic Health Records and Disease Retrieval: Algorithm Development and Validation Study. Journal of Medical Internet Research 2021;23(1):e25113 View
  3. Sorin V, Barash Y, Konen E, Klang E. Deep-learning natural language processing for oncological applications. The Lancet Oncology 2020;21(12):1553 View
  4. Mohammed A, Ali A. Survey of BERT (Bidirectional Encoder Representation Transformer) types. Journal of Physics: Conference Series 2021;1963(1):012173 View