Published on in Vol 8, No 5 (2020): May

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/17644, first published .
Document-Level Biomedical Relation Extraction Leveraging Pretrained Self-Attention Structure and Entity Replacement: Algorithm and Pretreatment Method Validation Study

Document-Level Biomedical Relation Extraction Leveraging Pretrained Self-Attention Structure and Entity Replacement: Algorithm and Pretreatment Method Validation Study

Document-Level Biomedical Relation Extraction Leveraging Pretrained Self-Attention Structure and Entity Replacement: Algorithm and Pretreatment Method Validation Study

Authors of this article:

Xiaofeng Liu1 Author Orcid Image ;   Jianye Fan1 Author Orcid Image ;   Shoubin Dong1 Author Orcid Image

Journals

  1. Lu H, Li L, Li Z, Zhao S. Extracting chemical-induced disease relation by integrating a hierarchical concentrative attention and a hybrid graph-based neural network. Journal of Biomedical Informatics 2021;121:103874 View
  2. Liu X, Tan K, Dong S. Multi-granularity sequential neural network for document-level biomedical relation extraction. Information Processing & Management 2021;58(6):102718 View
  3. Kalyan K, Rajasekharan A, Sangeetha S. AMMU: A survey of transformer-based biomedical pretrained language models. Journal of Biomedical Informatics 2022;126:103982 View
  4. Kanjirangat V, Rinaldi F. Enhancing Biomedical Relation Extraction with Transformer Models using Shortest Dependency Path Features and Triplet Information. Journal of Biomedical Informatics 2021;122:103893 View