References
- Yu X, Rong W, Liu J, et al., Lstm-based end-to-end framework for biomedical event extraction [J]. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2019, 17(6): 2029–2039.
- Yang H Chen Y., Liu K., et al. Multi-Turn and MultiGranularity Reader for Document-Level Event Extraction [J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2022, 22(2):1–16.
- Xu R, Liu T, Li L, et al. [Document-level event extraction via heterogeneous graph-based interaction model with a tracker [J]. arXiv preprint arXiv:2105. 14924, 2021.
- Devlin J, Chang M W, Lee K, et al. Bert:Pre-training of deep bidirectional transformers for language understanding [J]. ArXiv Preprint ArXiv:1810.04805, 2018.
- LI Xiangyang, ZHANG Huan, ZHOU Xiaohua. Chinese clinical named entity recognition with variant neural structures based on BERT methods [J]. Journal of Biomedical Informatics, 2020(107):103422.
- LI Ni, GUAN Huanmei, YANG Piao, et al.BERTIDCNN-CRF for named entity recognition in Chinese [J]. Journal of Shandong University (Natural Science), 2020, 55(01):102-109.
- YUAN Jian, ZHANG Haibo. Chinese Entity Recognition Model of Multi-granularity Fusion Embedded [J]. Journal of Chinese Computer Systems, 2022, 43(4):741-746.
- YANG Zhenyu, ZHANG Denghui. A complex long sentence intent classification method combining BERT and two-layer LSTM [J]. Computer Applications and Software, 2021, 38(12):207-212.