Current Location:home > Browse

1. chinaXiv:202110.00071 [pdf]


何沧平; 许涛
Subjects: Computer Science >> Natural Language Understanding and Machine Translation


submitted time 2021-10-20 Hits1695Downloads181 Comment 0

2. chinaXiv:202010.00060 [pdf]


刘宜佳; 纪斌; 余杰; 谭郁松; 马俊; 吴庆波
Subjects: Computer Science >> Natural Language Understanding and Machine Translation

ICD-9术语标准化任务旨在将医生在病历中记录的口语术语标准化为《国际疾病分类》(ICD-9)第九版中定义的标准术语。在本文中,我们首先提出一种基于BERT和文本相似度的方法(BTSBM),该方法将BERT分类模型与文本相似度计算算法相结合:1)使用N-gram算法为每种口语术语生成候选标准术语集(CSTS) ,用作下一步的训练数据集和测试数据集; 2)使用BERT分类模型对正确的标准术语进行分类。在这种BTSBM方法中,如果采用较大规模的CSTS作为测试数据集,则训练数据集也需要保持较大规模。但是,每个CSTS中只有一个正样本。因此,扩大规模将导致正负样本比例的严重失衡,这将严重降低系统性能。如果我们将测试数据集保持相对较小,则CSTS准确性(CSTSA)将大大降低,这将导致非常低的系统性能上限。为了解决上述问题,我们然后提出了一种优化的术语标准化方法,称为先进的BERT和基于文本相似性方法(ABTSBM),其中1)使用大规模初始CSTS来维持较高的CSTSA以确保较高的系统性能上限; 2)根据身体结构对CSTS进行降噪,以减轻正负样本的不平衡而不降低CSTSA; 3)引入focal loss损失函数以进一步促进正负样本的平衡。实验表明,ABTSBM方法的精度高达83.5%,比BTSBM高0.6%,而ABTSBM的计算成本比BTSBM低26.7%。

submitted time 2020-10-27 Hits11264Downloads1312 Comment 0

3. chinaXiv:201910.00076 [pdf]

Masked Sentence Model based on BERT for Move Recognition in Medical Scientific Abstracts

Yu, Gaihong; Zhang, Zhixiong; Liu, Huan ; Ding, Liangping
Subjects: Computer Science >> Natural Language Understanding and Machine Translation

Purpose: Move recognition in scientific abstracts is an NLP task of classifying sentences of the abstracts into different types of language unit. To improve the performance of move recognition in scientific abstracts, a novel model of move recognition is proposed that outperforms BERT-Base method. Design: Prevalent models based on BERT for sentence classification often classify sentences without considering the context of the sentences. In this paper, inspired by the BERT's Masked Language Model (MLM), we propose a novel model called Masked Sentence Model that integrates the content and contextual information of the sentences in move recognition. Experiments are conducted on the benchmark dataset PubMed 20K RCT in three steps. And then compare our model with HSLN-RNN, BERT-Base and SciBERT using the same dataset. Findings: Compared with BERT-Base and SciBERT model, the F1 score of our model outperforms them by 4.96% and 4.34% respectively, which shows the feasibility and effectiveness of the novel model and the result of our model comes closest to the state-of-the-art results of HSLN-RNN at present. Research Limitations: The sequential features of move labels are not considered, which might be one of the reasons why HSLN-RNN has better performance. And our model is restricted to dealing with bio-medical English literature because we use dataset from PubMed which is a typical bio-medical database to fine-tune our model. Practical implications: The proposed model is better and simpler in identifying move structure in scientific abstracts, and is worthy for text classification experiments to capture contextual features of sentences. Originality: The study proposes a Masked Sentence Model based on BERT which takes account of the contextual features of the sentences in abstracts in a new way. And the performance of this classification model is significantly improved by rebuilding the input layer without changing the structure of neural networks.

submitted time 2019-10-29 Hits56448Downloads2152 Comment 0

4. chinaXiv:201905.00012 [pdf]

Transfer Learning for Scientific Data Chain Extraction in Small Chemical Corpus with BERT-CRF Model

Na Pang; Li Qian; Weimin Lyu; Jin-Dong Yang
Subjects: Computer Science >> Natural Language Understanding and Machine Translation

Abstract. Computational chemistry develops fast in recent years due to the rapid growth and breakthroughs in AI. Thanks for the progress in natural language processing, researchers can extract more fine-grained knowledge in publications to stimulate the development in computational chemistry. While the works and corpora in chemical entity extraction have been restricted in the biomedicine or life science field instead of the chemistry field, we build a new corpus in chemical bond field anno- tated for 7 types of entities: compound, solvent, method, bond, reaction, pKa and pKa value. This paper presents a novel BERT-CRF model to build scientific chemical data chains by extracting 7 chemical entities and relations from publications. And we propose a joint model to ex- tract the entities and relations simultaneously. Experimental results on our Chemical Special Corpus demonstrate that we achieve state-of-art and competitive NER performance.

submitted time 2019-05-12 Hits23882Downloads1590 Comment 0

  [1 Pages/ 4 Totals]