Skip to Main content Skip to Navigation
Journal articles

Improving methods for normalizing biomedical text entities with concepts from an ontology with (almost) no training data at BLAH5 the CONTES

Abstract : Entity normalization, or entity linking in the general domain, is an information extraction task that aims to annotate/bind multiple words/expressions in raw text with semantic references, such as concepts of an ontology. An ontology consists minimally of a formally organized vocabulary or hierarchy of terms, which captures knowledge of a domain. Presently, machine-learning methods, often coupled with distributional representations, achieve good performance. However, these require large training datasets, which are not always available, especially for tasks in specialized domains. CONTES (CONcept-TErm System) is a supervised method that addresses entity normalization with ontology concepts using small training datasets. CONTES has some limitations, such as it does not scale well with very large ontologies, it tends to overgeneralize predictions, and it lacks valid representations for the out-of-vocabulary words. Here, we propose to assess different methods to reduce the dimensionality in the representation of the ontology. We also propose to calibrate parameters in order to make the predictions more accurate, and to address the problem of out-of-vocabulary words, with a specific method.
Document type :
Journal articles
Complete list of metadata

https://hal.inrae.fr/hal-02947689
Contributor : Migration Prodinra <>
Submitted on : Thursday, September 24, 2020 - 9:29:15 AM
Last modification on : Thursday, June 3, 2021 - 3:38:29 AM

Links full text

Identifiers

Citation

Arnaud Ferré, Mouhamadou Ba, Robert Bossy. Improving methods for normalizing biomedical text entities with concepts from an ontology with (almost) no training data at BLAH5 the CONTES. Genomics & Informatics, 2019, 17 (2), pp.e20. ⟨10.5808/GI.2019.17.2.e20⟩. ⟨hal-02947689⟩

Share

Metrics

Record views

27