Accéder directement au contenu Accéder directement à la navigation
Nouvelle interface
Communication dans un congrès

Learning Embeddings from Free-text Triage Notes using Pretrained Transformer Models

Abstract : The advent of transformer models has allowed for tremendous progress in the Natural Language Processing (NLP) domain. Pretrained transformers could successfully deliver the state-of-the-art performance in a myriad of NLP tasks. This study presents an application of transformers to learn contextual embeddings from freetext triage notes, widely recorded at the emergency department. A large-scale retrospective cohort of triage notes of more than 260K records was provided by the University Hospital of Amiens-Picardy in France. We utilize a set of Bidirectional Encoder Representations from Transformers (BERT) for the French language. The quality of embeddings is empirically examined based on a set of clustering models. In this regard, we provide a comparative analysis of popular models including Came mBERT, FlauBERT, and mBART. The study could be generally regarded as an addition to the ongoing contributions of applying the BERT approach in the healthcare context.
Type de document :
Communication dans un congrès
Liste complète des métadonnées
Contributeur : Louise DESSAIVRE Connectez-vous pour contacter le contributeur
Soumis le : jeudi 5 mai 2022 - 14:44:01
Dernière modification le : dimanche 21 août 2022 - 13:38:20

Lien texte intégral



Emilien Arnaud, Mahmoud Elbattah, Maxime Gignon, Gilles Dequen. Learning Embeddings from Free-text Triage Notes using Pretrained Transformer Models. HEALTHINF: PROCEEDINGS OF THE 15TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES - VOL 5: HEALTHINF, Feb 2022, Lisbonne, Portugal. pp.835-841, ⟨10.5220/0011012800003123⟩. ⟨hal-03659989⟩



Consultations de la notice