Implementation of specialised attention mechanisms: ICD-10 classification of Gastrointestinal discharge summaries in English, Spanish and Swedish

[Display omitted] •PlaBERT: a multi-label text classifier with Per-label Attention.•Electronic Health Record automatic coding in English, Spanish and Swedish.•The attention mechanism computes specific attention importance for each input token and label pair.•The study focuses on 157 ICD-10 diagnosti...

Full description

Saved in:
Bibliographic Details
Published in:Journal of biomedical informatics Vol. 130; p. 104050
Main Authors: Blanco, Alberto, Remmer, Sonja, Pérez, Alicia, Dalianis, Hercules, Casillas, Arantza
Format: Journal Article
Language:English
Published: United States Elsevier Inc 01-06-2022
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:[Display omitted] •PlaBERT: a multi-label text classifier with Per-label Attention.•Electronic Health Record automatic coding in English, Spanish and Swedish.•The attention mechanism computes specific attention importance for each input token and label pair.•The study focuses on 157 ICD-10 diagnostic terms within Chapter XI – Diseases of the Digestive System.•PlaBERT output the computed attention importance for each token and label and allow its visualisation. Multi-label classification according to the International Classification of Diseases (ICD) is an Extreme Multi-label Classification task aiming to categorise health records according to a set of relevant ICD codes. We implemented PlaBERT, a new multi-label text classification head with per-label attention, on top of a BERT model. The model assessment is conducted on Electronic Health Records, conveying Discharge Summaries in three languages – English, Spanish, and Swedish. The study focuses on 157 diagnostic codes from the ICD. We additionally measure the labelling noise to estimate the consistency of the gold standard. Our specialised attention mechanism computes attention weights for each input token and label pair, obtaining the specific relevance of every word concerning each ICD code. The PlaBERT model outputs the computed attention importance for each token and label, allowing for visualisation. Our best results are 40.65, 38.36, and 41.13 F1-Score points on the English, Spanish and Swedish datasets, respectively, for the 157 gastrointestinal codes. Besides, Precision is the metric that most significantly improves owing to the attention mechanism of PlaBERT, with an increase of 44.63, 40.93, and 12.92 points, respectively, for the Spanish, Swedish and English datasets.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1532-0464
1532-0480
1532-0480
DOI:10.1016/j.jbi.2022.104050