Search Results - "Strötgen, Jannik"
-
1
Multilingual and cross-domain temporal tagging
Published in Language Resources and Evaluation (01-06-2013)“…Extraction and normalization of temporal expressions from documents are important steps towards deep text understanding and a prerequisite for many NLP tasks…”
Get full text
Journal Article -
2
CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Published in Bioinformatics (13-06-2022)“…Abstract Motivation The field of natural language processing (NLP) has recently seen a large change toward using pre-trained language models for solving almost…”
Get full text
Journal Article -
3
Time and information retrieval: Introduction to the special issue
Published in Information processing & management (01-11-2015)Get full text
Journal Article -
4
Enriched Attention for Robust Relation Extraction
Published 22-04-2021“…The performance of relation extraction models has increased considerably with the rise of neural networks. However, a key issue of neural relation extraction…”
Get full text
Journal Article -
5
Discourse-Aware In-Context Learning for Temporal Expression Normalization
Published 11-04-2024“…Temporal expression (TE) normalization is a well-studied problem. However, the predominately used rule-based systems are highly restricted to specific…”
Get full text
Journal Article -
6
TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
Published 22-05-2023“…Intermediate training of pre-trained transformer-based language models on domain-specific data leads to substantial gains for downstream tasks. To increase…”
Get full text
Journal Article -
7
Better Call SAUL: Fluent and Consistent Language Model Editing with Generation Regularization
Published 03-10-2024“…To ensure large language models contain up-to-date knowledge, they need to be updated regularly. However, model editing is challenging as it might also affect…”
Get full text
Journal Article -
8
Learn it or Leave it: Module Composition and Pruning for Continual Learning
Published 26-06-2024“…In real-world environments, continual learning is essential for machine learning models, as they need to acquire new knowledge incrementally without forgetting…”
Get full text
Journal Article -
9
Rehearsal-Free Modular and Compositional Continual Learning for Language Models
Published 31-03-2024“…Continual learning aims at incrementally acquiring new knowledge while not forgetting existing knowledge. To overcome catastrophic forgetting, methods are…”
Get full text
Journal Article -
10
GradSim: Gradient-Based Language Grouping for Effective Multilingual Training
Published 23-10-2023“…Most languages of the world pose low-resource challenges to natural language processing models. With multilingual training, knowledge can be shared among…”
Get full text
Journal Article -
11
Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting
Published 17-09-2021“…In this paper, we explore possible improvements of transformer models in a low-resource setting. In particular, we present our approaches to tackle the first…”
Get full text
Journal Article -
12
NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Published 28-04-2023“…This paper describes our system developed for the SemEval-2023 Task 12 "Sentiment Analysis for Low-resource African Languages using Twitter Dataset". Sentiment…”
Get full text
Journal Article -
13
Multilingual Normalization of Temporal Expressions with Masked Language Models
Published 20-05-2022“…The detection and normalization of temporal expressions is an important task and preprocessing step for many applications. However, prior work on normalization…”
Get full text
Journal Article -
14
CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Published 20-05-2022“…The field of natural language processing (NLP) has recently seen a large change towards using pre-trained language models for solving almost any task. Despite…”
Get full text
Journal Article -
15
NLNDE: The Neither-Language-Nor-Domain-Experts' Way of Spanish Medical Document De-Identification
Published 02-07-2020“…Natural language processing has huge potential in the medical domain which recently led to a lot of research in this field. However, a prerequisite of secure…”
Get full text
Journal Article -
16
NLNDE: Enhancing Neural Sequence Taggers with Attention and Noisy Channel for Robust Pharmacological Entity Detection
Published 02-07-2020“…Named entity recognition has been extensively studied on English news texts. However, the transfer to other domains and languages is still a challenging…”
Get full text
Journal Article -
17
Closing the Gap: Joint De-Identification and Concept Extraction in the Clinical Domain
Published 19-05-2020“…Exploiting natural language processing in the clinical domain requires de-identification, i.e., anonymization of personal information in texts. However,…”
Get full text
Journal Article -
18
On the Choice of Auxiliary Languages for Improved Sequence Tagging
Published 19-05-2020“…Recent work showed that embeddings from related languages can improve the performance of sequence tagging, even for monolingual models. In this analysis paper,…”
Get full text
Journal Article -
19
To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning
Published 16-04-2021“…In low-resource settings, model transfer can help to overcome a lack of labeled data for many tasks and domains. However, predicting useful transfer sources is…”
Get full text
Journal Article -
20
FAME: Feature-Based Adversarial Meta-Embeddings for Robust Input Representations
Published 23-10-2020“…Combining several embeddings typically improves performance in downstream tasks as different embeddings encode different information. It has been shown that…”
Get full text
Journal Article