Living Lab Evaluation for Life and Social Sciences Search Platforms -- LiLAS at CLEF 2021
Meta-evaluation studies of system performances in controlled offline evaluation campaigns, like TREC and CLEF, show a need for innovation in evaluating IR-systems. The field of academic search is no exception to this. This might be related to the fact that relevance in academic search is multilayere...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
05-10-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Meta-evaluation studies of system performances in controlled offline
evaluation campaigns, like TREC and CLEF, show a need for innovation in
evaluating IR-systems. The field of academic search is no exception to this.
This might be related to the fact that relevance in academic search is
multilayered and therefore the aspect of user-centric evaluation is becoming
more and more important. The Living Labs for Academic Search (LiLAS) lab aims
to strengthen the concept of user-centric living labs for the domain of
academic search by allowing participants to evaluate their retrieval approaches
in two real-world academic search systems from the life sciences and the social
sciences. To this end, we provide participants with metadata on the systems'
content as well as candidate lists with the task to rank the most relevant
candidate to the top. Using the STELLA-infrastructure, we allow participants to
easily integrate their approaches into the real-world systems and provide the
possibility to compare different approaches at the same time. |
---|---|
DOI: | 10.48550/arxiv.2310.03859 |