STELLA - Infrastructures for Living Labs

Profile & Description

The STELLA project aims to create an evaluation infrastructure that allows to evaluate search and recommendation services within productive web-based search systems with real users. STELLA provides an integrated e-Research environment that allows researchers in the field of information retrieval and recommendation services to conduct studies with real users in real environments. The experimental set-ups differ considerably from classical TREC studies, which can only be carried out offline, or also from user studies, which only allow laboratory experiments, and thus enable researchers to use an evaluation method that was previously reserved only for industrial research or the operators of large online platforms.

ProjectSTELLA - Infrastructures for Living Labs

Duration
2018 - 2021
Funded by
Further Reading: General Publications

If you are interested in background knowledge and would like to learn more about the STELLA - Infrastructures for Living Labs project and its progression feel free to study the following additional information.

General
Project Website
Get all available information on the STELLA project.
Publications

2020

How to Measure the Reproducibility of System-oriented IR Experiments..
In: J. Huang, Y. Chang, X. Cheng, J. Kamps, V. Murdock, J.-R. Wen and Y. Liu, editors, SIGIR, pages 349-358. ACM, 2020.
Timo Breuer, Nicola Ferro, Norbert Fuhr, Maria Maistro, Tetsuya Sakai, Philipp Schaer and Ian Soboroff.
[doi]  [BibTeX] 
Relations Between Relevance Assessments, Bibliometrics and Altmetrics.
In: Proceedings of the 10th International Workshop on Bibliometric-enhanced Information Retrieval co-located with 42nd European Conference on Information Retrieval, BIR@ECIR 2020, Lisbon, Portugal, April 14th, 2020 [online only], pages 101-112. 2020.
Timo Breuer, Philipp Schaer and Dirk Tunger.
[doi] [pdf]  [BibTeX] 
Reproducible Online Search Experiments.
In: Advances in Information Retrieval - 42nd European Conference on IR Research, ECIR 2020, Lisbon, Portugal, April 14-17, 2020, Proceedings, Part II, pages 597-601. 2020.
Timo Breuer.
[doi] [pdf]  [BibTeX] 
Editorial.
Datenbank-Spektrum, 20(1):1-3, 2020.
Philipp Schaer, Klaus Berberich and Theo Härder.
[doi]  [BibTeX] 
Living Labs for Academic Search at CLEF 2020.
In: J. M. Jose, E. Yilmaz, J. Magalhães, P. Castells, N. Ferro, Má. J. Silva and F. Martins, editors, Advances in Information Retrieval, pages 580-586. Springer International Publishing, Cham, 2020.
Philipp Schaer, Johann Schaible and Bernd Müller.
[abstract]  [BibTeX] 
Evaluation Infrastructures for Academic Shared Tasks.
Datenbank-Spektrum, 20(1):29-36, 2020.
Johann Schaible, Timo Breuer, Narges Tavakolpoursaleh, Bernd Müller, Benjamin Wolff and Philipp Schaer.
[doi]  [abstract]  [BibTeX] 

2019

Dockerizing Automatic Routing Runs for The Open-Source IR Replicability Challenge (OSIRRC 2019)..
In: R. Clancy, N. Ferro, C. Hauff, J. Lin, T. Sakai and Z. Z. Wu, editors, OSIRRC@SIGIR, volume 2409, series CEUR Workshop Proceedings, pages 31-35. CEUR-WS.org, 2019.
Timo Breuer and Philipp Schaer.
[pdf]  [BibTeX] 
Replicability and Reproducibility of Automatic Routing Runs..
In: L. Cappellato, N. Ferro, D. E. Losada and H. Müller, editors, CLEF (Working Notes), volume 2380, series CEUR Workshop Proceedings. CEUR-WS.org, 2019.
Timo Breuer and Philipp Schaer.
[pdf]  [BibTeX] 
STELLA: Towards a Framework for the Reproducibility of Online Search Experiments..
In: R. Clancy, N. Ferro, C. Hauff, J. Lin, T. Sakai and Z. Z. Wu, editors, OSIRRC@SIGIR, volume 2409, series CEUR Workshop Proceedings, pages 8-11. CEUR-WS.org, 2019.
Timo Breuer, Philipp Schaer, Narges Tavakolpoursaleh, Johann Schaible, Benjamin Wolff and Bernd Müller.
[pdf]  [BibTeX]