Logo des Repositoriums
 
Zeitschriftenartikel

Measuring and Facilitating Data Repeatability in Web Science

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Journal Article

Zusatzinformation

Datum

2019

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

Springer

Zusammenfassung

Accessible and reusable datasets are a necessity to accomplish repeatable research. This requirement poses a problem particularly for web science, since scraped data comes in various formats and can change due to the dynamic character of the web. Further, usage of web data is typically restricted by copyright-protection or privacy regulations, which hinder publication of datasets. To alleviate these problems and reach what we define as “partial data repeatability”, we present a process that consists of multiple components. Researchers need to distribute only a scraper and not the data itself to comply with legal limitations. If a dataset is re-scraped for repeatability after some time, the integrity of different versions can be checked based on fingerprints. Moreover, fingerprints are sufficient to identify what parts of the data have changed and how much. We evaluate an implementation of this process with a dataset of 250 million online comments collected from five different news discussion platforms. We re-scraped the dataset after pausing for one year and show that less than ten percent of the data has actually changed. These experiments demonstrate that providing a scraper and fingerprints enables recreating a dataset and supports the repeatability of web science experiments.

Beschreibung

Risch, Julian; Krestel, Ralf (2019): Measuring and Facilitating Data Repeatability in Web Science. Datenbank-Spektrum: Vol. 19, No. 2. DOI: 10.1007/s13222-019-00316-9. Springer. PISSN: 1610-1995. pp. 117-126

Zitierform

Tags