SOTAB: The WDC Schema.org table annotation benchmark
Korini, Keti
;
Peeters, Ralph
;
Bizer, Christian
Weitere URL:
|
https://ceur-ws.org/Vol-3320/paper1.pdf
|
URN:
|
urn:nbn:de:bsz:180-madoc-638689
|
Dokumenttyp:
|
Konferenzveröffentlichung
|
Erscheinungsjahr:
|
2022
|
Buchtitel:
|
SemTab 2022 : Proceedings of the Semantic Web Challenge on Tabular Data to Knowledge Graph Matching, co-located with the 21st International semantic Web Conference (ISWC 2022), virtual conference, October 23-27, 2022
|
Titel einer Zeitschrift oder einer Reihe:
|
CEUR Workshop Proceedings
|
Band/Volume:
|
3320
|
Seitenbereich:
|
14-19
|
Veranstaltungstitel:
|
SemTab 2022, Semantic Web Challenge on Tabular Data to Knowledge Graph Matching
|
Veranstaltungsort:
|
Online
|
Veranstaltungsdatum:
|
23.-27.10.2022
|
Herausgeber:
|
Efthymiou, Vasilis
;
Jiménez-Ruiz, Ernesto
;
Chen, Jiaoyan
;
Cutrona, Vincenzo
;
Hassanzadeh, Oktie
;
Sequeda, Juan
;
Srinivas, Kavitha
;
Abdelmageed, Nora
;
Hulsebos, Madelon
|
Ort der Veröffentlichung:
|
Aachen, Germany
|
Verlag:
|
RWTH Aachen
|
ISSN:
|
1613-0073
|
Verwandte URLs:
|
|
Sprache der Veröffentlichung:
|
Englisch
|
Einrichtung:
|
Fakultät für Wirtschaftsinformatik und Wirtschaftsmathematik > Information Systems V: Web-based Systems (Bizer 2012-)
|
Bereits vorhandene Lizenz:
|
Creative Commons Namensnennung 4.0 International (CC BY 4.0)
|
Fachgebiet:
|
004 Informatik
|
Freie Schlagwörter (Englisch):
|
table annotation , column type annotation , column property annotation , schema.org
|
Abstract:
|
Understanding the semantics of table elements is a prerequisite for many data integration and data discovery tasks. Table annotation is the task of labeling table elements with terms from a given vocabulary. This paper presents the WDC Schema.org Table Annotation Benchmark (SOTAB) for comparing the performance of table annotation systems. SOTAB covers the column type annotation (CTA) and columns property annotation (CPA) tasks. SOTAB provides ∼50,000 annotated tables for each of the tasks containing Schema.org data from different websites. The tables cover 17 different types of entities such as movie, event, local business, recipe, job posting, or product. The tables stem from the WDC Schema.org Table Corpus which was created by extracting Schema.org annotations from the Common Crawl. Consequently, the labels used for annotating columns in SOTAB are part of the Schema.org vocabulary. The benchmark covers 91 types for CTA and 176 properties for CPA distributed across textual, numerical and date/time columns. The tables are split into fixed training, validation and test sets. The test sets are further divided into subsets focusing on specific challenges, such as columns with missing values or different value formats, in order to allow a more fine-grained comparison of annotation systems. The evaluation of SOTAB using Doduo and TURL shows that the benchmark is difficult to solve for current state-of-the-art systems.
|
| Dieser Eintrag ist Teil der Universitätsbibliographie. |
| Das Dokument wird vom Publikationsserver der Universitätsbibliothek Mannheim bereitgestellt. |
Suche Autoren in
Sie haben einen Fehler gefunden? Teilen Sie uns Ihren Korrekturwunsch bitte hier mit: E-Mail
Actions (login required)
|
Eintrag anzeigen |
|