A smart data annotation tool for multi-sensor activity recognition


Diete, Alexander ; Sztyler, Timo ; Stuckenschmidt, Heiner



DOI: https://doi.org/10.1109/PERCOMW.2017.7917542
URL: http://ieeexplore.ieee.org/document/7917542/
Additional URL: http://publications.wim.uni-mannheim.de/informatik...
Document Type: Conference or workshop publication
Year of publication: 2017
Book title: 2017 IEEE International Conference on Pervasive Computing and Communications Workshops : PERCOM Workshops
Page range: 111-116
Conference title: ARDUOUS '17: 1st International Workshop on Annotation of useR Data for UbiquitOUs Systems : affiliated to IEEE PerCom 2017
Location of the conference venue: Kona, Big Island, HI
Date of the conference: March 17, 2017
Publisher: Yordanova, Kristina
Place of publication: Piscataway, NJ
Publishing house: IEEE Computer Soc.
ISBN: 978-1-5090-4339-2 , 978-1-5090-4338-5
Publication language: English
Institution: School of Business Informatics and Mathematics > Practical Computer Science II: Artificial Intelligence (Stuckenschmidt 2009-)
Subject: 004 Computer science, internet
Abstract: Annotation of multimodal data sets is often a time consuming and a challenging task as many approaches require a accurate labeling. This includes in particular video recordings as often labeling exact to a frame is required. For that purpose, we created an annotation tool that enables to annotate data sets of video and inertial sensor data. However, in contrast to the most existing approaches, we focus on semi-supervised labeling support by utilizing different sensor types to infer labels for the whole dataset. Hence, after labeling a small set of instances our system is able to provide labeling recommendations and in turn make learning of image features more feasible by vastly speeding up the labeling time for single frames. We aim to rely on the inertial sensors of our wristband to support the labeling of video recording. For that purpose, we apply template matching in context of dynamic time warping to identify time intervals of certain actions. To investigate the feasibility of our approach we focus on a real world scenario, i.e., we gathered a data set which describes an order picking scenario of a logistic company. In this context, we focus mainly on the picking process as the selecting of the correct items can be prone to errors. Preliminary results show that we are able to identify 69% of the grabbing motion periods of time.




Dieser Eintrag ist Teil der Universitätsbibliographie.




Metadata export


Citation


+ Search Authors in

+ Page Views

Hits per month over past year

Detailed information



You have found an error? Please let us know about your desired correction here: E-Mail


Actions (login required)

Show item Show item