This challenge focuses on the problem of annotating everyday activities (cooking) captured with camera and sensors. For this purpose we are using the CMU Kitchen Dataset. The challenge has two strands:

  • Manual annotation: demonstrate your manual annotation software or toolset using the subset of this dataset described in Section 1.1 from the challenge’s instructions. We will compare the performance via various metrics, including: inter-rater reliability (percentage agreement, Cohen’s Kappa, intraclass correlation, Krippendorff’s Alpha, Pearson’s correlation coefficient, Spearman’s Rho), time taken for the annotation process, and learning curve.
  • Semi-supervised or automated annotation: build a solution or demonstrate your existing solution for annotating the data from this dataset, using any of the datatypes listed in Section 1.1 from the challenge’s instructions. We will compare the performance with standard machine learning evaluation metrics (accuracy, precision, F1-score). We will also ask you to provide indications regarding the speed and efficiency of the method used, and the resources required to run it.

For both of these tasks we are providing our own data dictionary for you to use, which is more detailed than the existing CMU annotation, and is validated against a series of metrics and against a causal model.

The detailed description of the challenge and the instructions could be downloaded from here.

Dataset description

The CMU Kitchen Dataset is a subset of the CMU Multi-Modal Activity Database (CMU-MMAC), which ‘contains multimodal measures of the human activity of subjects performing the tasks involved in cooking and food preparation. The CMU-MMAC database was collected in Carnegie Mellon’s Motion Capture Lab. A kitchen was built and to date twenty-five subjects have been recorded cooking five different recipes: brownies, pizza, sandwich, salad, and scrambled eggs.

For this challenge, we ask you not to use the audio feed or motion capture or the BodyMedia device data. You can use the video, the eWatch wearable data, the IMUs and RFID data.

A detailed description of the available data sources and their resolution and format could be found here.

Data Sources

CMU Kitchen dataset

Annotation provided by the University of Rostock

The annotation schema used in the provided annotation

A startup script that downloads the relevant parts of the dataset

Outcomes

A small prize will be offered for the winner in each category. Select participants will be invited to present their results in the next ARDUOUS workshop. The most highly scored overall entry across both challenges will also be granted a fee waiver to present their results in the MDPI journal Sensors.

Deadlines

The submission deadline is 15th November 2021.

Submission Instructions

Submit the results and artefacts of the challenge [here]. Use the format described in the instructions for the challenge. Please, after submitting your results and artefacts, send us an email with your contact details. See “Contact” for our email addresses.

Contact

For any questions, contact Kristina Yordanova (kristina.yordanova@uni-rostock.de), Emma Tonkin (e.l.tonkin@bristol.ac.uk), Teodor Stoev (teodor.stoev@uni-rostock.de) or Greg Tourte (g.j.l.tourte@bristol.ac.uk).

.