2018

This page describes how to enter the challenge and will be used to capture the results

Note: This page has been updated with a new challenge category and further detailed submission instructions

Context

We are pleased to announce the next iteration of the Runtime Verification Challenge. From 2014 to 2016 this ran as a competition comparing RV tools, principally on runtime performance, in three tracks based on the programming language of application. Following a hiatus in 2017 (holding the RV-CuBES workshop instead) we are announcing a more foundational RV Challenge for 2018. Modern software and cyber-physical systems require runtime verification, yet the burgeoning collection of RV technologies remain comparatively untested due to a dearth of benchmarks for objectively comparatively evaluating their performance. This is not for a lack of effort; it is due to a glaring gap in our understanding of what the benchmarks would need to look like, and of what exactly we need to measure. Therefore in 2018 we will host an RV Benchmark Competition to take the first major steps in filling this gap.

Challenge Categories

The challenge will be formed of two categories:

  • The MTL category. This sets a single format for traces and specifications and aims to allow objective comparison of most RV tools
  • The Open category. This is a very flexible category allowing benchmarks of many different forms and allows benchmarks to be submitted with minimal work

Submissions to either category must provide a benchmark package consisting of trace data, specification information, and an oracle (what the result should be) as well as a brief overview paper describing their submission.

For full details of submission structure and how to submit please refer to the rules document.

Evaluation

As explained in the rules document, submissions will be judged by an independent panel of experts and awards will be made in a number of categories. Evaluation will take place during the 18th International Conference on Runtime Verification

Timeline
Note that the timeline has been significantly revised with respect to the original timeline

Initial submissions must be made by 31st October and must be fixed by 5th November.

All submissions will be invited to present during a special session at the 18th International Conference on Runtime Verification (Nov 11-13). However, attendance at this event is not a requirement of taking part in the challenge.

Results

The results of the challenge will be announced here

Post-Proceedings

We plan a post-proceedings or journal paper summarising the results of the challenge and all participants will be invited to contribute.

Organisers

Giles Reger, University of Manchester (Chair)
Kristin Yvonne RozierIowa State University
Volker StolzWestern Norway University of Applied Sciences

Contact Giles for queries relating to rules and submission.