The Video Browser Showdown (VBS) is an international video content search competition that evaluates the state-of-the-art of interactive video retrieval systems. It is performed annually as a special event at the International Conference on MultiMedia Modeling (MMM) since 2012.  It aims at pushing research on large-scale video retrieval systems that are effective, fast, and easy to use for content search scenarios that are truly relevant in practice (e.g., known-item search in a ever increasing video archive, as nowadays ubiquitous in may domains of our digital world).

The VBS consists of an expert session and a novice session. In the expert session the developers of the systems themselves (i.e., the researchers) try to solve different types of content search queries (KIS and AVS) that are issued in an ad-hoc manner for a moderately large dataset (e.g., 600 – 1000 hours of content; VBS2018 used the IACC.3 dataset in collaboration with TRECVID). Although the dataset itself is available to the researchers several months before the actual competition, the queries are unknown in advance and issued on-site. In the novice session volunteers from the MMM conference audience (without help from the experts) are required to solve another set of tasks. This should ensure that the interactive video retrieval tools do not only improve in terms of retrieval performance but also in terms of usage (i.e., ease-of-use).

There are different types of queries:

  • Known-Item Search (KIS): a single video clip (20 secs long) is randomly selected from the dataset and visually presented with the projector on-site. The participants need to find exactly the single instance presented. Another task variation of this kind is textual KIS, where instead of a visual presentation, the searched segment is described only by text given by the moderator (and presented as text via the projector).
  • Ad-hoc Video Search (AVS): here, a rather general description of many shots is presented by the moderator (e.g., “Find all shots showing cars in front of trees”) and the  participants need to find as many correct examples (instances) according to the description.

Each query has a time limit (e.g., 5-7 minutes) and is rewarded on success with a score that depends on several factors: the required search time, the number of false submissions (which are penalized), and the number of different instances found for AVS tasks. For the latter case it is also considered, how many different ‘ranges‘ were submitted for an AVS tasks. For example, many different but temporally close shots in the same video count much less than several different shots from different videos.

 

The next Video Browser Showdown (VBS2019) will be held at MMM 2019 on January 9, 2019 in Thessaloniki, Greece.

See VBS2019 Call-for-Papers and the submission guidlines here.