This website is about the iMotion project on Intelligent Multimodal Augmented Video Motion Retrieval System. Find here more information on the project, the partners and the publications.
As in the previous two years, we participated to the Video Browser Showdown which was held in January in Reykjavik, Iceland. For the 2017 edition of the competition, the dataset was comprised of over 4’000 videos with a total duration of over 600 hours, more than twice the size of the one used in the previous year. The 2017 version of the imotion system had however no problems in dealing with the large number of videos. We are happy to announce that we won this years competition!
Over the past two years, we built some applications for content-based retrieval. Some parts of our software stack have reached a point at which we think they might find applications outside of our research use. This is why we decided to open-source our database back-end ADAM and our retrieval engine Cineast. While we will continue to develop these programs, we encourage everybody interested in (multimedia) retrieval to play with them, give feedback or even contribute to the code base.
This year we had not only the IMOTION system, an improved version of our previous contestant, but also a second system called iAutoMotion. The latter is a first stab at fully autonomous video retrieval since it has almost no user interaction but collects all information required for querying from a webcam. The only thing for the user to do is tell it when to start and stop recording. Both systems used the same back-end for retrieval which handled the 250 hours of video which were segmented into roughly 320’000 shots. We completed the competition with a greatly increased room for improvement compared to last year, meaning some things did not quite work as planned. While the systems performed mostly as they were designed, some of the design choices, while looking good during initial testing turned out not to be beneficial in a competitive setting…
Other contributions we had during the conference were more successful. The entry in the special poster session on processing ambiguous multi-modal queries was overall well received and our demo on Cineast – which is the retrieval logic of the IMOTION system stack – won the Best Demo Award. All of our contributions can be found in the Publications section.
We recently published a dataset we were using internally for quite some time now. The Open Short Video Collection in its initial version consists of 200 creative commons licensed videos covering a wide range of content. We released a technical report detailing the nature of the dataset which we provide as download.
Early this year, the iMotion team participated at the 4th Video Search Showcase with a first prototype of an integrated video retrieval system. We are happy to announce that we came in 2nd place out of 9 teams, only 10 points behind the winning team. See the image below for more information on our system.