The digitization of large quantities of analogue data and the massive production of born-digital documents for many years now provide us with large volumes of varied multimedia data (images, maps, text, video, multisensor data, etc.), an important feature of which is that they are cross-domain. "Cross-domain" reflects the fact that these data may have been acquired in very different conditions: different acquisition systems, times and points of view (e.g. a 1962 postcard from the Arc de Triomphe vs. a recent street-view acquisition by mobile mapping of the same monument). These data represent an extremely rich heritage that can be exploited in a wide variety of fields, from SSH to land use and territorial policies, including smart city, urban planning, tourism, creative media and entertainment.
In terms of research in computer science, they address challenging problems related to the diversity and volume of the media across time, the variety of content descriptors (potentially including the time dimension), the veracity of the data, and the different user needs with respect to engaging with this rich material and the extraction of value out of the data. These challenges are reflected in research topics such as multimodal and mixed media search, automatic content analysis, multimedia linking and recommendation, and big data analysis and visualisation, where scientific bottlenecks may be exacerbated by the time dimension, which also provides topics of interest such as multimodal time series analysis.
The objective of the second edition of this workshop is to present and discuss the latest and most significant trends in the analysis, structuring and understanding of multimedia contents dedicated to the valorization of heritage, with emphasis on the unlocking of and access to the big data of the past.
Valerie Gouet-Brunet  (IGN/LaSTIG, France)
Liming Chen   (Centrale Lyon/LIRIS, France)
Xu-Cheng Yin  (University of Science and Technology Beijing, China))
Ronak Kosti (Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany)
Margarita Khokhlova (IGN/LaSTIG, Centrale Lyon/LIRIS, France)
To be finalized
The objective of this workshop is to present and discuss the latest and most significant trends in the analysis, structuring and understanding of multimedia contents dedicated to the valorization of heritage, with the emphasis on the unlocking of and access to the big data of the past. We welcome research contributions related to the following (but not limited to) topics:
● Multimedia and cross-domain data interlinking and recommendation
● Dating and geolocalization of historical data
● Mixed media data access and indexing
● Deep learning in adverse conditions (transfer learning, learning with side information, etc.)
● Multi-modal time series analysis, evolution modelling
● Multi-modal and multi-temporal data rendering
● HCI / Interfaces for large scale data sets
● Smart digitization of massive quantities of data
● Benchmarking, open data movement
Monday 29 June > Thursday 30 July 2020
Monday 27 July Wednesday 26 August 2020
Camera Ready Submission:
Friday 7 August Wednesday 2 September 2020
Workshop Date: To be decided (12 or 16 Oct. 2020)
All submissions must be original work not under review at any other workshop, conference, or journal. The workshop will accept papers describing completed work as well as work in progress. One submission format is accepted: full paper, which must follow the formatting guidelines of the main conference ACM MM 2019. Full papers should be from 4 to 8 pages (plus additional pages for the reference pages), encoded as PDF and using the ACM Article Template.
Paper submissions must conform with the “double-blind” review policy. All papers will be peer-reviewed by experts in the field, they will receive at least two reviews. Acceptance will be based on relevance to the workshop, scientific novelty, and technical quality. Depending on the number, maturity and topics of the accepted submissions, the work will be presented via oral or poster sessions. The workshop papers will be published in the ACM Digital Library.
To be published later
Any questions? Please contact us!