LOMM: Latest Object Memory Management for Temporally Consistent Video Instance Segmentation

1DGIST, 2Stanford University

Key Mechanism

Demo

Abstract

In this paper, we introduce Latest Object Memory (LOM), a system for robustly tracking and continuously updating the latest states of objects by explicitly modeling their presence across video frames. LOM enables consistent tracking and accurate identity management across frames, enhancing both performance and reliability through the video segmentation process. Building upon LOM, we present Latest Object Memory Management (LOMM) for temporally consistent video instance segmentation, significantly improving long-term instance tracking. This enables consistent tracking and accurate identity management across frames, enhancing both performance and reliability through the video segmentation process. Moreover, we introduce Decoupled Object Association (DOA), a strategy that separately handles newly appearing and already existing objects. By leveraging our memory system, DOA accurately assigns object indices, improving matching accuracy and ensuring stable identity consistency, even in dynamic scenes where objects frequently appear and disappear. Extensive experiments and ablation studies demonstrate the superiority of our method over traditional approaches, setting a new state-of-the-art in video instance segmentation. Notably, our LOMM achieves an AP score of 54.0 on YouTube-VIS 2022, a dataset known for its challenging long videos.

Method

Interpolate start reference image.

Performance

Interpolate start reference image.

Comprison on YTVIS19, YTVIS21, and YTVIS22 datasets.