A fast cube-based video shot retrieval using 3D moment-preserving technique

Author(s):  
Wei-Ka Huang ◽  
Chi-Han Chung ◽  
Shyi-Chyi Cheng ◽  
Jun-Wei Hsieh
Keyword(s):  
Author(s):  
XIAN WU ◽  
JIANHUANG LAI ◽  
PONG C. YUEN

This paper proposes a novel approach for video-shot transition detection using spatio-temporal saliency. Both temporal and spatial information are combined to generate a saliency map, and features are available based on the change of saliency. Considering the context of shot changes, a statistical detector is constructed to determine all types of shot transitions by the minimization of the detection-error probability simultaneously under the same framework. The evaluation performed on videos of various content types demonstrates that the proposed approach outperforms a more recent method and two publicly available systems, namely VideoAnnex and VCM.


Author(s):  
Nandini H. M. ◽  
Chethan H. K. ◽  
Rashmi B. S.

Shot boundary detection in videos is one of the most fundamental tasks towards content-based video retrieval and analysis. In this aspect, an efficient approach to detect abrupt and gradual transition in videos is presented. The proposed method detects the shot boundaries in videos by extracting block-based mean probability binary weight (MPBW) histogram from the normalized Kirsch magnitude frames as an amalgamation of local and global features. Abrupt transitions in videos are detected by utilizing the distance measure between consecutive MPBW histograms and employing an adaptive threshold. In the subsequent step, co-efficient of mean deviation and variance statistical measure is applied on MPBW histograms to detect gradual transitions in the video. Experiments were conducted on TRECVID 2001 and 2007 datasets to analyse and validate the proposed method. Experimental result shows significant improvement of the proposed SBD approach over some of the state-of-the-art algorithms in terms of recall, precision, and F1-score.


2017 ◽  
Vol 17 (2) ◽  
pp. 245-261 ◽  
Author(s):  
Kathleen M Ryan

Popular culture has critiqued ‘vertical video syndrome’, or video shot on smartphones in the portrait rather than landscape orientation, as something aesthetically unpleasing which should be avoided. But the design of smartphones seems to encourage shooting vertical video. This article examines the aesthetic desirability of vertical videos through applied media aesthetics. It traces the history of horizontal film and television orientations, as well as the image-centric orientation model found in still photography. It argues that vertical video, rather than a syndrome to be avoided, instead takes advantage of the technological innovations and embodied pleasures offered by the smartphone to rupture the visual paradigms and create a new visual aesthetic for phone-based moving images.


2001 ◽  
Vol 01 (03) ◽  
pp. 507-526 ◽  
Author(s):  
TONG LIN ◽  
HONG-JIANG ZHANG ◽  
QING-YUN SHI

In this paper, we present a novel scheme on video content representation by exploring the spatio-temporal information. A pseudo-object-based shot representation containing more semantics is proposed to measure shot similarity and force competition approach is proposed to group shots into scene based on content coherences between shots. Two content descriptors, color objects: Dominant Color Histograms (DCH) and Spatial Structure Histograms (SSH), are introduced. To represent temporal content variations, a shot can be segmented into several subshots that are of coherent content, and shot similarity measure is formulated as subshot similarity measure that serves to shot retrieval. With this shot representation, scene structure can be extracted by analyzing the splitting and merging force competitions at each shot boundary. Experimental results on real-world sports video prove that our proposed approach for video shot retrievals achieve the best performance on the average recall (AR) and average normalized modified retrieval rank (ANMRR), and Experiment on MPEG-7 test videos achieves promising results by the proposed scene extraction algorithm.


Author(s):  
Nikos Nikolaidis ◽  
Costas Cotsaces ◽  
Zuzana Cernekova ◽  
Ioannis Pitas

Sign in / Sign up

Export Citation Format

Share Document