Recognition and tracking of object positions in image sequences using multiple-viewpoint television broadcast images

2007 ◽  
Vol 38 (11) ◽  
pp. 64-79
Author(s):  
Kazunori Sakaki ◽  
Kenichiro Yokota ◽  
Keiji Ohno ◽  
Kohei Suzuki ◽  
Hiroshi Arisawa
Keyword(s):  
Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3722
Author(s):  
Byeongkeun Kang ◽  
Yeejin Lee

Motion in videos refers to the pattern of the apparent movement of objects, surfaces, and edges over image sequences caused by the relative movement between a camera and a scene. Motion, as well as scene appearance, are essential features to estimate a driver’s visual attention allocation in computer vision. However, the fact that motion can be a crucial factor in a driver’s attention estimation has not been thoroughly studied in the literature, although driver’s attention prediction models focusing on scene appearance have been well studied. Therefore, in this work, we investigate the usefulness of motion information in estimating a driver’s visual attention. To analyze the effectiveness of motion information, we develop a deep neural network framework that provides attention locations and attention levels using optical flow maps, which represent the movements of contents in videos. We validate the performance of the proposed motion-based prediction model by comparing it to the performance of the current state-of-art prediction models using RGB frames. The experimental results for a real-world dataset confirm our hypothesis that motion plays a role in prediction accuracy improvement, and there is a margin for accuracy improvement by using motion features.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Jens Ziegle ◽  
Alfredo Illanes ◽  
Axel Boese ◽  
Michael Friebe

AbstractDuring thermal ablation in a target tissue the information about temperature is crucial for decision making of successful therapy. An observable temporal and spatial temperature propagation would give a visual feedback of irreversible cell damage of the target tissue. Potential temperature features in ultrasound (US) B-Mode image sequences during radiofrequency (RF) ablation in ex-vivo porcine liver were found and analysed. These features could help to detect the transition between reversible and irreversible damage of the ablated target tissue. Experimental RF ablations of ex-vivo porcine liver were imaged with US B-Mode imaging and image sequences were recorded. Temperature was simultaneously measured within the liver tissue around a bipolar RF needle electrode. In the B-Mode images, regions of interest (ROIs) around the centre of the measurement spots were analysed in post-processing using average gray-level (AVGL) compared against temperature. The pole of maximum energy level in the time-frequency domain of the AVGL changes was investigated in relation to the measured temperatures. Frequency shifts of the pole were observed which could be related to transitions between the states of tissue damage.


2020 ◽  
Vol 6 (3) ◽  
pp. 501-504
Author(s):  
Dennis Schmidt ◽  
Andreas Rausch ◽  
Thomas Schanze

AbstractThe Institute of Virology at the Philipps-Universität Marburg is currently researching possible drugs to combat the Marburg virus. This involves classifying cell structures based on fluoroscopic microscopic image sequences. Conventionally, membranes of cells must be marked for better analysis, which is time consuming. In this work, an approach is presented to identify cell structures in images that are marked for subviral particles. It could be shown that there is a correlation between the distribution of subviral particles in an infected cell and the position of the cell’s structures. The segmentation is performed with a "Mask-R-CNN" algorithm, presented in this work. The model (a region-based convolutional neural network) is applied to enable a robust and fast recognition of cell structures. Furthermore, the network architecture is described. The proposed method is tested on data evaluated by experts. The results show a high potential and demonstrate that the method is suitable.


Sign in / Sign up

Export Citation Format

Share Document