camera array
Recently Published Documents


TOTAL DOCUMENTS

213
(FIVE YEARS 54)

H-INDEX

13
(FIVE YEARS 4)

2022 ◽  
Author(s):  
Nusrat Mehajabin ◽  
Peizhi Yan ◽  
Supreet Kaur ◽  
Jingxiang Song ◽  
Mahsa T. Pourazad ◽  
...  
Keyword(s):  

Author(s):  
Anqi Zhu ◽  
Lin Zhang ◽  
Juntao Chen ◽  
Yicong Zhou

The panorama stitching system is an indispensable module in surveillance or space exploration. Such a system enables the viewer to understand the surroundings instantly by aligning the surrounding images on a plane and fusing them naturally. The bottleneck of existing systems mainly lies in alignment and naturalness of the transition of adjacent images. When facing dynamic foregrounds, they may produce outputs with misaligned semantic objects, which is evident and sensitive to human perception. We solve three key issues in the existing workflow that can affect its efficiency and the quality of the obtained panoramic video and present Pedestrian360, a panoramic video system based on a structured camera array (a spatial surround-view camera system). First, to get a geometrically aligned 360○ view in the horizontal direction, we build a unified multi-camera coordinate system via a novel refinement approach that jointly optimizes camera poses. Second, to eliminate the brightness and color difference of images taken by different cameras, we design a photometric alignment approach by introducing a bias to the baseline linear adjustment model and solving it with two-step least-squares. Third, considering that the human visual system is more sensitive to high-level semantic objects, such as pedestrians and vehicles, we integrate the results of instance segmentation into the framework of dynamic programming in the seam-cutting step. To our knowledge, we are the first to introduce instance segmentation to the seam-cutting problem, which can ensure the integrity of the salient objects in a panorama. Specifically, in our surveillance oriented system, we choose the most significant target, pedestrians, as the seam avoidance target, and this accounts for the name Pedestrian360 . To validate the effectiveness and efficiency of Pedestrian360, a large-scale dataset composed of videos with pedestrians in five scenes is established. The test results on this dataset demonstrate the superiority of Pedestrian360 compared to its competitors. Experimental results show that Pedestrian360 can stitch videos at a speed of 12 to 26 fps, which depends on the number of objects in the shooting scene and their frequencies of movements. To make our reported results reproducible, the relevant code and collected data are publicly available at https://cslinzhang.github.io/Pedestrian360-Homepage/ .


2021 ◽  
Vol 243 ◽  
pp. 106067
Author(s):  
Matthew R. Baker ◽  
Kresimir Williams ◽  
H.G. Greene ◽  
Casey Greufe ◽  
Heather Lopes ◽  
...  

2021 ◽  
Author(s):  
Eric Thomson ◽  
Mark Harfouche ◽  
Pavan Konda ◽  
Catherine W Seitz ◽  
Kanghyun Kim ◽  
...  

The dynamics of living organisms are organized across many spatial scales, yet existing, cost-effective imaging systems can measure only a subset of these scales at once. Here, we have created a scalable multi-camera array microscope (MCAM) that enables comprehensive high-resolution recording from multiple spatial scales simultaneously, ranging from cellular structures to large-group behavioral dynamics. By collecting data from up to 96 cameras, we computationally generate gigapixel-scale images and movies near cellular resolution and 5 um sensitivity over hundreds of square centimeters. This allows us to observe the behavior and fine anatomical features of numerous freely moving model organisms on multiple spatial scales, including larval zebrafish, fruit flies, nematodes, carpenter ants, and slime mold. The MCAM architecture allows stereoscopic tracking of the z-position of organisms using the overlapping field of view from adjacent cameras. Further, we demonstrate the ability to acquire dual color fluorescence video of multiple freely moving zebrafish, recording neural activity via ratiometric calcium imaging. Overall, the MCAM provides a powerful platform for investigating cellular and behavioral processes across a wide range of spatial scales, but without the bottlenecks imposed by single-camera image acquisition systems.


2021 ◽  
Vol 11 (18) ◽  
pp. 8464
Author(s):  
Adam L. Kaczmarek ◽  
Bernhard Blaschitz

This paper presents research on 3D scanning by taking advantage of a camera array consisting of up to five adjacent cameras. Such an array makes it possible to make a disparity map with a higher precision than a stereo camera, however it preserves the advantages of a stereo camera such as a possibility to operate in wide range of distances and in highly illuminated areas. In an outdoor environment, the array is a competitive alternative to other 3D imaging equipment such as Structured-light 3D scanners or Light Detection and Ranging (LIDAR). The considered kinds of arrays are called Equal Baseline Camera Array (EBCA). This paper presents a novel approach to calibrating the array based on the use of self-calibration methods. This paper also introduces a testbed which makes it possible to develop new algorithms for obtaining 3D data from images taken by the array. The testbed was released under open-source. Moreover, this paper shows new results of using these arrays with different stereo matching algorithms including an algorithm based on a convolutional neural network and deep learning technology.


Author(s):  
Erekle Chakhvashvili ◽  
Bastian Siegmann ◽  
Juliane Bendig ◽  
Uwe Rascher
Keyword(s):  

Author(s):  
Norbert Meidinger ◽  
Robert Andritschke ◽  
Konrad Dennerl ◽  
Valentin Emberger ◽  
Tanja Eraerds ◽  
...  
Keyword(s):  

2021 ◽  
Author(s):  
Esteban Vera ◽  
Felipe Guzmán ◽  
Pablo Meza
Keyword(s):  

Author(s):  
Mykola Dreval ◽  
Christian Brandt ◽  
Jonathan Schilling ◽  
H Thomsen ◽  
Aleksey Beletskii ◽  
...  
Keyword(s):  
X Ray ◽  

Sign in / Sign up

Export Citation Format

Share Document