scholarly journals A high-speed vision system with in-pixel programmable ADCs and PEs for real-time visual sensing

Author(s):  
S. Kagami ◽  
T. Komuro ◽  
M. Ishikawa
Author(s):  
Chauncey F. Graetzel ◽  
Steven N. Fry ◽  
Felix Beyeler ◽  
Yu Sun ◽  
Bradley J. Nelson

2011 ◽  
Vol 23 (1) ◽  
pp. 53-65 ◽  
Author(s):  
Yao-DongWang ◽  
◽  
Idaku Ishii ◽  
Takeshi Takaki ◽  
Kenji Tajima ◽  
...  

This paper introduces a high-speed vision system called IDP Express, which can execute real-time image processing and High-Frame-Rate (HFR) video recording simultaneously. In IDP Express, 512×512 pixel images from two camera heads and the processed results on a dedicated FPGA (Field Programmable Gate Array) board are transferred to standard PC memory at a rate of 1000 fps or more. Owing to the simultaneous HFR video processing and recording, IDP Express can be used as an intelligent video logging system for long-term high-speed phenomenon analysis. In this paper, a real-time abnormal behavior detection algorithm was implemented on IDP-Express to capture HFR videos of crucial moments of unpredictable abnormal behaviors in high-speed periodic motions. Several experiments were performed for a high-speed slider machine with repetitive operation at a frequency of 15 Hz and videos of the abnormal behaviors were automatically recorded to verify the effectiveness of our intelligent HFR video logging system.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 663
Author(s):  
Yuji Yamakawa ◽  
Yutaro Matsui ◽  
Masatoshi Ishikawa

In this research, we focused on Human-Robot collaboration. There were two goals: (1) to develop and evaluate a real-time Human-Robot collaborative system, and (2) to achieve concrete tasks such as collaborative peg-in-hole using the developed system. We proposed an algorithm for visual sensing and robot hand control to perform collaborative motion, and we analyzed the stability of the collaborative system and a so-called collaborative error caused by image processing and latency. We achieved collaborative motion using this developed system and evaluated the collaborative error on the basis of the analysis results. Moreover, we aimed to realize a collaborative peg-in-hole task that required a system with high speed and high accuracy. To achieve this goal, we analyzed the conditions required for performing the collaborative peg-in-hole task from the viewpoints of geometric, force and posture conditions. Finally, in this work, we show the experimental results and data of the collaborative peg-in-hole task, and we examine the effectiveness of our collaborative system.


2018 ◽  
Vol 25 (6) ◽  
pp. 758-762 ◽  
Author(s):  
Yuting Hu ◽  
Zhiling Long ◽  
Ghassan AlRegib
Keyword(s):  

2013 ◽  
Vol 25 (4) ◽  
pp. 586-595 ◽  
Author(s):  
Motofumi Kobatake ◽  
◽  
Tadayoshi Aoyama ◽  
Takeshi Takaki ◽  
Idaku Ishii

In this paper, we propose a novel concept of realtime microscopic particle image velocimetry (PIV) for apparent high-speed microchannel flows in lab-on-achip (LOC). We introduce a frame-straddling dualcamera high-speed vision system that synchronizes two different camera inputs for the same camera view with a submicrosecond time delay. In order to improve upper and lower limits of measurable velocity in microchannel flow observation, we designed an improved gradient-based optical flow algorithm that adaptively selects a pair of images in the optimal frame-straddling time between the two camera inputs based on the amplitude of the estimated optical flow. This avoids large image displacement between frames that often generates serious errors in optical flow estimation. Our method is implemented using software on a frame-straddling dual-camera high-speed vision platform that captures real-time video and processes 512 × 512 pixel images at 2000 fps for the two camera heads and controls the frame-straddling time delay between them from 0 to 0.25 ms with 9.9 ns step. Our microscopic PIV system with frame-straddling dualcamera high-speed vision simultaneously estimates the velocity distribution of high-speed microchannel flow at 1 × 108pixel/s or more. Results of experiments using real microscopic flows on microchannels thousands of µm wide on LOCs verify the performance of the real-time microscopic PIV system we developed.


2018 ◽  
Vol 30 (1) ◽  
pp. 117-127
Author(s):  
Xianwu Jiang ◽  
Qingyi Gu ◽  
Tadayoshi Aoyama ◽  
Takeshi Takaki ◽  
Idaku Ishii ◽  
...  

In this study, we develop a real-time high-frame-rate vision system with frame-by-frame automatic exposure (AE) control that can simultaneously synthesize multiple images with different exposure times into a high-dynamic-range (HDR) image for scenarios with dynamic change in illumination. By accelerating the video capture and processing for time-division multithread AE control at the millisecond level, the proposed system can virtually function as multiple AE cameras with different exposure times. This system can capture color HDR images of 512 × 512 pixels in real time at 500 fps by synthesizing four 8-bit color images with different exposure times at consecutive frames, captured at an interval of 2 ms, with pixel-level parallel processing accelerated by a GPU (Graphic Processing Unit) board. Several experimental results for scenarios with a large change in illumination are demonstrated to confirm the performance of the proposed system for real-time HDR imaging.


2013 ◽  
Vol 52 (31) ◽  
pp. 7530 ◽  
Author(s):  
Peng Cheng ◽  
Sissy M. Jhiang ◽  
Chia-Hsiang Menq

2005 ◽  
Vol 17 (2) ◽  
pp. 121-129 ◽  
Author(s):  
Yoshihiro Watanabe ◽  
◽  
Takashi Komuro ◽  
Shingo Kagami ◽  
Masatoshi Ishikawa

Real-time image processing at high frame rates could play an important role in various visual measurement. Such image processing can be realized by using a high-speed vision system imaging at high frame rates and having appropriate algorithms processed at high speed. We introduce a vision chip for high-speed vision and propose a multi-target tracking algorithm for the vision chip utilizing the unique features. We describe two visual measurement applications, target counting and rotation measurement. Both measurements enable excellent measurement precision and high flexibility because of high-frame-rate visual observation achievable. Experimental results show the advantages of vision chips compared with conventional visual systems.


Sign in / Sign up

Export Citation Format

Share Document