High-Speed Human Arm Projection Mapping with Skin Deformation

Author(s):  
Hao-Lun Peng ◽  
Yoshihiro Watanabe
2021 ◽  
Vol 11 (9) ◽  
pp. 3753
Author(s):  
Hao-Lun Peng ◽  
Yoshihiro Watanabe

Dynamic projection mapping for a moving object according to its position and shape is fundamental for augmented reality to resemble changes on a target surface. For instance, augmenting the human arm surface via dynamic projection mapping can enhance applications in fashion, user interfaces, prototyping, education, medical assistance, and other fields. For such applications, however, conventional methods neglect skin deformation and have a high latency between motion and projection, causing noticeable misalignment between the target arm surface and projected images. These problems degrade the user experience and limit the development of more applications. We propose a system for high-speed dynamic projection mapping onto a rapidly moving human arm with realistic skin deformation. With the developed system, the user does not perceive any misalignment between the arm surface and projected images. First, we combine a state-of-the-art parametric deformable surface model with efficient regression-based accuracy compensation to represent skin deformation. Through compensation, we modify the texture coordinates to achieve fast and accurate image generation for projection mapping based on joint tracking. Second, we develop a high-speed system that provides a latency between motion and projection below 10 ms, which is generally imperceptible by human vision. Compared with conventional methods, the proposed system provides more realistic experiences and increases the applicability of dynamic projection mapping.


2016 ◽  
Vol 25 (4) ◽  
pp. 299-321 ◽  
Author(s):  
Tomohiro Sueishi ◽  
Hiromasa Oku ◽  
Masatoshi Ishikawa

Dynamic projection mapping (DPM) is a type of projection-based augmented reality that aligns projected content with a moving physical object. In order to be able to adjust the projection to fast motions of moving objects, DPM requires high-speed visual feedback. An option to reduce the temporal delay of adjusting the projection to imperceptible levels is to use mirror-based high-speed optical axis controllers. However, using such controllers for capturing visual feedback requires a sufficient amount of illumination of the moving object. This leads to a trade-off between tracking stability and quality of projection content. In this article, we propose a system that combines mirror-based high-speed tracking with using a retroreflective background. The proposed tracking technique observes the silhouette of the target object by episcopic illumination and is robust against illumination changes. It also maintains high-speed, accident-avoidant tracking by performing background subtraction in an active vision system and employing an adaptive windows technique. This allows us to create a DPM with an imperceptible temporal delay, high tracking stability and high visual quality. We analyze the proposed system regarding the visual quality of the retroreflective background, the tracking stability under illumination and disturbance conditions, and the visual consistency relative to delay in the presence of pose estimation. In addition, we demonstrate application scenarios for the proposed DPM system.


Author(s):  
Sadam Fujioka

This paper describes an interactive art installation titled "drop." It is the first artwork using the Waterdrop Projection-Mapping (WPM) system, which animates levitating waterdrops. With this artwork, the anno lab team infuses physical characteristics into computer graphics and materializes them as tangible pixels. WPM consists of a waterdrop generator and an ultra high-speed projector. The team uses an ultra high-speed projector to cast stroboscopic spotlights mapping on waterdrops to create an optical illusion of animating each waterdrop individually. This is a new technique to show computer animation by animating levitating waterdrops. This technique explores a new horizon to create animations with tangible pixels that the viewer can touch physically.


2017 ◽  
Vol 2017 ◽  
pp. 1-10 ◽  
Author(s):  
Naoki Hashimoto ◽  
Ryo Koizumi ◽  
Daisuke Kobayashi

We propose a dynamic projection mapping system with effective machine-learning and high-speed edge-based object tracking using a single IR camera. The machine-learning techniques are used for precise 3D initial posture estimation from 2D IR images, as a detection process. After the detection, we apply an edge-based tracking process for real-time image projection. In this paper, we implement our proposal and actually achieve dynamic projection mapping. In addition, we evaluate the performance of our proposal through the comparison with a Kinect-based tracking system.


Author(s):  
Toshika Fegade ◽  
Yogesh Kurle ◽  
Sagar Nikale ◽  
Praful Kalpund

<p>Robotics is a field concerned with the “intelligent connection of perception of action”. The most common manufacturing robot is the robotic arm with different degree of freedoms. Today, these humanoids perform many functions to assist humans in different undertakings such as space missions, driving and monitoring high speed vehicles. They are called semi-humanoids because they resemble to upper part of human body.</p><p>        The idea of this paper is to change perception of controlling robotic arm. This paper provides a way to get rid of old fashioned remote controls and gives an intuitive technique for implementation of Semi-Humanoid Gesture controlled robot. It includes two robot arms which are exactly similar to human arm (5 fingers) increasing sensitivity of the system. It includes motion sensors -flex and accelerometer (used in mobile phones for tilting motion). The system design is divided into 3 parts namely: Robotic Arm, Real time video and Platform.</p>        The prime aim of the design is that the robot arm and platform starts the movement as soon as the operator makes hand and leg gesture. The Robotic arm is synchronized with the gestures (hand postures) of the operator and the platform part is controlled by the leg gestures of the operator.  The robot and the Gesture device are connected wireless via RF. The wireless communication enables user to interact with the robot in an effortless way.


2021 ◽  
Author(s):  
Sora Hisaichi ◽  
Kiwamu Sumino ◽  
Kunihiro Ueda ◽  
Hidenori Kasebe ◽  
Tohru Yamashita ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document