Enhancing Sensitivity Using Electrostatic Spring in Coupling Mode-Localized Mems Accelerometer

Author(s):  
Z Wang ◽  
XY Xiong ◽  
K F Wang ◽  
W H Yang ◽  
Z T Li ◽  
...  
2020 ◽  
Vol 23 (7) ◽  
pp. 25-33
Author(s):  
Luciane Agnoletti dos Santos Pedotti ◽  
Ricardo Mazza Zago ◽  
Mateus Giesbrecht ◽  
Fabiano Fruett

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1390
Author(s):  
Tomasz Ursel ◽  
Michał Olinski

This article aims to develop a system capable of estimating the displacement of a moving object with the usage of a relatively cheap and easy to apply sensors. There is a growing need for such systems, not only for robots, but also, for instance, pedestrian navigation. In this paper, the theory for this idea, including data postprocessing algorithms for a MEMS accelerometer and an optical flow sensor (OFS), as well as the developed complementary filter applied for sensor fusion, are presented. In addition, a vital part of the accelerometer’s algorithm, the zero velocity states detection, is implemented. It is based on analysis of the acceleration’s signal and further application of acceleration symmetrization, greatly improving the obtained displacement. A test stand with a linear guide and motor enabling imposing a specified linear motion is built. The results of both sensors’ testing suggest that the displacement estimated by each of them is highly correct. Fusion of the sensors’ data gives even better outcomes, especially in cases with external disturbance of OFS. The comparative evaluation of estimated linear displacements, in each case related to encoder data, confirms the algorithms’ operation correctness and proves the chosen sensors’ usefulness in the development of a linear displacement measuring system.


Author(s):  
Abhiraj Basavanna ◽  
Matthias Dienger ◽  
Jan Rockstroh ◽  
Steffen Keller ◽  
Alfons Dehe

2021 ◽  
pp. 1-1
Author(s):  
Pramod Martha ◽  
Naveen Kadayinti ◽  
V. Seena
Keyword(s):  

Author(s):  
Adam F. Werner ◽  
Jamie C. Gorman

Objective This study examines visual, auditory, and the combination of both (bimodal) coupling modes in the performance of a two-person perceptual-motor task, in which one person provides the perceptual inputs and the other the motor inputs. Background Parking a plane or landing a helicopter on a mountain top requires one person to provide motor inputs while another person provides perceptual inputs. Perceptual inputs are communicated either visually, auditorily, or through both cues. Methods One participant drove a remote-controlled car around an obstacle and through a target, while another participant provided auditory, visual, or bimodal cues for steering and acceleration. Difficulty was manipulated using target size. Performance (trial time, path variability), cue rate, and spatial ability were measured. Results Visual coupling outperformed auditory coupling. Bimodal performance was best in the most difficult task condition but also high in the easiest condition. Cue rate predicted performance in all coupling modes. Drivers with lower spatial ability required a faster auditory cue rate, whereas drivers with higher ability performed best with a lower rate. Conclusion Visual cues result in better performance when only one coupling mode is available. As predicted by multiple resource theory, when both cues are available, performance depends more on auditory cueing. In particular, drivers must be able to transform auditory cues into spatial actions. Application Spotters should be trained to provide an appropriate cue rate to match the spatial ability of the driver or pilot. Auditory cues can enhance visual communication when the interpersonal task is visual with spatial outputs.


Sign in / Sign up

Export Citation Format

Share Document