top view
Recently Published Documents


TOTAL DOCUMENTS

285
(FIVE YEARS 93)

H-INDEX

15
(FIVE YEARS 3)

2021 ◽  
Vol 26 (3) ◽  
Author(s):  
Pavlo Ihorovych Krysenko ◽  
Maksym Olehovych Zoziuk ◽  
Oleksandr Ivanovych Yurikov ◽  
Dmytro Volodymyrovych Koroliuk ◽  
Yurii Ivanovych Yakymenko

An analytical model for creating flat Chladni figures is presented. The equation of a standing wave in the simplest boundary conditions and the Fourier transform are used. Top view images are shown at different frequencies. The practical significance of the results obtained for the further development of the field of creating Chladni figures based on standing waves of different physical nature has been determined.


2021 ◽  
Vol 7 (12) ◽  
pp. 270
Author(s):  
Daniel Tøttrup ◽  
Stinus Lykke Skovgaard ◽  
Jonas le Fevre Sejersen ◽  
Rui Pimentel de Figueiredo

In this work we present a novel end-to-end solution for tracking objects (i.e., vessels), using video streams from aerial drones, in dynamic maritime environments. Our method relies on deep features, which are learned using realistic simulation data, for robust object detection, segmentation and tracking. Furthermore, we propose the use of rotated bounding-box representations, which are computed by taking advantage of pixel-level object segmentation, for improved tracking accuracy, by reducing erroneous data associations during tracking, when combined with the appearance-based features. A thorough set of experiments and results obtained in a realistic shipyard simulation environment, demonstrate that our method can accurately, and fast detect and track dynamic objects seen from a top-view.


2021 ◽  
Author(s):  
Linh Nguyen Viet ◽  
Tuan Nguyen Dinh ◽  
Duc Tran Minh ◽  
Hoang Nguyen Viet ◽  
Quoc Long Tran
Keyword(s):  

Author(s):  
Gelayol Golcarenarenji ◽  
Ignacio Martinez-Alpiste ◽  
Qi Wang ◽  
Jose Maria Alcaraz-Calero

AbstractTelescopic cranes are powerful lifting facilities employed in construction, transportation, manufacturing and other industries. Since the ground workforce cannot be aware of their surrounding environment during the current crane operations in busy and complex sites, accidents and even fatalities are not avoidable. Hence, deploying an automatic and accurate top-view human detection solution would make significant improvements to the health and safety of the workforce on such industrial operational sites. The proposed method (CraneNet) is a new machine learning empowered solution to increase the visibility of a crane operator in complex industrial operational environments while addressing the challenges of human detection from top-view on a resource-constrained small-form PC to meet the space constraint in the operator’s cabin. CraneNet consists of 4 modified ResBlock-D modules to fulfill the real-time requirements. To increase the accuracy of small humans at high altitudes which is crucial for this use-case, a PAN (Path Aggregation Network) was designed and added to the architecture. This enhances the structure of CraneNet by adding a bottom-up path to spread the low-level information. Furthermore, three output layers were employed in CraneNet to further improve the accuracy of small objects. Spatial Pyramid Pooling (SPP) was integrated at the end of the backbone stage which increases the receptive field of the backbone, thereby increasing the accuracy. The CraneNet has achieved 92.59% of accuracy at 19 FPS on a portable device. The proposed machine learning model has been trained with the Standford Drone Dataset and Visdrone 2019 to further show the efficacy of the smart crane approach. Consequently, the proposed system is able to detect people in complex industrial operational areas from a distance up to 50 meters between the camera and the person. This system is also applicable to the detection of any other objects from an overhead camera.


Processes ◽  
2021 ◽  
Vol 9 (10) ◽  
pp. 1791
Author(s):  
Murendeni I. Nemufulwi ◽  
Hendrik C. Swart ◽  
Gugu H. Mhlongo

Development of gas sensors displaying improved sensing characteristics including sensitivity, selectivity, and stability is now possible owing to tunable surface chemistry of the sensitive layers as well as favorable transport properties. Herein, zinc ferrite (ZnFe2O4) nanoparticles (NPs) were produced using a microwave-assisted hydrothermal method. ZnFe2O4 NP sensing layer films with different thicknesses deposited on interdigitated alumina substrates were fabricated at volumes of 1.0, 1.5, 2.0, and 2.5 µL using a simple and inexpensive drop-casting technique. Successful deposition of ZnFe2O4 NP-based active sensing layer films onto alumina substrates was confirmed by X-ray diffraction and atomic force microscope analysis. Top view and cross-section observations from the scanning electron microscope revealed inter-agglomerate pores within the sensing layers. The ZnFe2O4 NP sensing layer produced at a volume of 2 μL exhibited a high response of 33 towards 40 ppm of propanol, as well as rapid response and recovery times of 11 and 59 s, respectively, at an operating temperature of 120 °C. Furthermore, all sensors demonstrated a good response towards propanol and the highest response against ethanol, methanol, carbon dioxide, carbon monoxide, and methane. The results indicate that the developed fabrication strategy is an inexpensive way to enhance sensing response without sacrificing other sensing characteristics. The produced ZnFe2O4 NP-based active sensing layers can be used for the detection of volatile organic compounds in alcoholic beverages for quality check in the food sector.


2021 ◽  
Vol 111 (5) ◽  
Author(s):  
Daniel T.P. Fong ◽  
Marabelle Li-wen Heng ◽  
Jing Wen Pan ◽  
Yi Yan Lim ◽  
Pei-Yueng Lee ◽  
...  

Background Hallux valgus is a progressive foot deformity that commonly affects middle-aged women. The aim of this study was to develop a novel method using only top-view photographs to assess hallux valgus severity. Methods A top-view digital photograph was taken of each foot of 70 female participants. Two straight lines were drawn along the medial edge of the great toe and forefoot, and the included angle (termed bunion angle) was measured using a free software program. Each foot was also assessed by a clinician using the Manchester scale as no (grade 1), mild (grade 2), moderate (grade 3), or severe (grade 4) deformity. Results The mean bunion angles of the 140 feet were 6.7°, 13.5°, and 16.2° for Manchester grades 1, 2, and 3, respectively (no foot was in grade 4). The reliability was excellent for both intrarater (intraclass correlation coefficient [ICC] = 0.93–0.95) and interrater (ICC = 0.90) assessments. Receiver operating characteristic curves determined the optimal bunion angle cutoff value for screening hallux valgus to be 9°, which gives 89.2% sensitivity and 74.2% specificity. Conclusions The bunion angle is a reliable, clinician-free method that can potentially be integrated into a smartphone app for easy and inexpensive self-assessment of hallux valgus.


Sign in / Sign up

Export Citation Format

Share Document