scholarly journals Shift-rotation absolute measurement method for irregular aperture optical surfaces based on deep learning

2021 ◽  
pp. 105020
Author(s):  
Lili Yang ◽  
Jiantai Dou ◽  
Zhongming Yang ◽  
Zhaojun Liu
2021 ◽  
Author(s):  
Jingwei Yang ◽  
Yikang Wang ◽  
Chong Li ◽  
Wei Han ◽  
Weiwei Liu ◽  
...  

Background: Pronuclear assessment appears to have the ability to distinguish good and bad embryos in the zygote stage,but paradoxical results were obtained in clinical studies.This situation might be caused by the robust qualitative detection of the development of dynamic pronuclei. Here,we aim to establish a quantitative pronuclear measurement method by applying expert experience deep learning from large annotated datasets. Methods: Convinced handle-annotated 2PN images(13419) were used for deep learning then corresponded errors were recorded through handle check for subsequent parameters adjusting. We used 790 embryos with 52479 PN images from 155 patients for analysis the area of pronuclei and the preimplantation genetic test results.Establishment of the exponential fitting equation and the key coefficient β1 was extracted from the model for quantitative analysis for pronuclear(PN) annotation and automatic recognition. Findings: Based on the female original PN coefficient β1,the chromosome normal rate in the blastocyst with biggest PN area is much higher than that of the blastocyst with smallest PN area(58.06% vs.45.16%, OR=1.68[1.07-2.64];P=0.031).After adjusting coefficient β1 by the first three frames which high variance of outlier PN areas was removed, coefficient β1 at 12 hours and at 14 hours post-insemination,similar but stronger evidence was obtained. All these discrepancies resulted from the female propositus in the PGT(SR) subgroup and smaller chromosomal errors. Conclusion(s): The results suggest that detailed analysis of the images of embryos could improve our understanding of developmental biology. Funding: None


2018 ◽  
Vol 426 ◽  
pp. 589-597 ◽  
Author(s):  
Jinyu Du ◽  
Zhongming Yang ◽  
Zhaojun Liu ◽  
Guobin Fan

2020 ◽  
Vol 49 (6) ◽  
pp. 20200023
Author(s):  
张钊 Zhao Zhang ◽  
韩博文 Bowen Han ◽  
于浩天 Haotian Yu ◽  
张毅 Yi Zhang ◽  
郑东亮 Dongliang Zheng ◽  
...  

Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2484 ◽  
Author(s):  
Weixing Zhang ◽  
Chandi Witharana ◽  
Weidong Li ◽  
Chuanrong Zhang ◽  
Xiaojiang Li ◽  
...  

Traditional methods of detecting and mapping utility poles are inefficient and costly because of the demand for visual interpretation with quality data sources or intense field inspection. The advent of deep learning for object detection provides an opportunity for detecting utility poles from side-view optical images. In this study, we proposed using a deep learning-based method for automatically mapping roadside utility poles with crossarms (UPCs) from Google Street View (GSV) images. The method combines the state-of-the-art DL object detection algorithm (i.e., the RetinaNet object detection algorithm) and a modified brute-force-based line-of-bearing (LOB, a LOB stands for the ray towards the location of the target [UPC at here] from the original location of the sensor [GSV mobile platform]) measurement method to estimate the locations of detected roadside UPCs from GSV. Experimental results indicate that: (1) both the average precision (AP) and the overall accuracy (OA) are around 0.78 when the intersection-over-union (IoU) threshold is greater than 0.3, based on the testing of 500 GSV images with a total number of 937 objects; and (2) around 2.6%, 47%, and 79% of estimated locations of utility poles are within 1 m, 5 m, and 10 m buffer zones, respectively, around the referenced locations of utility poles. In general, this study indicates that even in a complex background, most utility poles can be detected with the use of DL, and the LOB measurement method can estimate the locations of most UPCs.


2013 ◽  
Vol 20 (5) ◽  
pp. 374-377 ◽  
Author(s):  
Weihong Song ◽  
Fan Wu ◽  
Xi Hou ◽  
Wenchuan Zhao ◽  
Yongjian Wan

2019 ◽  
Vol 58 (1) ◽  
pp. 28-33
Author(s):  
Hideaki MAEHARA ◽  
Momoyo NAGASE ◽  
Michihiro KUCHI ◽  
Toshihisa SUZUKI ◽  
Kenji TAIRA

Sign in / Sign up

Export Citation Format

Share Document