Flower Bud Detection Based on Saliency Map and SURF Feature-Points

2015 ◽  
Vol 740 ◽  
pp. 656-659
Author(s):  
You Jun Yue ◽  
Xiang Li ◽  
Hui Zhao

The flower bud ratio is an important criteria and nodus in flower automatic grading realization, however using computer vision to detect flower bud still remains a challenge. This paper presents a flower bud detection model as a resolution: By using spectral residual method to calculate saliency map and using SURF feature-points, we can quickly and unambiguously obtain the SURF histogram, then construct the flower bud detection model with SVM. The experiments indicates that the flower bud detection model which provides an effective way for measuring bud ratio in flower automatic grading system can distinguish between bud and bloom with good results.

2020 ◽  
Author(s):  
Eunjeong Park ◽  
Kijeong Lee ◽  
Taehwa Han ◽  
Hyo Suk Nam

BACKGROUND Subtle abnormal motor signs are indications of serious neurological diseases. Although neurological deficits require fast initiation of treatment in a restricted time, it is difficult for nonspecialists to detect and objectively assess the symptoms. In the clinical environment, diagnoses and decisions are based on clinical grading methods, including the National Institutes of Health Stroke Scale (NIHSS) score or the Medical Research Council (MRC) score, which have been used to measure motor weakness. Objective grading in various environments is necessitated for consistent agreement among patients, caregivers, paramedics, and medical staff to facilitate rapid diagnoses and dispatches to appropriate medical centers. OBJECTIVE In this study, we aimed to develop an autonomous grading system for stroke patients. We investigated the feasibility of our new system to assess motor weakness and grade NIHSS and MRC scores of 4 limbs, similar to the clinical examinations performed by medical staff. METHODS We implemented an automatic grading system composed of a measuring unit with wearable sensors and a grading unit with optimized machine learning. Inertial sensors were attached to measure subtle weaknesses caused by paralysis of upper and lower limbs. We collected 60 instances of data with kinematic features of motor disorders from neurological examination and demographic information of stroke patients with NIHSS 0 or 1 and MRC 7, 8, or 9 grades in a stroke unit. Training data with 240 instances were generated using a synthetic minority oversampling technique to complement the imbalanced number of data between classes and low number of training data. We trained 2 representative machine learning algorithms, an ensemble and a support vector machine (SVM), to implement auto-NIHSS and auto-MRC grading. The optimized algorithms performed a 5-fold cross-validation and were searched by Bayes optimization in 30 trials. The trained model was tested with the 60 original hold-out instances for performance evaluation in accuracy, sensitivity, specificity, and area under the receiver operating characteristics curve (AUC). RESULTS The proposed system can grade NIHSS scores with an accuracy of 83.3% and an AUC of 0.912 using an optimized ensemble algorithm, and it can grade with an accuracy of 80.0% and an AUC of 0.860 using an optimized SVM algorithm. The auto-MRC grading achieved an accuracy of 76.7% and a mean AUC of 0.870 in SVM classification and an accuracy of 78.3% and a mean AUC of 0.877 in ensemble classification. CONCLUSIONS The automatic grading system quantifies proximal weakness in real time and assesses symptoms through automatic grading. The pilot outcomes demonstrated the feasibility of remote monitoring of motor weakness caused by stroke. The system can facilitate consistent grading with instant assessment and expedite dispatches to appropriate hospitals and treatment initiation by sharing auto-MRC and auto-NIHSS scores between prehospital and hospital responses as an objective observation.


Author(s):  
Abhishek Basu ◽  
Susmita Talukdar

In this paper, a saliency and phase congruency based digital image watermarking scheme has been projected. The planned technique implants data at least significant bits (LSBs) by means of adaptive replacement. Here more information is embedded into less perceptive areas within the original image determined by a combination of spectral residual saliency map and phase congruency map. The position of pixels with less perceptibility denotes the most unimportant region for data hiding from the point of visibility within an image. Therefore any modification within these regions will be less perceptible to one observer. The model gives a concept of the areas which has excellent data hiding capacity within an image. Superiority of the algorithm is tested through imperceptibility, robustness, along with data hiding capacity.


Sensors ◽  
2019 ◽  
Vol 19 (11) ◽  
pp. 2553 ◽  
Author(s):  
Jingwen Cui ◽  
Jianping Zhang ◽  
Guiling Sun ◽  
Bowen Zheng

Based on computer vision technology, this paper proposes a method for identifying and locating crops in order to successfully capture crops in the process of automatic crop picking. This method innovatively combines the YOLOv3 algorithm under the DarkNet framework with the point cloud image coordinate matching method, and can achieve the goal of this paper very well. Firstly, RGB (RGB is the color representing the three channels of red, green and blue) images and depth images are obtained by using the Kinect v2 depth camera. Secondly, the YOLOv3 algorithm is used to identify the various types of target crops in the RGB images, and the feature points of the target crops are determined. Finally, the 3D coordinates of the feature points are displayed on the point cloud images. Compared with other methods, this method of crop identification has high accuracy and small positioning error, which lays a good foundation for the subsequent harvesting of crops using mechanical arms. In summary, the method used in this paper can be considered effective.


2012 ◽  
Vol 17 (2) ◽  
pp. 167 ◽  
Author(s):  
María Lucía Gutiérrez ◽  
Johana Guevara ◽  
Luis Alejandro Barrera

During embryological limb formation mesenchymal cells condense and differentiate into chondrocytes, in a process known as chondrogenesis. These chondrocytes synthesize glycosaminoglycans (GAGs), thus playing an important role in this process. A simplified system in vitro chondrogenesis, using adult mesenchymal stromal cells (MSCs) has been demonstrated. This differentiation potential is usually assessed by histological staining. <strong>Objective</strong>. Establishment of a semi-automatic grading system for histochemistry stains and immunohistochemistry assays. <strong>Materials and methods</strong>. For chondrogenesis cells were cultured for three weeks in aggregates with inducing media. Total GAGs were measured using dimethylmethylene blue (DMB) method. For histological analyses aggregates were stained with Alcian blue for total GAGs detection and immunohistochemistry (IHC) for aggrecan was performed. Semi-automatic grading for all slides was obtained after ImageJ analysis. <strong>Results</strong>. MSCs cultured as aggregates in chondrogenic differentiation media had similar protein concentrations for all time points, suggesting cellularity remained homogenous during culture. Total GAGs was higher for aggregates cultured in chondrogenic compared to complete media. The same trend was observed for Alcian blue stain grades by blinded observer and analysis using ImageJ software. Aggrecan’s IHC analysis had a decreasing tendency with time for aggregates in chondrogenic media for blinded observer and ImageJ evaluation. <strong>Conclusion</strong>. We developed a functional system for semi-automatic slide grading. We corroborated these results by biochemical analysis with comparable results. To our knowledge, for in vitro chondrogenesis, this is the first report to evaluate stains using<br />this methodology. This procedure might be useful for other applications in the field of Biology and Medical Sciences.<br /><strong>Key words</strong>: mesenchymal stromal cells, in vitro chondrogenesis, glycosaminoglycans, ImageJ


Sign in / Sign up

Export Citation Format

Share Document