Constructing energy spectrum of inorganic scintillator based on plastic scintillator by different kernel functions of SVM learning algorithm and TSC data mapping

2020 ◽  
Vol 15 (01) ◽  
pp. P01028-P01028
Author(s):  
Khalil Moshkbar-Bakhshayesh
Sensors ◽  
2020 ◽  
Vol 20 (22) ◽  
pp. 6671
Author(s):  
Sharif Hossain ◽  
Christopher W.K. Chow ◽  
Guna A. Hewa ◽  
David Cook ◽  
Martin Harris

The spectra fingerprint of drinking water from a water treatment plant (WTP) is characterised by a number of light-absorbing substances, including organic, nitrate, disinfectant, and particle or turbidity. Detection of disinfectant (monochloramine) can be better achieved by separating its spectra from the combined spectra. In this paper, two major focuses are (i) the separation of monochloramine spectra from the combined spectra and (ii) assessment of the application of the machine learning algorithm in real-time detection of monochloramine. The support vector regression (SVR) model was developed using multi-wavelength ultraviolet-visible (UV-Vis) absorbance spectra and online amperometric monochloramine residual measurement data. The performance of the SVR model was evaluated by using four different kernel functions. Results show that (i) particles or turbidity in water have a significant effect on UV-Vis spectral measurement and improved modelling accuracy is achieved by using particle compensated spectra; (ii) modelling performance is further improved by compensating the spectra for natural organic matter (NOM) and nitrate (NO3) and (iii) the choice of kernel functions greatly affected the SVR performance, especially the radial basis function (RBF) appears to be the highest performing kernel function. The outcomes of this research suggest that disinfectant residual (monochloramine) can be measured in real time using the SVR algorithm with a precision level of ± 0.1 mg L−1.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Bin Li ◽  
Xuewen Rong ◽  
Yibin Li

Robot execution failures prediction (classification) in the robot tasks is a difficult learning problem due to partially corrupted or incomplete measurements of data and unsuitable prediction techniques for this prediction problem with little learning samples. Therefore, how to predict the robot execution failures problem with little (incomplete) or erroneous data deserves more attention in the robot field. For improving the prediction accuracy of robot execution failures, this paper proposes a novel KELM learning algorithm using the particle swarm optimization approach to optimize the parameters of kernel functions of neural networks, which is called the AKELM learning algorithm. The simulation results with the robot execution failures datasets show that, by optimizing the kernel parameters, the proposed algorithm has good generalization performance and outperforms KELM and the other approaches in terms of classification accuracy. Other benchmark problems simulation results also show the efficiency and effectiveness of the proposed algorithm.


2013 ◽  
Vol 11 (2) ◽  
pp. 2273-2278
Author(s):  
Sangeetha Rajendran ◽  
B. Kalpana

Classification based on supervised learning theory is one of the most significant tasks frequently accomplished by so-called Intelligent Systems. Contrary to the traditional classification techniques that are used to validate or contradict a predefined hypothesis, kernel based classifiers offer the possibility to frame new hypotheses using statistical learning theory (Sangeetha and Kalpana, 2010). Support Vector Machine (SVM) is a standard kernel based learning algorithm where it improves the learning ability through experience. It is highly accurate, robust and optimal kernel based classification technique that is well-suited to many real time applications. In this paper, kernel functions related to Hilbert space and Banach Space are explained. Here, the experimental results are carried out using benchmark multiclass datasets which are taken from UCI Machine Learning Repository and their performance are compared using various metrics like support vector, support vector percentage, training time and accuracy.


Sensors ◽  
2020 ◽  
Vol 20 (13) ◽  
pp. 3685
Author(s):  
Jiantao Yang ◽  
Yuehong Yin

Estimating the joint torques of lower limbs in human gait is a highly challenging task and of great significance in developing high-level controllers for lower-limb exoskeletons. This paper presents a dependent Gaussian process (DGP)-based learning algorithm for joint-torque estimations with measurements from wearable smart shoes. The DGP was established to perform data fusion, and serves as the mathematical foundation to explore the correlations between joint kinematics and joint torques that are embedded deeply in the data. As joint kinematics are used in the training phase rather than the prediction process, the DGP model can realize accurate predictions in outdoor activities by using only the smart shoe, which is low-cost, nonintrusive for human gait, and comfortable to wearers. The design methodology of dynamic specific kernel functions is presented in accordance to prior knowledge of the measured signals. The designed composite kernel functions can be used to model multiple features at different scales, and cope with the temporal evolution of human gait. The statistical nature of the proposed DGP model and the composite kernel functions offer superior flexibility for time-varying gait-pattern learning, and enable accurate joint-torque estimations. Experiments were conducted with five subjects, whose results showed that it is possible to estimate joint torques under different trained and untrained speed levels. Comparisons were made between the proposed DGP and Gaussian process (GP) models. Obvious improvements were achieved when all DGP r2 values were higher than those of GP.


1968 ◽  
Vol 46 (10) ◽  
pp. S461-S465 ◽  
Author(s):  
J. A. M. Bleeker ◽  
J. J. Burger ◽  
A. J. M. Deerenberg ◽  
A. Scheepmaker ◽  
B. N. Swanenburg ◽  
...  

Two balloon flights with identical X-ray detectors were carried out in the summer of 1966, one from De Bilt, the Netherlands (geomagnetic latitude 53 °N), and the other from Taiyomura, Japan (geomagnetic latitude 25 °N). The detector consists of a NaI(Tl) crystal, 12.5 mm thick and 50 mm in diameter, surrounded by an effective collimator-shield and a plastic scintillator guard counter. The rotating disk incorporated enables the separation of "forward" X rays from the cosmic-ray-induced background. The results of the flights are in very good agreement with each other. In view of the rather large difference in geomagnetic latitude in these two flights, this agreement supports the celestial origin of the primary X rays observed. The energy spectrum between 20 and 180 keV can be expressed by a power law:[Formula: see text]


2013 ◽  
Vol 20 (3) ◽  
pp. 130 ◽  
Author(s):  
Celso Antonio Alves Kaestner

This work presents kernel functions that can be used in conjunction with the Support Vector Machine – SVM – learning algorithm to solve the automatic text classification task. Initially the Vector Space Model for text processing is presented. According to this model text is seen as a set of vectors in a high dimensional space; then extensions and alternative models are derived, and some preprocessing procedures are discussed. The SVM learning algorithm, largely employed for text classification, is outlined: its decision procedure is obtained as a solution of an optimization problem. The “kernel trick”, that allows the algorithm to be applied in non-linearly separable cases, is presented, as well as some kernel functions that are currently used in text applications. Finally some text classification experiments employing the SVM classifier are conducted, in order to illustrate some text preprocessing techniques and the presented kernel functions.


Author(s):  
Intisar Shadeed Al-Mejibli ◽  
Jwan K. Alwan ◽  
Dhafar Hamed Abd

Currently, the support vector machine (SVM) regarded as one of supervised machine learning algorithm that provides analysis of data for classification and regression. This technique is implemented in many fields such as bioinformatics, face recognition, text and hypertext categorization, generalized predictive control and many other different areas. The performance of SVM is affected by some parameters, which are used in the training phase, and the settings of parameters can have a profound impact on the resulting engine’s implementation. This paper investigated the SVM performance based on value of gamma parameter with used kernels. It studied the impact of gamma value on (SVM) efficiency classifier using different kernels on various datasets descriptions. SVM classifier has been implemented by using Python. The kernel functions that have been investigated are polynomials, radial based function (RBF) and sigmoid. UC irvine machine learning repository is the source of all the used datasets. Generally, the results show uneven effect on the classification accuracy of three kernels on used datasets. The changing of the gamma value taking on consideration the used dataset influences polynomial and sigmoid kernels. While the performance of RBF kernel function is more stable with different values of gamma as its accuracy is slightly changed.


Destructive earthquakes usually causes gargantuan casualties. So, to cut back these inimical casualties’ analysis are made to reduce despicable and forlorn impacts which they left upon others to just ponder and become lugubrious. These factors measure the decisive casualties it brings and also earthquake and therefore the development of rational prediction model to casualties become a crucial analysis topic, as a result of quality and cognitive content of gift prediction methodology of price, an additional correct prediction model is mentioned by gray correlation theory and BP neural networks. The earthquake can be analyzed succinct by using various technique mainly predictive commands to marshal all the calculated time and magnitude of a potential earthquake have been the topic of the many studies varied ways are tried mistreatment several input variables like temperature exorable, seismic movements and particularly the variable climatic conditions. The relation between recorded seismal-acoustic information associate degreed occurring an abnormal seismic process (ASP). However, it's obstreperous to predict all parameters the placement, time and magnitude of the earthquake by mistreatment this information. This model description is different from others as with the help of the prediction commands most of the paragons and domains are identified and tend to explore the activity of serious Earthquakes. We use the preemptive data information which is collected around the planet. We retrieved the data to perceive that associate degree earthquake reaches the class of exceeds a grade range of eight on Richter Scale. The two main affected areas are in the field of Data Exploration and Data Mapping. Number of occurrences of an earthquake with different magnitude ranges, severity of an earthquake. Mapping is thereby crucial to identify highly affected areas based on Magnitude and Correlation between depth and magnitude. So, based on the above explorations we have made the following predictions. Predictions Magnitude based on depth. Magnitude based on Latitude and Longitude. Depth based on Latitude and Longitude The primitive algorithm used here are the Machine Learning Algorithm I.e. Linear Regression and K- Means Clustering. Firstly, we have made all the predictions via Linear Regression and made different clusters of the Earthquakes which belong to the same subdivision as that of Magnitude or Depth. Keyword: Data Exploration and Data Mapping.


Author(s):  
P Priyanga ◽  
N C Naveen

This article describes how healthcare organizations is growing increasingly and are the potential beneficiary users of the data that is generated and gathered. From hospitals to clinics, data and analytics can be a very powerful tool that can improve patient care and satisfaction with efficiency. In developing countries, cardiovascular diseases have a huge impact on increasing death rates and are expected by the end of 2020 in spite of the best clinical practices. The current Machine Learning (ml) algorithms are adapted to estimate the heart disease risks in middle aged patients. Hence, to predict the heart diseases a detailed analysis is made in this research work by taking into account the angiographic heart disease status (i.e. ≥ 50% diameter narrowing). Deep Neural Network (DNN), Extreme Learning Machine (elm), K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) learning algorithm (with linear and polynomial kernel functions) are considered in this work. The accuracy and results of these algorithms are analyzed by comparing the effectiveness among them.


Sign in / Sign up

Export Citation Format

Share Document