scholarly journals DIRECT AND INDIRECT CLASSIFICATION OF HIGH FREQUENCY LNA GAIN PERFORMANCE – A COMPARISON BETWEEN SVMS AND MLPS

2014 ◽  
pp. 24-31
Author(s):  
Peter C. Hung ◽  
Seán F. McLoone ◽  
Ronan Farrell

The task of determining low noise amplifier (LNA) high-frequency performance in functional testing is as challenging as designing the circuit itself due to the difficulties associated with bringing high frequency signals offchip. One possible strategy for circumventing these difficulties is to inferentially estimate the high frequency performance measures from measurements taken at lower, more accessible, frequencies. This paper investigates the effectiveness of this strategy for classifying the high frequency gain of the amplifier, a key LNA performance parameter. An indirect Multilayer Perceptron (MLP) and direct support vector machine (SVM) classification strategy are considered. Extensive Monte-Carlo simulations show promising results with both methods, with the indirect MLP classifiers marginally outperforming SVMs.

2005 ◽  
Vol 40 (3) ◽  
pp. 726-735 ◽  
Author(s):  
Kwangseok Han ◽  
J. Gil ◽  
Seong-Sik Song ◽  
Jeonghu Han ◽  
Hyungcheol Shin ◽  
...  

2021 ◽  
Author(s):  
Leonie Lampe ◽  
Sebastian Niehaus ◽  
Hans-Jürgen Huppertz ◽  
Alberto Merola ◽  
Janis Reinelt ◽  
...  

Abstract Importance The entry of artificial intelligence into medicine is pending. Several methods have been used for predictions of structured neuroimaging data, yet nobody compared them in this context.Objective Multi-class prediction is key for building computational aid systems for differential diagnosis. We compared support vector machine, random forest, gradient boosting, and deep feed-forward neural networks for the classification of different neurodegenerative syndromes based on structural magnetic resonance imaging.Design, Setting, and Participants Atlas-based volumetry was performed on multi-centric T1weighted MRI data from 940 subjects, i.e. 124 healthy controls and 816 patients with ten different neurodegenerative diseases, leading to a multi-diagnostic multi-class classification task with eleven different classes.Interventions n.a.Main Outcomes and Measures Cohen’s Kappa, Accuracy, and F1-score to assess model performance.Results Over all, the neural network produced both the best performance measures as well as the most robust results. The smaller classes however were better classified by either the ensemble learning methods or the support vector machine, while performance measures for small classes were comparatively low, as expected. Diseases with regionally specific and pronounced atrophy patterns were generally better classified than diseases with wide-spread and rather weak atrophy.Conclusions and Relevance Our study furthermore underlines the necessity of larger data sets but also calls for a careful consideration of different machine learning methods that can handle the type of data and the classification task best.Trial Registration n.a.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8476
Author(s):  
Yuxuan Tang ◽  
Yulang Feng ◽  
He Hu ◽  
Cheng Fang ◽  
Hao Deng ◽  
...  

This paper presents a wideband low-noise amplifier (LNA) front-end with noise and distortion cancellation for high-frequency ultrasound transducers. The LNA employs a resistive shunt-feedback structure with a feedforward noise-canceling technique to accomplish both wideband impedance matching and low noise performance. A complementary CMOS topology was also developed to cancel out the second-order harmonic distortion and enhance the amplifier linearity. A high-frequency ultrasound (HFUS) and photoacoustic (PA) imaging front-end, including the proposed LNA and a variable gain amplifier (VGA), was designed and fabricated in a 180 nm CMOS process. At 80 MHz, the front-end achieves an input-referred noise density of 1.36 nV/sqrt (Hz), an input return loss (S11) of better than −16 dB, a voltage gain of 37 dB, and a total harmonic distortion (THD) of −55 dBc while dissipating a power of 37 mW, leading to a noise efficiency factor (NEF) of 2.66.


Sign in / Sign up

Export Citation Format

Share Document