margin classification
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 2)

H-INDEX

10
(FIVE YEARS 0)



2021 ◽  
Author(s):  
Renan Motta Goulart ◽  
Carlos Cristiano Hasenclever Borges ◽  
Raul Fonseca Neto


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 96649-96660
Author(s):  
Chenao Weng ◽  
Hai Wang ◽  
Ke Li ◽  
M. N. S. Swamy


2019 ◽  
Vol 2019 ◽  
pp. 1-17 ◽  
Author(s):  
Yanfeng Peng ◽  
Junhang Chen ◽  
Yanfei Liu ◽  
Junsheng Cheng ◽  
Yu Yang ◽  
...  

Adaptive sparsest narrow-band decomposition (ASNBD) method is proposed based on matching pursuit (MP) and empirical mode decomposition (EMD). ASNBD obtains the local narrow-band (LNB) components during the optimization process. Firstly, an optimal filter is designed. The parameter vector in the filter is obtained during optimization. The optimized objective function is a regulated singular local linear operator so that each obtained component is limited to be a LNB signal. Afterward, a component is generated by filtering the original signal with the optimized filter. Compared with MP, ASNBD is superior in both the physical meaning and the adaptivity. Drawbacks in EMD such as end effect and mode mixing are reduced in the proposed method because the application of interpolation function is not required. To achieve the fault diagnosis of roller bearings, raw signals are decomposed by ASNBD at first. Then, appropriate features of the decomposed results are chosen by applying distance evaluation technique (DET). Afterward, different faults are recognized by utilizing maximum margin classification based on flexible convex hulls (MMC-FCH). Comparisons between EMD and ASNBD show that the proposed method performs better in the antinoise performance, accuracy, orthogonality, and extracting the fault features of roller bearings.



2018 ◽  
Vol 36 (7) ◽  
pp. 704-709 ◽  
Author(s):  
Kenneth R. Gundle ◽  
Lisa Kafchinski ◽  
Sanjay Gupta ◽  
Anthony M. Griffin ◽  
Brendan C. Dickson ◽  
...  

Purpose To compare the ability of margin classification systems to determine local recurrence (LR) risk after soft tissue sarcoma (STS) resection. Methods Two thousand two hundred seventeen patients with nonmetastatic extremity and truncal STS treated with surgical resection and multidisciplinary consideration of perioperative radiotherapy were retrospectively reviewed. Margins were coded by residual tumor (R) classification (in which microscopic tumor at inked margin defines R1), the R+1mm classification (in which microscopic tumor within 1 mm of ink defines R1), and the Toronto Margin Context Classification (TMCC; in which positive margins are separated into planned close but positive at critical structures, positive after whoops re-excision, and inadvertent positive margins). Multivariate competing risk regression models were created. Results By R classification, LR rates at 10-year follow-up were 8%, 21%, and 44% in R0, R1, and R2, respectively. R+1mm classification resulted in increased R1 margins (726 v 278, P < .001), but led to decreased LR for R1 margins without changing R0 LR; for R0, the 10-year LR rate was 8% (range, 7% to 10%); for R1, the 10-year LR rate was 12% (10% to 15%) . The TMCC also showed various LR rates among its tiers ( P < .001). LR rates for positive margins on critical structures were not different from R0 at 10 years (11% v 8%, P = .18), whereas inadvertent positive margins had high LR (5-year, 28% [95% CI, 19% to 37%]; 10-year, 35% [95% CI, 25% to 46%]; P < .001). Conclusion The R classification identified three distinct risk levels for LR in STS. An R+1mm classification reduced LR differences between R1 and R0, suggesting that a negative but < 1-mm margin may be adequate with multidisciplinary treatment. The TMCC provides additional stratification of positive margins that may aid in surgical planning and patient education.



2017 ◽  
Vol 29 (11) ◽  
pp. 3078-3093 ◽  
Author(s):  
Liangzhi Chen ◽  
Haizhang Zhang

Support vector machines, which maximize the margin from patterns to the separation hyperplane subject to correct classification, have received remarkable success in machine learning. Margin error bounds based on Hilbert spaces have been introduced in the literature to justify the strategy of maximizing the margin in SVM. Recently, there has been much interest in developing Banach space methods for machine learning. Large margin classification in Banach spaces is a focus of such attempts. In this letter we establish a margin error bound for the SVM on reproducing kernel Banach spaces, thus supplying statistical justification for large-margin classification in Banach spaces.



2017 ◽  
Vol 7 (1) ◽  
Author(s):  
Ehsan Adeli ◽  
Guorong Wu ◽  
Behrouz Saghafi ◽  
Le An ◽  
Feng Shi ◽  
...  


Sign in / Sign up

Export Citation Format

Share Document