minimum margin
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 10)

H-INDEX

5
(FIVE YEARS 1)

Author(s):  
Nan Cao ◽  
Teng Zhang ◽  
Hai Jin

Partial multi-label learning deals with the circumstance in which the ground-truth labels are not directly available but hidden in a candidate label set. Due to the presence of other irrelevant labels, vanilla multi-label learning methods are prone to be misled and fail to generalize well on unseen data, thus how to enable them to get rid of the noisy labels turns to be the core problem of partial multi-label learning. In this paper, we propose the Partial Multi-Label Optimal margin Distribution Machine (PML-ODM), which distinguishs the noisy labels through explicitly optimizing the distribution of ranking margin, and exhibits better generalization performance than minimum margin based counterparts. In addition, we propose a novel feature prototype representation to further enhance the disambiguation ability, and the non-linear kernels can also be applied to promote the generalization performance for linearly inseparable data. Extensive experiments on real-world data sets validates the superiority of our proposed method.


2021 ◽  
Author(s):  
Gillian Ecclestone

In radiation therapy treatment planning, margins are added to the tumour volume to ensure that the correct radiation dose is delivered to the tumour in the presence of geometrical uncertainties. The van Herk margin formula (VHMF) was developed to calculate the minimum margin on the target to provide full coverage by 95% of the prescribed dose to 90% of the population. However, this formula is based on an ideal dose profile model that is not realistic for lung radiotherapy. The purpose of this study was to investigate the validity of the VHMF for lung radiotherapy with accurate dose calculation algorithms and respiratory motion modeling. Ultimately, the VHMF ensured sufficient target coverage, with the exception of small lesions in soft tissue; however, the derived PTV margins were larger than necessary. A novel planning approach using the VHMF was tested indicating the need to account for tumour motion trajectory and plan conformity.


2021 ◽  
Author(s):  
Gillian Ecclestone

In radiation therapy treatment planning, margins are added to the tumour volume to ensure that the correct radiation dose is delivered to the tumour in the presence of geometrical uncertainties. The van Herk margin formula (VHMF) was developed to calculate the minimum margin on the target to provide full coverage by 95% of the prescribed dose to 90% of the population. However, this formula is based on an ideal dose profile model that is not realistic for lung radiotherapy. The purpose of this study was to investigate the validity of the VHMF for lung radiotherapy with accurate dose calculation algorithms and respiratory motion modeling. Ultimately, the VHMF ensured sufficient target coverage, with the exception of small lesions in soft tissue; however, the derived PTV margins were larger than necessary. A novel planning approach using the VHMF was tested indicating the need to account for tumour motion trajectory and plan conformity.


2020 ◽  
pp. 110-115
Author(s):  
В.К. Румб

Расчеты прочности при проектировании и создании современной техники являются одним из главных критериев качества и конкурентоспособности изделий машиностроения. Неотъемлемой частью расчета прочности на выносливость является оценка минимально допустимого запаса прочности. Существующая оценка этого запаса на основе данных по отказам деталей несет в себе большую долю субъективизма. Предлагается методика определения минимально допустимого коэффициента запаса прочности. Ее принципиальное отличие от существующих заключается в том, что здесь этот коэффициент запаса прочности подсчитывается с учетом рассеяния характеристик прочности детали и действующих в ней напряжений при заданной вероятности отсутствия усталостного разрушения. Это позволяет исключить многие условности при прогнозировании прочностной надежности деталей и создает предпосылки для получения проектных решений, оптимальных по прочности и массе. Strength calculations in the design and creation of modern technology are one of the main criteria for the quality and competitiveness of engineering products. An integral part of the endurance strength calculation is the assessment of the minimum allowable margin of strength. The existing assessment of this stock, based on data on the failure of parts, carries a large share of subjectivism. It is proposed a method of determining the minimum allowable safety factor. Its fundamental difference from the existing one is that here this strength factor is calculated taking into account the scattering of the characteristics of the strength of the part and the stresses in it at a given probability of absence of fatigue destruction. This eliminates many conventions when predicting the strength of the reliability of parts and creates the prerequisites for obtaining design solutions that are optimal in strength and mass.


2020 ◽  
Vol 97 ◽  
pp. 107012 ◽  
Author(s):  
Xin Wei ◽  
Hui Wang ◽  
Bryan Scotney ◽  
Huan Wan

2019 ◽  
Vol 19 (03) ◽  
pp. 1950016 ◽  
Author(s):  
N. Senniangiri ◽  
J. Bensam Raj ◽  
J. Sunil

In practice, lubricants are used to minimize the friction and wear of frictional surfaces. The disposal of mineral-based lubricating oil possesses environmental issues and forced the development of bio-degradable lubricating agents. The simultaneous mono-dispersion of metallic and metal oxides nanomaterials into lubricating agents may concurrently reveal superior thermo-physical and rheological characteristics. This paper proposes an experimental and theoretical investigation on the dynamic viscosity enhancement of flat platelets textured Graphene/NiO-coconut oil hybrid nanofluids. The results reveal that the dynamic viscosity enhancement of hybrid nanofluids increases with nanomaterial concentration and decreases with temperature. The squat hybrid nanomaterial concentration has less collusion probability and dynamic contact between the mono-dispersed hybrid nanomaterials as it has enough interfacing gaps to conquer superficial surface energy. The high nanomaterial concentration revamps the formation of lamellar-composite agglomerated particles and enhances the dynamic viscosity of base fluid. Further, a theoretical correlation is recommended to estimate the dynamic viscosity of hybrid nanofluid with minimum margin of deviation using artificial neural network (ANN).


2019 ◽  
Vol 19 (02) ◽  
pp. 1950011
Author(s):  
M. Muthuraj ◽  
J. Bensam Raj ◽  
J. Sunil

In the past decades, considerable efforts have been made for the development of energy-efficient and eco-friendly convective heat transfer and lubricating agents because of growing energy demands, precision manufacturing, miniaturization and sustainability issues. In this study, different concentrations of graphene–sunflower oil nanofluid were prepared and their thermal conductivity was experimentally investigated and compared with the correlations of similar researches found in the literature. The morphology of graphene nanoplatelets was appraised by X-ray diffractometer (XRD) and scanning electron microscope (SEM). The results show that the thermal conductivity of nanofluid was enhanced with temperature and nanoparticles weight fraction. The nanoconvection at high temperatures, less meandering mobility of graphene nanoplatelets and high kinematic viscosity of graphene nanofluids at low temperatures were identified as the key factors for the thermal conductivity enhancement. Further, the concentration and temperature-dependent theoretical correlation were proposed for estimating the thermal conductivity of graphene nanofluids using backpropagation algorithm of artificial neural network (ANN) with the minimum margin of deviation.


Author(s):  
Soumya K. Manna ◽  
Venketesh N. Dubey

Abstract A Kinect sensor based basketball game is developed for delivering post-stroke exercises in association with a newly developed elbow exoskeleton. Few interesting features such as audio-visual feedback and scoring have been added to the game platform to enhance patient’s engagement during exercises. After playing the game, the performance score has been calculated based on their reachable points and reaching time to measure their current health conditions. During exercises, joint parameters are measured using the motion capture technique of Kinect sensor. The measurement accuracy of Kinect sensor is validated by two comparative studies where two healthy subjects were asked to move elbow joint in front of Kinect sensor wearing the developed elbow exoskeleton. In the first study, the joint information collected from Kinect sensor was compared with the exoskeleton based sensor. In the next study, the length of upperarm and forearm measured by Kinect were compared with the standard anthropometric data. The measurement errors between Kinect and exoskeleton are turned out to be in the acceptable range; 1% for subject 1 and 0.44% for subject 2 in case of joint angle; 5.55% and 3.58% for subject 1 and subject 2 respectively in case of joint torque. The average errors of Kinect measurement as compared to the anthropometric data of the two subjects are 16.52% for upperarm length and 9.87% for forearm length. It shows that Kinect sensor can measure the activity of joint movement with a minimum margin of error.


Author(s):  
Ke Ma ◽  
Qianqian Xu ◽  
Zhiyong Yang ◽  
Xiaochun Cao

In the absence of prior knowledge, ordinal embedding methods obtain new representation for items in a low-dimensional Euclidean space via a set of quadruple-wise comparisons. These ordinal comparisons often come from human annotators, and sufficient comparisons induce the success of classical approaches. However, collecting a large number of labeled data is known as a hard task, and most of the existing work pay little attention to the generalization ability with insufficient samples. Meanwhile, recent progress in large margin theory discloses that rather than just maximizing the minimum margin, both the margin mean and variance, which characterize the margin distribution, are more crucial to the overall generalization performance. To address the issue of insufficient training samples, we propose a margin distribution learning paradigm for ordinal embedding, entitled Distributional Margin based Ordinal Embedding (DMOE). Precisely, we first define the margin for ordinal embedding problem. Secondly, we formulate a concise objective function which avoids maximizing margin mean and minimizing margin variance directly but exhibits the similar effect. Moreover, an Augmented Lagrange Multiplier based algorithm is customized to seek the optimal solution of DMOE effectively. Experimental studies on both simulated and realworld datasets are provided to show the effectiveness of the proposed algorithm.


2019 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
György Walter

Purpose The purpose of this paper is to assess whether project finance loans were properly priced based on their risk before the crisis of 2008-2009 and what lessons can be learned under different market circumstances. Design/methodology/approach A literature review presents the structure of project financing, how banks are inspired to apply risk-adjusted price calculations for loans to create value for shareholders and how risk measurement differs by project loans. The authors adapt a general model for risk-adjusted pricing to project loans. Based on empirical parameters, assuming different margins and leverages, the authors estimate the implied maximum probability of default of projects, where project loans could produce value added to lenders. The authors compare these maximum probabilities of default with reference points. Findings The authors conclude that by the years of 2006-2007 several projects were very unlikely to produce any value added for shareholders and did not reach the minimum margin. Market and regulatory circumstances of 2016-2017 have significantly increased required margin levels and must shift lenders to a more conservative pricing and leverage policy. Research limitations/implications Though the presented model is general, the simulation focusses on the European banking market. Practical implications In high market competition, banks tend to underestimate risk, underprice loans and loosen risk parameters. The crisis pushed banks back to a more conservative approach, however, the danger to return to a loosen project loan policy is real. The simulation shows how required prices are influenced by different market circumstances. Originality/value The paper adapts the risk-adjusted pricing methodology of standard loans to a new segment of project financing and gives an insight into the risk-pricing characteristics of project loans. The authors can draw down several valuable conclusions what how the market environment or project phases affect risk-adjusted pricing and the ability to produce value added to shareholders.


Sign in / Sign up

Export Citation Format

Share Document