Mathematical Tools for the Quantitative Definition of a Design Space

Author(s):  
Amanda Rogers ◽  
Marianthi G. Ierapetritou
Energies ◽  
2020 ◽  
Vol 13 (13) ◽  
pp. 3366
Author(s):  
Daniel Suchet ◽  
Adrien Jeantet ◽  
Thomas Elghozi ◽  
Zacharie Jehl

The lack of a systematic definition of intermittency in the power sector blurs the use of this term in the public debate: the same power source can be described as stable or intermittent, depending on the standpoint of the authors. This work tackles a quantitative definition of intermittency adapted to the power sector, linked to the nature of the source, and not to the current state of the energy mix or the production predictive capacity. A quantitative indicator is devised, discussed and graphically depicted. A case study is illustrated by the analysis of the 2018 production data in France and then developed further to evaluate the impact of two methods often considered to reduce intermittency: aggregation and complementarity between wind and solar productions.


2021 ◽  
Author(s):  
William F. Quintero-Restrepo ◽  
Brian K. Smith ◽  
Junfeng Ma

Abstract The efficient creation of 3D CAD platforms can be achieved by the optimization of their design process. The research presented in this article showcases a method for allowing such efficiency improvement. The method is based on the DMADV six sigma approach. During the Define step, the definition of the scope and design space is established. In the Measure step, the initial evaluation of the platforms to be improved is done with the help of a Metrics framework for 3D CAD platforms. The Analyze Step includes the identification and optimization of the systems’ model of the process based on the architecture and the multiple objectives required for the improvement. The optimization method used that is based on evolutionary algorithms allows for the identification of the best improvement alternatives for the next step. During Design step of the method, the improvement alternatives are planned and executed. In the final Verification step, the evaluation of the improved process is tested against the previous status with the help of the Metrics Framework for 3D CAD platforms. The method is explained with an example case of a 3D CAD platform for creating metallic boxes for electric machinery.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ted Sichelman

Many scholars have employed the term “entropy” in the context of law and legal systems to roughly refer to the amount of “uncertainty” present in a given law, doctrine, or legal system. Just a few of these scholars have attempted to formulate a quantitative definition of legal entropy, and none have provided a precise formula usable across a variety of legal contexts. Here, relying upon Claude Shannon's definition of entropy in the context of information theory, I provide a quantitative formalization of entropy in delineating, interpreting, and applying the law. In addition to offering a precise quantification of uncertainty and the information content of the law, the approach offered here provides other benefits. For example, it offers a more comprehensive account of the uses and limits of “modularity” in the law—namely, using the terminology of Henry Smith, the use of legal “boundaries” (be they spatial or intangible) that “economize on information costs” by “hiding” classes of information “behind” those boundaries. In general, much of the “work” performed by the legal system is to reduce legal entropy by delineating, interpreting, and applying the law, a process that can in principle be quantified.


Author(s):  
Kevin N. Otto ◽  
Erik K. Antonsson

Abstract The Taguchi method of product design is an experimental approximation to minimizing the expected value of target variance for certain classes of problems. Taguchi’s method is extended to designs which involve variables each of which has a range of values all of which must be satisfied (necessity), and designs which involve variables each of which has a range of values any of which might be used (possibility). Tuning parameters, as a part of the design process, are also introduced into Taguchi’s method. The method is also extended to solve design problems with constraints, invoking the methods of constrained optimization. Finally, the Taguchi method uses a factorial method to search the design space, with a confined definition of an optimal solution. This is compared with other methods of searching the design space and their definition of an optimal solution.


2018 ◽  
Vol 2018 ◽  
pp. 1-9
Author(s):  
E. L. Rex ◽  
J. Werle ◽  
B. C. Burkart ◽  
J. R. MacKenzie ◽  
K. D. Johnston ◽  
...  

Geometry of the patella (kneecap) remains poorly understood yet is highly relevant to performing the correct patellar cut to reduce pain and to improve function and satisfaction after knee replacement surgery. Although studies routinely refer to “parallel to the anterior surface” and “the patellar horizon,” a quantitative definition of these is lacking and significant variability exists between observers for this irregularly-shaped bone. A 2D-3D shape analysis technique was developed to determine the optimal device configuration for contacting the patellar surface. Axial and sagittal pseudo-X-rays were created from 18 computed tomography (CT) scans of cadaveric knees. Four expert surgeons reviewed three repetitions of the X-rays in randomized order, marking their desired cut plane and their estimate of the anterior surface. These 2D results were related back to the 3D model to create the desired plane. There was considerable variability in perceptions, with intra- and intersurgeon repeatability (standard deviations) ranging from 1.3° to 2.4°. The best configuration of contact points to achieve the desired cutting plane was three pegs centred on the patellar surface, two superior and one inferior, forming a 16 mm equilateral triangle. This configuration achieved predicted cut planes within 1° of the surgeon ranges on all 18 patellae. Implementing this, as was done in a subsequent prototype surgical device, should help improve the success and satisfaction of knee replacement surgery.


2020 ◽  
Vol 21 (Supplement_1) ◽  
Author(s):  
P Bartko ◽  
H Arfsten ◽  
G Heitzinger ◽  
N Pavo ◽  
A Toma ◽  
...  

Abstract Background Diverging guideline definitions for the quantitative assessment of severe secondary mitral regurgitation (sMR) reflect the lacking link of the sMR spectrum to mortality and has introduced a source of uncertainty and continuing debate. Objectives The current study aimed to define improved risk-thresholds specifically tailored to the complex nature of sMR that provide a unifying solution to the ongoing guideline-controversy. Methods We enrolled 423 heart failure patients under guideline directed medical therapy and assessed sMR by effective regurgitant orifice area (EROA), regurgitant volume (RegVol) and regurgitant fraction (RegFrac). Results Measures of sMR severity were consistently associated with 5-year mortality with a HR for a 1-SD increase of 1.42 (95%CI 1.25-1.63, P < 0.001) for EROA, 1.37 (95%CI 1.20-1.56, P < 0.001) for RegVol and 1.50 (95%CI 1.30-1.73, P < 0.001) for RegFrac. Results remained statistically significant after bootstrap- or clinical confounder-based adjustment. Spline-curve analyses (Figure 1A-C) showed a linearly increasing risk enabling to stratify in low-risk (EROA < 20mm2 and RegVol < 30ml), intermediate-risk (EROA 20-30mm2 and RegVol 30-45ml) and, high-risk (EROA≥30mm2 and RegVol≥45ml). In the intermediate-risk group, a RegFrac ≥50% as indicator for hemodynamic severe sMR was associated with poor outcome (P = 0.017). A unifying concept based on combined assessment of the EROA, the RegVol, and the RegFrac (Figure 1D) showed a significantly better discrimination compared to the currently established algorithms (Table 1). Conclusions Risk-based thresholds tailored to the pathophysiological concept of sMR provide a unifying solution to the ongoing guideline controversy. An algorithm based on the combined assessment of the unifying cut-offs for EROA, RegVol and RegFrac improves risk prediction compared to currently established grading. Table 1 Definition of severe sMR Cox regression analysis ROC analysis IDI analysis HR (95%CI) P-Value ROC P-Value-for-comparison IDI P-Value Unifying concept 3.76 (2.71-5.23) <0.001 0.63 –- –- –- ACC/AHA definition 3.20 (2.14-4.78) <0.001 0.57 <0.001 0.06 <0.001 ESC/EACTS definition 1.52 (1.10-2.09) 0.01 0.55 <0.001 0.13 <0.001 ACC/ASE expert consensus 1.89 (1.40-2.56) <0.001 0.59 0.04 0.08 <0.001 Comparison of the unifying concept with the ACC/AHA, ESC/EACTS and ACC/ASE expert consensus definitions of sMR by Cox regression, ROC, and IDI demonstrated the most powerfull prediction by the unifying concept with significantly higher ROC area under the curve and better discriminatory power by IDI. Abstract P1764 Figure 1 A-D


2014 ◽  
Vol 15 (2) ◽  
pp. 157-160 ◽  
Author(s):  
Alison Van Eenennaam ◽  
Holly Neibergs ◽  
Christopher Seabury ◽  
Jeremy Taylor ◽  
Zeping Wang ◽  
...  

AbstractThe Bovine Respiratory Disease Coordinated Agricultural Project (BRD CAP) is a 5-year project funded by the United States Department of Agriculture (USDA), with an overriding objective to use the tools of modern genomics to identify cattle that are less susceptible to BRD. To do this, two large genome wide association studies (GWAS) were conducted using a case:control design on preweaned Holstein dairy heifers and beef feedlot cattle. A health scoring system was used to identify BRD cases and controls. Heritability estimates for BRD susceptibility ranged from 19 to 21% in dairy calves to 29.2% in beef cattle when using numerical scores as a semi-quantitative definition of BRD. A GWAS analysis conducted on the dairy calf data showed that single nucleotide polymorphism (SNP) effects explained 20% of the variation in BRD incidence and 17–20% of the variation in clinical signs. These results represent a preliminary analysis of ongoing work to identify loci associated with BRD. Future work includes validation of the chromosomal regions and SNPs that have been identified as important for BRD susceptibility, fine mapping of chromosomes to identify causal SNPs, and integration of predictive markers for BRD susceptibility into genetic tests and national cattle genetic evaluations.


Pharmaceutics ◽  
2018 ◽  
Vol 10 (3) ◽  
pp. 104 ◽  
Author(s):  
Leena Peltonen

Drug nanocrystals are nanosized solid drug particles, the most important application of which is the improvement of solubility properties of poorly soluble drug materials. Drug nanocrystals can be produced by many different techniques, but the mostly used are different kinds of media milling techniques; in milling, particle size of bulk sized drug material is decreased, with the aid of milling beads, to nanometer scale. Utilization of Quality by Design, QbD, approach in nanomilling improves the process-understanding of the system, and recently, the number of studies using the QbD approach in nanomilling has increased. In the QbD approach, the quality is built into the products and processes throughout the whole production chain. Definition of Critical Quality Attributes, CQAs, determines the targeted final product properties. CQAs are confirmed by setting Critical Process Parameters, CPPs, which include both process parameters but also input variables, like stabilizer amount or the solid state form of the drug. Finally, Design Space determines the limits in which CPPs should be in order to reach CQAs. This review discusses the milling process and process variables, CPPs, their impact on product properties, CQAs and challenges of the QbD approach in nanomilling studies.


1996 ◽  
Vol 113-114 ◽  
pp. 349-378
Author(s):  
Marc Bourdeau ◽  
Jean-Pierre Tubach

Abstract We describe a statistical methodology that provides a quantitative definition of the concept of phonological space that has been used for a long time as a purely qualitative description of languages. We apply this description to five corpora : four of French language, and one of Polish language. We obtain cartesian planes where the speakers are represented as points. We show that we can discriminate the five corpora. We end by pointing to applications to the learning and destruction of phonological capabilities.


Sign in / Sign up

Export Citation Format

Share Document