A priori system-level interconnect prediction

Author(s):  
Dirk Stroobandt
Keyword(s):  
A Priori ◽  
Author(s):  
N. Ashwin Bharadwaj ◽  
James T. Allison ◽  
Randy H. Ewoldt

Rheological material properties are high-dimensional function-valued quantities, such as frequency-dependent viscoelastic moduli or non-Newtonian shear viscosity. Here we describe a process to model and optimize design targets for such rheological material functions. For linear viscoelastic systems, we demonstrate that one can avoid specific a priori assumptions of spring-dashpot topology by writing governing equations in terms of a time-dependent relaxation modulus function. Our approach embraces rheological design freedom, connecting system-level performance to optimal material functions that transcend specific material classes or structure. This technique is therefore material agnostic, applying to any material class including polymers, colloids, metals, composites, or any rheologically complex material. These early-stage design targets allow for broadly creative ideation of possible material solutions, which can then be used for either material-specific selection or later-stage design of novel materials.


2020 ◽  
Vol 36 (3) ◽  
pp. 204-216
Author(s):  
Christiaan Vis ◽  
Leah Bührmann ◽  
Heleen Riper ◽  
Hans C. Ossebaard

ObjectivesTraditionally, health technology assessment (HTA) focuses on assessing the impact of pharmaceutical technologies on health and care. Resources are scarce and policy makers aim to achieve effective, accessible health care. eHealth innovations are increasingly more integrated in all healthcare domains. However, how eHealth is assessed prior to its implementation in care practices is unclear. To support evidence-informed policy making, this study aimed to identify frameworks and methods for assessing eHealth's impact on health care.MethodsThe scientific literature in five bibliographical databases was systematically reviewed. Articles were included if the study was conducted in a clinical setting, used an HTA framework and assessed an eHealth service. A systematic qualitative narrative approach was applied for analysis and reporting.ResultsTwenty-one HTA frameworks were identified in twenty-three articles. All frameworks addressed outcomes related to the technical performance and functionalities of eHealth service under assessment. The majority also addressed costs (n = 19), clinical outcomes (n = 14), organizational (n = 15) and system level aspects (n = 13). Most frameworks can be classified as dimensional (n = 13), followed by staged (n = 3), hybrid (n = 3), and business modeling frameworks (n = 2). Six frameworks specified assessment outcomes and methods.ConclusionsHTA frameworks are available for a-priori impact assessment of eHealth services. The frameworks vary in assessment outcomes, methods, and specificity. Demonstrated applicability in practice is limited. Recommendations include standardization of: (i) reporting characteristics of eHealth services, and (ii) specifying assessment outcomes and methods following a stepped-approach tailored to the functional characteristics of eHealth services. Standardization might improve the quality and comparability of eHTA assessments.


2010 ◽  
Vol 365 (1545) ◽  
pp. 1387-1395 ◽  
Author(s):  
Kyungrock Paik ◽  
Praveen Kumar

Mother Nature has left amazingly regular geomorphic patterns on the Earth's surface. These patterns are often explained as having arisen as a result of some optimal behaviour of natural processes. However, there is little agreement on what is being optimized. As a result, a number of alternatives have been proposed, often with little a priori justification with the argument that successful predictions will lend a posteriori support to the hypothesized optimality principle. Given that maximum entropy production is an optimality principle attempting to predict the microscopic behaviour from a macroscopic characterization, this paper provides a review of similar approaches with the goal of providing a comparison and contrast between them to enable synthesis. While assumptions of optimal behaviour approach a system from a macroscopic viewpoint, process-based formulations attempt to resolve the mechanistic details whose interactions lead to the system level functions. Using observed optimality trends may help simplify problem formulation at appropriate levels of scale of interest. However, for such an approach to be successful, we suggest that optimality approaches should be formulated at a broader level of environmental systems' viewpoint, i.e. incorporating the dynamic nature of environmental variables and complex feedback mechanisms between fluvial and non-fluvial processes.


2006 ◽  
Vol 16 (01) ◽  
pp. 63-79 ◽  
Author(s):  
Mourad Elhadef ◽  
Kaouther Abrougui ◽  
Shantanu Das ◽  
Amiya Nayak

In this paper, we present a system-level fault identification algorithm, using a parallel genetic algorithm, for diagnosing faulty nodes in large heterogeneous systems. The algorithm is based on a probabilistic model where individual node fails with an a priori probability p. The assumptions concerning test outcomes are the same as in the PMC model, that is, fault-free testers always give correct test outcomes and faulty testers are totally unpredictable. The parallel diagnosis algorithm was implemented and simulated on randomly generated large systems. The proposed parallelization is intended to speed up the performance of the evolutionary diagnosis approach, hence reducing the computation time by evolving various sub-populations in parallel. Simulation results are provided showing that the parallel diagnosis did improve the efficiency of the evolutionary diagnosis approach, in that it allowed faster diagnosis of faulty situations, making it a viable alternative to existing techniques of diagnosis. Moreover, the evolutionary approach still provide good results even when extreme non-diagnosable faulty situations are considered.


Author(s):  
Marco Gero Ferna´ndez ◽  
Jitesh H. Panchal ◽  
Janet K. Allen ◽  
Farrokh Mistree

Often, design problems are coupled and their concurrent resolution by interacting stakeholders is required. The ensuing interactions are characterized predominantly by degree of interdependence and level of cooperation. Since tradeoffs, made within and among sub-systems, inherently contribute to system level performance, bridging the associated gaps is crucial. With this in mind, effective collaboration, centered on continued communication, concise coordination, and non-biased achievement of system level objectives, is becoming increasingly important. Thus far, research in distributed and decentralized decision-making has focused primarily on conflict resolution. Game theoretic protocols and negotiation tactics have been used extensively as a means of making the required tradeoffs, often in a manner that emphasizes the maximization of stakeholder (personal) payoff over system level performance. More importantly, virtually all of the currently instantiated mechanisms are based upon the a priori assumption of the existence of solutions that are acceptable to all interacting parties. No explicit consideration has been given thus far to ensuring the convergence of stakeholder design activities leading up to the coupled decision and the associated determination of values for uncoupled and coupled design parameters. Consequently, unnecessary and costly iteration is likely to result from mismatched objectives. In this paper, we advocate moving beyond strategic collaboration towards co-design. We present an alternative coordination mechanism, centered on sharing key pieces of information throughout the process of determining a solution to a coupled system. Specifically, we focus on (1) establishing and assessing collaborative design spaces, (2) identifying and exploring regions of acceptable performance, and (3) preserving stakeholder dominion over design sub-system resolution throughout the duration of a given design process. The fundamental goal is to establish a consistent framework for goal-oriented collaboration that (1) more accurately represents the mechanics underlying product development and (2) facilitates interacting stakeholders in achieving their respective objectives in light of system level priorities. This is accomplished via improved utilization of shared resources and avoidance of unnecessary reductions in design freedom. Comparative performance of the proposed method is established using a simple example, involving the resolution of a tradeoff with respect to a system of non-linear equations.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


2019 ◽  
Vol 4 (5) ◽  
pp. 878-892
Author(s):  
Joseph A. Napoli ◽  
Linda D. Vallino

Purpose The 2 most commonly used operations to treat velopharyngeal inadequacy (VPI) are superiorly based pharyngeal flap and sphincter pharyngoplasty, both of which may result in hyponasal speech and airway obstruction. The purpose of this article is to (a) describe the bilateral buccal flap revision palatoplasty (BBFRP) as an alternative technique to manage VPI while minimizing these risks and (b) conduct a systematic review of the evidence of BBFRP on speech and other clinical outcomes. A report comparing the speech of a child with hypernasality before and after BBFRP is presented. Method A review of databases was conducted for studies of buccal flaps to treat VPI. Using the principles of a systematic review, the articles were read, and data were abstracted for study characteristics that were developed a priori. With respect to the case report, speech and instrumental data from a child with repaired cleft lip and palate and hypernasal speech were collected and analyzed before and after surgery. Results Eight articles were included in the analysis. The results were positive, and the evidence is in favor of BBFRP in improving velopharyngeal function, while minimizing the risk of hyponasal speech and obstructive sleep apnea. Before surgery, the child's speech was characterized by moderate hypernasality, and after surgery, it was judged to be within normal limits. Conclusion Based on clinical experience and results from the systematic review, there is sufficient evidence that the buccal flap is effective in improving resonance and minimizing obstructive sleep apnea. We recommend BBFRP as another approach in selected patients to manage VPI. Supplemental Material https://doi.org/10.23641/asha.9919352


Addiction ◽  
1997 ◽  
Vol 92 (12) ◽  
pp. 1671-1698 ◽  
Author(s):  
Project Match Research Group
Keyword(s):  
A Priori ◽  

Sign in / Sign up

Export Citation Format

Share Document