Guiding Target Synthesis with Statistical Modeling Tools: A Case Study in Organocatalysis

2022 ◽  
Author(s):  
Isaiah O. Betinol ◽  
Yutao Kuang ◽  
Jolene P. Reid
2021 ◽  
Vol 13 (14) ◽  
pp. 7990
Author(s):  
Suman Paneru ◽  
Forough Foroutan Jahromi ◽  
Mohsen Hatami ◽  
Wilfred Roudebush ◽  
Idris Jeelani

Traditional energy analysis in Building Information Modeling (BIM) only accounts for the energy requirements of building operations during a portion of the occupancy phase of the building’s life cycle and as such is unable to quantify the true impact of buildings on the environment. Specifically, the typical energy analysis in BIM does not account for the energy associated with resource formation, recycling, and demolition. Therefore, a comprehensive method is required to analyze the true environmental impact of buildings. Emergy analysis can offer a holistic approach to account for the environmental cost of activities involved in building construction and operation in all its life cycle phases from resource formation to demolition. As such, the integration of emergy analysis with BIM can result in the development of a holistic sustainability performance tool. Therefore, this study aimed at developing a comprehensive framework for the integration of emergy analysis with existing Building Information Modeling tools. The proposed framework was validated using a case study involving a test building element of 8’ × 8’ composite wall. The case study demonstrated the successful integration of emergy analysis with Revit®2021 using the inbuilt features of Revit and external tools such as MS Excel. The framework developed in this study will help in accurately determining the environmental cost of the buildings, which will help in selecting environment-friendly building materials and systems. In addition, the integration of emergy into BIM will allow a comparison of various built environment alternatives enabling designers to make sustainable decisions during the design phase.


2021 ◽  
Author(s):  
Henrik Singmann ◽  
Gregory Edward Cox ◽  
David Kellen ◽  
Suyog Chandramouli ◽  
Clintin Davis-Stober ◽  
...  

Statistical modeling is generally meant to describe patterns in data in service of the broader scientific goal of developing theories to explain those patterns. Statistical models support meaningful inferences when models are built so as to align parameters of the model with potential causal mechanisms and how they manifest in data. When statistical models are instead based on assumptions chosen by default, Attempts to draw inferences can be uninformative or even paradoxical—in essence, the tail is trying to wag the dog.These issues are illustrated by van Doorn et al. (in press) in the context of using BayesFactors to identify effects and interactions in linear mixed models. We show that the problems identified in their applications can be circumvented by using priors over inherently meaningful units instead of default priors on standardized scales. This case study illustrates how researchers must directly engage with a number of substantive issues in order to support meaningful inferences, of which we highlight two: The first is the problem of coordination, which requires a researcher to specify how the theoretical constructs postulated by a model are functionally related to observable variables. The second is the problem of generalization, which requires a researcher to consider how a model may represent theoretical constructs shared across similar but non-identical situations, along with the fact that model comparison metrics like Bayes Factors do not directly address this form of generalization. For statistical modeling to serve the goals of science, models cannot be based on default assumptions, but should instead be based on an understanding of their coordination function and on how they represent causal mechanisms that may be expected to generalize to other related scenarios.


2018 ◽  
Vol 77 (3) ◽  
Author(s):  
Maryam Ghazanfari Shabankareh ◽  
Hakimeh Amanipoor ◽  
Sedigheh Battaleb-Looie ◽  
Javad Dravishi Khatooni

Author(s):  
Marco Vitali ◽  
Roberta Spallone ◽  
Francesco Carota

In this chapter are developed some considerations about the heuristic potentialities of parametric digital modeling as a tool for analyzing and interpreting architectural heritage. Observed that the parametric thinking in architecture could be recognized almost from the origin, new parametric modeling software allows to verify the design criteria of the past. On the basis of previous studies on Baroque vaulted atria, this chapter develops, using parametric modeling tools, a real vocabulary of shapes and their possible combinations, suggested by the architectural literature of the time and the survey of about seventy atria in Turin. This method has been tested on the case study of the lunettes dome in the atrium of Palazzo Carignano.


2020 ◽  
Vol 231 (8) ◽  
Author(s):  
Costas A. Varotsos ◽  
Vladimir F. Krapivin ◽  
Ferdenant A. Mkrtchyan ◽  
Suren A. Gevorkyan ◽  
Tengfei Cui

Author(s):  
Komandur S. Sunder Raj

Surface condensers for power plant applications are generally specified and designed following turbine-condenser optimization studies. The turbine manufacturer provides turbine-generator performance data (thermal kit) at the very outset of plant design when the condenser is usually a black box and not much is known about its design. The turbine-generator guarantee would then be based on a specified condenser pressure that may or may not be attainable once the condenser is actually specified and designed. The condenser pressure used for the turbine performance guarantee might assume a single-pressure condenser while the actual design might be a multi-pressure condenser. In order to properly predict and monitor the performance and conduct diagnostics on a multi-pressure condenser, it is important to understand the design basis and develop an accurate model using performance modeling tools. The paper presents a multi-pressure condenser case study for a 600 Mwe nuclear power plant. The paper discusses the design basis used, interface between the turbine and condenser, use of a performance modeling tool for predicting performance, determining capacity losses attributable to the condenser and conducting diagnostics.


Author(s):  
Marco Aurisicchio ◽  
Rob Bracewell ◽  
Gareth Armstrong

AbstractUnderstanding product functions is a key aspect of the work undertaken by engineers involved in complex system design. The support offered to these engineers by existing modeling tools such as the function tree and the function structure is limited because they are not intuitive and do not scale well to deal with real-world engineering problems. A research collaboration between two universities and a major power system company in the aerospace domain has allowed the authors to further develop a method for function analysis known as function analysis diagram that was already in use by line engineers. The capability to generate and edit these diagrams was implemented in the Decision Rationale editor, a software tool for capturing design rationale. This article presents the intended benefits of the method and justifies them using an engineering case study. The results of the research have shown that the function analysis diagram method has a simple notation, permits the modeling of product functions together with structure, allows the generation of rich and accurate descriptions of product functionality, is useful to work with variant and adaptive design tasks, and can coexist with other functional modeling methods.


Sign in / Sign up

Export Citation Format

Share Document