A multi-species modeling framework for describing supersonic jet-induced cratering in a granular bed: Cratering on Titan case study

2019 ◽  
Vol 118 ◽  
pp. 205-241 ◽  
Author(s):  
Kaushik Balakrishnan ◽  
Josette Bellan
2021 ◽  
pp. 153450842199877
Author(s):  
Wilhelmina van Dijk ◽  
A. Corinne Huggins-Manley ◽  
Nicholas A. Gage ◽  
Holly B. Lane ◽  
Michael Coyne

In reading intervention research, implementation fidelity is assumed to be positively related to student outcomes, but the methods used to measure fidelity are often treated as an afterthought. Fidelity has been conceptualized and measured in many different ways, suggesting a lack of construct validity. One aspect of construct validity is the fidelity index of a measure. This methodological case study examined how different decisions in fidelity indices influence relative rank ordering of individuals on the construct of interest and influence our perception of the relation between the construct and intervention outcomes. Data for this study came from a large State-funded project to implement multi-tiered systems of support for early reading instruction. Analyses were conducted to determine whether the different fidelity indices are stable in relative rank ordering participants and if fidelity indices of dosage and adherence data influence researcher decisions on model building within a multilevel modeling framework. Results indicated that the fidelity indices resulted in different relations to outcomes with the most commonly used fidelity indices for both dosage and adherence being the worst performing. The choice of index to use should receive considerable thought during the design phase of an intervention study.


2010 ◽  
Vol 64 (3) ◽  
Author(s):  
Michal Kvasnica ◽  
Martin Herceg ◽  
Ľuboš Čirka ◽  
Miroslav Fikar

AbstractThis paper presents a case study of model predictive control (MPC) applied to a continuous stirred tank reactor (CSTR). It is proposed to approximate nonlinear behavior of a plant by several local linear models, enabling a piecewise affine (PWA) description of the model used to predict and optimize future evolution of the reactor behavior. Main advantage of the PWA model over traditional approaches based on single linearization is a significant increase of model accuracy which leads to a better control quality. It is also illustrated that, by adopting the PWA modeling framework, MPC strategy can be implemented using significantly less computational power compared to nonlinear MPC setups.


2020 ◽  
Vol 1 ◽  
pp. 1-23
Author(s):  
Majid Hojati ◽  
Colin Robertson

Abstract. With new forms of digital spatial data driving new applications for monitoring and understanding environmental change, there are growing demands on traditional GIS tools for spatial data storage, management and processing. Discrete Global Grid System (DGGS) are methods to tessellate globe into multiresolution grids, which represent a global spatial fabric capable of storing heterogeneous spatial data, and improved performance in data access, retrieval, and analysis. While DGGS-based GIS may hold potential for next-generation big data GIS platforms, few of studies have tried to implement them as a framework for operational spatial analysis. Cellular Automata (CA) is a classic dynamic modeling framework which has been used with traditional raster data model for various environmental modeling such as wildfire modeling, urban expansion modeling and so on. The main objectives of this paper are to (i) investigate the possibility of using DGGS for running dynamic spatial analysis, (ii) evaluate CA as a generic data model for dynamic phenomena modeling within a DGGS data model and (iii) evaluate an in-database approach for CA modelling. To do so, a case study into wildfire spread modelling is developed. Results demonstrate that using a DGGS data model not only provides the ability to integrate different data sources, but also provides a framework to do spatial analysis without using geometry-based analysis. This results in a simplified architecture and common spatial fabric to support development of a wide array of spatial algorithms. While considerable work remains to be done, CA modelling within a DGGS-based GIS is a robust and flexible modelling framework for big-data GIS analysis in an environmental monitoring context.


2011 ◽  
Vol 308-310 ◽  
pp. 538-541
Author(s):  
Yuan Chen

An effort is made to give a description of a computer-aided conceptual design system. A novel Function-Action-Behavior-Mechanism (FABM) modeling framework is proposed to realize mapping from the overall function to principle solution according to customer’s requirements. Expansion and modification rules in the demand behavior are developed to extend the innovation of principle solution. A case study on pan mechanism design for cooking robot is presented to show the procedure of how to implement the intelligent reasoning based on the FABM model.


Author(s):  
Ayda Saidane ◽  
Nicolas Guelfi

The quality of software systems depends strongly on their architecture. For this reason, taking into account non-functional requirements at architecture level is crucial for the success of the software development process. Early architecture model validation facilitates the detection and correction of design errors. In this research, the authors are interested in security critical systems, which require a reliable validation process. So far, they are missing security-testing approaches providing an appropriate compromise between software quality and development cost while satisfying certification and audit procedures requirements through automated and documented validation activities. In this chapter, the authors propose a novel test-driven and architecture model-based security engineering approach for resilient systems. It consists of a test-driven security modeling framework and a test based validation approach. The assessment of the security requirement satisfaction is based on the test traces analysis. Throughout this study, the authors illustrate the approach using a client server architecture case study.


2014 ◽  
pp. 2072-2098
Author(s):  
Ayda Saidane ◽  
Nicolas Guelfi

The quality of software systems depends strongly on their architecture. For this reason, taking into account non-functional requirements at architecture level is crucial for the success of the software development process. Early architecture model validation facilitates the detection and correction of design errors. In this research, the authors are interested in security critical systems, which require a reliable validation process. So far, they are missing security-testing approaches providing an appropriate compromise between software quality and development cost while satisfying certification and audit procedures requirements through automated and documented validation activities. In this chapter, the authors propose a novel test-driven and architecture model-based security engineering approach for resilient systems. It consists of a test-driven security modeling framework and a test based validation approach. The assessment of the security requirement satisfaction is based on the test traces analysis. Throughout this study, the authors illustrate the approach using a client server architecture case study.


2021 ◽  
Author(s):  
Marek Suchánek ◽  
Herwig Mannaert ◽  
Peter Uhnák ◽  
Robert Pergl

Normalized Systems (NS) theory describes how to design and develop evolvable systems. It is applied in practice to generate enterprise information systems using NS Expanders from models of NS Elements. As there are various well-established modelling languages, the possibility to (re-)use them to create NS applications is desired. This paper presents a mapping between the NS metamodel and the Ecore metamodel as a representant of essential structural modelling. The mapping is the basis of the transformation execution tool based on Eclipse Modeling Framework and NS Java libraries. Both the mapping and the tool are demonstrated in a concise case study but cover all essential Ecore constructs. During the work, several interesting similarities of the two metamodels were found and are described, e.g., its meta-circularity or ability to specify data types using references to Java classes. Still, there are significant differences between the metamodels that prevent some constructs from being mapped. The issues with information loss upon the transformation are mitigated by incorporating additional options that serve as key-value annotations. The results are ready to be used for any Ecore models to create an NS model that can be expanded into an NS application.


2019 ◽  
Vol 6 (4) ◽  
pp. 307-318 ◽  
Author(s):  
Nathan C. Lindstedt

Sociologists frequently make use of language as data in their research using methodologies including open-ended surveys, in-depth interviews, and content analyses. Unfortunately, the ability of researchers to analyze the growing amount of these data declines as the costs and time associated with the research process increases. Topic modeling is a computer-assisted technique that can help social scientists to address these data challenges. Despite the central role of language in sociological research, to date, the field has largely overlooked the promise of automated text analysis in favor of more familiar and more traditional methods. This article provides an overview of a topic modeling framework especially suited for social scientific research. By way of a case study using abstracts from social movement studies literature, a short tutorial from data preparation through data analysis is given for the method of structural topic modeling. This example demonstrates how text analytics can be applied to research in sociology and encourages academics to consider such methods not merely as novel tools, but as useful supplements that can work beside and enhance existing methodologies.


Sign in / Sign up

Export Citation Format

Share Document