compute model
Recently Published Documents


TOTAL DOCUMENTS

20
(FIVE YEARS 8)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 6 (1) ◽  
pp. 36
Author(s):  
Deddy Kurniawansyah

The purpose of this study to examine (1) how the stages of the production process for “Sale Pisang Product’ that affects the determination of the selling price of Micro, Small, and Medium Enterprises in Banyuwangi? (2) How is the selling price computing model "Banana Sale Products" able to determine a competitive selling price for Micro, Small and Medium Enterprises in Banyuwangi?. This Study used survey, qualitative exploratory and action. The analysis data used triangulation method. The results obtained in this studi is the process of computing the production cost of “Sale Pisang Product” still uses a traditional method, so that the resulting product cost is distorted and the selling price becomes uncompetitive. In fact, the product cost compute model using Time Driven Activity Based Costing provides more accurate and informative information for Micro, Small and Medium Enterprises actors in making decisions such as determining the selling price. The Time Driven Activity Based Costing method is able to reduce costs and increase the net income of each “sale pisang” variant. The contribution of this study is expected to create competitive selling price of “Sale Pisang Product” and improve the economic competitiveness of Kab. Banyuwangi is more dynamic based on the potential of natural resources and local wisdom.


2021 ◽  
Vol 1 (3) ◽  
Author(s):  
Yumeng Wu ◽  
George Chiu

Abstract This paper proposes an improved model of height profile for drop-on-demand printing of ultraviolet curable ink. Unlike previous model, the proposed model propagates volume and covered area based on height difference between adjacent drops. Height profile is then calculated from the propagated volume and area. Measurements of two-drop and three-drop patterns are used to experimentally compute model parameters. The parameters are used to predict and validate height profiles of four and more drops in a straight line. Using the same root-mean-square (RMS) error as benchmark, this model achieves 5.9% RMS height profile error on four-drop lines. This represents more than 60% reduction from graph-based model and an improvement from our previous effort.


2021 ◽  
Vol 251 ◽  
pp. 02070
Author(s):  
Matthew Feickert ◽  
Lukas Heinrich ◽  
Giordon Stark ◽  
Ben Galewsky

In High Energy Physics facilities that provide High Performance Computing environments provide an opportunity to efficiently perform the statistical inference required for analysis of data from the Large Hadron Collider, but can pose problems with orchestration and efficient scheduling. The compute architectures at these facilities do not easily support the Python compute model, and the configuration scheduling of batch jobs for physics often requires expertise in multiple job scheduling services. The combination of the pure-Python libraries pyhf and funcX reduces the common problem in HEP analyses of performing statistical inference with binned models, that would traditionally take multiple hours and bespoke scheduling, to an on-demand (fitting) “function as a service” that can scalably execute across workers in just a few minutes, offering reduced time to insight and inference. We demonstrate execution of a scalable workflow using funcX to simultaneously fit 125 signal hypotheses from a published ATLAS search for new physics using pyhf with a wall time of under 3 minutes. We additionally show performance comparisons for other physics analyses with openly published probability models and argue for a blueprint of fitting as a service systems at HPC centers.


Author(s):  
Vikas Radhakrishna Deulgaonkar ◽  
M.S. Kulkarni ◽  
S.S. Khedkar ◽  
S.U. Kharosekar ◽  
V.U. Sadavarte

Self-weight and durability analysis of non-airconditioned sleeper bus has been carried in present work. Automotive industry standards (052 and 119) are used to freeze bus dimensions. Generative surface design is used in preparation to compute model. The bus superstructure behaviour is simulated for load on cant and waist rails for self-weight analysis. Bump analysis is carried out considering total failure of suspension system. Behaviour of bus during bump is simulated for two situations i.e. bump focre applied to front left wheel suspension location and all other suspension locations are fixed and force applied to front two wheel suspension locations and rear two wheel suspension locations are fixed. Behaviour of bus under torsional load for two cases viz first, force is applied to left of front suspension location in upward direction and other on to right suspension location in downward direction while the rear wheel suspension points are fixed and in second case, force is applied to left of front suspension in upward direction while the second one is applied to right in rear suspension location. Braking and double lane change load conditions are simulated with a braking efficiency of 80% and a lateral load of magnitude 0.4g is evaluated. Durability of the bus based on outcomes from braking, bump, torsional and double-lane change road-load situations is evaluated. The stress and deflection magnitudes are in good agreement with the results available in literature.


2020 ◽  
Vol 17 (8) ◽  
pp. 3581-3585
Author(s):  
M. S. Roobini ◽  
Selvasurya Sampathkumar ◽  
Shaik Khadar Basha ◽  
Anitha Ponraj

In the last decade cloud computing transformed the way in which we build applications. The boom in cloud computing helped to develop new software design and architecture. Helping the developers to focus more on the business logic than the infrastructure. FaaS (function as a service) compute model it gave developers to concentrate only on the application code and rest of the factors will be taken care by the cloud provider. Here we present a serverless architecture of a web application built using AWS services and provide detail analysis of lambda function and micro service software design implemented using these AWS services.


This paper presents the development of automatic control over batch type production in the paint industry. The automation technique for the manufacturing of paint industry is applied wit0068 the help of programmable Logic Controller. In this task, the analytical compute model for optimizing the functions of the volume processing plant is capable of handling all possible decision variables in the setup. The synthesized function is used in a dominant paint construct plant. This workconcentrate plant is a multi-product, batch processing plant, with a ample variety of products, challenge for various process apparatus at the manufacturing site. The plant under consideration working in accordance with the principle of "order-based" production. Therefore, this application can be classified as a short-term organize of a real case multi-product module alter plant


Author(s):  
Jesper N. Wulff

Researchers who model fractional dependent variables often need to consider whether their data were generated by a two-part process. Two-part models are ideal for modeling two-part processes because they allow us to model the participation and magnitude decisions separately. While community-contributed commands currently facilitate estimation of two-part models, no specialized command exists for fitting two-part models with process dependency. In this article, I describe generalized two-part fractional regression, which allows for dependency between models’ parts. I show how this model can be fit using the community-contributed cmp command (Roodman, 2011, Stata Journal 11: 159–206). I use a data example on the financial leverage of firms to illustrate how cmp can be used to fit generalized two-part fractional regression. Furthermore, I show how to obtain predicted values of the fractional dependent variable and marginal effects that are useful for model interpretation. Finally, I show how to compute model fit statistics and perform the RESET test, which are useful for model evaluation.


2019 ◽  
Author(s):  
Mahmoud Khairy ◽  
Mengchi Zhang ◽  
Roland Green ◽  
Simon David Hammond ◽  
Robert J. Hoekstra ◽  
...  
Keyword(s):  

2018 ◽  
Vol 16 (05) ◽  
pp. 1850023 ◽  
Author(s):  
Keerthi S. Shetty ◽  
Annappa B

Many biochemical events involve multistep reactions. One of the most important biological processes that involve multistep reaction is the transcriptional process. Models for multistep reaction necessarily need multiple states and it is a challenge to compute model parameters that best agree with experimental data. Therefore, the aim of this work is to design a multistep promoter model which accurately characterizes transcriptional bursting and is consistent with observed data. To address this issue, we develop a model for promoters with several OFF states and a single ON state using Erlang distribution. To explore the combined effects of model and data, we combine Monte Carlo extension of Expectation Maximization (MCEM) and delay Stochastic Simulation Algorithm (DSSA) and call the resultant algorithm as delay Bursty MCEM. We apply this algorithm to time-series data of endogenous mouse glutaminase promoter to validate the model assumptions and infer the kinetic parameters. Our results show that with multiple OFF states, we are able to infer and produce a model which is more consistent with experimental data. Our results also show that delay Bursty MCEM inference is more efficient.


2018 ◽  
Author(s):  
Matthias Morzfeld ◽  
Bruce A. Buffett

Abstract. We consider a stochastic differential equation model for Earth's axial magnetic dipole field. The model's parameters are estimated using diverse and independent data sources that had previously been treated separately. The result is a numerical model that is informed by the full paleomagnetic record on kyr to Myr time scales and whose outputs match data of Earth's dipole in a precisely defined feature-based sense. Specifically, we compute model parameters and associated uncertainties that lead to model outputs that match spectral data of Earth's axial magnetic dipole field but our approach also reveals difficulties with simultaneously matching spectral data and reversal rates. This could be due to model deficiencies or inaccuracies in the limited amount of data. More generally, the approach we describe can be seen as an example of an effective strategy for combining diverse data sets that is particularly useful when the amount of data is limited.


Sign in / Sign up

Export Citation Format

Share Document