iteration model
Recently Published Documents


TOTAL DOCUMENTS

29
(FIVE YEARS 8)

H-INDEX

5
(FIVE YEARS 1)

Author(s):  
Hussein Alsteif ◽  
◽  
Murat Akkaya ◽  

Real-time prediction of hour-based order entry has been lacking in literature. Compared to previous research on supply chain problems, our proposed approach overcomes the constraints of operations management with longer time periods such as weekly and monthly by developing a novel iteration model. We performed experiments on 100 products with high cumulative volume over time. Using 3 different dataset, our proposed model proved efficient in forecasting skewed demand signals with lot of noise in supply chains.


2021 ◽  
pp. 004051752110471
Author(s):  
Yujuan Wang ◽  
Wengang Li ◽  
Jun Wang

In order to facilitate the design of a hybrid filament before spinning, a k-m (Kubelka-Munk) iteration model was proposed, which was based on the calculation method for reflectance of a translucent object and needed to be used in conjunction with a fabric model that can reflect the arrangement order of monofilaments. Therefore, the model can not only calculate the color of each point on the fabric surface, but also the mixed color of the fabric. Twenty fabrics with five different blending ratios of black monofilaments and white monofilaments, four multifilament fineness and three fabric weave types were woven. The relationship between the gray distribution of all points on the fabric surface captured by the camera in a DigiEye colorimeter and calculated by the k-m iteration model was analyzed, and the color difference between the mixed color of the fabric tested by the Datacolor spectrophotometer and that calculated by the k-m iteration model was calculated. The results show that the intersection distance and Pearson correlation coefficient between the gray histogram of the photographed fabric image and that of the calculated fabric image were 0.79 and 0.89, respectively. The average color difference obtained by the k-m iteration model was 0.92 Color Measurement Committee (2:1) units, which was best compared with the calculation results of other models. By discussing the fabric structure parameters causing the lightness difference, it was concluded that the calculated lightness was smaller than the measured lightness difference for fabric with a longer float length, smaller multifilament fineness and a larger black monofilament blending ratio.


PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0256633
Author(s):  
Catherine S. Jarnevich ◽  
Pairsa N. Belamaric ◽  
Kent Fricke ◽  
Mike Houts ◽  
Liza Rossi ◽  
...  

Habitat loss from land-use change is one of the top causes of declines in wildlife species of concern. As such, it is critical to assess and reassess habitat suitability as land cover and anthropogenic features change for both monitoring and developing current information to inform management decisions. However, there are obstacles that must be overcome to develop consistent assessments through time. A range-wide lek habitat suitability model for the lesser prairie-chicken (Tympanuchus pallidicinctus), currently under review by the U. S. Fish and Wildlife Service for potential listing under the Endangered Species Act, was published in 2016. This model was based on lek data from 2002 to 2012, land cover data ranging from 2001 to 2013, and anthropogenic features from circa 2011, and has been used to help guide lesser prairie-chicken management and anthropogenic development actions. We created a second iteration model based on new lek surveys (2015 to 2019) and updated predictors (2016 land cover and cleaned/updated anthropogenic data) to evaluate changes in lek suitability and to quantify current range-wide habitat suitability. Only three of 11 predictor variables were directly comparable between the iterations, making it difficult to directly assess what predicted changes resulted from changes in model inputs versus actual landscape change. The second iteration model showed a similar positive relationship with land cover and negative relationship with anthropogenic features to the first iteration, but exhibited more variation among candidate models. Range-wide, more suitable habitat was predicted in the second iteration. The Shinnery Oak Ecoregion, however, exhibited a loss in predicted suitable habitat that could be due to predictor source changes. Iterated models such as this are important to ensure current information is being used in conservation and development decisions.


Author(s):  
Sarah Welby ◽  
Mickael Cargnel ◽  
Claude Saegerman

Introduction: Despite eradication and control measures applied across Europe, bovine tuberculosis (bTB) remains a constant threat. In Belgium, after several years of bTB disease freedom status, routine movement testing, as currently practiced, revealed itself inadequate to detect some sporadic breakdown herds. The aim of this study was to strike the balance between cost and effectiveness of different surveillance system components to identify sustainable alternatives for early detection and substantiation of freedom of bTB while maintaining acceptance of these amongst the different animal health stakeholders. Methods: Stochastic iteration model was built to simulate, first, the expected current surveillance system performance in terms of sensitivity and specificity of detection. These results were then descriptively compared to observed field results. Secondly, the cost and effectiveness of simulated alternative surveillance components were quantified. To measure impact of key assumptions (i.e. regarding diagnostic tests and true prevalence), sensitivity analysis was performed. Results: Discrepancies between the predicted and observed performance of bTB surveillance in Belgium were observed. Secondly, simulated alternatives revealed that targeted IFN-γ as well serological testing with Antibody ELISA towards risk herds would enable increasing the overall cost and effectiveness of the Belgian bTB surveillance system. Sensitivity analysis showed that results remained constant despite modification of some key assumptions. Discussion: Performance of current bTB surveillance system performance in Belgium was questionable. This exercise highlighted that not only sensitivity, but specificity is a key driver for surveillance performance. The quantitative and participative conceptual framework revealed itself a useful tool to allow evidence-based decision making regarding future tuberculosis surveillance in Belgium, as required by the international standards.


2019 ◽  
Vol 9 (18) ◽  
pp. 3667
Author(s):  
Jianlin Jiang ◽  
Jianguo Chen ◽  
Rongyue Zheng ◽  
Yan Zhou

The implementation process of construction projects is an iterative process of continuous modification and improvement among participant organizations. Traditional workflow analysis methods for a single organization are not suitable for the analysis of such implementation processes. Therefore, an interorganizational workflow analysis method based on organizational roles and associated with their collaborative relationships is required. In this study, a role-based interorganizational workflow model for participant organizations is developed, in which it is assumed that interoperability has a loosely coupled form for temporary multi-organizations. The Fuzzy Analytic Hierarchy Process (FAHP) is applied to determine the parameters of the correlation between interorganizational workflows, which includes downstream sensitivity and the probability of change. Furthermore, according to workflow interactions between organizations, an analysis model of interorganizational workflow is developed by using the Design Iteration Model for reference to analyze the time performance of participant organizations. Additionally, two forms of interorganizational workflow are compared and analyzed. Some suggestions are put forward to improve interorganizational workflow management and reduce the total time taken to complete the workflow processing of each organization (T) and the total time spent on the interorganizational workflow process (effort, E). This research may help strengthen interorganizational workflow management and enrich the workflow modeling theory.


Sensors ◽  
2019 ◽  
Vol 19 (14) ◽  
pp. 3122
Author(s):  
Qiming Wang ◽  
Tao Sun ◽  
Zhichao Lyu ◽  
Dawei Gao

As a crucial and critical factor in monitoring the internal state of an engine, cylinder pressure is mainly used to monitor the burning efficiency, to detect engine faults, and to compute engine dynamics. Although the intrusive type cylinder pressure sensor has been greatly improved, it has been criticized by researchers for high cost, low reliability and short life due to severe working environments. Therefore, aimed at low-cost, real-time, non-invasive, and high-accuracy, this paper presents the cylinder pressure identification method also called a virtual cylinder pressure sensor, involving Frequency-Amplitude Modulated Fourier Series (FAMFS) and Extended-Kalman-Filter-optimized (EKF) engine model. This paper establishes an iterative speed model based on burning theory and Law of energy Conservation. Efficiency coefficient is used to represent operating state of engine from fuel to motion. The iterative speed model associated with the throttle opening value and the crankshaft load. The EKF is used to estimate the optimal output of this iteration model. The optimal output of the speed iteration model is utilized to separately compute the frequency and amplitude of the cylinder pressure cycle-to-cycle. A standard engine’s working cycle, identified by the 24th order Fourier series, is determined. Using frequency and amplitude obtained from the iteration model to modulate the Fourier series yields a complete pressure model. A commercial engine (EA211) provided by the China FAW Group corporate R&D center is used to verify the method. Test results show that this novel method possesses high accuracy and real-time capability, with an error percentage for speed below 9.6% and the cumulative error percentage of cylinder pressure less than 1.8% when A/F Ratio coefficient is setup at 0.85. Error percentage for speed below 1.7% and the cumulative error percentage of cylinder pressure no more than 1.4% when A/F Ratio coefficient is setup at 0.95. Thus, the novel method’s accuracy and feasibility are verified.


2018 ◽  
Vol 10 (6) ◽  
pp. 47
Author(s):  
Yan Bao ◽  
Frank Heilig ◽  
Chuo-Hsuan Lee ◽  
Edward J. Lusk

Bao, Lee, Heilig, and Lusk (2018) have documented and illustrated the Small Sample Size bias in Benford Screening of datasets for Non-Conformity. However, their sampling plan tested only a few random sample-bundles from a core set of data that were clearly Conforming to the Benford first digit profile. We extended their study using the same core datasets and DSS, called the Newcomb Benford Decision Support Systems Profiler [NBDSSP], to create an expanded set of random samples from their core sample. Specifically, we took repeated random samples in blocks of 10 down to 5% from their core-set of data in increments of 5% and finished with a random sample of 1%, 0.5% & 20 thus creating 221 sample-bundles. This arm focuses on the False Positive Signaling Error [FPSE]—i.e., believing that the sampled dataset is Non-Conforming when it, in fact, comes from a Conforming set of data. The second arm used the Hill Lottery dataset, argued and tested as Non-Conforming; we will use the same iteration model noted above to create a test of the False Negative Signaling Error [FNSE]—i.e., if for the sampled datasets the NBDSSP fails to detect Non-Conformity—to wit believing incorrectly that the dataset is Conforming. We find that there is a dramatic point in the sliding sampling scale at about 120 sampled points where the FPSE first appears—i.e., where the state of nature: Conforming incorrectly is flagged as Non-Conforming. Further, we find it is very unlikely that the FNSE manifests itself for the Hill dataset. This demonstrated clearly that small datasets are indeed likely to create the FPSE, and there should be little concern that Hill-type of datasets will not be indicated as Non-Conforming. We offer a discussion of these results with implications for audits in the Big-Data context where the audit In-charge may find it necessary to partition the datasets of the client.


Sign in / Sign up

Export Citation Format

Share Document