scholarly journals Analytical model for estimating the production and composition of gas resulted through gasification

2021 ◽  
Vol 286 ◽  
pp. 01011
Author(s):  
Lucian Mihăescu ◽  
Ionel Pîșă ◽  
Iulia Simion ◽  
Gabriel Paul Negreanu

For the modelling of gasification processes, several models have been developed over the years. It is remarked that gasification calculation models of very high complexity entail some complications. Therefore, simpler mathematical representations of gasification characteristics and process behavior are required as a first step in addressing such systems. The preliminary calculation simplicity is needed form two perspectives: First – the pre-sizing of gasification installations, and second – the estimation of experimental or functional results. For this kind of topics, an adequate simplified model should be defined. Further, to validate the results it will be necessary to use complex calculation models. The model proposed in this paper addresses gasification with distributive air in the air distribution current, considering general concurrent flow of air fuel. Previous successful investigations, conducted by the present research team, are taken into account within model definition stages. Thus, the work presented here provides useful advances in the field of mathematical modeling of gasification processes. The originality of the model consists in its easy computational accessibility, which allows the approach of technological optimizations, such as the variation of excess air and fuel composition.

Author(s):  
Michael Blondin ◽  
Javier Esparza ◽  
Stefan Jaax ◽  
Philipp J. Meyer

AbstractPopulation protocols are a well established model of computation by anonymous, identical finite-state agents. A protocol is well-specified if from every initial configuration, all fair executions of the protocol reach a common consensus. The central verification question for population protocols is the well-specification problem: deciding if a given protocol is well-specified. Esparza et al. have recently shown that this problem is decidable, but with very high complexity: it is at least as hard as the Petri net reachability problem, which is -hard, and for which only algorithms of non-primitive recursive complexity are currently known. In this paper we introduce the class $${ WS}^3$$ WS 3 of well-specified strongly-silent protocols and we prove that it is suitable for automatic verification. More precisely, we show that $${ WS}^3$$ WS 3 has the same computational power as general well-specified protocols, and captures standard protocols from the literature. Moreover, we show that the membership and correctness problems for $${ WS}^3$$ WS 3 reduce to solving boolean combinations of linear constraints over $${\mathbb {N}}$$ N . This allowed us to develop the first software able to automatically prove correctness for all of the infinitely many possible inputs.


1998 ◽  
Vol 23 (6) ◽  
pp. 781-784 ◽  
Author(s):  
M. SALOM ◽  
J. E. AROCA ◽  
V. CHOVER ◽  
R. ALONSO ◽  
R. VILAR

We present 19 ray lengthenings in 14 patients done with a small external fixator. In six cases the thumb was lengthened and in the other 13 cases, other digital rays. The most frequent reason for lengthening was an amputation sustained in a work accident. All the lengthenings were done by an osteotomy and subsequent gradual distraction with a small external fixator. The mean lengthening achieved was 20 mm (range, 0–32). An iliac-crest graft was needed in nine cases, corrective osteotomy in five cases and a deepening of the web in the six cases of thumb lengthening. In five cases an additional technique was needed to achieve bony consolidation. We have analysed the functional results and the ability to perform activities of daily living and resume employment. Although most of the patients had multiples injuries, the results have been very favourable, achieving a very high level of patient satisfaction.


SINERGI ◽  
2020 ◽  
Vol 24 (3) ◽  
pp. 237
Author(s):  
Hadi Pranoto ◽  
Andi Adriansyah ◽  
Dafit Feriyanto ◽  
Abdi Wahab ◽  
Supaat Zakaria

In 2015, there were 55 deaths from 6,231 accident cases that occurred in Jakarta. A severe problem in Indonesia is the absence of a unique safety device in both commercial transport or personal vehicles and the very high complexity problem of human highways. Consequently, there are many traffic accidents caused by the negligence of the driver, such as driving a vehicle in a drunken, tired, drowsy, or over-limit speed. Therefore, it needs to be innovative using devices to increase speed but able to detect the level of tired or sleepy drivers. This paper tries to propose a concept of improving safety engineering by developing devices that can control the speed and level of safety of trucks and buses, named SLIFA. The proposed device captures the driver's condition by looking at the eyes, size of mouth evaporating, and heart rate conditions.  Theses condition will be measured with a particular scale to determine the fatigue level of the driver. Some performance tests have been carried out on truck and bus with 122 Nm and 112 Nm torque wheels and 339 HP and 329 HP power values, respectively, and the minimum speed is 62 km/h. At a top speed of 70 km / h, the torque and power of the truck are 135Nm and 370HP, with average fuel consumption of 3.43 liters/km before SLIFA installation and average fuel consumption of 4.2 liters/km after SLIFA installation. SLIFA can be said to have functional eligibility and can cut fuel consumption by 81 percent.


There is a problematic difference between interactions of a human with natural and computer environments. The negatives of this difference are particularly painful in the design of software intensive systems, the success of which is unpredictable and extremely low. The root reason for such state of affairs is a very high complexity with which the designers have deals. What we name as “Complexity” is a characteristic of estimations that is discovered in interactions of a human or humans with perceived essences. Therefore, for example, designers need means that will help them in interactions with environments of their activity during real-time work. This chapter tries to show that one of the possible directions of mastering the complexity is bound with the possibility for designers to create conditions of interactions that are similar to conditions of natural interactions. In this case, both types of interactions will be intertwined in coordination in search of simplifying an arisen complexity.


2012 ◽  
Vol 2012 ◽  
pp. 1-17
Author(s):  
Mingmin Zhu ◽  
Sanyang Liu

Learning Bayesian network (BN) structure from data is a typical NP-hard problem. But almost existing algorithms have the very high complexity when the number of variables is large. In order to solve this problem(s), we present an algorithm that integrates with a decomposition-based approach and a scoring-function-based approach for learning BN structures. Firstly, the proposed algorithm decomposes the moral graph of BN into its maximal prime subgraphs. Then it orientates the local edges in each subgraph by the K2-scoring greedy searching. The last step is combining directed subgraphs to obtain final BN structure. The theoretical and experimental results show that our algorithm can efficiently and accurately identify complex network structures from small data set.


1994 ◽  
Vol 04 (04) ◽  
pp. 423-455
Author(s):  
KEITH D. McCROAN ◽  
R. C. LACHER

A theory is introduced relating extrinsic colorings of complementary regions of an embedded graph to certain intrinsic colorings of the edges of the graph, called color cycles, that satisfy a certain self-consistency condition. A region coloring is lifted to an edge coloring satisfying the cycle condition, and a dual construction builds a region coloring from any color cycle and any embedding of the graph. Both constructs are canonical, and the constructions are information-conservative in the sense that lifting an arbitrary region coloring to a color cycle and then reconstructing a region coloring from the cycle, using the same embedding, results in the original region coloring. The theory is motivated by, and provides the proof of correctness of, new scan-conversion algorithms that are useful in settings where region boundaries have very high complexity. These algorithms have been implemented and provide useful display functionality previously unavailable on certain rastor devices.


2021 ◽  
Vol 9 ◽  
Author(s):  
Tanya Strydom ◽  
Giulio V. Dalla Riva ◽  
Timothée Poisot

Quantifying the complexity of ecological networks has remained elusive. Primarily, complexity has been defined on the basis of the structural (or behavioural) complexity of the system. These definitions ignore the notion of “physical complexity,” which can measure the amount of information contained in an ecological network, and how difficult it would be to compress. We present relative rank deficiency and SVD entropy as measures of “external” and “internal” complexity, respectively. Using bipartite ecological networks, we find that they all show a very high, almost maximal, physical complexity. Pollination networks, in particular, are more complex when compared to other types of interactions. In addition, we find that SVD entropy relates to other structural measures of complexity (nestedness, connectance, and spectral radius), but does not inform about the resilience of a network when using simulated extinction cascades, which has previously been reported for structural measures of complexity. We argue that SVD entropy provides a fundamentally more “correct” measure of network complexity and should be added to the toolkit of descriptors of ecological networks moving forward.


Author(s):  
PATRICE FRISON ◽  
FRANÇOIS CHAROT ◽  
ERIC GAUTRIN ◽  
DOMINIQUE LAVENIER ◽  
PATRICE QUINTON ◽  
...  

Advances in VLSI technology make it possible to realize systems of very high complexity in a small volume of hardware using integration. In many application fields, it is necessary to implement certain algorithms, or even complete information processing systems, directly in silicon. Application domains which are likely to benefit are in the fields of signal processing and scientific computing. In this paper, we consider several steps which we believe to be essential in the design path of a special purpose architecture, and we present methodologies for achieving design requirements. These solutions are based on experience gathered in the Parallel VLSI Architecture group of IRISA.


Author(s):  
R. Piepereit ◽  
A. Schilling ◽  
N. Alam ◽  
M. Wewetzer ◽  
M. Pries ◽  
...  

Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.


Sign in / Sign up

Export Citation Format

Share Document