scholarly journals Rainfall Infiltration Modeling: A Review

Water ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 1873 ◽  
Author(s):  
Renato Morbidelli ◽  
Corrado Corradini ◽  
Carla Saltalippi ◽  
Alessia Flammini ◽  
Jacopo Dari ◽  
...  

Infiltration of water into soil is a key process in various fields, including hydrology, hydraulic works, agriculture, and transport of pollutants. Depending upon rainfall and soil characteristics as well as from initial and very complex boundary conditions, an exhaustive understanding of infiltration and its mathematical representation can be challenging. During the last decades, significant research effort has been expended to enhance the seminal contributions of Green, Ampt, Horton, Philip, Brutsaert, Parlange and many other scientists. This review paper retraces some important milestones that led to the definition of basic mathematical models, both at the local and field scales. Some open problems, especially those involving the vertical and horizontal inhomogeneity of the soils, are explored. Finally, rainfall infiltration modeling over surfaces with significant slopes is also discussed.

2021 ◽  
Vol Volume 17, Issue 3 ◽  
Author(s):  
Thomas Place ◽  
Marc Zeitoun

The dot-depth hierarchy of Brzozowski and Cohen classifies the star-free languages of finite words. By a theorem of McNaughton and Papert, these are also the first-order definable languages. The dot-depth rose to prominence following the work of Thomas, who proved an exact correspondence with the quantifier alternation hierarchy of first-order logic: each level in the dot-depth hierarchy consists of all languages that can be defined with a prescribed number of quantifier blocks. One of the most famous open problems in automata theory is to settle whether the membership problem is decidable for each level: is it possible to decide whether an input regular language belongs to this level? Despite a significant research effort, membership by itself has only been solved for low levels. A recent breakthrough was achieved by replacing membership with a more general problem: separation. Given two input languages, one has to decide whether there exists a third language in the investigated level containing the first language and disjoint from the second. The motivation is that: (1) while more difficult, separation is more rewarding (2) it provides a more convenient framework (3) all recent membership algorithms are reductions to separation for lower levels. We present a separation algorithm for dot-depth two. While this is our most prominent application, our result is more general. We consider a family of hierarchies that includes the dot-depth: concatenation hierarchies. They are built via a generic construction process. One first chooses an initial class, the basis, which is the lowest level in the hierarchy. Further levels are built by applying generic operations. Our main theorem states that for any concatenation hierarchy whose basis is finite, separation is decidable for level one. In the special case of the dot-depth, this can be lifted to level two using previously known results.


1987 ◽  
Vol 92 ◽  
pp. 3-21
Author(s):  
George W. Collins

AbstractIn this paper I shall examine the use and misuse of some astronomical terminology as it is commonly found in the literature. The incorrect usage of common terms, and sometimes the terms themselves, can lead to confusion by the reader and may well indicate misconceptions by the authors. A basic definition of the Be phenomena is suggested and other stellar characteristics whose interpretation may change when used for non-spherical stars, is discussed. Special attention is paid to a number of terms whose semantic nature is misleading when applied to the phenomena they are intended to represent. The use of model-dependent terms is discussed and some comments are offered which are intended to improve the clarity of communication within the subject.


2018 ◽  
Vol 42 (4) ◽  
pp. 361 ◽  
Author(s):  
Richard Olley ◽  
Andrea Morales

Objective Dementia is one of the most common illnesses worldwide, and is one of the most important causes of disability in older people. Currently, dementia affects over 35 million people around the globe. It is expected that this number will increase to 65.7 million by 2030. Early detection, diagnosis and treatment to control the principal behaviour symptoms may help reduce these numbers and delay the progression to more advanced and dangerous stages of this disorder with resultant increase quality of life for those affected. The main goal of the present systematic literature review was to examine contemporary evidence relating to non-pharmacological therapy in the treatment of dementia. Methods To achieve the study goal, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement was used. Results This study identified the five most common behaviours in patients with dementia as aggression, wandering, agitation, apathy and sleep disturbances. Two non-pharmacological therapies were the most studied treatment: music therapy and aromatherapy. Ten other non-pharmacological therapies were also identified, but these lack a sufficient evidence-base. Conclusion Although all the therapies identified could be used as part of the treatment of behavioural symptoms, there is insufficient evidence relating to the indications, appropriate use and effectiveness of these therapies to apply in each behavioural treatment. Thus, the present study has demonstrated a significant research gap. What is known about the topic? Despite the widespread use of many different types of therapies, there is limited evidence regarding the efficacy of non-pharmaceutical therapies deployed in the management of behaviours of concern manifested by some people who suffer with dementia in all its forms. What does this paper add? This systematic review examines contemporary evidence from the literature to determine whether there is an evidence base available that would underpin the use of these therapies. This report on a PRISMA systematic review of the available literature demonstrates that only two therapies have some evidence to underpin the use of these non-pharmaceutical therapies and that a significant research gap is exists. What are the implications for practitioners? The implications for practitioners is that significant research effort is required to determine the efficacy of many of the therapies that are currently deployed, and thus many of the therapies used lack an evidence base at this time.


Author(s):  
Dang Thi Thu Hien ◽  
Hoang Xuan Huan ◽  
Le Xuan Minh Hoang

Radial Basis Function (RBF) neuron network is being applied widely in multivariate function regression. However, selection of neuron number for hidden layer and definition of suitable centre in order to produce a good regression network are still open problems which have been researched by many people. This article proposes to apply grid equally space nodes as the centre of hidden layer. Then, the authors use k-nearest neighbour method to define the value of regression function at the center and an interpolation RBF network training algorithm with equally spaced nodes to train the network. The experiments show the outstanding efficiency of regression function when the training data has Gauss white noise.


Author(s):  
Lew Gordeev ◽  
Edward Hermann Haeusler

We upgrade [3] to a complete proof of the conjecture NP = PSPACE that is known as one of the fundamental open problems in the mathematical theory of computational complexity; this proof is based on [2]. Since minimal propositional logic is known to be PSPACE complete, while PSPACE to include NP, it suffices to show that every valid purely implicational formula ρ has a proof whose weight (= total number of symbols) and time complexity of the provability involved are both polynomial in the weight of ρ. As in [3], we use proof theoretic approach. Recall that in [3] we considered any valid ρ in question that had (by the definition of validity) a "short" tree-like proof π in the Hudelmaier-style cutfree sequent calculus for minimal logic. The "shortness" means that the height of π and the total weight of different formulas occurring in it are both polynomial in the weight of ρ. However, the size (= total number of nodes), and hence also the weight, of π could be exponential in that of ρ. To overcome this trouble we embedded π into Prawitz's proof system of natural deductions containing single formulas, instead of sequents. As in π, the height and the total weight of different formulas of the resulting tree-like natural deduction ∂1 were polynomial, although the size of ∂1 still could be exponential, in the weight of ρ. In our next, crucial move, ∂1 was deterministically compressed into a "small", although multipremise, dag-like deduction ∂ whose horizontal levels contained only mutually different formulas, which made the whole weight polynomial in that of ρ. However, ∂ required a more complicated verification of the underlying provability of ρ. In this paper we present a nondeterministic compression of ∂ into a desired standard dag-like deduction ∂0 that deterministically proves ρ in time and space polynomial in the weight of ρ. Together with [3] this completes the proof of NP = PSPACE. Natural deductions are essential for our proof. Tree-to-dag horizontal compression of π merging equal sequents, instead of formulas, is (possible but) not sufficient, since the total number of different sequents in π might be exponential in the weight of ρ − even assuming that all formulas occurring in sequents are subformulas of ρ. On the other hand, we need Hudelmaier's cutfree sequent calculus in order to control both the height and total weight of different formulas of the initial tree-like proof π, since standard Prawitz's normalization although providing natural deductions with the subformula property does not preserve polynomial heights. It is not clear yet if we can omit references to π even in the proof of the weaker result NP = coNP.


Author(s):  
Gopalakrishnan T.R. Nair ◽  
Selvarani R

As the object oriented programming languages and development methodologies moved forward, a significant research effort was spent in defining specific approaches and building models for quality based on object oriented measurements. Software metrics research and practice have helped in building an empirical basis for software engineering. Software developers require objectives and valid measurement schemes for the evaluation and improvisation of product quality from the initial stages of development. Measuring the structural design properties of a software system such as coupling, inheritance, cohesion, and complexity is a promising approach which can lead to an early quality assessment. The class codes and class diagrams are the key artifacts in the development of object oriented (OO) software and it constitutes the backbone of OO development. It also provides a solid foundation for the design and development of software with a greater influence over the system that is implemented. This chapter presents a survey of existing relevant works on class code / class diagram metrics in an elaborate way. Here, a critical review of the existing work is carried out in order to identify the lessons learnt regarding the way these studies are performed and reported. This work facilitates the development of an empirical body of knowledge. The classical approaches based on statistics alone do not provide managers and developers with a decision support scheme for risk assessment and cost reduction. One of the future challenges is to use software metrics in a way that they creatively address and handle the key objectives of risk assessment and the estimation of external quality factors of the software.


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4313 ◽  
Author(s):  
Xunjia Zheng ◽  
Di Zhang ◽  
Hongbo Gao ◽  
Zhiguo Zhao ◽  
Heye Huang ◽  
...  

Over the past decades, there has been significant research effort dedicated to the development of intelligent vehicles and V2X systems. This paper proposes a road traffic risk assessment method for road traffic accident prevention of intelligent vehicles. This method is based on HMM (Hidden Markov Model) and is applied to the prediction of steering angle status to (1) evaluate the probabilities of the steering angle in each independent interval and (2) calculate the road traffic risk in different analysis regions. According to the model, the road traffic risk is quantified and presented directly in a visual form by the time-varying risk map, to ensure the accuracy of assessment and prediction. Experiment results are presented, and the results show the effectiveness of the assessment strategies.


Author(s):  
Chris D. Nugent ◽  
Dewar D. Finlay ◽  
Mark P. Donnelly ◽  
Norman D. Black

Electrical forces generated by the heart are transmitted to the skin through the body’s tissues. These forces can be recorded on the body’s surface and are represented as an electrocardiogram (ECG). The ECG can be used to detect many cardiac abnormalities. Traditionally, ECG classification algorithms have used rule based techniques in an effort to model the thought and reasoning process of the human expert. However, the definition of an ultimate rule set for cardiac diagnosis has remained somewhat elusive, and much research effort has been directed at data driven techniques. Neural networks have emerged as a strong contender as the highly non-linear and chaotic nature of the ECG represents a well-suited application for this technique. This study presents an overview of the application of neural networks in the field of ECG classification, and, in addition, some preliminary results of adaptations of conventional neural classifiers are presented. From this work, it is possible to highlight issues that will affect the acceptance of this technique and, in addition, identify challenges faced for the future. The challenges can be found in the intelligent processing of larger amounts of ECG information which may be generated from recording techniques such as body surface potential mapping.


2007 ◽  
Vol 35 (4) ◽  
pp. 336-360 ◽  
Author(s):  
G. Dimitriadis ◽  
G. A. Vio

The identification of nonlinear dynamic systems is increasingly becoming a necessary part of vibration testing and there is significant research effort devoted to it. However, as the current methodologies are still not suitable for the identification of general nonlinear systems, the subject is rarely introduced to undergraduate students. In this paper, recent progress in developing an expert approach to the identification of nonlinear systems is used in order to demonstrate the subject within the context of an undergraduate course or as an introductory tool for postgraduate students. The demonstration is based around a software package of an expert system designed to apply systematically a wide range of identification approaches to the system under investigation. It is shown that the software can be used to demonstrate the need for identification of nonlinear systems, the complexity of the procedure, the possibility of failure and the good chances of success when enough physical information about the system is available.


2018 ◽  
Vol 7 (2.8) ◽  
pp. 436
Author(s):  
Prakhar Agarwal ◽  
Shivani Jain

Semantic Web is the extension of existing web that allows well defined expressions for the meaning of information which can be understood by computers and people both. In this paper we are doing study on semantic and is our review paper. Semantic web is a recommended development project by W3C (World Wide Web Consortium) which focuses on the enhancing of information search by keeping the facts in structured form using eXtensible Mark-up Language (XML) and marked in such a way that it can be understand by the system. To make the development of semantic web promising, new international standard is developed for exchanging of ontologies called OWL Web Ontology language. In XML we just provide tag of the model and store data in the hierarchy without its meaning, that’s why the computer cannot be able to process the data but in Semantic Web user can provide with a definition so that the computer can better recognize its meaning and provide with the better displaying of information. A crux of semantic web is that it works on the definition of the ontologies. Ontologies are responsible for re-usability and sharing of information. Semantic Web provides with a shared language which has stored data in the non-ending linking of distinct databases which provides data related to the real world objects. RDF is a common language for semantic web and is responsible for the collection of data on web and assembles different database from diverse sources and SPARQL is there for linking of databases for unifying documents. Thus, semantic web is the well-structured data web that relates all the data that present on the web and understands them to provide the exact display requested by the end user.


Sign in / Sign up

Export Citation Format

Share Document