model parsimony
Recently Published Documents


TOTAL DOCUMENTS

9
(FIVE YEARS 3)

H-INDEX

5
(FIVE YEARS 1)

2020 ◽  
Author(s):  
Elnaz Azmi ◽  
Uwe Ehret ◽  
Steven V. Weijs ◽  
Benjamin L. Ruddell ◽  
Rui A. P. Perdigão

Abstract. One of the main objectives of the scientific enterprise is the development of parsimonious yet well-performing models for all natural phenomena and systems. In the 21st century, scientists usually represent their models, hypotheses, and experimental observations using digital computers. Measuring performance and parsimony for computer models is therefore a key theoretical and practical challenge for 21st century science. The basic dimensions of computer model parsimony are descriptive complexity, i.e. the length of the model itself, and computational complexity, i.e. the model's effort to provide output. Descriptive complexity is related to inference quality and generality, and Occam's razor advocates minimizing this complexity. Computational complexity is a practical and economic concern for limited computing resources. Both complexities measure facets of the phenomenological or natural complexity of the process or system that is being observed, analysed and modelled. This paper presents a practical technique for measuring the computational complexity of a digital dynamical model and its performance bit by bit. Computational complexity is measured by the average number of memory visits per simulation time step in bits, and model performance is expressed by its inverse, information loss, measured by conditional entropy of observations given the related model predictions, also in bits. We demonstrate this technique by applying it to a variety of watershed models representing a wide diversity of modelling strategies including artificial neural network, auto-regressive, simple and more advanced process-based, and both approximate and exact restatements of experimental observations. Comparing the models revealed that the auto-regressive model poses a favourable trade-off with high performance and low computational complexity, but neural networks and high-time-frequency conceptual bucket models pose an unfavourable trade-off with low performance and high computational complexity. We conclude that the bit by bit approach is a practical approach for evaluating models in terms of performance and computational complexity, both in the universal unit of bits, which also can be used to express the other main aspect of model parsimony, description length.


PLoS ONE ◽  
2020 ◽  
Vol 15 (2) ◽  
pp. e0228535 ◽  
Author(s):  
Sun-Young Kim ◽  
Matthew Bechle ◽  
Steve Hankey ◽  
Lianne Sheppard ◽  
Adam A. Szpiro ◽  
...  

Proceedings ◽  
2018 ◽  
Vol 2 (11) ◽  
pp. 582 ◽  
Author(s):  
Sanghyun Kim

This work presents a platform for efficient representation in the frequency domain of multiple partial blockages in a single pipeline. Blockage detection studies were explored to calibrate the location and size of partial blockages using pressure variation induced by each blockage. To obtain feasible expressions for complicated analytical formulas of multiple partial blockages, an alternative formula is proposed for use in a reservoir pipeline valve system. The validity of the alternative formula was checked by comparing impedance distributions produced by it, with those of existing approaches. The new formula was validated and tested in terms of model parsimony.


Ecohydrology ◽  
2011 ◽  
Vol 5 (1) ◽  
pp. 121-142 ◽  
Author(s):  
Erkan Istanbulluoglu ◽  
Tiejun Wang ◽  
David A. Wedin

Sign in / Sign up

Export Citation Format

Share Document