Recent advances on bedform research and application: Process-based to machine learning

Author(s):  
Sanjay Giri ◽  
Amin Shakya ◽  
Mohamed Nabi ◽  
Suleyman Naqshband ◽  
Toshiki Iwasaki ◽  
...  

<p>Evolution and transition of bedforms in lowland rivers are micro-scale morphological processes that influence river management decisions. This work builds upon our past efforts that include physics-based modelling, physical experiments and the machine learning (ML) approach to predict bedform features, states as well as associated flow resistance. We revisit our past works and efforts on developing and applying numerical models, from simple to sophisticated, starting with a multi-scale shallow-water model with a dual-grid technique. The model incorporates an adjustment of the local bed shear stress by a slope effect and an additional term that influences bedform feature. Furthermore, we review our work on a vertical two-dimensional model with a free surface flow condition. We explore the effects of different sediment transport approaches such as equilibrium transport with bed slope correction and a non-equilibrium transport with pick-up and deposition. We revisit a sophisticated three-dimensional Large Eddy Simulation (LES) model with an improved sediment transport approach that includes sliding, rolling, and jumping based on a Lagrangian framework. Finally, we discuss about bedform states and transition that are studied using laboratory experiments as well as a theory-guided data science approach that assures logical reasoning to analyze physical phenomena with large amounts of data. A theoretical evaluation of parameters that influence bedform development is carried out, followed by classification of bedform type by using a neural network model.</p><p>In second part, we focus on practical application, and discuss about large-scale numerical models that are being applied in river engineering and management practices. Such models are found to have noticeable inaccuracies and uncertainties associated with various physical and non-physical reasons. A key physical problem of these large-scale numerical models is related to the prediction of evolution and transition of micro-scale bedforms, and associated flow resistance. The evolution and transition of bedforms during rising and falling stages of a flood wave have a noticeable impact on morphology and flow levels in low-land alluvial rivers. The interaction between flow and micro-scale bedforms cannot be considered in a physics-based manner in large-scale numerical models due to the incompatibility between the resolution of the models and the scale of morphological changes. The dynamics of bedforms and the corresponding changes in flow resistance are not captured. As a way forward, we propse a hydrid approach that includes application of the CFD models, mentioned above, to generate a large amount of data in complement with field and laboratory observations, analysis of their reliability based on which developing a ML model. The CFD models can replicate bedform evolution and transition processes as well as associated flow resistance in physics-based manner under steady and varying flow conditions. The hybrid approach of using CFD and ML models can offer a better prediction of flow resistance that can be coupled with large-scale numerical models to improve their performance. The reseach is in progress.</p>

2021 ◽  
Author(s):  
Aurore Lafond ◽  
Maurice Ringer ◽  
Florian Le Blay ◽  
Jiaxu Liu ◽  
Ekaterina Millan ◽  
...  

Abstract Abnormal surface pressure is typically the first indicator of a number of problematic events, including kicks, losses, washouts and stuck pipe. These events account for 60–70% of all drilling-related nonproductive time, so their early and accurate detection has the potential to save the industry billions of dollars. Detecting these events today requires an expert user watching multiple curves, which can be costly, and subject to human errors. The solution presented in this paper is aiming at augmenting traditional models with new machine learning techniques, which enable to detect these events automatically and help the monitoring of the drilling well. Today’s real-time monitoring systems employ complex physical models to estimate surface standpipe pressure while drilling. These require many inputs and are difficult to calibrate. Machine learning is an alternative method to predict pump pressure, but this alone needs significant labelled training data, which is often lacking in the drilling world. The new system combines these approaches: a machine learning framework is used to enable automated learning while the physical models work to compensate any gaps in the training data. The system uses only standard surface measurements, is fully automated, and is continuously retrained while drilling to ensure the most accurate pressure prediction. In addition, a stochastic (Bayesian) machine learning technique is used, which enables not only a prediction of the pressure, but also the uncertainty and confidence of this prediction. Last, the new system includes a data quality control workflow. It discards periods of low data quality for the pressure anomaly detection and enables to have a smarter real-time events analysis. The new system has been tested on historical wells using a new test and validation framework. The framework runs the system automatically on large volumes of both historical and simulated data, to enable cross-referencing the results with observations. In this paper, we show the results of the automated test framework as well as the capabilities of the new system in two specific case studies, one on land and another offshore. Moreover, large scale statistics enlighten the reliability and the efficiency of this new detection workflow. The new system builds on the trend in our industry to better capture and utilize digital data for optimizing drilling.


Author(s):  
P. Alison Paprica ◽  
Frank Sullivan ◽  
Yin Aphinyanaphongs ◽  
Garth Gibson

Many health systems and research institutes are interested in supplementing their traditional analyses of linked data with machine learning (ML) and other artificial intelligence (AI) methods and tools. However, the availability of individuals who have the required skills to develop and/or implement ML/AI is a constraint, as there is high demand for ML/AI talent in many sectors. The three organizations presenting are all actively involved in training and capacity building for ML/AI broadly, and each has a focus on, and/or discrete initiatives for, particular trainees. P. Alison Paprica, Vector Institute for artificial intelligence, Institute for Clinical Evaluative Sciences, University of Toronto, Canada. Alison is VP, Health Strategy and Partnerships at Vector, responsible for health strategy and also playing a lead role in “1000AIMs” – a Vector-led initiative in support of the Province of Ontario’s \$30 million investment to increase the number of AI-related master’s program graduates to 1,000 per year within five years. Frank Sullivan, University of St Andrews Scotland. Frank is a family physician and an associate director of HDRUK@Scotland. Health Data Research UK \url{https://hdruk.ac.uk/} has recently provided funding to six sites across the UK to address challenging healthcare issues through use of data science. A 50 PhD student Doctoral Training Scheme in AI has also been announced. Each site works in close partnership with National Health Service bodies and the public to translate research findings into benefits for patients and populations. Yin Aphinyanaphongs – INTREPID NYU clinical training program for incoming clinical fellows. Yin is the Director of the Clinical Informatics Training Program at NYU Langone Health. He is deeply interested in the intersection of computer science and health care and as a physician and a scientist, he has a unique perspective on how to train medical professionals for a data drive world. One version of this teaching process is demonstrated in the INTREPID clinical training program. Yin teaches clinicians to work with large scale data within the R environment and generate hypothesis and insights. The session will begin with three brief presentations followed by a facilitated session where all participants share their insights about the essential skills and competencies required for different kinds of ML/AI application and contributions. Live polling and voting will be used at the end of the session to capture participants’ view on the key learnings and take away points. The intended outputs and outcomes of the session are: Participants will have a better understanding of the skills and competencies required for individuals to contribute to AI applications in health in various ways Participants will gain knowledge about different options for capacity building from targeted enhancement of the skills of clinical fellows, to producing large number of applied master’s graduates, to doctoral-level training After the session, the co-leads will work together to create a resource that summarizes the learnings from the session and make them public (though publication in a peer-reviewed journal and/or through the IPDLN website)


2021 ◽  
Author(s):  
Stefan Hergarten

Abstract. Modeling glacial landform evolution is more challenging than modeling fluvial landform evolution. While several numerical models of large-scale fluvial erosion are available, there are only a few models of glacial erosion, and their application over long time spans requires a high numerical effort. In this paper, a simple formulation of glacial erosion which is similar to the fluvial stream-power model is presented. The model reproduces the occurrence of overdeepenings, hanging valleys, and steps at confluences at least qualitatively. Beyond this, it allows for a seamless coupling to fluvial erosion and sediment transport. The recently published direct numerical scheme for fluvial erosion and sediment transport can be applied to the entire domain, where the numerical effort is only moderately higher than for a purely fluvial system. Simulations over several million years on lattices of several million nodes can be performed on standard PCs. An open-source implementation is freely available as a part of the landform evolution model OpenLEM.


2021 ◽  
pp. 1-10
Author(s):  
Lei Shu ◽  
Kun Huang ◽  
Wenhao Jiang ◽  
Wenming Wu ◽  
Hongling Liu

It is easy to lead to poor generalization in machine learning tasks using real-world data directly, since such data is usually high-dimensional dimensionality and limited. Through learning the low dimensional representations of high-dimensional data, feature selection can retain useful features for machine learning tasks. Using these useful features effectively trains machine learning models. Hence, it is a challenge for feature selection from high-dimensional data. To address this issue, in this paper, a hybrid approach consisted of an autoencoder and Bayesian methods is proposed for a novel feature selection. Firstly, Bayesian methods are embedded in the proposed autoencoder as a special hidden layer. This of doing is to increase the precision during selecting non-redundant features. Then, the other hidden layers of the autoencoder are used for non-redundant feature selection. Finally, compared with the mainstream approaches for feature selection, the proposed method outperforms them. We find that the way consisted of autoencoders and probabilistic correction methods is more meaningful than that of stacking architectures or adding constraints to autoencoders as regards feature selection. We also demonstrate that stacked autoencoders are more suitable for large-scale feature selection, however, sparse autoencoders are beneficial for a smaller number of feature selection. We indicate that the value of the proposed method provides a theoretical reference to analyze the optimality of feature selection.


2021 ◽  
Vol 3 ◽  
Author(s):  
Anastassia Lauterbach

Discussions around Covid-19 apps and models demonstrated that primary challenges for AI and data science focused on governance and ethics. Personal information was involved in building data sets. It was unclear how this information could be utilized in large scale models to provide predictions and insights while observing privacy requirements. Most people expected a lot from technology but were unwilling to sacrifice part of their privacy for building it. Conversely, regulators and policy makers require AI and data science practitioners to ensure optimal public health, national security while avoiding these privacy-related struggles. Their choices vary largely from country to country and are driven more by cultural aspects, and less by machine learning capabilities. The question is whether current ways to design technology and work with data sets are sustainable and lead to a good outcome for individuals and their communities. At the same time Covid-19 made it obvious that economies and societies cannot succeed without far-reaching digital policies, touching every aspect of how we provide and receive education, live, and work. Most regions, businesses and individuals struggled to benefit from competitive capabilities modern data technologies could bring. This opinion paper suggests how Germany and Europe can rethink their digital policy while recognizing the value of data, introducing Data IDs for consumers and businesses, committing to support innovation in decentralized data technologies, introducing concepts of Data Trusts and compulsory education around data starting from the early school age. Besides, it discusses advantages of data-tokens to shape a new ecosystem for decentralized data exchange. Furthermore, it emphasizes the necessity to develop and promote technologies to work with small data sets and handle data in compliance with privacy regulations, keeping in mind costs for the environment while bidding on big data and large-scale machine learning models. Finally, innovation as an integral part of any data scientist's job will be called for.


2020 ◽  
Vol 12 (4) ◽  
pp. 2775-2786
Author(s):  
Bram C. van Prooijen ◽  
Marion F. S. Tissier ◽  
Floris P. de Wit ◽  
Stuart G. Pearson ◽  
Laura B. Brakenhoff ◽  
...  

Abstract. A large-scale field campaign was carried out on the ebb-tidal delta (ETD) of Ameland Inlet, a basin of the Wadden Sea in the Netherlands, as well as on three transects along the Dutch lower shoreface. The data have been obtained over the years 2017–2018. The most intensive campaign at the ETD of Ameland Inlet was in September 2017. With this campaign, as part of KustGenese2.0 (Coastal Genesis 2.0) and SEAWAD, we aim to gain new knowledge on the processes driving sediment transport and benthic species distribution in such a dynamic environment. These new insights will ultimately help the development of optimal strategies to nourish the Dutch coastal zone in order to prevent coastal erosion and keep up with sea level rise. The dataset obtained from the field campaign consists of (i) single- and multi-beam bathymetry; (ii) pressure, water velocity, wave statistics, turbidity, conductivity, temperature, and bedform morphology on the shoal; (iii) pressure and velocity at six back-barrier locations; (iv) bed composition and macrobenthic species from box cores and vibrocores; (v) discharge measurements through the inlet; (vi) depth and velocity from X-band radar; and (vii) meteorological data. The combination of all these measurements at the same time makes this dataset unique and enables us to investigate the interactions between sediment transport, hydrodynamics, morphology and the benthic ecosystem in more detail. The data provide opportunities to calibrate numerical models to a high level of detail. Furthermore, the open-source datasets can be used for system comparison studies. The data are publicly available at 4TU Centre for Research Data at https://doi.org/10.4121/collection:seawad (Delft University of Technology et al., 2019) and https://doi.org/10.4121/collection:kustgenese2 (Rijkswaterstaat and Deltares, 2019). The datasets are published in netCDF format and follow conventions for CF (Climate and Forecast) metadata. The http://data.4tu.nl (last access: 11 November 2020) site provides keyword searching options and maps with the geographical position of the data.


Author(s):  
Pankaj Khurana ◽  
Rajeev Varshney

The rise in the volume, variety and complexity of data in healthcare has made it as a fertile-bed for Artificial intelligence (AI) and Machine Learning (ML). Several types of AI are already being employed by healthcare providers and life sciences companies. The review summarises a classical machine learning cycle, different machine learning algorithms; different data analytical approaches and successful implementation in haematology. Although there are many instances where AI has been found to be great tool that can augment the clinician’s ability to provide better health outcomes, implementation factors need to be put in place to ascertain large-scale acceptance and popularity.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
R. Kabilan ◽  
V. Chandran ◽  
J. Yogapriya ◽  
Alagar Karthick ◽  
Priyesh P. Gandhi ◽  
...  

One of the biggest challenges is towards ensuring large-scale integration of photovoltaic systems into buildings. This work is aimed at presenting a building integrated photovoltaic system power prediction concerning the building’s various orientations based on the machine learning data science tools. The proposed prediction methodology comprises a data quality stage, machine learning algorithm, weather clustering assessment, and an accuracy assessment. The results showed that the application of linear regression coefficients to the forecast outputs of the developed photovoltaic power generation neural network improved the PV power generation’s forecast output. The final model resulted from accurate forecasts, exhibiting a root mean square error of 4.42% in NN, 16.86% in QSVM, and 8.76% in TREE. The results are presented with the building facade and roof application such as flat roof, south façade, east façade, and west façade.


2013 ◽  
Vol 37 ◽  
pp. 19-25 ◽  
Author(s):  
K. Blanckaert ◽  
G. Constantinescu ◽  
W. Uijttewaal ◽  
Q. Chen

Abstract. Curved river reaches were investigated as an example of river configurations where three-dimensional processes prevail. Similar processes occur, for example, in confluences and bifurcations, or near hydraulic structures such as bridge piers and abutments. Some important processes were investigated in detail in the laboratory, simulated numerically by means of eddy-resolving techniques, and finally parameterized in long-term and large-scale morphodynamic models. Investigated flow processes include secondary flow, large-scale coherent turbulence structures, shear layers and flow separation at the convex inner bank. Secondary flow causes a redistribution of the flow and a transverse inclination of the riverbed, which favour erosion of the outer bank and meander migration. Secondary flow generates vertical velocities that impinge on the riverbed, and are known to increase the erosive capacity of the flow. Large-scale turbulent coherent structures also increase the sediment entrainment and transport capacity. Both processes are not accounted for in sediment transport formulae, which leads to an underestimation of the bend scour and the erosion of the outer bank. Eddy-resolving numerical models are computationally too expensive to be implemented in long-term and large-scale morphodynamic models. But they provide insight in the flow processes and broaden the investigated parameter space. Results from laboratory experiments and eddy-resolving numerical models were at the basis of the development of a new parameterization without curvature restrictions of secondary flow effects, which is applicable in long-term and large-scale morphodynamic models. It also led to the development of a new engineering technique to modify the flow and the bed morphology by means of an air-bubble screen. The rising air bubbles generate secondary flow, which redistributes the patterns of flow, boundary shear stress and sediment transport.


Sign in / Sign up

Export Citation Format

Share Document