modelling problem
Recently Published Documents


TOTAL DOCUMENTS

72
(FIVE YEARS 18)

H-INDEX

10
(FIVE YEARS 1)

Author(s):  
I. Baraffe ◽  
T. Constantino ◽  
J. Clarke ◽  
A. Le Saux ◽  
T. Goffrey ◽  
...  

Author(s):  
Yasemin Saglam Kaya

This study aims to examine the perceptions of pre-service mathematics teachers about mathematical modelling activities. Participants of the study comprised 23 pre-service mathematics teachers who undertook a course on mathematical modelling. A 12-hour mathematical modelling course revealed the perceptions of participants. After it, the participants found/developed a modelling problem and explained why they evaluated it as a model eliciting activity (MEA). MEA found/developed by participants were examined by taking the principles of developing MEA into consideration. Results showed that many participants considered having more than one solution, the principle based on real-life context, and suitability for group work for MEA. The participants did not focus on the model documentation principle. Based on this result, faculty members can help pre-service teachers by using activities that can be considered appropriate for this principle in modelling education.


2021 ◽  
Vol 2056 (1) ◽  
pp. 012053
Author(s):  
E A Buzaeva ◽  
D A Evsevichev ◽  
O V Maksimova

Abstract This work is dedicated to study of the structural characteristics and parameters in advanced electroluminescent structures. The important problems in this area are the high complexity and the slowness of the design calculation processes. The problem can be solved by developing a mathematical model of the system based on equivalent circuits. First step in developing a mathematical model is formulation.


2021 ◽  
Author(s):  
Arka Dyuti Sarkar ◽  
Mads Huuse

The North Viking Graben (NVG) is part of the mature North Sea Basin petroleum province and designated as a major carbon storage basin for NW Europe. It has been extensively drilled over five decades with an abundance of well and seismic data in the public domain. As such it serves as an excellent setting to demonstrate the efficacy of a proprietary seismic data led approach to modelling subsurface temperatures using a state-of-the-art full waveform inversion velocity model covering the entire NVG. In a forward modelling problem, an empirical velocity to thermal conductivity transform is used in conjunction with predefined heat flow to predict subsurface temperature. The predefined heat flow parameters are set based on the range of values from previous studies in the area. Abundant well data with bottom hole temperature (BHT) records provide calibration of results. In the inverse modelling problem, BHT’s as well as the velocity derived thermal conductivity are used to solve a 1D steady state approximation of Fourier’s Law for heat flow. In this way heat flow is interpolated over the 12000 km2 model area at a km scale (lateral) resolution, highlighting lateral variability in comparison to the traditional point-based heat flow datasets. This heat flow is used to condition a final iterative loop of forward modelling to produce a temperature model that is best representative of the subsurface temperature. Calibration against 139 exploration wells indicate that the predicted temperatures are on average only 0.6 °C warmer than the recorded values, with a root mean squared error range of 5 °C. BHT for the recently completed Northern Lights carbon capture and sequestration (CCS) well 31/5-7 (Eos) has been modelled to be 97 °C, which is within 6 °C of the recorded BHT. This serves to highlight the applicability of this workflow not only towards enhancing petroleum systems modelling work but also for use in the energy transition and for fundamental scientific purposes.


2021 ◽  
Vol 35 (70) ◽  
pp. 840-876
Author(s):  
Milton Rosa ◽  
Daniel Clark Orey

Abstract An Ethnomathematics-based curriculum helps students demonstrate consistent mathematical processes as they reason, solve problems, communicate ideas, and choose appropriate representations through the development of daily mathematical practices. As well, it recognizes connections with Science, Technology, Engineering, and Mathematics (STEM) disciplines. Our pedagogical work, in relation to STEM Education, is based on the Trivium Curriculum for mathematics and ethnomodelling, which provides communicative, analytical, material, and technological tools to the development of emic, etic, and dialogic approaches that are necessary for the elaboration of the school curricula. STEM Education facilitates pedagogical action that connects ethnomathematics; mathematical modelling, problem-solving, critical judgment, and making sense of mathematical and non-mathematical environments, which involves distinct ways of thinking, reasoning, and developing mathematical knowledge in distinct sociocultural contexts. The ethnomathematical perspective for STEM Education proposed here provides a transformative pedagogy that exposes its power to transform students into critical and reflective citizens in order to enable them to transform society in a glocalized world.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 417
Author(s):  
Ryan Sweke ◽  
Jean-Pierre Seifert ◽  
Dominik Hangleiter ◽  
Jens Eisert

Here we study the comparative power of classical and quantum learners for generative modelling within the Probably Approximately Correct (PAC) framework. More specifically we consider the following task: Given samples from some unknown discrete probability distribution, output with high probability an efficient algorithm for generating new samples from a good approximation of the original distribution. Our primary result is the explicit construction of a class of discrete probability distributions which, under the decisional Diffie-Hellman assumption, is provably not efficiently PAC learnable by a classical generative modelling algorithm, but for which we construct an efficient quantum learner. This class of distributions therefore provides a concrete example of a generative modelling problem for which quantum learners exhibit a provable advantage over classical learning algorithms. In addition, we discuss techniques for proving classical generative modelling hardness results, as well as the relationship between the PAC learnability of Boolean functions and the PAC learnability of discrete probability distributions.


2021 ◽  
Author(s):  
Sebastian Drost ◽  
Fabian Netzel ◽  
Andreas Wytzisk-Ahrens ◽  
Christoph Mudersbach

<p>The application of Deep Learning methods for modelling rainfall-runoff have reached great advances in the last years. Especially, long short-term memory (LSTM) networks have gained enhanced attention for time-series prediction. The architecture of this special kind of recurrent neural network is optimized for learning long-term dependencies from large time-series datasets. Thus, different studies proved the applicability of LSTM networks for rainfall-runoff predictions and showed, that they are capable of outperforming other types of neural networks (Hu et al., 2018).</p><p>Understanding the impact of land-cover changes on rainfall-runoff dynamics is an important task. Such a hydrological modelling problem typically is solved with process-based models by varying model-parameters related to land-cover-incidents at different points in time. Kratzert et al. (2019) proposed an adaption of the standard LSTM architecture, called Entity-Aware-LSTM (EA-LSTM), which can take static catchment attributes as input features to overcome the regional modelling problem and provides a promising approach for similar use cases. Hence, our contribution aims to analyse the suitability of EA-LSTM for assessing the effect of land-cover changes.</p><p>In different experimental setups, we train standard LSTM and EA-LSTM networks for multiple small subbasins, that are associated to the Wupper region in Germany. Gridded daily precipitation data from the REGNIE dataset (Rauthe et al., 2013), provided by the German Weather Service (DWD), is used as model input to predict the daily discharge for each subbasin. For training the EA-LSTM we use land cover information from the European CORINE Land Cover (CLC) inventory as static input features. The CLC inventory includes Europe-wide timeseries of land cover in 44 classes as well as land cover changes for different time periods (Büttner, 2014). The percentage proportion of each land cover class within a subbasin serves as static input features. To evaluate the impact of land cover data on rainfall-runoff prediction, we compare the results of the EA-LSTM with those of the standard LSTM considering different statistical measures as well as the Nash–Sutcliffe efficiency (NSE).</p><p>In addition, we test the ability of the EA-LSTM to outperform physical process-based models. For this purpose, we utilize existing and calibrated hydrological models within the Wupper basin to simulate discharge for each subbasin. Finally, performance metrics of the calibrated model are used as benchmarks for assessing the performance of the EA-LSTM model.</p><p><strong>References</strong></p><p>Büttner, G. (2014). CORINE Land Cover and Land Cover Change Products. In: Manakos & M. Braun (Hrsg.), Land Use and Land Cover Mapping in Europe (Bd. 18, S. 55–74). Springer Netherlands. https://doi.org/10.1007/978-94-007-7969-3_5</p><p>Hu, C., Wu, Q., Li, H., Jian, S., Li, N., & Lou, Z. (2018). Deep Learning with a Long Short-Term Memory Networks Approach for Rainfall-Runoff Simulation. Water, 10(11), 1543. https://doi.org/10.3390/w10111543</p><p>Kratzert, F., Klotz, D., Shalev, G., Klambauer, G., Hochreiter, S., & Nearing, G. (2019). Towards learning universal, regional, and local hydrological behaviors via machine learning applied to large-sample datasets. Hydrology and Earth System Sciences, 23(12), 5089–5110. https://doi.org/10.5194/hess-23-5089-2019</p><p>Rauthe, M, Steiner, H, Riediger, U, Mazurkiewicz, A &Gratzki, A (2013): A Central European precipitation climatology – Part I: Generation and validation of a high-resolution gridded daily data set (HYRAS), Meteorologische Zeitschrift, Vol 22, No 3, 235–256. https://doi.org/10.1127/0941-2948/2013/0436</p>


2020 ◽  
Author(s):  
Louis Falissard ◽  
Claire Morgand ◽  
Walid Ghosn ◽  
Claire Imbaud ◽  
Karim Bounebache ◽  
...  

BACKGROUND The recognition of medical entities from natural language is an ubiquitous problem in the medical field, with applications ranging from medical act coding to the analysis of electronic health data for public health. It is however a complex task usually requiring human expert intervention, thus making it expansive and time consuming. The recent advances in artificial intelligence, specifically the raise of deep learning methods, has enabled computers to make efficient decisions on a number of complex problems, with the notable example of neural sequence models and their powerful applications in natural language processing. They however require a considerable amount of data to learn from, which is typically their main limiting factor. However, the CépiDc stores an exhaustive database of death certificates at the French national scale, amounting to several millions of natural language examples provided with their associated human coded medical entities available to the machine learning practitioner. OBJECTIVE This article investigates the applications of deep neural sequence models to the medical entity recognition from natural language problem. METHODS The investigated dataset is based on every French death certificate from 2011 to 2016, containing information such as the subject’s age, gender, and the chain of events leading to his or her death both in French and encoded as ICD-10 medical entities, for a total of around 3 million observations. The task of automatically recognizing ICD-10 medical entities from the French natural language based chain of event is then formulated as a type of predictive modelling problem known as a sequence-to-sequence modelling problem. A deep neural network based model known as the Transformer is then slightly adapted and fit to the dataset. Its performance is then assessed on an exterior dataset and compared to the current state of the art. Confidence intervals for derived measurements are derived via bootstrap. RESULTS The proposed approach resulted in a test F-measure of .952 [.946, .957], which constitutes a significant improvement on the current state of the art and its previously reported 82.5 F-measure assessed on a comparable dataset. Such an improvement opens a whole field of new applications, from nosologist level automated coding to temporal harmonization of death statistics. CONCLUSIONS This article shows that deep artificial neural network can directly learn from voluminous datasets complex relationships between natural language and medical entities, without any explicit prior knowledge. Although not entirely free from mistakes, the derived model constitutes a powerful tool for automated coding of medical entities from medical language with promising potential applications.


2020 ◽  
Vol 32 (2) ◽  
Author(s):  
Pakiso Joseph Khomokhoana ◽  
Liezel Nel

Many novice programmers fail to comprehend source code and its related concepts in the same way that their instructors do. As emphasised in the Decoding the Disciplines (DtDs) framework, each discipline (including Computer Science) has its own unique set of mental operations. However, instructors often take certain important mental operations for granted and do not explain these 'hidden' steps explicitly when modelling problem solutions. A clear understanding of the underlying cognitive processes and related support strategies employed by experts during source code comprehension (SCC) could ultimately be utilised to help novice programmers to better execute the cognitive processes necessary to efficiently comprehend source code. Positioned within Step 2 of the DtDs framework, this study employed decoding interviews and observations, followed by narrative data analysis, to identify the underlying cognitive processes and related support (though often 'hidden') strategies utilised by a select group of experienced programming instructors during an SCC task. The insights gained were then used to formulate a set of important cognitive-related support strategies for efficient SCC. Programming instructors are encouraged to continuously emphasise strategies like these when modelling their expert ways of thinking regarding efficient SCC more explicitly to their novice students.


Sign in / Sign up

Export Citation Format

Share Document