scholarly journals Assessment of Ore Grade Estimation Methods for Structurally Controlled Vein Deposits - A Review

2021 ◽  
Vol 21 (1) ◽  
pp. 31-44
Author(s):  
C. A. Abuntori ◽  
S. Al-Hassan ◽  
D. Mireku-Gyimah

Resource estimation techniques have upgraded over the past couple of years, thereby improving resource estimates. The classical method of estimation is less used in ore grade estimation than geostatistics (kriging) which proved to provide more accurate estimates by its ability to account for the geology of the deposit and assess error. Geostatistics has therefore been said to be superior over the classical methods of estimation. However, due to the complexity of using geostatistics in resource estimation, its time-consuming nature, the susceptibility to errors due to human interference, the difficulty in applying it to deposits with few data points and the difficulty in using it to estimate complicated deposits paved the way for the application of Artificial Intelligence (AI) techniques to be applied in ore grade estimation. AI techniques have been employed in diverse ore deposit types for the past two decades and have proven to provide comparable or better results than those estimated with kriging. This research aimed to review and compare the most commonly used kriging methods and AI techniques in ore grade estimation of complex structurally controlled vein deposits. The review showed that AI techniques outperformed kriging methods in ore grade estimation of vein deposits.   Keywords: Artificial Intelligence, Neural Networks, Geostatistics, Kriging, Mineral Resource, Grade

2020 ◽  
Vol 34 (04) ◽  
pp. 5545-5552
Author(s):  
Mohammad Rostami ◽  
Soheil Kolouri ◽  
Praveen Pilly ◽  
James McClelland

After learning a concept, humans are also able to continually generalize their learned concepts to new domains by observing only a few labeled instances without any interference with the past learned knowledge. In contrast, learning concepts efficiently in a continual learning setting remains an open challenge for current Artificial Intelligence algorithms as persistent model retraining is necessary. Inspired by the Parallel Distributed Processing learning and the Complementary Learning Systems theories, we develop a computational model that is able to expand its previously learned concepts efficiently to new domains using a few labeled samples. We couple the new form of a concept to its past learned forms in an embedding space for effective continual learning. Doing so, a generative distribution is learned such that it is shared across the tasks in the embedding space and models the abstract concepts. This procedure enables the model to generate pseudo-data points to replay the past experience to tackle catastrophic forgetting.


2021 ◽  
Author(s):  
C. Purna Chand ◽  
M.M. Ali ◽  
B. Himasri ◽  
Mark A Bourassa ◽  
Yangxing Zheng

Abstract Precise prediction of a cyclone track with wind speed, pressure, landfall point and the time of crossing the land are very essential for the disaster management and mitigation including the evacuation processes. In this paper, we use an artificial neural network (ANN) approach to estimate the cyclone parameters. For this purpose, these parameters are obtained from the International Best Track Archive for Climate Stewardship (IBTrACS), National Oceanic and Atmospheric Administration (NOAA). Since ANN benefits from a large number of data points, each cyclone is divided into different segments. We use the past information to predict the cyclone geophysical parameters. The predicted values are compared with the observations.


Author(s):  
Mahesh K. Joshi ◽  
J.R. Klein

The world of work has been impacted by technology. Work is different than it was in the past due to digital innovation. Labor market opportunities are becoming polarized between high-end and low-end skilled jobs. Migration and its effects on employment have become a sensitive political issue. From Buffalo to Beijing public debates are raging about the future of work. Developments like artificial intelligence and machine intelligence are contributing to productivity, efficiency, safety, and convenience but are also having an impact on jobs, skills, wages, and the nature of work. The “undiscovered country” of the workplace today is the combination of the changing landscape of work itself and the availability of ill-fitting tools, platforms, and knowledge to train for the requirements, skills, and structure of this new age.


2020 ◽  
Vol 114 ◽  
pp. 242-245
Author(s):  
Jootaek Lee

The term, Artificial Intelligence (AI), has changed since it was first coined by John MacCarthy in 1956. AI, believed to have been created with Kurt Gödel's unprovable computational statements in 1931, is now called deep learning or machine learning. AI is defined as a computer machine with the ability to make predictions about the future and solve complex tasks, using algorithms. The AI algorithms are enhanced and become effective with big data capturing the present and the past while still necessarily reflecting human biases into models and equations. AI is also capable of making choices like humans, mirroring human reasoning. AI can help robots to efficiently repeat the same labor intensive procedures in factories and can analyze historic and present data efficiently through deep learning, natural language processing, and anomaly detection. Thus, AI covers a spectrum of augmented intelligence relating to prediction, autonomous intelligence relating to decision making, automated intelligence for labor robots, and assisted intelligence for data analysis.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Zhan-Ning Liu ◽  
Xiao-Yan Yu ◽  
Li-Feng Jia ◽  
Yuan-Sheng Wang ◽  
Yu-Chen Song ◽  
...  

AbstractIn order to study the influence of distance weight on ore-grade estimation, the inverse distance weighted (IDW) is used to estimate the Ni grade and MgO grade of serpentinite ore based on a three-dimensional ore body model and related block models. Manhattan distance, Euclidean distance, Chebyshev distance, and multiple forms of the Minkowski distance are used to calculate distance weight of IDW. Results show that using the Minkowski distance for the distance weight calculation is feasible. The law of the estimated results along with the distance weight is given. The study expands the distance weight calculation method in the IDW method, and a new method for improving estimation accuracy is given. Researchers can choose different weight calculation methods according to their needs. In this study, the estimated effect is best when the power of the Minkowski distance is 3 for a 10 m × 10 m × 10 m block model. For a 20 m × 20 m × 20 m block model, the estimated effect is best when the power of the Minkowski distance is 9.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Hongbin Ma ◽  
Shuyuan Yang ◽  
Guangjun He ◽  
Ruowu Wu ◽  
Xiaojun Hao ◽  
...  

2021 ◽  
pp. 1420326X2199241
Author(s):  
Hanlin Li ◽  
Dan Wu ◽  
Yanping Yuan ◽  
Lijun Zuo

In the past 30 years, tubular daylight guide systems (TDGSs) have become one of the most popular ways to transport outdoor natural light into the inner space in building design. However, tubular daylight guide systems are not widely used because of the lack of methods to evaluate methods on the suitability of the TDGSs. This study therefore summarizes the daylight performance metrics of TDGSs and presents the estimation methods in terms of field measurements, simulation and empirical formulae. This study focuses on the daylight performance and potential energy savings of TDGSs. Moreover, this study will be helpful for building designers to build healthy, comfortable and energy-saving indoor environment.


Author(s):  
Gabrielle Samuel ◽  
Jenn Chubb ◽  
Gemma Derrick

The governance of ethically acceptable research in higher education institutions has been under scrutiny over the past half a century. Concomitantly, recently, decision makers have required researchers to acknowledge the societal impact of their research, as well as anticipate and respond to ethical dimensions of this societal impact through responsible research and innovation principles. Using artificial intelligence population health research in the United Kingdom and Canada as a case study, we combine a mapping study of journal publications with 18 interviews with researchers to explore how the ethical dimensions associated with this societal impact are incorporated into research agendas. Researchers separated the ethical responsibility of their research with its societal impact. We discuss the implications for both researchers and actors across the Ethics Ecosystem.


2021 ◽  
pp. 1-8
Author(s):  
Edith Brown Weiss

Today, it is evident that we are part of a planetary trust. Conserving our planet represents a public good, global as well as local. The threats to future generations resulting from human activities make applying the normative framework of a planetary trust even more urgent than in the past decades. Initially, the planetary trust focused primarily on threats to the natural system of our human environment such as pollution and natural resource degradation, and on threats to cultural heritage. Now, we face a higher threat of nuclear war, cyber wars, and threats from gene drivers that can cause inheritable changes to genes, potential threats from other new technologies such as artificial intelligence, and possible pandemics. In this context, it is proposed that in the kaleidoscopic world, we must engage all the actors to cooperate with the shared goal of caring for and maintaining planet Earth in trust for present and future generations.


2020 ◽  
pp. 000370282097751
Author(s):  
Xin Wang ◽  
Xia Chen

Many spectra have a polynomial-like baseline. Iterative polynomial fitting (IPF) is one of the most popular methods for baseline correction of these spectra. However, the baseline estimated by IPF may have substantially error when the spectrum contains significantly strong peaks or have strong peaks located at the endpoints. First, IPF uses temporary baseline estimated from the current spectrum to identify peak data points. If the current spectrum contains strong peaks, then the temporary baseline substantially deviates from the true baseline. Some good baseline data points of the spectrum might be mistakenly identified as peak data points and are artificially re-assigned with a low value. Second, if a strong peak is located at the endpoint of the spectrum, then the endpoint region of the estimated baseline might have significant error due to overfitting. This study proposes a search algorithm-based baseline correction method (SA) that aims to compress sample the raw spectrum to a dataset with small number of data points and then convert the peak removal process into solving a search problem in artificial intelligence (AI) to minimize an objective function by deleting peak data points. First, the raw spectrum is smoothened out by the moving average method to reduce noise and then divided into dozens of unequally spaced sections on the basis of Chebyshev nodes. Finally, the minimal points of each section are collected to form a dataset for peak removal through search algorithm. SA selects the mean absolute error (MAE) as the objective function because of its sensitivity to overfitting and rapid calculation. The baseline correction performance of SA is compared with those of three baseline correction methods: Lieber and Mahadevan–Jansen method, adaptive iteratively reweighted penalized least squares method, and improved asymmetric least squares method. Simulated and real FTIR and Raman spectra with polynomial-like baselines are employed in the experiments. Results show that for these spectra, the baseline estimated by SA has fewer error than those by the three other methods.


Sign in / Sign up

Export Citation Format

Share Document