scholarly journals A Method for Extracting Some Key Terrain Features from Shaded Relief of Digital Terrain Models

2020 ◽  
Vol 12 (17) ◽  
pp. 2809
Author(s):  
Meirman Syzdykbayev ◽  
Bobak Karimi ◽  
Hassan A. Karimi

Detection of terrain features (ridges, spurs, cliffs, and peaks) is a basic research topic in digital elevation model (DEM) analysis and is essential for learning about factors that influence terrain surfaces, such as geologic structures and geomorphologic processes. Detection of terrain features based on general geomorphometry is challenging and has a high degree of uncertainty, mostly due to a variety of controlling factors on surface evolution in different regions. Currently, there are different computational techniques for obtaining detailed information about terrain features using DEM analysis. One of the most common techniques is numerically identifying or classifying terrain elements where regional topologies of the land surface are constructed by using DEMs or by combining derivatives of DEM. The main drawbacks of these techniques are that they cannot differentiate between ridges, spurs, and cliffs, or result in a high degree of false positives when detecting spur lines. In this paper, we propose a new method for automatically detecting terrain features such as ridges, spurs, cliffs, and peaks, using shaded relief by controlling altitude and azimuth of illumination sources on both smooth and rough surfaces. In our proposed method, we use edge detection filters based on azimuth angle on shaded relief to identify specific terrain features. Results show that the proposed method performs similar to or in some cases better (when detecting spurs than current terrain features detection methods, such as geomorphon, curvature, and probabilistic methods.

2001 ◽  
Vol 4 (2) ◽  
pp. 75-91 ◽  
Author(s):  
Xiaotong Wang ◽  
Chih-Chen Chang ◽  
Lichu Fan

The recent advances in detecting and locating damage in bridges by different kinds of non-destructive testing and evaluation (NDT&E) methods are reviewed. From the application point of view, classifications for general bridge components and their damage types are presented. The relationships between damage, bridge components, and NDT&E techniques are summarized. Many useful WEB sources of NDT&E techniques in bridge damage detection are given. It is concluded that: (1) vibration-based damage detection methods are successful to a certain extent, especially when the overall damage is significant and, low frequency vibration can identify those areas where more detailed local inspection should be concentrated; (2) robust identification techniques that are able to locate damage based on realistic measured data sets still seem a long way from reality, and, basic research is still necessary in the mean time; (3) the rapid development of computer technology and digital signal processing (DSP) techniques greatly impacts upon the conventional NDT techniques, especially in control data processing and data displaying, as well as in simulation and modeling; (4) most of the NDT&E techniques introduced in this paper have their own practical commercial systems, but the effort required for combining the theoretical, experimental and engineering achievements, is still a challenging task when establishing the relationship between the unknown quantities and the measured signal parameters and specialised instruments have shown great advantages for doing some things more effectively than general ones; (5) in bridge damage detection, a problem usually requires the application of different NDT&E techniques; two or more independent techniques are needed to enable confidence in the results.


2016 ◽  
Vol 8 (2) ◽  
pp. 254-273 ◽  
Author(s):  
Chounghyun Seong ◽  
Venkataramana Sridhar

The Chesapeake Bay (CB) Watershed is undergoing changes in climate, hydrology, and land use. The assessment of hydroclimatic impacts is important for both water quantity and quality management. This study evaluated the hydroclimatic changes using the Coupled Model Intercomparison Project 5 (CMIP5) data which provided statistically downscaled daily precipitation and temperature. An increase of 3.0 to 5.2 °C in temperature was projected between 2070 and 2099 when compared with the baseline period of 1970–1999. However, precipitation projections showed a modest increase with an average of 5.2 and 8.4% between 2070 and 2099. The northern part of the CB Watershed was expected to be wetter and warmer than the southern region. The average changes in flow were projected between −12 and 6% and −22 to 5% between 2070 and 2099, respectively, under two scenarios. Minimum changes in winter and highest flow reduction in fall with a high degree of variability among the ensemble members was expected. Greater decrease in flows in the northern region of the CB Watershed was projected. Despite the wetter future projections at the end of the century and uncertainties in our evapotranspiration (ET) estimation, reductions in the land surface runoff partly were attributed to increased ET.


2002 ◽  
Vol 5 (04) ◽  
pp. 302-310
Author(s):  
Herman G. Acuna ◽  
D.R. Harrell

Summary Probabilistic methods have introduced inconsistent interpretations of how they should be applied while still complying with reserves certification guidelines. The objective of this paper is to present and discuss some pitfalls commonly encountered in the application of probabilistic methods to evaluate reserves. Several regulatory guidelines that should be followed during the generation of recoverable hydrocarbon distributions are discussed. An example also is given to understand the evolution of reserves categories as a function of probabilities. Most of the conflicting reserves interpretations can be attributed to the constraints of regulatory bodies [e.g., the U.S. Securities and Exchange Commission (SEC)] and the current SPE/World Petroleum Congresses (WPC) reserves definitions in which reserves categories are expressed in terms of the probabilities of being achieved. For example, proved reserves are defined as those hydrocarbon volumes with at least a 90% probability of being equaled or exceeded (P90). Unfortunately, these definitions alone fall short as guidance on how to derive the distributions from which these percentiles will be calculated. This may lead to distributions that do not comply with the remaining guidelines. While a P90 can be calculated from a noncomplying distribution, proved reserves may not be assigned at this percentile level. Introduction In 1997, new reserves definitions were drafted and introduced by SPE and WPC. For the first time, these reserves definitions included some language to address the increased interest in probabilistic analysis to estimate hydrocarbon reserves. Proved reserves were defined, in part, as those volumes of recoverable hydrocarbons with " . . . a high degree of confidence that the quantities will be recovered. If probabilistic methods are used, there should be at least a 90% probability that the quantities actually recovered will equal or exceed the estimate."1 The interpretation of this definition may be that satisfying the P90 criteria is sufficient to define proved reserves. We will discuss later in this paper why defining proved reserves as the P90 of any distribution is not always appropriate. Also, the definitions do not specify at what level the evaluator should apply the P90 test (i.e., is it at the field level or the total portfolio level?). These points are further clarified in the 2001 update of the SPE/WPC definitions.2 Probable reserves were then described in the SPE/WPC definitions as those recoverable hydrocarbon volumes that " . . . are more likely than not to be recoverable. In this context, when probabilistic methods are used, there should be at least a 50% probability that the quantities actually recovered will equal or exceed the sum of estimated proved plus probable reserves."1 Possible reserves were defined as those recoverable hydrocarbon volumes that " . . . are less likely to be recoverable than probable reserves. In this context, when probabilistic methods are used, there should be at least a 10% probability that the quantities actually recovered will equal or exceed the sum of estimated proved plus probable plus possible reserves."1 The SEC does not recognize probable and possible reserves. The SEC's guidelines for reporting proved reserves are set forth in its Regulation S-X, Rule 4-10 and subsequent clarifying bulletins. In Regulation S-X, Rule 4-10, there are no guidelines for the interpretation of probabilistic analysis. The regulation defines proved reserves as those recoverable hydrocarbon volumes with " . . . reasonable certainty to be recoverable in future years from known reservoirs . . ."3 Both the SPE/WPC and SEC proved reserves definitions have several other requirements that are usually applicable to deterministic methods that may conflict with probabilistic analysis if not properly incorporated. Evaluators of reserves should exercise caution when using probabilistic methods to ensure compliance with the reserves definitions adopted by the SEC and SPE/WPC. Caution is required because there are certain situations in which indiscriminate application of probabilistic methods may produce results that are inconsistent with the reserves definitions. For example, the SEC definition of proved reserves does not explicitly recognize the use of the probabilistic method and in no way allows for the probabilistic method to be used in such a manner as to violate any term of that definition. In this paper, we will first present a short definition of probabilistic analysis and the risks and benefits of using this technique. Next, we will address some significant shortcomings in the current reserves definitions and then present some examples on how some of these shortcomings can be addressed in the evaluation of reserves. Discussion of Probabilistic Analysis of Reserves The probabilistic analysis of reserves relies on the use of probabilistic techniques to estimate the uncertainty of the recoverable hydrocarbon volumes. In its purest sense, these probabilistic methods are used to collect and organize, evaluate, present, and summarize data. These methods provide the tools to analyze large amounts of representative data so that the significance of that data's variability and dependability can be measured and understood. Probabilistic analysis should be considered an important tool for internal analysis, allowing companies to understand and rank their hydrocarbon reserves and resources and the associated risks. This method provides the tools to identify the upside and the downside hydrocarbon potential to better organize the company's portfolio and to allocate capital and manpower resources more efficiently. However, it should be understood that the objectives of a hydrocarbon-property ranking study and an SPE/WPC or SEC reserves reporting evaluation might be different. For example, companies may have their own guidelines to group and analyze hydrocarbon assets to allocate company resources or for property acquisitions. These company guidelines may vary from project to project or from year to year (depending on pricing assumptions) and may be different from those guidelines provided in the SPE/ WPC and SEC definitions. It then becomes the primary challenge of the evaluator to reconcile both evaluations.


1988 ◽  
Vol 34 (8) ◽  
pp. 993-997 ◽  
Author(s):  
H. Verreault ◽  
M. Lafond ◽  
A. Asselin ◽  
G. Banville ◽  
G. Bellemare

Escherichia coli TB1 was transformed with pUC9 containing fragmented DNA (4–10 kilobases (kb)) from Corynebacterium sepedonicum. The resulting genomic bank was screened by a dot blot assay to identify clones specifically hydridizing to C. sepedonicum DNA and not to the DNA of several other Gram-positive and Gram-negative bacteria. Two clones (III24 and III31) were selected because of their ability to strongly hybridize to C. sepedonicum DNA and weakly hybridize to the DNA of C. michiganense, Erwinia carotovora, Agrobacterium tumefaciens, Bacillus subtilis, Pseudomonas solanacearum, Micrococcus luteus, and Arthrobacter globiformis. These two clones were also specific for C. sepedonicum DNA when tested against the DNA from 30 isolates of soil bacteria. Restriction enzyme analysis has shown that the two clones have an insert of 8 kb (III24) and 4 kb (III31). On the basis of restriction enzyme patterns, one clone (III24) does not correspond to plasmid pCL 50, a cryptic plasmid found in several C. sepedonicum isolates. Because purified III24 and III31 DNA can be used to detect approximately 1 ng of C. sepedonicum genomic DNA, the two clones can complement serological or biological detection methods. This could be useful, especially when a high degree of specificity is required for detection or identification of this plant pathogen.


Author(s):  
Charles Tomlin ◽  
Shelley Gammon ◽  
Charles Morris ◽  
Charlotte O'Brien

We have developed an innovative methodology to link maternal siblings within 2000-2005 England and Wales Birth Registration data, to form a Pregnancy Spine, a unification of all births to each unique mother. Key challenges were Blocking & Cluster resolution. To optimise geographic blocking, Internal Migration data was incorporated to map likely geographic movement of mothers between births. Following probabilistic linkage, sibling clusters were modelled as a graph and their structure optimised using community detection methods. Childhood statistics data relating to child DOB were incorporated to evaluate accuracy and remove false links. Our development has resulted in a new blocking and cluster resolution method. We developed new ways to assess sibling group accuracy, beyond traditional classifier metrics, and infer error rates.We applied our method to Registration Data used in earlier studies for QA of our methods. Using this, and other maternal sibling composition statistics, we present results showing that a high degree of accuracy was obtained for standard and new evaluation metrics. These methods will improve other linkage projects linking unknown clusters sizes/multiple datasets, or longer time period longitudinal linkage. To this Spine, researchers can append and link other data sources to answer questions about maternal and child health outcomes.


2018 ◽  
Vol 6 (4) ◽  
pp. 11-21
Author(s):  
Paweł Rutkiewicz ◽  
Ireneusz Malik

AbstractThe aim of this study was to present the use of the natural elements of the relief of river valleys such as changes in the width of the valley bottom, landforms occurring in the bottom of the valley, differences in height of the valley terraces as favourable for the location of the dam partitioning the bottom of the valley and creating a water reservoir for the requirements of historic metallurgical centres. The research was carried out based on DEM analysis from LiDAR data. Features were chosen in river basins with a rich metallurgical legacy. Analysis of the location of the former ironworks was carried out using Surfer 12 software. Five centres were selected due to the fact that only these are the only centres suitable for research which have survived to this day. Using the shaded relief models and contour coloured maps absolute differences in height between valley levels and other forms of relief occurring in the valley were analyzed, as well as the distribution of individual terrain forms in the designated part of the valley and changes in the width of the valley bottom were analysed in the context of the location of former metallurgical centres. On the basis of the contours of the former water reservoir visible in the valley relief, and using a surface area measurement tool (Surfer software), the range of the area that the reservoir could cover was measured. On the basis of the results obtained, it can be seen that convenient geomorphological conditions were used for the placement of selected weirs and metallurgical ponds which facilitated the damming of the valley. Natural narrowing of the valley bottom, or dunes and hills directly adjacent to the valley floor, were utilised during the construction of the dam. The rivers on which the furnace ponds were constructed are relatively small watercourses, so the weirs created by the constructors are not impressive. Their height is generally in the range of about 2 to 3 metres and their length is from about 120 to 300 metres. Nevertheless, they were effective in allowing sufficient water retention and the creation of furnace ponds with a measured area of about 4.5 ha to about 25 ha.


2019 ◽  
Vol 16 (4) ◽  
pp. 275-282
Author(s):  
Yan Xu ◽  
Yingxi Yang ◽  
Zu Wang ◽  
Yuanhai Shao

In vivo, one of the most efficient biological mechanisms for expanding the genetic code and regulating cellular physiology is protein post-translational modification (PTM). Because PTM can provide very useful information for both basic research and drug development, identification of PTM sites in proteins has become a very important topic in bioinformatics. Lysine residue in protein can be subjected to many types of PTMs, such as acetylation, succinylation, methylation and propionylation and so on. In order to deal with the huge protein sequences, the present study is devoted to developing computational techniques that can be used to predict the multiple K-type modifications of any uncharacterized protein timely and effectively. In this work, we proposed a method which could deal with the acetylation and succinylation prediction in a multilabel learning. Three feature constructions including sequences and physicochemical properties have been applied. The multilabel learning algorithm RankSVM has been first used in PTMs. In 10-fold cross-validation the predictor with physicochemical properties encoding got accuracy 73.86%, abslute-true 64.70%, respectively. They were better than the other feature constructions. We compared with other multilabel algorithms and the existing predictor iPTM-Lys. The results of our predictor were better than other methods. Meanwhile we also analyzed the acetylation and succinylation peptides which could illustrate the results.


Author(s):  
Koji Ochiai ◽  
Naohiro Motozawa ◽  
Motoki Terada ◽  
Takaaki Horinouchi ◽  
Tomohiro Masuda ◽  
...  

Cell culturing is a basic experimental technique in cell biology and medical science. However, culturing high-quality cells with a high degree of reproducibility relies heavily on expert skills and tacit knowledge, and it is not straightforward to scale the production process due to the education bottleneck. Although many automated culture systems have been developed and a few have succeeded in mass production environments, very few robots are permissive of frequent protocol changes, which are often required in basic research environments. LabDroid is a general-purpose humanoid robot with two arms that performs experiments using the same tools as humans. Combining our newly developed AI software with LabDroid, we developed a variable scheduling system that continuously produces subcultures of cell lines without human intervention. The system periodically observes the cells on plates with a microscope, predicts the cell growth curve by processing cell images, and decides the best times for passage. We have succeeded in developing a system that maintains the cultures of two HEK293A cell plates with no human intervention for 192 h.


2015 ◽  
Vol 6 ◽  
pp. 1082-1090 ◽  
Author(s):  
Oleksandr V Dobrovolskiy ◽  
Maksym Kompaniiets ◽  
Roland Sachser ◽  
Fabrizio Porrati ◽  
Christian Gspan ◽  
...  

Controlling magnetic properties on the nanometer-scale is essential for basic research in micro-magnetism and spin-dependent transport, as well as for various applications such as magnetic recording, imaging and sensing. This has been accomplished to a very high degree by means of layered heterostructures in the vertical dimension. Here we present a complementary approach that allows for a controlled tuning of the magnetic properties of Co/Pt heterostructures on the lateral mesoscale. By means of in situ post-processing of Pt- and Co-based nano-stripes prepared by focused electron beam induced deposition (FEBID) we are able to locally tune their coercive field and remanent magnetization. Whereas single Co-FEBID nano-stripes show no hysteresis, we find hard-magnetic behavior for post-processed Co/Pt nano-stripes with coercive fields up to 850 Oe. We attribute the observed effects to the locally controlled formation of the CoPt L10 phase, whose presence has been revealed by transmission electron microscopy.


Sign in / Sign up

Export Citation Format

Share Document