A survey of current trends in near-surface electrical and electromagnetic methods

Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. G249-G260 ◽  
Author(s):  
Esben Auken ◽  
Louise Pellerin ◽  
Niels B. Christensen ◽  
Kurt Sørensen

Electrical and electromagnetic (E&EM) methods for near-surface investigations have undergone rapid improvements over the past few decades. Besides the traditional applications in groundwater investigations, natural-resource exploration, and geological mapping, a number of new applications have appeared. These include hazardous-waste characterization studies, precision-agriculture applications, archeological surveys, and geotechnical investigations. The inclu-sion of microprocessors in survey instruments, development of new interpretation algorithms, and easy access to powerful computers have supported innovation throughout the geophysical community and the E&EM community is no exception. Most notable are development of continuous-measurement systems that generate large, dense data sets efficiently. These have contributed significantly to the usefulness of E&EM methods by allowing measurements over wide areas without sacrificing lateral resolution. The availability of these luxuriant data sets in turn spurred development of interpretation algorithms, including: Laterally constrained 1D inversion as well as innovative 2D- and 3D-inversion methods. Taken together, these developments can be expected to improve the resolution and usefulness of E&EM methods and permit them to be applied economically. The trend is clearly toward dense surveying over larger areas, followed by highly automated, post-acquisition processing and interpretation to provide improved resolution of the shallow subsurface in a cost-effective manner.

Author(s):  
L Mohana Tirumala ◽  
S. Srinivasa Rao

Privacy preserving in Data mining & publishing, plays a major role in today networked world. It is important to preserve the privacy of the vital information corresponding to a data set. This process can be achieved by k-anonymization solution for classification. Along with the privacy preserving using anonymization, yielding the optimized data sets is also of equal importance with a cost effective approach. In this paper Top-Down Refinement algorithm has been proposed which yields optimum results in a cost effective manner. Bayesian Classification has been proposed in this paper to predict class membership probabilities for a data tuple for which the associated class label is unknown.


Geophysics ◽  
2001 ◽  
Vol 66 (6) ◽  
pp. 1761-1773 ◽  
Author(s):  
Roman Spitzer ◽  
Alan G. Green ◽  
Frank O. Nitsche

By appropriately decimating a comprehensive shallow 3‐D seismic reflection data set recorded across unconsolidated sediments in northern Switzerland, we have investigated the potential and limitations of four different source‐receiver acquisition patterns. For the original survey, more than 12 000 shots and 18 000 receivers deployed on a [Formula: see text] grid resulted in common midpoint (CMP) data with an average fold of ∼40 across a [Formula: see text] area. A principal goal of our investigation was to determine an acquisition strategy capable of producing reliable subsurface images in a more efficient and cost‐effective manner. Field efforts for the four tested acquisition strategies were approximately 50%, 50%, 25%, and 20% of the original effort. All four data subsets were subjected to a common processing sequence. Static corrections, top‐mute functions, and stacking velocities were estimated individually for each subset. Because shallow reflections were difficult to discern on shot and CMP gathers generated with the lowest density acquisition pattern (20% field effort) such that dependable top‐mute functions could not be estimated, data resulting from this acquisition pattern were not processed to completion. Of the three fully processed data subsets, two (50% field effort and 25% field effort) yielded 3‐D migrated images comparable to that derived from the entire data set, whereas the third (50% field effort) resulted in good‐quality images only in the shallow subsurface because of a lack of far‐offset data. On the basis of these results, we concluded that all geological objectives associated with our particular study site, which included mapping complex lithological units and their intervening shallow dipping boundaries, would have been achieved by conducting a 3‐D seismic reflection survey that was 75% less expensive than the original one.


HortScience ◽  
2019 ◽  
Vol 54 (6) ◽  
pp. 976-981 ◽  
Author(s):  
Jean Carlos Bettoni ◽  
Aike Anneliese Kretzschmar ◽  
Remi Bonnart ◽  
Ashley Shepherd ◽  
Gayle M. Volk

The availability of and easy access to diverse Vitis species are prerequisites for advances in breeding programs. Plant genebanks usually maintain collections of Vitis taxa as field collections that are vulnerable to biotic and abiotic stresses. Cryopreservation has been considered an ideal method of preserving these collections as safety back-ups in a cost-effective manner. We report a droplet vitrification method used to cryopreserve 12 Vitis species (Vitis vinifera cvs. Chardonnay and ‘Riesling, V. actinifolia, V. aestivalis, V. jacquemontii, V. flexuosa, V. palmata, V. riparia, V. rupestris, V. sylvestris, V. ficifolia, V. treleasi, and V. ×novae angeliae) using shoot tips excised from plants grown in vitro. Our results demonstrated wide applicability of this technique, with regrowth levels at least 43% for 13 genotypes representing 12 Vitis species. We demonstrated that the droplet vitrification procedure can be successfully replicated by technical staff, thus suggesting that this method is ready for implementation.


2021 ◽  
Vol 14 (11) ◽  
pp. 2519-2532
Author(s):  
Fatemeh Nargesian ◽  
Abolfazl Asudeh ◽  
H. V. Jagadish

Data scientists often develop data sets for analysis by drawing upon sources of data available to them. A major challenge is to ensure that the data set used for analysis has an appropriate representation of relevant (demographic) groups: it meets desired distribution requirements. Whether data is collected through some experiment or obtained from some data provider, the data from any single source may not meet the desired distribution requirements. Therefore, a union of data from multiple sources is often required. In this paper, we study how to acquire such data in the most cost effective manner, for typical cost functions observed in practice. We present an optimal solution for binary groups when the underlying distributions of data sources are known and all data sources have equal costs. For the generic case with unequal costs, we design an approximation algorithm that performs well in practice. When the underlying distributions are unknown, we develop an exploration-exploitation based strategy with a reward function that captures the cost and approximations of group distributions in each data source. Besides theoretical analysis, we conduct comprehensive experiments that confirm the effectiveness of our algorithms.


Author(s):  
Sunil Nagpal ◽  
Mohammed Monzoorul Haque ◽  
Sharmila S. Mande

Motivation: 16S rRNA gene amplicon based sequencing has significantly expanded the scope of metagenomics research by enabling microbial community analyses in a cost-effective manner. The possibility to infer functional potential of a microbiome through amplicon sequencing derived taxonomic abundance profiles has further strengthened the utility of 16S sequencing. In fact, a surge in 'inferred function metagenomic analysis' has recently taken place, wherein most 16S microbiome studies include inferred functional insights in addition to taxonomic characterization. Tools like PICRUSt, Tax4Fun, Vikodak and iVikodak have significantly eased the process of inferring function potential of a microbiome using the taxonomic abundance profile. A platform that can enable hosting of inferred function 'metagenomic studies' with comprehensive metadata driven search utilities (of a typical database), coupled with on-the-fly comparative analytics between studies of interest, can be a major improvement to the state of art. ReFDash represents an effort in the proposed direction. Methods: This work introduces ReFDash - a Repository of Functional Dashboards. ReFDash, developed as a significant extension of iVikodak (function inference tool), provides three broad unique offerings in inferred function space - (i) a platform that hosts a database of inferred function data being continously updated using public 16S metagenomic studies (ii) a tool to search studies of interest and compare upto three metagenomic environments on the fly (iii) a community initiative wherein users can contribute their own inferred function data to the platform. ReFDash therefore provides a first-of-its-kind community-driven frame-work for scientific collaboration, data analytics, and sharing in this area of microbiome research. Results: Overall, the ReFDash database is aimed at compiling together a global ensemble of 16S-derived Functional Metagenomics projects. ReFDash currently hosts close to 50 ready-to-use, re-analyzable functional dashboards representing data from approximately 18,000 microbiome samples sourced from various published studies. Each entry also provides direct downloadable links to associated taxonomic files and metadata employed for analysis. Conclusion: The vision behind ReFDash is creation of a framework, wherein users can not only analyze their microbiome datasets in functional terms, but also contribute towards building an information base by submitting their functional analyses to ReFDash database. ReFDash web-server may be freely accessed at https://web.rniapps.net/iVikodak/refdash/


2003 ◽  
Vol 1819 (1) ◽  
pp. 294-295 ◽  
Author(s):  
Kerry J. McManus ◽  
John B. Metcalf

A set of deterioration models is required to manage local government authority (LGA) pavements in a cost-effective manner; yet, most existing deterioration models have been derived for the major highways of the State Road Authority system. LGA pavements are different in terms of pavement life, the effect of the environment, loading, and expectations of performance with respect to riding quality. There is a greater emphasis on sustained light routine maintenance in LGAs. There is a need to develop models that more closely represent LGA pavements so they can be used to forecast the deterioration of the asset and to provide better guidance for rehabilitation planning. Existing pavement deterioration models, such as Highway Development Management-III (HDM-III), were examined for application to the Australian LGA pavement set. In general, such models were too complex for use in LGAs, and they also used roughness as a performance measure. Roughness is not commonly measured in LGA pavement networks. Other studies have shown that HDM-III could overestimate the deterioration of lightly loaded pavements. For this study, data on deterioration of several LGA pavements were collected and analyzed. Visual assessment data on pavement condition were captured in a “snapshot” survey of pavements of different ages. Thus, the data represent an age cross-section sample. Little or no correlation was found between any of the performance indicators and age when the full data sets were used for the analyses. Some correlation was apparent with the averaged data for each age. Even then, some of the trendlines observed indicated a performance with age, contrary to normal expectations. It appears that factors besides age have a significant influence on the behavior of LGA pavement.


Geophysics ◽  
2021 ◽  
pp. 1-65
Author(s):  
Hemin Yuan ◽  
Majken C. Looms ◽  
Lars Nielsen

The characterization of shallow subsurface formations is essential for geological mapping and interpretation, reservoir characterization, and prospecting related to mining/quarrying. To analyze elastic and electromagnetic properties, we characterize near-surface chalk formations deposited on a shallow seabed during the Late Cretaceous–Early Paleogene (Maastrichtian-Danian). Electromagnetic and elastic properties, both of which are related to mineralogy, porosity, and water saturation, are combined to characterize the physical properties of chalk formations. We also perform rock physics modeling of elastic velocities and permittivity and analyze their relationships. We then use measured ground penetrating radar and P-wave velocity field data to determine the key model parameters, which are essential for the validity of the models and can be used to evaluate the consolidation degree of the rocks. Based on the models, a scheme is developed to estimate the porosity and water saturation by combining the two rock physics templates. The predictions are consistent with previous findings. Our templates facilitate fast mapping of near-surface porosity and saturation distributions and represent an efficient and cost-effective method for near-surface hydrological, environmental, and petrophysical studies. In the current formulation, the method is only applicable to rock type (chalk) comprising a single mineral (pure calcite). It is possible to tailor the formulation to include more than one mineral; however, this will increase the uncertainty of the results.


Author(s):  
W.J. Parker ◽  
N.M. Shadbolt ◽  
D.I. Gray

Three levels of planning can be distinguished in grassland farming: strategic, tactical and operational. The purpose of strategic planning is to achieve a sustainable long-term fit of the farm business with its physical, social and financial environment. In pastoral farming, this essentially means developing plans that maximise and best match pasture growth with animal demand, while generating sufficient income to maintain or enhance farm resources and improvements, and attain personal and financial goals. Strategic plans relate to the whole farm business and are focused on the means to achieve future needs. They should be routinely (at least annually) reviewed and monitored for effectiveness through key performance indicators (e.g., Economic Farm Surplus) that enable progress toward goals to be measured in a timely and cost-effective manner. Failure to link strategy with control is likely to result in unfulfilled plans. Keywords: management, performance


Author(s):  
Cristian Cocconcelli ◽  
Bongsuk Park ◽  
Jian Zou ◽  
George Lopp ◽  
Reynaldo Roque

Reflective cracking is frequently reported as the most common distress affecting resurfaced pavements. An asphalt rubber membrane interlayer (ARMI) approach has been traditionally used in Florida to mitigate reflective cracking. However, recent field evidence has raised doubts about the effectiveness of the ARMI when placed near the surface, indicating questionable benefits to reflective cracking and increased instability rutting potential. The main purpose of this research was to develop guidelines for an effective alternative to the ARMI for mitigation of near-surface reflective cracking in overlays on asphalt pavement. Fourteen interlayer mixtures, covering three aggregate types widely used in Florida, and two nominal maximum aggregate sizes (NMAS) were designed according to key characteristics identified for mitigation of reflective cracking, that is, sufficient gradation coarseness and high asphalt content. The dominant aggregate size range—interstitial component (DASR-IC) model was used for the design of all mixture gradations. A composite specimen interface cracking (CSIC) test was employed to evaluate reflective cracking performance of interlayer systems. In addition, asphalt pavement analyzer (APA) tests were performed to determine whether the interlayer mixtures had sufficient rutting resistance. The results indicated that interlayer mixtures designed with lower compaction effort, reduced design air voids, and coarser gradation led to more cost-effective fracture-tolerant and shear-resistant (FTSR) interlayers. Therefore, preliminary design guidelines including minimum effective film thickness and maximum DASR porosity requirements were proposed for 9.5-mm NMAS (35 µm and 50%) and 4.75-mm NMAS FTSR mixtures (20 µm and 60%) to mitigate near-surface reflective cracking.


Sign in / Sign up

Export Citation Format

Share Document