Computers in Geology - 25 Years of Progress
Latest Publications


TOTAL DOCUMENTS

24
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Published By Oxford University Press

9780195085938, 9780197560525

Author(s):  
C. John Mann

The nuclear waste programs of the United States and other countries have forced geologists to think specifically about probabilities of natural events, because the legal requirements to license repositories mandate a probabilistic standard (US EPA, 1985). In addition, uncertainties associated with these probabilities and the predicted performance of a geologic repository must be stated clearly in quantitative terms, as far as possible. Geoscientists rarely have thought in terms of stochasticity or clearly stated uncertainties for their results. All scientists are taught to acknowledge uncertainty and to specify the quantitative uncertainty in each derived or measured value, but this has seldom been done in geology. Thus, the nuclear waste disposal program is forcing us to do now what we should have been doing all along: acknowledge in quantitative terms what uncertainty is associated with each quantity that is employed, whether deterministically or probabilistically. Uncertainty is a simple concept ostensibly understood to mean that which is indeterminate, not certain, containing doubt, indefinite, problematical, not reliable, or dubious. However, uncertainty in a scientific sense demonstrates a complexity which often is unappreciated. Some types of uncertainty are difficult to handle, if they must be quantified, and a completely satisfactory treatment may be impossible. Initially, only uncertainty associated with measurement, was quantified. The Gaussian, or normal, probability density function (pdf) was recognized by Carl Friedrich Gauss as he studied errors in his measurements two centuries ago and developed a theory of errors still being used today. This was the only type of uncertainty that scientists acknowledged until Heisenberg stated his famous uncertainty principle in 1928. As information theory evolved during and after World War II, major advances were made in semantic uncertainty. Today, two major types of uncertainty are generally recognized (Klir and Folger, 1988): ambiguity or nonspecificity and vagueness or fuzziness. These can be subdivided further into seven types having various measures of uncertainty based on probability theory, set theory, fuzzy-set theory, and possibility theory.


Author(s):  
Donald B. Mclntyre

Elementary crystallography is an ideal context for introducing students to mathematical geology. Students meet crystallography early because rocks are made of crystalline minerals. Moreover, morphological crystallography is largely the study of lines and planes in real three-dimensional space, and visualizing the relationships is excellent training for other aspects of geology; many algorithms learned in crystallography (e.g., rotation of arrays) apply also to structural geology and plate tectonics. Sets of lines and planes should be treated as entities, and crystallography is an ideal environment for introducing what Sylvester (1884) called "Universal Algebra or the Algebra of multiple quantity." In modern terminology, we need SIMD (Single Instruction, Multiple Data) or even MIMD. This approach, initiated by W.H. Bond in 1946, dispels the mysticism unnecessarily associated with Miller indices and the reciprocal lattice; edges and face-normals are vectors in the same space. The growth of mathematical notation has been haphazard, new symbols often being introduced before the full significance of the functions they represent had been understood (Cajori, 1951; Mclntyre, 1991b). Iverson introduced a consistent notation in 1960 (e.g., Iverson 1960, 1962, 1980). His language, greatly extended in the executable form called J (Iverson, 1993), is used here. For information on its availability as shareware, see the Appendix. Publications suitable as tutorials in , J are available (e.g., Iverson. 1991; Mclntyre, 1991 a, b; 1992a,b,c; 1993). Crystals are periodic structures consisting of unit cells (parallelepipeds) repeated by translation along axes parallel to the cell edges. These edges define the crystallographic axes. In a crystal of cubic symmetry they are orthogonal and equal in length (Cartesian). Those of a triclinic crystal, on the other hand, are unequal in length and not at right angles. The triclinic system is the general case; others are special cases. The formal description of a crystal gives prominent place to the lengths of the axes (a, b, and c) and the interaxial angles ( α, β, and γ). A canonical form groups these values into a 2 x 3 table (matrix), the first row being the lengths and the second the angles.


Author(s):  
E. H. Timothy Whitten

Statisticians have demonstrated the iriappropriateness of percentage data for petrological purposes except when transformations (e.g., log-ratios) are used to avoid inherent closure. Use of open variables for chemical data (perhaps weight per unit volume, g/100cc) would avoid this problem and permit traditional petrological work to be undertaken. Virtually all compositional data used in petrology are expressed as percentages (e.g., SiO2 wt%, muscovite vol%, Si wt%) or parts per million. Geologists depend on percentage and ppm data for studies of petrogenesis, spatial variability, etc. However, for over four decades, statisticians and mathematical geologists have given dire warnings about the dangers of drawing conclusions from percentage (or ratio) data. In consequence, petrological literature abounds with disclaimers about possible adverse effects that closure constraints (stemming from use of percentage data) may have. The abundant warnings have given little help to geologists for two reasons. First, the precise impact of closure on petrologic analyses and conclusions has been unclear or abstract. Second, a practical and realistic way of avoiding closure in petrology has not been apparent. Problem avoidance might involve either (a) applying statistical or mathematical transformations to standard percentage data to escape the inherent closure constraints that importune petrological conclusions, or (b) using meaningful petrological variables that are free of closure constraints so that traditional thinking and data manipulation can be used without problem. Transformation has been advocated for geological work by Aitchison (e.g., 1982); this approach, which presents considerable geological difficulties, is briefly reviewed here. No attention appears to have been given to the simple approach of using closure-free variables, which is the main subject of this paper. Closed data are compositional data that have a constant sum. Open data can have any value and do not have the constant-sum constraint. Standard rock chemical analyses are closed because the oxides (or elements, etc.), expressed as percentages, sum to 100; in consequence, at least one negative correlation between the variables must exist (Chayes, 1960). The problem of closure is obvious in two-variable systems. In a quartz-feldspar rock, for example, if quartz percentage increases, feldspar must decrease, so there is inherent negative correlation between the components.


Author(s):  
D. Gill ◽  
M. Levinger

An information management and mapping system combining a series of interactive computer programs for stratigraphic, lithofacies, paleogeographic, and structural analysis interfaced with a comprehensive database on subsurface geology produces contour maps of quantitative variables including structure maps, isopach maps, and maps of lithofacies parameters; detailed lithologic and stratigraphic logs; and printouts of lithofacies parameters for all levels of the lithostratigraphic subdivision. Users communicate by means of simple, on-screen, menu-driven dialogues controlled by FORTRAN programs. The system runs on DEC/Micro VAX II computers operating under VMS. This information management and mapping system for subsurface stratigraphic analysis is an integration of a comprehensive database on the subsurface geology of Israel and a series of computer programs for stratigraphic, lithofacies, paleogeographic, and structural analysis. Development of the system, referred to as "ATLAS -RELIANT," was sponsored by OEIL [Israel Oil Exploration (Investment) Ltd.). The system serves primarily as a storage and retrieval facility for information on the subsurface geology of Israel. Users can obtain printouts of lithologic and stratigraphic logs, contour maps, and value maps. The system originally was developed to run on a CDC machine under the NOS/BE operating system. Later OEIL expanded the database to include many additional items of information [inventory of cores and petrophysical logs, results of production tests, results of petrophysical analyses, geochemical analyses of recovered fluids (water samples and hydrocarbons), and results of quantitative analyses of petrophysical logs] and the system was modified to run on DEC/Micro VAX II computers under the VMS operating system (Shertok, 1969). Among other things, the ATLAS-RELIANT system was instrumental in the regional stratigraphic analysis of the subsurface geology of Israel performed by OEIL during 1968-1988 (OEIL, 1966; Cohen et al., 1990). The database, dubbed "ATLAS," is about 16 MB in size and contains information on 320 petroleum exploration and development boreholes, 50 deep water wells, and 100 columnar sections of outcrops.


Author(s):  
H. A. F. Chaves

Characteristic analysis is well known in mineral resources appraisal and has proved useful for petroleum exploration. It also can be used to integrate geological data in sedimentary basin analysis and hydrocarbon assessment, considering geological relationships and uncertainties that result from lack of basic geological knowledge, A generalization of characteristic analysis, using fuzzy-set theory and fuzzy logic, may prove better for quantification of geologic analogues and also for description of reservoir and sedimentary facies. Characteristic analysis is a discrete multivariate procedure for combining and interpreting data; Botbol (1971) originally proposed its application to geology, geochemistry, and geophysics. It has been applied mainly in the search for poorly exposed or concealed mineral deposits by exploring joint occurrences or absences of mineralogical, lithological, and structural attributes (McCammon et al., 1981). It forms part of a systematic approach to resource appraisal and integration of generalized and specific geological knowledge (Chaves, 1988, 1989; Chaves and Lewis, 1989). The technique usually requires some form of discrete sampling to be applicable—generally a spatial discretization of maps into cells or regular grids (Melo, 1988). Characteristic analysis attempts to determine the joint occurrences of various attributes that are favorable for, related to, or indicative of the occurrence of the desired phenomenon or target. In geological applications, the target usually is an economic accumulation of energy or mineral resources. Applying characteristic analysis requires the following steps: 1) the studied area is sampled using a regular square or rectangular grid of cells; 2) in each cell the favorabilities of the variables are expressed in binary or ternary form; 3) a model is chosen that indicates the cells that include the target (Sinding-Larsen et al, 1979); and 4) a combined favorability map of the area is produced that points out possible new targets. The favorability of individual variables is expressed either in binary form— assigning a value of +1 to favorable and a value of 0 to unfavorable or unevaluated variables—or in ternary form if the two states represented by 0 are distinguishable—the value +1 again means favorable, the value —1 means unfavorable, and the value 0 means unevaluated.


Author(s):  
Hans Wackernagel ◽  
Henri Sanguinetti

In geochemical prospecting for gold a major difficulty is that many values are below the chemical detection limit. Tracers for gold thus play an important role in the evaluation of multivariate geochemical data. In this case study we apply geostatistical methods presented in Wackernagel (1988) to multielement exploration data from a prospect near Limoges, France. The analysis relies upon a metallogenetic model by Bonnemaison and Marcoux (1987, 1990) describing auriferous mineralization in shear zones of the Limousin. The aim of geochemical exploration is to find deposits of raw materials. What is a deposit? It is a geological anomaly which has a significant average content of a given raw material and enough spatial extension to have economic value. The geological body denned by an anomaly is generally buried at a specific depth and may be detectable at the surface through indices. These indices, which we shall call superficial anomalies, are disposed in three manners: at isolated locations, along faults, and as dispersion halos. These two definitions of the word "anomaly" correspond to a vision of the geological phenomenon in its full continuity. Yet in exploration geochemistry only a discrete perception of the phenomenon is possible through samples taken along a regularly meshed grid. A superficial anomaly thus can be apprehended by one or several samples or it can escape the grip of the geochemist when it is located between the nodes of the mesh. A geochemical anomaly, in the strict sense, only exists at the nodes of the sampling grid and we shall distinguish between: a pointwise anomaly defined on a single sample, and a groupwise anomaly defined on several neighboring samples. This distinction is important both upstream, for the geological interpretation of geochemical measurements, and downstream, at the level of geostatistical manipulation of the data. It will condition an exploration strategy on the basis of the data representations used in this case study. A pointwise anomaly, i.e., a high, isolated value of the material being sought, will correspond either to a geological phenomenon of limited extent or to a well hidden deposit.


Author(s):  
F.P. Agterberg ◽  
G.F. Bonharn-Carter

During the past few years, we have developed a method of weights of evidence modeling for mineral potential mapping (cf. Agterberg, 1989; Bonham- Carter et al., 1990). In this paper, weights of evidence modeling and logistic regression are applied to occurrences of hydrothermal vents on the ocean floor, East Pacific Rise near 21° N. For comparison, logistic regression is also applied to occurrences of gold deposits in Meguma Terrane, Nova Scotia. The volcanic, tectonic, and hydrothermal processes along the central axis of the East Pacific Rise at 21° N were originally studied by Ballard et al. (1981). Their maps were previously taken as the starting point for a pilot project on estimation of the probability of occurrence of polymetallic massive sulfide deposits on the ocean floor (Agterberg and Franklin, 1987). In the earlier work, presence or absence of deposits in relatively large square cells was related to explanatory variables quantified for small square cells (pixels) by means of stepwise multiple regression and logistic regression. In this paper, weights of evidence modeling and weighted logistic regression are applied to the same maps but a geographic information system (Intera- TYDAC SPANS, 1991) was used to create polygons for combinations of maps. These polygons can be classified taking the different classes from each map. Probabilities estimated for the resulting "unique conditions" can be classified and displayed. The vents are correlated with only a few patterns and it is relatively easy to interpret the weights and final probability maps in terms of the underlying volcanic, tectonic, and hydrothermal processes. The vents are situated along the central axis of the rise together with the youngest volcanics. They occur at approximately the same depth below sea level, tend to be associated with pillow flows rather than sheet flows, and with absence of fissures which are more prominent in older volcanics. Contrary to weights of evidence modeling, weighted logistic regression (cf. Agterberg, 1992, for discussion of algorithm) can be applied when the explanatory variables are not conditionally independent. This method was previously applied by Reddy et al. (1991) to volcanogenic massive sulfide deposits in the Snow Lake area of Manitoba.


Author(s):  
Václav Nēmec

Friends and associates of Daniel F. Merriam have prepared this volume in Dan's honor to commemorate his 65th birthday and mark the 25th anniversary of the International Association for Mathematical Geology. This compendium is in the tradition of the Festschriften issued by European universities and scholarly organizations to honor an individual who has bequeathed an exceptional legacy to his students, associates, and his discipline. Certainly Dan has made such an impact on geology, and particularly mathematical geology. It is a great privilege for rne to write the introduction to this Festschrift. The editors are to be congratulated for their idea to collect and to publish so many representative scientific articles written by famous authors of several generations. Dan Merriam is the most famous mathematical geologist, in the world. This statement will probably provoke some criticism against an over-glorification of Dan. Some readers will have their own candidates (including themselves) for such a top position. I would like to bring a testimony that the statement is correct and far from an ad hoc judgment only for this solemn occasion. It may be of interest to describe how I became acquainted with Dan. In my opinion this will show how thin and delicate was the original tissue of invisible ties which helped to build up the first contacts among Western and Eastern colleagues in the completely new discipline of mathematical geology. The role of Dan Merriam in opening and increasing these contacts has been very active indeed. In the Fall 1964 I was on a family visit in the United States. This was— after the coup of Prague in 1948—my first travel to the free Western world. With some experience in computerized evaluation of ore deposits, I was curious to see the application of computers in geology and to meet colleagues who had experience with introducing statistical methods into regular estimation of ore reserves. I had very useful contacts in Colorado and in Arizona. In Tucson I visited the real birthplace of the APCOM symposia.


Author(s):  
R.G. Craig

Reconstructions of past, climates and other applications of global models require specification of landforms arid geomorphic systems as boundary conditions. As general circulation models become more sophisticated and comprehensive and the range of applications for reconstructions grows, there will be an increasing demand for valid geomorphic boundary conditions throughout the Phanerozoic. Geomorphologists have not yet, developed the tools and expertise needed to produce reconstructions, so a major gap in understanding of global change now exists. A strategy to fill that gap is presented here. If geomorphology is the study of the form of the land, and if the form of the land can be described by a set of numbers (digital elevation models), then whither geomorphology? Do we become numericists, mathematicians, and statisticians? If geomorphology is the study of the processes shaping the land, and if the form is completely explained by a set of processes and an initial condition (i.e., landform at some earlier time), then two "knowns," current form and current processes, are essential. But if processes themselves change through time and there is an infinite set, of initial conditions, one set for each point in time, then the job becomes overwhelmingly complex. As the geomorphic community has become painfully aware of the difficulties of deep reconstructions, we have withdrawn into the Quaternary, a period during which many simplifying assumptions can be made which allow solution of geomorphic problems. Hence the Geological Society of America lumps "Quaternary Geology" and "Geomorphology" into one division. We have become so comfortable with the notion that, geomorphology and Quaternary geology are synonymous that we have lost sight of the goals of founders of the science such as William Morris Davis and John Wesley Powell who strove to unearth the landforms of the distant past. Of course there is plenty to keep us busy in the good ol’ Quaternary; but we shouldn't ignore the enormous challenges and opportunities that await those who would reconstruct landforms of earlier times.


Author(s):  
UteChristina Herzfeld

"Fractals" and "chaos" have become increasingly popular in geology; however, the use of "fractal" methods is mostly limited to simple cases of selfsimilarity, often taken as the prototype of a scaling property if not mistaken as equivalent to a fractal as such. Here; a few principles of fractal and chaos theory are clarified, an overview of geoscience applications is given, and possible pitfalls are discussed. An example from seafloor topography relates fractal dimension, self-similarity, and multifractal cascade scaling to traditional geostatistical and statistical concepts. While the seafloor has neither self-similar nor cascade scaling behavior, methods developed in the course of "fractal analysis" provide ways to quantitatively describe variability in spatial structures across scales arid yield geologically meaningful results. Upon hearing the slogan "the appleman reigns between order and chaos" in the early 1980's and seeing colorful computer-generated pictures, one was simply fascinated by the strangely beautiful figure of the "appleman" that, when viewed through a magnifying glass, has lots of parts that, are smaller, and smaller, and smaller applemen. The "appleman" is the recurrent feature of the Mandelbrot set, a self-similar fractal, and in a certain sense, the universal fractal (e.g., see Peitgen and Saupe, 1988, p. 195 ff.). Soon the realm of the appleman expanded, made possible by increasing availability of fast, cheap computer power and increasingly sophisticated computer graphics. In its first phase of popularity, when the Bremen working group traveled with their computer graphics display seeking public recognition through exhibits in the foyers of savings banks, the fractal was generally considered to be a contribution to modern art (Peitgen and Richter, The Beauty of Fractals, 1986). While the very title of Mandelbrot's famous book, The Fractal Geometry of Nature (1983), proclaims the discovery of the proper geometry to describe nature, long hidden by principals of Euclidean geometry, the "fractal" did not appeal to Earth scientists for well over two decades after its rediscovery by Mandelbrot (1964, 1965, 1967, 1974, 1975).


Sign in / Sign up

Export Citation Format

Share Document