SEISMIC DATA ENHANCEMENT—A CASE HISTORY
The theory relating to many methods—for example, multiple seismometer techniques—which the geophysicist may control to improve record quality is well known. However, its application has not been fully exploited. An example of the reduction of theory to practice in one area characterized by poor records is presented. It comprises a series of analytical tests designed to discover the cause of poor records, to examine the effect of each variable on the signal‐to‐noise ratio, and to evaluate the solutions predicted by theory. The tests showed that the poor record quality was attributable chiefly to relatively strong surface and near‐surface waves propagating outward from the shot. Wave length filtering by means of suitable shot and seismometer patterns, and compositing through data processing methods, greatly improved record quality and permitted magnetic recording of reflected signals over a broad frequency range. The tests established, in the allotted time, that the quality of the data would meet clearly specified standards of performance. Experience has shown that better seismic data can generally be obtained when the design of techniques is based on the special character of the signal and noise determined from simple tests rather than when the design is based on general assumptions.