Seismic signal processing—A new millennium perspective
In the introduction to his comprehensive SEG textbook, Seismic Data Processing, Oz Yilmaz selects deconvolution, common‐midpoint stacking and migration as being the three principal processes that are applied during routine seismic processing. Since Yilmaz’s tome was first published in 1987, a vast number of papers have been published and conference presentations have been given on virtually every aspect of seismic processing. However, I think it is still accurate to say that the same three processes dominate the processing flow of the vast majority of seismic data that is processed now, at the beginning of the twenty‐first century. This is not to say that important progress has not been made in many aspects of seismic processing and that much more sophisticated processing flows are now applied to some datasets. But it is a great tribute to the real pioneers of our profession—the people who advanced our ideas of seismic processing from examining raw analog records in the field to creating crisp computer‐generated images of the subsurface with processes such as deconvolution, stack and migration—that the very same, or similar, algorithms that they invented still form the backbone of everyday processing that is done around the world today. In fact, there are times when it seems that the last great geophysicist was Carl Friedrich Gauss, because the method that he published back in 1823 of minimizing the sum of the squared errors seems to be used almost everywhere one looks in seismic processing, from deconvolution to migration.