Geophysicists rely on seismic data to understand the Earth’s subsurface. Data from seismic receivers contains two types of information convoluted into a single signal: information about the source of the signal (source effects) and information about the subsurface features it passed through on its way to the receiver (path effects). Current methods for separating out the two types of information rely on assumptions, which may not be completely accurate—extracting source effects requires assumptions about the path, and extracting path effects requires assumptions about the source.
In a recent paper, ERL researcher Pawan Bharadwaj, in collaboration with EAPS research scientist Aimé Fournier and Laurent Demanet, Professor of Applied Mathematics and Director of the Earth Resources Laboratory (ERL), introduced a new mathematical method “Focused Blind Deconvolution” that can be used to extract source and path information without relying on the assumptions. Instead, this method compares data from the same source picked up by multiple receivers, and uses advanced math to identify similarities and differences among them. Similarities among the signals can be identified as source effects, while dissimilarities indicate path effects.
Because it does not require the aforementioned assumptions, this method could provide more accurate information to both earthquakes scientists (who are generally interested in the source) and energy industry scientists (who are generally interested in the path). The researchers demonstrate the method by applying it to the Marmousi model, a standard test model in geophysics.
Bharadwaj is a postdoctoral associate in ERL and the Department of Mathematics, and Fournier is a Research Scientist and Principal Investigator in ERL and the Department of Earth, Atmospheric and Planetary Sciences. Prof. Demanet is the director of ERL and a joint faculty member in both departments.
Video: Focused Blind Deconvolution created by the researchers for the 2018 SEG Annual Meeting.
Story image Credit: Petr Brož (Czech Academy of Science), CC BY-SA 4.0.