Seismic sensor networks collect data that are automatically processed to detect a variety of sources such as earthquakes, underground explosions, volcanic eruptions, induced microfractures, road usage, and footsteps. Achieving superior automatic detection of seismic activity will provide more accurate prediction of future events. Typical automated processing of data from seismic sensors produces many false signal detections that are not associated with events of interest. This is in part because sensor detection parameters are set as sensitive as possible to avoid missing detections, accepting that this will lead to many false detections that can in turn lead to false events. The problem is exacerbated by the fact that the noise conditions at each sensor often vary dynamically such that parameters that work well for one period of time may not work well for another period of time. The quality of automatic signal detections from sensor networks depends on individual detector trigger levels (TLs) from each sensor. The largely manual process of identifying effective TLs is painstaking and does not guarantee optimal configuration settings, yet achieving superior automatic detection of signals and ultimately, events, is closely related to these parameters.
Sandia has developed a model wherein sensor parameters are dynamically changed to achieve a balance between missing signals from events of interest and detecting false signals. Researchers at Sandia have developed a model wherein sensor parameters are dynamically changed to minimize missing both signals from events of interest and detecting false signals. The Dynamic Detector Tuning (DDT) system, after a stabilization period, can adapt in near real-time to changing conditions and automatically tune a signal detector to detect signals from only events of interest. Our work focuses on reducing false signal detections early in the seismic signal processing pipeline, which leads to fewer false events and has a significant impact on reducing analyst time and effort.
Sandia’s Dynamic Detector Tuning system provides an important new method to automatically tune detector TLs for a network of sensors and is applicable to both existing sensor performance boosting and new sensor deployment. This work has demonstrated a new dynamic TL tuning methodology for signal detector settings that assumes nothing about the underlying waveforms. DDT uses agreement (or lack thereof) within a neighborhood of seismic sensors to adapt a custom TL for each sensor in near-real time and results in improved signal detection quality. Improved (fewer false and missed) detections lead to superior event detection, location, and signal association later in the seismic signal processing pipeline. Specific results demonstrated that DDT reduces the number of false detections by 18% and the number of missed detections by 11% when compared with optimal fixed TLs for all sensors.