Friday, 4 July 2025

Trend Detection in Time Series Using the Minimizing Sets Method: Applications in Space Geodynamics and Beyond | Chapter 8 | Physical Science: New Insights and Developments Vol. 1

A significant problem in achieving the required level of accuracy of modern computing complexes and systems is the detection of coarse measurements (outliers) in the time series of data at the preprocessing stage. The proper detection and removal of outliers from measurement data is a necessary step in any scientific application to obtain the most accurate final result. This chapter addresses the challenge of detecting trends in time series data contaminated with outliers, a common issue in fields such as space geodynamics, geodesy, and other measurement-driven sciences. The proposed solution builds on the author’s previously developed Minimising Sets (MS) method, which iteratively constructs a trend by maximising the usable portion of the data and minimising the influence of outliers. The method is extended beyond power polynomials to include trigonometric functions with fixed frequencies and harmonic functions with unknown parameters, increasing its applicability across diverse physical systems. Simulations demonstrate that the approach yields accurate trend approximations and robust outlier detection, without relying on arbitrary thresholds. The technique shows promise for high-precision scientific applications requiring reliable preprocessing of noisy measurement data.

 

Author(s) Details

Igor V. Bezmenov
Russian Metrological Institute of Technical Physics and Radio Engineering, Mendeleevo, Moscow Region, Russia.

 

Please see the book here:- https://doi.org/10.9734/bpi/psniad/v1/5621

No comments:

Post a Comment