Detrending your time-series might be a game-changer
Detrending a signal before computing its Fourier transform is a common practice, especially when dealing with time-series.
In this post, I want to show both mathematically and visually how detrending your signal affects its Fourier-transform.
All images by author.
This post is the fourth of my Fourier-transform for time-series series: I use very simple examples and a few mathematical formulas to explain various concepts of the Fourier transform. You don’t need to read them in the order below, I’d rather recommend going back and forth between each article.
Check out the previous posts here:
- Review how the convolution relate to the Fourier transform and how fast it is:
- Deepen your understanding of convolution using image examples:
- Understand how the Fourier-transform can be visualy understood using a vector-visual approach:
In this post, we are going to explore 2 kinds of detrends : we’ll call them ‘constant’ and ‘linear’ detrendings.
The end goal of this post is to make you understand what are constant and linear detrending, why we use them, and how they affects the Fourier-transform of the signal.
This post originally appeared on TechToday.