The Nyquist theorem is a fundamental principle in signal processing that defines the minimum sampling rate required to accurately capture a continuous signal without losing information. According to the theorem, to avoid distortion and ensure that the original signal can be perfectly reconstructed, the sampling rate must be at least twice the highest frequency present in the signal. This minimum rate is known as the Nyquist rate.
For example, if a signal contains frequencies up to 1,000 Hz, it should be sampled at a minimum of 2,000 samples per second. Failing to meet this requirement can lead to a phenomenon called aliasing, where higher frequencies are misrepresented as lower frequencies, resulting in a distorted signal.