L^1 norm
The L^1 norm, also known as the Manhattan norm or taxicab norm, measures the distance between points in a space by summing the absolute differences of their coordinates. For a vector in n-dimensional space, it is calculated as the sum of the absolute values of its components. This norm is particularly useful in various fields, including statistics and machine learning, where it helps quantify differences and optimize solutions.
In mathematical terms, if a vector is represented as x = (x_1, x_2, \ldots, x_n) , the L^1 norm is expressed as ||x||_1 = |x_1| + |x_2| + \ldots + |x_n| . This property makes the L^1 norm sensitive to outliers, as larger values contribute more significantly to the total distance.