Big Ω
Big Ω (Omega) notation is a mathematical concept used in computer science to describe the lower bound of an algorithm's running time. It provides a guarantee that the algorithm will take at least a certain amount of time to complete, regardless of the input size. This helps in understanding the efficiency of algorithms, especially when comparing different approaches.
In formal terms, an algorithm is said to be in Ω(f(n)) if there exist positive constants c and n₀ such that for all n ≥ n₀, the running time is at least c * f(n). This notation is particularly useful for analyzing the best-case scenario of an algorithm's performance.