"O(n)" is a notation used in computer science to describe the time complexity of an algorithm. It indicates that the time it takes to complete the algorithm increases linearly with the size of the input data, denoted as n. For example, if an algorithm processes each item in a list, doubling the list size will roughly double the time required to complete the task.
This concept is part of a broader classification known as Big O notation, which helps evaluate the efficiency of algorithms. Understanding time complexity, including O(n), is crucial for developers to optimize code and improve performance, especially when dealing with large datasets.