A memory leak occurs when a computer program incorrectly manages memory allocations, causing it to consume more memory over time without releasing it. This can happen when the program retains references to objects that are no longer needed, preventing the system from reclaiming that memory. As a result, the available memory decreases, which can lead to slower performance or even crashes.
Memory leaks are particularly problematic in long-running applications, such as servers or mobile apps, where they can gradually degrade performance. Developers often use tools to detect and fix memory leaks, ensuring that programs run efficiently and utilize resources effectively.