Apparent magnitude is a measure of how bright a celestial object appears from Earth. It is a logarithmic scale, meaning that a difference of 5 magnitudes corresponds to a brightness factor of 100. For example, a star with an apparent magnitude of 1 is 100 times brighter than a star with an apparent magnitude of 6.
This scale helps astronomers compare the brightness of different objects, such as stars, planets, and galaxies. The lower the number, the brighter the object appears; negative values indicate extremely bright objects, like the Sun or the Moon.