During a 1994 blackout, L.A. residents called 911 when they saw the Milky Way for the first time

For those who live in urban areas, it’s shocking to think that older cities like London and New York were almost entirely dark upon nightfall. Even younger Los Angeles was camouflaged between the imposing San Gabriel Mountains and the yawning Pacific Ocean — until 1875.

Until the 1870s, light was synonymous with fire. But the invention of electricity meant humans could harness virtually unlimited stores of bright light. Los Angeles was among the first cities to introduce tower lights, which were erected on poles high above street level in the 1880s. Still, most cities turned off these lights during a full moon, preferring “a twilight level of illumination to be the ideal,” according to Urban Lighting, Light Pollution and Society.

Others saw electric lighting as a sunlight replacement under the complete control of humans. In 1885, officials in Paris even considered a scheme to erect one central tower so full of high-wattage lamps that it would create an artificial sun when evening approached.

Such displays, the Eiffel Tower and New York City’s “Great White Way” among them, were symbols of a city’s status and growth. At one point, Hannibal, Missouri, claimed it was “the best-lighted city in the world.”

Tower lighting would ultimately fall out of favor. The technology created steep shadows on any building over two stories. As cities grew vertically and automobile use universalized, cities began to expand their borders, installing shorter but more numerous streetlights to aid driving safety. Even suburban areas buzzed with power. The moon was now useless as a light source.

That time period formed the system of public lighting we know today, for better and worse.

The sun sets behind a hazy, unlit San Fernando Valley following the Northridge earthquake on January 17, 1994. (AP Photo/Mark J. Terrill)

According to the International Dark-Sky Association, Los Angeles’s sky glow is visible from an airplane 200 miles away. Places like this mean a full two-thirds of Americans, living under orange domes of artificial light, have lost the ability to see the Milky Way.

So, other than aesthetics, who cares? For starters, it makes astronomers’ lives harder. Cities like Flagstaff, Arizona, with its Lowell Observatory, made some of the earliest civic efforts in the mid-century to control light pollution; it was named the first International Dark Sky City in 2001. Also, animal species like sea turtles rely on dark beaches to lay eggs, and light pollution has disrupted their breeding instincts. Migrating birds and insects are similarly impacted. Human circadian rhythms have drastically altered with the Industrial Revolution and the invention of electrical light. Studies show that people revert to more natural sleep routines when they taper use of lights and screen time with dimming skylight. They’re healthier too; a recent study suggests bright neighborhoods correlate with higher rates of breast cancer.

Even if none of these reasons mattered, most of a city’s artificial light is wasted anyway. Sky glow is the result of light directed upward instead of where it is most useful: on streets and in homes. It is unnecessary, merely “ornamental” light.

True, there are some tradeoffs to limiting light pollution. Without electric light, we wouldn’t be able to enjoy outdoor baseball games on summer nights, when the moths bump against the lamps and the whole city seems to sigh contentedly. We wouldn’t have Las Vegas, where the Luxor Hotel’s Sky Beam can be seen from space.