Scientists solve the ‘Faint Young Sun Paradox’

Powerful solar eruptions that took place four billion years ago, when the sun was only about 75 percent as bright as it is today, may have warmed the planet enough to allow simple molecular life to form into complex compounds such as RNA and DNA, a new study reveals.

When the first organisms emerged, the sun was so weak that the Earth “should have been an icy ball,” study lead author Vladimir Airapetian, a solar scientist at the Goddard Space Flight Center in Greenbelt, Maryland, explained in a statement. “Instead,” he noted, “geological evidence says it was a warm globe with liquid water. We call this the Faint Young Sun Paradox.”

Now, thanks to observations made by NASA’s planet-hunting Kepler space telescope, Airapetian and his colleagues believe that they have come up with an explanation for this apparent paradox. They found other stars, roughly the same age as the sun when life first emerged on the Earth, that were far more active than their older counterparts, Space.com reported Monday.

These stars, which are only a few million years old (much younger than our 4.6 billion year old sun), were found to produce clouds of superheated plasma called coronal mass ejections (CMEs) and bursts of radiation in the form of solar flares far more frequently than older stars. If our sun was this active during its youth, it would have had a dramatic warming effect on the Earth.

The young sun was much weaker than today's, and scientists wondered how it could have kickstarted life on Earth.

The young sun was much weaker than today’s, and scientists wondered how it could have kickstarted life on Earth.

Changes to atmospheric chemistry may have been a difference maker

In fact, as Airapetian’s team reported in the latest edition of the journal Nature Geoscience, the sun currently produces a “superflare,” a rare and enormous solar eruption, once ever century or so. Younger stars, meanwhile, produce up to 10 such events each day, the Kepler data revealed, and the flares are more frequent and stronger than the sun’s.

Furthermore, NASA said, Earth’s current magnetic field is far stronger than it was billions of years ago. Today’s magnetic field prevents many dangerous solar rays from reaching the surface, but this wasn’t the case during Earth’s infancy. In fact, the study authors believe that space weather particles would have traveled down the magnetic field lines, colliding with nitrogen molecules in the atmosphere and causing chemical changes to occur.

The molecular nitrogen content of the early Earth’s atmosphere was higher than it is now (up to 90 percent of the atmosphere was nitrogen, versus 78 percent today), NASA scientists explained, and as particles from solar activity slammed into these molecules, the impact would have caused them to break down into individual nitrogen atoms, which then collided with carbon dioxide and caused those molecules to be broken down into carbon monoxide and oxygen.

The now free-flowing nitrogen and oxygen particles would have combined to form nitrous oxide, a potent greenhouse gas that would have warmed the planet significantly. If the atmosphere was filled with less than one percent as much N2O as CO2, the agency explained, it would have been enough to warm the planet such that liquid water would be able to exist on the surface, and may have provided enough energy to make complex chemicals to form the molecules that went on to seed life.

“Our new research shows that solar storms could have been central to warming Earth,” Airapetian told Space.com. “Changing the atmosphere’s chemistry turns out to have made all the difference for life on Earth.”

—–

Image credit: NASA