Early Stars not Quite the Monsters as First Thought

The first stars coalesced from the primordial hydrogen and helium synthesised in the first few minutes after the birth of our Universe just a few hundred million years after the Universe burst into existence.

All stars are born when a cool clump of hydrogen and helium gas condenses under its own gravity. As it does so the central regions of the clump heat up eventually to the millions of Kelvin required for nuclear fusion to occur. Once this critical point is reached the star fuses hydrogen to helium and begins to shine brightly. Stars grow by absorbing more material onto their surfaces from the surrounding nebula and hence they grow in mass. Eventually their increased energy output disperses the surrounding cocoon to the point where the star’s gravity is insufficient to attract anymore material and the star ceases growing.

In the current era of the universe this process is aided by heavier elements such as carbon and oxygen, these aid the cooling of the surrounding gas which allows for it to fall onto the star – if the gas was too hot it would have too much energy and escape off into space. The first generation of stars did not have any heavy elements to aid their formation as they had yet to be produced as such elements are only produced in the final stages of a massive star’s life. Thus they would have had a much shorter time to absorb matter from their surroundings before it escaped their clutches.

This may suggest that the first stars should be much smaller than those currently populating the Universe, though this is not the case. As the density of material in the early Universe would have been much higher they would have more material with their grasp to begin with and so all predictions produce stars that are much more massive than the current average – which is only a fraction of the mass of Sol.

It had previously been though that the first generation of stars would have been the most massive of all, with material equating to several hundred solar masses, similar to the most massive stars currently known – such as those within the Large Magelanic Cloud’s R136 open cluster. This new study casts doubt on this view on the first stellar newborns.

Simulations conducted at NASA’s Jet Propulsion Laboratory in California by a team of researchers lead by Takashi Hosokawa have indicated that such first generation stars would be ‘only’ a few tens of times a massive as Sol.

Whilst they are still significantly more massive than the vast majority of stars in today’s Universe this discovery removes a significant problem for Cosmology.

If the first generation of stars were as massive as previously predicted they would have produced a particular chemical signature of elements during their lives that would be detectably different from those of conventional mass stars – the higher mass equates to a higher core temperature – combine this with the unique blend of elements available to produce the first stars (~75 Hydrogen, ~25% Helium with tiny traces of Lithium, Deuterium and Beryllium) there should be a ‘bar code’ of the particular abundances of various heavy elements (as fusion reaction rates are easily altered by starting conditions) in the oldest of today’s stars that could only be attributed to the existence of such monstrous precursor stars. As of yet this has not been detected.

Whilst this proves to be a problem for the old model of 1st generation stars, the revised masses produced by this study removes it all together – with no such high mass stars the elemental fingerprint would have never have existed easily explaining the lack of detection – there is nothing to detect!

This fingerprint of elements would have been generated by the stars ending their lives as a yet undetected variety of Supernova. The new mass predictions would cause the 1st stars to detonate in a manner akin to their more recent brethren, allowing for the lack of unusual supernova detections to be explained satisfactorily as well.

As we have seen the masses of the 1st stars are now thought to be much lower than first expected but why would they be less massive than previously predicted?

The answer lies in the surroundings of the stars. The new simulations indicate that the region directly around a forming first generation star was heated to temperatures of up to 50,000 Kelvin or 8 ½ times the surface temperature of Sol. This extremely high temperature would have allowed the surrounding gas to disperse much more quickly than previously thought, thus the growing star has less time to absorb material leading to a lower final mass.

The region around one of the first stars as it was forming Image credit: NASA/JPL-Caltech/Kyoto Univ.

The simulation produced a star of just 43 solar masses, a far call from the 1000 solar mass superstars of early predictions (though I should add as predictions became more advanced the mass of the 1st generational stars has been falling continually and this is the latest study in a long line to reduce the value further).

You can read more here

Leave a comment