Site Links:

Descent of Man
In this lecture, beginners can familiarize themselves with basic information and terms used to describe the evolution of humanity beginning with the origin of primates through the comings and goings of Genus Homo.
SpaceBrainHuman EvolutionBiologyGeologyPhysicsUniverse
Click above fields for latest in the news
September 2003

Head lice key to clothing history
Posted: Monday, September 29, 2003
By Kat Arney,

An evolutionary comparison of human head and body lice has shed light on the history of clothing.

A team based in Germany has worked out that humans probably first began wearing clothes 72,000 years ago - give or take 40,000 years.

Their calculation is based on the fact that as species evolve, they become distinct by inhabiting different environments and gradually changing to suit them.

While head lice live solely on the human scalp, body lice prefer to inhabit those areas covered by clothing.

Molecular clock

Dr Mark Stoneking's team, from the Max Planck Institute for Evolutionary Anthropology in Leipzig, worked out when the two organisms began to diverge and became distinct species.

They compared DNA sequences from both types of lice, arriving at their result by counting the number of DNA mutations.

"DNA mutations occur at a roughly constant rate over time, so if you know what that rate is, you can use the number of mutations between head and body lice to estimate when body lice arose," explained Dr Stoneking.

They also analysed the DNA from chimpanzee lice, and found that the human and chimp bugs became separate species around 5.5 million years ago - not far off the time their hosts' lineages are thought to have diverged.

Margin of error

The calculation that clothing appeared about 72,000 years ago points towards it being a relatively new invention, given that Homo sapiens has probably been around for less than 200,000 years.

"A large time window is inevitable with any molecular clock approach to dating, but even if you take the extremes of the range, the result still associates clothing specifically with modern humans," said Dr Stoneking.

Given that Homo sapiens are generally believed to have expanded out of Africa about 100,000 years ago, perhaps clothing was invented to cope with the cooler climes of their new habitats.

Dr Stoneking's next project is the DNA analysis of pubic lice, using the molecular clock to work out when humans lost the majority of their body hair.

His research is published in Current Biology.


Preliminary analyses of the sequenced chimp genome suggest that it contains many more duplications than the human genome. One school of thought suggests these areas of duplication might drive structural variation in the primate genome.

Email page Send page by E-Mail

The churning grounds of the chimp genome
Posted: Thursday, September 25, 2003
by Laura Spinney,

Dresden, Germany - Preliminary analyses of the sequenced chimp genome suggest that it contains many more duplications than the human genome. One school of thought suggests these areas of duplication might drive structural variation in the primate genome.
Whereas these areas of duplication were once thought to be devoid of useful information, they could perhaps explain the differences between human and chimp, according to Evan Eichler of Case Western Reserve University in Cleveland, Ohio, one of the main proponents of this school and the researcher behind the genome comparison.

The fact that the chimp has more duplications than us could reflect the genetic bottleneck that humans went through about 200,000 years ago, coinciding with their exodus from Africa and diffusion throughout Europe and Asia, he suggested at the European Life Sciences Organization Meeting.

Lacking that bottleneck, chimps have maintained a greater genetic diversity for longer and that may have provided more opportunity for duplications to become fixed in their genome. What is not in doubt, says Eichler, is that these duplication sites are "hotspots for rapid genomic change between human and chimp".

"Imagine them as churning grounds, or blenders in the genome," he says. "There is constant change, duplication, deletion and inversion at a high frequency. And unless that load becomes too high, so that it's negative to the organism as a whole and fitness is reduced, or a change occurs that is selectively advantageous, it will continue to do this churning."

The duplications can occur within chromosomes or between chromosomes, and whereas Eichler used to think that interchromosomal duplications were the most important hotspots, he has now "shifted his guns a little bit".

Duplications within chromosomes are more randomly distributed across the chromosome than the other kind, which tend to cluster in certain regions. And they also tend to occur in the genetically active euchromatic portions of the chromosome, as opposed to the less active heterochromatic parts.

"Transcription activity is higher in those regions of the euchromatin than they are in even unique sequences of the genome," says Eichler. "So we think those are going to be the most important for new genes."

Wolfgang Enard of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, is excited about this kind of large-scale genome comparison. Last year, he and his colleagues carried out their own comparison of mRNA and protein expression patterns in blood and postmortem brain tissue taken from humans and chimps, using microarrays, to see whether the differences in body and mind that set us apart from our nearest relatives lie in patterns of gene expression.

As far as the brain is concerned, Enard says they found significantly more differences in gene expression in the regions of the human genome where Eichler finds duplications. Once the chimp's own duplications have been fully analyzed, he adds, it will be possible to say which of the gene expression differences are associated with human-specific duplications, and which with chimp-specific duplications.

For other stories from ELSO 2003, click through to BioMedNet Conference Reporter

© Elsevier Limited 2003

Email page Send page by E-Mail

Neanderthals, Cro-Magnon hunted same prey
Posted: Monday, September 22, 2003
Source: University of Washington

Bones from French cave show Neanderthals, Cro-Magnon hunted same prey

A 50,000-year record of mammals consumed by early humans in southwestern France indicates there was no major difference in the prey hunted by Neanderthal and Cro-Magnon, according to a new study.
The paper, published in the online Journal of Archaeological Science, counters the idea proposed by some scientists that Cro-Magnon, who were physically similar to modern man, supplanted Neanderthals because they were more skilled hunters as a result of some evolutionary physical or mental advantage.

"This study suggests Cro-Magnon were not superior in getting food from the landscape," said lead author Donald Grayson, a University of Washington professor of archaeology. "We could detect no difference in diet, the animals they were hunting and the way they were hunting across this period of time, aside from those caused by climate change.

"So the takeover by Cro-Magnon does not seem to be related to hunting capability. There is no significant difference in large mammal use from Neanderthals to Cro-Magnon in this part of the world. The idea that Neanderthals were big, dumb brutes is hard for some people to drop. Cro-Magnon created the first cave art, but late Neanderthals made body ornaments, so the depth of cognitive difference between the two just is not clear."

The study also resurrects a nearly 50-year-old theory first proposed by Finnish paleontologist Björn Kurtén that modern humans played a role in the extinction of giant cave bears in Europe. Cro-Magnon may have been the original "apartment hunters" and displaced the bears by competing with them for the same caves the animals used for winter den sites.

Grayson and his colleague, Francoise Delpech, a French paleontologist at the Institut de Prehistoire et de Geologie du Quanternaire at the University of Bordeaux, examined the fossil record left in Grotte XVI, a cave above the Ceou River, near its confluence with the Dordogne River. The cave has a rich, dated archaeological sequence that extends from about 65,000 to about 12,000 years ago, spanning the time when Neanderthals flourished and died off and when Cro-Magnon moved into the region. Neanderthals disappeared from southwestern France around 35,000 years ago, although they survived longer in southern Spain and central Europe.

The researchers were most interested in the transition from the Middle to Upper Paleolithic, or Middle to Late Stone Age.

Neanderthals occupied Grotte XVI as far back as 65,000 years ago, perhaps longer. Between 40,000 and 35,000 years ago, people began making stone tools in France, including at Grotte XVI, that were more like those later fashioned by Cro-Magnon. However, human remains found with these tools at several sites, were Neanderthal, not Cro-Magnon. Similar tools but no human remains from this time period were found in Grotte XVI and people assumed to be Cro-Magnon did not occupy the cave until about 30,000 years ago.

The researchers examined more than 7,200 bones and teeth from large hoofed mammals that had been recovered from the cave. The animals – ungulates such as reindeer, red deer, roe deer, horses and chamois were the most common prey – were the mainstay of humans in this part of the world, according to Grayson.

He and Delpech found a remarkable dietary similarity over time. Throughout the 50,000-year record, each bone and tooth assemblage, regardless of the time period or the size of the sample involved, contained eight or nine species of ungulates, indicating that Neanderthals and Cro-Magnon both hunted a wide variety of game.

The only difference the researchers found was in the relative abundance of species, particularly reindeer, uncovered at the various levels in Grotte XVI. At the oldest dated level in the cave, reindeer remains accounted for 26 percent of the total. Red deer were the most common prey at this time, accounting for nearly 34 percent of the bones and teeth. However, as summer temperatures began to drop in Southwestern France, the reindeer numbers increased and became the prey of choice. By around 30,000 years ago, when Cro-Magnon moved into the region, reindeer accounted for 52 percent of the bones and teeth. And by around 12,500 years ago, during the last ice age, reindeer remains accounted for 94 percent of bones and teeth found in Grotte XVI.

Grayson and Delpech also looked at the cut marks left on bones to analyze how humans were butchering their food. They found little difference except, surprisingly, at the uppermost level, which corresponds to the last ice age.

"It is possible that because it was so cold, people were hard up for food," Grayson said. "The bones were very heavily butchered, which might be a sign of food stress. However, if this had occurred earlier during Neanderthal times, people would have said this is a sure sign that Neanderthals did not have the fine hand-eye coordination to do fine butchering."

In examining the Grotte XVI record, the researchers also found a sharp drop in the number of cave bears from Neanderthal to Cro-Magnon times.

"Cave bears and humans may have been competing for the same living space and this may have led to their extinction," Grayson said. He added that it is not clear if the decline and eventual extinction of the bears was driven by an increase in the number of humans or increased human residence times in caves, or both.

"If we can understand the extinction of any animal from the past, such as the cave bear, it gives us a piece of evidence showing the importance of habitat to animals. The cave bear is one of the icons of the late Pleistocene Epoch, similar to the saber tooth cats and mammoths in North America. If further study supports Kurtén's argument, we finally may be in a position to confirm a human role in the extinction of a large Pleistocene mammal on a Northern Hemisphere continent."

Email page Send page by E-Mail

Earliest Modern Humans Found in Romanian Cave
Posted: Monday, September 22, 2003
By Maggie Fox, Health and Science Correspondent

The jawbone of a cave-man living in what is now Romania is the oldest fossil from an early modern human to be found in Europe, U.S. researchers said on Monday.

Primitive features such as heavy bone and tooth structure also support the controversial idea that Cro-Magnons and Neanderthals may have interbred, the researchers said.

The jawbone, found in southwestern Carpathian Mountains of Romania, was carbon-dated to between 34,000 and 36,000 years ago, said Erik Trinkaus of Washington University in St. Louis, who led the study.

That makes it "the oldest definite early modern human specimen in Europe and provides perspectives on the emergence and evolution of early modern humans in the northwestern Old World," Trinkaus and colleagues wrote in their report, published in the Proceedings of the National Academy of Sciences (news - web sites).

The jawbone was found in 2002 in Pestera cu Oase, which means "cave with bones." Details can be seen on the Internet at

"The jawbone is the oldest directly dated modern human fossil," Trinkaus, a leading expert on early humans, said in a telephone interview.

"Taken together, the material is the first that securely documents what modern humans looked like when they spread into Europe. Although we call them 'modern humans,' they were not fully modern in the sense that we think of living people," he added.

"They are all dirty and smelly and all that sort of stuff. The basic facial shape would have been like ours but from the cheeks on down they would have looked very large."

The jawbone is similar to those of other early modern humans found in Africa, the Middle East and later in Europe. But the molars are unusually big and proportioned in a way that makes them look different -- almost Neanderthal, said Trinkaus.

Trinkaus is a leading proponent of the controversial theory that early modern humans and Neanderthals interbred to some extent. The two subspecies of Homo sapiens lived side-by-side in Europe for thousands of years and evidence suggests some trade or other contact.

"The specimens suggest that there have been clear changes in human anatomy since then," said Trinkaus.

"The bones are also fully compatible with the blending of modern human and Neanderthal populations," he said.

Email page Send page by E-Mail

The Dawn Of Time
Posted: Wednesday, September 17, 2003
Astronomers have discovered an object whose make-up gives an insight into the formation of the Universe. Marcus Chown reports on the discovery of the 'genesis star'

17 September 2003, Independent UK

In archaeological terms, it's like finding a Pharaoh surviving in modern-day London. Astronomers have discovered a star hiding in the Milky Way that has survived from a time before the advent of galaxies and even before many of the atoms making up today's universe. Crucially, this "genesis star" may have a "memory" of the very first objects to form from the debris of the Big Bang - super-massive stars, thought to have blazed briefly in the dawn of time. "The lost generation of stars blew up and vanished around 13 billion years ago," says Timothy Beers of Michigan State University in East Lansing. "They're the missing jigsaw piece in cosmic history."

Astronomers believe the universe's earliest stars formed in small gas clouds, containing only a few dozen or a few hundred stars. Later, these clusters coalesced to form giant galaxies like the Milky Way which dominate today's universe. So some ancient stars should be mixed in with the galaxy's other stars.

Such ancient stars stand out because of their extremely low levels of heavy atoms like calcium and iron. Whereas nature's two lightest atoms, hydrogen and helium, were forged in the fireball of the Big Bang, heavier atoms have been cooked up since that time by nuclear reactions inside stars. When very massive stars explode, they enrich the gas of the interstellar medium with such atoms. Since this gas provides the raw material for new stars, each successive generation of stars has a higher average abundance of heavy atoms. The oldest stars will therefore have extremely low abundance of heavy atoms. This should be apparent in the light they emit since each atom in nature has a unique spectral "fingerprint", showing or absorbing light as characteristic colours, or wavelengths.

Finding stars with the all-important spectral fingerprint among the 200 billion or so in our Milky Way involves sifting painstakingly through millions upon millions of individual stars. In the early 1990s, astronomers from the University of Hamburg and the European Southern Observatory studied 4 million stars across the sky. They then looked at promising candidates in detail using some of the world's largest telescopes. "One star jumped out at us," says Beers. "It went by the rather uninspiring name of HE 0107-5240."

At the end of 2001, Beers and his colleagues spent a mammoth six hours collecting light from the star with the Very Large Telescope on Cerro Paranal in Chile. "What we got stunned us," says Beers.

HE 0107-5240 had 200,000 times less iron than the Sun - 200 times less than any known star. What's more, it had hardly any heavy atoms - just seven different kinds compared with 25 to 30 in old stars with 1000 times less iron than the Sun. It had to be the most ancient star ever found. According to the best estimates, it was 13.5 billion years old, meaning it was around only 200 million years after the Big Bang. "HE 0107-5240 was truly the genesis star," says Beers.

One of the most significant features of the star's spectrum is the total lack of any atoms heavier than nickel or iron. Atoms heavier than iron are built up by the repeated capture of particles called neutrons, either during the late stages of stars' lives or during the catastrophic explosions in which massive stars die. "The genesis star must therefore have come from a time before the onset of significant neutron-capture processes, which produce many of the heavier atoms in today's Universe," says Beers.

By contrast, many ancient stars with 1000 times less iron - estimated to be about 13 billion years older than the Sun - do contain atoms heavier than iron. "We can therefore say that neutron-capture processes, which made many of the atoms in the periodic table, probably got going some time between 13.5 and 13 billion years ago," says Beers.

Not everything about the genesis star makes sense. "The star has 10,000 times as much carbon relative to iron as the Sun," says Beers. "In fact, it has the highest carbon abundance of any known star."

Finding a star with such low levels of most heavy atoms and an enhanced level of carbon is like finding a 120-year-old with skin as soft as six-month-old baby.

Beers believes the enhanced carbon reflects atom-building processes going on in the very first stars, whose explosions provided the raw material for the genesis star. If he is right, the heavy atom abundances in the genesis star provide a "window" on the very first stars - short-lived, super-massive stars that blew up and disappeared 13.5 billion years ago. The genesis star is therefore not only like a Pharaoh that has survived to modern times but one who also remembers the last Neanderthal.

Stars formed from the helium and hydrogen of the Big Bang and were born with no heavy atoms at all. "The atoms they produced during their short lifetimes would be preserved in stars like the genesis star," says Beers.

If Beers is right, the genesis stars may tell us about the lost generation of stars. "It's been known for a decade or so that the more deficient a star is in heavy atoms the more carbon it has. My bet is that the very first carbon in the Universe was made by some nuclear process different from the one that makes it in today's universe and it's this carbon we see in these primitive stars."

Beers believes we may even be able to learn about the masses of the lost generation of stars from whether or not the genesis star has certain heavy atoms in it - such as europium, thorium and uranium, made in the inferno of a supernova explosion. If the first stars were as massive as some theorists suspect - hundreds of times bigger than the Sun - these atoms would be sucked back into the supernova relic - possibly a black hole - and never escape. The genesis star should therefore show no sign of them. If, on the other hand, the first stars were less massive, these atoms would have escaped and the genesis star will show evidence of their presence.

The key to utilising information from the genesis star is to observe more stars. "Discussions about a single star have a limited scope," says Abraham Loeb of Harvard University. "One would prefer to have many more examples before drawing statistical conclusions."

Only a few tens of thousands of stars have been examined in detail. Of all these, only one star - the genesis star - has been found with an iron abundance of 200,000 times less than the Sun. "The statistics are small but, even if only one in 50,000 stars in our Milky Way has an ultra-low abundance of heavy atoms like the genesis star, we are talking about as many as a million stars of a similar nature yet to be found," says Beers.

The key to finding them is going to be observing tens of millions of stars. A great place to look is the American-Japanese Sloan Digital Sky Survey, in New Mexico. "We're confident of finding stars with 10,000 to 100,000 times less iron than the Sun, or even lower," says Beers.

For most astronomers, the sexy objects being found by the Sloan survey are quasars - the ultra-bright cause of newborn galaxies - and stars are the poor relatives. However, quasars look like stars and, as a result, can be mistaken for stars with hardly any heavy atoms. "About 10 per cent of the quasars candidates are not quasars but the kind of stars were looking for," says Beers. "We just have to sit back and let the quasar hunters do the a job for us!"

Marcus Chown is the author of 'The Universe Next Door', published by Headline, £15.99

Reproduced for fair use only from:

Email page Send page by E-Mail

UCLA Astronomers Detect Plasma At Black Hole
Posted: Monday, September 8, 2003
Source: University Of California - Los Angeles

UCLA astronomers report they have detected remarkably stormy conditions in the hot plasma being pulled into the monstrous black hole residing at the center of our Milky Way galaxy, 26,000 light years away. This detection of the hot plasma is the first in an infrared wavelength, where most of the disturbed plasma's energy is emitted, and was made using the 10-meter Keck II Telescope at the W.M. Keck Observatory in Hawaii.

Plasma is a hot, ionized, gas-like matter -- a fourth state of matter, distinct from solids, liquids and gases -- believed to make up more than 99 percent of the visible universe, including the stars, galaxies and the vast majority of the solar system.

"Previous observations at radio and X-ray wavelengths suggested that the black hole is dining on a calm stream of plasma that experiences glitches only 2 percent of the time," said Andrea Ghez, professor of physics and astronomy at UCLA, who headed the research team. "Our infrared detection shows for the first time that the black hole's meal is more like the Grand Rapids, in which energetic glitches from shocked gas are occurring almost continually."

"I see this as a real breakthrough," said Mark Morris, a UCLA professor of physics and astronomy, who worked with Ghez. "It's a big leap, not just an incremental advance. The infrared is precisely where we need to look to learn what the black hole is eating. In the infrared, you see it all. The black hole's dirty laundry is hanging right there for us to see. We're peering deep down inside this tumultuous region."

"One of the big mysteries in studies of the black hole at the center of our galaxy is why the surrounding gas is emitting so little light compared to black holes at the center of other galaxies," Ghez said. "We now have a completely new and continuously open window to study the material that is falling onto the black hole at the center of the Milky Way."

The past two years, Ghez and her colleagues used adaptive optics at the Keck Observatory to get high-resolution images at wavelengths between the short near-infrared, where stars dominate, and the mid-infrared, where dust dominates.

"There's a history of false detections of this source in the infrared," Ghez said. "At short wavelengths, it's challenging because there are so many stars. In the mid-infrared, it's difficult because there is so much dust at the center of the galaxy. Our observation was successful because it was made between these two problematic regimes with an adaptive optics system. This type of observation only became possible last year."

"We are highly confident in our detection," Ghez added. "We have a bright source at exactly the right spot, right on the black hole, and with properties that are unlike the stars around it; the source emits much more strongly at long wavelengths than the stars, and the source doesn't move, while the stars move at huge velocities. What's exciting and important is not just that we detected the plasma, but that it varies dramatically in intensity from week-to-week, day-to-day, and even within a single hour. It's as if we have been watching the black hole breathing."

Black holes are collapsed stars so dense that nothing can escape their gravitational pull, not even light. Black holes cannot be seen directly, but their influence on nearby stars is visible, and provides a signature, Ghez said. The black hole, with a mass more than three million times that of our sun, is in the constellation of Sagittarius.

Since 1995, Ghez has been using the W.M. Keck Observatory's 10-meter Keck I Telescope atop Mauna Kea in Hawaii -- the world's largest optical and infrared telescope -- to study the galactic center and the movement of 200 nearby stars. She has made measurements using a technique she refined called infrared speckle interferometry, and for the last few years, has used adaptive optics, an even more sophisticated technique, which enables her to see the region more clearly.

"The Keck Observatory is one of the best facilities in the world for this research," Ghez said.

The astronomers know the location of the black hole so precisely "that it's like someone in Los Angeles who can identify where someone in Boston is standing to within the width of her hand, if you scale it out to 26,000 light years," Ghez said. The galactic center is located due south in the summer sky.

The black hole at the center of our galaxy came into existence billions of years ago, perhaps as very massive stars collapsed at the end of their life cycles and coalesced into a single, supermassive object.

For decades, the emission at the galactic center could be detected only in radio wavelengths, which do not reveal the variations in intensity. "The radio is partially opaque," Morris said. The emission was detected for the first time recently in the X-way wavelengths, but it is important to now have the detection between these two wavelength extremes, where details of the plasma can be seen. In the X-ray, activity can be seen only about 5 percent of the time, while in the infrared, it can be seen continually, Morris said.

The astronomers are learning what is causing gas to emit radiation as it approaches and enters the black hole. Ghez and her colleagues will continue to study the supermassive black hole at a variety of near infrared wavelengths.

Ghez's co-authors include Morris; UCLA physics and astronomy professor Eric Becklin, who identified the center of the Milky Way in 1968; California Institute of Technology research scientist Keith Matthews, and UCLA graduate student Shelley Wright.


The research is federally funded by an individual grant to the National Science Foundation, the National Science Foundation's Center for Adaptive Optics, and the Packard Foundation. It has been submitted for publication to the Astrophysical Journal Letters and is available at Ghez also will present her findings Sept. 24 in an invited talk at the 4th Cologne-Bonn-Zermatt Symposium on The Dense Interstellar Medium in Galaxies in Zermatt, Switzerland.

Ghez provides more information, images and movies, at

Email page Send page by E-Mail

Previous Page | News Home | Homepage

Best viewed in *Internet Explorer* --- Netscape users should Upgrade

Education 2001 -
Designed and maintained by: Renee, Hendricks, Vashti, Meri & Amon. S.E.L.F.