Culture

Neanderthals and Homo sapiens used identical Nubian technology

image: The view from Shukbah Cave

Image: 
Amos Frumkin

Long held in a private collection, the newly analysed tooth of an approximately 9-year-old Neanderthal child marks the hominin's southernmost known range. Analysis of the associated archaeological assemblage suggests Neanderthals used Nubian Levallois technology, previously thought to be restricted to Homo sapiens.

With a high concentration of cave sites harbouring evidence of past populations and their behaviour, the Levant is a major centre for human origins research. For over a century, archaeological excavations in the Levant have produced human fossils and stone tool assemblages that reveal landscapes inhabited by both Neanderthals and Homo sapiens, making this region a potential mixing ground between populations. Distinguishing these populations by stone tool assemblages alone is difficult, but one technology, the distinct Nubian Levallois method, is argued to have been produced only by Homo sapiens.

In a new study published in Scientific Reports, researchers from the Max Planck Institute for the Science of Human History teamed up with international partners to re-examine the fossil and archaeological record of Shukbah Cave. Their findings extend the southernmost known range of Neanderthals and suggest that our now-extinct relatives made use of a technology previously argued to be a trademark of modern humans. This study marks the first time the lone human tooth from the site has been studied in detail, in combination with a major comparative study examining the stone tool assemblage.

"Sites where hominin fossils are directly associated with stone tool assemblages remain a rarity - but the study of both fossils and tools is critical for understanding hominin occupations of Shukbah Cave and the larger region," says lead author Dr Jimbob Blinkhorn, formerly of Royal Holloway, University of London and now with the Pan-African Evolution Research Group (Max Planck Institute for the Science of Human History).

Shukbah Cave was first excavated in the spring of 1928 by Dorothy Garrod, who reported a rich assemblage of animal bones and Mousterian-style stone tools cemented in breccia deposits, often concentrated in well-marked hearths. She also identified a large, unique human molar. However, the specimen was kept in a private collection for most of the 20th century, prohibiting comparative studies using modern methods. The recent re-identification of the tooth at the Natural History Museum in London has led to new detailed work on the Shukbah collections.

"Professor Garrod immediately saw how distinctive this tooth was. We've examined the size, shape and both the external and internal 3D structure of the tooth, and compared that to Holocene and Pleistocene Homo sapiens and Neanderthal specimens. This has enabled us to clearly characterise the tooth as belonging to an approximately 9 year old Neanderthal child," says Dr. Clément Zanolli, from Université de Bordeaux. "Shukbah marks the southernmost extent of the Neanderthal range known to date," adds Zanolli.

Although Homo sapiens and Neanderthals shared the use of a wide suite of stone tool technologies, Nubian Levallois technology has recently been argued to have been exclusively used by Homo sapiens. The argument has been made particularly in southwest Asia, where Nubian Levallois tools have been used to track human dispersals in the absence of fossils.

"Illustrations of the stone tool collections from Shukbah hinted at the presence of Nubian Levallois technology so we revisited the collections to investigate further. In the end, we identified many more artefacts produced using the Nubian Levallois methods than we had anticipated," says Blinkhorn. "This is the first time they've been found in direct association with Neanderthal fossils, which suggests we can't make a simple link between this technology and Homo sapiens."

"Southwest Asia is a dynamic region in terms of hominin demography, behaviour and environmental change, and may be particularly important to examine interactions between Neanderthals and Homo sapiens," adds Prof Simon Blockley, of Royal Holloway, University of London. "This study highlights the geographic range of Neanderthal populations and their behavioural flexibility, but also issues a timely note of caution that there are no straightforward links between particular hominins and specific stone tool technologies."

"Up to now we have no direct evidence of a Neanderthal presence in Africa," said Prof Chris Stringer of the Natural History Museum. "But the southerly location of Shukbah, only about 400 km from Cairo, should remind us that they may have even dispersed into Africa at times."

Credit: 
Max Planck Institute of Geoanthropology

The comet that killed the dinosaurs

image: A comet plunging through Earth's atmosphere.

Image: 
Gerd Altmann/Pixabay

It was tens of miles wide and forever changed history when it crashed into Earth about 66 million years ago.

The Chicxulub impactor, as it's known, left behind a crater off the coast of Mexico that spans 93 miles and goes 12 miles deep. Its devastating impact brought the reign of the dinosaurs to an abrupt and calamitous end by triggering their sudden mass extinction, along with the end of almost three-quarters of the plant and animal species then living on Earth.

The enduring puzzle has always been where the asteroid or comet that set off the destruction originated, and how it came to strike the Earth. And now a pair of Harvard researchers believe they have the answer.

In a study published in Scientific Reports, Avi Loeb, Frank B. Baird Jr. Professor of Science at Harvard, and Amir Siraj '21, an astrophysics concentrator, put forth a new theory that could explain the origin and journey of this catastrophic object and others like it.

Using statistical analysis and gravitational simulations, Loeb and Siraj show that a significant fraction of a type of comet originating from the Oort cloud, a sphere of debris at the edge of the solar system, was bumped off-course by Jupiter's gravitational field during its orbit and sent close to the sun, whose tidal force broke apart pieces of the rock. That increases the rate of comets like Chicxulub (pronounced Chicks-uh-lub) because these fragments cross the Earth's orbit and hit the planet once every 250 to 730 million years or so.

"Basically, Jupiter acts as a kind of pinball machine," said Siraj, who is also co-president of Harvard Students for the Exploration and Development of Space and is pursuing a master's degree at the New England Conservatory of Music. "Jupiter kicks these incoming long-period comets into orbits that bring them very close to the sun."

It's because of this that long-period comets, which take more than 200 years to orbit the sun, are called sun grazers, he said.

"When you have these sun grazers, it's not so much the melting that goes on, which is a pretty small fraction relative to the total mass, but the comet is so close to the sun that the part that's closer to the sun feels a stronger gravitational pull than the part that is farther from the sun, causing a tidal force" he said. "You get what's called a tidal disruption event and so these large comets that come really close to the sun break up into smaller comets. And basically, on their way out, there's a statistical chance that these smaller comets hit the Earth."

The calculations from Loeb and Siraj's theory increase the chances of long-period comets impacting Earth by a factor of about 10, and show that about 20 percent of long-period comets become sun grazers. That finding falls in line with research from other astronomers.

The pair claim that their new rate of impact is consistent with the age of Chicxulub, providing a satisfactory explanation for its origin and other impactors like it.

"Our paper provides a basis for explaining the occurrence of this event," Loeb said. "We are suggesting that, in fact, if you break up an object as it comes close to the sun, it could give rise to the appropriate event rate and also the kind of impact that killed the dinosaurs."

Loeb and Siraj's hypothesis might also explain the makeup of many of these impactors.

"Our hypothesis predicts that other Chicxulub-size craters on Earth are more likely to correspond to an impactor with a primitive (carbonaceous chondrite) composition than expected from the conventional main-belt asteroids," the researchers wrote in the paper.

This is important because a popular theory on the origin of Chicxulub claims the impactor is a fragment of a much larger asteroid that came from the main belt, which is an asteroid population between the orbit of Jupiter and Mars. Only about a tenth of all main-belt asteroids have a composition of carbonaceous chondrite, while it's assumed most long-period comets have it. Evidence found at the Chicxulub crater and other similar craters that suggests they had carbonaceous chondrite.

This includes an object that hit about 2 billion years ago and left the Vredefort crater in South Africa, which is the largest confirmed crater in Earth's history, and the impactor that left the Zhamanshin crater in Kazakhstan, which is the largest confirmed crater within the last million years.

The researchers say that composition evidence supports their model and that the years the objects hit support both their calculations on impact rates of Chicxulub-sized tidally disrupted comets and for smaller ones like the impactor that made the Zhamanshin crater. If produced the same way, they say those would strike Earth once every 250,000 to 730,000 years.

Loeb and Siraj say their hypothesis can be tested by further studying these craters, others like them, and even ones on the surface of the moon to determine the composition of the impactors. Space missions sampling comets can also help.

Aside from composition of comets, the new Vera Rubin Observatory in Chile may be able to see the tidal disruption of long-period comets after it becomes operational next year.

"We should see smaller fragments coming to Earth more frequently from the Oort cloud," Loeb said. "I hope that we can test the theory by having more data on long-period comets, get better statistics, and perhaps see evidence for some fragments."

Loeb said understanding this is not just crucial to solving a mystery of Earth's history but could prove pivotal if such an event were to threaten the planet again.

"It must have been an amazing sight, but we don't want to see that side," he said.

Credit: 
Harvard University

Membrane building blocks play decisive role in controlling cell growth

image: A pollen tube that grows out of a polen grain with certain fluorescently labelled building blocks. In green: PIP5K2 , the enzyme responsible for the production of lipid nano domains in the cell membrane. In magenta: actin cytoskeleton (cell "bones"), which is regulated by the lipid nanodomains.

Image: 
Marta Fratini

Lipids are the building blocks of a cell's envelope - the cell membrane. In addition to their structural function, some lipids also play a regulatory role and decisively influence cell growth. This has been investigated in a new study by scientists at Martin Luther University Halle-Wittenberg (MLU). The impact of the lipids depends on how they are distributed over the plasma membrane. The study was published in "The Plant Cell".

If plant cells want to move, they need to grow. One notable example of this is the pollen tube. When pollen lands on a flower, the pollen tube grows directionally into the female reproductive organs. This allows the male gametes to be delivered, so fertilisation can occur. The pollen tube is special in that it is made up of a single cell that continues to extend and, in extreme cases, can become several centimetres long. "This makes pollen tubes an exciting object for research on directional growth processes," says Professor Ingo Heilmann, head of the Department of Plant Biochemistry at MLU.

For the current study, Heilmann's team focused on the phospholipids of pollen tubes, which, as the main component of the plasma membrane, are responsible for separating the cell's interior from its surroundings. "Lipids are generally known to have this structuring function," says Dr Marta Fratini, first author of the study. It has only recently come to light that some phospholipids can also regulate cellular processes. The scientists from Halle have now been able to show that a specific phospholipid called phosphatidylinositol 4,5-bisphosphate ("PIP2") can control various aspects of cell growth in pollen tubes - depending on its position at the plasma membrane. They did this by labelling the lipid with a fluorescent marker. "We found it is either distributed diffusely over the entire tip of the pollen tube without a recognisable pattern, or is concentrated in small dynamic nanodomains," Fratini explains. One can imagine a group of people on a square: either individuals remain 1.5 metres apart as currently prescribed, or they form small groups.

It appears that different enzymes are responsible for the varying distribution of PIP2. "Plant cells have several enzymes that can produce this one phospholipid," explains Heilmann. Like the lipids, some of these enzymes are widely distributed over the membrane and others are concentrated in nanodomains, as shown by the current study. Depending on which of the enzymes the researchers artificially increased, either the cytoskeleton - a structure important for directed growth - stabilised and the pollen tube swelled at the tip, or more pectin - an important building material for plant cell walls - was secreted. This made the cell branch out at the tip. To make sure that the distribution of the lipids was indeed responsible for these growth effects, the biochemists artificially changed the arrangement of the enzymes at the plasma membrane - from clusters to a wide scattering or vice versa. It turns out they were able to control the respective effects on cell growth.

"As far as I know, our study is the first to trace the regulatory function of a lipid back to its spatial distribution in the membrane," says Heilmann. Further research is now needed to clarify exactly how the membrane nanodomains assemble and how the distribution of PIP2 at the membrane can have such varying effects.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Commuters are inhaling unacceptably high levels of carcinogens

image: Driver exercising extreme caution in his car.

Image: 
Stan Lim/UCR

A new study finds that California's commuters are likely inhaling chemicals at levels that increase the risk for cancer and birth defects.

As with most chemicals, the poison is in the amount. Under a certain threshold of exposure, even known carcinogens are not likely to cause cancer. Once you cross that threshold, the risk for disease increases.

Governmental agencies tend to regulate that threshold in workplaces. However, private spaces such as the interior of our cars and living rooms are less studied and less regulated.

Benzene and formaldehyde -- both used in automobile manufacturing -- are known to cause cancer at or above certain levels of exposure and are Prop. 65-listed chemicals.

New UC Riverside research shows that the average commuter in California is exceeding the threshold for exposure, breathing in unsustainably high levels of both chemicals.

Both benzene and formaldehyde are carcinogens, and benzene carries the additional risk of reproductive and developmental toxicity.

"These chemicals are very volatile, moving easily from plastics and textiles to the air that you breathe," said David Volz, UCR professor of environmental toxicology.

The study, published in the journal Environment International, calculated the daily dose of benzene and formaldehyde being inhaled by drivers with commutes of at least 20 minutes per day.

It found that up to 90% of the population in Los Angeles, San Diego, Orange, Santa Clara, and Alameda counties have at least a 10% chance of exceeding cancer risk from inhaling the chemicals, based on having 30-minute average commute times.

"Of course, there is a range of exposure that depends on how long you're in the car, and how much of the compounds your car is emitting," said Aalekhya Reddam, a graduate student in the Volz laboratory, and lead author of the study.

Previously, Volz and Reddam studied commuter exposure to a flame retardant called TDCIPP or chlorinated tris, and found that longer commute times increased exposure to that carcinogen as well.

They set out on this study wanting to understand the risk of that compound relative to other chemicals introduced during car manufacturing.

Reddam advises commuters to keep the windows open during their rides if possible. "At least with some air flow, you'd be diluting the concentration of these chemicals inside your car," she said.

Benzene is used to produce synthetic fibers, and formaldehyde is a binder in plastics. "There should be alternatives to these chemicals to achieve the same goals during vehicle manufacturing," Volz said. "If so, these should be used."

Credit: 
University of California - Riverside

Clues for improving sleep in visually impaired athletes

Tsukuba, Japan - Sleep is very important for athletes, and sleep loss can affect physical performance and cognitive ability. But now, researchers from the University of Tsukuba have identified the prevalence of sleep disorders in visually impaired athletes, as well as specific risk factors associated with lower sleep quality.

In a study published last November in Sleep Medicine, researchers from the University of Tsukuba conducted a survey of 99 visually impaired athletes in Japan and analyzed data from 81 respondents. They found that approximately one-third of the respondents had sleep disorders. Further, higher levels of stress regarding interpersonal relationships in competition activities and a later wake-up time were associated with the rate of sleep disorders.

As sleep disorders are known to be especially common in visually impaired individuals, athletes who are visually impaired may be especially at risk. However, the factors associated with sleep quality in visually impaired athletes are unknown. Therefore, the researchers at the University of Tsukuba aimed to address this in their study.

To examine the prevalence and risk factors associated with sleep disorders in visually impaired athletes, the researchers collected data regarding the severity of vision loss, sleep quality, lifestyle habits, competition activities, and psychological distress in visually impaired athletes who were active in Paralympic sports events in Japan, such as marathon running, goalball, swimming, blind soccer, and judo.

"Our data indicated that roughly a third of the study participants had sleep disorders," explains Professor Takeda. "This prevalence was similar to that found in sighted athletes, including other para-athletes."

"Our analysis revealed that stress associated with interpersonal relationships in competition activities was independently related to sleep disorders in the study participants," explains Professor Takeda, "as was a late wake-up time."

Given that interpersonal stressors arising from competition activities were associated with sleep disorders, new strategies for improving these relationships could help improve sleep quality in visually impaired athletes. Further, strategies to help athletes adjust their wake-up time may have a positive effect on sleep quality.

Credit: 
University of Tsukuba

Research highlights ways to protect astronaut cardiovascular health from space radiation

Space: the final frontier. What's stopping us from exploring it? Well, lots of things, but one of the major issues is space radiation, and the effects it can have on astronaut health during long voyages. A new review in the open-access journal Frontiers in Cardiovascular Medicine explores what we know about the ways that space radiation can negatively affect cardiovascular health, and discusses methods to protect astronauts. These include radioprotective drugs, and antioxidant treatments, some of which are more common than you might think.

Space is incredibly inhospitable. Outside of low earth orbit, astronauts are bombarded with radiation, including galactic cosmic rays, and 'proton storms' released by the sun. This radiation is harmful for the human body, damaging proteins and DNA, and is one of the major reasons that we haven't yet been able to send anyone to Mars, or beyond.

These issues inspired Dr Jesper Hjortnaes of the Leiden University Medical Center in the Netherlands to investigate what we know about the harmful effects of space radiation. "If we want to see human long distance space travel, we need to understand the impact of space-induced disease and how to protect our bodies from it," said Hjortnaes. However, Hjortnaes has an interest in a specific aspect of space radiation: its cardiovascular effects.

You may be surprised to learn that aside from the illnesses we typically associate with radiation, such as cancer, it can also have serious effects on the cardiovascular system. Suffering from cardiovascular illness would be catastrophic for crew members on long-haul space missions, and so it's important to identify what the risks are, and how to reduce them.

Hjortnaes and colleagues reviewed the evidence to establish what we know about the cardiovascular risks of space radiation. Much of what we know comes from studying people who have received radiation therapy for cancer, where cardiovascular disease is a common side-effect, or from mouse studies of radiation exposure.

So, what are the effects? Radiation can cause myocardial remodeling, where the structure of the heart begins to change, and tough, fibrous tissue grows to replace healthy muscle, potentially leading to heart failure. Other effects include atherosclerosis in blood vessels, which can cause stroke or heart attack. Radiation exerts its effects by causing inflammation, oxidative stress, cell death and DNA damage.

Researchers have also investigated potential ways to protect astronauts. These include drugs that an astronaut could take to protect themselves from space radiation, and antioxidants. Interestingly, an antioxidant diet, including dairy products, green vegetables such as spinach, and antioxidant supplements such as vitamin C, has potential in protecting astronauts from the damaging reactive oxygen molecules produced during radiation exposure.

Overall, the review revealed that so far, research has only scratched the surface of space radiation and the best methods to protect astronauts from it. There is little conclusive evidence of radiation-induced cardiovascular disease in astronauts themselves, as so few of them have ever gone further than low earth orbit, and mouse studies aren't an exact match for humans.

These issues prompted Hjortnaes and colleagues, who develop human cardiac tissue in the laboratory, to conclude that we need further research into these issues, and new research methods, such as organ-on-a-chip testing technologies.

"We need to develop human-based tissue platforms, such as heart-on-a-chip systems, that can simulate real human disease, outside of the human body, to unravel the mechanisms at play in space radiation-induced cardiovascular disease," said Hjortnaes.

Credit: 
Frontiers

Disease epidemic possibly caused population collapse in Central Africa 1600-1400 years ago

A new study published in the journal Science Advances shows that Bantu-speaking communities in the Congo rainforest underwent a major population collapse from 1600 to 1400 years ago, probably due to a prolonged disease epidemic, and that significant resettlement did not restart until around 1000 years ago. These findings revise the population history of no less than seven present-day African countries (Cameroon, Central African Republic, Democratic Republic of the Congo, Republic of the Congo, Gabon, Equatorial Guinea, and Angola) and challenges the commonly held belief that the settlement of Central Africa by Bantu-speaking communities was a continuous process from about 4000 years ago until the start of the transatlantic slave trade.

Ongoing debates about decolonization, restitution of African cultural heritage and antiracism have also renewed interest in the European colonization of Central Africa, even if it was a relatively short period in the long and eventful history of the region. Modern humans lived in the savannas of Central Africa several tens of thousands of years before they emerged in Europe. Also, in the Congo rainforest did our ancestors overcome many challenges long before the first European expedition traversed it, as shown again in this recently published study.

Unique interdisciplinary research method

As part of a cross-disciplinary research project examining the interconnections between human migration, language spread, climate change and early farming in pre-colonial Central Africa, the current study combines a comprehensive analysis of all available archeological radiocarbon dates as a proxy for human activity and demographic fluctuation with a comprehensive analysis of the diversity and distribution of pottery styles as a proxy for socio-economic development. These well-dated archeological records were further compared in this study with genetic and linguistic evidence to gain new insights into the ancient settlement history of Bantu-speaking populations in the Congo rainforest.

According to archeologist Dirk Seidensticker (UGent), one of the two lead authors, the multi-proxy approach developed in this study is unique both in terms of empirical evidence and scientific method, in that it uses 1149 radiocarbon dates linked to 115 pottery styles recovered from 726 sites throughout the Congo rainforest and adjacent areas: "We are the first to integrate these three types of archeological datasets on such a large scale and for such a long period and to demonstrate that throughout Central Africa two periods of more intense human activity (~800 BCE to 400 CE and ~1000 to 1900 CE) are separated by a widespread population collapse between 400 and 600 CE. Doing so, we could clearly delineate the periods commonly known as the Early Iron Age and Late Iron Age, each of them characterized by distinct pottery styles which first underwent a widespread expansion phase followed by a regionalization phase with many more local pottery styles. Pottery being one of the few material items of cultural heritage that has survived the ravages of time, this is an important step forward for the archeology of Central Africa."

New insights on the controversial Bantu Expansion

The initial spread of Bantu-speaking people from their homeland on the border between Nigeria and Cameroon towards eastern and southern Africa starting some 4000 years ago is unique in the world due to its magnitude, rapid pace, and adaptation to multiple ecozones. This spread had a momentous impact on the continent's linguistic, demographic, and cultural landscape. The Bantu languages constitute Africa's largest language family: about 1 out of 3 Africans speak one or several Bantu languages.

Historical linguist and Africanist Koen Bostoen (UGent) is excited about how these new insights that urge us to rethink the Bantu Expansion, one of the most controversial issues in African History: "Africa's colonization by Bantu speech communities is usually seen as a single, long-term and continuous macro-event. We tend to see today's Bantu speakers as direct descendants from those who originally settled the rainforest some 2700 years ago. Likewise, we think that current-day Bantu languages developed directly from the ancestral languages of those first settlers. However, our results show that this initial wave of Bantu-speaking Early Iron Age communities had largely vanished from the entire Congo rainforest region by 600 CE. The Bantu languages of this area may thus be almost 1000 years younger than previously thought. Scientifically speaking, this introduces new challenges for our use of linguistic data to reconstruct Africa's history. More generally, our study shows that African societies faced serious catastrophes long before the transatlantic slave trade and European colonization and had the resilience to overcome them. This is hopeful."

A prolonged epidemic as the cause of population collapse?

Paleobotanist and tropical forest ecologist Wannes Hubau (UGent & RMCA Tervuren), the other lead author, highlights that the drastic population collapse around 400-600 CE coincided with wetter climatic conditions across the region and may therefore have been promoted by a prolonged disease epidemic: "We note the broad coincidence between the sharp demographic decline in the Congo rainforest and the Justinian Plague (541-750 CE), which is regarded as one of the factors leading to the fall of both the Roman Empire and the Aksumite Empire in Ethiopia. It may have killed up to 100 million people in Asia, Europe, and Africa. We have no firm evidence that the population collapse observed in our archeological data is really due to a persistent vector-borne disease. However, the bacteria Yersinia pestis, which caused the Justinian Plague, has a long-standing presence in Central Africa. One particular strain, still found today in DRC, Zambia, Kenya and Uganda, has prevailed in Central Africa for at least 300 years and is the oldest living strain closely related to the lineage that caused the Black Death in 14th century Europe. We therefore consider a prolonged pandemic of plague to be a plausible hypothesis for the observed supra-regional population decline in 5th-6th century Central Africa."

Credit: 
Ghent University

Birds can 'read' the Earth's magnetic signature well enough to get back on course

image: Map: Eurasian reed warbler breeding range (green) in Europe and variation in the geomagnetic signature (total magnetic intensity, magnetic inclination and magnetic declination). The natural migratory direction from the study site (white dot) towards Africa during autumn is shown as black arrow. The expected compensatory direction from the simulated site (black star) is shown as white arrow.
Circular diagrams: Left: orientation of birds experiencing the natural magnetic field at the study site in Austria. Right: orientation of birds experiencing the simulated magnetic field of a site in Russia while still being at the study site in Austria. Arrows depict the respective mean group direction. Black dots show the orientation of the individual birds tested.

Image: 
Paper authors

Birdwatchers get very excited when a 'rare' migratory bird makes landfall having been blown off-course and flown beyond its normal range. But these are rare for a reason; most birds that have made the journey before are able to correct for large displacements and find their final destination.

Now, new research by an international team shows for the first time, how birds displaced in this way are able to navigate back to their migratory route and gives us an insight into how they accomplish this feat.

Writing in Current Biology, the team from Bangor and Keele Universities describe how reed warblers can navigate from a 'magnetic position' beyond what they have experienced in their normal migration route, back towards that correct route.

Different parts of the Earth have a distinct 'geomagnetic signature' according to their location. This is a combination of the strength of the geomagnetic field, the magnetic inclination or the dip angle between magnetic field lines and the horizon and the magnetic declination, or the angle between directions to the geographic and magnetic North poles.

Adult birds already familiar with their migration route, and its general magnetic signatures, were held in captivity for a short period before being released back into the wild, and exposed to a simulation of the earth's magnetic signature at a location thousands of miles beyond the birds' natural migratory corridor.

Despite remaining physically located at their capture site and experiencing all other sensory clues about their location, including starlight and the sights, smell and sounds of their actual location, the birds still showed the urge to begin their journey as though they were in the location suggested by the magnetic signal they were experiencing.

They oriented themselves to fly in a direction which would lead them 'back' to their migratory path from the location suggested to them by the magnetic signals they were experiencing.

This shows that the earth's magnetic field is the key factor in guiding reed warblers when they are blown off course.

"The overriding impulse was to respond to the magnetic information they were receiving," explained Richard Holland of Bangor University's School of Natural Sciences.

What our current work shows is that birds are able to sense that they are beyond the bounds of the magnetic fields that are familiar to them from their year-round movements, and are able to extrapolate their position sufficiently from the signals. This fascinating ability enables bird to navigate towards their normal migration route."

Dr Dmitry Kishkinev of Keele University's School of Life Sciences explained:

"What these birds are achieving is "true navigation". In other words, they are able to return to a known goal after displacement to a completely unknown location without relying on familiar surroundings, cues that emanate from the destination, or information collected during the outward journey."

Florian Packmor of Bangor University added:
"We have already shown that the reed warblers use the same magnetic cues experienced within their natural range, but this study shows that they can extrapolate what they understand about how the magnetic field varies in space far beyond any previous experience they have had."

But questions remain about whether the birds have an accurate 'map' or are just using a 'rule of thumb' measurement to judge the general direction of travel needed to get back on course.

The Eurasian reed warbler was selected for the research, but the findings could probably be applied to other migrating songbirds.

Credit: 
Bangor University

Variations in sensitivity of serological tests among individuals infected with SARS-CoV-2

What The Study Did: This observational study investigated the sensitivity of antibody tests to detect previous SARS-CoV-2 infection using existing clinical data across the University of California Health system.

Authors: Atul J. Butte, M.D. Ph.D., of the University of California, San Francisco, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.0337)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Green tea compound aids tumor-suppressing, DNA-repairing protein

image: A compound found in green tea stabilizes an anti-cancer protein known as the "guardian of the genome."

Image: 
Rensselaer Polytechnic Institute

TROY, N.Y. -- An antioxidant found in green tea may increase levels of p53, a natural anti-cancer protein, known as the "guardian of the genome" for its ability to repair DNA damage or destroy cancerous cells. Published today in Nature Communications, a study of the direct interaction between p53 and the green tea compound, epigallocatechin gallate (EGCG), points to a new target for cancer drug discovery.

"Both p53 and EGCG molecules are extremely interesting. Mutations in p53 are found in over 50% of human cancer, while EGCG is the major anti-oxidant in green tea, a popular beverage worldwide," said Chunyu Wang, corresponding author and a professor of biological sciences at Rensselaer Polytechnic Institute. "Now we find that there is a previously unknown, direct interaction between the two, which points to a new path for developing anti-cancer drugs. Our work helps to explain how EGCG is able to boost p53's anti-cancer activity, opening the door to developing drugs with EGCG-like compounds."

Wang, a member of the Rensselaer Center for Biotechnology and Interdisciplinary Studies, is an expert in using nuclear magnetic resonance spectroscopy to study specific mechanisms in Alzheimer's disease and cancer, including p53, which he described as "arguably the most important protein in human cancer."

P53 has several well-known anti-cancer functions, including halting cell growth to allow for DNA repair, activating DNA repair, and initiating programmed cell death -- called apoptosis -- if DNA damage cannot be repaired. One end of the protein, known as the N-terminal domain, has a flexible shape, and therefore, can potentially serve several functions depending on its interaction with multiple molecules.

EGCG is a natural antioxidant, which means it helps to undo the near constant damage caused by using oxygen metabolism. Found in abundance in green tea, EGCG is also packaged as an herbal supplement.

Wang's team found that the interaction between EGCG and p53 preserves the protein from degradation. Typically, after being produced within the body, p53 is quickly degraded when the N-terminal domain interacts with a protein called MDM2. This regular cycle of production and degradation holds p53 levels at a low constant.

"Both EGCG and MDM2 bind at the same place on p53, the N-terminal domain, so EGCG competes with MDM2," said Wang. "When EGCG binds with p53, the protein is not being degraded through MDM2, so the level of p53 will increase with the direct interaction with EGCG, and that means there is more p53 for anti-cancer function. This is a very important interaction."

“By developing an understanding of the molecular-level mechanisms that control key biochemical interactions linked to devastating illnesses such as cancer and Alzheimer’s disease, Chunyu’s research is laying the groundwork for new and successful therapies,” said Curt Breneman, dean of the Rensselaer School of Science.

"EGCG Binds Intrinsically Disordered N-Terminal Domain of p53 and Disrupts p53-MDM2 Interaction" was published with support from multiple grants from the National Institutes of Health. At Rensselaer, Wang was joined in the research by Lauren Gandy, Weihua Jin, Lufeng Yan, Xinyue Liu, and Yuanyuan Xiao. First author Jing Zhao is a former member of Wang's lab, now on the faculty at China Agricultural University in Beijing, China. Co-first author Alan Blaney is an M.D.-Ph.D. student at Upstate Medical University. Researchers also contributed from SUNY Upstate Medical Center; the University of Massachusetts, Amherst; New York University; the State University of New York at Binghamton; NYU Shanghai; and Merck Research Laboratories.

Credit: 
Rensselaer Polytechnic Institute

Assessing brain capillaries in COVID-19

What The Study Did: This case series analyzes brains from autopsies of patients who died of COVID-19 as confirmed by nucleic acid test and with severe pulmonary pathology.

Authors: David W. Nauen, M.D., Ph.D., of Johns Hopkins University in Baltimore, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamaneurol.2020.0225)

Editor's Note: The article includes conflicts of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Lemurs show there's no single formula for lasting love

image: These distant primate cousins of humans are among the few mammal species in which male-female partners stick together year after year.

Image: 
David Haring, Duke Lemur Center

DURHAM, N.C. -- Humans aren't the only mammals that form long-term bonds with a single, special mate -- some bats, wolves, beavers, foxes and other animals do, too. But new research suggests the brain circuitry that makes love last in some species may not be the same in others.

The study, appearing Feb. 12 in the journal Scientific Reports, compares monogamous and promiscuous species within a closely related group of lemurs, distant primate cousins of humans from the island Madagascar.

Red-bellied lemurs and mongoose lemurs are among the few species in the lemur family tree in which male-female partners stick together year after year, working together to raise their young and defend their territory.

Once bonded, pairs spend much of their waking hours grooming each other or huddled side by side, often with their tails wrapped around each other's bodies. Males and females of these species spend a third of a lifetime with the same mate. The same cannot be said of their closest relatives, who change partners often.

To biologists, monogamy is somewhat a mystery. That's in part because in many animal groups it's rare. While around 90% of bird species practice some form of fidelity to one partner, only 3% to 5% of mammals do. The vast majority of the roughly 6,500 known species of mammals have open relationships, so to speak.

"It's an uncommon arrangement," said lead author Nicholas Grebe, a postdoctoral associate in professor Christine Drea's lab at Duke University.

Which raises a question: what makes some species biologically inclined to pair up for the long haul while others play the field?

Studies over the last 30 years in rodents point to two hormones released during mating, oxytocin and vasopressin, suggesting that the key to lasting love may lie in differences in how they act on the brain.

Some of the first clues came from influential research on prairie voles, small mouse-like mammals that, unlike most rodents, mate for life. When researchers compared the brains of monogamous prairie voles with their promiscuous counterparts, montane voles and meadow voles, they found that prairie voles had more "docking sites" for these hormones, particularly in parts of the brain's reward system.

Since these "cuddle chemicals" were found to enhance male-female bonds in voles, researchers have long wondered if they might work the same way in humans.

That's why the Duke-led team turned to lemurs. Despite being our most distant primate relatives, lemurs are a closer genetic match to humans than voles are.

The researchers used an imaging technique called autoradiography to map binding sites for oxytocin and vasopressin in the brains of 12 lemurs that had died of natural causes at the Duke Lemur Center.

The animals represented seven species: monogamous red-bellied and mongoose lemurs along with five promiscuous species in the same genus.

"They're really the only comparable natural experiment to look for biological signatures of monogamy in primates," Grebe said.

Comparing the brain imaging results in lemurs with previous results in voles and monkeys revealed some noticeable differences in the density and distribution of hormone receptors. In other words, oxytocin and vasopressin appear to act on different parts of the brain in lemurs -- which means they may also have different effects, depending on their target cell's location.

But within lemurs, the researchers were surprised to find few consistent differences between monogamous species and promiscuous ones.

"We don't see evidence of a pair-bond circuit" akin to that found in rodent brains, Grebe said.

As a next step, the team is looking at how lemur couples behave toward each other if the actions of oxytocin are blocked, by feeding them an antagonist that temporarily prevents oxytocin from binding to its receptors in the brain.

So what can lemurs teach us about love? The authors say their findings caution against drawing simple conclusions based on rodent experiments about how human social behaviors came to be.

Oxytocin may be the "potion of devotion" for voles, but it may be the combined actions and interactions of multiple brain chemicals, along with ecological factors, that create long-lasting bonds in lemurs and other primates, including humans, Grebe said.

"There are probably a number of different ways through which monogamy is instantiated within the brain, and it depends on what animals we're looking at," Grebe said. "There's more going on than we originally thought."

Credit: 
Duke University

ACC urges COVID-19 vaccine prioritization for highest risk heart disease patients

COVID-19 vaccine prioritization should prioritize those with advanced cardiovascular (CVD) disease over well-managed CVD disease, according to an American College of Cardiology (ACC) health policy statement published in the Journal of the American College of Cardiology (JACC). All CVD patients face a higher risk of COVID-19 complications and should receive the vaccine quickly, but recommendations in this paper serve to guide clinicians in prioritizing their most vulnerable patients within the larger CVD group, while considering disparities in COVID-19 outcomes among different racial/ethnic groups and socioeconomic levels.

"A coherent vaccine allocation strategy will consider the exposure risks and clinical risks of given individuals and populations," said Thomas M. Maddox, MD MSc, professor of cardiology at Washington University School of Medicine in St. Louis and co-chair of the health policy statement. "In addition, it will take into account those demographic populations that, for a variety of reasons, have additional risks that lead to higher rates of COVID-19 infection and severe health outcomes."

As of January 2021, there were almost 99 million COVID-19 cases and over 2 million deaths caused by the coronavirus worldwide. With the quick development of multiple vaccines, the Centers for Disease Control and Prevention (CDC) issued phased recommendations for which populations should get vaccinated first. Under Phase 1c of the CDC guidance, all patients from 16-64 years old with medical conditions that increase the risk for severe COVID-19 infection should receive the vaccine. Although the guidance states that heart conditions, hypertension, diabetes, obesity and smoking are examples of such high-risk medical conditions, it was silent on varying levels of risk among the variety of CVD conditions that cardiovascular clinicians manage.

In response, the writing group developed a policy statement that provides overall considerations of both exposure and clinical risk needed for vaccine allocation efforts. It presents the specific evidence and risk considerations related to CVD and COVID-19, and proposes a tiered schema of CVD risk to incorporate into vaccine allocation decisions. In addition, this policy statement highlights the large disparities in COVID-19 and CVD outcomes among racial and ethnic groups and different socioeconomic status levels and calls for consideration of these disparities in allocation decisions.

"Our proposed vaccine allocation schema outlines key CVD clinical risk considerations within the broader context of key overall risk considerations including exposure, disparities, health care access, advanced age and multimorbidity," Maddox said. "Patients' risk categorization is determined by highest tier in which meet one or more of its criteria."

Some examples in the proposed vaccine allocation schema include patients with poorly controlled hypertension, insulin-dependent diabetes or diabetes with microvascular and/or macrovascular complications as a result of poor glycemic control should be considered higher risk compared to patients who are medically optimized. Similarly, patients with morbid obesity should be considered higher risk compared to patients who are overweight.

Patients with severe medical conditions, such as advanced CVD, may require long-term stays in nursing homes or rehabilitation centers, which increases their risk of COVID-19 exposure. Data shows that the clinical risk for severe COVID-19 infection is associated with both advanced age and preexisting medical conditions, especially when two or more co-occur. In addition to multimorbidity, data has found adverse effects of frailty in patients with COVID-19. The CDC's phased vaccine allocation recommendations prioritize patients with advanced age, which is in accordance with the CVD-related risk associated with advanced age. However, this policy statement urges older patients with multiple comorbidities, including CVD conditions and/or frailty should be prioritized for COVID-19 vaccination.

"We hope that this document can be used to guide COVID-19 vaccine allocation and patient outreach in the context of prolonged demand-supply mismatch as we enter Phase 1c," Maddox said.

Credit: 
American College of Cardiology

New synthetic peptides could attenuate atherosclerosis

Research over the last 20 years has shown that atherosclerosis is a chronic inflammatory condition of the arterial blood vessel wall. Soluble mediators such as cytokines and chemokines are pivotal players in this disease, promoting vascular inflammation. However, the development of anti-inflammatory therapeutics directed against such mediators that could prevent atherosclerosis has proven difficult, despite promising clinical studies in the recent past.

Previous anti-inflammatory therapeutic strategies to prevent atherosclerosis, heart attacks, strokes, rheumatoid arthritis and other inflammatory diseases have mainly been based on antibodies and small molecule drugs. The Munich-based research team has now designed and chemically synthesized short chains of amino acids - i.e. peptides - that function like a minimized soluble chemokine receptor. In animal models, these peptides can block atherosclerosis.

Researchers design a new class of peptides against atherosclerosis

Chemokines orchestrate the migration of immune cells in our bodies. They are key players in inflammatory diseases, including atherosclerosis; and this is why they are of great interest to biomedical researchers.

The peptides designed and synthesized by the Munich researchers mimic certain chemokine receptors and are able to specifically inhibit chemokine mechanisms that promote atherosclerosis, whereas chemokine mechanisms that control important physiological processes in the body are not affected - one could say they are "spared".

Previous studies have shown the effectiveness of therapeutics related to cytokines and chemokines. However, these drugs, not only interfered with the effect of these mediators on atherosclerosis, but also suppressed their beneficial effects, for example those related to the host defense against infections.

"The mini-CXCR4 mimics we have developed are able to selectively differentiate between two different chemokines that target the same receptor - in this case between the atypical chemokine MIF and the classical chemokine CXCL12. This enables them to specifically block pathways underlying atherosclerosis," explained Aphrodite Kapurniotu, Professor for Peptide Biochemistry at TUM.

Peptide therapeutics are suitable and inexpensive

"Peptide-based therapeutics are often considered less stable, as peptides may get rapidly degraded in the body by enzymes called proteases. However, we can apply various state-of-the-art approaches of peptide chemistry to improve the stability of peptides, for example by introducing unnatural amino acids into the peptide sequence" Prof. Kapurniotu added.

"So far, our approach was validated only in an animal model of atherosclerosis, but future clinical applications seem possible, in particular also due to the fact that peptide-based therapeutics are substantially less expensive than antibodies," said Prof. Jürgen Bernhagen from the Institute for Stroke and Dementia Research (ISD) at the LMU University Hospital.

Therapeutic potential for inflammation diseases

The Munich researchers view these results as a „proof-of-principle" of their approach. In fact, their findings show that concepts based on mini-chemokine-receptor mimics are feasible and suggest that this kind of concept could potentially be applied to other chemokines as well.

Thus, the new molecular concept could bear therapeutic potential for atherosclerosis and other inflammatory diseases.

Credit: 
Technical University of Munich (TUM)

Drone-based photogrammetry: A reliable and low-cost method for estimating plant biomass

image: The experimental tallgrass prairie at the Morton Arboretum from above. This orthomosaic is constructed from almost 600 overlapping drone images.

Image: 
Photo Lane Scher.

Remote sensing technology has become a vital tool for scientists over the past several decades for monitoring changes in land use, ice cover, and vegetation across the globe. Satellite imagery, however, is typically available at only coarse resolutions, allowing only for the analysis of broad trends over large areas. Remote-controlled drones are an increasingly affordable alternative for researchers working at finer scales in ecology and agriculture, but the laser-based technology used to estimate plant productivity and biomass, such as light detection and ranging (LiDAR), remain prohibitively expensive.

In research presented in a recently published issue of Applications in Plant Sciences, researchers used low-cost remote sensing technology to produce multispectral vegetation indices and 3D photomosaics of the vegetation in a tallgrass prairie, comparing aerial estimates of biomass with direct measurements taken in the field. Based on their results, photogrammetry is a reliable way to estimate biomass, landcover, and ecosystem productivity, with several potential cost-saving implications for conservation and agricultural science.

Researchers carried out remote sensing data collection on a tallgrass prairie restoration experiment at the Morton Arboretum in Lisle, Illinois.

"The restoration experiment is designed to determine whether or not phylogenetic diversity and functional diversity affect restoration outcomes," said senior author Andrew Hipp, the Plant Systematist and Herbarium Director at the Morton Arboretum.

To accomplish this goal, Hipp and his research team drew from a total of 127 prairie species, each of which was planted as a monoculture in 4 m2 plots. These species were also mixed in various combinations in multispecies plots of the same size, ranging from high, medium, and low phylogenetic diversity crossed with high and low trait diversity.

Lead author Lane Scher, a community ecologist at Duke University, saw the experiment's design as a unique opportunity to test the precision of photogrammetry-based estimates of biomass between monocultures and multispecies plots.

The researchers used a DJI Phantom 4 drone equipped with a standard camera as well as a multispectral sensor that would allow them to calculate various vegetation indices -- metrics based on ratios of light wavelengths that indicate the relative health and productivity of plant ecosystems.

By stitching together close to 600 overlapping images, the researchers created a set of mosaics of the study site, allowing them to calculate the height of the vegetation in each plot. By extrapolation, they were then able to estimate the total biomass of the tallgrass species.

Analytical comparisons made between the aerial- and field-based measurements indicate that estimates of biomass derived from photogrammetry explained up to 47% of the variation in biomass for multispecies plots, a fairly significant result that shows promise for this method.

"Of the metrics we used, volume was the best predictor of productivity, which is great news, because it's also the least expensive to measure," said Scher. Although the researchers used a multispectral sensor to obtain imagery for different wavelengths of light to calculate vegetation indices, their results suggest that a simple RGB camera is all that's needed to reliably estimate biomass.

These methods might not be as applicable to monocultures, however. The explanatory power of their model, which accounted for 47% of the variation in multispecies plots, dropped to 34% in monocultures, a trend that wasn't entirely unexpected by the team.

"In monocultures, you typically have only one layer of vegetation," said Scher. Plants of the same species and variety typically have the same growth form, with the result that most of the leaves compete for space in a single, crowded layer.

"In multispecies plots, however, vegetation can be more evenly spaced vertically," said Scher. Since 3D photogrammetry calculates volume as everything between the soil surface and the top layer of vegetation, plots with plants evenly distributed in height will likely give the most accurate estimates.

While future comparisons with similar LiDAR measurements will be useful in further constraining the accuracy of photogrammetry for the estimation of biomass, the present study outlines a simple, fast, and affordable method to reliably assess vegetation productivity at fine-scale resolutions across large study areas.

Credit: 
Botanical Society of America