Culture

Bioactive natural compounds for the fight against cancer

Through a balanced diet, we consume larger quantities of phytoalexins every day - in a natural and healthy way. Phytoalexins (gr. phytos = plant, alekein = "repel") are phytochemicals that plants produce as an immune response to certain stimuli in order to maintain their own health. Numerous scientific studies have already shown that these bioactive natural products also have a health-promoting effect on humans. However, in order to investigate the mechanisms of action in detail, it is important to obtain the individual phytoalexins simply, which has so far been done with little efficiency and using toxic substances.

Dr. Philipp Ciesielski and Prof. Peter Metz from the Chair of Organic Chemistry I at TU Dresden have now presented a novel and extremely efficient synthesis for phytoalexins in the renowned journal Nature Communications. In particular, the low-level synthesis of the phytoalexins Glyceollin I and Glyceollin II, which are produced as part of the immune response in soybean plants, is a decisive innovation. These two natural compounds are characterized by a broad spectrum of bioactivities, including antitumour activity and health-promoting, anti-oxidant and anti-cholesterolemic effects against Western diseases.

The previous syntheses of Glyceollin I and II use large amounts of the very toxic and expensive oxidizing agent osmium tetroxide as well as large amounts of a comparatively expensive excipient as ligand in the key step. The newly presented synthesis route, on the other hand, manages without osmium tetroxide and at the same time proves to be much more efficient.

"Our synthesis pathway to various phytoalexins now allows easier access to these substances. This is an important basis for further investigations into the biological activity of these natural compounds and may well form the basis for their further development as therapeutics. The path we have described to the basic structure of phytoalexins can also be used by other research groups in the synthesis of related natural and active compounds," describes Prof. Peter Metz the importance of his publication.

Credit: 
Technische Universität Dresden

Sledge dogs are closely related to 9,500-year-old 'ancient dog'

Dogs play an important role in human life all over the world - whether as a family member or as a working animal. But where the dog comes from and how old various groups of dogs are is still a bit of a mystery.

Now, light has been shed on the origin of the sledge dog. In a new study published in SCIENCE, researchers from the Faculty of Health and Medical Sciences, University of Copenhagen, show that the sledge dog is both older and has adapted to the Arctic much earlier than thought. The research was conducted in collaboration with the University of Greenland and the Institute of Evolutionary Biology, Barcelona.

'We have extracted DNA from a 9,500-year-old dog from the Siberian island of Zhokhov, which the dog is named after. Based on that DNA we have sequenced the oldest complete dog genome to date, and the results show an extremely early diversification of dogs into types of sledge dogs', says one of the two first authors of the study, PhD student Mikkel Sinding, the Globe Institute.

Until now, it has been the common belief that the 9,500-year-old Siberian dog, Zhokhov, was a kind of ancient dog - one of the earliest domesticated dogs and a version of the common origin of all dogs. But according to the new study, modern sledge dogs such as the Siberian Husky, the Alaskan Malamute and the Greenland sledge dog share the major part of their genome with Zhokhov.

'This means that modern sledge dogs and Zhokhov had the same common origin in Siberia more than 9,500 years ago. Until now, we have thought that sledge dogs were only 2-3,000 years old', says the other first author, Associate Professor Shyam Gopalakrishnan, Globe Institute.

The Original Sledge Dog

To learn more about the origins of the sledge dog, researchers have further sequenced genomes of a 33,000-year-old Siberian wolf and ten modern Greenlandic sledge dogs. They have compared these genomes to genomes of dogs and wolves from around the world.

'We can see that the modern sledge dogs have most of their genomes in common with Zhokhov. So, they are more closely related to this ancient dog than to other dogs and wolves. But not just that - we can see traces of crossbreeding with wolves such as the 33,000-year-old Siberian wolf - but not with modern wolves. It further emphasises that the origin of the modern sledge dog goes back much further than we had thought', says Mikkel Sinding.

The modern sledge dogs have more genetic overlap with other modern dog breeds than Zhokhov has, but the studies do not show us where or when this occurred. Nevertheless, among modern sledge dogs, the Greenland sledge dogs stands out and has the least overlap with other dogs, meaning that the Greenland sledge dog is probably the most original sledge dog in the world.

Common Features with Inuit and Polar Bears

In addition to advancing the common understanding of the origin of sledge dogs, the new study also teaches the researchers more about the differences between sledge dogs and other dogs. Sledge dogs do not have the same genetic adaptations to a sugar and starch rich diet that other dogs have. On the other hand, they have adaptations to high-fat diets, with mechanisms that are similar to those described for polar bears and Arctic people.

'This emphasises that sledge dogs and Arctic people have worked and adapted together for more than 9,500 years. We can also see that they have adaptations that are probably linked to improved oxygen uptake, which makes sense in relation to sledding and give the sledding tradition ancient roots', says Shyam Gopalakrishnan.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Study looks at the impact of gendar bias in helmet regulations for lacrosse players

According to a new study, high school girls' lacrosse players who may, but are not required to, wear flexible headgear are at a higher risk of getting a concussion from a stick or ball impact than boys' lacrosse players, who are required to wear a hard shell helmet with a full face mask.

Faculty at the Colorado School of Public Health at the University of Colorado Anschutz Medical Campus, and the University of Colorado Denver published the study in Injury Epidemiology. The study looked at if girls' lacrosse players are at a greater risk for concussions than boys' lacrosse players due to differences in helmet regulations.

"As youth sports begin to restart across the country, the results of our study provide a reminder to parents about the risk of concussion for girls in lacrosse compared to their male counterparts," said Dawn Comstock, PhD, professor of epidemiology at the Colorado School of Public Health, and lead author of the paper.

Boys' lacrosse is a full contact sport that allows body and stick checking, which mandates hard shell helmets with full face masks. Girls' lacrosse, which prohibits body checking and whose sphere rule is supposed to prevent stick checking to the head, allows optional flexible headgear with or without integrated eye protection.

"The disproportionate guidelines around protective equipment for different genders are outdated. Our study's results indicate it shouldn't be a debate whether the required boys' lacrosse hard shell helmets should also be mandated in girls' lacrosse," said Sarah K. Fields, JD, PhD, professor of communication at the University of Colorado Denver and study co-author.

In this study, researchers used lacrosse concussion data from the National High School Sports-Related Injury Surveillance Study to determine if girls' lacrosse players were at increased risk of concussion from stick or ball contact due to differences in helmet regulations by calculating the attributable risk and attributable risk percent (AR%) for concussion resulting from ball or stick impacts.

The researchers looked at gender comparison by injury mechanism. The key findings include:

In girls' lacrosse, stick or ball contact was the most common mechanism of concussion, accounting for 72.7 percent of all concussions, while athlete-athlete contact accounted for 19.8 percent.

In boys' lacrosse, stick or ball contact accounted for 23.5 percent of all concussions while athlete to athlete contact accounted for 66.4 percent.

The rate of concussions from stick or ball contact was significantly higher in girls than boys, with girls being 2.6 times as likely to sustain a concussion from stick or ball contact.

An estimated 45 percent of all girls' lacrosse concussions could have been prevented if girls' wore the helmet mandated in boys' lacrosse.

The study found no evidence to indicate the hard shell helmet with full face mask currently mandated in boys' lacrosse would not provide similar protection from stick and ball strikes in girls' lacrosse.

Comstock added, "Bottom line, girls playing lacrosse are sustaining concussions that could have been prevented if gender bias did not prohibit them from wearing the very helmet required for boys playing lacrosse."

Credit: 
University of Colorado Anschutz Medical Campus

For children with cleft lip and palate, no major psychological impact of repeated surgeries

June 25, 2020 - Children born with cleft lip and cleft palate (CLP) commonly undergo multiple surgical procedures between infancy and adolescence. By the time they are teens, patients with CLP with more total surgeries do not have increased psychosocial problems.

However, an increased number of surgeries between 8 and 10 years of age may predict increased anxiety and depression in adolescence, reports a study in the July issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS).

ASPS member surgeon Justine C. Lee, MD, PhD, of UCLA is senior author of the new study. She comments, "In conjunction with our previous work that identified the 8-to-10 age range as a critical at-risk time period for poor psychosocial functioning, we now find that teenagers who had more surgeries during that age range report worse long-term psychosocial functioning."

The researchers identified a group of 55 teens who had undergone surgery for CLP. Like many children with CLP, they underwent multiple reconstructive surgeries to address their appearance, feeding, hearing, speech, and other functions.

The patients with CLP, along with a comparison group of 14 adolescents without CLP, completed standard assessments of anger, anxiety, and depressive symptoms. Relationships between these psychosocial outcomes and the number of surgeries was assessed. From age 0 to 14, the patients with CLP had an average of six procedures per patient.

Overall, there was no significant difference in psychosocial outcomes for teens with and without CLP. For all three psychosocial outcomes assessed - anger, anxiety, and depression - there was no significant association with total number of surgeries.

The study also looked at the impact of the number of surgeries by age group. More than half of all surgeries were done from birth to age 7 - most commonly procedures to close the cleft lip and/or palate. In older age groups, the most common surgeries were bone grafts to augment the bone under the gums (alveolar bone grafts), which must be done before the permanent teeth start to come in.

On this analysis, a greater number of surgeries between age 8 and 10 years was linked to higher scores for both anxiety and depressive symptoms. Higher scores for anger were also associated with increased anxiety and depression. In all other age groups, number of surgeries was unrelated to psychosocial outcomes.

Children with CLP undergo multiple procedures, surgical and otherwise, from infancy into adolescence. Addressing mental health is an important goal of modern, multidisciplinary CLP care. But in contrast to functional outcomes like feeding and speech, few studies have monitored the long-term psychosocial outcomes of treatment for CLP.

The total number of procedures does not appear to affect psychosocial outcomes in patients with CLP by the time they are teens, the results suggest. The researchers write, "While these conclusions may be considered positively in that patients with CLP have not been negatively impacted by more surgery, the negative aspect is that they have not demonstrated any benefit by having more surgery either."

The findings add to a previous study by Dr. Lee's group, also published in Plastic and Reconstructive Surgery®, which found that the 8- to 10-year age range is an "at-risk period for psychosocial distress in children with craniofacial anomalies," including CLP. Dr. Lee and colleagues conclude, "The significant association between multiple surgeries and psychosocial functioning suggests a need to develop strategies for modifying timing or consolidating procedures during that age range."

Credit: 
Wolters Kluwer Health

Novel radiotracer advantageous for imaging of neuroendocrine tumor patients

image: Comparison of whole-body maximum-intensity projections in 6 representative patients (patients 7, 8, 11, 14, 27, and 29 from left to right). Physiologic uptake is seen at pituitary gland, salivary glands, thyroids, adrenal glands, spleen (splenectomy in patients 7 and 8), and bowel on 68Ga-DOTATATE maximum-intensity projections (top). However, these normal organs show none or very mild uptake on 68Ga-DOTA-JR11 maximum-intensity projections (bottom). In addition, 68Ga-DOTA-JR11 depicts more liver lesions than 68Ga-DOTATATE, with lower liver background.

Image: 
Images created by Wenjia Zhu and Li Huo, et al., Peking Union Medical College Hospital, Beijing, China.

Reston, VA--For neuroendocrine cancer patients with liver metastases, a new radiopharmaceutical, 68Ga-DOTA-JR11, has shown excellent imaging performance in tumor detection, staging and restaging, providing important information to guide treatment. In a head-to-head comparison of two somatostatin receptor (SSTR) imaging agents, 68Ga-DOTA-JR11 PET/CT performed better than 68Ga-DOTATATE PET/CT in detecting liver metastases, with a better tumor-to-background ratio, according to research published in the June issue of The Journal of Nuclear Medicine.

SSTRs--the key target for imaging and peptide receptor radionuclide therapy in patients with neuroendocrine tumors--typically are imaged using 68Ga-labeled peptides, which are agonists that bind to SSTRs to elicit a response. However, newly developed peptide antagonists, which recognize and then block SSTRs, have shown more favorable pharmacokinetics, better image contrast, higher tumor uptake, and better residence time in recent studies.

"With antagonists, we now have an alternative to agonists," stated Wenjia Zhu, MD, nuclear medicine physician, at Peking Union Medical College Hospital in Bejing, China. "However, there is still not much evidence about the performance of PET/CT imaging with SSTR antagonists. Hence, we designed this prospective study to compare 68Ga-DOTATATE and 68Ga-DOTA-JR11 PET/CT in patients with metastatic, well-differentiated neuroendocrine tumors."

The study included 31 patients and took place on two consecutive days. Each patient received an intravenous injection of 68Ga-DOTATATE on the first day and 68Ga-DOTA-JR11 on the second day. Whole-body time-of-flight PET/CT scans were performed 40-60 minutes after each injection on the same scanner. Upon completion, physiologic normal-organ uptake, lesion numbers, and lesion uptake were compared between 68Ga-DOTATATE and 68Ga-DOTA-JR11 PET/CT images.

The physiologic normal-organ uptake of the spleen, renal cortex, adrenal glands, pituitary glands, stomach wall, normal liver parenchyma, small intestine, pancreas, and bone marrow was significantly lower on 68Ga-DOTA-JR11 PET/CT than on 68Ga-DOTATATE PET/CT. 68Ga-DOTA-JR11 was found to detect significantly more liver lesions than 68Ga-DOTATATE; however 68Ga-DOTATATE detected more bone lesions than 68Ga-DOTA-JR11. While the radiopharmaceuticals showed similar lesion uptake for primary tumors and lymph node metastases on both patient-based and lesion-based comparisons, the target-to-background ratio of liver lesions was significantly higher on 68Ga-DOTA-JR11.

"What we've learned from this study is that peptides matter," noted Zhu. "For patients with different metastatic patterns, different peptides (DOTA-JR11 versus DOTATATE) should be used. In liver-dominant disease, 68Ga-DOTA-JR11 may be a better choice in tumor staging and restaging compared to 68Ga-DOTATATE. It may also change the treatment strategy, especially when partial resection or local therapy for liver metastasis is considered. In bone-dominant disease, we should probably stick to agonists, as 68Ga-DOTA-JR11 may underestimate tumor burden. We expect more extensive theranostic application of antagonists in the near future."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

New approach drives bacteria to produce potential antibiotic, antiparasitic compounds

image: Researchers used a variety of techniques, including genome mining, to identify bacteria that produce defensive compounds in response to hormone exposure. Their approach will help in the discovery of new antibiotics and other medically useful molecules.

Image: 
Graphic by Julie McMahon

CHAMPAIGN, Ill. -- Researchers have developed a method to spur the production of new antibiotic or antiparasitic compounds hiding in the genomes of actinobacteria, which are the source of drugs such as actinomycin and streptomycin and are known to harbor other untapped chemical riches. The scientists report their findings in the journal eLife.

The researchers wanted to overcome a decades-old problem that confronts those hoping to study and make use of the countless antibiotic, antifungal and antiparasitic compounds that bacteria can produce, said Satish Nair, a University of Illinois at Urbana-Champaign professor of biochemistry who led the research.

"In laboratory conditions, bacteria don't make the number of molecules they have the capability of making," he said. "And that's because many are regulated by small-molecule hormones that aren't produced unless the bacteria are under threat."

Nair and his colleagues wanted to determine how such hormones influence the production of antibiotics in actinobacteria. By exposing their bacteria to the right hormone or combination of hormones, the researchers hope to spur the microbes to produce new compounds that are medically useful.

The team focused on avenolide, a hormone that is more chemically stable than one used in earlier studies of bacterial hormones. Avenolide regulates the production of an antiparasitic compound known as avermectin in a soil microbe. A chemically modified version of this compound, ivermectin, is used as a treatment for river blindness, a disease transmitted by flies that blinded millions of people, mostly in sub-Saharan Africa, before the drug was developed.

For the new study, chemistry graduate student Iti Kapoor developed a more streamlined process for synthesizing avenolide in the lab than was previously available. This allowed the team to study the hormone's interactions with its receptor both inside and outside bacterial cells.

"Using a method called X-ray crystallography, Iti and biochemistry graduate student Philip Olivares were able to determine how the hormone binds to its receptor and how the receptor binds to the DNA in the absence of hormones," Nair said. "Typically, these receptors sit on the genome and they basically act as brakes."

The researchers discovered that when the hormone binds to it, the receptor loses its ability to cling to DNA. This turns off the brakes, allowing the organism to churn out defensive compounds like antibiotics.

Knowing which regions of the receptor are involved in binding to the hormone and to the DNA enabled the team to scan the genomes of dozens of actinobacteria to find sequences that had the right traits to bind to their receptor or to similar receptors. This process, called genome mining, allowed the team to identify 90 actinobacteria that appear to be regulated by avenolide or other hormones in the same class.

"Our long-term project is to take those 90 bacteria, grow them up in the laboratory, add chemically synthesized hormones to them and see what new molecules are being produced," Nair said. "The beauty of our approach is that we can now get the bacteria to produce large quantities of molecules that normally we would not be able to make in the lab."

Some of these new compounds are likely to have medical relevance, he said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Racial disparities in surgery rates for esophageal cancer

PHILADELPHIA - Black patients with esophageal cancer are at a higher risk of death compared to white patients. Although many reasons have been suggested for this, few have given physician actionable information. A new study from the Sidney Kimmel Cancer Center (SKCC) - Jefferson Health points to a different reason - Black patients were less likely to receive surgery for treatable diseases, which could have contributed to their higher rates of death.

The results were published in the Journal of Gastrointestinal Surgery.

"National guidelines suggest that early-stage esophageal cancer should be treated with surgery because data shows that it offers patients the best chances of survival, rather than chemotherapy alone," says senior author Nathaniel Evans, MD, Director of the Division of Thoracic Surgery at Thomas Jefferson University, and Chief of Cancer Services, Center City Division at the SKCC. "Our data show that Black patients are not having surgery for early-stage disease, which may contribute to higher rates of death. With this data, we can now begin to educate patients and providers to change practice."

A total of 60,041 patients were included in the analysis that drew from the National Cancer Database, of whom 4,402 were Black and 55,639 were white across over 1,334 hospitals around the country. In order to ensure an unbiased comparison, Black and white patients were matched by demographics, comorbidities, and tumor characteristics in a 1:1 fashion. The final dataset included 5,858 patients.

The analysis led by first author Samantha L. Savitch, a senior medical student and researcher working in the Department of Surgery and others showed that rates of surgery were significantly lower, 25-40% less for Black patients with esophageal cancer in stages I to III. In addition, the researchers noted that the chances of getting surgery decreased as the age of Black patients increased, and also decreased if the patients were receiving radiation therapy. Black patients were more likely to get surgery if they were treated at a hospital that was more than 5 miles from their homes.

The findings also suggested that patients who were diagnosed with a type of esophageal cancer called squamous cell carcinoma, which is more common in Black patients, were less likely to receive surgery. All this despite clear evidence that surgical resection is the best chance for survival in patients with esophageal cancer.

"Although the data doesn't give us a reason for the observations we're seeing, it does show us areas where we can take action," says Dr. Evans. "Even when we control for socioeconomic status, insurance status, location, and comorbid conditions, the disparity still persists, it is quite profound. This highlights the need to educate Black patients and their healthcare providers on the importance of surgery in the treatment esophageal cancer."

"One way we are addressing this is by developing a Multidisciplinary GI Cancer group," says Dr. Evans. "We review esophageal cancer patients and ensure their treatment plans are tailored to the individual patent and follow established guidelines."

"This important study is part of a much larger effort at the Sidney Kimmel Cancer Center to understand and mitigate cancer disparities," says Karen Knudsen, PhD, EVP of Oncology Services and Enterprise Director of SKCC. "This goal is central to our mission to improve the lives cancer patients and their families, regardless of geography, gender, or demographic. We are thankful to Dr. Evans and the entire research team for raising awareness about this critical national issue."

Credit: 
Thomas Jefferson University

The Lancet Psychiatry: First UK-wide study describes brain complications in some patients with severe COVID-19

A study of 153 patients treated in UK hospitals during the acute phase of the COVID-19 pandemic describes a range of neurological and psychiatric complications that may be linked to the disease and is published today in The Lancet Psychiatry journal.

All of the patients included in the study were selected for inclusion by expert doctors and therefore likely represent the most severe cases. It is not possible to draw conclusions about the total proportion of COVID-19 patients likely to be affected based on this study and in light of these findings further research is now needed, the authors say.

Researchers say their report offers the first detailed snapshot of the breadth of neurological complications in COVID-19 patients and should help to direct future research to establish the mechanisms of such complications so that potential treatments can be developed.

Dr Benedict Michael, lead-author of the study, from The University of Liverpool said: "There have been growing reports of an association between COVID-19 infection and possible neurological or psychiatric complications, but until now these have typically been limited to studies of ten patients or fewer. Ours is the first nation-wide study of neurological complications associated with COVID-19, but it is important to note that it is focused on cases that are severe enough to require hospitalisation." [1]

To investigate the breadth of COVID-19 complications that affect the brain, researchers set up a secure, UK-wide online network for specialist doctors to report details of specific cases. These portals were hosted by professional bodies representing specialists in neurology, stroke, psychiatry and intensive care. Data was collected between 2 April and 26 April 2020, during the exponential phase of the pandemic.

Professor Sarah Pett co-author of the study, from University College London, UK, said: "This data represents an important snapshot of the brain-related complications of COVID-19 in hospitalised patients. It is critically important that we continue to collect this information to really understand this virus fully. We also need to understand brain-complications in people in the community who have COVID-19 but were not sick enough to be hospitalised. Our study provides the foundations for larger, hospital and community-based studies. These studies will help inform on the frequency of these brain complications, who's most at risk of getting them, and ultimately how best to treat." [1]

Some 153 cases were reported during the study period, of which full clinical details were available for 125 patients. The study included patients with confirmed COVID-19 infection by PCR test (114 people), probable infection as diagnosed from chest X-rays or CT scans (6 people), and possible infection, where patients had symptoms consistent with disease but diagnostic tests were either negative or not done (5 people).

The most common brain complication observed was stroke, which was reported in 77 of 125 patients. Of these, 57 patients had a stroke caused by a blood clot in the brain, known as an ischaemic stroke, nine patients had a stroke caused by a brain haemorrhage, and one patient had a stroke caused by inflammation in the blood vessels of the brain. Age data was available for 74 of the patients who experienced a stroke and the majority were over 60 years of age (82%, 61/77).

39 patients showed signs of confusion or changes in behaviour reflecting an altered mental state. Of these, nine patients had unspecified brain dysfunction, known as encephalopathy, and seven patients had inflammation of the brain, medically termed encephalitis. Long-term follow-up studies to assess duration and severity of these complications are needed.

The remaining 23 patients with an altered mental state were diagnosed with psychiatric conditions, of which the vast majority were determined as new diagnoses by the notifying psychiatrist (92%, 21/23). Although most psychiatric diagnoses were determined as new by the notifying psychiatrist or neuropsychiatrist, the researchers say they cannot exclude the possibility that these were undiagnosed before the patient developed COVID-19.

The 23 patients with psychiatric diagnoses included ten patients with a new-onset psychosis and six patients with a dementia-like syndrome. Seven patients had signs of a mood disorder, including depression and anxiety (7/23).

Age information was available for 37 of the 39 patients with an altered mental state and of those, around half were aged under 60 years of age (49%, 18/37).

The researchers say the high proportion of younger patients diagnosed with psychiatric conditions after showing signs of an altered mental state could be because these patients may be more likely to be referred to a psychiatrist or other specialist doctor, whereas confusion or behaviour changes in older patients may be more likely to be attributed to delirium and not investigated further. Detailed long-term studies are needed in order to confirm if there is any link between COVID-19 infection and the onset of psychiatric or neuropsychiatric complications in younger patients. Such studies should include comparison of the immune response in affected patients and those not affected, as well as investigation of genetic factors that might underpin the development of disease, the researchers say.

Dr Benedict Michael, one of the lead authors of the study, from the University of Liverpool, said: "Our study is an important early step towards defining neurological complications in COVID-19 patients, which will help with health policy planning as well as informing the immediate next steps in COVID-19 neuroscience research. We now need detailed studies to understand the possible biological mechanisms underlying these complications so that we can explore potential treatments." [1]

Credit: 
The Lancet

Smart phones are empowering women worldwide

By giving women access to information they otherwise wouldn't have, mobile phones are transforming lives. Putting smart phones in women's hands could be a powerful tool to support sustainable development goals in the developing world, according to researchers from McGill University, University of Oxford and Bocconi University.

The study published in Proceedings of the National Academy of the Sciences covers 209 countries between 1993 and 2017, and shows that access to mobile phones is associated with multiple indicators linked to global social development, such as good health, gender equality, and poverty reduction. The link between mobile phone access and female empowerment is stronger in less- and least-developed countries.

Survey of women in Sub-Saharan Africa

In an effort to better understand how mobile phones empower women, the authors also conducted an individual level analysis on 100,000 women from Angola, Burundi, Ethiopia, Malawi, Tanzania, Uganda, and Zimbabwe between 2015 and 2017. Though these sub-Saharan countries show slow fertility decline and infant and maternal mortality rates remain high, the adoption of mobile phones is fast spreading.

Results indicate that, other things being equal, women who own a mobile phone have a 1% higher probability of being involved in decision-making processes about contraception, 2% higher likelihood of using modern contraceptive methods, and a 3% higher likelihood of knowing where to get tested for HIV with respect to women who do not own a phone. These effects are sizeable, as they are comparable to, if not bigger than, the effects of living in an urban area compared to living in a rural area. Similar effects are estimated on higher overall decision-making power within the household.

According to the researchers, improved knowledge and enhanced decision-making power are the likely pathways through which the macro-level results emerge. The analysis of individual data also confirms that the effects are stronger in poorer and more isolated areas.

Digital divides in the developing world

Still, despite the proliferation of mobile networks, the researchers acknowledge that digital divides by gender and socioeconomic strata persist in the developing world. Women are less likely to own mobile phones on their own, use them less often when they have access, and have poorer information and communications technology skills compared to men, creating second-level (skill-related) digital divides on top of first-level (access-related) ones.

"Our results suggest that deploying mobile-phone technology might serve to complement the role of other development processes such as educational expansion and economic growth rather than a replacement for it," says Luca Maria Pesando, a professor in the Department of Sociology and Centre on Population Dynamics at McGill University.

Credit: 
McGill University

Researchers find best way to treat children with sickle cell anemia in sub-Saharan Africa

image: Nurse Susan Murungi working with one of the children enrolled in NOHARM.

Image: 
Indiana University School of Medicine

INDIANAPOLIS, CINCINNATI, and KAMPALA, Uganda - A team of international researchers has learned that dose escalation of hydroxyurea treatment for children in Uganda with sickle cell anemia is more effective and has similar side effects than a lower fixed dose of the same drug.

The study, known as NOHARM MTD (Novel use Of Hydroxyurea in an African Region with Malaria - Maximum Tolerated Dose), focused on children in Uganda, but the results could impact use of hydroxyurea worldwide, including the United States and Europe. The findings were published in the June 25 issue of the New England Journal of Medicine.

This clinical research milestone removes a major barrier to broadly expand the use of hydroxyurea in low-resource regions like sub-Saharan Africa, according to the physicians who led the study at Makerere University in Kampala, Uganda, Cincinnati Children's Hospital Medical Center, and the Indiana University School of Medicine.

For this study, 187 children with sickle cell anemia living in Uganda received hydroxyurea. About half received a fixed-dose of 20 mg per kilogram of body weight per day. The other half received an escalating dose, which started at 25 mg per kilogram of body weight per day and increased up to 35 mg per kilogram of body weight per day, if tolerated. Doctors evaluated the children every 2-3 months for laboratory and clinical benefits, as well as potential side effects.

While the study team was planning to keep the children on these separate treatment arms for two years, an independent data review panel changed the course about 18 months into the study, due to clear benefits of the higher dose.

"Our study's data safety and monitoring board noted a highly significant difference between the treatment groups, with the children on escalated dosing having superior clinical results, but the same number of side effects, so at their recommendation we halted the trial and moved all of the children to that escalated dosing strategy," said Robert Opoka, who oversaw the study at Makerere University and Mulago Hospital in Uganda.

Sickle cell anemia is a life-threatening blood disorder that distorts red blood cells into a sickle or crescent shape and leads to anemia, recurrent pain, organ damage, and early death. It is also a serious global health issue: in the United States, about 100,000 individuals are affected, but worldwide more than 300,000 children are born with the disease each year, with more than 80% born in sub-Saharan Africa. Yet most therapeutic developments for sickle cell have not been available to children in Africa, including hydroxyurea, which is FDA-approved and effectively reduces the acute and chronic disease manifestations.

Earlier studies in sub-Saharan Africa showed hydroxyurea to be safe, feasible to use, and effective for treating sickle cell anemia, according to Dr. Russell Ware, a hematologist at Cincinnati Children's who led those studies and is senior investigator on the current NEJM paper. Ware said the drug boosts fetal hemoglobin, which reduces sickling of the red blood cells and improves anemia, lowers pain and other sickle-related events, and reduces clinical interventions such as transfusions and hospitalizations.

According to lead study co-investigator Dr. Chandy John, a physician scientist at the Ryan White Center for Pediatric Infectious Diseases and Global Health at the IU School of Medicine, the optimal dosing and monitoring plan for hydroxyurea was unknown when the study started. Particularly for low-resource countries like Uganda, determining the hydroxyurea optimal dosing and monitoring plan was crucial.

John said the new study confirms that the dose escalation regimen is better than fixed dose, but expanding hydroxyurea treatment will require affordable drug costs, education of healthcare providers, and an increased drug supply. In addition, newborn screening for sickle cell anemia is needed to help identify those who will benefit from treatment as early as possible.

"There will be some additional costs associated with screening and increasing access to the drug, but they will be more than offset by benefits to patients," John explained. "Our data make it clear that children on a higher daily dose had substantially better clinical outcomes, with fewer adverse events."

"The study shows clearly that the optimized dosing strategy for hydroxyurea, though it requires more effort than a fixed-dose treatment regimen, results in far better outcomes for children with sickle cell anemia," Ware concluded. "We think this study can substantially benefit children with sickle cell anemia in Africa and throughout the world, by choosing the optimal hydroxyurea dose to decrease the disease complications."

"In Africa and other low-resource settings, children with sickle cell anemia have been a neglected population" said Opoka. "We are glad that our work together as African and US researchers has resulted in findings that will improve the health of all children with sickle cell anemia."

Credit: 
Indiana University

Control over work-life boundaries creates crucial buffer to manage after-hours work stress

image: Workers with greater boundary control over their work and personal lives were better at creating a stress buffer to prevent them from falling into a negative rumination trap, says a new study co-written by a trio of University of Illinois at Urbana-Champaign experts who study occupational stress and employee well-being. From left, labor and employment relations professors YoungAh Park and Yihao Liu, and graduate student Lucille Headrick.

Image: 
Photos and photo collage by L. Brian Stauffer

CHAMPAIGN, Ill. -- When work intrudes after hours in the form of pings and buzzes from smartphone alerts, it can cause spikes of stress that lead to a host of adverse effects for workers, including negative work rumination, poor affect and insomnia.

But according to research co-written by a team of researchers at the University of Illinois at Urbana-Champaign who study occupational stress and employee well-being, those who have greater "boundary control" over their work and personal lives were better at creating a stress buffer that helped protect them from falling into a negative-rumination trap.

Information communication technologies such as smartphones and tablets enable employees to work anywhere and anytime, thereby blurring work and nonwork boundaries. But that convenience comes at the expense of increased stress and mental health woes for workers unless they have control over the boundaries between work and nonwork life, said YoungAh Park, a professor of labor and employment relations at Illinois.

"Most people simply can't work without a smartphone, tablet or laptop computer," she said. "These technologies are so ubiquitous and convenient that it can lead some people to think that employees have to be always on or always available. Clearly, this kind of after-hours intrusion into the home or personal life domain is unhealthy, and our research shows that an always-on mentality has a big downside in the form of increased job stress."

In the study, Park and co-authors surveyed more than 500 full-time public school teachers in grades K-6 to measure their off-the-clock work intrusion via technologies on a weekly basis for five consecutive weeks.

"We asked about their weekly work intrusion involving technology, specifically their after-hours work - whether they were expected to respond to work-related messages and emails immediately, and whether they were contacted about work-related issues after hours," she said.

The researchers found that teachers' adoption of technological boundary tactics such as keeping work email alerts turned off on smartphones was related to lower perceptions of the weekly work intrusion.

The study builds on recent scholarship on how coping with off-hours occupational demands is becoming an increasingly important issue for workers, said Yihao Liu, a professor of labor and employment relations at Illinois and a co-author of the study.

"Managing your work-life balance through boundary control is not only helpful for you and your family, it also could be a benefit for your co-workers, because they also have to potentially read and respond to the back-and-forth messages that people are sending after the workday is done," he said. "Setting a good boundary between work and regular life is going to help more people and more stakeholders. Overall, it's critical that individuals manage their work-life boundaries for their own health and well-being, but also for their own productivity and their colleagues' productivity."

Moreover, the researchers found that teachers' boundary control softened the work intrusion-negative rumination link and that this boundary control was an important mechanism by which two "border keepers" - principals, who effectively functioned as supervisors in the study; and parents, who could be thought of as clientele - can affect teachers' weekly stress experiences.

In other words, the weekly strain symptoms involving work intrusion can be alleviated by a supervisor who supports employees' work-life balance, Park said. Or conversely, it can be aggravated by clientele who expect employees to be always accessible and available.

"A really important point around the sense of boundary control is that stakeholders can influence employees' control," she said. "Our study suggests that school principals can play a positive role in that their support for work-life balance was associated with the teachers' greater sense of boundary control. When you have supportive leaders who model behaviors for work-life balance and work effectively with employees to creatively solve work-life conflicts, that translates into less stress for teachers through boundary control."

Although the study only included elementary school teachers in its sample, the findings about drawing clear boundaries after work ought to apply to most workers, especially now that more are working remotely due to the COVID-19 pandemic, the researchers said.

"Our initial motivation was to study teachers because we tend to assume that their work and nonwork lives are separate and distinct," Park said. "Teachers have set schedules in a physical building, along with discrete blocks of free time over the weekends. But even with this working population, we found that after-hours work intrusion via technology can be really stressful for them. So although this finding is particular to teachers, a class of employees who we tend to assume have clear work-life boundaries, it's now an issue for everyone who is electronically tethered to their work after regular hours."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Hubble sees cosmic flapping 'bat shadow'

video: In 2017, NASA's Hubble Space Telescope captured an image of a huge wing-shaped shadow cast by a fledgling star's unseen, planet-forming disk. The young star, called HBC 672, is casting the shadow across a more distant cloud in a star-forming region--like a fly wandering into the beam of a flashlight shining on a wall.

Now, after observing the shadow again, astronomers report that they see the giant shadow flapping its "wings"!

Watch on YouTube: https://www.youtube.com/watch?v=SVb0V4tin64

Download in HD: https://svs.gsfc.nasa.gov/13638

En español: https://youtu.be/5Z6P8-r0wTk

Image: 
NASA's Goddard Space Flight Center

Sometimes nicknames turn out to be closer to reality than you might imagine.

NASA's Hubble Space Telescope captured a striking image of a fledgling star's unseen, planet-forming disk casting a huge shadow across a more distant cloud in a star-forming region--like a fly wandering into the beam of a flashlight shining on a wall.

The young star is called HBC 672, and the shadow feature was nicknamed the "Bat Shadow" because it resembles a pair of wings. The nickname turned out to be surprisingly appropriate: Now, the team reports that they see the Bat Shadow flapping!

"The shadow moves. It's flapping like the wings of a bird!" described lead author Klaus Pontoppidan, an astronomer at the Space Telescope Science Institute (STScI) in Baltimore, Maryland. The phenomenon may be caused by a planet pulling on the disk and warping it. The team witnessed the flapping over 404 days.

But what created the Bat Shadow in the first place?

"You have a star that is surrounded by a disk, and the disk is not like Saturn's rings--it's not flat. It's puffed up. And so that means that if the light from the star goes straight up, it can continue straight up--it's not blocked by anything. But if it tries to go along the plane of the disk, it doesn't get out, and it casts a shadow," explained Pontoppidan.

He suggests imagining a lamp with a shade that casts a shadow on the wall. In this case, the lightbulb is the star, the lampshade is the disk, and the cloud is the wall. Based on the shadow's shape, the disk must be flared, with an angle that increases with distance--like bell-bottom pants, or a trumpet.

The disk--a circling structure of gas, dust, and rock--might be roughly saddle-shaped, with two peaks and two dips, which would explain the "flapping" of the shadow. The team speculates that a planet is embedded in the disk, with an orbit inclined to the disk's plane. This planet would be the cause of the doubly warped shape of the orbiting disk and the resulting movement in its shadow.

"If there were just a simple bump in the disk, we'd expect both sides of the shadow to tilt in opposite directions, like airplane wings during a turn," said team member Colette Salyk of Vassar College in Poughkeepsie, New York.

The shadow, extending from the star across the surrounding cloud, is so large--about 200 times the length of our solar system--that light doesn't travel instantaneously across it. In fact, the time it takes for the light to travel from the star out to the perceivable edge of the shadow is about 40 to 45 days. Pontoppidan and his team calculate a planet warping the disk would orbit its star in no fewer than 180 days. They estimate that this planet would be about the same distance from its star as Earth is from the Sun.

If not a planet, an alternative explanation for the shadow motion is a lower-mass stellar companion orbiting HBC 672 outside the plane of the disk, causing HBC 672 to "wobble" relative to its shadowing disk. But Pontoppidan and his team doubt this is the case, based on the thickness of the disk. There is also no current evidence for a binary companion.

The disk is too small and too distant to be seen, even by Hubble. The star HBC 672 resides in a stellar nursery called the Serpens Nebula, about 1,400 light-years away. It is only one or two million years old, which is young in cosmic terms.

This finding was serendipitous. The first image of the Bat Shadow was taken by another team. Later, the image was slated for use in NASA's Universe of Learning, a program that creates materials and experiences to enable learners to explore the universe for themselves. The goal was to illustrate how shadows can convey information about phenomena invisible to us. However, the original team only observed the Bat Shadow in one light filter, which did not provide enough data for the color image desired by NASA's Universe of Learning.

To get the color image, Pontoppidan and his team had to observe the shadow in additional filters. When they combined the old and new images, the shadow appeared to have moved. At first, they thought the problem was in the image processing, but they quickly realized the images were properly aligned and the phenomenon was real.

Credit: 
NASA/Goddard Space Flight Center

The Lancet Child & Adolescent Health: First Europe-wide study of children confirms COVID-19 predominately causes mild disease in children and fatalities are very rare

Children with COVID-19 generally experience a mild disease and fatalities are very rare, according to a study of 582 patients from across Europe published today in The Lancet Child & Adolescent Health journal.

The study, which included children and adolescents aged from 3 days up to 18 years old, found that although the majority were admitted to hospital (62%, 363/582), fewer than one in ten patients required treatment in intensive care (8%, 48/582).

The researchers note that their study only involved patients who had sought medical help and been tested for COVID-19, and so milder cases would not have been included. They advise against extrapolating the numbers observed in their study to the wider population. However, they say their findings should be taken into consideration when planning for demand on intensive care services as the pandemic progresses.

Dr Marc Tebruegge, lead author from the UCL Great Ormond Street Institute of Child Health in London, UK, said: "Our study provides the most comprehensive overview of COVID-19 in children and adolescents to date. We were reassured to observe that the case fatality rate in our cohort was very low and it is likely to be substantially lower still, given that many children with mild disease would not have been brought to medical attention and therefore not included in this study. Overall, the vast majority of children and young people experience only mild disease. Nevertheless, a notable number of children do develop severe disease and require intensive care support, and this should be accounted for when planning and prioritising healthcare resources as the pandemic progresses." [1]

The study was carried out over a 3.5 week period from 1st to 24th April 2020, during the initial peak of the European COVID-19 pandemic. It involved 82 specialist healthcare institutions across 25 European countries. All of the 582 patients included in the study were confirmed to be infected with the SARS-CoV-2 virus by a PCR test. Only a quarter (25%, 145/582) had pre-existing medical conditions. This contrasts with adult studies where the proportion of patients with co-morbidities is typically far higher, but likely reflects that children have fewer chronic medical problems than adults overall in the general population, the authors say.

The researchers found that the most common symptom reported was fever (65%, 379/582). Around half of the patients had signs of upper respiratory tract infection (54%, 313/582) and a quarter had evidence of pneumonia (25%, 143/582). Gastrointestinal symptoms were reported in around a quarter of the children (22%, 128/582), 40 of whom did not have any respiratory symptoms. Some 92 children, most of whom were tested due to close contact with a known COVID-19 case, had no symptoms at all (16%, 92/582).

The vast majority of patients did not require oxygen or any other support to help them breathe at any stage (87%, 507/582). Only 25 children needed mechanical ventilation (4%, 25/582), but when they did need it, that support was typically required for a prolonged period, often for a week or more (range 1-34 days).

The number of patients receiving antiviral or immunomodulatory therapies were too low to draw conclusions about the efficacy of any of the treatments used. The authors say robust clinical trial data are urgently needed to help doctors make decisions regarding the best treatment strategy for children under their care.

Dr Florian Götzinger, from Wilhelminenspital in Vienna, Austria, said: "Although COVID-19 affects children less severely than adults overall, our study shows that there are severe cases in all age groups. Those who have pre-existing health issues and children under one month of age were more likely to be admitted to intensive care. Well-designed, randomised controlled studies on antiviral and immunomodulatory drugs in children are needed to enable evidence-based decisions regarding treatment for children with severe COVID-19." [1]

29 children were found to be infected with one or more additional respiratory viruses at the same time as SARS-CoV-2, such as cold or flu viruses. Of these, 24% required intensive care (7/29) compared with 7% of children with no additional viruses detected, (41/553).

Dr Begoña Santiago-Garcia, one of the lead authors from University Hospital Gregorio
Marañón in Madrid, Spain, said: "This is the first study of children with COVID-19 to include data from multiple countries and multiple centres. Of note, we found that children in whom additional viruses were detected in the respiratory tract at the same time as SARS-CoV-2 were more likely to be admitted to intensive care. This could have important implications for the upcoming winter season, when cold and flu infections will be more common." [1]

Four patients died during the study period, two of whom had pre-existing medical conditions. All of the patients who died were older than 10 years of age. However, the overwhelming majority of patients were alive when the study closed (99%, 578/582) with only 25 (4%) still experiencing symptoms or needing support for their breathing.

At the time the study was conducted, testing capacity in many European countries was lower than demand, and so many children with COVID-19 and mild symptoms would not have been tested or diagnosed. Different countries were using different criteria to screen for the SARS-CoV-2 virus. Some were screening all children admitted to hospital while others were more selective in which patients were offered a test. This lack of standardisation makes it difficult to generalise the findings to the wider population, the authors say, but the true case fatality rate in children is likely substantially lower than that observed in this study (0.69%, 4/582).

Credit: 
The Lancet

Monster black hole found in the early universe

video: Astronomers have discovered the second most distant quasar ever found, using the international Gemini Observatory and Cerro Tololo Inter-American Observatory (CTIO). It is also the first quasar to receive an indigenous Hawaiian name, Pōniuāʻena.

Image: 
International Gemini Observatory/NOIRLab/NSF/AURA/Pete Marenfeld, ESA/Hubble, NASA, M. Kornmesser. A Special Thanks to A Hua He Inoa and the 'Imiloa Astronomy Center of Hawai‘i Music: zero-project -- The Lower Dungeons (zero-project.gr).

Astronomers have discovered the second most distant quasar ever found, using the international Gemini Observatory and Cerro Tololo Inter-American Observatory (CTIO), Programs of NSF's NOIRLab. It is also the first quasar to receive an indigenous Hawaiian name, Pōniuāʻena. The quasar contains a monster black hole, twice the mass of the black hole in the only other quasar found at the same epoch, challenging the current theories of supermassive black hole formation and growth in the early Universe.

After more than a decade of searching for the first quasars, a team of astronomers used the NOIRLab's Gemini Observatory and CTIO to discover the most massive quasar known in the early Universe -- detected from a time only 700 million years after the Big Bang [1]. Quasars are the most energetic objects in the Universe, powered by their supermassive black holes, and since their discovery astronomers have been keen to determine when they first appeared in our cosmic history.

Systematic searches for these objects have led to the discovery of the most distant quasar (J1342+0928) in 2018 and now the second most distant, J1007+2115 [2]. The A Hua He Inoa program named J1007+2115 Pōniuāʻena, meaning "unseen spinning source of creation, surrounded with brilliance" in the Hawaiian language [3]. The supermassive black hole powering Pōniuāʻena is 1.5 billion times more massive than our Sun.

"Pōniuāʻena is the most distant object known in the Universe hosting a black hole exceeding one billion solar masses" said Jinyi Yang, a Postdoctoral Research Associate at the Steward Observatory of the University of Arizona.

For a black hole of this size to form this early in the Universe, it would need to start as a 10,000 solar mass "seed" black hole about 100 million years after the Big Bang, rather than growing from a much smaller black hole formed by the collapse of a single star.

"How can the Universe produce such a massive black hole so early in its history?" wondered Xiaohui Fan, Regents' professor and associate department head of the Department of Astronomy at the University of Arizona. "This discovery presents the biggest challenge yet for the theory of black hole formation and growth in the early Universe."

Current theory suggests that at the beginning of the Universe following the Big Bang, atoms were too distant from one another to interact and form stars and galaxies. The birth of stars and galaxies as we know them happened during the Epoch of Reionization, beginning about 400 hundred million years after the Big Bang. The discovery of quasars like Pōniuāʻena, deep into the reionization epoch, is a big step towards understanding this process of reionization and the formation of early supermassive black holes and massive galaxies. Pōniuāʻena has placed new and important constraints on the evolution of the matter between galaxies (the intergalactic medium) in the reionization epoch.

The search for distant quasars began with the research team combing through large area surveys such as the DECaLS imaging survey which uses the Dark Energy Camera (DECam) on the Víctor M. Blanco 4-meter Telescope, located at CTIO in Chile. The team uncovered a possible quasar in the data, and in 2019 they observed it with telescopes including the Gemini North telescope and the W. M. Keck Observatory both on Maunakea on Hawai'i Island. Gemini's GNIRS instrument confirmed the existence of Pōniuāʻena.

"Observations with Gemini were critical for obtaining high-quality near-infrared spectra which provided us with the measurement of the black hole's astounding mass," said Feige Wang, a NASA NHFP fellow at the Steward Observatory of the University of Arizona.

In honor of its discovery from Maunakea, this quasar was given the Hawaiian name Pōniuāʻena. The name was created by thirty Hawaiian immersion school teachers during a workshop led by the A Hua He Inoa group, a Hawaiian naming program led by the 'Imiloa Astronomy Center of Hawai'i. Pōniuāʻena is the first quasar to receive an indigenous name.

"In addition to the teamwork of the telescopes of NOIRLab that made this discovery possible, it is exciting to see the collaboration of science and culture in local communities, highlighted by this new name," said Chris Davis, Program Officer at the National Science Foundation.

"I am extremely grateful to be a part of this educational experience -- it is a rare learning opportunity," said Kauʻi Kaina, a High School Hawaiian Immersion Teacher from Kahuku, Oʻahu who was involved in the naming workshop. "Today it is relevant to apply these cultural values in order to further the wellbeing of the Hawaiian language beyond ordinary contexts, such as in school, but also so that the language lives throughout the Universe."

Credit: 
Association of Universities for Research in Astronomy (AURA)

MicroCT reveals detailed head morphology of arthropod, Leanchoilia illecebrosa

image: (A) YKLP11423, 5mm-long juvenile, ventral view; (B) YKLP11422, 7mm-long juvenile, ventral view.

Image: 
Javier Ortega-Hernández and Yu Liu

The Chengjiang biota in the Yunnan Province of China contains one of the most species-rich and well-preserved fossiliferous deposits for the early Cambrian (ca. 518 million years old), including numerous arthropod species. However, several Chengjiang arthropods have an unfamiliar morphology, are extremely rare, or are incompletely preserved, which often leads to many of these species being problematic, poorly known, or often both, thus hindering their contribution towards reconstructing the evolution of this major animal group.

Javier Ortega-Hernández, Assistant Professor in the Department of Organismic and Evolutionary Biology and Curator of Invertebrate Paleontology in the Museum of Comparative Zoology at Harvard University, and Yu Liu, Professor of Paleobiology, Yunnan University, China have collaborated for years in the study of Chengjiang arthropods and their evolutionary significance. Their latest paper in Current Biology shows with unprecedented clarity the head morphology of the species Leanchoilia illecebrosa - a member of Megacheira, a major extinct group characterized by distinctively raptorial great appendages. Ortega-Hernández's and Liu's reexamination of Leanchoilia demonstrates the presence of a labrum - a flap-like structure overlying the mouth opening in most modern arthropods - and offers renewed support to the hypothesis that megacheirans are distant relatives of modern chelicerates (e.g. horseshoe crabs, scorpions and spiders).

Ortega-Hernández and Liu used microCT, a technique that uses X-rays to visualize features that are not easily observable on the surface of the fossils, to study the organization of the head in small juveniles of Leanchoilia illecebrosa. With microCT they were able to understand the head in greater detail than ever before, and discover features that refute previously believed hypotheses.

"The biggest surprise came when studying structures close to the mouth," said Ortega-Hernández. "Until now, the very existence of a labrum in megacheirans, and its position relative to the mouth, have been the source of heated debate. In living arthropods the labrum is considered an important feature of the head because of its precise origin during embryonic development. The 3D data on Leanchoilia allowed us to show for the first time and with great clarity that this animal indeed had a labrum. This is a useful discovery because researchers have argued with each other whether a labrum was present or not in this and other closely related species, which has prompted very different interpretations about their evolution and affinities."

The paper is the fifth in a series of publications that represent an ongoing collaboration between the research groups led by Ortega-Hernández and Liu. This study along with others in Current Biology (v29:1, 2019), BMC Evolutionary Biology (v19, 2019, and v20, 2020), and Geological Magazine (March 27, 2020) consists of the study, and often restudy, of exceptional arthropod fossils from the early Cambrian (ca. 518 million years ago) using microCT to reveal exceptional details of the preserved anatomy that are completely inaccessible through conventional preparation tools.

"With microCT we can discern between the iron-rich fossils and the iron-depleted rock matrix to produce highly detailed and informative virtual models in 3D that reveal their affinities, ecology and evolutionary significance," said Ortega-Hernández. "Although each publication is a bit different and tells a distinct story for the early evolution of arthropods, they all follow the same overall goal and structure, and use similar techniques and methodology."

"We have several ongoing projects as part of this collaboration, including many new and exciting species, as well as re-descriptions of some old favorites," said Ortega-Hernández. "There are certainly a few pleasant surprises, and we expect that this collaboration will continue yielding high-quality morphological information for several years, as we have only started to scratch the surface." The ongoing project is partially funded by the Harvard China Fund.

Credit: 
Harvard University, Department of Organismic and Evolutionary Biology