Earth

AGS COVID-19 brief offers roadmap to government action for assisted living facilities

image: Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has--for more than 75 years--worked to improve the health, independence, and quality of life of older people. Our nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org.

Image: 
(C) 2020, American Geriatrics Society

In a policy brief published today in its namesake journal (DOI: 10.1111/jgs.16510), the American Geriatrics Society (AGS) offered a roadmap to guide federal, state, and local governments addressing COVID-19 concerns in an important but oft-overlooked arena: Assisted living facilities (ALFs). The brief, which joins an earlier statement on COVID-19 care in nursing homes (DOI: 10.1111/jgs.16477), outlines recommendations based on the latest research and guidance, encompassing actions on resource needs, patient transfers, priorities for public health, and opportunities to better empower health workers on the frontlines of COVID-19 care.

"As we've already learned, outbreaks impacting older people are a foreseeable consequence of this pandemic, even with experts working as valiantly as they are," notes AGS President-Elect Annie Medina-Walpole, MD, AGSF. "We hope this brief can help policymakers, advocates, and clinicians look at but also beyond the circumstances we can control--and those we can't--to prioritize the innovation, collaboration, and compassion that can put key patients and public health first. That's a cardinal direction for planning in crisis and in calm, regardless of where we may live as we age."

Nationwide, more than 800,000 people live in more than 28,000 ALFs, which employ more than 450,000 individuals. More than 80 percent of all ALFs residents are 75 years old or older. However, these individuals also live with increased susceptibility to the serious complications of COVID-19, including respiratory failure and death. Given the wide variety of structure and staffing for ALFs, most are not as well-resourced compared to other settings to respond to COVID-19. Though some elements of nursing home guidance could by adopted by ALFs, many still may struggle to implement best practices while more targeted recommendations are unavailable.

In reviewing existing research and recommendations, the AGS suggested orienting expertise towards several focal points where tangible action can make a difference:

The AGS called for President Trump to exercises his full authority under the Defense Production Act so the U.S. can move quickly to increase production and distribution of important supplies. These include personal protective equipment (PPE) and COVID-19 tests and related laboratory equipment, but also supplies for symptom management and end-of-life care.

The AGS reinforced the importance of robust COVID-19 testing and contact tracing to prevent the further spread of the disease, not only among ALF residents but also staff. Experts' estimates of the U.S. need for testing have a wide range, from 750,000 tests per week to more than 22 million tests per day. Contact tracing--the aggressive and resource-intense process of tracking who infected individuals may have seen or encountered while contagious--also will be vital as we start to loosen restrictions.

The AGS reinforced the importance of carefully considering transfers between ALFs, hospitals/emergency departments, and other care settings. AGS experts noted, for example, that the first and best option for individuals who test positive for COVID-19 remains to stay at home and quarantined unless symptoms are so serious that care only is available in a hospital. For ALFs, decisions to transfer a symptomatic or COVID-19-positive resident should consider the person's goals of care and be guided by a clinician who can work with the individual's primary care provider to manage care in place, if possible. Residents who test positive for COVID-19 and are discharged from other settings (like hospitals or skilled nursing facilities) should not return to an ALF unless the facility can safely isolate the patient from other residents and has adequate infection control protocols and PPE for staff and community members.

AGS experts also emphasized that state, county, and local health departments must do all they can to advance infection control. This includes engaging with ALFs to limit the spread of COVID-19. This also includes providing technical assistance for screening, obtaining testing for residents and staff, providing guidance on advanced hygiene practices, and communicating about and supporting physical distancing, among a variety of important priorities.

The AGS also reinforced the importance of supporting health professionals, our nation's frontline defense for treating and preventing the spread of COVID-19. The AGS encouraged Congress to advance paid family, medical, and sick leave for the whole health workforce, for example, while also enhancing COVID-19 screening and training to protect staff availability. As we continue to learn and grow from this emergency, the AGS also urged Congress to provide educational and grant opportunities for direct care workers, who play a critical role in ALFs. In addition to their physical and emotional demands, jobs in aging services are complex, requiring training and experience caring for older adults. At present, these workforce needs go unrecognized in pay scales, reimbursement rates, and state regulations.

The AGS policy brief is available with free access from the Journal of the American Geriatrics Society, which also issued its own call for COVID-19 scholarship to undergo expedited review and publication. AGS updates, including work on clinical practice, public policy, and public and professional education, will be posted to the Society's COVID-19 information hub at https://www.americangeriatrics.org/covid19.

Credit: 
American Geriatrics Society

Study shows immunotherapy prior to surgery may help destroy high-risk breast cancer

image: A new study led by Dr. Lajos Pusztai of Yale Cancer Center shows immunotherapy prior to surgery may help destroy high-risk breast cancer.

Image: 
Yale Cancer Center

New Haven, Conn. -- A new study led by Yale Cancer Center (YCC) researchers shows women with high-risk HER2-negative breast cancer treated before surgery with immunotherapy, plus a PARP inhibitor with chemotherapy, have a higher rate of complete eradication of cancer from the breast and lymph nodes compared to chemotherapy alone. The findings, part of the I-SPY clinical trial, were presented today at the American Association for Cancer Research (AACR) virtual annual meeting.

"The results provide further evidence for the clinical value of immunotherapy in early stage breast cancer and suggest new avenues to use these drugs, particularly in estrogen receptor (ER)-positive/HER2-negative breast cancers," said Lajos Pusztai, M.D., Professor of Medicine (Medical Oncology) and Director of Breast Cancer Translational Research at YCC. Pusztai presented the results of the study today during a plenary session at the AACR meeting.

Physicians treat some women with HER-2 negative breast cancer with chemotherapy before surgery, hoping to shrink the tumor and to guide treatment after the operation. In a subgroup of women, this pre-surgical treatment destroys any evidence of the tumor, achieving what is called "pathologic complete response" (pCR), a condition that typically heralds increased overall survival.

Investigators in the I-SPY 2 clinical trial now report that for women with HER2-negative breast cancer who are treated before surgery, an average pCR rate rises from 22% among those given standard-of-care chemotherapy to 37% in those who received the immunotherapy drug durvalumab, plus the PARP inhibitor drug olaparib, in addition to chemotherapy.

Durvalumab is a checkpoint inhibitor immunotherapy, engineered to unleash immune system T cells against tumors by inhibiting a protein on the surface of T cells called PD-1. PARP inhibitor drugs such as olaparib aim to the ability of impair cancer cells to repair DNA damage caused by chemotherapy.

Overall, 73 patients in the experimental arm were given durvalumab, olaparib, and paclitaxel chemotherapy followed by doxorubicin/cyclophosphamide chemotherapy, while 229 patients in the control arm received the standard treatment of paclitaxel plus doxorubicin/ cyclophosphamide. Researchers analyzed results for all HER2-negative patients, as well as for triple-negative (TNBC) and ER positive subsets. Women with triple negative cancer who received the combination treatment saw a pCR rate of 47%, compared to those given the standard chemotherapy with a pCR rate of 27%. Patients with estrogen-positive/HER2-negative cancer in the experimental arm experienced a pCR rate of 28%, versus 14% for those in the control arm. Patients in the experimental arm, however, were also more likely to experience grade 3 serious adverse events--58% in the experimental arm compared to 41% in the control arm.

Immune-rich cancers showed higher pCR rates in all subtypes and in both treatment arms, but the investigators discovered biomarkers that potentially could identify patients who are most likely to benefit from treatment with durvalumab and olaparib. Among estrogen-positive/HER2-negative cancers, the MammaPrint ultra-high subset benefited the most from the combination, their pCR rate reached 64%. In TNBC, tumors with low CD3/CD8 ratio, high Macrophage/Tcell-MHC class II ratio, and high proliferation appear to have benefited preferentially from adding durvalumab and olaparib to paclitaxel.

I-SPY (Investigation of Serial Studies to Predict Your Therapeutic Response with Imaging and Molecular Analysis) 2 is a multicenter phase 2 trial to evaluate novel agents as pre-surgical therapy for breast cancer. The study is a collaboration among 20 U.S. cancer research centers, the U.S. Food and Drug Administration and the Foundation for the National Institutes of Health Cancer Biomarkers Consortium. Lead support for I-SPY 2 came from the Quantum Leap Healthcare Collaborative.

Credit: 
Yale University

Connecting the dots between heart disease, potential for worse COVID-19 outcomes

ROCHESTER, Minn. -- People with certain heart diseases may be more susceptible to worse outcomes with COVID-19, but the reason why has remained unknown. New research from Mayo Clinic indicates that in patients with one specific type of heart disease obstructive hypertrophic cardiomyopathy (HCM) the heart increases production of the ACE2 RNA transcript and the translated ACE2 protein.

Normally, this pathological response at the cellular level might be the heart's attempt to compensate for changes caused by disease. Unfortunately, SARS-CoV-2, the virus that causes COVID-19, hijacks these ACE2 receptors on the membrane of cells and uses them to get inside the cells. The virus not only gains entry through ACE2, but also it takes this protein with it, removing a protective signaling pathway that normally counters the negative impact of the hormone angiotensin II. This hormone increases blood pressure and leads to fluid retention.

Over the course of a nearly 20-year study published in Mayo Clinic Proceedings, researchers analyzed frozen samples of heart muscle tissue from 106 patients who had surgery for obstructive hypertrophic cardiomyopathy. The control group used heart tissue from 39 healthy donor hearts.

"Of all the RNA transcripts in the entire human genome, our research revealed that the single most upregulated RNA transcript in the heart muscle was ACE2. In fact, we confirmed a fivefold increase in ACE2 protein levels in the heart muscle of these patients with obstructive HCM," says Michael Ackerman, M.D., Ph.D., a genetic cardiologist at Mayo Clinic. "This could connect the dots and potentially explain why patients with certain heart diseases might fare worse with COVID-19."

Dr. Ackerman is director of the Windland Smith Rice Sudden Death Genomics Laboratory at Mayo Clinic and senior author on the study. This study involved national and international investigators.

The next step is to look for other elevated ACE2 levels by analyzing available heart tissue from patients who have died from hypertension and other heart diseases. Lung tissue from COVID-19 victims also could be analyzed to see if ACE2 levels are higher than in normal lung tissue.

"This discovery provides another reason for patients taking angiotensin-converting enzyme (ACE) inhibitors and angiotensin II receptor blockers to stay on their heart medications, as recommended by all major cardiac societies," says Dr. Ackerman. "Removing these medications in a patient whose heart has elevated protein levels of ACE2 could cause even more tissue damage."

Credit: 
Mayo Clinic

Study analyzes contamination in drug manufacturing plants

CAMBRIDGE, MA -- Over the past few decades, there have been a handful of incidents in which manufacturing processes for making protein drugs became contaminated with viruses at manufacturing plants. These were all discovered before the drugs reached patients, but many of the incidents led to costly cleanups and in one instance a drug shortage.

A new study from an MIT-led consortium has analyzed 18 of these incidents, most of which had not been publicly reported until now. The report offers insight into the most common sources of viral contamination and makes several recommendations to help companies avoid such incidents in the future.

While the study focused on biopharmaceuticals (protein drugs produced by living cells), the findings could also help biotech companies to create safety guidelines for the manufacture of new gene therapies and cell-based therapies, many of which are now in development and could face similar contamination risks.

"As the biotech industry starts to think about manufacturing these really exciting new products, which are highly effective and even in some cases curative, we want to make sure that the viral safety aspects of manufacturing them are considered," says Stacy Springs, senior director of programs for MIT's Center for Biomedical Innovation (CBI).

Springs is the senior author of the study, which appears today in Nature Biotechnology. Paul Barone, co-director of the CBI's Biomanufacturing Program and director of the Consortium on Adventitious Agent Contamination in Biomanufacturing (CAACB), is the lead author. The other authors from CBI are Michael Wiebe and James Leung.

Sharing information

Many therapeutic proteins are produced using recombinant DNA technology, which allows bacterial, yeast, or mammalian cells to be engineered to produce a desired protein. While this practice has a strong safety record, there is a risk that the cultured mammalian cells can be infected with viruses. The CAACB, which performed the study, was launched in 2010 following a well-publicized contamination incident at a Genzyme manufacturing plant in Boston. The plant had to shut down for about 10 months when some of its production processes became infected with a virus in 2009.

When such incidents occur, drug companies aren't required to make them public unless the incident affects their ability to provide the drug. The CBI team assembled a group of 20 companies that were willing to share information on such incidents, on the condition that the data would be released anonymously.

"We thought it would be very valuable to have industry share their experience of viral contamination, since most companies have had none of these incidents if they're lucky, or maybe one or two at the most," Springs says. "All of that knowledge about how they discovered and managed the event, identified the virus and its source, disinfected and restarted the production facility, and took action to prevent a recurrence was all siloed within individual companies."

The study, which focused on protein drugs produced by mammalian cells, revealed 18 viral contamination incidents since 1985. These occurred at nine of the 20 biopharmaceutical companies that reported data. In 12 of the incidents, the infected cells were Chinese hamster ovary (CHO) cells, which are commonly used to produce protein drugs. The other incidents involved human or nonhuman primate cells.

The viruses that were found in the human and nonhuman primate cells included herpesvirus; human adenovirus, which causes the common cold; and reovirus, which can cause mild gastroenteritis. These viruses may have spread from workers at the plants, the researchers suggest.

In many cases, contamination incidents were first detected because cells were dying or didn't look healthy. In two cases, the cells looked normal but the viral contamination was detected by required safety testing. The most commonly used test takes at least two weeks to yield results, so the contaminating virus can spread further through the manufacturing process before it is detected.

Some companies also use a faster test based on polymerase chain reaction (PCR) technology, but this test has to be customized to look for specific DNA sequences, so it works best when the manufacturers know of specific viruses that are most likely to be found in their manufacturing processes.

New technology

Many of the CAACB member companies are exploring new technologies to inactivate or remove viruses from cell culture media before use, and from products during purification. Additionally, companies are developing rapid virus detection systems that are both sensitive and able to detect a broad spectrum of viruses.

CBI researchers are also working on several technologies that could enable more rapid tests for viral contamination. Much of this research is taking place within a new interdisciplinary research group at the Singapore-MIT Alliance for Science and Technology (SMART), called the Critical Analytics for Manufacturing Personalized Medicines. Led by Krystyn Van Vliet, MIT associate provost and a professor of biological engineering and materials science and engineering, this group, which includes several other MIT faculty members from across departments, is working on about half a dozen technologies to more rapidly detect viruses and other microbes.

"I think that there's a lot of potential for technology development to ameliorate some of the challenges we see," Barone says.

Another strategy that the report recommends, and that some companies are already using, is to reduce or eliminate the use of cell growth medium components that are derived from animal products such as bovine serum. When that isn't possible, another strategy is to perform virus removal or inactivation processes on media before use, which can prevent viruses from entering and contaminating manufacturing processes. Some companies are using a pasteurization-like process called high temperature short time (HTST) treatment, while others use ultraviolet light or nanofiltration.

The researchers hope that their study will also help guide manufacturers of new gene- and cell-therapy products. These therapies, which make use of genes or cells to either replace defective cells or produce a therapeutic molecule within the body, could face similar safety challenges as biopharmaceuticals, the researchers say, as they are often grown in media containing bovine serum or human serum.

"Having done this sharing of information in a systematic way, I think we can accelerate the dissemination of information on best practices, not only within the protein manufacturing industry but also the new industry of cell-based modalities," says James Leung.

Credit: 
Massachusetts Institute of Technology

New findings suggest laws of nature not as constant as previously thought

Those looking forward to a day when science's Grand Unifying Theory of Everything could be worn on a t-shirt may have to wait a little longer as astrophysicists continue to find hints that one of the cosmological constants is not so constant after all.

In a paper published in prestigious journal Science Advances, scientists from UNSW Sydney reported that four new measurements of light emitted from a quasar 13 billion light years away reaffirm past studies that have measured tiny variations in the fine structure constant.

UNSW Science's Professor John Webb says the fine structure constant is a measure of electromagnetism - one of the four fundamental forces in nature (the others are gravity, weak nuclear force and strong nuclear force).

"The fine structure constant is the quantity that physicists use as a measure of the strength of the electromagnetic force," Professor Webb says.

"It's a dimensionless number and it involves the speed of light, something called Planck's constant and the electron charge, and it's a ratio of those things. And it's the number that physicists use to measure the strength of the electromagnetic force."

The electromagnetic force keeps electrons whizzing around a nucleus in every atom of the universe - without it, all matter would fly apart. Up until recently, it was believed to be an unchanging force throughout time and space. But over the last two decades, Professor Webb has noticed anomalies in the fine structure constant whereby electromagnetic force measured in one particular direction of the universe seems ever so slightly different.

"We found a hint that that number of the fine structure constant was different in certain regions of the universe. Not just as a function of time, but actually also in direction in the universe, which is really quite odd if it's correct...but that's what we found."

LOOKING FOR CLUES

Ever the sceptic, when Professor Webb first came across these early signs of slightly weaker and stronger measurements of the electromagnetic force, he thought it could be a fault of the equipment, or of his calculations or some other error that had led to the unusual readings. It was while looking at some of the most distant quasars - massive celestial bodies emitting exceptionally high energy - at the edges of the universe that these anomalies were first observed using the world's most powerful telescopes.

"The most distant quasars that we know of are about 12 to 13 billion light years from us," Professor Webb says.

"So if you can study the light in detail from distant quasars, you're studying the properties of the universe as it was when it was in its infancy, only a billion years old. The universe then was very, very different. No galaxies existed, the early stars had formed but there was certainly not the same population of stars that we see today. And there were no planets."

He says that in the current study, the team looked at one such quasar that enabled them to probe back to when the universe was only a billion years old which had never been done before. The team made four measurements of the fine constant along the one line of sight to this quasar. Individually, the four measurements didn't provide any conclusive answer as to whether or not there were perceptible changes in the electromagnetic force. However, when combined with lots of other measurements between us and distant quasars made by other scientists and unrelated to this study, the differences in the fine structure constant became evident.

A WEIRD UNIVERSE

"And it seems to be supporting this idea that there could be a directionality in the universe, which is very weird indeed," Professor Webb says.

"So the universe may not be isotropic in its laws of physics - one that is the same, statistically, in all directions. But in fact, there could be some direction or preferred direction in the universe where the laws of physics change, but not in the perpendicular direction. In other words, the universe in some sense, has a dipole structure to it.

"In one particular direction, we can look back 12 billion light years and measure electromagnetism when the universe was very young. Putting all the data together, electromagnetism seems to gradually increase the further we look, while towards the opposite direction, it gradually decreases. In other directions in the cosmos, the fine structure constant remains just that - constant. These new very distant measurements have pushed our observations further than has ever been reached before."

In other words, in what was thought to be an arbitrarily random spread of galaxies, quasars, black holes, stars, gas clouds and planets - with life flourishing in at least one tiny niche of it - the universe suddenly appears to have the equivalent of a north and a south. Professor Webb is still open to the idea that somehow these measurements made at different stages using different technologies and from different locations on Earth are actually a massive coincidence.

"This is something that is taken very seriously and is regarded, quite correctly with scepticism, even by me, even though I did the first work on it with my students. But it's something you've got to test because it's possible we do live in a weird universe."

But adding to the side of the argument that says these findings are more than just coincidence, a team in the US working completely independently and unknown to Professor Webb's, made observations about X-rays that seemed to align with the idea that the universe has some sort of directionality.

"I didn't know anything about this paper until it appeared in the literature," he says.

"And they're not testing the laws of physics, they're testing the properties, the X-ray properties of galaxies and clusters of galaxies and cosmological distances from Earth. They also found that the properties of the universe in this sense are not isotropic and there's a preferred direction. And lo and behold, their direction coincides with ours."

LIFE, THE UNIVERSE, AND EVERYTHING

While still wanting to see more rigorous testing of ideas that electromagnetism may fluctuate in certain areas of the universe to give it a form of directionality, Professor Webb says if these findings continue to be confirmed, they may help explain why our universe is the way it is, and why there is life in it at all.

"For a long time, it has been thought that the laws of nature appear perfectly tuned to set the conditions for life to flourish. The strength of the electromagnetic force is one of those quantities. If it were only a few per cent different to the value we measure on Earth, the chemical evolution of the universe would be completely different and life may never have got going. It raises a tantalising question: does this 'Goldilocks' situation, where fundamental physical quantities like the fine structure constant are 'just right' to favour our existence, apply throughout the entire universe?"

If there is a directionality in the universe, Professor Webb argues, and if electromagnetism is shown to be very slightly different in certain regions of the cosmos, the most fundamental concepts underpinning much of modern physics will need revision.

"Our standard model of cosmology is based on an isotropic universe, one that is the same, statistically, in all directions," he says.

"That standard model itself is built upon Einstein's theory of gravity, which itself explicitly assumes constancy of the laws of Nature. If such fundamental principles turn out to be only good approximations, the doors are open to some very exciting, new ideas in physics."

Professor Webb's team believe this is the first step towards a far larger study exploring many directions in the universe, using data coming from new instruments on the world's largest telescopes. New technologies are now emerging to provide higher quality data, and new artificial intelligence analysis methods will help to automate measurements and carry them out more rapidly and with greater precision.

Credit: 
University of New South Wales

Using cloud-precipitation relationship to estimate cloud water path of mature tropical cyclones

image: Schematic diagram of the relationship between cloud and precipitation.

Image: 
Shuang Luo

The cloud water path of mature tropical cyclones can be estimated by a notable sigmoid function of near-surface rain rate, according to Prof. Yunfei Fu, a professor at the School of Earth and Space Sciences, University of Science and Technology of China, and one of the authors of a recently published study in Advances in Atmospheric Sciences.

"It is known that clouds play a significant role in the climate system due to their ability to modify the global radiation balance and atmospheric water cycle. Furthermore, cloud development, to a certain extent, will produce precipitation, which is closely related to many aspects of human life. The cloud microphysical processes that affect precipitation are complex, and whether cloud parameters can be used as indicators for precipitation, or vice versa, requires further study," says the first author of the study, Dr. Shuang Luo. "We analyzed the potential correlation between cloud microphysical properties and precipitation, to deepen our understanding of the evolution of cloud to rain."

Luo and Fu combine time- and space-synchronized precipitation and spectral data obtained by the Precipitation Radar as well as the Visible and Infrared Scanner onboard the TRMM satellite, and therefore overcome the limitations of precipitation properties and cloud parameters not being synchronized in previous studies.

In order to investigate the relationship between cloud water and precipitation intensity in mature typhoon systems, the team obtained 25 collocated satellite overpasses of mature typhoon cases in the Northwest Pacific Ocean from 1998 to 2012 (144,515 precipitation pixels in total). The results show that the cloud water path exhibits an oblique S-type change with increasing near-surface rain rate and ultimately tends toward saturation. In addition, the cloud water path and near-surface rain rate of mature typhoon systems with different precipitation types, precipitation cloud phases, and vertical depths of precipitation can be fitted by a notable sigmoid function, which may be useful for estimating the cloud water path and parameterizing precipitation in models.

"These newly derived relations certainly provide a new way to estimate the cloud water path of mature typhoon systems in the Northwest Pacific Ocean," Dr. Luo believes. "To better capture information inside the clouds, we need to obtain not only the cloud microphysical properties near the cloud tops, but also the profiles of cloud parameters inside the clouds, which is essential to analyzing the correlation between cloud and precipitation profiles." They plan to conduct further research along these lines in the future.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Scientists develop stable luminescent composite material based on perovskite nanocrystals

image: ChemNanoMat, Stable luminescent composite microspheres based on porous silica with embedded CsPbBr3 perovskite nanocrystals

Image: 
ChemNanoMat

An international team of scientists that includes researchers from ITMO University has developed a new composite material based on perovskite nanocrystals for the purpose of creating miniature light sources with improved output capacity. The introduction of perovskite nanocrystals into porous glass microparticles made it possible to increase their operating time by almost three times, and the subsequent coating of these particles with polymers - to increase the stability of their optical properties when underwater, which is especially important for the purposes of creating light sources for application in biological media. The results have been published in ChemNanoMat.

Perovskite nanocrystals are some of the most researched objects in modern materials science. They have excellent optical properties, such as the purity and brightness of emitted light, which makes them appealing for use in modern laser systems. At the same time, perovskites are unstable in the air, when interacting with water, as well as under intensive illumination. This is why the improvement of perovskite nanocrystals' stability is one of the key tasks that stands before the scientific community.

An international team of scientists that includes researchers from ITMO University, Ioffe Institute, as well as City University of Hong Kong, studied various conditions for the introduction of perovskite nanocrystals into porous spheres of silicon dioxide that can act as both protective matrices and optical resonators for spontaneous amplification of the luminescence signal. Their research identified the optimal parameters for the manufacturing of a perovskite nanocrystals-based luminescent material where the emission intensity stayed at 85% of the original, which is significantly higher than that of the same nanocrystals without a protection matrix. Such composite materials also remained stable under the effect of intensive UV radiation, which can be used as a light pumping source when designing laser systems.

"Our next step had to do with the development of a protective layer for such light-emitting microspheres with perovskite nanocrystals for the purposes of moving them into hydrous solutions," says Elena Ushakova, an associate professor at ITMO's Faculty of Photonics and Optical Information Technology. "In order to do this, we used the layer-by-layer technique of depositing alternating layers of oppositely charged materials on the microspheres' surface. The resulting luminescent spheres can be dispersed in water while retaining their optical properties, which is important from the standpoint of their further application as light sources in biological tissues."

Credit: 
ITMO University

Honey bees could help monitor fertility loss in insects due to climate change

video: New research from the University of British Columbia and North Carolina State University could help scientists track how climate change is impacting bee reproduction.

Image: 
UBC Media Relations

New research from the University of British Columbia and North Carolina State University could help scientists track how climate change is impacting the birds and the bees... of honey bees.

Heat can kill sperm cells across the animal kingdom, yet there are few ways to monitor the impact of heat on pollinators like honey bees, who are vital to ecosystems and agriculture around the world.

In a study published in Nature Sustainability, researchers used a technique called mass spectrometry to analyse sperm stored in honey bee queens and found five proteins that are activated when the queens are exposed to extreme temperatures.

The proteins could be used as a tool to monitor heat stress in queen bees, and serve as a bellwether for wider insect fertility losses due to climate change.

"Just like cholesterol levels are used to indicate the risk of heart disease in humans, these proteins could indicate whether a queen bee has experienced heat stress," said lead author Alison McAfee, a biochemist at the Michael Smith Labs at UBC and postdoc at NC State. "If we start to see patterns of heat shock emerging among bees, that's when we really need to start worrying about other insects."

Although honey bees are quite resilient compared to other non-social insects, they are a useful proxy because they are managed by humans all over the world and are easy to sample.

The researchers were particularly interested in queen bees because their reproductive capacity is directly linked to the productivity of a colony. If the sperm stored by a queen is damaged, she can "fail" when she no longer has enough live sperm to produce enough drones and worker bees to maintain a colony.

"We wanted to find out what 'safe' temperatures are for queen bees and explore two potential routes to heat exposure: during routine shipping and inside colonies," said McAfee. "This information is really important for beekeepers, who often have no way to tell what condition the queens they receive are in. That can have a really dramatic impact on their quality and quality of their colonies."

First McAfee established what the threshold for queen "failure" was, and how much heat they could withstand by exposing them to a range of temperatures and durations.

"Our data suggests that temperatures between 15 to 38 degrees Celsius are safe for queens," said McAfee. "Above 38 degrees, the percentage of live sperm dropped to or below the level we see in failed queens compared to healthy queens, which is an 11.5 per cent decrease from the normal 90 per cent."

The researchers then placed temperature loggers in seven domestic queen shipments via ground and one by air. They found that one package experienced a temperature spike to 38 degrees Celsius, while one dropped to four degrees Celsius.

"These findings can help create better guidelines for safe queen bee transportation and help buyers and sellers track the quality of queens," said co-author Leonard Foster, a professor at the Michael Smith Labs at UBC.

While bee colonies are generally thought to be good at regulating the temperature inside hives, the researchers wanted to know how much the temperature actually fluctuated. They recorded the temperatures in three hives in August in El Centro, California, when the ambient temperature in the shade below each hive reached up to 45 degrees Celsius.

They found that in all three hives, the temperatures at the two outermost frames spiked upwards of 40 degrees Celsius for two to five hours, while in two of the hives, temperatures exceeded 38 degrees Celsius one or two frames closer to the core.

"This tells us that a colony's ability to thermoregulate begins to break down in extreme heat, and queens can be vulnerable to heat stress even inside the hive," said co-author Jeff Pettis, an independent research consultant and former USDA-ARS scientist.

Having established these key parameters, the researchers will continue to refine the use of the protein signature to monitor heat stress among queen bees.

"Proteins can change quite easily, so we want to figure out how long these signatures last and how that might affect our ability to detect these heat stress events," said McAfee. "I also want to figure out if we can identify similar markers for cold and pesticide exposure, so we can make more evidence-based management decisions. If we can use the same markers as part of a wider biomonitoring program, then that's twice as useful."

Credit: 
University of British Columbia

New metasurface laser produces world's first super-chiral light

image: An artistic impression of the metasurface laser to produce super-chiral twisted light with OAM up to 100.

Image: 
Wits University

Researchers have demonstrated the world's first metasurface laser that produces "super-chiral light": light with ultra-high angular momentum. The light from this laser can be used as a type of "optical spanner" to or for encoding information in optical communications.

"Because light can carry angular momentum, it means that this can be transferred to matter. The more angular momentum light carries, the more it can transfer. So you can think of light as an 'optical spanner'," Professor Andrew Forbes from the School of Physics at the University of the Witwatersrand (Wits) in Johannesburg, South Africa, who led the research. "Instead of using a physical spanner to twist things (like screwing nuts), you can now shine light on the nut and it will tighten itself."

The new laser produces a new high purity "twisted light" not observed from lasers before, including the highest angular momentum reported from a laser. Simultaneously the researchers developed a nano-structured metasurface that has the largest phase gradient ever produced and allows for high power operation in a compact design. The implication is a world-first laser for producing exotic states of twisted structured light, on demand.

Nature Photonics today published online the research that was done as a collaboration between Wits and the Council for Scientific and Industrial Research (CSIR) in South Africa, Harvard University (USA), the National University of Singapore (Singapore), Vrije Universiteit Brussel (Belgium) and CNST - Fondazione Istituto Italiano di Tecnologia Via Giovanni Pascoli (Italy).

In their paper titled: High-purity orbital angular momentum states from a visible metasurface laser, the researchers demonstrate a new laser to produce any desired chiral state of light, with full control over both angular momentum (AM) components of light, the spin (polarisation) and orbital angular momentum (OAM) of light.

The laser design is made possible by the complete control offered by new nanometer-sized (1000 times smaller than the width of a human hair) metasurface - designed by the Harvard group - within the laser. The metasurface is made up of many tiny rods of nanomaterial, which alters the light as it passes through. The light passes through the metasurface many times, receiving a new twist everytime it does so.

"What makes it special is that to the light, the material has properties impossible to find in Nature, and so is called a "metamaterial" - a make-believe material. Because the structures were so small they appear only on the surface to make a metasurface."

The result is the generation of new forms of chiral light not observed from lasers until now, and complete control of light's chirality at the source, closing an open challenge.

"There is a strong drive at the moment to try and control chiral matter with twisted light, and for this to work you need light with a very high twist: super-chiral light," says Forbes. Various industries and research fields require super-chiral light to improve their processes, including the food, computer and biomedical industries.

"We can use this type of light to drive gears optically where physical mechanical systems would not work, such as in micro-fluidic systems to drive flow," says Forbes. "Using this example, the goal is to perform medicine on a chip rather than in a large lab, and is popularly called Lab-on-a-Chip. Because everything is small, light is used for the control: to move things around and sort things, such as good and bad cells. Twisted light is used to drive micro-gears to get the flow going, and to mimic centrifuges with light."

The chiral challenge

"Chirality" is a term often used in chemistry to describe compounds that are found as mirror images of one another. These compounds have a "handedness" and can be thought of as either left- or right-handed. For example, lemon and orange flavours are the same chemical compound, but only differ in their "handedness".

Light is also is chiral but has two forms: the spin (polarization) and the OAM. Spin AM is similar to planets spinning around their own axis, while OAM is similar to planets orbiting the Sun.

"Controlling light's chirality at the source is a challenging task and highly topical because of the many applications that require it, from optical control of chiral matter, to metrology, to communications," says Forbes. "Complete chiral control implies control of the full angular momentum of light, polarisation and OAM."

Because of design restrictions and implementation impediments, only a very small subset of chiral states has been produced to date. Ingenious schemes have been devised to control the helicity (the combination of spin and linear motion) of OAM beams but they too remain restricted to this symmetric set of modes. It was not possible to write down some desired chiral state of light and have a laser produce it, until now.

Metasurface laser

The laser used a metasurface to imbue light with ultra-high angular momentum, giving it an unprecedented "twist" in its phase while also controlling the polarisation. By arbitrary angular momentum control, the standard spin-orbit symmetry could be broke, for the first laser to produce full angular momentum control of light at the source.

The metasurface was built from carefully crafted nanostructures to produce the desired effect, and is the most extreme OAM structure so far fabricated, with the highest phase gradient yet reported. The nanometre resolution of the metasurface made possible a high-quality vortex with low loss and a high damage threshold, making the laser possible.

The result was a laser that could lase on OAM states of 10 and 100 simultaneously for the highest reported AM from a laser to date. In the special case that the metasurface is set to produce symmetric states, the laser then produces all prior OAM states reported from custom structured light lasers.

Going forward

"What we find particularly exciting is that our approach lends itself to many laser architectures. For instance, we could increase the gain volume and metasurface size to produce a bulk laser for high-power, or we could shrink the system down onto a chip using a monolithic metasurface design," says Forbes.

"In both cases the lasing mode would be controlled by the pump's polarisation, requiring no intra-cavity elements other than the metasurface itself. Our work represents an important step towards merging the research in bulk lasers with that of on-chip devices."

Credit: 
University of the Witwatersrand

A new explanation for the origins of human fatherhood

Chestnut Hill, Mass. (4/27/2020) - Humans differ from other primates in the types and amounts of care that males provide for their offspring. The precise timing of the emergence of human "fatherhood" is unknown, but a new theory proposes that it emerged from a need for partnership in response to changing ecological conditions, U.S. and French researchers report today in the Proceedings of the National Academy of Sciences.

The new theory was developed using tools of economists and knowledge of the economic and reproductive behavior of human foragers. The theory focuses on the benefits of a "fit" between exclusive partners that enabled the strengths of males and females to provide for one another and their offspring, according to researchers from Boston College, Chapman University, University of New Mexico, and the University of Toulouse in France.

Scientists have long tried to explain how human fatherhood emerged. Paternal care - those investments in offspring made by a biological father - is rare among mammals but widespread across modern human subsistence societies. Much of men's parental investment consists of provisioning relatively helpless children with food for prolonged periods of time - for as long as two decades among modern hunter-gatherers. This is a sharp break with other great apes, whose observed mating systems do not encourage paternal provisioning.

That paternal provisioning arose in humans seems remarkable and puzzling and has revolved around a discussion about two groups of males dubbed "Dads" and "Cads".

With promiscuous mating, a would-be Dad who provides food for a mate and their joint offspring without seeking additional mates risks being outcompeted in terms of biological fitness by a Cad, who focuses only on promiscuous mating instead of investing in offspring. Such a competitive disadvantage creates a formidable barrier for Dads to emerge when Cads abound.

An oft-invoked explanation for the evolution of paternal provisioning in humans is that ancestral females started mating preferentially with males who provided them with food, in exchange for female sexual fidelity. This explanation is insufficient for several reasons, the researchers write.

Instead, the team of anthropologists and economists argues that ecological change would have sufficed to trigger the spread of Dads, even in the face of female sexual infidelity, according to the report, "Paternal provisioning results from ecological change."

The key force in the theory of paternal provisioning is complementarities - in essence the cooperation between females and males, as well as between males. Complementarities are synergistic effects that increase per-capita benefits, which may arise from dividing labor and/or pooling resources. The path to complementarities began roughly five to eight million years ago, with a gradual drying in Africa, and a progressively greater need to rely on nutritious, diverse, spatially dispersed and relatively hard-to-obtain foods, including animal products.

In response to ecological change, ancestral hominins adapted in various ways, including efficient bipedal locomotion, dietary flexibility, and an ability to thrive in diverse environments, facilitated by tool use. Complementarities between males and females would have resulted from the nutrients that each sex specialized in acquiring: protein and fat acquired by males paired well with carbohydrates acquired by females.

Complementarities between males would have resulted from higher returns from hunting in groups instead of in isolation, and from food sharing to lower starvation risk. Dietary reliance on animal products is thus a key feature underlying these complementarities between and within sexes.

These complementarities would have led to a substantial increase in the impact of food provided by a Dad on the survival of his mate's offspring.

Using evolutionary game theory, the authors show that this impact can lead Dads to gain a fitness advantage over Cads, although Cads may still co-exist with Dads under certain conditions. If sons inherit their biological father's traits, then over time Dads will increase in number in a population. Theoretically connecting the evolution of paternal provisioning to ecological change allows the authors to make novel predictions about the paleontological and archeological record.

Credit: 
Boston College

Scientists unveil how general anesthesia works

image: The large size of the calyx of Held allows scientists to visualize and manipulate the synapse. The electrodes touching the neurons can be used to evoke and record electrical signals. The frequency of electrical signals stimulated ranged from once every five seconds (0.2Hz) up to 200 times per second (200Hz). The higher the frequency, the stronger the effect of isoflurane on reducing postsynaptic action potentials, lowering the fidelity of the synapse.

Image: 
OIST

Hailed as one of the most important medical advances, the discovery of general anesthetics - compounds which induce unconsciousness, prevent control of movement and block pain - helped transform dangerous and traumatic operations into safe and routine surgery. But despite their importance, scientists still don't understand exactly how general anesthetics work.

Now, in a study published this week in the Journal of Neuroscience, researchers from the Okinawa Institute of Science and Technology Graduate University (OIST) and Nagoya University have revealed how a commonly used general anesthetic called isoflurane weakens the transmission of electrical signals between neurons, at junctions called synapses.

"Importantly, we found that isoflurane did not block the transmission of all electrical signals equally; the anesthetic had the strongest effect on higher frequency impulses that are required for functions such as cognition or movement, whilst it had minimal effect on low frequency impulses that control life-supporting functions, such as breathing," said Professor Tomoyuki Takahashi, who leads the Cellular and Molecular Synaptic Function (CMSF) Unit at OIST. "This explains how isoflurane is able to cause anesthesia, by preferentially blocking the high frequency signals."

At synapses, signals are sent by presynaptic neurons and received by postsynaptic neurons. At most synapses, communication occurs via chemical messengers - or neurotransmitters.

When an electrical nerve impulse, or action potential, arrives at the end of the presynaptic neuron, this causes synaptic vesicles - tiny membrane 'packets' that contain neurotransmitters - to fuse with the terminal membrane, releasing the neurotransmitters into the gap between neurons. When enough neurotransmitters are sensed by the postsynaptic neuron, this triggers a new action potential in the post-synaptic neuron.

The CMSF unit used rat brain slices to study a giant synapse called the calyx of Held. The scientists induced electrical signals at different frequencies and then detected the action potentials generated in the postsynaptic neuron. They found that as they increased the frequency of electrical signals, isoflurane had a stronger effect on blocking transmission.

To corroborate his unit's findings, Takahashi reached out to Dr. Takayuki Yamashita, a researcher from Nagoya University who conducted experiments on synapses, called cortico-cortical synapses, in the brains of living mice.

Yamashita found that the anesthetic affected cortico-cortical synapses in a similar way to the calyx of Held. When the mice were anesthetized using isoflurane, high frequency transmission was strongly reduced whilst there was less effect on low frequency transmission.

"These experiments both confirmed how isoflurane acts as a general anesthetic," said Takahashi. "But we wanted to understand what underlying mechanisms isoflurane targets to weaken synapses in this frequency-dependent manner."

Tracking down the targets

With further research, the researchers found that isoflurane reduced the amount of neurotransmitter released, by both lowering the probability of the vesicles being released and by reducing the maximum number of vesicles able to be released at a time.

The scientists therefore examined whether isoflurane affected calcium ion channels, which are key in the process of vesicle release. When action potentials arrive at the presynaptic terminal, calcium ion channels in the membrane open, allowing calcium ions to flood in. Synaptic vesicles then detect this rise in calcium, and they fuse with the membrane. The researchers found that isoflurane lowered calcium influx by blocking calcium ion channels, which in turn reduced the probability of vesicle release.

"However, this mechanism alone could not explain how isoflurane reduces the number of releasable vesicles, or the frequency-dependent nature of isoflurane's effect," said Takahashi.

The scientists hypothesized that isoflurane could reduce the number of releasable vesicles by either directly blocking the process of vesicle release by exocytosis, or by indirectly blocking vesicle recycling, where vesicles are reformed by endocytosis and then refilled with neurotransmitter, ready to be released again.

By electrically measuring the changes in the surface area of the presynaptic terminal membrane, which is increased by exocytosis and decreased by endocytosis, the scientists concluded that isoflurane only affected vesicle release by exocytosis, likely by blocking exocytic machinery.

"Crucially, we found that this block only had a major effect on high frequency signals, suggesting that this block on exocytic machinery is the key to isoflurane's anesthetizing effect," said Takahashi.

The scientists proposed that high frequency action potentials trigger such a massive influx of calcium into the presynaptic terminal that isoflurane cannot effectively reduce the calcium concentration. Synaptic strength is therefore weakened predominantly by the direct block of exocytic machinery rather than a reduced probability of vesicle release.

Meanwhile, low frequency impulses trigger less exocytosis, so isoflurane's block on exocytic machinery has little effect. Although isoflurane effectively reduces entry of calcium into the presynaptic terminal, lowering the probability of vesicle release, by itself, is not powerful enough to block postsynaptic action potentials at the calyx of Held and has only a minor effect in cortico-cortical synapses. Low frequency transmission is therefore maintained.

Overall, the series of experiments provide compelling evidence to how isoflurane weakens synapses to induce anesthesia.

"Now that we have established techniques of manipulating and deciphering presynaptic mechanisms, we are ready to apply these techniques to tougher questions, such as presynaptic mechanisms underlying symptoms of neurodegenerative diseases," said Takahashi. "That will be our next challenge."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Coupled magnetic materials show interesting properties for quantum applications

image: Researchers at Argonne have found a new platform for coherent information transduction with magnons in an exchange-coupled magnetic thin film bilayer. The results show new insights in both fundamental physics and device potentials for spintronics and quantum applications.

Image: 
Argonne National Laboratory

Like fans that blow in sync, certain magnetic materials can exhibit interesting energetic properties.

In order to find new ways to transmit and process information, scientists have begun to explore the behavior of electronic and magnetic spins, specifically their resonant excitations, as information carriers. In some cases, researchers have identified new phenomena that could help eventually inform the creation of new devices for spintronic and quantum applications.

In a new study led by the U.S. Department of Energy’s (DOE) Argonne National Laboratory, researchers have uncovered a novel way in which the excitations of magnetic spins in two different thin films can be strongly coupled to each other through their common interface. This dynamic coupling represents one kind of hybrid system that is getting increasing amounts of attention from scientists interested in quantum information systems.

“For quantum information systems, the name of the game is to take some excitation and to manipulate it in some way or transfer it to another excitation, and that’s pretty much at the heart of what we’re doing here.” — University of Illinois materials scientist Axel Hoffmann

“One way to think about it is as though you have two pairs of masses attached to springs,” said Argonne postdoctoral researcher and first author Yi Li. “We know that each mass connected to a spring will oscillate periodically when it’s hit from the outside. But if we connect the two masses with a third spring, then the oscillation of one mass will also trigger the oscillation of the other mass, which can be used to exchange information between the springs. The role of the third spring here is played by the interfacial exchange coupling between the two magnetic layers.”

With some smart engineering, researchers can set the free oscillation frequency of the two layers of magnetic spins — the “masses” — to be identical, where they are the most favorable to couple. In addition, they show that the two systems can be “strongly” coupled, a state which is important to maintain coherence and may inspire applications in quantum information.

Besides the strong-coupling state, researchers have found an additional new effect in the magnetic bilayer which has an impact on the coherence of their excitations: one side can pump energy, called spin current, into the other one. One notable and intriguing behavior concerning the new dynamic coupling involves the exchange of energy between the two layers in the magnetic material.

According to University of Illinois materials scientist and study author Axel Hoffmann, each layer has a particular length of time over which the magnetization dynamics will usually independently persist. However, with the introduction of the spin current pushing spins in a particular direction, there can be enough energy transferred so that the magnetization dynamics last substantially longer in one of the layers.

“We knew that a rigid kind of coupling existed, but the fact is that the other dynamic coupling is also important — and important enough so that we can’t neglect it,” Hoffmann said. “For quantum information systems, the name of the game is to take some excitation and to manipulate it in some way or transfer it to another excitation, and that’s pretty much at the heart of what we’re doing here.”

“There is an intrinsic magnetic interaction that couples these two layers,” Li added. “We can apply a magnetic field, and then we can determine whether these two layers are pumping in phase or out of phase. Such controlled interactions are in principle what people are doing for quantum information processing.”

According to Hoffmann, the experiment started with the identification of two magnetic systems that the researchers knew were coupled together. By seeking to make the coupling as strong as it could possibly be compared to the individual excitations in the material, the researchers were able to see the additional detail of how the spin pumping energy transfer came about.

Credit: 
DOE/Argonne National Laboratory

European countries face a costly 23% increase in fragility fractures by 2030

image: This infographic shows the clinical, societal, and cost burden associated with fragility fractures in six European countries: France, Germany, Italy, Spain, Sweden and the UK. It also shows how effective management of fracture patients, including through fracture liaison services, could improve outcomes and reduce costs.

Image: 
International Osteoporosis Foundation (IOF)

A new study provides an overview and comparison of the burden and management of fragility fractures due to osteoporosis in the five largest countries in Europe (France, Germany, Italy, Spain, and the United Kingdom) as well as Sweden. The publication 'Fragility fractures in Europe: burden, management and opportunities' has been authored by an International Osteoporosis Foundation (IOF) steering committee in cooperation with experts from national societies.

Osteoporosis is a chronic condition in which bone mass and strength decrease causing an increased risk of fractures. Fragility fractures are a major cause of disability and early death in older adults, with one in three women and one in five men aged fifty and above sustaining a fracture in their remaining lifetime.

The authors find that in 2017 an estimated 2.7 million fragility fractures in the six countries resulted in an associated annual cost of €37.5 billion. By 2030, the number of annual fragility fractures is expected to increase by 23 per cent, to 3.3 million, with projected costs of approximately €47.4 billion.

The burden of fragility fractures exceeds those of many other chronic diseases. An estimated 1.02 million quality-adjusted life years (QALYSs) were lost in 2017 due to fracture. The current disability-adjusted life years (DALYs) per 1000 individuals age 50 years or more is estimated at 21 years, which is higher than the estimates for stroke or chronic obstructive pulmonary disease. Fractures also result in loss of productivity, with sick days taken in 2017 by non-retired individuals in the six countries totalling 7.6 million days.

Impairment due to fragility fractures, which includes pain, immobility and fear of falling, can make even simple daily activities such as dressing and washing difficult. As a result, the burden on informal caregivers such as family members may be significant. Average annual hours of care by relatives after a hip fracture is found to be as high 744 hours and 652 hours, per year, per 1000 individuals, in Spain and Italy respectively. Another major and costly burden caused by fragility fractures is the long-term impact on independence which may require individuals to move into long-term care (LTC) facilities. The percentage of patients moving into LTC varies from 2.1% at ages 50-60 years to 35.3% at ages 90 and above.

As well as quantifying the heavy burden of fragility fractures on patients, their families, and national healthcare systems, the study also identifies a massive treatment gap in all countries, based on the percentage of eligible individuals not receiving medication. The smallest treatment gap is in the UK (64% in women and 43% in men) and the highest treatment gap in Germany, with only 20% of eligible men and 22% of women receiving a pharmacological intervention to prevent future fractures.

Lead author, Professor John A. Kanis, IOF Honorary President, commented: "With timely identification and treatment of the underlying condition, fragility fractures in individuals at high risk are largely preventable. However, we have found that the percentage of eligible individuals not receiving osteoporosis medication is unacceptably high and estimated to be, on average, 73% for women and 63% for men. Of further concern, the treatment gap has seen a marked increase in the past decade, increasing by approximately 17% since 2010."

Given that a first fracture is a warning sign of future fractures, post-fracture care to treat osteoporosis is of critical importance and the key to preventing a cycle of recurring fractures, pain and long-term disability. Nevertheless, the proportion of fracture patients starting treatment is low. In France, Sweden and Spain, 85%, 84% and 72% of fracture patients remained untreated one year after fracture.

Experts have shown that post fracture care models such as Fracture Liaison Services (FLSs) are cost-effective care delivery models which have the potential to increase the number of high-risk patients being treated, improve adherence to treatment and reduce the risk of re-fracture. However, FLS coverage is suboptimal in the six European countries. The authors estimate that if FLS could be further expanded to reach all fracture patients in the six countries, 19,262 additional fractures every year would be avoided, and fracture related costs would be reduced by €285.5 million annually.

IOF President and co-author Professor Cyrus Cooper, stated: "In Europe, the lifetime risk of sustaining a life-threatening hip fracture is similar to the lifetime risk of a stroke. Yet as this study so clearly reveals, osteoporosis and fracture prevention remain markedly under-prioritized in health care policy across Europe. IOF calls on health authorities to take action and respond to the escalating fragility fracture crisis by prioritising care standards and funding to support the implementation of Fracture Liaison Services."

Credit: 
International Osteoporosis Foundation

The cause of the red coloration in stalagmites

image: The Red Chamber inside the Goikoetxe cave, where the study of the stalagmites was conducted.

Image: 
G.E. ADES

As a general rule, the colour red in geological and archaeological samples tends to be due to the presence of various iron oxides that dye minerals and rocks a very deep colour. However, "the studies conducted in this piece of work and published in the Quaternary International journal have shown that in the case of the stalagmites in the Goikoetxe Cave located in the Basque Country's Urdaibai Biosphere Reserve, the red colouration is due to the presence of organic substances produced by the decomposition of vegetation cover in the soils located on top of the cave", said Virginia Martínez-Pillado, researcher in the UPV/EHU's Department of Mineralogy and Petrology and lead author of this work. "All along the Cantabrian seaboard there is a lot of precipitation and vegetation, and that is why a large vegetation cover forms on top of the cave; when this vegetation degrades, these organic substances, mainly humic and fulvic acids, are dragged into the cave by rainwater, and dye the speleothems red when the red from the calcite that forms them is incorporated," explained Virginia Martínez-Pillado.

The authors of this work have combined various analysis techniques for these stalagmites of different colours, such as uranium-series radiometric dating, petrography, X-ray fluorescence, spectroscopy and ultraviolet light luminescence. "To conduct the analyses, two spectroscopic techniques known as Raman and FTIR were used; their purpose is to identify various types of molecules and compounds. These techniques were the ones that established the presence of organic compounds produced by the degradation of plant matter inside the stalagmites," said the lead author of the work.

In addition "we used ultraviolet light to take photographs of the stalagmites themselves, and the response of the calcite in the presence of ultraviolet radiation was found to be cyclical. So, inside the red stalagmites, the red ones only," stressed the researcher, "certain cycles have appeared in which organic substances have been incorporated to a lesser or greater degree; at the moments where there is increased organic incorporation there seems to have been a much denser vegetation. It is probably because precipitation was higher, although this is something that would still have to be explored in further depth," said Martínez-Pillado. What is clear, however, is that "the vegetation cover and soil production is much more intense at certain moments than in others throughout the mid-Holocene, between 7,000 and 5,000 years ago, and is closely connected with the climate conditions outside the cave".

These cycles detected and linked to the striking red colour have allowed the researchers to embark on new research designed to reconstruct climate evolution along the Cantabrian seaboard during this period. "Until now, it has not been known that this red colour could point to climate changes or could be used as a record to determine the climate changes themselves," said Virginia Martínez-Pillado.

Credit: 
University of the Basque Country

Beta cells from stem cells

image: The staining shows CD177-derived beta-like cells stained with antibodies against insulin (green) and against the beta cell transcription factor MAFA (red).

Image: 
©Helmholtz Zentrum München

The loss of insulin-secreting beta cells by autoimmune destruction leads to type 1 diabetes. Clinical islet cell transplantation has the potential to cure diabetes, but donor pancreases are rare. In a new study, a group of researchers developed an improved pluripotent stem cell differentiation protocol to generate beta cells in vitro with superior glucose response and insulin secretion. This is a major step towards beta cell replacement therapy.

Human pluripotent stem cells (either human embryonic stem cells or induced pluripotent stem cells) can differentiate into every cell type of the human body with unlimited self-renewing capacity. Hence, pluripotent stem cells are an optimal source to generate specialized cell types for cell replacement therapy, e.g. beta cells for diabetic patients. However, current in vitro beta cell differentiation protocols are very complex due to the high number of differentiation steps. The process requires almost 20 signaling proteins and small molecules to regulate the growth and differentiation of the cells and lasts for more than four weeks. Within this multi-step process not all cells differentiate into the targeted cells but take wrong differentiation paths. This can lead to a highly heterogeneous cell population with beta cells which are not completely functional. A group of researchers at Helmholtz Zentrum München, the German Center for Diabetes Research (DZD), Technical University of Munich (TUM) and Miltenyi Biotec therefore tried to improve the quality of stem cell-derived beta cells.

CD177 quality control

The researchers developed an approach to enrich the stem cell culture with highly specialized pancreas progenitors which might lead to a more targeted differentiation into beta cells. "From developmental biology we knew that pancreatic progenitors are already specified at the endoderm stage - the first step of differentiation. We needed to find out if this was true also for human pluripotent stem cell differentiation," explains Prof. Heiko Lickert, Director at the Institute of Diabetes and Regeneration Research at Helmholtz Zentrum München, Professor of Beta Cell Biology at TUM School of Medicine and member of the Research Coordination Board of the German Center for Diabetes Research (DZD).

To investigate on this, the researchers were looking for a possibility to better control the quality of the endoderm and its differentiation into specified pancreas progenitors. In a cooperation with Sebastian Knöbel's group at Miltenyi Biotec they identified a monoclonal antibody called CD177 which marks a subpopulation of the endoderm that efficiently and homogenously differentiates into specified pancreatic progenitors. CD177 can therefore function as a quality control. "With CD177 we can already see at an early stage if the cells are on the right differentiation track. This can help save lots of time, efforts and money," says Lickert.

Enriching the stem cell culture with CD177 at the endoderm stage increases the generation of specified pancreatic progenitors. Ultimately, this leads to more mature and more functional beta cells that respond better to glucose and show improved insulin secretion patterns.

Cell replacement therapy, disease modelling and drug screening

Current beta cell differentiation protocols generate very heterogeneous cell populations that not only contain beta cells, but also remaining pancreatic progenitors or cell types from a different lineage. The purification by CD177 will not only improve the homogeneity and quality of the generated beta cells but also increase their clinical safety, as pluripotent stem cells are separated out. This is a crucial step towards the clinical translation of stem cell-derived beta cell replacement therapy for patients with type 1 diabetes.

Furthermore, as CD177 generated beta cells are more similar to beta cells in the human body, the CD177 protocol will help to establish disease modeling systems that can mimic the human pancreas. In addition, a differentiation protocol giving rise to functional beta cells is of highest interest for drug screening approaches.

Credit: 
Helmholtz Munich (Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH))