Tech

Researchers call for worldwide biosurveillance network to protect from diseases

The emergence of COVID-19 is a powerful reminder of how unchecked wildlife trade can lead to the spillover spread of viruses between wildlife and humans. Understanding that wholesale bans on trade can affect community livelihoods and food security, the pandemic underscores the need for widespread pathogen screening and monitoring to better understand, predict and contain outbreaks in wildlife and humans.

To date, global biosurveillance has consisted of centralized efforts led by governmental and specialized health agencies. A group of authors--including eight researchers from San Diego Zoo Global--writing in the journal Science this week offer an efficient approach that may be more resilient to fluctuations in government support and could be utilized even in remote areas.

Given the importance for the health of a global population, the team of scientists recommend a "decentralized" disease surveillance system, enabled by modern pathogen-detection methods, which builds in-country capacity for addressing challenges. Utilizing portable molecular screening that is both cost-effective and relatively easy to use, this network would take a more fundamentally proactive approach to wildlife screening, they write.

"The COVID-19 crisis has shown that the international wildlife trade is a global system in need of greater oversight," said Elizabeth Oneita Davis, Ph.D., conservation social scientist in Community Engagement at San Diego Zoo Global, who was one of the authors. "However, ill-conceived measures such as 'blanket bans' could affect millions of people and drive these activities deeper underground, further impeding our efforts to understand and reduce demand for wildlife."

The network should expand monitoring beyond human disease outbreaks to encompass a broader understanding of pathogens and evaluate their spillover risk (of spreading from wildlife to humans or vice versa), they write. To this end, surveillance focal points should include wildlife markets and farms, as well as free-ranging populations of "high-risk" wildlife.

"Since the H1N1 outbreak of 2009, which spurred governmental responses such as PREDICT to begin active virus hunting in zoonotic hotspots, genomic technologies have transformed radically," said Mrinalini Erkenswick Watsa, Ph.D., lead author and conservation geneticist on San Diego Zoo Global's Population Sustainability team. "Sequencing the genome of a virus is now feasible on miniature sequencers, directly at the point of sample collection. Today, we can more directly and powerfully survey wildlife health, identify areas of high spillover potential and contribute to minimizing those behaviors, to keep human and wildlife populations safe," she said.

Key to this approach is the creation of a pathogen database to provide early warnings of spillover potential, and assist in containment and development of therapeutic treatments.

"A decentralized approach to biosurveillance would more readily address wildlife and ecosystem health, and therefore conservation as a whole," said Steven V. Kubiski, DVM, Ph.D., a veterinary pathologist on San Diego Zoo Global's Disease Investigations team, who co-authored the perspective piece. "The ability to test multiple populations is just the beginning--a centralized location for deposition, analysis and reporting would add even more value, and could serve as an open-access resource."

The authors note that beyond endangering human health, emerging infectious diseases can imperil wildlife populations that have not evolved resistance to unfamiliar pathogens.

Additionally, the authors call for an internationally recognized standard for wildlife trade, the risks of which they call the "largest unmet challenge" for infectious disease surveillance. Despite the known risks, little monitoring takes place in wildlife markets like the one believed to be the original vector of the SARS-CoV-2 virus.

"Decentralized pathogen screening in wildlife lends itself not only to early detection of pathogen spillover into humans, but helps conservation veterinarians and disease experts understand the natural host-pathogen relationship, allowing us to better conserve wildlife populations and save species," said Caroline Moore, DVM, Ph.D., Steel Endowed Pathology Fellow and veterinary toxicologist on San Diego Zoo Global's Disease Investigations team, who was among the co-authors.

"The proposed disease surveillance model will help us inventory naturally occurring pathogens in different taxa across the globe, enabling us to track future changes in viruses and ecosystem health that are relevant to both humans and wildlife populations," added Carmel Witte, Ph.D., wildlife epidemiologist on San Diego Zoo Global's Disease Investigations team.

The authors point out the value of biobanking efforts, including those of San Diego Zoo Global's Frozen Zoo®, in assisting the worldwide surveillance effort.

This decentralized system is consistent with the collaborative, holistic disease mitigation strategy of the One Health approach, used by the Centers for Disease Control and Prevention. This approach seeks to decrease the threat of disease through the conservation of nature and ecosystem function, accounting for domestic animals and all other human-related factors.

Credit: 
San Diego Zoo Wildlife Alliance

A new nanoconjugate blocks acute myeloid leukemia tumor cells without harming healthy ones

image: The nanoparticle targets only leukemic cells and therefore would reduce the severe adverse effects of current treatments. The receptor for this nanoparticle is expressed in 20 types of cancer and associated with a poor prognosis, so this drug could open a new therapeutic pathway for other tumors.

Image: 
CIBER-BBN

Acute myeloid leukemia (AML) is a heterogeneous disease which usual treatment is very aggressive and produces severe side effects to the patients. In the search for a new and more effective drug, researchers from the CIBER for Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN) have demonstrated the efficacy of a new nanoconjugate, designed in house, that blocks dissemination of leukemic cells in animal models of acute myeloid leukemia.

The results of this research, with the participation of the CIBER-BBN groups at the Institut de Recerca de l'Hospital de la Santa Creu i Sant Pau (Ramón Mangues, Isolda Casanova and Víctor Pallarès), the Institut de Biotecnologia i Biomedicina IBB of the Universitat Autònoma of Barcelona (Antonio Villaverde and Esther Vázquez) and the Josep Carreras Leukaemia Research Institute have been published in the Journal of Hematology and Oncology. Most of the experimental work has been performed in the nanotoxicology and protein production ICTS "NANBIOSIS" platforms from CIBER-BBN.

The researchers have developed a nanomedicine that is specifically targeted to the tumor cells without damaging normal cells. This new protein nanoparticle is bound to a toxin, named Auristatin, which is between 10 and 100 times more potent than the drugs typically used in the clinic. According to CIBER-BBN group leader Ramón Mangues, "we have designed a nanoconjugate that is targeted only to the cells that express in their membrane a receptor called CXCR4, which is overexpressed in leukemic cells. Thus, this nanoparticle can only enter and deliver the toxin into the cells that express this receptor".

CXCR4 is overexpressed in a large proportion of leukemic cells in patients with poor prognostic or refractory disease, so it could have a major clinical impact on these AML patients. Also, it is worth pointing out that the CXCR4 receptor is overexpressed in more than 20 different cancer types, and its expression is associated with a poor prognosis. Therefore, this nanodrug could be evaluated in the near future as a possible treatment in other tumor types of high prevalence.

Blocks the spread of leukemic cells in mice without toxicity

The researcher team has demonstrated that the nanoconjugate is able to internalize in the leukemic cells through the CXCR4 receptor and kill them. In addition, they have demonstrated the capacity of this nanoparticle to block the dissemination of leukemic cells in a mouse model without associated toxicity or adverse effects. Thanks to its targeting to leukemic cells, it could help AML patients that cannot be treated with current drugs because of their high toxicity, such as this experienced by elderly patients or patients with other non-favorable characteristics that exclude conventional treatment.

Ramón Mangues explains that "the novel nanoparticle could be used to treat patients that have developed resistance to drugs or those that have experienced relapse, since their leukemic cells would likely have high expression of the CXCR4 receptor. Hence, there is a wide range of patients that could benefit of this new treatment, which could have a major clinical impact if its effectiveness was confirmed in further clinical trials".

The intellectual property of this nanomedicine has been licensed to the SME biotech Nanoligent, which aim is continuing the so far successful access to public and private funds to complete the preclinical development to enter clinical trials in acute myeloid leukemia, before being tested in other cancer types.

Credit: 
Universitat Autonoma de Barcelona

Researcher reconstructs skull of two million year-old giant dormouse

image: Artist's impression of the giant dormouse (left) and its nearest living relative the garden dormouse (right).

Image: 
Image James Sadler, University of York.

A PhD student has produced the first digital reconstruction of the skull of a gigantic dormouse, which roamed the island of Sicily around two million years ago.

In a new study, the student from Hull York Medical School, has digitally pieced together fossilised fragments from five giant dormouse skulls to reconstruct the first known complete skull of the species.

The researchers estimate that the enormous long-extinct rodent was roughly the size of a cat, making it the largest species of dormouse ever identified.

The digitally reconstructed skull is 10 cm long - the length of the entire body and tail of many types of modern dormouse.

PhD student Jesse Hennekam said: "Having only a few fossilised pieces of broken skulls available made it difficult to study this fascinating animal accurately. This new reconstruction gives us a much better understanding of what the giant dormouse may have looked like and how it may have lived."

The enormous prehistoric dormouse is an example of island gigantism - a biological phenomenon in which the body size of an animal isolated on an island increases dramatically.

The palaeontological record shows that many weird and wonderful creatures once roamed the Italian islands. Alongside the giant dormouse, Sicily was also home to giant swans, giant owls and dwarf elephants.

Jesse's PhD supervisor, Dr Philip Cox from the Department of Archaeology at the University of York and Hull York Medical School, said: "While Island dwarfism is relatively well understood, as with limited resources on an island animals may need to shrink to survive, the causes of gigantism are less obvious.

"Perhaps, with fewer terrestrial predators, larger animals are able to survive as there is less need for hiding in small spaces, or it could be a case of co-evolution with predatory birds where rodents get bigger to make them less vulnerable to being scooped up in talons."

Jesse spotted the fossilised fragments of skull during a research visit to the Palermo Museum in Italy, where a segment of rock from the floor of a small cave, discovered during the construction of a motorway in northwest Sicily in the 1970s, was on display.

"I noticed what I thought were fragments of skull from an extinct species embedded in one of the cave floor segments", Jesse said. "We arranged for the segment to be sent to Basel, Switzerland for microCT scanning and the resulting scans revealed five fragmented skulls of giant dormice present within the rock."

The reconstruction is likely to play an important role in future research directed at improving understanding of why some small animals evolve larger body sizes on islands, the researchers say.

"The reconstructed skull gives us a better sense of whether the giant dormouse would have looked similar to its normal-sized counterparts or whether its physical appearance would have been influenced by adaptations to a specific environment", Jesse explains.

"For example, if we look at the largest living rodent - the capybara - we can see that it has expanded in size on a different trajectory to other species in the same family."

Jesse is also using biomechanical modelling to understand the feeding habits of the giant dormouse.

"At that size, it is possible that it may have had a very different diet to its smaller relatives," he adds.

Credit: 
University of York

Lightening data have more use than previously thought

image: The lightning strokes light up the night sky with various shapes.

Image: 
Zhichao Wang

Lightning is a spectacular natural phenomenon and the fascination with thunderstorms and lightning is a long-standing one. Today, lightning is known to be closely associated with the electrification and discharge of thunderstorms. Different types of thunderstorms correspond to different lightning characteristics and charge structures. But what are the characteristics of lightning in different types of thunderstorms?

To address this question, scientists attempted to depict the lightning activity and charge structure of a supercell over North China using a lightning network, S-band doppler radar, X-band dual-polarization radar, and ground observations in a recently published study in Advances in Atmospheric Sciences.

A team from the Institute of Atmospheric Physics at the Chinese Academy of Sciences found that the supercell was accompanied by severe hailfall, while the lightning frequency showed obviously different characteristics before and after the hailfall.

The results showed that +CG (positive cloud-to-ground) lightning accounted for a high percentage of CG lightning, especially during the hailfall stage. The charge structure of the thunderstorm converted from an inversion type to a normal tripolar pattern.

Based on the retrieval of hydrometeor particles from X-band radar data, it was found that graupel, hailstones and ice crystals were the main charged particles in the convective region, while snow, ice crystals and graupel were the main charged particles in the stratiform region.

"We also found that lightning data can serve as an indicator for hazardous weather phenomena. The radar detection range is restricted due to the 'shelter effect' of mountains and buildings. In such cases, lightning data could fix the problem," explains Dr. Dongxia Liu, lead author of the study.

In addition, the team also suggest that lightning data could improve short-term forecasting, with the study providing a reference for the use of lightning data in numerical weather models.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Breast cancer cells can reprogram immune cells to assist in metastasis

image: A blue tumor organoid surrounded by red NK cells.

Image: 
Isaac Chan, M.D.

Natural killer (NK) cells, a type of immune cell, are known to limit metastasis by inducing the death of cancer cells. But metastases still form in patients, so there must be ways for cancer cells to escape. Using a novel cell culture method developed by lead author Isaac Chan, M.D., Ph.D., a medical oncology fellow at Johns Hopkins working in the the laboratory of Andrew Ewald, Ph.D., the researchers studied the interactions between NK cells and invasive breast cancer cells in the laboratory in real time. They discovered that metastatic breast cancer cells can reprogram NK cells so that they stop killing cancer cells and, instead, assist in metastasis.

This work, published July 9 in the Journal of Cell Biology, also reports new immunotherapy strategies that reverse this reprogramming process in mouse models of breast cancer metastasis.

“Metastatic disease is the main driver of breast cancer deaths, and we need a deeper understanding of how and why it occurs,” says Chan. “Our research has identified a new strategy for cancer cells to co-opt the immune system. If we could prevent or reverse natural killer cell reprogramming in patients, it could be a new way to stop metastasis and reduce breast cancer mortality.”

“Our study showed that NK cells selectively target the cells that initiate the metastatic process and also how the cancer cells trick the immune system into helping them,” says Ewald, senior study author, co-director of the Cancer Invasion and Metastasis Program at the Johns Hopkins Kimmel Cancer Center and professor of cell biology at the Johns Hopkins University School of Medicine. “This study also highlights the power of multidisciplinary cancer research. This project brought together medical oncology, cell biology, immunology and biomedical engineering to understand metastasis. We were able to move rapidly into immunology and immunotherapy research because of an exciting collaboration with Elizabeth Jaffee.”

Elizabeth Jaffee, M.D., is deputy director of the Johns Hopkins Kimmel Cancer Center; co-director of the Skip Viragh Center for Pancreas Cancer Clinical Research and Patient Care; associate director of the Bloomberg~Kimmel Institute for Cancer Immunotherapy; and the Dana and Albert “Cubby” Broccoli Professor of Oncology.

Using molecular profiling and computational analyses developed by Joel Bader, Ph.D., professor of biomedical engineering at the Johns Hopkins Institute for Basic Biomedical Sciences and Institute of Genetic Medicine, and Hildur Knútsdóttir, a fellow in Bader’s lab, Ewald said, they were able to map every suspected molecular interaction between immune cells and cancer cells — and identify the ones that were likely regulating this communication.

“As predicted, when we blocked these inhibitory signals, the NK cells continued to be the "good guys" and kept clearing out cancer cells,” Ewald says. “We’re excited this approach could be used to prevent metastases from forming, and we’re also testing whether this same approach could be used to reactivate an immune response to an existing metastasis.”

The investigators say the process may also apply to other cancer types. Immunotherapies that target NK cells could also potentially be used together with existing immunotherapies that stimulate T cells to fight cancer.

Credit: 
Johns Hopkins Medicine

Shining light into the dark

Curtin University researchers have discovered a new way to more accurately analyse microscopic samples by essentially making them 'glow in the dark', through the use of chemically luminescent molecules.

Lead researcher Dr Yan Vogel from the School of Molecular and Life Sciences said current methods of microscopic imaging rely on fluorescence, which means a light needs to be shining on the sample while it is being analysed. While this method is effective, it also has some drawbacks.

"Most biological cells and chemicals generally do not like exposure to light because it can destroy things - similar to how certain plastics lose their colours after prolonged sun exposure, or how our skin can get sunburnt," Dr Vogel said.

"The light that shines on the samples is often too damaging for the living specimens and can be too invasive, interfering with the biochemical process and potentially limiting the study and scientists' understanding of the living organisms.

"Noting this, we set out to find a different way to analyse samples, to see if the process could successfully be completed without using any external lights shining on the sample."

The research team successfully found a way to use chemical stimuli to essentially make user-selected areas of the samples 'glow in the dark,' allowing them to be analysed without adding any potentially damaging external light.

Research co-author, Curtin University ARC Future Fellow Dr Simone Ciampi said that up until now, exciting a dye with chemical stimuli, instead of using high energy light, was not technically viable.

"Before discovering our new method, two-dimensional control of chemical energy conversion into light energy was an unmet challenge, mainly due to technical limitations," Dr Ciampi said.

"There are few tools available that allow scientists to trigger transient chemical changes at a specific microscopic site. Of the tools that are available, such as photoacids and photolabile protecting groups, direct light input or physical probes are needed to activate them, which are intrusive to the specimen.

"Our new method however, only uses external light shining on the back of an electrode to generate localised and microscopic oxidative hot-spots on the opposite side of the electrode.

"Basically, the light shines on an opaque substrate, while the other side of the sample in contact with the specimen does not have any exposure to the external light at all. The brief light exposure activates the chemicals and makes the sample 'glow in the dark'.

"This ultimately addresses two of the major drawbacks of the fluorescence method - namely the interference of the light potentially over-exciting the samples, and the risk of damaging light-sensitive specimens."

Credit: 
Curtin University

Intimate partner violence, history of childhood abuse worsen trauma symptoms for new moms

image: With her colleagues, graduate student Patricia Cintora examined how women with infants up to 18 months old respond to intimate partner violence.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- A study assessed the interaction of new and old relationship traumas among women three to 18 months after the birth of their child - one of the most challenging periods of their lives. The study found that new experiences of sexual, emotional and physical abuse at the hands of a romantic partner during this period are associated with increasing symptoms of trauma such as anxiety, depression, self-harm and sleep disorders. It also found that having experienced abuse in childhood appears to worsen the impact of current abuse on those symptoms.

Published in the Journal of Traumatic Stress, the research points to postnatal medical screenings as a potential point of intervention, giving health practitioners the opportunity to help young mothers recognize the signs of abuse and take steps to protect themselves and their children from harm.

The research suggests that recent episodes of relationship trauma can exacerbate a woman's mental health problems above and beyond symptoms tied to childhood experiences of maltreatment, the researchers said. It also indicates that interventions at this time of life may help alleviate a woman's symptoms, despite her personal history.

Studies have shown that intimate partner violence sometimes increases after parents bring a newborn into the home, said Patricia Cintora, a graduate student in the neuroscience program at the University of Illinois at Urbana-Champaign who led the research with U. of I. psychology professor Heidemarie Kaiser Laurent.

"In addition to the physical changes of pregnancy, there are a lot of emotional, social and economic changes that come along with parenthood that may cause stress or magnify prior stressors that feed into intimate partner violence," Cintora said. "Rather than focusing on specific categories of abuse that fall under the umbrella of IPV, we decided to look at the total number of experiences and severity."

The researchers followed 85 low-income women after the birth of a child. The women checked in at three, six, 12 and 18 months postpartum. They answered questions from standardized checklists designed to determine their trauma symptoms, history of childhood maltreatment and exposure to - or perpetuation of - intimate partner violence.

"We found that the higher their scores for experiencing intimate partner violence, the more symptoms they reported," Cintora said. "We also saw that relative changes in their experience over time also had an important effect on their symptoms."

Women who experienced childhood maltreatment also tended to report higher levels of traumatic stress in response to recent episodes of IPV, she said.

"This work is important because it highlights both the harms of worsening postpartum relationship dynamics - even before they reach clinically recognized abuse thresholds - and the opportunity to beneficially impact women's health during this critical time," Laurent said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Looking at linkers helps to join the dots

image: In the solvent-dominated regime, the dots are capped with long oleic acid molecules, which hamper the flow of electricity. After the transition, these are replaced by linker molecules, allowing the dots to conduct electricity efficiently. From left to right is the solvent-dominated regime, the transition regime and the linker-dominated regime.

Image: 
© 2020 Ahmad R. Kirmani

Better understanding the science that underpins well-known techniques for developing quantum dots--tiny semiconducting nanocrystals--can help reduce the guesswork of current practices as material scientists use them to make better solar panels and digital displays.

Just billionths of a meter across, quantum dots are routinely prepared in solution and coated or sprayed as an ink to create a thin electrically conducting film that is used to make devices. "But finding the best way to do this has been a matter of trial and error," says material scientist Ahmad R. Kirmani. Now, with colleagues at KAUST and the University of Toronto, Canada, he has revealed why certain well-known techniques can dramatically improve the film's performance.

Quantum dots absorb and emit different wavelengths of light depending on their size. This means they can be tuned to be highly efficient absorbers in solar panels, or to emit different colors for a display, just by making the crystals bigger or smaller.

The dots are commonly grown from lead and sulfur in solution. Because the dots' properties depend on their size, their growth must be halted at the right point, which is done by adding special molecules to cap their growth. Engineers often use molecules of oleic acid, each with 18 carbon atoms, which attach to the crystal's surface, like hairs, blocking growth.

This creates a solution of dots suitable for coating to create a film. Yet, this film is not good at conducting electricity because the long acid molecules hamper the flow of electrons between nanocrystals. So engineers add shorter molecules. These "linkers" only have around two carbon atoms per molecule. The linkers replace the long capping molecules, increasing conductance. "The method has been used for a couple of decades, but nobody had investigated exactly what happens," says Kirmani.

To find out, Kirmani's team used a microbalance to monitor the exchange of oleic acid for linkers during the transition. They measured the spacing between the dots by scattering X-rays from them, and they also recorded the film's changing thickness, density and optical absorption characteristics.

Rather than seeing a smooth change in the film's properties, they saw a sudden jump--marking a phase transition. When roughly all the acid molecules have been displaced by linkers, the dots abruptly come close together, and the conductivity shoots up.

Kirmani hopes other teams will be inspired to investigate further, possibly by arresting the transition process somewhere midway and introducing various molecules to the dot surface to see what novel features emerge. "There is a lot of potential in taking this understanding to new paradigms for new technologies," he says.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Study: More than half of US students experience summer learning losses five years in a row

Washington, July 9, 2020--Following U.S. students across five summers between grades 1 and 6, a little more than half (52 percent) experienced learning losses in all five summers, according to a large national study published today. Students in this group lost an average of 39 percent of their total school year gains during each summer. The study appeared in American Educational Research Journal, a peer-reviewed publication of the American Educational Research Association.

"Many children in the U.S. have not physically attended a school since early March because of the Covid-19 pandemic, and some have likened the period we're in now to an unusually long summer," said study author Allison Atteberry, an assistant professor at the University of Colorado--Boulder. "Because our results highlight that achievement disparities disproportionately widen during the summer, this is deeply concerning."

"Teachers nationwide are likely wondering how different their classes will be in the coming fall," Atteberry said. "To the extent that student learning loss plays a larger-than-usual role this year, we would anticipate that teachers will encounter even greater variability in students' jumping-off points when they return in fall 2020."

For the study Atteberry and her co-author, Andrew J. McEachin, a researcher at the RAND Corporation, a nonprofit research organization, used a database from NWEA, which includes more than 200 million test scores for nearly 18 million students in 7,500 school districts across all 50 states from 2008 through 2016.

The authors found that although some students learn more than others during the school year, most are moving in the same direction­­--that is, making learning gains--while school is in session. The same cannot be said for summers, when more than half of students exhibit learning losses year after year.

Twice as many students exhibit five years of consecutive summer losses--as opposed to no change or gains--as one would expect by chance, according to the authors.

The pattern is so strong that even if all differences in learning rates between students during the school year could be entirely eliminated, students would still end up with very different achievement rates due to the summer period alone.

"Our results highlight that achievement disparities disproportionately widen during summer periods, and presumably the 'longer summer' brought on by Covid-19 would allow this to happen to an even greater extent," said Atteberry. "Summer learning loss is just one example of how the current crisis will likely exacerbate outcome inequality."

Among the students studied, depending on grade, the average student loses between 17 and 28 percent of school-year gains in English language arts during the following summer. In math, the average student loses between 25 and 34 percent of each school-year gain during the following summer.

However Atteberry and McEachin focus their attention not on average patterns of summer learning loss, but rather on the dramatic variability around those means from one student to another.

"For instance in grade 2 math, at the high end of the distribution, students accrue an additional 32 percent of their school-year gains during the following summer," said Atteberry. "At the other end of the distribution, however, students can lose nearly 90 percent of what they have gained in the preceding school year."

"This remarkable variability in summer learning rates appears to be an important contributor to widening achievement disparities during the school-age years," Atteberry said. "Because summer losses tend to accumulate for the same students over time, consecutive losses add up to a sizeable impact on where students end up in the achievement distribution."

Atteberry noted that more research is needed to better understand what accounts for most of the summer variation across students. Prior research, including a 2018 study published in Sociology of Education, has found that race/ethnicity and socioeconomic status predict summer learning but, together, account for only up to 4 percent of the variance in summer learning rates.

Policy leaders across the United States have experimented with different approaches, including extending the school year and running summer bridge programs, to address concerns with summer learning losses. These need to be further assessed for effectiveness, said Atteberry.

Researchers have pointed to gaps in resources such as family income, parental time availability, and parenting skill and expectations as potential drivers of outcome inequality. Many of these resource differences are likely exacerbated by summer break when, for some families, work schedules come into greater conflict with reduced childcare.

"Our results suggest that we should look beyond just schooling solutions to address out-of-school learning disparities," Atteberry said. "Many social policies other than public education touch on these crucial resource inequalities and thus could help reduce summer learning disparities."

Credit: 
American Educational Research Association

Cosmic cataclysm allows precise test of general relativity

image: The MAGIC telescope system at the Roque de los Muchachos Observatory, La Palma, Canary Islands, Spain

Image: 
Giovanni Ceribella/MAGIC Collaboration

In 2019, the MAGIC telescopes detected the first Gamma Ray Burst at very high energies. This was the most intense gamma-radiation ever obtained from such a cosmic object. But the GRB data have more to offer: with further analyses, the MAGIC scientists could now confirm that the speed of light is constant in vacuum - and not dependent on energy. So, like many other tests, GRB data also corroborate Einstein's theory of General Relativity. The study has now been published in Physical Review Letters.

Einstein's general relativity (GR) is a beautiful theory which explains how mass and energy interact with space-time, creating a phenomenon commonly known as gravity. GR has been tested and retested in various physical situations and over many different scales, and, postulating that the speed of light is constant, it always turned out to outstandingly predict the experimental results. Nevertheless, physicists suspect that GR is not the most fundamental theory, and that there might exist an underlying quantum mechanical description of gravity, referred to as quantum gravity (QG). Some QG theories consider that the speed of light might be energy dependent. This hypothetical phenomenon is called Lorentz invariance violation (LIV). Its effects are thought to be too tiny to be measured, unless they are accumulated over a very long time. So how to achieve that? One solution is using signals from astronomical sources of gamma rays. Gamma-ray bursts (GRBs) are powerful and far away cosmic explosions, which emit highly variable, extremely energetic signals. They are thus excellent laboratories for experimental tests of QG. The higher energy photons are expected to be more influenced by the QG effects, and there should be plenty of those; these travel billions of years before reaching Earth, which enhances the effect.

GRBs are detected on a daily basis with satellite borne detectors, which observe large portions of the sky, but at lower energies than the ground-based telescopes like MAGIC. On January 14, 2019, the MAGIC telescope system detected the first GRB in the domain of teraelectronvolt energies (TeV, 1000 billion times more energetic than the visible light), hence recording by far the most energetic photons ever observed from such an object. Multiple analyses were performed to study the nature of this object and the very high energy radiation.

Tomislav Terzi?, a researcher from the University of Rijeka, says: "No LIV study was ever performed on GRB data in the TeV energy range, simply because there was no such data up to now. For over twenty years we were anticipating that such observation could increase the sensitivity to the LIV effects, but we couldn't tell by how much until seeing the final results of our analysis. It was a very exciting period."

Naturally, the MAGIC scientists wanted to use this unique observation to hunt for effects of QG. At the very beginning, they however faced an obstacle: the signal that was recorded with the MAGIC telescopes decayed monotonically with time. While this was an interesting finding for astrophysicists studying GRBs, it was not favorable for LIV testing. Daniel Kerszberg, a researcher at IFAE in Barcelona said: "when comparing the arrival times of two gamma-rays of different energies, one assumes they were emitted instantaneously from the source. However, our knowledge of processes in astronomical objects is still not precise enough to pinpoint the emission time of any given photon". Traditionally the astrophysicists rely on recognizable variations of the signal for constraining the emission time of photons. A monotonically changing signal lacks those features. So, the researchers used a theoretical model, which describes the expected gamma-ray emission before the MAGIC telescopes started observing. The model includes a fast rise of the flux, the peak emission and a monotonic decay like that observed by MAGIC. This provided the scientists with a handle to actually hunt for LIV.

A careful analysis then revealed no energy-dependent time delay in arrival times of gamma rays. Einstein still seems to hold the line. "This however does not mean that the MAGIC team was left empty handed", said Giacomo D'Amico, a researcher at Max Planck Institute for Physics in Munich; "we were able to set strong constraints on the QG energy scale". The limits set in this study are comparable to the best available limits obtained using GRB observations with satellite detectors or using ground-based observations of active galactic nuclei.

Cedric Perennes, postdoctoral researcher at the university of Padova added: "We were all very happy and feel privileged to be in the position to perform the first study on Lorentz invariance violation ever on GRB data in TeV energy range, and to crack the door open for future studies!"

In contrast to previous works, this was the first such test ever performed on a GRB signal at TeV energies. With this seminal study, the MAGIC team thus set a foothold for future research and even more stringent tests of Einstein's theory in the 21st century. Oscar Blanch, spokesperson of the MAGIC collaboration, concluded: "This time, we observed a relatively nearby GRB. We hope to soon catch brighter and more distant events, which would enable even more sensitive tests."

Credit: 
Max Planck Institute for Physics

How fear transforms into anxiety

A deadly coronavirus pandemic, economic instability and civil unrest menace the mental well-being of millions. Understanding how, in vulnerable people, fear from such frightening events evolves into lifelong anxiety, is critical for healing.

A University of New Mexico research team led by Elaine L. Bearer, MD, PhD, the Harvey Family Professor in Pathology, and graduate student Taylor W. Uselman has identified for the first time brain-wide neural correlates of the transition from fear to anxiety.

"Until now, psychiatrists had little information about what goes on in the brain after a fearful experience, and why some people don't easily recover and remain anxious, for even as long as the rest of their lives," Bearer says.

Life-threatening fear frequently leads to post-traumatic stress syndrome (PTSD). The goal is to shed light on the brain's response to fear and why, in some cases, it can lead to prolonged anxiety states like PTSD.

While not feasible in human subjects, fear can be provoked in rodents by exposure to a scary smell, such as a product commonly used to protect our barbecues from mouse nesting. This smell simulates a predator odor and scares mice away.

The UNM team used this trick to witness how the brain responds to scary events and discover how brain activity evolves from a scary feeling to anxiety.

In a paper published this week in the journal NeuroImage, they report a correlation of behavior with brain activity by watching behavior and capturing magnetic resonance images before, during and after exposure to non-scary and scary smells.

They created vulnerability to anxiety by manipulating the serotonin transporter (SERT), which is the major target of psychoactive drugs, like cocaine, and antidepressants, like Prozac. Deletion of the SERT gene (SERT-KO) produces vulnerability to anxiety, and thus provides a unique model to learn how frightening experiences morph into anxiety.

The UNM researchers compared behavior and brain activity in normal versus SERT-KO to identify the neural correlates of anxiety - those regions active in anxious SERT-KOs and not in normal subjects.

To highlight active neurons, they used manganese, a non-toxic ion that lights up active neurons in magnetic resonance images. Computational analyses of these brain-wide images yielded maps of activity throughout the brain before, immediately and long after brief exposure to the scary smell.

They identified differences in neural activity in 45 sub-regions throughout the brain. Some regions were activated by the scary smell, and some only came on later. Vulnerability to anxiety correlated with much more activity in many more regions.

The function of some of these regions, including the amygdala and hypothalamus, is at least partly understood, but others, such as the reward circuitry, were not previously known to be involved in anxiety.

In anxiety, the coordination between regions was altered, which may represent a brain-wide signature of anxiety, or signify a dis-coordination between brain regions, which is often experienced when we are frightened or anxious.

"We now know that brain activity in anxiety is not the same as in an acute fear response," Bearer says. "With anxiety, neural activity is elevated across many specific regions of the brain, and normal coordination between regions is lost."

What does this mean in the time of COVID? The time lag for resilient or anxious outcomes suggests that early containment of fearful responses to surges in cases, protests and economic recession may reduce the likelihood of progression to anxiety.

The involvement of serotonin also suggests pharmacologic targets that could help in reducing the likelihood of anxiety. Meditation, music, poetry, exercise and other stress-reducing activities that engage the reward circuitry will also likely help. Early interventions will have lasting benefits.

Credit: 
University of New Mexico Health Sciences Center

The restoration of forests with active rapid 'Ohi'a death infections may be possible

image: 'Ohi'a (Metrosideros polymorpha) seedling in an unweeded plot.

Image: 
Stephanie Yelenik, Ph.D. Research Ecologist USGS, Pacific Island Ecosystem Research Center

Hilo, Hawai'i - For the first time, researchers have shown that native 'ohi'a seedlings can survive for at least a year in areas that have active mortality from Rapid 'ohi'a Death, or ROD, a fungal disease that is devastating to this dominant and culturally important tree in Hawaiian forests. This information can be useful to land managers and homeowners as they prioritize conservation actions.

The study, published recently in Restoration Ecology, was authored by scientists from the U.S. Geological Survey and Hawai'i Cooperative Studies Unit at University of Hawai'i Hilo.

"'Ohi'a is a keystone species in Hawaiian forests, and ROD has the potential to cause major ecosystem disturbances that will negatively impact water supply, cultural traditions, natural resources and quality of life," said USGS Director Jim Reilly. "This innovative research provides a glimmer of hope for native 'ohi'a tree restoration in Hawaii by indicating that successful planting of 'ohi'a could be possible in ROD-affected forests if the native species' seedlings are protected."

The study also highlights specific best practices for maximizing seedling survival, noted Stephanie Yelenik, an ecologist with the USGS and lead author of the study.

"We found that 'ohi'a seedlings planted into a forest heavily affected by ROD have a high probability of survival for the first year," said Yelenik. "While that one-year survival of seedlings is great news, this species lives centuries and there's currently no treatment once the tree becomes infected. Because 'ohi'a grow slowly, a dead tree is a gap in the canopy for a long time, and one of Hawai'i's many quick-growing invaders can take over the gaps caused by dead trees."

Still, she added, "these results provide the first, but still early, evidence that planting 'ohi'a may be used as a restoration tool in forests that are threatened by ROD.

ROD is a newly emerging disease caused by fungal pathogens. Since the disease was first discovered, ROD has killed at least 1 million 'ohi'a -- a tree that is foundational to the Hawaiian landscape and culture. Its loss, study authors emphasized, is detrimental to habitat for endangered birds and plants, ecosystem processes and the cultural heritage of Hawaiians. There remains no treatment for infected trees, so slowing the spread of ROD remains important to protect native Hawaiian forests.

"The 'ohi'a tree is the dominant tree in Hawaiian forests, the first to colonize new lava flows and often forms the bulk of what makes up the forests," said Yelenik. "While planting 'ohi'a may not be necessary in forests where 'ohi'a seedlings naturally occur, we thought planting might be a good tool for managers who want to maintain a native 'ohi'a canopy in forests dominated by invasive plants in the understory. We wanted to test that idea."

To test how 'ohi'a seedlings would fare in areas where ROD is prevalent, researchers planted seedlings directly underneath adult 'ohi'a that had tested positive for Ceratocystis huliohia or lukuohia, the fungal pathogens that cause ROD. They monitored survival in plots that were fenced to keep animals out and removed seedlings of invasive trees and shrubs; other seedlings were planted in areas with no fencing or weeding. Seedlings that died were brought back to the lab for testing. In the lab researchers were able to test all the seedlings for the DNA of Ceratocystis and tested for evidence of viable fungal spores.

The results were clear: none of the dead seedlings tested positive for Ceratocystis, indicating that 'ohi'a can survive in ROD-affected forests if protected from wild pigs and goats and invasive trees and shrubs. In addition, seedlings were six times more likely to die in plots where weeds were allowed to grow than in areas where weeds were cleared around the 'ohi'a seedlings. In terms of prioritizing management, this means that controlling invasive plants and animals has a greater impact on survival in the tree's first year than does exposure to ROD.

"The current study provides clues about the epidemiology of the disease and the dynamics of ROD pathogens across the landscape," said Lisa Keith, research plant pathologist for the USDA Agricultural Research Service. "This information can help to slow the spread of the disease through effective management strategies. While surrounding disease pressure may be high, results suggest that ohia seedlings have the capacity to thrive in areas devastated by ROD."

Results of the study indicated that 'ohi'a seedlings are not infected with ROD through the soil. "Those results are encouraging for two reasons," said J.B. Friday, extension forester with the University of Hawai'i Cooperative Extension Unit. "First, it means that even in forests with invasive trees and shrubs, 'ohi'a may possibly be re-established. And second, it means that in our high-elevation, pristine native forests, natural 'ohi'a regeneration could be possible, even in forests hit by ROD, if those areas are protected."

Yelenik cautioned that longer-term studies of 'ohi'a seedling survival in ROD-affected forests are still needed, but these early results demonstrate that active planting could successfully help maintain native 'ohi'a forests. Survival of 'ohi'a seedlings in ROD-affected forests is good news, but the author noted that "protecting 'ohi'a from infection remains the primary tool in the fight against ROD."

Credit: 
U.S. Geological Survey

AI enables efficiencies in quantum information processing

image: In a robust tomography scheme with machine learning, noisy tomography measurements are fed to the convolutional neural network, which makes predictions of intermediate t-matrices as the outputs. At the end, the predicted matrices are inverted to reconstruct the pure density matrices for the given noisy measurements.

Image: 
U.S. Army image

ABERDEEN PROVING GROUND, Md. -- A new machine learning framework could pave the way for small, mobile quantum networks.

Researchers from the U.S. Army Combat Capabilities Development Command's Army Research Laboratory and Tulane University combined machine learning with quantum information science, or QIS, using photon measurements to reconstruct the quantum state of an unknown system.

QIS is a rapidly advancing field that exploits the unique properties of microscopic quantum systems, such as single particles of light or individual atoms, to achieve powerful applications in communication, computing and sensing, which are either impossible or less efficient under conventional means.

"We wanted to apply machine learning to problems in QIS, as machine learning systems are capable of making predictions based on example data sets without explicit programming for the given task," said Dr. Brian Kirby, a scientist at the Army's corporate research laboratory. "Machine learning has excelled in recent years in fields such as computer vision, where a machine learning algorithm trained on large sets of pre-classified images can then correctly classify new images that it has never seen before.

For example, banks often employ machine learning systems to read the handwriting on checks, despite the program never having seen that particular handwriting before. This image classification is similar to reconstructing quantum states from measurement data, researchers said.

"In image recognition, the machine learning algorithms try to decide if something is a car or a bike," said Tulane University researcher Dr. Sanjaya Lohani. "Machine learning systems can be just as effective at looking for particular features in measurement data that imply what sort of state it came from. In both cases, the input data can be considered as a two-dimensional array, and the ML system attempts to pick out particular features in the array."

To characterize an unknown quantum system, the research team used quantum state tomography, or QST. They prepared and measured identical unknown quantum system, and used a complicated computational process to determine the quantum system most consistent with the measurement results; however, researchers will need to develop alternative methods to process the classical information associated with quantum information protocols.

"This field often overlooks the classical information processing needed to operate quantum information systems," said Tulane University Professor Ryan Glasser. "As research and capabilities are now maturing to the point that real-world implementations are within sight, these are the sorts of engineering problems that we need to solve."

The researchers recently implemented a system that reconstructed quantum states and standard, more computationally intensive methods--in several cases, outperforming those methods.

"Once we realized we could match the performance of existing systems in pristine simulations, we wanted to see if we could build in resilience to common errors by training the system to expect them," said Tulane University researcher Onur Danaci.

To do this, the team simulated familiar sources of error in measurement, such as misaligned optical elements, and used these to train the machine learning system. The researchers further tested their system when measurements were not just noisy but completely missing. Notably, the team outperformed conventional state reconstruction methods in each situation while requiring fewer computational resources, Danaci said.

Because the team can front-load all its expensive computation into the training process, the actual reconstruction requires relatively modest resources.
The researchers hope to deploy these pre-trained, portable quantum systems on small field devices in the future, such as drones or vehicles, where space for hardware is limited, Kirby said.

Credit: 
U.S. Army Research Laboratory

What happens when food first touches your tongue

COLUMBUS, Ohio - A new study might explain why humans register some tastes more quickly than others, potentially due to each flavor's molecular size.

The research, published last month in the journal PLOS Computational Biology, also provided explanation as to why humans register taste more quickly when food or drink moves over their tongues quickly, as compared to when they are held in their mouth steadily.

The findings indicate that both the speed with which food and drink move in our mouth and the size of the molecules in the food that we consume affect our ability to taste.

"Our tongue has papillae on it that act like a sea of kelp in an ocean," said Kai Zhao, lead author of the paper and an associate professor of otolaryngology at The Ohio State University College of Medicine. "Those papillae - the small bumps that contain taste buds on the human tongue - move and sway as food or drink flow past them."

The human tongue has four kinds of papillae; three of those contain taste buds. (The fourth kind is the most numerous on the tongue, and functions primarily as a way to increase friction.)

For this study, the researchers modeled the way flavors move around the papillae in the tongue, using a range of salty and sweet stimuli. The researchers also built a computer model that simulated previous studies around taste perception.

The model considered the human tongue as a porous surface, with the spaces between the papillae acting like the holes of a sponge. Then the researchers simulated what would happen if they passed a range of salty and sweet flavors over that surface, first quickly, in one intense rush, then slowly.

They found that passing flavors over the tongue quickly caused the flavors to penetrate into the papillae gaps quicker, and that would register flavor more quickly.

And their findings could explain why taste buds were quicker to register a sweet compound with small molecular size as compared with those with large molecular size, such as salty flavors.

"Smaller molecules may diffuse quicker, and we think this could be the reason they move through the papillae gaps more quickly," Zhao said.

This study focused on the early stages of taste - what happens before taste buds have even registered a flavor. Compared with the other senses - sight and sound, for example - taste operates on a sort of time-delay. We hear a sound almost as soon as it is emitted; it takes our taste buds a little longer to register flavor.

"That early response is changed depending on how the molecules of what we are consuming interact with the tongue's surface," Zhao said. "It is a complex process."

Prior to this study, scientists knew that if they dropped a flavored solution onto a person's tongue, the intensity of that solution's taste would increase over time. But they did not know why that happened.

Zhao said scientists assumed the increase in flavor had something to do with papillae, so for this study, his lab focused on studying the mechanics of how papillae work.

"Our taste buds are important," he said. "They help us figure out what food to eat, how much food to eat, and how to balance the body's nutritional needs with its energy needs."

Taste buds also help humans avoid poisonous substances, can help identify edible and nutritious foods, and contribute to the cravings humans feel for things like ice cream and potato chips.

Zhao said his lab decided to focus on the early stages of taste because it is connected to so many other public health issues, including nutrition and obesity.

Credit: 
Ohio State University

Study of giant ant heads using simple models may aid bio-inspired designs

image: Soldier ants have giant heads with pincers that help them defend the colony.

Image: 
Photo by Alex Wild.

Researchers use a variety of modelling approaches to study form and function. By using a basic biomechanical model for studying body form and center of mass stability in ants, new research identifies the benefits of “simple models” and hope that it can be used for bio-inspired designs.

“Most organisms are constrained in their shape and size because they are juggling different needs such as the ability to fly, forage for food, and reproduce,” said Andrew Suarez, a professor of entomology and the head of the Department of Evolution, Ecology, and Behavior at the University of Illinois at Urbana-Champaign.

“Ants are unique because they live in colonies and divide their responsibilities. Therefore, they don’t have the body constraints that other insects do.” “Ants have a wide range of head sizes relative to their body,” said Philip Anderson, an assistant professor of evolution, ecology, and behavior. “Some ants have such extremely large heads that even though they look like their heads should pitch forward, they don’t. To study their body design, I created a simple mathematical model to locate their center of balance.”

The researchers, both affiliated with the Beckman Institute for Advanced Science and Technology, created a basic model of an ant body by treating it as a series of connected ellipsoids. They used ant body measurements from antweb.org, which has collections of ant pictures put together by the California Academy of Sciences.

The study “‘Simple’ biomechanical model for ants reveals how correlated evolution among body segments minimizes variation in center of mass as heads get larger” was published in Integrative and Comparative Biology.

“We found that the ants maintain a center of balance over where their legs are,” Anderson said. “These models have helped us understand how these unusual forms of ants have evolved and how the rest of their body compensates for it.”

“The worker ants are like hopeful monsters. They can play with their body form and produce more variation than other insects. With these models we can see that although they have these exaggerated forms, they are not breaking the laws of physics,” Suarez said.

Even though the researchers have focused on simplifying the model as much as possible, there are some limitations “Treating the ant bodies like ellipsoids doesn’t accurately represent their actual shape,” Anderson said. “Additionally, I assume that every part of the body has the same density, but our co-author Michael Rivera has shown that the head is a lot denser than the abdomen, which changes the calculations.”

The researchers are hopeful that such simple models can be used for applications that use bio-inspired designs. “What happens if you need to add weight to the front of a machine? Is it enough to add weight to the back or are there other ways to compensate? Using such models, we can look to nature for solutions to these issues,” Suarez said.

The study was funded by the National Science Foundation.

The researchers took part in the symposium Melding Modeling and Morphology: integrating Approaches to Understand the Evolution of Form and Function at the Society for Integrative and Comparative Biology annual meeting this past January in Austin, Texas. Lindsay Waldrop and Jonathan Rader organized the symposium.

 

The study “’Simple’ biomechanical model for ants reveals how correlated evolution among body segments minimizes variation in center of mass as heads get larger” can be found at https://doi.org/10.1093/icb/icaa027.

Journal

Integrative and Comparative Biology

DOI

10.1093/icb/icaa027

Credit: 
Beckman Institute for Advanced Science and Technology