Tech

How the brain senses smell

Rovereto, Italy, 29th July 2020 - An Italian-American research conducted by researchers at the IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) in Rovereto (Italy) and Harvard University in Boston (Usa) explains for the first time the mechanisms used by our brain to recognize specific smells. The study, published in Nature, sheds new light on the brain processes involved in the continuous flow of information arriving from our senses, in particular from the sense of smell. Thanks to this result, researchers will be able to think about the realization of an artificial sense of smell, to be transferred to robots and other intelligent machines in the future.

If we talk about sight or hearing, scientists know quite well the mechanisms that lead us to distinguish two colors or two notes. This knowledge was translated into relatively well-established theories: we know which wavelength a neon must emit to appear red and which frequency an electronic keyboard must produce to make us hear a G. The same does not happen with odors: we are not able to say how a molecule smells just by looking at its chemical structure.

Imagine that you are at the restaurant, and you are having a sorbet with a pleasant citrus scent. If you are a chef you could use the olfactory information to understand whether they used lemon or lime, two fruits producing odorous molecules perceptively similar, but chemically different. If you are not interested in the recipe you may simply notice that it is a citrus sorbet instead of a coffee sorbet. Which means, you will discard the small chemical differences between lemon and lime smells to generalize using the unique category of citrus.

It is not easy to flexibly choose between olfactory discrimination and generalization according to our experience and goals. Until now, scientists did not know the code used by our brain to smell a specific odor.

The new study answers this question for the first time. The team of Italian-American scientists, coordinated by Bob Datta of Harvard Medical School in Boston, identified the tricks used by the brain to discriminate and generalize odorous molecules having chemical structures with various levels of similarity. The team includes two Italian research groups from the Center for Neuroscience and Cognitive Systems at IIT in Rovereto: the team of Giuliano Iurilli, who returned to Italy thanks to the Armenise Harvard Foundation, and the team of Stefano Panzeri, coordinator of the Center.

Iurilli, one of the project's main authors, acted as a "bridge" between Italy and the United States. After a PhD in Neuroscience and Robotics at IIT, he moved to Harvard in 2012 where he studied the brain mechanisms underlying the perception of odors and their effects on our daily behavior. In 2018, he won the Career Development Award from the Armenise Harvard Foundation, a million-dollar grant for 5 years that promotes basic biomedical research. Thanks to these funds, Iurilli returned to Italy and started his neurophysiology laboratory at IIT in Rovereto. He brought with him a fundamental piece of research realized in Boston, and contributed to complete it with the Armenise Harvard grant. Bob Datta, lead author of the study, was also an Armenise Harvard awardee, being the Foundation's HMS Junior Faculty Grant winner in 2010 and 2016.

"We developed ad hoc analysis methods - Giuliano Iurilli explains - and we saw that the sensory neurons in the nose capture the odorous molecules, analyzing them almost like chromatographs, machines that precisely describe the chemical differences between molecules."

However, this precision only concerns the first "door" for the odors' recognition, namely the nose. But something changes when the information collected and processed by the nose arrives in a more central structure of the brain, called the olfactory cortex. At this stage, the odors' descriptions begin to "get personal", and they no longer respect chemical differences.

This is where memory comes into action. The new similarities described in the olfactory cortex are less about the chemistry and more about the subject's previous experiences. For example, if our nose is scenting an ethanol molecule and an octene-3-yl acetate molecule, our olfactory cortex simply registers the scent of a gentian schnapps if we have drunk one before.

"We found out that this happens because past experiences modify the way the neurons of the olfactory cortex exchange the chemical information they received from the nose - Giuliano Iurilli concludes - Now we can start thinking concretely about how to build an artificial brain that does the same thing in a robot".

In fact, this discovery could have a great impact to the field of robotics and artificial intelligence, through the realization of an artificial olfactory system. This intelligent system might be able to evaluate the safety of an environment or to recognize an object by quickly scenting the volatile molecules, just like a human being.

Link to the article: https://www.nature.com/articles/s41586-020-2451-1

Credit: 
Istituto Italiano di Tecnologia - IIT

Music training may not make children smarter after all

Music training does not have a positive impact on children's cognitive skills, such as memory, and academic achievement, such as maths, reading or writing, according to a study published in Memory & Cognition.

Previous research trials, carried out to examine a potential causal link between music training and improved cognitive and academic performance, have reached inconsistent conclusions, with some suggesting that there may be a link between music training and better cognitive and academic performance and others finding little effect.

Researchers Giovanni Sala at Fujita Health University, Japan and Fernand Gobet at the London School of Economics and Political Science, UK examined existing experimental evidence regarding the impact of music training on children's non-music cognitive skills and academic achievement.

The authors re-analyzed data from 54 previous studies conducted between 1986 and 2019, including a total of 6,984 children. They found that music training appeared to be ineffective at enhancing cognitive or academic skills, regardless of the type of skill (such as verbal, non-verbal, speed-related and so on), participants' age, and duration of music training.

When comparing between the individual studies included in their meta-analysis, the authors found that studies with high-quality study design, such as those which used a group of active controls - children who did not learn music, but instead learned a different skill, such as dance or sports - showed no effect of music education on cognitive or academic performance. Small effects were found in studies that did not include controls or which did not randomize participants into control groups (ones that received different or no training) and intervention groups (ones that received music training).

Giovanni Sala, the lead author said: "Our study shows that the common idea that 'music makes children smarter' is incorrect. On the practical side, this means that teaching music with the sole intent of enhancing a child's cognitive or academic skills may be pointless. While the brain can be trained in such a way that if you play music, you get better at music, these benefits do not generalize in such a way that if you learn music, you also get better at maths. Researchers' optimism about the benefits of music training appears to be unjustified and may stem from misinterpretation of previous empirical data."

Fernand Gobet, the corresponding author added: "Music training may nonetheless be beneficial for children, for example by improving social skills or self-esteem. Certain elements of music instruction, such as arithmetical music notation could be used to facilitate learning in other disciplines."

The authors caution that too few studies have been conducted to reach a definitive conclusion about possible positive effects of music education on non-academic or cognitive characteristics. Alternative potential avenues involving music activities may be worth exploring.

Credit: 
Springer

ADHD services map reveals major gaps in care, failing the vulnerable

New research has called for urgent action after creating a map that identifies gaps in services for adults with ADHD across the UK, leaving vulnerable people struggling to access vital support and treatment.

Research led by the University of Exeter, published in the British Journal of Psychiatry Open, has led to the first national map of ADHD service provision, based on responses to a survey completed by more than 2,600 respondents. The NIHR-funded research found huge variation in available care, patchy provision of dedicated ADHD services, and variation in the reports of services between people with ADHD, service users and health workers. Now, researchers are calling for urgent action to address the inequalities in services that the map has revealed, amid concerns that people in some areas may struggle to access care. Of the 44 dedicated services identified, only 12 (27 per cent) provided the full range of services recommended by the National Institute for Health and Care Excellence (NICE).

Previous research has found that without the right treatment and support, people with ADHD are at higher risk of poor health, unemployment, difficulties in education, going to prison, and being involved in a car accident. While children with ADHD are generally well supported in the NHS, provision is far more variable for adults, with earlier studies highlighting the risk of many slipping through the net during transition into adulthood.

Lead author Dr Anna Price, of the University of Exeter Medical School, said: "We know that getting the right support and medication is absolutely critical for people with ADHD to make the most of their lives, and to avoid complications that can really set them on a downward spiral. The NHS is meant to provide equal access to care - yet our research has revealed significant gaps in services across the UK. This is likely to hit the most vulnerable the hardest, who may be getting no support at all - we must address this as an urgent priority."

The researchers worked with colleagues at the Universities of Nottingham, Warwick and Kings College London as well as the UK Adult ADHD Network (UKAAN), AADD-UK, The Royal College of Psychiatrists, and the NIHR Clinical Research Network South West. They surveyed 2,686 people who were either affected by ADHD, worked in ADHD health care, or commissioned services. Respondents were asked to identify health services they knew of that support adults with ADHD. Of 294 unique services identified across the UK, 44 were dedicated ADHD services, and 99 were generic mental health services. The research revealed that there's disparity in how services are labelled. The best provision appeared to be from services specifically labelled as ADHD or neurodevelopmental services - yet only 12 of these dedicated services provided the full range of treatment recommended by NICE for adults with ADHD. Only half of the dedicated services (55%) and a minority of other services (7%) were reported by all groups surveyed, highlighting a lack of awareness surrounding available support, both among healthcare workers and service users.

Professor Tamsin Ford, who was involved in the study at the University of Exeter Medical School, said: "Our research has revealed significant knowledge gaps among both service users and healthcare workers including GPs, which must mean people are being let down. Our map is the first step in addressing that gap. Demand is clear - a pilot version of our map was viewed over 35,000 times, but much more needs to be done to provide people with the services to which they are entitled. It doesn't really matter whether a service is specialist, or treats many different types of mental health condition. What matters is prompt access to support and ongoing medication for adults with ADHD who need it."

The full paper is entitled: 'Mapping UK mental health services for adults with Attention-Deficit/ Hyperactivity Disorder; survey findings, with an analysis of differences in reporting between stakeholder groups'

Credit: 
University of Exeter

Decline of bees, other pollinators threatens US crop yields

image: A bumble bee pollinating a blueberry bush.

Image: 
Winfree lab

Crop yields for apples, cherries and blueberries across the United States are being reduced by a lack of pollinators, according to Rutgers-led research, the most comprehensive study of its kind to date.

Most of the world's crops depend on honeybees and wild bees for pollination, so declines in both managed and wild bee populations raise concerns about food security, notes the study in the journal Proceedings of the Royal Society B: Biological Sciences.

"We found that many crops are pollination-limited, meaning crop production would be higher if crop flowers received more pollination. We also found that honey bees and wild bees provided similar amounts of pollination overall," said senior author Rachael Winfree, a professor in the Department of Ecology, Evolution, and Natural Resources in the School of Environmental and Biological Sciences at Rutgers University-New Brunswick. "Managing habitat for native bee species and/or stocking more honey bees would boost pollination levels and could increase crop production."

Pollination by wild and managed insects is critical for most crops, including those providing essential micronutrients, and is essential for food security, the study notes. In the U.S., the production of crops that depend on pollinators generates more than $50 billion a year. According to recent evidence, European honey bees (Apis mellifera) and some native wild bee species are in decline.

At 131 farms across the United States and in British Columbia, Canada, scientists collected data on insect pollination of crop flowers and yield for apples, highbush blueberries, sweet cherries, tart cherries, almond, watermelon and pumpkin. Of those, apples, sweet cherries, tart cherries and blueberries showed evidence of being limited by pollination, indicating that yields are currently lower than they would be with full pollination. Wild bees and honey bees provided similar amounts of pollination for most crops.

The annual production value of wild pollinators for all seven crops was an estimated $1.5 billion-plus in the U.S. The value of wild bee pollination for all pollinator-dependent crops would be much greater.

"Our findings show that pollinator declines could translate directly into decreased yields for most of the crops studied," the study says. The findings suggest that adopting practices that conserve or augment wild bees, such as enhancing wildflowers and using managed pollinators other than honey bees, is likely to boost yields. Increasing investment in honey bee colonies is another alternative.

James Reilly, a research associate in Winfree's lab, led the study, which used data collected by researchers at many universities and was part of The Integrated Crop Pollination Project funded by the USDA-NIFA Specialty Crop Research Initiative.

Credit: 
Rutgers University

How Salt Lake's buildings affect its climate future

Anyone who's lived or worked in old buildings knows that their heating and cooling systems can't compare to the efficiency, insulation and consistency of those in new buildings. But the quirks of old buildings' climate control systems aren't just seasonal annoyances--they could shape the future of cities' energy use in a warming climate.

With warmer temperatures in both the summer and winter, we'll need less natural gas to heat buildings and more electricity to cool them--but what's the balance between those two effects? University of Utah researchers including Daniel Mendoza, a research assistant professor in the Department of Atmospheric Sciences and a visiting assistant professor in the Department of City & Metropolitan Planning, used hyper-localized climate models and building projections to find out. The answer, they write, is that buildings' energy use in the future varies wildly, depending on the climate scenario, and that local building policy now could have a big impact on energy use in the future.

The results are published in World.

Modeling the future

Climate models come in various scales, from global to hyper-local. For the purposes of this study, Mendoza and his colleagues chose a hyper-local model focused on Salt Lake County, which includes Salt Lake City and its suburbs.

"Using localized climate model output results is critical because climatic conditions are a very important input variable in building energy models," Mendoza says. "These conditions dictate how much energy will be required for heating and cooling which are a large component of a building's energy budget."

Next, the team built a model of how changes in air temperature would affect the energy usage of buildings. They included the five commercial building types most common in the county: large office buildings, small office buildings, primary schools, full-service restaurants and high-rise multi-family apartment buildings. Some buildings proved more challenging to model than others.

"It was after realizing that restaurants are really complicated conditioning environments, that is, you have a fridge/freezer right next to an oven, when we understood how challenging it is to model HVAC demands for these buildings," Mendoza says.

They also looked at building energy standards, which are determined largely by the age of the building. Then, they put in the possible composition of building types that might be present in Salt Lake County by 2040, based on projections by the Wasatch Front Regional Council.

"We expect multi-family apartment buildings to be the fastest-growing building type to accommodate our growing population," Mendoza says. The projections show apartment buildings growing from just under 20% of commercial square footage in 2012 to almost 40% by 2040.

Less heating, more cooling

It's not surprising that, with annual average temperatures in Salt Lake County expected to rise between 1.6 and 4.3 °F (0.9 and 2.3 °C) by 2040, less natural gas will be needed for heating in the winter and more electricity will be needed for cooling in summer.

But the researchers found substantial variability in energy use according to building type. Small and large office buildings saw reduced natural gas usage of up to 75% and 30%, respectively, in the 2040 projection. Those types of buildings are projected to comprise a quarter of Salt Lake County's commercial buildings, so the reduction is substantial.

But it is offset by the increased demand for cooling--up to 30% more electricity needed by schools and restaurants and 20% more by high-rise apartments and office buildings, which together comprise more than half of all commercial buildings.

Still a chance to choose

Mendoza acknowledges that projections of building types aren't set in stone. "Accelerated population growth could modify building type distribution," he says. "Faster than expected warming could also change predictions considerably." Given the anticipated demand for cooling electricity, Mendoza says, Salt Lake County could choose to generate that electricity through renewable sources, reducing the fossil fuel emissions that underlie the anticipated warming.

Credit: 
University of Utah

Black phosphorus future in 3D analysis, molecular fingerprinting

image: Schematics of an on-chip mid-infrared system based on black phosphorus-silicon hybrid platform. The passive silicon photonic layer serves to guide the mid-infrared light while black phosphorus plays an active role in light emission, modulation and detection.

Image: 
Bowei Dong and Li Huang

WASHINGTON, July 28, 2020 -- Many compact systems using mid-infrared technology continue to face compatibility issues when integrating with conventional electronics. Black phosphorus has garnered attention for overcoming these challenges thanks to a wide variety of uses in photonic circuits.

Research published in Applied Physics Reviews, by AIP Publishing, highlights the material's potential for emerging devices ranging from medical imaging to environment monitoring.

Scientists from the National University of Singapore reviewed the scientific work conducted so far looking into using black phosphorus for next-generation optoelectronics chips. In the paper, the group assesses progress in different components of the chips, from light detection to laser emission.

"Extending the wavelength from near-infrared to mid-infrared enables more diversified functions beyond communication and computing," said author Kah-Wee Ang. "Sensing is one of the most important potential applications in mid-infrared, as it serves to connect the real world we live in to the virtual system on chip."

Black phosphorus achieves its promising versatility through the various ways it can be manipulated as a 2D material. These features make it attractive for the field of optoelectronics, in which information conveyed using conventional electron-based chips is combined with emerging technology that uses photons to transmit information.

Going beyond thermal imaging uses, mid-infrared technology may be applied to identifying molecular "fingerprints" or using unique features of the mid-infrared wavelengths to analyze 3D structures and motion to distinguish human-made objects from natural ones.

"If we could realize a compact mid-infrared system, we may be able to actualize applications, such as health monitoring and toxic gas detection, with a small chip in a hand-held device," Ang said.

By modifying the number of layers, applying a vertical electric field and introducing chemical doping with relative ease, the material can efficiently tune electron energy levels to a device's desired needs. This precise tuning could be instrumental in the electro-optic modulation that would be required for faster computing and data communication, as well as weak signal detection and spectrum analysis.

Despite its promise, widespread production of atom-thick layers of black phosphorus remains challenging.

"We often rely on exfoliation by tape to obtain thin-film black phosphorus, which is not a fully repeatable process," Ang said. "Large-scale growth, if achieved, would be a breakthrough to advance black phosphorus-based technology."

Ang hopes the review helps cement black phosphorus as an essential material in next-generation optoelectronics devices in the coming years and looks to continue working toward high-performance and compact circuit prototypes.

Credit: 
American Institute of Physics

A blood test for Alzheimer's? Markers for tau take us a step closer

CHICAGO, JULY 28, 2020 -- A simple blood test for Alzheimer's would be a great advance for individuals with -- and at risk for -- the disease, families, doctors and researchers.

At the Alzheimer's Association International Conference® (AAIC®) 2020, scientists reported results of multiple studies on advances in blood "tests" for abnormal versions of the tau protein, one of which may be able to detect changes in the brain 20 years before dementia symptoms occur. In particular, the reports focus on a specific form of tau known as p-tau217, which seems to be the most specific to Alzheimer's and the earliest to show measurable changes.

Changes in brain proteins amyloid and tau, and their formation into clumps known as plaques and tangles, respectively, are defining physical features of Alzheimer's disease in the brain. Buildup of tau tangles is thought to correlate closely with cognitive decline. In these newly reported results, blood/plasma levels of p-tau 217, one of the forms of tau found in tangles, also seem to correlate closely with buildup of amyloid.

Currently, the brain changes that occur before Alzheimer's dementia symptoms appear can only be reliably assessed by positron-emission tomography (PET) scans, and from measuring amyloid and tau proteins in spinal fluid (CSF). These methods are expensive and invasive. And, too often, they are unavailable because they are not covered by insurance or difficult to access, or both.

"There is an urgent need for simple, inexpensive, non-invasive and easily available diagnostic tools for Alzheimer's. New testing technologies could also support drug development in many ways. For example, by helping identify the right people for clinical trials, and by tracking the impact of therapies being tested," said Maria C. Carrillo, Ph.D., Alzheimer's Association chief science officer. "The possibility of early detection and being able to intervene with a treatment before significant damage to the brain from Alzheimer's disease would be game changing for individuals, families and our healthcare system."

A blood test, for example, will enable interpretation and understanding of Alzheimer's progression in much larger, more diverse and more robust populations.

"While these new reports are encouraging, these are early results, and we do not yet know how long it will be until these tests are available for clinical use. They need to be tested in long-term, large-scale studies, such as Alzheimer's clinical trials," Carrillo added. "In addition, we need to continue research to refine and verify the tests that are the current state-of-the-art -- including cerebrospinal fluid and PET imaging biomarkers."

Blood P-tau217 Detects Alzheimer's Disease (i.e., Both Plaques and Tangles) with High Accuracy

As reported at AAIC 2020, an international team of researchers have identified a highly accurate, blood-based biomarker for the detection of Alzheimer's disease by measuring levels of p-tau217 in blood, and validated the finding in multiple, diverse populations. The scientists found that, "the diagnostic precision of blood p-tau217 was as high as established diagnostic methods, including positron emission tomography (PET) imaging and cerebrospinal fluid biomarkers, which are invasive, costly and less available."

The research team was led by Oskar Hansson, M.D., Ph.D., from Lund University, Sweden in coordination with Sebastian Palmqvist, M.D., Ph.D., and Shorena Janelidze, Ph.D. from Lund, Eric Reiman, M.D., from Banner Alzheimer's Institute, USA, Jeffrey Dage, Ph.D., from Eli Lilly, USA, and other research colleagues. The Lund University researchers presented the results at AAIC, and they were also published online.

They studied three different cohorts comprising more than 1,400 cases, including a large clinic-based study from Sweden (the BioFINDER-2 study), a cohort with neuropathological confirmation of Alzheimer's (the Arizona Study of Aging and Neurodegenerative Disorders), and a large kindred with genetically-caused Alzheimer's (Colombian autosomal-dominant Alzheimer's registry).They analyzed other current experimental biomarkers (p-tau217, p-tau181, Aβ42/40 and neurofilament light chain) in both blood and cerebrospinal fluid, as well as performed PET imaging for tau and amyloid pathology.

The main finding of the study was that blood p-tau217 could distinguish Alzheimer's from other neurodegenerative disorders with diagnostic accuracy between 89 and 98 percent. In this study, the p-tau271 assessment was more accurate for Alzheimer's than blood-based tests for p-tau181, neurofilament light or amyloid beta 42/40 ratio, as well as magnetic resonance imaging (MRI). In fact, according to the researchers, performance was similar to significantly more costly methods, such as PET imaging and cerebrospinal fluid biomarkers.

The researchers also found that p-tau217 analyzed in blood collected during life could detect tau brain changes measured in brain tissue analyzed after death. These tau brain changes are thought to be related to amyloid plaque accumulation. P-tau217 distinguished persons who had plaques and tangles from those without Alzheimer's pathology with 89% accuracy, those with plaques and more extensive tangles with 98% accuracy, and the outcome of tau PET imaging with 93% accuracy.

The p-tau217 levels were increased about seven-fold in Alzheimer's, and, in individuals with a gene causing Alzheimer's, the levels started to increase already 20 years before onset of cognitive impairment. "This test, once verified and confirmed, opens the possibility of early diagnosis of Alzheimer's before the dementia stage, which is very important for clinical trials evaluating novel therapies that might stop or slow down the disease process," Hansson said.

Blood Amyloid and P-tau are Precise Markers of Brain Amyloidosis, Tauopathy

To advance research on a blood test for Alzheimer's disease, Suzanne Schindler, M.D., Ph.D., of Washington University School of Medicine in St. Louis and colleagues evaluated the performance of a variety of amyloid and tau measures in blood.

Using mass spectrometry, the scientists mapped the blood plasma tau protein and compared the results to CSF and PET imaging measures. Compared to the better-known tau form p-tau181, they found that p-tau217 was more closely linked to build up of amyloid plaques in the brain as measured by a PET scan.

Additionally, their findings suggest that measuring levels of several different forms of p-tau in blood over time may enable clinicians and researchers to track the stages of Alzheimer's progression in people living with the disease.

According to the researchers, a blood test for Alzheimer's disease that incorporates both amyloid and tau measures may allow earlier and more accurate dementia diagnoses not only in research participants but also in clinic patients.

The scientists launched the Study to Evaluate Amyloid in Blood and Imaging Related to Dementia (SEABIRD) to develop and validate Alzheimer's blood biomarkers in a cohort that is more diverse and representative of the greater St. Louis region. SEABIRD will enroll more than 1,100 individuals including diversity in race, socioeconomic status, medical history and cognitive status.

Plasma P-tau217 is Comparable to P-tau181 for Distinguishing Between Alzheimer's and Frontotemporal Lobar Degeneration

Recent studies have shown that p-tau181 is more than three times as high in people with Alzheimer's compared to healthy elderly people or people with a neurodegenerative disease known as frontotemporal lobar degeneration (FTLD). At AAIC 2020, Elisabeth Thijssen, M.Sc., and Adam L. Boxer, M.D., Ph.D., of the UCSF Memory and Aging Center and colleagues reported a comparison of p-tau181 to a related form of tau called p-tau217 to determine which form can best identify people with Alzheimer's.

The retrospective study included 617 participants: 119 healthy controls, 74 Alzheimer's cases (biomarker-confirmed) and 294 FTLD. In this study group, plasma p-tau181 was increased three-fold in people with Alzheimer's compared to controls and FTLD. Increase in plasma p-tau217 was even higher; five-fold in Alzheimer's compared to healthy controls and four-fold relative to FTLD. The plasma comparison results mirrored the findings of tau PET imaging in the brain. P-tau181 had a 91% accuracy and p-tau217 had 96% accuracy in predicting whether a person had a tau positive brain scan.

According to the researchers, the study shows that both p-tau217 and p-tau181 measured in blood are elevated in Alzheimer's, and that measurements closely correspond to "gold standard" PET scan results. These blood tests are likely to be useful for diagnosing Alzheimer's and as monitoring tools in clinical trials to measure treatment effects of new Alzheimer's therapies.

Credit: 
Alzheimer's Association

The mystery of the less deadly mosquito nets

Research published in Nature Communications shows that insecticide-treated mosquito nets, the mainstay in the global battle against malaria, are not providing the protection they once did - and scientists say that's a cause for serious concern in tropical and subtropical countries around the globe.

Long-Lasting Insecticidal Nets, or LLINs, are credited with having saved 6.8 million lives from 2000 to 2015.

"While an untreated net stops mosquitoes from biting you while you sleep - providing valuable protection - these nets are treated with a long-life insecticide that actually kills mosquitoes that come in contact with them," said Dr Stephan Karl, a malaria researcher from James Cook University's Australian Institute of Tropical Health and Medicine, and the Papua New Guinea Institute of Medical Research.

"LLINs add a community-level protective effect by massively decreasing the overall number of mosquitoes. In other words, even people not directly using these nets benefit by their being present in the communities," Dr Karl said.

The introduction of LLINs in Papua New Guinea in 2006 led to a significant drop in malaria cases, but the rate of infections has since bounced back - from less than 1% in 2013-2014 to 7.1% in 2016-2017.

"The nets are really a frontline defence - in Papua New Guinea they are the only tools used at present in the national campaign against the mosquitoes that can carry malaria," said co-author Dr Moses Laman, Deputy Director at the PNG Institute of Medical Research.

"Malaria kills around half a million people worldwide each year, so any suggestion that the nets are not working is cause for grave concern."

When researchers in Papua New Guinea, Australia and the UK investigated, their search took an unexpected twist.

"Early on it was believed the rise in cases was due to shortages in anti-malarial drugs," said co-author Tim Freeman, country manager of Rotarians Against Malaria. "But after the drug supply was restored, cases continued to climb." Rotarians Against Malaria assists Papua New Guinea's National Department of Health with net distribution across the country.

Other possible explanations were investigated. Were mosquitoes building resistance to the insecticide, or avoiding the insecticide by feeding more on animals and humans outdoors? Were people getting bitten more often because greater access to electricity enabled them to stay up later?

"We can certainly rule out insecticide resistance as our studies have shown over and over again that there is currently no insecticide resistance in the malaria mosquitoes in PNG," Dr Karl said.

"Each of the remaining factors could contribute to an increasing rate of infections to some extent, but the rapid rise in cases ¬- to almost pre-control levels - indicated that we were still missing a major cause."

The mosquito nets themselves were not an obvious culprit, since their insecticide content is tested regularly in pre-delivery inspections.

The LLINs used in Papua New Guinea are all made by a single company (Vestergaard) to specifications set by the World Health Organisation. The model (PermaNet 2.0) is widely used - in 2014 it accounted for the largest market share of LLNIs globally.

It was when the researchers tested the nets' performance at knocking down and killing mosquitoes that the problem was revealed.

"With new nets manufactured up to 2012, the percentage of mosquitoes killed was always close to 100%," Dr Karl said.

"Using the same standard tests with new nets produced from 2013 to 2019, the kill rate dropped to an average of 40%, with some of the nets not killing any mosquitoes at all.

"That's an alarming loss of efficacy in critical protective equipment.

"This also calls into question the regulations and standards governing the quality of LLINs worldwide, if such nets are still considered acceptable."

All nets tested appeared to have the same amount of insecticide, which poses the question of how nets with the same insecticide levels could be less lethal to mosquitoes.

The authors agree that the answer most likely lies in changes to the manufacture of the nets. "We hope to work with the manufacturer to investigate further," Dr Karl said.

In the meantime, the researchers urge that LLINs be tested for their ability to kill mosquitoes - not just their insecticide content.

They have notified the World Health Organisation and the manufacturer of their findings.

Credit: 
James Cook University

New studies reveal inside of central energy release region in solar eruption

Prof. LIN Jun from the Yunnan Observatories of Chinese Academy of Sciences, collaborating with Prof. CHEN Bin from the New Jersey Institute of Technology, conducted the radio observation of the magnetic field distribution and relativistic electron acceleration characteristics in the current sheet of solar flares.

The related research results were published in the journal Nature Astronomy on July 27, 2020.

Solar eruption is the most violent energy release process in the solar system, which is usually accompanied by solar flares and coronal mass ejections (CMEs). In the standard flare model, the large-scale current sheet of magnetic reconnection is considered as the core engine of driving the rapid release of the magnetic energy and the particle acceleration.

However, due to lack of observations on the magnetic field property and high-energy particles near the current sheet, the key question such as the location and the mechanism of energy release and particle acceleration in solar flares is still open.

Prof. CHEN Bin et al. analyzed the microwave radiation near the current sheet in an X-class flare event on September 10, 2017 by using the Expanded Owens Valley Solar Array (EOVSA) data and the numerical experiment based on the Lin-Forbes model developed by Prof. LIN Jun et al.

Lin-Forbes model is a theoretical solar eruption model for quantitative descriptions of the overall evolution in the magnetic field structure and its physical relation to magnetic reconnection during solar eruptive process. It is often used by researchers in the solar physics community to help interpret the observational phenomena, reveal the corresponding physical scenario and understand the physics behind it.

The research group found that the magnetic field in the current sheet shows a local maximum at the X-point of magnetic reconnection, and a local minimum in the region between the bottom of the current sheet and the top of the flare loop (also known as the magnetic bottle).

The microwave energy spectrum shows that the acceleration or accumulation of more than 99% relativistic electrons are likely to occur in the magnetic bottle region at the top of the flare loop, rather than near the reconnecting X-point.

These results not only provide direct observational evidence for solving the problem of particle acceleration in the solar eruptive process, but also confirm the Lin-Forbes model.

The finding is the fruit of good international academic collaboration.

Credit: 
Chinese Academy of Sciences Headquarters

Winning the digital transformation race: three emerging approaches for leading transition

New research from Professor Feng Li, Chair of Information Management at City’s Business School has outlined three new approaches that digital innovators can take to reduce the risk of failure and seize competitive advantage in the industry.

With the coronavirus pandemic forcing many organisations to operate remotely, adoption of the latest secure technologies has taken on greater importance for many industries – presenting great opportunities for providers of these technologies, but also great challenges of meeting demand, staying ahead of competition and surviving in a fast-moving environment.

Professor Li interviewed senior leaders at eight global digital champions including Amazon, VMWare, Slack, Alibaba and Baidu to find out what their strategies were for innovation.

The findings can be summarised into three main approaches that are emerging:

Innovation by experimentation: a continual process of developing ideas on a small scale without high upfront investment, and then leveraging and rapidly scaling up those that turn out to be successful.

Radical transformation through incremental approaches: breaking up large scale projects into strategic investments with measuring capability at each stage allows companies to radically innovate several projects at once in small steps. This method also mitigates the risk of a single large project failing and allows businesses to judge which projects will yield the highest returns or biggest impact on investment.

Dynamic sustainable advantages through portfolios of temporary advantages: due to the fast-paced nature of the digital economy, competitive advantages are often short-lived. Implementing a strategy for successive and incremental temporary advantages can yield significant long-term gains.

All three strategies use elements of diversification and portfolio management to mitigate costs of failure, as is often seen in investor portfolios.

Professor Li said the nature of digital innovation lent itself to a highly dynamic approach.

“Digital technology is a highly volatile, fast-paced sector,” he said.

“It is important for companies in the field to recognise that competitive advantages are short-lived, and that there is no ‘end-point’ for innovation. Throwing all your weight behind one project as a start-to-end activity is highly risky and serves little long-term benefit even if successful.

“Sustainability can only be achieved by continuously reinventing the wheel while seeking new investment opportunities.

“The coronavirus pandemic has both challenged and opened doors of opportunity to traditionally non-digital organisation to innovate methods of banking, education and even living room gym classes.

“This is placing added pressure on industry incumbents to stay ahead of new disruptors, putting further emphasis on the need to have new irons in the fire and the ability to change direction quickly and efficiently between innovations.”

Read Professor Li’s new paper "Leading Digital Transformation: Three Emerging Approaches for Managing the Transition", published by the International Journal of Operations and Production Management (IJOPM).

Journal

International Journal of Operations & Production Management

DOI

10.1108/IJOPM-04-2020-0202

Credit: 
City St George’s, University of London

Eavesdropping on trout building their nests

video: Geophones have been used to record the seismic vibrations caused by fish as they dig their spawning pits. Translated into audio they are clearly distinguishable from a river or an airplane.

Image: 
M. Dietze/GFZ

Steelhead trout (Oncorhynchus mykiss) stir up the sediment of the river bed when building their spawning pits, thus influencing the composition of the river bed and the transport of sediment. Until now, this process could only be studied visually, irregularly and with great effort in the natural environment of the fish. Now, researchers led by Michael Dietze of the GFZ German Research Centre for Geosciences in Potsdam have used seismic sensors (geophones) to analyze the trout's nest-building process in detail. The study was published in the journal Earth Surface Processes and Landforms.

To lay their eggs, trout use their caudal fins to dig pits up to three metres long on each side and ten centimetres deep into the river bed. The aim of the researchers was to locate these spawning pits and to analyze the chronological sequence of the construction process. To this end, the researchers set up a network of seismic stations on a 150-meter section of the Mashel River in the US state of Washington. The geophones embedded in the earth are highly sensitive and detect the slightest vibrations in the ground. Small stones moved by the fish caused short frequency pulses in the range of 20 to 100 hertz and could be distinguished from background frequencies of flowing water, raindrops and even the pulses of passing airplanes. "The same signal arrives at each of the stations slightly delayed. This enabled us to determine where the seismic wave was generated," says Dietze, first author of the study.

The researchers listened to the construction of four spawning pits for almost four weeks from the end of April to the end of May. The geophones revealed that the trout were mostly busy building their nests within eleven days of the measurement period. The fish preferably started at sunrise and were active until early noon, followed by another period in the early evening. The trout dug in the sediment for between one and twenty minutes, typically at two- to three-minute intervals with 50 to 100 tail strokes. This was followed by a break of about the same length.

"Normally, the nest-building behaviour of the trout was recorded only very irregularly, at most weekly. We can now resolve this to the millisecond. In the future, we want to extend the method to the behaviour of other species, for example animals that dig along the banks and destabilize them," explains Dietze. The new measurement method might support fish and behavioural biology and provide a more accurate picture of the biotic and abiotic contribution of sediment transport in rivers. "Fish can move as much sediment as a normal spring flood. The biological component can therefore play a very important role," said Dietze.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Higher BPA levels linked to more asthma symptoms in children

Children in low-income neighborhoods in Baltimore tended to have more asthma symptoms when levels of the synthetic chemical BPA (Bisphenol A) in their urine were elevated, according to a study from researchers at the Johns Hopkins Bloomberg School of Public Health and School of Medicine.

While some products, including baby bottles, no longer contain BPA, exposures to BPA remain almost universal, and there are still concerns that, especially in childhood, those exposures might have a health impact.

Boys with elevated BPA were found to be at higher risk for having more asthma symptoms, the study found. The researchers found no statistically significant link between BPA levels and asthma symptoms among the girls in the study. The researchers also found that higher levels of two common chemicals closely related to BPA--BPS and BPF--were not consistently associated with more asthma symptoms. Like BPA, BPS and BPF are found in many consumer products, including food cans and beverage bottles.

For their analysis, the researchers examined clinical data and urine samples, taken at three-month intervals over a year, from 148 predominantly Black children in Baltimore. They found consistent links between higher BPA levels in urine and measures of recent asthma severity.

The study, published July 28 in the Journal of Allergy and Clinical Immunology, is thought to be the first to examine children's environmental exposures to BPA, BPS, and BPF and their associations with asthma severity.

"Our findings suggest that additional studies are needed to examine this BPA-asthma link, given the high burden of pediatric asthma and widespread exposure to BPA in the United States," says lead author Lesliam Quirós-Alcalá, PhD, assistant professor in the Department of Environmental Health and Engineering at the Bloomberg School. "This is especially important given that Black Americans have higher asthma rates than whites and also, according to CDC data, have higher exposure to these chemicals than whites."

BPA is a chemical building block used to make polycarbonate plastic as well as some epoxies. Produced at the rate of about 7 million tons per year worldwide, it can leach from polycarbonate bottles into the liquids they contain, and from epoxies that line cans of soup and other food items. A 2011 study published found that eating soup from cans lined with BPA-containing epoxy caused study participants' BPA levels to rise by a factor of almost 20.

BPA can activate estrogen receptors on cells, which suggests that it may have hormone-like effects--disrupting human biology even at very small exposure levels. Animal studies have found evidence that the chemical can have pro-inflammatory effects. Epidemiological studies have found that people with higher BPA levels in their urine are more likely to have cardiovascular disease, diabetes, asthma, and some other conditions. Children are in principle more vulnerable, to the extent that they use BPA-containing products more often than adults do. Due to consumer concerns, companies stopped making BPA-containing baby bottles and sippy cups more than a decade ago, and have largely switched to non-BPA can epoxies.

BPS and BPF are close chemical relatives, or analogs, of BPA, and are found, for example, in can-linings and thermal-printer receipts--often as replacements for BPA. They too can interact with estrogen receptors, although very little is known about their health impacts at current exposure levels.

In the new study, Quirós-Alcalá and colleagues examined the link between BPA and asthma. More than 25 million Americans, including about one out of twelve children, have this airway inflammatory disorder.

While prior studies in children have linked higher BPA levels to a greater likelihood of developing asthma, the researchers here looked for a link between BPA exposure and the extent of symptoms in established asthma--or asthma "morbidity," as epidemiologists call it.

To do this, they analyzed clinical data, as well as stored urine samples, from the Mouse Allergen and Asthma Cohort Study (MAACS), which was conducted from 2007 to 2010 in Baltimore and covered 148 asthmatic children between 5 and 17. The study included 85 boys and 63 girls. Most of the children (91 percent) were Black, and most (69 percent) came from households with annual incomes below $35,000. Each child in the study was evaluated by doctors every three months for a year, and at these visits the child's caregiver filled out a questionnaire about the child's recent asthma symptoms and medical care.

Quirós-Alcalá and her colleagues found BPA in every urine sample taken during the study, with a mean concentration of 3.6 nanograms per milliliter--consistent with one study of low-income minority children in the U.S., but several times higher than levels measured in other groups.

The children in the study varied greatly in their urine BPS levels, and the researchers found that a ten-times-greater level of BPS was associated with a 40 percent increased chance of having had "coughing, wheezing, or chest tightness" in the prior two weeks, along with an 84 percent and 112 percent increased chance of reporting an acute care or an emergency-room visit in the prior three months.

When the researchers analyzed the children by sex, they found that these associations remained statistically significant only for the boys.

The analysis also showed that BPS and BPF levels in urine of the 148 children were much lower on average than those for BPA, and in some urine samples were not found at all. Higher BPS or BPF levels were not consistently associated with more asthma morbidity.

This was an associational study and does not prove that BPA exposures caused health effects, but it suggests that more conclusive studies of cause and effect should be done, the researchers say.

"If these findings are confirmed in future studies, then avoiding or limiting contact with BPA sources may be advisable for families who have children with asthma," Quirós-Alcalá says.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Discovery will allow more sophisticated work at nanoscale

image: Researchers led by a University of Houston engineer have reported a new way to stimulate fluid flow at nanoscale by using a small increase in temperature or voltage.

Image: 
ACS Applied Nano Materials

The movement of fluids through small capillaries and channels is crucial for processes ranging from blood flow through the brain to power generation and electronic cooling systems, but that movement often stops when the channel is smaller than 10 nanometers.

Researchers led by a University of Houston engineer have reported a new understanding of the process and why some fluids stagnate in these tiny channels, as well as a new way to stimulate the fluid flow by using a small increase in temperature or voltage to promote mass and ion transport.

The work, published in ACS Applied Nano Materials, explores the movement of fluids with lower surface tension, which allows the bonds between molecules to break apart when forced into narrow channels, stopping the process of fluid transport, known as capillary wicking. The research was also featured on the journal's cover.

Hadi Ghasemi, Cullen Associate Professor of Mechanical Engineering at UH and corresponding author for the paper, said this capillary force drives liquid flow in small channels and is the critical mechanism for mass transport in nature and technology - that is, in situations ranging from blood flow in the human brain to the movement of water and nutrients from soil to plant roots and leaves, as well as in industrial processes.

But differences in the surface tension of some fluids causes the wicking process - and therefore, the movement of the fluid - to stop when those channels are smaller than 10 nanometers, he said. The researchers reported that it is possible to prompt continued flow by manipulating the surface tension through small stimuli, such as raising the temperature or using a small amount of voltage.

Ghasemi said raising the temperature even slightly can activate movement by changing surface tension, which they dubbed "nanogates." Depending on the liquid, raising the temperature between 2 degrees Centigrade and 3 degrees C is enough to mobilize the fluid.

"The surface tension can be changed through different variables," he said. "The simplest one is temperature. If you change temperature of the fluid, you can activate this fluid flow again." The process can be fine-tuned to move the fluid, or just specific ions within it, offering promise for more sophisticated work at nanoscale.

"The surface tension nanogates promise platforms to govern nanoscale functionality of a wide spectrum of systems, and applications can be foreseen in drug delivery, energy conversion, power generation, seawater desalination, and ionic separation," the researchers wrote.

Credit: 
University of Houston

NASA's Terra Satellite finds wind shear weakening Tropical Storm Douglas

image: On July 28 at 5 a.m. EDT (0900 UTC), the MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Douglas that confirmed wind shear was adversely affecting the storm. Persistent south to southwest vertical wind shear showed strongest storms (yellow) pushed north and northeast of the center where cloud top temperatures are as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

Former Hurricane Douglas has encountered strong wind shear after passing the Hawaiian Islands and has now weakened to a tropical storm. NASA's Terra satellite provided infrared data to find that the strongest storms were displaced from the center as the storm weakens.

Warnings in Effect for Douglas on July 28

On July 28, NOAA's Central Pacific Hurricane Center (CPHC) continued posting warnings and watches. A Tropical Storm Warning is in effect for portions of the Papahanaumokuakea Marine National Monument from Nihoa to French Frigate Shoals to Maro Reef. A Tropical Storm Watch is in effect for portions of the Papahanaumokuakea National Marine Monument from Maro Reef to Lisianski.

NASA's Terra Satellite Reveals Effects of Wind Shear 

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 28 at 5 a.m. EDT (0900 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Douglas that confirmed wind shear was adversely affecting the storm. Persistent south to southwest vertical wind shear showed strongest storms were pushed north and northeast of the center. Those storms had cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius). Satellite imagery also shows the low-level circulation center became exposed.

The wind shear and displacement of storms has led to a rapid weakening trend over the past 24 hours.

Wind Shear Affecting Douglas

The shape of a tropical cyclone provides forecasters with an idea of its organization and strength. When outside winds batter a storm, it can change the storm's shape and push much of the associated clouds and rain to one side of it. That is what wind shear does.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

Status of Tropical Storm Douglas on July 28, 2020

At 8 a.m. EDT (2 a.m. HST/1200 UTC), the center of Tropical Storm Douglas was located near latitude 23.8 degrees north and longitude 165.7 degrees west. That is about 40 miles (60 km) east of French Frigate Shoals, and about 525 miles (845 km) west-northwest of Honolulu, Hawaii.

Douglas was moving toward the west-northwest near 18 mph (30 kph), and this general motion is expected to continue the next couple of days. Maximum sustained winds were near 50 mph (85 kph) with higher gusts. The estimated minimum central pressure was 1001 millibars.

Forecast for Douglas

NOAA's CPHC said weakening is forecast over the next couple of days, and Douglas is expected to dissipate by Thursday, July 29. Interests elsewhere in the Papahanaumokuakea Marine National Monument should monitor the progress of this system.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts. visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Microbiologists clarify relationship between microbial diversity and soil carbon storage

image: UMass Amherst microbiologists recently released results of their investigation to understand the role of microbial diversity in soil carbon efficiency.

Image: 
UMass Amherst/Luiz A. Domeignoz-Horta

AMHERST, Mass. - In what they believe is the first study of its kind, researchers led by postdoctoral researcher Luiz A. Domeignoz-Horta and senior author Kristen DeAngelis at the University of Massachusetts Amherst report that shifts in the diversity of soil microbial communities can change the soil's ability to sequester carbon, where it usually helps to regulate climate.

They also found that the positive effect of diversity on carbon use efficiency - which plays a central role in that storage - is neutralized in dry conditions. Carbon use efficiency refers to the carbon assimilated into microbial products vs carbon lost to the atmosphere as CO2 and contributing to climate warming, DeAngelis explains. Among other benefits, soil carbon makes soil healthy by holding water and helping plants grow.

She and colleagues addressed these questions because they point out, "empirical evidence for the response of soil carbon cycling to the combined effects of warming, drought and diversity loss is scarce." To explore further, they experimentally manipulated microbial communities while varying factors such as microbe community species composition, temperature and soil moisture. Details are in Nature Communications.

In addition to first author Domeignoz-Horta and others at UMass Amherst, the team includes Serita Frey at the University of New Hampshire and Jerry Melillo at the Ecosystems Center, Woods Hole, Mass.

They point out that carbon in the soil is regulated in part by the rate and efficiency with which the microbes living there can use fresh plant foods and other parts of soil organic matter to grow. DeAngelis says some "soil carbon pools" can "stick around for decades and turn over very slowly. These are ones we really want to have because they help soil stay spongy to absorb water and help bind and release nutrients for plant growth."

"Diversity is interesting, not just in microbiology but in all organisms, including humans," DeAngelis says. "It's controlled by a lot of different factors, and it seems that more diverse systems tend to work more efficiently and to tolerate stress better. We wanted to understand the role of microbial diversity in soil carbon efficiency."

She adds, "Replicating diversity is tricky, which is why we used a model system soil. Luiz extracted microbes from soil, made serial dilutions of microbe concentrations in a buffer and inoculated the soil to get variation in diversity." They let the five different microbial mixes grow for 120 days. In addition to other tests, they used a new method based on a heavy, stable isotope of water known as 18O-H2O. It allowed them to trace the oxygen and track new growth over time in the different diversity, soil moisture and temperature conditions.

"One interesting thing we found is that we do see that more diverse communities are more efficient. The microbes grow more than in less diverse communities, but that increase in growth with diversity is lost when they are stressed for water. This suggests that there's a limit to the stress resilience with high diversity," she adds.

The authors point out, "Results indicate that the diversity by ecosystem-function relationship can be impaired under non-favorable conditions in soils, and that to understand changes in soil carbon cycling we need to account for the multiple facets of global changes."

DeAngelis adds, "We were a little surprised at how our approach worked so well. I'm really interested in the temperature/moisture efficiencies and Luiz is more interested in the diversity angle. It was a combination of the two that was the most interesting result."

Credit: 
University of Massachusetts Amherst