Earth

New perspectives to treat neuropschychiatric diseases

Researchers at the Institute of Biology, Eötvös Loránd University (ELTE), Budapest, Hungary, studied the major types of neurons of the prefrontal cortex of the brain in an international collaboration. The research team has identified molecular differences in neurons that may support drug development for the treatment of psychiatric disorders such as schizophrenia or depression.

It has long been known that the prefrontal cortex in the frontal lobe is responsible for nervous system processes related to decision-making, planning, or central constructive functions. Dysfunctions of this part of the brain play role in several psychiatric illnesses and can also cause neuropsychiatric conditions such as autism, schizophrenia, or depression.

The proper functioning of the prefrontal cortex is largely based on the balanced communication between its two major types of neurons, the excitatory pyramidal cells and the inhibitory interneurons.

The researchers at the Institute of Biology, Faculty of Science, Eötvös Loránd University (ELTE), led by Gábor Juhász, James Eberwine (University of Pennsylvania) and Tamás Bártfai (Scripps Research), studied the messenger RNA (mRNA) set of these two types of cortical neurons by the so-called single-cell sequencing for several years.

The mRNAs transmit genetic information from DNA to ribosomes, the sites of protein synthesis. Researchers at ELTE mapped these molecules in excitatory pyramidal cells and inhibitory interneurons. The essence of the single-cell sequencing method is that by examining the cells individually, it is possible to identify and quantify the mRNAs of the cells and thus estimate the type and amount of proteins that each cell synthesizes at a given time.

"Single-cell mRNA sequencing has been used primarily for neuronal classification in the previous nervous systems studies, whereas in the present work, we have focused specifically on the mRNA set of the two, functionally clearly distinguishable neuronal cell types. We have looked especially for mRNAs encoding cell surface proteins. This was of particular importance, since cell surface proteins such as neurotransmitter receptors and ion channels play essential role in the adequate communication between neurons, and on the other hand, these proteins might be available for pharmaceutical drugs" said Gábor Juhász, one of the leaders of the research.

Out of the detected more than 19,000 different mouse mRNAs, the scientists have looked for mRNAs that are preferentially expressed in one of the examined types of neurons and that encode proteins located on the surface of the cells. A large number of cell type-specific mRNAs has been identified, which were present in at least tenfold higher amount in one of the cell types than in the other, so it can be assumed that the encoded cell surface proteins are also present in higher amount. This is an important step providing an accurate picture of the cell surface proteins that can be used to treat certain psychiatric disorders by selectively influencing the function of the studied cell types. These findings will also help to better understand the mechanism of action of clinically used nervous system drugs.

Credit: 
Eötvös Loránd University

NASA examines Hurricane Delta's early morning structure

image: The Waning Gibbous moon (65% illumination) was enough to see both the tropospheric waves as well as the central dense overcast and long ranging feeder bands in Hurricane Delta on Oct. 8 at 4:05 a.m. EDT.

Image: 
UWM/SSEC/CIMSS/William Straka III

The NASA-NOAA Suomi NPP satellite provided two nighttime views of Hurricane Delta as it moved toward the U.S. Gulf Coast. A moonlit image and an infrared image revealed the extent and organization of the intensifying hurricane.

Satellite Imagery Shows Delta's Extent

On Oct. 8 at 4:05 a.m. EDT (0805 UTC), the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite captured an infrared and nighttime imagery of Hurricane Delta as it moved through the Gulf of Mexico.

One hour before Suomi NPP passed overhead, Delta had winds of 100 mph, making it a Category 2 storm on the Saffir-Simpson Hurricane Wind Scale. "The imagery showed enhanced convection near the center of circulation, though the actual circulation was covered by a central dense overcast (CDO) feature, but the curved bands beyond that, extending all the way to the U.S., can also be seen," noted William Straka III, researcher at University of Wisconsin - Madison's Space Science and Engineering Center, Cooperative Institute for Meteorological Satellite Studies, Wisconsin.

"The Waning Gibbous moon (65% illumination) was enough to see both the tropospheric waves as well as the CDO and long ranging feeder bands (of thunderstorms."

On Oct. 8 at 3:35 a.m. EDT (0735 UTC) NASA's Aqua satellite analyzed Delta using the Atmospheric Infrared Sounder or AIRS instrument.  AIRS found coldest cloud top temperatures as cold as or colder 210 Kelvin minus 81 degrees Fahrenheit (minus 63.1 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain. The eye was obscured by high clouds and central dense overcast.

NASA provides all of this data to tropical cyclone meteorologists so they can incorporate it in their forecasts.

NHC forecaster Jack Beven noted, "Satellite imagery shows that Delta is better organized this morning, with the center well embedded in a cold central dense overcast and a hint of an eye developing in the overcast."

Watches and Warnings on Oct. 8

NOAA's National Hurricane Center (NHC) has issued various warnings and watches for Delta's approach to the U.S. mainland.

A Storm Surge Warning is in effect for High Island, Texas to Ocean Springs, Mississippi including Calcasieu Lake, Vermilion Bay, Lake Pontchartrain, Lake Maurepas, and Lake Borgne.

A Hurricane Warning is in effect from High Island, Texas to Morgan City, Louisiana. A Tropical Storm Warning is in effect from west of High Island to San Luis Pass, Texas and from east of Morgan City Louisiana to the mouth of the Pearl River, including New Orleans. A Hurricane Warning is also in effect for Lake Pontchartrain and Lake Maurepas.

A Tropical Storm Watch is in effect from east of the mouth of the Pearl River to Bay St. Louis Mississippi.

Delta's Status on Oct. 8

At 11 a.m. EDT (1500 UTC), the center of Hurricane Delta was located near latitude 24.0 degrees north and longitude 92.7 degrees west. That is about 400 miles (645 km) south of Cameron, Louisiana. Delta was moving toward the northwest near 14 mph (22 kph), and this motion with a reduction in forward speed is expected today.  Reports from NOAA and Air Force Reserve Hurricane Hunter aircraft indicate that maximum sustained winds have increased to near 105 mph (165 kph) with higher gusts.  The latest minimum central pressure reported by the Hurricane Hunter aircraft is 968 millibars.

Delta's Forecast

NHC expects a turn toward the north by late tonight, followed by a north-northeastward motion by Friday afternoon or Friday night.  Additional strengthening is forecast, and Delta is expected to become a major hurricane again by tonight.  On the forecast track, the center of Delta will move over the central Gulf of Mexico today, over the northwestern Gulf of Mexico on Friday. Some weakening is possible as Delta approaches the northern Gulf coast on Friday, with rapid weakening expected after the center moves inland within the hurricane warning area Friday afternoon or Friday night.

NASA Researches Earth from Space

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

Survey shows broad bipartisan support for a stronger focus on science

ARLINGTON, VA -- A recent survey commissioned by Research!America on behalf of a working group formed to assess America's commitment to science shows overwhelming support for science across political parties. A strong majority of Americans agree that "the COVID-19 pandemic is a disruptive event and requires urgent refocusing of America's commitment to science." On a bipartisan basis, Americans:

Believe science benefits them (88%);

Would pay $1 more a week in taxes to support scientific research (66%);

Believe America should maintain its global leadership in science (89%);

View basic scientific research that advances the frontiers of knowledge as necessary and should be supported by the federal government (77%);

Support incentives for private sector investment in science and technology (76%);

Express concern about the number of children without home internet access (64%); and

Agree the U.S. is at a critical point for committing to a major new initiative to assure health, security and prosperity for the nation (77%).

Science is seen as crucial to addressing urgent and important concerns such as economic growth, climate change, safe drinking water, ensuring the food supply and ending COVID-19 and other diseases. There was striking agreement across racial and ethnic groups.

Of concern is that those ages 18-29 appear to see science as less consequential to our nation's future. They are less likely to say they support a greater share of the U.S. GDP going to research and development (69% compared to 79% of all adults) and less likely to agree the U.S. should be a global leader in scientific research (74% to 89% of all adults). Surprisingly, this group also expresses less interest in federal incentives for STEM education (58% to 70% of adults).

Among all adults, the bipartisan support for U.S. global leadership in science and for increasing the percentage of our GDP spend in science stands in sharp contradiction to the disturbing slippage in our nation's global scientific leadership. According to a report released this month by the American Academy of Arts and Sciences and the Baker Institute of Public Policy, the U.S. has fallen to tenth place among OECD nations in the share of GDP invested in research and development.

Congress has begun to address our global standing in science and technology with legislation and in pointed reports. This includes the Endless Frontiers Act and the America LEADS Act both introduced in the Senate as well as two House reports released by the Foreign Affairs Committee and Select Committee on Intelligence.

"Americans are facing major societal challenges - health, environment, food, and energy - that are now existential, due in large part to decades of underinvestment in STEM and STEM education," said Keith Yamamoto, PhD, Vice Chancellor for Science Policy and Strategy at University of California, San Francisco, a co-chair of the working group. "The survey indicates that Americans are aware of the urgency of addressing these challenges, and that science is essential to succeed."

"The level of bipartisan public consensus in this survey shows that support for science is much more than an agreement; it's a mandate to elected officials to do more. It's time for a national refocus on science so we may address the issues top of mind for Americans and live up to our full potential as a science-strong nation," said Mary Woolley, Research!America President and CEO and a co-chair of the working group.

"All elected leaders should take note of the high expectations and enormous support for science held by the American public," said Sudip Parikh, PhD, CEO of the American Association for the Advancement of Science, and also a working group co-chair. "Now is the time to summon all our resolve to assure that America leads the way in solving the challenges facing us and the rest of the planet."

Additional survey findings include:

A strong majority across parties believes it is important for elected officials to listen to scientists (80%);

It's important for scientists to talk to elected officials (81%) and the public (82%);

66% of Americans say that climate change is impacting their own health, an increase of 10 percentage points from when asked the same question just eight months ago in January 2020;

Agree basic research funded by the federal government is important to private sector innovation (76%); and

Americans see a positive future for STEM fields, with 83% saying they would strongly or somewhat recommend that their child, family member, or other young person enter a STEM field.

The survey results can be viewed here: http://www.researchamerica.org/bipartisan-support

Credit: 
Research!America

High-capacity tape for the era of big data

video: Millimeter waves irradiate epsilon iron oxide, reversing its
magnetic states representing binary states 1 or 0.

Image: 
© 2020 Ohkoshi et al.

It may seem odd to some that in the year 2020, magnetic tape is being discussed as a storage medium for digital data. After all, it has not been common in home computing since the 1980s. Surely the only relevant mediums today are solid state drives and Blu-ray discs? However, in data centers everywhere, at universities, banks, internet service providers or government offices, you will find that digital tapes are not only common, but essential.

Though they are slower to access than other storage devices, such as hard disk drives and solid state memory, digital tapes have very high storage densities. More information can be kept on a tape than other devices of similar sizes, and they can also be more cost effective too. So for data-intensive applications such as archives, backups and anything covered by the broad term big data, they are extremely important. And as demand for these applications increases, so does the demand for high-capacity digital tapes.

Professor Shin-ichi Ohkoshi from the Department of Chemistry at the University of Tokyo and his team have developed a magnetic material which, together with a special process to access it, can offer greater storage densities than ever. The robust nature of the material means that the data would last for longer than with other mediums, and the novel process operates at low power. As an added bonus, this system would also be very cheap to run.

"Our new magnetic material is called epsilon iron oxide, it is particularly suitable for long-term digital storage," said Ohkoshi. "When data is written to it, the magnetic states that represent bits become resistant to external stray magnetic fields that might otherwise interfere with the data. We say it has a strong magnetic anisotropy. Of course, this feature also means that it is harder to write the data in the first place; however, we have a novel approach to that part of the process too."

The recording process relies on high-frequency millimeter waves in the region of 30-300 gigahertz, or billions of cycles per second. These high frequency waves are directed at strips of epsilon iron oxide, which is an excellent absorber of such waves. When an external magnetic field is applied, the epsilon iron oxide allows its magnetic direction, which represents either a binary 1 or 0, to flip in the presence of the high-frequency waves. Once the tape has passed by the recording head where this takes place, the data is then locked into the tape until it is overwritten.

"This is how we overcome what is called in the data science field 'the magnetic recording trilemma,'" said Project Assistant Professor Marie Yoshikiyo, from Ohkoshi's laboratory. "The trilemma describes how, to increase storage density, you need smaller magnetic particles, but the smaller particles come with greater instability and the data can easily be lost. So we had to use more stable magnetic materials and produce an entirely new way to write to them. What surprised me was that this process could also be power efficient too."

Epsilon iron oxide may also find uses beyond magnetic recording tape. The frequencies it absorbs well for recording purposes are also the frequencies that are intended for use in next-generation cellular communication technologies beyond 5G. So in the not too distant future when you are accessing a website on your 6G smartphone, both it and the data center behind the website may very well be making use of epsilon iron oxide.

"We knew early on that millimeter waves should theoretically be capable of flipping magnetic poles in epsilon iron oxide. But since it's a newly observed phenomenon, we had to try various methods before finding one that worked," said Ohkoshi. "Although the experiments were very difficult and challenging, the sight of the first successful signals was incredibly moving. I anticipate we will see magnetic tapes based on our new technology with 10 times the current capacities within five to 10 years."

Credit: 
University of Tokyo

Double jeopardy for ecologically rare birds and terrestrial mammals

Common assumptions notwithstanding, rare species can play unique and essential ecological roles. After studying two databases that together cover all known terrestrial mammals and birds worldwide, scientists from the CNRS, the Foundation for Biodiversity Research (FRB), Université Grenoble Alpes, and the University of Montpellier[1] have demonstrated that, though these species are found on all continents, they are more threatened by human pressures than ecologically common species and will also be more impacted by future climate change. Thus they are in double jeopardy. The researchers' findings, published in Nature Communications (October 8, 2020), show that conservation programmes must account for the ecological rarity of species.

It has long been thought that rare species contribute little to the functioning of ecosystems. Yet recent studies have discredited that idea: rarity is a matter not only of the abundance or geographical range of a species, but also of the distinctiveness of its ecological functions. Because these functionally distinct species are irreplaceable, it is essential we understand their ecological characteristics, map their distributions, and evaluate how vulnerable they are to current and future threats.

Using two databases that collect information on the world's terrestrial mammals (4,654 species) and birds (9,287 species), scientists from the FRB's Centre de Synthèse et d'Analyse de la Biodiversité (CESAB), CNRS research laboratories, Université Grenoble Alpes, the University of Montpellier, and partner institutes divided the earth's surface into 50 × 50 km squares and determined the number of ecologically rare species within each. They showed that ecological rarity among mammals is concentrated in the tropics and the southern hemisphere, with peaks on Indonesian islands, in Madagascar, and in Costa Rica. Species concerned are mostly nocturnal frugivores, like bats and lemurs, and insectivores, such as small rodents. Ecologically rare bird species are mainly found in tropical and subtropical mountainous regions, especially in New Guinea, Indonesia, the Andes, and Central America. The birds in question are essentially frugivorous or nectarivorous, hummingbirds being an example. For birds and terrestrial mammals alike, islands are hotspots of ecological rarity.

The researchers also ranked these species according to their IUCN Red List status[2] and found they made up the bulk of the threatened species categories. That is, ecologically rare mammals account for 71% of Red List threatened species (versus 2% for ecologically common mammals); and ecologically rare birds, 44.2% (versus 0.5% for ecologically common birds). For each species, they determined (i) anthropogenic pressure exerted; (ii) human development indexes (HDIs) of host countries; and (iii) exposure to armed conflicts. The last two of these elements shape conservation policies. The scientists observed that human activity had a greater impact on ecologically rare mammals and birds than on more common species, and that these rare species were found in countries of every kind of profile, irrespective of HDI or the prevalence of warfare.[3] They used models to demonstrate that ecologically rare species will be the greatest victims of climate change, many of them facing extinction within 40 years.

This profiling of ecologically rare species makes it clear that current conservation efforts, even in zones already protected, are insufficient. Conservation strategies still too often ignore functional distinctiveness and focus instead on population sizes. But it is essential to take this distinctiveness into account, letting this knowledge guide steps taken to protect these rare species. As they are necessary for healthy ecosystems, a true paradigm shift in conservation policy is needed to ensure their survival.

For more information, here are snapshots of ecologically rare species: https://www.cnrs.fr/sites/default/files/press_info/2020-10/Exemples_Especes_EcoRare_VANG.pdf

Credit: 
CNRS

Pollinator monitoring more than pays for itself

image: Mining bee (Andrena nitida) on dandelion

Image: 
Credit Nadine Mitschunas

Monitoring schemes to count bees and other pollinating insects provide excellent value for money, and could help save species and protect UK food security, researchers have found.

The study led by the University of Reading and the UK Centre for Ecology & Hydrology, found that the costs of running nationwide monitoring schemes are more than 70 times lower than the value of pollination services to the UK economy, and provide high quality scientific data at a much lower cost than running individual research projects.

Dr Tom Breeze, an ecological economist at the University of Reading, led the research. He said: "Pollinating insects are the unsung, unpaid heroes of British farming, but we know many species need help. Pollinators are vital for our food security and natural ecosystems but are under threat from many factors including habitat loss and climate change.

"Our analysis shows that large-scale and long-term pollinator monitoring schemes can be cost-effective and add tremendous value to food security and wider scientific research."

Monitoring insect pollinators, such as wild bees and hoverflies, is vital to understanding where and why they are declining in order to better target efforts to protect them. National pollinator strategies for England, Scotland, Wales and Ireland have highlighted monitoring as a priority for government action, but need to compete with other funding needs.

This new study, published in the Journal of Applied Ecology, has demonstrated that a well-designed monitoring scheme provides excellent value for money, compared with traditional research funding models.

Senior author Dr Claire Carvell of the UK Centre for Ecology & Hydrology, who is coordinator of the UK Pollinator Monitoring and Research Partnership, said: "Monitoring bees and other pollinators is challenging due to the sheer number of species involved, the difficulty in identifying them and the shortage of specialist skills needed.

"This research has been instrumental in helping design a world-leading Pollinator Monitoring Scheme (PoMS) for the UK which combines both professional and volunteer involvement, and generates data for research and policy that we can't really accomplish with standard research grants. And at a lower cost too."

Rebecca Pow MP, Minister for Pollinators at Defra, said: “This year, we have seen an increased appreciation for nature in England in response to the Coronavirus pandemic and the nation building back greener.

“The UK Pollinator Monitoring Scheme (PoMS) is an excellent way to inspire people to take action to protect our pollinators. Our pollinators may be small, but they play a key role in our ecosystem, and this scheme creates world-leading evidence helping us to better understand their status.

"It’s tremendously positive to see the UK government and devolved administrations, research institutions and the public working together to understand these essential and precious creatures.”

Researchers looked at the costs compared with the monetary benefits of monitoring schemes. Working with a range of scientists and experts in wildlife management, the team calculated how much it would cost to run various different types of monitoring scheme for 10 years. These ranged from schemes where all the work was handled by professional research staff, to schemes where members of the public work with scientists to collect the data.

The costs calculated ranged from £6,000/year for a volunteer scheme collecting counts of insects visiting flowers, to £300,000/year for a scheme involving both volunteers and professionals to collect and process the data; £900,000/year for a professional scheme using more intensive sampling to collect species-level pollinator data and up to £2.7M/year for a professional scheme to monitor the pollination and yields of the UK's crops for shortfalls.

Using information from field research, the team explored the impact of pollinator losses on the yields of insect pollinated crops grown in the UK. These include apples, berries, beans, oilseed rape and tomatoes. This potential loss of yield, given a 30% decline in pollinator numbers over 10 years, was estimated at over £188M per year, 70 times the costs of even the most expensive monitoring scheme to track and respond to declines.

The team also compared the monitoring scheme costs with the costs of carrying out a series of separate studies answering different questions. This was done by surveying pollinator researchers from across Europe about how they would design studies to answer eight key research questions, such as, the impact of climate change, and whether current farmland conservation measures are working.

The cost of implementing all these studies was compared with the costs of a pollinator monitoring scheme to show the value added by monitoring to UK science.

The study demonstrated that well-designed monitoring schemes provide excellent value for money, providing data to answer these eight big research questions for at least 33% less cost than traditional research funding models.

Volunteers already contribute to our understanding of the status of wildlife by recording species that they see and sharing this information. There are a number of established, professionally-coordinated monitoring schemes involving volunteers, for example the UK Butterfly Monitoring Scheme and the Breeding Bird Survey.

Additionally there are well-established wildlife recording schemes such as the Bees, Wasps and Ants Recording Society and the Hoverfly Recording Scheme. By combining the strengths of professionals and often highly-skilled volunteers, these citizen science approaches are invaluable in providing information at scales that would not otherwise be practical.

Professor Helen Roy of the UK Centre for Ecology & Hydrology said: "It is incredibly exciting to consider the benefits to people and nature achieved through the implementation of the Pollinator Monitoring Scheme. Volunteers can play a critical role in gathering much needed data from across Britain while increasing everyone's understanding of these important and much loved insects."

The authors conclude that pollinator monitoring is an excellent investment that supports UK food security and environmental science, and helps conserve iconic British species.

Credit: 
University of Reading

New algorithm sharpens focus of world's most powerful microscopes

image: A composite image of the enzyme lactase showing how cryo-EM's resolution has improved dramatically in recent years. Older images to the left, more recent to the right.

Image: 
Veronica Falconieri/National Cancer Institute

We've all seen that moment in a cop TV show where a detective is reviewing grainy, low-resolution security footage, spots a person of interest on the tape, and nonchalantly asks a CSI technician to "enhance that." A few keyboard clicks later, and voila - they've got a perfect, clear picture of the suspect's face. This, of course, does not work in the real world, as many film critics and people on the internet like to point out.

However, real-life scientists have recently developed a true "enhance" tool: one that improves the resolution and accuracy of powerful microscopes that are used to reveal insights into biology and medicine.

In a study published in Nature Methods, a multi-institutional team led by Tom Terwilliger from the New Mexico Consortium and including researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) demonstrates how a new computer algorithm improves the quality of the 3D molecular structure maps generated with cryo-electron microscopy (cryo-EM).

For decades, these cryo-EM maps - generated by taking many microscopy images and applying image-processing software - have been a crucial tool for researchers seeking to learn how the molecules within animals, plants, microbes, and viruses function. And in recent years, cryo-EM technology has advanced to the point that it can produce structures with atomic-level resolution for many types of molecules. Yet in some situations, even the most sophisticated cryo-EM methods still generate maps with lower resolution and greater uncertainty than required to tease out the details of complex chemical reactions.

"In biology, we gain so much by knowing a molecule's structure," said study co-author Paul Adams, Director of the Molecular Biophysics & Integrated Bioimaging Division at Berkeley Lab. "The improvements we see with this algorithm will make it easier for researchers to determine atomistic structural models from electron cryo-microscopy data. This is particularly consequential for modeling very important biological molecules, such as those involved in transcribing and translating the genetic code, which are often only seen in lower-resolution maps due to their large and complex multi-unit structures."

The algorithm sharpens molecular maps by filtering the data based on existing knowledge of what molecules look like and how to best estimate and remove noise (unwanted and irrelevant data) in microscopy data. An approach with the same theoretical basis was previously used to improve structure maps generated from X-ray crystallography, and scientists have proposed its use in cryo-EM before. But, according to Adams, no one had been able to show definitive evidence that it worked for cryo-EM until now.

The team - composed of scientists from New Mexico Consortium, Los Alamos National Laboratory, Baylor College of Medicine, Cambridge University, and Berkeley Lab - first applied the algorithm to a publicly available map of the human protein apoferritin that is known to have 3.1-angstrom resolution (an angstrom is equal to a 10-billionth of a meter; for reference, the diameter of a carbon atom is estimated to be 2 angstroms). Then, they compared their enhanced version to another publicly available apoferritin reference map with 1.8-angstrom resolution, and found improved correlation between the two.

Next, the team used their approach on 104 map datasets from the Electron Microscopy Data Bank. For a large proportion of these map sets, the algorithm improved the correlation between the experimental map and the known atomic structure, and increased the visibility of details.

The authors note that the clear benefits of the algorithm in revealing important details in the data, combined with its ease of use - it is an automated analysis that can be performed on a laptop processor - will likely make it part of a standard part of the cryo-EM workflow moving forward. In fact, Adams has already added the algorithm's source code to the Phenix software suite, a popular package for automated macromolecular structure solution for which he leads the development team.

This research was part of Berkeley Lab's continued efforts to advance the capabilities of cryo-EM technology and to pioneer its use for basic science discoveries. Many of the breakthrough inventions that enabled the development of cryo-EM and later pushed it to its exceptional current resolution have involved Berkeley Lab scientists.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Turning a hot spot into a cold spot: Fano-shaped local-field responses probed by a quantum dot

image: (a) Schematics of the QD-loaded nanoantenna excited by a polarization-controlled light beam. (b) Simulated spectral dispersions and spatial distributions of the local-field responses under x-polarized and y-polarized excitation. (c,d) Simulated spectral dispersions of local-field responses under elliptically polarized excitation. The spectra exhibit Fano lineshapes with tunable Fano asymmetry parameter q and nearly vanishing Fano dips. Local-field distributions show that at the Fano dips the hot spot at the nanogap can be turned into a cold spot.

Image: 
by Juan Xia, Jianwei Tang, Fanglin Bao, Yongcheng Sun, Maodong Fang, Guanjun Cao, Julian Evans, and Sailing He

Optical nanoantennas can convert propagating light to local fields. The local-field responses can be engineered to exhibit nontrivial features in spatial, spectral and temporal domains. Local-field interferences play a key role in the engineering of the local-field responses. By controlling the local-field interferences, researchers have demonstrated local-field responses with various spatial distributions, spectral dispersions and temporal dynamics. Different degrees of freedom of the excitation light have been used to control the local-field interferences, such as the polarization, the beam shape and beam position, and the incidence direction. Despite the remarkable progress, achieving fully controllable local-field interferences remains a major challenge. A fully controllable local-field interference should be controllable between a constructive interference and a complete destructive interference. This would bring unprecedented benefit for the engineering of the local-field responses.

In a new paper published in Light Science & Application, a team of scientists from China, led by Professor Sailing He from Zhejiang University and Professor Jianwei Tang from Huazhong University of Science and Technology, have experimentally demonstrated that based on a fully controllable local-field interference designed in the nanogap of a nanoantenna, a local-field hot spot can be turned into a cold spot, and the spectral dispersion of the local-field response can exhibit dynamically tunable Fano lineshapes with nearly vanishing Fano dips. By simply controlling the excitation polarization, the Fano asymmetry parameter q can be tuned from negative to positive values, and correspondingly, the Fano dip can be tuned across a broad wavelength range. At the Fano dips, the local-field intensity is strongly suppressed by up to ~50-fold.

The nanoantenna is an asymmetric dimer of colloidal gold nanorods, with a nanogap between the nanorods. The local-field response in the nanogap has the following features: First, local field can be excited by both orthogonal polarizations; Second, the local-field polarization has a negligible dependence on the excitation polarization; Third, the local-field response is resonant for one excitation polarization, but nonresonant for the orthogonal excitation polarization. The first two features make the local-field interferences fully controllable. The third feature further enables Fano-shaped local-field responses.

For experimental study of the local-field responses, it is crucial to probe the local fields at specified spatial and spectral positions. The scientists use a single quantum dot as a tiny sensors to probe the local-field spectrum in the nanogap of the nanoantenna. When the quantum dot is placed in the local field, it is excited by the local field, and its photoluminescence intensity can reveal the local-field response through comparison with its photoluminescence intensity excited directly by the incident light.

Superb fabrication technique is needed to fabricate such a tiny nanoantenna and put the tiny quantum dot sensor into the nanogap. The scientists use the sharp tip of an atomic force microscope (AFM) to do this job, pushing nanoparticles together on a glass substrate.

The scientists summarized the relevance of their work:

"Turning a local-field hot spot into a cold spot significantly expands the dynamic range for local-field engineering. The demonstrated low-background and dynamically tuneable Fano-shaped local-field responses can contribute as design elements to the toolbox for spatial, spectral and temporal local-field engineering."

"More importantly, the low background and high tunability of the Fano lineshapes indicate that local-field interferences can be made fully controllable. Since the local-field interferences play a key role in the spatial, spectral and temporal engineering of the local-field responses, this encouraging conclusion may further inspire diverse designs of local-field responses with novel spatial distributions, spectral dispersions and temporal dynamics, which may find application in nanoscopy, spectroscopy, nano-optical quantum control and nanolithography."

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Comeback of drug-resistant neglected tropical disease tracked through genomic surveillance

Genome sequencing has shed light on the re-emergence of the bacterium that causes yaws, a neglected tropical disease of the skin, bones and joints. The re-emergence followed a mass drug administration (MDA) campaign that aimed to eliminate the disease in Papua New Guinea.

Researchers at the Wellcome Sanger Institute, the London School of Hygiene & Tropical Medicine, the Fight Aids and Infections Disease Foundation, Spain, the University of Washington and the University of Papua New Guinea, report their findings today in Lancet Microbe. The results will influence the global elimination strategy for this disease.

Yaws, caused by the bacterium Treponema pallidum subspecies pertenue (TPP), can cause chronic disfigurement and disability. Most commonly affecting children, infection with the bacteria results in stigmatising and debilitating ulcers. Despite global efforts, yaws remains common in tropical areas in some of the world's poorest countries, affecting millions of people*. The World Health Organisation (WHO) is currently carrying out campaigns to eradicate yaws using Mass Drug Administration (MDA) of the antibiotic, azithromycin.

An MDA campaign on Lihir Island, Papua New Guinea, in 2013 reached 83 per cent of the population (15-18,000 people) and was initially successful, dramatically reducing the incidence of the disease**. But after two years, cases of the disease starting increasing. Molecular testing showed the bacteria were of a single type, however it was unclear if the re-emergence had a single source, or several. In addition, a small proportion of the bacteria were found to be resistant to azithromycin, the first time any such resistance had been seen.

In this new study, Sanger researchers sequenced the genomes of bacteria from 20 swab samples taken during the follow up of the MDA campaign in Lihir. The aim was to further understand the re-emergence following the MDA and inform future strategies.

Comparing the DNA sequences of the TPP bacteria, the team constructed phylogenetic 'family' trees to map their evolution. DNA sequences change over time at a constant rate as organisms evolve, and so it is possible to determine the relatedness of individual samples and relate that back to the patient and possible routes of infection. They found that, rather than re-emergence being due to a single source, missed case, or reintroduction, the re-emergence of cases after MDA was actually caused by at least three distinct TPP lineages. The most likely explanation for this is that these were caused by latent infections in people, without symptoms, who didn't receive the treatment. This has important implications for disease control, leading the researchers to recommend employing strategies to maximise MDA population coverage to reduce the number of people who are missed by these treatments who may have latent infections. They also recommend intensive post-MDA surveillance for detection of "the last yaws cases" and to control onward transmission of new infections.

Of the three lineages linked to the re-emergence, one became resistant to azithromycin during the study. To study this antibiotic resistance, the team combined the genomic data with epidemiological data about the movement and interactions of people in the region. This showed that resistance to azithromycin can evolve and spread rapidly in TPP. The authors recommend careful monitoring following an MDA to enable the rapid detection of azithromycin resistance that could compromise MDA campaigns. They also suggest considering alternative treatments for cases detected post-MDA, particularly where diagnostic testing for resistance is unavailable, since resistance is subsequently more likely to be present in the bacterial population.

Dr Mathew Beale, first author of the study from the Wellcome Sanger Institute, said: "Even though an impressive 83 per cent of the population were reached with antibiotic treatment initially, there were several instances of re-emergence of the bacteria. The development of antibiotic resistance is worrying, but the fact that it only occurred once after a mass drug administration is positive news. We still need to be very concerned about resistance, but it may be possible to manage it. Our results have big implications for how yaws elimination campaigns are run - we can recommend high treatment coverage initially and there needs to be careful surveillance and follow up to detect and swiftly treat any re-emergence, to prevent the bacteria spreading."

Dr Michael Marks, senior author of the paper at London School of Hygiene & Tropical Medicine, said: "Yaws is one of many neglected tropical diseases, which combined affect around one billion people - usually in low- and middle-income countries. They are often diseases of poverty, affecting rural communities without access to sanitation and basic healthcare. WHO has laid out a new roadmap for the control, elimination and, in the case of yaws, eradicating these conditions over the coming decade."***

Dr Oriol Mitjà, senior author of the paper at Lihir Medical Center, Papua New Guinea and the Fight Aids and Infections Disease Foundation, said: "While yaws doesn't kill, it causes significant disease in thousands of children around the world. Yet it is easily treatable. Following up on treatment trials is vital - we need to know where and why the bacteria are re-emerging. The results from this study are influencing the next steps and approaches we take to eradicate this disease."

Professor Nicholas Thomson, senior author of the paper based at the Wellcome Sanger Institute and the London School of Hygiene & Tropical Medicine, said: "The resolution afforded by genome sequencing to track agents of infectious diseases is vital for our efforts to control and, eventually, to eliminate them. It can help us understand how and where bacteria or other pathogens are spreading between people, as well as how they respond genetically to our attempts to control them. Hence, monitoring for drug resistance is also essential to defeat these diseases. The information can help target interventions and treatments so they have the most effect."

Credit: 
Wellcome Trust Sanger Institute

The Lancet Planetary Health: Restricting supermarket promotions of high-sugar food and drinks reduces sales without reducing store profits

Restricting the promotion and merchandising of unhealthy foods and beverages leads to a reduction in their sales, presenting an opportunity to improve people's diets, according to a randomised controlled trial of 20 stores in remote regions of Australia published in The Lancet Planetary Health journal.

Merchandising in food retail outlets, such as price promotions, end-of-aisle displays and placement at eye level, can be effective in stimulating sales. This has been used to promote sales of healthier food, but rarely tested to discourage people from buying unhealthy food.

In the study, restricting merchandising of unhealthy products reduced the purchase of sugar sweetened drinks and confectionery, resulting in a reduction in the total amount of free sugar purchased from food and drinks by 2.8% [1]; the equivalent of 1.8 tons of sugar overall across all of the intervention stores. Free sugars include foods and drinks with added sugars, plus natural sugar in honey, syrups and fruit juices.

There were especially large reductions in free sugar purchased from soft drinks and confectionery. However, despite these reductions, business performance was unaffected.

Julie Brimblecombe, of Monash University, Australia, co-joint first author of the study, said: "Price promotions and marketing tactics, such as where products are placed on shelves, are frequently used to stimulate sales. Our novel study is the first to show that limiting these activities can also have an effect on sales, in particular, of unhealthy food and drinks. This strategy has important health implications and is an opportunity to improve diets and reduce associated non-communicable diseases. It also offers a way for supermarkets to position themselves as responsible retailers, which could potentially strengthen customers loyalty without damaging business performance." [2]

In the pragmatic randomised control trial, 20 stores in First Nations communities in remote regions of Australia were assigned to either an intervention group, which implemented the Healthy Stores 2020 strategy, or a control group which carried out usual retail practice. The strategy was tested for 12 weeks. Weekly sales figures were compared.

The Healthy Stores 2020 strategy asked shop owners to implement seven changes targeting high-sugar foods and beverages, including sugar-sweetened drinks, confectionery, sweet biscuits and table sugar. As well as restrictions on price promotions and reducing shelf space dedicated to targeted products, other restrictions included: no misleading promotional activity; removing end-of-aisle and counter displays of unhealthy food and drinks; reducing refrigerator space for targeted beverages and replacing them with healthier alternatives; removing large unit soft drinks (>600ml) from fridges to a different location and more visible marketing i.e floor stickers promoting water as a healthy choice and indicating the amount of sugar in soft drinks. However, due to store directors' concerns about potential adverse impact on business outcomes, 50% of the stores involved (those with food retail competition close by) did not have to remove bottles of soft drinks which were above 600ml from their fridges.

Stores that implemented the restrictions reduced added sugars purchased in all food and drink by 2.8%. Additionally, the reduction in free sugar purchased from sugar sweetened beverages (including soft drinks, fruit juices and energy drinks) was 6.8% and was especially large for carbonated sugar sweetened (soft drinks) at 13.4%. The reduction in free sugar purchased from confectionery was 7.5%. [1]

The researchers say that the large overall reduction in soft drink sales was due to the removal of larger soft drinks bottles from the fridges in stores and customers choosing smaller soft drinks or lower sugar alternatives, while the reduction in confectionery sales was due to limiting the amount of confectionary on the shelves and relocation from high traffic areas. The study reports that the strategy resulted in a reduction of soft drink purchased equivalent to 440mls per person per week in the communities that removed the larger sized bottles from the fridges.

Sales of table sugar and sweet biscuits were not statistically significantly impacted. The researchers were not surprised by this result and suggest it is because these items are less likely to be displayed in prime locations within the stores and so less likely to be impulse purchases.

Emma McMahon, of Menzies School of Health Research, Australia and co-joint first author of the study, said: "We anticipated that the Healthy Stores strategy would work best on impulse purchases such as confectionery, whereas staple items such as table sugar are bought as a matter of routine. Sweet biscuits are more likely to be an impulse purchase, but they are less likely to be displayed in prime locations like confectionery and so, there is less opportunity for people to pick them up randomly. A different strategy for biscuits and items like table sugar should be explored to stimulate change in those buying behaviours." [2]

The Health Stores 2020 strategy was co-designed and conducted in partnership with the Arnhem Land Progress Aboriginal Corporation (ALPA).

Khia De Silva, of The Arnhem Land Progress Aboriginal Corporation (ALPA), said: "Improving the health and quality of life of our customers is a key part of ALPA's mission and has been since the 1980s. Our Board of Directors continue to push for bold new strategies that encourage healthy food and drink choices. Co-designing Healthy Stores 2020 with community leaders, remote retailers and researchers was the strength of this trial - this collaboration challenged each other to identify factors that would challenge or support each strategy's success. ALPA are proud to contribute to evidence-based research that can be used by other retailers." [2]

First Nations communities experience high rates of diet-related diseases, such as type 2 diabetes and cardiovascular disease. They also have limited access to services. They do however, in some communities, have sovereignty of their community stores with the power to initiate change and have shown leadership to support healthier food. In these communities, in-store strategies have previously shown promise in promoting healthy food and drinks sales.

The researchers recognise that there may be factors unique to remote communities that contributed to the success of the Healthy Stores 2020 strategy. Households in remote Australia may not have a fridge and therefore shoppers would choose a cold drink from the store's fridge rather than an off-the-shelf bottle to refrigerate at home. Children are particularly susceptible to merchandising and as children in First Nation communities are able to make in-store purchases from a young age, this may have resulted in a larger effect from the strategy that may not be possible in non-remote supermarkets.

The researchers also note that remote stores may not have the same constraints on restricting merchandising as non-remote stores/supermarkets. They have, however, observed more promotional activity on unhealthy food in non-remote stores/supermarkets than that observed in the remote stores. This means there may be greater room for improvement in non-remote stores/supermarkets, if they are bold enough to put such strategies in place.

While Healthy Stores 2020 is likely relevant to non-remote store settings, the researchers state that retailers in more competitive environments may not be prepared to make the restrictions in merchandising even though the Healthy Stores 2020 study showed no impact on business performance. They say Healthy Stores 2020 should give retailers and governments the confidence to put steps in place to restrict merchandising of unhealthy food and drinks for the health of populations. More studies examining long-term sales with strategies like Healthy Stores 2020 are needed.

Credit: 
The Lancet

Phosphorus deficit may disrupt regional food supply chains

Phosphorus is essential in agriculture to maintain higher production levels, where it is applied as a fertilizer. Some of the world regions are experiencing high population growth rates, which means more phosphorus will be needed to produce an increasing amount of food needed in the next decades. A new study - Global phosphorus supply chain dynamics: Assessing regional impact to 2050 - published in the scientific journal Global Food Security, that was undertaken at Stockholm University, Sweden, University of Iceland, Iceland, and the Blekinge Institute of Technology, Sweden, shows that the world regions with high population growth rates are also the regions with the highest deficit in phosphorus supply. The study also quantifies the environmental impact of a business-as-usual scenario in the phosphorus supply chain to 2050 and identifies alarming rates of pollution and greenhouse gas emissions associated with the phosphorus supply.

Almost all of our phosphate fertilizers come from the mining and processing of phosphate rock (PR) and only a handful of countries produce and export this mineral. Losses along the Phosphorus (P) supply chain have been estimated in the literature at around 80-90% . At the same time, global population growth is expected to push food demand up by more than 50% to 2050, particularly in Latin America and Caribbean (LAC), South Asia (SA) and Sub-Saharan Africa (SSA) . Despite being a vital resource in food production, P is also a key pollutant in water bodies, where it can cause eutrophication. Processing PR is also an energy-intensive process, which uses significant quantities of water and produces large quantities of phosphogypsum (PG), a toxic and radioactive byproduct.

"Most of the focus in the literature has been on the sufficiency of the global phosphorus reserves. However, demand for phosphorus is unequal across regions so it was important to assess which regions require more phosphorus and what will that mean in terms of food security. Another valuable contribution of this study is that we quantified the negative environmental and climate impacts of the phosphorus supply chain at global and regional level. Our results indicate yet again the necessity of closing the loop when it comes to phosphorus and on reducing its usage through more sustainable farming practices" says Claudiu Eduard Nedelciu, researcher at the Department of Physical Geography and main author of the study.

The study, which is part of a larger European research project, Adaptation to a new Economic Reality (adaptecon.com), found that LAC, SA and ESEA will lead the increasing consumption of P in the coming decades. Surprisingly, SSA did not account for a significant increase in the P requirement to 2050, despite developing the highest population growth during the period. This is due to the historically low levels of fertilizer application in the region, but poses serious questions about food security, as this part of the world concentrates most of the undernourished people in the world. All the regions leading the increase in P requirement were also regions highly dependent on phosphate imports and thus vulnerable to price spikes and supply disruptions.

"Hunger will increase in the parts of the world where phosphorous is lacking, unless actions are taken by governments and international institutions to secure imports", aid Prof. Kristin Vala Ragnasdottir from the University of Iceland, co-author in the study.

Perhaps the most striking results were related to the impact of the P supply chain on the environment and climate. The amount of P reaching water bodies will more than triple in North Africa and Western Asia and will double in South Asia and Latin America and the Caribbean. This trend will be driven by P runoff from agricultural land and it is an optimistic scenario, as it assumes that the 2030 wastewater treatment targets of the Sustainable Development Goals (SDGs) will be achieved in all world regions. At current P runoff rates and without ambitious prevention measures, more coastal areas and inland water bodies are likely to be subject to eutrophication.

"Not only the efficient use of Phosphorous in agriculture but wise management of Phosphorus resources along the supply chain, including environmental effects, will be major challenges for the coming decades" said Prof. Peter Schlyter of the Blekinge Technology Institute and co-author in the study.

Climate impact resulting from the mining and processing of PR will double in 2050 compared to 2000, while the amount of phosphogypsum production will reach 500 million tons/year if no technological improvements are made. Production of phosphogypsum raises serious questions with regard to its safe disposal and management, owing to its toxicity and radioactivity. On the other hand, phosphogyspum can be a rich source for P in the future, if technological advancements will allow the safe recycling of phosphorus.

Credit: 
Stockholm University

RUDN University soil scientists: Green suburbs can be more harmful than city centers

image: A team of soil scientists from RUDN University confirmed that traditional approaches to urban soil pollution monitoring ignore actual risks for urban residents because they don't take into consideration the barrier function of the soil. The team used Moscow as an example to show that not only polluted downtown districts but also recreational parks and forest zones can pose a threat to people. This is due to the fact that the barrier functions of the soil are weaker in green suburbs, making it unable to withstand even the slightest pollution.

Image: 
RUDN University

A team of soil scientists from RUDN University confirmed that traditional approaches to urban soil pollution monitoring ignore actual risks for urban residents because they don't take into consideration the barrier function of the soil. The team used Moscow as an example to show that not only polluted downtown districts but also recreational parks and forest zones can pose a threat to people. This is due to the fact that the barrier functions of the soil are weaker in green suburbs, making it unable to withstand even the slightest pollution. The results of the study were published in the Journal of Environmental Quality.

Industrial soil pollution with heavy metals poses a threat to human health. From the soil, harmful substances get into the water, dust, and plants. The intensity of these processes depends on the properties of the soil, namely its organic content, acidity, and texture. For example, clay and loam soils act as a geochemical barrier: they retain harmful substances and don't let them spread. However, traditional approaches to ecological monitoring overlook this barrier role and assess risks based only on the concentration of contaminating agents. Using these methods, one can miss potentially harmful zones or overestimate the danger. A team of soil scientists from RUDN University developed the first map of Moscow that takes into account not only the level of heavy metal pollution but also the barrier function of the soils.

The experiment covered nine administrative districts of Moscow with a total area of over 1,000 km2. The main sources of contamination across this territory were industrial facilities and automobiles. The researchers took soil samples from 224 points in public spaces, residential areas, and industrial zones. The samples were dried and ground to measure their soil texture, organic content, and acidity. Based on these parameters, the team calculated the ability of the samples to contain pollutants. Then, using spectral analysis, the researchers measured the concentrations of heavy metals: nickel, cadmium, manganese, lead, copper, zinc, arsenic, and mercury. After that, individual concentration and containment maps were developed and integrated into one general map.

In over 30% of the samples, heavy metal concentrations exceeded the norms of the Russian Agency for Health and Consumer Rights (Rospotrebnadzor). For example, the prohibitive amount of copper in 1 kg of soil is 33 to 132 mg (depending on soil type), and of arsenic--from 2 to 10 mg. However, in some cases, these norms were exceeded 2 to 5 times. The most polluted soils were the ones taken from public places downtown. However, loam soil with alkaline acidity that is typical for the center of Moscow has a high barrier activity index, which means it can retain the pollution. At the same time, more sandy and acidic in topsoils have weaker barrier functions, and though the concentration of harmful substances there is lower, they cannot withstand the pollution. The team concluded that in order to effectively assess the ecological situation in the city, one has to take into account both the level of pollution and the ability of the soil to stop it. Traditional monitoring approaches based solely on concentration levels do not give an adequate risk profile.

"We developed pollution level maps and the maps of soils as geochemical barriers, and they turned out not to match each other. In some cases, the ability of the soils to bind down heavy metals compensates for high pollution levels. On the other hand, in some green zones topsoils are unable to contain even the smallest amounts of pollutants. Our results show how important it is to see the bigger picture and take both factors into consideration," said Olga Romzaykina, a researcher at the Research Laboratory "Smart Technologies for Sustainable Urban Development under Global Change", RUDN University

Credit: 
RUDN University

Molecular swarm rearranges surface structures atom by atom

image: Much like a zipper, carbene molecules cooperate on a gold surface to join two rows of atoms into one row, resulting - step by step - in a new surface structure.

Image: 
Saeed Amirjalayer

The surface of metals plays a key role in many technologically relevant areas, such as catalysis, sensor technology and battery research. For example, the large-scale production of many chemical compounds takes place on metal surfaces, whose atomic structure determines if and how molecules react with one another. At the same time, the surface structure of a metal influences its electronic properties. This is particularly important for the efficiency of electronic components in batteries. Researchers worldwide are therefore working intensively on developing new kinds of methods to tailor the structure of metal surfaces at the atomic level.

A team of researchers at the University of Münster, consisting of physicists and chemists and led by Dr. Saeed Amirjalayer, has now developed a molecular tool which makes it possible, at the atomic level, to change the structure of a metal surface. Using computer simulations, it was possible to predict that the restructuring of the surface by individual molecules - so-called N-heterocyclic carbenes - takes place similar to a zipper. During the process, at least two carbene molecules cooperate to rearrange the structure of the surface atom by atom. The researchers could experimentally confirm, as part of the study, this "zipper-type" mechanism in which the carbene molecules work together on the gold surface to join two rows of gold atoms into one row. The results of the work have been published in the journal "Angewandte Chemie International Edition".

In earlier studies the researchers from Münster had shown the high stability and mobility of carbene molecules at the gold surface. However, no specific change of the surface structure induced by the molecules could previously be demonstrated. In their latest study, the researchers proved for the first time that the structure of a gold surface is modified very precisely as a result of cooperation between the carbene molecules. "The carbene molecules behave like a molecular swarm - in other words, they work together as a group to change the long-range structure of the surface," Saeed Amirjalayer explains. "Based on the 'zipper' principle, the surface atoms are systematically rearranged, and, after this process, the molecules can be removed from the surface."

The new method makes it possible to develop new materials with specific chemical and physical properties - entirely without macroscopic tools. "In industrial applications often macroscopic tools, such presses or rollers, are used," Amirjalayer continues. "In biology, these tasks are undertaken by certain molecules. Our work shows a promising class of synthesized molecules which uses a similar approach to modify the surface." The team of researchers hopes that their method will be used in future to develop for examples new types of electrode or to optimize chemical reactions on surfaces.

Credit: 
University of Münster

Climate change could mean fewer sunny days for hot regions banking on solar power

image: Hot, arid regions may see greater fluctuations in sunlight as the climate changes, the researchers reported. They used satellite data and climate model outputs to evaluate the intermittency of solar radiation and the reliability of photovoltaic energy under future climate conditions. They found that arid areas (pink) were more likely to experience a decrease in average solar radiation -- and thus the reliability of solar power -- in January (top) and July (bottom).

Image: 
Image courtesy of Jun Yin, Nanjing University of Information Science and Technology

While solar power is a leading form of renewable energy, new research suggests that changes to regional climates brought on by global warming could make areas currently considered ideal for solar power production less viable in the future.

Princeton-based researchers recently published in the journal Nature Communications the first study to assess the day-to-day reliability of solar energy under climate change. The team used satellite data and climate models to project how sunlight reaching the ground would be affected as warmer global temperatures alter the dynamics and consistency of Earth's atmosphere.

Their study found that higher surface temperatures -- and the resulting increase in the amount of moisture, aerosols and particulates in the atmosphere -- may result in an overall decrease in solar radiation and an uptick in the number of cloudy days. Hot, arid regions such as the Middle East and the American Southwest -- considered among the highest potential producers of solar energy -- were most susceptible to greater fluctuations in sunlight, the researchers found.

"Our results could help in designing better solar power plants and optimizing storage while also avoiding the expansion of solar power capacity in areas where sunlight intermittency under future climate conditions may be too high to make solar reliable," said corresponding author Amilcare Porporato, Princeton's Thomas J. Wu '94 Professor of Civil and Environmental Engineering and the Princeton Environmental Institute (PEI). The research was supported by the Carbon Mitigation Initiative based in PEI.

"To use an academic metaphor, in terms of solar power, semiarid places are now like students who get an A nearly every day," Porporato said. "Now, climate change is disturbing the usual dynamics of the atmosphere and the regularity of the solar radiation reaching the planet's surface. We tried to quantify how much more often those A's could become B's, or even C's, as a result."

Existing research on how solar energy will fare in this irregular future has largely focused on average levels of sunlight, said first author Jun Yin, a researcher at Nanjing University of Information Science and Technology who worked on the paper at Princeton as a postdoctoral research associate with Porporato.

"The novelty of our approach was to point out that in some places there is going to be more uncertainty in day-to-day variability," Yin said. He and Porporato previously reported that climate models underestimate the cooling effect of the daily cloud cycle. They worked on the most recent paper with co-author Annalisa Molini, an associate professor of civil infrastructure and environmental engineering at Khalifa University in the United Arab Emirates.

The researchers' findings were based on probabilistic calculations similar to those used to determine the risk of flooding or drought. The reduced reliability of solar energy is related to the increased variability of atmospheric moisture and aerosols in some arid regions. Higher temperatures hold more moisture and are more turbulent, which favors the formation of clouds and keeps particles in suspension longer, Porporato said.

"Then there is the issue of soils drying, which may be even more important," Porporato said. As temperatures and atmospheric turbulence increase in arid regions such as the Middle East, dry soils potentially lead to greater amounts of dust and atmospheric aerosols that would diminish solar radiation. These trends are in fact already detectable in observations from climate-observation networks, Porporato said.

For the American Southwest, the researchers' findings were less consistent. Some models showed more solar radiation and lower intermittency in the future, while others showed less solar radiation and higher intermittency. These results illustrate the challenge of trying to predict the reliability of solar energy in an uncertain future, Yin said.

"We hope that policymakers and people in the energy industry can take advantage of this information to more efficiently design and manage photovoltaic facilities," Yin said.

"Our paper helps identify efficient solutions for different locations where intermittency could occur, but at an acceptable level," he said. "A variety of technologies such as power storage, or power-operation policies such as smart curtailment, load shaping or geographical dispersion, are promising solutions."

To follow up on their work, the researchers plan to examine climate persistency -- specifically, the number of consecutive sunny or cloudy days -- which is important for solar power. They also are exploring how clouds could affect the effectiveness of tree planting as a climate mitigation strategy. Trees absorb not only carbon dioxide but also solar energy, which would raise surface temperatures. A resulting increase in cloud coverage could change current estimates of how effective trees would be in reducing atmospheric carbon.

Credit: 
Princeton University

Mammals share gene pathways that allow zebrafish to grow new eyes

Working with fish, birds and mice, Johns Hopkins Medicine researchers report new evidence that some animals' natural capacity to regrow neurons is not missing, but is instead inactivated in mammals. Specifically, the researchers found that some genetic pathways that allow many fish and other cold-blooded animals to repair specialized eye neurons after injury remain present in mammals as well, but are turned off, blocking regeneration and healing.

A description of the study, published online by the journal Science on Oct. 1 (science.sciencemag.org/lookup/doi/10.1126/science.abb8598), offers a better understanding of how genes that control regeneration are conserved across species, as well as how they function. This may help scientists develop ways to grow cells that are lost due to hereditary blindness and other neurodegenerative diseases.

"Our research overall indicates that the potential for regeneration is there in mammals, including humans, but some evolutionary pressure has turned it off," says Seth Blackshaw, Ph.D., professor of neuroscience at the Johns Hopkins University School of Medicine. "In fact, regeneration seems to be the default status, and the loss of that ability happened at multiple points on the evolutionary tree," he says.

For the study, Blackshaw's team focused on supportive cells in the back of the eye. In zebrafish, a standard laboratory model whose genome has been well defined, these cells, known as Müller glia, respond and repair the light-sensitive retina by growing new cells in the central nervous system called neurons. In addition to regrowing eye tissue, zebrafish's regenerative abilities extend to other body parts, including fins, tails and some internal organs.

The retina is a good testing ground for mapping genetic activity, explains Blackshaw, because it contains structures common to other cells in the nervous system. In previous studies, moreover, scientists have found that the genetic networks in the retina are well conserved across species, so comparisons among fish, birds, mice and even humans are possible.

For the new experiments, the Johns Hopkins researchers created retinal injuries in zebrafish, chickens and mice. Then they used high-powered microscopes and a previously developed gene mapping tool to observe how the supportive Müller glia cells responded.

Blackshaw said the team was surprised to find, immediately after the injury, that the cells in each of the three species behaved the same way: They entered an "active state" characterized by the activation of specific genes, some of which control inflammation.

This active state, says Blackshaw, primarily helps to contain the injury and send signals to immune system cells to combat foreign invaders such as bacteria, or to clean up broken tissue.

Beyond that step, however, the species' responses diverged.

In zebrafish, active Müller glia began turning on a network of transcription factors that control which genes are 'on' and 'off.' In the current experiment, the NFI transcription factors activated genes that are linked to cell maturity, sending the Müller glia cells back in developmental time to a more primitive state, which then allows them to develop into many different cell types. The Müller glia then "differentiated" into new cells to replace the ones lost to injury.

In contrast, the research team saw that chickens with damaged retinas activate only some of the transcription factor 'gene control switches' that are turned on in zebrafish. Thus, chickens have much less capability to create new Müller glia and other neurons in the eye following injury.

Finally, the researchers looked at the injury response in mice. Mice share the vast majority of their DNA with humans, and their eyes are similar to human eyes. The researchers found that injured Müller glia in mice remained in the first "active" state for several days, much longer than the eight to 12 hours that zebrafish are in this state, and yet never acquired the ability to make new neurons.

Müller glia in all three species also express high levels of nuclear factor I (NFI) transcription factors, but rapidly turn them off following injury. In mice, however, the NFI genes are turned back on soon thereafter, and actively block the Müller glia from generating neurons.

The researchers found, to their surprise, they say, that the same genes that allowed the zebrafish cells to regenerate were "primed and ready to go" in the mouse eye, but that the "on" transcription factor was never activated. Instead, the NFI factors actively block the cells' regenerative potential.

Blackshaw suspects that animals with a higher potential to develop disease in brain and other neurological tissue may have lost this capability over evolutionary time to help protect and stabilize other brain cells. "For example, we know that certain viruses, bacteria and even parasites can infect the brain. It could be disastrous if infected brain cells were allowed to grow and spread the infection through the nervous system," says Blackshaw.

Now equipped with a more detailed map of the cellular response to neuronal injury and regrowth, scientists may be able to find a way to activate the regenerative capabilities hidden in human DNA, Blackshaw says.

Credit: 
Johns Hopkins Medicine